WorldWideScience

Sample records for reservoirs detection optimization

  1. Naturally fractured tight gas reservoir detection optimization

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-06-01

    Building upon the partitioning of the Greater Green River Basin (GGRB) that was conducted last quarter, the goal of the work this quarter has been to conclude evaluation of the Stratos well and the prototypical Green River Deep partition, and perform the fill resource evaluation of the Upper Cretaceous tight gas play, with the goal of defining target areas of enhanced natural fracturing. The work plan for the quarter of November 1-December 31, 1998 comprised four tasks: (1) Evaluation of the Green River Deep partition and the Stratos well and examination of potential opportunity for expanding the use of E and P technology to low permeability, naturally fractured gas reservoirs, (2) Gas field studies, and (3) Resource analysis of the balance of the partitions.

  2. Naturally fractured tight gas reservoir detection optimization. Annual report, September 1993--September 1994

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1994-10-01

    This report is an annual summarization of an ongoing research in the field of modeling and detecting naturally fractured gas reservoirs. The current research is in the Piceance basin of Western Colorado. The aim is to use existing information to determine the most optimal zone or area of fracturing using a unique reaction-transport-mechanical (RTM) numerical basin model. The RTM model will then subsequently help map subsurface lateral and vertical fracture geometries. The base collection techniques include in-situ fracture data, remote sensing, aeromagnetics, 2-D seismic, and regional geologic interpretations. Once identified, high resolution airborne and spaceborne imagery will be used to verify the RTM model by comparing surficial fractures. If this imagery agrees with the model data, then a further investigation using a three-dimensional seismic survey component will be added. This report presents an overview of the Piceance Creek basin and then reviews work in the Parachute and Rulison fields and the results of the RTM models in these fields.

  3. Oil Reservoir Production Optimization using Optimal Control

    DEFF Research Database (Denmark)

    Völcker, Carsten; Jørgensen, John Bagterp; Stenby, Erling Halfdan

    2011-01-01

    Practical oil reservoir management involves solution of large-scale constrained optimal control problems. In this paper we present a numerical method for solution of large-scale constrained optimal control problems. The method is a single-shooting method that computes the gradients using the adjo...... reservoir using water ooding and smart well technology. Compared to the uncontrolled case, the optimal operation increases the Net Present Value of the oil field by 10%.......Practical oil reservoir management involves solution of large-scale constrained optimal control problems. In this paper we present a numerical method for solution of large-scale constrained optimal control problems. The method is a single-shooting method that computes the gradients using...

  4. Naturally fractured tight gas: Gas reservoir detection optimization. Quarterly report, January 1--March 31, 1997

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    Economically viable natural gas production from the low permeability Mesaverde Formation in the Piceance Basin, Colorado requires the presence of an intense set of open natural fractures. Establishing the regional presence and specific location of such natural fractures is the highest priority exploration goal in the Piceance and other western US tight, gas-centered basins. Recently, Advanced Resources International, Inc. (ARI) completed a field program at Rulison Field, Piceance Basin, to test and demonstrate the use of advanced seismic methods to locate and characterize natural fractures. This project began with a comprehensive review of the tectonic history, state of stress and fracture genesis of the basin. A high resolution aeromagnetic survey, interpreted satellite and SLAR imagery, and 400 line miles of 2-D seismic provided the foundation for the structural interpretation. The central feature of the program was the 4.5 square mile multi-azimuth 3-D seismic P-wave survey to locate natural fracture anomalies. The interpreted seismic attributes are being tested against a control data set of 27 wells. Additional wells are currently being drilled at Rulison, on close 40 acre spacings, to establish the productivity from the seismically observed fracture anomalies. A similar regional prospecting and seismic program is being considered for another part of the basin. The preliminary results indicate that detailed mapping of fault geometries and use of azimuthally defined seismic attributes exhibit close correlation with high productivity gas wells. The performance of the ten new wells, being drilled in the seismic grid in late 1996 and early 1997, will help demonstrate the reliability of this natural fracture detection and mapping technology.

  5. Production Optimization of Oil Reservoirs

    DEFF Research Database (Denmark)

    Völcker, Carsten

    with emphasis on optimal control of water ooding with the use of smartwell technology. We have implemented immiscible ow of water and oil in isothermal reservoirs with isotropic heterogenous permeability elds. We use the method of lines for solution of the partial differential equation (PDE) system that governs...... the uid ow. We discretize the the two-phase ow model spatially using the nite volume method (FVM), and we use the two point ux approximation (TPFA) and the single-point upstream (SPU) scheme for computing the uxes. We propose a new formulation of the differential equation system that arise...... as a consequence of the spatial discretization of the two-phase ow model. Upon discretization in time, the proposed equation system ensures the mass conserving property of the two-phase ow model. For the solution of the spatially discretized two-phase ow model, we develop mass conserving explicit singly diagonally...

  6. Optimizing detectability

    International Nuclear Information System (INIS)

    Anon.

    1992-01-01

    HPLC is useful for trace and ultratrace analyses of a variety of compounds. For most applications, HPLC is useful for determinations in the nanogram-to-microgram range; however, detection limits of a picogram or less have been demonstrated in certain cases. These determinations require state-of-the-art capability; several examples of such determinations are provided in this chapter. As mentioned before, to detect and/or analyze low quantities of a given analyte at submicrogram or ultratrace levels, it is necessary to optimize the whole separation system, including the quantity and type of sample, sample preparation, HPLC equipment, chromatographic conditions (including column), choice of detector, and quantitation techniques. A limited discussion is provided here for optimization based on theoretical considerations, chromatographic conditions, detector selection, and miscellaneous approaches to detectability optimization. 59 refs

  7. Application of Integrated Reservoir Management and Reservoir Characterization to Optimize Infill Drilling

    Energy Technology Data Exchange (ETDEWEB)

    P. K. Pande

    1998-10-29

    Initial drilling of wells on a uniform spacing, without regard to reservoir performance and characterization, must become a process of the past. Such efforts do not optimize reservoir development as they fail to account for the complex nature of reservoir heterogeneities present in many low permeability reservoirs, and carbonate reservoirs in particular. These reservoirs are typically characterized by: o Large, discontinuous pay intervals o Vertical and lateral changes in reservoir properties o Low reservoir energy o High residual oil saturation o Low recovery efficiency

  8. Reservoir Operating Rule Optimization for California's Sacramento Valley

    Directory of Open Access Journals (Sweden)

    Timothy Nelson

    2016-03-01

    Full Text Available doi: http://dx.doi.org/10.15447/sfews.2016v14iss1art6Reservoir operating rules for water resource systems are typically developed by combining intuition, professional discussion, and simulation modeling. This paper describes a joint optimization–simulation approach to develop preliminary economically-based operating rules for major reservoirs in California’s Sacramento Valley, based on optimized results from CALVIN, a hydro-economic optimization model. We infer strategic operating rules from the optimization model results, including storage allocation rules to balance storage among multiple reservoirs, and reservoir release rules to determine monthly release for individual reservoirs. Results show the potential utility of considering previous year type on water availability and various system and sub-system storage conditions, in addition to normal consideration of local reservoir storage, season, and current inflows. We create a simple simulation to further refine and test the derived operating rules. Optimization model results show particular insights for balancing the allocation of water storage among Shasta, Trinity, and Oroville reservoirs over drawdown and refill seasons, as well as some insights for release rules at major reservoirs in the Sacramento Valley. We also discuss the applicability and limitations of developing reservoir operation rules from optimization model results.

  9. Optimal Operation of Hydropower Reservoirs under Climate Change: The Case of Tekeze Reservoir, Eastern Nile

    Directory of Open Access Journals (Sweden)

    Fikru Fentaw Abera

    2018-03-01

    Full Text Available Optimal operation of reservoirs is very essential for water resource planning and management, but it is very challenging and complicated when dealing with climate change impacts. The objective of this paper was to assess existing and future hydropower operation at the Tekeze reservoir in the face of climate change. In this study, a calibrated and validated Soil and Water Assessment Tool (SWAT was used to model runoff inflow into the Tekeze hydropower reservoir under present and future climate scenarios. Inflow to the reservoir was simulated using hydro-climatic data from an ensemble of downscaled climate data based on the Coordinated Regional climate Downscaling Experiment over African domain (CORDEX-Africa with Coupled Intercomparison Project Phase 5 (CMIP5 simulations under Representative Concentration Pathway (RCP4.5 and RCP8.5 climate scenarios. Observed and projected inflows to Tekeze hydropower reservoir were used as input to the US Army Corps of Engineer’s Reservoir Evaluation System Perspective Reservoir Model (HEC-ResPRM, a reservoir operation model, to optimize hydropower reservoir release, storage and pool level. Results indicated that climate change has a clear impact on reservoir inflow and showed increase in annual and monthly inflow into the reservoir except in dry months from May to June under RCP4.5 and RCP8.5 climate scenarios. HEC-ResPRM optimal operation results showed an increase in Tekeze reservoir power storage potential up to 25% and 30% under RCP4.5 and RCP8.5 climate scenarios, respectively. This implies that Tekeze hydropower production will be affected by climate change. This analysis can be used by water resources planners and mangers to develop reservoir operation techniques considering climate change impact to increase power production.

  10. Application of integrated reservoir management and reservoir characterization to optimize infill drilling, Class II

    Energy Technology Data Exchange (ETDEWEB)

    Bergeron, Jack; Blasingame, Tom; Doublet, Louis; Kelkar, Mohan; Freeman, George; Callard, Jeff; Moore, David; Davies, David; Vessell, Richard; Pregger, Brian; Dixon, Bill; Bezant, Bryce

    2000-03-16

    The major purpose of this project was to demonstrate the use of cost effective reservoir characterization and management tools that will be helpful to both independent and major operators for the optimal development of heterogeneous, low permeability carbonate reservoirs such as the North Robertson (Clearfork) Unit.

  11. Multiobjective Optimization Modeling Approach for Multipurpose Single Reservoir Operation

    Directory of Open Access Journals (Sweden)

    Iosvany Recio Villa

    2018-04-01

    Full Text Available The water resources planning and management discipline recognizes the importance of a reservoir’s carryover storage. However, mathematical models for reservoir operation that include carryover storage are scarce. This paper presents a novel multiobjective optimization modeling framework that uses the constraint-ε method and genetic algorithms as optimization techniques for the operation of multipurpose simple reservoirs, including carryover storage. The carryover storage was conceived by modifying Kritsky and Menkel’s method for reservoir design at the operational stage. The main objective function minimizes the cost of the total annual water shortage for irrigation areas connected to a reservoir, while the secondary one maximizes its energy production. The model includes operational constraints for the reservoir, Kritsky and Menkel’s method, irrigation areas, and the hydropower plant. The study is applied to Carlos Manuel de Céspedes reservoir, establishing a 12-month planning horizon and an annual reliability of 75%. The results highly demonstrate the applicability of the model, obtaining monthly releases from the reservoir that include the carryover storage, degree of reservoir inflow regulation, water shortages in irrigation areas, and the energy generated by the hydroelectric plant. The main product is an operational graph that includes zones as well as rule and guide curves, which are used as triggers for long-term reservoir operation.

  12. Modeling Reservoir-River Networks in Support of Optimizing Seasonal-Scale Reservoir Operations

    Science.gov (United States)

    Villa, D. L.; Lowry, T. S.; Bier, A.; Barco, J.; Sun, A.

    2011-12-01

    HydroSCOPE (Hydropower Seasonal Concurrent Optimization of Power and the Environment) is a seasonal time-scale tool for scenario analysis and optimization of reservoir-river networks. Developed in MATLAB, HydroSCOPE is an object-oriented model that simulates basin-scale dynamics with an objective of optimizing reservoir operations to maximize revenue from power generation, reliability in the water supply, environmental performance, and flood control. HydroSCOPE is part of a larger toolset that is being developed through a Department of Energy multi-laboratory project. This project's goal is to provide conventional hydropower decision makers with better information to execute their day-ahead and seasonal operations and planning activities by integrating water balance and operational dynamics across a wide range of spatial and temporal scales. This presentation details the modeling approach and functionality of HydroSCOPE. HydroSCOPE consists of a river-reservoir network model and an optimization routine. The river-reservoir network model simulates the heat and water balance of river-reservoir networks for time-scales up to one year. The optimization routine software, DAKOTA (Design Analysis Kit for Optimization and Terascale Applications - dakota.sandia.gov), is seamlessly linked to the network model and is used to optimize daily volumetric releases from the reservoirs to best meet a set of user-defined constraints, such as maximizing revenue while minimizing environmental violations. The network model uses 1-D approximations for both the reservoirs and river reaches and is able to account for surface and sediment heat exchange as well as ice dynamics for both models. The reservoir model also accounts for inflow, density, and withdrawal zone mixing, and diffusive heat exchange. Routing for the river reaches is accomplished using a modified Muskingum-Cunge approach that automatically calculates the internal timestep and sub-reach lengths to match the conditions of

  13. Using Chemicals to Optimize Conformance Control in Fractured Reservoirs; TOPICAL

    International Nuclear Information System (INIS)

    Seright, Randall S.; Liang, Jenn-Tai; Schrader, Richard; Hagstrom II, John; Wang, Ying; Kumar, Ananad; Wavrik, Kathryn

    2001-01-01

    This report describes work performed during the third and final year of the project, Using Chemicals to Optimize Conformance Control in Fractured Reservoirs. This research project had three objectives. The first objective was to develop a capability to predict and optimize the ability of gels to reduce permeability to water more than that to oil or gas. The second objective was to develop procedures for optimizing blocking agent placement in wells where hydraulic fractures cause channeling problems. The third objective was to develop procedures to optimize blocking agent placement in naturally fractured reservoirs

  14. Optimizing withdrawal from drinking water reservoirs to reduce downstream temperature pollution and reservoir hypoxia.

    Science.gov (United States)

    Weber, M; Rinke, K; Hipsey, M R; Boehrer, B

    2017-07-15

    Sustainable management of drinking water reservoirs requires balancing the demands of water supply whilst minimizing environmental impact. This study numerically simulates the effect of an improved withdrawal scheme designed to alleviate the temperature pollution downstream of a reservoir. The aim was to identify an optimal withdrawal strategy such that water of a desirable discharge temperature can be supplied downstream without leading to unacceptably low oxygen concentrations within the reservoir. First, we calibrated a one-dimensional numerical model for hydrodynamics and oxygen dynamics (GLM-AED2), verifying that the model reproduced water temperatures and hypolimnetic dissolved oxygen concentrations accurately over a 5 year period. Second, the model was extended to include an adaptive withdrawal functionality, allowing for a prescribed withdrawal temperature to be found, with the potential constraint of hypolimnetic oxygen concentration. Scenario simulations on epi-/metalimnetic withdrawal demonstrate that the model is able to autonomously determine the best withdrawal height depending on the thermal structure and the hypolimnetic oxygen concentration thereby optimizing the ability to supply a desirable discharge temperature to the downstream river during summer. This new withdrawal strategy also increased the hypolimnetic raw water volume to be used for drinking water supply, but reduced the dissolved oxygen concentrations in the deep and cold water layers (hypolimnion). Implications of the results for reservoir management are discussed and the numerical model is provided for operators as a simple and efficient tool for optimizing the withdrawal strategy within different reservoir contexts. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. The Improvement of Particle Swarm Optimization: a Case Study of Optimal Operation in Goupitan Reservoir

    Science.gov (United States)

    Li, Haichen; Qin, Tao; Wang, Weiping; Lei, Xiaohui; Wu, Wenhui

    2018-02-01

    Due to the weakness in holding diversity and reaching global optimum, the standard particle swarm optimization has not performed well in reservoir optimal operation. To solve this problem, this paper introduces downhill simplex method to work together with the standard particle swarm optimization. The application of this approach in Goupitan reservoir optimal operation proves that the improved method had better accuracy and higher reliability with small investment.

  16. Optimal reservoir operation policies using novel nested algorithms

    Science.gov (United States)

    Delipetrev, Blagoj; Jonoski, Andreja; Solomatine, Dimitri

    2015-04-01

    Historically, the two most widely practiced methods for optimal reservoir operation have been dynamic programming (DP) and stochastic dynamic programming (SDP). These two methods suffer from the so called "dual curse" which prevents them to be used in reasonably complex water systems. The first one is the "curse of dimensionality" that denotes an exponential growth of the computational complexity with the state - decision space dimension. The second one is the "curse of modelling" that requires an explicit model of each component of the water system to anticipate the effect of each system's transition. We address the problem of optimal reservoir operation concerning multiple objectives that are related to 1) reservoir releases to satisfy several downstream users competing for water with dynamically varying demands, 2) deviations from the target minimum and maximum reservoir water levels and 3) hydropower production that is a combination of the reservoir water level and the reservoir releases. Addressing such a problem with classical methods (DP and SDP) requires a reasonably high level of discretization of the reservoir storage volume, which in combination with the required releases discretization for meeting the demands of downstream users leads to computationally expensive formulations and causes the curse of dimensionality. We present a novel approach, named "nested" that is implemented in DP, SDP and reinforcement learning (RL) and correspondingly three new algorithms are developed named nested DP (nDP), nested SDP (nSDP) and nested RL (nRL). The nested algorithms are composed from two algorithms: 1) DP, SDP or RL and 2) nested optimization algorithm. Depending on the way we formulate the objective function related to deficits in the allocation problem in the nested optimization, two methods are implemented: 1) Simplex for linear allocation problems, and 2) quadratic Knapsack method in the case of nonlinear problems. The novel idea is to include the nested

  17. Quantifying the robustness of optimal reservoir operation for the Xinanjiang-Fuchunjiang Reservoir Cascade

    NARCIS (Netherlands)

    Vonk, E.; Xu, YuePing; Booij, Martijn J.; Augustijn, Dionysius C.M.

    2016-01-01

    In this research we investigate the robustness of the common implicit stochastic optimization (ISO) method for dam reoperation. As a case study, we focus on the Xinanjiang-Fuchunjiang reservoir cascade in eastern China, for which adapted operating rules were proposed as a means to reduce the impact

  18. Application of an expert system to optimize reservoir performance

    International Nuclear Information System (INIS)

    Gharbi, Ridha

    2005-01-01

    The main challenge of oil displacement by an injected fluid, such as in Enhanced Oil Recovery (EOR) processes, is to reduce the cost and improve reservoir performance. An optimization methodology, combined with an economic model, is implemented into an expert system to optimize the net present value of full field development with an EOR process. The approach is automated and combines an economic package and existing numerical reservoir simulators to optimize the design of a selected EOR process using sensitivity analysis. The EOR expert system includes three stages of consultations: (1) select an appropriate EOR process on the basis of the reservoir characteristics, (2) prepare appropriate input data sets to design the selected EOR process using existing numerical simulators, and (3) apply the discounted-cash-flow methods to the optimization of the selected EOR process to find out under what conditions at current oil prices this EOR process might be profitable. The project profitability measures were used as the decision-making variables in an iterative approach to optimize the design of the EOR process. The economic analysis is based on the estimated recovery, residual oil in-place, oil price, and operating costs. Two case studies are presented for two reservoirs that have already been produced to their economic limits and are potential candidates for surfactant/polymer flooding, and carbon-dioxide flooding, respectively, or otherwise subject to abandonment. The effect of several design parameters on the project profitability of these EOR processes was investigated

  19. NMPC for Oil Reservoir Production Optimization

    DEFF Research Database (Denmark)

    Völcker, Carsten; Jørgensen, John Bagterp; Thomsen, Per Grove

    2011-01-01

    this problem numerically using a single shooting sequential quadratic programming (SQP) based optimization method. Explicit singly diagonally implicit Runge-Kutta (ESDIRK) methods are used for integration of the stiff system of differential equations describing the two-phase flow, and the adjoint method...

  20. Optimization of Multipurpose Reservoir Systems Using Power Market Models

    DEFF Research Database (Denmark)

    Pereira-Cardenal, S. J.; Mo, B.; Riegels, N.

    2015-01-01

    optimal operation rules that maximize current and expected future benefits as a function of reservoir level, week of the year, and inflow state. The method was tested on the Iberian Peninsula and performed better than traditional approaches that use exogenous prices: resulting operation rules were more...

  1. A Study of the Optimal Planning Model for Reservoir Sustainable Management- A Case Study of Shihmen Reservoir

    Science.gov (United States)

    Chen, Y. Y.; Ho, C. C.; Chang, L. C.

    2017-12-01

    The reservoir management in Taiwan faces lots of challenge. Massive sediment caused by landslide were flushed into reservoir, which will decrease capacity, rise the turbidity, and increase supply risk. Sediment usually accompanies nutrition that will cause eutrophication problem. Moreover, the unevenly distribution of rainfall cause water supply instability. Hence, how to ensure sustainable use of reservoirs has become an important task in reservoir management. The purpose of the study is developing an optimal planning model for reservoir sustainable management to find out an optimal operation rules of reservoir flood control and sediment sluicing. The model applies Genetic Algorithms to combine with the artificial neural network of hydraulic analysis and reservoir sediment movement. The main objective of operation rules in this study is to prevent reservoir outflow caused downstream overflow, minimum the gap between initial and last water level of reservoir, and maximum sluicing sediment efficiency. A case of Shihmen reservoir was used to explore the different between optimal operating rule and the current operation of the reservoir. The results indicate optimal operating rules tended to open desilting tunnel early and extend open duration during flood discharge period. The results also show the sluicing sediment efficiency of optimal operating rule is 36%, 44%, 54% during Typhoon Jangmi, Typhoon Fung-Wong, and Typhoon Sinlaku respectively. The results demonstrate the optimal operation rules do play a role in extending the service life of Shihmen reservoir and protecting the safety of downstream. The study introduces a low cost strategy, alteration of operation reservoir rules, into reservoir sustainable management instead of pump dredger in order to improve the problem of elimination of reservoir sediment and high cost.

  2. Rule Optimization monthly reservoir operation Salvajina

    International Nuclear Information System (INIS)

    Sandoval Garcia, Maria Clemencia; Santacruz Salazar, Santiago; Ramirez Callejas, Carlos A

    2007-01-01

    In the present study a model was designed for the optimization of the rule for monthly operation of the Salvajina dam (Colombia) based in the technology) of dynamic programming. The model maximizes the benefits for electric power generation, ensuring at the same time flood regulation in winter and pollution relief during the summer. For the optimization of the rule of operation, it was necessary to define the levels and volumes of reserve and holding required for the control of flood zones in the Cauca river and to provide an effluent minimal flow and assure a daily flow at the Juanchito station (located 141 km downstream from the dam) of the Cauca river, 90 % of the time during the most critical summer periods.

  3. Optimizing Reservoir Operation to Adapt to the Climate Change

    Science.gov (United States)

    Madadgar, S.; Jung, I.; Moradkhani, H.

    2010-12-01

    Climate change and upcoming variation in flood timing necessitates the adaptation of current rule curves developed for operation of water reservoirs as to reduce the potential damage from either flood or draught events. This study attempts to optimize the current rule curves of Cougar Dam on McKenzie River in Oregon addressing some possible climate conditions in 21th century. The objective is to minimize the failure of operation to meet either designated demands or flood limit at a downstream checkpoint. A simulation/optimization model including the standard operation policy and a global optimization method, tunes the current rule curve upon 8 GCMs and 2 greenhouse gases emission scenarios. The Precipitation Runoff Modeling System (PRMS) is used as the hydrology model to project the streamflow for the period of 2000-2100 using downscaled precipitation and temperature forcing from 8 GCMs and two emission scenarios. An ensemble of rule curves, each associated with an individual scenario, is obtained by optimizing the reservoir operation. The simulation of reservoir operation, for all the scenarios and the expected value of the ensemble, is conducted and performance assessment using statistical indices including reliability, resilience, vulnerability and sustainability is made.

  4. An intelligent agent for optimal river-reservoir system management

    Science.gov (United States)

    Rieker, Jeffrey D.; Labadie, John W.

    2012-09-01

    A generalized software package is presented for developing an intelligent agent for stochastic optimization of complex river-reservoir system management and operations. Reinforcement learning is an approach to artificial intelligence for developing a decision-making agent that learns the best operational policies without the need for explicit probabilistic models of hydrologic system behavior. The agent learns these strategies experientially in a Markov decision process through observational interaction with the environment and simulation of the river-reservoir system using well-calibrated models. The graphical user interface for the reinforcement learning process controller includes numerous learning method options and dynamic displays for visualizing the adaptive behavior of the agent. As a case study, the generalized reinforcement learning software is applied to developing an intelligent agent for optimal management of water stored in the Truckee river-reservoir system of California and Nevada for the purpose of streamflow augmentation for water quality enhancement. The intelligent agent successfully learns long-term reservoir operational policies that specifically focus on mitigating water temperature extremes during persistent drought periods that jeopardize the survival of threatened and endangered fish species.

  5. Design and development of bio-inspired framework for reservoir operation optimization

    Science.gov (United States)

    Asvini, M. Sakthi; Amudha, T.

    2017-12-01

    Frameworks for optimal reservoir operation play an important role in the management of water resources and delivery of economic benefits. Effective utilization and conservation of water from reservoirs helps to manage water deficit periods. The main challenge in reservoir optimization is to design operating rules that can be used to inform real-time decisions on reservoir release. We develop a bio-inspired framework for the optimization of reservoir release to satisfy the diverse needs of various stakeholders. In this work, single-objective optimization and multiobjective optimization problems are formulated using an algorithm known as "strawberry optimization" and tested with actual reservoir data. Results indicate that well planned reservoir operations lead to efficient deployment of the reservoir water with the help of optimal release patterns.

  6. EXPLOITATION AND OPTIMIZATION OF RESERVOIR PERFORMANCE IN HUNTON FORMATION, OKLAHOMA

    Energy Technology Data Exchange (ETDEWEB)

    Mohan Kelkar

    2004-10-01

    West Carney field--one of the newest fields discovered in Oklahoma--exhibits many unique production characteristics. These characteristics include: (1) decreasing water-oil ratio; (2) decreasing gas-oil ratio followed by an increase; (3) poor prediction capability of the reserves based on the log data; and (4) low geological connectivity but high hydrodynamic connectivity. The purpose of this investigation is to understand the principal mechanisms affecting the production, and propose methods by which we can extend the phenomenon to other fields with similar characteristics. In our experimental investigation section, we present the data on surfactant injection in near well bore region. We demonstrate that by injecting the surfactant, the relative permeability of water could be decreased, and that of gas could be increased. This should result in improved gas recovery from the reservoir. Our geological analysis of the reservoir develops the detailed stratigraphic description of the reservoir. Two new stratigraphic units, previously unrecognized, are identified. Additional lithofacies are recognized in new core descriptions. Our engineering analysis has determined that well density is an important parameter in optimally producing Hunton reservoirs. It appears that 160 acre is an optimal spacing. The reservoir pressure appears to decline over time; however, recovery per well is only weakly influenced by the pressure. This indicates that additional opportunity to drill wells exists in relatively depleted fields. A simple material balance technique is developed to validate the recovery of gas, oil and water. This technique can be used to further extrapolate recoveries from other fields with similar field characteristics.

  7. Hydroeconomic optimization of reservoir management under downstream water quality constraints

    DEFF Research Database (Denmark)

    Davidsen, Claus; Liu, Suxia; Mo, Xingguo

    2015-01-01

    water quantity and water quality management and minimizes the total costs over a planning period assuming stochastic future runoff. The outcome includes cost-optimal reservoir releases, groundwater pumping, water allocation, wastewater treatments and water curtailments. The optimization model uses......), and the resulting minimum dissolved oxygen (DO) concentration is computed with the Streeter-Phelps equation and constrained to match Chinese water quality targets. The baseline water scarcity and operational costs are estimated to 15.6. billion. CNY/year. Compliance to water quality grade III causes a relatively...

  8. Constructing optimized binary masks for reservoir computing with delay systems

    Science.gov (United States)

    Appeltant, Lennert; van der Sande, Guy; Danckaert, Jan; Fischer, Ingo

    2014-01-01

    Reservoir computing is a novel bio-inspired computing method, capable of solving complex tasks in a computationally efficient way. It has recently been successfully implemented using delayed feedback systems, allowing to reduce the hardware complexity of brain-inspired computers drastically. In this approach, the pre-processing procedure relies on the definition of a temporal mask which serves as a scaled time-mutiplexing of the input. Originally, random masks had been chosen, motivated by the random connectivity in reservoirs. This random generation can sometimes fail. Moreover, for hardware implementations random generation is not ideal due to its complexity and the requirement for trial and error. We outline a procedure to reliably construct an optimal mask pattern in terms of multipurpose performance, derived from the concept of maximum length sequences. Not only does this ensure the creation of the shortest possible mask that leads to maximum variability in the reservoir states for the given reservoir, it also allows for an interpretation of the statistical significance of the provided training samples for the task at hand.

  9. Multipurpose Water Reservoir Management: An Evolutionary Multiobjective Optimization Approach

    Directory of Open Access Journals (Sweden)

    Luís A. Scola

    2014-01-01

    Full Text Available The reservoirs that feed large hydropower plants should be managed in order to provide other uses for the water resources. Those uses include, for instance, flood control and avoidance, irrigation, navigability in the rivers, and other ones. This work presents an evolutionary multiobjective optimization approach for the study of multiple water usages in multiple interlinked reservoirs, including both power generation objectives and other objectives not related to energy generation. The classical evolutionary algorithm NSGA-II is employed as the basic multiobjective optimization machinery, being modified in order to cope with specific problem features. The case studies, which include the analysis of a problem which involves an objective of navigability on the river, are tailored in order to illustrate the usefulness of the data generated by the proposed methodology for decision-making on the problem of operation planning of multiple reservoirs with multiple usages. It is shown that it is even possible to use the generated data in order to determine the cost of any new usage of the water, in terms of the opportunity cost that can be measured on the revenues related to electric energy sales.

  10. An adaptive robust optimization scheme for water-flooding optimization in oil reservoirs using residual analysis

    NARCIS (Netherlands)

    Siraj, M.M.; Van den Hof, P.M.J.; Jansen, J.D.

    2017-01-01

    Model-based dynamic optimization of the water-flooding process in oil reservoirs is a computationally complex problem and suffers from high levels of uncertainty. A traditional way of quantifying uncertainty in robust water-flooding optimization is by considering an ensemble of uncertain model

  11. Applicability and optimization of SAGD in eastern Venezuela reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Pina R, J.A.; Bashbush, J.L.; Fernandez, E.A. [Schlumberger, Caracas (Venezuela)

    2008-10-15

    Steam-assisted gravity drainage (SAGD) is one of the most effective enhanced oil recovery (EOR) methods. In Venezuela, a significant amount of heavy oil in place has been mapped, but limited areas have been developed. Suitable EOR methods need to be applied to extend the productive life of these reservoirs and increase their recovery factors. This paper presented and described an evaluation and stepwise optimization process for a steam-assisted gravity drainage (SAGD) project using a representative sector model from a field with fluid and reservoir characteristics from an eastern Venezuela formation. The purpose of the study was to understand the impact of key parameters in the process specific to the selected area and to understand the effects on the recovery factor in these reservoirs, which have previously produced with primary recovery mechanisms. The paper discussed a sensitivity analysis that was performed using thermal simulation. Thermal simulation and pressure-volume-temperature (PVT) analysis were described. Parameters that were analyzed included vertical well spacing, injection steam rate, well flowing pressure, and horizontal length of the well pair. The paper also presented a brief analysis of the effect on oil recovery from the angle of dip in the reservoir and the orientation of the well pair with regard to the direction of dip. A comparison between two- and three- pseudocomponent model results was also provided. The authors recommended that economic analyses should accompany the final optimization sequence, to incorporate financial and technical considerations for the selection design of the SAGD pilot. 7 refs., 12 tabs., 18 figs.

  12. Naturally fractured tight gas reservoir detection optimization

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1999-04-30

    In March, work continued on characterizing probabilities for determining natural fracturing associated with the GGRB for the Upper Cretaceous tight gas plays. Structural complexity, based on potential field data and remote sensing data was completed. A resource estimate for the Frontier and Mesa Verde play was also completed. Further, work was also conducted to determine threshold economics for the play based on limited current production in the plays in the Wamsutter Ridge area. These analyses culminated in a presentation at FETC on 24 March 1999 where quantified natural fracture domains, mapped on a partition basis, which establish ''sweet spot'' probability for natural fracturing, were reviewed. That presentation is reproduced here as Appendix 1. The work plan for the quarter of January 1, 1999--March 31, 1999 comprised five tasks: (1) Evaluation of the GGRB partitions for structural complexity that can be associated with natural fractures, (2) Continued resource analysis of the balance of the partitions to determine areas with higher relative gas richness, (3) Gas field studies, (4) Threshold resource economics to determine which partitions would be the most prospective, and (5) Examination of the area around the Table Rock 4H well.

  13. PDVSA Petrolera Sinovensa reservoir engineering project and optimization study

    Energy Technology Data Exchange (ETDEWEB)

    Campos, O. [PDVSA Petroleos de Venezuela SA, Caracas (Venezuela, Bolivarian Republic of). Petrolera Sinovensa; Patino, J. [Kizer Energy Inc., Katy, TX (United States); Chalifoux, G.V. [Petrospec Engineering Ltd., Edmonton, AB (Canada)

    2009-07-01

    This paper presented a development plan for an extra-heavy oil field in Venezuela's Orinoco belt involving cold heavy oil production (CHOP) as well as a thermal follow-up process to increase the ultimate recovery factor. A reservoir simulation model was used to model various reservoir formations in order to assess their oil recovery potential. Several thermal recovery processes were considered, such as steam assisted gravity drainage (SAGD), horizontal alternate steam drive (HASD), cyclic steam stimulation (CSS), horizontal continuous steam drive, and combined drive drainage (CDD). A geological static model and dynamic reservoir model were coupled for the well optimization evaluation. Production data were used to identify trends related to specific geological conditions. The study also examined methods of improving slotted liner designs and evaluated the use of electric heating as a means of improving CHOP performance. Results of the study showed that CDD offered the highest recovery rates as a follow-up to CHOP. The CDD process allowed for the use of existing wells drilled in the field. New horizontal wells will be placed between the existing wells. It was concluded that a CDD pilot should be implemented in order to prepare for a commercial implementation plan. 8 refs., 2 tabs., 14 figs.

  14. Gradient-based methods for production optimization of oil reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Suwartadi, Eka

    2012-07-01

    Production optimization for water flooding in the secondary phase of oil recovery is the main topic in this thesis. The emphasis has been on numerical optimization algorithms, tested on case examples using simple hypothetical oil reservoirs. Gradientbased optimization, which utilizes adjoint-based gradient computation, is used to solve the optimization problems. The first contribution of this thesis is to address output constraint problems. These kinds of constraints are natural in production optimization. Limiting total water production and water cut at producer wells are examples of such constraints. To maintain the feasibility of an optimization solution, a Lagrangian barrier method is proposed to handle the output constraints. This method incorporates the output constraints into the objective function, thus avoiding additional computations for the constraints gradient (Jacobian) which may be detrimental to the efficiency of the adjoint method. The second contribution is the study of the use of second-order adjoint-gradient information for production optimization. In order to speedup convergence rate in the optimization, one usually uses quasi-Newton approaches such as BFGS and SR1 methods. These methods compute an approximation of the inverse of the Hessian matrix given the first-order gradient from the adjoint method. The methods may not give significant speedup if the Hessian is ill-conditioned. We have developed and implemented the Hessian matrix computation using the adjoint method. Due to high computational cost of the Newton method itself, we instead compute the Hessian-timesvector product which is used in a conjugate gradient algorithm. Finally, the last contribution of this thesis is on surrogate optimization for water flooding in the presence of the output constraints. Two kinds of model order reduction techniques are applied to build surrogate models. These are proper orthogonal decomposition (POD) and the discrete empirical interpolation method (DEIM

  15. Optimization of fracture length in gas/condensate reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Mohan, J.; Sharma, M.M.; Pope, G.A. [Society of Petroleum Engineers, Richardson, TX (United States)]|[Texas Univ., Austin, TX (United States)

    2006-07-01

    A common practice that improves the productivity of gas-condensate reservoirs is hydraulic fracturing. Two important variables that determine the effectiveness of hydraulic fractures are fracture length and fracture conductivity. Although there are no simple guidelines for the optimization of fracture length and the factors that affect it, it is preferable to have an optimum fracture length for a given proppant volume in order to maximize productivity. An optimization study was presented in which fracture length was estimated at wells where productivity was maximized. An analytical expression that takes into account non-Darcy flow and condensate banking was derived. This paper also reviewed the hydraulic fracturing process and discussed previous simulation studies that investigated the effects of well spacing and fracture length on well productivity in low permeability gas reservoirs. The compositional simulation study and results and discussion were also presented. The analytical expression for optimum fracture length, analytical expression with condensate dropout, and equations for the optimum fracture length with non-Darcy flow in the fracture were included in an appendix. The Computer Modeling Group's GEM simulator, an equation-of-state compositional simulator, was used in this study. It was concluded that for cases with non-Darcy flow, the optimum fracture lengths are lower than those obtained with Darcy flow. 18 refs., 5 tabs., 22 figs., 1 appendix.

  16. Optimization In Searching Daily Rule Curve At Mosul Regulating Reservoir, North Iraq Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Thair M. Al-Taiee

    2013-05-01

    Full Text Available To obtain optimal operating rules for storage reservoirs, large numbers of simulation and optimization models have been developed over the past several decades, which vary significantly in their mechanisms and applications. Rule curves are guidelines for long term reservoir operation. An efficient technique is required to find the optimal rule curves that can mitigate water shortage in long term operation. The investigation of developed Genetic Algorithm (GA technique, which is an optimization approach base on the mechanics of natural selection, derived from the theory of natural evolution, was carried out to through the application to predict the daily rule curve of  Mosul regulating reservoir in Iraq.  Record daily inflows, outflow, water level in the reservoir for 19 year (1986-1990 and (1994-2007 were used in the developed model for assessing the optimal reservoir operation. The objective function is set to minimize the annual sum of squared deviation from the desired downstream release and desired storage volume in the reservoir. The decision variables are releases, storage volume, water level and outlet (demand from the reservoir. The results of the GA model gave a good agreement during the comparison with the actual rule curve and the designed rating curve of the reservoir. The simulated result shows that GA-derived policies are promising and competitive and can be effectively used for daily reservoir operation in addition to the rational monthly operation and predicting also rating curve of reservoirs.

  17. Nested algorithms for optimal reservoir operation and their embedding in a decision support platform

    NARCIS (Netherlands)

    Delipetrev, B.

    2016-01-01

    Reservoir operation is a multi-objective optimization problem traditionally solved with dynamic programming (DP) and stochastic dynamic programming (SDP) algorithms. The thesis presents novel algorithms for optimal reservoir operation named nested DP (nDP), nested SDP (nSDP), nested reinforcement

  18. Optimal nonlinear information processing capacity in delay-based reservoir computers

    Science.gov (United States)

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2015-09-01

    Reservoir computing is a recently introduced brain-inspired machine learning paradigm capable of excellent performances in the processing of empirical data. We focus in a particular kind of time-delay based reservoir computers that have been physically implemented using optical and electronic systems and have shown unprecedented data processing rates. Reservoir computing is well-known for the ease of the associated training scheme but also for the problematic sensitivity of its performance to architecture parameters. This article addresses the reservoir design problem, which remains the biggest challenge in the applicability of this information processing scheme. More specifically, we use the information available regarding the optimal reservoir working regimes to construct a functional link between the reservoir parameters and its performance. This function is used to explore various properties of the device and to choose the optimal reservoir architecture, thus replacing the tedious and time consuming parameter scannings used so far in the literature.

  19. Optimal dynamic detection of explosives

    Energy Technology Data Exchange (ETDEWEB)

    Moore, David Steven [Los Alamos National Laboratory; Mcgrane, Shawn D [Los Alamos National Laboratory; Greenfield, Margo T [Los Alamos National Laboratory; Scharff, R J [Los Alamos National Laboratory; Rabitz, Herschel A [PRINCETON UNIV; Roslund, J [PRINCETON UNIV

    2009-01-01

    The detection of explosives is a notoriously difficult problem, especially at stand-off distances, due to their (generally) low vapor pressure, environmental and matrix interferences, and packaging. We are exploring optimal dynamic detection to exploit the best capabilities of recent advances in laser technology and recent discoveries in optimal shaping of laser pulses for control of molecular processes to significantly enhance the standoff detection of explosives. The core of the ODD-Ex technique is the introduction of optimally shaped laser pulses to simultaneously enhance sensitivity of explosives signatures while reducing the influence of noise and the signals from background interferents in the field (increase selectivity). These goals are being addressed by operating in an optimal nonlinear fashion, typically with a single shaped laser pulse inherently containing within it coherently locked control and probe sub-pulses. With sufficient bandwidth, the technique is capable of intrinsically providing orthogonal broad spectral information for data fusion, all from a single optimal pulse.

  20. Optimizing Wind And Hydropower Generation Within Realistic Reservoir Operating Policy

    Science.gov (United States)

    Magee, T. M.; Clement, M. A.; Zagona, E. A.

    2012-12-01

    Previous studies have evaluated the benefits of utilizing the flexibility of hydropower systems to balance the variability and uncertainty of wind generation. However, previous hydropower and wind coordination studies have simplified non-power constraints on reservoir systems. For example, some studies have only included hydropower constraints on minimum and maximum storage volumes and minimum and maximum plant discharges. The methodology presented here utilizes the pre-emptive linear goal programming optimization solver in RiverWare to model hydropower operations with a set of prioritized policy constraints and objectives based on realistic policies that govern the operation of actual hydropower systems, including licensing constraints, environmental constraints, water management and power objectives. This approach accounts for the fact that not all policy constraints are of equal importance. For example target environmental flow levels may not be satisfied if it would require violating license minimum or maximum storages (pool elevations), but environmental flow constraints will be satisfied before optimizing power generation. Additionally, this work not only models the economic value of energy from the combined hydropower and wind system, it also captures the economic value of ancillary services provided by the hydropower resources. It is recognized that the increased variability and uncertainty inherent with increased wind penetration levels requires an increase in ancillary services. In regions with liberalized markets for ancillary services, a significant portion of hydropower revenue can result from providing ancillary services. Thus, ancillary services should be accounted for when determining the total value of a hydropower system integrated with wind generation. This research shows that the end value of integrated hydropower and wind generation is dependent on a number of factors that can vary by location. Wind factors include wind penetration level

  1. Optimization of Multipurpose Reservoir Operation with Application Particle Swarm Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Elahe Fallah Mehdipour

    2012-12-01

    Full Text Available Optimal operation of multipurpose reservoirs is one of the complex and sometimes nonlinear problems in the field of multi-objective optimization. Evolutionary algorithms are optimization tools that search decision space using simulation of natural biological evolution and present a set of points as the optimum solutions of problem. In this research, application of multi-objective particle swarm optimization (MOPSO in optimal operation of Bazoft reservoir with different objectives, including generating hydropower energy, supplying downstream demands (drinking, industry and agriculture, recreation and flood control have been considered. In this regard, solution sets of the MOPSO algorithm in bi-combination of objectives and compromise programming (CP using different weighting and power coefficients have been first compared that the MOPSO algorithm in all combinations of objectives is more capable than the CP to find solution with appropriate distribution and these solutions have dominated the CP solutions. Then, ending points of solution set from the MOPSO algorithm and nonlinear programming (NLP results have been compared. Results showed that the MOPSO algorithm with 0.3 percent difference from the NLP results has more capability to present optimum solutions in the ending points of solution set.

  2. Geomechanical production optimization in faulted and fractured reservoirs

    NARCIS (Netherlands)

    Heege, J.H. ter; Pizzocolo, F.; Osinga, S.; Veer, E.F. van der

    2016-01-01

    Faults and fractures in hydrocarbon reservoirs are key to some major production issues including (1) varying productivity of different well sections due to intersection of preferential flow paths with the wellbore, (2) varying hydrocarbon column heights in different reservoir compartments due to

  3. Optimization of well placement geothermal reservoirs using artificial intelligence

    Science.gov (United States)

    Akın, Serhat; Kok, Mustafa V.; Uraz, Irtek

    2010-06-01

    This research proposes a framework for determining the optimum location of an injection well using an inference method, artificial neural networks and a search algorithm to create a search space and locate the global maxima. A complex carbonate geothermal reservoir (Kizildere Geothermal field, Turkey) production history is used to evaluate the proposed framework. Neural networks are used as a tool to replicate the behavior of commercial simulators, by capturing the response of the field given a limited number of parameters such as temperature, pressure, injection location, and injection flow rate. A study on different network designs indicates that a combination of neural network and an optimization algorithm (explicit search with variable stepping) to capture local maxima can be used to locate a region or a location for optimum well placement. Results also indicate shortcomings and possible pitfalls associated with the approach. With the provided flexibility of the proposed workflow, it is possible to incorporate various parameters including injection flow rate, temperature, and location. For the field of study, optimum injection well location is found to be in the southeastern part of the field. Specific locations resulting from the workflow indicated a consistent search space, having higher values in that particular region. When studied with fixed flow rates (2500 and 4911 m 3/day), a search run through the whole field located two locations which are in the very same region resulting in consistent predictions. Further study carried out by incorporating effect of different flow rates indicates that the algorithm can be run in a particular region of interest and different flow rates may yield different locations. This analysis resulted with a new location in the same region and an optimum injection rate of 4000 m 3/day). It is observed that use of neural network, as a proxy to numerical simulator is viable for narrowing down or locating the area of interest for

  4. Management Optimization of Saguling Reservoir with Bellman Dynamic Programming and “Du Couloir” Iterative Method

    Directory of Open Access Journals (Sweden)

    Mariana Marselina

    2016-08-01

    Full Text Available The increasingly growth of population and industry sector have lead to an enhanced demand for electrical energy. One of the electricity providers in the area of Java-Madura Bali (Jamali is Saguling Reservoir. Saguling Reservoir is one of the three reservoirs that stem the flow of Citarum River in advance of to Jatiluhur and Cirata Reservoir. The average electricity production of Saguling Reservoir was 2,334,318.138 MWh/year in the period of 1986-2014. The water intake of Saguling Reservoir is the upstream Citarum Watershed with an area of 2340.88 km2 which also serves as the irrigation, inland fisheries, recreation, and other activities. An effort to improve the function of Saguling Reservoir in producing electrical energy is by optimizing the reservoir management. The optimization of Saguling Reservoir management in this study refers to Government Regulation No. 37/2010 on Dam/Reservoir Article 44 which states that the system of reservoir management consisting of the operation system in dry years, normal years, and wet years. In this research, the determination of the trajectory guideline in Saguling operation was divided in dry, normal and wet years. Trajectory guideline was conducted based on the electricity price of turbine inflow that various in every month. The determination of the trajectory guideline in various electricity price was done by using Program Dynamic Bellman (PD Bellman and “Du Couloir” iterative method which the objective to optimize the gain from electricity production. and “Du Couloir” iterative method was development of PD Bellman that can calculate the value of gain with a smaller discretization until 0,1 juta m3 effectively where PD Bellman just calculate until 10 million m3.  Smaller discretization can give maximum benefit from electricity production and the trajectory guideline will be closer to trajectory actual so optimization of Saguling operation will be achieved.

  5. SIMULATION AND OPTIMIZATION OF THE HYDRAULIC FRACTURING OPERATION IN A HEAVY OIL RESERVOIR IN SOUTHERN IRAN

    Directory of Open Access Journals (Sweden)

    REZA MASOOMI

    2017-01-01

    Full Text Available Extraction of oil from some Iranian reservoirs due to high viscosity of their oil or reducing the formation permeability due to asphaltene precipitation or other problems is not satisfactory. Hydraulic fracturing method increases production in the viscous oil reservoirs that the production rate is low. So this is very important for some Iranian reservoirs that contain these characteristics. In this study, hydraulic fracturing method has been compositionally simulated in a heavy oil reservoir in southern Iran. In this study, the parameters of the fracture half length, the propagation direction of the cracks and the depth of fracturing have been considered in this oil reservoir. The aim of this study is to find the best scenario which has the highest recovery factor in this oil reservoir. For this purpose the parameters of the length, propagation direction and depth of fracturing have been optimized in this reservoir. Through this study the cumulative oil production has been evaluated with the compositional simulation for the next 10 years in this reservoir. Also at the end of this paper, increasing the final production of this oil reservoir caused by optimized hydraulic fracturing has been evaluated.

  6. OPTIMIZATION OF INFILL DRILLING IN NATURALLY-FRACTURED TIGHT-GAS RESERVOIRS

    Energy Technology Data Exchange (ETDEWEB)

    Lawrence W. Teufel; Her-Yuan Chen; Thomas W. Engler; Bruce Hart

    2004-05-01

    A major goal of industry and the U.S. Department of Energy (DOE) fossil energy program is to increase gas reserves in tight-gas reservoirs. Infill drilling and hydraulic fracture stimulation in these reservoirs are important reservoir management strategies to increase production and reserves. Phase II of this DOE/cooperative industry project focused on optimization of infill drilling and evaluation of hydraulic fracturing in naturally-fractured tight-gas reservoirs. The cooperative project involved multidisciplinary reservoir characterization and simulation studies to determine infill well potential in the Mesaverde and Dakota sandstone formations at selected areas in the San Juan Basin of northwestern New Mexico. This work used the methodology and approach developed in Phase I. Integrated reservoir description and hydraulic fracture treatment analyses were also conducted in the Pecos Slope Abo tight-gas reservoir in southeastern New Mexico and the Lewis Shale in the San Juan Basin. This study has demonstrated a methodology to (1) describe reservoir heterogeneities and natural fracture systems, (2) determine reservoir permeability and permeability anisotropy, (3) define the elliptical drainage area and recoverable gas for existing wells, (4) determine the optimal location and number of new in-fill wells to maximize economic recovery, (5) forecast the increase in total cumulative gas production from infill drilling, and (6) evaluate hydraulic fracture simulation treatments and their impact on well drainage area and infill well potential. Industry partners during the course of this five-year project included BP, Burlington Resources, ConocoPhillips, and Williams.

  7. EXPLOITATION AND OPTIMIZATION OF RESERVOIR PERFORMANCE IN HUNTON FORMATION, OKLAHOMA

    Energy Technology Data Exchange (ETDEWEB)

    Mohan Kelkar

    2002-03-31

    The West Carney Field in Lincoln County, Oklahoma is one of few newly discovered oil fields in Oklahoma. Although profitable, the field exhibits several unusual characteristics. These include decreasing water-oil ratios, decreasing gas-oil ratios, decreasing bottomhole pressures during shut-ins in some wells, and transient behavior for water production in many wells. This report explains the unusual characteristics of West Carney Field based on detailed geological and engineering analyses. We propose a geological history that explains the presence of mobile water and oil in the reservoir. The combination of matrix and fractures in the reservoir explains the reservoir's flow behavior. We confirm our hypothesis by matching observed performance with a simulated model and develop procedures for correlating core data to log data so that the analysis can be extended to other, similar fields where the core coverage may be limited.

  8. EXPLOITATION AND OPTIMIZATION OF RESERVOIR PERFORMANCE IN HUNTON FORMATION, OKLAHOMA

    Energy Technology Data Exchange (ETDEWEB)

    Mohan Kelkar

    2005-02-01

    Hunton formation in Oklahoma has displayed some unique production characteristics. These include high initial water-oil and gas-oil ratios, decline in those ratios over time and temporary increase in gas-oil ratio during pressure build up. The formation also displays highly complex geology, but surprising hydrodynamic continuity. This report addresses three key issues related specifically to West Carney Hunton field and, in general, to any other Hunton formation exhibiting similar behavior: (1) What is the primary mechanism by which oil and gas is produced from the field? (2) How can the knowledge gained from studying the existing fields can be extended to other fields which have the potential to produce? (3) What can be done to improve the performance of this reservoir? We have developed a comprehensive model to explain the behavior of the reservoir. By using available production, geological, core and log data, we are able to develop a reservoir model which explains the production behavior in the reservoir. Using easily available information, such as log data, we have established the parameters needed for a field to be economically successful. We provide guidelines in terms of what to look for in a new field and how to develop it. Finally, through laboratory experiments, we show that surfactants can be used to improve the hydrocarbons recovery from the field. In addition, injection of CO{sub 2} or natural gas also will help us recover additional oil from the field.

  9. Multiple shooting applied to robust reservoir control optimization including output constraints on coherent risk measures

    DEFF Research Database (Denmark)

    Codas, Andrés; Hanssen, Kristian G.; Foss, Bjarne

    2017-01-01

    The production life of oil reservoirs starts under significant uncertainty regarding the actual economical return of the recovery process due to the lack of oil field data. Consequently, investors and operators make management decisions based on a limited and uncertain description of the reservoir....... In this work, we propose a new formulation for robust optimization of reservoir well controls. It is inspired by the multiple shooting (MS) method which permits a broad range of parallelization opportunities and output constraint handling. This formulation exploits coherent risk measures, a concept...

  10. Optimal model of radiocarbon residence time in exchange reservoir

    International Nuclear Information System (INIS)

    Dergachev, V.A.

    1977-01-01

    Radiocarbon content variations in the earth atmosphere were studied using a mathematical model. The so-called exchange reservoir was considered consisting of layers, and the radiocarbon exchange rate at the interfaces between these layers was supposed to be constant. The process of 14 C mixing and exchange in a dynamic system is described by a system of nonhomogeneous 1st order differential equations. The model also accounts for the change in rate of radiocarbon formation in the earth atmosphere due to cosmic and geophysical effects (solar activity, solar cycle, etc.). (J.P.)

  11. Oil Reservoir Production Optimization using Single Shooting and ESDIRK Methods

    DEFF Research Database (Denmark)

    Capolei, Andrea; Völcker, Carsten; Frydendall, Jan

    2012-01-01

    the injections and oil production such that flow is uniform in a given geological structure. Even in the case of conventional water flooding, feedback based optimal control technologies may enable higher oil recovery than with conventional operational strategies. The optimal control problems that must be solved...

  12. Hedging Rules for Water Supply Reservoir Based on the Model of Simulation and Optimization

    Directory of Open Access Journals (Sweden)

    Yi Ji

    2016-06-01

    Full Text Available This study proposes a hedging rule model which is composed of a two-period reservior operation model considering the damage depth and hedging rule parameter optimization model. The former solves hedging rules based on a given poriod’s water supply weighting factor and carryover storage target, while the latter optimization model is used to optimize the weighting factor and carryover storage target based on the hedging rules. The coupling model gives the optimal poriod’s water supply weighting factor and carryover storage target to guide release. The conclusions achieved from this study as follows: (1 the water supply weighting factor and carryover storage target have a direct impact on the three elements of the hedging rule; (2 parameters can guide reservoirs to supply water reasonably after optimization of the simulation and optimization model; and (3 in order to verify the utility of the hedging rule, the Heiquan reservoir is used as a case study and particle swarm optimization algorithm with a simulation model is adopted for optimizing the parameter. The results show that the proposed hedging rule can improve the operation performances of the water supply reservoir.

  13. Optimal Reoperation of Multi-Reservoirs for Integrated Watershed Management with Multiple Benefits

    Directory of Open Access Journals (Sweden)

    Xinyi Xu

    2014-04-01

    Full Text Available Constructing reservoirs can make more efficient use of water resources for human society. However, the negative impacts of these projects on the environment are often ignored. Optimal reoperation of reservoirs, which considers not only in socio-economic values but also environmental benefits, is increasingly important. A model of optimal reoperation of multi-reservoirs for integrated watershed management with multiple benefits was proposed to alleviate the conflict between water use and environmental deterioration. The social, economic, water quality and ecological benefits were respectively taken into account as the scheduling objectives and quantified according to economic models. River minimum ecological flows and reservoir water levels based on flood control were taken as key constraint conditions. Feasible search discrete differential dynamic programming (FS-DDDP was used to run the model. The proposed model was used in the upstream of the Nanpan River, to quantitatively evaluate the difference between optimal reoperation and routine operation. The results indicated that the reoperation could significantly increase the water quality benefit and have a minor effect on the benefits of power generation and irrigation under different hydrological years. The model can be readily adapted to other multi-reservoir systems for water resources management.

  14. Optimization of conventional rule curves coupled with hedging rules for reservoir operation

    DEFF Research Database (Denmark)

    Taghian, Mehrdad; Rosbjerg, Dan; Haghighi, Ali

    2014-01-01

    As a common approach to reservoir operating policies, water levels at the end of each time interval should be kept at or above the rule curve. In this study, the policy is captured using rationing of the target yield to reduce the intensity of severe water shortages. For this purpose, a hybrid...... to achieve the optimal water allocation and the target storage levels for reservoirs. As a case study, a multipurpose, multireservoir system in southern Iran is selected. The results show that the model has good performance in extracting the optimum policy for reservoir operation under both normal...... model is developed to optimize simultaneously both the conventional rule curve and the hedging rule. In the compound model, a simple genetic algorithm is coupled with a simulation program, including an inner linear programming algorithm. In this way, operational policies are imposed by priority concepts...

  15. Total output operation chart optimization of cascade reservoirs and its application

    International Nuclear Information System (INIS)

    Jiang, Zhiqiang; Ji, Changming; Sun, Ping; Wang, Liping; Zhang, Yanke

    2014-01-01

    Highlights: • We propose a new double nested model for cascade reservoirs operation optimization. • We use two methods to extract the output distribution ratio. • The adopted two methods perform better than the widely used methods at present. • Stepwise regression method performs better than mean value method on the whole. - Abstract: With the rapid development of cascade hydropower stations in recent decades, the cascade system composed of multiple reservoirs needs unified operation and management. However, the output distribution problem has not yet been solved reasonably when the total output of cascade system obtained, which makes the full utilization of hydropower resources in cascade reservoirs very difficult. Discriminant criterion method is a traditional and common method to solve the output distribution problem at present, but some shortcomings cannot be ignored in the practical application. In response to the above concern, this paper proposes a new total output operation chart optimization model and a new optimal output distribution model, the two models constitute to a double nested model with the goal of maximizing power generation. This paper takes the cascade reservoirs of Li Xianjiang River in China as an instance to obtain the optimal total output operation chart by the proposed double nested model and the 43 years historical runoff data, progressive searching method and progressive optimality algorithm are used in solving the model. In order to take the obtained total output operation chart into practical operation, mean value method and stepwise regression method are adopted to extract the output distribution ratios on the basis of the optimal simulation intermediate data. By comparing with discriminant criterion method and conventional method, the combined utilization of total output operation chart and output distribution ratios presents better performance in terms of power generation and assurance rate, which proves it is an effective

  16. Optimization of the Infrastructure of Reinforced Concrete Reservoirs by a Particle Swarm Algorithm

    Directory of Open Access Journals (Sweden)

    Kia Saeed

    2015-03-01

    Full Text Available Optimization techniques may be effective in finding the best modeling and shapes for reinforced concrete reservoirs (RCR to improve their durability and mechanical behavior, particularly for avoiding or reducing the bending moments in these structures. RCRs are one of the major structures applied for reserving fluids to be used in drinking water networks. Usually, these structures have fixed shapes which are designed and calculated based on input discharges, the conditions of the structure's topology, and geotechnical locations with various combinations of static and dynamic loads. In this research, the elements of reservoir walls are first typed according to the performance analyzed; then the range of the membrane based on the thickness and the minimum and maximum cross sections of the bar used are determined in each element. This is done by considering the variable constraints, which are estimated by the maximum stress capacity. In the next phase, based on the reservoir analysis and using the algorithm of the PARIS connector, the related information is combined with the code for the PSO algorithm, i.e., an algorithm for a swarming search, to determine the optimum thickness of the cross sections for the reservoir membrane’s elements and the optimum cross section of the bar used. Based on very complex mathematical linear models for the correct embedding and angles related to achain of peripheral strengthening membranes, which optimize the vibration of the structure, a mutual relation is selected between the modeling software and the code for a particle swarm optimization algorithm. Finally, the comparative weight of the concrete reservoir optimized by the peripheral strengthening membrane is analyzed using common methods. This analysis shows a 19% decrease in the bar’s weight, a 20% decrease in the concrete’s weight, and a minimum 13% saving in construction costs according to the items of a checklist for a concrete reservoir at 10,000 m3.

  17. An ensemble-based method for constrained reservoir life-cycle optimization

    NARCIS (Netherlands)

    Leeuwenburgh, O.; Egberts, P.J.P.; Chitu, A.G.

    2015-01-01

    We consider the problem of finding optimal long-term (life-cycle) recovery strategies for hydrocarbon reservoirs by use of simulation models. In such problems the presence of operating constraints, such as for example a maximum rate limit for a group of wells, may strongly influence the range of

  18. A prediction of Power Duration Curve from the Optimal Operation of the Multi Reservoirs System

    Directory of Open Access Journals (Sweden)

    Abdul Wahab Younis

    2013-04-01

    Full Text Available  This study aims of predication Power Duration Curves(PDC resulting from the optimal operation of the multi reservoirs system which comprises the reservoirs of Bakhma dam,Dokan dam and Makhool dam for the division of years over 30 years.Discrete Differential Dynamic Programming(DDDP has been employed to find the optimal operation of the said reservoirs.    PDC representing the relationship between the generated hydroelectric power and percentage of operation time equaled or exceeded . The importance of these curves lies in knowing the volume of electric power available for that percentage of operation time. The results have shown that the sum of yearly hydroelectric power for average Release and for the single operation was 5410,1604,2929(Mwfor the reservoirs of Bakhma, Dokan, Makhool dams, which resulted from the application of independent DDDP technology. Also, the hydroelectric power whose generation can be guranteed for 90% of the time is 344.91,107.7,188.15 (Mw for the single operation and 309.1,134.08,140.7 (Mw for the operation as a one system for the reservoirs of Bakhma, Dokan, and Makhool dams respectively.

  19. Derivation of Optimal Operating Rules for Large-scale Reservoir Systems Considering Multiple Trade-off

    Science.gov (United States)

    Zhang, J.; Lei, X.; Liu, P.; Wang, H.; Li, Z.

    2017-12-01

    Flood control operation of multi-reservoir systems such as parallel reservoirs and hybrid reservoirs often suffer from complex interactions and trade-off among tributaries and the mainstream. The optimization of such systems is computationally intensive due to nonlinear storage curves, numerous constraints and complex hydraulic connections. This paper aims to derive the optimal flood control operating rules based on the trade-off among tributaries and the mainstream using a new algorithm known as weighted non-dominated sorting genetic algorithm II (WNSGA II). WNSGA II could locate the Pareto frontier in non-dominated region efficiently due to the directed searching by weighted crowding distance, and the results are compared with those of conventional operating rules (COR) and single objective genetic algorithm (GA). Xijiang river basin in China is selected as a case study, with eight reservoirs and five flood control sections within four tributaries and the mainstream. Furthermore, the effects of inflow uncertainty have been assessed. Results indicate that: (1) WNSGA II could locate the non-dominated solutions faster and provide better Pareto frontier than the traditional non-dominated sorting genetic algorithm II (NSGA II) due to the weighted crowding distance; (2) WNSGA II outperforms COR and GA on flood control in the whole basin; (3) The multi-objective operating rules from WNSGA II deal with the inflow uncertainties better than COR. Therefore, the WNSGA II can be used to derive stable operating rules for large-scale reservoir systems effectively and efficiently.

  20. Towards an optimal integrated reservoir system management for the Awash River Basin, Ethiopia

    Directory of Open Access Journals (Sweden)

    R. Müller

    2016-05-01

    Full Text Available Recently, the Kessem–Tendaho project is completed to bring about socioeconomic development and growth in the Awash River Basin, Ethiopia. To support reservoir Koka, two new reservoirs where built together with extensive infrastructure for new irrigation projects. For best possible socioeconomic benefits under conflicting management goals, like energy production at three hydropower stations and basin wide water supply at various sites, an integrated reservoir system management is required. To satisfy the multi-purpose nature of the reservoir system, multi-objective parameterization-simulation-optimization model is applied. Different Pareto-optimal trade-off solutions between water supply and hydro-power generation are provided for two scenarios (i recent conditions and (ii future planned increases for Tendaho and Upper Awash Irrigation projects. Reservoir performance is further assessed under (i rule curves with a high degree of freedom – this allows for best performance, but may result in rules curves to variable for real word operation and (ii smooth rule curves, obtained by artificial neuronal networks. The results show no performance penalty for smooth rule curves under future conditions but a notable penalty under recent conditions.

  1. AI techniques for optimizing multi-objective reservoir operation upon human and riverine ecosystem demands

    Science.gov (United States)

    Tsai, Wen-Ping; Chang, Fi-John; Chang, Li-Chiu; Herricks, Edwin E.

    2015-11-01

    Flow regime is the key driver of the riverine ecology. This study proposes a novel hybrid methodology based on artificial intelligence (AI) techniques for quantifying riverine ecosystems requirements and delivering suitable flow regimes that sustain river and floodplain ecology through optimizing reservoir operation. This approach addresses issues to better fit riverine ecosystem requirements with existing human demands. We first explored and characterized the relationship between flow regimes and fish communities through a hybrid artificial neural network (ANN). Then the non-dominated sorting genetic algorithm II (NSGA-II) was established for river flow management over the Shihmen Reservoir in northern Taiwan. The ecosystem requirement took the form of maximizing fish diversity, which could be estimated by the hybrid ANN. The human requirement was to provide a higher satisfaction degree of water supply. The results demonstrated that the proposed methodology could offer a number of diversified alternative strategies for reservoir operation and improve reservoir operational strategies producing downstream flows that could meet both human and ecosystem needs. Applications that make this methodology attractive to water resources managers benefit from the wide spread of Pareto-front (optimal) solutions allowing decision makers to easily determine the best compromise through the trade-off between reservoir operational strategies for human and ecosystem needs.

  2. Towards an optimal integrated reservoir system management for the Awash River Basin, Ethiopia

    Science.gov (United States)

    Müller, Ruben; Gebretsadik, Henok Y.; Schütze, Niels

    2016-05-01

    Recently, the Kessem-Tendaho project is completed to bring about socioeconomic development and growth in the Awash River Basin, Ethiopia. To support reservoir Koka, two new reservoirs where built together with extensive infrastructure for new irrigation projects. For best possible socioeconomic benefits under conflicting management goals, like energy production at three hydropower stations and basin wide water supply at various sites, an integrated reservoir system management is required. To satisfy the multi-purpose nature of the reservoir system, multi-objective parameterization-simulation-optimization model is applied. Different Pareto-optimal trade-off solutions between water supply and hydro-power generation are provided for two scenarios (i) recent conditions and (ii) future planned increases for Tendaho and Upper Awash Irrigation projects. Reservoir performance is further assessed under (i) rule curves with a high degree of freedom - this allows for best performance, but may result in rules curves to variable for real word operation and (ii) smooth rule curves, obtained by artificial neuronal networks. The results show no performance penalty for smooth rule curves under future conditions but a notable penalty under recent conditions.

  3. Detecting the leakage source of a reservoir using isotopes.

    Science.gov (United States)

    Yi, Peng; Yang, Jing; Wang, Yongdong; Mugwanezal, Vincent de Paul; Chen, Li; Aldahan, Ala

    2018-07-01

    A good monitoring method is vital for understanding the sources of a water reservoir leakage and planning for effective restoring. Here we present a combination of several tracers ( 222 Rn, oxygen and hydrogen isotopes, anions and temperature) for identification of water leakage sources in the Pushihe pumped storage power station which is in the Liaoning province, China. The results show an average 222 Rn activity of 6843 Bq/m 3 in the leakage water, 3034 Bq/m 3 in the reservoir water, and 41,759 Bq/m 3 in the groundwater. Considering that 222 Rn activity in surface water is typically less than 5000 Bq/m 3 , the low level average 222 Rn activity in the leakage water suggests the reservoir water as the main source of water. Results of the oxygen and hydrogen isotopes show comparable ranges and values in the reservoir and the leakage water samples. However, important contribution of the groundwater (up to 36%) was present in some samples from the bottom and upper parts of the underground powerhouse, while the leakage water from some other parts indicate the reservoir water as the dominant source. The isotopic finding suggests that the reservoir water is the main source of the leakage water which is confirmed by the analysis of anions (nitrate, sulfate, and chloride) in the water samples. The combination of these tracer methods for studying dam water leakage improves the accuracy of identifying the source of leaks and provide a scientific reference for engineering solutions to ensure the dam safety. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Balancing Exploration, Uncertainty Representation and Computational Time in Many-Objective Reservoir Policy Optimization

    Science.gov (United States)

    Zatarain-Salazar, J.; Reed, P. M.; Quinn, J.; Giuliani, M.; Castelletti, A.

    2016-12-01

    As we confront the challenges of managing river basin systems with a large number of reservoirs and increasingly uncertain tradeoffs impacting their operations (due to, e.g. climate change, changing energy markets, population pressures, ecosystem services, etc.), evolutionary many-objective direct policy search (EMODPS) solution strategies will need to address the computational demands associated with simulating more uncertainties and therefore optimizing over increasingly noisy objective evaluations. Diagnostic assessments of state-of-the-art many-objective evolutionary algorithms (MOEAs) to support EMODPS have highlighted that search time (or number of function evaluations) and auto-adaptive search are key features for successful optimization. Furthermore, auto-adaptive MOEA search operators are themselves sensitive to having a sufficient number of function evaluations to learn successful strategies for exploring complex spaces and for escaping from local optima when stagnation is detected. Fortunately, recent parallel developments allow coordinated runs that enhance auto-adaptive algorithmic learning and can handle scalable and reliable search with limited wall-clock time, but at the expense of the total number of function evaluations. In this study, we analyze this tradeoff between parallel coordination and depth of search using different parallelization schemes of the Multi-Master Borg on a many-objective stochastic control problem. We also consider the tradeoff between better representing uncertainty in the stochastic optimization, and simplifying this representation to shorten the function evaluation time and allow for greater search. Our analysis focuses on the Lower Susquehanna River Basin (LSRB) system where multiple competing objectives for hydropower production, urban water supply, recreation and environmental flows need to be balanced. Our results provide guidance for balancing exploration, uncertainty, and computational demands when using the EMODPS

  5. Exploring synergistic benefits of Water-Food-Energy Nexus through multi-objective reservoir optimization schemes.

    Science.gov (United States)

    Uen, Tinn-Shuan; Chang, Fi-John; Zhou, Yanlai; Tsai, Wen-Ping

    2018-08-15

    This study proposed a holistic three-fold scheme that synergistically optimizes the benefits of the Water-Food-Energy (WFE) Nexus by integrating the short/long-term joint operation of a multi-objective reservoir with irrigation ponds in response to urbanization. The three-fold scheme was implemented step by step: (1) optimizing short-term (daily scale) reservoir operation for maximizing hydropower output and final reservoir storage during typhoon seasons; (2) simulating long-term (ten-day scale) water shortage rates in consideration of the availability of irrigation ponds for both agricultural and public sectors during non-typhoon seasons; and (3) promoting the synergistic benefits of the WFE Nexus in a year-round perspective by integrating the short-term optimization and long-term simulation of reservoir operations. The pivotal Shihmen Reservoir and 745 irrigation ponds located in Taoyuan City of Taiwan together with the surrounding urban areas formed the study case. The results indicated that the optimal short-term reservoir operation obtained from the non-dominated sorting genetic algorithm II (NSGA-II) could largely increase hydropower output but just slightly affected water supply. The simulation results of the reservoir coupled with irrigation ponds indicated that such joint operation could significantly reduce agricultural and public water shortage rates by 22.2% and 23.7% in average, respectively, as compared to those of reservoir operation excluding irrigation ponds. The results of year-round short/long-term joint operation showed that water shortage rates could be reduced by 10% at most, the food production rate could be increased by up to 47%, and the hydropower benefit could increase up to 9.33 million USD per year, respectively, in a wet year. Consequently, the proposed methodology could be a viable approach to promoting the synergistic benefits of the WFE Nexus, and the results provided unique insights for stakeholders and policymakers to pursue

  6. Simulation-optimization model of reservoir operation based on target storage curves

    Directory of Open Access Journals (Sweden)

    Hong-bin Fang

    2014-10-01

    Full Text Available This paper proposes a new storage allocation rule based on target storage curves. Joint operating rules are also proposed to solve the operation problems of a multi-reservoir system with joint demands and water transfer-supply projects. The joint operating rules include a water diversion rule to determine the amount of diverted water in a period, a hedging rule based on an aggregated reservoir to determine the total release from the system, and a storage allocation rule to specify the release from each reservoir. A simulation-optimization model was established to optimize the key points of the water diversion curves, the hedging rule curves, and the target storage curves using the improved particle swarm optimization (IPSO algorithm. The multi-reservoir water supply system located in Liaoning Province, China, including a water transfer-supply project, was employed as a case study to verify the effectiveness of the proposed join operating rules and target storage curves. The results indicate that the proposed operating rules are suitable for the complex system. The storage allocation rule based on target storage curves shows an improved performance with regard to system storage distribution.

  7. Monthly Optimal Reservoirs Operation for Multicrop Deficit Irrigation under Fuzzy Stochastic Uncertainties

    Directory of Open Access Journals (Sweden)

    Liudong Zhang

    2014-01-01

    Full Text Available An uncertain monthly reservoirs operation and multicrop deficit irrigation model was proposed under conjunctive use of underground and surface water for water resources optimization management. The objective is to maximize the total crop yield of the entire irrigation districts. Meanwhile, ecological water remained for the downstream demand. Because of the shortage of water resources, the monthly crop water production function was adopted for multiperiod deficit irrigation management. The model reflects the characteristics of water resources repetitive transformation in typical inland rivers irrigation system. The model was used as an example for water resources optimization management in Shiyang River Basin, China. Uncertainties in reservoir management shown as fuzzy probability were treated through chance-constraint parameter for decision makers. Necessity of dominance (ND was used to analyse the advantages of the method. The optimization results including reservoirs real-time operation policy, deficit irrigation management, and the available water resource allocation could be used to provide decision support for local irrigation management. Besides, the strategies obtained could help with the risk analysis of reservoirs operation stochastically.

  8. Exploitation and Optimization of Reservoir Performance in Hunton Formation, Oklahoma

    Energy Technology Data Exchange (ETDEWEB)

    Mohan Kelkar

    2007-06-30

    Hunton formation in Oklahoma has been the subject of attention for the last ten years. The new interest started with the drilling of the West Carney field in 1995 in Lincoln County. Subsequently, many other operators have expanded the search for oil and gas in Hunton formation in other parts of Oklahoma. These fields exhibit many unique production characteristics, including: (1) decreasing water-oil or water-gas ratio over time; (2) decreasing gas-oil ratio followed by an increase; (3) poor prediction capability of the reserves based on the log data; and (4) low geological connectivity but high hydrodynamic connectivity. The purpose of this investigation is to understand the principal mechanisms affecting the production, and propose methods by which we can optimize the production from fields with similar characteristics.

  9. Estimating irrigation water demand using an improved method and optimizing reservoir operation for water supply and hydropower generation: a case study of the Xinfengjiang reservoir in southern China

    Science.gov (United States)

    Wu, Yiping; Chen, Ji

    2013-01-01

    The ever-increasing demand for water due to growth of population and socioeconomic development in the past several decades has posed a worldwide threat to water supply security and to the environmental health of rivers. This study aims to derive reservoir operating rules through establishing a multi-objective optimization model for the Xinfengjiang (XFJ) reservoir in the East River Basin in southern China to minimize water supply deficit and maximize hydropower generation. Additionally, to enhance the estimation of irrigation water demand from the downstream agricultural area of the XFJ reservoir, a conventional method for calculating crop water demand is improved using hydrological model simulation results. Although the optimal reservoir operating rules are derived for the XFJ reservoir with three priority scenarios (water supply only, hydropower generation only, and equal priority), the river environmental health is set as the basic demand no matter which scenario is adopted. The results show that the new rules derived under the three scenarios can improve the reservoir operation for both water supply and hydropower generation when comparing to the historical performance. Moreover, these alternative reservoir operating policies provide the flexibility for the reservoir authority to choose the most appropriate one. Although changing the current operating rules may influence its hydropower-oriented functions, the new rules can be significant to cope with the increasingly prominent water shortage and degradation in the aquatic environment. Overall, our results and methods (improved estimation of irrigation water demand and formulation of the reservoir optimization model) can be useful for local watershed managers and valuable for other researchers worldwide.

  10. Optimizing Reservoir-Stream-Aquifer Interactions for Conjunctive Use and Hydropower Production

    Directory of Open Access Journals (Sweden)

    Hala Fayad

    2012-01-01

    Full Text Available Conjunctive management of water resources involves coordinating use of surface water and groundwater resources. Very few simulation/optimization (S-O models for stream-aquifer system management have included detailed interactions between groundwater, streams, and reservoir storage. This paper presents an S-O model doing that via artificial neural network simulators and genetic algorithm optimizer for multiobjective conjunctive water use problems. The model simultaneously addresses all significant flows including reservoir-stream-diversion-aquifer interactions in a more detailed manner than previous models. The model simultaneously maximizes total water provided and hydropower production. A penalty function implicitly poses constraints on state variables. The model effectively finds feasible optimal solutions and the Pareto optimum. Illustrated is application for planning water resource and minihydropower system development.

  11. On-line Optimization-Based Simulators for Fractured and Non-fractured Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Milind D. Deo

    2005-08-31

    Oil field development is a multi-million dollar business. Reservoir simulation is often used to guide the field management and development process. Reservoir characterization and geologic modeling tools have become increasingly sophisticated. As a result the geologic models produced are complex. Most reservoirs are fractured to a certain extent. The new geologic characterization methods are making it possible to map features such as faults and fractures, field-wide. Significant progress has been made in being able to predict properties of the faults and of the fractured zones. Traditionally, finite difference methods have been employed in discretizing the domains created by geologic means. For complex geometries, finite-element methods of discretization may be more suitable. Since reservoir simulation is a mature science, some of the advances in numerical methods (linear, nonlinear solvers and parallel computing) have not been fully realized in the implementation of most of the simulators. The purpose of this project was to address some of these issues. {sm_bullet} One of the goals of this project was to develop a series of finite-element simulators to handle problems of complex geometry, including systems containing faults and fractures. {sm_bullet} The idea was to incorporate the most modern computing tools; use of modular object-oriented computer languages, the most sophisticated linear and nonlinear solvers, parallel computing methods and good visualization tools. {sm_bullet} One of the tasks of the project was also to demonstrate the construction of fractures and faults in a reservoir using the available data and to assign properties to these features. {sm_bullet} Once the reservoir model is in place, it is desirable to find the operating conditions, which would provide the best reservoir performance. This can be accomplished by utilization optimization tools and coupling them with reservoir simulation. Optimization-based reservoir simulation was one of the

  12. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Ettehadtavakkol, Amin, E-mail: amin.ettehadtavakkol@ttu.edu [Texas Tech University (United States); Jablonowski, Christopher [Shell Exploration and Production Company (United States); Lake, Larry [University of Texas at Austin (United States)

    2017-04-15

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum design concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.

  13. Development Optimization and Uncertainty Analysis Methods for Oil and Gas Reservoirs

    International Nuclear Information System (INIS)

    Ettehadtavakkol, Amin; Jablonowski, Christopher; Lake, Larry

    2017-01-01

    Uncertainty complicates the development optimization of oil and gas exploration and production projects, but methods have been devised to analyze uncertainty and its impact on optimal decision-making. This paper compares two methods for development optimization and uncertainty analysis: Monte Carlo (MC) simulation and stochastic programming. Two example problems for a gas field development and an oilfield development are solved and discussed to elaborate the advantages and disadvantages of each method. Development optimization involves decisions regarding the configuration of initial capital investment and subsequent operational decisions. Uncertainty analysis involves the quantification of the impact of uncertain parameters on the optimum design concept. The gas field development problem is designed to highlight the differences in the implementation of the two methods and to show that both methods yield the exact same optimum design. The results show that both MC optimization and stochastic programming provide unique benefits, and that the choice of method depends on the goal of the analysis. While the MC method generates more useful information, along with the optimum design configuration, the stochastic programming method is more computationally efficient in determining the optimal solution. Reservoirs comprise multiple compartments and layers with multiphase flow of oil, water, and gas. We present a workflow for development optimization under uncertainty for these reservoirs, and solve an example on the design optimization of a multicompartment, multilayer oilfield development.

  14. Optimization of analitycal control over residues of active ingridients of modern pesticides in reservoirs water

    Directory of Open Access Journals (Sweden)

    Semenenko V.M.

    2013-10-01

    Full Text Available A highly sensitive and selective method of pyraclostrobin, boscalid, tebufenpyrad and prohexadione-calcium determination under their combined presence in water sample, using high-performance liquid chromatography was developed. On the base of mentioned active ingredients (combined fungicide Bellis, insecto-acaricide Masai and plant growth regulator Regalis pesticides may be used in one vegetation season for fruit trees protection. Method of co-determination of these substances is based on the preparation of water samples for extraction, extraction of pyraclostrobin, boscalid, tebufenpyrad and prohexadione-calcium, concentrating of extract of substances mixtures and chromatographic determination with ultraviolet detection. A distinctive feature of this method is changing of ratio of components of mobile phase (mixture of acetonitrile and 0,1 % aqueous solution of phosphoric acid in the process of chromatographic analysis, which allowed to clearly visualize test substances in case of their joint presence in one sample. Implementation of developed and patented method into practice optimizes control over application of pesticides in agriculture and their monitoring in reservoirs water by significant acceleration of analysis and reduction of expenses in its carying out.

  15. Offset Risk Minimization for Open-loop Optimal Control of Oil Reservoirs

    DEFF Research Database (Denmark)

    Capolei, Andrea; Christiansen, Lasse Hjuler; Jørgensen, J. B.

    2017-01-01

    Simulation studies of oil field water flooding have demonstrated a significant potential of optimal control technology to improve industrial practices. However, real-life applications are challenged by unknown geological factors that make reservoir models highly uncertain. To minimize...... the associated financial risks, the oil literature has used ensemble-based methods to manipulate the net present value (NPV) distribution by optimizing sample estimated risk measures. In general, such methods successfully reduce overall risk. However, as this paper demonstrates, ensemble-based control strategies...... practices. The results suggest that it may be more relevant to consider the NPV offset distribution than the NPV distribution when minimizing risk in production optimization....

  16. Thermo-economic optimization of an endoreversible four-heat-reservoir absorption-refrigerator

    International Nuclear Information System (INIS)

    Qin Xiaoyong; Chen Lingen; Sun Fengrui; Wu Chih

    2005-01-01

    Based on an endoreversible four-heat-reservoir absorption-refrigeration-cycle model, the optimal thermo-economic performance of an absorption-refrigerator is analyzed and optimized assuming a linear (Newtonian) heat-transfer law applies. The optimal relation between the thermo-economic criterion and the coefficient of performance (COP), the maximum thermo-economic criterion, and the COP and specific cooling load for the maximum thermo-economic criterion of the cycle are derived using finite-time thermodynamics. Moreover, the effects of the cycle parameters on the thermo-economic performance of the cycle are studied by numerical examples

  17. Hybrid Multi-Objective Optimization of Folsom Reservoir Operation to Maximize Storage in Whole Watershed

    Science.gov (United States)

    Goharian, E.; Gailey, R.; Maples, S.; Azizipour, M.; Sandoval Solis, S.; Fogg, G. E.

    2017-12-01

    The drought incidents and growing water scarcity in California have a profound effect on human, agricultural, and environmental water needs. California experienced multi-year droughts, which have caused groundwater overdraft and dropping groundwater levels, and dwindling of major reservoirs. These concerns call for a stringent evaluation of future water resources sustainability and security in the state. To answer to this call, Sustainable Groundwater Management Act (SGMA) was passed in 2014 to promise a sustainable groundwater management in California by 2042. SGMA refers to managed aquifer recharge (MAR) as a key management option, especially in areas with high variation in water availability intra- and inter-annually, to secure the refill of underground water storage and return of groundwater quality to a desirable condition. The hybrid optimization of an integrated water resources system provides an opportunity to adapt surface reservoir operations for enhancement in groundwater recharge. Here, to re-operate Folsom Reservoir, objectives are maximizing the storage in the whole American-Cosumnes watershed and maximizing hydropower generation from Folsom Reservoir. While a linear programing (LP) module tends to maximize the total groundwater recharge by distributing and spreading water over suitable lands in basin, a genetic based algorithm, Non-dominated Sorting Genetic Algorithm II (NSGA-II), layer above it controls releases from the reservoir to secure the hydropower generation, carry-over storage in reservoir, available water for replenishment, and downstream water requirements. The preliminary results show additional releases from the reservoir for groundwater recharge during high flow seasons. Moreover, tradeoffs between the objectives describe that new operation performs satisfactorily to increase the storage in the basin, with nonsignificant effects on other objectives.

  18. Determination of the Cascade Reservoir Operation for Optimal Firm-Energy Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Azmeri

    2013-08-01

    Full Text Available Indonesia today face a new paradigm in water management where aim to apply integrated water resources management has become unavoidable task in purpose of achieving greater level of effectiveness and efficiency. On of most interesting case study is the case of Citarum river, one of the most potential river for water supply in West Java, Indonesia. Alongside the river, Saguling, Cirata and Djuanda Reservoirs had been constructed in series/cascade. Saguling and Cirata reservoirs are particularly operated for hydroelectric power and Djuanda is multipurpose reservoir mainly operated for irrigation and contribute domestic water supply for Jakarta (capital city of Indonesia. Basically all reservoirs are relying on same resources, therefore this condition has considered addressing management and operational problem. Therefore, an approach toward new management and operation system are urgently required in order to achieve the effective and efficient output and to avoid conflicts of water used. This study aims to obtain energy production from Citarum Cascade Reservoir System using Genetic Algorithms optimization with the objective function to maximize firm-energy. Firm-energy is the minimum energy requirements must be available in a certain time period. Then, the result obtained by using the energy produced by GA is compared to the conventional searching technique of Non-Linier Programming (NLP. The GA derived operating curves reveal the higher energy and firm-energy than NLP model

  19. Problem of detecting inclusions by topological optimization

    Directory of Open Access Journals (Sweden)

    I. Faye

    2014-01-01

    Full Text Available In this paper we propose a new method to detect inclusions. The proposed method is based on shape and topological optimization tools. In fact after presenting the problem, we use topologication optimization tools to detect inclusions in the domain. Numerical results are presented.

  20. Production Optimization for Two-Phase Flow in an Oil Reservoir

    DEFF Research Database (Denmark)

    Völcker, Carsten; Jørgensen, John Bagterp; Thomsen, Per Grove

    2012-01-01

    framework to increase the production and economic value of an oil reservoir. Wether the objective is to maximize recovery or some financial measure like Net Present Value, the increased production is achieved by manipulation of the well rates and bottom-hole pressures of the injection and production wells....... The optimal water injection rates and production well bottom-hole pressures are computed by solution of a large-scale constrained optimal control problem. The objective is to maximize production by manipulating the well rates and bottom hole pressures of injection and production wells. Optimal control...... settings of injection and production wells are computed by solution of a large scale constrained optimal control problem. We describe a gradient based method to compute the optimal control strategy of the water flooding process. An explicit singly diagonally implicit Runge-Kutta (ESDIRK) method...

  1. Artificial intelligent techniques for optimizing water allocation in a reservoir watershed

    Science.gov (United States)

    Chang, Fi-John; Chang, Li-Chiu; Wang, Yu-Chung

    2014-05-01

    This study proposes a systematical water allocation scheme that integrates system analysis with artificial intelligence techniques for reservoir operation in consideration of the great uncertainty upon hydrometeorology for mitigating droughts impacts on public and irrigation sectors. The AI techniques mainly include a genetic algorithm and adaptive-network based fuzzy inference system (ANFIS). We first derive evaluation diagrams through systematic interactive evaluations on long-term hydrological data to provide a clear simulation perspective of all possible drought conditions tagged with their corresponding water shortages; then search the optimal reservoir operating histogram using genetic algorithm (GA) based on given demands and hydrological conditions that can be recognized as the optimal base of input-output training patterns for modelling; and finally build a suitable water allocation scheme through constructing an adaptive neuro-fuzzy inference system (ANFIS) model with a learning of the mechanism between designed inputs (water discount rates and hydrological conditions) and outputs (two scenarios: simulated and optimized water deficiency levels). The effectiveness of the proposed approach is tested on the operation of the Shihmen Reservoir in northern Taiwan for the first paddy crop in the study area to assess the water allocation mechanism during drought periods. We demonstrate that the proposed water allocation scheme significantly and substantially avails water managers of reliably determining a suitable discount rate on water supply for both irrigation and public sectors, and thus can reduce the drought risk and the compensation amount induced by making restrictions on agricultural use water.

  2. Genetic Algorithm (GA Method for Optimization of Multi-Reservoir Systems Operation

    Directory of Open Access Journals (Sweden)

    Shervin Momtahen

    2006-01-01

    Full Text Available A Genetic Algorithm (GA method for optimization of multi-reservoir systems operation is proposed in this paper. In this method, the parameters of operating policies are optimized using system simulation results. Hence, any operating problem with any sort of objective function, constraints and structure of operating policy can be optimized by GA. The method is applied to a 3-reservoir system and is compared with two traditional methods of Stochastic Dynamic Programming and Dynamic Programming and Regression. The results show that GA is superior both in objective function value and in computational speed. The proposed method is further improved using a mutation power updating rule and a varying period simulation method. The later is a novel procedure proposed in this paper that is believed to help in solving computational time problem in large systems. These revisions are evaluated and proved to be very useful in converging to better solutions in much less time. The final GA method is eventually evaluated as a very efficient procedure that is able to solve problems of large multi-reservoir system which is usually impossible by traditional methods. In fact, the real performance of the GA method starts where others fail to function.

  3. NN-Based Implicit Stochastic Optimization of Multi-Reservoir Systems Management

    Directory of Open Access Journals (Sweden)

    Matteo Sangiorgio

    2018-03-01

    Full Text Available Multi-reservoir systems management is complex because of the uncertainty on future events and the variety of purposes, usually conflicting, of the involved actors. An efficient management of these systems can help improving resource allocation, preventing political crisis and reducing the conflicts between the stakeholders. Bellman stochastic dynamic programming (SDP is the most famous among the many proposed approaches to solve this optimal control problem. Unfortunately, SDP is affected by the curse of dimensionality: computational effort increases exponentially with the complexity of the considered system (i.e., number of reservoirs, and the problem rapidly becomes intractable. This paper proposes an implicit stochastic optimization approach for the solution of the reservoir management problem. The core idea is using extremely flexible functions, such as artificial neural networks (NN, for designing release rules which approximate the optimal policies obtained by an open-loop approach. These trained NNs can then be used to take decisions in real time. The approach thus requires a sufficiently long series of historical or synthetic inflows, and the definition of a compromise solution to be approximated. This work analyzes with particular emphasis the importance of the information which represents the input of the control laws, investigating the effects of different degrees of completeness. The methodology is applied to the Nile River basin considering the main management objectives (minimization of the irrigation water deficit and maximization of the hydropower production, but can be easily adopted also in other cases.

  4. Technical Reviews on the Radioisotope Application for Leak Detection in Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jin Seop; Jung, Sung Hee; Kim, Jong Bum; Kim, Jae Ho

    2006-02-15

    The previous techniques on the detection of leaks from reservoirs are difficult to identify the leak points and leak pathways in reservoirs. Additionally the complexity and ambiguity of data analysis resulted from them can increase the failures of leak detection. While, The technique using radioisotope as a tracer is considered to be very promising. In the same context, systematic studies led by IAEA are being practiced by organizing the task force team. The detection technique using natural tracer can give information about the age of ground water and the interconnection between ground water and reservoir water and the seepage origin. On the other hand, the one using artificial tracer can identify the leak point in reservoirs directly, in which radioactive cloud migration method and radioactive tracer adsorption method are included. The former is using hydrophilic radioisotope tracer, and the latter adsorptive radioisotope tracer which is emitting gamma ray. The radiotracer are injected at a point of the reservoir near to the bottom. Afterwards, the migration of the radioactive tracer is followed by means of submerged scintillation detectors suspended from boats. Usually {sup 131}I, {sup 82}Br, {sup 46}Sc, and {sup 198}Au etc. can be used as tracer. The point reaching the maximum concentration of tracer corresponds to the leak point in reservoirs.

  5. Technical Reviews on the Radioisotope Application for Leak Detection in Reservoirs

    International Nuclear Information System (INIS)

    Kim, Jin Seop; Jung, Sung Hee; Kim, Jong Bum; Kim, Jae Ho

    2006-02-01

    The previous techniques on the detection of leaks from reservoirs are difficult to identify the leak points and leak pathways in reservoirs. Additionally the complexity and ambiguity of data analysis resulted from them can increase the failures of leak detection. While, The technique using radioisotope as a tracer is considered to be very promising. In the same context, systematic studies led by IAEA are being practiced by organizing the task force team. The detection technique using natural tracer can give information about the age of ground water and the interconnection between ground water and reservoir water and the seepage origin. On the other hand, the one using artificial tracer can identify the leak point in reservoirs directly, in which radioactive cloud migration method and radioactive tracer adsorption method are included. The former is using hydrophilic radioisotope tracer, and the latter adsorptive radioisotope tracer which is emitting gamma ray. The radiotracer are injected at a point of the reservoir near to the bottom. Afterwards, the migration of the radioactive tracer is followed by means of submerged scintillation detectors suspended from boats. Usually 131 I, 82 Br, 46 Sc, and 198 Au etc. can be used as tracer. The point reaching the maximum concentration of tracer corresponds to the leak point in reservoirs

  6. a Stochastic Approach to Multiobjective Optimization of Large-Scale Water Reservoir Networks

    Science.gov (United States)

    Bottacin-Busolin, A.; Worman, A. L.

    2013-12-01

    A main challenge for the planning and management of water resources is the development of multiobjective strategies for operation of large-scale water reservoir networks. The optimal sequence of water releases from multiple reservoirs depends on the stochastic variability of correlated hydrologic inflows and on various processes that affect water demand and energy prices. Although several methods have been suggested, large-scale optimization problems arising in water resources management are still plagued by the high dimensional state space and by the stochastic nature of the hydrologic inflows. In this work, the optimization of reservoir operation is approached using approximate dynamic programming (ADP) with policy iteration and function approximators. The method is based on an off-line learning process in which operating policies are evaluated for a number of stochastic inflow scenarios, and the resulting value functions are used to design new, improved policies until convergence is attained. A case study is presented of a multi-reservoir system in the Dalälven River, Sweden, which includes 13 interconnected reservoirs and 36 power stations. Depending on the late spring and summer peak discharges, the lowlands adjacent to Dalälven can often be flooded during the summer period, and the presence of stagnating floodwater during the hottest months of the year is the cause of a large proliferation of mosquitos, which is a major problem for the people living in the surroundings. Chemical pesticides are currently being used as a preventive countermeasure, which do not provide an effective solution to the problem and have adverse environmental impacts. In this study, ADP was used to analyze the feasibility of alternative operating policies for reducing the flood risk at a reasonable economic cost for the hydropower companies. To this end, mid-term operating policies were derived by combining flood risk reduction with hydropower production objectives. The performance

  7. Trophic State and Toxic Cyanobacteria Density in Optimization Modeling of Multi-Reservoir Water Resource Systems

    Directory of Open Access Journals (Sweden)

    Andrea Sulis

    2014-04-01

    Full Text Available The definition of a synthetic index for classifying the quality of water bodies is a key aspect in integrated planning and management of water resource systems. In previous works [1,2], a water system optimization modeling approach that requires a single quality index for stored water in reservoirs has been applied to a complex multi-reservoir system. Considering the same modeling field, this paper presents an improved quality index estimated both on the basis of the overall trophic state of the water body and on the basis of the density values of the most potentially toxic Cyanobacteria. The implementation of the index into the optimization model makes it possible to reproduce the conditions limiting water use due to excessive nutrient enrichment in the water body and to the health hazard linked to toxic blooms. The analysis of an extended limnological database (1996–2012 in four reservoirs of the Flumendosa-Campidano system (Sardinia, Italy provides useful insights into the strengths and limitations of the proposed synthetic index.

  8. Trophic state and toxic cyanobacteria density in optimization modeling of multi-reservoir water resource systems.

    Science.gov (United States)

    Sulis, Andrea; Buscarinu, Paola; Soru, Oriana; Sechi, Giovanni M

    2014-04-22

    The definition of a synthetic index for classifying the quality of water bodies is a key aspect in integrated planning and management of water resource systems. In previous works [1,2], a water system optimization modeling approach that requires a single quality index for stored water in reservoirs has been applied to a complex multi-reservoir system. Considering the same modeling field, this paper presents an improved quality index estimated both on the basis of the overall trophic state of the water body and on the basis of the density values of the most potentially toxic Cyanobacteria. The implementation of the index into the optimization model makes it possible to reproduce the conditions limiting water use due to excessive nutrient enrichment in the water body and to the health hazard linked to toxic blooms. The analysis of an extended limnological database (1996-2012) in four reservoirs of the Flumendosa-Campidano system (Sardinia, Italy) provides useful insights into the strengths and limitations of the proposed synthetic index.

  9. Optimizing multiple reliable forward contracts for reservoir allocation using multitime scale streamflow forecasts

    Science.gov (United States)

    Lu, Mengqian; Lall, Upmanu; Robertson, Andrew W.; Cook, Edward

    2017-03-01

    Streamflow forecasts at multiple time scales provide a new opportunity for reservoir management to address competing objectives. Market instruments such as forward contracts with specified reliability are considered as a tool that may help address the perceived risk associated with the use of such forecasts in lieu of traditional operation and allocation strategies. A water allocation process that enables multiple contracts for water supply and hydropower production with different durations, while maintaining a prescribed level of flood risk reduction, is presented. The allocation process is supported by an optimization model that considers multitime scale ensemble forecasts of monthly streamflow and flood volume over the upcoming season and year, the desired reliability and pricing of proposed contracts for hydropower and water supply. It solves for the size of contracts at each reliability level that can be allocated for each future period, while meeting target end of period reservoir storage with a prescribed reliability. The contracts may be insurable, given that their reliability is verified through retrospective modeling. The process can allow reservoir operators to overcome their concerns as to the appropriate skill of probabilistic forecasts, while providing water users with short-term and long-term guarantees as to how much water or energy they may be allocated. An application of the optimization model to the Bhakra Dam, India, provides an illustration of the process. The issues of forecast skill and contract performance are examined. A field engagement of the idea is useful to develop a real-world perspective and needs a suitable institutional environment.

  10. Sensitivity analysis and economic optimization studies of inverted five-spot gas cycling in gas condensate reservoir

    Directory of Open Access Journals (Sweden)

    Shams Bilal

    2017-08-01

    Full Text Available Gas condensate reservoirs usually exhibit complex flow behaviors because of propagation response of pressure drop from the wellbore into the reservoir. When reservoir pressure drops below the dew point in two phase flow of gas and condensate, the accumulation of large condensate amount occurs in the gas condensate reservoirs. Usually, the saturation of condensate accumulation in volumetric gas condensate reservoirs is lower than the critical condensate saturation that causes trapping of large amount of condensate in reservoir pores. Trapped condensate often is lost due to condensate accumulation-condensate blockage courtesy of high molecular weight, heavy condensate residue. Recovering lost condensate most economically and optimally has always been a challenging goal. Thus, gas cycling is applied to alleviate such a drastic loss in resources.

  11. Optimizing Systems of Threshold Detection Sensors

    National Research Council Canada - National Science Library

    Banschbach, David C

    2008-01-01

    .... Below the threshold all signals are ignored. We develop a mathematical model for setting individual sensor thresholds to obtain optimal probability of detecting a significant event, given a limit on the total number of false positives allowed...

  12. New well pattern optimization methodology in mature low-permeability anisotropic reservoirs

    Science.gov (United States)

    Qin, Jiazheng; Liu, Yuetian; Feng, Yueli; Ding, Yao; Liu, Liu; He, Youwei

    2018-02-01

    In China, lots of well patterns were designed before people knew the principal permeability direction in low-permeability anisotropic reservoirs. After several years’ production, it turns out that well line direction is unparallel with principal permeability direction. However, traditional well location optimization methods (in terms of the objective function such as net present value and/or ultimate recovery) are inapplicable, since wells are not free to move around in a mature oilfield. Thus, the well pattern optimization (WPO) of mature low-permeability anisotropic reservoirs is a significant but challenging task, since the original well pattern (WP) will be distorted and reconstructed due to permeability anisotropy. In this paper, we investigate the destruction and reconstruction of WP when the principal permeability direction and well line direction are unparallel. A new methodology was developed to quantitatively optimize the well locations of mature large-scale WP through a WPO algorithm on the basis of coordinate transformation (i.e. rotating and stretching). For a mature oilfield, large-scale WP has settled, so it is not economically viable to carry out further infill drilling. This paper circumvents this difficulty by combining the WPO algorithm with the well status (open or shut-in) and schedule adjustment. Finally, this methodology is applied to an example. Cumulative oil production rates of the optimized WP are higher, and water-cut is lower, which highlights the potential of the WPO methodology application in mature large-scale field development projects.

  13. Completion of potential conflicts of interest through optimization of Rukoh reservoir operation in Pidie District, Aceh Province, Indonesia

    Science.gov (United States)

    Azmeri, Hadihardaja, Iwan K.; Shaskia, Nina; Admaja, Kamal Surya

    2017-11-01

    Rukoh Reservoir's construction was planned to be built in Krueng Rukoh Watershed with supplet ion from Krueng Tiro River. Rukoh Reservoir operating system as a multipurpose reservoir raised potential conflict of interest between raw water and irrigation water. In this study, the operating system of Rukoh Reservoirs was designed to supply raw water in Titeu Sub-District and replenish water shortage in Baro Irrigation Area which is not able to be served by the Keumala Weir. Reservoir operating system should be planned optimally so that utilization of water in accordance with service area demands. Reservoir operation method was analyzed by using optimization technique with nonlinear programming. Optimization of reservoir operation is intended to minimize potential conflicts of interest in the operation. Suppletion discharge from Krueng Tiro River amounted to 46.62%, which was calculated based on ratio of Baro and Tiro irrigation area. However, during dry seasons, water demands could not be fully met, so there was a shortage of water. By considering the rules to minimize potential conflicts of interest between raw water and irrigation water, it would require suppletion from Krueng Tiro amounted to 52.30%. The increment of suppletion volume could minimize conflicts of interest. It produced l00% reservoir reliability for raw water and irrigation demands. Rukoh reservoir could serve raw water demands of Titeu Sub-District and irrigation demands of Baro irrigation area which is covering an area of 6,047 hectars. Reservoir operation guidelines can specify reservoir water release to balance the demands and the target storage.

  14. Estimating the Optimal Capacity for Reservoir Dam based on Reliability Level for Meeting Demands

    Directory of Open Access Journals (Sweden)

    Mehrdad Taghian

    2017-02-01

    Full Text Available Introduction: One of the practical and classic problems in the water resource studies is estimation of the optimal reservoir capacity to satisfy demands. However, full supplying demands for total periods need a very high dam to supply demands during severe drought conditions. That means a major part of reservoir capacity and costs is only usable for a short period of the reservoir lifetime, which would be unjustified in economic analysis. Thus, in the proposed method and model, the full meeting demand is only possible for a percent time of the statistical period that is according to reliability constraint. In the general methods, although this concept apparently seems simple, there is a necessity to add binary variables for meeting or not meeting demands in the linear programming model structures. Thus, with many binary variables, solving the problem will be time consuming and difficult. Another way to solve the problem is the application of the yield model. This model includes some simpler assumptions and that is so difficult to consider details of the water resource system. The applicationof evolutionary algorithms, for the problems have many constraints, is also very complicated. Therefore, this study pursues another solution. Materials and Methods: In this study, for development and improvement the usual methods, instead of mix integer linear programming (MILP and the above methods, a simulation model including flow network linear programming is used coupled with an interface manual code in Matlab to account the reliability based on output file of the simulation model. The acre reservoir simulation program (ARSP has been utilized as a simulation model. A major advantage of the ARSP is its inherent flexibility in defining the operating policies through a penalty structure specified by the user. The ARSP utilizes network flow optimization techniques to handle a subset of general linear programming (LP problems for individual time intervals

  15. Balancing exploration, uncertainty and computational demands in many objective reservoir optimization

    Science.gov (United States)

    Zatarain Salazar, Jazmin; Reed, Patrick M.; Quinn, Julianne D.; Giuliani, Matteo; Castelletti, Andrea

    2017-11-01

    Reservoir operations are central to our ability to manage river basin systems serving conflicting multi-sectoral demands under increasingly uncertain futures. These challenges motivate the need for new solution strategies capable of effectively and efficiently discovering the multi-sectoral tradeoffs that are inherent to alternative reservoir operation policies. Evolutionary many-objective direct policy search (EMODPS) is gaining importance in this context due to its capability of addressing multiple objectives and its flexibility in incorporating multiple sources of uncertainties. This simulation-optimization framework has high potential for addressing the complexities of water resources management, and it can benefit from current advances in parallel computing and meta-heuristics. This study contributes a diagnostic assessment of state-of-the-art parallel strategies for the auto-adaptive Borg Multi Objective Evolutionary Algorithm (MOEA) to support EMODPS. Our analysis focuses on the Lower Susquehanna River Basin (LSRB) system where multiple sectoral demands from hydropower production, urban water supply, recreation and environmental flows need to be balanced. Using EMODPS with different parallel configurations of the Borg MOEA, we optimize operating policies over different size ensembles of synthetic streamflows and evaporation rates. As we increase the ensemble size, we increase the statistical fidelity of our objective function evaluations at the cost of higher computational demands. This study demonstrates how to overcome the mathematical and computational barriers associated with capturing uncertainties in stochastic multiobjective reservoir control optimization, where parallel algorithmic search serves to reduce the wall-clock time in discovering high quality representations of key operational tradeoffs. Our results show that emerging self-adaptive parallelization schemes exploiting cooperative search populations are crucial. Such strategies provide a

  16. Detection of the water reservoir in a forming planetary system

    NARCIS (Netherlands)

    Hogerheijde, M.R.; Bergin, E.A.; Brinch, C.; Cleeves, L.I.; Fogel, J. K.J.; Blake, G.A.; Dominik, C.; Lis, D.C.; Melnick, G.; Neufeld, D.; Panić, O.; Pearson, J.C.; Kristensen, L.; Yıldız, U.A.; van Dishoeck, E.F.

    2011-01-01

    Icy bodies may have delivered the oceans to the early Earth, yet little is known about water in the ice-dominated regions of extrasolar planet-forming disks. The Heterodyne Instrument for the Far-Infrared on board the Herschel Space Observatory has detected emission lines from both spin isomers of

  17. Application of remote sensing methods for detection of water pollution degree in rivers and water reservoirs

    International Nuclear Information System (INIS)

    Krzyworzeka, M.; Piasek, Z.

    1997-01-01

    The paper presents non-contact registration methods of the electromagnetic radiation which can be used for the detection of water pollution in rivers and water reservoirs. These methods include aerial photographs, satellite images and thermograms. The satellite images need reprocessing to obtain the mutual comparability of the images from various multispectral scanners (TM and MSS)

  18. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  19. The detection of leakages in open reservoirs by the radioisotope sorption method

    International Nuclear Information System (INIS)

    Owczarczyk, A.; Wierzchnicki, R.; Urbanski, T.; Chmielewski, A.G.; Szpilowski, S.

    1992-01-01

    Location of leakages in large hydro-engineering plants and industrial water reservoirs is of great importance from view-point of both safety and economy of their exploitation. Large variety of water reservoirs encountered in hydro-engineering and industry calls for adaptation of investigation methods to their specific features. In the paper a number of methodological variants of known radiotracer technique developed at the INCT is presented. They are intended to detect and locate leakages in hydro-engineering reservoirs and dams as well as large open industrial tanks. The radioisotopes Au-198 and In-133 m being used for that purpose show excellent sorption characteristic on typical construction materials users to build such objects. (author). 8 refs, 8 figs

  20. Integrated method to optimize well connection and platform placement on a multi-reservoir scenario

    Energy Technology Data Exchange (ETDEWEB)

    Sousa, Sergio Henrique Guerra de; Madeira, Marcelo Gomes; Franca, Martha Salles [Halliburton, Rio de Janeiro, RJ (Brazil); Mota, Rosane Oliveira; Silva, Edilon Ribeiro da; King, Vanessa Pereira Spear [Petroleo Brasileiro S.A. (PETROBRAS), Rio de Janeiro, RJ (Brazil)

    2012-07-01

    This paper describes a workflow created to optimize the platform placement and well-platform connections on a multi reservoir scenario using an integrated reservoir simulator paired with an optimization engine. The proposed methodology describes how a new platform, being incorporated into a pre-existing asset, can be better used to develop newly-discovered fields, while helping increase the production of existing fields by sharing their production load. The sharing of production facilities is highly important in Brazilian offshore assets because of their high price (a few billion dollars per facility) and the fact that total production is usually limited to the installed capacity of liquid processing, which is an important constraint on high water-cut well production rates typical to this region. The case study asset used to present the workflow consists of two deep water oil fields, each one developed by its own production platform, and a newly-discovered field with strong aquifer support that will be entirely developed with a new production platform. Because this new field should not include injector wells owing to the strong aquifer presence, the idea is to consider reconnecting existing wells from the two pre-existing fields to better use the production resources. In this scenario, the platform location is an important optimization issue, as a balance between supporting the production of the planned wells on the new field and the production of re-routed wells from the existing fields must be reached to achieve improved overall asset production. If the new platform is too far away from any interconnected production well, pressure-drop issues along the pipeline might actually decrease production from the existing fields rather than augment it. The main contribution of this work is giving the reader insights on how to model and optimize these complex decisions to generate high-quality scenarios. (author)

  1. Optimizing and Quantifying CO2 Storage Resource in Saline Formations and Hydrocarbon Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Bosshart, Nicholas W. [Univ. of North Dakota, Grand Folks, ND (United States). Energy & Environmental Research Center; Ayash, Scott C. [Univ. of North Dakota, Grand Folks, ND (United States). Energy & Environmental Research Center; Azzolina, Nicholas A. [Univ. of North Dakota, Grand Folks, ND (United States). Energy & Environmental Research Center; Peck, Wesley D. [Univ. of North Dakota, Grand Folks, ND (United States). Energy & Environmental Research Center; Gorecki, Charles D. [Univ. of North Dakota, Grand Folks, ND (United States). Energy & Environmental Research Center; Ge, Jun [Univ. of North Dakota, Grand Folks, ND (United States). Energy & Environmental Research Center; Jiang, Tao [Univ. of North Dakota, Grand Folks, ND (United States). Energy & Environmental Research Center; Burton-Kelly, Matthew E. [Univ. of North Dakota, Grand Folks, ND (United States). Energy & Environmental Research Center; Anderson, Parker W. [Univ. of North Dakota, Grand Folks, ND (United States). Energy & Environmental Research Center; Dotzenrod, Neil W. [Univ. of North Dakota, Grand Folks, ND (United States). Energy & Environmental Research Center; Gorz, Andrew J. [Univ. of North Dakota, Grand Folks, ND (United States). Energy & Environmental Research Center

    2017-06-30

    In an effort to reduce carbon dioxide (CO2) emissions from large stationary sources, carbon capture and storage (CCS) is being investigated as one approach. This work assesses CO2 storage resource estimation methods for deep saline formations (DSFs) and hydrocarbon reservoirs undergoing CO2 enhanced oil recovery (EOR). Project activities were conducted using geologic modeling and simulation to investigate CO2 storage efficiency. CO2 storage rates and efficiencies in DSFs classified by interpreted depositional environment were evaluated at the regional scale over a 100-year time frame. A focus was placed on developing results applicable to future widespread commercial-scale CO2 storage operations in which an array of injection wells may be used to optimize storage in saline formations. The results of this work suggest future investigations of prospective storage resource in closed or semiclosed formations need not have a detailed understanding of the depositional environment of the reservoir to generate meaningful estimates. However, the results of this work also illustrate the relative importance of depositional environment, formation depth, structural geometry, and boundary conditions on the rate of CO2 storage in these types of systems. CO2 EOR occupies an important place in the realm of geologic storage of CO2, as it is likely to be the primary means of geologic CO2 storage during the early stages of commercial implementation, given the lack of a national policy and the viability of the current business case. This work estimates CO2 storage efficiency factors using a unique industry database of CO2 EOR sites and 18 different reservoir simulation models capturing fluvial clastic and shallow shelf carbonate depositional environments for reservoir depths of 1219 and 2438 meters (4000 and 8000 feet) and 7.6-, 20-, and 64-meter (25-, 66

  2. Optic disc detection using ant colony optimization

    Science.gov (United States)

    Dias, Marcy A.; Monteiro, Fernando C.

    2012-09-01

    The retinal fundus images are used in the treatment and diagnosis of several eye diseases, such as diabetic retinopathy and glaucoma. This paper proposes a new method to detect the optic disc (OD) automatically, due to the fact that the knowledge of the OD location is essential to the automatic analysis of retinal images. Ant Colony Optimization (ACO) is an optimization algorithm inspired by the foraging behaviour of some ant species that has been applied in image processing for edge detection. Recently, the ACO was used in fundus images to detect edges, and therefore, to segment the OD and other anatomical retinal structures. We present an algorithm for the detection of OD in the retina which takes advantage of the Gabor wavelet transform, entropy and ACO algorithm. Forty images of the retina from DRIVE database were used to evaluate the performance of our method.

  3. Optimizing Water Use and Hydropower Production in Operational Reservoir System Scheduling with RiverWare

    Science.gov (United States)

    Magee, T. M.; Zagona, E. A.

    2017-12-01

    Practical operational optimization of multipurpose reservoir systems is challenging for several reasons. Each purpose has its own constraints which may conflict with those of other purposes. While hydropower generation typically provides the bulk of the revenue, it is also among the lowest priority purposes. Each river system has important details that are specific to the location such as hydrology, reservoir storage capacity, physical limitations, bottlenecks, and the continuing evolution of operational policy. In addition, reservoir operations models include discrete, nonlinear, and nonconvex physical processes and if-then operating policies. Typically, the forecast horizon for scheduling needs to be extended far into the future to avoid near term (e.g., a few hours or a day) scheduling decisions that result in undesirable future states; this makes the computational effort much larger than may be expected. Put together, these challenges lead to large and customized mathematical optimization problems which must be solved efficiently to be of practical use. In addition, the solution process must be robust in an operational setting. We discuss a unique modeling approach in RiverWare that meets these challenges in an operational setting. The approach combines a Preemptive Linear Goal Programming optimization model to handle prioritized policies complimented by preprocessing and postprocessing with Rulebased Simulation to improve the solution with regard to nonlinearities, discrete issues, and if-then logic. An interactive policy language with a graphical user interface allows modelers to customize both the optimization and simulation based on the unique aspects of the policy for their system while the routine physical aspect of operations are modeled automatically. The modeler is aided by a set of compiled predefined functions and functions shared by other modelers. We illustrate the success of the approach with examples from daily use at the Tennessee Valley

  4. Application of integrated reservoir management and reservoir characterization to optimize infill drilling. Quarterly progress report, June 13, 1995--September 12, 1995

    Energy Technology Data Exchange (ETDEWEB)

    Pande, P.K.

    1995-09-12

    At this stage of the reservoir characterization research, the main emphasis is on the geostatistics and reservoir simulation. Progress is reported on geological analysis, reservoir simulation, and reservoir management.

  5. Optimizing Fracture Treatments in a Mississippian "Chat" Reservoir, South-Central Kansas

    Energy Technology Data Exchange (ETDEWEB)

    K. David Newell; Saibal Bhattacharya; Alan Byrnes; W. Lynn Watney; Willard Guy

    2005-10-01

    This project is a collaboration of Woolsey Petroleum Corporation (a small independent operator) and the Kansas Geological Survey. The project will investigate geologic and engineering factors critical for designing hydraulic fracture treatments in Mississippian ''chat'' reservoirs. Mississippian reservoirs, including the chat, account for 159 million m3 (1 billion barrels) of the cumulative oil produced in Kansas. Mississippian reservoirs presently represent {approx}40% of the state's 5.6*106m3 (35 million barrels) annual production. Although geographically widespread, the ''chat'' is a heterogeneous reservoir composed of chert, cherty dolomite, and argillaceous limestone. Fractured chert with micro-moldic porosity is the best reservoir in this 18- to 30-m-thick (60- to 100-ft) unit. The chat will be cored in an infill well in the Medicine Lodge North field (417,638 m3 [2,626,858 bbls] oil; 217,811,000 m3 [7,692,010 mcf] gas cumulative production; discovered 1954). The core and modern wireline logs will provide geological and petrophysical data for designing a fracture treatment. Optimum hydraulic fracturing design is poorly defined in the chat, with poor correlation of treatment size to production increase. To establish new geologic and petrophysical guidelines for these treatments, data from core petrophysics, wireline logs, and oil-field maps will be input to a fracture-treatment simulation program. Parameters will be established for optimal size of the treatment and geologic characteristics of the predicted fracturing. The fracturing will be performed and subsequent wellsite tests will ascertain the results for comparison to predictions. A reservoir simulation program will then predict the rate and volumetric increase in production. Comparison of the predicted increase in production with that of reality, and the hypothetical fracturing behavior of the reservoir with that of its actual behavior, will serve as tests of

  6. A new optimization framework using genetic algorithm and artificial neural network to reduce uncertainties in petroleum reservoir models

    Science.gov (United States)

    Maschio, Célio; José Schiozer, Denis

    2015-01-01

    In this article, a new optimization framework to reduce uncertainties in petroleum reservoir attributes using artificial intelligence techniques (neural network and genetic algorithm) is proposed. Instead of using the deterministic values of the reservoir properties, as in a conventional process, the parameters of the probability density function of each uncertain attribute are set as design variables in an optimization process using a genetic algorithm. The objective function (OF) is based on the misfit of a set of models, sampled from the probability density function, and a symmetry factor (which represents the distribution of curves around the history) is used as weight in the OF. Artificial neural networks are trained to represent the production curves of each well and the proxy models generated are used to evaluate the OF in the optimization process. The proposed method was applied to a reservoir with 16 uncertain attributes and promising results were obtained.

  7. The Application of GA, SMPSO and HGAPSO in Optimal Reservoirs Operation

    Directory of Open Access Journals (Sweden)

    Alireza Moghaddam

    2017-02-01

    Full Text Available Introduction: The reservoir operation is a multi-objective optimization problem with large-scale which consider reliability and the needs of hydrology, energy, agriculture and the environment. There were not the any algorithms with this ability which consider all the above-mentioned demands until now. Almost the existing algorithms usually solve a simple form of the problem for their limitations. In the recent decay the application of meta-heuristic algorithms were introduced into the water resources problem to overcome on some complexity, such as non-linear, non-convex and description of these problems which limited the mathematical optimization methods. In this paper presented a Simple Modified Particle Swarm Optimization Algorithm (SMPSO with applying a new factor in Particle Swarm Optimization (PSO algorithm. Then a new suggested hybrid method which called HGAPSO developed based on combining with Genetic algorithm (GA. In the end, the performance of GA, MPSO and HGAPSO algorithms on the reservoir operation problem is investigated with considering water supplying as objective function in a period of 60 months according to inflow data. Materials and Methods: The GA is one of the newer programming methods which use of the theory of evolution and survival in biology and genetics principles. GA has been developed as an effective method in optimization problems which doesn’t have the limitation of classical methods. The SMPSO algorithm is the member of swarm intelligence methods that a solution is a population of birds which know as a particle. In this collection, the birds have the individual artificial intelligence and develop the social behavior and their coordinate movement toward a specific destination. The goal of this process is the communication between individual intelligence with social interaction. The new modify factor in SMPSO makes to improve the speed of convergence in optimal answer. The HGAPSO is a suggested combination of GA

  8. Correlation Analysis of Rainstorm Runoff and Density Current in a Canyon-Shaped Source Water Reservoir: Implications for Reservoir Optimal Operation

    Directory of Open Access Journals (Sweden)

    Yang Li

    2018-04-01

    Full Text Available Extreme weather has recently become frequent. Heavy rainfall forms storm runoff, which is usually very turbid and contains a high concentration of organic matter, therefore affecting water quality when it enters reservoirs. The large canyon-shaped Heihe Reservoir is the most important raw water source for the city of Xi’an. During the flood season, storm runoff flows into the reservoir as a density current. We determined the relationship among inflow peak discharge (Q, suspended sediment concentration, inflow water temperature, and undercurrent water density. The relationships between (Q and inflow suspended sediment concentration (CS0 could be described by the equation CS0 = 0.3899 × e0.0025Q, that between CS0 and suspended sediment concentration at the entrance of the main reservoir area S1 (CS1 was determined using CS1 = 0.0346 × e0.2335CS0, and air temperature (Ta and inflow water temperature (Tw based on the meteorological data were related as follows: Tw = 0.7718 × Ta + 1.0979. Then, we calculated the density of the undercurrent layer. Compared to the vertical water density distribution at S1 before rainfall, the undercurrent elevation was determined based on the principle of equivalent density inflow. Based on our results, we proposed schemes for optimizing water intake selection and flood discharge during the flood season.

  9. Optimization of European call options considering physical delivery network and reservoir operation rules

    Science.gov (United States)

    Cheng, Wei-Chen; Hsu, Nien-Sheng; Cheng, Wen-Ming; Yeh, William W.-G.

    2011-10-01

    This paper develops alternative strategies for European call options for water purchase under hydrological uncertainties that can be used by water resources managers for decision making. Each alternative strategy maximizes its own objective over a selected sequence of future hydrology that is characterized by exceedance probability. Water trade provides flexibility and enhances water distribution system reliability. However, water trade between two parties in a regional water distribution system involves many issues, such as delivery network, reservoir operation rules, storage space, demand, water availability, uncertainty, and any existing contracts. An option is a security giving the right to buy or sell an asset; in our case, the asset is water. We extend a flow path-based water distribution model to include reservoir operation rules. The model simultaneously considers both the physical distribution network as well as the relationships between water sellers and buyers. We first test the model extension. Then we apply the proposed optimization model for European call options to the Tainan water distribution system in southern Taiwan. The formulation lends itself to a mixed integer linear programming model. We use the weighing method to formulate a composite function for a multiobjective problem. The proposed methodology provides water resources managers with an overall picture of water trade strategies and the consequence of each strategy. The results from the case study indicate that the strategy associated with a streamflow exceedence probability of 50% or smaller should be adopted as the reference strategy for the Tainan water distribution system.

  10. Automatic detection of epileptic seizures on the intra-cranial electroencephalogram of rats using reservoir computing.

    Science.gov (United States)

    Buteneers, Pieter; Verstraeten, David; van Mierlo, Pieter; Wyckhuys, Tine; Stroobandt, Dirk; Raedt, Robrecht; Hallez, Hans; Schrauwen, Benjamin

    2011-11-01

    In this paper we propose a technique based on reservoir computing (RC) to mark epileptic seizures on the intra-cranial electroencephalogram (EEG) of rats. RC is a recurrent neural networks training technique which has been shown to possess good generalization properties with limited training. The system is evaluated on data containing two different seizure types: absence seizures from genetic absence epilepsy rats from Strasbourg (GAERS) and tonic-clonic seizures from kainate-induced temporal-lobe epilepsy rats. The dataset consists of 452hours from 23 GAERS and 982hours from 15 kainate-induced temporal-lobe epilepsy rats. During the preprocessing stage, several features are extracted from the EEG. A feature selection algorithm selects the best features, which are then presented as input to the RC-based classification algorithm. To classify the output of this algorithm a two-threshold technique is used. This technique is compared with other state-of-the-art techniques. A balanced error rate (BER) of 3.7% and 3.5% was achieved on the data from GAERS and kainate rats, respectively. This resulted in a sensitivity of 96% and 94% and a specificity of 96% and 99% respectively. The state-of-the-art technique for GAERS achieved a BER of 4%, whereas the best technique to detect tonic-clonic seizures achieved a BER of 16%. Our method outperforms up-to-date techniques and only a few parameters need to be optimized on a limited training set. It is therefore suited as an automatic aid for epilepsy researchers and is able to eliminate the tedious manual review and annotation of EEG. Copyright © 2011 Elsevier B.V. All rights reserved.

  11. Risk-weighted optimization for cyclic steam stimulation of water-underlain reservoirs

    International Nuclear Information System (INIS)

    Mehra, R.K.; Ding, L.Y.; Donnelly, J.K.

    1991-01-01

    The paper describes the procedure that was adopted to develop operating guidelines for the cyclic steam stimulation process in a water underlain oil sands reservoir. The study consisted of three parts: i) The risks associated with fracturing into water sands were quantified by conducting a stochastic simulation; ii) Numerical simulations were conducted to obtain correlations among the cumulative production volumes and six selected operating variables; iii) These correlations were subsequently embodied in the economic model to maximize the profits under different constraints. The study concluded that rather than a unique value, a wide range of operating conditions maximized the project profitability. It recommended that a cautious approach in exploitation of these resources was warranted since the penalty for selecting non-optimal conditions was not high

  12. Phase Behaviors of Reservoir Fluids with Capillary Eff ect Using Particle Swarm Optimization

    KAUST Repository

    Ma, Zhiwei

    2013-05-06

    The study of phase behavior is important for the oil and gas industry. Many approaches have been proposed and developed for phase behavior calculation. In this thesis, an alternative method is introduced to study the phase behavior by means of minimization of Helmholtz free energy. For a system at fixed volume, constant temperature and constant number of moles, the Helmholtz free energy reaches minimum at the equilibrium state. Based on this theory, a stochastic method called Particle Swarm Optimization (PSO) algorithm, is implemented to compute the phase diagrams for several pure component and mixture systems. After comparing with experimental and the classical PT-ash calculation, we found the phase diagrams obtained by minimization of the Helmholtz Free Energy approach match the experimental and theoretical diagrams very well. Capillary effect is also considered in this thesis because it has a significant influence on the phase behavior of reservoir fluids. In this part, we focus on computing the phase envelopes, which consists of bubble and dew point lines. Both fixed and calculated capillary pressure from the Young-Laplace equation cases are introduced to study their effects on phase envelopes. We found that the existence of capillary pressure will change the phase envelopes. Positive capillary pressure reduces the dew point and bubble point temperatures under the same pressure condition, while the negative capillary pressure increases the dew point and bubble point temperatures. In addition, the change of contact angle and pore radius will affect the phase envelope. The effect of the pore radius on the phase envelope is insignificant when the radius is very large. These results may become reference for future research and study. Keywords: Phase Behavior; Particle Swarm Optimization; Capillary Pressure; Reservoir Fluids; Phase Equilibrium; Phase Envelope.

  13. Phase Behaviors of Reservoir Fluids with Capillary Eff ect Using Particle Swarm Optimization

    KAUST Repository

    Ma, Zhiwei

    2013-01-01

    The study of phase behavior is important for the oil and gas industry. Many approaches have been proposed and developed for phase behavior calculation. In this thesis, an alternative method is introduced to study the phase behavior by means of minimization of Helmholtz free energy. For a system at fixed volume, constant temperature and constant number of moles, the Helmholtz free energy reaches minimum at the equilibrium state. Based on this theory, a stochastic method called Particle Swarm Optimization (PSO) algorithm, is implemented to compute the phase diagrams for several pure component and mixture systems. After comparing with experimental and the classical PT-ash calculation, we found the phase diagrams obtained by minimization of the Helmholtz Free Energy approach match the experimental and theoretical diagrams very well. Capillary effect is also considered in this thesis because it has a significant influence on the phase behavior of reservoir fluids. In this part, we focus on computing the phase envelopes, which consists of bubble and dew point lines. Both fixed and calculated capillary pressure from the Young-Laplace equation cases are introduced to study their effects on phase envelopes. We found that the existence of capillary pressure will change the phase envelopes. Positive capillary pressure reduces the dew point and bubble point temperatures under the same pressure condition, while the negative capillary pressure increases the dew point and bubble point temperatures. In addition, the change of contact angle and pore radius will affect the phase envelope. The effect of the pore radius on the phase envelope is insignificant when the radius is very large. These results may become reference for future research and study. Keywords: Phase Behavior; Particle Swarm Optimization; Capillary Pressure; Reservoir Fluids; Phase Equilibrium; Phase Envelope.

  14. Optimization of Reinforced Concrete Reservoir with Circumferential Stiffeners Strips by Particle Swarm Algorithm

    Directory of Open Access Journals (Sweden)

    GholamReza Havaei

    2015-09-01

    Full Text Available Reinforced concrete reservoirs (RCR have been used extensively in municipal and industrial facilities for several decades. The design of these structures requires that attention be given not only to strength requirements, but to serviceability requirements as well. These types of structures will be square, round, and oval reinforced concrete structures which may be above, below, or partially below ground. The main challenge is to design concrete liquid containing structures which will resist the extremes of seasonal temperature changes, a variety of loading conditions, and remain liquid tight for useful life of 50 to 60 years. In this study, optimization is performed by particle swarm algorithm basd on structural design. Firstly by structural analysis all range of shell thickness and areas of rebar find. In the second step by parameter identification system interchange algorithm, source code which developed in particle swarm algorithm by MATLAB software linked to analysis software. Therefore best and optimized thicknesses and total area of bars for each element find. Lastly with circumferential stiffeners structure optimize and show 19% decrease in weight of rebar, 20% decrease in volume of concrete, and 13% minimum cost reduction in construction procedure compared with conventional 10,000 m3 RCR structures.

  15. Multi-Objective Optimization of the Hedging Model for reservoir Operation Using Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    sadegh sadeghitabas

    2015-12-01

    Full Text Available Multi-objective problems rarely ever provide a single optimal solution, rather they yield an optimal set of outputs (Pareto fronts. Solving these problems was previously accomplished by using some simplifier methods such as the weighting coefficient method used for converting a multi-objective problem to a single objective function. However, such robust tools as multi-objective meta-heuristic algorithms have been recently developed for solving these problems. The hedging model is one of the classic problems for reservoir operation that is generally employed for mitigating drought impacts in water resources management. According to this method, although it is possible to supply the total planned demands, only portions of the demands are met to save water by allowing small deficits in the current conditions in order to avoid or reduce severe deficits in future. The approach heavily depends on economic and social considerations. In the present study, the meta-heuristic algorithms of NSGA-II, MOPSO, SPEA-II, and AMALGAM are used toward the multi-objective optimization of the hedging model. For this purpose, the rationing factors involved in Taleghan dam operation are optimized over a 35-year statistical period of inflow. There are two objective functions: a minimizing the modified shortage index, and b maximizing the reliability index (i.e., two opposite objectives. The results show that the above algorithms are applicable to a wide range of optimal solutions. Among the algorithms, AMALGAM is found to produce a better Pareto front for the values of the objective function, indicating its more satisfactory performance.

  16. Optimized Strategies for Detecting Extrasolar Space Weather

    Science.gov (United States)

    Hallinan, Gregg

    2018-06-01

    Fully understanding the implications of space weather for the young solar system, as well as the wider population of planet-hosting stars, requires remote sensing of space weather in other stellar systems. Solar coronal mass ejections can be accompanied by bright radio bursts at low frequencies (typically measurement of the magnetic field strength of the planet, informing on whether the atmosphere of the planet can survive the intense magnetic activity of its host star. However, both stellar and planetary radio emission are highly variable and optimal strategies for detection of these emissions requires the capability to monitor 1000s of nearby stellar/planetary systems simultaneously. I will discuss optimized strategies for both ground and space-based experiments to take advantage of the highly variable nature of the radio emissions powered by extrasolar space weather to enable detection of stellar CMEs and planetary magnetospheres.

  17. Fracture detection, mapping, and analysis of naturally fractured gas reservoirs using seismic technology. Final report, November 1995

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-10-01

    Many basins in the Rocky Mountains contain naturally fractured gas reservoirs. Production from these reservoirs is controlled primarily by the shape, orientation and concentration of the natural fractures. The detection of gas filled fractures prior to drilling can, therefore, greatly benefit the field development of the reservoirs. The objective of this project was to test and verify specific seismic methods to detect and characterize fractures in a naturally fractured reservoir. The Upper Green River tight gas reservoir in the Uinta Basin, Northeast Utah was chosen for the project as a suitable reservoir to test the seismic technologies. Knowledge of the structural and stratigraphic geologic setting, the fracture azimuths, and estimates of the local in-situ stress field, were used to guide the acquisition and processing of approximately ten miles of nine-component seismic reflection data and a nine-component Vertical Seismic Profile (VSP). Three sources (compressional P-wave, inline shear S-wave, and cross-line, shear S-wave) were each recorded by 3-component (3C) geophones, to yield a nine-component data set. Evidence of fractures from cores, borehole image logs, outcrop studies, and production data, were integrated with the geophysical data to develop an understanding of how the seismic data relate to the fracture network, individual well production, and ultimately the preferred flow direction in the reservoir. The multi-disciplinary approach employed in this project is viewed as essential to the overall reservoir characterization, due to the interdependency of the above factors.

  18. Optimal Detection under the Restricted Bayesian Criterion

    Directory of Open Access Journals (Sweden)

    Shujun Liu

    2017-07-01

    Full Text Available This paper aims to find a suitable decision rule for a binary composite hypothesis-testing problem with a partial or coarse prior distribution. To alleviate the negative impact of the information uncertainty, a constraint is considered that the maximum conditional risk cannot be greater than a predefined value. Therefore, the objective of this paper becomes to find the optimal decision rule to minimize the Bayes risk under the constraint. By applying the Lagrange duality, the constrained optimization problem is transformed to an unconstrained optimization problem. In doing so, the restricted Bayesian decision rule is obtained as a classical Bayesian decision rule corresponding to a modified prior distribution. Based on this transformation, the optimal restricted Bayesian decision rule is analyzed and the corresponding algorithm is developed. Furthermore, the relation between the Bayes risk and the predefined value of the constraint is also discussed. The Bayes risk obtained via the restricted Bayesian decision rule is a strictly decreasing and convex function of the constraint on the maximum conditional risk. Finally, the numerical results including a detection example are presented and agree with the theoretical results.

  19. Reservoir management

    International Nuclear Information System (INIS)

    Satter, A.; Varnon, J.E.; Hoang, M.T.

    1992-01-01

    A reservoir's life begins with exploration leading to discovery followed by delineation of the reservoir, development of the field, production by primary, secondary and tertiary means, and finally to abandonment. Sound reservoir management is the key to maximizing economic operation of the reservoir throughout its entire life. Technological advances and rapidly increasing computer power are providing tools to better manage reservoirs and are increasing the gap between good and neural reservoir management. The modern reservoir management process involves goal setting, planning, implementing, monitoring, evaluating, and revising plans. Setting a reservoir management strategy requires knowledge of the reservoir, availability of technology, and knowledge of the business, political, and environmental climate. Formulating a comprehensive management plan involves depletion and development strategies, data acquisition and analyses, geological and numerical model studies, production and reserves forecasts, facilities requirements, economic optimization, and management approval. This paper provides management, engineers, geologists, geophysicists, and field operations staff with a better understanding of the practical approach to reservoir management using a multidisciplinary, integrated team approach

  20. Reservoir management

    International Nuclear Information System (INIS)

    Satter, A.; Varnon, J.E.; Hoang, M.T.

    1992-01-01

    A reservoir's life begins with exploration leading to discovery followed by delineation of the reservoir, development of the field, production by primary, secondary and tertiary means, and finally to abandonment. Sound reservoir management is the key to maximizing economic operation of the reservoir throughout its entire life. Technological advances and rapidly increasing computer power are providing tools to better manage reservoirs and are increasing the gap between good and neutral reservoir management. The modern reservoir management process involves goal setting, planning, implementing, monitoring, evaluating, and revising plans. Setting a reservoir management strategy requires knowledge of the reservoir, availability of technology, and knowledge of the business, political, and environmental climate. Formulating a comprehensive management plan involves depletion and development strategies, data acquisition and analyses, geological and numerical model studies, production and reserves forecasts, facilities requirements, economic optimization, and management approval. This paper provides management, engineers geologists, geophysicists, and field operations staff with a better understanding of the practical approach to reservoir management using a multidisciplinary, integrated team approach

  1. Chemical Flooding in Heavy-Oil Reservoirs: From Technical Investigation to Optimization Using Response Surface Methodology

    Directory of Open Access Journals (Sweden)

    Si Le Van

    2016-09-01

    Full Text Available Heavy-oil resources represent a large percentage of global oil and gas reserves, however, owing to the high viscosity, enhanced oil recovery (EOR techniques are critical issues for extracting this type of crude oil from the reservoir. According to the survey data in Oil & Gas Journal, thermal methods are the most widely utilized in EOR projects in heavy oil fields in the US and Canada, and there are not many successful chemical flooding projects for heavy oil reported elsewhere in the world. However, thermal methods such as steam injection might be restricted in cases of thin formations, overlying permafrost, or reservoir depths over 4500 ft, for which chemical flooding becomes a better option for recovering crude oil. Moreover, owing to the considerable fluctuations in the oil price, chemical injection plans should be employed consistently in terms of either technical or economic viewpoints. The numerical studies in this work aim to clarify the predominant chemical injection schemes among the various combinations of chemical agents involving alkali (A, surfactant (S and polymer (P for specific heavy-oil reservoir conditions. The feasibilities of all potential injection sequences are evaluated in the pre-evaluation stage in order to select the most efficient injection scheme according to the variation in the oil price which is based on practical market values. Finally, optimization procedures in the post-evaluation stage are carried out for the most economic injection plan by an effective mathematic tool with the purpose of gaining highest Net Present Value (NPV of the project. In technical terms, the numerical studies confirm the predominant performances of sequences in which alkali-surfactant-polymer (ASP solution is injected after the first preflushing water whereby the recovery factor can be higher than 47%. In particular, the oil production performances are improved by injecting a buffering viscous fluid right after the first chemical slug

  2. Exploitation and Optimization of Reservoir Performance in Hunton Formation, Oklahoma, Budget Period I, Class Revisit

    Energy Technology Data Exchange (ETDEWEB)

    Kelkar, Mohan

    2002-04-02

    This report explains the unusual characteristics of West Carney Field based on detailed geological and engineering analyses. A geological history that explains the presence of mobile water and oil in the reservoir was proposed. The combination of matrix and fractures in the reservoir explains the reservoir?s flow behavior. We confirm our hypothesis by matching observed performance with a simulated model and develop procedures for correlating core data to log data so that the analysis can be extended to other, similar fields where the core coverage may be limited.

  3. An Optimization Model for Kardeh Reservoir Operation Using Interval-Parameter, Multi-stage, Stochastic Programming

    Directory of Open Access Journals (Sweden)

    Fatemeh Rastegaripour

    2010-09-01

    Full Text Available The present study investigates water allocation of Kardeh Reservoir to domestic and agricultural users using an Interval Parameter, Multi-stage, Stochastic Programming (IMSLP under uncertainty. The advantages of the method include its dynamics nature, use of a pre-defined policy in its optimization process, and the use of interval parameter and probability under uncertainty conditions. Additionally, it offers different decision-making alternatives for different scenarios of water shortage. The required data were collected from Khorasan Razavi Regional Water Organization and from the Water and Wastewater Co. for the period 1988-2007. Results showed that, under the worst conditions, the water deficits expected to occur for each of the next 3 years will be 1.9, 2.55, and 3.11 million cubic meters for the domestic use and 0.22, 0.32, 0.75 million cubic meters for irrigation. Approximate reductions of 0.5, 0.7, and 1 million cubic meters in the monthly consumption of the urban community and enhanced irrigation efficiencies of about 6, 11, and 20% in the agricultural sector are recommended as approaches for combating the water shortage over the next 3 years.

  4. Molecular detection of anaerobic ammonium-oxidizing (anammox) bacteria in high-temperature petroleum reservoirs.

    Science.gov (United States)

    Li, Hui; Chen, Shuo; Mu, Bo-Zhong; Gu, Ji-Dong

    2010-11-01

    Anaerobic ammonium-oxidizing (anammox) process plays an important role in the nitrogen cycle of the worldwide anoxic and mesophilic habitats. Recently, the existence and activity of anammox bacteria have been detected in some thermophilic environments, but their existence in the geothermal subterranean oil reservoirs is still not reported. This study investigated the abundance, distribution and functional diversity of anammox bacteria in nine out of 17 high-temperature oil reservoirs by molecular ecology analysis. High concentration (5.31-39.2 mg l(-1)) of ammonium was detected in the production water from these oilfields with temperatures between 55°C and 75°C. Both 16S rRNA and hzo molecular biomarkers indicated the occurrence of anammox bacteria in nine out of 17 samples. Most of 16S rRNA gene phylotypes are closely related to the known anammox bacterial genera Candidatus Brocadia, Candidatus Kuenenia, Candidatus Scalindua, and Candidatus Jettenia, while hzo gene phylotypes are closely related to the genera Candidatus Anammoxoglobus, Candidatus Kuenenia, Candidatus Scalindua, and Candidatus Jettenia. The total bacterial and anammox bacterial densities were 6.4 ± 0.5 × 10(3) to 2.0 ± 0.18 × 10(6) cells ml(-1) and 6.6 ± 0.51 × 10(2) to 4.9 ± 0.36 × 10(4) cell ml(-1), respectively. The cluster I of 16S rRNA gene sequences showed distant identity (<92%) to the known Candidatus Scalindua species, inferring this cluster of anammox bacteria to be a new species, and a tentative name Candidatus "Scalindua sinooilfield" was proposed. The results extended the existence of anammox bacteria to the high-temperature oil reservoirs.

  5. Optimal Operation of a Network of Multi-purpose Reservoir : A Review

    NARCIS (Netherlands)

    Nay Myo Lin, N.M.; Rutten, M.M.

    2016-01-01

    Due to the effects of climate change and population growth, reservoirs play a more and more important role in water resources management. The management of a multi-reservoir system is complex due to the curse of dimensionalities, nonlinearities and conflicts between different objectives. The

  6. Optimal Robust Fault Detection for Linear Discrete Time Systems

    Directory of Open Access Journals (Sweden)

    Nike Liu

    2008-01-01

    Full Text Available This paper considers robust fault-detection problems for linear discrete time systems. It is shown that the optimal robust detection filters for several well-recognized robust fault-detection problems, such as ℋ−/ℋ∞, ℋ2/ℋ∞, and ℋ∞/ℋ∞ problems, are the same and can be obtained by solving a standard algebraic Riccati equation. Optimal filters are also derived for many other optimization criteria and it is shown that some well-studied and seeming-sensible optimization criteria for fault-detection filter design could lead to (optimal but useless fault-detection filters.

  7. Well pattern optimization in a low permeability sandstone reservoir: a case study from Erlian Basin in China

    Science.gov (United States)

    Wang, Xia; Fu, Lixia; Yan, Aihua; Guo, Fajun; Wu, Cong; Chen, Hong; Wang, Xinying; Lu, Ming

    2018-02-01

    Study on optimization of development well patterns is the core content of oilfield development and is a prerequisite for rational and effective development of oilfield. The study on well pattern optimization mainly includes types of well patterns and density of well patterns. This paper takes the Aer-3 fault block as an example. Firstly, models were built for diamond-shaped inverted 9-spot patterns, rectangular 5-spot patterns, square inverted 9-spot patterns and inverted 7-spot patterns under the same well pattern density to correlate the effect of different well patterns on development; secondly, comprehensive analysis was conducted to well pattern density in terms of economy and technology using such methods as oil reservoir engineering, numerical simulation, economic limits and economic rationality. Finally, the development mode of vertical well + horizontal well was presented according to the characteristics of oil reservoirs in some well blocks, which has realized efficient development of this fault block.

  8. Detecting fluid leakage of a reservoir dam based on streaming self-potential measurements

    Science.gov (United States)

    Song, Seo Young; Kim, Bitnarae; Nam, Myung Jin; Lim, Sung Keun

    2015-04-01

    Between many reservoir dams for agriculture in suburban area of South Korea, water leakage has been reported several times. The dam under consideration in this study, which is located in Gyeong-buk, in the south-east of the Korean Peninsula, was reported to have a large leakage at the right foot of downstream side of the reservoir dam. For the detection of the leakage, not only geological survey but also geophysical explorations have been made for precision safety diagnosis, since the leakage can lead to dam failure. Geophysical exploration includes both electrical-resistivity and self-potential surveys, while geological surveys water permeability test, standard penetration test, and sampling for undisturbed sample during the course of the drilling investigation. The geophysical explorations were made not only along the top of dam but also transverse the heel of dam. The leakage of water installations can change the known-heterogeneous structure of the dam body but also cause streaming spontaneous (self) potential (SP) anomaly, which can be detected by electrical resistivity and SP measurements, respectively. For the interpretation of streaming SP, we used trial-and-error method by comparing synthetic SP data with field SP data for model update. For the computation, we first invert the resistivity data to obtain the distorted resistivity structure of the dam levee then make three-dimensional electrical-resistivity modeling for the streaming potential distribution of the dam levee. Our simulation algorithm of streaming SP distribution based on the integrated finite difference scheme computes two-dimensional (2D) SP distribution based on the distribution of calculated flow velocities of fluid for a given permeability structure together with physical properties. This permeability is repeatedly updated based on error between synthetic and field SP data, until the synthetic data match the field data. Through this trial-and-error-based SP interpretation, we locate the

  9. On the application of artificial bee colony (ABC algorithm for optimization of well placements in fractured reservoirs; efficiency comparison with the particle swarm optimization (PSO methodology

    Directory of Open Access Journals (Sweden)

    Behzad Nozohour-leilabady

    2016-03-01

    Full Text Available The application of a recent optimization technique, the artificial bee colony (ABC, was investigated in the context of finding the optimal well locations. The ABC performance was compared with the corresponding results from the particle swarm optimization (PSO algorithm, under essentially similar conditions. Treatment of out-of-boundary solution vectors was accomplished via the Periodic boundary condition (PBC, which presumably accelerates convergence towards the global optimum. Stochastic searches were initiated from several random staring points, to minimize starting-point dependency in the established results. The optimizations were aimed at maximizing the Net Present Value (NPV objective function over the considered oilfield production durations. To deal with the issue of reservoir heterogeneity, random permeability was applied via normal/uniform distribution functions. In addition, the issue of increased number of optimization parameters was address, by considering scenarios with multiple injector and producer wells, and cases with deviated wells in a real reservoir model. The typical results prove ABC to excel PSO (in the cases studied after relatively short optimization cycles, indicating the great premise of ABC methodology to be used for well-optimization purposes.

  10. Sensitivity analysis and economic optimization studies of inverted five-spot gas cycling in gas condensate reservoir

    Science.gov (United States)

    Shams, Bilal; Yao, Jun; Zhang, Kai; Zhang, Lei

    2017-08-01

    Gas condensate reservoirs usually exhibit complex flow behaviors because of propagation response of pressure drop from the wellbore into the reservoir. When reservoir pressure drops below the dew point in two phase flow of gas and condensate, the accumulation of large condensate amount occurs in the gas condensate reservoirs. Usually, the saturation of condensate accumulation in volumetric gas condensate reservoirs is lower than the critical condensate saturation that causes trapping of large amount of condensate in reservoir pores. Trapped condensate often is lost due to condensate accumulation-condensate blockage courtesy of high molecular weight, heavy condensate residue. Recovering lost condensate most economically and optimally has always been a challenging goal. Thus, gas cycling is applied to alleviate such a drastic loss in resources. In gas injection, the flooding pattern, injection timing and injection duration are key parameters to study an efficient EOR scenario in order to recover lost condensate. This work contains sensitivity analysis on different parameters to generate an accurate investigation about the effects on performance of different injection scenarios in homogeneous gas condensate system. In this paper, starting time of gas cycling and injection period are the parameters used to influence condensate recovery of a five-spot well pattern which has an injection pressure constraint of 3000 psi and production wells are constraint at 500 psi min. BHP. Starting injection times of 1 month, 4 months and 9 months after natural depletion areapplied in the first study. The second study is conducted by varying injection duration. Three durations are selected: 100 days, 400 days and 900 days. In miscible gas injection, miscibility and vaporization of condensate by injected gas is more efficient mechanism for condensate recovery. From this study, it is proven that the application of gas cycling on five-spot well pattern greatly enhances condensate recovery

  11. Performance assessment of deterministic and probabilistic weather predictions for the short-term optimization of a tropical hydropower reservoir

    Science.gov (United States)

    Mainardi Fan, Fernando; Schwanenberg, Dirk; Alvarado, Rodolfo; Assis dos Reis, Alberto; Naumann, Steffi; Collischonn, Walter

    2016-04-01

    Hydropower is the most important electricity source in Brazil. During recent years, it accounted for 60% to 70% of the total electric power supply. Marginal costs of hydropower are lower than for thermal power plants, therefore, there is a strong economic motivation to maximize its share. On the other hand, hydropower depends on the availability of water, which has a natural variability. Its extremes lead to the risks of power production deficits during droughts and safety issues in the reservoir and downstream river reaches during flood events. One building block of the proper management of hydropower assets is the short-term forecast of reservoir inflows as input for an online, event-based optimization of its release strategy. While deterministic forecasts and optimization schemes are the established techniques for the short-term reservoir management, the use of probabilistic ensemble forecasts and stochastic optimization techniques receives growing attention and a number of researches have shown its benefit. The present work shows one of the first hindcasting and closed-loop control experiments for a multi-purpose hydropower reservoir in a tropical region in Brazil. The case study is the hydropower project (HPP) Três Marias, located in southeast Brazil. The HPP reservoir is operated with two main objectives: (i) hydroelectricity generation and (ii) flood control at Pirapora City located 120 km downstream of the dam. In the experiments, precipitation forecasts based on observed data, deterministic and probabilistic forecasts with 50 ensemble members of the ECMWF are used as forcing of the MGB-IPH hydrological model to generate streamflow forecasts over a period of 2 years. The online optimization depends on a deterministic and multi-stage stochastic version of a model predictive control scheme. Results for the perfect forecasts show the potential benefit of the online optimization and indicate a desired forecast lead time of 30 days. In comparison, the use of

  12. Seismic fracture detection of shale gas reservoir in Longmaxi formation, Sichuan Basin, China

    Science.gov (United States)

    Lu, Yujia; Cao, Junxing; Jiang, Xudong

    2017-11-01

    In the shale reservoirs, fractures play an important role, which not only provide space for the oil and gas, but also offer favorable petroleum migration channel. Therefore, it is of great significance to study the fractures characteristics in shale reservoirs for the exploration and development of shale gas. In this paper, four analysis technologies involving coherence, curvature attribute, structural stress field simulation and pre-stack P-wave azimuthal anisotropy have been applied to predict the fractures distribution in the Longmaxi formation, Silurian, southeast of Sichuan Basin, China. By using the coherence and curvature attribute, we got the spatial distribution characteristics of fractures in the study area. Structural stress field simulation can help us obtain distribution characteristics of structural fractures. And using the azimuth P-wave fracture detection technology, we got the characteristics about the fracture orientation and density of this region. Application results show that there are NW and NE fractures in the study block, which is basically consistent with the result of log interpretation. The results also provide reliable geological basis for shale gas sweet spots prediction.

  13. A fast complex domain-matching pursuit algorithm and its application to deep-water gas reservoir detection

    Science.gov (United States)

    Zeng, Jing; Huang, Handong; Li, Huijie; Miao, Yuxin; Wen, Junxiang; Zhou, Fei

    2017-12-01

    The main emphasis of exploration and development is shifting from simple structural reservoirs to complex reservoirs, which all have the characteristics of complex structure, thin reservoir thickness and large buried depth. Faced with these complex geological features, hydrocarbon detection technology is a direct indication of changes in hydrocarbon reservoirs and a good approach for delimiting the distribution of underground reservoirs. It is common to utilize the time-frequency (TF) features of seismic data in detecting hydrocarbon reservoirs. Therefore, we research the complex domain-matching pursuit (CDMP) method and propose some improvements. First is the introduction of a scale parameter, which corrects the defect that atomic waveforms only change with the frequency parameter. Its introduction not only decomposes seismic signal with high accuracy and high efficiency but also reduces iterations. We also integrate jumping search with ergodic search to improve computational efficiency while maintaining the reasonable accuracy. Then we combine the improved CDMP with the Wigner-Ville distribution to obtain a high-resolution TF spectrum. A one-dimensional modeling experiment has proved the validity of our method. Basing on the low-frequency domain reflection coefficient in fluid-saturated porous media, we finally get an approximation formula for the mobility attributes of reservoir fluid. This approximation formula is used as a hydrocarbon identification factor to predict deep-water gas-bearing sand of the M oil field in the South China Sea. The results are consistent with the actual well test results and our method can help inform the future exploration of deep-water gas reservoirs.

  14. Event detection and localization for small mobile robots using reservoir computing.

    Science.gov (United States)

    Antonelo, E A; Schrauwen, B; Stroobandt, D

    2008-08-01

    Reservoir Computing (RC) techniques use a fixed (usually randomly created) recurrent neural network, or more generally any dynamic system, which operates at the edge of stability, where only a linear static readout output layer is trained by standard linear regression methods. In this work, RC is used for detecting complex events in autonomous robot navigation. This can be extended to robot localization tasks which are solely based on a few low-range, high-noise sensory data. The robot thus builds an implicit map of the environment (after learning) that is used for efficient localization by simply processing the input stream of distance sensors. These techniques are demonstrated in both a simple simulation environment and in the physically realistic Webots simulation of the commercially available e-puck robot, using several complex and even dynamic environments.

  15. Derivation of optimal joint operating rules for multi-purpose multi-reservoir water-supply system

    Science.gov (United States)

    Tan, Qiao-feng; Wang, Xu; Wang, Hao; Wang, Chao; Lei, Xiao-hui; Xiong, Yi-song; Zhang, Wei

    2017-08-01

    The derivation of joint operating policy is a challenging task for a multi-purpose multi-reservoir system. This study proposed an aggregation-decomposition model to guide the joint operation of multi-purpose multi-reservoir system, including: (1) an aggregated model based on the improved hedging rule to ensure the long-term water-supply operating benefit; (2) a decomposed model to allocate the limited release to individual reservoirs for the purpose of maximizing the total profit of the facing period; and (3) a double-layer simulation-based optimization model to obtain the optimal time-varying hedging rules using the non-dominated sorting genetic algorithm II, whose objectives were to minimize maximum water deficit and maximize water supply reliability. The water-supply system of Li River in Guangxi Province, China, was selected for the case study. The results show that the operating policy proposed in this study is better than conventional operating rules and aggregated standard operating policy for both water supply and hydropower generation due to the use of hedging mechanism and effective coordination among multiple objectives.

  16. Optimal spatio-temporal design of water quality monitoring networks for reservoirs: Application of the concept of value of information

    Science.gov (United States)

    Maymandi, Nahal; Kerachian, Reza; Nikoo, Mohammad Reza

    2018-03-01

    This paper presents a new methodology for optimizing Water Quality Monitoring (WQM) networks of reservoirs and lakes using the concept of the value of information (VOI) and utilizing results of a calibrated numerical water quality simulation model. With reference to the value of information theory, water quality of every checkpoint with a specific prior probability differs in time. After analyzing water quality samples taken from potential monitoring points, the posterior probabilities are updated using the Baye's theorem, and VOI of the samples is calculated. In the next step, the stations with maximum VOI is selected as optimal stations. This process is repeated for each sampling interval to obtain optimal monitoring network locations for each interval. The results of the proposed VOI-based methodology is compared with those obtained using an entropy theoretic approach. As the results of the two methodologies would be partially different, in the next step, the results are combined using a weighting method. Finally, the optimal sampling interval and location of WQM stations are chosen using the Evidential Reasoning (ER) decision making method. The efficiency and applicability of the methodology are evaluated using available water quantity and quality data of the Karkheh Reservoir in the southwestern part of Iran.

  17. Application of advanced reservoir characterization, simulation and production optimization strategies to maximize recovery in slope and basin clastic reservoirs, West Texas (Delaware Basin). Annual report

    Energy Technology Data Exchange (ETDEWEB)

    Dutton, S.P.; Asquith, G.B.; Barton, M.D.; Cole, A.G.; Gogas, J.; Malik, M.A.; Clift, S.J.; Guzman, J.I.

    1997-11-01

    The objective of this project is to demonstrate that detailed reservoir characterization of slope and basin clastic reservoirs in sandstones of the Delaware Mountain Group in the Delaware Basin of West Texas and New Mexico is a cost-effective way to recover a higher percentage of the original oil in place through strategic placement of infill wells and geologically based field development. This project involves reservoir characterization of two Late Permian slope and basin clastic reservoirs in the Delaware Basin, West Texas, followed by a field demonstration in one of the fields. The fields being investigated are Geraldine Ford and Ford West fields in Reeves and Culberson Counties, Texas. Project objectives are divided into two major phases, reservoir characterization and implementation. The objectives of the reservoir characterization phase of the project were to provide a detailed understanding of the architecture and heterogeneity of the two fields, the Ford Geraldine unit and Ford West field. Reservoir characterization utilized 3-D seismic data, high-resolution sequence stratigraphy, subsurface field studies, outcrop characterization, and other techniques. Once reservoir characterized was completed, a pilot area of approximately 1 mi{sup 2} at the northern end of the Ford Geraldine unit was chosen for reservoir simulation. This report summarizes the results of the second year of reservoir characterization.

  18. modelling for optimal number of line storage reservoirs in a water

    African Journals Online (AJOL)

    user

    RESERVOIRS IN A WATER DISTRIBUTION SYSTEM. By. B.U. Anyata. Department ... water distribution systems, in order to balance the ... distribution line storage systems to meet peak demands at .... Evaluation Method. The criteria ... Pipe + Energy Cost (N). 191, 772 ... Economic Planning Model for Distributed information ...

  19. Stochastic reservoir optimization using El Niño information: case study of Daule Peripa, Ecuador

    DEFF Research Database (Denmark)

    Gelati, Emiliano; Madsen, Henrik; Rosbjerg, Dan

    2011-01-01

    , each corresponding to a hidden climate state. Climatic information is used as exogenous input and to condition state transitions. We apply the model to the inflow of the Daule Peripa reservoir in western Ecuador, where El Niño events cause anomalously heavy rainfall. El Niño–Southern Oscillation (ENSO...

  20. Modeling and optimizing the design of matrix treatments in carbonate reservoirs with self-diverting acid systems

    International Nuclear Information System (INIS)

    Bulgakova, G T; Kharisov, R Ya; Sharifullin, A R; Pestrikov, A V

    2015-01-01

    Application of a self-diverting-acid based on viscoelastic surfactant (SDVA) is a promising technology for improving the efficacy of acid treatment in oil and gas-bearing carbonate reservoirs. In this study, we present a mathematical model for assessing SDVA flow and reaction with carbonate rock using the SDVA rheological characteristics. The model calculates the technological parameters for acidizing operations and the prediction of well productivity after acid treatment, in addition to technical and economic optimization of the acidizing process by modeling different acid treatment options with varying volumes, injection rates, process fluids stages and initial economic scenarios

  1. Integrated petrophysical approach for determining reserves and reservoir characterization to optimize production of oil sands in northeastern Alberta

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, A.; Koch, J. [Weatherford Canada Partnership, Bonneyville, AB (Canada)

    2008-10-15

    This study used logging data, borehole imaging data, dipole sonic and magnetic resonance data to study a set of 6 wells in the McMurray Formation of northeastern Alberta. The data sets were used to understand the geologic settings, fluid properties, and rock properties of the area's geology as well as to more accurately estimate its reservoir and production potential. The study also incorporated data from electric, nuclear and acoustic measurements. A shaly sand analysis was used to provide key reservoir petrophysical data. Image data in the study was used to characterize the heterogeneity and permeability of the reservoir in order to optimize production. Results of the shaly sand analysis were then combined with core data and nuclear resonance data in order to determine permeability and lithology-independent porosity. Data sets were used to iteratively refine an integrated petrophysical analysis. Results of the analysis indicated that the depositional environment in which the wells were located did not match a typical fluvial-estuarine sands environment. A further interpretation of all data indicated that the wells were located in a shoreface environment. It was concluded that the integration of petrophysical measurements can enable geoscientists to more accurately characterize sub-surface environments. 3 refs., 7 figs.

  2. Optimal Scale Edge Detection Utilizing Noise within Images

    Directory of Open Access Journals (Sweden)

    Adnan Khashman

    2003-04-01

    Full Text Available Edge detection techniques have common problems that include poor edge detection in low contrast images, speed of recognition and high computational cost. An efficient solution to the edge detection of objects in low to high contrast images is scale space analysis. However, this approach is time consuming and computationally expensive. These expenses can be marginally reduced if an optimal scale is found in scale space edge detection. This paper presents a new approach to detecting objects within images using noise within the images. The novel idea is based on selecting one optimal scale for the entire image at which scale space edge detection can be applied. The selection of an ideal scale is based on the hypothesis that "the optimal edge detection scale (ideal scale depends on the noise within an image". This paper aims at providing the experimental evidence on the relationship between the optimal scale and the noise within images.

  3. An Overview of Radar Waveform Optimization for Target Detection

    Directory of Open Access Journals (Sweden)

    Wang Lulu

    2016-10-01

    Full Text Available An optimal waveform design method that fully employs the knowledge of the target and the environment can further improve target detection performance, thus is of vital importance to research. In this paper, methods of radar waveform optimization for target detection are reviewed and summarized and provide the basis for the research.

  4. T-R Cycle Characterization and Imaging: Advanced Diagnostic Methodology for Petroleum Reservoir and Trap Detection and Delineation

    Energy Technology Data Exchange (ETDEWEB)

    Ernest A. Mancini

    2006-08-30

    Characterization of stratigraphic sequences (T-R cycles or sequences) included outcrop studies, well log analysis and seismic reflection interpretation. These studies were performed by researchers at the University of Alabama, Wichita State University and McGill University. The outcrop, well log and seismic characterization studies were used to develop a depositional sequence model, a T-R cycle (sequence) model, and a sequence stratigraphy predictive model. The sequence stratigraphy predictive model developed in this study is based primarily on the modified T-R cycle (sequence) model. The T-R cycle (sequence) model using transgressive and regressive systems tracts and aggrading, backstepping, and infilling intervals or sections was found to be the most appropriate sequence stratigraphy model for the strata in the onshore interior salt basins of the Gulf of Mexico to improve petroleum stratigraphic trap and specific reservoir facies imaging, detection and delineation. The known petroleum reservoirs of the Mississippi Interior and North Louisiana Salt Basins were classified using T-R cycle (sequence) terminology. The transgressive backstepping reservoirs have been the most productive of oil, and the transgressive backstepping and regressive infilling reservoirs have been the most productive of gas. Exploration strategies were formulated using the sequence stratigraphy predictive model and the classification of the known petroleum reservoirs utilizing T-R cycle (sequence) terminology. The well log signatures and seismic reflector patterns were determined to be distinctive for the aggrading, backstepping and infilling sections of the T-R cycle (sequence) and as such, well log and seismic data are useful for recognizing and defining potential reservoir facies. The use of the sequence stratigraphy predictive model, in combination with the knowledge of how the distinctive characteristics of the T-R system tracts and their subdivisions are expressed in well log patterns

  5. Conjunctively optimizing flash flood control and water quality in urban water reservoirs by model predictive control and dynamic emulation

    Science.gov (United States)

    Galelli, Stefano; Goedbloed, Albert; Schmitter, Petra; Castelletti, Andrea

    2014-05-01

    Urban water reservoirs are a viable adaptation option to account for increasing drinking water demand of urbanized areas as they allow storage and re-use of water that is normally lost. In addition, the direct availability of freshwater reduces pumping costs and diversifies the portfolios of drinking water supply. Yet, these benefits have an associated twofold cost. Firstly, the presence of large, impervious areas increases the hydraulic efficiency of urban catchments, with short time of concentration, increased runoff rates, losses of infiltration and baseflow, and higher risk of flash floods. Secondly, the high concentration of nutrients and sediments characterizing urban discharges is likely to cause water quality problems. In this study we propose a new control scheme combining Model Predictive Control (MPC), hydro-meteorological forecasts and dynamic model emulation to design real-time operating policies that conjunctively optimize water quantity and quality targets. The main advantage of this scheme stands in its capability of exploiting real-time hydro-meteorological forecasts, which are crucial in such fast-varying systems. In addition, the reduced computational requests of the MPC scheme allows coupling it with dynamic emulators of water quality processes. The approach is demonstrated on Marina Reservoir, a multi-purpose reservoir located in the heart of Singapore and characterized by a large, highly urbanized catchment with a short (i.e. approximately one hour) time of concentration. Results show that the MPC scheme, coupled with a water quality emulator, provides a good compromise between different operating objectives, namely flood risk reduction, drinking water supply and salinity control. Finally, the scheme is used to assess the effect of source control measures (e.g. green roofs) aimed at restoring the natural hydrological regime of Marina Reservoir catchment.

  6. Correcting underestimation of optimal fracture length by modeling proppant conductivity variations in hydraulically fractured gas/condensate reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Akram, A.H.; Samad, A. [Society of Petroleum Engineers, Richardson, TX (United States)]|[Schlumberger, Houston, TX (United States)

    2006-07-01

    A study was conducted in which a newly developed numerical simulator was used to forecast the productivity of a hydraulically fractured well in a retrograde gas-condensate sandstone reservoir. The effect of condensate dropout was modeled in both the reservoir and the proppant pack. The type of proppant and the stress applied to it are among the factors that determine proppant conductivity in a single-phase flow. Other factors include the high velocity of gas and the presence of liquid in the proppant pack. It was concluded that apparent proppant permeability in a gas condensate reservoir varies along the length of the hydraulic fracture and depends on the distance from the wellbore. It will increase towards the tip of the fracture where liquid ratio and velocity are lower. Apparent proppant permeability also changes with time. Forecasting is most accurate when these conditions are considered in the simulation. There are 2 problems associated with the use of a constant proppant permeability in a gas condensate reservoir. The first relates to the fact that it is impossible to obtain a correct single number that will mimic the drawdown of the real fracture at a particular rate without going through the process of determining the proppant permeability profile in a numerical simulator. The second problem relates to the fact that constant proppant permeability yields an optimal fracture length that is too short. Analytical modeling does not account for these complexities. It was determined that the only way to accurately simulate the behaviour of a hydraulic fracture in a high rate well, is by advanced numerical modeling that considers varying apparent proppant permeability in terms of time and distance along the fracture length. 10 refs., 2 tabs., 16 figs., 1 appendix.

  7. An effective streamflow process model for optimal reservoir operation using stochastic dual dynamic programming

    OpenAIRE

    Raso , L.; Malaterre , P.O.; Bader , J.C.

    2017-01-01

    International audience; This article presents an innovative streamflow process model for use in reservoir operational rule design in stochastic dual dynamic programming (SDDP). Model features, which can be applied independently, are (1) a multiplicative process model for the forward phase and its linearized version for the backward phase; and (2) a nonuniform time-step length that is inversely proportional to seasonal variability. The advantages are (1) guaranteeing positive streamflow values...

  8. Detection of urinary biomarkers in reservoir hosts of Leptospirosis by capillary electrophoresis mass spectrometry

    Science.gov (United States)

    Pathogenic leptospires colonize the renal tubules of reservoir hosts of infection and are excreted via urine into the environment. Reservoir hosts include a wide range of domestic and wild animal species and include cattle, dogs and rats which can persistently excrete large numbers of pathogenic lep...

  9. Insights into the Anaerobic Biodegradation Pathway of n-Alkanes in Oil Reservoirs by Detection of Signature Metabolites

    Science.gov (United States)

    Bian, Xin-Yu; Maurice Mbadinga, Serge; Liu, Yi-Fan; Yang, Shi-Zhong; Liu, Jin-Feng; Ye, Ru-Qiang; Gu, Ji-Dong; Mu, Bo-Zhong

    2015-01-01

    Anaerobic degradation of alkanes in hydrocarbon-rich environments has been documented and different degradation strategies proposed, of which the most encountered one is fumarate addition mechanism, generating alkylsuccinates as specific biomarkers. However, little is known about the mechanisms of anaerobic degradation of alkanes in oil reservoirs, due to low concentrations of signature metabolites and lack of mass spectral characteristics to allow identification. In this work, we used a multidisciplinary approach combining metabolite profiling and selective gene assays to establish the biodegradation mechanism of alkanes in oil reservoirs. A total of twelve production fluids from three different oil reservoirs were collected and treated with alkali; organic acids were extracted, derivatized with ethanol to form ethyl esters and determined using GC-MS analysis. Collectively, signature metabolite alkylsuccinates of parent compounds from C1 to C8 together with their (putative) downstream metabolites were detected from these samples. Additionally, metabolites indicative of the anaerobic degradation of mono- and poly-aromatic hydrocarbons (2-benzylsuccinate, naphthoate, 5,6,7,8-tetrahydro-naphthoate) were also observed. The detection of alkylsuccinates and genes encoding for alkylsuccinate synthase shows that anaerobic degradation of alkanes via fumarate addition occurs in oil reservoirs. This work provides strong evidence on the in situ anaerobic biodegradation mechanisms of hydrocarbons by fumarate addition. PMID:25966798

  10. Optimizing geologic CO2 sequestration by injection in deep saline formations below oil reservoirs

    International Nuclear Information System (INIS)

    Han, Weon Shik; McPherson, Brian J.

    2009-01-01

    The purpose of this research is to present a best-case paradigm for geologic CO 2 storage: CO 2 injection and sequestration in saline formations below oil reservoirs. This includes the saline-only section below the oil-water contact (OWC) in oil reservoirs, a storage target neglected in many current storage capacity assessments. This also includes saline aquifers (high porosity and permeability formations) immediately below oil-bearing formations. While this is a very specific injection target, we contend that most, if not all, oil-bearing basins in the US contain a great volume of such strata, and represent a rather large CO 2 storage capacity option. We hypothesize that these are the best storage targets in those basins. The purpose of this research is to evaluate this hypothesis. We quantitatively compared CO 2 behavior in oil reservoirs and brine formations by examining the thermophysical properties of CO 2 , CO 2 -brine, and CO 2 -oil in various pressure, temperature, and salinity conditions. In addition, we compared the distribution of gravity number (N), which characterizes a tendency towards buoyancy-driven CO 2 migration, and mobility ratio (M), which characterizes the impeded CO 2 migration, in oil reservoirs and brine formations. Our research suggests competing advantages and disadvantages of CO 2 injection in oil reservoirs vs. brine formations: (1) CO 2 solubility in oil is significantly greater than in brine (over 30 times); (2) the tendency of buoyancy-driven CO 2 migration is smaller in oil reservoirs because density contrast between oil and CO 2 is smaller than it between brine and oil (the approximate density contrast between CO 2 and crude oil is ∼100 kg/m 3 and between CO 2 and brine is ∼350 kg/m 3 ); (3) the increased density of oil and brine due to the CO 2 dissolution is not significant (about 7-15 kg/m 3 ); (4) the viscosity reduction of oil due to CO 2 dissolution is significant (from 5790 to 98 mPa s). We compared these competing

  11. Energy optimization through probabilistic annual forecast water release technique for major storage hydroelectric reservoir

    International Nuclear Information System (INIS)

    Abdul Bahari Othman; Mohd Zamri Yusoff

    2006-01-01

    One of the important decisions to be made by the management of hydroelectric power plant associated with major storage reservoir is to determine the best turbine water release decision for the next financial year. The water release decision enables firm energy generated estimation for the coming financial year to be done. This task is usually a simple and straightforward task provided that the amount of turbine water release is known. The more challenging task is to determine the best water release decision that is able to resolve the two conflicting operational objectives which are minimizing the drop of turbine gross head and maximizing upper reserve margin of the reservoir. Most techniques from literature emphasize on utilizing the statistical simulations approach. Markovians models, for example, are a class of statistical model that utilizes the past and the present system states as a basis for predicting the future [1]. This paper illustrates that rigorous solution criterion can be mathematically proven to resolve those two conflicting operational objectives. Thus, best water release decision that maximizes potential energy for the prevailing natural inflow is met. It is shown that the annual water release decision shall be made in such a manner that annual return inflow that has return frequency smaller than critical return frequency (f c ) should not be considered. This criterion enables target turbine gross head to be set to the well-defined elevation. In the other words, upper storage margin of the reservoir shall be made available to capture magnitude of future inflow that has return frequency greater than or equal to f c. A case study is shown to demonstrate practical application of the derived mathematical formulas

  12. Intelligent monitoring system for real-time geologic CO2 storage, optimization and reservoir managemen

    Science.gov (United States)

    Dou, S.; Commer, M.; Ajo Franklin, J. B.; Freifeld, B. M.; Robertson, M.; Wood, T.; McDonald, S.

    2017-12-01

    Archer Daniels Midland Company's (ADM) world-scale agricultural processing and biofuels production complex located in Decatur, Illinois, is host to two industrial-scale carbon capture and storage projects. The first operation within the Illinois Basin-Decatur Project (IBDP) is a large-scale pilot that injected 1,000,000 metric tons of CO2 over a three year period (2011-2014) in order to validate the Illinois Basin's capacity to permanently store CO2. Injection for the second operation, the Illinois Industrial Carbon Capture and Storage Project (ICCS), started in April 2017, with the purpose of demonstrating the integration of carbon capture and storage (CCS) technology at an ethanol plant. The capacity to store over 1,000,000 metric tons of CO2 per year is anticipated. The latter project is accompanied by the development of an intelligent monitoring system (IMS) that will, among other tasks, perform hydrogeophysical joint analysis of pressure, temperature and seismic reflection data. Using a preliminary radial model assumption, we carry out synthetic joint inversion studies of these data combinations. We validate the history-matching process to be applied to field data once CO2-breakthrough at observation wells occurs. This process will aid the estimation of permeability and porosity for a reservoir model that best matches monitoring observations. The reservoir model will further be used for forecasting studies in order to evaluate different leakage scenarios and develop appropriate early-warning mechanisms. Both the inversion and forecasting studies aim at building an IMS that will use the seismic and pressure-temperature data feeds for providing continuous model calibration and reservoir status updates.

  13. Application of Advanced Reservoir Characterization, Simulation, and Production Optimization Strategies to Maximize Recovery in Slope and Basin Clastic Reservoirs, West Texas (Delaware Basin), Class III

    Energy Technology Data Exchange (ETDEWEB)

    Dutton, Shirley P.; Flanders, William A.

    2001-11-04

    The objective of this Class III project was demonstrate that reservoir characterization and enhanced oil recovery (EOR) by CO2 flood can increase production from slope and basin clastic reservoirs in sandstones of the Delaware Mountain Group in the Delaware Basin of West Texas and New Mexico. Phase 1 of the project, reservoir characterization, focused on Geraldine Ford and East Ford fields, which are Delaware Mountain Group fields that produce from the upper Bell Canyon Formation (Ramsey sandstone). The demonstration phase of the project was a CO2 flood conducted in East Ford field, which is operated by Orla Petco, Inc., as the East Ford unit.

  14. Optimization of Spore Forming Bacteria Flooding for Enhanced Oil Recovery in North Sea Chalk Reservoir

    DEFF Research Database (Denmark)

    Halim, Amalia Yunita; Nielsen, Sidsel Marie; Eliasson Lantz, Anna

    2015-01-01

    .2-3.8 cm) during bacteria injection. Further seawater flooding after three days shut in period showed that permeability gradually increased in the first two sections of the core and started to decrease in the third section of the core (3.8-6.3 cm). Complete plugging was never observed in our experiments.......Little has been done to study microbial enhanced oil recovery (MEOR) in chalk reservoirs. The present study focused on core flooding experiments to see microbial plugging and its effect on oil recovery. A pressure tapped core holder with pressure ports at 1.2 cm, 3.8 cm, and 6.3 cm from the inlet...

  15. Reservoir Engineering Optimization Strategies for Subsurface CO{sub 2} Storage

    Energy Technology Data Exchange (ETDEWEB)

    Mclntire, Blayde; McPherson, Brian

    2013-09-30

    The purpose of this report is to outline a methodology for calculating the optimum number of injection wells for geologic CCS. The methodology is intended primarily for reservoir pressure management, and factors in cost as well. Efficiency may come in many forms depending on project goals; therefore, various results are presented simultaneously. The developed methodology is illustrated via application in a case study of the Rocky Mountain Carbon Capture and Storage (RMCCS) project, including a CCS candidate site near Craig, Colorado, USA. The forecasting method provided reasonable estimates of cost and injection volume when compared to simulated results.

  16. Sensitivity analysis and economic optimization studies of inverted five-spot gas cycling in gas condensate reservoir

    OpenAIRE

    Shams Bilal; Yao Jun; Zhang Kai; Zhang Lei

    2017-01-01

    Gas condensate reservoirs usually exhibit complex flow behaviors because of propagation response of pressure drop from the wellbore into the reservoir. When reservoir pressure drops below the dew point in two phase flow of gas and condensate, the accumulation of large condensate amount occurs in the gas condensate reservoirs. Usually, the saturation of condensate accumulation in volumetric gas condensate reservoirs is lower than the critical condensate saturation that causes trapping of large...

  17. Marine controlled source electromagnetic (mCSEM) detects hydrocarbon reservoirs in the Santos Basin - Brazil

    Energy Technology Data Exchange (ETDEWEB)

    Buonora, Marco Polo Pereira; Rodrigues, Luiz Felipe [PETROBRAS, Rio de Janeiro, RJ (Brazil); Zerilli, Andrea; Labruzzo, Tiziano [WesternGeco, Houston, TX (United States)

    2008-07-01

    In recent years marine Controlled Source Electromagnetic (mCSEM) has driven the attention of an increasing number of operators due to its sensitivity to map resistive structures, such as hydrocarbon reservoirs beneath the ocean floor and successful case histories have been reported. The Santos basin mCSEM survey was performed as part of a technical co-operation project between PETROBRAS and Schlumberger to assess the integration of selected deep reading electromagnetic technologies into the full cycle of oil field exploration and development. The survey design was based on an in-depth sensitivity study, built on known reservoirs parameters, such as thickness, lateral extent, overburden and resistivities derived from seismic and well data. In this context, the mCSEM data were acquired to calibrate the technology over the area's known reservoirs, quantify the resistivity anomalies associated with those reservoirs, with the expectation that new prospective locations could be found. We show that the mCSEM response of the known reservoirs yields signatures that can be clearly imaged and accurately quantified and there are evident correlations between the mCSEM anomalies and the reservoirs. (author)

  18. Single Shooting and ESDIRK Methods for adjoint-based optimization of an oil reservoir

    DEFF Research Database (Denmark)

    Capolei, Andrea; Völcker, Carsten; Frydendall, Jan

    2012-01-01

    the injections and oil production such that ow is uniform in a given geological structure. Even in the case of conventional water ooding, feedback based optimal control technologies may enable higher oil recovery than with conventional operational strategies. The optimal control problems that must be solved...

  19. A long-term optimization method for reservoir management in a market oriented hydro-dominated power systems

    International Nuclear Information System (INIS)

    Sultovic, E.; Sarajcev, I.; Majstrovic, M.

    2001-01-01

    This paper presents an optimization-based method for the long-term scheduling of hydrothermal power system. The proposed method maximizes the profit of hydroelectric production in power system based on monthly energy requirement of the system and unit commitment calculations. The method allows precise hydro chain modeling with numerous restrictions as well as computation for multiple-reservoir river systems with multiple-purpose operation. The method has been implemented in a computer program and tested on power system, which is very similar to Croatian Power System in the year 2000. Several testing results are given. Presented method can be applicable for long-term planning in a competitive electricity market due to possibility of unit commitment calculations with different electrical energy price during scheduling horizon. (authors)

  20. Application of Advanced Reservoir Characterization, Simulation, and Production Optimization Strategies to Maximize Recovery in Slope and Basin Clastic Reservoirs, West Texas (Delaware Basin)

    Energy Technology Data Exchange (ETDEWEB)

    Dutton, S.P.; Flanders, W.A.; Guzman, J.I.; Zirczy, H.

    1999-06-08

    The objective of this Class III project is to demonstrate that detailed reservoir characterization of slope and basin clastic reservoirs in sandstones of the Delaware Mountain Group in the Delaware Basin of West Texas and New Mexico is a cost-effective way to recover a higher percentage of the original oil in place through geologically based field development. This year the project focused on reservoir characterization of the East Ford unit, a representative Delaware Mountain Group field that produces from the upper Bell Canyon Formation (Ramsey Sandstone). The field, discovered in 1960, is operated by Orla Petco, Inc., as the East Ford unit; it contained an estimated 19.8 million barrels (MMbbl) of original oil in place. Petrophysical characterization of the East Ford unit was accomplished by integrating core and log data and quantifying petrophysical properties from wireline logs. Most methods of petrophysical analysis that had been developed during an earlier study of the Ford Geraldine unit were successfully transferred to the East Ford unit. The approach that was used to interpret water saturation from resistivity logs, however, had to be modified because in some East Ford wells the log-calculated water saturation was too high and inconsistent with observations made during the actual production. Log-porosity to core-porosity transforms and core-porosity to core-permeability transforms were derived from the East Ford reservoir. The petrophysical data were used to map porosity, permeability, net pay, water saturation, mobil-oil saturation, and other reservoir properties.

  1. The Role of Energy Reservoirs in Distributed Computing: Manufacturing, Implementing, and Optimizing Energy Storage in Energy-Autonomous Sensor Nodes

    Science.gov (United States)

    Cowell, Martin Andrew

    The world already hosts more internet connected devices than people, and that ratio is only increasing. These devices seamlessly integrate with peoples lives to collect rich data and give immediate feedback about complex systems from business, health care, transportation, and security. As every aspect of global economies integrate distributed computing into their industrial systems and these systems benefit from rich datasets. Managing the power demands of these distributed computers will be paramount to ensure the continued operation of these networks, and is elegantly addressed by including local energy harvesting and storage on a per-node basis. By replacing non-rechargeable batteries with energy harvesting, wireless sensor nodes will increase their lifetimes by an order of magnitude. This work investigates the coupling of high power energy storage with energy harvesting technologies to power wireless sensor nodes; with sections covering device manufacturing, system integration, and mathematical modeling. First we consider the energy storage mechanism of supercapacitors and batteries, and identify favorable characteristics in both reservoir types. We then discuss experimental methods used to manufacture high power supercapacitors in our labs. We go on to detail the integration of our fabricated devices with collaborating labs to create functional sensor node demonstrations. With the practical knowledge gained through in-lab manufacturing and system integration, we build mathematical models to aid in device and system design. First, we model the mechanism of energy storage in porous graphene supercapacitors to aid in component architecture optimization. We then model the operation of entire sensor nodes for the purpose of optimally sizing the energy harvesting and energy reservoir components. In consideration of deploying these sensor nodes in real-world environments, we model the operation of our energy harvesting and power management systems subject to

  2. Hybrid ANFIS with ant colony optimization algorithm for prediction of shear wave velocity from a carbonate reservoir in Iran

    Directory of Open Access Journals (Sweden)

    Hadi Fattahi

    2016-12-01

    Full Text Available Shear wave velocity (Vs data are key information for petrophysical, geophysical and geomechanical studies. Although compressional wave velocity (Vp measurements exist in almost all wells, shear wave velocity is not recorded for most of elderly wells due to lack of technologic tools. Furthermore, measurement of shear wave velocity is to some extent costly. This study proposes a novel methodology to remove aforementioned problems by use of hybrid adaptive neuro fuzzy inference system (ANFIS with ant colony optimization algorithm (ACO based on fuzzy c–means clustering (FCM and subtractive clustering (SCM. The ACO is combined with two ANFIS models for determining the optimal value of its user–defined parameters. The optimization implementation by the ACO significantly improves the generalization ability of the ANFIS models. These models are used in this study to formulate conventional well log data into Vs in a quick, cheap, and accurate manner. A total of 3030 data points was used for model construction and 833 data points were employed for assessment of ANFIS models. Finally, a comparison among ANFIS models, and six well–known empirical correlations demonstrated ANFIS models outperformed other methods. This strategy was successfully applied in the Marun reservoir, Iran.

  3. [Optimized application of nested PCR method for detection of malaria].

    Science.gov (United States)

    Yao-Guang, Z; Li, J; Zhen-Yu, W; Li, C

    2017-04-28

    Objective To optimize the application of the nested PCR method for the detection of malaria according to the working practice, so as to improve the efficiency of malaria detection. Methods Premixing solution of PCR, internal primers for further amplification and new designed primers that aimed at two Plasmodium ovale subspecies were employed to optimize the reaction system, reaction condition and specific primers of P . ovale on basis of routine nested PCR. Then the specificity and the sensitivity of the optimized method were analyzed. The positive blood samples and examination samples of malaria were detected by the routine nested PCR and the optimized method simultaneously, and the detection results were compared and analyzed. Results The optimized method showed good specificity, and its sensitivity could reach the pg to fg level. The two methods were used to detect the same positive malarial blood samples simultaneously, the results indicated that the PCR products of the two methods had no significant difference, but the non-specific amplification reduced obviously and the detection rates of P . ovale subspecies improved, as well as the total specificity also increased through the use of the optimized method. The actual detection results of 111 cases of malarial blood samples showed that the sensitivity and specificity of the routine nested PCR were 94.57% and 86.96%, respectively, and those of the optimized method were both 93.48%, and there was no statistically significant difference between the two methods in the sensitivity ( P > 0.05), but there was a statistically significant difference between the two methods in the specificity ( P PCR can improve the specificity without reducing the sensitivity on the basis of the routine nested PCR, it also can save the cost and increase the efficiency of malaria detection as less experiment links.

  4. Molecular detection of Leishmania infection due to Leishmania major and Leishmania turanica in the vectors and reservoir host in Iran.

    Science.gov (United States)

    Rassi, Yavar; Oshaghi, Mohammad Ali; Azani, Sadegh Mohammadi; Abaie, Mohammad Reza; Rafizadeh, Sina; Mohebai, Mehdi; Mohtarami, Fatemeh; Zeinali, Mohammad kazem

    2011-02-01

    An epidemiological study was carried out on the vectors and reservoirs of cutaneous leishmaniasis in rural areas of Damghan district, Semnan province, central Iran, during 2008-2009. Totally, 6110 sand flies were collected using sticky papers and were subjected to molecular methods for detection of Leishmania parasite. Phlebotomus papatasi Scopoli was the common species in outdoor and indoor resting places. Polymerase chain reaction technique showed that 24 out of 218 P. papatasi (11%) and 4 out of 62 Phlebotomus caucasicus Marzinovskyi (6.5%) were positive for parasites Leishmania major Yakimoff and Schokhor. Twenty-one rodent reservoir hosts captured using Sherman traps were identified as Rhombomys opimus Lichtenstein (95%) and Meriones libycus Lichtenstein (5%). Microscopic investigation on blood smear of the animals for amastigote parasites revealed 8 (40%) rodents infected with R. opimus. L. major infection in these animals was then confirmed by polymerase chain reaction against internal transcribed spacer ribosomal DNA (rDNA) loci of the parasite followed by restriction fragment length polymorphism. Further, sequence analysis of 297 bp of ITS1-rDNA loci revealed the presence of L. major and Leishmania turanica in P. papatasi, and L. major in R. opimus. This is the first molecular report of L. major infection in both vectors (P. papatasi and P. caucasicus) and reservoir host (R. opimus) in this region. The results indicated that P. papatas was the primary vector of the disease and circulating the parasite between human and reservoirs, and P. caucasicus could be considered as a secondary vector. Further, our study showed that R. opimus is the most important host reservoir for maintenance of the parasite source in the area.

  5. Application of advanced reservoir characterization, simulation, and production optimization strategies to maximize recovery in slope and basin clastic reservoirs, West Texas (Delaware Basin), Class III

    Energy Technology Data Exchange (ETDEWEB)

    Dutton, Shirley P.; Flanders, William A.; Zirczy, Helena H.

    2000-05-24

    The objective of this Class 3 project was to demonstrate that detailed reservoir characterization of slope and basin clastic reservoirs in sandstones of the Delaware Mountain Group in the Delaware Basin of West Texas and New Mexico is a cost effective way to recover a higher percentage of the original oil in place through strategic placement of infill wells and geologically based field development. Phase 1 of the project, reservoir characterization, was completed this year, and Phase 2 began. The project is focused on East Ford field, a representative Delaware Mountain Group field that produces from the upper Bell Canyon Formation (Ramsey sandstone). The field, discovered in 1960, is operated by Oral Petco, Inc., as the East Ford unit. A CO{sub 2} flood is being conducted in the unit, and this flood is the Phase 2 demonstration for the project.

  6. Application of Advanced Reservoir Characterization, Simulation, and Production Optimization Strategies to Maximize Recovery in Slope and Basin Clastic Reservoirs, West Texas (Delaware Basin)

    International Nuclear Information System (INIS)

    Dutton, Shirley

    1999-01-01

    The objective of this Class 3 project was demonstrate that detailed reservoir characterization of slope and basin clastic reservoirs in sandstones of the Delaware Mountain Group in the Delaware Basin of West Texas and New Mexico is a cost effective way to recover a higher percentage of the original oil in place through strategic placement of infill wells and geologically based field development. Project objectives are divided into two main phases. The original objectives of the reservoir-characterization phase of the project were (1) to provide a detailed understanding of the architecture and heterogeneity of two representative fields of the Delaware Mountain Group, Geraldine Ford and Ford West, which produce from the Bell Canyon and Cherry Canyon Formations, respectively, (2) to chose a demonstration area in one of the fields, and (3) to simulate a CO 2 flood in the demonstration area

  7. Optimal detection and control strategies for invasive species management

    Science.gov (United States)

    Shefali V. Mehta; Robert G. Haight; Frances R. Homans; Stephen Polasky; Robert C. Venette

    2007-01-01

    The increasing economic and environmental losses caused by non-native invasive species amplify the value of identifying and implementing optimal management options to prevent, detect, and control invasive species. Previous literature has focused largely on preventing introductions of invasive species and post-detection control activities; few have addressed the role of...

  8. A High-Precision Time-Frequency Entropy Based on Synchrosqueezing Generalized S-Transform Applied in Reservoir Detection

    Directory of Open Access Journals (Sweden)

    Hui Chen

    2018-06-01

    Full Text Available According to the fact that high frequency will be abnormally attenuated when seismic signals travel across reservoirs, a new method, which is named high-precision time-frequency entropy based on synchrosqueezing generalized S-transform, is proposed for hydrocarbon reservoir detection in this paper. First, the proposed method obtains the time-frequency spectra by synchrosqueezing generalized S-transform (SSGST, which are concentrated around the real instantaneous frequency of the signals. Then, considering the characteristics and effects of noises, we give a frequency constraint condition to calculate the entropy based on time-frequency spectra. The synthetic example verifies that the entropy will be abnormally high when seismic signals have an abnormal attenuation. Besides, comparing with the GST time-frequency entropy and the original SSGST time-frequency entropy in field data, the results of the proposed method show higher precision. Moreover, the proposed method can not only accurately detect and locate hydrocarbon reservoirs, but also effectively suppress the impact of random noises.

  9. Hydrocarbon production and reservoir management: recent advances in closed-loop optimization technology

    NARCIS (Netherlands)

    Abbink, O.A.; Hanea, R.G.; Nennie, E.D.; Peters, R.C.A.M.

    2009-01-01

    Petroleum production is a relatively inefficient process. For oil production, it is, generally, less than 60 % effective on a macro scale and less than 60 % effective on a micro scale. This results, commonly, in an actual oil recovery of less than 35 %. Optimization of the production process will,

  10. Integrating a Typhoon Event Database with an Optimal Flood Operation Model on the Real-Time Flood Control of the Tseng-Wen Reservoir

    Science.gov (United States)

    Chen, Y. W.; Chang, L. C.

    2012-04-01

    Typhoons which normally bring a great amount of precipitation are the primary natural hazard in Taiwan during flooding season. Because the plentiful rainfall quantities brought by typhoons are normally stored for the usage of the next draught period, the determination of release strategies for flood operation of reservoirs which is required to simultaneously consider not only the impact of reservoir safety and the flooding damage in plain area but also for the water resource stored in the reservoir after typhoon becomes important. This study proposes a two-steps study process. First, this study develop an optimal flood operation model (OFOM) for the planning of flood control and also applies the OFOM on Tseng-wun reservoir and the downstream plain related to the reservoir. Second, integrating a typhoon event database with the OFOM mentioned above makes the proposed planning model have ability to deal with a real-time flood control problem and names as real-time flood operation model (RTFOM). Three conditions are considered in the proposed models, OFOM and RTFOM, include the safety of the reservoir itself, the reservoir storage after typhoons and the impact of flooding in the plain area. Besides, the flood operation guideline announced by government is also considered in the proposed models. The these conditions and the guideline can be formed as an optimization problem which is solved by the genetic algorithm (GA) in this study. Furthermore, a distributed runoff model, kinematic-wave geomorphic instantaneous unit hydrograph (KW-GIUH), and a river flow simulation model, HEC-RAS, are used to simulate the river water level of Tseng-wun basin in the plain area and the simulated level is shown as an index of the impact of flooding. Because the simulated levels are required to re-calculate iteratively in the optimization model, applying a recursive artificial neural network (recursive ANN) instead of the HEC-RAS model can significantly reduce the computational burden of

  11. Application of Advanced Reservoir Characterization, Simulation, and Production Optimization Strategies to Maximize Recovery in Slope and Basin Clastic Reservoirs, West Texas (Delaware Basin)

    Energy Technology Data Exchange (ETDEWEB)

    Andrew G. Cole; George B. Asquith; Jose I. Guzman; Mark D. Barton; Mohammad A. Malik; Shirley P. Dutton; Sigrid J. Clift

    1998-04-01

    The objective of this Class III project is to demonstrate that detailed reservoir characterization of clastic reservoirs in basinal sandstones of the Delaware Mountain Group in the Delaware Basin of West Texas and New Mexico is a cost-effective way to recover more of the original oil in place by strategic infill-well placement and geologically based enhanced oil recovery. The study focused on the Ford Geraldine unit, which produces from the upper Bell Canyon Formation (Ramsey sandstone). Reservoirs in this and other Delaware Mountain Group fields have low producibility (average recovery <14 percent of the original oil in place) because of a high degree of vertical and lateral heterogeneity caused by depositional processes and post-depositional diagenetic modification. Outcrop analogs were studied to better interpret the depositional processes that formed the reservoirs at the Ford Geraldine unit and to determine the dimensions of reservoir sandstone bodies. Facies relationships and bedding architecture within a single genetic unit exposed in outcrop in Culberson County, Texas, suggest that the sandstones were deposited in a system of channels and levees with attached lobes that initially prograded basinward, aggraded, and then turned around and stepped back toward the shelf. Channel sandstones are 10 to 60 ft thick and 300 to 3,000 ft wide. The flanking levees have a wedge-shaped geometry and are composed of interbedded sandstone and siltstone; thickness varies from 3 to 20 ft and length from several hundred to several thousands of feet. The lobe sandstones are broad lens-shaped bodies; thicknesses range up to 30 ft with aspect ratios (width/thickness) of 100 to 10,000. Lobe sandstones may be interstratified with laminated siltstones.

  12. Optimal and centralized reservoir management for drought and flood protection via Stochastic Dual Dynamic Programming on the Upper Seine-Aube River system

    Science.gov (United States)

    Chiavico, Mattia; Raso, Luciano; Dorchies, David; Malaterre, Pierre-Olivier

    2015-04-01

    Seine river region is an extremely important logistic and economic junction for France and Europe. The hydraulic protection of most part of the region relies on four controlled reservoirs, managed by EPTB Seine-Grands Lacs. Presently, reservoirs operation is not centrally coordinated, and release rules are based on empirical filling curves. In this study, we analyze how a centralized release policy can face flood and drought risks, optimizing water system efficiency. The optimal and centralized decisional problem is solved by Stochastic Dual Dynamic Programming (SDDP) method, minimizing an operational indicator for each planning objective. SDDP allows us to include into the system: 1) the hydrological discharge, specifically a stochastic semi-distributed auto-regressive model, 2) the hydraulic transfer model, represented by a linear lag and route model, and 3) reservoirs and diversions. The novelty of this study lies on the combination of reservoir and hydraulic models in SDDP for flood and drought protection problems. The study case covers the Seine basin until the confluence with Aube River: this system includes two reservoirs, the city of Troyes, and the Nuclear power plant of Nogent-Sur-Seine. The conflict between the interests of flood protection, drought protection, water use and ecology leads to analyze the environmental system in a Multi-Objective perspective.

  13. The optimal number of surveys when detectability varies.

    Directory of Open Access Journals (Sweden)

    Alana L Moore

    Full Text Available The survey of plant and animal populations is central to undertaking field ecology. However, detection is imperfect, so the absence of a species cannot be determined with certainty. Methods developed to account for imperfect detectability during surveys do not yet account for stochastic variation in detectability over time or space. When each survey entails a fixed cost that is not spent searching (e.g., time required to travel to the site, stochastic detection rates result in a trade-off between the number of surveys and the length of each survey when surveying a single site. We present a model that addresses this trade-off and use it to determine the number of surveys that: 1 maximizes the expected probability of detection over the entire survey period; and 2 is most likely to achieve a minimally-acceptable probability of detection. We illustrate the applicability of our approach using three practical examples (minimum survey effort protocols, number of frog surveys per season, and number of quadrats per site to detect a plant species and test our model's predictions using data from experimental plant surveys. We find that when maximizing the expected probability of detection, the optimal survey design is most sensitive to the coefficient of variation in the rate of detection and the ratio of the search budget to the travel cost. When maximizing the likelihood of achieving a particular probability of detection, the optimal survey design is most sensitive to the required probability of detection, the expected number of detections if the budget were spent only on searching, and the expected number of detections that are missed due to travel costs. We find that accounting for stochasticity in detection rates is likely to be particularly important for designing surveys when detection rates are low. Our model provides a framework to do this.

  14. Optimal Noise Enhanced Signal Detection in a Unified Framework

    Directory of Open Access Journals (Sweden)

    Ting Yang

    2016-06-01

    Full Text Available In this paper, a new framework for variable detectors is formulated in order to solve different noise enhanced signal detection optimal problems, where six different disjoint sets of detector and discrete vector pairs are defined according to the two inequality-constraints on detection and false-alarm probabilities. Then theorems and algorithms constructed based on the new framework are presented to search the optimal noise enhanced solutions to maximize the relative improvements of the detection and the false-alarm probabilities, respectively. Further, the optimal noise enhanced solution of the maximum overall improvement is obtained based on the new framework and the relationship among the three maximums is presented. In addition, the sufficient conditions for improvability or non-improvability under the two certain constraints are given. Finally, numerous examples are presented to illustrate the theoretical results and the proofs of the main theorems are given in the Appendix.

  15. Optimal Power Constrained Distributed Detection over a Noisy Multiaccess Channel

    Directory of Open Access Journals (Sweden)

    Zhiwen Hu

    2015-01-01

    Full Text Available The problem of optimal power constrained distributed detection over a noisy multiaccess channel (MAC is addressed. Under local power constraints, we define the transformation function for sensor to realize the mapping from local decision to transmitted waveform. The deflection coefficient maximization (DCM is used to optimize the performance of power constrained fusion system. Using optimality conditions, we derive the closed-form solution to the considered problem. Monte Carlo simulations are carried out to evaluate the performance of the proposed new method. Simulation results show that the proposed method could significantly improve the detection performance of the fusion system with low signal-to-noise ratio (SNR. We also show that the proposed new method has a robust detection performance for broad SNR region.

  16. SEROPOSITIVITY TO LEPTOSPIROSIS IN DOMESTIC RESERVOIRS AND DETECTION OF Leptospira spp. IN WATER SOURCES, IN FARMS OF YUCATAN, MEXICO.

    Directory of Open Access Journals (Sweden)

    Maria Fidelia Cardenas-Marrufo

    2010-11-01

    Full Text Available Leptospirosis is a zoonotic infectious disease with a worldwide distribution. WHO classifies this disease as reemergent and it represents risk to human health with economical repercussion to animal reproduction. Leptospirosis occurs with higher frequency in countries with tropical weather. A transversal study was conducted to determine the frequency of infection of L. interrogans in 476 reservoir animals -212 bovines, 203 pigs, and 61 dogs in 34 animal production units. Positivity frequency the reservoirs was 30.5%. 31 out of 34 animal units had positive reservoirs. The most frequent serovars were tarassovi (53.6%, and hardjo (31.6% in cattle; bratislava (66% and icterohaemorragiae (18.7% in pigs; and canicola (79.8% and icterohaemorragiae (9.8% in dogs. 68 pools of water samples from water tanks were analyzed by DNA amplification of a 16S rRNA fragment for L. interrogans detection using Lepat1-Lepat2 primers. It is recommended to use preventive measures such as vaccination to domestic animals to reduce the risk of transmission to the human population.

  17. Application of Advanced Reservoir Characterization, Simulation, and Production Optimization Strategies to Maximize Recovery in Slope and Basin Clastic Reservoirs, West Texas (Delaware Basin), Class III

    Energy Technology Data Exchange (ETDEWEB)

    Dutton, Shirley P.; Flanders, William A.; Mendez, Daniel L.

    2001-05-08

    The objective of this Class 3 project was demonstrate that detailed reservoir characterization of slope and basin clastic reservoirs in sandstone's of the Delaware Mountain Group in the Delaware Basin of West Texas and New Mexico is a cost effective way to recover oil more economically through geologically based field development. This project was focused on East Ford field, a Delaware Mountain Group field that produced from the upper Bell Canyon Formation (Ramsey sandstone). The field, discovered in 9160, is operated by Oral Petco, Inc., as the East Ford unit. A CO2 flood was being conducted in the unit, and this flood is the Phase 2 demonstration for the project.

  18. Edge detection in digital images using Ant Colony Optimization

    Directory of Open Access Journals (Sweden)

    Marjan Kuchaki Rafsanjani

    2015-11-01

    Full Text Available Ant Colony Optimization (ACO is an optimization algorithm inspired by the behavior of real ant colonies to approximate the solutions of difficult optimization problems. In this paper, ACO is introduced to tackle the image edge detection problem. The proposed approach is based on the distribution of ants on an image; ants try to find possible edges by using a state transition function. Experimental results show that the proposed method compared to standard edge detectors is less sensitive to Gaussian noise and gives finer details and thinner edges when compared to earlier ant-based approaches.

  19. Game theory and fuzzy programming approaches for bi-objective optimization of reservoir watershed management: a case study in Namazgah reservoir.

    Science.gov (United States)

    Üçler, N; Engin, G Onkal; Köçken, H G; Öncel, M S

    2015-05-01

    In this study, game theory and fuzzy programming approaches were used to balance economic and environmental impacts in the Namazgah reservoir, Turkey. The main goals identified were to maximize economic benefits of land use and to protect water quality of reservoir and land resources. Total phosphorous load (kg ha(-1) year(-1)) and economic income (USD ha(-1) year(-1)) from land use were determined as environmental value and economic value, respectively. The surface area of existing land use types, which are grouped under 10 headings according to the investigations on the watershed area, and the constraint values for the watershed were calculated using aerial photos, master plans, and basin slope map. The results of fuzzy programming approach were found to be very close to the results of the game theory model. It was concluded that the amount of fertilizer used in the current situation presents a danger to the reservoir and, therefore, unnecessary fertilizer use should be prevented. Additionally, nuts, fruit, and vegetable cultivation, instead of wheat and corn cultivation, was found to be more suitable due to their high economic income and low total phosphorus (TP) load. Apart from agricultural activities, livestock farming should also be considered in the area as a second source of income. It is believed that the results obtained in this study will help decision makers to identify possible problems of the watershed.

  20. Optimization of the Clarification System for Raw Water from the Pakra Reservoir Lake

    Directory of Open Access Journals (Sweden)

    Zečević, N.

    2011-10-01

    Full Text Available The first step in processing raw water from the Pakra lake for use in fertilizer production at Petrokemija is oxidation of total organic carbon matter with gaseous chlorine, Cl2. Thereupon it is clarified and filtered with the help of a clarification reactor and sand filters. Construction of the clarification reactor and process sand filters enables only the removal of the suspended matter from the raw water, without affecting its overall hardness. Process control of the clarification reactor and removal of the suspended matter from the raw water is achieved by adding corresponding mass concentration water solutions of aluminum sulphate, Al2(SO43 · 18 H2O and organic polyelectrolyte. The effectiveness of flocculation is carried out by laboratory determination of the m-alkalinity difference between inlet and outlet of raw water from the clarification reactor. For the most effective clarification of raw water, the optimal empirical value of the m-alkalinity difference is 0.65 mmol L-1 in the pH range of raw water from 7.0 to 8.0. Prior to processing clarified water by ionic decarbonatisation and demineralisation for protection of the ionic exchange resin from excess free Cl2, a corresponding mass concentration of a sodium bisulfite water solution, NaHSO3, is added. An improved system is proposed for continuous measurement of mass concentrations of free Cl2 in raw and clarified water, and pH difference value at the inlet and outlet of the clarification reactor. The proposed system can achieve optimal dosage of gaseous Cl2 in the raw water, improving the clarification process in the reactor as well as optimal dosage of water solution of NaHSO3. It is shown that the average pH difference from 0.65 to 0.75 at the inlet and outlet of the clarification reactor in the pH range of the raw water from 7.0 to 8.0 is an equally effective replacement for the laboratory determination of m-alkalinity. Also shown is the connection between dosage mass of the

  1. Numerical investigation and optimization of multiple fractures in tight gas reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Hou, M.Z. [Technische Univ. Clausthal, Clausthal-Zellerfeld (Germany). ITE; Energie-Forschungszentrum Niedersachsen, Goslar (Germany); Zhou, L. [Energie-Forschungszentrum Niedersachsen, Goslar (Germany)

    2013-08-01

    The main objective of the project DGMK-680 in phase 2 was to investigate the influence of fractures on each other in a multi-fracture system including their space optimization by using the numerical program FLAC3D with our own developments, which treats all fractures in one 3D geometric model under 3D stress state with fully hydro-mechanical coupling effect. The case study was conducted on a horizontal wellbore at location A, which was stimulated hydraulically with a total of eight transverse fractures in summer 2009. Transverse multiple fractures were simulated using the modified continuum method. In the simulation all fractures were generated in one single model, comprising 22 different rock layers. Each layer was assumed to be homogeneous with regard to its rock and hydromechanical parameters. Thus the influence of the individual fractures on each other can be investigated. The simulation procedure applied, which is a consecutive execution ofa hydraulic and a mechanical computation, is the same for all fractures. The only differences are the primary in-situ stresses, the initial pore pressure, the injection parameters (location, rate, volume, duration), which lead to different patterns of fracture propagations. But there are still some common points, such as irregular patterns of the fracture front, which represents the heterogeneity of the model. All fractures (1 to 8) have their fracture average half-length between 70 m to 115 m, height between 93 m to 114 m and average width between 18 mm to 31 mm. The percentage difference of fracture height for individual fractures is obviously smaller than that of the fracture half-lengths, because the fracture barriers at bottom and top limit the fracture propagation in z-direction. Incomparison with the analytical simulator (FracPro) most results match well. Simulation of multiple fractures at location A, with the newly developed algorithms, shows that individual transverse multiple fractures at distances between 100

  2. Multi-Objective Reservoir Optimization Balancing Energy Generation and Firm Power

    Directory of Open Access Journals (Sweden)

    Fang-Fang Li

    2015-07-01

    Full Text Available To maximize annual power generation and to improve firm power are important but competing goals for hydropower stations. The firm power output is decisive for the installed capacity in design, and represents the reliability of the power generation when the power plant is put into operation. To improve the firm power, the whole generation process needs to be as stable as possible, while the maximization of power generation requires a rapid rise of the water level at the beginning of the storage period. Taking the minimal power output as the firm power, both the total amount and the reliability of the hydropower generation are considered simultaneously in this study. A multi-objective model to improve the comprehensive benefits of hydropower stations are established, which is optimized by Non-dominated Sorting Genetic Algorithm-II (NSGA-II. The Three Gorges Cascade Hydropower System (TGCHS is taken as the study case, and the Pareto Fronts in different search spaces are obtained. The results not only prove the effectiveness of the proposed method, but also provide operational references for the TGCHS, indicating that there is room of improvement for both the annual power generation and the firm power.

  3. Signal processing for solar array monitoring, fault detection, and optimization

    CERN Document Server

    Braun, Henry; Spanias, Andreas

    2012-01-01

    Although the solar energy industry has experienced rapid growth recently, high-level management of photovoltaic (PV) arrays has remained an open problem. As sensing and monitoring technology continues to improve, there is an opportunity to deploy sensors in PV arrays in order to improve their management. In this book, we examine the potential role of sensing and monitoring technology in a PV context, focusing on the areas of fault detection, topology optimization, and performance evaluation/data visualization. First, several types of commonly occurring PV array faults are considered and detection algorithms are described. Next, the potential for dynamic optimization of an array's topology is discussed, with a focus on mitigation of fault conditions and optimization of power output under non-fault conditions. Finally, monitoring system design considerations such as type and accuracy of measurements, sampling rate, and communication protocols are considered. It is our hope that the benefits of monitoring presen...

  4. Daily Reservoir Runoff Forecasting Method Using Artificial Neural Network Based on Quantum-behaved Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Chun-tian Cheng

    2015-07-01

    Full Text Available Accurate daily runoff forecasting is of great significance for the operation control of hydropower station and power grid. Conventional methods including rainfall-runoff models and statistical techniques usually rely on a number of assumptions, leading to some deviation from the exact results. Artificial neural network (ANN has the advantages of high fault-tolerance, strong nonlinear mapping and learning ability, which provides an effective method for the daily runoff forecasting. However, its training has certain drawbacks such as time-consuming, slow learning speed and easily falling into local optimum, which cannot be ignored in the real world application. In order to overcome the disadvantages of ANN model, the artificial neural network model based on quantum-behaved particle swarm optimization (QPSO, ANN-QPSO for short, is presented for the daily runoff forecasting in this paper, where QPSO was employed to select the synaptic weights and thresholds of ANN, while ANN was used for the prediction. The proposed model can combine the advantages of both QPSO and ANN to enhance the generalization performance of the forecasting model. The methodology is assessed by using the daily runoff data of Hongjiadu reservoir in southeast Guizhou province of China from 2006 to 2014. The results demonstrate that the proposed approach achieves much better forecast accuracy than the basic ANN model, and the QPSO algorithm is an alternative training technique for the ANN parameters selection.

  5. The potential of GRACE gravimetry to detect the heavy rainfall-induced impoundment of a small reservoir in the upper Yellow River

    OpenAIRE

    Yi, Shuang; Song, Chunqiao; Wang, Qiuyu; Wang, Linsong; Heki, Kosuke; Sun, Wenke

    2017-01-01

    Artificial reservoirs are important indicators of anthropogenic impacts on environments, and their cumulative influences on the local water storage will change the gravity signal. However, because of their small signal size, such gravity changes are seldom studied using satellite gravimetry from the Gravity Recovery and Climate Experiment (GRACE). Here we investigate the ability of GRACE to detect water storage changes in the Longyangxia Reservoir (LR), which is situated in the uppe...

  6. Exergoeconomic performance optimization of an endoreversible intercooled regenerative Brayton combined heat and power plant coupled to variable-temperature heat reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Bo; Chen, Lingen; Sun, Fengrui [College of Naval Architecture and Power, Naval University of Engineering, Wuhan 430033 (China)

    2012-07-01

    An endoreversible intercooled regenerative Brayton combined heat and power (CHP) plant model coupled to variable-temperature heat reservoirs is established. The exergoeconomic performance of the CHP plant is investigated using finite time thermodynamics. The analytical formulae about dimensionless profit rate and exergy efficiency of the CHP plant with the heat resistance losses in the hot-, cold- and consumer-side heat exchangers, the intercooler and the regenerator are deduced. By taking the maximum profit rate as the objective, the heat conductance allocation among the five heat exchangers and the choice of intercooling pressure ratio are optimized by numerical examples, the characteristic of the optimal dimensionless profit rate versus corresponding exergy efficiency is investigated. When the optimization is performed further with respect to the total pressure ratio, a double-maximum profit rate is obtained. The effects of the design parameters on the double-maximum dimensionless profit rate and corresponding exergy efficiency, optimal total pressure ratio and optimal intercooling pressure ratio are analyzed in detail, and it is found that there exist an optimal consumer-side temperature and an optimal thermal capacitance rate matching between the working fluid and the heat reservoir, respectively, corresponding to a thrice-maximum dimensionless profit rate.

  7. Liquid-Rich Shale Potential of Utah’s Uinta and Paradox Basins: Reservoir Characterization and Development Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Vanden Berg, Michael [Utah Geological Survey, Salt Lake City, UT (United States); Morgan, Craig [Utah Geological Survey, Salt Lake City, UT (United States); Chidsey, Thomas [Utah Geological Survey, Salt Lake City, UT (United States); McLennan, John [Univ. of Utah, Salt Lake City, UT (United States). Energy & Geoscience Inst.; Eby, David [Eby Petrography & Consulting, Littleton, CO (United States); Machel, Hans [Univ. of Alberta, Edmonton, AB (Canada); Schamel, Steve [GeoX Consulting, Salt Lake City, UT (United States); Birdwell, Justin [U.S. Geological Survey, Boulder, CO (United States); Johnson, Ron [U.S. Geological Survey, Boulder, CO (United States); Sarg, Rick [Colorado School of Mines, Golden, CO (United States)

    2017-08-31

    the undiscovered oil resource in the Cane Creek shale of the Paradox Basin at 103 million barrels at a 95 percent confidence level and 198 million barrels at a 50 percent confidence level. Nonetheless, limited research was available or published to further define the play and the reservoir characteristics. The specific objectives of the enclosed research were to (1) characterize geologic, geochemical, and geomechanical rock properties of target zones in the two designated basins by compiling data and by analyzing available cores, cuttings, and well logs; (2) describe outcrop reservoir analogs of GRF plays (Cane Creek shale is not exposed) and compare them to subsurface data; (3) map major regional trends for targeted intervals and identify “sweet spots” that have the greatest oil potential; (4) reduce exploration costs and drilling risks, especially in environmentally sensitive areas; (5) improve drilling and fracturing effectiveness by determining optimal well completion design; and (6) reduce field development costs, maximize oil recovery, and increase reserves. These objectives are all addressed in a series of nine publications that resulted from this extensive research project. Each publication is included in this report as an independent appendix.

  8. A statistical data assimilation method for seasonal streamflow forecasting to optimize hydropower reservoir management in data-scarce regions

    Science.gov (United States)

    Arsenault, R.; Mai, J.; Latraverse, M.; Tolson, B.

    2017-12-01

    Probabilistic ensemble forecasts generated by the ensemble streamflow prediction (ESP) methodology are subject to biases due to errors in the hydrological model's initial states. In day-to-day operations, hydrologists must compensate for discrepancies between observed and simulated states such as streamflow. However, in data-scarce regions, little to no information is available to guide the streamflow assimilation process. The manual assimilation process can then lead to more uncertainty due to the numerous options available to the forecaster. Furthermore, the model's mass balance may be compromised and could affect future forecasts. In this study we propose a data-driven approach in which specific variables that may be adjusted during assimilation are defined. The underlying principle was to identify key variables that would be the most appropriate to modify during streamflow assimilation depending on the initial conditions such as the time period of the assimilation, the snow water equivalent of the snowpack and meteorological conditions. The variables to adjust were determined by performing an automatic variational data assimilation on individual (or combinations of) model state variables and meteorological forcing. The assimilation aimed to simultaneously optimize: (1) the error between the observed and simulated streamflow at the timepoint where the forecasts starts and (2) the bias between medium to long-term observed and simulated flows, which were simulated by running the model with the observed meteorological data on a hindcast period. The optimal variables were then classified according to the initial conditions at the time period where the forecast is initiated. The proposed method was evaluated by measuring the average electricity generation of a hydropower complex in Québec, Canada driven by this method. A test-bed which simulates the real-world assimilation, forecasting, water release optimization and decision-making of a hydropower cascade was

  9. Cyanobacterial occurrence and detection of microcystins and saxitoxins in reservoirs of the Brazilian semi-arid

    Directory of Open Access Journals (Sweden)

    Jessica Roberts Fonseca

    2015-03-01

    Full Text Available Aim:The rapid spread of cyanobacteria in water sources and reservoirs has caused serious environmental damage and public health problems, and consists in a problem that challenges the institutions responsible for providing water to the population. In this study, the quantification of microcystin, saxitoxins and cyanobacteria levels was performed over 3 years in the semi-arid reservoirs of Rio Grande do Norte (Brazil. In addition, we analyzed the seasonal distribution of cyanotoxins and the percentage of cyanobacteria and cyanotoxins which were above the limit established by Brazilian law.MethodsThe study was conducted between 2009 and 2011 in four dams with six sites: Armando Ribeiro Gonçalves (ARG in Itajá, San Rafael (SR and Jucurutu; Passagem das Traíras (PT; Itans and Gargalheiras (GARG. Cyanobacteria presence were quantified and identified and the presence of microcystins (MCYs and saxitoxins (STXs was investigated by ELISA.ResultsThe densities of cyanobacteria were found to be above the permitted in 76% of cases. The ELISA results showed that of the 128 samples analyzed, 27% were above the maximum allowed by the Brazilian Ministry of Health Order 2914/2011. A seasonal pattern for the presence of MCYs was found (0.00227 to 24.1954 µg.L–1, with the highest values in the rainy season. There was no clear seasonal pattern for STXs (0.003 to 0.766 µg.L–1.ConclusionsThis study showed the importance of establishing a water quality monitoring for human consumption and its potability standards since the concentration of MCYs in some samples was above the maximum limit allowed by Brazilian law, thus posing a risk to public health since the conventional water treatment is not able to eliminate these potent hepatotoxins.

  10. Optimized velocity distributions for direct dark matter detection

    Energy Technology Data Exchange (ETDEWEB)

    Ibarra, Alejandro; Rappelt, Andreas, E-mail: ibarra@tum.de, E-mail: andreas.rappelt@tum.de [Physik-Department T30d, Technische Universität München, James-Franck-Straße, 85748 Garching (Germany)

    2017-08-01

    We present a method to calculate, without making assumptions about the local dark matter velocity distribution, the maximal and minimal number of signal events in a direct detection experiment given a set of constraints from other direct detection experiments and/or neutrino telescopes. The method also allows to determine the velocity distribution that optimizes the signal rates. We illustrate our method with three concrete applications: i) to derive a halo-independent upper limit on the cross section from a set of null results, ii) to confront in a halo-independent way a detection claim to a set of null results and iii) to assess, in a halo-independent manner, the prospects for detection in a future experiment given a set of current null results.

  11. Optimization of PET system design for lesion detection

    International Nuclear Information System (INIS)

    Qi, Jinyi

    2000-01-01

    Traditionally, the figures of merit used in designing a PET scanner are spatial resolution, noise equivalent count rate, noise equivalent sensitivity, etc. These measures, however, do not directly reflect the lesion detectability using the PET scanner. Here we propose to optimize PET scanner design directly for lesion detection. The signal-to-noise ratio (SNR) of lesion detection can be easily computed using the theoretical expressions that we have previously derived. Because no time consuming Monte Carlo simulation is needed, the theoretical expressions allow evaluation of a large range of parameters. The PET system parameters can then be chosen to achieve the maximum SNR for lesion detection. The simulation study shown in this paper was focused a single ring PET scanner without depth of interaction measurement. Randoms and scatters were also ignored

  12. Combined Data with Particle Swarm Optimization for Structural Damage Detection

    Directory of Open Access Journals (Sweden)

    Fei Kang

    2013-01-01

    Full Text Available This paper proposes a damage detection method based on combined data of static and modal tests using particle swarm optimization (PSO. To improve the performance of PSO, some immune properties such as selection, receptor editing, and vaccination are introduced into the basic PSO and an improved PSO algorithm is formed. Simulations on three benchmark functions show that the new algorithm performs better than PSO. The efficiency of the proposed damage detection method is tested on a clamped beam, and the results demonstrate that it is more efficient than PSO, differential evolution, and an adaptive real-parameter simulated annealing genetic algorithm.

  13. Optimizing the Configuration of Sensor Networks to Detect Intruders.

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Nathanael J. K. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Katherine A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nozick, Linda Karen [Cornell Univ., Ithaca, NY (United States); Xu, Ningxiong [Cornell Univ., Ithaca, NY (United States)

    2015-03-01

    This paper focuses on optimizing the selection and configuration of detection technologies to protect a target of interest. The ability of an intruder to simply reach the target is assumed to be sufficient to consider the security system a failure. To address this problem, we develop a game theoretic model of the strategic interactions between the system owner and a knowledgeable intruder. A decomposition-based exact method is used to solve the resultant model.

  14. Suitable or optimal noise benefits in signal detection

    International Nuclear Information System (INIS)

    Liu, Shujun; Yang, Ting; Tang, Mingchun; Wang, Pin; Zhang, Xinzheng

    2016-01-01

    Highlights: • Six intervals of additive noises divided according to the two constraints. • Derivation of the suitable additive noise to meet the two constraints. • Formulation of the suitable noise for improvability or nonimprovability. • Optimal noises to minimize P FA , maximize P D and maximize the overall improvement. - Abstract: We present an effective way to generate the suitable or the optimal additive noises which can achieve the three goals of the noise enhanced detectability, i.e., the maximum detection probability (P D ), the minimum false alarm probability (P FA ) and the maximum overall improvement of P D and P FA , without increasing P FA and decreasing P D in a binary hypothesis testing problem. The mechanism of our method is that we divide the discrete vectors into six intervals and choose the useful or partial useful vectors from these intervals to form the additive noise according to different requirements. The form of the optimal noise is derived and proven as a randomization of no more than two discrete vectors in our way. Moreover, how to choose suitable and optimal noises from the six intervals are given. Finally, numerous examples are presented to illustrate the theoretical analysis, where the background noises are Gaussian, symmetric and asymmetric Gaussian mixture noise, respectively.

  15. Optimal sampling strategies for detecting zoonotic disease epidemics.

    Directory of Open Access Journals (Sweden)

    Jake M Ferguson

    2014-06-01

    Full Text Available The early detection of disease epidemics reduces the chance of successful introductions into new locales, minimizes the number of infections, and reduces the financial impact. We develop a framework to determine the optimal sampling strategy for disease detection in zoonotic host-vector epidemiological systems when a disease goes from below detectable levels to an epidemic. We find that if the time of disease introduction is known then the optimal sampling strategy can switch abruptly between sampling only from the vector population to sampling only from the host population. We also construct time-independent optimal sampling strategies when conducting periodic sampling that can involve sampling both the host and the vector populations simultaneously. Both time-dependent and -independent solutions can be useful for sampling design, depending on whether the time of introduction of the disease is known or not. We illustrate the approach with West Nile virus, a globally-spreading zoonotic arbovirus. Though our analytical results are based on a linearization of the dynamical systems, the sampling rules appear robust over a wide range of parameter space when compared to nonlinear simulation models. Our results suggest some simple rules that can be used by practitioners when developing surveillance programs. These rules require knowledge of transition rates between epidemiological compartments, which population was initially infected, and of the cost per sample for serological tests.

  16. Optimal sampling strategies for detecting zoonotic disease epidemics.

    Science.gov (United States)

    Ferguson, Jake M; Langebrake, Jessica B; Cannataro, Vincent L; Garcia, Andres J; Hamman, Elizabeth A; Martcheva, Maia; Osenberg, Craig W

    2014-06-01

    The early detection of disease epidemics reduces the chance of successful introductions into new locales, minimizes the number of infections, and reduces the financial impact. We develop a framework to determine the optimal sampling strategy for disease detection in zoonotic host-vector epidemiological systems when a disease goes from below detectable levels to an epidemic. We find that if the time of disease introduction is known then the optimal sampling strategy can switch abruptly between sampling only from the vector population to sampling only from the host population. We also construct time-independent optimal sampling strategies when conducting periodic sampling that can involve sampling both the host and the vector populations simultaneously. Both time-dependent and -independent solutions can be useful for sampling design, depending on whether the time of introduction of the disease is known or not. We illustrate the approach with West Nile virus, a globally-spreading zoonotic arbovirus. Though our analytical results are based on a linearization of the dynamical systems, the sampling rules appear robust over a wide range of parameter space when compared to nonlinear simulation models. Our results suggest some simple rules that can be used by practitioners when developing surveillance programs. These rules require knowledge of transition rates between epidemiological compartments, which population was initially infected, and of the cost per sample for serological tests.

  17. Detection of Ostreid herpesvirus-1 microvariants in healthy Crassostrea gigas following disease events and their possible role as reservoirs of infection.

    Science.gov (United States)

    Evans, Olivia; Hick, Paul; Whittington, Richard J

    2017-09-01

    Ostreid herpesvirus-1 microvariants (OsHV-1) cause severe mortalities in farmed Crassostrea gigas in Europe, New Zealand and Australia. Outbreaks are seasonal, recurring in the warmer months of the year in endemic estuaries. The reference genotype and microvariant genotypes of OsHV-1 have been previously detected in the tissues of apparently healthy adult oysters naturally exposed to OsHV-1 in the field. However, the role of such oysters as reservoirs of infection for subsequent mortality outbreaks remains unclear. The aims of this study were: (1) to identify the optimal sample type to use for the detection of OsHV-1 DNA in apparently healthy C. gigas; and (2) to assess whether live C. gigas maintained on-farm after an OsHV-1 related mortality event remain infected and could act as a reservoir host for subsequent outbreaks. OsHV-1 DNA was detected in the hemolymph, gill, mantle, adductor muscle, gonad and digestive gland of apparently healthy adult oysters. The likelihood of detecting OsHV-1 DNA in hemolymph was equivalent to that in gill and mantle, but the odds of detecting OsHV-1 DNA in hemolymph and gill were more than 8 times that of adductor muscle. Gill had the highest viral loads. Compared to testing whole gill homogenates, testing snippets of the gill improved the detection of OsHV-1 DNA by about four fold. The prevalence of OsHV-1 in gill and mantle was highest after the first season of OsHV-1 exposure; it then declined to low or negligible levels in the same cohorts in subsequent seasons, despite repeated seasonal exposure in monitoring lasting up to 4years. The hemolymph of individually identified oysters was repeatedly sampled over 15months, and OsHV-1 prevalence declined over that time frame in the youngest cohort, which had been exposed to OsHV-1 for the first time at the start of that season. In contrast, the prevalence in two cohorts of older oysters, which had been exposed to OsHV-1 in prior seasons, was consistently low (<10%). Viral loads were

  18. Detecting changes in real-time data: a user's guide to optimal detection.

    Science.gov (United States)

    Johnson, P; Moriarty, J; Peskir, G

    2017-08-13

    The real-time detection of changes in a noisily observed signal is an important problem in applied science and engineering. The study of parametric optimal detection theory began in the 1930s, motivated by applications in production and defence. Today this theory, which aims to minimize a given measure of detection delay under accuracy constraints, finds applications in domains including radar, sonar, seismic activity, global positioning, psychological testing, quality control, communications and power systems engineering. This paper reviews developments in optimal detection theory and sequential analysis, including sequential hypothesis testing and change-point detection, in both Bayesian and classical (non-Bayesian) settings. For clarity of exposition, we work in discrete time and provide a brief discussion of the continuous time setting, including recent developments using stochastic calculus. Different measures of detection delay are presented, together with the corresponding optimal solutions. We emphasize the important role of the signal-to-noise ratio and discuss both the underlying assumptions and some typical applications for each formulation.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  19. Optimal and sub-optimal post-detection timing estimators for PET

    International Nuclear Information System (INIS)

    Hero, A.O.; Antoniadis, N.; Clinthorne, N.; Rogers, W.L.; Hutchins, G.D.

    1990-01-01

    In this paper the authors derive linear and non-linear approximations to the post-detection likelihood function for scintillator interaction time in nuclear particle detection systems. The likelihood function is the optimal statistic for performing detection and estimation of scintillator events and event times. The authors derive the likelihood function approximations from a statistical model for the post-detection waveform which is common in the optical communications literature and takes account of finite detector bandwidth, random gains, and thermal noise. They then present preliminary simulation results for the associated approximate maximum likelihood timing estimators which indicate that significant MSE improvements may be achieved for low post-detection signal-to-noise ratio

  20. Adjoint based optimal control of partially miscible two-phase flow in porous media with applications to CO2 sequestration in underground reservoirs

    KAUST Repository

    Simon, Moritz

    2014-11-14

    © 2014, Springer Science+Business Media New York. With the target of optimizing CO2 sequestration in underground reservoirs, we investigate constrained optimal control problems with partially miscible two-phase flow in porous media. Our objective is to maximize the amount of trapped CO2 in an underground reservoir after a fixed period of CO2 injection, while time-dependent injection rates in multiple wells are used as control parameters. We describe the governing two-phase two-component Darcy flow PDE system, formulate the optimal control problem and derive the continuous adjoint equations. For the discretization we apply a variant of the so-called BOX method, a locally conservative control-volume FE method that we further stabilize by a periodic averaging feature to reduce oscillations. The timestep-wise Lagrange function of the control problem is implemented as a variational form in Sundance, a toolbox for rapid development of parallel FE simulations, which is part of the HPC software Trilinos. We discuss the BOX method and our implementation in Sundance. The MPI parallelized Sundance state and adjoint solvers are linked to the interior point optimization package IPOPT, using limited-memory BFGS updates for approximating second derivatives. Finally, we present and discuss different types of optimal control results.

  1. Optimization and Validation of kit of detection Antibiotics on Honey

    International Nuclear Information System (INIS)

    Hamza, Malek

    2013-01-01

    According to the Codex Alimentarius and the European Commission Directive each food has a maximum residual antibiotic (MRLs) however, for honey is still no limit set. Among the main methods that guarantee the detection of antibiotic residues include the Premi Test which is a qualitative method for the detection of antibiotics in honey, but it remains a non-specific method for this matrix and long enough (three hours of incubation). Through this work, we were able to develop and optimize a new kit called Honey test. This kit is able to detect the presence of antibiotic residues in honey by a bacterial strain radio-resistant called D.ra. The duration of treatment is only 30 minutes, requiring incubation at 37 Degree C and treatment with UV at 366 nm. This work will be the subject of a national patent.

  2. Multi-time scale Climate Informed Stochastic Hybrid Simulation-Optimization Model (McISH model) for Multi-Purpose Reservoir System

    Science.gov (United States)

    Lu, M.; Lall, U.

    2013-12-01

    In order to mitigate the impacts of climate change, proactive management strategies to operate reservoirs and dams are needed. A multi-time scale climate informed stochastic model is developed to optimize the operations for a multi-purpose single reservoir by simulating decadal, interannual, seasonal and sub-seasonal variability. We apply the model to a setting motivated by the largest multi-purpose dam in N. India, the Bhakhra reservoir on the Sutlej River, a tributary of the Indus. This leads to a focus on timing and amplitude of the flows for the monsoon and snowmelt periods. The flow simulations are constrained by multiple sources of historical data and GCM future projections, that are being developed through a NSF funded project titled 'Decadal Prediction and Stochastic Simulation of Hydroclimate Over Monsoon Asia'. The model presented is a multilevel, nonlinear programming model that aims to optimize the reservoir operating policy on a decadal horizon and the operation strategy on an updated annual basis. The model is hierarchical, in terms of having a structure that two optimization models designated for different time scales are nested as a matryoshka doll. The two optimization models have similar mathematical formulations with some modifications to meet the constraints within that time frame. The first level of the model is designated to provide optimization solution for policy makers to determine contracted annual releases to different uses with a prescribed reliability; the second level is a within-the-period (e.g., year) operation optimization scheme that allocates the contracted annual releases on a subperiod (e.g. monthly) basis, with additional benefit for extra release and penalty for failure. The model maximizes the net benefit of irrigation, hydropower generation and flood control in each of the periods. The model design thus facilitates the consistent application of weather and climate forecasts to improve operations of reservoir systems. The

  3. Evaluation of the Chagas Stat-Paktm Assay for Detection of Trypanosoma cruzi Antibodies in Wildlife Reservoirs

    Science.gov (United States)

    Yabsley, Michael J.; Brown, Emily L.; Roellig, Dawn M.

    2010-01-01

    An immunochromatographic assay (Chagas Stat-Pak™) was evaluated for the detection of Trypanosoma cruzi antibodies in 4 species of wildlife reservoirs. Antibodies to T. cruzi were detected in raccoons (Procyon lotor) (naturally and experimentally infected) and degus (Octodon degu) (experimentally-infected) using the Chagas Stat-Pak. In naturally exposed wild raccoons, the Chagas Stat-Pak had a sensitivity and specificity of 66.7–80.0% and 96.3%, respectively. Compared with indirect immunofluorescent antibody assay results, serocon-version as determined by Chagas Stat-Pak was delayed for experimentally infected raccoons, but occurred sooner in experimentally infected degus. The Chagas Stat-Pak did not detect antibodies in naturally or experimentally infected Virginia opossums (Didelphis virginiana) or in experimentally infected short-tailed opossums (Monodelphis domestica). These data suggest that the Chagas Stat-Pak might be useful in field studies of raccoons and degus when samples would not be available for more-conventional serologic assays. Because this assay did not work on either species of marsupial, the applicability of the assay should be examined before it is used in other wild species. PMID:19016578

  4. Sequential Change-Point Detection via Online Convex Optimization

    Directory of Open Access Journals (Sweden)

    Yang Cao

    2018-02-01

    Full Text Available Sequential change-point detection when the distribution parameters are unknown is a fundamental problem in statistics and machine learning. When the post-change parameters are unknown, we consider a set of detection procedures based on sequential likelihood ratios with non-anticipating estimators constructed using online convex optimization algorithms such as online mirror descent, which provides a more versatile approach to tackling complex situations where recursive maximum likelihood estimators cannot be found. When the underlying distributions belong to a exponential family and the estimators satisfy the logarithm regret property, we show that this approach is nearly second-order asymptotically optimal. This means that the upper bound for the false alarm rate of the algorithm (measured by the average-run-length meets the lower bound asymptotically up to a log-log factor when the threshold tends to infinity. Our proof is achieved by making a connection between sequential change-point and online convex optimization and leveraging the logarithmic regret bound property of online mirror descent algorithm. Numerical and real data examples validate our theory.

  5. ADAPTIVE ANT COLONY OPTIMIZATION BASED GRADIENT FOR EDGE DETECTION

    Directory of Open Access Journals (Sweden)

    Febri Liantoni

    2014-08-01

    Full Text Available Ant Colony Optimization (ACO is a nature-inspired optimization algorithm which is motivated by ants foraging behavior. Due to its favorable advantages, ACO has been widely used to solve several NP-hard problems, including edge detection. Since ACO initially distributes ants at random, it may cause imbalance ant distribution which later affects path discovery process. In this paper an adaptive ACO is proposed to optimize edge detection by adaptively distributing ant according to gradient analysis. Ants are adaptively distributed according to gradient ratio of each image regions. Region which has bigger gradient ratio, will have bigger number of ant distribution. Experiments are conducted using images from various datasets. Precision and recall are used to quantitatively evaluate performance of the proposed algorithm. Precision and recall of adaptive ACO reaches 76.98 % and 96.8 %. Whereas highest precision and recall for standard ACO are 69.74 % and 74.85 %. Experimental results show that the adaptive ACO outperforms standard ACO which randomly distributes ants.

  6. Detecting anthropogenic climate change with an optimal fingerprint method

    International Nuclear Information System (INIS)

    Hegerl, G.C.; Storch, H. von; Hasselmann, K.; Santer, B.D.; Jones, P.D.

    1994-01-01

    We propose a general fingerprint strategy to detect anthropogenic climate change and present application to near surface temperature trends. An expected time-space-variable pattern of anthropogenic climate change (the 'signal') is identified through application of an appropriate optimally matched space-time filter (the 'fingerprint') to the observations. The signal and the fingerprint are represented in a space with sufficient observed and simulated data. The signal pattern is derived from a model-generated prediction of anthropogenic climate change. Application of the fingerprint filter to the data yields a scalar detection variable. The statistically optimal fingerprint is obtained by weighting the model-predicted pattern towards low-noise directions. A combination of model output and observations is used to estimate the noise characteristics of the detection variable, arising from the natural variability of climate in the absence of external forcing. We test then the null hypothesis that the observed climate change is part of natural climate variability. We conclude that a statistically significant externally induced warming has been observed, with the caveat of a possibly inadequate estimate of the internal climate variability. In order to attribute this warming uniquely to anthropogenic greenhouse gas forcing, more information on the climate's response to other forcing mechanisms (e.g. changes in solar radiation, volcanic or anthropogenic aerosols) and their interaction is needed. (orig./KW)

  7. Application of the Taguchi Method for Optimizing the Process Parameters of Producing Lightweight Aggregates by Incorporating Tile Grinding Sludge with Reservoir Sediments.

    Science.gov (United States)

    Chen, How-Ji; Chang, Sheng-Nan; Tang, Chao-Wei

    2017-11-10

    This study aimed to apply the Taguchi optimization technique to determine the process conditions for producing synthetic lightweight aggregate (LWA) by incorporating tile grinding sludge powder with reservoir sediments. An orthogonal array L 16 (4⁵) was adopted, which consisted of five controllable four-level factors (i.e., sludge content, preheat temperature, preheat time, sintering temperature, and sintering time). Moreover, the analysis of variance method was used to explore the effects of the experimental factors on the particle density, water absorption, bloating ratio, and loss on ignition of the produced LWA. Overall, the produced aggregates had particle densities ranging from 0.43 to 2.1 g/cm³ and water absorption ranging from 0.6% to 13.4%. These values are comparable to the requirements for ordinary and high-performance LWAs. The results indicated that it is considerably feasible to produce high-performance LWA by incorporating tile grinding sludge with reservoir sediments.

  8. A Joint Optimization Criterion for Blind DS-CDMA Detection

    Directory of Open Access Journals (Sweden)

    Sergio A. Cruces-Alvarez

    2007-01-01

    Full Text Available This paper addresses the problem of the blind detection of a desired user in an asynchronous DS-CDMA communications system with multipath propagation channels. Starting from the inverse filter criterion introduced by Tugnait and Li in 2001, we propose to tackle the problem in the context of the blind signal extraction methods for ICA. In order to improve the performance of the detector, we present a criterion based on the joint optimization of several higher-order statistics of the outputs. An algorithm that optimizes the proposed criterion is described, and its improved performance and robustness with respect to the near-far problem are corroborated through simulations. Additionally, a simulation using measurements on a real software-radio platform at 5 GHz has also been performed.

  9. A Joint Optimization Criterion for Blind DS-CDMA Detection

    Science.gov (United States)

    Durán-Díaz, Iván; Cruces-Alvarez, Sergio A.

    2006-12-01

    This paper addresses the problem of the blind detection of a desired user in an asynchronous DS-CDMA communications system with multipath propagation channels. Starting from the inverse filter criterion introduced by Tugnait and Li in 2001, we propose to tackle the problem in the context of the blind signal extraction methods for ICA. In order to improve the performance of the detector, we present a criterion based on the joint optimization of several higher-order statistics of the outputs. An algorithm that optimizes the proposed criterion is described, and its improved performance and robustness with respect to the near-far problem are corroborated through simulations. Additionally, a simulation using measurements on a real software-radio platform at 5 GHz has also been performed.

  10. Constructing an optimal decision tree for FAST corner point detection

    KAUST Repository

    Alkhalid, Abdulaziz; Chikalov, Igor; Moshkov, Mikhail

    2011-01-01

    In this paper, we consider a problem that is originated in computer vision: determining an optimal testing strategy for the corner point detection problem that is a part of FAST algorithm [11,12]. The problem can be formulated as building a decision tree with the minimum average depth for a decision table with all discrete attributes. We experimentally compare performance of an exact algorithm based on dynamic programming and several greedy algorithms that differ in the attribute selection criterion. © 2011 Springer-Verlag.

  11. Three-Gorge Reservoir: A 'Controlled Experiment' for Calibration/Validation of Time-Variable Gravity Signals Detected from Space

    Science.gov (United States)

    Chao, Benjamin F.; Boy, J. P.

    2003-01-01

    With the advances of measurements, modern space geodesy has become a new type of remote sensing for the Earth dynamics, especially for mass transports in the geophysical fluids on large spatial scales. A case in point is the space gravity mission GRACE (Gravity Recovery And Climate Experiment) which has been in orbit collecting gravity data since early 2002. The data promise to be able to detect changes of water mass equivalent to sub-cm thickness on spatial scale of several hundred km every month or so. China s Three-Gorge Reservoir has already started the process of water impoundment in phases. By 2009,40 km3 of water will be stored behind one of the world s highest dams and spanning a section of middle Yangtze River about 600 km in length. For the GRACE observations, the Three-Gorge Reservoir would represent a geophysical controlled experiment , one that offers a unique opportunity to do detailed geophysical studies. -- Assuming a complete documentation of the water level and history of the water impoundment process and aided with a continual monitoring of the lithospheric loading response (such as in area gravity and deformation), one has at hand basically a classical forwardinverse modeling problem of surface loading, where the input and certain output are known. The invisible portion of the impounded water, i.e. underground storage, poses either added values as an observable or a complication as an unknown to be modeled. Wang (2000) has studied the possible loading effects on a local scale; we here aim for larger spatial scales upwards from several hundred km, with emphasis on the time-variable gravity signals that can be detected by GRACE and follow-on missions. Results using the Green s function approach on the PREM elastic Earth model indicate the geoid height variations reaching several millimeters on wavelengths of about a thousand kilometers. The corresponding vertical deformations have amplitude of a few centimeters. In terms of long

  12. Optimal Background Attenuation for Fielded Spectroscopic Detection Systems

    International Nuclear Information System (INIS)

    Robinson, Sean M.; Ashbaker, Eric D.; Schweppe, John E.; Siciliano, Edward R.

    2007-01-01

    Radiation detectors are often placed in positions difficult to shield from the effects of terrestrial background gamma radiation. This is particularly true in the case of Radiation Portal Monitor (RPM) systems, as their wide viewing angle and outdoor installations make them susceptible to radiation from the surrounding area. Reducing this source of background can improve gross-count detection capabilities in the current generation of non-spectroscopic RPM's as well as source identification capabilities in the next generation of spectroscopic RPM's. To provide guidance for designing such systems, the problem of shielding a general spectroscopic-capable RPM system from terrestrial gamma radiation is considered. This analysis is carried out by template matching algorithms, to determine and isolate a set of non-threat isotopes typically present in the commerce stream. Various model detector and shielding scenarios are calculated using the Monte-Carlo N Particle (MCNP) computer code. Amounts of nominal-density shielding needed to increase the probability of detection for an ensemble of illicit sources are given. Common shielding solutions such as steel plating are evaluated based on the probability of detection for 3 particular illicit sources of interest, and the benefits are weighed against the incremental cost of shielding. Previous work has provided optimal shielding scenarios for RPMs based on gross-counting measurements, and those same solutions (shielding the internal detector cavity, direct shielding of the ground between the detectors, and the addition of collimators) are examined with respect to their utility to improving spectroscopic detection

  13. Biometric Quantization through Detection Rate Optimized Bit Allocation

    Directory of Open Access Journals (Sweden)

    C. Chen

    2009-01-01

    Full Text Available Extracting binary strings from real-valued biometric templates is a fundamental step in many biometric template protection systems, such as fuzzy commitment, fuzzy extractor, secure sketch, and helper data systems. Previous work has been focusing on the design of optimal quantization and coding for each single feature component, yet the binary string—concatenation of all coded feature components—is not optimal. In this paper, we present a detection rate optimized bit allocation (DROBA principle, which assigns more bits to discriminative features and fewer bits to nondiscriminative features. We further propose a dynamic programming (DP approach and a greedy search (GS approach to achieve DROBA. Experiments of DROBA on the FVC2000 fingerprint database and the FRGC face database show good performances. As a universal method, DROBA is applicable to arbitrary biometric modalities, such as fingerprint texture, iris, signature, and face. DROBA will bring significant benefits not only to the template protection systems but also to the systems with fast matching requirements or constrained storage capability.

  14. Validation of qPCR Methods for the Detection of Mycobacterium in New World Animal Reservoirs.

    Directory of Open Access Journals (Sweden)

    Genevieve Housman

    2015-11-01

    Full Text Available Zoonotic pathogens that cause leprosy (Mycobacterium leprae and tuberculosis (Mycobacterium tuberculosis complex, MTBC continue to impact modern human populations. Therefore, methods able to survey mycobacterial infection in potential animal hosts are necessary for proper evaluation of human exposure threats. Here we tested for mycobacterial-specific single- and multi-copy loci using qPCR. In a trial study in which armadillos were artificially infected with M. leprae, these techniques were specific and sensitive to pathogen detection, while more traditional ELISAs were only specific. These assays were then employed in a case study to detect M. leprae as well as MTBC in wild marmosets. All marmosets were negative for M. leprae DNA, but 14 were positive for the mycobacterial rpoB gene assay. Targeted capture and sequencing of rpoB and other MTBC genes validated the presence of mycobacterial DNA in these samples and revealed that qPCR is useful for identifying mycobacterial-infected animal hosts.

  15. The potential of GRACE gravimetry to detect the heavy rainfall-induced impoundment of a small reservoir in the upper Yellow River

    Science.gov (United States)

    Yi, Shuang; Song, Chunqiao; Wang, Qiuyu; Wang, Linsong; Heki, Kosuke; Sun, Wenke

    2017-08-01

    Artificial reservoirs are important indicators of anthropogenic impacts on environments, and their cumulative influences on the local water storage will change the gravity signal. However, because of their small signal size, such gravity changes are seldom studied using satellite gravimetry from the Gravity Recovery and Climate Experiment (GRACE). Here we investigate the ability of GRACE to detect water storage changes in the Longyangxia Reservoir (LR), which is situated in the upper main stem of the Yellow River. Three different GRACE solutions from the CSR, GFZ, and JPL with three different processing filters are compared here. We find that heavy precipitation in the summer of 2005 caused the LR water storage to increase by 37.9 m in height, which is equivalent to 13.0 Gt in mass, and that the CSR solutions with a DDK4 filter show the best performance in revealing the synthetic gravity signals. We also obtain 109 pairs of reservoir inundation area measurements from satellite imagery and water level changes from laser altimetry and in situ observations to derive the area-height ratios for the LR. The root mean square of GRACE series in the LR is reduced by 39% after removing synthetic signals caused by mass changes in the LR or by 62% if the GRACE series is further smoothed. We conclude that GRACE data show promising potential in detecting water storage changes in this ˜400 km2 reservoir and that a small signal size is not a restricting factor for detection using GRACE data.

  16. Ducks as a potential reservoir for Pasteurella multocida infection detected using a new rOmpH-based ELISA.

    Science.gov (United States)

    Liu, Rongchang; Chen, Cuiteng; Cheng, Longfei; Lu, Ronghui; Fu, Guanghua; Shi, Shaohua; Chen, Hongmei; Wan, Chunhe; Lin, Jiansheng; Fu, Qiuling; Huang, Yu

    2017-07-28

    Pasteurella multocida is an important pathogen of numerous domestic poultry and wild animals and is associated with a variety of diseases including fowl cholera. The aim of this study was to develop an indirect enzyme-linked immunosorbent assay (ELISA) based on recombinant outer-membrane protein H (rOmpH) for detection of anti-P. multocida antibodies in serum to determine their prevalence in Chinese ducks. The P. multocida ompH gene was cloned into pET32a, and rOmpH was expressed in Escherichia coli BL21 (DE3). Western blotting revealed that purified rOmpH was recognized by duck antisera against P. multocida, and an indirect ELISA was established. During analysis of serum samples (n=115) from ducks, the rOmpH ELISA showed 95.0% specificity, 100% sensitivity and a 92.0% κ coefficient (95% confidence interval 0.844-0.997) as compared with a microtiter agglutination test. Among 165 randomly selected serum samples, which were collected in 2015 and originated from six duck farms across Fujian Province, China, anti-P. multocida antibodies were detected in 22.42% of apparently healthy ducks, including 25 of 90 sheldrakes (27.8%), eight of 50 Peking ducks (16.0%) and four of 25 Muscovy ducks (16%). Overall, the data suggest that rOmpH is a suitable candidate antigen for the development of an indirect ELISA for detection of P. multocida in ducks; moreover, our results showed that ducks could serve as a potential reservoir for P. multocida infection.

  17. A rationale for reservoir management economics

    International Nuclear Information System (INIS)

    Hickman, T.S.

    1995-01-01

    Significant economic benefits can be derived from the application f reservoir management. The key elements in economical reservoir management are the efficient use of available resources and optimization of reservoir exploitation through a multidisciplined approach. This paper describes various aspects of and approaches to reservoir management and provides case histories that support the findings

  18. AMICO: optimized detection of galaxy clusters in photometric surveys

    Science.gov (United States)

    Bellagamba, Fabio; Roncarelli, Mauro; Maturi, Matteo; Moscardini, Lauro

    2018-02-01

    We present Adaptive Matched Identifier of Clustered Objects (AMICO), a new algorithm for the detection of galaxy clusters in photometric surveys. AMICO is based on the Optimal Filtering technique, which allows to maximize the signal-to-noise ratio (S/N) of the clusters. In this work, we focus on the new iterative approach to the extraction of cluster candidates from the map produced by the filter. In particular, we provide a definition of membership probability for the galaxies close to any cluster candidate, which allows us to remove its imprint from the map, allowing the detection of smaller structures. As demonstrated in our tests, this method allows the deblending of close-by and aligned structures in more than 50 per cent of the cases for objects at radial distance equal to 0.5 × R200 or redshift distance equal to 2 × σz, being σz the typical uncertainty of photometric redshifts. Running AMICO on mocks derived from N-body simulations and semi-analytical modelling of the galaxy evolution, we obtain a consistent mass-amplitude relation through the redshift range of 0.3 slope of ∼0.55 and a logarithmic scatter of ∼0.14. The fraction of false detections is steeply decreasing with S/N and negligible at S/N > 5.

  19. Optimization of the digital Silicon Photomultiplier for Cherenkov light detection

    International Nuclear Information System (INIS)

    Frach, T

    2012-01-01

    The Silicon Photomultiplier is a promising alternative to fast vacuum photodetectors. We developed a fully digital implementation of the Silicon Photomultiplier. The sensor is based on a single photon avalanche photodiode (SPAD) integrated in a standard CMOS process. Photons are detected directly by sensing the voltage at the SPAD anode using a dedicated cell electronics block next to each diode. This block also contains active quenching and recharge circuits as well as a one bit memory for the selective inhibit of detector cells. A balanced trigger network is used to propagate the trigger signal from all cells to the integrated time-to-digital converter. Photons are detected and counted as digital signals, thus making the sensor less susceptible to temperature variations and electronic noise. The integration with CMOS logic has the added benefit of low power consumption and possible integration of data post-processing in the sensor. In this paper, we discuss the sensor architecture together with its characteristics, and its possible optimizations for applications requiring the detection of Cherenkov light.

  20. Towards Optimal Event Detection and Localization in Acyclic Flow Networks

    KAUST Repository

    Agumbe Suresh, Mahima

    2012-01-03

    Acyclic flow networks, present in many infrastructures of national importance (e.g., oil & gas and water distribution systems), have been attracting immense research interest. Existing solutions for detecting and locating attacks against these infrastructures, have been proven costly and imprecise, especially when dealing with large scale distribution systems. In this paper, to the best of our knowledge for the first time, we investigate how mobile sensor networks can be used for optimal event detection and localization in acyclic flow networks. Sensor nodes move along the edges of the network and detect events (i.e., attacks) and proximity to beacon nodes with known placement in the network. We formulate the problem of minimizing the cost of monitoring infrastructure (i.e., minimizing the number of sensor and beacon nodes deployed), while ensuring a degree of sensing coverage in a zone of interest and a required accuracy in locating events. We propose algorithms for solving these problems and demonstrate their effectiveness with results obtained from a high fidelity simulator.

  1. Studies on the Optimal behavior of Energy Storage in Reservoirs of a Hydroelectric system; Estudios sobre el comportamiento optimo del almacenamiento de energia en embalses de sistema hidroelectrico

    Energy Technology Data Exchange (ETDEWEB)

    Macedo Faria, Breno; Franco Barbosa, Paulo Sergio [Universidad Estatal de Campinas (Brazil)

    2002-09-01

    This work aims at studying the results of an optimisation model applied to the Paranaiba river basin, Brazil. This system is made by the junction of three river branches located in a region with a well-defined seasonal hydrological behavior. The ratio between the total energy storage in the system and the active storage for every reservoir is evaluated from the optimal operational results. This relationship allows recognizing systematic patterns on the relative use for every reservoir, when compared to the entire system. The main parameters that define reservoir behavior are identified, with highlights on the position of the power station in the cascade, the relationship between the river flow and the active storage, and the installed capacity of the power station. In addition, the parameter hydrological scenario is also another factor that defines the relative use of the reservoirs. [Spanish] El modelo del presente trabajo tiene como objetivo estudiar los resultados de una optimizacion para el sistema hidroelectrico de la cuenca del rio Paranaiba, Brasil, la cual esta formada por la confluencia de tres rios en una region de distribucion de lluvias bien definidas en terminos hidrologicos. Se analiza la relacion entre la energia total almacenada en el sistema y el volumen util de cada embalse a partir de los resultados operativos optimos. Esta relacion permite identificar resultados sistematicos en lo que se refiere a la utilizacion de cada embalse, en comparacion con el uso del sistema como un todo. Se identifican los principales parametros responsables por el comportamiento de los embalses, destacando la influencia de la posicion de la central hidroelectrica en la cascada, de la relacion caudal/volumen util y de la potencia de central. Ademas, el parametro escenario hidrologico tambien es otro factor determinante en el uso relativo de los embalses.

  2. Use of high-resolution satellite images for detection of geothermal reservoirs

    Science.gov (United States)

    Arellano-Baeza, A. A.

    2012-12-01

    Chile has an enormous potential to use the geothermal resources for electric energy generation. The main geothermal fields are located in the Central Andean Volcanic Chain in the North, between the Central valley and the border with Argentina in the center, and in the fault system Liquiñe-Ofqui in the South of the country. High resolution images from the LANDSAT and ASTER satellites have been used to delineate the geological structures related to the Calerias geothermal field located at the northern end of the Southern Volcanic Zone of Chile and Puchuldiza geothermal field located in the Region of Tarapaca. It was done by applying the lineament extraction technique developed by author. These structures have been compared with the distribution of main geological structures obtained in the fields. It was found that the lineament density increases in the areas of the major heat flux indicating that the lineament analysis could be a power tool for the detection of faults and joint zones associated to the geothermal fields.

  3. Optimization of Second Fault Detection Thresholds to Maximize Mission POS

    Science.gov (United States)

    Anzalone, Evan

    2018-01-01

    both magnitude and time. As such, the Navigation team is taking advantage of the INS's capability to schedule and change fault detection thresholds in flight. These values are optimized along a nominal trajectory in order to maximize probability of mission success, and reducing the probability of false positives (defined as when the INS would report a second fault condition resulting in loss of mission, but the vehicle would still meet insertion requirements within system-level margins). This paper will describe an optimization approach using Genetic Algorithms to tune the threshold parameters to maximize vehicle resilience to second fault events as a function of potential fault magnitude and time of fault over an ascent mission profile. The analysis approach, and performance assessment of the results will be presented to demonstrate the applicability of this process to second fault detection to maximize mission probability of success.

  4. Optimization of multi-reservoir operation with a new hedging rule: application of fuzzy set theory and NSGA-II

    Science.gov (United States)

    Ahmadianfar, Iman; Adib, Arash; Taghian, Mehrdad

    2017-10-01

    The reservoir hedging rule curves are used to avoid severe water shortage during drought periods. In this method reservoir storage is divided into several zones, wherein the rationing factors are changed immediately when water storage level moves from one zone to another. In the present study, a hedging rule with fuzzy rationing factors was applied for creating a transition zone in up and down each rule curve, and then the rationing factor will be changed in this zone gradually. For this propose, a monthly simulation model was developed and linked to the non-dominated sorting genetic algorithm for calculation of the modified shortage index of two objective functions involving water supply of minimum flow and agriculture demands in a long-term simulation period. Zohre multi-reservoir system in south Iran has been considered as a case study. The results of the proposed hedging rule have improved the long-term system performance from 10 till 27 percent in comparison with the simple hedging rule, where these results demonstrate that the fuzzification of hedging factors increase the applicability and the efficiency of the new hedging rule in comparison to the conventional rule curve for mitigating the water shortage problem.

  5. Optimization and modification of the method for detection of rhamnolipids

    Directory of Open Access Journals (Sweden)

    Takeshi Tabuchi

    2015-10-01

    Full Text Available Use of biosurfactants in bioremediation, facilitates and accelerates microbial degradation of hydrocarbons. CTAB/MB agar method created by Siegmund & Wagner for screening of rhamnolipids (RL producing strains, has been widely used but has not improved significantly for more than 20 years. To optimize the technique as a quantitative method, CTAB/MB agar plates were made and different variables were tested, like incubation time, cooling, CTAB concentration, methylene blue presence, wells diameter and inocula volume. Furthermore, a new method for RL detection within halos was developed: precipitation of RL with HCl, allows the formation a new halos pattern, easier to observe and to measure. This research reaffirm that this method is not totally suitable for a fine quantitative analysis, because of the difficulty to accurately correlate RL concentration and the area of the halos. RL diffusion does not seem to have a simple behavior and there are a lot of factors that affect RL migration rate.

  6. Optimization of Xenon Biosensors for Detection of Protein Interactions

    International Nuclear Information System (INIS)

    Lowery, Thomas J.; Garcia, Sandra; Chavez, Lana; Ruiz, E.Janette; Wu, Tom; Brotin, Thierry; Dutasta, Jean-Pierre; King, David S.; Schultz, Peter G.; Pines, Alex; Wemmer, David E.

    2005-08-01

    Hyperpolarized 129Xe NMR can detect the presence of specific low-concentration biomolecular analytes by means of the xenon biosensor, which consists of a water-soluble, targeted cryptophane-A cage that encapsulates xenon. In this work we use the prototypical biotinylated xenon biosensor to determine the relationship between the molecular composition of the xenon biosensor and the characteristics of protein-bound resonances. The effects of diastereomer overlap, dipole-dipole coupling, chemical shift anisotropy, xenon exchange, and biosensor conformational exchange on protein-bound biosensor signal were assessed. It was found that optimal protein-bound biosensor signal can be obtained by minimizing the number of biosensor diastereomers and using a flexible linker of appropriate length. Both the linewidth and sensitivity of chemical shift to protein binding of the xenon biosensor were found to be inversely proportional to linker length

  7. Rural health centres, communities and malaria case detection in Zambia using mobile telephones: a means to detect potential reservoirs of infection in unstable transmission conditions.

    Science.gov (United States)

    Kamanga, Aniset; Moono, Petros; Stresman, Gillian; Mharakurwa, Sungano; Shiff, Clive

    2010-04-15

    Effective malaria control depends on timely acquisition of information on new cases, their location and their frequency so as to deploy supplies, plan interventions or focus attention on specific locations appropriately to intervene and prevent an upsurge in transmission. The process is known as active case detection, but because the information is time sensitive, it is difficult to carry out. In Zambia, the rural health services are operating effectively and for the most part are provided with adequate supplies of rapid diagnostic tests (RDT) as well as effective drugs for the diagnosis and treatment of malaria. The tests are administered to all prior to treatment and appropriate records are kept. Data are obtained in a timely manner and distribution of this information is important for the effective management of malaria control operations. The work reported here involves combining the process of positive diagnoses in rural health centres (passive case detection) to help detect potential outbreaks of malaria and target interventions to foci where parasite reservoirs are likely to occur. Twelve rural health centres in the Choma and Namwala Districts were recruited to send weekly information of rapid malaria tests used and number of positive diagnoses to the Malaria Institute at Macha using mobile telephone SMS. Data were entered in excel, expressed as number of cases per rural health centre and distributed weekly to interested parties. These data from each of the health centres which were mapped using geographical positioning system (GPS) coordinates were used in a time sensitive manner to plot the patterns of malaria case detection in the vicinity of each location. The data were passed on to the appropriate authorities. The seasonal pattern of malaria transmission associated with local ecological conditions can be seen in the distribution of cases diagnosed. Adequate supplies of RDT are essential in health centres and the system can be expanded throughout the

  8. Software Piracy Detection Model Using Ant Colony Optimization Algorithm

    Science.gov (United States)

    Astiqah Omar, Nor; Zakuan, Zeti Zuryani Mohd; Saian, Rizauddin

    2017-06-01

    Internet enables information to be accessible anytime and anywhere. This scenario creates an environment whereby information can be easily copied. Easy access to the internet is one of the factors which contribute towards piracy in Malaysia as well as the rest of the world. According to a survey conducted by Compliance Gap BSA Global Software Survey in 2013 on software piracy, found out that 43 percent of the software installed on PCs around the world was not properly licensed, the commercial value of the unlicensed installations worldwide was reported to be 62.7 billion. Piracy can happen anywhere including universities. Malaysia as well as other countries in the world is faced with issues of piracy committed by the students in universities. Piracy in universities concern about acts of stealing intellectual property. It can be in the form of software piracy, music piracy, movies piracy and piracy of intellectual materials such as books, articles and journals. This scenario affected the owner of intellectual property as their property is in jeopardy. This study has developed a classification model for detecting software piracy. The model was developed using a swarm intelligence algorithm called the Ant Colony Optimization algorithm. The data for training was collected by a study conducted in Universiti Teknologi MARA (Perlis). Experimental results show that the model detection accuracy rate is better as compared to J48 algorithm.

  9. APPLICATION OF INTEGRATED RESERVOIR MANAGEMENT AND RESERVOIR CHARACTERIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Jack Bergeron; Tom Blasingame; Louis Doublet; Mohan Kelkar; George Freeman; Jeff Callard; David Moore; David Davies; Richard Vessell; Brian Pregger; Bill Dixon; Bryce Bezant

    2000-03-01

    Reservoir performance and characterization are vital parameters during the development phase of a project. Infill drilling of wells on a uniform spacing, without regard to characterization does not optimize development because it fails to account for the complex nature of reservoir heterogeneities present in many low permeability reservoirs, especially carbonate reservoirs. These reservoirs are typically characterized by: (1) large, discontinuous pay intervals; (2) vertical and lateral changes in reservoir properties; (3) low reservoir energy; (4) high residual oil saturation; and (5) low recovery efficiency. The operational problems they encounter in these types of reservoirs include: (1) poor or inadequate completions and stimulations; (2) early water breakthrough; (3) poor reservoir sweep efficiency in contacting oil throughout the reservoir as well as in the nearby well regions; (4) channeling of injected fluids due to preferential fracturing caused by excessive injection rates; and (5) limited data availability and poor data quality. Infill drilling operations only need target areas of the reservoir which will be economically successful. If the most productive areas of a reservoir can be accurately identified by combining the results of geological, petrophysical, reservoir performance, and pressure transient analyses, then this ''integrated'' approach can be used to optimize reservoir performance during secondary and tertiary recovery operations without resorting to ''blanket'' infill drilling methods. New and emerging technologies such as geostatistical modeling, rock typing, and rigorous decline type curve analysis can be used to quantify reservoir quality and the degree of interwell communication. These results can then be used to develop a 3-D simulation model for prediction of infill locations. The application of reservoir surveillance techniques to identify additional reservoir ''pay'' zones

  10. Encapsulated microsensors for reservoir interrogation

    Science.gov (United States)

    Scott, Eddie Elmer; Aines, Roger D.; Spadaccini, Christopher M.

    2016-03-08

    In one general embodiment, a system includes at least one microsensor configured to detect one or more conditions of a fluidic medium of a reservoir; and a receptacle, wherein the receptacle encapsulates the at least one microsensor. In another general embodiment, a method include injecting the encapsulated at least one microsensor as recited above into a fluidic medium of a reservoir; and detecting one or more conditions of the fluidic medium of the reservoir.

  11. Fortescue reservoir development and reservoir studies

    Energy Technology Data Exchange (ETDEWEB)

    Henzell, S.T.; Hicks, G.J.; Horden, M.J.; Irrgang, H.R.; Janssen, E.J.; Kable, C.W.; Mitchell, R.A.H.; Morrell, N.W.; Palmer, I.D.; Seage, N.W.

    1985-03-01

    The Fortescue field in the Gippsland Basin, offshore southeastern Australia is being developed from two platforms (Fortescue A and Cobia A) by Esso Australia Ltd. (operator) and BHP Petroleum. The Fortescue reservoir is a stratigraphic trap at the top of the Latrobe Group of sediments. It overlies the western flank of the Halibut and Cobia fields and is separated from them by a non-net sequence of shales and coals which form a hydraulic barrier between the two systems. Development drilling into the Fortescue reservoir commenced in April 1983 with production coming onstream in May 1983. Fortescue, with booked reserves of 44 stock tank gigalitres (280 million stock tank barrels) of 43/sup 0/ API oil, is the seventh major oil reservoir to be developed in the offshore Gippsland Basin by Esso/BHP. In mid-1984, after drilling a total of 20 exploration and development wells, and after approximately one year of production, a detailed three-dimensional, two-phase reservoir simulation study was performed to examine the recovery efficiency, drainage patterns, pressure performance and production rate potential of the reservoir. The model was validated by history matching an extensive suite of Repeat Formation Test (RFT) pressure data. The results confirmed the reserves basis, and demonstrated that the ultimate oil recovery from the reservoir is not sensitive to production rate. This result is consistent with studies on other high quality Latrobe Group reservoirs in the Gippsland Basin which contain undersaturated crudes and receive very strong water drive from the Basin-wide aquifer system. With the development of the simulation model during the development phase, it has been possible to more accurately define the optimal well pattern for the remainder of the development.

  12. Optimal Background Attenuation for Fielded Radiation Detection Systems

    International Nuclear Information System (INIS)

    Robinson, Sean M.; Kaye, William R.; Schweppe, John E.; Siciliano, Edward R.

    2006-01-01

    Radiation detectors are often placed in positions difficult to shield from the effects of terrestrial background. This is particularly true in the case of Radiation Portal Monitor (RPM) systems, as their wide viewing angle and outdoor installations make them susceptible to terrestrial background from the surrounding area. A low background is desired in most cases, especially when the background noise is of comparable strength to the signal of interest. The problem of shielding a generalized RPM from terrestrial background is considered. Various detector and shielding scenarios are modeled with the Monte-Carlo N Particle (MCNP) computer code. Amounts of nominal-density shielding needed to attenuate the terrestrial background to varying degrees are given, along with optimal shielding geometry to be used in areas where natural shielding is limited, and where radiation detection must occur in the presence of natural background. Common shielding solutions such as steel plating are evaluated based on the signal to noise ratio and the benefits are weighed against the incremental cost.

  13. A multiscale optimization approach to detect exudates in the macula.

    Science.gov (United States)

    Agurto, Carla; Murray, Victor; Yu, Honggang; Wigdahl, Jeffrey; Pattichis, Marios; Nemeth, Sheila; Barriga, E Simon; Soliz, Peter

    2014-07-01

    Pathologies that occur on or near the fovea, such as clinically significant macular edema (CSME), represent high risk for vision loss. The presence of exudates, lipid residues of serous leakage from damaged capillaries, has been associated with CSME, in particular if they are located one optic disc-diameter away from the fovea. In this paper, we present an automatic system to detect exudates in the macula. Our approach uses optimal thresholding of instantaneous amplitude (IA) components that are extracted from multiple frequency scales to generate candidate exudate regions. For each candidate region, we extract color, shape, and texture features that are used for classification. Classification is performed using partial least squares (PLS). We tested the performance of the system on two different databases of 652 and 400 images. The system achieved an area under the receiver operator characteristic curve (AUC) of 0.96 for the combination of both databases and an AUC of 0.97 for each of them when they were evaluated independently.

  14. Rural health centres, communities and malaria case detection in Zambia using mobile telephones: a means to detect potential reservoirs of infection in unstable transmission conditions

    Directory of Open Access Journals (Sweden)

    Kamanga Aniset

    2010-04-01

    Full Text Available Abstract Background Effective malaria control depends on timely acquisition of information on new cases, their location and their frequency so as to deploy supplies, plan interventions or focus attention on specific locations appropriately to intervene and prevent an upsurge in transmission. The process is known as active case detection, but because the information is time sensitive, it is difficult to carry out. In Zambia, the rural health services are operating effectively and for the most part are provided with adequate supplies of rapid diagnostic tests (RDT as well as effective drugs for the diagnosis and treatment of malaria. The tests are administered to all prior to treatment and appropriate records are kept. Data are obtained in a timely manner and distribution of this information is important for the effective management of malaria control operations. The work reported here involves combining the process of positive diagnoses in rural health centres (passive case detection to help detect potential outbreaks of malaria and target interventions to foci where parasite reservoirs are likely to occur. Methods Twelve rural health centres in the Choma and Namwala Districts were recruited to send weekly information of rapid malaria tests used and number of positive diagnoses to the Malaria Institute at Macha using mobile telephone SMS. Data were entered in excel, expressed as number of cases per rural health centre and distributed weekly to interested parties. Results These data from each of the health centres which were mapped using geographical positioning system (GPS coordinates were used in a time sensitive manner to plot the patterns of malaria case detection in the vicinity of each location. The data were passed on to the appropriate authorities. The seasonal pattern of malaria transmission associated with local ecological conditions can be seen in the distribution of cases diagnosed. Conclusions Adequate supplies of RDT are essential in

  15. Real-time detection of dielectric anisotropy or isotropy in unconventional oil-gas reservoir rocks supported by the oblique-incidence reflectivity difference technique.

    Science.gov (United States)

    Zhan, Honglei; Wang, Jin; Zhao, Kun; Lű, Huibin; Jin, Kuijuan; He, Liping; Yang, Guozhen; Xiao, Lizhi

    2016-12-15

    Current geological extraction theory and techniques are very limited to adequately characterize the unconventional oil-gas reservoirs because of the considerable complexity of the geological structures. Optical measurement has the advantages of non-interference with the earth magnetic fields, and is often useful in detecting various physical properties. One key parameter that can be detected using optical methods is the dielectric permittivity, which reflects the mineral and organic properties. Here we reported an oblique-incidence reflectivity difference (OIRD) technique that is sensitive to the dielectric and surface properties and can be applied to characterization of reservoir rocks, such as shale and sandstone core samples extracted from subsurface. The layered distribution of the dielectric properties in shales and the uniform distribution in sandstones are clearly identified using the OIRD signals. In shales, the micro-cracks and particle orientation result in directional changes of the dielectric and surface properties, and thus, the isotropy and anisotropy of the rock can be characterized by OIRD. As the dielectric and surface properties are closely related to the hydrocarbon-bearing features in oil-gas reservoirs, we believe that the precise measurement carried with OIRD can help in improving the recovery efficiency in well-drilling process.

  16. Optimal detection of burst events in gravitational wave interferometric observatories

    International Nuclear Information System (INIS)

    Vicere, Andrea

    2002-01-01

    We consider the problem of detecting a burst signal of unknown shape in the data from gravitational wave interferometric detectors. We introduce a statistic which generalizes the excess power statistic proposed first by Flanagan and Hughes, and then extended by Anderson et al. to the multiple detector case. The statistic that we propose is shown to be optimal for an arbitrary noise spectral characteristic, under the two hypotheses that the noise is Gaussian, albeit colored, and that the prior for the signal is uniform. The statistic derivation is based on the assumption that a signal affects only N parallel samples in the data stream, but that no other information is a priori available, and that the value of the signal at each sample can be arbitrary. This is the main difference from previous works, where different assumptions were made, such as a signal distribution uniform with respect to the metric induced by the (inverse) noise correlation matrix. The two choices are equivalent if the noise is white, and in that limit the two statistics do indeed coincide. In the general case, we believe that the statistic we propose may be more appropriate, because it does not reflect the characteristics of the noise affecting the detector on the supposed distribution of the gravitational wave signal. Moreover, we show that the proposed statistic can be easily implemented in its exact form, combining standard time-series analysis tools which can be efficiently implemented. We generalize this version of an excess power statistic to the multiple detector case, considering first a noise uncorrelated among the different instruments, and then including the effect of correlated noise. We discuss exact and approximate forms of the statistic; the choice depends on the characteristics of the noise and on the assumed length of the burst event. As an example, we show the sensitivity of the network of interferometers to a δ-function burst

  17. Optimal Sample Size for Probability of Detection Curves

    International Nuclear Information System (INIS)

    Annis, Charles; Gandossi, Luca; Martin, Oliver

    2012-01-01

    The use of Probability of Detection (POD) curves to quantify NDT reliability is common in the aeronautical industry, but relatively less so in the nuclear industry. The European Network for Inspection Qualification's (ENIQ) Inspection Qualification Methodology is based on the concept of Technical Justification, a document assembling all the evidence to assure that the NDT system in focus is indeed capable of finding the flaws for which it was designed. This methodology has become widely used in many countries, but the assurance it provides is usually of qualitative nature. The need to quantify the output of inspection qualification has become more important, especially as structural reliability modelling and quantitative risk-informed in-service inspection methodologies become more widely used. To credit the inspections in structural reliability evaluations, a measure of the NDT reliability is necessary. A POD curve provides such metric. In 2010 ENIQ developed a technical report on POD curves, reviewing the statistical models used to quantify inspection reliability. Further work was subsequently carried out to investigate the issue of optimal sample size for deriving a POD curve, so that adequate guidance could be given to the practitioners of inspection reliability. Manufacturing of test pieces with cracks that are representative of real defects found in nuclear power plants (NPP) can be very expensive. Thus there is a tendency to reduce sample sizes and in turn reduce the conservatism associated with the POD curve derived. Not much guidance on the correct sample size can be found in the published literature, where often qualitative statements are given with no further justification. The aim of this paper is to summarise the findings of such work. (author)

  18. Optimizing Urine Processing Protocols for Protein and Metabolite Detection.

    Science.gov (United States)

    Siddiqui, Nazema Y; DuBois, Laura G; St John-Williams, Lisa; Will, Thompson J; Grenier, Carole; Burke, Emily; Fraser, Matthew O; Amundsen, Cindy L; Murphy, Susan K

    In urine, factors such as timing of voids, and duration at room temperature (RT) may affect the quality of recovered protein and metabolite data. Additives may aid with detection, but can add more complexity in sample collection or analysis. We aimed to identify the optimal urine processing protocol for clinically-obtained urine samples that allows for the highest protein and metabolite yields with minimal degradation. Healthy women provided multiple urine samples during the same day. Women collected their first morning (1 st AM) void and another "random void". Random voids were aliquotted with: 1) no additive; 2) boric acid (BA); 3) protease inhibitor (PI); or 4) both BA + PI. Of these aliquots, some were immediately stored at 4°C, and some were left at RT for 4 hours. Proteins and individual metabolites were quantified, normalized to creatinine concentrations, and compared across processing conditions. Sample pools corresponding to each processing condition were analyzed using mass spectrometry to assess protein degradation. Ten Caucasian women between 35-65 years of age provided paired 1 st morning and random voided urine samples. Normalized protein concentrations were slightly higher in 1 st AM compared to random "spot" voids. The addition of BA did not significantly change proteins, while PI significantly improved normalized protein concentrations, regardless of whether samples were immediately cooled or left at RT for 4 hours. In pooled samples, there were minimal differences in protein degradation under the various conditions we tested. In metabolite analyses, there were significant differences in individual amino acids based on the timing of the void. For comparative translational research using urine, information about void timing should be collected and standardized. For urine samples processed in the same day, BA does not appear to be necessary while the addition of PI enhances protein yields, regardless of 4°C or RT storage temperature.

  19. Application of advanced reservoir characterization, simulation, and production optimization strategies to maximize recovery in slope and basin clastic reservoirs, West Texas (Delaware Basin). Quarterly report, October 1 - December 31, 1996

    Energy Technology Data Exchange (ETDEWEB)

    Dutton, S.P.

    1997-01-01

    The objective of this project is to demonstrate that detailed reservoir characterization of slope and basin clastic reservoirs in sandstones of the Delaware Mountain Group in the Delaware Basin of West Texas and New Mexico is a cost effective way to recover a higher percentage of the original oil in place through strategic placement of infill wells and geologically based field development. Project objectives are divided into two major phases. The objectives of the reservoir characterization phase of the project are to provide a detailed understanding of the architecture and heterogeneity of two fields, the Ford Geraldine unit and Ford West field, which produce from the Bell Canyon and Cherry Canyon Formations, respectively, of the Delaware Mountain Group and to compare Bell Canyon and Cherry Canyon reservoirs. Reservoir characterization will utilize 3-D seismic data, high-resolution sequence stratigraphy, subsurface field studies, outcrop characterization, and other techniques. Once the reservoir-characterization study of both fields is completed, a pilot area of approximately 1 mi{sup 2} in one of the fields will be chosen for reservoir simulation. The objectives of the implementation phase of the project are to (1) apply the knowledge gained from reservoir characterization and simulation studies to increase recovery from the pilot area, (2) demonstrate that economically significant unrecovered oil remains in geologically resolvable untapped compartments, and (3) test the accuracy of reservoir characterization and flow simulation as predictive tools in resource preservation of mature fields. A geologically designed, enhanced-recovery program (CO{sub 2} flood, waterflood, or polymer flood) and well-completion program will be developed, and one to three infill wells will be drilled and cored. Technical progress is summarized for: geophysical characterization; reservoir characterization; outcrop characterization; and recovery technology identification and analysis.

  20. Optimal processor for malfunction detection in operating nuclear reactor

    International Nuclear Information System (INIS)

    Ciftcioglu, O.

    1990-01-01

    An optimal processor for diagnosing operational transients in a nuclear reactor is described. Basic design of the processor involves real-time processing of noise signal obtained from a particular in core sensor and the optimality is based on minimum alarm failure in contrast to minimum false alarm criterion from the safe and reliable plant operation viewpoint

  1. Well Test Analysis of Naturally Fractured Vuggy Reservoirs with an Analytical Triple Porosity – Double Permeability Model and a Global Optimization Method

    Directory of Open Access Journals (Sweden)

    Gómez Susana

    2014-07-01

    Full Text Available The aim of this work is to study the automatic characterization of Naturally Fractured Vuggy Reservoirs via well test analysis, using a triple porosity-dual permeability model. The inter-porosity flow parameters, the storativity ratios, as well as the permeability ratio, the wellbore storage effect, the skin and the total permeability will be identified as parameters of the model. In this work, we will perform the well test interpretation in Laplace space, using numerical algorithms to transfer the discrete real data given in fully dimensional time to Laplace space. The well test interpretation problem in Laplace space has been posed as a nonlinear least squares optimization problem with box constraints and a linear inequality constraint, which is usually solved using local Newton type methods with a trust region. However, local methods as the one used in our work called TRON or the well-known Levenberg-Marquardt method, are often not able to find an optimal solution with a good fit of the data. Also well test analysis with the triple porosity-double permeability model, like most inverse problems, can yield multiple solutions with good match to the data. To deal with these specific characteristics, we will use a global optimization algorithm called the Tunneling Method (TM. In the design of the algorithm, we take into account issues of the problem like the fact that the parameter estimation has to be done with high precision, the presence of noise in the measurements and the need to solve the problem computationally fast. We demonstrate that the use of the TM in this study, showed to be an efficient and robust alternative to solve the well test characterization, as several optimal solutions, with very good match to the data were obtained.

  2. The role of reservoir characterization in the reservoir management process (as reflected in the Department of Energy`s reservoir management demonstration program)

    Energy Technology Data Exchange (ETDEWEB)

    Fowler, M.L. [BDM-Petroleum Technologies, Bartlesville, OK (United States); Young, M.A.; Madden, M.P. [BDM-Oklahoma, Bartlesville, OK (United States)] [and others

    1997-08-01

    Optimum reservoir recovery and profitability result from guidance of reservoir practices provided by an effective reservoir management plan. Success in developing the best, most appropriate reservoir management plan requires knowledge and consideration of (1) the reservoir system including rocks, and rock-fluid interactions (i.e., a characterization of the reservoir) as well as wellbores and associated equipment and surface facilities; (2) the technologies available to describe, analyze, and exploit the reservoir; and (3) the business environment under which the plan will be developed and implemented. Reservoir characterization is the essential to gain needed knowledge of the reservoir for reservoir management plan building. Reservoir characterization efforts can be appropriately scaled by considering the reservoir management context under which the plan is being built. Reservoir management plans de-optimize with time as technology and the business environment change or as new reservoir information indicates the reservoir characterization models on which the current plan is based are inadequate. BDM-Oklahoma and the Department of Energy have implemented a program of reservoir management demonstrations to encourage operators with limited resources and experience to learn, implement, and disperse sound reservoir management techniques through cooperative research and development projects whose objectives are to develop reservoir management plans. In each of the three projects currently underway, careful attention to reservoir management context assures a reservoir characterization approach that is sufficient, but not in excess of what is necessary, to devise and implement an effective reservoir management plan.

  3. Upconverting nanoparticles for optimizing scintillator based detection systems

    Science.gov (United States)

    Kross, Brian; McKisson, John E; McKisson, John; Weisenberger, Andrew; Xi, Wenze; Zom, Carl

    2013-09-17

    An upconverting device for a scintillation detection system is provided. The detection system comprises a scintillator material, a sensor, a light transmission path between the scintillator material and the sensor, and a plurality of upconverting nanoparticles particles positioned in the light transmission path.

  4. Optimizing detection and analysis of slow waves in sleep EEG.

    Science.gov (United States)

    Mensen, Armand; Riedner, Brady; Tononi, Giulio

    2016-12-01

    Analysis of individual slow waves in EEG recording during sleep provides both greater sensitivity and specificity compared to spectral power measures. However, parameters for detection and analysis have not been widely explored and validated. We present a new, open-source, Matlab based, toolbox for the automatic detection and analysis of slow waves; with adjustable parameter settings, as well as manual correction and exploration of the results using a multi-faceted visualization tool. We explore a large search space of parameter settings for slow wave detection and measure their effects on a selection of outcome parameters. Every choice of parameter setting had some effect on at least one outcome parameter. In general, the largest effect sizes were found when choosing the EEG reference, type of canonical waveform, and amplitude thresholding. Previously published methods accurately detect large, global waves but are conservative and miss the detection of smaller amplitude, local slow waves. The toolbox has additional benefits in terms of speed, user-interface, and visualization options to compare and contrast slow waves. The exploration of parameter settings in the toolbox highlights the importance of careful selection of detection METHODS: The sensitivity and specificity of the automated detection can be improved by manually adding or deleting entire waves and or specific channels using the toolbox visualization functions. The toolbox standardizes the detection procedure, sets the stage for reliable results and comparisons and is easy to use without previous programming experience. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Particle Swarm Optimization approach to defect detection in armour ceramics.

    Science.gov (United States)

    Kesharaju, Manasa; Nagarajah, Romesh

    2017-03-01

    In this research, various extracted features were used in the development of an automated ultrasonic sensor based inspection system that enables defect classification in each ceramic component prior to despatch to the field. Classification is an important task and large number of irrelevant, redundant features commonly introduced to a dataset reduces the classifiers performance. Feature selection aims to reduce the dimensionality of the dataset while improving the performance of a classification system. In the context of a multi-criteria optimization problem (i.e. to minimize classification error rate and reduce number of features) such as one discussed in this research, the literature suggests that evolutionary algorithms offer good results. Besides, it is noted that Particle Swarm Optimization (PSO) has not been explored especially in the field of classification of high frequency ultrasonic signals. Hence, a binary coded Particle Swarm Optimization (BPSO) technique is investigated in the implementation of feature subset selection and to optimize the classification error rate. In the proposed method, the population data is used as input to an Artificial Neural Network (ANN) based classification system to obtain the error rate, as ANN serves as an evaluator of PSO fitness function. Copyright © 2016. Published by Elsevier B.V.

  6. Near-Optimal Detection in MIMO Systems using Gibbs Sampling

    DEFF Research Database (Denmark)

    Hansen, Morten; Hassibi, Babak; Dimakis, Georgios Alexandros

    2009-01-01

    In this paper we study a Markov Chain Monte Carlo (MCMC) Gibbs sampler for solving the integer least-squares problem. In digital communication the problem is equivalent to preforming Maximum Likelihood (ML) detection in Multiple-Input Multiple-Output (MIMO) systems. While the use of MCMC methods...... sampler provides a computationally efficient way of achieving approximative ML detection in MIMO systems having a huge number of transmit and receive dimensions. In fact, they further suggest that the Markov chain is rapidly mixing. Thus, it has been observed that even in cases were ML detection using, e...

  7. Efficient optimal joint channel estimation and data detection for massive MIMO systems

    KAUST Repository

    Alshamary, Haider Ali Jasim

    2016-08-15

    In this paper, we propose an efficient optimal joint channel estimation and data detection algorithm for massive MIMO wireless systems. Our algorithm is optimal in terms of the generalized likelihood ratio test (GLRT). For massive MIMO systems, we show that the expected complexity of our algorithm grows polynomially in the channel coherence time. Simulation results demonstrate significant performance gains of our algorithm compared with suboptimal non-coherent detection algorithms. To the best of our knowledge, this is the first algorithm which efficiently achieves GLRT-optimal non-coherent detections for massive MIMO systems with general constellations.

  8. Dembo polymerase chain reaction technique for detection of bovine abortion, diarrhea, and respiratory disease complex infectious agents in potential vectors and reservoirs.

    Science.gov (United States)

    Rahpaya, Sayed Samim; Tsuchiaka, Shinobu; Kishimoto, Mai; Oba, Mami; Katayama, Yukie; Nunomura, Yuka; Kokawa, Saki; Kimura, Takashi; Kobayashi, Atsushi; Kirino, Yumi; Okabayashi, Tamaki; Nonaka, Nariaki; Mekata, Hirohisa; Aoki, Hiroshi; Shiokawa, Mai; Umetsu, Moeko; Morita, Tatsushi; Hasebe, Ayako; Otsu, Keiko; Asai, Tetsuo; Yamaguchi, Tomohiro; Makino, Shinji; Murata, Yoshiteru; Abi, Ahmad Jan; Omatsu, Tsutomu; Mizutani, Tetsuya

    2018-05-31

    Bovine abortion, diarrhea, and respiratory disease complexes, caused by infectious agents, result in high and significant economic losses for the cattle industry. These pathogens are likely transmitted by various vectors and reservoirs including insects, birds, and rodents. However, experimental data supporting this possibility are scarce. We collected 117 samples and screened them for 44 bovine abortive, diarrheal, and respiratory disease complex pathogens by using Dembo polymerase chain reaction (PCR), which is based on TaqMan real-time PCR. Fifty-seven samples were positive for at least one pathogen, including bovine viral diarrhea virus, bovine enterovirus, Salmonella enterica ser. Dublin, Salmonella enterica ser. Typhimurium, and Neospora caninum ; some samples were positive for multiple pathogens. Bovine viral diarrhea virus and bovine enterovirus were the most frequently detected pathogens, especially in flies, suggesting an important role of flies in the transmission of these viruses. Additionally, we detected the N. caninum genome from a cockroach sample for the first time. Our data suggest that insects (particularly flies), birds, and rodents are potential vectors and reservoirs of abortion, diarrhea, and respiratory infectious agents, and that they may transmit more than one pathogen at the same time.

  9. Optimization of Graphene Sensors to Detect Biological Warfare Agents

    Science.gov (United States)

    2014-03-27

    variations that use detection elements such as glucose, cholesterol, NADH, hydrogen peroxide, nitrites , nitrous oxide and aptamers (such as ssDNA...electrical current [34]. The sensor materials and detection limits listed in Table 1 illustrate the types of processed graphene that can be used to...and a 1% mortality rate for those treated[28]. Gastrointestinal anthrax results when B. anthracis enters the body by eating infected meat and has

  10. Optimization of a quench detection system for superconducting magnets

    International Nuclear Information System (INIS)

    Borlein, M.

    2004-12-01

    Subject of this report is the detection of a quench in a superconducting magnet. For the safe operation of superconducting magnets one of the most important issues is the quench detection system which controls the superconducting state of the magnet and triggers a safety discharge if necessary. If it comes to a breakdown of the superconductivity (quench), the magnet has to be discharged very quickly to avoid any damage or danger for the magnet or its environment. First an introducing overview is given. Next different methods of quench detection will be presented, partially on the basis of existing quench detection systems and the applicability of these methods in different states of the magnet operation will be shown. The different quench detection methods are compared and evaluated partially by using test experiments described in the appendix. As an application example this report contains a proposal for the quench detection system for the Wendelstein 7-X facility, actually built by the Institute for Plasma Physics, Garching [de

  11. Application of advanced reservoir characterization, simulation, and production optimization strategies to maximize recovery in slope and basin clastic reservoirs, west Texas (Delaware Basin). Annual progress report, March 31, 1995--March 31, 1996

    Energy Technology Data Exchange (ETDEWEB)

    Dutton, S.P.; Hovorka, S.D.; Cole, A.G.

    1996-08-01

    The objective of this Class III project is to demonstrate that detailed reservoir characterization of clastic reservoirs in basinal sandstones of the Delaware Mountain Group in the Delaware Basin of West Texas and New Mexico is a cost-effective way to recover more of the original oil in place by strategic infill-well placement and geologically based field development. Reservoirs in the Delaware Mountain Group have low producibility (average recovery <14 percent of the original oil in place) because of a high degree of vertical and lateral heterogeneity caused by depositional processes and post-depositional diagenetic modification. Detailed correlations of the Ramsey sandstone reservoirs in Geraldine Ford field suggest that lateral sandstone continuity is less than interpreted by previous studies. The degree of lateral heterogeneity in the reservoir sandstones suggests that they were deposited by eolian-derived turbidites. According to the eolian-derived turbidite model, sand dunes migrated across the exposed shelf to the shelf break during sea-level lowstands and provided well sorted sand for turbidity currents or grain flows into the deep basin.

  12. An optimized algorithm for detecting and annotating regional differential methylation.

    Science.gov (United States)

    Li, Sheng; Garrett-Bakelman, Francine E; Akalin, Altuna; Zumbo, Paul; Levine, Ross; To, Bik L; Lewis, Ian D; Brown, Anna L; D'Andrea, Richard J; Melnick, Ari; Mason, Christopher E

    2013-01-01

    DNA methylation profiling reveals important differentially methylated regions (DMRs) of the genome that are altered during development or that are perturbed by disease. To date, few programs exist for regional analysis of enriched or whole-genome bisulfate conversion sequencing data, even though such data are increasingly common. Here, we describe an open-source, optimized method for determining empirically based DMRs (eDMR) from high-throughput sequence data that is applicable to enriched whole-genome methylation profiling datasets, as well as other globally enriched epigenetic modification data. Here we show that our bimodal distribution model and weighted cost function for optimized regional methylation analysis provides accurate boundaries of regions harboring significant epigenetic modifications. Our algorithm takes the spatial distribution of CpGs into account for the enrichment assay, allowing for optimization of the definition of empirical regions for differential methylation. Combined with the dependent adjustment for regional p-value combination and DMR annotation, we provide a method that may be applied to a variety of datasets for rapid DMR analysis. Our method classifies both the directionality of DMRs and their genome-wide distribution, and we have observed that shows clinical relevance through correct stratification of two Acute Myeloid Leukemia (AML) tumor sub-types. Our weighted optimization algorithm eDMR for calling DMRs extends an established DMR R pipeline (methylKit) and provides a needed resource in epigenomics. Our method enables an accurate and scalable way of finding DMRs in high-throughput methylation sequencing experiments. eDMR is available for download at http://code.google.com/p/edmr/.

  13. Optimization of Hydrogen Peroxide Detection for a Methyl Mercaptan Biosensor

    Directory of Open Access Journals (Sweden)

    Shi-Gang Sun

    2013-04-01

    Full Text Available Several kinds of modified carbon screen printed electrodes (CSPEs for amperometric detection of hydrogen peroxide (H2O2 are presented in order to propose a methyl mercaptan (MM biosensor. Unmodified, carbon nanotubes (CNTs, cobalt phthalocyanine (CoPC, Prussian blue (PB, and Os-wired HRP modified CSPE sensors were fabricated and tested to detect H2O2, applying a potential of +0.6 V, +0.6 V, +0.4 V, −0.2 V and −0.1 V (versus Ag/AgCl, respectively. The limits of detection of these electrodes for H2O2 were 3.1 μM, 1.3 μM, 71 nM, 1.3 μM, 13.7 nM, respectively. The results demonstrated that the Os-wired HRP modified CSPEs gives the lowest limit of detection (LOD for H2O2 at a working potential as low as −0.1 V. Os-wired HRP is the optimum choice for establishment of a MM biosensor and gives a detection limit of 0.5 μM.

  14. Assay optimization for molecular detection of Zika virus

    NARCIS (Netherlands)

    Corman, Victor M.; Rasche, Andrea; Baronti, Cecile; Aldabbagh, Souhaib; Cadar, Daniel; Reusken, Chantal Bem; Pas, Suzan D.; Goorhuis, Abraham; Schinkel, Janke; Molenkamp, Richard; Kümmerer, Beate M.; Bleicker, Tobias; Brünink, Sebastian; Eschbach-Bludau, Monika; Eis-Hübinger, Anna M.; Koopmans, Marion P.; Schmidt-Chanasit, Jonas; Grobusch, Martin P.; de Lamballerie, Xavier; Drosten, Christian; Drexler, Jan Felix

    2016-01-01

    To examine the diagnostic performance of real-time reverse transcription (RT)-polymerase chain reaction (PCR) assays for Zika virus detection. We compared seven published real-time RT-PCR assays and two new assays that we have developed. To determine the analytical sensitivity of each assay, we

  15. Optimizing ultrasound detection for sensitive 3D photoacoustic breast tomography

    NARCIS (Netherlands)

    Xia, W.

    2013-01-01

    The standard modality for breast cancer detection is X-ray imaging. Diagnosis is performed after the triple assessment of X-ray mammography assisted by ultrasonog- raphy and biopsy. Magnetic resonance imaging (MRI) is sometimes used in specific problem solving such as contradictory results are

  16. Improved Genetic Algorithm Optimization for Forward Vehicle Detection Problems

    Directory of Open Access Journals (Sweden)

    Longhui Gang

    2015-07-01

    Full Text Available Automated forward vehicle detection is an integral component of many advanced driver-assistance systems. The method based on multi-visual information fusion, with its exclusive advantages, has become one of the important topics in this research field. During the whole detection process, there are two key points that should to be resolved. One is to find the robust features for identification and the other is to apply an efficient algorithm for training the model designed with multi-information. This paper presents an adaptive SVM (Support Vector Machine model to detect vehicle with range estimation using an on-board camera. Due to the extrinsic factors such as shadows and illumination, we pay more attention to enhancing the system with several robust features extracted from a real driving environment. Then, with the introduction of an improved genetic algorithm, the features are fused efficiently by the proposed SVM model. In order to apply the model in the forward collision warning system, longitudinal distance information is provided simultaneously. The proposed method is successfully implemented on a test car and evaluation experimental results show reliability in terms of both the detection rate and potential effectiveness in a real-driving environment.

  17. Structural damage detection-oriented multi-type sensor placement with multi-objective optimization

    Science.gov (United States)

    Lin, Jian-Fu; Xu, You-Lin; Law, Siu-Seong

    2018-05-01

    A structural damage detection-oriented multi-type sensor placement method with multi-objective optimization is developed in this study. The multi-type response covariance sensitivity-based damage detection method is first introduced. Two objective functions for optimal sensor placement are then introduced in terms of the response covariance sensitivity and the response independence. The multi-objective optimization problem is formed by using the two objective functions, and the non-dominated sorting genetic algorithm (NSGA)-II is adopted to find the solution for the optimal multi-type sensor placement to achieve the best structural damage detection. The proposed method is finally applied to a nine-bay three-dimensional frame structure. Numerical results show that the optimal multi-type sensor placement determined by the proposed method can avoid redundant sensors and provide satisfactory results for structural damage detection. The restriction on the number of each type of sensors in the optimization can reduce the searching space in the optimization to make the proposed method more effective. Moreover, how to select a most optimal sensor placement from the Pareto solutions via the utility function and the knee point method is demonstrated in the case study.

  18. Multiunit water resource systems management by decomposition, optimization and emulated evolution : a case study of seven water supply reservoirs in Tunisia

    NARCIS (Netherlands)

    Milutin, D.

    1998-01-01

    Being one of the essential elements of almost any water resource system, reservoirs are indispensable in our struggle to harness, utilize and manage natural water resources. Consequently, the derivation of appropriate reservoir operating strategies draws significant attention in water

  19. A Hybrid Heuristic Optimization Approach for Leak Detection in Pipe Networks Using Ordinal Optimization Approach and the Symbiotic Organism Search

    Directory of Open Access Journals (Sweden)

    Chao-Chih Lin

    2017-10-01

    Full Text Available A new transient-based hybrid heuristic approach is developed to optimize a transient generation process and to detect leaks in pipe networks. The approach couples the ordinal optimization approach (OOA and the symbiotic organism search (SOS to solve the optimization problem by means of iterations. A pipe network analysis model (PNSOS is first used to determine steady-state head distribution and pipe flow rates. The best transient generation point and its relevant valve operation parameters are optimized by maximizing the objective function of transient energy. The transient event is created at the chosen point, and the method of characteristics (MOC is used to analyze the transient flow. The OOA is applied to sift through the candidate pipes and the initial organisms with leak information. The SOS is employed to determine the leaks by minimizing the sum of differences between simulated and computed head at the observation points. Two synthetic leaking scenarios, a simple pipe network and a water distribution network (WDN, are chosen to test the performance of leak detection ordinal symbiotic organism search (LDOSOS. Leak information can be accurately identified by the proposed approach for both of the scenarios. The presented technique makes a remarkable contribution to the success of leak detection in the pipe networks.

  20. Spectrally optimal illuminations for diabetic retinopathy detection in retinal imaging

    Science.gov (United States)

    Bartczak, Piotr; Fält, Pauli; Penttinen, Niko; Ylitepsa, Pasi; Laaksonen, Lauri; Lensu, Lasse; Hauta-Kasari, Markku; Uusitalo, Hannu

    2017-04-01

    Retinal photography is a standard method for recording retinal diseases for subsequent analysis and diagnosis. However, the currently used white light or red-free retinal imaging does not necessarily provide the best possible visibility of different types of retinal lesions, important when developing diagnostic tools for handheld devices, such as smartphones. Using specifically designed illumination, the visibility and contrast of retinal lesions could be improved. In this study, spectrally optimal illuminations for diabetic retinopathy lesion visualization are implemented using a spectrally tunable light source based on digital micromirror device. The applicability of this method was tested in vivo by taking retinal monochrome images from the eyes of five diabetic volunteers and two non-diabetic control subjects. For comparison to existing methods, we evaluated the contrast of retinal images taken with our method and red-free illumination. The preliminary results show that the use of optimal illuminations improved the contrast of diabetic lesions in retinal images by 30-70%, compared to the traditional red-free illumination imaging.

  1. Optimally Robust Redundancy Relations for Failure Detection in Uncertain Systems,

    Science.gov (United States)

    1983-04-01

    particular applications. While the general methods provide the basis for what in principle should be a widely applicable failure detection methodology...modifications to this result which overcome them at no fundmental increase in complexity. 4.1 Scaling A critical problem with the criteria of the preceding...criterion which takes scaling into account L 2 s[ (45) As in (38), we can multiply the C. by positive scalars to take into account unequal weightings on

  2. Optimizing surface acoustic wave sensors for trace chemical detection

    Energy Technology Data Exchange (ETDEWEB)

    Frye, G.C.; Kottenstette, R.J.; Heller, E.J. [and others

    1997-06-01

    This paper describes several recent advances for fabricating coated surface acoustic wave (SAW) sensors for applications requiring trace chemical detection. Specifically, we have demonstrated that high surface area microporous oxides can provide 100-fold improvements in SAW sensor responses compared with more typical polymeric coatings. In addition, we fabricated GaAs SAW devices with frequencies up to 500 MHz to provide greater sensitivity and an ideal substrate for integration with high-frequency electronics.

  3. Optimization of Quantitative PCR Methods for Enteropathogen Detection

    Science.gov (United States)

    Liu, Jie; Gratz, Jean; Amour, Caroline; Nshama, Rosemary; Walongo, Thomas; Maro, Athanasia; Mduma, Esto; Platts-Mills, James; Boisen, Nadia; Nataro, James; Haverstick, Doris M.; Kabir, Furqan; Lertsethtakarn, Paphavee; Silapong, Sasikorn; Jeamwattanalert, Pimmada; Bodhidatta, Ladaporn; Mason, Carl; Begum, Sharmin; Haque, Rashidul; Praharaj, Ira; Kang, Gagandeep; Houpt, Eric R.

    2016-01-01

    Detection and quantification of enteropathogens in stool specimens is useful for diagnosing the cause of diarrhea but is technically challenging. Here we evaluate several important determinants of quantification: specimen collection, nucleic acid extraction, and extraction and amplification efficiency. First, we evaluate the molecular detection and quantification of pathogens in rectal swabs versus stool, using paired flocked rectal swabs and whole stool collected from 129 children hospitalized with diarrhea in Tanzania. Swabs generally yielded a higher quantification cycle (Cq) (average 29.7, standard deviation 3.5 vs. 25.3 ± 2.9 from stool, P<0.001) but were still able to detect 80% of pathogens with a Cq < 30 in stool. Second, a simplified total nucleic acid (TNA) extraction procedure was compared to separate DNA and RNA extractions and showed 92% (318/344) sensitivity and 98% (951/968) specificity, with no difference in Cq value for the positive results (ΔCq(DNA+RNA-TNA) = -0.01 ± 1.17, P = 0.972, N = 318). Third, we devised a quantification scheme that adjusts pathogen quantity to the specimen’s extraction and amplification efficiency, and show that this better estimates the quantity of spiked specimens than the raw target Cq. In sum, these methods for enteropathogen quantification, stool sample collection, and nucleic acid extraction will be useful for laboratories studying enteric disease. PMID:27336160

  4. Swine and rabbits are the main reservoirs of hepatitis E virus in China: detection of HEV RNA in feces of farmed and wild animals.

    Science.gov (United States)

    Xia, Junke; Zeng, Hang; Liu, Lin; Zhang, Yulin; Liu, Peng; Geng, Jiabao; Wang, Lin; Wang, Ling; Zhuang, Hui

    2015-11-01

    Hepatitis E virus (HEV) infection is recognized as a zoonosis. The prevalence of HEV RNA and anti-HEV antibodies in many animal species has been reported, but the host range of HEV is unclear. The aims of this study were to investigate HEV infection in various animal species and to determine the reservoirs of HEV. Eight hundred twenty-two fecal samples from 17 mammal species and 67 fecal samples from 24 avian species were collected in China and tested for HEV RNA by RT-nPCR. The products of PCR were sequenced and analyzed phylogenetically. The positive rates of HEV RNA isolated from pigs in Beijing, Shandong, and Henan were 33%, 30%, and 92%, respectively, and that from rabbits in Beijing was 5%. HEV RNA was not detectable in farmed foxes, sheep or sika deer, or in wild animals in zoos, including wild boars, yaks, camels, Asiatic black bears, African lions, red pandas, civets, wolves, jackals and primates. Sequence analysis revealed that swine isolates had 97.8%-98.4% nucleotide sequence identity to genotype 4d isolates from patients in Shandong and Jiangsu of China. Phylogenetic analysis showed that swine HEV isolates belong to genotype 4, including subgenotype 4h in Henan and 4d in Beijing and Shandong. The rabbit HEV strains shared 93%-99% nucleotide sequence identity with rabbit strains isolated from Inner Mongolia. In conclusion, swine and rabbits have been confirmed to be the main reservoirs of HEV in China.

  5. TEM10 homodyne detection as an optimal small-displacement and tilt-measurement scheme

    DEFF Research Database (Denmark)

    Delaubert, Vincent; Treps, Nikolas; Lassen, Mikael Østergaard

    2006-01-01

    We report an experimental demonstration of optimal measurements of small displacement and tilt of a Gaussian beam - two conjugate variables - involving a homodyne detection with a TEM10 local oscillator. We verify that the standard split detection is only 64% efficient. We also show a displacement...

  6. Optimization of immunochemistry for sensing techniques to detect pesticide residues in water

    DEFF Research Database (Denmark)

    Uthuppu, Basil; Kostesha, Natalie; Jakobsen, Mogens Havsteen

    2011-01-01

    We are working on the development of a real-time electrochemical sensor based on an immunoassay detection system to detect and quantify the presence of pesticide residues in ground water. Highly selective and sensitive immuno-reactions are being investigated to be optimized in order to bring them...

  7. Optimization of Large Volume Injection for Improved Detection of Polycyclic Aromatic Hydrocarbons (PAH) in Mussels

    DEFF Research Database (Denmark)

    Duedahl-Olesen, Lene; Ghorbani, Faranak

    2008-01-01

    Detection of PAH of six benzene rings is somewhat troublesome and lowering the limits of detection (LODs) for these compounds in food is necessary. For this purpose, we optimized a Programmable-Temperature-Vaporisation (PTV) injection with Large Volume Injection (LVI) with regard to the GC-MS det...

  8. Symmetrized local co-registration optimization for anomalous change detection

    Energy Technology Data Exchange (ETDEWEB)

    Wohlberg, Brendt E [Los Alamos National Laboratory; Theiler, James P [Los Alamos National Laboratory

    2009-01-01

    The goal of anomalous change detection (ACD) is to identify what unusual changes have occurred in a scene, based on two images of the scene taken at different times and under different conditions. The actual anomalous changes need to be distinguished from the incidental differences that occur throughout the imagery, and one of the most common and confounding of these incidental differences is due to the misregistration of the images, due to limitations of the registration pre-processing applied to the image pair. We propose a general method to compensate for residual misregistration in any ACD algorithm which constructs an estimate of the degree of 'anomalousness' for every pixel in the image pair. The method computes a modified misregistration-insensitive anomalousness by making local re-registration adjustments to minimize the local anomalousness. In this paper we describe a symmetrized version of our initial algorithm, and find significant performance improvements in the anomalous change detection ROC curves for a number of real and synthetic data sets.

  9. A Distributed Intrusion Detection Scheme about Communication Optimization in Smart Grid

    Directory of Open Access Journals (Sweden)

    Yunfa Li

    2013-01-01

    Full Text Available We first propose an efficient communication optimization algorithm in smart grid. Based on the optimization algorithm, we propose an intrusion detection algorithm to detect malicious data and possible cyberattacks. In this scheme, each node acts independently when it processes communication flows or cybersecurity threats. And neither special hardware nor nodes cooperation is needed. In order to justify the feasibility and the availability of this scheme, a series of experiments have been done. The results show that it is feasible and efficient to detect malicious data and possible cyberattacks with less computation and communication cost.

  10. Optimal sample size for probability of detection curves

    International Nuclear Information System (INIS)

    Annis, Charles; Gandossi, Luca; Martin, Oliver

    2013-01-01

    Highlights: • We investigate sample size requirement to develop probability of detection curves. • We develop simulations to determine effective inspection target sizes, number and distribution. • We summarize these findings and provide guidelines for the NDE practitioner. -- Abstract: The use of probability of detection curves to quantify the reliability of non-destructive examination (NDE) systems is common in the aeronautical industry, but relatively less so in the nuclear industry, at least in European countries. Due to the nature of the components being inspected, sample sizes tend to be much lower. This makes the manufacturing of test pieces with representative flaws, in sufficient numbers, so to draw statistical conclusions on the reliability of the NDT system under investigation, quite costly. The European Network for Inspection and Qualification (ENIQ) has developed an inspection qualification methodology, referred to as the ENIQ Methodology. It has become widely used in many European countries and provides assurance on the reliability of NDE systems, but only qualitatively. The need to quantify the output of inspection qualification has become more important as structural reliability modelling and quantitative risk-informed in-service inspection methodologies become more widely used. A measure of the NDE reliability is necessary to quantify risk reduction after inspection and probability of detection (POD) curves provide such a metric. The Joint Research Centre, Petten, The Netherlands supported ENIQ by investigating the question of the sample size required to determine a reliable POD curve. As mentioned earlier manufacturing of test pieces with defects that are typically found in nuclear power plants (NPPs) is usually quite expensive. Thus there is a tendency to reduce sample sizes, which in turn increases the uncertainty associated with the resulting POD curve. The main question in conjunction with POS curves is the appropriate sample size. Not

  11. Optimal threshold detection for Málaga turbulent optical links

    DEFF Research Database (Denmark)

    Jurado-Navas, Antonio; Garrido-Balsellss, José María; del Castillo Vázquez, Miguel

    2016-01-01

    in this paper the role of the detection threshold in a free-space optical system employing an on-off keying modulation technique and involved in different scenarios, and taking into account the extinction ratio associated to the employed laser. First we have derived some analytical expressions for the lower......A new and generalized statistical model, called Málaga distribution (M distribution), has been derived recently to characterize the irradiance fluctuations of an unbounded optical wave front propagating through a turbulent medium under all irradiance fluctuation conditions. As great advantages...... associated to that model, we can indicate that it is written in a simple tractable closed-form expression and that it is able to unify most of the proposed statistical models for free-space optical communications derived until now in the scientific literature. Based on that Málaga model, we have analyzed...

  12. Analysis of real-time reservoir monitoring : reservoirs, strategies, & modeling.

    Energy Technology Data Exchange (ETDEWEB)

    Mani, Seethambal S.; van Bloemen Waanders, Bart Gustaaf; Cooper, Scott Patrick; Jakaboski, Blake Elaine; Normann, Randy Allen; Jennings, Jim (University of Texas at Austin, Austin, TX); Gilbert, Bob (University of Texas at Austin, Austin, TX); Lake, Larry W. (University of Texas at Austin, Austin, TX); Weiss, Chester Joseph; Lorenz, John Clay; Elbring, Gregory Jay; Wheeler, Mary Fanett (University of Texas at Austin, Austin, TX); Thomas, Sunil G. (University of Texas at Austin, Austin, TX); Rightley, Michael J.; Rodriguez, Adolfo (University of Texas at Austin, Austin, TX); Klie, Hector (University of Texas at Austin, Austin, TX); Banchs, Rafael (University of Texas at Austin, Austin, TX); Nunez, Emilio J. (University of Texas at Austin, Austin, TX); Jablonowski, Chris (University of Texas at Austin, Austin, TX)

    2006-11-01

    The project objective was to detail better ways to assess and exploit intelligent oil and gas field information through improved modeling, sensor technology, and process control to increase ultimate recovery of domestic hydrocarbons. To meet this objective we investigated the use of permanent downhole sensors systems (Smart Wells) whose data is fed real-time into computational reservoir models that are integrated with optimized production control systems. The project utilized a three-pronged approach (1) a value of information analysis to address the economic advantages, (2) reservoir simulation modeling and control optimization to prove the capability, and (3) evaluation of new generation sensor packaging to survive the borehole environment for long periods of time. The Value of Information (VOI) decision tree method was developed and used to assess the economic advantage of using the proposed technology; the VOI demonstrated the increased subsurface resolution through additional sensor data. Our findings show that the VOI studies are a practical means of ascertaining the value associated with a technology, in this case application of sensors to production. The procedure acknowledges the uncertainty in predictions but nevertheless assigns monetary value to the predictions. The best aspect of the procedure is that it builds consensus within interdisciplinary teams The reservoir simulation and modeling aspect of the project was developed to show the capability of exploiting sensor information both for reservoir characterization and to optimize control of the production system. Our findings indicate history matching is improved as more information is added to the objective function, clearly indicating that sensor information can help in reducing the uncertainty associated with reservoir characterization. Additional findings and approaches used are described in detail within the report. The next generation sensors aspect of the project evaluated sensors and packaging

  13. Optimization of single photon detection model based on GM-APD

    Science.gov (United States)

    Chen, Yu; Yang, Yi; Hao, Peiyu

    2017-11-01

    One hundred kilometers high precision laser ranging hopes the detector has very strong detection ability for very weak light. At present, Geiger-Mode of Avalanche Photodiode has more use. It has high sensitivity and high photoelectric conversion efficiency. Selecting and designing the detector parameters according to the system index is of great importance to the improvement of photon detection efficiency. Design optimization requires a good model. In this paper, we research the existing Poisson distribution model, and consider the important detector parameters of dark count rate, dead time, quantum efficiency and so on. We improve the optimization of detection model, select the appropriate parameters to achieve optimal photon detection efficiency. The simulation is carried out by using Matlab and compared with the actual test results. The rationality of the model is verified. It has certain reference value in engineering applications.

  14. Feasibility of Optimizing Recovery and Reserves from a Mature and Geological Complex Multiple Turbidite Offshore Calif. Reservoir through the Drilling and Completion of a Trilateral Horizontal Well

    International Nuclear Information System (INIS)

    Coombs, Steven F.

    1999-01-01

    The main objective of this project is to devise an effective redevelopment strategy to combat producibility problems related to the Repetto turbidite sequences of the Carpinteria Field. The lack of adequate reservoir characterization, high-water cut production, and scaling problems have in the past contributed to the field's low productivity. To improve productivity and enhance recoverable reserves, the following specific goals are proposed: (1) Develop an integrated database of all existing data from work done by the former ownership group. (2) Expand reservoir drainage and reduce sand problems through horizontal well drilling and completion. (3) Operate and validate reservoirs' conceptual model by incorporating new data from the proposed trilateral well. (4) Transfer methodologies employed in geologic modeling and drilling multilateral wells to other operators with similar reservoirs

  15. Fault detection of feed water treatment process using PCA-WD with parameter optimization.

    Science.gov (United States)

    Zhang, Shirong; Tang, Qian; Lin, Yu; Tang, Yuling

    2017-05-01

    Feed water treatment process (FWTP) is an essential part of utility boilers; and fault detection is expected for its reliability improvement. Classical principal component analysis (PCA) has been applied to FWTPs in our previous work; however, the noises of T 2 and SPE statistics result in false detections and missed detections. In this paper, Wavelet denoise (WD) is combined with PCA to form a new algorithm, (PCA-WD), where WD is intentionally employed to deal with the noises. The parameter selection of PCA-WD is further formulated as an optimization problem; and PSO is employed for optimization solution. A FWTP, sustaining two 1000MW generation units in a coal-fired power plant, is taken as a study case. Its operation data is collected for following verification study. The results show that the optimized WD is effective to restrain the noises of T 2 and SPE statistics, so as to improve the performance of PCA-WD algorithm. And, the parameter optimization enables PCA-WD to get its optimal parameters in an automatic way rather than on individual experience. The optimized PCA-WD is further compared with classical PCA and sliding window PCA (SWPCA), in terms of four cases as bias fault, drift fault, broken line fault and normal condition, respectively. The advantages of the optimized PCA-WD, against classical PCA and SWPCA, is finally convinced with the results. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  16. Detection of Carious Lesions and Restorations Using Particle Swarm Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Mohammad Naebi

    2016-01-01

    Full Text Available Background/Purpose. In terms of the detection of tooth diagnosis, no intelligent detection has been done up till now. Dentists just look at images and then they can detect the diagnosis position in tooth based on their experiences. Using new technologies, scientists will implement detection and repair of tooth diagnosis intelligently. In this paper, we have introduced one intelligent method for detection using particle swarm optimization (PSO and our mathematical formulation. This method was applied to 2D special images. Using developing of our method, we can detect tooth diagnosis for all of 2D and 3D images. Materials and Methods. In recent years, it is possible to implement intelligent processing of images by high efficiency optimization algorithms in many applications especially for detection of dental caries and restoration without human intervention. In the present work, we explain PSO algorithm with our detection formula for detection of dental caries and restoration. Also image processing helped us to implement our method. And to do so, pictures taken by digital radiography systems of tooth are used. Results and Conclusion. We implement some mathematics formula for fitness of PSO. Our results show that this method can detect dental caries and restoration in digital radiography pictures with the good convergence. In fact, the error rate of this method was 8%, so that it can be implemented for detection of dental caries and restoration. Using some parameters, it is possible that the error rate can be even reduced below 0.5%.

  17. Optimal Fluorescence Waveband Determination for Detecting Defective Cherry Tomatoes Using a Fluorescence Excitation-Emission Matrix

    Directory of Open Access Journals (Sweden)

    In-Suck Baek

    2014-11-01

    Full Text Available A multi-spectral fluorescence imaging technique was used to detect defective cherry tomatoes. The fluorescence excitation and emission matrix was used to measure for defects, sound surface and stem areas to determine the optimal fluorescence excitation and emission wavelengths for discrimination. Two-way ANOVA revealed the optimal excitation wavelength for detecting defect areas was 410 nm. Principal component analysis (PCA was applied to the fluorescence emission spectra of all regions at 410 nm excitation to determine the emission wavelengths for defect detection. The major emission wavelengths were 688 nm and 506 nm for the detection. Fluorescence images combined with the determined emission wavebands demonstrated the feasibility of detecting defective cherry tomatoes with >98% accuracy. Multi-spectral fluorescence imaging has potential utility in non-destructive quality sorting of cherry tomatoes.

  18. Optimized Swinging Door Algorithm for Wind Power Ramp Event Detection: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Mingjian; Zhang, Jie; Florita, Anthony R.; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-06

    Significant wind power ramp events (WPREs) are those that influence the integration of wind power, and they are a concern to the continued reliable operation of the power grid. As wind power penetration has increased in recent years, so has the importance of wind power ramps. In this paper, an optimized swinging door algorithm (SDA) is developed to improve ramp detection performance. Wind power time series data are segmented by the original SDA, and then all significant ramps are detected and merged through a dynamic programming algorithm. An application of the optimized SDA is provided to ascertain the optimal parameter of the original SDA. Measured wind power data from the Electric Reliability Council of Texas (ERCOT) are used to evaluate the proposed optimized SDA.

  19. ALGORITHMS FOR OPTIMIZATION OF SYSYTEM PERFORMANCE IN LAYERED DETECTION SYSTEMS UNDER DETECTOR COORELATION

    International Nuclear Information System (INIS)

    Wood, Thomas W.; Heasler, Patrick G.; Daly, Don S.

    2010-01-01

    Almost all of the 'architectures' for radiation detection systems in Department of Energy (DOE) and other USG programs rely on some version of layered detector deployment. Efficacy analyses of layered (or more generally extended) detection systems in many contexts often assume statistical independence among detection events and thus predict monotonically increasing system performance with the addition of detection layers. We show this to be a false conclusion for the ROC curves typical of most current technology gamma detectors, and more generally show that statistical independence is often an unwarranted assumption for systems in which there is ambiguity about the objects to be detected. In such systems, a model of correlation among detection events allows optimization of system algorithms for interpretation of detector signals. These algorithms are framed as optimal discriminant functions in joint signal space, and may be applied to gross counting or spectroscopic detector systems. We have shown how system algorithms derived from this model dramatically improve detection probabilities compared to the standard serial detection operating paradigm for these systems. These results would not surprise anyone who has confronted the problem of correlated errors (or failure rates) in the analogous contexts, but is seems to be largely underappreciated among those analyzing the radiation detection problem - independence is widely assumed and experimental studies typical fail to measure correlation. This situation, if not rectified, will lead to several unfortunate results. Including overconfidence in system efficacy, overinvestment in layers of similar technology, and underinvestment in diversity among detection assets.

  20. A Time Domain Update Method for Reservoir History Matching of Electromagnetic Data

    KAUST Repository

    Katterbauer, Klemens; Hoteit, Ibrahim; Sun, Shuyu

    2014-01-01

    production forecasts and optimizing reservoir exploitation. Reservoir history matching has played here a key role incorporating production, seismic, electromagnetic and logging data for forecasting the development of reservoirs and its depletion

  1. Evaluation of Gaussian approximations for data assimilation in reservoir models

    KAUST Repository

    Iglesias, Marco A.; Law, Kody J H; Stuart, Andrew M.

    2013-01-01

    is fundamental for the optimal management of reservoirs. Unfortunately, due to the large-scale highly nonlinear properties of standard reservoir models, characterizing the posterior is computationally prohibitive. Instead, more affordable ad hoc techniques, based

  2. Hydrologic-agronomic-economic model for the optimal operation of the Yaqui river reservoir system using genetic algorithms; Modelo hidrologico-agronomico-economico para la operacion optima del sistema de presas del rio Yaqui, usando algoritmos geneticos

    Energy Technology Data Exchange (ETDEWEB)

    Minjares-Lugo, Jose Luis; Salmon-Castelo, Roberto Fernando; Oroz-Ramos, Lucas Antonio [Comision Nacional del Agua (Mexico); Cruz-Medina, Isidro Roberto [Instituto Tecnologico de Sonora (Mexico)

    2008-07-15

    The objective of this study is to develop an integrated hydrologic-agronomic-economic annual model for the optimal operation of the Yaqui River reservoir system to support irrigation and urban water supply in the watershed. The model solves for each year's water allocations by crop, maximizing annual agricultural income for a specified risk of reservoir shortages and spills. It accounts for adjustments in water supply arising from changes in precipitation and runoff uncertainty and from changes in water demand arising from variations in crop prices and production costs. Model predictions for the agricultural year 2000-2001 are compared with observed results to test the model's predictive ability. Results demonstrate that the model can be used to optimize and analyze reservoir system operation and for water resources management in the Irrigation District No. 041, providing a framework for improving the operation of a reservoir system, selecting an optimal cropping pattern according to its maximum economic benefits, and in the optimal monthly water releases from the reservoir system. The model considers the simultaneous operation of three dams and it is applied to the Irrigation District No. 041, Rio Yaqui. [Spanish] El objetivo de este estudio es desarrollar un modelo integral de optimizacion anual para definir la operacion del sistema de presas del rio Yaqui y la asignacion del volumen mensual de agua para la irrigacion de diferentes cultivos, asi como para satisfacer los requerimientos de uso urbano basado en las condiciones hidrologicas, agronomicas y economicas en la cuenca. El modelo maximiza los beneficios anuales netos del sector agricola, minimizando el riesgo de deficit o derrames en el sistema de presas; incluye cambios en el suministro de agua debido a la incertidumbre de las precipitaciones y del escurrimiento del rio y cambios en la demanda de agua provocados por la incertidumbre de los precios y costos de los cultivos. Se utilizaron datos del

  3. Improvement of LOD in Fluorescence Detection with Spectrally Nonuniform Background by Optimization of Emission Filtering.

    Science.gov (United States)

    Galievsky, Victor A; Stasheuski, Alexander S; Krylov, Sergey N

    2017-10-17

    The limit-of-detection (LOD) in analytical instruments with fluorescence detection can be improved by reducing noise of optical background. Efficiently reducing optical background noise in systems with spectrally nonuniform background requires complex optimization of an emission filter-the main element of spectral filtration. Here, we introduce a filter-optimization method, which utilizes an expression for the signal-to-noise ratio (SNR) as a function of (i) all noise components (dark, shot, and flicker), (ii) emission spectrum of the analyte, (iii) emission spectrum of the optical background, and (iv) transmittance spectrum of the emission filter. In essence, the noise components and the emission spectra are determined experimentally and substituted into the expression. This leaves a single variable-the transmittance spectrum of the filter-which is optimized numerically by maximizing SNR. Maximizing SNR provides an accurate way of filter optimization, while a previously used approach based on maximizing a signal-to-background ratio (SBR) is the approximation that can lead to much poorer LOD specifically in detection of fluorescently labeled biomolecules. The proposed filter-optimization method will be an indispensable tool for developing new and improving existing fluorescence-detection systems aiming at ultimately low LOD.

  4. An Improved Fruit Fly Optimization Algorithm and Its Application in Heat Exchange Fouling Ultrasonic Detection

    Directory of Open Access Journals (Sweden)

    Xia Li

    2018-01-01

    Full Text Available Inspired by the basic theory of Fruit Fly Optimization Algorithm, in this paper, cat mapping was added to the original algorithm, and the individual distribution and evolution mechanism of fruit fly population were improved in order to increase the search speed and accuracy. The flowchart of the improved algorithm was drawn to show its procedure. Using classical test functions, simulation optimization results show that the improved algorithm has faster and more reliable optimization ability. The algorithm was then combined with sparse decomposition theory and used in processing fouling detection ultrasonic signals to verify the validity and practicability of the improved algorithm.

  5. Detection of colonic polyps in the elderly: Optimization of the single-contrast barium enema examination

    International Nuclear Information System (INIS)

    Gelfand, D.W.; Chen, Y.M.; Ott, D.J.; Munitz, H.A.

    1986-01-01

    Single-contrast studies account for 75% of barium enema examinations and are often performed in the elderly. By optimizing all factors, the following results were obtained: for polyps of less than 1 cm, 40 of 57 were detected (sensitivity, 70.2%); for polyps of 1 cm or larger, 33 of 35 were detected (sensitivity, 94%). Overall, 73 of 92 polyps were detected (sensitivity, 79.3%). These sensitivities result from meticulous preparation and the use of compression filming, low-density barium, moderate kilovoltages, high-resolution screens, remote control apparatus, and high-bandpass TV fluoroscopy. The authors conclude that an optimal single-contrast barium enema examination detects colonic polyps with a sensitivity approaching that of the double-contrast study and may be employed in elderly patients who cannot undergo the double-contrast study

  6. Optimization of crack detection in steam generator tubes using a punctual probe

    International Nuclear Information System (INIS)

    Levy, R.; Ferre, C.

    1985-01-01

    The existence of cracks at the upper end of the expanded zone of a steam generator tube is a recent problem. A differential pencil probe was used for the detection of those cracks with encouraging results. An optimization study has been necessary to solve the difficulties in the evaluation of defects, due to the design of the first probe; the result is a probe making possible a precise analysis of detected signals

  7. Optimizing signal recycling for detecting a stochastic gravitational-wave background

    Science.gov (United States)

    Tao, Duo; Christensen, Nelson

    2018-06-01

    Signal recycling is applied in laser interferometers such as the Advanced Laser Interferometer Gravitational-Wave Observatory (aLIGO) to increase their sensitivity to gravitational waves. In this study, signal recycling configurations for detecting a stochastic gravitational wave background are optimized based on aLIGO parameters. Optimal transmission of the signal recycling mirror (SRM) and detuning phase of the signal recycling cavity under a fixed laser power and low-frequency cutoff are calculated. Based on the optimal configurations, the compatibility with a binary neutron star (BNS) search is discussed. Then, different laser powers and low-frequency cutoffs are considered. Two models for the dimensionless energy density of gravitational waves , the flat model and the model, are studied. For a stochastic background search, it is found that an interferometer using signal recycling has a better sensitivity than an interferometer not using it. The optimal stochastic search configurations are typically found when both the SRM transmission and the signal recycling detuning phase are low. In this region, the BNS range mostly lies between 160 and 180 Mpc. When a lower laser power is used the optimal signal recycling detuning phase increases, the optimal SRM transmission increases and the optimal sensitivity improves. A reduced low-frequency cutoff gives a better sensitivity limit. For both models of , a typical optimal sensitivity limit on the order of 10‑10 is achieved at a reference frequency of Hz.

  8. A hybrid neural network – world cup optimization algorithm for melanoma detection

    Directory of Open Access Journals (Sweden)

    Razmjooy Navid

    2018-03-01

    Full Text Available One of the most dangerous cancers in humans is Melanoma. However, early detection of melanoma can help us to cure it completely. This paper presents a new efficient method to detect malignancy in melanoma via images. At first, the extra scales are eliminated by using edge detection and smoothing. Afterwards, the proposed method can be utilized to segment the cancer images. Finally, the extra information is eliminated by morphological operations and used to focus on the area which melanoma boundary potentially exists. To do this, World Cup Optimization algorithm is utilized to optimize an MLP neural Networks (ANN. World Cup Optimization algorithm is a new meta-heuristic algorithm which is recently presented and has a good performance in some optimization problems. WCO is a derivative-free, Meta-Heuristic algorithm, mimicking the world’s FIFA competitions. World cup Optimization algorithm is a global search algorithm while gradient-based back propagation method is local search. In this proposed algorithm, multi-layer perceptron network (MLP employs the problem’s constraints and WCO algorithm attempts to minimize the root mean square error. Experimental results show that the proposed method can develop the performance of the standard MLP algorithm significantly.

  9. Optimizing pulse-pileup detection for soft-x-ray spectroscopy

    International Nuclear Information System (INIS)

    Greenberger, A.J.

    1981-04-01

    The problem of optimizing detection of the pileup of randomly occurring exponential tail pulses in white noise is considered. An attempt is made to reduce the process to an algorithm that could practically be performed in real time. Quantitative estimates are made for the performance of such an optimum detector. The relation to a more general pattern recognition problem is mentioned

  10. Optimized enrichment for the detection of Escherichia coli O26 in French raw milk cheeses.

    Science.gov (United States)

    Savoye, F; Rozand, C; Bouvier, M; Gleizal, A; Thevenot, D

    2011-06-01

    Our main objective was to optimize the enrichment of Escherichia coli O26 in raw milk cheeses for their subsequent detection with a new automated immunological method. Ten enrichment broths were tested for the detection of E. coli O26. Two categories of experimentally inoculated raw milk cheeses, semi-hard uncooked cheese and 'Camembert' type cheese, were initially used to investigate the relative efficacy of the different enrichments. The enrichments that were considered optimal for the growth of E. coli O26 in these cheeses were then challenged with other types of raw milk cheeses. Buffered peptone water supplemented with cefixim-tellurite and acriflavin was shown to optimize the growth of E. coli O26 artificially inoculated in the cheeses tested. Despite the low inoculum level (1-10 CFU per 25 g) in the cheeses, E. coli O26 counts reached at least 5.10(4) CFU ml(-1) after 24-h incubation at 41.5 °C in this medium. All the experimentally inoculated cheeses were found positive by the immunological method in the enrichment broth selected. Optimized E. coli O26 enrichment and rapid detection constitute the first steps of a complete procedure that could be used in routine to detect E. coli O26 in raw milk cheeses. © 2011 The Authors. Letters in Applied Microbiology © 2011 The Society for Applied Microbiology.

  11. Selection of an optimal neural network architecture for computer-aided detection of microcalcifications - Comparison of automated optimization techniques

    International Nuclear Information System (INIS)

    Gurcan, Metin N.; Sahiner, Berkman; Chan Heangping; Hadjiiski, Lubomir; Petrick, Nicholas

    2001-01-01

    Many computer-aided diagnosis (CAD) systems use neural networks (NNs) for either detection or classification of abnormalities. Currently, most NNs are 'optimized' by manual search in a very limited parameter space. In this work, we evaluated the use of automated optimization methods for selecting an optimal convolution neural network (CNN) architecture. Three automated methods, the steepest descent (SD), the simulated annealing (SA), and the genetic algorithm (GA), were compared. We used as an example the CNN that classifies true and false microcalcifications detected on digitized mammograms by a prescreening algorithm. Four parameters of the CNN architecture were considered for optimization, the numbers of node groups and the filter kernel sizes in the first and second hidden layers, resulting in a search space of 432 possible architectures. The area A z under the receiver operating characteristic (ROC) curve was used to design a cost function. The SA experiments were conducted with four different annealing schedules. Three different parent selection methods were compared for the GA experiments. An available data set was split into two groups with approximately equal number of samples. By using the two groups alternately for training and testing, two different cost surfaces were evaluated. For the first cost surface, the SD method was trapped in a local minimum 91% (392/432) of the time. The SA using the Boltzman schedule selected the best architecture after evaluating, on average, 167 architectures. The GA achieved its best performance with linearly scaled roulette-wheel parent selection; however, it evaluated 391 different architectures, on average, to find the best one. The second cost surface contained no local minimum. For this surface, a simple SD algorithm could quickly find the global minimum, but the SA with the very fast reannealing schedule was still the most efficient. The same SA scheme, however, was trapped in a local minimum on the first cost

  12. Optimal Detection of a Localized Perturbation in Random Networks of Integrate-and-Fire Neurons

    Science.gov (United States)

    Bernardi, Davide; Lindner, Benjamin

    2017-06-01

    Experimental and theoretical studies suggest that cortical networks are chaotic and coding relies on averages over large populations. However, there is evidence that rats can respond to the short stimulation of a single cortical cell, a theoretically unexplained fact. We study effects of single-cell stimulation on a large recurrent network of integrate-and-fire neurons and propose a simple way to detect the perturbation. Detection rates obtained from simulations and analytical estimates are similar to experimental response rates if the readout is slightly biased towards specific neurons. Near-optimal detection is attained for a broad range of intermediate values of the mean coupling between neurons.

  13. Adjoint based optimal control of partially miscible two-phase flow in porous media with applications to CO2 sequestration in underground reservoirs

    KAUST Repository

    Simon, Moritz; Ulbrich, Michael

    2014-01-01

    is to maximize the amount of trapped CO2 in an underground reservoir after a fixed period of CO2 injection, while time-dependent injection rates in multiple wells are used as control parameters. We describe the governing two-phase two-component Darcy flow PDE

  14. Tailor-made surfactants for optimized chemical EOR. Meeting oil reservoir conditions by applied knowledge of structure-performance relationship in extended surfactants

    Energy Technology Data Exchange (ETDEWEB)

    Trahan, G.; Sorensen, W. [Sasol North America Inc., Westlake, LA (United States); Jakobs-Sauter, B. [Sasol Germany GmbH (Germany)

    2013-08-01

    Formulating the surfactant package for chemical EOR is a time consuming and expensive process - the formulation needs to fit the specific reservoir conditions (like oil type, temperature, salinity, etc.) to give optimum performance and the number of formulation variables is virtually endless. This paper studies the impact of surfactant structure on EOR formulation ability and performance and how to adjust the structure of the surfactant molecule to meet a specific reservoir's needs. Data from salinity phase boundary studies of alcohol propoxy sulfates illustrate how changes in alcohol structure as well as in propylene oxide level can shift optimum salinity and temperature to the desired range in a given model oil. From these data the impact of individual structural units was evaluated. Application of the HLD model (Hydrophilic-Lipophilic Deviation) shows how to extrapolate from the known data set to actual reservoir conditions. This is illustrated by studies on crude oil samples. Additional tests study how effective the selected surfactants perform. The HLD concept proves to be a valuable tool to select and tailor surfactants to individual reservoir needs, thus simplifying the surfactant screening process for EOR formulations by pre-selection of suitable structures and ultimately reducing cost and effort on the way to the most effective chemical EOR package. (orig.)

  15. Optimization of a method for the detection of immunopotentiating antibodies against serotype 1 of dengue virus

    International Nuclear Information System (INIS)

    Soto Garita, Claudio

    2014-01-01

    An immunopotentiation trial has used sera from dengue seropositive patients from Costa Rica's endemic areas. The detection and semi-quantification of immunopotentiating antibodies were optimized against dengue virus serotype 1. The cell line K562 (human erythromyeloblastoid leukemia cells) has been more efficient than the U937 (human histiocytic lymphoma cells). A more adequate detection of immunopotentiating antibodies was determined. The optimal infection and virus-antibody incubation parameters are demonstrated for the detection of immunopotentiating antibodies with the immunostaining technique. The immuno-optimized assay has allowed the detection and semi-quantification of immunopotentiating antibodies against serotype 1 of dengue virus. Samples of strong positive, weak positive and dengue negative sera are analyzed. The end has been to evaluate the usefulness in the detection and semi-quantification of immunopotentiating antibodies. The presence of immunopotentiating antibodies was demonstrated against dengue virus serotype 1 in endemic zones of Costa Rica, to complement with the evaluation of the other existing serotypes is recommended [es

  16. Optimal Attack Strategies Subject to Detection Constraints Against Cyber-Physical Systems

    International Nuclear Information System (INIS)

    Chen, Yuan; Kar, Soummya; Moura, Jose M. F.

    2017-01-01

    This paper studies an attacker against a cyberphysical system (CPS) whose goal is to move the state of a CPS to a target state while ensuring that his or her probability of being detected does not exceed a given bound. The attacker’s probability of being detected is related to the nonnegative bias induced by his or her attack on the CPS’s detection statistic. We formulate a linear quadratic cost function that captures the attacker’s control goal and establish constraints on the induced bias that reflect the attacker’s detection-avoidance objectives. When the attacker is constrained to be detected at the false-alarm rate of the detector, we show that the optimal attack strategy reduces to a linear feedback of the attacker’s state estimate. In the case that the attacker’s bias is upper bounded by a positive constant, we provide two algorithms – an optimal algorithm and a sub-optimal, less computationally intensive algorithm – to find suitable attack sequences. Lastly, we illustrate our attack strategies in numerical examples based on a remotely-controlled helicopter under attack.

  17. Optimal Joint Detection and Estimation That Maximizes ROC-Type Curves.

    Science.gov (United States)

    Wunderlich, Adam; Goossens, Bart; Abbey, Craig K

    2016-09-01

    Combined detection-estimation tasks are frequently encountered in medical imaging. Optimal methods for joint detection and estimation are of interest because they provide upper bounds on observer performance, and can potentially be utilized for imaging system optimization, evaluation of observer efficiency, and development of image formation algorithms. We present a unified Bayesian framework for decision rules that maximize receiver operating characteristic (ROC)-type summary curves, including ROC, localization ROC (LROC), estimation ROC (EROC), free-response ROC (FROC), alternative free-response ROC (AFROC), and exponentially-transformed FROC (EFROC) curves, succinctly summarizing previous results. The approach relies on an interpretation of ROC-type summary curves as plots of an expected utility versus an expected disutility (or penalty) for signal-present decisions. We propose a general utility structure that is flexible enough to encompass many ROC variants and yet sufficiently constrained to allow derivation of a linear expected utility equation that is similar to that for simple binary detection. We illustrate our theory with an example comparing decision strategies for joint detection-estimation of a known signal with unknown amplitude. In addition, building on insights from our utility framework, we propose new ROC-type summary curves and associated optimal decision rules for joint detection-estimation tasks with an unknown, potentially-multiple, number of signals in each observation.

  18. Robust Fault Detection for a Class of Uncertain Nonlinear Systems Based on Multiobjective Optimization

    Directory of Open Access Journals (Sweden)

    Bingyong Yan

    2015-01-01

    Full Text Available A robust fault detection scheme for a class of nonlinear systems with uncertainty is proposed. The proposed approach utilizes robust control theory and parameter optimization algorithm to design the gain matrix of fault tracking approximator (FTA for fault detection. The gain matrix of FTA is designed to minimize the effects of system uncertainty on residual signals while maximizing the effects of system faults on residual signals. The design of the gain matrix of FTA takes into account the robustness of residual signals to system uncertainty and sensitivity of residual signals to system faults simultaneously, which leads to a multiobjective optimization problem. Then, the detectability of system faults is rigorously analyzed by investigating the threshold of residual signals. Finally, simulation results are provided to show the validity and applicability of the proposed approach.

  19. Improvement in minimum detectable activity for low energy gamma by optimization in counting geometry

    Directory of Open Access Journals (Sweden)

    Anil Gupta

    2017-01-01

    Full Text Available Gamma spectrometry for environmental samples of low specific activities demands low minimum detection levels of measurement. An attempt has been made to lower the gamma detection level of measurement by optimizing the sample geometry, without compromising on the sample size. Gamma energy of 50–200 keV range was chosen for the study, since low energy gamma photons suffer the most self-attenuation within matrix. The simulation study was carried out using MCNP based software “EffCalcMC” for silica matrix and cylindrical geometries. A volume of 250 ml sample geometry of 9 cm diameter is optimized as the best suitable geometry for use, against the in-practice 7 cm diameter geometry of same volume. An increase in efficiency of 10%–23% was observed for the 50–200 keV gamma energy range and a corresponding lower minimum detectable activity of 9%–20% could be achieved for the same.

  20. A simple optimization can improve the performance of single feature polymorphism detection by Affymetrix expression arrays

    Directory of Open Access Journals (Sweden)

    Fujisawa Hironori

    2010-05-01

    Full Text Available Abstract Background High-density oligonucleotide arrays are effective tools for genotyping numerous loci simultaneously. In small genome species (genome size: Results We compared the single feature polymorphism (SFP detection performance of whole-genome and transcript hybridizations using the Affymetrix GeneChip® Rice Genome Array, using the rice cultivars with full genome sequence, japonica cultivar Nipponbare and indica cultivar 93-11. Both genomes were surveyed for all probe target sequences. Only completely matched 25-mer single copy probes of the Nipponbare genome were extracted, and SFPs between them and 93-11 sequences were predicted. We investigated optimum conditions for SFP detection in both whole genome and transcript hybridization using differences between perfect match and mismatch probe intensities of non-polymorphic targets, assuming that these differences are representative of those between mismatch and perfect targets. Several statistical methods of SFP detection by whole-genome hybridization were compared under the optimized conditions. Causes of false positives and negatives in SFP detection in both types of hybridization were investigated. Conclusions The optimizations allowed a more than 20% increase in true SFP detection in whole-genome hybridization and a large improvement of SFP detection performance in transcript hybridization. Significance analysis of the microarray for log-transformed raw intensities of PM probes gave the best performance in whole genome hybridization, and 22,936 true SFPs were detected with 23.58% false positives by whole genome hybridization. For transcript hybridization, stable SFP detection was achieved for highly expressed genes, and about 3,500 SFPs were detected at a high sensitivity (> 50% in both shoot and young panicle transcripts. High SFP detection performances of both genome and transcript hybridizations indicated that microarrays of a complex genome (e.g., of Oryza sativa can be

  1. An Optimized Clustering Approach for Automated Detection of White Matter Lesions in MRI Brain Images

    Directory of Open Access Journals (Sweden)

    M. Anitha

    2012-04-01

    Full Text Available Settings White Matter lesions (WMLs are small areas of dead cells found in parts of the brain. In general, it is difficult for medical experts to accurately quantify the WMLs due to decreased contrast between White Matter (WM and Grey Matter (GM. The aim of this paper is to
    automatically detect the White Matter Lesions which is present in the brains of elderly people. WML detection process includes the following stages: 1. Image preprocessing, 2. Clustering (Fuzzy c-means clustering, Geostatistical Possibilistic clustering and Geostatistical Fuzzy clustering and 3.Optimization using Particle Swarm Optimization (PSO. The proposed system is tested on a database of 208 MRI images. GFCM yields high sensitivity of 89%, specificity of 94% and overall accuracy of 93% over FCM and GPC. The clustered brain images are then subjected to Particle Swarm Optimization (PSO. The optimized result obtained from GFCM-PSO provides sensitivity of 90%, specificity of 94% and accuracy of 95%. The detection results reveals that GFCM and GFCMPSO better localizes the large regions of lesions and gives less false positive rate when compared to GPC and GPC-PSO which captures the largest loads of WMLs only in the upper ventral horns of the brain.

  2. Hybrid Bacterial Foraging and Particle Swarm Optimization for detecting Bundle Branch Block.

    Science.gov (United States)

    Kora, Padmavathi; Kalva, Sri Ramakrishna

    2015-01-01

    Abnormal cardiac beat identification is a key process in the detection of heart diseases. Our present study describes a procedure for the detection of left and right bundle branch block (LBBB and RBBB) Electrocardiogram (ECG) patterns. The electrical impulses that control the cardiac beat face difficulty in moving inside the heart. This problem is termed as bundle branch block (BBB). BBB makes it harder for the heart to pump blood effectively through the heart circulatory system. ECG feature extraction is a key process in detecting heart ailments. Our present study comes up with a hybrid method combining two heuristic optimization methods: Bacterial Forging Optimization (BFO) and Particle Swarm Optimization (PSO) for the feature selection of ECG signals. One of the major controlling forces of BFO algorithm is the chemotactic movement of a bacterium that models a test solution. The chemotaxis process of the BFO depends on random search directions which may lead to a delay in achieving the global optimum solution. The hybrid technique: Bacterial Forging-Particle Swarm Optimization (BFPSO) incorporates the concepts from BFO and PSO and it creates individuals in a new generation. This BFPSO method performs local search through the chemotactic movement of BFO and the global search over the entire search domain is accomplished by a PSO operator. The BFPSO feature values are given as the input for the Levenberg-Marquardt Neural Network classifier.

  3. Optimal Seamline Detection for Orthoimage Mosaicking Based on DSM and Improved JPS Algorithm

    Directory of Open Access Journals (Sweden)

    Gang Chen

    2018-05-01

    Full Text Available Based on the digital surface model (DSM and jump point search (JPS algorithm, this study proposed a novel approach to detect the optimal seamline for orthoimage mosaicking. By threshold segmentation, DSM was first identified as ground regions and obstacle regions (e.g., buildings, trees, and cars. Then, the mathematical morphology method was used to make the edge of obstacles more prominent. Subsequently, the processed DSM was considered as a uniform-cost grid map, and the JPS algorithm was improved and employed to search for key jump points in the map. Meanwhile, the jump points would be evaluated according to an optimized function, finally generating a minimum cost path as the optimal seamline. Furthermore, the search strategy was modified to avoid search failure when the search map was completely blocked by obstacles in the search direction. Comparison of the proposed method and the Dijkstra’s algorithm was carried out based on two groups of image data with different characteristics. Results showed the following: (1 the proposed method could detect better seamlines near the centerlines of the overlap regions, crossing far fewer ground objects; (2 the efficiency and resource consumption were greatly improved since the improved JPS algorithm skips many image pixels without them being explicitly evaluated. In general, based on DSM, the proposed method combining threshold segmentation, mathematical morphology, and improved JPS algorithms was helpful for detecting the optimal seamline for orthoimage mosaicking.

  4. Computational intelligence-based optimization of maximally stable extremal region segmentation for object detection

    Science.gov (United States)

    Davis, Jeremy E.; Bednar, Amy E.; Goodin, Christopher T.; Durst, Phillip J.; Anderson, Derek T.; Bethel, Cindy L.

    2017-05-01

    Particle swarm optimization (PSO) and genetic algorithms (GAs) are two optimization techniques from the field of computational intelligence (CI) for search problems where a direct solution can not easily be obtained. One such problem is finding an optimal set of parameters for the maximally stable extremal region (MSER) algorithm to detect areas of interest in imagery. Specifically, this paper describes the design of a GA and PSO for optimizing MSER parameters to detect stop signs in imagery produced via simulation for use in an autonomous vehicle navigation system. Several additions to the GA and PSO are required to successfully detect stop signs in simulated images. These additions are a primary focus of this paper and include: the identification of an appropriate fitness function, the creation of a variable mutation operator for the GA, an anytime algorithm modification to allow the GA to compute a solution quickly, the addition of an exponential velocity decay function to the PSO, the addition of an "execution best" omnipresent particle to the PSO, and the addition of an attractive force component to the PSO velocity update equation. Experimentation was performed with the GA using various combinations of selection, crossover, and mutation operators and experimentation was also performed with the PSO using various combinations of neighborhood topologies, swarm sizes, cognitive influence scalars, and social influence scalars. The results of both the GA and PSO optimized parameter sets are presented. This paper details the benefits and drawbacks of each algorithm in terms of detection accuracy, execution speed, and additions required to generate successful problem specific parameter sets.

  5. Optimization and characterization of condensation nucleation light scattering detection coupled with supercritical fluid chromatography

    Science.gov (United States)

    Yang, Shaoping

    This dissertation is an investigation of two aspects of coupling condensation nucleation light scattering detection (CNLSD) with supercritical fluid chromatography (SFC). In the first part, it was demonstrated that CNLSD was compatible with packed column SFC using either pure CO2 or organic solvent modified CO2 as mobile phases. Factors which were expected to affect the interface between SFC and CNLSD were optimized for the detector to reach low detection limits. With SFC using pure CO2 as mobile phase, the detection limit of CNLSD with SFC was observed to be at low nanogram levels, which was at the same level of flame ionization detection (FID) coupled with SFC. For SFC using modified CO2 as mobile phase, detection limits at the picogram level were observed for CNLSD at optimal conditions, which were at least ten times lower than those reached by evaporative light scattering detection. In the second part, particle size distributions of aerosols produced from rapid expansion of supercritical solutions were measured with a scanning mobility particle sizer. The effect of the factors, which were investigated in the first part for their effects on signal intensities and signal to noise ratios (S/N), on particle size distributions (PSDs) of both analyte and background were investigated. Whenever possible, both particle sizes and particle number obtained from PSDs were used to explain the optimization results. In general, PSD data support the observations made in the first part. The detection limits of CNLSD obtained were much higher than predicted. PSDs did not provide direct explanation of this problem. The amount of analyte deposited in the transport tubing, evaporated to gas phase, and condensed to form particles was determined experimentally. Almost no analyte was found in the gas phase. Less than 3% was found in the particle forms. The vast majority of analyte was lost in the transport tubing, especially in the short distance after supercritical fluid expansion. A

  6. Optimal surveillance strategy for invasive species management when surveys stop after detection.

    Science.gov (United States)

    Guillera-Arroita, Gurutzeta; Hauser, Cindy E; McCarthy, Michael A

    2014-05-01

    Invasive species are a cause for concern in natural and economic systems and require both monitoring and management. There is a trade-off between the amount of resources spent on surveying for the species and conducting early management of occupied sites, and the resources that are ultimately spent in delayed management at sites where the species was present but undetected. Previous work addressed this optimal resource allocation problem assuming that surveys continue despite detection until the initially planned survey effort is consumed. However, a more realistic scenario is often that surveys stop after detection (i.e., follow a "removal" sampling design) and then management begins. Such an approach will indicate a different optimal survey design and can be expected to be more efficient. We analyze this case and compare the expected efficiency of invasive species management programs under both survey methods. We also evaluate the impact of mis-specifying the type of sampling approach during the program design phase. We derive analytical expressions that optimize resource allocation between monitoring and management in surveillance programs when surveys stop after detection. We do this under a scenario of unconstrained resources and scenarios where survey budget is constrained. The efficiency of surveillance programs is greater if a "removal survey" design is used, with larger gains obtained when savings from early detection are high, occupancy is high, and survey costs are not much lower than early management costs at a site. Designing a surveillance program disregarding that surveys stop after detection can result in an efficiency loss. Our results help guide the design of future surveillance programs for invasive species. Addressing program design within a decision-theoretic framework can lead to a better use of available resources. We show how species prevalence, its detectability, and the benefits derived from early detection can be considered.

  7. A Time-Domain Structural Damage Detection Method Based on Improved Multiparticle Swarm Coevolution Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Shao-Fei Jiang

    2014-01-01

    Full Text Available Optimization techniques have been applied to structural health monitoring and damage detection of civil infrastructures for two decades. The standard particle swarm optimization (PSO is easy to fall into the local optimum and such deficiency also exists in the multiparticle swarm coevolution optimization (MPSCO. This paper presents an improved MPSCO algorithm (IMPSCO firstly and then integrates it with Newmark’s algorithm to localize and quantify the structural damage by using the damage threshold proposed. To validate the proposed method, a numerical simulation and an experimental study of a seven-story steel frame were employed finally, and a comparison was made between the proposed method and the genetic algorithm (GA. The results show threefold: (1 the proposed method not only is capable of localization and quantification of damage, but also has good noise-tolerance; (2 the damage location can be accurately detected using the damage threshold proposed in this paper; and (3 compared with the GA, the IMPSCO algorithm is more efficient and accurate for damage detection problems in general. This implies that the proposed method is applicable and effective in the community of damage detection and structural health monitoring.

  8. A hardware-algorithm co-design approach to optimize seizure detection algorithms for implantable applications.

    Science.gov (United States)

    Raghunathan, Shriram; Gupta, Sumeet K; Markandeya, Himanshu S; Roy, Kaushik; Irazoqui, Pedro P

    2010-10-30

    Implantable neural prostheses that deliver focal electrical stimulation upon demand are rapidly emerging as an alternate therapy for roughly a third of the epileptic patient population that is medically refractory. Seizure detection algorithms enable feedback mechanisms to provide focally and temporally specific intervention. Real-time feasibility and computational complexity often limit most reported detection algorithms to implementations using computers for bedside monitoring or external devices communicating with the implanted electrodes. A comparison of algorithms based on detection efficacy does not present a complete picture of the feasibility of the algorithm with limited computational power, as is the case with most battery-powered applications. We present a two-dimensional design optimization approach that takes into account both detection efficacy and hardware cost in evaluating algorithms for their feasibility in an implantable application. Detection features are first compared for their ability to detect electrographic seizures from micro-electrode data recorded from kainate-treated rats. Circuit models are then used to estimate the dynamic and leakage power consumption of the compared features. A score is assigned based on detection efficacy and the hardware cost for each of the features, then plotted on a two-dimensional design space. An optimal combination of compared features is used to construct an algorithm that provides maximal detection efficacy per unit hardware cost. The methods presented in this paper would facilitate the development of a common platform to benchmark seizure detection algorithms for comparison and feasibility analysis in the next generation of implantable neuroprosthetic devices to treat epilepsy. Copyright © 2010 Elsevier B.V. All rights reserved.

  9. Naturally fractured reservoirs: Optimized E and P strategies using a reaction-transport-mechanical simulator in an integrated approach. Annual report, 1996--1997

    Energy Technology Data Exchange (ETDEWEB)

    Hoak, T.; Jenkins, R. [Science Applications International Corp., McLean, VA (United States); Ortoleva, P.; Ozkan, G.; Shebl, M.; Sibo, W.; Tuncay, K. [Laboratory for Computational Geodynamics (United States); Sundberg, K. [Phillips Petroleum Company (United States)

    1998-07-01

    The methodology and results of this project are being tested using the Andector-Goldsmith Field in the Permian Basin, West Texas. The study area includes the Central Basin Platform and the Midland Basin. The Andector-Goldsmith Field lies at the juncture of these two zones in the greater West Texas Permian Basin. Although the modeling is being conducted in this area, the results have widespread applicability to other fractured carbonate and other reservoirs throughout the world.

  10. Game theory and extremal optimization for community detection in complex dynamic networks.

    Science.gov (United States)

    Lung, Rodica Ioana; Chira, Camelia; Andreica, Anca

    2014-01-01

    The detection of evolving communities in dynamic complex networks is a challenging problem that recently received attention from the research community. Dynamics clearly add another complexity dimension to the difficult task of community detection. Methods should be able to detect changes in the network structure and produce a set of community structures corresponding to different timestamps and reflecting the evolution in time of network data. We propose a novel approach based on game theory elements and extremal optimization to address dynamic communities detection. Thus, the problem is formulated as a mathematical game in which nodes take the role of players that seek to choose a community that maximizes their profit viewed as a fitness function. Numerical results obtained for both synthetic and real-world networks illustrate the competitive performance of this game theoretical approach.

  11. Simplified Swarm Optimization-Based Function Module Detection in Protein–Protein Interaction Networks

    Directory of Open Access Journals (Sweden)

    Xianghan Zheng

    2017-04-01

    Full Text Available Proteomics research has become one of the most important topics in the field of life science and natural science. At present, research on protein–protein interaction networks (PPIN mainly focuses on detecting protein complexes or function modules. However, existing approaches are either ineffective or incomplete. In this paper, we investigate detection mechanisms of functional modules in PPIN, including open database, existing detection algorithms, and recent solutions. After that, we describe the proposed approach based on the simplified swarm optimization (SSO algorithm and the knowledge of Gene Ontology (GO. The proposed solution implements the SSO algorithm for clustering proteins with similar function, and imports biological gene ontology knowledge for further identifying function complexes and improving detection accuracy. Furthermore, we use four different categories of species datasets for experiment: fruitfly, mouse, scere, and human. The testing and analysis result show that the proposed solution is feasible, efficient, and could achieve a higher accuracy of prediction than existing approaches.

  12. Fast detection of genetic information by an optimized PCR in an interchangeable chip.

    KAUST Repository

    Wu, Jinbo

    2012-02-01

    In this paper, we report the construction of a polymerase chain reaction (PCR) device for fast amplification and detection of DNA. This device consists of an interchangeable PCR chamber, a temperature control component as well as an optical detection system. The DNA amplification happens on an interchangeable chip with the volumes as low as 1.25 μl, while the heating and cooling rate was as fast as 12.7°C/second ensuring that the total time needed of only 25 min to complete the 35 cycle PCR amplification. An optimized PCR with two-temperature approach for denaturing and annealing (Td and Ta) of DNA was also formulated with the PCR chip, with which the amplification of male-specific sex determining region Y (SRY) gene marker by utilizing raw saliva was successfully achieved and the genetic identification was in-situ detected right after PCR by the optical detection system.

  13. Optimal reconstructed section thickness for the detection of liver lesions with multidetector CT

    International Nuclear Information System (INIS)

    Soo, G.; Lau, K.K.; Yik, T.; Kutschera, P.

    2010-01-01

    Aim: To evaluate the impact of different reconstructed section thicknesses on liver lesion detection using multidetector computed tomography (CT). Methods: Fifty-three patients were examined using a 16-section CT machine with axial reconstructions provided at 2.5, 5, 7.5, and 10 mm section thicknesses. Images of different reconstructed section thicknesses from different patients were presented in random order to three independent, blinded radiologists for review at multiple sessions. All images were then reviewed by three radiologists in a common session. Consensus was reached following review of the previous interpretation results and results of follow-up imaging regarding the number of true liver lesions (n = 101) for comparison. Results: Mean detection rates were as follows: 93/101 lesions detected with the 2.5 mm section thickness, 98/101 lesions detected at the 5 mm section thickness, 78/101 lesions detected at the 7.5 mm section thickness, and 54/101 lesions detected at the 10 mm section thickness. Lesions missed at the 2.5 mm section thickness were due to masking by image noise. There was particular difficulty detecting subcapsular lesions and lesions adjacent to fissures or the gall bladder at the 7.5 mm and 10 mm section thicknesses. Conclusion: The optimal reconstructed section thickness for lesion detection in the liver was 5 mm.

  14. When is a species declining? Optimizing survey effort to detect population changes in reptiles.

    Directory of Open Access Journals (Sweden)

    David Sewell

    Full Text Available Biodiversity monitoring programs need to be designed so that population changes can be detected reliably. This can be problematical for species that are cryptic and have imperfect detection. We used occupancy modeling and power analysis to optimize the survey design for reptile monitoring programs in the UK. Surveys were carried out six times a year in 2009-2010 at multiple sites. Four out of the six species--grass snake, adder, common lizard, slow-worm -were encountered during every survey from March-September. The exceptions were the two rarest species--sand lizard and smooth snake--which were not encountered in July 2009 and March 2010 respectively. The most frequently encountered and most easily detected species was the slow-worm. For the four widespread reptile species in the UK, three to four survey visits that used a combination of directed transect walks and artificial cover objects resulted in 95% certainty that a species would be detected if present. Using artificial cover objects was an effective detection method for most species, considerably increased the detection rate of some, and reduced misidentifications. To achieve an 85% power to detect a decline in any of the four widespread species when the true decline is 15%, three surveys at a total of 886 sampling sites, or four surveys at a total of 688 sites would be required. The sampling effort needed reduces to 212 sites surveyed three times, or 167 sites surveyed four times, if the target is to detect a true decline of 30% with the same power. The results obtained can be used to refine reptile survey protocols in the UK and elsewhere. On a wider scale, the occupancy study design approach can be used to optimize survey effort and help set targets for conservation outcomes for regional or national biodiversity assessments.

  15. An optimized outlier detection algorithm for jury-based grading of engineering design projects

    DEFF Research Database (Denmark)

    Thompson, Mary Kathryn; Espensen, Christina; Clemmensen, Line Katrine Harder

    2016-01-01

    This work characterizes and optimizes an outlier detection algorithm to identify potentially invalid scores produced by jury members while grading engineering design projects. The paper describes the original algorithm and the associated adjudication process in detail. The impact of the various...... (the base rule and the three additional conditions) play a role in the algorithm's performance and should be included in the algorithm. Because there is significant interaction between the base rule and the additional conditions, many acceptable combinations that balance the FPR and FNR can be found......, but no true optimum seems to exist. The performance of the best optimizations and the original algorithm are similar. Therefore, it should be possible to choose new coefficient values for jury populations in other cultures and contexts logically and empirically without a full optimization as long...

  16. Segmentation of the Clustered Cells with Optimized Boundary Detection in Negative Phase Contrast Images.

    Directory of Open Access Journals (Sweden)

    Yuliang Wang

    Full Text Available Cell image segmentation plays a central role in numerous biology studies and clinical applications. As a result, the development of cell image segmentation algorithms with high robustness and accuracy is attracting more and more attention. In this study, an automated cell image segmentation algorithm is developed to get improved cell image segmentation with respect to cell boundary detection and segmentation of the clustered cells for all cells in the field of view in negative phase contrast images. A new method which combines the thresholding method and edge based active contour method was proposed to optimize cell boundary detection. In order to segment clustered cells, the geographic peaks of cell light intensity were utilized to detect numbers and locations of the clustered cells. In this paper, the working principles of the algorithms are described. The influence of parameters in cell boundary detection and the selection of the threshold value on the final segmentation results are investigated. At last, the proposed algorithm is applied to the negative phase contrast images from different experiments. The performance of the proposed method is evaluated. Results show that the proposed method can achieve optimized cell boundary detection and highly accurate segmentation for clustered cells.

  17. Segmentation of the Clustered Cells with Optimized Boundary Detection in Negative Phase Contrast Images.

    Science.gov (United States)

    Wang, Yuliang; Zhang, Zaicheng; Wang, Huimin; Bi, Shusheng

    2015-01-01

    Cell image segmentation plays a central role in numerous biology studies and clinical applications. As a result, the development of cell image segmentation algorithms with high robustness and accuracy is attracting more and more attention. In this study, an automated cell image segmentation algorithm is developed to get improved cell image segmentation with respect to cell boundary detection and segmentation of the clustered cells for all cells in the field of view in negative phase contrast images. A new method which combines the thresholding method and edge based active contour method was proposed to optimize cell boundary detection. In order to segment clustered cells, the geographic peaks of cell light intensity were utilized to detect numbers and locations of the clustered cells. In this paper, the working principles of the algorithms are described. The influence of parameters in cell boundary detection and the selection of the threshold value on the final segmentation results are investigated. At last, the proposed algorithm is applied to the negative phase contrast images from different experiments. The performance of the proposed method is evaluated. Results show that the proposed method can achieve optimized cell boundary detection and highly accurate segmentation for clustered cells.

  18. Feature selection and classifier parameters estimation for EEG signals peak detection using particle swarm optimization.

    Science.gov (United States)

    Adam, Asrul; Shapiai, Mohd Ibrahim; Tumari, Mohd Zaidi Mohd; Mohamad, Mohd Saberi; Mubin, Marizan

    2014-01-01

    Electroencephalogram (EEG) signal peak detection is widely used in clinical applications. The peak point can be detected using several approaches, including time, frequency, time-frequency, and nonlinear domains depending on various peak features from several models. However, there is no study that provides the importance of every peak feature in contributing to a good and generalized model. In this study, feature selection and classifier parameters estimation based on particle swarm optimization (PSO) are proposed as a framework for peak detection on EEG signals in time domain analysis. Two versions of PSO are used in the study: (1) standard PSO and (2) random asynchronous particle swarm optimization (RA-PSO). The proposed framework tries to find the best combination of all the available features that offers good peak detection and a high classification rate from the results in the conducted experiments. The evaluation results indicate that the accuracy of the peak detection can be improved up to 99.90% and 98.59% for training and testing, respectively, as compared to the framework without feature selection adaptation. Additionally, the proposed framework based on RA-PSO offers a better and reliable classification rate as compared to standard PSO as it produces low variance model.

  19. Global parameter optimization for maximizing radioisotope detection probabilities at fixed false alarm rates

    Energy Technology Data Exchange (ETDEWEB)

    Portnoy, David, E-mail: david.portnoy@jhuapl.edu [Johns Hopkins University Applied Physics Laboratory, 11100 Johns Hopkins Road, Laurel, MD 20723 (United States); Feuerbach, Robert; Heimberg, Jennifer [Johns Hopkins University Applied Physics Laboratory, 11100 Johns Hopkins Road, Laurel, MD 20723 (United States)

    2011-10-01

    Today there is a tremendous amount of interest in systems that can detect radiological or nuclear threats. Many of these systems operate in extremely high throughput situations where delays caused by false alarms can have a significant negative impact. Thus, calculating the tradeoff between detection rates and false alarm rates is critical for their successful operation. Receiver operating characteristic (ROC) curves have long been used to depict this tradeoff. The methodology was first developed in the field of signal detection. In recent years it has been used increasingly in machine learning and data mining applications. It follows that this methodology could be applied to radiological/nuclear threat detection systems. However many of these systems do not fit into the classic principles of statistical detection theory because they tend to lack tractable likelihood functions and have many parameters, which, in general, do not have a one-to-one correspondence with the detection classes. This work proposes a strategy to overcome these problems by empirically finding parameter values that maximize the probability of detection for a selected number of probabilities of false alarm. To find these parameter values a statistical global optimization technique that seeks to estimate portions of a ROC curve is proposed. The optimization combines elements of simulated annealing with elements of genetic algorithms. Genetic algorithms were chosen because they can reduce the risk of getting stuck in local minima. However classic genetic algorithms operate on arrays of Booleans values or bit strings, so simulated annealing is employed to perform mutation in the genetic algorithm. The presented initial results were generated using an isotope identification algorithm developed at Johns Hopkins University Applied Physics Laboratory. The algorithm has 12 parameters: 4 real-valued and 8 Boolean. A simulated dataset was used for the optimization study; the 'threat' set of

  20. Global parameter optimization for maximizing radioisotope detection probabilities at fixed false alarm rates

    International Nuclear Information System (INIS)

    Portnoy, David; Feuerbach, Robert; Heimberg, Jennifer

    2011-01-01

    Today there is a tremendous amount of interest in systems that can detect radiological or nuclear threats. Many of these systems operate in extremely high throughput situations where delays caused by false alarms can have a significant negative impact. Thus, calculating the tradeoff between detection rates and false alarm rates is critical for their successful operation. Receiver operating characteristic (ROC) curves have long been used to depict this tradeoff. The methodology was first developed in the field of signal detection. In recent years it has been used increasingly in machine learning and data mining applications. It follows that this methodology could be applied to radiological/nuclear threat detection systems. However many of these systems do not fit into the classic principles of statistical detection theory because they tend to lack tractable likelihood functions and have many parameters, which, in general, do not have a one-to-one correspondence with the detection classes. This work proposes a strategy to overcome these problems by empirically finding parameter values that maximize the probability of detection for a selected number of probabilities of false alarm. To find these parameter values a statistical global optimization technique that seeks to estimate portions of a ROC curve is proposed. The optimization combines elements of simulated annealing with elements of genetic algorithms. Genetic algorithms were chosen because they can reduce the risk of getting stuck in local minima. However classic genetic algorithms operate on arrays of Booleans values or bit strings, so simulated annealing is employed to perform mutation in the genetic algorithm. The presented initial results were generated using an isotope identification algorithm developed at Johns Hopkins University Applied Physics Laboratory. The algorithm has 12 parameters: 4 real-valued and 8 Boolean. A simulated dataset was used for the optimization study; the 'threat' set of spectra

  1. Global parameter optimization for maximizing radioisotope detection probabilities at fixed false alarm rates

    Science.gov (United States)

    Portnoy, David; Feuerbach, Robert; Heimberg, Jennifer

    2011-10-01

    Today there is a tremendous amount of interest in systems that can detect radiological or nuclear threats. Many of these systems operate in extremely high throughput situations where delays caused by false alarms can have a significant negative impact. Thus, calculating the tradeoff between detection rates and false alarm rates is critical for their successful operation. Receiver operating characteristic (ROC) curves have long been used to depict this tradeoff. The methodology was first developed in the field of signal detection. In recent years it has been used increasingly in machine learning and data mining applications. It follows that this methodology could be applied to radiological/nuclear threat detection systems. However many of these systems do not fit into the classic principles of statistical detection theory because they tend to lack tractable likelihood functions and have many parameters, which, in general, do not have a one-to-one correspondence with the detection classes. This work proposes a strategy to overcome these problems by empirically finding parameter values that maximize the probability of detection for a selected number of probabilities of false alarm. To find these parameter values a statistical global optimization technique that seeks to estimate portions of a ROC curve is proposed. The optimization combines elements of simulated annealing with elements of genetic algorithms. Genetic algorithms were chosen because they can reduce the risk of getting stuck in local minima. However classic genetic algorithms operate on arrays of Booleans values or bit strings, so simulated annealing is employed to perform mutation in the genetic algorithm. The presented initial results were generated using an isotope identification algorithm developed at Johns Hopkins University Applied Physics Laboratory. The algorithm has 12 parameters: 4 real-valued and 8 Boolean. A simulated dataset was used for the optimization study; the "threat" set of spectra

  2. Optimal Seamline Detection for Orthoimage Mosaicking by Combining Deep Convolutional Neural Network and Graph Cuts

    Directory of Open Access Journals (Sweden)

    Li Li

    2017-07-01

    Full Text Available When mosaicking orthoimages, especially in urban areas with various obvious ground objects like buildings, roads, cars or trees, the detection of optimal seamlines is one of the key technologies for creating seamless and pleasant image mosaics. In this paper, we propose a new approach to detect optimal seamlines for orthoimage mosaicking with the use of deep convolutional neural network (CNN and graph cuts. Deep CNNs have been widely used in many fields of computer vision and photogrammetry in recent years, and graph cuts is one of the most widely used energy optimization frameworks. We first propose a deep CNN for land cover semantic segmentation in overlap regions between two adjacent images. Then, the energy cost of each pixel in the overlap regions is defined based on the classification probabilities of belonging to each of the specified classes. To find the optimal seamlines globally, we fuse the CNN-classified energy costs of all pixels into the graph cuts energy minimization framework. The main advantage of our proposed method is that the pixel similarity energy costs between two images are defined using the classification results of the CNN based semantic segmentation instead of using the image informations of color, gradient or texture as traditional methods do. Another advantage of our proposed method is that the semantic informations are fully used to guide the process of optimal seamline detection, which is more reasonable than only using the hand designed features defined to represent the image differences. Finally, the experimental results on several groups of challenging orthoimages show that the proposed method is capable of finding high-quality seamlines among urban and non-urban orthoimages, and outperforms the state-of-the-art algorithms and the commercial software based on the visual comparison, statistical evaluation and quantitative evaluation based on the structural similarity (SSIM index.

  3. Computing Optimal Mixed Strategies for Terrorist Plot Detection Games with the Consideration of Information Leakage

    OpenAIRE

    Li MingChu; Yang Zekun; Lu Kun; Guo Cheng

    2017-01-01

    The terrorist’s coordinated attack is becoming an increasing threat to western countries. By monitoring potential terrorists, security agencies are able to detect and destroy terrorist plots at their planning stage. Therefore, an optimal monitoring strategy for the domestic security agency becomes necessary. However, previous study about monitoring strategy generation fails to consider the information leakage, due to hackers and insider threat. Such leakage events may lead to failure of watch...

  4. Optimal sensor placement for leakage detection and isolation in water distribution networks

    OpenAIRE

    Rosich Oliva, Albert; Sarrate Estruch, Ramon; Nejjari Akhi-Elarab, Fatiha

    2012-01-01

    In this paper, the problem of leakage detection and isolation in water distribution networks is addressed applying an optimal sensor placement methodology. The chosen technique is based on structural models and thus it is suitable to handle non-linear and large scale systems. A drawback of this technique arises when costs are assigned uniformly. A main contribution of this paper is the proposal of an iterative methodology that focuses on identifying essential sensors which ultimately leads to...

  5. Time efficient optimization of instance based problems with application to tone onset detection

    OpenAIRE

    Bauer, Nadja; Friedrichs, Klaus; Weihs, Claus

    2016-01-01

    A time efficient optimization technique for instance based problems is proposed, where for each parameter setting the target function has to be evaluated on a large set of problem instances. Computational time is reduced by beginning with a performance estimation based on the evaluation of a representative subset of instances. Subsequently, only promising settings are evaluated on the whole data set. As application a comprehensive music onset detection algorithm is introduce...

  6. Understanding the True Stimulated Reservoir Volume in Shale Reservoirs

    KAUST Repository

    Hussain, Maaruf

    2017-06-06

    hydraulic fracture were observed being stimulated. These results show the beginning of new understanding into the physical mechanisms responsible for greater disparity in stimulation results within the same shale reservoir and hence the SRV. Using the appropriate methodology, stimulation design can be controlled to optimize the responses of in-situ stresses and reservoir rock itself.

  7. Optimizing a neural network for detection of moving vehicles in video

    Science.gov (United States)

    Fischer, Noëlle M.; Kruithof, Maarten C.; Bouma, Henri

    2017-10-01

    In the field of security and defense, it is extremely important to reliably detect moving objects, such as cars, ships, drones and missiles. Detection and analysis of moving objects in cameras near borders could be helpful to reduce illicit trading, drug trafficking, irregular border crossing, trafficking in human beings and smuggling. Many recent benchmarks have shown that convolutional neural networks are performing well in the detection of objects in images. Most deep-learning research effort focuses on classification or detection on single images. However, the detection of dynamic changes (e.g., moving objects, actions and events) in streaming video is extremely relevant for surveillance and forensic applications. In this paper, we combine an end-to-end feedforward neural network for static detection with a recurrent Long Short-Term Memory (LSTM) network for multi-frame analysis. We present a practical guide with special attention to the selection of the optimizer and batch size. The end-to-end network is able to localize and recognize the vehicles in video from traffic cameras. We show an efficient way to collect relevant in-domain data for training with minimal manual labor. Our results show that the combination with LSTM improves performance for the detection of moving vehicles.

  8. Designing lymphocyte functional structure for optimal signal detection: voilà, T cells.

    Science.gov (United States)

    Noest, A J

    2000-11-21

    One basic task of immune systems is to detect signals from unknown "intruders" amidst a noisy background of harmless signals. To clarify the functional importance of many observed lymphocyte properties, I ask: What properties would a cell have if one designed it according to the theory of optimal detection, with minimal regard for biological constraints? Sparse and reasonable assumptions about the statistics of available signals prove sufficient for deriving many features of the optimal functional structure, in an incremental and modular design. The use of one common formalism guarantees that all parts of the design collaborate to solve the detection task. Detection performance is computed at several stages of the design. Comparison between design variants reveals e.g. the importance of controlling the signal integration time. This predicts that an appropriate control mechanism should exist. Comparing the design to reality, I find a striking similarity with many features of T cells. For example, the formalism dictates clonal specificity, serial receptor triggering, (grades of) anergy, negative and positive selection, co-stimulation, high-zone tolerance, and clonal production of cytokines. Serious mismatches should be found if T cells were hindered by mechanistic constraints or vestiges of their (co-)evolutionary history, but I have not found clear examples. By contrast, fundamental mismatches abound when comparing the design to immune systems of e.g. invertebrates. The wide-ranging differences seem to hinge on the (in)ability to generate a large diversity of receptors. Copyright 2000 Academic Press.

  9. Improving Accuracy of Intrusion Detection Model Using PCA and optimized SVM

    Directory of Open Access Journals (Sweden)

    Sumaiya Thaseen Ikram

    2016-06-01

    Full Text Available Intrusion detection is very essential for providing security to different network domains and is mostly used for locating and tracing the intruders. There are many problems with traditional intrusion detection models (IDS such as low detection capability against unknown network attack, high false alarm rate and insufficient analysis capability. Hence the major scope of the research in this domain is to develop an intrusion detection model with improved accuracy and reduced training time. This paper proposes a hybrid intrusiondetection model by integrating the principal component analysis (PCA and support vector machine (SVM. The novelty of the paper is the optimization of kernel parameters of the SVM classifier using automatic parameter selection technique. This technique optimizes the punishment factor (C and kernel parameter gamma (γ, thereby improving the accuracy of the classifier and reducing the training and testing time. The experimental results obtained on the NSL KDD and gurekddcup dataset show that the proposed technique performs better with higher accuracy, faster convergence speed and better generalization. Minimum resources are consumed as the classifier input requires reduced feature set for optimum classification. A comparative analysis of hybrid models with the proposed model is also performed.

  10. Optimization of HPV DNA detection in urine by improving collection, storage, and extraction.

    Science.gov (United States)

    Vorsters, A; Van den Bergh, J; Micalessi, I; Biesmans, S; Bogers, J; Hens, A; De Coster, I; Ieven, M; Van Damme, P

    2014-11-01

    The benefits of using urine for the detection of human papillomavirus (HPV) DNA have been evaluated in disease surveillance, epidemiological studies, and screening for cervical cancers in specific subgroups. HPV DNA testing in urine is being considered for important purposes, notably the monitoring of HPV vaccination in adolescent girls and young women who do not wish to have a vaginal examination. The need to optimize and standardize sampling, storage, and processing has been reported.In this paper, we examined the impact of a DNA-conservation buffer, the extraction method, and urine sampling on the detection of HPV DNA and human DNA in urine provided by 44 women with a cytologically normal but HPV DNA-positive cervical sample. Ten women provided first-void and midstream urine samples. DNA analysis was performed using real-time PCR to allow quantification of HPV and human DNA.The results showed that an optimized method for HPV DNA detection in urine should (a) prevent DNA degradation during extraction and storage, (b) recover cell-free HPV DNA in addition to cell-associated DNA, (c) process a sufficient volume of urine, and (d) use a first-void sample.In addition, we found that detectable human DNA in urine may not be a good internal control for sample validity. HPV prevalence data that are based on urine samples collected, stored, and/or processed under suboptimal conditions may underestimate infection rates.

  11. Experimental study on the crack detection with optimized spatial wavelet analysis and windowing

    Science.gov (United States)

    Ghanbari Mardasi, Amir; Wu, Nan; Wu, Christine

    2018-05-01

    In this paper, a high sensitive crack detection is experimentally realized and presented on a beam under certain deflection by optimizing spatial wavelet analysis. Due to the crack existence in the beam structure, a perturbation/slop singularity is induced in the deflection profile. Spatial wavelet transformation works as a magnifier to amplify the small perturbation signal at the crack location to detect and localize the damage. The profile of a deflected aluminum cantilever beam is obtained for both intact and cracked beams by a high resolution laser profile sensor. Gabor wavelet transformation is applied on the subtraction of intact and cracked data sets. To improve detection sensitivity, scale factor in spatial wavelet transformation and the transformation repeat times are optimized. Furthermore, to detect the possible crack close to the measurement boundaries, wavelet transformation edge effect, which induces large values of wavelet coefficient around the measurement boundaries, is efficiently reduced by introducing different windowing functions. The result shows that a small crack with depth of less than 10% of the beam height can be localized with a clear perturbation. Moreover, the perturbation caused by a crack at 0.85 mm away from one end of the measurement range, which is covered by wavelet transform edge effect, emerges by applying proper window functions.

  12. Optimal wavelet transform for the detection of microaneurysms in retina photographs.

    Science.gov (United States)

    Quellec, Gwénolé; Lamard, Mathieu; Josselin, Pierre Marie; Cazuguel, Guy; Cochener, Béatrice; Roux, Christian

    2008-09-01

    In this paper, we propose an automatic method to detect microaneurysms in retina photographs. Microaneurysms are the most frequent and usually the first lesions to appear as a consequence of diabetic retinopathy. So, their detection is necessary for both screening the pathology and follow up (progression measurement). Automating this task, which is currently performed manually, would bring more objectivity and reproducibility. We propose to detect them by locally matching a lesion template in subbands of wavelet transformed images. To improve the method performance, we have searched for the best adapted wavelet within the lifting scheme framework. The optimization process is based on a genetic algorithm followed by Powell's direction set descent. Results are evaluated on 120 retinal images analyzed by an expert and the optimal wavelet is compared to different conventional mother wavelets. These images are of three different modalities: there are color photographs, green filtered photographs, and angiographs. Depending on the imaging modality, microaneurysms were detected with a sensitivity of respectively 89.62%, 90.24%, and 93.74% and a positive predictive value of respectively 89.50%, 89.75%, and 91.67%, which is better than previously published methods.

  13. Attacks and Intrusion Detection in Cloud Computing Using Neural Networks and Particle Swarm Optimization Algorithms

    Directory of Open Access Journals (Sweden)

    Ahmad Shokuh Saljoughi

    2018-01-01

    Full Text Available Today, cloud computing has become popular among users in organizations and companies. Security and efficiency are the two major issues facing cloud service providers and their customers. Since cloud computing is a virtual pool of resources provided in an open environment (Internet, cloud-based services entail security risks. Detection of intrusions and attacks through unauthorized users is one of the biggest challenges for both cloud service providers and cloud users. In the present study, artificial intelligence techniques, e.g. MLP Neural Network sand particle swarm optimization algorithm, were used to detect intrusion and attacks. The methods were tested for NSL-KDD, KDD-CUP datasets. The results showed improved accuracy in detecting attacks and intrusions by unauthorized users.

  14. A Florescence Detection Module for Photodynamic Therapy Optimization by Measuring the Concentration of Photo sensitizer

    International Nuclear Information System (INIS)

    Serrano Navarro, Joel; Stolik Isakina, Suren; La Rosa Vazquez, Jose M. de; Valor Reed, Alma

    2016-01-01

    In the present work, a portable fluorescence detection system designed and built for dosimetry control applications in Photodynamic Therapy is presented. The system excites the used photo sensitizer drug with a modulated laser light source and subsequently measures the radiance of the emitted fluorescent light. Since the fluorescent radiance is directly related to the photosensitizers concentration, this measurement allows for real-time monitoring of the photo sensitizer concentration in the treated tissue. The system is thought to permit adjusting the therapeutic regime in order to optimize the expected therapy results. In the developed system, a synchronous detection technique is employed to recover the fluorescence signals embedded in noisy backgrounds and lit environments. A scanning probe with a 405 nm diode laser is used to excite the photo sensitizer, while a detection wavelength range from 590 nm to 700 nm has been implemented. (Author)

  15. A coupled classification - evolutionary optimization model for contamination event detection in water distribution systems.

    Science.gov (United States)

    Oliker, Nurit; Ostfeld, Avi

    2014-03-15

    This study describes a decision support system, alerts for contamination events in water distribution systems. The developed model comprises a weighted support vector machine (SVM) for the detection of outliers, and a following sequence analysis for the classification of contamination events. The contribution of this study is an improvement of contamination events detection ability and a multi-dimensional analysis of the data, differing from the parallel one-dimensional analysis conducted so far. The multivariate analysis examines the relationships between water quality parameters and detects changes in their mutual patterns. The weights of the SVM model accomplish two goals: blurring the difference between sizes of the two classes' data sets (as there are much more normal/regular than event time measurements), and adhering the time factor attribute by a time decay coefficient, ascribing higher importance to recent observations when classifying a time step measurement. All model parameters were determined by data driven optimization so the calibration of the model was completely autonomic. The model was trained and tested on a real water distribution system (WDS) data set with randomly simulated events superimposed on the original measurements. The model is prominent in its ability to detect events that were only partly expressed in the data (i.e., affecting only some of the measured parameters). The model showed high accuracy and better detection ability as compared to previous modeling attempts of contamination event detection. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Detecting and locating light atoms from high-resolution STEM images : The quest for a single optimal design

    NARCIS (Netherlands)

    Gonnissen, J; De Backer, A; den Dekker, A.J.; Sijbers, J.; Van Aert, S.

    2016-01-01

    In the present paper, the optimal detector design is investigated for both detecting and locating light atoms from high resolution scanning transmission electron microscopy (HR STEM) images. The principles of detection theory are used to quantify the probability of error for the detection of

  17. Detecting and locating light atoms from high-resolution STEM images: The quest for a single optimal design.

    Science.gov (United States)

    Gonnissen, J; De Backer, A; den Dekker, A J; Sijbers, J; Van Aert, S

    2016-11-01

    In the present paper, the optimal detector design is investigated for both detecting and locating light atoms from high resolution scanning transmission electron microscopy (HR STEM) images. The principles of detection theory are used to quantify the probability of error for the detection of light atoms from HR STEM images. To determine the optimal experiment design for locating light atoms, use is made of the so-called Cramér-Rao Lower Bound (CRLB). It is investigated if a single optimal design can be found for both the detection and location problem of light atoms. Furthermore, the incoming electron dose is optimised for both research goals and it is shown that picometre range precision is feasible for the estimation of the atom positions when using an appropriate incoming electron dose under the optimal detector settings to detect light atoms. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. An Adaptive Cultural Algorithm with Improved Quantum-behaved Particle Swarm Optimization for Sonar Image Detection.

    Science.gov (United States)

    Wang, Xingmei; Hao, Wenqian; Li, Qiming

    2017-12-18

    This paper proposes an adaptive cultural algorithm with improved quantum-behaved particle swarm optimization (ACA-IQPSO) to detect the underwater sonar image. In the population space, to improve searching ability of particles, iterative times and the fitness value of particles are regarded as factors to adaptively adjust the contraction-expansion coefficient of the quantum-behaved particle swarm optimization algorithm (QPSO). The improved quantum-behaved particle swarm optimization algorithm (IQPSO) can make particles adjust their behaviours according to their quality. In the belief space, a new update strategy is adopted to update cultural individuals according to the idea of the update strategy in shuffled frog leaping algorithm (SFLA). Moreover, to enhance the utilization of information in the population space and belief space, accept function and influence function are redesigned in the new communication protocol. The experimental results show that ACA-IQPSO can obtain good clustering centres according to the grey distribution information of underwater sonar images, and accurately complete underwater objects detection. Compared with other algorithms, the proposed ACA-IQPSO has good effectiveness, excellent adaptability, a powerful searching ability and high convergence efficiency. Meanwhile, the experimental results of the benchmark functions can further demonstrate that the proposed ACA-IQPSO has better searching ability, convergence efficiency and stability.

  19. Subpixel Mapping of Hyperspectral Image Based on Linear Subpixel Feature Detection and Object Optimization

    Science.gov (United States)

    Liu, Zhaoxin; Zhao, Liaoying; Li, Xiaorun; Chen, Shuhan

    2018-04-01

    Owing to the limitation of spatial resolution of the imaging sensor and the variability of ground surfaces, mixed pixels are widesperead in hyperspectral imagery. The traditional subpixel mapping algorithms treat all mixed pixels as boundary-mixed pixels while ignoring the existence of linear subpixels. To solve this question, this paper proposed a new subpixel mapping method based on linear subpixel feature detection and object optimization. Firstly, the fraction value of each class is obtained by spectral unmixing. Secondly, the linear subpixel features are pre-determined based on the hyperspectral characteristics and the linear subpixel feature; the remaining mixed pixels are detected based on maximum linearization index analysis. The classes of linear subpixels are determined by using template matching method. Finally, the whole subpixel mapping results are iteratively optimized by binary particle swarm optimization algorithm. The performance of the proposed subpixel mapping method is evaluated via experiments based on simulated and real hyperspectral data sets. The experimental results demonstrate that the proposed method can improve the accuracy of subpixel mapping.

  20. Contrast based band selection for optimized weathered oil detection in hyperspectral images

    Science.gov (United States)

    Levaux, Florian; Bostater, Charles R., Jr.; Neyt, Xavier

    2012-09-01

    Hyperspectral imagery offers unique benefits for detection of land and water features due to the information contained in reflectance signatures such as the bi-directional reflectance distribution function or BRDF. The reflectance signature directly shows the relative absorption and backscattering features of targets. These features can be very useful in shoreline monitoring or surveillance applications, for example to detect weathered oil. In real-time detection applications, processing of hyperspectral data can be an important tool and Optimal band selection is thus important in real time applications in order to select the essential bands using the absorption and backscatter information. In the present paper, band selection is based upon the optimization of target detection using contrast algorithms. The common definition of the contrast (using only one band out of all possible combinations available within a hyperspectral image) is generalized in order to consider all the possible combinations of wavelength dependent contrasts using hyperspectral images. The inflection (defined here as an approximation of the second derivative) is also used in order to enhance the variations in the reflectance spectra as well as in the contrast spectrua in order to assist in optimal band selection. The results of the selection in term of target detection (false alarms and missed detection) are also compared with a previous method to perform feature detection, namely the matched filter. In this paper, imagery is acquired using a pushbroom hyperspectral sensor mounted at the bow of a small vessel. The sensor is mechanically rotated using an optical rotation stage. This opto-mechanical scanning system produces hyperspectral images with pixel sizes on the order of mm to cm scales, depending upon the distance between the sensor and the shoreline being monitored. The motion of the platform during the acquisition induces distortions in the collected HSI imagery. It is therefore

  1. Optimal Feature Space Selection in Detecting Epileptic Seizure based on Recurrent Quantification Analysis and Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Saleh LAshkari

    2016-06-01

    Full Text Available Selecting optimal features based on nature of the phenomenon and high discriminant ability is very important in the data classification problems. Since it doesn't require any assumption about stationary condition and size of the signal and the noise in Recurrent Quantification Analysis (RQA, it may be useful for epileptic seizure Detection. In this study, RQA was used to discriminate ictal EEG from the normal EEG where optimal features selected by combination of algorithm genetic and Bayesian Classifier. Recurrence plots of hundred samples in each two categories were obtained with five distance norms in this study: Euclidean, Maximum, Minimum, Normalized and Fixed Norm. In order to choose optimal threshold for each norm, ten threshold of ε was generated and then the best feature space was selected by genetic algorithm in combination with a bayesian classifier. The results shown that proposed method is capable of discriminating the ictal EEG from the normal EEG where for Minimum norm and 0.1˂ε˂1, accuracy was 100%. In addition, the sensitivity of proposed framework to the ε and the distance norm parameters was low. The optimal feature presented in this study is Trans which it was selected in most feature spaces with high accuracy.

  2. An optimized ensemble local mean decomposition method for fault detection of mechanical components

    International Nuclear Information System (INIS)

    Zhang, Chao; Chen, Shuai; Wang, Jianguo; Li, Zhixiong; Hu, Chao; Zhang, Xiaogang

    2017-01-01

    Mechanical transmission systems have been widely adopted in most of industrial applications, and issues related to the maintenance of these systems have attracted considerable attention in the past few decades. The recently developed ensemble local mean decomposition (ELMD) method shows satisfactory performance in fault detection of mechanical components for preventing catastrophic failures and reducing maintenance costs. However, the performance of ELMD often heavily depends on proper selection of its model parameters. To this end, this paper proposes an optimized ensemble local mean decomposition (OELMD) method to determinate an optimum set of ELMD parameters for vibration signal analysis. In OELMD, an error index termed the relative root-mean-square error ( Relative RMSE ) is used to evaluate the decomposition performance of ELMD with a certain amplitude of the added white noise. Once a maximum Relative RMSE , corresponding to an optimal noise amplitude, is determined, OELMD then identifies optimal noise bandwidth and ensemble number based on the Relative RMSE and signal-to-noise ratio (SNR), respectively. Thus, all three critical parameters of ELMD (i.e. noise amplitude and bandwidth, and ensemble number) are optimized by OELMD. The effectiveness of OELMD was evaluated using experimental vibration signals measured from three different mechanical components (i.e. the rolling bearing, gear and diesel engine) under faulty operation conditions. (paper)

  3. An optimized ensemble local mean decomposition method for fault detection of mechanical components

    Science.gov (United States)

    Zhang, Chao; Li, Zhixiong; Hu, Chao; Chen, Shuai; Wang, Jianguo; Zhang, Xiaogang

    2017-03-01

    Mechanical transmission systems have been widely adopted in most of industrial applications, and issues related to the maintenance of these systems have attracted considerable attention in the past few decades. The recently developed ensemble local mean decomposition (ELMD) method shows satisfactory performance in fault detection of mechanical components for preventing catastrophic failures and reducing maintenance costs. However, the performance of ELMD often heavily depends on proper selection of its model parameters. To this end, this paper proposes an optimized ensemble local mean decomposition (OELMD) method to determinate an optimum set of ELMD parameters for vibration signal analysis. In OELMD, an error index termed the relative root-mean-square error (Relative RMSE) is used to evaluate the decomposition performance of ELMD with a certain amplitude of the added white noise. Once a maximum Relative RMSE, corresponding to an optimal noise amplitude, is determined, OELMD then identifies optimal noise bandwidth and ensemble number based on the Relative RMSE and signal-to-noise ratio (SNR), respectively. Thus, all three critical parameters of ELMD (i.e. noise amplitude and bandwidth, and ensemble number) are optimized by OELMD. The effectiveness of OELMD was evaluated using experimental vibration signals measured from three different mechanical components (i.e. the rolling bearing, gear and diesel engine) under faulty operation conditions.

  4. Crowded Field Photometry and Moving Object Detection with Optimal Image Subtraction

    Science.gov (United States)

    Lee, Austin A. T.; Scheulen, F.; Sauro, C. M.; McMahon, C. T.; Berry, S. J.; Robinson, C. H.; Buie, M. W.; Little, P.

    2010-05-01

    High precision photometry and moving object detection are essential in the study of Pluto and the Kuiper Belt. In particular, the New Horizons mission would benefit from an accurate and fast method of performing image subtraction to locate faint Kuiper Belt Objects (KBO) among large data sets. The optimal image subtraction (OIS) algorithm was optimized for IDL to decrease execution time by a factor of about 140 from a previous implementation (Miller 2008, PASP, 120, 449). In addition, a powerful image transformation and interpolation routine was written to provide OIS with well-aligned input images using astrometric fit data. The first half of this project is complete including the code optimization and the alignment routine. The second half of the project is focused on using these tools to search a 5 x 10 degree search area to find KBOs for possible targets for the New Horizons mission. We will present examples of how these tools work and along with resulting Pluto photometry and KBO target lists. The optimized OIS and transformation routines are available in Marc Buie's IDL library at http://www.boulder.swri.edu/ buie/idl/ as ois.pro and dewarp.pro. This project was conducted for Harvey Mudd College's Clinic Program with financial support from the NASA Planetary Astronomy Program grant number NNX09AB43G.

  5. Feature Optimize and Classification of EEG Signals: Application to Lie Detection Using KPCA and ELM

    Directory of Open Access Journals (Sweden)

    GAO Junfeng

    2014-04-01

    Full Text Available EEG signals had been widely used to detect liars recent years. To overcome the shortcomings of current signals processing, kernel principal component analysis (KPCA and extreme learning machine (ELM was combined to detect liars. We recorded the EEG signals at Pz from 30 randomly divided guilty and innocent subjects. Each five Probe responses were averaged within subject and then extracted wavelet features. KPCA was employed to select feature subset with deduced dimensions based on initial wavelet features, which was fed into ELM. To date, there is no perfect solution for the number of its hidden nodes (NHN. We used grid searching algorithm to select simultaneously the optimal values of the dimension of feature subset and NHN based on cross- validation method. The best classification mode was decided with the optimal searching values. Experimental results show that for EEG signals from the experiment of lie detection, KPCA_ELM has higher classification accuracy with faster training speed than other widely-used classification modes, which is especially suitable for online EEG signals processing system.

  6. White Blood Cell Segmentation by Circle Detection Using Electromagnetism-Like Optimization

    Directory of Open Access Journals (Sweden)

    Erik Cuevas

    2013-01-01

    Full Text Available Medical imaging is a relevant field of application of image processing algorithms. In particular, the analysis of white blood cell (WBC images has engaged researchers from fields of medicine and computer vision alike. Since WBCs can be approximated by a quasicircular form, a circular detector algorithm may be successfully applied. This paper presents an algorithm for the automatic detection of white blood cells embedded into complicated and cluttered smear images that considers the complete process as a circle detection problem. The approach is based on a nature-inspired technique called the electromagnetism-like optimization (EMO algorithm which is a heuristic method that follows electromagnetism principles for solving complex optimization problems. The proposed approach uses an objective function which measures the resemblance of a candidate circle to an actual WBC. Guided by the values of such objective function, the set of encoded candidate circles are evolved by using EMO, so that they can fit into the actual blood cells contained in the edge map of the image. Experimental results from blood cell images with a varying range of complexity are included to validate the efficiency of the proposed technique regarding detection, robustness, and stability.

  7. Optimal sampling in damage detection of flexural beams by continuous wavelet transform

    International Nuclear Information System (INIS)

    Basu, B; Broderick, B M; Montanari, L; Spagnoli, A

    2015-01-01

    Modern measurement techniques are improving in capability to capture spatial displacement fields occurring in deformed structures with high precision and in a quasi-continuous manner. This in turn has made the use of vibration-based damage identification methods more effective and reliable for real applications. However, practical measurement and data processing issues still present barriers to the application of these methods in identifying several types of structural damage. This paper deals with spatial Continuous Wavelet Transform (CWT) damage identification methods in beam structures with the aim of addressing the following key questions: (i) can the cost of damage detection be reduced by down-sampling? (ii) what is the minimum number of sampling intervals required for optimal damage detection ? The first three free vibration modes of a cantilever and a simple supported beam with an edge open crack are numerically simulated. A thorough parametric study is carried out by taking into account the key parameters governing the problem, including level of noise, crack depth and location, mechanical and geometrical parameters of the beam. The results are employed to assess the optimal number of sampling intervals for effective damage detection. (paper)

  8. Optimize the Coverage Probability of Prediction Interval for Anomaly Detection of Sensor-Based Monitoring Series

    Directory of Open Access Journals (Sweden)

    Jingyue Pang

    2018-03-01

    Full Text Available Effective anomaly detection of sensing data is essential for identifying potential system failures. Because they require no prior knowledge or accumulated labels, and provide uncertainty presentation, the probability prediction methods (e.g., Gaussian process regression (GPR and relevance vector machine (RVM are especially adaptable to perform anomaly detection for sensing series. Generally, one key parameter of prediction models is coverage probability (CP, which controls the judging threshold of the testing sample and is generally set to a default value (e.g., 90% or 95%. There are few criteria to determine the optimal CP for anomaly detection. Therefore, this paper designs a graphic indicator of the receiver operating characteristic curve of prediction interval (ROC-PI based on the definition of the ROC curve which can depict the trade-off between the PI width and PI coverage probability across a series of cut-off points. Furthermore, the Youden index is modified to assess the performance of different CPs, by the minimization of which the optimal CP is derived by the simulated annealing (SA algorithm. Experiments conducted on two simulation datasets demonstrate the validity of the proposed method. Especially, an actual case study on sensing series from an on-orbit satellite illustrates its significant performance in practical application.

  9. White Blood Cell Segmentation by Circle Detection Using Electromagnetism-Like Optimization

    Science.gov (United States)

    Oliva, Diego; Díaz, Margarita; Zaldivar, Daniel; Pérez-Cisneros, Marco; Pajares, Gonzalo

    2013-01-01

    Medical imaging is a relevant field of application of image processing algorithms. In particular, the analysis of white blood cell (WBC) images has engaged researchers from fields of medicine and computer vision alike. Since WBCs can be approximated by a quasicircular form, a circular detector algorithm may be successfully applied. This paper presents an algorithm for the automatic detection of white blood cells embedded into complicated and cluttered smear images that considers the complete process as a circle detection problem. The approach is based on a nature-inspired technique called the electromagnetism-like optimization (EMO) algorithm which is a heuristic method that follows electromagnetism principles for solving complex optimization problems. The proposed approach uses an objective function which measures the resemblance of a candidate circle to an actual WBC. Guided by the values of such objective function, the set of encoded candidate circles are evolved by using EMO, so that they can fit into the actual blood cells contained in the edge map of the image. Experimental results from blood cell images with a varying range of complexity are included to validate the efficiency of the proposed technique regarding detection, robustness, and stability. PMID:23476713

  10. The cryogenic photon detection system for the ALPS II experiment. Characterization, optimization and background rejection

    International Nuclear Information System (INIS)

    Bastidon, Noemi Alice Chloe

    2017-01-01

    The search for new fundamental bosons at very low mass is the central objective of the ALPS II experiment which is currently set up at the Deutsches Elektronen-Synchrotron (DESY, Hamburg). This experiment follows the light-shining-through-the-wall concept where photons could oscillate into weakly interacting light bosons in front of a wall and back into photons behind the wall, giving the impression that light can shine through a light tight barrier. In this concept, the background-free detection of near-infrared photons is required to fully exploit the sensitivity of the apparatus. The high efficiency single-photon detection in the near-infrared is challenging and requires a cryogenic detector. In this project, a Transition-Edge Sensor (TES) operated below 100mK will be used to detect single photons. This thesis focuses on the characterization and optimization of the ALPS II detector system including an Adiabatic Demagnetisation Refrigerator (ADR) with its two-stage pulse-tube cooler, two TES detectors and their Superconducting Quantum Interference Devices (SQUIDs) read-out system. Stability of the detection system over time is a priority in the ALPS II experiment. It is in this context that the cooling system has been subjected to many upgrades. In the framework of this thesis, the cooling setup has been studied in detail in order to optimize its cooling performances. Furthermore, the stability of the detector has been studied according to various criteria. Other essential parameters of the ALPS II experiment are its detection efficiency and its background rate. Indeed, the sensitivity of the experiment directly depends on these two characteristics. Both elements have been studied in depth in order to define if the chosen TES detector will meet ALPS IIc specifications.

  11. The cryogenic photon detection system for the ALPS II experiment. Characterization, optimization and background rejection

    Energy Technology Data Exchange (ETDEWEB)

    Bastidon, Noemi Alice Chloe

    2017-01-12

    The search for new fundamental bosons at very low mass is the central objective of the ALPS II experiment which is currently set up at the Deutsches Elektronen-Synchrotron (DESY, Hamburg). This experiment follows the light-shining-through-the-wall concept where photons could oscillate into weakly interacting light bosons in front of a wall and back into photons behind the wall, giving the impression that light can shine through a light tight barrier. In this concept, the background-free detection of near-infrared photons is required to fully exploit the sensitivity of the apparatus. The high efficiency single-photon detection in the near-infrared is challenging and requires a cryogenic detector. In this project, a Transition-Edge Sensor (TES) operated below 100mK will be used to detect single photons. This thesis focuses on the characterization and optimization of the ALPS II detector system including an Adiabatic Demagnetisation Refrigerator (ADR) with its two-stage pulse-tube cooler, two TES detectors and their Superconducting Quantum Interference Devices (SQUIDs) read-out system. Stability of the detection system over time is a priority in the ALPS II experiment. It is in this context that the cooling system has been subjected to many upgrades. In the framework of this thesis, the cooling setup has been studied in detail in order to optimize its cooling performances. Furthermore, the stability of the detector has been studied according to various criteria. Other essential parameters of the ALPS II experiment are its detection efficiency and its background rate. Indeed, the sensitivity of the experiment directly depends on these two characteristics. Both elements have been studied in depth in order to define if the chosen TES detector will meet ALPS IIc specifications.

  12. Computing Optimal Mixed Strategies for Terrorist Plot Detection Games with the Consideration of Information Leakage

    Directory of Open Access Journals (Sweden)

    Li MingChu

    2017-01-01

    Full Text Available The terrorist’s coordinated attack is becoming an increasing threat to western countries. By monitoring potential terrorists, security agencies are able to detect and destroy terrorist plots at their planning stage. Therefore, an optimal monitoring strategy for the domestic security agency becomes necessary. However, previous study about monitoring strategy generation fails to consider the information leakage, due to hackers and insider threat. Such leakage events may lead to failure of watching potential terrorists and destroying the plot, and cause a huge risk to public security. This paper makes two major contributions. Firstly, we develop a new Stackelberg game model for the security agency to generate optimal monitoring strategy with the consideration of information leakage. Secondly, we provide a double-oracle framework DO-TPDIL for calculation effectively. The experimental result shows that our approach can obtain robust strategies against information leakage with high feasibility and efficiency.

  13. Power-limited low-thrust trajectory optimization with operation point detection

    Science.gov (United States)

    Chi, Zhemin; Li, Haiyang; Jiang, Fanghua; Li, Junfeng

    2018-06-01

    The power-limited solar electric propulsion system is considered more practical in mission design. An accurate mathematical model of the propulsion system, based on experimental data of the power generation system, is used in this paper. An indirect method is used to deal with the time-optimal and fuel-optimal control problems, in which the solar electric propulsion system is described using a finite number of operation points, which are characterized by different pairs of thruster input power. In order to guarantee the integral accuracy for the discrete power-limited problem, a power operation detection technique is embedded in the fourth-order Runge-Kutta algorithm with fixed step. Moreover, the logarithmic homotopy method and normalization technique are employed to overcome the difficulties caused by using indirect methods. Three numerical simulations with actual propulsion systems are given to substantiate the feasibility and efficiency of the proposed method.

  14. Adaptive Near-Optimal Multiuser Detection Using a Stochastic and Hysteretic Hopfield Net Receiver

    Directory of Open Access Journals (Sweden)

    Gábor Jeney

    2003-01-01

    Full Text Available This paper proposes a novel adaptive MUD algorithm for a wide variety (practically any kind of interference limited systems, for example, code division multiple access (CDMA. The algorithm is based on recently developed neural network techniques and can perform near optimal detection in the case of unknown channel characteristics. The proposed algorithm consists of two main blocks; one estimates the symbols sent by the transmitters, the other identifies each channel of the corresponding communication links. The estimation of symbols is carried out either by a stochastic Hopfield net (SHN or by a hysteretic neural network (HyNN or both. The channel identification is based on either the self-organizing feature map (SOM or the learning vector quantization (LVQ. The combination of these two blocks yields a powerful real-time detector with near optimal performance. The performance is analyzed by extensive simulations.

  15. Social Network Community Detection for DMA Creation: Criteria Analysis through Multilevel Optimization

    Directory of Open Access Journals (Sweden)

    Bruno M. Brentan

    2017-01-01

    Full Text Available Management of large water distribution systems can be improved by dividing their networks into so-called district metered areas (DMAs. However, such divisions must be based on appropriated technical criteria. Considering the importance of deeply understanding the relationship between DMA creation and these criteria, this work proposes a performance analysis of DMA generation that takes into account such indicators as resilience index, demand similarity, pressure uniformity, water age (and thus water quality, solution implantation costs, and electrical consumption. To cope with the complexity of the problem, suitable mathematical techniques are proposed in this paper. We use a social community detection technique to define the sectors, and then a multilevel particle swarm optimization approach is applied to find the optimal placement and operating point of the necessary devices. The results obtained by implementing the methodology in a real water supply network show its validity and the meaningful influence on the final result of, especially, elevation and pipe length.

  16. Optimal sensor configuration for complex systems with application to signal detection in structures

    DEFF Research Database (Denmark)

    Sadegh, Payman; Spall, J. C.

    2000-01-01

    sensor outputs. Secondly, we describe an efficient and practical algorithm to achieve the optimization goals, based on simultaneous perturbation stochastic approximation (SPSA). SPSA avoids the need for detailed modeling of the sensor response by simply relying on observed responses as obtained......The paper considers the problem of sensor configuration for complex systems. The contribution of the paper is twofold. Firstly, we define an appropriate criterion that is based on maximizing overall sensor responses while minimizing redundant information as measured by correlations between multiple...... by limited experimentation with test sensor configurations. We illustrate the application of the approach to optimal placement of acoustic sensors for signal detection in structures. This includes both a computer simulation study for an aluminum plate, and real experimentations on a steel I-beam....

  17. Optimization of detection system based on inorganic scintillation crystal coupled with a long lightguide

    CERN Document Server

    Globus, M; Ratner, M

    2002-01-01

    Operation characteristics of a scintillation crystal, linked with the photomultiplier by a long transparent lightguide, are considered (such detection systems are used for monitoring the seawater pollution, scintillation measurements in magnetic field, etc.). This system is optimized with respect to the refractive index of the liquid, coupling the crystal with the lightguide, and the roughness degree of the crystal surface. It is shown that the energy resolution of the system can be significantly improved by using the coupling liquid with a refractive index somewhat less than that of the lightguide (a difference of about 0.2 is optimal). Light output and especially energy resolution becomes better with an increase of the roughness degree of the reflecting surface.

  18. Optimizing US examination to detect the normal and abnormal appendix in children

    International Nuclear Information System (INIS)

    Peletti, Adriana B.; Baldisserotto, Matteo

    2006-01-01

    US detection of a normal appendix can safely rule out appendicitis. However, there is a wide range of accuracy in detection of a normal appendix. To optimize US examination to detect the normal and the abnormal appendix according to the potential positions of the appendix. This prospective study included 107 children who underwent gray-scale US scanning. Noncompressive and compressive graded sonography was performed to detect normal and abnormal appendices according to their potential positions. The maximum transverse diameter of the appendices was measured. Of the 107 children examined, 56 had a histologic diagnosis of acute appendicitis. Sonography had a sensitivity of 100% and specificity of 98% for the diagnosis of appendicitis. A normal appendix was visualized in 44 (86.2%) of the 51 patients without acute appendicitis, and of these 44, 43 were true-negative and 1 was false-positive. Normal and abnormal appendices, respectively, were positioned as follows: 54.4% and 39.3% were mid-pelvic; 27.2% and 28.6% were retrocecal; 11.4% and 17.8% were deep pelvic; and 6.8% and 14.3% were abdominal. US scanning according to the potential positions of the appendix was useful in the detection of normal appendices in children suspected of having appendicitis. (orig.)

  19. Detecting and locating light atoms from high-resolution STEM images: The quest for a single optimal design

    Energy Technology Data Exchange (ETDEWEB)

    Gonnissen, J.; De Backer, A. [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium); Dekker, A.J. den [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, 2610 Wilrijk (Belgium); Delft Center for Systems and Control (DCSC), Delft University of Technology, Mekelweg 2, 2628 CD Delft (Netherlands); Sijbers, J. [iMinds-Vision Lab, University of Antwerp, Universiteitsplein 1, 2610 Wilrijk (Belgium); Van Aert, S., E-mail: sandra.vanaert@uantwerpen.be [Electron Microscopy for Materials Science (EMAT), University of Antwerp, Groenenborgerlaan 171, B-2020 Antwerp (Belgium)

    2016-11-15

    In the present paper, the optimal detector design is investigated for both detecting and locating light atoms from high resolution scanning transmission electron microscopy (HR STEM) images. The principles of detection theory are used to quantify the probability of error for the detection of light atoms from HR STEM images. To determine the optimal experiment design for locating light atoms, use is made of the so-called Cramér–Rao Lower Bound (CRLB). It is investigated if a single optimal design can be found for both the detection and location problem of light atoms. Furthermore, the incoming electron dose is optimised for both research goals and it is shown that picometre range precision is feasible for the estimation of the atom positions when using an appropriate incoming electron dose under the optimal detector settings to detect light atoms. - Highlights: • The optimal detector design to detect and locate light atoms in HR STEM is derived. • The probability of error is quantified and used to detect light atoms. • The Cramér–Rao lower bound is calculated to determine the atomic column precision. • Both measures are evaluated and result in the single optimal LAADF detector regime. • The incoming electron dose is optimised for both research goals.

  20. Reservoir model for the Alameda Central waterflood

    Energy Technology Data Exchange (ETDEWEB)

    Randall, T E

    1968-01-01

    The basic approach used in developing the model to characterize the Alameda Central Unit Waterflood assumes continuity of the reservoir mechanics with time. The past performance was analyzed to describe the reservoir and future performance was assumed to follow the established patterns. To develop a mathematical picture of the Alameda Central Unit reservoir, a two-dimensional single-phase steady-state model was used in conjunction with material balance calculations, real-time conversion methods and oil-water interface advance calculations. The model was developed to optimize water injection allocation, determine the configuration of the frontal advance and evaluate the success of the waterflood. The model also provides a basis for continuing review and revision of the basic concepts of reservoir operation. The results of the reservoir study have confirmed the apparent lack of permeability orientation in the pool and indicate that the waterflood is progressing better than originally anticipated.

  1. Live Operation Data Collection Optimization and Communication for the Domestic Nuclear Detection Office's Rail Test Center

    International Nuclear Information System (INIS)

    Gelston, Gariann M.

    2010-01-01

    For the Domestic Nuclear Detection Office's Rail Test Center (i.e., DNDO's RTC), port operation knowledge with flexible collection tools and technique are essential in both technology testing design and implementation intended for live operational settings. Increased contextual data, flexibility in procedures, and rapid availability of information are keys to addressing the challenges of optimization, validation, and analysis within live operational setting data collection. These concepts need to be integrated into technology testing designs, data collection, validation, and analysis processes. A modified data collection technique with a two phased live operation test method is proposed.

  2. Development of optimal strategies in executive management of special waste resulting from dredging of oil products reservoirs using SWOT and QSPM method in National Iranian Oil Product Distribution Company

    Directory of Open Access Journals (Sweden)

    Monireh Abbasi

    2017-09-01

    Full Text Available Mismanagement of special wastes can bring about destructive environmental effects. Therefore, development of strategic solutions in this sector requires a special attention. SWOT analysis was benefited from in this research as an instrument for planning special waste management system. In order to achieve an acceptable point in special waste management resulting from dredging of reservoirs, internal and external factors in the company were investigated. Then, optimal strategies were developed and eventually in order to specify the relative attractiveness of the determined strategies, Quantitative Strategic Planning Matrix (QSPM matrix was employed. Based on Internal Factor Evaluation and External Factor Evaluation matrices, it was found that the strong points were more than the weak points, while the available opportunities are less than the threats. Out of the developed strategies, construction of a suitable site to maintain the oily sludges according to environmental requirements are among the top priorities of the strategies.

  3. Optimization of metabolite detection by quantum mechanics simulations in magnetic resonance spectroscopy.

    Science.gov (United States)

    Gambarota, Giulio

    2017-07-15

    Magnetic resonance spectroscopy (MRS) is a well established modality for investigating tissue metabolism in vivo. In recent years, many efforts by the scientific community have been directed towards the improvement of metabolite detection and quantitation. Quantum mechanics simulations allow for investigations of the MR signal behaviour of metabolites; thus, they provide an essential tool in the optimization of metabolite detection. In this review, we will examine quantum mechanics simulations based on the density matrix formalism. The density matrix was introduced by von Neumann in 1927 to take into account statistical effects within the theory of quantum mechanics. We will discuss the main steps of the density matrix simulation of an arbitrary spin system and show some examples for the strongly coupled two spin system. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Optimization of Plastic Scintillator Thicknesses for Online Beta Detection in Mixed Fields

    Energy Technology Data Exchange (ETDEWEB)

    Pourtangestani, K.; Machrafi, R. [University of Ontario Institute of Technology, Oshawa, ON (Canada)

    2013-07-15

    For efficient beta detection in a mixed beta-gamma field, Monte Carlo simulation models have been developed to optimize the thickness of a plastic scintillator used in whole body monitor. The simulation has been performed using MCNP/X code and different thicknesses of plastic scintillators ranging from 150 to 600 {mu}m have been used. The relationship between the thickness of the scintillator and the efficiency of the detector has been analysed. For 150 {mu}m thickness, an experimental investigation has been conducted with different beta sources at different positions on the scintillator and the counting efficiency of the unit has been measured. Evaluated data along with experimental ones have been discussed. A thickness of 300 {mu}m to 500 {mu}m has been found to be an optimum thickness for better beta detection efficiency in the presence of low energy gamma ray. (author)

  5. Optimization of plastic scintillator thicknesses for online beta/gamma detection

    Directory of Open Access Journals (Sweden)

    Pourtangestani K.

    2012-04-01

    Full Text Available For efficient beta detection in a mixed beta gamma field, Monte Carlo simulation models have been built to optimize the thickness of a plastic scintillator, used in a whole body monitor. The simulation has been performed using the MCNP/X code for different thicknesses of plastic scintillator from 150 μm to 600 μm. The relationship between the thickness of the scintillator and the efficiency of the detector has been analyzed. For 150 μm thickness, an experimental investigation has been conducted with different beta sources at different positions on the scintillator and the counting efficiency of the unit has been measured. Evaluated data along with experimental ones have been discussed. A thickness of 300 μm to 500 μm has been found to be the optimum thickness for high efficiency beta detection in the presence of low energy gamma-rays.

  6. SILTATION IN RESERVOIRS

    African Journals Online (AJOL)

    Keywords: reservoir model, siltation, sediment, catchment, sediment transport. 1. Introduction. Sediment ... rendered water storage structures useless in less than 25 years. ... reservoir, thus reducing the space available for water storage and ...

  7. Reservoir fisheries of Asia

    International Nuclear Information System (INIS)

    Silva, S.S. De.

    1990-01-01

    At a workshop on reservoir fisheries research, papers were presented on the limnology of reservoirs, the changes that follow impoundment, fisheries management and modelling, and fish culture techniques. Separate abstracts have been prepared for three papers from this workshop

  8. Large reservoirs: Chapter 17

    Science.gov (United States)

    Miranda, Leandro E.; Bettoli, Phillip William

    2010-01-01

    Large impoundments, defined as those with surface area of 200 ha or greater, are relatively new aquatic ecosystems in the global landscape. They represent important economic and environmental resources that provide benefits such as flood control, hydropower generation, navigation, water supply, commercial and recreational fisheries, and various other recreational and esthetic values. Construction of large impoundments was initially driven by economic needs, and ecological consequences received little consideration. However, in recent decades environmental issues have come to the forefront. In the closing decades of the 20th century societal values began to shift, especially in the developed world. Society is no longer willing to accept environmental damage as an inevitable consequence of human development, and it is now recognized that continued environmental degradation is unsustainable. Consequently, construction of large reservoirs has virtually stopped in North America. Nevertheless, in other parts of the world construction of large reservoirs continues. The emergence of systematic reservoir management in the early 20th century was guided by concepts developed for natural lakes (Miranda 1996). However, we now recognize that reservoirs are different and that reservoirs are not independent aquatic systems inasmuch as they are connected to upstream rivers and streams, the downstream river, other reservoirs in the basin, and the watershed. Reservoir systems exhibit longitudinal patterns both within and among reservoirs. Reservoirs are typically arranged sequentially as elements of an interacting network, filter water collected throughout their watersheds, and form a mosaic of predictable patterns. Traditional approaches to fisheries management such as stocking, regulating harvest, and in-lake habitat management do not always produce desired effects in reservoirs. As a result, managers may expend resources with little benefit to either fish or fishing. Some locally

  9. Detection of imminent vein graft occlusion: what is the optimal surveillance program?

    Science.gov (United States)

    Tinder, Chelsey N; Bandyk, Dennis F

    2009-12-01

    The prediction of infrainguinal vein bypass failure remains an inexact judgment. Patient demographics, technical factors, and vascular laboratory graft surveillance testing are helpful in identifying a high-risk graft cohort. The optimal surveillance program to detect the bypass at risk for imminent occlusion continues to be developed, but required elements are known and include clinical assessment for new or changes in limb ischemia symptoms, measurement of ankle and/or toe systolic pressure, and duplex ultrasound imaging of the bypass graft. Duplex ultrasound assessment of bypass hemodynamics may be the most accurate method to detect imminent vein graft occlusion. The finding of low graft flow during intraoperative assessment or at a scheduled surveillance study predicts failure; and if associated with an occlusive lesion, a graft revision can prolong patency. The most common abnormality producing graft failure is conduit stenosis caused by myointimal hyperplasia; and the majority can be repaired by an endovascular intervention. Frequency of testing to detect the failing bypass should be individualized to the patient, the type of arterial bypass, and prior duplex ultrasound scan findings. The focus of surveillance is on identification of the low-flow arterial bypass and timely repair of detected critical stenosis defined by duplex velocity spectra criteria of a peak systolic velocity 300 cm/s and peak systolic velocity ratio across the stenosis >3.5-correlating with >70% diameter-reducing stenosis. When conducted appropriately, a graft surveillance program should result in an unexpected graft failure rate of <3% per year.

  10. Optimization of surface network and platform location using a next generation reservoir simulator coupled with an integrated asset optimizer application to an offshore deep water oil field in Brazil; Otimizacao de redes de superficie e locacao da plataforma atraves do acoplamento de um simulador de reservatorios de nova geracao e um otmizador global de ativo: aplicacao em um campo offshore

    Energy Technology Data Exchange (ETDEWEB)

    Campozana, Fernando P.; Almeida, Renato L. [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil); Madeira, Marcelo G.; Sousa, Sergio H.G. de; Spinola, Marcio [Halliburton Servicos Ltda., Rio de Janeiro, RJ (Brazil)

    2008-07-01

    To design, modify, and expand surface facilities is a multidisciplinary task which involves substantial financial resources. It can take months or years to complete, depending on the size and level of detail of the project. Nowadays, the use of Next Generation Reservoir Simulators (NGRS) is the most sophisticated and reliable way of obtaining field performance evaluation since they can couple surface and subsurface equations, thus eliminating the need of lengthy multiphase flow tables. Furthermore, coupling a NGRS with an optimizer is the best way to accomplish a large number of simulation runs on the search for optimized solutions when facilities are being modified and/or expanded. The suggested workflow is applied to a synthetic field which reproduces typical Brazilian offshore deep water scenarios. Hundreds of coupled simulation runs were performed and the results show that it is possible to find optimal diameters for the production lines as well as the ideal location for the production / injection platform. (author)

  11. Selection of optimal multispectral imaging system parameters for small joint arthritis detection

    Science.gov (United States)

    Dolenec, Rok; Laistler, Elmar; Stergar, Jost; Milanic, Matija

    2018-02-01

    Early detection and treatment of arthritis is essential for a successful outcome of the treatment, but it has proven to be very challenging with existing diagnostic methods. Novel methods based on the optical imaging of the affected joints are becoming an attractive alternative. A non-contact multispectral imaging (MSI) system for imaging of small joints of human hands and feet is being developed. In this work, a numerical simulation of the MSI system is presented. The purpose of the simulation is to determine the optimal design parameters. Inflamed and unaffected human joint models were constructed with a realistic geometry and tissue distributions, based on a MRI scan of a human finger with a spatial resolution of 0.2 mm. The light transport simulation is based on a weighted-photon 3D Monte Carlo method utilizing CUDA GPU acceleration. An uniform illumination of the finger within the 400-1100 nm spectral range was simulated and the photons exiting the joint were recorded using different acceptance angles. From the obtained reflectance and transmittance images the spectral and spatial features most indicative of inflammation were identified. Optimal acceptance angle and spectral bands were determined. This study demonstrates that proper selection of MSI system parameters critically affects ability of a MSI system to discriminate the unaffected and inflamed joints. The presented system design optimization approach could be applied to other pathologies.

  12. Fusion of optimized indicators from Advanced Driver Assistance Systems (ADAS) for driver drowsiness detection.

    Science.gov (United States)

    Daza, Iván García; Bergasa, Luis Miguel; Bronte, Sebastián; Yebes, Jose Javier; Almazán, Javier; Arroyo, Roberto

    2014-01-09

    This paper presents a non-intrusive approach for monitoring driver drowsiness using the fusion of several optimized indicators based on driver physical and driving performance measures, obtained from ADAS (Advanced Driver Assistant Systems) in simulated conditions. The paper is focused on real-time drowsiness detection technology rather than on long-term sleep/awake regulation prediction technology. We have developed our own vision system in order to obtain robust and optimized driver indicators able to be used in simulators and future real environments. These indicators are principally based on driver physical and driving performance skills. The fusion of several indicators, proposed in the literature, is evaluated using a neural network and a stochastic optimization method to obtain the best combination. We propose a new method for ground-truth generation based on a supervised Karolinska Sleepiness Scale (KSS). An extensive evaluation of indicators, derived from trials over a third generation simulator with several test subjects during different driving sessions, was performed. The main conclusions about the performance of single indicators and the best combinations of them are included, as well as the future works derived from this study.

  13. Fusion of Optimized Indicators from Advanced Driver Assistance Systems (ADAS for Driver Drowsiness Detection

    Directory of Open Access Journals (Sweden)

    Iván G. Daza

    2014-01-01

    Full Text Available This paper presents a non-intrusive approach for monitoring driver drowsiness using the fusion of several optimized indicators based on driver physical and driving performance measures, obtained from ADAS (Advanced Driver Assistant Systems in simulated conditions. The paper is focused on real-time drowsiness detection technology rather than on long-term sleep/awake regulation prediction technology. We have developed our own vision system in order to obtain robust and optimized driver indicators able to be used in simulators and future real environments. These indicators are principally based on driver physical and driving performance skills. The fusion of several indicators, proposed in the literature, is evaluated using a neural network and a stochastic optimization method to obtain the best combination. We propose a new method for ground-truth generation based on a supervised Karolinska Sleepiness Scale (KSS. An extensive evaluation of indicators, derived from trials over a third generation simulator with several test subjects during different driving sessions, was performed. The main conclusions about the performance of single indicators and the best combinations of them are included, as well as the future works derived from this study.

  14. Characterization and optimization of spiral eddy current coils for in-situ crack detection

    Science.gov (United States)

    Mandache, Catalin

    2018-03-01

    In-situ condition-based maintenance is making strides in the aerospace industry and it is seen as an alternative to scheduled, time-based maintenance. With fatigue cracks originating from fastener holes as the main reason for structural failures, embedded eddy current coils are a viable non-invasive solution for their timely detection. The development and potential broad use of these coils are motivated by a few consistent arguments: (i) inspection of structures of complicated geometries and hard to access areas, that often require disassembly, (ii) alternative to regular inspection actions that could introduce inadvertent damage, (iii) for structures that have short inspection intervals, and (iv) for repaired structures where fastener holes contain bushings and prevent further bolt-hole inspections. Since the spiral coils are aiming at detecting radial cracks emanating from the fastener holes, their design parameters should allow for high inductance, low ohmic losses and power requirements, as well as optimal size and high sensitivity to discontinuities. In this study, flexible, surface conformable, spiral eddy current coils are empirically investigated on mock-up specimens, while numerical analysis is performed for their optimization and design improvement.

  15. Optimal statistic for detecting gravitational wave signals from binary inspirals with LISA

    CERN Document Server

    Rogan, A

    2004-01-01

    A binary compact object early in its inspiral phase will be picked up by its nearly monochromatic gravitational radiation by LISA. But even this innocuous appearing candidate poses interesting detection challenges. The data that will be scanned for such sources will be a set of three functions of LISA's twelve data streams obtained through time-delay interferometry, which is necessary to cancel the noise contributions from laser-frequency fluctuations and optical-bench motions to these data streams. We call these three functions pseudo-detectors. The sensitivity of any pseudo-detector to a given sky position is a function of LISA's orbital position. Moreover, at a given point in LISA's orbit, each pseudo-detector has a different sensitivity to the same sky position. In this work, we obtain the optimal statistic for detecting gravitational wave signals, such as from compact binaries early in their inspiral stage, in LISA data. We also present how the sensitivity of LISA, defined by this optimal statistic, vari...

  16. Optimized Lateral Flow Immunoassay Reader for the Detection of Infectious Diseases in Developing Countries.

    Science.gov (United States)

    Pilavaki, Evdokia; Demosthenous, Andreas

    2017-11-20

    Detection and control of infectious diseases is a major problem, especially in developing countries. Lateral flow immunoassays can be used with great success for the detection of infectious diseases. However, for the quantification of their results an electronic reader is required. This paper presents an optimized handheld electronic reader for developing countries. It features a potentially low-cost, low-power, battery-operated device with no added optical accessories. The operation of this proof of concept device is based on measuring the reflected light from the lateral flow immunoassay and translating it into the concentration of the specific analyte of interest. Characterization of the surface of the lateral flow immunoassay has been performed in order to accurately model its response to the incident light. Ray trace simulations have been performed to optimize the system and achieve maximum sensitivity by placing all the components in optimum positions. A microcontroller enables all the signal processing to be performed on the device and a Bluetooth module allows transmission of the results wirelessly to a mobile phone app. Its performance has been validated using lateral flow immunoassays with influenza A nucleoprotein in the concentration range of 0.5 ng/mL to 200 ng/mL.

  17. Optimized Lateral Flow Immunoassay Reader for the Detection of Infectious Diseases in Developing Countries

    Directory of Open Access Journals (Sweden)

    Evdokia Pilavaki

    2017-11-01

    Full Text Available Detection and control of infectious diseases is a major problem, especially in developing countries. Lateral flow immunoassays can be used with great success for the detection of infectious diseases. However, for the quantification of their results an electronic reader is required. This paper presents an optimized handheld electronic reader for developing countries. It features a potentially low-cost, low-power, battery-operated device with no added optical accessories. The operation of this proof of concept device is based on measuring the reflected light from the lateral flow immunoassay and translating it into the concentration of the specific analyte of interest. Characterization of the surface of the lateral flow immunoassay has been performed in order to accurately model its response to the incident light. Ray trace simulations have been performed to optimize the system and achieve maximum sensitivity by placing all the components in optimum positions. A microcontroller enables all the signal processing to be performed on the device and a Bluetooth module allows transmission of the results wirelessly to a mobile phone app. Its performance has been validated using lateral flow immunoassays with influenza A nucleoprotein in the concentration range of 0.5 ng/mL to 200 ng/mL.

  18. X-ray backscatter imaging for radiography by selective detection and snapshot: Evolution, development, and optimization

    Science.gov (United States)

    Shedlock, Daniel

    Compton backscatter imaging (CBI) is a single-sided imaging technique that uses the penetrating power of radiation and unique interaction properties of radiation with matter to image subsurface features. CBI has a variety of applications that include non-destructive interrogation, medical imaging, security and military applications. Radiography by selective detection (RSD), lateral migration radiography (LMR) and shadow aperture backscatter radiography (SABR) are different CBI techniques that are being optimized and developed. Radiography by selective detection (RSD) is a pencil beam Compton backscatter imaging technique that falls between highly collimated and uncollimated techniques. Radiography by selective detection uses a combination of single- and multiple-scatter photons from a projected area below a collimation plane to generate an image. As a result, the image has a combination of first- and multiple-scatter components. RSD techniques offer greater subsurface resolution than uncollimated techniques, at speeds at least an order of magnitude faster than highly collimated techniques. RSD scanning systems have evolved from a prototype into near market-ready scanning devices for use in a variety of single-sided imaging applications. The design has changed to incorporate state-of-the-art detectors and electronics optimized for backscatter imaging with an emphasis on versatility, efficiency and speed. The RSD system has become more stable, about 4 times faster, and 60% lighter while maintaining or improving image quality and contrast over the past 3 years. A new snapshot backscatter radiography (SBR) CBI technique, shadow aperture backscatter radiography (SABR), has been developed from concept and proof-of-principle to a functional laboratory prototype. SABR radiography uses digital detection media and shaded aperture configurations to generate near-surface Compton backscatter images without scanning, similar to how transmission radiographs are taken. Finally, a

  19. Optimal Hotspots of Dynamic Surfaced-Enhanced Raman Spectroscopy for Drugs Quantitative Detection.

    Science.gov (United States)

    Yan, Xiunan; Li, Pan; Zhou, Binbin; Tang, Xianghu; Li, Xiaoyun; Weng, Shizhuang; Yang, Liangbao; Liu, Jinhuai

    2017-05-02

    Surface-enhanced Raman spectroscopy (SERS) as a powerful qualitative analysis method has been widely applied in many fields. However, SERS for quantitative analysis still suffers from several challenges partially because of the absence of stable and credible analytical strategy. Here, we demonstrate that the optimal hotspots created from dynamic surfaced-enhanced Raman spectroscopy (D-SERS) can be used for quantitative SERS measurements. In situ small-angle X-ray scattering was carried out to in situ real-time monitor the formation of the optimal hotspots, where the optimal hotspots with the most efficient hotspots were generated during the monodisperse Au-sol evaporating process. Importantly, the natural evaporation of Au-sol avoids the nanoparticles instability of salt-induced, and formation of ordered three-dimensional hotspots allows SERS detection with excellent reproducibility. Considering SERS signal variability in the D-SERS process, 4-mercaptopyridine (4-mpy) acted as internal standard to validly correct and improve stability as well as reduce fluctuation of signals. The strongest SERS spectra at the optimal hotspots of D-SERS have been extracted to statistics analysis. By using the SERS signal of 4-mpy as a stable internal calibration standard, the relative SERS intensity of target molecules demonstrated a linear response versus the negative logarithm of concentrations at the point of strongest SERS signals, which illustrates the great potential for quantitative analysis. The public drugs 3,4-methylenedioxymethamphetamine and α-methyltryptamine hydrochloride obtained precise analysis with internal standard D-SERS strategy. As a consequence, one has reason to believe our approach is promising to challenge quantitative problems in conventional SERS analysis.

  20. A new and fast image feature selection method for developing an optimal mammographic mass detection scheme.

    Science.gov (United States)

    Tan, Maxine; Pu, Jiantao; Zheng, Bin

    2014-08-01

    Selecting optimal features from a large image feature pool remains a major challenge in developing computer-aided detection (CAD) schemes of medical images. The objective of this study is to investigate a new approach to significantly improve efficacy of image feature selection and classifier optimization in developing a CAD scheme of mammographic masses. An image dataset including 1600 regions of interest (ROIs) in which 800 are positive (depicting malignant masses) and 800 are negative (depicting CAD-generated false positive regions) was used in this study. After segmentation of each suspicious lesion by a multilayer topographic region growth algorithm, 271 features were computed in different feature categories including shape, texture, contrast, isodensity, spiculation, local topological features, as well as the features related to the presence and location of fat and calcifications. Besides computing features from the original images, the authors also computed new texture features from the dilated lesion segments. In order to select optimal features from this initial feature pool and build a highly performing classifier, the authors examined and compared four feature selection methods to optimize an artificial neural network (ANN) based classifier, namely: (1) Phased Searching with NEAT in a Time-Scaled Framework, (2) A sequential floating forward selection (SFFS) method, (3) A genetic algorithm (GA), and (4) A sequential forward selection (SFS) method. Performances of the four approaches were assessed using a tenfold cross validation method. Among these four methods, SFFS has highest efficacy, which takes 3%-5% of computational time as compared to GA approach, and yields the highest performance level with the area under a receiver operating characteristic curve (AUC) = 0.864 ± 0.034. The results also demonstrated that except using GA, including the new texture features computed from the dilated mass segments improved the AUC results of the ANNs optimized

  1. Optimization of Rb-82 PET acquisition and reconstruction protocols for myocardial perfusion defect detection

    Science.gov (United States)

    Tang, Jing; Rahmim, Arman; Lautamäki, Riikka; Lodge, Martin A.; Bengel, Frank M.; Tsui, Benjamin M. W.

    2009-05-01

    The purpose of this study is to optimize the dynamic Rb-82 cardiac PET acquisition and reconstruction protocols for maximum myocardial perfusion defect detection using realistic simulation data and task-based evaluation. Time activity curves (TACs) of different organs under both rest and stress conditions were extracted from dynamic Rb-82 PET images of five normal patients. Combined SimSET-GATE Monte Carlo simulation was used to generate nearly noise-free cardiac PET data from a time series of 3D NCAT phantoms with organ activities modeling different pre-scan delay times (PDTs) and total acquisition times (TATs). Poisson noise was added to the nearly noise-free projections and the OS-EM algorithm was applied to generate noisy reconstructed images. The channelized Hotelling observer (CHO) with 32× 32 spatial templates corresponding to four octave-wide frequency channels was used to evaluate the images. The area under the ROC curve (AUC) was calculated from the CHO rating data as an index for image quality in terms of myocardial perfusion defect detection. The 0.5 cycle cm-1 Butterworth post-filtering on OS-EM (with 21 subsets) reconstructed images generates the highest AUC values while those from iteration numbers 1 to 4 do not show different AUC values. The optimized PDTs for both rest and stress conditions are found to be close to the cross points of the left ventricular chamber and myocardium TACs, which may promote an individualized PDT for patient data processing and image reconstruction. Shortening the TATs for <~3 min from the clinically employed acquisition time does not affect the myocardial perfusion defect detection significantly for both rest and stress studies.

  2. Optimization of illuminating system to detect optical properties inside a finger

    Science.gov (United States)

    Sano, Emiko; Shikai, Masahiro; Shiratsuki, Akihide; Maeda, Takuji; Matsushita, Masahito; Sasakawa, Koichi

    2007-01-01

    's surface condition because it detects the fingerprint pattern inside the finger using transmitted light. We examined optimization of illumination system of this novel fingerprint sensor to detect contrasty fingerprint pattern from wide area and to improve image processing at (2).

  3. Evaluation on the detection limit of blood hemoglobin using photolepthysmography based on path-length optimization

    Science.gov (United States)

    Sun, Di; Guo, Chao; Zhang, Ziyang; Han, Tongshuai; Liu, Jin

    2016-10-01

    The blood hemoglobin concentration's (BHC) measurement using Photoplethysmography (PPG), which gets blood absorption to near infrared light from the instantaneous pulse of transmitted light intensity, has not been applied to the clinical use due to the non-enough precision. The main challenge might be caused of the non-enough stable pulse signal when it's very weak and it often varies in different human bodies or in the same body with different physiological states. We evaluated the detection limit of BHC using PPG as the measurement precision level, which can be considered as a best precision result because we got the relative stable subject's pulse signals recorded by using a spectrometer with high signal-to-noise ratio (SNR) level, which is about 30000:1 in short term. Moreover, we optimized the used pathlength using the theory based on optimum pathlength to get a better sensitivity to the absorption variation in blood. The best detection limit was evaluated as about 1 g/L for BHC, and the best SNR of pulse for in vivo measurement was about 2000:1 at 1130 and 1250 nm. Meanwhile, we conclude that the SNR of pulse signal should be better than 400:1 when the required detection limit is set to 5 g/L. Our result would be a good reference to the BHC measurement to get a desired BHC measurement precision of real application.

  4. Optimized acoustic biochip integrated with microfluidics for biomarkers detection in molecular diagnostics.

    Science.gov (United States)

    Papadakis, G; Friedt, J M; Eck, M; Rabus, D; Jobst, G; Gizeli, E

    2017-09-01

    The development of integrated platforms incorporating an acoustic device as the detection element requires addressing simultaneously several challenges of technological and scientific nature. The present work was focused on the design of a microfluidic module, which, combined with a dual or array type Love wave acoustic chip could be applied to biomedical applications and molecular diagnostics. Based on a systematic study we optimized the mechanics of the flow cell attachment and the sealing material so that fluidic interfacing/encapsulation would impose minimal losses to the acoustic wave. We have also investigated combinations of operating frequencies with waveguide materials and thicknesses for maximum sensitivity during the detection of protein and DNA biomarkers. Within our investigations neutravidin was used as a model protein biomarker and unpurified PCR amplified Salmonella DNA as the model genetic target. Our results clearly indicate the need for experimental verification of the optimum engineering and analytical parameters, in order to develop commercially viable systems for integrated analysis. The good reproducibility of the signal together with the ability of the array biochip to detect multiple samples hold promise for the future use of the integrated system in a Lab-on-a-Chip platform for application to molecular diagnostics.

  5. Solar Power Ramp Events Detection Using an Optimized Swinging Door Algorithm: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Cui, Mingjian; Zhang, Jie; Florita, Anthony; Hodge, Bri-Mathias; Ke, Deping; Sun, Yuanzhang

    2015-08-07

    Solar power ramp events (SPREs) are those that significantly influence the integration of solar power on non-clear days and threaten the reliable and economic operation of power systems. Accurately extracting solar power ramps becomes more important with increasing levels of solar power penetrations in power systems. In this paper, we develop an optimized swinging door algorithm (OpSDA) to detection. First, the swinging door algorithm (SDA) is utilized to segregate measured solar power generation into consecutive segments in a piecewise linear fashion. Then we use a dynamic programming approach to combine adjacent segments into significant ramps when the decision thresholds are met. In addition, the expected SPREs occurring in clear-sky solar power conditions are removed. Measured solar power data from Tucson Electric Power is used to assess the performance of the proposed methodology. OpSDA is compared to two other ramp detection methods: the SDA and the L1-Ramp Detect with Sliding Window (L1-SW) method. The statistical results show the validity and effectiveness of the proposed method. OpSDA can significantly improve the performance of the SDA, and it can perform as well as or better than L1-SW with substantially less computation time.

  6. Optimization of an Accelerometer and Gyroscope-Based Fall Detection Algorithm

    Directory of Open Access Journals (Sweden)

    Quoc T. Huynh

    2015-01-01

    Full Text Available Falling is a common and significant cause of injury in elderly adults (>65 yrs old, often leading to disability and death. In the USA, one in three of the elderly suffers from fall injuries annually. This study’s purpose is to develop, optimize, and assess the efficacy of a falls detection algorithm based upon a wireless, wearable sensor system (WSS comprised of a 3-axis accelerometer and gyroscope. For this study, the WSS is placed at the chest center to collect real-time motion data of various simulated daily activities (i.e., walking, running, stepping, and falling. Tests were conducted on 36 human subjects with a total of 702 different movements collected in a laboratory setting. Half of the dataset was used for development of the fall detection algorithm including investigations of critical sensor thresholds and the remaining dataset was used for assessment of algorithm sensitivity and specificity. Experimental results show that the algorithm detects falls compared to other daily movements with a sensitivity and specificity of 96.3% and 96.2%, respectively. The addition of gyroscope information enhances sensitivity dramatically from results in the literature as angular velocity changes provide further delineation of a fall event from other activities that may also experience high acceleration peaks.

  7. Planar Hall effect sensor bridge geometries optimized for magnetic bead detection

    DEFF Research Database (Denmark)

    Østerberg, Frederik Westergaard; Rizzi, Giovanni; Henriksen, Anders Dahl

    2014-01-01

    Novel designs of planar Hall effect bridge sensors optimized for magnetic bead detection are presented and characterized. By constructing the sensor geometries appropriately, the sensors can be tailored to be sensitive to an external magnetic field, the magnetic field due to beads being magnetized...... by the sensor self-field or a combination thereof. The sensors can be made nominally insensitive to small external magnetic fields, while being maximally sensitive to magnetic beads, magnetized by the sensor self-field. Thus, the sensor designs can be tailored towards specific applications with minimal...... of the dynamic magnetic response of suspensions of magnetic beads with a nominal diameter of 80 nm are performed. Furthermore, a method to amplify the signal by appropriate combinations of multiple sensor segments is demonstrated....

  8. Optimal filter design with progressive genetic algorithm for local damage detection in rolling bearings

    Science.gov (United States)

    Wodecki, Jacek; Michalak, Anna; Zimroz, Radoslaw

    2018-03-01

    Harsh industrial conditions present in underground mining cause a lot of difficulties for local damage detection in heavy-duty machinery. For vibration signals one of the most intuitive approaches of obtaining signal with expected properties, such as clearly visible informative features, is prefiltration with appropriately prepared filter. Design of such filter is very broad field of research on its own. In this paper authors propose a novel approach to dedicated optimal filter design using progressive genetic algorithm. Presented method is fully data-driven and requires no prior knowledge of the signal. It has been tested against a set of real and simulated data. Effectiveness of operation has been proven for both healthy and damaged case. Termination criterion for evolution process was developed, and diagnostic decision making feature has been proposed for final result determinance.

  9. Applying the J-optimal channelized quadratic observer to SPECT myocardial perfusion defect detection

    Science.gov (United States)

    Kupinski, Meredith K.; Clarkson, Eric; Ghaly, Michael; Frey, Eric C.

    2016-03-01

    To evaluate performance on a perfusion defect detection task from 540 image pairs of myocardial perfusion SPECT image data we apply the J-optimal channelized quadratic observer (J-CQO). We compare AUC values of the linear Hotelling observer and J-CQO when the defect location is fixed and when it occurs in one of two locations. As expected, when the location is fixed a single channels maximizes AUC; location variability requires multiple channels to maximize the AUC. The AUC is estimated from both the projection data and reconstructed images. J-CQO is quadratic since it uses the first- and second- order statistics of the image data from both classes. The linear data reduction by the channels is described by an L x M channel matrix and in prior work we introduced an iterative gradient-based method for calculating the channel matrix. The dimensionality reduction from M measurements to L channels yields better estimates of these sample statistics from smaller sample sizes, and since the channelized covariance matrix is L x L instead of M x M, the matrix inverse is easier to compute. The novelty of our approach is the use of Jeffrey's divergence (J) as the figure of merit (FOM) for optimizing the channel matrix. We previously showed that the J-optimal channels are also the optimum channels for the AUC and the Bhattacharyya distance when the channel outputs are Gaussian distributed with equal means. This work evaluates the use of J as a surrogate FOM (SFOM) for AUC when these statistical conditions are not satisfied.

  10. An optimized pentaplex PCR for detecting DNA mismatch repair-deficient colorectal cancers.

    Directory of Open Access Journals (Sweden)

    Ajay Goel

    2010-02-01

    Full Text Available Microsatellite instability (MSI is used to screen colorectal cancers (CRC for Lynch Syndrome, and to predict outcome and response to treatment. The current technique for measuring MSI requires DNA from normal and neoplastic tissues, and fails to identify tumors with specific DNA mismatch repair (MMR defects. We tested a panel of five quasi-monomorphic mononucleotide repeat markers amplified in a single multiplex PCR reaction (pentaplex PCR to detect MSI.We investigated a cohort of 213 CRC patients, comprised of 114 MMR-deficient and 99 MMR-proficient tumors. Immunohistochemical (IHC analysis evaluated the expression of MLH1, MSH2, PMS2 and MSH6. MSI status was defined by differences in the quasi-monomorphic variation range (QMVR from a pool of normal DNA samples, and measuring differences in allele lengths in tumor DNA.Amplification of 426 normal alleles allowed optimization of the QMVR at each marker, and eliminated the requirement for matched reference DNA to define MSI in each sample. Using > or = 2/5 unstable markers as the criteria for MSI resulted in a sensitivity of 95.6% (95% CI = 90.1-98.1% and a positive predictive value of 100% (95% CI = 96.6%-100%. Detection of MSH6-deficiency was limited using all techniques. Data analysis with a three-marker panel (BAT26, NR21 and NR27 was comparable in sensitivity (97.4% and positive predictive value (96.5% to the five marker panel. Both approaches were superior to the standard approach to measuring MSI.An optimized pentaplex (or triplex PCR offers a facile, robust, very inexpensive, highly sensitive, and specific assay for the identification of MSI in CRC.

  11. Optimal pcr primers for rapid and accurate detection of Aspergillus flavus isolates.

    Science.gov (United States)

    Al-Shuhaib, Mohammed Baqur S; Albakri, Ali H; Alwan, Sabah H; Almandil, Noor B; AbdulAzeez, Sayed; Borgio, J Francis

    2018-03-01

    Aspergillus flavus is among the most devastating opportunistic pathogens of several food crops including rice, due to its high production of carcinogenic aflatoxins. The presence of these organisms in economically important rice strip farming is a serious food safety concern. Several polymerase chain reaction (PCR) primers have been designed to detect this species; however, a comparative assessment of their accuracy has not been conducted. This study aims to identify the optimal diagnostic PCR primers for the identification of A. flavus, among widely available primers. We isolated 122 A. flavus native isolates from randomly collected rice strips (N = 300). We identified 109 isolates to the genus level using universal fungal PCR primer pairs. Nine pairs of primers were examined for their PCR diagnostic specificity on the 109 isolates. FLA PCR was found to be the optimal PCR primer pair for specific identification of the native isolates, over aflP(1), aflM, aflA, aflD, aflP(3), aflP(2), and aflR. The PEP primer pair was found to be the most unsuitable for A. flavus identification. In conclusion, the present study indicates the powerful specificity of the FLA PCR primer over other commonly available diagnostic primers for accurate, rapid, and large-scale identification of A. flavus native isolates. This study provides the first simple, practical comparative guide to PCR-based screening of A. flavus infection in rice strips. Copyright © 2018 Elsevier Ltd. All rights reserved.

  12. A Swarm Optimization Algorithm for Multimodal Functions and Its Application in Multicircle Detection

    Directory of Open Access Journals (Sweden)

    Erik Cuevas

    2013-01-01

    Full Text Available In engineering problems due to physical and cost constraints, the best results, obtained by a global optimization algorithm, cannot be realized always. Under such conditions, if multiple solutions (local and global are known, the implementation can be quickly switched to another solution without much interrupting the design process. This paper presents a new swarm multimodal optimization algorithm named as the collective animal behavior (CAB. Animal groups, such as schools of fish, flocks of birds, swarms of locusts, and herds of wildebeest, exhibit a variety of behaviors including swarming about a food source, milling around a central location, or migrating over large distances in aligned groups. These collective behaviors are often advantageous to groups, allowing them to increase their harvesting efficiency to follow better migration routes, to improve their aerodynamic, and to avoid predation. In the proposed algorithm, searcher agents emulate a group of animals which interact with each other based on simple biological laws that are modeled as evolutionary operators. Numerical experiments are conducted to compare the proposed method with the state-of-the-art methods on benchmark functions. The proposed algorithm has been also applied to the engineering problem of multi-circle detection, achieving satisfactory results.

  13. Adaptive Spot Detection With Optimal Scale Selection in Fluorescence Microscopy Images.

    Science.gov (United States)

    Basset, Antoine; Boulanger, Jérôme; Salamero, Jean; Bouthemy, Patrick; Kervrann, Charles

    2015-11-01

    Accurately detecting subcellular particles in fluorescence microscopy is of primary interest for further quantitative analysis such as counting, tracking, or classification. Our primary goal is to segment vesicles likely to share nearly the same size in fluorescence microscopy images. Our method termed adaptive thresholding of Laplacian of Gaussian (LoG) images with autoselected scale (ATLAS) automatically selects the optimal scale corresponding to the most frequent spot size in the image. Four criteria are proposed and compared to determine the optimal scale in a scale-space framework. Then, the segmentation stage amounts to thresholding the LoG of the intensity image. In contrast to other methods, the threshold is locally adapted given a probability of false alarm (PFA) specified by the user for the whole set of images to be processed. The local threshold is automatically derived from the PFA value and local image statistics estimated in a window whose size is not a critical parameter. We also propose a new data set for benchmarking, consisting of six collections of one hundred images each, which exploits backgrounds extracted from real microscopy images. We have carried out an extensive comparative evaluation on several data sets with ground-truth, which demonstrates that ATLAS outperforms existing methods. ATLAS does not need any fine parameter tuning and requires very low computation time. Convincing results are also reported on real total internal reflection fluorescence microscopy images.

  14. Optimization of a tomosynthesis system for the detection of lung nodules

    International Nuclear Information System (INIS)

    Pineda, Angel R.; Yoon, Sungwon; Paik, David S.; Fahrig, Rebecca

    2006-01-01

    Mathematical observers that track human performance can be used to reduce the number of human observer studies needed to optimize imaging systems. The performance of human observers for the detection of a 3.6 mm lung nodule in anatomical backgrounds was measured as a function of varying tomosynthetic angle and compared with mathematical observers. The human observer results showed a dramatic increase in the percent of correct responses, from 80% in the projection images to 96% in the projection images with a tomosynthetic angle of just 3 degrees. This result suggests the potential usefulness of the scanned beam digital x-ray system for this application. Given the small number of images (40) used per tomosynthetic angle and the highly nonstationary statistical nature of the backgrounds, the nonprewhitening eye observer achieved a higher performance than the channelized Hotelling observer using a Laguerre-Gauss basis. The channelized Hotelling observer with internal noise and the eye filter matched to the projection data were shown to track human performance as the tomosynthetic angle changed. The validation of these mathematical observers extends their applicability to the optimization of tomosynthesis systems

  15. An inequality for detecting financial fraud, derived from the Markowitz Optimal Portfolio Theory

    Science.gov (United States)

    Bard, Gregory V.

    2016-12-01

    The Markowitz Optimal Portfolio Theory, published in 1952, is well-known, and was often taught because it blends Lagrange Multipliers, matrices, statistics, and mathematical finance. However, the theory faded from prominence in American investing, as Business departments at US universities shifted from techniques based on mathematics, finance, and statistics, to focus instead on leadership, public speaking, interpersonal skills, advertising, etc… The author proposes a new application of Markowitz's Theory: the detection of a fairly broad category of financial fraud (called "Ponzi schemes" in American newspapers) by looking at a particular inequality derived from the Markowitz Optimal Portfolio Theory, relating volatility and expected rate of return. For example, one recent Ponzi scheme was that of Bernard Madoff, uncovered in December 2008, which comprised fraud totaling 64,800,000,000 US dollars [23]. The objective is to compare investments with the "efficient frontier" as predicted by Markowitz's theory. Violations of the inequality should be impossible in theory; therefore, in practice, violations might indicate fraud.

  16. Gravity Recovery and Climate Experiment (GRACE) detection of water storage changes in the Three Gorges Reservoir of China and comparison with in situ measurements

    Science.gov (United States)

    Wang, Xianwei; de Linage, Caroline; Famiglietti, James; Zender, Charles S.

    2011-12-01

    Water impoundment in the Three Gorges Reservoir (TGR) of China caused a large mass redistribution from the oceans to a concentrated land area in a short time period. We show that this mass shift is captured by the Gravity Recovery and Climate Experiment (GRACE) unconstrained global solutions at a 400 km spatial resolution after removing correlated errors. The WaterGAP Global Hydrology Model (WGHM) is selected to isolate the TGR contribution from regional water storage changes. For the first time, this study compares the GRACE (minus WGHM) estimated TGR volume changes with in situ measurements from April 2002 to May 2010 at a monthly time scale. During the 8 year study period, GRACE-WGHM estimated TGR volume changes show an increasing trend consistent with the TGR in situ measurements and lead to similar estimates of impounded water volume. GRACE-WGHM estimated total volume increase agrees to within 14% (3.2 km3) of the in situ measurements. This indicates that GRACE can retrieve the true amplitudes of large surface water storage changes in a concentrated area that is much smaller than the spatial resolution of its global harmonic solutions. The GRACE-WGHM estimated TGR monthly volume changes explain 76% (r2 = 0.76) of in situ measurement monthly variability and have an uncertainty of 4.62 km3. Our results also indicate reservoir leakage and groundwater recharge due to TGR filling and contamination from neighboring lakes are nonnegligible in the GRACE total water storage changes. Moreover, GRACE observations could provide a relatively accurate estimate of global water volume withheld by newly constructed large reservoirs and their impacts on global sea level rise since 2002.

  17. YAHA: fast and flexible long-read alignment with optimal breakpoint detection.

    Science.gov (United States)

    Faust, Gregory G; Hall, Ira M

    2012-10-01

    With improved short-read assembly algorithms and the recent development of long-read sequencers, split mapping will soon be the preferred method for structural variant (SV) detection. Yet, current alignment tools are not well suited for this. We present YAHA, a fast and flexible hash-based aligner. YAHA is as fast and accurate as BWA-SW at finding the single best alignment per query and is dramatically faster and more sensitive than both SSAHA2 and MegaBLAST at finding all possible alignments. Unlike other aligners that report all, or one, alignment per query, or that use simple heuristics to select alignments, YAHA uses a directed acyclic graph to find the optimal set of alignments that cover a query using a biologically relevant breakpoint penalty. YAHA can also report multiple mappings per defined segment of the query. We show that YAHA detects more breakpoints in less time than BWA-SW across all SV classes, and especially excels at complex SVs comprising multiple breakpoints. YAHA is currently supported on 64-bit Linux systems. Binaries and sample data are freely available for download from http://faculty.virginia.edu/irahall/YAHA. imh4y@virginia.edu.

  18. A Pathological Brain Detection System based on Extreme Learning Machine Optimized by Bat Algorithm.

    Science.gov (United States)

    Lu, Siyuan; Qiu, Xin; Shi, Jianping; Li, Na; Lu, Zhi-Hai; Chen, Peng; Yang, Meng-Meng; Liu, Fang-Yuan; Jia, Wen-Juan; Zhang, Yudong

    2017-01-01

    It is beneficial to classify brain images as healthy or pathological automatically, because 3D brain images can generate so much information which is time consuming and tedious for manual analysis. Among various 3D brain imaging techniques, magnetic resonance (MR) imaging is the most suitable for brain, and it is now widely applied in hospitals, because it is helpful in the four ways of diagnosis, prognosis, pre-surgical, and postsurgical procedures. There are automatic detection methods; however they suffer from low accuracy. Therefore, we proposed a novel approach which employed 2D discrete wavelet transform (DWT), and calculated the entropies of the subbands as features. Then, a bat algorithm optimized extreme learning machine (BA-ELM) was trained to identify pathological brains from healthy controls. A 10x10-fold cross validation was performed to evaluate the out-of-sample performance. The method achieved a sensitivity of 99.04%, a specificity of 93.89%, and an overall accuracy of 98.33% over 132 MR brain images. The experimental results suggest that the proposed approach is accurate and robust in pathological brain detection. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  19. Root Exploit Detection and Features Optimization: Mobile Device and Blockchain Based Medical Data Management.

    Science.gov (United States)

    Firdaus, Ahmad; Anuar, Nor Badrul; Razak, Mohd Faizal Ab; Hashem, Ibrahim Abaker Targio; Bachok, Syafiq; Sangaiah, Arun Kumar

    2018-05-04

    The increasing demand for Android mobile devices and blockchain has motivated malware creators to develop mobile malware to compromise the blockchain. Although the blockchain is secure, attackers have managed to gain access into the blockchain as legal users, thereby comprising important and crucial information. Examples of mobile malware include root exploit, botnets, and Trojans and root exploit is one of the most dangerous malware. It compromises the operating system kernel in order to gain root privileges which are then used by attackers to bypass the security mechanisms, to gain complete control of the operating system, to install other possible types of malware to the devices, and finally, to steal victims' private keys linked to the blockchain. For the purpose of maximizing the security of the blockchain-based medical data management (BMDM), it is crucial to investigate the novel features and approaches contained in root exploit malware. This study proposes to use the bio-inspired method of practical swarm optimization (PSO) which automatically select the exclusive features that contain the novel android debug bridge (ADB). This study also adopts boosting (adaboost, realadaboost, logitboost, and multiboost) to enhance the machine learning prediction that detects unknown root exploit, and scrutinized three categories of features including (1) system command, (2) directory path and (3) code-based. The evaluation gathered from this study suggests a marked accuracy value of 93% with Logitboost in the simulation. Logitboost also helped to predicted all the root exploit samples in our developed system, the root exploit detection system (RODS).

  20. Transport of reservoir fines

    DEFF Research Database (Denmark)

    Yuan, Hao; Shapiro, Alexander; Stenby, Erling Halfdan

    Modeling transport of reservoir fines is of great importance for evaluating the damage of production wells and infectivity decline. The conventional methodology accounts for neither the formation heterogeneity around the wells nor the reservoir fines’ heterogeneity. We have developed an integral...... dispersion equation in modeling the transport and the deposition of reservoir fines. It successfully predicts the unsymmetrical concentration profiles and the hyperexponential deposition in experiments....

  1. Segmentation and abnormality detection of cervical cancer cells using fast elm with particle swarm optimization

    Directory of Open Access Journals (Sweden)

    Sukumar P.

    2015-01-01

    Full Text Available Cervical cancer arises when the anomalous cells on the cervix mature unmanageable obviously in the renovation sector. The most probably used methods to detect abnormal cervical cells are the routine and there is no difference between the abnormal and normal nuclei. So that the abnormal nuclei found are brown in color while normal nuclei are blue in color. The spread or cells are examined and the image denoising is performed based on the Iterative Decision Based Algorithm. Image Segmentation is the method of paneling a digital image into compound sections. The major utilize of segmentation is to abridge or modify the demonstration of an image. The images are segmented by applying anisotropic diffusion on the Denoised image. Image can be enhanced using dark stretching to increase the quality of the image. It separates the cells into all nuclei region and abnormal nuclei region. The abnormal nuclei regions are further classified into touching and non-touching regions and touching regions undergoes feature selection process. The existing Support Vector Machines (SVM is classified few nuclei regions but the time to taken for execution is high. The abnormality detected from the image is calculated as 45% from the total abnormal nuclei. Thus the proposed method of Fast Particle Swarm Optimization with Extreme Learning Machines (Fast PSO-ELM to classify all nuclei regions further into touching region and separated region. The iterative method for to training the ELM and make it more efficient than the SVM method. In experimental result, the proposed method of Fast PSO-ELM may shows the accuracy as above 90% and execution time is calculated based on the abnormality (ratio of abnormal nuclei regions to all nuclei regions image. Therefore, Fast PSO-ELM helps to detect the cervical cancer cells with maximum accuracy.

  2. Ecological operation for Three Gorges Reservoir

    Directory of Open Access Journals (Sweden)

    Wen-xian Guo

    2011-06-01

    Full Text Available The traditional operation of the Three Gorges Reservoir has mainly focused on water for flood control, power generation, navigation, water supply, and recreation, and given less attention to the negative impacts of reservoir operation on the river ecosystem. In order to reduce the negative influence of reservoir operation, ecological operation of the reservoir should be studied with a focus on maintaining a healthy river ecosystem. This study considered ecological operation targets, including maintaining the river environmental flow and protecting the spawning and reproduction of the Chinese sturgeon and four major Chinese carps. Using flow data from 1900 to 2006 at the Yichang gauging station as the control station data for the Yangtze River, the minimal and optimal river environmental flows were analyzed, and eco-hydrological targets for the Chinese sturgeon and four major Chinese carps in the Yangtze River were calculated. This paper proposes a reservoir ecological operation model, which comprehensively considers flood control, power generation, navigation, and the ecological environment. Three typical periods, wet, normal, and dry years, were selected, and the particle swarm optimization algorithm was used to analyze the model. The results show that ecological operation modes have different effects on the economic benefit of the hydropower station, and the reservoir ecological operation model can simulate the flood pulse for the requirements of spawning of the Chinese sturgeon and four major Chinese carps. According to the results, by adopting a suitable re-operation scheme, the hydropower benefit of the reservoir will not decrease dramatically while the ecological demand is met. The results provide a reference for designing reasonable operation schemes for the Three Gorges Reservoir.

  3. Optimization of an NLEO-based algorithm for automated detection of spontaneous activity transients in early preterm EEG

    International Nuclear Information System (INIS)

    Palmu, Kirsi; Vanhatalo, Sampsa; Stevenson, Nathan; Wikström, Sverre; Hellström-Westas, Lena; Palva, J Matias

    2010-01-01

    We propose here a simple algorithm for automated detection of spontaneous activity transients (SATs) in early preterm electroencephalography (EEG). The parameters of the algorithm were optimized by supervised learning using a gold standard created from visual classification data obtained from three human raters. The generalization performance of the algorithm was estimated by leave-one-out cross-validation. The mean sensitivity of the optimized algorithm was 97% (range 91–100%) and specificity 95% (76–100%). The optimized algorithm makes it possible to systematically study brain state fluctuations of preterm infants. (note)

  4. Operational trade-offs in reservoir control

    Science.gov (United States)

    Georgakakos, Aris P.

    1993-11-01

    Reservoir operation decisions require constant reevaluation in the face of conflicting objectives, varying hydrologic conditions, and frequent operational policy changes. Optimality is a relative concept very much dependent on the circumstances under which a decision is made. More than anything else, reservoir management authorities need the means to assess the impacts of various operational options. It is their responsibility to define what is desirable after a thorough evaluation of the existing circumstances. This article presents a model designed to generate operational trade-offs common among reservoir systems. The model avoids an all-encompassing problem formulation and distinguishes three operational modes (levels) corresponding to normal, drought, and flood operations. Each level addresses only relevant system elements and uses a static and a dynamic control module to optimize turbine performance within each planning period and temporally. The model is used for planning the operation of the Savannah River System.

  5. Reservoir Engineering Management Program

    Energy Technology Data Exchange (ETDEWEB)

    Howard, J.H.; Schwarz, W.J.

    1977-12-14

    The Reservoir Engineering Management Program being conducted at Lawrence Berkeley Laboratory includes two major tasks: 1) the continuation of support to geothermal reservoir engineering related work, started under the NSF-RANN program and transferred to ERDA at the time of its formation; 2) the development and subsequent implementation of a broad plan for support of research in topics related to the exploitation of geothermal reservoirs. This plan is now known as the GREMP plan. Both the NSF-RANN legacies and GREMP are in direct support of the DOE/DGE mission in general and the goals of the Resource and Technology/Resource Exploitation and Assessment Branch in particular. These goals are to determine the magnitude and distribution of geothermal resources and reduce risk in their exploitation through improved understanding of generically different reservoir types. These goals are to be accomplished by: 1) the creation of a large data base about geothermal reservoirs, 2) improved tools and methods for gathering data on geothermal reservoirs, and 3) modeling of reservoirs and utilization options. The NSF legacies are more research and training oriented, and the GREMP is geared primarily to the practical development of the geothermal reservoirs. 2 tabs., 3 figs.

  6. Optimization and Evaluation of a PCR Assay for Detecting Toxoplasmic Encephalitis in Patients with AIDS

    Science.gov (United States)

    Joseph, Priya; Calderón, Maritza M.; Gilman, Robert H.; Quispe, Monica L.; Cok, Jaime; Ticona, Eduardo; Chavez, Victor; Jimenez, Juan A.; Chang, Maria C.; Lopez, Martín J.; Evans, Carlton A.

    2002-01-01

    Toxoplasma gondii is a common life-threatening opportunistic infection. We used experimental murine T. gondii infection to optimize the PCR for diagnostic use, define its sensitivity, and characterize the time course and tissue distribution of experimental toxoplasmosis. PCR conditions were adjusted until the assay reliably detected quantities of DNA derived from less than a single parasite. Forty-two mice were inoculated intraperitoneally with T. gondii tachyzoites and sacrificed from 6 to 72 h later. Examination of tissues with PCR and histology revealed progression of infection from blood to lung, heart, liver, and brain, with PCR consistently detecting parasites earlier than microscopy and with no false-positive results. We then evaluated the diagnostic value of this PCR assay in human patients. We studied cerebrospinal fluid and serum samples from 12 patients with AIDS and confirmed toxoplasmic encephalitis (defined as positive mouse inoculation and/or all of the Centers for Disease Control clinical diagnostic criteria), 12 human immunodeficiency virus-infected patients with suspected cerebral toxoplasmosis who had neither CDC diagnostic criteria nor positive mouse inoculation, 26 human immunodeficiency virus-infected patients with other opportunistic infections and no signs of cerebral toxoplasmosis, and 18 immunocompetent patients with neurocysticercosis. Eleven of the 12 patients with confirmed toxoplasmosis had positive PCR results in either blood or cerebrospinal fluid samples (6 of 9 blood samples and 8 of 12 cerebrospinal fluid samples). All samples from control patients were negative. This study demonstrates the high sensitivity, specificity, and clinical utility of PCR in the diagnosis of toxoplasmic encephalitis in a resource-poor setting. PMID:12454142

  7. Optimal use of land surface temperature data to detect changes in tropical forest cover

    Science.gov (United States)

    Van Leeuwen, T. T.; Frank, A. J.; Jin, Y.; Smyth, P.; Goulden, M.; van der Werf, G.; Randerson, J. T.

    2011-12-01

    Rapid and accurate assessment of global forest cover change is needed to focus conservation efforts and to better understand how deforestation is contributing to the build up of atmospheric CO2. Here we examined different ways to use remotely sensed land surface temperature (LST) to detect changes in tropical forest cover. In our analysis we used monthly 0.05×0.05 degree Terra MODerate Resolution Imaging Spectroradiometer (MODIS) observations of LST and PRODES (Program for the Estimation of Deforestation in the Brazilian Amazon) estimates of forest cover change. We also compared MODIS LST observations with an independent estimate of forest cover loss derived from MODIS and Landsat observations. Our study domain of approximately 10×10 degree included most of the Brazilian state of Mato Grosso. For optimal use of LST data to detect changes in tropical forest cover in our study area, we found that using data sampled during the end of the dry season (~1-2 months after minimum monthly precipitation) had the greatest predictive skill. During this part of the year, precipitation was low, surface humidity was at a minimum, and the difference between day and night LST was the largest. We used this information to develop a simple temporal sampling algorithm appropriate for use in pan-tropical deforestation classifiers. Combined with the normalized difference vegetation index (NDVI), a logistic regression model using day-night LST did moderately well at predicting forest cover change. Annual changes in day-night LST difference decreased during 2006-2009 relative to 2001-2005 in many regions within the Amazon, providing independent confirmation of lower deforestation levels during the latter part of this decade as reported by PRODES. The use of day-night LST differences may be particularly valuable for use with satellites that do not have spectral bands that allow for the estimation of NDVI or other vegetation indices.

  8. A comparative study of standard vs. high definition colonoscopy for adenoma and hyperplastic polyp detection with optimized withdrawal technique.

    Science.gov (United States)

    East, J E; Stavrindis, M; Thomas-Gibson, S; Guenther, T; Tekkis, P P; Saunders, B P

    2008-09-15

    Colonoscopy has a known miss rate for polyps and adenomas. High definition (HD) colonoscopes may allow detection of subtle mucosal change, potentially aiding detection of adenomas and hyperplastic polyps. To compare detection rates between HD and standard definition (SD) colonoscopy. Prospective, cohort study with optimized withdrawal technique (withdrawal time >6 min, antispasmodic, position changes, re-examining flexures and folds). One hundred and thirty patients attending for routine colonoscopy were examined with either SD (n = 72) or HD (n = 58) colonoscopes. Groups were well matched. Sixty per cent of patients had at least one adenoma detected with SD vs. 71% with HD, P = 0.20, relative risk (benefit) 1.32 (95% CI 0.85-2.04). Eighty-eight adenomas (mean +/- standard deviation 1.2 +/- 1.4) were detected using SD vs. 93 (1.6 +/- 1.5) with HD, P = 0.12; however more nonflat, diminutive (9 mm) hyperplastic polyps was 7% (0.09 +/- 0.36). High definition did not lead to a significant increase in adenoma or hyperplastic polyp detection, but may help where comprehensive lesion detection is paramount. High detection rates appear possible with either SD or HD, when using an optimized withdrawal technique.

  9. TRUSTWORTHY OPTIMIZED CLUSTERING BASED TARGET DETECTION AND TRACKING FOR WIRELESS SENSOR NETWORK

    Directory of Open Access Journals (Sweden)

    C. Jehan

    2016-06-01

    Full Text Available In this paper, an efficient approach is proposed to address the problem of target tracking in wireless sensor network (WSN. The problem being tackled here uses adaptive dynamic clustering scheme for tracking the target. It is a specific problem in object tracking. The proposed adaptive dynamic clustering target tracking scheme uses three steps for target tracking. The first step deals with the identification of clusters and cluster heads using OGSAFCM. Here, kernel fuzzy c-means (KFCM and gravitational search algorithm (GSA are combined to create clusters. At first, oppositional gravitational search algorithm (OGSA is used to optimize the initial clustering center and then the KFCM algorithm is availed to guide the classification and the cluster formation process. In the OGSA, the concept of the opposition based population initialization in the basic GSA to improve the convergence profile. The identified clusters are changed dynamically. The second step deals with the data transmission to the cluster heads. The third step deals with the transmission of aggregated data to the base station as well as the detection of target. From the experimental results, the proposed scheme efficiently and efficiently identifies the target. As a result the tracking error is minimized.

  10. Inter-laboratory optimization of protein extraction, separation, and fluorescent detection of endogenous rice allergens.

    Science.gov (United States)

    Satoh, Rie; Teshima, Reiko; Kitta, Kazumi; Lang, Gang-Hua; Schegg, Kathleen; Blumenthal, Kenneth; Hicks, Leslie; Labory-Carcenac, Bénédicte; Rouquié, David; Herman, Rod A; Herouet-Guicheney, Corinne; Ladics, Gregory S; McClain, Scott; Poulsen, Lars K; Privalle, Laura; Ward, Jason M; Doerrer, Nancy; Rascle, Jean-Baptiste

    2016-07-11

    In rice, several allergens have been identified such as the non-specific lipid transfer protein-1, the α-amylase/trypsin-inhibitors, the α-globulin, the 33 kDa glyoxalase I (Gly I), the 52-63 kDa globulin, and the granule-bound starch synthetase. The goal of the present study was to define optimal rice extraction and detection methods that would allow a sensitive and reproducible measure of several classes of known rice allergens. In a three-laboratory ring-trial experiment, several protein extraction methods were first compared and analyzed by 1D multiplexed SDS-PAGE. In a second phase, an inter-laboratory validation of 2D-DIGE analysis was conducted in five independent laboratories, focusing on three rice allergens (52 kDa globulin, 33 kDa glyoxalase I, and 14-16 kDa α-amylase/trypsin inhibitor family members). The results of the present study indicate that a combination of 1D multiplexed SDS-PAGE and 2D-DIGE methods would be recommended to quantify the various rice allergens.

  11. Hazardous Traffic Event Detection Using Markov Blanket and Sequential Minimal Optimization (MB-SMO

    Directory of Open Access Journals (Sweden)

    Lixin Yan

    2016-07-01

    Full Text Available The ability to identify hazardous traffic events is already considered as one of the most effective solutions for reducing the occurrence of crashes. Only certain particular hazardous traffic events have been studied in previous studies, which were mainly based on dedicated video stream data and GPS data. The objective of this study is twofold: (1 the Markov blanket (MB algorithm is employed to extract the main factors associated with hazardous traffic events; (2 a model is developed to identify hazardous traffic event using driving characteristics, vehicle trajectory, and vehicle position data. Twenty-two licensed drivers were recruited to carry out a natural driving experiment in Wuhan, China, and multi-sensor information data were collected for different types of traffic events. The results indicated that a vehicle’s speed, the standard deviation of speed, the standard deviation of skin conductance, the standard deviation of brake pressure, turn signal, the acceleration of steering, the standard deviation of acceleration, and the acceleration in Z (G have significant influences on hazardous traffic events. The sequential minimal optimization (SMO algorithm was adopted to build the identification model, and the accuracy of prediction was higher than 86%. Moreover, compared with other detection algorithms, the MB-SMO algorithm was ranked best in terms of the prediction accuracy. The conclusions can provide reference evidence for the development of dangerous situation warning products and the design of intelligent vehicles.

  12. Detective quantum efficiency: a standard test to ensure optimal detector performance and low patient exposures

    Science.gov (United States)

    Escartin, Terenz R.; Nano, Tomi F.; Cunningham, Ian A.

    2016-03-01

    The detective quantum efficiency (DQE), expressed as a function of spatial frequency, describes the ability of an x-ray detector to produce high signal-to-noise ratio (SNR) images. While regulatory and scientific communities have used the DQE as a primary metric for optimizing detector design, the DQE is rarely used by end users to ensure high system performance is maintained. Of concern is that image quality varies across different systems for the same exposures with no current measures available to describe system performance. Therefore, here we conducted an initial DQE measurement survey of clinical x-ray systems using a DQE-testing instrument to identify their range of performance. Following laboratory validation, experiments revealed that the DQE of five different systems under the same exposure level (8.0 μGy) ranged from 0.36 to 0.75 at low spatial frequencies, and 0.02 to 0.4 at high spatial frequencies (3.5 cycles/mm). Furthermore, the DQE dropped substantially with decreasing detector exposure by a factor of up to 1.5x in the lowest spatial frequency, and a factor of 10x at 3.5 cycles/mm due to the effect of detector readout noise. It is concluded that DQE specifications in purchasing decisions, combined with periodic DQE testing, are important factors to ensure patients receive the health benefits of high-quality images for low x-ray exposures.

  13. Automated Sperm Head Detection Using Intersecting Cortical Model Optimised by Particle Swarm Optimization.

    Science.gov (United States)

    Tan, Weng Chun; Mat Isa, Nor Ashidi

    2016-01-01

    In human sperm motility analysis, sperm segmentation plays an important role to determine the location of multiple sperms. To ensure an improved segmentation result, the Laplacian of Gaussian filter is implemented as a kernel in a pre-processing step before applying the image segmentation process to automatically segment and detect human spermatozoa. This study proposes an intersecting cortical model (ICM), which was derived from several visual cortex models, to segment the sperm head region. However, the proposed method suffered from parameter selection; thus, the ICM network is optimised using particle swarm optimization where feature mutual information is introduced as the new fitness function. The final results showed that the proposed method is more accurate and robust than four state-of-the-art segmentation methods. The proposed method resulted in rates of 98.14%, 98.82%, 86.46% and 99.81% in accuracy, sensitivity, specificity and precision, respectively, after testing with 1200 sperms. The proposed algorithm is expected to be implemented in analysing sperm motility because of the robustness and capability of this algorithm.

  14. Resonant Frequency Calculation and Optimal Design of Peano Fractal Antenna for Partial Discharge Detection

    Directory of Open Access Journals (Sweden)

    Jian Li

    2012-01-01

    Full Text Available Ultra-high-frequency (UHF approaches have caught increasing attention recently and have been considered as a promising technology for online monitoring partial discharge (PD signals. This paper presents a Peano fractal antenna for UHF PD online monitoring of transformer with small size and multiband. The approximate formula for calculating the first resonant frequency of the Peano fractal antenna is presented. The results show that the first resonant frequency of the Peano fractal antenna is smaller than the Hilbert fractal antenna when the outer dimensions are equivalent approximately. The optimal geometric parameters of the antenna were obtained through simulation. Actual PD experiments had been carried out for two typically artificial insulation defect models, while the proposed antenna and the existing Hilbert antenna were both used for the PD measurement. The experimental results show that Peano fractal antenna is qualified for PD online UHF monitoring and a little more suitable than the Hilbert fractal antenna for pattern recognition by analyzing the waveforms of detected UHF PD signals.

  15. Optimization of PIXE-sensitivity for detection of Ti in thin human skin sections

    International Nuclear Information System (INIS)

    Pallon, Jan; Garmer, Mats; Auzelyte, Vaida; Elfman, Mikael; Kristiansson, Per; Malmqvist, Klas; Nilsson, Christer; Shariff, Asad; Wegden, Marie

    2005-01-01

    Modern sunscreens contain particles like TiO 2 having sizes of 25-70 nm and acting as a reflecting substance. For cosmetic reasons the particle size is minimized. Questions have been raised to what degree these nano particles penetrate the skin barrier, and how they do affect the human. The EU funded project 'Quality of skin as a barrier to ultra-fine particles' - NANODERM has started with the purpose to evaluate the possible risks of TiO 2 penetration into vital skin layers. The purpose of the work presented here was to find the optimal conditions for micro-PIXE analysis of Ti in thin skin sections. In the skin region where Ti is expected to be found, the naturally occurring major elements phosphorus, chlorine, sulphur and potassium have steep gradients and thus influence the X-ray background in a non-predictable manner. Based on experimental studies of Ti-exposed human skin sections using proton energies ranging from 1.8-2.55 MeV, the corresponding PIXE detection limits for Ti were calculated. The energy that was found to be the most favourable, 1.9 MeV, was then selected for future studies

  16. Optimization of band-pass filtering parameters of a Raman lidar detecting atmospheric water vapor

    International Nuclear Information System (INIS)

    Cao, Kai-Fa; Hu, Shun-Xing; Wang, Ying-jian

    2012-01-01

    It is very important for daytime Raman lidar measurement of water vapor to determine the parameters of a band-pass filter, which are pertinent to the lidar signal to noise ratio (SNR). The simulated annealing (SA) algorithm method has an advantage in finding the extremum of a certain cost function. In this paper, the Raman spectrum of water vapor is simulated and then a first realization of a simulated annealing algorithm in the optimization of a band-pass filter of a Raman lidar system designed to detect daytime water vapor is presented. The simulated results indicate that the narrow band-pass filter has higher SNR than the wide filter does but there would be an increase in the temperature sensitivity of a narrowband Raman water vapor lidar in the upper troposphere. The numerical simulation indicates that the magnitude of the temperature dependent effect can reach 3.5% or more for narrow band-pass Raman water vapor measurements so it is necessary to consider a new water vapor Raman lidar equation that permits the temperature sensitivity of these equations to be confined to a single term. (paper)

  17. Optimal statistical damage detection and classification in an experimental wind turbine blade using minimum instrumentation

    Science.gov (United States)

    Hoell, Simon; Omenzetter, Piotr

    2017-04-01

    The increasing demand for carbon neutral energy in a challenging economic environment is a driving factor for erecting ever larger wind turbines in harsh environments using novel wind turbine blade (WTBs) designs characterized by high flexibilities and lower buckling capacities. To counteract resulting increasing of operation and maintenance costs, efficient structural health monitoring systems can be employed to prevent dramatic failures and to schedule maintenance actions according to the true structural state. This paper presents a novel methodology for classifying structural damages using vibrational responses from a single sensor. The method is based on statistical classification using Bayes' theorem and an advanced statistic, which allows controlling the performance by varying the number of samples which represent the current state. This is done for multivariate damage sensitive features defined as partial autocorrelation coefficients (PACCs) estimated from vibrational responses and principal component analysis scores from PACCs. Additionally, optimal DSFs are composed not only for damage classification but also for damage detection based on binary statistical hypothesis testing, where features selections are found with a fast forward procedure. The method is applied to laboratory experiments with a small scale WTB with wind-like excitation and non-destructive damage scenarios. The obtained results demonstrate the advantages of the proposed procedure and are promising for future applications of vibration-based structural health monitoring in WTBs.

  18. Optimizing FRET-FLIM Labeling Conditions to Detect Nuclear Protein Interactions at Native Expression Levels in Living Arabidopsis Roots

    KAUST Repository

    Long, Yuchen

    2018-05-15

    Protein complex formation has been extensively studied using Förster resonance energy transfer (FRET) measured by Fluorescence Lifetime Imaging Microscopy (FLIM). However, implementing this technology to detect protein interactions in living multicellular organism at single-cell resolution and under native condition is still difficult to achieve. Here we describe the optimization of the labeling conditions to detect FRET-FLIM in living plants. This study exemplifies optimization procedure involving the identification of the optimal position for the labels either at the N or C terminal region and the selection of the bright and suitable, fluorescent proteins as donor and acceptor labels for the FRET study. With an effective optimization strategy, we were able to detect the interaction between the stem cell regulators SHORT-ROOT and SCARECROW at endogenous expression levels in the root pole of living Arabidopsis embryos and developing lateral roots by FRET-FLIM. Using this approach we show that the spatial profile of interaction between two transcription factors can be highly modulated in reoccurring and structurally resembling organs, thus providing new information on the dynamic redistribution of nuclear protein complex configurations in different developmental stages. In principle, our optimization procedure for transcription factor complexes is applicable to any biological system.

  19. Tracer applications in oil reservoirs in Brazil

    International Nuclear Information System (INIS)

    Moreira, R.M.; Ferreira Pinto, A.M.

    2004-01-01

    Radiotracer applications in oil reservoirs in Brazil started in 1997 at the request of the State Oil Company (Petrobras) at the Carmoplois oilfield. 1 Ci of HTO was injected in a regular five-spot plot and the results obtained were quite satisfactory. Shortly after this test one other request asked for distinguishing the contribution of different injection wells to a production well. It was then realized that other tracers should be available. As a first choice 35 SCN - has been selected since it could be produced at CDTN. An alternative synthesis path was defined which shortened post-irradiation manipulations. The tracer was tested in core samples and a field injection, simultaneously with HTO, was carried out at the Buracica field; again the HTO performed well but 35 SCN - showed up well ahead. Presently the HTO applications are being done on a routine basis. All in all, four tests were performed (some are still ongoing), and the detection limits for both 3 H and 35 S were optimized by refining the sample preparation stage. Lanthanide complexes used as activable tracers are also an appealing option, however core tests performed so far with La-, Ce- and Eu-EDTA indicated some delay of the tracer, so other complexants such as DOTA are to be tried in further laboratory tests and in a field application. Thus, a deeper understanding of their complexation chemistry and carefully conducted tests must be performed before lanthanide complexes can be qualified as reliable oil reservoir tracers. More recently, Petrobras has been asking for partitioning tracers intended for SOR measurement

  20. Harmonizing FDG PET quantification while maintaining optimal lesion detection: prospective multicentre validation in 517 oncology patients

    International Nuclear Information System (INIS)

    Quak, Elske; Le Roux, Pierre-Yves; Robin, Philippe; Bourhis, David; Salaun, Pierre-Yves; Hofman, Michael S.; Callahan, Jason; Binns, David; Hicks, Rodney J.; Desmonts, Cedric; Aide, Nicolas

    2015-01-01

    Point-spread function (PSF) or PSF + time-of-flight (TOF) reconstruction may improve lesion detection in oncologic PET, but can alter quantitation resulting in variable standardized uptake values (SUVs) between different PET systems. This study aims to validate a proprietary software tool (EQ.PET) to harmonize SUVs across different PET systems independent of the reconstruction algorithm used. NEMA NU2 phantom data were used to calculate the appropriate filter for each PSF or PSF+TOF reconstruction from three different PET systems, in order to obtain EANM compliant recovery coefficients. PET data from 517 oncology patients were reconstructed with a PSF or PSF+TOF reconstruction for optimal tumour detection and an ordered subset expectation maximization (OSEM3D) reconstruction known to fulfil EANM guidelines. Post-reconstruction, the proprietary filter was applied to the PSF or PSF+TOF data (PSF EQ or PSF+TOF EQ ). SUVs for PSF or PSF+TOF and PSF EQ or PSF+TOF EQ were compared to SUVs for the OSEM3D reconstruction. The impact of potential confounders on the EQ.PET methodology including lesion and patient characteristics was studied, as was the adherence to imaging guidelines. For the 1380 tumour lesions studied, Bland-Altman analysis showed a mean ratio between PSF or PSF+TOF and OSEM3D of 1.46 (95 %CI: 0.86-2.06) and 1.23 (95 %CI: 0.95-1.51) for SUV max and SUV peak , respectively. Application of the proprietary filter improved these ratios to 1.02 (95 %CI: 0.88-1.16) and 1.04 (95 %CI: 0.92-1.17) for SUV max and SUV peak , respectively. The influence of the different confounding factors studied (lesion size, location, radial offset and patient's BMI) was less than 5 %. Adherence to the European Association of Nuclear Medicine (EANM) guidelines for tumour imaging was good. These data indicate that it is not necessary to sacrifice the superior lesion detection and image quality achieved by newer reconstruction techniques in the quest for harmonizing quantitative

  1. Key seismic exploration technology for the Longwangmiao Fm gas reservoir in Gaoshiti–Moxi area, Sichuan Basin

    Directory of Open Access Journals (Sweden)

    Guangrong Zhang

    2016-10-01

    Full Text Available The dolomite reservoirs of the Lower Cambrian Longwangmiao Fm in the Gaoshiti–Moxi area, Sichuan Basin, are deeply buried (generally 4400–4900 m, with high heterogeneity, making reservoir prediction difficult. In this regard, key seismic exploration technologies were developed through researches. Firstly, through in-depth analysis on the existing geologic, drilling, seismic data and available research findings, basic surface and subsurface structures and geologic conditions within the study area were clarified. Secondly, digital seismic data acquisition technologies with wide azimuth, wide frequency band and minor bins were adopted to ensure even distribution of coverage of target formations through optimization of the 3D seismic geometry. In this way, high-accuracy 3D seismic data can be acquired through shallow, middle and deep formations. Thirdly, well-control seismic data processing technologies were applied to enhance the signal-to-noise ratio (SNR of seismic data for deep formations. Fourthly, a seismic response model was established specifically for the Longwangmiao Fm reservoir. Quantitative prediction of the reservoir was performed through pre-stack geo-statistics. In this way, plan distribution of reservoir thicknesses was mapped. Fifthly, core tests and logging data analysis were conducted to determine gas-sensitive elastic parameters, which were then used in pre-stack hydrocarbon detection to eliminate the multiple solutions in seismic data interpretation. It is concluded that application of the above-mentioned key technologies effectively promote the discovery of largescale marine carbonate gas reservoirs of the Longwangmiao Fm.

  2. Integrated Optimization of Long-Range Underwater Signal Detection, Feature Extraction, and Classification for Nuclear Treaty Monitoring

    NARCIS (Netherlands)

    Tuma, M.; Rorbech, V.; Prior, M.; Igel, C.

    2016-01-01

    We designed and jointly optimized an integrated signal processing chain for detection and classification of long-range passive-acoustic underwater signals recorded by the global geophysical monitoring network of the Comprehensive Nuclear-Test-Ban Treaty Organization. Starting at the level of raw

  3. Identification of Swallowing Tasks from a Modified Barium Swallow Study That Optimize the Detection of Physiological Impairment

    Science.gov (United States)

    Hazelwood, R. Jordan; Armeson, Kent E.; Hill, Elizabeth G.; Bonilha, Heather Shaw; Martin-Harris, Bonnie

    2017-01-01

    Purpose: The purpose of this study was to identify which swallowing task(s) yielded the worst performance during a standardized modified barium swallow study (MBSS) in order to optimize the detection of swallowing impairment. Method: This secondary data analysis of adult MBSSs estimated the probability of each swallowing task yielding the derived…

  4. Transmission characteristics and optimal diagnostic samples to detect an FMDV infection in vaccinated and non-vaccinated sheep

    NARCIS (Netherlands)

    Eble, P.L.; Orsel, K.; Kluitenberg-van Hemert, F.; Dekker, A.

    2015-01-01

    We wanted to quantify transmission of FMDV Asia-1 in sheep and to evaluate which samples would be optimal for detection of an FMDV infection in sheep. For this, we used 6 groups of 4 non-vaccinated and 6 groups of 4 vaccinated sheep. In each group 2 sheep were inoculated and contact exposed to 2

  5. Improved detection of multiple environmental antibiotics through an optimized sample extraction strategy in liquid chromatography-mass spectrometry analysis.

    Science.gov (United States)

    Yi, Xinzhu; Bayen, Stéphane; Kelly, Barry C; Li, Xu; Zhou, Zhi

    2015-12-01

    A solid-phase extraction/liquid chromatography/electrospray ionization/multi-stage mass spectrometry (SPE-LC-ESI-MS/MS) method was optimized in this study for sensitive and simultaneous detection of multiple antibiotics in urban surface waters and soils. Among the seven classes of tested antibiotics, extraction efficiencies of macrolides, lincosamide, chloramphenicol, and polyether antibiotics were significantly improved under optimized sample extraction pH. Instead of only using acidic extraction in many existing studies, the results indicated that antibiotics with low pK a values (antibiotics with high pK a values (>7) were extracted more efficiently under neutral conditions. The effects of pH were more obvious on polar compounds than those on non-polar compounds. Optimization of extraction pH resulted in significantly improved sample recovery and better detection limits. Compared with reported values in the literature, the average reduction of minimal detection limits obtained in this study was 87.6% in surface waters (0.06-2.28 ng/L) and 67.1% in soils (0.01-18.16 ng/g dry wt). This method was subsequently applied to detect antibiotics in environmental samples in a heavily populated urban city, and macrolides, sulfonamides, and lincomycin were frequently detected. Antibiotics with highest detected concentrations were sulfamethazine (82.5 ng/L) in surface waters and erythromycin (6.6 ng/g dry wt) in soils. The optimized sample extraction strategy can be used to improve the detection of a variety of antibiotics in environmental surface waters and soils.

  6. Developing Novel Reservoir Rule Curves Using Seasonal Inflow Projections

    Science.gov (United States)

    Tseng, Hsin-yi; Tung, Ching-pin

    2015-04-01

    Due to significant seasonal rainfall variations, reservoirs and their flexible operational rules are indispensable to Taiwan. Furthermore, with the intensifying impacts of climate change on extreme climate, the frequency of droughts in Taiwan has been increasing in recent years. Drought is a creeping phenomenon, the slow onset character of drought makes it difficult to detect at an early stage, and causes delays on making the best decision of allocating water. For these reasons, novel reservoir rule curves using projected seasonal streamflow are proposed in this study, which can potentially reduce the adverse effects of drought. This study dedicated establishing new rule curves which consider both current available storage and anticipated monthly inflows with leading time of two months to reduce the risk of water shortage. The monthly inflows are projected based on the seasonal climate forecasts from Central Weather Bureau (CWB), which a weather generation model is used to produce daily weather data for the hydrological component of the GWLF. To incorporate future monthly inflow projections into rule curves, this study designs a decision flow index which is a linear combination of current available storage and inflow projections with leading time of 2 months. By optimizing linear relationship coefficients of decision flow index, the shape of rule curves and the percent of water supply in each zone, the best rule curves to decrease water shortage risk and impacts can be developed. The Shimen Reservoir in the northern Taiwan is used as a case study to demonstrate the proposed method. Existing rule curves (M5 curves) of Shimen Reservoir are compared with two cases of new rule curves, including hindcast simulations and historic seasonal forecasts. The results show new rule curves can decrease the total water shortage ratio, and in addition, it can also allocate shortage amount to preceding months to avoid extreme shortage events. Even though some uncertainties in

  7. Defect Detection of Adhesive Layer of Thermal Insulation Materials Based on Improved Particle Swarm Optimization of ECT.

    Science.gov (United States)

    Wen, Yintang; Jia, Yao; Zhang, Yuyan; Luo, Xiaoyuan; Wang, Hongrui

    2017-10-25

    This paper studies the defect detection problem of adhesive layer of thermal insulation materials. A novel detection method based on an improved particle swarm optimization (PSO) algorithm of Electrical Capacitance Tomography (ECT) is presented. Firstly, a least squares support vector machine is applied for data processing of measured capacitance values. Then, the improved PSO algorithm is proposed and applied for image reconstruction. Finally, some experiments are provided to verify the effectiveness of the proposed method in defect detection for adhesive layer of thermal insulation materials. The performance comparisons demonstrate that the proposed method has higher precision by comparing with traditional ECT algorithms.

  8. Sediment management for reservoir

    International Nuclear Information System (INIS)

    Rahman, A.

    2005-01-01

    All natural lakes and reservoirs whether on rivers, tributaries or off channel storages are doomed to be sited up. Pakistan has two major reservoirs of Tarbela and Managla and shallow lake created by Chashma Barrage. Tarbela and Mangla Lakes are losing their capacities ever since first impounding, Tarbela since 1974 and Mangla since 1967. Tarbela Reservoir receives average annual flow of about 62 MAF and sediment deposits of 0.11 MAF whereas Mangla gets about 23 MAF of average annual flows and is losing its storage at the rate of average 34,000 MAF annually. The loss of storage is a great concern and studies for Tarbela were carried out by TAMS and Wallingford to sustain its capacity whereas no study has been done for Mangla as yet except as part of study for Raised Mangla, which is only desk work. Delta of Tarbala reservoir has advanced to about 6.59 miles (Pivot Point) from power intakes. In case of liquefaction of delta by tremor as low as 0.12g peak ground acceleration the power tunnels I, 2 and 3 will be blocked. Minimum Pool of reservoir is being raised so as to check the advance of delta. Mangla delta will follow the trend of Tarbela. Tarbela has vast amount of data as reservoir is surveyed every year, whereas Mangla Reservoir survey was done at five-year interval, which has now been proposed .to be reduced to three-year interval. In addition suspended sediment sampling of inflow streams is being done by Surface Water Hydrology Project of WAPDA as also some bed load sampling. The problem of Chasma Reservoir has also been highlighted, as it is being indiscriminately being filled up and drawdown several times a year without regard to its reaction to this treatment. The Sediment Management of these reservoirs is essential and the paper discusses pros and cons of various alternatives. (author)

  9. Optimal Detection Range of RFID Tag for RFID-based Positioning System Using the k-NN Algorithm

    Directory of Open Access Journals (Sweden)

    Joon Heo

    2009-06-01

    Full Text Available Positioning technology to track a moving object is an important and essential component of ubiquitous computing environments and applications. An RFID-based positioning system using the k-nearest neighbor (k-NN algorithm can determine the position of a moving reader from observed reference data. In this study, the optimal detection range of an RFID-based positioning system was determined on the principle that tag spacing can be derived from the detection range. It was assumed that reference tags without signal strength information are regularly distributed in 1-, 2- and 3-dimensional spaces. The optimal detection range was determined, through analytical and numerical approaches, to be 125% of the tag-spacing distance in 1-dimensional space. Through numerical approaches, the range was 134% in 2-dimensional space, 143% in 3-dimensional space.

  10. SEISMIC ATTENUATION FOR RESERVOIR CHARACTERIZATION

    Energy Technology Data Exchange (ETDEWEB)

    Joel Walls; M.T. Taner; Naum Derzhi; Gary Mavko; Jack Dvorkin

    2003-12-01

    We have developed and tested technology for a new type of direct hydrocarbon detection. The method uses inelastic rock properties to greatly enhance the sensitivity of surface seismic methods to the presence of oil and gas saturation. These methods include use of energy absorption, dispersion, and attenuation (Q) along with traditional seismic attributes like velocity, impedance, and AVO. Our approach is to combine three elements: (1) a synthesis of the latest rock physics understanding of how rock inelasticity is related to rock type, pore fluid types, and pore microstructure, (2) synthetic seismic modeling that will help identify the relative contributions of scattering and intrinsic inelasticity to apparent Q attributes, and (3) robust algorithms that extract relative wave attenuation attributes from seismic data. This project provides: (1) Additional petrophysical insight from acquired data; (2) Increased understanding of rock and fluid properties; (3) New techniques to measure reservoir properties that are not currently available; and (4) Provide tools to more accurately describe the reservoir and predict oil location and volumes. These methodologies will improve the industry's ability to predict and quantify oil and gas saturation distribution, and to apply this information through geologic models to enhance reservoir simulation. We have applied for two separate patents relating to work that was completed as part of this project.

  11. Simulation of California's Major Reservoirs Outflow Using Data Mining Technique

    Science.gov (United States)

    Yang, T.; Gao, X.; Sorooshian, S.

    2014-12-01

    The reservoir's outflow is controlled by reservoir operators, which is different from the upstream inflow. The outflow is more important than the reservoir's inflow for the downstream water users. In order to simulate the complicated reservoir operation and extract the outflow decision making patterns for California's 12 major reservoirs, we build a data-driven, computer-based ("artificial intelligent") reservoir decision making tool, using decision regression and classification tree approach. This is a well-developed statistical and graphical modeling methodology in the field of data mining. A shuffled cross validation approach is also employed to extract the outflow decision making patterns and rules based on the selected decision variables (inflow amount, precipitation, timing, water type year etc.). To show the accuracy of the model, a verification study is carried out comparing the model-generated outflow decisions ("artificial intelligent" decisions) with that made by reservoir operators (human decisions). The simulation results show that the machine-generated outflow decisions are very similar to the real reservoir operators' decisions. This conclusion is based on statistical evaluations using the Nash-Sutcliffe test. The proposed model is able to detect the most influential variables and their weights when the reservoir operators make an outflow decision. While the proposed approach was firstly applied and tested on California's 12 major reservoirs, the method is universally adaptable to other reservoir systems.

  12. Optimising reservoir operation

    DEFF Research Database (Denmark)

    Ngo, Long le

    Anvendelse af optimeringsteknik til drift af reservoirer er blevet et væsentligt element i vandressource-planlægning og -forvaltning. Traditionelt har reservoirer været styret af heuristiske procedurer for udtag af vand, suppleret i en vis udstrækning af subjektive beslutninger. Udnyttelse af...... reservoirer involverer en lang række interessenter med meget forskellige formål (f.eks. kunstig vanding, vandkraft, vandforsyning mv.), og optimeringsteknik kan langt bedre lede frem til afbalancerede løsninger af de ofte modstridende interesser. Afhandlingen foreslår en række tiltag, hvormed traditionelle...

  13. Optimal search strategies for detecting cost and economic studies in EMBASE

    Directory of Open Access Journals (Sweden)

    Haynes R Brian

    2006-06-01

    Full Text Available Abstract Background Economic evaluations in the medical literature compare competing diagnosis or treatment methods for their use of resources and their expected outcomes. The best evidence currently available from research regarding both cost and economic comparisons will continue to expand as this type of information becomes more important in today's clinical practice. Researchers and clinicians need quick, reliable ways to access this information. A key source of this type of information is large bibliographic databases such as EMBASE. The objective of this study was to develop search strategies that optimize the retrieval of health costs and economics studies from EMBASE. Methods We conducted an analytic survey, comparing hand searches of journals with retrievals from EMBASE for candidate search terms and combinations. 6 research assistants read all issues of 55 journals indexed by EMBASE for the publishing year 2000. We rated all articles using purpose and quality indicators and categorized them into clinically relevant original studies, review articles, general papers, or case reports. The original and review articles were then categorized for purpose (i.e., cost and economics and other clinical topics and depending on the purpose as 'pass' or 'fail' for methodologic rigor. Candidate search strategies were developed for economic and cost studies, then run in the 55 EMBASE journals, the retrievals being compared with the hand search data. The sensitivity, specificity, precision, and accuracy of the search strategies were calculated. Results Combinations of search terms for detecting both cost and economic studies attained levels of 100% sensitivity with specificity levels of 92.9% and 92.3% respectively. When maximizing for both sensitivity and specificity, the combination of terms for detecting cost studies (sensitivity increased 2.2% over the single term but at a slight decrease in specificity of 0.9%. The maximized combination of terms

  14. Quantum separability and entanglement detection via entanglement-witness search and global optimization

    International Nuclear Information System (INIS)

    Ioannou, Lawrence M.; Travaglione, Benjamin C.

    2006-01-01

    We focus on determining the separability of an unknown bipartite quantum state ρ by invoking a sufficiently large subset of all possible entanglement witnesses given the expected value of each element of a set of mutually orthogonal observables. We review the concept of an entanglement witness from the geometrical point of view and use this geometry to show that the set of separable states is not a polytope and to characterize the class of entanglement witnesses (observables) that detect entangled states on opposite sides of the set of separable states. All this serves to motivate a classical algorithm which, given the expected values of a subset of an orthogonal basis of observables of an otherwise unknown quantum state, searches for an entanglement witness in the span of the subset of observables. The idea of such an algorithm, which is an efficient reduction of the quantum separability problem to a global optimization problem, was introduced by [Ioannou et al., Phys. Rev. A 70, 060303(R)], where it was shown to be an improvement on the naive approach for the quantum separability problem (exhaustive search for a decomposition of the given state into a convex combination of separable states). The last section of the paper discusses in more generality such algorithms, which, in our case, assume a subroutine that computes the global maximum of a real function of several variables. Despite this, we anticipate that such algorithms will perform sufficiently well on small instances that they will render a feasible test for separability in some cases of interest (e.g., in 3x3 dimensional systems)

  15. A New Method for Fracturing Wells Reservoir Evaluation in Fractured Gas Reservoir

    Directory of Open Access Journals (Sweden)

    Jianchun Guo

    2014-01-01

    Full Text Available Natural fracture is a geological phenomenon widely distributed in tight formation, and fractured gas reservoir stimulation effect mainly depends on the communication of natural fractures. Therefore it is necessary to carry out the evaluation of this reservoir and to find out the optimal natural fractures development wells. By analyzing the interactions and nonlinear relationships of the parameters, it establishes three-level index system of reservoir evaluation and proposes a new method for gas well reservoir evaluation model in fractured gas reservoir on the basis of fuzzy logic theory and multilevel gray correlation. For this method, the Gaussian membership functions to quantify the degree of every factor in the decision-making system and the multilevel gray relation to determine the weight of each parameter on stimulation effect. Finally through fuzzy arithmetic operator between multilevel weights and fuzzy evaluation matrix, score, rank, the reservoir quality, and predicted production will be gotten. Result of this new method shows that the evaluation of the production coincidence rate reaches 80%, which provides a new way for fractured gas reservoir evaluation.

  16. Optimization and evaluation of a method to detect adenoviruses in river water

    Data.gov (United States)

    U.S. Environmental Protection Agency — This dataset includes the recoveries of spiked adenovirus through various stages of experimental optimization procedures. This dataset is associated with the...

  17. Mapping of Reservoir Properties and Facies Through Integration of Static and Dynamic Data

    Energy Technology Data Exchange (ETDEWEB)

    Reynolds, Albert C.; Oliver, Dean S.; Zhang, Fengjun; Dong, Yannong; Skjervheim, Jan Arild; Liu, Ning

    2003-03-10

    The goal of this project was to develop computationally efficient automatic history matching techniques for generating geologically plausible reservoir models which honor both static and dynamic data. Solution of this problem was necessary for the quantification of uncertainty in future reservoir performance predictions and for the optimization of reservoir management.

  18. Mapping of Reservoir Properties and Facies Through Integration of Static and Dynamic Data

    Energy Technology Data Exchange (ETDEWEB)

    Oliver, Dean S.; Reynolds, Albert C.; Zhang, Fengjun; Li, Ruijian; Abacioglu, Yafes; Dong, Yannong

    2002-03-05

    The goal of this project was to develop computationally efficient automatic history matching techniques for generating geologically plausible reservoir models which honor both static and dynamic data. Solution of this problem is necessary for the quantification of uncertainty in future reservoir performance predictions and for the optimization of reservoir management.

  19. Reservoir characterization of Pennsylvanian sandstone reservoirs. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Kelkar, M.

    1995-02-01

    This final report summarizes the progress during the three years of a project on Reservoir Characterization of Pennsylvanian Sandstone Reservoirs. The report is divided into three sections: (i) reservoir description; (ii) scale-up procedures; (iii) outcrop investigation. The first section describes the methods by which a reservoir can be described in three dimensions. The next step in reservoir description is to scale up reservoir properties for flow simulation. The second section addresses the issue of scale-up of reservoir properties once the spatial descriptions of properties are created. The last section describes the investigation of an outcrop.

  20. Optimization of Scat Detection Methods for a Social Ungulate, the Wild Pig, and Experimental Evaluation of Factors Affecting Detection of Scat.

    Directory of Open Access Journals (Sweden)

    David A Keiter

    Full Text Available Collection of scat samples is common in wildlife research, particularly for genetic capture-mark-recapture applications. Due to high degradation rates of genetic material in scat, large numbers of samples must be collected to generate robust estimates. Optimization of sampling approaches to account for taxa-specific patterns of scat deposition is, therefore, necessary to ensure sufficient sample collection. While scat collection methods have been widely studied in carnivores, research to maximize scat collection and noninvasive sampling efficiency for social ungulates is lacking. Further, environmental factors or scat morphology may influence detection of scat by observers. We contrasted performance of novel radial search protocols with existing adaptive cluster sampling protocols to quantify differences in observed amounts of wild pig (Sus scrofa scat. We also evaluated the effects of environmental (percentage of vegetative ground cover and occurrence of rain immediately prior to sampling and scat characteristics (fecal pellet size and number on the detectability of scat by observers. We found that 15- and 20-m radial search protocols resulted in greater numbers of scats encountered than the previously used adaptive cluster sampling approach across habitat types, and that fecal pellet size, number of fecal pellets, percent vegetative ground cover, and recent rain events were significant predictors of scat detection. Our results suggest that use of a fixed-width radial search protocol may increase the number of scats detected for wild pigs, or other social ungulates, allowing more robust estimation of population metrics using noninvasive genetic sampling methods. Further, as fecal pellet size affected scat detection, juvenile or smaller-sized animals may be less detectable than adult or large animals, which could introduce bias into abundance estimates. Knowledge of relationships between environmental variables and scat detection may allow

  1. Value assessment for reservoir recovery optimization

    International Nuclear Information System (INIS)

    Saito, R.; De Castro, G.N.; Mezzomo, C.; Schiozer, D.J.

    2001-01-01

    This paper analyzes the managerial flexibility embedded in oil and gas exploration and production. The analysis includes the economic impact of using different production techniques on the valuation of oil reserves. Two methodologies are used to evaluate the simulation of engineering techniques: (1) the real option approach; and (2) the discounted cash flow (DCF) method. Given the external variables (e.g., oil price, interest rate), this paper evaluates the best engineering technique for oil recovery by using a valuation approach. We conclude that by appropriately combining different production techniques, the value of oil reserves can increase under the real option approach and can be higher than the value assessed under the DCF method. Since oil recovery includes many managerial choices, we argue that the real option approach is more appropriate than the DCF method. The paper concludes that concession time and dividend yield are the most sensitive parameters for the valuation of oil reserves

  2. Value assessment for reservoir recovery optimization

    Energy Technology Data Exchange (ETDEWEB)

    Saito, R.; De Castro, G.N. [EAESP/FGV, Av. Nove de Julho, 2029-10 andar, 01313-902, SP Sao Paulo (Brazil); Mezzomo, C.; Schiozer, D.J. [Fundacao Getulio Vargas, Avenida Nove de Julho, 2029, 10th floor, 01313-902, SP Sao Paulo (Brazil)

    2001-12-29

    This paper analyzes the managerial flexibility embedded in oil and gas exploration and production. The analysis includes the economic impact of using different production techniques on the valuation of oil reserves. Two methodologies are used to evaluate the simulation of engineering techniques: (1) the real option approach; and (2) the discounted cash flow (DCF) method. Given the external variables (e.g., oil price, interest rate), this paper evaluates the best engineering technique for oil recovery by using a valuation approach. We conclude that by appropriately combining different production techniques, the value of oil reserves can increase under the real option approach and can be higher than the value assessed under the DCF method. Since oil recovery includes many managerial choices, we argue that the real option approach is more appropriate than the DCF method. The paper concludes that concession time and dividend yield are the most sensitive parameters for the valuation of oil reserves.

  3. Collimator optimization in myocardial perfusion SPECT using the ideal observer and realistic background variability for lesion detection and joint detection and localization tasks

    Science.gov (United States)

    Ghaly, Michael; Du, Yong; Links, Jonathan M.; Frey, Eric C.

    2016-03-01

    In SPECT imaging, collimators are a major factor limiting image quality and largely determine the noise and resolution of SPECT images. In this paper, we seek the collimator with the optimal tradeoff between image noise and resolution with respect to performance on two tasks related to myocardial perfusion SPECT: perfusion defect detection and joint detection and localization. We used the Ideal Observer (IO) operating on realistic background-known-statistically (BKS) and signal-known-exactly (SKE) data. The areas under the receiver operating characteristic (ROC) and localization ROC (LROC) curves (AUCd, AUCd+l), respectively, were used as the figures of merit for both tasks. We used a previously developed population of 54 phantoms based on the eXtended Cardiac Torso Phantom (XCAT) that included variations in gender, body size, heart size and subcutaneous adipose tissue level. For each phantom, organ uptakes were varied randomly based on distributions observed in patient data. We simulated perfusion defects at six different locations with extents and severities of 10% and 25%, respectively, which represented challenging but clinically relevant defects. The extent and severity are, respectively, the perfusion defect’s fraction of the myocardial volume and reduction of uptake relative to the normal myocardium. Projection data were generated using an analytical projector that modeled attenuation, scatter, and collimator-detector response effects, a 9% energy resolution at 140 keV, and a 4 mm full-width at half maximum (FWHM) intrinsic spatial resolution. We investigated a family of eight parallel-hole collimators that spanned a large range of sensitivity-resolution tradeoffs. For each collimator and defect location, the IO test statistics were computed using a Markov Chain Monte Carlo (MCMC) method for an ensemble of 540 pairs of defect-present and -absent images that included the aforementioned anatomical and uptake variability. Sets of test statistics were

  4. Improved characterization of reservoir behavior by integration of reservoir performances data and rock type distributions

    Energy Technology Data Exchange (ETDEWEB)

    Davies, D.K.; Vessell, R.K. [David K. Davies & Associates, Kingwood, TX (United States); Doublet, L.E. [Texas A& M Univ., College Station, TX (United States)] [and others

    1997-08-01

    An integrated geological/petrophysical and reservoir engineering study was performed for a large, mature waterflood project (>250 wells, {approximately}80% water cut) at the North Robertson (Clear Fork) Unit, Gaines County, Texas. The primary goal of the study was to develop an integrated reservoir description for {open_quotes}targeted{close_quotes} (economic) 10-acre (4-hectare) infill drilling and future recovery operations in a low permeability, carbonate (dolomite) reservoir. Integration of the results from geological/petrophysical studies and reservoir performance analyses provide a rapid and effective method for developing a comprehensive reservoir description. This reservoir description can be used for reservoir flow simulation, performance prediction, infill targeting, waterflood management, and for optimizing well developments (patterns, completions, and stimulations). The following analyses were performed as part of this study: (1) Geological/petrophysical analyses: (core and well log data) - {open_quotes}Rock typing{close_quotes} based on qualitative and quantitative visualization of pore-scale features. Reservoir layering based on {open_quotes}rock typing {close_quotes} and hydraulic flow units. Development of a {open_quotes}core-log{close_quotes} model to estimate permeability using porosity and other properties derived from well logs. The core-log model is based on {open_quotes}rock types.{close_quotes} (2) Engineering analyses: (production and injection history, well tests) Material balance decline type curve analyses to estimate total reservoir volume, formation flow characteristics (flow capacity, skin factor, and fracture half-length), and indications of well/boundary interference. Estimated ultimate recovery analyses to yield movable oil (or injectable water) volumes, as well as indications of well and boundary interference.

  5. An optimized staining technique for the detection of Gram positive and Gram negative bacteria within tissue.

    Science.gov (United States)

    Becerra, Sandra C; Roy, Daniel C; Sanchez, Carlos J; Christy, Robert J; Burmeister, David M

    2016-04-12

    Bacterial infections are a common clinical problem in both acute and chronic wounds. With growing concerns over antibiotic resistance, treatment of bacterial infections should only occur after positive diagnosis. Currently, diagnosis is delayed due to lengthy culturing methods which may also fail to identify the presence of bacteria. While newer costly bacterial identification methods are being explored, a simple and inexpensive diagnostic tool would aid in immediate and accurate treatments for bacterial infections. Histologically, hematoxylin and eosin (H&E) and Gram stains have been employed, but are far from optimal when analyzing tissue samples due to non-specific staining. The goal of the current study was to develop a modification of the Gram stain that enhances the contrast between bacteria and host tissue. A modified Gram stain was developed and tested as an alternative to Gram stain that improves the contrast between Gram positive bacteria, Gram negative bacteria and host tissue. Initially, clinically relevant strains of Pseudomonas aeruginosa and Staphylococcus aureus were visualized in vitro and in biopsies of infected, porcine burns using routine Gram stain, and immunohistochemistry techniques involving bacterial strain-specific fluorescent antibodies as validation tools. H&E and Gram stain of serial biopsy sections were then compared to a modification of the Gram stain incorporating a counterstain that highlights collagen found in tissue. The modified Gram stain clearly identified both Gram positive and Gram negative bacteria, and when compared to H&E or Gram stain alone provided excellent contrast between bacteria and non-viable burn eschar. Moreover, when applied to surgical biopsies from patients that underwent burn debridement this technique was able to clearly detect bacterial morphology within host tissue. We describe a modification of the Gram stain that provides improved contrast of Gram positive and Gram negative microorganisms within host

  6. Missed, Misused, or Mismanaged: Improving Early Detection Systems to Optimize Child Outcomes

    Science.gov (United States)

    Macy, Marisa; Marks, Kevin; Towle, Alexander

    2014-01-01

    Early detection efforts have been shown to vary greatly in practice, and there is a general lack of systematic accountability built into monitoring early detection effort impact. This article reviews current early detection practices and the drawbacks of these practices, with particular attention given to prevalent issues of mismeasurement,…

  7. Optimal Matched Filter in the Low-number Count Poisson Noise Regime and Implications for X-Ray Source Detection

    Science.gov (United States)

    Ofek, Eran O.; Zackay, Barak

    2018-04-01

    Detection of templates (e.g., sources) embedded in low-number count Poisson noise is a common problem in astrophysics. Examples include source detection in X-ray images, γ-rays, UV, neutrinos, and search for clusters of galaxies and stellar streams. However, the solutions in the X-ray-related literature are sub-optimal in some cases by considerable factors. Using the lemma of Neyman–Pearson, we derive the optimal statistics for template detection in the presence of Poisson noise. We demonstrate that, for known template shape (e.g., point sources), this method provides higher completeness, for a fixed false-alarm probability value, compared with filtering the image with the point-spread function (PSF). In turn, we find that filtering by the PSF is better than filtering the image using the Mexican-hat wavelet (used by wavdetect). For some background levels, our method improves the sensitivity of source detection by more than a factor of two over the popular Mexican-hat wavelet filtering. This filtering technique can also be used for fast PSF photometry and flare detection; it is efficient and straightforward to implement. We provide an implementation in MATLAB. The development of a complete code that works on real data, including the complexities of background subtraction and PSF variations, is deferred for future publication.

  8. Performance evaluation and optimization of multiband phase-modulated radio over IsOWC link with balanced coherent homodyne detection

    Science.gov (United States)

    Zong, Kang; Zhu, Jiang

    2018-04-01

    In this paper, we present a multiband phase-modulated (PM) radio over intersatellite optical wireless communication (IsOWC) link with balanced coherent homodyne detection. The proposed system can provide the transparent transport of multiband radio frequency (RF) signals with higher linearity and better receiver sensitivity than intensity modulated with direct detection (IM/DD) system. The expressions of RF gain, noise figure (NF) and third-order spurious-free dynamic range (SFDR) are derived considering the third-order intermodulation product and amplifier spontaneous emission (ASE) noise. The optimal power of local oscillator (LO) optical signal is also derived theoretically. Numerical results for RF gain, NF and third-order SFDR are given for demonstration. Results indicate that the gain of the optical preamplifier and the power of LO optical signal should be optimized to obtain the satisfactory performance.

  9. Tracing fluid flow in geothermal reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Rose, P.E.; Adams, M.C. [Univ. of Utah, Salt Lake City, UT (United States)

    1997-12-31

    A family of fluorescent compounds, the polycyclic aromatic sulfonates, were evaluated for application in intermediate- and high-temperature geothermal reservoirs. Whereas the naphthalene sulfonates were found to be very thermally stable and reasonably detectable, the amino-substituted naphthalene sulfonates were found to be somewhat less thermally stable, but much more detectable. A tracer test was conducted at the Dixie Valley, Nevada, geothermal reservoir using one of the substituted naphthalene sulfonates, amino G, and fluorescein. Four of 9 production wells showed tracer breakthrough during the first 200 days of the test. Reconstructed tracer return curves are presented that correct for the thermal decay of tracer assuming an average reservoir temperature of 227{degrees}C. In order to examine the feasibility of using numerical simulation to model tracer flow, we developed simple, two-dimensional models of the geothermal reservoir using the numerical simulation programs TETRAD and TOUGH2. By fitting model outputs to measured return curves, we show that numerical reservoir simulations can be calibrated with the tracer data. Both models predict the same order of elution, approximate tracer concentrations, and return curve shapes. Using these results, we propose a method for using numerical models to design a tracer test.

  10. Characterization of oil and gas reservoir heterogeneity

    Energy Technology Data Exchange (ETDEWEB)

    Tyler, N.; Barton, M.D.; Bebout, D.G.; Fisher, R.S.; Grigsby, J.D.; Guevara, E.; Holtz, M.; Kerans, C.; Nance, H.S.; Levey, R.A.

    1992-10-01

    Research described In this report addresses the internal architecture of two specific reservoir types: restricted-platform carbonates and fluvial-deltaic sandstones. Together, these two reservoir types contain more than two-thirds of the unrecovered mobile oil remaining ill Texas. The approach followed in this study was to develop a strong understanding of the styles of heterogeneity of these reservoir types based on a detailed outcrop description and a translation of these findings into optimized recovery strategies in select subsurface analogs. Research targeted Grayburg Formation restricted-platform carbonate outcrops along the Algerita Escarpment and In Stone Canyon In southeastern New Mexico and Ferron deltaic sandstones in central Utah as analogs for the North Foster (Grayburg) and Lake Creek (Wilcox) units, respectively. In both settings, sequence-stratigraphic style profoundly influenced between-well architectural fabric and permeability structure. It is concluded that reservoirs of different depositional origins can therefore be categorized Into a heterogeneity matrix'' based on varying intensity of vertical and lateral heterogeneity. The utility of the matrix is that it allows prediction of the nature and location of remaining mobile oil. Highly stratified reservoirs such as the Grayburg, for example, will contain a large proportion of vertically bypassed oil; thus, an appropriate recovery strategy will be waterflood optimization and profile modification. Laterally heterogeneous reservoirs such as deltaic distributary systems would benefit from targeted infill drilling (possibly with horizontal wells) and improved areal sweep efficiency. Potential for advanced recovery of remaining mobile oil through heterogeneity-based advanced secondary recovery strategies In Texas is projected to be an Incremental 16 Bbbl. In the Lower 48 States this target may be as much as 45 Bbbl at low to moderate oil prices over the near- to mid-term.

  11. Detection of Giardia intestinalis in water samples collected from natural water reservoirs and wells in northern and north-eastern Poland using LAMP, real-time PCR and nested PCR.

    Science.gov (United States)

    Lass, Anna; Szostakowska, Beata; Korzeniewski, Krzysztof; Karanis, Panagiotis

    2017-10-01

    Giardia intestinalis is a protozoan parasite, transmitted to humans and animals by the faecal-oral route, mainly through contaminated water and food. Knowledge about the distribution of this parasite in surface water in Poland is fragmentary and incomplete. Accordingly, 36 environmental water samples taken from surface water reservoirs and wells were collected in Pomerania and Warmia-Masuria provinces, Poland. The 50 L samples were filtered and subsequently analysed with three molecular detection methods: loop-mediated isothermal amplification (LAMP), real-time polymerase chain reaction (real-time PCR) and nested PCR. Of the samples examined, Giardia DNA was found in 15 (42%) samples with the use of LAMP; in 12 (33%) of these samples, Giardia DNA from this parasite was also detected using real-time PCR; and in 9 (25%) using nested PCR. Sequencing of selected positive samples confirmed that the PCR products were fragments of the Giardia intestinalis small subunit rRNA gene. Genotyping using multiplex real-time PCR indicated the presence of assemblages A and B, with the latter predominating. The results indicate that surface water in Poland, as well as water taken from surface wells, may be a source of Giardia strains which are potentially pathogenic for humans. It was also demonstrated that LAMP assay is more sensitive than the other two molecular assays.

  12. Optimized Pan-species and speciation duplex real-time PCR assays for Plasmodium parasites detection in malaria vectors.

    Directory of Open Access Journals (Sweden)

    Maurice Marcel Sandeu

    Full Text Available BACKGROUND: An accurate method for detecting malaria parasites in the mosquito's vector remains an essential component in the vector control. The Enzyme linked immunosorbent assay specific for circumsporozoite protein (ELISA-CSP is the gold standard method for the detection of malaria parasites in the vector even if it presents some limitations. Here, we optimized multiplex real-time PCR assays to accurately detect minor populations in mixed infection with multiple Plasmodium species in the African malaria vectors Anopheles gambiae and Anopheles funestus. METHODS: Complementary TaqMan-based real-time PCR assays that detect Plasmodium species using specific primers and probes were first evaluated on artificial mixtures of different targets inserted in plasmid constructs. The assays were further validated in comparison with the ELISA-CSP on 200 field caught Anopheles gambiae and Anopheles funestus mosquitoes collected in two localities in southern Benin. RESULTS: The validation of the duplex real-time PCR assays on the plasmid mixtures demonstrated robust specificity and sensitivity for detecting distinct targets. Using a panel of mosquito specimen, the real-time PCR showed a relatively high sensitivity (88.6% and specificity (98%, compared to ELISA-CSP as the referent standard. The agreement between both methods was "excellent" (κ=0.8, P<0.05. The relative quantification of Plasmodium DNA between the two Anopheles species analyzed showed no significant difference (P=0, 2. All infected mosquito samples contained Plasmodium falciparum DNA and mixed infections with P. malariae and/or P. ovale were observed in 18.6% and 13.6% of An. gambiae and An. funestus respectively. Plasmodium vivax was found in none of the mosquito samples analyzed. CONCLUSION: This study presents an optimized method for detecting the four Plasmodium species in the African malaria vectors. The study highlights substantial discordance with traditional ELISA-CSP pointing out the

  13. A new framework for analysing automated acoustic species-detection data: occupancy estimation and optimization of recordings post-processing

    Science.gov (United States)

    Chambert, Thierry A.; Waddle, J. Hardin; Miller, David A.W.; Walls, Susan; Nichols, James D.

    2018-01-01

    The development and use of automated species-detection technologies, such as acoustic recorders, for monitoring wildlife are rapidly expanding. Automated classification algorithms provide a cost- and time-effective means to process information-rich data, but often at the cost of additional detection errors. Appropriate methods are necessary to analyse such data while dealing with the different types of detection errors.We developed a hierarchical modelling framework for estimating species occupancy from automated species-detection data. We explore design and optimization of data post-processing procedures to account for detection errors and generate accurate estimates. Our proposed method accounts for both imperfect detection and false positive errors and utilizes information about both occurrence and abundance of detections to improve estimation.Using simulations, we show that our method provides much more accurate estimates than models ignoring the abundance of detections. The same findings are reached when we apply the methods to two real datasets on North American frogs surveyed with acoustic recorders.When false positives occur, estimator accuracy can be improved when a subset of detections produced by the classification algorithm is post-validated by a human observer. We use simulations to investigate the relationship between accuracy and effort spent on post-validation, and found that very accurate occupancy estimates can be obtained with as little as 1% of data being validated.Automated monitoring of wildlife provides opportunity and challenges. Our methods for analysing automated species-detection data help to meet key challenges unique to these data and will prove useful for many wildlife monitoring programs.

  14. DEVELOPMENT OF RESERVOIR CHARACTERIZATION TECHNIQUES AND PRODUCTION MODELS FOR EXPLOITING NATURALLY FRACTURED RESERVOIRS

    Energy Technology Data Exchange (ETDEWEB)

    Michael L. Wiggins; Raymon L. Brown; Faruk Civan; Richard G. Hughes

    2002-12-31

    optimizing the recovery from naturally fractured reservoir systems. The next logical extension of this work is to apply the proposed methods to an actual field case study to provide information for verification and modification of the techniques and simulator. This report provides the details of the proposed techniques and summarizes the activities undertaken during the course of this project. Technology transfer activities were highlighted by a two-day technical conference held in Oklahoma City in June 2002. This conference attracted over 90 participants and included the presentation of seventeen technical papers from researchers throughout the United States.

  15. An optimized one-tube, semi-nested PCR assay for Paracoccidioides brasiliensis detection

    Directory of Open Access Journals (Sweden)

    Amanda de Faveri Pitz

    2013-12-01

    Full Text Available Introduction Herein, we report a one-tube, semi-nested-polymerase chain reaction (OTsn-PCR assay for the detection of Paracoccidioides brasiliensis. Methods We developed the OTsn-PCR assay for the detection of P. brasiliensis in clinical specimens and compared it with other PCR methods. Results The OTsn-PCR assay was positive for all clinical samples, and the detection limit was better or equivalent to the other nested or semi-nested PCR methods for P. brasiliensis detection. Conclusions The OTsn-PCR assay described in this paper has a detection limit similar to other reactions for the molecular detection of P. brasiliensis, but this approach is faster and less prone to contamination than other conventional nested or semi-nested PCR assays.

  16. An optimized one-tube, semi-nested PCR assay for Paracoccidioides brasiliensis detection.

    Science.gov (United States)

    Pitz, Amanda de Faveri; Koishi, Andrea Cristine; Tavares, Eliandro Reis; Andrade, Fábio Goulart de; Loth, Eduardo Alexandre; Gandra, Rinaldo Ferreira; Venancio, Emerson José

    2013-01-01

    Herein, we report a one-tube, semi-nested-polymerase chain reaction (OTsn-PCR) assay for the detection of Paracoccidioides brasiliensis. We developed the OTsn-PCR assay for the detection of P. brasiliensis in clinical specimens and compared it with other PCR methods. The OTsn-PCR assay was positive for all clinical samples, and the detection limit was better or equivalent to the other nested or semi-nested PCR methods for P. brasiliensis detection. The OTsn-PCR assay described in this paper has a detection limit similar to other reactions for the molecular detection of P. brasiliensis, but this approach is faster and less prone to contamination than other conventional nested or semi-nested PCR assays.

  17. Optimal wavelet transform for the detection of microaneurysms in retina photographs

    OpenAIRE

    Quellec, Gwénolé; Lamard, Mathieu; Josselin, Pierre Marie; Cazuguel, Guy; Cochener, Béatrice; Roux, Christian

    2008-01-01

    11 pages; International audience; In this paper, we propose an automatic method to detect microaneurysms in retina photographs. Microaneurysms are the most frequent and usually the first lesions to appear as a consequence of diabetic retinopathy. So, their detection is necessary for both screening the pathology and follow up (progression measurement). Automating this task, which is currently performed manually, would bring more objectivity and reproducibility. We propose to detect them by loc...

  18. Detecting unstable periodic orbits of nonlinear mappings by a novel quantum-behaved particle swarm optimization non-Lyapunov way

    International Nuclear Information System (INIS)

    Gao Fei; Gao Hongrui; Li Zhuoqiu; Tong Hengqing; Lee, Ju-Jang

    2009-01-01

    It is well known that set of unstable periodic orbits (UPOs) can be thought of as the skeleton for the dynamics. However, detecting UPOs of nonlinear map is one of the most challenging problems of nonlinear science in both numerical computations and experimental measures. In this paper, a new method is proposed to detect the UPOs in a non-Lyapunov way. Firstly three special techniques are added to quantum-behaved particle swarm optimization (QPSO), a novel mbest particle, contracting the searching space self-adaptively and boundaries restriction (NCB), then the new method NCB-QPSO is proposed. It can maintain an effective search mechanism with fine equilibrium between exploitation and exploration. Secondly, the problems of detecting the UPOs are converted into a non-negative functions' minimization through a proper translation in a non-Lyapunov way. Thirdly the simulations to 6 benchmark optimization problems and different high order UPOs of 5 classic nonlinear maps are done by the proposed method. And the results show that NCB-QPSO is a successful method in detecting the UPOs, and it has the advantages of fast convergence, high precision and robustness.

  19. Pressure optimization of an EC-QCL based cavity ring-down spectroscopy instrument for exhaled NO detection

    Science.gov (United States)

    Zhou, Sheng; Han, Yanling; Li, Bincheng

    2018-02-01

    Nitric oxide (NO) in exhaled breath has gained increasing interest in recent years mainly driven by the clinical need to monitor inflammatory status in respiratory disorders, such as asthma and other pulmonary conditions. Mid-infrared cavity ring-down spectroscopy (CRDS) using an external cavity, widely tunable continuous-wave quantum cascade laser operating at 5.3 µm was employed for NO detection. The detection pressure was reduced in steps to improve the sensitivity, and the optimal pressure was determined to be 15 kPa based on the fitting residual analysis of measured absorption spectra. A detection limit (1σ, or one time of standard deviation) of 0.41 ppb was experimentally achieved for NO detection in human breath under the optimized condition in a total of 60 s acquisition time (2 s per data point). Diurnal measurement session was conducted for exhaled NO. The experimental results indicated that mid-infrared CRDS technique has great potential for various applications in health diagnosis.

  20. Technical note: Simultaneous carotenoid and vitamin analysis of milk from total mixed ration-fed cows optimized for xanthophyll detection.

    Science.gov (United States)

    Stout, M A; Benoist, D M; Drake, M A

    2018-06-01

    Concentrations of retinol, α-tocopherol, and major carotenoids in dairy products are often determined simultaneously by liquid chromatography. These compounds have different polarity and solubility; thus, extracting them simultaneously can be difficult and inefficient. In milks with low carotenoid concentrations, the xanthophylls lutein and zeaxanthin may not be completely resolved using common extraction techniques. A simplified method was developed to optimize extraction efficiency and the limit of detection and limit of quantification (LoQ) of lutein and zeaxanthin in bovine milk without decreasing sensitivity to other vitamins or carotenoids. The developed method evaluates lutein, zeaxanthin, β-carotene, retinol, and α-tocopherol simultaneously by ultra-high performance liquid chromatography-photodiode array detection. Common saponification temperatures (40-60°C) and concentrations of KOH in water (10-50% KOH wt/vol) were evaluated. Multiple solvents were evaluated for optimal xanthophyll extraction (diethyl ether, dichloromethane, hexane, and tetrahydrofuran) following saponification. The limit of detection and LoQ were defined as 3:1 and 10:1 signal-to-noise ratio, respectively. All experiments were performed in triplicate. The optimal saponification procedure was a concentration of 25% KOH at either 40 or 50°C. Saponified extracts solubilized in solutions containing diethyl ether had greater concentrations of lutein- than hexane- or tetrahydrofuran-based solutions, with peak areas above LoQ values. The solution containing diethyl ether solubilized similar concentrations of retinol, α-tocopherol, and β-carotene when compared with other solutions. The proposed optimized method allows for the simultaneous determination of carotenoids from milk with increased lutein and zeaxanthin sensitivity without sacrificing recovery of retinol, α-tocopherol, and β-carotene. Copyright © 2018 American Dairy Science Association. Published by Elsevier Inc. All rights

  1. Geothermal reservoir engineering

    CERN Document Server

    Grant, Malcolm Alister

    2011-01-01

    As nations alike struggle to diversify and secure their power portfolios, geothermal energy, the essentially limitless heat emanating from the earth itself, is being harnessed at an unprecedented rate.  For the last 25 years, engineers around the world tasked with taming this raw power have used Geothermal Reservoir Engineering as both a training manual and a professional reference.  This long-awaited second edition of Geothermal Reservoir Engineering is a practical guide to the issues and tasks geothermal engineers encounter in the course of their daily jobs. The bo

  2. Optimizing Case-based detection performance in a multiview CAD system for mammography

    NARCIS (Netherlands)

    Samulski, M.; Karssemeijer, N.

    2011-01-01

    When reading mammograms, radiologists combine information from multiple views to detect abnormalities. Most computer-aided detection (CAD) systems, however, use primitive methods for inclusion of multiview context or analyze each view independently. In previous research it was found that in

  3. Optimizing a neural network for detection of moving vehicles in video

    NARCIS (Netherlands)

    Fischer, N.M.; Kruithof, M.C.; Bouma, H.

    2017-01-01

    In the field of security and defense, it is extremely important to reliably detect moving objects, such as cars, ships, drones and missiles. Detection and analysis of moving objects in cameras near borders could be helpful to reduce illicit trading, drug trafficking, irregular border crossing,

  4. Converting structures to optimize the Synchrotron X radiation detection by CCD systems

    International Nuclear Information System (INIS)

    Zanella, G.; Zannoni, R.

    1987-01-01

    It is pointed out how the quantum efficiency of X ray detection for CCD detecting system can be improved enlarging their sensivity range by means of heavy element converting structures. So the problem of fabricating CCD with a deep emptying layer is avoided

  5. Bearing Fault Detection Based on Maximum Likelihood Estimation and Optimized ANN Using the Bees Algorithm

    Directory of Open Access Journals (Sweden)

    Behrooz Attaran

    2015-01-01

    Full Text Available Rotating machinery is the most common machinery in industry. The root of the faults in rotating machinery is often faulty rolling element bearings. This paper presents a technique using optimized artificial neural network by the Bees Algorithm for automated diagnosis of localized faults in rolling element bearings. The inputs of this technique are a number of features (maximum likelihood estimation values, which are derived from the vibration signals of test data. The results shows that the performance of the proposed optimized system is better than most previous studies, even though it uses only two features. Effectiveness of the above method is illustrated using obtained bearing vibration data.

  6. An Optimized Structure on FPGA of Key Point Detection in SIFT Algorithm

    Directory of Open Access Journals (Sweden)

    Xu Chenyu

    2016-01-01

    Full Text Available SIFT algorithm is the most efficient and powerful algorithm to describe the features of images and it has been applied in many fields. In this paper, we propose an optimized method to realize the hardware implementation of the SIFT algorithm. We mainly discuss the structure of Data Generation here. A pipeline architecture is introduced to accelerate this optimized system. Parameters’ setting and approximation’s controlling in different image qualities and hardware resources are the focus of this paper. The results of experiments fully prove that this structure is real-time and effective, and provide consultative opinion to meet the different situations.

  7. On the Optimal Detection and Error Performance Analysis of the Hardware Impaired Systems

    KAUST Repository

    Javed, Sidrah; Amin, Osama; Ikki, Salama S.; Alouini, Mohamed-Slim

    2018-01-01

    The conventional minimum Euclidean distance (MED) receiver design is based on the assumption of ideal hardware transceivers and proper Gaussian noise in communication systems. Throughout this study, an accurate statistical model of various hardware impairments (HWIs) is presented. Then, an optimal maximum likelihood (ML) receiver is derived considering the distinct characteristics of the HWIs comprised of additive improper Gaussian noise and signal distortion. Next, the average error probability performance of the proposed optimal ML receiver is analyzed and tight bounds are derived. Finally, different numerical and simulation results are presented to support the superiority of the proposed ML receiver over MED receiver and the tightness of the derived bounds.

  8. On the Optimal Detection and Error Performance Analysis of the Hardware Impaired Systems

    KAUST Repository

    Javed, Sidrah

    2018-01-15

    The conventional minimum Euclidean distance (MED) receiver design is based on the assumption of ideal hardware transceivers and proper Gaussian noise in communication systems. Throughout this study, an accurate statistical model of various hardware impairments (HWIs) is presented. Then, an optimal maximum likelihood (ML) receiver is derived considering the distinct characteristics of the HWIs comprised of additive improper Gaussian noise and signal distortion. Next, the average error probability performance of the proposed optimal ML receiver is analyzed and tight bounds are derived. Finally, different numerical and simulation results are presented to support the superiority of the proposed ML receiver over MED receiver and the tightness of the derived bounds.

  9. Nagylengyel: an interesting reservoir. [Yugoslovia

    Energy Technology Data Exchange (ETDEWEB)

    Dedinszky, J

    1971-04-01

    The Nagylengyel oil field, discovered in 1951, has oil-producing formations mostly in the Upper-Triassic dolomites, in the Norian-Ractian transition formations, in the Upper-Cretaceous limestones and shales, and in the Miocene. The formation of the reservoir space occurred in many stages. A porous, cavernous fractured reservoir is developed in the Norian principal dolomite. A cavernous fractured reservoir exists in the Cretaceous limestone and in the Cretaceous shale and porous fractured reservoir is developed in the Miocene. The derivation of the model of the reservoir, and the conservative evaluation of the volume of the reservoir made it possible to use secondary recovery.

  10. WE-EF-207-03: Design and Optimization of a CBCT Head Scanner for Detection of Acute Intracranial Hemorrhage

    Energy Technology Data Exchange (ETDEWEB)

    Xu, J; Sisniega, A; Zbijewski, W; Dang, H; Stayman, J; Aygun, N; Koliatsos, V; Siewerdsen, JH [Johns Hopkins University, Balitmore, MD (United States); Wang, X; Foos, DH [Carestream Health, Rochester, NY (United States)

    2015-06-15

    Purpose: To design a dedicated x-ray cone-beam CT (CBCT) system suitable to deployment at the point-of-care and offering reliable detection of acute intracranial hemorrhage (ICH), traumatic brain injury (TBI), stroke, and other head and neck injuries. Methods: A comprehensive task-based image quality model was developed to guide system design and optimization of a prototype head scanner suitable to imaging of acute TBI and ICH. Previously reported models were expanded to include the effects of x-ray scatter correction necessary for detection of low contrast ICH and the contribution of bit depth (digitization noise) to imaging performance. Task-based detectablity index provided the objective function for optimization of system geometry, x-ray source, detector type, anti-scatter grid, and technique at 10–25 mGy dose. Optimal characteristics were experimentally validated using a custom head phantom with 50 HU contrast ICH inserts imaged on a CBCT imaging bench allowing variation of system geometry, focal spot size, detector, grid selection, and x-ray technique. Results: The model guided selection of system geometry with a nominal source-detector distance 1100 mm and optimal magnification of 1.50. Focal spot size ∼0.6 mm was sufficient for spatial resolution requirements in ICH detection. Imaging at 90 kVp yielded the best tradeoff between noise and contrast. The model provided quantitation of tradeoffs between flat-panel and CMOS detectors with respect to electronic noise, field of view, and readout speed required for imaging of ICH. An anti-scatter grid was shown to provide modest benefit in conjunction with post-acquisition scatter correction. Images of the head phantom demonstrate visualization of millimeter-scale simulated ICH. Conclusions: Performance consistent with acute TBI and ICH detection is feasible with model-based system design and robust artifact correction in a dedicated head CBCT system. Further improvements can be achieved with incorporation of

  11. Feasibility of Optimizing Recovery and Reserves from a Mature and Geological Complex Multiple Turbidite Offshore Calif. Reservoir through the Drilling and Completion of a Trilateral Horizontal Well, Class III

    Energy Technology Data Exchange (ETDEWEB)

    Pacific Operators Offshore, Inc.

    2001-04-04

    The intent of this project was to increase production and extend the economic life of this mature field through the application of advanced reservoir characterization and drilling technology, demonstrating the efficacy of these technologies to other small operators of aging fields. Two study periods were proposed; the first to include data assimilation and reservoir characterization and the second to drill the demonstration well. The initial study period showed that a single tri-lateral well would not be economically efficient in redevelopment of Carpinteria's multiple deep water turbidite sand reservoirs, and the study was amended to include the drilling of a series of horizontal redrills from existing surplus well bores on Pacific Operators' Platform Hogan.

  12. Fault detection and isolation in GPS receiver autonomous integrity monitoring based on chaos particle swarm optimization-particle filter algorithm

    Science.gov (United States)

    Wang, Ershen; Jia, Chaoying; Tong, Gang; Qu, Pingping; Lan, Xiaoyu; Pang, Tao

    2018-03-01

    The receiver autonomous integrity monitoring (RAIM) is one of the most important parts in an avionic navigation system. Two problems need to be addressed to improve this system, namely, the degeneracy phenomenon and lack of samples for the standard particle filter (PF). However, the number of samples cannot adequately express the real distribution of the probability density function (i.e., sample impoverishment). This study presents a GPS receiver autonomous integrity monitoring (RAIM) method based on a chaos particle swarm optimization particle filter (CPSO-PF) algorithm with a log likelihood ratio. The chaos sequence generates a set of chaotic variables, which are mapped to the interval of optimization variables to improve particle quality. This chaos perturbation overcomes the potential for the search to become trapped in a local optimum in the particle swarm optimization (PSO) algorithm. Test statistics are configured based on a likelihood ratio, and satellite fault detection is then conducted by checking the consistency between the state estimate of the main PF and those of the auxiliary PFs. Based on GPS data, the experimental results demonstrate that the proposed algorithm can effectively detect and isolate satellite faults under conditions of non-Gaussian measurement noise. Moreover, the performance of the proposed novel method is better than that of RAIM based on the PF or PSO-PF algorithm.

  13. Double-layer evolutionary algorithm for distributed optimization of particle detection on the Grid

    International Nuclear Information System (INIS)

    Padée, Adam; Zaremba, Krzysztof; Kurek, Krzysztof

    2013-01-01

    Reconstruction of particle tracks from information collected by position-sensitive detectors is an important procedure in HEP experiments. It is usually controlled by a set of numerical parameters which have to be manually optimized. This paper proposes an automatic approach to this task by utilizing evolutionary algorithm (EA) operating on both real-valued and binary representations. Because of computational complexity of the task a special distributed architecture of the algorithm is proposed, designed to be run in grid environment. It is two-level hierarchical hybrid utilizing asynchronous master-slave EA on the level of clusters and island model EA on the level of the grid. The technical aspects of usage of production grid infrastructure are covered, including communication protocols on both levels. The paper deals also with the problem of heterogeneity of the resources, presenting efficiency tests on a benchmark function. These tests confirm that even relatively small islands (clusters) can be beneficial to the optimization process when connected to the larger ones. Finally a real-life usage example is presented, which is an optimization of track reconstruction in Large Angle Spectrometer of NA-58 COMPASS experiment held at CERN, using a sample of Monte Carlo simulated data. The overall reconstruction efficiency gain, achieved by the proposed method, is more than 4%, compared to the manually optimized parameters

  14. Fast detection of genetic information by an optimized PCR in an interchangeable chip.

    KAUST Repository

    Wu, Jinbo; Kodzius, Rimantas; Qin, Jianhua; Wen, Weijia; Xiao, Kang

    2012-01-01

    amplification. An optimized PCR with two-temperature approach for denaturing and annealing (Td and Ta) of DNA was also formulated with the PCR chip, with which the amplification of male-specific sex determining region Y (SRY) gene marker by utilizing raw saliva

  15. Redefining the Viral Reservoirs That Prevent HIV-1 Eradication

    Science.gov (United States)

    Eisele, Evelyn; Siliciano, Robert F.

    2014-01-01

    Summary This review proposes definitions for key terms in the field of HIV-1 latency and eradication. In the context of eradication, a reservoir is a cell type that allows persistence of replication-competent HIV-1 on a time scale of years in patients on optimal antiretroviral therapy. Reservoirs act as a barrier to eradication in the patient population in whom cure attempts will likely be made. Halting viral replication is essential to eradication, and definitions and criteria for assessing whether this goal has been achieved are proposed. The cell types that may serve as reservoirs for HIV-1 are discussed. Currently, only latently infected resting CD4+ T cells fit the proposed definition of a reservoir, and more evidence is necessary to demonstrate that other cell types including hematopoietic stem cells and macrophages fit this definition. Further research is urgently required on potential reservoirs in the gut-associated lymphoid tissue and the central nervous system. PMID:22999944

  16. Evaluation of swabs, transport media, and specimen transport conditions for optimal detection of viruses by PCR.

    Science.gov (United States)

    Druce, Julian; Garcia, Katherine; Tran, Thomas; Papadakis, Georgina; Birch, Chris

    2012-03-01

    Depletion of swabs and viral transport medium during epidemics may prompt the use of unvalidated alternatives. Swabs collected and transported dry or in saline were compared to commercially available swab/medium combinations for PCR detection of influenza, enterovirus, herpes simplex virus, and adenovirus. Each was detected at an ambient temperature (22°C) and 4°C for 7 days. Detection of influenza on dry or saline swabs is important because of its capacity to cause outbreaks involving large numbers of cases.

  17. Parallel reservoir simulator computations

    International Nuclear Information System (INIS)

    Hemanth-Kumar, K.; Young, L.C.

    1995-01-01

    The adaptation of a reservoir simulator for parallel computations is described. The simulator was originally designed for vector processors. It performs approximately 99% of its calculations in vector/parallel mode and relative to scalar calculations it achieves speedups of 65 and 81 for black oil and EOS simulations, respectively on the CRAY C-90

  18. unconventional natural gas reservoirs

    International Nuclear Information System (INIS)

    Correa G, Tomas F; Osorio, Nelson; Restrepo R, Dora P

    2009-01-01

    This work is an exploration about different unconventional gas reservoirs worldwide: coal bed methane, tight gas, shale gas and gas hydrate? describing aspects such as definition, reserves, production methods, environmental issues and economics. The overview also mentioned preliminary studies about these sources in Colombia.

  19. Development and optimization of the determination of pharmaceuticals in water samples by SPE and HPLC with diode-array detection.

    Science.gov (United States)

    Pavlović, Dragana Mutavdžić; Ašperger, Danijela; Tolić, Dijana; Babić, Sandra

    2013-09-01

    This paper describes the development, optimization, and validation of a method for the determination of five pharmaceuticals from different therapeutic classes (antibiotics, anthelmintics, glucocorticoides) in water samples. Water samples were prepared using SPE and extracts were analyzed by HPLC with diode-array detection. The efficiency of 11 different SPE cartridges to extract the investigated compounds from water was tested in preliminary experiments. Then, the pH of the water sample, elution solvent, and sorbent mass were optimized. Except for optimization of the SPE procedure, selection of the optimal HPLC column with different stationary phases from different manufacturers has been performed. The developed method was validated using spring water samples spiked with appropriate concentrations of pharmaceuticals. Good linearity was obtained in the range of 2.4-200 μg/L, depending on the pharmaceutical with the correlation coefficients >0.9930 in all cases, except for ciprofloxacin (0.9866). Also, the method has revealed that low LODs (0.7-3.9 μg/L), good precision (intra- and interday) with RSD below 17% and recoveries above 98% for all pharmaceuticals. The method has been successfully applied to the analysis of production wastewater samples from the pharmaceutical industry. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Environmental Factors Affecting Mercury in Camp Far West Reservoir, California, 2001-03

    Science.gov (United States)

    Alpers, Charles N.; Stewart, A. Robin; Saiki, Michael K.; Marvin-DiPasquale, Mark C.; Topping, Brent R.; Rider, Kelly M.; Gallanthine, Steven K.; Kester, Cynthia A.; Rye, Robert O.; Antweiler, Ronald C.; De Wild, John F.

    2008-01-01

    water were observed in samples collected during summer from deepwater stations in the anoxic hypolimnion. In the shallow (less than 14 meters depth) oxic epilimnion, concentrations of methylmercury in unfiltered water were highest during the spring and lowest during the fall. The ratio of methylmercury to total mercury (MeHg/HgT) increased systematically from winter to spring to summer, largely in response to the progressive seasonal decrease in total mercury concentrations, but also to some extent because of increases in MeHg concentrations during summer. Water-quality data for Camp Far West Reservoir are used in conjunction with data from linked studies of sediment and biota to develop and refine a conceptual model for mercury methylation and bioaccumulation in the reservoir and the lower Bear River watershed. It is hypothesized that MeHg is produced by sulfate-reducing bacteria in the anoxic parts of the water column and in shallow bed sediment. Conditions were optimal for this process during late summer and fall. Previous work has indicated that Camp Far West Reservoir is a phosphate-limited system - molar ratios of inorganic nitrogen to inorganic phosphorus in filtered water were consistently greater than 16 (the Redfield ratio), sometimes by orders of magnitude. Therefore, concentrations of orthophosphate were expectedly very low or below detection at all stations during all seasons. It is further hypothesized that iron-reducing bacteria facilitate release of phosphorus from iron-rich sediments during summer and early fall, stimulating phytoplankton growth in the fall and winter, and that the MeHg produced in the hypolimnion and metalimnion is released to the entire water column in the late fall during reservoir destratification (vertical mixing). Mercury bioaccumulation factors (BAF) were computed using data from linked studies of biota spanning a range of trophic position: zooplankton, midge larvae, mayfly nymphs, crayfish, threadfin shad, bluegill,

  1. Efficient optimal joint channel estimation and data detection for massive MIMO systems

    KAUST Repository

    Alshamary, Haider Ali Jasim; Xu, Weiyu

    2016-01-01

    show that the expected complexity of our algorithm grows polynomially in the channel coherence time. Simulation results demonstrate significant performance gains of our algorithm compared with suboptimal non-coherent detection algorithms. To the best

  2. Muon Tomography of Deep Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Bonneville, Alain H.; Kouzes, Richard T.

    2016-12-31

    Imaging subsurface geological formations, oil and gas reservoirs, mineral deposits, cavities or magma chambers under active volcanoes has been for many years a major quest of geophysicists and geologists. Since these objects cannot be observed directly, different indirect geophysical methods have been developed. They are all based on variations of certain physical properties of the subsurface that can be detected from the ground surface or from boreholes. Electrical resistivity, seismic wave’s velocities and density are certainly the most used properties. If we look at density, indirect estimates of density distributions are performed currently by seismic reflection methods - since the velocity of seismic waves depend also on density - but they are expensive and discontinuous in time. Direct estimates of density are performed using gravimetric data looking at variations of the gravity field induced by the density variations at depth but this is not sufficiently accurate. A new imaging technique using cosmic-ray muon detectors has emerged during the last decade and muon tomography - or muography - promises to provide, for the first time, a complete and precise image of the density distribution in the subsurface. Further, this novel approach has the potential to become a direct, real-time, and low-cost method for monitoring fluid displacement in subsurface reservoirs.

  3. Optimization of Diagnostic Elisa - Based Tests for the Detection of Auto-Antibodies Against Tumor Antigens in Human Serum

    Directory of Open Access Journals (Sweden)

    Daria Štefatić

    2008-08-01

    Full Text Available Colorectal cancer is one of the most common cancer types worldwide and it continues to be a serious public health problem. Early detection and diagnosis are of great importance in cancer management. At present, diagnostic blood tests are based on the detection of tumor-associated markers such as carcinoembryonic antigen (CEA, the cancer antigen CA19-9 for gastrointestinal cancer, CA15-3 for breast cancer or CA125 for ovarian cancer. The lack of sensitivity and specificity of these markers prevents their general use in cancer screening of an average risk population. Therefore, new cancer biomarkers or better screening methods are necessary to improve the diagnostics of the disease. This study was directed to the optimization of a diagnostic, enzyme linked immunosorbent assay (ELISA based test to identify and validate new serum markers, such as extracellular Protein Kinase A (ecPKA and Nicotinamide A-Meth- yltransferase (NNMT. In this type of assay, the cancer antigens are quantified indirectly - by detecting the presence of auto-antibodies against tumor proteins in human serum. The result of the optimization and validation process was in the case of ecPKA a reproducible and stable assay. In case of NNMT the assay was probably not sensitive enough.

  4. Deriving Area-storage Curves of Global Reservoirs

    Science.gov (United States)

    Mu, M.; Tang, Q.

    2017-12-01

    Basic information including capacity, dam height, and largest water area on global reservoirs and dams is well documented in databases such as GRanD (Global Reservoirs and Dams), ICOLD (International Commission on Large Dams). However, though playing a critical role in estimating reservoir storage variations from remote sensing or hydrological models, area-storage (or elevation-storage) curves of reservoirs are not publicly shared. In this paper, we combine Landsat surface water extent, 1 arc-minute global relief model (ETOPO1) and GRanD database to derive area-storage curves of global reservoirs whose area is larger than 1 km2 (6,000 more reservoirs are included). First, the coverage polygon of each reservoir in GRanD is extended to where water was detected by Landsat during 1985-2015. Second, elevation of each pixel in the reservoir is extracted from resampled 30-meter ETOPO1, and then relative depth and frequency of each depth value is calculated. Third, cumulative storage is calculated with increasing water area by every one percent of reservoir coverage area and then the uncalibrated area-storage curve is obtained. Finally, the area-storage curve is linearly calibrated by the ratio of calculated capacity over reported capacity in GRanD. The derived curves are compared with in-situ reservoir data collected in Great Plains Region in US, and the results show that in-situ records are well captured by the derived curves even in relative small reservoirs (several square kilometers). The new derived area-storage curves have the potential to be employed in global monitoring or modelling of reservoirs storage and area variations.