WorldWideScience

Sample records for optimal imaging strategy

  1. Optimal Segmentation Strategy for Compact Representation of Hyperspectral Image Cubes

    Energy Technology Data Exchange (ETDEWEB)

    Paglieroni, D; Roberts, R

    2000-02-08

    By producing compact representations of hyperspectral image cubes (hypercubes), image storage requirements and the amount of time it takes to extract essential elements of information can both be dramatically reduced. However, these compact representations must preserve the important spectral features within hypercube pixels and the spatial structure associated with background and objects or phenomena of interest. This paper describes a novel approach for automatically and efficiently generating a particular type of compact hypercube representation, referred to as a supercube. The hypercube is segmented into regions that contain pixels with similar spectral shapes that are spatially connected, and the pixel connectivity constraint can be relaxed. Thresholds of similarity in spectral shape between pairs of pixels are derived directly from the hypercube data. One superpixel is generated for each region as some linear combination of pixels belonging to that region. The superpixels are optimal in the sense that the linear combination coefficients are computed so as to minimize the level of noise. Each hypercube pixel is represented in the supercube by applying a gain and bias to the superpixel assigned to the region containing that pixel. Examples are provided.

  2. An Image Enhancement Method Using the Quantum-Behaved Particle Swarm Optimization with an Adaptive Strategy

    Directory of Open Access Journals (Sweden)

    Xiaoping Su

    2013-01-01

    Full Text Available Image enhancement techniques are very important to image processing, which are used to improve image quality or extract the fine details in degraded images. In this paper, two novel objective functions based on the normalized incomplete Beta transform function are proposed to evaluate the effectiveness of grayscale image enhancement and color image enhancement, respectively. Using these objective functions, the parameters of transform functions are estimated by the quantum-behaved particle swarm optimization (QPSO. We also propose an improved QPSO with an adaptive parameter control strategy. The QPSO and the AQPSO algorithms, along with genetic algorithm (GA and particle swarm optimization (PSO, are tested on several benchmark grayscale and color images. The results show that the QPSO and AQPSO perform better than GA and PSO for the enhancement of these images, and the AQPSO has some advantages over QPSO due to its adaptive parameter control strategy.

  3. Medical Image Registration by means of a Bio-Inspired Optimization Strategy

    Directory of Open Access Journals (Sweden)

    Hariton Costin

    2012-07-01

    Full Text Available Medical imaging mainly treats and processes missing, ambiguous, complementary, redundant and distorted data. Biomedical image registration is the process of geometric overlaying or alignment of two or more 2D/3D images of the same scene, taken at different time slots, from different angles, and/or by different acquisition systems. In medical practice, it is becoming increasingly important in diagnosis, treatment planning, functional studies, computer-guided therapies, and in biomedical research. Technically, image registration implies a complex optimization of different parameters, performed at local or/and global levels. Local optimization methods frequently fail because functions of the involved metrics with respect to transformation parameters are generally nonconvex and irregular. Therefore, global methods are often required, at least at the beginning of the procedure. In this paper, a new evolutionary and bio-inspired approach -- bacterial foraging optimization -- is adapted for single-slice to 3-D PET and CT multimodal image registration. Preliminary results of optimizing the normalized mutual information similarity metric validated the efficacy of the proposed method by using a freely available medical image database.

  4. Living renal donors: optimizing the imaging strategy--decision- and cost-effectiveness analysis

    NARCIS (Netherlands)

    Y.S. Liem (Ylian Serina); M.C.J.M. Kock (Marc); W. Weimar (Willem); K. Visser (Karen); M.G.M. Hunink (Myriam); J.N.M. IJzermans (Jan)

    2003-01-01

    textabstractPURPOSE: To determine the most cost-effective strategy for preoperative imaging performed in potential living renal donors. MATERIALS AND METHODS: In a decision-analytic model, the societal cost-effectiveness of digital subtraction angiography (DSA), gadolinium-enhanced

  5. Optimal strategies for observation of active galactic nuclei variability with Imaging Atmospheric Cherenkov Telescopes

    CERN Document Server

    Giomi, Matteo; Maier, Gernot

    2016-01-01

    Variable emission is one of the defining characteristic of active galactic nuclei (AGN). While providing precious information on the nature and physics of the sources, variability is often challenging to observe with time- and field-of-view-limited astronomical observatories such as Imaging Atmospheric Cherenkov Telescopes (IACTs). In this work, we address two questions relevant for the observation of sources characterized by AGN-like variability: what is the most time-efficient way to detect such sources, and what is the observational bias that can be introduced by the choice of the observing strategy when conducting blind surveys of the sky. Different observing strategies are evaluated using simulated light curves and realistic instrument response functions of the Cherenkov Telescope Array (CTA), a future gamma-ray observatory. We show that strategies that makes use of very small observing windows, spread over large periods of time, allows for a faster detection of the source, and are less influenced by the...

  6. Optimal strategies for observation of active galactic nuclei variability with Imaging Atmospheric Cherenkov Telescopes

    Science.gov (United States)

    Giomi, Matteo; Gerard, Lucie; Maier, Gernot

    2016-07-01

    Variable emission is one of the defining characteristic of active galactic nuclei (AGN). While providing precious information on the nature and physics of the sources, variability is often challenging to observe with time- and field-of-view-limited astronomical observatories such as Imaging Atmospheric Cherenkov Telescopes (IACTs). In this work, we address two questions relevant for the observation of sources characterized by AGN-like variability: what is the most time-efficient way to detect such sources, and what is the observational bias that can be introduced by the choice of the observing strategy when conducting blind surveys of the sky. Different observing strategies are evaluated using simulated light curves and realistic instrument response functions of the Cherenkov Telescope Array (CTA), a future gamma-ray observatory. We show that strategies that makes use of very small observing windows, spread over large periods of time, allows for a faster detection of the source, and are less influenced by the variability properties of the sources, as compared to strategies that concentrate the observing time in a small number of large observing windows. Although derived using CTA as an example, our conclusions are conceptually valid for any IACTs facility, and in general, to all observatories with small field of view and limited duty cycle.

  7. Optimal Strategy in Basketball

    CERN Document Server

    Skinner, Brian

    2015-01-01

    This book chapter reviews some of the major principles associated with optimal strategy in basketball. In particular, we consider the principles of allocative efficiency (optimal allocation of shots between offensive options), dynamic efficiency (optimal shot selection in the face of pressure from the shot clock), and the risk/reward tradeoff (strategic manipulation of outcome variance). For each principle, we provide a simple example of a strategic problem and show how it can be described analytically. We then review general analytical results and provide an overview of existing statistical studies. A number of open challenges in basketball analysis are highlighted.

  8. Optimal GENCO bidding strategy

    Science.gov (United States)

    Gao, Feng

    Electricity industries worldwide are undergoing a period of profound upheaval. The conventional vertically integrated mechanism is being replaced by a competitive market environment. Generation companies have incentives to apply novel technologies to lower production costs, for example: Combined Cycle units. Economic dispatch with Combined Cycle units becomes a non-convex optimization problem, which is difficult if not impossible to solve by conventional methods. Several techniques are proposed here: Mixed Integer Linear Programming, a hybrid method, as well as Evolutionary Algorithms. Evolutionary Algorithms share a common mechanism, stochastic searching per generation. The stochastic property makes evolutionary algorithms robust and adaptive enough to solve a non-convex optimization problem. This research implements GA, EP, and PS algorithms for economic dispatch with Combined Cycle units, and makes a comparison with classical Mixed Integer Linear Programming. The electricity market equilibrium model not only helps Independent System Operator/Regulator analyze market performance and market power, but also provides Market Participants the ability to build optimal bidding strategies based on Microeconomics analysis. Supply Function Equilibrium (SFE) is attractive compared to traditional models. This research identifies a proper SFE model, which can be applied to a multiple period situation. The equilibrium condition using discrete time optimal control is then developed for fuel resource constraints. Finally, the research discusses the issues of multiple equilibria and mixed strategies, which are caused by the transmission network. Additionally, an advantage of the proposed model for merchant transmission planning is discussed. A market simulator is a valuable training and evaluation tool to assist sellers, buyers, and regulators to understand market performance and make better decisions. A traditional optimization model may not be enough to consider the distributed

  9. Optimal imaging strategy for community-acquired Staphylococcus aureus musculoskeletal infections in children

    Energy Technology Data Exchange (ETDEWEB)

    Browne, Lorna P.; Cassady, Christopher I.; Krishnamurthy, Rajesh; Guillerman, R.P. [Texas Children' s Hospital, Edward B. Singleton Department of Diagnostic Imaging, Houston, TX (United States); Mason, Edward O.; Kaplan, Sheldon L. [Texas Children' s Hospital, Department of Pediatrics, Baylor College of Medicine, Infectious Disease Service, Houston, TX (United States)

    2008-08-15

    Invasive musculoskeletal infections from community-acquired methicillin-resistant and methicillin-susceptible Staphylococcus aureus (CA-SA) are increasingly encountered in children. Imaging is frequently requested in these children for diagnosis and planning of therapeutic interventions. To appraise the diagnostic efficacy of imaging practices performed for CA-SA osteomyelitis and its complications. A retrospective review was conducted of the clinical charts and imaging studies of CA-SA osteomyelitis cases since 2001 at a large children's hospital. Of 199 children diagnosed with CA-SA osteomyelitis, 160 underwent MRI examination and 35 underwent bone scintigraphy. The sensitivity of MRI and bone scintigraphy for CA-SA osteomyelitis was 98% and 53%, respectively. In all discordant cases, MRI was correct compared to bone scintigraphy. Extraosseous complications of CA-SA osteomyelitis detected only by MRI included subperiosteal abscesses (n = 77), pyomyositis (n = 43), septic arthritis (n = 31), and deep venous thrombosis (n = 12). MRI is the preferred imaging modality for the investigation of pediatric CA-SA musculoskeletal infection because it offers superior sensitivity for osteomyelitis compared to bone scintigraphy and detects extraosseous complications that occur in a substantial proportion of patients. (orig.)

  10. Evolution Strategies in Optimization Problems

    CERN Document Server

    Cruz, Pedro A F

    2007-01-01

    Evolution Strategies are inspired in biology and part of a larger research field known as Evolutionary Algorithms. Those strategies perform a random search in the space of admissible functions, aiming to optimize some given objective function. We show that simple evolution strategies are a useful tool in optimal control, permitting to obtain, in an efficient way, good approximations to the solutions of some recent and challenging optimal control problems.

  11. Optimising Optimal Image Subtraction

    CERN Document Server

    Israel, H; Schuh, S; Israel, Holger; Hessman, Frederic V.; Schuh, Sonja

    2006-01-01

    Difference imaging is a technique for obtaining precise relative photometry of variable sources in crowded stellar fields and, as such, constitutes a crucial part of the data reduction pipeline in surveys for microlensing events or transiting extrasolar planets. The Optimal Image Subtraction (OIS) algorithm permits the accurate differencing of images by determining convolution kernels which, when applied to reference images of particularly good quality, provide excellent matches to the point-spread functions (PSF) in other images of the time series to be analysed. The convolution kernels are built as linear combinations of a set of basis functions, conventionally bivariate Gaussians modulated by polynomials. The kernel parameters must be supplied by the user and should ideally be matched to the PSF, pixel-sampling, and S/N of the data to be analysed. We have studied the outcome of the reduction as a function of the kernel parameters using our implementation of OIS within the TRIPP package. From the analysis o...

  12. Optimal temporal windows and dose-reducing strategy for coronary artery bypass graft imaging with 256-slice CT

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Kun-Mu [Department of Radiology, Shin Kong Wu Ho-Su Memorial Hospital, 95 Wen Chang Road, Shih Lin District, Taipei 111, Taiwan. (China); Lee, Yi-Wei [Department of Radiology, Kaohsiung Chang Gung Memorial Hospital and Chang Gung University College of Medicine, Kaohsiung, Taiwan (China); Department of Biomedical Imaging and Radiological Sciences, National Yang Ming University, Taipei, Taiwan (China); Guan, Yu-Xiang [Department of Biomedical Imaging and Radiological Sciences, National Yang Ming University, Taipei, Taiwan (China); Chen, Liang-Kuang [Department of Radiology, Shin Kong Wu Ho-Su Memorial Hospital, 95 Wen Chang Road, Shih Lin District, Taipei 111, Taiwan. (China); School of Medicine, Fu Jen Catholic University, Taipei, Taiwan (China); Law, Wei-Yip, E-mail: m002325@ms.skh.org.tw [Department of Radiology, Shin Kong Wu Ho-Su Memorial Hospital, 95 Wen Chang Road, Shih Lin District, Taipei 111, Taiwan. (China); Su, Chen-Tau, E-mail: m005531@ms.skh.org.tw [Department of Radiology, Shin Kong Wu Ho-Su Memorial Hospital, 95 Wen Chang Road, Shih Lin District, Taipei 111, Taiwan. (China); School of Medicine, Fu Jen Catholic University, Taipei, Taiwan (China)

    2013-12-11

    Objective: To determine the optimal image reconstruction windows in the assessment of coronary artery bypass grafts (CABGs) with 256-slice computed tomography (CT), and to assess their associated optimal pulsing windows for electrocardiogram-triggered tube current modulation (ETCM). Methods: We recruited 18 patients (three female; mean age 68.9 years) having mean heart rate (HR) of 66.3 beats per minute (bpm) and a heart rate variability of 1.3 bpm for this study. A total of 36 CABGs with 168 segments were evaluated, including 12 internal mammary artery (33.3%) and 24 saphenous vein grafts (66.7%). We reconstructed 20 data sets in 5%-step through 0–95% of the R–R interval. The image quality of CABGs was assessed by a 5-point scale (1=excellent to 5=non-diagnostic) for each segment (proximal anastomosis, proximal, middle, distal course of graft body, and distal anastomosis). Two reviewers discriminated optimal reconstruction intervals for each CABG segment in each temporal window. Optimal windows for ETCM were also evaluated. Results: The determined optimal systolic and diastolic reconstruction intervals could be divided into 2 groups with threshold HR=68. The determined best reconstruction intervals for low heart rate (HR<68) and high heart rate (HR>68) were 76.0±2.5% and 45.0±0% respectively. Average image quality scores were 1.7±0.6 with good inter-observer agreement (Kappa=0.79). Image quality was significantly better for saphenous vein grafts versus arterial grafts (P<0.001). The recommended windows of ETCM for low HR, high HR and all HR groups were 40–50%, 71–81% and 40–96% of R-R interval, respectively. The corresponding dose savings were about 60.8%, 58.7% and 22.7% in that order. Conclusions: We determined optimal reconstruction intervals and ETCM windows representing a good compromise between radiation and image quality for following bypass surgery using a 256-slice CT.

  13. Optimal Strategy and Business Models

    DEFF Research Database (Denmark)

    Johnson, Peter; Foss, Nicolai Juul

    2016-01-01

    , it is possible to formalize useful notions of a business model, resources, and competitive advantage. The business model that underpins strategy may be seen as a set of constraints on resources that can be interpreted as controls in optimal control theory. Strategy then might be considered to be the control......This study picks up on earlier suggestions that control theory may further the study of strategy. Strategy can be formally interpreted as an idealized path optimizing heterogeneous resource deployment to produce maximum financial gain. Using standard matrix methods to describe the firm Hamiltonian...... variable of firm path, suggesting in turn that the firm's business model is the codification of the application of investment resources used to control the strategic path of value realization....

  14. Optimal strategies for throwing accurately.

    Science.gov (United States)

    Venkadesan, M; Mahadevan, L

    2017-04-01

    The accuracy of throwing in games and sports is governed by how errors in planning and initial conditions are propagated by the dynamics of the projectile. In the simplest setting, the projectile path is typically described by a deterministic parabolic trajectory which has the potential to amplify noisy launch conditions. By analysing how parabolic trajectories propagate errors, we show how to devise optimal strategies for a throwing task demanding accuracy. Our calculations explain observed speed-accuracy trade-offs, preferred throwing style of overarm versus underarm, and strategies for games such as dart throwing, despite having left out most biological complexities. As our criteria for optimal performance depend on the target location, shape and the level of uncertainty in planning, they also naturally suggest an iterative scheme to learn throwing strategies by trial and error.

  15. Particle swarm optimization based optimal bidding strategy in an ...

    African Journals Online (AJOL)

    user

    Particle swarm optimization based optimal bidding strategy in an open ... relaxation-based approach for strategic bidding in England-Wales pool type electricity market has ... presents the mathematical formulation of optimal bidding problem.

  16. Optimal intervention strategies for tuberculosis

    Science.gov (United States)

    Bowong, Samuel; Aziz Alaoui, A. M.

    2013-06-01

    This paper deals with the problem of optimal control of a deterministic model of tuberculosis (abbreviated as TB for tubercle bacillus). We first present and analyze an uncontrolled tuberculosis model which incorporates the essential biological and epidemiological features of the disease. The model is shown to exhibit the phenomenon of backward bifurcation, where a stable disease-free equilibrium co-exists with one or more stable endemic equilibria when the associated basic reproduction number is less than the unity. Based on this continuous model, the tuberculosis control is formulated and solved as an optimal control problem, indicating how control terms on the chemoprophylaxis and detection should be introduced in the population to reduce the number of individuals with active TB. Results provide a framework for designing the cost-effective strategies for TB with two intervention methods.

  17. Hedging Strategies for Bayesian Optimization

    CERN Document Server

    Brochu, Eric; de Freitas, Nando

    2010-01-01

    Bayesian optimization with Gaussian processes has become an increasingly popular tool in the machine learning community. It is efficient and can be used when very little is known about the objective function, making it popular in expensive black-box optimization scenarios. It is able to do this by sampling the objective using an acquisition function which incorporates the model's estimate of the objective and the uncertainty at any given point. However, there are several different parameterized acquisition functions in the literature, and it is often unclear which one to use. Instead of using a single acquisition function, we adopt a portfolio of acquisition functions governed by an online multi-armed bandit strategy. We describe the method, which we call GP-Hedge, and show that this method almost always outperforms the best individual acquisition function.

  18. Mixed integer evolution strategies for parameter optimization.

    Science.gov (United States)

    Li, Rui; Emmerich, Michael T M; Eggermont, Jeroen; Bäck, Thomas; Schütz, M; Dijkstra, J; Reiber, J H C

    2013-01-01

    Evolution strategies (ESs) are powerful probabilistic search and optimization algorithms gleaned from biological evolution theory. They have been successfully applied to a wide range of real world applications. The modern ESs are mainly designed for solving continuous parameter optimization problems. Their ability to adapt the parameters of the multivariate normal distribution used for mutation during the optimization run makes them well suited for this domain. In this article we describe and study mixed integer evolution strategies (MIES), which are natural extensions of ES for mixed integer optimization problems. MIES can deal with parameter vectors consisting not only of continuous variables but also with nominal discrete and integer variables. Following the design principles of the canonical evolution strategies, they use specialized mutation operators tailored for the aforementioned mixed parameter classes. For each type of variable, the choice of mutation operators is governed by a natural metric for this variable type, maximal entropy, and symmetry considerations. All distributions used for mutation can be controlled in their shape by means of scaling parameters, allowing self-adaptation to be implemented. After introducing and motivating the conceptual design of the MIES, we study the optimality of the self-adaptation of step sizes and mutation rates on a generalized (weighted) sphere model. Moreover, we prove global convergence of the MIES on a very general class of problems. The remainder of the article is devoted to performance studies on artificial landscapes (barrier functions and mixed integer NK landscapes), and a case study in the optimization of medical image analysis systems. In addition, we show that with proper constraint handling techniques, MIES can also be applied to classical mixed integer nonlinear programming problems.

  19. Color Strategies for Image Databases

    OpenAIRE

    Süsstrunk, Sabine

    2001-01-01

    In this paper, color encoding strategies for different image database applications are discussed. The color image workflow is examined in detail, and master and derivative file encoding strategies are outlined in relation to capture, maintenance, and deployment of image files. For the most common image database purposes, recommendations are given as to which type of color encoding is most suitable. Advantages and disadvantages of sensor, input-referred, output-referred, and output device spec...

  20. Optimization of BEV Charging Strategy

    Science.gov (United States)

    Ji, Wei

    This paper presents different approaches to optimize fast charging and workplace charging strategy of battery electric vehicle (BEV) drivers. For the fast charging analysis, a rule-based model was built to simulate BEV charging behavior. Monte Carlo analysis was performed to explore to the potential range of congestion at fast charging stations which could be more than four hours at the most crowded stations. Genetic algorithm was performed to explore the theoretical minimum waiting time at fast charging stations, and it can decrease the waiting time at the most crowded stations to be shorter than one hour. A deterministic approach was proposed as a feasible suggestion that people should consider to take fast charging when the state of charge is approaching 40 miles. This suggestion is hoped to help to minimize potential congestion at fast charging stations. For the workplace charging analysis, scenario analysis was performed to simulate temporal distribution of charging demand under different workplace charging strategies. It was found that if BEV drivers charge as much as possible and as late as possible at workplace, it could increase the utility of solar-generated electricity while relieve grid stress of extra intensive electricity demand at night caused by charging electric vehicles at home.

  1. Topological Optimization of Artificial Microstructure Strategies

    Science.gov (United States)

    2015-04-02

    Topographic Optimization Through Artificial Microstructure Strategies During this project as part of DARPA MCMA we aimed to develop and demonstrate...Topographic Optimization Through Artificial Microstructure Strategies Report Title During this project as part of DARPA MCMA we aimed to develop and...Artificial Microstructure Strategies (Yale and Johns Hopkins) During DARPA MCMA we aimed to develop and demonstrate a 3D microstructural

  2. Optimal Investment Strategy for Risky Assets

    OpenAIRE

    Sergei Maslov; Yi-Cheng Zhang

    1998-01-01

    We design an optimal strategy for investment in a portfolio of assets subject to a multiplicative Brownian motion. The strategy provides the maximal typical long-term growth rate of investor's capital. We determine the optimal fraction of capital that an investor should keep in risky assets as well as weights of different assets in an optimal portfolio. In this approach both average return and volatility of an asset are relevant indicators determining its optimal weight. Our results are parti...

  3. Optimal network proxy caching for image-rich contents

    Science.gov (United States)

    Yang, Xuguang; Ramchandran, Kannan

    1999-12-01

    This paper addresses optimizing cache allocation in a distributed image database system over computer networks. We consider progressive image file formats, and `soft' caching strategies, in which each image is allocated a variable amount of cache memory, in an effort to minimize the expected image transmission delay time. A simple and efficient optimization algorithm is proposed, and is generalized to include multiple proxies in a network scenario. With optimality proven, our algorithms are surprisingly simple, and are based on sorting the images according to a special priority index. We also present an adaptive cache allocation/replacement strategy that can be incorporated into web browsers with little computational overhead. Simulation results are presented.

  4. Optimizing Ballistic Imaging Operations.

    Science.gov (United States)

    Wang, Can; Beggs-Cassin, Mardy; Wein, Lawrence M

    2017-04-03

    Ballistic imaging systems can help solve crimes by comparing images of cartridge cases, which are recovered from a crime scene or test-fired from a gun, to a database of images obtained from past crime scenes. Many U.S. municipalities lack the resources to process all of their cartridge cases. Using data from Stockton, CA, we analyze two problems: how to allocate limited capacity to maximize the number of cartridge cases that generate at least one hit, and how to prioritize the cartridge cases that are processed to maximize the usefulness (i.e., obtained before the corresponding criminal case is closed) of hits. The number of hits can be significantly increased by prioritizing crime scene evidence over test-fires, and by ranking calibers by their hit probability and processing only the higher ranking calibers. We also estimate that last-come first-served increases the proportion of hits that are useful by only 0.05 relative to first-come first-served.

  5. Enhanced Ocean Predictability Through Optimal Observing Strategies

    Science.gov (United States)

    2016-06-14

    Enhanced Ocean Predictability Through Optimal Observing Strategies A. D. Kirwan, Jr. College of Marine Studies University of Delaware Robinson Hall...observation strategies that will maximize the capacity to predict mesoscale and submesoscale conditions so as to provide the best possible nowcasts and...systems approaches on developing optimal observing strategies . The common thread linking both approaches is Lagrangian data, so this phase of the work

  6. Determining an optimal supply chain strategy

    Directory of Open Access Journals (Sweden)

    Intaher M. Ambe

    2012-11-01

    Full Text Available In today’s business environment, many companies want to become efficient and flexible, but have struggled, in part, because they have not been able to formulate optimal supply chain strategies. Often this is as a result of insufficient knowledge about the costs involved in maintaining supply chains and the impact of the supply chain on their operations. Hence, these companies find it difficult to manufacture at a competitive cost and respond quickly and reliably to market demand. Mismatched strategies are the root cause of the problems that plague supply chains, and supply-chain strategies based on a one-size-fits-all strategy often fail. The purpose of this article is to suggest instruments to determine an optimal supply chain strategy. This article, which is conceptual in nature, provides a review of current supply chain strategies and suggests a framework for determining an optimal strategy.

  7. Optimality criteria solution strategies in multiple constraint design optimization

    Science.gov (United States)

    Levy, R.; Parzynski, W.

    1981-01-01

    Procedures and solution strategies are described to solve the conventional structural optimization problem using the Lagrange multiplier technique. The multipliers, obtained through solution of an auxiliary nonlinear optimization problem, lead to optimality criteria to determine the design variables. It is shown that this procedure is essentially equivalent to an alternative formulation using a dual method Lagrangian function objective. Although mathematical formulations are straight-forward, successful applications and computational efficiency depend upon execution procedure strategies. Strategies examined, with application examples, include selection of active constraints, move limits, line search procedures, and side constraint boundaries.

  8. Linear Tabling Strategies and Optimizations

    CERN Document Server

    Zhou, Neng-Fa; Shen, Yi-Dong

    2007-01-01

    Recently, the iterative approach named linear tabling has received considerable attention because of its simplicity, ease of implementation, and good space efficiency. Linear tabling is a framework from which different methods can be derived based on the strategies used in handling looping subgoals. One decision concerns when answers are consumed and returned. This paper describes two strategies, namely, {\\it lazy} and {\\it eager} strategies, and compares them both qualitatively and quantitatively. The results indicate that, while the lazy strategy has good locality and is well suited for finding all solutions, the eager strategy is comparable in speed with the lazy strategy and is well suited for programs with cuts. Linear tabling relies on depth-first iterative deepening rather than suspension to compute fixpoints. Each cluster of inter-dependent subgoals as represented by a top-most looping subgoal is iteratively evaluated until no subgoal in it can produce any new answers. Naive re-evaluation of all loopi...

  9. Multiobjective Optimization Based Vessel Collision Avoidance Strategy Optimization

    Directory of Open Access Journals (Sweden)

    Qingyang Xu

    2014-01-01

    Full Text Available The vessel collision accidents cause a great loss of lives and property. In order to reduce the human fault and greatly improve the safety of marine traffic, collision avoidance strategy optimization is proposed to achieve this. In the paper, a multiobjective optimization algorithm NSGA-II is adopted to search for the optimal collision avoidance strategy considering the safety as well as economy elements of collision avoidance. Ship domain and Arena are used to evaluate the collision risk in the simulation. Based on the optimization, an optimal rudder angle is recommended to navigator for collision avoidance. In the simulation example, a crossing encounter situation is simulated, and the NSGA-II searches for the optimal collision avoidance operation under the Convention on the International Regulations for Preventing Collisions at Sea (COLREGS. The simulation studies exhibit the validity of the method.

  10. Developing & Optimizing a Logical Sourcing Strategy

    National Research Council Canada - National Science Library

    Lee S Scheible; Chris Bodurow; Karin Daun

    2015-01-01

      In order to optimize the benefit of the sourcing strategy and ensure delivery of the portfolio, a logical operational process flow must be developed and implemented consistently across all study...

  11. Image meshing via hierarchical optimization

    Institute of Scientific and Technical Information of China (English)

    Hao XIE; Ruo-feng TONG‡

    2016-01-01

    Vector graphic, as a kind of geometric representation of raster images, has many advantages, e.g., defi nition independence and editing facility. A popular way to convert raster images into vector graphics is image meshing, the aim of which is to fi nd a mesh to represent an image as faithfully as possible. For traditional meshing algorithms, the crux of the problem resides mainly in the high non-linearity and non-smoothness of the objective, which makes it difficult to fi nd a desirable optimal solution. To ameliorate this situation, we present a hierarchical optimization algorithm solving the problem from coarser levels to fi ner ones, providing initialization for each level with its coarser ascent. To further simplify the problem, the original non-convex problem is converted to a linear least squares one, and thus becomes convex, which makes the problem much easier to solve. A dictionary learning framework is used to combine geometry and topology elegantly. Then an alternating scheme is employed to solve both parts. Experiments show that our algorithm runs fast and achieves better results than existing ones for most images.

  12. Optimal strategies for flood prevention

    NARCIS (Netherlands)

    Eijgenraam, Carel; Brekelmans, Ruud; den Hertog, Dick; Roos, C.

    2016-01-01

    Flood prevention policy is of major importance to the Netherlands since a large part of the country is below sea level and high water levels in rivers may also cause floods. In this paper we propose a dike height optimization model to determine economically efficient flood protection standards. We i

  13. Efficient Computation of Optimal Trading Strategies

    CERN Document Server

    Boyarshinov, Victor

    2010-01-01

    Given the return series for a set of instruments, a \\emph{trading strategy} is a switching function that transfers wealth from one instrument to another at specified times. We present efficient algorithms for constructing (ex-post) trading strategies that are optimal with respect to the total return, the Sterling ratio and the Sharpe ratio. Such ex-post optimal strategies are useful analysis tools. They can be used to analyze the "profitability of a market" in terms of optimal trading; to develop benchmarks against which real trading can be compared; and, within an inductive framework, the optimal trades can be used to to teach learning systems (predictors) which are then used to identify future trading opportunities.

  14. Fuzzy entropy image segmentation based on particle Swarm optimization

    Institute of Scientific and Technical Information of China (English)

    Linyi Li; Deren Li

    2008-01-01

    Partide swaFnl optimization is a stochastic global optimization algorithm that is based on swarm intelligence.Because of its excellent performance,particle swarm optimization is introduced into fuzzy entropy image segmentation to select the optimal fuzzy parameter combination and fuzzy threshold adaptively.In this study,the particles in the swarm are constructed and the swarm search strategy is proposed to meet the needs of the segmentation application.Then fuzzy entropy image segmentation based on particle swarm opti-mization is implemented and the proposed method obtains satisfactory results in the segmentation experiments.Compared with the exhaustive search method,particle swarm optimization can give the salne optimal fuzzy parameter combination and fuzzy threshold while needing less search time in the segmentation experiments and also has good search stability in the repeated experiments.Therefore,fuzzy entropy image segmentation based on particle swarm optimization is an efficient and promising segmentation method.

  15. Optimal control of anthracnose using mixed strategies.

    Science.gov (United States)

    Fotsa Mbogne, David Jaures; Thron, Christopher

    2015-11-01

    In this paper we propose and study a spatial diffusion model for the control of anthracnose disease in a bounded domain. The model is a generalization of the one previously developed in [15]. We use the model to simulate two different types of control strategies against anthracnose disease. Strategies that employ chemical fungicides are modeled using a continuous control function; while strategies that rely on cultivational practices (such as pruning and removal of mummified fruits) are modeled with a control function which is discrete in time (though not in space). For comparative purposes, we perform our analyses for a spatially-averaged model as well as the space-dependent diffusion model. Under weak smoothness conditions on parameters we demonstrate the well-posedness of both models by verifying existence and uniqueness of the solution for the growth inhibition rate for given initial conditions. We also show that the set [0, 1] is positively invariant. We first study control by impulsive strategies, then analyze the simultaneous use of mixed continuous and pulse strategies. In each case we specify a cost functional to be minimized, and we demonstrate the existence of optimal control strategies. In the case of pulse-only strategies, we provide explicit algorithms for finding the optimal control strategies for both the spatially-averaged model and the space-dependent model. We verify the algorithms for both models via simulation, and discuss properties of the optimal solutions. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Scanning strategies for imaging arrays

    CERN Document Server

    Kovács, A

    2008-01-01

    Large-format (sub)millimeter wavelength imaging arrays are best operated in scanning observing modes rather than traditional position-switched (chopped) modes. The choice of observing mode is critical for isolating source signals from various types of noise interference, especially for ground-based instrumentation operating under a bright atmosphere. Ideal observing strategies can combat 1/f noise, resist instrumental defects, sensitively recover emission on large scales, and provide an even field coverage -- all under feasible requirements of telescope movement. This work aims to guide the design of observing patterns that maximize scientific returns. It also compares some of the popular choices of observing modes for (sub)millimeter imaging, such as random, Lissajous, billiard, spiral, On-The-Fly (OTF), DREAM, chopped and stare patterns. Many of the conclusions are also applicable other imaging applications and imaging in one dimension (e.g. spectroscopic observations).

  17. TU-G-204-04: A Unified Strategy for Bi-Factorial Optimization of Radiation Dose and Contrast Dose in CT Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Sahbaee, P; Zhang, Y; Solomon, J; Becchetti, M; Segars, P; Samei, E [Duke University Medical Center, Durham, NC (United States)

    2015-06-15

    Purpose: To substantiate the interdependency of contrast dose, radiation dose, and image quality in CT towards the patient- specific optimization of the imaging protocols Methods: The study deployed two phantom platforms. A variable sized (12, 18, 23, 30, 37 cm) phantom (Mercury-3.0) containing an iodinated insert (8.5 mgI/ml) was imaged on a representative CT scanner at multiple CTDI values (0.7–22.6 mGy). The contrast and noise were measured from the reconstructed images for each phantom diameter. Linearly related to iodine-concentration, contrast-to-noise ratio (CNR), were calculated for 16 iodine-concentration levels (0–8.5 mgI/ml). The analysis was extended to a recently developed suit of 58 virtual human models (5D XCAT) with added contrast dynamics. Emulating a contrast-enhanced abdominal image procedure and targeting a peak-enhancement in aorta, each XCAT phantom was “imaged” using a simulation platform (CatSim, GE). 3D surfaces for each patient/size established the relationship between iodine-concentration, dose, and CNR. The ratios of change in iodine-concentration versus dose (IDR) to yield a constant change in CNR were calculated for each patient size. Results: Mercury phantom results show the image-quality size- dependence on CTDI and IC levels. For desired image-quality values, the iso-contour-lines reflect the trade off between contrast-material and radiation doses. For a fixed iodine-concentration (4 mgI/mL), the IDR values for low (1.4 mGy) and high (11.5 mGy) dose levels were 1.02, 1.07, 1.19, 1.65, 1.54, and 3.14, 3.12, 3.52, 3.76, 4.06, respectively across five sizes. The simulation data from XCAT models confirmed the empirical results from Mercury phantom. Conclusion: The iodine-concentration, image quality, and radiation dose are interdependent. The understanding of the relationships between iodine-concentration, image quality, and radiation dose will allow for a more comprehensive optimization of CT imaging devices and techniques

  18. Optimal strategies for throwing accurately

    CERN Document Server

    Venkadesan, Madhusudhan

    2010-01-01

    Accuracy of throwing in games and sports is governed by how errors at projectile release are propagated by flight dynamics. To address the question of what governs the choice of throwing strategy, we use a simple model of throwing with an arm modelled as a hinged bar of fixed length that can release a projectile at any angle and angular velocity. We show that the amplification of deviations in launch parameters from a one parameter family of solution curves is quantified by the largest singular value of an appropriate Jacobian. This allows us to predict a preferred throwing style in terms of this singular value, which itself depends on target location and the target shape. Our analysis also allows us to characterize the trade-off between speed and accuracy despite not including any effects of signal-dependent noise. Using nonlinear calculations for propagating finite input-noise, we find that an underarm throw to a target leads to an undershoot, but an overarm throw does not. Finally, we consider the limit of...

  19. Optimal experimental design strategies for detecting hormesis.

    Science.gov (United States)

    Dette, Holger; Pepelyshev, Andrey; Wong, Weng Kee

    2011-12-01

    Hormesis is a widely observed phenomenon in many branches of life sciences, ranging from toxicology studies to agronomy, with obvious public health and risk assessment implications. We address optimal experimental design strategies for determining the presence of hormesis in a controlled environment using the recently proposed Hunt-Bowman model. We propose alternative models that have an implicit hormetic threshold, discuss their advantages over current models, and construct and study properties of optimal designs for (i) estimating model parameters, (ii) estimating the threshold dose, and (iii) testing for the presence of hormesis. We also determine maximin optimal designs that maximize the minimum of the design efficiencies when we have multiple design criteria or there is model uncertainty where we have a few plausible models of interest. We apply these optimal design strategies to a teratology study and show that the proposed designs outperform the implemented design by a wide margin for many situations.

  20. Optimal Deterministic Investment Strategies for Insurers

    Directory of Open Access Journals (Sweden)

    Ulrich Rieder

    2013-11-01

    Full Text Available We consider an insurance company whose risk reserve is given by a Brownian motion with drift and which is able to invest the money into a Black–Scholes financial market. As optimization criteria, we treat mean-variance problems, problems with other risk measures, exponential utility and the probability of ruin. Following recent research, we assume that investment strategies have to be deterministic. This leads to deterministic control problems, which are quite easy to solve. Moreover, it turns out that there are some interesting links between the optimal investment strategies of these problems. Finally, we also show that this approach works in the Lévy process framework.

  1. Optimization strategies for complex engineering applications

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, M.S.

    1998-02-01

    LDRD research activities have focused on increasing the robustness and efficiency of optimization studies for computationally complex engineering problems. Engineering applications can be characterized by extreme computational expense, lack of gradient information, discrete parameters, non-converging simulations, and nonsmooth, multimodal, and discontinuous response variations. Guided by these challenges, the LDRD research activities have developed application-specific techniques, fundamental optimization algorithms, multilevel hybrid and sequential approximate optimization strategies, parallel processing approaches, and automatic differentiation and adjoint augmentation methods. This report surveys these activities and summarizes the key findings and recommendations.

  2. Optimizing Infant Development: Strategies for Day Care.

    Science.gov (United States)

    Chambliss, Catherine

    This guide for infant day care providers examines the importance of early experience for brain development and strategies for providing optimal infant care. The introduction discusses the current devaluation of day care and idealization of maternal care and identifies benefits of quality day care experience for intellectual development, sleep…

  3. Optimal Heating Strategies for a Convection Oven

    NARCIS (Netherlands)

    Stigter, J.D.; Scheerlinck, N.; Nicolai, B.M.; Impe, van J.F.

    2001-01-01

    In this study classical control theory is applied to a heat conduction model with convective boundary conditions. Optimal heating strategies are obtained through solution of an associated algebraic Riccati equation for a finite horizon linear quadratic regulator (LQR). The large dimensional system

  4. Instance Optimality of the Adaptive Maximum Strategy

    NARCIS (Netherlands)

    L. Diening; C. Kreuzer; R. Stevenson

    2016-01-01

    In this paper, we prove that the standard adaptive finite element method with a (modified) maximum marking strategy is instance optimal for the total error, being the square root of the squared energy error plus the squared oscillation. This result will be derived in the model setting of Poisson’s e

  5. Optimal inspection Strategies for Offshore Structural Systems

    DEFF Research Database (Denmark)

    Faber, M. H.; Sørensen, John Dalsgaard; Kroon, I. B.

    1992-01-01

    Optimal planning of inspection and maintenance strategies for structures has become a subject of increasing interest especially for offshore structures for which large costs are associated with structural failure, inspections and repairs. During the last five years a methodology has been formulated...... a mathematical framework for the estimation of the failure and repair costs a.ssociated with systems failure. Further a strategy for selecting the components to inspect based on decision tree analysis is suggested. Methods and analysis schemes are illustrated by a simple example....... to perform optimal inspection and repair strategies for structural components subject to uncertain loading conditions and material behavior. In this paper this methodology is extended to inelude also system failure i.e. failure of a given sub set of all the structural components. This extension ineludes...

  6. Optimization of Synthetic Aperture Image Quality

    DEFF Research Database (Denmark)

    Moshavegh, Ramin; Jensen, Jonas; Villagómez Hoyos, Carlos Armando

    2016-01-01

    Synthetic Aperture (SA) imaging produces high-quality images and velocity estimates of both slow and fast flow at high frame rates. However, grating lobe artifacts can appear both in transmission and reception. These affect the image quality and the frame rate. Therefore optimization of parameters...... effecting the image quality of SA is of great importance, and this paper proposes an advanced procedure for optimizing the parameters essential for acquiring an optimal image quality, while generating high resolution SA images. Optimization of the image quality is mainly performed based on measures...... such as F-number, number of emissions and the aperture size. They are considered to be the most contributing acquisition factors in the quality of the high resolution images in SA. Therefore, the performance of image quality is quantified in terms of full-width at half maximum (FWHM) and the cystic...

  7. Image-driven mesh optimization

    Energy Technology Data Exchange (ETDEWEB)

    Lindstrom, P; Turk, G

    2001-01-05

    We describe a method of improving the appearance of a low vertex count mesh in a manner that is guided by rendered images of the original, detailed mesh. This approach is motivated by the fact that greedy simplification methods often yield meshes that are poorer than what can be represented with a given number of vertices. Our approach relies on edge swaps and vertex teleports to alter the mesh connectivity, and uses the downhill simplex method to simultaneously improve vertex positions and surface attributes. Note that this is not a simplification method--the vertex count remains the same throughout the optimization. At all stages of the optimization the changes are guided by a metric that measures the differences between rendered versions of the original model and the low vertex count mesh. This method creates meshes that are geometrically faithful to the original model. Moreover, the method takes into account more subtle aspects of a model such as surface shading or whether cracks are visible between two interpenetrating parts of the model.

  8. The Optimal Nash Equilibrium Strategies Under Competition

    Institute of Scientific and Technical Information of China (English)

    孟力; 王崇喜; 汪定伟; 张爱玲

    2004-01-01

    This paper presented a game theoretic model to study the competition for a single investment oppertunity under uncertainty. It models the hazard rate of investment as a function of competitors' trigger level. Under uncertainty and different information structure, the option and game theory was applied to researching the optimal Nash equilibrium strategies of one or more firm. By means of Matlab software, the paper simulates a real estate developing project example and illustrates how parameter affects investment strategies. The paper's work will contribute to the present investment practice in China.

  9. Automatic CT simulation optimization for radiation therapy: A general strategy

    Energy Technology Data Exchange (ETDEWEB)

    Li, Hua, E-mail: huli@radonc.wustl.edu; Chen, Hsin-Chen; Tan, Jun; Gay, Hiram; Michalski, Jeff M.; Mutic, Sasa [Department of Radiation Oncology, Washington University, St. Louis, Missouri 63110 (United States); Yu, Lifeng [Department of Radiology, Mayo Clinic, Rochester, Minnesota 55905 (United States); Anastasio, Mark A. [Department of Biomedical Engineering, Washington University, St. Louis, Missouri 63110 (United States); Low, Daniel A. [Department of Radiation Oncology, University of California Los Angeles, Los Angeles, California 90095 (United States)

    2014-03-15

    Purpose: In radiation therapy, x-ray computed tomography (CT) simulation protocol specifications should be driven by the treatment planning requirements in lieu of duplicating diagnostic CT screening protocols. The purpose of this study was to develop a general strategy that allows for automatically, prospectively, and objectively determining the optimal patient-specific CT simulation protocols based on radiation-therapy goals, namely, maintenance of contouring quality and integrity while minimizing patient CT simulation dose. Methods: The authors proposed a general prediction strategy that provides automatic optimal CT simulation protocol selection as a function of patient size and treatment planning task. The optimal protocol is the one that delivers the minimum dose required to provide a CT simulation scan that yields accurate contours. Accurate treatment plans depend on accurate contours in order to conform the dose to actual tumor and normal organ positions. An image quality index, defined to characterize how simulation scan quality affects contour delineation, was developed and used to benchmark the contouring accuracy and treatment plan quality within the predication strategy. A clinical workflow was developed to select the optimal CT simulation protocols incorporating patient size, target delineation, and radiation dose efficiency. An experimental study using an anthropomorphic pelvis phantom with added-bolus layers was used to demonstrate how the proposed prediction strategy could be implemented and how the optimal CT simulation protocols could be selected for prostate cancer patients based on patient size and treatment planning task. Clinical IMRT prostate treatment plans for seven CT scans with varied image quality indices were separately optimized and compared to verify the trace of target and organ dosimetry coverage. Results: Based on the phantom study, the optimal image quality index for accurate manual prostate contouring was 4.4. The optimal tube

  10. Buffer management optimization strategy for satellite ATM

    Institute of Scientific and Technical Information of China (English)

    Lu Rong; Cao Zhigang

    2006-01-01

    ECTD (erroneous cell tail drop), a buffer management optimization strategy is suggested which can improve the utilization of buffer resources in satellite ATM (asynchronous transfer mode) networks. The strategy, in which erroneous cells caused by satellite channel and the following cells that belong to the same PDU (protocol data Unit) are discarded, concerns non-real-time data services that use higher layer protocol for retransmission. Based on EPD (early packet drop) policy, mathematical models are established with and without ECTD. The numerical results show that ECTD would optimize buffer management and improve effective throughput (goodput), and the increment of goodput is relative to the CER (cell error ratio) and the PDU length. The higher their values are, the greater the increment. For example,when the average PDU length values are 30 and 90, the improvement of goodput are respectively about 4% and 10%.

  11. Optimal Investment Strategy to Minimize Occupation Time

    CERN Document Server

    Bayraktar, Erhan

    2008-01-01

    We find the optimal investment strategy to minimize the expected time that an individual's wealth stays below zero, the so-called {\\it occupation time}. The individual consumes at a constant rate and invests in a Black-Scholes financial market consisting of one riskless and one risky asset, with the risky asset's price process following a geometric Brownian motion. We also consider an extension of this problem by penalizing the occupation time for the degree to which wealth is negative.

  12. Minimax Strategy of Optimal Unambiguous State Discrimination

    Institute of Scientific and Technical Information of China (English)

    张文海; 余龙宝; 曹卓良; 叶柳

    2012-01-01

    In this paper, we consider the minimax strategy to unambiguously discriminate two pure nonorthogonal quantum states without knowing a priori probability. By exploiting the positive-operator valued measure, we derive the upper bound of the minimax measurement of the optimal unambiguous state discrimination. Based on the linear optical devices, we propose an experimentally feasible scheme to implement a minimax measure of a general pair of two nonorthogonal quantum states.

  13. Optimal experimental design strategies for detecting hormesis

    OpenAIRE

    2010-01-01

    Hormesis is a widely observed phenomenon in many branches of life sciences ranging from toxicology studies to agronomy with obvious public health and risk assessment implications. We address optimal experimental design strategies for determining the presence of hormesis in a controlled environment using the recently proposed Hunt-Bowman model. We propose alternative models that have an implicit hormetic threshold, discuss their advantages over current models, construct and study properties of...

  14. Optimal strategies for pricing general insurance

    OpenAIRE

    Emms, P.; Haberman, S.; Savoulli, I.

    2006-01-01

    Optimal premium pricing policies in a competitive insurance environment are investigated using approximation methods and simulation of sample paths. The market average premium is modelled as a diffusion process, with the premium as the control function and the maximization of the expected total utility of wealth, over a finite time horizon, as the objective. In order to simplify the optimisation problem, a linear utility function is considered and two particular premium strategies are adopted...

  15. Optimal network protection against diverse interdictor strategies

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez-Marquez, Jose E., E-mail: jmarquez@stevens.ed [Systems Development and Maturity Lab, School of Systems and Enterprises, Stevens Institute of Technology, Castle Point on Hudson, Hoboken, NJ 07030 (United States); Rocco, Claudio M. [Facultad de Ingenieria, Universidad Central de Venezuela, Caracas (Venezuela, Bolivarian Republic of); Levitin, Gregory [Collaborative Autonomic Computing Laboratory, School of Computer Science, University of Electronic Science and Technology of China (China); Israel Electric Corporation, Reliability and Equipment Department, Haifa 31000 (Israel)

    2011-03-15

    The objective of this paper is to provide optimal protection configurations for a network with components vulnerable to an interdictor with potentially different attacking strategies. Under this new setting, a solution/configuration describes the defender's optimal amount of defense resources allocated to each link against a potential interdictor strategy. Previous to this research decisions were of a binary nature, restricted to defend or not. Obtaining these configurations is important because along with describing the protection scheme, they are also useful for identifying sets of components critical to the successful performance of the network. The application of the approach can be beneficial for networks in telecommunications, energy, and supply chains to name a few. To obtain an optimal solution, the manuscript describes an evolutionary algorithm that considers continuous decision variables. The results obtained for different examples illustrate that equal resource allocation is optimal for the case of homogeneous component vulnerability. These findings are the basis for discussion and for describing future research directives in this area.

  16. On optimal strategies for upgrading networks

    Energy Technology Data Exchange (ETDEWEB)

    Krumke, S.O.; Noltemeier, H. [Wuerzburg Univ. (Germany). Dept. of Computer Science; Marathe, M.V. [Los Alamos National Lab., NM (United States); Ravi, S.S. [State Univ. of New York, Albany, NY (United States). Dept. of Computer Science; Ravi, R. [Carnegie-Mellon Univ., Pittsburgh, PA (United States). Graduate School of Industrial Administration; Sundaram, R. [Massachusetts Inst. of Tech., Cambridge, MA (United States)

    1996-07-02

    We study {ital budget constrained optimal network upgrading problems}. Such problems aim at finding optimal strategies for improving a network under some cost measure subject to certain budget constraints. Given an edge weighted graph {ital G(V,E)}, in the {ital edge based upgrading model}, it is assumed that each edge {ital e} of the given network has an associated function {ital c(e)} that specifies for each edge {ital e} the amount by which the length {ital l(e)} is to be reduced. In the {ital node based upgrading model} a node {ital v} can be upgraded at an expense of cost {ital (v)}. Such an upgrade reduces the cost of each edge incident on {ital v} by a fixed factor {rho}, where 0 < {rho} < 1. For a given budget, {ital B}, the goal is to find an improvement strategy such that the total cost of reduction is a most the given budget {ital B} and the cost of a subgraph (e.g. minimum spanning tree) under the modified edge lengths is the best over all possible strategies which obey the budget constraint. Define an ({alpha},{beta})-approximation algorithm as a polynomial-time algorithm that produces a solution within {alpha} times the optimal function value, violating the budget constraint by a factor of at most {Beta}. The results obtained in this paper include the following 1. We show that in general the problem of computing optimal reduction strategy for modifying the network as above is {bold NP}-hard. 2. In the node based model, we show how to devise a near optimal strategy for improving the bottleneck spanning tree. The algorithms have a performance guarantee of (2 ln {ital n}, 1). 3. for the edge based improvement problems we present improved (in terms of performance and time) approximation algorithms. 4. We also present pseudo-polynomial time algorithms (extendible to polynomial time approximation schemes) for a number of edge/node based improvement problems when restricted to the class of treewidth-bounded graphs.

  17. Trading Strategy Adipted Optimization of European Call Option

    OpenAIRE

    Fukumi, Toshio

    2005-01-01

    Optimal pricing of European call option is described by linear stochastic differential equation. Trading strategy given by a twin of stochastic variables was integrated w.r.t. Black-Scholes formula to adopt optimal pricing to tarading strategy.

  18. Optimal growth strategies under divergent predation pressure.

    Science.gov (United States)

    Aikio, S; Herczeg, G; Kuparinen, A; Merilä, J

    2013-01-01

    The conditions leading to gigantism in nine-spined sticklebacks Pungitius pungitius were analysed by modelling fish growth with the von Bertalanffy model searching for the optimal strategy when the model's growth constant and asymptotic fish size parameters are negatively related to each other. Predator-related mortality was modelled through the increased risk of death during active foraging. The model was parameterized with empirical growth data of fish from four different populations and analysed for optimal growth strategy at different mortality levels. The growth constant and asymptotic fish size were negatively related in most populations. Optimal fish size, fitness and life span decreased with predator-induced mortality. At low mortality, the fitness of pond populations was higher than that of sea populations. The differences disappeared at intermediate mortalities, and sea populations had slightly higher fitness at extremely high mortalities. In the scenario where all populations mature at the same age, the pond populations perform better at low mortalities and the sea populations at high mortalities. It is concluded that a trade-off between growth constant and asymptotic fish size, together with different mortality rates, can explain a significant proportion of body size differentiation between populations. In the present case, it is a sufficient explanation of gigantism in pond P. pungitius. © 2012 The Authors. Journal of Fish Biology © 2012 The Fisheries Society of the British Isles.

  19. Affordance Learning Based on Subtask's Optimal Strategy

    Directory of Open Access Journals (Sweden)

    Huaqing Min

    2015-08-01

    Full Text Available Affordances define the relationships between the robot and environment, in terms of actions that the robot is able to perform. Prior work is mainly about predicting the possibility of a reactive action, and the object's affordance is invariable. However, in the domain of dynamic programming, a robot’s task could often be decomposed into several subtasks, and each subtask could limit the search space. As a result, the robot only needs to replan its sub strategy when an unexpected situation happens, and an object’s affordance might change over time depending on the robot’s state and current subtask. In this paper, we propose a novel affordance model linking the subtask, object, robot state and optimal action. An affordance represents the first action of the optimal strategy under the current subtask when detecting an object, and its influence is promoted from a primitive action to the subtask strategy. Furthermore, hierarchical reinforcement learning and state abstraction mechanism are introduced to learn the task graph and reduce state space. In the navigation experiment, the robot equipped with a camera could learn the objects’ crucial characteristics, and gain their affordances in different subtasks.

  20. Optimization Under Uncertainty for Wake Steering Strategies

    Energy Technology Data Exchange (ETDEWEB)

    Quick, Julian [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Annoni, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Ryan N [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Dykes, Katherine L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Fleming, Paul A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ning, Andrew [Brigham Young University

    2017-08-03

    Offsetting turbines' yaw orientations from incoming wind is a powerful tool that may be leveraged to reduce undesirable wake effects on downstream turbines. First, we examine a simple two-turbine case to gain intuition as to how inflow direction uncertainty affects the optimal solution. The turbines are modeled with unidirectional inflow such that one turbine directly wakes the other, using ten rotor diameter spacing. We perform optimization under uncertainty (OUU) via a parameter sweep of the front turbine. The OUU solution generally prefers less steering. We then do this optimization for a 60-turbine wind farm with unidirectional inflow, varying the degree of inflow uncertainty and approaching this OUU problem by nesting a polynomial chaos expansion uncertainty quantification routine within an outer optimization. We examined how different levels of uncertainty in the inflow direction effect the ratio of the expected values of deterministic and OUU solutions for steering strategies in the large wind farm, assuming the directional uncertainty used to reach said OUU solution (this ratio is defined as the value of the stochastic solution or VSS).

  1. Optimal defense strategy: storage vs. new production.

    Science.gov (United States)

    Shudo, Emi; Iwasa, Yoh

    2002-12-07

    If hosts produce defense proteins after they are infected by pathogens, it may take hours to days before defense becomes fully active. By producing defense proteins beforehand, and storing them until infection, the host can cope with pathogens with a short time delay. However, producing and storing defense proteins require energy, and the activated defense proteins often cause harm to the host's body as well as to pathogens. Here, we study the optimal strategy for a host who chooses the amount of stored defense proteins, the activation of the stored proteins upon infection, and the new production of the proteins. The optimal strategy is the one that minimizes the sum of the harm by pathogens and the cost of defense. The host chooses the storage size of defense proteins based on the probability distribution of the magnitude of pathogen infection. When the infection size is predictable, all the stored proteins are to be activated upon infection. The optimal strategy is to have no storage and to rely entirely on new production if the expected infection size n(0) is small, but to have a big storage without new production if n(0) is large. The transition from the "new production" phase to "storage" phase occurs at a smaller n(0) when storage cost is small, activation cost is large, pathogen toxicity is large, pathogen growth is fast, the defense is effective, the delay is long, and the infection is more likely. On the other hand, the storage size to produce for a large n(0) decreases with three cost parameters and the defense effectiveness, increases with the likelihood of infection, the toxicity and the growth rate of pathogens, and it is independent of the time delay. When infection size is much smaller than the expected size, some of the stored proteins may stay unused.

  2. An integral design strategy combining optical system and image processing to obtain high resolution images

    Science.gov (United States)

    Wang, Jiaoyang; Wang, Lin; Yang, Ying; Gong, Rui; Shao, Xiaopeng; Liang, Chao; Xu, Jun

    2016-05-01

    In this paper, an integral design that combines optical system with image processing is introduced to obtain high resolution images, and the performance is evaluated and demonstrated. Traditional imaging methods often separate the two technical procedures of optical system design and imaging processing, resulting in the failures in efficient cooperation between the optical and digital elements. Therefore, an innovative approach is presented to combine the merit function during optical design together with the constraint conditions of image processing algorithms. Specifically, an optical imaging system with low resolution is designed to collect the image signals which are indispensable for imaging processing, while the ultimate goal is to obtain high resolution images from the final system. In order to optimize the global performance, the optimization function of ZEMAX software is utilized and the number of optimization cycles is controlled. Then Wiener filter algorithm is adopted to process the image simulation and mean squared error (MSE) is taken as evaluation criterion. The results show that, although the optical figures of merit for the optical imaging systems is not the best, it can provide image signals that are more suitable for image processing. In conclusion. The integral design of optical system and image processing can search out the overall optimal solution which is missed by the traditional design methods. Especially, when designing some complex optical system, this integral design strategy has obvious advantages to simplify structure and reduce cost, as well as to gain high resolution images simultaneously, which has a promising perspective of industrial application.

  3. Combined optimization model for sustainable energization strategy

    Science.gov (United States)

    Abtew, Mohammed Seid

    Access to energy is a foundation to establish a positive impact on multiple aspects of human development. Both developed and developing countries have a common concern of achieving a sustainable energy supply to fuel economic growth and improve the quality of life with minimal environmental impacts. The Least Developing Countries (LDCs), however, have different economic, social, and energy systems. Prevalence of power outage, lack of access to electricity, structural dissimilarity between rural and urban regions, and traditional fuel dominance for cooking and the resultant health and environmental hazards are some of the distinguishing characteristics of these nations. Most energy planning models have been designed for developed countries' socio-economic demographics and have missed the opportunity to address special features of the poor countries. An improved mixed-integer programming energy-source optimization model is developed to address limitations associated with using current energy optimization models for LDCs, tackle development of the sustainable energization strategies, and ensure diversification and risk management provisions in the selected energy mix. The Model predicted a shift from traditional fuels reliant and weather vulnerable energy source mix to a least cost and reliable modern clean energy sources portfolio, a climb on the energy ladder, and scored multifaceted economic, social, and environmental benefits. At the same time, it represented a transition strategy that evolves to increasingly cleaner energy technologies with growth as opposed to an expensive solution that leapfrogs immediately to the cleanest possible, overreaching technologies.

  4. Local Optimization Strategies in Urban Vehicular Mobility.

    Directory of Open Access Journals (Sweden)

    Pierpaolo Mastroianni

    Full Text Available The comprehension of vehicular traffic in urban environments is crucial to achieve a good management of the complex processes arising from people collective motion. Even allowing for the great complexity of human beings, human behavior turns out to be subject to strong constraints--physical, environmental, social, economic--that induce the emergence of common patterns. The observation and understanding of those patterns is key to setup effective strategies to optimize the quality of life in cities while not frustrating the natural need for mobility. In this paper we focus on vehicular mobility with the aim to reveal the underlying patterns and uncover the human strategies determining them. To this end we analyze a large dataset of GPS vehicles tracks collected in the Rome (Italy district during a month. We demonstrate the existence of a local optimization of travel times that vehicle drivers perform while choosing their journey. This finding is mirrored by two additional important facts, i.e., the observation that the average vehicle velocity increases by increasing the travel length and the emergence of a universal scaling law for the distribution of travel times at fixed traveled length. A simple modeling scheme confirms this scenario opening the way to further predictions.

  5. Synthetic Imaging Maneuver Optimization (SIMO) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Aurora Flight Sciences (AFS), in collaboration with the MIT Space Systems Laboratory (MIT-SSL), proposed the Synthetic Imaging Maneuver Optimization (SIMO) program...

  6. Optimizing Synthetic Aperture Compound Imaging

    DEFF Research Database (Denmark)

    Hansen, Jens Munk; Jensen, Jørgen Arendt

    2012-01-01

    Spatial compound images are constructed from synthetic aperture data acquired using a linear phased-array transducer. Compound images of wires, tissue, and cysts are created using a method, which allows both transmit and receive compounding without any loss in temporal resolution. Similarly to co...

  7. Optimized absorption imaging of mesoscopic atomic clouds

    Science.gov (United States)

    Muessel, Wolfgang; Strobel, Helmut; Joos, Maxime; Nicklas, Eike; Stroescu, Ion; Tomkovič, Jiří; Hume, David B.; Oberthaler, Markus K.

    2013-10-01

    We report on the optimization of high-intensity absorption imaging for small Bose-Einstein condensates. The imaging calibration exploits the linear scaling of the quantum projection noise with the mean number of atoms for a coherent spin state. After optimization for atomic clouds containing up to 300 atoms, we find an atom number resolution of atoms, mainly limited by photon shot noise and radiation pressure.

  8. Applying BAT Evolutionary Optimization to Image-Based Visual Servoing

    Directory of Open Access Journals (Sweden)

    Marco Perez-Cisneros

    2015-01-01

    Full Text Available This paper presents a predictive control strategy for an image-based visual servoing scheme that employs evolutionary optimization. The visual control task is approached as a nonlinear optimization problem that naturally handles relevant visual servoing constraints such as workspace limitations and visibility restrictions. As the predictive scheme requires a reliable model, this paper uses a local model that is based on the visual interaction matrix and a global model that employs 3D trajectory data extracted from a quaternion-based interpolator. The work assumes a free-flying camera with 6-DOF simulation whose results support the discussion on the constraint handling and the image prediction scheme.

  9. Optimization of Sensor Monitoring Strategies for Emissions

    Science.gov (United States)

    Klise, K. A.; Laird, C. D.; Downey, N.; Baker Hebert, L.; Blewitt, D.; Smith, G. R.

    2016-12-01

    Continuous or regularly scheduled monitoring has the potential to quickly identify changes in air quality. However, even with low-cost sensors, only a limited number of sensors can be placed to monitor airborne pollutants. The physical placement of these sensors and the sensor technology used can have a large impact on the performance of a monitoring strategy. Furthermore, sensors can be placed for different objectives, including maximum coverage, minimum time to detection or exposure, or to quantify emissions. Different objectives may require different monitoring strategies, which need to be evaluated by stakeholders before sensors are placed in the field. In this presentation, we outline methods to enhance ambient detection programs through optimal design of the monitoring strategy. These methods integrate atmospheric transport models with sensor characteristics, including fixed and mobile sensors, sensor cost and failure rate. The methods use site specific pre-computed scenarios which capture differences in meteorology, terrain, concentration averaging times, gas concentration, and emission characteristics. The pre-computed scenarios become input to a mixed-integer, stochastic programming problem that solves for sensor locations and types that maximize the effectiveness of the detection program. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.

  10. Optimal allocation of trend following strategies

    Science.gov (United States)

    Grebenkov, Denis S.; Serror, Jeremy

    2015-09-01

    We consider a portfolio allocation problem for trend following (TF) strategies on multiple correlated assets. Under simplifying assumptions of a Gaussian market and linear TF strategies, we derive analytical formulas for the mean and variance of the portfolio return. We construct then the optimal portfolio that maximizes risk-adjusted return by accounting for inter-asset correlations. The dynamic allocation problem for n assets is shown to be equivalent to the classical static allocation problem for n2 virtual assets that include lead-lag corrections in positions of TF strategies. The respective roles of asset auto-correlations and inter-asset correlations are investigated in depth for the two-asset case and a sector model. In contrast to the principle of diversification suggesting to treat uncorrelated assets, we show that inter-asset correlations allow one to estimate apparent trends more reliably and to adjust the TF positions more efficiently. If properly accounted for, inter-asset correlations are not deteriorative but beneficial for portfolio management that can open new profit opportunities for trend followers. These concepts are illustrated using daily returns of three highly correlated futures markets: the E-mini S&P 500, Euro Stoxx 50 index, and the US 10-year T-note futures.

  11. Modified Discrete Grey Wolf Optimizer Algorithm for Multilevel Image Thresholding

    Directory of Open Access Journals (Sweden)

    Linguo Li

    2017-01-01

    Full Text Available The computation of image segmentation has become more complicated with the increasing number of thresholds, and the option and application of the thresholds in image thresholding fields have become an NP problem at the same time. The paper puts forward the modified discrete grey wolf optimizer algorithm (MDGWO, which improves on the optimal solution updating mechanism of the search agent by the weights. Taking Kapur’s entropy as the optimized function and based on the discreteness of threshold in image segmentation, the paper firstly discretizes the grey wolf optimizer (GWO and then proposes a new attack strategy by using the weight coefficient to replace the search formula for optimal solution used in the original algorithm. The experimental results show that MDGWO can search out the optimal thresholds efficiently and precisely, which are very close to the result examined by exhaustive searches. In comparison with the electromagnetism optimization (EMO, the differential evolution (DE, the Artifical Bee Colony (ABC, and the classical GWO, it is concluded that MDGWO has advantages over the latter four in terms of image segmentation quality and objective function values and their stability.

  12. Risky Arbitrage Strategies: Optimal Portfolio Choice and Economic Implications

    OpenAIRE

    Liu, Jun; Timmermann, Allan G

    2009-01-01

    We define risky arbitrages as self-financing trading strategies that have a strictly positive market price but a zero expected cumulative payoff. A continuous time cointegrated system is used to model risky arbitrages as arising from a mean-reverting mispricing component. We derive the optimal trading strategy in closed-form and show that the standard textbook arbitrage strategy is not optimal. In a calibration exercise, we show that the optimal strategy makes a sizeable difference in economi...

  13. Optimal portfolio strategies under a shortfall constraint

    Directory of Open Access Journals (Sweden)

    D Akuma

    2009-06-01

    Full Text Available We impose dynamically, a shortfall constraint in terms of Tail Conditional Expectation on the portfolio selection problem in continuous time, in order to obtain optimal strategies. The financial market is assumed to comprise n risky assets driven by geometric Brownian motion and one risk-free asset. The method of Lagrange multipliers is combined with the Hamilton-Jacobi-Bellman equation to insert the constraint into the resolution framework. The constraint is re-calculated at short intervals of time throughout the investment horizon. A numerical method is applied to obtain an approximate solution to the problem. It is found that the imposition of the constraint curbs investment in the risky assets.

  14. Optimized imaging of the postoperative spine.

    Science.gov (United States)

    McLellan, Anne Marie; Daniel, Simon; Corcuera-Solano, Idoia; Joshi, Vivek; Tanenbaum, Lawrence N

    2014-05-01

    Few tasks in imaging are more challenging than that of optimizing evaluations of the instrumented spine. The authors describe how applying fundamental and more advanced principles to postoperative spine computed tomography and magnetic resonance examinations mitigates the challenges associated with metal implants and significantly improves image quality and consistency. Newer and soon-to-be-available enhancements should provide improved visualization of tissues and hardware as multispectral imaging sequences continue to develop.

  15. Optimization of Equipment Maintenance Strategy Based on Availability

    Institute of Scientific and Technical Information of China (English)

    张友诚

    2001-01-01

    It is very important to optimize maintenance strategy in maintenance plan. Proper parameters play a decisive role for the optimization. In the opinion of writer, availability is a basic parameter, failure consequence cost and failure characteristic are also important parameters. Maintenance strategy can be optimized on the base by means of quantitative analysis and diagram.

  16. Spaceborne SAR Imaging Algorithm for Coherence Optimized.

    Directory of Open Access Journals (Sweden)

    Zhiwei Qiu

    Full Text Available This paper proposes SAR imaging algorithm with largest coherence based on the existing SAR imaging algorithm. The basic idea of SAR imaging algorithm in imaging processing is that output signal can have maximum signal-to-noise ratio (SNR by using the optimal imaging parameters. Traditional imaging algorithm can acquire the best focusing effect, but would bring the decoherence phenomenon in subsequent interference process. Algorithm proposed in this paper is that SAR echo adopts consistent imaging parameters in focusing processing. Although the SNR of the output signal is reduced slightly, their coherence is ensured greatly, and finally the interferogram with high quality is obtained. In this paper, two scenes of Envisat ASAR data in Zhangbei are employed to conduct experiment for this algorithm. Compared with the interferogram from the traditional algorithm, the results show that this algorithm is more suitable for SAR interferometry (InSAR research and application.

  17. Swarm Optimization Methods in Microwave Imaging

    Directory of Open Access Journals (Sweden)

    Andrea Randazzo

    2012-01-01

    Full Text Available Swarm intelligence denotes a class of new stochastic algorithms inspired by the collective social behavior of natural entities (e.g., birds, ants, etc.. Such approaches have been proven to be quite effective in several applicative fields, ranging from intelligent routing to image processing. In the last years, they have also been successfully applied in electromagnetics, especially for antenna synthesis, component design, and microwave imaging. In this paper, the application of swarm optimization methods to microwave imaging is discussed, and some recent imaging approaches based on such methods are critically reviewed.

  18. Optimization of Neutral Atom Imagers

    Science.gov (United States)

    Shappirio, M.; Coplan, M.; Balsamo, E.; Chornay, D.; Collier, M.; Hughes, P.; Keller, J.; Ogilvie, K.; Williams, E.

    2008-01-01

    The interactions between plasma structures and neutral atom populations in interplanetary space can be effectively studied with energetic neutral atom imagers. For neutral atoms with energies less than 1 keV, the most efficient detection method that preserves direction and energy information is conversion to negative ions on surfaces. We have examined a variety of surface materials and conversion geometries in order to identify the factors that determine conversion efficiency. For chemically and physically stable surfaces smoothness is of primary importance while properties such as work function have no obvious correlation to conversion efficiency. For the noble metals, tungsten, silicon, and graphite with comparable smoothness, conversion efficiency varies by a factor of two to three. We have also examined the way in which surface conversion efficiency varies with the angle of incidence of the neutral atom and have found that the highest efficiencies are obtained at angles of incidence greater then 80deg. The conversion efficiency of silicon, tungsten and graphite were examined most closely and the energy dependent variation of conversion efficiency measured over a range of incident angles. We have also developed methods for micromachining silicon in order to reduce the volume to surface area over that of a single flat surface and have been able to reduce volume to surface area ratios by up to a factor of 60. With smooth micro-machined surfaces of the optimum geometry, conversion efficiencies can be increased by an order of magnitude over instruments like LENA on the IMAGE spacecraft without increase the instruments mass or volume.

  19. Optimization of Neutral Atom Imagers

    Science.gov (United States)

    Shappirio, M.; Coplan, M.; Balsamo, E.; Chornay, D.; Collier, M.; Hughes, P.; Keller, J.; Ogilvie, K.; Williams, E.

    2008-01-01

    The interactions between plasma structures and neutral atom populations in interplanetary space can be effectively studied with energetic neutral atom imagers. For neutral atoms with energies less than 1 keV, the most efficient detection method that preserves direction and energy information is conversion to negative ions on surfaces. We have examined a variety of surface materials and conversion geometries in order to identify the factors that determine conversion efficiency. For chemically and physically stable surfaces smoothness is of primary importance while properties such as work function have no obvious correlation to conversion efficiency. For the noble metals, tungsten, silicon, and graphite with comparable smoothness, conversion efficiency varies by a factor of two to three. We have also examined the way in which surface conversion efficiency varies with the angle of incidence of the neutral atom and have found that the highest efficiencies are obtained at angles of incidence greater then 80deg. The conversion efficiency of silicon, tungsten and graphite were examined most closely and the energy dependent variation of conversion efficiency measured over a range of incident angles. We have also developed methods for micromachining silicon in order to reduce the volume to surface area over that of a single flat surface and have been able to reduce volume to surface area ratios by up to a factor of 60. With smooth micro-machined surfaces of the optimum geometry, conversion efficiencies can be increased by an order of magnitude over instruments like LENA on the IMAGE spacecraft without increase the instruments mass or volume.

  20. Optimal restructuring strategies under various dynamic factors

    Institute of Scientific and Technical Information of China (English)

    MENG Qing-xuan

    2007-01-01

    Corporate restructuring was identified as a new industrial force that has great impact on economic values and that therefore has become central in daily financial decision making. This article investigates the optimal restructuring strategies under different dynamic factors and their numerous impacts on firm value. The concept of quasi-leverage is introduced and valuation models are built for corporate debt and equity under imperfect market conditions. The model's input variables include the quasi-leverage and other firm-specific parameters, the output variables include multiple corporate security values. The restructuring cost is formulated in the form of exponential function, which allows us to observe the sensitivity of the variation in security values. The unified model and its analytical solution developed in this research allow us to examine the continuous changes of security values by dynamically changing the coupon rates, riskless interest rate, bankruptcy cost, quasi-leverage, personal tax rate, corporate taxes rate, transaction cost, firm risk, etc., so that the solutions provide useful guidance for financing and restructuring decisions.

  1. Mesh refinement strategy for optimal control problems

    Science.gov (United States)

    Paiva, L. T.; Fontes, F. A. C. C.

    2013-10-01

    Direct methods are becoming the most used technique to solve nonlinear optimal control problems. Regular time meshes having equidistant spacing are frequently used. However, in some cases these meshes cannot cope accurately with nonlinear behavior. One way to improve the solution is to select a new mesh with a greater number of nodes. Another way, involves adaptive mesh refinement. In this case, the mesh nodes have non equidistant spacing which allow a non uniform nodes collocation. In the method presented in this paper, a time mesh refinement strategy based on the local error is developed. After computing a solution in a coarse mesh, the local error is evaluated, which gives information about the subintervals of time domain where refinement is needed. This procedure is repeated until the local error reaches a user-specified threshold. The technique is applied to solve the car-like vehicle problem aiming minimum consumption. The approach developed in this paper leads to results with greater accuracy and yet with lower overall computational time as compared to using a time meshes having equidistant spacing.

  2. Prosumers strategy for DHC energy flow optimization

    Directory of Open Access Journals (Sweden)

    Vasek Lubomir

    2016-01-01

    Full Text Available This article introduces the proposal of discrete model of district heating and cooling system (DHC for energy flow optimization. The aim is to achieve the best solution of the objective function, usually determined by minimizing the production and distribution costs and providing meets the needs of energy consumers. The model also introduces the idea of general prosumers strategy, where all active elements within the modern DHC system are representing by prosumers object. The prosumers are perceived as objects able to actively participate in the planning of production and consumption of energy. It is assumed that the general behaviour of the object in DHC is the same, no matter how they differ in sizes and designs. Thus, all the objects are defined by two characteristics - the ability to produce and consume. The model based on this basic principle, of course, with the most accurate information about the particular values at a time, object properties and other, should provide tools for simulation and control of modern DHC, possibly superior units as Smart Energy Grids - understood as a system integrating Smart Grids (electricity and Smart Thermal Grids (heat a cool.

  3. Comparison of image quality in head CT studies with different dose-reduction strategies

    DEFF Research Database (Denmark)

    Johansen, Jeppe; Nielsen, Rikke; Fink-Jensen, Vibeke

    -reduction maneuvers is reduction of image quality due to image noise or artifacts. The aim of our study was therefore to find the best diagnostic images with lowest possible dose. We present results of dose- and image quality optimizing strategies of brain CT examinations at our institution. We compare sequential...

  4. Optimality of feedback control strategies for qubit purification

    OpenAIRE

    Wiseman, Howard M.; Bouten, Luc

    2007-01-01

    Recently two papers [K. Jacobs, Phys. Rev. A {\\bf 67}, 030301(R) (2003); H. M. Wiseman and J. F. Ralph, New J. Physics {\\bf 8}, 90 (2006)] have derived control strategies for rapid purification of qubits, optimized with respect to various goals. In the former paper the proof of optimality was not mathematically rigorous, while the latter gave only heuristic arguments for optimality. In this paper we provide rigorous proofs of optimality in all cases, by applying simple concepts from optimal c...

  5. Circular SAR Optimization Imaging Method of Buildings

    Directory of Open Access Journals (Sweden)

    Wang Jian-feng

    2015-12-01

    Full Text Available The Circular Synthetic Aperture Radar (CSAR can obtain the entire scattering properties of targets because of its great ability of 360° observation. In this study, an optimal orientation of the CSAR imaging algorithm of buildings is proposed by applying a combination of coherent and incoherent processing techniques. FEKO software is used to construct the electromagnetic scattering modes and simulate the radar echo. The FEKO imaging results are compared with the isotropic scattering results. On comparison, the optimal azimuth coherent accumulation angle of CSAR imaging of buildings is obtained. Practically, the scattering directions of buildings are unknown; therefore, we divide the 360° echo of CSAR into many overlapped and few angle echoes corresponding to the sub-aperture and then perform an imaging procedure on each sub-aperture. Sub-aperture imaging results are applied to obtain the all-around image using incoherent fusion techniques. The polarimetry decomposition method is used to decompose the all-around image and further retrieve the edge information of buildings successfully. The proposed method is validated with P-band airborne CSAR data from Sichuan, China.

  6. Optimization strategies for discrete multi-material stiffness optimization

    DEFF Research Database (Denmark)

    Hvejsel, Christian Frier; Lund, Erik; Stolpe, Mathias

    2011-01-01

    Design of composite laminated lay-ups are formulated as discrete multi-material selection problems. The design problem can be modeled as a non-convex mixed-integer optimization problem. Such problems are in general only solvable to global optimality for small to moderate sized problems. To attack...... larger problem instances we formulate convex and non-convex continuous relaxations which can be solved using gradient based optimization algorithms. The convex relaxation yields a lower bound on the attainable performance. The optimal solution to the convex relaxation is used as a starting guess...

  7. Developing an Integrated Design Strategy for Chip Layout Optimization

    NARCIS (Netherlands)

    Wits, Wessel Willems; Jauregui Becker, Juan Manuel; van Vliet, Frank Edward; te Riele, G.J.

    2011-01-01

    This paper presents an integrated design strategy for chip layout optimization. The strategy couples both electric and thermal aspects during the conceptual design phase to improve chip performances; thermal management being one of the major topics. The layout of the chip circuitry is optimized acco

  8. How do I get an optimal image?

    Directory of Open Access Journals (Sweden)

    Anil Kumar H

    2009-01-01

    Full Text Available Trans-esophageal echocardiography (TEE is fast becoming an indispensable monitoring and diagnostic modality in cardiac operation rooms. Its convenience and dependability in making important and crucial decisions intra-operatively, during cardiac operative procedures, makes it one of the most useful weapons in a cardiac anesthesiologist′s armory. But to make reliable inferences based on intra-operative TEE, creation and development of a proper image is one of the fundamental requirements. The image quality can be affected by factors like patient anatomy, quality of the ultrasound system, and skill of the echocardiographer. Since the first two cannot be changed, in most of cases, we will have to work on the third factor to optimize image quality. A working knowledge of the physics of ultrasound imaging and a sufficient familiarity with the various knobs and controls on the machine will go a long way in helping one acquire an optimum image.

  9. Molecular imaging: current status and emerging strategies

    Energy Technology Data Exchange (ETDEWEB)

    Pysz, M.A. [Department of Radiology, Molecular Imaging Program at Stanford, Stanford University School of Medicine, Stanford, CA (United States); Gambhir, S.S. [Department of Radiology, Molecular Imaging Program at Stanford, Stanford University School of Medicine, Stanford, CA (United States); Departments of Bioengineering and Materials Science and Engineering, Stanford University, Stanford, CA (United States); Willmann, J.K., E-mail: willmann@stanford.ed [Department of Radiology, Molecular Imaging Program at Stanford, Stanford University School of Medicine, Stanford, CA (United States)

    2010-07-15

    In vivo molecular imaging has a great potential to impact medicine by detecting diseases in early stages (screening), identifying extent of disease, selecting disease- and patient-specific treatment (personalized medicine), applying a directed or targeted therapy, and measuring molecular-specific effects of treatment. Current clinical molecular imaging approaches primarily use positron-emission tomography (PET) or single photon-emission computed tomography (SPECT)-based techniques. In ongoing preclinical research, novel molecular targets of different diseases are identified and, sophisticated and multifunctional contrast agents for imaging these molecular targets are developed along with new technologies and instrumentation for multi-modality molecular imaging. Contrast-enhanced molecular ultrasound (US) with molecularly-targeted contrast microbubbles is explored as a clinically translatable molecular imaging strategy for screening, diagnosing, and monitoring diseases at the molecular level. Optical imaging with fluorescent molecular probes and US imaging with molecularly-targeted microbubbles are attractive strategies as they provide real-time imaging, are relatively inexpensive, produce images with high spatial resolution, and do not involve exposure to ionizing irradiation. Raman spectroscopy/microscopy has emerged as a molecular optical imaging strategy for ultrasensitive detection of multiple biomolecules/biochemicals with both in vivo and ex vivo versatility. Photoacoustic imaging is a hybrid of optical and US techniques involving optically-excitable molecularly-targeted contrast agents and quantitative detection of resulting oscillatory contrast agent movement with US. Current preclinical findings and advances in instrumentation, such as endoscopes and microcatheters, suggest that these molecular imaging methods have numerous potential clinical applications and will be translated into clinical use in the near future.

  10. Optimizing metapopulation sustainability through a checkerboard strategy.

    Science.gov (United States)

    Zion, Yossi Ben; Yaari, Gur; Shnerb, Nadav M

    2010-01-22

    The persistence of a spatially structured population is determined by the rate of dispersal among habitat patches. If the local dynamic at the subpopulation level is extinction-prone, the system viability is maximal at intermediate connectivity where recolonization is allowed, but full synchronization that enables correlated extinction is forbidden. Here we developed and used an algorithm for agent-based simulations in order to study the persistence of a stochastic metapopulation. The effect of noise is shown to be dramatic, and the dynamics of the spatial population differs substantially from the predictions of deterministic models. This has been validated for the stochastic versions of the logistic map, the Ricker map and the Nicholson-Bailey host-parasitoid system. To analyze the possibility of extinction, previous studies were focused on the attractiveness (Lyapunov exponent) of stable solutions and the structure of their basin of attraction (dependence on initial population size). Our results suggest that these features are of secondary importance in the presence of stochasticity. Instead, optimal sustainability is achieved when decoherence is maximal. Individual-based simulations of metapopulations of different sizes, dimensions and noise types, show that the system's lifetime peaks when it displays checkerboard spatial patterns. This conclusion is supported by the results of a recently published Drosophila experiment. The checkerboard strategy provides a technique for the manipulation of migration rates (e.g., by constructing corridors) in order to affect the persistence of a metapopulation. It may be used in order to minimize the risk of extinction of an endangered species, or to maximize the efficiency of an eradication campaign.

  11. Optimizing metapopulation sustainability through a checkerboard strategy.

    Directory of Open Access Journals (Sweden)

    Yossi Ben Zion

    2010-01-01

    Full Text Available The persistence of a spatially structured population is determined by the rate of dispersal among habitat patches. If the local dynamic at the subpopulation level is extinction-prone, the system viability is maximal at intermediate connectivity where recolonization is allowed, but full synchronization that enables correlated extinction is forbidden. Here we developed and used an algorithm for agent-based simulations in order to study the persistence of a stochastic metapopulation. The effect of noise is shown to be dramatic, and the dynamics of the spatial population differs substantially from the predictions of deterministic models. This has been validated for the stochastic versions of the logistic map, the Ricker map and the Nicholson-Bailey host-parasitoid system. To analyze the possibility of extinction, previous studies were focused on the attractiveness (Lyapunov exponent of stable solutions and the structure of their basin of attraction (dependence on initial population size. Our results suggest that these features are of secondary importance in the presence of stochasticity. Instead, optimal sustainability is achieved when decoherence is maximal. Individual-based simulations of metapopulations of different sizes, dimensions and noise types, show that the system's lifetime peaks when it displays checkerboard spatial patterns. This conclusion is supported by the results of a recently published Drosophila experiment. The checkerboard strategy provides a technique for the manipulation of migration rates (e.g., by constructing corridors in order to affect the persistence of a metapopulation. It may be used in order to minimize the risk of extinction of an endangered species, or to maximize the efficiency of an eradication campaign.

  12. Multi-Criteria Optimization for Image Guidance

    CERN Document Server

    Winey, Brian

    2011-01-01

    Purpose: To develop a multi-criteria optimization framework for image guided radiotherapy. Methods: An algorithm is proposed for a multi-criteria framework for the purpose of patient setup verification decision processes. Optimal patient setup shifts and rotations are not always straightforward, particularly for deformable or moving targets of the spine, abdomen, thorax, breast, head and neck and limbs. The algorithm relies upon dosimetric constraints and objectives to aid in the patient setup such that the patient is setup to maximize tumor dose coverage and minimize dose to organs at risk while allowing for daily clinical changes. A simple 1D model and a lung lesion are presented. Results: The algorithm delivers a multi-criteria optimization framework allowing for clinical decisions to accommodate patient target variation make setup decisions less straightforward. With dosimetric considerations, optimal patient positions can be derived. Conclusions: A multi-criteria framework is demonstrated to aid in the p...

  13. Research on Design Optimization Strategy in Virtual Product Development

    Institute of Scientific and Technical Information of China (English)

    潘军; 韩帮军; 范秀敏; 马登哲

    2004-01-01

    Simulation and optimization are the key points of virtual product development (VPD). Traditional engineering simulation software and optimization methods are inadequate to analyze the optimization problems because of its computational inefficiency. A systematic design optimization strategy by using statistical methods and mathematical optimization technologies is proposed. This method extends the design of experiments (DOE) and the simulation metamodel technologies. Metamodels are built to in place of detailed simulation codes based on effectively DOE, and then be linked to optimization routines for fast analysis, or serve as a bridge for integrating simulation software across different domains. A design optimization of composite material structure is used to demonstrate the newly introduced methodology.

  14. Optimal vaccination strategies and rational behaviour in seasonal epidemics.

    Science.gov (United States)

    Doutor, Paulo; Rodrigues, Paula; Soares, Maria do Céu; Chalub, Fabio A C C

    2016-12-01

    We consider a SIRS model with time dependent transmission rate. We assume time dependent vaccination which confers the same immunity as natural infection. We study two types of vaccination strategies: (i) optimal vaccination, in the sense that it minimizes the effort of vaccination in the set of vaccination strategies for which, for any sufficiently small perturbation of the disease free state, the number of infectious individuals is monotonically decreasing; (ii) Nash-equilibria strategies where all individuals simultaneously minimize the joint risk of vaccination versus the risk of the disease. The former case corresponds to an optimal solution for mandatory vaccinations, while the second corresponds to the equilibrium to be expected if vaccination is fully voluntary. We are able to show the existence of both optimal and Nash strategies in a general setting. In general, these strategies will not be functions but Radon measures. For specific forms of the transmission rate, we provide explicit formulas for the optimal and the Nash vaccination strategies.

  15. Estimation of optimal feeding strategies for fed-batch bioprocesses.

    Science.gov (United States)

    Franco-Lara, Ezequiel; Weuster-Botz, Dirk

    2005-07-01

    A generic methodology for feeding strategy optimization is presented. This approach uses a genetic algorithm to search for optimal feeding profiles represented by means of artificial neural networks (ANN). Exemplified on a fed-batch hybridoma cell cultivation, the approach has proven to be able to cope with complex optimization tasks handling intricate constraints and objective functions. Furthermore, the performance of the method is compared with other previously reported standard techniques like: (1) optimal control theory, (2) first order conjugate gradient, (3) dynamical programming, (4) extended evolutionary strategies. The methodology presents no restrictions concerning the number or complexity of the state variables and therefore constitutes a remarkable alternative for process development and optimization.

  16. An optimal replication strategy for data grid systems

    Institute of Scientific and Technical Information of China (English)

    JIANG Jianjin; YANG Guangwen

    2007-01-01

    Data access latency is an important metric of system performance in data grid.By means of efficient replication strategy,the amount of data transferred in a wide area network will decrease,and the average access latency of data will decrease ultimately.The motivation of our research is to solve the optimized replica distribution problem in a data grid;that is,the system should utilize many replicas for every data with storage constraints to minimize the average access latency of data.This paper proposes a model of replication strategy in federated data grid and gives the optimized solution.The analysis results and simulation results show that the optimized replication strategy proposed in this paper is superior to LRU caching strategy,uniform replication strategy,proportional replication strategy and square root replication strategy in terms of wide area network bandwidth requirement and in the average access latency of data.

  17. An optimal tuning strategy for tidal turbines

    Science.gov (United States)

    Vennell, Ross

    2016-11-01

    Tuning wind and tidal turbines is critical to maximizing their power output. Adopting a wind turbine tuning strategy of maximizing the output at any given time is shown to be an extremely poor strategy for large arrays of tidal turbines in channels. This `impatient-tuning strategy' results in far lower power output, much higher structural loads and greater environmental impacts due to flow reduction than an existing `patient-tuning strategy' which maximizes the power output averaged over the tidal cycle. This paper presents a `smart patient tuning strategy', which can increase array output by up to 35% over the existing strategy. This smart strategy forgoes some power generation early in the half tidal cycle in order to allow stronger flows to develop later in the cycle. It extracts enough power from these stronger flows to produce more power from the cycle as a whole than the existing strategy. Surprisingly, the smart strategy can often extract more power without increasing maximum structural loads on the turbines, while also maintaining stronger flows along the channel. This paper also shows that, counterintuitively, for some tuning strategies imposing a cap on turbine power output to limit loads can increase a turbine's average power output.

  18. An optimal tuning strategy for tidal turbines.

    Science.gov (United States)

    Vennell, Ross

    2016-11-01

    Tuning wind and tidal turbines is critical to maximizing their power output. Adopting a wind turbine tuning strategy of maximizing the output at any given time is shown to be an extremely poor strategy for large arrays of tidal turbines in channels. This 'impatient-tuning strategy' results in far lower power output, much higher structural loads and greater environmental impacts due to flow reduction than an existing 'patient-tuning strategy' which maximizes the power output averaged over the tidal cycle. This paper presents a 'smart patient tuning strategy', which can increase array output by up to 35% over the existing strategy. This smart strategy forgoes some power generation early in the half tidal cycle in order to allow stronger flows to develop later in the cycle. It extracts enough power from these stronger flows to produce more power from the cycle as a whole than the existing strategy. Surprisingly, the smart strategy can often extract more power without increasing maximum structural loads on the turbines, while also maintaining stronger flows along the channel. This paper also shows that, counterintuitively, for some tuning strategies imposing a cap on turbine power output to limit loads can increase a turbine's average power output.

  19. Stable and Robust Sampling Strategies for Compressive Imaging.

    Science.gov (United States)

    Krahmer, Felix; Ward, Rachel

    2014-02-01

    In many signal processing applications, one wishes to acquire images that are sparse in transform domains such as spatial finite differences or wavelets using frequency domain samples. For such applications, overwhelming empirical evidence suggests that superior image reconstruction can be obtained through variable density sampling strategies that concentrate on lower frequencies. The wavelet and Fourier transform domains are not incoherent because low-order wavelets and low-order frequencies are correlated, so compressive sensing theory does not immediately imply sampling strategies and reconstruction guarantees. In this paper, we turn to a more refined notion of coherence-the so-called local coherence-measuring for each sensing vector separately how correlated it is to the sparsity basis. For Fourier measurements and Haar wavelet sparsity, the local coherence can be controlled and bounded explicitly, so for matrices comprised of frequencies sampled from a suitable inverse square power-law density, we can prove the restricted isometry property with near-optimal embedding dimensions. Consequently, the variable-density sampling strategy we provide allows for image reconstructions that are stable to sparsity defects and robust to measurement noise. Our results cover both reconstruction by ℓ1-minimization and total variation minimization. The local coherence framework developed in this paper should be of independent interest, as it implies that for optimal sparse recovery results, it suffices to have bounded average coherence from sensing basis to sparsity basis-as opposed to bounded maximal coherence-as long as the sampling strategy is adapted accordingly.

  20. Under-Exposed Image Enhancement Based on Relaxed Luminance Optimization

    National Research Council Canada - National Science Library

    Chunxiao Liu; Feng Yang

    2013-01-01

    ... optimization based under-exposed image clearness enhancement algorithm, which treats it as the simultaneous augmentation of luminance and contrast, and combines them in an optimization framework under...

  1. Optimal relocation strategies for spatially mobile consumers

    CERN Document Server

    Iordanov, Iordan

    2007-01-01

    We develop a model of the behaviour of a dynamically optimizing economic agent who makes consumption-saving and spatial relocation decisions. We formulate an existence result for the model, derive the necessary conditions for optimality and study the behaviour of the economic agent, focusing on the case of a wage distribution with a single maximum.

  2. Strategies for Optimal Design of Structural Systems

    DEFF Research Database (Denmark)

    Enevoldsen, I.; Sørensen, John Dalsgaard

    1992-01-01

    Reliability-based design of structural systems is considered. Especially systems where the reliability model is a series system of parallel systems are analysed. A sensitivity analysis for this class of problems is presented. Direct and sequential optimization procedures to solve the optimization...... problems are described. Numerical tests indicate that a sequential technique called the bounds iteration method (BIM) is particularly fast and stable....

  3. Cloud Optimized Image Format and Compression

    Science.gov (United States)

    Becker, P.; Plesea, L.; Maurer, T.

    2015-04-01

    Cloud based image storage and processing requires revaluation of formats and processing methods. For the true value of the massive volumes of earth observation data to be realized, the image data needs to be accessible from the cloud. Traditional file formats such as TIF and NITF were developed in the hay day of the desktop and assumed fast low latency file access. Other formats such as JPEG2000 provide for streaming protocols for pixel data, but still require a server to have file access. These concepts no longer truly hold in cloud based elastic storage and computation environments. This paper will provide details of a newly evolving image storage format (MRF) and compression that is optimized for cloud environments. Although the cost of storage continues to fall for large data volumes, there is still significant value in compression. For imagery data to be used in analysis and exploit the extended dynamic range of the new sensors, lossless or controlled lossy compression is of high value. Compression decreases the data volumes stored and reduces the data transferred, but the reduced data size must be balanced with the CPU required to decompress. The paper also outlines a new compression algorithm (LERC) for imagery and elevation data that optimizes this balance. Advantages of the compression include its simple to implement algorithm that enables it to be efficiently accessed using JavaScript. Combing this new cloud based image storage format and compression will help resolve some of the challenges of big image data on the internet.

  4. Strategy of image management in retail shops

    Directory of Open Access Journals (Sweden)

    Sandra Soče Kraljević

    2007-12-01

    Full Text Available A sound positioning in consumers’ mind, along with strong promotion support, brought many retail shops to the top. This is mostly thanks to the image created in the consumers’ mind. A retail shop’s image may but need not conform to reality. Image often looks like a cliché. It overstates certain elements of the shop while simply omitting others. That is exactly why image is of great importance and often crucial to consumer behavior. This paper aims at determining the impact of image on customer behavior in the course of decision making about shopping and choosing a particular retail shop. Image is a significant factor of success of every company, hence also of a retail shops. It is a relatively strong value and a component of creating competitive advantage. But if we do not pay sufficient attention to image, it can become counterproductive. Instead to, like an additional value helps creating and maintaining the advantage in competition and realization of business aims, transforms into a limiting factor. Therefore, it is imperative to identify the elements of image that are of greatest importance to customers. Research has shown that customers choose the retail shop first and after that products and brands within this shop. When it comes to the supermarket, as a kind of retail shop, research has shown that two out of three shopping decisions are made by the customer on the spot, that is, without previous planning. That practically means that we can influence customers with different sales techniques. The paper suggests different strategies of image management for supermarkets and conventional shops. For supermarkets it is the “widest assortment” strategy, while for conventional shops the strategy is that of a “selected group of products“. Improvements to research methods will enable getting more information about customer behavior, while pressures of increased competition in the business environment will force retailers to get

  5. Strategies in tower solar power plant optimization

    Science.gov (United States)

    Ramos, A.; Ramos, F.

    2012-09-01

    A method for optimizing a central receiver solar thermal electric power plant is studied. We parametrize the plant design as a function of eleven design variables and reduce the problem of finding optimal designs to the numerical problem of finding the minimum of a function of several variables. This minimization problem is attacked with different algorithms both local and global in nature. We find that all algorithms find the same minimum of the objective function. The performance of each of the algorithms and the resulting designs are studied for two typical cases. We describe a method to evaluate the impact of design variables in the plant performance. This method will tell us what variables are key to the optimal plant design and which ones are less important. This information can be used to further improve the plant design and to accelerate the optimization procedure.

  6. Strategies in tower solar power plant optimization

    CERN Document Server

    Ramos, A

    2012-01-01

    A method for optimizing a central receiver solar thermal electric power plant is studied. We parametrize the plant design as a function of eleven design variables and reduce the problem of finding optimal designs to the numerical problem of finding the minimum of a function of several variables. This minimization problem is attacked with different algorithms both local and global in nature. We find that all algorithms find the same minimum of the objective function. The performance of each of the algorithms and the resulting designs are studied for two typical cases. We describe a method to evaluate the impact of design variables in the plant performance. This method will tell us what variables are key to the optimal plant design and which ones are less important. This information can be used to further improve the plant design and to accelerate the optimization procedure.

  7. Optimization Under Uncertainty for Wake Steering Strategies

    Energy Technology Data Exchange (ETDEWEB)

    Quick, Julian [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Annoni, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Ryan N [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Dykes, Katherine L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Fleming, Paul A [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-08-03

    This presentation covers the motivation for this research, optimization under the uncertainty problem formulation, a two-turbine case, the Princess Amalia Wind Farm case, and conclusions and next steps.

  8. Strategies in tower solar power plant optimization

    OpenAIRE

    RAMOS, A.; RAMOS, F.

    2012-01-01

    A method for optimizing a central receiver solar thermal electric power plant is studied. We parametrize the plant design as a function of eleven design variables and reduce the problem of finding optimal designs to the numerical problem of finding the minimum of a function of several variables. This minimization problem is attacked with different algorithms both local and global in nature. We find that all algorithms find the same minimum of the objective function. The performance of each of...

  9. An approximation based global optimization strategy for structural synthesis

    Science.gov (United States)

    Sepulveda, A. E.; Schmit, L. A.

    1991-01-01

    A global optimization strategy for structural synthesis based on approximation concepts is presented. The methodology involves the solution of a sequence of highly accurate approximate problems using a global optimization algorithm. The global optimization algorithm implemented consists of a branch and bound strategy based on the interval evaluation of the objective function and constraint functions, combined with a local feasible directions algorithm. The approximate design optimization problems are constructed using first order approximations of selected intermediate response quantities in terms of intermediate design variables. Some numerical results for example problems are presented to illustrate the efficacy of the design procedure setforth.

  10. Optimal Approach to SAR Image Despeckling

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Speckle filtering of synthetic aperture radar (SAR) images while preserving the spatial signal variability (texture and fine structures) still remains a challenge. Many algorithms have been proposed for the SAR imagery despeckling. However,simulated annealing (SA) method is one of excellent choices currently. A critical problem in the study on SA is to provide appropriate cooling schedules that ensure fast convergence to near-optimal solutions. This paper gives a new necessary and sufficient condition for the cooling schedule so that the algorithm state converges in all probability to the set of globally minimum cost states.Moreover, it constructs an appropriate objective function for SAR image despeckling. An experimental result of the actual SAR image processing is obtained.

  11. Optimizing the 3R study strategy to learn from text

    NARCIS (Netherlands)

    Reijners, Pauline; Kester, Liesbeth; Wetzels, Sandra; Kirschner, Paul A.

    2013-01-01

    Reijners, P. B. G., Kester, L., Wetzels, S. A. J., & Kirschner, P. A. (2013, 29 May). Optimizing the 3R study strategy to learn from text. Presentation at plenary meeting Learning & Cogntion, Heerlen, The Netherlands.

  12. Optimizing the 3R study strategy to learn from text

    NARCIS (Netherlands)

    Reijners, Pauline; Kester, Liesbeth; Wetzels, Sandra; Kirschner, Paul A.

    2012-01-01

    Reijners, P. B. G., Kester, L., Wetzels, S. A. J., & Kirschner, P. A. (2012, 21 November). Optimizing the 3R study strategy to learn from text. Presentation at research meeting Educational and Developmental Psychology, Erasmus University, Rotterdam, The Netherlands.

  13. Optimizing the 3R study strategy to learn from text

    NARCIS (Netherlands)

    Reijners, Pauline; Kester, Liesbeth; Wetzels, Sandra; Kirschner, Paul A.

    2013-01-01

    Reijners, P. B. G., Kester, L., Wetzels, S. A. J., & Kirschner, P. A. (2013, 7 November). Optimizing the 3R study strategy to learn from text. Paper presented at the ICO National Fall School, Maastricht, The Netherlands.

  14. The construction of optimal hedging portfolio strategies of an investor

    African Journals Online (AJOL)

    We categorised the investor's portfolio into two folds: the initial investment and the capital gain ... We will also describe the dynamic of our stock price using Binomial lattice model ... equation to derive the optimal values of our trading strategies.

  15. Strategy optimization for controlled Markov process with descriptive complexity constraint

    Institute of Scientific and Technical Information of China (English)

    JIA QingShan; ZHAO QianChuan

    2009-01-01

    Due to various advantages in storage and Implementation,simple strategies are usually preferred than complex strategies when the performances are close.Strategy optimization for controlled Markov process with descriptive complexity constraint provides a general framework for many such problems.In this paper,we first show by examples that the descriptive complexity and the performance of a strategy could be Independent,and use the F-matrix in the No-Free-Lunch Theorem to show the risk that approximating complex strategies may lead to simple strategies that are unboundedly worse in cardinal performance than the original complex strategies.We then develop a method that handles the descriptive complexity constraint directly,which describes simple strategies exactly and only approximates complex strategies during the optimization.The ordinal performance difference between the resulting strategies of this selective approximation method and the global optimum is quantified.Numerical examples on an engine maintenance problem show how this method Improves the solution quality.We hope this work sheds some insights to solving general strategy optimization for controlled Markov procase with descriptive complexity constraint.

  16. Long-Run Savings and Investment Strategy Optimization

    Directory of Open Access Journals (Sweden)

    Russell Gerrard

    2014-01-01

    Full Text Available We focus on automatic strategies to optimize life cycle savings and investment. Classical optimal savings theory establishes that, given the level of risk aversion, a saver would keep the same relative amount invested in risky assets at any given time. We show that, when optimizing lifecycle investment, performance and risk assessment have to take into account the investor’s risk aversion and the maximum amount the investor could lose, simultaneously. When risk aversion and maximum possible loss are considered jointly, an optimal savings strategy is obtained, which follows from constant rather than relative absolute risk aversion. This result is fundamental to prove that if risk aversion and the maximum possible loss are both high, then holding a constant amount invested in the risky asset is optimal for a standard lifetime saving/pension process and outperforms some other simple strategies. Performance comparisons are based on downside risk-adjusted equivalence that is used in our illustration.

  17. Power consumption optimization strategy for wireless networks

    DEFF Research Database (Denmark)

    Cornean, Horia; Kumar, Sanjay; Marchetti, Nicola

    2011-01-01

    in order to reduce the total power consumption in a multi cellular network. We present an algorithm for power optimization under no interference and in presence of interference conditions, targeting to maximize the network capacity. The convergence of the algorithm is guaranteed if the interference...

  18. Optimal Rate Based Image Transmission Scheme in Multi-rate Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Mr. Jayachandran.A ,

    2011-06-01

    Full Text Available In image transmission application over WSN energy efficiency and image quality are both important factor for joint optimization. The large size image transmission cause bottleneck in WSN due to the limited energy resources and network capacity. Since some sensor are in similar viewing directions the images they are capture likely exhibit certain level of correlation among themselves. This optimization scheme allows each image sensor to transmit optimal functions of the overlapped images through appropriate multiple rate oriented routing paths. Moreover, we use unused segment loss protection with erasure codes of different strength to maximize the expected quality at the destination and propose a fast algorithm that find nearly optimal transmission strategies simulation results show that proposed the scheme achieves high energy efficiency in WSN enhancing the image transmission quality.

  19. Optimal marker placement in hadrontherapy: intelligent optimization strategies with augmented Lagrangian pattern search.

    Science.gov (United States)

    Altomare, Cristina; Guglielmann, Raffaella; Riboldi, Marco; Bellazzi, Riccardo; Baroni, Guido

    2015-02-01

    In high precision photon radiotherapy and in hadrontherapy, it is crucial to minimize the occurrence of geometrical deviations with respect to the treatment plan in each treatment session. To this end, point-based infrared (IR) optical tracking for patient set-up quality assessment is performed. Such tracking depends on external fiducial points placement. The main purpose of our work is to propose a new algorithm based on simulated annealing and augmented Lagrangian pattern search (SAPS), which is able to take into account prior knowledge, such as spatial constraints, during the optimization process. The SAPS algorithm was tested on data related to head and neck and pelvic cancer patients, and that were fitted with external surface markers for IR optical tracking applied for patient set-up preliminary correction. The integrated algorithm was tested considering optimality measures obtained with Computed Tomography (CT) images (i.e. the ratio between the so-called target registration error and fiducial registration error, TRE/FRE) and assessing the marker spatial distribution. Comparison has been performed with randomly selected marker configuration and with the GETS algorithm (Genetic Evolutionary Taboo Search), also taking into account the presence of organs at risk. The results obtained with SAPS highlight improvements with respect to the other approaches: (i) TRE/FRE ratio decreases; (ii) marker distribution satisfies both marker visibility and spatial constraints. We have also investigated how the TRE/FRE ratio is influenced by the number of markers, obtaining significant TRE/FRE reduction with respect to the random configurations, when a high number of markers is used. The SAPS algorithm is a valuable strategy for fiducial configuration optimization in IR optical tracking applied for patient set-up error detection and correction in radiation therapy, showing that taking into account prior knowledge is valuable in this optimization process. Further work will be

  20. Optimal Dynamic Advertising Strategy Under Age-Specific Market Segmentation

    Science.gov (United States)

    Krastev, Vladimir

    2011-12-01

    We consider the model proposed by Faggian and Grosset for determining the advertising efforts and goodwill in the long run of a company under age segmentation of consumers. Reducing this model to optimal control sub problems we find the optimal advertising strategy and goodwill.

  1. Existence of optimal consumption strategies in markets with longevity risk

    NARCIS (Netherlands)

    de Kort, Jan; Vellekoop, M.H.

    2017-01-01

    Survival bonds are financial instruments with a payoff that depends on human mortality rates. In markets that contain such bonds, agents optimizing expected utility of consumption and terminal wealth can mitigate their longevity risk. To examine how this influences optimal portfolio strategies and c

  2. Synthesis of Optimal Strategies Using HyTech

    DEFF Research Database (Denmark)

    Bouyer, Patricia; Cassez, Franck; Larsen, Kim Guldstrand

    2005-01-01

    Priced timed (game) automata extend timed (game) automata with costs on both locations and transitions. The problem of synthesizing an optimal winning strategy for a priced timed game under some hypotheses has been shown decidable in [P. Bouyer, F. Cassez, E. Fleury, and K.G. Larsen. Optimal...

  3. Health benefit modelling and optimization of vehicular pollution control strategies

    Science.gov (United States)

    Sonawane, Nayan V.; Patil, Rashmi S.; Sethi, Virendra

    2012-12-01

    This study asserts that the evaluation of pollution reduction strategies should be approached on the basis of health benefits. The framework presented could be used for decision making on the basis of cost effectiveness when the strategies are applied concurrently. Several vehicular pollution control strategies have been proposed in literature for effective management of urban air pollution. The effectiveness of these strategies has been mostly studied as a one at a time approach on the basis of change in pollution concentration. The adequacy and practicality of such an approach is studied in the present work. Also, the assessment of respective benefits of these strategies has been carried out when they are implemented simultaneously. An integrated model has been developed which can be used as a tool for optimal prioritization of various pollution management strategies. The model estimates health benefits associated with specific control strategies. ISC-AERMOD View has been used to provide the cause-effect relation between control options and change in ambient air quality. BenMAP, developed by U.S. EPA, has been applied for estimation of health and economic benefits associated with various management strategies. Valuation of health benefits has been done for impact indicators of premature mortality, hospital admissions and respiratory syndrome. An optimization model has been developed to maximize overall social benefits with determination of optimized percentage implementations for multiple strategies. The model has been applied for sub-urban region of Mumbai city for vehicular sector. Several control scenarios have been considered like revised emission standards, electric, CNG, LPG and hybrid vehicles. Reduction in concentration and resultant health benefits for the pollutants CO, NOx and particulate matter are estimated for different control scenarios. Finally, an optimization model has been applied to determine optimized percentage implementation of specific

  4. Mesh refinement strategy for optimal control problems

    OpenAIRE

    Paiva, Luis Tiago; Fontes, Fernando,

    2013-01-01

    International audience; Direct methods are becoming the most used technique to solve nonlinear optimal control problems. Regular time meshes having equidistant spacing are frequently used. However, in some cases these meshes cannot cope accurately with nonlinear behavior. One way to improve the solution is to select a new mesh with a greater number of nodes. Another way, involves adaptive mesh refinement. In this case, the mesh nodes have non equidistant spacing which allow a non uniform node...

  5. Sleep As A Strategy For Optimizing Performance.

    Science.gov (United States)

    Yarnell, Angela M; Deuster, Patricia

    2016-01-01

    Recovery is an essential component of maintaining, sustaining, and optimizing cognitive and physical performance during and after demanding training and strenuous missions. Getting sufficient amounts of rest and sleep is key to recovery. This article focuses on sleep and discusses (1) why getting sufficient sleep is important, (2) how to optimize sleep, and (3) tools available to help maximize sleep-related performance. Insufficient sleep negatively impacts safety and readiness through reduced cognitive function, more accidents, and increased military friendly-fire incidents. Sufficient sleep is linked to better cognitive performance outcomes, increased vigor, and better physical and athletic performance as well as improved emotional and social functioning. Because Special Operations missions do not always allow for optimal rest or sleep, the impact of reduced rest and sleep on readiness and mission success should be minimized through appropriate preparation and planning. Preparation includes periods of "banking" or extending sleep opportunities before periods of loss, monitoring sleep by using tools like actigraphy to measure sleep and activity, assessing mental effectiveness, exploiting strategic sleep opportunities, and consuming caffeine at recommended doses to reduce fatigue during periods of loss. Together, these efforts may decrease the impact of sleep loss on mission and performance. 2016.

  6. Discuss Optimal Approaches to Learning Strategy Instruction for EFL Learners

    Institute of Scientific and Technical Information of China (English)

    邢菊如

    2009-01-01

    Numerous research studies reveal that learning strategies have played an important role in language learning processes.This paper explores as English teachers.can we impmve students' language proficiency by giving them optimal learning strategy instruction and what approaches are most effective and efficient?

  7. Optimal Portfolio Strategy under Rolling Economic Maximum Drawdown Constraints

    OpenAIRE

    Xiaojian Yu; Siyu Xie; Weijun Xu

    2014-01-01

    This paper deals with the problem of optimal portfolio strategy under the constraints of rolling economic maximum drawdown. A more practical strategy is developed by using rolling Sharpe ratio in computing the allocation proportion in contrast to existing models. Besides, another novel strategy named “REDP strategy” is further proposed, which replaces the rolling economic drawdown of the portfolio with the rolling economic drawdown of the risky asset. The simulation tests prove that REDP stra...

  8. On Global Optimal Sailplane Flight Strategy

    Science.gov (United States)

    Sander, G. J.; Litt, F. X.

    1979-01-01

    The derivation and interpretation of the necessary conditions that a sailplane cross-country flight has to satisfy to achieve the maximum global flight speed is considered. Simple rules are obtained for two specific meteorological models. The first one uses concentrated lifts of various strengths and unequal distance. The second one takes into account finite, nonuniform space amplitudes for the lifts and allows, therefore, for dolphin style flight. In both models, altitude constraints consisting of upper and lower limits are shown to be essential to model realistic problems. Numerical examples illustrate the difference with existing techniques based on local optimality conditions.

  9. Optimal Inspection and Maintenance Strategies for Structural Systems

    DEFF Research Database (Denmark)

    Sommer, A. M.

    The aim of this thesis is to give an overview of conventional and optimal reliability-based inspection and maintenance strategies and to examine for specific structures how the cost can be reduced and/or the safety can be improved by using optimal reliability-based inspection strategies. For stru......The aim of this thesis is to give an overview of conventional and optimal reliability-based inspection and maintenance strategies and to examine for specific structures how the cost can be reduced and/or the safety can be improved by using optimal reliability-based inspection strategies....... Furthermore, in relation to the calculations performed the intention is to modify an existing program for determination of optimal inspection strategies. The main purpose of inspection and maintenance of structural systems is to prevent or delay damage or deterioration to protect people, environment......, and investments made in the structure. The inspection and maintenance should be performed so that the structural system is operating as much of the time as possible and the cost is kept at a minimum and so that the safety of the structure is satisfactory. Up till now inspection strategies have been based...

  10. Optimality of Spatially Inhomogeneous Search Strategies.

    Science.gov (United States)

    Schwarz, Karsten; Schröder, Yannick; Qu, Bin; Hoth, Markus; Rieger, Heiko

    2016-08-05

    We consider random search processes alternating stochastically between diffusion and ballistic motion, in which the distribution function of ballistic motion directions varies from point to point in space. The specific space dependence of the directional distribution together with the switching rates between the two modes of motion establishes a spatially inhomogeneous search strategy. We show that the mean first passage times for several standard search problems-narrow escape, reaction partner finding, reaction escape-can be minimized with a directional distribution that is reminiscent of the spatial organization of the cytoskeleton filaments of cells with a centrosome: radial ballistic transport from the center to the periphery and back, and ballistic transport in random directions within a concentric shell of thickness Δ_{opt} along the domain boundary. The results suggest that living cells realize efficient search strategies for various intracellular transport problems economically through a spatial cytoskeleton organization that involves radial microtubules in the central region and only a narrow actin cortex rather than a cell body filled with randomly oriented actin filaments.

  11. Optimality of Spatially Inhomogeneous Search Strategies

    Science.gov (United States)

    Schwarz, Karsten; Schröder, Yannick; Qu, Bin; Hoth, Markus; Rieger, Heiko

    2016-08-01

    We consider random search processes alternating stochastically between diffusion and ballistic motion, in which the distribution function of ballistic motion directions varies from point to point in space. The specific space dependence of the directional distribution together with the switching rates between the two modes of motion establishes a spatially inhomogeneous search strategy. We show that the mean first passage times for several standard search problems—narrow escape, reaction partner finding, reaction escape—can be minimized with a directional distribution that is reminiscent of the spatial organization of the cytoskeleton filaments of cells with a centrosome: radial ballistic transport from the center to the periphery and back, and ballistic transport in random directions within a concentric shell of thickness Δopt along the domain boundary. The results suggest that living cells realize efficient search strategies for various intracellular transport problems economically through a spatial cytoskeleton organization that involves radial microtubules in the central region and only a narrow actin cortex rather than a cell body filled with randomly oriented actin filaments.

  12. Optimal search strategies on complex networks

    CERN Document Server

    Di Patti, Francesca; Piazza, Francesco

    2014-01-01

    Complex networks are ubiquitous in nature and play a role of paramount importance in many contexts. Internet and the cyberworld, which permeate our everyday life, are self-organized hierarchical graphs. Urban traffic flows on intricate road networks, which impact both transportation design and epidemic control. In the brain, neurons are cabled through heterogeneous connections, which support the propagation of electric signals. In all these cases, the true challenge is to unveil the mechanisms through which specific dynamical features are modulated by the underlying topology of the network. Here, we consider agents randomly hopping along the links of a graph, with the additional possibility of performing long-range hops to randomly chosen disconnected nodes with a given probability. We show that an optimal combination of the two jump rules exists that maximises the efficiency of target search, the optimum reflecting the topology of the network.

  13. Optimization of energy planning strategies in municipalities

    DEFF Research Database (Denmark)

    Petersen, Jens-Phillip

    The paper evaluates the current status of community energy planning in northern Europe via a review of literature, practice and the performance of a barrier analysis for successful community energy planning. Main findings of the paper are that current community energy planning lacks a systematic...... approach, suffers from insufficient information, tools and resources. Municipalities are often unable to take on a steering role in community energy planning. To overcome these barriers and guide municipalities in the pre-project phase, a decision-support methodology, based on community energy profiles...... (CEP), is presented. The methodology was applied in a case study in Germany. With CEPs, a possibility to merge qualitative data from local settings into generic energy modelling is shown, which could contribute to improved community energy strategies....

  14. Optimal Portfolio Strategy under Rolling Economic Maximum Drawdown Constraints

    Directory of Open Access Journals (Sweden)

    Xiaojian Yu

    2014-01-01

    Full Text Available This paper deals with the problem of optimal portfolio strategy under the constraints of rolling economic maximum drawdown. A more practical strategy is developed by using rolling Sharpe ratio in computing the allocation proportion in contrast to existing models. Besides, another novel strategy named “REDP strategy” is further proposed, which replaces the rolling economic drawdown of the portfolio with the rolling economic drawdown of the risky asset. The simulation tests prove that REDP strategy can ensure the portfolio to satisfy the drawdown constraint and outperforms other strategies significantly. An empirical comparison research on the performances of different strategies is carried out by using the 23-year monthly data of SPTR, DJUBS, and 3-month T-bill. The investment cases of single risky asset and two risky assets are both studied in this paper. Empirical results indicate that the REDP strategy successfully controls the maximum drawdown within the given limit and performs best in both return and risk.

  15. Optimized Information Transmission Scheduling Strategy Oriented to Advanced Metering Infrastructure

    Directory of Open Access Journals (Sweden)

    Weiming Tong

    2013-01-01

    Full Text Available Advanced metering infrastructure (AMI is considered to be the first step in constructing smart grid. AMI allows customers to make real-time choices about power utilization and enables power utilities to increase the effectiveness of the regional power grids by managing demand load during peak times and reducing unneeded power generation. These initiatives rely heavily on the prompt information transmission inside AMI. Aiming at the information transmission problem, this paper researches the communication scheduling strategy in AMI at a macroscopic view. First, the information flow of AMI is analyzed, and the power users are classified into several grades by their importance. Then, the defect of conventional information transmission scheduling strategy is analyzed. On this basis, two optimized scheduling strategies are proposed. In the wide area, an optimized scheduling strategy based on user importance and time critical is proposed to guarantee the important power users’ information transmission being handled promptly. In the local area, an optimized scheduling strategy based on device and information importance and time critical is proposed to guarantee the important devices and information in AMI user end system being handled promptly. At last, the two optimized scheduling strategies are simulated. The simulation results show that they can effectively improve the real-time performance and reliability of AMI information transmission.

  16. Strategies for optimizing nitrogen use by ruminants

    DEFF Research Database (Denmark)

    Calsamiglia, S; Ferret, A; Reynolds, C K

    2010-01-01

    The efficiency of N utilization in ruminants is typically low (around 25%) and highly variable (10% to 40%) compared with the higher efficiency of other production animals. The low efficiency has implications for the production performance and environment. Many efforts have been devoted to improv......The efficiency of N utilization in ruminants is typically low (around 25%) and highly variable (10% to 40%) compared with the higher efficiency of other production animals. The low efficiency has implications for the production performance and environment. Many efforts have been devoted...... to improving the efficiency of N utilization in ruminants, and while major improvements in our understanding of N requirements and metabolism have been achieved, the overall efficiency remains low. In general, maximal efficiency of N utilization will only occur at the expense of some losses in production...... performance. However, optimal production and N utilization may be achieved through the understanding of the key mechanisms involved in the control of N metabolism. Key factors in the rumen include the efficiency of N capture in the rumen (grams of bacterial N per grams of rumen available N...

  17. Optimization of Secondary Concentrators with the Continuous Information Entropy Strategy

    Science.gov (United States)

    Schmidt, Tobias Christian; Ries, Harald

    2010-10-01

    In this contribution, a method for global optimization of noisy functions, the Continuous Information Entropy Strategy (CIES), is explained and its applicability for the optimization of solar concentrators is shown. The CIES is efficient because all decisions made during optimizations are based on criteria that are derived from the concept of information entropy. Two secondary concentrators have been optimized with the CIES. The optimized secondary concentrators convert circular light distributions of round focal spots to square light distributions to match with the shape of square PV cells. The secondary concentrators are highly efficient and have geometrical concentration ratios of 2.25 and 8 respectively. Part of this material has been published in: T. C. Schmidt, "Information Entropy-Based Decision Making in Optimization", Ph.D. Thesis, Philipps University Marburg, 2010.

  18. Body image inflexibility mediates the relationship between body image evaluation and maladaptive body image coping strategies.

    Science.gov (United States)

    Mancuso, Serafino G

    2016-03-01

    Body image inflexibility, the unwillingness to experience negative appearance-related thoughts and emotions, is associated with negative body image and eating disorder symptoms. The present study investigated whether body image inflexibility mediated the relationship between body image evaluation and maladaptive body image coping strategies (appearance-fixing and experiential avoidance) in a college and community sample comprising 156 females aged 18-51 years (M=22.76, SD=6.96). Controlling for recruitment source (college vs. community), body image inflexibility fully mediated the relationship between body image evaluation and maladaptive body image coping strategies. Results indicated that an unwillingness to experience negative appearance-related thoughts and emotions is likely responsible for negative body image evaluation's relationship to appearance-fixing behaviours and experiential avoidance. Findings support extant evidence that interventions that explicitly target body image inflexibility, such as Acceptance and Commitment Therapy, may have utility in treating body dissatisfaction in nonclinical populations.

  19. Optimization model of vaccination strategy for dengue transmission

    Science.gov (United States)

    Widayani, H.; Kallista, M.; Nuraini, N.; Sari, M. Y.

    2014-02-01

    Dengue fever is emerging tropical and subtropical disease caused by dengue virus infection. The vaccination should be done as a prevention of epidemic in population. The host-vector model are modified with consider a vaccination factor to prevent the occurrence of epidemic dengue in a population. An optimal vaccination strategy using non-linear objective function was proposed. The genetic algorithm programming techniques are combined with fourth-order Runge-Kutta method to construct the optimal vaccination. In this paper, the appropriate vaccination strategy by using the optimal minimum cost function which can reduce the number of epidemic was analyzed. The numerical simulation for some specific cases of vaccination strategy is shown.

  20. Turbine Control Strategies for Wind Farm Power Optimization

    DEFF Research Database (Denmark)

    Mirzaei, Mahmood; Göçmen Bozkurt, Tuhfe; Giebel, Gregor

    2015-01-01

    In recent decades there has been increasing interest in green energies, of which wind energy is the most important one. In order to improve the competitiveness of the wind power plants, there are ongoing researches to decrease cost per energy unit and increase the efficiency of wind turbines...... and wind farms. One way of achieving these goals is to optimize the power generated by a wind farm. One optimization method is to choose appropriate operating points for the individual wind turbines in the farm. We have made three models of a wind farm based on three difference control strategies....... Basically, the control strategies determine the steady state operating points of the wind turbines. Except the control strategies of the individual wind turbines, the wind farm models are similar. Each model consists of a row of 5MW reference wind turbines. In the models we are able to optimize...

  1. NEW VISUAL PERCEPTUAL POOLING STRATEGY FOR IMAGE QUALITY ASSESSMENT

    Institute of Scientific and Technical Information of China (English)

    Zhou Wujie; Jiang Gangyi; Yu Mei

    2012-01-01

    Most of Image Quality Assessment (IQA) metrics consist of two processes.In the first process,quality map of image is measured locally.In the second process,the last quality score is converted from the quality map by using the pooling strategy.The first process had been made effective and significant progresses,while the second process was always done in simple ways.In the second process of the pooling strategy,the optimal perceptual pooling weights should be determined and computed according to Human Visual System (HVS).Thus,a reliable spatial pooling mathematical model based on HVS is an important issue worthy of study.In this paper,a new Visual Perceptual Pooling Strategy (VPPS) for IQA is presented based on contrast sensitivity and luminance sensitivity of HVS.Experimental results with the LIVE database show that the visual perceptual weights,obtained by the proposed pooling strategy,can effectively and significantly improve the performances of the IQA metrics with Mean Structural SIMilarity (MSSIM) or Phase Quantization Code (PQC).It is confirmed that the proposed VPPS demonstrates promising results for improving the performances of existing IQA metrics.

  2. Optimal generator bidding strategies for power and ancillary services

    Science.gov (United States)

    Morinec, Allen G.

    As the electric power industry transitions to a deregulated market, power transactions are made upon price rather than cost. Generator companies are interested in maximizing their profits rather than overall system efficiency. A method to equitably compensate generation providers for real power, and ancillary services such as reactive power and spinning reserve, will ensure a competitive market with an adequate number of suppliers. Optimizing the generation product mix during bidding is necessary to maximize a generator company's profits. The objective of this research work is to determine and formulate appropriate optimal bidding strategies for a generation company in both the energy and ancillary services markets. These strategies should incorporate the capability curves of their generators as constraints to define the optimal product mix and price offered in the day-ahead and real time spot markets. In order to achieve such a goal, a two-player model was composed to simulate market auctions for power generation. A dynamic game methodology was developed to identify Nash Equilibria and Mixed-Strategy Nash Equilibria solutions as optimal generation bidding strategies for two-player non-cooperative variable-sum matrix games with incomplete information. These games integrated the generation product mix of real power, reactive power, and spinning reserve with the generators's capability curves as constraints. The research includes simulations of market auctions, where strategies were tested for generators with different unit constraints, costs, types of competitors, strategies, and demand levels. Studies on the capability of large hydrogen cooled synchronous generators were utilized to derive useful equations that define the exact shape of the capability curve from the intersections of the arcs defined by the centers and radial vectors of the rotor, stator, and steady-state stability limits. The available reactive reserve and spinning reserve were calculated given a

  3. Compact low field magnetic resonance imaging magnet: Design and optimization

    Science.gov (United States)

    Sciandrone, M.; Placidi, G.; Testa, L.; Sotgiu, A.

    2000-03-01

    Magnetic resonance imaging (MRI) is performed with a very large instrument that allows the patient to be inserted into a region of uniform magnetic field. The field is generated either by an electromagnet (resistive or superconductive) or by a permanent magnet. Electromagnets are designed as air cored solenoids of cylindrical symmetry, with an inner bore of 80-100 cm in diameter. In clinical analysis of peripheral regions of the body (legs, arms, foot, knee, etc.) it would be better to adopt much less expensive magnets leaving the most expensive instruments to applications that require the insertion of the patient in the magnet (head, thorax, abdomen, etc.). These "dedicated" apparati could be smaller and based on resistive magnets that are manufactured and operated at very low cost, particularly if they utilize an iron yoke to reduce power requirements. In order to obtain good field uniformity without the use of a set of shimming coils, we propose both particular construction of a dedicated magnet, using four independently controlled pairs of coils, and an optimization-based strategy for computing, a posteriori, the optimal current values. The optimization phase could be viewed as a low-cost shimming procedure for obtaining the desired magnetic field configuration. Some experimental measurements, confirming the effectiveness of the proposed approach (construction and optimization), have also been reported. In particular, it has been shown that the adoption of the proposed optimization based strategy has allowed the achievement of good uniformity of the magnetic field in about one fourth of the magnet length and about one half of its bore. On the basis of the good experimental results, the dedicated magnet can be used for MRI of peripheral regions of the body and for animal experimentation at very low cost.

  4. Optimal design of coordination control strategy for distributed generation system

    Institute of Scientific and Technical Information of China (English)

    WANG Ai-hua; Norapon Kanjanapadit

    2009-01-01

    This paper presents a novel design procedure for optimizing the power distribution strategy in distributed generation system. A coordinating controller, responsible to distribute the total load power request among multiple DG units, is suggested based on the conception of hierarchical control structure in the dynamic system.The optimal control problem was formulated as a nonlinear optimization problem subject to set of constraints.The resulting problem was solved using the Kutm-Tucker method. Computer simulation results demonstrate that the proposed method can provide better efficiency in terms of reducing total costs compared to existing methods.In addition, the proposed optimal load distribution strategy can be easily implemented in real-time thanks to the simplicity of closed-form solutions.

  5. Optimal Watermark Embedding and Detection Strategies Under Limited Detection Resources

    CERN Document Server

    Merhav, Neri

    2007-01-01

    An information-theoretic approach is proposed to watermark embedding and detection under limited detector resources. First, we consider the attack-free scenario under which asymptotically optimal decision regions in the Neyman-Pearson sense are proposed, along with the optimal embedding rule. Later, we explore the case of zero-mean i.i.d. Gaussian covertext distribution with unknown variance under the attack-free scenario. For this case, we propose a lower bound on the exponential decay rate of the false-negative probability and prove that the optimal embedding and detecting strategy is superior to the customary linear, additive embedding strategy in the exponential sense. Finally, these results are extended to the case of memoryless attacks and general worst case attacks. Optimal decision regions and embedding rules are offered, and the worst attack channel is identified.

  6. Imaging strategies in pediatric urinary tract infection

    Energy Technology Data Exchange (ETDEWEB)

    Dacher, Jean-Nicolas [University of Rouen, Quant-IF Laboratory, School of Medicine and Pharmacy, Rouen (France); Rouen University Hospital Charles Nicolle, Department of Radiology, Rouen (France); UFR Medecine Pharmacie de Rouen, Laboratoire Quant-If, Rouen (France); Hitzel, Anne; Vera, Pierre [University of Rouen, Quant-IF Laboratory, School of Medicine and Pharmacy, Rouen (France); CRLCC Henri Becquerel, Department of Nuclear Medicine, Rouen (France); Avni, Fred E. [Free University of Brussels, Department of Radiology, Erasmus Hospital, Brussels (Belgium)

    2005-07-01

    This article is focused on the controversial topic of imaging strategies in pediatric urinary tract infection. A review of the recent literature illustrates the complementary roles of ultrasound, diagnostic radiology and nuclear medicine. The authors stress the key role of ultrasound which has recently been debated. The commonly associated vesicoureteric reflux has to be classified as congenital or secondary due to voiding dysfunction. A series of frequently asked questions are addressed in a second section. The proposed answers are not the product of a consensus but should rather be considered as proposals to enrich the ongoing debate concerning the evaluation of urinary tract infection in children. (orig.)

  7. NEW OPTIMAL LARGE ANGLE MANEUVER STRATEGY FOR SINGLE FLEXIBLE LINK

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A component synthesis vibration suppression (CSVS) method for flexible structures is put forward. It can eliminate any unwanted orders of flexible vibration modes while achieves desired rigid motion. This method has robustness to uncertainty of frequency, which makes it practical in engineering. Several time optimal and time-fuel optimal control strategies are designed for a kind of single flexible link. Simulation results validate the feasibility of our method.

  8. Optimal atlas construction through hierarchical image registration

    Science.gov (United States)

    Grevera, George J.; Udupa, Jayaram K.; Odhner, Dewey; Torigian, Drew A.

    2016-03-01

    Atlases (digital or otherwise) are common in medicine. However, there is no standard framework for creating them from medical images. One traditional approach is to pick a representative subject and then proceed to label structures/regions of interest in this image. Another is to create a "mean" or average subject. Atlases may also contain more than a single representative (e.g., the Visible Human contains both a male and a female data set). Other criteria besides gender may be used as well, and the atlas may contain many examples for a given criterion. In this work, we propose that atlases be created in an optimal manner using a well-established graph theoretic approach using a min spanning tree (or more generally, a collection of them). The resulting atlases may contain many examples for a given criterion. In fact, our framework allows for the addition of new subjects to the atlas to allow it to evolve over time. Furthermore, one can apply segmentation methods to the graph (e.g., graph-cut, fuzzy connectedness, or cluster analysis) which allow it to be separated into "sub-atlases" as it evolves. We demonstrate our method by applying it to 50 3D CT data sets of the chest region, and by comparing it to a number of traditional methods using measures such as Mean Squared Difference, Mattes Mutual Information, and Correlation, and for rigid registration. Our results demonstrate that optimal atlases can be constructed in this manner and outperform other methods of construction using freely available software.

  9. Optimal Control Strategies in Delayed Sharing Information Structures

    CERN Document Server

    Nayyar, Ashutosh; Teneketzis, Demosthenis

    2010-01-01

    The $n$-step delayed sharing information structure is investigated. This information structure comprises of $K$ controllers that share their information with a delay of $n$ time steps. This information structure is a link between the classical information structure, where information is shared perfectly between the controllers, and a non-classical information structure, where there is no "lateral" sharing of information among the controllers. Structural results for optimal control strategies for systems with such information structures are presented. A sequential methodology for finding the optimal strategies is also derived. The solution approach provides an insight for identifying structural results and sequential decomposition for general decentralized stochastic control problems.

  10. Optimization Under Uncertainty for Wake Steering Strategies: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Quick, Julian [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Annoni, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Ryan N [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Dykes, Katherine L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Fleming, Paul A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ning, Andrew [Brigham Young University

    2017-05-01

    Wind turbines in a wind power plant experience significant power losses because of aerodynamic interactions between turbines. One control strategy to reduce these losses is known as 'wake steering,' in which upstream turbines are yawed to direct wakes away from downstream turbines. Previous wake steering research has assumed perfect information, however, there can be significant uncertainty in many aspects of the problem, including wind inflow and various turbine measurements. Uncertainty has significant implications for performance of wake steering strategies. Consequently, the authors formulate and solve an optimization under uncertainty (OUU) problem for finding optimal wake steering strategies in the presence of yaw angle uncertainty. The OUU wake steering strategy is demonstrated on a two-turbine test case and on the utility-scale, offshore Princess Amalia Wind Farm. When we accounted for yaw angle uncertainty in the Princess Amalia Wind Farm case, inflow-direction-specific OUU solutions produced between 0% and 1.4% more power than the deterministically optimized steering strategies, resulting in an overall annual average improvement of 0.2%. More importantly, the deterministic optimization is expected to perform worse and with more downside risk than the OUU result when realistic uncertainty is taken into account. Additionally, the OUU solution produces fewer extreme yaw situations than the deterministic solution.

  11. [Lead compound optimization strategy (1)--changing metabolic pathways and optimizing metabolism stability].

    Science.gov (United States)

    Wang, Jiang; Liu, Hong

    2013-10-01

    Lead compound optimization plays an important role in new drug discovery and development. The strategies for changing metabolic pathways can modulate pharmacokinetic properties, prolong the half life, improve metabolism stability and bioavailability of lead compounds. The strategies for changing metabolic pathways and improving metabolism stability are reviewed. These methods include blocking metabolic site, reduing lipophilicity, changing ring size, bioisosterism, and prodrug.

  12. A Computationally Efficient Aggregation Optimization Strategy of Model Predictive Control

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Model Predictive Control (MPC) is a popular technique and has been successfully used in various industrial applications. However, the big drawback of MPC involved in the formidable on-line computational effort limits its applicability to relatively slow and/or small processes with a moderate number of inputs. This paper develops an aggregation optimization strategy for MPC that can improve the computational efficiency of MPC. For the regulation problem, an input decaying aggregation optimization algorithm is presented by aggregating all the original optimized variables on control horizon with the decaying sequence in respect of the current control action.

  13. A Competitive and Experiential Assignment in Search Engine Optimization Strategy

    Science.gov (United States)

    Clarke, Theresa B.; Clarke, Irvine, III

    2014-01-01

    Despite an increase in ad spending and demand for employees with expertise in search engine optimization (SEO), methods for teaching this important marketing strategy have received little coverage in the literature. Using Bloom's cognitive goals hierarchy as a framework, this experiential assignment provides a process for educators who may be…

  14. A Competitive and Experiential Assignment in Search Engine Optimization Strategy

    Science.gov (United States)

    Clarke, Theresa B.; Clarke, Irvine, III

    2014-01-01

    Despite an increase in ad spending and demand for employees with expertise in search engine optimization (SEO), methods for teaching this important marketing strategy have received little coverage in the literature. Using Bloom's cognitive goals hierarchy as a framework, this experiential assignment provides a process for educators who may be new…

  15. A Competitive and Experiential Assignment in Search Engine Optimization Strategy

    Science.gov (United States)

    Clarke, Theresa B.; Clarke, Irvine, III

    2014-01-01

    Despite an increase in ad spending and demand for employees with expertise in search engine optimization (SEO), methods for teaching this important marketing strategy have received little coverage in the literature. Using Bloom's cognitive goals hierarchy as a framework, this experiential assignment provides a process for educators who may be new…

  16. OFDM-ISAR Sparse Optimization Imaging and Motion Compensation

    Directory of Open Access Journals (Sweden)

    Wu Min

    2016-02-01

    Full Text Available Orthogonal Frequency Division Multiplexing (OFDM technology has been utilized in radar imaging to obtain high-resolution range profiles without inter-range cell interference. In this study, we establish a novel algorithm for Inverse Synthetic Aperture Radar (ISAR imaging of a non-cooperative target using OFDM waveforms. We also achieve motion compensation and image enhancement with sparse reconstruction optimization. Utilizing sparse reconstruction optimization, we can simultaneously achieve high-precision OFDM-ISAR imaging and also correct phase errors. Extensive experimentation confirms that the proposed method can effectively overcome range interference and phase errors in OFDM-ISAR imaging, providing optimal robustness and precision.

  17. Immune clonal selection optimization method with combining mutation strategies

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    In artificial immune optimization algorithm, the mutation of immune cells has been considered as the key operator that determines the algorithm performance. Traditional immune optimization algorithms have used a single mutation operator, typically a Gaussian. Using a variety of mutation operators that can be combined during evolution to generate different probability density function could hold the potential for producing better solutions with less computational effort. In view of this, a linear combination mutation operator of Gaussian and Cauchy mutation is presented in this paper, and a novel clonal selection optimization method based on clonal selection principle is proposed also. The simulation results show the combining mutation strategy can obtain the same performance as the best of pure strategies or even better in some cases.

  18. Optimal switching strategies for stochastic geocentric/egocentric navigation

    CERN Document Server

    Peleg, O

    2015-01-01

    Animals use a combination of egocentric navigation driven by the internal integration of environmental cues, interspersed with geocentric course correction and reorientation, often with uncertainty in sensory acquisition of information, planning and execution. Inspired directly by observations of dung beetle navigational strategies that show switching between geocentric and egocentric strategies, we consider the question of optimal strategies for the navigation of an agent along a preferred direction in the presence of multiple sources of noise. We address this using a model that takes the form of a correlated random walk at short time scales that is interspersed with reorientation events that yields a biased random walks at long time scales. We identify optimal alternation schemes and characterize their robustness in the context of noisy sensory acquisition, and performance errors linked with variations in environmental conditions and agent-environment interactions.

  19. Solution of Chemical Dynamic Optimization Using the Simultaneous Strategies

    Institute of Scientific and Technical Information of China (English)

    LIU Xinggao; CHEN Long; HU Yunqing

    2013-01-01

    An approach of simultaneous strategies with two novel techniques is proposed to improve the solution accuracy of chemical dynamic optimization problems.The first technique is to handle constraints on control variables based on the finite-element collocation so as to control the approximation error for discrete optimal problems,where a set of control constraints at element knots are integrated with the procedure for optimization leading to a significant gain in the accuracy of the simultaneous strategies.The second technique is to make the mesh refinement more feasible and reliable by introducing length constraints and guideline in designing appropriate element length boundaries,so that the proposed approach becomes more efficient in adjusting elements to track optimal control profile breakpoints and ensure accurate state and control profiles.Four classic benchmarks of dynamic optimization problems are used as illustrations,and the proposed approach is compared with literature reports.The research results reveal that the proposed approach is preferable in improving the solution accuracy of chemical dynamic optimization problem.

  20. Optimal Scale Edge Detection Utilizing Noise within Images

    Directory of Open Access Journals (Sweden)

    Adnan Khashman

    2003-04-01

    Full Text Available Edge detection techniques have common problems that include poor edge detection in low contrast images, speed of recognition and high computational cost. An efficient solution to the edge detection of objects in low to high contrast images is scale space analysis. However, this approach is time consuming and computationally expensive. These expenses can be marginally reduced if an optimal scale is found in scale space edge detection. This paper presents a new approach to detecting objects within images using noise within the images. The novel idea is based on selecting one optimal scale for the entire image at which scale space edge detection can be applied. The selection of an ideal scale is based on the hypothesis that "the optimal edge detection scale (ideal scale depends on the noise within an image". This paper aims at providing the experimental evidence on the relationship between the optimal scale and the noise within images.

  1. On the robust optimization to the uncertain vaccination strategy problem

    Energy Technology Data Exchange (ETDEWEB)

    Chaerani, D., E-mail: d.chaerani@unpad.ac.id; Anggriani, N., E-mail: d.chaerani@unpad.ac.id; Firdaniza, E-mail: d.chaerani@unpad.ac.id [Department of Mathematics, Faculty of Mathematics and Natural Sciences, University of Padjadjaran Indonesia, Jalan Raya Bandung Sumedang KM 21 Jatinangor Sumedang 45363 (Indonesia)

    2014-02-21

    In order to prevent an epidemic of infectious diseases, the vaccination coverage needs to be minimized and also the basic reproduction number needs to be maintained below 1. This means that as we get the vaccination coverage as minimum as possible, thus we need to prevent the epidemic to a small number of people who already get infected. In this paper, we discuss the case of vaccination strategy in term of minimizing vaccination coverage, when the basic reproduction number is assumed as an uncertain parameter that lies between 0 and 1. We refer to the linear optimization model for vaccination strategy that propose by Becker and Starrzak (see [2]). Assuming that there is parameter uncertainty involved, we can see Tanner et al (see [9]) who propose the optimal solution of the problem using stochastic programming. In this paper we discuss an alternative way of optimizing the uncertain vaccination strategy using Robust Optimization (see [3]). In this approach we assume that the parameter uncertainty lies within an ellipsoidal uncertainty set such that we can claim that the obtained result will be achieved in a polynomial time algorithm (as it is guaranteed by the RO methodology). The robust counterpart model is presented.

  2. An optimal routing strategy on scale-free networks

    Science.gov (United States)

    Yang, Yibo; Zhao, Honglin; Ma, Jinlong; Qi, Zhaohui; Zhao, Yongbin

    Traffic is one of the most fundamental dynamical processes in networked systems. With the traditional shortest path routing (SPR) protocol, traffic congestion is likely to occur on the hub nodes on scale-free networks. In this paper, we propose an improved optimal routing (IOR) strategy which is based on the betweenness centrality and the degree centrality of nodes in the scale-free networks. With the proposed strategy, the routing paths can accurately bypass hub nodes in the network to enhance the transport efficiency. Simulation results show that the traffic capacity as well as some other indexes reflecting transportation efficiency are further improved with the IOR strategy. Owing to the significantly improved traffic performance, this study is helpful to design more efficient routing strategies in communication or transportation systems.

  3. DESTINATION MARKETING STRATEGY IN BALI THROUGH OPTIMIZING THE POTENTIAL OF LOCAL PRODUCTS

    OpenAIRE

    I Gusti Ayu Oka Suryawardani; Agung Suryawan Wiranatha; Petr, Christine

    2014-01-01

    This study was designed to study destination marketing strategy in Bali through optimizing the potential of local products. Seventy nine of hotel managers were interviewed based on cluster sampling method to gain their point of view. The results show that destination must build their images around unique attributes that provide them sustainable competitive advantage including its attraction which should be designed to meet the needs of the target market and should be served by local produc...

  4. Optimization of Segmentation Quality of Integrated Circuit Images

    Directory of Open Access Journals (Sweden)

    Gintautas Mušketas

    2012-04-01

    Full Text Available The paper presents investigation into the application of genetic algorithms for the segmentation of the active regions of integrated circuit images. This article is dedicated to a theoretical examination of the applied methods (morphological dilation, erosion, hit-and-miss, threshold and describes genetic algorithms, image segmentation as optimization problem. The genetic optimization of the predefined filter sequence parameters is carried out. Improvement to segmentation accuracy using a non optimized filter sequence makes 6%.Artcile in Lithuanian

  5. Research of stochastic weight strategy for extended particle swarm optimizer

    Institute of Scientific and Technical Information of China (English)

    XU Jun-jie; YUE Xin; XIN Zhan-hong

    2008-01-01

    To improve the performance of extended particle swarm optimizer, a novel means of stochastic weight deployment is proposed for the iterative equation of velocity updation. In this scheme, one of the weights is specified to a random number within the range of [0, 1] and the other two remain constant configurations. The simulations show that this weight strategy outperforms the previous deterministic approach with respect to success rate and convergence speed. The experi- ments also reveal that if the weight for global best neighbor is specified to a stochastic number, extended particle swarm optimizer achieves high and robust performance on the given multi-modal function.

  6. Optimal search strategies on complex multi-linked networks

    Science.gov (United States)

    Di Patti, Francesca; Fanelli, Duccio; Piazza, Francesco

    2015-01-01

    In this paper we consider the problem of optimal search strategies on multi-linked networks, i.e. graphs whose nodes are endowed with several independent sets of links. We focus preliminarily on agents randomly hopping along the links of a graph, with the additional possibility of performing non-local hops to randomly chosen nodes with a given probability. We show that an optimal combination of the two jump rules exists that maximises the efficiency of target search, the optimum reflecting the topology of the network. We then generalize our results to multi-linked networks with an arbitrary number of mutually interfering link sets. PMID:25950716

  7. Image Mosaic Techniques OptimizationUsing Wavelet

    Institute of Scientific and Technical Information of China (English)

    ZHOUAn-qi; CUILi

    2014-01-01

    This essay concentrates on two key procedures of image mosaic——image registration and imagefusion.Becauseof the character of geometric transformation invariance of edge points, wecalculate the angle difference of the direction vector ofedge points in different images anddraw an angle difference histogramto adjust the rotationproblem. Through this way, algorithm based on gray information is expandedandcan be used in images withdisplacementand rotation. Inthe term of image fusion, wavelet multi-scale analysis is used to fuse spliced images. In order to choose the best method of imagefusion,weevaluate the results of different methods of image fusion by cross entropy.

  8. CT enterography for Crohn's disease: optimal technique and imaging issues.

    Science.gov (United States)

    Baker, Mark E; Hara, Amy K; Platt, Joel F; Maglinte, Dean D T; Fletcher, Joel G

    2015-06-01

    CT enterography (CTE) is a common examination for patients with Crohn's disease. In order to achieve high quality, diagnostic images, proper technique is required. The purpose of this treatise is to review the processes and techniques that can optimize CTE for patients with suspected or known Crohn's disease. We will review the following: (1) how to start a CT enterography program; (2) workflow issues, including patient and ordering physician education and preparation; (3) oral contrast media options and administration regimens; (4) intravenous contrast media injection for uniphasic and multiphasic studies; (5) CTE radiation dose reduction strategies and the use of iterative reconstruction in lower dose examinations; (6) image reconstruction and interpretation; (7) imaging Crohn's patients in the acute or emergency department setting; (8) limitations of CTE as well as alternatives such as MRE or barium fluoroscopic examinations; and (9) dictation templates and a common nomenclature for reporting findings of CTE in Crohn's disease. Many of the issues discussed are summarized in the Abdominal Radiology Society Consensus MDCT Enterography Acquisition Protocol for Crohn's Disease.

  9. Transitions in optimal adaptive strategies for populations in fluctuating environments

    Science.gov (United States)

    Mayer, Andreas; Mora, Thierry; Rivoire, Olivier; Walczak, Aleksandra M.

    2017-09-01

    Biological populations are subject to fluctuating environmental conditions. Different adaptive strategies can allow them to cope with these fluctuations: specialization to one particular environmental condition, adoption of a generalist phenotype that compromises between conditions, or population-wise diversification (bet hedging). Which strategy provides the largest selective advantage in the long run depends on the range of accessible phenotypes and the statistics of the environmental fluctuations. Here, we analyze this problem in a simple mathematical model of population growth. First, we review and extend a graphical method to identify the nature of the optimal strategy when the environmental fluctuations are uncorrelated. Temporal correlations in environmental fluctuations open up new strategies that rely on memory but are mathematically challenging to study: We present analytical results to address this challenge. We illustrate our general approach by analyzing optimal adaptive strategies in the presence of trade-offs that constrain the range of accessible phenotypes. Our results extend several previous studies and have applications to a variety of biological phenomena, from antibiotic resistance in bacteria to immune responses in vertebrates.

  10. Survey of E-Commerce Modeling and Optimization Strategies

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    Electronic commerce is impacting almost all commercial activities. The resulting emerging commercial activities bring with them many new modeling and optimization problems. This survey reviews pioneering works in this new area, covering topics in advertising strategy, web page design, automatic pricing, auction methods, brokerage strategy, and customer behavior analysis. Mathematical models for problems in these areas and their solution algorithms are discussed. In addition to presenting and commenting on these works, we also discuss possible extensions and related problems. The objective of this survey is to encourage more researchers to pay attention to this emerging area.

  11. Acceleration of quantum optimal control theory algorithms with mixing strategies.

    Science.gov (United States)

    Castro, Alberto; Gross, E K U

    2009-05-01

    We propose the use of mixing strategies to accelerate the convergence of the common iterative algorithms utilized in quantum optimal control theory (QOCT). We show how the nonlinear equations of QOCT can be viewed as a "fixed-point" nonlinear problem. The iterative algorithms for this class of problems may benefit from mixing strategies, as it happens, e.g., in the quest for the ground-state density in Kohn-Sham density-functional theory. We demonstrate, with some numerical examples, how the same mixing schemes utilized in this latter nonlinear problem may significantly accelerate the QOCT iterative procedures.

  12. Optimizing signal and image processing applications using Intel libraries

    Science.gov (United States)

    Landré, Jérôme; Truchetet, Frédéric

    2007-01-01

    This paper presents optimized signal and image processing libraries from Intel Corporation. Intel Performance Primitives (IPP) is a low-level signal and image processing library developed by Intel Corporation to optimize code on Intel processors. Open Computer Vision library (OpenCV) is a high-level library dedicated to computer vision tasks. This article describes the use of both libraries to build flexible and efficient signal and image processing applications.

  13. Optimal intervention strategies for cholera outbreak by education and chlorination

    Science.gov (United States)

    Bakhtiar, Toni

    2016-01-01

    This paper discusses the control of infectious diseases in the framework of optimal control approach. A case study on cholera control was studied by considering two control strategies, namely education and chlorination. We distinct the former control into one regarding person-to-person behaviour and another one concerning person-to-environment conduct. Model are divided into two interacted populations: human population which follows an SIR model and pathogen population. Pontryagin maximum principle was applied in deriving a set of differential equations which consists of dynamical and adjoin systems as optimality conditions. Then, the fourth order Runge-Kutta method was exploited to numerically solve the equation system. An illustrative example was provided to assess the effectiveness of the control strategies toward a set of control scenarios.

  14. A NEW STOCHASTIC OPTIMAL CONTROL STRATEGY FOR HYSTERETIC MR DAMPERS

    Institute of Scientific and Technical Information of China (English)

    YingZuguang; NiYiqing; KoJanming

    2004-01-01

    A new stochastic optimal control strategy for randomly excited quasi-integrable Hamiltonian systems using magneto-theological (MR) dampers is proposed. The dynamic behavior of an MR damper is characterized by the Bouc-Wen hysteretic model. The control force produced by the MR damper is separated into a passive part incorporated in the uncontrolled system and a semi-active part to be determined. The system combining the Bouc-Wen hysteretic force is converted into an equivalent non-hysteretic nonlinear stochastic control system. Then Ito stochastic differential equations are derived from the equivalent system by using the stochastic averaging method. A dynamical programming equation for the controlled diffusion processes is established based on the stochastic dynamical programming principle. The non-clipping nonlinear optimal control law is obtained for a certain performance index by minimizing the dynamical programming equation. Finally, an example is given to illustrate the application and effectiveness of the proposed control strategy.

  15. Using Cotton Model Simulations to Estimate Optimally Profitable Irrigation Strategies

    Science.gov (United States)

    Mauget, S. A.; Leiker, G.; Sapkota, P.; Johnson, J.; Maas, S.

    2011-12-01

    In recent decades irrigation pumping from the Ogallala Aquifer has led to declines in saturated thickness that have not been compensated for by natural recharge, which has led to questions about the long-term viability of agriculture in the cotton producing areas of west Texas. Adopting irrigation management strategies that optimize profitability while reducing irrigation waste is one way of conserving the aquifer's water resource. Here, a database of modeled cotton yields generated under drip and center pivot irrigated and dryland production scenarios is used in a stochastic dominance analysis that identifies such strategies under varying commodity price and pumping cost conditions. This database and analysis approach will serve as the foundation for a web-based decision support tool that will help producers identify optimal irrigation treatments under specified cotton price, electricity cost, and depth to water table conditions.

  16. Optimization of reliability allocation strategies through use of genetic algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, J.E.; Painton, L.A.

    1996-08-01

    This paper examines a novel optimization technique called genetic algorithms and its application to the optimization of reliability allocation strategies. Reliability allocation should occur in the initial stages of design, when the objective is to determine an optimal breakdown or allocation of reliability to certain components or subassemblies in order to meet system specifications. The reliability allocation optimization is applied to the design of a cluster tool, a highly complex piece of equipment used in semiconductor manufacturing. The problem formulation is presented, including decision variables, performance measures and constraints, and genetic algorithm parameters. Piecewise ``effort curves`` specifying the amount of effort required to achieve a certain level of reliability for each component of subassembly are defined. The genetic algorithm evolves or picks those combinations of ``effort`` or reliability levels for each component which optimize the objective of maximizing Mean Time Between Failures while staying within a budget. The results show that the genetic algorithm is very efficient at finding a set of robust solutions. A time history of the optimization is presented, along with histograms or the solution space fitness, MTBF, and cost for comparative purposes.

  17. Investigation of Optimal Integrated Circuit Raster Image Vectorization Method

    Directory of Open Access Journals (Sweden)

    Leonas Jasevičius

    2011-03-01

    Full Text Available Visual analysis of integrated circuit layer requires raster image vectorization stage to extract layer topology data to CAD tools. In this paper vectorization problems of raster IC layer images are presented. Various line extraction from raster images algorithms and their properties are discussed. Optimal raster image vectorization method was developed which allows utilization of common vectorization algorithms to achieve the best possible extracted vector data match with perfect manual vectorization results. To develop the optimal method, vectorized data quality dependence on initial raster image skeleton filter selection was assessed.Article in Lithuanian

  18. Optimal sampling strategies for detecting zoonotic disease epidemics.

    Science.gov (United States)

    Ferguson, Jake M; Langebrake, Jessica B; Cannataro, Vincent L; Garcia, Andres J; Hamman, Elizabeth A; Martcheva, Maia; Osenberg, Craig W

    2014-06-01

    The early detection of disease epidemics reduces the chance of successful introductions into new locales, minimizes the number of infections, and reduces the financial impact. We develop a framework to determine the optimal sampling strategy for disease detection in zoonotic host-vector epidemiological systems when a disease goes from below detectable levels to an epidemic. We find that if the time of disease introduction is known then the optimal sampling strategy can switch abruptly between sampling only from the vector population to sampling only from the host population. We also construct time-independent optimal sampling strategies when conducting periodic sampling that can involve sampling both the host and the vector populations simultaneously. Both time-dependent and -independent solutions can be useful for sampling design, depending on whether the time of introduction of the disease is known or not. We illustrate the approach with West Nile virus, a globally-spreading zoonotic arbovirus. Though our analytical results are based on a linearization of the dynamical systems, the sampling rules appear robust over a wide range of parameter space when compared to nonlinear simulation models. Our results suggest some simple rules that can be used by practitioners when developing surveillance programs. These rules require knowledge of transition rates between epidemiological compartments, which population was initially infected, and of the cost per sample for serological tests.

  19. Incorporate Energy Strategy into Particle Swarm Optimizer Algorithm

    Institute of Scientific and Technical Information of China (English)

    ZHANG Lun; DONG De-cun; LU Yan; CHEN Lan

    2008-01-01

    The issue of optimizing the dynamic parameters in Particle Swarm Optimizer (PSO) is addressed in this paper.An algorithm is designed which makes all particles originally endowed with a certain level energy, what here we define as EPSO (Energy Strategy PSO).During the iterative process of PSO algorithm, the Inertia Weight is updated according to the calculation of the particle's energy.The portion ratio of the current residual energy to the initial endowed energy is used as the parameter Inertia Weight which aims to update the particles' velocity efficiently.By the simulation in a graph theoritical and a functional optimization problem respectively, it could be easily found that the rate of convergence in EPSO is obviously increased.

  20. Infomax strategies for an optimal balance between exploration and exploitation

    CERN Document Server

    Reddy, Gautam; Vergassola, Massimo

    2016-01-01

    Proper balance between exploitation and exploration is what makes good decisions, which achieve high rewards like payoff or evolutionary fitness. The Infomax principle postulates that maximization of information directs the function of diverse systems, from living systems to artificial neural networks. While specific applications are successful, the validity of information as a proxy for reward remains unclear. Here, we consider the multi-armed bandit decision problem, which features arms (slot-machines) of unknown probabilities of success and a player trying to maximize cumulative payoff by choosing the sequence of arms to play. We show that an Infomax strategy (Info-p) which optimally gathers information on the highest mean reward among the arms saturates known optimal bounds and compares favorably to existing policies. The highest mean reward considered by Info-p is not the quantity actually needed for the choice of the arm to play, yet it allows for optimal tradeoffs between exploration and exploitation.

  1. Optimal Constrained Resource Allocation Strategies under Low Risk Circumstances

    CERN Document Server

    Andreica, Mugurel Ionut; Visan, Costel

    2009-01-01

    In this paper we consider multiple constrained resource allocation problems, where the constraints can be specified by formulating activity dependency restrictions or by using game-theoretic models. All the problems are focused on generic resources, with a few exceptions which consider financial resources in particular. The problems consider low-risk circumstances and the values of the uncertain variables which are used by the algorithms are the expected values of the variables. For each of the considered problems we propose novel algorithmic solutions for computing optimal resource allocation strategies. The presented solutions are optimal or near-optimal from the perspective of their time complexity. The considered problems have applications in a broad range of domains, like workflow scheduling in industry (e.g. in the mining and metallurgical industry) or the financial sector, motion planning, facility location and data transfer or job scheduling and resource management in Grids, clouds or other distribute...

  2. OPTIMIZED IMAGE RETRIEVAL SYSTEM USING MULTIPLE THREADS

    Directory of Open Access Journals (Sweden)

    S.P. Jeyapriyamvadha

    2012-05-01

    Full Text Available Content-Based Image Retrieval is a technique used to retrieve similar images where the most challenging aspect is to bridge the gap between low level feature layout and high level semantic concepts. Efficient and effective retrieval techniques of images are desired to work out a certain image on the condition that the result would be more suitable than the input image. A novel method is proposed, which links various images as threads; where the texture features and shape features of the images are extracted and stored. The minimum distance between the query image and the thread provides the output. These threads are based upon the query result. The resultant retrieval system is found to be beneficial and interactive.

  3. Optimal image-fusion method based on nonsubsampled contourlet transform

    Science.gov (United States)

    Dou, Jianfang; Li, Jianxun

    2012-10-01

    The optimization of image fusion is researched. Based on the properties of nonsubsampled contourlet transform (NSCT), shift invariance, multiscale and multidirectional expansion, the fusion parameters of the multiscale decompostion scheme is optimized. In order to meet the requirement of feedback optimization, a new image fusion quality metric of image quality index normalized edge association (IQI-NEA) is built. A polynomial model is adopted to establish the relationship between the IQI_NEA metric and several decomposition levels. The optimal fusion includes four steps. First, the source images are decomposed in NSCT domain for several given levels. Second, principal component analysis is adopted to fuse the low frequency coefficients and the maximum fusion rule is utilized to fuse the high frequency coefficients to obtain the fused coefficients and the fused result is reconstructed from the obtained fused coefficients. Third, calculate the fusion quality metric IQI_NEA for the source images and fused images. Finally, the optimal fused image and optimal level are obtained through extremum properties of polynomials function. The visual and statistical results show that the proposed method has optimized the fusion performance compared to the existing fusion schemes, in terms of the visual effects and quantitative fusion evaluation indexes.

  4. Optimizing selection of decentralized stormwater management strategies in urbanized regions

    Science.gov (United States)

    Yu, Z.; Montalto, F.

    2011-12-01

    A variety of decentralized stormwater options are available for implementation in urbanized regions. These strategies, which include bio-retention, porous pavement, green roof etc., vary in terms of cost, ability to reduce runoff, and site applicability. This paper explores the tradeoffs between different types of stormwater control meastures that could be applied in a typical urban study area. A nested optimization strategy first identifies the most cost-effective (e.g. runoff reduction / life cycle cost invested ) options for individual land parcel typologies, and then scales up the results with detailed attention paid to uncertainty in adoption rates, life cycle costs, and hydrologic performance. The study is performed with a custom built stochastic rainfall-runoff model (Monte Carlo techniques are used to quantify uncertainties associated with phased implementation of different strategies and different land parcel typologies under synthetic precipitation ensembles). The results are presented as a comparison of cost-effectiveness over the time span of 30 years, and state an optimized strategy on the cumulative cost-effectiveness over the period.

  5. Optimized Plane Wave Imaging for Fast and High-Quality Ultrasound Imaging

    DEFF Research Database (Denmark)

    Jensen, Jonas; Stuart, Matthias Bo; Jensen, Jørgen Arendt

    2016-01-01

    This paper presents a method for optimizing parameters affecting the image quality in plane wave imaging. More specifically, the number of emissions and steering angles is optimized to attain the best images with the highest frame rate possible. The method is applied to a specific problem, where ...

  6. VI International Workshop on Nature Inspired Cooperative Strategies for Optimization

    CERN Document Server

    Otero, Fernando; Masegosa, Antonio

    2014-01-01

    Biological and other natural processes have always been a source of inspiration for computer science and information technology. Many emerging problem solving techniques integrate advanced evolution and cooperation strategies, encompassing a range of spatio-temporal scales for visionary conceptualization of evolutionary computation. This book is a collection of research works presented in the VI International Workshop on Nature Inspired Cooperative Strategies for Optimization (NICSO) held in Canterbury, UK. Previous editions of NICSO were held in Granada, Spain (2006 & 2010), Acireale, Italy (2007), Tenerife, Spain (2008), and Cluj-Napoca, Romania (2011). NICSO 2013 and this book provides a place where state-of-the-art research, latest ideas and emerging areas of nature inspired cooperative strategies for problem solving are vigorously discussed and exchanged among the scientific community. The breadth and variety of articles in this book report on nature inspired methods and applications such as Swarm In...

  7. Optimal strategies for electric energy contract decision making

    Science.gov (United States)

    Song, Haili

    2000-10-01

    The power industry restructuring in various countries in recent years has created an environment where trading of electric energy is conducted in a market environment. In such an environment, electric power companies compete for the market share through spot and bilateral markets. Being profit driven, electric power companies need to make decisions on spot market bidding, contract evaluation, and risk management. New methods and software tools are required to meet these upcoming needs. In this research, bidding strategy and contract pricing are studied from a market participant's viewpoint; new methods are developed to guide a market participant in spot and bilateral market operation. A supplier's spot market bidding decision is studied. Stochastic optimization is formulated to calculate a supplier's optimal bids in a single time period. This decision making problem is also formulated as a Markov Decision Process. All the competitors are represented by their bidding parameters with corresponding probabilities. A systematic method is developed to calculate transition probabilities and rewards. The optimal strategy is calculated to maximize the expected reward over a planning horizon. Besides the spot market, a power producer can also trade in the bilateral markets. Bidding strategies in a bilateral market are studied with game theory techniques. Necessary and sufficient conditions of Nash Equilibrium (NE) bidding strategy are derived based on the generators' cost and the loads' willingness to pay. The study shows that in any NE, market efficiency is achieved. Furthermore, all Nash equilibria are revenue equivalent for the generators. The pricing of "Flexible" contracts, which allow delivery flexibility over a period of time with a fixed total amount of electricity to be delivered, is analyzed based on the no-arbitrage pricing principle. The proposed algorithm calculates the price based on the optimality condition of the stochastic optimization formulation

  8. Optimal Bidding Strategies using New Aggregated Demand Model with Particle Swarm Optimization Technique

    Directory of Open Access Journals (Sweden)

    Dr.B.Subramanyam

    2013-02-01

    Full Text Available In this paper, Particle Swarm optimization(PSO and Artificial Bee Colony (ABC algorithms are used to determine the optimal bidding strategy in competitive auction market implementation. The deregulated power industry meets the challenges of increase their profits and also minimize the associadted risks of the system. Themarket includes generating companies(Gencos and large Consumers. The demand prediction of the system has been determined by the neural network, which is trained by using the previous day demand dataset, the training process is achieved by the back propagation algorithm. The fitness of the system compared with PSO and ABC technique, the maximized fitness is the optimal bidding strategy of the system . The results for two techniques will be analyzed in this paper. The implementation of the two techniques could be implemented in theMATLAB Platform.

  9. Constrained optimization for image restoration using nonlinear programming

    Science.gov (United States)

    Yeh, C.-L.; Chin, R. T.

    1985-01-01

    The constrained optimization problem for image restoration, utilizing incomplete information and partial constraints, is formulated using nonlinear proramming techniques. This method restores a distorted image by optimizing a chosen object function subject to available constraints. The penalty function method of nonlinear programming is used. Both linear or nonlinear object function, and linear or nonlinear constraint functions can be incorporated in the formulation. This formulation provides a generalized approach to solve constrained optimization problems for image restoration. Experiments using this scheme have been performed. The results are compared with those obtained from other restoration methods and the comparative study is presented.

  10. The Tonya Harding Controversy: An Analysis of Image Restoration Strategies.

    Science.gov (United States)

    Benoit, William L.; Hanczor, Robert S.

    1994-01-01

    Analyzes Tonya Harding's defense of her image in "Eye to Eye with Connie Chung," applying the theory of image restoration discourse. Finds that the principal strategies employed in her behalf were bolstering, denial, and attacking her accuser, but that these strategies were not developed very effectively in this instance. (SR)

  11. Synthetic Imaging Maneuver Optimization (SIMO) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Space-based interferometry missions have the potential to revolutionize imaging and astrometry, providing observations of unprecedented accuracy. Realizing the full...

  12. Particle Swarm Optimization With Interswarm Interactive Learning Strategy.

    Science.gov (United States)

    Qin, Quande; Cheng, Shi; Zhang, Qingyu; Li, Li; Shi, Yuhui

    2016-10-01

    The learning strategy in the canonical particle swarm optimization (PSO) algorithm is often blamed for being the primary reason for loss of diversity. Population diversity maintenance is crucial for preventing particles from being stuck into local optima. In this paper, we present an improved PSO algorithm with an interswarm interactive learning strategy (IILPSO) by overcoming the drawbacks of the canonical PSO algorithm's learning strategy. IILPSO is inspired by the phenomenon in human society that the interactive learning behavior takes place among different groups. Particles in IILPSO are divided into two swarms. The interswarm interactive learning (IIL) behavior is triggered when the best particle's fitness value of both the swarms does not improve for a certain number of iterations. According to the best particle's fitness value of each swarm, the softmax method and roulette method are used to determine the roles of the two swarms as the learning swarm and the learned swarm. In addition, the velocity mutation operator and global best vibration strategy are used to improve the algorithm's global search capability. The IIL strategy is applied to PSO with global star and local ring structures, which are termed as IILPSO-G and IILPSO-L algorithm, respectively. Numerical experiments are conducted to compare the proposed algorithms with eight popular PSO variants. From the experimental results, IILPSO demonstrates the good performance in terms of solution accuracy, convergence speed, and reliability. Finally, the variations of the population diversity in the entire search process provide an explanation why IILPSO performs effectively.

  13. Energy Optimal Control Strategy of PHEV Based on PMP Algorithm

    Directory of Open Access Journals (Sweden)

    Tiezhou Wu

    2017-01-01

    Full Text Available Under the global voice of “energy saving” and the current boom in the development of energy storage technology at home and abroad, energy optimal control of the whole hybrid electric vehicle power system, as one of the core technologies of electric vehicles, is bound to become a hot target of “clean energy” vehicle development and research. This paper considers the constraints to the performance of energy storage system in Parallel Hybrid Electric Vehicle (PHEV, from which lithium-ion battery frequently charges/discharges, PHEV largely consumes energy of fuel, and their are difficulty in energy recovery and other issues in a single cycle; the research uses lithium-ion battery combined with super-capacitor (SC, which is hybrid energy storage system (Li-SC HESS, working together with internal combustion engine (ICE to drive PHEV. Combined with PSO-PI controller and Li-SC HESS internal power limited management approach, the research proposes the PHEV energy optimal control strategy. It is based on revised Pontryagin’s minimum principle (PMP algorithm, which establishes the PHEV vehicle simulation model through ADVISOR software and verifies the effectiveness and feasibility. Finally, the results show that the energy optimization control strategy can improve the instantaneity of tracking PHEV minimum fuel consumption track, implement energy saving, and prolong the life of lithium-ion batteries and thereby can improve hybrid energy storage system performance.

  14. Toeplitz block circulant matrix optimized with particle swarm optimization for compressive imaging

    Science.gov (United States)

    Tao, Huifeng; Yin, Songfeng; Tang, Cong

    2016-10-01

    Compressive imaging is an imaging way based on the compressive sensing theory, which could achieve to capture the high resolution image through a small set of measurements. As the core of the compressive imaging, the design of the measurement matrix is sufficient to ensure that the image can be recovered from the measurements. Due to the fast computing capacity and the characteristic of easy hardware implementation, The Toeplitz block circulant matrix is proposed to realize the encoded samples. The measurement matrix is usually optimized for improving the image reconstruction quality. However, the existing optimization methods can destroy the matrix structure easily when applied to the Toeplitz block circulant matrix optimization process, and the deterministic iterative processes of them are inflexible, because of requiring the task optimized to need to satisfy some certain mathematical property. To overcome this problem, a novel method of optimizing the Toeplitz block circulant matrix based on the particle swarm optimization intelligent algorithm is proposed in this paper. The objective function is established by the way of approaching the target matrix that is the Gram matrix truncated by the Welch threshold. The optimized object is the vector composed by the free entries instead of the Gram matrix. The experimental results indicate that the Toeplitz block circulant measurement matrix can be optimized while preserving the matrix structure by our method, and result in the reconstruction quality improvement.

  15. Cost Effectiveness Analysis of Optimal Malaria Control Strategies in Kenya

    Directory of Open Access Journals (Sweden)

    Gabriel Otieno

    2016-03-01

    Full Text Available Malaria remains a leading cause of mortality and morbidity among the children under five and pregnant women in sub-Saharan Africa, but it is preventable and controllable provided current recommended interventions are properly implemented. Better utilization of malaria intervention strategies will ensure the gain for the value for money and producing health improvements in the most cost effective way. The purpose of the value for money drive is to develop a better understanding (and better articulation of costs and results so that more informed, evidence-based choices could be made. Cost effectiveness analysis is carried out to inform decision makers on how to determine where to allocate resources for malaria interventions. This study carries out cost effective analysis of one or all possible combinations of the optimal malaria control strategies (Insecticide Treated Bednets—ITNs, Treatment, Indoor Residual Spray—IRS and Intermittent Preventive Treatment for Pregnant Women—IPTp for the four different transmission settings in order to assess the extent to which the intervention strategies are beneficial and cost effective. For the four different transmission settings in Kenya the optimal solution for the 15 strategies and their associated effectiveness are computed. Cost-effective analysis using Incremental Cost Effectiveness Ratio (ICER was done after ranking the strategies in order of the increasing effectiveness (total infections averted. The findings shows that for the endemic regions the combination of ITNs, IRS, and IPTp was the most cost-effective of all the combined strategies developed in this study for malaria disease control and prevention; for the epidemic prone areas is the combination of the treatment and IRS; for seasonal areas is the use of ITNs plus treatment; and for the low risk areas is the use of treatment only. Malaria transmission in Kenya can be minimized through tailor-made intervention strategies for malaria control

  16. Image Processing Oriented to Security Optimization

    Directory of Open Access Journals (Sweden)

    Ion Ivan

    2010-06-01

    Full Text Available This paper presents the main aspects of the digital content security. It describes the content of watermarking, presenting the steganography concept. SteganoGraphy application is presented and the algorithm used is analyzed. Optimization techniques are introduces to minimize the risk of discovering the information embedded into digital content by means of invisible watermarking. Techniques of analyzing the digital content results and identify the possible countermeasures for optimizing the steganography algorithm are presented.

  17. Hemoglobin optimization and transfusion strategies in patients undergoing cardiac surgery

    Institute of Scientific and Technical Information of China (English)

    Mahdi; Najafi; David; Faraoni

    2015-01-01

    Although red blood cells(RBCs) transfusion is sometimes associated with adverse reactions,anemia could also lead to increased morbidity and mortality in highrisk patients. For these reasons,the definition of perioperative strategies that aims to detect and treat preoperative anemia,prevent excessive blood loss,and define "optimal" transfusion algorithms is crucial. Although the treatment with preoperative iron and erythropoietin has been recommended in some specific conditions,several controversies exist regarding the benefit-to-risk balance associated with these treatments. Further studies are needed to better define the indications,dosage,and route of administration for preoperative iron with or without erythropoietin supplementation. Although restrictive transfusion strategies in patients undergoing cardiac surgery have been shown to effectively reduce the incidence and the amount of RBCs transfusion without increase in side effects,some high-risk patients(e.g.,symptomatic acute coronary syndrome) could benefit from higher hemoglobin concentrations. Despite all efforts made last decade,a significant amount of work remains to be done to improve hemoglobin optimization and transfusion strategies in patients undergoing cardiac surgery.

  18. Evolving Nash-optimal poker strategies using evolutionary computation

    Institute of Scientific and Technical Information of China (English)

    Hanyang QUEK; Chunghoong WOO; Kaychen TAN; Arthur TAY

    2009-01-01

    This paper focuses on the development of a competitive computer player for the one versus one Texas Hold'em poker using evolutionary algorithms (EA). A Texas Hold'em game engine is first constructed where an efficient odds" calculator is programmed to allow for the abstraction of a player's cards, which yield important but complex information. Effort is directed to realize an optimal player that will play close to the Nash equilibrium (NE) by proposing a new fitness criterion. Preliminary studies on a simplified version of poker highlighted the intransitivity nature of poker.The evolved player displays strategies that are logical but reveals insights that are hard to comprehend e.g., bluffing.The player is then benchmarked against Poki and PSOpti,which is the best heads-up Texas Hold'em artificial intelligence to date and plays closest to the optimal Nash equilibrium. Despite the much constrained chromosomal strategy representation, simulated results verified that evolutionary algorithms are effective in creating strategies that are comparable to Poki and PSOpti in the absence of expert knowledge.

  19. Optimal Bidding Strategy for Renewable Microgrid with Active Network Management

    Directory of Open Access Journals (Sweden)

    Seung Wan Kim

    2016-01-01

    Full Text Available Active Network Management (ANM enables a microgrid to optimally dispatch the active/reactive power of its Renewable Distributed Generation (RDG and Battery Energy Storage System (BESS units in real time. Thus, a microgrid with high penetration of RDGs can handle their uncertainties and variabilities to achieve the stable operation using ANM. However, the actual power flow in the line connecting the main grid and microgrid may deviate significantly from the day-ahead bids if the bids are determined without consideration of the real-time adjustment through ANM, which will lead to a substantial imbalance cost. Therefore, this study proposes a formulation for obtaining an optimal bidding which reflects the change of power flow in the connecting line by real-time adjustment using ANM. The proposed formulation maximizes the expected profit of the microgrid considering various network and physical constraints. The effectiveness of the proposed bidding strategy is verified through the simulations with a 33-bus test microgrid. The simulation results show that the proposed bidding strategy improves the expected operating profit by reducing the imbalance cost to a greater degree compared to the basic bidding strategy without consideration of ANM.

  20. An improved technique for the prediction of optimal image resolution ...

    African Journals Online (AJOL)

    user

    2010-10-04

    Oct 4, 2010 ... Key words: Optimal resolution, savannah ecosystems, image noise index, land cover index, .... Most techniques, including those employed by Mugisha .... Resampling imagery using cubic convolution was used because it.

  1. Computer teaching process optimization strategy analysis of thinking ability

    Directory of Open Access Journals (Sweden)

    Luo Liang

    2016-01-01

    Full Text Available As is known to all, computer is a college student in a university course, one of the basic course in the process of education for college students which lay a theoretical foundation for the next professional learning. At the same time, in recent years, countries and universities attach great importance to and focus on computer teaching for young college students, the purpose is to improve students’ thinking ability, eventually to promote college students’ ability to use computational thinking to solve and analyze the problems of daily life. Therefore, this article on how to the calculation of optimization in the process of computer teaching college students thinking ability on further discussion and analysis, and then explore the strategies and methods, so as to promote the computer teaching in the process of the cultivation of thinking ability and optimize the computer

  2. Strategy of Concurrent Optimization for an Assembly Sequence

    Institute of Scientific and Technical Information of China (English)

    YANG Bo; LIU Lu-ning; ZE Xiang-bo

    2005-01-01

    An effective constraint release based approach to realize concurrent optimization for an assembly sequence is proposed. To quantify the measurement of assembly efficiency, a mathematical model of concurrency evaluation index was put forward at first, and then a technology to quantify assembly constraints was developed by application of some fuzzy logic algorithms. In the process of concurrent optimization of the assembly sequence, two kinds of constraints were involved. One was self-constraints of components, which was used to evaluate the assemble capability of components under the condition of full-freedom. Another was an assembly constraint between components represented by geometric constraints between points, lines and planes under physical restriction conditions. The concept of connection strength degree (CSD) was introduced as one efficient indicator and the value of it was evaluated by the intersection of the two constraints mentioned above. The equivalent constraints describing the connection weights between components were realized by a well designed constraints reduction, and then the connection weights based complete assembly liaison graph was applied to release virtual connections between components. Under a given threshold value, a decomposition and reconstituting strategy for the graph with the focus on high assembly concurrency was used to realize an optimized assembly concurrency evaluation index. Finally, the availability of the approach was illustrated in an example to optimize the assembly of a shift pump.

  3. Issues and Strategies in Solving Multidisciplinary Optimization Problems

    Science.gov (United States)

    Patnaik, Surya

    2013-01-01

    Optimization research at NASA Glenn Research Center has addressed the design of structures, aircraft and airbreathing propulsion engines. The accumulated multidisciplinary design activity is collected under a testbed entitled COMETBOARDS. Several issues were encountered during the solution of the problems. Four issues and the strategies adapted for their resolution are discussed. This is followed by a discussion on analytical methods that is limited to structural design application. An optimization process can lead to an inefficient local solution. This deficiency was encountered during design of an engine component. The limitation was overcome through an augmentation of animation into optimization. Optimum solutions obtained were infeasible for aircraft and airbreathing propulsion engine problems. Alleviation of this deficiency required a cascading of multiple algorithms. Profile optimization of a beam produced an irregular shape. Engineering intuition restored the regular shape for the beam. The solution obtained for a cylindrical shell by a subproblem strategy converged to a design that can be difficult to manufacture. Resolution of this issue remains a challenge. The issues and resolutions are illustrated through a set of problems: Design of an engine component, Synthesis of a subsonic aircraft, Operation optimization of a supersonic engine, Design of a wave-rotor-topping device, Profile optimization of a cantilever beam, and Design of a cylindrical shell. This chapter provides a cursory account of the issues. Cited references provide detailed discussion on the topics. Design of a structure can also be generated by traditional method and the stochastic design concept. Merits and limitations of the three methods (traditional method, optimization method and stochastic concept) are illustrated. In the traditional method, the constraints are manipulated to obtain the design and weight is back calculated. In design optimization, the weight of a structure becomes the

  4. Optimized Control Strategy For Over Loaded Offshore Wind Turbines

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Knudsen, Torben; Wisniewski, Rafal

    2015-01-01

    Abstract Optimized control strategy for overloaded offshore wind turbines Introduction Operation and maintenance cost are an important part of cost of energy especially for offshore wind farms. Typically unplanned service is called for due to detection off excessive loads on components, e...... controller tuning for a given wind turbine. It also enables a very safe and robust comparison between a new control strategy and the present one. Main body of abstract Is it true that power de-rating indeed the best way to reduce loads? The power de-rating approach has the drawback of only indirectly...... and service at offshore location, where accessibility can be problematic. The controller objectives are focused directly on the actual objective like lowering of fore aft fatigue loads, instead of using an indirect objective of de-rating the power production of the wind turbine. This means what the wind...

  5. Process of Market Strategy Optimization Using Distributed Computing Systems

    Directory of Open Access Journals (Sweden)

    Nowicki Wojciech

    2015-12-01

    Full Text Available If market repeatability is assumed, it is possible with some real probability to deduct short term market changes by making some calculations. The algorithm, based on logical and statistically reasonable scheme to make decisions about opening or closing position on a market, is called an automated strategy. Due to market volatility, all parameters are changing from time to time, so there is need to constantly optimize them. This article describes a team organization process when researching market strategies. Individual team members are merged into small groups, according to their responsibilities. The team members perform data processing tasks through a cascade organization, providing solutions to speed up work related to the use of remote computing resources. They also work out how to store results in a suitable way, according to the type of task, and facilitate the publication of a large amount of results.

  6. Sequential optimizing strategy in multi-dimensional bounded forecasting games

    CERN Document Server

    Kumon, Masayuki; Takeuchi, Kei

    2009-01-01

    We propose a sequential optimizing betting strategy in the multi-dimensional bounded forecasting game in the framework of game-theoretic probability of Shafer and Vovk (2001). By studying the asymptotic behavior of its capital process, we prove a generalization of the strong law of large numbers, where the convergence rate of the sample mean vector depends on the growth rate of the quadratic variation process. The growth rate of the quadratic variation process may be slower than the number of rounds or may even be zero. We also introduce an information criterion for selecting efficient betting items. These results are then applied to multiple asset trading strategies in discrete-time and continuous-time games. In the case of continuous-time game we present a measure of the jaggedness of a vector-valued continuous process. Our results are examined by several numerical examples.

  7. A Novel Optimization-Based Approach for Content-Based Image Retrieval

    Directory of Open Access Journals (Sweden)

    Manyu Xiao

    2013-01-01

    Full Text Available Content-based image retrieval is nowadays one of the possible and promising solutions to manage image databases effectively. However, with the large number of images, there still exists a great discrepancy between the users’ expectations (accuracy and efficiency and the real performance in image retrieval. In this work, new optimization strategies are proposed on vocabulary tree building, retrieval, and matching methods. More precisely, a new clustering strategy combining classification and conventional K-Means method is firstly redefined. Then a new matching technique is built to eliminate the error caused by large-scaled scale-invariant feature transform (SIFT. Additionally, a new unit mechanism is proposed to reduce the cost of indexing time. Finally, the numerical results show that excellent performances are obtained in both accuracy and efficiency based on the proposed improvements for image retrieval.

  8. Determining the Bayesian optimal sampling strategy in a hierarchical system.

    Energy Technology Data Exchange (ETDEWEB)

    Grace, Matthew D.; Ringland, James T.; Boggs, Paul T.; Pebay, Philippe Pierre

    2010-09-01

    Consider a classic hierarchy tree as a basic model of a 'system-of-systems' network, where each node represents a component system (which may itself consist of a set of sub-systems). For this general composite system, we present a technique for computing the optimal testing strategy, which is based on Bayesian decision analysis. In previous work, we developed a Bayesian approach for computing the distribution of the reliability of a system-of-systems structure that uses test data and prior information. This allows for the determination of both an estimate of the reliability and a quantification of confidence in the estimate. Improving the accuracy of the reliability estimate and increasing the corresponding confidence require the collection of additional data. However, testing all possible sub-systems may not be cost-effective, feasible, or even necessary to achieve an improvement in the reliability estimate. To address this sampling issue, we formulate a Bayesian methodology that systematically determines the optimal sampling strategy under specified constraints and costs that will maximally improve the reliability estimate of the composite system, e.g., by reducing the variance of the reliability distribution. This methodology involves calculating the 'Bayes risk of a decision rule' for each available sampling strategy, where risk quantifies the relative effect that each sampling strategy could have on the reliability estimate. A general numerical algorithm is developed and tested using an example multicomponent system. The results show that the procedure scales linearly with the number of components available for testing.

  9. Space, time, error, and power optimization of image compression transforms

    Science.gov (United States)

    Schmalz, Mark S.; Ritter, Gerhard X.; Caimi, Frank M.

    2000-11-01

    The implementation of an image compression transform on one or more small, embedded processors typically involves stringent constraints on power consumption and form factor. Traditional methods of optimizing compression algorithm performance typically emphasize joint minimization of space and time complexity, often without significant consideration of arithmetic accuracy or power consumption. However, small autonomous imaging platforms typically require joint optimization of space, time, error (or accuracy), and power (STEP) parameters, which the authors call STEP optimization. In response to implementational constraints on space and power consumption, the authors have developed systems and techniques for STEP optimization that are based on recent research in VLSI circuit design, as well as extensive previous work in system optimization. Building on the authors' previous research in embedded processors as well as adaptive or reconfigurable computing, it is possible to produce system-independent STEP optimization that can be customized for a given set of system-specific constraints. This approach is particularly useful when algorithms for image and signal processing (ISP) computer vision (CV), or automated target recognition (ATR), expressed in a machine- independent notation, are mapped to one or more heterogeneous processors (e.g., digital signal processors or DSPs, SIMD mesh processors, or reconfigurable logic). Following a theoretical summary, this paper illustrates various STEP optimization techniques via case studies, for example, real-time compression of underwater imagery on board an autonomous vehicle. Optimization algorithms are taken from the literature, and error profiling/analysis methodologies developed in the authors' previous research are employed. This yields a more rigorous basis for the simulation and evaluation of compression algorithms on a wide variety of hardware models. In this study, image algebra is employed as the notation of choice

  10. Increasing accuracy and precision of digital image correlation through pattern optimization

    Science.gov (United States)

    Bomarito, G. F.; Hochhalter, J. D.; Ruggles, T. J.; Cannon, A. H.

    2017-04-01

    The accuracy and precision of digital image correlation (DIC) is based on three primary components: image acquisition, image analysis, and the subject of the image. Focus on the third component, the image subject, has been relatively limited and primarily concerned with comparing pseudo-random surface patterns. In the current work, a strategy is proposed for the creation of optimal DIC patterns. In this strategy, a pattern quality metric is developed as a combination of quality metrics from the literature rather than optimization based on any single one of them. In this way, optimization produces a pattern which balances the benefits of multiple quality metrics. Specifically, sum of square of subset intensity gradients (SSSIG) was found to be the metric most strongly correlated to DIC accuracy and thus is the main component of the newly proposed pattern quality metric. A term related to the secondary auto-correlation peak height is also part of the proposed quality metric which effectively acts as a constraint upon SSSIG ensuring that a regular (e.g., checkerboard-type) pattern is not achieved. The combined pattern quality metric is used to generate a pattern that was on average 11.6% more accurate than a randomly generated pattern in a suite of numerical experiments. Furthermore, physical experiments were performed which confirm that there is indeed improvement of a similar magnitude in DIC measurements for the optimized pattern compared to a random pattern.

  11. New Optimal DWT Domain Image Watermarking Technique via Genetic Algorithm

    Institute of Scientific and Technical Information of China (English)

    ZHONG Ning; KUANG Jing-ming; HE Zun-wen

    2007-01-01

    A novel optimal image watermarking scheme is proposed in which the genetic algor ithm (GA) is employed to obtain the improvement of algorithm performance. Arnold transform is utilized to obtain the scrambled watermark, and then the embedding and extraction of watermark are implemented in digital wavelet transform (DWT) domain. During the watermarking process, GA is employed to search optimal parame ters of embedding strength and times of Arnold transform to gain the optimization of watermarking performance. Simulation results show that the proposed method can improve the quality of watermarked image and give almost the same robustness of the watermark.

  12. An approach to designing optimal imaging systems

    Energy Technology Data Exchange (ETDEWEB)

    Seeley, G.W.; Barrett, H.H.; Borgstrom, M.C.; Cargill, E.B.; Fiete, R.D.; Myers, K.J.; Patton, D.D.; Smith, W.E.; Stempski, M.O.; Paxman, R.G.

    1985-05-01

    This paper reports recent work by the authors to develop a systematic basis for the improvement of existing and the development of new imaging systems for nuclear medicine. Assessment of imaging systems is typically done by using the radiologists' perceptual skills in a number of tasks which approximate the clinical setting. For these psycho-physical experiments, an object class with a specified number of categories must first be selected (e.g., liver with or without lesions). Data collected by the system to be evaluated are then used to generate a set of images which are displayed to the observers. From the experiment comes a figure of merit that is used to evaluate the system. However, there is often no clear indication of how one should use the information from the psychophysical study to guide physicists and engineers toward specific improvements in the imaging system. Proposed here is a procedure which will provide a feedback loop for system improvement. A key part of this procedure involves identifying and selecting features that can be used to classify images into their respective categories. The human-evaluation segment of the paradigm, which makes use of signal-detection theory and multidimensional scaling techniques, serves as a verification of the computer-selected features.

  13. Infomax Strategies for an Optimal Balance Between Exploration and Exploitation

    Science.gov (United States)

    Reddy, Gautam; Celani, Antonio; Vergassola, Massimo

    2016-06-01

    Proper balance between exploitation and exploration is what makes good decisions that achieve high reward, like payoff or evolutionary fitness. The Infomax principle postulates that maximization of information directs the function of diverse systems, from living systems to artificial neural networks. While specific applications turn out to be successful, the validity of information as a proxy for reward remains unclear. Here, we consider the multi-armed bandit decision problem, which features arms (slot-machines) of unknown probabilities of success and a player trying to maximize cumulative payoff by choosing the sequence of arms to play. We show that an Infomax strategy (Info-p) which optimally gathers information on the highest probability of success among the arms, saturates known optimal bounds and compares favorably to existing policies. Conversely, gathering information on the identity of the best arm in the bandit leads to a strategy that is vastly suboptimal in terms of payoff. The nature of the quantity selected for Infomax acquisition is then crucial for effective tradeoffs between exploration and exploitation.

  14. Optimal measurement strategies for effective suppression of drift errors

    Energy Technology Data Exchange (ETDEWEB)

    Yashchuk, Valeriy V.

    2009-04-16

    Drifting of experimental set-ups with change of temperature or other environmental conditions is the limiting factor of many, if not all, precision measurements. The measurement error due to a drift is, in some sense, in-between random noise and systematic error. In the general case, the error contribution of a drift cannot be averaged out using a number of measurements identically carried out over a reasonable time. In contrast to systematic errors, drifts are usually not stable enough for a precise calibration. Here a rather general method for effective suppression of the spurious effects caused by slow drifts in a large variety of instruments and experimental set-ups is described. An analytical derivation of an identity, describing the optimal measurement strategies suitable for suppressing the contribution of a slow drift described with a certain order polynomial function, is presented. A recursion rule as well as a general mathematical proof of the identity is given. The effectiveness of the discussed method is illustrated with an application of the derived optimal scanning strategies to precise surface slope measurements with a surface profiler.

  15. Web malware spread modelling and optimal control strategies

    Science.gov (United States)

    Liu, Wanping; Zhong, Shouming

    2017-02-01

    The popularity of the Web improves the growth of web threats. Formulating mathematical models for accurate prediction of malicious propagation over networks is of great importance. The aim of this paper is to understand the propagation mechanisms of web malware and the impact of human intervention on the spread of malicious hyperlinks. Considering the characteristics of web malware, a new differential epidemic model which extends the traditional SIR model by adding another delitescent compartment is proposed to address the spreading behavior of malicious links over networks. The spreading threshold of the model system is calculated, and the dynamics of the model is theoretically analyzed. Moreover, the optimal control theory is employed to study malware immunization strategies, aiming to keep the total economic loss of security investment and infection loss as low as possible. The existence and uniqueness of the results concerning the optimality system are confirmed. Finally, numerical simulations show that the spread of malware links can be controlled effectively with proper control strategy of specific parameter choice.

  16. The optimal polarizations for achieving maximum contrast in radar images

    Science.gov (United States)

    Swartz, A. A.; Yueh, H. A.; Kong, J. A.; Novak, L. M.; Shin, R. T.

    1988-01-01

    There is considerable interest in determining the optimal polarizations that maximize contrast between two scattering classes in polarimetric radar images. A systematic approach is presented for obtaining the optimal polarimetric matched filter, i.e., that filter which produces maximum contrast between two scattering classes. The maximization procedure involves solving an eigenvalue problem where the eigenvector corresponding to the maximum contrast ratio is an optimal polarimetric matched filter. To exhibit the physical significance of this filter, it is transformed into its associated transmitting and receiving polarization states, written in terms of horizontal and vertical vector components. For the special case where the transmitting polarization is fixed, the receiving polarization which maximizes the contrast ratio is also obtained. Polarimetric filtering is then applies to synthetic aperture radar images obtained from the Jet Propulsion Laboratory. It is shown, both numerically and through the use of radar imagery, that maximum image contrast can be realized when data is processed with the optimal polarimeter matched filter.

  17. Global optimization for multisensor fusion in seismic imaging

    Energy Technology Data Exchange (ETDEWEB)

    Barhen, J.; Protopopescu, V.; Reister, D. [Oak Ridge National Lab., TN (United States). Center for Engineering Systems Advanced Research

    1997-06-01

    The accurate imaging of subsurface structures requires the fusion of data collected from large arrays of seismic sensors. The fusion process is formulated as an optimization problem and yields an extremely complex energy surface. Due to the very large number of local minima to be explored and escaped from, the seismic imaging problem has typically been tackled with stochastic optimization methods based on Monte Carlo techniques. Unfortunately, these algorithms are very cumbersome and computationally intensive. Here, the authors present TRUST--a novel deterministic algorithm for global optimization that they apply to seismic imaging. The excellent results demonstrate that TRUST may provide the necessary breakthrough to address major scientific and technological challenges in fields as diverse as seismic modeling, process optimization, and protein engineering.

  18. Method of Fire Image Identification Based on Optimization Theory

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In view of some distinctive characteristics of the early-stage flame image, a corresponding method of characteristic extraction is presented. Also introduced is the application of the improved BP algorithm based on the optimization theory to identifying fire image characteristics. First the optimization of BP neural network adopting Levenberg-Marquardt algorithm with the property of quadratic convergence is discussed, and then a new system of fire image identification is devised. Plenty of experiments and field tests have proved that this system can detect the early-stage fire flame quickly and reliably.

  19. Optimal Bidding Strategy in Power Market before and after Congestion Management Using Invasive Weed Optimization

    Directory of Open Access Journals (Sweden)

    Mohsen Khalilpour

    2013-02-01

    Full Text Available Power companies world-wide have been restructuring their electric power systems from a vertically integrated entity to a deregulated, open-market environment. Previously, electric utilities usually sought to maximize the social welfare of the system with distributional equity as its main operational criterion. The operating paradigm was based on achieving the least-cost system solution while meeting reliability and security margins. This often resulted in investments in generating capacity operating at very low capacity factors. Decommissioning of this type of generating capacity was a natural outcome when the vertically integrated utilities moved over to deregulated market operations. This study proposes an optimizing base and load demand relative binding strategy for generating power apprises of different units in the investigated system. Afterwards, congestion effect in this biding strategy is investigated. The described systems analysis is implemented on 5 and 9 bus systems and optimizing technique in this issue is the Invasive Weed Optimization algorithm; the results are then compared by GA. Finally, examined systems is simulated by using the Power World software; experimental results show that the proposed technique (Invasive Weed Optimization is a high performance by compared GA for the congestion management purposes.

  20. Improved Strategies for Parallel Medical Image Processing Applications

    Institute of Scientific and Technical Information of China (English)

    WANG Kun; WANG Xiao-ying; LI San-li; CHEN Ying

    2008-01-01

    In order to meet the demands of high efficient and real-time computer assisted diagnosis as well as screening in medical area, to improve the efficacy of parallel medical image processing is of great importance. This article proposes improved strategies for parallel medical image processing applications,which is categorized into two genera. For each genus individual strategy is devised, including the theoretic algorithm for minimizing the exertion time. Experiment using mammograms not only justifies the validity of the theoretic analysis, with reasonable difference between the theoretic and measured value, but also shows that when adopting the improved strategies, efficacy of medical image parallel processing is improved greatly.

  1. Strategy Developed for Selecting Optimal Sensors for Monitoring Engine Health

    Science.gov (United States)

    2004-01-01

    Sensor indications during rocket engine operation are the primary means of assessing engine performance and health. Effective selection and location of sensors in the operating engine environment enables accurate real-time condition monitoring and rapid engine controller response to mitigate critical fault conditions. These capabilities are crucial to ensure crew safety and mission success. Effective sensor selection also facilitates postflight condition assessment, which contributes to efficient engine maintenance and reduced operating costs. Under the Next Generation Launch Technology program, the NASA Glenn Research Center, in partnership with Rocketdyne Propulsion and Power, has developed a model-based procedure for systematically selecting an optimal sensor suite for assessing rocket engine system health. This optimization process is termed the systematic sensor selection strategy. Engine health management (EHM) systems generally employ multiple diagnostic procedures including data validation, anomaly detection, fault-isolation, and information fusion. The effectiveness of each diagnostic component is affected by the quality, availability, and compatibility of sensor data. Therefore systematic sensor selection is an enabling technology for EHM. Information in three categories is required by the systematic sensor selection strategy. The first category consists of targeted engine fault information; including the description and estimated risk-reduction factor for each identified fault. Risk-reduction factors are used to define and rank the potential merit of timely fault diagnoses. The second category is composed of candidate sensor information; including type, location, and estimated variance in normal operation. The final category includes the definition of fault scenarios characteristic of each targeted engine fault. These scenarios are defined in terms of engine model hardware parameters. Values of these parameters define engine simulations that generate

  2. Optimal ship imaging for shore-based ISAR using DCF estimation

    Institute of Scientific and Technical Information of China (English)

    Ling Wang; Zhenxiao Cao; Ning Li; Teng Jing; Daiyin Zhu

    2015-01-01

    The optimal imaging time selection of ship targets for shore-based inverse synthetic aperture radar (ISAR) in high sea conditions is investigated. The optimal imaging time includes opti-mal imaging instants and optimal imaging duration. A novel method for optimal imaging instants selection based on the estimation of the Doppler centroid frequencies (DCFs) of a series of images obtained over continuous short durations is proposed. Combined with the optimal imaging duration selection scheme using the image contrast maximization criteria, this method can provide the ship images with the highest focus. Simulated and real data pro-cessing results verify the effectiveness of the proposed imaging method.

  3. PEMFC Optimization Strategy with Auxiliary Power Source in Fuel Cell Hybrid Vehicle

    Directory of Open Access Journals (Sweden)

    Tinton Dwi Atmaja

    2012-02-01

    Full Text Available Page HeaderOpen Journal SystemsJournal HelpUser You are logged in as...aulia My Journals My Profile Log Out Log Out as UserNotifications View (27 new ManageJournal Content SearchBrowse By Issue By Author By Title Other JournalsFont SizeMake font size smaller Make font size default Make font size largerInformation For Readers For Authors For LibrariansKeywords CBPNN Displacement FLC LQG/LTR Mixed PMA Ventilation bottom shear stress direct multiple shooting effective fuzzy logic geoelectrical method hourly irregular wave missile trajectory panoramic image predator-prey systems seawater intrusion segmentation structure development pattern terminal bunt manoeuvre Home About User Home Search Current Archives ##Editorial Board##Home > Vol 23, No 1 (2012 > AtmajaPEMFC Optimization Strategy with Auxiliary Power Source in Fuel Cell Hybrid VehicleTinton Dwi Atmaja, Amin AminAbstractone of the present-day implementation of fuel cell is acting as main power source in Fuel Cell Hybrid Vehicle (FCHV. This paper proposes some strategies to optimize the performance of Polymer Electrolyte Membrane Fuel Cell (PEMFC implanted with auxiliary power source to construct a proper FCHV hybridization. The strategies consist of the most updated optimization method determined from three point of view i.e. Energy Storage System (ESS, hybridization topology and control system analysis. The goal of these strategies is to achieve an optimum hybridization with long lifetime, low cost, high efficiency, and hydrogen consumption rate improvement. The energy storage system strategy considers battery, supercapacitor, and high-speed flywheel as the most promising alternative auxiliary power source. The hybridization topology strategy analyzes the using of multiple storage devices injected with electronic components to bear a higher fuel economy and cost saving. The control system strategy employs nonlinear control system to optimize the ripple factor of the voltage and the current

  4. Survey strategy optimization for the Atacama Cosmology Telescope

    Science.gov (United States)

    De Bernardis, F.; Stevens, J. R.; Hasselfield, M.; Alonso, D.; Bond, J. R.; Calabrese, E.; Choi, S. K.; Crowley, K. T.; Devlin, M.; Dunkley, J.; Gallardo, P. A.; Henderson, S. W.; Hilton, M.; Hlozek, R.; Ho, S. P.; Huffenberger, K.; Koopman, B. J.; Kosowsky, A.; Louis, T.; Madhavacheril, M. S.; McMahon, J.; Næss, S.; Nati, F.; Newburgh, L.; Niemack, M. D.; Page, L. A.; Salatino, M.; Schillaci, A.; Schmitt, B. L.; Sehgal, N.; Sievers, J. L.; Simon, S. M.; Spergel, D. N.; Staggs, S. T.; van Engelen, A.; Vavagiakis, E. M.; Wollack, E. J.

    2016-07-01

    In recent years there have been significant improvements in the sensitivity and the angular resolution of the instruments dedicated to the observation of the Cosmic Microwave Background (CMB). ACTPol is the first polarization receiver for the Atacama Cosmology Telescope (ACT) and is observing the CMB sky with arcmin resolution over 2000 sq. deg. Its upgrade, Advanced ACTPol (AdvACT), will observe the CMB in five frequency bands and over a larger area of the sky. We describe the optimization and implementation of the ACTPol and AdvACT surveys. The selection of the observed fields is driven mainly by the science goals, that is, small angular scale CMB measurements, B-mode measurements and cross-correlation studies. For the ACTPol survey we have observed patches of the southern galactic sky with low galactic foreground emissions which were also chosen to maximize the overlap with several galaxy surveys to allow unique cross-correlation studies. A wider field in the northern galactic cap ensured significant additional overlap with the BOSS spectroscopic survey. The exact shapes and footprints of the fields were optimized to achieve uniform coverage and to obtain cross-linked maps by observing the fields with different scan directions. We have maximized the efficiency of the survey by implementing a close to 24 hour observing strategy, switching between daytime and nighttime observing plans and minimizing the telescope idle time. We describe the challenges represented by the survey optimization for the significantly wider area observed by AdvACT, which will observe roughly half of the low-foreground sky. The survey strategies described here may prove useful for planning future ground-based CMB surveys, such as the Simons Observatory and CMB Stage IV surveys.

  5. Using Chemical Reaction Kinetics to Predict Optimal Antibiotic Treatment Strategies

    Science.gov (United States)

    Abel zur Wiesch, Pia; Cohen, Ted

    2017-01-01

    Identifying optimal dosing of antibiotics has proven challenging—some antibiotics are most effective when they are administered periodically at high doses, while others work best when minimizing concentration fluctuations. Mechanistic explanations for why antibiotics differ in their optimal dosing are lacking, limiting our ability to predict optimal therapy and leading to long and costly experiments. We use mathematical models that describe both bacterial growth and intracellular antibiotic-target binding to investigate the effects of fluctuating antibiotic concentrations on individual bacterial cells and bacterial populations. We show that physicochemical parameters, e.g. the rate of drug transmembrane diffusion and the antibiotic-target complex half-life are sufficient to explain which treatment strategy is most effective. If the drug-target complex dissociates rapidly, the antibiotic must be kept constantly at a concentration that prevents bacterial replication. If antibiotics cross bacterial cell envelopes slowly to reach their target, there is a delay in the onset of action that may be reduced by increasing initial antibiotic concentration. Finally, slow drug-target dissociation and slow diffusion out of cells act to prolong antibiotic effects, thereby allowing for less frequent dosing. Our model can be used as a tool in the rational design of treatment for bacterial infections. It is easily adaptable to other biological systems, e.g. HIV, malaria and cancer, where the effects of physiological fluctuations of drug concentration are also poorly understood. PMID:28060813

  6. Using Chemical Reaction Kinetics to Predict Optimal Antibiotic Treatment Strategies.

    Science.gov (United States)

    Abel Zur Wiesch, Pia; Clarelli, Fabrizio; Cohen, Ted

    2017-01-01

    Identifying optimal dosing of antibiotics has proven challenging-some antibiotics are most effective when they are administered periodically at high doses, while others work best when minimizing concentration fluctuations. Mechanistic explanations for why antibiotics differ in their optimal dosing are lacking, limiting our ability to predict optimal therapy and leading to long and costly experiments. We use mathematical models that describe both bacterial growth and intracellular antibiotic-target binding to investigate the effects of fluctuating antibiotic concentrations on individual bacterial cells and bacterial populations. We show that physicochemical parameters, e.g. the rate of drug transmembrane diffusion and the antibiotic-target complex half-life are sufficient to explain which treatment strategy is most effective. If the drug-target complex dissociates rapidly, the antibiotic must be kept constantly at a concentration that prevents bacterial replication. If antibiotics cross bacterial cell envelopes slowly to reach their target, there is a delay in the onset of action that may be reduced by increasing initial antibiotic concentration. Finally, slow drug-target dissociation and slow diffusion out of cells act to prolong antibiotic effects, thereby allowing for less frequent dosing. Our model can be used as a tool in the rational design of treatment for bacterial infections. It is easily adaptable to other biological systems, e.g. HIV, malaria and cancer, where the effects of physiological fluctuations of drug concentration are also poorly understood.

  7. Optimization of microsatellite DNA Gelred fluorescence imaging ...

    African Journals Online (AJOL)

    user1

    2012-10-11

    Oct 11, 2012 ... In order to explore the best microsatellite DNA Gelred imaging technology, this study .... The cycling parameters were: 4 min at 94°C, followed by 35 cycles of ..... double-strand breaks in mammalian cells. Nucleic Acids Res.

  8. Optimized imaging using non-rigid registration

    Energy Technology Data Exchange (ETDEWEB)

    Berkels, Benjamin, E-mail: berkels@aices.rwth-aachen.de [Interdisciplinary Mathematics Institute, 1523 Greene Street, University of South Carolina, Columbia, SC 29208 (United States); Binev, Peter, E-mail: binev@math.sc.edu [Interdisciplinary Mathematics Institute, 1523 Greene Street, University of South Carolina, Columbia, SC 29208 (United States); Department of Mathematics, 1523 Greene Street, University of South Carolina, Columbia, SC 29208 (United States); Blom, Douglas A., E-mail: doug.blom@sc.edu [NanoCenter, 1212 Greene Street, University of South Carolina, Columbia, SC 29208 (United States); Dahmen, Wolfgang, E-mail: dahmen@igpm.rwth-aachen.de [Interdisciplinary Mathematics Institute, 1523 Greene Street, University of South Carolina, Columbia, SC 29208 (United States); Institut für Geometrie und Praktische Mathematik, RWTH Aachen, Templergraben 55, 52056 Aachen (Germany); Sharpley, Robert C., E-mail: rcsharpley@gmail.com [Interdisciplinary Mathematics Institute, 1523 Greene Street, University of South Carolina, Columbia, SC 29208 (United States); Department of Mathematics, 1523 Greene Street, University of South Carolina, Columbia, SC 29208 (United States); Vogt, Thomas, E-mail: tvogt@mailbox.sc.edu [Interdisciplinary Mathematics Institute, 1523 Greene Street, University of South Carolina, Columbia, SC 29208 (United States); NanoCenter, 1212 Greene Street, University of South Carolina, Columbia, SC 29208 (United States); Department of Chemistry and Biochemistry, 631 Sumter Street, University of South Carolina, Columbia, SC 29208 (United States)

    2014-03-01

    The extraordinary improvements of modern imaging devices offer access to data with unprecedented information content. However, widely used image processing methodologies fall far short of exploiting the full breadth of information offered by numerous types of scanning probe, optical, and electron microscopies. In many applications, it is necessary to keep measurement intensities below a desired threshold. We propose a methodology for extracting an increased level of information by processing a series of data sets suffering, in particular, from high degree of spatial uncertainty caused by complex multiscale motion during the acquisition process. An important role is played by a non-rigid pixel-wise registration method that can cope with low signal-to-noise ratios. This is accompanied by formulating objective quality measures which replace human intervention and visual inspection in the processing chain. Scanning transmission electron microscopy of siliceous zeolite material exhibits the above-mentioned obstructions and therefore serves as orientation and a test of our procedures. - Highlights: • Developed a new process for extracting more information from a series of STEM images. • An objective non-rigid registration process copes with distortions. • Images of zeolite Y show retrieval of all information available from the data set. • Quantitative measures of registration quality were implemented. • Applicable to any serially acquired data, e.g. STM, AFM, STXM, etc.

  9. Comparison of image quality in head CT studies with different dose-reduction strategies

    DEFF Research Database (Denmark)

    Johansen, Jeppe; Nielsen, Rikke; Fink-Jensen, Vibeke;

    The number of multi-detector CT examinations is increasing rapidly. They allow high quality reformatted images providing accurate and precise diagnosis at maximum speed. Brain examinations are the most commonly requested studies, and although they come at a lower effective dose than body CT, can...... account to a considerable radiation dose as many patients undergo repeated studies. Therefore, various dose-reduction strategies are applied such as automated tube current and voltage modulation and recently different iterative reconstruction algorithms. However, the trade-off of all dose......-reduction maneuvers is reduction of image quality due to image noise or artifacts. The aim of our study was therefore to find the best diagnostic images with lowest possible dose. We present results of dose- and image quality optimizing strategies of brain CT examinations at our institution. We compare sequential...

  10. CUDA optimization strategies for compute- and memory-bound neuroimaging algorithms.

    Science.gov (United States)

    Lee, Daren; Dinov, Ivo; Dong, Bin; Gutman, Boris; Yanovsky, Igor; Toga, Arthur W

    2012-06-01

    As neuroimaging algorithms and technology continue to grow faster than CPU performance in complexity and image resolution, data-parallel computing methods will be increasingly important. The high performance, data-parallel architecture of modern graphical processing units (GPUs) can reduce computational times by orders of magnitude. However, its massively threaded architecture introduces challenges when GPU resources are exceeded. This paper presents optimization strategies for compute- and memory-bound algorithms for the CUDA architecture. For compute-bound algorithms, the registers are reduced through variable reuse via shared memory and the data throughput is increased through heavier thread workloads and maximizing the thread configuration for a single thread block per multiprocessor. For memory-bound algorithms, fitting the data into the fast but limited GPU resources is achieved through reorganizing the data into self-contained structures and employing a multi-pass approach. Memory latencies are reduced by selecting memory resources whose cache performance are optimized for the algorithm's access patterns. We demonstrate the strategies on two computationally expensive algorithms and achieve optimized GPU implementations that perform up to 6× faster than unoptimized ones. Compared to CPU implementations, we achieve peak GPU speedups of 129× for the 3D unbiased nonlinear image registration technique and 93× for the non-local means surface denoising algorithm. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  11. Uncooled Micro-Cantilever Infrared Imager Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Panagiotis, Datskos G. [ORNL

    2008-02-05

    We report on the development, fabrication and characterization of microcantilever based uncooled focal plane array (FPA) for infrared imaging. By combining a streamlined design of microcantilever thermal transducers with a highly efficient optical readout, we minimized the fabrication complexity while achieving a competitive level of imaging performance. The microcantilever FPAs were fabricated using a straightforward fabrication process that involved only three photolithographic steps (i.e. three masks). A designed and constructed prototype of an IR imager employed a simple optical readout based on a noncoherent low-power light source. The main figures of merit of the IR imager were found to be comparable to those of uncooled MEMS infrared detectors with substantially higher degree of fabrication complexity. In particular, the NETD and the response time of the implemented MEMS IR detector were measured to be as low as 0.5K and 6 ms, respectively. The potential of the implemented designs can also be concluded from the fact that the constructed prototype enabled IR imaging of close to room temperature objects without the use of any advanced data processing. The most unique and practically valuable feature of the implemented FPAs, however, is their scalability to high resolution formats, such as 2000 x 2000, without progressively growing device complexity and cost. The overall technical objective of the proposed work was to develop uncooled infrared arrays based on micromechanical sensors. Currently used miniature sensors use a number of different readout techniques to accomplish the sensing. The use of optical readout techniques sensing require the deposition of thin coatings on the surface of micromechanical thermal detectors. Oak Ridge National Laboratory (ORNL) is uniquely qualified to perform the required research and development (R&D) services that will assist our ongoing activities. Over the past decade ORNL has developed a number of unique methods and

  12. Optimal Power Management Strategy for Energy Storage with Stochastic Loads

    Directory of Open Access Journals (Sweden)

    Stefano Pietrosanti

    2016-03-01

    Full Text Available In this paper, a power management strategy (PMS has been developed for the control of energy storage in a system subjected to loads of random duration. The PMS minimises the costs associated with the energy consumption of specific systems powered by a primary energy source and equipped with energy storage, under the assumption that the statistical distribution of load durations is known. By including the variability of the load in the cost function, it was possible to define the optimality criteria for the power flow of the storage. Numerical calculations have been performed obtaining the control strategies associated with the global minimum in energy costs, for a wide range of initial conditions of the system. The results of the calculations have been tested on a MATLAB/Simulink model of a rubber tyre gantry (RTG crane equipped with a flywheel energy storage system (FESS and subjected to a test cycle, which corresponds to the real operation of a crane in the Port of Felixstowe. The results of the model show increased energy savings and reduced peak power demand with respect to existing control strategies, indicating considerable potential savings for port operators in terms of energy and maintenance costs.

  13. OPTIMIZATION OF BIOCIDE STRATEGIES ON FINE PAPER MACHINES

    Directory of Open Access Journals (Sweden)

    Jani Kiuru

    2010-05-01

    Full Text Available In this study a rapid at-line ATP (adenosine triphosphate analysis is applied in papermaking. This ATP analysis takes less than a minute, and the information can be utilized instantly to adapt the biocide program. The study shows the effect of different biocide strategies at paper mills. Comparison is made between oxidative and reductive biocides on the one hand, and on the other hand between continuous vs. batch additions of biocide. Continuous biocide addition keeps the microbial activity at a constant level. However, a long production period without a boil-out might result in accumulation of resistant bacteria, which cannot be eliminated without changing the biocide strategy. Batch addition of biocide creates a high temporary concentration of biocide in the process. This causes lower temporary microbial activity in the process, but between the doses the microbial activity may rise to an intolerable level. Batch addition causes chemical variation to the wet end of a paper machine more easily than continuous addition. This can affect the performance of papermaking chemicals and cause problems with retention, fixing, etc. Both biocide addition strategies can be used if they are monitored and optimized properly. Rapid ATP analysis is a suitable tool for both purposes.

  14. Scoring Strategies for the Underdog: A general, quantitative method for determining optimal sports strategies

    CERN Document Server

    Skinner, Brian

    2011-01-01

    When facing a heavily-favored opponent, an underdog must be willing to assume greater-than-average risk. In statistical language, one would say that an underdog must be willing to adopt a strategy whose outcome has a larger-than-average variance. The difficult question is how much risk a team should be willing to accept. This is equivalent to asking how much the team should be willing to sacrifice from its mean score in order to increase the score's variance. In this paper a general, analytical method is developed for addressing this question quantitatively. Under the assumption that every play in a game is statistically independent, both the mean and the variance of a team's offensive output can be described using the binomial distribution. This description allows for direct calculations of the winning probability when a particular strategy is employed, and therefore allows one to calculate optimal offensive strategies. This paper develops this method for calculating optimal strategies exactly and then prese...

  15. Status Survey and Optimization Strategies of Image In-formation Retrieval Behavior under the Network Environ-ment%网络环境下的图像信息检索行为的现状调查及优化对策

    Institute of Scientific and Technical Information of China (English)

    文洁

    2014-01-01

    在网络环境下,面对海量的图像信息资源、多种的图像检索途径和动态的网络环境,学习者的信息检索行为发生了深刻变化。当前大学生是获取图像信息的主要群体之一。以大学生为例,调查当前网络环境下的图像信息检索行为的情况,在此基础上提出适合我国实际情况的图像信息检索行为的优化策略,促使学习者积极地运用图像媒介,更好地完善图像知识、参与社会活动和了解现实世界。%Under the network environment, faced with a large number of image information resources, various ways of image re-trieval, and dynamic network environment, learners' information retrieval behavior has changed profoundly. Currently, university students form one of the major groups of obtaining image infor-mation. Exemplified by university students, this paper investigat-ed the situation of image information retrieval behavior under the current network environment, and proposed the optimization strategies of image information retrieval behavior suitable for the practical situation of China based on the investigation, in order to promote learners to actively use image media, better improve their knowledge on image, participate in social activities, and know about the real world.

  16. Optimal experimental design to position transducers in ultrasound breast imaging

    Science.gov (United States)

    Korta Martiartu, Naiara; Boehm, Christian; Vinard, Nicolas; Jovanović Balic, Ivana; Fichtner, Andreas

    2017-03-01

    We present methods to optimize the setup of a 3D ultrasound tomography scanner for breast cancer detection. This approach provides a systematic and quantitative tool to evaluate different designs and to optimize the con- figuration with respect to predefined design parameters. We consider both, time-of-flight inversion using straight rays and time-domain waveform inversion governed by the acoustic wave equation for imaging the sound speed. In order to compare different designs, we measure their quality by extracting properties from the Hessian operator of the time-of-flight or waveform differences defined in the inverse problem, i.e., the second derivatives with respect to the sound speed. Spatial uncertainties and resolution can be related to the eigenvalues of the Hessian, which provide a good indication of the information contained in the data that is acquired with a given design. However, the complete spectrum is often prohibitively expensive to compute, thus suitable approximations have to be developed and analyzed. We use the trace of the Hessian operator as design criterion, which is equivalent to the sum of all eigenvalues and requires less computational effort. In addition, we suggest to take advantage of the spatial symmetry to extrapolate the 3D experimental design from a set of 2D configurations. In order to maximize the quality criterion, we use a genetic algorithm to explore the space of possible design configurations. Numerical results show that the proposed strategies are capable of improving an initial configuration with uniformly distributed transducers, clustering them around regions with poor illumination and improving the ray coverage of the domain of interest.

  17. Optimal strategy for selling on group-buying website

    Directory of Open Access Journals (Sweden)

    Xuan Jiang

    2014-09-01

    Full Text Available Purpose: The purpose of this paper is to help business marketers with offline channels to make decisions on whether to sell through Group-buying (GB websites and how to set online price with the coordination of maximum deal size on GB websites. Design/methodology/approach: Considering the deal structure of GB websites especially for the service fee and minimum deal size limit required by GB websites, advertising effect of selling on GB websites, and interaction between online and offline markets, an analytical model is built to derive optimal online price and maximum deal size for sellers selling through GB website. This paper aims to answer four research questions: (1 How to make a decision on maximum deal size with coordination of the deal price? (2 Will selling on GB websites always be better than staying with offline channel only? (3 What kind of products is more appropriate to sell on GB website? (4How could GB website operator induce sellers to offer deep discount in GB deals? Findings and Originality/value: This paper obtains optimal strategies for sellers selling on GB website and finds that: Even if a seller has sufficient capacity, he/she may still set a maximum deal size on the GB deal to take advantage of Advertisement with Limited Availability (ALA effect; Selling through GB website may not bring a higher profit than selling only through offline channel when a GB site only has a small consumer base and/or if there is a big overlap between the online and offline markets; Low margin products are more suitable for being sold online with ALA strategies (LP-ALA or HP-ALA than high margin ones; A GB site operator could set a small minimum deal size to induce deep discounts from the sellers selling through GB deals. Research limitations/implications: The present study assumed that the demand function is determinate and linear. It will be interesting to study how stochastic demand and a more general demand function affect the optimal

  18. An adaptive fusion strategy of polarization image based on NSCT

    Science.gov (United States)

    Zhao, Chang-xia; Duan, Jin; Mo, Chun-he; Chen, Guang-qiu; Fu, Qiang

    2015-03-01

    An improved image fusion algorithm based on the NSCT is proposed in this paper. After decomposition NSCT method of multi-scale and multiple directions, polarization image was decomposed into two parts: low frequency sub-band and high frequency band-pass images. The fusion strategy of combining local regional energy and gradient structure similarity were used in low-frequency coefficients. While in the high-frequency band-pass coefficients part, the fusion strategy of the location spatial frequency as the correlation coefficient was used. The intensity image and polarization degree image are fused for improving the sharpness and contrast of the image. The experiments show that the algorithm is effective to improve the imaging quality in the turbid medium.

  19. Optimized Discretization Schemes For Brain Images

    Directory of Open Access Journals (Sweden)

    USHA RANI.N,

    2011-02-01

    Full Text Available In medical image processing active contour method is the important technique in segmenting human organs. Geometric deformable curves known as levelsets are widely used in segmenting medical images. In this modeling , evolution of the curve is described by the basic lagrange pde expressed as a function of space and time. This pde can be solved either using continuous functions or discrete numerical methods.This paper deals with the application of numerical methods like finite diffefence and TVd-RK methods for brain scans. The stability and accuracy of these methods are also discussed. This paper also deals with the more accurate higher order non-linear interpolation techniques like ENO and WENO in reconstructing the brain scans like CT,MRI,PET and SPECT is considered.

  20. Imaging TMS: antidepressant mechanisms and treatment optimization.

    Science.gov (United States)

    Dubin, Marc

    2017-04-01

    With the antidepressant efficacy of Transcranial Magnetic Stimulation well-established by several meta-analyses, there is growing interest in its mechanism of action. TMS has been shown to engage, and in some cases, normalize functional connectivity and neurotransmitter levels within networks dysfunctional in the depressed state. In this review, I will suggest candidate biomarkers, based on neuroimaging, that may be predictive of response to TMS. I will then review the effects of TMS on networks and neurotransmitter systems involved in depression. Throughout, I will also discuss how our current understanding of response predication and network engagement may be used to personalize treatment and optimize its efficacy.

  1. Image processing to optimize wave energy converters

    Science.gov (United States)

    Bailey, Kyle Marc-Anthony

    The world is turning to renewable energies as a means of ensuring the planet's future and well-being. There have been a few attempts in the past to utilize wave power as a means of generating electricity through the use of Wave Energy Converters (WEC), but only recently are they becoming a focal point in the renewable energy field. Over the past few years there has been a global drive to advance the efficiency of WEC. Placing a mechanical device either onshore or offshore that captures the energy within ocean surface waves to drive a mechanical device is how wave power is produced. This paper seeks to provide a novel and innovative way to estimate ocean wave frequency through the use of image processing. This will be achieved by applying a complex modulated lapped orthogonal transform filter bank to satellite images of ocean waves. The complex modulated lapped orthogonal transform filterbank provides an equal subband decomposition of the Nyquist bounded discrete time Fourier Transform spectrum. The maximum energy of the 2D complex modulated lapped transform subband is used to determine the horizontal and vertical frequency, which subsequently can be used to determine the wave frequency in the direction of the WEC by a simple trigonometric scaling. The robustness of the proposed method is provided by the applications to simulated and real satellite images where the frequency is known.

  2. Optimal material discrimination using spectral x-ray imaging.

    Science.gov (United States)

    Nik, S J; Meyer, J; Watts, R

    2011-09-21

    Spectral x-ray imaging using novel photon counting x-ray detectors (PCDs) with energy resolving abilities is capable of providing energy-selective images. PCDs have energy thresholds, enabling the classification of photons into multiple energy bins. The extra energy information provided may allow materials such as iodine and calcium, or water and fat to be distinguishable. The information content of spectral x-ray images, however, depends on how the photons are grouped together. In this work, we present a model to optimize energy windows for maximum material discrimination. Multivariate statistics allows the confidence region of the correlated uncertainties to be mapped in the thickness space. Minimization of the uncertainties enables optimization of energy windows. Applications related to small animal imaging and breast imaging are considered.

  3. Proper image subtraction - optimal transient detection, photometry and hypothesis testing

    CERN Document Server

    Zackay, Barak; Gal-Yam, Avishay

    2016-01-01

    Transient detection and flux measurement via image subtraction stand at the base of time domain astronomy. Due to the varying seeing conditions, the image subtraction process is non-trivial, and existing solutions suffer from a variety of problems. Starting from basic statistical principles, we develop the optimal statistic for transient detection, flux measurement and any image-difference hypothesis testing. We derive a closed-form statistic that: (i) Is mathematically proven to be the optimal transient detection statistic in the limit of background-dominated noise; (ii) Is numerically stable; (iii) For accurately registered, adequately sampled images, does not leave subtraction or deconvolution artifacts; (iv) Allows automatic transient detection to the theoretical sensitivity limit by providing credible detection significance; (v) Has uncorrelated white noise; (vi) Is a sufficient statistic for any further statistical test on the difference image, and in particular, allows to distinguish particle hits and ...

  4. Image Filtering using All Neighbor Directional Weighted Pixels: Optimization using Particle Swarm Optimization

    CERN Document Server

    Mandal, J K

    2012-01-01

    In this paper a novel approach for de noising images corrupted by random valued impulses has been proposed. Noise suppression is done in two steps. The detection of noisy pixels is done using all neighbor directional weighted pixels (ANDWP) in the 5 x 5 window. The filtering scheme is based on minimum variance of the four directional pixels. In this approach, relatively recent category of stochastic global optimization technique i.e., particle swarm optimization (PSO) has also been used for searching the parameters of detection and filtering operators required for optimal performance. Results obtained shows better de noising and preservation of fine details for highly corrupted images.

  5. Strategies for optimizing algal biology for enhanced biomass production

    Directory of Open Access Journals (Sweden)

    Amanda N. Barry

    2015-02-01

    Full Text Available One of the more environmentally sustainable ways to produce high energy density (oils feed stocks for the production of liquid transportation fuels is from biomass. Photosynthetic carbon capture combined with biomass combustion (point source and subsequent carbon capture and sequestration (BECCS has also been proposed in the Intergovernmental Panel on Climate Change Report as one of the most effective and economical strategies to remediate atmospheric greenhouse gases. To maximize photosynthetic carbon capture efficiency and energy-return-on-investment, we must develop biomass production systems that achieve the greatest yields with the lowest inputs. Numerous studies have demonstrated that microalgae have among the greatest potentials for biomass production. This is in part due to the fact that all alga cells are photoautotrophic, they have active carbon concentrating mechanisms to increase photosynthetic productivity, and all the biomass is harvestable unlike plants. All photosynthetic organisms, however, convert only a fraction of the solar energy they capture into chemical energy (reduced carbon or biomass. To increase aerial carbon capture rates and biomass productivity it will be necessary to identify the most robust algal strains and increase their biomass production efficiency often by genetic manipulation. We review recent large-scale efforts to identify the best biomass producing strains and metabolic engineering strategies to improve aerial productivity. These strategies include optimization of photosynthetic light-harvesting antenna size to increase energy capture and conversion efficiency and the potential development of advanced molecular breeding techniques. To date, these strategies have resulted in up to two-fold increases in biomass productivity.

  6. Orientation Strategies for Aerial Oblique Images

    Science.gov (United States)

    Wiedemann, A.; Moré, J.

    2012-07-01

    Oblique aerial images become more and more distributed to fill the gap between vertical aerial images and mobile mapping systems. Different systems are on the market. For some applications, like texture mapping, precise orientation data are required. One point is the stable interior orientation, which can be achieved by stable camera systems, the other a precise exterior orientation. A sufficient exterior orientation can be achieved by a large effort in direct sensor orientation, whereas minor errors in the angles have a larger effect than in vertical imagery. The more appropriate approach is by determine the precise orientation parameters by photogrammetric methods using an adapted aerial triangulation. Due to the different points of view towards the object the traditional aerotriangulation matching tools fail, as they produce a bunch of blunders and require a lot of manual work to achieve a sufficient solution. In this paper some approaches are discussed and results are presented for the most promising approaches. We describe a single step approach with an aerotriangulation using all available images; a two step approach with an aerotriangulation only of the vertical images plus a mathematical transformation of the oblique images using the oblique cameras excentricity; and finally the extended functional model for a bundle block adjustment considering the mechanical connection between vertical and oblique images. Beside accuracy also other aspects like efficiency and required manual work have to be considered.

  7. Bound Alternative Direction Optimization for Image Deblurring

    Directory of Open Access Journals (Sweden)

    Xiangrong Zeng

    2014-01-01

    the ℓp regularizer by a novel majorizer and then, based on a variable splitting, to reformulate the bound unconstrained problem into a constrained one, which is then addressed via an augmented Lagrangian method. The proposed algorithm actually combines the reweighted ℓ1 minimization method and the alternating direction method of multiples (ADMM such that it succeeds in extending the application of ADMM to ℓp minimization problems. The conducted experimental studies demonstrate the superiority of the proposed algorithm for the synthesis ℓp minimization over the state-of-the-art algorithms for the synthesis ℓ1 minimization on image deblurring.

  8. Optimizing urology group partnerships: collaboration strategies and compensation best practices.

    Science.gov (United States)

    Jacoby, Dana L; Maller, Bruce S; Peltier, Lisa R

    2014-10-01

    Market forces in health care have created substantial regulatory, legislative, and reimbursement changes that have had a significant impact on urology group practices. To maintain viability, many urology groups have merged into larger integrated entities. Although group operations vary considerably, the majority of groups have struggled with the development of a strong culture, effective decision-making, and consensus-building around shared resources, income, and expense. Creating a sustainable business model requires urology group leaders to allocate appropriate time and resources to address these issues in a proactive manner. This article outlines collaboration strategies for creating an effective culture, governance, and leadership, and provides practical suggestions for optimizing the performance of the urology group practice.

  9. An Overview of Optimizing Strategies for Flotation Banks

    Directory of Open Access Journals (Sweden)

    Miguel Maldonado

    2012-10-01

    Full Text Available A flotation bank is a serial arrangement of cells. How to optimally operate a bank remains a challenge. This article reviews three reported strategies: air profiling, mass-pull (froth velocity profiling and Peak Air Recovery (PAR profiling. These are all ways of manipulating the recovery profile down a bank, which may be the property being exploited. Mathematical analysis has shown that a flat cell-by-cell recovery profile maximizes the separation of two floatable minerals for a given target bank recovery when the relative floatability is constant down the bank. Available bank survey data are analyzed with respect to recovery profiling. Possible variations on recovery profile to minimize entrainment are discussed.

  10. Optimal retinal cyst segmentation from OCT images

    Science.gov (United States)

    Oguz, Ipek; Zhang, Li; Abramoff, Michael D.; Sonka, Milan

    2016-03-01

    Accurate and reproducible segmentation of cysts and fluid-filled regions from retinal OCT images is an important step allowing quantification of the disease status, longitudinal disease progression, and response to therapy in wet-pathology retinal diseases. However, segmentation of fluid-filled regions from OCT images is a challenging task due to their inhomogeneous appearance, the unpredictability of their number, size and location, as well as the intensity profile similarity between such regions and certain healthy tissue types. While machine learning techniques can be beneficial for this task, they require large training datasets and are often over-fitted to the appearance models of specific scanner vendors. We propose a knowledge-based approach that leverages a carefully designed cost function and graph-based segmentation techniques to provide a vendor-independent solution to this problem. We illustrate the results of this approach on two publicly available datasets with a variety of scanner vendors and retinal disease status. Compared to a previous machine-learning based approach, the volume similarity error was dramatically reduced from 81:3+/-56:4% to 22:2+/-21:3% (paired t-test, p << 0:001).

  11. Optimization of pediatric chest radiographic images using optical densities ratio

    Energy Technology Data Exchange (ETDEWEB)

    Souza, Rafael T.F.; Miranda, Jose R.A. [Universidade Estadual Paulista Julio de Mesquita Filho (UNESP), Botucatu, SP (Brazil). Inst. de Biociencias de Botucatu; Pina, Diana R. [Faculdade de Medicina de Botucatu (UNESP), Botucatu, SP (Brazil). Hospital das Clinicas. Dept. de Doencas Tropicais e Diagnostico por Imagem; Duarte, Sergio B. [Centro Brasileiro de Pesquisas Fisicas (CBPF/MCT), Rio de Janeiro, RJ (Brazil)

    2011-07-01

    The aim of this study is the optimization of radiographic images for the pediatric patients in the age range between 0 and 1 years old, through Optical Density Ratio (ODR), considering that pediatric patients are overexposed to radiation in the repeated attempts to obtain radiographic images considered of good quality. The optimization of radiographic techniques was carried out with the RAP-PEPP (Realistic Analytical Phantom coupled to homogeneous Phantom Equivalent to Pediatric Patient) phantom in two incubators and one cradle. The data show that the clinical routine radiographic techniques generate low-quality images at up to 18.8% when evaluated by the ODRs, and increases in doses up to 60% when compared to the optimized techniques doses. (author)

  12. Tree-Based Visualization and Optimization for Image Collection.

    Science.gov (United States)

    Han, Xintong; Zhang, Chongyang; Lin, Weiyao; Xu, Mingliang; Sheng, Bin; Mei, Tao

    2016-06-01

    The visualization of an image collection is the process of displaying a collection of images on a screen under some specific layout requirements. This paper focuses on an important problem that is not well addressed by the previous methods: visualizing image collections into arbitrary layout shapes while arranging images according to user-defined semantic or visual correlations (e.g., color or object category). To this end, we first propose a property-based tree construction scheme to organize images of a collection into a tree structure according to user-defined properties. In this way, images can be adaptively placed with the desired semantic or visual correlations in the final visualization layout. Then, we design a two-step visualization optimization scheme to further optimize image layouts. As a result, multiple layout effects including layout shape and image overlap ratio can be effectively controlled to guarantee a satisfactory visualization. Finally, we also propose a tree-transfer scheme such that visualization layouts can be adaptively changed when users select different "images of interest." We demonstrate the effectiveness of our proposed approach through the comparisons with state-of-the-art visualization techniques.

  13. Optimization of wavelet decomposition for image compression and feature preservation.

    Science.gov (United States)

    Lo, Shih-Chung B; Li, Huai; Freedman, Matthew T

    2003-09-01

    A neural-network-based framework has been developed to search for an optimal wavelet kernel that can be used for a specific image processing task. In this paper, a linear convolution neural network was employed to seek a wavelet that minimizes errors and maximizes compression efficiency for an image or a defined image pattern such as microcalcifications in mammograms and bone in computed tomography (CT) head images. We have used this method to evaluate the performance of tap-4 wavelets on mammograms, CTs, magnetic resonance images, and Lena images. We found that the Daubechies wavelet or those wavelets with similar filtering characteristics can produce the highest compression efficiency with the smallest mean-square-error for many image patterns including general image textures as well as microcalcifications in digital mammograms. However, the Haar wavelet produces the best results on sharp edges and low-noise smooth areas. We also found that a special wavelet whose low-pass filter coefficients are 0.32252136, 0.85258927, 1.38458542, and -0.14548269) produces the best preservation outcomes in all tested microcalcification features including the peak signal-to-noise ratio, the contrast and the figure of merit in the wavelet lossy compression scheme. Having analyzed the spectrum of the wavelet filters, we can find the compression outcomes and feature preservation characteristics as a function of wavelets. This newly developed optimization approach can be generalized to other image analysis applications where a wavelet decomposition is employed.

  14. Malignant tumours of the kidney: imaging strategy

    Energy Technology Data Exchange (ETDEWEB)

    Smets, Anne M. [Academic Medical Center, Department of Radiology G1, Amsterdam (Netherlands); Kraker, Jan de [Paediatric Oncology-Academic Medical Center, Amsterdam (Netherlands)

    2010-06-15

    Primitive malignant renal tumours comprise 6% of all childhood cancers. Wilms tumour (WT) or nephroblastoma is the most frequent type accounting for more than 90%. Imaging alone cannot differentiate between these tumours with certainty but it plays an important role in screening, diagnostic workup, assessment of therapy response, preoperative evaluation and follow-up. The outcome of WT after therapy is excellent with an overall survival around 90%. In tumours such as those where the outcome is extremely good, focus can be shifted to a risk-based stratification to maintain excellent outcome in children with low risk tumours while improving quality of life and decreasing toxicity and costs. This review will discuss the imaging issues for WT from the European perspective and briefly discuss the characteristics of other malignant renal tumours occurring in children and new imaging techniques with potential in this matter. (orig.)

  15. Optimizing clinical environments for knowledge translation: strategies for nursing leaders.

    Science.gov (United States)

    Scott, Shannon D; VandenBeld, Brenda; Cummings, Greta G

    2011-10-01

    Using findings from our recent study that found that a context of uncertainty in the work environment hindered nurses' research utilization, we suggest strategies for nurse managers and leaders to optimize clinical environments and support efforts to put research into clinical practice (knowledge translation). Two important sources of uncertainty were the complexity of teamwork and inconsistency in management and leadership styles. To reduce the uncertainty arising from teamwork, we propose (a) clarifying nurses' scopes of practice, (b) increasing knowledge sharing through supporting journal clubs and enhanced computer access and (c) creating safe venues for multidisciplinary dialogue. To reduce uncertainty arising from variations in management and leadership, we propose (a) developing policies that enhance the consistency of leadership and clarify the strategic direction of the management team, (b) clearly communicating those policies to nurses and (c) providing explicit rationales for treatment changes. Small, incremental steps can be taken to realize substantive changes in clinical environments in order to optimize nursing work environments for knowledge translation.

  16. [Optimal allocation of irrigation water resources based on systematical strategy].

    Science.gov (United States)

    Cheng, Shuai; Zhang, Shu-qing

    2015-01-01

    With the development of the society and economy, as well as the rapid increase of population, more and more water is needed by human, which intensified the shortage of water resources. The scarcity of water resources and growing competition of water in different water use sectors reduce water availability for irrigation, so it is significant to plan and manage irrigation water resources scientifically and reasonably for improving water use efficiency (WUE) and ensuring food security. Many investigations indicate that WUE can be increased by optimization of water use. However, present studies focused primarily on a particular aspect or scale, which lack systematic analysis on the problem of irrigation water allocation. By summarizing previous related studies, especially those based on intelligent algorithms, this article proposed a multi-level, multi-scale framework for allocating irrigation water, and illustrated the basic theory of each component of the framework. Systematical strategy of optimal irrigation water allocation can not only control the total volume of irrigation water on the time scale, but also reduce water loss on the spatial scale. It could provide scientific basis and technical support for improving the irrigation water management level and ensuring the food security.

  17. Optimal Order Strategy in Uncertain Demands with Free Shipping Option

    Directory of Open Access Journals (Sweden)

    Qing-Chun Meng

    2014-01-01

    Full Text Available Free shipping with conditions has become one of the most effective marketing tools; more and more companies especially e-business companies prefer to offer free shipping to buyers whenever their orders exceed the minimum quantity specified by them. But in practice, the demands of buyers are uncertain, which are affected by weather, season, and many other factors. Firstly, we model the centralization ordering problem of retailers who face stochastic demands when suppliers offer free shipping, in which limited distributional information such as known mean, support, and some deviation measures of the random data is needed only. Then, based on the linear decision rule mainly for stochastic programming, we analyze the optimal order strategies of retailers and discuss the approximate solution. Further, we present the core allocation between all retailers via dual and cooperative game theory. The existence of core shows that each retailer is pleased to cooperate with others in the centralization problem. Finally, a numerical example is implemented to discuss how uncertain data and parameters affect the optimal solution.

  18. Evolving strategies for optimal care management and plan benefit designs.

    Science.gov (United States)

    Cruickshank, John M

    2012-11-01

    As a prevalent, complex disease, diabetes presents a challenge to managed care. Strategies to optimize type 2 diabetes care management and treatment outcomes have been evolving over the past several years. Novel economic incentive programs (eg, those outlined in the Patient Protection and Affordable Care Act of 2010 that tie revenue from Medicare Advantage plans to the quality of healthcare delivered) are being implemented, as are evidence-based interventions designed to optimize treatment, reduce clinical complications, and lower the total financial burden of the disease. Another step that can improve outcomes is to align managed care diabetes treatment algorithms with national treatment guidelines. In addition, designing the pharmacy benefit to emphasize the overall value of treatment and minimize out-of-pocket expenses for patients can be an effective approach to reducing prescription abandonment. The implementation of emerging models of care that encourage collaboration between providers, support lifestyle changes, and engage patients to become partners in their own treatment also appears to be effective.

  19. Superiorization of incremental optimization algorithms for statistical tomographic image reconstruction

    Science.gov (United States)

    Helou, E. S.; Zibetti, M. V. W.; Miqueles, E. X.

    2017-04-01

    We propose the superiorization of incremental algorithms for tomographic image reconstruction. The resulting methods follow a better path in its way to finding the optimal solution for the maximum likelihood problem in the sense that they are closer to the Pareto optimal curve than the non-superiorized techniques. A new scaled gradient iteration is proposed and three superiorization schemes are evaluated. Theoretical analysis of the methods as well as computational experiments with both synthetic and real data are provided.

  20. OPTIMIZATION OF DIAGNOSTIC IMAGING IN BREAST CANCER

    Directory of Open Access Journals (Sweden)

    S. A. Velichko

    2015-01-01

    Full Text Available The paper presents the results of breast imaging for 47200 women. Breast cancer was detected in 862 (1.9% patients, fibroadenoma in 1267 (2.7% patients and isolated breast cysts in 1162 (2.4% patients. Different types of fibrocystic breast disease (adenosis, diffuse fibrocystic changes, local fibrosis and others were observed in 60.1% of women. Problems of breast cancer visualization during mammography, characterized by the appearance of fibrocystic mastopathy (sclerosing adenosis, fibrous bands along the ducts have been analyzed. Data on the development of diagnostic algorithms including the modern techniques for ultrasound and interventional radiology aimed at detecting early breast cancer have been presented.  

  1. Sparse synthetic aperture radar imaging with optimized azimuthal aperture

    Institute of Scientific and Technical Information of China (English)

    ZENG Cao; WANG MinHang; LIAO GuiSheng; ZHU ShengQi

    2012-01-01

    To counter the problem of acquiring and processing huge amounts of data for synthetic aperture radar (SAR) using traditional sampling techniques,a method for sparse SAR imaging with an optimized azimuthal aperture is presented.The equivalence of an azimuthal match filter and synthetic array beamforming is shown so that optimization of the azimuthal sparse aperture can be converted to optimization of synthetic array beamforming.The azimuthal sparse aperture,which is composed of a middle aperture and symmetrical bilateral apertures,can be obtained by optimization algorithms (density weighting and simulated annealing algorithms,respectively).Furthermore,sparse imaging of spectrum analysis SAR based on the optimized sparse aperture is achieved by padding zeros at null samplings and using a non-uniform Taylor window. Compared with traditional sampling,this method has the advantages of reducing the amount of sampling and alleviating the computational burden with acceptable image quality.Unlike periodic sparse sampling,the proposed method exhibits no image ghosts.The results obtained from airborne measurements demonstrate the effectiveness and superiority of the proposed method.

  2. Optimizing the HLT Buffer Strategy with Monte Carlo Simulations

    CERN Document Server

    AUTHOR|(CDS)2266763

    2017-01-01

    This project aims to optimize the strategy of utilizing the disk buffer for the High Level Trigger (HLT) of the LHCb experiment with the help of Monte-Carlo simulations. A method is developed, which simulates the Event Filter Farm (EFF) -- a computing cluster for the High Level Trigger -- as a compound of nodes with different performance properties. In this way, the behavior of the computing farm can be analyzed at a deeper level than before. It is demonstrated that the current operating strategy might be improved when data taking is reaching a mid-year scheduled stop or the year-end technical stop. The processing time of the buffered data can be lowered by distributing the detector data according to the processing power of the nodes instead of the relative disk size as long as the occupancy level of the buffer is low enough. Moreover, this ensures that data taken and stored on the buffer at the same time is processed by different nodes nearly simultaneously, which reduces load on the infrastructure.

  3. Optimizing Reinjection Strategy at Palinpinon, Philippines Based on Chloride Data

    Energy Technology Data Exchange (ETDEWEB)

    Urbino, Ma. Elena G.; Horne, Roland N.

    1992-03-24

    One of the guidelines established for the safe and efficient management of the Palinpinon Geothermal Field is to adopt a production and well utilization strategy such that the rapid rate and magnitude of reinjection fluid returns leading to premature thermal breakthrough would be minimized. To help achieve this goal, sodium fluorescein and radioactive tracer tests have been conducted to determine the rate and extent of communication between the reinjection and producing sectors of the field. The first objective of this paper is to show how the results of these tests, together with information on field geometry and operating conditions were used in algorithms developed in Operations Research to allocate production and reinjection rates among the different Palinpinon wells. Due to operational and economic constraints, such tracer tests were very limited in number and scope. This prevents obtaining information on the explicit interaction between each reinjection well and the producing wells. Hence, the chloride value of the producing well, was tested to determine if use of this parameter would enable identifying fast reinjection paths among different production/reinjection well pairs. The second aim, therefore, of this paper is to show the different methods of using the chloride data of the producing wells and the injection flow rates of the reinjection wells to provide a ranking of the pair of wells and, thereby, optimize the reinjection strategy of the field.

  4. Optimal Pharmacologic Treatment Strategies in Obesity and Type 2 Diabetes

    Directory of Open Access Journals (Sweden)

    Gayotri Goswami

    2014-06-01

    Full Text Available The prevalence of obesity has increased to pandemic levels worldwide and is related to increased risk of morbidity and mortality. Metabolic comorbidities are commonly associated with obesity and include metabolic syndrome, pre-diabetes, and type 2 diabetes. Even if the prevalence of obesity remains stable until 2030, the anticipated numbers of people with diabetes will more than double as a consequence of population aging and urbanization. Weight reduction is integral in the prevention of diabetes among obese adults with pre-diabetes. Lifestyle intervention and weight reduction are also key in the management of type 2 diabetes. Weight loss is challenging for most obese patients, but for those with diabetes, it can pose an even greater challenge due to the weight gain associated with many treatment regimens. This article will review optimal treatment strategies for patients with comorbid obesity and type 2 diabetes. The role of anti-obesity agents in diabetes will also be reviewed. This literature review will provide readers with current strategies for the pharmacologic treatment of obesity and diabetes with a focus on the weight outcomes related to diabetes treatments.

  5. Multiexposure imaging and parameter optimization for intensified star trackers.

    Science.gov (United States)

    Yu, Wenbo; Jiang, Jie; Zhang, Guangjun

    2016-12-20

    Due to the introduction of the intensified image detector, the dynamic performance of the intensified star tracker is effectively improved. However, its attitude update rate is still seriously restricted by the transmission and processing of pixel data. In order to break through the above limitation, a multiexposure imaging approach for intensified star trackers is proposed in this paper. One star image formed by this approach actually records N different groups of star positions, and then N corresponding groups of attitude information can be acquired. Compared with the existing exposure imaging approach, the proposed approach improves the attitude update rate by N times. Furthermore, for a dim star, the proposed approach can also accumulate the energy of its N positions and then effectively improve its signal-to-noise ratio. Subsequently, in order to obtain the optimal performance of the proposed approach, parameter optimization is carried out. First, the motion model of the star spot in the image plane is established, and then based on it, all the key parameters are optimized. Simulations and experiments demonstrate the feasibility and effectiveness of the proposed approach and parameter optimization.

  6. Optimization of a Biometric System Based on Acoustic Images

    Science.gov (United States)

    Izquierdo Fuente, Alberto; Del Val Puente, Lara; Villacorta Calvo, Juan J.; Raboso Mateos, Mariano

    2014-01-01

    On the basis of an acoustic biometric system that captures 16 acoustic images of a person for 4 frequencies and 4 positions, a study was carried out to improve the performance of the system. On a first stage, an analysis to determine which images provide more information to the system was carried out showing that a set of 12 images allows the system to obtain results that are equivalent to using all of the 16 images. Finally, optimization techniques were used to obtain the set of weights associated with each acoustic image that maximizes the performance of the biometric system. These results improve significantly the performance of the preliminary system, while reducing the time of acquisition and computational burden, since the number of acoustic images was reduced. PMID:24616643

  7. Optimization of a Biometric System Based on Acoustic Images

    Directory of Open Access Journals (Sweden)

    Alberto Izquierdo Fuente

    2014-01-01

    Full Text Available On the basis of an acoustic biometric system that captures 16 acoustic images of a person for 4 frequencies and 4 positions, a study was carried out to improve the performance of the system. On a first stage, an analysis to determine which images provide more information to the system was carried out showing that a set of 12 images allows the system to obtain results that are equivalent to using all of the 16 images. Finally, optimization techniques were used to obtain the set of weights associated with each acoustic image that maximizes the performance of the biometric system. These results improve significantly the performance of the preliminary system, while reducing the time of acquisition and computational burden, since the number of acoustic images was reduced.

  8. Optimizing Imaging Instruments for Emission Mammography

    Science.gov (United States)

    Weinberg, Irving N.

    1996-05-01

    Clinical studies have demonstrated that radiotracer methods can noninvasively detect breast cancers in vivo(L.P. Adler, J.P.Crowe, N.K. Al-Kaisis, et al, Radiology 187,743-750 (1993)) (I. Khalkhali, I. Mena, E. Jouanne, et al, J. Am. Coll. Surg. 178, 491-497 (1994)). Due to spatial resolution and count efficiency considerations, users of conventional nuclear medicine instruments have had difficulty in detecting subcentimeter cancers. This limitation is unfortunate, since cancer therapy is generally most efficacious when tumor diameter at detection is less than a centimeter. A more subtle limitation of conventional nuclear medicine imaging instruments is that they are poorly suited to guiding interventions. With the assistance of C.J. Thompson from McGill University, and the CEBAF Detector Physics Group, we have explored the possibility of configuring detectors for nuclear medicine imaging devices into geometries that resemble conventional x-ray mammography cameras(I.N. Weinberg, U.S.Patent 5,252,830 (1993)). Phantom and pilot clinical studies suggest that applying breast compression within such geometries may offer several advantages(C.J. Thompson, K. Murthy, I.N. Weinberg, et al, Med. Physics 21, 259-538 (1994)): For coincident detection of positron emitters, efficiency and spatial resolution are improved by bringing the detectors very close to the source (the breast tumor). For single-photon detection, attenuation due to overlying tissue is reduced. Since, for a high-efficiency collimator, spatial resolution worsens with increasing source to collimator distance, adoption of compression allows more efficient collimators to be employed. Economics are favorable in that detectors can be deployed in the region of interest, rather than around the entire body, and that such detectors can be mounted in conventional mammographic gantries. The application of conventional mammographic geometry promises to assist physicians in conducting radiotracer-guided biopsies, and in

  9. Split Bregman's optimization method for image construction in compressive sensing

    Science.gov (United States)

    Skinner, D.; Foo, S.; Meyer-Bäse, A.

    2014-05-01

    The theory of compressive sampling (CS) was reintroduced by Candes, Romberg and Tao, and D. Donoho in 2006. Using a priori knowledge that a signal is sparse, it has been mathematically proven that CS can defY Nyquist sampling theorem. Theoretically, reconstruction of a CS image relies on the minimization and optimization techniques to solve this complex almost NP-complete problem. There are many paths to consider when compressing and reconstructing an image but these methods have remained untested and unclear on natural images, such as underwater sonar images. The goal of this research is to perfectly reconstruct the original sonar image from a sparse signal while maintaining pertinent information, such as mine-like object, in Side-scan sonar (SSS) images. Goldstein and Osher have shown how to use an iterative method to reconstruct the original image through a method called Split Bregman's iteration. This method "decouples" the energies using portions of the energy from both the !1 and !2 norm. Once the energies are split, Bregman iteration is used to solve the unconstrained optimization problem by recursively solving the problems simultaneously. The faster these two steps or energies can be solved then the faster the overall method becomes. While the majority of CS research is still focused on the medical field, this paper will demonstrate the effectiveness of the Split Bregman's methods on sonar images.

  10. Optimizing the processing and presentation of PPCR imaging

    Science.gov (United States)

    Davies, Andrew G.; Cowen, Arnold R.; Parkin, Geoff J. S.; Bury, Robert F.

    1996-03-01

    Photostimulable phosphor computed radiography (CR) is becoming an increasingly popular image acquisition system. The acceptability of this technique, both diagnostically, ergonomically and economically is highly influenced by the method by which the image data is presented to the user. Traditional CR systems utilize an 11' by 14' film hardcopy format, and can place two images per exposure onto this film, which does not correspond to sizes and presentations provided by conventional techniques. It is also the authors' experience that the image enhancement algorithms provided by traditional CR systems do not provide optimal image presentation. An alternative image enhancement algorithm was developed, along with a number of hardcopy formats, designed to match the requirements of the image reporting process. The new image enhancement algorithm, called dynamic range reduction (DRR), is designed to provide a single presentation per exposure, maintaining the appearance of a conventional radiograph, while optimizing the rendition of diagnostically relevant features within the image. The algorithm was developed on a Sun SPARCstation, but later ported to a Philips' EasyVisionRAD workstation. Print formats were developed on the EasyVision to improve the acceptability of the CR hardcopy. For example, for mammographic examinations, four mammograms (a cranio-caudal and medio-lateral view of each breast) are taken for each patient, with all images placed onto a single sheet of 14' by 17' film. The new composite format provides a more suitable image presentation for reporting, and is more economical to produce. It is the use of enhanced image processing and presentation which has enabled all mammography undertaken within the general infirmary to be performed using the CR/EasyVisionRAD DRR/3M 969 combination, without recourse to conventional film/screen mammography.

  11. Temporal Parameter Optimization in Four-Dimensional Flash Trajectory Imaging

    Institute of Scientific and Technical Information of China (English)

    WANG Xin-Wei; ZHOU Yan; FAN Song-Tao; LIU Yu-Liang

    2011-01-01

    In four-dimensional fiash trajectory imaging, temporal parameters include time delay, laser pulse width, gate time, pulse pair repetition frequency and the frame rate of CCD, which directly impact on the acquisition of target trajectories over time. We propose a method of optimizing the temporal parameters of flash trajectory imaging. All the temporal parameters can be estimated by the spatial parameters of the volumes of interest, target scale and velocity, and target sample number. The formulae for optimizing temporal parameters are derived, and the method is demonstrated in an experiment with a ball oscillating as a pendulum.%In four-dimensional flash trajectory imaging,temporal parameters include time delay,laser pulse width,gate time,pulse pair repetition frequency and the frame rate of CCD,which directly impact on the acquisition of target trajectories over time.We propose a method of optimizing the temporal parameters of flash trajectory imaging.All the temporal parameters can be estimated by the spatial parameters of the volumes of interest,target scale and velocity,and target sample number.The formulae for optimizing temporal parameters are derived,and the method is demonstrated in an experiment with a ball oscillating as a pendulum.Four-dimensional flash trajectory imaging (FTI)based on time-delay-modulated range-gated viewing can directly image the trajectories of moving objects with backgrounds filtered and deduce target 3D positions over time,[1] which has potentials in astronomy,remote sensing and biomedical applications.[2 4] Temporal parameters are crucial for FTI.An unreasonable setting of temporal parameters will lead to failure in obtaining target trajectories.However,in the previous work,[1] the optimization of temporal parameters has not been discussed in detail.Therefore,in this Letter we give a method of estimating the temporal parameters of FTI.

  12. Fast algorithm for optimal graph-Laplacian based 3D image segmentation

    Science.gov (United States)

    Harizanov, S.; Georgiev, I.

    2016-10-01

    In this paper we propose an iterative steepest-descent-type algorithm that is observed to converge towards the exact solution of the ℓ0 discrete optimization problem, related to graph-Laplacian based image segmentation. Such an algorithm allows for significant additional improvements on the segmentation quality once the minimizer of the associated relaxed ℓ1 continuous optimization problem is computed, unlike the standard strategy of simply hard-thresholding the latter. Convergence analysis of the algorithm is not a subject of this work. Instead, various numerical experiments, confirming the practical value of the algorithm, are documented.

  13. Partial-Transfer Absorption Imaging: A versatile technique for optimal imaging of ultracold gases

    CERN Document Server

    Ramanathan, Anand; Wright, Kevin C; Anderson, Russell P; Phillips, William D; Helmerson, Kristian; Campbell, Gretchen K

    2012-01-01

    Partial-transfer absorption imaging is a tool that enables optimal imaging of atomic clouds for a wide range of optical depths. In contrast to standard absorption imaging, the technique can be minimally-destructive and can be used to obtain multiple successive images of the same sample. The technique involves transferring a small fraction of the sample from an initial internal atomic state to an auxiliary state and subsequently imaging that fraction absorptively on a cycling transition. The atoms remaining in the initial state are essentially unaffected. We demonstrate the technique, discuss its applicability, and compare its performance as a minimally-destructive technique to that of phase-contrast imaging.

  14. Adaptive hybrid optimization strategy for calibration and parameter estimation of physical models

    CERN Document Server

    Vesselinov, Velimir V

    2011-01-01

    A new adaptive hybrid optimization strategy, entitled squads, is proposed for complex inverse analysis of computationally intensive physical models. The new strategy is designed to be computationally efficient and robust in identification of the global optimum (e.g. maximum or minimum value of an objective function). It integrates a global Adaptive Particle Swarm Optimization (APSO) strategy with a local Levenberg-Marquardt (LM) optimization strategy using adaptive rules based on runtime performance. The global strategy optimizes the location of a set of solutions (particles) in the parameter space. The LM strategy is applied only to a subset of the particles at different stages of the optimization based on the adaptive rules. After the LM adjustment of the subset of particle positions, the updated particles are returned to the APSO strategy. The advantages of coupling APSO and LM in the manner implemented in squads is demonstrated by comparisons of squads performance against Levenberg-Marquardt (LM), Particl...

  15. Remote Sensing Image Fusion Using Ica and Optimized Wavelet Transform

    Science.gov (United States)

    Hnatushenko, V. V.; Vasyliev, V. V.

    2016-06-01

    In remote-sensing image processing, fusion (pan-sharpening) is a process of merging high-resolution panchromatic and lower resolution multispectral (MS) imagery to create a single high-resolution color image. Many methods exist to produce data fusion results with the best possible spatial and spectral characteristics, and a number have been commercially implemented. However, the pan-sharpening image produced by these methods gets the high color distortion of spectral information. In this paper, to minimize the spectral distortion we propose a remote sensing image fusion method which combines the Independent Component Analysis (ICA) and optimization wavelet transform. The proposed method is based on selection of multiscale components obtained after the ICA of images on the base of their wavelet decomposition and formation of linear forms detailing coefficients of the wavelet decomposition of images brightness distributions by spectral channels with iteratively adjusted weights. These coefficients are determined as a result of solving an optimization problem for the criterion of maximization of information entropy of the synthesized images formed by means of wavelet reconstruction. Further, reconstruction of the images of spectral channels is done by the reverse wavelet transform and formation of the resulting image by superposition of the obtained images. To verify the validity, the new proposed method is compared with several techniques using WorldView-2 satellite data in subjective and objective aspects. In experiments we demonstrated that our scheme provides good spectral quality and efficiency. Spectral and spatial quality metrics in terms of RASE, RMSE, CC, ERGAS and SSIM are used in our experiments. These synthesized MS images differ by showing a better contrast and clarity on the boundaries of the "object of interest - the background". The results show that the proposed approach performs better than some compared methods according to the performance metrics.

  16. Illumination, color and imaging evaluation and optimization of visual displays

    CERN Document Server

    Bodrogi , P

    2012-01-01

    This comprehensive and modern reference on display technology, Illumination, color and imaging focuses on visual effects and how displayed images are best matched to the human visual system. It teaches how to exploit the knowledge of color information processing to design usable, ergonomic, and visually pleasing displays and display environments. The contents describe design principles and methods to optimize self-luminous visual technologies for the user using modern still and motion image displays and the whole range of indoor light sources. Design principles and methods are derived from

  17. Optimal Portfolios in Wishart Models and Effects of Discrete Rebalancing on Portfolio Distribution and Strategy Selection

    OpenAIRE

    Li, Zejing

    2012-01-01

    This dissertation is mainly devoted to the research of two problems - the continuous-time portfolio optimization in different Wishart models and the effects of discrete rebalancing on portfolio wealth distribution and optimal portfolio strategy.

  18. Optimal control strategies for tuberculosis treatment: a case study in Angola

    CERN Document Server

    Silva, Cristiana J

    2012-01-01

    We apply optimal control theory to a tuberculosis model given by a system of ordinary differential equations. Optimal control strategies are proposed to minimize the cost of interventions. Numerical simulations are given using data from Angola.

  19. Selection, optimization, and compensation strategies : Interactive effects on daily work engagement

    NARCIS (Netherlands)

    Zacher, Hannes; Chan, Felicia; Bakker, Arnold B.; Demerouti, Evangelia

    2015-01-01

    The theory of selective optimization with compensation (SOC) proposes that the "orchestrated" use of three distinct action regulation strategies (selection, optimization, and compensation) leads to positive employee outcomes. Previous research examined overall scores and additive models (i.e., main

  20. Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations

    Science.gov (United States)

    Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying

    2010-09-01

    Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).

  1. Analytical models integrated with satellite images for optimized pest management

    Science.gov (United States)

    The global field protection (GFP) was developed to protect and optimize pest management resources integrating satellite images for precise field demarcation with physical models of controlled release devices of pesticides to protect large fields. The GFP was implemented using a graphical user interf...

  2. Realizing sub-wavelength imaging with evolutionary optimization

    DEFF Research Database (Denmark)

    Zhang, Jingjing

    2015-01-01

    Here we propose an approach to realize farfield subwavelength imaging lens by combining the transformation optics methodology with evolutionary optimization method. The lens is composed of an isotropic dielectric core and anisotropic or isotropic dielectric matching layer, of which the parameters...

  3. Imaging of soft tissue tumors: general imaging strategy and technical considerations.

    Science.gov (United States)

    Van Dyck, P; Gielen, J; Vanhoenacker, F M; De Schepper, A M; Parizel, P M

    2006-01-01

    This paper reviews the imaging strategy and protocol for detection, grading and staging, and posttherapeutic follow-up of soft tissue tumors (STT), used in our institution. The role of each imaging technique, with emphasis on magnetic resonance imaging, is highlighted.

  4. Optimal Point Spread Function Design for 3D Imaging

    Science.gov (United States)

    Shechtman, Yoav; Sahl, Steffen J.; Backer, Adam S.; Moerner, W. E.

    2015-01-01

    To extract from an image of a single nanoscale object maximum physical information about its position, we propose and demonstrate a framework for pupil-plane modulation for 3D imaging applications requiring precise localization, including single-particle tracking and super-resolution microscopy. The method is based on maximizing the information content of the system, by formulating and solving the appropriate optimization problem – finding the pupil-plane phase pattern that would yield a PSF with optimal Fisher information properties. We use our method to generate and experimentally demonstrate two example PSFs: one optimized for 3D localization precision over a 3 μm depth of field, and another with an unprecedented 5 μm depth of field, both designed to perform under physically common conditions of high background signals. PMID:25302889

  5. Optimizing strategies to improve interprofessional practice for veterans, part 1

    Directory of Open Access Journals (Sweden)

    Bhattacharya SB

    2014-04-01

    Full Text Available Shelley B Bhattacharya,1–3 Michelle I Rossi,1,2 Jennifer M Mentz11Geriatric Research Education and Clinical Center (GRECC, Veteran's Affairs Pittsburgh Healthcare System, 2University of Pittsburgh Medical Center, Pittsburgh, PA, USA; 3Albert Schweitzer Fellowship Program, Pittsburgh, PA, USAIntroduction: Interprofessional patient care is a well-recognized path that health care systems are striving toward. The Veteran's Affairs (VA system initiated interprofessional practice (IPP models with their Geriatric Evaluation and Management (GEM programs. GEM programs incorporate a range of specialties, including but not limited to, medicine, nursing, social work, physical therapy and pharmacy, to collaboratively evaluate veterans. Despite being a valuable resource, they are now faced with significant cut-backs, including closures. The primary goal of this project was to assess how the GEM model could be optimized at the Pittsburgh, Pennsylvania VA to allow for the sustainability of this important IPP assessment. Part 1 of the study evaluated the IPP process using program, patient, and family surveys. Part 2 examined how well the geriatrician matched patients to specialists in the GEM model. This paper describes Part 1 of our study.Methods: Three strategies were used: 1 a national GEM program survey; 2 a veteran/family satisfaction survey; and 3 an absentee assessment.Results: Twenty-six of 92 programs responded to the GEM IPP survey. Six strategies were shared to optimize IPP models throughout the country. Of the 34 satisfaction surveys, 80% stated the GEM clinic was beneficial, 79% stated their concerns were addressed, and 100% would recommend GEM to their friends. Of the 24 absentee assessments, the top three reasons for missing the appointments were transportation, medical illnesses, and not knowing/remembering about the appointment. Absentee rate diminished from 41% to 19% after instituting a reminder phone call policy.Discussion: Maintaining the

  6. Optimal pandemic influenza vaccine allocation strategies for the Canadian population.

    Directory of Open Access Journals (Sweden)

    Ashleigh R Tuite

    Full Text Available BACKGROUND: The world is currently confronting the first influenza pandemic of the 21(st century. Influenza vaccination is an effective preventive measure, but the unique epidemiological features of swine-origin influenza A (H1N1 (pH1N1 introduce uncertainty as to the best strategy for prioritization of vaccine allocation. We sought to determine optimal prioritization of vaccine distribution among different age and risk groups within the Canadian population, to minimize influenza-attributable morbidity and mortality. METHODOLOGY/PRINCIPAL FINDINGS: We developed a deterministic, age-structured compartmental model of influenza transmission, with key parameter values estimated from data collected during the initial phase of the epidemic in Ontario, Canada. We examined the effect of different vaccination strategies on attack rates, hospitalizations, intensive care unit admissions, and mortality. In all scenarios, prioritization of high-risk individuals (those with underlying chronic conditions and pregnant women, regardless of age, markedly decreased the frequency of severe outcomes. When individuals with underlying medical conditions were not prioritized and an age group-based approach was used, preferential vaccination of age groups at increased risk of severe outcomes following infection generally resulted in decreased mortality compared to targeting vaccine to age groups with higher transmission, at a cost of higher population-level attack rates. All simulations were sensitive to the timing of the epidemic peak in relation to vaccine availability, with vaccination having the greatest impact when it was implemented well in advance of the epidemic peak. CONCLUSIONS/SIGNIFICANCE: Our model simulations suggest that vaccine should be allocated to high-risk groups, regardless of age, followed by age groups at increased risk of severe outcomes. Vaccination may significantly reduce influenza-attributable morbidity and mortality, but the benefits are

  7. Field of view selection for optimal airborne imaging sensor performance

    Science.gov (United States)

    Goss, Tristan M.; Barnard, P. Werner; Fildis, Halidun; Erbudak, Mustafa; Senger, Tolga; Alpman, Mehmet E.

    2014-05-01

    The choice of the Field of View (FOV) of imaging sensors used in airborne targeting applications has major impact on the overall performance of the system. Conducting a market survey from published data on sensors used in stabilized airborne targeting systems shows a trend of ever narrowing FOVs housed in smaller and lighter volumes. This approach promotes the ever increasing geometric resolution provided by narrower FOVs, while it seemingly ignores the influences the FOV selection has on the sensor's sensitivity, the effects of diffraction, the influences of sight line jitter and collectively the overall system performance. This paper presents a trade-off methodology to select the optimal FOV for an imaging sensor that is limited in aperture diameter by mechanical constraints (such as space/volume available and window size) by balancing the influences FOV has on sensitivity and resolution and thereby optimizing the system's performance. The methodology may be applied to staring array based imaging sensors across all wavebands from visible/day cameras through to long wave infrared thermal imagers. Some examples of sensor analysis applying the trade-off methodology are given that highlights the performance advantages that can be gained by maximizing the aperture diameters and choosing the optimal FOV for an imaging sensor used in airborne targeting applications.

  8. SAR Image Segmentation Based On Hybrid PSOGSA Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Amandeep Kaur

    2014-09-01

    Full Text Available Image segmentation is useful in many applications. It can identify the regions of interest in a scene or annotate the data. It categorizes the existing segmentation algorithm into region-based segmentation, data clustering, and edge-base segmentation. Region-based segmentation includes the seeded and unseeded region growing algorithms, the JSEG, and the fast scanning algorithm. Due to the presence of speckle noise, segmentation of Synthetic Aperture Radar (SAR images is still a challenging problem. We proposed a fast SAR image segmentation method based on Particle Swarm Optimization-Gravitational Search Algorithm (PSO-GSA. In this method, threshold estimation is regarded as a search procedure that examinations for an appropriate value in a continuous grayscale interval. Hence, PSO-GSA algorithm is familiarized to search for the optimal threshold. Experimental results indicate that our method is superior to GA based, AFS based and ABC based methods in terms of segmentation accuracy, segmentation time, and Thresholding.

  9. Leaf Area Adjustment As an Optimal Drought-Adaptation Strategy

    Science.gov (United States)

    Manzoni, S.; Beyer, F.; Thompson, S. E.; Vico, G.; Weih, M.

    2014-12-01

    Leaf phenology plays a major role in land-atmosphere mass and energy exchanges. Much work has focused on phenological responses to light and temperature, but less to leaf area changes during dry periods. Because the duration of droughts is expected to increase under future climates in seasonally-dry as well as mesic environments, it is crucial to (i) predict drought-related phenological changes and (ii) to develop physiologically-sound models of leaf area dynamics during dry periods. Several optimization criteria have been proposed to model leaf area adjustment as soil moisture decreases. Some theories are based on the plant carbon (C) balance, hypothesizing that leaf area will decline when instantaneous net photosynthetic rates become negative (equivalent to maximization of cumulative C gain). Other theories draw on hydraulic principles, suggesting that leaf area should adjust to either maintain a constant leaf water potential (isohydric behavior) or to avoid leaf water potentials with negative impacts on photosynthesis (i.e., minimization of water stress). Evergreen leaf phenology is considered as a control case. Merging these theories into a unified framework, we quantify the effect of phenological strategy and climate forcing on the net C gain over the entire growing season. By accounting for the C costs of leaf flushing and the gains stemming from leaf photosynthesis, this metric assesses the effectiveness of different phenological strategies, under different climatic scenarios. Evergreen species are favored only when the dry period is relatively short, as they can exploit most of the growing season, and only incur leaf maintenance costs during the short dry period. In contrast, deciduous species that lower maintenance costs by losing leaves are advantaged under drier climates. Moreover, among drought-deciduous species, isohydric behavior leads to lowest C gains. Losing leaves gradually so as to maintain a net C uptake equal to zero during the driest period in

  10. Medical Image Classification Using Genetic Optimized Elman Network

    Directory of Open Access Journals (Sweden)

    T. Baranidharan

    2012-01-01

    Full Text Available Problem statement: Advancements in the internet and digital images have resulted in a huge database of images. Most of the current search engines found in the web depends only on images that can be retrieved using metadata, which generates a lot of unwanted results in the results got. Content-Based Image Retrieval (CBIR system is the utilization of computer vision techniques in the predicament of image retrieval. In other words, it is used for searching and retrieving of the right digital image among a huge database using query image. CBIR finds extensive applications in the field of medicine as it helps medical professionals in diagnosis and plan treatment. Approach: Various methods have been proposed for CBIR using the images low level features like histogram, color, texture and shape. Similarly various classification algorithms like Naive Bayes classifier, Support Vector Machine, Decision tree induction algorithms and Neural Network based classifiers have been studied extensively. In this study it is proposed to extract global features using Hilbert Transform (HT, select features based on the correlation of the extracted vectors with respect to the class label and propose a enhanced Elman Neural Network Genetic Algorithm Optimized Elman (GAOE Neural Network. Results and Conclusion: The proposed method for feature extraction and the classification algorithm was tested on a dataset consisting of 180 medical images. The classification accuracy of 92.22% was obtained in the proposed method.

  11. A Swarm Optimization Based Power Aware Clustering Strategy for WSNs

    Directory of Open Access Journals (Sweden)

    Harendra S. Jangwan

    2017-02-01

    Full Text Available The technique of division of a wireless sensor network (WSN into clusters has proved to most suitable for the reliable data communication inside the network. This approach also improves the throughput of the system along with other attributes such as rate of delivering data packet to the base station (BS and overall energy dissipation of the sensor nodes in the network. This in turn results in the increased network lifetime. As the sensor nodes are operated by battery or some other source, this introduces a constraint in energy resource. Therefore, there is a strong need to develop a novel approach to overcome this constraint, since this phenomenon leads to the degradation of the network. The swarm intelligence approach is able to cope with all such pitfalls of WSNs. In this paper, we have presented a cluster-head (CH selection technique which is based on swarm optimization with the main aim to increase the overall network lifetime. The proposed approach gives higher effects with regards to power utilization of nodes, data packets received at BS and stability period, and for this reason serves to be a higher performer as compared to Stable Election Protocol (SEP and Enhance Threshold Sensitive Stable Election Protocol(ETSSEP. MATLAB simulation outcomes exhibit that the proposed clustering strategy outperforms the SEP and ETSSEP with regards to the above noted attributes.

  12. Solving Optimal Broadcasting Strategy in Metropolitan MANETs Using MOCELL Algorithm

    Directory of Open Access Journals (Sweden)

    M. Ghonamy

    2010-09-01

    Full Text Available Mobile ad-hoc networks (MANETs are a set of communicating devices that are able to spontaneously interconnect without any pre-existing infrastructure. In such a scenario, broadcasting becomes very important to the existence and the operation of this network. The process of optimizing the broadcast strategy of MANETs is a multi-objective problem with three objectives: (1 reaching as many stations as possible, (2 minimizing the network utilization and (3 reducing the broadcasting duration. The main contribution of this paper is that it tackles this problem by using multi-objective cellular genetic algorithm that is called MOCELL. MOCELL computes a Pareto front of solutions to empower a human designer with the ability to choose the preferred configuration for the network. Our results are compared with those obtained from the previous proposals used for solving the problem, a cellular multi-objective genetic algorithm which called cMOGA (the old version of MOCELL. We conclude that MOCELL outperforms cMOGA with respect to set coverage metric.

  13. Phenology as a strategy for carbon optimality: a global model

    Directory of Open Access Journals (Sweden)

    S. Caldararu

    2013-09-01

    Full Text Available Phenology is essential to our understanding of biogeochemical cycles and the climate system. We develop a global mechanistic model of leaf phenology based on the hypothesis that phenology is a strategy for optimal carbon gain at the canopy level so that trees adjust leaf gains and losses in response to environmental factors such as light, temperature and soil moisture, to achieve maximum carbon assimilation. We fit this model to five years of satellite observations of leaf area index (LAI using a Bayesian fitting algorithm. We show that our model is able to reproduce phenological patterns for all vegetation types and use it to explore variations in growing season length and the climate factors that limit leaf growth for different biomes. Phenology in wet tropical areas is limited by leaf age physiological constraints while at higher latitude leaf seasonality is limited by low temperature and light availability. Leaf growth in grassland regions is limited by water availability but often in combination with other factors. This model will advance the current understanding of phenology for ecosystem carbon models and our ability to predict future phenological behaviour.

  14. Strategy optimization for mask rule check in wafer fab

    Science.gov (United States)

    Yang, Chuen Huei; Lin, Shaina; Lin, Roger; Wang, Alice; Lee, Rachel; Deng, Erwin

    2015-07-01

    Photolithography process is getting more and more sophisticated for wafer production following Moore's law. Therefore, for wafer fab, consolidated and close cooperation with mask house is a key to achieve silicon wafer success. However, generally speaking, it is not easy to preserve such partnership because many engineering efforts and frequent communication are indispensable. The inattentive connection is obvious in mask rule check (MRC). Mask houses will do their own MRC at job deck stage, but the checking is only for identification of mask process limitation including writing, etching, inspection, metrology, etc. No further checking in terms of wafer process concerned mask data errors will be implemented after data files of whole mask are composed in mask house. There are still many potential data errors even post-OPC verification has been done for main circuits. What mentioned here are the kinds of errors which will only occur as main circuits combined with frame and dummy patterns to form whole reticle. Therefore, strategy optimization is on-going in UMC to evaluate MRC especially for wafer fab concerned errors. The prerequisite is that no impact on mask delivery cycle time even adding this extra checking. A full-mask checking based on job deck in gds or oasis format is necessary in order to secure acceptable run time. Form of the summarized error report generated by this checking is also crucial because user friendly interface will shorten engineers' judgment time to release mask for writing. This paper will survey the key factors of MRC in wafer fab.

  15. Optimal recruitment strategies for groups of interacting walkers with leaders

    Science.gov (United States)

    Martínez-García, Ricardo; López, Cristóbal; Vazquez, Federico

    2015-02-01

    We introduce a model of interacting random walkers on a finite one-dimensional chain with absorbing boundaries or targets at the ends. Walkers are of two types: informed particles that move ballistically towards a given target and diffusing uninformed particles that are biased towards close informed individuals. This model mimics the dynamics of hierarchical groups of animals, where an informed individual tries to persuade and lead the movement of its conspecifics. We characterize the success of this persuasion by the first-passage probability of the uninformed particle to the target, and we interpret the speed of the informed particle as a strategic parameter that the particle can tune to maximize its success. We find that the success probability is nonmonotonic, reaching its maximum at an intermediate speed whose value increases with the diffusing rate of the uninformed particle. When two different groups of informed leaders traveling in opposite directions compete, usually the largest group is the most successful. However, the minority can reverse this situation and become the most probable winner by following two different strategies: increasing its attraction strength or adjusting its speed to an optimal value relative to the majority's speed.

  16. Glycosylation of therapeutic proteins: an effective strategy to optimize efficacy.

    Science.gov (United States)

    Solá, Ricardo J; Griebenow, Kai

    2010-02-01

    During their development and administration, protein-based drugs routinely display suboptimal therapeutic efficacies due to their poor physicochemical and pharmacological properties. These innate liabilities have driven the development of molecular strategies to improve the therapeutic behavior of protein drugs. Among the currently developed approaches, glycoengineering is one of the most promising, because it has been shown to simultaneously afford improvements in most of the parameters necessary for optimization of in vivo efficacy while allowing for targeting to the desired site of action. These include increased in vitro and in vivo molecular stability (due to reduced oxidation, cross-linking, pH-, chemical-, heating-, and freezing-induced unfolding/denaturation, precipitation, kinetic inactivation, and aggregation), as well as modulated pharmacodynamic responses (due to altered potencies from diminished in vitro enzymatic activities and altered receptor binding affinities) and improved pharmacokinetic profiles (due to altered absorption and distribution behaviors, longer circulation lifetimes, and decreased clearance rates). This article provides an account of the effects that glycosylation has on the therapeutic efficacy of protein drugs and describes the current understanding of the mechanisms by which glycosylation leads to such effects.

  17. Noninfectious uveitis: strategies to optimize treatment compliance and adherence

    Directory of Open Access Journals (Sweden)

    Dolz-Marco R

    2015-08-01

    Full Text Available Rosa Dolz-Marco,1 Roberto Gallego-Pinazo,1 Manuel Díaz-Llopis,2 Emmett T Cunningham Jr,3–6 J Fernando Arévalo7,8 1Unit of Macula, Department of Ophthalmology, University and Polytechnic Hospital La Fe, 2Faculty of Medicine, University of Valencia, Spain; 3Department of Ophthalmology, California Pacific Medical Center, San Francisco, 4Department of Ophthalmology, Stanford University School of Medicine, Stanford, 5The Francis I Proctor Foundation, University of California San Francisco Medical Center, 6West Coast Retina Medical Group, San Francisco, CA, USA; 7Vitreoretina Division, King Khaled Eye Specialist Hospital, Riyadh, Saudi Arabia; 8Retina Division, Wilmer Eye Institute, Johns Hopkins University School of Medicine, Baltimore, MD, USA Abstract: Noninfectious uveitis includes a heterogenous group of sight-threatening ocular and systemic disorders. Significant progress has been made in the treatment of noninfectious uveitis in recent years, particularly with regard to the effective use of corticosteroids and non-corticosteroid immunosuppressive drugs, including biologic agents. All of these therapeutic approaches are limited, however, by any given patient’s ability to comply with and adhere to their prescribed treatment. In fact, compliance and adherence are among the most important patient-related determinants of treatment success. We discuss strategies to optimize compliance and adherence. Keywords: noninfectious uveitis, intraocular inflammation, immunosuppressive treatment, adherence, compliance, therapeutic failure

  18. Real-time implementation of optimized maximum noise fraction transform for feature extraction of hyperspectral images

    Science.gov (United States)

    Wu, Yuanfeng; Gao, Lianru; Zhang, Bing; Zhao, Haina; Li, Jun

    2014-01-01

    We present a parallel implementation of the optimized maximum noise fraction (G-OMNF) transform algorithm for feature extraction of hyperspectral images on commodity graphics processing units (GPUs). The proposed approach explored the algorithm data-level concurrency and optimized the computing flow. We first defined a three-dimensional grid, in which each thread calculates a sub-block data to easily facilitate the spatial and spectral neighborhood data searches in noise estimation, which is one of the most important steps involved in OMNF. Then, we optimized the processing flow and computed the noise covariance matrix before computing the image covariance matrix to reduce the original hyperspectral image data transmission. These optimization strategies can greatly improve the computing efficiency and can be applied to other feature extraction algorithms. The proposed parallel feature extraction algorithm was implemented on an Nvidia Tesla GPU using the compute unified device architecture and basic linear algebra subroutines library. Through the experiments on several real hyperspectral images, our GPU parallel implementation provides a significant speedup of the algorithm compared with the CPU implementation, especially for highly data parallelizable and arithmetically intensive algorithm parts, such as noise estimation. In order to further evaluate the effectiveness of G-OMNF, we used two different applications: spectral unmixing and classification for evaluation. Considering the sensor scanning rate and the data acquisition time, the proposed parallel implementation met the on-board real-time feature extraction.

  19. A universal optimization strategy for ant colony optimization algorithms based on the Physarum-inspired mathematical model.

    Science.gov (United States)

    Zhang, Zili; Gao, Chao; Liu, Yuxin; Qian, Tao

    2014-09-01

    Ant colony optimization (ACO) algorithms often fall into the local optimal solution and have lower search efficiency for solving the travelling salesman problem (TSP). According to these shortcomings, this paper proposes a universal optimization strategy for updating the pheromone matrix in the ACO algorithms. The new optimization strategy takes advantages of the unique feature of critical paths reserved in the process of evolving adaptive networks of the Physarum-inspired mathematical model (PMM). The optimized algorithms, denoted as PMACO algorithms, can enhance the amount of pheromone in the critical paths and promote the exploitation of the optimal solution. Experimental results in synthetic and real networks show that the PMACO algorithms are more efficient and robust than the traditional ACO algorithms, which are adaptable to solve the TSP with single or multiple objectives. Meanwhile, we further analyse the influence of parameters on the performance of the PMACO algorithms. Based on these analyses, the best values of these parameters are worked out for the TSP.

  20. Structure-Aware Nonlocal Optimization Framework for Image Colorization

    Institute of Scientific and Technical Information of China (English)

    Han-Li Zhao; Gui-Zhi Nie; Xu-Jie Li; CCF Xiao-Gang Jin; Zhi-Geng Pan

    2015-01-01

    This paper proposes a structure-aware nonlocal energy optimization framework for interactive image colo-rization with sparse scribbles. Our colorization technique propagates colors to both local intensity-continuous regions and remote texture-similar regions without explicit image segmentation. We implement the nonlocal principle by computing k nearest neighbors in the high-dimensional feature space. The feature space contains not only image coordinates and intensities but also statistical texture features obtained with the direction-aligned Gabor wavelet filter. Structure maps are utilized to scale texture features to avoid artifacts along high-contrast boundaries. We show various experimental results and comparisons on image colorization, selective recoloring and decoloring, and progressive color editing to demonstrate the effectiveness of the proposed approach.

  1. Optimization of Bit Plane Combination for Efficient Digital Image Watermarking

    CERN Document Server

    Kejgir, Sushma

    2009-01-01

    In view of the frequent multimedia data transfer authentication and protection of images has gained importance in todays world. In this paper we propose a new watermarking technique, based on bit plane, which enhances robustness and capacity of the watermark, as well as maintains transparency of the watermark and fidelity of the image. In the proposed technique, higher strength bit plane of digital signature watermark is embedded in to a significant bit plane of the original image. The combination of bit planes (image and watermark) selection is an important issue. Therefore, a mechanism is developed for appropriate bit plane selection. Ten different attacks are selected to test different alternatives. These attacks are given different weightings as appropriate to user requirement. A weighted correlation coefficient for retrieved watermark is estimated for each of the alternatives. Based on these estimated values optimal bit plane combination is identified for a given user requirement. The proposed method is ...

  2. Optimization of an Image-Based Talking Head System

    Directory of Open Access Journals (Sweden)

    Kang Liu

    2009-01-01

    Full Text Available This paper presents an image-based talking head system, which includes two parts: analysis and synthesis. The audiovisual analysis part creates a face model of a recorded human subject, which is composed of a personalized 3D mask as well as a large database of mouth images and their related information. The synthesis part generates natural looking facial animations from phonetic transcripts of text. A critical issue of the synthesis is the unit selection which selects and concatenates these appropriate mouth images from the database such that they match the spoken words of the talking head. Selection is based on lip synchronization and the similarity of consecutive images. The unit selection is refined in this paper, and Pareto optimization is used to train the unit selection. Experimental results of subjective tests show that most people cannot distinguish our facial animations from real videos.

  3. Polymerase chain reaction: basic protocol plus troubleshooting and optimization strategies.

    Science.gov (United States)

    Lorenz, Todd C

    2012-05-22

    In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus, consequently the name Taq DNA polymerase. PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to: • Set up reactions and thermal cycling

  4. Imaging strategies for detection of urgent conditions in patients with acute abdominal pain: diagnostic accuracy study

    Science.gov (United States)

    Laméris, Wytze; van Randen, Adrienne; van Es, H Wouter; van Heesewijk, Johannes P M; van Ramshorst, Bert; Bouma, Wim H; ten Hove, Wim; van Leeuwen, Maarten S; van Keulen, Esteban M; Dijkgraaf, Marcel G W; Bossuyt, Patrick M M; Boermeester, Marja A

    2009-01-01

    Objective To identify an optimal imaging strategy for the accurate detection of urgent conditions in patients with acute abdominal pain. Design Fully paired multicentre diagnostic accuracy study with prospective data collection. Setting Emergency departments of two university hospitals and four large teaching hospitals in the Netherlands. Participants 1021 patients with non-traumatic abdominal pain of >2 hours’ and <5 days’ duration. Exclusion criteria were discharge from the emergency department with no imaging considered warranted by the treating physician, pregnancy, and haemorrhagic shock. Intervention All patients had plain radiographs (upright chest and supine abdominal), ultrasonography, and computed tomography (CT) after clinical and laboratory examination. A panel of experienced physicians assigned a final diagnosis after six months and classified the condition as urgent or non-urgent. Main outcome measures Sensitivity and specificity for urgent conditions, percentage of missed cases and false positives, and exposure to radiation for single imaging strategies, conditional imaging strategies (CT after initial ultrasonography), and strategies driven by body mass index and age or by location of pain. Results 661 (65%) patients had a final diagnosis classified as urgent. The initial clinical diagnosis resulted in many false positive urgent diagnoses, which were significantly reduced after ultrasonography or CT. CT detected more urgent diagnoses than did ultrasonography: sensitivity was 89% (95% confidence interval 87% to 92%) for CT and 70% (67% to 74%) for ultrasonography (P<0.001). A conditional strategy with CT only after negative or inconclusive ultrasonography yielded the highest sensitivity, missing only 6% of urgent cases. With this strategy, only 49% (46% to 52%) of patients would have CT. Alternative strategies guided by body mass index, age, or location of the pain would all result in a loss of sensitivity. Conclusion Although CT is the most

  5. Genetics algorithm optimization of DWT-DCT based image Watermarking

    Science.gov (United States)

    Budiman, Gelar; Novamizanti, Ledya; Iwut, Iwan

    2017-01-01

    Data hiding in an image content is mandatory for setting the ownership of the image. Two dimensions discrete wavelet transform (DWT) and discrete cosine transform (DCT) are proposed as transform method in this paper. First, the host image in RGB color space is converted to selected color space. We also can select the layer where the watermark is embedded. Next, 2D-DWT transforms the selected layer obtaining 4 subband. We select only one subband. And then block-based 2D-DCT transforms the selected subband. Binary-based watermark is embedded on the AC coefficients of each block after zigzag movement and range based pixel selection. Delta parameter replacing pixels in each range represents embedded bit. +Delta represents bit “1” and –delta represents bit “0”. Several parameters to be optimized by Genetics Algorithm (GA) are selected color space, layer, selected subband of DWT decomposition, block size, embedding range, and delta. The result of simulation performs that GA is able to determine the exact parameters obtaining optimum imperceptibility and robustness, in any watermarked image condition, either it is not attacked or attacked. DWT process in DCT based image watermarking optimized by GA has improved the performance of image watermarking. By five attacks: JPEG 50%, resize 50%, histogram equalization, salt-pepper and additive noise with variance 0.01, robustness in the proposed method has reached perfect watermark quality with BER=0. And the watermarked image quality by PSNR parameter is also increased about 5 dB than the watermarked image quality from previous method.

  6. A duple watermarking strategy for multi-channel quantum images

    Science.gov (United States)

    Yan, Fei; Iliyasu, Abdullah M.; Sun, Bo; Venegas-Andraca, Salvador E.; Dong, Fangyan; Hirota, Kaoru

    2015-05-01

    Utilizing a stockpile of efficient transformations consisting of channel of interest, channel swapping, and quantum Fourier transforms, a duple watermarking strategy on multi-channel quantum images is proposed. It embeds the watermark image both into the spatial domain and the frequency domain of the multi-channel quantum carrier image, while also providing a quantum measurement-based algorithm to generate an unknown key that is used to protect the color information, which accompanies another key that is mainly used to scramble the spatial content of the watermark image in order to further safeguard the copyright of the carrier image. Simulation-based experiments using a watermark logo and nine building images as watermark image and carrier images, respectively, offer a duple protection for the copyright of carrier images in terms of the visible quality of the watermarked images. The proposed stratagem advances available literature in the quantum watermarking research field and sets the stage for the applications aimed at quantum data protection.

  7. Spectrally optimal illuminations for diabetic retinopathy detection in retinal imaging

    Science.gov (United States)

    Bartczak, Piotr; Fält, Pauli; Penttinen, Niko; Ylitepsa, Pasi; Laaksonen, Lauri; Lensu, Lasse; Hauta-Kasari, Markku; Uusitalo, Hannu

    2017-01-01

    Retinal photography is a standard method for recording retinal diseases for subsequent analysis and diagnosis. However, the currently used white light or red-free retinal imaging does not necessarily provide the best possible visibility of different types of retinal lesions, important when developing diagnostic tools for handheld devices, such as smartphones. Using specifically designed illumination, the visibility and contrast of retinal lesions could be improved. In this study, spectrally optimal illuminations for diabetic retinopathy lesion visualization are implemented using a spectrally tunable light source based on digital micromirror device. The applicability of this method was tested in vivo by taking retinal monochrome images from the eyes of five diabetic volunteers and two non-diabetic control subjects. For comparison to existing methods, we evaluated the contrast of retinal images taken with our method and red-free illumination. The preliminary results show that the use of optimal illuminations improved the contrast of diabetic lesions in retinal images by 30-70%, compared to the traditional red-free illumination imaging.

  8. OPTIMAL WAVELET FILTER DESIGN FOR REMOTE SENSING IMAGE COMPRESSION

    Institute of Scientific and Technical Information of China (English)

    Yang Guoan; Zheng Nanning; Guo Shugang

    2007-01-01

    A new approach for designing the Biorthogonal Wavelet Filter Bank (BWFB) for the purpose of image compression is presented in this letter. The approach is decomposed into two steps.First, an optimal filter bank is designed in theoretical sense based on Vaidyanathan's coding gain criterion in SubBand Coding (SBC) system. Then the above filter bank is optimized based on the criterion of Peak Signal-to-Noise Ratio (PSNR) in JPEG2000 image compression system, resulting in a BWFB in practical application sense. With the approach, a series of BWFB for a specific class of applications related to image compression, such as remote sensing images, can be fast designed. Here,new 5/3 BWFB and 9/7 BWFB are presented based on the above approach for the remote sensing image compression applications. Experiments show that the two filter banks are equally performed with respect to CDF 9/7 and LT 5/3 filter in JPEG2000 standard; at the same time, the coefficients and the lifting parameters of the lifting scheme are all rational, which bring the computational advantage, and the ease for VLSI implementation.

  9. Optimized Optomechanical Micro-Cantilever Array for Uncooled Infrared Imaging

    Institute of Scientific and Technical Information of China (English)

    DONG Feng-Liang; ZHANG Qing-Chuan; CHEN Da-Peng; MIAO Zheng-Yu; XIONG Zhi-Ming; GUO Zhe-Ying; LI Chao-Bo; JIAO Bin-Bin; WU Xiao-Ping

    2007-01-01

    We present a new substrate-free bimaterial cantilever array made of SiNx and Au for an uncooled microoptomechanical infrared imaging device.Each cantilever element has an optimized deformation magnification structure.A 160×160 array with a 120μm×120μm pitch is fabricared and an optical readout is used to collectively measure deflections of all microcantilevers in the array.Tharmal images of room-temperature objects with higher spatial resolution have been obtained and the noise-equivalent temperature difference of the fabricated focal plane arrays is giyen statistically and is measured to be about 270mK.

  10. Automated bone segmentation from dental CBCT images using patch-based sparse representation and convex optimization

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Li; Gao, Yaozong; Shi, Feng; Liao, Shu; Li, Gang [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, North Carolina 27599 (United States); Chen, Ken Chung [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 and Department of Stomatology, National Cheng Kung University Medical College and Hospital, Tainan, Taiwan 70403 (China); Shen, Steve G. F.; Yan, Jin [Department of Oral and Craniomaxillofacial Surgery and Science, Shanghai Ninth People' s Hospital, Shanghai Jiao Tong University College of Medicine, Shanghai, China 200011 (China); Lee, Philip K. M.; Chow, Ben [Hong Kong Dental Implant and Maxillofacial Centre, Hong Kong, China 999077 (China); Liu, Nancy X. [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 and Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, Beijing, China 100050 (China); Xia, James J. [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 (United States); Department of Surgery (Oral and Maxillofacial Surgery), Weill Medical College, Cornell University, New York, New York 10065 (United States); Department of Oral and Craniomaxillofacial Surgery and Science, Shanghai Ninth People' s Hospital, Shanghai Jiao Tong University College of Medicine, Shanghai, China 200011 (China); Shen, Dinggang, E-mail: dgshen@med.unc.edu [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, North Carolina 27599 and Department of Brain and Cognitive Engineering, Korea University, Seoul, 136701 (Korea, Republic of)

    2014-04-15

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate three-dimensional (3D) models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the poor image quality, including very low signal-to-noise ratio and the widespread image artifacts such as noise, beam hardening, and inhomogeneity, it is challenging to segment the CBCT images. In this paper, the authors present a new automatic segmentation method to address these problems. Methods: To segment CBCT images, the authors propose a new method for fully automated CBCT segmentation by using patch-based sparse representation to (1) segment bony structures from the soft tissues and (2) further separate the mandible from the maxilla. Specifically, a region-specific registration strategy is first proposed to warp all the atlases to the current testing subject and then a sparse-based label propagation strategy is employed to estimate a patient-specific atlas from all aligned atlases. Finally, the patient-specific atlas is integrated into amaximum a posteriori probability-based convex segmentation framework for accurate segmentation. Results: The proposed method has been evaluated on a dataset with 15 CBCT images. The effectiveness of the proposed region-specific registration strategy and patient-specific atlas has been validated by comparing with the traditional registration strategy and population-based atlas. The experimental results show that the proposed method achieves the best segmentation accuracy by comparison with other state-of-the-art segmentation methods. Conclusions: The authors have proposed a new CBCT segmentation method by using patch-based sparse representation and convex optimization, which can achieve considerably accurate segmentation results in CBCT

  11. Task-Based Optimization of Computed Tomography Imaging Systems

    CERN Document Server

    Sanchez, Adrian A

    2015-01-01

    The goal of this thesis is to provide a framework for the use of task-based metrics of image quality to aid in the design, implementation, and evaluation of CT image reconstruction algorithms and CT systems in general. We support the view that task-based metrics of image quality can be useful in guiding the algorithm design and implementation process in order to yield images of objectively superior quality and higher utility for a given task. Further, we believe that metrics such as the Hotelling observer (HO) SNR can be used as summary scalar metrics of image quality for the evaluation of images produced by novel reconstruction algorithms. In this work, we aim to construct a concise and versatile formalism for image reconstruction algorithm design, implementation, and assessment. The bulk of the work focuses on linear analytical algorithms, specifically the ubiquitous filtered back-projection (FBP) algorithm. However, due to the demonstrated importance of optimization-based algorithms in a wide variety of CT...

  12. Joint optimization toward effective and efficient image search.

    Science.gov (United States)

    Wei, Shikui; Xu, Dong; Li, Xuelong; Zhao, Yao

    2013-12-01

    The bag-of-words (BoW) model has been known as an effective method for large-scale image search and indexing. Recent work shows that the performance of the model can be further improved by using the embedding method. While different variants of the BoW model and embedding method have been developed, less effort has been made to discover their underlying working mechanism. In this paper, we systematically investigate the image search performance variation with respect to a few factors of the BoW model, and study how to employ the embedding method to further improve the image search performance. Subsequently, we summarize several observations based on the experiments on descriptor matching. To validate these observations in a real image search, we propose an effective and efficient image search scheme, in which the BoW model and embedding method are jointly optimized in terms of effectiveness and efficiency by following these observations. Our comprehensive experiments demonstrate that it is beneficial to employ these observations to develop an image search algorithm, and the proposed image search scheme outperforms state-of-the art methods in both effectiveness and efficiency.

  13. Chemical process dynamic optimization based on the differential evolution algorithm with an adaptive scheduling mutation strategy

    Science.gov (United States)

    Zhu, Jun; Yan, Xuefeng; Zhao, Weixiang

    2013-10-01

    To solve chemical process dynamic optimization problems, a differential evolution algorithm integrated with adaptive scheduling mutation strategy (ASDE) is proposed. According to the evolution feedback information, ASDE, with adaptive control parameters, adopts the round-robin scheduling algorithm to adaptively schedule different mutation strategies. By employing an adaptive mutation strategy and control parameters, the real-time optimal control parameters and mutation strategy are obtained to improve the optimization performance. The performance of ASDE is evaluated using a suite of 14 benchmark functions. The results demonstrate that ASDE performs better than four conventional differential evolution (DE) algorithm variants with different mutation strategies, and that the whole performance of ASDE is equivalent to a self-adaptive DE algorithm variant and better than five conventional DE algorithm variants. Furthermore, ASDE was applied to solve a typical dynamic optimization problem of a chemical process. The obtained results indicate that ASDE is a feasible and competitive optimizer for this kind of problem.

  14. Three-dimensional virtual surgery models for percutaneous coronary intervention (PCI) optimization strategies

    Science.gov (United States)

    Wang, Hujun; Liu, Jinghua; Zheng, Xu; Rong, Xiaohui; Zheng, Xuwei; Peng, Hongyu; Silber-Li, Zhanghua; Li, Mujun; Liu, Liyu

    2015-06-01

    Percutaneous coronary intervention (PCI), especially coronary stent implantation, has been shown to be an effective treatment for coronary artery disease. However, in-stent restenosis is one of the longstanding unsolvable problems following PCI. Although stents implanted inside narrowed vessels recover normal flux of blood flows, they instantaneously change the wall shear stress (WSS) distribution on the vessel surface. Improper stent implantation positions bring high possibilities of restenosis as it enlarges the low WSS regions and subsequently stimulates more epithelial cell outgrowth on vessel walls. To optimize the stent position for lowering the risk of restenosis, we successfully established a digital three-dimensional (3-D) model based on a real clinical coronary artery and analysed the optimal stenting strategies by computational simulation. Via microfabrication and 3-D printing technology, the digital model was also converted into in vitro microfluidic models with 3-D micro channels. Simultaneously, physicians placed real stents inside them; i.e., they performed “virtual surgeries”. The hydrodynamic experimental results showed that the microfluidic models highly inosculated the simulations. Therefore, our study not only demonstrated that the half-cross stenting strategy could maximally reduce restenosis risks but also indicated that 3-D printing combined with clinical image reconstruction is a promising method for future angiocardiopathy research.

  15. Data for evaluation of fast kurtosis strategies, b-value optimization and exploration of diffusion MRI contrast

    Science.gov (United States)

    Hansen, Brian; Jespersen, Sune Nørhøj

    2016-01-01

    Here we describe and provide diffusion magnetic resonance imaging (dMRI) data that was acquired in neural tissue and a physical phantom. Data acquired in biological tissue includes: fixed rat brain (acquired at 9.4 T) and spinal cord (acquired at 16.4 T) and in normal human brain (acquired at 3 T). This data was recently used for evaluation of diffusion kurtosis imaging (DKI) contrasts and for comparison to diffusion tensor imaging (DTI) parameter contrast. The data has also been used to optimize b-values for ex vivo and in vivo fast kurtosis imaging. The remaining data was obtained in a physical phantom with three orthogonal fiber orientations (fresh asparagus stems) for exploration of the kurtosis fractional anisotropy. However, the data may have broader interest and, collectively, may form the basis for image contrast exploration and simulations based on a wide range of dMRI analysis strategies. PMID:27576023

  16. Data for evaluation of fast kurtosis strategies, b-value optimization and exploration of diffusion MRI contrast

    Science.gov (United States)

    Hansen, Brian; Jespersen, Sune Nørhøj

    2016-08-01

    Here we describe and provide diffusion magnetic resonance imaging (dMRI) data that was acquired in neural tissue and a physical phantom. Data acquired in biological tissue includes: fixed rat brain (acquired at 9.4 T) and spinal cord (acquired at 16.4 T) and in normal human brain (acquired at 3 T). This data was recently used for evaluation of diffusion kurtosis imaging (DKI) contrasts and for comparison to diffusion tensor imaging (DTI) parameter contrast. The data has also been used to optimize b-values for ex vivo and in vivo fast kurtosis imaging. The remaining data was obtained in a physical phantom with three orthogonal fiber orientations (fresh asparagus stems) for exploration of the kurtosis fractional anisotropy. However, the data may have broader interest and, collectively, may form the basis for image contrast exploration and simulations based on a wide range of dMRI analysis strategies.

  17. Optimizing and extending light-sculpting microscopy for fast functional imaging in neuroscience

    CERN Document Server

    Rupprecht, Peter; Groessl, Florian; Haubensak, Wulf E; Vaziri, Alipasha

    2015-01-01

    A number of questions in systems biology such as understanding how dynamics of neuronal networks are related to brain function require the ability to capture the functional dynamics of large cellular populations at high speed. Recently, this has driven the development of a number of parallel and high speed imaging techniques such as light-sculpting microscopy, which has been used to capture neuronal dynamics at the whole brain and single cell level in small model organism. However, the broader applicability of light-sculpting microscopy is limited by the size of volumes for which high speed imaging can be obtained and scattering in brain tissue. Here, we present strategies for optimizing the present tradeoffs in light-sculpting microscopy. Various scanning modalities in light-sculpting microscopy are theoretically and experimentally evaluated, and strategies to maximize the obtainable volume speeds, and depth penetration in brain tissue using different laser systems are provided. Design-choices, important par...

  18. An embedded still image coder with rate-distortion optimization.

    Science.gov (United States)

    Li, J; Lei, S

    1999-01-01

    It is well known that the fixed rate coder achieves optimality when all coefficients are coded with the same rate-distortion (R-D) slope. In this paper, we show that the performance of the embedded coder can be optimized in a rate-distortion sense by coding the coefficients with decreasing R-D slope. We denote such a coding strategy as rate-distortion optimized embedding (RDE). RDE allocates the available coding bits first to the coefficient with the steepest R-D slope, i.e., the largest distortion decrease per coding bit. The resultant coding bitstream can be truncated at any point and still maintain an optimal R-D performance. To avoid the overhead of coding order transmission, we use the expected R-D slope, which can be calculated from the coded bits and is available in both the encoder and the decoder. With the probability estimation table of the QM-coder, the calculation of the R-D slope can be just a lookup table operation. Experimental results show that the rate-distortion optimization significantly improves the coding efficiency in a wide bit rate range.

  19. Supply Chain Optimized Strategies in the Mode of External Financing

    Institute of Scientific and Technical Information of China (English)

    Wenyi; DU; Xingzheng; AI; Xiaowo; TANG

    2015-01-01

    In the circumstance that market demand is uncertain,it studies the decision-making problem of supply chain financial system consisting of the single supplier,a capital constraint retailer and a bank. Considering the mode of external financing,we obtain the optimal order decision of the capital constraint retailer,the optimal financing rate and the optimal wholesale price of the supplier and analyze the effects of owned capitals of retailer on the optimized decision-making of supply chain financial system. At last,it demonstrates the effectiveness of conclusion by numerical examples.

  20. SAR image autofocus by sharpness optimization: a theoretical study.

    Science.gov (United States)

    Morrison, Robert L; Do, Minh N; Munson, David C

    2007-09-01

    Synthetic aperture radar (SAR) autofocus techniques that optimize sharpness metrics can produce excellent restorations in comparison with conventional autofocus approaches. To help formalize the understanding of metric-based SAR autofocus methods, and to gain more insight into their performance, we present a theoretical analysis of these techniques using simple image models. Specifically, we consider the intensity-squared metric, and a dominant point-targets image model, and derive expressions for the resulting objective function. We examine the conditions under which the perfectly focused image models correspond to stationary points of the objective function. A key contribution is that we demonstrate formally, for the specific case of intensity-squared minimization autofocus, the mechanism by which metric-based methods utilize the multichannel defocusing model of SAR autofocus to enforce the stationary point property for multiple image columns. Furthermore, our analysis shows that the objective function has a special separble property through which it can be well approximated locally by a sum of 1-D functions of each phase error component. This allows fast performance through solving a sequence of 1-D optimization problems for each phase component simultaneously. Simulation results using the proposed models and actual SAR imagery confirm that the analysis extends well to realistic situations.

  1. DETECTION OF MASSES IN MAMMOGRAM IMAGES USING ANT COLONY OPTIMIZATION

    Directory of Open Access Journals (Sweden)

    Varsha Patankar

    2014-04-01

    Full Text Available This paper proposes the advances in edge detection techniques, which is used for the mammogram images for cancer diagnosis. It compares the evaluation of edge detection with the proposed method ant colony optimization. The study shows that the edge detection technique is applied on the mammogram images because it will clearly identify the masses in mammogram images. This will help to identify the type of cancer at the early stage. ACO edge detector is best in detecting the edges when compared to the other edge detectors. The quality of various edge detectors is calculated based on the parameters such as Peak signal to noise ratio (PSNR and Mean square error (MSE.

  2. The optimal algorithm for Multi-source RS image fusion.

    Science.gov (United States)

    Fu, Wei; Huang, Shui-Guang; Li, Zeng-Shun; Shen, Hao; Li, Jun-Shuai; Wang, Peng-Yuan

    2016-01-01

    In order to solve the issue which the fusion rules cannot be self-adaptively adjusted by using available fusion methods according to the subsequent processing requirements of Remote Sensing (RS) image, this paper puts forward GSDA (genetic-iterative self-organizing data analysis algorithm) by integrating the merit of genetic arithmetic together with the advantage of iterative self-organizing data analysis algorithm for multi-source RS image fusion. The proposed algorithm considers the wavelet transform of the translation invariance as the model operator, also regards the contrast pyramid conversion as the observed operator. The algorithm then designs the objective function by taking use of the weighted sum of evaluation indices, and optimizes the objective function by employing GSDA so as to get a higher resolution of RS image. As discussed above, the bullet points of the text are summarized as follows.•The contribution proposes the iterative self-organizing data analysis algorithm for multi-source RS image fusion.•This article presents GSDA algorithm for the self-adaptively adjustment of the fusion rules.•This text comes up with the model operator and the observed operator as the fusion scheme of RS image based on GSDA. The proposed algorithm opens up a novel algorithmic pathway for multi-source RS image fusion by means of GSDA.

  3. Optimization-Based Image Segmentation by Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    H. Laurent

    2008-05-01

    Full Text Available Many works in the literature focus on the definition of evaluation metrics and criteria that enable to quantify the performance of an image processing algorithm. These evaluation criteria can be used to define new image processing algorithms by optimizing them. In this paper, we propose a general scheme to segment images by a genetic algorithm. The developed method uses an evaluation criterion which quantifies the quality of an image segmentation result. The proposed segmentation method can integrate a local ground truth when it is available in order to set the desired level of precision of the final result. A genetic algorithm is then used in order to determine the best combination of information extracted by the selected criterion. Then, we show that this approach can either be applied for gray-levels or multicomponents images in a supervised context or in an unsupervised one. Last, we show the efficiency of the proposed method through some experimental results on several gray-levels and multicomponents images.

  4. Optimization-Based Image Segmentation by Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Rosenberger C

    2008-01-01

    Full Text Available Abstract Many works in the literature focus on the definition of evaluation metrics and criteria that enable to quantify the performance of an image processing algorithm. These evaluation criteria can be used to define new image processing algorithms by optimizing them. In this paper, we propose a general scheme to segment images by a genetic algorithm. The developed method uses an evaluation criterion which quantifies the quality of an image segmentation result. The proposed segmentation method can integrate a local ground truth when it is available in order to set the desired level of precision of the final result. A genetic algorithm is then used in order to determine the best combination of information extracted by the selected criterion. Then, we show that this approach can either be applied for gray-levels or multicomponents images in a supervised context or in an unsupervised one. Last, we show the efficiency of the proposed method through some experimental results on several gray-levels and multicomponents images.

  5. To denoise or deblur: parameter optimization for imaging systems

    Science.gov (United States)

    Mitra, Kaushik; Cossairt, Oliver; Veeraraghavan, Ashok

    2014-03-01

    In recent years smartphone cameras have improved a lot but they still produce very noisy images in low light conditions. This is mainly because of their small sensor size. Image quality can be improved by increasing the aperture size and/or exposure time however this make them susceptible to defocus and/or motion blurs. In this paper, we analyze the trade-off between denoising and deblurring as a function of the illumination level. For this purpose we utilize a recently introduced framework for analysis of computational imaging systems that takes into account the effect of (1) optical multiplexing, (2) noise characteristics of the sensor, and (3) the reconstruction algorithm, which typically uses image priors. Following this framework, we model the image prior using Gaussian Mixture Model (GMM), which allows us to analytically compute the Minimum Mean Squared Error (MMSE). We analyze the specific problem of motion and defocus deblurring, showing how to find the optimal exposure time and aperture setting as a function of illumination level. This framework gives us the machinery to answer an open question in computational imaging: To deblur or denoise?.

  6. Object Detection In Image Using Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Nirbhowjap Singh

    2010-12-01

    Full Text Available Image matching is a key component in almost any image analysis process. Image matching is crucial to a wide range of applications, such as in navigation, guidance, automatic surveillance, robot vision, and in mapping sciences. Any automated system for three-dimensional point positioning must include a potent procedure for image matching. Most biological vision systems have the talent to cope with changing world. Computer vision systems have developed in the same way. For a computer vision system, the ability to cope withmoving and changing objects, changing illumination, and changing viewpoints is essential to perform several tasks. Object detection is necessary for surveillance applications, for guidance of autonomous vehicles, for efficient video compression, for smart tracking of moving objects, for automatic target recognition (ATR systems and for many other applications. Cross-correlation and related techniqueshave dominated the field since the early fifties. Conventional template matching algorithm based on cross-correlation requires complex calculation and large time for object detection, which makes difficult to use them in real time applications. The shortcomings of this class of image matching methods have caused a slow-down in the development of operational automated correlation systems. In the proposed work particle swarm optimization & its variants basedalgorithm is used for detection of object in image. Implementation of this algorithm reduces the time required for object detection than conventional template matching algorithm. Algorithm can detect object in less number of iteration & hence less time & energy than the complexity of conventional template matching. This feature makes the method capable for real time implementation. In this thesis a study of particle Swarm optimization algorithm is done & then formulation of the algorithm for object detection using PSO & its variants is implemented for validating its effectiveness.

  7. Offshore Wind Farm Layout Design Considering Optimized Power Dispatch Strategy

    DEFF Research Database (Denmark)

    Hou, Peng; Hu, Weihao; N. Soltani, Mohsen

    2017-01-01

    leading to energy losses. It is expected that the optimized placement of wind turbines (WT) over a large sea area can lead to the best tradeoff between energy yields and capital investment. This paper proposes a novel way to position offshore WTs for a regular shaped wind farm. In addition to optimizing...

  8. Educational Tool for Optimal Controller Tuning Using Evolutionary Strategies

    Science.gov (United States)

    Carmona Morales, D.; Jimenez-Hornero, J. E.; Vazquez, F.; Morilla, F.

    2012-01-01

    In this paper, an optimal tuning tool is presented for control structures based on multivariable proportional-integral-derivative (PID) control, using genetic algorithms as an alternative to traditional optimization algorithms. From an educational point of view, this tool provides students with the necessary means to consolidate their knowledge on…

  9. A trust-region strategy for manifold-mapping optimization

    NARCIS (Netherlands)

    Hemker, P.W.; Echeverria, D.

    2007-01-01

    Studying the space-mapping technique by Bandler et al. [J. Bandler, R. Biernacki, S. Chen, P. Grobelny, R.H. Hemmers, Space mapping technique for electromagnetic optimization, IEEE Trans. Microwave Theory Tech. 42 (1994) 2536–2544] for the solution of optimization problems, we observe the possible d

  10. A trust-region strategy for manifold mapping optimization.

    NARCIS (Netherlands)

    Hemker, P.W.; Echeverria, D.

    2006-01-01

    As a starting point we take the space-mapping iteration technique by Bandler et al. for the efficient solution of optimization problems. This technique achieves acceleration of accurate design processes with the help of simpler, easier to optimize models. We observe the difference between the soluti

  11. Model-based Optimization of Oil Recovery: Robust Operational Strategies

    NARCIS (Netherlands)

    Van Essen, G.M.

    2015-01-01

    The process of depleting an oil reservoir can be poured into an optimal control problem with the objective to maximize economic performance over the life of the field. Despite its large potential, life-cycle optimization has not yet found its way into operational environments. The objective of this t

  12. Global Optimization strategies for two-mode clustering

    NARCIS (Netherlands)

    J.M. van Rosmalen (Joost); P.J.F. Groenen (Patrick); J. Trejos (Javier); W. Castilli

    2005-01-01

    textabstractTwo-mode clustering is a relatively new form of clustering that clusters both rows and columns of a data matrix. To do so, a criterion similar to k-means is optimized. However, it is still unclear which optimization method should be used to perform two-mode clustering, as various meth

  13. A Particle Swarm Optimization with Adaptive Multi-Swarm Strategy for Capacitated Vehicle Routing Problem

    Directory of Open Access Journals (Sweden)

    Kui-Ting CHEN

    2015-12-01

    Full Text Available Capacitated vehicle routing problem with pickups and deliveries (CVRPPD is one of the most challenging combinatorial optimization problems which include goods delivery/pickup optimization, vehicle number optimization, routing path optimization and transportation cost minimization. The conventional particle swarm optimization (PSO is difficult to find an optimal solution of the CVRPPD due to its simple search strategy. A PSO with adaptive multi-swarm strategy (AMSPSO is proposed to solve the CVRPPD in this paper. The proposed AMSPSO employs multiple PSO algorithms and an adaptive algorithm with punishment mechanism to search the optimal solution, which can deal with large-scale optimization problems. The simulation results prove that the proposed AMSPSO can solve the CVRPPD with the least number of vehicles and less transportation cost, simultaneously.

  14. Optimal Strategy for Inspection and Repair of Structural Systems

    DEFF Research Database (Denmark)

    Thoft-Christensen, Palle; Sørensen, John Dalsgaard

    1987-01-01

    A new strategy for inspection and repair of structural elements and systems is presented. The total cost of inspection and repair is minimized with the constraints that the reliability of elements and/or of the structural system are acceptable. The design variables are the time intervals between...... inspections and the quality of the inspections. Numerical examples are presented to illustrate the performance of the strategy. The strategy can be used for any engineering system where inspection and repair are required....

  15. Body Image as Strategy for Engagement in Social Media

    Directory of Open Access Journals (Sweden)

    Tarcisio Torres Silva

    2015-06-01

    This work intends to analyze not only how communication technologies have contributed to the emergence of such events but also how image production can be interpreted in such environments. Since the use of social media in protests caught the attention of broadcasting media in 2009 during demonstrations in Iran, a strong connection can be noticed between the content circulating through digital communication technologies and the body. For images produced during the Arab Spring, the same is observed with a series of strategies connecting body image and social mobilization. Our intention is to contribute to the debate of political images, considering the way they have been produced in contemporary society, which deals with a complex environment composed of communication technologies, social organization, and the body itself.

  16. Optimization of super-resolution processing using incomplete image sets in PET imaging.

    Science.gov (United States)

    Chang, Guoping; Pan, Tinsu; Clark, John W; Mawlawi, Osama R

    2008-12-01

    Super-resolution (SR) techniques are used in PET imaging to generate a high-resolution image by combining multiple low-resolution images that have been acquired from different points of view (POVs). The number of low-resolution images used defines the processing time and memory storage necessary to generate the SR image. In this paper, the authors propose two optimized SR implementations (ISR-1 and ISR-2) that require only a subset of the low-resolution images (two sides and diagonal of the image matrix, respectively), thereby reducing the overall processing time and memory storage. In an N x N matrix of low-resolution images, ISR-1 would be generated using images from the two sides of the N x N matrix, while ISR-2 would be generated from images across the diagonal of the image matrix. The objective of this paper is to investigate whether the two proposed SR methods can achieve similar performance in contrast and signal-to-noise ratio (SNR) as the SR image generated from a complete set of low-resolution images (CSR) using simulation and experimental studies. A simulation, a point source, and a NEMA/IEC phantom study were conducted for this investigation. In each study, 4 (2 x 2) or 16 (4 x 4) low-resolution images were reconstructed from the same acquired data set while shifting the reconstruction grid to generate images from different POVs. SR processing was then applied in each study to combine all as well as two different subsets of the low-resolution images to generate the CSR, ISR-1, and ISR-2 images, respectively. For reference purpose, a native reconstruction (NR) image using the same matrix size as the three SR images was also generated. The resultant images (CSR, ISR-1, ISR-2, and NR) were then analyzed using visual inspection, line profiles, SNR plots, and background noise spectra. The simulation study showed that the contrast and the SNR difference between the two ISR images and the CSR image were on average 0.4% and 0.3%, respectively. Line profiles of

  17. Open- and closed-loop multiobjective optimal strategies for HIV therapy using NSGA-II.

    Science.gov (United States)

    Heris, S Mostapha Kalami; Khaloozadeh, Hamid

    2011-06-01

    In this paper, multiobjective open- and closed-loop optimal treatment strategies for HIV/AIDS are presented. It is assumed that highly active antiretroviral therapy is available for treatment of HIV infection. Amount of drug usage and the quality of treatment are defined as two objectives of a biobjective optimization problem, and Nondominated Sorting Genetic Algorithm II is used to solve this problem. Open- and closed-loop control strategies are used to produce optimal control inputs, and the Pareto frontiers obtained from these two strategies are compared. Pareto frontier, resulted from the optimization process, suggests a set of treatment strategies, which all are optimal from a perspective, and can be used in different medical and economic conditions. Robustness of closed-loop system in the presence of measurement noises is analyzed, assuming various levels of noise.

  18. Pupil-phase optimization for extended-focus, aberration-corrected imaging systems

    Science.gov (United States)

    Prasad, Sudhakar; Pauca, V. Paul; Plemmons, Robert J.; Torgersen, Todd C.; van der Gracht, Joseph

    2004-10-01

    The insertion of a suitably designed phase plate in the pupil of an imaging system makes it possible to encode the depth dimension of an extended three-dimensional scene by means of an approximately shift-invariant PSF. The so-encoded image can then be deblurred digitally by standard image recovery algorithms to recoup the depth dependent detail of the original scene. A similar strategy can be adopted to compensate for certain monochromatic aberrations of the system. Here we consider two approaches to optimizing the design of the phase plate that are somewhat complementary - one based on Fisher information that attempts to reduce the sensitivity of the phase encoded image to misfocus and the other based on a minimax formulation of the sum of singular values of the system blurring matrix that attempts to maximize the resolution in the final image. Comparisons of these two optimization approaches are discussed. Our preliminary demonstration of the use of such pupil-phase engineering to successfully control system aberrations, particularly spherical aberration, is also presented.

  19. Optimal operation strategies of compressed air energy storage (CAES) on electricity spot markets with fluctuating prices

    DEFF Research Database (Denmark)

    Lund, Henrik; Salgi, Georges; Elmegaard, Brian;

    2009-01-01

    on electricity spot markets by storing energy when electricity prices are low and producing electricity when prices are high. In order to make a profit on such markets, CAES plant operators have to identify proper strategies to decide when to sell and when to buy electricity. This paper describes three...... plants will not be able to achieve such optimal operation, since the fluctuations of spot market prices in the coming hours and days are not known. Consequently, two simple practical strategies have been identified and compared to the results of the optimal strategy. This comparison shows that...... independent computer-based methodologies which may be used for identifying the optimal operation strategy for a given CAES plant, on a given spot market and in a given year. The optimal strategy is identified as the one which provides the best business-economic net earnings for the plant. In practice, CAES...

  20. Optimizing the stirring strategy for the vibrating intrinsic reverberation chamber

    NARCIS (Netherlands)

    Serra, Ramiro; Leferink, Frank

    2010-01-01

    This work describes the definition, application and assessment of a factorial plan with the aim of gaining insight on what kind of stirring strategy could work the best in a vibrating intrinsic reverberation chamber. Three different stirring strategies were defined as factors of a factorial analysis

  1. A Cascade Optimization Strategy for Solution of Difficult Multidisciplinary Design Problems

    Science.gov (United States)

    Patnaik, Surya N.; Coroneos, Rula M.; Hopkins, Dale A.; Berke, Laszlo

    1996-01-01

    A research project to comparatively evaluate 10 nonlinear optimization algorithms was recently completed. A conclusion was that no single optimizer could successfully solve all 40 problems in the test bed, even though most optimizers successfully solved at least one-third of the problems. We realized that improved search directions and step lengths, available in the 10 optimizers compared, were not likely to alleviate the convergence difficulties. For the solution of those difficult problems we have devised an alternative approach called cascade optimization strategy. The cascade strategy uses several optimizers, one followed by another in a specified sequence, to solve a problem. A pseudorandom scheme perturbs design variables between the optimizers. The cascade strategy has been tested successfully in the design of supersonic and subsonic aircraft configurations and air-breathing engines for high-speed civil transport applications. These problems could not be successfully solved by an individual optimizer. The cascade optimization strategy, however, generated feasible optimum solutions for both aircraft and engine problems. This paper presents the cascade strategy and solutions to a number of these problems.

  2. An Optimized Method for Terrain Reconstruction Based on Descent Images

    Directory of Open Access Journals (Sweden)

    Xu Xinchao

    2016-02-01

    Full Text Available An optimization method is proposed to perform high-accuracy terrain reconstruction of the landing area of Chang’e III. First, feature matching is conducted using geometric model constraints. Then, the initial terrain is obtained and the initial normal vector of each point is solved on the basis of the initial terrain. By changing the vector around the initial normal vector in small steps a set of new vectors is obtained. By combining these vectors with the direction of light and camera, the functions are set up on the basis of a surface reflection model. Then, a series of gray values is derived by solving the equations. The new optimized vector is recorded when the obtained gray value is closest to the corresponding pixel. Finally, the optimized terrain is obtained after iteration of the vector field. Experiments were conducted using the laboratory images and descent images of Chang’e III. The results showed that the performance of the proposed method was better than that of the classical feature matching method. It can provide a reference for terrain reconstruction of the landing area in subsequent moon exploration missions.

  3. Design of Optimal Quincunx Filter Banks for Image Coding

    Directory of Open Access Journals (Sweden)

    Wu-Sheng Lu

    2007-01-01

    Full Text Available Two new optimization-based methods are proposed for the design of high-performance quincunx filter banks for the application of image coding. These new techniques are used to build linear-phase finite-length-impulse-response (FIR perfect-reconstruction (PR systems with high coding gain, good frequency selectivity, and certain prescribed vanishing-moment properties. A parametrization of quincunx filter banks based on the lifting framework is employed to structurally impose the PR and linear-phase conditions. Then, the coding gain is maximized subject to a set of constraints on vanishing moments and frequency selectivity. Examples of filter banks designed using the newly proposed methods are presented and shown to be highly effective for image coding. In particular, our new optimal designs are shown to outperform three previously proposed quincunx filter banks in 72% to 95% of our experimental test cases. Moreover, in some limited cases, our optimal designs are even able to outperform the well-known (separable 9/7 filter bank (from the JPEG-2000 standard.

  4. Stochastic optimal phase retrieval algorithm for high-contrast imaging

    Science.gov (United States)

    Give'on, Amir; Kasdin, N. Jeremy; Vanderbei, Robert J.; Spergel, David N.; Littman, Michael G.; Gurfil, Pini

    2003-12-01

    The Princeton University Terrestrial Planet Finder (TPF) has been working on a novel method for direct imaging of extra solar planets using a shaped-pupil coronagraph. The entrance pupil of the coronagraph is optimized to have a point spread function (PSF) that provides the suppression level needed at the angular separation required for detection of extra solar planets. When integration time is to be minimized, the photon count at the planet location in the image plane is a Poisson distributed random process. The ultimate limitation of these high-dynamic-range imaging systems comes from scattering due to imperfections in the optical surfaces of the collecting system. The first step in correcting the wavefront errors is the estimation of the phase aberrations. The phase aberration caused by these imperfections is assumed to be a sum of two-dimensional sinusoidal functions. Its parameters are estimated using a global search with a genetic algorithm and a local optimization with the BFGS quasi-Newton method with a mixed quadratic and cubic line search procedure.

  5. Optimal strategy for controlling the spread of Plasmodium Knowlesi malaria: Treatment and culling

    Science.gov (United States)

    Abdullahi, Mohammed Baba; Hasan, Yahya Abu; Abdullah, Farah Aini

    2015-05-01

    Plasmodium Knowlesi malaria is a parasitic mosquito-borne disease caused by a eukaryotic protist of genus Plasmodium Knowlesi transmitted by mosquito, Anopheles leucosphyrus to human and macaques. We developed and analyzed a deterministic Mathematical model for the transmission of Plasmodium Knowlesi malaria in human and macaques. The optimal control theory is applied to investigate optimal strategies for controlling the spread of Plasmodium Knowlesi malaria using treatment and culling as control strategies. The conditions for optimal control of the Plasmodium Knowlesi malaria are derived using Pontryagin's Maximum Principle. Finally, numerical simulations suggested that the combination of the control strategies is the best way to control the disease in any community.

  6. On Optimality of the Barrier Strategy for the Classical Risk Model with Interest

    Institute of Scientific and Technical Information of China (English)

    Ying Fang; Rong Wu

    2011-01-01

    In this paper, we consider the optimal dividend problem for a classical risk model with a constant force of interest. For such a risk model, a sufficient condition under which a barrier strategy is the optimal strategy is presented for general claim distributions. When claim sizes are exponentially distributed, it is shown that the optimal dividend policy is a barrier strategy and the maximal dividend-value function is a concave function. Finally, some known results relating to the distribution of aggregate dividends before ruin are extended.

  7. Immune clonal selection optimization method with combining mutation strategies

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    In artificial immune optimization algorithm, the mutation of immune cells has been considered as the key operator that determines the algorithm performance. Traditional immune optimization algorithms have used a single mutation operator, typically a Gaussian. Using a variety of mutation operators that can be combined during evolution to generate different probability density function could hold the potential for producing better solutions with less computational effort. In view of this, a linear combination...

  8. Targeting Strategies for Multifunctional Nanoparticles in Cancer Imaging and Therapy

    Directory of Open Access Journals (Sweden)

    Mi Kyung Yu, Jinho Park, Sangyong Jon

    2012-01-01

    Full Text Available Nanomaterials offer new opportunities for cancer diagnosis and treatment. Multifunctional nanoparticles harboring various functions including targeting, imaging, therapy, and etc have been intensively studied aiming to overcome limitations associated with conventional cancer diagnosis and therapy. Of various nanoparticles, magnetic iron oxide nanoparticles with superparamagnetic property have shown potential as multifunctional nanoparticles for clinical translation because they have been used asmagnetic resonance imaging (MRI constrast agents in clinic and their features could be easily tailored by including targeting moieties, fluorescence dyes, or therapeutic agents. This review summarizes targeting strategies for construction of multifunctional nanoparticles including magnetic nanoparticles-based theranostic systems, and the various surface engineering strategies of nanoparticles for in vivo applications.

  9. Optimizing brain tumor resection. Midfield interventional MR imaging.

    Science.gov (United States)

    Alexander, E

    2001-11-01

    The development of the intraoperative MR imager represents an important example of creative vision and interdisciplinary teamwork. The result is a remarkable tool for neurosurgical applications. MRT allows surgical manipulation under direct visualization of the intracranial contents through the eye of the surgeon and through the volumetric images of the MR imaging system. This technology can be applied to cranial and spinal cases, and forseeably can encompass application to the entire gamut of neurosurgical efforts. The author's experience has been that this device is easy and comfortable for the surgeon to use. Image acquisition, giving views in the plane of choice, lasts no more than 2 to 60 seconds (depending on the imaging method), and does not increase the duration of a given procedure substantially. The author believes that the information received through intraoperative MR imaging scanning ultimately will contribute to decreasing the duration of surgery. Future possibilities include combining the intraoperative MR imager with other technologies, such as the endoscope, focused ultrasound, robotics, and the evaluation of brain function intraoperatively. The development of the intraoperative MR imager marks a significant advance in neurosurgery, an advance that will revolutionize intraoperative visualization as fully as the operating microscope. The combination of intraoperative visualization and precise surgical navigation is unparalleled, and its enhancement of surgical applications will be widespread. Considering the remarkable potential of the intraoperative MR imager for neurosurgical applications, optimal magnet design, image quality, and navigational methods are necessary to capitalize on the advantages of this revolutionary tool. The intraoperative MR imaging system that the author's team has developed and used has combined these features, and allows the performance of open surgical procedures without the need of patient or magnet repositioning. By

  10. Optimal segmentation of pupillometric images for estimating pupil shape parameters.

    Science.gov (United States)

    De Santis, A; Iacoviello, D

    2006-12-01

    The problem of determining the pupil morphological parameters from pupillometric data is considered. These characteristics are of great interest for non-invasive early diagnosis of the central nervous system response to environmental stimuli of different nature, in subjects suffering some typical diseases such as diabetes, Alzheimer disease, schizophrenia, drug and alcohol addiction. Pupil geometrical features such as diameter, area, centroid coordinates, are estimated by a procedure based on an image segmentation algorithm. It exploits the level set formulation of the variational problem related to the segmentation. A discrete set up of this problem that admits a unique optimal solution is proposed: an arbitrary initial curve is evolved towards the optimal segmentation boundary by a difference equation; therefore no numerical approximation schemes are needed, as required in the equivalent continuum formulation usually adopted in the relevant literature.

  11. Implementation and Optimization of Image Processing Algorithms on Embedded GPU

    Science.gov (United States)

    Singhal, Nitin; Yoo, Jin Woo; Choi, Ho Yeol; Park, In Kyu

    In this paper, we analyze the key factors underlying the implementation, evaluation, and optimization of image processing and computer vision algorithms on embedded GPU using OpenGL ES 2.0 shader model. First, we present the characteristics of the embedded GPU and its inherent advantage when compared to embedded CPU. Additionally, we propose techniques to achieve increased performance with optimized shader design. To show the effectiveness of the proposed techniques, we employ cartoon-style non-photorealistic rendering (NPR), speeded-up robust feature (SURF) detection, and stereo matching as our example algorithms. Performance is evaluated in terms of the execution time and speed-up achieved in comparison with the implementation on embedded CPU.

  12. User-driven sampling strategies in image exploitation

    Science.gov (United States)

    Harvey, Neal; Porter, Reid

    2013-12-01

    Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-driven sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. User-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. In preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.

  13. Optimizing the 3R Study Strategy to Learn from Text

    NARCIS (Netherlands)

    Reijners, Pauline; Kester, Liesbeth; Wetzels, Sandra; Kirschner, Paul A.

    2014-01-01

    Learning from text is often very difficult for students. In this presentation the results of a study with the 3R study strategy are presented in which possible mechanisms for stimulating successful text learning are discussed.

  14. Optimal scan strategies for future CMB satellite experiments

    CERN Document Server

    Wallis, Christopher G R; Battye, Richard A; Delabrouille, Jacques

    2016-01-01

    The B-mode polarisation power spectrum in the Cosmic Microwave Background (CMB) is about four orders of magnitude fainter than the CMB temperature power spectrum. Any instrumental imperfections that couple temperature fluctuations to B-mode polarisation must therefore be carefully controlled and/or removed. We investigate the role that a scan strategy can have in mitigating certain common systematics by averaging systematic errors down with many crossing angles. We present approximate analytic forms for the error on the recovered B-mode power spectrum that would result from differential gain, differential pointing and differential ellipticity for the case where two detector pairs are used in a polarisation experiment. We use these analytic predictions to search the parameter space of common satellite scan strategies in order to identify those features of a scan strategy that have most impact in mitigating systematic effects. As an example we go on to identify a scan strategy suitable for the CMB satellite pro...

  15. Optimal scan strategies for future CMB satellite experiments

    Science.gov (United States)

    Wallis, Christopher G. R.; Brown, Michael L.; Battye, Richard A.; Delabrouille, Jacques

    2017-04-01

    The B-mode polarization power spectrum in the cosmic microwave background (CMB) is about four orders of magnitude fainter than the CMB temperature power spectrum. Any instrumental imperfections that couple temperature fluctuations to B-mode polarization must therefore be carefully controlled and/or removed. We investigate the role that a scan strategy can have in mitigating certain common systematics by averaging systematic errors down with many crossing angles. We present approximate analytic forms for the error on the recovered B-mode power spectrum that would result from differential gain, differential pointing and differential ellipticity for the case where two detector pairs are used in a polarization experiment. We use these analytic predictions to search the parameter space of common satellite scan strategies in order to identify those features of a scan strategy that have most impact in mitigating systematic effects. As an example, we go on to identify a scan strategy suitable for the CMB satellite proposed for the European Space Agency M5 call, considering the practical considerations of fuel requirement, data rate and the relative orientation of the telescope to the earth. Having chosen a scan strategy we then go on to investigate the suitability of the scan strategy.

  16. Optimal Energy Management Strategy of a Plug-in Hybrid Electric Vehicle Based on a Particle Swarm Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Zeyu Chen

    2015-04-01

    Full Text Available Plug-in hybrid electric vehicles (PHEVs have been recognized as one of the most promising vehicle categories nowadays due to their low fuel consumption and reduced emissions. Energy management is critical for improving the performance of PHEVs. This paper proposes an energy management approach based on a particle swarm optimization (PSO algorithm. The optimization objective is to minimize total energy cost (summation of oil and electricity from vehicle utilization. A main drawback of optimal strategies is that they can hardly be used in real-time control. In order to solve this problem, a rule-based strategy containing three operation modes is proposed first, and then the PSO algorithm is implemented on four threshold values in the presented rule-based strategy. The proposed strategy has been verified by the US06 driving cycle under the MATLAB/Simulink software environment. Two different driving cycles are adopted to evaluate the generalization ability of the proposed strategy. Simulation results indicate that the proposed PSO-based energy management method can achieve better energy efficiency compared with traditional blended strategies. Online control performance of the proposed approach has been demonstrated through a driver-in-the-loop real-time experiment.

  17. Optimized optical clearing method for imaging central nervous system

    Science.gov (United States)

    Yu, Tingting; Qi, Yisong; Gong, Hui; Luo, Qingming; Zhu, Dan

    2015-03-01

    The development of various optical clearing methods provides a great potential for imaging entire central nervous system by combining with multiple-labelling and microscopic imaging techniques. These methods had made certain clearing contributions with respective weaknesses, including tissue deformation, fluorescence quenching, execution complexity and antibody penetration limitation that makes immunostaining of tissue blocks difficult. The passive clarity technique (PACT) bypasses those problems and clears the samples with simple implementation, excellent transparency with fine fluorescence retention, but the passive tissue clearing method needs too long time. In this study, we not only accelerate the clearing speed of brain blocks but also preserve GFP fluorescence well by screening an optimal clearing temperature. The selection of proper temperature will make PACT more applicable, which evidently broaden the application range of this method.

  18. Optimization strategy for element sizing in hybrid power systems

    Science.gov (United States)

    del Real, Alejandro J.; Arce, Alicia; Bordons, Carlos

    This paper presents a procedure to evaluate the optimal element sizing of hybrid power systems. In order to generalize the problem, this work exploits the "energy hub" formulation previously presented in the literature, defining an energy hub as an interface among energy producers, consumers and the transportation infrastructure. The resulting optimization minimizes an objective function which is based on costs and efficiencies of the system elements, while taking into account the hub model, energy and power constraints and estimated operational conditions, such as energy prices, input power flow availability and output energy demand. The resulting optimal architecture also constitutes a framework for further real-time control designs. Moreover, an example of a hybrid storage system is considered. In particular, the architecture of a hybrid plant incorporating a wind generator, batteries and intermediate hydrogen storage is optimized, based on real wind data and averaged residential demands, also taking into account possible estimation errors. The hydrogen system integrates an electrolyzer, a fuel cell stack and hydrogen tanks. The resulting optimal cost of such hybrid power plant is compared with the equivalent hydrogen-only and battery-only systems, showing improvements in investment costs of almost 30% in the worst case.

  19. Optimization strategy for element sizing in hybrid power systems

    Energy Technology Data Exchange (ETDEWEB)

    del Real, Alejandro J.; Arce, Alicia; Bordons, Carlos [Departamento de Ingenieria de Sistemas y Automatica, Universidad de Sevilla, 41092 Sevilla (Spain)

    2009-08-01

    This paper presents a procedure to evaluate the optimal element sizing of hybrid power systems. In order to generalize the problem, this work exploits the ''energy hub'' formulation previously presented in the literature, defining an energy hub as an interface among energy producers, consumers and the transportation infrastructure. The resulting optimization minimizes an objective function which is based on costs and efficiencies of the system elements, while taking into account the hub model, energy and power constraints and estimated operational conditions, such as energy prices, input power flow availability and output energy demand. The resulting optimal architecture also constitutes a framework for further real-time control designs. Moreover, an example of a hybrid storage system is considered. In particular, the architecture of a hybrid plant incorporating a wind generator, batteries and intermediate hydrogen storage is optimized, based on real wind data and averaged residential demands, also taking into account possible estimation errors. The hydrogen system integrates an electrolyzer, a fuel cell stack and hydrogen tanks. The resulting optimal cost of such hybrid power plant is compared with the equivalent hydrogen-only and battery-only systems, showing improvements in investment costs of almost 30% in the worst case. (author)

  20. Optimization of image processing algorithms on mobile platforms

    Science.gov (United States)

    Poudel, Pramod; Shirvaikar, Mukul

    2011-03-01

    This work presents a technique to optimize popular image processing algorithms on mobile platforms such as cell phones, net-books and personal digital assistants (PDAs). The increasing demand for video applications like context-aware computing on mobile embedded systems requires the use of computationally intensive image processing algorithms. The system engineer has a mandate to optimize them so as to meet real-time deadlines. A methodology to take advantage of the asymmetric dual-core processor, which includes an ARM and a DSP core supported by shared memory, is presented with implementation details. The target platform chosen is the popular OMAP 3530 processor for embedded media systems. It has an asymmetric dual-core architecture with an ARM Cortex-A8 and a TMS320C64x Digital Signal Processor (DSP). The development platform was the BeagleBoard with 256 MB of NAND RAM and 256 MB SDRAM memory. The basic image correlation algorithm is chosen for benchmarking as it finds widespread application for various template matching tasks such as face-recognition. The basic algorithm prototypes conform to OpenCV, a popular computer vision library. OpenCV algorithms can be easily ported to the ARM core which runs a popular operating system such as Linux or Windows CE. However, the DSP is architecturally more efficient at handling DFT algorithms. The algorithms are tested on a variety of images and performance results are presented measuring the speedup obtained due to dual-core implementation. A major advantage of this approach is that it allows the ARM processor to perform important real-time tasks, while the DSP addresses performance-hungry algorithms.

  1. Optimal embedding for shape indexing in medical image databases.

    Science.gov (United States)

    Qian, Xiaoning; Tagare, Hemant D; Fulbright, Robert K; Long, Rodney; Antani, Sameer

    2010-06-01

    This paper addresses the problem of indexing shapes in medical image databases. Shapes of organs are often indicative of disease, making shape similarity queries important in medical image databases. Mathematically, shapes with landmarks belong to shape spaces which are curved manifolds with a well defined metric. The challenge in shape indexing is to index data in such curved spaces. One natural indexing scheme is to use metric trees, but metric trees are prone to inefficiency. This paper proposes a more efficient alternative. We show that it is possible to optimally embed finite sets of shapes in shape space into a Euclidean space. After embedding, classical coordinate-based trees can be used for efficient shape retrieval. The embedding proposed in the paper is optimal in the sense that it least distorts the partial Procrustes shape distance. The proposed indexing technique is used to retrieve images by vertebral shape from the NHANES II database of cervical and lumbar spine X-ray images maintained at the National Library of Medicine. Vertebral shape strongly correlates with the presence of osteophytes, and shape similarity retrieval is proposed as a tool for retrieval by osteophyte presence and severity. Experimental results included in the paper evaluate (1) the usefulness of shape similarity as a proxy for osteophytes, (2) the computational and disk access efficiency of the new indexing scheme, (3) the relative performance of indexing with embedding to the performance of indexing without embedding, and (4) the computational cost of indexing using the proposed embedding versus the cost of an alternate embedding. The experimental results clearly show the relevance of shape indexing and the advantage of using the proposed embedding.

  2. Matching suitable feature construction for SAR images based on evolutionary synthesis strategy

    Institute of Scientific and Technical Information of China (English)

    Bu Yanlong; Tang Geshi; Liu Hongfu; Pan Liang

    2013-01-01

    In the paper, a set of algorithms to construct synthetic aperture radar (SAR) matching suitable features are firstly proposed based on the evolutionary synthesis strategy. During the pro-cess, on the one hand, the indexes of primary matching suitable features (PMSFs) are designed based on the characteristics of image texture, SAR imaging and SAR matching algorithm, which is a process involving expertise;on the other hand, by designing a synthesized operation expression tree based on PMSFs, a much more flexible expression form of synthesized features is built, which greatly expands the construction space. Then, the genetic algorithm-based optimized searching process is employed to search the synthesized matching suitable feature (SMSF) with the highest efficiency, largely improving the optimized searching efficiency. In addition, the experimental results of the airborne synthetic aperture radar ortho-images of C-band and P-band show that the SMSFs gained via the algorithms can reflect the matching suitability of SAR images accurately and the matching probabilities of selected matching suitable areas of ortho-images could reach 99 ± 0.5%.

  3. Optimal Phase Masks for High Contrast Imaging Applications

    Science.gov (United States)

    Ruane, Garreth J.

    2016-05-01

    Phase-only optical elements can provide a number of important functions for high-contrast imaging. This thesis presents analytical and numerical optical design methods for accomplishing specific tasks, the most significant of which is the precise suppression of light from a distant point source. Instruments designed for this purpose are known as coronagraphs. Here, advanced coronagraph designs are presented that offer improved theoretical performance in comparison to the current state-of-the-art. Applications of these systems include the direct imaging and characterization of exoplanets and circumstellar disks with high sensitivity. Several new coronagraph designs are introduced and, in some cases, experimental support is provided. In addition, two novel high-contrast imaging applications are discussed: the measurement of sub-resolution information using coronagraphic optics and the protection of sensors from laser damage. The former is based on experimental measurements of the sensitivity of a coronagraph to source displacement. The latter discussion presents the current state of ongoing theoretical work. Beyond the mentioned applications, the main outcome of this thesis is a generalized theory for the design of optical systems with one of more phase masks that provide precise control of radiation over a large dynamic range, which is relevant in various high-contrast imaging scenarios. The optimal phase masks depend on the necessary tasks, the maximum number of optics, and application specific performance measures. The challenges and future prospects of this work are discussed in detail.

  4. Investment Strategies Optimization based on a SAX-GA Methodology

    CERN Document Server

    Canelas, António M L; Horta, Nuno C G

    2013-01-01

    This book presents a new computational finance approach combining a Symbolic Aggregate approXimation (SAX) technique with an optimization kernel based on genetic algorithms (GA). While the SAX representation is used to describe the financial time series, the evolutionary optimization kernel is used in order to identify the most relevant patterns and generate investment rules. The proposed approach considers several different chromosomes structures in order to achieve better results on the trading platform The methodology presented in this book has great potential on investment markets.

  5. From the risk-stratification of patients with dilated cardiomyopathy to the optimal treatment strategy

    Directory of Open Access Journals (Sweden)

    Frolov A.V.

    2016-03-01

    Conclusions. Application of the original model of risk stratification will allow to optimize the general management in DCM and the strategy of timely selection of potential candidates for implantation of cardioverter- defibrillator for the primary prevention of SCD.

  6. Real-time optimization power-split strategy for hybrid electric vehicles

    Institute of Scientific and Technical Information of China (English)

    XIA ChaoYing; ZHANG Cong

    2016-01-01

    Energy management strategies based on optimal control theory can achieve minimum fuel consumption for hybrid electric vehicles,but the requirement for driving cycles known in prior leads to a real-time problem.A real-time optimization power-split strategy is proposed based on linear quadratic optimal control.The battery state of charge sustainability and fuel economy are ensured by designing a quadratic performance index combined with two rules.The engine power and motor power of this strategy are calculated in real-time based on current system state and command,and not related to future driving conditions.The simulation results in ADVISOR demonstrate that,under the conditions of various driving cycles,road slopes and vehicle parameters,the proposed strategy significantly improves fuel economy,which is very close to that of the optimal control based on Pontryagin's minimum principle,and greatly reduces computation complexity.

  7. Image Quality Modeling and Optimization for Non-Conventional Aperture Imaging Systems

    Science.gov (United States)

    Salvaggio, Philip S.

    The majority of image quality studies have been performed on systems with conventional aperture functions. These systems have straightforward aperture designs and well-understood behavior. Image quality for these systems can be predicted by the General Image Quality Equation (GIQE). However, in order to continue pushing the boundaries of imaging, more control over the point spread function of an imaging system may be necessary. This requires modifications in the pupil plane of a system, causing a departure from the realm of most image quality studies. Examples include sparse apertures, synthetic apertures, coded apertures and phase elements. This work will focus on sparse aperture telescopes and the image quality issues associated with them, however, the methods presented will be applicable to other non-conventional aperture systems. In this research, an approach for modeling the image quality of non-conventional aperture systems will be introduced. While the modeling approach is based in previous work, a novel validation study will be performed, which accounts for the effects of both broadband illumination and wavefront error. One of the key image quality challenges for sparse apertures is post-processing ringing artifacts. These artifacts have been observed in modeled data, but a validation study will be performed to observe them in measured data and to compare them to model predictions. Once validated, the modeling approach will be used to perform a small set of design studies for sparse aperture systems, including spectral bandpass selection and aperture layout optimization.

  8. Optimal Control Strategies in a Two Dimensional Differential Game Using Linear Equation under a Perturbed System

    Directory of Open Access Journals (Sweden)

    Musa Danjuma SHEHU

    2008-06-01

    Full Text Available This paper lays emphasis on formulation of two dimensional differential games via optimal control theory and consideration of control systems whose dynamics is described by a system of Ordinary Differential equation in the form of linear equation under the influence of two controls U(. and V(.. Base on this, strategies were constructed. Hence we determine the optimal strategy for a control say U(. under a perturbation generated by the second control V(. within a given manifold M.

  9. The CEV Model and Its Application in a Study of Optimal Investment Strategy

    Directory of Open Access Journals (Sweden)

    Aiyin Wang

    2014-01-01

    Full Text Available The constant elasticity of variance (CEV model is used to describe the price of the risky asset. Maximizing the expected utility relating to the Hamilton-Jacobi-Bellman (HJB equation which describes the optimal investment strategies, we obtain a partial differential equation. Applying the Legendre transform, we transform the equation into a dual problem and obtain an approximation solution and an optimal investment strategies for the exponential utility function.

  10. AN ASSESSMENT AND OPTIMIZATION OF QUALITY OF STRATEGY PROCESS

    Directory of Open Access Journals (Sweden)

    Snezana Nestic

    2013-12-01

    Full Text Available In order to improve the quality of their processes companies usually rely on quality management systems and the requirements of ISO 9001:2008. The small and medium-sized companies are faced with a series of challenges in objectification, evaluation and assessment of the quality of processes. In this paper, the strategy process is decomposed for one typical medium size of manufacturing company and the indicators of the defined sub processes, based on the requirements of ISO 9001:2008, are developed. The weights of sub processes are calculated using fuzzy set approach. Finally, the developed solution based on the genetic algorithm approach is presented and tested on data from 142 manufacturing companies. The presented solution enables assessment of the quality of a strategy process, ranks the indicators and provides a basis for successful improvement of the quality of the strategy process.

  11. Novel Strategies to Optimize Targeted Molecular Imaging and Therapy

    NARCIS (Netherlands)

    J.P. Norenberg (Jeffrey)

    2013-01-01

    textabstractImproving patients’ clinical outcomes requires many levels of examination, owing to the enormous complexities of human disease and healthcare delivery. Our understanding of disease also requires many different levels of observation. The human experience preconditions us to see the whole

  12. Optimization Strategies to Increase Electrical Distribution Networks Robustness

    Directory of Open Access Journals (Sweden)

    Dorin Sarchiz

    2010-12-01

    Full Text Available The paper aims to present a mathematical model to optimize power distribution network graph, in terms of increasing its robustness, ie to reduce the risk of destruction (its removal from service – accidentally or intentionally, with applications to the distribution networks 20 kV and 110 kV, County Mures.

  13. Evolutionary and principled search strategies for sensornet protocol optimization.

    Science.gov (United States)

    Tate, Jonathan; Woolford-Lim, Benjamin; Bate, Iain; Yao, Xin

    2012-02-01

    Interactions between multiple tunable protocol parameters and multiple performance metrics are generally complex and unknown; finding optimal solutions is generally difficult. However, protocol tuning can yield significant gains in energy efficiency and resource requirements, which is of particular importance for sensornet systems in which resource availability is severely restricted. We address this multi-objective optimization problem for two dissimilar routing protocols and by two distinct approaches. First, we apply factorial design and statistical model fitting methods to reject insignificant factors and locate regions of the problem space containing near-optimal solutions by principled search. Second, we apply the Strength Pareto Evolutionary Algorithm 2 and Two-Archive evolutionary algorithms to explore the problem space, with each iteration potentially yielding solutions of higher quality and diversity than the preceding iteration. Whereas a principled search methodology yields a generally applicable survey of the problem space and enables performance prediction, the evolutionary approach yields viable solutions of higher quality and at lower experimental cost. This is the first study in which sensornet protocol optimization has been explicitly formulated as a multi-objective problem and solved with state-of-the-art multi-objective evolutionary algorithms.

  14. Optimization of Decommission Strategy for Offshore Wind Farms

    DEFF Research Database (Denmark)

    Hou, Peng; Hu, Weihao; Soltani, Mohsen

    2016-01-01

    The life time of offshore wind farm is around 20 years. After that, the whole farm should be decommissioned which is also one of the main factors that contribute to the high investment. In order to make a costeffective wind farm, a novel optimization method for decommission is addressed...

  15. Motion Structural Optimization Strategy for Rhombic Element Based Foldable Structure

    Directory of Open Access Journals (Sweden)

    Seung Hyun Jeong

    2015-02-01

    Full Text Available This research presents a new systematical design approach of foldable structure composed of several rhombic elements by applying genetic algorithm. As structural shapes represented by a foldable structure can be easily and dramatically morphed by manipulating rotational directions and angle of joints, the foldable structure has been used for various elementary structural members and engineering mechanisms. However a systematic design approach determining detail rotational angle and directions of unit cells for arbitrary shaped target areas has not been proposed yet. This research contributes to it by developing a new structural optimization method determining optimal angle and rotation directions to cover arbitrary shaped target areas of interest with aggregated rhombic elements. To achieve this purpose, we present an optimization formulation minimizing the sum of distances between each reference joint of an arbitrary shaped target area and its closest outer joints of foldable structure. To find out the outer joint set of a given foldable structure, an efficient geometric analysis method based on Delaunay triangulation is also developed and implemented. To show the validity and limitations of the present approach, several foldable structure design problems for two-dimensional arbitrary shaped target areas are solved with the present optimization procedure.

  16. Taxing Strategies for Carbon Emissions: A Bilevel Optimization Approach

    Directory of Open Access Journals (Sweden)

    Wei Wei

    2014-04-01

    Full Text Available This paper presents a quantitative and computational method to determine the optimal tax rate among generating units. To strike a balance between the reduction of carbon emission and the profit of energy sectors, the proposed bilevel optimization model can be regarded as a Stackelberg game between the government agency and the generation companies. The upper-level, which represents the government agency, aims to limit total carbon emissions within a certain level by setting optimal tax rates among generators according to their emission performances. The lower-level, which represents decision behaviors of the grid operator, tries to minimize the total production cost under the tax rates set by the government. The bilevel optimization model is finally reformulated into a mixed integer linear program (MILP which can be solved by off-the-shelf MILP solvers. Case studies on a 10-unit system as well as a provincial power grid in China demonstrate the validity of the proposed method and its capability in practical applications.

  17. Optimal Input Strategy for Plug and Play Process Control Systems

    DEFF Research Database (Denmark)

    Kragelund, Martin Nygaard; Leth, John-Josef; Wisniewski, Rafal

    2010-01-01

    This paper considers the problem of optimal operation of a plant, which goal is to maintain production at minimum cost. The system considered in this work consists of a joined plant and redundant input systems. It is assumed that each input system contributes to a flow of goods into the joined part...

  18. React or wait: which optimal culling strategy to control infectious diseases in wildlife.

    Science.gov (United States)

    Bolzoni, Luca; Tessoni, Valentina; Groppi, Maria; De Leo, Giulio A

    2014-10-01

    We applied optimal control theory to an SI epidemic model to identify optimal culling strategies for diseases management in wildlife. We focused on different forms of the objective function, including linear control, quadratic control, and control with limited amount of resources. Moreover, we identified optimal solutions under different assumptions on disease-free host dynamics, namely: self-regulating logistic growth, Malthusian growth, and the case of negligible demography. We showed that the correct characterization of the disease-free host growth is crucial for defining optimal disease control strategies. By analytical investigations of the model with negligible demography, we demonstrated that the optimal strategy for the linear control can be either to cull at the maximum rate at the very beginning of the epidemic (reactive culling) when the culling cost is low, or never to cull, when culling cost is high. On the other hand, in the cases of quadratic control or limited resources, we demonstrated that the optimal strategy is always reactive. Numerical analyses for hosts with logistic growth showed that, in the case of linear control, the optimal strategy is always reactive when culling cost is low. In contrast, if the culling cost is high, the optimal strategy is to delay control, i.e. not to cull at the onset of the epidemic. Finally, we showed that for diseases with the same basic reproduction number delayed control can be optimal for acute infections, i.e. characterized by high disease-induced mortality and fast dynamics, while reactive control can be optimal for chronic ones.

  19. Aircraft path planning for optimal imaging using dynamic cost functions

    Science.gov (United States)

    Christie, Gordon; Chaudhry, Haseeb; Kochersberger, Kevin

    2015-05-01

    Unmanned aircraft development has accelerated with recent technological improvements in sensing and communications, which has resulted in an "applications lag" for how these aircraft can best be utilized. The aircraft are becoming smaller, more maneuverable and have longer endurance to perform sensing and sampling missions, but operating them aggressively to exploit these capabilities has not been a primary focus in unmanned systems development. This paper addresses a means of aerial vehicle path planning to provide a realistic optimal path in acquiring imagery for structure from motion (SfM) reconstructions and performing radiation surveys. This method will allow SfM reconstructions to occur accurately and with minimal flight time so that the reconstructions can be executed efficiently. An assumption is made that we have 3D point cloud data available prior to the flight. A discrete set of scan lines are proposed for the given area that are scored based on visibility of the scene. Our approach finds a time-efficient path and calculates trajectories between scan lines and over obstacles encountered along those scan lines. Aircraft dynamics are incorporated into the path planning algorithm as dynamic cost functions to create optimal imaging paths in minimum time. Simulations of the path planning algorithm are shown for an urban environment. We also present our approach for image-based terrain mapping, which is able to efficiently perform a 3D reconstruction of a large area without the use of GPS data.

  20. A study of optimal abstract jamming strategies vs. noncoherent MFSK

    Science.gov (United States)

    Mceliece, R. J.; Rodemich, E. R.

    1983-01-01

    The present investigation is concerned with the performance of uncoded MFSK modulation in the presence of arbitrary additive jamming, taking into account the objective to devise robust antijamming strategies. An abstract model is considered, giving attention to the signal strength as a nonnegative real number X, the employment of X as a random variable, its distribution function G(x), the transmitter's strategy G, the jamming noise as an M-dimensional random vector Z, and the error probability. A summary of previous work on the considered problem is provided, and the results of the current study are presented.

  1. Event-based progression detection strategies using scanning laser polarimetry images of the human retina.

    Science.gov (United States)

    Vermeer, K A; Lo, B; Zhou, Q; Vos, F M; Vossepoel, A M; Lemij, H G

    2011-09-01

    Monitoring glaucoma patients and ensuring optimal treatment requires accurate and precise detection of progression. Many glaucomatous progression detection strategies may be formulated for Scanning Laser Polarimetry (SLP) data of the local nerve fiber thickness. In this paper, several strategies, all based on repeated GDx VCC SLP measurements, are tested to identify the optimal one for clinical use. The parameters of the methods were adapted to yield a set specificity of 97.5% on real image series. For a fixed sensitivity of 90%, the minimally detectable loss was subsequently determined for both localized and diffuse loss. Due to the large size of the required data set, a previously described simulation method was used for assessing the minimally detectable loss. The optimal strategy was identified and was based on two baseline visits and two follow-up visits, requiring two-out-of-four positive tests. Its associated minimally detectable loss was 5-12 μm, depending on the reproducibility of the measurements.

  2. Current imaging strategies for the evaluation of uterine cervical cancer.

    Science.gov (United States)

    Bourgioti, Charis; Chatoupis, Konstantinos; Moulopoulos, Lia Angela

    2016-04-28

    Uterine cervical cancer still remains an important socioeconomic issue because it largely affects women of reproductive age. Prognosis is highly depended on extent of the disease at diagnosis and, therefore, accurate staging is crucial for optimal management. Cervical cancer is clinically staged, according to International Federation of Gynecology and Obstetrics guidelines, but, currently, there is increased use of cross sectional imaging modalities [computed tomography (CT), magnetic resonance imaging (MRI), positron emission tomography-CT (PET-CT)] for the study of important prognostic factors like tumor size, parametrial invasion, endocervical extension, pelvic side wall or adjacent/distal organs involvement and lymph node status. Imaging indications also include cervical cancer follow-up, evaluation of tumor response to treatment and selection of suitable candidates for less radical surgeries like radical trachelectomy for fertility preservation. The preferred imaging method for local cervical cancer evaluation is MRI; CT is equally effective for evaluation of extrauterine spread of the disease. PET-CT shows high diagnostic performance for the detection of tumor relapse and metastatic lymph nodes. The aim of this review is to familiarize radiologists with the MRI appearance of cervical carcinoma and to discuss the indications of cross sectional imaging during the course of the disease in patients with cervical carcinoma.

  3. Current imaging strategies for the evaluation of uterine cervical cancer

    Institute of Scientific and Technical Information of China (English)

    Charis Bourgioti; Konstantinos Chatoupis; Lia Angela Moulopoulos

    2016-01-01

    Uterine cervical cancer still remains an important socioeconomic issue because it largely affects women of reproductive age.Prognosis is highly depended on extent of the disease at diagnosis and,therefore,accurate staging is crucial for optimal management.Cervical cancer is clinically staged,according to International Federation of Gynecology and Obstetrics guidelines,but,currently,there is increased use of cross sectional imaging modalities [computed tomography(CT),magnetic resonance imaging(MRI),positron emission tomography-CT(PET-CT)] for the study of important prognostic factors like tumor size,parametrial invasion,endocervical extension,pelvic side wall or adjacent/distal organs involvement and lymph node status.Imaging indications also include cervical cancer follow-up,evaluation of tumor response to treatment and selection of suitable candidates for less radical surgeries like radical trachelectomy for fertility preservation.The preferred imaging method for local cervical cancer evaluation is MRI;CT is equally effective for evaluation of extrauterine spread of the disease.PETCT shows high diagnostic performance for the detection of tumor relapse and metastatic lymph nodes.The aim of this review is to familiarize radiologists with the MRI appearance of cervical carcinoma and to discuss the indications of cross sectional imaging during the course of the disease in patients with cervical carcinoma.

  4. MIMO-OFDM signal optimization for SAR imaging radar

    Science.gov (United States)

    Baudais, J.-Y.; Méric, S.; Riché, V.; Pottier, É.

    2016-12-01

    This paper investigates the optimization of the coded orthogonal frequency division multiplexing (OFDM) transmitted signal in a synthetic aperture radar (SAR) context. We propose to design OFDM signals to achieve range ambiguity mitigation. Indeed, range ambiguities are well known to be a limitation for SAR systems which operates with pulsed transmitted signal. The ambiguous reflected signal corresponding to one pulse is then detected when the radar has already transmitted the next pulse. In this paper, we demonstrate that the range ambiguity mitigation is possible by using orthogonal transmitted wave as OFDM pulses. The coded OFDM signal is optimized through genetic optimization procedures based on radar image quality parameters. Moreover, we propose to design a multiple-input multiple-output (MIMO) configuration to enhance the noise robustness of a radar system and this configuration is mainly efficient in the case of using orthogonal waves as OFDM pulses. The results we obtain show that OFDM signals outperform conventional radar chirps for range ambiguity suppression and for robustness enhancement in 2 ×2 MIMO configuration.

  5. Using Evolution Strategy with Meta-models for Well Placement Optimization

    CERN Document Server

    Bouzarkouna, Zyed; Auger, Anne

    2010-01-01

    Optimum implementation of non-conventional wells allows us to increase considerably hydrocarbon recovery. By considering the high drilling cost and the potential improvement in well productivity, well placement decision is an important issue in field development. Considering complex reservoir geology and high reservoir heterogeneities, stochastic optimization methods are the most suitable approaches for optimum well placement. This paper proposes an optimization methodology to determine optimal well location and trajectory based upon the Covariance Matrix Adaptation - Evolution Strategy (CMA-ES) which is a variant of Evolution Strategies recognized as one of the most powerful derivative-free optimizers for continuous optimization. To improve the optimization procedure, two new techniques are investigated: (1). Adaptive penalization with rejection is developed to handle well placement constraints. (2). A meta-model, based on locally weighted regression, is incorporated into CMA-ES using an approximate ranking ...

  6. Optimized Power Dispatch Strategy for Offshore Wind Farms

    DEFF Research Database (Denmark)

    Hou, Peng; Hu, Weihao; Zhang, Baohua

    2016-01-01

    Maximizing the power production of offshore wind farms using proper control strategy has become an important issue for wind farm operators. However, the power transmitted to the onshore substation (OS) is not only related to the power production of each wind turbine (WT) but also the power losses...

  7. Closing the loop : optimal strategies for hybrid manufacturing /remanufacturing systems

    NARCIS (Netherlands)

    Caner Bulmus, Serra

    2013-01-01

    Serra Caner Bulmus beschrijft in haar proefschrift optimale strategieën voor inzameling van gebruikte producten en herfabricage van die producten door bedrijven. Bij herfabricage worden niet alleen de materialen hergebruikt, maar wordt ook de toegevoegde productiewaarde behouden. Daarmee is herfabri

  8. Optimization of Key Parameters of Energy Management Strategy for Hybrid Electric Vehicle Using DIRECT Algorithm

    Directory of Open Access Journals (Sweden)

    Jingxian Hao

    2016-11-01

    Full Text Available The rule-based logic threshold control strategy has been frequently used in energy management strategies for hybrid electric vehicles (HEVs owing to its convenience in adjusting parameters, real-time performance, stability, and robustness. However, the logic threshold control parameters cannot usually ensure the best vehicle performance at different driving cycles and conditions. For this reason, the optimization of key parameters is important to improve the fuel economy, dynamic performance, and drivability. In principle, this is a multiparameter nonlinear optimization problem. The logic threshold energy management strategy for an all-wheel-drive HEV is comprehensively analyzed and developed in this study. Seven key parameters to be optimized are extracted. The optimization model of key parameters is proposed from the perspective of fuel economy. The global optimization method, DIRECT algorithm, which has good real-time performance, low computational burden, rapid convergence, is selected to optimize the extracted key parameters globally. The results show that with the optimized parameters, the engine operates more at the high efficiency range resulting into a fuel savings of 7% compared with non-optimized parameters. The proposed method can provide guidance for calibrating the parameters of the vehicle energy management strategy from the perspective of fuel economy.

  9. Optimization of holographic real images for subsea hologrammetry

    Science.gov (United States)

    Watson, John; Foster, E.; Ross, Gary A.

    1995-07-01

    Hologrammetry has many advantages over conventional imaging techniques for subsea visual inspection. Holograms recorded underwater can be replayed in the laboratory to provide an optical replica of the original subject. Real-image reconstruction allows planar 'optical sections' to be isolated and measured directly. However, these advantages can be removed by poor optimization of the reconstructed image. Furthermore, recording the hologram in water and replaying in air increases the magnitude of the optical aberrations which may be apparent. Such aberrations can be minimized using index compensation whereby the hologram is replayed in air with a wavelength which is equivalent to the effective wavelength of the beam in water. To monitor the influence of these effects and to establish the validity of the index compensation method, reconstruction takes place in a micrometer-controlled plate holder to allow precise positioning about all three rotational axes and the three translational axes. The image is viewed using a lensless TV camera or measuring microscope which is accurately moved through the image volume to provide dimensional information. Index compensation has been shown to work well for both back-lit and front-lit off-axis holograms and is effective over a wide range of field angles. Typically an on-axis resolution of around 1 1p/mm for a front-lit hologram replayed at the recording wavelength will increase to over 20 1p/mm when reconstruction takes place at the compensation wavelength. The corresponding astigmatic difference reduces from around 100 mm to less than 2 mm on employing compensation.

  10. Design of optimal collimation for dedicated molecular breast imaging systems

    Energy Technology Data Exchange (ETDEWEB)

    Weinmann, Amanda L.; Hruska, Carrie B.; O' Connor, Michael K. [Department of Radiology, Division of Nuclear Medicine, Mayo Clinic, Rochester, Minnesota 55905 (United States)

    2009-03-15

    Molecular breast imaging (MBI) is a functional imaging technique that uses specialized small field-of-view gamma cameras to detect the preferential uptake of a radiotracer in breast lesions. MBI has potential to be a useful adjunct method to screening mammography for the detection of occult breast cancer. However, a current limitation of MBI is the high radiation dose (a factor of 7-10 times that of screening mammography) associated with current technology. The purpose of this study was to optimize the gamma camera collimation with the aim of improving sensitivity while retaining adequate resolution for the detection of sub-10-mm lesions. Square-hole collimators with holes matched to the pixilated cadmium zinc telluride detector elements of the MBI system were designed. Data from MBI patient studies and parameters of existing dual-head MBI systems were used to guide the range of desired collimator resolutions, source-to-collimator distances, pixel sizes, and collimator materials that were examined. General equations describing collimator performance for a conventional gamma camera were used in the design process along with several important adjustments to account for the specialized imaging geometry of the MBI system. Both theoretical calculations and a Monte Carlo model were used to measure the geometric efficiency (or sensitivity) and resolution of each designed collimator. Results showed that through optimal collimation, collimator sensitivity could be improved by factors of 1.5-3.2, while maintaining a collimator resolution of either {<=}5 or {<=}7.5 mm at a distance of 3 cm from the collimator face. These gains in collimator sensitivity permit an inversely proportional drop in the required dose to perform MBI.

  11. OPTIMAL FEED STRATEGY FOR FED-BATCH GLYCEROL FERMENTATION DETERMINED BY MAXIMUM PRINCIPLE

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    1 IntroductionGlycerol fed-batch fermentation is attractive tocommercial application since it can control theglucose concentration by changing the feed rate andget a high glycerol yield, therefore it is essential todevelop an optimal glucose feed strategy. For mostof fed-batch fermentation, optimization of feed ratewas based on Pontryagin's maximum principle [if.Since the term of feed rate appears linearly in theHamiltonian, the optimal feed rate profile usuallyconsists of ba,lg-bang intervals and singular ...

  12. Alternative adhesive strategies to optimize bonding to radicular dentin.

    Science.gov (United States)

    Bouillaguet, Serge; Bertossa, Bruno; Krejci, Ivo; Wataha, John C; Tay, Franklin R; Pashley, David H

    2007-10-01

    This study tested the hypothesis that bond strengths of filling materials to radicular dentin might be optimized by using an indirect dentin bonding procedure with an acrylic core material. Roots of human teeth were endodontically prepared and obturated with EndoREZ, Epiphany, or the bonding of an acrylic point with SE Bond by using a direct or an indirect bonding technique. Bond strengths of endodontic sealers to radicular dentin were measured with a thin slice push-out test. Push-out strengths of EndoREZ and Epiphany to radicular dentin were less than 5 megapascals (MPa). The direct bonding technique with acrylic points and the self-etching adhesive had push-out strengths of 10 MPa, increasing to 18 MPa with the indirect technique. The use of the indirect bonding protocol with an acrylic point to compensate for polymerization stresses appears to be a viable means for optimizing bond strengths of endodontic filling materials to radicular dentin.

  13. Numerical Strategies for Stroke Optimization of Axisymmetric Microswimmers

    CERN Document Server

    Alouges, François; Heltai, Luca

    2009-01-01

    We propose a computational method to solve optimal swimming problems, based on the boundary integral formulation of the hydrodynamic interaction between swimmer and surrounding fluid and direct constrained minimization of the energy consumed by the swimmer. We apply our method to axisymmetric model examples. We consider a classical model swimmer (the three-sphere swimmer of Golestanian et al.) as well as a novel axisymmetric swimmer inspired by the observation of biological micro-organisms.

  14. Exploring optimal fertigation strategies for orange production, using soil-crop modelling

    NARCIS (Netherlands)

    Qin, Wei; Heinen, Marius; Assinck, Falentijn B.T.; Oenema, Oene

    2016-01-01

    Water and nitrogen (N) are two key limiting factors in orange (Citrus sinensis) production. The amount and the timing of water and N application are critical, but optimal strategies have not yet been well established. This study presents an analysis of 47 fertigation strategies examined by a coup

  15. Optimal marker-strategy clinical trial design to detect predictive markers for targeted therapy.

    Science.gov (United States)

    Zang, Yong; Liu, Suyu; Yuan, Ying

    2016-07-01

    In developing targeted therapy, the marker-strategy design (MSD) provides an important approach to evaluate the predictive marker effect. This design first randomizes patients into non-marker-based or marker-based strategies. Patients allocated to the non-marker-based strategy are then further randomized to receive either the standard or targeted treatments, while patients allocated to the marker-based strategy receive treatments based on their marker statuses. Little research has been done on the statistical properties of the MSD, which has led to some widespread misconceptions and placed clinical researchers at high risk of using inefficient designs. In this article, we show that the commonly used between-strategy comparison has low power to detect the predictive effect and is valid only under a restrictive condition that the randomization ratio within the non-marker-based strategy matches the marker prevalence. We propose a Wald test that is generally valid and also uniformly more powerful than the between-strategy comparison. Based on that, we derive an optimal MSD that maximizes the power to detect the predictive marker effect by choosing the optimal randomization ratios between the two strategies and treatments. Our numerical study shows that using the proposed optimal designs can substantially improve the power of the MSD to detect the predictive marker effect. We use a lung cancer trial to illustrate the proposed optimal designs.

  16. Integrated emission management strategy for cost-optimal engine-aftertreatment operation

    NARCIS (Netherlands)

    Cloudt, R.P.M.; Willems, F.P.T.

    2011-01-01

    A new cost-based control strategy is presented that optimizes engine-aftertreatment performance under all operating conditions. This Integrated Emission Management strategy minimizes fuel consumption within the set emission limits by on-line adjustment of air management based on the actual state of

  17. Optimization strategies with resource scarcity: From immunization of networks to the traveling salesman problem

    Science.gov (United States)

    Bellingeri, Michele; Agliari, Elena; Cassi, Davide

    2015-10-01

    The best strategy to immunize a complex network is usually evaluated in terms of the percolation threshold, i.e. the number of vaccine doses which make the largest connected cluster (LCC) vanish. The strategy inducing the minimum percolation threshold represents the optimal way to immunize the network. Here we show that the efficacy of the immunization strategies can change during the immunization process. This means that, if the number of doses is limited, the best strategy is not necessarily the one leading to the smallest percolation threshold. This outcome should warn about the adoption of global measures in order to evaluate the best immunization strategy.

  18. Optimization of white matter fiber tractography with diffusional kurtosis imaging.

    Science.gov (United States)

    Glenn, G Russell; Helpern, Joseph A; Tabesh, Ali; Jensen, Jens H

    2015-10-01

    Diffusional kurtosis imaging (DKI) is a clinically feasible diffusion MRI technique for white matter (WM) fiber tractography (FT) with the ability to directly resolve intra-voxel crossing fibers by means of the kurtosis diffusion orientation distribution function (dODF). Here we expand on previous work by exploring properties of the kurtosis dODF and their subsequent effects on WM FT for in vivo human data. For comparison, the results are contrasted with fiber bundle orientation estimates provided by the diffusion tensor, which is the primary quantity obtained from diffusion tensor imaging. We also outline an efficient method for performing DKI-based WM FT that can substantially decrease the computational requirements. The recommended method for implementing the kurtosis ODF is demonstrated to optimize the reproducibility and sensitivity of DKI for detecting crossing fibers while reducing the occurrence of non-physically-meaningful, negative values in the kurtosis dODF approximation. In addition, DKI-based WM FT is illustrated for different protocols differing in image acquisition times from 48 to 5.3 min.

  19. Lossless coding using predictors and VLCs optimized for each image

    Science.gov (United States)

    Matsuda, Ichiro; Shirai, Noriyuki; Itoh, Susumu

    2003-06-01

    This paper proposes an efficient lossless coding scheme for still images. The scheme utilizes an adaptive prediction technique where a set of linear predictors are designed for a given image and an appropriate predictor is selected from the set block-by-block. The resulting prediction errors are encoded using context-adaptive variable-length codes (VLCs). Context modeling, or adaptive selection of VLCs, is carried out pel-by-pel and the VLC assigned to each context is designed on a probability distribution model of the prediction errors. In order to improve coding efficiency, a generalized Gaussian function is used as the model for each context. Moreover, not only the predictors but also parameters of the probability distribution models are iteratively optimized for each image so that a coding rate of the prediction errors can have a minimum. Experimental results show that the proposed coding scheme attains comparable coding performance to the state-of-the-art TMW scheme with much lower complexity in the decoding process.

  20. An optimized transport-of-intensity solution for phase imaging

    Science.gov (United States)

    Banerjee, Partha; Basunia, Mahmudunnabi; Poon, Ting-Chung; Zhang, Hongbo

    2016-05-01

    The transport-of-intensity equation (TIE) is often used to determine the phase and amplitude profile of a complex object by monitoring the intensities at different distances of propagation or around the image plane. TIE results from the imaginary part of the paraxial wave equation and is equivalent to the conservation of energy. The real part of the paraxial wave equation gives the eikonal equation in the presence of diffraction. Since propagation of the optical field between different planes is governed by the (paraxial) wave equation, both real and imaginary parts need to be satisfied at every propagation plane. In this work, the solution of the TIE is optimized by using the real part of the paraxial wave equation as a constraint. This technique is applied to the more exact determination of imaging the induced phase of a liquid heated by a focused laser beam, which has been previously computed using TIE only. Retrieval of imaged phase using the TIE is performed by using the constraint that naturally arises from the real part of the paraxial wave equation.

  1. Optimal Coordinated Strategy Analysis for the Procurement Logistics of a Steel Group

    Directory of Open Access Journals (Sweden)

    Lianbo Deng

    2014-01-01

    Full Text Available This paper focuses on the optimization of an internal coordinated procurement logistics system in a steel group and the decision on the coordinated procurement strategy by minimizing the logistics costs. Considering the coordinated procurement strategy and the procurement logistics costs, the aim of the optimization model was to maximize the degree of quality satisfaction and to minimize the procurement logistics costs. The model was transformed into a single-objective model and solved using a simulated annealing algorithm. In the algorithm, the supplier of each subsidiary was selected according to the evaluation result for independent procurement. Finally, the effect of different parameters on the coordinated procurement strategy was analysed. The results showed that the coordinated strategy can clearly save procurement costs; that the strategy appears to be more cooperative when the quality requirement is not stricter; and that the coordinated costs have a strong effect on the coordinated procurement strategy.

  2. Improved quantum-behaved particle swarm optimization with local search strategy

    Directory of Open Access Journals (Sweden)

    Maolong Xi

    2017-03-01

    Full Text Available Quantum-behaved particle swarm optimization, which was motivated by analysis of particle swarm optimization and quantum system, has shown compared performance in finding the optimal solutions for many optimization problems to other evolutionary algorithms. To address the problem of premature, a local search strategy is proposed to improve the performance of quantum-behaved particle swarm optimization. In proposed local search strategy, a super particle is presented which is a collection body of randomly selected particles’ dimension information in the swarm. The selected probability of particles in swarm is different and determined by their fitness values. To minimization problems, the fitness value of one particle is smaller; the selected probability is more and will contribute more information in constructing the super particle. In addition, in order to investigate the influence on algorithm performance with different local search space, four methods of computing the local search radius are applied in local search strategy and propose four variants of local search quantum-behaved particle swarm optimization. Empirical studies on a suite of well-known benchmark functions are undertaken in order to make an overall performance comparison among the proposed methods and other quantum-behaved particle swarm optimization. The simulation results show that the proposed quantum-behaved particle swarm optimization variants have better advantages over the original quantum-behaved particle swarm optimization.

  3. Multiresolution strategies for the numerical solution of optimal control problems

    Science.gov (United States)

    Jain, Sachin

    There exist many numerical techniques for solving optimal control problems but less work has been done in the field of making these algorithms run faster and more robustly. The main motivation of this work is to solve optimal control problems accurately in a fast and efficient way. Optimal control problems are often characterized by discontinuities or switchings in the control variables. One way of accurately capturing the irregularities in the solution is to use a high resolution (dense) uniform grid. This requires a large amount of computational resources both in terms of CPU time and memory. Hence, in order to accurately capture any irregularities in the solution using a few computational resources, one can refine the mesh locally in the region close to an irregularity instead of refining the mesh uniformly over the whole domain. Therefore, a novel multiresolution scheme for data compression has been designed which is shown to outperform similar data compression schemes. Specifically, we have shown that the proposed approach results in fewer grid points in the grid compared to a common multiresolution data compression scheme. The validity of the proposed mesh refinement algorithm has been verified by solving several challenging initial-boundary value problems for evolution equations in 1D. The examples have demonstrated the stability and robustness of the proposed algorithm. The algorithm adapted dynamically to any existing or emerging irregularities in the solution by automatically allocating more grid points to the region where the solution exhibited sharp features and fewer points to the region where the solution was smooth. Thereby, the computational time and memory usage has been reduced significantly, while maintaining an accuracy equivalent to the one obtained using a fine uniform mesh. Next, a direct multiresolution-based approach for solving trajectory optimization problems is developed. The original optimal control problem is transcribed into a

  4. Imaging Tasks Scheduling for High-Altitude Airship in Emergency Condition Based on Energy-Aware Strategy

    Directory of Open Access Journals (Sweden)

    Li Zhimeng

    2013-01-01

    Full Text Available Aiming to the imaging tasks scheduling problem on high-altitude airship in emergency condition, the programming models are constructed by analyzing the main constraints, which take the maximum task benefit and the minimum energy consumption as two optimization objectives. Firstly, the hierarchy architecture is adopted to convert this scheduling problem into three subproblems, that is, the task ranking, value task detecting, and energy conservation optimization. Then, the algorithms are designed for the sub-problems, and the solving results are corresponding to feasible solution, efficient solution, and optimization solution of original problem, respectively. This paper makes detailed introduction to the energy-aware optimization strategy, which can rationally adjust airship’s cruising speed based on the distribution of task’s deadline, so as to decrease the total energy consumption caused by cruising activities. Finally, the application results and comparison analysis show that the proposed strategy and algorithm are effective and feasible.

  5. Imaging tasks scheduling for high-altitude airship in emergency condition based on energy-aware strategy.

    Science.gov (United States)

    Zhimeng, Li; Chuan, He; Dishan, Qiu; Jin, Liu; Manhao, Ma

    2013-01-01

    Aiming to the imaging tasks scheduling problem on high-altitude airship in emergency condition, the programming models are constructed by analyzing the main constraints, which take the maximum task benefit and the minimum energy consumption as two optimization objectives. Firstly, the hierarchy architecture is adopted to convert this scheduling problem into three subproblems, that is, the task ranking, value task detecting, and energy conservation optimization. Then, the algorithms are designed for the sub-problems, and the solving results are corresponding to feasible solution, efficient solution, and optimization solution of original problem, respectively. This paper makes detailed introduction to the energy-aware optimization strategy, which can rationally adjust airship's cruising speed based on the distribution of task's deadline, so as to decrease the total energy consumption caused by cruising activities. Finally, the application results and comparison analysis show that the proposed strategy and algorithm are effective and feasible.

  6. Optimal vaccination strategies against vector-borne diseases

    DEFF Research Database (Denmark)

    Græsbøll, Kaare; Enøe, Claes; Bødker, Rene

    2014-01-01

    Using a process oriented semi-agent based model, we simulated the spread of Bluetongue virus by Culicoides, biting midges, between cattle in Denmark. We evaluated the minimum vaccination cover and minimum cost for eight different preventive vaccination strategies in Denmark. The simulation model...... replicates both a passive and active flight of midges between cattle distributed on pastures and cattle farms in Denmark. A seasonal abundance of midges and temperature dependence of biological processes were included in the model. The eight vaccination strategies were investigated under four different...... grazing conditions. Furthermore, scenarios were tested with three different index locations stratified for cattle density. The cheapest way to vaccinate cattle with a medium risk profile (less than 1000 total affected cattle) was to vaccinate cattle on pasture. Regional vaccination displayed better...

  7. Optimization of wind farm power production using innovative control strategies

    DEFF Research Database (Denmark)

    Duc, Thomas

    Wind energy has experienced a very significant growth and cost reduction over the past decade, and is now able to compete with conventional power generation sources. New concepts are currently investigated to decrease costs of production of electricity even further. Wind farm coordinated control...... is one of them; it is aimed at increasing the efficiency of a wind farm and decreasing the fatigue loads faced by wind turbines by reducing aerodynamic interactions between them. These objectives are achieved considering two different strategies: curtailing an upwind turbine to reduce the wind speed...... conditions. It is therefore not known to what extent these gains can be reproduced in a real wind farm where wind conditions are very fluctuating. The French national project SMARTEOLE constitutes one of the first attempts of implementing these strategies on a full scale wind farm. A ten month measurement...

  8. Optimal Demand Execution Strategy for the Defense Logistics Agency

    Science.gov (United States)

    2014-12-01

    a: ~ 8000 ~ 6000 :I z 4000 2000 0 5 Current PR Generation Demand Avg of March thru June 2014 - current - 5-day moving avg 10 15 20 25...March through June PR Generation The peak demand for daily PR execution per buyer is approximately 23 per day. At nearly 16,000 PRs per day, each of...Current Order Execution Strategy ..................................27  3.  Generate Hypothetical Workload Models

  9. Optimization Strategy to Capitalize on the Romanian Tourism Potential

    OpenAIRE

    PhD Lecturer Dindire Laura; PhD Reader Dugan Silvia

    2010-01-01

    An important direction of the improvement of promotional activities achieved both by the decisional governmental and non-governmental organisms within the tourist services sector and by the tourism firms, both on an intern and international level, is the promotional strategy. Consisting in the mastership of obtaining the best results, through organizing, coordination, prediction, communication and control activities, the promotional management means knowing and understanding the intern and in...

  10. A digital processing strategy to optimize hearing aid outputs directly.

    Science.gov (United States)

    Blamey, Peter J; Martin, Lois F A; Fiket, Hayley J

    2004-01-01

    A new amplification strategy (ADRO), based on 64 independently operating channels, was compared with a nine-channel wide dynamic range compression strategy (WDRC). Open-platform in-the-ear hearing instruments were configured either with ADRO or the manufacturer's WDRC strategy. Twenty-two subjects with mild to moderate hearing loss took home the ADRO or WDRC hearing aids. After three weeks' acclimatization, the aids were evaluated using monosyllables in quiet at 50 to 65 dB SPL and sentences in eight-talker babble. The acclimatization and evaluation were repeated in the second phase of the balanced reverse-block blind experimental design. The ADRO program showed a statistically significant mean advantage of 7.85% word score (95% confidence interval 3.19% to 12.51%; p = 0.002) and 6.41% phoneme score for the monosyllables in quiet (95% confidence interval 2.03% to 10.79%; p = 0.006). A statistically significant advantage of 7.25% was also found for the ADRO program in background noise (95% confidence interval 1.95% to 12.55%; p = 0.010). The results are consistent with earlier data for listeners with moderate to severe hearing loss.

  11. Cellular scanning strategy for selective laser melting: Generating reliable, optimized scanning paths and processing parameters

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2015-01-01

    to generate optimized cellular scanning strategies and processing parameters, with an objective of reducing thermal asymmetries and mechanical deformations. The optimized scanning strategies are used for selective laser melting of the standard samples, and experimental and numerical results are compared....... gradients that occur during the process. While process monitoring and control of selective laser melting is an active area of research, establishing the reliability and robustness of the process still remains a challenge.In this paper, a methodology for generating reliable, optimized scanning paths...

  12. Dispositional optimism and coping strategies in patients with a kidney transplant.

    Science.gov (United States)

    Costa-Requena, Gemma; Cantarell-Aixendri, M Carmen; Parramon-Puig, Gemma; Serón-Micas, Daniel

    2014-01-01

     Dispositional optimism is a personal resource that determines the coping style and adaptive response to chronic diseases. The aim of this study was to assess the correlations between dispositional optimism and coping strategies in patients with recent kidney transplantation and evaluate the differences in the use of coping strategies in accordance with the level of dispositional optimism.  Patients who were hospitalised in the nephrology department were selected consecutively after kidney transplantation was performed. The evaluation instruments were the Life Orientation Test-Revised, and the Coping Strategies Inventory. The data were analysed with central tendency measures, correlation analyses and means were compared using Student’s t-test.   66 patients with a kidney transplant participated in the study. The coping styles that characterised patients with a recent kidney transplantation were Social withdrawal and Problem avoidance. Correlations between dispositional optimism and coping strategies were significant in a positive direction in Problem-solving (p<.05) and Cognitive restructuring (p<.01), and inversely with Self-criticism (p<.05). Differences in dispositional optimism created significant differences in the Self-Criticism dimension (t=2.58; p<.01).  Dispositional optimism scores provide differences in coping responses after kidney transplantation. Moreover, coping strategies may influence the patient’s perception of emotional wellbeing after kidney transplantation.

  13. Optimal dispatch strategy for the agile virtual power plant

    DEFF Research Database (Denmark)

    Petersen, Mette Højgaard; Bendtsen, Jan Dimon; Stoustrup, Jakob

    2012-01-01

    of perfect prediction is unrealistic. This paper therefore introduces the Agile Virtual Power Plant. The Agile Virtual Power Plant assumes that the base load production planning based on best available knowledge is already given, so imbalances cannot be predicted. Consequently the Agile Virtual Power Plant...... attempts to preserve maneuverability (stay agile) rather than optimize performance according to predictions. In this paper the imbalance compensation problem for an Agile Virtual Power Plant is formulated. It is proved formally, that when local units are power and energy constrained integrators a dispatch...

  14. OPTIMAL POWER ALLOCATION WITH AF AND SDF STRATEGIES IN DUAL-HOP COOPERATIVE MIMO NETWORKS

    Institute of Scientific and Technical Information of China (English)

    Xu Xiaorong; Zheng Baoyu; Zhang Jianwu

    2010-01-01

    Dual-hop cooperative Multiple-Input Multiple-Output (MIMO) network with multi-relay cooperative communication is introduced. Power allocation problem with Amplify-and-Forward (AF) and Selective Decode-and-Forward (SDF) strategies in multi-node scenario are formulated and solved respectively. Optimal power allocation schemes that maximize system capacity with AF strategy are presented. In addition,optimal power allocation methods that minimize asymptotic Symbol Error Rate (SER) with SDF cooperative protocol in multi-node scenario are also proposed. Furthermore,performance comparisons are provided in terms of system capacity and approximate SER. Numerical and simulation results confirm our theoretical analysis. It is revealed that,maximum system capacity could be obtained when powers are allocated optimally with AF protocol,while minimization of system's SER could also be achieved with optimum power allocation in SDF strategy. In multi-node scenario,those optimal power allocation algorithms are superior to conventional equal power allocation schemes.

  15. Stochastic Optimal Wind Power Bidding Strategy in Short-Term Electricity Market

    DEFF Research Database (Denmark)

    Hu, Weihao; Chen, Zhe; Bak-Jensen, Birgitte

    2012-01-01

    minimization problem for trading wind power in the short-term electricity market is described, to help the wind power owners optimize their bidding strategy. Stochastic optimization and a Monte Carlo method are adopted to find the optimal bidding strategy for trading wind power in the short-term electricity....... Simulation results show that the stochastic optimal bidding strategy for trading wind power in the Danish short-term electricity market is an effective measure to maximize the revenue of the wind power owners.......Due to the fluctuating nature and non-perfect forecast of the wind power, the wind power owners are penalized for the imbalance costs of the regulation, when they trade wind power in the short-term liberalized electricity market. Therefore, in this paper a formulation of an imbalance cost...

  16. Stochastic Optimal Wind Power Bidding Strategy in Short-Term Electricity Market

    DEFF Research Database (Denmark)

    Hu, Weihao; Chen, Zhe; Bak-Jensen, Birgitte

    2012-01-01

    minimization problem for trading wind power in the short-term electricity market is described, to help the wind power owners optimize their bidding strategy. Stochastic optimization and a Monte Carlo method are adopted to find the optimal bidding strategy for trading wind power in the short-term electricity...... market in order to deal with the uncertainty of the regulation price, the activated regulation of the power system and the forecasted wind power generation. The Danish short-term electricity market and a wind farm in western Denmark are chosen as study cases due to the high wind power penetration here....... Simulation results show that the stochastic optimal bidding strategy for trading wind power in the Danish short-term electricity market is an effective measure to maximize the revenue of the wind power owners....

  17. Imaging and power generation strategies for Chandrayaan-1

    Indian Academy of Sciences (India)

    Ananth Krishna; N S Gopinath; N S Hegde; N K Malik

    2005-12-01

    The Chandrayaan-1 mission proposes to put a 550 kg lunarcraft into Geostationary Transfer Orbit (GTO)using the Polar Satellite Launch Vehicle (PSLV)which will subsequently be transferred into a 100 km circular lunar polar orbit for imaging purposes.In this paper,we describe certain aspects of mission strategies which will allow optimum power generation and imaging of the lunar surface. The lunar orbit considered is circular and polar and therefore nearly perpendicular to the ecliptic plane.Unlike an Earth orbiting remote sensing satellite,the orbit plane of lunar orbiter is inertially fixed as a consequence of the very small oblateness of the Moon.The Earth rotates around the Sun once a year,resulting in an apparent motion of Sun around this orbit in a year.Two extreme situations can be identified concerning the solar illumination of the lunar orbit,noon/midnight orbit,where the Sun vector is parallel to the spacecraft orbit plane and dawn/dusk orbit,where the Sun vector is perpendicular to the spacecraft orbit plane.This scenario directly affects the solar panel configuration.In case the solar panels are not canted,during the noon/midnight orbit, 100%power is generated,whereas during the dawn/dusk orbit,zero power is generated.Hence for optimum power generation,canting of the panels is essential.Detailed analysis was carriedout to fix optimum canting and also determine a strategy to maintain optimum power generation throughout the year.The analysis led to the strategy of 180° yaw rotation at noon/midnight orbits and flipping the solar panel by 180° at dawn/dusk orbits.This also resulted in the negative pitch face of the lunarcraft to be an anti-sun panel,which is very useful for thermal design,and further to meet cooling requirements of the spectrometers. In principle the Moon ’s surface can be imaged in 28 days,because the orbit chosen and the payload swath provide adequate overlap.However,in reality it is not possible to complete the imaging in 28 days

  18. An Optimal Strategy for Accurate Bulge-to-disk Decomposition of Disk Galaxies

    Science.gov (United States)

    Gao, Hua; Ho, Luis C.

    2017-08-01

    The development of two-dimensional (2D) bulge-to-disk decomposition techniques has shown their advantages over traditional one-dimensional (1D) techniques, especially for galaxies with non-axisymmetric features. However, the full potential of 2D techniques has yet to be fully exploited. Secondary morphological features in nearby disk galaxies, such as bars, lenses, rings, disk breaks, and spiral arms, are seldom accounted for in 2D image decompositions, even though some image-fitting codes, such as GALFIT, are capable of handling them. We present detailed, 2D multi-model and multi-component decomposition of high-quality R-band images of a representative sample of nearby disk galaxies selected from the Carnegie-Irvine Galaxy Survey, using the latest version of GALFIT. The sample consists of five barred and five unbarred galaxies, spanning Hubble types from S0 to Sc. Traditional 1D decomposition is also presented for comparison. In detailed case studies of the 10 galaxies, we successfully model the secondary morphological features. Through a comparison of best-fit parameters obtained from different input surface brightness models, we identify morphological features that significantly impact bulge measurements. We show that nuclear and inner lenses/rings and disk breaks must be properly taken into account to obtain accurate bulge parameters, whereas outer lenses/rings and spiral arms have a negligible effect. We provide an optimal strategy to measure bulge parameters of typical disk galaxies, as well as prescriptions to estimate realistic uncertainties of them, which will benefit subsequent decomposition of a larger galaxy sample.

  19. How to coadd images? I. Optimal source detection and photometry using ensembles of images

    CERN Document Server

    Zackay, Barak

    2015-01-01

    Stacks of digital astronomical images are combined in order to increase image depth. The variable seeing conditions, sky background and transparency of ground-based observations make the coaddition process non-trivial. We present image coaddition methods optimized for source detection and flux measurement, that maximize the signal-to-noise ratio (S/N). We show that for these purposes the best way to combine images is to apply a matched filter to each image using its own point spread function (PSF) and only then to sum the images with the appropriate weights. Methods that either match filter after coaddition, or perform PSF homogenization prior to coaddition will result in loss of sensitivity. We argue that our method provides an increase of between a few and 25 percent in the survey speed of deep ground-based imaging surveys compared with weighted coaddition techniques. We demonstrate this claim using simulated data as well as data from the Palomar Transient Factory data release 2. We present a variant of thi...

  20. Optimizing Gear Shifting Strategy for Off-Road Vehicle with Dynamic Programming

    Directory of Open Access Journals (Sweden)

    Xinxin Zhao

    2014-01-01

    Full Text Available Gear shifting strategy of vehicle is important aid for the acquisition of dynamic performance and high economy. A dynamic programming (DP algorithm is used to optimize the gear shifting schedule for off-road vehicle by using an objective function that weighs fuel use and trip time. The optimization is accomplished through discrete dynamic programming and a trade-off between trip time and fuel consumption is analyzed. By using concave and convex surface road as road profile, an optimal gear shifting strategy is used to control the longitudinal behavior of the vehicle. Simulation results show that the trip time can be reduced by powerful gear shifting strategy and fuel consumption can achieve high economy with economical gear shifting strategy in different initial conditions and route cases.

  1. The importance of optical optimization in whole slide imaging (WSI) and digital pathology imaging.

    Science.gov (United States)

    Yagi, Yukako; Gilbertson, John R

    2008-07-15

    In the last 10 years, whole slide imaging (WSI) has seen impressive progress not only in image quality and scanning speed but also in the variety of systems available to pathologists. However, we have noticed that most systems have relatively simple optics axes and rely on software to optimize image quality and colour balance. While much can be done in software, this study examines the importance of optics, in particular optical filters, in WSI.Optical resolution is a function of the wavelength of light used and the numerical aperture of the lens system (Resolution = (f) wavelength/2 NA). When illumining light is not conditioned correctly with filters, there is a tendency for the wavelength to shift to longer values (more red) because of the characteristics of the lamps in common use. Most microscopes (but remarkably few WSI devices) correct for this with ND filter for brightness and Blue filter (depends on the light source) for colour correction.Using H&E slides research microscopes (Axiophot, Carl Zeiss MicroImaging, Inc. NY. Eclipse 50i., Nikon Inc. NY) at 20x, an attached digital camera (SPOT RT741 Slider Color, Diagnosis Instruments., MI USA), and a filter set, we examined the effect of filters and software enhancement on digital image quality. The focus value (as evaluated by focus evaluation software developed in house and SPOT imaging Software v4.6) was used as a proxy for image quality. Resolution of tissue features was best with the use of both the Blue and ND filters (in addition to software enhancement). Images without filters but with software enhancement while superficially good, lacked some details of specimen morphology and were unclear compared with the images with filters.The results indicate that the appropriate use of optical filters could measurably improve the appearance and resolution of WSI images.

  2. Determination of an optimal control strategy for drug administration in tumor treatment using multi-objective optimization differential evolution.

    Science.gov (United States)

    Lobato, Fran Sérgio; Machado, Vinicius Silvério; Steffen, Valder

    2016-07-01

    The mathematical modeling of physical and biologic systems represents an interesting alternative to study the behavior of these phenomena. In this context, the development of mathematical models to simulate the dynamic behavior of tumors is configured as an important theme in the current days. Among the advantages resulting from using these models is their application to optimization and inverse problem approaches. Traditionally, the formulated Optimal Control Problem (OCP) has the objective of minimizing the size of tumor cells by the end of the treatment. In this case an important aspect is not considered, namely, the optimal concentrations of drugs may affect the patients' health significantly. In this sense, the present work has the objective of obtaining an optimal protocol for drug administration to patients with cancer, through the minimization of both the cancerous cells concentration and the prescribed drug concentration. The resolution of this multi-objective problem is obtained through the Multi-objective Optimization Differential Evolution (MODE) algorithm. The Pareto's Curve obtained supplies a set of optimal protocols from which an optimal strategy for drug administration can be chosen, according to a given criterion.

  3. Optimal inspection and Repair Strategies for Structural Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Faber, Michael Havbro

    1992-01-01

    A model for reliability-based repair and maintenance strategies of structural systems is described. The total expected costs in the lifetime of the structure are minimized with the number of inspections, the number and positions of the inspected points, the inspection efforts, the repair criteria...... to be inspected and to select the location of the points to be inspected. It is shown how information obtained through inspections and through the periods of normal operating of the structure can be used to update the inspection and maintenance planning. Finally, a small example is given illustrating...

  4. Generalized PSF modeling for optimized quantitation in PET imaging

    Science.gov (United States)

    Ashrafinia, Saeed; Mohy-ud-Din, Hassan; Karakatsanis, Nicolas A.; Jha, Abhinav K.; Casey, Michael E.; Kadrmas, Dan J.; Rahmim, Arman

    2017-06-01

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUVmean and SUVmax, including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUVmean bias in small tumours. Overall, the results indicate that exactly matched PSF

  5. Optimized Image Steganalysis through Feature Selection using MBEGA

    CERN Document Server

    Geetha, S

    2010-01-01

    Feature based steganalysis, an emerging branch in information forensics, aims at identifying the presence of a covert communication by employing the statistical features of the cover and stego image as clues/evidences. Due to the large volumes of security audit data as well as complex and dynamic properties of steganogram behaviours, optimizing the performance of steganalysers becomes an important open problem. This paper is focussed at fine tuning the performance of six promising steganalysers in this field, through feature selection. We propose to employ Markov Blanket-Embedded Genetic Algorithm (MBEGA) for stego sensitive feature selection process. In particular, the embedded Markov blanket based memetic operators add or delete features (or genes) from a genetic algorithm (GA) solution so as to quickly improve the solution and fine-tune the search. Empirical results suggest that MBEGA is effective and efficient in eliminating irrelevant and redundant features based on both Markov blanket and predictive pow...

  6. Independent feature subspace iterative optimization based fuzzy clustering for synthetic aperture radar image segmentation

    Science.gov (United States)

    Yu, Hang; Xu, Luping; Feng, Dongzhu; He, Xiaochuan

    2015-01-01

    Synthetic aperture radar (SAR) image segmentation is investigated from feature extraction to algorithm design, which is characterized by two aspects: (1) multiple heterogeneous features are extracted to describe SAR images and the corresponding similarity measures are developed independently to avoid the mutual influences between different features in order to enhance the discriminability of the final similarity between objects. (2) A method called fuzzy clustering based on independent subspace iterative optimization (FCISIO) is proposed. FCISIO integrates multiple features into an objective function which is then iteratively optimized in each feature subspace to obtain final segmentation results. This strategy can protect the distribution structures of the data points in each feature subspace, which realizes an effective way to integrate multiple features of different properties. In order to improve the computation speed and the accuracy of feature description for FCISIO, we design a region merging algorithm before FCISIO which can use many kinds of information to quickly merge regions inside the true segments. Experiments on synthetic and real SAR images show that the proposed method is effective and robust and can obtain good segmentation results with a very short running time.

  7. Optimal processing for gel electrophoresis images: Applying Monte Carlo Tree Search in GelApp.

    Science.gov (United States)

    Nguyen, Phi-Vu; Ghezal, Ali; Hsueh, Ya-Chih; Boudier, Thomas; Gan, Samuel Ken-En; Lee, Hwee Kuan

    2016-08-01

    In biomedical research, gel band size estimation in electrophoresis analysis is a routine process. To facilitate and automate this process, numerous software have been released, notably the GelApp mobile app. However, the band detection accuracy is limited due to a band detection algorithm that cannot adapt to the variations in input images. To address this, we used the Monte Carlo Tree Search with Upper Confidence Bound (MCTS-UCB) method to efficiently search for optimal image processing pipelines for the band detection task, thereby improving the segmentation algorithm. Incorporating this into GelApp, we report a significant enhancement of gel band detection accuracy by 55.9 ± 2.0% for protein polyacrylamide gels, and 35.9 ± 2.5% for DNA SYBR green agarose gels. This implementation is a proof-of-concept in demonstrating MCTS-UCB as a strategy to optimize general image segmentation. The improved version of GelApp-GelApp 2.0-is freely available on both Google Play Store (for Android platform), and Apple App Store (for iOS platform). © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Determining Optimal Fluorescent Agent Concentrations in Dental Adhesive Resins for Imaging the Tooth/Restoration Interface.

    Science.gov (United States)

    Bim Júnior, Odair; Cebim, Marco A; Atta, Maria T; Machado, Camila M; Francisconi-Dos-Rios, Luciana F; Wang, Linda

    2017-02-01

    Fluorescent dyes like Rhodamine B (RB) have been used to identify the spatial distribution of adhesive restorative materials in the tooth/restoration interface. Potential effects of the addition of RB to dental adhesives were addressed in the past, but no further information is available on how to determine suitable concentrations of RB in these bonding agents for imaging in the confocal laser scanning microscope. This study provides systematical strategies for adding RB to viscous dental adhesive resins, focusing on the determination of the lowest range of dye concentrations necessary to achieve an acceptable image of the dentin/adhesive interface. It was demonstrated that optimized images of the resin distribution in dentin can be produced with 0.1-0.02 mg/mL of RB in the (tested) adhesives. Our approaches took into account aspects related to the dye concentration, photophysical parameters in different host media, specimen composition and morphology to develop a rational use of the fluorescent agent with the resin-based materials. Information gained from this work can help optimize labeling methods using dispersions of low-molecular-weight dyes in different monomer blend systems.

  9. Optimal combined purchasing strategies for a risk-averse manufacturer under price uncertainty

    Directory of Open Access Journals (Sweden)

    Qiao Wu

    2015-09-01

    Full Text Available Purpose: The purpose of our paper is to analyze optimal purchasing strategies when a manufacturer can buy raw materials from a long-term contract supplier and a spot market under spot price uncertainty. Design/methodology/approach: This procurement model can be solved by using dynamic programming. First, we maximize the DM’s utility of the second period, obtaining the optimal contract quantity and spot quantity for the second period. Then, maximize the DM’s utility of both periods, obtaining the optimal purchasing strategy for the first period. We use a numerical method to compare the performance level of a pure spot sourcing strategy with that of a mixed strategy. Findings: Our results show that optimal purchasing strategies vary with the trend of contract prices. If the contract price falls, the total quantity purchased in period 1 will decrease in the degree of risk aversion. If the contract price increases, the total quantity purchased in period 1 will increase in the degree of risk aversion. The relationship between the optimal contract quantity and the degree of risk aversion depends on whether the expected spot price or the contract price is larger in period 2. Finally, we compare the performance levels between a combined strategy and a spot sourcing strategy. It shows that a combined strategy is optimal for a risk-averse buyer. Originality/value: It’s challenging to deal with a two-period procurement problem with risk consideration. We have obtained results of a two-period procurement problem with two sourcing options, namely contract procurement and spot purchases. Our model incorporates the buyer’s risk aversion factor and the change of contract prices, which are not addressed in early studies.

  10. Distributed optimization using virtual and real game strategies for multi-criterion aerodynamic design

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    This paper introduces the virtual and real game concepts to investigate multi-criterion optimization for optimum shape design in aerodynamics. The constrained adjoint meth- odology is used as the basic optimizer. Furthermore, the above is combined with the vir- tual and real game strategies to treat single-point/multi-point airfoil optimization. In a symmetric Nash Game, each optimizer attempts to optimize one’s own target with ex- change of symmetric information with others. A Nash equilibrium is just the compromised solution among the multiple criteria. Several kinds of airfoil splitting and design cases are shown for the utility of virtual and real game strategies in aerodynamic design. Successful design results confirm the validity and efficiency of the present design method.

  11. Footprints of Optimal Protein Assembly Strategies in the Operonic Structure of Prokaryotes

    Directory of Open Access Journals (Sweden)

    Jan Ewald

    2015-04-01

    Full Text Available In this work, we investigate optimality principles behind synthesis strategies for protein complexes using a dynamic optimization approach. We show that the cellular capacity of protein synthesis has a strong influence on optimal synthesis strategies reaching from a simultaneous to a sequential synthesis of the subunits of a protein complex. Sequential synthesis is preferred if protein synthesis is strongly limited, whereas a simultaneous synthesis is optimal in situations with a high protein synthesis capacity. We confirm the predictions of our optimization approach through the analysis of the operonic organization of protein complexes in several hundred prokaryotes. Thereby, we are able to show that cellular protein synthesis capacity is a driving force in the dissolution of operons comprising the subunits of a protein complex. Thus, we also provide a tested hypothesis explaining why the subunits of many prokaryotic protein complexes are distributed across several operons despite the presumably less precise co-regulation.

  12. Distributed optimization using virtual and real game strategies for multi-criterion aerodynamic design

    Institute of Scientific and Technical Information of China (English)

    TANG ZhiLi; BAI Wen; DONG Jun

    2008-01-01

    This paper introduces the virtual and real game concepts to investigate multi-criterion optimization for optimum shape design in aerodynamics. The constrained acljoint meth-odology is used as the basic optimizer. Furthermore, the above is combined with the vir-tual and real game strategies to treat single-point/multi-point airfoil optimization. In a symmetric Nash Game, each optimizer attempts to optimize one's own target with ex-change of symmetric information with others. A Nash equilibrium is just the compromised solution among the multiple criteria. Several kinds of airfoil splitting and design cases are shown for the utility of virtual and real game strategies in aerodynamic design. Successful design results confirm the validity and efficiency of the present design method.

  13. Brillouin endoscope, design and optimization strategies (Conference Presentation)

    Science.gov (United States)

    Xiang, Yuchen; Song, ChengZe; Wadsworth, William J.; Paterson, Carl; Török, Peter; Kabakova, Irina V.

    2017-02-01

    Brillouin imaging has recently emerged as a powerful technique for its ability to give insight to the mechanical properties of biomaterial. It exploits inelastic scattering of light by acoustic vibrations and maps the tissue stiffness point by point with micron resolution. The non-invasive, real-time nature of the measurements also makes it a potent candidate for in-vivo imaging of live cells and tissues. This, however, has to rely on a compact and flexible apparatus, a Brillouin endoscope, for remote access to specimen parts. One of the main challenges encountered in the construction of Brillouin endoscope is that the inelastic scattering in the fibre conduit itself is orders of magnitude stronger than the Brillouin signal scattered by the specimen. This is because the length of the fibre endoscope (meters) is orders of magnitude larger than the imaging volume (microns). The problem can be overcome if the scattered light is collected by a separate fibre and does not mix with the fibre scattering inside the delivery channel. Here we present an all-fibre integrated Brillouin microspectroscopy system that exploits the paths separation between delivery and collection channels. The experimental setup consists of a pair of standard silica single-mode fibres coupled to a graded-index lens and illuminated with a 671nm continuum wavelength source. We test our system performance on liquid samples of water and ethanol and confirm Brillouin shifts of 5.9 GHz and 4.6 GHz, respectively. More importantly, we do not observe any signals corresponding to Brillouin shift in the fibre, in agreement with expectation.

  14. Optimization of remediation strategies using vadose zone monitoring systems

    Science.gov (United States)

    Dahan, Ofer

    2016-04-01

    In-situ bio-remediation of the vadose zone depends mainly on the ability to change the subsurface hydrological, physical and chemical conditions in order to enable development of specific, indigenous, pollutants degrading bacteria. As such the remediation efficiency is much dependent on the ability to implement optimal hydraulic and chemical conditions in deep sections of the vadose zone. These conditions are usually determined in laboratory experiments where parameters such as the chemical composition of the soil water solution, redox potential and water content of the sediment are fully controlled. Usually, implementation of desired optimal degradation conditions in deep vadose zone at full scale field setups is achieved through infiltration of water enriched with chemical additives on the land surface. It is assumed that deep percolation into the vadose zone would create chemical conditions that promote biodegradation of specific compounds. However, application of water with specific chemical conditions near land surface dose not necessarily results in promoting of desired chemical and hydraulic conditions in deep sections of the vadose zone. A vadose-zone monitoring system (VMS) that was recently developed allows continuous monitoring of the hydrological and chemical properties of deep sections of the unsaturated zone. The VMS includes flexible time-domain reflectometry (FTDR) probes which allow continuous monitoring of the temporal variation of the vadose zone water content, and vadose-zone sampling ports (VSPs) which are designed to allow frequent sampling of the sediment pore-water and gas at multiple depths. Implementation of the vadose zone monitoring system in sites that undergoes active remediation provides real time information on the actual chemical and hydrological conditions in the vadose zone as the remediation process progresses. Up-to-date the system has been successfully implemented in several studies on water flow and contaminant transport in

  15. Signal/noise optimization strategies for stochastically estimated correlation functions

    CERN Document Server

    Detmold, William

    2014-01-01

    Numerical studies of quantum field theories usually rely upon an accurate determination of stochastically estimated correlation functions in order to extract information about the spectrum of the theory and matrix elements of operators. The reliable determination of such correlators is often hampered by an exponential degradation of signal/noise at late time separations. We demonstrate that it is sometimes possible to achieve significant enhancements of signal/noise by appropriately optimizing correlators with respect to the source and sink interpolating operators, and highlight the large range of possibilities that are available for this task. The ideas are discussed for both a toy model, and single hadron correlators in the context of quantum chromodynamics.

  16. Optimal control strategy of malaria vector using genetically modified mosquitoes.

    Science.gov (United States)

    Rafikov, M; Bevilacqua, L; Wyse, A P P

    2009-06-07

    The development of transgenic mosquitoes that are resistant to diseases may provide a new and effective weapon of diseases control. Such an approach relies on transgenic mosquitoes being able to survive and compete with wild-type populations. These transgenic mosquitoes carry a specific code that inhibits the plasmodium evolution in its organism. It is said that this characteristic is hereditary and consequently the disease fades away after some time. Once transgenic mosquitoes are released, interactions between the two populations and inter-specific mating between the two types of mosquitoes take place. We present a mathematical model that considers the generation overlapping and variable environment factors. Based on this continuous model, the malaria vector control is formulated and solved as an optimal control problem, indicating how genetically modified mosquitoes should be introduced in the environment. Numerical simulations show the effectiveness of the proposed control.

  17. A Parameter Communication Optimization Strategy for Distributed Machine Learning in Sensors

    Directory of Open Access Journals (Sweden)

    Jilin Zhang

    2017-09-01

    Full Text Available In order to utilize the distributed characteristic of sensors, distributed machine learning has become the mainstream approach, but the different computing capability of sensors and network delays greatly influence the accuracy and the convergence rate of the machine learning model. Our paper describes a reasonable parameter communication optimization strategy to balance the training overhead and the communication overhead. We extend the fault tolerance of iterative-convergent machine learning algorithms and propose the Dynamic Finite Fault Tolerance (DFFT. Based on the DFFT, we implement a parameter communication optimization strategy for distributed machine learning, named Dynamic Synchronous Parallel Strategy (DSP, which uses the performance monitoring model to dynamically adjust the parameter synchronization strategy between worker nodes and the Parameter Server (PS. This strategy makes full use of the computing power of each sensor, ensures the accuracy of the machine learning model, and avoids the situation that the model training is disturbed by any tasks unrelated to the sensors.

  18. A Parameter Communication Optimization Strategy for Distributed Machine Learning in Sensors.

    Science.gov (United States)

    Zhang, Jilin; Tu, Hangdi; Ren, Yongjian; Wan, Jian; Zhou, Li; Li, Mingwei; Wang, Jue; Yu, Lifeng; Zhao, Chang; Zhang, Lei

    2017-09-21

    In order to utilize the distributed characteristic of sensors, distributed machine learning has become the mainstream approach, but the different computing capability of sensors and network delays greatly influence the accuracy and the convergence rate of the machine learning model. Our paper describes a reasonable parameter communication optimization strategy to balance the training overhead and the communication overhead. We extend the fault tolerance of iterative-convergent machine learning algorithms and propose the Dynamic Finite Fault Tolerance (DFFT). Based on the DFFT, we implement a parameter communication optimization strategy for distributed machine learning, named Dynamic Synchronous Parallel Strategy (DSP), which uses the performance monitoring model to dynamically adjust the parameter synchronization strategy between worker nodes and the Parameter Server (PS). This strategy makes full use of the computing power of each sensor, ensures the accuracy of the machine learning model, and avoids the situation that the model training is disturbed by any tasks unrelated to the sensors.

  19. Approximate representation of optimal strategies from influence diagrams

    DEFF Research Database (Denmark)

    Jensen, Finn V.

    2008-01-01

    of the advantages of influence diagrams (IDs) is that for small decision problems, the distinction between phases does not confront the decision maker with a problem; when the problem has been properly specified, the solution algorithms are so efficient that the ID can also be used as an on-line representation......, and where the policy functions for the decisions have so large do- mains that they cannot be represented directly in a strategy tree. The approach is to have separate ID representations for each decision variable. In each representation the actual information is fully exploited, however the representation...... of policies for future decisions are approximations. We call the approximation information abstraction. It consists in introducing a dummy structure connecting the past with the decision. We study how to specify, implement and learn information abstraction....

  20. Optimal swimming strategies in mate searching pelagic copepods

    DEFF Research Database (Denmark)

    Kiørboe, Thomas

    2008-01-01

    Male copepods must swim to find females, but swimming increases the risk of meeting predators and is expensive in terms of energy expenditure. Here I address the trade-offs between gains and risks and the question of how much and how fast to swim using simple models that optimise the number...... of lifetime mate encounters. Radically different swimming strategies are predicted for different feeding behaviours, and these predictions are tested experimentally using representative species. In general, male swimming speeds and the difference in swimming speeds between the genders are predicted...... and observed to increase with increasing conflict between mate searching and feeding. It is high in ambush feeders, where searching (swimming) and feeding are mutually exclusive and low in species, where the matured males do not feed at all. Ambush feeding males alternate between stationary ambush feeding...

  1. [Plastid genome engineering: novel optimization strategies and applications].

    Science.gov (United States)

    Zhou, Fei; Lu, Shizhan; Gao, Liang; Zhang, Juanjuan; Lin, Yongjun

    2015-08-01

    The plastid genome engineering system allows site-specific modifications via two homologous recombination events. It is much safer, more precise and efficient compared with the nuclear transformation system. This technology can be applied to the basic research to expand plastid genome function analysis, and it also provides an excellent platform for not only high-level production of recombinant proteins but also plant breeding. In this review, we summarize the state of the art and progresses in this field. We focus on novel breeding strategies in transformation system improvement and new tools to enhance plastid transgene expression levels. In addition, we highlight selected applications in resistance engineering and quality improvement via metabolic engineering. We believe that by overcoming current technological limitations in the plastid transformation system can another green revolution for crop breeding beckon.

  2. Wake Mitigation Strategies for Optimizing Wind Farm Power Production

    Science.gov (United States)

    Dilip, Deepu; Porté-Agel, Fernando

    2016-04-01

    Although wind turbines are designed individually for optimum power production, they are often arranged into groups of closely spaced turbines in a wind farm rather than in isolation. Consequently, most turbines in a wind farm do not operate in unobstructed wind flows, but are affected by the wakes of turbines in front of them. Such wake interference significantly reduces the overall power generation from wind farms and hence, development of effective wake mitigation strategies is critical for improving wind farm efficiency. One approach towards this end is based on the notion that the operation of each turbine in a wind farm at its optimum efficiency might not lead to optimum power generation from the wind farm as a whole. This entails a down regulation of individual turbines from its optimum operating point, which can be achieved through different methods such as pitching the turbine blades, changing the turbine tip speed ratio or yawing of the turbine, to name a few. In this study, large-eddy simulations of a two-turbine arrangement with the second turbine fully in the wake of the first are performed. Different wake mitigation techniques are applied to the upstream turbine, and the effects of these on its wake characteristics are investigated. Results for the combined power from the two turbines for each of these methods are compared to a baseline scenario where no wake mitigation strategies are employed. Analysis of the results shows the potential for improved power production from such wake control methods. It should be noted, however, that the magnitude of the improvement is strongly affected by the level of turbulence in the incoming atmospheric flow.

  3. Optimal illumination and wave form design for imaging in random media.

    Science.gov (United States)

    Borcea, Liliana; Papanicolaou, George; Tsogka, Chrysoula

    2007-12-01

    The problem of optimal illumination for selective array imaging of small and not well separated scatterers in clutter is considered. The imaging algorithms introduced are based on the coherent interferometric (CINT) imaging functional, which can be viewed as a smoothed version of travel-time migration. The smoothing gives statistical stability to the image but it also causes blurring. The trade-off between statistical stability and blurring is optimized with an adaptive version of CINT. The algorithm for optimal illumination and for selective array imaging uses CINT. It is a constrained optimization problem that is based on the quality of the image obtained with adaptive CINT. The resulting optimal illuminations and selectivity improve the resolution of the images significantly, as can be seen in the numerical simulations presented in the paper.

  4. An optimizing start-up strategy for a bio-methanator.

    Science.gov (United States)

    Sbarciog, Mihaela; Loccufier, Mia; Vande Wouwer, Alain

    2012-05-01

    This paper presents an optimizing start-up strategy for a bio-methanator. The goal of the control strategy is to maximize the outflow rate of methane in anaerobic digestion processes, which can be described by a two-population model. The methodology relies on a thorough analysis of the system dynamics and involves the solution of two optimization problems: steady-state optimization for determining the optimal operating point and transient optimization. The latter is a classical optimal control problem, which can be solved using the maximum principle of Pontryagin. The proposed control law is of the bang-bang type. The process is driven from an initial state to a small neighborhood of the optimal steady state by switching the manipulated variable (dilution rate) from the minimum to the maximum value at a certain time instant. Then the dilution rate is set to the optimal value and the system settles down in the optimal steady state. This control law ensures the convergence of the system to the optimal steady state and substantially increases its stability region. The region of attraction of the steady state corresponding to maximum production of methane is considerably enlarged. In some cases, which are related to the possibility of selecting the minimum dilution rate below a certain level, the stability region of the optimal steady state equals the interior of the state space. Aside its efficiency, which is evaluated not only in terms of biogas production but also from the perspective of treatment of the organic load, the strategy is also characterized by simplicity, being thus appropriate for implementation in real-life systems. Another important advantage is its generality: this technique may be applied to any anaerobic digestion process, for which the acidogenesis and methanogenesis are, respectively, characterized by Monod and Haldane kinetics.

  5. A Particle Swarm Optimization Variant with an Inner Variable Learning Strategy

    Directory of Open Access Journals (Sweden)

    Guohua Wu

    2014-01-01

    Full Text Available Although Particle Swarm Optimization (PSO has demonstrated competitive performance in solving global optimization problems, it exhibits some limitations when dealing with optimization problems with high dimensionality and complex landscape. In this paper, we integrate some problem-oriented knowledge into the design of a certain PSO variant. The resulting novel PSO algorithm with an inner variable learning strategy (PSO-IVL is particularly efficient for optimizing functions with symmetric variables. Symmetric variables of the optimized function have to satisfy a certain quantitative relation. Based on this knowledge, the inner variable learning (IVL strategy helps the particle to inspect the relation among its inner variables, determine the exemplar variable for all other variables, and then make each variable learn from the exemplar variable in terms of their quantitative relations. In addition, we design a new trap detection and jumping out strategy to help particles escape from local optima. The trap detection operation is employed at the level of individual particles whereas the trap jumping out strategy is adaptive in its nature. Experimental simulations completed for some representative optimization functions demonstrate the excellent performance of PSO-IVL. The effectiveness of the PSO-IVL stresses a usefulness of augmenting evolutionary algorithms by problem-oriented domain knowledge.

  6. An Equivalent Emission Minimization Strategy for Causal Optimal Control of Diesel Engines

    Directory of Open Access Journals (Sweden)

    Stephan Zentner

    2014-02-01

    Full Text Available One of the main challenges during the development of operating strategies for modern diesel engines is the reduction of the CO2 emissions, while complying with ever more stringent limits for the pollutant emissions. The inherent trade-off between the emissions of CO2 and pollutants renders a simultaneous reduction difficult. Therefore, an optimal operating strategy is sought that yields minimal CO2 emissions, while holding the cumulative pollutant emissions at the allowed level. Such an operating strategy can be obtained offline by solving a constrained optimal control problem. However, the final-value constraint on the cumulated pollutant emissions prevents this approach from being adopted for causal control. This paper proposes a framework for causal optimal control of diesel engines. The optimization problem can be solved online when the constrained minimization of the CO2 emissions is reformulated as an unconstrained minimization of the CO2 emissions and the weighted pollutant emissions (i.e., equivalent emissions. However, the weighting factors are not known a priori. A method for the online calculation of these weighting factors is proposed. It is based on the Hamilton–Jacobi–Bellman (HJB equation and a physically motivated approximation of the optimal cost-to-go. A case study shows that the causal control strategy defined by the online calculation of the equivalence factor and the minimization of the equivalent emissions is only slightly inferior to the non-causal offline optimization, while being applicable to online control.

  7. Multilevel Optimization Framework for Hierarchical Stiffened Shells Accelerated by Adaptive Equivalent Strategy

    Science.gov (United States)

    Wang, Bo; Tian, Kuo; Zhao, Haixin; Hao, Peng; Zhu, Tianyu; Zhang, Ke; Ma, Yunlong

    2016-09-01

    In order to improve the post-buckling optimization efficiency of hierarchical stiffened shells, a multilevel optimization framework accelerated by adaptive equivalent strategy is presented in this paper. Firstly, the Numerical-based Smeared Stiffener Method (NSSM) for hierarchical stiffened shells is derived by means of the numerical implementation of asymptotic homogenization (NIAH) method. Based on the NSSM, a reasonable adaptive equivalent strategy for hierarchical stiffened shells is developed from the concept of hierarchy reduction. Its core idea is to self-adaptively decide which hierarchy of the structure should be equivalent according to the critical buckling mode rapidly predicted by NSSM. Compared with the detailed model, the high prediction accuracy and efficiency of the proposed model is highlighted. On the basis of this adaptive equivalent model, a multilevel optimization framework is then established by decomposing the complex entire optimization process into major-stiffener-level and minor-stiffener-level sub-optimizations, during which Fixed Point Iteration (FPI) is employed to accelerate convergence. Finally, the illustrative examples of the multilevel framework is carried out to demonstrate its efficiency and effectiveness to search for the global optimum result by contrast with the single-level optimization method. Remarkably, the high efficiency and flexibility of the adaptive equivalent strategy is indicated by compared with the single equivalent strategy.

  8. Mapping Aboveground Biomass in the Amazon Basin: Exploring Sensors, Scales, and Strategies for Optimal Data Linkage

    Science.gov (United States)

    Walker, W. S.; Baccini, A.

    2013-05-01

    Information on the distribution and density of carbon in tropical forests is critical to decision-making on a host of globally significant issues ranging from climate stabilization and biodiversity conservation to poverty reduction and human health. Encouraged by recent progress at both the international and jurisdictional levels on the design of incentive-based policy mechanisms to compensate tropical nations for maintaining their forests intact, governments throughout the tropics are moving with urgency to implement robust national and sub-national forest monitoring systems for operationally tracking and reporting on changes in forest cover and associated carbon stocks. Monitoring systems will be required to produce results that are accurate, consistent, complete, transparent, and comparable at sub-national to pantropical scales, and satellite-based remote sensing supported by field observations is widely-accepted as the most objective and cost-effective solution. The effectiveness of any system for large-area forest monitoring will necessarily depend on the capacity of current and near-future Earth observation satellites to provide information that meets the requirements of developing monitoring protocols. However, important questions remain regarding the role that spatially explicit maps of aboveground biomass and carbon can play in IPCC-compliant forest monitoring systems, with the majority of these questions stemming from doubts about the inherit sensitivity of satellite data to aboveground forest biomass, confusion about the relationship between accuracy and resolution, and a general lack of guidance on optimal strategies for linking field reference and remote sensing data sources. Here we demonstrate the ability of a state-of-the-art satellite radar sensor, the Japanese ALOS/PALSAR, and a venerable optical platform, Landsat 5, to support large-area mapping of aboveground tropical woody biomass across a 153,000-km2 region in the southwestern Amazon

  9. A Single-Degree-of-Freedom Energy Optimization Strategy for Power-Split Hybrid Electric Vehicles

    Directory of Open Access Journals (Sweden)

    Chaoying Xia

    2017-07-01

    Full Text Available This paper presents a single-degree-of-freedom energy optimization strategy to solve the energy management problem existing in power-split hybrid electric vehicles (HEVs. The proposed strategy is based on a quadratic performance index, which is innovatively designed to simultaneously restrict the fluctuation of battery state of charge (SOC and reduce fuel consumption. An extended quadratic optimal control problem is formulated by approximating the fuel consumption rate as a quadratic polynomial of engine power. The approximated optimal control law is obtained by utilizing the solution properties of the Riccati equation and adjoint equation. It is easy to implement in real-time and the engineering significance is explained in details. In order to validate the effectiveness of the proposed strategy, the forward-facing vehicle simulation model is established based on the ADVISOR software (Version 2002, National Renewable Energy Laboratory, Golden, CO, USA. The simulation results show that there is only a little fuel consumption difference between the proposed strategy and the Pontryagin’s minimum principle (PMP-based global optimal strategy, and the proposed strategy also exhibits good adaptability under different initial battery SOC, cargo mass and road slope conditions.

  10. Survey strategy optimization for the Atacama Cosmology Telescope

    CERN Document Server

    De Bernardis, F; Hasselfield, M; Alonso, D; Bond, J R; Calabrese, E; Choi, S K; Crowley, K T; Devlin, M; Dunkley, J; Gallardo, P A; Henderson, S W; Hilton, M; Hlozek, R; Ho, S P; Huffenberger, K; Koopman, B J; Kosowsky, A; Louis, T; Madhavacheril, M S; McMahon, J; Naess, S; Nati, F; Newburgh, L; Niemack, M D; Page, L A; Salatino, M; Schillaci, A; Schmitt, B L; Sehgal, N; Sievers, J L; Simon, S M; Spergel, D N; Staggs, S T; van Engelen, A; Vavagiakis, E M; Wollack, E J

    2016-01-01

    In recent years there have been significant improvements in the sensitivity and the angular resolution of the instruments dedicated to the observation of the Cosmic Microwave Background (CMB). ACTPol is the first polarization receiver for the Atacama Cosmology Telescope (ACT) and is observing the CMB sky with arcmin resolution over about 2000 sq. deg. Its upgrade, Advanced ACTPol (AdvACT), will observe the CMB in five frequency bands and over a larger area of the sky. We describe the optimization and implementation of the ACTPol and AdvACT surveys. The selection of the observed fields is driven mainly by the science goals, that is, small angular scale CMB measurements, B-mode measurements and cross-correlation studies. For the ACTPol survey we have observed patches of the southern galactic sky with low galactic foreground emissions which were also chosen to maximize the overlap with several galaxy surveys to allow unique cross-correlation studies. A wider field in the northern galactic cap ensured significant...

  11. Complexity Reduction in the Use of Evolutionary Algorithms to Function Optimization: A Variable Reduction Strategy

    Directory of Open Access Journals (Sweden)

    Guohua Wu

    2013-01-01

    Full Text Available Discovering and utilizing problem domain knowledge is a promising direction towards improving the efficiency of evolutionary algorithms (EAs when solving optimization problems. We propose a knowledge-based variable reduction strategy (VRS that can be integrated into EAs to solve unconstrained and first-order derivative optimization functions more efficiently. VRS originates from the knowledge that, in an unconstrained and first-order derivative optimization function, the optimal solution locates in a local extreme point at which the partial derivative over each variable equals zero. Through this collective of partial derivative equations, some quantitative relations among different variables can be obtained. These variable relations have to be satisfied in the optimal solution. With the use of such relations, VRS could reduce the number of variables and shrink the solution space when using EAs to deal with the optimization function, thus improving the optimizing speed and quality. When we apply VRS to optimization problems, we just need to modify the calculation approach of the objective function. Therefore, practically, it can be integrated with any EA. In this study, VRS is combined with particle swarm optimization variants and tested on several benchmark optimization functions and a real-world optimization problem. Computational results and comparative study demonstrate the effectiveness of VRS.

  12. Event-based progression detection strategies using scanning laser polarimetry images of the human retina

    NARCIS (Netherlands)

    Vermeer, K.A.; Lo, B.; Zhou, Q.; Vos, F.M.; Vossepoel, A.M.; Lemij, H.G.

    2011-01-01

    Monitoring glaucoma patients and ensuring optimal treatment requires accurate and precise detection of progression. Many glaucomatous progression detection strategies may be formulated for Scanning Laser Polarimetry (SLP) data of the local nerve fiber thickness. In this paper, several strategies, al

  13. Event-based progression detection strategies using scanning laser polarimetry images of the human retina

    NARCIS (Netherlands)

    Vermeer, K.A.; Lo, B.; Zhou, Q.; Vos, F.M.; Vossepoel, A.M.; Lemij, H.G.

    2011-01-01

    Monitoring glaucoma patients and ensuring optimal treatment requires accurate and precise detection of progression. Many glaucomatous progression detection strategies may be formulated for Scanning Laser Polarimetry (SLP) data of the local nerve fiber thickness. In this paper, several strategies, al

  14. Strategies to Optimize Adult Stem Cell Therapy for Tissue Regeneration

    Directory of Open Access Journals (Sweden)

    Shan Liu

    2016-06-01

    Full Text Available Stem cell therapy aims to replace damaged or aged cells with healthy functioning cells in congenital defects, tissue injuries, autoimmune disorders, and neurogenic degenerative diseases. Among various types of stem cells, adult stem cells (i.e., tissue-specific stem cells commit to becoming the functional cells from their tissue of origin. These cells are the most commonly used in cell-based therapy since they do not confer risk of teratomas, do not require fetal stem cell maneuvers and thus are free of ethical concerns, and they confer low immunogenicity (even if allogenous. The goal of this review is to summarize the current state of the art and advances in using stem cell therapy for tissue repair in solid organs. Here we address key factors in cell preparation, such as the source of adult stem cells, optimal cell types for implantation (universal mesenchymal stem cells vs. tissue-specific stem cells, or induced vs. non-induced stem cells, early or late passages of stem cells, stem cells with endogenous or exogenous growth factors, preconditioning of stem cells (hypoxia, growth factors, or conditioned medium, using various controlled release systems to deliver growth factors with hydrogels or microspheres to provide apposite interactions of stem cells and their niche. We also review several approaches of cell delivery that affect the outcomes of cell therapy, including the appropriate routes of cell administration (systemic, intravenous, or intraperitoneal vs. local administration, timing for cell therapy (immediate vs. a few days after injury, single injection of a large number of cells vs. multiple smaller injections, a single site for injection vs. multiple sites and use of rodents vs. larger animal models. Future directions of stem cell-based therapies are also discussed to guide potential clinical applications.

  15. Strategies to Optimize Adult Stem Cell Therapy for Tissue Regeneration.

    Science.gov (United States)

    Liu, Shan; Zhou, Jingli; Zhang, Xuan; Liu, Yang; Chen, Jin; Hu, Bo; Song, Jinlin; Zhang, Yuanyuan

    2016-06-21

    Stem cell therapy aims to replace damaged or aged cells with healthy functioning cells in congenital defects, tissue injuries, autoimmune disorders, and neurogenic degenerative diseases. Among various types of stem cells, adult stem cells (i.e., tissue-specific stem cells) commit to becoming the functional cells from their tissue of origin. These cells are the most commonly used in cell-based therapy since they do not confer risk of teratomas, do not require fetal stem cell maneuvers and thus are free of ethical concerns, and they confer low immunogenicity (even if allogenous). The goal of this review is to summarize the current state of the art and advances in using stem cell therapy for tissue repair in solid organs. Here we address key factors in cell preparation, such as the source of adult stem cells, optimal cell types for implantation (universal mesenchymal stem cells vs. tissue-specific stem cells, or induced vs. non-induced stem cells), early or late passages of stem cells, stem cells with endogenous or exogenous growth factors, preconditioning of stem cells (hypoxia, growth factors, or conditioned medium), using various controlled release systems to deliver growth factors with hydrogels or microspheres to provide apposite interactions of stem cells and their niche. We also review several approaches of cell delivery that affect the outcomes of cell therapy, including the appropriate routes of cell administration (systemic, intravenous, or intraperitoneal vs. local administration), timing for cell therapy (immediate vs. a few days after injury), single injection of a large number of cells vs. multiple smaller injections, a single site for injection vs. multiple sites and use of rodents vs. larger animal models. Future directions of stem cell-based therapies are also discussed to guide potential clinical applications.

  16. A minimax optimal control strategy for uncertain quasi-Hamiltonian systems

    Institute of Scientific and Technical Information of China (English)

    Yong WANC; Zu-guang YING; Wei-qiu ZHU

    2008-01-01

    A minimax optimal control strategy for quasi-Hamiltonian systems with bounded parametric and/or external disturbances is proposed based on the stochastic averaging method and stochastic differential game. To conduct the system energy control, the partially averaged It6 stochastic differential equations for the energy processes are first derived by using the stochastic averaging method for quasi-Hamiltonian systems. Combining the above equations with an appropriate performance index, the proposed strategy is searching for an optimal worst-case controller by solving a stochastic differential game problem. The worst-case disturbances and the optimal controls are obtained by solving a Hamilton-Jacobi-Isaacs (HJI) equation. Numerical results for a controlled and stochastically excited Duffing oscillator with uncertain disturbances exhibit the efficacy of the proposed control strategy.

  17. Optimal Strategy of Efficiency Power Plant with Battery Electric Vehicle in Distribution Network

    Science.gov (United States)

    Ma, Tao; Su, Su; Li, Shunxin; Wang, Wei; Yang, Tiantian; Li, Mengjuan; Ota, Yutaka

    2017-05-01

    With the popularity of electric vehicles (EVs), such as plug-in electric vehicles (PHEVs) and battery electric vehicles (BEVs), an optimal strategy for the coordination of BEVs charging is proposed in this paper. The proposed approach incorporates the random behaviours and regular behaviours of BEV drivers in urban environment. These behaviours lead to the stochastic nature of the charging demand. The optimal strategy is used to guide the coordinated charging at different time to maximize the efficiency of virtual power plant (VPP). An innovative peer-to-peer system is used with BEVs to achieve the goals. The actual behaviours of vehicles in a campus is used to validate the proposed approach, and the simulation results show that the optimal strategy can not only maximize the utilization ratio of efficiency power plant, but also do not need additional energies from distribution grid.

  18. Optimal Investment-Consumption Strategy under Inflation in a Markovian Regime-Switching Market

    Directory of Open Access Journals (Sweden)

    Huiling Wu

    2016-01-01

    Full Text Available This paper studies an investment-consumption problem under inflation. The consumption price level, the prices of the available assets, and the coefficient of the power utility are assumed to be sensitive to the states of underlying economy modulated by a continuous-time Markovian chain. The definition of admissible strategies and the verification theory corresponding to this stochastic control problem are presented. The analytical expression of the optimal investment strategy is derived. The existence, boundedness, and feasibility of the optimal consumption are proven. Finally, we analyze in detail by mathematical and numerical analysis how the risk aversion, the correlation coefficient between the inflation and the stock price, the inflation parameters, and the coefficient of utility affect the optimal investment and consumption strategy.

  19. An Optimal Operating Strategy for Battery Life Cycle Costs in Electric Vehicles

    Directory of Open Access Journals (Sweden)

    Yinghua Han

    2014-01-01

    Full Text Available Impact on petroleum based vehicles on the environment, cost, and availability of fuel has led to an increased interest in electric vehicle as a means of transportation. Battery is a major component in an electric vehicle. Economic viability of these vehicles depends on the availability of cost-effective batteries. This paper presents a generalized formulation for determining the optimal operating strategy and cost optimization for battery. Assume that the deterioration of the battery is stochastic. Under the assumptions, the proposed operating strategy for battery is formulated as a nonlinear optimization problem considering reliability and failure number. And an explicit expression of the average cost rate is derived for battery lifetime. Results show that the proposed operating strategy enhances the availability and reliability at a low cost.

  20. Optimal control strategies for hydrogen production when coupling solid oxide electrolysers with intermittent renewable energies

    Science.gov (United States)

    Cai, Qiong; Adjiman, Claire S.; Brandon, Nigel P.

    2014-12-01

    The penetration of intermittent renewable energies requires the development of energy storage technologies. High temperature electrolysis using solid oxide electrolyser cells (SOECs) as a potential energy storage technology, provides the prospect of a cost-effective and energy efficient route to clean hydrogen production. The development of optimal control strategies when SOEC systems are coupled with intermittent renewable energies is discussed. Hydrogen production is examined in relation to energy consumption. Control strategies considered include maximizing hydrogen production, minimizing SOEC energy consumption and minimizing compressor energy consumption. Optimal control trajectories of the operating variables over a given period of time show feasible control for the chosen situations. Temperature control of the SOEC stack is ensured via constraints on the overall temperature difference across the cell and the local temperature gradient within the SOEC stack, to link materials properties with system performance; these constraints are successfully managed. The relative merits of the optimal control strategies are analyzed.

  1. Performance Evaluation of Content Based Image Retrieval on Feature Optimization and Selection Using Swarm Intelligence

    Directory of Open Access Journals (Sweden)

    Kirti Jain

    2016-03-01

    Full Text Available The diversity and applicability of swarm intelligence is increasing everyday in the fields of science and engineering. Swarm intelligence gives the features of the dynamic features optimization concept. We have used swarm intelligence for the process of feature optimization and feature selection for content-based image retrieval. The performance of content-based image retrieval faced the problem of precision and recall. The value of precision and recall depends on the retrieval capacity of the image. The basic raw image content has visual features such as color, texture, shape and size. The partial feature extraction technique is based on geometric invariant function. Three swarm intelligence algorithms were used for the optimization of features: ant colony optimization, particle swarm optimization (PSO, and glowworm optimization algorithm. Coral image dataset and MatLab software were used for evaluating performance.

  2. Engineering a near-infrared dark chromoprotein optimized for photoacoustic imaging (Conference Presentation)

    Science.gov (United States)

    Li, Yan; Barber, Quinn; Paproski, Robert J.; Enterina, Jhon R.; Rodriguez, Erik A.; Tsien, Roger Y.; Campbell, Robert E.; Zemp, Roger J.

    2016-03-01

    An optimal genetically-encoded probe for photoacoustic (PA) imaging should exhibit high optical absorption, low fluorescence quantum yield, and an absorption maxima within the near-infrared (NIR) window. One promising candidate is a newly engineered chromoprotein (CP), designated dark small ultra-red fluorescent protein (dsmURFP), which is based on a cyanobacterial phycobiliprotein. To optimize dsmURFP characteristics for PA imaging, we have developed a directed evolution method to iteratively screen libraries of protein variants with three different screening systems. Firstly, we took inspiration from dark-acceptor (also known as dark-quencher)-based Förster resonance energy transfer (FRET) constructs, and used dsmURFP as a dark acceptor from a mCardinal fluorescent donor. The rationale for this design was that the higher the extinction coefficient of the dsmURFP, the more the emission of the donor would be quenched. In addition, more energy transferred to the dark acceptor would lead to more thermoelastic expansion and a stronger PA signal. Three rounds of evolution using this first strategy resulted in dsmURFP1.3 that quenched the emission of mCardinal ~2-fold more efficiently than dsmURFP. Secondly, an absorption-based screening based on visual inspection of plates led to identification of the variant dsmURFP1.4, which exhibited a 2-fold higher absorbance and a 5 nm red shift. Thirdly, we developed a colony-based photoacoustic screening method. To demonstrate the utility of our optimized variants, we used photoacoustic imaging to visualize dsmURFP and its variants in phantom and in vivo experiments using chicken embryo models and murine bacterial bladder infection models.

  3. Design of Underwater Robot Lines Based on a Hybrid Automatic Optimization Strategy

    Institute of Scientific and Technical Information of China (English)

    Wenjing Lyu; Weilin Luo

    2014-01-01

    In this paper, a hybrid automatic optimization strategy is proposed for the design of underwater robot lines. Isight is introduced as an integration platform. The construction of this platform is based on the user programming and several commercial software including UG6.0, GAMBIT2.4.6 and FLUENT12.0. An intelligent parameter optimization method, the particle swarm optimization, is incorporated into the platform. To verify the strategy proposed, a simulation is conducted on the underwater robot model 5470, which originates from the DTRC SUBOFF project. With the automatic optimization platform, the minimal resistance is taken as the optimization goal;the wet surface area as the constraint condition; the length of the fore-body, maximum body radius and after-body’s minimum radius as the design variables. With the CFD calculation, the RANS equations and the standard turbulence model are used for direct numerical simulation. By analyses of the simulation results, it is concluded that the platform is of high efficiency and feasibility. Through the platform, a variety of schemes for the design of the lines are generated and the optimal solution is achieved. The combination of the intelligent optimization algorithm and the numerical simulation ensures a global optimal solution and improves the efficiency of the searching solutions.

  4. Optimal Attack Strategy in Random Scale-Free Networks Based on Incomplete Information

    Institute of Scientific and Technical Information of China (English)

    LI Jun; WU Jun; LI Yong; DENG Hong-Zhong; TAN Yue-Jin

    2011-01-01

    @@ We introduce an attack model based on incomplete information, which means that we can obtain the information from partial nodes.We investigate the optimal attack strategy in random scale-free networks both analytically and numerically.We show that the attack strategy can affect the attack effect remarkably and the OAS can achieve better attack effect than other typical attack strategies.It is found that when the attack intensity is small, the attacker should attack more nodes in the "white area" in which we can obtain attack information; when the attack intensity is greater, the attacker should attack more nodes in the "black area" in which we can not obtain attack information.Moreover, we show that there is an inflection point in the curve of optimal attack proportion.For a given magnitude of attack information, the optimal attack proportion decreases with the attack intensity before the inflection point and then increases after the inflection point.%We introduce an attack model based on incomplete information, which means that we can obtain the information from partial nodes. We investigate the optimal attack strategy in random scale-free networks both analytically and numerically. We show that the attack strategy can affect the attack effect remarkably and the OAS can achieve better attack effect than other typical attack strategies. It is found that when the attack intensity is small, the attacker should attack more nodes in the "white area" in which we can obtain attack information; when the attack intensity is greater, the attacker should attack more nodes in the "black area" in which we can not obtain attack information. Moreover, we show that there is an inflection point in the curve of optimal attack proportion. For a given magnitude of attack information, the optimal attack proportion decreases with the attack intensity before the inflection point and then increases after the inflection point.

  5. Optimizing the Stark-decelerator beamline for the trapping of cold molecules using evolutionary strategies

    CERN Document Server

    Gilijamse, J J; Hoekstra, S; De van Meerakker, S Y T; Meijer, G; Gilijamse, Joop J.; K\\"upper, Jochen; Hoekstra, Steven; Meerakker, Sebastiaan Y. T. van de; Meijer, Gerard

    2006-01-01

    We demonstrate feedback control optimization for the Stark deceleration and trapping of neutral polar molecules using evolutionary strategies. In a Stark-decelerator beamline pulsed electric fields are used to decelerate OH radicals and subsequently store them in an electrostatic trap. The efficiency of the deceleration and trapping process is determined by the exact timings of the applied electric field pulses. Automated optimization of these timings yields an increase of 40 % of the number of trapped OH radicals.

  6. Novel Modeling Framework To Guide Design of Optimal Dosing Strategies for β-Lactamase Inhibitors

    OpenAIRE

    Bhagunde, Pratik; Chang, Kai-Tai; Hirsch, Elizabeth B.; Ledesma, Kimberly R.; Nikolaou, Michael; Tam, Vincent H.

    2012-01-01

    The scarcity of new antibiotics against drug-resistant bacteria has led to the development of inhibitors targeting specific resistance mechanisms, which aim to restore the effectiveness of existing agents. However, there are few guidelines for the optimal dosing of inhibitors. Extending the utility of mathematical modeling, which has been used as a decision support tool for antibiotic dosing regimen design, we developed a novel mathematical modeling framework to guide optimal dosing strategie...

  7. Implementation of Evolution Strategies (ES) Algorithm to Optimization Lovebird Feed Composition

    OpenAIRE

    Agung Mustika Rizki; Wayan Firdaus Mahmudy; Gusti Eka Yuliastuti

    2017-01-01

    Lovebird current society, especially popular among bird lovers. Some people began to try to develop the cultivation of these birds. In the cultivation process to consider the composition of feed to produce a quality bird. Determining the feed is not easy because it must consider the cost and need for vitamin Lovebird. This problem can be solved by the algorithm Evolution Strategies (ES). Based on test results obtained optimal fitness value of 0.3125 using a population size of 100 and optimal ...

  8. The Development of an Optimal Control Strategy for a Series Hydraulic Hybrid Vehicle

    Directory of Open Access Journals (Sweden)

    Chih-Wei Hung

    2016-03-01

    Full Text Available In this work, a Truck Class II series hydraulic hybrid model is established. Dynamic Programming (DP methodology is applied to derive the optimal power-splitting factor for the hybrid system for preselected driving schedules. Implementable rules are derived by extracting the optimal trajectory features from a DP scheme. The system behaviors illustrate that the improved control strategy gives a highly effective operation region for the engine and high power density characteristics for the hydraulic components.

  9. Optimal investment policy and dividend payment strategy in an insurance company

    OpenAIRE

    Pablo Azcue; Nora Muler

    2010-01-01

    We consider in this paper the optimal dividend problem for an insurance company whose uncontrolled reserve process evolves as a classical Cram\\'{e}r--Lundberg process. The firm has the option of investing part of the surplus in a Black--Scholes financial market. The objective is to find a strategy consisting of both investment and dividend payment policies which maximizes the cumulative expected discounted dividend pay-outs until the time of bankruptcy. We show that the optimal value function...

  10. Linear dynamic model of production-inventory with debt repayment: optimal management strategies

    CERN Document Server

    Tuchnolobova, Ekaterina; Vasilieva, Olga

    2012-01-01

    In this paper, we present a simple microeconomic model with linear continuous-time dynamics that describes a production-inventory system with debt repayment. This model is formulated in terms of optimal control and its exact solutions are derived by prudent application of the maximum principle under different sets of initial conditions (scenarios). For a potentially profitable small firm, we also propose some alternative short-term control strategies resulting in a positive final profit and prove their optimality. Practical implementation of such strategies is also discussed.

  11. Convexity of Ruin Probability and Optimal Dividend Strategies for a General Lévy Process

    Science.gov (United States)

    Yuen, Kam Chuen; Shen, Ying

    2015-01-01

    We consider the optimal dividends problem for a company whose cash reserves follow a general Lévy process with certain positive jumps and arbitrary negative jumps. The objective is to find a policy which maximizes the expected discounted dividends until the time of ruin. Under appropriate conditions, we use some recent results in the theory of potential analysis of subordinators to obtain the convexity properties of probability of ruin. We present conditions under which the optimal dividend strategy, among all admissible ones, takes the form of a barrier strategy. PMID:26351655

  12. Optimal control strategies for deficit irrigation systems under different climate conditions

    Science.gov (United States)

    Schuetze, Niels; Wagner, Michael

    2017-04-01

    In this contribution, the suitability of different control strategies for the operation of irrigation systems under limited water and different climate conditions is investigated. To treat the climate uncertainty within a simulation optimization framework for irrigation management we formulated a probabilistic framework that is based on Monte Carlo simulations. Thus, results show which control strategy can ensure food security since higher quantiles (90% and above) are of interest. This study also demonstrates the efficiency of a stack-ordering technique for generating high productive irrigation schedules which is based on statistically appropriate sample sizes and a reliable optimal management.

  13. Convexity of Ruin Probability and Optimal Dividend Strategies for a General Lévy Process

    Directory of Open Access Journals (Sweden)

    Chuancun Yin

    2015-01-01

    Full Text Available We consider the optimal dividends problem for a company whose cash reserves follow a general Lévy process with certain positive jumps and arbitrary negative jumps. The objective is to find a policy which maximizes the expected discounted dividends until the time of ruin. Under appropriate conditions, we use some recent results in the theory of potential analysis of subordinators to obtain the convexity properties of probability of ruin. We present conditions under which the optimal dividend strategy, among all admissible ones, takes the form of a barrier strategy.

  14. Optimal transmission strategy for spatially correlated MIMO systems with channel statistical information

    Institute of Scientific and Technical Information of China (English)

    ZHAO Zhen-shan; XU Guo-zhi

    2007-01-01

    In real multiple-input multiple-output (MIMO) systems, the perfect channel state information (CSI) may be costly or impossible to acquire. But the channel statistical information can be considered relatively stationary during long-term transmission.The statistical information can be obtained at the receiver and fed back to the transmitter and do not require frequent update. By exploiting channel mean and covariance information at the transmitter simultaneously, this paper investigates the optimal transmission strategy for spatially correlated MIMO channels. An upper bound of ergodic capacity is derived and taken as the performance criterion. Simulation results are also given to show the performance improvement of the optimal transmission strategy.

  15. Collision Distance Detection Based on Swept Volume Strategy for Optimal Motion Plan

    Science.gov (United States)

    Huang, Tsai-Jeon

    A swept volume strategy to detect the collision distances between obstacles is presented in this paper for robot motion planning based on optimization technique. The strategy utilizes the recursive quadratic programming optimization method to perform the motion planning problem. This paper is based on segmental swept volume for convenient distance-to-contact calculation. Hermite interpolation is presented to approach the envelope bounding the swept volume. The new method is capable of handling a modestly non-convex swept volume and it has yielded accurate answers in distance calculations. Also, examples would be presented to illustrate and demonstrate this approach in the paper.

  16. Developing optimal search strategies for retrieving clinically relevant qualitative studies in EMBASE.

    Science.gov (United States)

    Walters, Leslie A; Wilczynski, Nancy L; Haynes, R Brian

    2006-01-01

    Qualitative researchers address many issues relevant to patient health care. Their studies appear in an array of journals, making literature searching difficult. Large databases such as EMBASE provide a means of retrieving qualitative research, but these studies represent only a minuscule fraction of published articles, making electronic retrieval problematic. Little work has been done on developing search strategies for the detection of qualitative studies. The objective of this study was to develop optimal search strategies to retrieve qualitative studies in EMBASE for the 2000 publishing year. The authors conducted an analytic survey, comparing hand searches of journals with retrievals from EMBASE for candidate search terms and combinations. Search strategies reached peak sensitivities at 94.2% and peak specificities of 99.7%. Combining search terms to optimize the combination of sensitivity and specificity resulted in values over 89% for both. The authors identified search strategies with high performance for retrieving qualitative studies in EMBASE.

  17. Optimizing Patient-centered Communication and Multidisciplinary Care Coordination in Emergency Diagnostic Imaging: A Research Agenda.

    Science.gov (United States)

    Sabbatini, Amber K; Merck, Lisa H; Froemming, Adam T; Vaughan, William; Brown, Michael D; Hess, Erik P; Applegate, Kimberly E; Comfere, Nneka I

    2015-12-01

    Patient-centered emergency diagnostic imaging relies on efficient communication and multispecialty care coordination to ensure optimal imaging utilization. The construct of the emergency diagnostic imaging care coordination cycle with three main phases (pretest, test, and posttest) provides a useful framework to evaluate care coordination in patient-centered emergency diagnostic imaging. This article summarizes findings reached during the patient-centered outcomes session of the 2015 Academic Emergency Medicine consensus conference "Diagnostic Imaging in the Emergency Department: A Research Agenda to Optimize Utilization." The primary objective was to develop a research agenda focused on 1) defining component parts of the emergency diagnostic imaging care coordination process, 2) identifying gaps in communication that affect emergency diagnostic imaging, and 3) defining optimal methods of communication and multidisciplinary care coordination that ensure patient-centered emergency diagnostic imaging. Prioritized research questions provided the framework to define a research agenda for multidisciplinary care coordination in emergency diagnostic imaging.

  18. Improved Glowworm Swarm Optimization Algorithm for Multilevel Color Image Thresholding Problem

    Directory of Open Access Journals (Sweden)

    Lifang He

    2016-01-01

    Full Text Available The thresholding process finds the proper threshold values by optimizing a criterion, which can be considered as a constrained optimization problem. The computation time of traditional thresholding techniques will increase dramatically for multilevel thresholding. To greatly overcome this problem, swarm intelligence algorithm is widely used to search optimal thresholds. In this paper, an improved glowworm swarm optimization (IGSO algorithm has been presented to find the optimal multilevel thresholds of color image based on the between-class variance and minimum cross entropy (MCE. The proposed methods are examined on standard set of color test images by using various numbers of threshold values. The results are then compared with those of basic glowworm swarm optimization, adaptive particle swarm optimization (APSO, and self-adaptive differential evolution (SaDE. The simulation results show that the proposed method can find the optimal thresholds accurately and efficiently and is an effective multilevel thresholding method for color image segmentation.

  19. Optimizing 4-Dimensional Magnetic Resonance Imaging Data Sampling for Respiratory Motion Analysis of Pancreatic Tumors

    Energy Technology Data Exchange (ETDEWEB)

    Stemkens, Bjorn, E-mail: b.stemkens@umcutrecht.nl [Department of Radiotherapy, University Medical Center Utrecht, Utrecht (Netherlands); Tijssen, Rob H.N. [Department of Radiotherapy, University Medical Center Utrecht, Utrecht (Netherlands); Senneville, Baudouin D. de [Imaging Division, University Medical Center Utrecht, Utrecht (Netherlands); L' Institut de Mathématiques de Bordeaux, Unité Mixte de Recherche 5251, Centre National de la Recherche Scientifique/University of Bordeaux, Bordeaux (France); Heerkens, Hanne D.; Vulpen, Marco van; Lagendijk, Jan J.W.; Berg, Cornelis A.T. van den [Department of Radiotherapy, University Medical Center Utrecht, Utrecht (Netherlands)

    2015-03-01

    Purpose: To determine the optimum sampling strategy for retrospective reconstruction of 4-dimensional (4D) MR data for nonrigid motion characterization of tumor and organs at risk for radiation therapy purposes. Methods and Materials: For optimization, we compared 2 surrogate signals (external respiratory bellows and internal MRI navigators) and 2 MR sampling strategies (Cartesian and radial) in terms of image quality and robustness. Using the optimized protocol, 6 pancreatic cancer patients were scanned to calculate the 4D motion. Region of interest analysis was performed to characterize the respiratory-induced motion of the tumor and organs at risk simultaneously. Results: The MRI navigator was found to be a more reliable surrogate for pancreatic motion than the respiratory bellows signal. Radial sampling is most benign for undersampling artifacts and intraview motion. Motion characterization revealed interorgan and interpatient variation, as well as heterogeneity within the tumor. Conclusions: A robust 4D-MRI method, based on clinically available protocols, is presented and successfully applied to characterize the abdominal motion in a small number of pancreatic cancer patients.

  20. Evaluation of optimal monetary policy strategy in Romania in the context of fulfilment of convergence criteria

    Directory of Open Access Journals (Sweden)

    Monica DAMIAN

    2011-12-01

    Full Text Available Adopting the euro currency implies the fulfilment of Maastricht convergence criteria, which implies a number of challenges for the macroeconomic policy mix, due to the existence of the conflict between them. The paper analyzes empirically the main monetary policy strategy in the context of euro in Romania.The results of the study show that inflation targeting is an optimal monetary policy strategy to achieve real and nominal convergence criteria.

  1. Optimal combinations of control strategies and cost-effective analysis for visceral leishmaniasis disease transmission

    Science.gov (United States)

    Biswas, Santanu; Subramanian, Abhishek; ELMojtaba, Ibrahim M.; Chattopadhyay, Joydev; Sarkar, Ram Rup

    2017-01-01

    Visceral leishmaniasis (VL) is a deadly neglected tropical disease that poses a serious problem in various countries all over the world. Implementation of various intervention strategies fail in controlling the spread of this disease due to issues of parasite drug resistance and resistance of sandfly vectors to insecticide sprays. Due to this, policy makers need to develop novel strategies or resort to a combination of multiple intervention strategies to control the spread of the disease. To address this issue, we propose an extensive SIR-type model for anthroponotic visceral leishmaniasis transmission with seasonal fluctuations modeled in the form of periodic sandfly biting rate. Fitting the model for real data reported in South Sudan, we estimate the model parameters and compare the model predictions with known VL cases. Using optimal control theory, we study the effects of popular control strategies namely, drug-based treatment of symptomatic and PKDL-infected individuals, insecticide treated bednets and spray of insecticides on the dynamics of infected human and vector populations. We propose that the strategies remain ineffective in curbing the disease individually, as opposed to the use of optimal combinations of the mentioned strategies. Testing the model for different optimal combinations while considering periodic seasonal fluctuations, we find that the optimal combination of treatment of individuals and insecticide sprays perform well in controlling the disease for the time period of intervention introduced. Performing a cost-effective analysis we identify that the same strategy also proves to be efficacious and cost-effective. Finally, we suggest that our model would be helpful for policy makers to predict the best intervention strategies for specific time periods and their appropriate implementation for elimination of visceral leishmaniasis. PMID:28222162

  2. Improving water management efficiency by using optimization-based control strategies: the Barcelona case study

    OpenAIRE

    2009-01-01

    This paper describes the application of model-based predictive control (MPC) techniques to the flow management in large-scale drinking water networks including a telemetry/telecontrol system. MPC technique is used to generate flow control strategies from the sources to the consumer areas to meet future demands, optimizing performance indexes associated to operational goals such as economic cost, network safety volumes and flow control stability. The designed management strategies are...

  3. A strategy for multimodal deformable image registration to integrate PET/MR into radiotherapy treatment planning

    Energy Technology Data Exchange (ETDEWEB)

    Leibfarth, Sara; Moennich, David; Thorwarth, Daniela [Section for Biomedical Physics, Dept. of Radiation Oncology, Univ. Hospital Tuebingen, Tuebingen (Germany)], e-mail: Sara.Leibfarth@med.uni-tuebingen.de; Welz, Stefan; Siegel, Christine; Zips, Daniel [Dept. of Radiation Oncology, Univ. Hospital Tuebingen, Tuebingen (Germany); Schwenzer, Nina [Dept. of Diagnostic and Interventional Radiology, Univ. Hospital Tuebingen, Tuebingen (Germany); Holger Schmidt, Holger [Dept. of Diagnostic and Interventional Radiology, Univ. Hospital Tuebingen, Tuebingen (Germany); Lab. for Preclinical Imaging and Imaging Technology of the Werner Siemens Foundation, Dept. of Preclinical Imaging and Radiopharmacy, Tuebingen (Germany)

    2013-10-15

    Background: Combined positron emission tomography (PET)/magnetic resonance imaging (MRI) is highly promising for biologically individualized radiotherapy (RT). Hence, the purpose of this work was to develop an accurate and robust registration strategy to integrate combined PET/MR data into RT treatment planning. Material and methods: Eight patient datasets consisting of an FDG PET/computed tomography (CT) and a subsequently acquired PET/MR of the head and neck (HN) region were available. Registration strategies were developed based on CT and MR data only, whereas the PET components were fused with the resulting deformation field. Following a rigid registration, deformable registration was performed with a transform parametrized by B-splines. Three different optimization metrics were investigated: global mutual information (GMI), GMI combined with a bending energy penalty (BEP) for regularization (GMI + BEP) and localized mutual information with BEP (LMI + BEP). Different quantitative registration quality measures were developed, including volumetric overlap and mean distance measures for structures segmented on CT and MR as well as anatomical landmark distances. Moreover, the local registration quality in the tumor region was assessed by the normalized cross correlation (NCC) of the two PET datasets. Results: LMI + BEP yielded the most robust and accurate registration results. For GMI, GMI + BEP and LMI + BEP, mean landmark distances (standard deviations) were 23.9 mm (15.5 mm), 4.8 mm (4.0 mm) and 3.0 mm (1.0 mm), and mean NCC values (standard deviations) were 0.29 (0.29), 0.84 (0.14) and 0.88 (0.06), respectively. Conclusion: Accurate and robust multimodal deformable image registration of CT and MR in the HN region can be performed using a B-spline parametrized transform and LMI + BEP as optimization metric. With this strategy, biologically individualized RT based on combined PET/MRI in terms of dose painting is possible.

  4. An Optimal Charging Strategy for PV-Based Battery Swapping Stations in a DC Distribution System

    Directory of Open Access Journals (Sweden)

    Shengjun Wu

    2017-01-01

    Full Text Available Photovoltaic- (PV- based battery swapping stations (BSSs utilize a typical integration of consumable renewable resources to supply power for electric vehicles (EVs. The charging strategy of PV-based BSSs directly influences the availability, cost, and carbon emissions of the swapping service. This paper proposes an optimal charging strategy to improve the self-consumption of PV-generated power and service availability while considering forecast errors. First, we introduce the typical structure and operation model of PV-based BSSs. Second, three indexes are presented to evaluate operational performance. Then, a particle swarm optimization (PSO algorithm is developed to calculate the optimal charging power and to minimize the charging cost for each time slot. The proposed charging strategy helps decrease the impact of forecast uncertainties on the availability of the battery swapping service. Finally, a day-ahead operation schedule, a real-time decision-making strategy, and the proposed PSO charging strategy for PV-based BSSs are simulated in a case study. The simulation results show that the proposed strategy can effectively improve the self-consumption of PV-generated power and reduce charging cost.

  5. A hybrid genetic algorithm based on mutative scale chaos optimization strategy

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    In order to avoid such problems as low convergent speed and local optimal solution in simple genetic algorithms, a new hybrid genetic algorithm is proposed. In this algorithm, a mutative scale chaos optimization strategy is operated on the population after a genetic operation. And according to the searching process, the searching space of the optimal variables is gradually diminished and the regulating coefficient of the secondary searching process is gradually changed which will lead to the quick evolution of the population. The algorithm has such advantages as fast search, precise results and convenient using etc. The simulation results show that the performance of the method is better than that of simple genetic algorithms.

  6. Optimal image interpolation using local low-order surfaces

    Science.gov (United States)

    Gustafson, Steven C.; Claypoole, Roger L., Jr.; Magee, Eric P.; Loomis, John S.

    2002-05-01

    Desirable features of any digital image resolution- enhancement algorithm include exact interpolation (for 'distortionless' or 'lossless' processing) adjustable resolution, adjustable smoothness, and ease of computation. A given low-order polynomial surface (linear, quadratic, cubic, etc.) optimally fit by least squares to a given local neighborhood of a pixel to be interpolated can enable all of these features. For example, if the surface is cubic, if a pixel and the 5-by-5 pixel array surrounding it are selected, and if interpolation of this pixel must yield a 4- by-4 array of sub-pixels, then the 10 coefficients that define the surface may be determined by the constrained least squares solution of 25 linear equations in 10 unknowns, where each equation sets the surface value at a pixel center equal to the pixel gray value and where the constraint is that the mean of the surface values at the sub-pixel centers equals the gray value of the interpolated pixel. Note that resolution is adjustable because the interpolating surface for each pixel may be subdivided arbitrarily, that smoothness is adjustable (within each pixel) because the polynomial order and number neighboring pixels may be selected, and that the most computationally demanding operation is solving a relatively small number of simultaneous linear equations for each pixel.

  7. Time Reversal Reconstruction Algorithm Based on PSO Optimized SVM Interpolation for Photoacoustic Imaging

    Directory of Open Access Journals (Sweden)

    Mingjian Sun

    2015-01-01

    Full Text Available Photoacoustic imaging is an innovative imaging technique to image biomedical tissues. The time reversal reconstruction algorithm in which a numerical model of the acoustic forward problem is run backwards in time is widely used. In the paper, a time reversal reconstruction algorithm based on particle swarm optimization (PSO optimized support vector machine (SVM interpolation method is proposed for photoacoustics imaging. Numerical results show that the reconstructed images of the proposed algorithm are more accurate than those of the nearest neighbor interpolation, linear interpolation, and cubic convolution interpolation based time reversal algorithm, which can provide higher imaging quality by using significantly fewer measurement positions or scanning times.

  8. MO-PIS-Exhibit Hall-01: Imaging: CT Dose Optimization Technologies I

    Energy Technology Data Exchange (ETDEWEB)

    Denison, K; Smith, S [GE Healthcare, Waukesha, WI (United States)

    2014-06-15

    Partners in Solutions is an exciting new program in which AAPM partners with our vendors to present practical “hands-on” information about the equipment and software systems that we use in our clinics. The imaging topic this year is CT scanner dose optimization capabilities. Note that the sessions are being held in a special purpose room built on the Exhibit Hall Floor, to encourage further interaction with the vendors. Dose Optimization Capabilities of GE Computed Tomography Scanners Presentation Time: 11:15 – 11:45 AM GE Healthcare is dedicated to the delivery of high quality clinical images through the development of technologies, which optimize the application of ionizing radiation. In computed tomography, dose management solutions fall into four categories: employs projection data and statistical modeling to decrease noise in the reconstructed image - creating an opportunity for mA reduction in the acquisition of diagnostic images. Veo represents true Model Based Iterative Reconstruction (MBiR). Using high-level algorithms in tandem with advanced computing power, Veo enables lower pixel noise standard deviation and improved spatial resolution within a single image. Advanced Adaptive Image Filters allow for maintenance of spatial resolution while reducing image noise. Examples of adaptive image space filters include Neuro 3-D filters and Cardiac Noise Reduction Filters. AutomA adjusts mA along the z-axis and is the CT equivalent of auto exposure control in conventional x-ray systems. Dynamic Z-axis Tracking offers an additional opportunity for dose reduction in helical acquisitions while SmartTrack Z-axis Tracking serves to ensure beam, collimator and detector alignment during tube rotation. SmartmA provides angular mA modulation. ECG Helical Modulation reduces mA during the systolic phase of the heart cycle. SmartBeam optimization uses bowtie beam-shaping hardware and software to filter off-axis x-rays - minimizing dose and reducing x-ray scatter. The

  9. [Imaging strategy for children after a first episode of pyelonephritis].

    Science.gov (United States)

    Bocquet, N; Biebuyck, N; Lortat Jacob, S; Aigrain, Y; Salomon, R; Chéron, G

    2015-05-01

    Pyelonephritis is a common bacterial disease in young children and is a serious infection because of its potential to produce renal scarring. One of the concerns of physicians is therefore the diagnosis of uropathy at risk for recurrence of pyelonephritis, especially high-grade reflux. There are no French recommendations on imaging evaluation after a first episode of pyelonephritis. Voiding cystography was systematically proposed years ago and recommended by the American Academy of Pediatrics until 1999. This systematic strategy exposed all children to a painful, irradiating exam, and exposed them to urinary tract infection. The American recommendations changed in 2011 and cystography is now only proposed to children with recurrence of pyelonephritis or with ultrasound abnormalities. A collaborative review of the literature involving the Pediatric Emergency, Nephrology and Surgery Departments at Necker-Enfants-Malades Hospital led us to propose an algorithm for imaging after the first episode of pyelonephritis in children. This algorithm was based on data from the past medical history (results of prenatal ultrasonography or recurrence of pyelonephritis), the results of the ultrasound exam at the time of diagnosis, and the procalcitonin concentration, to limit the indications for voiding cystography, limiting risk for delaying high-grade reflux diagnosis. Children with low risk for high-grade reflux can be followed up with an ultrasound exam 6 months after acute infection. Copyright © 2015 Elsevier Masson SAS. All rights reserved.

  10. Optimal orientation in flows: providing a benchmark for animal movement strategies.

    Science.gov (United States)

    McLaren, James D; Shamoun-Baranes, Judy; Dokter, Adriaan M; Klaassen, Raymond H G; Bouten, Willem

    2014-10-06

    Animal movements in air and water can be strongly affected by experienced flow. While various flow-orientation strategies have been proposed and observed, their performance in variable flow conditions remains unclear. We apply control theory to establish a benchmark for time-minimizing (optimal) orientation. We then define optimal orientation for movement in steady flow patterns and, using dynamic wind data, for short-distance mass movements of thrushes (Turdus sp.) and 6000 km non-stop migratory flights by great snipes, Gallinago media. Relative to the optimal benchmark, we assess the efficiency (travel speed) and reliability (success rate) of three generic orientation strategies: full compensation for lateral drift, vector orientation (single-heading movement) and goal orientation (continually heading towards the goal). Optimal orientation is characterized by detours to regions of high flow support, especially when flow speeds approach and exceed the animal's self-propelled speed. In strong predictable flow (short distance thrush flights), vector orientation adjusted to flow on departure is nearly optimal, whereas for unpredictable flow (inter-continental snipe flights), only goal orientation was near-optimally reliable and efficient. Optimal orientation provides a benchmark for assessing efficiency of responses to complex flow conditions, thereby offering insight into adaptive flow-orientation across taxa in the light of flow strength, predictability and navigation capacity.

  11. Social Optimization and Pricing Policy in Cognitive Radio Networks with an Energy Saving Strategy

    Directory of Open Access Journals (Sweden)

    Shunfu Jin

    2016-01-01

    Full Text Available The rapid growth of wireless application results in an increase in demand for spectrum resource and communication energy. In this paper, we firstly introduce a novel energy saving strategy in cognitive radio networks (CRNs and then propose an appropriate pricing policy for secondary user (SU packets. We analyze the behavior of data packets in a discrete-time single-server priority queue under multiple-vacation discipline. With the help of a Quasi-Birth-Death (QBD process model, we obtain the joint distribution for the number of SU packets and the state of base station (BS via the Matrix-Geometric Solution method. We assess the average latency of SU packets and the energy saving ratio of system. According to a natural reward-cost structure, we study the individually optimal behavior and the socially optimal behavior of the energy saving strategy and use an optimization algorithm based on standard particle swarm optimization (SPSO method to search the socially optimal arrival rate of SU packets. By comparing the individually optimal behavior and the socially optimal behavior, we impose an appropriate admission fee to SU packets. Finally, we present numerical results to show the impacts of system parameters on the system performance and the pricing policy.

  12. Real-time inverse high-dose-rate brachytherapy planning with catheter optimization by compressed sensing-inspired optimization strategies

    Science.gov (United States)

    Guthier, C. V.; Aschenbrenner, K. P.; Müller, R.; Polster, L.; Cormack, R. A.; Hesser, J. W.

    2016-08-01

    This paper demonstrates that optimization strategies derived from the field of compressed sensing (CS) improve computational performance in inverse treatment planning (ITP) for high-dose-rate (HDR) brachytherapy. Following an approach applied to low-dose-rate brachytherapy, we developed a reformulation of the ITP problem with the same mathematical structure as standard CS problems. Two greedy methods, derived from hard thresholding and subspace pursuit are presented and their performance is compared to state-of-the-art ITP solvers. Applied to clinical prostate brachytherapy plans speed-up by a factor of 56-350 compared to state-of-the-art methods. Based on a Wilcoxon signed rank-test the novel method statistically significantly decreases the final objective function value (p  <  0.01). The optimization times were below one second and thus planing can be considered as real-time capable. The novel CS inspired strategy enables real-time ITP for HDR brachytherapy including catheter optimization. The generated plans are either clinically equivalent or show a better performance with respect to dosimetric measures.

  13. Convex optimization problem prototyping for image reconstruction in computed tomography with the Chambolle–Pock algorithm

    DEFF Research Database (Denmark)

    Sidky, Emil Y.; Jørgensen, Jakob Heide; Pan, Xiaochuan

    2012-01-01

    for the purpose of designing iterative image reconstruction algorithms for CT. The primal–dual algorithm is briefly summarized in this paper, and its potential for prototyping is demonstrated by explicitly deriving CP algorithm instances for many optimization problems relevant to CT. An example application......The primal–dual optimization algorithm developed in Chambolle and Pock (CP) (2011 J. Math. Imag. Vis. 40 1–26) is applied to various convex optimization problems of interest in computed tomography (CT) image reconstruction. This algorithm allows for rapid prototyping of optimization problems...

  14. Multi-objective optimization of the control strategy of electric vehicle electro-hydraulic composite braking system with genetic algorithm

    OpenAIRE

    Zhang Fengjiao; Wei Minxiang

    2015-01-01

    Optimization of the control strategy plays an important role in improving the performance of electric vehicles. In order to improve the braking stability and recover the braking energy, a multi-objective genetic algorithm is applied to optimize the key parameters in the control strategy of electric vehicle electro-hydraulic composite braking system. Various limitations are considered in the optimization process, and the optimization results are verified by a software simulation platform of el...

  15. Multi-objective optimization of cellular scanning strategy in selective laser melting

    DEFF Research Database (Denmark)

    Ahrari, Ali; Deb, Kalyanmoy; Mohanty, Sankhya

    2017-01-01

    The scanning strategy for selective laser melting - an additive manufacturing process - determines the temperature fields during the manufacturing process, which in turn affects residual stresses and distortions, two of the main sources of process-induced defects. The goal of this study...... is to develop a multi-objective approach to optimize the cellular scanning strategy such that the two aforementioned defects are minimized. The decision variable in the chosen problem is a combination of the sequence in which cells are processed and one of six scanning strategies applied to each cell. Thus...

  16. Analysis for Influence of Market Information on Firms' Optimal Strategies in Multidimensional Bertrand Game

    Institute of Scientific and Technical Information of China (English)

    DeqingTan; GuangzhongLiu

    2004-01-01

    The Bertrand model of two firms' static multidimensional game with incomplete information for two kinds of product with certain substitution is discussed in the paper,and analyzes influences of the firms' forecasting results of total market demands on their optimal strategies according to marxet information. The conclusions are that the more a firm masters market information, the greater differences of forecasted values and expected values of market demands for products have influence upon equilibrium strategies; conversely, the less they have influence upon equilibrium strategies.

  17. Optimization of the matrix inversion tomosynthesis (MITS) impulse response and modulation transfer function characteristics for chest imaging.

    Science.gov (United States)

    Godfrey, Devon J; McAdams, H P; Dobbins, James T

    2006-03-01

    Matrix inversion tomosynthesis (MITS) uses linear systems theory, along with a priori knowledge of the imaging geometry, to deterministically distinguish between true structure and overlying tomographic blur in a set of conventional tomosynthesis planes. In this paper we examine the effect of total scan angle (ANG), number of input projections (N), and plane separation/number of reconstructed planes (NP) on the MITS impulse response (IR) and modulation transfer function (MTF), with the purpose of optimizing MITS imaging of the chest. MITS IR and MTF data were generated by simulating the imaging of a very thin wire, using various combinations of ANG, N, and NP. Actual tomosynthesis data of an anthropomorphic chest phantom were acquired with a prototype experimental system, using the same imaging parameter combinations as those in the simulations. Thoracic projection data from two human subjects were collected for corroboration of the system response analysis in vivo. Results suggest that ANG=20 degrees, N=71, NP=69 is the optimal combination for MITS chest imaging given the inherent constraints of our prototype system. MITS chest data from human subjects demonstrates that the selected imaging strategy can effectively produce high-quality MITS thoracic images in vivo.

  18. Natural Image Enhancement Using a Biogeography Based Optimization Enhanced with Blended Migration Operator

    Directory of Open Access Journals (Sweden)

    J. Jasper

    2014-01-01

    Full Text Available This paper addresses a novel and efficient algorithm for solving optimization problem in image processing applications. Image enhancement (IE is one of the complex optimization problems in image processing. The main goal of this paper is to enhance color images such that the eminence of the image is more suitable than the original image from the perceptual viewpoint of human. Traditional methods require prior knowledge of the image to be enhanced, whereas the aim of the proposed biogeography based optimization (BBO enhanced with blended migration operator (BMO algorithm is to maximize the objective function in order to enhance the image contrast by maximizing the parameters like edge intensity, edge information, and entropy. Experimental results are compared with the current state-of-the-art approaches and indicate the superiority of the proposed technique in terms of subjective and objective evaluation.

  19. Optimized protocols for cardiac magnetic resonance imaging in patients with thoracic metallic implants

    Energy Technology Data Exchange (ETDEWEB)

    Olivieri, Laura J.; Ratnayaka, Kanishka [Children' s National Health System, Division of Cardiology, Washington, DC (United States); National Institutes of Health, National Heart, Lung and Blood Institute, Bethesda, MD (United States); Cross, Russell R.; O' Brien, Kendall E. [Children' s National Health System, Division of Cardiology, Washington, DC (United States); Hansen, Michael S. [National Institutes of Health, National Heart, Lung and Blood Institute, Bethesda, MD (United States)

    2015-09-15

    Cardiac magnetic resonance (MR) imaging is a valuable tool in congenital heart disease; however patients frequently have metal devices in the chest from the treatment of their disease that complicate imaging. Methods are needed to improve imaging around metal implants near the heart. Basic sequence parameter manipulations have the potential to minimize artifact while limiting effects on image resolution and quality. Our objective was to design cine and static cardiac imaging sequences to minimize metal artifact while maintaining image quality. Using systematic variation of standard imaging parameters on a fluid-filled phantom containing commonly used metal cardiac devices, we developed optimized sequences for steady-state free precession (SSFP), gradient recalled echo (GRE) cine imaging, and turbo spin-echo (TSE) black-blood imaging. We imaged 17 consecutive patients undergoing routine cardiac MR with 25 metal implants of various origins using both standard and optimized imaging protocols for a given slice position. We rated images for quality and metal artifact size by measuring metal artifact in two orthogonal planes within the image. All metal artifacts were reduced with optimized imaging. The average metal artifact reduction for the optimized SSFP cine was 1.5+/-1.8 mm, and for the optimized GRE cine the reduction was 4.6+/-4.5 mm (P < 0.05). Quality ratings favored the optimized GRE cine. Similarly, the average metal artifact reduction for the optimized TSE images was 1.6+/-1.7 mm (P < 0.05), and quality ratings favored the optimized TSE imaging. Imaging sequences tailored to minimize metal artifact are easily created by modifying basic sequence parameters, and images are superior to standard imaging sequences in both quality and artifact size. Specifically, for optimized cine imaging a GRE sequence should be used with settings that favor short echo time, i.e. flow compensation off, weak asymmetrical echo and a relatively high receiver bandwidth. For static

  20. Optimal Coaddition of Imaging Data for Rapidly Fading Gamma-Ray Burst Afterglows

    CERN Document Server

    Morgan, A N; Roming, P W A; Nousek, J A; Koch, T S; Breeveld, A A; de Pasquale, M; Holland, S T; Kuin, N P M; Page, M J; Still, M

    2008-01-01

    We present a technique for optimal coaddition of image data for rapidly varying sources, with specific application to gamma-ray burst (GRB) afterglows. Unweighted coaddition of rapidly fading afterglow lightcurve data becomes counterproductive relatively quickly. It is better to stop coaddition of the data once noise dominates late exposures. A better alternative is to optimally weight each exposure to maximize the signal-to-noise ratio (S/N) of the final coadded image data. By using information about GRB lightcurves and image noise characteristics, optimal image coaddition increases the probability of afterglow detection and places the most stringent upper limits on non-detections. For a temporal power law flux decay typical of GRB afterglows, optimal coaddition has the greatest potential to improve the S/N of afterglow imaging data (relative to unweighted coaddition), when the decay rate is high, the source count rate is low, and the background rate is high. The optimal coaddition technique is demonstrated ...