WorldWideScience

Sample records for optimal imaging strategy

  1. An Image Enhancement Method Using the Quantum-Behaved Particle Swarm Optimization with an Adaptive Strategy

    Directory of Open Access Journals (Sweden)

    Xiaoping Su

    2013-01-01

    Full Text Available Image enhancement techniques are very important to image processing, which are used to improve image quality or extract the fine details in degraded images. In this paper, two novel objective functions based on the normalized incomplete Beta transform function are proposed to evaluate the effectiveness of grayscale image enhancement and color image enhancement, respectively. Using these objective functions, the parameters of transform functions are estimated by the quantum-behaved particle swarm optimization (QPSO. We also propose an improved QPSO with an adaptive parameter control strategy. The QPSO and the AQPSO algorithms, along with genetic algorithm (GA and particle swarm optimization (PSO, are tested on several benchmark grayscale and color images. The results show that the QPSO and AQPSO perform better than GA and PSO for the enhancement of these images, and the AQPSO has some advantages over QPSO due to its adaptive parameter control strategy.

  2. Mixed-integer evolution strategies for parameter optimization and their applications to medical image analysis

    NARCIS (Netherlands)

    Li, Rui

    2009-01-01

    The target of this work is to extend the canonical Evolution Strategies (ES) from traditional real-valued parameter optimization domain to mixed-integer parameter optimization domain. This is necessary because there exist numerous practical optimization problems from industry in which the set of

  3. Medical Image Registration by means of a Bio-Inspired Optimization Strategy

    Directory of Open Access Journals (Sweden)

    Hariton Costin

    2012-07-01

    Full Text Available Medical imaging mainly treats and processes missing, ambiguous, complementary, redundant and distorted data. Biomedical image registration is the process of geometric overlaying or alignment of two or more 2D/3D images of the same scene, taken at different time slots, from different angles, and/or by different acquisition systems. In medical practice, it is becoming increasingly important in diagnosis, treatment planning, functional studies, computer-guided therapies, and in biomedical research. Technically, image registration implies a complex optimization of different parameters, performed at local or/and global levels. Local optimization methods frequently fail because functions of the involved metrics with respect to transformation parameters are generally nonconvex and irregular. Therefore, global methods are often required, at least at the beginning of the procedure. In this paper, a new evolutionary and bio-inspired approach -- bacterial foraging optimization -- is adapted for single-slice to 3-D PET and CT multimodal image registration. Preliminary results of optimizing the normalized mutual information similarity metric validated the efficacy of the proposed method by using a freely available medical image database.

  4. Living renal donors: optimizing the imaging strategy--decision- and cost-effectiveness analysis

    NARCIS (Netherlands)

    Y.S. Liem (Ylian Serina); M.C.J.M. Kock (Marc); W. Weimar (Willem); K. Visser (Karen); M.G.M. Hunink (Myriam); J.N.M. IJzermans (Jan)

    2003-01-01

    textabstractPURPOSE: To determine the most cost-effective strategy for preoperative imaging performed in potential living renal donors. MATERIALS AND METHODS: In a decision-analytic model, the societal cost-effectiveness of digital subtraction angiography (DSA), gadolinium-enhanced

  5. Task-based strategy for optimized contrast enhanced breast imaging: Analysis of six imaging techniques for mammography and tomosynthesis

    Science.gov (United States)

    Ikejimba, Lynda C.; Kiarashi, Nooshin; Ghate, Sujata V.; Samei, Ehsan; Lo, Joseph Y.

    2014-01-01

    Purpose: The use of contrast agents in breast imaging has the capability of enhancing nodule detectability and providing physiological information. Accordingly, there has been a growing trend toward using iodine as a contrast medium in digital mammography (DM) and digital breast tomosynthesis (DBT). Widespread use raises concerns about the best way to use iodine in DM and DBT, and thus a comparison is necessary to evaluate typical iodine-enhanced imaging methods. This study used a task-based observer model to determine the optimal imaging approach by analyzing six imaging paradigms in terms of their ability to resolve iodine at a given dose: unsubtracted mammography and tomosynthesis, temporal subtraction mammography and tomosynthesis, and dual energy subtraction mammography and tomosynthesis. Methods: Imaging performance was characterized using a detectability index d′, derived from the system task transfer function (TTF), an imaging task, iodine signal difference, and the noise power spectrum (NPS). The task modeled a 10 mm diameter lesion containing iodine concentrations between 2.1 mg/cc and 8.6 mg/cc. TTF was obtained using an edge phantom, and the NPS was measured over several exposure levels, energies, and target-filter combinations. Using a structured CIRS phantom, d′ was generated as a function of dose and iodine concentration. Results: For all iodine concentrations and dose, temporal subtraction techniques for mammography and tomosynthesis yielded the highest d′, while dual energy techniques for both modalities demonstrated the next best performance. Unsubtracted imaging resulted in the lowest d′ values for both modalities, with unsubtracted mammography performing the worst out of all six paradigms. Conclusions: At any dose, temporal subtraction imaging provides the greatest detectability, with temporally subtracted DBT performing the highest. The authors attribute the successful performance to excellent cancellation of inplane structures and

  6. Task-based strategy for optimized contrast enhanced breast imaging: analysis of six imaging techniques for mammography and tomosynthesis

    Science.gov (United States)

    Ikejimba, Lynda; Kiarashi, Nooshin; Lin, Yuan; Chen, Baiyu; Ghate, Sujata V.; Zerhouni, Moustafa; Samei, Ehsan; Lo, Joseph Y.

    2012-03-01

    Digital breast tomosynthesis (DBT) is a novel x-ray imaging technique that provides 3D structural information of the breast. In contrast to 2D mammography, DBT minimizes tissue overlap potentially improving cancer detection and reducing number of unnecessary recalls. The addition of a contrast agent to DBT and mammography for lesion enhancement has the benefit of providing functional information of a lesion, as lesion contrast uptake and washout patterns may help differentiate between benign and malignant tumors. This study used a task-based method to determine the optimal imaging approach by analyzing six imaging paradigms in terms of their ability to resolve iodine at a given dose: contrast enhanced mammography and tomosynthesis, temporal subtraction mammography and tomosynthesis, and dual energy subtraction mammography and tomosynthesis. Imaging performance was characterized using a detectability index d', derived from the system task transfer function (TTF), an imaging task, iodine contrast, and the noise power spectrum (NPS). The task modeled a 5 mm lesion containing iodine concentrations between 2.1 mg/cc and 8.6 mg/cc. TTF was obtained using an edge phantom, and the NPS was measured over several exposure levels, energies, and target-filter combinations. Using a structured CIRS phantom, d' was generated as a function of dose and iodine concentration. In general, higher dose gave higher d', but for the lowest iodine concentration and lowest dose, dual energy subtraction tomosynthesis and temporal subtraction tomosynthesis demonstrated the highest performance.

  7. Optimal GENCO bidding strategy

    Science.gov (United States)

    Gao, Feng

    Electricity industries worldwide are undergoing a period of profound upheaval. The conventional vertically integrated mechanism is being replaced by a competitive market environment. Generation companies have incentives to apply novel technologies to lower production costs, for example: Combined Cycle units. Economic dispatch with Combined Cycle units becomes a non-convex optimization problem, which is difficult if not impossible to solve by conventional methods. Several techniques are proposed here: Mixed Integer Linear Programming, a hybrid method, as well as Evolutionary Algorithms. Evolutionary Algorithms share a common mechanism, stochastic searching per generation. The stochastic property makes evolutionary algorithms robust and adaptive enough to solve a non-convex optimization problem. This research implements GA, EP, and PS algorithms for economic dispatch with Combined Cycle units, and makes a comparison with classical Mixed Integer Linear Programming. The electricity market equilibrium model not only helps Independent System Operator/Regulator analyze market performance and market power, but also provides Market Participants the ability to build optimal bidding strategies based on Microeconomics analysis. Supply Function Equilibrium (SFE) is attractive compared to traditional models. This research identifies a proper SFE model, which can be applied to a multiple period situation. The equilibrium condition using discrete time optimal control is then developed for fuel resource constraints. Finally, the research discusses the issues of multiple equilibria and mixed strategies, which are caused by the transmission network. Additionally, an advantage of the proposed model for merchant transmission planning is discussed. A market simulator is a valuable training and evaluation tool to assist sellers, buyers, and regulators to understand market performance and make better decisions. A traditional optimization model may not be enough to consider the distributed

  8. Optimal intermittent search strategies

    International Nuclear Information System (INIS)

    Rojo, F; Budde, C E; Wio, H S

    2009-01-01

    We study the search kinetics of a single fixed target by a set of searchers performing an intermittent random walk, jumping between different internal states. Exploiting concepts of multi-state and continuous-time random walks we have calculated the survival probability of a target up to time t, and have 'optimized' (minimized) it with regard to the transition probability among internal states. Our model shows that intermittent strategies always improve target detection, even for simple diffusion states of motion

  9. Thoracic lymph node station recognition on CT images based on automatic anatomy recognition with an optimal parent strategy

    Science.gov (United States)

    Xu, Guoping; Udupa, Jayaram K.; Tong, Yubing; Cao, Hanqiang; Odhner, Dewey; Torigian, Drew A.; Wu, Xingyu

    2018-03-01

    Currently, there are many papers that have been published on the detection and segmentation of lymph nodes from medical images. However, it is still a challenging problem owing to low contrast with surrounding soft tissues and the variations of lymph node size and shape on computed tomography (CT) images. This is particularly very difficult on low-dose CT of PET/CT acquisitions. In this study, we utilize our previous automatic anatomy recognition (AAR) framework to recognize the thoracic-lymph node stations defined by the International Association for the Study of Lung Cancer (IASLC) lymph node map. The lymph node stations themselves are viewed as anatomic objects and are localized by using a one-shot method in the AAR framework. Two strategies have been taken in this paper for integration into AAR framework. The first is to combine some lymph node stations into composite lymph node stations according to their geometrical nearness. The other is to find the optimal parent (organ or union of organs) as an anchor for each lymph node station based on the recognition error and thereby find an overall optimal hierarchy to arrange anchor organs and lymph node stations. Based on 28 contrast-enhanced thoracic CT image data sets for model building, 12 independent data sets for testing, our results show that thoracic lymph node stations can be localized within 2-3 voxels compared to the ground truth.

  10. Optimal fuel inventory strategies

    International Nuclear Information System (INIS)

    Caspary, P.J.; Hollibaugh, J.B.; Licklider, P.L.; Patel, K.P.

    1990-01-01

    In an effort to maintain their competitive edge, most utilities are reevaluating many of their conventional practices and policies in an effort to further minimize customer revenue requirements without sacrificing system reliability. Over the past several years, Illinois Power has been rethinking its traditional fuel inventory strategies, recognizing that coal supplies are competitive and plentiful and that carrying charges on inventory are expensive. To help the Company achieve one of its strategic corporate goals, an optimal fuel inventory study was performed for its five major coal-fired generating stations. The purpose of this paper is to briefly describe Illinois Power's system and past practices concerning coal inventories, highlight the analytical process behind the optimal fuel inventory study, and discuss some of the recent experiences affecting coal deliveries and economic dispatch

  11. Optimal intermittent search strategies

    Energy Technology Data Exchange (ETDEWEB)

    Rojo, F; Budde, C E [FaMAF, Universidad Nacional de Cordoba, Ciudad Universitaria, X5000HUA Cordoba (Argentina); Wio, H S [Instituto de Fisica de Cantabria, Universidad de Cantabria and CSIC E-39005 Santander (Spain)

    2009-03-27

    We study the search kinetics of a single fixed target by a set of searchers performing an intermittent random walk, jumping between different internal states. Exploiting concepts of multi-state and continuous-time random walks we have calculated the survival probability of a target up to time t, and have 'optimized' (minimized) it with regard to the transition probability among internal states. Our model shows that intermittent strategies always improve target detection, even for simple diffusion states of motion.

  12. Satellite image collection optimization

    Science.gov (United States)

    Martin, William

    2002-09-01

    Imaging satellite systems represent a high capital cost. Optimizing the collection of images is critical for both satisfying customer orders and building a sustainable satellite operations business. We describe the functions of an operational, multivariable, time dynamic optimization system that maximizes the daily collection of satellite images. A graphical user interface allows the operator to quickly see the results of what if adjustments to an image collection plan. Used for both long range planning and daily collection scheduling of Space Imaging's IKONOS satellite, the satellite control and tasking (SCT) software allows collection commands to be altered up to 10 min before upload to the satellite.

  13. [Imaging center - optimization of the imaging process].

    Science.gov (United States)

    Busch, H-P

    2013-04-01

    Hospitals around the world are under increasing pressure to optimize the economic efficiency of treatment processes. Imaging is responsible for a great part of the success but also of the costs of treatment. In routine work an excessive supply of imaging methods leads to an "as well as" strategy up to the limit of the capacity without critical reflection. Exams that have no predictable influence on the clinical outcome are an unjustified burden for the patient. They are useless and threaten the financial situation and existence of the hospital. In recent years the focus of process optimization was exclusively on the quality and efficiency of performed single examinations. In the future critical discussion of the effectiveness of single exams in relation to the clinical outcome will be more important. Unnecessary exams can be avoided, only if in addition to the optimization of single exams (efficiency) there is an optimization strategy for the total imaging process (efficiency and effectiveness). This requires a new definition of processes (Imaging Pathway), new structures for organization (Imaging Center) and a new kind of thinking on the part of the medical staff. Motivation has to be changed from gratification of performed exams to gratification of process quality (medical quality, service quality, economics), including the avoidance of additional (unnecessary) exams. © Georg Thieme Verlag KG Stuttgart · New York.

  14. Implementing optimal thinning strategies

    Science.gov (United States)

    Kurt H. Riitters; J. Douglas Brodie

    1984-01-01

    Optimal thinning regimes for achieving several management objectives were derived from two stand-growth simulators by dynamic programming. Residual mean tree volumes were then plotted against stand density management diagrams. The results supported the use of density management diagrams for comparing, checking, and implementing the results of optimization analyses....

  15. Optimizing decommissioning strategies

    International Nuclear Information System (INIS)

    Passant, F.H.

    1993-01-01

    Many different approaches can be considered for achieving satisfactory decommissioning of nuclear installations. These can embrace several different engineering actions at several stages, with time variations between the stages. Multi-attribute analysis can be used to help in the decision making process and to establish the optimum strategy. It has been used in the Usa and the UK to help in selecting preferred sites for radioactive waste repositories, and also in UK to help with the choice of preferred sites for locating PWR stations, and in selecting optimum decommissioning strategies

  16. Optimal temporal windows and dose-reducing strategy for coronary artery bypass graft imaging with 256-slice CT

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Kun-Mu [Department of Radiology, Shin Kong Wu Ho-Su Memorial Hospital, 95 Wen Chang Road, Shih Lin District, Taipei 111, Taiwan. (China); Lee, Yi-Wei [Department of Radiology, Kaohsiung Chang Gung Memorial Hospital and Chang Gung University College of Medicine, Kaohsiung, Taiwan (China); Department of Biomedical Imaging and Radiological Sciences, National Yang Ming University, Taipei, Taiwan (China); Guan, Yu-Xiang [Department of Biomedical Imaging and Radiological Sciences, National Yang Ming University, Taipei, Taiwan (China); Chen, Liang-Kuang [Department of Radiology, Shin Kong Wu Ho-Su Memorial Hospital, 95 Wen Chang Road, Shih Lin District, Taipei 111, Taiwan. (China); School of Medicine, Fu Jen Catholic University, Taipei, Taiwan (China); Law, Wei-Yip, E-mail: m002325@ms.skh.org.tw [Department of Radiology, Shin Kong Wu Ho-Su Memorial Hospital, 95 Wen Chang Road, Shih Lin District, Taipei 111, Taiwan. (China); Su, Chen-Tau, E-mail: m005531@ms.skh.org.tw [Department of Radiology, Shin Kong Wu Ho-Su Memorial Hospital, 95 Wen Chang Road, Shih Lin District, Taipei 111, Taiwan. (China); School of Medicine, Fu Jen Catholic University, Taipei, Taiwan (China)

    2013-12-11

    Objective: To determine the optimal image reconstruction windows in the assessment of coronary artery bypass grafts (CABGs) with 256-slice computed tomography (CT), and to assess their associated optimal pulsing windows for electrocardiogram-triggered tube current modulation (ETCM). Methods: We recruited 18 patients (three female; mean age 68.9 years) having mean heart rate (HR) of 66.3 beats per minute (bpm) and a heart rate variability of 1.3 bpm for this study. A total of 36 CABGs with 168 segments were evaluated, including 12 internal mammary artery (33.3%) and 24 saphenous vein grafts (66.7%). We reconstructed 20 data sets in 5%-step through 0–95% of the R–R interval. The image quality of CABGs was assessed by a 5-point scale (1=excellent to 5=non-diagnostic) for each segment (proximal anastomosis, proximal, middle, distal course of graft body, and distal anastomosis). Two reviewers discriminated optimal reconstruction intervals for each CABG segment in each temporal window. Optimal windows for ETCM were also evaluated. Results: The determined optimal systolic and diastolic reconstruction intervals could be divided into 2 groups with threshold HR=68. The determined best reconstruction intervals for low heart rate (HR<68) and high heart rate (HR>68) were 76.0±2.5% and 45.0±0% respectively. Average image quality scores were 1.7±0.6 with good inter-observer agreement (Kappa=0.79). Image quality was significantly better for saphenous vein grafts versus arterial grafts (P<0.001). The recommended windows of ETCM for low HR, high HR and all HR groups were 40–50%, 71–81% and 40–96% of R-R interval, respectively. The corresponding dose savings were about 60.8%, 58.7% and 22.7% in that order. Conclusions: We determined optimal reconstruction intervals and ETCM windows representing a good compromise between radiation and image quality for following bypass surgery using a 256-slice CT.

  17. Optimization of a Pretargeted Strategy for the PET Imaging of Colorectal Carcinoma via the Modulation of Radioligand Pharmacokinetics.

    Science.gov (United States)

    Zeglis, Brian M; Brand, Christian; Abdel-Atti, Dalya; Carnazza, Kathryn E; Cook, Brendon E; Carlin, Sean; Reiner, Thomas; Lewis, Jason S

    2015-10-05

    Pretargeted PET imaging has emerged as an effective strategy for merging the exquisite selectivity of antibody-based targeting vectors with the rapid pharmacokinetics of radiolabeled small molecules. We previously reported the development of a strategy for the pretargeted PET imaging of colorectal cancer based on the bioorthogonal inverse electron demand Diels-Alder reaction between a tetrazine-bearing radioligand and a transcyclooctene-modified huA33 immunoconjugate. Although this method effectively delineated tumor tissue, its clinical potential was limited by the somewhat sluggish clearance of the radioligand through the gastrointestinal tract. Herein, we report the development and in vivo validation of a pretargeted strategy for the PET imaging of colorectal carcinoma with dramatically improved pharmacokinetics. Two novel tetrazine constructs, Tz-PEG7-NOTA and Tz-SarAr, were synthesized, characterized, and radiolabeled with (64)Cu in high yield (>90%) and radiochemical purity (>99%). PET imaging and biodistribution experiments in healthy mice revealed that although (64)Cu-Tz-PEG7-NOTA is cleared via both the gastrointestinal and urinary tracts, (64)Cu-Tz-SarAr is rapidly excreted by the renal system alone. On this basis, (64)Cu-Tz-SarAr was selected for further in vivo evaluation. To this end, mice bearing A33 antigen-expressing SW1222 human colorectal carcinoma xenografts were administered huA33-TCO, and the immunoconjugate was given 24 h to accumulate at the tumor and clear from the blood, after which (64)Cu-Tz-SarAr was administered via intravenous tail vein injection. PET imaging and biodistribution experiments revealed specific uptake of the radiotracer in the tumor at early time points (5.6 ± 0.7 %ID/g at 1 h p.i.), high tumor-to-background activity ratios, and rapid elimination of unclicked radioligand. Importantly, experiments with longer antibody accumulation intervals (48 and 120 h) yielded slight decreases in tumoral uptake but also concomitant

  18. Optimal coal import strategy

    International Nuclear Information System (INIS)

    Chen, C.Y.; Shih, L.H.

    1992-01-01

    Recently, the main power company in Taiwan has shifted the primary energy resource from oil to coal and tried to diversify the coal supply from various sources. The company wants to have the imported coal meet the environmental standards and operation requirements as well as to have high heating value. In order to achieve these objectives, establishment of a coal blending system for Taiwan is necessary. A mathematical model using mixed integer programming technique is used to model the import strategy and the blending system. 6 refs., 1 tab

  19. Switching strategies to optimize search

    International Nuclear Information System (INIS)

    Shlesinger, Michael F

    2016-01-01

    Search strategies are explored when the search time is fixed, success is probabilistic and the estimate for success can diminish with time if there is not a successful result. Under the time constraint the problem is to find the optimal time to switch a search strategy or search location. Several variables are taken into account, including cost, gain, rate of success if a target is present and the probability that a target is present. (paper: interdisciplinary statistical mechanics)

  20. Optimal Strategy and Business Models

    DEFF Research Database (Denmark)

    Johnson, Peter; Foss, Nicolai Juul

    2016-01-01

    This study picks up on earlier suggestions that control theory may further the study of strategy. Strategy can be formally interpreted as an idealized path optimizing heterogeneous resource deployment to produce maximum financial gain. Using standard matrix methods to describe the firm Hamiltonia...... variable of firm path, suggesting in turn that the firm's business model is the codification of the application of investment resources used to control the strategic path of value realization....

  1. Optimization strategies in complex systems

    NARCIS (Netherlands)

    Bussolari, L.; Contucci, P.; Giardinà, C.; Giberti, C.; Unguendoli, F.; Vernia, C.

    2003-01-01

    We consider a class of combinatorial optimization problems that emerge in a variety of domains among which: condensed matter physics, theory of financial risks, error correcting codes in information transmissions, molecular and protein conformation, image restoration. We show the performances of two

  2. A fast inverse treatment planning strategy facilitating optimized catheter selection in image-guided high-dose-rate interstitial gynecologic brachytherapy.

    Science.gov (United States)

    Guthier, Christian V; Damato, Antonio L; Hesser, Juergen W; Viswanathan, Akila N; Cormack, Robert A

    2017-12-01

    Interstitial high-dose rate (HDR) brachytherapy is an important therapeutic strategy for the treatment of locally advanced gynecologic (GYN) cancers. The outcome of this therapy is determined by the quality of dose distribution achieved. This paper focuses on a novel yet simple heuristic for catheter selection for GYN HDR brachytherapy and their comparison against state of the art optimization strategies. The proposed technique is intended to act as a decision-supporting tool to select a favorable needle configuration. The presented heuristic for catheter optimization is based on a shrinkage-type algorithm (SACO). It is compared against state of the art planning in a retrospective study of 20 patients who previously received image-guided interstitial HDR brachytherapy using a Syed Neblett template. From those plans, template orientation and position are estimated via a rigid registration of the template with the actual catheter trajectories. All potential straight trajectories intersecting the contoured clinical target volume (CTV) are considered for catheter optimization. Retrospectively generated plans and clinical plans are compared with respect to dosimetric performance and optimization time. All plans were generated with one single run of the optimizer lasting 0.6-97.4 s. Compared to manual optimization, SACO yields a statistically significant (P ≤ 0.05) improved target coverage while at the same time fulfilling all dosimetric constraints for organs at risk (OARs). Comparing inverse planning strategies, dosimetric evaluation for SACO and "hybrid inverse planning and optimization" (HIPO), as gold standard, shows no statistically significant difference (P > 0.05). However, SACO provides the potential to reduce the number of used catheters without compromising plan quality. The proposed heuristic for needle selection provides fast catheter selection with optimization times suited for intraoperative treatment planning. Compared to manual optimization, the

  3. Mixed integer evolution strategies for parameter optimization.

    Science.gov (United States)

    Li, Rui; Emmerich, Michael T M; Eggermont, Jeroen; Bäck, Thomas; Schütz, M; Dijkstra, J; Reiber, J H C

    2013-01-01

    Evolution strategies (ESs) are powerful probabilistic search and optimization algorithms gleaned from biological evolution theory. They have been successfully applied to a wide range of real world applications. The modern ESs are mainly designed for solving continuous parameter optimization problems. Their ability to adapt the parameters of the multivariate normal distribution used for mutation during the optimization run makes them well suited for this domain. In this article we describe and study mixed integer evolution strategies (MIES), which are natural extensions of ES for mixed integer optimization problems. MIES can deal with parameter vectors consisting not only of continuous variables but also with nominal discrete and integer variables. Following the design principles of the canonical evolution strategies, they use specialized mutation operators tailored for the aforementioned mixed parameter classes. For each type of variable, the choice of mutation operators is governed by a natural metric for this variable type, maximal entropy, and symmetry considerations. All distributions used for mutation can be controlled in their shape by means of scaling parameters, allowing self-adaptation to be implemented. After introducing and motivating the conceptual design of the MIES, we study the optimality of the self-adaptation of step sizes and mutation rates on a generalized (weighted) sphere model. Moreover, we prove global convergence of the MIES on a very general class of problems. The remainder of the article is devoted to performance studies on artificial landscapes (barrier functions and mixed integer NK landscapes), and a case study in the optimization of medical image analysis systems. In addition, we show that with proper constraint handling techniques, MIES can also be applied to classical mixed integer nonlinear programming problems.

  4. Evolution strategies for robust optimization

    NARCIS (Netherlands)

    Kruisselbrink, Johannes Willem

    2012-01-01

    Real-world (black-box) optimization problems often involve various types of uncertainties and noise emerging in different parts of the optimization problem. When this is not accounted for, optimization may fail or may yield solutions that are optimal in the classical strict notion of optimality, but

  5. Determining an optimal supply chain strategy

    Directory of Open Access Journals (Sweden)

    Intaher M. Ambe

    2012-11-01

    Full Text Available In today’s business environment, many companies want to become efficient and flexible, but have struggled, in part, because they have not been able to formulate optimal supply chain strategies. Often this is as a result of insufficient knowledge about the costs involved in maintaining supply chains and the impact of the supply chain on their operations. Hence, these companies find it difficult to manufacture at a competitive cost and respond quickly and reliably to market demand. Mismatched strategies are the root cause of the problems that plague supply chains, and supply-chain strategies based on a one-size-fits-all strategy often fail. The purpose of this article is to suggest instruments to determine an optimal supply chain strategy. This article, which is conceptual in nature, provides a review of current supply chain strategies and suggests a framework for determining an optimal strategy.

  6. Optimal management strategies in variable environments: Stochastic optimal control methods

    Science.gov (United States)

    Williams, B.K.

    1985-01-01

    Dynamic optimization was used to investigate the optimal defoliation of salt desert shrubs in north-western Utah. Management was formulated in the context of optimal stochastic control theory, with objective functions composed of discounted or time-averaged biomass yields. Climatic variability and community patterns of salt desert shrublands make the application of stochastic optimal control both feasible and necessary. A primary production model was used to simulate shrub responses and harvest yields under a variety of climatic regimes and defoliation patterns. The simulation results then were used in an optimization model to determine optimal defoliation strategies. The latter model encodes an algorithm for finite state, finite action, infinite discrete time horizon Markov decision processes. Three questions were addressed: (i) What effect do changes in weather patterns have on optimal management strategies? (ii) What effect does the discounting of future returns have? (iii) How do the optimal strategies perform relative to certain fixed defoliation strategies? An analysis was performed for the three shrub species, winterfat (Ceratoides lanata), shadscale (Atriplex confertifolia) and big sagebrush (Artemisia tridentata). In general, the results indicate substantial differences among species in optimal control strategies, which are associated with differences in physiological and morphological characteristics. Optimal policies for big sagebrush varied less with variation in climate, reserve levels and discount rates than did either shadscale or winterfat. This was attributed primarily to the overwintering of photosynthetically active tissue and to metabolic activity early in the growing season. Optimal defoliation of shadscale and winterfat generally was more responsive to differences in plant vigor and climate, reflecting the sensitivity of these species to utilization and replenishment of carbohydrate reserves. Similarities could be seen in the influence of both

  7. Optimal Advance Selling Strategy under Price Commitment

    OpenAIRE

    Chenhang Zeng

    2012-01-01

    This paper considers a two-period model with experienced consumers and inexperienced consumers. The retailer determines both advance selling price and regular selling price at the beginning of the first period. I show that advance selling weekly dominates no advance selling, and the optimal advance selling price may be at a discount, at a premium or at the regular selling price. To help the retailer choose the optimal pricing strategy, conditions for each possible advance selling strategy to ...

  8. Optimization strategies for ultrasound volume registration

    International Nuclear Information System (INIS)

    Ijaz, Umer Zeeshan; Prager, Richard W; Gee, Andrew H; Treece, Graham M

    2010-01-01

    This paper considers registration of 3D ultrasound volumes acquired in multiple views for display in a single image volume. One way to acquire 3D data is to use a mechanically swept 3D probe. However, the usefulness of these probes is restricted by their limited field of view. This problem can be overcome by attaching a six-degree-of-freedom (DOF) position sensor to the probe, and displaying the information from multiple sweeps in their proper positions. However, an external six-DOF position sensor can be an inconvenience in a clinical setting. The objective of this paper is to propose a hybrid strategy that replaces the sensor with a combination of three-DOF image registration and an unobtrusive inertial sensor for measuring orientation. We examine a range of optimization algorithms and similarity measures for registration and compare them in in vitro and in vivo experiments. We register based on multiple reslice images rather than a whole voxel array. In this paper, we use a large number of reslices for improved reliability at the expense of computational speed. We have found that the Levenberg–Marquardt method is very fast but is not guaranteed to give the correct solution all the time. We conclude that normalized mutual information used in the Nelder–Mead simplex algorithm is potentially suitable for the registration task with an average execution time of around 5 min, in the majority of cases, with two restarts in a C++ implementation on a 3.0 GHz Intel Core 2 Duo CPU machine

  9. Tank Waste Remediation System optimized processing strategy

    International Nuclear Information System (INIS)

    Slaathaug, E.J.; Boldt, A.L.; Boomer, K.D.; Galbraith, J.D.; Leach, C.E.; Waldo, T.L.

    1996-03-01

    This report provides an alternative strategy evolved from the current Hanford Site Tank Waste Remediation System (TWRS) programmatic baseline for accomplishing the treatment and disposal of the Hanford Site tank wastes. This optimized processing strategy performs the major elements of the TWRS Program, but modifies the deployment of selected treatment technologies to reduce the program cost. The present program for development of waste retrieval, pretreatment, and vitrification technologies continues, but the optimized processing strategy reuses a single facility to accomplish the separations/low-activity waste (LAW) vitrification and the high-level waste (HLW) vitrification processes sequentially, thereby eliminating the need for a separate HLW vitrification facility

  10. Current strategy for the imaging of neuroblastoma

    International Nuclear Information System (INIS)

    Brisse, H.; Neuenschwander, S.; Edeline, V.; Michon, J.; Zucker, J.M.; Couanet, D.

    2001-01-01

    Advances in the management of neuroblastoma lead radiologists and nuclear medicine specialists to optimize their procedures in order to propose a rational use of their techniques, adjusted to the various clinical presentations and to therapeutic management. The aim of this paper is to assess the imaging procedures for the diagnosis and follow-up of neuroblastoma in children according to current therapeutic European protocols. An imaging strategy at diagnosis is first proposed: optimal assessment of local extension of the primary tumour is made with MRI, or spiral-CT when MRI is not available, for all locations except for abdominal tumours for which CT remains the best imaging modality. Metastatic extension is assessed with mlBG scan and liver sonography. Indications for bone metastasis evaluation with either radiological or radionuclide techniques are detailed. Imaging follow-up during treatment for metastatic or unresectable tumours is described. A check-list of radiological main points to be evaluated before surgery is proposed for localized neuroblastoma. The imaging strategy for the diagnosis of 'occult' neuroblastoma is considered. Finally, we explain the management of neuroblastoma detected during the prenatal or neonatal period. (authors)

  11. TU-G-204-04: A Unified Strategy for Bi-Factorial Optimization of Radiation Dose and Contrast Dose in CT Imaging

    Energy Technology Data Exchange (ETDEWEB)

    Sahbaee, P; Zhang, Y; Solomon, J; Becchetti, M; Segars, P; Samei, E [Duke University Medical Center, Durham, NC (United States)

    2015-06-15

    Purpose: To substantiate the interdependency of contrast dose, radiation dose, and image quality in CT towards the patient- specific optimization of the imaging protocols Methods: The study deployed two phantom platforms. A variable sized (12, 18, 23, 30, 37 cm) phantom (Mercury-3.0) containing an iodinated insert (8.5 mgI/ml) was imaged on a representative CT scanner at multiple CTDI values (0.7–22.6 mGy). The contrast and noise were measured from the reconstructed images for each phantom diameter. Linearly related to iodine-concentration, contrast-to-noise ratio (CNR), were calculated for 16 iodine-concentration levels (0–8.5 mgI/ml). The analysis was extended to a recently developed suit of 58 virtual human models (5D XCAT) with added contrast dynamics. Emulating a contrast-enhanced abdominal image procedure and targeting a peak-enhancement in aorta, each XCAT phantom was “imaged” using a simulation platform (CatSim, GE). 3D surfaces for each patient/size established the relationship between iodine-concentration, dose, and CNR. The ratios of change in iodine-concentration versus dose (IDR) to yield a constant change in CNR were calculated for each patient size. Results: Mercury phantom results show the image-quality size- dependence on CTDI and IC levels. For desired image-quality values, the iso-contour-lines reflect the trade off between contrast-material and radiation doses. For a fixed iodine-concentration (4 mgI/mL), the IDR values for low (1.4 mGy) and high (11.5 mGy) dose levels were 1.02, 1.07, 1.19, 1.65, 1.54, and 3.14, 3.12, 3.52, 3.76, 4.06, respectively across five sizes. The simulation data from XCAT models confirmed the empirical results from Mercury phantom. Conclusion: The iodine-concentration, image quality, and radiation dose are interdependent. The understanding of the relationships between iodine-concentration, image quality, and radiation dose will allow for a more comprehensive optimization of CT imaging devices and techniques

  12. Optimal control of anthracnose using mixed strategies.

    Science.gov (United States)

    Fotsa Mbogne, David Jaures; Thron, Christopher

    2015-11-01

    In this paper we propose and study a spatial diffusion model for the control of anthracnose disease in a bounded domain. The model is a generalization of the one previously developed in [15]. We use the model to simulate two different types of control strategies against anthracnose disease. Strategies that employ chemical fungicides are modeled using a continuous control function; while strategies that rely on cultivational practices (such as pruning and removal of mummified fruits) are modeled with a control function which is discrete in time (though not in space). For comparative purposes, we perform our analyses for a spatially-averaged model as well as the space-dependent diffusion model. Under weak smoothness conditions on parameters we demonstrate the well-posedness of both models by verifying existence and uniqueness of the solution for the growth inhibition rate for given initial conditions. We also show that the set [0, 1] is positively invariant. We first study control by impulsive strategies, then analyze the simultaneous use of mixed continuous and pulse strategies. In each case we specify a cost functional to be minimized, and we demonstrate the existence of optimal control strategies. In the case of pulse-only strategies, we provide explicit algorithms for finding the optimal control strategies for both the spatially-averaged model and the space-dependent model. We verify the algorithms for both models via simulation, and discuss properties of the optimal solutions. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Optimal Pricing Strategy in Marketing Research Consulting.

    OpenAIRE

    Chang, Chun-Hao; Lee, Chi-Wen Jevons

    1994-01-01

    This paper studies the optimal pricing scheme for a monopolistic marketing research consultant who sells high-cost proprietary marketing information to her oligopolistic clients in the manufacturing industry. In designing an optimal pricing strategy, the consultant needs to fully consider the behavior of her clients, the behavior of the existing and potential competitors to her clients, and the behavior of her clients' customers. The authors show how the environment uncertainty, the capabilit...

  14. Optimal Deterministic Investment Strategies for Insurers

    Directory of Open Access Journals (Sweden)

    Ulrich Rieder

    2013-11-01

    Full Text Available We consider an insurance company whose risk reserve is given by a Brownian motion with drift and which is able to invest the money into a Black–Scholes financial market. As optimization criteria, we treat mean-variance problems, problems with other risk measures, exponential utility and the probability of ruin. Following recent research, we assume that investment strategies have to be deterministic. This leads to deterministic control problems, which are quite easy to solve. Moreover, it turns out that there are some interesting links between the optimal investment strategies of these problems. Finally, we also show that this approach works in the Lévy process framework.

  15. Image registration via optimization over disjoint image regions

    Science.gov (United States)

    Pitts, Todd; Hathaway, Simon; Karelitz, David B.; Sandusky, John; Laine, Mark Richard

    2018-02-06

    Technologies pertaining to registering a target image with a base image are described. In a general embodiment, the base image is selected from a set of images, and the target image is an image in the set of images that is to be registered to the base image. A set of disjoint regions of the target image is selected, and a transform to be applied to the target image is computed based on the optimization of a metric over the selected set of disjoint regions. The transform is applied to the target image so as to register the target image with the base image.

  16. Optimization strategies for complex engineering applications

    Energy Technology Data Exchange (ETDEWEB)

    Eldred, M.S.

    1998-02-01

    LDRD research activities have focused on increasing the robustness and efficiency of optimization studies for computationally complex engineering problems. Engineering applications can be characterized by extreme computational expense, lack of gradient information, discrete parameters, non-converging simulations, and nonsmooth, multimodal, and discontinuous response variations. Guided by these challenges, the LDRD research activities have developed application-specific techniques, fundamental optimization algorithms, multilevel hybrid and sequential approximate optimization strategies, parallel processing approaches, and automatic differentiation and adjoint augmentation methods. This report surveys these activities and summarizes the key findings and recommendations.

  17. Parallel strategy for optimal learning in perceptrons

    International Nuclear Information System (INIS)

    Neirotti, J P

    2010-01-01

    We developed a parallel strategy for learning optimally specific realizable rules by perceptrons, in an online learning scenario. Our result is a generalization of the Caticha-Kinouchi (CK) algorithm developed for learning a perceptron with a synaptic vector drawn from a uniform distribution over the N-dimensional sphere, so called the typical case. Our method outperforms the CK algorithm in almost all possible situations, failing only in a denumerable set of cases. The algorithm is optimal in the sense that it saturates Bayesian bounds when it succeeds.

  18. The Optimal Nash Equilibrium Strategies Under Competition

    Institute of Scientific and Technical Information of China (English)

    孟力; 王崇喜; 汪定伟; 张爱玲

    2004-01-01

    This paper presented a game theoretic model to study the competition for a single investment oppertunity under uncertainty. It models the hazard rate of investment as a function of competitors' trigger level. Under uncertainty and different information structure, the option and game theory was applied to researching the optimal Nash equilibrium strategies of one or more firm. By means of Matlab software, the paper simulates a real estate developing project example and illustrates how parameter affects investment strategies. The paper's work will contribute to the present investment practice in China.

  19. Automatic CT simulation optimization for radiation therapy: A general strategy.

    Science.gov (United States)

    Li, Hua; Yu, Lifeng; Anastasio, Mark A; Chen, Hsin-Chen; Tan, Jun; Gay, Hiram; Michalski, Jeff M; Low, Daniel A; Mutic, Sasa

    2014-03-01

    In radiation therapy, x-ray computed tomography (CT) simulation protocol specifications should be driven by the treatment planning requirements in lieu of duplicating diagnostic CT screening protocols. The purpose of this study was to develop a general strategy that allows for automatically, prospectively, and objectively determining the optimal patient-specific CT simulation protocols based on radiation-therapy goals, namely, maintenance of contouring quality and integrity while minimizing patient CT simulation dose. The authors proposed a general prediction strategy that provides automatic optimal CT simulation protocol selection as a function of patient size and treatment planning task. The optimal protocol is the one that delivers the minimum dose required to provide a CT simulation scan that yields accurate contours. Accurate treatment plans depend on accurate contours in order to conform the dose to actual tumor and normal organ positions. An image quality index, defined to characterize how simulation scan quality affects contour delineation, was developed and used to benchmark the contouring accuracy and treatment plan quality within the predication strategy. A clinical workflow was developed to select the optimal CT simulation protocols incorporating patient size, target delineation, and radiation dose efficiency. An experimental study using an anthropomorphic pelvis phantom with added-bolus layers was used to demonstrate how the proposed prediction strategy could be implemented and how the optimal CT simulation protocols could be selected for prostate cancer patients based on patient size and treatment planning task. Clinical IMRT prostate treatment plans for seven CT scans with varied image quality indices were separately optimized and compared to verify the trace of target and organ dosimetry coverage. Based on the phantom study, the optimal image quality index for accurate manual prostate contouring was 4.4. The optimal tube potentials for patient sizes

  20. Automatic CT simulation optimization for radiation therapy: A general strategy

    Energy Technology Data Exchange (ETDEWEB)

    Li, Hua, E-mail: huli@radonc.wustl.edu; Chen, Hsin-Chen; Tan, Jun; Gay, Hiram; Michalski, Jeff M.; Mutic, Sasa [Department of Radiation Oncology, Washington University, St. Louis, Missouri 63110 (United States); Yu, Lifeng [Department of Radiology, Mayo Clinic, Rochester, Minnesota 55905 (United States); Anastasio, Mark A. [Department of Biomedical Engineering, Washington University, St. Louis, Missouri 63110 (United States); Low, Daniel A. [Department of Radiation Oncology, University of California Los Angeles, Los Angeles, California 90095 (United States)

    2014-03-15

    Purpose: In radiation therapy, x-ray computed tomography (CT) simulation protocol specifications should be driven by the treatment planning requirements in lieu of duplicating diagnostic CT screening protocols. The purpose of this study was to develop a general strategy that allows for automatically, prospectively, and objectively determining the optimal patient-specific CT simulation protocols based on radiation-therapy goals, namely, maintenance of contouring quality and integrity while minimizing patient CT simulation dose. Methods: The authors proposed a general prediction strategy that provides automatic optimal CT simulation protocol selection as a function of patient size and treatment planning task. The optimal protocol is the one that delivers the minimum dose required to provide a CT simulation scan that yields accurate contours. Accurate treatment plans depend on accurate contours in order to conform the dose to actual tumor and normal organ positions. An image quality index, defined to characterize how simulation scan quality affects contour delineation, was developed and used to benchmark the contouring accuracy and treatment plan quality within the predication strategy. A clinical workflow was developed to select the optimal CT simulation protocols incorporating patient size, target delineation, and radiation dose efficiency. An experimental study using an anthropomorphic pelvis phantom with added-bolus layers was used to demonstrate how the proposed prediction strategy could be implemented and how the optimal CT simulation protocols could be selected for prostate cancer patients based on patient size and treatment planning task. Clinical IMRT prostate treatment plans for seven CT scans with varied image quality indices were separately optimized and compared to verify the trace of target and organ dosimetry coverage. Results: Based on the phantom study, the optimal image quality index for accurate manual prostate contouring was 4.4. The optimal tube

  1. Optimized Strategies for Detecting Extrasolar Space Weather

    Science.gov (United States)

    Hallinan, Gregg

    2018-06-01

    Fully understanding the implications of space weather for the young solar system, as well as the wider population of planet-hosting stars, requires remote sensing of space weather in other stellar systems. Solar coronal mass ejections can be accompanied by bright radio bursts at low frequencies (typically measurement of the magnetic field strength of the planet, informing on whether the atmosphere of the planet can survive the intense magnetic activity of its host star. However, both stellar and planetary radio emission are highly variable and optimal strategies for detection of these emissions requires the capability to monitor 1000s of nearby stellar/planetary systems simultaneously. I will discuss optimized strategies for both ground and space-based experiments to take advantage of the highly variable nature of the radio emissions powered by extrasolar space weather to enable detection of stellar CMEs and planetary magnetospheres.

  2. Optimizing color reproduction of natural images

    NARCIS (Netherlands)

    Yendrikhovskij, S.N.; Blommaert, F.J.J.; Ridder, de H.

    1998-01-01

    The paper elaborates on understanding, measuring and optimizing perceived color quality of natural images. We introduce a model for optimal color reproduction of natural scenes which is based on the assumption that color quality of natural images is constrained by perceived naturalness and

  3. Optimization of pocket machining strategy in HSM

    OpenAIRE

    Msaddek, El Bechir; Bouaziz, Zoubeir; Dessein, Gilles; Baili, Maher

    2012-01-01

    International audience; Our two major concerns, which should be taken into consideration as soon as we start the selecting the machining parameters, are the minimization of the machining time and the maintaining of the high-speed machining machine in good state. The manufacturing strategy is one of the parameters which practically influences the time of the different geometrical forms manufacturing, as well as the machine itself. In this article, we propose an optimization methodology of the ...

  4. Optimal strategies for pricing general insurance

    OpenAIRE

    Emms, P.; Haberman, S.; Savoulli, I.

    2006-01-01

    Optimal premium pricing policies in a competitive insurance environment are investigated using approximation methods and simulation of sample paths. The market average premium is modelled as a diffusion process, with the premium as the control function and the maximization of the expected total utility of wealth, over a finite time horizon, as the objective. In order to simplify the optimisation problem, a linear utility function is considered and two particular premium strategies are adopted...

  5. Optimization of Synthetic Aperture Image Quality

    DEFF Research Database (Denmark)

    Moshavegh, Ramin; Jensen, Jonas; Villagómez Hoyos, Carlos Armando

    2016-01-01

    Synthetic Aperture (SA) imaging produces high-quality images and velocity estimates of both slow and fast flow at high frame rates. However, grating lobe artifacts can appear both in transmission and reception. These affect the image quality and the frame rate. Therefore optimization of parameter...

  6. Strategies for Biologic Image-Guided Dose Escalation: A Review

    International Nuclear Information System (INIS)

    Sovik, Aste; Malinen, Eirik; Olsen, Dag Rune

    2009-01-01

    There is increasing interest in how to incorporate functional and molecular information obtained by noninvasive, three-dimensional tumor imaging into radiotherapy. The key issues are to identify radioresistant regions that can be targeted for dose escalation, and to develop radiation dose prescription and delivery strategies providing optimal treatment for the individual patient. In the present work, we review the proposed strategies for biologic image-guided dose escalation with intensity-modulated radiation therapy. Biologic imaging modalities and the derived images are discussed, as are methods for target volume delineation. Different dose escalation strategies and techniques for treatment delivery and treatment plan evaluation are also addressed. Furthermore, we consider the need for response monitoring during treatment. We conclude with a summary of the current status of biologic image-based dose escalation and of areas where further work is needed for this strategy to become incorporated into clinical practice

  7. Optimization Under Uncertainty for Wake Steering Strategies

    Energy Technology Data Exchange (ETDEWEB)

    Quick, Julian [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Annoni, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Ryan N [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Dykes, Katherine L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Fleming, Paul A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ning, Andrew [Brigham Young University

    2017-08-03

    Offsetting turbines' yaw orientations from incoming wind is a powerful tool that may be leveraged to reduce undesirable wake effects on downstream turbines. First, we examine a simple two-turbine case to gain intuition as to how inflow direction uncertainty affects the optimal solution. The turbines are modeled with unidirectional inflow such that one turbine directly wakes the other, using ten rotor diameter spacing. We perform optimization under uncertainty (OUU) via a parameter sweep of the front turbine. The OUU solution generally prefers less steering. We then do this optimization for a 60-turbine wind farm with unidirectional inflow, varying the degree of inflow uncertainty and approaching this OUU problem by nesting a polynomial chaos expansion uncertainty quantification routine within an outer optimization. We examined how different levels of uncertainty in the inflow direction effect the ratio of the expected values of deterministic and OUU solutions for steering strategies in the large wind farm, assuming the directional uncertainty used to reach said OUU solution (this ratio is defined as the value of the stochastic solution or VSS).

  8. Combined optimization model for sustainable energization strategy

    Science.gov (United States)

    Abtew, Mohammed Seid

    Access to energy is a foundation to establish a positive impact on multiple aspects of human development. Both developed and developing countries have a common concern of achieving a sustainable energy supply to fuel economic growth and improve the quality of life with minimal environmental impacts. The Least Developing Countries (LDCs), however, have different economic, social, and energy systems. Prevalence of power outage, lack of access to electricity, structural dissimilarity between rural and urban regions, and traditional fuel dominance for cooking and the resultant health and environmental hazards are some of the distinguishing characteristics of these nations. Most energy planning models have been designed for developed countries' socio-economic demographics and have missed the opportunity to address special features of the poor countries. An improved mixed-integer programming energy-source optimization model is developed to address limitations associated with using current energy optimization models for LDCs, tackle development of the sustainable energization strategies, and ensure diversification and risk management provisions in the selected energy mix. The Model predicted a shift from traditional fuels reliant and weather vulnerable energy source mix to a least cost and reliable modern clean energy sources portfolio, a climb on the energy ladder, and scored multifaceted economic, social, and environmental benefits. At the same time, it represented a transition strategy that evolves to increasingly cleaner energy technologies with growth as opposed to an expensive solution that leapfrogs immediately to the cleanest possible, overreaching technologies.

  9. Local Optimization Strategies in Urban Vehicular Mobility.

    Directory of Open Access Journals (Sweden)

    Pierpaolo Mastroianni

    Full Text Available The comprehension of vehicular traffic in urban environments is crucial to achieve a good management of the complex processes arising from people collective motion. Even allowing for the great complexity of human beings, human behavior turns out to be subject to strong constraints--physical, environmental, social, economic--that induce the emergence of common patterns. The observation and understanding of those patterns is key to setup effective strategies to optimize the quality of life in cities while not frustrating the natural need for mobility. In this paper we focus on vehicular mobility with the aim to reveal the underlying patterns and uncover the human strategies determining them. To this end we analyze a large dataset of GPS vehicles tracks collected in the Rome (Italy district during a month. We demonstrate the existence of a local optimization of travel times that vehicle drivers perform while choosing their journey. This finding is mirrored by two additional important facts, i.e., the observation that the average vehicle velocity increases by increasing the travel length and the emergence of a universal scaling law for the distribution of travel times at fixed traveled length. A simple modeling scheme confirms this scenario opening the way to further predictions.

  10. An integral design strategy combining optical system and image processing to obtain high resolution images

    Science.gov (United States)

    Wang, Jiaoyang; Wang, Lin; Yang, Ying; Gong, Rui; Shao, Xiaopeng; Liang, Chao; Xu, Jun

    2016-05-01

    In this paper, an integral design that combines optical system with image processing is introduced to obtain high resolution images, and the performance is evaluated and demonstrated. Traditional imaging methods often separate the two technical procedures of optical system design and imaging processing, resulting in the failures in efficient cooperation between the optical and digital elements. Therefore, an innovative approach is presented to combine the merit function during optical design together with the constraint conditions of image processing algorithms. Specifically, an optical imaging system with low resolution is designed to collect the image signals which are indispensable for imaging processing, while the ultimate goal is to obtain high resolution images from the final system. In order to optimize the global performance, the optimization function of ZEMAX software is utilized and the number of optimization cycles is controlled. Then Wiener filter algorithm is adopted to process the image simulation and mean squared error (MSE) is taken as evaluation criterion. The results show that, although the optical figures of merit for the optical imaging systems is not the best, it can provide image signals that are more suitable for image processing. In conclusion. The integral design of optical system and image processing can search out the overall optimal solution which is missed by the traditional design methods. Especially, when designing some complex optical system, this integral design strategy has obvious advantages to simplify structure and reduce cost, as well as to gain high resolution images simultaneously, which has a promising perspective of industrial application.

  11. Optimal allocation of trend following strategies

    Science.gov (United States)

    Grebenkov, Denis S.; Serror, Jeremy

    2015-09-01

    We consider a portfolio allocation problem for trend following (TF) strategies on multiple correlated assets. Under simplifying assumptions of a Gaussian market and linear TF strategies, we derive analytical formulas for the mean and variance of the portfolio return. We construct then the optimal portfolio that maximizes risk-adjusted return by accounting for inter-asset correlations. The dynamic allocation problem for n assets is shown to be equivalent to the classical static allocation problem for n2 virtual assets that include lead-lag corrections in positions of TF strategies. The respective roles of asset auto-correlations and inter-asset correlations are investigated in depth for the two-asset case and a sector model. In contrast to the principle of diversification suggesting to treat uncorrelated assets, we show that inter-asset correlations allow one to estimate apparent trends more reliably and to adjust the TF positions more efficiently. If properly accounted for, inter-asset correlations are not deteriorative but beneficial for portfolio management that can open new profit opportunities for trend followers. These concepts are illustrated using daily returns of three highly correlated futures markets: the E-mini S&P 500, Euro Stoxx 50 index, and the US 10-year T-note futures.

  12. Modified Discrete Grey Wolf Optimizer Algorithm for Multilevel Image Thresholding

    Directory of Open Access Journals (Sweden)

    Linguo Li

    2017-01-01

    Full Text Available The computation of image segmentation has become more complicated with the increasing number of thresholds, and the option and application of the thresholds in image thresholding fields have become an NP problem at the same time. The paper puts forward the modified discrete grey wolf optimizer algorithm (MDGWO, which improves on the optimal solution updating mechanism of the search agent by the weights. Taking Kapur’s entropy as the optimized function and based on the discreteness of threshold in image segmentation, the paper firstly discretizes the grey wolf optimizer (GWO and then proposes a new attack strategy by using the weight coefficient to replace the search formula for optimal solution used in the original algorithm. The experimental results show that MDGWO can search out the optimal thresholds efficiently and precisely, which are very close to the result examined by exhaustive searches. In comparison with the electromagnetism optimization (EMO, the differential evolution (DE, the Artifical Bee Colony (ABC, and the classical GWO, it is concluded that MDGWO has advantages over the latter four in terms of image segmentation quality and objective function values and their stability.

  13. Asymptotic estimation of reactor fueling optimal strategy

    International Nuclear Information System (INIS)

    Simonov, V.D.

    1985-01-01

    The problem of improving the technical-economic factors of operating. and designed nuclear power plant blocks by developino. internal fuel cycle strategy (reactor fueling regime optimization), taking into account energy system structural peculiarities altogether, is considered. It is shown, that in search of asymptotic solutions of reactor fueling planning tasks the model of fuel energy potential (FEP) is the most ssuitable and effective. FEP represents energy which may be produced from the fuel in a reactor with real dimensions and power, but with hypothetical fresh fuel supply, regime, providing smilar burnup of all the fuel, passing through the reactor, and continuous overloading of infinitely small fuel portion under fule power, and infinitely rapid mixing of fuel in the reactor core volume. Reactor fuel run with such a standard fuel cycle may serve as FEP quantitative measure. Assessment results of optimal WWER-440 reactor fresh fuel supply periodicity are given as an example. The conclusion is drawn that with fuel enrichment x=3.3% the run which is 300 days, is economically justified, taking into account that the cost of one energy unit production is > 3 cop/KW/h

  14. Optimal energy management strategy for self-reconfigurable batteries

    International Nuclear Information System (INIS)

    Bouchhima, Nejmeddine; Schnierle, Marc; Schulte, Sascha; Birke, Kai Peter

    2017-01-01

    This paper proposes a novel energy management strategy for multi-cell high voltage batteries where the current through each cell can be controlled, called self-reconfigurable batteries. An optimized control strategy further enhances the energy efficiency gained by the hardware architecture of those batteries. Currently, achieving cell equalization by using the active balancing circuits is considered as the best way to optimize the energy efficiency of the battery pack. This study demonstrates that optimizing the energy efficiency of self-reconfigurable batteries is no more strongly correlated to the cell balancing. According to the features of this novel battery architecture, the energy management strategy is formulated as nonlinear dynamic optimization problem. To solve this optimal control, an optimization algorithm that generates the optimal discharge policy for a given driving cycle is developed based on dynamic programming and code vectorization. The simulation results show that the designed energy management strategy maximizes the system efficiency across the battery lifetime over conventional approaches. Furthermore, the present energy management strategy can be implemented online due to the reduced complexity of the optimization algorithm. - Highlights: • The energy efficiency of self-reconfigurable batteries is maximized. • The energy management strategy for the battery is formulated as optimal control problem. • Developing an optimization algorithm using dynamic programming techniques and code vectorization. • Simulation studies are conducted to validate the proposed optimal strategy.

  15. PACS strategy for imaging centers.

    Science.gov (United States)

    Bedel, Victoria; Zdanowicz, Mark

    2004-01-01

    Picture archiving and communications systems (PACS) have been available in imaging centers for many years, but they often were less functional, were not well integrated into patient information systems, and lacked the network backbone to implement a system. As modalities are replaced and technology improves, the ability and time for an imaging center to acquire, integrate, and utilize PACS has arrived. However, each imaging center must determine why it should invest in PACS. A business plan is the fundamental need. Each imaging center must understand its target market, growth rate, and staffing plans. Additional considerations lie in current and future modality availability, the need for offsite delivery of images and reports, and the potential need for remote transmission of images. These issues must be identified and prioritized. A multidisciplinary team is essential. The most successful PACS implementation begins with complete involvement from all levels. The team should be comprised of people with complementary skills who are committed to a common purpose, set of performance goals, and approach for which they hold themselves mutually accountable. The team must jointly decide on the project's objectives. These objectives fall under 4 categories: clinical, service, financial, and performance. PACS must be considered a tool to help accomplish each objective. The imaging center must determine its top priorities, then translate them into a technology "wish list." The center can then list those pieces of technology that are most important and prioritize them. There are even more considerations for connecting multiple imaging centers. The team must create a comprehensive request for proposal (RFP) and determine the vendors that will receive the document. Once the RFP responses have been received and the vendor has been selected, an effective training plan must be executed. Training plans should be competency-based, ensuring comfort and competency among all staff. Upon

  16. Heuristic optimization in penumbral image for high resolution reconstructed image

    International Nuclear Information System (INIS)

    Azuma, R.; Nozaki, S.; Fujioka, S.; Chen, Y. W.; Namihira, Y.

    2010-01-01

    Penumbral imaging is a technique which uses the fact that spatial information can be recovered from the shadow or penumbra that an unknown source casts through a simple large circular aperture. The size of the penumbral image on the detector can be mathematically determined as its aperture size, object size, and magnification. Conventional reconstruction methods are very sensitive to noise. On the other hand, the heuristic reconstruction method is very tolerant of noise. However, the aperture size influences the accuracy and resolution of the reconstructed image. In this article, we propose the optimization of the aperture size for the neutron penumbral imaging.

  17. Circular SAR Optimization Imaging Method of Buildings

    Directory of Open Access Journals (Sweden)

    Wang Jian-feng

    2015-12-01

    Full Text Available The Circular Synthetic Aperture Radar (CSAR can obtain the entire scattering properties of targets because of its great ability of 360° observation. In this study, an optimal orientation of the CSAR imaging algorithm of buildings is proposed by applying a combination of coherent and incoherent processing techniques. FEKO software is used to construct the electromagnetic scattering modes and simulate the radar echo. The FEKO imaging results are compared with the isotropic scattering results. On comparison, the optimal azimuth coherent accumulation angle of CSAR imaging of buildings is obtained. Practically, the scattering directions of buildings are unknown; therefore, we divide the 360° echo of CSAR into many overlapped and few angle echoes corresponding to the sub-aperture and then perform an imaging procedure on each sub-aperture. Sub-aperture imaging results are applied to obtain the all-around image using incoherent fusion techniques. The polarimetry decomposition method is used to decompose the all-around image and further retrieve the edge information of buildings successfully. The proposed method is validated with P-band airborne CSAR data from Sichuan, China.

  18. Optimized nonorthogonal transforms for image compression.

    Science.gov (United States)

    Guleryuz, O G; Orchard, M T

    1997-01-01

    The transform coding of images is analyzed from a common standpoint in order to generate a framework for the design of optimal transforms. It is argued that all transform coders are alike in the way they manipulate the data structure formed by transform coefficients. A general energy compaction measure is proposed to generate optimized transforms with desirable characteristics particularly suited to the simple transform coding operation of scalar quantization and entropy coding. It is shown that the optimal linear decoder (inverse transform) must be an optimal linear estimator, independent of the structure of the transform generating the coefficients. A formulation that sequentially optimizes the transforms is presented, and design equations and algorithms for its computation provided. The properties of the resulting transform systems are investigated. In particular, it is shown that the resulting basis are nonorthogonal and complete, producing energy compaction optimized, decorrelated transform coefficients. Quantization issues related to nonorthogonal expansion coefficients are addressed with a simple, efficient algorithm. Two implementations are discussed, and image coding examples are given. It is shown that the proposed design framework results in systems with superior energy compaction properties and excellent coding results.

  19. Image-Optimized Coronal Magnetic Field Models

    Science.gov (United States)

    Jones, Shaela I.; Uritsky, Vadim; Davila, Joseph M.

    2017-01-01

    We have reported previously on a new method we are developing for using image-based information to improve global coronal magnetic field models. In that work we presented early tests of the method which proved its capability to improve global models based on flawed synoptic magnetograms, given excellent constraints on the field in the model volume. In this follow-up paper we present the results of similar tests given field constraints of a nature that could realistically be obtained from quality white-light coronagraph images of the lower corona. We pay particular attention to difficulties associated with the line-of-sight projection of features outside of the assumed coronagraph image plane, and the effect on the outcome of the optimization of errors in localization of constraints. We find that substantial improvement in the model field can be achieved with this type of constraints, even when magnetic features in the images are located outside of the image plane.

  20. Image-optimized Coronal Magnetic Field Models

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Shaela I.; Uritsky, Vadim; Davila, Joseph M., E-mail: shaela.i.jones-mecholsky@nasa.gov, E-mail: shaela.i.jonesmecholsky@nasa.gov [NASA Goddard Space Flight Center, Code 670, Greenbelt, MD 20771 (United States)

    2017-08-01

    We have reported previously on a new method we are developing for using image-based information to improve global coronal magnetic field models. In that work, we presented early tests of the method, which proved its capability to improve global models based on flawed synoptic magnetograms, given excellent constraints on the field in the model volume. In this follow-up paper, we present the results of similar tests given field constraints of a nature that could realistically be obtained from quality white-light coronagraph images of the lower corona. We pay particular attention to difficulties associated with the line-of-sight projection of features outside of the assumed coronagraph image plane and the effect on the outcome of the optimization of errors in the localization of constraints. We find that substantial improvement in the model field can be achieved with these types of constraints, even when magnetic features in the images are located outside of the image plane.

  1. Guided color consistency optimization for image mosaicking

    Science.gov (United States)

    Xie, Renping; Xia, Menghan; Yao, Jian; Li, Li

    2018-01-01

    This paper studies the problem of color consistency correction for sequential images with diverse color characteristics. Existing algorithms try to adjust all images to minimize color differences among images under a unified energy framework, however, the results are prone to presenting a consistent but unnatural appearance when the color difference between images is large and diverse. In our approach, this problem is addressed effectively by providing a guided initial solution for the global consistency optimization, which avoids converging to a meaningless integrated solution. First of all, to obtain the reliable intensity correspondences in overlapping regions between image pairs, we creatively propose the histogram extreme point matching algorithm which is robust to image geometrical misalignment to some extents. In the absence of the extra reference information, the guided initial solution is learned from the major tone of the original images by searching some image subset as the reference, whose color characteristics will be transferred to the others via the paths of graph analysis. Thus, the final results via global adjustment will take on a consistent color similar to the appearance of the reference image subset. Several groups of convincing experiments on both the synthetic dataset and the challenging real ones sufficiently demonstrate that the proposed approach can achieve as good or even better results compared with the state-of-the-art approaches.

  2. Optimizing metapopulation sustainability through a checkerboard strategy.

    Directory of Open Access Journals (Sweden)

    Yossi Ben Zion

    2010-01-01

    Full Text Available The persistence of a spatially structured population is determined by the rate of dispersal among habitat patches. If the local dynamic at the subpopulation level is extinction-prone, the system viability is maximal at intermediate connectivity where recolonization is allowed, but full synchronization that enables correlated extinction is forbidden. Here we developed and used an algorithm for agent-based simulations in order to study the persistence of a stochastic metapopulation. The effect of noise is shown to be dramatic, and the dynamics of the spatial population differs substantially from the predictions of deterministic models. This has been validated for the stochastic versions of the logistic map, the Ricker map and the Nicholson-Bailey host-parasitoid system. To analyze the possibility of extinction, previous studies were focused on the attractiveness (Lyapunov exponent of stable solutions and the structure of their basin of attraction (dependence on initial population size. Our results suggest that these features are of secondary importance in the presence of stochasticity. Instead, optimal sustainability is achieved when decoherence is maximal. Individual-based simulations of metapopulations of different sizes, dimensions and noise types, show that the system's lifetime peaks when it displays checkerboard spatial patterns. This conclusion is supported by the results of a recently published Drosophila experiment. The checkerboard strategy provides a technique for the manipulation of migration rates (e.g., by constructing corridors in order to affect the persistence of a metapopulation. It may be used in order to minimize the risk of extinction of an endangered species, or to maximize the efficiency of an eradication campaign.

  3. Developing an Integrated Design Strategy for Chip Layout Optimization

    NARCIS (Netherlands)

    Wits, Wessel Willems; Jauregui Becker, Juan Manuel; van Vliet, Frank Edward; te Riele, G.J.

    2011-01-01

    This paper presents an integrated design strategy for chip layout optimization. The strategy couples both electric and thermal aspects during the conceptual design phase to improve chip performances; thermal management being one of the major topics. The layout of the chip circuitry is optimized

  4. Optimal Spatial Harvesting Strategy and Symmetry-Breaking

    International Nuclear Information System (INIS)

    Kurata, Kazuhiro; Shi Junping

    2008-01-01

    A reaction-diffusion model with logistic growth and constant effort harvesting is considered. By minimizing an intrinsic biological energy function, we obtain an optimal spatial harvesting strategy which will benefit the population the most. The symmetry properties of the optimal strategy are also discussed, and related symmetry preserving and symmetry breaking phenomena are shown with several typical examples of habitats

  5. Effective Clipart Image Vectorization through Direct Optimization of Bezigons.

    Science.gov (United States)

    Yang, Ming; Chao, Hongyang; Zhang, Chi; Guo, Jun; Yuan, Lu; Sun, Jian

    2016-02-01

    Bezigons, i.e., closed paths composed of Bézier curves, have been widely employed to describe shapes in image vectorization results. However, most existing vectorization techniques infer the bezigons by simply approximating an intermediate vector representation (such as polygons). Consequently, the resultant bezigons are sometimes imperfect due to accumulated errors, fitting ambiguities, and a lack of curve priors, especially for low-resolution images. In this paper, we describe a novel method for vectorizing clipart images. In contrast to previous methods, we directly optimize the bezigons rather than using other intermediate representations; therefore, the resultant bezigons are not only of higher fidelity compared with the original raster image but also more reasonable because they were traced by a proficient expert. To enable such optimization, we have overcome several challenges and have devised a differentiable data energy as well as several curve-based prior terms. To improve the efficiency of the optimization, we also take advantage of the local control property of bezigons and adopt an overlapped piecewise optimization strategy. The experimental results show that our method outperforms both the current state-of-the-art method and commonly used commercial software in terms of bezigon quality.

  6. Synthesis of Optimal Strategies Using HyTech

    DEFF Research Database (Denmark)

    Bouyer, Patricia; Cassez, Franck; Larsen, Kim Guldstrand

    2005-01-01

    Priced timed (game) automata extend timed (game) automata with costs on both locations and transitions. The problem of synthesizing an optimal winning strategy for a priced timed game under some hypotheses has been shown decidable in [P. Bouyer, F. Cassez, E. Fleury, and K.G. Larsen. Optimal...... strategies in priced timed game automata. Research Report BRICS RS-04-4, Denmark, Feb. 2004. Available at http://www.brics.dk/RS/04/4/]. In this paper, we present an algorithm for computing the optimal cost and for synthesizing an optimal strategy in case there exists one. We also describe the implementation...

  7. Optimization Strategies for Hardware-Based Cofactorization

    Science.gov (United States)

    Loebenberger, Daniel; Putzka, Jens

    We use the specific structure of the inputs to the cofactorization step in the general number field sieve (GNFS) in order to optimize the runtime for the cofactorization step on a hardware cluster. An optimal distribution of bitlength-specific ECM modules is proposed and compared to existing ones. With our optimizations we obtain a speedup between 17% and 33% of the cofactorization step of the GNFS when compared to the runtime of an unoptimized cluster.

  8. Particle swarm optimization based optimal bidding strategy in an ...

    African Journals Online (AJOL)

    In an electricity market generating companies and large consumers need suitable bidding models to maximize their profits. Therefore, each supplier and large consumer will bid strategically for choosing the bidding coefficients to counter the competitors bidding strategy. In this paper, bidding strategy problem modeled as an ...

  9. Molecular imaging: current status and emerging strategies

    International Nuclear Information System (INIS)

    Pysz, M.A.; Gambhir, S.S.; Willmann, J.K.

    2010-01-01

    In vivo molecular imaging has a great potential to impact medicine by detecting diseases in early stages (screening), identifying extent of disease, selecting disease- and patient-specific treatment (personalized medicine), applying a directed or targeted therapy, and measuring molecular-specific effects of treatment. Current clinical molecular imaging approaches primarily use positron-emission tomography (PET) or single photon-emission computed tomography (SPECT)-based techniques. In ongoing preclinical research, novel molecular targets of different diseases are identified and, sophisticated and multifunctional contrast agents for imaging these molecular targets are developed along with new technologies and instrumentation for multi-modality molecular imaging. Contrast-enhanced molecular ultrasound (US) with molecularly-targeted contrast microbubbles is explored as a clinically translatable molecular imaging strategy for screening, diagnosing, and monitoring diseases at the molecular level. Optical imaging with fluorescent molecular probes and US imaging with molecularly-targeted microbubbles are attractive strategies as they provide real-time imaging, are relatively inexpensive, produce images with high spatial resolution, and do not involve exposure to ionizing irradiation. Raman spectroscopy/microscopy has emerged as a molecular optical imaging strategy for ultrasensitive detection of multiple biomolecules/biochemicals with both in vivo and ex vivo versatility. Photoacoustic imaging is a hybrid of optical and US techniques involving optically-excitable molecularly-targeted contrast agents and quantitative detection of resulting oscillatory contrast agent movement with US. Current preclinical findings and advances in instrumentation, such as endoscopes and microcatheters, suggest that these molecular imaging methods have numerous potential clinical applications and will be translated into clinical use in the near future.

  10. Optimization strategies for discrete multi-material stiffness optimization

    DEFF Research Database (Denmark)

    Hvejsel, Christian Frier; Lund, Erik; Stolpe, Mathias

    2011-01-01

    Design of composite laminated lay-ups are formulated as discrete multi-material selection problems. The design problem can be modeled as a non-convex mixed-integer optimization problem. Such problems are in general only solvable to global optimality for small to moderate sized problems. To attack...... which numerically confirm the sought properties of the new scheme in terms of convergence to a discrete solution....

  11. Fetal DNA: strategies for optimal recovery

    NARCIS (Netherlands)

    Legler, Tobias J.; Heermann, Klaus-Hinrich; Liu, Zhong; Soussan, Aicha Ait; van der Schoot, C. Ellen

    2008-01-01

    For fetal DNA extraction, in principle each DNA extraction method can be used; however, because most methods have been optimized for genomic DNA from leucocytes, we describe here the methods that have been optimized for the extraction of fetal DNA from maternal plasma and validated for this purpose

  12. Strategies for Optimal Design of Structural Systems

    DEFF Research Database (Denmark)

    Enevoldsen, I.; Sørensen, John Dalsgaard

    1992-01-01

    Reliability-based design of structural systems is considered. Especially systems where the reliability model is a series system of parallel systems are analysed. A sensitivity analysis for this class of problems is presented. Direct and sequential optimization procedures to solve the optimization...

  13. Cloud Optimized Image Format and Compression

    Science.gov (United States)

    Becker, P.; Plesea, L.; Maurer, T.

    2015-04-01

    Cloud based image storage and processing requires revaluation of formats and processing methods. For the true value of the massive volumes of earth observation data to be realized, the image data needs to be accessible from the cloud. Traditional file formats such as TIF and NITF were developed in the hay day of the desktop and assumed fast low latency file access. Other formats such as JPEG2000 provide for streaming protocols for pixel data, but still require a server to have file access. These concepts no longer truly hold in cloud based elastic storage and computation environments. This paper will provide details of a newly evolving image storage format (MRF) and compression that is optimized for cloud environments. Although the cost of storage continues to fall for large data volumes, there is still significant value in compression. For imagery data to be used in analysis and exploit the extended dynamic range of the new sensors, lossless or controlled lossy compression is of high value. Compression decreases the data volumes stored and reduces the data transferred, but the reduced data size must be balanced with the CPU required to decompress. The paper also outlines a new compression algorithm (LERC) for imagery and elevation data that optimizes this balance. Advantages of the compression include its simple to implement algorithm that enables it to be efficiently accessed using JavaScript. Combing this new cloud based image storage format and compression will help resolve some of the challenges of big image data on the internet.

  14. An optimal tuning strategy for tidal turbines.

    Science.gov (United States)

    Vennell, Ross

    2016-11-01

    Tuning wind and tidal turbines is critical to maximizing their power output. Adopting a wind turbine tuning strategy of maximizing the output at any given time is shown to be an extremely poor strategy for large arrays of tidal turbines in channels. This 'impatient-tuning strategy' results in far lower power output, much higher structural loads and greater environmental impacts due to flow reduction than an existing 'patient-tuning strategy' which maximizes the power output averaged over the tidal cycle. This paper presents a 'smart patient tuning strategy', which can increase array output by up to 35% over the existing strategy. This smart strategy forgoes some power generation early in the half tidal cycle in order to allow stronger flows to develop later in the cycle. It extracts enough power from these stronger flows to produce more power from the cycle as a whole than the existing strategy. Surprisingly, the smart strategy can often extract more power without increasing maximum structural loads on the turbines, while also maintaining stronger flows along the channel. This paper also shows that, counterintuitively, for some tuning strategies imposing a cap on turbine power output to limit loads can increase a turbine's average power output.

  15. Optimized Quasi-Interpolators for Image Reconstruction.

    Science.gov (United States)

    Sacht, Leonardo; Nehab, Diego

    2015-12-01

    We propose new quasi-interpolators for the continuous reconstruction of sampled images, combining a narrowly supported piecewise-polynomial kernel and an efficient digital filter. In other words, our quasi-interpolators fit within the generalized sampling framework and are straightforward to use. We go against standard practice and optimize for approximation quality over the entire Nyquist range, rather than focusing exclusively on the asymptotic behavior as the sample spacing goes to zero. In contrast to previous work, we jointly optimize with respect to all degrees of freedom available in both the kernel and the digital filter. We consider linear, quadratic, and cubic schemes, offering different tradeoffs between quality and computational cost. Experiments with compounded rotations and translations over a range of input images confirm that, due to the additional degrees of freedom and the more realistic objective function, our new quasi-interpolators perform better than the state of the art, at a similar computational cost.

  16. Particle swarm optimization based optimal bidding strategy in an ...

    African Journals Online (AJOL)

    user

    A considerable amount of work has also been reported on the game theory applications ... probability distribution function (Song et al, 1999) and as a continuous ..... compared with GA and Monte Carlo method, therefore the bidding strategies.

  17. Optimal portfolio strategies under a shortfall constraint

    African Journals Online (AJOL)

    we make precise the optimal control problem to be solved. .... is closely related to the concept of Value-at-Risk, but overcomes some of the conceptual .... We adapt a dynamic programming approach to solve the HJB equation associated with.

  18. Optimal Pricing Strategy for New Products

    OpenAIRE

    Trichy V. Krishnan; Frank M. Bass; Dipak C. Jain

    1999-01-01

    Robinson and Lakhani (1975) initiated a long research stream in marketing when they used the Bass model (1969) to develop optimal pricing path for a new product. A careful analysis of the extant literature reveals that the research predominantly suggests that the optimal price path should be largely based on the sales growth pattern. However, in the real world we rarely find new products that have such pricing pattern. We observe either a monotonically declining pricing pattern or an increase...

  19. An optimal tuning strategy for tidal turbines

    Science.gov (United States)

    2016-01-01

    Tuning wind and tidal turbines is critical to maximizing their power output. Adopting a wind turbine tuning strategy of maximizing the output at any given time is shown to be an extremely poor strategy for large arrays of tidal turbines in channels. This ‘impatient-tuning strategy’ results in far lower power output, much higher structural loads and greater environmental impacts due to flow reduction than an existing ‘patient-tuning strategy’ which maximizes the power output averaged over the tidal cycle. This paper presents a ‘smart patient tuning strategy’, which can increase array output by up to 35% over the existing strategy. This smart strategy forgoes some power generation early in the half tidal cycle in order to allow stronger flows to develop later in the cycle. It extracts enough power from these stronger flows to produce more power from the cycle as a whole than the existing strategy. Surprisingly, the smart strategy can often extract more power without increasing maximum structural loads on the turbines, while also maintaining stronger flows along the channel. This paper also shows that, counterintuitively, for some tuning strategies imposing a cap on turbine power output to limit loads can increase a turbine’s average power output. PMID:27956870

  20. Strategy of image management in retail shops

    Directory of Open Access Journals (Sweden)

    Sandra Soče Kraljević

    2007-12-01

    Full Text Available A sound positioning in consumers’ mind, along with strong promotion support, brought many retail shops to the top. This is mostly thanks to the image created in the consumers’ mind. A retail shop’s image may but need not conform to reality. Image often looks like a cliché. It overstates certain elements of the shop while simply omitting others. That is exactly why image is of great importance and often crucial to consumer behavior. This paper aims at determining the impact of image on customer behavior in the course of decision making about shopping and choosing a particular retail shop. Image is a significant factor of success of every company, hence also of a retail shops. It is a relatively strong value and a component of creating competitive advantage. But if we do not pay sufficient attention to image, it can become counterproductive. Instead to, like an additional value helps creating and maintaining the advantage in competition and realization of business aims, transforms into a limiting factor. Therefore, it is imperative to identify the elements of image that are of greatest importance to customers. Research has shown that customers choose the retail shop first and after that products and brands within this shop. When it comes to the supermarket, as a kind of retail shop, research has shown that two out of three shopping decisions are made by the customer on the spot, that is, without previous planning. That practically means that we can influence customers with different sales techniques. The paper suggests different strategies of image management for supermarkets and conventional shops. For supermarkets it is the “widest assortment” strategy, while for conventional shops the strategy is that of a “selected group of products“. Improvements to research methods will enable getting more information about customer behavior, while pressures of increased competition in the business environment will force retailers to get

  1. Modular strategies for PET imaging agents

    International Nuclear Information System (INIS)

    Hooker, J.M.

    2010-01-01

    In recent years, modular and simplified chemical and biological strategies have been developed for the synthesis and implementation of positron emission tomography (PET) radiotracers. New developments in bioconjugation and synthetic methodologies, in combination with advances in macromolecular delivery systems and gene-expression imaging, reflect a need to reduce radiosynthesis burden in order to accelerate imaging agent development. These new approaches, which are often mindful of existing infrastructure and available resources, are anticipated to provide a more approachable entry point for researchers interested in using PET to translate in vitro research to in vivo imaging.

  2. Power consumption optimization strategy for wireless networks

    DEFF Research Database (Denmark)

    Cornean, Horia; Kumar, Sanjay; Marchetti, Nicola

    2011-01-01

    in order to reduce the total power consumption in a multi cellular network. We present an algorithm for power optimization under no interference and in presence of interference conditions, targeting to maximize the network capacity. The convergence of the algorithm is guaranteed if the interference...

  3. Intelligent fault recognition strategy based on adaptive optimized multiple centers

    Science.gov (United States)

    Zheng, Bo; Li, Yan-Feng; Huang, Hong-Zhong

    2018-06-01

    For the recognition principle based optimized single center, one important issue is that the data with nonlinear separatrix cannot be recognized accurately. In order to solve this problem, a novel recognition strategy based on adaptive optimized multiple centers is proposed in this paper. This strategy recognizes the data sets with nonlinear separatrix by the multiple centers. Meanwhile, the priority levels are introduced into the multi-objective optimization, including recognition accuracy, the quantity of optimized centers, and distance relationship. According to the characteristics of various data, the priority levels are adjusted to ensure the quantity of optimized centers adaptively and to keep the original accuracy. The proposed method is compared with other methods, including support vector machine (SVM), neural network, and Bayesian classifier. The results demonstrate that the proposed strategy has the same or even better recognition ability on different distribution characteristics of data.

  4. Long-run savings and investment strategy optimization.

    Science.gov (United States)

    Gerrard, Russell; Guillén, Montserrat; Nielsen, Jens Perch; Pérez-Marín, Ana M

    2014-01-01

    We focus on automatic strategies to optimize life cycle savings and investment. Classical optimal savings theory establishes that, given the level of risk aversion, a saver would keep the same relative amount invested in risky assets at any given time. We show that, when optimizing lifecycle investment, performance and risk assessment have to take into account the investor's risk aversion and the maximum amount the investor could lose, simultaneously. When risk aversion and maximum possible loss are considered jointly, an optimal savings strategy is obtained, which follows from constant rather than relative absolute risk aversion. This result is fundamental to prove that if risk aversion and the maximum possible loss are both high, then holding a constant amount invested in the risky asset is optimal for a standard lifetime saving/pension process and outperforms some other simple strategies. Performance comparisons are based on downside risk-adjusted equivalence that is used in our illustration.

  5. Long-Run Savings and Investment Strategy Optimization

    Directory of Open Access Journals (Sweden)

    Russell Gerrard

    2014-01-01

    Full Text Available We focus on automatic strategies to optimize life cycle savings and investment. Classical optimal savings theory establishes that, given the level of risk aversion, a saver would keep the same relative amount invested in risky assets at any given time. We show that, when optimizing lifecycle investment, performance and risk assessment have to take into account the investor’s risk aversion and the maximum amount the investor could lose, simultaneously. When risk aversion and maximum possible loss are considered jointly, an optimal savings strategy is obtained, which follows from constant rather than relative absolute risk aversion. This result is fundamental to prove that if risk aversion and the maximum possible loss are both high, then holding a constant amount invested in the risky asset is optimal for a standard lifetime saving/pension process and outperforms some other simple strategies. Performance comparisons are based on downside risk-adjusted equivalence that is used in our illustration.

  6. Optimal inspection strategies for offshore structural systems

    DEFF Research Database (Denmark)

    Faber, M. H.; Sorensen, J. D.; Kroon, I. B.

    1992-01-01

    a mathematical framework for the estimation of the failure and repair costs a.ssociated with systems failure. Further a strategy for selecting the components to inspect based on decision tree analysis is suggested. Methods and analysis schemes are illustrated by a simple example....

  7. Control strategy optimization of HVAC plants

    Energy Technology Data Exchange (ETDEWEB)

    Facci, Andrea Luigi; Zanfardino, Antonella [Department of Engineering, University of Napoli “Parthenope” (Italy); Martini, Fabrizio [Green Energy Plus srl (Italy); Pirozzi, Salvatore [SIAT Installazioni spa (Italy); Ubertini, Stefano [School of Engineering (DEIM) University of Tuscia (Italy)

    2015-03-10

    In this paper we present a methodology to optimize the operating conditions of heating, ventilation and air conditioning (HVAC) plants to achieve a higher energy efficiency in use. Semi-empiric numerical models of the plant components are used to predict their performances as a function of their set-point and the environmental and occupied space conditions. The optimization is performed through a graph-based algorithm that finds the set-points of the system components that minimize energy consumption and/or energy costs, while matching the user energy demands. The resulting model can be used with systems of almost any complexity, featuring both HVAC components and energy systems, and is sufficiently fast to make it applicable to real-time setting.

  8. Control strategy optimization of HVAC plants

    International Nuclear Information System (INIS)

    Facci, Andrea Luigi; Zanfardino, Antonella; Martini, Fabrizio; Pirozzi, Salvatore; Ubertini, Stefano

    2015-01-01

    In this paper we present a methodology to optimize the operating conditions of heating, ventilation and air conditioning (HVAC) plants to achieve a higher energy efficiency in use. Semi-empiric numerical models of the plant components are used to predict their performances as a function of their set-point and the environmental and occupied space conditions. The optimization is performed through a graph-based algorithm that finds the set-points of the system components that minimize energy consumption and/or energy costs, while matching the user energy demands. The resulting model can be used with systems of almost any complexity, featuring both HVAC components and energy systems, and is sufficiently fast to make it applicable to real-time setting

  9. Optimal Licensing Strategy: Royalty or Fixed Fee?

    OpenAIRE

    Andrea Fosfuri; Esther Roca

    2004-01-01

    Licensing a cost-reducing innovation through a royalty has been shown to be superior to licensing by means of a fixed fee for an incumbent licensor. This note shows that this result relies crucially on the assumption that the incumbent licensor can sell its cost-reducing inno-vation to all industry players. If, for any reason, only some competitors could be reached through a licensing contract, then a fixed fee might be optimally chosen.

  10. Optimized Power Dispatch Strategy for Offshore Wind Farms

    DEFF Research Database (Denmark)

    Hou, Peng; Hu, Weihao; Zhang, Baohua

    2016-01-01

    which are related to electrical system topology. This paper proposed an optimized power dispatch strategy (OPD) for minimizing the levelized production cost (LPC) of a wind farm. Particle swarm optimization (PSO) is employed to obtain final solution for the optimization problem. Both regular shape......Maximizing the power production of offshore wind farms using proper control strategy has become an important issue for wind farm operators. However, the power transmitted to the onshore substation (OS) is not only related to the power production of each wind turbine (WT) but also the power losses...... and irregular shape wind farm are chosen for the case study. The proposed dispatch strategy is compared with two other control strategies. The simulation results show the effectiveness of the proposed strategy....

  11. Optimal wave focusing for seismic source imaging

    Science.gov (United States)

    Bazargani, Farhad

    In both global and exploration seismology, studying seismic sources provides geophysicists with invaluable insight into the physics of earthquakes and faulting processes. One way to characterize the seismic source is to directly image it. Time-reversal (TR) focusing provides a simple and robust solution to the source imaging problem. However, for recovering a well- resolved image, TR requires a full-aperture receiver array that surrounds the source and adequately samples the wavefield. This requirement often cannot be realized in practice. In most source imaging experiments, the receiver geometry, due to the limited aperture and sparsity of the stations, does not allow adequate sampling of the source wavefield. Incomplete acquisition and imbalanced illumination of the imaging target limit the resolving power of the TR process. The main focus of this thesis is to offer an alternative approach to source imaging with the goal of mitigating the adverse effects of incomplete acquisition on the TR modeling. To this end, I propose a new method, named Backus-Gilbert (BG) source imaging, to optimally focus the wavefield onto the source position using a given receiver geometry. I first introduce BG as a method for focusing waves in acoustic media at a desired location and time. Then, by exploiting the source-receiver reciprocity of the Green function and the linearity of the problem, I show that BG focusing can be adapted and used as a source-imaging tool. Following this, I generalize the BG theory for elastic waves. Applying BG formalism for source imaging requires a model for the wave propagation properties of the earth and an estimate of the source location. Using numerical tests, I next examine the robustness and sensitivity of the proposed method with respect to errors in the earth model, uncertainty in the source location, and noise in data. The BG method can image extended sources as well as point sources. It can also retrieve the source mechanism. These features of

  12. Sleep As A Strategy For Optimizing Performance.

    Science.gov (United States)

    Yarnell, Angela M; Deuster, Patricia

    2016-01-01

    Recovery is an essential component of maintaining, sustaining, and optimizing cognitive and physical performance during and after demanding training and strenuous missions. Getting sufficient amounts of rest and sleep is key to recovery. This article focuses on sleep and discusses (1) why getting sufficient sleep is important, (2) how to optimize sleep, and (3) tools available to help maximize sleep-related performance. Insufficient sleep negatively impacts safety and readiness through reduced cognitive function, more accidents, and increased military friendly-fire incidents. Sufficient sleep is linked to better cognitive performance outcomes, increased vigor, and better physical and athletic performance as well as improved emotional and social functioning. Because Special Operations missions do not always allow for optimal rest or sleep, the impact of reduced rest and sleep on readiness and mission success should be minimized through appropriate preparation and planning. Preparation includes periods of "banking" or extending sleep opportunities before periods of loss, monitoring sleep by using tools like actigraphy to measure sleep and activity, assessing mental effectiveness, exploiting strategic sleep opportunities, and consuming caffeine at recommended doses to reduce fatigue during periods of loss. Together, these efforts may decrease the impact of sleep loss on mission and performance. 2016.

  13. Seismic image watermarking using optimized wavelets

    International Nuclear Information System (INIS)

    Mufti, M.

    2010-01-01

    Geotechnical processes and technologies are becoming more and more sophisticated by the use of computer and information technology. This has made the availability, authenticity and security of geo technical data even more important. One of the most common methods of storing and sharing seismic data images is through standardized SEG- Y file format.. Geo technical industry is now primarily data centric. The analytic and detection capability of seismic processing tool is heavily dependent on the correctness of the contents of the SEG-Y data file. This paper describes a method through an optimized wavelet transform technique which prevents unauthorized alteration and/or use of seismic data. (author)

  14. Optimal Dynamic Advertising Strategy Under Age-Specific Market Segmentation

    Science.gov (United States)

    Krastev, Vladimir

    2011-12-01

    We consider the model proposed by Faggian and Grosset for determining the advertising efforts and goodwill in the long run of a company under age segmentation of consumers. Reducing this model to optimal control sub problems we find the optimal advertising strategy and goodwill.

  15. A strategy for optimizing item-pool management

    NARCIS (Netherlands)

    Ariel, A.; van der Linden, Willem J.; Veldkamp, Bernard P.

    2006-01-01

    Item-pool management requires a balancing act between the input of new items into the pool and the output of tests assembled from it. A strategy for optimizing item-pool management is presented that is based on the idea of a periodic update of an optimal blueprint for the item pool to tune item

  16. Blackjack in Holland Casino's : Basic, optimal and winning strategies

    NARCIS (Netherlands)

    van der Genugten, B.B.

    1995-01-01

    This paper considers the cardgame Blackjack according to the rules of Holland Casino's in the Netherlands. Expected gains of strategies are derived with simulation and also with analytic tools. New effiency concepts based on the gains of the basic and the optimal strategy are introduced. A general

  17. A new inertia weight control strategy for particle swarm optimization

    Science.gov (United States)

    Zhu, Xianming; Wang, Hongbo

    2018-04-01

    Particle Swarm Optimization is a member of swarm intelligence algorithms, which is inspired by the behavior of bird flocks. The inertia weight, one of the most important parameters of PSO, is crucial for PSO, for it balances the performance of exploration and exploitation of the algorithm. This paper proposes a new inertia weight control strategy and PSO with this new strategy is tested by four benchmark functions. The results shows that the new strategy provides the PSO with better performance.

  18. Noise-dependent optimal strategies for quantum metrology

    Science.gov (United States)

    Huang, Zixin; Macchiavello, Chiara; Maccone, Lorenzo

    2018-03-01

    For phase estimation using qubits, we show that for some noise channels, the optimal entanglement-assisted strategy depends on the noise level. We note that there is a nontrivial crossover between the parallel-entangled strategy and the ancilla-assisted strategy: in the former the probes are all entangled; in the latter the probes are entangled with a noiseless ancilla but not among themselves. The transition can be explained by the fact that separable states are more robust against noise and therefore are optimal in the high-noise limit, but they are in turn outperformed by ancilla-assisted ones.

  19. Optimal intermittent search strategies: smelling the prey

    International Nuclear Information System (INIS)

    Revelli, J A; Wio, H S; Rojo, F; Budde, C E

    2010-01-01

    We study the kinetics of the search of a single fixed target by a searcher/walker that performs an intermittent random walk, characterized by different states of motion. In addition, we assume that the walker has the ability to detect the scent left by the prey/target in its surroundings. Our results, in agreement with intuition, indicate that the prey's survival probability could be strongly reduced (increased) if the predator is attracted (or repelled) by the trace left by the prey. We have also found that, for a positive trace (the predator is guided towards the prey), increasing the inhomogeneity's size reduces the prey's survival probability, while the optimal value of α (the parameter that regulates intermittency) ceases to exist. The agreement between theory and numerical simulations is excellent.

  20. Optimal intermittent search strategies: smelling the prey

    Energy Technology Data Exchange (ETDEWEB)

    Revelli, J A; Wio, H S [Instituto de Fisica de Cantabria, Universidad de Cantabria and CSIC, E-39005 Santander (Spain); Rojo, F; Budde, C E [Fa.M.A.F., Universidad Nacional de Cordoba, Ciudad Universitaria, X5000HUA Cordoba (Argentina)

    2010-05-14

    We study the kinetics of the search of a single fixed target by a searcher/walker that performs an intermittent random walk, characterized by different states of motion. In addition, we assume that the walker has the ability to detect the scent left by the prey/target in its surroundings. Our results, in agreement with intuition, indicate that the prey's survival probability could be strongly reduced (increased) if the predator is attracted (or repelled) by the trace left by the prey. We have also found that, for a positive trace (the predator is guided towards the prey), increasing the inhomogeneity's size reduces the prey's survival probability, while the optimal value of {alpha} (the parameter that regulates intermittency) ceases to exist. The agreement between theory and numerical simulations is excellent.

  1. Optimal Inspection and Maintenance Strategies for Structural Systems

    DEFF Research Database (Denmark)

    Sommer, A. M.

    The aim of this thesis is to give an overview of conventional and optimal reliability-based inspection and maintenance strategies and to examine for specific structures how the cost can be reduced and/or the safety can be improved by using optimal reliability-based inspection strategies....... For structures with several almost similar components it is suggested that individual inspection strategies should be determined for each component or a group of components based on the reliability of the actual component. The benefit of this procedure is assessed in connection with the structures considered....... Furthermore, in relation to the calculations performed the intention is to modify an existing program for determination of optimal inspection strategies. The main purpose of inspection and maintenance of structural systems is to prevent or delay damage or deterioration to protect people, environment...

  2. A proposal of optimal sampling design using a modularity strategy

    Science.gov (United States)

    Simone, A.; Giustolisi, O.; Laucelli, D. B.

    2016-08-01

    In real water distribution networks (WDNs) are present thousands nodes and optimal placement of pressure and flow observations is a relevant issue for different management tasks. The planning of pressure observations in terms of spatial distribution and number is named sampling design and it was faced considering model calibration. Nowadays, the design of system monitoring is a relevant issue for water utilities e.g., in order to manage background leakages, to detect anomalies and bursts, to guarantee service quality, etc. In recent years, the optimal location of flow observations related to design of optimal district metering areas (DMAs) and leakage management purposes has been faced considering optimal network segmentation and the modularity index using a multiobjective strategy. Optimal network segmentation is the basis to identify network modules by means of optimal conceptual cuts, which are the candidate locations of closed gates or flow meters creating the DMAs. Starting from the WDN-oriented modularity index, as a metric for WDN segmentation, this paper proposes a new way to perform the sampling design, i.e., the optimal location of pressure meters, using newly developed sampling-oriented modularity index. The strategy optimizes the pressure monitoring system mainly based on network topology and weights assigned to pipes according to the specific technical tasks. A multiobjective optimization minimizes the cost of pressure meters while maximizing the sampling-oriented modularity index. The methodology is presented and discussed using the Apulian and Exnet networks.

  3. Optimal reactor strategy for commercializing fast breeder reactors

    International Nuclear Information System (INIS)

    Yamaji, Kenji; Nagano, Koji

    1988-01-01

    In this paper, a fuel cycle optimization model developed for analyzing the condition of selecting fast breeder reactors in the optimal reactor strategy is described. By dividing the period of planning, 1966-2055, into nine ten-year periods, the model was formulated as a compact linear programming model. With the model, the best mix of reactor types as well as the optimal timing of reprocessing spent fuel from LWRs to minimize the total cost were found. The results of the analysis are summarized as follows. Fast breeder reactors could be introduced in the optimal strategy when they can economically compete with LWRs with 30 year storage of spent fuel. In order that fast breeder reactors monopolize the new reactor market after the achievement of their technical availability, their capital cost should be less than 0.9 times as much as that of LWRs. When a certain amount of reprocessing commitment is assumed, the condition of employing fast breeder reactors in the optimal strategy is mitigated. In the optimal strategy, reprocessing is done just to meet plutonium demand, and the storage of spent fuel is selected to adjust the mismatch of plutonium production and utilization. The price hike of uranium ore facilitates the commercial adoption of fast breeder reactors. (Kako, I.)

  4. Optimal Portfolio Strategy under Rolling Economic Maximum Drawdown Constraints

    Directory of Open Access Journals (Sweden)

    Xiaojian Yu

    2014-01-01

    Full Text Available This paper deals with the problem of optimal portfolio strategy under the constraints of rolling economic maximum drawdown. A more practical strategy is developed by using rolling Sharpe ratio in computing the allocation proportion in contrast to existing models. Besides, another novel strategy named “REDP strategy” is further proposed, which replaces the rolling economic drawdown of the portfolio with the rolling economic drawdown of the risky asset. The simulation tests prove that REDP strategy can ensure the portfolio to satisfy the drawdown constraint and outperforms other strategies significantly. An empirical comparison research on the performances of different strategies is carried out by using the 23-year monthly data of SPTR, DJUBS, and 3-month T-bill. The investment cases of single risky asset and two risky assets are both studied in this paper. Empirical results indicate that the REDP strategy successfully controls the maximum drawdown within the given limit and performs best in both return and risk.

  5. Data Analysis Strategies in Medical Imaging.

    Science.gov (United States)

    Parmar, Chintan; Barry, Joseph D; Hosny, Ahmed; Quackenbush, John; Aerts, Hugo Jwl

    2018-03-26

    Radiographic imaging continues to be one of the most effective and clinically useful tools within oncology. Sophistication of artificial intelligence (AI) has allowed for detailed quantification of radiographic characteristics of tissues using predefined engineered algorithms or deep learning methods. Precedents in radiology as well as a wealth of research studies hint at the clinical relevance of these characteristics. However, there are critical challenges associated with the analysis of medical imaging data. While some of these challenges are specific to the imaging field, many others like reproducibility and batch effects are generic and have already been addressed in other quantitative fields such as genomics. Here, we identify these pitfalls and provide recommendations for analysis strategies of medical imaging data including data normalization, development of robust models, and rigorous statistical analyses. Adhering to these recommendations will not only improve analysis quality, but will also enhance precision medicine by allowing better integration of imaging data with other biomedical data sources. Copyright ©2018, American Association for Cancer Research.

  6. An unsupervised strategy for biomedical image segmentation

    Directory of Open Access Journals (Sweden)

    Roberto Rodríguez

    2010-09-01

    Full Text Available Roberto Rodríguez1, Rubén Hernández21Digital Signal Processing Group, Institute of Cybernetics, Mathematics, and Physics, Havana, Cuba; 2Interdisciplinary Professional Unit of Engineering and Advanced Technology, IPN, MexicoAbstract: Many segmentation techniques have been published, and some of them have been widely used in different application problems. Most of these segmentation techniques have been motivated by specific application purposes. Unsupervised methods, which do not assume any prior scene knowledge can be learned to help the segmentation process, and are obviously more challenging than the supervised ones. In this paper, we present an unsupervised strategy for biomedical image segmentation using an algorithm based on recursively applying mean shift filtering, where entropy is used as a stopping criterion. This strategy is proven with many real images, and a comparison is carried out with manual segmentation. With the proposed strategy, errors less than 20% for false positives and 0% for false negatives are obtained.Keywords: segmentation, mean shift, unsupervised segmentation, entropy

  7. Optimal generator bidding strategies for power and ancillary services

    Science.gov (United States)

    Morinec, Allen G.

    As the electric power industry transitions to a deregulated market, power transactions are made upon price rather than cost. Generator companies are interested in maximizing their profits rather than overall system efficiency. A method to equitably compensate generation providers for real power, and ancillary services such as reactive power and spinning reserve, will ensure a competitive market with an adequate number of suppliers. Optimizing the generation product mix during bidding is necessary to maximize a generator company's profits. The objective of this research work is to determine and formulate appropriate optimal bidding strategies for a generation company in both the energy and ancillary services markets. These strategies should incorporate the capability curves of their generators as constraints to define the optimal product mix and price offered in the day-ahead and real time spot markets. In order to achieve such a goal, a two-player model was composed to simulate market auctions for power generation. A dynamic game methodology was developed to identify Nash Equilibria and Mixed-Strategy Nash Equilibria solutions as optimal generation bidding strategies for two-player non-cooperative variable-sum matrix games with incomplete information. These games integrated the generation product mix of real power, reactive power, and spinning reserve with the generators's capability curves as constraints. The research includes simulations of market auctions, where strategies were tested for generators with different unit constraints, costs, types of competitors, strategies, and demand levels. Studies on the capability of large hydrogen cooled synchronous generators were utilized to derive useful equations that define the exact shape of the capability curve from the intersections of the arcs defined by the centers and radial vectors of the rotor, stator, and steady-state stability limits. The available reactive reserve and spinning reserve were calculated given a

  8. Optimal Image Data Compression For Whole Slide Images

    Directory of Open Access Journals (Sweden)

    J. Isola

    2016-06-01

    Differences in WSI file sizes of scanned images deemed “visually lossless” were significant. If we set Hamamatsu Nanozoomer .NDPI file size (using its default “jpeg80 quality” as 100%, the size of a “visually lossless” JPEG2000 file was only 15-20% of that. Comparisons to Aperio and 3D-Histech files (.svs and .mrxs at their default settings yielded similar results. A further optimization of JPEG2000 was done by treating empty slide area as uniform white-grey surface, which could be maximally compressed. Using this algorithm, JPEG2000 file sizes were only half, or even smaller, of original JPEG2000. Variation was due to the proportion of empty slide area on the scan. We anticipate that wavelet-based image compression methods, such as JPEG2000, have a significant advantage in saving storage costs of scanned whole slide image. In routine pathology laboratories applying WSI technology widely to their histology material, absolute cost savings can be substantial.  

  9. An optimized strategy for real-time hemorrhage monitoring with electrical impedance tomography

    International Nuclear Information System (INIS)

    Xu, Canhua; Dai, Meng; You, Fusheng; Shi, Xuetao; Fu, Feng; Liu, Ruigang; Dong, Xiuzhen

    2011-01-01

    Delayed detection of an internal hemorrhage may result in serious disabilities and possibly death for a patient. Currently, there are no portable medical imaging instruments that are suitable for long-term monitoring of patients at risk of internal hemorrhage. Electrical impedance tomography (EIT) has the potential to monitor patients continuously as a novel functional image modality and instantly detect the occurrence of an internal hemorrhage. However, the low spatial resolution and high sensitivity to noise of this technique have limited its application in clinics. In addition, due to the circular boundary display mode used in current EIT images, it is difficult for clinicians to identify precisely which organ is bleeding using this technique. The aim of this study was to propose an optimized strategy for EIT reconstruction to promote the use of EIT for clinical studies, which mainly includes the use of anatomically accurate boundary shapes, rapid selection of optimal regularization parameters and image fusion of EIT and computed tomography images. The method was evaluated on retroperitoneal and intraperitoneal bleeding piglet data. Both traditional backprojection images and optimized images among different boundary shapes were reconstructed and compared. The experimental results demonstrated that EIT images with precise anatomical information can be reconstructed in which the image resolution and resistance to noise can be improved effectively

  10. Application of optimal interation strategies to diffusion theory calculations

    International Nuclear Information System (INIS)

    Jones, R.B.

    1978-01-01

    The geometric interpretation of optimal (minimum computational time) iteration strategies is applied to one- and two-group, two-dimensional diffusion-theory calculations. The method is a ''spectral/time balance'' technique which weighs the convergence enhancement of the inner iteration procedure with that of the outer iteration loop and the time required to reconstruct the source. The diffusion-theory option of the discrete-ordinates transport code DOT3.5 was altered to incorporate the theoretical inner/outer decision logic. For the two-dimensional configuration considered, the optimal strategies reduced the total number of iterations performed for a given error criterion

  11. Optimization Under Uncertainty for Wake Steering Strategies: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Quick, Julian [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Annoni, Jennifer [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Ryan N [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Dykes, Katherine L [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Fleming, Paul A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Ning, Andrew [Brigham Young University

    2017-05-01

    Wind turbines in a wind power plant experience significant power losses because of aerodynamic interactions between turbines. One control strategy to reduce these losses is known as 'wake steering,' in which upstream turbines are yawed to direct wakes away from downstream turbines. Previous wake steering research has assumed perfect information, however, there can be significant uncertainty in many aspects of the problem, including wind inflow and various turbine measurements. Uncertainty has significant implications for performance of wake steering strategies. Consequently, the authors formulate and solve an optimization under uncertainty (OUU) problem for finding optimal wake steering strategies in the presence of yaw angle uncertainty. The OUU wake steering strategy is demonstrated on a two-turbine test case and on the utility-scale, offshore Princess Amalia Wind Farm. When we accounted for yaw angle uncertainty in the Princess Amalia Wind Farm case, inflow-direction-specific OUU solutions produced between 0% and 1.4% more power than the deterministically optimized steering strategies, resulting in an overall annual average improvement of 0.2%. More importantly, the deterministic optimization is expected to perform worse and with more downside risk than the OUU result when realistic uncertainty is taken into account. Additionally, the OUU solution produces fewer extreme yaw situations than the deterministic solution.

  12. Optimal football strategies: AC Milan versus FC Barcelona

    OpenAIRE

    Papahristodoulou, Christos

    2012-01-01

    In a recent UEFA Champions League game between AC Milan and FC Barcelona, played in Italy (final score 2-3), the collected match statistics, classified into four offensive and two defensive strategies, were in favour of FC Barcelona (by 13 versus 8 points). The aim of this paper is to examine to what extent the optimal game strategies derived from some deterministic, possibilistic, stochastic and fuzzy LP models would improve the payoff of AC Milan at the cost of FC Barcelona.

  13. Imaging strategies in pediatric urinary tract infection

    Energy Technology Data Exchange (ETDEWEB)

    Dacher, Jean-Nicolas [University of Rouen, Quant-IF Laboratory, School of Medicine and Pharmacy, Rouen (France); Rouen University Hospital Charles Nicolle, Department of Radiology, Rouen (France); UFR Medecine Pharmacie de Rouen, Laboratoire Quant-If, Rouen (France); Hitzel, Anne; Vera, Pierre [University of Rouen, Quant-IF Laboratory, School of Medicine and Pharmacy, Rouen (France); CRLCC Henri Becquerel, Department of Nuclear Medicine, Rouen (France); Avni, Fred E. [Free University of Brussels, Department of Radiology, Erasmus Hospital, Brussels (Belgium)

    2005-07-01

    This article is focused on the controversial topic of imaging strategies in pediatric urinary tract infection. A review of the recent literature illustrates the complementary roles of ultrasound, diagnostic radiology and nuclear medicine. The authors stress the key role of ultrasound which has recently been debated. The commonly associated vesicoureteric reflux has to be classified as congenital or secondary due to voiding dysfunction. A series of frequently asked questions are addressed in a second section. The proposed answers are not the product of a consensus but should rather be considered as proposals to enrich the ongoing debate concerning the evaluation of urinary tract infection in children. (orig.)

  14. Imaging strategies in pediatric urinary tract infection

    International Nuclear Information System (INIS)

    Dacher, Jean-Nicolas; Hitzel, Anne; Vera, Pierre; Avni, Fred E.

    2005-01-01

    This article is focused on the controversial topic of imaging strategies in pediatric urinary tract infection. A review of the recent literature illustrates the complementary roles of ultrasound, diagnostic radiology and nuclear medicine. The authors stress the key role of ultrasound which has recently been debated. The commonly associated vesicoureteric reflux has to be classified as congenital or secondary due to voiding dysfunction. A series of frequently asked questions are addressed in a second section. The proposed answers are not the product of a consensus but should rather be considered as proposals to enrich the ongoing debate concerning the evaluation of urinary tract infection in children. (orig.)

  15. Optimal decentralized valley-filling charging strategy for electric vehicles

    International Nuclear Information System (INIS)

    Zhang, Kangkang; Xu, Liangfei; Ouyang, Minggao; Wang, Hewu; Lu, Languang; Li, Jianqiu; Li, Zhe

    2014-01-01

    Highlights: • An implementable charging strategy is developed for electric vehicles connected to a grid. • A two-dimensional pricing scheme is proposed to coordinate charging behaviors. • The strategy effectively works in decentralized way but achieves the systematic valley filling. • The strategy allows device-level charging autonomy, and does not require a bidirectional communication/control network. • The strategy can self-correct when confronted with adverse factors. - Abstract: Uncoordinated charging load of electric vehicles (EVs) increases the peak load of the power grid, thereby increasing the cost of electricity generation. The valley-filling charging scenario offers a cheaper alternative. This study proposes a novel decentralized valley-filling charging strategy, in which a day-ahead pricing scheme is designed by solving a minimum-cost optimization problem. The pricing scheme can be broadcasted to EV owners, and the individual charging behaviors can be indirectly coordinated. EV owners respond to the pricing scheme by autonomously optimizing their individual charge patterns. This device-level response induces a valley-filling effect in the grid at the system level. The proposed strategy offers three advantages: coordination (by the valley-filling effect), practicality (no requirement for a bidirectional communication/control network between the grid and EV owners), and autonomy (user control of EV charge patterns). The proposed strategy is validated in simulations of typical scenarios in Beijing, China. According to the results, the strategy (1) effectively achieves the valley-filling charging effect at 28% less generation cost than the uncoordinated charging strategy, (2) is robust to several potential affecters of the valley-filling effect, such as (system-level) inaccurate parameter estimation and (device-level) response capability and willingness (which cause less than 2% deviation in the minimal generation cost), and (3) is compatible with

  16. A Competitive and Experiential Assignment in Search Engine Optimization Strategy

    Science.gov (United States)

    Clarke, Theresa B.; Clarke, Irvine, III

    2014-01-01

    Despite an increase in ad spending and demand for employees with expertise in search engine optimization (SEO), methods for teaching this important marketing strategy have received little coverage in the literature. Using Bloom's cognitive goals hierarchy as a framework, this experiential assignment provides a process for educators who may be new…

  17. Optimal portfolio strategies under a shortfall constraint | Akume ...

    African Journals Online (AJOL)

    We impose dynamically, a shortfall constraint in terms of Tail Conditional Expectation on the portfolio selection problem in continuous time, in order to obtain optimal strategies. The nancial market is assumed to comprise n risky assets driven by geometric Brownian motion and one risk-free asset. The method of Lagrange ...

  18. Optimal energy management strategy for battery powered electric vehicles

    International Nuclear Information System (INIS)

    Xi, Jiaqi; Li, Mian; Xu, Min

    2014-01-01

    Highlights: • The power usage for battery-powered electrical vehicles with in-wheel motors is maximized. • The battery and motor dynamics are examined emphasized on the power conversion and utilization. • The optimal control strategy is derived and verified by simulations. • An analytic expression of the optimal operating point is obtained. - Abstract: Due to limited energy density of batteries, energy management has always played a critical role in improving the overall energy efficiency of electric vehicles. In this paper, a key issue within the energy management problem will be carefully tackled, i.e., maximizing the power usage of batteries for battery-powered electrical vehicles with in-wheel motors. To this end, the battery and motor dynamics will be thoroughly examined with particular emphasis on the power conversion and power utilization. The optimal control strategy will then be derived based on the analysis. One significant contribution of this work is that an analytic expression for the optimal operating point in terms of the component and environment parameters can be obtained. Owing to this finding, the derived control strategy is also rendered a simple structure for real-time implementation. Simulation results demonstrate that the proposed strategy works both adaptively and robustly under different driving scenarios

  19. Validation of optimization strategies using the linear structured production chains

    Science.gov (United States)

    Kusiak, Jan; Morkisz, Paweł; Oprocha, Piotr; Pietrucha, Wojciech; Sztangret, Łukasz

    2017-06-01

    Different optimization strategies applied to sequence of several stages of production chains were validated in this paper. Two benchmark problems described by ordinary differential equations (ODEs) were considered. A water tank and a passive CR-RC filter were used as the exemplary objects described by the first and the second order differential equations, respectively. Considered in the work optimization problems serve as the validators of strategies elaborated by the Authors. However, the main goal of research is selection of the best strategy for optimization of two real metallurgical processes which will be investigated in an on-going projects. The first problem will be the oxidizing roasting process of zinc sulphide concentrate where the sulphur from the input concentrate should be eliminated and the minimal concentration of sulphide sulphur in the roasted products has to be achieved. Second problem will be the lead refining process consisting of three stages: roasting to the oxide, oxide reduction to metal and the oxidizing refining. Strategies, which appear the most effective in considered benchmark problems will be candidates for optimization of the mentioned above industrial processes.

  20. Optimal Scale Edge Detection Utilizing Noise within Images

    Directory of Open Access Journals (Sweden)

    Adnan Khashman

    2003-04-01

    Full Text Available Edge detection techniques have common problems that include poor edge detection in low contrast images, speed of recognition and high computational cost. An efficient solution to the edge detection of objects in low to high contrast images is scale space analysis. However, this approach is time consuming and computationally expensive. These expenses can be marginally reduced if an optimal scale is found in scale space edge detection. This paper presents a new approach to detecting objects within images using noise within the images. The novel idea is based on selecting one optimal scale for the entire image at which scale space edge detection can be applied. The selection of an ideal scale is based on the hypothesis that "the optimal edge detection scale (ideal scale depends on the noise within an image". This paper aims at providing the experimental evidence on the relationship between the optimal scale and the noise within images.

  1. An optimal inspection strategy for randomly failing equipment

    International Nuclear Information System (INIS)

    Chelbi, Anis; Ait-Kadi, Daoud

    1999-01-01

    This paper addresses the problem of generating optimal inspection strategies for randomly failing equipment where imminent failure is not obvious and can only be detected through inspection. Inspections are carried out following a condition-based procedure. The equipment is replaced if it has failed or if it shows imminent signs of failure. The latter state is indicated by measuring certain predetermined control parameters during inspection. Costs are associated with inspection, idle time and preventive or corrective actions. An optimal inspection strategy is defined as the inspection sequence minimizing the expected total cost per time unit over an infinite span. A mathematical model and a numerical algorithm are developed to generate an optimal inspection sequence. As a practical example, the model is applied to provide a machine tool operator with a time sequence for inspecting the cutting tool. The tool life time distribution and the trend of one control parameter defining its actual condition are supposed to be known

  2. Optimal Energy Control Strategy Design for a Hybrid Electric Vehicle

    Directory of Open Access Journals (Sweden)

    Yuan Zou

    2013-01-01

    Full Text Available A heavy-duty parallel hybrid electric truck is modeled, and its optimal energy control is studied in this paper. The fundamental architecture of the parallel hybrid electric truck is modeled feed-forwardly, together with necessary dynamic features of subsystem or components. Dynamic programming (DP technique is adopted to find the optimal control strategy including the gear-shifting sequence and the power split between the engine and the motor subject to a battery SOC-sustaining constraint. Improved control rules are extracted from the DP-based control solution, forming near-optimal control strategies. Simulation results demonstrate that a significant improvement on the fuel economy can be achieved in the heavy-duty vehicle cycle from the natural driving statistics.

  3. Growth or reproduction: emergence of an evolutionary optimal strategy

    International Nuclear Information System (INIS)

    Grilli, J; Suweis, S; Maritan, A

    2013-01-01

    Modern ecology has re-emphasized the need for a quantitative understanding of the original ‘survival of the fittest theme’ based on analysis of the intricate trade-offs between competing evolutionary strategies that characterize the evolution of life. This is key to the understanding of species coexistence and ecosystem diversity under the omnipresent constraint of limited resources. In this work we propose an agent-based model replicating a community of interacting individuals, e.g. plants in a forest, where all are competing for the same finite amount of resources and each competitor is characterized by a specific growth–reproduction strategy. We show that such an evolution dynamics drives the system towards a stationary state characterized by an emergent optimal strategy, which in turn depends on the amount of available resources the ecosystem can rely on. We find that the share of resources used by individuals is power-law distributed with an exponent directly related to the optimal strategy. The model can be further generalized to devise optimal strategies in social and economical interacting systems dynamics. (paper)

  4. Online gaming for learning optimal team strategies in real time

    Science.gov (United States)

    Hudas, Gregory; Lewis, F. L.; Vamvoudakis, K. G.

    2010-04-01

    This paper first presents an overall view for dynamical decision-making in teams, both cooperative and competitive. Strategies for team decision problems, including optimal control, zero-sum 2-player games (H-infinity control) and so on are normally solved for off-line by solving associated matrix equations such as the Riccati equation. However, using that approach, players cannot change their objectives online in real time without calling for a completely new off-line solution for the new strategies. Therefore, in this paper we give a method for learning optimal team strategies online in real time as team dynamical play unfolds. In the linear quadratic regulator case, for instance, the method learns the Riccati equation solution online without ever solving the Riccati equation. This allows for truly dynamical team decisions where objective functions can change in real time and the system dynamics can be time-varying.

  5. Artificial root foraging optimizer algorithm with hybrid strategies

    Directory of Open Access Journals (Sweden)

    Yang Liu

    2017-02-01

    Full Text Available In this work, a new plant-inspired optimization algorithm namely the hybrid artificial root foraging optimizion (HARFO is proposed, which mimics the iterative root foraging behaviors for complex optimization. In HARFO model, two innovative strategies were developed: one is the root-to-root communication strategy, which enables the individual exchange information with each other in different efficient topologies that can essentially improve the exploration ability; the other is co-evolution strategy, which can structure the hierarchical spatial population driven by evolutionary pressure of multiple sub-populations that ensure the diversity of root population to be well maintained. The proposed algorithm is benchmarked against four classical evolutionary algorithms on well-designed test function suites including both classical and composition test functions. Through the rigorous performance analysis that of all these tests highlight the significant performance improvement, and the comparative results show the superiority of the proposed algorithm.

  6. A new optimal seam method for seamless image stitching

    Science.gov (United States)

    Xue, Jiale; Chen, Shengyong; Cheng, Xu; Han, Ying; Zhao, Meng

    2017-07-01

    A novel optimal seam method which aims to stitch those images with overlapping area more seamlessly has been propos ed. Considering the traditional gradient domain optimal seam method and fusion algorithm result in bad color difference measurement and taking a long time respectively, the input images would be converted to HSV space and a new energy function is designed to seek optimal stitching path. To smooth the optimal stitching path, a simplified pixel correction and weighted average method are utilized individually. The proposed methods exhibit performance in eliminating the stitching seam compared with the traditional gradient optimal seam and high efficiency with multi-band blending algorithm.

  7. Image Mosaic Techniques OptimizationUsing Wavelet

    Institute of Scientific and Technical Information of China (English)

    ZHOUAn-qi; CUILi

    2014-01-01

    This essay concentrates on two key procedures of image mosaic——image registration and imagefusion.Becauseof the character of geometric transformation invariance of edge points, wecalculate the angle difference of the direction vector ofedge points in different images anddraw an angle difference histogramto adjust the rotationproblem. Through this way, algorithm based on gray information is expandedandcan be used in images withdisplacementand rotation. Inthe term of image fusion, wavelet multi-scale analysis is used to fuse spliced images. In order to choose the best method of imagefusion,weevaluate the results of different methods of image fusion by cross entropy.

  8. Optimization of Segmentation Quality of Integrated Circuit Images

    Directory of Open Access Journals (Sweden)

    Gintautas Mušketas

    2012-04-01

    Full Text Available The paper presents investigation into the application of genetic algorithms for the segmentation of the active regions of integrated circuit images. This article is dedicated to a theoretical examination of the applied methods (morphological dilation, erosion, hit-and-miss, threshold and describes genetic algorithms, image segmentation as optimization problem. The genetic optimization of the predefined filter sequence parameters is carried out. Improvement to segmentation accuracy using a non optimized filter sequence makes 6%.Artcile in Lithuanian

  9. On the robust optimization to the uncertain vaccination strategy problem

    International Nuclear Information System (INIS)

    Chaerani, D.; Anggriani, N.; Firdaniza

    2014-01-01

    In order to prevent an epidemic of infectious diseases, the vaccination coverage needs to be minimized and also the basic reproduction number needs to be maintained below 1. This means that as we get the vaccination coverage as minimum as possible, thus we need to prevent the epidemic to a small number of people who already get infected. In this paper, we discuss the case of vaccination strategy in term of minimizing vaccination coverage, when the basic reproduction number is assumed as an uncertain parameter that lies between 0 and 1. We refer to the linear optimization model for vaccination strategy that propose by Becker and Starrzak (see [2]). Assuming that there is parameter uncertainty involved, we can see Tanner et al (see [9]) who propose the optimal solution of the problem using stochastic programming. In this paper we discuss an alternative way of optimizing the uncertain vaccination strategy using Robust Optimization (see [3]). In this approach we assume that the parameter uncertainty lies within an ellipsoidal uncertainty set such that we can claim that the obtained result will be achieved in a polynomial time algorithm (as it is guaranteed by the RO methodology). The robust counterpart model is presented

  10. On the robust optimization to the uncertain vaccination strategy problem

    Energy Technology Data Exchange (ETDEWEB)

    Chaerani, D., E-mail: d.chaerani@unpad.ac.id; Anggriani, N., E-mail: d.chaerani@unpad.ac.id; Firdaniza, E-mail: d.chaerani@unpad.ac.id [Department of Mathematics, Faculty of Mathematics and Natural Sciences, University of Padjadjaran Indonesia, Jalan Raya Bandung Sumedang KM 21 Jatinangor Sumedang 45363 (Indonesia)

    2014-02-21

    In order to prevent an epidemic of infectious diseases, the vaccination coverage needs to be minimized and also the basic reproduction number needs to be maintained below 1. This means that as we get the vaccination coverage as minimum as possible, thus we need to prevent the epidemic to a small number of people who already get infected. In this paper, we discuss the case of vaccination strategy in term of minimizing vaccination coverage, when the basic reproduction number is assumed as an uncertain parameter that lies between 0 and 1. We refer to the linear optimization model for vaccination strategy that propose by Becker and Starrzak (see [2]). Assuming that there is parameter uncertainty involved, we can see Tanner et al (see [9]) who propose the optimal solution of the problem using stochastic programming. In this paper we discuss an alternative way of optimizing the uncertain vaccination strategy using Robust Optimization (see [3]). In this approach we assume that the parameter uncertainty lies within an ellipsoidal uncertainty set such that we can claim that the obtained result will be achieved in a polynomial time algorithm (as it is guaranteed by the RO methodology). The robust counterpart model is presented.

  11. Generating optimized stochastic power management strategies for electric car components

    Energy Technology Data Exchange (ETDEWEB)

    Fruth, Matthias [TraceTronic GmbH, Dresden (Germany); Bastian, Steve [Technische Univ. Dresden (Germany)

    2012-11-01

    With the increasing prevalence of electric vehicles, reducing the power consumption of car components becomes a necessity. For the example of a novel traffic-light assistance system, which makes speed recommendations based on the expected length of red-light phases, power-management strategies are used to control under which conditions radio communication, positioning systems and other components are switched to low-power (e.g. sleep) or high-power (e.g. idle/busy) states. We apply dynamic power management, an optimization technique well-known from other domains, in order to compute energy-optimal power-management strategies, sometimes resulting in these strategies being stochastic. On the example of the traffic-light assistant, we present a MATLAB/Simulink-implemented framework for the generation, simulation and formal analysis of optimized power-management strategies, which is based on this technique. We study capabilities and limitations of this approach and sketch further applications in the automotive domain. (orig.)

  12. An optimization strategy for a biokinetic model of inhaled radionuclides

    International Nuclear Information System (INIS)

    Shyr, L.J.; Griffith, W.C.; Boecker, B.B.

    1991-01-01

    Models for material disposition and dosimetry involve predictions of the biokinetics of the material among compartments representing organs and tissues in the body. Because of a lack of human data for most toxicants, many of the basic data are derived by modeling the results obtained from studies using laboratory animals. Such a biomathematical model is usually developed by adjusting the model parameters to make the model predictions match the measured retention and excretion data visually. The fitting process can be very time-consuming for a complicated model, and visual model selections may be subjective and easily biased by the scale or the data used. Due to the development of computerized optimization methods, manual fitting could benefit from an automated process. However, for a complicated model, an automated process without an optimization strategy will not be efficient, and may not produce fruitful results. In this paper, procedures for, and implementation of, an optimization strategy for a complicated mathematical model is demonstrated by optimizing a biokinetic model for 144Ce in fused aluminosilicate particles inhaled by beagle dogs. The optimized results using SimuSolv were compared to manual fitting results obtained previously using the model simulation software GASP. Also, statistical criteria provided by SimuSolv, such as likelihood function values, were used to help or verify visual model selections

  13. Control strategies for wind farm power optimization: LES study

    Science.gov (United States)

    Ciri, Umberto; Rotea, Mario; Leonardi, Stefano

    2017-11-01

    Turbines in wind farms operate in off-design conditions as wake interactions occur for particular wind directions. Advanced wind farm control strategies aim at coordinating and adjusting turbine operations to mitigate power losses in such conditions. Coordination is achieved by controlling on upstream turbines either the wake intensity, through the blade pitch angle or the generator torque, or the wake direction, through yaw misalignment. Downstream turbines can be adapted to work in waked conditions and limit power losses, using the blade pitch angle or the generator torque. As wind conditions in wind farm operations may change significantly, it is difficult to determine and parameterize the variations of the coordinated optimal settings. An alternative is model-free control and optimization of wind farms, which does not require any parameterization and can track the optimal settings as conditions vary. In this work, we employ a model-free optimization algorithm, extremum-seeking control, to find the optimal set-points of generator torque, blade pitch and yaw angle for a three-turbine configuration. Large-Eddy Simulations are used to provide a virtual environment to evaluate the performance of the control strategies under realistic, unsteady incoming wind. This work was supported by the National Science Foundation, Grants No. 1243482 (the WINDINSPIRE project) and IIP 1362033 (I/UCRC WindSTAR). TACC is acknowledged for providing computational time.

  14. Investigation of Optimal Integrated Circuit Raster Image Vectorization Method

    Directory of Open Access Journals (Sweden)

    Leonas Jasevičius

    2011-03-01

    Full Text Available Visual analysis of integrated circuit layer requires raster image vectorization stage to extract layer topology data to CAD tools. In this paper vectorization problems of raster IC layer images are presented. Various line extraction from raster images algorithms and their properties are discussed. Optimal raster image vectorization method was developed which allows utilization of common vectorization algorithms to achieve the best possible extracted vector data match with perfect manual vectorization results. To develop the optimal method, vectorized data quality dependence on initial raster image skeleton filter selection was assessed.Article in Lithuanian

  15. Transitions in optimal adaptive strategies for populations in fluctuating environments

    Science.gov (United States)

    Mayer, Andreas; Mora, Thierry; Rivoire, Olivier; Walczak, Aleksandra M.

    2017-09-01

    Biological populations are subject to fluctuating environmental conditions. Different adaptive strategies can allow them to cope with these fluctuations: specialization to one particular environmental condition, adoption of a generalist phenotype that compromises between conditions, or population-wise diversification (bet hedging). Which strategy provides the largest selective advantage in the long run depends on the range of accessible phenotypes and the statistics of the environmental fluctuations. Here, we analyze this problem in a simple mathematical model of population growth. First, we review and extend a graphical method to identify the nature of the optimal strategy when the environmental fluctuations are uncorrelated. Temporal correlations in environmental fluctuations open up new strategies that rely on memory but are mathematically challenging to study: We present analytical results to address this challenge. We illustrate our general approach by analyzing optimal adaptive strategies in the presence of trade-offs that constrain the range of accessible phenotypes. Our results extend several previous studies and have applications to a variety of biological phenomena, from antibiotic resistance in bacteria to immune responses in vertebrates.

  16. Eye Movements Reveal Optimal Strategies for Analogical Reasoning.

    Science.gov (United States)

    Vendetti, Michael S; Starr, Ariel; Johnson, Elizabeth L; Modavi, Kiana; Bunge, Silvia A

    2017-01-01

    Analogical reasoning refers to the process of drawing inferences on the basis of the relational similarity between two domains. Although this complex cognitive ability has been the focus of inquiry for many years, most models rely on measures that cannot capture individuals' thought processes moment by moment. In the present study, we used participants' eye movements to investigate reasoning strategies in real time while solving visual propositional analogy problems (A:B::C:D). We included both a semantic and a perceptual lure on every trial to determine how these types of distracting information influence reasoning strategies. Participants spent more time fixating the analogy terms and the target relative to the other response choices, and made more saccades between the A and B items than between any other items. Participants' eyes were initially drawn to perceptual lures when looking at response choices, but they nonetheless performed the task accurately. We used participants' gaze sequences to classify each trial as representing one of three classic analogy problem solving strategies and related strategy usage to analogical reasoning performance. A project-first strategy, in which participants first extrapolate the relation between the AB pair and then generalize that relation for the C item, was both the most commonly used strategy as well as the optimal strategy for solving visual analogy problems. These findings provide new insight into the role of strategic processing in analogical problem solving.

  17. Eye Movements Reveal Optimal Strategies for Analogical Reasoning

    Directory of Open Access Journals (Sweden)

    Michael S. Vendetti

    2017-06-01

    Full Text Available Analogical reasoning refers to the process of drawing inferences on the basis of the relational similarity between two domains. Although this complex cognitive ability has been the focus of inquiry for many years, most models rely on measures that cannot capture individuals' thought processes moment by moment. In the present study, we used participants' eye movements to investigate reasoning strategies in real time while solving visual propositional analogy problems (A:B::C:D. We included both a semantic and a perceptual lure on every trial to determine how these types of distracting information influence reasoning strategies. Participants spent more time fixating the analogy terms and the target relative to the other response choices, and made more saccades between the A and B items than between any other items. Participants' eyes were initially drawn to perceptual lures when looking at response choices, but they nonetheless performed the task accurately. We used participants' gaze sequences to classify each trial as representing one of three classic analogy problem solving strategies and related strategy usage to analogical reasoning performance. A project-first strategy, in which participants first extrapolate the relation between the AB pair and then generalize that relation for the C item, was both the most commonly used strategy as well as the optimal strategy for solving visual analogy problems. These findings provide new insight into the role of strategic processing in analogical problem solving.

  18. Optimal intervention strategies for cholera outbreak by education and chlorination

    Science.gov (United States)

    Bakhtiar, Toni

    2016-01-01

    This paper discusses the control of infectious diseases in the framework of optimal control approach. A case study on cholera control was studied by considering two control strategies, namely education and chlorination. We distinct the former control into one regarding person-to-person behaviour and another one concerning person-to-environment conduct. Model are divided into two interacted populations: human population which follows an SIR model and pathogen population. Pontryagin maximum principle was applied in deriving a set of differential equations which consists of dynamical and adjoin systems as optimality conditions. Then, the fourth order Runge-Kutta method was exploited to numerically solve the equation system. An illustrative example was provided to assess the effectiveness of the control strategies toward a set of control scenarios.

  19. TH-B-207B-00: Pediatric Image Quality Optimization

    International Nuclear Information System (INIS)

    2016-01-01

    This imaging educational program will focus on solutions to common pediatric image quality optimization challenges. The speakers will present collective knowledge on best practices in pediatric imaging from their experience at dedicated children’s hospitals. One of the most commonly encountered pediatric imaging requirements for the non-specialist hospital is pediatric CT in the emergency room setting. Thus, this educational program will begin with optimization of pediatric CT in the emergency department. Though pediatric cardiovascular MRI may be less common in the non-specialist hospitals, low pediatric volumes and unique cardiovascular anatomy make optimization of these techniques difficult. Therefore, our second speaker will review best practices in pediatric cardiovascular MRI based on experiences from a children’s hospital with a large volume of cardiac patients. Learning Objectives: To learn techniques for optimizing radiation dose and image quality for CT of children in the emergency room setting. To learn solutions for consistently high quality cardiovascular MRI of children

  20. TH-B-207B-00: Pediatric Image Quality Optimization

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-06-15

    This imaging educational program will focus on solutions to common pediatric image quality optimization challenges. The speakers will present collective knowledge on best practices in pediatric imaging from their experience at dedicated children’s hospitals. One of the most commonly encountered pediatric imaging requirements for the non-specialist hospital is pediatric CT in the emergency room setting. Thus, this educational program will begin with optimization of pediatric CT in the emergency department. Though pediatric cardiovascular MRI may be less common in the non-specialist hospitals, low pediatric volumes and unique cardiovascular anatomy make optimization of these techniques difficult. Therefore, our second speaker will review best practices in pediatric cardiovascular MRI based on experiences from a children’s hospital with a large volume of cardiac patients. Learning Objectives: To learn techniques for optimizing radiation dose and image quality for CT of children in the emergency room setting. To learn solutions for consistently high quality cardiovascular MRI of children.

  1. Turbine Control Strategies for Wind Farm Power Optimization

    DEFF Research Database (Denmark)

    Mirzaei, Mahmood; Göçmen Bozkurt, Tuhfe; Giebel, Gregor

    2015-01-01

    In recent decades there has been increasing interest in green energies, of which wind energy is the most important one. In order to improve the competitiveness of the wind power plants, there are ongoing researches to decrease cost per energy unit and increase the efficiency of wind turbines...... and wind farms. One way of achieving these goals is to optimize the power generated by a wind farm. One optimization method is to choose appropriate operating points for the individual wind turbines in the farm. We have made three models of a wind farm based on three difference control strategies...... the generated power by changing the power reference of the individual wind turbines. We use the optimization setup to compare power production of the wind farm models. This paper shows that for the most frequent wind velocities (below and around the rated values), the generated powers of the wind farms...

  2. Integrated testing strategies can be optimal for chemical risk classification.

    Science.gov (United States)

    Raseta, Marko; Pitchford, Jon; Cussens, James; Doe, John

    2017-08-01

    There is an urgent need to refine strategies for testing the safety of chemical compounds. This need arises both from the financial and ethical costs of animal tests, but also from the opportunities presented by new in-vitro and in-silico alternatives. Here we explore the mathematical theory underpinning the formulation of optimal testing strategies in toxicology. We show how the costs and imprecisions of the various tests, and the variability in exposures and responses of individuals, can be assembled rationally to form a Markov Decision Problem. We compute the corresponding optimal policies using well developed theory based on Dynamic Programming, thereby identifying and overcoming some methodological and logical inconsistencies which may exist in the current toxicological testing. By illustrating our methods for two simple but readily generalisable examples we show how so-called integrated testing strategies, where information of different precisions from different sources is combined and where different initial test outcomes lead to different sets of future tests, can arise naturally as optimal policies. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Optimal sampling strategies for detecting zoonotic disease epidemics.

    Directory of Open Access Journals (Sweden)

    Jake M Ferguson

    2014-06-01

    Full Text Available The early detection of disease epidemics reduces the chance of successful introductions into new locales, minimizes the number of infections, and reduces the financial impact. We develop a framework to determine the optimal sampling strategy for disease detection in zoonotic host-vector epidemiological systems when a disease goes from below detectable levels to an epidemic. We find that if the time of disease introduction is known then the optimal sampling strategy can switch abruptly between sampling only from the vector population to sampling only from the host population. We also construct time-independent optimal sampling strategies when conducting periodic sampling that can involve sampling both the host and the vector populations simultaneously. Both time-dependent and -independent solutions can be useful for sampling design, depending on whether the time of introduction of the disease is known or not. We illustrate the approach with West Nile virus, a globally-spreading zoonotic arbovirus. Though our analytical results are based on a linearization of the dynamical systems, the sampling rules appear robust over a wide range of parameter space when compared to nonlinear simulation models. Our results suggest some simple rules that can be used by practitioners when developing surveillance programs. These rules require knowledge of transition rates between epidemiological compartments, which population was initially infected, and of the cost per sample for serological tests.

  4. Optimal sampling strategies for detecting zoonotic disease epidemics.

    Science.gov (United States)

    Ferguson, Jake M; Langebrake, Jessica B; Cannataro, Vincent L; Garcia, Andres J; Hamman, Elizabeth A; Martcheva, Maia; Osenberg, Craig W

    2014-06-01

    The early detection of disease epidemics reduces the chance of successful introductions into new locales, minimizes the number of infections, and reduces the financial impact. We develop a framework to determine the optimal sampling strategy for disease detection in zoonotic host-vector epidemiological systems when a disease goes from below detectable levels to an epidemic. We find that if the time of disease introduction is known then the optimal sampling strategy can switch abruptly between sampling only from the vector population to sampling only from the host population. We also construct time-independent optimal sampling strategies when conducting periodic sampling that can involve sampling both the host and the vector populations simultaneously. Both time-dependent and -independent solutions can be useful for sampling design, depending on whether the time of introduction of the disease is known or not. We illustrate the approach with West Nile virus, a globally-spreading zoonotic arbovirus. Though our analytical results are based on a linearization of the dynamical systems, the sampling rules appear robust over a wide range of parameter space when compared to nonlinear simulation models. Our results suggest some simple rules that can be used by practitioners when developing surveillance programs. These rules require knowledge of transition rates between epidemiological compartments, which population was initially infected, and of the cost per sample for serological tests.

  5. A New Hybrid Whale Optimizer Algorithm with Mean Strategy of Grey Wolf Optimizer for Global Optimization

    Directory of Open Access Journals (Sweden)

    Narinder Singh

    2018-03-01

    Full Text Available The quest for an efficient nature-inspired optimization technique has continued over the last few decades. In this paper, a hybrid nature-inspired optimization technique has been proposed. The hybrid algorithm has been constructed using Mean Grey Wolf Optimizer (MGWO and Whale Optimizer Algorithm (WOA. We have utilized the spiral equation of Whale Optimizer Algorithm for two procedures in the Hybrid Approach GWO (HAGWO algorithm: (i firstly, we used the spiral equation in Grey Wolf Optimizer algorithm for balance between the exploitation and the exploration process in the new hybrid approach; and (ii secondly, we also applied this equation in the whole population in order to refrain from the premature convergence and trapping in local minima. The feasibility and effectiveness of the hybrid algorithm have been tested by solving some standard benchmarks, XOR, Baloon, Iris, Breast Cancer, Welded Beam Design, Pressure Vessel Design problems and comparing the results with those obtained through other metaheuristics. The solutions prove that the newly existing hybrid variant has higher stronger stability, faster convergence rate and computational accuracy than other nature-inspired metaheuristics on the maximum number of problems and can successfully resolve the function of constrained nonlinear optimization in reality.

  6. Random mask optimization for fast neutron coded aperture imaging

    Energy Technology Data Exchange (ETDEWEB)

    McMillan, Kyle [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Univ. of California, Los Angeles, CA (United States); Marleau, Peter [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Brubaker, Erik [Sandia National Lab. (SNL-CA), Livermore, CA (United States)

    2015-05-01

    In coded aperture imaging, one of the most important factors determining the quality of reconstructed images is the choice of mask/aperture pattern. In many applications, uniformly redundant arrays (URAs) are widely accepted as the optimal mask pattern. Under ideal conditions, thin and highly opaque masks, URA patterns are mathematically constructed to provide artifact-free reconstruction however, the number of URAs for a chosen number of mask elements is limited and when highly penetrating particles such as fast neutrons and high-energy gamma-rays are being imaged, the optimum is seldom achieved. In this case more robust mask patterns that provide better reconstructed image quality may exist. Through the use of heuristic optimization methods and maximum likelihood expectation maximization (MLEM) image reconstruction, we show that for both point and extended neutron sources a random mask pattern can be optimized to provide better image quality than that of a URA.

  7. VI International Workshop on Nature Inspired Cooperative Strategies for Optimization

    CERN Document Server

    Otero, Fernando; Masegosa, Antonio

    2014-01-01

    Biological and other natural processes have always been a source of inspiration for computer science and information technology. Many emerging problem solving techniques integrate advanced evolution and cooperation strategies, encompassing a range of spatio-temporal scales for visionary conceptualization of evolutionary computation. This book is a collection of research works presented in the VI International Workshop on Nature Inspired Cooperative Strategies for Optimization (NICSO) held in Canterbury, UK. Previous editions of NICSO were held in Granada, Spain (2006 & 2010), Acireale, Italy (2007), Tenerife, Spain (2008), and Cluj-Napoca, Romania (2011). NICSO 2013 and this book provides a place where state-of-the-art research, latest ideas and emerging areas of nature inspired cooperative strategies for problem solving are vigorously discussed and exchanged among the scientific community. The breadth and variety of articles in this book report on nature inspired methods and applications such as Swarm In...

  8. Nuclear Power Plant Outage Optimization Strategy. 2016 Edition

    International Nuclear Information System (INIS)

    2016-10-01

    This publication is an update of IAEA-TECDOC-1315, Nuclear Power Plant Outage Optimisation Strategy, which was published in 2002, and aims to communicate good outage management practices in a manner that can be used by operators and utilities in Member States. Nuclear power plant outage management is a key factor for safe and economic nuclear power plant performance. This publication discusses plant outage strategy and how this strategy is actually implemented. The main areas that are important for outage optimization that were identified by the utilities and government organizations participating in this report are: 1) organization and management; 2) outage planning and preparation; 3) outage execution; 4) safety outage review; and 5) counter measures to avoid the extension of outages and to facilitate the work in forced outages. Good outage management practices cover many different areas of work and this publication aims to communicate these good practices in a way that they can be used effectively by operators and utilities

  9. Evaluation of optimization strategies and the effect of initial conditions on IMAT optimization using a leaf position optimization algorithm

    International Nuclear Information System (INIS)

    Oliver, Mike; Jensen, Michael; Chen, Jeff; Wong, Eugene

    2009-01-01

    Intensity-modulated arc therapy (IMAT) is a rotational variant of intensity-modulated radiation therapy (IMRT) that can be implemented with or without angular dose rate variation. The purpose of this study is to assess optimization strategies and initial conditions using a leaf position optimization (LPO) algorithm altered for variable dose rate IMAT. A concave planning target volume (PTV) with a central cylindrical organ at risk (OAR) was used in this study. The initial IMAT arcs were approximated by multiple static beams at 5 deg. angular increments where multi-leaf collimator (MLC) leaf positions were determined from the beam's eye view to irradiate the PTV but avoid the OAR. For the optimization strategy, two arcs with arc ranges of 280 deg. and 150 deg. were employed and plans were created using LPO alone, variable dose rate optimization (VDRO) alone, simultaneous LPO and VDRO and sequential combinations of these strategies. To assess the MLC initialization effect, three single 360 deg. arc plans with different initial MLC configurations were generated using the simultaneous LPO and VDRO. The effect of changing optimization degrees of freedom was investigated by employing 3 deg., 5 deg. and 10 deg. angular sampling intervals for the two 280 deg., two 150 deg. and single arc plans using LPO and VDRO. The objective function value, a conformity index, a dose homogeneity index, mean dose to OAR and normal tissues were computed and used to evaluate the treatment plans. This study shows that the best optimization strategy for a concave target is to use simultaneous MLC LPO and VDRO. We found that the optimization result is sensitive to the choice of initial MLC aperture shapes suggesting that an LPO-based IMAT plan may not be able to overcome local minima for this geometry. In conclusion, simultaneous MLC leaf position and VDRO are needed with the most appropriate initial conditions (MLC positions, arc ranges and number of arcs) for IMAT.

  10. Image Edge Tracking via Ant Colony Optimization

    Science.gov (United States)

    Li, Ruowei; Wu, Hongkun; Liu, Shilong; Rahman, M. A.; Liu, Sanchi; Kwok, Ngai Ming

    2018-04-01

    A good edge plot should use continuous thin lines to describe the complete contour of the captured object. However, the detection of weak edges is a challenging task because of the associated low pixel intensities. Ant Colony Optimization (ACO) has been employed by many researchers to address this problem. The algorithm is a meta-heuristic method developed by mimicking the natural behaviour of ants. It uses iterative searches to find the optimal solution that cannot be found via traditional optimization approaches. In this work, ACO is employed to track and repair broken edges obtained via conventional Sobel edge detector to produced a result with more connected edges.

  11. An Image Morphing Technique Based on Optimal Mass Preserving Mapping

    Science.gov (United States)

    Zhu, Lei; Yang, Yan; Haker, Steven; Tannenbaum, Allen

    2013-01-01

    Image morphing, or image interpolation in the time domain, deals with the metamorphosis of one image into another. In this paper, a new class of image morphing algorithms is proposed based on the theory of optimal mass transport. The L2 mass moving energy functional is modified by adding an intensity penalizing term, in order to reduce the undesired double exposure effect. It is an intensity-based approach and, thus, is parameter free. The optimal warping function is computed using an iterative gradient descent approach. This proposed morphing method is also extended to doubly connected domains using a harmonic parameterization technique, along with finite-element methods. PMID:17547128

  12. Diagnosis of scaphoid fracture: optimal imaging techniques

    Directory of Open Access Journals (Sweden)

    Geijer M

    2013-07-01

    Full Text Available Mats Geijer Center for Medical Imaging and Physiology, Skåne University Hospital and Lund University, Lund, Sweden Abstract: This review aims to provide an overview of modern imaging techniques for evaluation of scaphoid fracture, with emphasis on occult fractures and an outlook on the possible evolution of imaging; it also gives an overview of the pathologic and anatomic basis for selection of techniques. Displaced scaphoid fractures detected by wrist radiography, with or without special scaphoid views, pose no diagnostic problems. After wrist trauma with clinically suspected scaphoid fracture and normal scaphoid radiography, most patients will have no clinically important fracture. Between 5% and 19% of patients (on average 16% in meta-analyses will, however, have an occult scaphoid fracture which, untreated, may lead to later, potentially devastating, complications. Follow-up imaging may be done with repeat radiography, tomosynthesis, computed tomography, magnetic resonance imaging (MRI, or bone scintigraphy. However, no method is perfect, and choice of imaging may be based on availability, cost, perceived accuracy, or personal preference. Generally, MRI and bone scintigraphy are regarded as the most sensitive modalities, but both are flawed by false positive results at various rates. Keywords: occult fracture, wrist, radiography, computed tomography, magnetic resonance imaging, radionuclide imaging

  13. ProxImaL: efficient image optimization using proximal algorithms

    KAUST Repository

    Heide, Felix; Diamond, Steven; Nieß ner, Matthias; Ragan-Kelley, Jonathan; Heidrich, Wolfgang; Wetzstein, Gordon

    2016-01-01

    domain-specific language and compiler for image optimization problems that makes it easy to experiment with different problem formulations and algorithm choices. The language uses proximal operators as the fundamental building blocks of a variety

  14. Optimal Dynamic Strategies for Index Tracking and Algorithmic Trading

    Science.gov (United States)

    Ward, Brian

    In this thesis we study dynamic strategies for index tracking and algorithmic trading. Tracking problems have become ever more important in Financial Engineering as investors seek to precisely control their portfolio risks and exposures over different time horizons. This thesis analyzes various tracking problems and elucidates the tracking errors and strategies one can employ to minimize those errors and maximize profit. In Chapters 2 and 3, we study the empirical tracking properties of exchange traded funds (ETFs), leveraged ETFs (LETFs), and futures products related to spot gold and the Chicago Board Option Exchange (CBOE) Volatility Index (VIX), respectively. These two markets provide interesting and differing examples for understanding index tracking. We find that static strategies work well in the nonleveraged case for gold, but fail to track well in the corresponding leveraged case. For VIX, tracking via neither ETFs, nor futures\\ portfolios succeeds, even in the nonleveraged case. This motivates the need for dynamic strategies, some of which we construct in these two chapters and further expand on in Chapter 4. There, we analyze a framework for index tracking and risk exposure control through financial derivatives. We derive a tracking condition that restricts our exposure choices and also define a slippage process that characterizes the deviations from the index over longer horizons. The framework is applied to a number of models, for example, Black Scholes model and Heston model for equity index tracking, as well as the Square Root (SQR) model and the Concatenated Square Root (CSQR) model for VIX tracking. By specifying how each of these models fall into our framework, we are able to understand the tracking errors in each of these models. Finally, Chapter 5 analyzes a tracking problem of a different kind that arises in algorithmic trading: schedule following for optimal execution. We formulate and solve a stochastic control problem to obtain the optimal

  15. Optimization of contrast of MR images in imaging of knee joint

    International Nuclear Information System (INIS)

    Szyblinski, K.; Bacic, G.

    1994-01-01

    The work describes the method of contrast optimization in magnetic resonance imaging. Computer program presented in the report allows analysis of contrast in selected tissues as a function of experiment parameters. Application to imaging of knee joint is presented

  16. Particle Swarm Optimization With Interswarm Interactive Learning Strategy.

    Science.gov (United States)

    Qin, Quande; Cheng, Shi; Zhang, Qingyu; Li, Li; Shi, Yuhui

    2016-10-01

    The learning strategy in the canonical particle swarm optimization (PSO) algorithm is often blamed for being the primary reason for loss of diversity. Population diversity maintenance is crucial for preventing particles from being stuck into local optima. In this paper, we present an improved PSO algorithm with an interswarm interactive learning strategy (IILPSO) by overcoming the drawbacks of the canonical PSO algorithm's learning strategy. IILPSO is inspired by the phenomenon in human society that the interactive learning behavior takes place among different groups. Particles in IILPSO are divided into two swarms. The interswarm interactive learning (IIL) behavior is triggered when the best particle's fitness value of both the swarms does not improve for a certain number of iterations. According to the best particle's fitness value of each swarm, the softmax method and roulette method are used to determine the roles of the two swarms as the learning swarm and the learned swarm. In addition, the velocity mutation operator and global best vibration strategy are used to improve the algorithm's global search capability. The IIL strategy is applied to PSO with global star and local ring structures, which are termed as IILPSO-G and IILPSO-L algorithm, respectively. Numerical experiments are conducted to compare the proposed algorithms with eight popular PSO variants. From the experimental results, IILPSO demonstrates the good performance in terms of solution accuracy, convergence speed, and reliability. Finally, the variations of the population diversity in the entire search process provide an explanation why IILPSO performs effectively.

  17. Cost-effectiveness analysis of optimal strategy for tumor treatment

    International Nuclear Information System (INIS)

    Pang, Liuyong; Zhao, Zhong; Song, Xinyu

    2016-01-01

    We propose and analyze an antitumor model with combined immunotherapy and chemotherapy. Firstly, we explore the treatment effects of single immunotherapy and single chemotherapy, respectively. Results indicate that neither immunotherapy nor chemotherapy alone are adequate to cure a tumor. Hence, we apply optimal theory to investigate how the combination of immunotherapy and chemotherapy should be implemented, for a certain time period, in order to reduce the number of tumor cells, while minimizing the implementation cost of the treatment strategy. Secondly, we establish the existence of the optimality system and use Pontryagin’s Maximum Principle to characterize the optimal levels of the two treatment measures. Furthermore, we calculate the incremental cost-effectiveness ratios to analyze the cost-effectiveness of all possible combinations of the two treatment measures. Finally, numerical results show that the combination of immunotherapy and chemotherapy is the most cost-effective strategy for tumor treatment, and able to eliminate the entire tumor with size 4.470 × 10"8 in a year.

  18. Energy Optimal Control Strategy of PHEV Based on PMP Algorithm

    Directory of Open Access Journals (Sweden)

    Tiezhou Wu

    2017-01-01

    Full Text Available Under the global voice of “energy saving” and the current boom in the development of energy storage technology at home and abroad, energy optimal control of the whole hybrid electric vehicle power system, as one of the core technologies of electric vehicles, is bound to become a hot target of “clean energy” vehicle development and research. This paper considers the constraints to the performance of energy storage system in Parallel Hybrid Electric Vehicle (PHEV, from which lithium-ion battery frequently charges/discharges, PHEV largely consumes energy of fuel, and their are difficulty in energy recovery and other issues in a single cycle; the research uses lithium-ion battery combined with super-capacitor (SC, which is hybrid energy storage system (Li-SC HESS, working together with internal combustion engine (ICE to drive PHEV. Combined with PSO-PI controller and Li-SC HESS internal power limited management approach, the research proposes the PHEV energy optimal control strategy. It is based on revised Pontryagin’s minimum principle (PMP algorithm, which establishes the PHEV vehicle simulation model through ADVISOR software and verifies the effectiveness and feasibility. Finally, the results show that the energy optimization control strategy can improve the instantaneity of tracking PHEV minimum fuel consumption track, implement energy saving, and prolong the life of lithium-ion batteries and thereby can improve hybrid energy storage system performance.

  19. Optimal knockout strategies in genome-scale metabolic networks using particle swarm optimization.

    Science.gov (United States)

    Nair, Govind; Jungreuthmayer, Christian; Zanghellini, Jürgen

    2017-02-01

    Knockout strategies, particularly the concept of constrained minimal cut sets (cMCSs), are an important part of the arsenal of tools used in manipulating metabolic networks. Given a specific design, cMCSs can be calculated even in genome-scale networks. We would however like to find not only the optimal intervention strategy for a given design but the best possible design too. Our solution (PSOMCS) is to use particle swarm optimization (PSO) along with the direct calculation of cMCSs from the stoichiometric matrix to obtain optimal designs satisfying multiple objectives. To illustrate the working of PSOMCS, we apply it to a toy network. Next we show its superiority by comparing its performance against other comparable methods on a medium sized E. coli core metabolic network. PSOMCS not only finds solutions comparable to previously published results but also it is orders of magnitude faster. Finally, we use PSOMCS to predict knockouts satisfying multiple objectives in a genome-scale metabolic model of E. coli and compare it with OptKnock and RobustKnock. PSOMCS finds competitive knockout strategies and designs compared to other current methods and is in some cases significantly faster. It can be used in identifying knockouts which will force optimal desired behaviors in large and genome scale metabolic networks. It will be even more useful as larger metabolic models of industrially relevant organisms become available.

  20. Optimal Bidding Strategy for Renewable Microgrid with Active Network Management

    Directory of Open Access Journals (Sweden)

    Seung Wan Kim

    2016-01-01

    Full Text Available Active Network Management (ANM enables a microgrid to optimally dispatch the active/reactive power of its Renewable Distributed Generation (RDG and Battery Energy Storage System (BESS units in real time. Thus, a microgrid with high penetration of RDGs can handle their uncertainties and variabilities to achieve the stable operation using ANM. However, the actual power flow in the line connecting the main grid and microgrid may deviate significantly from the day-ahead bids if the bids are determined without consideration of the real-time adjustment through ANM, which will lead to a substantial imbalance cost. Therefore, this study proposes a formulation for obtaining an optimal bidding which reflects the change of power flow in the connecting line by real-time adjustment using ANM. The proposed formulation maximizes the expected profit of the microgrid considering various network and physical constraints. The effectiveness of the proposed bidding strategy is verified through the simulations with a 33-bus test microgrid. The simulation results show that the proposed bidding strategy improves the expected operating profit by reducing the imbalance cost to a greater degree compared to the basic bidding strategy without consideration of ANM.

  1. Optimization of Butterworth filter for brain SPECT imaging

    International Nuclear Information System (INIS)

    Minoshima, Satoshi; Maruno, Hirotaka; Yui, Nobuharu

    1993-01-01

    A method has been described to optimize the cutoff frequency of the Butterworth filter for brain SPECT imaging. Since a computer simulation study has demonstrated that separation between an object signal and the random noise in projection images in a spatial-frequency domain is influenced by the total number of counts, the cutoff frequency of the Butterworth filter should be optimized for individual subjects according to total counts in a study. To reveal the relationship between the optimal cutoff frequencies and total counts in brain SPECT study, we used a normal volunteer and 99m Tc hexamethyl-propyleneamine oxime (HMPAO) to obtain projection sets with different total counts. High quality images were created from a projection set with an acquisition time of 300-seconds per projection. The filter was optimized by calculating mean square errors from high quality images visually inspecting filtered reconstructed images. Dependence between total counts and optimal cutoff frequencies was clearly demonstrated in a nomogram. Using this nomogram, the optimal cutoff frequency for each study can be estimated from total counts, maximizing visual image quality. The results suggest that the cutoff frequency of Butterworth filter should be determined by referring to total counts in each study. (author)

  2. Cost Effectiveness Analysis of Optimal Malaria Control Strategies in Kenya

    Directory of Open Access Journals (Sweden)

    Gabriel Otieno

    2016-03-01

    Full Text Available Malaria remains a leading cause of mortality and morbidity among the children under five and pregnant women in sub-Saharan Africa, but it is preventable and controllable provided current recommended interventions are properly implemented. Better utilization of malaria intervention strategies will ensure the gain for the value for money and producing health improvements in the most cost effective way. The purpose of the value for money drive is to develop a better understanding (and better articulation of costs and results so that more informed, evidence-based choices could be made. Cost effectiveness analysis is carried out to inform decision makers on how to determine where to allocate resources for malaria interventions. This study carries out cost effective analysis of one or all possible combinations of the optimal malaria control strategies (Insecticide Treated Bednets—ITNs, Treatment, Indoor Residual Spray—IRS and Intermittent Preventive Treatment for Pregnant Women—IPTp for the four different transmission settings in order to assess the extent to which the intervention strategies are beneficial and cost effective. For the four different transmission settings in Kenya the optimal solution for the 15 strategies and their associated effectiveness are computed. Cost-effective analysis using Incremental Cost Effectiveness Ratio (ICER was done after ranking the strategies in order of the increasing effectiveness (total infections averted. The findings shows that for the endemic regions the combination of ITNs, IRS, and IPTp was the most cost-effective of all the combined strategies developed in this study for malaria disease control and prevention; for the epidemic prone areas is the combination of the treatment and IRS; for seasonal areas is the use of ITNs plus treatment; and for the low risk areas is the use of treatment only. Malaria transmission in Kenya can be minimized through tailor-made intervention strategies for malaria control

  3. Optimization of shearography image quality analysis

    International Nuclear Information System (INIS)

    Rafhayudi Jamro

    2005-01-01

    Shearography is an optical technique based on speckle pattern to measure the deformation of the object surface in which the fringe pattern is obtained through the correlation analysis from the speckle pattern. Analysis of fringe pattern for engineering application is limited for qualitative measurement. Therefore, for further analysis that lead to qualitative data, series of image processing mechanism are involved. In this paper, the fringe pattern for qualitative analysis is discussed. In principal field of applications is qualitative non-destructive testing such as detecting discontinuity, defect in the material structure, locating fatigue zones and etc and all these required image processing application. In order to performed image optimisation successfully, the noise in the fringe pattern must be minimised and the fringe pattern itself must be maximise. This can be achieved by applying a filtering method with a kernel size ranging from 2 X 2 to 7 X 7 pixels size and also applying equalizer in the image processing. (Author)

  4. Testing of Strategies for the Acceleration of the Cost Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Ponciroli, Roberto [Argonne National Lab. (ANL), Argonne, IL (United States); Vilim, Richard B. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-08-31

    The general problem addressed in the Nuclear-Renewable Hybrid Energy System (N-R HES) project is finding the optimum economical dispatch (ED) and capacity planning solutions for the hybrid energy systems. In the present test-problem configuration, the N-R HES unit is composed of three electrical power-generating components, i.e. the Balance of Plant (BOP), the Secondary Energy Source (SES), and the Energy Storage (ES). In addition, there is an Industrial Process (IP), which is devoted to hydrogen generation. At this preliminary stage, the goal is to find the power outputs of each one of the N-R HES unit components (BOP, SES, ES) and the IP hydrogen production level that maximizes the unit profit by simultaneously satisfying individual component operational constraints. The optimization problem is meant to be solved in the Risk Analysis Virtual Environment (RAVEN) framework. The dynamic response of the N-R HES unit components is simulated by using dedicated object-oriented models written in the Modelica modeling language. Though this code coupling provides for very accurate predictions, the ensuing optimization problem is characterized by a very large number of solution variables. To ease the computational burden and to improve the path to a converged solution, a method to better estimate the initial guess for the optimization problem solution was developed. The proposed approach led to the definition of a suitable Monte Carlo-based optimization algorithm (called the preconditioner), which provides an initial guess for the optimal N-R HES power dispatch and the optimal installed capacity for each one of the unit components. The preconditioner samples a set of stochastic power scenarios for each one of the N-R HES unit components, and then for each of them the corresponding value of a suitably defined cost function is evaluated. After having simulated a sufficient number of power histories, the configuration which ensures the highest profit is selected as the optimal

  5. ProxImaL: efficient image optimization using proximal algorithms

    KAUST Repository

    Heide, Felix

    2016-07-11

    Computational photography systems are becoming increasingly diverse, while computational resources-for example on mobile platforms-are rapidly increasing. As diverse as these camera systems may be, slightly different variants of the underlying image processing tasks, such as demosaicking, deconvolution, denoising, inpainting, image fusion, and alignment, are shared between all of these systems. Formal optimization methods have recently been demonstrated to achieve state-of-the-art quality for many of these applications. Unfortunately, different combinations of natural image priors and optimization algorithms may be optimal for different problems, and implementing and testing each combination is currently a time-consuming and error-prone process. ProxImaL is a domain-specific language and compiler for image optimization problems that makes it easy to experiment with different problem formulations and algorithm choices. The language uses proximal operators as the fundamental building blocks of a variety of linear and nonlinear image formation models and cost functions, advanced image priors, and noise models. The compiler intelligently chooses the best way to translate a problem formulation and choice of optimization algorithm into an efficient solver implementation. In applications to the image processing pipeline, deconvolution in the presence of Poisson-distributed shot noise, and burst denoising, we show that a few lines of ProxImaL code can generate highly efficient solvers that achieve state-of-the-art results. We also show applications to the nonlinear and nonconvex problem of phase retrieval.

  6. The Tonya Harding Controversy: An Analysis of Image Restoration Strategies.

    Science.gov (United States)

    Benoit, William L.; Hanczor, Robert S.

    1994-01-01

    Analyzes Tonya Harding's defense of her image in "Eye to Eye with Connie Chung," applying the theory of image restoration discourse. Finds that the principal strategies employed in her behalf were bolstering, denial, and attacking her accuser, but that these strategies were not developed very effectively in this instance. (SR)

  7. Issues and Strategies in Solving Multidisciplinary Optimization Problems

    Science.gov (United States)

    Patnaik, Surya

    2013-01-01

    Optimization research at NASA Glenn Research Center has addressed the design of structures, aircraft and airbreathing propulsion engines. The accumulated multidisciplinary design activity is collected under a testbed entitled COMETBOARDS. Several issues were encountered during the solution of the problems. Four issues and the strategies adapted for their resolution are discussed. This is followed by a discussion on analytical methods that is limited to structural design application. An optimization process can lead to an inefficient local solution. This deficiency was encountered during design of an engine component. The limitation was overcome through an augmentation of animation into optimization. Optimum solutions obtained were infeasible for aircraft and airbreathing propulsion engine problems. Alleviation of this deficiency required a cascading of multiple algorithms. Profile optimization of a beam produced an irregular shape. Engineering intuition restored the regular shape for the beam. The solution obtained for a cylindrical shell by a subproblem strategy converged to a design that can be difficult to manufacture. Resolution of this issue remains a challenge. The issues and resolutions are illustrated through a set of problems: Design of an engine component, Synthesis of a subsonic aircraft, Operation optimization of a supersonic engine, Design of a wave-rotor-topping device, Profile optimization of a cantilever beam, and Design of a cylindrical shell. This chapter provides a cursory account of the issues. Cited references provide detailed discussion on the topics. Design of a structure can also be generated by traditional method and the stochastic design concept. Merits and limitations of the three methods (traditional method, optimization method and stochastic concept) are illustrated. In the traditional method, the constraints are manipulated to obtain the design and weight is back calculated. In design optimization, the weight of a structure becomes the

  8. Muon tomography imaging improvement using optimized limited angle data

    Science.gov (United States)

    Bai, Chuanyong; Simon, Sean; Kindem, Joel; Luo, Weidong; Sossong, Michael J.; Steiger, Matthew

    2014-05-01

    Image resolution of muon tomography is limited by the range of zenith angles of cosmic ray muons and the flux rate at sea level. Low flux rate limits the use of advanced data rebinning and processing techniques to improve image quality. By optimizing the limited angle data, however, image resolution can be improved. To demonstrate the idea, physical data of tungsten blocks were acquired on a muon tomography system. The angular distribution and energy spectrum of muons measured on the system was also used to generate simulation data of tungsten blocks of different arrangement (geometry). The data were grouped into subsets using the zenith angle and volume images were reconstructed from the data subsets using two algorithms. One was a distributed PoCA (point of closest approach) algorithm and the other was an accelerated iterative maximal likelihood/expectation maximization (MLEM) algorithm. Image resolution was compared for different subsets. Results showed that image resolution was better in the vertical direction for subsets with greater zenith angles and better in the horizontal plane for subsets with smaller zenith angles. The overall image resolution appeared to be the compromise of that of different subsets. This work suggests that the acquired data can be grouped into different limited angle data subsets for optimized image resolution in desired directions. Use of multiple images with resolution optimized in different directions can improve overall imaging fidelity and the intended applications.

  9. Radiation dose optimization research: Exposure technique approaches in CR imaging – A literature review

    International Nuclear Information System (INIS)

    Seeram, Euclid; Davidson, Rob; Bushong, Stewart; Swan, Hans

    2013-01-01

    The purpose of this paper is to review the literature on exposure technique approaches in Computed Radiography (CR) imaging as a means of radiation dose optimization in CR imaging. Specifically the review assessed three approaches: optimization of kVp; optimization of mAs; and optimization of the Exposure Indicator (EI) in practice. Only papers dating back to 2005 were described in this review. The major themes, patterns, and common findings from the literature reviewed showed that important features are related to radiation dose management strategies for digital radiography include identification of the EI as a dose control mechanism and as a “surrogate for dose management”. In addition the use of the EI has been viewed as an opportunity for dose optimization. Furthermore optimization research has focussed mainly on optimizing the kVp in CR imaging as a means of implementing the ALARA philosophy, and studies have concentrated on mainly chest imaging using different CR systems such as those commercially available from Fuji, Agfa, Kodak, and Konica-Minolta. These studies have produced “conflicting results”. In addition, a common pattern was the use of automatic exposure control (AEC) and the measurement of constant effective dose, and the use of a dose-area product (DAP) meter

  10. Edge detection in digital images using Ant Colony Optimization

    Directory of Open Access Journals (Sweden)

    Marjan Kuchaki Rafsanjani

    2015-11-01

    Full Text Available Ant Colony Optimization (ACO is an optimization algorithm inspired by the behavior of real ant colonies to approximate the solutions of difficult optimization problems. In this paper, ACO is introduced to tackle the image edge detection problem. The proposed approach is based on the distribution of ants on an image; ants try to find possible edges by using a state transition function. Experimental results show that the proposed method compared to standard edge detectors is less sensitive to Gaussian noise and gives finer details and thinner edges when compared to earlier ant-based approaches.

  11. A Novel Optimization-Based Approach for Content-Based Image Retrieval

    Directory of Open Access Journals (Sweden)

    Manyu Xiao

    2013-01-01

    Full Text Available Content-based image retrieval is nowadays one of the possible and promising solutions to manage image databases effectively. However, with the large number of images, there still exists a great discrepancy between the users’ expectations (accuracy and efficiency and the real performance in image retrieval. In this work, new optimization strategies are proposed on vocabulary tree building, retrieval, and matching methods. More precisely, a new clustering strategy combining classification and conventional K-Means method is firstly redefined. Then a new matching technique is built to eliminate the error caused by large-scaled scale-invariant feature transform (SIFT. Additionally, a new unit mechanism is proposed to reduce the cost of indexing time. Finally, the numerical results show that excellent performances are obtained in both accuracy and efficiency based on the proposed improvements for image retrieval.

  12. Strategies for Optimizing Algal Biology for Enhanced Biomass Production

    Energy Technology Data Exchange (ETDEWEB)

    Barry, Amanda N.; Starkenburg, Shawn R.; Sayre, Richard T., E-mail: rsayre@newmexicoconsortium.org [Los Alamos National Laboratory, New Mexico Consortium, Los Alamos, NM (United States)

    2015-02-02

    One of the most environmentally sustainable ways to produce high-energy density (oils) feed stocks for the production of liquid transportation fuels is from biomass. Photosynthetic carbon capture combined with biomass combustion (point source) and subsequent carbon capture and sequestration has also been proposed in the intergovernmental panel on climate change report as one of the most effective and economical strategies to remediate atmospheric greenhouse gases. To maximize photosynthetic carbon capture efficiency and energy-return-on-investment, we must develop biomass production systems that achieve the greatest yields with the lowest inputs. Numerous studies have demonstrated that microalgae have among the greatest potentials for biomass production. This is in part due to the fact that all alga cells are photoautotrophic, they have active carbon concentrating mechanisms to increase photosynthetic productivity, and all the biomass is harvestable unlike plants. All photosynthetic organisms, however, convert only a fraction of the solar energy they capture into chemical energy (reduced carbon or biomass). To increase aerial carbon capture rates and biomass productivity, it will be necessary to identify the most robust algal strains and increase their biomass production efficiency often by genetic manipulation. We review recent large-scale efforts to identify the best biomass producing strains and metabolic engineering strategies to improve aerial productivity. These strategies include optimization of photosynthetic light-harvesting antenna size to increase energy capture and conversion efficiency and the potential development of advanced molecular breeding techniques. To date, these strategies have resulted in up to twofold increases in biomass productivity.

  13. Strategies for Optimizing Algal Biology for Enhanced Biomass Production

    International Nuclear Information System (INIS)

    Barry, Amanda N.; Starkenburg, Shawn R.; Sayre, Richard T.

    2015-01-01

    One of the most environmentally sustainable ways to produce high-energy density (oils) feed stocks for the production of liquid transportation fuels is from biomass. Photosynthetic carbon capture combined with biomass combustion (point source) and subsequent carbon capture and sequestration has also been proposed in the intergovernmental panel on climate change report as one of the most effective and economical strategies to remediate atmospheric greenhouse gases. To maximize photosynthetic carbon capture efficiency and energy-return-on-investment, we must develop biomass production systems that achieve the greatest yields with the lowest inputs. Numerous studies have demonstrated that microalgae have among the greatest potentials for biomass production. This is in part due to the fact that all alga cells are photoautotrophic, they have active carbon concentrating mechanisms to increase photosynthetic productivity, and all the biomass is harvestable unlike plants. All photosynthetic organisms, however, convert only a fraction of the solar energy they capture into chemical energy (reduced carbon or biomass). To increase aerial carbon capture rates and biomass productivity, it will be necessary to identify the most robust algal strains and increase their biomass production efficiency often by genetic manipulation. We review recent large-scale efforts to identify the best biomass producing strains and metabolic engineering strategies to improve aerial productivity. These strategies include optimization of photosynthetic light-harvesting antenna size to increase energy capture and conversion efficiency and the potential development of advanced molecular breeding techniques. To date, these strategies have resulted in up to twofold increases in biomass productivity.

  14. Multi-Objective Optimization of a Hybrid ESS Based on Optimal Energy Management Strategy for LHDs

    Directory of Open Access Journals (Sweden)

    Jiajun Liu

    2017-10-01

    Full Text Available Energy storage systems (ESS play an important role in the performance of mining vehicles. A hybrid ESS combining both batteries (BTs and supercapacitors (SCs is one of the most promising solutions. As a case study, this paper discusses the optimal hybrid ESS sizing and energy management strategy (EMS of 14-ton underground load-haul-dump vehicles (LHDs. Three novel contributions are added to the relevant literature. First, a multi-objective optimization is formulated regarding energy consumption and the total cost of a hybrid ESS, which are the key factors of LHDs, and a battery capacity degradation model is used. During the process, dynamic programming (DP-based EMS is employed to obtain the optimal energy consumption and hybrid ESS power profiles. Second, a 10-year life cycle cost model of a hybrid ESS for LHDs is established to calculate the total cost, including capital cost, operating cost, and replacement cost. According to the optimization results, three solutions chosen from the Pareto front are compared comprehensively, and the optimal one is selected. Finally, the optimal and battery-only options are compared quantitatively using the same objectives, and the hybrid ESS is found to be a more economical and efficient option.

  15. Optimized imaging using non-rigid registration

    International Nuclear Information System (INIS)

    Berkels, Benjamin; Binev, Peter; Blom, Douglas A.; Dahmen, Wolfgang; Sharpley, Robert C.; Vogt, Thomas

    2014-01-01

    The extraordinary improvements of modern imaging devices offer access to data with unprecedented information content. However, widely used image processing methodologies fall far short of exploiting the full breadth of information offered by numerous types of scanning probe, optical, and electron microscopies. In many applications, it is necessary to keep measurement intensities below a desired threshold. We propose a methodology for extracting an increased level of information by processing a series of data sets suffering, in particular, from high degree of spatial uncertainty caused by complex multiscale motion during the acquisition process. An important role is played by a non-rigid pixel-wise registration method that can cope with low signal-to-noise ratios. This is accompanied by formulating objective quality measures which replace human intervention and visual inspection in the processing chain. Scanning transmission electron microscopy of siliceous zeolite material exhibits the above-mentioned obstructions and therefore serves as orientation and a test of our procedures. - Highlights: • Developed a new process for extracting more information from a series of STEM images. • An objective non-rigid registration process copes with distortions. • Images of zeolite Y show retrieval of all information available from the data set. • Quantitative measures of registration quality were implemented. • Applicable to any serially acquired data, e.g. STM, AFM, STXM, etc

  16. Optimization of microsatellite DNA Gelred fluorescence imaging ...

    African Journals Online (AJOL)

    user1

    2012-10-11

    Oct 11, 2012 ... In order to explore the best microsatellite DNA Gelred imaging technology, this ... analysis and character identification breeding practice, because it is ... detection methods are agarose gel electrophoresis (AGE) with ethidium ... method (PG). Gelred 10000X stock reagent was diluted in the 1.5% agarose gel.

  17. Improving the automated optimization of profile extrusion dies by applying appropriate optimization areas and strategies

    Science.gov (United States)

    Hopmann, Ch.; Windeck, C.; Kurth, K.; Behr, M.; Siegbert, R.; Elgeti, S.

    2014-05-01

    The rheological design of profile extrusion dies is one of the most challenging tasks in die design. As no analytical solution is available, the quality and the development time for a new design highly depend on the empirical knowledge of the die manufacturer. Usually, prior to start production several time-consuming, iterative running-in trials need to be performed to check the profile accuracy and the die geometry is reworked. An alternative are numerical flow simulations. These simulations enable to calculate the melt flow through a die so that the quality of the flow distribution can be analyzed. The objective of a current research project is to improve the automated optimization of profile extrusion dies. Special emphasis is put on choosing a convenient starting geometry and parameterization, which enable for possible deformations. In this work, three commonly used design features are examined with regard to their influence on the optimization results. Based on the results, a strategy is derived to select the most relevant areas of the flow channels for the optimization. For these characteristic areas recommendations are given concerning an efficient parameterization setup that still enables adequate deformations of the flow channel geometry. Exemplarily, this approach is applied to a L-shaped profile with different wall thicknesses. The die is optimized automatically and simulation results are qualitatively compared with experimental results. Furthermore, the strategy is applied to a complex extrusion die of a floor skirting profile to prove the universal adaptability.

  18. Optimization of fuel cycle strategies with constraints on uranium availability

    International Nuclear Information System (INIS)

    Silvennoinen, P.; Vira, J.; Westerberg, R.

    1982-01-01

    Optimization of nuclear reactor and fuel cycle strategies is studied under the influence of reduced availability of uranium. The analysis is separated in two distinct steps. First, the global situation is considered within given high and low projections of the installed capacity up to the year 2025. Uranium is regarded as an exhaustible resource whose production cost would increase proportionally to increasing cumulative exploitation. Based on the estimates obtained for the uranium cost, a global strategy is derived by splitting the installed capacity between light water reactor (LWR) once-through, LWR recycle, and fast breeder reactor (FBR) alternatives. In the second phase, the nuclear program of an individual utility is optimized within the constraints imposed from the global scenario. Results from the global scenarios indicate that in a reference case the uranium price would triple by the year 2000, and the price escalation would continue throughout the planning period. In a pessimistic growth scenario where the global nuclear capacity would not exceed 600 GW(electric) in 2025, the uranium price would almost double by 2000. In both global scenarios, FBRs would be introduced, in the reference case after 2000 and in the pessimistic case after 2010. In spite of the increases in the uranium prices, the levelized power production cost would increase only by 45% up to 2025 in the utility case provided that the plutonium is incinerated as a substitute fuel

  19. Web malware spread modelling and optimal control strategies

    Science.gov (United States)

    Liu, Wanping; Zhong, Shouming

    2017-02-01

    The popularity of the Web improves the growth of web threats. Formulating mathematical models for accurate prediction of malicious propagation over networks is of great importance. The aim of this paper is to understand the propagation mechanisms of web malware and the impact of human intervention on the spread of malicious hyperlinks. Considering the characteristics of web malware, a new differential epidemic model which extends the traditional SIR model by adding another delitescent compartment is proposed to address the spreading behavior of malicious links over networks. The spreading threshold of the model system is calculated, and the dynamics of the model is theoretically analyzed. Moreover, the optimal control theory is employed to study malware immunization strategies, aiming to keep the total economic loss of security investment and infection loss as low as possible. The existence and uniqueness of the results concerning the optimality system are confirmed. Finally, numerical simulations show that the spread of malware links can be controlled effectively with proper control strategy of specific parameter choice.

  20. Using Chemical Reaction Kinetics to Predict Optimal Antibiotic Treatment Strategies.

    Science.gov (United States)

    Abel Zur Wiesch, Pia; Clarelli, Fabrizio; Cohen, Ted

    2017-01-01

    Identifying optimal dosing of antibiotics has proven challenging-some antibiotics are most effective when they are administered periodically at high doses, while others work best when minimizing concentration fluctuations. Mechanistic explanations for why antibiotics differ in their optimal dosing are lacking, limiting our ability to predict optimal therapy and leading to long and costly experiments. We use mathematical models that describe both bacterial growth and intracellular antibiotic-target binding to investigate the effects of fluctuating antibiotic concentrations on individual bacterial cells and bacterial populations. We show that physicochemical parameters, e.g. the rate of drug transmembrane diffusion and the antibiotic-target complex half-life are sufficient to explain which treatment strategy is most effective. If the drug-target complex dissociates rapidly, the antibiotic must be kept constantly at a concentration that prevents bacterial replication. If antibiotics cross bacterial cell envelopes slowly to reach their target, there is a delay in the onset of action that may be reduced by increasing initial antibiotic concentration. Finally, slow drug-target dissociation and slow diffusion out of cells act to prolong antibiotic effects, thereby allowing for less frequent dosing. Our model can be used as a tool in the rational design of treatment for bacterial infections. It is easily adaptable to other biological systems, e.g. HIV, malaria and cancer, where the effects of physiological fluctuations of drug concentration are also poorly understood.

  1. Survey Strategy Optimization for the Atacama Cosmology Telescope

    Science.gov (United States)

    De Bernardis, F.; Stevens, J. R.; Hasselfield, M.; Alonso, D.; Bond, J. R.; Calabrese, E.; Choi, S. K.; Crowley, K. T.; Devlin, M.; Wollack, E. J.

    2016-01-01

    In recent years there have been significant improvements in the sensitivity and the angular resolution of the instruments dedicated to the observation of the Cosmic Microwave Background (CMB). ACTPol is the first polarization receiver for the Atacama Cosmology Telescope (ACT) and is observing the CMB sky with arcmin resolution over approximately 2000 square degrees. Its upgrade, Advanced ACTPol (AdvACT), will observe the CMB in five frequency bands and over a larger area of the sky. We describe the optimization and implementation of the ACTPol and AdvACT surveys. The selection of the observed fields is driven mainly by the science goals, that is, small angular scale CMB measurements, B-mode measurements and cross-correlation studies. For the ACTPol survey we have observed patches of the southern galactic sky with low galactic foreground emissions which were also chosen to maximize the overlap with several galaxy surveys to allow unique cross-correlation studies. A wider field in the northern galactic cap ensured significant additional overlap with the BOSS spectroscopic survey. The exact shapes and footprints of the fields were optimized to achieve uniform coverage and to obtain cross-linked maps by observing the fields with different scan directions. We have maximized the efficiency of the survey by implementing a close to 24-hour observing strategy, switching between daytime and nighttime observing plans and minimizing the telescope idle time. We describe the challenges represented by the survey optimization for the significantly wider area observed by AdvACT, which will observe roughly half of the low-foreground sky. The survey strategies described here may prove useful for planning future ground-based CMB surveys, such as the Simons Observatory and CMB Stage IV surveys.

  2. Image processing to optimize wave energy converters

    Science.gov (United States)

    Bailey, Kyle Marc-Anthony

    The world is turning to renewable energies as a means of ensuring the planet's future and well-being. There have been a few attempts in the past to utilize wave power as a means of generating electricity through the use of Wave Energy Converters (WEC), but only recently are they becoming a focal point in the renewable energy field. Over the past few years there has been a global drive to advance the efficiency of WEC. Placing a mechanical device either onshore or offshore that captures the energy within ocean surface waves to drive a mechanical device is how wave power is produced. This paper seeks to provide a novel and innovative way to estimate ocean wave frequency through the use of image processing. This will be achieved by applying a complex modulated lapped orthogonal transform filter bank to satellite images of ocean waves. The complex modulated lapped orthogonal transform filterbank provides an equal subband decomposition of the Nyquist bounded discrete time Fourier Transform spectrum. The maximum energy of the 2D complex modulated lapped transform subband is used to determine the horizontal and vertical frequency, which subsequently can be used to determine the wave frequency in the direction of the WEC by a simple trigonometric scaling. The robustness of the proposed method is provided by the applications to simulated and real satellite images where the frequency is known.

  3. Developing optimized CT scan protocols: Phantom measurements of image quality

    International Nuclear Information System (INIS)

    Zarb, Francis; Rainford, Louise; McEntee, Mark F.

    2011-01-01

    Purpose: The increasing frequency of computerized tomography (CT) examinations is well documented, leading to concern about potential radiation risks for patients. However, the consequences of not performing the CT examination and missing injuries and disease are potentially serious, impacting upon correct patient management. The ALARA principle of dose optimization must be employed for all justified CT examinations. Dose indicators displayed on the CT console as either CT dose index (CTDI) and/or dose length product (DLP), are used to indicate dose and can quantify improvements achieved through optimization. Key scan parameters contributing to dose have been identified in previous literature and in previous work by our group. The aim of this study was to optimize the scan parameters of mA; kV and pitch, whilst maintaining image quality and reducing dose. This research was conducted using psychophysical image quality measurements on a CT quality assurance (QA) phantom establishing the impact of dose optimization on image quality parameters. Method: Current CT scan parameters for head (posterior fossa and cerebrum), abdomen and chest examinations were collected from 57% of CT suites available nationally in Malta (n = 4). Current scan protocols were used to image a Catphan 600 CT QA phantom whereby image quality was assessed. Each scan parameter: mA; kV and pitch were systematically reduced until the contrast resolution (CR), spatial resolution (SR) and noise were significantly lowered. The Catphan 600 images, produced by the range of protocols, were evaluated by 2 expert observers assessing CR, SR and noise. The protocol considered as the optimization threshold was just above the setting that resulted in a significant reduction in CR and noise but not affecting SR at the 95% confidence interval. Results: The limit of optimization threshold was determined for each CT suite. Employing optimized parameters, CTDI and DLP were both significantly reduced (p ≤ 0.001) by

  4. Identifying optimal agricultural countermeasure strategies for a hypothetical contamination scenario using the strategy model

    International Nuclear Information System (INIS)

    Cox, G.; Beresford, N.A.; Alvarez-Farizo, B.; Oughton, D.; Kis, Z.; Eged, K.; Thorring, H.; Hunt, J.; Wright, S.; Barnett, C.L.; Gil, J.M.; Howard, B.J.; Crout, N.M.J.

    2005-01-01

    A spatially implemented model designed to assist the identification of optimal countermeasure strategies for radioactively contaminated regions is described. Collective and individual ingestion doses for people within the affected area are estimated together with collective exported ingestion dose. A range of countermeasures are incorporated within the model, and environmental restrictions have been included as appropriate. The model evaluates the effectiveness of a given combination of countermeasures through a cost function which balances the benefit obtained through the reduction in dose with the cost of implementation. The optimal countermeasure strategy is the combination of individual countermeasures (and when and where they are implemented) which gives the lowest value of the cost function. The model outputs should not be considered as definitive solutions, rather as interactive inputs to the decision making process. As a demonstration the model has been applied to a hypothetical scenario in Cumbria (UK). This scenario considered a published nuclear power plant accident scenario with a total deposition of 1.7 x 10 14 , 1.2 x 10 13 , 2.8 x 10 10 and 5.3 x 10 9 Bq for Cs-137, Sr-90, Pu-239/240 and Am-241, respectively. The model predicts that if no remediation measures were implemented the resulting collective dose would be approximately 36 000 person-Sv (predominantly from 137 Cs) over a 10-year period post-deposition. The optimal countermeasure strategy is predicted to avert approximately 33 000 person-Sv at a cost of approximately pound 160 million. The optimal strategy comprises a mixture of ploughing, AFCF (ammonium-ferric hexacyano-ferrate) administration, potassium fertiliser application, clean feeding of livestock and food restrictions. The model recommends specific areas within the contaminated area and time periods where these measures should be implemented

  5. PEMFC Optimization Strategy with Auxiliary Power Source in Fuel Cell Hybrid Vehicle

    Directory of Open Access Journals (Sweden)

    Tinton Dwi Atmaja

    2012-02-01

    Full Text Available Page HeaderOpen Journal SystemsJournal HelpUser You are logged in as...aulia My Journals My Profile Log Out Log Out as UserNotifications View (27 new ManageJournal Content SearchBrowse By Issue By Author By Title Other JournalsFont SizeMake font size smaller Make font size default Make font size largerInformation For Readers For Authors For LibrariansKeywords CBPNN Displacement FLC LQG/LTR Mixed PMA Ventilation bottom shear stress direct multiple shooting effective fuzzy logic geoelectrical method hourly irregular wave missile trajectory panoramic image predator-prey systems seawater intrusion segmentation structure development pattern terminal bunt manoeuvre Home About User Home Search Current Archives ##Editorial Board##Home > Vol 23, No 1 (2012 > AtmajaPEMFC Optimization Strategy with Auxiliary Power Source in Fuel Cell Hybrid VehicleTinton Dwi Atmaja, Amin AminAbstractone of the present-day implementation of fuel cell is acting as main power source in Fuel Cell Hybrid Vehicle (FCHV. This paper proposes some strategies to optimize the performance of Polymer Electrolyte Membrane Fuel Cell (PEMFC implanted with auxiliary power source to construct a proper FCHV hybridization. The strategies consist of the most updated optimization method determined from three point of view i.e. Energy Storage System (ESS, hybridization topology and control system analysis. The goal of these strategies is to achieve an optimum hybridization with long lifetime, low cost, high efficiency, and hydrogen consumption rate improvement. The energy storage system strategy considers battery, supercapacitor, and high-speed flywheel as the most promising alternative auxiliary power source. The hybridization topology strategy analyzes the using of multiple storage devices injected with electronic components to bear a higher fuel economy and cost saving. The control system strategy employs nonlinear control system to optimize the ripple factor of the voltage and the current

  6. Robust approximate optimal guidance strategies for aeroassisted orbital transfer missions

    Science.gov (United States)

    Ilgen, Marc R.

    This thesis presents the application of game theoretic and regular perturbation methods to the problem of determining robust approximate optimal guidance laws for aeroassisted orbital transfer missions with atmospheric density and navigated state uncertainties. The optimal guidance problem is reformulated as a differential game problem with the guidance law designer and Nature as opposing players. The resulting equations comprise the necessary conditions for the optimal closed loop guidance strategy in the presence of worst case parameter variations. While these equations are nonlinear and cannot be solved analytically, the presence of a small parameter in the equations of motion allows the method of regular perturbations to be used to solve the equations approximately. This thesis is divided into five parts. The first part introduces the class of problems to be considered and presents results of previous research. The second part then presents explicit semianalytical guidance law techniques for the aerodynamically dominated region of flight. These guidance techniques are applied to unconstrained and control constrained aeroassisted plane change missions and Mars aerocapture missions, all subject to significant atmospheric density variations. The third part presents a guidance technique for aeroassisted orbital transfer problems in the gravitationally dominated region of flight. Regular perturbations are used to design an implicit guidance technique similar to the second variation technique but that removes the need for numerically computing an optimal trajectory prior to flight. This methodology is then applied to a set of aeroassisted inclination change missions. In the fourth part, the explicit regular perturbation solution technique is extended to include the class of guidance laws with partial state information. This methodology is then applied to an aeroassisted plane change mission using inertial measurements and subject to uncertainties in the initial value

  7. Optimal strategy for selling on group-buying website

    Directory of Open Access Journals (Sweden)

    Xuan Jiang

    2014-09-01

    Full Text Available Purpose: The purpose of this paper is to help business marketers with offline channels to make decisions on whether to sell through Group-buying (GB websites and how to set online price with the coordination of maximum deal size on GB websites. Design/methodology/approach: Considering the deal structure of GB websites especially for the service fee and minimum deal size limit required by GB websites, advertising effect of selling on GB websites, and interaction between online and offline markets, an analytical model is built to derive optimal online price and maximum deal size for sellers selling through GB website. This paper aims to answer four research questions: (1 How to make a decision on maximum deal size with coordination of the deal price? (2 Will selling on GB websites always be better than staying with offline channel only? (3 What kind of products is more appropriate to sell on GB website? (4How could GB website operator induce sellers to offer deep discount in GB deals? Findings and Originality/value: This paper obtains optimal strategies for sellers selling on GB website and finds that: Even if a seller has sufficient capacity, he/she may still set a maximum deal size on the GB deal to take advantage of Advertisement with Limited Availability (ALA effect; Selling through GB website may not bring a higher profit than selling only through offline channel when a GB site only has a small consumer base and/or if there is a big overlap between the online and offline markets; Low margin products are more suitable for being sold online with ALA strategies (LP-ALA or HP-ALA than high margin ones; A GB site operator could set a small minimum deal size to induce deep discounts from the sellers selling through GB deals. Research limitations/implications: The present study assumed that the demand function is determinate and linear. It will be interesting to study how stochastic demand and a more general demand function affect the optimal

  8. Bound Alternative Direction Optimization for Image Deblurring

    Directory of Open Access Journals (Sweden)

    Xiangrong Zeng

    2014-01-01

    the ℓp regularizer by a novel majorizer and then, based on a variable splitting, to reformulate the bound unconstrained problem into a constrained one, which is then addressed via an augmented Lagrangian method. The proposed algorithm actually combines the reweighted ℓ1 minimization method and the alternating direction method of multiples (ADMM such that it succeeds in extending the application of ADMM to ℓp minimization problems. The conducted experimental studies demonstrate the superiority of the proposed algorithm for the synthesis ℓp minimization over the state-of-the-art algorithms for the synthesis ℓ1 minimization on image deblurring.

  9. Optimization of MR imaging for extracranial head and neck lesions

    International Nuclear Information System (INIS)

    Dalley, R.W.; Maravilla, K.R.; Cohen, W.

    1989-01-01

    The authors have used a 1.5T MR imager to study 28 pathologically proven extracranial head and neck lesions. Multiple pulse sequences were performed pre-and/or post-gadolinium, including T1-weighted, short TI inversion-recovery (STIR), spin-density, and T2-weighted sequences. T1-weighted images provided excellent anatomic detail but relatively poor muscle/lesion contrast. Gadolinium often improved lesion visibility; however, discrimination from surrounding fat was impaired. Postcontrast T2-weighted images seemed to provide better lesion conspicuity than did pre-gadolinium images. STIR imaging provided the highest lesion conspicuity in fatty areas. No single sequence was optimal for all head and neck imaging. The authors analyze the advantages and limitations of each sequence and formulate rational imaging protocols based on the primary region of interest

  10. Optimization of T2-weighted imaging for shoulder magnetic resonance arthrography by synthetic magnetic resonance imaging.

    Science.gov (United States)

    Lee, Seung Hyun; Lee, Young Han; Hahn, Seok; Yang, Jaemoon; Song, Ho-Taek; Suh, Jin-Suck

    2017-01-01

    Background Synthetic magnetic resonance imaging (MRI) allows reformatting of various synthetic images by adjustment of scanning parameters such as repetition time (TR) and echo time (TE). Optimized MR images can be reformatted from T1, T2, and proton density (PD) values to achieve maximum tissue contrast between joint fluid and adjacent soft tissue. Purpose To demonstrate the method for optimization of TR and TE by synthetic MRI and to validate the optimized images by comparison with conventional shoulder MR arthrography (MRA) images. Material and Methods Thirty-seven shoulder MRA images acquired by synthetic MRI were retrospectively evaluated for PD, T1, and T2 values at the joint fluid and glenoid labrum. Differences in signal intensity between the fluid and labrum were observed between TR of 500-6000 ms and TE of 80-300 ms in T2-weighted (T2W) images. Conventional T2W and synthetic images were analyzed for diagnostic agreement of supraspinatus tendon abnormalities (kappa statistics) and image quality scores (one-way analysis of variance with post-hoc analysis). Results Optimized mean values of TR and TE were 2724.7 ± 1634.7 and 80.1 ± 0.4, respectively. Diagnostic agreement for supraspinatus tendon abnormalities between conventional and synthetic MR images was excellent (κ = 0.882). The mean image quality score of the joint space in optimized synthetic images was significantly higher compared with those in conventional and synthetic images (2.861 ± 0.351 vs. 2.556 ± 0.607 vs. 2.750 ± 0.439; P optimized TR and TE for shoulder MRA enables optimization of soft-tissue contrast.

  11. Differential evolution optimization combined with chaotic sequences for image contrast enhancement

    Energy Technology Data Exchange (ETDEWEB)

    Santos Coelho, Leandro dos [Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Pontifical Catholic University of Parana, PUCPR Imaculada Conceicao, 1155, 80215-901 Curitiba, Parana (Brazil)], E-mail: leandro.coelho@pucpr.br; Sauer, Joao Guilherme [Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Pontifical Catholic University of Parana, PUCPR Imaculada Conceicao, 1155, 80215-901 Curitiba, Parana (Brazil)], E-mail: joao.sauer@gmail.com; Rudek, Marcelo [Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Pontifical Catholic University of Parana, PUCPR Imaculada Conceicao, 1155, 80215-901 Curitiba, Parana (Brazil)], E-mail: marcelo.rudek@pucpr.br

    2009-10-15

    Evolutionary Algorithms (EAs) are stochastic and robust meta-heuristics of evolutionary computation field useful to solve optimization problems in image processing applications. Recently, as special mechanism to avoid being trapped in local minimum, the ergodicity property of chaotic sequences has been used in various designs of EAs. Three differential evolution approaches based on chaotic sequences using logistic equation for image enhancement process are proposed in this paper. Differential evolution is a simple yet powerful evolutionary optimization algorithm that has been successfully used in solving continuous problems. The proposed chaotic differential evolution schemes have fast convergence rate but also maintain the diversity of the population so as to escape from local optima. In this paper, the image contrast enhancement is approached as a constrained nonlinear optimization problem. The objective of the proposed chaotic differential evolution schemes is to maximize the fitness criterion in order to enhance the contrast and detail in the image by adapting the parameters using a contrast enhancement technique. The proposed chaotic differential evolution schemes are compared with classical differential evolution to two testing images. Simulation results on three images show that the application of chaotic sequences instead of random sequences is a possible strategy to improve the performance of classical differential evolution optimization algorithm.

  12. Comparison of image quality in head CT studies with different dose-reduction strategies

    DEFF Research Database (Denmark)

    Johansen, Jeppe; Nielsen, Rikke; Fink-Jensen, Vibeke

    The number of multi-detector CT examinations is increasing rapidly. They allow high quality reformatted images providing accurate and precise diagnosis at maximum speed. Brain examinations are the most commonly requested studies, and although they come at a lower effective dose than body CT, can...... account to a considerable radiation dose as many patients undergo repeated studies. Therefore, various dose-reduction strategies are applied such as automated tube current and voltage modulation and recently different iterative reconstruction algorithms. However, the trade-off of all dose......-reduction maneuvers is reduction of image quality due to image noise or artifacts. The aim of our study was therefore to find the best diagnostic images with lowest possible dose. We present results of dose- and image quality optimizing strategies of brain CT examinations at our institution. We compare sequential...

  13. Progresses in optimization strategy for radiolabeled molecular probes targeting integrin αvβ3

    International Nuclear Information System (INIS)

    Chen Haojun; Wu Hua

    2012-01-01

    Tumor angiogenesis is critical in the growth, invasion and metastasis of malignant tumors. The integrins, which express on many types of tumor cells and activated vascular endothelial cells, play an important role in regulation of the tumor angiogenesis. RGD peptide, which contains Arg-Gly-Asp sequence, binds specifically to integrin α v β 3 . Therefore, the radiolabeled RGD peptides may have broad application prospects in radionuclide imaging and therapy. Major research interests include the selection of radionuclides, modification and improvement of RGD structures. In this article, we give a review on research progresses in optimization strategy for radiolabeled molecular probes targeting integrin α v β 3 . (authors)

  14. Optimization of wavelet decomposition for image compression and feature preservation.

    Science.gov (United States)

    Lo, Shih-Chung B; Li, Huai; Freedman, Matthew T

    2003-09-01

    A neural-network-based framework has been developed to search for an optimal wavelet kernel that can be used for a specific image processing task. In this paper, a linear convolution neural network was employed to seek a wavelet that minimizes errors and maximizes compression efficiency for an image or a defined image pattern such as microcalcifications in mammograms and bone in computed tomography (CT) head images. We have used this method to evaluate the performance of tap-4 wavelets on mammograms, CTs, magnetic resonance images, and Lena images. We found that the Daubechies wavelet or those wavelets with similar filtering characteristics can produce the highest compression efficiency with the smallest mean-square-error for many image patterns including general image textures as well as microcalcifications in digital mammograms. However, the Haar wavelet produces the best results on sharp edges and low-noise smooth areas. We also found that a special wavelet whose low-pass filter coefficients are 0.32252136, 0.85258927, 1.38458542, and -0.14548269) produces the best preservation outcomes in all tested microcalcification features including the peak signal-to-noise ratio, the contrast and the figure of merit in the wavelet lossy compression scheme. Having analyzed the spectrum of the wavelet filters, we can find the compression outcomes and feature preservation characteristics as a function of wavelets. This newly developed optimization approach can be generalized to other image analysis applications where a wavelet decomposition is employed.

  15. OPTIMIZATION OF DIAGNOSTIC IMAGING IN BREAST CANCER

    Directory of Open Access Journals (Sweden)

    S. A. Velichko

    2015-01-01

    Full Text Available The paper presents the results of breast imaging for 47200 women. Breast cancer was detected in 862 (1.9% patients, fibroadenoma in 1267 (2.7% patients and isolated breast cysts in 1162 (2.4% patients. Different types of fibrocystic breast disease (adenosis, diffuse fibrocystic changes, local fibrosis and others were observed in 60.1% of women. Problems of breast cancer visualization during mammography, characterized by the appearance of fibrocystic mastopathy (sclerosing adenosis, fibrous bands along the ducts have been analyzed. Data on the development of diagnostic algorithms including the modern techniques for ultrasound and interventional radiology aimed at detecting early breast cancer have been presented.  

  16. An Overview of Optimizing Strategies for Flotation Banks

    Directory of Open Access Journals (Sweden)

    Miguel Maldonado

    2012-10-01

    Full Text Available A flotation bank is a serial arrangement of cells. How to optimally operate a bank remains a challenge. This article reviews three reported strategies: air profiling, mass-pull (froth velocity profiling and Peak Air Recovery (PAR profiling. These are all ways of manipulating the recovery profile down a bank, which may be the property being exploited. Mathematical analysis has shown that a flat cell-by-cell recovery profile maximizes the separation of two floatable minerals for a given target bank recovery when the relative floatability is constant down the bank. Available bank survey data are analyzed with respect to recovery profiling. Possible variations on recovery profile to minimize entrainment are discussed.

  17. Gradient Material Strategies for Hydrogel Optimization in Tissue Engineering Applications

    Science.gov (United States)

    2018-01-01

    Although a number of combinatorial/high-throughput approaches have been developed for biomaterial hydrogel optimization, a gradient sample approach is particularly well suited to identify hydrogel property thresholds that alter cellular behavior in response to interacting with the hydrogel due to reduced variation in material preparation and the ability to screen biological response over a range instead of discrete samples each containing only one condition. This review highlights recent work on cell–hydrogel interactions using a gradient material sample approach. Fabrication strategies for composition, material and mechanical property, and bioactive signaling gradient hydrogels that can be used to examine cell–hydrogel interactions will be discussed. The effects of gradients in hydrogel samples on cellular adhesion, migration, proliferation, and differentiation will then be examined, providing an assessment of the current state of the field and the potential of wider use of the gradient sample approach to accelerate our understanding of matrices on cellular behavior. PMID:29485612

  18. Optimizing urology group partnerships: collaboration strategies and compensation best practices.

    Science.gov (United States)

    Jacoby, Dana L; Maller, Bruce S; Peltier, Lisa R

    2014-10-01

    Market forces in health care have created substantial regulatory, legislative, and reimbursement changes that have had a significant impact on urology group practices. To maintain viability, many urology groups have merged into larger integrated entities. Although group operations vary considerably, the majority of groups have struggled with the development of a strong culture, effective decision-making, and consensus-building around shared resources, income, and expense. Creating a sustainable business model requires urology group leaders to allocate appropriate time and resources to address these issues in a proactive manner. This article outlines collaboration strategies for creating an effective culture, governance, and leadership, and provides practical suggestions for optimizing the performance of the urology group practice.

  19. Optimal Strategies for Probing Terrestrial Exoplanet Atmospheres with JWST

    Science.gov (United States)

    Batalha, Natasha E.; Lewis, Nikole K.; Line, Michael

    2018-01-01

    It is imperative that the exoplanet community determines the feasibility and the resources needed to yield high fidelity atmospheric compositions from terrestrial exoplanets. In particular, LHS 1140b and the TRAPPIST-1 system, already slated for observations by JWST’s Guaranteed Time Observers, will be the first two terrestrial planets observed by JWST. I will discuss optimal observing strategies for observing these two systems, focusing on the NIRSpec Prism (1-5μm) and the combination of NIRISS SOSS (1-2.7μm) and NIRSpec G395H (3-5μm). I will also introduce currently unsupported JWST readmodes that have the potential to greatly increase the precision on our atmospheric spectra. Lastly, I will use information content theory to compute the expected confidence interval on the retrieved abundances of key molecular species and temperature profiles as a function of JWST observing cycles.

  20. Optimized control strategy for crowbarless solid state modular power supply

    International Nuclear Information System (INIS)

    Upadhyay, R.; Badapanda, M.K.; Tripathi, A.; Hannurkar, P.R.; Pithawa, C.K.

    2009-01-01

    Solid state modular power supply with series connected IGBT based power modules have been employed as high voltage bias power supply of klystron amplifier. Auxiliary compensation of full wave inverter bridge with ZVS/ZCS operations of all IGBTs over entire operating range is incorporated. An optimized control strategy has been adopted for this power supply needing no output filter, making this scheme crowbarless and is presented in this paper. DSP based fully digital control with same duty cycle for all power modules, have been incorporated for regulating this power supply along with adequate protection features. Input to this power supply is taken directly from 11 kV line and the input system is intentionally made 24 pulsed to reduce the input harmonics, improve the input power factor significantly, there by requiring no line filters. Various steps have been taken to increase the efficiency of major subsystems, so as to improve the overall efficiency of this power supply significantly. (author)

  1. Evolution strategy based optimal chiller loading for saving energy

    International Nuclear Information System (INIS)

    Chang, Y.-C.; Lee, C.-Y.; Chen, C.-R.; Chou, C.-J.; Chen, W.-H.; Chen, W.-H.

    2009-01-01

    This study employs evolution strategy (ES) to solve optimal chiller loading (OCL) problem. ES overcomes the flaw that Lagrangian method is not adaptable for solving OCL as the power consumption models or the kW-PLR (partial load ratio) curves include convex functions and concave functions simultaneously. The complicated process of evolution by the genetic algorithm (GA) method for solving OCL can also be simplified by the ES method. This study uses the PLR of chiller as the variable to be solved for the decoupled air conditioning system. After analysis and comparison of the case study, it has been concluded that this method not only solves the problems of Lagrangian method and GA method, but also produces results with high accuracy within a rapid timeframe. It can be perfectly applied to the operation of air conditioning systems

  2. Optimal Inspection and Repair Strategies for Structural Systems

    DEFF Research Database (Denmark)

    Sommer, A. M.; Nowak, A. S.; Thoft-Christensen, Palle

    1992-01-01

    and a design variable as optimization variables. A model for estimating the total expected costs for structural systems is given including the costs associated with the loss of individual structural members as well as the costs associated with the loss of at least one element of a particular group......A model for reliability-based repair and maintenance strategies of structural systems is described. The total expected costs in the lifetime of the structure are minimized with the number of inspections, the number and positions of the inspected points, the inspection efforts, the repair criteria...... of structural members and the costs associated with the simultaneous loss of all members of a specific group of structural members. The approach is based on the pre-posteriori analysis from the classical decision theory. Special emphasis is given to the problem of selecting the number of points in the structure...

  3. Orientation Strategies for Aerial Oblique Images

    Science.gov (United States)

    Wiedemann, A.; Moré, J.

    2012-07-01

    Oblique aerial images become more and more distributed to fill the gap between vertical aerial images and mobile mapping systems. Different systems are on the market. For some applications, like texture mapping, precise orientation data are required. One point is the stable interior orientation, which can be achieved by stable camera systems, the other a precise exterior orientation. A sufficient exterior orientation can be achieved by a large effort in direct sensor orientation, whereas minor errors in the angles have a larger effect than in vertical imagery. The more appropriate approach is by determine the precise orientation parameters by photogrammetric methods using an adapted aerial triangulation. Due to the different points of view towards the object the traditional aerotriangulation matching tools fail, as they produce a bunch of blunders and require a lot of manual work to achieve a sufficient solution. In this paper some approaches are discussed and results are presented for the most promising approaches. We describe a single step approach with an aerotriangulation using all available images; a two step approach with an aerotriangulation only of the vertical images plus a mathematical transformation of the oblique images using the oblique cameras excentricity; and finally the extended functional model for a bundle block adjustment considering the mechanical connection between vertical and oblique images. Beside accuracy also other aspects like efficiency and required manual work have to be considered.

  4. Optimization of behavioral, biobehavioral, and biomedical interventions the multiphase optimization strategy (MOST)

    CERN Document Server

    Collins, Linda M

    2018-01-01

    This book presents a framework for development, optimization, and evaluation of behavioral,  biobehavioral, and biomedical interventions.  Behavioral, biobehavioral, and biomedical interventions are programs with the objective of improving and maintaining human health and well-being, broadly defined, in individuals, families, schools, organizations, or communities.  These interventions may be aimed at, for example, preventing or treating disease, promoting physical and mental health, preventing violence, or improving academic achievement.   This volume introduces the Multiphase Optimization Strategy (MOST), pioneered at The Methodology Center at the Pennsylvania State University, as an alternative to the classical approach of relying solely on the randomized controlled trial (RCT).  MOST borrows heavily from perspectives taken and approaches used in engineering, and also integrates concepts from statistics and behavioral science, including the RCT.  As described in detail in this book, MOST consists of ...

  5. Optimal Order Strategy in Uncertain Demands with Free Shipping Option

    Directory of Open Access Journals (Sweden)

    Qing-Chun Meng

    2014-01-01

    Full Text Available Free shipping with conditions has become one of the most effective marketing tools; more and more companies especially e-business companies prefer to offer free shipping to buyers whenever their orders exceed the minimum quantity specified by them. But in practice, the demands of buyers are uncertain, which are affected by weather, season, and many other factors. Firstly, we model the centralization ordering problem of retailers who face stochastic demands when suppliers offer free shipping, in which limited distributional information such as known mean, support, and some deviation measures of the random data is needed only. Then, based on the linear decision rule mainly for stochastic programming, we analyze the optimal order strategies of retailers and discuss the approximate solution. Further, we present the core allocation between all retailers via dual and cooperative game theory. The existence of core shows that each retailer is pleased to cooperate with others in the centralization problem. Finally, a numerical example is implemented to discuss how uncertain data and parameters affect the optimal solution.

  6. Optimization of a Biometric System Based on Acoustic Images

    Directory of Open Access Journals (Sweden)

    Alberto Izquierdo Fuente

    2014-01-01

    Full Text Available On the basis of an acoustic biometric system that captures 16 acoustic images of a person for 4 frequencies and 4 positions, a study was carried out to improve the performance of the system. On a first stage, an analysis to determine which images provide more information to the system was carried out showing that a set of 12 images allows the system to obtain results that are equivalent to using all of the 16 images. Finally, optimization techniques were used to obtain the set of weights associated with each acoustic image that maximizes the performance of the biometric system. These results improve significantly the performance of the preliminary system, while reducing the time of acquisition and computational burden, since the number of acoustic images was reduced.

  7. Improved MR breast images by contrast optimization using artificial intelligence

    International Nuclear Information System (INIS)

    Konig, H.; Gohagan, J.; Laub, G.; Bachus, R.; Heywang, S.; Reinhardt, E.R.

    1986-01-01

    The clinical relevance of MR imaging of the breast is mainly related to the modelity's ability to differentiate among normal, benign, and malignant tissue and to yield prognostic information. In addition to the MR imaging parameters, morphologic features of these images are calculated. Based on statistical information of a comprehensive, labeled image and knowledge of a data base system, a numerical classifier is deduced. The application of this classifier to all cases leads to estimations of specific tissue types for each pixel. The method is sufficiently sensitive for grading a recognized tissue class. In this manner images with optimal contrast appropriate to particular diagnostic requirements are generated. The discriminant power of each MR imaging parameter as well as of a combination of parameters can be determined objectively with respect to tissue discrimination

  8. Optimization of a Biometric System Based on Acoustic Images

    Science.gov (United States)

    Izquierdo Fuente, Alberto; Del Val Puente, Lara; Villacorta Calvo, Juan J.; Raboso Mateos, Mariano

    2014-01-01

    On the basis of an acoustic biometric system that captures 16 acoustic images of a person for 4 frequencies and 4 positions, a study was carried out to improve the performance of the system. On a first stage, an analysis to determine which images provide more information to the system was carried out showing that a set of 12 images allows the system to obtain results that are equivalent to using all of the 16 images. Finally, optimization techniques were used to obtain the set of weights associated with each acoustic image that maximizes the performance of the biometric system. These results improve significantly the performance of the preliminary system, while reducing the time of acquisition and computational burden, since the number of acoustic images was reduced. PMID:24616643

  9. The PWR loading pattern optimization in X-IMAGE

    International Nuclear Information System (INIS)

    Stevens, J.G.; Smith, K.S.; Rempe, K.R.; Downar, T.J.

    1993-01-01

    The design of reactor core loading patterns is difficult due to the staggering number of patterns. The integer nature and nonlinear neutronic response of core design preclude simple prescriptions for generation of the feasible patterns, much less optimization among feasible candidates. Fortunately, recent developments in optimization, graphical user interfaces (GUIs), and the speed and low cost of engineering workstations combine to make loading pattern automation possible. The optimization module SIMAN has been added to X-IMAGE to automatically generate high-quality core loadings

  10. Optimizing the HLT Buffer Strategy with Monte Carlo Simulations

    CERN Document Server

    AUTHOR|(CDS)2266763

    2017-01-01

    This project aims to optimize the strategy of utilizing the disk buffer for the High Level Trigger (HLT) of the LHCb experiment with the help of Monte-Carlo simulations. A method is developed, which simulates the Event Filter Farm (EFF) -- a computing cluster for the High Level Trigger -- as a compound of nodes with different performance properties. In this way, the behavior of the computing farm can be analyzed at a deeper level than before. It is demonstrated that the current operating strategy might be improved when data taking is reaching a mid-year scheduled stop or the year-end technical stop. The processing time of the buffered data can be lowered by distributing the detector data according to the processing power of the nodes instead of the relative disk size as long as the occupancy level of the buffer is low enough. Moreover, this ensures that data taken and stored on the buffer at the same time is processed by different nodes nearly simultaneously, which reduces load on the infrastructure.

  11. Effects of optimization and image processing in digital chest radiography

    International Nuclear Information System (INIS)

    Kheddache, S.; Maansson, L.G.; Angelhed, J.E.; Denbratt, L.; Gottfridsson, B.; Schlossman, D.

    1991-01-01

    A digital system for chest radiography based on a large image intensifier was compared to a conventional film-screen system. The digital system was optimized with regard to spatial and contrast resolution and dose. The images were digitally processed for contrast and edge enhancement. A simulated pneumothorax and two and two simulated nodules were positioned over the lungs and the mediastinum of an anthro-pomorphic phantom. Observer performance was evaluated with Receiver Operating Characteristic (ROC) analysis. Five observers assessed the processed digital images and the conventional full-size radiographs. The time spent viewing the full-size radiographs and the digital images was recorded. For the simulated pneumothorax, the results showed perfect performance for the full-size radiographs and detectability was high also for the processed digital images. No significant differences in the detectability of the simulated nodules was seen between the two imaging systems. The results for the digital images showed a significantly improved detectability for the nodules in the mediastinum as compared to a previous ROC study where no optimization and image processing was available. No significant difference in detectability was seen between the former and the present ROC study for small nodules in the lung. No difference was seen in the time spent assessing the conventional full-size radiographs and the digital images. The study indicates that processed digital images produced by a large image intensifier are equal in image quality to conventional full-size radiographs for low-contrast objects such as nodules. (author). 38 refs.; 4 figs.; 1 tab

  12. Malignant tumours of the kidney: imaging strategy

    International Nuclear Information System (INIS)

    Smets, Anne M.; Kraker, Jan de

    2010-01-01

    Primitive malignant renal tumours comprise 6% of all childhood cancers. Wilms tumour (WT) or nephroblastoma is the most frequent type accounting for more than 90%. Imaging alone cannot differentiate between these tumours with certainty but it plays an important role in screening, diagnostic workup, assessment of therapy response, preoperative evaluation and follow-up. The outcome of WT after therapy is excellent with an overall survival around 90%. In tumours such as those where the outcome is extremely good, focus can be shifted to a risk-based stratification to maintain excellent outcome in children with low risk tumours while improving quality of life and decreasing toxicity and costs. This review will discuss the imaging issues for WT from the European perspective and briefly discuss the characteristics of other malignant renal tumours occurring in children and new imaging techniques with potential in this matter. (orig.)

  13. Toward optimal color image quality of television display

    Science.gov (United States)

    MacDonald, Lindsay W.; Endrikhovski, Sergej N.; Bech, Soren; Jensen, Kaj

    1999-12-01

    A general framework and first experimental results are presented for the `OPTimal IMage Appearance' (OPTIMA) project, which aims to develop a computational model for achieving optimal color appearance of natural images on adaptive CRT television displays. To achieve this goal we considered the perceptual constraints determining quality of displayed images and how they could be quantified. The practical value of the notion of optimal image appearance was translated from the high level of the perceptual constraints into a method for setting the display's parameters at the physical level. In general, the whole framework of quality determination includes: (1) evaluation of perceived quality; (2) evaluation of the individual perceptual attributes; and (3) correlation between the physical measurements, psychometric parameters and the subjective responses. We performed a series of psychophysical experiments, with observers viewing a series of color images on a high-end consumer television display, to investigate the relationships between Overall Image Quality and four quality-related attributes: Brightness Rendering, Chromatic Rendering, Visibility of Details and Overall Naturalness. The results of the experiments presented in this paper suggest that these attributes are highly inter-correlated.

  14. Implementation of Enterprise Imaging Strategy at a Chinese Tertiary Hospital.

    Science.gov (United States)

    Li, Shanshan; Liu, Yao; Yuan, Yifang; Li, Jia; Wei, Lan; Wang, Yuelong; Fei, Xiaolu

    2018-01-04

    Medical images have become increasingly important in clinical practice and medical research, and the need to manage images at the hospital level has become urgent in China. To unify patient identification in examinations from different medical specialties, increase convenient access to medical images under authentication, and make medical images suitable for further artificial intelligence investigations, we implemented an enterprise imaging strategy by adopting an image integration platform as the main tool at Xuanwu Hospital. Workflow re-engineering and business system transformation was also performed to ensure the quality and content of the imaging data. More than 54 million medical images and approximately 1 million medical reports were integrated, and uniform patient identification, images, and report integration were made available to the medical staff and were accessible via a mobile application, which were achieved by implementing the enterprise imaging strategy. However, to integrate all medical images of different specialties at a hospital and ensure that the images and reports are qualified for data mining, some further policy and management measures are still needed.

  15. Argument Strategies: Antidote to Tylenol's Poisoned Image.

    Science.gov (United States)

    Benoit, William L.; Lindsey, James J.

    1987-01-01

    Analyzes how the manufacturer dealt with the Tylenol poisonings: the link between Tylenol and the poisoning was denied, its image as a safe product was bolstered, capsules were differentiated from other products, and as a result, sales recovered. Extends the applicability of apologia as a way to analyze other media campaigns. (SKC)

  16. Imaging strategy in differentiated thyroid cancer

    NARCIS (Netherlands)

    Phan, Thi Thanh Ha

    2007-01-01

    This thesis focuses on clinical dilemmas, which the clinician faces in the management of patients with differentiated thyroid cancer (DTC) with a specific emphasis on the role of current and new diagnostic imaging. Thyroid cancer is a rare disease, but it is the most common endocrine malignancy of

  17. A Degree Distribution Optimization Algorithm for Image Transmission

    Science.gov (United States)

    Jiang, Wei; Yang, Junjie

    2016-09-01

    Luby Transform (LT) code is the first practical implementation of digital fountain code. The coding behavior of LT code is mainly decided by the degree distribution which determines the relationship between source data and codewords. Two degree distributions are suggested by Luby. They work well in typical situations but not optimally in case of finite encoding symbols. In this work, the degree distribution optimization algorithm is proposed to explore the potential of LT code. Firstly selection scheme of sparse degrees for LT codes is introduced. Then probability distribution is optimized according to the selected degrees. In image transmission, bit stream is sensitive to the channel noise and even a single bit error may cause the loss of synchronization between the encoder and the decoder. Therefore the proposed algorithm is designed for image transmission situation. Moreover, optimal class partition is studied for image transmission with unequal error protection. The experimental results are quite promising. Compared with LT code with robust soliton distribution, the proposed algorithm improves the final quality of recovered images obviously with the same overhead.

  18. An improved technique for the prediction of optimal image resolution ...

    African Journals Online (AJOL)

    user

    2010-10-04

    Oct 4, 2010 ... Available online at http://www.academicjournals.org/AJEST ... robust technique for predicting optimal image resolution for the mapping of savannah ecosystems was developed. .... whether to purchase multi-spectral imagery acquired by GeoEye-2 ..... Analysis of the spectral behaviour of the pasture class in.

  19. An improved technique for the prediction of optimal image resolution ...

    African Journals Online (AJOL)

    Past studies to predict optimal image resolution required for generating spatial information for savannah ecosystems have yielded different outcomes, hence providing a knowledge gap that was investigated in the present study. The postulation, for the present study, was that by graphically solving two simultaneous ...

  20. Novel imaging strategies for upper gastrointestinal tract cancers

    DEFF Research Database (Denmark)

    Mortensen, Michael Bau

    2015-01-01

    Accurate pretherapeutic imaging is the cornerstone of all cancer treatment. Unfortunately, modern imaging modalities have several unsolved problems and limitations. The differentiation between inflammation and cancer infiltration, false positive and false negative findings as well as lack...... of confirming biopsies in suspected metastases may have serious negative consequences in cancer patients. This review describes some of these problems and challenges the use of conventional imaging by suggesting new combined strategies that include selective use of confirming biopsies and complementary methods...

  1. PET image reconstruction: mean, variance, and optimal minimax criterion

    International Nuclear Information System (INIS)

    Liu, Huafeng; Guo, Min; Gao, Fei; Shi, Pengcheng; Xue, Liying; Nie, Jing

    2015-01-01

    Given the noise nature of positron emission tomography (PET) measurements, it is critical to know the image quality and reliability as well as expected radioactivity map (mean image) for both qualitative interpretation and quantitative analysis. While existing efforts have often been devoted to providing only the reconstructed mean image, we present a unified framework for joint estimation of the mean and corresponding variance of the radioactivity map based on an efficient optimal min–max criterion. The proposed framework formulates the PET image reconstruction problem to be a transformation from system uncertainties to estimation errors, where the minimax criterion is adopted to minimize the estimation errors with possibly maximized system uncertainties. The estimation errors, in the form of a covariance matrix, express the measurement uncertainties in a complete way. The framework is then optimized by ∞-norm optimization and solved with the corresponding H ∞ filter. Unlike conventional statistical reconstruction algorithms, that rely on the statistical modeling methods of the measurement data or noise, the proposed joint estimation stands from the point of view of signal energies and can handle from imperfect statistical assumptions to even no a priori statistical assumptions. The performance and accuracy of reconstructed mean and variance images are validated using Monte Carlo simulations. Experiments on phantom scans with a small animal PET scanner and real patient scans are also conducted for assessment of clinical potential. (paper)

  2. Group search optimiser-based optimal bidding strategies with no Karush-Kuhn-Tucker optimality conditions

    Science.gov (United States)

    Yadav, Naresh Kumar; Kumar, Mukesh; Gupta, S. K.

    2017-03-01

    General strategic bidding procedure has been formulated in the literature as a bi-level searching problem, in which the offer curve tends to minimise the market clearing function and to maximise the profit. Computationally, this is complex and hence, the researchers have adopted Karush-Kuhn-Tucker (KKT) optimality conditions to transform the model into a single-level maximisation problem. However, the profit maximisation problem with KKT optimality conditions poses great challenge to the classical optimisation algorithms. The problem has become more complex after the inclusion of transmission constraints. This paper simplifies the profit maximisation problem as a minimisation function, in which the transmission constraints, the operating limits and the ISO market clearing functions are considered with no KKT optimality conditions. The derived function is solved using group search optimiser (GSO), a robust population-based optimisation algorithm. Experimental investigation is carried out on IEEE 14 as well as IEEE 30 bus systems and the performance is compared against differential evolution-based strategic bidding, genetic algorithm-based strategic bidding and particle swarm optimisation-based strategic bidding methods. The simulation results demonstrate that the obtained profit maximisation through GSO-based bidding strategies is higher than the other three methods.

  3. Undersampling strategies for compressed sensing accelerated MR spectroscopic imaging

    Science.gov (United States)

    Vidya Shankar, Rohini; Hu, Houchun Harry; Bikkamane Jayadev, Nutandev; Chang, John C.; Kodibagkar, Vikram D.

    2017-03-01

    Compressed sensing (CS) can accelerate magnetic resonance spectroscopic imaging (MRSI), facilitating its widespread clinical integration. The objective of this study was to assess the effect of different undersampling strategy on CS-MRSI reconstruction quality. Phantom data were acquired on a Philips 3 T Ingenia scanner. Four types of undersampling masks, corresponding to each strategy, namely, low resolution, variable density, iterative design, and a priori were simulated in Matlab and retrospectively applied to the test 1X MRSI data to generate undersampled datasets corresponding to the 2X - 5X, and 7X accelerations for each type of mask. Reconstruction parameters were kept the same in each case(all masks and accelerations) to ensure that any resulting differences can be attributed to the type of mask being employed. The reconstructed datasets from each mask were statistically compared with the reference 1X, and assessed using metrics like the root mean square error and metabolite ratios. Simulation results indicate that both the a priori and variable density undersampling masks maintain high fidelity with the 1X up to five-fold acceleration. The low resolution mask based reconstructions showed statistically significant differences from the 1X with the reconstruction failing at 3X, while the iterative design reconstructions maintained fidelity with the 1X till 4X acceleration. In summary, a pilot study was conducted to identify an optimal sampling mask in CS-MRSI. Simulation results demonstrate that the a priori and variable density masks can provide statistically similar results to the fully sampled reference. Future work would involve implementing these two masks prospectively on a clinical scanner.

  4. Optimized multiple linear mappings for single image super-resolution

    Science.gov (United States)

    Zhang, Kaibing; Li, Jie; Xiong, Zenggang; Liu, Xiuping; Gao, Xinbo

    2017-12-01

    Learning piecewise linear regression has been recognized as an effective way for example learning-based single image super-resolution (SR) in literature. In this paper, we employ an expectation-maximization (EM) algorithm to further improve the SR performance of our previous multiple linear mappings (MLM) based SR method. In the training stage, the proposed method starts with a set of linear regressors obtained by the MLM-based method, and then jointly optimizes the clustering results and the low- and high-resolution subdictionary pairs for regression functions by using the metric of the reconstruction errors. In the test stage, we select the optimal regressor for SR reconstruction by accumulating the reconstruction errors of m-nearest neighbors in the training set. Thorough experimental results carried on six publicly available datasets demonstrate that the proposed SR method can yield high-quality images with finer details and sharper edges in terms of both quantitative and perceptual image quality assessments.

  5. Image-Based Models Using Crowdsourcing Strategy

    Directory of Open Access Journals (Sweden)

    Antonia Spanò

    2016-12-01

    Full Text Available The conservation and valorization of Cultural Heritage require an extensive documentation, both in properly historic-artistic terms and regarding the physical characteristics of position, shape, color, and geometry. With the use of digital photogrammetry that make acquisition of overlapping images for 3D photo modeling and with the development of dense and accurate 3D point models, it is possible to obtain high-resolution orthoprojections of surfaces.Recent years have seen a growing interest in crowdsourcing that holds in the field of the protection and dissemination of cultural heritage, in parallel there is an increasing awareness for contributing the generation of digital models with the immense wealth of images available on the web which are useful for documentation heritage.In this way, the availability and ease the automation of SfM (Structure from Motion algorithm enables the generation of digital models of the built heritage, which can be inserted positively in crowdsourcing processes. In fact, non-expert users can handle the technology in the process of acquisition, which today is one of the fundamental points to involve the wider public to the cultural heritage protection. To present the image based models and their derivatives that can be made from a great digital resource; the current approach is useful for the little-known heritage or not easily accessible buildings as an emblematic case study that was selected. It is the Vank Cathedral in Isfahan in Iran: the availability of accurate point clouds and reliable orthophotos are very convenient since the building of the Safavid epoch (cent. XVII-XVIII completely frescoed with the internal surfaces, which the architecture and especially the architectural decoration reach their peak.The experimental part of the paper explores also some aspects of usability of the digital output from the image based modeling methods. The availability of orthophotos allows and facilitates the iconographic

  6. Exploring the Optimal Strategy to Predict Essential Genes in Microbes

    Directory of Open Access Journals (Sweden)

    Yao Lu

    2011-12-01

    Full Text Available Accurately predicting essential genes is important in many aspects of biology, medicine and bioengineering. In previous research, we have developed a machine learning based integrative algorithm to predict essential genes in bacterial species. This algorithm lends itself to two approaches for predicting essential genes: learning the traits from known essential genes in the target organism, or transferring essential gene annotations from a closely related model organism. However, for an understudied microbe, each approach has its potential limitations. The first is constricted by the often small number of known essential genes. The second is limited by the availability of model organisms and by evolutionary distance. In this study, we aim to determine the optimal strategy for predicting essential genes by examining four microbes with well-characterized essential genes. Our results suggest that, unless the known essential genes are few, learning from the known essential genes in the target organism usually outperforms transferring essential gene annotations from a related model organism. In fact, the required number of known essential genes is surprisingly small to make accurate predictions. In prokaryotes, when the number of known essential genes is greater than 2% of total genes, this approach already comes close to its optimal performance. In eukaryotes, achieving the same best performance requires over 4% of total genes, reflecting the increased complexity of eukaryotic organisms. Combining the two approaches resulted in an increased performance when the known essential genes are few. Our investigation thus provides key information on accurately predicting essential genes and will greatly facilitate annotations of microbial genomes.

  7. Optimizing strategies to improve interprofessional practice for veterans, part 1

    Directory of Open Access Journals (Sweden)

    Bhattacharya SB

    2014-04-01

    Full Text Available Shelley B Bhattacharya,1–3 Michelle I Rossi,1,2 Jennifer M Mentz11Geriatric Research Education and Clinical Center (GRECC, Veteran's Affairs Pittsburgh Healthcare System, 2University of Pittsburgh Medical Center, Pittsburgh, PA, USA; 3Albert Schweitzer Fellowship Program, Pittsburgh, PA, USAIntroduction: Interprofessional patient care is a well-recognized path that health care systems are striving toward. The Veteran's Affairs (VA system initiated interprofessional practice (IPP models with their Geriatric Evaluation and Management (GEM programs. GEM programs incorporate a range of specialties, including but not limited to, medicine, nursing, social work, physical therapy and pharmacy, to collaboratively evaluate veterans. Despite being a valuable resource, they are now faced with significant cut-backs, including closures. The primary goal of this project was to assess how the GEM model could be optimized at the Pittsburgh, Pennsylvania VA to allow for the sustainability of this important IPP assessment. Part 1 of the study evaluated the IPP process using program, patient, and family surveys. Part 2 examined how well the geriatrician matched patients to specialists in the GEM model. This paper describes Part 1 of our study.Methods: Three strategies were used: 1 a national GEM program survey; 2 a veteran/family satisfaction survey; and 3 an absentee assessment.Results: Twenty-six of 92 programs responded to the GEM IPP survey. Six strategies were shared to optimize IPP models throughout the country. Of the 34 satisfaction surveys, 80% stated the GEM clinic was beneficial, 79% stated their concerns were addressed, and 100% would recommend GEM to their friends. Of the 24 absentee assessments, the top three reasons for missing the appointments were transportation, medical illnesses, and not knowing/remembering about the appointment. Absentee rate diminished from 41% to 19% after instituting a reminder phone call policy.Discussion: Maintaining the

  8. Emergency strategy optimization for the environmental control system in manned spacecraft

    Science.gov (United States)

    Li, Guoxiang; Pang, Liping; Liu, Meng; Fang, Yufeng; Zhang, Helin

    2018-02-01

    It is very important for a manned environmental control system (ECS) to be able to reconfigure its operation strategy in emergency conditions. In this article, a multi-objective optimization is established to design the optimal emergency strategy for an ECS in an insufficient power supply condition. The maximum ECS lifetime and the minimum power consumption are chosen as the optimization objectives. Some adjustable key variables are chosen as the optimization variables, which finally represent the reconfigured emergency strategy. The non-dominated sorting genetic algorithm-II is adopted to solve this multi-objective optimization problem. Optimization processes are conducted at four different carbon dioxide partial pressure control levels. The study results show that the Pareto-optimal frontiers obtained from this multi-objective optimization can represent the relationship between the lifetime and the power consumption of the ECS. Hence, the preferred emergency operation strategy can be recommended for situations when there is suddenly insufficient power.

  9. Optimal Portfolios in Wishart Models and Effects of Discrete Rebalancing on Portfolio Distribution and Strategy Selection

    OpenAIRE

    Li, Zejing

    2012-01-01

    This dissertation is mainly devoted to the research of two problems - the continuous-time portfolio optimization in different Wishart models and the effects of discrete rebalancing on portfolio wealth distribution and optimal portfolio strategy.

  10. Verification and synthesis of optimal decision strategies for complex systems

    Energy Technology Data Exchange (ETDEWEB)

    Summers, S. J.

    2013-07-01

    that quantifies the probability of hitting a target set at some point during a finite time horizon, while avoiding an obstacle set during each time step preceding the target hitting time. In contrast with the general reach-avoid formulation, which assumes that the target and obstacle sets are constant and deterministic, we allow these sets to be both time-varying and probabilistic. An optimal reach-avoid control policy is derived as the solution to an optimal control problem via dynamic programming. A framework for analyzing probabilistic safety and reachability problems for discrete time stochastic hybrid systems in scenarios where system dynamics are affected by rational competing agents follows. We consider a zero sum game formulation of the probabilistic reach-avoid problem, in which the control objective is to maximize the probability of reaching a desired subset of the hybrid state space, while avoiding an unsafe set, subject to the worst case behavior of a rational adversary. Theoretical results are provided on a dynamic programming algorithm for computing the maximal reach-avoid probability under the worst-case adversary strategy, as well as the existence of a maxmin control policy that achieves this probability. Probabilistic Computation Tree Logic (PCTL) is a well-known modal logic that has become a standard for expressing temporal properties of finite state Markov chains in the context of automated model checking. Here we consider PCTL for non countable-space Markov chains, and we show that there is a substantial affinity between certain of its operators and problems of dynamic programming. We prove some basic properties of the solutions to the latter. The dissertation concludes with a collection of computational examples in the areas of ecology, robotics, aerospace, and finance. (author)

  11. Verification and synthesis of optimal decision strategies for complex systems

    International Nuclear Information System (INIS)

    Summers, S. J.

    2013-01-01

    that quantifies the probability of hitting a target set at some point during a finite time horizon, while avoiding an obstacle set during each time step preceding the target hitting time. In contrast with the general reach-avoid formulation, which assumes that the target and obstacle sets are constant and deterministic, we allow these sets to be both time-varying and probabilistic. An optimal reach-avoid control policy is derived as the solution to an optimal control problem via dynamic programming. A framework for analyzing probabilistic safety and reachability problems for discrete time stochastic hybrid systems in scenarios where system dynamics are affected by rational competing agents follows. We consider a zero sum game formulation of the probabilistic reach-avoid problem, in which the control objective is to maximize the probability of reaching a desired subset of the hybrid state space, while avoiding an unsafe set, subject to the worst case behavior of a rational adversary. Theoretical results are provided on a dynamic programming algorithm for computing the maximal reach-avoid probability under the worst-case adversary strategy, as well as the existence of a maxmin control policy that achieves this probability. Probabilistic Computation Tree Logic (PCTL) is a well-known modal logic that has become a standard for expressing temporal properties of finite state Markov chains in the context of automated model checking. Here we consider PCTL for non countable-space Markov chains, and we show that there is a substantial affinity between certain of its operators and problems of dynamic programming. We prove some basic properties of the solutions to the latter. The dissertation concludes with a collection of computational examples in the areas of ecology, robotics, aerospace, and finance. (author)

  12. Computing Optimal Stochastic Portfolio Execution Strategies: A Parametric Approach Using Simulations

    Science.gov (United States)

    Moazeni, Somayeh; Coleman, Thomas F.; Li, Yuying

    2010-09-01

    Computing optimal stochastic portfolio execution strategies under appropriate risk consideration presents great computational challenge. We investigate a parametric approach for computing optimal stochastic strategies using Monte Carlo simulations. This approach allows reduction in computational complexity by computing coefficients for a parametric representation of a stochastic dynamic strategy based on static optimization. Using this technique, constraints can be similarly handled using appropriate penalty functions. We illustrate the proposed approach to minimize the expected execution cost and Conditional Value-at-Risk (CVaR).

  13. Optimal Stochastic Advertising Strategies for the U.S. Beef Industry

    OpenAIRE

    Kun C. Lee; Stanley Schraufnagel; Earl O. Heady

    1982-01-01

    An important decision variable in the promotional strategy for the beef sector is the optimal level of advertising expenditures over time. Optimal stochastic and deterministic advertising expenditures are derived for the U.S. beef industry for the period `1966 through 1980. They are compared with historical levels and gains realized by optimal advertising strategies are measured. Finally, the optimal advertising expenditures in the future are forecasted.

  14. An optimization strategy for the control of small capacity heat pump integrated air-conditioning system

    International Nuclear Information System (INIS)

    Gao, Jiajia; Huang, Gongsheng; Xu, Xinhua

    2016-01-01

    Highlights: • An optimization strategy for a small-scale air-conditioning system is developed. • The optimization strategy aims at optimizing the overall system energy consumption. • The strategy may guarantee the robust control of the space air temperature. • The performance of the optimization strategy was tested on a simulation platform. - Abstract: This paper studies the optimization of a small-scale central air-conditioning system, in which the cooling is provided by a ground source heat pump (GSHP) equipped with an on/off capacity control. The optimization strategy aims to optimize the overall system energy consumption and simultaneously guarantee the robustness of the space air temperature control without violating the allowed GSHP maximum start-ups number per hour specified by customers. The set-point of the chilled water return temperature and the width of the water temperature control band are used as the decision variables for the optimization. The performance of the proposed strategy was tested on a simulation platform. Results show that the optimization strategy can save the energy consumption by 9.59% in a typical spring day and 2.97% in a typical summer day. Meanwhile it is able to enhance the space air temperature control robustness when compared with a basic control strategy without optimization.

  15. Noninfectious uveitis: strategies to optimize treatment compliance and adherence

    Directory of Open Access Journals (Sweden)

    Dolz-Marco R

    2015-08-01

    Full Text Available Rosa Dolz-Marco,1 Roberto Gallego-Pinazo,1 Manuel Díaz-Llopis,2 Emmett T Cunningham Jr,3–6 J Fernando Arévalo7,8 1Unit of Macula, Department of Ophthalmology, University and Polytechnic Hospital La Fe, 2Faculty of Medicine, University of Valencia, Spain; 3Department of Ophthalmology, California Pacific Medical Center, San Francisco, 4Department of Ophthalmology, Stanford University School of Medicine, Stanford, 5The Francis I Proctor Foundation, University of California San Francisco Medical Center, 6West Coast Retina Medical Group, San Francisco, CA, USA; 7Vitreoretina Division, King Khaled Eye Specialist Hospital, Riyadh, Saudi Arabia; 8Retina Division, Wilmer Eye Institute, Johns Hopkins University School of Medicine, Baltimore, MD, USA Abstract: Noninfectious uveitis includes a heterogenous group of sight-threatening ocular and systemic disorders. Significant progress has been made in the treatment of noninfectious uveitis in recent years, particularly with regard to the effective use of corticosteroids and non-corticosteroid immunosuppressive drugs, including biologic agents. All of these therapeutic approaches are limited, however, by any given patient’s ability to comply with and adhere to their prescribed treatment. In fact, compliance and adherence are among the most important patient-related determinants of treatment success. We discuss strategies to optimize compliance and adherence. Keywords: noninfectious uveitis, intraocular inflammation, immunosuppressive treatment, adherence, compliance, therapeutic failure

  16. Optimal breast cancer screening strategies for older women: current perspectives

    Directory of Open Access Journals (Sweden)

    Braithwaite D

    2016-02-01

    Full Text Available Dejana Braithwaite,1 Joshua Demb,1 Louise M Henderson2 1Department of Epidemiology and Biostatistics, University of California, San Francisco, CA, 2Department of Radiology, University of North Carolina, Chapel Hill, NC, USA Abstract: Breast cancer is a major cause of cancer-related deaths among older women, aged 65 years or older. Screening mammography has been shown to be effective in reducing breast cancer mortality in women aged 50–74 years but not among those aged 75 years or older. Given the large heterogeneity in comorbidity status and life expectancy among older women, controversy remains over screening mammography in this population. Diminished life expectancy with aging may decrease the potential screening benefit and increase the risk of harms. In this review, we summarize the evidence on screening mammography utilization, performance, and outcomes and highlight evidence gaps. Optimizing the screening strategy will involve separating older women who will benefit from screening from those who will not benefit by using information on comorbidity status and life expectancy. This review has identified areas related to screening mammography in older women that warrant additional research, including the need to evaluate emerging screening technologies, such as tomosynthesis among older women and precision cancer screening. In the absence of randomized controlled trials, the benefits and harms of continued screening mammography in older women need to be estimated using both population-based cohort data and simulation models. Keywords: aging, breast cancer, precision cancer screening

  17. An Optimal Investment Strategy for Insurers in Incomplete Markets

    Directory of Open Access Journals (Sweden)

    Mohamed Badaoui

    2018-04-01

    Full Text Available In this paper we consider the problem of an insurance company where the wealth of the insurer is described by a Cramér-Lundberg process. The insurer is allowed to invest in a risky asset with stochastic volatility subject to the influence of an economic factor and the remaining surplus in a bank account. The price of the risky asset and the economic factor are modeled by a system of correlated stochastic differential equations. In a finite horizon framework and assuming that the market is incomplete, we study the problem of maximizing the expected utility of terminal wealth. When the insurer’s preferences are exponential, an existence and uniqueness theorem is proven for the non-linear Hamilton-Jacobi-Bellman equation (HJB. The optimal strategy and the value function have been produced in closed form. In addition and in order to show the connection between the insurer’s decision and the correlation coefficient we present two numerical approaches: A Monte-Carlo method based on the stochastic representation of the solution of the insurer problem via Feynman-Kac’s formula, and a mixed Finite Difference Monte-Carlo one. Finally the results are presented in the case of Scott model.

  18. Optimizing individual iron deficiency prevention strategies in physiological pregnancy

    Directory of Open Access Journals (Sweden)

    Kramarskiy V.A.

    2018-04-01

    Full Text Available Sideropenia by the end of pregnancy takes place in all mothers without exception. Moreover, the selective administration of iron preparations, in contrast to the routine, makes it possible to avoid hemochromatosis, frequency of which in the general population makes from 0.5 to 13 %. The aim of the study was to optimize the individual strategy for the prevention of iron deficiency in physiological pregnancy. A prospective pre-experimental study was conducted, the criterion of inclusion in which was the mother’s extragenital and obstetrical pathology during the first half of pregnancy, a burdened obstetric and gynecological anamnesis. The study group of 98 women with a physiological pregnancy in the period of 20 to 24 weeks was recruited by simple ran- dom selection. Serum ferritin, hemoglobin, and serum iron were used to estimate iron deficiency. In the latent stage of iron deficiency against a background of monthly correction with Fenules ® in a dose of 90 mg of elemental iron per day, there was a significant increase in ferritin and iron in the blood rotor. In healthy mothers, during the gestational period of 20–24 weeks, a regularity arises in the replenishment of iron status, especially in the case of repeated pregnancy, which is successfully satisfied during the month of Fenules ® intake in doses of 45 mg or 90 mg per day with a serum ferritin level of, respectively, 30 up to 70 μg/l or less than 30 μg/l.

  19. Optimization for PET imaging based on phantom study and NECdensity

    International Nuclear Information System (INIS)

    Daisaki, Hiromitsu; Shimada, Naoki; Shinohara, Hiroyuki

    2012-01-01

    In consideration of the requirement for global standardization and quality control of PET imaging, the present studies gave an outline of phantom study to decide both scan and reconstruction parameters based on FDG-PET/CT procedure guideline in Japan, and optimization of scan duration based on NEC density was performed continuously. In the phantom study, scan and reconstruction parameters were decided by visual assessment and physical indexes (N 10mm , NEC phantom , Q H,10mm /N 10mm ) to visualize hot spot of 10 mm diameter with standardized uptake value (SUV)=4 explicitly. Simultaneously, Recovery Coefficient (RC) was evaluated to recognize that PET images had enough quantifiably. Scan durations were optimized by Body Mass Index (BMI) based on retrospective analysis of NEC density . Correlation between visual score in clinical FDG-PET images and NEC density fell after the optimization of scan duration. Both Inter-institution and inter-patient variability were decreased by performing the phantom study based on the procedure guideline and the optimization of scan duration based on NEC density which seem finally useful to practice highly precise examination and promote high-quality controlled study. (author)

  20. Real-time implementation of optimized maximum noise fraction transform for feature extraction of hyperspectral images

    Science.gov (United States)

    Wu, Yuanfeng; Gao, Lianru; Zhang, Bing; Zhao, Haina; Li, Jun

    2014-01-01

    We present a parallel implementation of the optimized maximum noise fraction (G-OMNF) transform algorithm for feature extraction of hyperspectral images on commodity graphics processing units (GPUs). The proposed approach explored the algorithm data-level concurrency and optimized the computing flow. We first defined a three-dimensional grid, in which each thread calculates a sub-block data to easily facilitate the spatial and spectral neighborhood data searches in noise estimation, which is one of the most important steps involved in OMNF. Then, we optimized the processing flow and computed the noise covariance matrix before computing the image covariance matrix to reduce the original hyperspectral image data transmission. These optimization strategies can greatly improve the computing efficiency and can be applied to other feature extraction algorithms. The proposed parallel feature extraction algorithm was implemented on an Nvidia Tesla GPU using the compute unified device architecture and basic linear algebra subroutines library. Through the experiments on several real hyperspectral images, our GPU parallel implementation provides a significant speedup of the algorithm compared with the CPU implementation, especially for highly data parallelizable and arithmetically intensive algorithm parts, such as noise estimation. In order to further evaluate the effectiveness of G-OMNF, we used two different applications: spectral unmixing and classification for evaluation. Considering the sensor scanning rate and the data acquisition time, the proposed parallel implementation met the on-board real-time feature extraction.

  1. GPU based Monte Carlo for PET image reconstruction: parameter optimization

    International Nuclear Information System (INIS)

    Cserkaszky, Á; Légrády, D.; Wirth, A.; Bükki, T.; Patay, G.

    2011-01-01

    This paper presents the optimization of a fully Monte Carlo (MC) based iterative image reconstruction of Positron Emission Tomography (PET) measurements. With our MC re- construction method all the physical effects in a PET system are taken into account thus superior image quality is achieved in exchange for increased computational effort. The method is feasible because we utilize the enormous processing power of Graphical Processing Units (GPUs) to solve the inherently parallel problem of photon transport. The MC approach regards the simulated positron decays as samples in mathematical sums required in the iterative reconstruction algorithm, so to complement the fast architecture, our work of optimization focuses on the number of simulated positron decays required to obtain sufficient image quality. We have achieved significant results in determining the optimal number of samples for arbitrary measurement data, this allows to achieve the best image quality with the least possible computational effort. Based on this research recommendations can be given for effective partitioning of computational effort into the iterations in limited time reconstructions. (author)

  2. Optimization of an Image-Based Talking Head System

    Directory of Open Access Journals (Sweden)

    Kang Liu

    2009-01-01

    Full Text Available This paper presents an image-based talking head system, which includes two parts: analysis and synthesis. The audiovisual analysis part creates a face model of a recorded human subject, which is composed of a personalized 3D mask as well as a large database of mouth images and their related information. The synthesis part generates natural looking facial animations from phonetic transcripts of text. A critical issue of the synthesis is the unit selection which selects and concatenates these appropriate mouth images from the database such that they match the spoken words of the talking head. Selection is based on lip synchronization and the similarity of consecutive images. The unit selection is refined in this paper, and Pareto optimization is used to train the unit selection. Experimental results of subjective tests show that most people cannot distinguish our facial animations from real videos.

  3. Optimization of the alpha image reconstruction. An iterative CT-image reconstruction with well-defined image quality metrics

    International Nuclear Information System (INIS)

    Lebedev, Sergej; Sawall, Stefan; Knaup, Michael; Kachelriess, Marc

    2017-01-01

    Optimization of the AIR-algorithm for improved convergence and performance. TThe AIR method is an iterative algorithm for CT image reconstruction. As a result of its linearity with respect to the basis images, the AIR algorithm possesses well defined, regular image quality metrics, e.g. point spread function (PSF) or modulation transfer function (MTF), unlike other iterative reconstruction algorithms. The AIR algorithm computes weighting images α to blend between a set of basis images that preferably have mutually exclusive properties, e.g. high spatial resolution or low noise. The optimized algorithm uses an approach that alternates between the optimization of rawdata fidelity using an OSSART like update and regularization using gradient descent, as opposed to the initially proposed AIR using a straightforward gradient descent implementation. A regularization strength for a given task is chosen by formulating a requirement for the noise reduction and checking whether it is fulfilled for different regularization strengths, while monitoring the spatial resolution using the voxel-wise defined modulation transfer function for the AIR image. The optimized algorithm computes similar images in a shorter time compared to the initial gradient descent implementation of AIR. The result can be influenced by multiple parameters that can be narrowed down to a relatively simple framework to compute high quality images. The AIR images, for instance, can have at least a 50% lower noise level compared to the sharpest basis image, while the spatial resolution is mostly maintained. The optimization improves performance by a factor of 6, while maintaining image quality. Furthermore, it was demonstrated that the spatial resolution for AIR can be determined using regular image quality metrics, given smooth weighting images. This is not possible for other iterative reconstructions as a result of their non linearity. A simple set of parameters for the algorithm is discussed that provides

  4. Optimization of the alpha image reconstruction. An iterative CT-image reconstruction with well-defined image quality metrics

    Energy Technology Data Exchange (ETDEWEB)

    Lebedev, Sergej; Sawall, Stefan; Knaup, Michael; Kachelriess, Marc [German Cancer Research Center, Heidelberg (Germany).

    2017-10-01

    Optimization of the AIR-algorithm for improved convergence and performance. TThe AIR method is an iterative algorithm for CT image reconstruction. As a result of its linearity with respect to the basis images, the AIR algorithm possesses well defined, regular image quality metrics, e.g. point spread function (PSF) or modulation transfer function (MTF), unlike other iterative reconstruction algorithms. The AIR algorithm computes weighting images α to blend between a set of basis images that preferably have mutually exclusive properties, e.g. high spatial resolution or low noise. The optimized algorithm uses an approach that alternates between the optimization of rawdata fidelity using an OSSART like update and regularization using gradient descent, as opposed to the initially proposed AIR using a straightforward gradient descent implementation. A regularization strength for a given task is chosen by formulating a requirement for the noise reduction and checking whether it is fulfilled for different regularization strengths, while monitoring the spatial resolution using the voxel-wise defined modulation transfer function for the AIR image. The optimized algorithm computes similar images in a shorter time compared to the initial gradient descent implementation of AIR. The result can be influenced by multiple parameters that can be narrowed down to a relatively simple framework to compute high quality images. The AIR images, for instance, can have at least a 50% lower noise level compared to the sharpest basis image, while the spatial resolution is mostly maintained. The optimization improves performance by a factor of 6, while maintaining image quality. Furthermore, it was demonstrated that the spatial resolution for AIR can be determined using regular image quality metrics, given smooth weighting images. This is not possible for other iterative reconstructions as a result of their non linearity. A simple set of parameters for the algorithm is discussed that provides

  5. An optimal big data workflow for biomedical image analysis

    Directory of Open Access Journals (Sweden)

    Aurelle Tchagna Kouanou

    Full Text Available Background and objective: In the medical field, data volume is increasingly growing, and traditional methods cannot manage it efficiently. In biomedical computation, the continuous challenges are: management, analysis, and storage of the biomedical data. Nowadays, big data technology plays a significant role in the management, organization, and analysis of data, using machine learning and artificial intelligence techniques. It also allows a quick access to data using the NoSQL database. Thus, big data technologies include new frameworks to process medical data in a manner similar to biomedical images. It becomes very important to develop methods and/or architectures based on big data technologies, for a complete processing of biomedical image data. Method: This paper describes big data analytics for biomedical images, shows examples reported in the literature, briefly discusses new methods used in processing, and offers conclusions. We argue for adapting and extending related work methods in the field of big data software, using Hadoop and Spark frameworks. These provide an optimal and efficient architecture for biomedical image analysis. This paper thus gives a broad overview of big data analytics to automate biomedical image diagnosis. A workflow with optimal methods and algorithm for each step is proposed. Results: Two architectures for image classification are suggested. We use the Hadoop framework to design the first, and the Spark framework for the second. The proposed Spark architecture allows us to develop appropriate and efficient methods to leverage a large number of images for classification, which can be customized with respect to each other. Conclusions: The proposed architectures are more complete, easier, and are adaptable in all of the steps from conception. The obtained Spark architecture is the most complete, because it facilitates the implementation of algorithms with its embedded libraries. Keywords: Biomedical images, Big

  6. Comparison of various online IGRT strategies: The benefits of online treatment plan re-optimization

    International Nuclear Information System (INIS)

    Schulze, Derek; Liang, Jian; Yan, Di; Zhang Tiezhi

    2009-01-01

    Purpose: To compare the dosimetric differences of various online IGRT strategies and to predict potential benefits of online re-optimization techniques in prostate cancer radiation treatments. Materials and methods: Nine prostate patients were recruited in this study. Each patient has one treatment planning CT images and 10-treatment day CT images. Five different online IGRT strategies were evaluated which include 3D conformal with bone alignment, 3D conformal re-planning via aperture changes, intensity modulated radiation treatment (IMRT) with bone alignment, IMRT with target alignment and IMRT daily re-optimization. Treatment planning and virtual treatment delivery were performed. The delivered doses were obtained using in-house deformable dose mapping software. The results were analyzed using equivalent uniform dose (EUD). Results: With the same margin, rectum and bladder doses in IMRT plans were about 10% and 5% less than those in CRT plans, respectively. Rectum and bladder doses were reduced as much as 20% if motion margin is reduced by 1 cm. IMRT is more sensitive to organ motion. Large discrepancies of bladder and rectum doses were observed compared to the actual delivered dose with treatment plan predication. The therapeutic ratio can be improved by 14% and 25% for rectum and bladder, respectively, if IMRT online re-planning is employed compared to the IMRT bone alignment approach. The improvement of target alignment approach is similar with 11% and 21% dose reduction to rectum and bladder, respectively. However, underdosing in seminal vesicles was observed on certain patients. Conclusions: Online treatment plan re-optimization may significantly improve therapeutic ratio in prostate cancer treatments mostly due to the reduction of PTV margin. However, for low risk patient with only prostate involved, online target alignment IMRT treatment would achieve similar results as online re-planning. For all IGRT approaches, the delivered organ-at-risk doses may be

  7. Acoustic-noise-optimized diffusion-weighted imaging.

    Science.gov (United States)

    Ott, Martin; Blaimer, Martin; Grodzki, David M; Breuer, Felix A; Roesch, Julie; Dörfler, Arnd; Heismann, Björn; Jakob, Peter M

    2015-12-01

    This work was aimed at reducing acoustic noise in diffusion-weighted MR imaging (DWI) that might reach acoustic noise levels of over 100 dB(A) in clinical practice. A diffusion-weighted readout-segmented echo-planar imaging (EPI) sequence was optimized for acoustic noise by utilizing small readout segment widths to obtain low gradient slew rates and amplitudes instead of faster k-space coverage. In addition, all other gradients were optimized for low slew rates. Volunteer and patient imaging experiments were conducted to demonstrate the feasibility of the method. Acoustic noise measurements were performed and analyzed for four different DWI measurement protocols at 1.5T and 3T. An acoustic noise reduction of up to 20 dB(A) was achieved, which corresponds to a fourfold reduction in acoustic perception. The image quality was preserved at the level of a standard single-shot (ss)-EPI sequence, with a 27-54% increase in scan time. The diffusion-weighted imaging technique proposed in this study allowed a substantial reduction in the level of acoustic noise compared to standard single-shot diffusion-weighted EPI. This is expected to afford considerably more patient comfort, but a larger study would be necessary to fully characterize the subjective changes in patient experience.

  8. Spectrally optimal illuminations for diabetic retinopathy detection in retinal imaging

    Science.gov (United States)

    Bartczak, Piotr; Fält, Pauli; Penttinen, Niko; Ylitepsa, Pasi; Laaksonen, Lauri; Lensu, Lasse; Hauta-Kasari, Markku; Uusitalo, Hannu

    2017-04-01

    Retinal photography is a standard method for recording retinal diseases for subsequent analysis and diagnosis. However, the currently used white light or red-free retinal imaging does not necessarily provide the best possible visibility of different types of retinal lesions, important when developing diagnostic tools for handheld devices, such as smartphones. Using specifically designed illumination, the visibility and contrast of retinal lesions could be improved. In this study, spectrally optimal illuminations for diabetic retinopathy lesion visualization are implemented using a spectrally tunable light source based on digital micromirror device. The applicability of this method was tested in vivo by taking retinal monochrome images from the eyes of five diabetic volunteers and two non-diabetic control subjects. For comparison to existing methods, we evaluated the contrast of retinal images taken with our method and red-free illumination. The preliminary results show that the use of optimal illuminations improved the contrast of diabetic lesions in retinal images by 30-70%, compared to the traditional red-free illumination imaging.

  9. Polymerase chain reaction: basic protocol plus troubleshooting and optimization strategies.

    Science.gov (United States)

    Lorenz, Todd C

    2012-05-22

    In the biological sciences there have been technological advances that catapult the discipline into golden ages of discovery. For example, the field of microbiology was transformed with the advent of Anton van Leeuwenhoek's microscope, which allowed scientists to visualize prokaryotes for the first time. The development of the polymerase chain reaction (PCR) is one of those innovations that changed the course of molecular science with its impact spanning countless subdisciplines in biology. The theoretical process was outlined by Keppe and coworkers in 1971; however, it was another 14 years until the complete PCR procedure was described and experimentally applied by Kary Mullis while at Cetus Corporation in 1985. Automation and refinement of this technique progressed with the introduction of a thermal stable DNA polymerase from the bacterium Thermus aquaticus, consequently the name Taq DNA polymerase. PCR is a powerful amplification technique that can generate an ample supply of a specific segment of DNA (i.e., an amplicon) from only a small amount of starting material (i.e., DNA template or target sequence). While straightforward and generally trouble-free, there are pitfalls that complicate the reaction producing spurious results. When PCR fails it can lead to many non-specific DNA products of varying sizes that appear as a ladder or smear of bands on agarose gels. Sometimes no products form at all. Another potential problem occurs when mutations are unintentionally introduced in the amplicons, resulting in a heterogeneous population of PCR products. PCR failures can become frustrating unless patience and careful troubleshooting are employed to sort out and solve the problem(s). This protocol outlines the basic principles of PCR, provides a methodology that will result in amplification of most target sequences, and presents strategies for optimizing a reaction. By following this PCR guide, students should be able to: • Set up reactions and thermal cycling

  10. Software optimization for electrical conductivity imaging in polycrystalline diamond cutters

    Energy Technology Data Exchange (ETDEWEB)

    Bogdanov, G.; Ludwig, R. [Department of Electrical and Computer Engineering, Worcester Polytechnic Institute, 100 Institute Rd, Worcester, MA 01609 (United States); Wiggins, J.; Bertagnolli, K. [US Synthetic, 1260 South 1600 West, Orem, UT 84058 (United States)

    2014-02-18

    We previously reported on an electrical conductivity imaging instrument developed for measurements on polycrystalline diamond cutters. These cylindrical cutters for oil and gas drilling feature a thick polycrystalline diamond layer on a tungsten carbide substrate. The instrument uses electrical impedance tomography to profile the conductivity in the diamond table. Conductivity images must be acquired quickly, on the order of 5 sec per cutter, to be useful in the manufacturing process. This paper reports on successful efforts to optimize the conductivity reconstruction routine, porting major portions of it to NVIDIA GPUs, including a custom CUDA kernel for Jacobian computation.

  11. Towards optimized naphthalocyanines as sonochromes for photoacoustic imaging in vivo

    Directory of Open Access Journals (Sweden)

    Mitchell J. Duffy

    2018-03-01

    Full Text Available In this paper we establish a methodology to predict photoacoustic imaging capabilities from the structure of absorber molecules (sonochromes. The comparative in vitro and in vivo screening of naphthalocyanines and cyanine dyes has shown a substitution pattern dependent shift in photoacoustic excitation wavelength, with distal substitution producing the preferred maximum around 800 nm. Central ion change showed variable production of photoacoustic signals, as well as singlet oxygen photoproduction and fluorescence with the optimum for photoacoustic imaging being nickel(II. Our approach paves the way for the design, evaluation and realization of optimized sonochromes as photoacoustic contrast agents. Keywords: Naphthalocyanines, Spectroscopy

  12. Otsu Based Optimal Multilevel Image Thresholding Using Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    N. Sri Madhava Raja

    2014-01-01

    Full Text Available Histogram based multilevel thresholding approach is proposed using Brownian distribution (BD guided firefly algorithm (FA. A bounded search technique is also presented to improve the optimization accuracy with lesser search iterations. Otsu’s between-class variance function is maximized to obtain optimal threshold level for gray scale images. The performances of the proposed algorithm are demonstrated by considering twelve benchmark images and are compared with the existing FA algorithms such as Lévy flight (LF guided FA and random operator guided FA. The performance assessment comparison between the proposed and existing firefly algorithms is carried using prevailing parameters such as objective function, standard deviation, peak-to-signal ratio (PSNR, structural similarity (SSIM index, and search time of CPU. The results show that BD guided FA provides better objective function, PSNR, and SSIM, whereas LF based FA provides faster convergence with relatively lower CPU time.

  13. Strategies towards an optimized use of the shallow geothermal potential

    Science.gov (United States)

    Schelenz, S.; Firmbach, L.; Kalbacher, T.; Goerke, U.; Kolditz, O.; Dietrich, P.; Vienken, T.

    2013-12-01

    Thermal use of the shallow subsurface for heat generation, cooling and thermal energy storage is increasingly gaining importance in reconsideration of future energy supplies, e.g. in the course of German energy transition, with application shifting from isolated to intensive use. The planning and dimensioning of (geo-)thermal applications is strongly influenced by the availability of exploration data. Hence, reliable site-specific dimensioning of systems for the thermal use of the shallow subsurface will contribute to an increase in resource efficiency, cost reduction during installation and operation, as well as reduction of environmental impacts and prevention of resource over-exploitation. Despite large cumulative investments that are being made for the utilization of the shallow thermal potential, thermal energy is in many cases exploited without prior on-site exploration and investigation of the local geothermal potential, due to the lack of adequate and cost-efficient exploration techniques. We will present new strategies for an optimized utilization of urban thermal potential, showcased at a currently developed residential neighborhood with high demand for shallow geothermal applications, based on a) enhanced site characterization and b) simulation of different site specific application scenarios. For enhanced site characterization, surface geophysics and vertical high resolution direct push-profiling were combined for reliable determination of aquifer structure and aquifer parameterization. Based on the site characterization, different site specific geothermal application scenarios, including different system types and system configurations, were simulated using OpenGeoSys to guarantee an environmental and economic sustainable thermal use of the shallow subsurface.

  14. Bacterial Quorum Sensing Stabilizes Cooperation by Optimizing Growth Strategies.

    Science.gov (United States)

    Bruger, Eric L; Waters, Christopher M

    2016-11-15

    Communication has been suggested as a mechanism to stabilize cooperation. In bacteria, chemical communication, termed quorum sensing (QS), has been hypothesized to fill this role, and extracellular public goods are often induced by QS at high cell densities. Here we show, with the bacterium Vibrio harveyi, that QS provides strong resistance against invasion of a QS defector strain by maximizing the cellular growth rate at low cell densities while achieving maximum productivity through protease upregulation at high cell densities. In contrast, QS mutants that act as defectors or unconditional cooperators maximize either the growth rate or the growth yield, respectively, and thus are less fit than the wild-type QS strain. Our findings provide experimental evidence that regulation mediated by microbial communication can optimize growth strategies and stabilize cooperative phenotypes by preventing defector invasion, even under well-mixed conditions. This effect is due to a combination of responsiveness to environmental conditions provided by QS, lowering of competitive costs when QS is not induced, and pleiotropic constraints imposed on defectors that do not perform QS. Cooperation is a fundamental problem for evolutionary biology to explain. Conditional participation through phenotypic plasticity driven by communication is a potential solution to this dilemma. Thus, among bacteria, QS has been proposed to be a proximate stabilizing mechanism for cooperative behaviors. Here, we empirically demonstrate that QS in V. harveyi prevents cheating and subsequent invasion by nonproducing defectors by maximizing the growth rate at low cell densities and the growth yield at high cell densities, whereas an unconditional cooperator is rapidly driven to extinction by defectors. Our findings provide experimental evidence that QS regulation prevents the invasion of cooperative populations by QS defectors even under unstructured conditions, and they strongly support the role of

  15. Selecting optimal monochromatic level with spectral CT imaging for improving imaging quality in hepatic venography

    International Nuclear Information System (INIS)

    Sun Jun; Luo Xianfu; Wang Shou'an; Wang Jun; Sun Jiquan; Wang Zhijun; Wu Jingtao

    2013-01-01

    Objective: To investigate the effect of spectral CT monochromatic images for improving imaging quality in hepatic venography. Methods: Thirty patients underwent spectral CT examination on a GE Discovery CT 750 HD scanner. During portal phase, 1.25 mm slice thickness polychromatic images and optimal monochromatic images were obtained, and volume rendering and maximum intensity projection were created to show the hepatic veins respectively. The overall imaging quality was evaluated on a five-point scale by two radiologists. Inter-observer agreement in subjective image quality grading was assessed by Kappa statistics. Paired-sample t test were used to compare hepatic vein attenuation, hepatic parenchyma attenuation, CT value difference between the hepatic vein and the liver parenchyma, image noise, vein-to-liver contrast-to-noise ratio (CNR), the image quality score of hepatic venography between the two image data sets. Results: The monochromatic images at 50 keV were found to demonstrate the best CNR for hepatic vein.The hepatic vein attenuation [(329 ± 47) HU], hepatic parenchyma attenuation [(178 ± 33) HU], CT value difference between the hepatic vein and the liver parenchyma [(151 ± 33) HU], image noise (17.33 ± 4.18), CNR (9.13 ± 2.65), the image quality score (4.2 ± 0.6) of optimal monochromatic images were significantly higher than those of polychromatic images [(149 ± 18) HU], [(107 ± 14) HU], [(43 ±11) HU], 12.55 ± 3.02, 3.53 ± 1.03, 3.1 ± 0.8 (t values were 24.79, 13.95, 18.85, 9.07, 13.25 and 12.04, respectively, P < 0.01). In the comparison of image quality, Kappa value was 0.81 with optimal monochromatic images and 0.69 with polychromatic images. Conclusion: Monochromatic images of spectral CT could improve CNR for displaying hepatic vein and improve the image quality compared to the conventional polychromatic images. (authors)

  16. What marketing strategy for destinations with a negative image?

    OpenAIRE

    Seraphin, Hugues; Gowreesunkar, Vanessa; Hugues Seraphin

    2017-01-01

    Purpose\\ud This concluding article filters out meaningful marketing strategies that aim at re-positioning and re-establishing struggling tourism destinations with negative image. Drawing from a collection of case studies around the world, the article provides evidences from post-colonial, post-conflict and post-disaster destinations to finally anchor the overall conclusion of the theme issue.\\ud \\ud Design\\ud The article summarizes key issues faced by destinations plagued with a negative imag...

  17. Optimal context quantization in lossless compression of image data sequences

    DEFF Research Database (Denmark)

    Forchhammer, Søren; Wu, X.; Andersen, Jakob Dahl

    2004-01-01

    In image compression context-based entropy coding is commonly used. A critical issue to the performance of context-based image coding is how to resolve the conflict of a desire for large templates to model high-order statistic dependency of the pixels and the problem of context dilution due...... to insufficient sample statistics of a given input image. We consider the problem of finding the optimal quantizer Q that quantizes the K-dimensional causal context C/sub t/=(X/sub t-t1/,X/sub t-t2/,...,X/sub t-tK/) of a source symbol X/sub t/ into one of a set of conditioning states. The optimality of context...... quantization is defined to be the minimum static or minimum adaptive code length of given a data set. For a binary source alphabet an optimal context quantizer can be computed exactly by a fast dynamic programming algorithm. Faster approximation solutions are also proposed. In case of m-ary source alphabet...

  18. 3-D brain image registration using optimal morphological processing

    International Nuclear Information System (INIS)

    Loncaric, S.; Dhawan, A.P.

    1994-01-01

    The three-dimensional (3-D) registration of Magnetic Resonance (MR) and Positron Emission Tomographic (PET) images of the brain is important for analysis of the human brain and its diseases. A procedure for optimization of (3-D) morphological structuring elements, based on a genetic algorithm, is presented in the paper. The registration of the MR and PET images is done by means of a registration procedure in two major phases. In the first phase, the Iterative Principal Axis Transform (IPAR) is used for initial registration. In the second phase, the optimal shape description method based on the Morphological Signature Transform (MST) is used for final registration. The morphological processing is used to improve the accuracy of the basic IPAR method. The brain ventricle is used as a landmark for MST registration. A near-optimal structuring element obtained by means of a genetic algorithm is used in MST to describe the shape of the ventricle. The method has been tested on the set of brain images demonstrating the feasibility of approach. (author). 11 refs., 3 figs

  19. Optimization of image quality and patient dose in mammography

    International Nuclear Information System (INIS)

    Shafqat Faaruq; Jaferi, R.A.; Nafeesa Nazlee

    2007-01-01

    Complete test of publication follows. Optimization of patient dose and image quality can be defined as to get the best image quality with minimum possible radiation dose to the patient by setting various parameters and modes of operation available in mammography machines. The optimization procedures were performed on two mammography units from M/S GE and Metaltronica, available at NORI, using standard mammographic accreditation phantom (Model: BR-156) and acrylic sheets of variable thicknesses. Quality assurance and quality control (QC) tests being the essential part of optimization. The QC tests as recommended by American College of Radiology, were first performed on both machines as well as X-ray film processor. In the second step, different affecting the image quality and radiation dose to patient, like film screen combination (FSC), phantom optical density (PD), kVp, mAs etc, were adjusted for various phantom thicknesses ranging from 3 cm to 6.5 cm in various modes of operation in the machines (semi-auto- and manual in GE, Auto-, semi-auto- and manual mode in Metaltronica). The image quality was studied for these optimized parameters on the basis of the number of test objects of the phantom visible in these images. Finally the linear relationship between mAs and skin entrance dose (mGy) was verified using ionization chamber with the phantom and the actual patients. Despite some practical limitations, the results of the quality assurance tests were within acceptable limits defined by ACR. The dose factor for GE was 68.0 y/mAs, while 76.0 mGy/mAs for Metaltronica at 25 kVp. Before the start of this study the only one mammography unit GE, was routinely used at NORI and normal mode of operation of this unit was semi-auto mode with fixed kVp independent of compressed breast thickness, but in this study it was concluded that selecting kVp according to beast thickness result in an appreciable dose reduction (4-5 times less) without any compromise in image quality. The

  20. Automated bone segmentation from dental CBCT images using patch-based sparse representation and convex optimization

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Li; Gao, Yaozong; Shi, Feng; Liao, Shu; Li, Gang [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, North Carolina 27599 (United States); Chen, Ken Chung [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 and Department of Stomatology, National Cheng Kung University Medical College and Hospital, Tainan, Taiwan 70403 (China); Shen, Steve G. F.; Yan, Jin [Department of Oral and Craniomaxillofacial Surgery and Science, Shanghai Ninth People' s Hospital, Shanghai Jiao Tong University College of Medicine, Shanghai, China 200011 (China); Lee, Philip K. M.; Chow, Ben [Hong Kong Dental Implant and Maxillofacial Centre, Hong Kong, China 999077 (China); Liu, Nancy X. [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 and Department of Oral and Maxillofacial Surgery, Peking University School and Hospital of Stomatology, Beijing, China 100050 (China); Xia, James J. [Department of Oral and Maxillofacial Surgery, Houston Methodist Hospital Research Institute, Houston, Texas 77030 (United States); Department of Surgery (Oral and Maxillofacial Surgery), Weill Medical College, Cornell University, New York, New York 10065 (United States); Department of Oral and Craniomaxillofacial Surgery and Science, Shanghai Ninth People' s Hospital, Shanghai Jiao Tong University College of Medicine, Shanghai, China 200011 (China); Shen, Dinggang, E-mail: dgshen@med.unc.edu [Department of Radiology and BRIC, University of North Carolina at Chapel Hill, North Carolina 27599 and Department of Brain and Cognitive Engineering, Korea University, Seoul, 136701 (Korea, Republic of)

    2014-04-15

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate three-dimensional (3D) models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the poor image quality, including very low signal-to-noise ratio and the widespread image artifacts such as noise, beam hardening, and inhomogeneity, it is challenging to segment the CBCT images. In this paper, the authors present a new automatic segmentation method to address these problems. Methods: To segment CBCT images, the authors propose a new method for fully automated CBCT segmentation by using patch-based sparse representation to (1) segment bony structures from the soft tissues and (2) further separate the mandible from the maxilla. Specifically, a region-specific registration strategy is first proposed to warp all the atlases to the current testing subject and then a sparse-based label propagation strategy is employed to estimate a patient-specific atlas from all aligned atlases. Finally, the patient-specific atlas is integrated into amaximum a posteriori probability-based convex segmentation framework for accurate segmentation. Results: The proposed method has been evaluated on a dataset with 15 CBCT images. The effectiveness of the proposed region-specific registration strategy and patient-specific atlas has been validated by comparing with the traditional registration strategy and population-based atlas. The experimental results show that the proposed method achieves the best segmentation accuracy by comparison with other state-of-the-art segmentation methods. Conclusions: The authors have proposed a new CBCT segmentation method by using patch-based sparse representation and convex optimization, which can achieve considerably accurate segmentation results in CBCT

  1. Automated bone segmentation from dental CBCT images using patch-based sparse representation and convex optimization

    International Nuclear Information System (INIS)

    Wang, Li; Gao, Yaozong; Shi, Feng; Liao, Shu; Li, Gang; Chen, Ken Chung; Shen, Steve G. F.; Yan, Jin; Lee, Philip K. M.; Chow, Ben; Liu, Nancy X.; Xia, James J.; Shen, Dinggang

    2014-01-01

    Purpose: Cone-beam computed tomography (CBCT) is an increasingly utilized imaging modality for the diagnosis and treatment planning of the patients with craniomaxillofacial (CMF) deformities. Accurate segmentation of CBCT image is an essential step to generate three-dimensional (3D) models for the diagnosis and treatment planning of the patients with CMF deformities. However, due to the poor image quality, including very low signal-to-noise ratio and the widespread image artifacts such as noise, beam hardening, and inhomogeneity, it is challenging to segment the CBCT images. In this paper, the authors present a new automatic segmentation method to address these problems. Methods: To segment CBCT images, the authors propose a new method for fully automated CBCT segmentation by using patch-based sparse representation to (1) segment bony structures from the soft tissues and (2) further separate the mandible from the maxilla. Specifically, a region-specific registration strategy is first proposed to warp all the atlases to the current testing subject and then a sparse-based label propagation strategy is employed to estimate a patient-specific atlas from all aligned atlases. Finally, the patient-specific atlas is integrated into amaximum a posteriori probability-based convex segmentation framework for accurate segmentation. Results: The proposed method has been evaluated on a dataset with 15 CBCT images. The effectiveness of the proposed region-specific registration strategy and patient-specific atlas has been validated by comparing with the traditional registration strategy and population-based atlas. The experimental results show that the proposed method achieves the best segmentation accuracy by comparison with other state-of-the-art segmentation methods. Conclusions: The authors have proposed a new CBCT segmentation method by using patch-based sparse representation and convex optimization, which can achieve considerably accurate segmentation results in CBCT

  2. Model-based optimization strategy of chiller driven liquid desiccant dehumidifier with genetic algorithm

    International Nuclear Information System (INIS)

    Wang, Xinli; Cai, Wenjian; Lu, Jiangang; Sun, Youxian; Zhao, Lei

    2015-01-01

    This study presents a model-based optimization strategy for an actual chiller driven dehumidifier of liquid desiccant dehumidification system operating with lithium chloride solution. By analyzing the characteristics of the components, energy predictive models for the components in the dehumidifier are developed. To minimize the energy usage while maintaining the outlet air conditions at the pre-specified set-points, an optimization problem is formulated with an objective function, the constraints of mechanical limitations and components interactions. Model-based optimization strategy using genetic algorithm is proposed to obtain the optimal set-points for desiccant solution temperature and flow rate, to minimize the energy usage in the dehumidifier. Experimental studies on an actual system are carried out to compare energy consumption between the proposed optimization and the conventional strategies. The results demonstrate that energy consumption using the proposed optimization strategy can be reduced by 12.2% in the dehumidifier operation. - Highlights: • Present a model-based optimization strategy for energy saving in LDDS. • Energy predictive models for components in dehumidifier are developed. • The Optimization strategy are applied and tested in an actual LDDS. • Optimization strategy can achieve energy savings by 12% during operation

  3. Optimal image resolution for digital storage of radiotherapy-planning images

    International Nuclear Information System (INIS)

    Baba, Yuji; Furusawa, Mitsuhiro; Murakami, Ryuji; Baba, Takashi; Yokoyama, Toshimi; Nishimura, Ryuichi; Takahashi, Mutsumasa

    1998-01-01

    Purpose: To evaluate the quality of digitized radiation-planning images at different resolution and to determine the optimal resolution for digital storage. Methods and Materials: Twenty-five planning films were scanned and digitized using a film scanner at a resolution of 72 dots per inch (dpi) with 8-bit depth. The resolution of scanned images was reduced to 48, 36, 24, and 18 dpi using computer software. Image qualities of these five images (72, 48, 36, 24, and 18 dpi) were evaluated and given scores (4 = excellent; 3 = good; 2 = fair; and 1 = poor) by three radiation oncologists. An image data compression algorithm by the Joint Photographic Experts Group (JPEG) (not reversible and some information will be lost) was also evaluated. Results: The scores of digitized images with 72, 48, 36, 24, and 17 dpi resolution were 3.8 ± 0.3, 3.5 ± 0.3, 3.3 ± 0.5, 2.7 ± 0.5, and 1.6 ± 0.3, respectively. The quality of 36-dpi images were definitely worse compared to 72-dpi images, but were good enough as planning films. Digitized planning images with 72- and 36-dpi resolution requires about 800 and 200 KBytes, respectively. The JPEG compression algorithm produces little degradation in 36-dpi images at compression ratios of 5:1. Conclusion: The quality of digitized images with 36-dpi resolution was good enough as radiation-planning images and required 200 KBytes/image

  4. Information theoretic methods for image processing algorithm optimization

    Science.gov (United States)

    Prokushkin, Sergey F.; Galil, Erez

    2015-01-01

    Modern image processing pipelines (e.g., those used in digital cameras) are full of advanced, highly adaptive filters that often have a large number of tunable parameters (sometimes > 100). This makes the calibration procedure for these filters very complex, and the optimal results barely achievable in the manual calibration; thus an automated approach is a must. We will discuss an information theory based metric for evaluation of algorithm adaptive characteristics ("adaptivity criterion") using noise reduction algorithms as an example. The method allows finding an "orthogonal decomposition" of the filter parameter space into the "filter adaptivity" and "filter strength" directions. This metric can be used as a cost function in automatic filter optimization. Since it is a measure of a physical "information restoration" rather than perceived image quality, it helps to reduce the set of the filter parameters to a smaller subset that is easier for a human operator to tune and achieve a better subjective image quality. With appropriate adjustments, the criterion can be used for assessment of the whole imaging system (sensor plus post-processing).

  5. Optimization of Iron Oxide Tracer Synthesis for Magnetic Particle Imaging

    Directory of Open Access Journals (Sweden)

    Sabina Ziemian

    2018-03-01

    Full Text Available The optimization of iron oxide nanoparticles as tracers for magnetic particle imaging (MPI alongside the development of data acquisition equipment and image reconstruction techniques is crucial for the required improvements in image resolution and sensitivity of MPI scanners. We present a large-scale water-based synthesis of multicore superparamagnetic iron oxide nanoparticles stabilized with dextran (MC-SPIONs. We also demonstrate the preparation of single core superparamagnetic iron oxide nanoparticles in organic media, subsequently coated with a poly(ethylene glycol gallic acid polymer and phase transferred to water (SC-SPIONs. Our aim was to obtain long-term stable particles in aqueous media with high MPI performance. We found that the amplitude of the third harmonic measured by magnetic particle spectroscopy (MPS at 10 mT is 2.3- and 5.8-fold higher than Resovist for the MC-SPIONs and SC-SPIONs, respectively, revealing excellent MPI potential as compared to other reported MPI tracer particle preparations. We show that the reconstructed MPI images of phantoms using optimized multicore and specifically single-core particles are superior to that of commercially available Resovist, which we utilize as a reference standard, as predicted by MPS.

  6. Optimization-Based Image Segmentation by Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Rosenberger C

    2008-01-01

    Full Text Available Abstract Many works in the literature focus on the definition of evaluation metrics and criteria that enable to quantify the performance of an image processing algorithm. These evaluation criteria can be used to define new image processing algorithms by optimizing them. In this paper, we propose a general scheme to segment images by a genetic algorithm. The developed method uses an evaluation criterion which quantifies the quality of an image segmentation result. The proposed segmentation method can integrate a local ground truth when it is available in order to set the desired level of precision of the final result. A genetic algorithm is then used in order to determine the best combination of information extracted by the selected criterion. Then, we show that this approach can either be applied for gray-levels or multicomponents images in a supervised context or in an unsupervised one. Last, we show the efficiency of the proposed method through some experimental results on several gray-levels and multicomponents images.

  7. Optimization-Based Image Segmentation by Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    H. Laurent

    2008-05-01

    Full Text Available Many works in the literature focus on the definition of evaluation metrics and criteria that enable to quantify the performance of an image processing algorithm. These evaluation criteria can be used to define new image processing algorithms by optimizing them. In this paper, we propose a general scheme to segment images by a genetic algorithm. The developed method uses an evaluation criterion which quantifies the quality of an image segmentation result. The proposed segmentation method can integrate a local ground truth when it is available in order to set the desired level of precision of the final result. A genetic algorithm is then used in order to determine the best combination of information extracted by the selected criterion. Then, we show that this approach can either be applied for gray-levels or multicomponents images in a supervised context or in an unsupervised one. Last, we show the efficiency of the proposed method through some experimental results on several gray-levels and multicomponents images.

  8. The optimal algorithm for Multi-source RS image fusion.

    Science.gov (United States)

    Fu, Wei; Huang, Shui-Guang; Li, Zeng-Shun; Shen, Hao; Li, Jun-Shuai; Wang, Peng-Yuan

    2016-01-01

    In order to solve the issue which the fusion rules cannot be self-adaptively adjusted by using available fusion methods according to the subsequent processing requirements of Remote Sensing (RS) image, this paper puts forward GSDA (genetic-iterative self-organizing data analysis algorithm) by integrating the merit of genetic arithmetic together with the advantage of iterative self-organizing data analysis algorithm for multi-source RS image fusion. The proposed algorithm considers the wavelet transform of the translation invariance as the model operator, also regards the contrast pyramid conversion as the observed operator. The algorithm then designs the objective function by taking use of the weighted sum of evaluation indices, and optimizes the objective function by employing GSDA so as to get a higher resolution of RS image. As discussed above, the bullet points of the text are summarized as follows.•The contribution proposes the iterative self-organizing data analysis algorithm for multi-source RS image fusion.•This article presents GSDA algorithm for the self-adaptively adjustment of the fusion rules.•This text comes up with the model operator and the observed operator as the fusion scheme of RS image based on GSDA. The proposed algorithm opens up a novel algorithmic pathway for multi-source RS image fusion by means of GSDA.

  9. Optimal strategy analysis based on robust predictive control for inventory system with random demand

    Science.gov (United States)

    Saputra, Aditya; Widowati, Sutrisno

    2017-12-01

    In this paper, the optimal strategy for a single product single supplier inventory system with random demand is analyzed by using robust predictive control with additive random parameter. We formulate the dynamical system of this system as a linear state space with additive random parameter. To determine and analyze the optimal strategy for the given inventory system, we use robust predictive control approach which gives the optimal strategy i.e. the optimal product volume that should be purchased from the supplier for each time period so that the expected cost is minimal. A numerical simulation is performed with some generated random inventory data. We simulate in MATLAB software where the inventory level must be controlled as close as possible to a set point decided by us. From the results, robust predictive control model provides the optimal strategy i.e. the optimal product volume that should be purchased and the inventory level was followed the given set point.

  10. Modeling digital breast tomosynthesis imaging systems for optimization studies

    Science.gov (United States)

    Lau, Beverly Amy

    Digital breast tomosynthesis (DBT) is a new imaging modality for breast imaging. In tomosynthesis, multiple images of the compressed breast are acquired at different angles, and the projection view images are reconstructed to yield images of slices through the breast. One of the main problems to be addressed in the development of DBT is the optimal parameter settings to obtain images ideal for detection of cancer. Since it would be unethical to irradiate women multiple times to explore potentially optimum geometries for tomosynthesis, it is ideal to use a computer simulation to generate projection images. Existing tomosynthesis models have modeled scatter and detector without accounting for oblique angles of incidence that tomosynthesis introduces. Moreover, these models frequently use geometry-specific physical factors measured from real systems, which severely limits the robustness of their algorithms for optimization. The goal of this dissertation was to design the framework for a computer simulation of tomosynthesis that would produce images that are sensitive to changes in acquisition parameters, so an optimization study would be feasible. A computer physics simulation of the tomosynthesis system was developed. The x-ray source was modeled as a polychromatic spectrum based on published spectral data, and inverse-square law was applied. Scatter was applied using a convolution method with angle-dependent scatter point spread functions (sPSFs), followed by scaling using an angle-dependent scatter-to-primary ratio (SPR). Monte Carlo simulations were used to generate sPSFs for a 5-cm breast with a 1-cm air gap. Detector effects were included through geometric propagation of the image onto layers of the detector, which were blurred using depth-dependent detector point-spread functions (PRFs). Depth-dependent PRFs were calculated every 5-microns through a 200-micron thick CsI detector using Monte Carlo simulations. Electronic noise was added as Gaussian noise as a

  11. Implementation of an optimal control energy management strategy in a hybrid truck

    NARCIS (Netherlands)

    Mullem, D. van; Keulen, T. van; Kessels, J.T.B.A.; Jager, B. de; Steinbuch, M.

    2010-01-01

    Energy Management Strategies for hybrid powertrains control the power split, between the engine and electric motor, of a hybrid vehicle, with fuel consumption or emission minimization as objective. Optimal control theory can be applied to rewrite the optimization problem to an optimization

  12. Image Registration for PET/CT and CT Images with Particle Swarm Optimization

    International Nuclear Information System (INIS)

    Lee, Hak Jae; Kim, Yong Kwon; Lee, Ki Sung; Choi, Jong Hak; Kim, Chang Kyun; Moon, Guk Hyun; Joo, Sung Kwan; Kim, Kyeong Min; Cheon, Gi Jeong

    2009-01-01

    Image registration is a fundamental task in image processing used to match two or more images. It gives new information to the radiologists by matching images from different modalities. The objective of this study is to develop 2D image registration algorithm for PET/CT and CT images acquired by different systems at different times. We matched two CT images first (one from standalone CT and the other from PET/CT) that contain affluent anatomical information. Then, we geometrically transformed PET image according to the results of transformation parameters calculated by the previous step. We have used Affine transform to match the target and reference images. For the similarity measure, mutual information was explored. Use of particle swarm algorithm optimized the performance by finding the best matched parameter set within a reasonable amount of time. The results show good agreements of the images between PET/CT and CT. We expect the proposed algorithm can be used not only for PET/CT and CT image registration but also for different multi-modality imaging systems such as SPECT/CT, MRI/PET and so on.

  13. Anodic Cyclization Reactions and the Mechanistic Strategies That Enable Optimization.

    Science.gov (United States)

    Feng, Ruozhu; Smith, Jake A; Moeller, Kevin D

    2017-09-19

    Oxidation reactions are powerful tools for synthesis because they allow us to reverse the polarity of electron-rich functional groups, generate highly reactive intermediates, and increase the functionality of molecules. For this reason, oxidation reactions have been and continue to be the subject of intense study. Central to these efforts is the development of mechanism-based strategies that allow us to think about the reactive intermediates that are frequently central to the success of the reactions and the mechanistic pathways that those intermediates trigger. For example, consider oxidative cyclization reactions that are triggered by the removal of an electron from an electron-rich olefin and lead to cyclic products that are functionalized for further elaboration. For these reactions to be successful, the radical cation intermediate must first be generated using conditions that limit its polymerization and then channeled down a productive desired pathway. Following the cyclization, a second oxidation step is necessary for product formation, after which the resulting cation must be quenched in a controlled fashion to avoid undesired elimination reactions. Problems can arise at any one or all of these steps, a fact that frequently complicates reaction optimization and can discourage the development of new transformations. Fortunately, anodic electrochemistry offers an outstanding opportunity to systematically probe the mechanism of oxidative cyclization reactions. The use of electrochemical methods allows for the generation of radical cations under neutral conditions in an environment that helps prevent polymerization of the intermediate. Once the intermediates have been generated, a series of "telltale indicators" can be used to diagnose which step in an oxidative cyclization is problematic for less successful transformation. A set of potential solutions to address each type of problem encountered has been developed. For example, problems with the initial

  14. Color standardization and optimization in Whole Slide Imaging

    Directory of Open Access Journals (Sweden)

    Yagi Yukako

    2011-03-01

    Full Text Available Abstract Introduction Standardization and validation of the color displayed by digital slides is an important aspect of digital pathology implementation. While the most common reason for color variation is the variance in the protocols and practices in the histology lab, the color displayed can also be affected by variation in capture parameters (for example, illumination and filters, image processing and display factors in the digital systems themselves. Method We have been developing techniques for color validation and optimization along two paths. The first was based on two standard slides that are scanned and displayed by the imaging system in question. In this approach, one slide is embedded with nine filters with colors selected especially for H&E stained slides (looking like tiny Macbeth color chart; the specific color of the nine filters were determined in our previous study and modified for whole slide imaging (WSI. The other slide is an H&E stained mouse embryo. Both of these slides were scanned and the displayed images were compared to a standard. The second approach was based on our previous multispectral imaging research. Discussion As a first step, the two slide method (above was used to identify inaccurate display of color and its cause, and to understand the importance of accurate color in digital pathology. We have also improved the multispectral-based algorithm for more consistent results in stain standardization. In near future, the results of the two slide and multispectral techniques can be combined and will be widely available. We have been conducting a series of researches and developing projects to improve image quality to establish Image Quality Standardization. This paper discusses one of most important aspects of image quality – color.

  15. Parameter Optimization of Multi-Element Synthetic Aperture Imaging Systems

    Directory of Open Access Journals (Sweden)

    Vera Behar

    2007-03-01

    Full Text Available In conventional ultrasound imaging systems with phased arrays, the further improvement of lateral resolution requires enlarging of the number of array elements that in turn increases both, the complexity and the cost, of imaging systems. Multi-element synthetic aperture focusing (MSAF systems are a very good alternative to conventional systems with phased arrays. The benefit of the synthetic aperture is in reduction of the system complexity, cost and acquisition time. In a MSAF system considered in the paper, a group of elements transmit and receive signals simultaneously, and the transmit beam is defocused to emulate a single element response. The echo received at each element of a receive sub-aperture is recorded in the computer memory. The process of transmission/reception is repeated for all positions of a transmit sub-aperture. All the data recordings associated with each corresponding pair "transmit-receive sub-aperture" are then focused synthetically producing a low-resolution image. The final high-resolution image is formed by summing of the all low-resolution images associated with transmit/receive sub-apertures. A problem of parameter optimization of a MSAF system is considered in this paper. The quality of imaging (lateral resolution and contrast is expressed in terms of the beam characteristics - beam width and side lobe level. The comparison between the MSAF system described in the paper and an equivalent conventional phased array system shows that the MSAF system acquires images of equivalent quality much faster using only a small part of the power per image.

  16. Offshore Wind Farm Layout Design Considering Optimized Power Dispatch Strategy

    DEFF Research Database (Denmark)

    Hou, Peng; Hu, Weihao; N. Soltani, Mohsen

    2017-01-01

    Offshore wind farm has drawn more and more attention recently due to its higher energy capacity and more freedom to occupy area. However, the investment is higher. In order to make a cost-effective wind farm, the wind farm layout should be optimized. The wake effect is one of the dominant factors...... leading to energy losses. It is expected that the optimized placement of wind turbines (WT) over a large sea area can lead to the best tradeoff between energy yields and capital investment. This paper proposes a novel way to position offshore WTs for a regular shaped wind farm. In addition to optimizing...... the direction of wind farm placement and the spacing between WTs, the control strategy’s impact on energy yields is also discussed. Since the problem is non-convex and lots of optimization variables are involved, an evolutionary algorithm, the particle swarm optimization algorithm (PSO), is adopted to find...

  17. Optimization of super-resolution processing using incomplete image sets in PET imaging.

    Science.gov (United States)

    Chang, Guoping; Pan, Tinsu; Clark, John W; Mawlawi, Osama R

    2008-12-01

    Super-resolution (SR) techniques are used in PET imaging to generate a high-resolution image by combining multiple low-resolution images that have been acquired from different points of view (POVs). The number of low-resolution images used defines the processing time and memory storage necessary to generate the SR image. In this paper, the authors propose two optimized SR implementations (ISR-1 and ISR-2) that require only a subset of the low-resolution images (two sides and diagonal of the image matrix, respectively), thereby reducing the overall processing time and memory storage. In an N x N matrix of low-resolution images, ISR-1 would be generated using images from the two sides of the N x N matrix, while ISR-2 would be generated from images across the diagonal of the image matrix. The objective of this paper is to investigate whether the two proposed SR methods can achieve similar performance in contrast and signal-to-noise ratio (SNR) as the SR image generated from a complete set of low-resolution images (CSR) using simulation and experimental studies. A simulation, a point source, and a NEMA/IEC phantom study were conducted for this investigation. In each study, 4 (2 x 2) or 16 (4 x 4) low-resolution images were reconstructed from the same acquired data set while shifting the reconstruction grid to generate images from different POVs. SR processing was then applied in each study to combine all as well as two different subsets of the low-resolution images to generate the CSR, ISR-1, and ISR-2 images, respectively. For reference purpose, a native reconstruction (NR) image using the same matrix size as the three SR images was also generated. The resultant images (CSR, ISR-1, ISR-2, and NR) were then analyzed using visual inspection, line profiles, SNR plots, and background noise spectra. The simulation study showed that the contrast and the SNR difference between the two ISR images and the CSR image were on average 0.4% and 0.3%, respectively. Line profiles of

  18. An Optimized Method for Terrain Reconstruction Based on Descent Images

    Directory of Open Access Journals (Sweden)

    Xu Xinchao

    2016-02-01

    Full Text Available An optimization method is proposed to perform high-accuracy terrain reconstruction of the landing area of Chang’e III. First, feature matching is conducted using geometric model constraints. Then, the initial terrain is obtained and the initial normal vector of each point is solved on the basis of the initial terrain. By changing the vector around the initial normal vector in small steps a set of new vectors is obtained. By combining these vectors with the direction of light and camera, the functions are set up on the basis of a surface reflection model. Then, a series of gray values is derived by solving the equations. The new optimized vector is recorded when the obtained gray value is closest to the corresponding pixel. Finally, the optimized terrain is obtained after iteration of the vector field. Experiments were conducted using the laboratory images and descent images of Chang’e III. The results showed that the performance of the proposed method was better than that of the classical feature matching method. It can provide a reference for terrain reconstruction of the landing area in subsequent moon exploration missions.

  19. Finding optimal vaccination strategies for pandemic influenza using genetic algorithms.

    Science.gov (United States)

    Patel, Rajan; Longini, Ira M; Halloran, M Elizabeth

    2005-05-21

    In the event of pandemic influenza, only limited supplies of vaccine may be available. We use stochastic epidemic simulations, genetic algorithms (GA), and random mutation hill climbing (RMHC) to find optimal vaccine distributions to minimize the number of illnesses or deaths in the population, given limited quantities of vaccine. Due to the non-linearity, complexity and stochasticity of the epidemic process, it is not possible to solve for optimal vaccine distributions mathematically. However, we use GA and RMHC to find near optimal vaccine distributions. We model an influenza pandemic that has age-specific illness attack rates similar to the Asian pandemic in 1957-1958 caused by influenza A(H2N2), as well as a distribution similar to the Hong Kong pandemic in 1968-1969 caused by influenza A(H3N2). We find the optimal vaccine distributions given that the number of doses is limited over the range of 10-90% of the population. While GA and RMHC work well in finding optimal vaccine distributions, GA is significantly more efficient than RMHC. We show that the optimal vaccine distribution found by GA and RMHC is up to 84% more effective than random mass vaccination in the mid range of vaccine availability. GA is generalizable to the optimization of stochastic model parameters for other infectious diseases and population structures.

  20. Task-based optimization of image reconstruction in breast CT

    Science.gov (United States)

    Sanchez, Adrian A.; Sidky, Emil Y.; Pan, Xiaochuan

    2014-03-01

    We demonstrate a task-based assessment of image quality in dedicated breast CT in order to optimize the number of projection views acquired. The methodology we employ is based on the Hotelling Observer (HO) and its associated metrics. We consider two tasks: the Rayleigh task of discerning between two resolvable objects and a single larger object, and the signal detection task of classifying an image as belonging to either a signalpresent or signal-absent hypothesis. HO SNR values are computed for 50, 100, 200, 500, and 1000 projection view images, with the total imaging radiation dose held constant. We use the conventional fan-beam FBP algorithm and investigate the effect of varying the width of a Hanning window used in the reconstruction, since this affects both the noise properties of the image and the under-sampling artifacts which can arise in the case of sparse-view acquisitions. Our results demonstrate that fewer projection views should be used in order to increase HO performance, which in this case constitutes an upper-bound on human observer performance. However, the impact on HO SNR of using fewer projection views, each with a higher dose, is not as significant as the impact of employing regularization in the FBP reconstruction through a Hanning filter.

  1. Endoscopic hyperspectral imaging: light guide optimization for spectral light source

    Science.gov (United States)

    Browning, Craig M.; Mayes, Samuel; Rich, Thomas C.; Leavesley, Silas J.

    2018-02-01

    Hyperspectral imaging (HSI) is a technology used in remote sensing, food processing and documentation recovery. Recently, this approach has been applied in the medical field to spectrally interrogate regions of interest within respective substrates. In spectral imaging, a two (spatial) dimensional image is collected, at many different (spectral) wavelengths, to sample spectral signatures from different regions and/or components within a sample. Here, we report on the use of hyperspectral imaging for endoscopic applications. Colorectal cancer is the 3rd leading cancer for incidences and deaths in the US. One factor of severity is the miss rate of precancerous/flat lesions ( 65% accuracy). Integrating HSI into colonoscopy procedures could minimize misdiagnosis and unnecessary resections. We have previously reported a working prototype light source with 16 high-powered light emitting diodes (LEDs) capable of high speed cycling and imaging. In recent testing, we have found our current prototype is limited by transmission loss ( 99%) through the multi-furcated solid light guide (lightpipe) and the desired framerate (20-30 fps) could not be achieved. Here, we report on a series of experimental and modeling studies to better optimize the lightpipe and the spectral endoscopy system as a whole. The lightpipe was experimentally evaluated using an integrating sphere and spectrometer (Ocean Optics). Modeling the lightpipe was performed using Monte Carlo optical ray tracing in TracePro (Lambda Research Corp.). Results of these optimization studies will aid in manufacturing a revised prototype with the newly designed light guide and increased sensitivity. Once the desired optical output (5-10 mW) is achieved then the HIS endoscope system will be able to be implemented without adding onto the procedure time.

  2. Cross-layer Energy Optimization Under Image Quality Constraints for Wireless Image Transmissions.

    Science.gov (United States)

    Yang, Na; Demirkol, Ilker; Heinzelman, Wendi

    2012-01-01

    Wireless image transmission is critical in many applications, such as surveillance and environment monitoring. In order to make the best use of the limited energy of the battery-operated cameras, while satisfying the application-level image quality constraints, cross-layer design is critical. In this paper, we develop an image transmission model that allows the application layer (e.g., the user) to specify an image quality constraint, and optimizes the lower layer parameters of transmit power and packet length, to minimize the energy dissipation in image transmission over a given distance. The effectiveness of this approach is evaluated by applying the proposed energy optimization to a reference ZigBee system and a WiFi system, and also by comparing to an energy optimization study that does not consider any image quality constraint. Evaluations show that our scheme outperforms the default settings of the investigated commercial devices and saves a significant amount of energy at middle-to-large transmission distances.

  3. Optimal Software Strategies in the Presence of Network Externalities

    Science.gov (United States)

    Liu, Yipeng

    2009-01-01

    Network externalities or alternatively termed network effects are pervasive in computer software markets. While software vendors consider pricing strategies, they must also take into account the impact of network externalities on their sales. My main interest in this research is to describe a firm's strategies and behaviors in the presence of…

  4. Optimizing the stirring strategy for the vibrating intrinsic reverberation chamber

    NARCIS (Netherlands)

    Serra, Ramiro; Serra, Ramiro; Leferink, Frank Bernardus Johannes

    2010-01-01

    This work describes the definition, application and assessment of a factorial plan with the aim of gaining insight on what kind of stirring strategy could work the best in a vibrating intrinsic reverberation chamber. Three different stirring strategies were defined as factors of a factorial

  5. An Adaptive Cultural Algorithm with Improved Quantum-behaved Particle Swarm Optimization for Sonar Image Detection.

    Science.gov (United States)

    Wang, Xingmei; Hao, Wenqian; Li, Qiming

    2017-12-18

    This paper proposes an adaptive cultural algorithm with improved quantum-behaved particle swarm optimization (ACA-IQPSO) to detect the underwater sonar image. In the population space, to improve searching ability of particles, iterative times and the fitness value of particles are regarded as factors to adaptively adjust the contraction-expansion coefficient of the quantum-behaved particle swarm optimization algorithm (QPSO). The improved quantum-behaved particle swarm optimization algorithm (IQPSO) can make particles adjust their behaviours according to their quality. In the belief space, a new update strategy is adopted to update cultural individuals according to the idea of the update strategy in shuffled frog leaping algorithm (SFLA). Moreover, to enhance the utilization of information in the population space and belief space, accept function and influence function are redesigned in the new communication protocol. The experimental results show that ACA-IQPSO can obtain good clustering centres according to the grey distribution information of underwater sonar images, and accurately complete underwater objects detection. Compared with other algorithms, the proposed ACA-IQPSO has good effectiveness, excellent adaptability, a powerful searching ability and high convergence efficiency. Meanwhile, the experimental results of the benchmark functions can further demonstrate that the proposed ACA-IQPSO has better searching ability, convergence efficiency and stability.

  6. Optimal Pricing Strategies for New Products in Dynamic Oligopolies

    OpenAIRE

    Engelbert Dockner; Steffen Jørgensen

    1988-01-01

    This paper deals with the determination of optimal pricing policies for firms in oligopolistic markets. The problem is studied as a differential game and optimal pricing policies are established as Nash open-loop controls. Cost learning effects are assumed such that unit costs are decreasing with cumulative output. Discounting of future profits is also taken into consideration. Initially, the problem is addressed in a general framework, and we proceed to study some specific cases that are rel...

  7. Optimal scanning and image processing with the STEM

    International Nuclear Information System (INIS)

    Crewe, A.V.; Ohtsuki, M.

    1981-01-01

    We have recently published a theory of an optimal scanning system which is particularly suited for the STEM. One concludes from the theory that the diffraction limit of the electron probe should be a fixed fraction of the full-scale deflection in order to avoid scanning artifacts. More recently, we have confirmed the value of this technique by direct experiments. Our program now is to combine the use of optimal scanning with the use of a programmable digital refresh memory for image analysis. Limited experience to date indicates that false color conversion is probably more useful than histogram equalization in black and white and that this system is particularly valuable for rotational averaging and selected area Fourier transforms. (orig.)

  8. Multi-Objective Optimization of Start-up Strategy for Pumped Storage Units

    Directory of Open Access Journals (Sweden)

    Jinjiao Hou

    2018-05-01

    Full Text Available This paper proposes a multi-objective optimization method for the start-up strategy of pumped storage units (PSU for the first time. In the multi-objective optimization method, the speed rise time and the overshoot during the process of the start-up are taken as the objectives. A precise simulation platform is built for simulating the transient process of start-up, and for calculating the objectives based on the process. The Multi-objective Particle Swarm Optimization algorithm (MOPSO is adopted to optimize the widely applied start-up strategies based on one-stage direct guide vane control (DGVC, and two-stage DGVC. Based on the Pareto Front obtained, a multi-objective decision-making method based on the relative objective proximity is used to sort the solutions in the Pareto Front. Start-up strategy optimization for a PSU of a pumped storage power station in Jiangxi Province in China is conducted in experiments. The results show that: (1 compared with the single objective optimization, the proposed multi-objective optimization of start-up strategy not only greatly shortens the speed rise time and the speed overshoot, but also makes the speed curve quickly stabilize; (2 multi-objective optimization of strategy based on two-stage DGVC achieves better solution for a quick and smooth start-up of PSU than that of the strategy based on one-stage DGVC.

  9. Simulation Modeling to Compare High-Throughput, Low-Iteration Optimization Strategies for Metabolic Engineering.

    Science.gov (United States)

    Heinsch, Stephen C; Das, Siba R; Smanski, Michael J

    2018-01-01

    Increasing the final titer of a multi-gene metabolic pathway can be viewed as a multivariate optimization problem. While numerous multivariate optimization algorithms exist, few are specifically designed to accommodate the constraints posed by genetic engineering workflows. We present a strategy for optimizing expression levels across an arbitrary number of genes that requires few design-build-test iterations. We compare the performance of several optimization algorithms on a series of simulated expression landscapes. We show that optimal experimental design parameters depend on the degree of landscape ruggedness. This work provides a theoretical framework for designing and executing numerical optimization on multi-gene systems.

  10. Body Image as Strategy for Engagement in Social Media

    Directory of Open Access Journals (Sweden)

    Tarcisio Torres Silva

    2015-06-01

    This work intends to analyze not only how communication technologies have contributed to the emergence of such events but also how image production can be interpreted in such environments. Since the use of social media in protests caught the attention of broadcasting media in 2009 during demonstrations in Iran, a strong connection can be noticed between the content circulating through digital communication technologies and the body. For images produced during the Arab Spring, the same is observed with a series of strategies connecting body image and social mobilization. Our intention is to contribute to the debate of political images, considering the way they have been produced in contemporary society, which deals with a complex environment composed of communication technologies, social organization, and the body itself.

  11. Optimal operation strategies of compressed air energy storage (CAES) on electricity spot markets with fluctuating prices

    DEFF Research Database (Denmark)

    Lund, Henrik; Salgi, Georges; Elmegaard, Brian

    2009-01-01

    on electricity spot markets by storing energy when electricity prices are low and producing electricity when prices are high. In order to make a profit on such markets, CAES plant operators have to identify proper strategies to decide when to sell and when to buy electricity. This paper describes three...... plants will not be able to achieve such optimal operation, since the fluctuations of spot market prices in the coming hours and days are not known. Consequently, two simple practical strategies have been identified and compared to the results of the optimal strategy. This comparison shows that...... independent computer-based methodologies which may be used for identifying the optimal operation strategy for a given CAES plant, on a given spot market and in a given year. The optimal strategy is identified as the one which provides the best business-economic net earnings for the plant. In practice, CAES...

  12. Computational assessment of visual search strategies in volumetric medical images.

    Science.gov (United States)

    Wen, Gezheng; Aizenman, Avigael; Drew, Trafton; Wolfe, Jeremy M; Haygood, Tamara Miner; Markey, Mia K

    2016-01-01

    When searching through volumetric images [e.g., computed tomography (CT)], radiologists appear to use two different search strategies: "drilling" (restrict eye movements to a small region of the image while quickly scrolling through slices), or "scanning" (search over large areas at a given depth before moving on to the next slice). To computationally identify the type of image information that is used in these two strategies, 23 naïve observers were instructed with either "drilling" or "scanning" when searching for target T's in 20 volumes of faux lung CTs. We computed saliency maps using both classical two-dimensional (2-D) saliency, and a three-dimensional (3-D) dynamic saliency that captures the characteristics of scrolling through slices. Comparing observers' gaze distributions with the saliency maps showed that search strategy alters the type of saliency that attracts fixations. Drillers' fixations aligned better with dynamic saliency and scanners with 2-D saliency. The computed saliency was greater for detected targets than for missed targets. Similar results were observed in data from 19 radiologists who searched five stacks of clinical chest CTs for lung nodules. Dynamic saliency may be superior to the 2-D saliency for detecting targets embedded in volumetric images, and thus "drilling" may be more efficient than "scanning."

  13. Optimizing Nanoscale Quantitative Optical Imaging of Subfield Scattering Targets

    Science.gov (United States)

    Henn, Mark-Alexander; Barnes, Bryan M.; Zhou, Hui; Sohn, Martin; Silver, Richard M.

    2016-01-01

    The full 3-D scattered field above finite sets of features has been shown to contain a continuum of spatial frequency information, and with novel optical microscopy techniques and electromagnetic modeling, deep-subwavelength geometrical parameters can be determined. Similarly, by using simulations, scattering geometries and experimental conditions can be established to tailor scattered fields that yield lower parametric uncertainties while decreasing the number of measurements and the area of such finite sets of features. Such optimized conditions are reported through quantitative optical imaging in 193 nm scatterfield microscopy using feature sets up to four times smaller in area than state-of-the-art critical dimension targets. PMID:27805660

  14. Accuracy optimization with wavelength tunability in overlay imaging technology

    Science.gov (United States)

    Lee, Honggoo; Kang, Yoonshik; Han, Sangjoon; Shim, Kyuchan; Hong, Minhyung; Kim, Seungyoung; Lee, Jieun; Lee, Dongyoung; Oh, Eungryong; Choi, Ahlin; Kim, Youngsik; Marciano, Tal; Klein, Dana; Hajaj, Eitan M.; Aharon, Sharon; Ben-Dov, Guy; Lilach, Saltoun; Serero, Dan; Golotsvan, Anna

    2018-03-01

    As semiconductor manufacturing technology progresses and the dimensions of integrated circuit elements shrink, overlay budget is accordingly being reduced. Overlay budget closely approaches the scale of measurement inaccuracies due to both optical imperfections of the measurement system and the interaction of light with geometrical asymmetries of the measured targets. Measurement inaccuracies can no longer be ignored due to their significant effect on the resulting device yield. In this paper we investigate a new approach for imaging based overlay (IBO) measurements by optimizing accuracy rather than contrast precision, including its effect over the total target performance, using wavelength tunable overlay imaging metrology. We present new accuracy metrics based on theoretical development and present their quality in identifying the measurement accuracy when compared to CD-SEM overlay measurements. The paper presents the theoretical considerations and simulation work, as well as measurement data, for which tunability combined with the new accuracy metrics is shown to improve accuracy performance.

  15. Optimized optical clearing method for imaging central nervous system

    Science.gov (United States)

    Yu, Tingting; Qi, Yisong; Gong, Hui; Luo, Qingming; Zhu, Dan

    2015-03-01

    The development of various optical clearing methods provides a great potential for imaging entire central nervous system by combining with multiple-labelling and microscopic imaging techniques. These methods had made certain clearing contributions with respective weaknesses, including tissue deformation, fluorescence quenching, execution complexity and antibody penetration limitation that makes immunostaining of tissue blocks difficult. The passive clarity technique (PACT) bypasses those problems and clears the samples with simple implementation, excellent transparency with fine fluorescence retention, but the passive tissue clearing method needs too long time. In this study, we not only accelerate the clearing speed of brain blocks but also preserve GFP fluorescence well by screening an optimal clearing temperature. The selection of proper temperature will make PACT more applicable, which evidently broaden the application range of this method.

  16. Improved sliced velocity map imaging apparatus optimized for H photofragments.

    Science.gov (United States)

    Ryazanov, Mikhail; Reisler, Hanna

    2013-04-14

    Time-sliced velocity map imaging (SVMI), a high-resolution method for measuring kinetic energy distributions of products in scattering and photodissociation reactions, is challenging to implement for atomic hydrogen products. We describe an ion optics design aimed at achieving SVMI of H fragments in a broad range of kinetic energies (KE), from a fraction of an electronvolt to a few electronvolts. In order to enable consistently thin slicing for any imaged KE range, an additional electrostatic lens is introduced in the drift region for radial magnification control without affecting temporal stretching of the ion cloud. Time slices of ∼5 ns out of a cloud stretched to ⩾50 ns are used. An accelerator region with variable dimensions (using multiple electrodes) is employed for better optimization of radial and temporal space focusing characteristics at each magnification level. The implemented system was successfully tested by recording images of H fragments from the photodissociation of HBr, H2S, and the CH2OH radical, with kinetic energies ranging from 3 eV. It demonstrated KE resolution ≲1%-2%, similar to that obtained in traditional velocity map imaging followed by reconstruction, and to KE resolution achieved previously in SVMI of heavier products. We expect it to perform just as well up to at least 6 eV of kinetic energy. The tests showed that numerical simulations of the electric fields and ion trajectories in the system, used for optimization of the design and operating parameters, provide an accurate and reliable description of all aspects of system performance. This offers the advantage of selecting the best operating conditions in each measurement without the need for additional calibration experiments.

  17. Optimization of image processing algorithms on mobile platforms

    Science.gov (United States)

    Poudel, Pramod; Shirvaikar, Mukul

    2011-03-01

    This work presents a technique to optimize popular image processing algorithms on mobile platforms such as cell phones, net-books and personal digital assistants (PDAs). The increasing demand for video applications like context-aware computing on mobile embedded systems requires the use of computationally intensive image processing algorithms. The system engineer has a mandate to optimize them so as to meet real-time deadlines. A methodology to take advantage of the asymmetric dual-core processor, which includes an ARM and a DSP core supported by shared memory, is presented with implementation details. The target platform chosen is the popular OMAP 3530 processor for embedded media systems. It has an asymmetric dual-core architecture with an ARM Cortex-A8 and a TMS320C64x Digital Signal Processor (DSP). The development platform was the BeagleBoard with 256 MB of NAND RAM and 256 MB SDRAM memory. The basic image correlation algorithm is chosen for benchmarking as it finds widespread application for various template matching tasks such as face-recognition. The basic algorithm prototypes conform to OpenCV, a popular computer vision library. OpenCV algorithms can be easily ported to the ARM core which runs a popular operating system such as Linux or Windows CE. However, the DSP is architecturally more efficient at handling DFT algorithms. The algorithms are tested on a variety of images and performance results are presented measuring the speedup obtained due to dual-core implementation. A major advantage of this approach is that it allows the ARM processor to perform important real-time tasks, while the DSP addresses performance-hungry algorithms.

  18. Optimal Strategy Analysis of a Competing Portfolio Market with a Polyvariant Profit Function

    International Nuclear Information System (INIS)

    Bogolubov, Nikolai N. Jr.; Kyshakevych, Bohdan Yu.; Blackmore, Denis; Prykarpatsky, Anatoliy K.

    2010-12-01

    A competing market model with a polyvariant profit function that assumes 'zeitnot' stock behavior of clients is formulated within the banking portfolio medium and then analyzed from the perspective of devising optimal strategies. An associated Markov process method for finding an optimal choice strategy for monovariant and bivariant profit functions is developed. Under certain conditions on the bank 'promotional' parameter with respect to the 'fee' for a missed share package transaction and at an asymptotically large enough portfolio volume, universal transcendental equations - determining the optimal share package choice among competing strategies with monovariant and bivariant profit functions - are obtained. (author)

  19. Application of evolution strategy algorithm for optimization of a single-layer sound absorber

    Directory of Open Access Journals (Sweden)

    Morteza Gholamipoor

    2014-12-01

    Full Text Available Depending on different design parameters and limitations, optimization of sound absorbers has always been a challenge in the field of acoustic engineering. Various methods of optimization have evolved in the past decades with innovative method of evolution strategy gaining more attention in the recent years. Based on their simplicity and straightforward mathematical representations, single-layer absorbers have been widely used in both engineering and industrial applications and an optimized design for these absorbers has become vital. In the present study, the method of evolution strategy algorithm is used for optimization of a single-layer absorber at both a particular frequency and an arbitrary frequency band. Results of the optimization have been compared against different methods of genetic algorithm and penalty functions which are proved to be favorable in both effectiveness and accuracy. Finally, a single-layer absorber is optimized in a desired range of frequencies that is the main goal of an industrial and engineering optimization process.

  20. Scout or Cavalry? Optimal Discovery Strategies for GRBs

    International Nuclear Information System (INIS)

    Nemiroff, Robert J.

    2004-01-01

    Many present and past gamma-ray burst (GRB) detectors try to be not only a 'scout', discovering new GRBs, but also the 'cavalry', simultaneously optimizing on-board science return. Recently, however, most GRB science return has moved out from the gamma-ray energy bands where discovery usually occurs. Therefore a future gamma-ray instrument that is only a scout might best optimize future GRB science. Such a scout would specialize solely in the initial discovery of GRBs, determining only those properties that would allow an unambiguous handoff to waiting cavalry instruments. Preliminary general principles of scout design and cadence are discussed. Scouts could implement observing algorithms optimized for finding GRBs with specific attributes of duration, location, or energy. Scout sky-scanning algorithms utilizing a return cadence near to desired durations of short GRBs are suggested as a method of discovering GRBs in the unexplored short duration part of the GRB duration distribution

  1. Investment Strategies Optimization based on a SAX-GA Methodology

    CERN Document Server

    Canelas, António M L; Horta, Nuno C G

    2013-01-01

    This book presents a new computational finance approach combining a Symbolic Aggregate approXimation (SAX) technique with an optimization kernel based on genetic algorithms (GA). While the SAX representation is used to describe the financial time series, the evolutionary optimization kernel is used in order to identify the most relevant patterns and generate investment rules. The proposed approach considers several different chromosomes structures in order to achieve better results on the trading platform The methodology presented in this book has great potential on investment markets.

  2. Targeting Strategies for Multifunctional Nanoparticles in Cancer Imaging and Therapy

    Science.gov (United States)

    Yu, Mi Kyung; Park, Jinho; Jon, Sangyong

    2012-01-01

    Nanomaterials offer new opportunities for cancer diagnosis and treatment. Multifunctional nanoparticles harboring various functions including targeting, imaging, therapy, and etc have been intensively studied aiming to overcome limitations associated with conventional cancer diagnosis and therapy. Of various nanoparticles, magnetic iron oxide nanoparticles with superparamagnetic property have shown potential as multifunctional nanoparticles for clinical translation because they have been used asmagnetic resonance imaging (MRI) constrast agents in clinic and their features could be easily tailored by including targeting moieties, fluorescence dyes, or therapeutic agents. This review summarizes targeting strategies for construction of multifunctional nanoparticles including magnetic nanoparticles-based theranostic systems, and the various surface engineering strategies of nanoparticles for in vivo applications. PMID:22272217

  3. Analyzing “Etka Chain Stores” Strategies and Proposing Optimal Strategies; Using SWOT Model based on Fuzzy Logic

    OpenAIRE

    Mohammad Aghaei; Amin Asadollahi; Elham Vahedi; Mahdi Pirooz

    2013-01-01

    To maintain and achieve optimal growth, development and to be more competitive, organizations need a comprehensive and coherent plan compatible with their objectives and goals which is called strategic planning. This research aims to analyse strategically “Etka Chain Stores” and to propose optimal strategies by using SWOT model and based on fuzzy logic. The scope of this research is limited to “Etka Chain stores in Tehran”. As instrumentation, a questioner, consisting of 138 questions, was us...

  4. Aircraft path planning for optimal imaging using dynamic cost functions

    Science.gov (United States)

    Christie, Gordon; Chaudhry, Haseeb; Kochersberger, Kevin

    2015-05-01

    Unmanned aircraft development has accelerated with recent technological improvements in sensing and communications, which has resulted in an "applications lag" for how these aircraft can best be utilized. The aircraft are becoming smaller, more maneuverable and have longer endurance to perform sensing and sampling missions, but operating them aggressively to exploit these capabilities has not been a primary focus in unmanned systems development. This paper addresses a means of aerial vehicle path planning to provide a realistic optimal path in acquiring imagery for structure from motion (SfM) reconstructions and performing radiation surveys. This method will allow SfM reconstructions to occur accurately and with minimal flight time so that the reconstructions can be executed efficiently. An assumption is made that we have 3D point cloud data available prior to the flight. A discrete set of scan lines are proposed for the given area that are scored based on visibility of the scene. Our approach finds a time-efficient path and calculates trajectories between scan lines and over obstacles encountered along those scan lines. Aircraft dynamics are incorporated into the path planning algorithm as dynamic cost functions to create optimal imaging paths in minimum time. Simulations of the path planning algorithm are shown for an urban environment. We also present our approach for image-based terrain mapping, which is able to efficiently perform a 3D reconstruction of a large area without the use of GPS data.

  5. Optimizing torque vectoring strategies for an electric vehicle concept

    NARCIS (Netherlands)

    van Boekel, J.J.P.; Besselink, I.J.M.; Nijmeijer, H.; Rauh, J.; Knorr, S.; Durnberger, J.

    2013-01-01

    As part of the internship project carried out at Daimler AG, this report describes the application and optimization of torque vectoring on a research vehicle based on the Mercedes- Benz SLS AMG E-CELL. A concise introduction is given regarding the MATLAB scripts and Simulink models that were used

  6. Taxing Strategies for Carbon Emissions: A Bilevel Optimization Approach

    Directory of Open Access Journals (Sweden)

    Wei Wei

    2014-04-01

    Full Text Available This paper presents a quantitative and computational method to determine the optimal tax rate among generating units. To strike a balance between the reduction of carbon emission and the profit of energy sectors, the proposed bilevel optimization model can be regarded as a Stackelberg game between the government agency and the generation companies. The upper-level, which represents the government agency, aims to limit total carbon emissions within a certain level by setting optimal tax rates among generators according to their emission performances. The lower-level, which represents decision behaviors of the grid operator, tries to minimize the total production cost under the tax rates set by the government. The bilevel optimization model is finally reformulated into a mixed integer linear program (MILP which can be solved by off-the-shelf MILP solvers. Case studies on a 10-unit system as well as a provincial power grid in China demonstrate the validity of the proposed method and its capability in practical applications.

  7. An Optimal Stochastic Investment and Consumption Strategy with ...

    African Journals Online (AJOL)

    This paper considers a single investor who owns a production plant that generates units of consumption goods in a capitalist economy. The goal is to choose optimal investment and consumption policies that maximize the finite horizon expected discounted logarithmic utility of consumption and terminal wealth. A dynamical ...

  8. Optimal detection and control strategies for invasive species management

    Science.gov (United States)

    Shefali V. Mehta; Robert G. Haight; Frances R. Homans; Stephen Polasky; Robert C. Venette

    2007-01-01

    The increasing economic and environmental losses caused by non-native invasive species amplify the value of identifying and implementing optimal management options to prevent, detect, and control invasive species. Previous literature has focused largely on preventing introductions of invasive species and post-detection control activities; few have addressed the role of...

  9. Two-dimensional pixel image lag simulation and optimization in a 4-T CMOS image sensor

    Energy Technology Data Exchange (ETDEWEB)

    Yu Junting; Li Binqiao; Yu Pingping; Xu Jiangtao [School of Electronics Information Engineering, Tianjin University, Tianjin 300072 (China); Mou Cun, E-mail: xujiangtao@tju.edu.c [Logistics Management Office, Hebei University of Technology, Tianjin 300130 (China)

    2010-09-15

    Pixel image lag in a 4-T CMOS image sensor is analyzed and simulated in a two-dimensional model. Strategies of reducing image lag are discussed from transfer gate channel threshold voltage doping adjustment, PPD N-type doping dose/implant tilt adjustment and transfer gate operation voltage adjustment for signal electron transfer. With the computer analysis tool ISE-TCAD, simulation results show that minimum image lag can be obtained at a pinned photodiode n-type doping dose of 7.0 x 10{sup 12} cm{sup -2}, an implant tilt of -2{sup 0}, a transfer gate channel doping dose of 3.0 x 10{sup 12} cm{sup -2} and an operation voltage of 3.4 V. The conclusions of this theoretical analysis can be a guideline for pixel design to improve the performance of 4-T CMOS image sensors. (semiconductor devices)

  10. User-driven sampling strategies in image exploitation

    Science.gov (United States)

    Harvey, Neal; Porter, Reid

    2013-12-01

    Visual analytics and interactive machine learning both try to leverage the complementary strengths of humans and machines to solve complex data exploitation tasks. These fields overlap most significantly when training is involved: the visualization or machine learning tool improves over time by exploiting observations of the human-computer interaction. This paper focuses on one aspect of the human-computer interaction that we call user-driven sampling strategies. Unlike relevance feedback and active learning sampling strategies, where the computer selects which data to label at each iteration, we investigate situations where the user selects which data is to be labeled at each iteration. User-driven sampling strategies can emerge in many visual analytics applications but they have not been fully developed in machine learning. User-driven sampling strategies suggest new theoretical and practical research questions for both visualization science and machine learning. In this paper we identify and quantify the potential benefits of these strategies in a practical image analysis application. We find user-driven sampling strategies can sometimes provide significant performance gains by steering tools towards local minima that have lower error than tools trained with all of the data. In preliminary experiments we find these performance gains are particularly pronounced when the user is experienced with the tool and application domain.

  11. Optimizing the fabrication of carbon nanotube electrode for effective capacitive deionization via electrophoretic deposition strategy

    Directory of Open Access Journals (Sweden)

    Simeng Zhang

    2018-04-01

    Full Text Available In order to obtain superior electrode performances in capacitive deionization (CDI, the electrophoretic deposition (EPD was introduced as a novel strategy for the fabrication of carbon nanotube (CNT electrode. Preparation parameters, including the concentration of slurry components, deposition time and electric field intensity, were mainly investigated and optimized in terms of electrochemical characteristic and desalination performance of the deposited CNT electrode. The SEM image shows that the CNT material was deposited homogeneously on the current collector and a non-crack surface of the electrode was obtained. An optimal preparation condition of the deposited CNT electrode was obtained and specified as the Al (NO33 M concentration of 1.3 × 10−2 mol/L, the deposition time of 30 min and the electric field intensity of 15 V/cm. The obtained electrode performs an increasing specific mass capacitance of 33.36 F/g and specific adsorption capacity of 23.93 mg/g, which are 1.62 and 1.85 times those of the coated electrode respectively. The good performance of the deposited CNT electrode indicates the promising application of the EPD methodology in subsequent research and fabrication of the CDI electrodes for CDI process. Keywords: Carbon nanotube, Water treatment, Desalination, Capacitive deionization, Electrode fabrication, Electrophoretic deposition

  12. Optimization of image quality in diagnostic radiology associated with exposure

    International Nuclear Information System (INIS)

    Fulea, C.; Ramboiu, S.

    1996-01-01

    Optimal parameters for a high quality image and minimal radiation risk for the patients were established. The characteristics that affect image quality speed, contrast factor, latitude, base density, fog density, reciprocity law failure and latent image fading were analyzed. Base density on the radiographic image was measured for Azoix film and it is 0.1 log units. Fog density is a function of development time and it will increase with 20% through the increase with 2 minutes of development time. The curve Hurter-Driffield was used to characterize photographic emulsion for Azoix film. The value of latitude of 0.7 log units is in the normally useful range of densities found in radiographs. The speed of Azoix film, as a function of the time interval between exposure and development, increase with 10% for the first 24 hours. The reduction in the patient exposure that could be effected by delaying the development of Azoix is so small, that it is far outweighed by the possible disadvantage of a delay in the acquisition of diagnostic information. Therefore, latent image fading is not very important from the point of view of patient exposure. The speeds evaluated for exposure times 0.08s, 0.16s, 0.64s were unmodified, that is reciprocity law failure was unimportant for Azoix film Perlux screen. The romania films Azoix used with Perlux screen and processed with original solutions are determined an optical density of 1.0 (average density of medical radiograph) with a minimum radiation exposure for the patient (59·10 -7 C/kg). (author)

  13. Commentary: progress in optimization of patient dose and image quality in x-ray diagnostics

    International Nuclear Information System (INIS)

    Carlsson, G.A.; Chan, H.-P.

    1999-01-01

    X-ray diagnostics gives the largest contribution to the population dose from man-made radiation sources. Strategies for reduction of patient doses without loss of diagnostic accuracy are therefore of great interest to society and have been focussed in general terms by the ICRP (ICRP 1996) through the introduction of the concept of diagnostic reference levels. The European Union has stimulated research in the field, and, based on patient dose measurements and radiologists' appreciation of acceptable image quality, good radiographic techniques have been identified and recommended (EUR 1996a, b) for conventional screen-film imaging. These efforts have resulted in notable dose reductions in clinical practices (Hart et al 1996). In spite of 100 years of use of x-rays for diagnostics, the choice of technique parameters still relies to a great extent on experience. Scientific efforts to optimize the choice in terms of finding the parameter settings which yield sufficient image quality at the lowest possible cost in dose are still rare. True optimization requires (1) estimation of the image quality needed to make a correct diagnosis and (2) methods to investigate all possible means of achieving this image quality in order to be able to decide which of them gives the lowest dose. The paper by Tapiovaara, Sandborg and Dance published in this issue of Physics in Medicine and Biology (pages 537-559) addresses the optimization of paediatric fluoroscopy, a timely and important topic. Fluoroscopy procedures, used to guide x-ray examinations or interventional procedures, are little standardized and may result in high dose levels; radiation exposure in childhood is likely to result in a higher lifetime risk than the same exposure later in life. The authors represent an interesting mix of expertise within various scientific fields: the theory of medical imaging and assessment of image quality, the physics of diagnostic radiology and radiation dosimetry. They provide good insights

  14. Investigating the Optimal Management Strategy for a Healthcare Facility Maintenance Program

    National Research Council Canada - National Science Library

    Gaillard, Daria

    2004-01-01

    ...: strategic partnering with an equipment management firm. The objective of this study is to create a decision-model for selecting the optimal management strategy for a healthcare organization's facility maintenance program...

  15. An advanced Lithium-ion battery optimal charging strategy based on a coupled thermoelectric model

    International Nuclear Information System (INIS)

    Liu, Kailong; Li, Kang; Yang, Zhile; Zhang, Cheng; Deng, Jing

    2017-01-01

    Lithium-ion batteries are widely adopted as the power supplies for electric vehicles. A key but challenging issue is to achieve optimal battery charging, while taking into account of various constraints for safe, efficient and reliable operation. In this paper, a triple-objective function is first formulated for battery charging based on a coupled thermoelectric model. An advanced optimal charging strategy is then proposed to develop the optimal constant-current-constant-voltage (CCCV) charge current profile, which gives the best trade-off among three conflicting but important objectives for battery management. To be specific, a coupled thermoelectric battery model is first presented. Then, a specific triple-objective function consisting of three objectives, namely charging time, energy loss, and temperature rise (both the interior and surface), is proposed. Heuristic methods such as Teaching-learning-based-optimization (TLBO) and particle swarm optimization (PSO) are applied to optimize the triple-objective function, and their optimization performances are compared. The impacts of the weights for different terms in the objective function are then assessed. Experimental results show that the proposed optimal charging strategy is capable of offering desirable effective optimal charging current profiles and a proper trade-off among the conflicting objectives. Further, the proposed optimal charging strategy can be easily extended to other battery types.

  16. Mean-variance Optimal Reinsurance-investment Strategy in Continuous Time

    Directory of Open Access Journals (Sweden)

    Daheng Peng

    2017-10-01

    Full Text Available In this paper, Lagrange method is used to solve the continuous-time mean-variance reinsurance-investment problem. Proportional reinsurance, multiple risky assets and risk-free asset are considered synthetically in the optimal strategy for insurers. By solving the backward stochastic differential equation for the Lagrange multiplier, we get the mean-variance optimal reinsurance-investment strategy and its effective frontier in explicit forms.

  17. Optimization of control strategies for epidemics in heterogeneous populations with symmetric and asymmetric transmission

    OpenAIRE

    Ndeffo Mbah , Martial L.; Gilligan , Christopher A.

    2010-01-01

    Abstract There is growing interest in incorporating economic factors into epidemiological models in order to identify optimal strategies for disease control when resources are limited. In this paper we consider how to optimize the control of a pathogen that is capable of infecting multiple hosts with different rates of transmission within and between species. Our objective is to find control strategies that maximize the discounted number of healthy individuals. We consider two clas...

  18. Optimal Control Strategies in a Two Dimensional Differential Game Using Linear Equation under a Perturbed System

    Directory of Open Access Journals (Sweden)

    Musa Danjuma SHEHU

    2008-06-01

    Full Text Available This paper lays emphasis on formulation of two dimensional differential games via optimal control theory and consideration of control systems whose dynamics is described by a system of Ordinary Differential equation in the form of linear equation under the influence of two controls U(. and V(.. Base on this, strategies were constructed. Hence we determine the optimal strategy for a control say U(. under a perturbation generated by the second control V(. within a given manifold M.

  19. The CEV Model and Its Application in a Study of Optimal Investment Strategy

    Directory of Open Access Journals (Sweden)

    Aiyin Wang

    2014-01-01

    Full Text Available The constant elasticity of variance (CEV model is used to describe the price of the risky asset. Maximizing the expected utility relating to the Hamilton-Jacobi-Bellman (HJB equation which describes the optimal investment strategies, we obtain a partial differential equation. Applying the Legendre transform, we transform the equation into a dual problem and obtain an approximation solution and an optimal investment strategies for the exponential utility function.

  20. Mean-variance Optimal Reinsurance-investment Strategy in Continuous Time

    OpenAIRE

    Daheng Peng; Fang Zhang

    2017-01-01

    In this paper, Lagrange method is used to solve the continuous-time mean-variance reinsurance-investment problem. Proportional reinsurance, multiple risky assets and risk-free asset are considered synthetically in the optimal strategy for insurers. By solving the backward stochastic differential equation for the Lagrange multiplier, we get the mean-variance optimal reinsurance-investment strategy and its effective frontier in explicit forms.

  1. Optimized Control Strategy For Over Loaded Offshore Wind Turbines

    DEFF Research Database (Denmark)

    Odgaard, Peter Fogh; Knudsen, Torben; Wisniewski, Rafal

    2015-01-01

    controller tuning for a given wind turbine. It also enables a very safe and robust comparison between a new control strategy and the present one. Main body of abstract Is it true that power de-rating indeed the best way to reduce loads? The power de-rating approach has the drawback of only indirectly...

  2. Optimizing social participation in community-dwelling older adults through the use of behavioral coping strategies.

    Science.gov (United States)

    Provencher, Véronique; Desrosiers, Johanne; Demers, Louise; Carmichael, Pierre-Hugues

    2016-01-01

    This study aimed to (1) determine the categories of behavioral coping strategies most strongly correlated with optimal seniors' social participation in different activity and role domains and (2) identify the demographic, health and environmental factors associated with the use of these coping strategies optimizing social participation. The sample consisted of 350 randomly recruited community-dwelling older adults (≥65 years). Coping strategies and social participation were measured, respectively, using the Inventory of Coping Strategies Used by the Elderly and Assessment of Life Habits questionnaires. Information about demographic, health and environmental factors was also collected during the interview. Regression analyses showed a strong relationship between the use of cooking- and transportation-related coping strategies and optimal participation in the domains of nutrition and community life, respectively. Older age and living alone were associated with increased use of cooking-related strategies, while good self-rated health and not living in a seniors' residence were correlated with greater use of transportation-related strategies. Our study helped to identify useful behavioral coping strategies that should be incorporated in disability prevention programs designed to promote community-dwelling seniors' social participation. However, the appropriateness of these strategies depends on whether they are used in relevant contexts and tailored to specific needs. Our results support the relevance of including behavioral coping strategies related to cooking and transportation in disability prevention programs designed to promote community-dwelling seniors' social participation in the domains of nutrition and community life, respectively. Older age and living alone were associated with increased use of cooking-related strategies, while good self-rated health and not living in a seniors' residence were correlated with greater use of transportation

  3. Maintenance and test strategies to optimize NPP equipment performance

    International Nuclear Information System (INIS)

    Mayer, S.; Tomic, B.

    2000-01-01

    This paper proposes an approach to maintenance optimization of nuclear power plant components, which can help to increase both safety and availability. In order to evaluate the benefits of preventive maintenance on a quantitative basis, a software code has been developed for component performance and reliability simulation of safety related nuclear power plant equipment. A three state Markov model will be introduced, considering a degraded state in addition to an operational state and a failed state. (author)

  4. Optimal Input Strategy for Plug and Play Process Control Systems

    DEFF Research Database (Denmark)

    Kragelund, Martin Nygaard; Leth, John-Josef; Wisniewski, Rafal

    2010-01-01

    This paper considers the problem of optimal operation of a plant, which goal is to maintain production at minimum cost. The system considered in this work consists of a joined plant and redundant input systems. It is assumed that each input system contributes to a flow of goods into the joined pa...... the performance of the plant. The results are applied to a coal fired power plant where an additional new fuel system, gas, becomes available....

  5. Systems analysis as a tool for optimal process strategy

    International Nuclear Information System (INIS)

    Ditterich, K.; Schneider, J.

    1975-09-01

    For the description and the optimal treatment of complex processes, the methods of Systems Analysis are used as the most promising approach in recent times. In general every process should be optimised with respect to reliability, safety, economy and environmental pollution. In this paper the complex relations between these general optimisation postulates are established in qualitative form. These general trend relations have to be quantified for every particular system studied in practice

  6. Optimal Quality Strategy and Matching Service on Crowdfunding Platforms

    Directory of Open Access Journals (Sweden)

    Wenqing Wu

    2018-04-01

    Full Text Available This paper develops a crowdfunding platform model incorporating quality and a matching service from the perspective of a two-sided market. It aims to explore the impact of different factors on the optimal quality threshold and matching service in a context of crowdfunding from the perspective of a two-sided market. We discuss the impact of different factors on the optimal quality threshold and matching service. Two important influential factors are under consideration, simultaneously. One is the quality threshold of admission and the other is the matching efficiency on crowdfunding platforms. This paper develops a two-sided market model incorporating quality, a matching service, and the characters of crowdfunding campaigns. After attempting to solve the model by derivative method, this paper identifies the mechanism of how the parameters influence the optimal quality threshold and matching service. Additionally, it compares the platform profits in scenarios with and without an exclusion policy. The results demonstrate that excluding low-quality projects is profitable when funder preference for project quality is substantial enough. Crowdfunding platform managers would be unwise to admit the quality threshold of the crowdfunding project and charge entrance fees when the parameter of funder preference for project quality is small.

  7. A universal optimization strategy for ant colony optimization algorithms based on the Physarum-inspired mathematical model

    International Nuclear Information System (INIS)

    Zhang, Zili; Gao, Chao; Liu, Yuxin; Qian, Tao

    2014-01-01

    Ant colony optimization (ACO) algorithms often fall into the local optimal solution and have lower search efficiency for solving the travelling salesman problem (TSP). According to these shortcomings, this paper proposes a universal optimization strategy for updating the pheromone matrix in the ACO algorithms. The new optimization strategy takes advantages of the unique feature of critical paths reserved in the process of evolving adaptive networks of the Physarum-inspired mathematical model (PMM). The optimized algorithms, denoted as PMACO algorithms, can enhance the amount of pheromone in the critical paths and promote the exploitation of the optimal solution. Experimental results in synthetic and real networks show that the PMACO algorithms are more efficient and robust than the traditional ACO algorithms, which are adaptable to solve the TSP with single or multiple objectives. Meanwhile, we further analyse the influence of parameters on the performance of the PMACO algorithms. Based on these analyses, the best values of these parameters are worked out for the TSP. (paper)

  8. Multiresolution strategies for the numerical solution of optimal control problems

    Science.gov (United States)

    Jain, Sachin

    There exist many numerical techniques for solving optimal control problems but less work has been done in the field of making these algorithms run faster and more robustly. The main motivation of this work is to solve optimal control problems accurately in a fast and efficient way. Optimal control problems are often characterized by discontinuities or switchings in the control variables. One way of accurately capturing the irregularities in the solution is to use a high resolution (dense) uniform grid. This requires a large amount of computational resources both in terms of CPU time and memory. Hence, in order to accurately capture any irregularities in the solution using a few computational resources, one can refine the mesh locally in the region close to an irregularity instead of refining the mesh uniformly over the whole domain. Therefore, a novel multiresolution scheme for data compression has been designed which is shown to outperform similar data compression schemes. Specifically, we have shown that the proposed approach results in fewer grid points in the grid compared to a common multiresolution data compression scheme. The validity of the proposed mesh refinement algorithm has been verified by solving several challenging initial-boundary value problems for evolution equations in 1D. The examples have demonstrated the stability and robustness of the proposed algorithm. The algorithm adapted dynamically to any existing or emerging irregularities in the solution by automatically allocating more grid points to the region where the solution exhibited sharp features and fewer points to the region where the solution was smooth. Thereby, the computational time and memory usage has been reduced significantly, while maintaining an accuracy equivalent to the one obtained using a fine uniform mesh. Next, a direct multiresolution-based approach for solving trajectory optimization problems is developed. The original optimal control problem is transcribed into a

  9. Optimization of Key Parameters of Energy Management Strategy for Hybrid Electric Vehicle Using DIRECT Algorithm

    Directory of Open Access Journals (Sweden)

    Jingxian Hao

    2016-11-01

    Full Text Available The rule-based logic threshold control strategy has been frequently used in energy management strategies for hybrid electric vehicles (HEVs owing to its convenience in adjusting parameters, real-time performance, stability, and robustness. However, the logic threshold control parameters cannot usually ensure the best vehicle performance at different driving cycles and conditions. For this reason, the optimization of key parameters is important to improve the fuel economy, dynamic performance, and drivability. In principle, this is a multiparameter nonlinear optimization problem. The logic threshold energy management strategy for an all-wheel-drive HEV is comprehensively analyzed and developed in this study. Seven key parameters to be optimized are extracted. The optimization model of key parameters is proposed from the perspective of fuel economy. The global optimization method, DIRECT algorithm, which has good real-time performance, low computational burden, rapid convergence, is selected to optimize the extracted key parameters globally. The results show that with the optimized parameters, the engine operates more at the high efficiency range resulting into a fuel savings of 7% compared with non-optimized parameters. The proposed method can provide guidance for calibrating the parameters of the vehicle energy management strategy from the perspective of fuel economy.

  10. A two-level strategy to realize life-cycle production optimization in an operational setting

    NARCIS (Netherlands)

    Essen, van G.M.; Hof, Van den P.M.J.; Jansen, J.D.

    2012-01-01

    We present a two-level strategy to improve robustness against uncertainty and model errors in life-cycle flooding optimization. At the upper level, a physics-based large-scale reservoir model is used to determine optimal life-cycle injection and production profiles. At the lower level these profiles

  11. A two-level strategy to realize life-cycle production optimization in an operational setting

    NARCIS (Netherlands)

    Essen, van G.M.; Hof, Van den P.M.J.; Jansen, J.D.

    2013-01-01

    We present a two-level strategy to improve robustness against uncertainty and model errors in life-cycle flooding optimization. At the upper level, a physics-based large-scale reservoir model is used to determine optimal life-cycle injection and production profiles. At the lower level these profiles

  12. Dynamic optimal strategies in transboundary pollution game under learning by doing

    Science.gov (United States)

    Chang, Shuhua; Qin, Weihua; Wang, Xinyu

    2018-01-01

    In this paper, we present a transboundary pollution game, in which emission permits trading and pollution abatement costs under learning by doing are considered. In this model, the abatement cost mainly depends on the level of pollution abatement and the experience of using pollution abatement technology. We use optimal control theory to investigate the optimal emission paths and the optimal pollution abatement strategies under cooperative and noncooperative games, respectively. Additionally, the effects of parameters on the results have been examined.

  13. The Optimal Strategy to Research Pension Funds in China Based on the Loss Function

    Directory of Open Access Journals (Sweden)

    Jian-wei Gao

    2007-10-01

    Full Text Available Based on the theory of actuarial present value, a pension fund investment goal can be formulated as an objective function. The mean-variance model is extended by defining the objective loss function. Furthermore, using the theory of stochastic optimal control, an optimal investment model is established under the minimum expectation of loss function. In the light of the Hamilton-Jacobi-Bellman (HJB equation, the analytic solution of the optimal investment strategy problem is derived.

  14. The Optimal Strategy to Research Pension Funds in China Based on the Loss Function

    OpenAIRE

    Gao, Jian-wei; Guo, Hong-zhen; Ye, Yan-cheng

    2007-01-01

    Based on the theory of actuarial present value, a pension fund investment goal can be formulated as an objective function. The mean-variance model is extended by defining the objective loss function. Furthermore, using the theory of stochastic optimal control, an optimal investment model is established under the minimum expectation of loss function. In the light of the Hamilton-Jacobi-Bellman (HJB) equation, the analytic solution of the optimal investment strategy problem is derived.

  15. Optimal vaccination strategies against vector-borne diseases

    DEFF Research Database (Denmark)

    Græsbøll, Kaare; Enøe, Claes; Bødker, Rene

    2014-01-01

    Using a process oriented semi-agent based model, we simulated the spread of Bluetongue virus by Culicoides, biting midges, between cattle in Denmark. We evaluated the minimum vaccination cover and minimum cost for eight different preventive vaccination strategies in Denmark. The simulation model ...... results when index cases were in the vaccinated areas. However, given that the long-range spread of midge borne disease is still poorly quantified, more robust national vaccination schemes seem preferable....

  16. Optimal Pricing Strategy for Wireless Social Community Networks

    OpenAIRE

    Mazloumian, Amin; Manshaei, Mohammad Hossein; Felegyhazi, Mark; Hubaux, Jean-Pierre

    2008-01-01

    The increasing number of mobile applications fuels the demand for affordable and ubiquitous wireless access. The traditional wireless network technologies such as EV-DO or WiMAX provide this service but require a huge upfront investment in infrastructure and spectrum. On the contrary, as they do not have to face such an investment, social community operators rely on subscribers who constitute a community of users. The pricing strategy of the provided wireless access is an open problem for thi...

  17. In-operation learning of optimal wind farm operation strategy

    OpenAIRE

    Oliva Gratacós, Joan

    2017-01-01

    In a wind farm, power losses due to wind turbine wake effects can be up to 30-40% under certain conditions. As the global installed wind power capacity increases, the mitigation of wake effects in wind farms is gaining more importance. Following a conventional control strategy, each individual turbine maximizes its own power production without taking into consideration its effects on the performance of downstream turbines. Therefore, this control scheme results in operation con...

  18. Optimization Strategy to Capitalize on the Romanian Tourism Potential

    OpenAIRE

    PhD Lecturer Dindire Laura; PhD Reader Dugan Silvia

    2010-01-01

    An important direction of the improvement of promotional activities achieved both by the decisional governmental and non-governmental organisms within the tourist services sector and by the tourism firms, both on an intern and international level, is the promotional strategy. Consisting in the mastership of obtaining the best results, through organizing, coordination, prediction, communication and control activities, the promotional management means knowing and understanding the intern and in...

  19. Fueling strategies to optimize performance: training high or training low?

    Science.gov (United States)

    Burke, L M

    2010-10-01

    Availability of carbohydrate as a substrate for the muscle and central nervous system is critical for the performance of both intermittent high-intensity work and prolonged aerobic exercise. Therefore, strategies that promote carbohydrate availability, such as ingesting carbohydrate before, during and after exercise, are critical for the performance of many sports and a key component of current sports nutrition guidelines. Guidelines for daily carbohydrate intakes have evolved from the "one size fits all" recommendation for a high-carbohydrate diets to an individualized approach to fuel needs based on the athlete's body size and exercise program. More recently, it has been suggested that athletes should train with low carbohydrate stores but restore fuel availability for competition ("train low, compete high"), based on observations that the intracellular signaling pathways underpinning adaptations to training are enhanced when exercise is undertaken with low glycogen stores. The present literature is limited to studies of "twice a day" training (low glycogen for the second session) or withholding carbohydrate intake during training sessions. Despite increasing the muscle adaptive response and reducing the reliance on carbohydrate utilization during exercise, there is no clear evidence that these strategies enhance exercise performance. Further studies on dietary periodization strategies, especially those mimicking real-life athletic practices, are needed. © 2010 John Wiley & Sons A/S.

  20. Improved quantum-behaved particle swarm optimization with local search strategy

    Directory of Open Access Journals (Sweden)

    Maolong Xi

    2017-03-01

    Full Text Available Quantum-behaved particle swarm optimization, which was motivated by analysis of particle swarm optimization and quantum system, has shown compared performance in finding the optimal solutions for many optimization problems to other evolutionary algorithms. To address the problem of premature, a local search strategy is proposed to improve the performance of quantum-behaved particle swarm optimization. In proposed local search strategy, a super particle is presented which is a collection body of randomly selected particles’ dimension information in the swarm. The selected probability of particles in swarm is different and determined by their fitness values. To minimization problems, the fitness value of one particle is smaller; the selected probability is more and will contribute more information in constructing the super particle. In addition, in order to investigate the influence on algorithm performance with different local search space, four methods of computing the local search radius are applied in local search strategy and propose four variants of local search quantum-behaved particle swarm optimization. Empirical studies on a suite of well-known benchmark functions are undertaken in order to make an overall performance comparison among the proposed methods and other quantum-behaved particle swarm optimization. The simulation results show that the proposed quantum-behaved particle swarm optimization variants have better advantages over the original quantum-behaved particle swarm optimization.

  1. Integrated Emission Management strategy for cost-optimal engine-aftertreatment operation

    NARCIS (Netherlands)

    Cloudt, R.P.M.; Willems, F.P.T.

    2011-01-01

    A new cost-based control strategy is presented that optimizes engine-aftertreatment performance under all operating conditions. This Integrated Emission Management strategy minimizes fuel consumption within the set emission limits by on-line adjustment of air management based on the actual state of

  2. Optimization as a Reasoning Strategy for Dealing with Socioscientific Decision-Making Situations

    Science.gov (United States)

    Papadouris, Nicos

    2012-01-01

    This paper reports on an attempt to help 12-year-old students develop a specific optimization strategy for selecting among possible solutions in socioscientific decision-making situations. We have developed teaching and learning materials for elaborating this strategy, and we have implemented them in two intact classes (N = 48). Prior to and after…

  3. Exploring optimal fertigation strategies for orange production, using soil-crop modelling

    NARCIS (Netherlands)

    Qin, Wei; Heinen, Marius; Assinck, Falentijn B.T.; Oenema, Oene

    2016-01-01

    Water and nitrogen (N) are two key limiting factors in orange (Citrus sinensis) production. The amount and the timing of water and N application are critical, but optimal strategies have not yet been well established. This study presents an analysis of 47 fertigation strategies examined by a

  4. Generalized PSF modeling for optimized quantitation in PET imaging.

    Science.gov (United States)

    Ashrafinia, Saeed; Mohy-Ud-Din, Hassan; Karakatsanis, Nicolas A; Jha, Abhinav K; Casey, Michael E; Kadrmas, Dan J; Rahmim, Arman

    2017-06-21

    Point-spread function (PSF) modeling offers the ability to account for resolution degrading phenomena within the PET image generation framework. PSF modeling improves resolution and enhances contrast, but at the same time significantly alters image noise properties and induces edge overshoot effect. Thus, studying the effect of PSF modeling on quantitation task performance can be very important. Frameworks explored in the past involved a dichotomy of PSF versus no-PSF modeling. By contrast, the present work focuses on quantitative performance evaluation of standard uptake value (SUV) PET images, while incorporating a wide spectrum of PSF models, including those that under- and over-estimate the true PSF, for the potential of enhanced quantitation of SUVs. The developed framework first analytically models the true PSF, considering a range of resolution degradation phenomena (including photon non-collinearity, inter-crystal penetration and scattering) as present in data acquisitions with modern commercial PET systems. In the context of oncologic liver FDG PET imaging, we generated 200 noisy datasets per image-set (with clinically realistic noise levels) using an XCAT anthropomorphic phantom with liver tumours of varying sizes. These were subsequently reconstructed using the OS-EM algorithm with varying PSF modelled kernels. We focused on quantitation of both SUV mean and SUV max , including assessment of contrast recovery coefficients, as well as noise-bias characteristics (including both image roughness and coefficient of-variability), for different tumours/iterations/PSF kernels. It was observed that overestimated PSF yielded more accurate contrast recovery for a range of tumours, and typically improved quantitative performance. For a clinically reasonable number of iterations, edge enhancement due to PSF modeling (especially due to over-estimated PSF) was in fact seen to lower SUV mean bias in small tumours. Overall, the results indicate that exactly matched PSF

  5. Optimal Coordinated Strategy Analysis for the Procurement Logistics of a Steel Group

    Directory of Open Access Journals (Sweden)

    Lianbo Deng

    2014-01-01

    Full Text Available This paper focuses on the optimization of an internal coordinated procurement logistics system in a steel group and the decision on the coordinated procurement strategy by minimizing the logistics costs. Considering the coordinated procurement strategy and the procurement logistics costs, the aim of the optimization model was to maximize the degree of quality satisfaction and to minimize the procurement logistics costs. The model was transformed into a single-objective model and solved using a simulated annealing algorithm. In the algorithm, the supplier of each subsidiary was selected according to the evaluation result for independent procurement. Finally, the effect of different parameters on the coordinated procurement strategy was analysed. The results showed that the coordinated strategy can clearly save procurement costs; that the strategy appears to be more cooperative when the quality requirement is not stricter; and that the coordinated costs have a strong effect on the coordinated procurement strategy.

  6. Optimization Strategy of the APR+ BOP Technical Specifications

    International Nuclear Information System (INIS)

    Cho, Yoon Sang; Lee, Jae Gon; Han, Sung Heum

    2016-01-01

    The BOP is one of the key factors for successful project implementation of NPP. In constructing the APR1400 NPP, the BOP procurement has been one of the biggest concerns. Due to the design changes and increased capacity of equipment in NPP, lots of BOPs should be ‘first supplied equipment’ [hereinafter, ‘FSE’]. The manufacture-ability, and the performances of FSEs have not been fully proved and tested, manufacturers and suppliers are requested to submit Reports for Equipment Qualification Evaluation in accordance with 10 CFR 50.49, IEEE 323. They need at least 1-2 years’ tests for Environment Qualification (EQ) and Seismic Qualification (SQ). This study is focused how to prepare the BOP purchase specifications in order to control the FSEs, especially in safety class equipment. With the optimization plan for BOP packages of this study, the FSEs’ occurrence can be reasonably controlled as low as possible. For successful NPP project, the concerns in procuring BOPs shall be fully analyzed beforehand. Now Korea is preparing new era of APR+, with closing the time of APR1400. The technical specification of APR+ BOPs can be developed and prepared successfully and very effectively according to this optimization plan. This will be a great contribution not only in constructing APR+ in time, but also in exporting APR+ overseas, all over the world in the future

  7. Optimization Strategy of the APR+ BOP Technical Specifications

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Yoon Sang; Lee, Jae Gon; Han, Sung Heum [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    The BOP is one of the key factors for successful project implementation of NPP. In constructing the APR1400 NPP, the BOP procurement has been one of the biggest concerns. Due to the design changes and increased capacity of equipment in NPP, lots of BOPs should be ‘first supplied equipment’ [hereinafter, ‘FSE’]. The manufacture-ability, and the performances of FSEs have not been fully proved and tested, manufacturers and suppliers are requested to submit Reports for Equipment Qualification Evaluation in accordance with 10 CFR 50.49, IEEE 323. They need at least 1-2 years’ tests for Environment Qualification (EQ) and Seismic Qualification (SQ). This study is focused how to prepare the BOP purchase specifications in order to control the FSEs, especially in safety class equipment. With the optimization plan for BOP packages of this study, the FSEs’ occurrence can be reasonably controlled as low as possible. For successful NPP project, the concerns in procuring BOPs shall be fully analyzed beforehand. Now Korea is preparing new era of APR+, with closing the time of APR1400. The technical specification of APR+ BOPs can be developed and prepared successfully and very effectively according to this optimization plan. This will be a great contribution not only in constructing APR+ in time, but also in exporting APR+ overseas, all over the world in the future.

  8. Optimal investment strategies and hedging of derivatives in the presence of transaction costs (Invited Paper)

    Science.gov (United States)

    Muratore-Ginanneschi, Paolo

    2005-05-01

    Investment strategies in multiplicative Markovian market models with transaction costs are defined using growth optimal criteria. The optimal strategy is shown to consist in holding the amount of capital invested in stocks within an interval around an ideal optimal investment. The size of the holding interval is determined by the intensity of the transaction costs and the time horizon. The inclusion of financial derivatives in the models is also considered. All the results presented in this contributions were previously derived in collaboration with E. Aurell.

  9. Optimization of maintenance strategies in case of data uncertainties; Optimierung von Instandhaltungsstrategien bei unscharfen Eingangsdaten

    Energy Technology Data Exchange (ETDEWEB)

    Aha, Ulrich

    2013-07-01

    Maintenance strategies are aimed to keep a technical facility functioning in spite of damaging processes (wear, corrosion, fatigue) with simultaneous control of these processes. The project optimization of maintenance strategies in case of data uncertainties is aimed to optimize maintenance measures like preventive measures (lubrication etc.), inspections and replacements to keep the facility/plant operating including the minimization of financial costs. The report covers the following topics: modeling assumptions, model development and optimization procedure, results for a conventional power plant and an oxyfuel plant.

  10. Multi-step optimization strategy for fuel-optimal orbital transfer of low-thrust spacecraft

    Science.gov (United States)

    Rasotto, M.; Armellin, R.; Di Lizia, P.

    2016-03-01

    An effective method for the design of fuel-optimal transfers in two- and three-body dynamics is presented. The optimal control problem is formulated using calculus of variation and primer vector theory. This leads to a multi-point boundary value problem (MPBVP), characterized by complex inner constraints and a discontinuous thrust profile. The first issue is addressed by embedding the MPBVP in a parametric optimization problem, thus allowing a simplification of the set of transversality constraints. The second problem is solved by representing the discontinuous control function by a smooth function depending on a continuation parameter. The resulting trajectory optimization method can deal with different intermediate conditions, and no a priori knowledge of the control structure is required. Test cases in both the two- and three-body dynamics show the capability of the method in solving complex trajectory design problems.

  11. Optimal control of stretching process of flexible solar arrays on spacecraft based on a hybrid optimization strategy

    Directory of Open Access Journals (Sweden)

    Qijia Yao

    2017-07-01

    Full Text Available The optimal control of multibody spacecraft during the stretching process of solar arrays is investigated, and a hybrid optimization strategy based on Gauss pseudospectral method (GPM and direct shooting method (DSM is presented. First, the elastic deformation of flexible solar arrays was described approximately by the assumed mode method, and a dynamic model was established by the second Lagrangian equation. Then, the nonholonomic motion planning problem is transformed into a nonlinear programming problem by using GPM. By giving fewer LG points, initial values of the state variables and control variables were obtained. A serial optimization framework was adopted to obtain the approximate optimal solution from a feasible solution. Finally, the control variables were discretized at LG points, and the precise optimal control inputs were obtained by DSM. The optimal trajectory of the system can be obtained through numerical integration. Through numerical simulation, the stretching process of solar arrays is stable with no detours, and the control inputs match the various constraints of actual conditions. The results indicate that the method is effective with good robustness. Keywords: Motion planning, Multibody spacecraft, Optimal control, Gauss pseudospectral method, Direct shooting method

  12. Least median of squares filtering of locally optimal point matches for compressible flow image registration

    International Nuclear Information System (INIS)

    Castillo, Edward; Guerrero, Thomas; Castillo, Richard; White, Benjamin; Rojo, Javier

    2012-01-01

    Compressible flow based image registration operates under the assumption that the mass of the imaged material is conserved from one image to the next. Depending on how the mass conservation assumption is modeled, the performance of existing compressible flow methods is limited by factors such as image quality, noise, large magnitude voxel displacements, and computational requirements. The Least Median of Squares Filtered Compressible Flow (LFC) method introduced here is based on a localized, nonlinear least squares, compressible flow model that describes the displacement of a single voxel that lends itself to a simple grid search (block matching) optimization strategy. Spatially inaccurate grid search point matches, corresponding to erroneous local minimizers of the nonlinear compressible flow model, are removed by a novel filtering approach based on least median of squares fitting and the forward search outlier detection method. The spatial accuracy of the method is measured using ten thoracic CT image sets and large samples of expert determined landmarks (available at www.dir-lab.com). The LFC method produces an average error within the intra-observer error on eight of the ten cases, indicating that the method is capable of achieving a high spatial accuracy for thoracic CT registration. (paper)

  13. An Optimal Strategy for Accurate Bulge-to-disk Decomposition of Disk Galaxies

    Energy Technology Data Exchange (ETDEWEB)

    Gao Hua [Department of Astronomy, School of Physics, Peking University, Beijing 100871 (China); Ho, Luis C. [Kavli Institute for Astronomy and Astrophysics, Peking University, Beijing 100871 (China)

    2017-08-20

    The development of two-dimensional (2D) bulge-to-disk decomposition techniques has shown their advantages over traditional one-dimensional (1D) techniques, especially for galaxies with non-axisymmetric features. However, the full potential of 2D techniques has yet to be fully exploited. Secondary morphological features in nearby disk galaxies, such as bars, lenses, rings, disk breaks, and spiral arms, are seldom accounted for in 2D image decompositions, even though some image-fitting codes, such as GALFIT, are capable of handling them. We present detailed, 2D multi-model and multi-component decomposition of high-quality R -band images of a representative sample of nearby disk galaxies selected from the Carnegie-Irvine Galaxy Survey, using the latest version of GALFIT. The sample consists of five barred and five unbarred galaxies, spanning Hubble types from S0 to Sc. Traditional 1D decomposition is also presented for comparison. In detailed case studies of the 10 galaxies, we successfully model the secondary morphological features. Through a comparison of best-fit parameters obtained from different input surface brightness models, we identify morphological features that significantly impact bulge measurements. We show that nuclear and inner lenses/rings and disk breaks must be properly taken into account to obtain accurate bulge parameters, whereas outer lenses/rings and spiral arms have a negligible effect. We provide an optimal strategy to measure bulge parameters of typical disk galaxies, as well as prescriptions to estimate realistic uncertainties of them, which will benefit subsequent decomposition of a larger galaxy sample.

  14. An Optimal Strategy for Accurate Bulge-to-disk Decomposition of Disk Galaxies

    Science.gov (United States)

    Gao, Hua; Ho, Luis C.

    2017-08-01

    The development of two-dimensional (2D) bulge-to-disk decomposition techniques has shown their advantages over traditional one-dimensional (1D) techniques, especially for galaxies with non-axisymmetric features. However, the full potential of 2D techniques has yet to be fully exploited. Secondary morphological features in nearby disk galaxies, such as bars, lenses, rings, disk breaks, and spiral arms, are seldom accounted for in 2D image decompositions, even though some image-fitting codes, such as GALFIT, are capable of handling them. We present detailed, 2D multi-model and multi-component decomposition of high-quality R-band images of a representative sample of nearby disk galaxies selected from the Carnegie-Irvine Galaxy Survey, using the latest version of GALFIT. The sample consists of five barred and five unbarred galaxies, spanning Hubble types from S0 to Sc. Traditional 1D decomposition is also presented for comparison. In detailed case studies of the 10 galaxies, we successfully model the secondary morphological features. Through a comparison of best-fit parameters obtained from different input surface brightness models, we identify morphological features that significantly impact bulge measurements. We show that nuclear and inner lenses/rings and disk breaks must be properly taken into account to obtain accurate bulge parameters, whereas outer lenses/rings and spiral arms have a negligible effect. We provide an optimal strategy to measure bulge parameters of typical disk galaxies, as well as prescriptions to estimate realistic uncertainties of them, which will benefit subsequent decomposition of a larger galaxy sample.

  15. An Optimal Strategy for Accurate Bulge-to-disk Decomposition of Disk Galaxies

    International Nuclear Information System (INIS)

    Gao Hua; Ho, Luis C.

    2017-01-01

    The development of two-dimensional (2D) bulge-to-disk decomposition techniques has shown their advantages over traditional one-dimensional (1D) techniques, especially for galaxies with non-axisymmetric features. However, the full potential of 2D techniques has yet to be fully exploited. Secondary morphological features in nearby disk galaxies, such as bars, lenses, rings, disk breaks, and spiral arms, are seldom accounted for in 2D image decompositions, even though some image-fitting codes, such as GALFIT, are capable of handling them. We present detailed, 2D multi-model and multi-component decomposition of high-quality R -band images of a representative sample of nearby disk galaxies selected from the Carnegie-Irvine Galaxy Survey, using the latest version of GALFIT. The sample consists of five barred and five unbarred galaxies, spanning Hubble types from S0 to Sc. Traditional 1D decomposition is also presented for comparison. In detailed case studies of the 10 galaxies, we successfully model the secondary morphological features. Through a comparison of best-fit parameters obtained from different input surface brightness models, we identify morphological features that significantly impact bulge measurements. We show that nuclear and inner lenses/rings and disk breaks must be properly taken into account to obtain accurate bulge parameters, whereas outer lenses/rings and spiral arms have a negligible effect. We provide an optimal strategy to measure bulge parameters of typical disk galaxies, as well as prescriptions to estimate realistic uncertainties of them, which will benefit subsequent decomposition of a larger galaxy sample.

  16. Dispositional optimism and coping strategies in patients with a kidney transplant.

    Science.gov (United States)

    Costa-Requena, Gemma; Cantarell-Aixendri, M Carmen; Parramon-Puig, Gemma; Serón-Micas, Daniel

    2014-01-01

     Dispositional optimism is a personal resource that determines the coping style and adaptive response to chronic diseases. The aim of this study was to assess the correlations between dispositional optimism and coping strategies in patients with recent kidney transplantation and evaluate the differences in the use of coping strategies in accordance with the level of dispositional optimism.  Patients who were hospitalised in the nephrology department were selected consecutively after kidney transplantation was performed. The evaluation instruments were the Life Orientation Test-Revised, and the Coping Strategies Inventory. The data were analysed with central tendency measures, correlation analyses and means were compared using Student’s t-test.   66 patients with a kidney transplant participated in the study. The coping styles that characterised patients with a recent kidney transplantation were Social withdrawal and Problem avoidance. Correlations between dispositional optimism and coping strategies were significant in a positive direction in Problem-solving (p<.05) and Cognitive restructuring (p<.01), and inversely with Self-criticism (p<.05). Differences in dispositional optimism created significant differences in the Self-Criticism dimension (t=2.58; p<.01).  Dispositional optimism scores provide differences in coping responses after kidney transplantation. Moreover, coping strategies may influence the patient’s perception of emotional wellbeing after kidney transplantation.

  17. Tank waste remediation system optimized processing strategy with an altered treatment scheme

    International Nuclear Information System (INIS)

    Slaathaug, E.J.

    1996-03-01

    This report provides an alternative strategy evolved from the current Hanford Site Tank Waste Remediation System (TWRS) programmatic baseline for accomplishing the treatment and disposal of the Hanford Site tank wastes. This optimized processing strategy with an altered treatment scheme performs the major elements of the TWRS Program, but modifies the deployment of selected treatment technologies to reduce the program cost. The present program for development of waste retrieval, pretreatment, and vitrification technologies continues, but the optimized processing strategy reuses a single facility to accomplish the separations/low-activity waste (LAW) vitrification and the high-level waste (HLW) vitrification processes sequentially, thereby eliminating the need for a separate HLW vitrification facility

  18. Optimization of remediation strategies using vadose zone monitoring systems

    Science.gov (United States)

    Dahan, Ofer

    2016-04-01

    In-situ bio-remediation of the vadose zone depends mainly on the ability to change the subsurface hydrological, physical and chemical conditions in order to enable development of specific, indigenous, pollutants degrading bacteria. As such the remediation efficiency is much dependent on the ability to implement optimal hydraulic and chemical conditions in deep sections of the vadose zone. These conditions are usually determined in laboratory experiments where parameters such as the chemical composition of the soil water solution, redox potential and water content of the sediment are fully controlled. Usually, implementation of desired optimal degradation conditions in deep vadose zone at full scale field setups is achieved through infiltration of water enriched with chemical additives on the land surface. It is assumed that deep percolation into the vadose zone would create chemical conditions that promote biodegradation of specific compounds. However, application of water with specific chemical conditions near land surface dose not necessarily results in promoting of desired chemical and hydraulic conditions in deep sections of the vadose zone. A vadose-zone monitoring system (VMS) that was recently developed allows continuous monitoring of the hydrological and chemical properties of deep sections of the unsaturated zone. The VMS includes flexible time-domain reflectometry (FTDR) probes which allow continuous monitoring of the temporal variation of the vadose zone water content, and vadose-zone sampling ports (VSPs) which are designed to allow frequent sampling of the sediment pore-water and gas at multiple depths. Implementation of the vadose zone monitoring system in sites that undergoes active remediation provides real time information on the actual chemical and hydrological conditions in the vadose zone as the remediation process progresses. Up-to-date the system has been successfully implemented in several studies on water flow and contaminant transport in

  19. Multiple-point statistical simulation for hydrogeological models: 3-D training image development and conditioning strategies

    Science.gov (United States)

    Høyer, Anne-Sophie; Vignoli, Giulio; Mejer Hansen, Thomas; Thanh Vu, Le; Keefer, Donald A.; Jørgensen, Flemming

    2017-12-01

    Most studies on the application of geostatistical simulations based on multiple-point statistics (MPS) to hydrogeological modelling focus on relatively fine-scale models and concentrate on the estimation of facies-level structural uncertainty. Much less attention is paid to the use of input data and optimal construction of training images. For instance, even though the training image should capture a set of spatial geological characteristics to guide the simulations, the majority of the research still relies on 2-D or quasi-3-D training images. In the present study, we demonstrate a novel strategy for 3-D MPS modelling characterized by (i) realistic 3-D training images and (ii) an effective workflow for incorporating a diverse group of geological and geophysical data sets. The study covers an area of 2810 km2 in the southern part of Denmark. MPS simulations are performed on a subset of the geological succession (the lower to middle Miocene sediments) which is characterized by relatively uniform structures and dominated by sand and clay. The simulated domain is large and each of the geostatistical realizations contains approximately 45 million voxels with size 100 m × 100 m × 5 m. Data used for the modelling include water well logs, high-resolution seismic data, and a previously published 3-D geological model. We apply a series of different strategies for the simulations based on data quality, and develop a novel method to effectively create observed spatial trends. The training image is constructed as a relatively small 3-D voxel model covering an area of 90 km2. We use an iterative training image development strategy and find that even slight modifications in the training image create significant changes in simulations. Thus, this study shows how to include both the geological environment and the type and quality of input information in order to achieve optimal results from MPS modelling. We present a practical workflow to build the training image and

  20. Optimal processing for gel electrophoresis images: Applying Monte Carlo Tree Search in GelApp.

    Science.gov (United States)

    Nguyen, Phi-Vu; Ghezal, Ali; Hsueh, Ya-Chih; Boudier, Thomas; Gan, Samuel Ken-En; Lee, Hwee Kuan

    2016-08-01

    In biomedical research, gel band size estimation in electrophoresis analysis is a routine process. To facilitate and automate this process, numerous software have been released, notably the GelApp mobile app. However, the band detection accuracy is limited due to a band detection algorithm that cannot adapt to the variations in input images. To address this, we used the Monte Carlo Tree Search with Upper Confidence Bound (MCTS-UCB) method to efficiently search for optimal image processing pipelines for the band detection task, thereby improving the segmentation algorithm. Incorporating this into GelApp, we report a significant enhancement of gel band detection accuracy by 55.9 ± 2.0% for protein polyacrylamide gels, and 35.9 ± 2.5% for DNA SYBR green agarose gels. This implementation is a proof-of-concept in demonstrating MCTS-UCB as a strategy to optimize general image segmentation. The improved version of GelApp-GelApp 2.0-is freely available on both Google Play Store (for Android platform), and Apple App Store (for iOS platform). © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Optimizing noise control strategy in a forging workshop.

    Science.gov (United States)

    Razavi, Hamideh; Ramazanifar, Ehsan; Bagherzadeh, Jalal

    2014-01-01

    In this paper, a computer program based on a genetic algorithm is developed to find an economic solution for noise control in a forging workshop. Initially, input data, including characteristics of sound sources, human exposure, abatement techniques, and production plans are inserted into the model. Using sound pressure levels at working locations, the operators who are at higher risk are identified and picked out for the next step. The program is devised in MATLAB such that the parameters can be easily defined and changed for comparison. The final results are structured into 4 sections that specify an appropriate abatement method for each operator and machine, minimum allowance time for high-risk operators, required damping material for enclosures, and minimum total cost of these treatments. The validity of input data in addition to proper settings in the optimization model ensures the final solution is practical and economically reasonable.

  2. Global optimization numerical strategies for rate-independent processes

    Czech Academy of Sciences Publication Activity Database

    Benešová, Barbora

    2011-01-01

    Roč. 50, č. 2 (2011), s. 197-220 ISSN 0925-5001 R&D Projects: GA ČR GAP201/10/0357 Grant - others:GA MŠk(CZ) LC06052 Program:LC Institutional research plan: CEZ:AV0Z20760514 Keywords : rate-independent processes * numerical global optimization * energy estimates based algorithm Subject RIV: BA - General Mathematics Impact factor: 1.196, year: 2011 http://math.hnue.edu.vn/portal/rss.viewpage.php?id=0000037780&ap=L3BvcnRhbC9ncmFiYmVyLnBocD9jYXRpZD0xMDEyJnBhZ2U9Mg==

  3. Optimization of wind farm power production using innovative control strategies

    DEFF Research Database (Denmark)

    Duc, Thomas

    Wind energy has experienced a very significant growth and cost reduction over the past decade, and is now able to compete with conventional power generation sources. New concepts are currently investigated to decrease costs of production of electricity even further. Wind farm coordinated control...... deficit caused by the wake downstream, or yawing the turbine to deflect the wake away from the downwind turbine. Simulation results found in the literature indicate that an increase in overall power production can be obtained. However they underline the high sensitivity of these gains to incoming wind...... aligned wind turbines. The experimental results show that the scenarios implemented during the first measurement campaign did not achieve an increase in overall power production, which confirms the difficulty to realize wind farm power optimization in real operating conditions. In the curtailment field...

  4. Optimal dispatch strategy for the agile virtual power plant

    DEFF Research Database (Denmark)

    Petersen, Mette Højgaard; Bendtsen, Jan Dimon; Stoustrup, Jakob

    2012-01-01

    The introduction of large ratios of renewable energy into the existing power system is complicated by the inherent variability of production technologies, which harvest energy from wind, sun and waves. Fluctuations of renewable power production can be predicted to some extent, but the assumption...... of perfect prediction is unrealistic. This paper therefore introduces the Agile Virtual Power Plant. The Agile Virtual Power Plant assumes that the base load production planning based on best available knowledge is already given, so imbalances cannot be predicted. Consequently the Agile Virtual Power Plant...... attempts to preserve maneuverability (stay agile) rather than optimize performance according to predictions. In this paper the imbalance compensation problem for an Agile Virtual Power Plant is formulated. It is proved formally, that when local units are power and energy constrained integrators a dispatch...

  5. Experimental transport phenomena and optimization strategies for thermoelectrics

    Energy Technology Data Exchange (ETDEWEB)

    Ehrlich, A C; Gillespie, D J

    1997-07-01

    When a new and promising thermoelectric material is discovered, an effort is undertaken to improve its figure of merit. If the effort is to be more efficient than one of trial and error with perhaps some rule of thumb guidance then it is important to be able to make the connection between experimental data and the underlying material characteristics, electronic and phononic, that influence the figure of merit. Transport and fermiology experimental data can be used to evaluate these material characteristics and thus establish trends as a function of some controllable parameter, such as composition. In this paper some of the generic-materials characteristics, generally believed to be required for a high figure of merit, will be discussed in terms of the experimental approach to their evaluation and optimization. Transport and fermiology experiments will be emphasized and both will be outlined in what they can reveal and what can be obscured by the simplifying assumptions generally used in their interpretation.

  6. Optimization of coronary optical coherence tomography imaging using the attenuation-compensated technique: a validation study.

    NARCIS (Netherlands)

    Teo, Jing Chun; Foin, Nicolas; Otsuka, Fumiyuki; Bulluck, Heerajnarain; Fam, Jiang Ming; Wong, Philip; Low, Fatt Hoe; Leo, Hwa Liang; Mari, Jean-Martial; Joner, Michael; Girard, Michael J A; Virmani, Renu; Bezerra, HG.; Costa, MA.; Guagliumi, G.; Rollins, AM.; Simon, D.; Gutiérrez-Chico, JL.; Alegría-Barrero, E.; Teijeiro-Mestre, R.; Chan, PH.; Tsujioka, H.; de Silva, R.; Otsuka, F.; Joner, M.; Prati, F.; Virmani, R.; Narula, J.; Members, WC.; Levine, GN.; Bates, ER.; Blankenship, JC.; Bailey, SR.; Bittl, JA.; Prati, F.; Guagliumi, G.; Mintz, G.S.; Costa, Marco; Regar, E.; Akasaka, T.; Roleder, T.; Jąkała, J.; Kałuża, GL.; Partyka, Ł.; Proniewska, K.; Pociask, E.; Girard, MJA.; Strouthidis, NG.; Ethier, CR.; Mari, JM.; Mari, JM.; Strouthidis, NG.; Park, SC.; Girard, MJA.; van der Lee, R.; Foin, N.; Otsuka, F.; Wong, P.K.; Mari, J-M.; Joner, M.; Nakano, M.; Vorpahl, M.; Otsuka, F.; Taniwaki, M.; Yazdani, SK.; Finn, AV.; Nakano, M.; Yahagi, K.; Yamamoto, H.; Taniwaki, M.; Otsuka, F.; Ladich, ER.; Girard, MJ.; Ang, M.; Chung, CW.; Farook, M.; Strouthidis, N.; Mehta, JS.; Foin, N.; Mari, JM.; Nijjer, S.; Sen, S.; Petraco, R.; Ghione, M.; Liu, X.; Kang, JU.; Virmani, R.; Kolodgie, F.D.; Burke, AP.; Farb, A.; Schwartz, S.M.; Yahagi, K.; Kolodgie, F.D.; Otsuka, F.; Finn, AV.; Davis, HR.; Joner, M.; Kume, T.; Akasaka, T.; Kawamoto, T.; Watanabe, N.; Toyota, E.; Neishi, Y.; Rieber, J.; Meissner, O.; Babaryka, G.; Reim, S.; Oswald, M.E.; Koenig, A.S.; Tearney, G. J.; Regar, E.; Akasaka, T.; Adriaenssens, T.; Barlis, P.; Bezerra, HG.; Yabushita, H.; Bouma, BE.; Houser, S. L.; Aretz, HT.; Jang, I-K.; Schlendorf, KH.; Guo, J.; Sun, L.; Chen, Y.D.; Tian, F.; Liu, HB.; Chen, L.; Kawasaki, M.; Bouma, BE.; Bressner, J. E.; Houser, S. L.; Nadkarni, S. K.; MacNeill, BD.; Jansen, CHP.; Onthank, DC.; Cuello, F.; Botnar, RM.; Wiethoff, AJ.; Warley, A.; von Birgelen, C.; Hartmann, A. M.; Kubo, T.; Akasaka, T.; Shite, J.; Suzuki, T.; Uemura, S.; Yu, B.; Habara, M.; Nasu, K.; Terashima, M.; Kaneda, H.; Yokota, D.; Ko, E.; Virmani, R.; Burke, AP.; Kolodgie, F.D.; Farb, A.; Takarada, S.; Imanishi, T.; Kubo, T.; Tanimoto, T.; Kitabata, H.; Nakamura, N.; Hattori, K.; Ozaki, Y.; Ismail, TF.; Okumura, M.; Naruse, H.; Kan, S.; Nishio, R.; Shinke, T.; Otake, H.; Nakagawa, M.; Nagoshi, R.; Inoue, T.; Sinclair, H.D.; Bourantas, C.; Bagnall, A.; Mintz, G.S.; Kunadian, V.; Tearney, G. J.; Yabushita, H.; Houser, S. L.; Aretz, HT.; Jang, I-K.; Schlendorf, KH.; van Soest, G.; Goderie, T.; Regar, E.; Koljenović, S.; Leenders, GL. van; Gonzalo, N.; Xu, C.; Schmitt, JM.; Carlier, SG.; Virmani, R.; van der Meer, FJ; Faber, D.J.; Sassoon, DMB.; Aalders, M.C.; Pasterkamp, G.; Leeuwen, TG. van; Schmitt, JM.; Knuttel, A.; Yadlowsky, M.; Eckhaus, MA.; Karamata, B.; Laubscher, M.; Leutenegger, M.; Bourquin, S.; Lasser, T.; Lambelet, P.; Vermeer, K.A.; Mo, J.; Weda, J.J.A.; Lemij, H.G.; Boer, JF. de

    2016-01-01

    PURPOSE To optimize conventional coronary optical coherence tomography (OCT) images using the attenuation-compensated technique to improve identification of plaques and the external elastic lamina (EEL) contour. METHOD The attenuation-compensated technique was optimized via manipulating contrast

  7. Simultaneous topography and recognition imaging: physical aspects and optimal imaging conditions

    International Nuclear Information System (INIS)

    Preiner, Johannes; Ebner, Andreas; Zhu Rong; Hinterdorfer, Peter; Chtcheglova, Lilia

    2009-01-01

    Simultaneous topography and recognition imaging (TREC) allows for the investigation of receptor distributions on natural biological surfaces under physiological conditions. Based on atomic force microscopy (AFM) in combination with a cantilever tip carrying a ligand molecule, it enables us to sense topography and recognition of receptor molecules simultaneously with nanometre accuracy. In this study we introduce optimized handling conditions and investigate the physical properties of the cantilever-tip-sample ensemble, which is essential for the interpretation of the experimental data gained from this technique. In contrast to conventional AFM methods, TREC is based on a more sophisticated feedback loop, which enables us to discriminate topographical contributions from recognition events in the AFM cantilever motion. The features of this feedback loop were investigated through a detailed analysis of the topography and recognition data obtained on a model protein system. Single avidin molecules immobilized on a mica substrate were imaged with an AFM tip functionalized with a biotinylated IgG. A simple procedure for adjusting the optimal amplitude for TREC imaging is described by exploiting the sharp localization of the TREC signal within a small range of oscillation amplitudes. This procedure can also be used for proving the specificity of the detected receptor-ligand interactions. For understanding and eliminating topographical crosstalk in the recognition images we developed a simple theoretical model, which nicely explains its origin and its dependence on the excitation frequency.

  8. Optimal Sizing and Control Strategy Design for Heavy Hybrid Electric Truck

    Directory of Open Access Journals (Sweden)

    Yuan Zou

    2012-01-01

    Full Text Available Due to the complexity of the hybrid powertrain, the control is highly involved to improve the collaborations of the different components. For the specific powertrain, the components' sizing just gives the possibility to propel the vehicle and the control will realize the function of the propulsion. Definitely the components' sizing also gives the constraints to the control design, which cause a close coupling between the sizing and control strategy design. This paper presents a parametric study focused on sizing of the powertrain components and optimization of the power split between the engine and electric motor for minimizing the fuel consumption. A framework is put forward to accomplish the optimal sizing and control design for a heavy parallel pre-AMT hybrid truck under the natural driving schedule. The iterative plant-controller combined optimization methodology is adopted to optimize the key parameters of the plant and control strategy simultaneously. A scalable powertrain model based on a bilevel optimization framework is built. Dynamic programming is applied to find the optimal control in the inner loop with a prescribed cycle. The parameters are optimized in the outer loop. The results are analysed and the optimal sizing and control strategy are achieved simultaneously.

  9. Footprints of Optimal Protein Assembly Strategies in the Operonic Structure of Prokaryotes

    Directory of Open Access Journals (Sweden)

    Jan Ewald

    2015-04-01

    Full Text Available In this work, we investigate optimality principles behind synthesis strategies for protein complexes using a dynamic optimization approach. We show that the cellular capacity of protein synthesis has a strong influence on optimal synthesis strategies reaching from a simultaneous to a sequential synthesis of the subunits of a protein complex. Sequential synthesis is preferred if protein synthesis is strongly limited, whereas a simultaneous synthesis is optimal in situations with a high protein synthesis capacity. We confirm the predictions of our optimization approach through the analysis of the operonic organization of protein complexes in several hundred prokaryotes. Thereby, we are able to show that cellular protein synthesis capacity is a driving force in the dissolution of operons comprising the subunits of a protein complex. Thus, we also provide a tested hypothesis explaining why the subunits of many prokaryotic protein complexes are distributed across several operons despite the presumably less precise co-regulation.

  10. Optimizing Lidar Scanning Strategies for Wind Energy Measurements (Invited)

    Science.gov (United States)

    Newman, J. F.; Bonin, T. A.; Klein, P.; Wharton, S.; Chilson, P. B.

    2013-12-01

    Environmental concerns and rising fossil fuel prices have prompted rapid development in the renewable energy sector. Wind energy, in particular, has become increasingly popular in the United States. However, the intermittency of available wind energy makes it difficult to integrate wind energy into the power grid. Thus, the expansion and successful implementation of wind energy requires accurate wind resource assessments and wind power forecasts. The actual power produced by a turbine is affected by the wind speeds and turbulence levels experienced across the turbine rotor disk. Because of the range of measurement heights required for wind power estimation, remote sensing devices (e.g., lidar) are ideally suited for these purposes. However, the volume averaging inherent in remote sensing technology produces turbulence estimates that are different from those estimated by a sonic anemometer mounted on a standard meteorological tower. In addition, most lidars intended for wind energy purposes utilize a standard Doppler beam-swinging or Velocity-Azimuth Display technique to estimate the three-dimensional wind vector. These scanning strategies are ideal for measuring mean wind speeds but are likely inadequate for measuring turbulence. In order to examine the impact of different lidar scanning strategies on turbulence measurements, a WindCube lidar, a scanning Halo lidar, and a scanning Galion lidar were deployed at the Southern Great Plains Atmospheric Radiation Measurement (ARM) site in Summer 2013. Existing instrumentation at the ARM site, including a 60-m meteorological tower and an additional scanning Halo lidar, were used in conjunction with the deployed lidars to evaluate several user-defined scanning strategies. For part of the experiment, all three scanning lidars were pointed at approximately the same point in space and a tri-Doppler analysis was completed to calculate the three-dimensional wind vector every 1 second. In another part of the experiment, one of

  11. The Bio-Inspired Optimization of Trading Strategies and Its Impact on the Efficient Market Hypothesis and Sustainable Development Strategies

    Directory of Open Access Journals (Sweden)

    Rafał Dreżewski

    2018-05-01

    Full Text Available In this paper, the evolutionary algorithm for the optimization of Forex market trading strategies is proposed. The introduction to issues related to the financial markets and the evolutionary algorithms precedes the main part of the paper, in which the proposed trading system is presented. The system uses the evolutionary algorithm for optimization of a parameterized greedy strategy, which is then used as an investment strategy on the Forex market. In the proposed system, a model of the Forex market was developed, including all elements that are necessary for simulating realistic trading processes. The proposed evolutionary algorithm contains several novel mechanisms that were introduced to optimize the greedy strategy. The most important of the proposed techniques are the mechanisms for maintaining the population diversity, a mechanism for protecting the best individuals in the population, the mechanisms preventing the excessive growth of the population, the mechanisms of the initialization of the population after moving the time window and a mechanism of choosing the best strategies used for trading. The experiments, conducted with the use of real-world Forex market data, were aimed at testing the quality of the results obtained using the proposed algorithm and comparing them with the results obtained by the buy-and-hold strategy. By comparing our results with the results of the buy-and-hold strategy, we attempted to verify the validity of the efficient market hypothesis. The credibility of the hypothesis would have more general implications for many different areas of our lives, including future sustainable development policies.

  12. Optimal combined purchasing strategies for a risk-averse manufacturer under price uncertainty

    Directory of Open Access Journals (Sweden)

    Qiao Wu

    2015-09-01

    Full Text Available Purpose: The purpose of our paper is to analyze optimal purchasing strategies when a manufacturer can buy raw materials from a long-term contract supplier and a spot market under spot price uncertainty. Design/methodology/approach: This procurement model can be solved by using dynamic programming. First, we maximize the DM’s utility of the second period, obtaining the optimal contract quantity and spot quantity for the second period. Then, maximize the DM’s utility of both periods, obtaining the optimal purchasing strategy for the first period. We use a numerical method to compare the performance level of a pure spot sourcing strategy with that of a mixed strategy. Findings: Our results show that optimal purchasing strategies vary with the trend of contract prices. If the contract price falls, the total quantity purchased in period 1 will decrease in the degree of risk aversion. If the contract price increases, the total quantity purchased in period 1 will increase in the degree of risk aversion. The relationship between the optimal contract quantity and the degree of risk aversion depends on whether the expected spot price or the contract price is larger in period 2. Finally, we compare the performance levels between a combined strategy and a spot sourcing strategy. It shows that a combined strategy is optimal for a risk-averse buyer. Originality/value: It’s challenging to deal with a two-period procurement problem with risk consideration. We have obtained results of a two-period procurement problem with two sourcing options, namely contract procurement and spot purchases. Our model incorporates the buyer’s risk aversion factor and the change of contract prices, which are not addressed in early studies.

  13. Optimal swimming strategies in mate searching pelagic copepods

    DEFF Research Database (Denmark)

    Kiørboe, Thomas

    2008-01-01

    Male copepods must swim to find females, but swimming increases the risk of meeting predators and is expensive in terms of energy expenditure. Here I address the trade-offs between gains and risks and the question of how much and how fast to swim using simple models that optimise the number...... of lifetime mate encounters. Radically different swimming strategies are predicted for different feeding behaviours, and these predictions are tested experimentally using representative species. In general, male swimming speeds and the difference in swimming speeds between the genders are predicted...... and observed to increase with increasing conflict between mate searching and feeding. It is high in ambush feeders, where searching (swimming) and feeding are mutually exclusive and low in species, where the matured males do not feed at all. Ambush feeding males alternate between stationary ambush feeding...

  14. Approximate representation of optimal strategies from influence diagrams

    DEFF Research Database (Denmark)

    Jensen, Finn V.

    2008-01-01

    , and where the policy functions for the decisions have so large do- mains that they cannot be represented directly in a strategy tree. The approach is to have separate ID representations for each decision variable. In each representation the actual information is fully exploited, however the representation...... of policies for future decisions are approximations. We call the approximation information abstraction. It consists in introducing a dummy structure connecting the past with the decision. We study how to specify, implement and learn information abstraction.......There are three phases in the life of a decision problem, specification, solution, and rep- resentation of solution. The specification and solution phases are off-line, while the rep- resention of solution often shall serve an on-line situation with rather tough constraints on time and space. One...

  15. An Improved Ensemble of Random Vector Functional Link Networks Based on Particle Swarm Optimization with Double Optimization Strategy.

    Science.gov (United States)

    Ling, Qing-Hua; Song, Yu-Qing; Han, Fei; Yang, Dan; Huang, De-Shuang

    2016-01-01

    For ensemble learning, how to select and combine the candidate classifiers are two key issues which influence the performance of the ensemble system dramatically. Random vector functional link networks (RVFL) without direct input-to-output links is one of suitable base-classifiers for ensemble systems because of its fast learning speed, simple structure and good generalization performance. In this paper, to obtain a more compact ensemble system with improved convergence performance, an improved ensemble of RVFL based on attractive and repulsive particle swarm optimization (ARPSO) with double optimization strategy is proposed. In the proposed method, ARPSO is applied to select and combine the candidate RVFL. As for using ARPSO to select the optimal base RVFL, ARPSO considers both the convergence accuracy on the validation data and the diversity of the candidate ensemble system to build the RVFL ensembles. In the process of combining RVFL, the ensemble weights corresponding to the base RVFL are initialized by the minimum norm least-square method and then further optimized by ARPSO. Finally, a few redundant RVFL is pruned, and thus the more compact ensemble of RVFL is obtained. Moreover, in this paper, theoretical analysis and justification on how to prune the base classifiers on classification problem is presented, and a simple and practically feasible strategy for pruning redundant base classifiers on both classification and regression problems is proposed. Since the double optimization is performed on the basis of the single optimization, the ensemble of RVFL built by the proposed method outperforms that built by some single optimization methods. Experiment results on function approximation and classification problems verify that the proposed method could improve its convergence accuracy as well as reduce the complexity of the ensemble system.

  16. System Characterizations and Optimized Reconstruction Methods for Novel X-ray Imaging Modalities

    Science.gov (United States)

    Guan, Huifeng

    In the past decade there have been many new emerging X-ray based imaging technologies developed for different diagnostic purposes or imaging tasks. However, there exist one or more specific problems that prevent them from being effectively or efficiently employed. In this dissertation, four different novel X-ray based imaging technologies are discussed, including propagation-based phase-contrast (PB-XPC) tomosynthesis, differential X-ray phase-contrast tomography (D-XPCT), projection-based dual-energy computed radiography (DECR), and tetrahedron beam computed tomography (TBCT). System characteristics are analyzed or optimized reconstruction methods are proposed for these imaging modalities. In the first part, we investigated the unique properties of propagation-based phase-contrast imaging technique when combined with the X-ray tomosynthesis. Fourier slice theorem implies that the high frequency components collected in the tomosynthesis data can be more reliably reconstructed. It is observed that the fringes or boundary enhancement introduced by the phase-contrast effects can serve as an accurate indicator of the true depth position in the tomosynthesis in-plane image. In the second part, we derived a sub-space framework to reconstruct images from few-view D-XPCT data set. By introducing a proper mask, the high frequency contents of the image can be theoretically preserved in a certain region of interest. A two-step reconstruction strategy is developed to mitigate the risk of subtle structures being oversmoothed when the commonly used total-variation regularization is employed in the conventional iterative framework. In the thirt part, we proposed a practical method to improve the quantitative accuracy of the projection-based dual-energy material decomposition. It is demonstrated that applying a total-projection-length constraint along with the dual-energy measurements can achieve a stabilized numerical solution of the decomposition problem, thus overcoming the

  17. Optimized protocols for cardiac magnetic resonance imaging in patients with thoracic metallic implants.

    Science.gov (United States)

    Olivieri, Laura J; Cross, Russell R; O'Brien, Kendall E; Ratnayaka, Kanishka; Hansen, Michael S

    2015-09-01

    Cardiac magnetic resonance (MR) imaging is a valuable tool in congenital heart disease; however patients frequently have metal devices in the chest from the treatment of their disease that complicate imaging. Methods are needed to improve imaging around metal implants near the heart. Basic sequence parameter manipulations have the potential to minimize artifact while limiting effects on image resolution and quality. Our objective was to design cine and static cardiac imaging sequences to minimize metal artifact while maintaining image quality. Using systematic variation of standard imaging parameters on a fluid-filled phantom containing commonly used metal cardiac devices, we developed optimized sequences for steady-state free precession (SSFP), gradient recalled echo (GRE) cine imaging, and turbo spin-echo (TSE) black-blood imaging. We imaged 17 consecutive patients undergoing routine cardiac MR with 25 metal implants of various origins using both standard and optimized imaging protocols for a given slice position. We rated images for quality and metal artifact size by measuring metal artifact in two orthogonal planes within the image. All metal artifacts were reduced with optimized imaging. The average metal artifact reduction for the optimized SSFP cine was 1.5+/-1.8 mm, and for the optimized GRE cine the reduction was 4.6+/-4.5 mm (P metal artifact reduction for the optimized TSE images was 1.6+/-1.7 mm (P metal artifact are easily created by modifying basic sequence parameters, and images are superior to standard imaging sequences in both quality and artifact size. Specifically, for optimized cine imaging a GRE sequence should be used with settings that favor short echo time, i.e. flow compensation off, weak asymmetrical echo and a relatively high receiver bandwidth. For static black-blood imaging, a TSE sequence should be used with fat saturation turned off and high receiver bandwidth.

  18. Combining two strategies to optimize biometric decisions against spoofing attacks

    Science.gov (United States)

    Li, Weifeng; Poh, Norman; Zhou, Yicong

    2014-09-01

    Spoof attack by replicating biometric traits represents a real threat to an automatic biometric verification/ authentication system. This is because the system, originally designed to distinguish between genuine users from impostors, simply cannot distinguish between a replicated biometric sample (replica) from a live sample. An effective solution is to obtain some measures that can indicate whether or not a biometric trait has been tempered with, e.g., liveness detection measures. These measures are referred to as evidence of spoofing or anti-spoofing measures. In order to make the final accept/rejection decision, a straightforward solution to define two thresholds: one for the anti-spoofing measure, and another for the verification score. We compared two variants of a method that relies on applying two thresholds - one to the verification (matching) score and another to the anti-spoofing measure. Our experiments carried out using a signature database as well as by simulation show that both the brute-force and its probabilistic variant turn out to be optimal under different operating conditions.

  19. Energy Optimization Using a Case-Based Reasoning Strategy.

    Science.gov (United States)

    González-Briones, Alfonso; Prieto, Javier; De La Prieta, Fernando; Herrera-Viedma, Enrique; Corchado, Juan M

    2018-03-15

    At present, the domotization of homes and public buildings is becoming increasingly popular. Domotization is most commonly applied to the field of energy management, since it gives the possibility of managing the consumption of the devices connected to the electric network, the way in which the users interact with these devices, as well as other external factors that influence consumption. In buildings, Heating, Ventilation and Air Conditioning (HVAC) systems have the highest consumption rates. The systems proposed so far have not succeeded in optimizing the energy consumption associated with a HVAC system because they do not monitor all the variables involved in electricity consumption. For this reason, this article presents an agent approach that benefits from the advantages provided by a Multi-Agent architecture (MAS) deployed in a Cloud environment with a wireless sensor network (WSN) in order to achieve energy savings. The agents of the MAS learn social behavior thanks to the collection of data and the use of an artificial neural network (ANN). The proposed system has been assessed in an office building achieving an average energy savings of 41% in the experimental group offices.

  20. An optimal strategy for functional mapping of dynamic trait loci.

    Science.gov (United States)

    Jin, Tianbo; Li, Jiahan; Guo, Ying; Zhou, Xiaojing; Yang, Runqing; Wu, Rongling

    2010-02-01

    As an emerging powerful approach for mapping quantitative trait loci (QTLs) responsible for dynamic traits, functional mapping models the time-dependent mean vector with biologically meaningful equations and are likely to generate biologically relevant and interpretable results. Given the autocorrelation nature of a dynamic trait, functional mapping needs the implementation of the models for the structure of the covariance matrix. In this article, we have provided a comprehensive set of approaches for modelling the covariance structure and incorporated each of these approaches into the framework of functional mapping. The Bayesian information criterion (BIC) values are used as a model selection criterion to choose the optimal combination of the submodels for the mean vector and covariance structure. In an example for leaf age growth from a rice molecular genetic project, the best submodel combination was found between the Gaussian model for the correlation structure, power equation of order 1 for the variance and the power curve for the mean vector. Under this combination, several significant QTLs for leaf age growth trajectories were detected on different chromosomes. Our model can be well used to study the genetic architecture of dynamic traits of agricultural values.

  1. Advanced Variance Reduction Strategies for Optimizing Mesh Tallies in MAVRIC

    International Nuclear Information System (INIS)

    Peplow, Douglas E.; Blakeman, Edward D; Wagner, John C

    2007-01-01

    More often than in the past, Monte Carlo methods are being used to compute fluxes or doses over large areas using mesh tallies (a set of region tallies defined on a mesh that overlays the geometry). For problems that demand that the uncertainty in each mesh cell be less than some set maximum, computation time is controlled by the cell with the largest uncertainty. This issue becomes quite troublesome in deep-penetration problems, and advanced variance reduction techniques are required to obtain reasonable uncertainties over large areas. The CADIS (Consistent Adjoint Driven Importance Sampling) methodology has been shown to very efficiently optimize the calculation of a response (flux or dose) for a single point or a small region using weight windows and a biased source based on the adjoint of that response. This has been incorporated into codes such as ADVANTG (based on MCNP) and the new sequence MAVRIC, which will be available in the next release of SCALE. In an effort to compute lower uncertainties everywhere in the problem, Larsen's group has also developed several methods to help distribute particles more evenly, based on forward estimates of flux. This paper focuses on the use of a forward estimate to weight the placement of the source in the adjoint calculation used by CADIS, which we refer to as a forward-weighted CADIS (FW-CADIS)

  2. Optimal investment strategies in decentralized renewable power generation under uncertainty

    International Nuclear Information System (INIS)

    Fleten, S.-E.; Maribu, K.M.; Wangensteen, I.

    2007-01-01

    This paper presents a method for evaluating investments in decentralized renewable power generation under price un certainty. The analysis is applicable for a client with an electricity load and a renewable resource that can be utilized for power generation. The investor has a deferrable opportunity to invest in one local power generating unit, with the objective to maximize the profits from the opportunity. Renewable electricity generation can serve local load when generation and load coincide in time, and surplus power can be exported to the grid. The problem is to find the price intervals and the capacity of the generator at which to invest. Results from a case with wind power generation for an office building suggests it is optimal to wait for higher prices than the net present value break-even price under price uncertainty, and that capacity choice can depend on the current market price and the price volatility. With low price volatility there can be more than one investment price interval for different units with intermediate waiting regions between them. High price volatility increases the value of the investment opportunity, and therefore makes it more attractive to postpone investment until larger units are profitable. (author)

  3. Energy Optimization Using a Case-Based Reasoning Strategy

    Directory of Open Access Journals (Sweden)

    Alfonso González-Briones

    2018-03-01

    Full Text Available At present, the domotization of homes and public buildings is becoming increasingly popular. Domotization is most commonly applied to the field of energy management, since it gives the possibility of managing the consumption of the devices connected to the electric network, the way in which the users interact with these devices, as well as other external factors that influence consumption. In buildings, Heating, Ventilation and Air Conditioning (HVAC systems have the highest consumption rates. The systems proposed so far have not succeeded in optimizing the energy consumption associated with a HVAC system because they do not monitor all the variables involved in electricity consumption. For this reason, this article presents an agent approach that benefits from the advantages provided by a Multi-Agent architecture (MAS deployed in a Cloud environment with a wireless sensor network (WSN in order to achieve energy savings. The agents of the MAS learn social behavior thanks to the collection of data and the use of an artificial neural network (ANN. The proposed system has been assessed in an office building achieving an average energy savings of 41% in the experimental group offices.

  4. Energy Optimization Using a Case-Based Reasoning Strategy

    Science.gov (United States)

    Herrera-Viedma, Enrique

    2018-01-01

    At present, the domotization of homes and public buildings is becoming increasingly popular. Domotization is most commonly applied to the field of energy management, since it gives the possibility of managing the consumption of the devices connected to the electric network, the way in which the users interact with these devices, as well as other external factors that influence consumption. In buildings, Heating, Ventilation and Air Conditioning (HVAC) systems have the highest consumption rates. The systems proposed so far have not succeeded in optimizing the energy consumption associated with a HVAC system because they do not monitor all the variables involved in electricity consumption. For this reason, this article presents an agent approach that benefits from the advantages provided by a Multi-Agent architecture (MAS) deployed in a Cloud environment with a wireless sensor network (WSN) in order to achieve energy savings. The agents of the MAS learn social behavior thanks to the collection of data and the use of an artificial neural network (ANN). The proposed system has been assessed in an office building achieving an average energy savings of 41% in the experimental group offices. PMID:29543729

  5. A Parameter Communication Optimization Strategy for Distributed Machine Learning in Sensors.

    Science.gov (United States)

    Zhang, Jilin; Tu, Hangdi; Ren, Yongjian; Wan, Jian; Zhou, Li; Li, Mingwei; Wang, Jue; Yu, Lifeng; Zhao, Chang; Zhang, Lei

    2017-09-21

    In order to utilize the distributed characteristic of sensors, distributed machine learning has become the mainstream approach, but the different computing capability of sensors and network delays greatly influence the accuracy and the convergence rate of the machine learning model. Our paper describes a reasonable parameter communication optimization strategy to balance the training overhead and the communication overhead. We extend the fault tolerance of iterative-convergent machine learning algorithms and propose the Dynamic Finite Fault Tolerance (DFFT). Based on the DFFT, we implement a parameter communication optimization strategy for distributed machine learning, named Dynamic Synchronous Parallel Strategy (DSP), which uses the performance monitoring model to dynamically adjust the parameter synchronization strategy between worker nodes and the Parameter Server (PS). This strategy makes full use of the computing power of each sensor, ensures the accuracy of the machine learning model, and avoids the situation that the model training is disturbed by any tasks unrelated to the sensors.

  6. Optimal bidding strategies for competitive generators and large consumers

    International Nuclear Information System (INIS)

    Fushuan Wen; David, A.K.

    2001-01-01

    There exists the potential for gaming such as strategic bidding by participants (power suppliers and large consumers) in a deregulated power market, which is more an oligopoly than a laissez-faire market. Each participant can increase his or her own profit through strategic bidding but this has a negative effect on maximising social welfare. A method to build bidding strategies for both power suppliers and large consumers in a poolco-type electricity market is presented in this paper. It is assumed that each supplier/large consumer bids a linear supply/demand function, and the system is dispatched to maximise social welfare. Each supplier/large consumer chooses the coefficients in the linear supply/demand function to maximise benefits, subject to expectations about how rival participants will bid. The problem is formulated as a stochastic optimisation problem, and solved by a Monte Carlo approach. A numerical example with six suppliers and two large consumers serves to illustrate the essential features of the method. (author)

  7. Communication strategies to optimize commitments and investments in iron programming.

    Science.gov (United States)

    Griffiths, Marcia

    2002-04-01

    There is consensus that a communications component is crucial to the success of iron supplementation and fortification programs. However, in many instances, we have not applied what we know about successful advocacy and program communications to iron programs. Communication must play a larger and more central role in iron programs to overcome several common shortcomings and allow the use of new commitments and investments in iron programming to optimum advantage. One shortcoming is that iron program communication has been driven primarily by the supply side of the supply-demand continuum. That is, technical information has been given without thought for what people want to know or do. To overcome this, the communication component, which should be responsive to the consumer perspective, must be considered at program inception, not enlisted late in the program cycle as a remedy when interventions fail to reach their targets. Another shortcoming is the lack of program focus on behavior. Because the "technology" of iron, a supplement, or fortified or specific local food must be combined with appropriate consumer behavior, it is not enough to promote the technology. The appropriate use of technology must be ensured, and this requires precise and strategically crafted communications. A small number of projects from countries as diverse as Indonesia, Egypt, Nicaragua and Peru offer examples of successful communications efforts and strategies for adaptation by other countries.

  8. An Equivalent Emission Minimization Strategy for Causal Optimal Control of Diesel Engines

    Directory of Open Access Journals (Sweden)

    Stephan Zentner

    2014-02-01

    Full Text Available One of the main challenges during the development of operating strategies for modern diesel engines is the reduction of the CO2 emissions, while complying with ever more stringent limits for the pollutant emissions. The inherent trade-off between the emissions of CO2 and pollutants renders a simultaneous reduction difficult. Therefore, an optimal operating strategy is sought that yields minimal CO2 emissions, while holding the cumulative pollutant emissions at the allowed level. Such an operating strategy can be obtained offline by solving a constrained optimal control problem. However, the final-value constraint on the cumulated pollutant emissions prevents this approach from being adopted for causal control. This paper proposes a framework for causal optimal control of diesel engines. The optimization problem can be solved online when the constrained minimization of the CO2 emissions is reformulated as an unconstrained minimization of the CO2 emissions and the weighted pollutant emissions (i.e., equivalent emissions. However, the weighting factors are not known a priori. A method for the online calculation of these weighting factors is proposed. It is based on the Hamilton–Jacobi–Bellman (HJB equation and a physically motivated approximation of the optimal cost-to-go. A case study shows that the causal control strategy defined by the online calculation of the equivalence factor and the minimization of the equivalent emissions is only slightly inferior to the non-causal offline optimization, while being applicable to online control.

  9. Multilevel Optimization Framework for Hierarchical Stiffened Shells Accelerated by Adaptive Equivalent Strategy

    Science.gov (United States)

    Wang, Bo; Tian, Kuo; Zhao, Haixin; Hao, Peng; Zhu, Tianyu; Zhang, Ke; Ma, Yunlong

    2017-06-01

    In order to improve the post-buckling optimization efficiency of hierarchical stiffened shells, a multilevel optimization framework accelerated by adaptive equivalent strategy is presented in this paper. Firstly, the Numerical-based Smeared Stiffener Method (NSSM) for hierarchical stiffened shells is derived by means of the numerical implementation of asymptotic homogenization (NIAH) method. Based on the NSSM, a reasonable adaptive equivalent strategy for hierarchical stiffened shells is developed from the concept of hierarchy reduction. Its core idea is to self-adaptively decide which hierarchy of the structure should be equivalent according to the critical buckling mode rapidly predicted by NSSM. Compared with the detailed model, the high prediction accuracy and efficiency of the proposed model is highlighted. On the basis of this adaptive equivalent model, a multilevel optimization framework is then established by decomposing the complex entire optimization process into major-stiffener-level and minor-stiffener-level sub-optimizations, during which Fixed Point Iteration (FPI) is employed to accelerate convergence. Finally, the illustrative examples of the multilevel framework is carried out to demonstrate its efficiency and effectiveness to search for the global optimum result by contrast with the single-level optimization method. Remarkably, the high efficiency and flexibility of the adaptive equivalent strategy is indicated by compared with the single equivalent strategy.

  10. A Particle Swarm Optimization Variant with an Inner Variable Learning Strategy

    Directory of Open Access Journals (Sweden)

    Guohua Wu

    2014-01-01

    Full Text Available Although Particle Swarm Optimization (PSO has demonstrated competitive performance in solving global optimization problems, it exhibits some limitations when dealing with optimization problems with high dimensionality and complex landscape. In this paper, we integrate some problem-oriented knowledge into the design of a certain PSO variant. The resulting novel PSO algorithm with an inner variable learning strategy (PSO-IVL is particularly efficient for optimizing functions with symmetric variables. Symmetric variables of the optimized function have to satisfy a certain quantitative relation. Based on this knowledge, the inner variable learning (IVL strategy helps the particle to inspect the relation among its inner variables, determine the exemplar variable for all other variables, and then make each variable learn from the exemplar variable in terms of their quantitative relations. In addition, we design a new trap detection and jumping out strategy to help particles escape from local optima. The trap detection operation is employed at the level of individual particles whereas the trap jumping out strategy is adaptive in its nature. Experimental simulations completed for some representative optimization functions demonstrate the excellent performance of PSO-IVL. The effectiveness of the PSO-IVL stresses a usefulness of augmenting evolutionary algorithms by problem-oriented domain knowledge.

  11. Optimization strategies based on sequential quadratic programming applied for a fermentation process for butanol production.

    Science.gov (United States)

    Pinto Mariano, Adriano; Bastos Borba Costa, Caliane; de Franceschi de Angelis, Dejanira; Maugeri Filho, Francisco; Pires Atala, Daniel Ibraim; Wolf Maciel, Maria Regina; Maciel Filho, Rubens

    2009-11-01

    In this work, the mathematical optimization of a continuous flash fermentation process for the production of biobutanol was studied. The process consists of three interconnected units, as follows: fermentor, cell-retention system (tangential microfiltration), and vacuum flash vessel (responsible for the continuous recovery of butanol from the broth). The objective of the optimization was to maximize butanol productivity for a desired substrate conversion. Two strategies were compared for the optimization of the process. In one of them, the process was represented by a deterministic model with kinetic parameters determined experimentally and, in the other, by a statistical model obtained using the factorial design technique combined with simulation. For both strategies, the problem was written as a nonlinear programming problem and was solved with the sequential quadratic programming technique. The results showed that despite the very similar solutions obtained with both strategies, the problems found with the strategy using the deterministic model, such as lack of convergence and high computational time, make the use of the optimization strategy with the statistical model, which showed to be robust and fast, more suitable for the flash fermentation process, being recommended for real-time applications coupling optimization and control.

  12. Strategies to Optimize Adult Stem Cell Therapy for Tissue Regeneration

    Directory of Open Access Journals (Sweden)

    Shan Liu

    2016-06-01

    Full Text Available Stem cell therapy aims to replace damaged or aged cells with healthy functioning cells in congenital defects, tissue injuries, autoimmune disorders, and neurogenic degenerative diseases. Among various types of stem cells, adult stem cells (i.e., tissue-specific stem cells commit to becoming the functional cells from their tissue of origin. These cells are the most commonly used in cell-based therapy since they do not confer risk of teratomas, do not require fetal stem cell maneuvers and thus are free of ethical concerns, and they confer low immunogenicity (even if allogenous. The goal of this review is to summarize the current state of the art and advances in using stem cell therapy for tissue repair in solid organs. Here we address key factors in cell preparation, such as the source of adult stem cells, optimal cell types for implantation (universal mesenchymal stem cells vs. tissue-specific stem cells, or induced vs. non-induced stem cells, early or late passages of stem cells, stem cells with endogenous or exogenous growth factors, preconditioning of stem cells (hypoxia, growth factors, or conditioned medium, using various controlled release systems to deliver growth factors with hydrogels or microspheres to provide apposite interactions of stem cells and their niche. We also review several approaches of cell delivery that affect the outcomes of cell therapy, including the appropriate routes of cell administration (systemic, intravenous, or intraperitoneal vs. local administration, timing for cell therapy (immediate vs. a few days after injury, single injection of a large number of cells vs. multiple smaller injections, a single site for injection vs. multiple sites and use of rodents vs. larger animal models. Future directions of stem cell-based therapies are also discussed to guide potential clinical applications.

  13. Development of a codon optimization strategy using the efor RED reporter gene as a test case

    Science.gov (United States)

    Yip, Chee-Hoo; Yarkoni, Orr; Ajioka, James; Wan, Kiew-Lian; Nathan, Sheila

    2018-04-01

    Synthetic biology is a platform that enables high-level synthesis of useful products such as pharmaceutically related drugs, bioplastics and green fuels from synthetic DNA constructs. Large-scale expression of these products can be achieved in an industrial compliant host such as Escherichia coli. To maximise the production of recombinant proteins in a heterologous host, the genes of interest are usually codon optimized based on the codon usage of the host. However, the bioinformatics freeware available for standard codon optimization might not be ideal in determining the best sequence for the synthesis of synthetic DNA. Synthesis of incorrect sequences can prove to be a costly error and to avoid this, a codon optimization strategy was developed based on the E. coli codon usage using the efor RED reporter gene as a test case. This strategy replaces codons encoding for serine, leucine, proline and threonine with the most frequently used codons in E. coli. Furthermore, codons encoding for valine and glycine are substituted with the second highly used codons in E. coli. Both the optimized and original efor RED genes were ligated to the pJS209 plasmid backbone using Gibson Assembly and the recombinant DNAs were transformed into E. coli E. cloni 10G strain. The fluorescence intensity per cell density of the optimized sequence was improved by 20% compared to the original sequence. Hence, the developed codon optimization strategy is proposed when designing an optimal sequence for heterologous protein production in E. coli.

  14. Optimization of reference library used in content-based medical image retrieval scheme

    International Nuclear Information System (INIS)

    Park, Sang Cheol; Sukthankar, Rahul; Mummert, Lily; Satyanarayanan, Mahadev; Zheng Bin

    2007-01-01

    Building an optimal image reference library is a critical step in developing the interactive computer-aided detection and diagnosis (I-CAD) systems of medical images using content-based image retrieval (CBIR) schemes. In this study, the authors conducted two experiments to investigate (1) the relationship between I-CAD performance and size of reference library and (2) a new reference selection strategy to optimize the library and improve I-CAD performance. The authors assembled a reference library that includes 3153 regions of interest (ROI) depicting either malignant masses (1592) or CAD-cued false-positive regions (1561) and an independent testing data set including 200 masses and 200 false-positive regions. A CBIR scheme using a distance-weighted K-nearest neighbor algorithm is applied to retrieve references that are considered similar to the testing sample from the library. The area under receiver operating characteristic curve (A z ) is used as an index to evaluate the I-CAD performance. In the first experiment, the authors systematically increased reference library size and tested I-CAD performance. The result indicates that scheme performance improves initially from A z =0.715 to 0.874 and then plateaus when the library size reaches approximately half of its maximum capacity. In the second experiment, based on the hypothesis that a ROI should be removed if it performs poorly compared to a group of similar ROIs in a large and diverse reference library, the authors applied a new strategy to identify 'poorly effective' references. By removing 174 identified ROIs from the reference library, I-CAD performance significantly increases to A z =0.914 (p<0.01). The study demonstrates that increasing reference library size and removing poorly effective references can significantly improve I-CAD performance

  15. A Single-Degree-of-Freedom Energy Optimization Strategy for Power-Split Hybrid Electric Vehicles

    Directory of Open Access Journals (Sweden)

    Chaoying Xia

    2017-07-01

    Full Text Available This paper presents a single-degree-of-freedom energy optimization strategy to solve the energy management problem existing in power-split hybrid electric vehicles (HEVs. The proposed strategy is based on a quadratic performance index, which is innovatively designed to simultaneously restrict the fluctuation of battery state of charge (SOC and reduce fuel consumption. An extended quadratic optimal control problem is formulated by approximating the fuel consumption rate as a quadratic polynomial of engine power. The approximated optimal control law is obtained by utilizing the solution properties of the Riccati equation and adjoint equation. It is easy to implement in real-time and the engineering significance is explained in details. In order to validate the effectiveness of the proposed strategy, the forward-facing vehicle simulation model is established based on the ADVISOR software (Version 2002, National Renewable Energy Laboratory, Golden, CO, USA. The simulation results show that there is only a little fuel consumption difference between the proposed strategy and the Pontryagin’s minimum principle (PMP-based global optimal strategy, and the proposed strategy also exhibits good adaptability under different initial battery SOC, cargo mass and road slope conditions.

  16. Physics-based optimization of image quality in 3D X-ray flat-panel cone-beam imaging

    NARCIS (Netherlands)

    Snoeren, R.M.

    2012-01-01

    This thesis describes the techniques for modeling and control of 3D X-ray cardiovascular systems in terms of Image Quality and patient dose, aiming at optimizing the diagnostic quality. When aiming at maximum Image Quality (IQ), a cascaded system constituted from inter-dependent imaging components,

  17. An optimal staggered harvesting strategy for herbaceous biomass energy crops

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, M.G.; English, B.C. [Univ. of Tennessee, Knoxville, TN (United States)

    1993-12-31

    Biofuel research over the past two decades indicates lignocellulosic crops are a reliable source of feedstock for alternative energy. However, under the current technology of producing, harvesting and converting biomass crops, the cost of biofuel is not competitive with conventional biofuel. Cost of harvesting biomass feedstock is a single largest component of feedstock cost so there is a cost advantage in designing a biomass harvesting system. Traditional farmer-initiated harvesting operation causes over investment. This study develops a least-cost, time-distributed (staggered) harvesting system for example switch grass, that calls for an effective coordination between farmers, processing plant and a single third-party custom harvester. A linear programming model explicitly accounts for the trade-off between yield loss and benefit of reduced machinery overhead cost, associated with the staggered harvesting system. Total cost of producing and harvesting switch grass will decline by 17.94 percent from conventional non-staggered to proposed staggered harvesting strategy. Harvesting machinery cost alone experiences a significant reduction of 39.68 percent from moving from former to latter. The net return to farmers is estimated to increase by 160.40 percent. Per tonne and per hectare costs of feedstock production will decline by 17.94 percent and 24.78 percent, respectively. These results clearly lend support to the view that the traditional system of single period harvesting calls for over investment on agricultural machinery which escalates the feedstock cost. This social loss to the society in the form of escalated harvesting cost can be avoided if there is a proper coordination among farmers, processing plant and custom harvesters as to when and how biomass crop needs to be planted and harvested. Such an institutional arrangement benefits producers, processing plant and, in turn, end users of biofuels.

  18. Optimization of PET image quality by means of 3D data acquisition and iterative image reconstruction

    International Nuclear Information System (INIS)

    Doll, J.; Zaers, J.; Trojan, H.; Bellemann, M.E.; Adam, L.E.; Haberkorn, U.; Brix, G.

    1998-01-01

    The experiments were performed at the latest-generation whole-body PET system ECAT EXACT HR + . For 2D data acquisition, a collimator of thin tungsten septa was positioned in the field-of-view. Prior to image reconstruction, the measured 3D data were sorted into 2D sinograms by using the Fourier rebinning (FORE) algorithm developed by M. Defrise. The standard filtered backprojection (FBP) method and an optimized ML/EM algorithm with overrelaxation for accelerated convergence were employed for image reconstruction. The spatial resolution of both methods as well as the convergence and noise properties of the ML/EM algorithm were studied in phantom measurements. Furthermore, patient data were acquired in the 2D mode as well as in the 3D mode and reconstructed with both techniques. At the same spatial resolution, the ML/EM-reconstructed images showed fewer and less prominent artefacts than the FBP-reconstructed images. The resulting improved detail conspicuously was achieved for the data acquired in the 2D mode as well as in the 3D mode. The best image quality was obtained by iterative 2D reconstruction of 3D data sets which were previously rebinned into 2D sinograms with help of the FORE algorithm. The phantom measurements revealed that 50 iteration steps with the otpimized ML/EM algorithm were sufficient to keep the relative quantitation error below 5%. (orig./MG) [de

  19. Visual Communications for Heterogeneous Networks/Visually Optimized Scalable Image Compression. Final Report for September 1, 1995 - February 28, 2002

    Energy Technology Data Exchange (ETDEWEB)

    Hemami, S. S.

    2003-06-03

    The authors developed image and video compression algorithms that provide scalability, reconstructibility, and network adaptivity, and developed compression and quantization strategies that are visually optimal at all bit rates. The goal of this research is to enable reliable ''universal access'' to visual communications over the National Information Infrastructure (NII). All users, regardless of their individual network connection bandwidths, qualities-of-service, or terminal capabilities, should have the ability to access still images, video clips, and multimedia information services, and to use interactive visual communications services. To do so requires special capabilities for image and video compression algorithms: scalability, reconstructibility, and network adaptivity. Scalability allows an information service to provide visual information at many rates, without requiring additional compression or storage after the stream has been compressed the first time. Reconstructibility allows reliable visual communications over an imperfect network. Network adaptivity permits real-time modification of compression parameters to adjust to changing network conditions. Furthermore, to optimize the efficiency of the compression algorithms, they should be visually optimal, where each bit expended reduces the visual distortion. Visual optimality is achieved through first extensive experimentation to quantify human sensitivity to supra-threshold compression artifacts and then incorporation of these experimental results into quantization strategies and compression algorithms.

  20. Dual-source CT coronary imaging in heart transplant recipients: image quality and optimal reconstruction interval

    International Nuclear Information System (INIS)

    Bastarrika, Gorka; Arraiza, Maria; Pueyo, Jesus C.; Cecco, Carlo N. de; Ubilla, Matias; Mastrobuoni, Stefano; Rabago, Gregorio

    2008-01-01

    The image quality and optimal reconstruction interval for coronary arteries in heart transplant recipients undergoing non-invasive dual-source computed tomography (DSCT) coronary angiography was evaluated. Twenty consecutive heart transplant recipients who underwent DSCT coronary angiography were included (19 male, one female; mean age 63.1±10.7 years). Data sets were reconstructed in 5% steps from 30% to 80% of the R-R interval. Two blinded independent observers assessed the image quality of each coronary segments using a five-point scale (from 0 = not evaluative to 4=excellent quality). A total of 289 coronary segments in 20 heart transplant recipients were evaluated. Mean heart rate during the scan was 89.1±10.4 bpm. At the best reconstruction interval, diagnostic image quality (score ≥2) was obtained in 93.4% of the coronary segments (270/289) with a mean image quality score of 3.04± 0.63. Systolic reconstruction intervals provided better image quality scores than diastolic reconstruction intervals (overall mean quality scores obtained with the systolic and diastolic reconstructions 3.03±1.06 and 2.73±1.11, respectively; P<0.001). Different systolic reconstruction intervals (35%, 40%, 45% of RR interval) did not yield to significant differences in image quality scores for the coronary segments (P=0.74). Reconstructions obtained at the systolic phase of the cardiac cycle allowed excellent diagnostic image quality coronary angiograms in heart transplant recipients undergoing DSCT coronary angiography. (orig.)

  1. An Optimal Operating Strategy for Battery Life Cycle Costs in Electric Vehicles

    Directory of Open Access Journals (Sweden)

    Yinghua Han

    2014-01-01

    Full Text Available Impact on petroleum based vehicles on the environment, cost, and availability of fuel has led to an increased interest in electric vehicle as a means of transportation. Battery is a major component in an electric vehicle. Economic viability of these vehicles depends on the availability of cost-effective batteries. This paper presents a generalized formulation for determining the optimal operating strategy and cost optimization for battery. Assume that the deterioration of the battery is stochastic. Under the assumptions, the proposed operating strategy for battery is formulated as a nonlinear optimization problem considering reliability and failure number. And an explicit expression of the average cost rate is derived for battery lifetime. Results show that the proposed operating strategy enhances the availability and reliability at a low cost.

  2. Optimal Investment-Consumption Strategy under Inflation in a Markovian Regime-Switching Market

    Directory of Open Access Journals (Sweden)

    Huiling Wu

    2016-01-01

    Full Text Available This paper studies an investment-consumption problem under inflation. The consumption price level, the prices of the available assets, and the coefficient of the power utility are assumed to be sensitive to the states of underlying economy modulated by a continuous-time Markovian chain. The definition of admissible strategies and the verification theory corresponding to this stochastic control problem are presented. The analytical expression of the optimal investment strategy is derived. The existence, boundedness, and feasibility of the optimal consumption are proven. Finally, we analyze in detail by mathematical and numerical analysis how the risk aversion, the correlation coefficient between the inflation and the stock price, the inflation parameters, and the coefficient of utility affect the optimal investment and consumption strategy.

  3. Integrated Optimization of Bus Line Fare and Operational Strategies Using Elastic Demand

    Directory of Open Access Journals (Sweden)

    Chunyan Tang

    2017-01-01

    Full Text Available An optimization approach for designing a transit service system is proposed. Its objective would be the maximization of total social welfare, by providing a profitable fare structure and tailoring operational strategies to passenger demand. These operational strategies include full route operation (FRO, limited stop, short turn, and a mix of the latter two strategies. The demand function is formulated to reflect the attributes of these strategies, in-vehicle crowding, and fare effects on demand variation. The fare is either a flat fare or a differential fare structure; the latter is based on trip distance and achieved service levels. This proposed methodology is applied to a case study of Dalian, China. The optimal results indicate that an optimal combination of operational strategies integrated with a differential fare structure results in the highest potential for increasing total social welfare, if the value of parameter ε related to additional service fee is low. When this value increases up to more than a threshold, strategies with a flat fare show greater benefits. If this value increases beyond yet another threshold, the use of skipped stop strategies is not recommended.

  4. Design of Underwater Robot Lines Based on a Hybrid Automatic Optimization Strategy

    Institute of Scientific and Technical Information of China (English)

    Wenjing Lyu; Weilin Luo

    2014-01-01

    In this paper, a hybrid automatic optimization strategy is proposed for the design of underwater robot lines. Isight is introduced as an integration platform. The construction of this platform is based on the user programming and several commercial software including UG6.0, GAMBIT2.4.6 and FLUENT12.0. An intelligent parameter optimization method, the particle swarm optimization, is incorporated into the platform. To verify the strategy proposed, a simulation is conducted on the underwater robot model 5470, which originates from the DTRC SUBOFF project. With the automatic optimization platform, the minimal resistance is taken as the optimization goal;the wet surface area as the constraint condition; the length of the fore-body, maximum body radius and after-body’s minimum radius as the design variables. With the CFD calculation, the RANS equations and the standard turbulence model are used for direct numerical simulation. By analyses of the simulation results, it is concluded that the platform is of high efficiency and feasibility. Through the platform, a variety of schemes for the design of the lines are generated and the optimal solution is achieved. The combination of the intelligent optimization algorithm and the numerical simulation ensures a global optimal solution and improves the efficiency of the searching solutions.

  5. White blood cell counting analysis of blood smear images using various segmentation strategies

    Science.gov (United States)

    Safuan, Syadia Nabilah Mohd; Tomari, Razali; Zakaria, Wan Nurshazwani Wan; Othman, Nurmiza

    2017-09-01

    In white blood cell (WBC) diagnosis, the most crucial measurement parameter is the WBC counting. Such information is widely used to evaluate the effectiveness of cancer therapy and to diagnose several hidden infection within human body. The current practice of manual WBC counting is laborious and a very subjective assessment which leads to the invention of computer aided system (CAS) with rigorous image processing solution. In the CAS counting work, segmentation is the crucial step to ensure the accuracy of the counted cell. The optimal segmentation strategy that can work under various blood smeared image acquisition conditions is remain a great challenge. In this paper, a comparison between different segmentation methods based on color space analysis to get the best counting outcome is elaborated. Initially, color space correction is applied to the original blood smeared image to standardize the image color intensity level. Next, white blood cell segmentation is performed by using combination of several color analysis subtraction which are RGB, CMYK and HSV, and Otsu thresholding. Noises and unwanted regions that present after the segmentation process is eliminated by applying a combination of morphological and Connected Component Labelling (CCL) filter. Eventually, Circle Hough Transform (CHT) method is applied to the segmented image to estimate the number of WBC including the one under the clump region. From the experiment, it is found that G-S yields the best performance.

  6. An Optimal Investment Strategy and Multiperiod Deposit Insurance Pricing Model for Commercial Banks

    Directory of Open Access Journals (Sweden)

    Grant E. Muller

    2018-01-01

    Full Text Available We employ the method of stochastic optimal control to derive the optimal investment strategy for maximizing an expected exponential utility of a commercial bank’s capital at some future date T>0. In addition, we derive a multiperiod deposit insurance (DI pricing model that incorporates the explicit solution of the optimal control problem and an asset value reset rule comparable to the typical practice of insolvency resolution by insuring agencies. By way of numerical simulations, we study the effects of changes in the DI coverage horizon, the risk associated with the asset portfolio of the bank, and the bank’s initial leverage level (deposit-to-asset ratio on the DI premium while the optimal investment strategy is followed.

  7. Multi-objective optimal strategy for generating and bidding in the power market

    International Nuclear Information System (INIS)

    Peng Chunhua; Sun Huijuan; Guo Jianfeng; Liu Gang

    2012-01-01

    Highlights: ► A new benefit/risk/emission comprehensive generation optimization model is established. ► A hybrid multi-objective differential evolution optimization algorithm is designed. ► Fuzzy set theory and entropy weighting method are employed to extract the general best solution. ► The proposed approach of generating and bidding is efficient for maximizing profit and minimizing both risk and emissions. - Abstract: Based on the coordinated interaction between units output and electricity market prices, the benefit/risk/emission comprehensive generation optimization model with objectives of maximal profit and minimal bidding risk and emissions is established. A hybrid multi-objective differential evolution optimization algorithm, which successfully integrates Pareto non-dominated sorting with differential evolution algorithm and improves individual crowding distance mechanism and mutation strategy to avoid premature and unevenly search, is designed to achieve Pareto optimal set of this model. Moreover, fuzzy set theory and entropy weighting method are employed to extract one of the Pareto optimal solutions as the general best solution. Several optimization runs have been carried out on different cases of generation bidding and scheduling. The results confirm the potential and effectiveness of the proposed approach in solving the multi-objective optimization problem of generation bidding and scheduling. In addition, the comparison with the classical optimization algorithms demonstrates the superiorities of the proposed algorithm such as integrality of Pareto front, well-distributed Pareto-optimal solutions, high search speed.

  8. Optimized Management of Groundwater Resources in Kish Island: A Sensitivity Analysis of Optimal Strategies in Response to Environmental Changes

    Directory of Open Access Journals (Sweden)

    Davood Mahmoodzadeh

    2016-05-01

    Full Text Available Groundwater in coastal areas is an essential source of freshwater that warrants protection from seawater intrusion as a priority based on an optimal management plan. Proper optimal management strategies can be developed using a variety of decision-making models. The present study aims to investigate the impacts of environmental changes on groundwater resources. For this purpose, a combined simulation-optimization model is employed that incorporates the SUTRA numerical model and the evolutionaty method of ant colony optimization. The fresh groundwater lens in Kish Island is used as a case study and different scenarios are considered for the likely enviromental changes. Results indicate that while variations in recharge rate form an important factor in the fresh groundwater lens, land-surface inundation due to rises in seawater level, especially in low-lying lands, is the major factor affecting the lens. Furthermore, impacts of environmental changes when effected into the Kish Island aquifer optimization management plan have led to a reduction of more than 20% in the allowable water extraction, indicating the high sensitivity of groundwater resources management plans in small islands to such variations.

  9. Magnetic resonance imaging of vulnerable atherosclerotic plaques: current imaging strategies and molecular imaging probes

    NARCIS (Netherlands)

    Briley-Saebo, Karen C.; Mulder, Willem J. M.; Mani, Venkatesh; Hyafil, Fabien; Amirbekian, Vardan; Aguinaldo, Juan Gilberto S.; Fisher, Edward A.; Fayad, Zahi A.

    2007-01-01

    The vulnerability or destabilization of atherosclerotic plaques has been directly linked to plaque composition. Imaging modalities, such as magnetic resonance (MR) imaging, that allow for evaluation of plaque composition at a cellular and molecular level, could further improve the detection of

  10. Optimizing 4-Dimensional Magnetic Resonance Imaging Data Sampling for Respiratory Motion Analysis of Pancreatic Tumors

    Energy Technology Data Exchange (ETDEWEB)

    Stemkens, Bjorn, E-mail: b.stemkens@umcutrecht.nl [Department of Radiotherapy, University Medical Center Utrecht, Utrecht (Netherlands); Tijssen, Rob H.N. [Department of Radiotherapy, University Medical Center Utrecht, Utrecht (Netherlands); Senneville, Baudouin D. de [Imaging Division, University Medical Center Utrecht, Utrecht (Netherlands); L' Institut de Mathématiques de Bordeaux, Unité Mixte de Recherche 5251, Centre National de la Recherche Scientifique/University of Bordeaux, Bordeaux (France); Heerkens, Hanne D.; Vulpen, Marco van; Lagendijk, Jan J.W.; Berg, Cornelis A.T. van den [Department of Radiotherapy, University Medical Center Utrecht, Utrecht (Netherlands)

    2015-03-01

    Purpose: To determine the optimum sampling strategy for retrospective reconstruction of 4-dimensional (4D) MR data for nonrigid motion characterization of tumor and organs at risk for radiation therapy purposes. Methods and Materials: For optimization, we compared 2 surrogate signals (external respiratory bellows and internal MRI navigators) and 2 MR sampling strategies (Cartesian and radial) in terms of image quality and robustness. Using the optimized protocol, 6 pancreatic cancer patients were scanned to calculate the 4D motion. Region of interest analysis was performed to characterize the respiratory-induced motion of the tumor and organs at risk simultaneously. Results: The MRI navigator was found to be a more reliable surrogate for pancreatic motion than the respiratory bellows signal. Radial sampling is most benign for undersampling artifacts and intraview motion. Motion characterization revealed interorgan and interpatient variation, as well as heterogeneity within the tumor. Conclusions: A robust 4D-MRI method, based on clinically available protocols, is presented and successfully applied to characterize the abdominal motion in a small number of pancreatic cancer patients.

  11. Optimal Control and Operation Strategy for Wind Turbines Contributing to Grid Primary Frequency Regulation

    Directory of Open Access Journals (Sweden)

    Mun-Kyeom Kim

    2017-09-01

    Full Text Available This study introduces a frequency regulation strategy to enable the participation of wind turbines with permanent magnet synchronous generators (PMSGs. The optimal strategy focuses on developing the frequency support capability of PMSGs connected to the power system. Active power control is performed using maximum power point tracking (MPPT and de-loaded control to supply the required power reserve following a disturbance. A kinetic energy (KE reserve control is developed to enhance the frequency regulation capability of wind turbines. The coordination with the de-loaded control prevents instability in the PMSG wind system due to excessive KE discharge. A KE optimization method that maximizes the sum of the KE reserves at wind farms is also adopted to determine the de-loaded power reference for each PMSG wind turbine using the particle swarm optimization (PSO algorithm. To validate the effectiveness of the proposed optimal control and operation strategy, three different case studies are conducted using the PSCAD/EMTDC simulation tool. The results demonstrate that the optimal strategy enhances the frequency support contribution from PMSG wind turbines.

  12. Dynamic contrast-enhanced MR imaging of endometrial cancer. Optimizing the imaging delay for tumour-myometrium contrast

    International Nuclear Information System (INIS)

    Park, Sung Bin; Moon, Min Hoan; Sung, Chang Kyu; Oh, Sohee; Lee, Young Ho

    2014-01-01

    To investigate the optimal imaging delay time of dynamic contrast-enhanced magnetic resonance (MR) imaging in women with endometrial cancer. This prospective single-institution study was approved by the institutional review board, and informed consent was obtained from the participants. Thirty-five women (mean age, 54 years; age range, 29-66 years) underwent dynamic contrast-enhanced MR imaging with a temporal resolution of 25-40 seconds. The signal intensity difference ratios between the myometrium and endometrial cancer were analyzed to investigate the optimal imaging delay time using single change-point analysis. The optimal imaging delay time for appropriate tumour-myometrium contrast ranged from 31.7 to 268.1 seconds. The median optimal imaging delay time was 91.3 seconds, with an interquartile range of 46.2 to 119.5 seconds. The median signal intensity difference ratios between the myometrium and endometrial cancer were 0.03, with an interquartile range of -0.01 to 0.06, on the pre-contrast MR imaging and 0.20, with an interquartile range of 0.15 to 0.25, on the post-contrast MR imaging. An imaging delay of approximately 90 seconds after initiating contrast material injection may be optimal for obtaining appropriate tumour-myometrium contrast in women with endometrial cancer. (orig.)

  13. Optimal operation strategy of battery energy storage system to real-time electricity price in Denmark

    DEFF Research Database (Denmark)

    Hu, Weihao; Chen, Zhe; Bak-Jensen, Birgitte

    2010-01-01

    markets in some ways, is chosen as the studied power system in this paper. Two kinds of BESS, based on polysulfide-bromine (PSB) and vanadium redox (VRB) battery technologies, are studies in the paper. Simulation results show, that the proposed optimal operation strategy is an effective measure to achieve......Since the hourly spot market price is available one day ahead, the price could be transferred to the consumers and they may have some motivations to install an energy storage system in order to save their energy costs. This paper presents an optimal operation strategy for a battery energy storage...

  14. Energy evaluation of optimal control strategies for central VWV chiller systems

    International Nuclear Information System (INIS)

    Jin Xinqiao; Du Zhimin; Xiao Xiaokun

    2007-01-01

    Under various conditions, the actual load of the heating, ventilation and air conditioning (HVAC) systems is less than it is originally designed in most operation periods. To save energy and to optimize the controls for chilling systems, the performance of variable water volume (VWV) systems and characteristics of control systems are analyzed, and three strategies are presented and tested based on simulation in this paper. Energy evaluation for the three strategies shows that they can save energy to some extent, and there is potential remained. To minimize the energy consumption of chilling system, the setpoints of controls of supply chilled water temperature and supply head of secondary pump should be optimized simultaneously

  15. Distributed Strategy for Optimal Dispatch of Unbalanced Three-Phase Islanded Microgrids

    DEFF Research Database (Denmark)

    Vergara Barrios, Pedro Pablo; Rey-López, Juan Manuel; Shaker, Hamid Reza

    2018-01-01

    This paper presents a distributed strategy for the optimal dispatch of islanded microgrids, modeled as unbalanced three-phase electrical distribution systems (EDS). To set the dispatch of the distributed generation (DG) units, an optimal generation problem is stated and solved distributively based......-phase microgrid. According to the obtained results, the proposed strategy achieves a lower cost solution when compared with a centralized approach based on a static droop framework, with a considerable reduction on the communication system complexity. Additionally, it corrects the mismatch between generation...

  16. Convexity of Ruin Probability and Optimal Dividend Strategies for a General Lévy Process

    Directory of Open Access Journals (Sweden)

    Chuancun Yin

    2015-01-01

    Full Text Available We consider the optimal dividends problem for a company whose cash reserves follow a general Lévy process with certain positive jumps and arbitrary negative jumps. The objective is to find a policy which maximizes the expected discounted dividends until the time of ruin. Under appropriate conditions, we use some recent results in the theory of potential analysis of subordinators to obtain the convexity properties of probability of ruin. We present conditions under which the optimal dividend strategy, among all admissible ones, takes the form of a barrier strategy.

  17. Convexity of Ruin Probability and Optimal Dividend Strategies for a General Lévy Process

    Science.gov (United States)

    Yuen, Kam Chuen; Shen, Ying

    2015-01-01

    We consider the optimal dividends problem for a company whose cash reserves follow a general Lévy process with certain positive jumps and arbitrary negative jumps. The objective is to find a policy which maximizes the expected discounted dividends until the time of ruin. Under appropriate conditions, we use some recent results in the theory of potential analysis of subordinators to obtain the convexity properties of probability of ruin. We present conditions under which the optimal dividend strategy, among all admissible ones, takes the form of a barrier strategy. PMID:26351655

  18. Optimal Claiming Strategies in Bonus Malus Systems and Implied Markov Chains

    Directory of Open Access Journals (Sweden)

    Arthur Charpentier

    2017-11-01

    Full Text Available In this paper, we investigate the impact of the accident reporting strategy of drivers, within a Bonus-Malus system. We exhibit the induced modification of the corresponding class level transition matrix and derive the optimal reporting strategy for rational drivers. The hunger for bonuses induces optimal thresholds under which, drivers do not claim their losses. Mathematical properties of the induced level class process are studied. A convergent numerical algorithm is provided for computing such thresholds and realistic numerical applications are discussed.

  19. Evolution strategies and multi-objective optimization of permanent magnet motor

    DEFF Research Database (Denmark)

    Andersen, Søren Bøgh; Santos, Ilmar

    2012-01-01

    When designing a permanent magnet motor, several geometry and material parameters are to be defined. This is not an easy task, as material properties and magnetic fields are highly non-linear and the design of a motor is therefore often an iterative process. From an engineering point of view, we...... of evolution strategies, ES to effectively design and optimize parameters of permanent magnet motors. Single as well as multi-objective optimization procedures are carried out. A modified way of creating the strategy parameters for the ES algorithm is also proposed and has together with the standard ES...

  20. Optimization of hybrid imaging systems based on maximization of kurtosis of the restored point spread function

    DEFF Research Database (Denmark)

    Demenikov, Mads

    2011-01-01

    to optimization results based on full-reference image measures of restored images. In comparison with full-reference measures, the kurtosis measure is fast to compute and requires no images, noise distributions, or alignment of restored images, but only the signal-to-noise-ratio. © 2011 Optical Society of America.......I propose a novel, but yet simple, no-reference, objective image quality measure based on the kurtosis of the restored point spread function. Using this measure, I optimize several phase masks for extended-depth-of-field in hybrid imaging systems and obtain results that are identical...

  1. Time Reversal Reconstruction Algorithm Based on PSO Optimized SVM Interpolation for Photoacoustic Imaging

    Directory of Open Access Journals (Sweden)

    Mingjian Sun

    2015-01-01

    Full Text Available Photoacoustic imaging is an innovative imaging technique to image biomedical tissues. The time reversal reconstruction algorithm in which a numerical model of the acoustic forward problem is run backwards in time is widely used. In the paper, a time reversal reconstruction algorithm based on particle swarm optimization (PSO optimized support vector machine (SVM interpolation method is proposed for photoacoustics imaging. Numerical results show that the reconstructed images of the proposed algorithm are more accurate than those of the nearest neighbor interpolation, linear interpolation, and cubic convolution interpolation based time reversal algorithm, which can provide higher imaging quality by using significantly fewer measurement positions or scanning times.

  2. Utility of BRDF Models for Estimating Optimal View Angles in Classification of Remotely Sensed Images

    Science.gov (United States)

    Valdez, P. F.; Donohoe, G. W.

    1997-01-01

    Statistical classification of remotely sensed images attempts to discriminate between surface cover types on the basis of the spectral response recorded by a sensor. It is well known that surfaces reflect incident radiation as a function of wavelength producing a spectral signature specific to the material under investigation. Multispectral and hyperspectral sensors sample the spectral response over tens and even hundreds of wavelength bands to capture the variation of spectral response with wavelength. Classification algorithms then exploit these differences in spectral response to distinguish between materials of interest. Sensors of this type, however, collect detailed spectral information from one direction (usually nadir); consequently, do not consider the directional nature of reflectance potentially detectable at different sensor view angles. Improvements in sensor technology have resulted in remote sensing platforms capable of detecting reflected energy across wavelengths (spectral signatures) and from multiple view angles (angular signatures) in the fore and aft directions. Sensors of this type include: the moderate resolution imaging spectroradiometer (MODIS), the multiangle imaging spectroradiometer (MISR), and the airborne solid-state array spectroradiometer (ASAS). A goal of this paper, then, is to explore the utility of Bidirectional Reflectance Distribution Function (BRDF) models in the selection of optimal view angles for the classification of remotely sensed images by employing a strategy of searching for the maximum difference between surface BRDFs. After a brief discussion of directional reflect ante in Section 2, attention is directed to the Beard-Maxwell BRDF model and its use in predicting the bidirectional reflectance of a surface. The selection of optimal viewing angles is addressed in Section 3, followed by conclusions and future work in Section 4.

  3. Computing Optimal Mixed Strategies for Terrorist Plot Detection Games with the Consideration of Information Leakage

    Directory of Open Access Journals (Sweden)

    Li MingChu

    2017-01-01

    Full Text Available The terrorist’s coordinated attack is becoming an increasing threat to western countries. By monitoring potential terrorists, security agencies are able to detect and destroy terrorist plots at their planning stage. Therefore, an optimal monitoring strategy for the domestic security agency becomes necessary. However, previous study about monitoring strategy generation fails to consider the information leakage, due to hackers and insider threat. Such leakage events may lead to failure of watching potential terrorists and destroying the plot, and cause a huge risk to public security. This paper makes two major contributions. Firstly, we develop a new Stackelberg game model for the security agency to generate optimal monitoring strategy with the consideration of information leakage. Secondly, we provide a double-oracle framework DO-TPDIL for calculation effectively. The experimental result shows that our approach can obtain robust strategies against information leakage with high feasibility and efficiency.

  4. The Development and Empirical Validation of an E-based Supply Chain Strategy Optimization Model

    DEFF Research Database (Denmark)

    Kotzab, Herbert; Skjoldager, Niels; Vinum, Thorkil

    2003-01-01

    Examines the formulation of supply chain strategies in complex environments. Argues that current state‐of‐the‐art e‐business and supply chain management, combined into the concept of e‐SCM, as well as the use of transaction cost theory, network theory and resource‐based theory, altogether can...... be used to form a model for analyzing supply chains with the purpose of reducing the uncertainty of formulating supply chain strategies. Presents e‐supply chain strategy optimization model (e‐SOM) as a way to analyze supply chains in a structured manner as regards strategic preferences for supply chain...... design, relations and resources in the chains with the ultimate purpose of enabling the formulation of optimal, executable strategies for specific supply chains. Uses research results for a specific supply chain to validate the usefulness of the model....

  5. Reliability–redundancy allocation problem considering optimal redundancy strategy using parallel genetic algorithm

    International Nuclear Information System (INIS)

    Kim, Heungseob; Kim, Pansoo

    2017-01-01

    To maximize the reliability of a system, the traditional reliability–redundancy allocation problem (RRAP) determines the component reliability and level of redundancy for each subsystem. This paper proposes an advanced RRAP that also considers the optimal redundancy strategy, either active or cold standby. In addition, new examples are presented for it. Furthermore, the exact reliability function for a cold standby redundant subsystem with an imperfect detector/switch is suggested, and is expected to replace the previous approximating model that has been used in most related studies. A parallel genetic algorithm for solving the RRAP as a mixed-integer nonlinear programming model is presented, and its performance is compared with those of previous studies by using numerical examples on three benchmark problems. - Highlights: • Optimal strategy is proposed to solve reliability redundancy allocation problem. • The redundancy strategy uses parallel genetic algorithm. • Improved reliability function for a cold standby subsystem is suggested. • Proposed redundancy strategy enhances the system reliability.

  6. The topography of the environment alters the optimal search strategy for active particles

    Science.gov (United States)

    Volpe, Giorgio; Volpe, Giovanni

    2017-10-01

    In environments with scarce resources, adopting the right search strategy can make the difference between succeeding and failing, even between life and death. At different scales, this applies to molecular encounters in the cell cytoplasm, to animals looking for food or mates in natural landscapes, to rescuers during search and rescue operations in disaster zones, and to genetic computer algorithms exploring parameter spaces. When looking for sparse targets in a homogeneous environment, a combination of ballistic and diffusive steps is considered optimal; in particular, more ballistic Lévy flights with exponent α≤1 are generally believed to optimize the search process. However, most search spaces present complex topographies. What is the best search strategy in these more realistic scenarios? Here, we show that the topography of the environment significantly alters the optimal search strategy toward less ballistic and more Brownian strategies. We consider an active particle performing a blind cruise search for nonregenerating sparse targets in a 2D space with steps drawn from a Lévy distribution with the exponent varying from α=1 to α=2 (Brownian). We show that, when boundaries, barriers, and obstacles are present, the optimal search strategy depends on the topography of the environment, with α assuming intermediate values in the whole range under consideration. We interpret these findings using simple scaling arguments and discuss their robustness to varying searcher's size. Our results are relevant for search problems at different length scales from animal and human foraging to microswimmers' taxis to biochemical rates of reaction.

  7. MO-PIS-Exhibit Hall-01: Imaging: CT Dose Optimization Technologies I

    International Nuclear Information System (INIS)

    Denison, K; Smith, S

    2014-01-01

    Partners in Solutions is an exciting new program in which AAPM partners with our vendors to present practical “hands-on” information about the equipment and software systems that we use in our clinics. The imaging topic this year is CT scanner dose optimization capabilities. Note that the sessions are being held in a special purpose room built on the Exhibit Hall Floor, to encourage further interaction with the vendors. Dose Optimization Capabilities of GE Computed Tomography Scanners Presentation Time: 11:15 – 11:45 AM GE Healthcare is dedicated to the delivery of high quality clinical images through the development of technologies, which optimize the application of ionizing radiation. In computed tomography, dose management solutions fall into four categories: employs projection data and statistical modeling to decrease noise in the reconstructed image - creating an opportunity for mA reduction in the acquisition of diagnostic images. Veo represents true Model Based Iterative Reconstruction (MBiR). Using high-level algorithms in tandem with advanced computing power, Veo enables lower pixel noise standard deviation and improved spatial resolution within a single image. Advanced Adaptive Image Filters allow for maintenance of spatial resolution while reducing image noise. Examples of adaptive image space filters include Neuro 3-D filters and Cardiac Noise Reduction Filters. AutomA adjusts mA along the z-axis and is the CT equivalent of auto exposure control in conventional x-ray systems. Dynamic Z-axis Tracking offers an additional opportunity for dose reduction in helical acquisitions while SmartTrack Z-axis Tracking serves to ensure beam, collimator and detector alignment during tube rotation. SmartmA provides angular mA modulation. ECG Helical Modulation reduces mA during the systolic phase of the heart cycle. SmartBeam optimization uses bowtie beam-shaping hardware and software to filter off-axis x-rays - minimizing dose and reducing x-ray scatter. The

  8. MO-PIS-Exhibit Hall-01: Imaging: CT Dose Optimization Technologies I

    Energy Technology Data Exchange (ETDEWEB)

    Denison, K; Smith, S [GE Healthcare, Waukesha, WI (United States)

    2014-06-15

    Partners in Solutions is an exciting new program in which AAPM partners with our vendors to present practical “hands-on” information about the equipment and software systems that we use in our clinics. The imaging topic this year is CT scanner dose optimization capabilities. Note that the sessions are being held in a special purpose room built on the Exhibit Hall Floor, to encourage further interaction with the vendors. Dose Optimization Capabilities of GE Computed Tomography Scanners Presentation Time: 11:15 – 11:45 AM GE Healthcare is dedicated to the delivery of high quality clinical images through the development of technologies, which optimize the application of ionizing radiation. In computed tomography, dose management solutions fall into four categories: employs projection data and statistical modeling to decrease noise in the reconstructed image - creating an opportunity for mA reduction in the acquisition of diagnostic images. Veo represents true Model Based Iterative Reconstruction (MBiR). Using high-level algorithms in tandem with advanced computing power, Veo enables lower pixel noise standard deviation and improved spatial resolution within a single image. Advanced Adaptive Image Filters allow for maintenance of spatial resolution while reducing image noise. Examples of adaptive image space filters include Neuro 3-D filters and Cardiac Noise Reduction Filters. AutomA adjusts mA along the z-axis and is the CT equivalent of auto exposure control in conventional x-ray systems. Dynamic Z-axis Tracking offers an additional opportunity for dose reduction in helical acquisitions while SmartTrack Z-axis Tracking serves to ensure beam, collimator and detector alignment during tube rotation. SmartmA provides angular mA modulation. ECG Helical Modulation reduces mA during the systolic phase of the heart cycle. SmartBeam optimization uses bowtie beam-shaping hardware and software to filter off-axis x-rays - minimizing dose and reducing x-ray scatter. The

  9. Optimal combinations of control strategies and cost-effective analysis for visceral leishmaniasis disease transmission.

    Directory of Open Access Journals (Sweden)

    Santanu Biswas

    Full Text Available Visceral leishmaniasis (VL is a deadly neglected tropical disease that poses a serious problem in various countries all over the world. Implementation of various intervention strategies fail in controlling the spread of this disease due to issues of parasite drug resistance and resistance of sandfly vectors to insecticide sprays. Due to this, policy makers need to develop novel strategies or resort to a combination of multiple intervention strategies to control the spread of the disease. To address this issue, we propose an extensive SIR-type model for anthroponotic visceral leishmaniasis transmission with seasonal fluctuations modeled in the form of periodic sandfly biting rate. Fitting the model for real data reported in South Sudan, we estimate the model parameters and compare the model predictions with known VL cases. Using optimal control theory, we study the effects of popular control strategies namely, drug-based treatment of symptomatic and PKDL-infected individuals, insecticide treated bednets and spray of insecticides on the dynamics of infected human and vector populations. We propose that the strategies remain ineffective in curbing the disease individually, as opposed to the use of optimal combinations of the mentioned strategies. Testing the model for different optimal combinations while considering periodic seasonal fluctuations, we find that the optimal combination of treatment of individuals and insecticide sprays perform well in controlling the disease for the time period of intervention introduced. Performing a cost-effective analysis we identify that the same strategy also proves to be efficacious and cost-effective. Finally, we suggest that our model would be helpful for policy makers to predict the best intervention strategies for specific time periods and their appropriate implementation for elimination of visceral leishmaniasis.

  10. Computing Optimal Mixed Strategies for Terrorist Plot Detection Games with the Consideration of Information Leakage

    OpenAIRE

    Li MingChu; Yang Zekun; Lu Kun; Guo Cheng

    2017-01-01

    The terrorist’s coordinated attack is becoming an increasing threat to western countries. By monitoring potential terrorists, security agencies are able to detect and destroy terrorist plots at their planning stage. Therefore, an optimal monitoring strategy for the domestic security agency becomes necessary. However, previous study about monitoring strategy generation fails to consider the information leakage, due to hackers and insider threat. Such leakage events may lead to failure of watch...

  11. Convex optimization problem prototyping for image reconstruction in computed tomography with the Chambolle–Pock algorithm

    DEFF Research Database (Denmark)

    Sidky, Emil Y.; Jørgensen, Jakob Heide; Pan, Xiaochuan

    2012-01-01

    The primal–dual optimization algorithm developed in Chambolle and Pock (CP) (2011 J. Math. Imag. Vis. 40 1–26) is applied to various convex optimization problems of interest in computed tomography (CT) image reconstruction. This algorithm allows for rapid prototyping of optimization problems...... for the purpose of designing iterative image reconstruction algorithms for CT. The primal–dual algorithm is briefly summarized in this paper, and its potential for prototyping is demonstrated by explicitly deriving CP algorithm instances for many optimization problems relevant to CT. An example application...

  12. Strategies to minimize sedation in pediatric body magnetic resonance imaging

    International Nuclear Information System (INIS)

    Jaimes, Camilo; Gee, Michael S.

    2016-01-01

    The high soft-tissue contrast of MRI and the absence of ionizing radiation make it a valuable tool for assessment of body pathology in children. Infants and young children are often unable to cooperate with awake MRI so sedation or general anesthesia might be required. However, given recent data on the costs and potential risks of anesthesia in young children, there is a need to try to decrease or avoid sedation in this population when possible. Child life specialists in radiology frequently use behavioral techniques and audiovisual support devices, and they practice with children and families using mock scanners to improve child compliance with MRI. Optimization of the MR scanner environment is also important to create a child-friendly space. If the child can remain inside the MRI scanner, a variety of emerging techniques can reduce the effect of involuntary motion. Using sequences with short acquisition times such as single-shot fast spin echo and volumetric gradient echo can decrease artifacts and improve image quality. Breath-holding, respiratory triggering and signal averaging all reduce respiratory motion. Emerging techniques such as radial and multislice k-space acquisition, navigator motion correction, as well as parallel imaging and compressed sensing reconstruction methods can further accelerate acquisition and decrease motion. Collaboration among radiologists, anesthesiologists, technologists, child life specialists and families is crucial for successful performance of MRI in young children. (orig.)

  13. Conditions for characterizing the structure of optimal strategies in infinite-horizon dynamic programs

    International Nuclear Information System (INIS)

    Porteus, E.

    1982-01-01

    The study of infinite-horizon nonstationary dynamic programs using the operator approach is continued. The point of view here differs slightly from that taken by others, in that Denardo's local income function is not used as a starting point. Infinite-horizon values are defined as limits of finite-horizon values, as the horizons get long. Two important conditions of an earlier paper are weakened, yet the optimality equations, the optimality criterion, and the existence of optimal ''structured'' strategies are still obtained

  14. Risk-Averse Suppliers’ Optimal Pricing Strategies in a Two-Stage Supply Chain

    Directory of Open Access Journals (Sweden)

    Rui Shen

    2013-01-01

    Full Text Available Risk-averse suppliers’ optimal pricing strategies in two-stage supply chains under competitive environment are discussed. The suppliers in this paper focus more on losses as compared to profits, and they care their long-term relationship with their customers. We introduce for the suppliers a loss function, which covers both current loss and future loss. The optimal wholesale price is solved under situations of risk neutral, risk averse, and a combination of minimizing loss and controlling risk, respectively. Besides, some properties of and relations among these optimal wholesale prices are given as well. A numerical example is given to illustrate the performance of the proposed method.

  15. Optimization of cooling strategy and seeding by FBRM analysis of batch crystallization

    Science.gov (United States)

    Zhang, Dejiang; Liu, Lande; Xu, Shijie; Du, Shichao; Dong, Weibing; Gong, Junbo

    2018-03-01

    A method is presented for optimizing the cooling strategy and seed loading simultaneously. Focused beam reflectance measurement (FBRM) was used to determine the approximating optimal cooling profile. Using these results in conjunction with constant growth rate assumption, modified Mullin-Nyvlt trajectory could be calculated. This trajectory could suppress secondary nucleation and has the potential to control product's polymorph distribution. Comparing with linear and two step cooling, modified Mullin-Nyvlt trajectory have a larger size distribution and a better morphology. Based on the calculating results, the optimized seed loading policy was also developed. This policy could be useful for guiding the batch crystallization process.

  16. Optimized bolt tightening strategies for gasketed flanged pipe joints of different sizes

    International Nuclear Information System (INIS)

    Abid, Muhammad; Khan, Ayesha; Nash, David Hugh; Hussain, Masroor; Wajid, Hafiz Abdul

    2016-01-01

    Achieving a proper preload in the bolts of a gasketed bolted flanged pipe joint during joint assembly is considered important for its optimized performance. This paper presents results of detailed non-linear finite element analysis of an optimized bolt tightening strategy of different joint sizes for achieving proper preload close to the target stress values. Industrial guidelines are considered for applying recommended target stress values with TCM (torque control method) and SCM (stretch control method) using a customized optimization algorithm. Different joint components performance is observed and discussed in detail.

  17. Strategies to optimize MEDLINE and EMBASE search strategies for anesthesiology systematic reviews. An experimental study.

    Science.gov (United States)

    Volpato, Enilze de Souza Nogueira; Betini, Marluci; Puga, Maria Eduarda; Agarwal, Arnav; Cataneo, Antônio José Maria; Oliveira, Luciane Dias de; Bazan, Rodrigo; Braz, Leandro Gobbo; Pereira, José Eduardo Guimarães; El Dib, Regina

    2018-01-15

    A high-quality electronic search is essential for ensuring accuracy and comprehensiveness among the records retrieved when conducting systematic reviews. Therefore, we aimed to identify the most efficient method for searching in both MEDLINE (through PubMed) and EMBASE, covering search terms with variant spellings, direct and indirect orders, and associations with MeSH and EMTREE terms (or lack thereof). Experimental study. UNESP, Brazil. We selected and analyzed 37 search strategies that had specifically been developed for the field of anesthesiology. These search strategies were adapted in order to cover all potentially relevant search terms, with regard to variant spellings and direct and indirect orders, in the most efficient manner. When the strategies included variant spellings and direct and indirect orders, these adapted versions of the search strategies selected retrieved the same number of search results in MEDLINE (mean of 61.3%) and a higher number in EMBASE (mean of 63.9%) in the sample analyzed. The numbers of results retrieved through the searches analyzed here were not identical with and without associated use of MeSH and EMTREE terms. However, association of these terms from both controlled vocabularies retrieved a larger number of records than did the use of either one of them. In view of these results, we recommend that the search terms used should include both preferred and non-preferred terms (i.e. variant spellings and direct/indirect order of the same term) and associated MeSH and EMTREE terms, in order to develop highly-sensitive search strategies for systematic reviews.

  18. Natural Image Enhancement Using a Biogeography Based Optimization Enhanced with Blended Migration Operator

    Directory of Open Access Journals (Sweden)

    J. Jasper

    2014-01-01

    Full Text Available This paper addresses a novel and efficient algorithm for solving optimization problem in image processing applications. Image enhancement (IE is one of the complex optimization problems in image processing. The main goal of this paper is to enhance color images such that the eminence of the image is more suitable than the original image from the perceptual viewpoint of human. Traditional methods require prior knowledge of the image to be enhanced, whereas the aim of the proposed biogeography based optimization (BBO enhanced with blended migration operator (BMO algorithm is to maximize the objective function in order to enhance the image contrast by maximizing the parameters like edge intensity, edge information, and entropy. Experimental results are compared with the current state-of-the-art approaches and indicate the superiority of the proposed technique in terms of subjective and objective evaluation.

  19. Optimized protocols for cardiac magnetic resonance imaging in patients with thoracic metallic implants

    Energy Technology Data Exchange (ETDEWEB)

    Olivieri, Laura J.; Ratnayaka, Kanishka [Children' s National Health System, Division of Cardiology, Washington, DC (United States); National Institutes of Health, National Heart, Lung and Blood Institute, Bethesda, MD (United States); Cross, Russell R.; O' Brien, Kendall E. [Children' s National Health System, Division of Cardiology, Washington, DC (United States); Hansen, Michael S. [National Institutes of Health, National Heart, Lung and Blood Institute, Bethesda, MD (United States)

    2015-09-15

    Cardiac magnetic resonance (MR) imaging is a valuable tool in congenital heart disease; however patients frequently have metal devices in the chest from the treatment of their disease that complicate imaging. Methods are needed to improve imaging around metal implants near the heart. Basic sequence parameter manipulations have the potential to minimize artifact while limiting effects on image resolution and quality. Our objective was to design cine and static cardiac imaging sequences to minimize metal artifact while maintaining image quality. Using systematic variation of standard imaging parameters on a fluid-filled phantom containing commonly used metal cardiac devices, we developed optimized sequences for steady-state free precession (SSFP), gradient recalled echo (GRE) cine imaging, and turbo spin-echo (TSE) black-blood imaging. We imaged 17 consecutive patients undergoing routine cardiac MR with 25 metal implants of various origins using both standard and optimized imaging protocols for a given slice position. We rated images for quality and metal artifact size by measuring metal artifact in two orthogonal planes within the image. All metal artifacts were reduced with optimized imaging. The average metal artifact reduction for the optimized SSFP cine was 1.5+/-1.8 mm, and for the optimized GRE cine the reduction was 4.6+/-4.5 mm (P < 0.05). Quality ratings favored the optimized GRE cine. Similarly, the average metal artifact reduction for the optimized TSE images was 1.6+/-1.7 mm (P < 0.05), and quality ratings favored the optimized TSE imaging. Imaging sequences tailored to minimize metal artifact are easily created by modifying basic sequence parameters, and images are superior to standard imaging sequences in both quality and artifact size. Specifically, for optimized cine imaging a GRE sequence should be used with settings that favor short echo time, i.e. flow compensation off, weak asymmetrical echo and a relatively high receiver bandwidth. For static

  20. Social Optimization and Pricing Policy in Cognitive Radio Networks with an Energy Saving Strategy

    Directory of Open Access Journals (Sweden)

    Shunfu Jin

    2016-01-01

    Full Text Available The rapid growth of wireless application results in an increase in demand for spectrum resource and communication energy. In this paper, we firstly introduce a novel energy saving strategy in cognitive radio networks (CRNs and then propose an appropriate pricing policy for secondary user (SU packets. We analyze the behavior of data packets in a discrete-time single-server priority queue under multiple-vacation discipline. With the help of a Quasi-Birth-Death (QBD process model, we obtain the joint distribution for the number of SU packets and the state of base station (BS via the Matrix-Geometric Solution method. We assess the average latency of SU packets and the energy saving ratio of system. According to a natural reward-cost structure, we study the individually optimal behavior and the socially optimal behavior of the energy saving strategy and use an optimization algorithm based on standard particle swarm optimization (SPSO method to search the socially optimal arrival rate of SU packets. By comparing the individually optimal behavior and the socially optimal behavior, we impose an appropriate admission fee to SU packets. Finally, we present numerical results to show the impacts of system parameters on the system performance and the pricing policy.

  1. Optimal robust control strategy of a solid oxide fuel cell system

    Science.gov (United States)

    Wu, Xiaojuan; Gao, Danhui

    2018-01-01

    Optimal control can ensure system safe operation with a high efficiency. However, only a few papers discuss optimal control strategies for solid oxide fuel cell (SOFC) systems. Moreover, the existed methods ignore the impact of parameter uncertainty on system instantaneous performance. In real SOFC systems, several parameters may vary with the variation of operation conditions and can not be identified exactly, such as load current. Therefore, a robust optimal control strategy is proposed, which involves three parts: a SOFC model with parameter uncertainty, a robust optimizer and robust controllers. During the model building process, boundaries of the uncertain parameter are extracted based on Monte Carlo algorithm. To achieve the maximum efficiency, a two-space particle swarm optimization approach is employed to obtain optimal operating points, which are used as the set points of the controllers. To ensure the SOFC safe operation, two feed-forward controllers and a higher-order robust sliding mode controller are presented to control fuel utilization ratio, air excess ratio and stack temperature afterwards. The results show the proposed optimal robust control method can maintain the SOFC system safe operation with a maximum efficiency under load and uncertainty variations.

  2. Two-objective on-line optimization of supervisory control strategy

    Energy Technology Data Exchange (ETDEWEB)

    Nassif, N.; Kajl, S.; Sabourin, R. [Ecole de Technologie Superieure, Montreal (Canada)

    2004-09-01

    The set points of supervisory control strategy are optimized with respect to energy use and thermal comfort for existing HVAC systems. The set point values of zone temperatures, supply duct static pressure, and supply air temperature are the problem variables, while energy use and thermal comfort are the objective functions. The HVAC system model includes all the individual component models developed and validated against the monitored data of an existing VAV system. It serves to calculate energy use during the optimization process, whereas the actual energy use is determined by using monitoring data and the appropriate validated component models. A comparison, done for one summer week, of actual and optimal energy use shows that the on-line implementation of a genetic algorithm optimization program to determine the optimal set points of supervisory control strategy could save energy by 19.5%, while satisfying the minimum zone airflow rates and the thermal comfort. The results also indicate that the application of the two-objective optimization problem can help control daily energy use or daily building thermal comfort, thus saving more energy than the application of the one-objective optimization problem. (Author)

  3. Sequentially optimized reconstruction strategy: A meta-strategy for perimetry testing.

    Directory of Open Access Journals (Sweden)

    Şerife Seda Kucur

    Full Text Available Perimetry testing is an automated method to measure visual function and is heavily used for diagnosing ophthalmic and neurological conditions. Its working principle is to sequentially query a subject about perceived light using different brightness levels at different visual field locations. At a given location, this query-patient-feedback process is expected to converge at a perceived sensitivity, such that a shown stimulus intensity is observed and reported 50% of the time. Given this inherently time-intensive and noisy process, fast testing strategies are necessary in order to measure existing regions more effectively and reliably. In this work, we present a novel meta-strategy which relies on the correlative nature of visual field locations in order to strongly reduce the necessary number of locations that need to be examined. To do this, we sequentially determine locations that most effectively reduce visual field estimation errors in an initial training phase. We then exploit these locations at examination time and show that our approach can easily be combined with existing perceived sensitivity estimation schemes to speed up the examinations. Compared to state-of-the-art strategies, our approach shows marked performance gains with a better accuracy-speed trade-off regime for both mixed and sub-populations.

  4. A strategy for multimodal deformable image registration to integrate PET/MR into radiotherapy treatment planning

    International Nuclear Information System (INIS)

    Leibfarth, Sara; Moennich, David; Thorwarth, Daniela; Welz, Stefan; Siegel, Christine; Zips, Daniel; Schwenzer, Nina; Holger Schmidt, Holger

    2013-01-01

    Background: Combined positron emission tomography (PET)/magnetic resonance imaging (MRI) is highly promising for biologically individualized radiotherapy (RT). Hence, the purpose of this work was to develop an accurate and robust registration strategy to integrate combined PET/MR data into RT treatment planning. Material and methods: Eight patient datasets consisting of an FDG PET/computed tomography (CT) and a subsequently acquired PET/MR of the head and neck (HN) region were available. Registration strategies were developed based on CT and MR data only, whereas the PET components were fused with the resulting deformation field. Following a rigid registration, deformable registration was performed with a transform parametrized by B-splines. Three different optimization metrics were investigated: global mutual information (GMI), GMI combined with a bending energy penalty (BEP) for regularization (GMI + BEP) and localized mutual information with BEP (LMI + BEP). Different quantitative registration quality measures were developed, including volumetric overlap and mean distance measures for structures segmented on CT and MR as well as anatomical landmark distances. Moreover, the local registration quality in the tumor region was assessed by the normalized cross correlation (NCC) of the two PET datasets. Results: LMI + BEP yielded the most robust and accurate registration results. For GMI, GMI + BEP and LMI + BEP, mean landmark distances (standard deviations) were 23.9 mm (15.5 mm), 4.8 mm (4.0 mm) and 3.0 mm (1.0 mm), and mean NCC values (standard deviations) were 0.29 (0.29), 0.84 (0.14) and 0.88 (0.06), respectively. Conclusion: Accurate and robust multimodal deformable image registration of CT and MR in the HN region can be performed using a B-spline parametrized transform and LMI + BEP as optimization metric. With this strategy, biologically individualized RT based on combined PET/MRI in terms of dose painting is possible

  5. Image reconstruction for a Positron Emission Tomograph optimized for breast cancer imaging

    International Nuclear Information System (INIS)

    Virador, Patrick R.G.

    2000-01-01

    The author performs image reconstruction for a novel Positron Emission Tomography camera that is optimized for breast cancer imaging. This work addresses for the first time, the problem of fully-3D, tomographic reconstruction using a septa-less, stationary, (i.e. no rotation or linear motion), and rectangular camera whose Field of View (FOV) encompasses the entire volume enclosed by detector modules capable of measuring Depth of Interaction (DOI) information. The camera is rectangular in shape in order to accommodate breasts of varying sizes while allowing for soft compression of the breast during the scan. This non-standard geometry of the camera exacerbates two problems: (a) radial elongation due to crystal penetration and (b) reconstructing images from irregularly sampled data. Packing considerations also give rise to regions in projection space that are not sampled which lead to missing information. The author presents new Fourier Methods based image reconstruction algorithms that incorporate DOI information and accommodate the irregular sampling of the camera in a consistent manner by defining lines of responses (LORs) between the measured interaction points instead of rebinning the events into predefined crystal face LORs which is the only other method to handle DOI information proposed thus far. The new procedures maximize the use of the increased sampling provided by the DOI while minimizing interpolation in the data. The new algorithms use fixed-width evenly spaced radial bins in order to take advantage of the speed of the Fast Fourier Transform (FFT), which necessitates the use of irregular angular sampling in order to minimize the number of unnormalizable Zero-Efficiency Bins (ZEBs). In order to address the persisting ZEBs and the issue of missing information originating from packing considerations, the algorithms (a) perform nearest neighbor smoothing in 2D in the radial bins (b) employ a semi-iterative procedure in order to estimate the unsampled data

  6. Image reconstruction for a Positron Emission Tomograph optimized for breast cancer imaging

    Energy Technology Data Exchange (ETDEWEB)

    Virador, Patrick R.G. [Univ. of California, Berkeley, CA (United States)

    2000-04-01

    The author performs image reconstruction for a novel Positron Emission Tomography camera that is optimized for breast cancer imaging. This work addresses for the first time, the problem of fully-3D, tomographic reconstruction using a septa-less, stationary, (i.e. no rotation or linear motion), and rectangular camera whose Field of View (FOV) encompasses the entire volume enclosed by detector modules capable of measuring Depth of Interaction (DOI) information. The camera is rectangular in shape in order to accommodate breasts of varying sizes while allowing for soft compression of the breast during the scan. This non-standard geometry of the camera exacerbates two problems: (a) radial elongation due to crystal penetration and (b) reconstructing images from irregularly sampled data. Packing considerations also give rise to regions in projection space that are not sampled which lead to missing information. The author presents new Fourier Methods based image reconstruction algorithms that incorporate DOI information and accommodate the irregular sampling of the camera in a consistent manner by defining lines of responses (LORs) between the measured interaction points instead of rebinning the events into predefined crystal face LORs which is the only other method to handle DOI information proposed thus far. The new procedures maximize the use of the increased sampling provided by the DOI while minimizing interpolation in the data. The new algorithms use fixed-width evenly spaced radial bins in order to take advantage of the speed of the Fast Fourier Transform (FFT), which necessitates the use of irregular angular sampling in order to minimize the number of unnormalizable Zero-Efficiency Bins (ZEBs). In order to address the persisting ZEBs and the issue of missing information originating from packing considerations, the algorithms (a) perform nearest neighbor smoothing in 2D in the radial bins (b) employ a semi-iterative procedure in order to estimate the unsampled data

  7. Analysis for Influence of Market Information on Firms' Optimal Strategies in Multidimensional Bertrand Game

    Institute of Scientific and Technical Information of China (English)

    DeqingTan; GuangzhongLiu

    2004-01-01

    The Bertrand model of two firms' static multidimensional game with incomplete information for two kinds of product with certain substitution is discussed in the paper,and analyzes influences of the firms' forecasting results of total market demands on their optimal strategies according to marxet information. The conclusions are that the more a firm masters market information, the greater differences of forecasted values and expected values of market demands for products have influence upon equilibrium strategies; conversely, the less they have influence upon equilibrium strategies.

  8. Optimism, pain coping strategies and pain intensity among women with rheumatoid arthritis

    Directory of Open Access Journals (Sweden)

    Zuzanna Kwissa-Gajewska

    2014-07-01

    Full Text Available Objectives: According to the biopsychosocial model of pain, it is a multidimensional phenomenon, which comprises physiological (sensation-related factors, psychological (affective and social (socio-economic status, social support factors. Researchers have mainly focused on phenomena increasing the pain sensation; very few studies have examined psychological factors preventing pain. The aim of the research is to assess chronic pain intensity as determined by level of optimism, and to identify pain coping strategies in women with rheumatoid arthritis (RA. Material and methods : A survey was carried out among 54 women during a 7-day period of hospitalisation. The following questionnaires were used: LOT-R (optimism; Scheier, Carver and Bridges, the Coping Strategies Questionnaire (CSQ; Rosenstiel and Keefe and the 10-point visual-analogue pain scale (VAS. Results: The research findings indicate the significance of optimism in the experience of chronic pain, and in the pain coping strategies. Optimists felt a significantly lower level of pain than pessimists. Patients with positive outcome expectancies (optimists experienced less pain thanks to replacing catastrophizing (negative concentration on pain with an increased activity level. Regardless of personality traits, active coping strategies (e.g. ignoring pain sensations, coping self-statements – appraising pain as a challenge, a belief in one’s ability to manage pain resulted in a decrease in pain, whilst catastrophizing contributed to its intensification. The most common coping strategies included praying and hoping. Employment was an important demographic variable: the unemployed experienced less pain than those who worked. Conclusions : The research results indicate that optimism and pain coping strategies should be taken into account in clinical practice. Particular attention should be given to those who have negative outcome expectations, which in turn determine strong chronic pain

  9. Reducing image interpretation errors – Do communication strategies undermine this?

    International Nuclear Information System (INIS)

    Snaith, B.; Hardy, M.; Lewis, E.F.

    2014-01-01

    Introduction: Errors in the interpretation of diagnostic images in the emergency department are a persistent problem internationally. To address this issue, a number of risk reduction strategies have been suggested but only radiographer abnormality detection schemes (RADS) have been widely implemented in the UK. This study considers the variation in RADS operation and communication in light of technological advances and changes in service operation. Methods: A postal survey of all NHS hospitals operating either an Emergency Department or Minor Injury Unit and a diagnostic imaging (radiology) department (n = 510) was undertaken between July and August 2011. The questionnaire was designed to elicit information on emergency service provision and details of RADS. Results: 325 questionnaires were returned (n = 325/510; 63.7%). The majority of sites (n = 288/325; 88.6%) operated a RADS with the majority (n = 227/288; 78.8%) employing a visual ‘flagging’ system as the only method of communication although symbols used were inconsistent and contradictory across sites. 61 sites communicated radiographer findings through a written proforma (paper or electronic) but this was run in conjunction with a flagging system at 50 sites. The majority of sites did not have guidance on the scope or operation of the ‘flagging’ or written communication system in use. Conclusions: RADS is an established clinical intervention to reduce errors in diagnostic image interpretation within the emergency setting. The lack of standardisation in communication processes and practices alongside the rapid adoption of technology has increased the potential for error and miscommunication

  10. A new reconstruction strategy for image improvement in pinhole SPECT

    International Nuclear Information System (INIS)

    Zeniya, Tsutomu; Watabe, Hiroshi; Kim, Kyeong Min; Teramoto, Noboru; Hayashi, Takuya; Iida, Hidehiro; Aoi, Toshiyuki; Sohlberg, Antti; Kudo, Hiroyuki

    2004-01-01

    Pinhole single-photon emission computed tomography (SPECT) is able to provide information on the biodistribution of several radioligands in small laboratory animals, but has limitations associated with non-uniform spatial resolution or axial blurring. We have hypothesised that this blurring is due to incompleteness of the projection data acquired by a single circular pinhole orbit, and have evaluated a new strategy for accurate image reconstruction with better spatial resolution uniformity. A pinhole SPECT system using two circular orbits and a dedicated three-dimensional ordered subsets expectation maximisation (3D-OSEM) reconstruction method were developed. In this system, not the camera but the object rotates, and the two orbits are at 90 and 45 relative to the object's axis. This system satisfies Tuy's condition, and is thus able to provide complete data for 3D pinhole SPECT reconstruction within the whole field of view (FOV). To evaluate this system, a series of experiments was carried out using a multiple-disk phantom filled with 99m Tc solution. The feasibility of the proposed method for small animal imaging was tested with a mouse bone study using 99m Tc-hydroxymethylene diphosphonate. Feldkamp's filtered back-projection (FBP) method and the 3D-OSEM method were applied to these data sets, and the visual and statistical properties were examined. Axial blurring, which was still visible at the edge of the FOV even after applying the conventional 3D-OSEM instead of FBP for single-orbit data, was not visible after application of 3D-OSEM using two-orbit data. 3D-OSEM using two-orbit data dramatically reduced the resolution non-uniformity and statistical noise, and also demonstrated considerably better image quality in the mouse scan. This system may be of use in quantitative assessment of bio-physiological functions in small animals. (orig.)

  11. Reliability optimization of series–parallel systems with mixed redundancy strategy in subsystems

    International Nuclear Information System (INIS)

    Abouei Ardakan, Mostafa; Zeinal Hamadani, Ali

    2014-01-01

    Traditionally in redundancy allocation problem (RAP), it is assumed that the redundant components are used based on a predefined active or standby strategies. Recently, some studies consider the situation that both active and standby strategies can be used in a specific system. However, these researches assume that the redundancy strategy for each subsystem can be either active or standby and determine the best strategy for these subsystems by using a proper mathematical model. As an extension to this assumption, a novel strategy, that is a combination of traditional active and standby strategies, is introduced. The new strategy is called mixed strategy which uses both active and cold-standby strategies in one subsystem simultaneously. Therefore, the problem is to determine the component type, redundancy level, number of active and cold-standby units for each subsystem in order to maximize the system reliability. To have a more practical model, the problem is formulated with imperfect switching of cold-standby redundant components and k-Erlang time-to-failure (TTF) distribution. As the optimization of RAP belongs to NP-hard class of problems, a genetic algorithm (GA) is developed. The new strategy and proposed GA are implemented on a well-known test problem in the literature which leads to interesting results. - Highlights: • In this paper the redundancy allocation problem (RAP) for a series–parallel system is considered. • Traditionally there are two main strategies for redundant component namely active and standby. • In this paper a new redundancy strategy which is called “Mixed” redundancy strategy is introduced. • Computational experiments demonstrate that implementing the new strategy lead to interesting results

  12. Stochastic Optimized Relevance Feedback Particle Swarm Optimization for Content Based Image Retrieval

    Directory of Open Access Journals (Sweden)

    Muhammad Imran

    2014-01-01

    Full Text Available One of the major challenges for the CBIR is to bridge the gap between low level features and high level semantics according to the need of the user. To overcome this gap, relevance feedback (RF coupled with support vector machine (SVM has been applied successfully. However, when the feedback sample is small, the performance of the SVM based RF is often poor. To improve the performance of RF, this paper has proposed a new technique, namely, PSO-SVM-RF, which combines SVM based RF with particle swarm optimization (PSO. The aims of this proposed technique are to enhance the performance of SVM based RF and also to minimize the user interaction with the system by minimizing the RF number. The PSO-SVM-RF was tested on the coral photo gallery containing 10908 images. The results obtained from the experiments showed that the proposed PSO-SVM-RF achieved 100% accuracy in 8 feedback iterations for top 10 retrievals and 80% accuracy in 6 iterations for 100 top retrievals. This implies that with PSO-SVM-RF technique high accuracy rate is achieved at a small number of iterations.

  13. A characteristic study of CCF modeling techniques and optimization of CCF defense strategies

    International Nuclear Information System (INIS)

    Kim, Min Chull

    2000-02-01

    Common Cause Failures (CCFs ) are among the major contributors to risk and core damage frequency (CDF ) from operating nuclear power plants (NPPs ). Our study on CCF focused on the following aspects : 1) a characteristic study on the CCF modeling techniques and 2) development of the optimal CCF defense strategy. Firstly, the characteristics of CCF modeling techniques were studied through sensitivity study of CCF occurrence probability upon system redundancy. The modeling techniques considered in this study include those most widely used worldwide, i.e., beta factor, MGL, alpha factor, and binomial failure rate models. We found that MGL and alpha factor models are essentially identical in terms of the CCF probability. Secondly, in the study for CCF defense, the various methods identified in the previous studies for defending against CCF were classified into five different categories. Based on these categories, we developed a generic method by which the optimal CCF defense strategy can be selected. The method is not only qualitative but also quantitative in nature: the selection of the optimal strategy among candidates is based on the use of analytic hierarchical process (AHP). We applied this method to two motor-driven valves for containment sump isolation in Ulchin 3 and 4 nuclear power plants. The result indicates that the method for developing an optimal CCF defense strategy is effective

  14. Beyond the drugs : non-pharmacological strategies to optimize procedural care in children

    NARCIS (Netherlands)

    Leroy, Piet L.; Costa, Luciane R.; Emmanouil, Dimitris; van Beukering, Alice; Franck, Linda S.

    2016-01-01

    Purpose of review Painful and/or stressful medical procedures mean a substantial burden for sick children. There is good evidence that procedural comfort can be optimized by a comprehensive comfort-directed policy containing the triad of non-pharmacological strategies (NPS) in all cases, timely or

  15. Optimal bidding strategies in oligopoly markets considering bilateral contracts and transmission constraints

    Energy Technology Data Exchange (ETDEWEB)

    A.Badri; Jadid, S. [Department of Electrical Engineering, Iran University of Science and Technology (Iran); Rashidinejad, M. [Shahid Bahonar University, Kerman (Iran); Moghaddam, M.P. [Tarbiat Modarres University, Tehran (Iran)

    2008-06-15

    In electricity industry with transmission constraints and limited number of producers, Generation Companies (GenCos) are facing an oligopoly market rather than a perfect competition one. Under oligopoly market environment, each GenCo may increase its own profit through a favorable bidding strategy. This paper investigates the problem of developing optimal bidding strategies of GenCos, considering bilateral contracts and transmission constraints. The problem is modeled with a bi-level optimization algorithm, where in the first level each GenCo maximizes its payoff and in the second level a system dispatch will be accomplished through an OPF problem in which transmission constraints are taken into account. It is assumed that each GenCo has information about initial bidding strategies of other competitors. Impacts of exercising market power due to transmission constraints as well as irrational biddings of the some generators are studied and the interactions of different bidding strategies on participants' corresponding payoffs are presented. Furthermore, a risk management-based method to obtain GenCos' optimal bilateral contracts is proposed and the impacts of these contracts on GenCos' optimal biddings and obtained payoffs are investigated. At the end, IEEE 30-bus test system is used for the case study in order to demonstrate the simulation results and support the effectiveness of the proposed model. (author)

  16. Optimal Control Strategy Search Using a Simplest 3-D PWR Xenon Oscillation Simulator

    International Nuclear Information System (INIS)

    Yoichiro, Shimazu

    2004-01-01

    Power spatial oscillations due to the transient xenon spatial distribution are well known as xenon oscillation in large PWRs. When the reactor size becomes larger than the current design, then even radial oscillations can be also divergent. Even if the radial oscillation is convergent, when some control rods malfunction occurs, it is necessary to suppress the oscillation in as short time as possible. In such cases, optimal control strategy is required. Generally speaking the optimality search based on the modern control theory requires a lot of calculation for the evaluation of state variables. In the case of control rod malfunctions the xenon oscillation could be three dimensional. In such case, direct core calculations would be inevitable. From this point of view a very simple model, only four point reactor model, has been developed and verified. In this paper, an example of a procedure and the results for optimal control strategy search are presented. It is shown that we have only one optimal strategy within a half cycle of the oscillation with fixed control strength. It is also shown that a 3-D xenon oscillation introduced by a control rod malfunction can not be controlled by only one control step as can be done for axial oscillations. They might be quite strong limitations to the operators. Thus it is recommended that a strategy generator, which is quick in analyzing and easy to use, might be installed in a monitoring system or operator guiding system. (author)

  17. Research of Ant Colony Optimized Adaptive Control Strategy for Hybrid Electric Vehicle

    Directory of Open Access Journals (Sweden)

    Linhui Li

    2014-01-01

    Full Text Available Energy management control strategy of hybrid electric vehicle has a great influence on the vehicle fuel consumption with electric motors adding to the traditional vehicle power system. As vehicle real driving cycles seem to be uncertain, the dynamic driving cycles will have an impact on control strategy’s energy-saving effect. In order to better adapt the dynamic driving cycles, control strategy should have the ability to recognize the real-time driving cycle and adaptively adjust to the corresponding off-line optimal control parameters. In this paper, four types of representative driving cycles are constructed based on the actual vehicle operating data, and a fuzzy driving cycle recognition algorithm is proposed for online recognizing the type of actual driving cycle. Then, based on the equivalent fuel consumption minimization strategy, an ant colony optimization algorithm is utilized to search the optimal control parameters “charge and discharge equivalent factors” for each type of representative driving cycle. At last, the simulation experiments are conducted to verify the accuracy of the proposed fuzzy recognition algorithm and the validity of the designed control strategy optimization method.

  18. A Regional Time-of-Use Electricity Price Based Optimal Charging Strategy for Electrical Vehicles

    Directory of Open Access Journals (Sweden)

    Jun Yang

    2016-08-01

    Full Text Available With the popularization of electric vehicles (EVs, the out-of-order charging behaviors of large numbers of EVs will bring new challenges to the safe and economic operation of power systems. This paper studies an optimal charging strategy for EVs. For that a typical urban zone is divided into four regions, a regional time-of-use (RTOU electricity price model is proposed to guide EVs when and where to charge considering spatial and temporal characteristics. In light of the elastic coefficient, the user response to the RTOU electricity price is analyzed, and also a bilayer optimization charging strategy including regional-layer and node-layer models is suggested to schedule the EVs. On the one hand, the regional layer model is designed to coordinate the EVs located in different time and space. On the other hand, the node layer model is built to schedule the EVs to charge in certain nodes. According to the simulations of an IEEE 33-bus distribution network, the performance of the proposed optimal charging strategy is verified. The results demonstrate that the proposed bilayer optimization strategy can effectively decrease the charging cost of users, mitigate the peak-valley load difference and the network loss. Besides, the RTOU electricity price shows better performance than the time-of-use (TOU electricity price.

  19. Aggregators’ Optimal Bidding Strategy in Sequential Day-Ahead and Intraday Electricity Spot Markets

    Directory of Open Access Journals (Sweden)

    Xiaolin Ayón

    2017-04-01

    Full Text Available This paper proposes a probabilistic optimization method that produces optimal bidding curves to be submitted by an aggregator to the day-ahead electricity market and the intraday market, considering the flexible demand of his customers (based in time dependent resources such as batteries and shiftable demand and taking into account the possible imbalance costs as well as the uncertainty of forecasts (market prices, demand, and renewable energy sources (RES generation. The optimization strategy aims to minimize the total cost of the traded energy over a whole day, taking into account the intertemporal constraints. The proposed formulation leads to the solution of different linear optimization problems, following the natural temporal sequence of electricity spot markets. Intertemporal constraints regarding time dependent resources are fulfilled through a scheduling process performed after the day-ahead market clearing. Each of the different problems is of moderate dimension and requires short computation times. The benefits of the proposed strategy are assessed comparing the payments done by an aggregator over a sample period of one year following different deterministic and probabilistic strategies. Results show that probabilistic strategy reports better benefits for aggregators participating in power markets.

  20. An Optimal Portfolio and Capital Management Strategy for Basel III Compliant Commercial Banks

    Directory of Open Access Journals (Sweden)

    Grant E. Muller

    2014-01-01

    Full Text Available We model a Basel III compliant commercial bank that operates in a financial market consisting of a treasury security, a marketable security, and a loan and we regard the interest rate in the market as being stochastic. We find the investment strategy that maximizes an expected utility of the bank’s asset portfolio at a future date. This entails obtaining formulas for the optimal amounts of bank capital invested in different assets. Based on the optimal investment strategy, we derive a model for the Capital Adequacy Ratio (CAR, which the Basel Committee on Banking Supervision (BCBS introduced as a measure against banks’ susceptibility to failure. Furthermore, we consider the optimal investment strategy subject to a constant CAR at the minimum prescribed level. We derive a formula for the bank’s asset portfolio at constant (minimum CAR value and present numerical simulations on different scenarios. Under the optimal investment strategy, the CAR is above the minimum prescribed level. The value of the asset portfolio is improved if the CAR is at its (constant minimum value.

  1. Optimal orientation in flows : Providing a benchmark for animal movement strategies

    NARCIS (Netherlands)

    McLaren, James D.; Shamoun-Baranes, Judy; Dokter, Adriaan M.; Klaassen, Raymond H. G.; Bouten, Willem

    2014-01-01

    Animal movements in air and water can be strongly affected by experienced flow. While various flow-orientation strategies have been proposed and observed, their performance in variable flow conditions remains unclear. We apply control theory to establish a benchmark for time-minimizing (optimal)

  2. Stability Analysis and Optimal Control Strategy for Prevention of Pine Wilt Disease

    Directory of Open Access Journals (Sweden)

    Kwang Sung Lee

    2014-01-01

    Full Text Available We propose a mathematical model of pine wilt disease (PWD which is caused by pine sawyer beetles carrying the pinewood nematode (PWN. We calculate the basic reproduction number R0 and investigate the stability of a disease-free and endemic equilibrium in a given mathematical model. We show that the stability of the equilibrium in the proposed model can be controlled through the basic reproduction number R0. We then discuss effective optimal control strategies for the proposed PWD mathematical model. We demonstrate the existence of a control problem, and then we apply both analytical and numerical techniques to demonstrate effective control methods to prevent the transmission of the PWD. In order to do this, we apply two control strategies: tree-injection of nematicide and the eradication of adult beetles through aerial pesticide spraying. Optimal prevention strategies can be determined by solving the corresponding optimality system. Numerical simulations of the optimal control problem using a set of reasonable parameter values suggest that reducing the number of pine sawyer beetles is more effective than the tree-injection strategy for controlling the spread of PWD.

  3. Optimal household appliances scheduling under day-ahead pricing and load-shaping demand response strategies

    NARCIS (Netherlands)

    Paterakis, N.G.; Erdinç, O.; Bakirtzis, A.G.; Catalao, J.P.S.

    2015-01-01

    In this paper, a detailed home energy management system structure is developed to determine the optimal dayahead appliance scheduling of a smart household under hourly pricing and peak power-limiting (hard and soft power limitation)-based demand response strategies. All types of controllable assets

  4. Optimal and Robust Switching Control Strategies : Theory, and Applications in Traffic Management

    NARCIS (Netherlands)

    Hajiahmadi, M.

    2015-01-01

    Macroscopic modeling, predictive and robust control and route guidance for large-scale freeway and urban traffic networks are the main focus of this thesis. In order to increase the efficiency of our control strategies, we propose several mathematical and optimization techniques. Moreover, in the

  5. Reproduction now or later: optimal host-handling strategies in the whitefly parasitoid Encarsia formosa

    NARCIS (Netherlands)

    Burger, J.M.S.; Hemerik, L.; Lenteren, van J.C.; Vet, L.E.M.

    2004-01-01

    We developed a dynamic state variable model for studying optimal host-handling strategies in the whitefly parasitoid Encarsia formosa Gahan (Hymenoptera: Aphelinidae). We assumed that (a) the function of host feeding is to gain nutrients that can be matured into eggs, (b) oogenesis is continuous and

  6. Reproduction now or later: optimal host-hanling strategies in the whitefly parasitoid Encarsia formosa

    NARCIS (Netherlands)

    Burger, J.S.M.; Hemerik, L.; Van Lenteren, J.C.; Vet, L.E.M.

    2004-01-01

    We developed a dynamic state variable model for studying optimal host-handling strategies in the whitefly parasitoid Encarsia formosa Gahan (Hymenoptera: Aphelinidae). We assumed that (a) the function of host feeding is to gain nutrients that can be matured into eggs, (b) oögenesis is continuous and

  7. Optimal bidding strategies in oligopoly markets considering bilateral contracts and transmission constraints

    International Nuclear Information System (INIS)

    Badri, A.; Jadid, S.; Rashidinejad, M.; Moghaddam, M.P.

    2008-01-01

    In electricity industry with transmission constraints and limited number of producers, Generation Companies (GenCos) are facing an oligopoly market rather than a perfect competition one. Under oligopoly market environment, each GenCo may increase its own profit through a favorable bidding strategy. This paper investigates the problem of developing optimal bidding strategies of GenCos, considering bilateral contracts and transmission constraints. The problem is modeled with a bi-level optimization algorithm, where in the first level each GenCo maximizes its payoff and in the second level a system dispatch will be accomplished through an OPF problem in which transmission constraints are taken into account. It is assumed that each GenCo has information about initial bidding strategies of other competitors. Impacts of exercising market power due to transmission constraints as well as irrational biddings of the some generators are studied and the interactions of different bidding strategies on participants' corresponding payoffs are presented. Furthermore, a risk management-based method to obtain GenCos' optimal bilateral contracts is proposed and the impacts of these contracts on GenCos' optimal biddings and obtained payoffs are investigated. At the end, IEEE 30-bus test system is used for the case study in order to demonstrate the simulation results and support the effectiveness of the proposed model. (author)

  8. Self-Regulatory Strategies in Daily Life: Selection, Optimization, and Compensation and Everyday Memory Problems

    Science.gov (United States)

    Robinson, Stephanie A.; Rickenbach, Elizabeth H.; Lachman, Margie E.

    2016-01-01

    The effective use of self-regulatory strategies, such as selection, optimization, and compensation (SOC) requires resources. However, it is theorized that SOC use is most advantageous for those experiencing losses and diminishing resources. The present study explored this seeming paradox within the context of limitations or constraints due to…

  9. Multi-objective Optimization Strategies Using Adjoint Method and Game Theory in Aerodynamics

    Science.gov (United States)

    Tang, Zhili

    2006-08-01

    There are currently three different game strategies originated in economics: (1) Cooperative games (Pareto front), (2) Competitive games (Nash game) and (3) Hierarchical games (Stackelberg game). Each game achieves different equilibria with different performance, and their players play different roles in the games. Here, we introduced game concept into aerodynamic design, and combined it with adjoint method to solve multi-criteria aerodynamic optimization problems. The performance distinction of the equilibria of these three game strategies was investigated by numerical experiments. We computed Pareto front, Nash and Stackelberg equilibria of the same optimization problem with two conflicting and hierarchical targets under different parameterizations by using the deterministic optimization method. The numerical results show clearly that all the equilibria solutions are inferior to the Pareto front. Non-dominated Pareto front solutions are obtained, however the CPU cost to capture a set of solutions makes the Pareto front an expensive tool to the designer.

  10. Development of an evaluation method for optimization of maintenance strategy in commercial plant

    International Nuclear Information System (INIS)

    Ito, Satoshi; Shiraishi, Natsuki; Yuki, Kazuhisa; Hashizume, Hidetoshi

    2006-01-01

    In this study, a new simulation method is developed for optimization of maintenance strategy in NPP as a multiple-objective optimization problem (MOP). The result of operation is evaluated as the average of the following three measures in 3,000 trials: Cost of Electricity (COE) as economic risk, Frequency of unplanned shutdown as plant reliability, and Unavailability of Regular Service System (RSS) and Engineering Safety Features (ESF) as safety measures. The following maintenance parameters are considered to evaluate several risk in plant operation by changing maintenance strategy: planned outage cycle, surveillance cycle, major inspection cycle, and surveillance cycle depending on the value of Fussel-Vesely importance measure. By using the Decision-Making method based on AHP, there are individual tendencies depending on individual decision-maker. Therefore this study could be useful for resolving the problem of maintenance optimization as a MOP. (author)

  11. Multi-objective optimization strategies using adjoint method and game theory in aerodynamics

    Institute of Scientific and Technical Information of China (English)

    Zhili Tang

    2006-01-01

    There are currently three different game strategies originated in economics:(1) Cooperative games (Pareto front),(2)Competitive games (Nash game) and (3)Hierarchical games (Stackelberg game).Each game achieves different equilibria with different performance,and their players play different roles in the games.Here,we introduced game concept into aerodynamic design, and combined it with adjoint method to solve multicriteria aerodynamic optimization problems.The performance distinction of the equilibria of these three game strategies was investigated by numerical experiments.We computed Pareto front, Nash and Stackelberg equilibria of the same optimization problem with two conflicting and hierarchical targets under different parameterizations by using the deterministic optimization method.The numerical results show clearly that all the equilibria solutions are inferior to the Pareto front.Non-dominated Pareto front solutions are obtained,however the CPU cost to capture a set of solutions makes the Pareto front an expensive tool to the designer.

  12. Stochastic Optimal Wind Power Bidding Strategy in Short-Term Electricity Market

    DEFF Research Database (Denmark)

    Hu, Weihao; Chen, Zhe; Bak-Jensen, Birgitte

    2012-01-01

    Due to the fluctuating nature and non-perfect forecast of the wind power, the wind power owners are penalized for the imbalance costs of the regulation, when they trade wind power in the short-term liberalized electricity market. Therefore, in this paper a formulation of an imbalance cost...... minimization problem for trading wind power in the short-term electricity market is described, to help the wind power owners optimize their bidding strategy. Stochastic optimization and a Monte Carlo method are adopted to find the optimal bidding strategy for trading wind power in the short-term electricity...... market in order to deal with the uncertainty of the regulation price, the activated regulation of the power system and the forecasted wind power generation. The Danish short-term electricity market and a wind farm in western Denmark are chosen as study cases due to the high wind power penetration here...

  13. Study on the Optimal Charging Strategy for Lithium-Ion Batteries Used in Electric Vehicles

    Directory of Open Access Journals (Sweden)

    Shuo Zhang

    2014-10-01

    Full Text Available The charging method of lithium-ion batteries used in electric vehicles (EVs significantly affects its commercial application. This paper aims to make three contributions to the existing literature. (1 In order to achieve an efficient charging strategy for lithium-ion batteries with shorter charging time and lower charring loss, the trade-off problem between charging loss and charging time has been analyzed in details through the dynamic programing (DP optimization algorithm; (2 To reduce the computation time consumed during the optimization process, we have proposed a database based optimization approach. After off-line calculation, the simulation results can be applied to on-line charge; (3 The novel database-based DP method is proposed and the simulation results illustrate that this method can effectively find the suboptimal charging strategies under a certain balance between the charging loss and charging time.

  14. An Image Filter Based on Shearlet Transformation and Particle Swarm Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Kai Hu

    2015-01-01

    Full Text Available Digital image is always polluted by noise and made data postprocessing difficult. To remove noise and preserve detail of image as much as possible, this paper proposed image filter algorithm which combined the merits of Shearlet transformation and particle swarm optimization (PSO algorithm. Firstly, we use classical Shearlet transform to decompose noised image into many subwavelets under multiscale and multiorientation. Secondly, we gave weighted factor to those subwavelets obtained. Then, using classical Shearlet inverse transform, we obtained a composite image which is composed of those weighted subwavelets. After that, we designed fast and rough evaluation method to evaluate noise level of the new image; by using this method as fitness, we adopted PSO to find the optimal weighted factor we added; after lots of iterations, by the optimal factors and Shearlet inverse transform, we got the best denoised image. Experimental results have shown that proposed algorithm eliminates noise effectively and yields good peak signal noise ratio (PSNR.

  15. Modelling and Optimal Control of Typhoid Fever Disease with Cost-Effective Strategies.

    Science.gov (United States)

    Tilahun, Getachew Teshome; Makinde, Oluwole Daniel; Malonza, David

    2017-01-01

    We propose and analyze a compartmental nonlinear deterministic mathematical model for the typhoid fever outbreak and optimal control strategies in a community with varying population. The model is studied qualitatively using stability theory of differential equations and the basic reproductive number that represents the epidemic indicator is obtained from the largest eigenvalue of the next-generation matrix. Both local and global asymptotic stability conditions for disease-free and endemic equilibria are determined. The model exhibits a forward transcritical bifurcation and the sensitivity analysis is performed. The optimal control problem is designed by applying Pontryagin maximum principle with three control strategies, namely, the prevention strategy through sanitation, proper hygiene, and vaccination; the treatment strategy through application of appropriate medicine; and the screening of the carriers. The cost functional accounts for the cost involved in prevention, screening, and treatment together with the total number of the infected persons averted. Numerical results for the typhoid outbreak dynamics and its optimal control revealed that a combination of prevention and treatment is the best cost-effective strategy to eradicate the disease.

  16. Applying GA for Optimizing the User Query in Image and Video Retrieval

    OpenAIRE

    Ehsan Lotfi

    2014-01-01

    In an information retrieval system, the query can be made by user sketch. The new method presented here, optimizes the user sketch and applies the optimized query to retrieval the information. This optimization may be used in Content-Based Image Retrieval (CBIR) and Content-Based Video Retrieval (CBVR) which is based on trajectory extraction. To optimize the retrieval process, one stage of retrieval is performed by the user sketch. The retrieval criterion is based on the proposed distance met...

  17. Digital radiography: optimization of image quality and dose using multi-frequency software.

    Science.gov (United States)

    Precht, H; Gerke, O; Rosendahl, K; Tingberg, A; Waaler, D

    2012-09-01

    New developments in processing of digital radiographs (DR), including multi-frequency processing (MFP), allow optimization of image quality and radiation dose. This is particularly promising in children as they are believed to be more sensitive to ionizing radiation than adults. To examine whether the use of MFP software reduces the radiation dose without compromising quality at DR of the femur in 5-year-old-equivalent anthropomorphic and technical phantoms. A total of 110 images of an anthropomorphic phantom were imaged on a DR system (Canon DR with CXDI-50 C detector and MLT[S] software) and analyzed by three pediatric radiologists using Visual Grading Analysis. In addition, 3,500 images taken of a technical contrast-detail phantom (CDRAD 2.0) provide an objective image-quality assessment. Optimal image-quality was maintained at a dose reduction of 61% with MLT(S) optimized images. Even for images of diagnostic quality, MLT(S) provided a dose reduction of 88% as compared to the reference image. Software impact on image quality was found significant for dose (mAs), dynamic range dark region and frequency band. By optimizing image processing parameters, a significant dose reduction is possible without significant loss of image quality.

  18. Optimal offering and operating strategies for wind-storage systems with linear decision rules

    DEFF Research Database (Denmark)

    Ding, Huajie; Pinson, Pierre; Hu, Zechun

    2016-01-01

    The participation of wind farm-energy storage systems (WF-ESS) in electricity markets calls for an integrated view of day-ahead offering strategies and real-time operation policies. Such an integrated strategy is proposed here by co-optimizing offering at the day-ahead stage and operation policy...... to be used at the balancing stage. Linear decision rules are seen as a natural approach to model and optimize the real-time operation policy. These allow enhancing profits from balancing markets based on updated information on prices and wind power generation. Our integrated strategies for WF...

  19. Optimal Search Strategy of Robotic Assembly Based on Neural Vibration Learning

    Directory of Open Access Journals (Sweden)

    Lejla Banjanovic-Mehmedovic

    2011-01-01

    Full Text Available This paper presents implementation of optimal search strategy (OSS in verification of assembly process based on neural vibration learning. The application problem is the complex robot assembly of miniature parts in the example of mating the gears of one multistage planetary speed reducer. Assembly of tube over the planetary gears was noticed as the most difficult problem of overall assembly. The favourable influence of vibration and rotation movement on compensation of tolerance was also observed. With the proposed neural-network-based learning algorithm, it is possible to find extended scope of vibration state parameter. Using optimal search strategy based on minimal distance path between vibration parameter stage sets (amplitude and frequencies of robots gripe vibration and recovery parameter algorithm, we can improve the robot assembly behaviour, that is, allow the fastest possible way of mating. We have verified by using simulation programs that search strategy is suitable for the situation of unexpected events due to uncertainties.

  20. Operational Analysis of Time-Optimal Maneuvering for Imaging Spacecraft

    Science.gov (United States)

    2013-03-01

    Figure 1.  In-Track Stereo Satellite Image Collection. From [7] ............................ 3  Figure 2.  NASA MODIS Terra Satellite Image of Oil...satellites for remote sensing ranges from military applications to tracking global weather patterns, tectonic activity, surface vegetation , ocean...imagery [22]. Figure 2 shows an example of a satellite image captured by the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA’s Terra

  1. Optimal energy window setting depending on the energy resolution for radionuclides used in gamma camera imaging. Planar imaging evaluation

    International Nuclear Information System (INIS)

    Kojima, Akihiro; Watanabe, Hiroyuki; Arao, Yuichi; Kawasaki, Masaaki; Takaki, Akihiro; Matsumoto, Masanori

    2007-01-01

    In this study, we examined whether the optimal energy window (EW) setting depending on an energy resolution of a gamma camera, which we previously proposed, is valid on planar scintigraphic imaging using Tl-201, Ga-67, Tc-99m, and I-123. Image acquisitions for line sources and paper sheet phantoms containing each radionuclide were performed in air and with scattering materials. For the six photopeaks excluding the Hg-201 characteristic x-rays' one, the conventional 20%-width energy window (EW20%) setting and the optimal energy window (optimal EW) setting (15%-width below 100 keV and 13%-width above 100 keV) were compared. For the Hg-201 characteristic x-rays' photopeak, the conventional on-peak EW20% setting was compared with the off-peak EW setting (73 keV-25%) and the wider off-peak EW setting (77 keV-29%). Image-count ratio (defined as the ratio of the image counts obtained with an EW and the total image counts obtained with the EW covered the whole photopeak for a line source in air), image quality, spatial resolutions (full width half maximum (FWHM) and full width tenth maximum (FWTM) values), count-profile curves, and defect-contrast values were compared between the conventional EW setting and the optimal EW setting. Except for the Hg-201 characteristic x-rays, the image-count ratios were 94-99% for the EW20% setting, but 78-89% for the optimal EW setting. However, the optimal EW setting reduced scatter fraction (defined as the scattered-to-primary counts ratio) effectively, as compared with the EW20% setting. Consequently, all the images with the optimal EW setting gave better image quality than ones with the EW20% setting. For the Hg-201 characteristic x-rays, the off-peak EW setting showed great improvement in image quality in comparison with the EW20% setting and the wider off-peak EW setting gave the best results. In conclusion, from our planar imaging study it was shown that although the optimal EW setting proposed by us gives less image-count ratio by

  2. Optimization of selective inversion recovery magnetization transfer imaging for macromolecular content mapping in the human brain.

    Science.gov (United States)

    Dortch, Richard D; Bagnato, Francesca; Gochberg, Daniel F; Gore, John C; Smith, Seth A

    2018-03-24

    To optimize a selective inversion recovery (SIR) sequence for macromolecular content mapping in the human brain at 3.0T. SIR is a quantitative method for measuring magnetization transfer (qMT) that uses a low-power, on-resonance inversion pulse. This results in a biexponential recovery of free water signal that can be sampled at various inversion/predelay times (t I/ t D ) to estimate a subset of qMT parameters, including the macromolecular-to-free pool-size-ratio (PSR), the R 1 of free water (R 1f ), and the rate of MT exchange (k mf ). The adoption of SIR has been limited by long acquisition times (≈4 min/slice). Here, we use Cramér-Rao lower bound theory and data reduction strategies to select optimal t I /t D combinations to reduce imaging times. The schemes were experimentally validated in phantoms, and tested in healthy volunteers (N = 4) and a multiple sclerosis patient. Two optimal sampling schemes were determined: (i) a 5-point scheme (k mf estimated) and (ii) a 4-point scheme (k mf assumed). In phantoms, the 5/4-point schemes yielded parameter estimates with similar SNRs as our previous 16-point scheme, but with 4.1/6.1-fold shorter scan times. Pair-wise comparisons between schemes did not detect significant differences for any scheme/parameter. In humans, parameter values were consistent with published values, and similar levels of precision were obtained from all schemes. Furthermore, fixing k mf reduced the sensitivity of PSR to partial-volume averaging, yielding more consistent estimates throughout the brain. qMT parameters can be robustly estimated in ≤1 min/slice (without independent measures of ΔB 0 , B1+, and T 1 ) when optimized t I -t D combinations are selected. © 2018 International Society for Magnetic Resonance in Medicine.

  3. Optimization of contrast of MR images in imaging of knee joint; Optymalizacja kontrastu obrazow MR na przykladzie obrazow stawu kolanowego

    Energy Technology Data Exchange (ETDEWEB)

    Szyblinski, K. [Institute of Nuclear Physics, Cracow (Poland); Bacic, G. [Dartmouth College, Hanover, NH (United States)

    1994-12-31

    The work describes the method of contrast optimization in magnetic resonance imaging. Computer program presented in the report allows analysis of contrast in selected tissues as a function of experiment parameters. Application to imaging of knee joint is presented. 2 refs, 4 figs.

  4. Optimal recharge and driving strategies for a battery-powered electric vehicle

    Directory of Open Access Journals (Sweden)

    Lee W. R.

    1999-01-01

    Full Text Available A major problem facing battery-powered electric vehicles is in their batteries: weight and charge capacity. Thus, a battery-powered electric vehicle only has a short driving range. To travel for a longer distance, the batteries are required to be recharged frequently. In this paper, we construct a model for a battery-powered electric vehicle, in which driving strategy is to be obtained such that the total travelling time between two locations is minimized. The problem is formulated as an optimization problem with switching times and speed as decision variables. This is an unconventional optimization problem. However, by using the control parametrization enhancing technique (CPET, it is shown that this unconventional optimization is equivalent to a conventional optimal parameter selection problem. Numerical examples are solved using the proposed method.

  5. A novel optimal coordinated control strategy for the updated robot system for single port surgery.

    Science.gov (United States)

    Bai, Weibang; Cao, Qixin; Leng, Chuntao; Cao, Yang; Fujie, Masakatsu G; Pan, Tiewen

    2017-09-01

    Research into robotic systems for single port surgery (SPS) has become widespread around the world in recent years. A new robot arm system for SPS was developed, but its positioning platform and other hardware components were not efficient. Special features of the developed surgical robot system make good teleoperation with safety and efficiency difficult. A robot arm is combined and used as new positioning platform, and the remote center motion is realized by a new method using active motion control. A new mapping strategy based on kinematics computation and a novel optimal coordinated control strategy based on real-time approaching to a defined anthropopathic criterion configuration that is referred to the customary ease state of human arms and especially the configuration of boxers' habitual preparation posture are developed. The hardware components, control architecture, control system, and mapping strategy of the robotic system has been updated. A novel optimal coordinated control strategy is proposed and tested. The new robot system can be more dexterous, intelligent, convenient and safer for preoperative positioning and intraoperative adjustment. The mapping strategy can achieve good following and representation for the slave manipulator arms. And the proposed novel control strategy can enable them to complete tasks with higher maneuverability, lower possibility of self-interference and singularity free while teleoperating. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Real-time SPARSE-SENSE cardiac cine MR imaging: optimization of image reconstruction and sequence validation.

    Science.gov (United States)

    Goebel, Juliane; Nensa, Felix; Bomas, Bettina; Schemuth, Haemi P; Maderwald, Stefan; Gratz, Marcel; Quick, Harald H; Schlosser, Thomas; Nassenstein, Kai

    2016-12-01

    Improved real-time cardiac magnetic resonance (CMR) sequences have currently been introduced, but so far only limited practical experience exists. This study aimed at image reconstruction optimization and clinical validation of a new highly accelerated real-time cine SPARSE-SENSE sequence. Left ventricular (LV) short-axis stacks of a real-time free-breathing SPARSE-SENSE sequence with high spatiotemporal resolution and of a standard segmented cine SSFP sequence were acquired at 1.5 T in 11 volunteers and 15 patients. To determine the optimal iterations, all volunteers' SPARSE-SENSE images were reconstructed using 10-200 iterations, and contrast ratios, image entropies, and reconstruction times were assessed. Subsequently, the patients' SPARSE-SENSE images were reconstructed with the clinically optimal iterations. LV volumetric values were evaluated and compared between both sequences. Sufficient image quality and acceptable reconstruction times were achieved when using 80 iterations. Bland-Altman plots and Passing-Bablok regression showed good agreement for all volumetric parameters. 80 iterations are recommended for iterative SPARSE-SENSE image reconstruction in clinical routine. Real-time cine SPARSE-SENSE yielded comparable volumetric results as the current standard SSFP sequence. Due to its intrinsic low image acquisition times, real-time cine SPARSE-SENSE imaging with iterative image reconstruction seems to be an attractive alternative for LV function analysis. • A highly accelerated real-time CMR sequence using SPARSE-SENSE was evaluated. • SPARSE-SENSE allows free breathing in real-time cardiac cine imaging. • For clinically optimal SPARSE-SENSE image reconstruction, 80 iterations are recommended. • Real-time SPARSE-SENSE imaging yielded comparable volumetric results as the reference SSFP sequence. • The fast SPARSE-SENSE sequence is an attractive alternative to standard SSFP sequences.

  7. Establishment of an immortalized mouse dermal papilla cell strain with optimized culture strategy

    Directory of Open Access Journals (Sweden)

    Haiying Guo

    2018-01-01

    Full Text Available Dermal papilla (DP plays important roles in hair follicle regeneration. Long-term culture of mouse DP cells can provide enough cells for research and application of DP cells. We optimized the culture strategy for DP cells from three dimensions: stepwise dissection, collagen I coating, and optimized culture medium. Based on the optimized culture strategy, we immortalized primary DP cells with SV40 large T antigen, and established several immortalized DP cell strains. By comparing molecular expression and morphologic characteristics with primary DP cells, we found one cell strain named iDP6 was similar with primary DP cells. Further identifications illustrate that iDP6 expresses FGF7 and α-SMA, and has activity of alkaline phosphatase. During the process of characterization of immortalized DP cell strains, we also found that cells in DP were heterogeneous. We successfully optimized culture strategy for DP cells, and established an immortalized DP cell strain suitable for research and application of DP cells.

  8. Applying the Taguchi method to river water pollution remediation strategy optimization.

    Science.gov (United States)

    Yang, Tsung-Ming; Hsu, Nien-Sheng; Chiu, Chih-Chiang; Wang, Hsin-Ju

    2014-04-15

    Optimization methods usually obtain the travel direction of the solution by substituting the solutions into the objective function. However, if the solution space is too large, this search method may be time consuming. In order to address this problem, this study incorporated the Taguchi method into the solution space search process of the optimization method, and used the characteristics of the Taguchi method to sequence the effects of the variation of decision variables on the system. Based on the level of effect, this study determined the impact factor of decision variables and the optimal solution for the model. The integration of the Taguchi method and the solution optimization method successfully obtained the optimal solution of the optimization problem, while significantly reducing the solution computing time and enhancing the river water quality. The results suggested that the basin with the greatest water quality improvement effectiveness is the Dahan River. Under the optimal strategy of this study, the severe pollution length was reduced from 18 km to 5 km.

  9. Applying the Taguchi Method to River Water Pollution Remediation Strategy Optimization

    Directory of Open Access Journals (Sweden)

    Tsung-Ming Yang

    2014-04-01

    Full Text Available Optimization methods usually obtain the travel direction of the solution by substituting the solutions into the objective function. However, if the solution space is too large, this search method may be time consuming. In order to address this problem, this study incorporated the Taguchi method into the solution space search process of the optimization method, and used the characteristics of the Taguchi method to sequence the effects of the variation of decision variables on the system. Based on the level of effect, this study determined the impact factor of decision variables and the optimal solution for the model. The integration of the Taguchi method and the solution optimization method successfully obtained the optimal solution of the optimization problem, while significantly reducing the solution computing time and enhancing the river water quality. The results suggested that the basin with the greatest water quality improvement effectiveness is the Dahan River. Under the optimal strategy of this study, the severe pollution length was reduced from 18 km to 5 km.

  10. Optimization strategies for cask design and container loading in long term spent fuel storage

    International Nuclear Information System (INIS)

    2006-12-01

    As delays are incurred in implementing reprocessing and in planning for geologic repositories, storage of increasing quantities of spent fuel for extended durations is becoming a growing reality. Accordingly, effective management of spent fuel continues to be a priority topic. In response, the IAEA has organized a series of meetings to identify cask loading optimisation issues in preparation for a technical publication on Optimization Strategies for Cask/Container Loading in Long Term Spent Fuel Storage. This publication outlines the optimisation process for cask design, licensing and utilization, describing three principal groups of optimization activities in terms of relevant technical considerations such as criticality, shielding, structural design, operations, maintenance and retrievability. The optimization process for cask design, licensing, and utilization is outlined. The general objectives for the design of storage casks, including storage casks that are intended to be transportable, are summarized. The nature of optimization within the design process is described. The typical regulatory and licensing process is outlined, focusing on the roles of safety regulations, the regulator, and the designer/applicant in the optimization process. Based on the foregoing, a description of the three principal groups of optimization activities is provided. The subsequent chapters of this document then describe the specific optimization activities within these three activity groups, in each of the several design disciplines

  11. Ultrafuzziness Optimization Based on Type II Fuzzy Sets for Image Thresholding

    Directory of Open Access Journals (Sweden)

    Hudan Studiawan

    2010-11-01

    Full Text Available Image thresholding is one of the processing techniques to provide high quality preprocessed image. Image vagueness and bad illumination are common obstacles yielding in a poor image thresholding output. By assuming image as fuzzy sets, several different fuzzy thresholding techniques have been proposed to remove these obstacles during threshold selection. In this paper, we proposed an algorithm for thresholding image using ultrafuzziness optimization to decrease uncertainty in fuzzy system by common fuzzy sets like type II fuzzy sets. Optimization was conducted by involving ultrafuzziness measurement for background and object fuzzy sets separately. Experimental results demonstrated that the proposed image thresholding method had good performances for images with high vagueness, low level contrast, and grayscale ambiguity.

  12. DESTINATION MARKETING STRATEGY IN BALI THROUGH OPTIMIZING THE POTENTIAL OF LOCAL PRODUCTS

    Directory of Open Access Journals (Sweden)

    I Gusti Ayu Oka Suryawardani

    2014-03-01

    Full Text Available This study was designed to study destination marketing strategy in Bali through optimizing the potential of local products. Seventy nine of hotel managers were interviewed based on cluster sampling method to gain their point of view. The results show that destination must build their images around unique attributes that provide them sustainable competitive advantage including its attraction which should be designed to meet the needs of the target market and should be served by local products. The results also show that hotel managers thought that foreign tourists always preferred imported products, meanwhile previous statistical results indicate that foreign tourists significantly look for local products. There is a need to encourage hotel managers to change their perception and attitude about local and imported products. In fact, hoteliers expressed willingness to use local products as long as these meet the quality standard. As tourism involves four types of activities, namely something to see, something to do, something to buy, something to learn, destination product development could be focused in the above activities through offering foreign tourist, such as to stay in hotels, homestays or villas owned by Balinese; to eat in restaurants owned by Balinese by choosing the authentic local foods that are using local meat, seafood and vegetables, exotic local fruits and beverages; and to buy products that are produced by the Balinese. By promoting vacation on the real Balinese atmosphere such as stay in accommodations owned by the Balinese supported by the authenticity of local Balinese foods, fruits and beverages, these will strengthen the local economy, so the benefit of tourism development can be more beneficial to the local Balinese. The results suggests that destination management related to improvement of service and hospitality are really important through improvement of human resource by giving training to their employees, educate

  13. Health-related quality of life, optimism, and coping strategies in persons suffering from localized scleroderma.

    Science.gov (United States)

    Szramka-Pawlak, B; Dańczak-Pazdrowska, A; Rzepa, T; Szewczyk, A; Sadowska-Przytocka, A; Żaba, R

    2013-01-01

    The clinical course of localized scleroderma may consist of bodily deformations, and bodily functions may also be affected. Additionally, the secondary lesions, such as discoloration, contractures, and atrophy, are unlikely to regress. The aforementioned symptoms and functional disturbances may decrease one's quality of life (QoL). Although much has been mentioned in the medical literature regarding QoL in persons suffering from dermatologic diseases, no data specifically describing patients with localized scleroderma exist. The aim of the study was to explore QoL in localized scleroderma patients and to examine their coping strategies in regard to optimism and QoL. The study included 41 patients with localized scleroderma. QoL was evaluated using the SKINDEX questionnaire, and levels of dispositional optimism were assessed using the Life Orientation Test-Revised. In addition, individual coping strategy was determined using the Mini-MAC scale and physical condition was assessed using the Localized Scleroderma Severity Index. The mean QoL score amounted to 51.10 points, with mean scores for individual components as follows: symptoms = 13.49 points, emotions = 21.29 points, and functioning = 16.32 points. A relationship was detected between QoL and the level of dispositional optimism as well as with coping strategies known as anxious preoccupation and helplessness-hopelessness. Higher levels of optimism predicted a higher general QoL. In turn, greater intensity of anxious preoccupied and helpless-hopeless behaviors predicted a lower QoL. Based on these results, it may be stated that localized scleroderma patients have a relatively high QoL, which is accompanied by optimism as well as a lower frequency of behaviors typical of emotion-focused coping strategies.

  14. A Dynamic Optimization Strategy for the Operation of Large Scale Seawater Reverses Osmosis System

    Directory of Open Access Journals (Sweden)

    Aipeng Jiang

    2014-01-01

    Full Text Available In this work, an efficient strategy was proposed for efficient solution of the dynamic model of SWRO system. Since the dynamic model is formulated by a set of differential-algebraic equations, simultaneous strategies based on collocations on finite element were used to transform the DAOP into large scale nonlinear programming problem named Opt2. Then, simulation of RO process and storage tanks was carried element by element and step by step with fixed control variables. All the obtained values of these variables then were used as the initial value for the optimal solution of SWRO system. Finally, in order to accelerate the computing efficiency and at the same time to keep enough accuracy for the solution of Opt2, a simple but efficient finite element refinement rule was used to reduce the scale of Opt2. The proposed strategy was applied to a large scale SWRO system with 8 RO plants and 4 storage tanks as case study. Computing result shows that the proposed strategy is quite effective for optimal operation of the large scale SWRO system; the optimal problem can be successfully solved within decades of iterations and several minutes when load and other operating parameters fluctuate.

  15. Closed-loop optimization of chromatography column sizing strategies in biopharmaceutical manufacture.

    Science.gov (United States)

    Allmendinger, Richard; Simaria, Ana S; Turner, Richard; Farid, Suzanne S

    2014-10-01

    This paper considers a real-world optimization problem involving the identification of cost-effective equipment sizing strategies for the sequence of chromatography steps employed to purify biopharmaceuticals. Tackling this problem requires solving a combinatorial optimization problem subject to multiple constraints, uncertain parameters, and time-consuming fitness evaluations. An industrially-relevant case study is used to illustrate that evolutionary algorithms can identify chromatography sizing strategies with significant improvements in performance criteria related to process cost, time and product waste over the base case. The results demonstrate also that evolutionary algorithms perform best when infeasible solutions are repaired intelligently, the population size is set appropriately, and elitism is combined with a low number of Monte Carlo trials (needed to account for uncertainty). Adopting this setup turns out to be more important for scenarios where less time is available for the purification process. Finally, a data-visualization tool is employed to illustrate how user preferences can be accounted for when it comes to selecting a sizing strategy to be implemented in a real industrial setting. This work demonstrates that closed-loop evolutionary optimization, when tuned properly and combined with a detailed manufacturing cost model, acts as a powerful decisional tool for the identification of cost-effective purification strategies. © 2013 The Authors. Journal of Chemical Technology & Biotechnology published by John Wiley & Sons Ltd on behalf of Society of Chemical Industry.

  16. Using and comparing metaheuristic algorithms for optimizing bidding strategy viewpoint of profit maximization of generators

    Science.gov (United States)

    Mousavi, Seyed Hosein; Nazemi, Ali; Hafezalkotob, Ashkan

    2015-03-01

    With the formation of the competitive electricity markets in the world, optimization of bidding strategies has become one of the main discussions in studies related to market designing. Market design is challenged by multiple objectives that need to be satisfied. The solution of those multi-objective problems is searched often over the combined strategy space, and thus requires the simultaneous optimization of multiple parameters. The problem is formulated analytically using the Nash equilibrium concept for games composed of large numbers of players having discrete and large strategy spaces. The solution methodology is based on a characterization of Nash equilibrium in terms of minima of a function and relies on a metaheuristic optimization approach to find these minima. This paper presents some metaheuristic algorithms to simulate how generators bid in the spot electricity market viewpoint of their profit maximization according to the other generators' strategies, such as genetic algorithm (GA), simulated annealing (SA) and hybrid simulated annealing genetic algorithm (HSAGA) and compares their results. As both GA and SA are generic search methods, HSAGA is also a generic search method. The model based on the actual data is implemented in a peak hour of Tehran's wholesale spot market in 2012. The results of the simulations show that GA outperforms SA and HSAGA on computing time, number of function evaluation and computing stability, as well as the results of calculated Nash equilibriums by GA are less various and different from each other than the other algorithms.

  17. A strategy for field shape evaluation in digital portal imaging

    International Nuclear Information System (INIS)

    Vos, P.H.; Quist, M.; Weistra, J.; Vossepoel, A.M.

    1995-01-01

    Digital portal imagers allow accurate measurement of the field shape in radiotherapy. A strategy is introduced to determine origin and magnitude of discrepancies between the prescribed and measured field outline. After measurement of the actual detector position relative to the beam a conversion is made from pixels in the image matrix to mm in the plane of the isocenter, without using information from the imaged field. Using a distance transform a quick check is performed: the outline is accepted if all outline points deviate less then a predefined minimum (usually 5 mm). Subsequent evaluation starts if somewhere in the outline this minimum is exceeded. The collimator defined parts in the field outline are discriminated from the shielding blocks using an enclosing rectangle of the portal outline. This rectangle is found by minimization of the area as a function of rotation. If more than one solution is available, minimization of the entropy of the field outline projections determines which rectangle corresponds best to the field outline. A check for the validity of the determined collimator parts is performed with a separate linear fit through these parts. An outline part is accepted as a collimator outline part if it is longer than a predefined length. Using this procedure the position for each of the collimator jaws can be individually measured and compared with its prescription, thus allowing discrimination between symmetric and asymmetric collimator set-ups. Using the distance transform again, for each of the detected (secondary) shielding blocks the largest discrepancy or the area giving underdosage or overdosage can be computed to evaluate their shape and position. Parameter(s) and criteria that should be used to evaluate the field set-up are specified in clinical protocols. For standard shielding blocks usually only a maximum tolerated difference is specified, whereas for mantle fields also maximum allowed over- and underdose areas are specified. The

  18. A Power System Optimal Dispatch Strategy Considering the Flow of Carbon Emissions and Large Consumers

    Directory of Open Access Journals (Sweden)

    Jun Yang

    2015-08-01

    Full Text Available The carbon emissions trading market and direct power purchases by large consumers are two promising directions of power system development. To trace the carbon emission flow in the power grid, the theory of carbon emission flow is improved by allocating power loss to the load side. Based on the improved carbon emission flow theory, an optimal dispatch model is proposed to optimize the cost of both large consumers and the power grid, which will benefit from the carbon emissions trading market. Moreover, to better simulate reality, the direct purchase of power by large consumers is also considered in this paper. The OPF (optimal power flow method is applied to solve the problem. To evaluate our proposed optimal dispatch strategy, an IEEE 30-bus system is used to test the performance. The effects of the price of carbon emissions and the price of electricity from normal generators and low-carbon generators with regards to the optimal dispatch are analyzed. The simulation results indicate that the proposed strategy can significantly reduce both the operation cost of the power grid and the power utilization cost of large consumers.

  19. Pareto Optimal Solutions for Network Defense Strategy Selection Simulator in Multi-Objective Reinforcement Learning

    Directory of Open Access Journals (Sweden)

    Yang Sun

    2018-01-01

    Full Text Available Using Pareto optimization in Multi-Objective Reinforcement Learning (MORL leads to better learning results for network defense games. This is particularly useful for network security agents, who must often balance several goals when choosing what action to take in defense of a network. If the defender knows his preferred reward distribution, the advantages of Pareto optimization can be retained by using a scalarization algorithm prior to the implementation of the MORL. In this paper, we simulate a network defense scenario by creating a multi-objective zero-sum game and using Pareto optimization and MORL to determine optimal solutions and compare those solutions to different scalarization approaches. We build a Pareto Defense Strategy Selection Simulator (PDSSS system for assisting network administrators on decision-making, specifically, on defense strategy selection, and the experiment results show that the Satisficing Trade-Off Method (STOM scalarization approach performs better than linear scalarization or GUESS method. The results of this paper can aid network security agents attempting to find an optimal defense policy for network security games.

  20. Hybrid collaborative optimization based on selection strategy of initial point and adaptive relaxation

    Energy Technology Data Exchange (ETDEWEB)

    Ji, Aimin; Yin, Xu; Yuan, Minghai [Hohai University, Changzhou (China)

    2015-09-15

    There are two problems in Collaborative optimization (CO): (1) the local optima arising from the selection of an inappropriate initial point; (2) the low efficiency and accuracy root in inappropriate relaxation factors. To solve these problems, we first develop the Latin hypercube design (LHD) to determine an initial point of optimization, and then use the non-linear programming by quadratic Lagrangian (NLPQL) to search for the global solution. The effectiveness of the initial point selection strategy is verified by three benchmark functions with some dimensions and different complexities. Then we propose the Adaptive relaxation collaborative optimization (ARCO) algorithm to solve the inconsistency between the system level and the disciplines level, and in this method, the relaxation factors are determined according to the three separated stages of CO respectively. The performance of the ARCO algorithm is compared with the standard collaborative algorithm and the constant relaxation collaborative algorithm with a typical numerical example, which indicates that the ARCO algorithm is more efficient and accurate. Finally, we propose a Hybrid collaborative optimization (HCO) approach, which integrates the selection strategy of initial point with the ARCO algorithm. The results show that HCO can achieve the global optimal solution without the initial value and it also has advantages in convergence, accuracy and robustness. Therefore, the proposed HCO approach can solve the CO problems with applications in the spindle and the speed reducer.