WorldWideScience

Sample records for selection optimization program

  1. Optimal selection for shielding materials by fuzzy linear programming

    International Nuclear Information System (INIS)

    Kanai, Y.; Miura, N.; Sugasawa, S.

    1996-01-01

    An application of fuzzy linear programming methods to optimization of a radiation shield is presented. The main purpose of the present study is the choice of materials and the search of the ratio of mixture-component as the first stage of the methodology on optimum shielding design according to individual requirements of nuclear reactor, reprocessing facility, shipping cask installing spent fuel, ect. The characteristic values for the shield optimization may be considered their cost, spatial space, weight and some shielding qualities such as activation rate and total dose rate for neutron and gamma ray (includes secondary gamma ray). This new approach can reduce huge combination calculations for conventional two-valued logic approaches to representative single shielding calculation by group-wised optimization parameters determined in advance. Using the fuzzy linear programming method, possibilities for reducing radiation effects attainable in optimal compositions hydrated, lead- and boron-contained materials are investigated

  2. Leakage characterization of top select transistor for program disturbance optimization in 3D NAND flash

    Science.gov (United States)

    Zhang, Yu; Jin, Lei; Jiang, Dandan; Zou, Xingqi; Zhao, Zhiguo; Gao, Jing; Zeng, Ming; Zhou, Wenbin; Tang, Zhaoyun; Huo, Zongliang

    2018-03-01

    In order to optimize program disturbance characteristics effectively, a characterization approach that measures top select transistor (TSG) leakage from bit-line is proposed to quantify TSG leakage under program inhibit condition in 3D NAND flash memory. Based on this approach, the effect of Vth modulation of two-cell TSG on leakage is evaluated. By checking the dependence of leakage and corresponding program disturbance on upper and lower TSG Vth, this approach is validated. The optimal Vth pattern with high upper TSG Vth and low lower TSG Vth has been suggested for low leakage current and high boosted channel potential. It is found that upper TSG plays dominant role in preventing drain induced barrier lowering (DIBL) leakage from boosted channel to bit-line, while lower TSG assists to further suppress TSG leakage by providing smooth potential drop from dummy WL to edge of TSG, consequently suppressing trap assisted band-to-band tunneling current (BTBT) between dummy WL and TSG.

  3. Drug efficiency: a new concept to guide lead optimization programs towards the selection of better clinical candidates.

    Science.gov (United States)

    Braggio, Simone; Montanari, Dino; Rossi, Tino; Ratti, Emiliangelo

    2010-07-01

    As a result of their wide acceptance and conceptual simplicity, drug-like concepts are having a major influence on the drug discovery process, particularly in the selection of the 'optimal' absorption, distribution, metabolism, excretion and toxicity and physicochemical parameters space. While they have an undisputable value when assessing the potential of lead series or in evaluating inherent risk of a portfolio of drug candidates, they result much less useful in weighing up compounds for the selection of the best potential clinical candidate. We introduce the concept of drug efficiency as a new tool both to guide the drug discovery program teams during the lead optimization phase and to better assess the developability potential of a drug candidate.

  4. Optimal Strategy for Integrated Dynamic Inventory Control and Supplier Selection in Unknown Environment via Stochastic Dynamic Programming

    International Nuclear Information System (INIS)

    Sutrisno; Widowati; Solikhin

    2016-01-01

    In this paper, we propose a mathematical model in stochastic dynamic optimization form to determine the optimal strategy for an integrated single product inventory control problem and supplier selection problem where the demand and purchasing cost parameters are random. For each time period, by using the proposed model, we decide the optimal supplier and calculate the optimal product volume purchased from the optimal supplier so that the inventory level will be located at some point as close as possible to the reference point with minimal cost. We use stochastic dynamic programming to solve this problem and give several numerical experiments to evaluate the model. From the results, for each time period, the proposed model was generated the optimal supplier and the inventory level was tracked the reference point well. (paper)

  5. Selection on Optimal Haploid Value Increases Genetic Gain and Preserves More Genetic Diversity Relative to Genomic Selection.

    Science.gov (United States)

    Daetwyler, Hans D; Hayden, Matthew J; Spangenberg, German C; Hayes, Ben J

    2015-08-01

    Doubled haploids are routinely created and phenotypically selected in plant breeding programs to accelerate the breeding cycle. Genomic selection, which makes use of both phenotypes and genotypes, has been shown to further improve genetic gain through prediction of performance before or without phenotypic characterization of novel germplasm. Additional opportunities exist to combine genomic prediction methods with the creation of doubled haploids. Here we propose an extension to genomic selection, optimal haploid value (OHV) selection, which predicts the best doubled haploid that can be produced from a segregating plant. This method focuses selection on the haplotype and optimizes the breeding program toward its end goal of generating an elite fixed line. We rigorously tested OHV selection breeding programs, using computer simulation, and show that it results in up to 0.6 standard deviations more genetic gain than genomic selection. At the same time, OHV selection preserved a substantially greater amount of genetic diversity in the population than genomic selection, which is important to achieve long-term genetic gain in breeding populations. Copyright © 2015 by the Genetics Society of America.

  6. Optimization of temperature-programmed GC separations. II. Off-line simplex optimization and column selection

    NARCIS (Netherlands)

    Snijders, H.M.J.; Janssen, J.G.M.; Cramers, C.A.M.G.; Sandra, P; Bertsch, W.; Sandra, P.; Devos, G.

    1996-01-01

    In this work a method is described which allows off-line optimization of temperature programmed GC separations. Recently, we described a new numerical method to predict off-line retention times and peak widths of a mixture containing components with known identities in capillary GC. In the present

  7. Diet selection of African elephant over time shows changing optimization currency

    NARCIS (Netherlands)

    Pretorius, Y.; Stigter, J.D.; Boer, de W.F.; Wieren, van S.E.; Jong, de C.B.; Knegt, de H.J.; Grant, R.C.; Heitkonig, I.M.A.; Knox, N.; Kohi, E.; Mwakiwa, E.; Peel, M.J.S.; Skidmore, A.K.; Slotow, R.; Waal, van der C.; Langevelde, van F.; Prins, H.H.T.

    2012-01-01

    Multiple factors determine diet selection of herbivores. However, in many diet studies selection of single nutrients is studied or optimization models are developed using only one currency. In this paper, we use linear programming to explain diet selection by African elephant based on plant

  8. Pipe degradation investigations for optimization of flow-accelerated corrosion inspection location selection

    International Nuclear Information System (INIS)

    Chandra, S.; Habicht, P.; Chexal, B.; Mahini, R.; McBrine, W.; Esselman, T.; Horowitz, J.

    1995-01-01

    A large amount of piping in a typical nuclear power plant is susceptible to Flow-Accelerated Corrosion (FAC) wall thinning to varying degrees. A typical PAC monitoring program includes the wall thickness measurement of a select number of components in order to judge the structural integrity of entire systems. In order to appropriately allocate resources and maintain an adequate FAC program, it is necessary to optimize the selection of components for inspection by focusing on those components which provide the best indication of system susceptibility to FAC. A better understanding of system FAC predictability and the types of FAC damage encountered can provide some of the insight needed to better focus and optimize the inspection plan for an upcoming refueling outage. Laboratory examination of FAC damaged components removed from service at Northeast Utilities' (NU) nuclear power plants provides a better understanding of the damage mechanisms involved and contributing causes. Selected results of this ongoing study are presented with specific conclusions which will help NU to better focus inspections and thus optimize the ongoing FAC inspection program

  9. Optimal set of selected uranium enrichments that minimizes blending consequences

    International Nuclear Information System (INIS)

    Nachlas, J.A.; Kurstedt, H.A. Jr.; Lobber, J.S. Jr.

    1977-01-01

    Identities, quantities, and costs associated with producing a set of selected enrichments and blending them to provide fuel for existing reactors are investigated using an optimization model constructed with appropriate constraints. Selected enrichments are required for either nuclear reactor fuel standardization or potential uranium enrichment alternatives such as the gas centrifuge. Using a mixed-integer linear program, the model minimizes present worth costs for a 39-product-enrichment reference case. For four ingredients, the marginal blending cost is only 0.18% of the total direct production cost. Natural uranium is not an optimal blending ingredient. Optimal values reappear in most sets of ingredient enrichments

  10. Optimal Sensor Selection for Health Monitoring Systems

    Science.gov (United States)

    Santi, L. Michael; Sowers, T. Shane; Aguilar, Robert B.

    2005-01-01

    Sensor data are the basis for performance and health assessment of most complex systems. Careful selection and implementation of sensors is critical to enable high fidelity system health assessment. A model-based procedure that systematically selects an optimal sensor suite for overall health assessment of a designated host system is described. This procedure, termed the Systematic Sensor Selection Strategy (S4), was developed at NASA John H. Glenn Research Center in order to enhance design phase planning and preparations for in-space propulsion health management systems (HMS). Information and capabilities required to utilize the S4 approach in support of design phase development of robust health diagnostics are outlined. A merit metric that quantifies diagnostic performance and overall risk reduction potential of individual sensor suites is introduced. The conceptual foundation for this merit metric is presented and the algorithmic organization of the S4 optimization process is described. Representative results from S4 analyses of a boost stage rocket engine previously under development as part of NASA's Next Generation Launch Technology (NGLT) program are presented.

  11. Optimization methods for activities selection problems

    Science.gov (United States)

    Mahad, Nor Faradilah; Alias, Suriana; Yaakop, Siti Zulaika; Arshad, Norul Amanina Mohd; Mazni, Elis Sofia

    2017-08-01

    Co-curriculum activities must be joined by every student in Malaysia and these activities bring a lot of benefits to the students. By joining these activities, the students can learn about the time management and they can developing many useful skills. This project focuses on the selection of co-curriculum activities in secondary school using the optimization methods which are the Analytic Hierarchy Process (AHP) and Zero-One Goal Programming (ZOGP). A secondary school in Negeri Sembilan, Malaysia was chosen as a case study. A set of questionnaires were distributed randomly to calculate the weighted for each activity based on the 3 chosen criteria which are soft skills, interesting activities and performances. The weighted was calculated by using AHP and the results showed that the most important criteria is soft skills. Then, the ZOGP model will be analyzed by using LINGO Software version 15.0. There are two priorities to be considered. The first priority which is to minimize the budget for the activities is achieved since the total budget can be reduced by RM233.00. Therefore, the total budget to implement the selected activities is RM11,195.00. The second priority which is to select the co-curriculum activities is also achieved. The results showed that 9 out of 15 activities were selected. Thus, it can concluded that AHP and ZOGP approach can be used as the optimization methods for activities selection problem.

  12. Programmed evolution for optimization of orthogonal metabolic output in bacteria.

    Directory of Open Access Journals (Sweden)

    Todd T Eckdahl

    Full Text Available Current use of microbes for metabolic engineering suffers from loss of metabolic output due to natural selection. Rather than combat the evolution of bacterial populations, we chose to embrace what makes biological engineering unique among engineering fields - evolving materials. We harnessed bacteria to compute solutions to the biological problem of metabolic pathway optimization. Our approach is called Programmed Evolution to capture two concepts. First, a population of cells is programmed with DNA code to enable it to compute solutions to a chosen optimization problem. As analog computers, bacteria process known and unknown inputs and direct the output of their biochemical hardware. Second, the system employs the evolution of bacteria toward an optimal metabolic solution by imposing fitness defined by metabolic output. The current study is a proof-of-concept for Programmed Evolution applied to the optimization of a metabolic pathway for the conversion of caffeine to theophylline in E. coli. Introduced genotype variations included strength of the promoter and ribosome binding site, plasmid copy number, and chaperone proteins. We constructed 24 strains using all combinations of the genetic variables. We used a theophylline riboswitch and a tetracycline resistance gene to link theophylline production to fitness. After subjecting the mixed population to selection, we measured a change in the distribution of genotypes in the population and an increased conversion of caffeine to theophylline among the most fit strains, demonstrating Programmed Evolution. Programmed Evolution inverts the standard paradigm in metabolic engineering by harnessing evolution instead of fighting it. Our modular system enables researchers to program bacteria and use evolution to determine the combination of genetic control elements that optimizes catabolic or anabolic output and to maintain it in a population of cells. Programmed Evolution could be used for applications in

  13. Programmed Evolution for Optimization of Orthogonal Metabolic Output in Bacteria

    Science.gov (United States)

    Eckdahl, Todd T.; Campbell, A. Malcolm; Heyer, Laurie J.; Poet, Jeffrey L.; Blauch, David N.; Snyder, Nicole L.; Atchley, Dustin T.; Baker, Erich J.; Brown, Micah; Brunner, Elizabeth C.; Callen, Sean A.; Campbell, Jesse S.; Carr, Caleb J.; Carr, David R.; Chadinha, Spencer A.; Chester, Grace I.; Chester, Josh; Clarkson, Ben R.; Cochran, Kelly E.; Doherty, Shannon E.; Doyle, Catherine; Dwyer, Sarah; Edlin, Linnea M.; Evans, Rebecca A.; Fluharty, Taylor; Frederick, Janna; Galeota-Sprung, Jonah; Gammon, Betsy L.; Grieshaber, Brandon; Gronniger, Jessica; Gutteridge, Katelyn; Henningsen, Joel; Isom, Bradley; Itell, Hannah L.; Keffeler, Erica C.; Lantz, Andrew J.; Lim, Jonathan N.; McGuire, Erin P.; Moore, Alexander K.; Morton, Jerrad; Nakano, Meredith; Pearson, Sara A.; Perkins, Virginia; Parrish, Phoebe; Pierson, Claire E.; Polpityaarachchige, Sachith; Quaney, Michael J.; Slattery, Abagael; Smith, Kathryn E.; Spell, Jackson; Spencer, Morgan; Taye, Telavive; Trueblood, Kamay; Vrana, Caroline J.; Whitesides, E. Tucker

    2015-01-01

    Current use of microbes for metabolic engineering suffers from loss of metabolic output due to natural selection. Rather than combat the evolution of bacterial populations, we chose to embrace what makes biological engineering unique among engineering fields – evolving materials. We harnessed bacteria to compute solutions to the biological problem of metabolic pathway optimization. Our approach is called Programmed Evolution to capture two concepts. First, a population of cells is programmed with DNA code to enable it to compute solutions to a chosen optimization problem. As analog computers, bacteria process known and unknown inputs and direct the output of their biochemical hardware. Second, the system employs the evolution of bacteria toward an optimal metabolic solution by imposing fitness defined by metabolic output. The current study is a proof-of-concept for Programmed Evolution applied to the optimization of a metabolic pathway for the conversion of caffeine to theophylline in E. coli. Introduced genotype variations included strength of the promoter and ribosome binding site, plasmid copy number, and chaperone proteins. We constructed 24 strains using all combinations of the genetic variables. We used a theophylline riboswitch and a tetracycline resistance gene to link theophylline production to fitness. After subjecting the mixed population to selection, we measured a change in the distribution of genotypes in the population and an increased conversion of caffeine to theophylline among the most fit strains, demonstrating Programmed Evolution. Programmed Evolution inverts the standard paradigm in metabolic engineering by harnessing evolution instead of fighting it. Our modular system enables researchers to program bacteria and use evolution to determine the combination of genetic control elements that optimizes catabolic or anabolic output and to maintain it in a population of cells. Programmed Evolution could be used for applications in energy

  14. Investigating the Optimal Management Strategy for a Healthcare Facility Maintenance Program

    National Research Council Canada - National Science Library

    Gaillard, Daria

    2004-01-01

    ...: strategic partnering with an equipment management firm. The objective of this study is to create a decision-model for selecting the optimal management strategy for a healthcare organization's facility maintenance program...

  15. OPTIMIZING ANTIMICROBIAL PHARMACODYNAMICS: A GUIDE FOR YOUR STEWARDSHIP PROGRAM

    Directory of Open Access Journals (Sweden)

    Joseph L. Kuti, PharmD

    2016-09-01

    Full Text Available Pharmacodynamic concepts should be applied to optimize antibiotic dosing regimens, particularly in the face of some multidrug resistant bacterial infections. Although the pharmacodynamics of most antibiotic classes used in the hospital setting are well described, guidance on how to select regimens and implement them into an antimicrobial stewardship program in one's institution are more limited. The role of the antibiotic MIC is paramount in understanding which regimens might benefit from implementation as a protocol or use in individual patients. This review article outlines the pharmacodynamics of aminoglycosides, beta-lactams, fluoroquinolones, tigecycline, vancomycin, and polymyxins with the goal of providing a basis for strategy to select an optimized antibiotic regimen in your hospital setting.

  16. Industrial cogeneration optimization program. Final report, September 1979

    Energy Technology Data Exchange (ETDEWEB)

    Davis, Jerry; McWhinney, Jr., Robert T.

    1980-01-01

    This study program is part of the DOE Integrated Industry Cogeneration Program to optimize, evaluate, and demonstrate cogeneration systems, with direct participation of the industries most affected. One objective is to characterize five major energy-intensive industries with respect to their energy-use profiles. The industries are: petroleum refining and related industries, textile mill products, paper and allied products, chemicals and allied products, and food and kindred products. Another objective is to select optimum cogeneration systems for site-specific reference case plants in terms of maximum energy savings subject to given return on investment hurdle rates. Analyses were made that define the range of optimal cogeneration systems for each reference-case plant considering technology applicability, economic factors, and energy savings by type of fuel. This study also provides guidance to other parts of the program through information developed with regard to component development requirements, institutional and regulatory barriers, as well as fuel use and environmental considerations. (MCW)

  17. Fire-tube immersion heater optimization program and field heater audit program

    Energy Technology Data Exchange (ETDEWEB)

    Croteau, P. [Petro-Canada, Calgary, AB (Canada)

    2007-07-01

    This presentation provided an overview of the top 5 priorities for emission reduction and eco-efficiency by the Petroleum Technology Alliance of Canada (PTAC). These included venting of methane emissions; fuel consumption in reciprocating engines; fuel consumption in fired heaters; flaring and incineration; and fugitive emissions. It described the common concern for many upstream operating companies as being energy consumption associated with immersion heaters. PTAC fire-tube heater and line heater studies were presented. Combustion efficiency was discussed in terms of excess air, fire-tube selection, heat flux rate, and reliability guidelines. Other topics included heat transfer and fire-tube design; burner selection; burner duty cycle; heater tune up inspection procedure; and insulation. Two other programs were also discussed, notably a Petro-Canada fire-tube immersion heater optimization program and the field audit program run by Natural Resources Canada. It was concluded that improved efficiency involves training; managing excess air in combustion; managing the burner duty cycle; striving for 82 per cent combustion efficiency; and providing adequate insulation to reduce energy demand. tabs., figs.

  18. Optimal and Suboptimal Finger Selection Algorithms for MMSE Rake Receivers in Impulse Radio Ultra-Wideband Systems

    Directory of Open Access Journals (Sweden)

    Chiang Mung

    2006-01-01

    Full Text Available The problem of choosing the optimal multipath components to be employed at a minimum mean square error (MMSE selective Rake receiver is considered for an impulse radio ultra-wideband system. First, the optimal finger selection problem is formulated as an integer programming problem with a nonconvex objective function. Then, the objective function is approximated by a convex function and the integer programming problem is solved by means of constraint relaxation techniques. The proposed algorithms are suboptimal due to the approximate objective function and the constraint relaxation steps. However, they perform better than the conventional finger selection algorithm, which is suboptimal since it ignores the correlation between multipath components, and they can get quite close to the optimal scheme that cannot be implemented in practice due to its complexity. In addition to the convex relaxation techniques, a genetic-algorithm- (GA- based approach is proposed, which does not need any approximations or integer relaxations. This iterative algorithm is based on the direct evaluation of the objective function, and can achieve near-optimal performance with a reasonable number of iterations. Simulation results are presented to compare the performance of the proposed finger selection algorithms with that of the conventional and the optimal schemes.

  19. Optimal processing pathway selection for microalgae-based biorefinery under uncertainty

    DEFF Research Database (Denmark)

    Rizwan, Muhammad; Zaman, Muhammad; Lee, Jay H.

    2015-01-01

    We propose a systematic framework for the selection of optimal processing pathways for a microalgaebased biorefinery under techno-economic uncertainty. The proposed framework promotes robust decision making by taking into account the uncertainties that arise due to inconsistencies among...... and shortage in the available technical information. A stochastic mixed integer nonlinear programming (sMINLP) problem is formulated for determining the optimal biorefinery configurations based on a superstructure model where parameter uncertainties are modeled and included as sampled scenarios. The solution...... the accounting of uncertainty are compared with respect to different objectives. (C) 2015 Elsevier Ltd. All rights reserved....

  20. Efficient dynamic optimization of logic programs

    Science.gov (United States)

    Laird, Phil

    1992-01-01

    A summary is given of the dynamic optimization approach to speed up learning for logic programs. The problem is to restructure a recursive program into an equivalent program whose expected performance is optimal for an unknown but fixed population of problem instances. We define the term 'optimal' relative to the source of input instances and sketch an algorithm that can come within a logarithmic factor of optimal with high probability. Finally, we show that finding high-utility unfolding operations (such as EBG) can be reduced to clause reordering.

  1. Dermatology Residency Selection Criteria with an Emphasis on Program Characteristics: A National Program Director Survey

    Directory of Open Access Journals (Sweden)

    Farzam Gorouhi

    2014-01-01

    Full Text Available Background. Dermatology residency programs are relatively diverse in their resident selection process. The authors investigated the importance of 25 dermatology residency selection criteria focusing on differences in program directors’ (PDs’ perception based on specific program demographics. Methods. This cross-sectional nationwide observational survey utilized a 41-item questionnaire that was developed by literature search, brainstorming sessions, and online expert reviews. The data were analyzed utilizing the reliability test, two-step clustering, and K-means methods as well as other methods. The main purpose of this study was to investigate the differences in PDs’ perception regarding the importance of the selection criteria based on program demographics. Results. Ninety-five out of 114 PDs (83.3% responded to the survey. The top five criteria for dermatology residency selection were interview, letters of recommendation, United States Medical Licensing Examination Step I scores, medical school transcripts, and clinical rotations. The following criteria were preferentially ranked based on different program characteristics: “advanced degrees,” “interest in academics,” “reputation of undergraduate and medical school,” “prior unsuccessful attempts to match,” and “number of publications.” Conclusions. Our survey provides up-to-date factual data on dermatology PDs’ perception in this regard. Dermatology residency programs may find the reported data useful in further optimizing their residency selection process.

  2. Feature Selection via Chaotic Antlion Optimization.

    Directory of Open Access Journals (Sweden)

    Hossam M Zawbaa

    Full Text Available Selecting a subset of relevant properties from a large set of features that describe a dataset is a challenging machine learning task. In biology, for instance, the advances in the available technologies enable the generation of a very large number of biomarkers that describe the data. Choosing the more informative markers along with performing a high-accuracy classification over the data can be a daunting task, particularly if the data are high dimensional. An often adopted approach is to formulate the feature selection problem as a biobjective optimization problem, with the aim of maximizing the performance of the data analysis model (the quality of the data training fitting while minimizing the number of features used.We propose an optimization approach for the feature selection problem that considers a "chaotic" version of the antlion optimizer method, a nature-inspired algorithm that mimics the hunting mechanism of antlions in nature. The balance between exploration of the search space and exploitation of the best solutions is a challenge in multi-objective optimization. The exploration/exploitation rate is controlled by the parameter I that limits the random walk range of the ants/prey. This variable is increased iteratively in a quasi-linear manner to decrease the exploration rate as the optimization progresses. The quasi-linear decrease in the variable I may lead to immature convergence in some cases and trapping in local minima in other cases. The chaotic system proposed here attempts to improve the tradeoff between exploration and exploitation. The methodology is evaluated using different chaotic maps on a number of feature selection datasets. To ensure generality, we used ten biological datasets, but we also used other types of data from various sources. The results are compared with the particle swarm optimizer and with genetic algorithm variants for feature selection using a set of quality metrics.

  3. Optimizing the allocation of resources for genomic selection in one breeding cycle.

    Science.gov (United States)

    Riedelsheimer, Christian; Melchinger, Albrecht E

    2013-11-01

    We developed a universally applicable planning tool for optimizing the allocation of resources for one cycle of genomic selection in a biparental population. The framework combines selection theory with constraint numerical optimization and considers genotype  ×  environment interactions. Genomic selection (GS) is increasingly implemented in plant breeding programs to increase selection gain but little is known how to optimally allocate the resources under a given budget. We investigated this problem with model calculations by combining quantitative genetic selection theory with constraint numerical optimization. We assumed one selection cycle where both the training and prediction sets comprised double haploid (DH) lines from the same biparental population. Grain yield for testcrosses of maize DH lines was used as a model trait but all parameters can be adjusted in a freely available software implementation. An extension of the expected selection accuracy given by Daetwyler et al. (2008) was developed to correctly balance between the number of environments for phenotyping the training set and its population size in the presence of genotype × environment interactions. Under small budget, genotyping costs mainly determine whether GS is superior over phenotypic selection. With increasing budget, flexibility in resource allocation increases greatly but selection gain leveled off quickly requiring balancing the number of populations with the budget spent for each population. The use of an index combining phenotypic and GS predicted values in the training set was especially beneficial under limited resources and large genotype × environment interactions. Once a sufficiently high selection accuracy is achieved in the prediction set, further selection gain can be achieved most efficiently by massively expanding its size. Thus, with increasing budget, reducing the costs for producing a DH line becomes increasingly crucial for successfully exploiting the

  4. Research on numerical method for multiple pollution source discharge and optimal reduction program

    Science.gov (United States)

    Li, Mingchang; Dai, Mingxin; Zhou, Bin; Zou, Bin

    2018-03-01

    In this paper, the optimal method for reduction program is proposed by the nonlinear optimal algorithms named that genetic algorithm. The four main rivers in Jiangsu province, China are selected for reducing the environmental pollution in nearshore district. Dissolved inorganic nitrogen (DIN) is studied as the only pollutant. The environmental status and standard in the nearshore district is used to reduce the discharge of multiple river pollutant. The research results of reduction program are the basis of marine environmental management.

  5. Optimized remedial groundwater extraction using linear programming

    International Nuclear Information System (INIS)

    Quinn, J.J.

    1995-01-01

    Groundwater extraction systems are typically installed to remediate contaminant plumes or prevent further spread of contamination. These systems are expensive to install and maintain. A traditional approach to designing such a wellfield uses a series of trial-and-error simulations to test the effects of various well locations and pump rates. However, the optimal locations and pump rates of extraction wells are difficult to determine when objectives related to the site hydrogeology and potential pumping scheme are considered. This paper describes a case study of an application of linear programming theory to determine optimal well placement and pump rates. The objectives of the pumping scheme were to contain contaminant migration and reduce contaminant concentrations while minimizing the total amount of water pumped and treated. Past site activities at the area under study included disposal of contaminants in pits. Several groundwater plumes have been identified, and others may be present. The area of concern is bordered on three sides by a wetland, which receives a portion of its input budget as groundwater discharge from the pits. Optimization of the containment pumping scheme was intended to meet three goals: (1) prevent discharge of contaminated groundwater to the wetland, (2) minimize the total water pumped and treated (cost benefit), and (3) avoid dewatering of the wetland (cost and ecological benefits). Possible well locations were placed at known source areas. To constrain the problem, the optimization program was instructed to prevent any flow toward the wetland along a user-specified border. In this manner, the optimization routine selects well locations and pump rates so that a groundwater divide is produced along this boundary

  6. Conjugate gradient optimization programs for shuttle reentry

    Science.gov (United States)

    Powers, W. F.; Jacobson, R. A.; Leonard, D. A.

    1972-01-01

    Two computer programs for shuttle reentry trajectory optimization are listed and described. Both programs use the conjugate gradient method as the optimization procedure. The Phase 1 Program is developed in cartesian coordinates for a rotating spherical earth, and crossrange, downrange, maximum deceleration, total heating, and terminal speed, altitude, and flight path angle are included in the performance index. The programs make extensive use of subroutines so that they may be easily adapted to other atmospheric trajectory optimization problems.

  7. Design optimization and analysis of selected thermal devices using self-adaptive Jaya algorithm

    International Nuclear Information System (INIS)

    Rao, R.V.; More, K.C.

    2017-01-01

    Highlights: • Self-adaptive Jaya algorithm is proposed for optimal design of thermal devices. • Optimization of heat pipe, cooling tower, heat sink and thermo-acoustic prime mover is presented. • Results of the proposed algorithm are better than the other optimization techniques. • The proposed algorithm may be conveniently used for the optimization of other devices. - Abstract: The present study explores the use of an improved Jaya algorithm called self-adaptive Jaya algorithm for optimal design of selected thermal devices viz; heat pipe, cooling tower, honeycomb heat sink and thermo-acoustic prime mover. Four different optimization case studies of the selected thermal devices are presented. The researchers had attempted the same design problems in the past using niched pareto genetic algorithm (NPGA), response surface method (RSM), leap-frog optimization program with constraints (LFOPC) algorithm, teaching-learning based optimization (TLBO) algorithm, grenade explosion method (GEM) and multi-objective genetic algorithm (MOGA). The results achieved by using self-adaptive Jaya algorithm are compared with those achieved by using the NPGA, RSM, LFOPC, TLBO, GEM and MOGA algorithms. The self-adaptive Jaya algorithm is proved superior as compared to the other optimization methods in terms of the results, computational effort and function evalutions.

  8. Comparison of Optimal Portfolios Selected by Multicriterial Model Using Absolute and Relative Criteria Values

    Directory of Open Access Journals (Sweden)

    Branka Marasović

    2009-03-01

    Full Text Available In this paper we select an optimal portfolio on the Croatian capital market by using the multicriterial programming. In accordance with the modern portfolio theory maximisation of returns at minimal risk should be the investment goal of any successful investor. However, contrary to the expectations of the modern portfolio theory, the tests carried out on a number of financial markets reveal the existence of other indicators important in portfolio selection. Considering the importance of variables other than return and risk, selection of the optimal portfolio becomes a multicriterial problem which should be solved by using the appropriate techniques.In order to select an optimal portfolio, absolute values of criteria, like return, risk, price to earning value ratio (P/E, price to book value ratio (P/B and price to sale value ratio (P/S are included in our multicriterial model. However the problem might occur as the mean values of some criteria are significantly different for different sectors and because financial managers emphasize that comparison of the same criteria for different sectors could lead us to wrong conclusions. In the second part of the paper, relative values of previously stated criteria (in relation to mean value of sector are included in model for selecting optimal portfolio. Furthermore, the paper shows that if relative values of criteria are included in multicriterial model for selecting optimal portfolio, return in subsequent period is considerably higher than if absolute values of the same criteria were used.

  9. Optimal Quadratic Programming Algorithms

    CERN Document Server

    Dostal, Zdenek

    2009-01-01

    Quadratic programming (QP) is one technique that allows for the optimization of a quadratic function in several variables in the presence of linear constraints. This title presents various algorithms for solving large QP problems. It is suitable as an introductory text on quadratic programming for graduate students and researchers

  10. Optimal selection of Orbital Replacement Unit on-orbit spares - A Space Station system availability model

    Science.gov (United States)

    Schwaab, Douglas G.

    1991-01-01

    A mathematical programing model is presented to optimize the selection of Orbital Replacement Unit on-orbit spares for the Space Station. The model maximizes system availability under the constraints of logistics resupply-cargo weight and volume allocations.

  11. Optimal Diet Planning for Eczema Patient Using Integer Programming

    Science.gov (United States)

    Zhen Sheng, Low; Sufahani, Suliadi

    2018-04-01

    Human diet planning is conducted by choosing appropriate food items that fulfill the nutritional requirements into the diet formulation. This paper discusses the application of integer programming to build the mathematical model of diet planning for eczema patients. The model developed is used to solve the diet problem of eczema patients from young age group. The integer programming is a scientific approach to select suitable food items, which seeks to minimize the costs, under conditions of meeting desired nutrient quantities, avoiding food allergens and getting certain foods into the diet that brings relief to the eczema conditions. This paper illustrates that the integer programming approach able to produce the optimal and feasible solution to deal with the diet problem of eczema patient.

  12. Hybrid collaborative optimization based on selection strategy of initial point and adaptive relaxation

    Energy Technology Data Exchange (ETDEWEB)

    Ji, Aimin; Yin, Xu; Yuan, Minghai [Hohai University, Changzhou (China)

    2015-09-15

    There are two problems in Collaborative optimization (CO): (1) the local optima arising from the selection of an inappropriate initial point; (2) the low efficiency and accuracy root in inappropriate relaxation factors. To solve these problems, we first develop the Latin hypercube design (LHD) to determine an initial point of optimization, and then use the non-linear programming by quadratic Lagrangian (NLPQL) to search for the global solution. The effectiveness of the initial point selection strategy is verified by three benchmark functions with some dimensions and different complexities. Then we propose the Adaptive relaxation collaborative optimization (ARCO) algorithm to solve the inconsistency between the system level and the disciplines level, and in this method, the relaxation factors are determined according to the three separated stages of CO respectively. The performance of the ARCO algorithm is compared with the standard collaborative algorithm and the constant relaxation collaborative algorithm with a typical numerical example, which indicates that the ARCO algorithm is more efficient and accurate. Finally, we propose a Hybrid collaborative optimization (HCO) approach, which integrates the selection strategy of initial point with the ARCO algorithm. The results show that HCO can achieve the global optimal solution without the initial value and it also has advantages in convergence, accuracy and robustness. Therefore, the proposed HCO approach can solve the CO problems with applications in the spindle and the speed reducer.

  13. Hybrid collaborative optimization based on selection strategy of initial point and adaptive relaxation

    International Nuclear Information System (INIS)

    Ji, Aimin; Yin, Xu; Yuan, Minghai

    2015-01-01

    There are two problems in Collaborative optimization (CO): (1) the local optima arising from the selection of an inappropriate initial point; (2) the low efficiency and accuracy root in inappropriate relaxation factors. To solve these problems, we first develop the Latin hypercube design (LHD) to determine an initial point of optimization, and then use the non-linear programming by quadratic Lagrangian (NLPQL) to search for the global solution. The effectiveness of the initial point selection strategy is verified by three benchmark functions with some dimensions and different complexities. Then we propose the Adaptive relaxation collaborative optimization (ARCO) algorithm to solve the inconsistency between the system level and the disciplines level, and in this method, the relaxation factors are determined according to the three separated stages of CO respectively. The performance of the ARCO algorithm is compared with the standard collaborative algorithm and the constant relaxation collaborative algorithm with a typical numerical example, which indicates that the ARCO algorithm is more efficient and accurate. Finally, we propose a Hybrid collaborative optimization (HCO) approach, which integrates the selection strategy of initial point with the ARCO algorithm. The results show that HCO can achieve the global optimal solution without the initial value and it also has advantages in convergence, accuracy and robustness. Therefore, the proposed HCO approach can solve the CO problems with applications in the spindle and the speed reducer

  14. Optimal installation program for reprocessing plants

    International Nuclear Information System (INIS)

    Kubokawa, Toshihiko; Kiyose, Ryohei

    1976-01-01

    Optimization of the program of installation of reprocessing plants is mathematically formulated as problem of mixed integer programming, which is numerically solved by the branch-and-bound method. A new concept of quasi-penalty is used to obviate the difficulties associated with dual degeneracy. The finiteness of the useful life of the plant is also taken into consideration. It is shown that an analogous formulation is possible for the cases in which the demand forecasts and expected plant lives cannot be predicted with certainty. The scale of the problem is found to have kN binary variables, (k+2)N continuous variables, and (k+3)N constraint conditions, where k is the number of intervals used in the piece-wise linear approximation of a nonlinear objective function, and N the overall duration of the period covered by the installation program. Calculations are made for N=24 yr and k=3, with the assumption that the plant life is 15 yr, the plant scale factor 0.5, and the maximum plant capacity 900 (t/yr). The results are calculated and discussed for four different demand forecasts. The difference of net profit between optimal and non-optimal installation programs is found to be in the range of 50 -- 100 M$. The pay-off matrix is calculated, and the optimal choice of action when the demand cannot be forecast with certainty is determined by applying Bayes' theory. The optimal installation program under such conditions of uncertainty is obtained also with a stochastic mixed integer programming model. (auth.)

  15. Dynamic programming for QFD in PES optimization

    Energy Technology Data Exchange (ETDEWEB)

    Sorrentino, R. [Mediterranean Univ. of Reggio Calabria, Reggio Calabria (Italy). Dept. of Computer Science and Electrical Technology

    2008-07-01

    Quality function deployment (QFD) is a method for linking the needs of the customer with design, development, engineering, manufacturing, and service functions. In the electric power industry, QFD is used to help designers concentrate on the most important technical attributes to develop better electrical services. Most optimization approaches used in QFD analysis have been based on integer or linear programming. These approaches perform well in certain circumstances, but there are problems that hinder their practical use. This paper proposed an approach to optimize Power and Energy Systems (PES). A dynamic programming approach was used along with an extended House of Quality to gather information. Dynamic programming was used to allocate the limited resources to the technical attributes. The approach integrated dynamic programming into the electrical service design process. The dynamic programming approach did not require the full relationship curve between technical attributes and customer satisfaction, or the relationship between technical attributes and cost. It only used a group of discrete points containing information about customer satisfaction, technical attributes, and the cost to find the optimal product design. Therefore, it required less time and resources than other approaches. At the end of the optimization process, the value of each technical attribute, the related cost, and the overall customer satisfaction were obtained at the same time. It was concluded that compared with other optimization methods, the dynamic programming method requires less information and the optimal results are more relevant. 21 refs., 2 tabs., 2 figs.

  16. Optimal selection of TLD chips

    International Nuclear Information System (INIS)

    Phung, P.; Nicoll, J.J.; Edmonds, P.; Paris, M.; Thompson, C.

    1996-01-01

    Large sets of TLD chips are often used to measure beam dose characteristics in radiotherapy. A sorting method is presented to allow optimal selection of chips from a chosen set. This method considers the variation

  17. Combinatorial Optimization in Project Selection Using Genetic Algorithm

    Science.gov (United States)

    Dewi, Sari; Sawaluddin

    2018-01-01

    This paper discusses the problem of project selection in the presence of two objective functions that maximize profit and minimize cost and the existence of some limitations is limited resources availability and time available so that there is need allocation of resources in each project. These resources are human resources, machine resources, raw material resources. This is treated as a consideration to not exceed the budget that has been determined. So that can be formulated mathematics for objective function (multi-objective) with boundaries that fulfilled. To assist the project selection process, a multi-objective combinatorial optimization approach is used to obtain an optimal solution for the selection of the right project. It then described a multi-objective method of genetic algorithm as one method of multi-objective combinatorial optimization approach to simplify the project selection process in a large scope.

  18. Mathematical programming model for heat exchanger design through optimization of partial objectives

    International Nuclear Information System (INIS)

    Onishi, Viviani C.; Ravagnani, Mauro A.S.S.; Caballero, José A.

    2013-01-01

    Highlights: • Rigorous design of shell-and-tube heat exchangers according to TEMA standards. • Division of the problem into sets of equations that are easier to solve. • Selected heuristic objective functions based on the physical behavior of the problem. • Sequential optimization approach to avoid solutions stuck in local minimum. • The results obtained with this model improved the values reported in the literature. - Abstract: Mathematical programming can be used for the optimal design of shell-and-tube heat exchangers (STHEs). This paper proposes a mixed integer non-linear programming (MINLP) model for the design of STHEs, following rigorously the standards of the Tubular Exchanger Manufacturers Association (TEMA). Bell–Delaware Method is used for the shell-side calculations. This approach produces a large and non-convex model that cannot be solved to global optimality with the current state of the art solvers. Notwithstanding, it is proposed to perform a sequential optimization approach of partial objective targets through the division of the problem into sets of related equations that are easier to solve. For each one of these problems a heuristic objective function is selected based on the physical behavior of the problem. The global optimal solution of the original problem cannot be ensured even in the case in which each of the sub-problems is solved to global optimality, but at least a very good solution is always guaranteed. Three cases extracted from the literature were studied. The results showed that in all cases the values obtained using the proposed MINLP model containing multiple objective functions improved the values presented in the literature

  19. Optimization of control poison management by dynamic programming

    International Nuclear Information System (INIS)

    Ponzoni Filho, P.

    1974-01-01

    A dynamic programming approach was used to optimize the poison distribution in the core of a nuclear power plant between reloading. This method was applied to a 500 M We PWR subject to two different fuel management policies. The beginning of a stage is marked by a fuel management decision. The state vector of the system is defined by the burnups in the three fuel zones of the core. The change of the state vector is computed in several time steps. A criticality conserving poison management pattern is chosen at the beginning of each step. The burnups at the end of a step are obtained by means of depletion calculations, assuming constant neutron distribution during the step. The violation of burnup and power peaking constraints during the step eliminates the corresponding end states. In the case of identical end states, all except that which produced the largest amount of energy, are eliminated. Among the several end states one is selected for the subsequent stage, when it is subjected to a fuel management decision. This selection is based on an optimally criterion previously chosen, such as: discharged fuel burnup maximization, energy generation cost minimization, etc. (author)

  20. Fuzzy Random λ-Mean SAD Portfolio Selection Problem: An Ant Colony Optimization Approach

    Science.gov (United States)

    Thakur, Gour Sundar Mitra; Bhattacharyya, Rupak; Mitra, Swapan Kumar

    2010-10-01

    To reach the investment goal, one has to select a combination of securities among different portfolios containing large number of securities. Only the past records of each security do not guarantee the future return. As there are many uncertain factors which directly or indirectly influence the stock market and there are also some newer stock markets which do not have enough historical data, experts' expectation and experience must be combined with the past records to generate an effective portfolio selection model. In this paper the return of security is assumed to be Fuzzy Random Variable Set (FRVS), where returns are set of random numbers which are in turn fuzzy numbers. A new λ-Mean Semi Absolute Deviation (λ-MSAD) portfolio selection model is developed. The subjective opinions of the investors to the rate of returns of each security are taken into consideration by introducing a pessimistic-optimistic parameter vector λ. λ-Mean Semi Absolute Deviation (λ-MSAD) model is preferred as it follows absolute deviation of the rate of returns of a portfolio instead of the variance as the measure of the risk. As this model can be reduced to Linear Programming Problem (LPP) it can be solved much faster than quadratic programming problems. Ant Colony Optimization (ACO) is used for solving the portfolio selection problem. ACO is a paradigm for designing meta-heuristic algorithms for combinatorial optimization problem. Data from BSE is used for illustration.

  1. Optimal Bandwidth Selection for Kernel Density Functionals Estimation

    Directory of Open Access Journals (Sweden)

    Su Chen

    2015-01-01

    Full Text Available The choice of bandwidth is crucial to the kernel density estimation (KDE and kernel based regression. Various bandwidth selection methods for KDE and local least square regression have been developed in the past decade. It has been known that scale and location parameters are proportional to density functionals ∫γ(xf2(xdx with appropriate choice of γ(x and furthermore equality of scale and location tests can be transformed to comparisons of the density functionals among populations. ∫γ(xf2(xdx can be estimated nonparametrically via kernel density functionals estimation (KDFE. However, the optimal bandwidth selection for KDFE of ∫γ(xf2(xdx has not been examined. We propose a method to select the optimal bandwidth for the KDFE. The idea underlying this method is to search for the optimal bandwidth by minimizing the mean square error (MSE of the KDFE. Two main practical bandwidth selection techniques for the KDFE of ∫γ(xf2(xdx are provided: Normal scale bandwidth selection (namely, “Rule of Thumb” and direct plug-in bandwidth selection. Simulation studies display that our proposed bandwidth selection methods are superior to existing density estimation bandwidth selection methods in estimating density functionals.

  2. Integration of genomic information into sport horse breeding programs for optimization of accuracy of selection.

    Science.gov (United States)

    Haberland, A M; König von Borstel, U; Simianer, H; König, S

    2012-09-01

    Reliable selection criteria are required for young riding horses to increase genetic gain by increasing accuracy of selection and decreasing generation intervals. In this study, selection strategies incorporating genomic breeding values (GEBVs) were evaluated. Relevant stages of selection in sport horse breeding programs were analyzed by applying selection index theory. Results in terms of accuracies of indices (r(TI) ) and relative selection response indicated that information on single nucleotide polymorphism (SNP) genotypes considerably increases the accuracy of breeding values estimated for young horses without own or progeny performance. In a first scenario, the correlation between the breeding value estimated from the SNP genotype and the true breeding value (= accuracy of GEBV) was fixed to a relatively low value of r(mg) = 0.5. For a low heritability trait (h(2) = 0.15), and an index for a young horse based only on information from both parents, additional genomic information doubles r(TI) from 0.27 to 0.54. Including the conventional information source 'own performance' into the before mentioned index, additional SNP information increases r(TI) by 40%. Thus, particularly with regard to traits of low heritability, genomic information can provide a tool for well-founded selection decisions early in life. In a further approach, different sources of breeding values (e.g. GEBV and estimated breeding values (EBVs) from different countries) were combined into an overall index when altering accuracies of EBVs and correlations between traits. In summary, we showed that genomic selection strategies have the potential to contribute to a substantial reduction in generation intervals in horse breeding programs.

  3. Graphic Interface for LCP2 Optimization Program

    DEFF Research Database (Denmark)

    Nicolae, Taropa Laurentiu; Gaunholt, Hans

    1998-01-01

    This report provides information about the software interface that is programmed for the Optimization Program LCP2. The first part is about the general description of the program followed by a guide for using the interface. The last chapters contain a discussion about problems or futute extension...... of the project. The program is written in Visual C++5.0 on a Windows NT4.0 operating system.......This report provides information about the software interface that is programmed for the Optimization Program LCP2. The first part is about the general description of the program followed by a guide for using the interface. The last chapters contain a discussion about problems or futute extensions...

  4. Parameter Selection and Performance Comparison of Particle Swarm Optimization in Sensor Networks Localization.

    Science.gov (United States)

    Cui, Huanqing; Shu, Minglei; Song, Min; Wang, Yinglong

    2017-03-01

    Localization is a key technology in wireless sensor networks. Faced with the challenges of the sensors' memory, computational constraints, and limited energy, particle swarm optimization has been widely applied in the localization of wireless sensor networks, demonstrating better performance than other optimization methods. In particle swarm optimization-based localization algorithms, the variants and parameters should be chosen elaborately to achieve the best performance. However, there is a lack of guidance on how to choose these variants and parameters. Further, there is no comprehensive performance comparison among particle swarm optimization algorithms. The main contribution of this paper is three-fold. First, it surveys the popular particle swarm optimization variants and particle swarm optimization-based localization algorithms for wireless sensor networks. Secondly, it presents parameter selection of nine particle swarm optimization variants and six types of swarm topologies by extensive simulations. Thirdly, it comprehensively compares the performance of these algorithms. The results show that the particle swarm optimization with constriction coefficient using ring topology outperforms other variants and swarm topologies, and it performs better than the second-order cone programming algorithm.

  5. Parameter Selection and Performance Comparison of Particle Swarm Optimization in Sensor Networks Localization

    Directory of Open Access Journals (Sweden)

    Huanqing Cui

    2017-03-01

    Full Text Available Localization is a key technology in wireless sensor networks. Faced with the challenges of the sensors’ memory, computational constraints, and limited energy, particle swarm optimization has been widely applied in the localization of wireless sensor networks, demonstrating better performance than other optimization methods. In particle swarm optimization-based localization algorithms, the variants and parameters should be chosen elaborately to achieve the best performance. However, there is a lack of guidance on how to choose these variants and parameters. Further, there is no comprehensive performance comparison among particle swarm optimization algorithms. The main contribution of this paper is three-fold. First, it surveys the popular particle swarm optimization variants and particle swarm optimization-based localization algorithms for wireless sensor networks. Secondly, it presents parameter selection of nine particle swarm optimization variants and six types of swarm topologies by extensive simulations. Thirdly, it comprehensively compares the performance of these algorithms. The results show that the particle swarm optimization with constriction coefficient using ring topology outperforms other variants and swarm topologies, and it performs better than the second-order cone programming algorithm.

  6. A novel variable selection approach that iteratively optimizes variable space using weighted binary matrix sampling.

    Science.gov (United States)

    Deng, Bai-chuan; Yun, Yong-huan; Liang, Yi-zeng; Yi, Lun-zhao

    2014-10-07

    In this study, a new optimization algorithm called the Variable Iterative Space Shrinkage Approach (VISSA) that is based on the idea of model population analysis (MPA) is proposed for variable selection. Unlike most of the existing optimization methods for variable selection, VISSA statistically evaluates the performance of variable space in each step of optimization. Weighted binary matrix sampling (WBMS) is proposed to generate sub-models that span the variable subspace. Two rules are highlighted during the optimization procedure. First, the variable space shrinks in each step. Second, the new variable space outperforms the previous one. The second rule, which is rarely satisfied in most of the existing methods, is the core of the VISSA strategy. Compared with some promising variable selection methods such as competitive adaptive reweighted sampling (CARS), Monte Carlo uninformative variable elimination (MCUVE) and iteratively retaining informative variables (IRIV), VISSA showed better prediction ability for the calibration of NIR data. In addition, VISSA is user-friendly; only a few insensitive parameters are needed, and the program terminates automatically without any additional conditions. The Matlab codes for implementing VISSA are freely available on the website: https://sourceforge.net/projects/multivariateanalysis/files/VISSA/.

  7. SOCP relaxation bounds for the optimal subset selection problem applied to robust linear regression

    OpenAIRE

    Flores, Salvador

    2015-01-01

    This paper deals with the problem of finding the globally optimal subset of h elements from a larger set of n elements in d space dimensions so as to minimize a quadratic criterion, with an special emphasis on applications to computing the Least Trimmed Squares Estimator (LTSE) for robust regression. The computation of the LTSE is a challenging subset selection problem involving a nonlinear program with continuous and binary variables, linked in a highly nonlinear fashion. The selection of a ...

  8. Optimal decisions principles of programming

    CERN Document Server

    Lange, Oskar

    1971-01-01

    Optimal Decisions: Principles of Programming deals with all important problems related to programming.This book provides a general interpretation of the theory of programming based on the application of the Lagrange multipliers, followed by a presentation of the marginal and linear programming as special cases of this general theory. The praxeological interpretation of the method of Lagrange multipliers is also discussed.This text covers the Koopmans' model of transportation, geometric interpretation of the programming problem, and nature of activity analysis. The solution of t

  9. Pareto-Optimal Model Selection via SPRINT-Race.

    Science.gov (United States)

    Zhang, Tiantian; Georgiopoulos, Michael; Anagnostopoulos, Georgios C

    2018-02-01

    In machine learning, the notion of multi-objective model selection (MOMS) refers to the problem of identifying the set of Pareto-optimal models that optimize by compromising more than one predefined objectives simultaneously. This paper introduces SPRINT-Race, the first multi-objective racing algorithm in a fixed-confidence setting, which is based on the sequential probability ratio with indifference zone test. SPRINT-Race addresses the problem of MOMS with multiple stochastic optimization objectives in the proper Pareto-optimality sense. In SPRINT-Race, a pairwise dominance or non-dominance relationship is statistically inferred via a non-parametric, ternary-decision, dual-sequential probability ratio test. The overall probability of falsely eliminating any Pareto-optimal models or mistakenly returning any clearly dominated models is strictly controlled by a sequential Holm's step-down family-wise error rate control method. As a fixed-confidence model selection algorithm, the objective of SPRINT-Race is to minimize the computational effort required to achieve a prescribed confidence level about the quality of the returned models. The performance of SPRINT-Race is first examined via an artificially constructed MOMS problem with known ground truth. Subsequently, SPRINT-Race is applied on two real-world applications: 1) hybrid recommender system design and 2) multi-criteria stock selection. The experimental results verify that SPRINT-Race is an effective and efficient tool for such MOMS problems. code of SPRINT-Race is available at https://github.com/watera427/SPRINT-Race.

  10. An Optimization Model for the Selection of Bus-Only Lanes in a City.

    Science.gov (United States)

    Chen, Qun

    2015-01-01

    The planning of urban bus-only lane networks is an important measure to improve bus service and bus priority. To determine the effective arrangement of bus-only lanes, a bi-level programming model for urban bus lane layout is developed in this study that considers accessibility and budget constraints. The goal of the upper-level model is to minimize the total travel time, and the lower-level model is a capacity-constrained traffic assignment model that describes the passenger flow assignment on bus lines, in which the priority sequence of the transfer times is reflected in the passengers' route-choice behaviors. Using the proposed bi-level programming model, optimal bus lines are selected from a set of candidate bus lines; thus, the corresponding bus lane network on which the selected bus lines run is determined. The solution method using a genetic algorithm in the bi-level programming model is developed, and two numerical examples are investigated to demonstrate the efficacy of the proposed model.

  11. An Optimization Model for the Selection of Bus-Only Lanes in a City.

    Directory of Open Access Journals (Sweden)

    Qun Chen

    Full Text Available The planning of urban bus-only lane networks is an important measure to improve bus service and bus priority. To determine the effective arrangement of bus-only lanes, a bi-level programming model for urban bus lane layout is developed in this study that considers accessibility and budget constraints. The goal of the upper-level model is to minimize the total travel time, and the lower-level model is a capacity-constrained traffic assignment model that describes the passenger flow assignment on bus lines, in which the priority sequence of the transfer times is reflected in the passengers' route-choice behaviors. Using the proposed bi-level programming model, optimal bus lines are selected from a set of candidate bus lines; thus, the corresponding bus lane network on which the selected bus lines run is determined. The solution method using a genetic algorithm in the bi-level programming model is developed, and two numerical examples are investigated to demonstrate the efficacy of the proposed model.

  12. A first formal link between the price equation and an optimization program.

    Science.gov (United States)

    Grafen, Alan

    2002-07-07

    The Darwin unification project is pursued. A meta-model encompassing an important class of population genetic models is formed by adding an abstract model of the number of successful gametes to the Price equation under uncertainty. A class of optimization programs are defined to represent the "individual-as-maximizing-agent analogy" in a general way. It is then shown that for each population genetic model there is a corresponding optimization program with which formal links can be established. These links provide a secure logical foundation for the commonplace biological principle that natural selection leads organisms to act as if maximizing their "fitness", provides a definition of "fitness", and clarifies the limitations of that principle. The situations covered do not include frequency dependence or social behaviour, but the approach is capable of extension.

  13. Managing the Public Sector Research and Development Portfolio Selection Process: A Case Study of Quantitative Selection and Optimization

    Science.gov (United States)

    2016-09-01

    PUBLIC SECTOR RESEARCH & DEVELOPMENT PORTFOLIO SELECTION PROCESS: A CASE STUDY OF QUANTITATIVE SELECTION AND OPTIMIZATION by Jason A. Schwartz...PUBLIC SECTOR RESEARCH & DEVELOPMENT PORTFOLIO SELECTION PROCESS: A CASE STUDY OF QUANTITATIVE SELECTION AND OPTIMIZATION 5. FUNDING NUMBERS 6...describing how public sector organizations can implement a research and development (R&D) portfolio optimization strategy to maximize the cost

  14. Software for selection of optimal layouts of fast reactors

    International Nuclear Information System (INIS)

    Geraskin, N.I.; Kuz'min, A.M.; Morin, D.V.

    1983-01-01

    A complex program for the calculation and optimization of a two-dimensional cylindrical fast reactor consisting of two axial layers and having up to 10 zones of different compositions in each layer is described. Search for optimal parameters is performed by the successive linearization method based on the small perturbation theory and linear programming. The complex program is written for the BESM-6 computer in the FORTRAN language

  15. Optimizing the hydraulic program of cementing casing strings

    Energy Technology Data Exchange (ETDEWEB)

    Novakovic, M

    1984-01-01

    A technique is described for calculating the optimal parameters of the flow of plugging mud which takes into consideration the geometry of the annular space and the rheological characteristics of the muds. The optimization algorithm was illustrated by a block diagram. Examples are given for practical application of the optimization programs in production conditions. It is stressed that optimizing the hydraulic cementing program is effective if other technical-technological problems in cementing casing strings have been resolved.

  16. Portfolio selection problem: a comparison of fuzzy goal programming and linear physical programming

    Directory of Open Access Journals (Sweden)

    Fusun Kucukbay

    2016-04-01

    Full Text Available Investors have limited budget and they try to maximize their return with minimum risk. Therefore this study aims to deal with the portfolio selection problem. In the study two criteria are considered which are expected return, and risk. In this respect, linear physical programming (LPP technique is applied on Bist 100 stocks to be able to find out the optimum portfolio. The analysis covers the period April 2009- March 2015. This period is divided into two; April 2009-March 2014 and April 2014 – March 2015. April 2009-March 2014 period is used as data to find an optimal solution. April 2014-March 2015 period is used to test the real performance of portfolios. The performance of the obtained portfolio is compared with that obtained from fuzzy goal programming (FGP. Then the performances of both method, LPP and FGP are compared with BIST 100 in terms of their Sharpe Indexes. The findings reveal that LPP for portfolio selection problem is a good alternative to FGP.

  17. Optimal portfolio selection between different kinds of Renewable energy sources

    Energy Technology Data Exchange (ETDEWEB)

    Zakerinia, MohammadSaleh; Piltan, Mehdi; Ghaderi, Farid

    2010-09-15

    In this paper, selection of the optimal energy supply system in an industrial unit is taken into consideration. This study takes environmental, economical and social parameters into consideration in modeling along with technical factors. Several alternatives which include renewable energy sources, micro-CHP systems and conventional system has been compared by means of an integrated model of linear programming and three multi-criteria approaches (AHP, TOPSIS and ELECTRE III). New parameters like availability of sources, fuels' price volatility, besides traditional factors are considered in different scenarios. Results show with environmental preferences, renewable sources and micro-CHP are good alternatives for conventional systems.

  18. Ant colony optimization and constraint programming

    CERN Document Server

    Solnon, Christine

    2013-01-01

    Ant colony optimization is a metaheuristic which has been successfully applied to a wide range of combinatorial optimization problems. The author describes this metaheuristic and studies its efficiency for solving some hard combinatorial problems, with a specific focus on constraint programming. The text is organized into three parts. The first part introduces constraint programming, which provides high level features to declaratively model problems by means of constraints. It describes the main existing approaches for solving constraint satisfaction problems, including complete tree search

  19. A Simulation Modeling Framework to Optimize Programs Using Financial Incentives to Motivate Health Behavior Change.

    Science.gov (United States)

    Basu, Sanjay; Kiernan, Michaela

    2016-01-01

    While increasingly popular among mid- to large-size employers, using financial incentives to induce health behavior change among employees has been controversial, in part due to poor quality and generalizability of studies to date. Thus, fundamental questions have been left unanswered: To generate positive economic returns on investment, what level of incentive should be offered for any given type of incentive program and among which employees? We constructed a novel modeling framework that systematically identifies how to optimize marginal return on investment from programs incentivizing behavior change by integrating commonly collected data on health behaviors and associated costs. We integrated "demand curves" capturing individual differences in response to any given incentive with employee demographic and risk factor data. We also estimated the degree of self-selection that could be tolerated: that is, the maximum percentage of already-healthy employees who could enroll in a wellness program while still maintaining positive absolute return on investment. In a demonstration analysis, the modeling framework was applied to data from 3000 worksite physical activity programs across the nation. For physical activity programs, the incentive levels that would optimize marginal return on investment ($367/employee/year) were higher than average incentive levels currently offered ($143/employee/year). Yet a high degree of self-selection could undermine the economic benefits of the program; if more than 17% of participants came from the top 10% of the physical activity distribution, the cost of the program would be expected to always be greater than its benefits. Our generalizable framework integrates individual differences in behavior and risk to systematically estimate the incentive level that optimizes marginal return on investment. © The Author(s) 2015.

  20. Optimal control of bond selectivity in unimolecular reactions

    International Nuclear Information System (INIS)

    Shi Shenghua; Rabitz, H.

    1991-01-01

    The optimal control theory approach to designing optimal fields for bond-selective unimolecular reactions is presented. A set of equations for determining the optimal fields, which will lead to the achievement of the objective of bond-selective dissociation is developed. The numerical procedure given for solving these equations requires the repeated calculation of the time propagator for the system with the time-dependent Hamiltonian. The splitting approximation combined with the fast Fourier transform algorithm is used for computing the short time propagator. As an illustrative example, a model linear triatomic molecule is treated. The model system consists of two Morse oscillators coupled via kinetic coupling. The magnitude of the dipoles of the two Morse oscillators are the same, the fundamental frequencies are almost the same, but the dissociation energies are different. The rather demanding objective under these conditions is to break the stronger bond while leaving the weaker one intact. It is encouraging that the present computational method efficiently gives rise to the optimal field, which leads to the excellent achievement of the objective of bond selective dissociation. (orig.)

  1. Feature selection for portfolio optimization

    DEFF Research Database (Denmark)

    Bjerring, Thomas Trier; Ross, Omri; Weissensteiner, Alex

    2016-01-01

    Most portfolio selection rules based on the sample mean and covariance matrix perform poorly out-of-sample. Moreover, there is a growing body of evidence that such optimization rules are not able to beat simple rules of thumb, such as 1/N. Parameter uncertainty has been identified as one major....... While most of the diversification benefits are preserved, the parameter estimation problem is alleviated. We conduct out-of-sample back-tests to show that in most cases different well-established portfolio selection rules applied on the reduced asset universe are able to improve alpha relative...

  2. Portfolio optimization using fuzzy linear programming

    Science.gov (United States)

    Pandit, Purnima K.

    2013-09-01

    Portfolio Optimization (PO) is a problem in Finance, in which investor tries to maximize return and minimize risk by carefully choosing different assets. Expected return and risk are the most important parameters with regard to optimal portfolios. In the simple form PO can be modeled as quadratic programming problem which can be put into equivalent linear form. PO problems with the fuzzy parameters can be solved as multi-objective fuzzy linear programming problem. In this paper we give the solution to such problems with an illustrative example.

  3. Optimization Research of Generation Investment Based on Linear Programming Model

    Science.gov (United States)

    Wu, Juan; Ge, Xueqian

    Linear programming is an important branch of operational research and it is a mathematical method to assist the people to carry out scientific management. GAMS is an advanced simulation and optimization modeling language and it will combine a large number of complex mathematical programming, such as linear programming LP, nonlinear programming NLP, MIP and other mixed-integer programming with the system simulation. In this paper, based on the linear programming model, the optimized investment decision-making of generation is simulated and analyzed. At last, the optimal installed capacity of power plants and the final total cost are got, which provides the rational decision-making basis for optimized investments.

  4. Optimization of biotechnological systems through geometric programming

    Directory of Open Access Journals (Sweden)

    Torres Nestor V

    2007-09-01

    Full Text Available Abstract Background In the past, tasks of model based yield optimization in metabolic engineering were either approached with stoichiometric models or with structured nonlinear models such as S-systems or linear-logarithmic representations. These models stand out among most others, because they allow the optimization task to be converted into a linear program, for which efficient solution methods are widely available. For pathway models not in one of these formats, an Indirect Optimization Method (IOM was developed where the original model is sequentially represented as an S-system model, optimized in this format with linear programming methods, reinterpreted in the initial model form, and further optimized as necessary. Results A new method is proposed for this task. We show here that the model format of a Generalized Mass Action (GMA system may be optimized very efficiently with techniques of geometric programming. We briefly review the basics of GMA systems and of geometric programming, demonstrate how the latter may be applied to the former, and illustrate the combined method with a didactic problem and two examples based on models of real systems. The first is a relatively small yet representative model of the anaerobic fermentation pathway in S. cerevisiae, while the second describes the dynamics of the tryptophan operon in E. coli. Both models have previously been used for benchmarking purposes, thus facilitating comparisons with the proposed new method. In these comparisons, the geometric programming method was found to be equal or better than the earlier methods in terms of successful identification of optima and efficiency. Conclusion GMA systems are of importance, because they contain stoichiometric, mass action and S-systems as special cases, along with many other models. Furthermore, it was previously shown that algebraic equivalence transformations of variables are sufficient to convert virtually any types of dynamical models into

  5. Enhanced index tracking modeling in portfolio optimization with mixed-integer programming z approach

    Science.gov (United States)

    Siew, Lam Weng; Jaaman, Saiful Hafizah Hj.; Ismail, Hamizun bin

    2014-09-01

    Enhanced index tracking is a popular form of portfolio management in stock market investment. Enhanced index tracking aims to construct an optimal portfolio to generate excess return over the return achieved by the stock market index without purchasing all of the stocks that make up the index. The objective of this paper is to construct an optimal portfolio using mixed-integer programming model which adopts regression approach in order to generate higher portfolio mean return than stock market index return. In this study, the data consists of 24 component stocks in Malaysia market index which is FTSE Bursa Malaysia Kuala Lumpur Composite Index from January 2010 until December 2012. The results of this study show that the optimal portfolio of mixed-integer programming model is able to generate higher mean return than FTSE Bursa Malaysia Kuala Lumpur Composite Index return with only selecting 30% out of the total stock market index components.

  6. Decision-making methodology of optimal shielding materials by using fuzzy linear programming

    International Nuclear Information System (INIS)

    Kanai, Y.; Miura, T.; Hirao, Y.

    2000-01-01

    The main purpose of our studies are to select materials and determine the ratio of constituent materials as the first stage of optimum shielding design to suit the individual requirements of nuclear reactors, reprocessing facilities, casks for shipping spent fuel, etc. The parameters of the shield optimization are cost, space, weight and some shielding properties such as activation rates or individual irradiation and cooling time, and total dose rate for neutrons (including secondary gamma ray) and for primary gamma ray. Using conventional two-valued logic (i.e. crisp) approaches, huge combination calculations are needed to identify suitable materials for optimum shielding design. Also, re-computation is required for minor changes, as the approach does not react sensitively to the computation result. Present approach using a fuzzy linear programming method is much of the decision-making toward the satisfying solution might take place in fuzzy environment. And it can quickly and easily provide a guiding principle of optimal selection of shielding materials under the above-mentioned conditions. The possibility or reducing radiation effects by optimizing the ratio of constituent materials is investigated. (author)

  7. SU-F-R-10: Selecting the Optimal Solution for Multi-Objective Radiomics Model

    International Nuclear Information System (INIS)

    Zhou, Z; Folkert, M; Wang, J

    2016-01-01

    Purpose: To develop an evidential reasoning approach for selecting the optimal solution from a Pareto solution set obtained by a multi-objective radiomics model for predicting distant failure in lung SBRT. Methods: In the multi-objective radiomics model, both sensitivity and specificity are considered as the objective functions simultaneously. A Pareto solution set with many feasible solutions will be resulted from the multi-objective optimization. In this work, an optimal solution Selection methodology for Multi-Objective radiomics Learning model using the Evidential Reasoning approach (SMOLER) was proposed to select the optimal solution from the Pareto solution set. The proposed SMOLER method used the evidential reasoning approach to calculate the utility of each solution based on pre-set optimal solution selection rules. The solution with the highest utility was chosen as the optimal solution. In SMOLER, an optimal learning model coupled with clonal selection algorithm was used to optimize model parameters. In this study, PET, CT image features and clinical parameters were utilized for predicting distant failure in lung SBRT. Results: Total 126 solution sets were generated by adjusting predictive model parameters. Each Pareto set contains 100 feasible solutions. The solution selected by SMOLER within each Pareto set was compared to the manually selected optimal solution. Five-cross-validation was used to evaluate the optimal solution selection accuracy of SMOLER. The selection accuracies for five folds were 80.00%, 69.23%, 84.00%, 84.00%, 80.00%, respectively. Conclusion: An optimal solution selection methodology for multi-objective radiomics learning model using the evidential reasoning approach (SMOLER) was proposed. Experimental results show that the optimal solution can be found in approximately 80% cases.

  8. SU-F-R-10: Selecting the Optimal Solution for Multi-Objective Radiomics Model

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Z; Folkert, M; Wang, J [UT Southwestern Medical Center, Dallas, TX (United States)

    2016-06-15

    Purpose: To develop an evidential reasoning approach for selecting the optimal solution from a Pareto solution set obtained by a multi-objective radiomics model for predicting distant failure in lung SBRT. Methods: In the multi-objective radiomics model, both sensitivity and specificity are considered as the objective functions simultaneously. A Pareto solution set with many feasible solutions will be resulted from the multi-objective optimization. In this work, an optimal solution Selection methodology for Multi-Objective radiomics Learning model using the Evidential Reasoning approach (SMOLER) was proposed to select the optimal solution from the Pareto solution set. The proposed SMOLER method used the evidential reasoning approach to calculate the utility of each solution based on pre-set optimal solution selection rules. The solution with the highest utility was chosen as the optimal solution. In SMOLER, an optimal learning model coupled with clonal selection algorithm was used to optimize model parameters. In this study, PET, CT image features and clinical parameters were utilized for predicting distant failure in lung SBRT. Results: Total 126 solution sets were generated by adjusting predictive model parameters. Each Pareto set contains 100 feasible solutions. The solution selected by SMOLER within each Pareto set was compared to the manually selected optimal solution. Five-cross-validation was used to evaluate the optimal solution selection accuracy of SMOLER. The selection accuracies for five folds were 80.00%, 69.23%, 84.00%, 84.00%, 80.00%, respectively. Conclusion: An optimal solution selection methodology for multi-objective radiomics learning model using the evidential reasoning approach (SMOLER) was proposed. Experimental results show that the optimal solution can be found in approximately 80% cases.

  9. Expected value based fuzzy programming approach to solve integrated supplier selection and inventory control problem with fuzzy demand

    Science.gov (United States)

    Sutrisno; Widowati; Sunarsih; Kartono

    2018-01-01

    In this paper, a mathematical model in quadratic programming with fuzzy parameter is proposed to determine the optimal strategy for integrated inventory control and supplier selection problem with fuzzy demand. To solve the corresponding optimization problem, we use the expected value based fuzzy programming. Numerical examples are performed to evaluate the model. From the results, the optimal amount of each product that have to be purchased from each supplier for each time period and the optimal amount of each product that have to be stored in the inventory for each time period were determined with minimum total cost and the inventory level was sufficiently closed to the reference level.

  10. An intutionistic fuzzy optimization approach to vendor selection problem

    Directory of Open Access Journals (Sweden)

    Prabjot Kaur

    2016-09-01

    Full Text Available Selecting the right vendor is an important business decision made by any organization. The decision involves multiple criteria and if the objectives vary in preference and scope, then nature of decision becomes multiobjective. In this paper, a vendor selection problem has been formulated as an intutionistic fuzzy multiobjective optimization where appropriate number of vendors is to be selected and order allocated to them. The multiobjective problem includes three objectives: minimizing the net price, maximizing the quality, and maximizing the on time deliveries subject to supplier's constraints. The objection function and the demand are treated as intutionistic fuzzy sets. An intutionistic fuzzy set has its ability to handle uncertainty with additional degrees of freedom. The Intutionistic fuzzy optimization (IFO problem is converted into a crisp linear form and solved using optimization software Tora. The advantage of IFO is that they give better results than fuzzy/crisp optimization. The proposed approach is explained by a numerical example.

  11. On Optimal Input Design and Model Selection for Communication Channels

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yanyan [ORNL; Djouadi, Seddik M [ORNL; Olama, Mohammed M [ORNL

    2013-01-01

    In this paper, the optimal model (structure) selection and input design which minimize the worst case identification error for communication systems are provided. The problem is formulated using metric complexity theory in a Hilbert space setting. It is pointed out that model selection and input design can be handled independently. Kolmogorov n-width is used to characterize the representation error introduced by model selection, while Gel fand and Time n-widths are used to represent the inherent error introduced by input design. After the model is selected, an optimal input which minimizes the worst case identification error is shown to exist. In particular, it is proven that the optimal model for reducing the representation error is a Finite Impulse Response (FIR) model, and the optimal input is an impulse at the start of the observation interval. FIR models are widely popular in communication systems, such as, in Orthogonal Frequency Division Multiplexing (OFDM) systems.

  12. Spectral Quantitative Analysis Model with Combining Wavelength Selection and Topology Structure Optimization

    Directory of Open Access Journals (Sweden)

    Qian Wang

    2016-01-01

    Full Text Available Spectroscopy is an efficient and widely used quantitative analysis method. In this paper, a spectral quantitative analysis model with combining wavelength selection and topology structure optimization is proposed. For the proposed method, backpropagation neural network is adopted for building the component prediction model, and the simultaneousness optimization of the wavelength selection and the topology structure of neural network is realized by nonlinear adaptive evolutionary programming (NAEP. The hybrid chromosome in binary scheme of NAEP has three parts. The first part represents the topology structure of neural network, the second part represents the selection of wavelengths in the spectral data, and the third part represents the parameters of mutation of NAEP. Two real flue gas datasets are used in the experiments. In order to present the effectiveness of the methods, the partial least squares with full spectrum, the partial least squares combined with genetic algorithm, the uninformative variable elimination method, the backpropagation neural network with full spectrum, the backpropagation neural network combined with genetic algorithm, and the proposed method are performed for building the component prediction model. Experimental results verify that the proposed method has the ability to predict more accurately and robustly as a practical spectral analysis tool.

  13. Optimal Implantable Cardioverter Defibrillator Programming.

    Science.gov (United States)

    Shah, Bindi K

    Optimal programming of implantable cardioverter defibrillators (ICDs) is essential to appropriately treat ventricular tachyarrhythmias and to avoid unnecessary and inappropriate shocks. There have been a series of large clinical trials evaluating tailored programming of ICDs. We reviewed the clinical trials evaluating ICD therapies and detection, and the consensus statement on ICD programming. In doing so, we found that prolonged ICD detection times, higher rate cutoffs, and antitachycardia pacing (ATP) programming decreases inappropriate and painful therapies in a primary prevention population. The use of supraventricular tachyarrhythmia discriminators can also decrease inappropriate shocks. Tailored ICD programming using the knowledge gained from recent ICD trials can decrease inappropriate and unnecessary ICD therapies and decrease mortality.

  14. Optimal Operation of Radial Distribution Systems Using Extended Dynamic Programming

    DEFF Research Database (Denmark)

    Lopez, Juan Camilo; Vergara, Pedro P.; Lyra, Christiano

    2018-01-01

    An extended dynamic programming (EDP) approach is developed to optimize the ac steady-state operation of radial electrical distribution systems (EDS). Based on the optimality principle of the recursive Hamilton-Jacobi-Bellman equations, the proposed EDP approach determines the optimal operation o...... approach is illustrated using real-scale systems and comparisons with commercial programming solvers. Finally, generalizations to consider other EDS operation problems are also discussed.......An extended dynamic programming (EDP) approach is developed to optimize the ac steady-state operation of radial electrical distribution systems (EDS). Based on the optimality principle of the recursive Hamilton-Jacobi-Bellman equations, the proposed EDP approach determines the optimal operation...... of the EDS by setting the values of the controllable variables at each time period. A suitable definition for the stages of the problem makes it possible to represent the optimal ac power flow of radial EDS as a dynamic programming problem, wherein the 'curse of dimensionality' is a minor concern, since...

  15. A Scheme to Optimize Flow Routing and Polling Switch Selection of Software Defined Networks.

    Directory of Open Access Journals (Sweden)

    Huan Chen

    Full Text Available This paper aims at minimizing the communication cost for collecting flow information in Software Defined Networks (SDN. Since flow-based information collecting method requires too much communication cost, and switch-based method proposed recently cannot benefit from controlling flow routing, jointly optimize flow routing and polling switch selection is proposed to reduce the communication cost. To this end, joint optimization problem is formulated as an Integer Linear Programming (ILP model firstly. Since the ILP model is intractable in large size network, we also design an optimal algorithm for the multi-rooted tree topology and an efficient heuristic algorithm for general topology. According to extensive simulations, it is found that our method can save up to 55.76% communication cost compared with the state-of-the-art switch-based scheme.

  16. A Scheme to Optimize Flow Routing and Polling Switch Selection of Software Defined Networks.

    Science.gov (United States)

    Chen, Huan; Li, Lemin; Ren, Jing; Wang, Yang; Zhao, Yangming; Wang, Xiong; Wang, Sheng; Xu, Shizhong

    2015-01-01

    This paper aims at minimizing the communication cost for collecting flow information in Software Defined Networks (SDN). Since flow-based information collecting method requires too much communication cost, and switch-based method proposed recently cannot benefit from controlling flow routing, jointly optimize flow routing and polling switch selection is proposed to reduce the communication cost. To this end, joint optimization problem is formulated as an Integer Linear Programming (ILP) model firstly. Since the ILP model is intractable in large size network, we also design an optimal algorithm for the multi-rooted tree topology and an efficient heuristic algorithm for general topology. According to extensive simulations, it is found that our method can save up to 55.76% communication cost compared with the state-of-the-art switch-based scheme.

  17. Polyhedral and semidefinite programming methods in combinatorial optimization

    CERN Document Server

    Tunçel, Levent

    2010-01-01

    Since the early 1960s, polyhedral methods have played a central role in both the theory and practice of combinatorial optimization. Since the early 1990s, a new technique, semidefinite programming, has been increasingly applied to some combinatorial optimization problems. The semidefinite programming problem is the problem of optimizing a linear function of matrix variables, subject to finitely many linear inequalities and the positive semidefiniteness condition on some of the matrix variables. On certain problems, such as maximum cut, maximum satisfiability, maximum stable set and geometric r

  18. Bias due to sample selection in propensity score matching for a supportive housing program evaluation in New York City.

    Directory of Open Access Journals (Sweden)

    Sungwoo Lim

    Full Text Available OBJECTIVES: Little is known about influences of sample selection on estimation in propensity score matching. The purpose of the study was to assess potential selection bias using one-to-one greedy matching versus optimal full matching as part of an evaluation of supportive housing in New York City (NYC. STUDY DESIGN AND SETTINGS: Data came from administrative data for 2 groups of applicants who were eligible for an NYC supportive housing program in 2007-09, including chronically homeless adults with a substance use disorder and young adults aging out of foster care. We evaluated the 2 matching methods in their ability to balance covariates and represent the original population, and in how those methods affected outcomes related to Medicaid expenditures. RESULTS: In the population with a substance use disorder, only optimal full matching performed well in balancing covariates, whereas both methods created representative populations. In the young adult population, both methods balanced covariates effectively, but only optimal full matching created representative populations. In the young adult population, the impact of the program on Medicaid expenditures was attenuated when one-to-one greedy matching was used, compared with optimal full matching. CONCLUSION: Given covariate balancing with both methods, attenuated program impacts in the young adult population indicated that one-to-one greedy matching introduced selection bias.

  19. Doctoral Program Selection Using Pairwise Comparisons.

    Science.gov (United States)

    Tadisina, Suresh K.; Bhasin, Vijay

    1989-01-01

    The application of a pairwise comparison methodology (Saaty's Analytic Hierarchy Process) to the doctoral program selection process is illustrated. A hierarchy for structuring and facilitating the doctoral program selection decision is described. (Author/MLW)

  20. Optimal tariff design under consumer self-selection

    Energy Technology Data Exchange (ETDEWEB)

    Raesaenen, M.; Ruusunen, J.; Haemaelaeinen, R.

    1995-12-31

    This report considers the design of electricity tariffs which guides an individual consumer to select the tariff designed for his consumption pattern. In the model the utility maximizes the weighted sum of individual consumers` benefits of electricity consumption subject to the utility`s revenue requirement constraints. The consumers` free choice of tariffs is ensured with the so-called self-selection constraints. The relationship between the consumers` optimal choice of tariffs and the weights in the aggregated consumers` benefit function is analyzed. If such weights exist, they will guarantee both the consumers` optimal choice of tariffs and the efficient consumption patterns. Also the welfare effects are analyzed by using demand parameters estimated from a Finnish dynamic pricing experiment. The results indicate that it is possible to design an efficient tariff menu with the welfare losses caused by the self-selection constraints being small compared with the costs created when some consumers choose tariffs other than assigned for them. (author)

  1. A multi-fidelity analysis selection method using a constrained discrete optimization formulation

    Science.gov (United States)

    Stults, Ian C.

    The purpose of this research is to develop a method for selecting the fidelity of contributing analyses in computer simulations. Model uncertainty is a significant component of result validity, yet it is neglected in most conceptual design studies. When it is considered, it is done so in only a limited fashion, and therefore brings the validity of selections made based on these results into question. Neglecting model uncertainty can potentially cause costly redesigns of concepts later in the design process or can even cause program cancellation. Rather than neglecting it, if one were to instead not only realize the model uncertainty in tools being used but also use this information to select the tools for a contributing analysis, studies could be conducted more efficiently and trust in results could be quantified. Methods for performing this are generally not rigorous or traceable, and in many cases the improvement and additional time spent performing enhanced calculations are washed out by less accurate calculations performed downstream. The intent of this research is to resolve this issue by providing a method which will minimize the amount of time spent conducting computer simulations while meeting accuracy and concept resolution requirements for results. In many conceptual design programs, only limited data is available for quantifying model uncertainty. Because of this data sparsity, traditional probabilistic means for quantifying uncertainty should be reconsidered. This research proposes to instead quantify model uncertainty using an evidence theory formulation (also referred to as Dempster-Shafer theory) in lieu of the traditional probabilistic approach. Specific weaknesses in using evidence theory for quantifying model uncertainty are identified and addressed for the purposes of the Fidelity Selection Problem. A series of experiments was conducted to address these weaknesses using n-dimensional optimization test functions. These experiments found that model

  2. Behavioral optimization models for multicriteria portfolio selection

    Directory of Open Access Journals (Sweden)

    Mehlawat Mukesh Kumar

    2013-01-01

    Full Text Available In this paper, behavioral construct of suitability is used to develop a multicriteria decision making framework for portfolio selection. To achieve this purpose, we rely on multiple methodologies. Analytical hierarchy process technique is used to model the suitability considerations with a view to obtaining the suitability performance score in respect of each asset. A fuzzy multiple criteria decision making method is used to obtain the financial quality score of each asset based upon investor's rating on the financial criteria. Two optimization models are developed for optimal asset allocation considering simultaneously financial and suitability criteria. An empirical study is conducted on randomly selected assets from National Stock Exchange, Mumbai, India to demonstrate the effectiveness of the proposed methodology.

  3. GPAW optimized for Blue Gene/P using hybrid programming

    DEFF Research Database (Denmark)

    Kristensen, Mads Ruben Burgdorff; Happe, Hans Henrik; Vinter, Brian

    2009-01-01

    In this work we present optimizations of a Grid-based projector-augmented wave method software, GPAW for the Blue Gene/P architecture. The improvements are achieved by exploring the advantage of shared and distributed memory programming also known as hybrid programming. The work focuses on optimi......In this work we present optimizations of a Grid-based projector-augmented wave method software, GPAW for the Blue Gene/P architecture. The improvements are achieved by exploring the advantage of shared and distributed memory programming also known as hybrid programming. The work focuses...... on optimizing a very time consuming operation in GPAW, the finite-different stencil operation, and different hybrid programming approaches are evaluated. The work succeeds in demonstrating a hybrid programming model which is clearly beneficial compared to the original flat programming model. In total...... an improvement of 1.94 compared to the original implementation is obtained. The results we demonstrate here are reasonably general and may be applied to other finite difference codes....

  4. Portfolio optimization by using linear programing models based on genetic algorithm

    Science.gov (United States)

    Sukono; Hidayat, Y.; Lesmana, E.; Putra, A. S.; Napitupulu, H.; Supian, S.

    2018-01-01

    In this paper, we discussed the investment portfolio optimization using linear programming model based on genetic algorithms. It is assumed that the portfolio risk is measured by absolute standard deviation, and each investor has a risk tolerance on the investment portfolio. To complete the investment portfolio optimization problem, the issue is arranged into a linear programming model. Furthermore, determination of the optimum solution for linear programming is done by using a genetic algorithm. As a numerical illustration, we analyze some of the stocks traded on the capital market in Indonesia. Based on the analysis, it is shown that the portfolio optimization performed by genetic algorithm approach produces more optimal efficient portfolio, compared to the portfolio optimization performed by a linear programming algorithm approach. Therefore, genetic algorithms can be considered as an alternative on determining the investment portfolio optimization, particularly using linear programming models.

  5. Dynamic optimization approach for integrated supplier selection and tracking control of single product inventory system with product discount

    Science.gov (United States)

    Sutrisno; Widowati; Heru Tjahjana, R.

    2017-01-01

    In this paper, we propose a mathematical model in the form of dynamic/multi-stage optimization to solve an integrated supplier selection problem and tracking control problem of single product inventory system with product discount. The product discount will be stated as a piece-wise linear function. We use dynamic programming to solve this proposed optimization to determine the optimal supplier and the optimal product volume that will be purchased from the optimal supplier for each time period so that the inventory level tracks a reference trajectory given by decision maker with minimal total cost. We give a numerical experiment to evaluate the proposed model. From the result, the optimal supplier was determined for each time period and the inventory level follows the given reference well.

  6. Probabilistic methods for maintenance program optimization

    International Nuclear Information System (INIS)

    Liming, J.K.; Smith, M.J.; Gekler, W.C.

    1989-01-01

    In today's regulatory and economic environments, it is more important than ever that managers, engineers, and plant staff join together in developing and implementing effective management plans for safety and economic risk. This need applied to both power generating stations and other process facilities. One of the most critical parts of these management plans is the development and continuous enhancement of a maintenance program that optimizes plant or facility safety and profitability. The ultimate objective is to maximize the potential for station or facility success, usually measured in terms of projected financial profitability, while meeting or exceeding meaningful and reasonable safety goals, usually measured in terms of projected damage or consequence frequencies. This paper describes the use of the latest concepts in developing and evaluating maintenance programs to achieve maintenance program optimization (MPO). These concepts are based on significant field experience gained through the integration and application of fundamentals developed for industry and Electric Power Research Institute (EPRI)-sponsored projects on preventive maintenance (PM) program development and reliability-centered maintenance (RCM)

  7. Optimizing Energy and Modulation Selection in Multi-Resolution Modulation For Wireless Video Broadcast/Multicast

    KAUST Repository

    She, James

    2009-11-01

    Emerging technologies in Broadband Wireless Access (BWA) networks and video coding have enabled high-quality wireless video broadcast/multicast services in metropolitan areas. Joint source-channel coded wireless transmission, especially using hierarchical/superposition coded modulation at the channel, is recognized as an effective and scalable approach to increase the system scalability while tackling the multi-user channel diversity problem. The power allocation and modulation selection problem, however, is subject to a high computational complexity due to the nonlinear formulation and huge solution space. This paper introduces a dynamic programming framework with conditioned parsing, which significantly reduces the search space. The optimized result is further verified with experiments using real video content. The proposed approach effectively serves as a generalized and practical optimization framework that can gauge and optimize a scalable wireless video broadcast/multicast based on multi-resolution modulation in any BWA network.

  8. Optimizing Energy and Modulation Selection in Multi-Resolution Modulation For Wireless Video Broadcast/Multicast

    KAUST Repository

    She, James; Ho, Pin-Han; Shihada, Basem

    2009-01-01

    Emerging technologies in Broadband Wireless Access (BWA) networks and video coding have enabled high-quality wireless video broadcast/multicast services in metropolitan areas. Joint source-channel coded wireless transmission, especially using hierarchical/superposition coded modulation at the channel, is recognized as an effective and scalable approach to increase the system scalability while tackling the multi-user channel diversity problem. The power allocation and modulation selection problem, however, is subject to a high computational complexity due to the nonlinear formulation and huge solution space. This paper introduces a dynamic programming framework with conditioned parsing, which significantly reduces the search space. The optimized result is further verified with experiments using real video content. The proposed approach effectively serves as a generalized and practical optimization framework that can gauge and optimize a scalable wireless video broadcast/multicast based on multi-resolution modulation in any BWA network.

  9. An Optimization Model For Strategy Decision Support to Select Kind of CPO’s Ship

    Science.gov (United States)

    Suaibah Nst, Siti; Nababan, Esther; Mawengkang, Herman

    2018-01-01

    The selection of marine transport for the distribution of crude palm oil (CPO) is one of strategy that can be considered in reducing cost of transport. The cost of CPO’s transport from one area to CPO’s factory located at the port of destination may affect the level of CPO’s prices and the number of demands. In order to maintain the availability of CPO a strategy is required to minimize the cost of transporting. In this study, the strategy used to select kind of charter ships as barge or chemical tanker. This study aims to determine an optimization model for strategy decision support in selecting kind of CPO’s ship by minimizing costs of transport. The select of ship was done randomly, so that two-stage stochastic programming model was used to select the kind of ship. Model can help decision makers to select either barge or chemical tanker to distribute CPO.

  10. Designing optimal food intake patterns to achieve nutritional goals for Japanese adults through the use of linear programming optimization models.

    Science.gov (United States)

    Okubo, Hitomi; Sasaki, Satoshi; Murakami, Kentaro; Yokoyama, Tetsuji; Hirota, Naoko; Notsu, Akiko; Fukui, Mitsuru; Date, Chigusa

    2015-06-06

    Simultaneous dietary achievement of a full set of nutritional recommendations is difficult. Diet optimization model using linear programming is a useful mathematical means of translating nutrient-based recommendations into realistic nutritionally-optimal food combinations incorporating local and culture-specific foods. We used this approach to explore optimal food intake patterns that meet the nutrient recommendations of the Dietary Reference Intakes (DRIs) while incorporating typical Japanese food selections. As observed intake values, we used the food and nutrient intake data of 92 women aged 31-69 years and 82 men aged 32-69 years living in three regions of Japan. Dietary data were collected with semi-weighed dietary record on four non-consecutive days in each season of the year (16 days total). The linear programming models were constructed to minimize the differences between observed and optimized food intake patterns while also meeting the DRIs for a set of 28 nutrients, setting energy equal to estimated requirements, and not exceeding typical quantities of each food consumed by each age (30-49 or 50-69 years) and gender group. We successfully developed mathematically optimized food intake patterns that met the DRIs for all 28 nutrients studied in each sex and age group. Achieving nutritional goals required minor modifications of existing diets in older groups, particularly women, while major modifications were required to increase intake of fruit and vegetables in younger groups of both sexes. Across all sex and age groups, optimized food intake patterns demanded greatly increased intake of whole grains and reduced-fat dairy products in place of intake of refined grains and full-fat dairy products. Salt intake goals were the most difficult to achieve, requiring marked reduction of salt-containing seasoning (65-80%) in all sex and age groups. Using a linear programming model, we identified optimal food intake patterns providing practical food choices and

  11. Computer program for optimal BWR congtrol rod programming

    International Nuclear Information System (INIS)

    Taner, M.S.; Levine, S.H.; Carmody, J.M.

    1995-01-01

    A fully automated computer program has been developed for designing optimal control rod (CR) patterns for boiling water reactors (BWRs). The new program, called OCTOPUS-3, is based on the OCTOPUS code and employs SIMULATE-3 (Ref. 2) for the analysis. There are three aspects of OCTOPUS-3 that make it successful for use at PECO Energy. It incorporates a new feasibility algorithm that makes the CR design meet all constraints, it has been coupled to a Bourne Shell program 3 to allow the user to run the code interactively without the need for a manual, and it develops a low axial peak to extend the cycle. For PECO Energy Co.'s limericks it increased the energy output by 1 to 2% over the traditional PECO Energy design. The objective of the optimization in OCTOPUS-3 is to approximate a very low axial peaked target power distribution while maintaining criticality, keeping the nodal and assembly peaks below the allowed maximum, and meeting the other constraints. The user-specified input for each exposure point includes: CR groups allowed-to-move, target k eff , and amount of core flow. The OCTOPUS-3 code uses the CR pattern from the previous step as the initial guess unless indicated otherwise

  12. Stress-constrained truss topology optimization problems that can be solved by linear programming

    DEFF Research Database (Denmark)

    Stolpe, Mathias; Svanberg, Krister

    2004-01-01

    We consider the problem of simultaneously selecting the material and determining the area of each bar in a truss structure in such a way that the cost of the structure is minimized subject to stress constraints under a single load condition. We show that such problems can be solved by linear...... programming to give the global optimum, and that two different materials are always sufficient in an optimal structure....

  13. Training set optimization under population structure in genomic selection.

    Science.gov (United States)

    Isidro, Julio; Jannink, Jean-Luc; Akdemir, Deniz; Poland, Jesse; Heslot, Nicolas; Sorrells, Mark E

    2015-01-01

    Population structure must be evaluated before optimization of the training set population. Maximizing the phenotypic variance captured by the training set is important for optimal performance. The optimization of the training set (TRS) in genomic selection has received much interest in both animal and plant breeding, because it is critical to the accuracy of the prediction models. In this study, five different TRS sampling algorithms, stratified sampling, mean of the coefficient of determination (CDmean), mean of predictor error variance (PEVmean), stratified CDmean (StratCDmean) and random sampling, were evaluated for prediction accuracy in the presence of different levels of population structure. In the presence of population structure, the most phenotypic variation captured by a sampling method in the TRS is desirable. The wheat dataset showed mild population structure, and CDmean and stratified CDmean methods showed the highest accuracies for all the traits except for test weight and heading date. The rice dataset had strong population structure and the approach based on stratified sampling showed the highest accuracies for all traits. In general, CDmean minimized the relationship between genotypes in the TRS, maximizing the relationship between TRS and the test set. This makes it suitable as an optimization criterion for long-term selection. Our results indicated that the best selection criterion used to optimize the TRS seems to depend on the interaction of trait architecture and population structure.

  14. Lean and Efficient Software: Whole-Program Optimization of Executables

    Science.gov (United States)

    2015-09-30

    Lean and Efficient Software: Whole-Program Optimization of Executables” Project Summary Report #5 (Report Period: 7/1/2015 to 9/30/2015...TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Lean and Efficient Software: Whole-Program Optimization of Executables 5a...unclassified c. THIS PAGE unclassified Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std Z39-18 Lean and Efficient Software: Whole-Program

  15. Multi-Objective Stochastic Optimization Programs for a Non-Life Insurance Company under Solvency Constraints

    Directory of Open Access Journals (Sweden)

    Massimiliano Kaucic

    2015-09-01

    Full Text Available In the paper, we introduce a multi-objective scenario-based optimization approach for chance-constrained portfolio selection problems. More specifically, a modified version of the normal constraint method is implemented with a global solver in order to generate a dotted approximation of the Pareto frontier for bi- and tri-objective programming problems. Numerical experiments are carried out on a set of portfolios to be optimized for an EU-based non-life insurance company. Both performance indicators and risk measures are managed as objectives. Results show that this procedure is effective and readily applicable to achieve suitable risk-reward tradeoff analysis.

  16. Dynamic Programming Optimization of Multi-rate Multicast Video-Streaming Services

    Directory of Open Access Journals (Sweden)

    Nestor Michael Caños Tiglao

    2010-06-01

    Full Text Available In large scale IP Television (IPTV and Mobile TV distributions, the video signal is typically encoded and transmitted using several quality streams, over IP Multicast channels, to several groups of receivers, which are classified in terms of their reception rate. As the number of video streams is usually constrained by both the number of TV channels and the maximum capacity of the content distribution network, it is necessary to find the selection of video stream transmission rates that maximizes the overall user satisfaction. In order to efficiently solve this problem, this paper proposes the Dynamic Programming Multi-rate Optimization (DPMO algorithm. The latter was comparatively evaluated considering several user distributions, featuring different access rate patterns. The experimental results reveal that DPMO is significantly more efficient than exhaustive search, while presenting slightly higher execution times than the non-optimal Multi-rate Step Search (MSS algorithm.

  17. Mathematical programming methods for large-scale topology optimization problems

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana

    for mechanical problems, but has rapidly extended to many other disciplines, such as fluid dynamics and biomechanical problems. However, the novelty and improvements of optimization methods has been very limited. It is, indeed, necessary to develop of new optimization methods to improve the final designs......, and at the same time, reduce the number of function evaluations. Nonlinear optimization methods, such as sequential quadratic programming and interior point solvers, have almost not been embraced by the topology optimization community. Thus, this work is focused on the introduction of this kind of second...... for the classical minimum compliance problem. Two of the state-of-the-art optimization algorithms are investigated and implemented for this structural topology optimization problem. A Sequential Quadratic Programming (TopSQP) and an interior point method (TopIP) are developed exploiting the specific mathematical...

  18. Optimal infrastructure selection to boost regional sustainable economy

    OpenAIRE

    Martín Utrillas, Manuel Guzmán; Juan-Garcia, F.; Cantó Perelló, Julián; Curiel Esparza, Jorge

    2015-01-01

    The role of infrastructures in boosting the economic growth of the regions is widely recognized. In many cases, an infrastructure is selected by subjective reasons. Selection of the optimal infrastructure for sustainable economic development of a region should be based on objective and founded reasons, not only economical, but also environmental and social. In this paper is developed such selection through a hybrid method based on Delphi, analytical hierarchy process (AHP), and VIKOR (from Se...

  19. Optimal Contracting under Adverse Selection

    DEFF Research Database (Denmark)

    Lenells, Jonatan; Stea, Diego; Foss, Nicolai Juul

    2015-01-01

    We study a model of adverse selection, hard and soft information, and mentalizing ability--the human capacity to represent others' intentions, knowledge, and beliefs. By allowing for a continuous range of different information types, as well as for different means of acquiring information, we dev...... of that information. This strategy affects the properties of the optimal contract, which grows closer to the first best. This research provides insights into the implications of mentalizing for agency theory....

  20. Pareto optimization in algebraic dynamic programming.

    Science.gov (United States)

    Saule, Cédric; Giegerich, Robert

    2015-01-01

    Pareto optimization combines independent objectives by computing the Pareto front of its search space, defined as the set of all solutions for which no other candidate solution scores better under all objectives. This gives, in a precise sense, better information than an artificial amalgamation of different scores into a single objective, but is more costly to compute. Pareto optimization naturally occurs with genetic algorithms, albeit in a heuristic fashion. Non-heuristic Pareto optimization so far has been used only with a few applications in bioinformatics. We study exact Pareto optimization for two objectives in a dynamic programming framework. We define a binary Pareto product operator [Formula: see text] on arbitrary scoring schemes. Independent of a particular algorithm, we prove that for two scoring schemes A and B used in dynamic programming, the scoring scheme [Formula: see text] correctly performs Pareto optimization over the same search space. We study different implementations of the Pareto operator with respect to their asymptotic and empirical efficiency. Without artificial amalgamation of objectives, and with no heuristics involved, Pareto optimization is faster than computing the same number of answers separately for each objective. For RNA structure prediction under the minimum free energy versus the maximum expected accuracy model, we show that the empirical size of the Pareto front remains within reasonable bounds. Pareto optimization lends itself to the comparative investigation of the behavior of two alternative scoring schemes for the same purpose. For the above scoring schemes, we observe that the Pareto front can be seen as a composition of a few macrostates, each consisting of several microstates that differ in the same limited way. We also study the relationship between abstract shape analysis and the Pareto front, and find that they extract information of a different nature from the folding space and can be meaningfully combined.

  1. A program package for solving linear optimization problems

    International Nuclear Information System (INIS)

    Horikami, Kunihiko; Fujimura, Toichiro; Nakahara, Yasuaki

    1980-09-01

    Seven computer programs for the solution of linear, integer and quadratic programming (four programs for linear programming, one for integer programming and two for quadratic programming) have been prepared and tested on FACOM M200 computer, and auxiliary programs have been written to make it easy to use the optimization program package. The characteristics of each program are explained and the detailed input/output descriptions are given in order to let users know how to use them. (author)

  2. Generic Optimization Program User Manual Version 3.0.0

    International Nuclear Information System (INIS)

    Wetter, Michael

    2009-01-01

    GenOpt is an optimization program for the minimization of a cost function that is evaluated by an external simulation program. It has been developed for optimization problems where the cost function is computationally expensive and its derivatives are not available or may not even exist. GenOpt can be coupled to any simulation program that reads its input from text files and writes its output to text files. The independent variables can be continuous variables (possibly with lower and upper bounds), discrete variables, or both, continuous and discrete variables. Constraints on dependent variables can be implemented using penalty or barrier functions. GenOpt uses parallel computing to evaluate the simulations. GenOpt has a library with local and global multi-dimensional and one-dimensional optimization algorithms, and algorithms for doing parametric runs. An algorithm interface allows adding new minimization algorithms without knowing the details of the program structure. GenOpt is written in Java so that it is platform independent. The platform independence and the general interface make GenOpt applicable to a wide range of optimization problems. GenOpt has not been designed for linear programming problems, quadratic programming problems, and problems where the gradient of the cost function is available. For such problems, as well as for other problems, special tailored software exists that is more efficient

  3. Generic Optimization Program User Manual Version 3.0.0

    Energy Technology Data Exchange (ETDEWEB)

    Wetter, Michael

    2009-05-11

    GenOpt is an optimization program for the minimization of a cost function that is evaluated by an external simulation program. It has been developed for optimization problems where the cost function is computationally expensive and its derivatives are not available or may not even exist. GenOpt can be coupled to any simulation program that reads its input from text files and writes its output to text files. The independent variables can be continuous variables (possibly with lower and upper bounds), discrete variables, or both, continuous and discrete variables. Constraints on dependent variables can be implemented using penalty or barrier functions. GenOpt uses parallel computing to evaluate the simulations. GenOpt has a library with local and global multi-dimensional and one-dimensional optimization algorithms, and algorithms for doing parametric runs. An algorithm interface allows adding new minimization algorithms without knowing the details of the program structure. GenOpt is written in Java so that it is platform independent. The platform independence and the general interface make GenOpt applicable to a wide range of optimization problems. GenOpt has not been designed for linear programming problems, quadratic programming problems, and problems where the gradient of the cost function is available. For such problems, as well as for other problems, special tailored software exists that is more efficient.

  4. Discrete Biogeography Based Optimization for Feature Selection in Molecular Signatures.

    Science.gov (United States)

    Liu, Bo; Tian, Meihong; Zhang, Chunhua; Li, Xiangtao

    2015-04-01

    Biomarker discovery from high-dimensional data is a complex task in the development of efficient cancer diagnoses and classification. However, these data are usually redundant and noisy, and only a subset of them present distinct profiles for different classes of samples. Thus, selecting high discriminative genes from gene expression data has become increasingly interesting in the field of bioinformatics. In this paper, a discrete biogeography based optimization is proposed to select the good subset of informative gene relevant to the classification. In the proposed algorithm, firstly, the fisher-markov selector is used to choose fixed number of gene data. Secondly, to make biogeography based optimization suitable for the feature selection problem; discrete migration model and discrete mutation model are proposed to balance the exploration and exploitation ability. Then, discrete biogeography based optimization, as we called DBBO, is proposed by integrating discrete migration model and discrete mutation model. Finally, the DBBO method is used for feature selection, and three classifiers are used as the classifier with the 10 fold cross-validation method. In order to show the effective and efficiency of the algorithm, the proposed algorithm is tested on four breast cancer dataset benchmarks. Comparison with genetic algorithm, particle swarm optimization, differential evolution algorithm and hybrid biogeography based optimization, experimental results demonstrate that the proposed method is better or at least comparable with previous method from literature when considering the quality of the solutions obtained. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Fuzzy Goal Programming Approach in Selective Maintenance Reliability Model

    Directory of Open Access Journals (Sweden)

    Neha Gupta

    2013-12-01

    Full Text Available 800x600 In the present paper, we have considered the allocation problem of repairable components for a parallel-series system as a multi-objective optimization problem and have discussed two different models. In first model the reliability of subsystems are considered as different objectives. In second model the cost and time spent on repairing the components are considered as two different objectives. These two models is formulated as multi-objective Nonlinear Programming Problem (MONLPP and a Fuzzy goal programming method is used to work out the compromise allocation in multi-objective selective maintenance reliability model in which we define the membership functions of each objective function and then transform membership functions into equivalent linear membership functions by first order Taylor series and finally by forming a fuzzy goal programming model obtain a desired compromise allocation of maintenance components. A numerical example is also worked out to illustrate the computational details of the method.  Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4

  6. Programming for Sparse Minimax Optimization

    DEFF Research Database (Denmark)

    Jonasson, K.; Madsen, Kaj

    1994-01-01

    We present an algorithm for nonlinear minimax optimization which is well suited for large and sparse problems. The method is based on trust regions and sequential linear programming. On each iteration, a linear minimax problem is solved for a basic step. If necessary, this is followed...... by the determination of a minimum norm corrective step based on a first-order Taylor approximation. No Hessian information needs to be stored. Global convergence is proved. This new method has been extensively tested and compared with other methods, including two well known codes for nonlinear programming...

  7. Optimal timing of joint replacement using mathematical programming and stochastic programming models.

    Science.gov (United States)

    Keren, Baruch; Pliskin, Joseph S

    2011-12-01

    The optimal timing for performing radical medical procedures as joint (e.g., hip) replacement must be seriously considered. In this paper we show that under deterministic assumptions the optimal timing for joint replacement is a solution of a mathematical programming problem, and under stochastic assumptions the optimal timing can be formulated as a stochastic programming problem. We formulate deterministic and stochastic models that can serve as decision support tools. The results show that the benefit from joint replacement surgery is heavily dependent on timing. Moreover, for a special case where the patient's remaining life is normally distributed along with a normally distributed survival of the new joint, the expected benefit function from surgery is completely solved. This enables practitioners to draw the expected benefit graph, to find the optimal timing, to evaluate the benefit for each patient, to set priorities among patients and to decide if joint replacement should be performed and when.

  8. Compensatory Analysis and Optimization for MADM for Heterogeneous Wireless Network Selection

    Directory of Open Access Journals (Sweden)

    Jian Zhou

    2016-01-01

    Full Text Available In the next-generation heterogeneous wireless networks, a mobile terminal with a multi-interface may have network access from different service providers using various technologies. In spite of this heterogeneity, seamless intersystem mobility is a mandatory requirement. One of the major challenges for seamless mobility is the creation of a network selection scheme, which is for users that select an optimal network with best comprehensive performance between different types of networks. However, the optimal network may be not the most reasonable one due to compensation of MADM (Multiple Attribute Decision Making, and the network is called pseudo-optimal network. This paper conducts a performance evaluation of a number of widely used MADM-based methods for network selection that aim to keep the mobile users always best connected anywhere and anytime, where subjective weight and objective weight are all considered. The performance analysis shows that the selection scheme based on MEW (weighted multiplicative method and combination weight can better avoid accessing pseudo-optimal network for balancing network load and reducing ping-pong effect in comparison with three other MADM solutions.

  9. Numerical methods of mathematical optimization with Algol and Fortran programs

    CERN Document Server

    Künzi, Hans P; Zehnder, C A; Rheinboldt, Werner

    1971-01-01

    Numerical Methods of Mathematical Optimization: With ALGOL and FORTRAN Programs reviews the theory and the practical application of the numerical methods of mathematical optimization. An ALGOL and a FORTRAN program was developed for each one of the algorithms described in the theoretical section. This should result in easy access to the application of the different optimization methods.Comprised of four chapters, this volume begins with a discussion on the theory of linear and nonlinear optimization, with the main stress on an easily understood, mathematically precise presentation. In addition

  10. Optimized Power Allocation and Relay Location Selection in Cooperative Relay Networks

    Directory of Open Access Journals (Sweden)

    Jianrong Bao

    2017-01-01

    Full Text Available An incremental selection hybrid decode-amplify forward (ISHDAF scheme for the two-hop single relay systems and a relay selection strategy based on the hybrid decode-amplify-and-forward (HDAF scheme for the multirelay systems are proposed along with an optimized power allocation for the Internet of Thing (IoT. Given total power as the constraint and outage probability as an objective function, the proposed scheme possesses good power efficiency better than the equal power allocation. By the ISHDAF scheme and HDAF relay selection strategy, an optimized power allocation for both the source and relay nodes is obtained, as well as an effective reduction of outage probability. In addition, the optimal relay location for maximizing the gain of the proposed algorithm is also investigated and designed. Simulation results show that, in both single relay and multirelay selection systems, some outage probability gains by the proposed scheme can be obtained. In the comparison of the optimized power allocation scheme with the equal power allocation one, nearly 0.1695 gains are obtained in the ISHDAF single relay network at a total power of 2 dB, and about 0.083 gains are obtained in the HDAF relay selection system with 2 relays at a total power of 2 dB.

  11. Optimality Theory and Lexical Interpretation and Selection

    NARCIS (Netherlands)

    Hogeweg, L.; Legendre, G.; Putnam, M.T.; de Swart, H.; Zaroukian, E.

    2016-01-01

    This chapter argues for an optimization approach to the selection and interpretation of words. Several advantages of such an approach to lexical semantics are discussed. First of all, it will be argued that competition, entailing that words and interpretations are always judged in relation to other

  12. A Pareto-optimal moving average multigene genetic programming model for daily streamflow prediction

    Science.gov (United States)

    Danandeh Mehr, Ali; Kahya, Ercan

    2017-06-01

    Genetic programming (GP) is able to systematically explore alternative model structures of different accuracy and complexity from observed input and output data. The effectiveness of GP in hydrological system identification has been recognized in recent studies. However, selecting a parsimonious (accurate and simple) model from such alternatives still remains a question. This paper proposes a Pareto-optimal moving average multigene genetic programming (MA-MGGP) approach to develop a parsimonious model for single-station streamflow prediction. The three main components of the approach that take us from observed data to a validated model are: (1) data pre-processing, (2) system identification and (3) system simplification. The data pre-processing ingredient uses a simple moving average filter to diminish the lagged prediction effect of stand-alone data-driven models. The multigene ingredient of the model tends to identify the underlying nonlinear system with expressions simpler than classical monolithic GP and, eventually simplification component exploits Pareto front plot to select a parsimonious model through an interactive complexity-efficiency trade-off. The approach was tested using the daily streamflow records from a station on Senoz Stream, Turkey. Comparing to the efficiency results of stand-alone GP, MGGP, and conventional multi linear regression prediction models as benchmarks, the proposed Pareto-optimal MA-MGGP model put forward a parsimonious solution, which has a noteworthy importance of being applied in practice. In addition, the approach allows the user to enter human insight into the problem to examine evolved models and pick the best performing programs out for further analysis.

  13. Optimizing Event Selection with the Random Grid Search

    Energy Technology Data Exchange (ETDEWEB)

    Bhat, Pushpalatha C. [Fermilab; Prosper, Harrison B. [Florida State U.; Sekmen, Sezen [Kyungpook Natl. U.; Stewart, Chip [Broad Inst., Cambridge

    2017-06-29

    The random grid search (RGS) is a simple, but efficient, stochastic algorithm to find optimal cuts that was developed in the context of the search for the top quark at Fermilab in the mid-1990s. The algorithm, and associated code, have been enhanced recently with the introduction of two new cut types, one of which has been successfully used in searches for supersymmetry at the Large Hadron Collider. The RGS optimization algorithm is described along with the recent developments, which are illustrated with two examples from particle physics. One explores the optimization of the selection of vector boson fusion events in the four-lepton decay mode of the Higgs boson and the other optimizes SUSY searches using boosted objects and the razor variables.

  14. An efficient particle swarm approach for mixed-integer programming in reliability-redundancy optimization applications

    International Nuclear Information System (INIS)

    Santos Coelho, Leandro dos

    2009-01-01

    The reliability-redundancy optimization problems can involve the selection of components with multiple choices and redundancy levels that produce maximum benefits, and are subject to the cost, weight, and volume constraints. Many classical mathematical methods have failed in handling nonconvexities and nonsmoothness in reliability-redundancy optimization problems. As an alternative to the classical optimization approaches, the meta-heuristics have been given much attention by many researchers due to their ability to find an almost global optimal solutions. One of these meta-heuristics is the particle swarm optimization (PSO). PSO is a population-based heuristic optimization technique inspired by social behavior of bird flocking and fish schooling. This paper presents an efficient PSO algorithm based on Gaussian distribution and chaotic sequence (PSO-GC) to solve the reliability-redundancy optimization problems. In this context, two examples in reliability-redundancy design problems are evaluated. Simulation results demonstrate that the proposed PSO-GC is a promising optimization technique. PSO-GC performs well for the two examples of mixed-integer programming in reliability-redundancy applications considered in this paper. The solutions obtained by the PSO-GC are better than the previously best-known solutions available in the recent literature

  15. An efficient particle swarm approach for mixed-integer programming in reliability-redundancy optimization applications

    Energy Technology Data Exchange (ETDEWEB)

    Santos Coelho, Leandro dos [Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Pontifical Catholic University of Parana, PUCPR, Imaculada Conceicao, 1155, 80215-901 Curitiba, Parana (Brazil)], E-mail: leandro.coelho@pucpr.br

    2009-04-15

    The reliability-redundancy optimization problems can involve the selection of components with multiple choices and redundancy levels that produce maximum benefits, and are subject to the cost, weight, and volume constraints. Many classical mathematical methods have failed in handling nonconvexities and nonsmoothness in reliability-redundancy optimization problems. As an alternative to the classical optimization approaches, the meta-heuristics have been given much attention by many researchers due to their ability to find an almost global optimal solutions. One of these meta-heuristics is the particle swarm optimization (PSO). PSO is a population-based heuristic optimization technique inspired by social behavior of bird flocking and fish schooling. This paper presents an efficient PSO algorithm based on Gaussian distribution and chaotic sequence (PSO-GC) to solve the reliability-redundancy optimization problems. In this context, two examples in reliability-redundancy design problems are evaluated. Simulation results demonstrate that the proposed PSO-GC is a promising optimization technique. PSO-GC performs well for the two examples of mixed-integer programming in reliability-redundancy applications considered in this paper. The solutions obtained by the PSO-GC are better than the previously best-known solutions available in the recent literature.

  16. Defense Acquisitions: Assessments of Selected Weapon Programs

    Science.gov (United States)

    2017-03-01

    Figure 17: Examples of Knowledge Scorecards 61 Page vi GAO-17-333SP Assessments of Selected Weapon Programs...programs. Page 61 GAO-17-333SP Assessments of Selected Weapon Programs Figure 17: Examples of Knowledge Scorecards Pursuant to a...had direct access to the USD AT&L and other senior acquisition officials, and some approval authorities were delegated to lower levels. For example

  17. The New Multipoint Relays Selection in OLSR using Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Razali Ngah

    2012-06-01

    Full Text Available The standard Optimized Link State Routing (OLSR introduces an interesting concept, the multipoint relays (MPRs, to mitigate message overhead during the flooding process. We propose a new algorithm for MPRs selection to enhance the performance of OLSR using Particle Swarm Optimization Sigmoid Increasing Inertia Weight (PSOSIIW. The sigmoid increasing inertia weight has significance improve the particle swarm optimization (PSO in terms of simplicity and quick convergence towards optimum solution. The new fitness function of PSO-SIIW, packet delay of each node and degree of willingness are introduced to support MPRs selection in OLSR. We examine the throughput, packet loss and end-to-end delay of the proposed method using network simulator 2 (ns2.  Overall results indicate that OLSR-PSOSIIW has shown good performance compared to the standard OLSR and OLSR-PSO, particularly for the throughput and end-to-end delay. Generally the proposed OLSR-PSOSIIW shows advantage of using PSO for optimizing routing paths in the MPRs selection algorithm.

  18. A hybrid agent-based computational economics and optimization approach for supplier selection problem

    Directory of Open Access Journals (Sweden)

    Zahra Pourabdollahi

    2017-12-01

    Full Text Available Supplier evaluation and selection problem is among the most important of logistics decisions that have been addressed extensively in supply chain management. The same logistics decision is also important in freight transportation since it identifies trade relationships between business establishments and determines commodity flows between production and consumption points. The commodity flows are then used as input to freight transportation models to determine cargo movements and their characteristics including mode choice and shipment size. Various approaches have been proposed to explore this latter problem in previous studies. Traditionally, potential suppliers are evaluated and selected using only price/cost as the influential criteria and the state-of-practice methods. This paper introduces a hybrid agent-based computational economics and optimization approach for supplier selection. The proposed model combines an agent-based multi-criteria supplier evaluation approach with a multi-objective optimization model to capture both behavioral and economical aspects of the supplier selection process. The model uses a system of ordered response models to determine importance weights of the different criteria in supplier evaluation from a buyers’ point of view. The estimated weights are then used to calculate a utility for each potential supplier in the market and rank them. The calculated utilities are then entered into a mathematical programming model in which best suppliers are selected by maximizing the total accrued utility for all buyers and minimizing total shipping costs while balancing the capacity of potential suppliers to ensure market clearing mechanisms. The proposed model, herein, was implemented under an operational agent-based supply chain and freight transportation framework for the Chicago Metropolitan Area.

  19. Optimality Conditions for Fuzzy Number Quadratic Programming with Fuzzy Coefficients

    Directory of Open Access Journals (Sweden)

    Xue-Gang Zhou

    2014-01-01

    Full Text Available The purpose of the present paper is to investigate optimality conditions and duality theory in fuzzy number quadratic programming (FNQP in which the objective function is fuzzy quadratic function with fuzzy number coefficients and the constraint set is fuzzy linear functions with fuzzy number coefficients. Firstly, the equivalent quadratic programming of FNQP is presented by utilizing a linear ranking function and the dual of fuzzy number quadratic programming primal problems is introduced. Secondly, we present optimality conditions for fuzzy number quadratic programming. We then prove several duality results for fuzzy number quadratic programming problems with fuzzy coefficients.

  20. Adaptive feature selection using v-shaped binary particle swarm optimization.

    Science.gov (United States)

    Teng, Xuyang; Dong, Hongbin; Zhou, Xiurong

    2017-01-01

    Feature selection is an important preprocessing method in machine learning and data mining. This process can be used not only to reduce the amount of data to be analyzed but also to build models with stronger interpretability based on fewer features. Traditional feature selection methods evaluate the dependency and redundancy of features separately, which leads to a lack of measurement of their combined effect. Moreover, a greedy search considers only the optimization of the current round and thus cannot be a global search. To evaluate the combined effect of different subsets in the entire feature space, an adaptive feature selection method based on V-shaped binary particle swarm optimization is proposed. In this method, the fitness function is constructed using the correlation information entropy. Feature subsets are regarded as individuals in a population, and the feature space is searched using V-shaped binary particle swarm optimization. The above procedure overcomes the hard constraint on the number of features, enables the combined evaluation of each subset as a whole, and improves the search ability of conventional binary particle swarm optimization. The proposed algorithm is an adaptive method with respect to the number of feature subsets. The experimental results show the advantages of optimizing the feature subsets using the V-shaped transfer function and confirm the effectiveness and efficiency of the feature subsets obtained under different classifiers.

  1. Review. Promises, pitfalls and challenges of genomic selection in breeding programs

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez-Escriche, N.; Gonzalez-Recio, O.

    2011-07-01

    The aim of this work was to review the main challenges and pitfalls of the implementation of genomic selection in the breeding programs of different livestock species. Genomic selection is now one of the main challenges in animal breeding and genetics. Its application could considerably increase the genetic gain in traits of interest. However, the success of its practical implementation depends on the selection scheme characteristics, and these must be studied for each particular case. In dairy cattle, especially in Holsteins, genomic selection is a reality. However, in other livestock species (beef cattle, small ruminants, monogastrics and fish) genomic selection has mainly been used experimentally. The main limitation for its implementation in the mentioned livestock species is the high geno typing costs compared to the low selection value of the candidate. Nevertheless, nowadays the possibility of using single-nucleotide polymorphism (SNP) chips of low density to make genomic selection applications economically feasible is under study. Economic studies may optimize the benefits of genomic selection (GS) to include new traits in the breeding goals. It is evident that genomic selection offers great potential; however, a suitable geno typing strategy and recording system for each case is needed in order to properly exploit it. (Author) 50 refs.

  2. SMART Optimization of a Parenting Program for Active Duty Families

    Science.gov (United States)

    2017-10-01

    child and caregiver outcomes over time, based on a sample of 200 military personnel and their co- parents who have recently or will soon separate from...AWARD NUMBER: W81XWH-16-1-0407 TITLE: SMART Optimization of a Parenting Program for Active Duty Families PRINCIPAL INVESTIGATOR: Abigail...Optimization of a Parenting Program for Active Duty 5a. CONTRACT NUMBER Families 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Abigail

  3. Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology

    Directory of Open Access Journals (Sweden)

    Rupert Faltermeier

    2015-01-01

    Full Text Available Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP and intracranial pressure (ICP. Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP, with the outcome of the patients represented by the Glasgow Outcome Scale (GOS. For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses.

  4. Parameter Optimization for Selected Correlation Analysis of Intracranial Pathophysiology.

    Science.gov (United States)

    Faltermeier, Rupert; Proescholdt, Martin A; Bele, Sylvia; Brawanski, Alexander

    2015-01-01

    Recently we proposed a mathematical tool set, called selected correlation analysis, that reliably detects positive and negative correlations between arterial blood pressure (ABP) and intracranial pressure (ICP). Such correlations are associated with severe impairment of the cerebral autoregulation and intracranial compliance, as predicted by a mathematical model. The time resolved selected correlation analysis is based on a windowing technique combined with Fourier-based coherence calculations and therefore depends on several parameters. For real time application of this method at an ICU it is inevitable to adjust this mathematical tool for high sensitivity and distinct reliability. In this study, we will introduce a method to optimize the parameters of the selected correlation analysis by correlating an index, called selected correlation positive (SCP), with the outcome of the patients represented by the Glasgow Outcome Scale (GOS). For that purpose, the data of twenty-five patients were used to calculate the SCP value for each patient and multitude of feasible parameter sets of the selected correlation analysis. It could be shown that an optimized set of parameters is able to improve the sensitivity of the method by a factor greater than four in comparison to our first analyses.

  5. Optimized Policies for Improving Fairness of Location-based Relay Selection

    DEFF Research Database (Denmark)

    Nielsen, Jimmy Jessen; Olsen, Rasmus Løvenstein; Madsen, Tatiana Kozlova

    2013-01-01

    For WLAN systems in which relaying is used to improve throughput performance for nodes located at the cell edge, node mobility and information collection delays can have a significant impact on the performance of a relay selection scheme. In this paper we extend our existing Markov Chain modeling...... framework for relay selection to allow for efficient calculation of relay policies given either mean throughput or kth throughput percentile as optimization criterium. In a scenario with static access point, static relay, and a mobile destination node, the kth throughput percentile optimization...

  6. Quantum dot laser optimization: selectively doped layers

    Science.gov (United States)

    Korenev, Vladimir V.; Konoplev, Sergey S.; Savelyev, Artem V.; Shernyakov, Yurii M.; Maximov, Mikhail V.; Zhukov, Alexey E.

    2016-08-01

    Edge emitting quantum dot (QD) lasers are discussed. It has been recently proposed to use modulation p-doping of the layers that are adjacent to QD layers in order to control QD's charge state. Experimentally it has been proven useful to enhance ground state lasing and suppress the onset of excited state lasing at high injection. These results have been also confirmed with numerical calculations involving solution of drift-diffusion equations. However, deep understanding of physical reasons for such behavior and laser optimization requires analytical approaches to the problem. In this paper, under a set of assumptions we provide an analytical model that explains major effects of selective p-doping. Capture rates of elections and holes can be calculated by solving Poisson equations for electrons and holes around the charged QD layer. The charge itself is ruled by capture rates and selective doping concentration. We analyzed this self-consistent set of equations and showed that it can be used to optimize QD laser performance and to explain underlying physics.

  7. Quantum dot laser optimization: selectively doped layers

    International Nuclear Information System (INIS)

    Korenev, Vladimir V; Konoplev, Sergey S; Savelyev, Artem V; Shernyakov, Yurii M; Maximov, Mikhail V; Zhukov, Alexey E

    2016-01-01

    Edge emitting quantum dot (QD) lasers are discussed. It has been recently proposed to use modulation p-doping of the layers that are adjacent to QD layers in order to control QD's charge state. Experimentally it has been proven useful to enhance ground state lasing and suppress the onset of excited state lasing at high injection. These results have been also confirmed with numerical calculations involving solution of drift-diffusion equations. However, deep understanding of physical reasons for such behavior and laser optimization requires analytical approaches to the problem. In this paper, under a set of assumptions we provide an analytical model that explains major effects of selective p-doping. Capture rates of elections and holes can be calculated by solving Poisson equations for electrons and holes around the charged QD layer. The charge itself is ruled by capture rates and selective doping concentration. We analyzed this self-consistent set of equations and showed that it can be used to optimize QD laser performance and to explain underlying physics. (paper)

  8. A parallel optimization method for product configuration and supplier selection based on interval

    Science.gov (United States)

    Zheng, Jian; Zhang, Meng; Li, Guoxi

    2017-06-01

    In the process of design and manufacturing, product configuration is an important way of product development, and supplier selection is an essential component of supply chain management. To reduce the risk of procurement and maximize the profits of enterprises, this study proposes to combine the product configuration and supplier selection, and express the multiple uncertainties as interval numbers. An integrated optimization model of interval product configuration and supplier selection was established, and NSGA-II was put forward to locate the Pareto-optimal solutions to the interval multiobjective optimization model.

  9. Lean and Efficient Software: Whole Program Optimization of Executables

    Science.gov (United States)

    2016-12-31

    19b. TELEPHONE NUMBER (Include area code) 12/31/2016 Final Technical Report (Phase I - Base Period) 30-06-2014 - 31-12-2016 Lean and Efficient...Software: Whole-Program Optimization of Executables Final Report Evan Driscoll Tom Johnson GrammaTech, Inc. 531 Esty Street Ithaca, NY 14850 Office of...hardening U U U UU 30 Tom Johnson (607) 273-7340 x.134 Page 1 of 30 “ Lean and Efficient Software: Whole-Program Optimization of Executables

  10. Exploration of automatic optimization for CUDA programming

    KAUST Repository

    Al-Mouhamed, Mayez; Khan, Ayaz ul Hassan

    2012-01-01

    Graphic processing Units (GPUs) are gaining ground in high-performance computing. CUDA (an extension to C) is most widely used parallel programming framework for general purpose GPU computations. However, the task of writing optimized CUDA program is complex even for experts. We present a method for restructuring loops into an optimized CUDA kernels based on a 3-step algorithm which are loop tiling, coalesced memory access, and resource optimization. We also establish the relationships between the influencing parameters and propose a method for finding possible tiling solutions with coalesced memory access that best meets the identified constraints. We also present a simplified algorithm for restructuring loops and rewrite them as an efficient CUDA Kernel. The execution model of synthesized kernel consists of uniformly distributing the kernel threads to keep all cores busy while transferring a tailored data locality which is accessed using coalesced pattern to amortize the long latency of the secondary memory. In the evaluation, we implement some simple applications using the proposed restructuring strategy and evaluate the performance in terms of execution time and GPU throughput. © 2012 IEEE.

  11. Exploration of automatic optimization for CUDA programming

    KAUST Repository

    Al-Mouhamed, Mayez

    2012-12-01

    Graphic processing Units (GPUs) are gaining ground in high-performance computing. CUDA (an extension to C) is most widely used parallel programming framework for general purpose GPU computations. However, the task of writing optimized CUDA program is complex even for experts. We present a method for restructuring loops into an optimized CUDA kernels based on a 3-step algorithm which are loop tiling, coalesced memory access, and resource optimization. We also establish the relationships between the influencing parameters and propose a method for finding possible tiling solutions with coalesced memory access that best meets the identified constraints. We also present a simplified algorithm for restructuring loops and rewrite them as an efficient CUDA Kernel. The execution model of synthesized kernel consists of uniformly distributing the kernel threads to keep all cores busy while transferring a tailored data locality which is accessed using coalesced pattern to amortize the long latency of the secondary memory. In the evaluation, we implement some simple applications using the proposed restructuring strategy and evaluate the performance in terms of execution time and GPU throughput. © 2012 IEEE.

  12. Portfolios with fuzzy returns: Selection strategies based on semi-infinite programming

    Science.gov (United States)

    Vercher, Enriqueta

    2008-08-01

    This paper provides new models for portfolio selection in which the returns on securities are considered fuzzy numbers rather than random variables. The investor's problem is to find the portfolio that minimizes the risk of achieving a return that is not less than the return of a riskless asset. The corresponding optimal portfolio is derived using semi-infinite programming in a soft framework. The return on each asset and their membership functions are described using historical data. The investment risk is approximated by mean intervals which evaluate the downside risk for a given fuzzy portfolio. This approach is illustrated with a numerical example.

  13. AHP-Based Optimal Selection of Garment Sizes for Online Shopping

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Garment online shopping has been accepted by more and more consumers in recent years. In online shopping, a buyer only chooses the garment size judged by his own experience without trying-on, so the selected garment may not be the fittest one for the buyer due to the variety of body's figures. Thus, we propose a method of optimal selection of garment sizes for online shopping based on Analytic Hierarchy Process (AHP). The hierarchical structure model for optimal selection of garment sizes is structured and the fittest garment for a buyer is found by calculating the matching degrees between individual's measurements and the corresponding key-part values of ready-to-wear clothing sizes. In order to demonstrate its feasibility, we provide an example of selecting the fittest sizes of men's bottom. The result shows that the proposed method is useful in online clothing sales application.

  14. Uncertain and multi-objective programming models for crop planting structure optimization

    Directory of Open Access Journals (Sweden)

    Mo LI,Ping GUO,Liudong ZHANG,Chenglong ZHANG

    2016-03-01

    Full Text Available Crop planting structure optimization is a significant way to increase agricultural economic benefits and improve agricultural water management. The complexities of fluctuating stream conditions, varying economic profits, and uncertainties and errors in estimated modeling parameters, as well as the complexities among economic, social, natural resources and environmental aspects, have led to the necessity of developing optimization models for crop planting structure which consider uncertainty and multi-objectives elements. In this study, three single-objective programming models under uncertainty for crop planting structure optimization were developed, including an interval linear programming model, an inexact fuzzy chance-constrained programming (IFCCP model and an inexact fuzzy linear programming (IFLP model. Each of the three models takes grayness into account. Moreover, the IFCCP model considers fuzzy uncertainty of parameters/variables and stochastic characteristics of constraints, while the IFLP model takes into account the fuzzy uncertainty of both constraints and objective functions. To satisfy the sustainable development of crop planting structure planning, a fuzzy-optimization-theory-based fuzzy linear multi-objective programming model was developed, which is capable of reflecting both uncertainties and multi-objective. In addition, a multi-objective fractional programming model for crop structure optimization was also developed to quantitatively express the multi-objective in one optimization model with the numerator representing maximum economic benefits and the denominator representing minimum crop planting area allocation. These models better reflect actual situations, considering the uncertainties and multi-objectives of crop planting structure optimization systems. The five models developed were then applied to a real case study in Minqin County, north-west China. The advantages, the applicable conditions and the solution methods

  15. Selecting Optimal Subset of Security Controls

    OpenAIRE

    Yevseyeva, I.; Basto-Fernandes, V.; Michael, Emmerich, T. M.; Moorsel, van, A.

    2015-01-01

    Open Access journal Choosing an optimal investment in information security is an issue most companies face these days. Which security controls to buy to protect the IT system of a company in the best way? Selecting a subset of security controls among many available ones can be seen as a resource allocation problem that should take into account conflicting objectives and constraints of the problem. In particular, the security of the system should be improved without hindering productivity, ...

  16. Optimal Parameter Selection of Power System Stabilizer using Genetic Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Hyeng Hwan; Chung, Dong Il; Chung, Mun Kyu [Dong-AUniversity (Korea); Wang, Yong Peel [Canterbury Univeristy (New Zealand)

    1999-06-01

    In this paper, it is suggested that the selection method of optimal parameter of power system stabilizer (PSS) with robustness in low frequency oscillation for power system using real variable elitism genetic algorithm (RVEGA). The optimal parameters were selected in the case of power system stabilizer with one lead compensator, and two lead compensator. Also, the frequency responses characteristics of PSS, the system eigenvalues criterion and the dynamic characteristics were considered in the normal load and the heavy load, which proved usefulness of RVEGA compare with Yu's compensator design theory. (author). 20 refs., 15 figs., 8 tabs.

  17. Post optimization paradigm in maximum 3-satisfiability logic programming

    Science.gov (United States)

    Mansor, Mohd. Asyraf; Sathasivam, Saratha; Kasihmuddin, Mohd Shareduwan Mohd

    2017-08-01

    Maximum 3-Satisfiability (MAX-3SAT) is a counterpart of the Boolean satisfiability problem that can be treated as a constraint optimization problem. It deals with a conundrum of searching the maximum number of satisfied clauses in a particular 3-SAT formula. This paper presents the implementation of enhanced Hopfield network in hastening the Maximum 3-Satisfiability (MAX-3SAT) logic programming. Four post optimization techniques are investigated, including the Elliot symmetric activation function, Gaussian activation function, Wavelet activation function and Hyperbolic tangent activation function. The performances of these post optimization techniques in accelerating MAX-3SAT logic programming will be discussed in terms of the ratio of maximum satisfied clauses, Hamming distance and the computation time. Dev-C++ was used as the platform for training, testing and validating our proposed techniques. The results depict the Hyperbolic tangent activation function and Elliot symmetric activation function can be used in doing MAX-3SAT logic programming.

  18. Differential Spatio-temporal Multiband Satellite Image Clustering using K-means Optimization With Reinforcement Programming

    Directory of Open Access Journals (Sweden)

    Irene Erlyn Wina Rachmawan

    2015-06-01

    Full Text Available Deforestration is one of the crucial issues in Indonesia because now Indonesia has world's highest deforestation rate. In other hand, multispectral image delivers a great source of data for studying spatial and temporal changeability of the environmental such as deforestration area. This research present differential image processing methods for detecting nature change of deforestration. Our differential image processing algorithms extract and indicating area automatically. The feature of our proposed idea produce extracted information from multiband satellite image and calculate the area of deforestration by years with calculating data using temporal dataset. Yet, multiband satellite image consists of big data size that were difficult to be handled for segmentation. Commonly, K- Means clustering is considered to be a powerfull clustering algorithm because of its ability to clustering big data. However K-Means has sensitivity of its first generated centroids, which could lead into a bad performance. In this paper we propose a new approach to optimize K-Means clustering using Reinforcement Programming in order to clustering multispectral image. We build a new mechanism for generating initial centroids by implementing exploration and exploitation knowledge from Reinforcement Programming. This optimization will lead a better result for K-means data cluster. We select multispectral image from Landsat 7 in past ten years in Medawai, Borneo, Indonesia, and apply two segmentation areas consist of deforestration land and forest field. We made series of experiments and compared the experimental results of K-means using Reinforcement Programming as optimizing initiate centroid and normal K-means without optimization process. Keywords: Deforestration, Multispectral images, landsat, automatic clustering, K-means.

  19. Selective Segmentation for Global Optimization of Depth Estimation in Complex Scenes

    Directory of Open Access Journals (Sweden)

    Sheng Liu

    2013-01-01

    Full Text Available This paper proposes a segmentation-based global optimization method for depth estimation. Firstly, for obtaining accurate matching cost, the original local stereo matching approach based on self-adapting matching window is integrated with two matching cost optimization strategies aiming at handling both borders and occlusion regions. Secondly, we employ a comprehensive smooth term to satisfy diverse smoothness request in real scene. Thirdly, a selective segmentation term is used for enforcing the plane trend constraints selectively on the corresponding segments to further improve the accuracy of depth results from object level. Experiments on the Middlebury image pairs show that the proposed global optimization approach is considerably competitive with other state-of-the-art matching approaches.

  20. Path selection and bandwidth allocation in MPLS networks: a nonlinear programming approach

    Science.gov (United States)

    Burns, J. E.; Ott, Teunis J.; de Kock, Johan M.; Krzesinski, Anthony E.

    2001-07-01

    Multi-protocol Label Switching extends the IPv4 destination-based routing protocols to provide new and scalable routing capabilities in connectionless networks using relatively simple packet forwarding mechanisms. MPLS networks carry traffic on virtual connections called label switched paths. This paper considers path selection and bandwidth allocation in MPLS networks in order to optimize the network quality of service. The optimization is based upon the minimization of a non-linear objective function which under light load simplifies to OSPF routing with link metrics equal to the link propagation delays. The behavior under heavy load depends on the choice of certain parameters: It can essentially be made to minimize maximal expected utilization, or to maximize minimal expected weighted slacks (both over all links). Under certain circumstances it can be made to minimize the probability that a link has an instantaneous offered load larger than its transmission capacity. We present a model of an MPLS network and an algorithm to find and capacitate optimal LSPs. The algorithm is an improvement of the well-known flow deviation non-linear programming method. The algorithm is applied to compute optimal LSPs for several test networks carrying a single traffic class.

  1. A combined stochastic programming and optimal control approach to personal finance and pensions

    DEFF Research Database (Denmark)

    Konicz, Agnieszka Karolina; Pisinger, David; Rasmussen, Kourosh Marjani

    2015-01-01

    The paper presents a model that combines a dynamic programming (stochastic optimal control) approach and a multi-stage stochastic linear programming approach (SLP), integrated into one SLP formulation. Stochastic optimal control produces an optimal policy that is easy to understand and implement....

  2. 3Es System Optimization under Uncertainty Using Hybrid Intelligent Algorithm: A Fuzzy Chance-Constrained Programming Model

    Directory of Open Access Journals (Sweden)

    Jiekun Song

    2016-01-01

    Full Text Available Harmonious development of 3Es (economy-energy-environment system is the key to realize regional sustainable development. The structure and components of 3Es system are analyzed. Based on the analysis of causality diagram, GDP and industrial structure are selected as the target parameters of economy subsystem, energy consumption intensity is selected as the target parameter of energy subsystem, and the emissions of COD, ammonia nitrogen, SO2, and NOX and CO2 emission intensity are selected as the target parameters of environment system. Fixed assets investment of three industries, total energy consumption, and investment in environmental pollution control are selected as the decision variables. By regarding the parameters of 3Es system optimization as fuzzy numbers, a fuzzy chance-constrained goal programming (FCCGP model is constructed, and a hybrid intelligent algorithm including fuzzy simulation and genetic algorithm is proposed for solving it. The results of empirical analysis on Shandong province of China show that the FCCGP model can reflect the inherent relationship and evolution law of 3Es system and provide the effective decision-making support for 3Es system optimization.

  3. Optimization-Based Selection of Influential Agents in a Rural Afghan Social Network

    Science.gov (United States)

    2010-06-01

    nonlethal targeting model, a nonlinear programming ( NLP ) optimization formulation that identifies the k US agent assignment strategy producing the greatest...leader social network, and 3) the nonlethal targeting model, a nonlinear programming ( NLP ) optimization formulation that identifies the k US agent...NATO Coalition in Afghanistan. 55 for Afghanistan ( [54], [31], [48], [55], [30]). While Arab tribes tend to be more hierarchical, Pashtun tribes are

  4. Optimization of the annual construction program solutions

    Directory of Open Access Journals (Sweden)

    Oleinik Pavel

    2017-01-01

    Full Text Available The article considers potentially possible optimization solutions in scheduling while forming the annual production programs of the construction complex organizations. The optimization instrument is represented as a two-component system. As a fundamentally new approach in the first block of the annual program solutions, the authors propose to use a scientifically grounded methodology for determining the scope of work permissible for the transfer to a subcontractor without risk of General Contractor’s management control losing over the construction site. For this purpose, a special indicator is introduced that characterizes the activity of the general construction organization - the coefficient of construction production management. In the second block, the principal methods for the formation of calendar plans for the fulfillment of the critical work effort by the leading stream are proposed, depending on the intensity characteristic.

  5. A Study of Joint Cost Inclusion in Linear Programming Optimization

    Directory of Open Access Journals (Sweden)

    P. Armaos

    2013-08-01

    Full Text Available The concept of Structural Optimization has been a topic or research over the past century. Linear Programming Optimization has proved being the most reliable method of structural optimization. Global advances in linear programming optimization have been recently powered by University of Sheffield researchers, to include joint cost, self-weight and buckling considerations. A joint cost inclusion scopes to reduce the number of joints existing in an optimized structural solution, transforming it to a practically viable solution. The topic of the current paper is to investigate the effects of joint cost inclusion, as this is currently implemented in the optimization code. An extended literature review on this subject was conducted prior to familiarization with small scale optimization software. Using IntelliFORM software, a structured series of problems were set and analyzed. The joint cost tests examined benchmark problems and their consequent changes in the member topology, as the design domain was expanding. The findings of the analyses were remarkable and are being commented further on. The distinct topologies of solutions created by optimization processes are also recognized. Finally an alternative strategy of penalizing joints is presented.

  6. Simulation and Optimization of Control of Selected Phases of Gyroplane Flight

    Directory of Open Access Journals (Sweden)

    Wienczyslaw Stalewski

    2018-02-01

    Full Text Available Optimization methods are increasingly used to solve problems in aeronautical engineering. Typically, optimization methods are utilized in the design of an aircraft airframe or its structure. The presented study is focused on improvement of aircraft flight control procedures through numerical optimization. The optimization problems concern selected phases of flight of a light gyroplane—a rotorcraft using an unpowered rotor in autorotation to develop lift and an engine-powered propeller to provide thrust. An original methodology of computational simulation of rotorcraft flight was developed and implemented. In this approach the aircraft motion equations are solved step-by-step, simultaneously with the solution of the Unsteady Reynolds-Averaged Navier–Stokes equations, which is conducted to assess aerodynamic forces acting on the aircraft. As a numerical optimization method, the BFGS (Broyden–Fletcher–Goldfarb–Shanno algorithm was adapted. The developed methodology was applied to optimize the flight control procedures in selected stages of gyroplane flight in direct proximity to the ground, where proper control of the aircraft is critical to ensure flight safety and performance. The results of conducted computational optimizations proved the qualitative correctness of the developed methodology. The research results can be helpful in the design of easy-to-control gyroplanes and also in the training of pilots for this type of rotorcraft.

  7. An opinion formation based binary optimization approach for feature selection

    Science.gov (United States)

    Hamedmoghadam, Homayoun; Jalili, Mahdi; Yu, Xinghuo

    2018-02-01

    This paper proposed a novel optimization method based on opinion formation in complex network systems. The proposed optimization technique mimics human-human interaction mechanism based on a mathematical model derived from social sciences. Our method encodes a subset of selected features to the opinion of an artificial agent and simulates the opinion formation process among a population of agents to solve the feature selection problem. The agents interact using an underlying interaction network structure and get into consensus in their opinions, while finding better solutions to the problem. A number of mechanisms are employed to avoid getting trapped in local minima. We compare the performance of the proposed method with a number of classical population-based optimization methods and a state-of-the-art opinion formation based method. Our experiments on a number of high dimensional datasets reveal outperformance of the proposed algorithm over others.

  8. Propositional Optimal Trajectory Programming for Improving Stability ...

    African Journals Online (AJOL)

    Propositional Optimal Trajectory Programming for Improving Stability of Hermite Definite Control System. ... PROMOTING ACCESS TO AFRICAN RESEARCH. AFRICAN JOURNALS ONLINE (AJOL) ... Knowledge of systems operation subjected to heat diffusion constraints is required of systems analysts. In an instance that ...

  9. Partner Selection Optimization Model of Agricultural Enterprises in Supply Chain

    OpenAIRE

    Feipeng Guo; Qibei Lu

    2013-01-01

    With more and more importance of correctly selecting partners in supply chain of agricultural enterprises, a large number of partner evaluation techniques are widely used in the field of agricultural science research. This study established a partner selection model to optimize the issue of agricultural supply chain partner selection. Firstly, it constructed a comprehensive evaluation index system after analyzing the real characteristics of agricultural supply chain. Secondly, a heuristic met...

  10. Opportunistic relaying in multipath and slow fading channel: Relay selection and optimal relay selection period

    KAUST Repository

    Sungjoon Park,

    2011-11-01

    In this paper we present opportunistic relay communication strategies of decode and forward relaying. The channel that we are considering includes pathloss, shadowing, and fast fading effects. We find a simple outage probability formula for opportunistic relaying in the channel, and validate the results by comparing it with the exact outage probability. Also, we suggest a new relay selection algorithm that incorporates shadowing. We consider a protocol of broadcasting the channel gain of the previously selected relay. This saves resources in slow fading channel by reducing collisions in relay selection. We further investigate the optimal relay selection period to maximize the throughput while avoiding selection overhead. © 2011 IEEE.

  11. Nonlinear Time Series Prediction Using LS-SVM with Chaotic Mutation Evolutionary Programming for Parameter Optimization

    International Nuclear Information System (INIS)

    Xu Ruirui; Chen Tianlun; Gao Chengfeng

    2006-01-01

    Nonlinear time series prediction is studied by using an improved least squares support vector machine (LS-SVM) regression based on chaotic mutation evolutionary programming (CMEP) approach for parameter optimization. We analyze how the prediction error varies with different parameters (σ, γ) in LS-SVM. In order to select appropriate parameters for the prediction model, we employ CMEP algorithm. Finally, Nasdaq stock data are predicted by using this LS-SVM regression based on CMEP, and satisfactory results are obtained.

  12. Selection and optimization of extracellular lipase production using ...

    African Journals Online (AJOL)

    The aim of this study was to isolate and select lipase-producing microorganisms originated from different substrates, as well as to optimize the production of microbial lipase by submerged fermentation under different nutrient conditions. Of the 40 microorganisms isolated, 39 showed a halo around the colonies and 4 were ...

  13. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng; Yuan, Ganzhao; Ghanem, Bernard

    2013-01-01

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  14. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng

    2013-10-03

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  15. optimal selection of hydraulic turbines for small hydro electric power

    African Journals Online (AJOL)

    eobe

    Keywords: optimal selection, SHP turbine, flow duration curve, energy efficiency, annual capacity factor. 1. INTRODUCTION ... depleted, with adverse environmental impacts downstream ..... Technologies, Financing Cogeneration and Small -.

  16. Methods for optimizing over the efficient and weakly efficient sets of an affine fractional vector optimization program

    DEFF Research Database (Denmark)

    Le, T.H.A.; Pham, D. T.; Canh, Nam Nguyen

    2010-01-01

    Both the efficient and weakly efficient sets of an affine fractional vector optimization problem, in general, are neither convex nor given explicitly. Optimization problems over one of these sets are thus nonconvex. We propose two methods for optimizing a real-valued function over the efficient...... and weakly efficient sets of an affine fractional vector optimization problem. The first method is a local one. By using a regularization function, we reformulate the problem into a standard smooth mathematical programming problem that allows applying available methods for smooth programming. In case...... the objective function is linear, we have investigated a global algorithm based upon a branch-and-bound procedure. The algorithm uses Lagrangian bound coupling with a simplicial bisection in the criteria space. Preliminary computational results show that the global algorithm is promising....

  17. Optimization of Algorithms Using Extensions of Dynamic Programming

    KAUST Repository

    AbouEisha, Hassan M.

    2017-04-09

    We study and answer questions related to the complexity of various important problems such as: multi-frontal solvers of hp-adaptive finite element method, sorting and majority. We advocate the use of dynamic programming as a viable tool to study optimal algorithms for these problems. The main approach used to attack these problems is modeling classes of algorithms that may solve this problem using a discrete model of computation then defining cost functions on this discrete structure that reflect different complexity measures of the represented algorithms. As a last step, dynamic programming algorithms are designed and used to optimize those models (algorithms) and to obtain exact results on the complexity of the studied problems. The first part of the thesis presents a novel model of computation (element partition tree) that represents a class of algorithms for multi-frontal solvers along with cost functions reflecting various complexity measures such as: time and space. It then introduces dynamic programming algorithms for multi-stage and bi-criteria optimization of element partition trees. In addition, it presents results based on optimal element partition trees for famous benchmark meshes such as: meshes with point and edge singularities. New improved heuristics for those benchmark meshes were ob- tained based on insights of the optimal results found by our algorithms. The second part of the thesis starts by introducing a general problem where different problems can be reduced to and show how to use a decision table to model such problem. We describe how decision trees and decision tests for this table correspond to adaptive and non-adaptive algorithms for the original problem. We present exact bounds on the average time complexity of adaptive algorithms for the eight elements sorting problem. Then bounds on adaptive and non-adaptive algorithms for a variant of the majority problem are introduced. Adaptive algorithms are modeled as decision trees whose depth

  18. Grid-Optimization Program for Photovoltaic Cells

    Science.gov (United States)

    Daniel, R. E.; Lee, T. S.

    1986-01-01

    CELLOPT program developed to assist in designing grid pattern of current-conducting material on photovoltaic cell. Analyzes parasitic resistance losses and shadow loss associated with metallized grid pattern on both round and rectangular solar cells. Though performs sensitivity studies, used primarily to optimize grid design in terms of bus bar and grid lines by minimizing power loss. CELLOPT written in APL.

  19. An Improved Particle Swarm Optimization for Selective Single Machine Scheduling with Sequence Dependent Setup Costs and Downstream Demands

    Directory of Open Access Journals (Sweden)

    Kun Li

    2015-01-01

    Full Text Available This paper investigates a special single machine scheduling problem derived from practical industries, namely, the selective single machine scheduling with sequence dependent setup costs and downstream demands. Different from traditional single machine scheduling, this problem further takes into account the selection of jobs and the demands of downstream lines. This problem is formulated as a mixed integer linear programming model and an improved particle swarm optimization (PSO is proposed to solve it. To enhance the exploitation ability of the PSO, an adaptive neighborhood search with different search depth is developed based on the decision characteristics of the problem. To improve the search diversity and make the proposed PSO algorithm capable of getting out of local optimum, an elite solution pool is introduced into the PSO. Computational results based on extensive test instances show that the proposed PSO can obtain optimal solutions for small size problems and outperform the CPLEX and some other powerful algorithms for large size problems.

  20. ROTAX: a nonlinear optimization program by axes rotation method

    International Nuclear Information System (INIS)

    Suzuki, Tadakazu

    1977-09-01

    A nonlinear optimization program employing the axes rotation method has been developed for solving nonlinear problems subject to nonlinear inequality constraints and its stability and convergence efficiency were examined. The axes rotation method is a direct search of the optimum point by rotating the orthogonal coordinate system in a direction giving the minimum objective. The searching direction is rotated freely in multi-dimensional space, so the method is effective for the problems represented with the contours having deep curved valleys. In application of the axes rotation method to the optimization problems subject to nonlinear inequality constraints, an improved version of R.R. Allran and S.E.J. Johnsen's method is used, which deals with a new objective function composed of the original objective and a penalty term to consider the inequality constraints. The program is incorporated in optimization code system SCOOP. (auth.)

  1. An Improved Particle Swarm Optimization for Solving Bilevel Multiobjective Programming Problem

    Directory of Open Access Journals (Sweden)

    Tao Zhang

    2012-01-01

    Full Text Available An improved particle swarm optimization (PSO algorithm is proposed for solving bilevel multiobjective programming problem (BLMPP. For such problems, the proposed algorithm directly simulates the decision process of bilevel programming, which is different from most traditional algorithms designed for specific versions or based on specific assumptions. The BLMPP is transformed to solve multiobjective optimization problems in the upper level and the lower level interactively by an improved PSO. And a set of approximate Pareto optimal solutions for BLMPP is obtained using the elite strategy. This interactive procedure is repeated until the accurate Pareto optimal solutions of the original problem are found. Finally, some numerical examples are given to illustrate the feasibility of the proposed algorithm.

  2. A man in the loop trajectory optimization program (MILTOP)

    Science.gov (United States)

    Reinfields, J.

    1974-01-01

    An interactive trajectory optimization program is developed for use in initial fixing of launch configurations. The program is called MILTOP for Man-In-the-Loop-Trajectory Optimization-Program. The program is designed to facilitate quick look studies using man-machine decision combinations to reduce the time required to solve a given problem. MILTOP integrates the equations of motion of a point-mass in 3-Dimensions with drag as the only aerodynamic force present. Any point in time at which an integration step terminates, may be used as a decision-break-point, with complete user control over all variables and routines at this point. Automatic phases are provided for different modes of control: vertical rise, pitch-over, gravity turn, chi-freeze and control turn. Stage parameters are initialized from a separate routine so the user may fly as many stages as his problem demands. The MILTOP system uses both interactively on storage scope consoles, or in batch mode with numerical output on the live printer.

  3. Project Selection for NASA's R&D Programs

    Science.gov (United States)

    Jones, Harry

    2005-01-01

    The purpose of NASA s Research and Development (R&D) programs is to provide advanced human support technologies for the Exploration Systems Mission Directorate (ESMD). The new technologies must be sufficiently attractive and proven to be selectable for future missions. This requires identifying promising candidate technologies and advancing them in technology readiness until they are likely options for flight. The R&D programs must select an array of technology development projects, manage them, and either terminate or continue them, so as to maximize the delivered number of potentially usable advanced human support technologies. This paper proposes an effective project selection methodology to help manage NASA R&D project portfolios.

  4. Age-Related Differences in Goals: Testing Predictions from Selection, Optimization, and Compensation Theory and Socioemotional Selectivity Theory

    Science.gov (United States)

    Penningroth, Suzanna L.; Scott, Walter D.

    2012-01-01

    Two prominent theories of lifespan development, socioemotional selectivity theory and selection, optimization, and compensation theory, make similar predictions for differences in the goal representations of younger and older adults. Our purpose was to test whether the goals of younger and older adults differed in ways predicted by these two…

  5. A compensatory approach to optimal selection with mastery scores

    NARCIS (Netherlands)

    van der Linden, Willem J.; Vos, Hendrik J.

    1994-01-01

    This paper presents some Bayesian theories of simultaneous optimization of decision rules for test-based decisions. Simultaneous decision making arises when an institution has to make a series of selection, placement, or mastery decisions with respect to subjects from a population. An obvious

  6. Optimization theory for large systems

    CERN Document Server

    Lasdon, Leon S

    2002-01-01

    Important text examines most significant algorithms for optimizing large systems and clarifying relations between optimization procedures. Much data appear as charts and graphs and will be highly valuable to readers in selecting a method and estimating computer time and cost in problem-solving. Initial chapter on linear and nonlinear programming presents all necessary background for subjects covered in rest of book. Second chapter illustrates how large-scale mathematical programs arise from real-world problems. Appendixes. List of Symbols.

  7. MULTI-CRITERIA PROGRAMMING METHODS AND PRODUCTION PLAN OPTIMIZATION PROBLEM SOLVING IN METAL INDUSTRY

    OpenAIRE

    Tunjo Perić; Željko Mandić

    2017-01-01

    This paper presents the production plan optimization in the metal industry considered as a multi-criteria programming problem. We first provided the definition of the multi-criteria programming problem and classification of the multicriteria programming methods. Then we applied two multi-criteria programming methods (the STEM method and the PROMETHEE method) in solving a problem of multi-criteria optimization production plan in a company from the metal industry. The obtained resul...

  8. An Improved Test Selection Optimization Model Based on Fault Ambiguity Group Isolation and Chaotic Discrete PSO

    Directory of Open Access Journals (Sweden)

    Xiaofeng Lv

    2018-01-01

    Full Text Available Sensor data-based test selection optimization is the basis for designing a test work, which ensures that the system is tested under the constraint of the conventional indexes such as fault detection rate (FDR and fault isolation rate (FIR. From the perspective of equipment maintenance support, the ambiguity isolation has a significant effect on the result of test selection. In this paper, an improved test selection optimization model is proposed by considering the ambiguity degree of fault isolation. In the new model, the fault test dependency matrix is adopted to model the correlation between the system fault and the test group. The objective function of the proposed model is minimizing the test cost with the constraint of FDR and FIR. The improved chaotic discrete particle swarm optimization (PSO algorithm is adopted to solve the improved test selection optimization model. The new test selection optimization model is more consistent with real complicated engineering systems. The experimental result verifies the effectiveness of the proposed method.

  9. Non-linear programming method in optimization of fast reactors

    International Nuclear Information System (INIS)

    Pavelesku, M.; Dumitresku, Kh.; Adam, S.

    1975-01-01

    Application of the non-linear programming methods on optimization of nuclear materials distribution in fast reactor is discussed. The programming task composition is made on the basis of the reactor calculation dependent on the fuel distribution strategy. As an illustration of this method application the solution of simple example is given. Solution of the non-linear program is done on the basis of the numerical method SUMT. (I.T.)

  10. Structural optimization of static power control programs of nuclear power plants with WWER-1000

    International Nuclear Information System (INIS)

    Kokol, E.O.

    2015-01-01

    The question of possibility the power control programs switching for WWER-1000 is considered. The aim of this research is to determine the best program for the power control of nuclear reactor under cyclic diurnal behavior of electrical generation, as well as the switching implementation. The considered problem of finding the best control program refers to the multicriteria optimization class of problems. Operation of the nuclear power generation system simulated using the following power control programs: with constant average temperature of transfer fluid, with constant pressure in the reactor secondary circuit, with constant temperature in input of the nuclear reactor. The target function was proposed. It consists of three normalized criteria: the burn up fraction, the damage level of fuel rod array shells, as well as changes in the power values. When simulation of the nuclear power generation system operation within the life was done, the values of the selected criteria were obtained and inserted in the target function. The minimum of three values of the target function depending on the control program at current time defined the criterion of switching of considered static power control programs for nuclear power generation system

  11. Optimization of Product Instantiation using Integer Programming

    NARCIS (Netherlands)

    van den Broek, P.M.; Botterweck, Goetz; Jarzabek, Stan; Kishi, Tomoji

    2010-01-01

    We show that Integer Programming (IP) can be used as an optimization technique for the instantiation of products of feature models. This is done by showing that the constraints of feature models can be written in linear form. As particular IP technique, we use Gomory cutting planes. We have applied

  12. Optimized ONO thickness for multi-level and 2-bit/cell operation for wrapped-select-gate (WSG) SONOS memory

    International Nuclear Information System (INIS)

    Wu, Woei-Cherng; Chao, Tien-Sheng; Yang, Tsung-Yu; Peng, Wu-Chin; Yang, Wen-Luh; Chen, Jian-Hao; Ma, Ming Wen; Lai, Chao-Sung; Lee, Chien-Hsing; Hsieh, Tsung-Min; Liou, Jhyy Cheng; Chen, Tzu Ping; Chen, Chien Hung; Lin, Chih Hung; Chen, Hwi Huang; Ko, Joe

    2008-01-01

    In this paper, highly reliable wrapped-select-gate (WSG) silicon–oxide–nitride–oxide–silicon (SONOS) memory cells with multi-level and 2-bit/cell operation have been successfully demonstrated. The source-side injection mechanism for WSG-SONOS memory with different ONO thickness was thoroughly investigated. The different programming efficiencies of the WSG-SONOS memory under different ONO thicknesses are explained by the lateral electrical field extracted from the simulation results. Furthermore, multi-level storage is easily obtained, and good V TH distribution presented, for the WSG-SONOS memory with optimized ONO thickness. High program/erase speed (10 µs/5 ms) and low programming current (3.5 µA) are used to achieve the multi-level operation with tolerable gate and drain disturbance, negligible second-bit effect, excellent data retention and good endurance performance

  13. Noninvasive imaging systems for gametes and embryo selection in IVF programs: a review.

    Science.gov (United States)

    Omidi, Marjan; Faramarzi, Azita; Agharahimi, Azam; Khalili, Mohammad Ali

    2017-09-01

    Optimizing the efficiency of the in vitro fertilization procedure by improving pregnancy rates and reducing the risks of multiple pregnancies simultaneously are the primary goals of the current assisted reproductive technology program. With the move to single embryo transfers, the need for more cost-effective and noninvasive methods for embryo selection prior to transfer is paramount. These aims require advancement in a more acquire gametes/embryo testing and selection procedures using high-tech devices. Therefore, the aim of the present review is to evaluate the efficacy of noninvasive imaging systems in the current literatures, focusing on the potential clinical application in infertile patients undergoing assisted reproductive technology treatments. In this regards, three advanced imaging systems of motile sperm organelle morphology examination, polarization microscopy and time-lapse monitoring for the best selection of the gametes and preimplantation embryos are introduced in full. © 2017 The Authors Journal of Microscopy © 2017 Royal Microscopical Society.

  14. Dynamic Programming Approach for Exact Decision Rule Optimization

    KAUST Repository

    Amin, Talha M.; Chikalov, Igor; Moshkov, Mikhail; Zielosko, Beata

    2013-01-01

    This chapter is devoted to the study of an extension of dynamic programming approach that allows sequential optimization of exact decision rules relative to the length and coverage. It contains also results of experiments with decision tables from

  15. A Sequential Convex Semidefinite Programming Algorithm for Multiple-Load Free Material Optimization

    Czech Academy of Sciences Publication Activity Database

    Stingl, M.; Kočvara, Michal; Leugering, G.

    2009-01-01

    Roč. 20, č. 1 (2009), s. 130-155 ISSN 1052-6234 R&D Projects: GA AV ČR IAA1075402 Grant - others:commision EU(XE) EU-FP6-30717 Institutional research plan: CEZ:AV0Z10750506 Keywords : structural optimization * material optimization * semidefinite programming * sequential convex programming Subject RIV: BA - General Mathematics Impact factor: 1.429, year: 2009

  16. Selecting Optimal Parameters of Random Linear Network Coding for Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Heide, J; Zhang, Qi; Fitzek, F H P

    2013-01-01

    This work studies how to select optimal code parameters of Random Linear Network Coding (RLNC) in Wireless Sensor Networks (WSNs). With Rateless Deluge [1] the authors proposed to apply Network Coding (NC) for Over-the-Air Programming (OAP) in WSNs, and demonstrated that with NC a significant...... reduction in the number of transmitted packets can be achieved. However, NC introduces additional computations and potentially a non-negligible transmission overhead, both of which depend on the chosen coding parameters. Therefore it is necessary to consider the trade-off that these coding parameters...... present in order to obtain the lowest energy consumption per transmitted bit. This problem is analyzed and suitable coding parameters are determined for the popular Tmote Sky platform. Compared to the use of traditional RLNC, these parameters enable a reduction in the energy spent per bit which grows...

  17. Postdoctoral periodontal program directors' perspectives of resident selection.

    Science.gov (United States)

    Khan, Saba; Carmosino, Andrew J; Yuan, Judy Chia-Chun; Lucchiari, Newton; Kawar, Nadia; Sukotjo, Cortino

    2015-02-01

    Applications for postdoctoral periodontal programs have recently increased. The National Board Dental Examinations (NBDE) has adopted a pass/fail format. The purpose of this study is to examine the criteria used by accredited postdoctoral periodontal programs in the United States to evaluate potential applicants. A secondary purpose was to determine whether the absence of NBDE scores would change program directors' selection process. Basic demographic information of the program directors was also collected. A questionnaire was sent to all 54 program directors of accredited postdoctoral periodontal programs in the United States. The raw data were compiled, descriptive analyses were performed, and results were tabulated and ranked when applicable. Thirty-five of 54 program directors (64.8%) responded to the survey. The five most important factors in selecting residents were: 1) interview ratings; 2) dental school clinical grades; 3) dental school periodontics grades; 4) personal statement; and 5) letters of recommendation. The majority of the programs (94%; n = 33) require an interview, and many (86%; n = 30) have a committee that makes the final decision on candidate acceptance. More than half of the respondents (56%; n = 17) stated that the pass/fail format of the NBDE would affect the decision-making process. This study describes the criteria used by postdoctoral periodontal programs to help select applicants. Interview ratings, dental school grades, personal statements, and letters of recommendation were found to be the most important factors. Results from this study may be helpful for prospective postdoctoral periodontal program applicants in the United States.

  18. Penempatan Optimal Phasor Measurement Unit (PMU) Dengan Integer Programming

    OpenAIRE

    Amrulloh, Yunan Helmy

    2013-01-01

    Phasor Measurement Unit (PMU) merupakan peralatan yang mampu memberikan pengukuran fasor tegangan dan arus secara real-time. PMU dapat digunakan untuk monitoring, proteksi dan kontrol pada sistem tenaga listrik. Tugas akhir ini membahas penempatan PMU secara optimal berdasarkan topologi jaringan sehingga sistem tenaga listrik dapat diobservasi. Penempatan optimal PMU dirumuskan sebagai masalah Binary Integer Programming (BIP) yang akan memberikan variabel dengan pilihan nilai (0,1) yang menu...

  19. Portfolio optimization in enhanced index tracking with goal programming approach

    Science.gov (United States)

    Siew, Lam Weng; Jaaman, Saiful Hafizah Hj.; Ismail, Hamizun bin

    2014-09-01

    Enhanced index tracking is a popular form of passive fund management in stock market. Enhanced index tracking aims to generate excess return over the return achieved by the market index without purchasing all of the stocks that make up the index. This can be done by establishing an optimal portfolio to maximize the mean return and minimize the risk. The objective of this paper is to determine the portfolio composition and performance using goal programming approach in enhanced index tracking and comparing it to the market index. Goal programming is a branch of multi-objective optimization which can handle decision problems that involve two different goals in enhanced index tracking, a trade-off between maximizing the mean return and minimizing the risk. The results of this study show that the optimal portfolio with goal programming approach is able to outperform the Malaysia market index which is FTSE Bursa Malaysia Kuala Lumpur Composite Index because of higher mean return and lower risk without purchasing all the stocks in the market index.

  20. Program Characteristics Influencing Allopathic Students' Residency Selection.

    Science.gov (United States)

    Stillman, Michael D; Miller, Karen Hughes; Ziegler, Craig H; Upadhyay, Ashish; Mitchell, Charlene K

    2016-04-01

    Medical students must consider many overt variables when entering the National Resident Matching Program. However, changes with the single graduate medical education accreditation system have caused a gap in knowledge about more subtle considerations, including what, if any, influence the presence of osteopathic physician (ie, DO) and international medical graduate (IMG) house officers has on allopathic students' residency program preferences. Program directors and selection committee members may assume students' implicit bias without substantiating evidence. To reexamine which program characteristics affect US-trained allopathic medical students' residency selection, and to determine whether the presence of DO and IMG house officers affects the program choices of allopathic medical students. Fourth-year medical students from 4 allopathic medical schools completed an online survey. The Pearson χ(2) statistic was used to compare demographic and program-specific traits that influence ranking decisions and to determine whether school type (private vs public), valuing a residency program's prestige, or interest in a competitive specialty dictated results. Qualitative data were analyzed using the Pandit variation of the Glaser and Strauss constant comparison. Surveys were completed by 323 of 577 students (56%). Students from private vs public institutions were more likely to value a program's prestige (160 [93%] vs 99 [72%]; P<.001) and research opportunities (114 [66%] vs 57 [42%]; P<.001), and they were less likely to consider their prospects of being accepted (98 [57%] vs 111 [81%]; P<.001). A total of 33 (10%) and 52 (16%) students reported that the presence of DO or IMG trainees, respectively, would influence their final residency selection, and these percentages were largely unchanged among students interested in programs' prestige or in entering a competitive specialty. Open-ended comments were generally optimistic about diversification of the physician

  1. New approach to the optimization of nuclear fuel cycle - application of the goal programming and the AHP

    International Nuclear Information System (INIS)

    Kim, Poong Oh

    1998-02-01

    The front-end fuel cycle from mining to enrichment is in the maturity. Unlike the front-end fuel cycle, there are several pathways in the back-end fuel cycle. in this study five fuel cycle scenarios derived from a unique position in Korea of having a two-reactor programme of PWR and PHWR are proposed. In a selection of an optimal fuel cycle in a country, a number of attributes and factors that interact each other should be taken into account. Those factors to be considered in the study are categorized into two groups, one is tangible factor and the other is intangible factor. The major factors consist of minimizing fuel cycle cost, maximizing resource utilization, minimizing environmental impact and satisfying domestic and international politics. The long-term consequences of any decision for the back-end fuel cycle requires some sophisticated decision making tools. In this paper the Goal Programming method in combination with the Analytic Hierarchy Process (AHP) is applied in the decision making process. The Goal Programming is a very useful decision making tool to solve complex and multi-objective problems. The AHP, a method of solving complex decision problems with multiple attributes or objectives shows the strength in measuring the preferences of the attributes. In the study, the AHP is used for quantification of the intangible factors of which the evaluation is done by a team of nuclear experts. A model for fuel cycle selection is established in accordance with the logic of the Goal Programming. Also an interactive computer program is developed to obtain a solution for the most optimal fuel cycle in Korea

  2. Optimal control of a programmed motion of a rigid spacecraft using redundant kinematics parameterizations

    International Nuclear Information System (INIS)

    El-Gohary, Awad

    2005-01-01

    This paper considers the problem of optimal controlling of a programmed motion of a rigid spacecraft. Given a cost of the spacecraft as a quadratic function of state and control variables we seek for optimal control laws as functions of the state variables and the angle of programmed rotation that minimize this cost and asymptotically stabilize the required programmed motion. The stabilizing properties of the proposed controllers are proved using the optimal Liapunov techniques. Numerical simulation study is presented

  3. Optimal Portfolio Selection Under Concave Price Impact

    International Nuclear Information System (INIS)

    Ma Jin; Song Qingshuo; Xu Jing; Zhang Jianfeng

    2013-01-01

    In this paper we study an optimal portfolio selection problem under instantaneous price impact. Based on some empirical analysis in the literature, we model such impact as a concave function of the trading size when the trading size is small. The price impact can be thought of as either a liquidity cost or a transaction cost, but the concavity nature of the cost leads to some fundamental difference from those in the existing literature. We show that the problem can be reduced to an impulse control problem, but without fixed cost, and that the value function is a viscosity solution to a special type of Quasi-Variational Inequality (QVI). We also prove directly (without using the solution to the QVI) that the optimal strategy exists and more importantly, despite the absence of a fixed cost, it is still in a “piecewise constant” form, reflecting a more practical perspective.

  4. Optimal Portfolio Selection Under Concave Price Impact

    Energy Technology Data Exchange (ETDEWEB)

    Ma Jin, E-mail: jinma@usc.edu [University of Southern California, Department of Mathematics (United States); Song Qingshuo, E-mail: songe.qingshuo@cityu.edu.hk [City University of Hong Kong, Department of Mathematics (Hong Kong); Xu Jing, E-mail: xujing8023@yahoo.com.cn [Chongqing University, School of Economics and Business Administration (China); Zhang Jianfeng, E-mail: jianfenz@usc.edu [University of Southern California, Department of Mathematics (United States)

    2013-06-15

    In this paper we study an optimal portfolio selection problem under instantaneous price impact. Based on some empirical analysis in the literature, we model such impact as a concave function of the trading size when the trading size is small. The price impact can be thought of as either a liquidity cost or a transaction cost, but the concavity nature of the cost leads to some fundamental difference from those in the existing literature. We show that the problem can be reduced to an impulse control problem, but without fixed cost, and that the value function is a viscosity solution to a special type of Quasi-Variational Inequality (QVI). We also prove directly (without using the solution to the QVI) that the optimal strategy exists and more importantly, despite the absence of a fixed cost, it is still in a 'piecewise constant' form, reflecting a more practical perspective.

  5. FIRE: an SPSS program for variable selection in multiple linear regression analysis via the relative importance of predictors.

    Science.gov (United States)

    Lorenzo-Seva, Urbano; Ferrando, Pere J

    2011-03-01

    We provide an SPSS program that implements currently recommended techniques and recent developments for selecting variables in multiple linear regression analysis via the relative importance of predictors. The approach consists of: (1) optimally splitting the data for cross-validation, (2) selecting the final set of predictors to be retained in the equation regression, and (3) assessing the behavior of the chosen model using standard indices and procedures. The SPSS syntax, a short manual, and data files related to this article are available as supplemental materials from brm.psychonomic-journals.org/content/supplemental.

  6. The optimization of demand response programs in smart grids

    International Nuclear Information System (INIS)

    Derakhshan, Ghasem; Shayanfar, Heidar Ali; Kazemi, Ahad

    2016-01-01

    The potential to schedule portion of the electricity demand in smart energy systems is clear as a significant opportunity to enhance the efficiency of the grids. Demand response is one of the new developments in the field of electricity which is meant to engage consumers in improving the energy consumption pattern. We used Teaching & Learning based Optimization (TLBO) and Shuffled Frog Leaping (SFL) algorithms to propose an optimization model for consumption scheduling in smart grid when payment costs of different periods are reduced. This study conducted on four types residential consumers obtained in the summer for some residential houses located in the centre of Tehran city in Iran: first with time of use pricing, second with real-time pricing, third one with critical peak pricing, and the last consumer had no tariff for pricing. The results demonstrate that the adoption of demand response programs can reduce total payment costs and determine a more efficient use of optimization techniques. - Highlights: •An optimization model for the demand response program is made. •TLBO and SFL algorithms are applied to reduce payment costs in smart grid. •The optimal condition is provided for the maximization of the social welfare problem. •An application to some residential houses located in the centre of Tehran city in Iran is demonstrated.

  7. Optimal investment in a portfolio of HIV prevention programs.

    Science.gov (United States)

    Zaric, G S; Brandeau, M L

    2001-01-01

    In this article, the authors determine the optimal allocation of HIV prevention funds and investigate the impact of different allocation methods on health outcomes. The authors present a resource allocation model that can be used to determine the allocation of HIV prevention funds that maximizes quality-adjusted life years (or life years) gained or HIV infections averted in a population over a specified time horizon. They apply the model to determine the allocation of a limited budget among 3 types of HIV prevention programs in a population of injection drug users and nonusers: needle exchange programs, methadone maintenance treatment, and condom availability programs. For each prevention program, the authors estimate a production function that relates the amount invested to the associated change in risky behavior. The authors determine the optimal allocation of funds for both objective functions for a high-prevalence population and a low-prevalence population. They also consider the allocation of funds under several common rules of thumb that are used to allocate HIV prevention resources. It is shown that simpler allocation methods (e.g., allocation based on HIV incidence or notions of equity among population groups) may lead to alloctions that do not yield the maximum health benefit. The optimal allocation of HIV prevention funds in a population depends on HIV prevalence and incidence, the objective function, the production functions for the prevention programs, and other factors. Consideration of cost, equity, and social and political norms may be important when allocating HIV prevention funds. The model presented in this article can help decision makers determine the health consequences of different allocations of funds.

  8. Artificial intelligence programming with LabVIEW: genetic algorithms for instrumentation control and optimization.

    Science.gov (United States)

    Moore, J H

    1995-06-01

    A genetic algorithm for instrumentation control and optimization was developed using the LabVIEW graphical programming environment. The usefulness of this methodology for the optimization of a closed loop control instrument is demonstrated with minimal complexity and the programming is presented in detail to facilitate its adaptation to other LabVIEW applications. Closed loop control instruments have variety of applications in the biomedical sciences including the regulation of physiological processes such as blood pressure. The program presented here should provide a useful starting point for those wishing to incorporate genetic algorithm approaches to LabVIEW mediated optimization of closed loop control instruments.

  9. Optimizing the creation of base populations for aquaculture breeding programs using phenotypic and genomic data and its consequences on genetic progress.

    Science.gov (United States)

    Fernández, Jesús; Toro, Miguel Á; Sonesson, Anna K; Villanueva, Beatriz

    2014-01-01

    The success of an aquaculture breeding program critically depends on the way in which the base population of breeders is constructed since all the genetic variability for the traits included originally in the breeding goal as well as those to be included in the future is contained in the initial founders. Traditionally, base populations were created from a number of wild strains by sampling equal numbers from each strain. However, for some aquaculture species improved strains are already available and, therefore, mean phenotypic values for economically important traits can be used as a criterion to optimize the sampling when creating base populations. Also, the increasing availability of genome-wide genotype information in aquaculture species could help to refine the estimation of relationships within and between candidate strains and, thus, to optimize the percentage of individuals to be sampled from each strain. This study explores the advantages of using phenotypic and genome-wide information when constructing base populations for aquaculture breeding programs in terms of initial and subsequent trait performance and genetic diversity level. Results show that a compromise solution between diversity and performance can be found when creating base populations. Up to 6% higher levels of phenotypic performance can be achieved at the same level of global diversity in the base population by optimizing the selection of breeders instead of sampling equal numbers from each strain. The higher performance observed in the base population persisted during 10 generations of phenotypic selection applied in the subsequent breeding program.

  10. A metaheuristic optimization framework for informative gene selection

    Directory of Open Access Journals (Sweden)

    Kaberi Das

    Full Text Available This paper presents a metaheuristic framework using Harmony Search (HS with Genetic Algorithm (GA for gene selection. The internal architecture of the proposed model broadly works in two phases, in the first phase, the model allows the hybridization of HS with GA to compute and evaluate the fitness of the randomly selected solutions of binary strings and then HS ranks the solutions in descending order of their fitness. In the second phase, the offsprings are generated using crossover and mutation operations of GA and finally, those offsprings were selected for the next generation whose fitness value is more than their parents evaluated by SVM classifier. The accuracy of the final gene subsets obtained from this model has been evaluated using SVM classifiers. The merit of this approach is analyzed by experimental results on five benchmark datasets and the results showed an impressive accuracy over existing feature selection approaches. The occurrence of gene subsets selected from this model have also been computed and the most often selected gene subsets with the probability of [0.1–0.9] have been chosen as optimal sets of informative genes. Finally, the performance of those selected informative gene subsets have been measured and established through probabilistic measures. Keywords: Gene Selection, Metaheuristic, Harmony Search Algorithm, Genetic Algorithm, SVM

  11. A fuzzy multi-criteria decision model for integrated suppliers selection and optimal order allocation in the green supply chain

    Directory of Open Access Journals (Sweden)

    Hamzeh Amin-Tahmasbi

    2018-09-01

    Full Text Available Today, with the advancement of technology in the production process of various products, the achievement of sustainable production and development has become one of the main concerns of factories and manufacturing organizations. In the same vein, many manufacturers try to select suppliers in their upstream supply chains that have the best performance in terms of sustainable development criteria. In this research, a new multi-criteria decision-making model for selecting suppliers and assigning orders in the green supply chain is presented with a fuzzy optimization approach. Due to uncertainty in supplier capacity as well as customer demand, the problem is formulated as a fuzzy multi-objective linear programming (FMOLP. The proposed model for the selection of suppliers of SAPCO Corporation is evaluated. Firstly, in order to select and rank suppliers in a green supply chain, a network structure of criteria has defined with five main criteria of cost, quality, delivery, technology and environmental benefits. Subsequently, using incomplete fuzzy linguistic relationships, pair-wise comparisons between the criteria and sub-criteria as well as the operation of the options will be assessed. The results of these comparisons rank the existing suppliers in terms of performance and determine the utility of them. The output of these calculations (utility index is used in the optimization model. Subsequently, in the order allocation process, the two functions of the target cost of purchase and purchase value are optimized simultaneously. Finally, the order quantity is determined for each supplier in each period.

  12. Project evaluation and selection using fuzzy Delphi method and zero - one goal programming

    Science.gov (United States)

    Alias, Suriana; Adna, Nofarziah; Arsad, Roslah; Soid, Siti Khuzaimah; Ali, Zaileha Md

    2014-12-01

    Project evaluation and selection is a factor affecting the impotence of board director in which is trying to maximize all the possible goals. Assessment of the problem occurred in organization plan is the first phase for decision making process. The company needs a group of expert to evaluate the problems. The Fuzzy Delphi Method (FDM) is a systematic procedure to evoke the group's opinion in order to get the best result to evaluate the project performance. This paper proposes an evaluation and selection of the best alternative project based on combination of FDM and Zero - One Goal Programming (ZOGP) formulation. ZOGP is used to solve the multi-criteria decision making for final decision part by using optimization software LINDO 6.1. An empirical example on an ongoing decision making project in Johor, Malaysia is implemented for case study.

  13. Optimization of a pump-pipe system by dynamic programming

    DEFF Research Database (Denmark)

    Vidal, Rene Victor Valqui; Ferreira, Jose S.

    1984-01-01

    In this paper the problem of minimizing the total cost of a pump-pipe system in series is considered. The route of the pipeline and the number of pumping stations are known. The optimization will then consist in determining the control variables, diameter and thickness of the pipe and the size of...... of the pumps. A general mathematical model is formulated and Dynamic Programming is used to find an optimal solution....

  14. Selection of magnetorheological brake types via optimal design considering maximum torque and constrained volume

    International Nuclear Information System (INIS)

    Nguyen, Q H; Choi, S B

    2012-01-01

    This research focuses on optimal design of different types of magnetorheological brakes (MRBs), from which an optimal selection of MRB types is identified. In the optimization, common types of MRB such as disc-type, drum-type, hybrid-types, and T-shaped type are considered. The optimization problem is to find the optimal value of significant geometric dimensions of the MRB that can produce a maximum braking torque. The MRB is constrained in a cylindrical volume of a specific radius and length. After a brief description of the configuration of MRB types, the braking torques of the MRBs are derived based on the Herschel–Bulkley model of the MR fluid. The optimal design of MRBs constrained in a specific cylindrical volume is then analysed. The objective of the optimization is to maximize the braking torque while the torque ratio (the ratio of maximum braking torque and the zero-field friction torque) is constrained to be greater than a certain value. A finite element analysis integrated with an optimization tool is employed to obtain optimal solutions of the MRBs. Optimal solutions of MRBs constrained in different volumes are obtained based on the proposed optimization procedure. From the results, discussions on the optimal selection of MRB types depending on constrained volumes are given. (paper)

  15. The stock selection problem: Is the stock selection approach more important than the optimization method? Evidence from the Danish stock market

    OpenAIRE

    Grobys, Klaus

    2011-01-01

    Passive investment strategies basically aim to replicate an underlying benchmark. Thereby, the management usually selects a subset of stocks being employed in the optimization procedure. Apart from the optimization procedure, the stock selection approach determines the stock portfolios' out-of-sample performance. The empirical study here takes into account the Danish stock market from 2000-2010 and gives evidence that stock portfolios including small companies' stocks being estimated via coin...

  16. Selective Placement Program Coordinator (SPPC) Directory

    Data.gov (United States)

    Office of Personnel Management — List of the Selective Placement Program Coordinators (SPPC) in Federal agencies, updated as needed. Users can filter the list by choosing a state and/or agency name.

  17. A Linear Programming Model to Optimize Various Objective Functions of a Foundation Type State Support Program.

    Science.gov (United States)

    Matzke, Orville R.

    The purpose of this study was to formulate a linear programming model to simulate a foundation type support program and to apply this model to a state support program for the public elementary and secondary school districts in the State of Iowa. The model was successful in producing optimal solutions to five objective functions proposed for…

  18. An Examination of Program Selection Criteria for Part-Time MBA Students

    Science.gov (United States)

    Colburn, Michael; Fox, Daniel E.; Westerfelt, Debra Kay

    2011-01-01

    Prospective graduate students select a graduate program as a result of a multifaceted decision-making process. This study examines the selection criteria that part-time MBA students used in selecting a program at a private university. Further, it analyzes the methods by which the students first learned of the MBA program. The authors posed the…

  19. MULTI-CRITERIA PROGRAMMING METHODS AND PRODUCTION PLAN OPTIMIZATION PROBLEM SOLVING IN METAL INDUSTRY

    Directory of Open Access Journals (Sweden)

    Tunjo Perić

    2017-09-01

    Full Text Available This paper presents the production plan optimization in the metal industry considered as a multi-criteria programming problem. We first provided the definition of the multi-criteria programming problem and classification of the multicriteria programming methods. Then we applied two multi-criteria programming methods (the STEM method and the PROMETHEE method in solving a problem of multi-criteria optimization production plan in a company from the metal industry. The obtained results indicate a high efficiency of the applied methods in solving the problem.

  20. Averaging and Linear Programming in Some Singularly Perturbed Problems of Optimal Control

    Energy Technology Data Exchange (ETDEWEB)

    Gaitsgory, Vladimir, E-mail: vladimir.gaitsgory@mq.edu.au [Macquarie University, Department of Mathematics (Australia); Rossomakhine, Sergey, E-mail: serguei.rossomakhine@flinders.edu.au [Flinders University, Flinders Mathematical Sciences Laboratory, School of Computer Science, Engineering and Mathematics (Australia)

    2015-04-15

    The paper aims at the development of an apparatus for analysis and construction of near optimal solutions of singularly perturbed (SP) optimal controls problems (that is, problems of optimal control of SP systems) considered on the infinite time horizon. We mostly focus on problems with time discounting criteria but a possibility of the extension of results to periodic optimization problems is discussed as well. Our consideration is based on earlier results on averaging of SP control systems and on linear programming formulations of optimal control problems. The idea that we exploit is to first asymptotically approximate a given problem of optimal control of the SP system by a certain averaged optimal control problem, then reformulate this averaged problem as an infinite-dimensional linear programming (LP) problem, and then approximate the latter by semi-infinite LP problems. We show that the optimal solution of these semi-infinite LP problems and their duals (that can be found with the help of a modification of an available LP software) allow one to construct near optimal controls of the SP system. We demonstrate the construction with two numerical examples.

  1. Optimal Design of Measurement Programs for the Parameter Identification of Dynamic Systems

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard; Brincker, Rune

    The design of measurement programs devoted to parameter identification of structural dynamic systems is considered. The design problem is formulated as an optimization problem to minimize the total expected cost that is the cost of failure and the cost of the measurement program. All...... the calculations are based on a priori knowledge and engineering judgement. One of the contribution of the approach is that the optimal number of sensors can be estimated. This is shown in a numerical example where the proposed approach is demonstrated. The example is concerned with design of a measurement program...

  2. Optimal Design of Measurement Programs for the Parameter Identification of Dynamic Systems

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard; Brincker, Rune

    The design of a measured program devoted to parameter identification of structural dynamic systems is considered, the design problem is formulated as an optimization problem due to minimize the total expected cost of the measurement program. All the calculations are based on a priori knowledge...... and engineering judgement. One of the contribution of the approach is that the optimal nmber of sensors can be estimated. This is sown in an numerical example where the proposed approach is demonstrated. The example is concerned with design of a measurement program for estimating the modal damping parameters...

  3. Review: Optimization methods for groundwater modeling and management

    Science.gov (United States)

    Yeh, William W.-G.

    2015-09-01

    Optimization methods have been used in groundwater modeling as well as for the planning and management of groundwater systems. This paper reviews and evaluates the various optimization methods that have been used for solving the inverse problem of parameter identification (estimation), experimental design, and groundwater planning and management. Various model selection criteria are discussed, as well as criteria used for model discrimination. The inverse problem of parameter identification concerns the optimal determination of model parameters using water-level observations. In general, the optimal experimental design seeks to find sampling strategies for the purpose of estimating the unknown model parameters. A typical objective of optimal conjunctive-use planning of surface water and groundwater is to minimize the operational costs of meeting water demand. The optimization methods include mathematical programming techniques such as linear programming, quadratic programming, dynamic programming, stochastic programming, nonlinear programming, and the global search algorithms such as genetic algorithms, simulated annealing, and tabu search. Emphasis is placed on groundwater flow problems as opposed to contaminant transport problems. A typical two-dimensional groundwater flow problem is used to explain the basic formulations and algorithms that have been used to solve the formulated optimization problems.

  4. Optimal Risk Reduction in the Railway Industry by Using Dynamic Programming

    OpenAIRE

    Michael Todinov; Eberechi Weli

    2013-01-01

    The paper suggests for the first time the use of dynamic programming techniques for optimal risk reduction in the railway industry. It is shown that by using the concept ‘amount of removed risk by a risk reduction option’, the problem related to optimal allocation of a fixed budget to achieve a maximum risk reduction in the railway industry can be reduced to an optimisation problem from dynamic programming. For n risk reduction options and size of the available risk reduction budget B (expres...

  5. Selective waste collection optimization in Romania and its impact to urban climate

    Science.gov (United States)

    Mihai, Šercǎianu; Iacoboaea, Cristina; Petrescu, Florian; Aldea, Mihaela; Luca, Oana; Gaman, Florian; Parlow, Eberhard

    2016-08-01

    According to European Directives, transposed in national legislation, the Member States should organize separate collection systems at least for paper, metal, plastic, and glass until 2015. In Romania, since 2011 only 12% of municipal collected waste was recovered, the rest being stored in landfills, although storage is considered the last option in the waste hierarchy. At the same time there was selectively collected only 4% of the municipal waste. Surveys have shown that the Romanian people do not have selective collection bins close to their residencies. The article aims to analyze the current situation in Romania in the field of waste collection and management and to make a proposal for selective collection containers layout, using geographic information systems tools, for a case study in Romania. Route optimization is used based on remote sensing technologies and network analyst protocols. Optimizing selective collection system the greenhouse gases, particles and dust emissions can be reduced.

  6. Adaptive dynamic programming with applications in optimal control

    CERN Document Server

    Liu, Derong; Wang, Ding; Yang, Xiong; Li, Hongliang

    2017-01-01

    This book covers the most recent developments in adaptive dynamic programming (ADP). The text begins with a thorough background review of ADP making sure that readers are sufficiently familiar with the fundamentals. In the core of the book, the authors address first discrete- and then continuous-time systems. Coverage of discrete-time systems starts with a more general form of value iteration to demonstrate its convergence, optimality, and stability with complete and thorough theoretical analysis. A more realistic form of value iteration is studied where value function approximations are assumed to have finite errors. Adaptive Dynamic Programming also details another avenue of the ADP approach: policy iteration. Both basic and generalized forms of policy-iteration-based ADP are studied with complete and thorough theoretical analysis in terms of convergence, optimality, stability, and error bounds. Among continuous-time systems, the control of affine and nonaffine nonlinear systems is studied using the ADP app...

  7. Iterative Selection of Unknown Weights in Direct Weight Optimization Identification

    Directory of Open Access Journals (Sweden)

    Xiao Xuan

    2014-01-01

    Full Text Available To the direct weight optimization identification of the nonlinear system, we add some linear terms about input sequences in the former linear affine function so as to approximate the nonlinear property. To choose the two classes of unknown weights in the more linear terms, this paper derives the detailed process on how to choose these unknown weights from theoretical analysis and engineering practice, respectively, and makes sure of their key roles between the unknown weights. From the theoretical analysis, the added unknown weights’ auxiliary role can be known in the whole process of approximating the nonlinear system. From the practical analysis, we learn how to transform one complex optimization problem to its corresponding common quadratic program problem. Then, the common quadratic program problem can be solved by the basic interior point method. Finally, the efficiency and possibility of the proposed strategies can be confirmed by the simulation results.

  8. Markdown Optimization via Approximate Dynamic Programming

    Directory of Open Access Journals (Sweden)

    Cos?gun

    2013-02-01

    Full Text Available We consider the markdown optimization problem faced by the leading apparel retail chain. Because of substitution among products the markdown policy of one product affects the sales of other products. Therefore, markdown policies for product groups having a significant crossprice elasticity among each other should be jointly determined. Since the state space of the problem is very huge, we use Approximate Dynamic Programming. Finally, we provide insights on the behavior of how each product price affects the markdown policy.

  9. Statistical models for optimizing mineral exploration

    International Nuclear Information System (INIS)

    Wignall, T.K.; DeGeoffroy, J.

    1987-01-01

    The primary purpose of mineral exploration is to discover ore deposits. The emphasis of this volume is on the mathematical and computational aspects of optimizing mineral exploration. The seven chapters that make up the main body of the book are devoted to the description and application of various types of computerized geomathematical models. These chapters include: (1) the optimal selection of ore deposit types and regions of search, as well as prospecting selected areas, (2) designing airborne and ground field programs for the optimal coverage of prospecting areas, and (3) delineating and evaluating exploration targets within prospecting areas by means of statistical modeling. Many of these statistical programs are innovative and are designed to be useful for mineral exploration modeling. Examples of geomathematical models are applied to exploring for six main types of base and precious metal deposits, as well as other mineral resources (such as bauxite and uranium)

  10. Mathematical Optimization Algorithm for Minimizing the Cost Function of GHG Emission in AS/RS Using Positive Selection Based Clonal Selection Principle

    Science.gov (United States)

    Mahalakshmi; Murugesan, R.

    2018-04-01

    This paper regards with the minimization of total cost of Greenhouse Gas (GHG) efficiency in Automated Storage and Retrieval System (AS/RS). A mathematical model is constructed based on tax cost, penalty cost and discount cost of GHG emission of AS/RS. A two stage algorithm namely positive selection based clonal selection principle (PSBCSP) is used to find the optimal solution of the constructed model. In the first stage positive selection principle is used to reduce the search space of the optimal solution by fixing a threshold value. In the later stage clonal selection principle is used to generate best solutions. The obtained results are compared with other existing algorithms in the literature, which shows that the proposed algorithm yields a better result compared to others.

  11. Project STOP (Spectral Thermal Optimization Program)

    Science.gov (United States)

    Goldhammer, L. J.; Opjorden, R. W.; Goodelle, G. S.; Powe, J. S.

    1977-01-01

    The spectral thermal optimization of solar cell configurations for various solar panel applications is considered. The method of optimization depends upon varying the solar cell configuration's optical characteristics to minimize panel temperatures, maximize power output and decrease the power delta from beginning of life to end of life. Four areas of primary investigation are: (1) testing and evaluation of ultraviolet resistant coverslide adhesives, primarily FEP as an adhesive; (2) examination of solar cell absolute spectral response and corresponding cell manufacturing processes that affect it; (3) experimental work with solar cell manufacturing processes that vary cell reflectance (solar absorptance); and (4) experimental and theoretical studies with various coverslide filter designs, mainly a red rejection filter. The Hughes' solar array prediction program has been modified to aid in evaluating the effect of each of the above four areas on the output of a solar panel in orbit.

  12. Optimization programs for reactor core fuel loading exhibiting reduced neutron leakage

    International Nuclear Information System (INIS)

    Darilek, P.

    1991-01-01

    The program MAXIM was developed for the optimization of the fuel loading of WWER-440 reactors. It enables the reactor core reactivity to be maximized by modifying the arrangement of the fuel assemblies. The procedure is divided into three steps. The first step includes the passage from the three-dimensional model of the reactor core to the two-dimensional model. In the second step, the solution to the problem is sought assuming that the multiplying properties, or the reactivity in the zones of the core, vary continuously. In the third step, parameters of actual fuel assemblies are inserted in the ''continuous'' solution obtained. Combined with the program PROPAL for a detailed refinement of the loading, the program MAXIM forms a basis for the development of programs for the optimization of fuel loading with burnable poisons. (Z.M.). 16 refs

  13. Dynamic Programming Approach for Exact Decision Rule Optimization

    KAUST Repository

    Amin, Talha

    2013-01-01

    This chapter is devoted to the study of an extension of dynamic programming approach that allows sequential optimization of exact decision rules relative to the length and coverage. It contains also results of experiments with decision tables from UCI Machine Learning Repository. © Springer-Verlag Berlin Heidelberg 2013.

  14. Optimal portfolio selection for general provisioning and terminal wealth problems

    NARCIS (Netherlands)

    van Weert, K.; Dhaene, J.; Goovaerts, M.

    2010-01-01

    In Dhaene et al. (2005), multiperiod portfolio selection problems are discussed, using an analytical approach to find optimal constant mix investment strategies in a provisioning or a savings context. In this paper we extend some of these results, investigating some specific, real-life situations.

  15. Optimal portfolio selection for general provisioning and terminal wealth problems

    NARCIS (Netherlands)

    van Weert, K.; Dhaene, J.; Goovaerts, M.

    2009-01-01

    In Dhaene et al. (2005), multiperiod portfolio selection problems are discussed, using an analytical approach to find optimal constant mix investment strategies in a provisioning or savings context. In this paper we extend some of these results, investigating some specific, real-life situations. The

  16. Selection of an optimal neural network architecture for computer-aided detection of microcalcifications - Comparison of automated optimization techniques

    International Nuclear Information System (INIS)

    Gurcan, Metin N.; Sahiner, Berkman; Chan Heangping; Hadjiiski, Lubomir; Petrick, Nicholas

    2001-01-01

    Many computer-aided diagnosis (CAD) systems use neural networks (NNs) for either detection or classification of abnormalities. Currently, most NNs are 'optimized' by manual search in a very limited parameter space. In this work, we evaluated the use of automated optimization methods for selecting an optimal convolution neural network (CNN) architecture. Three automated methods, the steepest descent (SD), the simulated annealing (SA), and the genetic algorithm (GA), were compared. We used as an example the CNN that classifies true and false microcalcifications detected on digitized mammograms by a prescreening algorithm. Four parameters of the CNN architecture were considered for optimization, the numbers of node groups and the filter kernel sizes in the first and second hidden layers, resulting in a search space of 432 possible architectures. The area A z under the receiver operating characteristic (ROC) curve was used to design a cost function. The SA experiments were conducted with four different annealing schedules. Three different parent selection methods were compared for the GA experiments. An available data set was split into two groups with approximately equal number of samples. By using the two groups alternately for training and testing, two different cost surfaces were evaluated. For the first cost surface, the SD method was trapped in a local minimum 91% (392/432) of the time. The SA using the Boltzman schedule selected the best architecture after evaluating, on average, 167 architectures. The GA achieved its best performance with linearly scaled roulette-wheel parent selection; however, it evaluated 391 different architectures, on average, to find the best one. The second cost surface contained no local minimum. For this surface, a simple SD algorithm could quickly find the global minimum, but the SA with the very fast reannealing schedule was still the most efficient. The same SA scheme, however, was trapped in a local minimum on the first cost

  17. Computer-Aided Communication Satellite System Analysis and Optimization.

    Science.gov (United States)

    Stagl, Thomas W.; And Others

    Various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. The rationale for selecting General Dynamics/Convair's Satellite Telecommunication Analysis and Modeling Program (STAMP) in modified form to aid in the system costing and sensitivity analysis work in the Program on…

  18. Joint Optimization of Receiver Placement and Illuminator Selection for a Multiband Passive Radar Network.

    Science.gov (United States)

    Xie, Rui; Wan, Xianrong; Hong, Sheng; Yi, Jianxin

    2017-06-14

    The performance of a passive radar network can be greatly improved by an optimal radar network structure. Generally, radar network structure optimization consists of two aspects, namely the placement of receivers in suitable places and selection of appropriate illuminators. The present study investigates issues concerning the joint optimization of receiver placement and illuminator selection for a passive radar network. Firstly, the required radar cross section (RCS) for target detection is chosen as the performance metric, and the joint optimization model boils down to the partition p -center problem (PPCP). The PPCP is then solved by a proposed bisection algorithm. The key of the bisection algorithm lies in solving the partition set covering problem (PSCP), which can be solved by a hybrid algorithm developed by coupling the convex optimization with the greedy dropping algorithm. In the end, the performance of the proposed algorithm is validated via numerical simulations.

  19. Optimal relay selection and power allocation for cognitive two-way relaying networks

    KAUST Repository

    Pandarakkottilil, Ubaidulla

    2012-06-01

    In this paper, we present an optimal scheme for power allocation and relay selection in a cognitive radio network where a pair of cognitive (or secondary) transceiver nodes communicate with each other assisted by a set of cognitive two-way relays. The secondary nodes share the spectrum with a licensed primary user (PU), and each node is assumed to be equipped with a single transmit/receive antenna. The interference to the PU resulting from the transmission from the cognitive nodes is kept below a specified limit. We propose joint relay selection and optimal power allocation among the secondary user (SU) nodes achieving maximum throughput under transmit power and PU interference constraints. A closed-form solution for optimal allocation of transmit power among the SU transceivers and the SU relay is presented. Furthermore, numerical simulations and comparisons are presented to illustrate the performance of the proposed scheme. © 2012 IEEE.

  20. Asymptotic Normality of the Optimal Solution in Multiresponse Surface Mathematical Programming

    OpenAIRE

    Díaz-García, José A.; Caro-Lopera, Francisco J.

    2015-01-01

    An explicit form for the perturbation effect on the matrix of regression coeffi- cients on the optimal solution in multiresponse surface methodology is obtained in this paper. Then, the sensitivity analysis of the optimal solution is studied and the critical point characterisation of the convex program, associated with the optimum of a multiresponse surface, is also analysed. Finally, the asymptotic normality of the optimal solution is derived by the standard methods.

  1. Global Optimal Energy Management Strategy Research for a Plug-In Series-Parallel Hybrid Electric Bus by Using Dynamic Programming

    Directory of Open Access Journals (Sweden)

    Hongwen He

    2013-01-01

    Full Text Available Energy management strategy influences the power performance and fuel economy of plug-in hybrid electric vehicles greatly. To explore the fuel-saving potential of a plug-in hybrid electric bus (PHEB, this paper searched the global optimal energy management strategy using dynamic programming (DP algorithm. Firstly, the simplified backward model of the PHEB was built which is necessary for DP algorithm. Then the torque and speed of engine and the torque of motor were selected as the control variables, and the battery state of charge (SOC was selected as the state variables. The DP solution procedure was listed, and the way was presented to find all possible control variables at every state of each stage in detail. Finally, the appropriate SOC increment is determined after quantizing the state variables, and then the optimal control of long driving distance of a specific driving cycle is replaced with the optimal control of one driving cycle, which reduces the computational time significantly and keeps the precision at the same time. The simulation results show that the fuel economy of the PEHB with the optimal energy management strategy is improved by 53.7% compared with that of the conventional bus, which can be a benchmark for the assessment of other control strategies.

  2. POBE: A Computer Program for Optimal Design of Multi-Subject Blocked fMRI Experiments

    Directory of Open Access Journals (Sweden)

    Bärbel Maus

    2014-01-01

    Full Text Available For functional magnetic resonance imaging (fMRI studies, researchers can use multi-subject blocked designs to identify active brain regions for a certain stimulus type of interest. Before performing such an experiment, careful planning is necessary to obtain efficient stimulus effect estimators within the available financial resources. The optimal number of subjects and the optimal scanning time for a multi-subject blocked design with fixed experimental costs can be determined using optimal design methods. In this paper, the user-friendly computer program POBE 1.2 (program for optimal design of blocked experiments, version 1.2 is presented. POBE provides a graphical user interface for fMRI researchers to easily and efficiently design their experiments. The computer program POBE calculates the optimal number of subjects and the optimal scanning time for user specified experimental factors and model parameters so that the statistical efficiency is maximised for a given study budget. POBE can also be used to determine the minimum budget for a given power. Furthermore, a maximin design can be determined as efficient design for a possible range of values for the unknown model parameters. In this paper, the computer program is described and illustrated with typical experimental factors for a blocked fMRI experiment.

  3. Optimal Licensing Contracts with Adverse Selection and Informational Rents

    Directory of Open Access Journals (Sweden)

    Daniela MARINESCU

    2011-06-01

    Full Text Available In the paper we analyse a model for determining the optimal licensing contract in both situations of symmetric and asymmetric information between the license’s owner and the potential buyer. Next we present another way of solving the corresponding adverse selection model, using the informational rents as variables. This approach is different from that of Macho-Stadler and Perez-Castrillo.

  4. Optimal Subinterval Selection Approach for Power System Transient Stability Simulation

    Directory of Open Access Journals (Sweden)

    Soobae Kim

    2015-10-01

    Full Text Available Power system transient stability analysis requires an appropriate integration time step to avoid numerical instability as well as to reduce computational demands. For fast system dynamics, which vary more rapidly than what the time step covers, a fraction of the time step, called a subinterval, is used. However, the optimal value of this subinterval is not easily determined because the analysis of the system dynamics might be required. This selection is usually made from engineering experiences, and perhaps trial and error. This paper proposes an optimal subinterval selection approach for power system transient stability analysis, which is based on modal analysis using a single machine infinite bus (SMIB system. Fast system dynamics are identified with the modal analysis and the SMIB system is used focusing on fast local modes. An appropriate subinterval time step from the proposed approach can reduce computational burden and achieve accurate simulation responses as well. The performance of the proposed method is demonstrated with the GSO 37-bus system.

  5. Comparative evaluation of various optimization methods and the development of an optimization code system SCOOP

    International Nuclear Information System (INIS)

    Suzuki, Tadakazu

    1979-11-01

    Thirty two programs for linear and nonlinear optimization problems with or without constraints have been developed or incorporated, and their stability, convergence and efficiency have been examined. On the basis of these evaluations, the first version of the optimization code system SCOOP-I has been completed. The SCOOP-I is designed to be an efficient, reliable, useful and also flexible system for general applications. The system enables one to find global optimization point for a wide class of problems by selecting the most appropriate optimization method built in it. (author)

  6. Signal Timing Optimization Based on Fuzzy Compromise Programming for Isolated Signalized Intersection

    Directory of Open Access Journals (Sweden)

    Dexin Yu

    2016-01-01

    Full Text Available In order to optimize the signal timing for isolated intersection, a new method based on fuzzy programming approach is proposed in this paper. Considering the whole operation efficiency of the intersection comprehensively, traffic capacity, vehicle cycle delay, cycle stops, and exhaust emission are chosen as optimization goals to establish a multiobjective function first. Then fuzzy compromise programming approach is employed to give different weight coefficients to various optimization objectives for different traffic flow ratios states. And then the multiobjective function is converted to a single objective function. By using genetic algorithm, the optimized signal cycle and effective green time can be obtained. Finally, the performance of the traditional method and new method proposed in this paper is compared and analyzed through VISSIM software. It can be concluded that the signal timing optimized in this paper can effectively reduce vehicle delays and stops, which can improve traffic capacity of the intersection as well.

  7. A multi-objective model for closed-loop supply chain optimization and efficient supplier selection in a competitive environment considering quantity discount policy

    Science.gov (United States)

    Jahangoshai Rezaee, Mustafa; Yousefi, Samuel; Hayati, Jamileh

    2017-06-01

    Supplier selection and allocation of optimal order quantity are two of the most important processes in closed-loop supply chain (CLSC) and reverse logistic (RL). So that providing high quality raw material is considered as a basic requirement for a manufacturer to produce popular products, as well as achieve more market shares. On the other hand, considering the existence of competitive environment, suppliers have to offer customers incentives like discounts and enhance the quality of their products in a competition with other manufacturers. Therefore, in this study, a model is presented for CLSC optimization, efficient supplier selection, as well as orders allocation considering quantity discount policy. It is modeled using multi-objective programming based on the integrated simultaneous data envelopment analysis-Nash bargaining game. In this study, maximizing profit and efficiency and minimizing defective and functions of delivery delay rate are taken into accounts. Beside supplier selection, the suggested model selects refurbishing sites, as well as determining the number of products and parts in each network's sector. The suggested model's solution is carried out using global criteria method. Furthermore, based on related studies, a numerical example is examined to validate it.

  8. On the Lasserre hierarchy of semidefinite programming relaxations of convex polynomial optimization problems

    NARCIS (Netherlands)

    de Klerk, E.; Laurent, M.

    2011-01-01

    The Lasserre hierarchy of semidefinite programming approximations to convex polynomial optimization problems is known to converge finitely under some assumptions. [J. B. Lasserre, Convexity in semialgebraic geometry and polynomial optimization, SIAM J. Optim., 19 (2009), pp. 1995–2014]. We give a

  9. An ILP based Algorithm for Optimal Customer Selection for Demand Response in SmartGrids

    Energy Technology Data Exchange (ETDEWEB)

    Kuppannagari, Sanmukh R. [Univ. of Southern California, Los Angeles, CA (United States); Kannan, Rajgopal [Louisiana State Univ., Baton Rouge, LA (United States); Prasanna, Viktor K. [Univ. of Southern California, Los Angeles, CA (United States)

    2015-12-07

    Demand Response (DR) events are initiated by utilities during peak demand periods to curtail consumption. They ensure system reliability and minimize the utility’s expenditure. Selection of the right customers and strategies is critical for a DR event. An effective DR scheduling algorithm minimizes the curtailment error which is the absolute difference between the achieved curtailment value and the target. State-of-the-art heuristics exist for customer selection, however their curtailment errors are unbounded and can be as high as 70%. In this work, we develop an Integer Linear Programming (ILP) formulation for optimally selecting customers and curtailment strategies that minimize the curtailment error during DR events in SmartGrids. We perform experiments on real world data obtained from the University of Southern California’s SmartGrid and show that our algorithm achieves near exact curtailment values with errors in the range of 10-7 to 10-5, which are within the range of numerical errors. We compare our results against the state-of-the-art heuristic being deployed in practice in the USC SmartGrid. We show that for the same set of available customer strategy pairs our algorithm performs 103 to 107 times better in terms of the curtailment errors incurred.

  10. Dynamic programming approach to optimization of approximate decision rules

    KAUST Repository

    Amin, Talha M.; Chikalov, Igor; Moshkov, Mikhail; Zielosko, Beata

    2013-01-01

    This paper is devoted to the study of an extension of dynamic programming approach which allows sequential optimization of approximate decision rules relative to the length and coverage. We introduce an uncertainty measure R(T) which is the number

  11. Iteration particle swarm optimization for contract capacities selection of time-of-use rates industrial customers

    International Nuclear Information System (INIS)

    Lee, Tsung-Ying; Chen, Chun-Lung

    2007-01-01

    This paper presents a new algorithm for solving the optimal contract capacities of a time-of-use (TOU) rates industrial customer. This algorithm is named iteration particle swarm optimization (IPSO). A new index, called iteration best is incorporated into particle swarm optimization (PSO) to improve solution quality and computation efficiency. Expanding line construction cost and contract recovery cost are considered, as well as demand contract capacity cost and penalty bill, in the selection of the optimal contract capacities. The resulting optimal contract capacity effectively reaches the minimum electricity charge of TOU rates users. A significant reduction in electricity costs is observed. The effects of expanding line construction cost and contract recovery cost on the selection of optimal contract capacities can also be estimated. The feasibility of the new algorithm is demonstrated by a numerical example, and the IPSO solution quality and computation efficiency are compared to those of other algorithms

  12. Status of selected air pollution control programs, February 1992

    International Nuclear Information System (INIS)

    1992-02-01

    The collection of status reports has been prepared in order to provide a timely summary of selected EPA air pollution control activities to those individuals who are involved with the implementation of these programs. The report contains ozone/carbon monoxide (CO) programs; mobile sources programs; particulate matter nominally 10M and less (PM-10), sulfur dioxide (SO2) and lead programs; New Source Review (NSR); economics programs; emission standards programs; Indian activity programs; mobile sources programs; air toxics programs; acid rain programs; permits programs; chlorofluorocarbons programs; enforcement programs; and other programs

  13. Risk-Constrained Dynamic Programming for Optimal Mars Entry, Descent, and Landing

    Science.gov (United States)

    Ono, Masahiro; Kuwata, Yoshiaki

    2013-01-01

    A chance-constrained dynamic programming algorithm was developed that is capable of making optimal sequential decisions within a user-specified risk bound. This work handles stochastic uncertainties over multiple stages in the CEMAT (Combined EDL-Mobility Analyses Tool) framework. It was demonstrated by a simulation of Mars entry, descent, and landing (EDL) using real landscape data obtained from the Mars Reconnaissance Orbiter. Although standard dynamic programming (DP) provides a general framework for optimal sequential decisionmaking under uncertainty, it typically achieves risk aversion by imposing an arbitrary penalty on failure states. Such a penalty-based approach cannot explicitly bound the probability of mission failure. A key idea behind the new approach is called risk allocation, which decomposes a joint chance constraint into a set of individual chance constraints and distributes risk over them. The joint chance constraint was reformulated into a constraint on an expectation over a sum of an indicator function, which can be incorporated into the cost function by dualizing the optimization problem. As a result, the chance-constraint optimization problem can be turned into an unconstrained optimization over a Lagrangian, which can be solved efficiently using a standard DP approach.

  14. Large-scale hydropower system optimization using dynamic programming and object-oriented programming: the case of the Northeast China Power Grid.

    Science.gov (United States)

    Li, Ji-Qing; Zhang, Yu-Shan; Ji, Chang-Ming; Wang, Ai-Jing; Lund, Jay R

    2013-01-01

    This paper examines long-term optimal operation using dynamic programming for a large hydropower system of 10 reservoirs in Northeast China. Besides considering flow and hydraulic head, the optimization explicitly includes time-varying electricity market prices to maximize benefit. Two techniques are used to reduce the 'curse of dimensionality' of dynamic programming with many reservoirs. Discrete differential dynamic programming (DDDP) reduces the search space and computer memory needed. Object-oriented programming (OOP) and the ability to dynamically allocate and release memory with the C++ language greatly reduces the cumulative effect of computer memory for solving multi-dimensional dynamic programming models. The case study shows that the model can reduce the 'curse of dimensionality' and achieve satisfactory results.

  15. Optimization of axial enrichment and gadolinia distributions for BWR fuel under control rod programming, (2)

    International Nuclear Information System (INIS)

    Hida, Kazuki; Yoshioka, Ritsuo

    1992-01-01

    A method has been developed for optimizing the axial enrichment and gadolinia distributions for the reload BWR fuel under control rod programming. The problem was to minimize the enrichment requirement subject to the criticality and axial power peaking constraints. The optimization technique was based on the successive linear programming method, each linear programming problem being solved by a goal programming algorithm. A rapid and practically accurate core neutronics model, named the modified one-dimensional core model, was developed to describe the batch-averaged burnup behavior of the reload fuel. A core burnup simulation algorithm, employing a burnup-power-void iteration, was also developed to calculate the rigorous equilibrium cycle performance. This method was applied to the optimization of axial two- and 24-region fuels for demonstrative purposes. The optimal solutions for both fuels have proved the optimality of what is called burnup shape optimization spectral shift. For the two-region fuel with a practical power peaking of 1.4, the enrichment distribution was nearly uniform, because a bottom-peaked burnup shape flattens the axial power shape. Optimization of the 24-region fuel has shown a potential improvement in BWR fuel cycle economics, which will guide future advancement in BWR fuel designs. (author)

  16. Multi-Objective Particle Swarm Optimization Approach for Cost-Based Feature Selection in Classification.

    Science.gov (United States)

    Zhang, Yong; Gong, Dun-Wei; Cheng, Jian

    2017-01-01

    Feature selection is an important data-preprocessing technique in classification problems such as bioinformatics and signal processing. Generally, there are some situations where a user is interested in not only maximizing the classification performance but also minimizing the cost that may be associated with features. This kind of problem is called cost-based feature selection. However, most existing feature selection approaches treat this task as a single-objective optimization problem. This paper presents the first study of multi-objective particle swarm optimization (PSO) for cost-based feature selection problems. The task of this paper is to generate a Pareto front of nondominated solutions, that is, feature subsets, to meet different requirements of decision-makers in real-world applications. In order to enhance the search capability of the proposed algorithm, a probability-based encoding technology and an effective hybrid operator, together with the ideas of the crowding distance, the external archive, and the Pareto domination relationship, are applied to PSO. The proposed PSO-based multi-objective feature selection algorithm is compared with several multi-objective feature selection algorithms on five benchmark datasets. Experimental results show that the proposed algorithm can automatically evolve a set of nondominated solutions, and it is a highly competitive feature selection method for solving cost-based feature selection problems.

  17. TRU Waste Management Program. Cost/schedule optimization analysis

    International Nuclear Information System (INIS)

    Detamore, J.A.; Raudenbush, M.H.; Wolaver, R.W.; Hastings, G.A.

    1985-10-01

    This Current Year Work Plan presents in detail a description of the activities to be performed by the Joint Integration Office Rockwell International (JIO/RI) during FY86. It breaks down the activities into two major work areas: Program Management and Program Analysis. Program Management is performed by the JIO/RI by providing technical planning and guidance for the development of advanced TRU waste management capabilities. This includes equipment/facility design, engineering, construction, and operations. These functions are integrated to allow transition from interim storage to final disposition. JIO/RI tasks include program requirements identification, long-range technical planning, budget development, program planning document preparation, task guidance development, task monitoring, task progress information gathering and reporting to DOE, interfacing with other agencies and DOE lead programs, integrating public involvement with program efforts, and preparation of reports for DOE detailing program status. Program Analysis is performed by the JIO/RI to support identification and assessment of alternatives, and development of long-term TRU waste program capabilities. These analyses include short-term analyses in response to DOE information requests, along with performing an RH Cost/Schedule Optimization report. Systems models will be developed, updated, and upgraded as needed to enhance JIO/RI's capability to evaluate the adequacy of program efforts in various fields. A TRU program data base will be maintained and updated to provide DOE with timely responses to inventory related questions

  18. Optimal Selection of the Sampling Interval for Estimation of Modal Parameters by an ARMA- Model

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning

    1993-01-01

    Optimal selection of the sampling interval for estimation of the modal parameters by an ARMA-model for a white noise loaded structure modelled as a single degree of- freedom linear mechanical system is considered. An analytical solution for an optimal uniform sampling interval, which is optimal...

  19. portfolio optimization based on nonparametric estimation methods

    Directory of Open Access Journals (Sweden)

    mahsa ghandehari

    2017-03-01

    Full Text Available One of the major issues investors are facing with in capital markets is decision making about select an appropriate stock exchange for investing and selecting an optimal portfolio. This process is done through the risk and expected return assessment. On the other hand in portfolio selection problem if the assets expected returns are normally distributed, variance and standard deviation are used as a risk measure. But, the expected returns on assets are not necessarily normal and sometimes have dramatic differences from normal distribution. This paper with the introduction of conditional value at risk ( CVaR, as a measure of risk in a nonparametric framework, for a given expected return, offers the optimal portfolio and this method is compared with the linear programming method. The data used in this study consists of monthly returns of 15 companies selected from the top 50 companies in Tehran Stock Exchange during the winter of 1392 which is considered from April of 1388 to June of 1393. The results of this study show the superiority of nonparametric method over the linear programming method and the nonparametric method is much faster than the linear programming method.

  20. Optimal blood glucose level control using dynamic programming based on minimal Bergman model

    Science.gov (United States)

    Rettian Anggita Sari, Maria; Hartono

    2018-03-01

    The purpose of this article is to simulate the glucose dynamic and the insulin kinetic of diabetic patient. The model used in this research is a non-linear Minimal Bergman model. Optimal control theory is then applied to formulate the problem in order to determine the optimal dose of insulin in the treatment of diabetes mellitus such that the glucose level is in the normal range for some specific time range. The optimization problem is solved using dynamic programming. The result shows that dynamic programming is quite reliable to represent the interaction between glucose and insulin levels in diabetes mellitus patient.

  1. A new and fast image feature selection method for developing an optimal mammographic mass detection scheme.

    Science.gov (United States)

    Tan, Maxine; Pu, Jiantao; Zheng, Bin

    2014-08-01

    Selecting optimal features from a large image feature pool remains a major challenge in developing computer-aided detection (CAD) schemes of medical images. The objective of this study is to investigate a new approach to significantly improve efficacy of image feature selection and classifier optimization in developing a CAD scheme of mammographic masses. An image dataset including 1600 regions of interest (ROIs) in which 800 are positive (depicting malignant masses) and 800 are negative (depicting CAD-generated false positive regions) was used in this study. After segmentation of each suspicious lesion by a multilayer topographic region growth algorithm, 271 features were computed in different feature categories including shape, texture, contrast, isodensity, spiculation, local topological features, as well as the features related to the presence and location of fat and calcifications. Besides computing features from the original images, the authors also computed new texture features from the dilated lesion segments. In order to select optimal features from this initial feature pool and build a highly performing classifier, the authors examined and compared four feature selection methods to optimize an artificial neural network (ANN) based classifier, namely: (1) Phased Searching with NEAT in a Time-Scaled Framework, (2) A sequential floating forward selection (SFFS) method, (3) A genetic algorithm (GA), and (4) A sequential forward selection (SFS) method. Performances of the four approaches were assessed using a tenfold cross validation method. Among these four methods, SFFS has highest efficacy, which takes 3%-5% of computational time as compared to GA approach, and yields the highest performance level with the area under a receiver operating characteristic curve (AUC) = 0.864 ± 0.034. The results also demonstrated that except using GA, including the new texture features computed from the dilated mass segments improved the AUC results of the ANNs optimized

  2. Optimal foraging in marine ecosystem models: selectivity, profitability and switching

    DEFF Research Database (Denmark)

    Visser, Andre W.; Fiksen, Ø.

    2013-01-01

    ecological mechanics and evolutionary logic as a solution to diet selection in ecosystem models. When a predator can consume a range of prey items it has to choose which foraging mode to use, which prey to ignore and which ones to pursue, and animals are known to be particularly skilled in adapting...... to the preference functions commonly used in models today. Indeed, depending on prey class resolution, optimal foraging can yield feeding rates that are considerably different from the ‘switching functions’ often applied in marine ecosystem models. Dietary inclusion is dictated by two optimality choices: 1...... by letting predators maximize energy intake or more properly, some measure of fitness where predation risk and cost are also included. An optimal foraging or fitness maximizing approach will give marine ecosystem models a sound principle to determine trophic interactions...

  3. Optimization model for the design of distributed wastewater treatment networks

    Directory of Open Access Journals (Sweden)

    Ibrić Nidret

    2012-01-01

    Full Text Available In this paper we address the synthesis problem of distributed wastewater networks using mathematical programming approach based on the superstructure optimization. We present a generalized superstructure and optimization model for the design of the distributed wastewater treatment networks. The superstructure includes splitters, treatment units, mixers, with all feasible interconnections including water recirculation. Based on the superstructure the optimization model is presented. The optimization model is given as a nonlinear programming (NLP problem where the objective function can be defined to minimize the total amount of wastewater treated in treatment operations or to minimize the total treatment costs. The NLP model is extended to a mixed integer nonlinear programming (MINLP problem where binary variables are used for the selection of the wastewater treatment technologies. The bounds for all flowrates and concentrations in the wastewater network are specified as general equations. The proposed models are solved using the global optimization solvers (BARON and LINDOGlobal. The application of the proposed models is illustrated on the two wastewater network problems of different complexity. First one is formulated as the NLP and the second one as the MINLP. For the second one the parametric and structural optimization is performed at the same time where optimal flowrates, concentrations as well as optimal technologies for the wastewater treatment are selected. Using the proposed model both problems are solved to global optimality.

  4. An Optimization Model for Expired Drug Recycling Logistics Networks and Government Subsidy Policy Design Based on Tri-level Programming.

    Science.gov (United States)

    Huang, Hui; Li, Yuyu; Huang, Bo; Pi, Xing

    2015-07-09

    In order to recycle and dispose of all people's expired drugs, the government should design a subsidy policy to stimulate users to return their expired drugs, and drug-stores should take the responsibility of recycling expired drugs, in other words, to be recycling stations. For this purpose it is necessary for the government to select the right recycling stations and treatment stations to optimize the expired drug recycling logistics network and minimize the total costs of recycling and disposal. This paper establishes a tri-level programming model to study how the government can optimize an expired drug recycling logistics network and the appropriate subsidy policies. Furthermore, a Hybrid Genetic Simulated Annealing Algorithm (HGSAA) is proposed to search for the optimal solution of the model. An experiment is discussed to illustrate the good quality of the recycling logistics network and government subsides obtained by the HGSAA. The HGSAA is proven to have the ability to converge on the global optimal solution, and to act as an effective algorithm for solving the optimization problem of expired drug recycling logistics network and government subsidies.

  5. Development of Base Transceiver Station Selection Algorithm for ...

    African Journals Online (AJOL)

    TEMS) equipment was carried out on the existing BTSs, and a linear algorithm optimization program based on the spectral link efficiency of each BTS was developed, the output of this site optimization gives the selected number of base station sites ...

  6. Optimization of Artificial Neural Network using Evolutionary Programming for Prediction of Cascading Collapse Occurrence due to the Hidden Failure Effect

    Science.gov (United States)

    Idris, N. H.; Salim, N. A.; Othman, M. M.; Yasin, Z. M.

    2018-03-01

    This paper presents the Evolutionary Programming (EP) which proposed to optimize the training parameters for Artificial Neural Network (ANN) in predicting cascading collapse occurrence due to the effect of protection system hidden failure. The data has been collected from the probability of hidden failure model simulation from the historical data. The training parameters of multilayer-feedforward with backpropagation has been optimized with objective function to minimize the Mean Square Error (MSE). The optimal training parameters consists of the momentum rate, learning rate and number of neurons in first hidden layer and second hidden layer is selected in EP-ANN. The IEEE 14 bus system has been tested as a case study to validate the propose technique. The results show the reliable prediction of performance validated through MSE and Correlation Coefficient (R).

  7. Selection of the optimal Box-Cox transformation parameter for modelling and forecasting age-specific fertility

    OpenAIRE

    Shang, Han Lin

    2015-01-01

    The Box-Cox transformation can sometimes yield noticeable improvements in model simplicity, variance homogeneity and precision of estimation, such as in modelling and forecasting age-specific fertility. Despite its importance, there have been few studies focusing on the optimal selection of Box-Cox transformation parameters in demographic forecasting. A simple method is proposed for selecting the optimal Box-Cox transformation parameter, along with an algorithm based on an in-sample forecast ...

  8. Nonlinear programming analysis and methods

    CERN Document Server

    Avriel, Mordecai

    2012-01-01

    This text provides an excellent bridge between principal theories and concepts and their practical implementation. Topics include convex programming, duality, generalized convexity, analysis of selected nonlinear programs, techniques for numerical solutions, and unconstrained optimization methods.

  9. Computer Program for Analysis, Design and Optimization of Propulsion, Dynamics, and Kinematics of Multistage Rockets

    Science.gov (United States)

    Lali, Mehdi

    2009-03-01

    A comprehensive computer program is designed in MATLAB to analyze, design and optimize the propulsion, dynamics, thermodynamics, and kinematics of any serial multi-staging rocket for a set of given data. The program is quite user-friendly. It comprises two main sections: "analysis and design" and "optimization." Each section has a GUI (Graphical User Interface) in which the rocket's data are entered by the user and by which the program is run. The first section analyzes the performance of the rocket that is previously devised by the user. Numerous plots and subplots are provided to display the performance of the rocket. The second section of the program finds the "optimum trajectory" via billions of iterations and computations which are done through sophisticated algorithms using numerical methods and incremental integrations. Innovative techniques are applied to calculate the optimal parameters for the engine and designing the "optimal pitch program." This computer program is stand-alone in such a way that it calculates almost every design parameter in regards to rocket propulsion and dynamics. It is meant to be used for actual launch operations as well as educational and research purposes.

  10. Artificial Intelligence Based Selection of Optimal Cutting Tool and Process Parameters for Effective Turning and Milling Operations

    Science.gov (United States)

    Saranya, Kunaparaju; John Rozario Jegaraj, J.; Ramesh Kumar, Katta; Venkateshwara Rao, Ghanta

    2016-06-01

    With the increased trend in automation of modern manufacturing industry, the human intervention in routine, repetitive and data specific activities of manufacturing is greatly reduced. In this paper, an attempt has been made to reduce the human intervention in selection of optimal cutting tool and process parameters for metal cutting applications, using Artificial Intelligence techniques. Generally, the selection of appropriate cutting tool and parameters in metal cutting is carried out by experienced technician/cutting tool expert based on his knowledge base or extensive search from huge cutting tool database. The present proposed approach replaces the existing practice of physical search for tools from the databooks/tool catalogues with intelligent knowledge-based selection system. This system employs artificial intelligence based techniques such as artificial neural networks, fuzzy logic and genetic algorithm for decision making and optimization. This intelligence based optimal tool selection strategy is developed using Mathworks Matlab Version 7.11.0 and implemented. The cutting tool database was obtained from the tool catalogues of different tool manufacturers. This paper discusses in detail, the methodology and strategies employed for selection of appropriate cutting tool and optimization of process parameters based on multi-objective optimization criteria considering material removal rate, tool life and tool cost.

  11. Pareto Optimal Solutions for Network Defense Strategy Selection Simulator in Multi-Objective Reinforcement Learning

    Directory of Open Access Journals (Sweden)

    Yang Sun

    2018-01-01

    Full Text Available Using Pareto optimization in Multi-Objective Reinforcement Learning (MORL leads to better learning results for network defense games. This is particularly useful for network security agents, who must often balance several goals when choosing what action to take in defense of a network. If the defender knows his preferred reward distribution, the advantages of Pareto optimization can be retained by using a scalarization algorithm prior to the implementation of the MORL. In this paper, we simulate a network defense scenario by creating a multi-objective zero-sum game and using Pareto optimization and MORL to determine optimal solutions and compare those solutions to different scalarization approaches. We build a Pareto Defense Strategy Selection Simulator (PDSSS system for assisting network administrators on decision-making, specifically, on defense strategy selection, and the experiment results show that the Satisficing Trade-Off Method (STOM scalarization approach performs better than linear scalarization or GUESS method. The results of this paper can aid network security agents attempting to find an optimal defense policy for network security games.

  12. An efficient scenario-based stochastic programming framework for multi-objective optimal micro-grid operation

    International Nuclear Information System (INIS)

    Niknam, Taher; Azizipanah-Abarghooee, Rasoul; Narimani, Mohammad Rasoul

    2012-01-01

    Highlights: ► Proposes a stochastic model for optimal energy management. ► Consider uncertainties related to the forecasted values for load demand. ► Consider uncertainties of forecasted values of output power of wind and photovoltaic units. ► Consider uncertainties of forecasted values of market price. ► Present an improved multi-objective teaching–learning-based optimization. -- Abstract: This paper proposes a stochastic model for optimal energy management with the goal of cost and emission minimization. In this model, the uncertainties related to the forecasted values for load demand, available output power of wind and photovoltaic units and market price are modeled by a scenario-based stochastic programming. In the presented method, scenarios are generated by a roulette wheel mechanism based on probability distribution functions of the input random variables. Through this method, the inherent stochastic nature of the proposed problem is released and the problem is decomposed into a deterministic problem. An improved multi-objective teaching–learning-based optimization is implemented to yield the best expected Pareto optimal front. In the proposed stochastic optimization method, a novel self adaptive probabilistic modification strategy is offered to improve the performance of the presented algorithm. Also, a set of non-dominated solutions are stored in a repository during the simulation process. Meanwhile, the size of the repository is controlled by usage of a fuzzy-based clustering technique. The best expected compromise solution stored in the repository is selected via the niching mechanism in a way that solutions are encouraged to seek the lesser explored regions. The proposed framework is applied in a typical grid-connected micro grid in order to verify its efficiency and feasibility.

  13. Using linear programming to analyze and optimize stochastic flow lines

    DEFF Research Database (Denmark)

    Helber, Stefan; Schimmelpfeng, Katja; Stolletz, Raik

    2011-01-01

    This paper presents a linear programming approach to analyze and optimize flow lines with limited buffer capacities and stochastic processing times. The basic idea is to solve a huge but simple linear program that models an entire simulation run of a multi-stage production process in discrete time...... programming and hence allows us to solve buffer allocation problems. We show under which conditions our method works well by comparing its results to exact values for two-machine models and approximate simulation results for longer lines....

  14. An optimal maintenance policy for machine replacement problem using dynamic programming

    Directory of Open Access Journals (Sweden)

    Mohsen Sadegh Amalnik

    2017-06-01

    Full Text Available In this article, we present an acceptance sampling plan for machine replacement problem based on the backward dynamic programming model. Discount dynamic programming is used to solve a two-state machine replacement problem. We plan to design a model for maintenance by consid-ering the quality of the item produced. The purpose of the proposed model is to determine the optimal threshold policy for maintenance in a finite time horizon. We create a decision tree based on a sequential sampling including renew, repair and do nothing and wish to achieve an optimal threshold for making decisions including renew, repair and continue the production in order to minimize the expected cost. Results show that the optimal policy is sensitive to the data, for the probability of defective machines and parameters defined in the model. This can be clearly demonstrated by a sensitivity analysis technique.

  15. Use of social media by residency program directors for resident selection.

    Science.gov (United States)

    Cain, Jeff; Scott, Doneka R; Smith, Kelly

    2010-10-01

    Pharmacy residency program directors' attitudes and opinions regarding the use of social media in residency recruitment and selection were studied. A 24-item questionnaire was developed, pilot tested, revised, and sent to 996 residency program directors via SurveyMonkey.com. Demographic, social media usage, and opinions on social media data were collected and analyzed. A total of 454 residency program directors completed the study (response rate, 46.4%). The majority of respondents were women (58.8%), were members of Generation X (75.4%), and worked in a hospital or health system (80%). Most respondents (73%) rated themselves as either nonusers or novice users of social media. Twenty percent indicated that they had viewed a pharmacy residency applicant's social media information. More than half (52%) had encountered e-professionalism issues, including questionable photos and posts revealing unprofessional attitudes, and 89% strongly agreed or agreed that information voluntarily published online was fair game for judgments on character, attitudes, and professionalism. Only 4% of respondents had reviewed applicants' profiles for residency selection decisions. Of those respondents, 52% indicated that the content had no effect on resident selection. Over half of residency program directors were unsure whether they will use social media information for future residency selection decisions. Residency program directors from different generations had different views regarding social media information and its use in residency applicant selections. Residency program directors anticipated using social media information to aid in future decisions for resident selection and hiring.

  16. Optimal design and selection of magneto-rheological brake types based on braking torque and mass

    International Nuclear Information System (INIS)

    Nguyen, Q H; Lang, V T; Choi, S B

    2015-01-01

    In developing magnetorheological brakes (MRBs), it is well known that the braking torque and the mass of the MRBs are important factors that should be considered in the product’s design. This research focuses on the optimal design of different types of MRBs, from which we identify an optimal selection of MRB types, considering braking torque and mass. In the optimization, common types of MRBs such as disc-type, drum-type, hybrid-type, and T-shape types are considered. The optimization problem is to find an optimal MRB structure that can produce the required braking torque while minimizing its mass. After a brief description of the configuration of the MRBs, the MRBs’ braking torque is derived based on the Herschel-Bulkley rheological model of the magnetorheological fluid. Then, the optimal designs of the MRBs are analyzed. The optimization objective is to minimize the mass of the brake while the braking torque is constrained to be greater than a required value. In addition, the power consumption of the MRBs is also considered as a reference parameter in the optimization. A finite element analysis integrated with an optimization tool is used to obtain optimal solutions for the MRBs. Optimal solutions of MRBs with different required braking torque values are obtained based on the proposed optimization procedure. From the results, we discuss the optimal selection of MRB types, considering braking torque and mass. (technical note)

  17. APPLYING ROBUST RANKING METHOD IN TWO PHASE FUZZY OPTIMIZATION LINEAR PROGRAMMING PROBLEMS (FOLPP

    Directory of Open Access Journals (Sweden)

    Monalisha Pattnaik

    2014-12-01

    Full Text Available Background: This paper explores the solutions to the fuzzy optimization linear program problems (FOLPP where some parameters are fuzzy numbers. In practice, there are many problems in which all decision parameters are fuzzy numbers, and such problems are usually solved by either probabilistic programming or multi-objective programming methods. Methods: In this paper, using the concept of comparison of fuzzy numbers, a very effective method is introduced for solving these problems. This paper extends linear programming based problem in fuzzy environment. With the problem assumptions, the optimal solution can still be theoretically solved using the two phase simplex based method in fuzzy environment. To handle the fuzzy decision variables can be initially generated and then solved and improved sequentially using the fuzzy decision approach by introducing robust ranking technique. Results and conclusions: The model is illustrated with an application and a post optimal analysis approach is obtained. The proposed procedure was programmed with MATLAB (R2009a version software for plotting the four dimensional slice diagram to the application. Finally, numerical example is presented to illustrate the effectiveness of the theoretical results, and to gain additional managerial insights. 

  18. Optimal Design of Measurement Programs for the Parameter Identification of Dynamic Systems

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard; Brincker, Rune

    1993-01-01

    The design of a measurement program devoted to parameter identification of structural dynamic systems is considered. The design problem is formulated as an optimization problem to minimize the total expected cost that is the cost of failure and the cost of the measurement program. All...... the calculations are based on a priori knowledge and engineering judgement. One of the contribution of the approach is that the optimal number of sensory can be estimated. This is shown in an numerical example where the proposed approach is demonstrated. The example is concerned with design of a measurement...

  19. Optimal Design of Measurement Programs for the Parameter Identification of Dynamic Systems

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard; Brincker, Rune

    1991-01-01

    The design of a measurement program devoted to parameter identification of structural dynamic systems is considered. The design problem is formulated as an optimization problem to minimize the total expected cost, i.e. the cost of failure and the cost of the measurement program. All...... the calculations are based on a priori knowledge and engineering judgement. One of the contributions of the approach is that the optimal number of sensors can be estimated. This is shown in a numerical example where the proposed approach is demonstrated. The example is concerned with design of a measurement...

  20. Application of multi-objective optimization based on genetic algorithm for sustainable strategic supplier selection under fuzzy environment

    Energy Technology Data Exchange (ETDEWEB)

    Hashim, M.; Nazam, M.; Yao, L.; Baig, S.A.; Abrar, M.; Zia-ur-Rehman, M.

    2017-07-01

    The incorporation of environmental objective into the conventional supplier selection practices is crucial for corporations seeking to promote green supply chain management (GSCM). Challenges and risks associated with green supplier selection have been broadly recognized by procurement and supplier management professionals. This paper aims to solve a Tetra “S” (SSSS) problem based on a fuzzy multi-objective optimization with genetic algorithm in a holistic supply chain environment. In this empirical study, a mathematical model with fuzzy coefficients is considered for sustainable strategic supplier selection (SSSS) problem and a corresponding model is developed to tackle this problem. Design/methodology/approach: Sustainable strategic supplier selection (SSSS) decisions are typically multi-objectives in nature and it is an important part of green production and supply chain management for many firms. The proposed uncertain model is transferred into deterministic model by applying the expected value mesurement (EVM) and genetic algorithm with weighted sum approach for solving the multi-objective problem. This research focus on a multi-objective optimization model for minimizing lean cost, maximizing sustainable service and greener product quality level. Finally, a mathematical case of textile sector is presented to exemplify the effectiveness of the proposed model with a sensitivity analysis. Findings: This study makes a certain contribution by introducing the Tetra ‘S’ concept in both the theoretical and practical research related to multi-objective optimization as well as in the study of sustainable strategic supplier selection (SSSS) under uncertain environment. Our results suggest that decision makers tend to select strategic supplier first then enhance the sustainability. Research limitations/implications: Although the fuzzy expected value model (EVM) with fuzzy coefficients constructed in present research should be helpful for solving real world

  1. Application of multi-objective optimization based on genetic algorithm for sustainable strategic supplier selection under fuzzy environment

    Directory of Open Access Journals (Sweden)

    Muhammad Hashim

    2017-05-01

    Full Text Available Purpose:  The incorporation of environmental objective into the conventional supplier selection practices is crucial for corporations seeking to promote green supply chain management (GSCM. Challenges and risks associated with green supplier selection have been broadly recognized by procurement and supplier management professionals. This paper aims to solve a Tetra “S” (SSSS problem based on a fuzzy multi-objective optimization with genetic algorithm in a holistic supply chain environment. In this empirical study, a mathematical model with fuzzy coefficients is considered for sustainable strategic supplier selection (SSSS problem and a corresponding model is developed to tackle this problem. Design/methodology/approach: Sustainable strategic supplier selection (SSSS decisions are typically multi-objectives in nature and it is an important part of green production and supply chain management for many firms. The proposed uncertain model is transferred into deterministic model by applying the expected value mesurement (EVM and genetic algorithm with weighted sum approach for solving the multi-objective problem. This research focus on a multi-objective optimization model for minimizing lean cost, maximizing sustainable service and greener product quality level. Finally, a mathematical case of textile sector is presented to exemplify the effectiveness of the proposed model with a sensitivity analysis. Findings: This study makes a certain contribution by introducing the Tetra ‘S’ concept in both the theoretical and practical research related to multi-objective optimization as well as in the study of sustainable strategic supplier selection (SSSS under uncertain environment. Our results suggest that decision makers tend to select strategic supplier first then enhance the sustainability. Research limitations/implications: Although the fuzzy expected value model (EVM with fuzzy coefficients constructed in present research should be helpful for

  2. Application of multi-objective optimization based on genetic algorithm for sustainable strategic supplier selection under fuzzy environment

    International Nuclear Information System (INIS)

    Hashim, M.; Nazam, M.; Yao, L.; Baig, S.A.; Abrar, M.; Zia-ur-Rehman, M.

    2017-01-01

    The incorporation of environmental objective into the conventional supplier selection practices is crucial for corporations seeking to promote green supply chain management (GSCM). Challenges and risks associated with green supplier selection have been broadly recognized by procurement and supplier management professionals. This paper aims to solve a Tetra “S” (SSSS) problem based on a fuzzy multi-objective optimization with genetic algorithm in a holistic supply chain environment. In this empirical study, a mathematical model with fuzzy coefficients is considered for sustainable strategic supplier selection (SSSS) problem and a corresponding model is developed to tackle this problem. Design/methodology/approach: Sustainable strategic supplier selection (SSSS) decisions are typically multi-objectives in nature and it is an important part of green production and supply chain management for many firms. The proposed uncertain model is transferred into deterministic model by applying the expected value mesurement (EVM) and genetic algorithm with weighted sum approach for solving the multi-objective problem. This research focus on a multi-objective optimization model for minimizing lean cost, maximizing sustainable service and greener product quality level. Finally, a mathematical case of textile sector is presented to exemplify the effectiveness of the proposed model with a sensitivity analysis. Findings: This study makes a certain contribution by introducing the Tetra ‘S’ concept in both the theoretical and practical research related to multi-objective optimization as well as in the study of sustainable strategic supplier selection (SSSS) under uncertain environment. Our results suggest that decision makers tend to select strategic supplier first then enhance the sustainability. Research limitations/implications: Although the fuzzy expected value model (EVM) with fuzzy coefficients constructed in present research should be helpful for solving real world

  3. Application Of Database Program in selecting Sorghum (Sorghum bicolor L) Mutant Lines

    International Nuclear Information System (INIS)

    H, Soeranto

    2000-01-01

    Computer database software namely MSTAT and paradox have been exercised in the field of mutation breeding especially in the process of selecting plant mutant lines of sorghum. In MSTAT, selecting mutant lines can be done by activating the SELECTION function and then followed by entering mathematical formulas for the selection criterion. Another alternative is by defining the desired selection intensity to the analysis results of subprogram SORT. Including the selected plant mutant lines in BRSERIES program, it will make their progenies be easier to be traced in subsequent generations. In paradox, an application program for selecting mutant lines can be made by combining facilities of Table, form and report. Selecting mutant lines with defined selection criterion can easily be done through filtering data. As a relation database, paradox ensures that the application program for selecting mutant lines and progeny trachings, can be made easier, efficient and interactive

  4. Power Grid Construction Project Portfolio Optimization Based on Bi-level programming model

    Science.gov (United States)

    Zhao, Erdong; Li, Shangqi

    2017-08-01

    As the main body of power grid operation, county-level power supply enterprises undertake an important emission to guarantee the security of power grid operation and safeguard social power using order. The optimization of grid construction projects has been a key issue of power supply capacity and service level of grid enterprises. According to the actual situation of power grid construction project optimization of county-level power enterprises, on the basis of qualitative analysis of the projects, this paper builds a Bi-level programming model based on quantitative analysis. The upper layer of the model is the target restriction of the optimal portfolio; the lower layer of the model is enterprises’ financial restrictions on the size of the enterprise project portfolio. Finally, using a real example to illustrate operation proceeding and the optimization result of the model. Through qualitative analysis and quantitative analysis, the bi-level programming model improves the accuracy and normative standardization of power grid enterprises projects.

  5. SELECTION OF CHEMICAL TREATMENT PROGRAM FOR OILY WASTEWATER

    Directory of Open Access Journals (Sweden)

    Miguel Díaz

    2017-04-01

    Full Text Available When selecting a chemical treatment program for wastewater to achieve an effective flocculation and coagulation is crucial to understand how individual colloids interact. The coagulation process requires a rapid mixing while flocculation process needs a slow mixing. The behavior of colloids in water is strongly influenced by the electrokinetic charge, where each colloidal particle carries its own charge, which in its nature is usually negative. Polymers, which are long chains of high molecular weight and high charge, when added to water begin to form longer chains, allowing removing numerous particles of suspended matter. A study of physico-chemical treatment by addition of coagulant and flocculant was carried out in order to determine a chemical program for oily wastewater coming from the gravity separation process in a crude oil refinery. The tests were carried out in a Jar Test equipment, where commercial products: aluminum polychloride (PAC, aluminum sulfate and Sintec D50 were evaluated with five different flocculants. The selected chemical program was evaluated with fluids at three temperatures to know its sensitivity to this parameter and the mixing energy in the coagulation and flocculation. The chemical program and operational characteristics for physico-chemical treatment with PAC were determined, obtaining a removal of more than 93% for suspended matter and 96% for total hydrocarbons for the selected coagulant / flocculant combination.

  6. Near-Optimal Tracking Control of Mobile Robots Via Receding-Horizon Dual Heuristic Programming.

    Science.gov (United States)

    Lian, Chuanqiang; Xu, Xin; Chen, Hong; He, Haibo

    2016-11-01

    Trajectory tracking control of wheeled mobile robots (WMRs) has been an important research topic in control theory and robotics. Although various tracking control methods with stability have been developed for WMRs, it is still difficult to design optimal or near-optimal tracking controller under uncertainties and disturbances. In this paper, a near-optimal tracking control method is presented for WMRs based on receding-horizon dual heuristic programming (RHDHP). In the proposed method, a backstepping kinematic controller is designed to generate desired velocity profiles and the receding horizon strategy is used to decompose the infinite-horizon optimal control problem into a series of finite-horizon optimal control problems. In each horizon, a closed-loop tracking control policy is successively updated using a class of approximate dynamic programming algorithms called finite-horizon dual heuristic programming (DHP). The convergence property of the proposed method is analyzed and it is shown that the tracking control system based on RHDHP is asymptotically stable by using the Lyapunov approach. Simulation results on three tracking control problems demonstrate that the proposed method has improved control performance when compared with conventional model predictive control (MPC) and DHP. It is also illustrated that the proposed method has lower computational burden than conventional MPC, which is very beneficial for real-time tracking control.

  7. Cancer Feature Selection and Classification Using a Binary Quantum-Behaved Particle Swarm Optimization and Support Vector Machine

    Directory of Open Access Journals (Sweden)

    Maolong Xi

    2016-01-01

    Full Text Available This paper focuses on the feature gene selection for cancer classification, which employs an optimization algorithm to select a subset of the genes. We propose a binary quantum-behaved particle swarm optimization (BQPSO for cancer feature gene selection, coupling support vector machine (SVM for cancer classification. First, the proposed BQPSO algorithm is described, which is a discretized version of original QPSO for binary 0-1 optimization problems. Then, we present the principle and procedure for cancer feature gene selection and cancer classification based on BQPSO and SVM with leave-one-out cross validation (LOOCV. Finally, the BQPSO coupling SVM (BQPSO/SVM, binary PSO coupling SVM (BPSO/SVM, and genetic algorithm coupling SVM (GA/SVM are tested for feature gene selection and cancer classification on five microarray data sets, namely, Leukemia, Prostate, Colon, Lung, and Lymphoma. The experimental results show that BQPSO/SVM has significant advantages in accuracy, robustness, and the number of feature genes selected compared with the other two algorithms.

  8. Cancer Feature Selection and Classification Using a Binary Quantum-Behaved Particle Swarm Optimization and Support Vector Machine

    Science.gov (United States)

    Sun, Jun; Liu, Li; Fan, Fangyun; Wu, Xiaojun

    2016-01-01

    This paper focuses on the feature gene selection for cancer classification, which employs an optimization algorithm to select a subset of the genes. We propose a binary quantum-behaved particle swarm optimization (BQPSO) for cancer feature gene selection, coupling support vector machine (SVM) for cancer classification. First, the proposed BQPSO algorithm is described, which is a discretized version of original QPSO for binary 0-1 optimization problems. Then, we present the principle and procedure for cancer feature gene selection and cancer classification based on BQPSO and SVM with leave-one-out cross validation (LOOCV). Finally, the BQPSO coupling SVM (BQPSO/SVM), binary PSO coupling SVM (BPSO/SVM), and genetic algorithm coupling SVM (GA/SVM) are tested for feature gene selection and cancer classification on five microarray data sets, namely, Leukemia, Prostate, Colon, Lung, and Lymphoma. The experimental results show that BQPSO/SVM has significant advantages in accuracy, robustness, and the number of feature genes selected compared with the other two algorithms. PMID:27642363

  9. Exploiting variability for energy optimization of parallel programs

    Energy Technology Data Exchange (ETDEWEB)

    Lavrijsen, Wim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Iancu, Costin [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); de Jong, Wibe [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Chen, Xin [Georgia Inst. of Technology, Atlanta, GA (United States); Schwan, Karsten [Georgia Inst. of Technology, Atlanta, GA (United States)

    2016-04-18

    Here in this paper we present optimizations that use DVFS mechanisms to reduce the total energy usage in scientific applications. Our main insight is that noise is intrinsic to large scale parallel executions and it appears whenever shared resources are contended. The presence of noise allows us to identify and manipulate any program regions amenable to DVFS. When compared to previous energy optimizations that make per core decisions using predictions of the running time, our scheme uses a qualitative approach to recognize the signature of executions amenable to DVFS. By recognizing the "shape of variability" we can optimize codes with highly dynamic behavior, which pose challenges to all existing DVFS techniques. We validate our approach using offline and online analyses for one-sided and two-sided communication paradigms. We have applied our methods to NWChem, and we show best case improvements in energy use of 12% at no loss in performance when using online optimizations running on 720 Haswell cores with one-sided communication. With NWChem on MPI two-sided and offline analysis, capturing the initialization, we find energy savings of up to 20%, with less than 1% performance cost.

  10. Continuous-Time Mean-Variance Portfolio Selection under the CEV Process

    OpenAIRE

    Ma, Hui-qiang

    2014-01-01

    We consider a continuous-time mean-variance portfolio selection model when stock price follows the constant elasticity of variance (CEV) process. The aim of this paper is to derive an optimal portfolio strategy and the efficient frontier. The mean-variance portfolio selection problem is formulated as a linearly constrained convex program problem. By employing the Lagrange multiplier method and stochastic optimal control theory, we obtain the optimal portfolio strategy and mean-variance effici...

  11. Selective Redundancy Removal: A Framework for Data Hiding

    Directory of Open Access Journals (Sweden)

    Ugo Fiore

    2010-02-01

    Full Text Available Data hiding techniques have so far concentrated on adding or modifying irrelevant information in order to hide a message. However, files in widespread use, such as HTML documents, usually exhibit high redundancy levels, caused by code-generation programs. Such redundancy may be removed by means of optimization software. Redundancy removal, if applied selectively, enables information hiding. This work introduces Selective Redundancy Removal (SRR as a framework for hiding data. An example application of the framework is given in terms of hiding information in HTML documents. Non-uniformity across documents may raise alarms. Nevertheless, selective application of optimization techniques might be due to the legitimate use of optimization software not supporting all the optimization methods, or configured to not use all of them.

  12. Performance comparison of genetic algorithms and particle swarm optimization for model integer programming bus timetabling problem

    Science.gov (United States)

    Wihartiko, F. D.; Wijayanti, H.; Virgantari, F.

    2018-03-01

    Genetic Algorithm (GA) is a common algorithm used to solve optimization problems with artificial intelligence approach. Similarly, the Particle Swarm Optimization (PSO) algorithm. Both algorithms have different advantages and disadvantages when applied to the case of optimization of the Model Integer Programming for Bus Timetabling Problem (MIPBTP), where in the case of MIPBTP will be found the optimal number of trips confronted with various constraints. The comparison results show that the PSO algorithm is superior in terms of complexity, accuracy, iteration and program simplicity in finding the optimal solution.

  13. Natural selection and optimality

    International Nuclear Information System (INIS)

    Torres, J.L.

    1989-01-01

    It is assumed that Darwin's principle translates into optimal regimes of operation along metabolical pathways in an ecological system. Fitness is then defined in terms of the distance of a given individual's thermodynamic parameters from their optimal values. The method is illustrated testing maximum power as a criterion of merit satisfied in ATP synthesis. (author). 26 refs, 2 figs

  14. Log-Optimal Portfolio Selection Using the Blackwell Approachability Theorem

    OpenAIRE

    V'yugin, Vladimir

    2014-01-01

    We present a method for constructing the log-optimal portfolio using the well-calibrated forecasts of market values. Dawid's notion of calibration and the Blackwell approachability theorem are used for computing well-calibrated forecasts. We select a portfolio using this "artificial" probability distribution of market values. Our portfolio performs asymptotically at least as well as any stationary portfolio that redistributes the investment at each round using a continuous function of side in...

  15. Penempatan Optimal Phasor Measurement Unit (PMU dengan Integer Programming

    Directory of Open Access Journals (Sweden)

    Yunan Helmy Amrulloh

    2013-09-01

    Full Text Available Phasor Measurement Unit (PMU merupakan peralatan yang mampu memberikan pengukuran fasor tegangan dan arus secara real-time. PMU dapat digunakan untuk monitoring, proteksi dan kontrol pada sistem tenaga listrik. Tugas akhir ini membahas penempatan PMU secara optimal berdasarkan topologi jaringan sehingga sistem tenaga listrik  dapat diobservasi. Penempatan optimal PMU dirumuskan sebagai masalah Binary Integer Programming (BIP yang akan memberikan variabel dengan pilihan nilai (0,1 yang menunjukkan tempat yang harus dipasang PMU. Dalam tugas akhir ini, BIP diterapkan untuk menyelesaikan masalah penempatan PMU secara optimal pada sistem tenaga listrik  Jawa-Bali 500 KV yang selanjutnya diterapkan dengan penambahan konsep incomplete observability. Hasil simulasi menunjukkan bahwa penerapan BIP pada sistem dengan incomplete observability memberikan jumlah PMU yang lebih sedikit dibandingkan dengan sistem tanpa konsep incomplete observability.

  16. Fuzzy preference based interactive fuzzy physical programming and its application in multi-objective optimization

    International Nuclear Information System (INIS)

    Zhang, Xu; Huang, Hong Zhong; Yu, Lanfeng

    2006-01-01

    Interactive Fuzzy Physical Programming (IFPP) developed in this paper is a new efficient multi-objective optimization method, which retains the advantages of physical programming while considering the fuzziness of the designer's preferences. The fuzzy preference function is introduced based on the model of linear physical programming, which is used to guide the search for improved solutions by interactive decision analysis. The example of multi-objective optimization design of the spindle of internal grinder demonstrates that the improved preference conforms to the subjective desires of the designer

  17. ADAM: A computer program to simulate selective-breeding schemes for animals

    DEFF Research Database (Denmark)

    Pedersen, L D; Sørensen, A C; Henryon, M

    2009-01-01

    ADAM is a computer program that models selective breeding schemes for animals using stochastic simulation. The program simulates a population of animals and traces the genetic changes in the population under different selective breeding scenarios. It caters to different population structures......, genetic models, selection strategies, and mating designs. ADAM can be used to evaluate breeding schemes and generate genetic data to test statistical tools...

  18. How to Use Linear Programming for Information System Performances Optimization

    Directory of Open Access Journals (Sweden)

    Hell Marko

    2014-09-01

    Full Text Available Background: Organisations nowadays operate in a very dynamic environment, and therefore, their ability of continuously adjusting the strategic plan to the new conditions is a must for achieving their strategic objectives. BSC is a well-known methodology for measuring performances enabling organizations to learn how well they are doing. In this paper, “BSC for IS” will be proposed in order to measure the IS impact on the achievement of organizations’ business goals. Objectives: The objective of this paper is to present the original procedure which is used to enhance the BSC methodology in planning the optimal targets of IS performances value in order to maximize the organization's effectiveness. Methods/Approach: The method used in this paper is the quantitative methodology - linear programming. In the case study, linear programming is used for optimizing organization’s strategic performance. Results: Results are shown on the example of a case study national park. An optimal performance value for the strategic objective has been calculated, as well as an optimal performance value for each DO (derived objective. Results are calculated in Excel, using Solver Add-in. Conclusions: The presentation of methodology through the case study of a national park shows that this methodology, though it requires a high level of formalisation, provides a very transparent performance calculation.

  19. Tank Waste Remediation System optimized processing strategy

    International Nuclear Information System (INIS)

    Slaathaug, E.J.; Boldt, A.L.; Boomer, K.D.; Galbraith, J.D.; Leach, C.E.; Waldo, T.L.

    1996-03-01

    This report provides an alternative strategy evolved from the current Hanford Site Tank Waste Remediation System (TWRS) programmatic baseline for accomplishing the treatment and disposal of the Hanford Site tank wastes. This optimized processing strategy performs the major elements of the TWRS Program, but modifies the deployment of selected treatment technologies to reduce the program cost. The present program for development of waste retrieval, pretreatment, and vitrification technologies continues, but the optimized processing strategy reuses a single facility to accomplish the separations/low-activity waste (LAW) vitrification and the high-level waste (HLW) vitrification processes sequentially, thereby eliminating the need for a separate HLW vitrification facility

  20. Optimal Multi-Interface Selection for Mobile Video Streaming in Efficient Battery Consumption and Data Usage

    Directory of Open Access Journals (Sweden)

    Seonghoon Moon

    2016-01-01

    Full Text Available With the proliferation of high-performance, large-screen mobile devices, users’ expectations of having access to high-resolution video content in smooth network environments are steadily growing. To guarantee such stable streaming, a high cellular network bandwidth is required; yet network providers often charge high prices for even limited data plans. Moreover, the costs of smoothly streaming high-resolution videos are not merely monetary; the device’s battery life must also be accounted for. To resolve these problems, we design an optimal multi-interface selection system for streaming video over HTTP/TCP. An optimization problem including battery life and LTE data constraints is derived and then solved using binary integer programming. Additionally, the system is designed with an adoption of split-layer scalable video coding, which provides direct adaptations of video quality and prevents out-of-order packet delivery problems. The proposed system is evaluated using a prototype application in a real, iOS-based device as well as through experiments conducted in heterogeneous mobile scenarios. Results show that the system not only guarantees the highest-possible video quality, but also prevents reckless consumption of LTE data and battery life.

  1. A particle swarm optimization algorithm for beam angle selection in intensity-modulated radiotherapy planning

    International Nuclear Information System (INIS)

    Li Yongjie; Yao Dezhong; Yao, Jonathan; Chen Wufan

    2005-01-01

    Automatic beam angle selection is an important but challenging problem for intensity-modulated radiation therapy (IMRT) planning. Though many efforts have been made, it is still not very satisfactory in clinical IMRT practice because of overextensive computation of the inverse problem. In this paper, a new technique named BASPSO (Beam Angle Selection with a Particle Swarm Optimization algorithm) is presented to improve the efficiency of the beam angle optimization problem. Originally developed as a tool for simulating social behaviour, the particle swarm optimization (PSO) algorithm is a relatively new population-based evolutionary optimization technique first introduced by Kennedy and Eberhart in 1995. In the proposed BASPSO, the beam angles are optimized using PSO by treating each beam configuration as a particle (individual), and the beam intensity maps for each beam configuration are optimized using the conjugate gradient (CG) algorithm. These two optimization processes are implemented iteratively. The performance of each individual is evaluated by a fitness value calculated with a physical objective function. A population of these individuals is evolved by cooperation and competition among the individuals themselves through generations. The optimization results of a simulated case with known optimal beam angles and two clinical cases (a prostate case and a head-and-neck case) show that PSO is valid and efficient and can speed up the beam angle optimization process. Furthermore, the performance comparisons based on the preliminary results indicate that, as a whole, the PSO-based algorithm seems to outperform, or at least compete with, the GA-based algorithm in computation time and robustness. In conclusion, the reported work suggested that the introduced PSO algorithm could act as a new promising solution to the beam angle optimization problem and potentially other optimization problems in IMRT, though further studies need to be investigated

  2. Parallel algorithms for islanded microgrid with photovoltaic and energy storage systems planning optimization problem: Material selection and quantity demand optimization

    Science.gov (United States)

    Cao, Yang; Liu, Chun; Huang, Yuehui; Wang, Tieqiang; Sun, Chenjun; Yuan, Yue; Zhang, Xinsong; Wu, Shuyun

    2017-02-01

    With the development of roof photovoltaic power (PV) generation technology and the increasingly urgent need to improve supply reliability levels in remote areas, islanded microgrid with photovoltaic and energy storage systems (IMPE) is developing rapidly. The high costs of photovoltaic panel material and energy storage battery material have become the primary factors that hinder the development of IMPE. The advantages and disadvantages of different types of photovoltaic panel materials and energy storage battery materials are analyzed in this paper, and guidance is provided on material selection for IMPE planners. The time sequential simulation method is applied to optimize material demands of the IMPE. The model is solved by parallel algorithms that are provided by a commercial solver named CPLEX. Finally, to verify the model, an actual IMPE is selected as a case system. Simulation results on the case system indicate that the optimization model and corresponding algorithm is feasible. Guidance for material selection and quantity demand for IMPEs in remote areas is provided by this method.

  3. Extensions of Dynamic Programming: Decision Trees, Combinatorial Optimization, and Data Mining

    KAUST Repository

    Hussain, Shahid

    2016-01-01

    This thesis is devoted to the development of extensions of dynamic programming to the study of decision trees. The considered extensions allow us to make multi-stage optimization of decision trees relative to a sequence of cost functions, to count the number of optimal trees, and to study relationships: cost vs cost and cost vs uncertainty for decision trees by construction of the set of Pareto-optimal points for the corresponding bi-criteria optimization problem. The applications include study of totally optimal (simultaneously optimal relative to a number of cost functions) decision trees for Boolean functions, improvement of bounds on complexity of decision trees for diagnosis of circuits, study of time and memory trade-off for corner point detection, study of decision rules derived from decision trees, creation of new procedure (multi-pruning) for construction of classifiers, and comparison of heuristics for decision tree construction. Part of these extensions (multi-stage optimization) was generalized to well-known combinatorial optimization problems: matrix chain multiplication, binary search trees, global sequence alignment, and optimal paths in directed graphs.

  4. Extensions of Dynamic Programming: Decision Trees, Combinatorial Optimization, and Data Mining

    KAUST Repository

    Hussain, Shahid

    2016-07-10

    This thesis is devoted to the development of extensions of dynamic programming to the study of decision trees. The considered extensions allow us to make multi-stage optimization of decision trees relative to a sequence of cost functions, to count the number of optimal trees, and to study relationships: cost vs cost and cost vs uncertainty for decision trees by construction of the set of Pareto-optimal points for the corresponding bi-criteria optimization problem. The applications include study of totally optimal (simultaneously optimal relative to a number of cost functions) decision trees for Boolean functions, improvement of bounds on complexity of decision trees for diagnosis of circuits, study of time and memory trade-off for corner point detection, study of decision rules derived from decision trees, creation of new procedure (multi-pruning) for construction of classifiers, and comparison of heuristics for decision tree construction. Part of these extensions (multi-stage optimization) was generalized to well-known combinatorial optimization problems: matrix chain multiplication, binary search trees, global sequence alignment, and optimal paths in directed graphs.

  5. Optimizing diffusion of an online computer tailored lifestyle program: a study protocol

    Directory of Open Access Journals (Sweden)

    Schulz Daniela N

    2011-06-01

    Full Text Available Abstract Background Although the Internet is a promising medium to offer lifestyle interventions to large amounts of people at relatively low costs and effort, actual exposure rates of these interventions fail to meet the high expectations. Since public health impact of interventions is determined by intervention efficacy and level of exposure to the intervention, it is imperative to put effort in optimal dissemination. The present project attempts to optimize the dissemination process of a new online computer tailored generic lifestyle program by carefully studying the adoption process and developing a strategy to achieve sustained use of the program. Methods/Design A prospective study will be conducted to yield relevant information concerning the adoption process by studying the level of adoption of the program, determinants involved in adoption and characteristics of adopters and non-adopters as well as satisfied and unsatisfied users. Furthermore, a randomized control trial will be conducted to the test the effectiveness of a proactive strategy using periodic e-mail prompts in optimizing sustained use of the new program. Discussion Closely mapping the adoption process will gain insight in characteristics of adopters and non-adopters and satisfied and unsatisfied users. This insight can be used to further optimize the program by making it more suitable for a wider range of users, or to develop adjusted interventions to attract subgroups of users that are not reached or satisfied with the initial intervention. Furthermore, by studying the effect of a proactive strategy using period prompts compared to a reactive strategy to stimulate sustained use of the intervention and, possibly, behaviour change, specific recommendations on the use and the application of prompts in online lifestyle interventions can be developed. Trial registration Dutch Trial Register NTR1786 and Medical Ethics Committee of Maastricht University and the University Hospital

  6. Optimization of a genomic breeding program for a moderately sized dairy cattle population.

    Science.gov (United States)

    Reiner-Benaim, A; Ezra, E; Weller, J I

    2017-04-01

    Although it now standard practice to genotype thousands of female calves, genotyping of bull calves is generally limited to progeny of elite cows. In addition to genotyping costs, increasing the pool of candidate sires requires purchase, isolation, and identification of calves until selection decisions are made. We economically optimized via simulation a genomic breeding program for a population of approximately 120,000 milk-recorded cows, corresponding to the Israeli Holstein population. All 30,000 heifers and 60,000 older cows of parities 1 to 3 were potential bull dams. Animals were assumed to have genetic evaluations for a trait with heritability of 0.25 derived by an animal model evaluation of the population. Only bull calves were assumed to be genotyped. A pseudo-phenotype corresponding to each animal's genetic evaluation was generated, consisting of the animal's genetic value plus a residual with variance set to obtain the assumed reliability for each group of animals. Between 4 and 15 bulls and between 200 and 27,000 cows with the highest pseudo-phenotypes were selected as candidate bull parents. For all progeny of the founder animals, genetic values were simulated as the mean of the parental values plus a Mendelian sampling effect with variance of 0.5. A probability of 0.3 for a healthy bull calf per mating, and a genomic reliability of 0.43 were assumed. The 40 bull calves with the highest genomic evaluations were selected for general service for 1 yr. Costs included genotyping of candidate bulls and their dams, purchase of the calves from the farmers, and identification. Costs of raising culled calves were partially recovered by resale for beef. Annual costs were estimated as $10,922 + $305 × candidate bulls. Nominal profit per cow per genetic standard deviation was $106. Economic optimum with a discount rate of 5%, first returns after 4 yr, and a profit horizon of 15 yr were obtained with genotyping 1,620 to 1,750 calves for all numbers of bull sires

  7. Selecting Optimal Feature Set in High-Dimensional Data by Swarm Search

    Directory of Open Access Journals (Sweden)

    Simon Fong

    2013-01-01

    Full Text Available Selecting the right set of features from data of high dimensionality for inducing an accurate classification model is a tough computational challenge. It is almost a NP-hard problem as the combinations of features escalate exponentially as the number of features increases. Unfortunately in data mining, as well as other engineering applications and bioinformatics, some data are described by a long array of features. Many feature subset selection algorithms have been proposed in the past, but not all of them are effective. Since it takes seemingly forever to use brute force in exhaustively trying every possible combination of features, stochastic optimization may be a solution. In this paper, we propose a new feature selection scheme called Swarm Search to find an optimal feature set by using metaheuristics. The advantage of Swarm Search is its flexibility in integrating any classifier into its fitness function and plugging in any metaheuristic algorithm to facilitate heuristic search. Simulation experiments are carried out by testing the Swarm Search over some high-dimensional datasets, with different classification algorithms and various metaheuristic algorithms. The comparative experiment results show that Swarm Search is able to attain relatively low error rates in classification without shrinking the size of the feature subset to its minimum.

  8. An Optimization Model for Expired Drug Recycling Logistics Networks and Government Subsidy Policy Design Based on Tri-level Programming

    Directory of Open Access Journals (Sweden)

    Hui Huang

    2015-07-01

    Full Text Available In order to recycle and dispose of all people’s expired drugs, the government should design a subsidy policy to stimulate users to return their expired drugs, and drug-stores should take the responsibility of recycling expired drugs, in other words, to be recycling stations. For this purpose it is necessary for the government to select the right recycling stations and treatment stations to optimize the expired drug recycling logistics network and minimize the total costs of recycling and disposal. This paper establishes a tri-level programming model to study how the government can optimize an expired drug recycling logistics network and the appropriate subsidy policies. Furthermore, a Hybrid Genetic Simulated Annealing Algorithm (HGSAA is proposed to search for the optimal solution of the model. An experiment is discussed to illustrate the good quality of the recycling logistics network and government subsides obtained by the HGSAA. The HGSAA is proven to have the ability to converge on the global optimal solution, and to act as an effective algorithm for solving the optimization problem of expired drug recycling logistics network and government subsidies.

  9. Program management aid for redundancy selection and operational guidelines

    Science.gov (United States)

    Hodge, P. W.; Davis, W. L.; Frumkin, B.

    1972-01-01

    Although this criterion was developed specifically for use on the shuttle program, it has application to many other multi-missions programs (i.e. aircraft or mechanisms). The methodology employed is directly applicable even if the tools (nomographs and equations) are for mission peculiar cases. The redundancy selection criterion was developed to insure that both the design and operational cost impacts (life cycle costs) were considered in the selection of the quantity of operational redundancy. These tools were developed as aids in expediting the decision process and not intended as the automatic decision maker. This approach to redundancy selection is unique in that it enables a pseudo systems analysis to be performed on an equipment basis without waiting for all designs to be hardened.

  10. Optimized bioregenerative space diet selection with crew choice

    Science.gov (United States)

    Vicens, Carrie; Wang, Carolyn; Olabi, Ammar; Jackson, Peter; Hunter, Jean

    2003-01-01

    Previous studies on optimization of crew diets have not accounted for choice. A diet selection model with crew choice was developed. Scenario analyses were conducted to assess the feasibility and cost of certain crew preferences, such as preferences for numerous-desserts, high-salt, and high-acceptability foods. For comparison purposes, a no-choice and a random-choice scenario were considered. The model was found to be feasible in terms of food variety and overall costs. The numerous-desserts, high-acceptability, and random-choice scenarios all resulted in feasible solutions costing between 13.2 and 17.3 kg ESM/person-day. Only the high-sodium scenario yielded an infeasible solution. This occurred when the foods highest in salt content were selected for the crew-choice portion of the diet. This infeasibility can be avoided by limiting the total sodium content in the crew-choice portion of the diet. Cost savings were found by reducing food variety in scenarios where the preference bias strongly affected nutritional content.

  11. Selecting, adapting, and sustaining programs in health care systems

    Directory of Open Access Journals (Sweden)

    Zullig LL

    2015-04-01

    Full Text Available Leah L Zullig,1,2 Hayden B Bosworth1–4 1Center for Health Services Research in Primary Care, Durham Veterans Affairs Medical Center, Durham, NC, USA; 2Department of Medicine, Duke University Medical Center, Durham, NC, USA; 3School of Nursing, 4Department of Psychiatry and Behavioral Sciences, Duke University, Durham, NC, USA Abstract: Practitioners and researchers often design behavioral programs that are effective for a specific population or problem. Despite their success in a controlled setting, relatively few programs are scaled up and implemented in health care systems. Planning for scale-up is a critical, yet often overlooked, element in the process of program design. Equally as important is understanding how to select a program that has already been developed, and adapt and implement the program to meet specific organizational goals. This adaptation and implementation requires attention to organizational goals, available resources, and program cost. We assert that translational behavioral medicine necessitates expanding successful programs beyond a stand-alone research study. This paper describes key factors to consider when selecting, adapting, and sustaining programs for scale-up in large health care systems and applies the Knowledge to Action (KTA Framework to a case study, illustrating knowledge creation and an action cycle of implementation and evaluation activities. Keywords: program sustainability, diffusion of innovation, information dissemination, health services research, intervention studies 

  12. Hyperopt: a Python library for model selection and hyperparameter optimization

    Science.gov (United States)

    Bergstra, James; Komer, Brent; Eliasmith, Chris; Yamins, Dan; Cox, David D.

    2015-01-01

    Sequential model-based optimization (also known as Bayesian optimization) is one of the most efficient methods (per function evaluation) of function minimization. This efficiency makes it appropriate for optimizing the hyperparameters of machine learning algorithms that are slow to train. The Hyperopt library provides algorithms and parallelization infrastructure for performing hyperparameter optimization (model selection) in Python. This paper presents an introductory tutorial on the usage of the Hyperopt library, including the description of search spaces, minimization (in serial and parallel), and the analysis of the results collected in the course of minimization. This paper also gives an overview of Hyperopt-Sklearn, a software project that provides automatic algorithm configuration of the Scikit-learn machine learning library. Following Auto-Weka, we take the view that the choice of classifier and even the choice of preprocessing module can be taken together to represent a single large hyperparameter optimization problem. We use Hyperopt to define a search space that encompasses many standard components (e.g. SVM, RF, KNN, PCA, TFIDF) and common patterns of composing them together. We demonstrate, using search algorithms in Hyperopt and standard benchmarking data sets (MNIST, 20-newsgroups, convex shapes), that searching this space is practical and effective. In particular, we improve on best-known scores for the model space for both MNIST and convex shapes. The paper closes with some discussion of ongoing and future work.

  13. SEWER NETWORK DISCHARGE OPTIMIZATION USING THE DYNAMIC PROGRAMMING

    Directory of Open Access Journals (Sweden)

    Viorel MINZU

    2015-12-01

    Full Text Available It is necessary to adopt an optimal control that allows an efficient usage of the existing sewer networks, in order to avoid the building of new retention facilities. The main objective of the control action is to minimize the overflow volume of a sewer network. This paper proposes a method to apply a solution obtained by discrete dynamic programming through a realistic closed loop system.

  14. Optimal Corridor Selection for a Road Space Management Strategy: Methodology and Tool

    Directory of Open Access Journals (Sweden)

    Sushant Sharma

    2017-01-01

    Full Text Available Nationwide, there is a growing realization that there are valuable benefits to using the existing roadway facilities to their full potential rather than expanding capacity in a traditional way. Currently, state DOTs are looking for cost-effective transportation solutions to mitigate the growing congestion and increasing funding gaps. Innovative road space management strategies like narrowing of multiple lanes (three or more and shoulder width to add a lane enhance the utilization while eliminating the costs associated with constructing new lanes. Although this strategy (among many generally leads to better mobility, identifying optimal corridors is a challenge and may affect the benefits. Further, there is a likelihood that added capacity may provide localized benefits, at the expense of system level performance measures (travel time and crashes because of the relocation of traffic operational bottlenecks. This paper develops a novel transportation programming and investment decision method to identify optimal corridors for adding capacity in the network by leveraging lane widths. The methodology explicitly takes into consideration the system level benefits and safety. The programming compares two conflicting objectives of system travel time and safety benefits to find an optimal solution.

  15. Integrated Method for Optimizing Connection Layout and Cable Selection for an Internal Network of a Wind Farm

    Directory of Open Access Journals (Sweden)

    Andrzej Wędzik

    2015-09-01

    Full Text Available An internal network of a wind farm is similar to a wide network structure. Wind turbines are deployed over a vast area, and cable lines used to interconnect them may have lengths reaching tens of kilometres. The cost of constructing such a network is a major component of the entire investment. Therefore, it is advisable to develop a configuration of such a farm’s internal connections which will minimise the cost, while complying with technical requirements even at the design stage. So far this has usually been done within two independent processes. At first the network structure ensuring the shortest possible connections between the turbines is determined. Then appropriate cables compliant with technical regulations are selected for the specified structure. But does this design approach ensure the optimal (lowest investment cost? This paper gives an answer to this question. A method for accomplishing the task given in the title is presented. Examples of calculations are presented and results are compared for the two methods of optimal wind farm internal connection structure design and cable cross-section dimensioning: two-stage and integrated. The usefulness of employing the Mixed Integer Nonlinear Programming (MNLP method in the process of determining the optimal structure of a wind farm’s cable network is demonstrated.

  16. Optimal Feature Space Selection in Detecting Epileptic Seizure based on Recurrent Quantification Analysis and Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Saleh LAshkari

    2016-06-01

    Full Text Available Selecting optimal features based on nature of the phenomenon and high discriminant ability is very important in the data classification problems. Since it doesn't require any assumption about stationary condition and size of the signal and the noise in Recurrent Quantification Analysis (RQA, it may be useful for epileptic seizure Detection. In this study, RQA was used to discriminate ictal EEG from the normal EEG where optimal features selected by combination of algorithm genetic and Bayesian Classifier. Recurrence plots of hundred samples in each two categories were obtained with five distance norms in this study: Euclidean, Maximum, Minimum, Normalized and Fixed Norm. In order to choose optimal threshold for each norm, ten threshold of ε was generated and then the best feature space was selected by genetic algorithm in combination with a bayesian classifier. The results shown that proposed method is capable of discriminating the ictal EEG from the normal EEG where for Minimum norm and 0.1˂ε˂1, accuracy was 100%. In addition, the sensitivity of proposed framework to the ε and the distance norm parameters was low. The optimal feature presented in this study is Trans which it was selected in most feature spaces with high accuracy.

  17. SVM-RFE based feature selection and Taguchi parameters optimization for multiclass SVM classifier.

    Science.gov (United States)

    Huang, Mei-Ling; Hung, Yung-Hsiang; Lee, W M; Li, R K; Jiang, Bo-Ru

    2014-01-01

    Recently, support vector machine (SVM) has excellent performance on classification and prediction and is widely used on disease diagnosis or medical assistance. However, SVM only functions well on two-group classification problems. This study combines feature selection and SVM recursive feature elimination (SVM-RFE) to investigate the classification accuracy of multiclass problems for Dermatology and Zoo databases. Dermatology dataset contains 33 feature variables, 1 class variable, and 366 testing instances; and the Zoo dataset contains 16 feature variables, 1 class variable, and 101 testing instances. The feature variables in the two datasets were sorted in descending order by explanatory power, and different feature sets were selected by SVM-RFE to explore classification accuracy. Meanwhile, Taguchi method was jointly combined with SVM classifier in order to optimize parameters C and γ to increase classification accuracy for multiclass classification. The experimental results show that the classification accuracy can be more than 95% after SVM-RFE feature selection and Taguchi parameter optimization for Dermatology and Zoo databases.

  18. Dynamic programming for optimization of timber production and grazing in ponderosa pine

    Science.gov (United States)

    Kurt H. Riitters; J. Douglas Brodie; David W. Hann

    1982-01-01

    Dynamic programming procedures are presented for optimizing thinning and rotation of even-aged ponderosa pine by using the four descriptors: age, basal area, number of trees, and time since thinning. Because both timber yield and grazing yield are functions of stand density, the two outputs-forage and timber-can both be optimized. The soil expectation values for single...

  19. Impact of Demand Response Programs on Optimal Operation of Multi-Microgrid System

    Directory of Open Access Journals (Sweden)

    Anh-Duc Nguyen

    2018-06-01

    Full Text Available The increased penetration of renewables is beneficial for power systems but it poses several challenges, i.e., uncertainty in power supply, power quality issues, and other technical problems. Backup generators or storage system have been proposed to solve this problem but there are limitations remaining due to high installation and maintenance cost. Furthermore, peak load is also an issue in the power distribution system. Due to the adjustable characteristics of loads, strategies on demand side such as demand response (DR are more appropriate in order to deal with these challenges. Therefore, this paper studies how DR programs influence the operation of the multi-microgrid (MMG. The implementation is executed based on a hierarchical energy management system (HiEMS including microgrid EMSs (MG-EMSs responsible for local optimization in each MG and community EMS (C-EMS responsible for community optimization in the MMG. Mixed integer linear programming (MILP-based mathematical models are built for MMG optimal operation. Five scenarios consisting of single DR programs and DR groups are tested in an MMG test system to evaluate their impact on MMG operation. Among the five scenarios, some DR programs apply curtailing strategies, resulting in a study about the influence of base load value and curtailable load percentage on the amount of curtailed load and shifted load as well as the operation cost of the MMG. Furthermore, the impact of DR programs on the amount of external and internal trading power in the MMG is also examined. In summary, each individual DR program or group could be handy in certain situations depending on the interest of the MMG such as external trading, self-sufficiency or operation cost minimization.

  20. Optimal relay selection and power allocation for cognitive two-way relaying networks

    KAUST Repository

    Pandarakkottilil, Ubaidulla; Aï ssa, Sonia

    2012-01-01

    In this paper, we present an optimal scheme for power allocation and relay selection in a cognitive radio network where a pair of cognitive (or secondary) transceiver nodes communicate with each other assisted by a set of cognitive two-way relays

  1. Optimal individual supervised hyperspectral band selection distinguishing savannah trees at leaf level

    CSIR Research Space (South Africa)

    Debba, Pravesh

    2009-08-01

    Full Text Available computer intensive search technique to find the bands optimizing the value of TSAM as a function of the bands, by continually updating this function at succes- sive steps. Band selection by means of minimizing the total accumulated correlation...

  2. Ckmeans.1d.dp: Optimal k-means Clustering in One Dimension by Dynamic Programming.

    Science.gov (United States)

    Wang, Haizhou; Song, Mingzhou

    2011-12-01

    The heuristic k -means algorithm, widely used for cluster analysis, does not guarantee optimality. We developed a dynamic programming algorithm for optimal one-dimensional clustering. The algorithm is implemented as an R package called Ckmeans.1d.dp . We demonstrate its advantage in optimality and runtime over the standard iterative k -means algorithm.

  3. 3rd International Conference on Modelling, Computation and Optimization in Information Systems and Management Sciences

    CERN Document Server

    Dinh, Tao; Nguyen, Ngoc

    2015-01-01

    This proceedings set contains 85 selected full papers presented at the 3rd International Conference on Modelling, Computation and Optimization in Information Systems and Management Sciences - MCO 2015, held on May 11–13, 2015 at Lorraine University, France. The present part I of the 2 volume set includes articles devoted to Combinatorial optimization and applications, DC programming and DCA: thirty years of Developments, Dynamic Optimization, Modelling and Optimization in financial engineering, Multiobjective programming, Numerical Optimization, Spline Approximation and Optimization, as well as Variational Principles and Applications

  4. A Constraint Programming Model for Fast Optimal Stowage of Container Vessel Bays

    DEFF Research Database (Denmark)

    Delgado-Ortegon, Alberto; Jensen, Rune Møller; Janstrup, Kira

    2012-01-01

    Container vessel stowage planning is a hard combinatorial optimization problem with both high economic and environmental impact. We have developed an approach that often is able to generate near-optimal plans for large container vessels within a few minutes. It decomposes the problem into a master...... planning phase that distributes the containers to bay sections and a slot planning phase that assigns containers of each bay section to slots. In this paper, we focus on the slot planning phase of this approach and present a constraint programming and integer programming model for stowing a set...... of containers in a single bay section. This so-called slot planning problem is NP-hard and often involves stowing several hundred containers. Using state-of-the-art constraint solvers and modeling techniques, however, we were able to solve 90% of 236 real instances from our industrial collaborator to optimality...

  5. Optimal Channel Selection Based on Online Decision and Offline Learning in Multichannel Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Mu Qiao

    2017-01-01

    Full Text Available We propose a channel selection strategy with hybrid architecture, which combines the centralized method and the distributed method to alleviate the overhead of access point and at the same time provide more flexibility in network deployment. By this architecture, we make use of game theory and reinforcement learning to fulfill the optimal channel selection under different communication scenarios. Particularly, when the network can satisfy the requirements of energy and computational costs, the online decision algorithm based on noncooperative game can help each individual sensor node immediately select the optimal channel. Alternatively, when the network cannot satisfy the requirements of energy and computational costs, the offline learning algorithm based on reinforcement learning can help each individual sensor node to learn from its experience and iteratively adjust its behavior toward the expected target. Extensive simulation results validate the effectiveness of our proposal and also prove that higher system throughput can be achieved by our channel selection strategy over the conventional off-policy channel selection approaches.

  6. Optimal traffic control in highway transportation networks using linear programming

    KAUST Repository

    Li, Yanning; Canepa, Edward S.; Claudel, Christian G.

    2014-01-01

    of the Hamilton-Jacobi PDE, the problem of controlling the state of the system on a network link in a finite horizon can be posed as a Linear Program. Assuming all intersections in the network are controllable, we show that the optimization approach can

  7. A Quantitative Optimization Framework for Market-Driven Academic Program Portfolios

    NARCIS (Netherlands)

    Burgher, Joshua; Hamers, Herbert

    2017-01-01

    We introduce a quantitative model that can be used for decision support for planning and optimizing the composition of portfolios of market-driven academic programs within the context of higher education. This model is intended to enable leaders in colleges and universities to maximize financial

  8. Optimized Irregular Low-Density Parity-Check Codes for Multicarrier Modulations over Frequency-Selective Channels

    Directory of Open Access Journals (Sweden)

    Valérian Mannoni

    2004-09-01

    Full Text Available This paper deals with optimized channel coding for OFDM transmissions (COFDM over frequency-selective channels using irregular low-density parity-check (LDPC codes. Firstly, we introduce a new characterization of the LDPC code irregularity called “irregularity profile.” Then, using this parameterization, we derive a new criterion based on the minimization of the transmission bit error probability to design an irregular LDPC code suited to the frequency selectivity of the channel. The optimization of this criterion is done using the Gaussian approximation technique. Simulations illustrate the good performance of our approach for different transmission channels.

  9. The Effect of Exit Strategy on Optimal Portfolio Selection with Birandom Returns

    OpenAIRE

    Cao, Guohua; Shan, Dan

    2013-01-01

    The aims of this paper are to use a birandom variable to denote the stock return selected by some recurring technical patterns and to study the effect of exit strategy on optimal portfolio selection with birandom returns. Firstly, we propose a new method to estimate the stock return and use birandom distribution to denote the final stock return which can reflect the features of technical patterns and investors' heterogeneity simultaneously; secondly, we build a birandom safety-first model and...

  10. Fuzzy 2-partition entropy threshold selection based on Big Bang–Big Crunch Optimization algorithm

    Directory of Open Access Journals (Sweden)

    Baljit Singh Khehra

    2015-03-01

    Full Text Available The fuzzy 2-partition entropy approach has been widely used to select threshold value for image segmenting. This approach used two parameterized fuzzy membership functions to form a fuzzy 2-partition of the image. The optimal threshold is selected by searching an optimal combination of parameters of the membership functions such that the entropy of fuzzy 2-partition is maximized. In this paper, a new fuzzy 2-partition entropy thresholding approach based on the technology of the Big Bang–Big Crunch Optimization (BBBCO is proposed. The new proposed thresholding approach is called the BBBCO-based fuzzy 2-partition entropy thresholding algorithm. BBBCO is used to search an optimal combination of parameters of the membership functions for maximizing the entropy of fuzzy 2-partition. BBBCO is inspired by the theory of the evolution of the universe; namely the Big Bang and Big Crunch Theory. The proposed algorithm is tested on a number of standard test images. For comparison, three different algorithms included Genetic Algorithm (GA-based, Biogeography-based Optimization (BBO-based and recursive approaches are also implemented. From experimental results, it is observed that the performance of the proposed algorithm is more effective than GA-based, BBO-based and recursion-based approaches.

  11. Portfolio optimization for seed selection in diverse weather scenarios.

    Science.gov (United States)

    Marko, Oskar; Brdar, Sanja; Panić, Marko; Šašić, Isidora; Despotović, Danica; Knežević, Milivoje; Crnojević, Vladimir

    2017-01-01

    The aim of this work was to develop a method for selection of optimal soybean varieties for the American Midwest using data analytics. We extracted the knowledge about 174 varieties from the dataset, which contained information about weather, soil, yield and regional statistical parameters. Next, we predicted the yield of each variety in each of 6,490 observed subregions of the Midwest. Furthermore, yield was predicted for all the possible weather scenarios approximated by 15 historical weather instances contained in the dataset. Using predicted yields and covariance between varieties through different weather scenarios, we performed portfolio optimisation. In this way, for each subregion, we obtained a selection of varieties, that proved superior to others in terms of the amount and stability of yield. According to the rules of Syngenta Crop Challenge, for which this research was conducted, we aggregated the results across all subregions and selected up to five soybean varieties that should be distributed across the network of seed retailers. The work presented in this paper was the winning solution for Syngenta Crop Challenge 2017.

  12. Portfolio optimization for seed selection in diverse weather scenarios.

    Directory of Open Access Journals (Sweden)

    Oskar Marko

    Full Text Available The aim of this work was to develop a method for selection of optimal soybean varieties for the American Midwest using data analytics. We extracted the knowledge about 174 varieties from the dataset, which contained information about weather, soil, yield and regional statistical parameters. Next, we predicted the yield of each variety in each of 6,490 observed subregions of the Midwest. Furthermore, yield was predicted for all the possible weather scenarios approximated by 15 historical weather instances contained in the dataset. Using predicted yields and covariance between varieties through different weather scenarios, we performed portfolio optimisation. In this way, for each subregion, we obtained a selection of varieties, that proved superior to others in terms of the amount and stability of yield. According to the rules of Syngenta Crop Challenge, for which this research was conducted, we aggregated the results across all subregions and selected up to five soybean varieties that should be distributed across the network of seed retailers. The work presented in this paper was the winning solution for Syngenta Crop Challenge 2017.

  13. Reducing residual stresses and deformations in selective laser melting through multi-level multi-scale optimization of cellular scanning strategy

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2016-01-01

    . A multilevel optimization strategy is adopted using a customized genetic algorithm developed for optimizing cellular scanning strategy for selective laser melting, with an objective of reducing residual stresses and deformations. The resulting thermo-mechanically optimized cellular scanning strategies......, a calibrated, fast, multiscale thermal model coupled with a 3D finite element mechanical model is used to simulate residual stress formation and deformations during selective laser melting. The resulting reduction in thermal model computation time allows evolutionary algorithm-based optimization of the process...

  14. Regulation of Dynamical Systems to Optimal Solutions of Semidefinite Programs: Algorithms and Applications to AC Optimal Power Flow

    Energy Technology Data Exchange (ETDEWEB)

    Dall' Anese, Emiliano; Dhople, Sairaj V.; Giannakis, Georgios B.

    2015-07-01

    This paper considers a collection of networked nonlinear dynamical systems, and addresses the synthesis of feedback controllers that seek optimal operating points corresponding to the solution of pertinent network-wide optimization problems. Particular emphasis is placed on the solution of semidefinite programs (SDPs). The design of the feedback controller is grounded on a dual e-subgradient approach, with the dual iterates utilized to dynamically update the dynamical-system reference signals. Global convergence is guaranteed for diminishing stepsize rules, even when the reference inputs are updated at a faster rate than the dynamical-system settling time. The application of the proposed framework to the control of power-electronic inverters in AC distribution systems is discussed. The objective is to bridge the time-scale separation between real-time inverter control and network-wide optimization. Optimization objectives assume the form of SDP relaxations of prototypical AC optimal power flow problems.

  15. C-program LINOP for the evaluation of film dosemeters by linear optimization. User manual

    International Nuclear Information System (INIS)

    Kragh, P.

    1995-11-01

    Linear programming results in an optimal measuring value for film dosemeters. The Linop program was developed to be used for linear programming. The program permits the evaluation and control of film dosemeters and of all other multi-component dosemeters. This user manual for the Linop program contains the source program, a description of the program and installation and use instructions. The data sets with programs and examples are available upon request. (orig.) [de

  16. A Constraint programming-based genetic algorithm for capacity output optimization

    Directory of Open Access Journals (Sweden)

    Kate Ean Nee Goh

    2014-10-01

    Full Text Available Purpose: The manuscript presents an investigation into a constraint programming-based genetic algorithm for capacity output optimization in a back-end semiconductor manufacturing company.Design/methodology/approach: In the first stage, constraint programming defining the relationships between variables was formulated into the objective function. A genetic algorithm model was created in the second stage to optimize capacity output. Three demand scenarios were applied to test the robustness of the proposed algorithm.Findings: CPGA improved both the machine utilization and capacity output once the minimum requirements of a demand scenario were fulfilled. Capacity outputs of the three scenarios were improved by 157%, 7%, and 69%, respectively.Research limitations/implications: The work relates to aggregate planning of machine capacity in a single case study. The constraints and constructed scenarios were therefore industry-specific.Practical implications: Capacity planning in a semiconductor manufacturing facility need to consider multiple mutually influenced constraints in resource availability, process flow and product demand. The findings prove that CPGA is a practical and an efficient alternative to optimize the capacity output and to allow the company to review its capacity with quick feedback.Originality/value: The work integrates two contemporary computational methods for a real industry application conventionally reliant on human judgement.

  17. Particle swarm optimization for programming deep brain stimulation arrays.

    Science.gov (United States)

    Peña, Edgar; Zhang, Simeng; Deyo, Steve; Xiao, YiZi; Johnson, Matthew D

    2017-02-01

    Deep brain stimulation (DBS) therapy relies on both precise neurosurgical targeting and systematic optimization of stimulation settings to achieve beneficial clinical outcomes. One recent advance to improve targeting is the development of DBS arrays (DBSAs) with electrodes segmented both along and around the DBS lead. However, increasing the number of independent electrodes creates the logistical challenge of optimizing stimulation parameters efficiently. Solving such complex problems with multiple solutions and objectives is well known to occur in biology, in which complex collective behaviors emerge out of swarms of individual organisms engaged in learning through social interactions. Here, we developed a particle swarm optimization (PSO) algorithm to program DBSAs using a swarm of individual particles representing electrode configurations and stimulation amplitudes. Using a finite element model of motor thalamic DBS, we demonstrate how the PSO algorithm can efficiently optimize a multi-objective function that maximizes predictions of axonal activation in regions of interest (ROI, cerebellar-receiving area of motor thalamus), minimizes predictions of axonal activation in regions of avoidance (ROA, somatosensory thalamus), and minimizes power consumption. The algorithm solved the multi-objective problem by producing a Pareto front. ROI and ROA activation predictions were consistent across swarms (<1% median discrepancy in axon activation). The algorithm was able to accommodate for (1) lead displacement (1 mm) with relatively small ROI (⩽9.2%) and ROA (⩽1%) activation changes, irrespective of shift direction; (2) reduction in maximum per-electrode current (by 50% and 80%) with ROI activation decreasing by 5.6% and 16%, respectively; and (3) disabling electrodes (n  =  3 and 12) with ROI activation reduction by 1.8% and 14%, respectively. Additionally, comparison between PSO predictions and multi-compartment axon model simulations showed discrepancies

  18. Particle Swarm Optimization for Programming Deep Brain Stimulation Arrays

    Science.gov (United States)

    Peña, Edgar; Zhang, Simeng; Deyo, Steve; Xiao, YiZi; Johnson, Matthew D.

    2017-01-01

    Objective Deep brain stimulation (DBS) therapy relies on both precise neurosurgical targeting and systematic optimization of stimulation settings to achieve beneficial clinical outcomes. One recent advance to improve targeting is the development of DBS arrays (DBSAs) with electrodes segmented both along and around the DBS lead. However, increasing the number of independent electrodes creates the logistical challenge of optimizing stimulation parameters efficiently. Approach Solving such complex problems with multiple solutions and objectives is well known to occur in biology, in which complex collective behaviors emerge out of swarms of individual organisms engaged in learning through social interactions. Here, we developed a particle swarm optimization (PSO) algorithm to program DBSAs using a swarm of individual particles representing electrode configurations and stimulation amplitudes. Using a finite element model of motor thalamic DBS, we demonstrate how the PSO algorithm can efficiently optimize a multi-objective function that maximizes predictions of axonal activation in regions of interest (ROI, cerebellar-receiving area of motor thalamus), minimizes predictions of axonal activation in regions of avoidance (ROA, somatosensory thalamus), and minimizes power consumption. Main Results The algorithm solved the multi-objective problem by producing a Pareto front. ROI and ROA activation predictions were consistent across swarms (<1% median discrepancy in axon activation). The algorithm was able to accommodate for (1) lead displacement (1 mm) with relatively small ROI (≤9.2%) and ROA (≤1%) activation changes, irrespective of shift direction; (2) reduction in maximum per-electrode current (by 50% and 80%) with ROI activation decreasing by 5.6% and 16%, respectively; and (3) disabling electrodes (n=3 and 12) with ROI activation reduction by 1.8% and 14%, respectively. Additionally, comparison between PSO predictions and multi-compartment axon model simulations

  19. Optimization of multi-environment trials for genomic selection based on crop models.

    Science.gov (United States)

    Rincent, R; Kuhn, E; Monod, H; Oury, F-X; Rousset, M; Allard, V; Le Gouis, J

    2017-08-01

    We propose a statistical criterion to optimize multi-environment trials to predict genotype × environment interactions more efficiently, by combining crop growth models and genomic selection models. Genotype × environment interactions (GEI) are common in plant multi-environment trials (METs). In this context, models developed for genomic selection (GS) that refers to the use of genome-wide information for predicting breeding values of selection candidates need to be adapted. One promising way to increase prediction accuracy in various environments is to combine ecophysiological and genetic modelling thanks to crop growth models (CGM) incorporating genetic parameters. The efficiency of this approach relies on the quality of the parameter estimates, which depends on the environments composing this MET used for calibration. The objective of this study was to determine a method to optimize the set of environments composing the MET for estimating genetic parameters in this context. A criterion called OptiMET was defined to this aim, and was evaluated on simulated and real data, with the example of wheat phenology. The MET defined with OptiMET allowed estimating the genetic parameters with lower error, leading to higher QTL detection power and higher prediction accuracies. MET defined with OptiMET was on average more efficient than random MET composed of twice as many environments, in terms of quality of the parameter estimates. OptiMET is thus a valuable tool to determine optimal experimental conditions to best exploit MET and the phenotyping tools that are currently developed.

  20. Optimization of programming parameters in children with the advanced bionics cochlear implant.

    Science.gov (United States)

    Baudhuin, Jacquelyn; Cadieux, Jamie; Firszt, Jill B; Reeder, Ruth M; Maxson, Jerrica L

    2012-05-01

    -to-noise ratio). Outcomes were analyzed using a paired t-test and a mixed-model repeated measures analysis of variance (ANOVA). T-levels set 10 CUs below "soft" resulted in significantly lower detection thresholds for all six Ling sounds and FM tones at 250, 1000, 3000, 4000, and 6000 Hz. When comparing programs differing by IDR and sensitivity, a 50 dB IDR with a 0 sensitivity setting showed significantly poorer thresholds for low frequency FM tones and voiced Ling sounds. Analysis of group mean scores for CNC words in quiet or HINT-C sentences in noise indicated no significant differences across IDR/sensitivity settings. Individual data, however, showed significant differences between IDR/sensitivity programs in noise; the optimal program differed across participants. In pediatric recipients of the Advanced Bionics cochlear implant device, manually setting T-levels with ascending loudness judgments should be considered when possible or when low-level sounds are inaudible. Study findings confirm the need to determine program settings on an individual basis as well as the importance of speech recognition verification measures in both quiet and noise. Clinical guidelines are suggested for selection of programming parameters in both young and older children. American Academy of Audiology.

  1. [Hyperspectral remote sensing image classification based on SVM optimized by clonal selection].

    Science.gov (United States)

    Liu, Qing-Jie; Jing, Lin-Hai; Wang, Meng-Fei; Lin, Qi-Zhong

    2013-03-01

    Model selection for support vector machine (SVM) involving kernel and the margin parameter values selection is usually time-consuming, impacts training efficiency of SVM model and final classification accuracies of SVM hyperspectral remote sensing image classifier greatly. Firstly, based on combinatorial optimization theory and cross-validation method, artificial immune clonal selection algorithm is introduced to the optimal selection of SVM (CSSVM) kernel parameter a and margin parameter C to improve the training efficiency of SVM model. Then an experiment of classifying AVIRIS in India Pine site of USA was performed for testing the novel CSSVM, as well as a traditional SVM classifier with general Grid Searching cross-validation method (GSSVM) for comparison. And then, evaluation indexes including SVM model training time, classification overall accuracy (OA) and Kappa index of both CSSVM and GSSVM were all analyzed quantitatively. It is demonstrated that OA of CSSVM on test samples and whole image are 85.1% and 81.58, the differences from that of GSSVM are both within 0.08% respectively; And Kappa indexes reach 0.8213 and 0.7728, the differences from that of GSSVM are both within 0.001; While the ratio of model training time of CSSVM and GSSVM is between 1/6 and 1/10. Therefore, CSSVM is fast and accurate algorithm for hyperspectral image classification and is superior to GSSVM.

  2. Space-planning and structural solutions of low-rise buildings: Optimal selection methods

    Science.gov (United States)

    Gusakova, Natalya; Minaev, Nikolay; Filushina, Kristina; Dobrynina, Olga; Gusakov, Alexander

    2017-11-01

    The present study is devoted to elaboration of methodology used to select appropriately the space-planning and structural solutions in low-rise buildings. Objective of the study is working out the system of criteria influencing the selection of space-planning and structural solutions which are most suitable for low-rise buildings and structures. Application of the defined criteria in practice aim to enhance the efficiency of capital investments, energy and resource saving, create comfortable conditions for the population considering climatic zoning of the construction site. Developments of the project can be applied while implementing investment-construction projects of low-rise housing at different kinds of territories based on the local building materials. The system of criteria influencing the optimal selection of space-planning and structural solutions of low-rise buildings has been developed. Methodological basis has been also elaborated to assess optimal selection of space-planning and structural solutions of low-rise buildings satisfying the requirements of energy-efficiency, comfort and safety, and economical efficiency. Elaborated methodology enables to intensify the processes of low-rise construction development for different types of territories taking into account climatic zoning of the construction site. Stimulation of low-rise construction processes should be based on the system of approaches which are scientifically justified; thus it allows enhancing energy efficiency, comfort, safety and economical effectiveness of low-rise buildings.

  3. An overview of the Douglas Aircraft Company Aeroelastic Design Optimization Program (ADOP)

    Science.gov (United States)

    Dodd, Alan J.

    1989-01-01

    From a program manager's viewpoint, the history, scope and architecture of a major structural design program at Douglas Aircraft Company called Aeroelastic Design Optimization Program (ADOP) are described. ADOP was originally intended for the rapid, accurate, cost-effective evaluation of relatively small structural models at the advanced design level, resulting in improved proposal competitiveness and avoiding many costly changes later in the design cycle. Before release of the initial version in November 1987, however, the program was expanded to handle very large production-type analyses.

  4. Reserve selection with land market feedbacks.

    Science.gov (United States)

    Butsic, Van; Lewis, David J; Radeloff, Volker C

    2013-01-15

    How to best site reserves is a leading question for conservation biologists. Recently, reserve selection has emphasized efficient conservation: maximizing conservation goals given the reality of limited conservation budgets, and this work indicates that land market can potentially undermine the conservation benefits of reserves by increasing property values and development probabilities near reserves. Here we propose a reserve selection methodology which optimizes conservation given both a budget constraint and land market feedbacks by using a combination of econometric models along with stochastic dynamic programming. We show that amenity based feedbacks can be accounted for in optimal reserve selection by choosing property price and land development models which exogenously estimate the effects of reserve establishment. In our empirical example, we use previously estimated models of land development and property prices to select parcels to maximize coarse woody debris along 16 lakes in Vilas County, WI, USA. Using each lake as an independent experiment, we find that including land market feedbacks in the reserve selection algorithm has only small effects on conservation efficacy. Likewise, we find that in our setting heuristic (minloss and maxgain) algorithms perform nearly as well as the optimal selection strategy. We emphasize that land market feedbacks can be included in optimal reserve selection; the extent to which this improves reserve placement will likely vary across landscapes. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. A reliable computational workflow for the selection of optimal screening libraries.

    Science.gov (United States)

    Gilad, Yocheved; Nadassy, Katalin; Senderowitz, Hanoch

    2015-01-01

    The experimental screening of compound collections is a common starting point in many drug discovery projects. Successes of such screening campaigns critically depend on the quality of the screened library. Many libraries are currently available from different vendors yet the selection of the optimal screening library for a specific project is challenging. We have devised a novel workflow for the rational selection of project-specific screening libraries. The workflow accepts as input a set of virtual candidate libraries and applies the following steps to each library: (1) data curation; (2) assessment of ADME/T profile; (3) assessment of the number of promiscuous binders/frequent HTS hitters; (4) assessment of internal diversity; (5) assessment of similarity to known active compound(s) (optional); (6) assessment of similarity to in-house or otherwise accessible compound collections (optional). For ADME/T profiling, Lipinski's and Veber's rule-based filters were implemented and a new blood brain barrier permeation model was developed and validated (85 and 74 % success rate for training set and test set, respectively). Diversity and similarity descriptors which demonstrated best performances in terms of their ability to select either diverse or focused sets of compounds from three databases (Drug Bank, CMC and CHEMBL) were identified and used for diversity and similarity assessments. The workflow was used to analyze nine common screening libraries available from six vendors. The results of this analysis are reported for each library providing an assessment of its quality. Furthermore, a consensus approach was developed to combine the results of these analyses into a single score for selecting the optimal library under different scenarios. We have devised and tested a new workflow for the rational selection of screening libraries under different scenarios. The current workflow was implemented using the Pipeline Pilot software yet due to the usage of generic

  6. SeGRAm - A practical and versatile tool for spacecraft trajectory optimization

    Science.gov (United States)

    Rishikof, Brian H.; Mccormick, Bernell R.; Pritchard, Robert E.; Sponaugle, Steven J.

    1991-01-01

    An implementation of the Sequential Gradient/Restoration Algorithm, SeGRAm, is presented along with selected examples. This spacecraft trajectory optimization and simulation program uses variational calculus to solve problems of spacecraft flying under the influence of one or more gravitational bodies. It produces a series of feasible solutions to problems involving a wide range of vehicles, environments and optimization functions, until an optimal solution is found. The examples included highlight the various capabilities of the program and emphasize in particular its versatility over a wide spectrum of applications from ascent to interplanetary trajectories.

  7. Optimization of fuel-cell tram operation based on two dimension dynamic programming

    Science.gov (United States)

    Zhang, Wenbin; Lu, Xuecheng; Zhao, Jingsong; Li, Jianqiu

    2018-02-01

    This paper proposes an optimal control strategy based on the two-dimension dynamic programming (2DDP) algorithm targeting at minimizing operation energy consumption for a fuel-cell tram. The energy consumption model with the tram dynamics is firstly deduced. Optimal control problem are analyzed and the 2DDP strategy is applied to solve the problem. The optimal tram speed profiles are obtained for each interstation which consist of three stages: accelerate to the set speed with the maximum traction power, dynamically adjust to maintain a uniform speed and decelerate to zero speed with the maximum braking power at a suitable timing. The optimal control curves of all the interstations are connected with the parking time to form the optimal control method of the whole line. The optimized speed profiles are also simplified for drivers to follow.

  8. IMI Workshop on Optimization in the Real World

    CERN Document Server

    Shinano, Yuji; Waki, Hayato

    2016-01-01

    This book clearly shows the importance, usefulness, and powerfulness of current optimization technologies, in particular, mixed-integer programming and its remarkable applications. It is intended to be the definitive study of state-of-the-art optimization technologies for students, academic researchers, and non-professionals in industry. The chapters of this book are based on a collection of selected and extended papers from the  “IMI Workshop on Optimization in the Real World” held in October 2014 in Japan.

  9. Weight optimization of plane truss using genetic algorithm

    Science.gov (United States)

    Neeraja, D.; Kamireddy, Thejesh; Santosh Kumar, Potnuru; Simha Reddy, Vijay

    2017-11-01

    Optimization of structure on basis of weight has many practical benefits in every engineering field. The efficiency is proportionally related to its weight and hence weight optimization gains prime importance. Considering the field of civil engineering, weight optimized structural elements are economical and easier to transport to the site. In this study, genetic optimization algorithm for weight optimization of steel truss considering its shape, size and topology aspects has been developed in MATLAB. Material strength and Buckling stability have been adopted from IS 800-2007 code of construction steel. The constraints considered in the present study are fabrication, basic nodes, displacements, and compatibility. Genetic programming is a natural selection search technique intended to combine good solutions to a problem from many generations to improve the results. All solutions are generated randomly and represented individually by a binary string with similarities of natural chromosomes, and hence it is termed as genetic programming. The outcome of the study is a MATLAB program, which can optimise a steel truss and display the optimised topology along with element shapes, deflections, and stress results.

  10. The Retrofit Puzzle Extended: Optimal Fleet Owner Behavior over Multiple Time Periods

    Science.gov (United States)

    2009-08-04

    In "The Retrofit Puzzle: Optimal Fleet Owner Behavior in the Context of Diesel Retrofit Incentive Programs" (1) an integer program was developed to model profit-maximizing diesel fleet owner behavior when selecting pollution reduction retrofits. Flee...

  11. Policy Gradient Adaptive Dynamic Programming for Data-Based Optimal Control.

    Science.gov (United States)

    Luo, Biao; Liu, Derong; Wu, Huai-Ning; Wang, Ding; Lewis, Frank L

    2017-10-01

    The model-free optimal control problem of general discrete-time nonlinear systems is considered in this paper, and a data-based policy gradient adaptive dynamic programming (PGADP) algorithm is developed to design an adaptive optimal controller method. By using offline and online data rather than the mathematical system model, the PGADP algorithm improves control policy with a gradient descent scheme. The convergence of the PGADP algorithm is proved by demonstrating that the constructed Q -function sequence converges to the optimal Q -function. Based on the PGADP algorithm, the adaptive control method is developed with an actor-critic structure and the method of weighted residuals. Its convergence properties are analyzed, where the approximate Q -function converges to its optimum. Computer simulation results demonstrate the effectiveness of the PGADP-based adaptive control method.

  12. The relationship between PMI (manA) gene expression and optimal selection pressure in Indica rice transformation.

    Science.gov (United States)

    Gui, Huaping; Li, Xia; Liu, Yubo; Han, Kai; Li, Xianggan

    2014-07-01

    An efficient mannose selection system was established for transformation of Indica cultivar IR58025B . Different selection pressures were required to achieve optimum transformation frequency for different PMI selectable marker cassettes. This study was conducted to establish an efficient transformation system for Indica rice, cultivar IR58025B. Four combinations of two promoters, rice Actin 1 and maize Ubiquitin 1, and two manA genes, native gene from E. coli (PMI-01) and synthetic maize codon-optimized gene (PMI-09) were compared under various concentrations of mannose. Different selection pressures were required for different gene cassettes to achieve corresponding optimum transformation frequency (TF). Higher TFs as 54 and 53% were obtained when 5 g/L mannose was used for selection of prActin-PMI-01 cassette and 7.5 g/L mannose used for selection of prActin-PMI-09, respectively. TFs as 67 and 56% were obtained when 7.5 and 15 g/L mannose were used for selection of prUbi-PMI-01 and prUbi-PMI-09, respectively. We conclude that higher TFs can be achieved for different gene cassettes when an optimum selection pressure is applied. By investigating the PMI expression level in transgenic calli and leaves, we found there was a significant positive correlation between the protein expression level and the optimal selection pressure. Higher optimal selection pressure is required for those constructs which confer higher expression of PMI protein. The single copy rate of those transgenic events for prActin-PMI-01 cassette is lower than that for other three cassettes. We speculate some of low copy events with low protein expression levels might not have been able to survive in the mannose selection.

  13. Chiral stationary phase optimized selectivity liquid chromatography: A strategy for the separation of chiral isomers.

    Science.gov (United States)

    Hegade, Ravindra Suryakant; De Beer, Maarten; Lynen, Frederic

    2017-09-15

    Chiral Stationary-Phase Optimized Selectivity Liquid Chromatography (SOSLC) is proposed as a tool to optimally separate mixtures of enantiomers on a set of commercially available coupled chiral columns. This approach allows for the prediction of the separation profiles on any possible combination of the chiral stationary phases based on a limited number of preliminary analyses, followed by automated selection of the optimal column combination. Both the isocratic and gradient SOSLC approach were implemented for prediction of the retention times for a mixture of 4 chiral pairs on all possible combinations of the 5 commercial chiral columns. Predictions in isocratic and gradient mode were performed with a commercially available and with an in-house developed Microsoft visual basic algorithm, respectively. Optimal predictions in the isocratic mode required the coupling of 4 columns whereby relative deviations between the predicted and experimental retention times ranged between 2 and 7%. Gradient predictions led to the coupling of 3 chiral columns allowing baseline separation of all solutes, whereby differences between predictions and experiments ranged between 0 and 12%. The methodology is a novel tool allowing optimizing the separation of mixtures of optical isomers. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Application of linear programming and perturbation theory in optimization of fuel utilization in a nuclear reactor

    International Nuclear Information System (INIS)

    Zavaljevski, N.

    1985-01-01

    Proposed optimization procedure is fast due to application of linear programming. Non-linear constraints which demand iterative application of linear programming are slowing down the calculation. Linearization can be done by different procedures starting from simple empirical rules for fuel in-core management to complicated general perturbation theory with higher order of corrections. A mathematical model was formulated for optimization of improved fuel cycle. A detailed algorithm for determining minimum of fresh fuel at the beginning of each fuel cycle is shown and the problem is linearized by first order perturbation theory and it is optimized by linear programming. Numerical illustration of the proposed method was done for the experimental reactor mostly for saving computer time

  15. Applying Four Different Risk Models in Local Ore Selection

    International Nuclear Information System (INIS)

    Richmond, Andrew

    2002-01-01

    Given the uncertainty in grade at a mine location, a financially risk-averse decision-maker may prefer to incorporate this uncertainty into the ore selection process. A FORTRAN program risksel is presented to calculate local risk-adjusted optimal ore selections using a negative exponential utility function and three dominance models: mean-variance, mean-downside risk, and stochastic dominance. All four methods are demonstrated in a grade control environment. In the case study, optimal selections range with the magnitude of financial risk that a decision-maker is prepared to accept. Except for the stochastic dominance method, the risk models reassign material from higher cost to lower cost processing options as the aversion to financial risk increases. The stochastic dominance model usually was unable to determine the optimal local selection

  16. Optimizing Biorefinery Design and Operations via Linear Programming Models

    Energy Technology Data Exchange (ETDEWEB)

    Talmadge, Michael; Batan, Liaw; Lamers, Patrick; Hartley, Damon; Biddy, Mary; Tao, Ling; Tan, Eric

    2017-03-28

    The ability to assess and optimize economics of biomass resource utilization for the production of fuels, chemicals and power is essential for the ultimate success of a bioenergy industry. The team of authors, consisting of members from the National Renewable Energy Laboratory (NREL) and the Idaho National Laboratory (INL), has developed simple biorefinery linear programming (LP) models to enable the optimization of theoretical or existing biorefineries. The goal of this analysis is to demonstrate how such models can benefit the developing biorefining industry. It focuses on a theoretical multi-pathway, thermochemical biorefinery configuration and demonstrates how the biorefinery can use LP models for operations planning and optimization in comparable ways to the petroleum refining industry. Using LP modeling tools developed under U.S. Department of Energy's Bioenergy Technologies Office (DOE-BETO) funded efforts, the authors investigate optimization challenges for the theoretical biorefineries such as (1) optimal feedstock slate based on available biomass and prices, (2) breakeven price analysis for available feedstocks, (3) impact analysis for changes in feedstock costs and product prices, (4) optimal biorefinery operations during unit shutdowns / turnarounds, and (5) incentives for increased processing capacity. These biorefinery examples are comparable to crude oil purchasing and operational optimization studies that petroleum refiners perform routinely using LPs and other optimization models. It is important to note that the analyses presented in this article are strictly theoretical and they are not based on current energy market prices. The pricing structure assigned for this demonstrative analysis is consistent with $4 per gallon gasoline, which clearly assumes an economic environment that would favor the construction and operation of biorefineries. The analysis approach and examples provide valuable insights into the usefulness of analysis tools for

  17. A concurrent optimization model for supplier selection with fuzzy quality loss

    International Nuclear Information System (INIS)

    Rosyidi, C.; Murtisari, R.; Jauhari, W.

    2017-01-01

    The purpose of this research is to develop a concurrent supplier selection model to minimize the purchasing cost and fuzzy quality loss considering process capability and assembled product specification. Design/methodology/approach: This research integrates fuzzy quality loss in the model to concurrently solve the decision making in detailed design stage and manufacturing stage. Findings: The resulted model can be used to concurrently select the optimal supplier and determine the tolerance of the components. The model balances the purchasing cost and fuzzy quality loss. Originality/value: An assembled product consists of many components which must be purchased from the suppliers. Fuzzy quality loss is integrated in the supplier selection model to allow the vagueness in final assembly by grouping the assembly into several grades according to the resulted assembly tolerance.

  18. A concurrent optimization model for supplier selection with fuzzy quality loss

    Energy Technology Data Exchange (ETDEWEB)

    Rosyidi, C.; Murtisari, R.; Jauhari, W.

    2017-07-01

    The purpose of this research is to develop a concurrent supplier selection model to minimize the purchasing cost and fuzzy quality loss considering process capability and assembled product specification. Design/methodology/approach: This research integrates fuzzy quality loss in the model to concurrently solve the decision making in detailed design stage and manufacturing stage. Findings: The resulted model can be used to concurrently select the optimal supplier and determine the tolerance of the components. The model balances the purchasing cost and fuzzy quality loss. Originality/value: An assembled product consists of many components which must be purchased from the suppliers. Fuzzy quality loss is integrated in the supplier selection model to allow the vagueness in final assembly by grouping the assembly into several grades according to the resulted assembly tolerance.

  19. Selection of the optimal combination of water vapor absorption lines for detection of temperature in combustion zones of mixing supersonic gas flows by diode laser absorption spectrometry

    International Nuclear Information System (INIS)

    Mironenko, V.R.; Kuritsyn, Yu.A.; Bolshov, M.A.; Liger, V.V.

    2017-01-01

    Determination of a gas medium temperature by diode laser absorption spectrometry (DLAS) is based on the measurement of integral intensities of the absorption lines of a test molecule (generally water vapor molecule). In case of local thermodynamic equilibrium temperature is inferred from the ratio of the integral intensities of two lines with different low energy levels. For the total gas pressure above 1 atm the absorption lines are broadened and one cannot find isolated well resolved water vapor absorption lines within relatively narrow spectral interval of fast diode laser (DL) tuning range (about 3 cm"−"1). For diagnostics of a gas object in the case of high temperature and pressure DLAS technique can be realized with two diode lasers working in different spectral regions with strong absorption lines. In such situation the criteria of the optimal line selection differs significantly from the case of narrow lines. These criteria are discussed in our work. The software for selection the optimal spectral regions using the HITRAN-2012 and HITEMP data bases is developed. The program selects spectral regions of DL tuning, minimizing the error of temperature determination δT/T, basing on the attainable experimental error of line intensity measurement δS. Two combinations of optimal spectral regions were selected – (1.392 & 1.343 μm) and (1.392 & 1.339 μm). Different algorithms of experimental data processing are discussed.

  20. An improved chaotic fruit fly optimization based on a mutation strategy for simultaneous feature selection and parameter optimization for SVM and its applications.

    Science.gov (United States)

    Ye, Fei; Lou, Xin Yuan; Sun, Lin Fu

    2017-01-01

    This paper proposes a new support vector machine (SVM) optimization scheme based on an improved chaotic fly optimization algorithm (FOA) with a mutation strategy to simultaneously perform parameter setting turning for the SVM and feature selection. In the improved FOA, the chaotic particle initializes the fruit fly swarm location and replaces the expression of distance for the fruit fly to find the food source. However, the proposed mutation strategy uses two distinct generative mechanisms for new food sources at the osphresis phase, allowing the algorithm procedure to search for the optimal solution in both the whole solution space and within the local solution space containing the fruit fly swarm location. In an evaluation based on a group of ten benchmark problems, the proposed algorithm's performance is compared with that of other well-known algorithms, and the results support the superiority of the proposed algorithm. Moreover, this algorithm is successfully applied in a SVM to perform both parameter setting turning for the SVM and feature selection to solve real-world classification problems. This method is called chaotic fruit fly optimization algorithm (CIFOA)-SVM and has been shown to be a more robust and effective optimization method than other well-known methods, particularly in terms of solving the medical diagnosis problem and the credit card problem.

  1. TH-EF-BRB-05: 4pi Non-Coplanar IMRT Beam Angle Selection by Convex Optimization with Group Sparsity Penalty

    International Nuclear Information System (INIS)

    O’Connor, D; Nguyen, D; Voronenko, Y; Yin, W; Sheng, K

    2016-01-01

    Purpose: Integrated beam orientation and fluence map optimization is expected to be the foundation of robust automated planning but existing heuristic methods do not promise global optimality. We aim to develop a new method for beam angle selection in 4π non-coplanar IMRT systems based on solving (globally) a single convex optimization problem, and to demonstrate the effectiveness of the method by comparison with a state of the art column generation method for 4π beam angle selection. Methods: The beam angle selection problem is formulated as a large scale convex fluence map optimization problem with an additional group sparsity term that encourages most candidate beams to be inactive. The optimization problem is solved using an accelerated first-order method, the Fast Iterative Shrinkage-Thresholding Algorithm (FISTA). The beam angle selection and fluence map optimization algorithm is used to create non-coplanar 4π treatment plans for several cases (including head and neck, lung, and prostate cases) and the resulting treatment plans are compared with 4π treatment plans created using the column generation algorithm. Results: In our experiments the treatment plans created using the group sparsity method meet or exceed the dosimetric quality of plans created using the column generation algorithm, which was shown superior to clinical plans. Moreover, the group sparsity approach converges in about 3 minutes in these cases, as compared with runtimes of a few hours for the column generation method. Conclusion: This work demonstrates the first non-greedy approach to non-coplanar beam angle selection, based on convex optimization, for 4π IMRT systems. The method given here improves both treatment plan quality and runtime as compared with a state of the art column generation algorithm. When the group sparsity term is set to zero, we obtain an excellent method for fluence map optimization, useful when beam angles have already been selected. NIH R43CA183390, NIH R01CA

  2. Global blending optimization of laminated composites with discrete material candidate selection and thickness variation

    DEFF Research Database (Denmark)

    Sørensen, Søren N.; Stolpe, Mathias

    2015-01-01

    rate. The capabilities of the method and the effect of active versus inactive manufacturing constraints are demonstrated on several numerical examples of limited size, involving at most 320 binary variables. Most examples are solved to guaranteed global optimality and may constitute benchmark examples...... but is, however, convex in the original mixed binary nested form. Convexity is the foremost important property of optimization problems, and the proposed method can guarantee the global or near-global optimal solution; unlike most topology optimization methods. The material selection is limited...... for popular topology optimization methods and heuristics based on solving sequences of non-convex problems. The results will among others demonstrate that the difficulty of the posed problem is highly dependent upon the composition of the constitutive properties of the material candidates....

  3. Selection on Optimal Haploid Value Increases Genetic Gain and Preserves More Genetic Diversity Relative to Genomic Selection

    OpenAIRE

    Daetwyler, Hans D.; Hayden, Matthew J.; Spangenberg, German C.; Hayes, Ben J.

    2015-01-01

    Doubled haploids are routinely created and phenotypically selected in plant breeding programs to accelerate the breeding cycle. Genomic selection, which makes use of both phenotypes and genotypes, has been shown to further improve genetic gain through prediction of performance before or without phenotypic characterization of novel germplasm. Additional opportunities exist to combine genomic prediction methods with the creation of doubled haploids. Here we propose an extension to genomic selec...

  4. Bi-objective optimization for multi-modal transportation routing planning problem based on Pareto optimality

    Directory of Open Access Journals (Sweden)

    Yan Sun

    2015-09-01

    Full Text Available Purpose: The purpose of study is to solve the multi-modal transportation routing planning problem that aims to select an optimal route to move a consignment of goods from its origin to its destination through the multi-modal transportation network. And the optimization is from two viewpoints including cost and time. Design/methodology/approach: In this study, a bi-objective mixed integer linear programming model is proposed to optimize the multi-modal transportation routing planning problem. Minimizing the total transportation cost and the total transportation time are set as the optimization objectives of the model. In order to balance the benefit between the two objectives, Pareto optimality is utilized to solve the model by gaining its Pareto frontier. The Pareto frontier of the model can provide the multi-modal transportation operator (MTO and customers with better decision support and it is gained by the normalized normal constraint method. Then, an experimental case study is designed to verify the feasibility of the model and Pareto optimality by using the mathematical programming software Lingo. Finally, the sensitivity analysis of the demand and supply in the multi-modal transportation organization is performed based on the designed case. Findings: The calculation results indicate that the proposed model and Pareto optimality have good performance in dealing with the bi-objective optimization. The sensitivity analysis also shows the influence of the variation of the demand and supply on the multi-modal transportation organization clearly. Therefore, this method can be further promoted to the practice. Originality/value: A bi-objective mixed integer linear programming model is proposed to optimize the multi-modal transportation routing planning problem. The Pareto frontier based sensitivity analysis of the demand and supply in the multi-modal transportation organization is performed based on the designed case.

  5. Contrast based band selection for optimized weathered oil detection in hyperspectral images

    Science.gov (United States)

    Levaux, Florian; Bostater, Charles R., Jr.; Neyt, Xavier

    2012-09-01

    Hyperspectral imagery offers unique benefits for detection of land and water features due to the information contained in reflectance signatures such as the bi-directional reflectance distribution function or BRDF. The reflectance signature directly shows the relative absorption and backscattering features of targets. These features can be very useful in shoreline monitoring or surveillance applications, for example to detect weathered oil. In real-time detection applications, processing of hyperspectral data can be an important tool and Optimal band selection is thus important in real time applications in order to select the essential bands using the absorption and backscatter information. In the present paper, band selection is based upon the optimization of target detection using contrast algorithms. The common definition of the contrast (using only one band out of all possible combinations available within a hyperspectral image) is generalized in order to consider all the possible combinations of wavelength dependent contrasts using hyperspectral images. The inflection (defined here as an approximation of the second derivative) is also used in order to enhance the variations in the reflectance spectra as well as in the contrast spectrua in order to assist in optimal band selection. The results of the selection in term of target detection (false alarms and missed detection) are also compared with a previous method to perform feature detection, namely the matched filter. In this paper, imagery is acquired using a pushbroom hyperspectral sensor mounted at the bow of a small vessel. The sensor is mechanically rotated using an optical rotation stage. This opto-mechanical scanning system produces hyperspectral images with pixel sizes on the order of mm to cm scales, depending upon the distance between the sensor and the shoreline being monitored. The motion of the platform during the acquisition induces distortions in the collected HSI imagery. It is therefore

  6. Optimal selection of major equipment in dual purpose plants

    International Nuclear Information System (INIS)

    Gabbrielli, E.

    1981-01-01

    Simulation of different operational conditions with the aid of a computer program is one of the best ways of assisting decision-makers in the selection of the most economic mix of equipment for a dual purpose plant. Using this approach this paper deals with the economic comparison of plants consisting of MSF desalinators and combustion gas or back pressure steam turbines coupled to low capacity electric power generators. The comparison is performed on the basis of the data made available by the OPTDIS computer program and the results are given in terms of yearly cost of production as the sum of capital, manpower, maintenance, fuel and chemical costs. (orig.)

  7. A feasibility study: Selection of a personalized radiotherapy fractionation schedule using spatiotemporal optimization

    International Nuclear Information System (INIS)

    Kim, Minsun; Stewart, Robert D.; Phillips, Mark H.

    2015-01-01

    Purpose: To investigate the impact of using spatiotemporal optimization, i.e., intensity-modulated spatial optimization followed by fractionation schedule optimization, to select the patient-specific fractionation schedule that maximizes the tumor biologically equivalent dose (BED) under dose constraints for multiple organs-at-risk (OARs). Methods: Spatiotemporal optimization was applied to a variety of lung tumors in a phantom geometry using a range of tumor sizes and locations. The optimal fractionation schedule for a patient using the linear-quadratic cell survival model depends on the tumor and OAR sensitivity to fraction size (α/β), the effective tumor doubling time (T d ), and the size and location of tumor target relative to one or more OARs (dose distribution). The authors used a spatiotemporal optimization method to identify the optimal number of fractions N that maximizes the 3D tumor BED distribution for 16 lung phantom cases. The selection of the optimal fractionation schedule used equivalent (30-fraction) OAR constraints for the heart (D mean ≤ 45 Gy), lungs (D mean ≤ 20 Gy), cord (D max ≤ 45 Gy), esophagus (D max ≤ 63 Gy), and unspecified tissues (D 05 ≤ 60 Gy). To assess plan quality, the authors compared the minimum, mean, maximum, and D 95 of tumor BED, as well as the equivalent uniform dose (EUD) for optimized plans to conventional intensity-modulated radiation therapy plans prescribing 60 Gy in 30 fractions. A sensitivity analysis was performed to assess the effects of T d (3–100 days), tumor lag-time (T k = 0–10 days), and the size of tumors on optimal fractionation schedule. Results: Using an α/β ratio of 10 Gy, the average values of tumor max, min, mean BED, and D 95 were up to 19%, 21%, 20%, and 19% larger than those from conventional prescription, depending on T d and T k used. Tumor EUD was up to 17% larger than the conventional prescription. For fast proliferating tumors with T d less than 10 days, there was no

  8. A Hybrid Programming Framework for Modeling and Solving Constraint Satisfaction and Optimization Problems

    OpenAIRE

    Sitek, Paweł; Wikarek, Jarosław

    2016-01-01

    This paper proposes a hybrid programming framework for modeling and solving of constraint satisfaction problems (CSPs) and constraint optimization problems (COPs). Two paradigms, CLP (constraint logic programming) and MP (mathematical programming), are integrated in the framework. The integration is supplemented with the original method of problem transformation, used in the framework as a presolving method. The transformation substantially reduces the feasible solution space. The framework a...

  9. Optimization of in-vivo monitoring program for radiation emergency response

    Energy Technology Data Exchange (ETDEWEB)

    Ha, Wi Ho; Kim, Jong Kyung [Dept. of Nuclear Engineering, Hanyang University, Seoul (Korea, Republic of)

    2016-12-15

    In case of radiation emergencies, internal exposure monitoring for the members of public will be required to confirm internal contamination of each individual. In-vivo monitoring technique using portable gamma spectrometer can be easily applied for internal exposure monitoring in the vicinity of the on-site area. In this study, minimum detectable doses (MDDs) for '1'3'4Cs, {sup 137}Cs, and {sup 131}I were calculated adjusting minimum detectable activities (MDAs) from 50 to 1,000 Bq to find out the optimal in-vivo counting condition. DCAL software was used to derive retention fraction of Cs and I isotopes in the whole body and thyroid, respectively. A minimum detectable level was determined to set committed effective dose of 0.1 mSv for emergency response. We found that MDDs at each MDA increased along with the elapsed time. 1,000 Bq for {sup 134}Cs and {sup 137}Cs, and 100 Bq for {sup 131}I were suggested as optimal MDAs to provide in-vivo monitoring service in case of radiation emergencies. In-vivo monitoring program for emergency response should be designed to achieve the optimal MDA suggested from the present work. We expect that a reduction of counting time compared with routine monitoring program can achieve the high throughput system in case of radiation emergencies.

  10. Optimal contracts decision of industrial customers

    International Nuclear Information System (INIS)

    Tsay, M.-T.; Lin, W.-M.; Lee, J.-L.

    2001-01-01

    This paper develops a software package to calculate the optimal contract capacities for industrial customers. Based on the time-of-use (TOU) rates employed by the Taiwan Power Company, the objective function is formulated, to minimize the electricity bill of industrial customers during the whole year period. Evolutionary programming (EP) was adopted to solve this problem. Users can get the optimal contract capacities for the peak load, semi-peak load, and off-peak load, respectively. Practical load consumption data were used to prove the validity of this program. Results show that the software developed in this paper can be used as a useful tool for industrial customers in selecting contract capacities to curtail the electricity bill. (author)

  11. [Selection of indicators for continuous monitoring of the impact of programs optimizing antimicrobial use in Primary Care].

    Science.gov (United States)

    Fernández-Urrusuno, Rocío; Flores-Dorado, Macarena; Moreno-Campoy, Eva; Montero-Balosa, M Carmen

    2015-05-01

    To determine core indicators for monitoring quality prescribing in Primary Care based on the evidence, and to assess the feasibility of these indicators for monitoring the use of antibiotics. A literature review was carried out on quality indicators for antimicrobial prescribing through an electronic search limited to the period 2001-2012. It was completed with an "ad hoc" search on the websites of public national and international health services. Finally, indicators were chosen by consensus by a multidisciplinary group of professionals dedicated to managing infections from several areas. The feasibility and applicability of these indicators was verified through the reporting and use of data in the prescription database. Twenty two indicators were found. The consensus group selected 16 indicators. Eleven of them measure the specific antimicrobial selection, and 5 are consumption rates. The indicators were successfully applied to the prescription database, being able to make comparisons between different geographical areas and to observe trends in prescriptions. The definition of a basic set of indicators to monitor antibiotic use adapted to local conditions is required. The results of these indicators can be used for feedback to professionals and for evaluating the impact of programs aimed at improving antimicrobial use. Copyright © 2014 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  12. Mass Optimization of Battery/Supercapacitors Hybrid Systems Based on a Linear Programming Approach

    Science.gov (United States)

    Fleury, Benoit; Labbe, Julien

    2014-08-01

    The objective of this paper is to show that, on a specific launcher-type mission profile, a 40% gain of mass is expected using a battery/supercapacitors active hybridization instead of a single battery solution. This result is based on the use of a linear programming optimization approach to perform the mass optimization of the hybrid power supply solution.

  13. Applications of sub-optimality in dynamic programming to location and construction of nuclear fuel processing plant

    International Nuclear Information System (INIS)

    Thiriet, L.; Deledicq, A.

    1968-09-01

    First, the point of applying Dynamic Programming to optimization and Operational Research problems in chemical industries are recalled, as well as the conditions in which a dynamic program is illustrated by a sequential graph. A new algorithm for the determination of sub-optimal politics in a sequential graph is then developed. Finally, the applications of sub-optimality concept is shown when taking into account the indirect effects related to possible strategies, or in the case of stochastic choices and of problems of the siting of plants... application examples are given. (authors) [fr

  14. Conditions for characterizing the structure of optimal strategies in infinite-horizon dynamic programs

    International Nuclear Information System (INIS)

    Porteus, E.

    1982-01-01

    The study of infinite-horizon nonstationary dynamic programs using the operator approach is continued. The point of view here differs slightly from that taken by others, in that Denardo's local income function is not used as a starting point. Infinite-horizon values are defined as limits of finite-horizon values, as the horizons get long. Two important conditions of an earlier paper are weakened, yet the optimality equations, the optimality criterion, and the existence of optimal ''structured'' strategies are still obtained

  15. Evaluating Varied Label Designs for Use with Medical Devices: Optimized Labels Outperform Existing Labels in the Correct Selection of Devices and Time to Select.

    Directory of Open Access Journals (Sweden)

    Laura Bix

    Full Text Available Effective standardization of medical device labels requires objective study of varied designs. Insufficient empirical evidence exists regarding how practitioners utilize and view labeling.Measure the effect of graphic elements (boxing information, grouping information, symbol use and color-coding to optimize a label for comparison with those typical of commercial medical devices.Participants viewed 54 trials on a computer screen. Trials were comprised of two labels that were identical with regard to graphics, but differed in one aspect of information (e.g., one had latex, the other did not. Participants were instructed to select the label along a given criteria (e.g., latex containing as quickly as possible. Dependent variables were binary (correct selection and continuous (time to correct selection.Eighty-nine healthcare professionals were recruited at Association of Surgical Technologists (AST conferences, and using a targeted e-mail of AST members.Symbol presence, color coding and grouping critical pieces of information all significantly improved selection rates and sped time to correct selection (α = 0.05. Conversely, when critical information was graphically boxed, probability of correct selection and time to selection were impaired (α = 0.05. Subsequently, responses from trials containing optimal treatments (color coded, critical information grouped with symbols were compared to two labels created based on a review of those commercially available. Optimal labels yielded a significant positive benefit regarding the probability of correct choice ((P<0.0001 LSM; UCL, LCL: 97.3%; 98.4%, 95.5%, as compared to the two labels we created based on commercial designs (92.0%; 94.7%, 87.9% and 89.8%; 93.0%, 85.3% and time to selection.Our study provides data regarding design factors, namely: color coding, symbol use and grouping of critical information that can be used to significantly enhance the performance of medical device labels.

  16. A note on “An alternative multiple attribute decision making methodology for solving optimal facility layout design selection problems”

    OpenAIRE

    R. Venkata Rao

    2012-01-01

    A paper published by Maniya and Bhatt (2011) (An alternative multiple attribute decision making methodology for solving optimal facility layout design selection problems, Computers & Industrial Engineering, 61, 542-549) proposed an alternative multiple attribute decision making method named as “Preference Selection Index (PSI) method” for selection of an optimal facility layout design. The authors had claimed that the method was logical and more appropriate and the method gives directly the o...

  17. MATLAB-based program for optimization of quantum cascade laser active region parameters and calculation of output characteristics in magnetic field

    Science.gov (United States)

    Smiljanić, J.; Žeželj, M.; Milanović, V.; Radovanović, J.; Stanković, I.

    2014-03-01

    A strong magnetic field applied along the growth direction of a quantum cascade laser (QCL) active region gives rise to a spectrum of discrete energy states, the Landau levels. By combining quantum engineering of a QCL with a static magnetic field, we can selectively inhibit/enhance non-radiative electron relaxation process between the relevant Landau levels of a triple quantum well and realize a tunable surface emitting device. An efficient numerical algorithm implementation is presented of optimization of GaAs/AlGaAs QCL region parameters and calculation of output properties in the magnetic field. Both theoretical analysis and MATLAB implementation are given for LO-phonon and interface roughness scattering mechanisms on the operation of QCL. At elevated temperatures, electrons in the relevant laser states absorb/emit more LO-phonons which results in reduction of the optical gain. The decrease in the optical gain is moderated by the occurrence of interface roughness scattering, which remains unchanged with increasing temperature. Using the calculated scattering rates as input data, rate equations can be solved and population inversion and the optical gain obtained. Incorporation of the interface roughness scattering mechanism into the model did not create new resonant peaks of the optical gain. However, it resulted in shifting the existing peaks positions and overall reduction of the optical gain. Catalogue identifier: AERL_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERL_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 37763 No. of bytes in distributed program, including test data, etc.: 2757956 Distribution format: tar.gz Programming language: MATLAB. Computer: Any capable of running MATLAB version R2010a or higher. Operating system: Any platform

  18. Optimization of environmental management strategies through a dynamic stochastic possibilistic multiobjective program

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xiaodong, E-mail: xiaodong.zhang@beg.utexas.edu [Bureau of Economic Geology, Jackson School of Geosciences, The University of Texas at Austin, Austin, TX 78713 (United States); Huang, Gordon [Institute of Energy, Environment and Sustainable Communities, University of Regina, Regina, Saskatchewan S4S 0A2 (Canada)

    2013-02-15

    Highlights: ► A dynamic stochastic possibilistic multiobjective programming model is developed. ► Greenhouse gas emission control is considered. ► Three planning scenarios are analyzed and compared. ► Optimal decision schemes under three scenarios and different p{sub i} levels are obtained. ► Tradeoffs between economics and environment are reflected. -- Abstract: Greenhouse gas (GHG) emissions from municipal solid waste (MSW) management facilities have become a serious environmental issue. In MSW management, not only economic objectives but also environmental objectives should be considered simultaneously. In this study, a dynamic stochastic possibilistic multiobjective programming (DSPMP) model is developed for supporting MSW management and associated GHG emission control. The DSPMP model improves upon the existing waste management optimization methods through incorporation of fuzzy possibilistic programming and chance-constrained programming into a general mixed-integer multiobjective linear programming (MOP) framework where various uncertainties expressed as fuzzy possibility distributions and probability distributions can be effectively reflected. Two conflicting objectives are integrally considered, including minimization of total system cost and minimization of total GHG emissions from waste management facilities. Three planning scenarios are analyzed and compared, representing different preferences of the decision makers for economic development and environmental-impact (i.e. GHG-emission) issues in integrated MSW management. Optimal decision schemes under three scenarios and different p{sub i} levels (representing the probability that the constraints would be violated) are generated for planning waste flow allocation and facility capacity expansions as well as GHG emission control. The results indicate that economic and environmental tradeoffs can be effectively reflected through the proposed DSPMP model. The generated decision variables can help

  19. Optimal Selection of AC Cables for Large Scale Offshore Wind Farms

    DEFF Research Database (Denmark)

    Hou, Peng; Hu, Weihao; Chen, Zhe

    2014-01-01

    The investment of large scale offshore wind farms is high in which the electrical system has a significant contribution to the total cost. As one of the key components, the cost of the connection cables affects the initial investment a lot. The development of cable manufacturing provides a vast...... and systematical way for the optimal selection of cables in large scale offshore wind farms....

  20. Metrics Evolution in an Energy Research and Development Program

    International Nuclear Information System (INIS)

    Dixon, Brent

    2011-01-01

    All technology programs progress through three phases: Discovery, Definition, and Deployment. The form and application of program metrics needs to evolve with each phase. During the discovery phase, the program determines what is achievable. A set of tools is needed to define program goals, to analyze credible technical options, and to ensure that the options are compatible and meet the program objectives. A metrics system that scores the potential performance of technical options is part of this system of tools, supporting screening of concepts and aiding in the overall definition of objectives. During the definition phase, the program defines what specifically is wanted. What is achievable is translated into specific systems and specific technical options are selected and optimized. A metrics system can help with the identification of options for optimization and the selection of the option for deployment. During the deployment phase, the program shows that the selected system works. Demonstration projects are established and classical systems engineering is employed. During this phase, the metrics communicate system performance. This paper discusses an approach to metrics evolution within the Department of Energy's Nuclear Fuel Cycle R and D Program, which is working to improve the sustainability of nuclear energy.

  1. Optimal Input Design for Aircraft Parameter Estimation using Dynamic Programming Principles

    Science.gov (United States)

    Morelli, Eugene A.; Klein, Vladislav

    1990-01-01

    A new technique was developed for designing optimal flight test inputs for aircraft parameter estimation experiments. The principles of dynamic programming were used for the design in the time domain. This approach made it possible to include realistic practical constraints on the input and output variables. A description of the new approach is presented, followed by an example for a multiple input linear model describing the lateral dynamics of a fighter aircraft. The optimal input designs produced by the new technique demonstrated improved quality and expanded capability relative to the conventional multiple input design method.

  2. A Two-Stage Robust Optimization for Centralized-Optimal Dispatch of Photovoltaic Inverters in Active Distribution Networks

    DEFF Research Database (Denmark)

    Ding, Tao; Li, Cheng; Yang, Yongheng

    2017-01-01

    Optimally dispatching Photovoltaic (PV) inverters is an efficient way to avoid overvoltage in active distribution networks, which may occur in the case of PV generation surplus load demand. Typically, the dispatching optimization objective is to identify critical PV inverters that have the most...... nature of solar PV energy may affect the selection of the critical PV inverters and also the final optimal objective value. In order to address this issue, a two-stage robust optimization model is proposed in this paper to achieve a robust optimal solution to the PV inverter dispatch, which can hedge...... against any possible realization within the uncertain PV outputs. In addition, the conic relaxation-based branch flow formulation and second-order cone programming based column-and-constraint generation algorithm are employed to deal with the proposed robust optimization model. Case studies on a 33-bus...

  3. Minimizing transient influence in WHPA delineation: An optimization approach for optimal pumping rate schemes

    Science.gov (United States)

    Rodriguez-Pretelin, A.; Nowak, W.

    2017-12-01

    For most groundwater protection management programs, Wellhead Protection Areas (WHPAs) have served as primarily protection measure. In their delineation, the influence of time-varying groundwater flow conditions is often underestimated because steady-state assumptions are commonly made. However, it has been demonstrated that temporary variations lead to significant changes in the required size and shape of WHPAs. Apart from natural transient groundwater drivers (e.g., changes in the regional angle of flow direction and seasonal natural groundwater recharge), anthropogenic causes such as transient pumping rates are of the most influential factors that require larger WHPAs. We hypothesize that WHPA programs that integrate adaptive and optimized pumping-injection management schemes can counter transient effects and thus reduce the additional areal demand in well protection under transient conditions. The main goal of this study is to present a novel management framework that optimizes pumping schemes dynamically, in order to minimize the impact triggered by transient conditions in WHPA delineation. For optimizing pumping schemes, we consider three objectives: 1) to minimize the risk of pumping water from outside a given WHPA, 2) to maximize the groundwater supply and 3) to minimize the involved operating costs. We solve transient groundwater flow through an available transient groundwater and Lagrangian particle tracking model. The optimization problem is formulated as a dynamic programming problem. Two different optimization approaches are explored: I) the first approach aims for single-objective optimization under objective (1) only. The second approach performs multiobjective optimization under all three objectives where compromise pumping rates are selected from the current Pareto front. Finally, we look for WHPA outlines that are as small as possible, yet allow the optimization problem to find the most suitable solutions.

  4. An Efficacious Multi-Objective Fuzzy Linear Programming Approach for Optimal Power Flow Considering Distributed Generation.

    Science.gov (United States)

    Warid, Warid; Hizam, Hashim; Mariun, Norman; Abdul-Wahab, Noor Izzri

    2016-01-01

    This paper proposes a new formulation for the multi-objective optimal power flow (MOOPF) problem for meshed power networks considering distributed generation. An efficacious multi-objective fuzzy linear programming optimization (MFLP) algorithm is proposed to solve the aforementioned problem with and without considering the distributed generation (DG) effect. A variant combination of objectives is considered for simultaneous optimization, including power loss, voltage stability, and shunt capacitors MVAR reserve. Fuzzy membership functions for these objectives are designed with extreme targets, whereas the inequality constraints are treated as hard constraints. The multi-objective fuzzy optimal power flow (OPF) formulation was converted into a crisp OPF in a successive linear programming (SLP) framework and solved using an efficient interior point method (IPM). To test the efficacy of the proposed approach, simulations are performed on the IEEE 30-busand IEEE 118-bus test systems. The MFLP optimization is solved for several optimization cases. The obtained results are compared with those presented in the literature. A unique solution with a high satisfaction for the assigned targets is gained. Results demonstrate the effectiveness of the proposed MFLP technique in terms of solution optimality and rapid convergence. Moreover, the results indicate that using the optimal DG location with the MFLP algorithm provides the solution with the highest quality.

  5. An efficient inverse radiotherapy planning method for VMAT using quadratic programming optimization.

    Science.gov (United States)

    Hoegele, W; Loeschel, R; Merkle, N; Zygmanski, P

    2012-01-01

    The purpose of this study is to investigate the feasibility of an inverse planning optimization approach for the Volumetric Modulated Arc Therapy (VMAT) based on quadratic programming and the projection method. The performance of this method is evaluated against a reference commercial planning system (eclipse(TM) for rapidarc(TM)) for clinically relevant cases. The inverse problem is posed in terms of a linear combination of basis functions representing arclet dose contributions and their respective linear coefficients as degrees of freedom. MLC motion is decomposed into basic motion patterns in an intuitive manner leading to a system of equations with a relatively small number of equations and unknowns. These equations are solved using quadratic programming under certain limiting physical conditions for the solution, such as the avoidance of negative dose during optimization and Monitor Unit reduction. The modeling by the projection method assures a unique treatment plan with beneficial properties, such as the explicit relation between organ weightings and the final dose distribution. Clinical cases studied include prostate and spine treatments. The optimized plans are evaluated by comparing isodose lines, DVH profiles for target and normal organs, and Monitor Units to those obtained by the clinical treatment planning system eclipse(TM). The resulting dose distributions for a prostate (with rectum and bladder as organs at risk), and for a spine case (with kidneys, liver, lung and heart as organs at risk) are presented. Overall, the results indicate that similar plan qualities for quadratic programming (QP) and rapidarc(TM) could be achieved at significantly more efficient computational and planning effort using QP. Additionally, results for the quasimodo phantom [Bohsung et al., "IMRT treatment planning: A comparative inter-system and inter-centre planning exercise of the estro quasimodo group," Radiother. Oncol. 76(3), 354-361 (2005)] are presented as an example

  6. Stan: A Probabilistic Programming Language for Bayesian Inference and Optimization

    Science.gov (United States)

    Gelman, Andrew; Lee, Daniel; Guo, Jiqiang

    2015-01-01

    Stan is a free and open-source C++ program that performs Bayesian inference or optimization for arbitrary user-specified models and can be called from the command line, R, Python, Matlab, or Julia and has great promise for fitting large and complex statistical models in many areas of application. We discuss Stan from users' and developers'…

  7. An improved chaotic fruit fly optimization based on a mutation strategy for simultaneous feature selection and parameter optimization for SVM and its applications

    Science.gov (United States)

    Lou, Xin Yuan; Sun, Lin Fu

    2017-01-01

    This paper proposes a new support vector machine (SVM) optimization scheme based on an improved chaotic fly optimization algorithm (FOA) with a mutation strategy to simultaneously perform parameter setting turning for the SVM and feature selection. In the improved FOA, the chaotic particle initializes the fruit fly swarm location and replaces the expression of distance for the fruit fly to find the food source. However, the proposed mutation strategy uses two distinct generative mechanisms for new food sources at the osphresis phase, allowing the algorithm procedure to search for the optimal solution in both the whole solution space and within the local solution space containing the fruit fly swarm location. In an evaluation based on a group of ten benchmark problems, the proposed algorithm’s performance is compared with that of other well-known algorithms, and the results support the superiority of the proposed algorithm. Moreover, this algorithm is successfully applied in a SVM to perform both parameter setting turning for the SVM and feature selection to solve real-world classification problems. This method is called chaotic fruit fly optimization algorithm (CIFOA)-SVM and has been shown to be a more robust and effective optimization method than other well-known methods, particularly in terms of solving the medical diagnosis problem and the credit card problem. PMID:28369096

  8. A Generalized Measure for the Optimal Portfolio Selection Problem and its Explicit Solution

    Directory of Open Access Journals (Sweden)

    Zinoviy Landsman

    2018-03-01

    Full Text Available In this paper, we offer a novel class of utility functions applied to optimal portfolio selection. This class incorporates as special cases important measures such as the mean-variance, Sharpe ratio, mean-standard deviation and others. We provide an explicit solution to the problem of optimal portfolio selection based on this class. Furthermore, we show that each measure in this class generally reduces to the efficient frontier that coincides or belongs to the classical mean-variance efficient frontier. In addition, a condition is provided for the existence of the a one-to-one correspondence between the parameter of this class of utility functions and the trade-off parameter λ in the mean-variance utility function. This correspondence essentially provides insight into the choice of this parameter. We illustrate our results by taking a portfolio of stocks from National Association of Securities Dealers Automated Quotation (NASDAQ.

  9. Selecting an optimal mixed products using grey relationship model

    Directory of Open Access Journals (Sweden)

    Farshad Faezy Razi

    2013-06-01

    Full Text Available This paper presents an integrated supplier selection and inventory management using grey relationship model (GRM as well as multi-objective decision making process. The proposed model of this paper first ranks different suppliers based on GRM technique and then determines the optimum level of inventory by considering different objectives. To show the implementation of the proposed model, we use some benchmark data presented by Talluri and Baker [Talluri, S., & Baker, R. C. (2002. A multi-phase mathematical programming approach for effective supply chain design. European Journal of Operational Research, 141(3, 544-558.]. The preliminary results indicate that the proposed model of this paper is capable of handling different criteria for supplier selection.

  10. Soft computing approach for reliability optimization: State-of-the-art survey

    International Nuclear Information System (INIS)

    Gen, Mitsuo; Yun, Young Su

    2006-01-01

    In the broadest sense, reliability is a measure of performance of systems. As systems have grown more complex, the consequences of their unreliable behavior have become severe in terms of cost, effort, lives, etc., and the interest in assessing system reliability and the need for improving the reliability of products and systems have become very important. Most solution methods for reliability optimization assume that systems have redundancy components in series and/or parallel systems and alternative designs are available. Reliability optimization problems concentrate on optimal allocation of redundancy components and optimal selection of alternative designs to meet system requirement. In the past two decades, numerous reliability optimization techniques have been proposed. Generally, these techniques can be classified as linear programming, dynamic programming, integer programming, geometric programming, heuristic method, Lagrangean multiplier method and so on. A Genetic Algorithm (GA), as a soft computing approach, is a powerful tool for solving various reliability optimization problems. In this paper, we briefly survey GA-based approach for various reliability optimization problems, such as reliability optimization of redundant system, reliability optimization with alternative design, reliability optimization with time-dependent reliability, reliability optimization with interval coefficients, bicriteria reliability optimization, and reliability optimization with fuzzy goals. We also introduce the hybrid approaches for combining GA with fuzzy logic, neural network and other conventional search techniques. Finally, we have some experiments with an example of various reliability optimization problems using hybrid GA approach

  11. Optimal traffic control in highway transportation networks using linear programming

    KAUST Repository

    Li, Yanning

    2014-06-01

    This article presents a framework for the optimal control of boundary flows on transportation networks. The state of the system is modeled by a first order scalar conservation law (Lighthill-Whitham-Richards PDE). Based on an equivalent formulation of the Hamilton-Jacobi PDE, the problem of controlling the state of the system on a network link in a finite horizon can be posed as a Linear Program. Assuming all intersections in the network are controllable, we show that the optimization approach can be extended to an arbitrary transportation network, preserving linear constraints. Unlike previously investigated transportation network control schemes, this framework leverages the intrinsic properties of the Halmilton-Jacobi equation, and does not require any discretization or boolean variables on the link. Hence this framework is very computational efficient and provides the globally optimal solution. The feasibility of this framework is illustrated by an on-ramp metering control example.

  12. A feasibility study: Selection of a personalized radiotherapy fractionation schedule using spatiotemporal optimization

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Minsun, E-mail: mk688@uw.edu; Stewart, Robert D. [Department of Radiation Oncology, University of Washington, Seattle, Washington 98195-6043 (United States); Phillips, Mark H. [Departments of Radiation Oncology and Neurological Surgery, University of Washington, Seattle, Washington 98195-6043 (United States)

    2015-11-15

    Purpose: To investigate the impact of using spatiotemporal optimization, i.e., intensity-modulated spatial optimization followed by fractionation schedule optimization, to select the patient-specific fractionation schedule that maximizes the tumor biologically equivalent dose (BED) under dose constraints for multiple organs-at-risk (OARs). Methods: Spatiotemporal optimization was applied to a variety of lung tumors in a phantom geometry using a range of tumor sizes and locations. The optimal fractionation schedule for a patient using the linear-quadratic cell survival model depends on the tumor and OAR sensitivity to fraction size (α/β), the effective tumor doubling time (T{sub d}), and the size and location of tumor target relative to one or more OARs (dose distribution). The authors used a spatiotemporal optimization method to identify the optimal number of fractions N that maximizes the 3D tumor BED distribution for 16 lung phantom cases. The selection of the optimal fractionation schedule used equivalent (30-fraction) OAR constraints for the heart (D{sub mean} ≤ 45 Gy), lungs (D{sub mean} ≤ 20 Gy), cord (D{sub max} ≤ 45 Gy), esophagus (D{sub max} ≤ 63 Gy), and unspecified tissues (D{sub 05} ≤ 60 Gy). To assess plan quality, the authors compared the minimum, mean, maximum, and D{sub 95} of tumor BED, as well as the equivalent uniform dose (EUD) for optimized plans to conventional intensity-modulated radiation therapy plans prescribing 60 Gy in 30 fractions. A sensitivity analysis was performed to assess the effects of T{sub d} (3–100 days), tumor lag-time (T{sub k} = 0–10 days), and the size of tumors on optimal fractionation schedule. Results: Using an α/β ratio of 10 Gy, the average values of tumor max, min, mean BED, and D{sub 95} were up to 19%, 21%, 20%, and 19% larger than those from conventional prescription, depending on T{sub d} and T{sub k} used. Tumor EUD was up to 17% larger than the conventional prescription. For fast proliferating

  13. A screening method for the optimal selection of plate heat exchanger configurations

    Directory of Open Access Journals (Sweden)

    Pinto J.M.

    2002-01-01

    Full Text Available An optimization method for determining the best configuration(s of gasketed plate heat exchangers is presented. The objective is to select the configuration(s with the minimum heat transfer area that still satisfies constraints on the number of channels, the pressure drop of both fluids, the channel flow velocities and the exchanger thermal effectiveness. The configuration of the exchanger is defined by six parameters, which are as follows: the number of channels, the numbers of passes on each side, the fluid locations, the feed positions and the type of flow in the channels. The resulting configuration optimization problem is formulated as the minimization of the exchanger heat transfer area and a screening procedure is proposed for its solution. In this procedure, subsets of constraints are successively applied to eliminate infeasible and nonoptimal solutions. Examples show that the optimization method is able to successfully determine a set of optimal configurations with a minimum number of exchanger evaluations. Approximately 5 % of the pressure drop and channel velocity calculations and 1 % of the thermal simulations are required for the solution.

  14. An optimal maintenance policy for machine replacement problem using dynamic programming

    OpenAIRE

    Mohsen Sadegh Amalnik; Morteza Pourgharibshahi

    2017-01-01

    In this article, we present an acceptance sampling plan for machine replacement problem based on the backward dynamic programming model. Discount dynamic programming is used to solve a two-state machine replacement problem. We plan to design a model for maintenance by consid-ering the quality of the item produced. The purpose of the proposed model is to determine the optimal threshold policy for maintenance in a finite time horizon. We create a decision tree based on a sequential sampling inc...

  15. Using Random Forests to Select Optimal Input Variables for Short-Term Wind Speed Forecasting Models

    Directory of Open Access Journals (Sweden)

    Hui Wang

    2017-10-01

    Full Text Available Achieving relatively high-accuracy short-term wind speed forecasting estimates is a precondition for the construction and grid-connected operation of wind power forecasting systems for wind farms. Currently, most research is focused on the structure of forecasting models and does not consider the selection of input variables, which can have significant impacts on forecasting performance. This paper presents an input variable selection method for wind speed forecasting models. The candidate input variables for various leading periods are selected and random forests (RF is employed to evaluate the importance of all variable as features. The feature subset with the best evaluation performance is selected as the optimal feature set. Then, kernel-based extreme learning machine is constructed to evaluate the performance of input variables selection based on RF. The results of the case study show that by removing the uncorrelated and redundant features, RF effectively extracts the most strongly correlated set of features from the candidate input variables. By finding the optimal feature combination to represent the original information, RF simplifies the structure of the wind speed forecasting model, shortens the training time required, and substantially improves the model’s accuracy and generalization ability, demonstrating that the input variables selected by RF are effective.

  16. Cost-benefit study of consumer product take-back programs using IBM's WIT reverse logistics optimization tool

    Science.gov (United States)

    Veerakamolmal, Pitipong; Lee, Yung-Joon; Fasano, J. P.; Hale, Rhea; Jacques, Mary

    2002-02-01

    In recent years, there has been increased focus by regulators, manufacturers, and consumers on the issue of product end of life management for electronics. This paper presents an overview of a conceptual study designed to examine the costs and benefits of several different Product Take Back (PTB) scenarios for used electronics equipment. The study utilized a reverse logistics supply chain model to examine the effects of several different factors in PTB programs. The model was done using the IBM supply chain optimization tool known as WIT (Watson Implosion Technology). Using the WIT tool, we were able to determine a theoretical optimal cost scenario for PTB programs. The study was designed to assist IBM internally in determining theoretical optimal Product Take Back program models and determining potential incentives for increasing participation rates.

  17. Stochastic optimization: beyond mathematical programming

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Stochastic optimization, among which bio-inspired algorithms, is gaining momentum in areas where more classical optimization algorithms fail to deliver satisfactory results, or simply cannot be directly applied. This presentation will introduce baseline stochastic optimization algorithms, and illustrate their efficiency in different domains, from continuous non-convex problems to combinatorial optimization problem, to problems for which a non-parametric formulation can help exploring unforeseen possible solution spaces.

  18. Optimality and stability of symmetric evolutionary games with applications in genetic selection.

    Science.gov (United States)

    Huang, Yuanyuan; Hao, Yiping; Wang, Min; Zhou, Wen; Wu, Zhijun

    2015-06-01

    Symmetric evolutionary games, i.e., evolutionary games with symmetric fitness matrices, have important applications in population genetics, where they can be used to model for example the selection and evolution of the genotypes of a given population. In this paper, we review the theory for obtaining optimal and stable strategies for symmetric evolutionary games, and provide some new proofs and computational methods. In particular, we review the relationship between the symmetric evolutionary game and the generalized knapsack problem, and discuss the first and second order necessary and sufficient conditions that can be derived from this relationship for testing the optimality and stability of the strategies. Some of the conditions are given in different forms from those in previous work and can be verified more efficiently. We also derive more efficient computational methods for the evaluation of the conditions than conventional approaches. We demonstrate how these conditions can be applied to justifying the strategies and their stabilities for a special class of genetic selection games including some in the study of genetic disorders.

  19. Efficient Iris Recognition Based on Optimal Subfeature Selection and Weighted Subregion Fusion

    Science.gov (United States)

    Deng, Ning

    2014-01-01

    In this paper, we propose three discriminative feature selection strategies and weighted subregion matching method to improve the performance of iris recognition system. Firstly, we introduce the process of feature extraction and representation based on scale invariant feature transformation (SIFT) in detail. Secondly, three strategies are described, which are orientation probability distribution function (OPDF) based strategy to delete some redundant feature keypoints, magnitude probability distribution function (MPDF) based strategy to reduce dimensionality of feature element, and compounded strategy combined OPDF and MPDF to further select optimal subfeature. Thirdly, to make matching more effective, this paper proposes a novel matching method based on weighted sub-region matching fusion. Particle swarm optimization is utilized to accelerate achieve different sub-region's weights and then weighted different subregions' matching scores to generate the final decision. The experimental results, on three public and renowned iris databases (CASIA-V3 Interval, Lamp, andMMU-V1), demonstrate that our proposed methods outperform some of the existing methods in terms of correct recognition rate, equal error rate, and computation complexity. PMID:24683317

  20. Efficient Iris Recognition Based on Optimal Subfeature Selection and Weighted Subregion Fusion

    Directory of Open Access Journals (Sweden)

    Ying Chen

    2014-01-01

    Full Text Available In this paper, we propose three discriminative feature selection strategies and weighted subregion matching method to improve the performance of iris recognition system. Firstly, we introduce the process of feature extraction and representation based on scale invariant feature transformation (SIFT in detail. Secondly, three strategies are described, which are orientation probability distribution function (OPDF based strategy to delete some redundant feature keypoints, magnitude probability distribution function (MPDF based strategy to reduce dimensionality of feature element, and compounded strategy combined OPDF and MPDF to further select optimal subfeature. Thirdly, to make matching more effective, this paper proposes a novel matching method based on weighted sub-region matching fusion. Particle swarm optimization is utilized to accelerate achieve different sub-region’s weights and then weighted different subregions’ matching scores to generate the final decision. The experimental results, on three public and renowned iris databases (CASIA-V3 Interval, Lamp, andMMU-V1, demonstrate that our proposed methods outperform some of the existing methods in terms of correct recognition rate, equal error rate, and computation complexity.

  1. Efficient iris recognition based on optimal subfeature selection and weighted subregion fusion.

    Science.gov (United States)

    Chen, Ying; Liu, Yuanning; Zhu, Xiaodong; He, Fei; Wang, Hongye; Deng, Ning

    2014-01-01

    In this paper, we propose three discriminative feature selection strategies and weighted subregion matching method to improve the performance of iris recognition system. Firstly, we introduce the process of feature extraction and representation based on scale invariant feature transformation (SIFT) in detail. Secondly, three strategies are described, which are orientation probability distribution function (OPDF) based strategy to delete some redundant feature keypoints, magnitude probability distribution function (MPDF) based strategy to reduce dimensionality of feature element, and compounded strategy combined OPDF and MPDF to further select optimal subfeature. Thirdly, to make matching more effective, this paper proposes a novel matching method based on weighted sub-region matching fusion. Particle swarm optimization is utilized to accelerate achieve different sub-region's weights and then weighted different subregions' matching scores to generate the final decision. The experimental results, on three public and renowned iris databases (CASIA-V3 Interval, Lamp, and MMU-V1), demonstrate that our proposed methods outperform some of the existing methods in terms of correct recognition rate, equal error rate, and computation complexity.

  2. Programming in the Zone: Repertoire Selection for the Large Ensemble

    Science.gov (United States)

    Hopkins, Michael

    2013-01-01

    One of the great challenges ensemble directors face is selecting high-quality repertoire that matches the musical and technical levels of their ensembles. Thoughtful repertoire selection can lead to increased student motivation as well as greater enthusiasm for the music program from parents, administrators, teachers, and community members. Common…

  3. TreePOD: Sensitivity-Aware Selection of Pareto-Optimal Decision Trees.

    Science.gov (United States)

    Muhlbacher, Thomas; Linhardt, Lorenz; Moller, Torsten; Piringer, Harald

    2018-01-01

    Balancing accuracy gains with other objectives such as interpretability is a key challenge when building decision trees. However, this process is difficult to automate because it involves know-how about the domain as well as the purpose of the model. This paper presents TreePOD, a new approach for sensitivity-aware model selection along trade-offs. TreePOD is based on exploring a large set of candidate trees generated by sampling the parameters of tree construction algorithms. Based on this set, visualizations of quantitative and qualitative tree aspects provide a comprehensive overview of possible tree characteristics. Along trade-offs between two objectives, TreePOD provides efficient selection guidance by focusing on Pareto-optimal tree candidates. TreePOD also conveys the sensitivities of tree characteristics on variations of selected parameters by extending the tree generation process with a full-factorial sampling. We demonstrate how TreePOD supports a variety of tasks involved in decision tree selection and describe its integration in a holistic workflow for building and selecting decision trees. For evaluation, we illustrate a case study for predicting critical power grid states, and we report qualitative feedback from domain experts in the energy sector. This feedback suggests that TreePOD enables users with and without statistical background a confident and efficient identification of suitable decision trees.

  4. Comonotonic approximations for a generalized provisioning problem with application to optimal portfolio selection

    NARCIS (Netherlands)

    van Weert, K.; Dhaene, J.; Goovaerts, M.

    2011-01-01

    In this paper we discuss multiperiod portfolio selection problems related to a specific provisioning problem. Our results are an extension of Dhaene et al. (2005) [14], where optimal constant mix investment strategies are obtained in a provisioning and savings context, using an analytical approach

  5. Plant breeding with marker-assisted selection in Brazil

    Directory of Open Access Journals (Sweden)

    Ney Sussumu Sakiyama

    2014-03-01

    Full Text Available Over the past three decades, molecular marker studies reached extraordinary advances, especially for sequencing and bioinformatics techniques. Marker-assisted selection became part of the breeding program routines of important seed companies, in order to accelerate and optimize the cultivar developing processes. Private seed companies increasingly use marker-assisted selection, especially for the species of great importance to the seed market, e.g. corn, soybean, cotton, and sunflower. In the Brazilian public institutions few breeding programs use it efficiently. The possible reasons are: lack of know-how, lack of appropriate laboratories, few validated markers, high cost, and lack of urgency in obtaining cultivars. In this article we analyze the use and the constraints of marker-assisted selection in plant breeding programs of Brazilian public institutes

  6. Fuzzy linear programming based optimal fuel scheduling incorporating blending/transloading facilities

    Energy Technology Data Exchange (ETDEWEB)

    Djukanovic, M.; Babic, B.; Milosevic, B. [Electrical Engineering Inst. Nikola Tesla, Belgrade (Yugoslavia); Sobajic, D.J. [EPRI, Palo Alto, CA (United States). Power System Control; Pao, Y.H. [Case Western Reserve Univ., Cleveland, OH (United States)]|[AI WARE, Inc., Cleveland, OH (United States)

    1996-05-01

    In this paper the blending/transloading facilities are modeled using an interactive fuzzy linear programming (FLP), in order to allow the decision-maker to solve the problem of uncertainty of input information within the fuel scheduling optimization. An interactive decision-making process is formulated in which decision-maker can learn to recognize good solutions by considering all possibilities of fuzziness. The application of the fuzzy formulation is accompanied by a careful examination of the definition of fuzziness, appropriateness of the membership function and interpretation of results. The proposed concept provides a decision support system with integration-oriented features, whereby the decision-maker can learn to recognize the relative importance of factors in the specific domain of optimal fuel scheduling (OFS) problem. The formulation of a fuzzy linear programming problem to obtain a reasonable nonfuzzy solution under consideration of the ambiguity of parameters, represented by fuzzy numbers, is introduced. An additional advantage of the FLP formulation is its ability to deal with multi-objective problems.

  7. Optimisation of selective breeding program for Nile tilapia (Oreochromis niloticus)

    NARCIS (Netherlands)

    Trong, T.Q.

    2013-01-01

    The aim of this thesis was to optimise the selective breeding program for Nile tilapia in the Mekong Delta region of Vietnam. Two breeding schemes, the “classic” BLUP scheme following the GIFT method (with pair mating) and a rotational mating scheme with own performance selection and

  8. Clinical implementation of stereotaxic brain implant optimization

    International Nuclear Information System (INIS)

    Rosenow, U.F.; Wojcicka, J.B.

    1991-01-01

    This optimization method for stereotaxic brain implants is based on seed/strand configurations of the basic type developed for the National Cancer Institute (NCI) atlas of regular brain implants. Irregular target volume shapes are determined from delineation in a stack of contrast enhanced computed tomography scans. The neurosurgeon may then select up to ten directions, or entry points, of surgical approach of which the program finds the optimal one under the criterion of smallest target volume diameter. Target volume cross sections are then reconstructed in 5-mm-spaced planes perpendicular to the implantation direction defined by the entry point and the target volume center. This information is used to define a closed line in an implant cross section along which peripheral seed strands are positioned and which has now an irregular shape. Optimization points are defined opposite peripheral seeds on the target volume surface to which the treatment dose rate is prescribed. Three different optimization algorithms are available: linear least-squares programming, quadratic programming with constraints, and a simplex method. The optimization routine is implemented into a commercial treatment planning system. It generates coordinate and source strength information of the optimized seed configurations for further dose rate distribution calculation with the treatment planning system, and also the coordinate settings for the stereotaxic Brown-Roberts-Wells (BRW) implantation device

  9. JuPOETs: a constrained multiobjective optimization approach to estimate biochemical model ensembles in the Julia programming language.

    Science.gov (United States)

    Bassen, David M; Vilkhovoy, Michael; Minot, Mason; Butcher, Jonathan T; Varner, Jeffrey D

    2017-01-25

    Ensemble modeling is a promising approach for obtaining robust predictions and coarse grained population behavior in deterministic mathematical models. Ensemble approaches address model uncertainty by using parameter or model families instead of single best-fit parameters or fixed model structures. Parameter ensembles can be selected based upon simulation error, along with other criteria such as diversity or steady-state performance. Simulations using parameter ensembles can estimate confidence intervals on model variables, and robustly constrain model predictions, despite having many poorly constrained parameters. In this software note, we present a multiobjective based technique to estimate parameter or models ensembles, the Pareto Optimal Ensemble Technique in the Julia programming language (JuPOETs). JuPOETs integrates simulated annealing with Pareto optimality to estimate ensembles on or near the optimal tradeoff surface between competing training objectives. We demonstrate JuPOETs on a suite of multiobjective problems, including test functions with parameter bounds and system constraints as well as for the identification of a proof-of-concept biochemical model with four conflicting training objectives. JuPOETs identified optimal or near optimal solutions approximately six-fold faster than a corresponding implementation in Octave for the suite of test functions. For the proof-of-concept biochemical model, JuPOETs produced an ensemble of parameters that gave both the mean of the training data for conflicting data sets, while simultaneously estimating parameter sets that performed well on each of the individual objective functions. JuPOETs is a promising approach for the estimation of parameter and model ensembles using multiobjective optimization. JuPOETs can be adapted to solve many problem types, including mixed binary and continuous variable types, bilevel optimization problems and constrained problems without altering the base algorithm. JuPOETs is open

  10. Computer-assisted optimization of reversed-phase HPLC isocratic separations of neutral compounds

    NARCIS (Netherlands)

    Baczek, T.; Kaliszan, R.; Claessens, H.A.; Straten, van M.A.

    2001-01-01

    Rational selection of optimized exptl. conditions for chromatog. sepn. of analytes is realized nowadays by means of specialized computer programs. Two such programs, DryLab (LC Resources, Walnut Creek, California, USA) and ChromSword (Merck KGaA, Darmstadt, Germany), were compared. The aim of the

  11. Optimization Techniques for Design Problems in Selected Areas in WSNs: A Tutorial.

    Science.gov (United States)

    Ibrahim, Ahmed; Alfa, Attahiru

    2017-08-01

    This paper is intended to serve as an overview of, and mostly a tutorial to illustrate, the optimization techniques used in several different key design aspects that have been considered in the literature of wireless sensor networks (WSNs). It targets the researchers who are new to the mathematical optimization tool, and wish to apply it to WSN design problems. We hence divide the paper into two main parts. One part is dedicated to introduce optimization theory and an overview on some of its techniques that could be helpful in design problem in WSNs. In the second part, we present a number of design aspects that we came across in the WSN literature in which mathematical optimization methods have been used in the design. For each design aspect, a key paper is selected, and for each we explain the formulation techniques and the solution methods implemented. We also provide in-depth analyses and assessments of the problem formulations, the corresponding solution techniques and experimental procedures in some of these papers. The analyses and assessments, which are provided in the form of comments, are meant to reflect the points that we believe should be taken into account when using optimization as a tool for design purposes.

  12. Selective pressures on C4 photosynthesis evolution in grasses through the lens of optimality

    OpenAIRE

    Akcay, Erol; Zhou, Haoran; Helliker, Brent

    2016-01-01

    CO2, temperature, water availability and light intensity were potential selective pressures to propel the initial evolution and global expansion of C4 photosynthesis in grasses. To tease apart the primary selective pressures along the evolutionary trajectory, we coupled photosynthesis and hydraulics models and optimized photosynthesis over stomatal resistance and leaf/fine-root allocation. We also examined the importance of nitrogen reallocation from the dark to the light reactions. Our resul...

  13. Optimality Conditions in Vector Optimization

    CERN Document Server

    Jiménez, Manuel Arana; Lizana, Antonio Rufián

    2011-01-01

    Vector optimization is continuously needed in several science fields, particularly in economy, business, engineering, physics and mathematics. The evolution of these fields depends, in part, on the improvements in vector optimization in mathematical programming. The aim of this Ebook is to present the latest developments in vector optimization. The contributions have been written by some of the most eminent researchers in this field of mathematical programming. The Ebook is considered essential for researchers and students in this field.

  14. The use of linear programming in optimization of HDR implant dose distributions

    International Nuclear Information System (INIS)

    Jozsef, Gabor; Streeter, Oscar E.; Astrahan, Melvin A.

    2003-01-01

    The introduction of high dose rate brachytherapy enabled optimization of dose distributions to be used on a routine basis. The objective of optimization is to homogenize the dose distribution within the implant while simultaneously satisfying dose constraints on certain points. This is accomplished by varying the time the source dwells at different locations. As the dose at any point is a linear function of the dwell times, a linear programming approach seems to be a natural choice. The dose constraints are inherently linear inequalities. Homogeneity requirements are linearized by minimizing the maximum deviation of the doses at points inside the implant from a prescribed dose. The revised simplex method was applied for the solution of this linear programming problem. In the homogenization process the possible source locations were chosen as optimization points. To avoid the problem of the singular value of the dose at a source location from the source itself we define the 'self-contribution' as the dose at a small distance from the source. The effect of varying this distance is discussed. Test cases were optimized for planar, biplanar and cylindrical implants. A semi-irregular, fan-like implant with diverging needles was also investigated. Mean central dose calculation based on 3D Delaunay-triangulation of the source locations was used to evaluate the dose distributions. The optimization method resulted in homogeneous distributions (for brachytherapy). Additional dose constraints--when applied--were satisfied. The method is flexible enough to include other linear constraints such as the inclusion of the centroids of the Delaunay-triangulation for homogenization, or limiting the maximum allowable dwell time

  15. Optimal Training for Time-Selective Wireless Fading Channels Using Cutoff Rate

    Directory of Open Access Journals (Sweden)

    Tong Lang

    2006-01-01

    Full Text Available We consider the optimal allocation of resources—power and bandwidth—between training and data transmissions for single-user time-selective Rayleigh flat-fading channels under the cutoff rate criterion. The transmitter exploits statistical channel state information (CSI in the form of the channel Doppler spectrum to embed pilot symbols into the transmission stream. At the receiver, instantaneous, though imperfect, CSI is acquired through minimum mean-square estimation of the channel based on some set of pilot observations. We compute the ergodic cutoff rate for this scenario. Assuming estimator-based interleaving and -PSK inputs, we study two special cases in-depth. First, we derive the optimal resource allocation for the Gauss-Markov correlation model. Next, we validate and refine these insights by studying resource allocation for the Jakes model.

  16. Optimal local dimming for LED-backlit LCD displays via linear programming

    DEFF Research Database (Denmark)

    Shu, Xiao; Wu, Xiaolin; Forchhammer, Søren

    2012-01-01

    and the attenuations of LCD pixels. The objective is to minimize the distortion in luminance reproduction due to the leakage of LCD and the coarse granularity of the LED lights. The optimization problem is formulated as one of linear programming, and both exact and approximate algorithms are proposed. Simulation...

  17. Energy-efficient relay selection and optimal power allocation for performance-constrained dual-hop variable-gain AF relaying

    KAUST Repository

    Zafar, Ammar

    2013-12-01

    This paper investigates the energy-efficiency enhancement of a variable-gain dual-hop amplify-and-forward (AF) relay network utilizing selective relaying. The objective is to minimize the total consumed power while keeping the end-to-end signal-to-noise-ratio (SNR) above a certain peak value and satisfying the peak power constraints at the source and relay nodes. To achieve this objective, an optimal relay selection and power allocation strategy is derived by solving the power minimization problem. Numerical results show that the derived optimal strategy enhances the energy-efficiency as compared to a benchmark scheme in which both the source and the selected relay transmit at peak power. © 2013 IEEE.

  18. Optimal Selection Method of Process Patents for Technology Transfer Using Fuzzy Linguistic Computing

    Directory of Open Access Journals (Sweden)

    Gangfeng Wang

    2014-01-01

    Full Text Available Under the open innovation paradigm, technology transfer of process patents is one of the most important mechanisms for manufacturing companies to implement process innovation and enhance the competitive edge. To achieve promising technology transfers, we need to evaluate the feasibility of process patents and optimally select the most appropriate patent according to the actual manufacturing situation. Hence, this paper proposes an optimal selection method of process patents using multiple criteria decision-making and 2-tuple fuzzy linguistic computing to avoid information loss during the processes of evaluation integration. An evaluation index system for technology transfer feasibility of process patents is designed initially. Then, fuzzy linguistic computing approach is applied to aggregate the evaluations of criteria weights for each criterion and corresponding subcriteria. Furthermore, performance ratings for subcriteria and fuzzy aggregated ratings of criteria are calculated. Thus, we obtain the overall technology transfer feasibility of patent alternatives. Finally, a case study of aeroengine turbine manufacturing is presented to demonstrate the applicability of the proposed method.

  19. Dynamic supplier selection problem considering full truck load in probabilistic environment

    Science.gov (United States)

    Sutrisno, Wicaksono, Purnawan Adi

    2017-11-01

    In this paper, we propose a mathematical model in a probabilistic dynamic optimization to solve a dynamic supplier selection problem considering full truck load in probabilistic environment where some parameters are uncertain. We determine the optimal strategy for this problem by using stochastic dynamic programming. We give some numerical experiments to evaluate and analyze the model. From the results, the optimal supplier and the optimal product volume from the optimal supplier were determined for each time period.

  20. 8th Workshop on Computational Optimization

    CERN Document Server

    2016-01-01

    This volume is a comprehensive collection of extended contributions from the Workshop on Computational Optimization 2015. It presents recent advances in computational optimization. The volume includes important real life problems like parameter settings for controlling processes in bioreactor, control of ethanol production, minimal convex hill with application in routing algorithms, graph coloring, flow design in photonic data transport system, predicting indoor temperature, crisis control center monitoring, fuel consumption of helicopters, portfolio selection, GPS surveying and so on. It shows how to develop algorithms for them based on new metaheuristic methods like evolutionary computation, ant colony optimization, constrain programming and others. This research demonstrates how some real-world problems arising in engineering, economics, medicine and other domains can be formulated as optimization problems. .

  1. CALIBRATION, OPTIMIZATION, AND SENSITIVITY AND UNCERTAINTY ALGORITHMS APPLICATION PROGRAMMING INTERFACE (COSU-API)

    Science.gov (United States)

    The Application Programming Interface (API) for Uncertainty Analysis, Sensitivity Analysis, and Parameter Estimation (UA/SA/PE API) tool development, here fore referred to as the Calibration, Optimization, and Sensitivity and Uncertainty Algorithms API (COSU-API), was initially d...

  2. Automated design and optimization of flexible booster autopilots via linear programming. Volume 2: User's manual

    Science.gov (United States)

    Hauser, F. D.; Szollosi, G. D.; Lakin, W. S.

    1972-01-01

    COEBRA, the Computerized Optimization of Elastic Booster Autopilots, is an autopilot design program. The bulk of the design criteria is presented in the form of minimum allowed gain/phase stability margins. COEBRA has two optimization phases: (1) a phase to maximize stability margins; and (2) a phase to optimize structural bending moment load relief capability in the presence of minimum requirements on gain/phase stability margins.

  3. Parameter identification using optimization techniques in the continuous simulation programs FORSIM and MACKSIM

    International Nuclear Information System (INIS)

    Carver, M.B.; Austin, C.F.; Ross, N.E.

    1980-02-01

    This report discusses the mechanics of automated parameter identification in simulation packages, and reviews available integration and optimization algorithms and their interaction within the recently developed optimization options in the FORSIM and MACKSIM simulation packages. In the MACKSIM mass-action chemical kinetics simulation package, the form and structure of the ordinary differential equations involved is known, so the implementation of an optimizing option is relatively straightforward. FORSIM, however, is designed to integrate ordinary and partial differential equations of abritrary definition. As the form of the equations is not known in advance, the design of the optimizing option is more intricate, but the philosophy could be applied to most simulation packages. In either case, however, the invocation of the optimizing interface is simple and user-oriented. Full details for the use of the optimizing mode for each program are given; specific applications are used as examples. (O.T.)

  4. On the non-stationarity of financial time series: impact on optimal portfolio selection

    International Nuclear Information System (INIS)

    Livan, Giacomo; Inoue, Jun-ichi; Scalas, Enrico

    2012-01-01

    We investigate the possible drawbacks of employing the standard Pearson estimator to measure correlation coefficients between financial stocks in the presence of non-stationary behavior, and we provide empirical evidence against the well-established common knowledge that using longer price time series provides better, more accurate, correlation estimates. Then, we investigate the possible consequences of instabilities in empirical correlation coefficient measurements on optimal portfolio selection. We rely on previously published works which provide a framework allowing us to take into account possible risk underestimations due to the non-optimality of the portfolio weights being used in order to distinguish such non-optimality effects from risk underestimations genuinely due to non-stationarities. We interpret such results in terms of instabilities in some spectral properties of portfolio correlation matrices. (paper)

  5. Selection criteria of residents for residency programs in Kuwait.

    Science.gov (United States)

    Marwan, Yousef; Ayed, Adel

    2013-01-19

    In Kuwait, 21 residency training programs were offered in the year 2011; however, no data is available regarding the criteria of selecting residents for these programs. This study aims to provide information about the importance of these criteria. A self-administered questionnaire was used to collect data from members (e.g. chairmen, directors, assistants …etc.) of residency programs in Kuwait. A total of 108 members were invited to participate. They were asked to rate the importance level (scale from 1 to 5) of criteria that may affect the acceptance of an applicant to their residency programs. Average scores were calculated for each criterion. Of the 108 members invited to participate, only 12 (11.1%) declined to participate. Interview performance was ranked as the most important criteria for selecting residents (average score: 4.63/5.00), followed by grade point average (average score: 3.78/5.00) and honors during medical school (average score: 3.67/5.00). On the other hand, receiving disciplinary action during medical school and failure in a required clerkship were considered as the most concerning among other criteria used to reject applicants (average scores: 3.83/5.00 and 3.54/5.00 respectively). Minor differences regarding the importance level of each criterion were noted across different programs. This study provided general information about the criteria that are used to accept/reject applicants to residency programs in Kuwait. Future studies should be conducted to investigate each criterion individually, and to assess if these criteria are related to residents' success during their training.

  6. Evaluation and optimization of LWR fuel cycles

    International Nuclear Information System (INIS)

    Akbas, T.; Zabunoglu, O.; Tombakoglu, M.

    2001-01-01

    There are several options in the back-end of the nuclear fuel cycle. Discharge burn-up, length of interim storage period, choice of direct disposal or recycling and method of reprocessing in case of recycling affect the options and determine/define the fuel cycle scenarios. These options have been evaluated in viewpoint of some tangible (fuel cycle cost, natural uranium requirement, decay heat of high level waste, radiological ingestion and inhalation hazards) and intangible factors (technological feasibility, nonproliferation aspect, etc.). Neutronic parameters are calculated using versatile fuel depletion code ORIGEN2.1. A program is developed for calculation of cost related parameters. Analytical hierarchy process is used to transform the intangible factors into the tangible ones. Then all these tangible and intangible factors are incorporated into a form that is suitable for goal programming, which is a linear optimization technique and used to determine the optimal option among alternatives. According to the specified objective function and constraints, the optimal fuel cycle scenario is determined using GPSYS (a linear programming software) as a goal programming tool. In addition, a sensitivity analysis is performed for some selected important parameters

  7. Accommodation of practical constraints by a linear programming jet select. [for Space Shuttle

    Science.gov (United States)

    Bergmann, E.; Weiler, P.

    1983-01-01

    An experimental spacecraft control system will be incorporated into the Space Shuttle flight software and exercised during a forthcoming mission to evaluate its performance and handling qualities. The control system incorporates a 'phase space' control law to generate rate change requests and a linear programming jet select to compute jet firings. Posed as a linear programming problem, jet selection must represent the rate change request as a linear combination of jet acceleration vectors where the coefficients are the jet firing times, while minimizing the fuel expended in satisfying that request. This problem is solved in real time using a revised Simplex algorithm. In order to implement the jet selection algorithm in the Shuttle flight control computer, it was modified to accommodate certain practical features of the Shuttle such as limited computer throughput, lengthy firing times, and a large number of control jets. To the authors' knowledge, this is the first such application of linear programming. It was made possible by careful consideration of the jet selection problem in terms of the properties of linear programming and the Simplex algorithm. These modifications to the jet select algorithm may by useful for the design of reaction controlled spacecraft.

  8. Worst-Case Execution Time Based Optimization of Real-Time Java Programs

    DEFF Research Database (Denmark)

    Hepp, Stefan; Schoeberl, Martin

    2012-01-01

    optimization is method in lining. It is especially important for languages, like Java, where small setter and getter methods are considered good programming style. In this paper we present and explore WCET driven in lining of Java methods. We use the WCET analysis tool for the Java processor JOP to guide...

  9. Optimization of hot water transport and distribution networks by analytical method: OPTAL program

    International Nuclear Information System (INIS)

    Barreau, Alain; Caizergues, Robert; Moret-Bailly, Jean

    1977-06-01

    This report presents optimization studies of hot water transport and distribution network by minimizing operating cost. Analytical optimization is used: Lagrange's method of undetermined multipliers. Optimum diameter of each pipe is calculated for minimum network operating cost. The characteristics of the computer program used for calculations, OPTAL, are given in this report. An example of network is calculated and described: 52 branches and 27 customers. Results are discussed [fr

  10. A Semidefinite Programming Based Search Strategy for Feature Selection with Mutual Information Measure.

    Science.gov (United States)

    Naghibi, Tofigh; Hoffmann, Sarah; Pfister, Beat

    2015-08-01

    Feature subset selection, as a special case of the general subset selection problem, has been the topic of a considerable number of studies due to the growing importance of data-mining applications. In the feature subset selection problem there are two main issues that need to be addressed: (i) Finding an appropriate measure function than can be fairly fast and robustly computed for high-dimensional data. (ii) A search strategy to optimize the measure over the subset space in a reasonable amount of time. In this article mutual information between features and class labels is considered to be the measure function. Two series expansions for mutual information are proposed, and it is shown that most heuristic criteria suggested in the literature are truncated approximations of these expansions. It is well-known that searching the whole subset space is an NP-hard problem. Here, instead of the conventional sequential search algorithms, we suggest a parallel search strategy based on semidefinite programming (SDP) that can search through the subset space in polynomial time. By exploiting the similarities between the proposed algorithm and an instance of the maximum-cut problem in graph theory, the approximation ratio of this algorithm is derived and is compared with the approximation ratio of the backward elimination method. The experiments show that it can be misleading to judge the quality of a measure solely based on the classification accuracy, without taking the effect of the non-optimum search strategy into account.

  11. Optimizing selective cutting strategies for maximum carbon stocks and yield of Moso bamboo forest using BIOME-BGC model.

    Science.gov (United States)

    Mao, Fangjie; Zhou, Guomo; Li, Pingheng; Du, Huaqiang; Xu, Xiaojun; Shi, Yongjun; Mo, Lufeng; Zhou, Yufeng; Tu, Guoqing

    2017-04-15

    The selective cutting method currently used in Moso bamboo forests has resulted in a reduction of stand productivity and carbon sequestration capacity. Given the time and labor expense involved in addressing this problem manually, simulation using an ecosystem model is the most suitable approach. The BIOME-BGC model was improved to suit managed Moso bamboo forests, which was adapted to include age structure, specific ecological processes and management measures of Moso bamboo forest. A field selective cutting experiment was done in nine plots with three cutting intensities (high-intensity, moderate-intensity and low-intensity) during 2010-2013, and biomass of these plots was measured for model validation. Then four selective cutting scenarios were simulated by the improved BIOME-BGC model to optimize the selective cutting timings, intervals, retained ages and intensities. The improved model matched the observed aboveground carbon density and yield of different plots, with a range of relative error from 9.83% to 15.74%. The results of different selective cutting scenarios suggested that the optimal selective cutting measure should be cutting 30% culms of age 6, 80% culms of age 7, and all culms thereafter (above age 8) in winter every other year. The vegetation carbon density and harvested carbon density of this selective cutting method can increase by 74.63% and 21.5%, respectively, compared with the current selective cutting measure. The optimized selective cutting measure developed in this study can significantly promote carbon density, yield, and carbon sink capacity in Moso bamboo forests. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Meeting the challenges with the Douglas Aircraft Company Aeroelastic Design Optimization Program (ADOP)

    Science.gov (United States)

    Rommel, Bruce A.

    1989-01-01

    An overview of the Aeroelastic Design Optimization Program (ADOP) at the Douglas Aircraft Company is given. A pilot test program involving the animation of mode shapes with solid rendering as well as wire frame displays, a complete aircraft model of a high-altitude hypersonic aircraft to test ADOP procedures, a flap model, and an aero-mesh modeler for doublet lattice aerodynamics are discussed.

  13. Optimal load suitability based RAT selection for HSDPA and IEEE 802.11e

    DEFF Research Database (Denmark)

    Prasad, Ramjee; Cabral, O.; Felez, F.J.

    2009-01-01

    are a premium. This paper investigates cooperation between networks based Radio Access Technology (RAT) selection algorithm that uses suitability to optimize the choice between WiFi and High Speed Downlink Packet Access (HSDPA). It has been shown that this approach has the potential to provide gain...... by allocating a user terminal to the most preferred network based on traffic type and network load. Optimal load threshold values that maximise the total QoS throughput for the given interworking scenario are 0.6 and 0.53 for HSDPA and WiFi, respectively. This corresponds to a CRRM gain on throughput of 80...

  14. Optimizing antiretroviral product selection: a sample approach to improving patient outcomes, saving money, and scaling-up health services in developing countries.

    Science.gov (United States)

    Amole, Carolyn D; Brisebois, Catherine; Essajee, Shaffiq; Koehler, Erin; Levin, Andrew D; Moore, Meredith C; Brown Ripin, David H; Sickler, Joanna J; Singh, Inder R

    2011-08-01

    Over the last decade, increased funding to support HIV treatment programs has enabled millions of new patients in developing countries to access the medications they need. Today, although demand for antiretrovirals continues to grow, the financial crisis has severely constrained funding leaving countries with difficult choices on program prioritization. Product optimization is one solution countries can pursue to continue to improve patient care while also uncovering savings that can be used for further scale up or other health system needs. Program managers can make procurement decisions that actually reduce program costs by considering additional factors beyond World Health Organization guidelines when making procurement decisions. These include in-country product availability, convenience, price, and logistics such as supply chain implications and laboratory testing requirements. Three immediate product selection opportunities in the HIV space include using boosted atazanavir in place of lopinovir for second-line therapy, lamivudine instead of emtricitabine in both first-line and second-line therapy, and tenofovir + lamivudine over abacavir + didanosine in second-line therapy. If these 3 opportunities were broadly implemented in sub-Saharan Africa and India today, approximately $300 million of savings would be realized over the next 5 years, enabling hundreds of thousands of additional patients to be treated. Although the discussion herein is specific to antriretrovirals, the principles of product selection are generalizable to diseases with multiple treatment options and fungible commodity procurement. Identifying and implementing approaches to overcome health system inefficiencies will help sustain and may expand quality care in resource-limited settings.

  15. Analysis and design optimization of flexible pavement

    Energy Technology Data Exchange (ETDEWEB)

    Mamlouk, M.S.; Zaniewski, J.P.; He, W.

    2000-04-01

    A project-level optimization approach was developed to minimize total pavement cost within an analysis period. Using this approach, the designer is able to select the optimum initial pavement thickness, overlay thickness, and overlay timing. The model in this approach is capable of predicting both pavement performance and condition in terms of roughness, fatigue cracking, and rutting. The developed model combines the American Association of State Highway and Transportation Officials (AASHTO) design procedure and the mechanistic multilayer elastic solution. The Optimization for Pavement Analysis (OPA) computer program was developed using the prescribed approach. The OPA program incorporates the AASHTO equations, the multilayer elastic system ELSYM5 model, and the nonlinear dynamic programming optimization technique. The program is PC-based and can run in either a Windows 3.1 or a Windows 95 environment. Using the OPA program, a typical pavement section was analyzed under different traffic volumes and material properties. The optimum design strategy that produces the minimum total pavement cost in each case was determined. The initial construction cost, overlay cost, highway user cost, and total pavement cost were also calculated. The methodology developed during this research should lead to more cost-effective pavements for agencies adopting the recommended analysis methods.

  16. Optimal Electrode Selection for Electrical Resistance Tomography in Carbon Fiber Reinforced Polymer Composites

    Science.gov (United States)

    Escalona Galvis, Luis Waldo; Diaz-Montiel, Paulina; Venkataraman, Satchi

    2017-01-01

    Electrical Resistance Tomography (ERT) offers a non-destructive evaluation (NDE) technique that takes advantage of the inherent electrical properties in carbon fiber reinforced polymer (CFRP) composites for internal damage characterization. This paper investigates a method of optimum selection of sensing configurations for delamination detection in thick cross-ply laminates using ERT. Reduction in the number of sensing locations and measurements is necessary to minimize hardware and computational effort. The present work explores the use of an effective independence (EI) measure originally proposed for sensor location optimization in experimental vibration modal analysis. The EI measure is used for selecting the minimum set of resistance measurements among all possible combinations resulting from selecting sensing electrode pairs. Singular Value Decomposition (SVD) is applied to obtain a spectral representation of the resistance measurements in the laminate for subsequent EI based reduction to take place. The electrical potential field in a CFRP laminate is calculated using finite element analysis (FEA) applied on models for two different laminate layouts considering a set of specified delamination sizes and locations with two different sensing arrangements. The effectiveness of the EI measure in eliminating redundant electrode pairs is demonstrated by performing inverse identification of damage using the full set and the reduced set of resistance measurements. This investigation shows that the EI measure is effective for optimally selecting the electrode pairs needed for resistance measurements in ERT based damage detection. PMID:28772485

  17. Optimization methods applied to hybrid vehicle design

    Science.gov (United States)

    Donoghue, J. F.; Burghart, J. H.

    1983-01-01

    The use of optimization methods as an effective design tool in the design of hybrid vehicle propulsion systems is demonstrated. Optimization techniques were used to select values for three design parameters (battery weight, heat engine power rating and power split between the two on-board energy sources) such that various measures of vehicle performance (acquisition cost, life cycle cost and petroleum consumption) were optimized. The apporach produced designs which were often significant improvements over hybrid designs already reported on in the literature. The principal conclusions are as follows. First, it was found that the strategy used to split the required power between the two on-board energy sources can have a significant effect on life cycle cost and petroleum consumption. Second, the optimization program should be constructed so that performance measures and design variables can be easily changed. Third, the vehicle simulation program has a significant effect on the computer run time of the overall optimization program; run time can be significantly reduced by proper design of the types of trips the vehicle takes in a one year period. Fourth, care must be taken in designing the cost and constraint expressions which are used in the optimization so that they are relatively smooth functions of the design variables. Fifth, proper handling of constraints on battery weight and heat engine rating, variables which must be large enough to meet power demands, is particularly important for the success of an optimization study. Finally, the principal conclusion is that optimization methods provide a practical tool for carrying out the design of a hybrid vehicle propulsion system.

  18. A fast inverse treatment planning strategy facilitating optimized catheter selection in image-guided high-dose-rate interstitial gynecologic brachytherapy.

    Science.gov (United States)

    Guthier, Christian V; Damato, Antonio L; Hesser, Juergen W; Viswanathan, Akila N; Cormack, Robert A

    2017-12-01

    Interstitial high-dose rate (HDR) brachytherapy is an important therapeutic strategy for the treatment of locally advanced gynecologic (GYN) cancers. The outcome of this therapy is determined by the quality of dose distribution achieved. This paper focuses on a novel yet simple heuristic for catheter selection for GYN HDR brachytherapy and their comparison against state of the art optimization strategies. The proposed technique is intended to act as a decision-supporting tool to select a favorable needle configuration. The presented heuristic for catheter optimization is based on a shrinkage-type algorithm (SACO). It is compared against state of the art planning in a retrospective study of 20 patients who previously received image-guided interstitial HDR brachytherapy using a Syed Neblett template. From those plans, template orientation and position are estimated via a rigid registration of the template with the actual catheter trajectories. All potential straight trajectories intersecting the contoured clinical target volume (CTV) are considered for catheter optimization. Retrospectively generated plans and clinical plans are compared with respect to dosimetric performance and optimization time. All plans were generated with one single run of the optimizer lasting 0.6-97.4 s. Compared to manual optimization, SACO yields a statistically significant (P ≤ 0.05) improved target coverage while at the same time fulfilling all dosimetric constraints for organs at risk (OARs). Comparing inverse planning strategies, dosimetric evaluation for SACO and "hybrid inverse planning and optimization" (HIPO), as gold standard, shows no statistically significant difference (P > 0.05). However, SACO provides the potential to reduce the number of used catheters without compromising plan quality. The proposed heuristic for needle selection provides fast catheter selection with optimization times suited for intraoperative treatment planning. Compared to manual optimization, the

  19. Efficient C/C++ programming smaller, faster, better

    CERN Document Server

    Heller, Steve

    1994-01-01

    Efficient C/C++ Programming describes a practical, real-world approach to efficient C/C++ programming. Topics covered range from how to save storage using a restricted character set and how to speed up access to records by employing hash coding and caching. A selective mailing list system is used to illustrate rapid access to and rearrangement of information selected by criteria specified at runtime.Comprised of eight chapters, this book begins by discussing factors to consider when deciding whether a program needs optimization. In the next chapter, a supermarket price lookup system is used to

  20. Analysis of multicriteria models application for selection of an optimal artificial lift method in oil production

    Directory of Open Access Journals (Sweden)

    Crnogorac Miroslav P.

    2016-01-01

    Full Text Available In the world today for the exploitation of oil reservoirs by artificial lift methods are applied different types of deep pumps (piston, centrifugal, screw, hydraulic, water jet pumps and gas lift (continuous, intermittent and plunger. Maximum values of oil production achieved by these exploitation methods are significantly different. In order to select the optimal exploitation method of oil well, the multicriteria analysis models are used. In this paper is presented an analysis of the multicriteria model's application known as VIKOR, TOPSIS, ELECTRE, AHP and PROMETHEE for selection of optimal exploitation method for typical oil well at Serbian exploration area. Ranking results of applicability of the deep piston pumps, hydraulic pumps, screw pumps, gas lift method and electric submersible centrifugal pumps, indicated that in the all above multicriteria models except in PROMETHEE, the optimal method of exploitation are deep piston pumps and gas lift.

  1. Local beam angle optimization with linear programming and gradient search

    International Nuclear Information System (INIS)

    Craft, David

    2007-01-01

    The optimization of beam angles in IMRT planning is still an open problem, with literature focusing on heuristic strategies and exhaustive searches on discrete angle grids. We show how a beam angle set can be locally refined in a continuous manner using gradient-based optimization in the beam angle space. The gradient is derived using linear programming duality theory. Applying this local search to 100 random initial angle sets of a phantom pancreatic case demonstrates the method, and highlights the many-local-minima aspect of the BAO problem. Due to this function structure, we recommend a search strategy of a thorough global search followed by local refinement at promising beam angle sets. Extensions to nonlinear IMRT formulations are discussed. (note)

  2. Selecting Operations for Assembler Encoding

    Directory of Open Access Journals (Sweden)

    Tomasz Praczyk

    2010-04-01

    Full Text Available Assembler Encoding is a neuro-evolutionary method in which a neural network is represented in the form of a simple program called Assembler Encoding Program. The task of the program is to create the so-called Network Definition Matrix which maintains all the information necessary to construct the network. To generate Assembler Encoding Programs and the subsequent neural networks evolutionary techniques are used.
    The performance of Assembler Encoding strongly depends on operations used in Assembler Encoding Programs. To select the most effective operations, experiments in the optimization and the predator-prey problem were carried out. In the experiments, Assembler Encoding Programs equipped with different types of operations were tested. The results of the tests are presented at the end of the paper.

  3. Dynamic programming approach for partial decision rule optimization

    KAUST Repository

    Amin, Talha

    2012-10-04

    This paper is devoted to the study of an extension of dynamic programming approach which allows optimization of partial decision rules relative to the length or coverage. We introduce an uncertainty measure J(T) which is the difference between number of rows in a decision table T and number of rows with the most common decision for T. For a nonnegative real number γ, we consider γ-decision rules (partial decision rules) that localize rows in subtables of T with uncertainty at most γ. Presented algorithm constructs a directed acyclic graph Δ γ(T) which nodes are subtables of the decision table T given by systems of equations of the kind "attribute = value". This algorithm finishes the partitioning of a subtable when its uncertainty is at most γ. The graph Δ γ(T) allows us to describe the whole set of so-called irredundant γ-decision rules. We can optimize such set of rules according to length or coverage. This paper contains also results of experiments with decision tables from UCI Machine Learning Repository.

  4. Dynamic programming approach for partial decision rule optimization

    KAUST Repository

    Amin, Talha M.; Chikalov, Igor; Moshkov, Mikhail; Zielosko, Beata

    2012-01-01

    This paper is devoted to the study of an extension of dynamic programming approach which allows optimization of partial decision rules relative to the length or coverage. We introduce an uncertainty measure J(T) which is the difference between number of rows in a decision table T and number of rows with the most common decision for T. For a nonnegative real number γ, we consider γ-decision rules (partial decision rules) that localize rows in subtables of T with uncertainty at most γ. Presented algorithm constructs a directed acyclic graph Δ γ(T) which nodes are subtables of the decision table T given by systems of equations of the kind "attribute = value". This algorithm finishes the partitioning of a subtable when its uncertainty is at most γ. The graph Δ γ(T) allows us to describe the whole set of so-called irredundant γ-decision rules. We can optimize such set of rules according to length or coverage. This paper contains also results of experiments with decision tables from UCI Machine Learning Repository.

  5. Optimizing a mobile robot control system using GPU acceleration

    Science.gov (United States)

    Tuck, Nat; McGuinness, Michael; Martin, Fred

    2012-01-01

    This paper describes our attempt to optimize a robot control program for the Intelligent Ground Vehicle Competition (IGVC) by running computationally intensive portions of the system on a commodity graphics processing unit (GPU). The IGVC Autonomous Challenge requires a control program that performs a number of different computationally intensive tasks ranging from computer vision to path planning. For the 2011 competition our Robot Operating System (ROS) based control system would not run comfortably on the multicore CPU on our custom robot platform. The process of profiling the ROS control program and selecting appropriate modules for porting to run on a GPU is described. A GPU-targeting compiler, Bacon, is used to speed up development and help optimize the ported modules. The impact of the ported modules on overall performance is discussed. We conclude that GPU optimization can free a significant amount of CPU resources with minimal effort for expensive user-written code, but that replacing heavily-optimized library functions is more difficult, and a much less efficient use of time.

  6. Bandgap optimization of two-dimensional photonic crystals using semidefinite programming and subspace methods

    International Nuclear Information System (INIS)

    Men, H.; Nguyen, N.C.; Freund, R.M.; Parrilo, P.A.; Peraire, J.

    2010-01-01

    In this paper, we consider the optimal design of photonic crystal structures for two-dimensional square lattices. The mathematical formulation of the bandgap optimization problem leads to an infinite-dimensional Hermitian eigenvalue optimization problem parametrized by the dielectric material and the wave vector. To make the problem tractable, the original eigenvalue problem is discretized using the finite element method into a series of finite-dimensional eigenvalue problems for multiple values of the wave vector parameter. The resulting optimization problem is large-scale and non-convex, with low regularity and non-differentiable objective. By restricting to appropriate eigenspaces, we reduce the large-scale non-convex optimization problem via reparametrization to a sequence of small-scale convex semidefinite programs (SDPs) for which modern SDP solvers can be efficiently applied. Numerical results are presented for both transverse magnetic (TM) and transverse electric (TE) polarizations at several frequency bands. The optimized structures exhibit patterns which go far beyond typical physical intuition on periodic media design.

  7. A New Methodology to Select the Preferred Solutions from the Pareto-optimal Set: Application to Polymer Extrusion

    International Nuclear Information System (INIS)

    Ferreira, Jose C.; Gaspar-Cunha, Antonio; Fonseca, Carlos M.

    2007-01-01

    Most of the real world optimization problems involve multiple, usually conflicting, optimization criteria. Generating Pareto optimal solutions plays an important role in multi-objective optimization, and the problem is considered to be solved when the Pareto optimal set is found, i.e., the set of non-dominated solutions. Multi-Objective Evolutionary Algorithms based on the principle of Pareto optimality are designed to produce the complete set of non-dominated solutions. However, this is not allays enough since the aim is not only to know the Pareto set but, also, to obtain one solution from this Pareto set. Thus, the definition of a methodology able to select a single solution from the set of non-dominated solutions (or a region of the Pareto frontier), and taking into account the preferences of a Decision Maker (DM), is necessary. A different method, based on a weighted stress function, is proposed. It is able to integrate the user's preferences in order to find the best region of the Pareto frontier accordingly with these preferences. This method was tested on some benchmark test problems, with two and three criteria, and on a polymer extrusion problem. This methodology is able to select efficiently the best Pareto-frontier region for the specified relative importance of the criteria

  8. Green supplier development program selection using NGT and VIKOR under fuzzy environment

    DEFF Research Database (Denmark)

    Awasthi, Anjali; Govindan, Kannan

    2016-01-01

    Developing environmental performance of suppliers is critical for green supply chain management. Organizations are nowadays investing in various green supplier development programs to enhance their supplier performances. The decision to select the right program for green supplier development...... is often a challenging decision due to lack of prior experience, limited quantitative information, specific context of the organization, and varying supplier backgrounds. This paper addresses the problem of evaluating green supplier development programs and proposes a fuzzy NGT (Nominal Group Technique......)-VIKOR (VlseKriterijumska Optimizacija I Kompromisno Resenje) based solution approach. NGT is used to identify criteria for evaluating green supplier development programs. Fuzzy theory is used to address qualitative (linguistic) ratings for the alternatives and the selected criteria used under lack...

  9. The optimal hormonal replacement modality selection for multiple organ procurement from brain-dead organ donors

    Directory of Open Access Journals (Sweden)

    Mi Z

    2014-12-01

    Full Text Available Zhibao Mi,1 Dimitri Novitzky,2 Joseph F Collins,1 David KC Cooper3 1Cooperative Studies Program Coordinating Center, VA Maryland Health Care Systems, Perry Point, MD, USA; 2Department of Cardiothoracic Surgery, University of South Florida, Tampa, FL, USA; 3Thomas E Starzl Transplantation Institute, University of Pittsburgh, Pittsburgh, PA, USA Abstract: The management of brain-dead organ donors is complex. The use of inotropic agents and replacement of depleted hormones (hormonal replacement therapy is crucial for successful multiple organ procurement, yet the optimal hormonal replacement has not been identified, and the statistical adjustment to determine the best selection is not trivial. Traditional pair-wise comparisons between every pair of treatments, and multiple comparisons to all (MCA, are statistically conservative. Hsu’s multiple comparisons with the best (MCB – adapted from the Dunnett’s multiple comparisons with control (MCC – has been used for selecting the best treatment based on continuous variables. We selected the best hormonal replacement modality for successful multiple organ procurement using a two-step approach. First, we estimated the predicted margins by constructing generalized linear models (GLM or generalized linear mixed models (GLMM, and then we applied the multiple comparison methods to identify the best hormonal replacement modality given that the testing of hormonal replacement modalities is independent. Based on 10-year data from the United Network for Organ Sharing (UNOS, among 16 hormonal replacement modalities, and using the 95% simultaneous confidence intervals, we found that the combination of thyroid hormone, a corticosteroid, antidiuretic hormone, and insulin was the best modality for multiple organ procurement for transplantation. Keywords: best treatment selection, brain-dead organ donors, hormonal replacement, multiple binary endpoints, organ procurement, multiple comparisons

  10. Policy Iteration for $H_\\infty $ Optimal Control of Polynomial Nonlinear Systems via Sum of Squares Programming.

    Science.gov (United States)

    Zhu, Yuanheng; Zhao, Dongbin; Yang, Xiong; Zhang, Qichao

    2018-02-01

    Sum of squares (SOS) polynomials have provided a computationally tractable way to deal with inequality constraints appearing in many control problems. It can also act as an approximator in the framework of adaptive dynamic programming. In this paper, an approximate solution to the optimal control of polynomial nonlinear systems is proposed. Under a given attenuation coefficient, the Hamilton-Jacobi-Isaacs equation is relaxed to an optimization problem with a set of inequalities. After applying the policy iteration technique and constraining inequalities to SOS, the optimization problem is divided into a sequence of feasible semidefinite programming problems. With the converged solution, the attenuation coefficient is further minimized to a lower value. After iterations, approximate solutions to the smallest -gain and the associated optimal controller are obtained. Four examples are employed to verify the effectiveness of the proposed algorithm.

  11. Optimization of radioactive waste management system by application of multiobjective linear programming

    International Nuclear Information System (INIS)

    Shimizu, Yoshiaki

    1981-01-01

    A mathematical procedure is proposed to make a radioactive waste management plan comprehensively. Since such planning is relevant to some different goals in management, decision making has to be formulated as a multiobjective optimization problem. A mathematical programming method was introduced to make a decision through an interactive manner which enables us to assess the preference of decision maker step by step among the conflicting objectives. The reference system taken as an example is the radioactive waste management system at the Research Reactor Institute of Kyoto University (KUR). Its linear model was built based on the experience in the actual management at KUR. The best-compromise model was then formulated as a multiobjective linear programming by the aid of the computational analysis through a conventional optimization. It was shown from the numerical results that the proposed approach could provide some useful informations to make an actual management plan. (author)

  12. An approach to optimization of the choice of boiler steel grades as to a mixed-integer programming problem

    International Nuclear Information System (INIS)

    Kler, Alexandr M.; Potanina, Yulia M.

    2017-01-01

    One of the ways to enhance the energy efficiency of thermal power plants is to increase thermodynamic parameters of steam. A sufficient level of reliability and longevity can be provided by the application of advanced construction materials (in particular, high-alloy steel can be used to manufacture the most loaded heating surfaces of a boiler unit). A rational choice of technical and economic parameters of energy plants as the most complex technical systems should be made using the methods of mathematical modeling and optimization. The paper considers an original approach to an economically sound optimal choice of steel grade to manufacture heating surfaces for boiler units. A case study of optimization of the discrete-continuous parameters of an energy unit operating at ultra-supercritical steam parameters, in combination with construction of a variant selection tree is presented. - Highlights: • A case study on optimization of an ultra-supercritical power plant is demonstrated. • Optimization is based on the minimization of electricity price. • An approach is proposed to optimize the selection of boiler steel grades. • The approach is based on the construction of a variant tree. • The selection of steel grades for a boiler unit is shown.

  13. An ant colony optimization based feature selection for web page classification.

    Science.gov (United States)

    Saraç, Esra; Özel, Selma Ayşe

    2014-01-01

    The increased popularity of the web has caused the inclusion of huge amount of information to the web, and as a result of this explosive information growth, automated web page classification systems are needed to improve search engines' performance. Web pages have a large number of features such as HTML/XML tags, URLs, hyperlinks, and text contents that should be considered during an automated classification process. The aim of this study is to reduce the number of features to be used to improve runtime and accuracy of the classification of web pages. In this study, we used an ant colony optimization (ACO) algorithm to select the best features, and then we applied the well-known C4.5, naive Bayes, and k nearest neighbor classifiers to assign class labels to web pages. We used the WebKB and Conference datasets in our experiments, and we showed that using the ACO for feature selection improves both accuracy and runtime performance of classification. We also showed that the proposed ACO based algorithm can select better features with respect to the well-known information gain and chi square feature selection methods.

  14. Optimization programs of radiation protection applied to post-graduation and encouraging research

    International Nuclear Information System (INIS)

    Levy, Denise S.; Sordi, Gian Maria A.A.

    2013-01-01

    In 2011 we started the automation and integration of radiological protection optimization programs, in order to offer unified programs and inter-related information in Portuguese, providing Brazilian radioactive facilities a complete repository for research, consultation and information. The authors of this project extended it to postgraduate education, in order to encourage postgraduate students researches, expanding methods for enhancing student learning through the use of different combined resources, such as educational technology, information technology and group dynamics. This new methodology was applied in a postgraduate discipline at Instituto de Pesquisas Energeticas e Nucleares (IPEN), Brazil, in the postgraduate discipline entitled Fundamental Elements of Radiological Protection (TNA-5732). Students have six weeks to assimilate a complex content of optimization, considering national and international standards, guidelines and recommendations published by different organizations over the past decades. Unlike traditional classes, in which students receive prompt responses, this new methodology stimulates discussion, encouraging collective thinking processes and promoting ongoing personal reflection and researches. Case-oriented problem-solving permitted students to play different roles, promoting whole-group discussions and cooperative learning, approaching theory and practical applications. Students discussed different papers, published in international conferences, and their implications according to current standards. The automation of optimization programs was essential as a research tool during the course. The results of this experience were evaluated in two consecutive years. We had excellent results compared to the previous 14 years. The methodology has exceeded expectations and will be also applied in 2013 to ionizing radiation monitoring postgraduate classes. (author)

  15. Statistical mechanical analysis of linear programming relaxation for combinatorial optimization problems

    Science.gov (United States)

    Takabe, Satoshi; Hukushima, Koji

    2016-05-01

    Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α -uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α =2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c =e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c =1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α ≥3 , minimum vertex covers on α -uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c =e /(α -1 ) where the replica symmetry is broken.

  16. Statistical mechanical analysis of linear programming relaxation for combinatorial optimization problems.

    Science.gov (United States)

    Takabe, Satoshi; Hukushima, Koji

    2016-05-01

    Typical behavior of the linear programming (LP) problem is studied as a relaxation of the minimum vertex cover (min-VC), a type of integer programming (IP) problem. A lattice-gas model on the Erdös-Rényi random graphs of α-uniform hyperedges is proposed to express both the LP and IP problems of the min-VC in the common statistical mechanical model with a one-parameter family. Statistical mechanical analyses reveal for α=2 that the LP optimal solution is typically equal to that given by the IP below the critical average degree c=e in the thermodynamic limit. The critical threshold for good accuracy of the relaxation extends the mathematical result c=1 and coincides with the replica symmetry-breaking threshold of the IP. The LP relaxation for the minimum hitting sets with α≥3, minimum vertex covers on α-uniform random graphs, is also studied. Analytic and numerical results strongly suggest that the LP relaxation fails to estimate optimal values above the critical average degree c=e/(α-1) where the replica symmetry is broken.

  17. Optimizing Water Allocation under Uncertain System Conditions for Water and Agriculture Future Scenarios in Alfeios River Basin (Greece—Part B: Fuzzy-Boundary Intervals Combined with Multi-Stage Stochastic Programming Model

    Directory of Open Access Journals (Sweden)

    Eleni Bekri

    2015-11-01

    Full Text Available Optimal water allocation within a river basin still remains a great modeling challenge for engineers due to various hydrosystem complexities, parameter uncertainties and their interactions. Conventional deterministic optimization approaches have given their place to stochastic, fuzzy and interval-parameter programming approaches and their hybrid combinations for overcoming these difficulties. In many countries, including Mediterranean countries, water resources management is characterized by uncertain, imprecise and limited data because of the absence of permanent measuring systems, inefficient river monitoring and fragmentation of authority responsibilities. A fuzzy-boundary-interval linear programming methodology developed by Li et al. (2010 is selected and applied in the Alfeios river basin (Greece for optimal water allocation under uncertain system conditions. This methodology combines an ordinary multi-stage stochastic programming with uncertainties expressed as fuzzy-boundary intervals. Upper- and lower-bound solution intervals for optimized water allocation targets and probabilistic water allocations and shortages are estimated under a baseline scenario and four water and agricultural policy future scenarios for an optimistic and a pessimistic attitude of the decision makers. In this work, the uncertainty of the random water inflows is incorporated through the simultaneous generation of stochastic equal-probability hydrologic scenarios at various inflow positions instead of using a scenario-tree approach in the original methodology.

  18. Analysis Methodology for Optimal Selection of Ground Station Site in Space Missions

    Science.gov (United States)

    Nieves-Chinchilla, J.; Farjas, M.; Martínez, R.

    2013-12-01

    Optimization of ground station sites is especially important in complex missions that include several small satellites (clusters or constellations) such as the QB50 project, where one ground station would be able to track several spatial vehicles, even simultaneously. In this regard the design of the communication system has to carefully take into account the ground station site and relevant signal phenomena, depending on the frequency band. To propose the optimal location of the ground station, these aspects become even more relevant to establish a trusted communication link due to the ground segment site in urban areas and/or selection of low orbits for the space segment. In addition, updated cartography with high resolution data of the location and its surroundings help to develop recommendations in the design of its location for spatial vehicles tracking and hence to improve effectiveness. The objectives of this analysis methodology are: completion of cartographic information, modelling the obstacles that hinder communication between the ground and space segment and representation in the generated 3D scene of the degree of impairment in the signal/noise of the phenomena that interferes with communication. The integration of new technologies of geographic data capture, such as 3D Laser Scan, determine that increased optimization of the antenna elevation mask, in its AOS and LOS azimuths along the horizon visible, maximizes visibility time with spatial vehicles. Furthermore, from the three-dimensional cloud of points captured, specific information is selected and, using 3D modeling techniques, the 3D scene of the antenna location site and surroundings is generated. The resulting 3D model evidences nearby obstacles related to the cartographic conditions such as mountain formations and buildings, and any additional obstacles that interfere with the operational quality of the antenna (other antennas and electronic devices that emit or receive in the same bandwidth

  19. Feature Selection and Parameters Optimization of SVM Using Particle Swarm Optimization for Fault Classification in Power Distribution Systems.

    Science.gov (United States)

    Cho, Ming-Yuan; Hoang, Thi Thom

    2017-01-01

    Fast and accurate fault classification is essential to power system operations. In this paper, in order to classify electrical faults in radial distribution systems, a particle swarm optimization (PSO) based support vector machine (SVM) classifier has been proposed. The proposed PSO based SVM classifier is able to select appropriate input features and optimize SVM parameters to increase classification accuracy. Further, a time-domain reflectometry (TDR) method with a pseudorandom binary sequence (PRBS) stimulus has been used to generate a dataset for purposes of classification. The proposed technique has been tested on a typical radial distribution network to identify ten different types of faults considering 12 given input features generated by using Simulink software and MATLAB Toolbox. The success rate of the SVM classifier is over 97%, which demonstrates the effectiveness and high efficiency of the developed method.

  20. Feature Selection and Parameters Optimization of SVM Using Particle Swarm Optimization for Fault Classification in Power Distribution Systems

    Directory of Open Access Journals (Sweden)

    Ming-Yuan Cho

    2017-01-01

    Full Text Available Fast and accurate fault classification is essential to power system operations. In this paper, in order to classify electrical faults in radial distribution systems, a particle swarm optimization (PSO based support vector machine (SVM classifier has been proposed. The proposed PSO based SVM classifier is able to select appropriate input features and optimize SVM parameters to increase classification accuracy. Further, a time-domain reflectometry (TDR method with a pseudorandom binary sequence (PRBS stimulus has been used to generate a dataset for purposes of classification. The proposed technique has been tested on a typical radial distribution network to identify ten different types of faults considering 12 given input features generated by using Simulink software and MATLAB Toolbox. The success rate of the SVM classifier is over 97%, which demonstrates the effectiveness and high efficiency of the developed method.

  1. Optimal design of distributed energy resource systems based on two-stage stochastic programming

    International Nuclear Information System (INIS)

    Yang, Yun; Zhang, Shijie; Xiao, Yunhan

    2017-01-01

    Highlights: • A two-stage stochastic programming model is built to design DER systems under uncertainties. • Uncertain energy demands have a significant effect on the optimal design. • Uncertain energy prices and renewable energy intensity have little effect on the optimal design. • The economy is overestimated if the system is designed without considering the uncertainties. • The uncertainty in energy prices has the significant and greatest effect on the economy. - Abstract: Multiple uncertainties exist in the optimal design of distributed energy resource (DER) systems. The expected energy, economic, and environmental benefits may not be achieved and a deficit in energy supply may occur if the uncertainties are not handled properly. This study focuses on the optimal design of DER systems with consideration of the uncertainties. A two-stage stochastic programming model is built in consideration of the discreteness of equipment capacities, equipment partial load operation and output bounds as well as of the influence of ambient temperature on gas turbine performance. The stochastic model is then transformed into its deterministic equivalent and solved. For an illustrative example, the model is applied to a hospital in Lianyungang, China. Comparative studies are performed to evaluate the effect of the uncertainties in load demands, energy prices, and renewable energy intensity separately and simultaneously on the system’s economy and optimal design. Results show that the uncertainties in load demands have a significant effect on the optimal system design, whereas the uncertainties in energy prices and renewable energy intensity have almost no effect. Results regarding economy show that it is obviously overestimated if the system is designed without considering the uncertainties.

  2. Optimization strategies based on sequential quadratic programming applied for a fermentation process for butanol production.

    Science.gov (United States)

    Pinto Mariano, Adriano; Bastos Borba Costa, Caliane; de Franceschi de Angelis, Dejanira; Maugeri Filho, Francisco; Pires Atala, Daniel Ibraim; Wolf Maciel, Maria Regina; Maciel Filho, Rubens

    2009-11-01

    In this work, the mathematical optimization of a continuous flash fermentation process for the production of biobutanol was studied. The process consists of three interconnected units, as follows: fermentor, cell-retention system (tangential microfiltration), and vacuum flash vessel (responsible for the continuous recovery of butanol from the broth). The objective of the optimization was to maximize butanol productivity for a desired substrate conversion. Two strategies were compared for the optimization of the process. In one of them, the process was represented by a deterministic model with kinetic parameters determined experimentally and, in the other, by a statistical model obtained using the factorial design technique combined with simulation. For both strategies, the problem was written as a nonlinear programming problem and was solved with the sequential quadratic programming technique. The results showed that despite the very similar solutions obtained with both strategies, the problems found with the strategy using the deterministic model, such as lack of convergence and high computational time, make the use of the optimization strategy with the statistical model, which showed to be robust and fast, more suitable for the flash fermentation process, being recommended for real-time applications coupling optimization and control.

  3. Multi-objective optimization of cellular scanning strategy in selective laser melting

    DEFF Research Database (Denmark)

    Ahrari, Ali; Deb, Kalyanmoy; Mohanty, Sankhya

    2017-01-01

    The scanning strategy for selective laser melting - an additive manufacturing process - determines the temperature fields during the manufacturing process, which in turn affects residual stresses and distortions, two of the main sources of process-induced defects. The goal of this study is to dev......The scanning strategy for selective laser melting - an additive manufacturing process - determines the temperature fields during the manufacturing process, which in turn affects residual stresses and distortions, two of the main sources of process-induced defects. The goal of this study......, the problem is a combination of combinatorial and choice optimization, which makes the problem difficult to solve. On a process simulation domain consisting of 32 cells, our multi-objective evolutionary method is able to find a set of trade-off solutions for the defined conflicting objectives, which cannot...

  4. AN APPLICATION OF FUZZY PROMETHEE METHOD FOR SELECTING OPTIMAL CAR PROBLEM

    Directory of Open Access Journals (Sweden)

    SERKAN BALLI

    2013-06-01

    Full Text Available Most of the economical, industrial, financial or political decision problems are multi-criteria. In these multi criteria problems, optimal selection of alternatives is hard and complex process. Recently, some kinds of methods are improved to solve these problems. Promethee is one of most efficient and easiest method and solves problems that consist quantitative criteria.  However, in daily life, there are criteria which are explained as linguistic and cannot modeled numerical. Hence, Promethee method is incomplete for linguistic criteria which are imprecise. To satisfy this deficiency, fuzzy set approximation can be used. Promethee method, which is extended with using fuzzy inputs, is applied to car selection for seven different cars in same class by using criteria: price, fuel, performance and security. The obtained results are appropriate and consistent.

  5. Linear Optimization of Frequency Spectrum Assignments Across System

    Science.gov (United States)

    2016-03-01

    selection tools, frequency allocation, transmission optimization, electromagnetic maneuver warfare, electronic protection, assignment model 15. NUMBER ...Characteristics Modeled ...............................................................29 Table 10.   Antenna Systems Modeled , Number of Systems and...surveillance EW early warning GAMS general algebraic modeling system GHz gigahertz IDE integrated development environment ILP integer linear program

  6. An Algebraic Programming Style for Numerical Software and Its Optimization

    Directory of Open Access Journals (Sweden)

    T.B. Dinesh

    2000-01-01

    Full Text Available The abstract mathematical theory of partial differential equations (PDEs is formulated in terms of manifolds, scalar fields, tensors, and the like, but these algebraic structures are hardly recognizable in actual PDE solvers. The general aim of the Sophus programming style is to bridge the gap between theory and practice in the domain of PDE solvers. Its main ingredients are a library of abstract datatypes corresponding to the algebraic structures used in the mathematical theory and an algebraic expression style similar to the expression style used in the mathematical theory. Because of its emphasis on abstract datatypes, Sophus is most naturally combined with object-oriented languages or other languages supporting abstract datatypes. The resulting source code patterns are beyond the scope of current compiler optimizations, but are sufficiently specific for a dedicated source-to-source optimizer. The limited, domain-specific, character of Sophus is the key to success here. This kind of optimization has been tested on computationally intensive Sophus style code with promising results. The general approach may be useful for other styles and in other application domains as well.

  7. Logic hybrid simulation-optimization algorithm for distillation design

    OpenAIRE

    Caballero Suárez, José Antonio

    2014-01-01

    In this paper, we propose a novel algorithm for the rigorous design of distillation columns that integrates a process simulator in a generalized disjunctive programming formulation. The optimal distillation column, or column sequence, is obtained by selecting, for each column section, among a set of column sections with different number of theoretical trays. The selection of thermodynamic models, properties estimation etc., are all in the simulation environment. All the numerical issues relat...

  8. CiOpt: a program for optimization of the frequency response of linear circuits

    OpenAIRE

    Miró Sans, Joan Maria; Palà Schönwälder, Pere

    1991-01-01

    An interactive personal-computer program for optimizing the frequency response of linear lumped circuits (CiOpt) is presented. CiOpt has proved to be an efficient tool in improving designs where the inclusion of more accurate device models distorts the desired frequency response, as well as in device modeling. The outputs of CiOpt are the element values which best match the obtained and the desired frequency response. The optimization algorithms used (the Fletcher-Powell and Newton's methods,...

  9. Optimization Models for Reaction Networks: Information Divergence, Quadratic Programming and Kirchhoff’s Laws

    Directory of Open Access Journals (Sweden)

    Julio Michael Stern

    2014-03-01

    Full Text Available This article presents a simple derivation of optimization models for reaction networks leading to a generalized form of the mass-action law, and compares the formal structure of Minimum Information Divergence, Quadratic Programming and Kirchhoff type network models. These optimization models are used in related articles to develop and illustrate the operation of ontology alignment algorithms and to discuss closely connected issues concerning the epistemological and statistical significance of sharp or precise hypotheses in empirical science.

  10. A Selection Approach for Optimized Problem-Solving Process by Grey Relational Utility Model and Multicriteria Decision Analysis

    Directory of Open Access Journals (Sweden)

    Chih-Kun Ke

    2012-01-01

    Full Text Available In business enterprises, especially the manufacturing industry, various problem situations may occur during the production process. A situation denotes an evaluation point to determine the status of a production process. A problem may occur if there is a discrepancy between the actual situation and the desired one. Thus, a problem-solving process is often initiated to achieve the desired situation. In the process, how to determine an action need to be taken to resolve the situation becomes an important issue. Therefore, this work uses a selection approach for optimized problem-solving process to assist workers in taking a reasonable action. A grey relational utility model and a multicriteria decision analysis are used to determine the optimal selection order of candidate actions. The selection order is presented to the worker as an adaptive recommended solution. The worker chooses a reasonable problem-solving action based on the selection order. This work uses a high-tech company’s knowledge base log as the analysis data. Experimental results demonstrate that the proposed selection approach is effective.

  11. A singular value decomposition linear programming (SVDLP) optimization technique for circular cone based robotic radiotherapy

    Science.gov (United States)

    Liang, Bin; Li, Yongbao; Wei, Ran; Guo, Bin; Xu, Xuang; Liu, Bo; Li, Jiafeng; Wu, Qiuwen; Zhou, Fugen

    2018-01-01

    With robot-controlled linac positioning, robotic radiotherapy systems such as CyberKnife significantly increase freedom of radiation beam placement, but also impose more challenges on treatment plan optimization. The resampling mechanism in the vendor-supplied treatment planning system (MultiPlan) cannot fully explore the increased beam direction search space. Besides, a sparse treatment plan (using fewer beams) is desired to improve treatment efficiency. This study proposes a singular value decomposition linear programming (SVDLP) optimization technique for circular collimator based robotic radiotherapy. The SVDLP approach initializes the input beams by simulating the process of covering the entire target volume with equivalent beam tapers. The requirements on dosimetry distribution are modeled as hard and soft constraints, and the sparsity of the treatment plan is achieved by compressive sensing. The proposed linear programming (LP) model optimizes beam weights by minimizing the deviation of soft constraints subject to hard constraints, with a constraint on the l 1 norm of the beam weight. A singular value decomposition (SVD) based acceleration technique was developed for the LP model. Based on the degeneracy of the influence matrix, the model is first compressed into lower dimension for optimization, and then back-projected to reconstruct the beam weight. After beam weight optimization, the number of beams is reduced by removing the beams with low weight, and optimizing the weights of the remaining beams using the same model. This beam reduction technique is further validated by a mixed integer programming (MIP) model. The SVDLP approach was tested on a lung case. The results demonstrate that the SVD acceleration technique speeds up the optimization by a factor of 4.8. Furthermore, the beam reduction achieves a similar plan quality to the globally optimal plan obtained by the MIP model, but is one to two orders of magnitude faster. Furthermore, the SVDLP

  12. A singular value decomposition linear programming (SVDLP) optimization technique for circular cone based robotic radiotherapy.

    Science.gov (United States)

    Liang, Bin; Li, Yongbao; Wei, Ran; Guo, Bin; Xu, Xuang; Liu, Bo; Li, Jiafeng; Wu, Qiuwen; Zhou, Fugen

    2018-01-05

    With robot-controlled linac positioning, robotic radiotherapy systems such as CyberKnife significantly increase freedom of radiation beam placement, but also impose more challenges on treatment plan optimization. The resampling mechanism in the vendor-supplied treatment planning system (MultiPlan) cannot fully explore the increased beam direction search space. Besides, a sparse treatment plan (using fewer beams) is desired to improve treatment efficiency. This study proposes a singular value decomposition linear programming (SVDLP) optimization technique for circular collimator based robotic radiotherapy. The SVDLP approach initializes the input beams by simulating the process of covering the entire target volume with equivalent beam tapers. The requirements on dosimetry distribution are modeled as hard and soft constraints, and the sparsity of the treatment plan is achieved by compressive sensing. The proposed linear programming (LP) model optimizes beam weights by minimizing the deviation of soft constraints subject to hard constraints, with a constraint on the l 1 norm of the beam weight. A singular value decomposition (SVD) based acceleration technique was developed for the LP model. Based on the degeneracy of the influence matrix, the model is first compressed into lower dimension for optimization, and then back-projected to reconstruct the beam weight. After beam weight optimization, the number of beams is reduced by removing the beams with low weight, and optimizing the weights of the remaining beams using the same model. This beam reduction technique is further validated by a mixed integer programming (MIP) model. The SVDLP approach was tested on a lung case. The results demonstrate that the SVD acceleration technique speeds up the optimization by a factor of 4.8. Furthermore, the beam reduction achieves a similar plan quality to the globally optimal plan obtained by the MIP model, but is one to two orders of magnitude faster. Furthermore, the SVDLP

  13. Dynamic programming approach to optimization of approximate decision rules

    KAUST Repository

    Amin, Talha

    2013-02-01

    This paper is devoted to the study of an extension of dynamic programming approach which allows sequential optimization of approximate decision rules relative to the length and coverage. We introduce an uncertainty measure R(T) which is the number of unordered pairs of rows with different decisions in the decision table T. For a nonnegative real number β, we consider β-decision rules that localize rows in subtables of T with uncertainty at most β. Our algorithm constructs a directed acyclic graph Δβ(T) which nodes are subtables of the decision table T given by systems of equations of the kind "attribute = value". This algorithm finishes the partitioning of a subtable when its uncertainty is at most β. The graph Δβ(T) allows us to describe the whole set of so-called irredundant β-decision rules. We can describe all irredundant β-decision rules with minimum length, and after that among these rules describe all rules with maximum coverage. We can also change the order of optimization. The consideration of irredundant rules only does not change the results of optimization. This paper contains also results of experiments with decision tables from UCI Machine Learning Repository. © 2012 Elsevier Inc. All rights reserved.

  14. OPTIMAL AIRCRAFT TRAJECTORIES FOR SPECIFIED RANGE

    Science.gov (United States)

    Lee, H.

    1994-01-01

    cruise cost is specified, an optimum trajectory can easily be generated; however, the range obtained for a particular optimum cruise cost is not known a priori. For short range flights, the program iteratively varies the optimum cruise cost until the computed range converges to the specified range. For long-range flights, iteration is unnecessary since the specified range can be divided into a cruise segment distance and full climb and descent distances. The user must supply the program with engine fuel flow rate coefficients and an aircraft aerodynamic model. The program currently includes coefficients for the Pratt-Whitney JT8D-7 engine and an aerodynamic model for the Boeing 727. Input to the program consists of the flight range to be covered and the prevailing flight conditions including pressure, temperature, and wind profiles. Information output by the program includes: optimum cruise tables at selected weights, optimal cruise quantities as a function of cruise weight and cruise distance, climb and descent profiles, and a summary of the complete synthesized optimal trajectory. This program is written in FORTRAN IV for batch execution and has been implemented on a CDC 6000 series computer with a central memory requirement of approximately 100K (octal) of 60 bit words. This aircraft trajectory optimization program was developed in 1979.

  15. Variationally optimal selection of slow coordinates and reaction coordinates in macromolecular systems

    Science.gov (United States)

    Noe, Frank

    To efficiently simulate and generate understanding from simulations of complex macromolecular systems, the concept of slow collective coordinates or reaction coordinates is of fundamental importance. Here we will introduce variational approaches to approximate the slow coordinates and the reaction coordinates between selected end-states given MD simulations of the macromolecular system and a (possibly large) basis set of candidate coordinates. We will then discuss how to select physically intuitive order paremeters that are good surrogates of this variationally optimal result. These result can be used in order to construct Markov state models or other models of the stationary and kinetics properties, in order to parametrize low-dimensional / coarse-grained model of the dynamics. Deutsche Forschungsgemeinschaft, European Research Council.

  16. Overview of ONWI'S Salt site selection program

    International Nuclear Information System (INIS)

    Madia, W.J.

    1983-01-01

    In the past year, activities in the salt site selection program of the Office of Nuclear Waste Isolation (ONWI) have focused on narrowing the number and size of areas under consideration as candidate repository sites. The progressive focusing is illustrated. Bedded salt, in the Permian Basin of West Texas and the Paradox Basin of Utah, and salt domes in the Gulf Coast Salt Dome Region (including parts of East Texas, Louisiana, and Mississippi) have been the subjects of geologic, environmental, and socioeconomic characterization of progressively greater detail as the screening process has proceeded. Detailed, field-oriented research and testing have superceded broad-based studies relying heavily on literature and other existing data. Coinciding with the increased field activities has been the publication of results and recommendations resulting from earlier program efforts

  17. Making the Optimal Decision in Selecting Protective Clothing

    International Nuclear Information System (INIS)

    Price, J. Mark

    2008-01-01

    Protective Clothing plays a major role in the decommissioning and operation of nuclear facilities. Literally thousands of dress-outs occur over the life of a decommissioning project and during outages at operational plants. In order to make the optimal decision on which type of protective clothing is best suited for the decommissioning or maintenance and repair work on radioactive systems, a number of interrelating factors must be considered. This article discusses these factors as well as surveys of plants regarding their level of usage of single use protective clothing and should help individuals making decisions about protective clothing as it applies to their application. Individuals considering using SUPC should not jump to conclusions. The survey conducted clearly indicates that plants have different drivers. An evaluation should be performed to understand the facility's true drivers for selecting clothing. It is recommended that an interdisciplinary team be formed including representatives from budgets and cost, safety, radwaste, health physics, and key user groups to perform the analysis. The right questions need to be asked and answered by the company providing the clothing to formulate a proper perspective and conclusion. The conclusions and recommendations need to be shared with senior management so that the drivers, expected results, and associated costs are understood and endorsed. In the end, the individual making the recommendation should ask himself/herself: 'Is my decision emotional, or logical and economical?' 'Have I reached the optimal decision for my plant?'

  18. Can Programmed or Self-Selected Physical Activity Affect Physical Fitness of Adolescents?

    Directory of Open Access Journals (Sweden)

    Neto Cláudio F.

    2014-12-01

    Full Text Available The aim of this study was to verify the effects of programmed and self-selected physical activities on the physical fitness of adolescents. High school adolescents, aged between 15 and 17 years, were divided into two experimental groups: a a self-selected physical activity group (PAS with 55 students (aged 15.7 ± 0.7 years, who performed physical activities with self-selected rhythm at the following sports: basketball, volleyball, handball, futsal and swimming; and b a physical fitness training group (PFT with 53 students (aged 16.0 ± 0.7 years, who performed programmed physical fitness exercises. Both types of activity were developed during 60 min classes. To assess physical fitness the PROESP-BR protocol was used. The statistical analysis was performed by repeated measures ANOVA. The measurements of pre and post-tests showed significantly different values after PFT in: 9 minute running test, medicine ball throw, horizontal jump, abdominal endurance, running speed and flexibility. After PAS differences were detected in abdominal endurance, agility, running speed and flexibility. The intervention with programmed physical activity promoted more changes in the physical abilities; however, in the self-selected program, agility was improved probably because of the practice of sports. Therefore, physical education teachers can use PFT to improve cardiorespiratory fitness and power of lower and upper limbs and PAS to improve agility of high school adolescents.

  19. Can programmed or self-selected physical activity affect physical fitness of adolescents?

    Science.gov (United States)

    Neto, Cláudio F; Neto, Gabriel R; Araújo, Adenilson T; Sousa, Maria S C; Sousa, Juliana B C; Batista, Gilmário R; Reis, Victor M M R

    2014-09-29

    The aim of this study was to verify the effects of programmed and self-selected physical activities on the physical fitness of adolescents. High school adolescents, aged between 15 and 17 years, were divided into two experimental groups: a) a self-selected physical activity group (PAS) with 55 students (aged 15.7 ± 0.7 years), who performed physical activities with self-selected rhythm at the following sports: basketball, volleyball, handball, futsal and swimming; and b) a physical fitness training group (PFT) with 53 students (aged 16.0 ± 0.7 years), who performed programmed physical fitness exercises. Both types of activity were developed during 60 min classes. To assess physical fitness the PROESP-BR protocol was used. The statistical analysis was performed by repeated measures ANOVA. The measurements of pre and post-tests showed significantly different values after PFT in: 9 minute running test, medicine ball throw, horizontal jump, abdominal endurance, running speed and flexibility. After PAS differences were detected in abdominal endurance, agility, running speed and flexibility. The intervention with programmed physical activity promoted more changes in the physical abilities; however, in the self-selected program, agility was improved probably because of the practice of sports. Therefore, physical education teachers can use PFT to improve cardiorespiratory fitness and power of lower and upper limbs and PAS to improve agility of high school adolescents.

  20. A Dynamic Programming Solution for Energy-Optimal Video Playback on Mobile Devices

    Directory of Open Access Journals (Sweden)

    Minseok Song

    2016-01-01

    Full Text Available Due to the development of mobile technology and wide availability of smartphones, the Internet of Things (IoT starts to handle high volumes of video data to facilitate multimedia-based services, which requires energy-efficient video playback. In video playback, frames have to be decoded and rendered at high playback rate, increasing the computation cost on the CPU. To save the CPU power, dynamic voltage and frequency scaling (DVFS dynamically adjusts the operating voltage of the processor along with frequency, in which appropriate selection of frequency on power could achieve a balance between performance and power. We present a decoding model that allows buffering frames to let the CPU run at low frequency and then propose an algorithm that determines the CPU frequency needed to decode each frame in a video, with the aim of minimizing power consumption while meeting buffer size and deadline constraints, using a dynamic programming technique. We finally extend this algorithm to optimize CPU frequencies over a short sequence of frames, producing a practical method of reducing the energy required for video decoding. Experimental results show a system-wide reduction in energy of 27%, compared with a processor running at full speed.

  1. Direct and Mediated Relationships Between Participation in a Telephonic Health Coaching Program and Health Behavior, Life Satisfaction, and Optimism.

    Science.gov (United States)

    Sears, Lindsay E; Coberley, Carter R; Pope, James E

    2016-07-01

    The aim of this study was to examine the direct and mediated effects of a telephonic health coaching program on changes to healthy behaviors, life satisfaction, and optimism. This longitudinal correlational study of 4881 individuals investigated simple and mediated relationships between participation in a telephonic health risk coaching program and outcomes from three annual Well-being Assessments. Program participation was directly related to improvements in healthy behaviors, life satisfaction and optimism, and indirect effects of coaching on these variables concurrently and over a one-year time lag were also supported. Given previous research that improvements to life satisfaction, optimism, and health behaviors are valuable for individuals, employers, and communities, a clearer understanding of intervention approaches that may impact these outcomes simultaneously can drive greater program effectiveness and value on investment.

  2. Genetic Particle Swarm Optimization-Based Feature Selection for Very-High-Resolution Remotely Sensed Imagery Object Change Detection.

    Science.gov (United States)

    Chen, Qiang; Chen, Yunhao; Jiang, Weiguo

    2016-07-30

    In the field of multiple features Object-Based Change Detection (OBCD) for very-high-resolution remotely sensed images, image objects have abundant features and feature selection affects the precision and efficiency of OBCD. Through object-based image analysis, this paper proposes a Genetic Particle Swarm Optimization (GPSO)-based feature selection algorithm to solve the optimization problem of feature selection in multiple features OBCD. We select the Ratio of Mean to Variance (RMV) as the fitness function of GPSO, and apply the proposed algorithm to the object-based hybrid multivariate alternative detection model. Two experiment cases on Worldview-2/3 images confirm that GPSO can significantly improve the speed of convergence, and effectively avoid the problem of premature convergence, relative to other feature selection algorithms. According to the accuracy evaluation of OBCD, GPSO is superior at overall accuracy (84.17% and 83.59%) and Kappa coefficient (0.6771 and 0.6314) than other algorithms. Moreover, the sensitivity analysis results show that the proposed algorithm is not easily influenced by the initial parameters, but the number of features to be selected and the size of the particle swarm would affect the algorithm. The comparison experiment results reveal that RMV is more suitable than other functions as the fitness function of GPSO-based feature selection algorithm.

  3. Portfolio selection problem with liquidity constraints under non-extensive statistical mechanics

    International Nuclear Information System (INIS)

    Zhao, Pan; Xiao, Qingxian

    2016-01-01

    In this study, we consider the optimal portfolio selection problem with liquidity limits. A portfolio selection model is proposed in which the risky asset price is driven by the process based on non-extensive statistical mechanics instead of the classic Wiener process. Using dynamic programming and Lagrange multiplier methods, we obtain the optimal policy and value function. Moreover, the numerical results indicate that this model is considerably different from the model based on the classic Wiener process, the optimal strategy is affected by the non-extensive parameter q, the increase in the investment in the risky asset is faster at a larger parameter q and the increase in wealth is similar.

  4. Fusion of remote sensing images based on pyramid decomposition with Baldwinian Clonal Selection Optimization

    Science.gov (United States)

    Jin, Haiyan; Xing, Bei; Wang, Lei; Wang, Yanyan

    2015-11-01

    In this paper, we put forward a novel fusion method for remote sensing images based on the contrast pyramid (CP) using the Baldwinian Clonal Selection Algorithm (BCSA), referred to as CPBCSA. Compared with classical methods based on the transform domain, the method proposed in this paper adopts an improved heuristic evolutionary algorithm, wherein the clonal selection algorithm includes Baldwinian learning. In the process of image fusion, BCSA automatically adjusts the fusion coefficients of different sub-bands decomposed by CP according to the value of the fitness function. BCSA also adaptively controls the optimal search direction of the coefficients and accelerates the convergence rate of the algorithm. Finally, the fusion images are obtained via weighted integration of the optimal fusion coefficients and CP reconstruction. Our experiments show that the proposed method outperforms existing methods in terms of both visual effect and objective evaluation criteria, and the fused images are more suitable for human visual or machine perception.

  5. 3rd World Congress on Global Optimization in Engineering & Science

    CERN Document Server

    Ruan, Ning; Xing, Wenxun; WCGO-III; Advances in Global Optimization

    2015-01-01

    This proceedings volume addresses advances in global optimization—a multidisciplinary research field that deals with the analysis, characterization, and computation of global minima and/or maxima of nonlinear, non-convex, and nonsmooth functions in continuous or discrete forms. The volume contains selected papers from the third biannual World Congress on Global Optimization in Engineering & Science (WCGO), held in the Yellow Mountains, Anhui, China on July 8-12, 2013. The papers fall into eight topical sections: mathematical programming; combinatorial optimization; duality theory; topology optimization; variational inequalities and complementarity problems; numerical optimization; stochastic models and simulation; and complex simulation and supply chain analysis.

  6. Tank waste remediation system optimized processing strategy with an altered treatment scheme

    International Nuclear Information System (INIS)

    Slaathaug, E.J.

    1996-03-01

    This report provides an alternative strategy evolved from the current Hanford Site Tank Waste Remediation System (TWRS) programmatic baseline for accomplishing the treatment and disposal of the Hanford Site tank wastes. This optimized processing strategy with an altered treatment scheme performs the major elements of the TWRS Program, but modifies the deployment of selected treatment technologies to reduce the program cost. The present program for development of waste retrieval, pretreatment, and vitrification technologies continues, but the optimized processing strategy reuses a single facility to accomplish the separations/low-activity waste (LAW) vitrification and the high-level waste (HLW) vitrification processes sequentially, thereby eliminating the need for a separate HLW vitrification facility

  7. A Mixed Integer Linear Programming Approach to Electrical Stimulation Optimization Problems.

    Science.gov (United States)

    Abouelseoud, Gehan; Abouelseoud, Yasmine; Shoukry, Amin; Ismail, Nour; Mekky, Jaidaa

    2018-02-01

    Electrical stimulation optimization is a challenging problem. Even when a single region is targeted for excitation, the problem remains a constrained multi-objective optimization problem. The constrained nature of the problem results from safety concerns while its multi-objectives originate from the requirement that non-targeted regions should remain unaffected. In this paper, we propose a mixed integer linear programming formulation that can successfully address the challenges facing this problem. Moreover, the proposed framework can conclusively check the feasibility of the stimulation goals. This helps researchers to avoid wasting time trying to achieve goals that are impossible under a chosen stimulation setup. The superiority of the proposed framework over alternative methods is demonstrated through simulation examples.

  8. Selecting electrode configurations for image-guided cochlear implant programming using template matching.

    Science.gov (United States)

    Zhang, Dongqing; Zhao, Yiyuan; Noble, Jack H; Dawant, Benoit M

    2018-04-01

    Cochlear implants (CIs) are neural prostheses that restore hearing using an electrode array implanted in the cochlea. After implantation, the CI processor is programmed by an audiologist. One factor that negatively impacts outcomes and can be addressed by programming is cross-electrode neural stimulation overlap (NSO). We have proposed a system to assist the audiologist in programming the CI that we call image-guided CI programming (IGCIP). IGCIP permits using CT images to detect NSO and recommend deactivation of a subset of electrodes to avoid NSO. We have shown that IGCIP significantly improves hearing outcomes. Most of the IGCIP steps are robustly automated but electrode configuration selection still sometimes requires manual intervention. With expertise, distance-versus-frequency curves, which are a way to visualize the spatial relationship learned from CT between the electrodes and the nerves they stimulate, can be used to select the electrode configuration. We propose an automated technique for electrode configuration selection. A comparison between this approach and one we have previously proposed shows that our method produces results that are as good as those obtained with our previous method while being generic and requiring fewer parameters.

  9. Comparing the Selected Transfer Functions and Local Optimization Methods for Neural Network Flood Runoff Forecast

    Directory of Open Access Journals (Sweden)

    Petr Maca

    2014-01-01

    Full Text Available The presented paper aims to analyze the influence of the selection of transfer function and training algorithms on neural network flood runoff forecast. Nine of the most significant flood events, caused by the extreme rainfall, were selected from 10 years of measurement on small headwater catchment in the Czech Republic, and flood runoff forecast was investigated using the extensive set of multilayer perceptrons with one hidden layer of neurons. The analyzed artificial neural network models with 11 different activation functions in hidden layer were trained using 7 local optimization algorithms. The results show that the Levenberg-Marquardt algorithm was superior compared to the remaining tested local optimization methods. When comparing the 11 nonlinear transfer functions, used in hidden layer neurons, the RootSig function was superior compared to the rest of analyzed activation functions.

  10. Stochastic optimal control in infinite dimension dynamic programming and HJB equations

    CERN Document Server

    Fabbri, Giorgio; Święch, Andrzej

    2017-01-01

    Providing an introduction to stochastic optimal control in infinite dimension, this book gives a complete account of the theory of second-order HJB equations in infinite-dimensional Hilbert spaces, focusing on its applicability to associated stochastic optimal control problems. It features a general introduction to optimal stochastic control, including basic results (e.g. the dynamic programming principle) with proofs, and provides examples of applications. A complete and up-to-date exposition of the existing theory of viscosity solutions and regular solutions of second-order HJB equations in Hilbert spaces is given, together with an extensive survey of other methods, with a full bibliography. In particular, Chapter 6, written by M. Fuhrman and G. Tessitore, surveys the theory of regular solutions of HJB equations arising in infinite-dimensional stochastic control, via BSDEs. The book is of interest to both pure and applied researchers working in the control theory of stochastic PDEs, and in PDEs in infinite ...

  11. Self-Regulatory Strategies in Daily Life: Selection, Optimization, and Compensation and Everyday Memory Problems

    Science.gov (United States)

    Robinson, Stephanie A.; Rickenbach, Elizabeth H.; Lachman, Margie E.

    2016-01-01

    The effective use of self-regulatory strategies, such as selection, optimization, and compensation (SOC) requires resources. However, it is theorized that SOC use is most advantageous for those experiencing losses and diminishing resources. The present study explored this seeming paradox within the context of limitations or constraints due to…

  12. Optimalization of selected RFID systems Parameters

    Directory of Open Access Journals (Sweden)

    Peter Vestenicky

    2004-01-01

    Full Text Available This paper describes procedure for maximization of RFID transponder read range. This is done by optimalization of magnetics field intensity at transponder place and by optimalization of antenna and transponder coils coupling factor. Results of this paper can be used for RFID with inductive loop, i.e. system working in near electromagnetic field.

  13. A generalized fuzzy credibility-constrained linear fractional programming approach for optimal irrigation water allocation under uncertainty

    Science.gov (United States)

    Zhang, Chenglong; Guo, Ping

    2017-10-01

    The vague and fuzzy parametric information is a challenging issue in irrigation water management problems. In response to this problem, a generalized fuzzy credibility-constrained linear fractional programming (GFCCFP) model is developed for optimal irrigation water allocation under uncertainty. The model can be derived from integrating generalized fuzzy credibility-constrained programming (GFCCP) into a linear fractional programming (LFP) optimization framework. Therefore, it can solve ratio optimization problems associated with fuzzy parameters, and examine the variation of results under different credibility levels and weight coefficients of possibility and necessary. It has advantages in: (1) balancing the economic and resources objectives directly; (2) analyzing system efficiency; (3) generating more flexible decision solutions by giving different credibility levels and weight coefficients of possibility and (4) supporting in-depth analysis of the interrelationships among system efficiency, credibility level and weight coefficient. The model is applied to a case study of irrigation water allocation in the middle reaches of Heihe River Basin, northwest China. Therefore, optimal irrigation water allocation solutions from the GFCCFP model can be obtained. Moreover, factorial analysis on the two parameters (i.e. λ and γ) indicates that the weight coefficient is a main factor compared with credibility level for system efficiency. These results can be effective for support reasonable irrigation water resources management and agricultural production.

  14. Optimization of environmental management strategies through a dynamic stochastic possibilistic multiobjective program.

    Science.gov (United States)

    Zhang, Xiaodong; Huang, Gordon

    2013-02-15

    Greenhouse gas (GHG) emissions from municipal solid waste (MSW) management facilities have become a serious environmental issue. In MSW management, not only economic objectives but also environmental objectives should be considered simultaneously. In this study, a dynamic stochastic possibilistic multiobjective programming (DSPMP) model is developed for supporting MSW management and associated GHG emission control. The DSPMP model improves upon the existing waste management optimization methods through incorporation of fuzzy possibilistic programming and chance-constrained programming into a general mixed-integer multiobjective linear programming (MOP) framework where various uncertainties expressed as fuzzy possibility distributions and probability distributions can be effectively reflected. Two conflicting objectives are integrally considered, including minimization of total system cost and minimization of total GHG emissions from waste management facilities. Three planning scenarios are analyzed and compared, representing different preferences of the decision makers for economic development and environmental-impact (i.e. GHG-emission) issues in integrated MSW management. Optimal decision schemes under three scenarios and different p(i) levels (representing the probability that the constraints would be violated) are generated for planning waste flow allocation and facility capacity expansions as well as GHG emission control. The results indicate that economic and environmental tradeoffs can be effectively reflected through the proposed DSPMP model. The generated decision variables can help the decision makers justify and/or adjust their waste management strategies based on their implicit knowledge and preferences. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Identification of a selective small molecule inhibitor of breast cancer stem cells.

    Science.gov (United States)

    Germain, Andrew R; Carmody, Leigh C; Morgan, Barbara; Fernandez, Cristina; Forbeck, Erin; Lewis, Timothy A; Nag, Partha P; Ting, Amal; VerPlank, Lynn; Feng, Yuxiong; Perez, Jose R; Dandapani, Sivaraman; Palmer, Michelle; Lander, Eric S; Gupta, Piyush B; Schreiber, Stuart L; Munoz, Benito

    2012-05-15

    A high-throughput screen (HTS) with the National Institute of Health-Molecular Libraries Small Molecule Repository (NIH-MLSMR) compound collection identified a class of acyl hydrazones to be selectively lethal to breast cancer stem cell (CSC) enriched populations. Medicinal chemistry efforts were undertaken to optimize potency and selectivity of this class of compounds. The optimized compound was declared as a probe (ML239) with the NIH Molecular Libraries Program and displayed greater than 20-fold selective inhibition of the breast CSC-like cell line (HMLE_sh_Ecad) over the isogenic control line (HMLE_sh_GFP). Copyright © 2012 Elsevier Ltd. All rights reserved.

  16. EABOT - Energetic analysis as a basis for robust optimization of trigeneration systems by linear programming

    International Nuclear Information System (INIS)

    Piacentino, A.; Cardona, F.

    2008-01-01

    The optimization of synthesis, design and operation in trigeneration systems for building applications is a quite complex task, due to the high number of decision variables, the presence of irregular heat, cooling and electric load profiles and the variable electricity price. Consequently, computer-aided techniques are usually adopted to achieve the optimal solution, based either on iterative techniques, linear or non-linear programming or evolutionary search. Large efforts have been made in improving algorithm efficiency, which have resulted in an increasingly rapid convergence to the optimal solution and in reduced calculation time; robust algorithm have also been formulated, assuming stochastic behaviour for energy loads and prices. This paper is based on the assumption that margins for improvements in the optimization of trigeneration systems still exist, which require an in-depth understanding of plant's energetic behaviour. Robustness in the optimization of trigeneration systems has more to do with a 'correct and comprehensive' than with an 'efficient' modelling, being larger efforts required to energy specialists rather than to experts in efficient algorithms. With reference to a mixed integer linear programming model implemented in MatLab for a trigeneration system including a pressurized (medium temperature) heat storage, the relevant contribute of thermoeconomics and energo-environmental analysis in the phase of mathematical modelling and code testing are shown

  17. Pyrochemical and Dry Processing Methods Program. A selected bibliography

    Energy Technology Data Exchange (ETDEWEB)

    McDuffie, H.F.; Smith, D.H.; Owen, P.T.

    1979-03-01

    This selected bibliography with abstracts was compiled to provide information support to the Pyrochemical and Dry Processing Methods (PDPM) Program sponsored by DOE and administered by the Argonne National Laboratory. Objectives of the PDPM Program are to evaluate nonaqueous methods of reprocessing spent fuel as a route to the development of proliferation-resistant and diversion-resistant methods for widespread use in the nuclear industry. Emphasis was placed on the literature indexed in the ERDA--DOE Energy Data Base (EDB). The bibliography includes indexes to authors, subject descriptors, EDB subject categories, and titles.

  18. Pyrochemical and Dry Processing Methods Program. A selected bibliography

    International Nuclear Information System (INIS)

    McDuffie, H.F.; Smith, D.H.; Owen, P.T.

    1979-03-01

    This selected bibliography with abstracts was compiled to provide information support to the Pyrochemical and Dry Processing Methods (PDPM) Program sponsored by DOE and administered by the Argonne National Laboratory. Objectives of the PDPM Program are to evaluate nonaqueous methods of reprocessing spent fuel as a route to the development of proliferation-resistant and diversion-resistant methods for widespread use in the nuclear industry. Emphasis was placed on the literature indexed in the ERDA--DOE Energy Data Base (EDB). The bibliography includes indexes to authors, subject descriptors, EDB subject categories, and titles

  19. Optimal Allocation of Static Var Compensator via Mixed Integer Conic Programming

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xiaohu [ORNL; Shi, Di [Global Energy Interconnection Research Institute North America (GEIRI North America), California; Wang, Zhiwei [Global Energy Interconnection Research Institute North America (GEIRI North America), California; Huang, Junhui [Global Energy Interconnection Research Institute North America (GEIRI North America), California; Wang, Xu [Global Energy Interconnection Research Institute North America (GEIRI North America), California; Liu, Guodong [ORNL; Tomsovic, Kevin [University of Tennessee, Knoxville (UTK)

    2017-01-01

    Shunt FACTS devices, such as, a Static Var Compensator (SVC), are capable of providing local reactive power compensation. They are widely used in the network to reduce the real power loss and improve the voltage profile. This paper proposes a planning model based on mixed integer conic programming (MICP) to optimally allocate SVCs in the transmission network considering load uncertainty. The load uncertainties are represented by a number of scenarios. Reformulation and linearization techniques are utilized to transform the original non-convex model into a convex second order cone programming (SOCP) model. Numerical case studies based on the IEEE 30-bus system demonstrate the effectiveness of the proposed planning model.

  20. Moral Hazard, Adverse Selection and the Optimal Consumption-Leisure Choice under Equilibrium Price Dispersion

    Directory of Open Access Journals (Sweden)

    Sergey Malakhov

    2017-09-01

    Full Text Available The analysis of the optimal consumption-leisure choice under equilibrium price dispersion discovers the methodological difference between problems of moral hazard and adverse selection. While the phenomenon of moral hazard represents the individual behavioral reaction on the marginal rate of substitution of leisure for consumption proposed by the insurance policy, the adverse selection can take place on any imperfect market under equilibrium price dispersion and it looks like a market phenomenon of a natural selection between consumers with different income and different propensity to search. The analysis of health insurance where the propensity to search takes the form of the propensity to seek healthcare demonstrates that moral hazard takes place when the insurance policy proposes a suboptimal consumption-leisure choice and the increase in consumption of medical services with the reduction of leisure time represents not an unlimited demand for “free goods” but the simple process of the consumption-leisure optimization. The path of consumerism with consumer-directed plans can solve partly the problem of moral hazard because in order to eliminate moral hazard this trend should come to the re-sale of medical services under health vouchers like it takes place in the life settlement.

  1. A Goal Programming Optimization Model for The Allocation of Liquid Steel Production

    Science.gov (United States)

    Hapsari, S. N.; Rosyidi, C. N.

    2018-03-01

    This research was conducted in one of the largest steel companies in Indonesia which has several production units and produces a wide range of steel products. One of the important products in the company is billet steel. The company has four Electric Arc Furnace (EAF) which produces liquid steel which must be procesed further to be billet steel. The billet steel plant needs to make their production process more efficient to increase the productvity. The management has four goals to be achieved and hence the optimal allocation of the liquid steel production is needed to achieve those goals. In this paper, a goal programming optimization model is developed to determine optimal allocation of liquid steel production in each EAF, to satisfy demand in 3 periods and the company goals, namely maximizing the volume of production, minimizing the cost of raw materials, minimizing maintenance costs, maximizing sales revenues, and maximizing production capacity. From the results of optimization, only maximizing production capacity goal can not achieve the target. However, the model developed in this papare can optimally allocate liquid steel so the allocation of production does not exceed the maximum capacity of the machine work hours and maximum production capacity.

  2. Optimization of ISSR-PCR reaction system and selection of primers in Bryum argenteum

    Directory of Open Access Journals (Sweden)

    Ma Xiaoying

    2017-02-01

    Full Text Available In order to determine optimum ISSR-PCR reaction system for moss Bryum argenteum,the concentrations of template DNA primers,dNTPs,Mg2+ and Taq DNA polymerase were optimized in four levels by PCR orthogonal experimental method. The appropriate primers were screened from 100 primers by temperature gradient PCR,and the optimal anneal temperature of the screened primers were determined. The results showed that the optimized 20 μL ISSR-PCR reaction system was as follows:template DNA 20 ng/20 μL,primers 0.45 μmol/L,Mg2+2.65 mmol/L,Taq DNA polymerase 0.4 U/20 μL,dNTPs 0.45 mmol/L. Using this system,50 primers with clear bands,repeatability well and polymorphism highly were selected from 100 primers. The establishment of this system,the screened primers and the annealing temperature could provide a theoretical basis for further research on the genetic diversity of bryophytes using ISSR molecular markers.

  3. Cost-Effectiveness of a Community Exercise and Nutrition Program for Older Adults: Texercise Select.

    Science.gov (United States)

    Akanni, Olufolake Odufuwa; Smith, Matthew Lee; Ory, Marcia G

    2017-05-20

    The wide-spread dissemination of evidence-based programs that can improve health outcomes among older populations often requires an understanding of factors influencing community adoption of such programs. One such program is Texercise Select , a community-based health promotion program previously shown to improve functional health, physical activity, nutritional habits and quality of the life among older adults. This paper assesses the cost-effectiveness of Texercise Select in the context of supportive environments to facilitate its delivery and statewide sustainability. Participants were surveyed using self-reported instruments distributed at program baseline and conclusion. Program costs were based on actual direct costs of program implementation and included costs of recruitment and outreach, personnel costs and participant incentives. Program effectiveness was measured using quality-adjusted life year (QALY) gained, as well as health outcomes, such as healthy days, weekly physical activity and Timed Up-and-Go (TUG) test scores. Preference-based EuroQol (EQ-5D) scores were estimated from the number of healthy days reported by participants and converted into QALYs. There was a significant increase in the number of healthy days ( p nutrition-related outcomes among participants, this study supports the use of Texercise Select as an intervention with substantial health and cost benefits.

  4. Optimization of contrast of MR images in imaging of knee joint

    International Nuclear Information System (INIS)

    Szyblinski, K.; Bacic, G.

    1994-01-01

    The work describes the method of contrast optimization in magnetic resonance imaging. Computer program presented in the report allows analysis of contrast in selected tissues as a function of experiment parameters. Application to imaging of knee joint is presented

  5. Fruit Phenolic Profiling: A New Selection Criterion in Olive Breeding Programs.

    Science.gov (United States)

    Pérez, Ana G; León, Lorenzo; Sanz, Carlos; de la Rosa, Raúl

    2018-01-01

    Olive growing is mainly based on traditional varieties selected by the growers across the centuries. The few attempts so far reported to obtain new varieties by systematic breeding have been mainly focused on improving the olive adaptation to different growing systems, the productivity and the oil content. However, the improvement of oil quality has rarely been considered as selection criterion and only in the latter stages of the breeding programs. Due to their health promoting and organoleptic properties, phenolic compounds are one of the most important quality markers for Virgin olive oil (VOO) although they are not commonly used as quality traits in olive breeding programs. This is mainly due to the difficulties for evaluating oil phenolic composition in large number of samples and the limited knowledge on the genetic and environmental factors that may influence phenolic composition. In the present work, we propose a high throughput methodology to include the phenolic composition as a selection criterion in olive breeding programs. For that purpose, the phenolic profile has been determined in fruits and oils of several breeding selections and two varieties ("Picual" and "Arbequina") used as control. The effect of three different environments, typical for olive growing in Andalusia, Southern Spain, was also evaluated. A high genetic effect was observed on both fruit and oil phenolic profile. In particular, the breeding selection UCI2-68 showed an optimum phenolic profile, which sums up to a good agronomic performance previously reported. A high correlation was found between fruit and oil total phenolic content as well as some individual phenols from the two different matrices. The environmental effect on phenolic compounds was also significant in both fruit and oil, although the low genotype × environment interaction allowed similar ranking of genotypes on the different environments. In summary, the high genotypic variance and the simplified procedure of the

  6. Fruit Phenolic Profiling: A New Selection Criterion in Olive Breeding Programs

    Directory of Open Access Journals (Sweden)

    Ana G. Pérez

    2018-02-01

    Full Text Available Olive growing is mainly based on traditional varieties selected by the growers across the centuries. The few attempts so far reported to obtain new varieties by systematic breeding have been mainly focused on improving the olive adaptation to different growing systems, the productivity and the oil content. However, the improvement of oil quality has rarely been considered as selection criterion and only in the latter stages of the breeding programs. Due to their health promoting and organoleptic properties, phenolic compounds are one of the most important quality markers for Virgin olive oil (VOO although they are not commonly used as quality traits in olive breeding programs. This is mainly due to the difficulties for evaluating oil phenolic composition in large number of samples and the limited knowledge on the genetic and environmental factors that may influence phenolic composition. In the present work, we propose a high throughput methodology to include the phenolic composition as a selection criterion in olive breeding programs. For that purpose, the phenolic profile has been determined in fruits and oils of several breeding selections and two varieties (“Picual” and “Arbequina” used as control. The effect of three different environments, typical for olive growing in Andalusia, Southern Spain, was also evaluated. A high genetic effect was observed on both fruit and oil phenolic profile. In particular, the breeding selection UCI2-68 showed an optimum phenolic profile, which sums up to a good agronomic performance previously reported. A high correlation was found between fruit and oil total phenolic content as well as some individual phenols from the two different matrices. The environmental effect on phenolic compounds was also significant in both fruit and oil, although the low genotype × environment interaction allowed similar ranking of genotypes on the different environments. In summary, the high genotypic variance and the

  7. Progressive sampling-based Bayesian optimization for efficient and automatic machine learning model selection.

    Science.gov (United States)

    Zeng, Xueqiang; Luo, Gang

    2017-12-01

    Machine learning is broadly used for clinical data analysis. Before training a model, a machine learning algorithm must be selected. Also, the values of one or more model parameters termed hyper-parameters must be set. Selecting algorithms and hyper-parameter values requires advanced machine learning knowledge and many labor-intensive manual iterations. To lower the bar to machine learning, miscellaneous automatic selection methods for algorithms and/or hyper-parameter values have been proposed. Existing automatic selection methods are inefficient on large data sets. This poses a challenge for using machine learning in the clinical big data era. To address the challenge, this paper presents progressive sampling-based Bayesian optimization, an efficient and automatic selection method for both algorithms and hyper-parameter values. We report an implementation of the method. We show that compared to a state of the art automatic selection method, our method can significantly reduce search time, classification error rate, and standard deviation of error rate due to randomization. This is major progress towards enabling fast turnaround in identifying high-quality solutions required by many machine learning-based clinical data analysis tasks.

  8. Modeling and prioritizing demand response programs in power markets

    International Nuclear Information System (INIS)

    Aalami, H.A.; Moghaddam, M. Parsa; Yousefi, G.R.

    2010-01-01

    One of the responsibilities of power market regulator is setting rules for selecting and prioritizing demand response (DR) programs. There are many different alternatives of DR programs for improving load profile characteristics and achieving customers' satisfaction. Regulator should find the optimal solution which reflects the perspectives of each DR stakeholder. Multi Attribute Decision Making (MADM) is a proper method for handling such optimization problems. In this paper, an extended responsive load economic model is developed. The model is based on price elasticity and customer benefit function. Prioritizing of DR programs can be realized by means of Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) method. Considerations of ISO/utility/customer regarding the weighting of attributes are encountered by entropy method. An Analytical Hierarchy Process (AHP) is used for selecting the most effective DR program. Numerical studies are conducted on the load curve of the Iranian power grid in 2007. (author)

  9. Algorithms for optimizing drug therapy

    Directory of Open Access Journals (Sweden)

    Martin Lene

    2004-07-01

    Full Text Available Abstract Background Drug therapy has become increasingly efficient, with more drugs available for treatment of an ever-growing number of conditions. Yet, drug use is reported to be sub optimal in several aspects, such as dosage, patient's adherence and outcome of therapy. The aim of the current study was to investigate the possibility to optimize drug therapy using computer programs, available on the Internet. Methods One hundred and ten officially endorsed text documents, published between 1996 and 2004, containing guidelines for drug therapy in 246 disorders, were analyzed with regard to information about patient-, disease- and drug-related factors and relationships between these factors. This information was used to construct algorithms for identifying optimum treatment in each of the studied disorders. These algorithms were categorized in order to define as few models as possible that still could accommodate the identified factors and the relationships between them. The resulting program prototypes were implemented in HTML (user interface and JavaScript (program logic. Results Three types of algorithms were sufficient for the intended purpose. The simplest type is a list of factors, each of which implies that the particular patient should or should not receive treatment. This is adequate in situations where only one treatment exists. The second type, a more elaborate model, is required when treatment can by provided using drugs from different pharmacological classes and the selection of drug class is dependent on patient characteristics. An easily implemented set of if-then statements was able to manage the identified information in such instances. The third type was needed in the few situations where the selection and dosage of drugs were depending on the degree to which one or more patient-specific factors were present. In these cases the implementation of an established decision model based on fuzzy sets was required. Computer programs

  10. Cancer microarray data feature selection using multi-objective binary particle swarm optimization algorithm

    Science.gov (United States)

    Annavarapu, Chandra Sekhara Rao; Dara, Suresh; Banka, Haider

    2016-01-01

    Cancer investigations in microarray data play a major role in cancer analysis and the treatment. Cancer microarray data consists of complex gene expressed patterns of cancer. In this article, a Multi-Objective Binary Particle Swarm Optimization (MOBPSO) algorithm is proposed for analyzing cancer gene expression data. Due to its high dimensionality, a fast heuristic based pre-processing technique is employed to reduce some of the crude domain features from the initial feature set. Since these pre-processed and reduced features are still high dimensional, the proposed MOBPSO algorithm is used for finding further feature subsets. The objective functions are suitably modeled by optimizing two conflicting objectives i.e., cardinality of feature subsets and distinctive capability of those selected subsets. As these two objective functions are conflicting in nature, they are more suitable for multi-objective modeling. The experiments are carried out on benchmark gene expression datasets, i.e., Colon, Lymphoma and Leukaemia available in literature. The performance of the selected feature subsets with their classification accuracy and validated using 10 fold cross validation techniques. A detailed comparative study is also made to show the betterment or competitiveness of the proposed algorithm. PMID:27822174

  11. Pretreatment of wastewater: Optimal coagulant selection using Partial Order Scaling Analysis (POSA)

    International Nuclear Information System (INIS)

    Tzfati, Eran; Sein, Maya; Rubinov, Angelika; Raveh, Adi; Bick, Amos

    2011-01-01

    Jar-test is a well-known tool for chemical selection for physical-chemical wastewater treatment. Jar test results show the treatment efficiency in terms of suspended matter and organic matter removal. However, in spite of having all these results, coagulant selection is not an easy task because one coagulant can remove efficiently the suspended solids but at the same time increase the conductivity. This makes the final selection of coagulants very dependent on the relative importance assigned to each measured parameter. In this paper, the use of Partial Order Scaling Analysis (POSA) and multi-criteria decision analysis is proposed to help the selection of the coagulant and its concentration in a sequencing batch reactor (SBR). Therefore, starting from the parameters fixed by the jar-test results, these techniques will allow to weight these parameters, according to the judgments of wastewater experts, and to establish priorities among coagulants. An evaluation of two commonly used coagulation/flocculation aids (Alum and Ferric Chloride) was conducted and based on jar tests and POSA model, Ferric Chloride (100 ppm) was the best choice. The results obtained show that POSA and multi-criteria techniques are useful tools to select the optimal chemicals for the physical-technical treatment.

  12. Optimizing Crawler4j using MapReduce Programming Model

    Science.gov (United States)

    Siddesh, G. M.; Suresh, Kavya; Madhuri, K. Y.; Nijagal, Madhushree; Rakshitha, B. R.; Srinivasa, K. G.

    2017-06-01

    World wide web is a decentralized system that consists of a repository of information on the basis of web pages. These web pages act as a source of information or data in the present analytics world. Web crawlers are used for extracting useful information from web pages for different purposes. Firstly, it is used in web search engines where the web pages are indexed to form a corpus of information and allows the users to query on the web pages. Secondly, it is used for web archiving where the web pages are stored for later analysis phases. Thirdly, it can be used for web mining where the web pages are monitored for copyright purposes. The amount of information processed by the web crawler needs to be improved by using the capabilities of modern parallel processing technologies. In order to solve the problem of parallelism and the throughput of crawling this work proposes to optimize the Crawler4j using the Hadoop MapReduce programming model by parallelizing the processing of large input data. Crawler4j is a web crawler that retrieves useful information about the pages that it visits. The crawler Crawler4j coupled with data and computational parallelism of Hadoop MapReduce programming model improves the throughput and accuracy of web crawling. The experimental results demonstrate that the proposed solution achieves significant improvements with respect to performance and throughput. Hence the proposed approach intends to carve out a new methodology towards optimizing web crawling by achieving significant performance gain.

  13. Comparing the Selection and Placement of Best Management Practices in Improving Water Quality Using a Multiobjective Optimization and Targeting Method

    Directory of Open Access Journals (Sweden)

    Li-Chi Chiang

    2014-03-01

    Full Text Available Suites of Best Management Practices (BMPs are usually selected to be economically and environmentally efficient in reducing nonpoint source (NPS pollutants from agricultural areas in a watershed. The objective of this research was to compare the selection and placement of BMPs in a pasture-dominated watershed using multiobjective optimization and targeting methods. Two objective functions were used in the optimization process, which minimize pollutant losses and the BMP placement areas. The optimization tool was an integration of a multi-objective genetic algorithm (GA and a watershed model (Soil and Water Assessment Tool—SWAT. For the targeting method, an optimum BMP option was implemented in critical areas in the watershed that contribute the greatest pollutant losses. A total of 171 BMP combinations, which consist of grazing management, vegetated filter strips (VFS, and poultry litter applications were considered. The results showed that the optimization is less effective when vegetated filter strips (VFS are not considered, and it requires much longer computation times than the targeting method to search for optimum BMPs. Although the targeting method is effective in selecting and placing an optimum BMP, larger areas are needed for BMP implementation to achieve the same pollutant reductions as the optimization method.

  14. DETERMINATION OF OPTIMAL CONTOURS OF OPEN PIT MINE DURING OIL SHALE EXPLOITATION, BY MINEX 5.2.3. PROGRAM

    Directory of Open Access Journals (Sweden)

    Miroslav Ignjatović

    2013-04-01

    Full Text Available By examination and determination of optimal solution of technological processes of exploitation and oil shale processing from Aleksinac site and with adopted technical solution and exploitation of oil shale, derived a technical solution that optimize contour of the newly defined open pit mine. In the world, this problem is solved by using a computer program that has become the established standard for quick and efficient solution for this problem. One of the computer’s program, which can be used for determination of the optimal contours of open pit mines is Minex 5.2.3. program, produced in Australia in the Surpac Minex Group Pty Ltd Company, which is applied at the Mining and Metallurgy Institute Bor (no. of licenses are SSI - 24765 and SSI - 24766. In this study, authors performed 11 optimization of deposit geo - models in Minex 5.2.3. based on the tests results, performed in a laboratory for soil mechanics of Mining and Metallurgy Institute, Bor, on samples from the site of Aleksinac deposits.

  15. Revealing metabolite biomarkers for acupuncture treatment by linear programming based feature selection.

    Science.gov (United States)

    Wang, Yong; Wu, Qiao-Feng; Chen, Chen; Wu, Ling-Yun; Yan, Xian-Zhong; Yu, Shu-Guang; Zhang, Xiang-Sun; Liang, Fan-Rong

    2012-01-01

    Acupuncture has been practiced in China for thousands of years as part of the Traditional Chinese Medicine (TCM) and has gradually accepted in western countries as an alternative or complementary treatment. However, the underlying mechanism of acupuncture, especially whether there exists any difference between varies acupoints, remains largely unknown, which hinders its widespread use. In this study, we develop a novel Linear Programming based Feature Selection method (LPFS) to understand the mechanism of acupuncture effect, at molecular level, by revealing the metabolite biomarkers for acupuncture treatment. Specifically, we generate and investigate the high-throughput metabolic profiles of acupuncture treatment at several acupoints in human. To select the subsets of metabolites that best characterize the acupuncture effect for each meridian point, an optimization model is proposed to identify biomarkers from high-dimensional metabolic data from case and control samples. Importantly, we use nearest centroid as the prototype to simultaneously minimize the number of selected features and the leave-one-out cross validation error of classifier. We compared the performance of LPFS to several state-of-the-art methods, such as SVM recursive feature elimination (SVM-RFE) and sparse multinomial logistic regression approach (SMLR). We find that our LPFS method tends to reveal a small set of metabolites with small standard deviation and large shifts, which exactly serves our requirement for good biomarker. Biologically, several metabolite biomarkers for acupuncture treatment are revealed and serve as the candidates for further mechanism investigation. Also biomakers derived from five meridian points, Zusanli (ST36), Liangmen (ST21), Juliao (ST3), Yanglingquan (GB34), and Weizhong (BL40), are compared for their similarity and difference, which provide evidence for the specificity of acupoints. Our result demonstrates that metabolic profiling might be a promising method to

  16. Optimal placement of capacitors in a radial network using conic and mixed integer linear programming

    Energy Technology Data Exchange (ETDEWEB)

    Jabr, R.A. [Electrical, Computer and Communication Engineering Department, Notre Dame University, P.O. Box: 72, Zouk Mikhael, Zouk Mosbeh (Lebanon)

    2008-06-15

    This paper considers the problem of optimally placing fixed and switched type capacitors in a radial distribution network. The aim of this problem is to minimize the costs associated with capacitor banks, peak power, and energy losses whilst satisfying a pre-specified set of physical and technical constraints. The proposed solution is obtained using a two-phase approach. In phase-I, the problem is formulated as a conic program in which all nodes are candidates for placement of capacitor banks whose sizes are considered as continuous variables. A global solution of the phase-I problem is obtained using an interior-point based conic programming solver. Phase-II seeks a practical optimal solution by considering capacitor sizes as discrete variables. The problem in this phase is formulated as a mixed integer linear program based on minimizing the L1-norm of deviations from the phase-I state variable values. The solution to the phase-II problem is obtained using a mixed integer linear programming solver. The proposed method is validated via extensive comparisons with previously published results. (author)

  17. Effect of Selection of Design Parameters on the Optimization of a Horizontal Axis Wind Turbine via Genetic Algorithm

    International Nuclear Information System (INIS)

    Alpman, Emre

    2014-01-01

    The effect of selecting the twist angle and chord length distributions on the wind turbine blade design was investigated by performing aerodynamic optimization of a two-bladed stall regulated horizontal axis wind turbine. Twist angle and chord length distributions were defined using Bezier curve using 3, 5, 7 and 9 control points uniformly distributed along the span. Optimizations performed using a micro-genetic algorithm with populations composed of 5, 10, 15, 20 individuals showed that, the number of control points clearly affected the outcome of the process; however the effects were different for different population sizes. The results also showed the superiority of micro-genetic algorithm over a standard genetic algorithm, for the selected population sizes. Optimizations were also performed using a macroevolutionary algorithm and the resulting best blade design was compared with that yielded by micro-genetic algorithm

  18. Solving Bilevel Multiobjective Programming Problem by Elite Quantum Behaved Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Tao Zhang

    2012-01-01

    Full Text Available An elite quantum behaved particle swarm optimization (EQPSO algorithm is proposed, in which an elite strategy is exerted for the global best particle to prevent premature convergence of the swarm. The EQPSO algorithm is employed for solving bilevel multiobjective programming problem (BLMPP in this study, which has never been reported in other literatures. Finally, we use eight different test problems to measure and evaluate the proposed algorithm, including low dimension and high dimension BLMPPs, as well as attempt to solve the BLMPPs whose theoretical Pareto optimal front is not known. The experimental results show that the proposed algorithm is a feasible and efficient method for solving BLMPPs.

  19. [A program for optimizing the use of antimicrobials (PROA): experience in a regional hospital].

    Science.gov (United States)

    Ugalde-Espiñeira, J; Bilbao-Aguirregomezcorta, J; Sanjuan-López, A Z; Floristán-Imízcoz, C; Elorduy-Otazua, L; Viciola-García, M

    2016-08-01

    Programs for optimizing the use of antibiotics (PROA) or antimicrobial stewardship programs are multidisciplinary programs developed in response to the increase of antibiotic resistant bacteria, the objective of which are to improve clinical results, to minimize adverse events and to reduce costs associated with the use of antimicrobials. The implementation of a PROA program in a 128-bed general hospital and the results obtained at 6 months are here reported. An intervention quasi-experimental study with historical control group was designed with the objective of assessing the impact of a PROA program with a non-restrictive intervention model to help prescription, with a direct and bidirectional intervention. The basis of the program is an optimization audit of the use of antimicrobials with not imposed personalized recommendations and the use of information technologies applied to this setting. The impact on the pharmaceutical consumption and costs, cost per process, mean hospital stay, percentage of readmissions to the hospital are described. A total of 307 audits were performed. In 65.8% of cases, treatment was discontinued between the 7th and the 10th day. The main reasons of treatment discontinuation were completeness of treatment (43.6%) and lack of indication (14.7%). The reduction of pharmaceutical expenditure was 8.59% (P = 0.049) and 5.61% of the consumption in DDD/100 stays (P=0.180). The costs by processes in general surgery showed a 3.14% decrease (p=0.000). The results obtained support the efficiency of these programs in small size hospitals with limited resources.

  20. Factors that influence medical student selection of an emergency medicine residency program: implications for training programs.

    Science.gov (United States)

    Love, Jeffrey N; Howell, John M; Hegarty, Cullen B; McLaughlin, Steven A; Coates, Wendy C; Hopson, Laura R; Hern, Gene H; Rosen, Carlo L; Fisher, Jonathan; Santen, Sally A

    2012-04-01

    An understanding of student decision-making when selecting an emergency medicine (EM) training program is essential for program directors as they enter interview season. To build upon preexisting knowledge, a survey was created to identify and prioritize the factors influencing candidate decision-making of U.S. medical graduates. This was a cross-sectional, multi-institutional study that anonymously surveyed U.S. allopathic applicants to EM training programs. It took place in the 3-week period between the 2011 National Residency Matching Program (NRMP) rank list submission deadline and the announcement of match results. Of 1,525 invitations to participate, 870 candidates (57%) completed the survey. Overall, 96% of respondents stated that both geographic location and individual program characteristics were important to decision-making, with approximately equal numbers favoring location when compared to those who favored program characteristics. The most important factors in this regard were preference for a particular geographic location (74.9%, 95% confidence interval [CI] = 72% to 78%) and to be close to spouse, significant other, or family (59.7%, 95% CI = 56% to 63%). Factors pertaining to geographic location tend to be out of the control of the program leadership. The most important program factors include the interview experience (48.9%, 95% CI = 46% to 52%), personal experience with the residents (48.5%, 95% CI = 45% to 52%), and academic reputation (44.9%, 95% CI = 42% to 48%). Unlike location, individual program factors are often either directly or somewhat under the control of the program leadership. Several other factors were ranked as the most important factor a disproportionate number of times, including a rotation in that emergency department (ED), orientation (academic vs. community), and duration of training (3-year vs. 4-year programs). For a subset of applicants, these factors had particular importance in overall decision-making. The vast majority