Directory of Open Access Journals (Sweden)
Christopher Expósito-Izquierdo
2017-02-01
Full Text Available This paper summarizes the main contributions of the Ph.D. thesis of Christopher Exp\\'osito-Izquierdo. This thesis seeks to develop a wide set of intelligent heuristic and meta-heuristic algorithms aimed at solving some of the most highlighted optimization problems associated with the transshipment and storage of containers at conventional maritime container terminals. Under the premise that no optimization technique can have a better performance than any other technique under all possible assumptions, the main point of interest in the domain of maritime logistics is to propose optimization techniques superior in terms of effectiveness and computational efficiency to previous proposals found in the scientific literature when solving individual optimization problems under realistic scenarios. Simultaneously, these optimization techniques should be enough competitive to be potentially implemented in practice. }}
DEFF Research Database (Denmark)
Ding, Yi; Goel, Lalit; Wang, Peng
2012-01-01
the required level of supply reliability to its customers. In previous research, Genetic Algorithm (GA) has been used to solve most reliability optimization problems. However, the GA is not very computationally efficient in some cases. In this chapter a new heuristic optimization technique—the particle swarm...
Directory of Open Access Journals (Sweden)
Jeevanandham Arumugam
2009-01-01
Full Text Available In this paper a classical lead-lag power system stabilizer is used for demonstration. The stabilizer parameters are selected in such a manner to damp the rotor oscillations. The problem of selecting the stabilizer parameters is converted to a simple optimization problem with an eigen value based objective function and it is proposed to employ simulated annealing and particle swarm optimization for solving the optimization problem. The objective function allows the selection of the stabilizer parameters to optimally place the closed-loop eigen values in the left hand side of the complex s-plane. The single machine connected to infinite bus system and 10-machine 39-bus system are considered for this study. The effectiveness of the stabilizer tuned using the best technique, in enhancing the stability of power system. Stability is confirmed through eigen value analysis and simulation results and suitable heuristic technique will be selected for the best performance of the system.
A novel hybrid meta-heuristic technique applied to the well-known benchmark optimization problems
Abtahi, Amir-Reza; Bijari, Afsane
2017-09-01
In this paper, a hybrid meta-heuristic algorithm, based on imperialistic competition algorithm (ICA), harmony search (HS), and simulated annealing (SA) is presented. The body of the proposed hybrid algorithm is based on ICA. The proposed hybrid algorithm inherits the advantages of the process of harmony creation in HS algorithm to improve the exploitation phase of the ICA algorithm. In addition, the proposed hybrid algorithm uses SA to make a balance between exploration and exploitation phases. The proposed hybrid algorithm is compared with several meta-heuristic methods, including genetic algorithm (GA), HS, and ICA on several well-known benchmark instances. The comprehensive experiments and statistical analysis on standard benchmark functions certify the superiority of the proposed method over the other algorithms. The efficacy of the proposed hybrid algorithm is promising and can be used in several real-life engineering and management problems.
Investigations of quantum heuristics for optimization
Rieffel, Eleanor; Hadfield, Stuart; Jiang, Zhang; Mandra, Salvatore; Venturelli, Davide; Wang, Zhihui
We explore the design of quantum heuristics for optimization, focusing on the quantum approximate optimization algorithm, a metaheuristic developed by Farhi, Goldstone, and Gutmann. We develop specific instantiations of the of quantum approximate optimization algorithm for a variety of challenging combinatorial optimization problems. Through theoretical analyses and numeric investigations of select problems, we provide insight into parameter setting and Hamiltonian design for quantum approximate optimization algorithms and related quantum heuristics, and into their implementation on hardware realizable in the near term.
Comparison of Heuristics for Inhibitory Rule Optimization
Alsolami, Fawaz
2014-09-13
Knowledge representation and extraction are very important tasks in data mining. In this work, we proposed a variety of rule-based greedy algorithms that able to obtain knowledge contained in a given dataset as a series of inhibitory rules containing an expression “attribute ≠ value” on the right-hand side. The main goal of this paper is to determine based on rule characteristics, rule length and coverage, whether the proposed rule heuristics are statistically significantly different or not; if so, we aim to identify the best performing rule heuristics for minimization of rule length and maximization of rule coverage. Friedman test with Nemenyi post-hoc are used to compare the greedy algorithms statistically against each other for length and coverage. The experiments are carried out on real datasets from UCI Machine Learning Repository. For leading heuristics, the constructed rules are compared with optimal ones obtained based on dynamic programming approach. The results seem to be promising for the best heuristics: the average relative difference between length (coverage) of constructed and optimal rules is at most 2.27% (7%, respectively). Furthermore, the quality of classifiers based on sets of inhibitory rules constructed by the considered heuristics are compared against each other, and the results show that the three best heuristics from the point of view classification accuracy coincides with the three well-performed heuristics from the point of view of rule length minimization.
Directory of Open Access Journals (Sweden)
Mohammad Dreidy
2017-01-01
Full Text Available Recently, several environmental problems are beginning to affect all aspects of life. For this reason, many governments and international agencies have expressed great interest in using more renewable energy sources (RESs. However, integrating more RESs with distribution networks resulted in several critical problems vis-à-vis the frequency stability, which might lead to a complete blackout if not properly treated. Therefore, this paper proposed a new Under Frequency Load Shedding (UFLS scheme for islanding distribution network. This scheme uses three meta-heuristics techniques, binary evolutionary programming (BEP, Binary genetic algorithm (BGA, and Binary particle swarm optimization (BPSO, to determine the optimal combination of loads that needs to be shed from the islanded distribution network. Compared with existing UFLS schemes using fixed priority loads, the proposed scheme has the ability to restore the network frequency without any overshooting. Furthermore, in terms of execution time, the simulation results show that the BEP technique is fast enough to shed the optimal combination of loads compared with BGA and BPSO techniques.
Heuristic Kalman algorithm for solving optimization problems.
Toscano, Rosario; Lyonnet, Patrick
2009-10-01
The main objective of this paper is to present a new optimization approach, which we call heuristic Kalman algorithm (HKA). We propose it as a viable approach for solving continuous nonconvex optimization problems. The principle of the proposed approach is to consider explicitly the optimization problem as a measurement process designed to produce an estimate of the optimum. A specific procedure, based on the Kalman method, was developed to improve the quality of the estimate obtained through the measurement process. The efficiency of HKA is evaluated in detail through several nonconvex test problems, both in the unconstrained and constrained cases. The results are then compared to those obtained via other metaheuristics. These various numerical experiments show that the HKA has very interesting potentialities for solving nonconvex optimization problems, notably concerning the computation time and the success ratio.
DEFF Research Database (Denmark)
Vlachogiannis, Ioannis (John); Lee, KY
2009-01-01
In this paper an improved coordinated aggregation-based particle swarm optimization (ICA-PSO) algorithm is introduced for solving the optimal economic load dispatch (ELD) problem in power systems. In the ICA-PSO algorithm each particle in the swarm retains a memory of its best position ever...... encountered, and is attracted only by other particles with better achievements than its own with the exception of the particle with the best achievement, which moves randomly. Moreover, the population size is increased adaptively, the number of search intervals for the particles is selected adaptively...... and the particles search the decision space with accuracy up to two digit points resulting in the improved convergence of the process. The ICA-PSO algorithm is tested on a number of power systems, including the systems with 6, 13, 15, and 40 generating units, the island power system of Crete in Greece...
Superiorization: an optimization heuristic for medical physics.
Herman, Gabor T; Garduno, Edgar; Davidi, Ran; Censor, Yair
2012-09-01
To describe and mathematically validate the superiorization methodology, which is a recently developed heuristic approach to optimization, and to discuss its applicability to medical physics problem formulations that specify the desired solution (of physically given or otherwise obtained constraints) by an optimization criterion. The superiorization methodology is presented as a heuristic solver for a large class of constrained optimization problems. The constraints come from the desire to produce a solution that is constraints-compatible, in the sense of meeting requirements provided by physically or otherwise obtained constraints. The underlying idea is that many iterative algorithms for finding such a solution are perturbation resilient in the sense that, even if certain kinds of changes are made at the end of each iterative step, the algorithm still produces a constraints-compatible solution. This property is exploited by using permitted changes to steer the algorithm to a solution that is not only constraints-compatible, but is also desirable according to a specified optimization criterion. The approach is very general, it is applicable to many iterative procedures and optimization criteria used in medical physics. The main practical contribution is a procedure for automatically producing from any given iterative algorithm its superiorized version, which will supply solutions that are superior according to a given optimization criterion. It is shown that if the original iterative algorithm satisfies certain mathematical conditions, then the output of its superiorized version is guaranteed to be as constraints-compatible as the output of the original algorithm, but it is superior to the latter according to the optimization criterion. This intuitive description is made precise in the paper and the stated claims are rigorously proved. Superiorization is illustrated on simulated computerized tomography data of a head cross section and, in spite of its generality
Deterministic oscillatory search: a new meta-heuristic optimization ...
Indian Academy of Sciences (India)
heuristic optimization; power system problem. Abstract. The paper proposes a new optimization algorithm that is extremely robust in solving mathematical and engineering problems. The algorithm combines the deterministic nature of classical ...
Proximity search heuristics for wind farm optimal layout
DEFF Research Database (Denmark)
Fischetti, Martina; Monaci, Michele
2016-01-01
A heuristic framework for turbine layout optimization in a wind farm is proposed that combines ad-hoc heuristics and mixed-integer linear programming. In our framework, large-scale mixed-integer programming models are used to iteratively refine the current best solution according to the recently...
HEURISTIC OPTIMIZATION AND ALGORITHM TUNING APPLIED TO SORPTIVE BARRIER DESIGN
While heuristic optimization is applied in environmental applications, ad-hoc algorithm configuration is typical. We use a multi-layer sorptive barrier design problem as a benchmark for an algorithm-tuning procedure, as applied to three heuristics (genetic algorithms, simulated ...
It looks easy! Heuristics for combinatorial optimization problems.
Chronicle, Edward P; MacGregor, James N; Ormerod, Thomas C; Burr, Alistair
2006-04-01
Human performance on instances of computationally intractable optimization problems, such as the travelling salesperson problem (TSP), can be excellent. We have proposed a boundary-following heuristic to account for this finding. We report three experiments with TSPs where the capacity to employ this heuristic was varied. In Experiment 1, participants free to use the heuristic produced solutions significantly closer to optimal than did those prevented from doing so. Experiments 2 and 3 together replicated this finding in larger problems and demonstrated that a potential confound had no effect. In all three experiments, performance was closely matched by a boundary-following model. The results implicate global rather than purely local processes. Humans may have access to simple, perceptually based, heuristics that are suited to some combinatorial optimization tasks.
Two Effective Heuristics for Beam Angle Optimization in Radiation Therapy
Yarmand, Hamed
2013-01-01
In radiation therapy, mathematical methods have been used for optimizing treatment planning for delivery of sufficient dose to the cancerous cells while keeping the dose to critical surrounding structures minimal. This optimization problem can be modeled using mixed integer programming (MIP) whose solution gives the optimal beam orientation as well as optimal beam intensity. The challenge, however, is the computation time for this large scale MIP. We propose and investigate two novel heuristic approaches to reduce the computation time considerably while attaining high-quality solutions. We introduce a family of heuristic cuts based on the concept of 'adjacent beams' and a beam elimination scheme based on the contribution of each beam to deliver the dose to the tumor in the ideal plan in which all potential beams can be used simultaneously. We show the effectiveness of these heuristics for intensity modulated radiation therapy (IMRT) and stereotactic body radiation therapy (SBRT) on a clinical liver case.
Heuristic possibilistic clustering for detecting optimal number of elements in fuzzy clusters
Directory of Open Access Journals (Sweden)
Viattchenin Dmitri A.
2016-03-01
Full Text Available The paper deals with the problem of discovering fuzzy clusters with optimal number of elements in heuristic possibilistic clustering. The relational clustering procedure using a parameter that controls cluster sizes is considered and a technique for detecting the optimal number of elements in fuzzy clusters is proposed. The effectiveness of the proposed technique is illustrated through numerical examples. Experimental results are discussed and some preliminary conclusions are formulated.
Optimizing Linear Functions with Randomized Search Heuristics - The Robustness of Mutation
DEFF Research Database (Denmark)
Witt, Carsten
2012-01-01
The analysis of randomized search heuristics on classes of functions is fundamental for the understanding of the underlying stochastic process and the development of suitable proof techniques. Recently, remarkable progress has been made in bounding the expected optimization time of the simple (1...
Tight Bounds on the Optimization Time of a Randomized Search Heuristic on Linear Functions
DEFF Research Database (Denmark)
Witt, Carsten
2013-01-01
The analysis of randomized search heuristics on classes of functions is fundamental to the understanding of the underlying stochastic process and the development of suitable proof techniques. Recently, remarkable progress has been made in bounding the expected optimization time of a simple...
Energy Technology Data Exchange (ETDEWEB)
Castillo M, J.A.; Ortiz S, J.J.; Perusquia C, R.; Montes T, J.L.; Hernandez M, J.L. [ININ, Carretera Mexico-Toluca s/n, 52750 La Marquesa, Ocoyoacac, Estado de Mexico (Mexico)]. e-mail: jacm@nuclear.inin.mx
2007-07-01
Presently work a comparison, using different methodologies, with the results obtained for the optimization of the design of control bar patterns for boiling water reactors is carried out. The results were obtained considering the same conditions for all the used methodologies, which are part of the combinatory optimization, these were programmed in FORTRAN 77 language under the UNIX platform in an ALPHA work station. The techniques used to carry out the optimization are the following ones, Genetic Algorithms, Dispersed Search, Taboo Search, Ants Colonies and Neural Networks. The objective function used is the same one in all the cases, in this its are included the MFLPD thermal limits (maximum fraction of power density to the end of the operation cycle), MPGR (Maxim rate of power generation at the end of the operation cycle) and MFLCPR (maximum fraction for the critical power ratio at the end of the operation cycle), of equal form it is included the criticality condition for the reactor and it is needed the Power Axial Profile in each step of burnt to be adjusted to a proposed Power Axial Profile. The CM-PRESTO code (Scandpower) was used to evaluate the proposed designs. The approach used for the comparison is essentially the k{sub eff} at the cycle end, as well as the thermal limits performance, nevertheless, it is also analyzed the exchanges number among the shallow and deep positions, and the total number of movements carried out during the complete cycle. (Author)
Multiobjective hyper heuristic scheme for system design and optimization
Rafique, Amer Farhan
2012-11-01
As system design is becoming more and more multifaceted, integrated, and complex, the traditional single objective optimization trends of optimal design are becoming less and less efficient and effective. Single objective optimization methods present a unique optimal solution whereas multiobjective methods present pareto front. The foremost intent is to predict a reasonable distributed pareto-optimal solution set independent of the problem instance through multiobjective scheme. Other objective of application of intended approach is to improve the worthiness of outputs of the complex engineering system design process at the conceptual design phase. The process is automated in order to provide the system designer with the leverage of the possibility of studying and analyzing a large multiple of possible solutions in a short time. This article presents Multiobjective Hyper Heuristic Optimization Scheme based on low level meta-heuristics developed for the application in engineering system design. Herein, we present a stochastic function to manage meta-heuristics (low-level) to augment surety of global optimum solution. Generic Algorithm, Simulated Annealing and Swarm Intelligence are used as low-level meta-heuristics in this study. Performance of the proposed scheme is investigated through a comprehensive empirical analysis yielding acceptable results. One of the primary motives for performing multiobjective optimization is that the current engineering systems require simultaneous optimization of conflicting and multiple. Random decision making makes the implementation of this scheme attractive and easy. Injecting feasible solutions significantly alters the search direction and also adds diversity of population resulting in accomplishment of pre-defined goals set in the proposed scheme.
Energy Technology Data Exchange (ETDEWEB)
Cruz Castrejon, J. A; Islas Perez, E; Espinosa Reza, A; Garcia Mendoza, R [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)]. E-mails: adrian.cruz@iie.org.mx; eislas@iie.org.mx; aer@iie.org.mx; rgarcia@iie.org.mx
2013-03-15
In this paper we present a proposed solution to the problem of finding alternatives to reset faults in radial distribution networks power systems. This solution uses a deterministic method based on the definition of heuristics and whose main objectives are to improve execution time and solution quality. This search is based on the alternate repetition of two stages: a stage that attempts to reset the unconnected areas and other areas trying ballasting overloaded. [Spanish] En este articulo se presenta una propuesta de solucion al problema de busqueda de alternativas de restablecimiento para fallas en redes de distribucion radiales en sistemas electricos de potencia. Esta solucion utiliza un metodo deterministico basado en la definicion de heuristicas y cuyos objetivos principales son: mejorar el tiempo de ejecucion y calidad de la solucion. Esta busqueda se basa en la repeticion alternada de dos etapas: una etapa que intenta restablecer las areas desconectadas y otra que intenta deslastrar las areas sobrecargadas.
Heuristics for NP-hard optimization problems - simpler is better!?
Directory of Open Access Journals (Sweden)
Žerovnik Janez
2015-11-01
Full Text Available We provide several examples showing that local search, the most basic metaheuristics, may be a very competitive choice for solving computationally hard optimization problems. In addition, generation of starting solutions by greedy heuristics should be at least considered as one of very natural possibilities. In this critical survey, selected examples discussed include the traveling salesman, the resource-constrained project scheduling, the channel assignment, and computation of bounds for the Shannon capacity.
DEFF Research Database (Denmark)
Vlachogiannis, Ioannis (John); Lee, K. Y.
2010-01-01
In this paper an improved coordinated aggregation-based particle swarm optimization algorithm is introduced for solving the optimal economic load dispatch problem in power systems. In the improved coordinated aggregation-based particle swarm optimization algorithm each particle in the swarm retains...... a memory of its best position ever encountered, and is attracted only by other particles with better achievements than its own with the exception of the particle with the best achievement, which moves randomly.The ICA-PSO algorithm is tested on a number of power systems, including the systems with 6, 13...
An Approach to Optical Network Design using General Heuristic Optimization Framework
Directory of Open Access Journals (Sweden)
Marko Lacković
2010-12-01
Full Text Available The article tackles the problem of optimization methods in optical network design process, based on optimal traffic routing with the goal to minimize the utilized network resources for given topology and traffic demands. An optimization framework Nyx has been developed with the focus on flexibility in solving optimization problems by implementing general heuristic search techniques. Nyx modular organization has been described, including coding types for solutions and genetic algorithm as the optimization method. Optimal routing has been implemented to demonstrate the use of Nyx in the optical network design process. Optimal routing procedure has been applied to Pan-European optical network with variations of routing procedures and the number of wavelengths. The analysis included no protection scenario, 1+1 protection and path restoration. The routing was performed using shortest path routing and optimal routing which minimizes the use of optical network resources, being network multiplexers, amplifiers and fibers.
Heuristic Optimization for the Discrete Virtual Power Plant Dispatch Problem
DEFF Research Database (Denmark)
Petersen, Mette Kirschmeyer; Hansen, Lars Henrik; Bendtsen, Jan Dimon
2014-01-01
of the optimal algorithms we have tested. In particular, GRASP Sorted shows the most promising performance, as it is able to find solutions that are both agile (sorted) and well balanced, and consistently yields the best numerical performance among the developed algorithms....... Problem. First NP-completeness of the Discrete Virtual Power Plant Dispatch Problem is proved formally. We then proceed to develop tailored versions of the meta-heuristic algorithms Hill Climber and Greedy Randomized Adaptive Search Procedure (GRASP). The algorithms are tuned and tested on portfolios...... of varying sizes. We find that all the tailored algorithms perform satisfactorily in the sense that they are able to find sub-optimal, but usable, solutions to very large problems (on the order of 10 5 units) at computation times on the scale of just 10 seconds, which is far beyond the capabilities...
Implementing and Bounding a Cascade Heuristic for Large-Scale Optimization
2017-06-01
BOUNDING A CASCADE HEURISTIC FOR LARGE-SCALE OPTIMIZATION by Katherine H. Guthrie June 2017 Thesis Advisor: Robert F. Dell Second Reader...AND DATES COVERED Master’s thesis 4. TITLE AND SUBTITLE IMPLEMENTING AND BOUNDING A CASCADE HEURISTIC FOR LARGE-SCALE OPTIMIZATION 5. FUNDING...release. Distribution is unlimited. 12b. DISTRIBUTION CODE 13. ABSTRACT (maximum 200 words) A cascade heuristic appeals when we are faced with a
Directory of Open Access Journals (Sweden)
Ziauddin Ursani
2016-01-01
Full Text Available In this paper, complexity curtailing techniques are introduced to create faster version of insertion heuristics, that is, cheapest insertion heuristic (CIH and largest insertion heuristic (LIH, effectively reducing their complexities from O(n3 to O(n2 with no significant effect on quality of solution. This paper also examines relatively not very known heuristic concept of max difference and shows that it can be culminated into a full-fledged max difference insertion heuristic (MDIH by defining its missing steps. Further to this the paper extends the complexity curtailing techniques to MDIH to create its faster version. The resultant heuristic, that is, fast max difference insertion heuristic (FMDIH, outperforms the “farthest insertion” heuristic (FIH across a wide spectrum of popular datasets with statistical significance, even though both the heuristics have the same worst case complexity of O(n2. It should be noted that FIH is considered best among lowest order complexity heuristics. The complexity curtailing techniques presented here open up the new area of research for their possible extension to other heuristics.
Heuristic optimization of the scanning path of particle therapy beams
Energy Technology Data Exchange (ETDEWEB)
Pardo, J.; Donetti, M.; Bourhaleb, F.; Ansarinejad, A.; Attili, A.; Cirio, R.; Garella, M. A.; Giordanengo, S.; Givehchi, N.; La Rosa, A.; Marchetto, F.; Monaco, V.; Pecka, A.; Peroni, C.; Russo, G.; Sacchi, R. [Istituto Nazionale di Fisica Nucleare, Sezione di Torino, Via P. Giuria 1, I-10125 Torino (Italy); Istituto Nazionale di Fisica Nucleare, Sezione di Torino, Via P. Giuria 1, I-10125 Torino (Italy) and Fondazione CNAO, Via Caminadella 16, I-20123, Milano (Italy); Dipartimento di Fisica Sperimentale, Universita di Torino, Via P. Giuria 1, I-10125 Torino (Italy); Istituto Nazionale di Fisica Nucleare, Sezione di Torino, Via P. Giuria 1, I-10125 Torino (Italy) and Dipartimento di Fisica Sperimentale, Universita di Torino, Via P. Giuria 1, I-10125 Torino (Italy); Istituto Nazionale di Fisica Nucleare, Sezione di Torino, Via P. Giuria 1, I-10125 Torino (Italy); Istituto Nazionale di Fisica Nucleare, Sezione di Torino, Via P. Giuria 1, I-10125 Torino (Italy) and Dipartimento di Fisica Sperimentale, Universita di Torino, Via P. Giuria 1, I-10125 Torino (Italy); Istituto Nazionale di Fisica Nucleare, Sezione di Torino, Via P. Giuria 1, I-10125 Torino (Italy); Istituto Nazionale di Fisica Nucleare, Sezione di Torino, Via P. Giuria 1, I-10125 Torino (Italy) and Dipartimento di Fisica Sperimentale, Universita di Torino, Via P. Giuria 1, I-10125 Torino (Italy); Istituto Nazionale di Fisica Nucleare, Sezione di Torino, Via P. Giuria 1, I-10125 Torino (Italy); Istituto Nazionale di Fisica Nucleare, Sezione di Torino, Via P. Giuria 1, I-10125 Torino (Italy) and Dipartimento di Fisica Sperimentale, Universita di Torino, Via P. Giuria 1, I-10125 Torino (Italy)
2009-06-15
Quasidiscrete scanning is a delivery strategy for proton and ion beam therapy in which the beam is turned off when a slice is finished and a new energy must be set but not during the scanning between consecutive spots. Different scanning paths lead to different dose distributions due to the contribution of the unintended transit dose between spots. In this work an algorithm to optimize the scanning path for quasidiscrete scanned beams is presented. The classical simulated annealing algorithm is used. It is a heuristic algorithm frequently used in combinatorial optimization problems, which allows us to obtain nearly optimal solutions in acceptable running times. A study focused on the best choice of operational parameters on which the algorithm performance depends is presented. The convergence properties of the algorithm have been further improved by using the next-neighbor algorithm to generate the starting paths. Scanning paths for two clinical treatments have been optimized. The optimized paths are found to be shorter than the back-and-forth, top-to-bottom (zigzag) paths generally provided by the treatment planning systems. The gamma method has been applied to quantify the improvement achieved on the dose distribution. Results show a reduction of the transit dose when the optimized paths are used. The benefit is clear especially when the fluence per spot is low, as in the case of repainting. The minimization of the transit dose can potentially allow the use of higher beam intensities, thus decreasing the treatment time. The algorithm implemented for this work can optimize efficiently the scanning path of quasidiscrete scanned particle beams. Optimized scanning paths decrease the transit dose and lead to better dose distributions.
Using Heuristic Algorithms to Optimize Observing Target Sequences
Sosnowska, D.; Ouadahi, A.; Buchschacher, N.; Weber, L.; Pepe, F.
2014-05-01
The preparation of observations is normally carried out at the telescope by the visiting observer. In order to help the observer, we propose several algorithms to automatically optimize the sequence of targets. The optimization consists of assuring that all the chosen targets are observable within the given time interval, and to find their best execution order in terms of the observation quality and the shortest telescope displacement time. Since an exhaustive search is too expensive in time, we researched heuristic algorithms, specifically: Min-Conflict, Non-Sorting Genetic Algorithms and Simulated Annealing. Multiple metaheuristics are used in parallel to swiftly give an approximation of the best solution, with all the constraints satisfied and the total execution time minimized. The optimization process has a duration on the order of tens of seconds, allowing for quick re-adaptation in case of changing atmospheric conditions. The graphical user interface allows the user to control the parameters of the optimization process. Therefore, the search can be adjusted in real time. The module was coded in a way to allow easily the addition of new constraints, and thus ensure its compatibility with different instruments. For now, the application runs as a plug-in to the observation preparation tool called New Short Term Scheduler, which is used on three spectrographs dedicated to the exoplanets search: HARPS at the La Silla observatory, HARPS North at the La Palma observatory and SOPHIE at the Observatoire de Haute-Provence.
Optimal Control of Complex Systems Based on Improved Dual Heuristic Dynamic Programming Algorithm
Directory of Open Access Journals (Sweden)
Hui Li
2017-01-01
Full Text Available When applied to solving the data modeling and optimal control problems of complex systems, the dual heuristic dynamic programming (DHP technique, which is based on the BP neural network algorithm (BP-DHP, has difficulty in prediction accuracy, slow convergence speed, poor stability, and so forth. In this paper, a dual DHP technique based on Extreme Learning Machine (ELM algorithm (ELM-DHP was proposed. Through constructing three kinds of network structures, the paper gives the detailed realization process of the DHP technique in the ELM. The controller designed upon the ELM-DHP algorithm controlled a molecular distillation system with complex features, such as multivariability, strong coupling, and nonlinearity. Finally, the effectiveness of the algorithm is verified by the simulation that compares DHP and HDP algorithms based on ELM and BP neural network. The algorithm can also be applied to solve the data modeling and optimal control problems of similar complex systems.
Directory of Open Access Journals (Sweden)
Ricardo Faia
2017-06-01
Full Text Available The deregulation of the electricity sector has culminated in the introduction of competitive markets. In addition, the emergence of new forms of electric energy production, namely the production of renewable energy, has brought additional changes in electricity market operation. Renewable energy has significant advantages, but at the cost of an intermittent character. The generation variability adds new challenges for negotiating players, as they have to deal with a new level of uncertainty. In order to assist players in their decisions, decision support tools enabling assisting players in their negotiations are crucial. Artificial intelligence techniques play an important role in this decision support, as they can provide valuable results in rather small execution times, namely regarding the problem of optimizing the electricity markets participation portfolio. This paper proposes a heuristic method that provides an initial solution that allows metaheuristic techniques to improve their results through a good initialization of the optimization process. Results show that by using the proposed heuristic, multiple metaheuristic optimization methods are able to improve their solutions in a faster execution time, thus providing a valuable contribution for players support in energy markets negotiations.
A new iterative heuristic to solve the joint replenishment problem using a spreadsheet technique
Nilsson, A.; Segerstedt, A.; van der Sluis, E.
2007-01-01
In this paper, a heuristic method is presented which gives a novel approach to solve joint replenishment problems (JRP) with strict cycle policies. The heuristic solves the JRP in an iterative procedure and is based on a spreadsheet technique. The principle of the recursion procedure is to find a
Theory of Randomized Search Heuristics in Combinatorial Optimization
DEFF Research Database (Denmark)
such as time, money, or knowledge to obtain good specific algorithms. It is widely acknowledged that a solid mathematical foundation for such heuristics is needed. Most designers of RSHs, however, rather focused on mimicking processes in nature (such as evolution) rather than making the heuristics amenable......The rigorous mathematical analysis of randomized search heuristics(RSHs) with respect to their expected runtime is a growing research area where many results have been obtained in recent years. This class of heuristics includes well-known approaches such as Randomized Local Search (RLS...... to a mathematical analysis. This is different to the classical design of (randomized) algorithms which are developed with their theoretical analysis of runtime (and proof of correctness) in mind. Despite these obstacles, research from the last about 15 years has shown how to apply the methods for the probabilistic...
Optimization techniques in statistics
Rustagi, Jagdish S
1994-01-01
Statistics help guide us to optimal decisions under uncertainty. A large variety of statistical problems are essentially solutions to optimization problems. The mathematical techniques of optimization are fundamentalto statistical theory and practice. In this book, Jagdish Rustagi provides full-spectrum coverage of these methods, ranging from classical optimization and Lagrange multipliers, to numerical techniques using gradients or direct search, to linear, nonlinear, and dynamic programming using the Kuhn-Tucker conditions or the Pontryagin maximal principle. Variational methods and optimiza
Sabar, Nasser R; Ayob, Masri; Kendall, Graham; Qu, Rong
2015-02-01
Hyper-heuristics are search methodologies that aim to provide high-quality solutions across a wide variety of problem domains, rather than developing tailor-made methodologies for each problem instance/domain. A traditional hyper-heuristic framework has two levels, namely, the high level strategy (heuristic selection mechanism and the acceptance criterion) and low level heuristics (a set of problem specific heuristics). Due to the different landscape structures of different problem instances, the high level strategy plays an important role in the design of a hyper-heuristic framework. In this paper, we propose a new high level strategy for a hyper-heuristic framework. The proposed high-level strategy utilizes a dynamic multiarmed bandit-extreme value-based reward as an online heuristic selection mechanism to select the appropriate heuristic to be applied at each iteration. In addition, we propose a gene expression programming framework to automatically generate the acceptance criterion for each problem instance, instead of using human-designed criteria. Two well-known, and very different, combinatorial optimization problems, one static (exam timetabling) and one dynamic (dynamic vehicle routing) are used to demonstrate the generality of the proposed framework. Compared with state-of-the-art hyper-heuristics and other bespoke methods, empirical results demonstrate that the proposed framework is able to generalize well across both domains. We obtain competitive, if not better results, when compared to the best known results obtained from other methods that have been presented in the scientific literature. We also compare our approach against the recently released hyper-heuristic competition test suite. We again demonstrate the generality of our approach when we compare against other methods that have utilized the same six benchmark datasets from this test suite.
Directory of Open Access Journals (Sweden)
Fawad Zaman
2015-01-01
Full Text Available We assume Bistatic Phase Multiple Input Multiple Output radar having passive Centrosymmetric Cross Shape Sensor Array (CSCA on its receiver. Let the transmitter of this Bistatic radar send coherent signals using a subarray that gives a fairly wide beam with a large solid angle so as to cover up any potential relevant target in the near field. We developed Heuristic Computational Intelligence (HCI based techniques to jointly estimate the range, amplitude, and elevation and azimuth angles of these multiple targets impinging on the CSCA. In this connection, first the global search optimizers, that is,are developed separately Particle Swarm Optimization (PSO and Differential Evolution (DE are developed separately, and, to enhance the performances further, both of them are hybridized with a local search optimizer called Active Set Algorithm (ASA. Initially, the performance of PSO, DE, PSO hybridized with ASA, and DE hybridized with ASA are compared with each other and then with some traditional techniques available in literature using root mean square error (RMSE as figure of merit.
An Empirical Comparison of Seven Iterative and Evolutionary Function Optimization Heuristics
Baluja, Shumeet
1995-01-01
This report is a repository of the results obtained from a large scale empirical comparison of seven iterative and evolution-based optimization heuristics. Twenty-seven static optimization problems, spanning six sets of problem classes which are commonly explored in genetic algorithm literature, are examined. The problem sets include job-shop scheduling, traveling salesman, knapsack, binpacking, neural network weight optimization, and standard numerical optimization. The search spaces in these problems range from 2368 to 22040. The results indicate that using genetic algorithms for the optimization of static functions does not yield a benefit, in terms of the final answer obtained, over simpler optimization heuristics. Descriptions of the algorithms tested and the encodings of the problems are described in detail for reproducibility.
A nuclear heuristic for application to metaheuristics in-core fuel management optimization
Energy Technology Data Exchange (ETDEWEB)
Meneses, Anderson Alvarenga de Moura, E-mail: ameneses@lmp.ufrj.b [COPPE/Federal University of Rio de Janeiro, RJ (Brazil). Nuclear Engineering Program; Dalle Molle Institute for Artificial Intelligence (IDSIA), Manno-Lugano, TI (Switzerland); Gambardella, Luca Maria, E-mail: luca@idsia.c [Dalle Molle Institute for Artificial Intelligence (IDSIA), Manno-Lugano, TI (Switzerland); Schirru, Roberto, E-mail: schirru@lmp.ufrj.b [COPPE/Federal University of Rio de Janeiro, RJ (Brazil). Nuclear Engineering Program
2009-07-01
The In-Core Fuel Management Optimization (ICFMO) is a well-known problem of nuclear engineering whose features are complexity, high number of feasible solutions, and a complex evaluation process with high computational cost, thus it is prohibitive to have a great number of evaluations during an optimization process. Heuristics are criteria or principles for deciding which among several alternative courses of action are more effective with respect to some goal. In this paper, we propose a new approach for the use of relational heuristics for the search in the ICFMO. The Heuristic is based on the reactivity of the fuel assemblies and their position into the reactor core. It was applied to random search, resulting in less computational effort concerning the number of evaluations of loading patterns during the search. The experiments demonstrate that it is possible to achieve results comparable to results in the literature, for future application to metaheuristics in the ICFMO. (author)
Directory of Open Access Journals (Sweden)
Chao-Chih Lin
2017-10-01
Full Text Available A new transient-based hybrid heuristic approach is developed to optimize a transient generation process and to detect leaks in pipe networks. The approach couples the ordinal optimization approach (OOA and the symbiotic organism search (SOS to solve the optimization problem by means of iterations. A pipe network analysis model (PNSOS is first used to determine steady-state head distribution and pipe flow rates. The best transient generation point and its relevant valve operation parameters are optimized by maximizing the objective function of transient energy. The transient event is created at the chosen point, and the method of characteristics (MOC is used to analyze the transient flow. The OOA is applied to sift through the candidate pipes and the initial organisms with leak information. The SOS is employed to determine the leaks by minimizing the sum of differences between simulated and computed head at the observation points. Two synthetic leaking scenarios, a simple pipe network and a water distribution network (WDN, are chosen to test the performance of leak detection ordinal symbiotic organism search (LDOSOS. Leak information can be accurately identified by the proposed approach for both of the scenarios. The presented technique makes a remarkable contribution to the success of leak detection in the pipe networks.
Directory of Open Access Journals (Sweden)
Lopez-Loces Mario C.
2016-06-01
Full Text Available Internet shopping has been one of the most common online activities, carried out by millions of users every day. As the number of available offers grows, the difficulty in getting the best one among all the shops increases as well. In this paper we propose an integer linear programming (ILP model and two heuristic solutions, the MinMin algorithm and the cellular processing algorithm, to tackle the Internet shopping optimization problem with delivery costs. The obtained results improve those achieved by the state-of-the-art heuristics, and for small real case scenarios ILP delivers exact solutions in a reasonable amount of time.
New Heuristics for global optimization of complex bioprocesses
Egea Larrosa, Jose Alberto
2008-01-01
[ENG] Optimization problems arising from the biotechnological and food industries are usually of non-convex nature and they often exhibit several local minima. Even though advances in global optimization research have been outstanding in recent years, the current state-of-the- art is not completely satisfactory, specially when one considers the global optimization of complex process models (typical of biotechnological and food industries). These models are complex due to their dynamic beha...
Othman, Muhammad Murtadha; Abd Rahman, Nurulazmi; Musirin, Ismail; Fotuhi-Firuzabad, Mahmud; Rajabi-Ghahnavieh, Abbas
2015-01-01
This paper introduces a novel multiobjective approach for capacity benefit margin (CBM) assessment taking into account tie-line reliability of interconnected systems. CBM is the imperative information utilized as a reference by the load-serving entities (LSE) to estimate a certain margin of transfer capability so that a reliable access to generation through interconnected system could be attained. A new Pareto-based evolutionary programming (EP) technique is used to perform a simultaneous determination of CBM for all areas of the interconnected system. The selection of CBM at the Pareto optimal front is proposed to be performed by referring to a heuristic ranking index that takes into account system loss of load expectation (LOLE) in various conditions. Eventually, the power transfer based available transfer capability (ATC) is determined by considering the firm and nonfirm transfers of CBM. A comprehensive set of numerical studies are conducted on the modified IEEE-RTS79 and the performance of the proposed method is numerically investigated in detail. The main advantage of the proposed technique is in terms of flexibility offered to an independent system operator in selecting an appropriate solution of CBM simultaneously for all areas.
Automated generation of constructive ordering heuristics for educational timetabling
Pillay, Nelishia; Özcan, Ender
2017-01-01
Construction heuristics play an important role in solving combinatorial optimization problems. These heuristics are usually used to create an initial solution to the problem which is improved using optimization techniques such as metaheuristics. For examination timetabling and university course timetabling problems essentially graph colouring heuristics have been used for this purpose. The process of deriving heuristics manually for educational timetabling is a time consuming task. Furthermor...
Triangular Geometrized Sampling Heuristics for Fast Optimal Motion Planning
Directory of Open Access Journals (Sweden)
Ahmed Hussain Qureshi
2015-02-01
Full Text Available Rapidly-exploring Random Tree (RRT-based algorithms have become increasingly popular due to their lower computational complexity as compared with other path planning algorithms. The recently presented RRT* motion planning algorithm improves upon the original RRT algorithm by providing optimal path solutions. While RRT determines an initial collision-free path fairly quickly, RRT* guarantees almost certain convergence to an optimal, obstacle-free path from the start to the goal points for any given geometrical environment. However, the main limitations of RRT* include its slow processing rate and high memory consumption, due to the large number of iterations required for calculating the optimal path. In order to overcome these limitations, we present another improvement, i.e, the Triangular Geometerized-RRT* (TG-RRT* algorithm, which utilizes triangular geometrical methods to improve the performance of the RRT* algorithm in terms of the processing time and a decreased number of iterations required for an optimal path solution. Simulations comparing the performance results of the improved TG-RRT* with RRT* are presented to demonstrate the overall improvement in performance and optimal path detection.
Directory of Open Access Journals (Sweden)
Tashkova Katerina
2011-10-01
Full Text Available Abstract Background We address the task of parameter estimation in models of the dynamics of biological systems based on ordinary differential equations (ODEs from measured data, where the models are typically non-linear and have many parameters, the measurements are imperfect due to noise, and the studied system can often be only partially observed. A representative task is to estimate the parameters in a model of the dynamics of endocytosis, i.e., endosome maturation, reflected in a cut-out switch transition between the Rab5 and Rab7 domain protein concentrations, from experimental measurements of these concentrations. The general parameter estimation task and the specific instance considered here are challenging optimization problems, calling for the use of advanced meta-heuristic optimization methods, such as evolutionary or swarm-based methods. Results We apply three global-search meta-heuristic algorithms for numerical optimization, i.e., differential ant-stigmergy algorithm (DASA, particle-swarm optimization (PSO, and differential evolution (DE, as well as a local-search derivative-based algorithm 717 (A717 to the task of estimating parameters in ODEs. We evaluate their performance on the considered representative task along a number of metrics, including the quality of reconstructing the system output and the complete dynamics, as well as the speed of convergence, both on real-experimental data and on artificial pseudo-experimental data with varying amounts of noise. We compare the four optimization methods under a range of observation scenarios, where data of different completeness and accuracy of interpretation are given as input. Conclusions Overall, the global meta-heuristic methods (DASA, PSO, and DE clearly and significantly outperform the local derivative-based method (A717. Among the three meta-heuristics, differential evolution (DE performs best in terms of the objective function, i.e., reconstructing the output, and in terms of
2011-01-01
Background We address the task of parameter estimation in models of the dynamics of biological systems based on ordinary differential equations (ODEs) from measured data, where the models are typically non-linear and have many parameters, the measurements are imperfect due to noise, and the studied system can often be only partially observed. A representative task is to estimate the parameters in a model of the dynamics of endocytosis, i.e., endosome maturation, reflected in a cut-out switch transition between the Rab5 and Rab7 domain protein concentrations, from experimental measurements of these concentrations. The general parameter estimation task and the specific instance considered here are challenging optimization problems, calling for the use of advanced meta-heuristic optimization methods, such as evolutionary or swarm-based methods. Results We apply three global-search meta-heuristic algorithms for numerical optimization, i.e., differential ant-stigmergy algorithm (DASA), particle-swarm optimization (PSO), and differential evolution (DE), as well as a local-search derivative-based algorithm 717 (A717) to the task of estimating parameters in ODEs. We evaluate their performance on the considered representative task along a number of metrics, including the quality of reconstructing the system output and the complete dynamics, as well as the speed of convergence, both on real-experimental data and on artificial pseudo-experimental data with varying amounts of noise. We compare the four optimization methods under a range of observation scenarios, where data of different completeness and accuracy of interpretation are given as input. Conclusions Overall, the global meta-heuristic methods (DASA, PSO, and DE) clearly and significantly outperform the local derivative-based method (A717). Among the three meta-heuristics, differential evolution (DE) performs best in terms of the objective function, i.e., reconstructing the output, and in terms of convergence. These
Smooth Constrained Heuristic Optimization of a Combinatorial Chemical Space
2015-05-01
12 Appendix. Listings 17 List of Symbols , Abbreviations, and Acronyms 31 Distribution List 32 iii List of Figures Fig. 1 Optimization framework: Each...X may be replaced by -H, -F, -Cl, or -Br for a total of 216 possible molecules. ..............................................4 Fig. 2 Flowchart of...Stopping criteria? d = n? Stop d = 1, λ = 0 yes no d = 1 yes no d = d+ 1 Fig. 2 Flowchart of algorithm • Algorithm 1: Complete a full sweep of all
Efficient heuristic algorithm used for optimal capacitor placement in distribution systems
Energy Technology Data Exchange (ETDEWEB)
Segura, Silvio; Rider, Marcos J. [Department of Electric Energy Systems, University of Campinas, Campinas, Sao Paulo (Brazil); Romero, Ruben [Faculty of Engineering of Ilha Solteira, Paulista State University, Ilha Solteira, Sao Paulo (Brazil)
2010-01-15
An efficient heuristic algorithm is presented in this work in order to solve the optimal capacitor placement problem in radial distribution systems. The proposal uses the solution from the mathematical model after relaxing the integrality of the discrete variables as a strategy to identify the most attractive bus to add capacitors to each step of the heuristic algorithm. The relaxed mathematical model is a non-linear programming problem and is solved using a specialized interior point method. The algorithm still incorporates an additional strategy of local search that enables the finding of a group of quality solutions after small alterations in the optimization strategy. Proposed solution methodology has been implemented and tested in known electric systems getting a satisfactory outcome compared with metaheuristic methods. The tests carried out in electric systems known in specialized literature reveal the satisfactory outcome of the proposed algorithm compared with metaheuristic methods. (author)
Heuristic Optimization Approach to Selecting a Transport Connection in City Public Transport
Kul’ka Jozef; Mantič Martin; Kopas Melichar; Faltinová Eva; Kachman Daniel
2017-01-01
The article presents a heuristic optimization approach to select a suitable transport connection in the framework of a city public transport. This methodology was applied on a part of the public transport in Košice, because it is the second largest city in the Slovak Republic and its network of the public transport creates a complex transport system, which consists of three different transport modes, namely from the bus transport, tram transport and trolley-bus transport. This solution focuse...
A heuristic risk assessment technique for birdstrike management at airports.
Allan, John
2006-06-01
Collisions between birds and aircraft (birdstrikes) have caused the loss of at least 88 aircraft and 243 lives in world civil aviation. Conservative estimates suggest that more routine damage and delays following birdstrikes cost the industry and its insurers US$1.2-1.5 billion per year. The majority of strikes happen close to airports and most countries have regulations that require airport managers to control the birdstrike risk on their property. Birdstrike prevention has, however, lagged behind other aspects of flight safety in the development and implementation of risk assessment protocols, possibly because of the inherent difficulty in quantifying the variability in the populations and behavior of the various bird species involved. This article presents a technique that uses both national and airport-specific data to evaluate risk by creating a simple probability-times-severity matrix. It uses the frequency of strikes reported for different bird species at a given airport over the preceding five years as a measure of strike probability, and the proportion of strikes with each species that result in damage to aircraft, in the national birdstrike database, as a measure of likely severity. Action thresholds for risk levels for particular bird species are then defined, above which the airport should take action to reduce the risk further. The assessment is designed for airports where the reporting and collation of birdstrike events is reasonably consistent over time and where a bird hazard management program of some sort is already in place. This risk assessment is designed to measure risk to the airport as a business rather than risk to the traveling passenger individually. It therefore takes no account of aircraft movement rate in the calculations and is aimed at minimizing the number of damaging incidents rather than concentrating on catastrophic events. Once set up at an airport, the technique is simple to implement for nonexperts, and it allows managers to
Sun, Jian
2010-09-01
A new methodology for adapting rigorous simulation programs to optimal supervisory control of a central chilled water plant is proposed in this article, which solves plant operation mode optimization and set points optimization by combining heuristic search with sequential quadratic programming. The mathematical basis of this algorithm is developed. A new derivative calculation strategy is introduced in set points optimization. This approach is applied to a central chilled water plant which consists of three chillers, two 3-cell cooling towers, three chilled water pumps and three condenser water pumps. Model verification study is performed. The optimal sequence of operation, set points of the decision variables at given load demand and weather condition are calculated. The plant performance and optimal control results are discussed.
Application of Fuzzy Sets for the Improvement of Routing Optimization Heuristic Algorithms
Directory of Open Access Journals (Sweden)
Mattas Konstantinos
2016-12-01
Full Text Available The determination of the optimal circular path has become widely known for its difficulty in producing a solution and for the numerous applications in the scope of organization and management of passenger and freight transport. It is a mathematical combinatorial optimization problem for which several deterministic and heuristic models have been developed in recent years, applicable to route organization issues, passenger and freight transport, storage and distribution of goods, waste collection, supply and control of terminals, as well as human resource management. Scope of the present paper is the development, with the use of fuzzy sets, of a practical, comprehensible and speedy heuristic algorithm for the improvement of the ability of the classical deterministic algorithms to identify optimum, symmetrical or non-symmetrical, circular route. The proposed fuzzy heuristic algorithm is compared to the corresponding deterministic ones, with regard to the deviation of the proposed solution from the best known solution and the complexity of the calculations needed to obtain this solution. It is shown that the use of fuzzy sets reduced up to 35% the deviation of the solution identified by the classical deterministic algorithms from the best known solution.
Performance of Optimization Heuristics for the Operational Planning of Multi-energy Storage Systems
Haas, J.; Schradi, J.; Nowak, W.
2016-12-01
In the transition to low-carbon energy sources, energy storage systems (ESS) will play an increasingly important role. Particularly in the context of solar power challenges (variability, uncertainty), ESS can provide valuable services: energy shifting, ramping, robustness against forecast errors, frequency support, etc. However, these qualities are rarely modelled in the operational planning of power systems because of the involved computational burden, especially when multiple ESS technologies are involved. This work assesses two optimization heuristics for speeding up the optimal operation problem. It compares their accuracy (in terms of costs) and speed against a reference solution. The first heuristic (H1) is based on a merit order. Here, the ESS are sorted from lower to higher operational costs (including cycling costs). For each time step, the cheapest available ESS is used first, followed by the second one and so on, until matching the net load (demand minus available renewable generation). The second heuristic (H2) uses the Fourier transform to detect the main frequencies that compose the net load. A specific ESS is assigned to each frequency range, aiming to smoothen the net load. Finally, the reference solution is obtained with a mixed integer linear program (MILP). H1, H2 and MILP are subject to technical constraints (energy/power balance, ramping rates, on/off states...). Costs due to operation, replacement (cycling) and unserved energy are considered. Four typical days of a system with a high share of solar energy were used in several test cases, varying the resolution from one second to fifteen minutes. H1 and H2 achieve accuracies of about 90% and 95% in average, and speed-up times of two to three and one to two orders of magnitude, respectively. The use of the heuristics looks promising in the context of planning the expansion of power systems, especially when their loss of accuracy is outweighed by solar or wind forecast errors.
Françoise Benz
2004-01-01
ACADEMIC TRAINING LECTURE REGULAR PROGRAMME 1, 2, 3 and 4 June From 11:00 hrs to 12:00 hrs - Main Auditorium bldg. 500 Evolutionary Heuristic Optimization: Genetic Algorithms and Estimation of Distribution Algorithms V. Robles Forcada and M. Perez Hernandez / Univ. de Madrid, Spain In the real world, there exist a huge number of problems that require getting an optimum or near-to-optimum solution. Optimization can be used to solve a lot of different problems such as network design, sets and partitions, storage and retrieval or scheduling. On the other hand, in nature, there exist many processes that seek a stable state. These processes can be seen as natural optimization processes. Over the last 30 years several attempts have been made to develop optimization algorithms, which simulate these natural optimization processes. These attempts have resulted in methods such as Simulated Annealing, based on natural annealing processes or Evolutionary Computation, based on biological evolution processes. Geneti...
Françoise Benz
2004-01-01
ENSEIGNEMENT ACADEMIQUE ACADEMIC TRAINING Françoise Benz 73127 academic.training@cern.ch ACADEMIC TRAINING LECTURE REGULAR PROGRAMME 1, 2, 3 and 4 June From 11:00 hrs to 12:00 hrs - Main Auditorium bldg. 500 Evolutionary Heuristic Optimization: Genetic Algorithms and Estimation of Distribution Algorithms V. Robles Forcada and M. Perez Hernandez / Univ. de Madrid, Spain In the real world, there exist a huge number of problems that require getting an optimum or near-to-optimum solution. Optimization can be used to solve a lot of different problems such as network design, sets and partitions, storage and retrieval or scheduling. On the other hand, in nature, there exist many processes that seek a stable state. These processes can be seen as natural optimization processes. Over the last 30 years several attempts have been made to develop optimization algorithms, which simulate these natural optimization processes. These attempts have resulted in methods such as Simulated Annealing, based on nat...
Near-Optimal Tracking Control of Mobile Robots Via Receding-Horizon Dual Heuristic Programming.
Lian, Chuanqiang; Xu, Xin; Chen, Hong; He, Haibo
2016-11-01
Trajectory tracking control of wheeled mobile robots (WMRs) has been an important research topic in control theory and robotics. Although various tracking control methods with stability have been developed for WMRs, it is still difficult to design optimal or near-optimal tracking controller under uncertainties and disturbances. In this paper, a near-optimal tracking control method is presented for WMRs based on receding-horizon dual heuristic programming (RHDHP). In the proposed method, a backstepping kinematic controller is designed to generate desired velocity profiles and the receding horizon strategy is used to decompose the infinite-horizon optimal control problem into a series of finite-horizon optimal control problems. In each horizon, a closed-loop tracking control policy is successively updated using a class of approximate dynamic programming algorithms called finite-horizon dual heuristic programming (DHP). The convergence property of the proposed method is analyzed and it is shown that the tracking control system based on RHDHP is asymptotically stable by using the Lyapunov approach. Simulation results on three tracking control problems demonstrate that the proposed method has improved control performance when compared with conventional model predictive control (MPC) and DHP. It is also illustrated that the proposed method has lower computational burden than conventional MPC, which is very beneficial for real-time tracking control.
A Modified Particle Swarm Optimization Technique for Finding Optimal Designs for Mixture Models
Wong, Weng Kee; Chen, Ray-Bing; Huang, Chien-Chih; Wang, Weichung
2015-01-01
Particle Swarm Optimization (PSO) is a meta-heuristic algorithm that has been shown to be successful in solving a wide variety of real and complicated optimization problems in engineering and computer science. This paper introduces a projection based PSO technique, named ProjPSO, to efficiently find different types of optimal designs, or nearly optimal designs, for mixture models with and without constraints on the components, and also for related models, like the log contrast models. We also compare the modified PSO performance with Fedorov's algorithm, a popular algorithm used to generate optimal designs, Cocktail algorithm, and the recent algorithm proposed by [1]. PMID:26091237
Heuristic Optimization Approach to Selecting a Transport Connection in City Public Transport
Directory of Open Access Journals (Sweden)
Kul’ka Jozef
2017-02-01
Full Text Available The article presents a heuristic optimization approach to select a suitable transport connection in the framework of a city public transport. This methodology was applied on a part of the public transport in Košice, because it is the second largest city in the Slovak Republic and its network of the public transport creates a complex transport system, which consists of three different transport modes, namely from the bus transport, tram transport and trolley-bus transport. This solution focused on examining the individual transport services and their interconnection in relevant interchange points.
Heuristic Optimization Approach to Selecting a Transport Connection in City Public Transport
Kul'ka, Jozef; Mantič, Martin; Kopas, Melichar; Faltinová, Eva; Kachman, Daniel
2017-02-01
The article presents a heuristic optimization approach to select a suitable transport connection in the framework of a city public transport. This methodology was applied on a part of the public transport in Košice, because it is the second largest city in the Slovak Republic and its network of the public transport creates a complex transport system, which consists of three different transport modes, namely from the bus transport, tram transport and trolley-bus transport. This solution focused on examining the individual transport services and their interconnection in relevant interchange points.
Directory of Open Access Journals (Sweden)
Muhammad Farhan Ausaf
2015-12-01
Full Text Available Process planning and scheduling are two important components of a manufacturing setup. It is important to integrate them to achieve better global optimality and improved system performance. To find optimal solutions for integrated process planning and scheduling (IPPS problem, numerous algorithm-based approaches exist. Most of these approaches try to use existing meta-heuristic algorithms for solving the IPPS problem. Although these approaches have been shown to be effective in optimizing the IPPS problem, there is still room for improvement in terms of quality of solution and algorithm efficiency, especially for more complicated problems. Dispatching rules have been successfully utilized for solving complicated scheduling problems, but haven’t been considered extensively for the IPPS problem. This approach incorporates dispatching rules with the concept of prioritizing jobs, in an algorithm called priority-based heuristic algorithm (PBHA. PBHA tries to establish job and machine priority for selecting operations. Priority assignment and a set of dispatching rules are simultaneously used to generate both the process plans and schedules for all jobs and machines. The algorithm was tested for a series of benchmark problems. The proposed algorithm was able to achieve superior results for most complex problems presented in recent literature while utilizing lesser computational resources.
An adaptive heuristic cross-entropy algorithm for optimal design of water distribution systems
Perelman, Lina; Ostfeld, Avi
2007-06-01
The optimal design problem of a water distribution system is to find the water distribution system component characteristics (e.g. pipe diameters, pump heads and maximum power, reservoir storage volumes, etc.) which minimize the system's capital and operational costs such that the system hydraulic laws are maintained (i.e. Kirchhoff's first and second laws), and constraints on quantities and pressures at the consumer nodes are fulfilled. In this study, an adaptive stochastic algorithm for water distribution systems optimal design based on the heuristic cross-entropy method for combinatorial optimization is presented. The algorithm is demonstrated using two well-known benchmark examples from the water distribution systems research literature for single loading gravitational systems, and an example of multiple loadings, pumping, and storage. The results show the cross-entropy dominance over previously published methods.
A Heuristic Optimal Discrete Bit Allocation Algorithm for Margin Maximization in DMT Systems
Directory of Open Access Journals (Sweden)
Dong Shi-Wei
2007-01-01
Full Text Available A heuristic optimal discrete bit allocation algorithm is proposed for solving the margin maximization problem in discrete multitone (DMT systems. Starting from an initial equal power assignment bit distribution, the proposed algorithm employs a multistaged bit rate allocation scheme to meet the target rate. If the total bit rate is far from the target rate, a multiple-bits loading procedure is used to obtain a bit allocation close to the target rate. When close to the target rate, a parallel bit-loading procedure is used to achieve the target rate and this is computationally more efficient than conventional greedy bit-loading algorithm. Finally, the target bit rate distribution is checked, if it is efficient, then it is also the optimal solution; else, optimal bit distribution can be obtained only by few bit swaps. Simulation results using the standard asymmetric digital subscriber line (ADSL test loops show that the proposed algorithm is efficient for practical DMT transmissions.
Wibisono, E.; Santoso, A.; Sunaryo, M. A.
2017-11-01
XYZ is a distributor of various consumer goods products. The company plans its delivery routes daily and in order to obtain route construction in a short amount of time, it simplifies the process by assigning drivers based on geographic regions. This approach results in inefficient use of vehicles leading to imbalance workloads. In this paper, we propose a combined method involving heuristic and optimization to obtain better solutions in acceptable computation time. The heuristic is based on a time-oriented, nearest neighbor (TONN) to form clusters if the number of locations is higher than a certain value. The optimization part uses a mathematical modeling formulation based on vehicle routing problem that considers heterogeneous vehicles, time windows, and fixed costs (HVRPTWF) and is used to solve routing problem in clusters. A case study using data from one month of the company’s operations is analyzed, and data from one day of operations are detailed in this paper. The analysis shows that the proposed method results in 24% cost savings on that month, but it can be as high as 54% in a day.
Mechanical Design Optimization Using Advanced Optimization Techniques
Rao, R Venkata
2012-01-01
Mechanical design includes an optimization process in which designers always consider objectives such as strength, deflection, weight, wear, corrosion, etc. depending on the requirements. However, design optimization for a complete mechanical assembly leads to a complicated objective function with a large number of design variables. It is a good practice to apply optimization techniques for individual components or intermediate assemblies than a complete assembly. Analytical or numerical methods for calculating the extreme values of a function may perform well in many practical cases, but may fail in more complex design situations. In real design problems, the number of design parameters can be very large and their influence on the value to be optimized (the goal function) can be very complicated, having nonlinear character. In these complex cases, advanced optimization algorithms offer solutions to the problems, because they find a solution near to the global optimum within reasonable time and computational ...
Bellala, Djamel; Smadi, Hacene; Medjghou, Aicha
Exact solutions for the TSP problem are typically difficult from computational point of view, because of their size and time complexities. That is why, heuristics are substituted to exact algorithms in order to provide a good solution to the problem. In this paper two heuristics, the nearest-neighbor and the subtour-reversal algorithms, are used to solve an industrial problem. The first algorithm gives birth to an optimal tour by which the industrial process can be carried out while the second algorithm generally provides an improvement to the previous optimal tour.
A Multi-Inner-World Genetic Algorithm Using Multiple Heuristics to Optimize Delivery Schedule
Sakurai, Yoshitaka; Onoyama, Takashi; Tsukamoto, Natsuki; Takada, Kouhei; Tsuruta, Setsuo
A delivery route optimization that improves the efficiency of real time delivery or a distribution network requires to solve several tens to hundreds cities Traveling Salesman Problems (TSP) (1)(2) within interactive response time, with expert-level accuracy (less than about 3% of error rate). To meet these requirements, a multi-inner-world Genetic Algorithm (Miw-GA) method is developed. This method combines several types of GA's inner worlds. Each world of this method uses a different type of heuristics such as a 2-opt type mutation world and a block (Nearest Insertion) type mutation world. Comparison based on the results of experiments proved the method is superior to others and our previously proposed method.
Transport Logistics Optimization Model Of Passenger Car Based On Heuristic Algorithm
Directory of Open Access Journals (Sweden)
Zhou Juan
2016-01-01
Full Text Available Passenger car logistics transportation problem is at the primary stage of development in China, great economic loss and waste of resources caused in the process of logistics transportation, because most of enterprises in China rely on artificial experience for guidance at present. And use of a particular genetic algorithm to solve this problem by previous also exist many defects, such as programming complexity, instability and low efficiency etc. In view of this, The heuristic algorithm designing based on the greedy thought and the 0-1 knapsack thought was proposd, according build a universal model of the logistics transportation to the passenger car.Experimental results show, this model not only recording the concrete scheme of loading,also can make the optimization by changing the loading sequence. etc. This model is applicable to solve similar problems of transport logistics, and containing great practical application value and economic value.
Optimal and heuristic algorithms of planning of low-rise residential buildings
Kartak, V. M.; Marchenko, A. A.; Petunin, A. A.; Sesekin, A. N.; Fabarisova, A. I.
2017-10-01
The problem of the optimal layout of low-rise residential building is considered. Each apartment must be no less than the corresponding apartment from the proposed list. Also all requests must be made and excess of the total square over of the total square of apartment from the list must be minimized. The difference in the squares formed due to with the discreteness of distances between bearing walls and a number of other technological limitations. It shown, that this problem is NP-hard. The authors built a linear-integer model and conducted her qualitative analysis. As well, authors developed a heuristic algorithm for the solution tasks of a high dimension. The computational experiment was conducted which confirming the efficiency of the proposed approach. Practical recommendations on the use the proposed algorithms are given.
Optimized LTE cell planning for multiple user density subareas using meta-heuristic algorithms
Ghazzai, Hakim
2014-09-01
Base station deployment in cellular networks is one of the most fundamental problems in network design. This paper proposes a novel method for the cell planning problem for the fourth generation 4G-LTE cellular networks using meta heuristic algorithms. In this approach, we aim to satisfy both coverage and cell capacity constraints simultaneously by formulating a practical optimization problem. We start by performing a typical coverage and capacity dimensioning to identify the initial required number of base stations. Afterwards, we implement a Particle Swarm Optimization algorithm or a recently-proposed Grey Wolf Optimizer to find the optimal base station locations that satisfy both problem constraints in the area of interest which can be divided into several subareas with different user densities. Subsequently, an iterative approach is executed to eliminate eventual redundant base stations. We have also performed Monte Carlo simulations to study the performance of the proposed scheme and computed the average number of users in outage. Results show that our proposed approach respects in all cases the desired network quality of services even for large-scale dimension problems.
Directory of Open Access Journals (Sweden)
Moreno-Pérez José A.
2005-01-01
Full Text Available In this paper we discuss the application of a met heuristic approach based on the Scatter Search to deal with robust optimization of the planning problem in the deploying of the Dense Wavelength Division Multiplexing (DWDM technology on an existing optical fiber network taking into account, in addition to the forecasted demands, the uncertainty in the survivability requirements.
Index Fund Optimization Using a Genetic Algorithm and a Heuristic Local Search
Orito, Yukiko; Inoguchi, Manabu; Yamamoto, Hisashi
It is well known that index funds are popular passively managed portfolios and have been used very extensively for the hedge trading. Index funds consist of a certain number of stocks of listed companies on a stock market such that the fund's return rates follow a similar path to the changing rates of the market indices. However it is hard to make a perfect index fund consisting of all companies included in the given market index. Thus, the index fund optimization can be viewed as a combinatorial optimization for portfolio managements. In this paper, we propose an optimization method that consists of a genetic algorithm and a heuristic local search algorithm to make strong linear association between the fund's return rates and the changing rates of market index. We apply the method to the Tokyo Stock Exchange and make index funds whose return rates follow a similar path to the changing rates of Tokyo Stock Price Index (TOPIX). The results show that our proposal method makes the index funds with strong linear association to the market index by small computing time.
Directory of Open Access Journals (Sweden)
Vinicius Amorim Sobreiro
2013-06-01
Full Text Available The definition of the product mix provides the allocation of the productive resources in the manufacture process and the optimization of productive system. However, the definition of the product mix is a problem of the NP-complete, in other words, of difficult solution. Taking this into account, with the aid of the Theory of Constraints - TOC, some constructive heuristics have been presented to help to solve this problem. Thus, the objective in this paper is to propose a new heuristics to provide better solutions when compared with the main heuristics presented in the literature, TOC-h of Fredendall and Lea. To accomplish this comparison, simulations were accomplished with the objective of identifying the production mix with the best throughput, considering CPU time and the characteristics of the productive ambient. The results show that the heuristics proposal was more satisfactory when compared to TOC-h and it shows good solution when compared with the optimum solution. This fact evidence the importance of the heuristics proposal in the definition of product mix.
Energy Technology Data Exchange (ETDEWEB)
Pholdee, Nantiwat; Bureerat, Su Jin [Khon Kaen University, Khon Kaen (Thailand); Baek, Hyun Moo [DTaQ, Changwon (Korea, Republic of); Im, Yong Taek [KAIST, Daejeon (Korea, Republic of)
2015-08-15
Process optimization of a Non-circular drawing (NCD) sequence of a pearlitic steel wire was performed to improve the mechanical properties of a drawn wire based on surrogate assisted meta-heuristic algorithms. The objective function was introduced to minimize inhomogeneity of effective strain distribution at the cross-section of the drawn wire, which could deteriorate delamination characteristics of the drawn wires. The design variables introduced were die geometry and reduction of area of the NCD sequence. Several surrogate models and their combinations with the weighted sum technique were utilized. In the process optimization of the NCD sequence, the surrogate models were used to predict effective strain distributions at the cross-section of the drawn wire. Optimization using Differential evolution (DE) algorithm was performed, while the objective function was calculated from the predicted effective strains. The accuracy of all surrogate models was investigated, while optimum results were compared with the previous study available in the literature. It was found that hybrid surrogate models can improve prediction accuracy compared to a single surrogate model. The best result was obtained from the combination of Kriging (KG) and Support vector regression (SVR) models, while the second best was obtained from the combination of four surrogate models: Polynomial response surface (PRS), Radial basic function (RBF), KG, and SVR. The optimum results found in this study showed better effective strain homogeneity at the cross-section of the drawn wire with the same total reduction of area of the previous work available in the literature for fewer number of passes. The multi-surrogate models with the weighted sum technique were found to be powerful in improving the delamination characteristics of the drawn wire and reducing the production cost.
Roozitalab, Ali; Asgharizadeh, Ezzatollah
2013-12-01
Warranty is now an integral part of each product. Since its length is directly related to the cost of production, it should be set in such a way that it would maximize revenue generation and customers' satisfaction. Furthermore, based on the behavior of customers, it is assumed that increasing the warranty period to earn the trust of more customers leads to more sales until the market is saturated. We should bear in mind that different groups of consumers have different consumption behaviors and that performance of the product has a direct impact on the failure rate over the life of the product. Therefore, the optimum duration for every group is different. In fact, we cannot present different warranty periods for various customer groups. In conclusion, using cuckoo meta-heuristic optimization algorithm, we try to find a common period for the entire population. Results with high convergence offer a term length that will maximize the aforementioned goals simultaneously. The study was tested using real data from Appliance Company. The results indicate a significant increase in sales when the optimization approach was applied; it provides a longer warranty through increased revenue from selling, not only reducing profit margins but also increasing it.
Wu, Hao; Wan, Zhong
2018-02-01
In this paper, a multiobjective mixed-integer piecewise nonlinear programming model (MOMIPNLP) is built to formulate the management problem of urban mining system, where the decision variables are associated with buy-back pricing, choices of sites, transportation planning, and adjustment of production capacity. Different from the existing approaches, the social negative effect, generated from structural optimization of the recycling system, is minimized in our model, as well as the total recycling profit and utility from environmental improvement are jointly maximized. For solving the problem, the MOMIPNLP model is first transformed into an ordinary mixed-integer nonlinear programming model by variable substitution such that the piecewise feature of the model is removed. Then, based on technique of orthogonal design, a hybrid heuristic algorithm is developed to find an approximate Pareto-optimal solution, where genetic algorithm is used to optimize the structure of search neighborhood, and both local branching algorithm and relaxation-induced neighborhood search algorithm are employed to cut the searching branches and reduce the number of variables in each branch. Numerical experiments indicate that this algorithm spends less CPU (central processing unit) time in solving large-scale regional urban mining management problems, especially in comparison with the similar ones available in literature. By case study and sensitivity analysis, a number of practical managerial implications are revealed from the model. Since the metal stocks in society are reliable overground mineral sources, urban mining has been paid great attention as emerging strategic resources in an era of resource shortage. By mathematical modeling and development of efficient algorithms, this paper provides decision makers with useful suggestions on the optimal design of recycling system in urban mining. For example, this paper can answer how to encourage enterprises to join the recycling activities
Ortiz-Matos, L.; Aguila-Tellez, A.; Hincapié-Reyes, R. C.; González-Sanchez, J. W.
2017-07-01
In order to design electrification systems, recent mathematical models solve the problem of location, type of electrification components, and the design of possible distribution microgrids. However, due to the amount of points to be electrified increases, the solution to these models require high computational times, thereby becoming unviable practice models. This study posed a new heuristic method for the electrification of rural areas in order to solve the problem. This heuristic algorithm presents the deployment of rural electrification microgrids in the world, by finding routes for optimal placement lines and transformers in transmission and distribution microgrids. The challenge is to obtain a display with equity in losses, considering the capacity constraints of the devices and topology of the land at minimal economic cost. An optimal scenario ensures the electrification of all neighbourhoods to a minimum investment cost in terms of the distance between electric conductors and the amount of transformation devices.
A Heuristic Design Information Sharing Framework for Hard Discrete Optimization Problems
National Research Council Canada - National Science Library
Jacobson, Sheldon H
2007-01-01
.... This framework has been used to gain new insights into neighborhood structure designs that allow different neighborhood functions to share information when using the same heuristic applied to the same problem...
Heuristic Optimization Applied to an Intrinsically Difficult Problem: Birds Formation Flight
DEFF Research Database (Denmark)
Filippone, Antonino
1996-01-01
The birds formation flight is studied by means oftheoretical aerodynamics, heuristic methods anddistributed systems. A simplified aerodynamic analog is presented, and calculations of drag savings and flight range are shown for sometypical cases, including the line abreast flightwith various...
FocusHeuristics - expression-data-driven network optimization and disease gene prediction.
Ernst, Mathias; Du, Yang; Warsow, Gregor; Hamed, Mohamed; Endlich, Nicole; Endlich, Karlhans; Murua Escobar, Hugo; Sklarz, Lisa-Madeleine; Sender, Sina; Junghanß, Christian; Möller, Steffen; Fuellen, Georg; Struckmann, Stephan
2017-02-16
To identify genes contributing to disease phenotypes remains a challenge for bioinformatics. Static knowledge on biological networks is often combined with the dynamics observed in gene expression levels over disease development, to find markers for diagnostics and therapy, and also putative disease-modulatory drug targets and drugs. The basis of current methods ranges from a focus on expression-levels (Limma) to concentrating on network characteristics (PageRank, HITS/Authority Score), and both (DeMAND, Local Radiality). We present an integrative approach (the FocusHeuristics) that is thoroughly evaluated based on public expression data and molecular disease characteristics provided by DisGeNet. The FocusHeuristics combines three scores, i.e. the log fold change and another two, based on the sum and difference of log fold changes of genes/proteins linked in a network. A gene is kept when one of the scores to which it contributes is above a threshold. Our FocusHeuristics is both, a predictor for gene-disease-association and a bioinformatics method to reduce biological networks to their disease-relevant parts, by highlighting the dynamics observed in expression data. The FocusHeuristics is slightly, but significantly better than other methods by its more successful identification of disease-associated genes measured by AUC, and it delivers mechanistic explanations for its choice of genes.
Harmonic Optimization in Voltage Source Inverter for PV Application using Heuristic Algorithms
Kandil, Shaimaa A.; Ali, A. A.; El Samahy, Adel; Wasfi, Sherif M.; Malik, O. P.
2016-12-01
Selective Harmonic Elimination (SHE) technique is the fundamental switching frequency scheme that is used to eliminate specific order harmonics. Its application to minimize low order harmonics in a three level inverter is proposed in this paper. The modulation strategy used here is SHEPWM and the nonlinear equations, that characterize the low order harmonics, are solved using Harmony Search Algorithm (HSA) to obtain the optimal switching angles that minimize the required harmonics and maintain the fundamental at the desired value. Total Harmonic Distortion (THD) of the output voltage is minimized maintaining selected harmonics within allowable limits. A comparison has been drawn between HSA, Genetic Algorithm (GA) and Newton Raphson (NR) technique using MATLAB software to determine the effectiveness of getting optimized switching angles.
Directory of Open Access Journals (Sweden)
Camilo Caraveo
2017-07-01
Full Text Available Fuzzy logic is a soft computing technique that has been very successful in recent years when it is used as a complement to improve meta-heuristic optimization. In this paper, we present a new variant of the bio-inspired optimization algorithm based on the self-defense mechanisms of plants in the nature. The optimization algorithm proposed in this work is based on the predator-prey model originally presented by Lotka and Volterra, where two populations interact with each other and the objective is to maintain a balance. The system of predator-prey equations use four variables (α, β, λ, δ and the values of these variables are very important since they are in charge of maintaining a balance between the pair of equations. In this work, we propose the use of Type-2 fuzzy logic for the dynamic adaptation of the variables of the system. This time a fuzzy controller is in charge of finding the optimal values for the model variables, the use of this technique will allow the algorithm to have a higher performance and accuracy in the exploration of the values.
Directory of Open Access Journals (Sweden)
S. López-Ruiz
2016-01-01
Full Text Available This paper presents the design of a tree sections corrugated horn antenna with a modified linear profile, using NURBS, suitable for radio-astronomy applications. The operating band ranges from 4.5 to 8.8 GHz. The aperture efficiency is higher than 84% and the return losses are greater than 20 dB in the whole bandwidth. The antenna optimization has been carried out with multiobjective versions of an evolutionary algorithm (EA and a particle swarm optimization (PSO algorithm. We show that both techniques provide good antenna design, but the experience carried out shows that the results of the evolutionary algorithm outperform the particle swarm results.
Directory of Open Access Journals (Sweden)
Dawid Połap
2017-09-01
Full Text Available In the proposed article, we present a nature-inspired optimization algorithm, which we called Polar Bear Optimization Algorithm (PBO. The inspiration to develop the algorithm comes from the way polar bears hunt to survive in harsh arctic conditions. These carnivorous mammals are active all year round. Frosty climate, unfavorable to other animals, has made polar bears adapt to the specific mode of exploration and hunting in large areas, not only over ice but also water. The proposed novel mathematical model of the way polar bears move in the search for food and hunt can be a valuable method of optimization for various theoretical and practical problems. Optimization is very similar to nature, similarly to search for optimal solutions for mathematical models animals search for optimal conditions to develop in their natural environments. In this method. we have used a model of polar bear behaviors as a search engine for optimal solutions. Proposed simulated adaptation to harsh winter conditions is an advantage for local and global search, while birth and death mechanism controls the population. Proposed PBO was evaluated and compared to other meta-heuristic algorithms using sample test functions and some classical engineering problems. Experimental research results were compared to other algorithms and analyzed using various parameters. The analysis allowed us to identify the leading advantages which are rapid recognition of the area by the relevant population and efficient birth and death mechanism to improve global and local search within the solution space.
Simulation-based optimization parametric optimization techniques and reinforcement learning
Gosavi, Abhijit
2003-01-01
Simulation-Based Optimization: Parametric Optimization Techniques and Reinforcement Learning introduces the evolving area of simulation-based optimization. The book's objective is two-fold: (1) It examines the mathematical governing principles of simulation-based optimization, thereby providing the reader with the ability to model relevant real-life problems using these techniques. (2) It outlines the computational technology underlying these methods. Taken together these two aspects demonstrate that the mathematical and computational methods discussed in this book do work. Broadly speaking, the book has two parts: (1) parametric (static) optimization and (2) control (dynamic) optimization. Some of the book's special features are: *An accessible introduction to reinforcement learning and parametric-optimization techniques. *A step-by-step description of several algorithms of simulation-based optimization. *A clear and simple introduction to the methodology of neural networks. *A gentle introduction to converg...
Bakar, Sumarni Abu; Ibrahim, Milbah
2017-08-01
The shortest path problem is a popular problem in graph theory. It is about finding a path with minimum length between a specified pair of vertices. In any network the weight of each edge is usually represented in a form of crisp real number and subsequently the weight is used in the calculation of shortest path problem using deterministic algorithms. However, due to failure, uncertainty is always encountered in practice whereby the weight of edge of the network is uncertain and imprecise. In this paper, a modified algorithm which utilized heuristic shortest path method and fuzzy approach is proposed for solving a network with imprecise arc length. Here, interval number and triangular fuzzy number in representing arc length of the network are considered. The modified algorithm is then applied to a specific example of the Travelling Salesman Problem (TSP). Total shortest distance obtained from this algorithm is then compared with the total distance obtained from traditional nearest neighbour heuristic algorithm. The result shows that the modified algorithm can provide not only on the sequence of visited cities which shown to be similar with traditional approach but it also provides a good measurement of total shortest distance which is lesser as compared to the total shortest distance calculated using traditional approach. Hence, this research could contribute to the enrichment of methods used in solving TSP.
Heuristics: The good, the bad, and the biased. What value can bias have for decision makers?
Curley, Lee J.; Murray, Jennifer.; MacLean, Rory.
2017-01-01
This discussion paper will look at heuristics (rule of thumb techniques for decision making), (Tversky & Kahneman, 1974) and their potential value. Typically, heuristics have been viewed negatively (Gigerenzer & Goldstein, 1996), with research suggesting that heuristics bias how individuals think, which may create sub-optimal performance (Tversky & Kahneman, 1974). However, researchers, such as Gigerenzer and Goldstein (1996), have highlighted that a bias in decision making may not necessaril...
Directory of Open Access Journals (Sweden)
Markowski Marcin
2017-09-01
Full Text Available In recent years elastic optical networks have been perceived as a prospective choice for future optical networks due to better adjustment and utilization of optical resources than is the case with traditional wavelength division multiplexing networks. In the paper we investigate the elastic architecture as the communication network for distributed data centers. We address the problems of optimization of routing and spectrum assignment for large-scale computing systems based on an elastic optical architecture; particularly, we concentrate on anycast user to data center traffic optimization. We assume that computational resources of data centers are limited. For this offline problems we formulate the integer linear programming model and propose a few heuristics, including a meta-heuristic algorithm based on a tabu search method. We report computational results, presenting the quality of approximate solutions and efficiency of the proposed heuristics, and we also analyze and compare some data center allocation scenarios.
On the manifold-mapping optimization technique
D. Echeverria (David); P.W. Hemker (Piet)
2006-01-01
textabstractIn this paper, we study in some detail the manifold-mapping optimization technique introduced in an earlier paper. Manifold mapping aims at accelerating optimal design procedures that otherwise require many evaluations of time-expensive cost functions. We give a proof of convergence for
Demand Side Management in Nearly Zero Energy Buildings Using Heuristic Optimizations
Directory of Open Access Journals (Sweden)
Nadeem Javaid
2017-08-01
Full Text Available Today’s buildings are responsible for about 40% of total energy consumption and 30–40% of carbon emissions, which are key concerns for the sustainable development of any society. The excessive usage of grid energy raises sustainability issues in the face of global changes, such as climate change, population, economic growths, etc. Traditionally, the power systems that deliver this commodity are fuel operated and lead towards high carbon emissions and global warming. To overcome these issues, the recent concept of the nearly zero energy building (nZEB has attracted numerous researchers and industry for the construction and management of the new generation buildings. In this regard, this paper proposes various demand side management (DSM programs using the genetic algorithm (GA, teaching learning-based optimization (TLBO, the enhanced differential evolution (EDE algorithm and the proposed enhanced differential teaching learning algorithm (EDTLA to manage energy and comfort, while taking the human preferences into consideration. Power consumption patterns of shiftable home appliances are modified in response to the real-time price signal in order to get monetary benefits. To further improve the cost and user discomfort objectives along with reduced carbon emission, renewable energy sources (RESs are also integrated into the microgrid (MG. The proposed model is implemented in a smart residential complex of multiple homes under a real-time pricing environment. We figure out two feasible regions: one for electricity cost and the other for user discomfort. The proposed model aims to deal with the stochastic nature of RESs while introducing the battery storage system (BSS. The main objectives of this paper include: (1 integration of RESs; (2 minimization of the electricity bill (cost and discomfort; and (3 minimizing the peak to average ratio (PAR and carbon emission. Additionally, we also analyze the tradeoff between two conflicting objectives
Directory of Open Access Journals (Sweden)
MUDASIR AHMED MEMON
2017-01-01
Full Text Available In this paper, PSO (Particle Swarm Optimization based technique is proposed to derive optimized switching angles that minimizes the THD (Total Harmonic Distortion and reduces the effect of selected low order non-triple harmonics from the output of the multilevel inverter. Conventional harmonic elimination techniques have plenty of limitations, and other heuristic techniques also not provide the satisfactory results. In this paper, single phase symmetrical cascaded H-Bridge 11-Level multilevel inverter is considered, and proposed algorithm is utilized to obtain the optimized switching angles that reduced the effect of 5th, 7th, 11th and 13th non-triplen harmonics from the output voltage of the multilevel inverter. A simulation result indicates that this technique outperforms other methods in terms of minimizing THD and provides high-quality output voltage waveform.
Recursive heuristic classification
Wilkins, David C.
1994-01-01
The author will describe a new problem-solving approach called recursive heuristic classification, whereby a subproblem of heuristic classification is itself formulated and solved by heuristic classification. This allows the construction of more knowledge-intensive classification programs in a way that yields a clean organization. Further, standard knowledge acquisition and learning techniques for heuristic classification can be used to create, refine, and maintain the knowledge base associated with the recursively called classification expert system. The method of recursive heuristic classification was used in the Minerva blackboard shell for heuristic classification. Minerva recursively calls itself every problem-solving cycle to solve the important blackboard scheduler task, which involves assigning a desirability rating to alternative problem-solving actions. Knowing these ratings is critical to the use of an expert system as a component of a critiquing or apprenticeship tutoring system. One innovation of this research is a method called dynamic heuristic classification, which allows selection among dynamically generated classification categories instead of requiring them to be prenumerated.
Combat Identification Modeling Using Robust Optimization Techniques
2008-03-01
friendly forces, warfighters must use a combination of on-board Cooperative and Non-cooperative Identification systems , along with Tactics, Techniques...COMBAT IDENTIFICATION MODELING USING ROBUST OPTIMIZATION TECHNIQUES THESIS TaeHo Kim, Captain, ROKA...the United States Government. AFIT/GOR/ENS/08-11 COMBAT IDENTIFICATION MODELING USING ROBUST
Optimal Formation Trajectory-Planning Using Parameter Optimization Technique
Directory of Open Access Journals (Sweden)
Hyung-Chul Lim
2004-09-01
Full Text Available Some methods have been presented to get optimal formation trajectories in the step of configuration or reconfiguration, which subject to constraints of collision avoidance and final configuration. In this study, a method for optimal formation trajectory-planning is introduced in view of fuel/time minimization using parameter optimization technique which has not been applied to optimal trajectory-planning for satellite formation flying. New constraints of nonlinear equality are derived for final configuration and constraints of nonlinear inequality are used for collision avoidance. The final configuration constraints are that three or more satellites should be placed in an equilateral polygon of the circular horizontal plane orbit. Several examples are given to get optimal trajectories based on the parameter optimization problem which subjects to constraints of collision avoidance and final configuration. They show that the introduced method for trajectory-planning is well suited to trajectory design problems of formation flying missions.
Scaling Up Optimal Heuristic Search in Dec-POMDPs via Incremental Expansion (extended abstract)
Spaan, M.T.J.; Oliehoek, F.A.; Amato, C.
2011-01-01
We advance the state of the art in optimal solving of decentralized partially observable Markov decision processes (Dec-POMDPs), which provide a formal model for multiagent planning under uncertainty.
A heuristic approach to optimization of structural topology including self-weight
Tajs-Zielińska, Katarzyna; Bochenek, Bogdan
2018-01-01
Topology optimization of structures under a design-dependent self-weight load is investigated in this paper. The problem deserves attention because of its significant importance in the engineering practice, especially nowadays as topology optimization is more often applied when designing large engineering structures, for example, bridges or carrying systems of tall buildings. It is worth noting that well-known approaches of topology optimization which have been successfully applied to structures under fixed loads cannot be directly adapted to the case of design-dependent loads, so that topology generation can be a challenge also for numerical algorithms. The paper presents the application of a simple but efficient non-gradient method to topology optimization of elastic structures under self-weight loading. The algorithm is based on the Cellular Automata concept, the application of which can produce effective solutions with low computational cost.
Srinivas, B; Kulick, S N; Doran, Christine; Kulick, Seth
1995-01-01
There are currently two philosophies for building grammars and parsers -- Statistically induced grammars and Wide-coverage grammars. One way to combine the strengths of both approaches is to have a wide-coverage grammar with a heuristic component which is domain independent but whose contribution is tuned to particular domains. In this paper, we discuss a three-stage approach to disambiguation in the context of a lexicalized grammar, using a variety of domain independent heuristic techniques. We present a training algorithm which uses hand-bracketed treebank parses to set the weights of these heuristics. We compare the performance of our grammar against the performance of the IBM statistical grammar, using both untrained and trained weights for the heuristics.
Shaped reflector antenna designed using optimization techniques
Hall, W. J.; Curtis, I.; Tennant, A.; Wilcockson, P. C.
A technique used to optimize the design of a shaped reflector antenna is presented. The antenna described is designed by the addition to the main reflector of sets of distortion functions which are optimized to achieve an approximation to a specified gain distribution. The distortions which match the radiation pattern to the coverage zones are applied as two sets of Fourier coefficients referred to as the 'coarse' and the 'fine' distortions; the values of the coefficients in the series are determined using the Madsen (1975) minimax optimization technique. The optimization points include the band edge frequencies and the two zones of different gain. The antenna design provides a coverage matched to the Eutelsat 2 continental coverage zone.
Meta-heuristic algorithms as tools for hydrological science
Yoo, Do Guen; Kim, Joong Hoon
2014-12-01
In this paper, meta-heuristic optimization techniques are introduced and their applications to water resources engineering, particularly in hydrological science are introduced. In recent years, meta-heuristic optimization techniques have been introduced that can overcome the problems inherent in iterative simulations. These methods are able to find good solutions and require limited computation time and memory use without requiring complex derivatives. Simulation-based meta-heuristic methods such as Genetic algorithms (GAs) and Harmony Search (HS) have powerful searching abilities, which can occasionally overcome the several drawbacks of traditional mathematical methods. For example, HS algorithms can be conceptualized from a musical performance process and used to achieve better harmony; such optimization algorithms seek a near global optimum determined by the value of an objective function, providing a more robust determination of musical performance than can be achieved through typical aesthetic estimation. In this paper, meta-heuristic algorithms and their applications (focus on GAs and HS) in hydrological science are discussed by subject, including a review of existing literature in the field. Then, recent trends in optimization are presented and a relatively new technique such as Smallest Small World Cellular Harmony Search (SSWCHS) is briefly introduced, with a summary of promising results obtained in previous studies. As a result, previous studies have demonstrated that meta-heuristic algorithms are effective tools for the development of hydrological models and the management of water resources.
Heuristic techniques for the analysis of variability as a dynamic aspect of change
Van Dijk, M.W.G.; Van Geert, P.
Due to the influence of dynamic systems and microgenetic perspectives, variability is nowadays often seen as an important phenomenon that helps us understand the underlying mechanisms of development. This paper aims at demonstrating several simple techniques that can be used to analyse variability
Solving non-standard packing problems by global optimization and heuristics
Fasano, Giorgio
2014-01-01
This book results from a long-term research effort aimed at tackling complex non-standard packing issues which arise in space engineering. The main research objective is to optimize cargo loading and arrangement, in compliance with a set of stringent rules. Complicated geometrical aspects are also taken into account, in addition to balancing conditions based on attitude control specifications. Chapter 1 introduces the class of non-standard packing problems studied. Chapter 2 gives a detailed explanation of a general model for the orthogonal packing of tetris-like items in a convex domain. A number of additional conditions are looked at in depth, including the prefixed orientation of subsets of items, the presence of unusable holes, separation planes and structural elements, relative distance bounds as well as static and dynamic balancing requirements. The relative feasibility sub-problem which is a special case that does not have an optimization criterion is discussed in Chapter 3. This setting can be exploit...
Meta-Heuristics in Short Scale Construction: Ant Colony Optimization and Genetic Algorithm.
Schroeders, Ulrich; Wilhelm, Oliver; Olaru, Gabriel
2016-01-01
The advent of large-scale assessment, but also the more frequent use of longitudinal and multivariate approaches to measurement in psychological, educational, and sociological research, caused an increased demand for psychometrically sound short scales. Shortening scales economizes on valuable administration time, but might result in inadequate measures because reducing an item set could: a) change the internal structure of the measure, b) result in poorer reliability and measurement precision, c) deliver measures that cannot effectively discriminate between persons on the intended ability spectrum, and d) reduce test-criterion relations. Different approaches to abbreviate measures fare differently with respect to the above-mentioned problems. Therefore, we compare the quality and efficiency of three item selection strategies to derive short scales from an existing long version: a Stepwise COnfirmatory Factor Analytical approach (SCOFA) that maximizes factor loadings and two metaheuristics, specifically an Ant Colony Optimization (ACO) with a tailored user-defined optimization function and a Genetic Algorithm (GA) with an unspecific cost-reduction function. SCOFA compiled short versions were highly reliable, but had poor validity. In contrast, both metaheuristics outperformed SCOFA and produced efficient and psychometrically sound short versions (unidimensional, reliable, sensitive, and valid). We discuss under which circumstances ACO and GA produce equivalent results and provide recommendations for conditions in which it is advisable to use a metaheuristic with an unspecific out-of-the-box optimization function.
Numerical derivative techniques for trajectory optimization
Hallman, Wayne P.
1990-01-01
The adoption of robust numerical optimization techniques in trajectory simulation programs has resulted in powerful design and analysis tools. These trajectory simulation/optimization programs are widely used, and a representative list includes the GTS system, the POST program, and newer collocation methods such as OTIS and FONPAC. All of these programs rely on optimization algorithms which require objective function and constraint gradient data during the iteration process. However, most trajectory optimization problems lack simple analytical expressions for these derivatives. In the general case a function evaluation involves integrating aerodynamic, propulsive, and gravity forces over multiple trajectory phases with complex control models. With the newer collocation methods, the integration is replaced by defect constraints and cubic approximations for the state. While analytic gradient expressions can sometimes be derived for trajectory optimization problems, the derivation is cumbersome, time consuming, and prone to mistakes. Fortunately, an alternate method exists for the gradient evaluation, namely finite difference approximations. In this paper some finite difference gradient techniques developed for use with the GTS system are presented. These techniques include methods for computing first and second partial derivatives of single and multiple sets of functions. A key feature of these methods is an error control mechanism which automatically adjusts the perturbation size to obtain accurate derivative values.
Directory of Open Access Journals (Sweden)
Sheraz Aslam
2017-12-01
Full Text Available The smart grid plays a vital role in decreasing electricity cost through Demand Side Management (DSM. Smart homes, a part of the smart grid, contribute greatly to minimizing electricity consumption cost via scheduling home appliances. However, user waiting time increases due to the scheduling of home appliances. This scheduling problem is the motivation to find an optimal solution that could minimize the electricity cost and Peak to Average Ratio (PAR with minimum user waiting time. There are many studies on Home Energy Management (HEM for cost minimization and peak load reduction. However, none of the systems gave sufficient attention to tackle multiple parameters (i.e., electricity cost and peak load reduction at the same time as user waiting time was minimum for residential consumers with multiple homes. Hence, in this work, we propose an efficient HEM scheme using the well-known meta-heuristic Genetic Algorithm (GA, the recently developed Cuckoo Search Optimization Algorithm (CSOA and the Crow Search Algorithm (CSA, which can be used for electricity cost and peak load alleviation with minimum user waiting time. The integration of a smart Electricity Storage System (ESS is also taken into account for more efficient operation of the Home Energy Management System (HEMS. Furthermore, we took the real-time electricity consumption pattern for every residence, i.e., every home has its own living pattern. The proposed scheme is implemented in a smart building; comprised of thirty smart homes (apartments, Real-Time Pricing (RTP and Critical Peak Pricing (CPP signals are examined in terms of electricity cost estimation for both a single smart home and a smart building. In addition, feasible regions are presented for single and multiple smart homes, which show the relationship among the electricity cost, electricity consumption and user waiting time. Experimental results demonstrate the effectiveness of our proposed scheme for single and multiple smart
Tanjung, WN; Nurhasanah, N.; Suri, QA; Jingga; Aribowo, B.; Mardhika, DA; Gayatri, AM; Safitri, R.; Supriyanto, A.
2017-12-01
The textile industry is one of the 10 commodities of industrial products which are still survives in Indonesia due to the crisis in the year 2009 until 2016. Drawback happened in 2017 by increased the number of demand by approximate 3% compares with previous year. In this case, the research conducted in Small Medium Enterprise (SME) called FBS. SME is a business group that is able to absorb a lot of labor and a source of income for society. SME FBS producing clothing boys and domiciled in Jakarta. To complete FBS product, the WIP products are sent to CMT or depot in Sukabumi. In this study, aims to do the shortest route in the determination of the distribution of WIP product to 10 CMT scattered in the area of Sukabumi. After optimization hapened, the route must be started from the Depot SME FBS Sukabumi-Shell Sand Village – village of Sukamaju Village – Margaluyu Village – Narogong Cicurug, – the village of Parakanlima, Cuguha – Padabeunghar Village – Sagaranten Village – village of Ciherang, Ciguyang, Sagaranten – the village of Bojong Waru, Pasirsalam Village, Purabaya – Students – return to Depot SME FBS Sukabumi with mileage in a single trip of 403.6 kilometers. It spents 10 hours 09 minutes and cost distribution issued amounting to IDR 296,928.52. The route length was optimized 47% from 759.1 become 403.6 kilometers.
Sunstein, Cass R
2005-08-01
With respect to questions of fact, people use heuristics--mental short-cuts, or rules of thumb, that generally work well, but that also lead to systematic errors. People use moral heuristics too--moral short-cuts, or rules of thumb, that lead to mistaken and even absurd moral judgments. These judgments are highly relevant not only to morality, but to law and politics as well. examples are given from a number of domains, including risk regulation, punishment, reproduction and sexuality, and the act/omission distinction. in all of these contexts, rapid, intuitive judgments make a great deal of sense, but sometimes produce moral mistakes that are replicated in law and policy. One implication is that moral assessments ought not to be made by appealing to intuitions about exotic cases and problems; those intuitions are particularly unlikely to be reliable. Another implication is that some deeply held moral judgments are unsound if they are products of moral heuristics. The idea of error-prone heuristics is especially controversial in the moral domain, where agreement on the correct answer may be hard to elicit; but in many contexts, heuristics are at work and they do real damage. Moral framing effects, including those in the context of obligations to future generations, are also discussed.
Energy Technology Data Exchange (ETDEWEB)
Konishi, M. [Kobe Steel, Ltd., Kobe (Japan)
1996-09-01
This paper introduces meta-heuristic methods for system optimization for material flow scheduling as well as their applications. The systems are intended to optimize combinations which determine such variables as selection of transport routes in and out of a factory, ratio of transport vehicles, and ratio of product orders to facilities. The meta-heuristic methods include the simulated annealing (SA) algorithm and the genetic algorithm (GA). The SA method is a method to search an optimal solution by utilizing combination optimization and analogy in the statistical dynamics. Although the system has limitation in the scope of application, it is characterized in that the setting of vicinity structure utilizing experience is effective. The GA method is a collective search method which models after the evolution mechanism of living organisms, and is characterized by parallel search on a plurality of search points. The applications of the SA method include a system to optimize limits of receiving product orders (shipment plans) in an expanded copper plate manufacturing factory. The applications of the GA method include optimization of a problem to allot a plurality of orders to a plurality of slabs. A method that can be comparable to the GA and SA methods is the expert system. 11 refs., 8 figs., 1 tab.
Computational optimization techniques applied to microgrids planning
DEFF Research Database (Denmark)
Gamarra, Carlos; Guerrero, Josep M.
2015-01-01
Microgrids are expected to become part of the next electric power system evolution, not only in rural and remote areas but also in urban communities. Since microgrids are expected to coexist with traditional power grids (such as district heating does with traditional heating systems...... appear along the planning process. In this context, technical literature about optimization techniques applied to microgrid planning have been reviewed and the guidelines for innovative planning methodologies focused on economic feasibility can be defined. Finally, some trending techniques and new...
Directory of Open Access Journals (Sweden)
Cenk Demirkır
2014-04-01
Full Text Available Plywood, which is one of the most important wood based panels, has many usage areas changing from traffic signs to building constructions in many countries. It is known that the high quality plywood panel manufacturing has been achieved with a good bonding under the optimum pressure conditions depending on adhesive type. This is a study of determining the using possibilities of modern meta-heuristic hybrid artificial intelligence techniques such as IKE and AANN methods for prediction of bonding strength of plywood panels. This study has composed of two main parts as experimental and analytical. Scots pine, maritime pine and European black pine logs were used as wood species. The pine veneers peeled at 32°C and 50°C were dried at 110°C, 140°C and 160°C temperatures. Phenol formaldehyde and melamine urea formaldehyde resins were used as adhesive types. EN 314-1 standard was used to determine the bonding shear strength values of plywood panels in experimental part of this study. Then the intuitive k-nearest neighbor estimator (IKE and adaptive artificial neural network (AANN were used to estimate bonding strength of plywood panels. The best estimation performance was obtained from MA metric for k-value=10. The most effective factor on bonding strength was determined as adhesive type. Error rates were determined less than 5% for both of the IKE and AANN. It may be recommended that proposed methods could be used in applying to estimation of bonding strength values of plywood panels.
Heuristic computation of the rovibrational G matrix in optimized molecule-fixed axes. Gmat 2.1
Castro, M. E.; Niño, A.; Muñoz-Caro, C.
2010-08-01
Gmat 2.1 is a program able to compute the rovibrational G matrix in different molecule-fixed axes extending the capabilities of Gmat 1.0. The present version is able to select optimal molecule-fixed axes minimizing the pure rotational kinetic elements, the rovibrational kinetic elements or both simultaneously. To such an end, it uses a hybrid minimization approach. Thus, it combines a global search heuristic based in simulated annealing with a gradient-free local minimization. As the previous version, the program handles the structural results of potential energy hypersurface mappings computed in computer clusters or computational Grid environments. However, since now more general molecule-fixed axes can be defined, a procedure is implemented to ensure the same minimum of the cost function is used in all the molecular structures. In addition, an algorithm for the unambiguous definition of the molecule-fixed axes orientation is used. Program summaryProgram title: Gmat 2.1 Catalogue identifier: AECZ_v2_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AECZ_v2_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 52 555 No. of bytes in distributed program, including test data, etc.: 932 366 Distribution format: tar.gz Programming language: Standard ANSI C++ Computer: All Operating system: Linux, Windows Classification: 16.2 Catalogue identifier of previous version: AECZ_v1_0 Journal reference of previous version: Comput. Phys. Comm. 180 (2009) 1183 Does the new version supersede the previous version?: Yes Nature of problem: When building molecular rovibrational Hamiltonians, the kinetic terms depend on the molecule-fixed axes orientation. Thus, an appropriate orientation can significantly simplify the treatment of pure rotation and rovibrational coupling. The kinetic terms
Directory of Open Access Journals (Sweden)
Wei Tu
2015-10-01
Full Text Available Vehicle routing optimization (VRO designs the best routes to reduce travel cost, energy consumption, and carbon emission. Due to non-deterministic polynomial-time hard (NP-hard complexity, many VROs involved in real-world applications require too much computing effort. Shortening computing time for VRO is a great challenge for state-of-the-art spatial optimization algorithms. From a spatial-temporal perspective, this paper presents a spatial-temporal Voronoi diagram-based heuristic approach for large-scale vehicle routing problems with time windows (VRPTW. Considering time constraints, a spatial-temporal Voronoi distance is derived from the spatial-temporal Voronoi diagram to find near neighbors in the space-time searching context. A Voronoi distance decay strategy that integrates a time warp operation is proposed to accelerate local search procedures. A spatial-temporal feature-guided search is developed to improve unpromising micro route structures. Experiments on VRPTW benchmarks and real-world instances are conducted to verify performance. The results demonstrate that the proposed approach is competitive with state-of-the-art heuristics and achieves high-quality solutions for large-scale instances of VRPTWs in a short time. This novel approach will contribute to spatial decision support community by developing an effective vehicle routing optimization method for large transportation applications in both public and private sectors.
Machine Learning Techniques in Optimal Design
Cerbone, Giuseppe
1992-01-01
Many important applications can be formalized as constrained optimization tasks. For example, we are studying the engineering domain of two-dimensional (2-D) structural design. In this task, the goal is to design a structure of minimum weight that bears a set of loads. A solution to a design problem in which there is a single load (L) and two stationary support points (S1 and S2) consists of four members, E1, E2, E3, and E4 that connect the load to the support points is discussed. In principle, optimal solutions to problems of this kind can be found by numerical optimization techniques. However, in practice [Vanderplaats, 1984] these methods are slow and they can produce different local solutions whose quality (ratio to the global optimum) varies with the choice of starting points. Hence, their applicability to real-world problems is severely restricted. To overcome these limitations, we propose to augment numerical optimization by first performing a symbolic compilation stage to produce: (a) objective functions that are faster to evaluate and that depend less on the choice of the starting point and (b) selection rules that associate problem instances to a set of recommended solutions. These goals are accomplished by successive specializations of the problem class and of the associated objective functions. In the end, this process reduces the problem to a collection of independent functions that are fast to evaluate, that can be differentiated symbolically, and that represent smaller regions of the overall search space. However, the specialization process can produce a large number of sub-problems. This is overcome by deriving inductively selection rules which associate problems to small sets of specialized independent sub-problems. Each set of candidate solutions is chosen to minimize a cost function which expresses the tradeoff between the quality of the solution that can be obtained from the sub-problem and the time it takes to produce it. The overall solution
Cache Energy Optimization Techniques For Modern Processors
Energy Technology Data Exchange (ETDEWEB)
Mittal, Sparsh [ORNL
2013-01-01
newcomers and veterans in the field of cache power management. It will help graduate students, CAD tool developers and designers in understanding the need of energy efficiency in modern computing systems. Further, it will be useful for researchers in gaining insights into algorithms and techniques for micro-architectural and system-level energy optimization using dynamic cache reconfiguration. We sincerely believe that the ``food for thought'' presented in this book will inspire the readers to develop even better ideas for designing ``green'' processors of tomorrow.
Jack F. Williams
1981-01-01
Seven heuristic algorithms are discussed. Each can be used for production scheduling in an assembly network (a network where each work station has at most one immediate successor work station, but may have any number of immediate predecessor work stations), distribution scheduling in an arborescence network (a network where each warehouse or stocking point is supplied by at most one immediate predecessor stocking point, but may itself supply any number of immediate successor stocking points),...
Evolutionary optimization technique for site layout planning
El Ansary, Ayman M.
2014-02-01
Solving the site layout planning problem is a challenging task. It requires an iterative approach to satisfy design requirements (e.g. energy efficiency, skyview, daylight, roads network, visual privacy, and clear access to favorite views). These design requirements vary from one project to another based on location and client preferences. In the Gulf region, the most important socio-cultural factor is the visual privacy in indoor space. Hence, most of the residential houses in this region are surrounded by high fences to provide privacy, which has a direct impact on other requirements (e.g. daylight and direction to a favorite view). This paper introduces a novel technique to optimally locate and orient residential buildings to satisfy a set of design requirements. The developed technique is based on genetic algorithm which explores the search space for possible solutions. This study considers two dimensional site planning problems. However, it can be extended to solve three dimensional cases. A case study is presented to demonstrate the efficiency of this technique in solving the site layout planning of simple residential dwellings. © 2013 Elsevier B.V. All rights reserved.
Parallel halftoning technique using dot diffusion optimization
Molina-Garcia, Javier; Ponomaryov, Volodymyr I.; Reyes-Reyes, Rogelio; Cruz-Ramos, Clara
2017-05-01
In this paper, a novel approach for halftone images is proposed and implemented for images that are obtained by the Dot Diffusion (DD) method. Designed technique is based on an optimization of the so-called class matrix used in DD algorithm and it consists of generation new versions of class matrix, which has no baron and near-baron in order to minimize inconsistencies during the distribution of the error. Proposed class matrix has different properties and each is designed for two different applications: applications where the inverse-halftoning is necessary, and applications where this method is not required. The proposed method has been implemented in GPU (NVIDIA GeForce GTX 750 Ti), multicore processors (AMD FX(tm)-6300 Six-Core Processor and in Intel core i5-4200U), using CUDA and OpenCV over a PC with linux. Experimental results have shown that novel framework generates a good quality of the halftone images and the inverse halftone images obtained. The simulation results using parallel architectures have demonstrated the efficiency of the novel technique when it is implemented in real-time processing.
Heuristic Methods for Security Protocols
Directory of Open Access Journals (Sweden)
Qurat ul Ain Nizamani
2009-10-01
Full Text Available Model checking is an automatic verification technique to verify hardware and software systems. However it suffers from state-space explosion problem. In this paper we address this problem in the context of cryptographic protocols by proposing a security property-dependent heuristic. The heuristic weights the state space by exploiting the security formulae; the weights may then be used to explore the state space when searching for attacks.
2015-01-01
How can we advance knowledge? Which methods do we need in order to make new discoveries? How can we rationally evaluate, reconstruct and offer discoveries as a means of improving the ‘method’ of discovery itself? And how can we use findings about scientific discovery to boost funding policies, thus fostering a deeper impact of scientific discovery itself? The respective chapters in this book provide readers with answers to these questions. They focus on a set of issues that are essential to the development of types of reasoning for advancing knowledge, such as models for both revolutionary findings and paradigm shifts; ways of rationally addressing scientific disagreement, e.g. when a revolutionary discovery sparks considerable disagreement inside the scientific community; frameworks for both discovery and inference methods; and heuristics for economics and the social sciences.
Efficient reanalysis techniques for robust topology optimization
DEFF Research Database (Denmark)
Amir, Oded; Sigmund, Ole; Lazarov, Boyan Stefanov
2012-01-01
shown to yield optimized designs that are tolerant with respect to such manufacturing uncertainties. The main drawback of such procedures is the added computational cost associated with the need to evaluate a set of designs by performing multiple finite element analyses. In this article, we propose......The article focuses on the reduction of the computational effort involved in robust topology optimization procedures. The performance of structures designed by means of topology optimization may be seriously degraded due to fabrication errors. Robust formulations of the optimization problem were...
Directory of Open Access Journals (Sweden)
Khurram Hammed
2016-01-01
Full Text Available This paper presents a stochastic global optimization technique known as Particle Swarm Optimization (PSO for joint estimation of amplitude and direction of arrival of the targets in RADAR communication system. The proposed scheme is an excellent optimization methodology and a promising approach for solving the DOA problems in communication systems. Moreover, PSO is quite suitable for real time scenario and easy to implement in hardware. In this study, uniform linear array is used and targets are supposed to be in far field of the arrays. Formulation of the fitness function is based on mean square error and this function requires a single snapshot to obtain the best possible solution. To check the accuracy of the algorithm, all of the results are taken by varying the number of antenna elements and targets. Finally, these results are compared with existing heuristic techniques to show the accuracy of PSO.
Advanced Aerostructural Optimization Techniques for Aircraft Design
Directory of Open Access Journals (Sweden)
Yingtao Zuo
2015-01-01
Full Text Available Traditional coupled aerostructural design optimization (ASDO of aircraft based on high-fidelity models is computationally expensive and inefficient. To improve the efficiency, the key is to predict aerostructural performance of the aircraft efficiently. The cruise shape of the aircraft is parameterized and optimized in this paper, and a methodology named reverse iteration of structural model (RISM is adopted to get the aerostructural performance of cruise shape efficiently. A new mathematical explanation of RISM is presented in this paper. The efficiency of RISM can be improved by four times compared with traditional static aeroelastic analysis. General purpose computing on graphical processing units (GPGPU is adopted to accelerate the RISM further, and GPU-accelerated RISM is constructed. The efficiency of GPU-accelerated RISM can be raised by about 239 times compared with that of the loosely coupled aeroelastic analysis. Test shows that the fidelity of GPU-accelerated RISM is high enough for optimization. Optimization framework based on Kriging model is constructed. The efficiency of the proposed optimization system can be improved greatly with the aid of GPU-accelerated RISM. An unmanned aerial vehicle (UAV is optimized using this framework and the range is improved by 4.67% after optimization, which shows effectiveness and efficiency of this framework.
Heuristics and Biases in Retirement Savings Behavior
Shlomo Benartzi; Richard Thaler
2007-01-01
Standard economic theories of saving implicitly assume that households have the cognitive ability to solve the relevant optimization problem and the willpower to execute the optimal plan. Both of the implicit assumptions are suspect. Even among economists, few spend much time calculating a personal optimal savings rate. Instead, most people cope by adopting simple heuristics, or rules of thumb. In this paper, we investigate both the heuristics and the biases that emerge in the area of retirem...
OPTIMAL DATA REPLACEMENT TECHNIQUE FOR COOPERATIVE CACHING IN MANET
Directory of Open Access Journals (Sweden)
P. Kuppusamy
2014-09-01
Full Text Available A cooperative caching approach improves data accessibility and reduces query latency in Mobile Ad hoc Network (MANET. Maintaining the cache is challenging issue in large MANET due to mobility, cache size and power. The previous research works on caching primarily have dealt with LRU, LFU and LRU-MIN cache replacement algorithms that offered low query latency and greater data accessibility in sparse MANET. This paper proposes Memetic Algorithm (MA to locate the better replaceable data based on neighbours interest and fitness value of cached data to store the newly arrived data. This work also elects ideal CH using Meta heuristic search Ant Colony Optimization algorithm. The simulation results shown that proposed algorithm reduces the latency, control overhead and increases the packet delivery rate than existing approach by increasing nodes and speed respectively.
Optimization techniques using MODFLOW-GWM
Grava, Anna; Feinstein, Daniel T.; Barlow, Paul M.; Bonomi, Tullia; Buarne, Fabiola; Dunning, Charles; Hunt, Randall J.
2015-01-01
An important application of optimization codes such as MODFLOW-GWM is to maximize water supply from unconfined aquifers subject to constraints involving surface-water depletion and drawdown. In optimizing pumping for a fish hatchery in a bedrock aquifer system overlain by glacial deposits in eastern Wisconsin, various features of the GWM-2000 code were used to overcome difficulties associated with: 1) Non-linear response matrices caused by unconfined conditions and head-dependent boundaries; 2) Efficient selection of candidate well and drawdown constraint locations; and 3) Optimizing against water-level constraints inside pumping wells. Features of GWM-2000 were harnessed to test the effects of systematically varying the decision variables and constraints on the optimized solution for managing withdrawals. An important lesson of the procedure, similar to lessons learned in model calibration, is that the optimized outcome is non-unique, and depends on a range of choices open to the user. The modeler must balance the complexity of the numerical flow model used to represent the groundwater-flow system against the range of options (decision variables, objective functions, constraints) available for optimizing the model.
National Research Council Canada - National Science Library
Berns, Eric
2002-01-01
The technical objectives of this study are to determine optimum techniques for a flat-panel Cesium- iodide silicon-diode full-field digital mammography system and to compare those optimized techniques...
National Research Council Canada - National Science Library
Berns, Eric
2001-01-01
The technical objectives of this study are to determine optimum techniques for a flat-panel Cesium-iodide silicon-diode full-field digital mammography system and to compare those optimized techniques...
9th International Conference on Optimization : Techniques and Applications
Wang, Song; Wu, Soon-Yi
2015-01-01
This book presents the latest research findings and state-of-the-art solutions on optimization techniques and provides new research direction and developments. Both the theoretical and practical aspects of the book will be much beneficial to experts and students in optimization and operation research community. It selects high quality papers from The International Conference on Optimization: Techniques and Applications (ICOTA2013). The conference is an official conference series of POP (The Pacific Optimization Research Activity Group; there are over 500 active members). These state-of-the-art works in this book authored by recognized experts will make contributions to the development of optimization with its applications.
Optimal Technique in Cardiac Anesthesia Recovery
Svircevic, V.
2014-01-01
The aim of this thesis is to evaluate fast-track cardiac anesthesia techniques and investigate their impact on postoperative mortality, morbidity and quality of life. The following topics will be discussed in the thesis. (1.) Is fast track cardiac anesthesia a safe technique for cardiac surgery?
Query Optimization Techniques in Microsoft SQL Server
Directory of Open Access Journals (Sweden)
Costel Gabriel CORLATAN
2014-09-01
Full Text Available Microsoft SQL Server is a relational database management system, having MS-SQL and Transact-SQL as primary structured programming languages. They rely on relational algebra which is mainly used for data insertion, modifying, deletion and retrieval, as well as for data access controlling. The problem with getting the expected results is handled by the management system which has the purpose of finding the best execution plan, this process being called optimization. The most frequently used queries are those of data retrieval through SELECT command. We have to take into consideration that not only the select queries need optimization, but also other objects, such as: index, view or statistics.
A direct heuristic algorithm for linear programming
Indian Academy of Sciences (India)
Abstract. An (3) mathematically non-iterative heuristic procedure that needs no artificial variable is presented for solving linear programming problems. An optimality test is included. Numerical experiments depict the utility/scope of such a procedure.
A novel technique for active vibration control, based on optimal ...
Indian Academy of Sciences (India)
In the last few decades, researchers have proposed many control techniques to suppress unwanted vibrations in a structure. In this work, a novel and simple technique is proposed for the active vibration control. In this technique, an optimal tracking control is employed to suppress vibrations in a structure by simultaneously ...
A novel technique for active vibration control, based on optimal ...
Indian Academy of Sciences (India)
BEHROUZ KHEIRI SARABI
2017-07-11
Jul 11, 2017 ... Abstract. In the last few decades, researchers have proposed many control techniques to suppress unwanted vibrations in a structure. In this work, a novel and simple technique is proposed for the active vibration control. In this technique, an optimal tracking control is employed to suppress vibrations in a ...
Dawid Połap; Marcin Woz´niak
2017-01-01
In the proposed article, we present a nature-inspired optimization algorithm, which we called Polar Bear Optimization Algorithm (PBO). The inspiration to develop the algorithm comes from the way polar bears hunt to survive in harsh arctic conditions. These carnivorous mammals are active all year round. Frosty climate, unfavorable to other animals, has made polar bears adapt to the specific mode of exploration and hunting in large areas, not only over ice but also water. The proposed novel mat...
Complex energy system management using optimization techniques
Energy Technology Data Exchange (ETDEWEB)
Bridgeman, Stuart; Hurdowar-Castro, Diana; Allen, Rick; Olason, Tryggvi; Welt, Francois
2010-09-15
Modern energy systems are often very complex with respect to the mix of generation sources, energy storage, transmission, and avenues to market. Historically, power was provided by government organizations to load centers, and pricing was provided in a regulatory manner. In recent years, this process has been displaced by the independent system operator (ISO). This complexity makes the operation of these systems very difficult, since the components of the system are interdependent. Consequently, computer-based large-scale simulation and optimization methods like Decision Support Systems are now being used. This paper discusses the application of a DSS to operations and planning systems.
Nonparametric Comparison of Two Dynamic Parameter Setting Methods in a Meta-Heuristic Approach
Directory of Open Access Journals (Sweden)
Seyhun HEPDOGAN
2007-10-01
Full Text Available Meta-heuristics are commonly used to solve combinatorial problems in practice. Many approaches provide very good quality solutions in a short amount of computational time; however most meta-heuristics use parameters to tune the performance of the meta-heuristic for particular problems and the selection of these parameters before solving the problem can require much time. This paper investigates the problem of setting parameters using a typical meta-heuristic called Meta-RaPS (Metaheuristic for Randomized Priority Search.. Meta-RaPS is a promising meta-heuristic optimization method that has been applied to different types of combinatorial optimization problems and achieved very good performance compared to other meta-heuristic techniques. To solve a combinatorial problem, Meta-RaPS uses two well-defined stages at each iteration: construction and local search. After a number of iterations, the best solution is reported. Meta-RaPS performance depends on the fine tuning of two main parameters, priority percentage and restriction percentage, which are used during the construction stage. This paper presents two different dynamic parameter setting methods for Meta-RaPS. These dynamic parameter setting approaches tune the parameters while a solution is being found. To compare these two approaches, nonparametric statistic approaches are utilized since the solutions are not normally distributed. Results from both these dynamic parameter setting methods are reported.
Directory of Open Access Journals (Sweden)
Xuejun Li
2015-01-01
Full Text Available Cloud workflow system is a kind of platform service based on cloud computing. It facilitates the automation of workflow applications. Between cloud workflow system and its counterparts, market-oriented business model is one of the most prominent factors. The optimization of task-level scheduling in cloud workflow system is a hot topic. As the scheduling is a NP problem, Ant Colony Optimization (ACO and Particle Swarm Optimization (PSO have been proposed to optimize the cost. However, they have the characteristic of premature convergence in optimization process and therefore cannot effectively reduce the cost. To solve these problems, Chaotic Particle Swarm Optimization (CPSO algorithm with chaotic sequence and adaptive inertia weight factor is applied to present the task-level scheduling. Chaotic sequence with high randomness improves the diversity of solutions, and its regularity assures a good global convergence. Adaptive inertia weight factor depends on the estimate value of cost. It makes the scheduling avoid premature convergence by properly balancing between global and local exploration. The experimental simulation shows that the cost obtained by our scheduling is always lower than the other two representative counterparts.
Li, Xuejun; Xu, Jia; Yang, Yun
2015-01-01
Cloud workflow system is a kind of platform service based on cloud computing. It facilitates the automation of workflow applications. Between cloud workflow system and its counterparts, market-oriented business model is one of the most prominent factors. The optimization of task-level scheduling in cloud workflow system is a hot topic. As the scheduling is a NP problem, Ant Colony Optimization (ACO) and Particle Swarm Optimization (PSO) have been proposed to optimize the cost. However, they have the characteristic of premature convergence in optimization process and therefore cannot effectively reduce the cost. To solve these problems, Chaotic Particle Swarm Optimization (CPSO) algorithm with chaotic sequence and adaptive inertia weight factor is applied to present the task-level scheduling. Chaotic sequence with high randomness improves the diversity of solutions, and its regularity assures a good global convergence. Adaptive inertia weight factor depends on the estimate value of cost. It makes the scheduling avoid premature convergence by properly balancing between global and local exploration. The experimental simulation shows that the cost obtained by our scheduling is always lower than the other two representative counterparts.
Heuristics for Multidimensional Packing Problems
DEFF Research Database (Denmark)
Egeblad, Jens
for a three-dimensional knapsack packing problem involving furniture is presented in the fourth paper. The heuristic is based on a variety of techniques including tree-search, wall-building, and sequential placement. The solution process includes considerations regarding stability and load bearing strength...
Heuristic Search Theory and Applications
Edelkamp, Stefan
2011-01-01
Search has been vital to artificial intelligence from the very beginning as a core technique in problem solving. The authors present a thorough overview of heuristic search with a balance of discussion between theoretical analysis and efficient implementation and application to real-world problems. Current developments in search such as pattern databases and search with efficient use of external memory and parallel processing units on main boards and graphics cards are detailed. Heuristic search as a problem solving tool is demonstrated in applications for puzzle solving, game playing, constra
The analytical representation of viscoelastic material properties using optimization techniques
Hill, S. A.
1993-02-01
This report presents a technique to model viscoelastic material properties with a function of the form of the Prony series. Generally, the method employed to determine the function constants requires assuming values for the exponential constants of the function and then resolving the remaining constants through linear least-squares techniques. The technique presented here allows all the constants to be analytically determined through optimization techniques. This technique is employed in a computer program named PRONY and makes use of commercially available optimization tool developed by VMA Engineering, Inc. The PRONY program was utilized to compare the technique against previously determined models for solid rocket motor TP-H1148 propellant and V747-75 Viton fluoroelastomer. In both cases, the optimization technique generated functions that modeled the test data with at least an order of magnitude better correlation. This technique has demonstrated the capability to use small or large data sets and to use data sets that have uniformly or nonuniformly spaced data pairs. The reduction of experimental data to accurate mathematical models is a vital part of most scientific and engineering research. This technique of regression through optimization can be applied to other mathematical models that are difficult to fit to experimental data through traditional regression techniques.
Advanced memory optimization techniques for low-power embedded processors
Verma, Manish
2007-01-01
The complete application, including data variables and code segments, is optimizedComprehensive architecture-level exploration for real-life applicationsDemonstration of architecture-aware compilation techniques.
de Jong, Menno D.T.; van der Geest, Thea
2000-01-01
This article is intended to make Web designers more aware of the qualities of heuristics by presenting a framework for analyzing the characteristics of heuristics. The framework is meant to support Web designers in choosing among alternative heuristics. We hope that better knowledge of the
Directory of Open Access Journals (Sweden)
Fanrong Kong
2017-09-01
Full Text Available To alleviate the emission of greenhouse gas and the dependence on fossil fuel, Plug-in Hybrid Electrical Vehicles (PHEVs have gained an increasing popularity in current decades. Due to the fluctuating electricity prices in the power market, a charging schedule is very influential to driving cost. Although the next-day electricity prices can be obtained in a day-ahead power market, a driving plan is not easily made in advance. Although PHEV owners can input a next-day plan into a charging system, e.g., aggregators, day-ahead, it is a very trivial task to do everyday. Moreover, the driving plan may not be very accurate. To address this problem, in this paper, we analyze energy demands according to a PHEV owner’s historical driving records and build a personalized statistic driving model. Based on the model and the electricity spot prices, a rolling optimization strategy is proposed to help make a charging decision in the current time slot. On one hand, by employing a heuristic algorithm, the schedule is made according to the situations in the following time slots. On the other hand, however, after the current time slot, the schedule will be remade according to the next tens of time slots. Hence, the schedule is made by a dynamic rolling optimization, but it only decides the charging decision in the current time slot. In this way, the fluctuation of electricity prices and driving routine are both involved in the scheduling. Moreover, it is not necessary for PHEV owners to input a day-ahead driving plan. By the optimization simulation, the results demonstrate that the proposed method is feasible to help owners save charging costs and also meet requirements for driving.
Directory of Open Access Journals (Sweden)
Vinícius Vilar Jacob
2016-01-01
Full Text Available This paper addresses a single-machine scheduling problem with sequence-dependent family setup times. In this problem the jobs are classified into families according to their similarity characteristics. Setup times are required on each occasion when the machine switches from processing jobs in one family to jobs in another family. The performance measure to be minimized is the total tardiness with respect to the given due dates of the jobs. The problem is classified as NP-hard in the ordinary sense. Since the computational complexity associated with the mathematical formulation of the problem makes it difficult for optimization solvers to deal with large-sized instances in reasonable solution time, efficient heuristic algorithms are needed to obtain near-optimal solutions. In this work we propose three heuristics based on the Iterated Local Search (ILS metaheuristic. The first heuristic is a basic ILS, the second uses a dynamic perturbation size, and the third uses a Path Relinking (PR technique as an intensification strategy. We carry out comprehensive computational and statistical experiments in order to analyze the performance of the proposed heuristics. The computational experiments show that the ILS heuristics outperform a genetic algorithm proposed in the literature. The ILS heuristic with dynamic perturbation size and PR intensification has a superior performance compared to other heuristics.
Manifold mapping: a two-level optimization technique
Echeverría, D.; Hemker, P.W.
2008-01-01
In this paper, we analyze in some detail the manifold-mapping optimization technique introduced recently [Echeverría and Hemker in space mapping and defect correction. Comput Methods Appl Math 5(2): 107--136, 2005]. Manifold mapping aims at accelerating optimal design procedures that otherwise
Manifold mapping: a two-level optimization technique
D. Echeverria (David); P.W. Hemker (Piet)
2008-01-01
textabstractIn this paper, we analyze in some detail the manifold-mapping optimization technique introduced recently [Echeverría and Hemker in space mapping and defect correction. Comput Methods Appl Math 5(2): 107-–136, 2005]. Manifold mapping aims at accelerating optimal design procedures
A GIS-Based Optimization Technique for Spatial Location of ...
African Journals Online (AJOL)
GIS)-based package; TransCAD v. 5.0 was used to determine the optimal locations of one to ten waste bins. This optimization technique requires less computational time and the output of ten computer runs showed that partial service coverage ...
Decomposition Techniques and Effective Algorithms in Reliability-Based Optimization
DEFF Research Database (Denmark)
Enevoldsen, I.; Sørensen, John Dalsgaard
1995-01-01
The common problem of an extensive number of limit state function calculations in the various formulations and applications of reliability-based optimization is treated. It is suggested to use a formulation based on decomposition techniques so the nested two-level optimization problem can be solv...
Modeling reproductive decisions with simple heuristics
Directory of Open Access Journals (Sweden)
Peter Todd
2013-10-01
Full Text Available BACKGROUND Many of the reproductive decisions that humans make happen without much planning or forethought, arising instead through the use of simple choice rules or heuristics that involve relatively little information and processing. Nonetheless, these heuristic-guided decisions are typically beneficial, owing to humans' ecological rationality - the evolved fit between our constrained decision mechanisms and the adaptive problems we face. OBJECTIVE This paper reviews research on the ecological rationality of human decision making in the domain of reproduction, showing how fertility-related decisions are commonly made using various simple heuristics matched to the structure of the environment in which they are applied, rather than being made with information-hungry mechanisms based on optimization or rational economic choice. METHODS First, heuristics for sequential mate search are covered; these heuristics determine when to stop the process of mate search by deciding that a good-enough mate who is also mutually interested has been found, using a process of aspiration-level setting and assessing. These models are tested via computer simulation and comparison to demographic age-at-first-marriage data. Next, a heuristic process of feature-based mate comparison and choice is discussed, in which mate choices are determined by a simple process of feature-matching with relaxing standards over time. Parental investment heuristics used to divide resources among offspring are summarized. Finally, methods for testing the use of such mate choice heuristics in a specific population over time are then described.
2008-06-01
such as intensification, diversification , and strategic oscillation (Harder et al., 2004). Harder et al.’s results are comparable to results achieved...were obtained by running both solutions using solvers on a Dell Precision PWS690 Intel® Xeon™ CPU 3.37GHz processor, with 3.00 GB of RAM. In some...there are multiple optimal solutions, but given the problem description, efficient routing procedures are desired and minor reward diversification
Optimization using surrogate models - by the space mapping technique
DEFF Research Database (Denmark)
Søndergaard, Jacob
2003-01-01
Surrogate modelling and optimization techniques are intended for engineering design in the case where an expensive physical model is involved. This thesis provides a literature overview of the field of surrogate modelling and optimization. The space mapping technique is one such method for constr......Surrogate modelling and optimization techniques are intended for engineering design in the case where an expensive physical model is involved. This thesis provides a literature overview of the field of surrogate modelling and optimization. The space mapping technique is one such method...... conditions are satisfied. So hybrid methods, combining the space mapping technique with classical optimization methods, should be used if convergence to high accuracy is wanted. Approximation abilities of the space mapping surrogate are compared with those of a Taylor model of the expensive model. The space...... mapping surrogate has a lower approximation error for long steps. For short steps, however, the Taylor model of the expensive model is best, due to exact interpolation at the model origin. Five algorithms for space mapping optimization are presented and the numerical performance is evaluated. Three...
Optimization of detector positioning in the radioactive particle tracking technique.
Dubé, Olivier; Dubé, David; Chaouki, Jamal; Bertrand, François
2014-07-01
The radioactive particle tracking (RPT) technique is a non-intrusive experimental velocimetry and tomography technique extensively applied to the study of hydrodynamics in a great variety of systems. In this technique, arrays of scintillation detector are used to track the motion of a single radioactive tracer particle emitting isotropic γ-rays. This work describes and applies an optimization strategy developed to find an optimal set of positions for the scintillation detectors used in the RPT technique. This strategy employs the overall resolution of the detectors as the objective function and a mesh adaptive direct search (MADS) algorithm to solve the optimization problem. More precisely, NOMAD, a C++ implementation of the MADS algorithm is used. First, the optimization strategy is validated using simple cases with known optimal detector configurations. Next, it is applied to a three-dimensional axisymmetric system (i.e. a vertical cylinder, which could represent a fluidized bed, bubble column, riser or else). The results obtained using the optimization strategy are in agreement with what was previously recommended by Roy et al. (2002) for a similar system. Finally, the optimization strategy is used for a system consisting of a partially filled cylindrical tumbler. The application of insights gained by the optimization strategy is shown to lead to a significant reduction in the error made when reconstructing the position of a tracer particle. The results of this work show that the optimization strategy developed is sensitive to both the type of objective function used and the experimental conditions. The limitations and drawbacks of the optimization strategy are also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.
Meta-heuristics in cellular manufacturing: A state-of-the-art review
Directory of Open Access Journals (Sweden)
Tamal Ghosh
2011-01-01
Full Text Available Meta-heuristic approaches are general algorithmic framework, often nature-inspired and designed to solve NP-complete optimization problems in cellular manufacturing systems and has been a growing research area for the past two decades. This paper discusses various meta-heuristic techniques such as evolutionary approach, Ant colony optimization, simulated annealing, Tabu search and other recent approaches, and their applications to the vicinity of group technology/cell formation (GT/CF problem in cellular manufacturing. The nobility of this paper is to incorporate various prevailing issues, open problems of meta-heuristic approaches, its usage, comparison, hybridization and its scope of future research in the aforesaid area.
Automatic Choice of Scheduling Heuristics for Parallel/Distributed Computing
Directory of Open Access Journals (Sweden)
Clayton S. Ferner
1999-01-01
Full Text Available Task mapping and scheduling are two very difficult problems that must be addressed when a sequential program is transformed into a parallel program. Since these problems are NP‐hard, compiler writers have opted to concentrate their efforts on optimizations that produce immediate gains in performance. As a result, current parallelizing compilers either use very simple methods to deal with task scheduling or they simply ignore it altogether. Unfortunately, the programmer does not have this luxury. The burden of repartitioning or rescheduling, should the compiler produce inefficient parallel code, lies entirely with the programmer. We were able to create an algorithm (called a metaheuristic, which automatically chooses a scheduling heuristic for each input program. The metaheuristic produces better schedules in general than the heuristics upon which it is based. This technique was tested on a suite of real scientific programs written in SISAL and simulated on four different network configurations. Averaged over all of the test cases, the metaheuristic out‐performed all eight underlying scheduling algorithms; beating the best one by 2%, 12%, 13%, and 3% on the four separate network configurations. It is able to do this, not always by picking the best heuristic, but rather by avoiding the heuristics when they would produce very poor schedules. For example, while the metaheuristic only picked the best algorithm about 50% of the time for the 100 Gbps Ethernet, its worst decision was only 49% away from optimal. In contrast, the best of the eight scheduling algorithms was optimal 30% of the time, but its worst decision was 844% away from optimal.
Optimization of segmented thermoelectric generator using Taguchi and ANOVA techniques.
Kishore, Ravi Anant; Sanghadasa, Mohan; Priya, Shashank
2017-12-01
Recent studies have demonstrated that segmented thermoelectric generators (TEGs) can operate over large thermal gradient and thus provide better performance (reported efficiency up to 11%) as compared to traditional TEGs, comprising of single thermoelectric (TE) material. However, segmented TEGs are still in early stages of development due to the inherent complexity in their design optimization and manufacturability. In this study, we demonstrate physics based numerical techniques along with Analysis of variance (ANOVA) and Taguchi optimization method for optimizing the performance of segmented TEGs. We have considered comprehensive set of design parameters, such as geometrical dimensions of p-n legs, height of segmentation, hot-side temperature, and load resistance, in order to optimize output power and efficiency of segmented TEGs. Using the state-of-the-art TE material properties and appropriate statistical tools, we provide near-optimum TEG configuration with only 25 experiments as compared to 3125 experiments needed by the conventional optimization methods. The effect of environmental factors on the optimization of segmented TEGs is also studied. Taguchi results are validated against the results obtained using traditional full factorial optimization technique and a TEG configuration for simultaneous optimization of power and efficiency is obtained.
Cache Memory: An Analysis on Replacement Algorithms and Optimization Techniques
Directory of Open Access Journals (Sweden)
QAISAR JAVAID
2017-10-01
Full Text Available Caching strategies can improve the overall performance of a system by allowing the fast processor and slow memory to at a same pace. One important factor in caching is the replacement policy. Advancement in technology results in evolution of a huge number of techniques and algorithms implemented to improve cache performance. In this paper, analysis is done on different cache optimization techniques as well as replacement algorithms. Furthermore this paper presents a comprehensive statistical comparison of cache optimization techniques.To the best of our knowledge there is no numerical measure which can tell us the rating of specific cache optimization technique. We tried to come up with such a numerical figure. By statistical comparison we find out which technique is more consistent among all others. For said purpose we calculated mean and CV (Coefficient of Variation. CV tells us about which technique is more consistent. Comparative analysis of different techniques shows that victim cache has more consistent technique among all.
Process sequence optimization for digital microfluidic integration using EWOD technique
Yadav, Supriya; Joyce, Robin; Sharma, Akash Kumar; Sharma, Himani; Sharma, Niti Nipun; Varghese, Soney; Akhtar, Jamil
2016-04-01
Micro/nano-fluidic MEMS biosensors are the devices that detects the biomolecules. The emerging micro/nano-fluidic devices provide high throughput and high repeatability with very low response time and reduced device cost as compared to traditional devices. This article presents the experimental details for process sequence optimization of digital microfluidics (DMF) using "electrowetting-on-dielectric" (EWOD). Stress free thick film deposition of silicon dioxide using PECVD and subsequent process for EWOD techniques have been optimized in this work.
Machine learning techniques for the optimization of joint replacements
Cilla, Myriam; Borgiani, Edoardo; Martinez, Javier; Duda, Georg N.; Checa, Sara
2017-01-01
Today, different implant designs exist in the market; however, there is not a clear understanding of which are the best implant design parameters to achieve mechanical optimal conditions. Therefore, the aim of this project was to investigate if the geometry of a commercial short stem hip prosthesis can be further optimized to reduce stress shielding effects and achieve better short-stemmed implant performance. To reach this aim, the potential of machine learning techniques combined with param...
Operation optimization of distributed generation using artificial intelligent techniques
Directory of Open Access Journals (Sweden)
Mahmoud H. Elkazaz
2016-06-01
Full Text Available Future smart grids will require an observable, controllable and flexible network architecture for reliable and efficient energy delivery. The use of artificial intelligence and advanced communication technologies is essential in building a fully automated system. This paper introduces a new technique for online optimal operation of distributed generation (DG resources, i.e. a hybrid fuel cell (FC and photovoltaic (PV system for residential applications. The proposed technique aims to minimize the total daily operating cost of a group of residential homes by managing the operation of embedded DG units remotely from a control centre. The target is formed as an objective function that is solved using genetic algorithm (GA optimization technique. The optimal settings of the DG units obtained from the optimization process are sent to each DG unit through a fully automated system. The results show that the proposed technique succeeded in defining the optimal operating points of the DGs that affect directly the total operating cost of the entire system.
Energy Technology Data Exchange (ETDEWEB)
Gonzalez C, J.; Martin del Campo M, C.; Francois L, J.L. [Facultad de Ingenieria, UNAM, Paseo Cuauhnahuac 8532, 62550 Jiutepec, Morelos (Mexico)
2004-07-01
The objective of this work is to verify the validity of the heuristic rules that have been applied in the processes of radial optimization of fuel cells. It was examined the rule with respect to the accommodation of fuel in the corners of the cell and it became special attention on the influence of the position and concentration of those pellets with gadolinium in the reactivity of the cell and the safety parameters. The evaluation behaved on designed cells violating the heuristic rules. For both cases the cells were analyzed between infinite using the HELIOS code. Additionally, for the second case, it was behaved a stage more exhaustive where it was used one of the studied cells that it completed those safety parameters and of reactivity to generate the design of an assemble that was used to calculate with CM-PRESTO the behavior of the nucleus during three operation cycles. (Author)
Archer, Charles J [Rochester, MN; Blocksome, Michael A [Rochester, MN; Heidelberger, Philip [Cortlandt Manor, NY; Kumar, Sameer [White Plains, NY; Parker, Jeffrey J [Rochester, MN; Ratterman, Joseph D [Rochester, MN
2011-06-07
Methods, compute nodes, and computer program products are provided for heuristic status polling of a component in a computing system. Embodiments include receiving, by a polling module from a requesting application, a status request requesting status of a component; determining, by the polling module, whether an activity history for the component satisfies heuristic polling criteria; polling, by the polling module, the component for status if the activity history for the component satisfies the heuristic polling criteria; and not polling, by the polling module, the component for status if the activity history for the component does not satisfy the heuristic criteria.
DEFF Research Database (Denmark)
Sousa, Tiago; Morais, Hugo; Castro, Rui
2016-01-01
will turn the day-ahead optimal resource scheduling problem into an even more difficult optimization problem. Under these circumstances, metaheuristics can be used to address this optimization problem. An adequate algorithm for generating a good initial solution can improve the metaheuristic's performance...... of finding a final solution near to the optimal than using a random initial solution. This paper proposes two initial solution algorithms to be used by a metaheuristic technique (simulated annealing). These algorithms are tested and evaluated with other published algorithms that obtain initial solution....... The proposed algorithms have been developed as modules to be more flexible their use by other metaheuristics than just simulated annealing. The simulated annealing with different initial solution algorithms has been tested in a 37-bus distribution network with distributed resources, especially electric...
Hip joint contact forces calculated using different muscle optimization techniques
Wesseling, M.; Derikx, L.C.; de Groote, F.; Bartels, W.; Meyer, C.; Verdonschot, Nicolaas Jacobus Joseph; Jonkers, I.
2013-01-01
The goal of this study was to calculate muscle forces using different optimization techniques and investigate their effect on hip joint contact forces in gait and sit to stand. These contact forces were compared to measured hip contact forces [3]. The results showed that contact forces were
Optimization of machining techniques – A retrospective and ...
Indian Academy of Sciences (India)
Abstract. In this paper an attempt is made to review the literature on optimiz- ing machining .... of classic logical systems, which impose inherent restrictions on representation of imprecise concepts. Vagueness ..... A review of literature shows that various traditional machining optimization techniques like. Lagrange's method ...
An improved technique for the prediction of optimal image resolution ...
African Journals Online (AJOL)
user
2010-10-04
Oct 4, 2010 ... two simultaneous equations of values of image noise index (INI) and degradation level Index (LDI), a robust technique for predicting optimal image resolution for the mapping of savannah ecosystems was developed. ..... of aerial photography, Landsat TM and SPOT satellite imagery. Int. J. Remote Sens.
Case Based Heuristic Selection for Timetabling Problems
Burke, Edmund; Petrovic, Sanja; Qu, Rong
2006-01-01
This paper presents a case-based heuristic selection approach for automated university course and exam timetabling. The method described in this paper is motivated by the goal of developing timetabling systems that are fundamentally more general than the current state of the art. Heuristics that worked well in previous similar situations are memorized in a case base and are retrieved for solving the problem in hand. Knowledge discovery techniques are employed in two distinct scenarios. Firstl...
DESIGN OF MICROSTRIP RADIATOR USING PARTICLE SWARM OPTIMIZATION TECHNIQUE
Directory of Open Access Journals (Sweden)
Yogesh Kumar Choukiker
2011-09-01
Full Text Available An inset feed Microstrip radiator has been designed and developed for operation at 2.4GHz frequency. The Microstrip patch antenna (MPA parameters were designed using IE3D®TM EM simulator (version 14.0 and optimized with an evolutionary stochastic optimizer i.e. Particle Swarm Optimization (PSO technique. Optimized results show that the antenna has a bandwidth of 33.54 MHz (<-10dB in the range 2.38355 GHz to 2.41709 GHz and a maximum return loss of -43.87dB at the resonant frequency of 2.4 GHz. The patch antenna is fabricated and the important parameters like return loss, VSWR etc were measured. The measured parameters match with the simulated results well within the tolerable limits.
Directory of Open Access Journals (Sweden)
Samiran Karmakar
2014-07-01
Full Text Available An alternative optimization technique via multiobjective programming for constrained optimization problems with interval-valued objectives has been proposed. Reduction of interval objective functions to those of noninterval (crisp one is the main ingredient of the proposed technique. At first, the significance of interval-valued objective functions along with the meaning of interval-valued solutions of the proposed problem has been explained graphically. Generally, the proposed problems have infinitely many compromise solutions. The objective is to obtain one of such solutions with higher accuracy and lower computational effort. Adequate number of numerical examples has been solved in support of this technique.
Improved Differential Evolution with Shrinking Space Technique for Constrained Optimization
Fu, Chunming; Xu, Yadong; Jiang, Chao; Han, Xu; Huang, Zhiliang
2017-05-01
Most of the current evolutionary algorithms for constrained optimization algorithm are low computational efficiency. In order to improve efficiency, an improved differential evolution with shrinking space technique and adaptive trade-off model, named ATMDE, is proposed to solve constrained optimization problems. The proposed ATMDE algorithm employs an improved differential evolution as the search optimizer to generate new offspring individuals into evolutionary population. For the constraints, the adaptive trade-off model as one of the most important constraint-handling techniques is employed to select better individuals to retain into the next population, which could effectively handle multiple constraints. Then the shrinking space technique is designed to shrink the search region according to feedback information in order to improve computational efficiency without losing accuracy. The improved DE algorithm introduces three different mutant strategies to generate different offspring into evolutionary population. Moreover, a new mutant strategy called "DE/rand/best/1" is constructed to generate new individuals according to the feasibility proportion of current population. Finally, the effectiveness of the proposed method is verified by a suite of benchmark functions and practical engineering problems. This research presents a constrained evolutionary algorithm with high efficiency and accuracy for constrained optimization problems.
Local search-based heuristics for the multiobjective multidimensional knapsack problem
Directory of Open Access Journals (Sweden)
Dalessandro Soares Vianna
2012-01-01
Full Text Available In real optimization problems it is generally desirable to optimize more than one performance criterion (or objective at the same time. The goal of the multiobjective combinatorial optimization (MOCO is to optimize simultaneously r > 1 objectives. As in the single-objective case, the use of heuristic/metaheuristic techniques seems to be the most promising approach to MOCO problems because of their efficiency, generality and relative simplicity of implementation. In this work, we develop algorithms based on Greedy Randomized Adaptive Search Procedure (GRASP and Iterated Local Search (ILS metaheuristics for the multiobjective knapsack problem. Computational experiments on benchmark instances show that the proposed algorithms are very robust and outperform other heuristics in terms of solution quality and running times.
A method to objectively optimize coral bleaching prediction techniques
van Hooidonk, R. J.; Huber, M.
2007-12-01
Thermally induced coral bleaching is a global threat to coral reef health. Methodologies, e.g. the Degree Heating Week technique, have been developed to predict bleaching induced by thermal stress by utilizing remotely sensed sea surface temperature (SST) observations. These techniques can be used as a management tool for Marine Protected Areas (MPA). Predictions are valuable to decision makers and stakeholders on weekly to monthly time scales and can be employed to build public awareness and support for mitigation. The bleaching problem is only expected to worsen because global warming poses a major threat to coral reef health. Indeed, predictive bleaching methods combined with climate model output have been used to forecast the global demise of coral reef ecosystems within coming decades due to climate change. Accuracy of these predictive techniques has not been quantitatively characterized despite the critical role they play. Assessments have typically been limited, qualitative or anecdotal, or more frequently they are simply unpublished. Quantitative accuracy assessment, using well established methods and skill scores often used in meteorology and medical sciences, will enable objective optimization of existing predictive techniques. To accomplish this, we will use existing remotely sensed data sets of sea surface temperature (AVHRR and TMI), and predictive values from techniques such as the Degree Heating Week method. We will compare these predictive values with observations of coral reef health and calculate applicable skill scores (Peirce Skill Score, Hit Rate and False Alarm Rate). We will (a) quantitatively evaluate the accuracy of existing coral reef bleaching predictive methods against state-of- the-art reef health databases, and (b) present a technique that will objectively optimize the predictive method for any given location. We will illustrate this optimization technique for reefs located in Puerto Rico and the US Virgin Islands.
An image morphing technique based on optimal mass preserving mapping.
Zhu, Lei; Yang, Yan; Haker, Steven; Tannenbaum, Allen
2007-06-01
Image morphing, or image interpolation in the time domain, deals with the metamorphosis of one image into another. In this paper, a new class of image morphing algorithms is proposed based on the theory of optimal mass transport. The L(2) mass moving energy functional is modified by adding an intensity penalizing term, in order to reduce the undesired double exposure effect. It is an intensity-based approach and, thus, is parameter free. The optimal warping function is computed using an iterative gradient descent approach. This proposed morphing method is also extended to doubly connected domains using a harmonic parameterization technique, along with finite-element methods.
TECHNIQUE OF OPTIMAL AUDIT PLANNING FOR INFORMATION SECURITY MANAGEMENT SYSTEM
Directory of Open Access Journals (Sweden)
F. N. Shago
2014-03-01
Full Text Available Complication of information security management systems leads to the necessity of improving the scientific and methodological apparatus for these systems auditing. Planning is an important and determining part of information security management systems auditing. Efficiency of audit will be defined by the relation of the reached quality indicators to the spent resources. Thus, there is an important and urgent task of developing methods and techniques for optimization of the audit planning, making it possible to increase its effectiveness. The proposed technique gives the possibility to implement optimal distribution for planning time and material resources on audit stages on the basis of dynamics model for the ISMS quality. Special feature of the proposed approach is the usage of a priori data as well as a posteriori data for the initial audit planning, and also the plan adjustment after each audit event. This gives the possibility to optimize the usage of audit resources in accordance with the selected criteria. Application examples of the technique are given while planning audit information security management system of the organization. The result of computational experiment based on the proposed technique showed that the time (cost audit costs can be reduced by 10-15% and, consequently, quality assessments obtained through audit resources allocation can be improved with respect to well-known methods of audit planning.
Structural Sustainability - Heuristic Approach
Rostański, Krzysztof
2017-10-01
Nowadays, we are faced with a challenge of having to join building structures with elements of nature, which seems to be the paradigm of modern planning and design. The questions arise, however, with reference to the following categories: the leading idea, the relation between elements of nature and buildings, the features of a structure combining such elements and, finally, our perception of this structure. If we consider both the overwhelming globalization and our attempts to preserve local values, the only reasonable solution is to develop naturalistic greenery. It can add its uniqueness to any building and to any developed area. Our holistic model, presented in this paper, contains the above mentioned categories within the scope of naturalism. The model is divided into principles, actions related, and possible effects to be obtained. It provides a useful tool for determining the ways and priorities of our design. Although it is not possible to consider all possible actions and solutions in order to support sustainability in any particular design, we can choose, however, a proper mode for our design according to the local conditions by turning to the heuristic method, which helps to choose priorities and targets. Our approach is an attempt to follow the ways of nature as in the natural environment it is optimal solutions that appear and survive, idealism being the domain of mankind only. We try to describe various natural processes in a manner comprehensible to us, which is always a generalization. Such definitions, however, called artificial by naturalists, are presented as art or the current state of knowledge by artists and engineers. Reality, in fact, is always more complicated than its definitions. The heuristic method demonstrates the way how to optimize our design. It requires that all possible information about the local environment should be gathered, as the more is known, the fewer mistakes are made. Following the unquestionable principles, we can
Optimization of Hydraulic Machinery Bladings by Multilevel CFD Techniques
Directory of Open Access Journals (Sweden)
Thum Susanne
2005-01-01
Full Text Available The numerical design optimization for complex hydraulic machinery bladings requires a high number of design parameters and the use of a precise CFD solver yielding high computational costs. To reduce the CPU time needed, a multilevel CFD method has been developed. First of all, the 3D blade geometry is parametrized by means of a geometric design tool to reduce the number of design parameters. To keep geometric accuracy, a special B-spline modification technique has been developed. On the first optimization level, a quasi-3D Euler code (EQ3D is applied. To guarantee a sufficiently accurate result, the code is calibrated by a Navier-Stokes recalculation of the initial design and can be recalibrated after a number of optimization steps by another Navier-Stokes computation. After having got a convergent solution, the optimization process is repeated on the second level using a full 3D Euler code yielding a more accurate flow prediction. Finally, a 3D Navier-Stokes code is applied on the third level to search for the optimum optimorum by means of a fine-tuning of the geometrical parameters. To show the potential of the developed optimization system, the runner blading of a water turbine having a specific speed n q = 41 1 / min was optimized applying the multilevel approach.
Age Effects and Heuristics in Decision Making*
Besedeš, Tibor; Deck, Cary; Sarangi, Sudipta; Shor, Mikhael
2011-01-01
Using controlled experiments, we examine how individuals make choices when faced with multiple options. Choice tasks are designed to mimic the selection of health insurance, prescription drug, or retirement savings plans. In our experiment, available options can be objectively ranked allowing us to examine optimal decision making. First, the probability of a person selecting the optimal option declines as the number of options increases, with the decline being more pronounced for older subjects. Second, heuristics differ by age with older subjects relying more on suboptimal decision rules. In a heuristics validation experiment, older subjects make worse decisions than younger subjects. PMID:22544977
Age Effects and Heuristics in Decision Making.
Besedeš, Tibor; Deck, Cary; Sarangi, Sudipta; Shor, Mikhael
2012-05-01
Using controlled experiments, we examine how individuals make choices when faced with multiple options. Choice tasks are designed to mimic the selection of health insurance, prescription drug, or retirement savings plans. In our experiment, available options can be objectively ranked allowing us to examine optimal decision making. First, the probability of a person selecting the optimal option declines as the number of options increases, with the decline being more pronounced for older subjects. Second, heuristics differ by age with older subjects relying more on suboptimal decision rules. In a heuristics validation experiment, older subjects make worse decisions than younger subjects.
Heuristic-based scheduling algorithm for high level synthesis
Mohamed, Gulam; Tan, Han-Ngee; Chng, Chew-Lye
1992-01-01
A new scheduling algorithm is proposed which uses a combination of a resource utilization chart, a heuristic algorithm to estimate the minimum number of hardware units based on operator mobilities, and a list-scheduling technique to achieve fast and near optimal schedules. The schedule time of this algorithm is almost independent of the length of mobilities of operators as can be seen from the benchmark example (fifth order digital elliptical wave filter) presented when the cycle time was increased from 17 to 18 and then to 21 cycles. It is implemented in C on a SUN3/60 workstation.
The Generalized Direct Optimization Technique for Printed Reflectarrays
DEFF Research Database (Denmark)
Zhou, Min; Sørensen, Stig Busk; Kim, Oleksiy S.
2014-01-01
A generalized direct optimization technique (GDOT) for the design of printed reflectarrays using arbitrarily shaped elements with irregular orientation and position is presented. The GDOT is based on the spectral domain method of moments (SDMoM) assuming local periodicity (LP) and a minimax...... optimization algorithm. The accuracy of the LP-SDMoM for the design of reflectarrays with irregularly positioned and oriented array elements has been verified by comparisons with full wave method of moments. Three contoured beam reflectarrays, forming a high-gain beam on a European coverage area, have been...... designed: a broadband design, a circularly polarized design using the variable rotation technique, and a design with irregularly positioned array elements. The latter has been manufactured and measured at the DTU-ESA Spherical Near-Field Antenna Test Facility. An very good agreement between simulated...
Material saving by means of CWR technology using optimization techniques
Pérez, Iñaki; Ambrosio, Cristina
2017-10-01
Material saving is currently a must for the forging companies, as material costs sum up to 50% for parts made of steel and up to 90% in other materials like titanium. For long products, cross wedge rolling (CWR) technology can be used to obtain forging preforms with a suitable distribution of the material along its own axis. However, defining the correct preform dimensions is not an easy task and it could need an intensive trial-and-error campaign. To speed up the preform definition, it is necessary to apply optimization techniques on Finite Element Models (FEM) able to reproduce the material behaviour when being rolled. Meta-models Assisted Evolution Strategies (MAES), that combine evolutionary algorithms with Kriging meta-models, are implemented in FORGE® software and they allow reducing optimization computation costs in a relevant way. The paper shows the application of these optimization techniques to the definition of the right preform for a shaft from a vehicle of the agricultural sector. First, the current forging process, based on obtaining the forging preform by means of an open die forging operation, is showed. Then, the CWR preform optimization is developed by using the above mentioned optimization techniques. The objective is to reduce, as much as possible, the initial billet weight, so that a calculation of flash weight reduction due to the use of the proposed preform is stated. Finally, a simulation of CWR process for the defined preform is carried out to check that most common failures (necking, spirals,..) in CWR do not appear in this case.
Novel optimization technique of isolated microgrid with hydrogen energy storage.
Beshr, Eman Hassan; Abdelghany, Hazem; Eteiba, Mahmoud
2018-01-01
This paper presents a novel optimization technique for energy management studies of an isolated microgrid. The system is supplied by various Distributed Energy Resources (DERs), Diesel Generator (DG), a Wind Turbine Generator (WTG), Photovoltaic (PV) arrays and supported by fuel cell/electrolyzer Hydrogen storage system for short term storage. Multi-objective optimization is used through non-dominated sorting genetic algorithm to suit the load requirements under the given constraints. A novel multi-objective flower pollination algorithm is utilized to check the results. The Pros and cons of the two optimization techniques are compared and evaluated. An isolated microgrid is modelled using MATLAB software package, dispatch of active/reactive power, optimal load flow analysis with slack bus selection are carried out to be able to minimize fuel cost and line losses under realistic constraints. The performance of the system is studied and analyzed during both summer and winter conditions and three case studies are presented for each condition. The modified IEEE 15 bus system is used to validate the proposed algorithm.
Heuristics Considering UX and Quality Criteria for Heuristics
Directory of Open Access Journals (Sweden)
Frederik Bader
2017-12-01
Full Text Available Heuristic evaluation is a cheap tool with which one can take qualitative measures of a product’s usability. However, since the methodology was first presented, the User Experience (UX has become more popular but the heuristics have remained the same. In this paper, we analyse the current state of heuristic evaluation in terms of heuristics for measuring the UX. To do so, we carried out a literature review. In addition, we had a look at different heuristics and mapped them with the UX dimensions of the User Experience Questionnaire (UEQ. Moreover, we proposed a quality model for heuristic evaluation and a list of quality criteria for heuristics.
Simple heuristics and rules of thumb: where psychologists and behavioural biologists might meet.
Hutchinson, John M C; Gigerenzer, Gerd
2005-05-31
The Centre for Adaptive Behaviour and Cognition (ABC) has hypothesised that much human decision-making can be described by simple algorithmic process models (heuristics). This paper explains this approach and relates it to research in biology on rules of thumb, which we also review. As an example of a simple heuristic, consider the lexicographic strategy of Take The Best for choosing between two alternatives: cues are searched in turn until one discriminates, then search stops and all other cues are ignored. Heuristics consist of building blocks, and building blocks exploit evolved or learned abilities such as recognition memory; it is the complexity of these abilities that allows the heuristics to be simple. Simple heuristics have an advantage in making decisions fast and with little information, and in avoiding overfitting. Furthermore, humans are observed to use simple heuristics. Simulations show that the statistical structures of different environments affect which heuristics perform better, a relationship referred to as ecological rationality. We contrast ecological rationality with the stronger claim of adaptation. Rules of thumb from biology provide clearer examples of adaptation because animals can be studied in the environments in which they evolved. The range of examples is also much more diverse. To investigate them, biologists have sometimes used similar simulation techniques to ABC, but many examples depend on empirically driven approaches. ABC's theoretical framework can be useful in connecting some of these examples, particularly the scattered literature on how information from different cues is integrated. Optimality modelling is usually used to explain less detailed aspects of behaviour but might more often be redirected to investigate rules of thumb.
Methodology and Implementation on DSP of Heuristic Multiuser DS/CDMA Detectors
Directory of Open Access Journals (Sweden)
Alex Miyamoto Mussi
2010-12-01
Full Text Available The growing number of users of mobile communications networks and the scarcity of the electromagnetic spectrum make the use of diversity techniques and detection/decoding efficient, such as the use of multiple antennas at the transmitter and/or receiver, multiuser detection (MuD – Multiuser Detection, among others, have an increasingly prominent role in the telecommunications landscape. This paper presents a design methodology based on digital signal processors (DSP – Digital Signal Processor with a view to the implementation of multiuser heuristics detectors in systems DS/CDMA (Direct Sequence Code Division Multiple Access. Heuristics detection techniques result in near-optimal performance in order to approach the performance of maximum-likelihood (ML. In this work, was employed the DSP development platform called the C6713 DSK, which is based in Texas TMS320C6713 processor. The heuristics techniques proposed are based on well established algorithms in the literature. The efficiency of the algorithms implemented in DSP has been evaluated numerically by computing the measure of bit error rate (BER. Finally, the feasibility of implementation in DSP could then be verified by comparing results from multiple Monte-Carlo simulation in Matlab, with those obtained from implementation on DSP. It also demonstrates the effective increase in performance and system capacity of DS/CDMA with the use of heuristic multiuser detection techniques, implemented directly in the DSP.
Energy Technology Data Exchange (ETDEWEB)
Souza Lima, Carlos A. [Instituto de Engenharia Nuclear - Divisao de Reatores/PPGIEN, Rua Helio de Almeida 75, Cidade Universitaria - Ilha do Fundao, P.O. Box: 68550 - Zip Code: 21941-972, Rio de Janeiro (Brazil); Instituto Politecnico, Universidade do Estado do Rio de Janeiro, Pos-Graduacao em Modelagem Computacional, Rua Alberto Rangel - s/n, Vila Nova, Nova Friburgo, Zip Code: 28630-050, Nova Friburgo (Brazil); Lapa, Celso Marcelo F.; Pereira, Claudio Marcio do N.A. [Instituto de Engenharia Nuclear - Divisao de Reatores/PPGIEN, Rua Helio de Almeida 75, Cidade Universitaria - Ilha do Fundao, P.O. Box: 68550 - Zip Code: 21941-972, Rio de Janeiro (Brazil); Instituto Nacional de Ciencia e Tecnologia de Reatores Nucleares Inovadores (INCT) (Brazil); Cunha, Joao J. da [Eletronuclear Eletrobras Termonuclear - Gerencia de Analise de Seguranca Nuclear, Rua da Candelaria, 65, 7 andar. Centro, Zip Code: 20091-906, Rio de Janeiro (Brazil); Alvim, Antonio Carlos M. [Universidade Federal do Rio de Janeiro, COPPE/Nuclear, Cidade Universitaria - Ilha do Fundao s/n, P.O.Box 68509 - Zip Code: 21945-970, Rio de Janeiro (Brazil); Instituto Nacional de Ciencia e Tecnologia de Reatores Nucleares Inovadores (INCT) (Brazil)
2011-06-15
Research highlights: > Performance of PSO and GA techniques applied to similar system design. > This work uses ANGRA1 (two loop PWR) core as a prototype. > Results indicate that PSO technique is more adequate than GA to solve this kind of problem. - Abstract: This paper compares the performance of two optimization techniques, particle swarm optimization (PSO) and genetic algorithm (GA) applied to the design a typical reduced scale two loop Pressurized Water Reactor (PWR) core, at full power in single phase forced circulation flow. This comparison aims at analyzing the performance in reaching the global optimum, considering that both heuristics are based on population search methods, that is, methods whose population (candidate solution set) evolve from one generation to the next using a combination of deterministic and probabilistic rules. The simulated PWR, similar to ANGRA 1 power plant, was used as a case example to compare the performance of PSO and GA. Results from simulations indicated that PSO is more adequate to solve this kind of problem.
Using Tabu Search Heuristics in Solving the Vehicle Routing ...
African Journals Online (AJOL)
Nafiisah
according to Glover, is a meta-heuristic that guides a local heuristic to explore the solution space beyond local optimality. Tabu Search starts just as an ordinary local search, proceeding iteratively from one solution to the next until some stopping criteria is satisfied while making use of some strategies to avoid getting trapped ...
Efficient Heuristics for Simulating Population Overflow in Parallel Networks
Zaburnenko, T.S.; Nicola, V.F.
2006-01-01
In this paper we propose a state-dependent importance sampling heuristic to estimate the probability of population overflow in networks of parallel queues. This heuristic approximates the “optimal��? state-dependent change of measure without the need for costly optimization involved in other
On the empirical performance of (T,s,S) heuristics
Babai, M. Zied; Syntetos, Aris A.; Teunter, Ruud
2010-01-01
The periodic (T,s,S) policies have received considerable attention from the academic literature. Determination of the optimal parameters is computationally prohibitive, and a number of heuristic procedures have been put forward. However, these heuristics have never been compared in an extensive
Pitfalls in Teaching Judgment Heuristics
Shepperd, James A.; Koch, Erika J.
2005-01-01
Demonstrations of judgment heuristics typically focus on how heuristics can lead to poor judgments. However, exclusive focus on the negative consequences of heuristics can prove problematic. We illustrate the problem with the representativeness heuristic and present a study (N = 45) that examined how examples influence understanding of the…
National Research Council Canada - National Science Library
Keedwell, Edward
2005-01-01
... Intelligence and Computer Science 3.1 Introduction to search 3.2 Search algorithms 3.3 Heuristic search methods 3.4 Optimal search strategies 3.5 Problems with search techniques 3.6 Complexity of...
National Research Council Canada - National Science Library
Keedwell, Edward; Narayanan, Ajit
2005-01-01
... Intelligence and Computer Science 3.1 Introduction to search 3.2 Search algorithms 3.3 Heuristic search methods 3.4 Optimal search strategies 3.5 Problems with search techniques 3.6 Complexity of...
Directory of Open Access Journals (Sweden)
P. Ajay - D - Vimal Raj
2007-03-01
Full Text Available This paper presents a Particle Swarm Optimization (PSO based algorithm for Optimal Power Flow (OPF in Combined Economic Emission Dispatch (CEED environment of thermal units while satisfying the constraints such as generator capacity limits, power balance and line flow limits. Particle Swarm Optimization is a population based stochastic optimization, developed by Kennedy and Eberhart [12], in which members within a group share the information among them to achieve the global best position. This method is dynamic in nature and it overcomes the shortcomings of other evolutionary computation techniques such as premature convergence and provides high quality solutions. The performance of the proposed method has been demonstrated on IEEE 30 bus system with six generating units. The problem has been formulated as a single optimization problem to obtain the solution for optimal power flow problem with combined fuel cost and environment impact as objectives. The results obtained by the proposed method are better than any other evolutionary computation techniques proposed so far.
Villar-Rodriguez, Esther
2015-01-01
According to the report published by the online protection firm Iovation in 2012, cyber fraud ranged from 1 percent of the Internet transactions in North America Africa to a 7 percent in Africa, most of them involving credit card fraud, identity theft, and account takeover or h¼acking attempts. This kind of crime is still growing due to the advantages offered by a non face-to-face channel where a increasing number of unsuspecting victims divulges sensitive information. Interpol...
Techniques for developing reliability-oriented optimal microgrid architectures
Patra, Shashi B.
2007-12-01
Alternative generation technologies such as fuel cells, micro-turbines, solar etc. have been the focus of active research in the past decade. These energy sources are small and modular. Because of these advantages, these sources can be deployed effectively at or near locations where they are actually needed, i.e. in the distribution network. This is in contrast to the traditional electricity generation which has been "centralized" in nature. The new technologies can be deployed in a "distributed" manner. Therefore, they are also known as Distributed Energy Resources (DER). It is expected that the use of DER, will grow significantly in the future. Hence, it is prudent to interconnect the energy resources in a meshed or grid-like structure, so as to exploit the reliability and economic benefits of distributed deployment. These grids, which are smaller in scale but similar to the electric transmission grid, are known as "microgrids". This dissertation presents rational methods of building microgrids optimized for cost and subject to system-wide and locational reliability guarantees. The first method is based on dynamic programming and consists of determining the optimal interconnection between microsources and load points, given their locations and the rights of way for possible interconnections. The second method is based on particle swarm optimization. This dissertation describes the formulation of the optimization problem and the solution methods. The applicability of the techniques is demonstrated in two possible situations---design of a microgrid from scratch and expansion of an existing distribution system.
Optimization Techniques for 3D Graphics Deployment on Mobile Devices
Koskela, Timo; Vatjus-Anttila, Jarkko
2015-03-01
3D Internet technologies are becoming essential enablers in many application areas including games, education, collaboration, navigation and social networking. The use of 3D Internet applications with mobile devices provides location-independent access and richer use context, but also performance issues. Therefore, one of the important challenges facing 3D Internet applications is the deployment of 3D graphics on mobile devices. In this article, we present an extensive survey on optimization techniques for 3D graphics deployment on mobile devices and qualitatively analyze the applicability of each technique from the standpoints of visual quality, performance and energy consumption. The analysis focuses on optimization techniques related to data-driven 3D graphics deployment, because it supports off-line use, multi-user interaction, user-created 3D graphics and creation of arbitrary 3D graphics. The outcome of the analysis facilitates the development and deployment of 3D Internet applications on mobile devices and provides guidelines for future research.
Iterative metal artifact reduction: Evaluation and optimization of technique
Energy Technology Data Exchange (ETDEWEB)
Subhas, Naveen; Gupta, Amit; Polster, Joshua M. [Imaging Institute, Cleveland Clinic, Cleveland, OH (United States); Primak, Andrew N. [Siemens Medical Solutions USA Inc., Malvern, PA (United States); Obuchowski, Nancy A. [Quantitative Health Sciences, Cleveland Clinic, Cleveland, OH (United States); Krauss, Andreas [Siemens Healthcare, Forchheim (Germany); Iannotti, Joseph P. [Orthopaedic and Rheumatologic Institute, Cleveland Clinic, Cleveland, OH (United States)
2014-12-15
Iterative metal artifact reduction (IMAR) is a sinogram inpainting technique that incorporates high-frequency data from standard weighted filtered back projection (WFBP) reconstructions to reduce metal artifact on computed tomography (CT). This study was designed to compare the image quality of IMAR and WFBP in total shoulder arthroplasties (TSA); determine the optimal amount of WFBP high-frequency data needed for IMAR; and compare image quality of the standard 3D technique with that of a faster 2D technique. Eight patients with nine TSA underwent CT with standardized parameters: 140 kVp, 300 mAs, 0.6 mm collimation and slice thickness, and B30 kernel. WFBP, three 3D IMAR algorithms with different amounts of WFBP high-frequency data (IMARlo, lowest; IMARmod, moderate; IMARhi, highest), and one 2D IMAR algorithm were reconstructed. Differences in attenuation near hardware and away from hardware were measured and compared using repeated measures ANOVA. Five readers independently graded image quality; scores were compared using Friedman's test. Attenuation differences were smaller with all 3D IMAR techniques than with WFBP (p < 0.0063). With increasing high-frequency data, the attenuation difference increased slightly (differences not statistically significant). All readers ranked IMARmod and IMARhi more favorably than WFBP (p < 0.05), with IMARmod ranked highest for most structures. The attenuation difference was slightly higher with 2D than with 3D IMAR, with no significant reader preference for 3D over 2D. IMAR significantly decreases metal artifact compared to WFBP both objectively and subjectively in TSA. The incorporation of a moderate amount of WFBP high-frequency data and use of a 2D reconstruction technique optimize image quality and allow for relatively short reconstruction times. (orig.)
Machine learning techniques for energy optimization in mobile embedded systems
Donohoo, Brad Kyoshi
Mobile smartphones and other portable battery operated embedded systems (PDAs, tablets) are pervasive computing devices that have emerged in recent years as essential instruments for communication, business, and social interactions. While performance, capabilities, and design are all important considerations when purchasing a mobile device, a long battery lifetime is one of the most desirable attributes. Battery technology and capacity has improved over the years, but it still cannot keep pace with the power consumption demands of today's mobile devices. This key limiter has led to a strong research emphasis on extending battery lifetime by minimizing energy consumption, primarily using software optimizations. This thesis presents two strategies that attempt to optimize mobile device energy consumption with negligible impact on user perception and quality of service (QoS). The first strategy proposes an application and user interaction aware middleware framework that takes advantage of user idle time between interaction events of the foreground application to optimize CPU and screen backlight energy consumption. The framework dynamically classifies mobile device applications based on their received interaction patterns, then invokes a number of different power management algorithms to adjust processor frequency and screen backlight levels accordingly. The second strategy proposes the usage of machine learning techniques to learn a user's mobile device usage pattern pertaining to spatiotemporal and device contexts, and then predict energy-optimal data and location interface configurations. By learning where and when a mobile device user uses certain power-hungry interfaces (3G, WiFi, and GPS), the techniques, which include variants of linear discriminant analysis, linear logistic regression, non-linear logistic regression, and k-nearest neighbor, are able to dynamically turn off unnecessary interfaces at runtime in order to save energy.
Using tree diversity to compare phylogenetic heuristics.
Sul, Seung-Jin; Matthews, Suzanne; Williams, Tiffani L
2009-04-29
Evolutionary trees are family trees that represent the relationships between a group of organisms. Phylogenetic heuristics are used to search stochastically for the best-scoring trees in tree space. Given that better tree scores are believed to be better approximations of the true phylogeny, traditional evaluation techniques have used tree scores to determine the heuristics that find the best scores in the fastest time. We develop new techniques to evaluate phylogenetic heuristics based on both tree scores and topologies to compare Pauprat and Rec-I-DCM3, two popular Maximum Parsimony search algorithms. Our results show that although Pauprat and Rec-I-DCM3 find the trees with the same best scores, topologically these trees are quite different. Furthermore, the Rec-I-DCM3 trees cluster distinctly from the Pauprat trees. In addition to our heatmap visualizations of using parsimony scores and the Robinson-Foulds distance to compare best-scoring trees found by the two heuristics, we also develop entropy-based methods to show the diversity of the trees found. Overall, Pauprat identifies more diverse trees than Rec-I-DCM3. Overall, our work shows that there is value to comparing heuristics beyond the parsimony scores that they find. Pauprat is a slower heuristic than Rec-I-DCM3. However, our work shows that there is tremendous value in using Pauprat to reconstruct trees-especially since it finds identical scoring but topologically distinct trees. Hence, instead of discounting Pauprat, effort should go in improving its implementation. Ultimately, improved performance measures lead to better phylogenetic heuristics and will result in better approximations of the true evolutionary history of the organisms of interest.
Multivariate Analysis Techniques for Optimal Vision System Design
DEFF Research Database (Denmark)
Sharifzadeh, Sara
(SSPCA) and DCT based characterization of the spectral diffused reflectance images for wavelength selection and discrimination. These methods together with some other state-of-the-art statistical and mathematical analysis techniques are applied on datasets of different food items; meat, diaries, fruits......The present thesis considers optimization of the spectral vision systems used for quality inspection of food items. The relationship between food quality, vision based techniques and spectral signature are described. The vision instruments for food analysis as well as datasets of the food items...... based on the existing sparse regression methods (EN and lasso) and one unsupervised feature selection strategy based on the local maxima of the spectral 1D/2D signals of food items are proposed. In addition, two novel feature extraction and selection strategies are introduced; sparse supervised PCA...
A genetic algorithm selection perturbative hyper-heuristic for solving ...
African Journals Online (AJOL)
Hyper-heuristics, on the other hand, search a heuristic space with the aim of providing a more generalized solution to the particular optimisation problem. This is a fairly new technique that has proven to be successful in solving various combinatorial optimisation problems. There has not been much research into the use of ...
High-level power analysis and optimization techniques
Raghunathan, Anand
1997-12-01
This thesis combines two ubiquitous trends in the VLSI design world--the move towards designing at higher levels of design abstraction, and the increasing importance of power consumption as a design metric. Power estimation and optimization tools are becoming an increasingly important part of design flows, driven by a variety of requirements such as prolonging battery life in portable computing and communication devices, thermal considerations and system cooling and packaging costs, reliability issues (e.g. electromigration, ground bounce, and I-R drops in the power network), and environmental concerns. This thesis presents a suite of techniques to automatically perform power analysis and optimization for designs at the architecture or register-transfer, and behavior or algorithm levels of the design hierarchy. High-level synthesis refers to the process of synthesizing, from an abstract behavioral description, a register-transfer implementation that satisfies the desired constraints. High-level synthesis tools typically perform one or more of the following tasks: transformations, module selection, clock selection, scheduling, and resource allocation and assignment (also called resource sharing or hardware sharing). High-level synthesis techniques for minimizing the area, maximizing the performance, and enhancing the testability of the synthesized designs have been investigated. This thesis presents high-level synthesis techniques that minimize power consumption in the synthesized data paths. This thesis investigates the effects of resource sharing on the power consumption in the data path, provides techniques to efficiently estimate power consumption during resource sharing, and resource sharing algorithms to minimize power consumption. The RTL circuit that is obtained from the high-level synthesis process can be further optimized for power by applying power-reducing RTL transformations. This thesis presents macro-modeling and estimation techniques for switching
A Deep-Cutting-Plane Technique for Reverse Convex Optimization.
Moshirvaziri, K; Amouzegar, M A
2011-08-01
A large number of problems in engineering design and in many areas of social and physical sciences and technology lend themselves to particular instances of problems studied in this paper. Cutting-plane methods have traditionally been used as an effective tool in devising exact algorithms for solving convex and large-scale combinatorial optimization problems. Its utilization in nonconvex optimization has been also promising. A cutting plane, essentially a hyperplane defined by a linear inequality, can be used to effectively reduce the computational efforts in search of a global solution. Each cut is generated in order to eliminate a large portion of the search domain. Thus, a deep cut is intuitively superior in which it will exclude a larger set of extraneous points from consideration. This paper is concerned with the development of deep-cutting-plane techniques applied to reverse-convex programs. An upper bound and a lower bound for the optimal value are found, updated, and improved at each iteration. The algorithm terminates when the two bounds collapse or all the generated subdivisions have been fathomed. Finally, computational considerations and numerical results on a set of test problems are discussed. An illustrative example, walking through the steps of the algorithm and explaining the computational process, is presented.
Babaveisi, Vahid; Paydar, Mohammad Mahdi; Safaei, Abdul Sattar
2017-07-01
This study aims to discuss the solution methodology for a closed-loop supply chain (CLSC) network that includes the collection of used products as well as distribution of the new products. This supply chain is presented on behalf of the problems that can be solved by the proposed meta-heuristic algorithms. A mathematical model is designed for a CLSC that involves three objective functions of maximizing the profit, minimizing the total risk and shortages of products. Since three objective functions are considered, a multi-objective solution methodology can be advantageous. Therefore, several approaches have been studied and an NSGA-II algorithm is first utilized, and then the results are validated using an MOSA and MOPSO algorithms. Priority-based encoding, which is used in all the algorithms, is the core of the solution computations. To compare the performance of the meta-heuristics, random numerical instances are evaluated by four criteria involving mean ideal distance, spread of non-dominance solution, the number of Pareto solutions, and CPU time. In order to enhance the performance of the algorithms, Taguchi method is used for parameter tuning. Finally, sensitivity analyses are performed and the computational results are presented based on the sensitivity analyses in parameter tuning.
Teo, Jing Chun; Foin, Nicolas; Otsuka, Fumiyuki; Bulluck, Heerajnarain; Fam, Jiang Ming; Wong, Philip; Low, Fatt Hoe; Leo, Hwa Liang; Mari, Jean-Martial; Joner, Michael; Girard, Michael J A; Virmani, Renu; Bezerra, HG.; Costa, MA.; Guagliumi, G.; Rollins, AM.; Simon, D.; Gutiérrez-Chico, JL.; Alegría-Barrero, E.; Teijeiro-Mestre, R.; Chan, PH.; Tsujioka, H.; de Silva, R.; Otsuka, F.; Joner, M.; Prati, F.; Virmani, R.; Narula, J.; Members, WC.; Levine, GN.; Bates, ER.; Blankenship, JC.; Bailey, SR.; Bittl, JA.; Prati, F.; Guagliumi, G.; Mintz, G.S.; Costa, Marco; Regar, E.; Akasaka, T.; Roleder, T.; Jąkała, J.; Kałuża, GL.; Partyka, Ł.; Proniewska, K.; Pociask, E.; Girard, MJA.; Strouthidis, NG.; Ethier, CR.; Mari, JM.; Mari, JM.; Strouthidis, NG.; Park, SC.; Girard, MJA.; van der Lee, R.; Foin, N.; Otsuka, F.; Wong, P.K.; Mari, J-M.; Joner, M.; Nakano, M.; Vorpahl, M.; Otsuka, F.; Taniwaki, M.; Yazdani, SK.; Finn, AV.; Nakano, M.; Yahagi, K.; Yamamoto, H.; Taniwaki, M.; Otsuka, F.; Ladich, ER.; Girard, MJ.; Ang, M.; Chung, CW.; Farook, M.; Strouthidis, N.; Mehta, JS.; Foin, N.; Mari, JM.; Nijjer, S.; Sen, S.; Petraco, R.; Ghione, M.; Liu, X.; Kang, JU.; Virmani, R.; Kolodgie, F.D.; Burke, AP.; Farb, A.; Schwartz, S.M.; Yahagi, K.; Kolodgie, F.D.; Otsuka, F.; Finn, AV.; Davis, HR.; Joner, M.; Kume, T.; Akasaka, T.; Kawamoto, T.; Watanabe, N.; Toyota, E.; Neishi, Y.; Rieber, J.; Meissner, O.; Babaryka, G.; Reim, S.; Oswald, M.E.; Koenig, A.S.; Tearney, G. J.; Regar, E.; Akasaka, T.; Adriaenssens, T.; Barlis, P.; Bezerra, HG.; Yabushita, H.; Bouma, BE.; Houser, S. L.; Aretz, HT.; Jang, I-K.; Schlendorf, KH.; Guo, J.; Sun, L.; Chen, Y.D.; Tian, F.; Liu, HB.; Chen, L.; Kawasaki, M.; Bouma, BE.; Bressner, J. E.; Houser, S. L.; Nadkarni, S. K.; MacNeill, BD.; Jansen, CHP.; Onthank, DC.; Cuello, F.; Botnar, RM.; Wiethoff, AJ.; Warley, A.; von Birgelen, C.; Hartmann, A. M.; Kubo, T.; Akasaka, T.; Shite, J.; Suzuki, T.; Uemura, S.; Yu, B.; Habara, M.; Nasu, K.; Terashima, M.; Kaneda, H.; Yokota, D.; Ko, E.; Virmani, R.; Burke, AP.; Kolodgie, F.D.; Farb, A.; Takarada, S.; Imanishi, T.; Kubo, T.; Tanimoto, T.; Kitabata, H.; Nakamura, N.; Hattori, K.; Ozaki, Y.; Ismail, TF.; Okumura, M.; Naruse, H.; Kan, S.; Nishio, R.; Shinke, T.; Otake, H.; Nakagawa, M.; Nagoshi, R.; Inoue, T.; Sinclair, H.D.; Bourantas, C.; Bagnall, A.; Mintz, G.S.; Kunadian, V.; Tearney, G. J.; Yabushita, H.; Houser, S. L.; Aretz, HT.; Jang, I-K.; Schlendorf, KH.; van Soest, G.; Goderie, T.; Regar, E.; Koljenović, S.; Leenders, GL. van; Gonzalo, N.; Xu, C.; Schmitt, JM.; Carlier, SG.; Virmani, R.; van der Meer, FJ; Faber, D.J.; Sassoon, DMB.; Aalders, M.C.; Pasterkamp, G.; Leeuwen, TG. van; Schmitt, JM.; Knuttel, A.; Yadlowsky, M.; Eckhaus, MA.; Karamata, B.; Laubscher, M.; Leutenegger, M.; Bourquin, S.; Lasser, T.; Lambelet, P.; Vermeer, K.A.; Mo, J.; Weda, J.J.A.; Lemij, H.G.; Boer, JF. de
2016-01-01
PURPOSE To optimize conventional coronary optical coherence tomography (OCT) images using the attenuation-compensated technique to improve identification of plaques and the external elastic lamina (EEL) contour. METHOD The attenuation-compensated technique was optimized via manipulating contrast
Heuristics for the inversion median problem
2010-01-01
Background The study of genome rearrangements has become a mainstay of phylogenetics and comparative genomics. Fundamental in such a study is the median problem: given three genomes find a fourth that minimizes the sum of the evolutionary distances between itself and the given three. Many exact algorithms and heuristics have been developed for the inversion median problem, of which the best known is MGR. Results We present a unifying framework for median heuristics, which enables us to clarify existing strategies and to place them in a partial ordering. Analysis of this framework leads to a new insight: the best strategies continue to refer to the input data rather than reducing the problem to smaller instances. Using this insight, we develop a new heuristic for inversion medians that uses input data to the end of its computation and leverages our previous work with DCJ medians. Finally, we present the results of extensive experimentation showing that our new heuristic outperforms all others in accuracy and, especially, in running time: the heuristic typically returns solutions within 1% of optimal and runs in seconds to minutes even on genomes with 25'000 genes--in contrast, MGR can take days on instances of 200 genes and cannot be used beyond 1'000 genes. Conclusion Finding good rearrangement medians, in particular inversion medians, had long been regarded as the computational bottleneck in whole-genome studies. Our new heuristic for inversion medians, ASM, which dominates all others in our framework, puts that issue to rest by providing near-optimal solutions within seconds to minutes on even the largest genomes. PMID:20122203
Airfoil shape optimization using non-traditional optimization technique and its validation
Directory of Open Access Journals (Sweden)
R. Mukesh
2014-07-01
Full Text Available Computational fluid dynamics (CFD is one of the computer-based solution methods which is more widely employed in aerospace engineering. The computational power and time required to carry out the analysis increase as the fidelity of the analysis increases. Aerodynamic shape optimization has become a vital part of aircraft design in the recent years. Generally if we want to optimize an airfoil we have to describe the airfoil and for that, we need to have at least hundred points of x and y co-ordinates. It is really difficult to optimize airfoils with this large number of co-ordinates. Nowadays many different schemes of parameter sets are used to describe general airfoil such as B-spline, and PARSEC. The main goal of these parameterization schemes is to reduce the number of needed parameters as few as possible while controlling the important aerodynamic features effectively. Here the work has been done on the PARSEC geometry representation method. The objective of this work is to introduce the knowledge of describing general airfoil using twelve parameters by representing its shape as a polynomial function. And also we have introduced the concept of Genetic Algorithm to optimize the aerodynamic characteristics of a general airfoil for specific conditions. A MATLAB program has been developed to implement PARSEC, Panel Technique, and Genetic Algorithm. This program has been tested for a standard NACA 2411 airfoil and optimized to improve its coefficient of lift. Pressure distribution and co-efficient of lift for airfoil geometries have been calculated using the Panel method. The optimized airfoil has improved co-efficient of lift compared to the original one. The optimized airfoil is validated using wind tunnel data.
Local search heuristics for the probabilistic dial-a-ride problem
DEFF Research Database (Denmark)
Ho, Sin C.; Haugland, Dag
2011-01-01
evaluation procedure in a pure local search heuristic and in a tabu search heuristic. The quality of the solutions obtained by the two heuristics have been compared experimentally. Computational results confirm that our neighborhood evaluation technique is much faster than the straightforward one...
Techniques for optimizing nanotips derived from frozen taylor cones
Hirsch, Gregory
2017-12-05
Optimization techniques are disclosed for producing sharp and stable tips/nanotips relying on liquid Taylor cones created from electrically conductive materials with high melting points. A wire substrate of such a material with a preform end in the shape of a regular or concave cone, is first melted with a focused laser beam. Under the influence of a high positive potential, a Taylor cone in a liquid/molten state is formed at that end. The cone is then quenched upon cessation of the laser power, thus freezing the Taylor cone. The tip of the frozen Taylor cone is reheated by the laser to allow its precise localized melting and shaping. Tips thus obtained yield desirable end-forms suitable as electron field emission sources for a variety of applications. In-situ regeneration of the tip is readily accomplished. These tips can also be employed as regenerable bright ion sources using field ionization/desorption of introduced chemical species.
Design Optimization of a Speed Reducer Using Deterministic Techniques
Directory of Open Access Journals (Sweden)
Ming-Hua Lin
2013-01-01
Full Text Available The optimal design problem of minimizing the total weight of a speed reducer under constraints is a generalized geometric programming problem. Since the metaheuristic approaches cannot guarantee to find the global optimum of a generalized geometric programming problem, this paper applies an efficient deterministic approach to globally solve speed reducer design problems. The original problem is converted by variable transformations and piecewise linearization techniques. The reformulated problem is a convex mixed-integer nonlinear programming problem solvable to reach an approximate global solution within an acceptable error. Experiment results from solving a practical speed reducer design problem indicate that this study obtains a better solution comparing with the other existing methods.
Review of optimization techniques of polygeneration systems for building applications
Y, Rong A.; Y, Su; R, Lahdelma
2016-08-01
Polygeneration means simultaneous production of two or more energy products in a single integrated process. Polygeneration is an energy-efficient technology and plays an important role in transition into future low-carbon energy systems. It can find wide applications in utilities, different types of industrial sectors and building sectors. This paper mainly focus on polygeneration applications in building sectors. The scales of polygeneration systems in building sectors range from the micro-level for a single home building to the large- level for residential districts. Also the development of polygeneration microgrid is related to building applications. The paper aims at giving a comprehensive review for optimization techniques for designing, synthesizing and operating different types of polygeneration systems for building applications.
Optimal exposure techniques for iodinated contrast enhanced breast CT
Glick, Stephen J.; Makeev, Andrey
2016-03-01
Screening for breast cancer using mammography has been very successful in the effort to reduce breast cancer mortality, and its use has largely resulted in the 30% reduction in breast cancer mortality observed since 1990 [1]. However, diagnostic mammography remains an area of breast imaging that is in great need for improvement. One imaging modality proposed for improving the accuracy of diagnostic workup is iodinated contrast-enhanced breast CT [2]. In this study, a mathematical framework is used to evaluate optimal exposure techniques for contrast-enhanced breast CT. The ideal observer signal-to-noise ratio (i.e., d') figure-of-merit is used to provide a task performance based assessment of optimal acquisition parameters under the assumptions of a linear, shift-invariant imaging system. A parallel-cascade model was used to estimate signal and noise propagation through the detector, and a realistic lesion model with iodine uptake was embedded into a structured breast background. Ideal observer performance was investigated across kVp settings, filter materials, and filter thickness. Results indicated many kVp spectra/filter combinations can improve performance over currently used x-ray spectra.
Optimal technique for maximal forward rotating vaults in men's gymnastics.
Hiley, Michael J; Jackson, Monique I; Yeadon, Maurice R
2015-08-01
In vaulting a gymnast must generate sufficient linear and angular momentum during the approach and table contact to complete the rotational requirements in the post-flight phase. This study investigated the optimization of table touchdown conditions and table contact technique for the maximization of rotation potential for forwards rotating vaults. A planar seven-segment torque-driven computer simulation model of the contact phase in vaulting was evaluated by varying joint torque activation time histories to match three performances of a handspring double somersault vault by an elite gymnast. The closest matching simulation was used as a starting point to maximize post-flight rotation potential (the product of angular momentum and flight time) for a forwards rotating vault. It was found that the maximized rotation potential was sufficient to produce a handspring double piked somersault vault. The corresponding optimal touchdown configuration exhibited hip flexion in contrast to the hyperextended configuration required for maximal height. Increasing touchdown velocity and angular momentum lead to additional post-flight rotation potential. By increasing the horizontal velocity at table touchdown, within limits obtained from recorded performances, the handspring double somersault tucked with one and a half twists, and the handspring triple somersault tucked became theoretically possible. Copyright © 2015 Elsevier B.V. All rights reserved.
Optimization Techniques for Dimensionally Truncated Sparse Grids on Heterogeneous Systems
Deftu, A.
2013-02-01
Given the existing heterogeneous processor landscape dominated by CPUs and GPUs, topics such as programming productivity and performance portability have become increasingly important. In this context, an important question refers to how can we develop optimization strategies that cover both CPUs and GPUs. We answer this for fastsg, a library that provides functionality for handling efficiently high-dimensional functions. As it can be employed for compressing and decompressing large-scale simulation data, it finds itself at the core of a computational steering application which serves us as test case. We describe our experience with implementing fastsg\\'s time critical routines for Intel CPUs and Nvidia Fermi GPUs. We show the differences and especially the similarities between our optimization strategies for the two architectures. With regard to our test case for which achieving high speedups is a "must" for real-time visualization, we report a speedup of up to 6.2x times compared to the state-of-the-art implementation of the sparse grid technique for GPUs. © 2013 IEEE.
Conspicuous Waste and Representativeness Heuristic
Directory of Open Access Journals (Sweden)
Tatiana M. Shishkina
2017-12-01
Full Text Available The article deals with the similarities between conspicuous waste and representativeness heuristic. The conspicuous waste is analyzed according to the classic Veblen’ interpretation as a strategy to increase social status through conspicuous consumption and conspicuous leisure. In “The Theory of the Leisure Class” Veblen introduced two different types of utility – conspicuous and functional. The article focuses on the possible benefits of the analysis of conspicuous utility not only in terms of institutional economic theory, but also in terms of behavioral economics. To this end, the representativeness heuristics is considered, on the one hand, as a way to optimize the decision-making process, which allows to examine it in comparison with procedural rationality by Simon. On the other hand, it is also analyzed as cognitive bias within the Kahneman and Twersky’ approach. The article provides the analysis of the patterns in the deviations from the rational behavior strategy that could be observed in case of conspicuous waste both in modern market economies in the form of conspicuous consumption and in archaic economies in the form of gift-exchange. The article also focuses on the marketing strategies for luxury consumption’ advertisement. It highlights the impact of the symbolic capital (in Bourdieu’ interpretation on the social and symbolic payments that actors get from the act of conspicuous waste. This allows to perform a analysis of conspicuous consumption both as a rational way to get the particular kind of payments, and, at the same time, as a form of institutionalized cognitive bias.
A Monte Carlo simulation technique to determine the optimal portfolio
Directory of Open Access Journals (Sweden)
Hassan Ghodrati
2014-03-01
Full Text Available During the past few years, there have been several studies for portfolio management. One of the primary concerns on any stock market is to detect the risk associated with various assets. One of the recognized methods in order to measure, to forecast, and to manage the existing risk is associated with Value at Risk (VaR, which draws much attention by financial institutions in recent years. VaR is a method for recognizing and evaluating of risk, which uses the standard statistical techniques and the method has been used in other fields, increasingly. The present study has measured the value at risk of 26 companies from chemical industry in Tehran Stock Exchange over the period 2009-2011 using the simulation technique of Monte Carlo with 95% confidence level. The used variability in the present study has been the daily return resulted from the stock daily price change. Moreover, the weight of optimal investment has been determined using a hybrid model called Markowitz and Winker model in each determined stocks. The results showed that the maximum loss would not exceed from 1259432 Rials at 95% confidence level in future day.
A Heuristic Bioinspired for 8-Piece Puzzle
Machado, M. O.; Fabres, P. A.; Melo, J. C. L.
2017-10-01
This paper investigates a mathematical model inspired by nature, and presents a Meta-Heuristic that is efficient in improving the performance of an informed search, when using strategy A * using a General Search Tree as data structure. The work hypothesis suggests that the investigated meta-heuristic is optimal in nature and may be promising in minimizing the computational resources required by an objective-based agent in solving high computational complexity problems (n-part puzzle) as well as In the optimization of objective functions for local search agents. The objective of this work is to describe qualitatively the characteristics and properties of the mathematical model investigated, correlating the main concepts of the A * function with the significant variables of the metaheuristic used. The article shows that the amount of memory required to perform this search when using the metaheuristic is less than using the A * function to evaluate the nodes of a general search tree for the eight-piece puzzle. It is concluded that the meta-heuristic must be parameterized according to the chosen heuristic and the level of the tree that contains the possible solutions to the chosen problem.
Efficient heuristics for maximum common substructure search.
Englert, Péter; Kovács, Péter
2015-05-26
Maximum common substructure search is a computationally hard optimization problem with diverse applications in the field of cheminformatics, including similarity search, lead optimization, molecule alignment, and clustering. Most of these applications have strict constraints on running time, so heuristic methods are often preferred. However, the development of an algorithm that is both fast enough and accurate enough for most practical purposes is still a challenge. Moreover, in some applications, the quality of a common substructure depends not only on its size but also on various topological features of the one-to-one atom correspondence it defines. Two state-of-the-art heuristic algorithms for finding maximum common substructures have been implemented at ChemAxon Ltd., and effective heuristics have been developed to improve both their efficiency and the relevance of the atom mappings they provide. The implementations have been thoroughly evaluated and compared with existing solutions (KCOMBU and Indigo). The heuristics have been found to greatly improve the performance and applicability of the algorithms. The purpose of this paper is to introduce the applied methods and present the experimental results.
Directory of Open Access Journals (Sweden)
Hediyeh Karimi
2013-01-01
Full Text Available It has been predicted that the nanomaterials of graphene will be among the candidate materials for postsilicon electronics due to their astonishing properties such as high carrier mobility, thermal conductivity, and biocompatibility. Graphene is a semimetal zero gap nanomaterial with demonstrated ability to be employed as an excellent candidate for DNA sensing. Graphene-based DNA sensors have been used to detect the DNA adsorption to examine a DNA concentration in an analyte solution. In particular, there is an essential need for developing the cost-effective DNA sensors holding the fact that it is suitable for the diagnosis of genetic or pathogenic diseases. In this paper, particle swarm optimization technique is employed to optimize the analytical model of a graphene-based DNA sensor which is used for electrical detection of DNA molecules. The results are reported for 5 different concentrations, covering a range from 0.01 nM to 500 nM. The comparison of the optimized model with the experimental data shows an accuracy of more than 95% which verifies that the optimized model is reliable for being used in any application of the graphene-based DNA sensor.
Directory of Open Access Journals (Sweden)
A. Baskar
2016-04-01
Full Text Available Permutation flow shop scheduling problems have been an interesting area of research for over six decades. Out of the several parameters, minimization of makespan has been studied much over the years. The problems are widely regarded as NP-Complete if the number of machines is more than three. As the computation time grows exponentially with respect to the problem size, heuristics and meta-heuristics have been proposed by many authors that give reasonably accurate and acceptable results. The NEH algorithm proposed in 1983 is still considered as one of the best simple, constructive heuristics for the minimization of makespan. This paper analyses the powerful job insertion technique used by NEH algorithm and proposes seven new variants, the complexity level remains same. 120 numbers of problem instances proposed by Taillard have been used for the purpose of validating the algorithms. Out of the seven, three produce better results than the original NEH algorithm.
Optimization technique for problems with an inequality constraint
Russell, K. J.
1972-01-01
General technique uses a modified version of an existing technique termed the pattern search technique. New procedure called the parallel move strategy permits pattern search technique to be used with problems involving a constraint.
Technique to optimize magnetic response of gelatin coated magnetic nanoparticles.
Parikh, Nidhi; Parekh, Kinnari
2015-07-01
The paper describes the results of optimization of magnetic response for highly stable bio-functionalize magnetic nanoparticles dispersion. Concentration of gelatin during in situ co-precipitation synthesis was varied from 8, 23 and 48 mg/mL to optimize magnetic properties. This variation results in a change in crystallite size from 10.3 to 7.8 ± 0.1 nm. TEM measurement of G3 sample shows highly crystalline spherical nanoparticles with a mean diameter of 7.2 ± 0.2 nm and diameter distribution (σ) of 0.27. FTIR spectra shows a shift of 22 cm(-1) at C=O stretching with absence of N-H stretching confirming the chemical binding of gelatin on magnetic nanoparticles. The concept of lone pair electron of the amide group explains the mechanism of binding. TGA shows 32.8-25.2% weight loss at 350 °C temperature substantiating decomposition of chemically bind gelatin. The magnetic response shows that for 8 mg/mL concentration of gelatin, the initial susceptibility and saturation magnetization is the maximum. The cytotoxicity of G3 sample was assessed in Normal Rat Kidney Epithelial Cells (NRK Line) by MTT assay. Results show an increase in viability for all concentrations, the indicative probability of a stimulating action of these particles in the nontoxic range. This shows the potential of this technique for biological applications as the coated particles are (i) superparamagnetic (ii) highly stable in physiological media (iii) possibility of attaching other drug with free functional group of gelatin and (iv) non-toxic.
Heuristic decision making in medicine.
Marewski, Julian N; Gigerenzer, Gerd
2012-03-01
Can less information be more helpful when it comes to making medical decisions? Contrary to the common intuition that more information is always better, the use of heuristics can help both physicians and patients to make sound decisions. Heuristics are simple decision strategies that ignore part of the available information, basing decisions on only a few relevant predictors. We discuss: (i) how doctors and patients use heuristics; and (ii) when heuristics outperform information-greedy methods, such as regressions in medical diagnosis. Furthermore, we outline those features of heuristics that make them useful in health care settings. These features include their surprising accuracy, transparency, and wide accessibility, as well as the low costs and little time required to employ them. We close by explaining one of the statistical reasons why heuristics are accurate, and by pointing to psychiatry as one area for future research on heuristics in health care.
Heuristic decision making in medicine
Marewski, Julian N.; Gigerenzer, Gerd
2012-01-01
Can less information be more helpful when it comes to making medical decisions? Contrary to the common intuition that more information is always better, the use of heuristics can help both physicians and patients to make sound decisions. Heuristics are simple decision strategies that ignore part of the available information, basing decisions on only a few relevant predictors. We discuss: (i) how doctors and patients use heuristics; and (ii) when heuristics outperform information-greedy methods, such as regressions in medical diagnosis. Furthermore, we outline those features of heuristics that make them useful in health care settings. These features include their surprising accuracy, transparency, and wide accessibility, as well as the low costs and little time required to employ them. We close by explaining one of the statistical reasons why heuristics are accurate, and by pointing to psychiatry as one area for future research on heuristics in health care. PMID:22577307
A HYBRID OPTIMIZATION TECHNIQUE FOR EFFECTIVE DOCUMENT CLUSTERING IN QUESTION ANSWERING SYSTEM
Directory of Open Access Journals (Sweden)
K Karpagam
2017-04-01
Full Text Available Today, the information is growing enormously and it is difficult and tedious task to retrieve the necessary information from that pool. The main area for retrieving relevant answers is called intelligent information retrieval. To achieve this, question and answering system is used. This question and answering plays a major role in user query processing, information retrieval and extracting related information from the information pool. Recently, number of optimization algorithms is introduced to obtain the accurate and better results. Genetic Algorithm and Cuckoo Search are nature inspired meta-heuristic optimization algorithms. In this paper, combination of Genetic Algorithm with Cuckoo Search is applied to the question and answering system. The proposed algorithm is tested with the Amazon review, Trip Advisor and 20newsgroup datasets. The results are compared with Genetic Algorithm and Cuckoo Search algorithms.
A Meta-Heuristic Regression-Based Feature Selection for Predictive Analytics
Directory of Open Access Journals (Sweden)
Bharat Singh
2014-11-01
Full Text Available A high-dimensional feature selection having a very large number of features with an optimal feature subset is an NP-complete problem. Because conventional optimization techniques are unable to tackle large-scale feature selection problems, meta-heuristic algorithms are widely used. In this paper, we propose a particle swarm optimization technique while utilizing regression techniques for feature selection. We then use the selected features to classify the data. Classification accuracy is used as a criterion to evaluate classifier performance, and classification is accomplished through the use of k-nearest neighbour (KNN and Bayesian techniques. Various high dimensional data sets are used to evaluate the usefulness of the proposed approach. Results show that our approach gives better results when compared with other conventional feature selection algorithms.
Directory of Open Access Journals (Sweden)
Viktor Ivanovich Petrov
2017-01-01
Full Text Available The article considers the issues of civil aviation aircraft onboard computers data safety. Infor- mation security undeclared capabilities stand for technical equipment or software possibilities, which are not mentioned in the documentation. Documentation and tests content requirements are imposed during the software certification. Documentation requirements include documents composition and content of control (specification, description and program code, the source code. Test requirements include: static analysis of program codes (including the compliance of the sources with their loading modules monitoring; dynamic analysis of source code (including implementation of routes monitor- ing. Currently, there are no complex measures for checking onboard computer software. There are no rules and regulations that can allow controlling foreign production aircraft software, and the actual receiving of software is difficult. Consequently, the author suggests developing the basics of aviation rules and regulations, which allow to analyze the programs of CA aircraft onboard computers. If there are no software source codes the two approaches of code analysis are used: a structural static and dy- namic analysis of the source code; signature-heuristic analysis of potentially dangerous operations. Static analysis determines the behavior of the program by reading the program code (without running the program which is represented in the assembler language - disassembly listing. Program tracing is performed by the dynamic analysis. The analysis of aircraft software ability to detect undeclared capa- bilities using the interactive disassembler was considered in this article.
Intelligent System Design Using Hyper-Heuristics
Directory of Open Access Journals (Sweden)
Nelishia Pillay
2015-07-01
Full Text Available Determining the most appropriate search method or artificial intelligence technique to solve a problem is not always evident and usually requires implementation of the different approaches to ascertain this. In some instances a single approach may not be sufficient and hybridization of methods may be needed to find a solution. This process can be time consuming. The paper proposes the use of hyper-heuristics as a means of identifying which method or combination of approaches is needed to solve a problem. The research presented forms part of a larger initiative aimed at using hyper-heuristics to develop intelligent hybrid systems. As an initial step in this direction, this paper investigates this for classical artificial intelligence uninformed and informed search methods, namely depth first search, breadth first search, best first search, hill-climbing and the A* algorithm. The hyper-heuristic determines the search or combination of searches to use to solve the problem. An evolutionary algorithm hyper-heuristic is implemented for this purpose and its performance is evaluated in solving the 8-Puzzle, Towers of Hanoi and Blocks World problems. The hyper-heuristic employs a generational evolutionary algorithm which iteratively refines an initial population using tournament selection to select parents, which the mutation and crossover operators are applied to for regeneration. The hyper-heuristic was able to identify a search or combination of searches to produce solutions for the twenty 8-Puzzle, five Towers of Hanoi and five Blocks World problems. Furthermore, admissible solutions were produced for all problem instances.
Optimizing the technique of laparoscopic splenectomy in children
Directory of Open Access Journals (Sweden)
Józef Dzielicki
2010-03-01
Full Text Available Introduction: The first laparoscopic splenectomy in children was managed in 1993 by Tulman. During the past 15 years,laparoscopic splenectomy has become a gold standard in many centres.Aim: To establish an optimal surgical technique of laparoscopic splenectomy according to our own experience andconcerning clinical and economic aspects.Material and methods: In 1996-2007, 77 laparoscopic splenectomies were carried out in our department. Most of theindications were haematological diseases. Three groups were created from the study population according to theinstruments used for preparation and closure of the vessels. Attention was paid to duration of the procedure, complications,conversions, volume of postoperative blood loss, time of analgesic therapy and length of hospital stay.Results: Mean duration of the procedure was 112 ±53.11 min. Complication rate was 12.98%. There were no statisticallysignificant differences among the three studied groups concerning the number and character of complications.Costs were significantly different in all three groups.Conclusions:1. There are no statistically significant differences in the clinical aspects of perioperative course among the three studiedgroups.2. The use of disposable instruments and especially LigaSure high energy coagulation essentially lowers overall costsof the procedure.
Solving Large Clustering Problems with Meta-Heuristic Search
DEFF Research Database (Denmark)
Turkensteen, Marcel; Andersen, Kim Allan; Bang-Jensen, Jørgen
In Clustering Problems, groups of similar subjects are to be retrieved from data sets. In this paper, Clustering Problems with the frequently used Minimum Sum-of-Squares Criterion are solved using meta-heuristic search. Tabu search has proved to be a successful methodology for solving optimization...... problems, but applications to large clustering problems are rare. The simulated annealing heuristic has mainly been applied to relatively small instances. In this paper, we implement tabu search and simulated annealing approaches and compare them to the commonly used k-means approach. We find that the meta-heuristic...
Heuristic of radiodiagnostic systems
Energy Technology Data Exchange (ETDEWEB)
Wackenheim, A.
1986-12-01
In the practice of creating expert systems, the radiologist and his team are considered as the expert who leads the job of the cognitian or cognitician. Different kinds of expert systems can be imagined. The author describes the main characteristics of heuristics in redefining semiology, semantics and rules of picture reading. Finally it is the experience of the couple expert and cognitician which will in the futur grant for the success of expert systems in radiology.
DEFF Research Database (Denmark)
Larsen, Anders Astrup; Bendsøe, Martin P.; Schmidt, Henrik Nikolaj Blicher
2007-01-01
The aim of this paper is to optimize a thermal model of a friction stir welding process. The optimization is performed using a space mapping technique in which an analytical model is used along with the FEM model to be optimized. The results are compared to traditional gradient based optimization...
A heuristic and hybrid method for the tank allocation problem in maritime bulk shipping
DEFF Research Database (Denmark)
Vilhelmsen, Charlotte; Larsen, Jesper; Lusby, Richard Martin
2016-01-01
finding a feasible solution. We have developed a heuristic that can efficiently find feasible cargo allocations. Computational results show that it can solve 99 % of the considered instances within 0.4 s and all of them if allowed longer time. We have also modified an optimality based method from...... the literature. The heuristic is much faster than this modified method on the vast majority of considered instances. However, the heuristic struggles on two instances which are relatively quickly solved by the modified optimality based method. These two methods therefore complement each other nicely and so, we...... have created a hybrid method that first runs the heuristic and if the heuristic fails to solve the problem, then runs the modified optimality based method on the parts of the problem that the heuristic did not solve. This hybrid method cuts between 90 and 94 % of the average running times compared...
A Heuristic and Hybrid Method for the Tank Allocation Problem in Maritime Bulk Shipping
DEFF Research Database (Denmark)
Vilhelmsen, Charlotte; Larsen, Jesper; Lusby, Richard Martin
finding a feasible solution. We have developed a heuristic that can efficiently find feasible cargo allocations. Computational results show that it can solve 99% of the considered instances within 0.4 seconds and all of them if allowed longer time. We have also modified an optimality based method from...... the literature. The heuristic is much faster than this modified method on the vast majority of considered instances. However, the heuristic struggles on two instances which are relatively quickly solved by the modified optimality based method. These two methods therefore complement each other nicely and so, we...... have created a hybrid method that first runs the heuristic and if the heuristic fails to solve the problem, then runs the modified optimality based method on the parts of the problem that the heuristic did not solve. This hybrid method cuts between 90% and 94% of the average running times compared...
Statistical designs and response surface techniques for the optimization of chromatographic systems.
Ferreira, Sergio Luis Costa; Bruns, Roy Edward; da Silva, Erik Galvão Paranhos; Dos Santos, Walter Nei Lopes; Quintella, Cristina Maria; David, Jorge Mauricio; de Andrade, Jailson Bittencourt; Breitkreitz, Marcia Cristina; Jardim, Isabel Cristina Sales Fontes; Neto, Benicio Barros
2007-07-27
This paper describes fundamentals and applications of multivariate statistical techniques for the optimization of chromatographic systems. The surface response methodologies: central composite design, Doehlert matrix and Box-Behnken design are discussed and applications of these techniques for optimization of sample preparation steps (extractions) and determination of experimental conditions for chromatographic separations are presented. The use of mixture design for optimization of mobile phases is also related. An optimization example involving a real separation process is exhaustively described. A discussion about model validation is presented. Some applications of other multivariate techniques for optimization of chromatographic methods are also summarized.
Heuristic for two-dimensional homogeneous two-segment cutting patterns
Cui, Yaodong
2013-01-01
This article deals with the guillotine-constrained two-dimensional cutting problem, where a guillotine is used to cut the stock plate into rectangular pieces, such that the pattern value (the total value of the pieces produced) is maximized, observing the constraint that the frequency of each piece type should not exceed the demand. Homogeneous two-segment (HTS) cutting patterns are considered to simplify the cutting process. Each HTS pattern includes two segments, each segment contains homogeneous strips of the same direction, and each homogeneous strip contains pieces of the same type. A heuristic is presented for generating HTS patterns. It is based on dynamic programming and branch-and-bound techniques. The computational results indicate that the heuristic is able to generate solutions close to optimal, and is adequate for solving large-scale instances.
Directory of Open Access Journals (Sweden)
Qi Xu
2016-01-01
Full Text Available This paper proposes an economic production quantity problem with the maximal production run time and minimal preventive maintenance time over a finite planning horizon. The objective is to find the efficient production and maintenance policy to minimize the total cost composed of production, maintenance, shortages, and holding costs under the restriction on the production run time and the preventive maintenance time. The production and maintenance decisions include the production and maintenance frequencies and the production run and the maintenance time. The variability and the boundedness of the production run and maintenance time make the problem difficult to solve. Two heuristic algorithms are developed using different techniques based on the optimal properties of the relaxed problem. The performance comparison between the two algorithms is illustrated by numerical examples. The numerical results show that, for the most part, there exists a heuristic algorithm which is more effective than the other.
Optimal Design of Composite Structures Under Manufacturing Constraints
DEFF Research Database (Denmark)
Marmaras, Konstantinos
This thesis considers discrete multi material and thickness optimization of laminated composite structures including local failure criteria and manufacturing constraints. Our models closely follow an immediate extension of the Discrete Material Optimization scheme, which allows simultaneous...... mixed integer 0–1 programming problems. The manufacturing constraints have been treated by developing explicit models with favorable properties. In this thesis we have developed and implemented special purpose global optimization methods and heuristic techniques for solving this class of problems...... to react swiftly to changes of scale in the problem. As opposed to the original Discrete Material Optimization methodology, we obtain discrete feasible solutions to the stated mixed 0–1 convex problems by the application of advanced heuristic techniques. Our heuristics are based on solving a finite...
Optimization of Hydraulic Machinery Bladings by Multilevel CFD Techniques
Directory of Open Access Journals (Sweden)
Susanne Thum
2005-01-01
prediction. Finally, a 3D Navier-Stokes code is applied on the third level to search for the optimum optimorum by means of a fine-tuning of the geometrical parameters. To show the potential of the developed optimization system, the runner blading of a water turbine having a specific speed nq=411/min was optimized applying the multilevel approach.
Optimizing feed force for turned parts through the Taguchi technique
Indian Academy of Sciences (India)
Abstract. The objective of the paper is to obtain an optimal setting of turning process parameters (cutting speed, feed rate and depth of cut) resulting in an optimal value of the feed force when machining EN24 steel with TiC-coated tungsten- carbide inserts. The effects of the selected turning process parameters on feed force.
A Tutorial on Heuristic Methods
DEFF Research Database (Denmark)
Vidal, Rene Victor Valqui; Werra, D. de; Silver, E.
1980-01-01
In this paper we define a heuristic method as a procedure for solving a well-defined mathematical problem by an intuitive approach in which the structure of the problem can be interpreted and exploited intelligently to obtain a reasonable solution. Issues discussed include: (i) the measurement...... of the quality of a heuristic method, (ii) different types of heuristic procedures, (iii) the interactive role of human beings and (iv) factors that may influence the choice or testing of heuristic methods. A large number of references are included....
Bio-Inspired Meta-Heuristics for Emergency Transportation Problems
Directory of Open Access Journals (Sweden)
Min-Xia Zhang
2014-02-01
Full Text Available Emergency transportation plays a vital role in the success of disaster rescue and relief operations, but its planning and scheduling often involve complex objectives and search spaces. In this paper, we conduct a survey of recent advances in bio-inspired meta-heuristics, including genetic algorithms (GA, particle swarm optimization (PSO, ant colony optimization (ACO, etc., for solving emergency transportation problems. We then propose a new hybrid biogeography-based optimization (BBO algorithm, which outperforms some state-of-the-art heuristics on a typical transportation planning problem.
Bashiri, Mahdi; Karimi, Hossein
2012-07-01
Quadratic assignment problem (QAP) is a well-known problem in the facility location and layout. It belongs to the NP-complete class. There are many heuristic and meta-heuristic methods, which are presented for QAP in the literature. In this paper, we applied 2-opt, greedy 2-opt, 3-opt, greedy 3-opt, and VNZ as heuristic methods and tabu search (TS), simulated annealing, and particle swarm optimization as meta-heuristic methods for the QAP. This research is dedicated to compare the relative percentage deviation of these solution qualities from the best known solution which is introduced in QAPLIB. Furthermore, a tuning method is applied for meta-heuristic parameters. Results indicate that TS is the best in 31%of QAPs, and the IFLS method, which is in the literature, is the best in 58 % of QAPs; these two methods are the same in 11 % of test problems. Also, TS has a better computational time among heuristic and meta-heuristic methods.
Techniques for Component-Based Software Architecture Optimization
Directory of Open Access Journals (Sweden)
Adil Ali Abdelaziz
2012-06-01
Full Text Available Although Component-Based System (CBS increases the efficiency of development and reduces the need for maintenance, but even good quality components could fail to compose good product if the composition is not managed appropriately. In real world, such as industrial automation domain, this probability is unacceptable because additional measures, time, efforts, and costs are required to minimize its impacts. Many general optimization approaches have been proposed in literature to manage the composition of system at early stage of development. This paper investigates recent approach es used to optimize software architecture. The results of this study are important since it will be used to develop an efficient optimization framework to optimize software architecture in next step of our ongoing research.
Complex Chemical Reaction Networks from Heuristics-Aided Quantum Chemistry.
Rappoport, Dmitrij; Galvin, Cooper J; Zubarev, Dmitry Yu; Aspuru-Guzik, Alán
2014-03-11
While structures and reactivities of many small molecules can be computed efficiently and accurately using quantum chemical methods, heuristic approaches remain essential for modeling complex structures and large-scale chemical systems. Here, we present a heuristics-aided quantum chemical methodology applicable to complex chemical reaction networks such as those arising in cell metabolism and prebiotic chemistry. Chemical heuristics offer an expedient way of traversing high-dimensional reactive potential energy surfaces and are combined here with quantum chemical structure optimizations, which yield the structures and energies of the reaction intermediates and products. Application of heuristics-aided quantum chemical methodology to the formose reaction reproduces the experimentally observed reaction products, major reaction pathways, and autocatalytic cycles.
Heuristic Portfolio Trading Rules with Capital Gain Taxes
DEFF Research Database (Denmark)
Fischer, Marcel; Gallmeyer, Michael
outperform a 1/N trading strategy augmented with a tax heuristic, not even the most tax- and transaction-cost efficient buy-and-hold strategy. Overall, the best strategy is 1/N augmented with a heuristic that allows for a fixed deviation in absolute portfolio weights. Our results show that the best trading...... strategy is not dominated out-of-sample by a variety of optimizing trading strategies, except the parametric portfolios of Brandt, Santa-Clara, and Valkanov (2009). With dividend and realization-based capital gain taxes, the welfare costs of the taxes are large with the cost being as large as 30% of wealth...... in some cases. Overlaying simple tax trading heuristics on these trading strategies improves out-of-sample performance. In particular, the 1/N trading strategy's welfare gains improve when a variety of tax trading heuristics are also imposed. For medium to large transaction costs, no trading strategy can...
Directory of Open Access Journals (Sweden)
Elisabeth Rangosch-Schneck
2007-01-01
Full Text Available Studying subjective attitudes has to answer the question about the possibilities of expressing personal systems of meanings and about the possibilities of reconstructing the verbalized meanings by the researchers. Investigating teachers' perceptions of parents leads to two additional problems: teachers' rule of neutrality—which demands not making emotional and degrading statements—, and the normative orientation of "partnership" with parents—which is currently being increasingly discussed in the context of school-development and which makes teachers justify their working together with parents. The Repertory Grid Technique as a method of interview design supports the teachers who have been questioned in verbalizing individual perceptions of parents without using common phrases in their statements. But explicitly integrating the method in a qualitative design raises new questions: According to the currently prevailing orientation towards the typical quantitative Grid-data, qualitative designs are of little interest, so one cannot rely on proven procedures. The procedure described in this article is characterized by using the complete transcription of the interviews with teachers and the analysis of them as texts. The quantitative grid data are only collected for heuristic purposes. This article is a contribution to the discussion of the possibilities of realizing the qualitative potentials of Repertory Grid Technique. URN: urn:nbn:de:0114-fqs070197
Reexamining Our Bias against Heuristics
McLaughlin, Kevin; Eva, Kevin W.; Norman, Geoff R.
2014-01-01
Using heuristics offers several cognitive advantages, such as increased speed and reduced effort when making decisions, in addition to allowing us to make decision in situations where missing data do not allow for formal reasoning. But the traditional view of heuristics is that they trade accuracy for efficiency. Here the authors discuss sources…
Directory of Open Access Journals (Sweden)
Jianhua Wang
2014-10-01
Full Text Available Purpose: The stable relationship of one-supplier-one-customer is replaced by a dynamic relationship of multi-supplier-multi-customer in current market gradually, and efficient scheduling techniques are important tools of the dynamic supply chain relationship establishing process. This paper studies the optimization of the integrated planning and scheduling problem of a two-stage supply chain with multiple manufacturers and multiple retailers to obtain a minimum supply chain operating cost, whose manufacturers have different production capacities, holding and producing cost rates, transportation costs to retailers.Design/methodology/approach: As a complex task allocation and scheduling problem, this paper sets up an INLP model for it and designs a Unit Cost Adjusting (UCA heuristic algorithm that adjust the suppliers’ supplying quantity according to their unit costs step by step to solve the model.Findings: Relying on the contrasting analysis between the UCA and the Lingo solvers for optimizing many numerical experiments, results show that the INLP model and the UCA algorithm can obtain its near optimal solution of the two-stage supply chain’s planning and scheduling problem within very short CPU time.Research limitations/implications: The proposed UCA heuristic can easily help managers to optimizing the two-stage supply chain scheduling problems which doesn’t include the delivery time and batch of orders. For two-stage supply chains are the most common form of actual commercial relationships, so to make some modification and study on the UCA heuristic should be able to optimize the integrated planning and scheduling problems of a supply chain with more reality constraints.Originality/value: This research proposes an innovative UCA heuristic for optimizing the integrated planning and scheduling problem of two-stage supply chains with the constraints of suppliers’ production capacity and the orders’ delivering time, and has a great
Machine learning techniques for optical communication system optimization
DEFF Research Database (Denmark)
Zibar, Darko; Wass, Jesper; Thrane, Jakob
In this paper, machine learning techniques relevant to optical communication are presented and discussed. The focus is on applying machine learning tools to optical performance monitoring and performance prediction.......In this paper, machine learning techniques relevant to optical communication are presented and discussed. The focus is on applying machine learning tools to optical performance monitoring and performance prediction....
Optimal Technique for Abdominal Fascial Closure in Liver Transplant Patients
Directory of Open Access Journals (Sweden)
Unal Aydin
2010-01-01
Conclusion: Our results indicate that the novel technique used in this study contributed to overcoming early and late postoperative complications associated with closure of the abdominal fascia in liver transplant patients. In addition, this new technique has proven to be easily applicable, faster, safer and efficient in these patients; it is also potentially useful for conventional surgery.
LITERATURE REVIEW OF OPTIMIZATION TECHNIQUES FOR CHATTER SUPPRESSION IN MACHINING
Directory of Open Access Journals (Sweden)
Ahmad RazlanYusoff
2011-12-01
Full Text Available Chatter produces a poor surface finish, high tool wear, and can even damage machine tools because of the regenerative effect, the loss of contact effect, and the mode coupling effect. Various research works have investigated the suppression of chatter by either passive or active methods, such as by applying absorbers, damping, varied speeds and other alternatives. In this paper, it can be observed that for chatter suppression, optimization focuses on spindle design, tool path, cutting process, and variable pitch. Various algorithms can be applied in the optimization of machining problems; however, Differential Evolution is the most appropriate for use in chatter suppression, being less time consuming, locally optimal, and more robust than both Genetic Algorithms, despite their wide applications, and Sequential Quadratic Programming, which is a famous conventional algorithm.
Traditional and contemporary techniques for optimizing root canal irrigation.
Holliday, Richard; Alani, Aws
2014-01-01
Canal irrigation during root canal treatment is an important component of chemo-mechanical debridement of the root canal system. Traditional syringe irrigation can be enhanced by activating the irrigant to provide superior cleaning properties. This activation can be achieved by simple modifications in current technique or by contemporary automated devices. Novel techniques are also being developed, such as the Self-adjusting File (Re-Dent-Nova, Ra'anana, Israel), Ozone (Healozone, Dental Ozone, London, UK), Photoactivated Disinfection and Ultraviolet Light Disinfection. This paper reviews the techniques available to enhance traditional syringe irrigation, contemporary irrigation devices and novel techniques, citing their evidence base, advantages and disadvantages. Recent advances in irrigation techniques and canal disinfection and debridement are relevant to practitioners carrying out root canal treatment.
Optimal Component Lumping: problem formulation and solution techniques
DEFF Research Database (Denmark)
Lin, Bao; Leibovici, Claude F.; Jørgensen, Sten Bay
2008-01-01
This paper presents a systematic method for optimal lumping of a large number of components in order to minimize the loss of information. In principle, a rigorous composition-based model is preferable to describe a system accurately. However, computational intensity and numerical issues restrict...... to determine the lumping scheme. Given an objective function defined with a linear weighting rule, an optimal lumping problem is formulated as a mixed integer nonlinear programming (MINLP) problem both in discrete and in continuous settings. A reformulation of the original problem is also presented, which...
Adjoint Techniques for Topology Optimization of Structures Under Damage Conditions
Akgun, Mehmet A.; Haftka, Raphael T.
2000-01-01
The objective of this cooperative agreement was to seek computationally efficient ways to optimize aerospace structures subject to damage tolerance criteria. Optimization was to involve sizing as well as topology optimization. The work was done in collaboration with Steve Scotti, Chauncey Wu and Joanne Walsh at the NASA Langley Research Center. Computation of constraint sensitivity is normally the most time-consuming step of an optimization procedure. The cooperative work first focused on this issue and implemented the adjoint method of sensitivity computation (Haftka and Gurdal, 1992) in an optimization code (runstream) written in Engineering Analysis Language (EAL). The method was implemented both for bar and plate elements including buckling sensitivity for the latter. Lumping of constraints was investigated as a means to reduce the computational cost. Adjoint sensitivity computation was developed and implemented for lumped stress and buckling constraints. Cost of the direct method and the adjoint method was compared for various structures with and without lumping. The results were reported in two papers (Akgun et al., 1998a and 1999). It is desirable to optimize topology of an aerospace structure subject to a large number of damage scenarios so that a damage tolerant structure is obtained. Including damage scenarios in the design procedure is critical in order to avoid large mass penalties at later stages (Haftka et al., 1983). A common method for topology optimization is that of compliance minimization (Bendsoe, 1995) which has not been used for damage tolerant design. In the present work, topology optimization is treated as a conventional problem aiming to minimize the weight subject to stress constraints. Multiple damage configurations (scenarios) are considered. Each configuration has its own structural stiffness matrix and, normally, requires factoring of the matrix and solution of the system of equations. Damage that is expected to be tolerated is local
Topology optimization with geometric uncertainties by perturbation techniques
DEFF Research Database (Denmark)
Lazarov, Boyan Stefanov; Schevenels, M.; Sigmund, Ole
2012-01-01
the above assumptions, the proposed algorithm provides a computationally cheap alternative to previously introduced stochastic optimization methods based on Monte Carlo sampling. The method is demonstrated on the design of a minimum compliance cantilever beam and a compliant mechanism. Copyright © 2012 John......The aim of this paper was to present a topology optimization methodology for obtaining robust designs insensitive to small uncertainties in the geometry. The variations are modeled using a stochastic field. The model can represent spatially varying geometry imperfections in devices produced...
GYutsis: heuristic based calculation of general recoupling coefficients
Van Dyck, D.; Fack, V.
2003-08-01
physical problem: A general recoupling coefficient for an arbitrary number of (integer or half-integer) angular momenta can be expressed as a formula consisting of products of 6- j coefficients summed over a certain number of variables. Such a formula can be generated using the program GYutsis (with a graphical user front end) or CycleCostAlgorithm (with a text-mode user front end). Method of solution: Using the graphical techniques of Yutsis, Levinson and Vanagas (1962) a summation formula for a general recoupling coefficient is obtained by representing the coefficient as a Yutsis graph and by performing a selection of reduction rules valid for such graphs. Each reduction rule contributes to the final summation formula by a numerical factor or by an additional summation variable. Whereas an optimal summation formula (i.e. with a minimum number of summation variables) is hard to obtain, we present here some new heuristic approaches for selecting an edge from a k-cycle in order to transform it into an ( k-1)-cycle ( k>3) in such a way that a 'good' summation formula is obtained. Typical running time: From instantaneously for the typical problems to 30 s for the heaviest problems on a Pentium II-350 Linux-system with 256 MB RAM.
Optimization of an embedded rail structure using a numerical technique
Markine, V.L.; De Man, A.P.; Esveld, C.
2000-01-01
This paper presents several steps of a procedure for design of a railway track aiming at the development of optimal track structures under various predefined service and environmental conditions. The structural behavior of the track is analyzed using a finite element model in which the track and a
Solving semi-infinite optimization problems with interior point techniques
Stein, Oliver; Still, Georg J.
2003-01-01
We introduce a new numerical solution method for semi-infinite optimization problems with convex lower level problems. The method is based on a reformulation of the semi-infinite problem as a Stackelberg game and the use of regularized nonlinear complementarity problem functions. This approach leads
An improved technique for the prediction of optimal image resolution ...
African Journals Online (AJOL)
Past studies to predict optimal image resolution required for generating spatial information for savannah ecosystems have yielded different outcomes, hence providing a knowledge gap that was investigated in the present study. The postulation, for the present study, was that by graphically solving two simultaneous ...
Machine learning techniques for optical communication system optimization
Zibar, Darko; Wass, Jesper; Thrane, Jakob; Piels, Molly
2017-01-01
In this paper, machine learning techniques relevant to optical communication are presented and discussed. The focus is on applying machine learning tools to optical performance monitoring and performance prediction.
Optimal PID control of a brushless DC motor using PSO and BF techniques
Directory of Open Access Journals (Sweden)
H.E.A. Ibrahim
2014-06-01
Full Text Available This paper presents a Particle Swarm Optimization (PSO technique and bacterial foraging (BF technique for determining the optimal parameters of (PID controller for speed control of a brushless DC motor (BLDC where the (BLDC motor is modeled in simulink in Matlab. The proposed technique was more efficient in improving the step response characteristics as well as reducing the steady-state error, rise time, settling time and maximum overshoot.
Impact of heuristics in clustering large biological networks.
Shafin, Md Kishwar; Kabir, Kazi Lutful; Ridwan, Iffatur; Anannya, Tasmiah Tamzid; Karim, Rashid Saadman; Hoque, Mohammad Mozammel; Rahman, M Sohel
2015-12-01
Traditional clustering algorithms often exhibit poor performance for large networks. On the contrary, greedy algorithms are found to be relatively efficient while uncovering functional modules from large biological networks. The quality of the clusters produced by these greedy techniques largely depends on the underlying heuristics employed. Different heuristics based on different attributes and properties perform differently in terms of the quality of the clusters produced. This motivates us to design new heuristics for clustering large networks. In this paper, we have proposed two new heuristics and analyzed the performance thereof after incorporating those with three different combinations in a recently celebrated greedy clustering algorithm named SPICi. We have extensively analyzed the effectiveness of these new variants. The results are found to be promising. Copyright © 2015 Elsevier Ltd. All rights reserved.
Design of an operational transconductance amplifier applying multiobjective optimization techniques
Directory of Open Access Journals (Sweden)
Roberto Pereira-Arroyo
2014-02-01
Full Text Available In this paper, the problem at hand consists in the sizing of an Operational Transconductance Amplifier (OTA. The Pareto front is introduced as a useful analysis concept in order to explore the design space of such analog circuit. A genetic algorithm (GA is employed to automatically detect this front in a process that efficiently finds optimal parameterizations and their corresponding values in an aggregate fitness space. Since the problem is treated as a multi-objective optimization task, different measures of the amplifier like the transconductance, the slew rate, the linear range and the input capacitance are used as fitness functions. Finally, simulation results are presented, using a standard 0,5μm CMOS technology.
Grey Wolf Optimizer Based on Powell Local Optimization Method for Clustering Analysis
Sen Zhang; Yongquan Zhou
2015-01-01
One heuristic evolutionary algorithm recently proposed is the grey wolf optimizer (GWO), inspired by the leadership hierarchy and hunting mechanism of grey wolves in nature. This paper presents an extended GWO algorithm based on Powell local optimization method, and we call it PGWO. PGWO algorithm significantly improves the original GWO in solving complex optimization problems. Clustering is a popular data analysis and data mining technique. Hence, the PGWO could be applied in solving cluster...
Optimization techniques for OpenCL-based linear algebra routines
Kozacik, Stephen; Fox, Paul; Humphrey, John; Kuller, Aryeh; Kelmelis, Eric; Prather, Dennis W.
2014-06-01
The OpenCL standard for general-purpose parallel programming allows a developer to target highly parallel computations towards graphics processing units (GPUs), CPUs, co-processing devices, and field programmable gate arrays (FPGAs). The computationally intense domains of linear algebra and image processing have shown significant speedups when implemented in the OpenCL environment. A major benefit of OpenCL is that a routine written for one device can be run across many different devices and architectures; however, a kernel optimized for one device may not exhibit high performance when executed on a different device. For this reason kernels must typically be hand-optimized for every target device family. Due to the large number of parameters that can affect performance, hand tuning for every possible device is impractical and often produces suboptimal results. For this work, we focused on optimizing the general matrix multiplication routine. General matrix multiplication is used as a building block for many linear algebra routines and often comprises a large portion of the run-time. Prior work has shown this routine to be a good candidate for high-performance implementation in OpenCL. We selected several candidate algorithms from the literature that are suitable for parameterization. We then developed parameterized kernels implementing these algorithms using only portable OpenCL features. Our implementation queries device information supplied by the OpenCL runtime and utilizes this as well as user input to generate a search space that satisfies device and algorithmic constraints. Preliminary results from our work confirm that optimizations are not portable from one device to the next, and show the benefits of automatic tuning. Using a standard set of tuning parameters seen in the literature for the NVIDIA Fermi architecture achieves a performance of 1.6 TFLOPS on an AMD 7970 device, while automatically tuning achieves a peak of 2.7 TFLOPS
Optimal Bangla Keyboard Layout using Data Mining Technique
Kamruzzaman, S. M.; Alam, Md. Hijbul; Masum, Abdul Kadar Muhammad; Hassan, Md Mahadi
2010-01-01
This paper presents an optimal Bangla Keyboard Layout, which distributes the load equally on both hands so that maximizing the ease and minimizing the effort. Bangla alphabet has a large number of letters, for this it is difficult to type faster using Bangla keyboard. Our proposed keyboard will maximize the speed of operator as they can type with both hands parallel. Here we use the association rule of data mining to distribute the Bangla characters in the keyboard. First, we analyze the freq...
THE METHOD OF FORMING THE PIGGYBACK TECHNOLOGIES USING THE AUTOMATED HEURISTIC SYSTEM
Directory of Open Access Journals (Sweden)
Ye. Nahornyi
2015-07-01
Full Text Available In order to choose a rational piggyback technology there was offered a method that envisages the automated system improvement by giving it a heuristic nature. The automated system is based on a set of methods, techniques and strategies aimed at creating optimal resource saving technologies, which makes it possible to take into account with maximum efficiency the interests of all the participants of the delivery process. When organizing the piggyback traffic there is presupposed the coordination of operations between the piggyback traffic participants to minimize the cargo travel time.
A HIGH PERFORMANCE OPTIMIZATION TECHNIQUE FOR POLE BALANCING PROBLEM
Directory of Open Access Journals (Sweden)
Bahadır KARASULU
2008-02-01
Full Text Available High performance computing techniques can be used effectively for solution of the complex scientific problems. Pole balancing problem is a basic benchmark tool of robotic field, which is an important field of Artificial Intelligence research areas. In this study, a solution is developed for pole balancing problem using Artificial Neural Network (ANN and high performance computation technique. Algorithm, that basis of the Reinforcement Learning method which is used to find the force of pole's balance, is transfered to parallel environment. In Implementation, C is preferred as programming language and Message Passing Interface (MPI is used for parallel computation technique. Self–Organizing Map (SOM ANN model's neurons (artificial neural nodes and their weights are distributed to six processors of a server computer which equipped with each quad core processor (total 24 processors. In this way, performance values are obtained for different number of artificial neural nodes. Success of method based on results is discussed.
Social heuristics shape intuitive cooperation.
Rand, David G; Peysakhovich, Alexander; Kraft-Todd, Gordon T; Newman, George E; Wurzbacher, Owen; Nowak, Martin A; Greene, Joshua D
2014-04-22
Cooperation is central to human societies. Yet relatively little is known about the cognitive underpinnings of cooperative decision making. Does cooperation require deliberate self-restraint? Or is spontaneous prosociality reined in by calculating self-interest? Here we present a theory of why (and for whom) intuition favors cooperation: cooperation is typically advantageous in everyday life, leading to the formation of generalized cooperative intuitions. Deliberation, by contrast, adjusts behaviour towards the optimum for a given situation. Thus, in one-shot anonymous interactions where selfishness is optimal, intuitive responses tend to be more cooperative than deliberative responses. We test this 'social heuristics hypothesis' by aggregating across every cooperation experiment using time pressure that we conducted over a 2-year period (15 studies and 6,910 decisions), as well as performing a novel time pressure experiment. Doing so demonstrates a positive average effect of time pressure on cooperation. We also find substantial variation in this effect, and show that this variation is partly explained by previous experience with one-shot lab experiments.
New insights into diversification of hyper-heuristics.
Ren, Zhilei; Jiang, He; Xuan, Jifeng; Hu, Yan; Luo, Zhongxuan
2014-10-01
There has been a growing research trend of applying hyper-heuristics for problem solving, due to their ability of balancing the intensification and the diversification with low level heuristics. Traditionally, the diversification mechanism is mostly realized by perturbing the incumbent solutions to escape from local optima. In this paper, we report our attempt toward providing a new diversification mechanism, which is based on the concept of instance perturbation. In contrast to existing approaches, the proposed mechanism achieves the diversification by perturbing the instance under solving, rather than the solutions. To tackle the challenge of incorporating instance perturbation into hyper-heuristics, we also design a new hyper-heuristic framework HIP-HOP (recursive acronym of HIP-HOP is an instance perturbation-based hyper-heuristic optimization procedure), which employs a grammar guided high level strategy to manipulate the low level heuristics. With the expressive power of the grammar, the constraints, such as the feasibility of the output solution could be easily satisfied. Numerical results and statistical tests over both the Ising spin glass problem and the p -median problem instances show that HIP-HOP is able to achieve promising performances. Furthermore, runtime distribution analysis reveals that, although being relatively slow at the beginning, HIP-HOP is able to achieve competitive solutions once given sufficient time.
Is there one optimal repair technique for all composites?
Loomans, B.A.C.; Cardoso, M.V.; Roeters, F.J.M.; Opdam, N.J.M.; Munck, J. De; Huysmans, M.C.D.N.J.M.; Meerbeek, B. Van
2011-01-01
OBJECTIVES: The aim of this study was to investigate the effectiveness of a variety of techniques to bond new composite to artificially aged composite of different compositions. METHODS: Composite resin blocks were made of five different commercially available composites (n=30) (Clearfil AP-X,
Optimizing Nuclear Reactor Operation Using Soft Computing Techniques
Entzinger, J.O.; Ruan, D.; Kahraman, Cengiz
2006-01-01
The strict safety regulations for nuclear reactor control make it di±cult to implement new control techniques such as fuzzy logic control (FLC). FLC however, can provide very desirable advantages over classical control, like robustness, adaptation and the capability to include human experience into
DEFF Research Database (Denmark)
Kong, Fanrong; Jiang, Jianhui; Ding, Zhigang
2017-01-01
. Although the next-day electricity prices can be obtained in a day-ahead power market, a driving plan is not easily made in advance. Although PHEV owners can input a next-day plan into a charging system, e.g., aggregators, day-ahead, it is a very trivial task to do everyday. Moreover, the driving plan may...... not be very accurate. To address this problem, in this paper, we analyze energy demands according to a PHEV owner's historical driving records and build a personalized statistic driving model. Based on the model and the electricity spot prices, a rolling optimization strategy is proposed to help make......To alleviate the emission of greenhouse gas and the dependence on fossil fuel, Plug-in Hybrid Electrical Vehicles (PHEVs) have gained an increasing popularity in current decades. Due to the fluctuating electricity prices in the power market, a charging schedule is very influential to driving cost...
Electric power systems advanced forecasting techniques and optimal generation scheduling
Catalão, João P S
2012-01-01
Overview of Electric Power Generation SystemsCláudio MonteiroUncertainty and Risk in Generation SchedulingRabih A. JabrShort-Term Load ForecastingAlexandre P. Alves da Silva and Vitor H. FerreiraShort-Term Electricity Price ForecastingNima AmjadyShort-Term Wind Power ForecastingGregor Giebel and Michael DenhardPrice-Based Scheduling for GencosGovinda B. Shrestha and Songbo QiaoOptimal Self-Schedule of a Hydro Producer under UncertaintyF. Javier Díaz and Javie
OPTIMIZATION OF GRID RESOURCE SCHEDULING USING PARTICLE SWARM OPTIMIZATION ALGORITHM
Directory of Open Access Journals (Sweden)
S. Selvakrishnan
2010-10-01
Full Text Available Job allocation process is one of the big issues in grid environment and it is one of the research areas in Grid Computing. Hence a new area of research is developed to design optimal methods. It focuses on new heuristic techniques that provide an optimal or near optimal solution for large grids. By learning grid resource scheduling and PSO (Particle Swarm Optimization algorithm, this proposed scheduler allocates an application to a host from a pool of available hosts and applications by selecting the best match. PSO-based algorithm is more effective in grid resources scheduling with the favor of reducing the executing time and completing time.
Usable guidelines for usable websites? an analysis of five e-government heuristics
Welle Donker-Kuijer, M.C.J.; de Jong, Menno D.T.; Lentz, Leo
2010-01-01
Many government organizations use web heuristics for the quality assurance of their websites. Heuristics may be used by web designers to guide the decisions about a website in development, or by web evaluators to optimize or assess the quality of an existing website. Despite their popularity, very
Nicola, V.F.; Zaburnenko, T.S.
2006-01-01
In this paper we propose a state-dependent importance sampling heuristic to estimate the probability of population overï¬‚ow in feed-forward networks. This heuristic attempts to approximate the â€œoptimalï¿½ï¿½? state-dependent change of measure without the need for difficult analysis or costly
Mixed Integer Programming and Heuristic Scheduling for Space Communication
Lee, Charles H.; Cheung, Kar-Ming
2013-01-01
Optimal planning and scheduling for a communication network was created where the nodes within the network are communicating at the highest possible rates while meeting the mission requirements and operational constraints. The planning and scheduling problem was formulated in the framework of Mixed Integer Programming (MIP) to introduce a special penalty function to convert the MIP problem into a continuous optimization problem, and to solve the constrained optimization problem using heuristic optimization. The communication network consists of space and ground assets with the link dynamics between any two assets varying with respect to time, distance, and telecom configurations. One asset could be communicating with another at very high data rates at one time, and at other times, communication is impossible, as the asset could be inaccessible from the network due to planetary occultation. Based on the network's geometric dynamics and link capabilities, the start time, end time, and link configuration of each view period are selected to maximize the communication efficiency within the network. Mathematical formulations for the constrained mixed integer optimization problem were derived, and efficient analytical and numerical techniques were developed to find the optimal solution. By setting up the problem using MIP, the search space for the optimization problem is reduced significantly, thereby speeding up the solution process. The ratio of the dimension of the traditional method over the proposed formulation is approximately an order N (single) to 2*N (arraying), where N is the number of receiving antennas of a node. By introducing a special penalty function, the MIP problem with non-differentiable cost function and nonlinear constraints can be converted into a continuous variable problem, whose solution is possible.
LDPC code optimization techniques to improve the error correction threshold
Directory of Open Access Journals (Sweden)
Роман Сергійович Новиков
2015-11-01
Full Text Available Non-empty stopping sets, which are the main reason for achieving a threshold of errors in data transmission channels, are studied. New algorithm of transfer smallest stopping sets and stop distance of any LDPC code is proposed. More functional and flexible technique of splitting-and-filling is proposed. Time for which will be transferred the smallest stopping sets and founded stop distance of any LDPC code is calculated
Power Optimization Techniques for Next Generation Wireless Networks
Ratheesh R; Vetrivelan P
2016-01-01
The massive data traffic and the need for high speed wireless communication is increasing day by day corresponds to an exponential increase in the consumption of power by Information and Communication Technology (ICT) sector. Reducing consumption of power in wireless network is a challenging topic and has attracted the attention of researches around the globe. Many techniques like multiple-input multiple-output (MIMO), cognitive radio, cooperative heterogeneous communications and new netwo...
Formative Research on the Heuristic Task Analysis.
Reigeluth, Charles M.; Lee, Ji-Yeon; Peterson, Bruce; Chavez, Michael
Corporate and educational settings increasingly require decision-making, problem-solving and other complex cognitive skills to handle ill-structured, or heuristic, tasks, but the growing need for heuristic task expertise has outpaced the refinement of task analysis methods for heuristic expertise. The Heuristic Task Analysis (HTA) Method was…
A Comparison of Genetic Programming Variants for Hyper-Heuristics
Energy Technology Data Exchange (ETDEWEB)
Harris, Sean [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-03-01
Modern society is faced with ever more complex problems, many of which can be formulated as generate-and-test optimization problems. General-purpose optimization algorithms are not well suited for real-world scenarios where many instances of the same problem class need to be repeatedly and efficiently solved, such as routing vehicles over highways with constantly changing traffic flows, because they are not targeted to a particular scenario. Hyper-heuristics automate the design of algorithms to create a custom algorithm for a particular scenario. Hyper-heuristics typically employ Genetic Programming (GP) and this project has investigated the relationship between the choice of GP and performance in Hyper-heuristics. Results are presented demonstrating the existence of problems for which there is a statistically significant performance differential between the use of different types of GP.
Energy Technology Data Exchange (ETDEWEB)
Toelle, F.J.
1997-12-01
Process Simulation is routine in chemical engineering and process analysis. This article traces the early developments of process simulation of flowsheeting. Dramatically new expectations and visions are emerging for software tools used in chemical process modeling and simulation. Many companies anticipate a rapid migration of process modeling software to an open architecture. The software components exploit object-oriented pragmatics, including abstraction, encapsulation, inheritance and polymorphism. We discuss the software architecture of tools supporting process synthesis and operations optimization. (orig.) [Deutsch] Als Simulation bezeichnen wir ein experimentelles Vorgehen, bei dem wir bestimmte Eigenschaften eines tatsaechlichen oder auch gedachten technischen, wirtschaftlichen, biologischen Systems nicht am Original selbst, sondern ersatzweise an einem geeigneten Modell des Originals, dem sogenannten Simulator, untersuchen. Bezogen auf die Prozess- und Verfahrenstechnik sind dies primaer funktionelle und systemdynamische Eigenschaften, wie zum Beispiel ein Kraftwerksblock, den wir moeglichst genau dem Original nachbilden. Dabei werden nur jene Aspekte des realen Verhaltens nachgebildet, die vom Modellierer als notwendig erachtet werden. Ein guter Simulator liefert in der Regel eine bequeme und umfassende, zeit- und kostenguenstige, gelegentlich sogar einzige praktikable Moeglichkeit zum Studium aller Betriebszustaende und Eigenschaften des Originals. (orig.)
Interliminal Design: Understanding cognitive heuristics to mitigate design distortion
Directory of Open Access Journals (Sweden)
Andrew McCollough
2014-12-01
Full Text Available Cognitive heuristics are mental shortcuts adapted over time to enable rapid interpretation of our complex environment. They are intrinsic to human cognition and resist modification. Heuristics applied outside the context to which they are best suited are termed cognitive bias, and are the cause of systematic errors in judgment and reasoning. As both a cognitive and intuitive discipline, design by individuals is vulnerable to context-inappropriate heuristic usage. Designing in groups can act positively to counterbalance these tendencies, but is subject to heuristic misuse and biases particular to social environments. Mismatch between desired and actual outcomes– termed here, design distortion – occurs when such usage goes unnoticed and unaddressed, and can affect multiple dimensions of a system. We propose a methodology, interliminal design, emerging from the Program in Collaborative Design at Pacific Northwest College of Art, to specifically address the influence of cognitive heuristics in design. This adaptive approach involves reflective, dialogic, inquiry-driven practices intended to increase awareness of heuristic usage, and identify aspects of the design process vulnerable to misuse on both individual and group levels. By facilitating the detection and mitigation of potentially costly errors in judgment and decision-making that create distortion, such metacognitive techniques can meaningfully improve design.
New completion technique optimizes multiple zone frac treatment
Energy Technology Data Exchange (ETDEWEB)
Livingston, D.; Kastrop, J.E.
1974-04-01
A big step toward optimum treatment of multiple zones was made recently in a new technique developed by Lone Star Producing Co.'s E. Texas District, Halliburton Services, and Otis Engineering Corp. The novel approach involves isolating and breaking down each zone prior to limited entry fracturing in one continuous pumping operation down the casing. Complete pressure control is maintained at all times eliminating any need to kill the well. Although the new method evolved from problems peculiar to some formations found in E. Texas, its application opens the way to improving productivity in other formations with similar characteristics. Wells completed in the Travis Peak Sandstone involved up to 14 potentially gas-producing zones spanning an interval as much as 1,500 ft. In the case history described, it was not economically feasible to isolate and treat each zone by retrievable packers on the tubing. Equally important, response to exposure to kill fluids used in this procedure usually resulted in disappointing results. The completion technique is described in detail.
Sreekanta Murthy, T.
1992-01-01
Results of the investigation of formal nonlinear programming-based numerical optimization techniques of helicopter airframe vibration reduction are summarized. The objective and constraint function and the sensitivity expressions used in the formulation of airframe vibration optimization problems are presented and discussed. Implementation of a new computational procedure based on MSC/NASTRAN and CONMIN in a computer program system called DYNOPT for optimizing airframes subject to strength, frequency, dynamic response, and dynamic stress constraints is described. An optimization methodology is proposed which is thought to provide a new way of applying formal optimization techniques during the various phases of the airframe design process. Numerical results obtained from the application of the DYNOPT optimization code to a helicopter airframe are discussed.
Optimization of hydrostatic transmissions by means of virtual instrumentation technique
Ion Guta, Dragos Daniel; Popescu, Teodor Costinel; Dumitrescu, Catalin
2010-11-01
Obtaining mathematical models, as close as possible to physical phenomena which are intended to be replicated or improved, help us in deciding how to optimize them. The introduction of computers in monitoring and controlling processes caused changes in technological systems. With support from the methods for identification of processes and from the power of numerical computing equipment, researchers and designers can shorten the period for development of applications in various fields by generating a solution as close as possible to reality, since the design stage [1]. The paper presents a hybrid solution of modeling / simulation of a hydrostatic transmission with mixed adjustment. For simulation and control of the examined process we have used two distinct environments, AMESim and LabVIEW. The proposed solution allows coupling of the system's model to the software control modules developed using virtual instrumentation. Simulation network of the analyzed system was "tuned" and validated by an actual model of the process. This paper highlights some aspects regarding energy and functional advantages of hydraulic transmissions based on adjustable volumetric machines existing in their primary and secondary sectors [2].
Pre-synthesis Optimization for Asynchronous Circuits Using Compiler Techniques
Zamanzadeh, Sharareh; Najibi, Mehrdad; Pedram, Hossein
The effectiveness of traditional compiler techniques employed in high-level synthesis of synchronous circuits aiming to present a generic code is studied for asynchronous synthesis by considering the special features of these circuits. The compiler methods can be used innovatively to improve the synthesis results in both power consumption and area. The compiler methods like speculation, loop invariant code motion and condition expansion are applicable in decreasing mass of handshaking circuits and intermediate modules. Moreover, they eliminate conditional access to variables and ports and reducing the amount of completion detection circuits. The approach is superimposed on to Persia synthesis toolset as a presynthesis source-to-source transformation phase, and results shows on average 22% improvement in terms of area and 24 % in power consumption for asynchronous benchmarks.
Development of a 3D log sawing optimization system for small sawmills in central Appalachia, US
Wenshu Lin; Jingxin Wang; Edward. Thomas
2011-01-01
A 3D log sawing optimization system was developed to perform log generation, opening face determination, sawing simulation, and lumber grading using 3D modeling techniques. Heuristic and dynamic programming algorithms were used to determine opening face and grade sawing optimization. Positions and shapes of internal log defects were predicted using a model developed by...
How to Speed up Optimization? Opposite-Center Learning and Its Application to Differential Evolution
Xu, H.; Erdbrink, C.D.; Krzhizhanovskaya, V.V.
2015-01-01
This paper introduces a new sampling technique called Opposite-Center Learning (OCL) intended for convergence speed-up of meta-heuristic optimization algorithms. It comprises an extension of Opposition-Based Learning (OBL), a simple scheme that manages to boost numerous optimization methods by
An Ant Colony based Hyper-Heuristic Approach for the Set Covering Problem
Directory of Open Access Journals (Sweden)
Alexandre Silvestre FERREIRA
2015-12-01
Full Text Available The Set Covering Problem (SCP is a NP-hard combinatorial optimization problem that is challenging for meta-heuristic algorithms. In the optimization literature, several approaches using meta-heuristics have been developed to tackle the SCP and the quality of the results provided by these approaches highly depends on customized operators that demands high effort from researchers and practitioners. In order to alleviate the complexity of designing metaheuristics, a methodology called hyper-heuristic has emerged as a possible solution. A hyper-heuristic is capable of dynamically selecting simple low-level heuristics accordingly to their performance, alleviating the design complexity of the problem solver and obtaining satisfactory results at the same time. In a previous study, we proposed a hyper-heuristic approach based on Ant Colony Optimization (ACO-HH for solving the SCP. This paper extends our previous efforts, presenting better results and a deeper analysis of ACO-HH parameters and behavior, specially about the selection of low-level heuristics. The paper also presents a comparison with an ACO meta-heuristic customized for the SCP.
Heuristic space diversity control for improved meta-hyper-heuristic performance
CSIR Research Space (South Africa)
Grobler, J
2015-04-01
Full Text Available This paper expands on the concept of heuristic space diversity and investigates various strategies for the management of heuristic space diversity within the context of a meta-hyper-heuristic algorithm in search of greater performance benefits...
Intensified crystallization in complex media: heuristics for crystallization of platform chemicals
Urbanus, J.; Roelands, C.P.M.; Verdoes, D.; Horst, J.H. ter
2012-01-01
This paper presents heuristics for the integration of fermentation with the appropriate crystallization based in-situ product recovery (ISPR) technique. Here techniques, such as co-crystallization (CC), evaporative crystallization (EC), template induced crystallization (TIC), cooling crystallization
Economic Load Dispatch Using Grey Wolf Optimization
Dr.Sudhir Sharma; Shivani Mehta
2015-01-01
This paper presents grey wolf optimization (GWO) to solve convex economic load dispatch (ELD) problem. Grey Wolf Optimization (GWO) is a new meta-heuristic inspired by grey wolves. The leadership hierarchy and hunting mechanism of the grey wolves is mimicked in GWO. The objective of ELD problem is to minimize the total generation cost while fulfilling the different constraints, when the required load of power system is being supplied. The proposed technique is implemented on two d...
The implementation frameworks of meta-heuristics hybridization with ...
African Journals Online (AJOL)
The hybridization of meta-heuristics algorithms has achieved a remarkable improvement from the adaptation of dynamic parameterization. This paper proposes a variety of implementation frameworks for the hybridization of Particle Swarm Optimization (PSO) and Genetic Algorithm (GA) and the dynamic parameterization.
The afforestation problem: a heuristic method based on simulated annealing
DEFF Research Database (Denmark)
Vidal, Rene Victor Valqui
1992-01-01
This paper presents the afforestation problem, that is the location and design of new forest compartments to be planted in a given area. This optimization problem is solved by a two-step heuristic method based on simulated annealing. Tests and experiences with this method are also presented....
Intelligent perturbation algorithms for space scheduling optimization
Kurtzman, Clifford R.
1991-01-01
Intelligent perturbation algorithms for space scheduling optimization are presented in the form of the viewgraphs. The following subject areas are covered: optimization of planning, scheduling, and manifesting; searching a discrete configuration space; heuristic algorithms used for optimization; use of heuristic methods on a sample scheduling problem; intelligent perturbation algorithms are iterative refinement techniques; properties of a good iterative search operator; dispatching examples of intelligent perturbation algorithm and perturbation operator attributes; scheduling implementations using intelligent perturbation algorithms; major advances in scheduling capabilities; the prototype ISF (industrial Space Facility) experiment scheduler; optimized schedule (max revenue); multi-variable optimization; Space Station design reference mission scheduling; ISF-TDRSS command scheduling demonstration; and example task - communications check.
DEFF Research Database (Denmark)
Thummala, Prasanth; Schneider, Henrik; Zhang, Zhe
2015-01-01
.The energy efficiency is optimized using a proposed new automatic winding layout (AWL) technique and a comprehensive loss model.The AWL technique generates a large number of transformer winding layouts.The transformer parasitics such as dc resistance, leakage inductance and self-capacitance are calculated...... for each winding layout.An optimization technique is formulated to minimize the sum of energy losses during charge and discharge operations.The efficiency and energy loss distribution results from the optimization routine provide a deep insight into the high voltage transformer designand its impact...... on the total converter efficiency.The proposed efficiency optimization approach is experimentally verified on a25 W (average charging power) with100 W (peakpower) flyback dc-dc prototype....
Directory of Open Access Journals (Sweden)
Joao CARDOSO NETO
2012-01-01
Full Text Available Chile is a country with great attractions for tourists in South America and the whole world. Among the many tourist Chilean attractions the city of Vina del Mar is one of the highlights, recognized nationally and internationally as one of the most beautiful places for summer. In Vina del Mar tourists have many options for leisure, besides pretty beaches, e.g. playa renaca, the city has beautiful squares and castles, e.g. Castillo Wulff built more than 100 (one hundred years ago. It is noteworthy that already exist over there five (5 tourist itineraries, so this work was developed in order to determine the best routes to these existing itineraries, and create a unique route that includes all the tourist points in Vina del Mar, because in this way, the tourists visiting this city can minimize the time spent in traveling, as well as optimize their moments of leisure, taking the opportunity to know all the city attractions. To determine shorter ways to do it and then propose some suggestions for improvement of the quality of the tourist service offered, it had used the exact method, by solving the mathematical model of the TSP (Traveling Salesman Problem, and the heuristic method, using the most economic insertion algorithm.
Techniques for the optimal design of photovoltaic inverters interconnected with the electric grid
DEFF Research Database (Denmark)
Koutroulis, Eftichios; Blaabjerg, Frede
2011-01-01
The DC/AC inverters are the key elements of grid-connected PV energy production systems. In this paper, a new technique for the optimal design of the power section and output filter of a full-bridge, grid-connected PV inverter, is proposed. The objective function which is minimized during...... the Genetic Algorithm-based optimization procedure is the PV inverter Levelized Cost Of the Electricity generated (LCOE). The proposed method has been applied for the optimal design of PV inverters installed at various sites in Europe. The simulation results indicate that the optimal values of the PV inverter...
The use of meta-heuristics for airport gate assignment
DEFF Research Database (Denmark)
Cheng, Chun-Hung; Ho, Sin C.; Kwan, Cheuk-Lam
2012-01-01
Improper assignment of gates may result in flight delays, inefficient use of the resource, customer’s dissatisfaction. A typical metropolitan airport handles hundreds of flights a day. Solving the gate assignment problem (GAP) to optimality is often impractical. Meta-heuristics have recently been...... proposed to generate good solutions within a reasonable timeframe. In this work, we attempt to assess the performance of three meta-heuristics, namely, genetic algorithm (GA), tabu search (TS), simulated annealing (SA) and a hybrid approach based on SA and TS. Flight data from Incheon International Airport...
[Optimization of preparative technique for banxia-houpu effervescent tablets by orthogonal design].
Zheng, Ping; Wang, Wen-zhong; Zhang, Peng; Han, Jian; Wu, Li-zhi
2006-10-01
To optimize preparative technique for banxia-houpu effervescent tablets. Based on the pH, disintegration time limited, taste, and rigidity of effervescent tablets, the proper proportion between citric acid and sodium bicarbonate, as well as the proper quantity of polyethylene glycol 6000 and sodium cyclamate in the effervescent tablets were determined by using orthogonal design. The content of magnolol and honokiol in effervescent tablets were measured by HPLC. The optimal preparative technique was: cirtic acid: sodium bicarbonate = 0.65:1. The percentage of polyethylene glycol 6000 was 85%, and the percentage of sodium cyclamate was 1.0%. The preparative technique is stable, reliable and suitable for practical use.
Towards an Understanding of Instructional Design Heuristics: An Exploratory Delphi Study
York, Cindy S.; Ertmer, Peggy A.
2011-01-01
Evidence suggests that experienced instructional designers often use heuristics and adapted models when engaged in the instructional design problem-solving process. This study used the Delphi technique to identify a core set of heuristics designers reported as being important to the success of the design process. The overarching purpose of the…
Using Heuristic Task Analysis to Create Web-Based Instructional Design Theory
Fiester, Herbert R.
2010-01-01
The first purpose of this study was to identify procedural and heuristic knowledge used when creating web-based instruction. The second purpose of this study was to develop suggestions for improving the Heuristic Task Analysis process, a technique for eliciting, analyzing, and representing expertise in cognitively complex tasks. Three expert…
Heuristic errors in clinical reasoning.
Rylander, Melanie; Guerrasio, Jeannette
2016-08-01
Errors in clinical reasoning contribute to patient morbidity and mortality. The purpose of this study was to determine the types of heuristic errors made by third-year medical students and first-year residents. This study surveyed approximately 150 clinical educators inquiring about the types of heuristic errors they observed in third-year medical students and first-year residents. Anchoring and premature closure were the two most common errors observed amongst third-year medical students and first-year residents. There was no difference in the types of errors observed in the two groups. Errors in clinical reasoning contribute to patient morbidity and mortality Clinical educators perceived that both third-year medical students and first-year residents committed similar heuristic errors, implying that additional medical knowledge and clinical experience do not affect the types of heuristic errors made. Further work is needed to help identify methods that can be used to reduce heuristic errors early in a clinician's education. © 2015 John Wiley & Sons Ltd.
An adaptive dual-optimal path-planning technique for unmanned air vehicles
Directory of Open Access Journals (Sweden)
Whitfield Clifford A.
2016-01-01
Full Text Available A multi-objective technique for unmanned air vehicle path-planning generation through task allocation has been developed. The dual-optimal path-planning technique generates real-time adaptive flight paths based on available flight windows and environmental influenced objectives. The environmentally-influenced flight condition determines the aircraft optimal orientation within a downstream virtual window of possible vehicle destinations that is based on the vehicle’s kinematics. The intermittent results are then pursued by a dynamic optimization technique to determine the flight path. This path-planning technique is a multi-objective optimization procedure consisting of two goals that do not require additional information to combine the conflicting objectives into a single-objective. The technique was applied to solar-regenerative high altitude long endurance flight which can benefit significantly from an adaptive real-time path-planning technique. The objectives were to determine the minimum power required flight paths while maintaining maximum solar power for continual surveillance over an area of interest (AOI. The simulated path generation technique prolonged the flight duration over a sustained turn loiter flight path by approximately 2 months for a year of flight. The potential for prolonged solar powered flight was consistent for all latitude locations, including 2 months of available flight at 60° latitude, where sustained turn flight was no longer capable.
A greedy double swap heuristic for nurse scheduling
Directory of Open Access Journals (Sweden)
Murphy Choy
2012-10-01
Full Text Available One of the key challenges of nurse scheduling problem (NSP is the number of constraints placed on preparing the timetable, both from the regulatory requirements as well as the patients’ demand for the appropriate nursing care specialists. In addition, the preferences of the nursing staffs related to their work schedules add another dimension of complexity. Most solutions proposed for solving nurse scheduling involve the use of mathematical programming and generally considers only the hard constraints. However, the psychological needs of the nurses are ignored and this resulted in subsequent interventions by the nursing staffs to remedy any deficiency and often results in last minute changes to the schedule. In this paper, we present a staff preference optimization framework solved with a greedy double swap heuristic. The heuristic yields good performance in speed at solving the problem. The heuristic is simple and we will demonstrate its performance by implementing it on open source spreadsheet software.
Mixed Integer Programming and Heuristic Scheduling for Space Communication Networks
Lee, Charles H.; Cheung, Kar-Ming
2012-01-01
In this paper, we propose to solve the constrained optimization problem in two phases. The first phase uses heuristic methods such as the ant colony method, particle swarming optimization, and genetic algorithm to seek a near optimal solution among a list of feasible initial populations. The final optimal solution can be found by using the solution of the first phase as the initial condition to the SQP algorithm. We demonstrate the above problem formulation and optimization schemes with a large-scale network that includes the DSN ground stations and a number of spacecraft of deep space missions.
Implementation of genetic algorithm technique for solving ROP detector layout optimization problem
Energy Technology Data Exchange (ETDEWEB)
Kastanya, D.; Fodor, B. [CANDU Energy Inc., Mississauga, Ontario (Canada)
2012-07-01
The regional overpower protection (ROP) systems protect CANDU® reactors against overpower in the fuel that could reduce the safety margin-to-dryout. The overpower could originate from localized power peaking within the core or a general increase in the core power level. The design of the detector layout for the ROP systems is a challenging discrete optimization problem. In recent years, two algorithms have been developed to find a quasi-optimal solution to this detector layout optimization problem. Both of these algorithms utilize the simulated annealing (SA) algorithm as their optimization engine. In the present paper, an alternative optimization algorithm, namely the genetic algorithm (GA), has been implemented as the optimization engine. The implementation is done within the ADORE algorithm. Based on this preliminary studies performed on four different sizes of ROP system, it has been demonstrated that the GA technique is able to produce good results. (author)
Yang, Y.; Özgen, S.
2017-06-01
During the last few decades, CFD (Computational Fluid Dynamics) has developed greatly and has become a more reliable tool for the conceptual phase of aircraft design. This tool is generally combined with an optimization algorithm. In the optimization phase, the need for regenerating the computational mesh might become cumbersome, especially when the number of design parameters is high. For this reason, several mesh generation and deformation techniques have been developed in the past decades. One of the most widely used techniques is the Spring Analogy. There are numerous spring analogy related techniques reported in the literature: linear spring analogy, torsional spring analogy, semitorsional spring analogy, and ball vertex spring analogy. This paper gives the explanation of linear spring analogy method and angle inclusion in the spring analogy method. In the latter case, two di¨erent solution methods are proposed. The best feasible method will later be used for two-dimensional (2D) Airfoil Design Optimization with objective function being to minimize sectional drag for a required lift coe©cient at di¨erent speeds. Design variables used in the optimization include camber and thickness distribution of the airfoil. SU2 CFD is chosen as the §ow solver during the optimization procedure. The optimization is done by using Phoenix ModelCenter Optimization Tool.
Study of heuristics in ant system for nuclear reload optimisation
Energy Technology Data Exchange (ETDEWEB)
Lima, Alan M.M. de; Schirru, Roberto; Silva, Fernando C. da; Machado, Marcelo D.; Medeiros, Jose A.C.C. [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia (COPPE). Programa de Engenharia Nuclear]. E-mail: alan@lmp.ufrj.br; schirru@lmp.ufrj.br; fernando@con.ufrj.br; marcelo@lmp.ufrj.br; canedo@lmp.ufrj.br
2007-07-01
A Pressurized Water Reactor core must be reloaded every time the fuel burnup reaches a level when it is not possible to sustain nominal power operation. The nuclear core fuel reload optimization consists in finding a burned-up and fresh-fuel-assembly loading pattern that maximizes the number of effective full power days, minimizing the relationship cost/benefit. This problem is NP-hard, meaning that complexity grows exponentially with the number of fuel assemblies in the core. Besides that, the problem is non-linear and its search space is highly discontinual and multimodal. In this work a parallel computational system based on Ant Colony System (ACS) called Artificial-Ant-Colony Networks is used to solve the nuclear reactor core fuel reload optimization problem, with compatibles heuristics. ACS is a system based on artificial agents that uses the reinforcement learning technique and was originally developed to solve the Traveling Salesman Problem, which is conceptually similar to the nuclear fuel reload problem. (author)
Energy Technology Data Exchange (ETDEWEB)
Brandt, Christopher; Fieg, Georg [Hamburg University of Technology, Institute of Process and Plant Engineering, Hamburg (Germany); Luo, Xing [Helmut Schmidt University, Institute of Thermodynamics, Hamburg (Germany); University of Shanghai for Science and Technology, Institute of Thermal Engineering, Shanghai (China)
2011-08-15
In this work an innovative method for heat exchanger network (HEN) synthesis is introduced and examined. It combines a genetic algorithm (GA) with a heuristic based optimization procedure. The novel algorithm removes appearing heat load loops from the HEN structures when profitable, throughout the evolution. Two examples were examined with the new HEN synthesis method and for both better results were obtained. Thus, a positive effect of heuristic based optimization methods on the HEN synthesis with GA could be located. (orig.)
APPLICATION OF A HEURISTIC METHOD FOR THE ESTIMATION OF S-WAVE VELOCITY STRUCTURE
Directory of Open Access Journals (Sweden)
Alfaro Castillo Andrés José
2006-08-01
Full Text Available The assessment of local site effects is one of the most important subjects in Engineering Seismology. In order to perform an assessment, it is necessary to determine the S-wave velocity structure of the site. Additionally, in some basins, it is very important to know the deep sedimentary structure, due to the amplification phenomena of low frequency waves. There are several techniques to achieve this purpose; probably the most inexpensive technique is using the vertical component of microtremors measured with an array of seismographs. The phase velocity of Rayleigh waves is inverted to an S-wave velocity (Vs profile using optimization techniques. Most of the time, least square methods have been applied in the inversion.Recently, heuristic methods have also been used for the estimation of the S-wave velocity structure from microtremor.In this study seven arrays of microtremors in the city of Tsukuba city were performed, located to the NE edge of Kanto Basin, in order to estimate the deep S-wave velocity structure. The spatial autocorrelationmethod SPAC was used to determine phase velocity dispersion curves in the frequency range from 0.3-2.5 Hz. The determination of Vs profiles reached a depth of 750 m. Two methods were used to estimate the Swavevelocity structure: Inversion method and a heuristic method via the combination of Downhill Simplex Algorithm with a Very Fast Simulated Annealing Method. Comparisons with Vs from the existent resultsfrom PS-logging tests at the center of the array showed the reliability of the heuristic method.
Studies Regarding Design and Optimization of Mechanisms Using Modern Techniques of CAD and CAE
Directory of Open Access Journals (Sweden)
Marius Tufoi
2010-01-01
Full Text Available The paper presents applications of modern techniques of CAD (Computer Aided Design and CAE (Computer Aided Engineering to design and optimize the mechanisms used in mechanical engineering. The use exemplification of these techniques was achieved by designing and optimizing parts of a drawing installation for horizontal continuous casting of metals. By applying these design methods and using finite element method at simulations on designed mechanisms results a number of advantages over traditional methods of drawing and design: speed in drawing, design and optimization of parts and mechanisms, kinematic analysis option, kinetostatic and dynamic through simulation, without requiring physical realization of the part or mechanism, the determination by finite element method of tension, elongations, travel and safety factor and the possibility of optimization for these sizes to ensure the mechanical strength of each piece separately. Achieving these studies was possible using SolidWorks 2009 software suite.
A novel technique for active vibration control, based on optimal tracking control
Kheiri Sarabi, Behrouz; Sharma, Manu; Kaur, Damanjeet
2017-08-01
In the last few decades, researchers have proposed many control techniques to suppress unwanted vibrations in a structure. In this work, a novel and simple technique is proposed for the active vibration control. In this technique, an optimal tracking control is employed to suppress vibrations in a structure by simultaneously tracking zero references for modes of vibration. To illustrate the technique, a two-degrees of freedom spring-mass-damper system is considered as a test system. The mathematical model of the system is derived and then converted into a state-space model. A linear quadratic tracking control law is then used to make the disturbed system track zero references.
Conceptual and Action Heuristics: Tools for the Evaluator.
McClintock, Charles
1987-01-01
Program theory can be used to improve programs and policies. This article describes a set of techniques for complicating and simplifying program theory, referred to as conceptual and action heuristics. Methods such as analyzing metaphors, clarifying concepts, mapping, component assessment, causal modeling, and decision analysis are discussed. (JAZ)
Concentrated Hitting Times of Randomized Search Heuristics with Variable Drift
DEFF Research Database (Denmark)
Lehre, Per Kristian; Witt, Carsten
2014-01-01
Drift analysis is one of the state-of-the-art techniques for the runtime analysis of randomized search heuristics (RSHs) such as evolutionary algorithms (EAs), simulated annealing etc. The vast majority of existing drift theorems yield bounds on the expected value of the hitting time for a target...
Optimization Techniques for Design Problems in Selected Areas in WSNs: A Tutorial.
Ibrahim, Ahmed; Alfa, Attahiru
2017-08-01
This paper is intended to serve as an overview of, and mostly a tutorial to illustrate, the optimization techniques used in several different key design aspects that have been considered in the literature of wireless sensor networks (WSNs). It targets the researchers who are new to the mathematical optimization tool, and wish to apply it to WSN design problems. We hence divide the paper into two main parts. One part is dedicated to introduce optimization theory and an overview on some of its techniques that could be helpful in design problem in WSNs. In the second part, we present a number of design aspects that we came across in the WSN literature in which mathematical optimization methods have been used in the design. For each design aspect, a key paper is selected, and for each we explain the formulation techniques and the solution methods implemented. We also provide in-depth analyses and assessments of the problem formulations, the corresponding solution techniques and experimental procedures in some of these papers. The analyses and assessments, which are provided in the form of comments, are meant to reflect the points that we believe should be taken into account when using optimization as a tool for design purposes.
Optimization and optimal control in automotive systems
Kolmanovsky, Ilya; Steinbuch, Maarten; Re, Luigi
2014-01-01
This book demonstrates the use of the optimization techniques that are becoming essential to meet the increasing stringency and variety of requirements for automotive systems. It shows the reader how to move away from earlier approaches, based on some degree of heuristics, to the use of more and more common systematic methods. Even systematic methods can be developed and applied in a large number of forms so the text collects contributions from across the theory, methods and real-world automotive applications of optimization. Greater fuel economy, significant reductions in permissible emissions, new drivability requirements and the generally increasing complexity of automotive systems are among the criteria that the contributing authors set themselves to meet. In many cases multiple and often conflicting requirements give rise to multi-objective constrained optimization problems which are also considered. Some of these problems fall into the domain of the traditional multi-disciplinary optimization applie...
Plan-graph Based Heuristics for Conformant Probabilistic Planning
Ramakrishnan, Salesh; Pollack, Martha E.; Smith, David E.
2004-01-01
In this paper, we introduce plan-graph based heuristics to solve a variation of the conformant probabilistic planning (CPP) problem. In many real-world problems, it is the case that the sensors are unreliable or take too many resources to provide knowledge about the environment. These domains are better modeled as conformant planning problems. POMDP based techniques are currently the most successful approach for solving CPP but have the limitation of state- space explosion. Recent advances in deterministic and conformant planning have shown that plan-graphs can be used to enhance the performance significantly. We show that this enhancement can also be translated to CPP. We describe our process for developing the plan-graph heuristics and estimating the probability of a partial plan. We compare the performance of our planner PVHPOP when used with different heuristics. We also perform a comparison with a POMDP solver to show over a order of magnitude improvement in performance.
Yang, Pengyi; Yoo, Paul D; Fernando, Juanita; Zhou, Bing B; Zhang, Zili; Zomaya, Albert Y
2014-03-01
Data sampling is a widely used technique in a broad range of machine learning problems. Traditional sampling approaches generally rely on random resampling from a given dataset. However, these approaches do not take into consideration additional information, such as sample quality and usefulness. We recently proposed a data sampling technique, called sample subset optimization (SSO). The SSO technique relies on a cross-validation procedure for identifying and selecting the most useful samples as subsets. In this paper, we describe the application of SSO techniques to imbalanced and ensemble learning problems, respectively. For imbalanced learning, the SSO technique is employed as an under-sampling technique for identifying a subset of highly discriminative samples in the majority class. In ensemble learning, the SSO technique is utilized as a generic ensemble technique where multiple optimized subsets of samples from each class are selected for building an ensemble classifier. We demonstrate the utilities and advantages of the proposed techniques on a variety of bioinformatics applications where class imbalance, small sample size, and noisy data are prevalent.
Conflict and Bias in Heuristic Judgment
Bhatia, Sudeep
2017-01-01
Conflict has been hypothesized to play a key role in recruiting deliberative processing in reasoning and judgment tasks. This claim suggests that changing the task so as to add incorrect heuristic responses that conflict with existing heuristic responses can make individuals less likely to respond heuristically and can increase response accuracy.…
An Effective Exercise for Teaching Cognitive Heuristics
Swinkels, Alan
2003-01-01
This article describes a brief heuristics demonstration and offers suggestions for personalizing examples of heuristics by making them relevant to students. Students complete a handout asking for 4 judgments illustrative of such heuristics. The decisions are cast in the context of students' daily lives at their particular university. After the…
Combined Heuristic Attack Strategy on Complex Networks
Directory of Open Access Journals (Sweden)
Marek Šimon
2017-01-01
Full Text Available Usually, the existence of a complex network is considered an advantage feature and efforts are made to increase its robustness against an attack. However, there exist also harmful and/or malicious networks, from social ones like spreading hoax, corruption, phishing, extremist ideology, and terrorist support up to computer networks spreading computer viruses or DDoS attack software or even biological networks of carriers or transport centers spreading disease among the population. New attack strategy can be therefore used against malicious networks, as well as in a worst-case scenario test for robustness of a useful network. A common measure of robustness of networks is their disintegration level after removal of a fraction of nodes. This robustness can be calculated as a ratio of the number of nodes of the greatest remaining network component against the number of nodes in the original network. Our paper presents a combination of heuristics optimized for an attack on a complex network to achieve its greatest disintegration. Nodes are deleted sequentially based on a heuristic criterion. Efficiency of classical attack approaches is compared to the proposed approach on Barabási-Albert, scale-free with tunable power-law exponent, and Erdős-Rényi models of complex networks and on real-world networks. Our attack strategy results in a faster disintegration, which is counterbalanced by its slightly increased computational demands.
A Geographical Heuristic Routing Protocol for VANETs
Directory of Open Access Journals (Sweden)
Luis Urquiza-Aguiar
2016-09-01
Full Text Available Vehicular ad hoc networks (VANETs leverage the communication system of Intelligent Transportation Systems (ITS. Recently, Delay-Tolerant Network (DTN routing protocols have increased their popularity among the research community for being used in non-safety VANET applications and services like traffic reporting. Vehicular DTN protocols use geographical and local information to make forwarding decisions. However, current proposals only consider the selection of the best candidate based on a local-search. In this paper, we propose a generic Geographical Heuristic Routing (GHR protocol that can be applied to any DTN geographical routing protocol that makes forwarding decisions hop by hop. GHR includes in its operation adaptations simulated annealing and Tabu-search meta-heuristics, which have largely been used to improve local-search results in discrete optimization. We include a complete performance evaluation of GHR in a multi-hop VANET simulation scenario for a reporting service. Our study analyzes all of the meaningful configurations of GHR and offers a statistical analysis of our findings by means of MANOVA tests. Our results indicate that the use of a Tabu list contributes to improving the packet delivery ratio by around 5% to 10%. Moreover, if Tabu is used, then the simulated annealing routing strategy gets a better performance than the selection of the best node used with carry and forwarding (default operation.
Heuristic Biases in Mathematical Reasoning
Inglis, Matthew; Simpson, Adrian
2005-01-01
In this paper we briefly describe the dual process account of reasoning, and explain the role of heuristic biases in human thought. Concentrating on the so-called matching bias effect, we describe a piece of research that indicates a correlation between success at advanced level mathematics and an ability to override innate and misleading…
Thelen, Mark; Koppenhaver, Shane
2015-01-01
The Army Physical Fitness Test (APFT) is a biannual training requirement for all soldiers. The Army has made significant overall fitness gains by developing functional and comprehensive Physical Readiness Training (PRT) programs, but more emphasis on individualized physical fitness test taking technique is warranted in order to optimize performance. The purpose of this clinical commentary is to provide clinicians with several examples of APFT performance enhancement techniques that can potent...
Ranking of Storm Water Harvesting Sites Using Heuristic and Non-Heuristic Weighing Approaches
Directory of Open Access Journals (Sweden)
Shray Pathak
2017-09-01
Full Text Available Conservation of water is essential as climate change coupled with land use changes influence the distribution of water availability. Stormwater harvesting (SWH is a widely used conservation measure, which reduces pressure on fresh water resources. However, determining the availability of stormwater and identifying the suitable sites for SWH require consideration of various socio-economic and technical factors. Earlier studies use demand, ratio of runoff to demand and weighted demand distance, as the screening criteria. In this study, a Geographic Information System (GIS based screening methodology is adopted for identifying potential suitable SWH sites in urban areas as a first pass, and then a detailed study is done by applying suitability criteria. Initially, potential hotspots are identified by a concept of accumulated catchments and later the sites are screened and ranked using various screening parameters namely demand, ratio of runoff to demand and weighted demand distance. During this process, the opinion of experts for finalizing the suitable SWH sites brings subjectivity in the methodology. To obviate this, heuristic (Saaty Analytic hierarchy process (AHP and non-heuristic approaches (Entropy weight, and Principal Component Analysis (PCA weighing techniques are adapted for allotting weights to the parameters and applied in the ranking of SWH sites in Melbourne, Australia and Dehradun, India. It is observed that heuristic approach is not effective for the study area as it was affected by the subjectivity in the expert opinion. Results obtained by non-heuristic approach come out to be in a good agreement with the sites finalized for SWH by the water planners of the study area. Hence, the proposed ranking methodology has the potential for application in decision making of suitable storm water harvesting sites.
Hermawati, Setia; Lawson, Glyn
2016-01-01
Heuristics evaluation is frequently employed to evaluate usability. While general heuristics are suitable to evaluate most user interfaces, there is still a need to establish heuristics for specific domains to ensure that their specific usability issues are identified. This paper presents a comprehensive review of 70 studies related to usability heuristics for specific domains. The aim of this paper is to review the processes that were applied to establish heuristics in specific domains and i...
Underwater Robot Task Planning Using Multi-Objective Meta-Heuristics
Landa-Torres, Itziar; Manjarres, Diana; Bilbao, Sonia; Del Ser, Javier
2017-01-01
Robotics deployed in the underwater medium are subject to stringent operational conditions that impose a high degree of criticality on the allocation of resources and the schedule of operations in mission planning. In this context the so-called cost of a mission must be considered as an additional criterion when designing optimal task schedules within the mission at hand. Such a cost can be conceived as the impact of the mission on the robotic resources themselves, which range from the consumption of battery to other negative effects such as mechanic erosion. This manuscript focuses on this issue by devising three heuristic solvers aimed at efficiently scheduling tasks in robotic swarms, which collaborate together to accomplish a mission, and by presenting experimental results obtained over realistic scenarios in the underwater environment. The heuristic techniques resort to a Random-Keys encoding strategy to represent the allocation of robots to tasks and the relative execution order of such tasks within the schedule of certain robots. The obtained results reveal interesting differences in terms of Pareto optimality and spread between the algorithms considered in the benchmark, which are insightful for the selection of a proper task scheduler in real underwater campaigns. PMID:28375160
Alanis Pena, Antonio Alejandro
Major commercial electricity generation is done by burning fossil fuels out of which coal-fired power plants produce a substantial quantity of electricity worldwide. The United States has large reserves of coal, and it is cheaply available, making it a good choice for the generation of electricity on a large scale. However, one major problem associated with using coal for combustion is that it produces a group of pollutants known as nitrogen oxides (NO x). NOx are strong oxidizers and contribute to ozone formation and respiratory illness. The Environmental Protection Agency (EPA) regulates the quantity of NOx emitted to the atmosphere in the United States. One technique coal-fired power plants use to reduce NOx emissions is Selective Catalytic Reduction (SCR). SCR uses layers of catalyst that need to be added or changed to maintain the required performance. Power plants do add or change catalyst layers during temporary shutdowns, but it is expensive. However, many companies do not have only one power plant, but instead they can have a fleet of coal-fired power plants. A fleet of power plants can use EPA cap and trade programs to have an outlet NOx emission below the allowances for the fleet. For that reason, the main aim of this research is to develop an SCR management mathematical optimization methods that, with a given set of scheduled outages for a fleet of power plants, minimizes the total cost of the entire fleet of power plants and also maintain outlet NO x below the desired target for the entire fleet. We use a multi commodity network flow problem (MCFP) that creates edges that represent all the SCR catalyst layers for each plant. This MCFP is relaxed because it does not consider average daily NOx constraint, and it is solved by a binary integer program. After that, we add the average daily NOx constraint to the model with a schedule elimination constraint (MCFPwSEC). The MCFPwSEC eliminates, one by one, the solutions that do not satisfy the average daily
Optimization Techniques for Verification of Out-of-Order Execution Machines
Directory of Open Access Journals (Sweden)
Sudarshan K. Srinivasan
2010-01-01
Full Text Available We develop two optimization techniques, flush-machine and collapsed flushing, to improve the efficiency of automatic refinement-abased verification of out-of-order (ooo processor models. Refinement is a notion of equivalence that can be used to check that an ooo processor correctly implements all behaviors of its instruction set architecture (ISA, including deadlock detection. The optimization techniques work by reducing the computational complexity of the refinement map, a function central to refinement proofs that maps ooo processor model states to ISA states. This has a direct impact on the efficiency of verification, which is studied using 23 ooo processor models. Flush-machine, is a novel optimization technique. Collapsed flushing has been employed previously in the context of in-order processors. We show how to apply collapsed flushing for ooo processor models. Using both the optimizations together, we can handle 9 ooo models that could not be verified using standard flushing. Also, the optimizations provided a speed up of 23.29 over standard flushing.
CT-imaging in Acute Ischemic Stroke: Thrombus Characterization and Technique Optimization
Niesten, J.M.
2014-01-01
In this thesis two main subjects were discussed. First, histopathologic and CT characteristics of cerebral thrombi were examined. Second, techniques to increase the accuracy and to optimize CT-perfusion (CTP)- and CT-angiography (CTA)-imaging were explored. In part 1 we investigated the relation
Remondo Bueno, D.; Srinivasan, R.; Nicola, V.F.; van Etten, Wim; Tattje, H.E.P.
1998-01-01
In this paper new adaptive importance sampling techniques are applied to the performance evaluation and parameter optimization of wavelength division multiplexing (WDM) network impaired by crosstalk in an optical cross-connect. Worst-case analysis is carried out including all the beat noise terms
Directory of Open Access Journals (Sweden)
Jude Hemanth Duraisamy
2016-01-01
Full Text Available Image steganography is one of the ever growing computational approaches which has found its application in many fields. The frequency domain techniques are highly preferred for image steganography applications. However, there are significant drawbacks associated with these techniques. In transform based approaches, the secret data is embedded in random manner in the transform coefficients of the cover image. These transform coefficients may not be optimal in terms of the stego image quality and embedding capacity. In this work, the application of Genetic Algorithm (GA and Particle Swarm Optimization (PSO have been explored in the context of determining the optimal coefficients in these transforms. Frequency domain transforms such as Bandelet Transform (BT and Finite Ridgelet Transform (FRIT are used in combination with GA and PSO to improve the efficiency of the image steganography system.
A Genetic Algorithm Optimization Technique for Multiwavelet-Based Digital Audio Watermarking
Directory of Open Access Journals (Sweden)
Kumsawat Prayoth
2010-01-01
Full Text Available We propose a new approach for optimization in digital audio watermarking using genetic algorithm. The watermarks are embedded into the low frequency coefficients in discrete multiwavelet transform domain. The embedding technique is based on quantization process which does not require the original audio signal in the watermark extraction. We have developed an optimization technique using the genetic algorithm to search for four optimal quantization steps in order to improve both quality of watermarked audio and robustness of the watermark. In addition, we analyze the performance of the proposed algorithm in terms of signal-to-noise ratio, normalized correlation, and bit error rate. The experimental results show that the proposed scheme can achieve a good robustness against most of the attacks which were included in this study.
Familiarity and recollection in heuristic decision making.
Schwikert, Shane R; Curran, Tim
2014-12-01
Heuristics involve the ability to utilize memory to make quick judgments by exploiting fundamental cognitive abilities. In the current study we investigated the memory processes that contribute to the recognition heuristic and the fluency heuristic, which are both presumed to capitalize on the byproducts of memory to make quick decisions. In Experiment 1, we used a city-size comparison task while recording event-related potentials (ERPs) to investigate the potential contributions of familiarity and recollection to the 2 heuristics. ERPs were markedly different for recognition heuristic-based decisions and fluency heuristic-based decisions, suggesting a role for familiarity in the recognition heuristic and recollection in the fluency heuristic. In Experiment 2, we coupled the same city-size comparison task with measures of subjective preexperimental memory for each stimulus in the task. Although previous literature suggests the fluency heuristic relies on recognition speed alone, our results suggest differential contributions of recognition speed and recollected knowledge to these decisions, whereas the recognition heuristic relies on familiarity. Based on these results, we created a new theoretical framework that explains decisions attributed to both heuristics based on the underlying memory associated with the choice options. PsycINFO Database Record (c) 2014 APA, all rights reserved.
Directory of Open Access Journals (Sweden)
Rajesh Kumar
2016-06-01
Full Text Available The present study describes isolation of laccase producing fungal strain and optimization of the process parameters by design of experiment (DOE technique to achieve the maximum production of extracellular laccases by Aspergillus flavus obtained from natural habitat. Bromophenol blue dye and ABTS (2,2′-azinobis 3-ethyl-benzothiazoline-6-sulfonate were used as substrates for the screening of laccase activity. Design expert 8.0.7.1 software was used to optimize culture conditions such as carbon source, nitrogen source, temperature and pH. Subsequently, optimization for inoculums size was also carried out. The optimization studies revealed that the laccase yield was highest when operated at the following conditions: carbon source – cellulose (8%, nitrogen source – peptone (2%, temperature – 35 °C, pH – 7 and inoculum of size 1.5 cm.
A multi-agent technique for contingency constrained optimal power flows
Energy Technology Data Exchange (ETDEWEB)
Talukdar, S.; Ramesh, V.C. (Carnegie Mellon Univ., Pittsburgh, PA (United States). Engineering Design Research Center)
1994-05-01
This paper does three things. First, it proposes that each critical contingency in a power system be represented by a correction time'' (the time required to eliminate the violations produced by the contingency), rather than by a set of hard constraints. Second, it adds these correction times to an optimal power flow and decomposes the resulting problem into a number of smaller optimization problems. Third, it proposes a multiagent technique for solving the smaller problems in parallel. The agents encapsulate traditional optimization algorithms as well as a new algorithm, called the voyager, that generates starting points for the traditional algorithms. All the agents communicate asynchronously, meaning that they can work in parallel without ever interrupting or delaying one another. The resulting scheme has potential for handling power system contingencies and other difficult global optimization problems.
The Use of Lean Manufacturing Techniques – SMED Analysis to Optimization of the Production Process
Directory of Open Access Journals (Sweden)
Dusan Sabadka
2017-09-01
Full Text Available Lean is a culture of real and continuous optimization. As a concept of continuous optimization in the midst of limited resources must be practiced continuously as a long term organizational norm. This paper revels why changeover time reduction is important in manufacturing industries and from the various tool and techniques available within Lean manufacturing describes mainly SMED (Single Minute Exchange of Dies for changeover time reduction and its application in Shaft manufacturing industry. This paper also describes principles, benefits, procedure and practical application of SMED. Theoretical bases are verified in a practical part that describes analysis and design optimization of non-productive time at changeover honing machine in selected shaft manufacturing compaty. The output is the structural design of universal palette and evaluation of productivity due to optimization of operations of time honing gear shafts. The result achieved showed considerable reduction in delay arising out of machine setting time, batch setting time and demonstration delay.
Feedback control for fuel-optimal descents using singular perturbation techniques
Price, D. B.
1984-01-01
In response to rising fuel costs and reduced profit margins for the airline companies, the optimization of the paths flown by transport aircraft has been considered. It was found that application of optimal control theory to the considered problem can result in savings in fuel, time, and direct operating costs. The best solution to the aircraft trajectory problem is an onboard real-time feedback control law. The present paper presents a technique which shows promise of becoming a part of a complete solution. The application of singular perturbation techniques to the problem is discussed, taking into account the benefits and some problems associated with them. A different technique for handling the descent part of a trajectory is also discussed.
Gutin, Gregory; Goldengorin, Boris; Huang, Jing
Optimization heuristics are often compared with each other to determine which one performs best by means of worst-case performance ratio reflecting the quality of returned solution in the worst case. The domination number is a complement parameter indicating the quality of the heuristic in hand by
Tuning Parameters in Heuristics by Using Design of Experiments Methods
Arin, Arif; Rabadi, Ghaith; Unal, Resit
2010-01-01
With the growing complexity of today's large scale problems, it has become more difficult to find optimal solutions by using exact mathematical methods. The need to find near-optimal solutions in an acceptable time frame requires heuristic approaches. In many cases, however, most heuristics have several parameters that need to be "tuned" before they can reach good results. The problem then turns into "finding best parameter setting" for the heuristics to solve the problems efficiently and timely. One-Factor-At-a-Time (OFAT) approach for parameter tuning neglects the interactions between parameters. Design of Experiments (DOE) tools can be instead employed to tune the parameters more effectively. In this paper, we seek the best parameter setting for a Genetic Algorithm (GA) to solve the single machine total weighted tardiness problem in which n jobs must be scheduled on a single machine without preemption, and the objective is to minimize the total weighted tardiness. Benchmark instances for the problem are available in the literature. To fine tune the GA parameters in the most efficient way, we compare multiple DOE models including 2-level (2k ) full factorial design, orthogonal array design, central composite design, D-optimal design and signal-to-noise (SIN) ratios. In each DOE method, a mathematical model is created using regression analysis, and solved to obtain the best parameter setting. After verification runs using the tuned parameter setting, the preliminary results for optimal solutions of multiple instances were found efficiently.
Liang, Bin; Li, Yongbao; Ran, Wei; Guo, Bin; Xu, Xuang; Liu, Bo; Li, Jiafeng; Wu, Qiuwen; Zhou, Fugen
2017-11-17
With robot-controlled linac positioning, the robotic radiotherapy system such as CyberKnife significantly increases the freedom in radiation beam placement, but also imposes more challenges on treatment plan optimization. The resampling mechanism in vendor supplied treatment planning system (MultiPlan) could not fully explore the increased beam direction search space. Besides, a sparse treatment plan (using fewer beams) is desired to improve the treatment efficiency. This study proposes a singular value decomposition linear programming (SVDLP) optimization technique for circular collimator based robotic radiotherapy. The SVDLP approach initializes the input beams by simulating the process of covering the entire target volume with equivalent beam taper. The requirements on dosimetry distribution are modeled as hard and soft constraints, and the sparsity of treatment plan is achieved by compressive sensing. The proposed liner programming (LP) model optimizes beam weight by minimizing the deviation of soft constraints while subjecting to hard constraints, with the constraint on the l^{1} norm of beam weight. A singular value decomposition (SVD) based acceleration technique was developed for the LP model. Based on the degeneracy of influence matrix, the model is first compressed into lower dimension for optimization, and then back-projected to reconstruct the beam weight. After beam weight optimization, the number of beams is reduced by removing the beams with low weight, and optimizing the weight of remaining beams using the same model. This beam reduction technique is further validated by a mixed integer programming (MIP) model. The SVDLP approach was tested on a lung case. The results demonstrate that the SVD acceleration technique speeds up the optimization by a factor of 4.8. And the beam reduction achieves similar plan quality as the globally optimal plan obtained by MIP model, but is 1-2 orders of magnitude faster. Furthermore, the SVDLP approach
Comparing the performance of different meta-heuristics for unweighted parallel machine scheduling
Directory of Open Access Journals (Sweden)
Adamu, Mumuni Osumah
2015-08-01
Full Text Available This article considers the due window scheduling problem to minimise the number of early and tardy jobs on identical parallel machines. This problem is known to be NP complete and thus finding an optimal solution is unlikely. Three meta-heuristics and their hybrids are proposed and extensive computational experiments are conducted. The purpose of this paper is to compare the performance of these meta-heuristics and their hybrids and to determine the best among them. Detailed comparative tests have also been conducted to analyse the different heuristics with the simulated annealing hybrid giving the best result.
Triple Modular Redundancy verification via heuristic netlist analysis
Directory of Open Access Journals (Sweden)
Giovanni Beltrame
2015-08-01
Full Text Available Triple Modular Redundancy (TMR is a common technique to protect memory elements for digital processing systems subject to radiation effects (such as in space, high-altitude, or near nuclear sources. This paper presents an approach to verify the correct implementation of TMR for the memory elements of a given netlist (i.e., a digital circuit specification using heuristic analysis. The purpose is detecting any issues that might incur during the use of automatic tools for TMR insertion, optimization, place and route, etc. Our analysis does not require a testbench and can perform full, exhaustive coverage within less than an hour even for large designs. This is achieved by applying a divide et impera approach, splitting the circuit into smaller submodules without loss of generality, instead of applying formal verification to the whole netlist at once. The methodology has been applied to a production netlist of the LEON2-FT processor that had reported errors during radiation testing, successfully showing a number of unprotected memory elements, namely 351 flip-flops.
Special relativity a heuristic approach
Hassani, Sadri
2017-01-01
Special Relativity: A Heuristic Approach provides a qualitative exposition of relativity theory on the basis of the constancy of the speed of light. Using Einstein's signal velocity as the defining idea for the notion of simultaneity and the fact that the speed of light is independent of the motion of its source, chapters delve into a qualitative exposition of the relativity of time and length, discuss the time dilation formula using the standard light clock, explore the Minkowski four-dimensional space-time distance based on how the time dilation formula is derived, and define the components of the two-dimensional space-time velocity, amongst other topics. Provides a heuristic derivation of the Minkowski distance formula Uses relativistic photography to see Lorentz transformation and vector algebra manipulation in action Includes worked examples to elucidate and complement the topic being discussed Written in a very accessible style
Directory of Open Access Journals (Sweden)
GHOLAMIAN, A. S.
2009-06-01
Full Text Available In this paper, a magnet shape optimization method for reduction of cogging torque and torque ripple in Permanent Magnet (PM brushless DC motors is presented by using the reduced basis technique coupled by finite element and design of experiments methods. The primary objective of the method is to reduce the enormous number of design variables required to define the magnet shape. The reduced basis technique is a weighted combination of several basis shapes. The aim of the method is to find the best combination using the weights for each shape as the design variables. A multi-level design process is developed to find suitable basis shapes or trial shapes at each level that can be used in the reduced basis technique. Each level is treated as a separated optimization problem until the required objective is achieved. The experimental design of Taguchi method is used to build the approximation model and to perform optimization. This method is demonstrated on the magnet shape optimization of a 6-poles/18-slots PM BLDC motor.
Expected Fitness Gains of Randomized Search Heuristics for the Traveling Salesperson Problem.
Nallaperuma, Samadhi; Neumann, Frank; Sudholt, Dirk
2017-01-01
Randomized search heuristics are frequently applied to NP-hard combinatorial optimization problems. The runtime analysis of randomized search heuristics has contributed tremendously to our theoretical understanding. Recently, randomized search heuristics have been examined regarding their achievable progress within a fixed-time budget. We follow this approach and present a fixed-budget analysis for an NP-hard combinatorial optimization problem. We consider the well-known Traveling Salesperson Problem (TSP) and analyze the fitness increase that randomized search heuristics are able to achieve within a given fixed-time budget. In particular, we analyze Manhattan and Euclidean TSP instances and Randomized Local Search (RLS), (1+1) EA and (1+[Formula: see text]) EA algorithms for the TSP in a smoothed complexity setting, and derive the lower bounds of the expected fitness gain for a specified number of generations.
Optimal Turbine Allocation for Offshore and Onshore Wind Farms
DEFF Research Database (Denmark)
Fischetti, Martina; Fischetti, Matteo; Monaci, Michele
2016-01-01
. In particular, lots of money and energy are spent on the optimal design of wind farms, as an efficient use of the available resources is instrumental for their economical success. In the present paper we address the optimization of turbine positions, which is one of the most relevant problems in the design...... of a wind farm, and propose a heuristic approach based on Mixed-Integer Linear Programming techniques. Computational results on very large scale instances prove the practical viability of the approach....
A novel heuristic algorithm for capacitated vehicle routing problem
Kır, Sena; Yazgan, Harun Reşit; Tüncel, Emre
2017-02-01
The vehicle routing problem with the capacity constraints was considered in this paper. It is quite difficult to achieve an optimal solution with traditional optimization methods by reason of the high computational complexity for large-scale problems. Consequently, new heuristic or metaheuristic approaches have been developed to solve this problem. In this paper, we constructed a new heuristic algorithm based on the tabu search and adaptive large neighborhood search (ALNS) with several specifically designed operators and features to solve the capacitated vehicle routing problem (CVRP). The effectiveness of the proposed algorithm was illustrated on the benchmark problems. The algorithm provides a better performance on large-scaled instances and gained advantage in terms of CPU time. In addition, we solved a real-life CVRP using the proposed algorithm and found the encouraging results by comparison with the current situation that the company is in.
Optimization technique for improved microwave transmission from multi-solar power satellites
Energy Technology Data Exchange (ETDEWEB)
Arndt, G.D.; Kerwin, E.M.
1982-08-01
An optimization technique for generating antenna illumination tapers allows improved microwave transmission efficiencies from proposed solar power satellite (SPS) systems and minimizes sidelobe levels to meet preset environmental standards. The cumulative microwave power density levels from 50 optimized SPS systems are calculated at the centroids of each of the 3073 counties in the continental United States. These cumulative levels are compared with Environmental Protection Agency (EPA) measured levels of electromagnetic radiation in seven eastern cities. Effects of rectenna relocations upon the power levels/population exposure rates are also studied.
Analysis on the Metrics used in Optimizing Electronic Business based on Learning Techniques
Directory of Open Access Journals (Sweden)
Irina-Steliana STAN
2014-09-01
Full Text Available The present paper proposes a methodology of analyzing the metrics related to electronic business. The drafts of the optimizing models include KPIs that can highlight the business specific, if only they are integrated by using learning-based techniques. Having set the most important and high-impact elements of the business, the models should get in the end the link between them, by automating business flows. The human resource will be found in the situation of collaborating more and more with the optimizing models which will translate into high quality decisions followed by profitability increase.
A study of optimization techniques in HDR brachytherapy for the prostate
Pokharel, Ghana Shyam
. Based on our study, DVH based objective function performed better than traditional variance based objective function in creating a clinically acceptable plan when executed under identical conditions. Thirdly, we studied the multiobjective optimization strategy using both DVH and variance based objective functions. The optimization strategy was to create several Pareto optimal solutions by scanning the clinically relevant part of the Pareto front. This strategy was adopted to decouple optimization from decision such that user could select final solution from the pool of alternative solutions based on his/her clinical goals. The overall quality of treatment plan improved using this approach compared to traditional class solution approach. In fact, the final optimized plan selected using decision engine with DVH based objective was comparable to typical clinical plan created by an experienced physicist. Next, we studied the hybrid technique comprising both stochastic and deterministic algorithm to optimize both dwell positions and dwell times. The simulated annealing algorithm was used to find optimal catheter distribution and the DVH based algorithm was used to optimize 3D dose distribution for given catheter distribution. This unique treatment planning and optimization tool was capable of producing clinically acceptable highly reproducible treatment plans in clinically reasonable time. As this algorithm was able to create clinically acceptable plans within clinically reasonable time automatically, it is really appealing for real time procedures. Next, we studied the feasibility of multiobjective optimization using evolutionary algorithm for real time HDR brachytherapy for the prostate. The algorithm with properly tuned algorithm specific parameters was able to create clinically acceptable plans within clinically reasonable time. However, the algorithm was let to run just for limited number of generations not considered optimal, in general, for such algorithms. This was
Meyer, Burghard Christian; Lescot, Jean-Marie; Laplana, Ramon
2009-02-01
Two spatial optimization approaches, developed from the opposing perspectives of ecological economics and landscape planning and aimed at the definition of new distributions of farming systems and of land use elements, are compared and integrated into a general framework. The first approach, applied to a small river catchment in southwestern France, uses SWAT (Soil and Water Assessment Tool) and a weighted goal programming model in combination with a geographical information system (GIS) for the determination of optimal farming system patterns, based on selected objective functions to minimize deviations from the goals of reducing nitrogen and maintaining income. The second approach, demonstrated in a suburban landscape near Leipzig, Germany, defines a GIS-based predictive habitat model for the search of unfragmented regions suitable for hare populations (Lepus europaeus), followed by compromise optimization with the aim of planning a new habitat structure distribution for the hare. The multifunctional problem is solved by the integration of the three landscape functions ("production of cereals," "resistance to soil erosion by water," and "landscape water retention"). Through the comparison, we propose a framework for the definition of optimal land use patterns based on optimization techniques. The framework includes the main aspects to solve land use distribution problems with the aim of finding the optimal or best land use decisions. It integrates indicators, goals of spatial developments and stakeholders, including weighting, and model tools for the prediction of objective functions and risk assessments. Methodological limits of the uncertainty of data and model outcomes are stressed. The framework clarifies the use of optimization techniques in spatial planning.
Heuristics for Routing Heterogeneous Unmanned Vehicles with Fuel Constraints
Directory of Open Access Journals (Sweden)
David Levy
2014-01-01
Full Text Available This paper addresses a multiple depot, multiple unmanned vehicle routing problem with fuel constraints. The objective of the problem is to find a tour for each vehicle such that all the specified targets are visited at least once by some vehicle, the tours satisfy the fuel constraints, and the total travel cost of the vehicles is a minimum. We consider a scenario where the vehicles are allowed to refuel by visiting any of the depots or fuel stations. This is a difficult optimization problem that involves partitioning the targets among the vehicles and finding a feasible tour for each vehicle. The focus of this paper is on developing fast variable neighborhood descent (VND and variable neighborhood search (VNS heuristics for finding good feasible solutions for large instances of the vehicle routing problem. Simulation results are presented to corroborate the performance of the proposed heuristics on a set of 23 large instances obtained from a standard library. These results show that the proposed VND heuristic, on an average, performed better than the proposed VNS heuristic for the tested instances.
Heuristic Scheduling Algorithm Oriented Dynamic Tasks for Imaging Satellites
Directory of Open Access Journals (Sweden)
Maocai Wang
2014-01-01
Full Text Available Imaging satellite scheduling is an NP-hard problem with many complex constraints. This paper researches the scheduling problem for dynamic tasks oriented to some emergency cases. After the dynamic properties of satellite scheduling were analyzed, the optimization model is proposed in this paper. Based on the model, two heuristic algorithms are proposed to solve the problem. The first heuristic algorithm arranges new tasks by inserting or deleting them, then inserting them repeatedly according to the priority from low to high, which is named IDI algorithm. The second one called ISDR adopts four steps: insert directly, insert by shifting, insert by deleting, and reinsert the tasks deleted. Moreover, two heuristic factors, congestion degree of a time window and the overlapping degree of a task, are employed to improve the algorithm’s performance. Finally, a case is given to test the algorithms. The results show that the IDI algorithm is better than ISDR from the running time point of view while ISDR algorithm with heuristic factors is more effective with regard to algorithm performance. Moreover, the results also show that our method has good performance for the larger size of the dynamic tasks in comparison with the other two methods.
Greedy heuristics for minimization of number of terminal nodes in decision trees
Hussain, Shahid
2014-10-01
This paper describes, in detail, several greedy heuristics for construction of decision trees. We study the number of terminal nodes of decision trees, which is closely related with the cardinality of the set of rules corresponding to the tree. We compare these heuristics empirically for two different types of datasets (datasets acquired from UCI ML Repository and randomly generated data) as well as compare with the optimal results obtained using dynamic programming method.
New heuristics for traveling salesman and vehicle routing problems with time windows
Energy Technology Data Exchange (ETDEWEB)
Gendreau, M.; Hertz, A.; Laporte, G.; Mihnea, S.
1994-12-31
We consider variants of the Traveling Salesman (TSP) and Vehicle Routing (VRP) Problems in which each customer can only be visited within a pre-specified (hard) time-window. We first present a two-phase (construction and post-optimization) generalized insertion heuristic for the TSPTW. This insertion heuristic is then imbedded in a tabu search metaheuristic in order to solve the VRPTW. Computational results on standard benchmark problems will be reported.
Energy Technology Data Exchange (ETDEWEB)
Kumari, M. Sailaja; Maheswarapu, Sydulu [Department of Electrical Engineering, National Institute of Technology, Warangal (India)
2010-07-15
Optimal Power Flow (OPF) is used for developing corrective strategies and to perform least cost dispatches. In order to guide the decision making of power system operators a more robust and faster OPF algorithm is needed. OPF can be solved for minimum generation cost, that satisfies the power balance equations and system constraints. But, cost based OPF solutions usually result in unattractive system losses and voltage profiles. In the present paper the OPF problem is formulated as a multi-objective optimization problem, where optimal control settings for simultaneous minimization of fuel cost and loss, loss and voltage stability index, fuel cost and voltage stability index and finally fuel cost, loss and voltage stability index are obtained. The present paper combines a new Decoupled Quadratic Load Flow (DQLF) solution with Enhanced Genetic Algorithm (EGA) to solve the OPF problem. A Strength Pareto Evolutionary Algorithm (SPEA) based approach with strongly dominated set of solutions is used to form the pareto-optimal set. A hierarchical clustering technique is employed to limit the set of trade-off solutions. Finally a fuzzy based approach is used to obtain the optimal solution from the tradeoff curve. The proposed multi-objective evolutionary algorithm with EGA-DQLF model for OPF solution determines diverse pareto optimal front in just 50 generations. IEEE 30 bus system is used to demonstrate the behavior of the proposed approach. The obtained final optimal solution is compared with that obtained using Particle Swarm Optimization (PSO) and Fuzzy satisfaction maximization approach. The results using EGA-DQLF with SPEA approach show their superiority over PSO-Fuzzy approach. (author)
Thelen, Mark; Koppenhaver, Shane
2015-06-01
The Army Physical Fitness Test (APFT) is a biannual training requirement for all soldiers. The Army has made significant overall fitness gains by developing functional and comprehensive Physical Readiness Training (PRT) programs, but more emphasis on individualized physical fitness test taking technique is warranted in order to optimize performance. The purpose of this clinical commentary is to provide clinicians with several examples of APFT performance enhancement techniques that can potentially be applied not only in the Army, but throughout the military and in the sports community where general fitness assessments are routinely administered. 5.
A Preconditioning Technique for First-Order Primal-Dual Splitting Method in Convex Optimization
Directory of Open Access Journals (Sweden)
Meng Wen
2017-01-01
Full Text Available We introduce a preconditioning technique for the first-order primal-dual splitting method. The primal-dual splitting method offers a very general framework for solving a large class of optimization problems arising in image processing. The key idea of the preconditioning technique is that the constant iterative parameters are updated self-adaptively in the iteration process. We also give a simple and easy way to choose the diagonal preconditioners while the convergence of the iterative algorithm is maintained. The efficiency of the proposed method is demonstrated on an image denoising problem. Numerical results show that the preconditioned iterative algorithm performs better than the original one.
Mühlig, Christian; Kufert, Siegfried; Bublitz, Simon; Speck, Uwe
2011-03-20
Using experimental results and numerical simulations, two measuring concepts of the laser induced deflection (LID) technique are introduced and optimized for absolute thin film absorption measurements from deep ultraviolet to IR wavelengths. For transparent optical coatings, a particular probe beam deflection direction allows the absorption measurement with virtually no influence of the substrate absorption, yielding improved accuracy compared to the common techniques of separating bulk and coating absorption. For high-reflection coatings, where substrate absorption contributions are negligible, a different probe beam deflection is chosen to achieve a better signal-to-noise ratio. Various experimental results for the two different measurement concepts are presented.
Superfast multifrequency phase-shifting technique with optimal pulse width modulation.
Wang, Yajun; Zhang, Song
2011-03-14
The technique of generating sinusoidal fringe patterns by defocusing squared binary structured ones has numerous merits for high-speed three-dimensional (3D) shape measurement. However, it is challenging for this method to realize a multifrequency phase-shifting (MFPS) algorithm because it is difficult to simultaneously generate high-quality sinusoidal fringe patterns with different periods. This paper proposes to realize an MFPS algorithm utilizing an optimal pulse width modulation (OPWM) technique that can selectively eliminate high-order harmonics of squared binary patterns. We successfully develop a 556 Hz system utilizing a three-frequency algorithm for simultaneously measuring multiple objects.
Approximating Optimal Release in a Deterministic Model for the Sterile Insect Technique
Directory of Open Access Journals (Sweden)
Sergio Ramirez
2016-01-01
Full Text Available Cost/benefit analyses are essential to support management planning and decisions before launching any pest control program. In particular, applications of the sterile insect technique (SIT are often prevented by the projected economic burden associated with rearing processes. This has had a deep impact on the technique development and its use on insects with long larval periods, as often seen in beetles. Under the assumptions of long adult timespan and multiple mating, we show how to find approximate optimal sterile release policies that minimize costs. The theoretical framework proposed considers the release of insects by pulses and finds approximate optimal release sizes through stochastic searching. The scheme is then used to compare simulated release strategies obtained for different pulse schedules and release bounds, providing a platform for evaluating the convenience of increasing sterile male release intensity or extending the period of control.
Robust Optimization of Thermal Aspects of Friction Stir Welding Using Manifold Mapping Techniques
DEFF Research Database (Denmark)
Larsen, Anders Astrup; Lahaye, Domenico; Schmidt, Henrik Nikolaj Blicher
2008-01-01
The aim of this paper is to optimize a friction stir welding process taking robustness into account. The optimization problems are formulated with the goal of obtaining desired mean responses while reducing the variance of the response. We restrict ourselves to a thermal model of the process...... and use the manifold mapping technique to solve the optimization problems using a fast analytical coarse and an expensive accurate fine model. The statistics of the response are calculated using Taylor expansions and are compared to Monte Carlo simulations. The results show that the use of manifold...... mapping reduces the number of fine model evaluations required and that the Taylor expansion approach gives good results when compared to Monte Carlo simulations....
Optimization Techniques for Improving the Performance of Silicone-Based Dielectric Elastomers
DEFF Research Database (Denmark)
Skov, Anne Ladegaard; Yu, Liyun
2017-01-01
Dielectric elastomers are possible candidates for realizing products that are in high demand by society, such as soft robotics and prosthetics, tactile displays, and smart wearables. Diverse and advanced products based on dielectric elastomers are available; however, no elastomer has proven ideal...... the electro-mechanical performance of dielectric elastomers are highlighted. Various optimization methods for improved energy transduction are investigated and discussed, with special emphasis placed on the promise each method holds. The compositing and blending of elastomers are shown to be simple, versatile...... that there is not a single optimization technique that will lead to the universal optimization of dielectric elastomer films, though each method may lead to elastomers with certain features, and thus certain potentials....
Directory of Open Access Journals (Sweden)
Chein-Shan Liu
2014-01-01
Full Text Available To solve an unconstrained nonlinear minimization problem, we propose an optimal algorithm (OA as well as a globally optimal algorithm (GOA, by deflecting the gradient direction to the best descent direction at each iteration step, and with an optimal parameter being derived explicitly. An invariant manifold defined for the model problem in terms of a locally quadratic function is used to derive a purely iterative algorithm and the convergence is proven. Then, the rank-two updating techniques of BFGS are employed, which result in several novel algorithms as being faster than the steepest descent method (SDM and the variable metric method (DFP. Six numerical examples are examined and compared with exact solutions, revealing that the new algorithms of OA, GOA, and the updated ones have superior computational efficiency and accuracy.
Optimization of brushless direct current motor design using an intelligent technique.
Shabanian, Alireza; Tousiwas, Armin Amini Poustchi; Pourmandi, Massoud; Khormali, Aminollah; Ataei, Abdolhay
2015-07-01
This paper presents a method for the optimal design of a slotless permanent magnet brushless DC (BLDC) motor with surface mounted magnets using an improved bee algorithm (IBA). The characteristics of the motor are expressed as functions of motor geometries. The objective function is a combination of losses, volume and cost to be minimized simultaneously. This method is based on the capability of swarm-based algorithms in finding the optimal solution. One sample case is used to illustrate the performance of the design approach and optimization technique. The IBA has a better performance and speed of convergence compared with bee algorithm (BA). Simulation results show that the proposed method has a very high/efficient performance. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.
Improved simple optimization (SOPT algorithm for unconstrained non-linear optimization problems
Directory of Open Access Journals (Sweden)
J. Thomas
2016-09-01
Full Text Available In the recent years, population based meta-heuristic are developed to solve non-linear optimization problems. These problems are difficult to solve using traditional methods. Simple optimization (SOPT algorithm is one of the simple and efficient meta-heuristic techniques to solve the non-linear optimization problems. In this paper, SOPT is compared with some of the well-known meta-heuristic techniques viz. Artificial Bee Colony algorithm (ABC, Particle Swarm Optimization (PSO, Genetic Algorithm (GA and Differential Evolutions (DE. For comparison, SOPT algorithm is coded in MATLAB and 25 standard test functions for unconstrained optimization having different characteristics are run for 30 times each. The results of experiments are compared with previously reported results of other algorithms. Promising and comparable results are obtained for most of the test problems. To improve the performance of SOPT, an improvement in the algorithm is proposed which helps it to come out of local optima when algorithm gets trapped in it. In almost all the test problems, improved SOPT is able to get the actual solution at least once in 30 runs.
Minisci, E.A.; Avanzini, G.
2008-01-01
Orbit transfer maneuvers are here considered as benchmark cases for comparing performance of different optimization\\ud techniques in the framework of direct methods. Two different classes of evolutionary algorithms, a\\ud conventional genetic algorithm and an estimation of distribution method, are compared in terms of performance\\ud indices statistically evaluated over a prescribed number of runs. At the same time, two different types of problem\\ud representations are considered, a first one b...
Energy Technology Data Exchange (ETDEWEB)
AlRashidi, M.R. [Electrical Engineering Department, College of Technological Studies, Shuwaikh (Kuwait); El-Hawary, M.E. [Department of Electrical and Computer Engineering, Dalhousie University, Halifax, NS B3J 2X4 (Canada)
2009-04-15
Computational intelligence tools are attracting added attention in different research areas and research in power systems is not different. This paper provides an overview of major computational issues with regard to the optimal power flow (OPF). Then, it offers a brief summary of major computational intelligence tools. A detailed coverage of most OPF related research work that make use of modern computational intelligence techniques is presented next. (author)
Bhattacharjee, Deblina; Paul, Anand; Kim, Jeong Hong; Kim, Mucheol
2016-01-01
The analysis of leukocyte images has drawn interest from fields of both medicine and computer vision for quite some time where different techniques have been applied to automate the process of manual analysis and classification of such images. Manual analysis of blood samples to identify leukocytes is time-consuming and susceptible to error due to the different morphological features of the cells.?In this article, the nature-inspired plant growth simulation algorithm has been applied to optim...
Directory of Open Access Journals (Sweden)
Mehiddin Al-Baali
2015-12-01
Full Text Available We deal with the design of parallel algorithms by using variable partitioning techniques to solve nonlinear optimization problems. We propose an iterative solution method that is very efficient for separable functions, our scope being to discuss its performance for general functions. Experimental results on an illustrative example have suggested some useful modifications that, even though they improve the efficiency of our parallel method, leave some questions open for further investigation.
A NOVEL CASCADED H- BRIDGE MULTILEVEL INVERTER BASED ON OPTIMAL PWM TECHNIQUE
MAHESWARI, A.; GNANAMBAL, I
2013-01-01
In this paper, a novel cascaded H- bridge multilevel inverter has been proposed using less number of switches. A standard cascaded multilevel inverter requires 4h number of switches for (2h + 1) levels whereas h is the number of dc sources. This proposed scheme allows less number of switches for the same level. A novel cascaded H- bridge multilevel inverter fed induction motor shows better performance due to fundamental frequency switching scheme using optimal PWM Technique (OPWM). High qual...
Wroblewski, David [Mentor, OH; Katrompas, Alexander M [Concord, OH; Parikh, Neel J [Richmond Heights, OH
2009-09-01
A method and apparatus for optimizing the operation of a power generating plant using artificial intelligence techniques. One or more decisions D are determined for at least one consecutive time increment, where at least one of the decisions D is associated with a discrete variable for the operation of a power plant device in the power generating plant. In an illustrated embodiment, the power plant device is a soot cleaning device associated with a boiler.
Methods of modeling and optimization of work effects for chosen mineral processing systems
Directory of Open Access Journals (Sweden)
Tomasz Niedoba
2005-11-01
Full Text Available The methods being used in the mineral processing modeling are reviewed in this paper. Particularly, the heuristic approach was presented. The new, modern techniques of modeling and optimization were proposed, including the least median squares method and genetic algorithms. The rules of the latter were described in details.
Multiple sensitive estimation and optimal sample size allocation in the item sum technique.
Perri, Pier Francesco; Rueda García, María Del Mar; Cobo Rodríguez, Beatriz
2017-09-27
For surveys of sensitive issues in life sciences, statistical procedures can be used to reduce nonresponse and social desirability response bias. Both of these phenomena provoke nonsampling errors that are difficult to deal with and can seriously flaw the validity of the analyses. The item sum technique (IST) is a very recent indirect questioning method derived from the item count technique that seeks to procure more reliable responses on quantitative items than direct questioning while preserving respondents' anonymity. This article addresses two important questions concerning the IST: (i) its implementation when two or more sensitive variables are investigated and efficient estimates of their unknown population means are required; (ii) the determination of the optimal sample size to achieve minimum variance estimates. These aspects are of great relevance for survey practitioners engaged in sensitive research and, to the best of our knowledge, were not studied so far. In this article, theoretical results for multiple estimation and optimal allocation are obtained under a generic sampling design and then particularized to simple random sampling and stratified sampling designs. Theoretical considerations are integrated with a number of simulation studies based on data from two real surveys and conducted to ascertain the efficiency gain derived from optimal allocation in different situations. One of the surveys concerns cannabis consumption among university students. Our findings highlight some methodological advances that can be obtained in life sciences IST surveys when optimal allocation is achieved. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei
2015-01-01
Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper. PMID:25784928
Directory of Open Access Journals (Sweden)
Zhiwei Ye
2015-01-01
Full Text Available Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper.
Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei
2015-01-01
Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper.
A Hybrid Ant Colony Optimization Algorithm for the Extended Capacitated Arc Routing Problem.
Li-Ning Xing; Rohlfshagen, P; Ying-Wu Chen; Xin Yao
2011-08-01
The capacitated arc routing problem (CARP) is representative of numerous practical applications, and in order to widen its scope, we consider an extended version of this problem that entails both total service time and fixed investment costs. We subsequently propose a hybrid ant colony optimization (ACO) algorithm (HACOA) to solve instances of the extended CARP. This approach is characterized by the exploitation of heuristic information, adaptive parameters, and local optimization techniques: Two kinds of heuristic information, arc cluster information and arc priority information, are obtained continuously from the solutions sampled to guide the subsequent optimization process. The adaptive parameters ease the burden of choosing initial values and facilitate improved and more robust results. Finally, local optimization, based on the two-opt heuristic, is employed to improve the overall performance of the proposed algorithm. The resulting HACOA is tested on four sets of benchmark problems containing a total of 87 instances with up to 140 nodes and 380 arcs. In order to evaluate the effectiveness of the proposed method, some existing capacitated arc routing heuristics are extended to cope with the extended version of this problem; the experimental results indicate that the proposed ACO method outperforms these heuristics.
Optimal Plant Layout Design for Process-focused Systems
Khoshnevisan, M.; Bhattacharya, S.; Smarandache, F.
2003-01-01
In this paper we have proposed a semi-heuristic optimization algorithm for designing optimal plant layouts in process-focused manufacturing/service facilities. Being a semi-heuristic search, our algorithm is likely to be more efficient in terms of computer CPU engagement time as it tends to converge on the global optimum faster than the traditional CRAFT algorithm - a pure heuristic.
Heuristic Evaluation on Mobile Interfaces: A New Checklist
Directory of Open Access Journals (Sweden)
Rosa Yáñez Gómez
2014-01-01
Full Text Available The rapid evolution and adoption of mobile devices raise new usability challenges, given their limitations (in screen size, battery life, etc. as well as the specific requirements of this new interaction. Traditional evaluation techniques need to be adapted in order for these requirements to be met. Heuristic evaluation (HE, an Inspection Method based on evaluation conducted by experts over a real system or prototype, is based on checklists which are desktop-centred and do not adequately detect mobile-specific usability issues. In this paper, we propose a compilation of heuristic evaluation checklists taken from the existing bibliography but readapted to new mobile interfaces. Selecting and rearranging these heuristic guidelines offer a tool which works well not just for evaluation but also as a best-practices checklist. The result is a comprehensive checklist which is experimentally evaluated as a design tool. This experimental evaluation involved two software engineers without any specific knowledge about usability, a group of ten users who compared the usability of a first prototype designed without our heuristics, and a second one after applying the proposed checklist. The results of this experiment show the usefulness of the proposed checklist for avoiding usability gaps even with nontrained developers.
Heuristic Evaluation on Mobile Interfaces: A New Checklist
Yáñez Gómez, Rosa; Cascado Caballero, Daniel; Sevillano, José-Luis
2014-01-01
The rapid evolution and adoption of mobile devices raise new usability challenges, given their limitations (in screen size, battery life, etc.) as well as the specific requirements of this new interaction. Traditional evaluation techniques need to be adapted in order for these requirements to be met. Heuristic evaluation (HE), an Inspection Method based on evaluation conducted by experts over a real system or prototype, is based on checklists which are desktop-centred and do not adequately detect mobile-specific usability issues. In this paper, we propose a compilation of heuristic evaluation checklists taken from the existing bibliography but readapted to new mobile interfaces. Selecting and rearranging these heuristic guidelines offer a tool which works well not just for evaluation but also as a best-practices checklist. The result is a comprehensive checklist which is experimentally evaluated as a design tool. This experimental evaluation involved two software engineers without any specific knowledge about usability, a group of ten users who compared the usability of a first prototype designed without our heuristics, and a second one after applying the proposed checklist. The results of this experiment show the usefulness of the proposed checklist for avoiding usability gaps even with nontrained developers. PMID:25295300
Luĉić, Felipe; Sánchez-Nieto, Beatriz; Caprile, Paola; Zelada, Gabriel; Goset, Karen
2013-09-06
Total skin electron irradiation (TSEI) has been used as a treatment for mycosis fungoides. Our center has implemented a modified Stanford technique with six pairs of 6 MeV adjacent electron beams, incident perpendicularly on the patient who remains lying on a translational platform, at 200 cm from the source. The purpose of this study is to perform a dosimetric characterization of this technique and to investigate its optimization in terms of energy characteristics, extension, and uniformity of the treatment field. In order to improve the homogeneity of the distribution, a custom-made polyester filter of variable thickness and a uniform PMMA degrader plate were used. It was found that the characteristics of a 9 MeV beam with an 8 mm thick degrader were similar to those of the 6 MeV beam without filter, but with an increased surface dose. The combination of the degrader and the polyester filter improved the uniformity of the distribution along the dual field (180cm long), increasing the dose at the borders of field by 43%. The optimum angles for the pair of beams were ± 27°. This configuration avoided displacement of the patient, and reduced the treatment time and the positioning problems related to the abutting superior and inferior fields. Dose distributions in the transversal plane were measured for the six incidences of the Stanford technique with film dosimetry in an anthropomorphic pelvic phantom. This was performed for the optimized treatment and compared with the previously implemented technique. The comparison showed an increased superficial dose and improved uniformity of the 85% isodose curve coverage for the optimized technique.
Précis of Simple heuristics that make us smart.
Todd, P M; Gigerenzer, G
2000-10-01
How can anyone be rational in a world where knowledge is limited, time is pressing, and deep thought is often an unattainable luxury? Traditional models of unbounded rationality and optimization in cognitive science, economics, and animal behavior have tended to view decision-makers as possessing supernatural powers of reason, limitless knowledge, and endless time. But understanding decisions in the real world requires a more psychologically plausible notion of bounded rationality. In Simple heuristics that make us smart (Gigerenzer et al. 1999), we explore fast and frugal heuristics--simple rules in the mind's adaptive toolbox for making decisions with realistic mental resources. These heuristics can enable both living organisms and artificial systems to make smart choices quickly and with a minimum of information by exploiting the way that information is structured in particular environments. In this précis, we show how simple building blocks that control information search, stop search, and make decisions can be put together to form classes of heuristics, including: ignorance-based and one-reason decision making for choice, elimination models for categorization, and satisficing heuristics for sequential search. These simple heuristics perform comparably to more complex algorithms, particularly when generalizing to new data--that is, simplicity leads to robustness. We present evidence regarding when people use simple heuristics and describe the challenges to be addressed by this research program.
Social biases determine spatiotemporal sparseness of ciliate mating heuristics.
Clark, Kevin B
2012-01-01
Ciliates become highly social, even displaying animal-like qualities, in the joint presence of aroused conspecifics and nonself mating pheromones. Pheromone detection putatively helps trigger instinctual and learned courtship and dominance displays from which social judgments are made about the availability, compatibility, and fitness representativeness or likelihood of prospective mates and rivals. In earlier studies, I demonstrated the heterotrich Spirostomum ambiguum improves mating competence by effecting preconjugal strategies and inferences in mock social trials via behavioral heuristics built from Hebbian-like associative learning. Heuristics embody serial patterns of socially relevant action that evolve into ordered, topologically invariant computational networks supporting intra- and intermate selection. S. ambiguum employs heuristics to acquire, store, plan, compare, modify, select, and execute sets of mating propaganda. One major adaptive constraint over formation and use of heuristics involves a ciliate's initial subjective bias, responsiveness, or preparedness, as defined by Stevens' Law of subjective stimulus intensity, for perceiving the meaningfulness of mechanical pressures accompanying cell-cell contacts and additional perimating events. This bias controls durations and valences of nonassociative learning, search rates for appropriate mating strategies, potential net reproductive payoffs, levels of social honesty and deception, successful error diagnosis and correction of mating signals, use of insight or analysis to solve mating dilemmas, bioenergetics expenditures, and governance of mating decisions by classical or quantum statistical mechanics. I now report this same social bias also differentially affects the spatiotemporal sparseness, as measured with metric entropy, of ciliate heuristics. Sparseness plays an important role in neural systems through optimizing the specificity, efficiency, and capacity of memory representations. The present
A single cognitive heuristic process meets the complexity of domain-specific moral heuristics.
Dubljević, Veljko; Racine, Eric
2014-10-01
The inherence heuristic (a) offers modest insights into the complex nature of both the is-ought tension in moral reasoning and moral reasoning per se, and (b) does not reflect the complexity of domain-specific moral heuristics. Formal and general in nature, we contextualize the process described as "inherence heuristic" in a web of domain-specific heuristics (e.g., agent specific; action specific; consequences specific).
Heuristic Synthesis of Reversible Logic – A Comparative Study
Directory of Open Access Journals (Sweden)
Chua Shin Cheng
2014-01-01
Full Text Available Reversible logic circuits have been historically motivated by theoretical research in low-power, and recently attracted interest as components of the quantum algorithm, optical computing and nanotechnology. However due to the intrinsic property of reversible logic, traditional irreversible logic design and synthesis methods cannot be carried out. Thus a new set of algorithms are developed correctly to synthesize reversible logic circuit. This paper presents a comprehensive literature review with comparative study on heuristic based reversible logic synthesis. It reviews a range of heuristic based reversible logic synthesis techniques reported by researchers (BDD-based, cycle-based, search-based, non-search-based, rule-based, transformation-based, and ESOP-based. All techniques are described in detail and summarized in a table based on their features, limitation, library used and their consideration metric. Benchmark comparison of gate count and quantum cost are analysed for each synthesis technique. Comparing the synthesis algorithm outputs over the years, it can be observed that different approach has been used for the synthesis of reversible circuit. However, the improvements are not significant. Quantum cost and gate count has improved over the years, but arguments and debates are still on certain issues such as the issue of garbage outputs that remain the same. This paper provides the information of all heuristic based synthesis of reversible logic method proposed over the years. All techniques are explained in detail and thus informative for new reversible logic researchers and bridging the knowledge gap in this area.
DEFF Research Database (Denmark)
Wang, Yong; Cai, Zixing; Zhou, Yuren
2009-01-01
A novel approach to deal with numerical and engineering constrained optimization problems, which incorporates a hybrid evolutionary algorithm and an adaptive constraint-handling technique, is presented in this paper. The hybrid evolutionary algorithm simultaneously uses simplex crossover and two...... mutation operators to generate the offspring population. Additionally, the adaptive constraint-handling technique consists of three main situations. In detail, at each situation, one constraint-handling mechanism is designed based on current population state. Experiments on 13 benchmark test functions...... and four well-known constrained design problems verify the effectiveness and efficiency of the proposed method. The experimental results show that integrating the hybrid evolutionary algorithm with the adaptive constraint-handling technique is beneficial, and the proposed method achieves competitive...
Heuristics and bias in homeopathy.
Souter, K
2006-10-01
The practice of Homeopathy ought to be strictly logical. In the Organon Samuel Hahnemann gives the impression that the unprejudiced observer should be able to follow an algorithmic route to the simillimum in every case. Judgement and Decision Research, however, indicates that when people grapple with complex systems like homeopathy they are more likely to use heuristics or empirical rules to help them reach a solution. Thus Hahnemann's concept of the unprejudiced observer is virtually impossible to attain. There is inevitable bias in both case-taking and remedy selection. Understanding the types of bias may enable the practitioner to reduce his/her own bias.
Energy Technology Data Exchange (ETDEWEB)
Delahaye, P., E-mail: delahaye@ganil.fr; Jardin, P.; Maunoury, L. [GANIL, CEA/DSM-CNRS/IN2P3, Blvd. Becquerel, BP 55027, 14076 Caen Cedex 05 (France); Galatà, A.; Patti, G. [INFN–Laboratori Nazionali di Legnaro, Viale dell’Università 2, 35020 Legnaro (Padova) (Italy); Angot, J.; Lamy, T.; Thuillier, T. [LPSC–Université Grenoble Alpes–CNRS/IN2P3, 53 rue des Martyrs, 38026 Grenoble Cedex (France); Cam, J. F.; Traykov, E.; Ban, G. [LPC Caen, 6 Blvd. Maréchal Juin, 14050 Caen Cedex (France); Celona, L. [INFN–Laboratori Nazionali del Sud, via S. Sofia 62, 95125 Catania (Italy); Choinski, J.; Gmaj, P. [Heavy Ion Laboratory, University of Warsaw, ul. Pasteura 5a, 02 093 Warsaw (Poland); Koivisto, H.; Kolhinen, V.; Tarvainen, O. [Department of Physics, University of Jyväskylä, PB 35 (YFL), 40351 Jyväskylä (Finland); Vondrasek, R. [Argonne National Laboratory, 9700 S. Cass Avenue, Argonne, Illinois 60439 (United States); Wenander, F. [ISOLDE, CERN, 1211 Geneva 23 (Switzerland)
2016-02-15
The present paper summarizes the results obtained from the past few years in the framework of the Enhanced Multi-Ionization of short-Lived Isotopes for Eurisol (EMILIE) project. The EMILIE project aims at improving the charge breeding techniques with both Electron Cyclotron Resonance Ion Sources (ECRIS) and Electron Beam Ion Sources (EBISs) for European Radioactive Ion Beam (RIB) facilities. Within EMILIE, an original technique for debunching the beam from EBIS charge breeders is being developed, for making an optimal use of the capabilities of CW post-accelerators of the future facilities. Such a debunching technique should eventually resolve duty cycle and time structure issues which presently complicate the data-acquisition of experiments. The results of the first tests of this technique are reported here. In comparison with charge breeding with an EBIS, the ECRIS technique had lower performance in efficiency and attainable charge state for metallic ion beams and also suffered from issues related to beam contamination. In recent years, improvements have been made which significantly reduce the differences between the two techniques, making ECRIS charge breeding more attractive especially for CW machines producing intense beams. Upgraded versions of the Phoenix charge breeder, originally developed by LPSC, will be used at SPES and GANIL/SPIRAL. These two charge breeders have benefited from studies undertaken within EMILIE, which are also briefly summarized here.
Visualization for Hyper-Heuristics: Back-End Processing
Energy Technology Data Exchange (ETDEWEB)
Simon, Luke [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-03-01
Modern society is faced with increasingly complex problems, many of which can be formulated as generate-and-test optimization problems. Yet, general-purpose optimization algorithms may sometimes require too much computational time. In these instances, hyperheuristics may be used. Hyper-heuristics automate the design of algorithms to create a custom algorithm for a particular scenario, finding the solution significantly faster than its predecessor. However, it may be difficult to understand exactly how a design was derived and why it should be trusted. This project aims to address these issues by creating an easy-to-use graphical user interface (GUI) for hyper-heuristics and an easy-to-understand scientific visualization for the produced solutions. To support the development of this GUI, my portion of the research involved developing algorithms that would allow for parsing of the data produced by the hyper-heuristics. This data would then be sent to the front-end, where it would be displayed to the end user.
Directory of Open Access Journals (Sweden)
Ali Gerami Matin
2017-10-01
Full Text Available Optimized road maintenance planning seeks for solutions that can minimize the life-cycle cost of a road network and concurrently maximize pavement condition. Aiming at proposing an optimal set of road maintenance solutions, robust meta-heuristic algorithms are used in research. Two main optimization techniques are applied including single-objective and multi-objective optimization. Genetic algorithms (GA, particle swarm optimization (PSO, and combination of genetic algorithm and particle swarm optimization (GAPSO as single-objective techniques are used, while the non-domination sorting genetic algorithm II (NSGAII and multi-objective particle swarm optimization (MOPSO which are sufficient for solving computationally complex large-size optimization problems as multi-objective techniques are applied and compared. A real case study from the rural transportation network of Iran is employed to illustrate the sufficiency of the optimum algorithm. The formulation of the optimization model is carried out in such a way that a cost-effective maintenance strategy is reached by preserving the performance level of the road network at a desirable level. So, the objective functions are pavement performance maximization and maintenance cost minimization. It is concluded that multi-objective algorithms including non-domination sorting genetic algorithm II (NSGAII and multi-objective particle swarm optimization performed better than the single objective algorithms due to the capability to balance between both objectives. And between multi-objective algorithms the NSGAII provides the optimum solution for the road maintenance planning.
Andriani, Dian; Wresta, Arini; Atmaja, Tinton Dwi; Saepudin, Aep
2014-02-01
Biogas from anaerobic digestion of organic materials is a renewable energy resource that consists mainly of CH4 and CO2. Trace components that are often present in biogas are water vapor, hydrogen sulfide, siloxanes, hydrocarbons, ammonia, oxygen, carbon monoxide, and nitrogen. Considering the biogas is a clean and renewable form of energy that could well substitute the conventional source of energy (fossil fuels), the optimization of this type of energy becomes substantial. Various optimization techniques in biogas production process had been developed, including pretreatment, biotechnological approaches, co-digestion as well as the use of serial digester. For some application, the certain purity degree of biogas is needed. The presence of CO2 and other trace components in biogas could affect engine performance adversely. Reducing CO2 content will significantly upgrade the quality of biogas and enhancing the calorific value. Upgrading is generally performed in order to meet the standards for use as vehicle fuel or for injection in the natural gas grid. Different methods for biogas upgrading are used. They differ in functioning, the necessary quality conditions of the incoming gas, and the efficiency. Biogas can be purified from CO2 using pressure swing adsorption, membrane separation, physical or chemical CO2 absorption. This paper reviews the various techniques, which could be used to optimize the biogas production as well as to upgrade the biogas quality.
Artificial intelligent techniques for optimizing water allocation in a reservoir watershed
Chang, Fi-John; Chang, Li-Chiu; Wang, Yu-Chung
2014-05-01
This study proposes a systematical water allocation scheme that integrates system analysis with artificial intelligence techniques for reservoir operation in consideration of the great uncertainty upon hydrometeorology for mitigating droughts impacts on public and irrigation sectors. The AI techniques mainly include a genetic algorithm and adaptive-network based fuzzy inference system (ANFIS). We first derive evaluation diagrams through systematic interactive evaluations on long-term hydrological data to provide a clear simulation perspective of all possible drought conditions tagged with their corresponding water shortages; then search the optimal reservoir operating histogram using genetic algorithm (GA) based on given demands and hydrological conditions that can be recognized as the optimal base of input-output training patterns for modelling; and finally build a suitable water allocation scheme through constructing an adaptive neuro-fuzzy inference system (ANFIS) model with a learning of the mechanism between designed inputs (water discount rates and hydrological conditions) and outputs (two scenarios: simulated and optimized water deficiency levels). The effectiveness of the proposed approach is tested on the operation of the Shihmen Reservoir in northern Taiwan for the first paddy crop in the study area to assess the water allocation mechanism during drought periods. We demonstrate that the proposed water allocation scheme significantly and substantially avails water managers of reliably determining a suitable discount rate on water supply for both irrigation and public sectors, and thus can reduce the drought risk and the compensation amount induced by making restrictions on agricultural use water.
Integration of ab-initio nuclear calculation with derivative free optimization technique
Energy Technology Data Exchange (ETDEWEB)
Sharda, Anurag [Iowa State Univ., Ames, IA (United States)
2008-01-01
Optimization techniques are finding their inroads into the field of nuclear physics calculations where the objective functions are very complex and computationally intensive. A vast space of parameters needs searching to obtain a good match between theoretical (computed) and experimental observables, such as energy levels and spectra. Manual calculation defies the scope of such complex calculation and are prone to error at the same time. This body of work attempts to formulate a design and implement it which would integrate the ab initio nuclear physics code MFDn and the VTDIRECT95 code. VTDIRECT95 is a Fortran95 suite of parallel code implementing the derivative-free optimization algorithm DIRECT. Proposed design is implemented for a serial and parallel version of the optimization technique. Experiment with the initial implementation of the design showing good matches for several single-nucleus cases are conducted. Determination and assignment of appropriate number of processors for parallel integration code is implemented to increase the efficiency and resource utilization in the case of multiple nuclei parameter search.
Directory of Open Access Journals (Sweden)
Syed Hamid Hussain Madni
Full Text Available Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS, Minimum Completion Time (MCT, Minimum Execution Time (MET, Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.
Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Abdulhamid, Shafi'i Muhammad; Usman, Mohammed Joda
2017-01-01
Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.
Hernandez, Wilmar
2007-01-01
In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart) sensors that today's cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher's interest in the fusion of intelligent sensors and optimal signal processing techniques.
Optimized Scheduling Technique of Null Subcarriers for Peak Power Control in 3GPP LTE Downlink
Park, Sang Kyu
2014-01-01
Orthogonal frequency division multiple access (OFDMA) is a key multiple access technique for the long term evolution (LTE) downlink. However, high peak-to-average power ratio (PAPR) can cause the degradation of power efficiency. The well-known PAPR reduction technique, dummy sequence insertion (DSI), can be a realistic solution because of its structural simplicity. However, the large usage of subcarriers for the dummy sequences may decrease the transmitted data rate in the DSI scheme. In this paper, a novel DSI scheme is applied to the LTE system. Firstly, we obtain the null subcarriers in single-input single-output (SISO) and multiple-input multiple-output (MIMO) systems, respectively; then, optimized dummy sequences are inserted into the obtained null subcarrier. Simulation results show that Walsh-Hadamard transform (WHT) sequence is the best for the dummy sequence and the ratio of 16 to 20 for the WHT and randomly generated sequences has the maximum PAPR reduction performance. The number of near optimal iteration is derived to prevent exhausted iterations. It is also shown that there is no bit error rate (BER) degradation with the proposed technique in LTE downlink system. PMID:24883376
Optimized scheduling technique of null subcarriers for peak power control in 3GPP LTE downlink.
Cho, Soobum; Park, Sang Kyu
2014-01-01
Orthogonal frequency division multiple access (OFDMA) is a key multiple access technique for the long term evolution (LTE) downlink. However, high peak-to-average power ratio (PAPR) can cause the degradation of power efficiency. The well-known PAPR reduction technique, dummy sequence insertion (DSI), can be a realistic solution because of its structural simplicity. However, the large usage of subcarriers for the dummy sequences may decrease the transmitted data rate in the DSI scheme. In this paper, a novel DSI scheme is applied to the LTE system. Firstly, we obtain the null subcarriers in single-input single-output (SISO) and multiple-input multiple-output (MIMO) systems, respectively; then, optimized dummy sequences are inserted into the obtained null subcarrier. Simulation results show that Walsh-Hadamard transform (WHT) sequence is the best for the dummy sequence and the ratio of 16 to 20 for the WHT and randomly generated sequences has the maximum PAPR reduction performance. The number of near optimal iteration is derived to prevent exhausted iterations. It is also shown that there is no bit error rate (BER) degradation with the proposed technique in LTE downlink system.
Human motion planning based on recursive dynamics and optimal control techniques
Lo, Janzen; Huang, Gang; Metaxas, Dimitris
2002-01-01
This paper presents an efficient optimal control and recursive dynamics-based computer animation system for simulating and controlling the motion of articulated figures. A quasi-Newton nonlinear programming technique (super-linear convergence) is implemented to solve minimum torque-based human motion-planning problems. The explicit analytical gradients needed in the dynamics are derived using a matrix exponential formulation and Lie algebra. Cubic spline functions are used to make the search space for an optimal solution finite. Based on our formulations, our method is well conditioned and robust, in addition to being computationally efficient. To better illustrate the efficiency of our method, we present results of natural looking and physically correct human motions for a variety of human motion tasks involving open and closed loop kinematic chains.
Mirapeix, J; Cobo, A; González, D A; López-Higuera, J M
2007-02-19
A new plasma spectroscopy analysis technique based on the generation of synthetic spectra by means of optimization processes is presented in this paper. The technique has been developed for its application in arc-welding quality assurance. The new approach has been checked through several experimental tests, yielding results in reasonably good agreement with the ones offered by the traditional spectroscopic analysis technique.
Cooperative heuristic multi-agent planning
De Weerdt, M.M.; Tonino, J.F.M.; Witteveen, C.
2001-01-01
In this paper we will use the framework to study cooperative heuristic multi-agent planning. During the construction of their plans, the agents use a heuristic function inspired by the FF planner (l3l). At any time in the process of planning the agents may exchange available resources, or they may
Effective Heuristics for New Venture Formation
Kraaijenbrink, Jeroen
2010-01-01
Entrepreneurs are often under time pressure and may only have a short window of opportunity to launch their new venture. This means they often have no time for rational analytical decisions and rather rely on heuristics. Past research on entrepreneurial heuristics has primarily focused on predictive
A Heuristic Approach to Scheduling University Timetables.
Loo, E. H.; And Others
1986-01-01
Categories of facilities utilization and scheduling requirements to be considered when using a heuristic approach to timetabling are described together with a nine-step algorithm and the computerized timetabling system, Timetable Schedules System (TTS); TTS utilizes heuristic approach. An example demonstrating use of TTS and a program flowchart…
Jha, Madan K.; Kumar, S.; Chowdhury, A.
2008-09-01
SummaryGrowing water scarcity in West Midnapore district of West Bengal, India, is threatening sustainable agricultural production as well as sanitation of the inhabitants. Because of its several inherent qualities, groundwater can play an important role in ensuring sustainable water supply in the district. This study was carried out to assess groundwater condition in the Salboni Block of West Midnapore district using surface resistivity method. Vertical electrical sounding (VES) surveys were carried out at 38 sites using the Schlumberger array. The apparent resistivity-depth datasets (henceforth called 'VES data') thus obtained were interpreted by the genetic algorithm (GA) optimization technique. A GA-based stand-alone computer program was developed for optimizing subsurface layer parameters (true resistivity and thickness) from the VES data. The optimal layer parameters were then correlated with the available well logs to identify aquifer and confining layers. Moreover, a groundwater potential map was created by integrating the thematic layers of aquifer resistivity and thickness in a GIS environment. In order to explore the spatial variation of layer resistivity at a particular depth, resistivity contour maps of the study area for different depths were prepared using ArcView software. The GA technique yielded layer parameters with reasonably low values of root mean square error (0.36-9.75 Ω m) for most VES datasets. It was found that shallow aquifers exist at depths ranging from 4 to 19 m and relatively deep aquifers from 24 to 60 m below the ground surface. The study area is classified into 'very good', 'good', 'moderate' and 'poor' groundwater potential zones, with a majority of the area having good to moderate groundwater prospect. The resistivity contour maps for different depths revealed that deeper aquifers are prevalent in the study area. It is concluded that the GA technique is efficient and reliable for determining subsurface layer parameters from the
Johnson, Perry B; Monterroso, Maria I; Yang, Fei; Mellon, Eric
2017-11-25
This work explores how the choice of prescription isodose line (IDL) affects the dose gradient, target coverage, and treatment time for Gamma Knife radiosurgery when a smaller shot is encompassed within a larger shot at the same stereotactic coordinates (shot within shot technique). Beam profiles for the 4, 8, and 16 mm collimator settings were extracted from the treatment planning system and characterized using Gaussian fits. The characterized data were used to create over 10,000 shot within shot configurations by systematically changing collimator weighting and choice of prescription IDL. Each configuration was quantified in terms of the dose gradient, target coverage, and beam-on time. By analyzing these configurations, it was found that there are regions of overlap in target size where a higher prescription IDL provides equivalent dose fall-off to a plan prescribed at the 50% IDL. Furthermore, the data indicate that treatment times within these regions can be reduced by up to 40%. An optimization strategy was devised to realize these gains. The strategy was tested for seven patients treated for 1-4 brain metastases (20 lesions total). For a single collimator setting, the gradient in the axial plane was steepest when prescribed to the 56-63% (4 mm), 62-70% (8 mm), and 77-84% (16 mm) IDL, respectively. Through utilization of the optimization technique, beam-on time was reduced by more than 15% in 16/20 lesions. The volume of normal brain receiving 12 Gy or above also decreased in many cases, and in only one instance increased by more than 0.5 cm3. This work demonstrates that IDL optimization using the shot within shot technique can reduce treatment times without degrading treatment plan quality.
Directory of Open Access Journals (Sweden)
Parham Azimi
2012-01-01
Full Text Available A new efficient heuristic algorithm has been developed for the dynamic facility layout problem with budget constraint (DFLPB using optimization via simulation technique. The heuristic integrates integer programming and discrete event simulation to address DFLPB. In the proposed algorithm, the nonlinear model of the DFLP has been changed to a pure integer programming (PIP model. Then, the optimal solution of the PIP model has been used in a simulation model that has been designed in a similar manner as the DFLP for determining the probability of assigning a facility to a location. After a sufficient number of runs, the simulation model obtains near optimum solutions. Finally, to test the performance of the algorithm, several test problems have been taken from the literature and solved. The results show that the proposed algorithm is more efficient in terms of speed and accuracy than other heuristic algorithms presented in previous works.
Sintering process optimization for multi-layer CGO membranes by in situ techniques
DEFF Research Database (Denmark)
Kaiser, Andreas; Prasad, A.S.; Foghmoes, Søren Preben Vagn
2013-01-01
The sintering of asymmetric CGO bi-layers (thin dense membrane on a porous support; Ce0.9Gd0.1O1.95-delta = CGO) with Co3O4 as sintering additive has been optimized by combination of two in situ techniques. Optical dilatometry revealed that bi-layer shape and microstructure are dramatically...... changing in a narrow temperature range of less than 100 degrees C. Below 1030 degrees C, a higher densification rate in the dense membrane layer than in the porous support leads to concave shape, whereas the densification rate of the support is dominant above 1030 degrees C, leading to convex shape. A fiat...
Graph theory and combinatorial optimization
Marcotte, Odile; Avis, David
2006-01-01
A current treatment of cutting-edge topics in Graph Theory and Combinatorial Optimization by leading researchersIncludes heuristic advances and novel approaches to solving combinatorial optimization problems.
Design and optimization of stepped austempered ductile iron using characterization techniques
Energy Technology Data Exchange (ETDEWEB)
Hernández-Rivera, J.L., E-mail: jose.hernandez@cimav.edu.mx [Centro de Investigación en Materiales Avanzados-Laboratorio Nacional de Nanotecnología, Miguel de Cervantes 120, Z.C. 31109, Chihuahua (Mexico); Garay-Reyes, C.G.; Campos-Cambranis, R.E.; Cruz-Rivera, J.J. [Facultad de Ingeniería, Universidad Autónoma de San Luis Potosí, Sierra Leona 550, Lomas 2a. sección, Z.C. 78210, San Luis Potosí (Mexico)
2013-09-15
Conventional characterization techniques such as dilatometry, X-ray diffraction and metallography were used to select and optimize temperatures and times for conventional and stepped austempering. Austenitization and conventional austempering time was selected when the dilatometry graphs showed a constant expansion value. A special heat color-etching technique was applied to distinguish between the untransformed austenite and high carbon stabilized austenite which had formed during the treatments. Finally, it was found that carbide precipitation was absent during the stepped austempering in contrast to conventional austempering, on which carbide evidence was found. - Highlights: • Dilatometry helped to establish austenitization and austempering parameters. • Untransformed austenite was present even for longer processing times. • Ausferrite formed during stepped austempering caused important reinforcement effect. • Carbide precipitation was absent during stepped treatment.
Directory of Open Access Journals (Sweden)
Mansoor Ahmed Siddiqui
2017-06-01
Full Text Available This research work is aimed at optimizing the availability of a framework comprising of two units linked together in series configuration utilizing Markov Model and Monte Carlo (MC Simulation techniques. In this article, effort has been made to develop a maintenance model that incorporates three distinct states for each unit, while taking into account their different levels of deterioration. Calculations are carried out using the proposed model for two distinct cases of corrective repair, namely perfect and imperfect repairs, with as well as without opportunistic maintenance. Initially, results are accomplished using an analytical technique i.e., Markov Model. Validation of the results achieved is later carried out with the help of MC Simulation. In addition, MC Simulation based codes also work well for the frameworks that follow non-exponential failure and repair rates, and thus overcome the limitations offered by the Markov Model.
New efficient optimizing techniques for Kalman filters and numerical weather prediction models
Famelis, Ioannis; Galanis, George; Liakatas, Aristotelis
2016-06-01
The need for accurate local environmental predictions and simulations beyond the classical meteorological forecasts are increasing the last years due to the great number of applications that are directly or not affected: renewable energy resource assessment, natural hazards early warning systems, global warming and questions on the climate change can be listed among them. Within this framework the utilization of numerical weather and wave prediction systems in conjunction with advanced statistical techniques that support the elimination of the model bias and the reduction of the error variability may successfully address the above issues. In the present work, new optimization methods are studied and tested in selected areas of Greece where the use of renewable energy sources is of critical. The added value of the proposed work is due to the solid mathematical background adopted making use of Information Geometry and Statistical techniques, new versions of Kalman filters and state of the art numerical analysis tools.
DATA MINING WORKSPACE AS AN OPTIMIZATION PREDICTION TECHNIQUE FOR SOLVING TRANSPORT PROBLEMS
Directory of Open Access Journals (Sweden)
Anastasiia KUPTCOVA
2016-09-01
Full Text Available This article addresses the study related to forecasting with an actual high-speed decision making under careful modelling of time series data. The study uses data-mining modelling for algorithmic optimization of transport goals. Our finding brings to the future adequate techniques for the fitting of a prediction model. This model is going to be used for analyses of the future transaction costs in the frontiers of the Czech Republic. Time series prediction methods for the performance of prediction models in the package of Statistics are Exponential, ARIMA and Neural Network approaches. The primary target for a predictive scenario in the data mining workspace is to provide modelling data faster and with more versatility than the other management techniques.
Gravity inversion of a fault by Particle swarm optimization (PSO).
Toushmalani, Reza
2013-01-01
Particle swarm optimization is a heuristic global optimization method and also an optimization algorithm, which is based on swarm intelligence. It comes from the research on the bird and fish flock movement behavior. In this paper we introduce and use this method in gravity inverse problem. We discuss the solution for the inverse problem of determining the shape of a fault whose gravity anomaly is known. Application of the proposed algorithm to this problem has proven its capability to deal with difficult optimization problems. The technique proved to work efficiently when tested to a number of models.
A NONLINEAR FEASIBILITY PROBLEM HEURISTIC
Directory of Open Access Journals (Sweden)
Sergio Drumond Ventura
2015-04-01
Full Text Available In this work we consider a region S ⊂ given by a finite number of nonlinear smooth convex inequalities and having nonempty interior. We assume a point x 0 is given, which is close in certain norm to the analytic center of S, and that a new nonlinear smooth convex inequality is added to those defining S (perturbed region. It is constructively shown how to obtain a shift of the right-hand side of this inequality such that the point x 0 is still close (in the same norm to the analytic center of this shifted region. Starting from this point and using the theoretical results shown, we develop a heuristic that allows us to obtain the approximate analytic center of the perturbed region. Then, we present a procedure to solve the problem of nonlinear feasibility. The procedure was implemented and we performed some numerical tests for the quadratic (random case.
Simply criminal: predicting burglars' occupancy decisions with a simple heuristic.
Snook, Brent; Dhami, Mandeep K; Kavanagh, Jennifer M
2011-08-01
Rational choice theories of criminal decision making assume that offenders weight and integrate multiple cues when making decisions (i.e., are compensatory). We tested this assumption by comparing how well a compensatory strategy called Franklin's Rule captured burglars' decision policies regarding residence occupancy compared to a non-compensatory strategy (i.e., Matching Heuristic). Forty burglars each decided on the occupancy of 20 randomly selected photographs of residences (for which actual occupancy was known when the photo was taken). Participants also provided open-ended reports on the cues that influenced their decisions in each case, and then rated the importance of eight cues (e.g., deadbolt visible) over all decisions. Burglars predicted occupancy beyond chance levels. The Matching Heuristic was a significantly better predictor of burglars' decisions than Franklin's Rule, and cue use in the Matching Heuristic better corresponded to the cue ecological validities in the environment than cue use in Franklin's Rule. The most important cue in burglars' models was also the most ecologically valid or predictive of actual occupancy (i.e., vehicle present). The majority of burglars correctly identified the most important cue in their models, and the open-ended technique showed greater correspondence between self-reported and captured cue use than the rating over decision technique. Our findings support a limited rationality perspective to understanding criminal decision making, and have implications for crime prevention.
Directory of Open Access Journals (Sweden)
Homsup Nuttaka
2016-09-01
Full Text Available This research presents a new technique which includes the principle of a Bezier curve and Particle Swarm Optimization (PSO together, in order to design the planar dipole antenna for the two different targets. This technique can improve the characteristics of the antennas by modifying copper textures on the antennas with a Bezier curve. However, the time to process an algorithm will be increased due to the expansion of the solution space in optimization process. So as to solve this problem, the suitable initial parameters need to be set. Therefore this research initialized parameters with reference antenna parameters (a reference antenna operates on 2.4 GHz for IEEE 802.11 b/g/n WLAN standards which resulted in the proposed designs, rapidly converted into the goals. The goal of the first design is to reduce the size of the antenna. As a result, the first antenna is reduced in the substrate size from areas of 5850 mm2 to 2987 mm2 (48.93% approximately and can also operates at 2.4 GHz (2.37 GHz to 2.51 GHz. The antenna with dual band application is presented in the second design. The second antenna is operated at 2.4 GHz (2.40 GHz to 2.49 GHz and 5 GHz (5.10 GHz to 5.45 GHz for IEEE 802.11 a/b/g/n WLAN standards.
Energy Technology Data Exchange (ETDEWEB)
Ito, Fuminori, E-mail: fuminoito@spice.ocn.ne.jp [Tokyo Metropolitan University, Department of Applied Chemistry, Graduate School of Urban Environmental Sciences (Japan)
2016-09-15
In this study, we report the optimization of a solvent evaporation technique for preparing monodisperse poly-(lactide-co-glycolide) (PLGA) nanospheres, from a mixture of solvents composed of ethanol and PVA solution. Various experimental conditions were investigated in order to control the particle size and size distribution of the nanospheres. In addition, nanospheres containing rifampicin (RFP, an antituberculosis drug), were prepared using PLGA of various molecular weights, to study the effects of RFP as a model hydrophobic drug. The results showed that a higher micro-homogenizer stirring rate facilitated the preparation of monodisperse PLGA nanospheres with a low coefficient of variation (~20 %), with sizes below 200 nm. Increasing the PLGA concentration from 0.1 to 0.5 g resulted in an increase in the size of the obtained nanospheres from 130 to 174 nm. The molecular weight of PLGA had little effect on the particle sizes and particle size distributions of the nanospheres. However, the drug loading efficiencies of the obtained RFP/PLGA nanospheres decreased when the molecular weight of PLGA was increased. Based on these experiments, an optimized technique was established for the preparation of monodisperse PLGA nanospheres, using the method developed by the authors.Graphical Abstract.
THE HEURISTIC FUNCTION OF SPORT
Directory of Open Access Journals (Sweden)
Adam Petrović
2012-09-01
Full Text Available Being a significant area of human activity, sport has multiple functions. One of the more important functions of sport, especially top sport, is the inventive heuristic function. Creative work, being a process of creating new values, represents a significant possibility for advancement of sport. This paper aims at pointing at the various dimensions of human creative work, at the creative work which can be seen in sport (in a narrow sense and at the scientific and practical areas which borderline sport. The method of theoretical analysis of different approaches to the phenomenon of creative work , both in general and in sport, was applied in this paper. This area can be systematized according to various criterion : the level of creative work, different fields where it appears, the subjects of creative work - creators etc. Case analysis shows that the field of creative work in sport is widening and deepening constantly. There are different levels of creativity not only in the system of training and competition, but in a wider social context of sport as well. As a process of human spirit and mind the creative work belongs not just to athletes and coaches, but also to all the people and social groups who's creative power manifests itself in sport. The classification of creative work in sport according to various criterion allows for heuristic function of sport to be explained comprehensively and to create an image how do the sparks of human spirit improve the micro cosmos of sport. A thorough classification of creative work in sport allows for a detailed analysis of all the elements of creative work and each of it’s area in sport. In this way the progress in sport , as a consequence of innovations in both competitions and athletes’ training and of everything that goes with those activities, can be guided into the needed direction more easily as well as studied and applied.
A heuristic approach to incremental and reactive scheduling
Odubiyi, Jide B.; Zoch, David R.
1989-01-01
An heuristic approach to incremental and reactive scheduling is described. Incremental scheduling is the process of modifying an existing schedule if the initial schedule does not meet its stated initial goals. Reactive scheduling occurs in near real-time in response to changes in available resources or the occurrence of targets of opportunity. Only minor changes are made during both incremental and reactive scheduling because a goal of re-scheduling procedures is to minimally impact the schedule. The described heuristic search techniques, which are employed by the Request Oriented Scheduling Engine (ROSE), a prototype generic scheduler, efficiently approximate the cost of reaching a goal from a given state and effective mechanisms for controlling search.
A convex programming framework for optimal and bounded suboptimal well field management
DEFF Research Database (Denmark)
Dorini, Gianluca Fabio; Thordarson, Fannar Ørn; Bauer-Gottwein, Peter
2012-01-01
are often convex, hence global optimality can be attained by a wealth of algorithms. Among these, the Interior Point methods are extensively employed for practical applications, as they are capable of efficiently solving large-scale problems. Despite this, management models explicitly embedding both systems...... without simplifications are rare, and they usually involve heuristic techniques. The main limitation with heuristics is that neither optimality nor suboptimality bounds can be guarantee. This paper extends the proof of convexity to mixed management models, enabling the use of Interior Point techniques...... to compute globally optimal management solutions. If convexity is not achieved, it is shown how suboptimal solutions can be computed, and how to bind their deviation from the optimality. Experimental results obtained by testing the methodology in a well field located nearby Copenhagen (DK), show...
Energy Technology Data Exchange (ETDEWEB)
Hernandez, Victor, E-mail: vhernandezmasgrau@gmail.com [Department of Medical Physics, Hospital Sant Joan de Reus, IISPV, Tarragona (Spain); Arenas, Meritxell [Department of Radiation therapy, Hospital Sant Joan de Reus, IISPV, Tarragona (Spain); Müller, Katrin [Department of Medical Physics, Hospital Sant Joan de Reus, IISPV, Tarragona (Spain); Gomez, David; Bonet, Marta [Department of Radiation therapy, Hospital Sant Joan de Reus, IISPV, Tarragona (Spain)
2013-01-01
To assess the advantages of an optimized posterior axillary (AX) boost technique for the irradiation of supraclavicular (SC) and AX lymph nodes. Five techniques for the treatment of SC and levels I, II, and III AX lymph nodes were evaluated for 10 patients selected at random: a direct anterior field (AP); an anterior to posterior parallel pair (AP-PA); an anterior field with a posterior axillary boost (PAB); an anterior field with an anterior axillary boost (AAB); and an optimized PAB technique (OptPAB). The target coverage, hot spots, irradiated volume, and dose to organs at risk were evaluated and a statistical analysis comparison was performed. The AP technique delivered insufficient dose to the deeper AX nodes. The AP-PA technique produced larger irradiated volumes and higher mean lung doses than the other techniques. The PAB and AAB techniques originated excessive hot spots in most of the cases. The OptPAB technique produced moderate hot spots while maintaining a similar planning target volume (PTV) coverage, irradiated volume, and dose to organs at risk. This optimized technique combines the advantages of the PAB and AP-PA techniques, with moderate hot spots, sufficient target coverage, and adequate sparing of normal tissues. The presented technique is simple, fast, and easy to implement in routine clinical practice and is superior to the techniques historically used for the treatment of SC and AX lymph nodes.
Hammed, Khurram; Ghauri, Sajjad Ahmed; Qamar, M. Salman
2016-01-01
This paper presents a stochastic global optimization technique known as Particle Swarm Optimization (PSO) for joint estimation of amplitude and direction of arrival of the targets in RADAR communication system. The proposed scheme is an excellent optimization methodology and a promising approach for solving the DOA problems in communication systems. Moreover, PSO is quite suitable for real time scenario and easy to implement in hardware. In this study, uniform linear array is used and targets...
Numerical and Evolutionary Optimization Workshop
Trujillo, Leonardo; Legrand, Pierrick; Maldonado, Yazmin
2017-01-01
This volume comprises a selection of works presented at the Numerical and Evolutionary Optimization (NEO) workshop held in September 2015 in Tijuana, Mexico. The development of powerful search and optimization techniques is of great importance in today’s world that requires researchers and practitioners to tackle a growing number of challenging real-world problems. In particular, there are two well-established and widely known fields that are commonly applied in this area: (i) traditional numerical optimization techniques and (ii) comparatively recent bio-inspired heuristics. Both paradigms have their unique strengths and weaknesses, allowing them to solve some challenging problems while still failing in others. The goal of the NEO workshop series is to bring together people from these and related fields to discuss, compare and merge their complimentary perspectives in order to develop fast and reliable hybrid methods that maximize the strengths and minimize the weaknesses of the underlying paradigms. Throu...
Chang, Chiou-Shiung; Hwang, Jing-Min; Tai, Po-An; Chang, You-Kang; Wang, Yu-Nong; Shih, Rompin; Chuang, Keh-Shih
2016-01-01
the multiple arc or beam planning designs of IMRS and VMAT, both of these techniques required higher MU delivery than DCA, with the averages being twice as high (p < 0.05). If linear accelerator is only 1 modality can to establish for SRS treatment. Based on statistical evidence retrospectively, we recommend VMAT as the optimal technique for delivering treatment to tumors adjacent to brainstem. Copyright © 2016 American Association of Medical Dosimetrists. All rights reserved.
J. A. Baskar*, Dr. R. Hariprakash (IITM), Dr. M. Vijayakumar
2017-01-01
Integration of Distributed Generation (DG) in an electrical distribution system has increased recently due to voltage improvement, line loss reduction, environmental advantages, and postponement of system upgrading, and increasing reliability. Improper location and capacity of DG may affect the voltage stability on the Distribution System (DS). Optimization techniques are tools used to predict size and locate the DG units in the system, so as to utilize these units optimally within certain li...
MacKay, Rebecca A.; Locci, Ivan E.; Garg, anita; Ritzert, Frank J.
2002-01-01
is a three-phase constituent composed of TCP and stringers of gamma phase in a matrix of gamma prime. An incoherent grain boundary separates the SRZ from the gammagamma prime microstructure of the superalloy. The SRZ is believed to form as a result of local chemistry changes in the superalloy due to the application of the diffusion aluminide bondcoat. Locally high surface stresses also appear to promote the formation of the SRZ. Thus, techniques that change the local alloy chemistry or reduce surface stresses have been examined for their effectiveness in reducing SRZ. These SRZ-reduction steps are performed on the test specimen or the turbine blade before the bondcoat is applied. Stressrelief heat treatments developed at NASA Glenn have been demonstrated to reduce significantly the amount of SRZ that develops during subsequent high-temperature exposures. Stress-relief heat treatments reduce surface stresses by recrystallizing a thin surface layer of the superalloy. However, in alloys with very high propensities to form SRZ, stress relief heat treatments alone do not eliminate SRZ entirely. Thus, techniques that modify the local chemistry under the bondcoat have been emphasized and optimized successfully at Glenn. One such technique is carburization, which changes the local chemistry by forming submicron carbides near the surface of the superalloy. Detailed characterizations have demonstrated that the depth and uniform distribution of these carbides are enhanced when a stress relief treatment and an appropriate surface preparation are employed in advance of the carburization treatment. Even in alloys that have the propensity to develop a continuous SRZ layer beneath the diffusion zone, the SRZ has been completely eliminated or reduced to low, manageable levels when this combination of techniques is utilized. Now that the techniques to mitigate SRZ have been established at Glenn, TCP phase formation is being emphasized in ongoing work under the UEET Program. The
Directory of Open Access Journals (Sweden)
Pradeep Jangir
2017-04-01
Full Text Available Recent trend of research is to hybridize two and more algorithms to obtain superior solution in the field of optimization problems. In this context, a new technique hybrid Particle Swarm Optimization (PSO-Multi verse Optimizer (MVO is exercised on some unconstraint benchmark test functions and the most common problem of the modern power system named Optimal Reactive Power Dispatch (ORPD is optimized using the novel hybrid meta-heuristic optimization algorithm Particle Swarm Optimization-Multi Verse Optimizer (HPSO-MVO method. Hybrid PSO-MVO is combination of PSO used for exploitation phase and MVO for exploration phase in uncertain environment. Position and Speed of particle is modernised according to location of universes in each iteration. The hybrid PSO-MVO method has a fast convergence rate due to use of roulette wheel selection method. For the ORPD solution, standard IEEE-30 bus test system is used. The hybrid PSO-MVO method is implemented to solve the proposed problem. The problems considered in the ORPD are fuel cost reduction, Voltage profile improvement, Voltage stability enhancement, Active power loss minimization and Reactive power loss minimization. The results obtained with hybrid PSO-MVO method is compared with other techniques such as Particle Swarm Optimization (PSO and Multi Verse Optimizer (MVO. Analysis of competitive results obtained from HPSO-MVO validates its effectiveness compare to standard PSO and MVO algorithm.
Directory of Open Access Journals (Sweden)
Maryam Ashouri
2017-07-01
Full Text Available Vehicle routing problem (VRP is a Nondeterministic Polynomial Hard combinatorial optimization problem to serve the consumers from central depots and returned back to the originated depots with given vehicles. Furthermore, two of the most important extensions of the VRPs are the open vehicle routing problem (OVRP and VRP with simultaneous pickup and delivery (VRPSPD. In OVRP, the vehicles have not return to the depot after last visit and in VRPSPD, customers require simultaneous delivery and pick-up service. The aim of this paper is to present a combined effective ant colony optimization (CEACO which includes sweep and several local search algorithms which is different with common ant colony optimization (ACO. An extensive numerical experiment is performed on benchmark problem instances addressed in the literature. The computational result shows that suggested CEACO approach not only presented a very satisfying scalability, but also was competitive with other meta-heuristic algorithms in the literature for solving VRP, OVRP and VRPSPD problems. Keywords: Meta-heuristic algorithms, Vehicle Routing Problem, Open Vehicle Routing Problem, Simultaneously Pickup and Delivery, Ant Colony Optimization.
Formative Research on the Heuristic Task Analysis Process.
Reigeluth, Charles M.; Lee, Ji-Yeon; Peterson, Bruce; Chavez, Michael
Corporate and educational settings increasingly require decision making, problem solving and other complex cognitive skills to handle ill-structured, or heuristic, tasks, but the growing need for heuristic task expertise has outpaced the refinement of task analysis methods for heuristic expertise. The Heuristic Task Analysis (HTA) Method was…
A new heuristic for the quadratic assignment problem
Zvi Drezner
2002-01-01
We propose a new heuristic for the solution of the quadratic assignment problem. The heuristic combines ideas from tabu search and genetic algorithms. Run times are very short compared with other heuristic procedures. The heuristic performed very well on a set of test problems.
Mathematical models and heuristic solutions for container positioning problems in port terminals
DEFF Research Database (Denmark)
Kallehauge, Louise Sibbesen
2008-01-01
This PhD thesis is concerned with the container positioning problem (CPP) which consists in determining optimal sequences of positions and moves for containers in a single storage block of a terminal yard. The purpose of the thesis is to apply Operations Research (OR) methods for optimizing the CPP...... by constructing mathematical programming formulations of the problem and developing an efficient heuristic algorithm for its solution. The thesis consists of an introduction, two main chapters concerning new mathematical formulations and a new heuristic for the CPP, technical issues, computational results...... concerning the subject is reviewed. The research presented in this thesis is divided into two main parts: Construction and investigation of new mathematical programming formulations of the CPP and development and implementation of a new event-based heuristic for the problem. The first part presents three...
Optimization of growth conditions of ZnO nano thin films by chemical double dip technique
Energy Technology Data Exchange (ETDEWEB)
Vijayan, Thirukonda Anandamoorthy; Chandramohan, Rathinam; Thirumalai, Jagannathan [Department of Physics, Sree Sevugan Annamalai College, Devakottai-630 303 (India); Valanarasu, Santiyagu [Department of Physics, Ananda College, Devakottai-630 303 (India); Venkateswaran, Sivasuriyan [Department of Chemistry, Sree Sevugan Annamalai College, Devakottai-630 303 (India); Mahalingam, Thaiyan [Department of Physics, Alagappa University, Karaikudi-630 003 (India); Srikumar, Subbiah Ramachandran [Department of Physics, Kalasalingam University, Krishnankoil-626 190 (India)], E-mail: chandru17@yahoo.com
2008-07-01
Zinc oxide (ZnO) nano thin films have been deposited by the chemical double-dip technique using aqueous ZnSO{sub 4} and NaOH solutions. The ZnO films were characterized in terms of surface morphology by x-ray diffraction, energy-dispersive x-ray analysis (EDX), the use of a scanning electron microscope (SEM) and atomic force microscope (AFM) for surface morphology. The films exhibited a smooth morphology. The chemical states of oxygen and zinc in the ZnO nano thin films were also investigated by x-ray photoelectron spectroscopy (XPS). In the present investigations, highly textured ZnO thin films with a preferential (002)-orientation were prepared on glass substrates. The deposition conditions were optimized to obtain device-quality films for practical applications.
An Evolutionary Video Assignment Optimization Technique for VOD System in Heterogeneous Environment
Directory of Open Access Journals (Sweden)
King-Man Ho
2010-01-01
Full Text Available We investigate the video assignment problem of a hierarchical Video-on-Demand (VOD system in heterogeneous environments where different quality levels of videos can be encoded using either replication or layering. In such systems, videos are delivered to clients either through a proxy server or video broadcast/unicast channels. The objective of our work is to determine the appropriate coding strategy as well as the suitable delivery mechanism for a specific quality level of a video such that the overall system blocking probability is minimized. In order to find a near-optimal solution for such a complex video assignment problem, an evolutionary approach based on genetic algorithm (GA is proposed. From the results, it is shown that the system performance can be significantly enhanced by efficiently coupling the various techniques.
Multi Objective Optimization of Flux Cored Arc Weld Parameters Using Hybrid Grey - Fuzzy Technique
Directory of Open Access Journals (Sweden)
M Satheesh
2014-06-01
Full Text Available In the present work, an attempt has been made to use the grey-based fuzzy logic method to solve correlated multiple response optimization problems in the field of flux cored arc welding. This approach converts the complex multiple objectives into a single grey-fuzzy reasoning grade. Based on the grey-fuzzy reasoning grade, optimum parameters are identified. The significant contributions of parameters are estimated using analysis of variance (ANOVA. This evaluation procedure can be used in intelligent decision making for a welding operator. The proposed and developed method has good accuracy and competency. The proposed technique provides manufacturers who develop intelligent manufacturing systems a method to facilitate the achievement of the highest level of automation.
Directory of Open Access Journals (Sweden)
Natalya Balamutova
2015-04-01
Full Text Available Purpose: optimizing the learning process engineering sport diving students of higher educational institutions on the basis of experimental detection features changes leading factors in teaching swimming. Material and Methods: the study involved 102 students of higher educational institutions. Kharkov. All subjects were divided into groups: experimental and control. Methods: theoretical analysis and synthesis of data specific scientific and methodological literature, educational tests, methods of functional diagnostics, pedagogical experiment, methods of mathematical statistics. Results: according to the results of peer reviews of sports engineering methods of navigation, the best results achieved experimental group students. Performance analyses of functional tests that assess the cardiovascular and respiratory systems were higher in the experimental group students than the control. Conclusions: developed an innovative system of accelerated learning technique sport diving students, creates favorable conditions for the improvement of physical development and physical fitness, providing a faster increase athletic performance.
Optical Biosensor Based on Microbendings Technique: An Optimized Mean to Measure the Bone Strength
Directory of Open Access Journals (Sweden)
Preeti Singh
2014-01-01
Full Text Available Osteoporosis, a disease in humans, reduces bone mineral density. The microarchitecture of the bone gets deteriorated with change in variety of proteins in the bone. Therefore, a quantitative assessment of the strength of human bone, considering its structural properties and degradation due to aging, disease, and therapeutic treatment, becomes an integral part of the bioengineering studies. This paper presents a model of fiber optic biosensors (FOBs which utilizes microbending technique to measure the strength of the bone. In parallel, an artificial neural network (ANN based test bench has been developed for the optimization of FOBs strain measurement in orthoapplications using MATLAB. The performance accuracy of the given model appears to be considerable in ensuring the detection of the onset of osteoporosis.
FORMULATING AN OPTIMAL DRAINAGE MODEL FOR THE CALABAR AREA USING CARTOGRAPHIC TECHNIQUES
Directory of Open Access Journals (Sweden)
Innocent A. Ugbong
2016-01-01
Full Text Available In order to achieve the task of formulating an optimal drainage model for the Calabar area, the Calabar drainage system was studied using some cartographic techniques to analyze its surface run-off and channel characteristics so as to determine how floods are generated. A morphological analysis was done, using detailed contour maps prepared for the study area. The “Blue line” and “contour crenulations” methods were used to recreate the expected run-off channels or drainage networks under natural non-urbanized conditions. A drainage structure with 6 major basins and 73 sub-basins was discovered. Existing storm drains were constructed without regards to this natural structure and so floods were generated.
Hermawati, Setia; Lawson, Glyn
2016-09-01
Heuristics evaluation is frequently employed to evaluate usability. While general heuristics are suitable to evaluate most user interfaces, there is still a need to establish heuristics for specific domains to ensure that their specific usability issues are identified. This paper presents a comprehensive review of 70 studies related to usability heuristics for specific domains. The aim of this paper is to review the processes that were applied to establish heuristics in specific domains and identify gaps in order to provide recommendations for future research and area of improvements. The most urgent issue found is the deficiency of validation effort following heuristics proposition and the lack of robustness and rigour of validation method adopted. Whether domain specific heuristics perform better or worse than general ones is inconclusive due to lack of validation quality and clarity on how to assess the effectiveness of heuristics for specific domains. The lack of validation quality also affects effort in improving existing heuristics for specific domain as their weaknesses are not addressed. Copyright © 2016 Elsevier Ltd. All rights reserved.
Judgements with errors lead to behavioral heuristics
Ungureanu, S.
2016-01-01
A decision process robust to errors in the estimation of values, probabilities and times will employ heuristics that generate consistent apparent biases like loss aversion, nonlinear probability weighting with discontinuities and present bias.
Application of Optimization Techniques to Design of Unconventional Rocket Nozzle Configurations
Follett, W.; Ketchum, A.; Darian, A.; Hsu, Y.
1996-01-01
Several current rocket engine concepts such as the bell-annular tri-propellant engine, and the linear aerospike being proposed for the X-33 require unconventional three dimensional rocket nozzles which must conform to rectangular or sector shaped envelopes to meet integration constraints. These types of nozzles exist outside the current experience database, therefore, the application of efficient design methods for these propulsion concepts is critical to the success of launch vehicle programs. The objective of this work is to optimize several different nozzle configurations, including two- and three-dimensional geometries. Methodology includes coupling computational fluid dynamic (CFD) analysis to genetic algorithms and Taguchi methods as well as implementation of a streamline tracing technique. Results of applications are shown for several geometeries including: three dimensional thruster nozzles with round or super elliptic throats and rectangualar exits, two- and three-dimensional thrusters installed within a bell nozzle, and three dimensional thrusters with round throats and sector shaped exits. Due to the novel designs considered for this study, there is little experience which can be used to guide the effort and limit the design space. With a nearly infinite parameter space to explore, simple parametric design studies cannot possibly search the entire design space within the time frame required to impact the design cycle. For this reason, robust and efficient optimization methods are required to explore and exploit the design space to achieve high performance engine designs. Five case studies which examine the application of various techniques in the engineering environment are presented in this paper.
Optimization of GPS water vapor tomography technique with radiosonde and COSMIC historical data
Directory of Open Access Journals (Sweden)
S. Ye
2016-09-01
Full Text Available The near-real-time high spatial resolution of atmospheric water vapor distribution is vital in numerical weather prediction. GPS tomography technique has been proved effectively for three-dimensional water vapor reconstruction. In this study, the tomography processing is optimized in a few aspects by the aid of radiosonde and COSMIC historical data. Firstly, regional tropospheric zenith hydrostatic delay (ZHD models are improved and thus the zenith wet delay (ZWD can be obtained at a higher accuracy. Secondly, the regional conversion factor of converting the ZWD to the precipitable water vapor (PWV is refined. Next, we develop a new method for dividing the tomography grid with an uneven voxel height and a varied water vapor layer top. Finally, we propose a Gaussian exponential vertical interpolation method which can better reflect the vertical variation characteristic of water vapor. GPS datasets collected in Hong Kong in February 2014 are employed to evaluate the optimized tomographic method by contrast with the conventional method. The radiosonde-derived and COSMIC-derived water vapor densities are utilized as references to evaluate the tomographic results. Using radiosonde products as references, the test results obtained from our optimized method indicate that the water vapor density accuracy is improved by 15 and 12 % compared to those derived from the conventional method below the height of 3.75 km and above the height of 3.75 km, respectively. Using the COSMIC products as references, the results indicate that the water vapor density accuracy is improved by 15 and 19 % below 3.75 km and above 3.75 km, respectively.
Tsai, Wen-Ping; Chang, Fi-John; Chang, Li-Chiu; Herricks, Edwin E.
2015-11-01
Flow regime is the key driver of the riverine ecology. This study proposes a novel hybrid methodology based on artificial intelligence (AI) techniques for quantifying riverine ecosystems requirements and delivering suitable flow regimes that sustain river and floodplain ecology through optimizing reservoir operation. This approach addresses issues to better fit riverine ecosystem requirements with existing human demands. We first explored and characterized the relationship between flow regimes and fish communities through a hybrid artificial neural network (ANN). Then the non-dominated sorting genetic algorithm II (NSGA-II) was established for river flow management over the Shihmen Reservoir in northern Taiwan. The ecosystem requirement took the form of maximizing fish diversity, which could be estimated by the hybrid ANN. The human requirement was to provide a higher satisfaction degree of water supply. The results demonstrated that the proposed methodology could offer a number of diversified alternative strategies for reservoir operation and improve reservoir operational strategies producing downstream flows that could meet both human and ecosystem needs. Applications that make this methodology attractive to water resources managers benefit from the wide spread of Pareto-front (optimal) solutions allowing decision makers to easily determine the best compromise through the trade-off between reservoir operational strategies for human and ecosystem needs.
Comparison of metaheuristic techniques to determine optimal placement of biomass power plants
Energy Technology Data Exchange (ETDEWEB)
Reche-Lopez, P.; Ruiz-Reyes, N.; Garcia Galan, S. [Telecommunication Engineering Department, University of Jaen Polytechnic School, C/ Alfonso X el Sabio 28, 23700 Linares, Jaen (Spain); Jurado, F. [Electrical Engineering Department, University of Jaen Polytechnic School, C/ Alfonso X el Sabio 28, 23700 Linares, Jaen (Spain)
2009-08-15
This paper deals with the application and comparison of several metaheuristic techniques to optimize the placement and supply area of biomass-fueled power plants. Both, trajectory and population-based methods are applied for our goal. In particular, two well-known trajectory method, such as Simulated Annealing (SA) and Tabu Search (TS), and two commonly used population-based methods, such as Genetic Algorithms (GA) and Particle Swarm Optimization (PSO) are hereby considered. In addition, a new binary PSO algorithm has been proposed, which incorporates an inertia weight factor, like the classical continuous approach. The fitness function for the metaheuristics is the profitability index, defined as the ratio between the net present value and the initial investment. In this work, forest residues are considered as biomass source, and the problem constraints are: the generation system must be located inside the supply area, and its maximum electric power is 5 MW. The comparative results obtained by all considered metaheuristics are discussed. Random walk has also been assessed for the problem we deal with. (author)
Design-for-test and test optimization techniques for TSV-based 3D stacked ICs
Noia, Brandon
2014-01-01
This book describes innovative techniques to address the testing needs of 3D stacked integrated circuits (ICs) that utilize through-silicon-vias (TSVs) as vertical interconnects. The authors identify the key challenges facing 3D IC testing and present results that have emerged from cutting-edge research in this domain. Coverage includes topics ranging from die-level wrappers, self-test circuits, and TSV probing to test-architecture design, test scheduling, and optimization. Readers will benefit from an in-depth look at test-technology solutions that are needed to make 3D ICs a reality and commercially viable. • Provides a comprehensive guide to the challenges and solutions for the testing of TSV-based 3D stacked ICs; • Includes in-depth explanation of key test and design-for-test technologies, emerging standards, and test- architecture and test-schedule optimizations; • Encompasses all aspects of test as related to 3D ICs, including pre-bond and post-bond test as well as the test optimizatio...
Aquino, P L M; Fonseca, F S; Mozzer, O D; Giordano, R C; Sousa, R
2016-07-01
Clostridium novyi causes necrotic hepatitis in sheep and cattle, as well as gas gangrene. The microorganism is strictly anaerobic, fastidious, and difficult to cultivate in industrial scale. C. novyi type B produces alpha and beta toxins, with the alpha toxin being linked to the presence of specific bacteriophages. The main strategy to combat diseases caused by C. novyi is vaccination, employing vaccines produced with toxoids or with toxoids and bacterins. In order to identify culture medium components and concentrations that maximized cell density and alpha toxin production, a neuro-fuzzy algorithm was applied to predict the yields of the fermentation process for production of C. novyi type B, within a global search procedure using the simulated annealing technique. Maximizing cell density and toxin production is a multi-objective optimization problem and could be treated by a Pareto approach. Nevertheless, the approach chosen here was a step-by-step one. The optimum values obtained with this approach were validated in laboratory scale, and the results were used to reload the data matrix for re-parameterization of the neuro-fuzzy model, which was implemented for a final optimization step with regards to the alpha toxin productivity. With this methodology, a threefold increase of alpha toxin could be achieved.
A multi-criteria optimization technique for SSSC based power oscillation damping controller design
Directory of Open Access Journals (Sweden)
Sarat Chandra Swain
2016-06-01
Full Text Available In this paper, Non-dominated Sorting Genetic Algorithm-II (NSGA-II technique is applied to obtain Pareto optimal set of solutions pertaining to the tuning of lead-lag structured SSSC-based stabilizer. The design objective is to get maximum damping (performance with minimum control effort (cost. Further a fuzzy based membership function value assignment method is employed to choose the best compromise solution. Simulation results are presented under various loading conditions and disturbances for various control signals to show the effectiveness and robustness of the proposed approach. The effectiveness and superiority of the proposed design approach are illustrated for both single machine infinite bus and multi-machine power systems by comparing the proposed approach with some recently published single objective and evolutionary multi-objective approaches such as Differential Evolution (DE, Particle Swarm Optimization (PSO and Multi-objective Genetic Algorithm. It is observed that the proposed approach yields superior damping performance compared to some recently published approaches.
Rakheja, S; Gurram, R; Gouw, G J
1993-10-01
Hand-arm vibration (HAV) models serve as an effective tool to assess the vibration characteristics of the hand-tool system and to evaluate the attenuation performance of vibration isolation mechanisms. This paper describes a methodology to identify the parameters of HAV models, whether linear or nonlinear, using mechanical impedance data and a nonlinear programming based optimization technique. Three- and four-degrees-of-freedom (DOF) linear, piecewise linear and nonlinear HAV models are formulated and analyzed to yield impedance characteristics in the 5-1000 Hz frequency range. A local equivalent linearization algorithm, based upon the principle of energy similarity, is implemented to simulate the nonlinear HAV models. Optimization methods are employed to identify the model parameters, such that the magnitude and phase errors between the computed and measured impedance characteristics are minimum in the entire frequency range. The effectiveness of the proposed method is demonstrated through derivations of models that correlate with the measured X-axis impedance characteristics of the hand-arm system, proposed by ISO. The results of the study show that a linear model cannot predict the impedance characteristics in the entire frequency range, while a piecewise linear model yields an accurate estimation.
Optimization of electrospinning techniques for the realization of nanofiber plastic lasers
Persano, L.; Moffa, M.; Fasano, V.; Montinaro, M.; Morello, G.; Resta, V.; Spadaro, D.; Gucciardi, P. G.; Maragò, O. M.; Camposeo, A.; Pisignano, D.
2016-02-01
Electrospinning technologies for the realization of active polymeric nanomaterials can be easily up-scaled, opening perspectives to industrial exploitation, and due to their versatility they can be employed to finely tailor the size, morphology and macroscopic assembly of fibers as well as their functional properties. Light-emitting or other active polymer nanofibers, made of conjugated polymers or of blends embedding chromophores or other functional dopants, are suitable for various applications in advanced photonics and sensing technologies. In particular, their almost onedimensional geometry and finely tunable composition make them interesting materials for developing novel lasing devices. However, electrospinning techniques rely on a large variety of parameters and possible experimental geometries, and they need to be carefully optimized in order to obtain suitable topographical and photonic properties in the resulting nanostructures. Targeted features include smooth and uniform fiber surface, dimensional control, as well as filament alignment, enhanced light emission, and stimulated emission. We here present various optimization strategies for electrospinning methods which have been implemented and developed by us for the realization of lasing architectures based on polymer nanofibers. The geometry of the resulting nanowires leads to peculiar light-scattering from spun filaments, and to controllable lasing characteristics.
A NEW HEURISTIC ALGORITHM FOR MULTIPLE TRAVELING SALESMAN PROBLEM
Directory of Open Access Journals (Sweden)
F. NURIYEVA
2017-06-01
Full Text Available The Multiple Traveling Salesman Problem (mTSP is a combinatorial optimization problem in NP-hard class. The mTSP aims to acquire the minimum cost for traveling a given set of cities by assigning each of them to a different salesman in order to create m number of tours. This paper presents a new heuristic algorithm based on the shortest path algorithm to find a solution for the mTSP. The proposed method has been programmed in C language and its performance analysis has been carried out on the library instances. The computational results show the efficiency of this method.
A New Approach to Tuning Heuristic Parameters of Genetic Algorithms
Czech Academy of Sciences Publication Activity Database
Holeňa, Martin
2006-01-01
Roč. 3, č. 3 (2006), s. 562-569 ISSN 1790-0832. [AIKED'06. WSEAS International Conference on Artificial Intelligence , Knowledge Engineering and Data Bases. Madrid, 15.02.2006-17.02.2006] R&D Projects: GA ČR(CZ) GA201/05/0325; GA ČR(CZ) GA201/05/0557 Institutional research plan: CEZ:AV0Z10300504 Keywords : evolutionary optimization * genetic algorithms * heuristic parameters * parameter tuning * artificial neural networks * convergence speed * population diversity Subject RIV: IN - Informatics, Computer Science
Energy Technology Data Exchange (ETDEWEB)
Castillo M, J.A. [ININ, 52045 Ocoyoacac, Estado de Mexico (Mexico)
2003-07-01
The basic elements of the Tabu search technique are presented, putting emphasis in the qualities that it has in comparison with the traditional methods of optimization known as in descending pass. Later on some modifications are sketched that have been implemented in the technique along the time, so that this it is but robust. Finally they are given to know some areas where this technique has been applied, obtaining successful results. (Author)
Finding Solutions to Sudoku Puzzles Using Human Intuitive Heuristics
Directory of Open Access Journals (Sweden)
Nelishia Pillay
2012-09-01
Full Text Available Sudoku is a logical puzzle that has achieved international popularity. Given this, there have been a number of computer solvers developed for this puzzle. Various methods including genetic algorithms, simulated annealing, particle swarm optimization and harmony search have been evaluated for this purpose. The approach described in this paper combines human intuition and optimization to solve Sudoku problems. The main contribution of this paper is a set of heuristic moves, incorporating human expertise, to solve Sudoku puzzles. The paper investigates the use of genetic programming to optimize a space of programs composed of these heuristics moves, with the aim of evolving a program that can produce a solution to the Sudoku problem instance. Each program is a combination of randomly selected moves. The approach was tested on 1800 Sudoku puzzles of differing difficulty. The approach presented was able to solve all 1800 problems, with a majority of these problems being solved in under a second. For a majority of the puzzles evolution was not needed and random combinations of the moves created during the initial population produced solutions. For the more difficult problems at least one generation of evolution was needed to find a solution. Further analysis revealed that solution programs for the more difficult problems could be found by enumerating random combinations of the move operators, however at a cost of higher runtimes. The performance of the approach presented was found to be comparable to other methods used to solve Sudoku problems and in a number of cases produced better results.
Heuristic thinking makes a chemist smart.
Graulich, Nicole; Hopf, Henning; Schreiner, Peter R
2010-05-01
We focus on the virtually neglected use of heuristic principles in understanding and teaching of organic chemistry. As human thinking is not comparable to computer systems employing factual knowledge and algorithms--people rarely make decisions through careful considerations of every possible event and its probability, risks or usefulness--research in science and teaching must include psychological aspects of the human decision making processes. Intuitive analogical and associative reasoning and the ability to categorize unexpected findings typically demonstrated by experienced chemists should be made accessible to young learners through heuristic concepts. The psychology of cognition defines heuristics as strategies that guide human problem-solving and deciding procedures, for example with patterns, analogies, or prototypes. Since research in the field of artificial intelligence and current studies in the psychology of cognition have provided evidence for the usefulness of heuristics in discovery, the status of heuristics has grown into something useful and teachable. In this tutorial review, we present a heuristic analysis of a familiar fundamental process in organic chemistry--the cyclic six-electron case, and we show that this approach leads to a more conceptual insight in understanding, as well as in teaching and learning.
Optimization of process parameters during vibratory welding technique using Taguchi's analysis
Directory of Open Access Journals (Sweden)
Pravin Kumar Singh
2016-09-01
Full Text Available With an aim to improve the mechanical properties of a weld joint, a new concept of vibratory setup has been designed which is capable to stir the molten weld pool before it solidifies during shielded metal arc welding (SMAW operation. Mechanical vibration having resonance frequency of 300 Hz and amplitude of 0.5 mm was transferred to the molten weld pool of 6 mm thick mild steel butt-welded joints during the welding operation. The experimental work was conducted at various ranges of frequencies, welding current and welding speed. Taguchi's analysis technique has been applied to optimize the process parameters; the response values for analysis are yield strength and micro-hardness. The test results showed that with the application of the vibratory treatment the values of hardness and tensile properties increased. The auxiliary vibrations induced into the weld pool resulted in increased micro-hardness of the weld metal which indicates the orientation of the crystal and refinement of grains took place. This study shows that vibration applied into the weld pool can be successfully improved the mechanical properties of welded joints. Thus this research attempt provided an alternative welding technique for grain refinement of weldments.
Bhattacharjee, Deblina; Paul, Anand; Kim, Jeong Hong; Kim, Mucheol
2016-01-01
The analysis of leukocyte images has drawn interest from fields of both medicine and computer vision for quite some time where different techniques have been applied to automate the process of manual analysis and classification of such images. Manual analysis of blood samples to identify leukocytes is time-consuming and susceptible to error due to the different morphological features of the cells. In this article, the nature-inspired plant growth simulation algorithm has been applied to optimize the image processing technique of object localization of medical images of leukocytes. This paper presents a random bionic algorithm for the automated detection of white blood cells embedded in cluttered smear and stained images of blood samples that uses a fitness function that matches the resemblances of the generated candidate solution to an actual leukocyte. The set of candidate solutions evolves via successive iterations as the proposed algorithm proceeds, guaranteeing their fit with the actual leukocytes outlined in the edge map of the image. The higher precision and sensitivity of the proposed scheme from the existing methods is validated with the experimental results of blood cell images. The proposed method reduces the feasible sets of growth points in each iteration, thereby reducing the required run time of load flow, objective function evaluation, thus reaching the goal state in minimum time and within the desired constraints.
Energy Technology Data Exchange (ETDEWEB)
Sohn, Bong Young [Kumho Technical R and D Center (Korea, Republic of); Park, Chan Young [Chonnam National Univ., Kwangju (Korea, Republic of)
1996-02-01
The unique Cross Point technique, offering uniquely the optimum loading of carbon blacks a reinforcing filler required in rubber compounds tire industry, has been proposed. The strength of the technique lies in optimizing the feasible loading by both Mooney viscosity and DSR(Dynamic Stress Relaxometer) simultaneously. In this study, the Mooney viscosity(ML1 + 4,125 degree){sub {eta}x} as a function of its loading X and characteristic value {gamma} was expressed as {eta}{sub x} = {eta}{sub 0}{center_dot}{gamma}{center_dot}(1 + a{sub 1}X + a{sub 2}X{sup 2}). And also Sr`, the relative integral in two seconds of torque from DSR, was a function of X as follow Sr` = b{sub 0} + b{sub 1}X. After thermal cure the natural rubber compounds loaded with the quantity designated by the cross section of the above two equations showed so feasible properties confirmed by tensile tester, Pico abrasion tester and Flexometer that the experimental coefficients of the equations applicable to fields are listed in Table 2. 12 refs., 4 tabs., 11 figs.
Flow and Mixture Optimization for a Fuel Stratification Engine Using PIV and PLIF Techniques
Li, Y.; Zhao, H.; Ma, T.
2006-07-01
This paper describes an application of PIV (particle image velocimetry) and two-tracer PLIF (planar laser-induced florescence) techniques to optimize the in-cylinder flow and to visualize two fuels distribution simultaneously for developing a fuel stratification engine. This research was carried out on a twin-spark four-valve SI engine. The PIV measurement results shows that a strong tumbling flow was produced in the cylinder as the intake valves were shrouded. The flow exhibited a symmetrical distribution in the plane perpendicular to the cylinder axis from the early stage of intake until the late stage of compression. This flow pattern helps to stratify the two fuels introduced from separate ports into two regions laterally. The stratification of fuels was observed visually by the two-tracer PLIF technique. During the PLIF measurement, two tracers, 3- pentanone and N, N-dimethylaniline (DMA), were doped into two fuels, hexane and iso-octane, respectively. Their fluorescence emissions were separated by two optical band-pass filters and recorded by a single ICCD camera simultaneously via an image doubling system. The PLIF measurement result shows that two fuels were well stratified.
Directory of Open Access Journals (Sweden)
Juan Antonio Castro Flores
Full Text Available ABSTRACT Mesial temporal sclerosis creates a focal epileptic syndrome that usually requires surgical resection of mesial temporal structures. Objective: To describe a novel operative technique for treatment of temporal lobe epilepsy and its clinical results. Methods: Prospective case-series at a single institution, performed by a single surgeon, from 2006 to 2012. A total of 120 patients were submitted to minimally-invasive keyhole transtemporal amygdalohippocampectomy. Results: Of the patients, 55% were male, and 85% had a right-sided disease. The first 70 surgeries had a mean surgical time of 2.51 hours, and the last 50 surgeries had a mean surgical time of 1.62 hours. There was 3.3% morbidity, and 5% mild temporal muscle atrophy. There was no visual field impairment. On the Engel Outcome Scale at the two-year follow-up, 71% of the patients were Class I, 21% were Class II, and 6% were Class III. Conclusion: This novel technique is feasible and reproducible, with optimal clinical results.
Directory of Open Access Journals (Sweden)
Cuong D. Tran
2015-05-01
Full Text Available It is well recognised that zinc deficiency is a major global public health issue, particularly in young children in low-income countries with diarrhoea and environmental enteropathy. Zinc supplementation is regarded as a powerful tool to correct zinc deficiency as well as to treat a variety of physiologic and pathologic conditions. However, the dose and frequency of its use as well as the choice of zinc salt are not clearly defined regardless of whether it is used to treat a disease or correct a nutritional deficiency. We discuss the application of zinc stable isotope tracer techniques to assess zinc physiology, metabolism and homeostasis and how these can address knowledge gaps in zinc supplementation pharmacokinetics. This may help to resolve optimal dose, frequency, length of administration, timing of delivery to food intake and choice of zinc compound. It appears that long-term preventive supplementation can be administered much less frequently than daily but more research needs to be undertaken to better understand how best to intervene with zinc in children at risk of zinc deficiency. Stable isotope techniques, linked with saturation response and compartmental modelling, also have the potential to assist in the continued search for simple markers of zinc status in health, malnutrition and disease.
A Modularity Degree Based Heuristic Community Detection Algorithm
Directory of Open Access Journals (Sweden)
Dongming Chen
2014-01-01
Full Text Available A community in a complex network can be seen as a subgroup of nodes that are densely connected. Discovery of community structures is a basic problem of research and can be used in various areas, such as biology, computer science, and sociology. Existing community detection methods usually try to expand or collapse the nodes partitions in order to optimize a given quality function. These optimization function based methods share the same drawback of inefficiency. Here we propose a heuristic algorithm (MDBH algorithm based on network structure which employs modularity degree as a measure function. Experiments on both synthetic benchmarks and real-world networks show that our algorithm gives competitive accuracy with previous modularity optimization methods, even though it has less computational complexity. Furthermore, due to the use of modularity degree, our algorithm naturally improves the resolution limit in community detection.
Anatomy-based transmission factors for technique optimization in portable chest x-ray
Liptak, Christopher L.; Tovey, Deborah; Segars, William P.; Dong, Frank D.; Li, Xiang
2015-03-01
Portable x-ray examinations often account for a large percentage of all radiographic examinations. Currently, portable examinations do not employ automatic exposure control (AEC). To aid in the design of a size-specific technique chart, acrylic slabs of various thicknesses are often used to estimate x-ray transmission for patients of various body thicknesses. This approach, while simple, does not account for patient anatomy, tissue heterogeneity, and the attenuation properties of the human body. To better account for these factors, in this work, we determined x-ray transmission factors using computational patient models that are anatomically realistic. A Monte Carlo program was developed to model a portable x-ray system. Detailed modeling was done of the x-ray spectrum, detector positioning, collimation, and source-to-detector distance. Simulations were performed using 18 computational patient models from the extended cardiac-torso (XCAT) family (9 males, 9 females; age range: 2-58 years; weight range: 12-117 kg). The ratio of air kerma at the detector with and without a patient model was calculated as the transmission factor. Our study showed that the transmission factor decreased exponentially with increasing patient thickness. For the range of patient thicknesses examined (12-28 cm), the transmission factor ranged from approximately 21% to 1.9% when the air kerma used in the calculation represented an average over the entire imaging field of view. The transmission factor ranged from approximately 21% to 3.6% when the air kerma used in the calculation represented the average signals from two discrete AEC cells behind the lung fields. These exponential relationships may be used to optimize imaging techniques for patients of various body thicknesses to aid in the design of clinical technique charts.
MacGregor, James N.; Chronicle, Edward P.; Ormerod, Thomas C.
2006-01-01
We compared the performance of three heuristics with that of subjects on variants of a well-known combinatorial optimization task, the Traveling Salesperson Problem (TSP). The present task consisted of finding the shortest path through an array of points from one side of the array to the other. Like the standard TSP, the task is computationally…
Heuristic space diversity management in a meta-hyper-heuristic framework
CSIR Research Space (South Africa)
Grobler, J
2014-07-01
Full Text Available IEEE Congress on Evolutionary Computation (CEC), Beijing, China, 6-11 July 2014 Heuristic Space Diversity Management in a Meta-Hyper- Heuristic Framework Jacomine Grobler1 and Andries P. Engelbrecht2 1Department of Industrial and Systems...
Energy Technology Data Exchange (ETDEWEB)
Shah, Chirag [Department of Radiation Oncology, William Beaumont Hospital, Royal Oak, MI (United States); Vicini, Frank A., E-mail: fvicini@beaumont.edu [Department of Radiation Oncology, William Beaumont Hospital, Royal Oak, MI (United States)
2011-11-15
As more women survive breast cancer, long-term toxicities affecting their quality of life, such as lymphedema (LE) of the arm, gain importance. Although numerous studies have attempted to determine incidence rates, identify optimal diagnostic tests, enumerate efficacious treatment strategies and outline risk reduction guidelines for breast cancer-related lymphedema (BCRL), few groups have consistently agreed on any of these issues. As a result, standardized recommendations are still lacking. This review will summarize the latest data addressing all of these concerns in order to provide patients and health care providers with optimal, contemporary recommendations. Published incidence rates for BCRL vary substantially with a range of 2-65% based on surgical technique, axillary sampling method, radiation therapy fields treated, and the use of chemotherapy. Newer clinical assessment tools can potentially identify BCRL in patients with subclinical disease with prospective data suggesting that early diagnosis and management with noninvasive therapy can lead to excellent outcomes. Multiple therapies exist with treatments defined by the severity of BCRL present. Currently, the standard of care for BCRL in patients with significant LE is complex decongestive physiotherapy (CDP). Contemporary data also suggest that a multidisciplinary approach to the management of BCRL should begin prior to definitive treatment for breast cancer employing patient-specific surgical, radiation therapy, and chemotherapy paradigms that limit risks. Further, prospective clinical assessments before and after treatment should be employed to diagnose subclinical disease. In those patients who require aggressive locoregional management, prophylactic therapies and the use of CDP can help reduce the long-term sequelae of BCRL.
Comet assay on thawed embryos: An optimized technique to evaluate DNA damage in mouse embryos.
Rolland, L; Courbiere, B; Tassistro, V; Sansoni, A; Orsière, T; Liu, W; Di Giorgio, C; Perrin, J
2017-10-01
Our objective was to optimize the CA technique on mammal embryos. 1000 frozen 2-cell embryos from B6CBA mice were used. Based on a literature review, and after checking post-thaw embryo viability, the main outcome measures included: 1) comparison of the embryo recovery rate between 2 CA protocols (2 agarose layers and 3 agarose layers); 2) comparison of DNA damage by the CA on embryos with (ZP+) and without (ZP-) zona pellucida; and 3) comparison of DNA damage in embryos exposed to 2 genotoxic agents (H2O2 and simulated sunlight irradiation (SSI)). DNA damage was quantified by the % tail DNA. 1) The recovery rate was 3,3% (n=5/150) with the 2 agarose layers protocol and 71,3% (n=266/371) with the 3 agarose layers protocol. 2) DNA damage did not differ statistically significantly between ZP- and ZP+ embryos (12.60±2.53% Tail DNA vs 11.04±1.50 (p=0.583) for the control group and 49.23±4.16 vs 41.13±4.31 (p=0.182) for the H2O2 group); 3) H2O2 and SSI induced a statistically significant increase in DNA damage compared with the control group (41.13±4.31% Tail DNA, 36.33±3.02 and 11.04±1.50 (p<0.0001)). The CA on mammal embryos was optimized by using thawed embryos, by avoiding ZP removal and by the adjunction of a third agarose layer. Copyright © 2017 Elsevier Ltd. All rights reserved.
Becerra, Sandra C; Roy, Daniel C; Sanchez, Carlos J; Christy, Robert J; Burmeister, David M
2016-04-12
Bacterial infections are a common clinical problem in both acute and chronic wounds. With growing concerns over antibiotic resistance, treatment of bacterial infections should only occur after positive diagnosis. Currently, diagnosis is delayed due to lengthy culturing methods which may also fail to identify the presence of bacteria. While newer costly bacterial identification methods are being explored, a simple and inexpensive diagnostic tool would aid in immediate and accurate treatments for bacterial infections. Histologically, hematoxylin and eosin (H&E) and Gram stains have been employed, but are far from optimal when analyzing tissue samples due to non-specific staining. The goal of the current study was to develop a modification of the Gram stain that enhances the contrast between bacteria and host tissue. A modified Gram stain was developed and tested as an alternative to Gram stain that improves the contrast between Gram positive bacteria, Gram negative bacteria and host tissue. Initially, clinically relevant strains of Pseudomonas aeruginosa and Staphylococcus aureus were visualized in vitro and in biopsies of infected, porcine burns using routine Gram stain, and immunohistochemistry techniques involving bacterial strain-specific fluorescent antibodies as validation tools. H&E and Gram stain of serial biopsy sections were then compared to a modification of the Gram stain incorporating a counterstain that highlights collagen found in tissue. The modified Gram stain clearly identified both Gram positive and Gram negative bacteria, and when compared to H&E or Gram stain alone provided excellent contrast between bacteria and non-viable burn eschar. Moreover, when applied to surgical biopsies from patients that underwent burn debridement this technique was able to clearly detect bacterial morphology within host tissue. We describe a modification of the Gram stain that provides improved contrast of Gram positive and Gram negative microorganisms within host
Cilla, Myriam; Borgiani, Edoardo; Mart?nez, Javier; Duda, Georg N.; Checa, Sara
2017-01-01
Today, different implant designs exist in the market; however, there is not a clear understanding of which are the best implant design parameters to achieve mechanical optimal conditions. Therefore, the aim of this project was to investigate if the geometry of a commercial short stem hip prosthesis can be further optimized to reduce stress shielding effects and achieve better short-stemmed implant performance. To reach this aim, the potential of machine learning techniques combined with param...
Ramalingom, Delphine; Cocquet, Pierre-Henri; Bastide, Alain
2017-01-01
Submitted to Computers And Fluids; This paper proposes a new interpolation technique based on density approach to solve topology optimization problems for heat transfer. Problems are modeled under the assumptions of steady-state laminar flow using the incompressible Navier-Stokes equations coupled to the convection-diffusion equation through the Boussinesq approximation. The governing equations are discretized using finite volume elements and topology optimization is performed using adjoint s...
Stieler, Florian; Yan, Hui; Lohr, Frank; Wenz, Frederik; Yin, Fang-Fang
2009-01-01
Background Parameter optimization in the process of inverse treatment planning for intensity modulated radiation therapy (IMRT) is mainly conducted by human planners in order to create a plan with the desired dose distribution. To automate this tedious process, an artificial intelligence (AI) guided system was developed and examined. Methods The AI system can automatically accomplish the optimization process based on prior knowledge operated by several fuzzy inference systems (FIS). Prior knowledge, which was collected from human planners during their routine trial-and-error process of inverse planning, has first to be "translated" to a set of "if-then rules" for driving the FISs. To minimize subjective error which could be costly during this knowledge acquisition process, it is necessary to find a quantitative method to automatically accomplish this task. A well-developed machine learning technique, based on an adaptive neuro fuzzy inference system (ANFIS), was introduced in this study. Based on this approach, prior knowledge of a fuzzy inference system can be quickly collected from observation data (clinically used constraints). The learning capability and the accuracy of such a system were analyzed by generating multiple FIS from data collected from an AI system with known settings and rules. Results Multiple analyses showed good agreements of FIS and ANFIS according to rules (error of the output values of ANFIS based on the training data from FIS of 7.77 ± 0.02%) and membership functions (3.9%), thus suggesting that the "behavior" of an FIS can be propagated to another, based on this process. The initial experimental results on a clinical case showed that ANFIS is an effective way to build FIS from practical data, and analysis of ANFIS and FIS with clinical cases showed good planning results provided by ANFIS. OAR volumes encompassed by characteristic percentages of isodoses were reduced by a mean of between 0 and 28%. Conclusion The study demonstrated a
Directory of Open Access Journals (Sweden)
Baseem Khan
2017-01-01
Full Text Available The exhaustive knowledge of optimal power flow (OPF methods is critical for proper system operation and planning, since OPF methods are utilized for finding the optimal state of any system under system constraint conditions, such as loss minimization, reactive power limits, thermal limits of transmission lines, and reactive power optimization. Incorporating renewable energy sources optimized the power flow of system under different constraints. This work presents a comprehensive study of optimal power flows methods with conventional and renewable energy constraints. Additionally, this work presents a progress of optimal power flow solution from its beginning to its present form. Authors classify the optimal power flow methods under different constraints condition of conventional and renewable energy sources. The current and future applications of optimal power flow programs in smart system planning, operations, sensitivity calculation, and control are presented. This study will help the engineers and researchers to optimize power flow with conventional and renewable energy sources.
Prediction-based dynamic load-sharing heuristics
Goswami, Kumar K.; Devarakonda, Murthy; Iyer, Ravishankar K.
1993-01-01
The authors present dynamic load-sharing heuristics that use predicted resource requirements of processes to manage workloads in a distributed system. A previously developed statistical pattern-recognition method is employed for resource prediction. While nonprediction-based heuristics depend on a rapidly changing system status, the new heuristics depend on slowly changing program resource usage patterns. Furthermore, prediction-based heuristics can be more effective since they use future requirements rather than just the current system state. Four prediction-based heuristics, two centralized and two distributed, are presented. Using trace driven simulations, they are compared against random scheduling and two effective nonprediction based heuristics. Results show that the prediction-based centralized heuristics achieve up to 30 percent better response times than the nonprediction centralized heuristic, and that the prediction-based distributed heuristics achieve up to 50 percent improvements relative to their nonprediction counterpart.
DEFF Research Database (Denmark)
Rodrigues, Vinicius Picanco; Morabito, Reinaldo; Yamashita, Denise
2016-01-01
application of two tailor-made MIP heuristics, based on relax-and-fix and timedecomposition procedures. The model minimizes fuel costs of a heterogeneous fleet of oil tankersand costs related to freighting contracts. The model also considers company-specific constraints foroffshore oil transportation....... Computational experiments based on the mathematical models and therelated MIP heuristics are presented for a set of real data provided by the company, which confirmthe potential of optimization-based methods to find good solutions for problems of moderate sizes....
Exact and heuristic solutions to the Double TSP with Multiple Stacks
DEFF Research Database (Denmark)
Petersen, Hanne Løhmann; Archetti, Claudia; Madsen, Oli B.G.
-pallet, which can be loaded in 3 stacks in a standard 40 foot container. Different exact and heuristic solution approaches to the DTSPMS have been implemented and tested. The exact approaches are based on different mathematical formulations of the problem which are solved using branch-and-cut. One formulation...... instances. The implemented heuristics include tabu search, simulated annealing and large neighbourhood search. Particularly the LNS approach shows promising results. It finds the known optimal solution of smaller instances (15 orders) within 10 seconds in most cases, and in 3 minutes it finds solutions...
An Innovative Heuristic in Multi-Item Replenishment Problem for One Warehouse and N Retailers
Directory of Open Access Journals (Sweden)
Yugowati Praharsi
2014-01-01
Full Text Available Joint replenishment problem (JRP is a type of inventory model which aims to minimize the total inventory cost consisting of major ordering cost, minor ordering cost and inventory holding cost. Different from previous papers, this study considers one warehouse, multi items and N retailers. An innovative heuristic approach is developed to solve the problem. In this paper, we consider a multi echelon inventory system and seek to find a balance between the order cost and the inventory holding costs at each installation. The computational results show that the innovative heuristic provides a near exact optimal solution, but is more efficient in terms of the computational time and the iteration number.
Mathematical models and a constructive heuristic for finding minimum fundamental cycle bases
Directory of Open Access Journals (Sweden)
Liberti Leo
2005-01-01
Full Text Available The problem of finding a fundamental cycle basis with minimum total cost in a graph arises in many application fields. In this paper we present some integer linear programming formulations and we compare their performances, in terms of instance size, CPU time required for the solution, and quality of the associated lower bound derived by solving the corresponding continuous relaxations. Since only very small instances can be solved to optimality with these formulations and very large instances occur in a number of applications, we present a new constructive heuristic and compare it with alternative heuristics.
Directory of Open Access Journals (Sweden)
Wilmar Hernandez
2007-01-01
Full Text Available In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart sensors that todayÃ¢Â€Â™s cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcherÃ¢Â€Â™s interest in the fusion of intelligent sensors and optimal signal processing techniques.
A set-covering based heuristic algorithm for the periodic vehicle routing problem.
Cacchiani, V; Hemmelmayr, V C; Tricoire, F
2014-01-30
We present a hybrid optimization algorithm for mixed-integer linear programming, embedding both heuristic and exact components. In order to validate it we use the periodic vehicle routing problem (PVRP) as a case study. This problem consists of determining a set of minimum cost routes for each day of a given planning horizon, with the constraints that each customer must be visited a required number of times (chosen among a set of valid day combinations), must receive every time the required quantity of product, and that the number of routes per day (each respecting the capacity of the vehicle) does not exceed the total number of available vehicles. This is a generalization of the well-known vehicle routing problem (VRP). Our algorithm is based on the linear programming (LP) relaxation of a set-covering-like integer linear programming formulation of the problem, with additional constraints. The LP-relaxation is solved by column generation, where columns are generated heuristically by an iterated local search algorithm. The whole solution method takes advantage of the LP-solution and applies techniques of fixing and releasing of the columns as a local search, making use of a tabu list to avoid cycling. We show the results of the proposed algorithm on benchmark instances from the literature and compare them to the state-of-the-art algorithms, showing the effectiveness of our approach in producing good quality solutions. In addition, we report the results on realistic instances of the PVRP introduced in Pacheco et al. (2011) [24] and on benchmark instances of the periodic traveling salesman problem (PTSP), showing the efficacy of the proposed algorithm on these as well. Finally, we report the new best known solutions found for all the tested problems.
Directory of Open Access Journals (Sweden)
Ahmed F. Mohamed
2014-05-01
Full Text Available One of the most recent optimization techniques applied to the optimal design of photovoltaic system to supply an isolated load demand is the Artificial Bee Colony Algorithm (ABC. The proposed methodology is applied to optimize the cost of the PV system including photovoltaic, a battery bank, a battery charger controller, and inverter. Two objective functions are proposed: the first one is the PV module output power which is to be maximized and the second one is the life cycle cost (LCC which is to be minimized. The analysis is performed based on measured solar radiation and ambient temperature measured at Helwan city, Egypt. A comparison between ABC algorithm and Genetic Algorithm (GA optimal results is done. Another location is selected which is Zagazig city to check the validity of ABC algorithm in any location. The ABC is more optimal than GA. The results encouraged the use of the PV systems to electrify the rural sites of Egypt.
Mohamed, Ahmed F; Elarini, Mahdi M; Othman, Ahmed M
2014-05-01
One of the most recent optimization techniques applied to the optimal design of photovoltaic system to supply an isolated load demand is the Artificial Bee Colony Algorithm (ABC). The proposed methodology is applied to optimize the cost of the PV system including photovoltaic, a battery bank, a battery charger controller, and inverter. Two objective functions are proposed: the first one is the PV module output power which is to be maximized and the second one is the life cycle cost (LCC) which is to be minimized. The analysis is performed based on measured solar radiation and ambient temperature measured at Helwan city, Egypt. A comparison between ABC algorithm and Genetic Algorithm (GA) optimal results is done. Another location is selected which is Zagazig city to check the validity of ABC algorithm in any location. The ABC is more optimal than GA. The results encouraged the use of the PV systems to electrify the rural sites of Egypt.
Quantifying Heuristic Bias: Anchoring, Availability, and Representativeness.
Richie, Megan; Josephson, S Andrew
2018-01-01
Construct: Authors examined whether a new vignette-based instrument could isolate and quantify heuristic bias. Heuristics are cognitive shortcuts that may introduce bias and contribute to error. There is no standardized instrument available to quantify heuristic bias in clinical decision making, limiting future study of educational interventions designed to improve calibration of medical decisions. This study presents validity data to support a vignette-based instrument quantifying bias due to the anchoring, availability, and representativeness heuristics. Participants completed questionnaires requiring assignment of probabilities to potential outcomes of medical and nonmedical scenarios. The instrument randomly presented scenarios in one of two versions: Version A, encouraging heuristic bias, and Version B, worded neutrally. The primary outcome was the difference in probability judgments for Version A versus Version B scenario options. Of 167 participants recruited, 139 enrolled. Participants assigned significantly higher mean probability values to Version A scenario options (M = 9.56, SD = 3.75) than Version B (M = 8.98, SD = 3.76), t(1801) = 3.27, p = .001. This result remained significant analyzing medical scenarios alone (Version A, M = 9.41, SD = 3.92; Version B, M = 8.86, SD = 4.09), t(1204) = 2.36, p = .02. Analyzing medical scenarios by heuristic revealed a significant difference between Version A and B for availability (Version A, M = 6.52, SD = 3.32; Version B, M = 5.52, SD = 3.05), t(404) = 3.04, p = .003, and representativeness (Version A, M = 11.45, SD = 3.12; Version B, M = 10.67, SD = 3.71), t(396) = 2.28, p = .02, but not anchoring. Stratifying by training level, students maintained a significant difference between Version A and B medical scenarios (Version A, M = 9.83, SD = 3.75; Version B, M = 9.00, SD = 3.98), t(465) = 2.29, p = .02, but not residents or attendings. Stratifying by heuristic and training level, availability maintained
Technology optimization techniques for multicomponent optical band-pass filter manufacturing
Baranov, Yuri P.; Gryaznov, Georgiy M.; Rodionov, Andrey Y.; Obrezkov, Andrey V.; Medvedev, Roman V.; Chivanov, Alexey N.
2016-04-01
Narrowband optical devices (like IR-sensing devices, celestial navigation systems, solar-blind UV-systems and many others) are one of the most fast-growing areas in optical manufacturing. However, signal strength in this type of applications is quite low and performance of devices depends on attenuation level of wavelengths out of operating range. Modern detectors (photodiodes, matrix detectors, photomultiplier tubes and others) usually do not have required selectivity or have higher sensitivity to background spectrum at worst. Manufacturing of a single component band-pass filter with high attenuation level of wavelength is resource-intensive task. Sometimes it's not possible to find solution for this problem using existing technologies. Different types of filters have technology variations of transmittance profile shape due to various production factors. At the same time there are multiple tasks with strict requirements for background spectrum attenuation in narrowband optical devices. For example, in solar-blind UV-system wavelengths above 290-300 nm must be attenuated by 180dB. In this paper techniques of multi-component optical band-pass filters assembly from multiple single elements with technology variations of transmittance profile shape for optimal signal-tonoise ratio (SNR) were proposed. Relationships between signal-to-noise ratio and different characteristics of transmittance profile shape were shown. Obtained practical results were in rather good agreement with our calculations.
Ćujić, Nada; Šavikin, Katarina; Janković, Teodora; Pljevljakušić, Dejan; Zdunić, Gordana; Ibrić, Svetlana
2016-03-01
Traditional maceration method was used for the extraction of polyphenols from chokeberry (Aronia melanocarpa) dried fruit, and the effects of several extraction parameters on the total phenolics and anthocyanins contents were studied. Various solvents, particle size, solid-solvent ratio and extraction time have been investigated as independent variables in two level factorial design. Among examined variables, time was not statistically important factor for the extraction of polyphenols. The optimal extraction conditions were maceration of 0.75mm size berries by 50% ethanol, with solid-solvent ratio of 1:20, and predicted values were 27.7mgGAE/g for total phenolics and 0.27% for total anthocyanins. Under selected conditions, the experimental total phenolics were 27.8mgGAE/g, and total anthocyanins were 0.27%, which is in agreement with the predicted values. In addition, a complementary quantitative analysis of individual phenolic compounds was performed using HPLC method. The study indicated that maceration was effective and simple technique for the extraction of bioactive compounds from chokeberry fruit. Copyright © 2015 Elsevier Ltd. All rights reserved.
A study on the stress cycle determination using an optimization technique
Energy Technology Data Exchange (ETDEWEB)
Ko, Han Ok; Jhung, Myung Jo [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of); Choi, Jae Boong [Sungkyunkwan University, Suwon (Korea, Republic of)
2012-05-15
Monitoring for various transients is needed because they can cause the fatigue damage to nuclear components when transients are occurred by events such as plant heat-up/cool-down or reactor trip in the operating NPPs. Then, the Korea Institute of Nuclear Safety (KINS) developed a monitoring system. It was called as Computerized technical Advisory system for the Radiological Emergency (CARE). However, it is difficult to evaluate the fatigue damage by using the actual transient data. Because the amount of data is very much and these signals usually contain noises. So various methods, such as peaking counting, level crossing counting or simple range counting method, to determine the stress cycles were developed. In this paper, an engineering methodology which extracts stress cycles from the real-time transient data has been developed. To determine stress cycles, the racetrack counting and rain flow counting method were used, and an optimization technique was applied to implement the racetrack counting algorithm. Depending on the application of the racetrack method, the stress cycles counted by the rain flow method were compared. As a result, it was found that stress cycles smaller than a threshold value were discarded
Directory of Open Access Journals (Sweden)
Washington Alves de Oliveira
Full Text Available ABSTRACT In this work we propose a heuristic algorithm for the layout optimization for disks installed in a rotating circular container. This is a unequal circle packing problem with additional balance constraints. It proved to be an NP-hard problem, which justifies heuristics methods for its resolution in larger instances. The main feature of our heuristic is based on the selection of the next circle to be placed inside the container according to the position of the system's center of mass. Our approach has been tested on a series of instances up to 55 circles and compared with the literature. Computational results show good performance in terms of solution quality and computational time for the proposed algorithm.
Heuristics for multiobjective multiple sequence alignment.
Abbasi, Maryam; Paquete, Luís; Pereira, Francisco B
2016-07-15
Aligning multiple sequences arises in many tasks in Bioinformatics. However, the alignments produced by the current software packages are highly dependent on the parameters setting, such as the relative importance of opening gaps with respect to the increase of similarity. Choosing only one parameter setting may provide an undesirable bias in further steps of the analysis and give too simplistic interpretations. In this work, we reformulate multiple sequence alignment from a multiobjective point of view. The goal is to generate several sequence alignments that represent a trade-off between maximizing the substitution score and minimizing the number of indels/gaps in the sum-of-pairs score function. This trade-off gives to the practitioner further information about the similarity of the sequences, from which she could analyse and choose the most plausible alignment. We introduce several heuristic approaches, based on local search procedures, that compute a set of sequence alignments, which are representative of the trade-off between the two objectives (substitution score and indels). Several algorithm design options are discussed and analysed, with particular emphasis on the influence of the starting alignment and neighborhood search definitions on the overall performance. A perturbation technique is proposed to improve the local search, which provides a wide range of high-quality alignments. The proposed approach is tested experimentally on a wide range of instances. We performed several experiments with sequences obtained from the benchmark database BAliBASE 3.0. To evaluate the quality of the results, we calculate the hypervolume indicator of the set of score vectors returned by the algorithms. The results obtained allow us to identify reasonably good choices of parameters for our approach. Further, we compared our method in terms of correctly aligned pairs ratio and columns correctly aligned ratio with respect to reference alignments. Experimental results show
A heuristic for the inventory management of smart vending machine systems
Directory of Open Access Journals (Sweden)
Yang-Byung Park
2012-12-01
Full Text Available Purpose: The purpose of this paper is to propose a heuristic for the inventory management of smart vending machine systems with product substitution under the replenishment point, order-up-to level policy and to evaluate its performance.Design/methodology/approach: The heuristic is developed on the basis of the decoupled approach. An integer linear mathematical model is built to determine the number of product storage compartments and replenishment threshold for each smart vending machine in the system and the Clarke and Wright’s savings algorithm is applied to route vehicles for inventory replenishments of smart vending machines that share the same delivery days. Computational experiments are conducted on several small-size test problems to compare the proposed heuristic with the integrated optimization mathematical model with respect to system profit. Furthermore, a sensitivity analysis is carried out on a medium-size test problem to evaluate the effect of the customer service level on system profit using a computer simulation.Findings: The results show that the proposed heuristic yielded pretty good solutions with 5.7% error rate on average compared to the optimal solutions. The proposed heuristic took about 3 CPU minutes on average in the test problems being consisted of 10 five-product smart vending machines. It was confirmed that the system profit is significantly affected by the customer service level.Originality/value: The inventory management of smart vending machine systems is newly treated. Product substitutions are explicitly considered in the model. The proposed heuristic is effective as well as efficient. It can be easily modified for application to various retail vending settings under a vendor-managed inventory scheme with POS system.
Directory of Open Access Journals (Sweden)
Dorin Sendrescu
2013-01-01
Full Text Available This paper deals with the offline parameters identification for a class of wastewater treatment bioprocesses using particle swarm optimization (PSO techniques. Particle swarm optimization is a relatively new heuristic method that has produced promising results for solving complex optimization problems. In this paper one uses some variants of the PSO algorithm for parameter estimation of an anaerobic wastewater treatment process that is a complex biotechnological system. The identification scheme is based on a multimodal numerical optimization problem with high dimension. The performances of the method are analyzed by numerical simulations.
Heuristics for container loading of furniture
DEFF Research Database (Denmark)
Egeblad, Jens; Garavelli, Claudio; Lisi, Stefano
2010-01-01
. In the studied company, the problem arises hundreds of times daily during transport planning. Instances may contain more than one hundred different items with irregular shapes. To solve this complex problem we apply a set of heuristics successively that each solve one part of the problem. Large items...... are combined in specific structures to ensure proper protection of the items during transportation and to simplify the problem. The solutions generated by the heuristic has an average loading utilization of 91.3% for the most general instances with average running times around 100 seconds....
Heuristic Drift Elimination for Personnel Tracking Systems
Borenstein, Johann; Ojeda, Lauro
This paper pertains to the reduction of the effects of measurement errors in rate gyros used for tracking, recording, or monitoring the position of persons walking indoors. In such applications, bias drift and other gyro errors can degrade accuracy within minutes. To overcome this problem we developed the Heuristic Drift Elimination (HDE) method, that effectively corrects bias drift and other slow-changing errors. HDE works by making assumptions about walking in structured, indoor environments. The paper explains the heuristic assumptions and the HDE method, and shows experimental results. In typical applications, HDE maintains near-zero heading errors in walks of unlimited duration.
All-in-one model for designing optimal water distribution pipe networks
Aklog, Dagnachew; Hosoi, Yoshihiko
2017-05-01
This paper discusses the development of an easy-to-use, all-in-one model for designing optimal water distribution networks. The model combines different optimization techniques into a single package in which a user can easily choose what optimizer to use and compare the results of different optimizers to gain confidence in the performances of the models. At present, three optimization techniques are included in the model: linear programming (LP), genetic algorithm (GA) and a heuristic one-by-one reduction method (OBORM) that was previously developed by the authors. The optimizers were tested on a number of benchmark problems and performed very well in terms of finding optimal or near-optimal solutions with a reasonable computation effort. The results indicate that the model effectively addresses the issues of complexity and limited performance trust associated with previous models and can thus be used for practical purposes.
An Optimized and Feasible Preparation Technique for the Industrial Production of Hydrogel Patches.
Li, Wei-Ze; Han, Wen-Xia; Hao, Xu-Liang; Zhao, Ning; Zhai, Xi-Feng; Yang, Li-Bin; He, Shu-Miao; Cheng, Yu-Chuan; Zhang, Han; Fu, Li-Na; Zhang, Yan; Liang, Ze
2017-11-16
For hydrogel patches, the laboratory tests could not fully reveal the existing problems of full scale of industrial production, and there are few studies about the preparation technique for the industrial manufacturing process of hydrogel patches. So, the purpose of this work was to elucidate the effects of mainly technological operation and its parameters on the performance of hydrogel patches at the industrial-scale production. The results revealed the following: (1) the aqueous phase was obtained by polyvinylpyrrolidone (PVP) along with tartaric acid dissolved in purified water, then feeding this into a vacuum mixer as a whole in one batch, thus extended the crosslinking reaction time of hydrogel paste (matrix) and allowed the operation of coating/cutting-off to be carried out easily, and there was no permeation of backing layer; (2) the gel strength of the hydrogel patches increased with the increase of working temperature, however, once the temperature exceeded 35 ± 2 °C, the hydrogel paste would lose water severely and the resultant physical crosslinking structure which has lower gel/cohesive strength would easily bring gelatinization/residues during application; (3) the relative humidity (RH) of the standing-workshop was dynamically controlled (namely at 35 ± 2 °C, keeping the RH at 55 ± 5% for 4 days, then 65 ± 5% for 2 days), which would make patches with satisfactory characteristics such as better flexibility, higher adhesive force, smooth flat matrix surface, and without gelatinization/residues and warped edge during the using process; (4) the aging of the packaged hydrogel patches was very sensitive to storage temperature, higher temperature, higher gel strength and lower adhesiveness. The storage temperature of 10 ± 2 °C could effectively prevent matrix aging and adhesion losing, which would also facilitate the expiration date of patches extended obviously. In conclusion, this work provides an optimized and feasible preparation
A Heuristic Framework to Solve a General Delivery Problem
Lian, Lian; Castelain, Emmanuel
2010-06-01
This paper presents a new distribution and route planning problem, General Delivery Problem (GDP) which is more general than the well-known Vehicle Routing Problem. To solve a GDP, a three-phase framework heuristic approach based on decomposition techniques is introduced. The decomposition techniques are employed to divide an original problem into a set of sub-problems, which can reduce the problem size. A kind of decomposition technique, Capacity Clustering Algorithm (CCA), is embedded into the framework with Simulated Annealing (SA) to solve a special GDP. The proposed three-phase framework with the above two algorithms is compared with five other decomposition methods in a distribution instance of the Regional Fire and Emergency Center in the north of France.
Abdelkarim, Noha; Mohamed, Amr E; El-Garhy, Ahmed M; Dorrah, Hassen T
2016-01-01
The two-coupled distillation column process is a physically complicated system in many aspects. Specifically, the nested interrelationship between system inputs and outputs constitutes one of the significant challenges in system control design. Mostly, such a process is to be decoupled into several input/output pairings (loops), so that a single controller can be assigned for each loop. In the frame of this research, the Brain Emotional Learning Based Intelligent Controller (BELBIC) forms the control structure for each decoupled loop. The paper's main objective is to develop a parameterization technique for decoupling and control schemes, which ensures robust control behavior. In this regard, the novel optimization technique Bacterial Swarm Optimization (BSO) is utilized for the minimization of summation of the integral time-weighted squared errors (ITSEs) for all control loops. This optimization technique constitutes a hybrid between two techniques, which are the Particle Swarm and Bacterial Foraging algorithms. According to the simulation results, this hybridized technique ensures low mathematical burdens and high decoupling and control accuracy. Moreover, the behavior analysis of the proposed BELBIC shows a remarkable improvement in the time domain behavior and robustness over the conventional PID controller.
Directory of Open Access Journals (Sweden)
Noha Abdelkarim
2016-01-01
Full Text Available The two-coupled distillation column process is a physically complicated system in many aspects. Specifically, the nested interrelationship between system inputs and outputs constitutes one of the significant challenges in system control design. Mostly, such a process is to be decoupled into several input/output pairings (loops, so that a single controller can be assigned for each loop. In the frame of this research, the Brain Emotional Learning Based Intelligent Controller (BELBIC forms the control structure for each decoupled loop. The paper’s main objective is to develop a parameterization technique for decoupling and control schemes, which ensures robust control behavior. In this regard, the novel optimization technique Bacterial Swarm Optimization (BSO is utilized for the minimization of summation of the integral time-weighted squared errors (ITSEs for all control loops. This optimization technique constitutes a hybrid between two techniques, which are the Particle Swarm and Bacterial Foraging algorithms. According to the simulation results, this hybridized technique ensures low mathematical burdens and high decoupling and control accuracy. Moreover, the behavior analysis of the proposed BELBIC shows a remarkable improvement in the time domain behavior and robustness over the conventional PID controller.
Mohamed, Amr E.; Dorrah, Hassen T.
2016-01-01
The two-coupled distillation column process is a physically complicated system in many aspects. Specifically, the nested interrelationship between system inputs and outputs constitutes one of the significant challenges in system control design. Mostly, such a process is to be decoupled into several input/output pairings (loops), so that a single controller can be assigned for each loop. In the frame of this research, the Brain Emotional Learning Based Intelligent Controller (BELBIC) forms the control structure for each decoupled loop. The paper's main objective is to develop a parameterization technique for decoupling and control schemes, which ensures robust control behavior. In this regard, the novel optimization technique Bacterial Swarm Optimization (BSO) is utilized for the minimization of summation of the integral time-weighted squared errors (ITSEs) for all control loops. This optimization technique constitutes a hybrid between two techniques, which are the Particle Swarm and Bacterial Foraging algorithms. According to the simulation results, this hybridized technique ensures low mathematical burdens and high decoupling and control accuracy. Moreover, the behavior analysis of the proposed BELBIC shows a remarkable improvement in the time domain behavior and robustness over the conventional PID controller. PMID:27807444
Fitness levels with tail bounds for the analysis of randomized search heuristics
DEFF Research Database (Denmark)
Witt, Carsten
2014-01-01
The fitness-level method, also called the method of f-based partitions, is an intuitive and widely used technique for the running time analysis of randomized search heuristics. It was originally defined to prove upper and lower bounds on the expected running time. Recently, upper tail bounds were...