WorldWideScience

Sample records for search algorithm applied

  1. An Efficient VQ Codebook Search Algorithm Applied to AMR-WB Speech Coding

    Directory of Open Access Journals (Sweden)

    Cheng-Yu Yeh

    2017-04-01

    Full Text Available The adaptive multi-rate wideband (AMR-WB speech codec is widely used in modern mobile communication systems for high speech quality in handheld devices. Nonetheless, a major disadvantage is that vector quantization (VQ of immittance spectral frequency (ISF coefficients takes a considerable computational load in the AMR-WB coding. Accordingly, a binary search space-structured VQ (BSS-VQ algorithm is adopted to efficiently reduce the complexity of ISF quantization in AMR-WB. This search algorithm is done through a fast locating technique combined with lookup tables, such that an input vector is efficiently assigned to a subspace where relatively few codeword searches are required to be executed. In terms of overall search performance, this work is experimentally validated as a superior search algorithm relative to a multiple triangular inequality elimination (MTIE, a TIE with dynamic and intersection mechanisms (DI-TIE, and an equal-average equal-variance equal-norm nearest neighbor search (EEENNS approach. With a full search algorithm as a benchmark for overall search load comparison, this work provides an 87% search load reduction at a threshold of quantization accuracy of 0.96, a figure far beyond 55% in the MTIE, 76% in the EEENNS approach, and 83% in the DI-TIE approach.

  2. Improved understanding of the searching behavior of ant colony optimization algorithms applied to the water distribution design problem

    Science.gov (United States)

    Zecchin, A. C.; Simpson, A. R.; Maier, H. R.; Marchi, A.; Nixon, J. B.

    2012-09-01

    Evolutionary algorithms (EAs) have been applied successfully to many water resource problems, such as system design, management decision formulation, and model calibration. The performance of an EA with respect to a particular problem type is dependent on how effectively its internal operators balance the exploitation/exploration trade-off to iteratively find solutions of an increasing quality. For a given problem, different algorithms are observed to produce a variety of different final performances, but there have been surprisingly few investigations into characterizing how the different internal mechanisms alter the algorithm's searching behavior, in both the objective and decision space, to arrive at this final performance. This paper presents metrics for analyzing the searching behavior of ant colony optimization algorithms, a particular type of EA, for the optimal water distribution system design problem, which is a classical NP-hard problem in civil engineering. Using the proposed metrics, behavior is characterized in terms of three different attributes: (1) the effectiveness of the search in improving its solution quality and entering into optimal or near-optimal regions of the search space, (2) the extent to which the algorithm explores as it converges to solutions, and (3) the searching behavior with respect to the feasible and infeasible regions. A range of case studies is considered, where a number of ant colony optimization variants are applied to a selection of water distribution system optimization problems. The results demonstrate the utility of the proposed metrics to give greater insight into how the internal operators affect each algorithm's searching behavior.

  3. A hybrid adaptive large neighborhood search algorithm applied to a lot-sizing problem

    DEFF Research Database (Denmark)

    Muller, Laurent Flindt; Spoorendonk, Simon

    programming solver and its built-in feasibility heuristics is used to search a neighborhood for improving solutions. The general reoptimization approach used for repairing solutions is specifically suited for combinatorial problems where it may be hard to otherwise design operations to define a neighborhood......This paper presents a hybrid of a general heuristic framework that has been successfully applied to vehicle routing problems and a general purpose MIP solver. The framework uses local search and an adaptive procedure which choses between a set of large neighborhoods to be searched. A mixed integer...... of a solution and to investigate the feasibility of elements in such a neighborhood. The hybrid heuristic framework is applied to the multi-item capacitated lot sizing problem with dynamic lot sizes, where experiments have been conducted on a series of instances from the literature. On average the heuristic...

  4. A tabu search algorithm applied to the staffing roster problem of Leicestershire police force

    OpenAIRE

    O S S T Edleston; Bartlett, L. M.

    2012-01-01

    This paper presents an application of the tabu search algorithm to a staff rostering problem relevant to Leicestershire Police. The aim is to address the issue of structuring staff rosters to enable effective use of staff to meet the demand on the Police to reduce and deal with crime-related incidents. This problem is defined through the compilation of a time-varying level of required staff and an associated staff roster. The objective is an optimised work set-up, maximising staff resources a...

  5. Quantum Search Algorithms

    Science.gov (United States)

    Korepin, Vladimir E.; Xu, Ying

    This article reviews recent progress in quantum database search algorithms. The subject is presented in a self-contained and pedagogical way. The problem of searching a large database (a Hilbert space) for a target item is performed by the famous Grover algorithm which locates the target item with high probability and a quadratic speed-up compared with the corresponding classical algorithm. If the database is partitioned into blocks and one is searching for the block containing the target item instead of the target item itself, then the problem is referred to as partial search. Partial search trades accuracy for speed and the most efficient version is the Grover-Radhakrishnan-Korepin (GRK) algorithm. The target block can be further partitioned into sub-blocks so that GRK's can be performed in a sequence called a hierarchy. We study the Grover search and GRK partial search in detail and prove that a GRK hierarchy is less efficient than a direct GRK partial search. Both the Grover search and the GRK partial search can be generalized to the case with several target items (or target blocks for a GRK). The GRK partial search algorithm can also be represented in terms of group theory.

  6. Evolutionary pattern search algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hart, W.E.

    1995-09-19

    This paper defines a class of evolutionary algorithms called evolutionary pattern search algorithms (EPSAs) and analyzes their convergence properties. This class of algorithms is closely related to evolutionary programming, evolutionary strategie and real-coded genetic algorithms. EPSAs are self-adapting systems that modify the step size of the mutation operator in response to the success of previous optimization steps. The rule used to adapt the step size can be used to provide a stationary point convergence theory for EPSAs on any continuous function. This convergence theory is based on an extension of the convergence theory for generalized pattern search methods. An experimental analysis of the performance of EPSAs demonstrates that these algorithms can perform a level of global search that is comparable to that of canonical EAs. We also describe a stopping rule for EPSAs, which reliably terminated near stationary points in our experiments. This is the first stopping rule for any class of EAs that can terminate at a given distance from stationary points.

  7. Harmony search algorithms for structural design optimization

    National Research Council Canada - National Science Library

    Geem, Zong Woo

    2009-01-01

    ... of these techniques is harmony search, an algorithm developed from musical improvisation that has been applied to various structural design problems and has demonstrated cost-savings. This book gathers all ...

  8. Performance Simulations of Moving Target Search Algorithms

    Directory of Open Access Journals (Sweden)

    Peter K. K. Loh

    2009-01-01

    Full Text Available The design of appropriate moving target search (MTS algorithms for computer-generated bots poses serious challenges as they have to satisfy stringent requirements that include computation and execution efficiency. In this paper, we investigate the performance and behaviour of existing moving target search algorithms when applied to search-and-capture gaming scenarios. As part of the investigation, we also introduce a novel algorithm known as abstraction MTS. We conduct performance simulations with a game bot and moving target within randomly generated mazes of increasing sizes and reveal that abstraction MTS exhibits competitive performance even with large problem spaces.

  9. Quantum walks and search algorithms

    CERN Document Server

    Portugal, Renato

    2013-01-01

    This book addresses an interesting area of quantum computation called quantum walks, which play an important role in building quantum algorithms, in particular search algorithms. Quantum walks are the quantum analogue of classical random walks. It is known that quantum computers have great power for searching unsorted databases. This power extends to many kinds of searches, particularly to the problem of finding a specific location in a spatial layout, which can be modeled by a graph. The goal is to find a specific node knowing that the particle uses the edges to jump from one node to the next. This book is self-contained with main topics that include: Grover's algorithm, describing its geometrical interpretation and evolution by means of the spectral decomposition of the evolution operater Analytical solutions of quantum walks on important graphs like line, cycles, two-dimensional lattices, and hypercubes using Fourier transforms Quantum walks on generic graphs, describing methods to calculate the limiting d...

  10. Efficient protein alignment algorithm for protein search.

    Science.gov (United States)

    Lu, Zaixin; Zhao, Zhiyu; Fu, Bin

    2010-01-18

    Proteins show a great variety of 3D conformations, which can be used to infer their evolutionary relationship and to classify them into more general groups; therefore protein structure alignment algorithms are very helpful for protein biologists. However, an accurate alignment algorithm itself may be insufficient for effective discovering of structural relationships among tens of thousands of proteins. Due to the exponentially increasing amount of protein structural data, a fast and accurate structure alignment tool is necessary to access protein classification and protein similarity search; however, the complexity of current alignment algorithms are usually too high to make a fully alignment-based classification and search practical. We have developed an efficient protein pairwise alignment algorithm and applied it to our protein search tool, which aligns a query protein structure in the pairwise manner with all protein structures in the Protein Data Bank (PDB) to output similar protein structures. The algorithm can align hundreds of pairs of protein structures in one second. Given a protein structure, the tool efficiently discovers similar structures from tens of thousands of structures stored in the PDB always in 2 minutes in a single machine and 20 seconds in our cluster of 6 machines. The algorithm has been fully implemented and is accessible online at our webserver, which is supported by a cluster of computers. Our algorithm can work out hundreds of pairs of protein alignments in one second. Therefore, it is very suitable for protein search. Our experimental results show that it is more accurate than other well known protein search systems in finding proteins which are structurally similar at SCOP family and superfamily levels, and its speed is also competitive with those systems. In terms of the pairwise alignment performance, it is as good as some well known alignment algorithms.

  11. A review on quantum search algorithms

    Science.gov (United States)

    Giri, Pulak Ranjan; Korepin, Vladimir E.

    2017-12-01

    The use of superposition of states in quantum computation, known as quantum parallelism, has significant advantage in terms of speed over the classical computation. It is evident from the early invented quantum algorithms such as Deutsch's algorithm, Deutsch-Jozsa algorithm and its variation as Bernstein-Vazirani algorithm, Simon algorithm, Shor's algorithms, etc. Quantum parallelism also significantly speeds up the database search algorithm, which is important in computer science because it comes as a subroutine in many important algorithms. Quantum database search of Grover achieves the task of finding the target element in an unsorted database in a time quadratically faster than the classical computer. We review Grover's quantum search algorithms for a singe and multiple target elements in a database. The partial search algorithm of Grover and Radhakrishnan and its optimization by Korepin called GRK algorithm are also discussed.

  12. Modified Parameters of Harmony Search Algorithm for Better Searching

    Science.gov (United States)

    Farraliza Mansor, Nur; Abal Abas, Zuraida; Samad Shibghatullah, Abdul; Rahman, Ahmad Fadzli Nizam Abdul

    2017-08-01

    The scheduling and rostering problems are deliberated as integrated due to they depend on each other whereby the input of rostering problems is a scheduling problems. In this research, the integrated scheduling and rostering bus driver problems are defined as maximising the balance of the assignment of tasks in term of distribution of shifts and routes. It is essential to achieve is fairer among driver because this can bring to increase in driver levels of satisfaction. The latest approaches still unable to address the fairness problem that has emerged, thus this research proposes a strategy to adopt an amendment of a harmony search algorithm in order to address the fairness issue and thus the level of fairness will be escalate. The harmony search algorithm is classified as a meta-heuristics algorithm that is capable of solving hard and combinatorial or discrete optimisation problems. In this respect, the three main operators in HS, namely the Harmony Memory Consideration Rate (HMCR), Pitch Adjustment Rate (PAR) and Bandwidth (BW) play a vital role in balancing local exploitation and global exploration. These parameters influence the overall performance of the HS algorithm, and therefore it is crucial to fine-tune them. The contributions to this research are the HMCR parameter using step function while the fret spacing concept on guitars that is associated with mathematical formulae is also applied in the BW parameter. The model of constant step function is introduced in the alteration of HMCR parameter. The experimental results revealed that our proposed approach is superior than parameter adaptive harmony search algorithm. In conclusion, this proposed approach managed to generate a fairer roster and was thus capable of maximising the balancing distribution of shifts and routes among drivers, which contributed to the lowering of illness, incidents, absenteeism and accidents.

  13. Hybridizing Evolutionary Algorithms with Opportunistic Local Search

    DEFF Research Database (Denmark)

    Gießen, Christian

    2013-01-01

    There is empirical evidence that memetic algorithms (MAs) can outperform plain evolutionary algorithms (EAs). Recently the first runtime analyses have been presented proving the aforementioned conjecture rigorously by investigating Variable-Depth Search, VDS for short (Sudholt, 2008). Sudholt...

  14. An Improved Harmony Search Algorithm for Power Distribution Network Planning

    Directory of Open Access Journals (Sweden)

    Wei Sun

    2015-01-01

    Full Text Available Distribution network planning because of involving many variables and constraints is a multiobjective, discrete, nonlinear, and large-scale optimization problem. Harmony search (HS algorithm is a metaheuristic algorithm inspired by the improvisation process of music players. HS algorithm has several impressive advantages, such as easy implementation, less adjustable parameters, and quick convergence. But HS algorithm still has some defects such as premature convergence and slow convergence speed. According to the defects of the standard algorithm and characteristics of distribution network planning, an improved harmony search (IHS algorithm is proposed in this paper. We set up a mathematical model of distribution network structure planning, whose optimal objective function is to get the minimum annual cost and constraint conditions are overload and radial network. IHS algorithm is applied to solve the complex optimization mathematical model. The empirical results strongly indicate that IHS algorithm can effectively provide better results for solving the distribution network planning problem compared to other optimization algorithms.

  15. Utilizing Centralized Diamond Architecture for Searching Algorithms

    Science.gov (United States)

    Damrudi, Masumeh; Jadidy Aval, Kamal

    2017-09-01

    Searching data elements and information in a fraction of time is still an important task for many areas of works. The importance of finding the best answer leads to use different algorithms which consume time. To improve the speed of searching information within mass information, employing parallel processing is inevitable. A search algorithm with constant time(CS) is proposed on centralized diamond architecture. We have optimized this algorithm to find the location and the number of occurrence of data, if there exists any.

  16. Search algorithms, hidden labour and information control

    Directory of Open Access Journals (Sweden)

    Paško Bilić

    2016-06-01

    Full Text Available The paper examines some of the processes of the closely knit relationship between Google’s ideologies of neutrality and objectivity and global market dominance. Neutrality construction comprises an important element sustaining the company’s economic position and is reflected in constant updates, estimates and changes to utility and relevance of search results. Providing a purely technical solution to these issues proves to be increasingly difficult without a human hand in steering algorithmic solutions. Search relevance fluctuates and shifts through continuous tinkering and tweaking of the search algorithm. The company also uses third parties to hire human raters for performing quality assessments of algorithmic updates and adaptations in linguistically and culturally diverse global markets. The adaptation process contradicts the technical foundations of the company and calculations based on the initial Page Rank algorithm. Annual market reports, Google’s Search Quality Rating Guidelines, and reports from media specialising in search engine optimisation business are analysed. The Search Quality Rating Guidelines document provides a rare glimpse into the internal architecture of search algorithms and the notions of utility and relevance which are presented and structured as neutral and objective. Intertwined layers of ideology, hidden labour of human raters, advertising revenues, market dominance and control are discussed throughout the paper.

  17. Search Parameter Optimization for Discrete, Bayesian, and Continuous Search Algorithms

    Science.gov (United States)

    2017-09-01

    SUBJECT TERMS Search Theory , Undersea Warfare, South China Sea, Anti-Submarine Warfare 15. NUMBER OF PAGES 253 16. PRICE CODE 17. SECURITY...to 09-22-2017 4. TITLE AND SUBTITLE SEARCH PARAMETER OPTIMIZATION FOR DISCRETE, BAYESIAN, AND CON- TINUOUS SEARCH ALGORITHMS 5. FUNDING NUMBERS 6...REPORT NUMBER 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) N/A 10. SPONSORING / MONITORING AGENCY REPORT NUMBER 11. SUPPLEMENTARY NOTES The

  18. 6. Algorithms for Sorting and Searching

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Algorithms - Algorithms for Sorting and Searching. R K Shyamasundar. Series Article ... Author Affiliations. R K Shyamasundar1. Computer Science Group, Tata Institute of Fundamental Research, Homi Bhabha Road, Mumbai 400 005, India ...

  19. Economic load dispatch using improved gravitational search algorithm

    Science.gov (United States)

    Huang, Yu; Wang, Jia-rong; Guo, Feng

    2016-03-01

    This paper presents an improved gravitational search algorithm(IGSA) to solve the economic load dispatch(ELD) problem. In order to avoid the local optimum phenomenon, mutation processing is applied to the GSA. The IGSA is applied to solve the economic load dispatch problems with the valve point effects, which has 13 generators and a load demand of 2520 MW. Calculation results show that the algorithm in this paper can deal with the ELD problems with high stability.

  20. Learning Search Algorithms: An Educational View

    Directory of Open Access Journals (Sweden)

    Ales Janota

    2014-12-01

    Full Text Available Artificial intelligence methods find their practical usage in many applications including maritime industry. The paper concentrates on the methods of uninformed and informed search, potentially usable in solving of complex problems based on the state space representation. The problem of introducing the search algorithms to newcomers has its technical and psychological dimensions. The authors show how it is possible to cope with both of them through design and use of specialized authoring systems. A typical example of searching a path through the maze is used to demonstrate how to test, observe and compare properties of various search strategies. Performance of search methods is evaluated based on the common criteria.

  1. Gradient gravitational search: An efficient metaheuristic algorithm for global optimization.

    Science.gov (United States)

    Dash, Tirtharaj; Sahu, Prabhat K

    2015-05-30

    The adaptation of novel techniques developed in the field of computational chemistry to solve the concerned problems for large and flexible molecules is taking the center stage with regard to efficient algorithm, computational cost and accuracy. In this article, the gradient-based gravitational search (GGS) algorithm, using analytical gradients for a fast minimization to the next local minimum has been reported. Its efficiency as metaheuristic approach has also been compared with Gradient Tabu Search and others like: Gravitational Search, Cuckoo Search, and Back Tracking Search algorithms for global optimization. Moreover, the GGS approach has also been applied to computational chemistry problems for finding the minimal value potential energy of two-dimensional and three-dimensional off-lattice protein models. The simulation results reveal the relative stability and physical accuracy of protein models with efficient computational cost. © 2015 Wiley Periodicals, Inc.

  2. Graph Extremities Defined by Search Algorithms

    Directory of Open Access Journals (Sweden)

    Jean-Paul Bordat

    2010-03-01

    Full Text Available Graph search algorithms have exploited graph extremities, such as the leaves of a tree and the simplicial vertices of a chordal graph. Recently, several well-known graph search algorithms have been collectively expressed as two generic algorithms called MLS and MLSM. In this paper, we investigate the properties of the vertex that is numbered 1 by MLS on a chordal graph and by MLSM on an arbitrary graph. We explain how this vertex is an extremity of the graph. Moreover, we show the remarkable property that the minimal separators included in the neighborhood of this vertex are totally ordered by inclusion.

  3. Genetic algorithms as global random search methods

    Science.gov (United States)

    Peck, Charles C.; Dhawan, Atam P.

    1995-01-01

    Genetic algorithm behavior is described in terms of the construction and evolution of the sampling distributions over the space of candidate solutions. This novel perspective is motivated by analysis indicating that the schema theory is inadequate for completely and properly explaining genetic algorithm behavior. Based on the proposed theory, it is argued that the similarities of candidate solutions should be exploited directly, rather than encoding candidate solutions and then exploiting their similarities. Proportional selection is characterized as a global search operator, and recombination is characterized as the search process that exploits similarities. Sequential algorithms and many deletion methods are also analyzed. It is shown that by properly constraining the search breadth of recombination operators, convergence of genetic algorithms to a global optimum can be ensured.

  4. Optimization of machining processes using pattern search algorithm

    Directory of Open Access Journals (Sweden)

    Miloš Madić

    2014-04-01

    Full Text Available Optimization of machining processes not only increases machining efficiency and economics, but also the end product quality. In recent years, among the traditional optimization methods, stochastic direct search optimization methods such as meta-heuristic algorithms are being increasingly applied for solving machining optimization problems. Their ability to deal with complex, multi-dimensional and ill-behaved optimization problems made them the preferred optimization tool by most researchers and practitioners. This paper introduces the use of pattern search (PS algorithm, as a deterministic direct search optimization method, for solving machining optimization problems. To analyze the applicability and performance of the PS algorithm, six case studies of machining optimization problems, both single and multi-objective, were considered. The PS algorithm was employed to determine optimal combinations of machining parameters for different machining processes such as abrasive waterjet machining, turning, turn-milling, drilling, electrical discharge machining and wire electrical discharge machining. In each case study the optimization solutions obtained by the PS algorithm were compared with the optimization solutions that had been determined by past researchers using meta-heuristic algorithms. Analysis of obtained optimization results indicates that the PS algorithm is very applicable for solving machining optimization problems showing good competitive potential against stochastic direct search methods such as meta-heuristic algorithms. Specific features and merits of the PS algorithm were also discussed.

  5. A Novel Self-Adaptive Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    Kaiping Luo

    2013-01-01

    Full Text Available The harmony search algorithm is a music-inspired optimization technology and has been successfully applied to diverse scientific and engineering problems. However, like other metaheuristic algorithms, it still faces two difficulties: parameter setting and finding the optimal balance between diversity and intensity in searching. This paper proposes a novel, self-adaptive search mechanism for optimization problems with continuous variables. This new variant can automatically configure the evolutionary parameters in accordance with problem characteristics, such as the scale and the boundaries, and dynamically select evolutionary strategies in accordance with its search performance. The new variant simplifies the parameter setting and efficiently solves all types of optimization problems with continuous variables. Statistical test results show that this variant is considerably robust and outperforms the original harmony search (HS, improved harmony search (IHS, and other self-adaptive variants for large-scale optimization problems and constrained problems.

  6. Smoothed Analysis of Local Search Algorithms

    NARCIS (Netherlands)

    Manthey, Bodo; Dehne, Frank; Sack, Jörg-Rüdiger; Stege, Ulrike

    2015-01-01

    Smoothed analysis is a method for analyzing the performance of algorithms for which classical worst-case analysis fails to explain the performance observed in practice. Smoothed analysis has been applied to explain the performance of a variety of algorithms in the last years. One particular class of

  7. A Cooperative Harmony Search Algorithm for Function Optimization

    Directory of Open Access Journals (Sweden)

    Gang Li

    2014-01-01

    Full Text Available Harmony search algorithm (HS is a new metaheuristic algorithm which is inspired by a process involving musical improvisation. HS is a stochastic optimization technique that is similar to genetic algorithms (GAs and particle swarm optimizers (PSOs. It has been widely applied in order to solve many complex optimization problems, including continuous and discrete problems, such as structure design, and function optimization. A cooperative harmony search algorithm (CHS is developed in this paper, with cooperative behavior being employed as a significant improvement to the performance of the original algorithm. Standard HS just uses one harmony memory and all the variables of the object function are improvised within the harmony memory, while the proposed algorithm CHS uses multiple harmony memories, so that each harmony memory can optimize different components of the solution vector. The CHS was then applied to function optimization problems. The results of the experiment show that CHS is capable of finding better solutions when compared to HS and a number of other algorithms, especially in high-dimensional problems.

  8. Transitionless driving on adiabatic search algorithm

    Science.gov (United States)

    Oh, Sangchul; Kais, Sabre

    2014-12-01

    We study quantum dynamics of the adiabatic search algorithm with the equivalent two-level system. Its adiabatic and non-adiabatic evolution is studied and visualized as trajectories of Bloch vectors on a Bloch sphere. We find the change in the non-adiabatic transition probability from exponential decay for the short running time to inverse-square decay in asymptotic running time. The scaling of the critical running time is expressed in terms of the Lambert W function. We derive the transitionless driving Hamiltonian for the adiabatic search algorithm, which makes a quantum state follow the adiabatic path. We demonstrate that a uniform transitionless driving Hamiltonian, approximate to the exact time-dependent driving Hamiltonian, can alter the non-adiabatic transition probability from the inverse square decay to the inverse fourth power decay with the running time. This may open up a new but simple way of speeding up adiabatic quantum dynamics.

  9. Implementing Quantum Search Algorithm with Metamaterials.

    Science.gov (United States)

    Zhang, Weixuan; Cheng, Kaiyang; Wu, Chao; Wang, Yi; Li, Hongqiang; Zhang, Xiangdong

    2018-01-01

    Metamaterials, artificially structured electromagnetic (EM) materials, have enabled the realization of many unconventional EM properties not found in nature, such as negative refractive index, magnetic response, invisibility cloaking, and so on. Based on these man-made materials with novel EM properties, various devices are designed and realized. However, quantum analog devices based on metamaterials have not been achieved so far. Here, metamaterials are designed and printed to perform quantum search algorithm. The structures, comprising of an array of 2D subwavelength air holes with different radii perforated on the dielectric layer, are fabricated using a 3D-printing technique. When an incident wave enters in the designed metamaterials, the profile of beam wavefront is processed iteratively as it propagates through the metamaterial periodically. After ≈N roundtrips, precisely the same as the efficiency of quantum search algorithm, searched items will be found with the incident wave all focusing on the marked positions. Such a metamaterial-based quantum searching simulator may lead to remarkable achievements in wave-based signal processors. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Consultant-Guided Search Algorithms for the Quadratic Assignment Problem

    Science.gov (United States)

    Iordache, Serban

    Consultant-Guided Search (CGS) is a recent swarm intelligence metaheuristic for combinatorial optimization problems, inspired by the way real people make decisions based on advice received from consultants. Until now, CGS has been successfully applied to the Traveling Salesman Problem. Because a good metaheuristic should be able to tackle efficiently a large variety of problems, it is important to see how CGS behaves when applied to other classes of problems. In this paper, we propose an algorithm for the Quadratic Assignment Problem (QAP), which hybridizes CGS with a local search procedure. Our experimental results show that CGS is able to compete in terms of solution quality with one of the best Ant Colony Optimization algorithms, the MAX-MIN Ant System.

  11. A hybrid approach using chaotic dynamics and global search algorithms for combinatorial optimization problems

    Science.gov (United States)

    Igeta, Hideki; Hasegawa, Mikio

    Chaotic dynamics have been effectively applied to improve various heuristic algorithms for combinatorial optimization problems in many studies. Currently, the most used chaotic optimization scheme is to drive heuristic solution search algorithms applicable to large-scale problems by chaotic neurodynamics including the tabu effect of the tabu search. Alternatively, meta-heuristic algorithms are used for combinatorial optimization by combining a neighboring solution search algorithm, such as tabu, gradient, or other search method, with a global search algorithm, such as genetic algorithms (GA), ant colony optimization (ACO), or others. In these hybrid approaches, the ACO has effectively optimized the solution of many benchmark problems in the quadratic assignment problem library. In this paper, we propose a novel hybrid method that combines the effective chaotic search algorithm that has better performance than the tabu search and global search algorithms such as ACO and GA. Our results show that the proposed chaotic hybrid algorithm has better performance than the conventional chaotic search and conventional hybrid algorithms. In addition, we show that chaotic search algorithm combined with ACO has better performance than when combined with GA.

  12. Harmony Search Algorithm for the Container Storage Problem

    OpenAIRE

    Ayachi, Imen; Kammarti, Ryan; Borne, Pierre; Ksouri, Mekki

    2010-01-01

    Recently a new metaheuristic called harmony search was developed. It mimics the behaviors of musicians improvising to find the better state harmony. In this paper, this algorithm is described and applied to solve the container storage problem in the harbor. The objective of this problem is to determine a valid containers arrangement, which meets customers' delivery deadlines, reduces the number of container rehandlings and minimizes the ship idle time. In this paper, an adaptation of the harm...

  13. A parallel algorithm for random searches

    Science.gov (United States)

    Wosniack, M. E.; Raposo, E. P.; Viswanathan, G. M.; da Luz, M. G. E.

    2015-11-01

    We discuss a parallelization procedure for a two-dimensional random search of a single individual, a typical sequential process. To assure the same features of the sequential random search in the parallel version, we analyze the former spatial patterns of the encountered targets for different search strategies and densities of homogeneously distributed targets. We identify a lognormal tendency for the distribution of distances between consecutively detected targets. Then, by assigning the distinct mean and standard deviation of this distribution for each corresponding configuration in the parallel simulations (constituted by parallel random walkers), we are able to recover important statistical properties, e.g., the target detection efficiency, of the original problem. The proposed parallel approach presents a speedup of nearly one order of magnitude compared with the sequential implementation. This algorithm can be easily adapted to different instances, as searches in three dimensions. Its possible range of applicability covers problems in areas as diverse as automated computer searchers in high-capacity databases and animal foraging.

  14. An ensemble symbiosis organisms search algorithm and its application to real world problems

    Directory of Open Access Journals (Sweden)

    Sukanta Nama

    2017-07-01

    Full Text Available In this study, an ensemble algorithm has been proposed, called Quasi-Oppositional Symbiosis Organisms Search (QOSOS algorithms, by incorporating the quasi-oppositional based learning (QOBL strategy into the newly proposed Symbiosis Organisms Search (SOS algorithm for solving unconstrained global optimization problems. The QOBL is incorporated into the basic SOS algorithm due to the balance of the exploration capability of QOBL and the exploitation potential of SOS algorithm. To validate the efficiency and robustness of the proposed Quasi-Oppositional Symbiosis Organisms Search (QOSOS algorithms, it is applied to solve unconstrained global optimization problems. Also, the proposed QOSOS algorithm is applied to solve two real world global optimization problems. One is gas transmission compressor design optimization problem and another is optimal capacity of the gas production facilities optimization problem. The performance of the QOSOS algorithm is extensively evaluated and compares favorably with many progressive algorithms.

  15. Development of Navigation Control Algorithm for AGV Using D* search Algorithm

    Directory of Open Access Journals (Sweden)

    Jeong Geun Kim

    2013-06-01

    Full Text Available In this paper, we present a navigation control algorithm for Automatic Guided Vehicles (AGV that move in industrial environments including static and moving obstacles using D* algorithm. This algorithm has ability to get paths planning in unknown, partially known and changing environments efficiently. To apply the D* search algorithm, the grid map represent the known environment is generated. By using the laser scanner LMS-151 and laser navigation sensor NAV-200, the grid map is updated according to the changing of environment and obstacles. When the AGV finds some new map information such as new unknown obstacles, it adds the information to its map and re-plans a new shortest path from its current coordinates to the given goal coordinates. It repeats the process until it reaches the goal coordinates. This algorithm is verified through simulation and experiment. The simulation and experimental results show that the algorithm can be used to move the AGV successfully to reach the goal position while it avoids unknown moving and static obstacles. [Keywords— navigation control algorithm; Automatic Guided Vehicles (AGV; D* search algorithm

  16. An improved genetic algorithm for searching for pollution sources

    Directory of Open Access Journals (Sweden)

    Quan-min BU

    2013-10-01

    Full Text Available As an optimization method that has experienced rapid development over the past 20 years, the genetic algorithm has been successfully applied in many fields, but it requires repeated searches based on the characteristics of high-speed computer calculation and conditions of the known relationship between the objective function and independent variables. There are several hundred generations of evolvement, but the functional relationship is unknown in pollution source searches. Therefore, the genetic algorithm cannot be used directly. Certain improvements need to be made based on the actual situation, so that the genetic algorithm can adapt to the actual conditions of environmental problems, and can be used in environmental monitoring and environmental quality assessment. Therefore, a series of methods are proposed for the improvement of the genetic algorithm: (1 the initial generation of individual groups should be artificially set and move from lightly polluted areas to heavily polluted areas; (2 intervention measures should be introduced in the competition between individuals; (3 guide individuals should be added; and (4 specific improvement programs should be put forward. Finally, the scientific rigor and rationality of the improved genetic algorithm are proven through an example.

  17. Wolf Search Algorithm for Solving Optimal Reactive Power Dispatch Problem

    Directory of Open Access Journals (Sweden)

    Kanagasabai Lenin

    2015-03-01

    Full Text Available This paper presents a new bio-inspired heuristic optimization algorithm called the Wolf Search Algorithm (WSA for solving the multi-objective reactive power dispatch problem. Wolf Search algorithm is a new bio – inspired heuristic algorithm which based on wolf preying behaviour. The way wolves search for food and survive by avoiding their enemies has been imitated to formulate the algorithm for solving the reactive power dispatches. And the speciality  of wolf is  possessing  both individual local searching ability and autonomous flocking movement and this special property has been utilized to formulate the search algorithm .The proposed (WSA algorithm has been tested on standard IEEE 30 bus test system and simulation results shows clearly about the good performance of the proposed algorithm .

  18. Improved Gravitational Search Algorithm (GSA Using Fuzzy Logic

    Directory of Open Access Journals (Sweden)

    Omid Mokhlesi

    2013-04-01

    Full Text Available Researchers tendency to use different collective intelligence as the search methods to optimize complex engineering problems has increased because of the high performance of this algorithms. Gravitational search algorithm (GSA is among these algorithms. This algorithm is inspired by Newton's laws of physics and gravitational attraction. Random masses are agents who have searched for the space. This paper presents a new Fuzzy Population GSA model called FPGSA. The proposed method is a combination of parametric fuzzy controller and gravitational search algorithm. The space being searched using this combined reasonable and accurate method. In the collective intelligence algorithms, population size influences the final answer so that for a large population, a better response is obtained but the algorithm execution time is longer. To overcome this problem, a new parameter called the dispersion coefficient is added to the algorithm. Implementation results show that by controlling this factor, system performance can be improved.

  19. Ant colony search algorithm for optimal reactive power optimization

    Directory of Open Access Journals (Sweden)

    Lenin K.

    2006-01-01

    Full Text Available The paper presents an (ACSA Ant colony search Algorithm for Optimal Reactive Power Optimization and voltage control of power systems. ACSA is a new co-operative agents’ approach, which is inspired by the observation of the behavior of real ant colonies on the topic of ant trial formation and foraging methods. Hence, in the ACSA a set of co-operative agents called "Ants" co-operates to find good solution for Reactive Power Optimization problem. The ACSA is applied for optimal reactive power optimization is evaluated on standard IEEE, 30, 57, 191 (practical test bus system. The proposed approach is tested and compared to genetic algorithm (GA, Adaptive Genetic Algorithm (AGA.

  20. Direct search algorithms for optimization calculations

    Science.gov (United States)

    Powell, M. J. D.

    Many different procedures have been proposed for optimization calculations when first derivatives are not available. Further, several researchers have contributed to the subject, including some who wish to prove convergence theorems, and some who wish to make any reduction in the least calculated value of the objective function. There is not even a key idea that can be used as a foundation of a review, except for the problem itself, which is the adjustment of variables so that a function becomes least, where each value of the function is returned by a subroutine for each trial vector of variables. Therefore the paper is a collection of essays on particular strategies and algorithms, in order to consider the advantages, limitations and theory of several techniques. The subjects addressed are line search methods, the restriction of vectors of variables to discrete grids, the use of geometric simplices, conjugate direction procedures, trust region algorithms that form linear or quadratic approximations to the objective function, and simulated annealing. We study the main features of the methods themselves, instead of providing a catalogue of references to published work, because an understanding of these features may be very helpful to future research.

  1. A hybrid search algorithm for swarm robots searching in an unknown environment.

    Science.gov (United States)

    Li, Shoutao; Li, Lina; Lee, Gordon; Zhang, Hao

    2014-01-01

    This paper proposes a novel method to improve the efficiency of a swarm of robots searching in an unknown environment. The approach focuses on the process of feeding and individual coordination characteristics inspired by the foraging behavior in nature. A predatory strategy was used for searching; hence, this hybrid approach integrated a random search technique with a dynamic particle swarm optimization (DPSO) search algorithm. If a search robot could not find any target information, it used a random search algorithm for a global search. If the robot found any target information in a region, the DPSO search algorithm was used for a local search. This particle swarm optimization search algorithm is dynamic as all the parameters in the algorithm are refreshed synchronously through a communication mechanism until the robots find the target position, after which, the robots fall back to a random searching mode. Thus, in this searching strategy, the robots alternated between two searching algorithms until the whole area was covered. During the searching process, the robots used a local communication mechanism to share map information and DPSO parameters to reduce the communication burden and overcome hardware limitations. If the search area is very large, search efficiency may be greatly reduced if only one robot searches an entire region given the limited resources available and time constraints. In this research we divided the entire search area into several subregions, selected a target utility function to determine which subregion should be initially searched and thereby reduced the residence time of the target to improve search efficiency.

  2. Could Grover's quantum algorithm help in searching an actual database?

    OpenAIRE

    Zalka, Christof

    1999-01-01

    I investigate whether it would technologically and economically make sense to build database search engines based on Grover's quantum search algorithm. The answer is not fully conclusive but in my judgement rather negative.

  3. Mass optimization of engineering structures applying genetic algorithms

    Directory of Open Access Journals (Sweden)

    Valentina Gerfolveden

    2015-07-01

    Full Text Available The paper proposes a technology for mass optimization of two-dimensional body applying genetic algorithms. Main attention is focused on geometry of 2D body, i. e. search for optimal coordinates of body points. Direct analysis of 2D body – von Mises stress determination – is performed using original program based on finite element method. The set of design parameters contains the coordinates of body points in 2D space. The results of numerical experiments proved the proposed technology to be efficient tool for solution of 2D body mass optimization problem.DOI: 10.15181/csat.v2i2.892

  4. GMDH algorithms applied to turbidity forecasting

    Science.gov (United States)

    Tsai, Tsung-Min; Yen, Pei-Hwa

    2017-06-01

    By applying the group method of data handling algorithm to self-organization networks, we design a turbidity prediction model based on simple input/output observations of daily hydrological data (rainfall, discharge, and turbidity). The data are from a field test site at the Chiahsien Weir and its upper stream in Taiwan, and were recorded from May 2000 to December 2008. The model has a regressive mode that can assess the estimated error, i.e., whether a threshold has been exceeded, and can be adjusted by updating the field input data. Consequently, the model can achieve accurate estimations over long-term periods. Test results demonstrate that the 2006 turbidity prediction model was selected as the best predictive model (RMSE = 5.787 and CC = 0.975) because of its ability to predict turbidity within the acceptable error range and 90 % required confidence interval (50NTU). 70(3,1,1) is the optimum modeling data length and variable combinations.

  5. Improved Degree Search Algorithms in Unstructured P2P Networks

    Directory of Open Access Journals (Sweden)

    Guole Liu

    2012-01-01

    Full Text Available Searching and retrieving the demanded correct information is one important problem in networks; especially, designing an efficient search algorithm is a key challenge in unstructured peer-to-peer (P2P networks. Breadth-first search (BFS and depth-first search (DFS are the current two typical search methods. BFS-based algorithms show the perfect performance in the aspect of search success rate of network resources, while bringing the huge search messages. On the contrary, DFS-based algorithms reduce the search message quantity and also cause the dropping of search success ratio. To address the problem that only one of performances is excellent, we propose two memory function degree search algorithms: memory function maximum degree algorithm (MD and memory function preference degree algorithm (PD. We study their performance including the search success rate and the search message quantity in different networks, which are scale-free networks, random graph networks, and small-world networks. Simulations show that the two performances are both excellent at the same time, and the performances are improved at least 10 times.

  6. Structural design optimization of vehicle components using Cuckoo Search Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Yildiz, Ali Riza [Bursa Technical Univ., Bursa (Turkey). Dept. of Mechanical Engineering; Durgun, Ismail

    2012-07-01

    In order to meet today's vehicle design requirements and to improve the cost and fuel efficiency, there is an increasing interest to design light-weight and cost-effective vehicle components. In this research, a new optimization algorithm, called the Cuckoo Search Algorithm (CS) algorithm, is introduced for solving structural design optimization problems. This research is the first application of the CS to the shape design optimization problems in the literature. The CS algorithm is applied to the structural design optimization of a vehicle component to illustrate how the present approach can be applied for solving structural design problems. Results show the ability of the CS to find better optimal structural design. [German] Um heutige Anforderungen an das Fahrzeugdesign zu beruecksichtigen und um die Kosten- und Kraftstoffeffektivitaet zu erhoehen, nimmt das Interesse am Design leichter und kosteneffektiver Fahrzeugkomponenten weiterhin zu. In der diesem Beitrag zugrunde liegenden Studie wurde ein neuer Optimierungsalgorithmus angewendet, der so genannte Cuckoo Suchalgorithmus (CS). Es handelt sich um die erste CS-Applikation fuer das Formdesign in der Literatur. Der CS-Algorithmus wird hierbei zur Strukturdesignoptimierung einer Fahrzeugkomponente angewendet, um zu zeigen, wie er bei der Loesung von Strukturdesignaufgaben angewendet werden kann. Die Ergebnisse zeigen, wie damit ein verbessertes Design erreicht werden kann.

  7. Automated Spectroscopic Analysis Using the Particle Swarm Optimization Algorithm: Implementing a Guided Search Algorithm to Autofit

    Science.gov (United States)

    Ervin, Katherine; Shipman, Steven

    2017-06-01

    While rotational spectra can be rapidly collected, their analysis (especially for complex systems) is seldom straightforward, leading to a bottleneck. The AUTOFIT program was designed to serve that need by quickly matching rotational constants to spectra with little user input and supervision. This program can potentially be improved by incorporating an optimization algorithm in the search for a solution. The Particle Swarm Optimization Algorithm (PSO) was chosen for implementation. PSO is part of a family of optimization algorithms called heuristic algorithms, which seek approximate best answers. This is ideal for rotational spectra, where an exact match will not be found without incorporating distortion constants, etc., which would otherwise greatly increase the size of the search space. PSO was tested for robustness against five standard fitness functions and then applied to a custom fitness function created for rotational spectra. This talk will explain the Particle Swarm Optimization algorithm and how it works, describe how Autofit was modified to use PSO, discuss the fitness function developed to work with spectroscopic data, and show our current results. Seifert, N.A., Finneran, I.A., Perez, C., Zaleski, D.P., Neill, J.L., Steber, A.L., Suenram, R.D., Lesarri, A., Shipman, S.T., Pate, B.H., J. Mol. Spec. 312, 13-21 (2015)

  8. Memetic algorithms for continuous optimisation based on local search chains.

    Science.gov (United States)

    Molina, Daniel; Lozano, Manuel; García-Martínez, Carlos; Herrera, Francisco

    2010-01-01

    Memetic algorithms with continuous local search methods have arisen as effective tools to address the difficulty of obtaining reliable solutions of high precision for complex continuous optimisation problems. There exists a group of continuous local search algorithms that stand out as exceptional local search optimisers. However, on some occasions, they may become very expensive, because of the way they exploit local information to guide the search process. In this paper, they are called intensive continuous local search methods. Given the potential of this type of local optimisation methods, it is interesting to build prospective memetic algorithm models with them. This paper presents the concept of local search chain as a springboard to design memetic algorithm approaches that can effectively use intense continuous local search methods as local search operators. Local search chain concerns the idea that, at one stage, the local search operator may continue the operation of a previous invocation, starting from the final configuration (initial solution, strategy parameter values, internal variables, etc.) reached by this one. The proposed memetic algorithm favours the formation of local search chains during the memetic algorithm run with the aim of concentrating local tuning in search regions showing promise. In order to study the performance of the new memetic algorithm model, an instance is implemented with CMA-ES as an intense local search method. The benefits of the proposal in comparison to other kinds of memetic algorithms and evolutionary algorithms proposed in the literature to deal with continuous optimisation problems are experimentally shown. Concretely, the empirical study reveals a clear superiority when tackling high-dimensional problems.

  9. Design and economic investigation of shell and tube heat exchangers using Improved Intelligent Tuned Harmony Search algorithm

    OpenAIRE

    Turgut, Oguz Emrah; Turgut, Mert Sinan; Coban, Mustafa Turhan

    2014-01-01

    This study explores the thermal design of shell and tube heat exchangers by using Improved Intelligent Tuned Harmony Search (I-ITHS) algorithm. Intelligent Tuned Harmony Search (ITHS) is an upgraded version of harmony search algorithm which has an advantage of deciding intensification and diversification processes by applying proper pitch adjusting strategy. In this study, we aim to improve the search capacity of ITHS algorithm by utilizing chaotic sequences instead of uniformly distributed r...

  10. Nature-inspired novel Cuckoo Search Algorithm for genome ...

    Indian Academy of Sciences (India)

    Keywords. Bioinformatics; Cuckoo search; genome sequence assembly; metaheuristics. Abstract. This study aims to produce a novel optimization algorithm, called the Cuckoo Search Algorithm (CS), for solving the genome sequence assembly problem. Assembly of genome sequence is a technique that attempts to rebuild ...

  11. 2nd International Conference on Harmony Search Algorithm

    CERN Document Server

    Geem, Zong

    2016-01-01

    The Harmony Search Algorithm (HSA) is one of the most well-known techniques in the field of soft computing, an important paradigm in the science and engineering community.  This volume, the proceedings of the 2nd International Conference on Harmony Search Algorithm 2015 (ICHSA 2015), brings together contributions describing the latest developments in the field of soft computing with a special focus on HSA techniques. It includes coverage of new methods that have potentially immense application in various fields. Contributed articles cover aspects of the following topics related to the Harmony Search Algorithm: analytical studies; improved, hybrid and multi-objective variants; parameter tuning; and large-scale applications.  The book also contains papers discussing recent advances on the following topics: genetic algorithms; evolutionary strategies; the firefly algorithm and cuckoo search; particle swarm optimization and ant colony optimization; simulated annealing; and local search techniques.   This book ...

  12. Image similarity search using a negative selection algorithm

    NARCIS (Netherlands)

    Keijzers, S.; Maandag, P.; Marchiori, E.; Sprinkhuizen-Kuyper, I.G.

    2013-01-01

    The Negative Selection Algorithm is an immune inspired algorithm that can be used for different purposes such as fault detection, data integrity protection and virus detection. In this paper we show how the Negative Selection Algorithm can be adapted to tackle the similar image search problem: given

  13. A review: search visualization with Knuth Morris Pratt algorithm

    Science.gov (United States)

    Rahim, Robbi; Zulkarnain, Iskandar; Jaya, Hendra

    2017-09-01

    In this research modeled a search process of the Knuth-Morris-Pratt algorithm in the form of easy-to-understand visualization, Knuth-Morris-Pratt algorithm selection because this algorithm is easy to learn and easy to implement into many programming languages.

  14. Cuckoo search and firefly algorithm theory and applications

    CERN Document Server

    2014-01-01

    Nature-inspired algorithms such as cuckoo search and firefly algorithm have become popular and widely used in recent years in many applications. These algorithms are flexible, efficient and easy to implement. New progress has been made in the last few years, and it is timely to summarize the latest developments of cuckoo search and firefly algorithm and their diverse applications. This book will review both theoretical studies and applications with detailed algorithm analysis, implementation and case studies so that readers can benefit most from this book.  Application topics are contributed by many leading experts in the field. Topics include cuckoo search, firefly algorithm, algorithm analysis, feature selection, image processing, travelling salesman problem, neural network, GPU optimization, scheduling, queuing, multi-objective manufacturing optimization, semantic web service, shape optimization, and others.   This book can serve as an ideal reference for both graduates and researchers in computer scienc...

  15. A GENETIC ALGORITHM USING THE LOCAL SEARCH HEURISTIC IN FACILITIES LAYOUT PROBLEM: A MEMETİC ALGORİTHM APPROACH

    Directory of Open Access Journals (Sweden)

    Orhan TÜRKBEY

    2002-02-01

    Full Text Available Memetic algorithms, which use local search techniques, are hybrid structured algorithms like genetic algorithms among evolutionary algorithms. In this study, for Quadratic Assignment Problem (QAP, a memetic structured algorithm using a local search heuristic like 2-opt is developed. Developed in the algorithm, a crossover operator that has not been used before for QAP is applied whereas, Eshelman procedure is used in order to increase thesolution variability. The developed memetic algorithm is applied on test problems taken from QAP-LIB, the results are compared with the present techniques in the literature.

  16. State-of-the-Art Review on Relevance of Genetic Algorithm to Internet Web Search

    Directory of Open Access Journals (Sweden)

    Kehinde Agbele

    2012-01-01

    Full Text Available People use search engines to find information they desire with the aim that their information needs will be met. Information retrieval (IR is a field that is concerned primarily with the searching and retrieving of information in the documents and also searching the search engine, online databases, and Internet. Genetic algorithms (GAs are robust, efficient, and optimizated methods in a wide area of search problems motivated by Darwin’s principles of natural selection and survival of the fittest. This paper describes information retrieval systems (IRS components. This paper looks at how GAs can be applied in the field of IR and specifically the relevance of genetic algorithms to internet web search. Finally, from the proposals surveyed it turns out that GA is applied to diverse problem fields of internet web search.

  17. HEURISTIC OPTIMIZATION AND ALGORITHM TUNING APPLIED TO SORPTIVE BARRIER DESIGN

    Science.gov (United States)

    While heuristic optimization is applied in environmental applications, ad-hoc algorithm configuration is typical. We use a multi-layer sorptive barrier design problem as a benchmark for an algorithm-tuning procedure, as applied to three heuristics (genetic algorithms, simulated ...

  18. Induction Motor Parameter Identification Using a Gravitational Search Algorithm

    Directory of Open Access Journals (Sweden)

    Omar Avalos

    2016-04-01

    Full Text Available The efficient use of electrical energy is a topic that has attracted attention for its environmental consequences. On the other hand, induction motors represent the main component in most industries. They consume the highest energy percentages in industrial facilities. This energy consumption depends on the operation conditions of the induction motor imposed by its internal parameters. Since the internal parameters of an induction motor are not directly measurable, an identification process must be conducted to obtain them. In the identification process, the parameter estimation is transformed into a multidimensional optimization problem where the internal parameters of the induction motor are considered as decision variables. Under this approach, the complexity of the optimization problem tends to produce multimodal error surfaces for which their cost functions are significantly difficult to minimize. Several algorithms based on evolutionary computation principles have been successfully applied to identify the optimal parameters of induction motors. However, most of them maintain an important limitation: They frequently obtain sub-optimal solutions as a result of an improper equilibrium between exploitation and exploration in their search strategies. This paper presents an algorithm for the optimal parameter identification of induction motors. To determine the parameters, the proposed method uses a recent evolutionary method called the gravitational search algorithm (GSA. Different from most of the existent evolutionary algorithms, the GSA presents a better performance in multimodal problems, avoiding critical flaws such as the premature convergence to sub-optimal solutions. Numerical simulations have been conducted on several models to show the effectiveness of the proposed scheme.

  19. Computing a Clique Tree with the Algorithm Maximal Label Search

    Directory of Open Access Journals (Sweden)

    Anne Berry

    2017-01-01

    Full Text Available The algorithm MLS (Maximal Label Search is a graph search algorithm that generalizes the algorithms Maximum Cardinality Search (MCS, Lexicographic Breadth-First Search (LexBFS, Lexicographic Depth-First Search (LexDFS and Maximal Neighborhood Search (MNS. On a chordal graph, MLS computes a PEO (perfect elimination ordering of the graph. We show how the algorithm MLS can be modified to compute a PMO (perfect moplex ordering, as well as a clique tree and the minimal separators of a chordal graph. We give a necessary and sufficient condition on the labeling structure of MLS for the beginning of a new clique in the clique tree to be detected by a condition on labels. MLS is also used to compute a clique tree of the complement graph, and new cliques in the complement graph can be detected by a condition on labels for any labeling structure. We provide a linear time algorithm computing a PMO and the corresponding generators of the maximal cliques and minimal separators of the complement graph. On a non-chordal graph, the algorithm MLSM, a graph search algorithm computing an MEO and a minimal triangulation of the graph, is used to compute an atom tree of the clique minimal separator decomposition of any graph.

  20. Combinatorial search from algorithms to systems

    CERN Document Server

    Hamadi, Youssef

    2013-01-01

    This book details key techniques in constraint networks, dealing in particular with constraint satisfaction, search, satisfiability, and applications in machine learning and constraint programming. Includes case studies.

  1. Conditionally-uniform Feasible Grid Search Algorithm

    DEFF Research Database (Denmark)

    Dziubinski, Matt P.

    We present and evaluate a numerical optimization method (together with an algorithm for choosing the starting values) pertinent to the constrained optimization problem arising in the estimation of the GARCH models with inequality constraints, in particular the Simplied Component GARCH Model...... (SCGARCH), together with algorithms for the objective function and analytical gradient computation for SCGARCH....

  2. Kernel Clustering with a Differential Harmony Search Algorithm for Scheme Classification

    Directory of Open Access Journals (Sweden)

    Yu Feng

    2017-01-01

    Full Text Available This paper presents a kernel fuzzy clustering with a novel differential harmony search algorithm to coordinate with the diversion scheduling scheme classification. First, we employed a self-adaptive solution generation strategy and differential evolution-based population update strategy to improve the classical harmony search. Second, we applied the differential harmony search algorithm to the kernel fuzzy clustering to help the clustering method obtain better solutions. Finally, the combination of the kernel fuzzy clustering and the differential harmony search is applied for water diversion scheduling in East Lake. A comparison of the proposed method with other methods has been carried out. The results show that the kernel clustering with the differential harmony search algorithm has good performance to cooperate with the water diversion scheduling problems.

  3. Merged Search Algorithms for Radio Frequency Identification Anticollision

    Directory of Open Access Journals (Sweden)

    Bih-Yaw Shih

    2012-01-01

    The arbitration algorithm for RFID system is used to arbitrate all the tags to avoid the collision problem with the existence of multiple tags in the interrogation field of a transponder. A splitting algorithm which is called Binary Search Tree (BST is well known for multitags arbitration. In the current study, a splitting-based schema called Merged Search Tree is proposed to capture identification codes correctly for anticollision. Performance of the proposed algorithm is compared with the original BST according to time and power consumed during the arbitration process. The results show that the proposed model can reduce searching time and power consumed to achieve a better performance arbitration.

  4. Journal of Applied Biosciences: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  5. Adaptive switching gravitational search algorithm: an attempt to ...

    Indian Academy of Sciences (India)

    Nor Azlina Ab Aziz

    ). The statistical analysis results show that ASw-GSA performs significantly better than GA and BA and as well as PSO, the original GSA and GWO.12. Keywords. Asynchronous; diversity; gravitational search algorithm; iteration strategy; ...

  6. Fast Multidimensional Nearest Neighbor Search Algorithm Using Priority Queue

    Science.gov (United States)

    Ajioka, Shiro; Tsuge, Satoru; Shishibori, Masami; Kita, Kenji

    Nearest neighbor search in high dimensional spaces is an interesting and important problem which is relevant for a wide variety of applications, including multimedia information retrieval, data mining, and pattern recognition. For such applications, the curse of high dimensionality tends to be a major obstacle in the development of efficient search methods. This paper addresses the problem of designing an efficient algorithm for high dimensional nearest neighbor search using a priority queue. The proposed algorithm is based on a simple linear search algorithm and eliminates unnecessary arithmetic operations from distance computations between multidimensional vectors. Moreover, we propose two techniques, a dimensional sorting method and a PCA-based method, to accelerate multidimensional search. Experimental results indicate that our scheme scales well even for a very large number of dimensions.

  7. Improved bat algorithm applied to multilevel image thresholding.

    Science.gov (United States)

    Alihodzic, Adis; Tuba, Milan

    2014-01-01

    Multilevel image thresholding is a very important image processing technique that is used as a basis for image segmentation and further higher level processing. However, the required computational time for exhaustive search grows exponentially with the number of desired thresholds. Swarm intelligence metaheuristics are well known as successful and efficient optimization methods for intractable problems. In this paper, we adjusted one of the latest swarm intelligence algorithms, the bat algorithm, for the multilevel image thresholding problem. The results of testing on standard benchmark images show that the bat algorithm is comparable with other state-of-the-art algorithms. We improved standard bat algorithm, where our modifications add some elements from the differential evolution and from the artificial bee colony algorithm. Our new proposed improved bat algorithm proved to be better than five other state-of-the-art algorithms, improving quality of results in all cases and significantly improving convergence speed.

  8. Improved Bat Algorithm Applied to Multilevel Image Thresholding

    Directory of Open Access Journals (Sweden)

    Adis Alihodzic

    2014-01-01

    Full Text Available Multilevel image thresholding is a very important image processing technique that is used as a basis for image segmentation and further higher level processing. However, the required computational time for exhaustive search grows exponentially with the number of desired thresholds. Swarm intelligence metaheuristics are well known as successful and efficient optimization methods for intractable problems. In this paper, we adjusted one of the latest swarm intelligence algorithms, the bat algorithm, for the multilevel image thresholding problem. The results of testing on standard benchmark images show that the bat algorithm is comparable with other state-of-the-art algorithms. We improved standard bat algorithm, where our modifications add some elements from the differential evolution and from the artificial bee colony algorithm. Our new proposed improved bat algorithm proved to be better than five other state-of-the-art algorithms, improving quality of results in all cases and significantly improving convergence speed.

  9. International Timetabling Competition 2011: An Adaptive Large Neighborhood Search algorithm

    DEFF Research Database (Denmark)

    Sørensen, Matias; Kristiansen, Simon; Stidsen, Thomas Riis

    2012-01-01

    An algorithm based on Adaptive Large Neighborhood Search (ALNS) for solving the generalized High School Timetabling problem in XHSTT-format (Post et al (2012a)) is presented. This algorithm was among the nalists of round 2 of the International Timetabling Competition 2011 (ITC2011). For problem...

  10. International Timetabling Competition 2011: An Adaptive Large Neighborhood Search algorithm

    OpenAIRE

    Sørensen, Matias; Kristiansen, Simon; Stidsen, Thomas Riis

    2012-01-01

    An algorithm based on Adaptive Large Neighborhood Search (ALNS) for solving the generalized High School Timetabling problem in XHSTT-format (Post et al (2012a)) is presented. This algorithm was among the nalists of round 2 of the International Timetabling Competition 2011 (ITC2011). For problem description and results we refer to Post et al (2012b).

  11. A Functional Programming Approach to AI Search Algorithms

    Science.gov (United States)

    Panovics, Janos

    2012-01-01

    The theory and practice of search algorithms related to state-space represented problems form the major part of the introductory course of Artificial Intelligence at most of the universities and colleges offering a degree in the area of computer science. Students usually meet these algorithms only in some imperative or object-oriented language…

  12. Algorithms for Academic Search and Recommendation Systems

    DEFF Research Database (Denmark)

    Amolochitis, Emmanouil

    2014-01-01

    implementation of the term frequency heuristic, a time-depreciated citation score and a graph-theoretic computed score that relates the paper’s index terms with each other. On the second part we describe the design of hybrid recommender ensemble (user, item and content based). The newly introduced algorithms...

  13. Fault-tolerant search algorithms reliable computation with unreliable information

    CERN Document Server

    Cicalese, Ferdinando

    2013-01-01

    Why a book on fault-tolerant search algorithms? Searching is one of the fundamental problems in computer science. Time and again algorithmic and combinatorial issues originally studied in the context of search find application in the most diverse areas of computer science and discrete mathematics. On the other hand, fault-tolerance is a necessary ingredient of computing. Due to their inherent complexity, information systems are naturally prone to errors, which may appear at any level - as imprecisions in the data, bugs in the software, or transient or permanent hardware failures. This book pr

  14. Fast search algorithms for computational protein design.

    Science.gov (United States)

    Traoré, Seydou; Roberts, Kyle E; Allouche, David; Donald, Bruce R; André, Isabelle; Schiex, Thomas; Barbe, Sophie

    2016-05-05

    One of the main challenges in computational protein design (CPD) is the huge size of the protein sequence and conformational space that has to be computationally explored. Recently, we showed that state-of-the-art combinatorial optimization technologies based on Cost Function Network (CFN) processing allow speeding up provable rigid backbone protein design methods by several orders of magnitudes. Building up on this, we improved and injected CFN technology into the well-established CPD package Osprey to allow all Osprey CPD algorithms to benefit from associated speedups. Because Osprey fundamentally relies on the ability of A* to produce conformations in increasing order of energy, we defined new A* strategies combining CFN lower bounds, with new side-chain positioning-based branching scheme. Beyond the speedups obtained in the new A*-CFN combination, this novel branching scheme enables a much faster enumeration of suboptimal sequences, far beyond what is reachable without it. Together with the immediate and important speedups provided by CFN technology, these developments directly benefit to all the algorithms that previously relied on the DEE/ A* combination inside Osprey* and make it possible to solve larger CPD problems with provable algorithms. © 2016 Wiley Periodicals, Inc.

  15. Computer Algorithms in the Search for Unrelated Stem Cell Donors

    Directory of Open Access Journals (Sweden)

    David Steiner

    2012-01-01

    Full Text Available Hematopoietic stem cell transplantation (HSCT is a medical procedure in the field of hematology and oncology, most often performed for patients with certain cancers of the blood or bone marrow. A lot of patients have no suitable HLA-matched donor within their family, so physicians must activate a “donor search process” by interacting with national and international donor registries who will search their databases for adult unrelated donors or cord blood units (CBU. Information and communication technologies play a key role in the donor search process in donor registries both nationally and internationaly. One of the major challenges for donor registry computer systems is the development of a reliable search algorithm. This work discusses the top-down design of such algorithms and current practice. Based on our experience with systems used by several stem cell donor registries, we highlight typical pitfalls in the implementation of an algorithm and underlying data structure.

  16. Synergy of Genetic Algorithm with Extensive Neighborhood Search for the Permutation Flowshop Scheduling Problem

    Directory of Open Access Journals (Sweden)

    Rong-Chang Chen

    2017-01-01

    Full Text Available The permutation flowshop scheduling problem (PFSP is an important issue in the manufacturing industry. The objective of this study is to minimize the total completion time of scheduling for minimum makespan. Although the hybrid genetic algorithms are popular for resolving PFSP, their local search methods were compromised by the local optimum which has poorer solutions. This study proposed a new hybrid genetic algorithm for PFSP which makes use of the extensive neighborhood search method. For evaluating the performance, results of this study were compared against other state-of-the-art hybrid genetic algorithms. The comparisons showed that the proposed algorithm outperformed the other algorithms. A significant 50% test instances achieved the known optimal solutions. The proposed algorithm is simple and easy to implement. It can be extended easily to apply to similar combinatorial optimization problems.

  17. Adaptive symbiotic organisms search (SOS algorithm for structural design optimization

    Directory of Open Access Journals (Sweden)

    Ghanshyam G. Tejani

    2016-07-01

    Full Text Available The symbiotic organisms search (SOS algorithm is an effective metaheuristic developed in 2014, which mimics the symbiotic relationship among the living beings, such as mutualism, commensalism, and parasitism, to survive in the ecosystem. In this study, three modified versions of the SOS algorithm are proposed by introducing adaptive benefit factors in the basic SOS algorithm to improve its efficiency. The basic SOS algorithm only considers benefit factors, whereas the proposed variants of the SOS algorithm, consider effective combinations of adaptive benefit factors and benefit factors to study their competence to lay down a good balance between exploration and exploitation of the search space. The proposed algorithms are tested to suit its applications to the engineering structures subjected to dynamic excitation, which may lead to undesirable vibrations. Structure optimization problems become more challenging if the shape and size variables are taken into account along with the frequency. To check the feasibility and effectiveness of the proposed algorithms, six different planar and space trusses are subjected to experimental analysis. The results obtained using the proposed methods are compared with those obtained using other optimization methods well established in the literature. The results reveal that the adaptive SOS algorithm is more reliable and efficient than the basic SOS algorithm and other state-of-the-art algorithms.

  18. Interior search algorithm (ISA): a novel approach for global optimization.

    Science.gov (United States)

    Gandomi, Amir H

    2014-07-01

    This paper presents the interior search algorithm (ISA) as a novel method for solving optimization tasks. The proposed ISA is inspired by interior design and decoration. The algorithm is different from other metaheuristic algorithms and provides new insight for global optimization. The proposed method is verified using some benchmark mathematical and engineering problems commonly used in the area of optimization. ISA results are further compared with well-known optimization algorithms. The results show that the ISA is efficiently capable of solving optimization problems. The proposed algorithm can outperform the other well-known algorithms. Further, the proposed algorithm is very simple and it only has one parameter to tune. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Adiabatic quantum algorithm for search engine ranking.

    Science.gov (United States)

    Garnerone, Silvano; Zanardi, Paolo; Lidar, Daniel A

    2012-06-08

    We propose an adiabatic quantum algorithm for generating a quantum pure state encoding of the PageRank vector, the most widely used tool in ranking the relative importance of internet pages. We present extensive numerical simulations which provide evidence that this algorithm can prepare the quantum PageRank state in a time which, on average, scales polylogarithmically in the number of web pages. We argue that the main topological feature of the underlying web graph allowing for such a scaling is the out-degree distribution. The top-ranked log(n) entries of the quantum PageRank state can then be estimated with a polynomial quantum speed-up. Moreover, the quantum PageRank state can be used in "q-sampling" protocols for testing properties of distributions, which require exponentially fewer measurements than all classical schemes designed for the same task. This can be used to decide whether to run a classical update of the PageRank.

  20. Adiabatic Quantum Algorithm for Search Engine Ranking

    Science.gov (United States)

    Garnerone, Silvano; Zanardi, Paolo; Lidar, Daniel A.

    2012-06-01

    We propose an adiabatic quantum algorithm for generating a quantum pure state encoding of the PageRank vector, the most widely used tool in ranking the relative importance of internet pages. We present extensive numerical simulations which provide evidence that this algorithm can prepare the quantum PageRank state in a time which, on average, scales polylogarithmically in the number of web pages. We argue that the main topological feature of the underlying web graph allowing for such a scaling is the out-degree distribution. The top-ranked log⁡(n) entries of the quantum PageRank state can then be estimated with a polynomial quantum speed-up. Moreover, the quantum PageRank state can be used in “q-sampling” protocols for testing properties of distributions, which require exponentially fewer measurements than all classical schemes designed for the same task. This can be used to decide whether to run a classical update of the PageRank.

  1. Modified cuckoo search algorithm in microscopic image segmentation of hippocampus.

    Science.gov (United States)

    Chakraborty, Shouvik; Chatterjee, Sankhadeep; Dey, Nilanjan; Ashour, Amira S; Ashour, Ahmed S; Shi, Fuqian; Mali, Kalyani

    2017-10-01

    Microscopic image analysis is one of the challenging tasks due to the presence of weak correlation and different segments of interest that may lead to ambiguity. It is also valuable in foremost meadows of technology and medicine. Identification and counting of cells play a vital role in features extraction to diagnose particular diseases precisely. Different segments should be identified accurately in order to identify and to count cells in a microscope image. Consequently, in the current work, a novel method for cell segmentation and identification has been proposed that incorporated marking cells. Thus, a novel method based on cuckoo search after pre-processing step is employed. The method is developed and evaluated on light microscope images of rats' hippocampus which used as a sample for the brain cells. The proposed method can be applied on the color images directly. The proposed approach incorporates the McCulloch's method for lévy flight production in cuckoo search (CS) algorithm. Several objective functions, namely Otsu's method, Kapur entropy and Tsallis entropy are used for segmentation. In the cuckoo search process, the Otsu's between class variance, Kapur's entropy and Tsallis entropy are employed as the objective functions to be optimized. Experimental results are validated by different metrics, namely the peak signal to noise ratio (PSNR), mean square error, feature similarity index and CPU running time for all the test cases. The experimental results established that the Kapur's entropy segmentation method based on the modified CS required the least computational time compared to Otsu's between-class variance segmentation method and the Tsallis entropy segmentation method. Nevertheless, Tsallis entropy method with optimized multi-threshold levels achieved superior performance compared to the other two segmentation methods in terms of the PSNR. © 2017 Wiley Periodicals, Inc.

  2. Can search algorithms save large-scale automatic performance tuning?

    Energy Technology Data Exchange (ETDEWEB)

    Balaprakash, P.; Wild, S. M.; Hovland, P. D. (Mathematics and Computer Science)

    2011-01-01

    Empirical performance optimization of computer codes using autotuners has received significant attention in recent years. Given the increased complexity of computer architectures and scientific codes, evaluating all possible code variants is prohibitively expensive for all but the simplest kernels. One way for autotuners to overcome this hurdle is through use of a search algorithm that finds high-performing code variants while examining relatively few variants. In this paper we examine the search problem in autotuning from a mathematical optimization perspective. As an illustration of the power and limitations of this optimization, we conduct an experimental study of several optimization algorithms on a number of linear algebra kernel codes. We find that the algorithms considered obtain performance gains similar to the optimal ones found by complete enumeration or by large random searches but in a tiny fraction of the computation time.

  3. Global Algorithm Applied on Single Photon Detection

    Science.gov (United States)

    LIU, Hua; DING, Quanxin; Wang, Helong; Chen, Hongliang; GUO, Chunjie; ZHOU, Liwei

    2017-06-01

    There are three major contributions. Firstly, applied study on the theory and experiment of single photon detection, including the project and experiment of quantum key distribution. Secondly, based on methods of the selection of detector, main photo electronic system configuration, design and creation, along with the relationship between these to system characteristics have been studied. Thirdly, based on the considerations on the research of image sensor systems on single photon detection, and the total system characteristics are evaluated and discussed in quantity. The results of simulation experiments and theory analyzing demonstrate that proposed method could advance the system validity effectively, and theory analysis and experiment shows the method is reasonable and efficient.

  4. New Predicted Spiral Search Block Matching Algorithm - PSSBMA

    Directory of Open Access Journals (Sweden)

    J. Pika

    2002-04-01

    Full Text Available This article describes the modification of the full search algorithmESBMA (Exhaustive Search Block Matching Algorithm, which leads up to40% speed increase. The modification is based on the ESBMA motion fieldanalysis results. The major modifications to the ESBMA are: -Introduction of sub-optimality by thresholding the matching criterion(MAEthr; - Respecting constraints on motion vectors resulting from"head and shoulder" scenes by changing the position of the searchstart; - Respecting the dependence of motion vectors (MV by predictionintroduction.

  5. Development of a scatter search optimization algorithm for BWR fuel lattice design

    Energy Technology Data Exchange (ETDEWEB)

    Francois, J.L.; Martin-del-Campo, C. [Mexico Univ. Nacional Autonoma, Facultad de Ingenieria (Mexico); Morales, L.B.; Palomera, M.A. [Mexico Univ. Nacional Autonoma, Instituto de Investigaciones en Matematicas Aplicadas y Sistemas, D.F. (Mexico)

    2005-07-01

    A basic Scatter Search (SS) method, applied to the optimization of radial enrichment and gadolinia distributions for BWR fuel lattices, is presented in this paper. Scatter search is considered as an evolutionary algorithm that constructs solutions by combining others. The goal of this methodology is to enable the implementation of solution procedures that can derive new solutions from combined elements. The main mechanism for combining solutions is such that a new solution is created from the strategic combination of two other solutions to explore the solutions' space. Results show that the Scatter Search method is an efficient optimization algorithm applied to the BWR design and optimization problem. Its main features are based on the use of heuristic rules since the beginning of the process, which allows directing the optimization process to the solution, and to use the diversity mechanism in the combination operator, which allows covering the search space in an efficient way. (authors)

  6. Genetic Algorithms Applied to Multi-Objective Aerodynamic Shape Optimization

    Science.gov (United States)

    Holst, Terry L.

    2005-01-01

    A genetic algorithm approach suitable for solving multi-objective problems is described and evaluated using a series of aerodynamic shape optimization problems. Several new features including two variations of a binning selection algorithm and a gene-space transformation procedure are included. The genetic algorithm is suitable for finding Pareto optimal solutions in search spaces that are defined by any number of genes and that contain any number of local extrema. A new masking array capability is included allowing any gene or gene subset to be eliminated as decision variables from the design space. This allows determination of the effect of a single gene or gene subset on the Pareto optimal solution. Results indicate that the genetic algorithm optimization approach is flexible in application and reliable. The binning selection algorithms generally provide Pareto front quality enhancements and moderate convergence efficiency improvements for most of the problems solved.

  7. Training Feedforward Neural Networks Using Symbiotic Organisms Search Algorithm.

    Science.gov (United States)

    Wu, Haizhou; Zhou, Yongquan; Luo, Qifang; Basset, Mohamed Abdel

    2016-01-01

    Symbiotic organisms search (SOS) is a new robust and powerful metaheuristic algorithm, which stimulates the symbiotic interaction strategies adopted by organisms to survive and propagate in the ecosystem. In the supervised learning area, it is a challenging task to present a satisfactory and efficient training algorithm for feedforward neural networks (FNNs). In this paper, SOS is employed as a new method for training FNNs. To investigate the performance of the aforementioned method, eight different datasets selected from the UCI machine learning repository are employed for experiment and the results are compared among seven metaheuristic algorithms. The results show that SOS performs better than other algorithms for training FNNs in terms of converging speed. It is also proven that an FNN trained by the method of SOS has better accuracy than most algorithms compared.

  8. An enhanced dynamic hash TRIE algorithm for lexicon search

    Science.gov (United States)

    Yang, Lai; Xu, Lida; Shi, Zhongzhi

    2012-11-01

    Information retrieval (IR) is essential to enterprise systems along with growing orders, customers and materials. In this article, an enhanced dynamic hash TRIE (eDH-TRIE) algorithm is proposed that can be used in a lexicon search in Chinese, Japanese and Korean (CJK) segmentation and in URL identification. In particular, the eDH-TRIE algorithm is suitable for Unicode retrieval. The Auto-Array algorithm and Hash-Array algorithm are proposed to handle the auxiliary memory allocation; the former changes its size on demand without redundant restructuring, and the latter replaces linked lists with arrays, saving the overhead of memory. Comparative experiments show that the Auto-Array algorithm and Hash-Array algorithm have better spatial performance; they can be used in a multitude of situations. The eDH-TRIE is evaluated for both speed and storage and compared with the naïve DH-TRIE algorithms. The experiments show that the eDH-TRIE algorithm performs better. These algorithms reduce memory overheads and speed up IR.

  9. An Experience Oriented-Convergence Improved Gravitational Search Algorithm for Minimum Variance Distortionless Response Beamforming Optimum.

    Directory of Open Access Journals (Sweden)

    Soodabeh Darzi

    Full Text Available An experience oriented-convergence improved gravitational search algorithm (ECGSA based on two new modifications, searching through the best experiments and using of a dynamic gravitational damping coefficient (α, is introduced in this paper. ECGSA saves its best fitness function evaluations and uses those as the agents' positions in searching process. In this way, the optimal found trajectories are retained and the search starts from these trajectories, which allow the algorithm to avoid the local optimums. Also, the agents can move faster in search space to obtain better exploration during the first stage of the searching process and they can converge rapidly to the optimal solution at the final stage of the search process by means of the proposed dynamic gravitational damping coefficient. The performance of ECGSA has been evaluated by applying it to eight standard benchmark functions along with six complicated composite test functions. It is also applied to adaptive beamforming problem as a practical issue to improve the weight vectors computed by minimum variance distortionless response (MVDR beamforming technique. The results of implementation of the proposed algorithm are compared with some well-known heuristic methods and verified the proposed method in both reaching to optimal solutions and robustness.

  10. Optimal fractional order PID design via Tabu Search based algorithm.

    Science.gov (United States)

    Ateş, Abdullah; Yeroglu, Celaleddin

    2016-01-01

    This paper presents an optimization method based on the Tabu Search Algorithm (TSA) to design a Fractional-Order Proportional-Integral-Derivative (FOPID) controller. All parameter computations of the FOPID employ random initial conditions, using the proposed optimization method. Illustrative examples demonstrate the performance of the proposed FOPID controller design method. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Novel genetic algorithm search procedure for LEED surface structure determination.

    Science.gov (United States)

    Viana, M L; dos Reis, D D; Soares, E A; Van Hove, M A; Moritz, W; de Carvalho, V E

    2014-06-04

    Low Energy Electron Diffraction (LEED) is one of the most powerful experimental techniques for surface structure analysis but until now only a trial-and-error approach has been successful. So far, fitting procedures developed to optimize structural and nonstructural parameters-by minimization of the R-factor-have had a fairly small convergence radius, suitable only for local optimization. However, the identification of the global minimum among the several local minima is essential for complex surface structures. Global optimization methods have been applied to LEED structure determination, but they still require starting from structures that are relatively close to the correct one, in order to find the final structure. For complex systems, the number of trial structures and the resulting computation time increase so rapidly that the task of finding the correct model becomes impractical using the present methodologies. In this work we propose a new search method, based on Genetic Algorithms, which is able to determine the correct structural model starting from completely random structures. This method-called here NGA-LEED for Novel Genetic Algorithm for LEED-utilizes bond lengths and symmetry criteria to select reasonable trial structures before performing LEED calculations. This allows a reduction of the parameter space and, consequently of the calculation time, by several orders of magnitude. A refinement of the parameters by least squares fit of simulated annealing is performed only at some intermediate stages and in the final step. The method was successfully tested for two systems, Ag(1 1 1)(4 × 4)-O and Au(1 1 0)-(1 × 2), both in theory versus theory and in theory versus experiment comparisons. Details of the implementation as well as the results for these two systems are presented.

  12. Computing gap free Pareto front approximations with stochastic search algorithms.

    Science.gov (United States)

    Schütze, Oliver; Laumanns, Marco; Tantar, Emilia; Coello, Carlos A Coello; Talbi, El-Ghazali

    2010-01-01

    Recently, a convergence proof of stochastic search algorithms toward finite size Pareto set approximations of continuous multi-objective optimization problems has been given. The focus was on obtaining a finite approximation that captures the entire solution set in some suitable sense, which was defined by the concept of epsilon-dominance. Though bounds on the quality of the limit approximation-which are entirely determined by the archiving strategy and the value of epsilon-have been obtained, the strategies do not guarantee to obtain a gap free approximation of the Pareto front. That is, such approximations A can reveal gaps in the sense that points f in the Pareto front can exist such that the distance of f to any image point F(a), a epsilon A, is "large." Since such gap free approximations are desirable in certain applications, and the related archiving strategies can be advantageous when memetic strategies are included in the search process, we are aiming in this work for such methods. We present two novel strategies that accomplish this task in the probabilistic sense and under mild assumptions on the stochastic search algorithm. In addition to the convergence proofs, we give some numerical results to visualize the behavior of the different archiving strategies. Finally, we demonstrate the potential for a possible hybridization of a given stochastic search algorithm with a particular local search strategy-multi-objective continuation methods-by showing that the concept of epsilon-dominance can be integrated into this approach in a suitable way.

  13. Neural network design with combined backpropagation and creeping random search learning algorithms applied to the determination of retained austenite in TRIP steels; Diseno de redes neuronales con aprendizaje combinado de retropropagacion y busqueda aleatoria progresiva aplicado a la determinacion de austenita retenida en aceros TRIP

    Energy Technology Data Exchange (ETDEWEB)

    Toda-Caraballo, I.; Garcia-Mateo, C.; Capdevila, C.

    2010-07-01

    At the beginning of the decade of the nineties, the industrial interest for TRIP steels leads to a significant increase of the investigation and application in this field. In this work, the flexibility of neural networks for the modelling of complex properties is used to tackle the problem of determining the retained austenite content in TRIP-steel. Applying a combination of two learning algorithms (backpropagation and creeping-random-search) for the neural network, a model has been created that enables the prediction of retained austenite in low-Si / low-Al multiphase steels as a function of processing parameters. (Author). 34 refs.

  14. Design and economic investigation of shell and tube heat exchangers using Improved Intelligent Tuned Harmony Search algorithm

    Directory of Open Access Journals (Sweden)

    Oguz Emrah Turgut

    2014-12-01

    Full Text Available This study explores the thermal design of shell and tube heat exchangers by using Improved Intelligent Tuned Harmony Search (I-ITHS algorithm. Intelligent Tuned Harmony Search (ITHS is an upgraded version of harmony search algorithm which has an advantage of deciding intensification and diversification processes by applying proper pitch adjusting strategy. In this study, we aim to improve the search capacity of ITHS algorithm by utilizing chaotic sequences instead of uniformly distributed random numbers and applying alternative search strategies inspired by Artificial Bee Colony algorithm and Opposition Based Learning on promising areas (best solutions. Design variables including baffle spacing, shell diameter, tube outer diameter and number of tube passes are used to minimize total cost of heat exchanger that incorporates capital investment and the sum of discounted annual energy expenditures related to pumping and heat exchanger area. Results show that I-ITHS can be utilized in optimizing shell and tube heat exchangers.

  15. A Cooperative Coevolutionary Cuckoo Search Algorithm for Optimization Problem

    Directory of Open Access Journals (Sweden)

    Hongqing Zheng

    2013-01-01

    Full Text Available Taking inspiration from an organizational evolutionary algorithm for numerical optimization, this paper designs a kind of dynamic population and combining evolutionary operators to form a novel algorithm, a cooperative coevolutionary cuckoo search algorithm (CCCS, for solving both unconstrained, constrained optimization and engineering problems. A population of this algorithm consists of organizations, and an organization consists of dynamic individuals. In experiments, fifteen unconstrained functions, eleven constrained functions, and two engineering design problems are used to validate the performance of CCCS, and thorough comparisons are made between the CCCS and the existing approaches. The results show that the CCCS obtains good performance in the solution quality. Moreover, for the constrained problems, the good performance is obtained by only incorporating a simple constraint handling technique into the CCCS. The results show that the CCCS is quite robust and easy to use.

  16. Pipelining Memetic Algorithms, Constraint Satisfaction, and Local Search for Course Timetabling

    Science.gov (United States)

    Conant-Pablos, Santiago E.; Magaña-Lozano, Dulce J.; Terashima-Marín, Hugo

    This paper introduces a hybrid algorithm that combines local search and constraint satisfaction techniques with memetic algorithms for solving Course Timetabling hard problems. These problems require assigning a set of courses to a predetermined finite number of classrooms and periods of time, complying with a complete set of hard constraints while maximizing the consistency with a set of preferences (soft constraints). The algorithm works in a three-stage sequence: first, it creates an initial population of approximations to the solution by partitioning the variables that represent the courses and solving each partition as a constraint-satisfaction problem; second, it reduces the number of remaining hard and soft constraint violations applying a memetic algorithm; and finally, it obtains a complete and fully consistent solution by locally searching around the best memetic solution. The approach produces competitive results, always getting feasible solutions with a reduced number of soft constraints inconsistencies, when compared against the methods running independently.

  17. Moon Search Algorithms for NASA's Dawn Mission to Asteroid Vesta

    Science.gov (United States)

    Memarsadeghi, Nargess; Mcfadden, Lucy A.; Skillman, David R.; McLean, Brian; Mutchler, Max; Carsenty, Uri; Palmer, Eric E.

    2012-01-01

    A moon or natural satellite is a celestial body that orbits a planetary body such as a planet, dwarf planet, or an asteroid. Scientists seek understanding the origin and evolution of our solar system by studying moons of these bodies. Additionally, searches for satellites of planetary bodies can be important to protect the safety of a spacecraft as it approaches or orbits a planetary body. If a satellite of a celestial body is found, the mass of that body can also be calculated once its orbit is determined. Ensuring the Dawn spacecraft's safety on its mission to the asteroid Vesta primarily motivated the work of Dawn's Satellite Working Group (SWG) in summer of 2011. Dawn mission scientists and engineers utilized various computational tools and techniques for Vesta's satellite search. The objectives of this paper are to 1) introduce the natural satellite search problem, 2) present the computational challenges, approaches, and tools used when addressing this problem, and 3) describe applications of various image processing and computational algorithms for performing satellite searches to the electronic imaging and computer science community. Furthermore, we hope that this communication would enable Dawn mission scientists to improve their satellite search algorithms and tools and be better prepared for performing the same investigation in 2015, when the spacecraft is scheduled to approach and orbit the dwarf planet Ceres.

  18. Micro-seismic waveform matching inversion based on gravitational search algorithm and parallel computation

    Science.gov (United States)

    Jiang, Y.; Xing, H. L.

    2016-12-01

    Micro-seismic events induced by water injection, mining activity or oil/gas extraction are quite informative, the interpretation of which can be applied for the reconstruction of underground stress and monitoring of hydraulic fracturing progress in oil/gas reservoirs. The source characterises and locations are crucial parameters that required for these purposes, which can be obtained through the waveform matching inversion (WMI) method. Therefore it is imperative to develop a WMI algorithm with high accuracy and convergence speed. Heuristic algorithm, as a category of nonlinear method, possesses a very high convergence speed and good capacity to overcome local minimal values, and has been well applied for many areas (e.g. image processing, artificial intelligence). However, its effectiveness for micro-seismic WMI is still poorly investigated; very few literatures exits that addressing this subject. In this research an advanced heuristic algorithm, gravitational search algorithm (GSA) , is proposed to estimate the focal mechanism (angle of strike, dip and rake) and source locations in three dimension. Unlike traditional inversion methods, the heuristic algorithm inversion does not require the approximation of green function. The method directly interacts with a CPU parallelized finite difference forward modelling engine, and updating the model parameters under GSA criterions. The effectiveness of this method is tested with synthetic data form a multi-layered elastic model; the results indicate GSA can be well applied on WMI and has its unique advantages. Keywords: Micro-seismicity, Waveform matching inversion, gravitational search algorithm, parallel computation

  19. Design of Digital IIR Filter with Conflicting Objectives Using Hybrid Gravitational Search Algorithm

    Directory of Open Access Journals (Sweden)

    D. S. Sidhu

    2015-01-01

    Full Text Available In the recent years, the digital IIR filter design as a single objective optimization problem using evolutionary algorithms has gained much attention. In this paper, the digital IIR filter design is treated as a multiobjective problem by minimizing the magnitude response error, linear phase response error and optimal order simultaneously along with meeting the stability criterion. Hybrid gravitational search algorithm (HGSA has been applied to design the digital IIR filter. GSA technique is hybridized with binary successive approximation (BSA based evolutionary search method for exploring the search space locally. The relative performance of GSA and hybrid GSA has been evaluated by applying these techniques to standard mathematical test functions. The above proposed hybrid search techniques have been applied effectively to solve the multiparameter and multiobjective optimization problem of low-pass (LP, high-pass (HP, band-pass (BP, and band-stop (BS digital IIR filter design. The obtained results reveal that the proposed technique performs better than other algorithms applied by other researchers for the design of digital IIR filter with conflicting objectives.

  20. Performance of genetic algorithms in search for water splitting perovskites

    DEFF Research Database (Denmark)

    Jain, A.; Castelli, Ivano Eligio; Hautier, G.

    2013-01-01

    , the performance of the GA can be further improved to approximately 12–17 better than random search. We discuss the effect of population size, selection function, crossover function, mutation rate, fitness function, and elitism on the final result, finding that selection function and elitism are especially......We examine the performance of genetic algorithms (GAs) in uncovering solar water light splitters over a space of almost 19,000 perovskite materials. The entire search space was previously calculated using density functional theory to determine solutions that fulfill constraints on stability, band...

  1. Random search optimization based on genetic algorithm and discriminant function

    Science.gov (United States)

    Kiciman, M. O.; Akgul, M.; Erarslanoglu, G.

    1990-01-01

    The general problem of optimization with arbitrary merit and constraint functions, which could be convex, concave, monotonic, or non-monotonic, is treated using stochastic methods. To improve the efficiency of the random search methods, a genetic algorithm for the search phase and a discriminant function for the constraint-control phase were utilized. The validity of the technique is demonstrated by comparing the results to published test problem results. Numerical experimentation indicated that for cases where a quick near optimum solution is desired, a general, user-friendly optimization code can be developed without serious penalties in both total computer time and accuracy.

  2. A Local-Search Algorithm for Steiner Forest

    OpenAIRE

    Groß, Martin; Gupta, Anupam; Kumar, Amit; Matuschke, Jannik; Schmidt, Daniel R.; Schmidt, Melanie; Verschae, José

    2018-01-01

    In the Steiner Forest problem, we are given a graph and a collection of source-sink pairs, and the goal is to find a subgraph of minimum total length such that all pairs are connected. The problem is APX-Hard and can be 2-approximated by, e.g., the elegant primal-dual algorithm of Agrawal, Klein, and Ravi from 1995. We give a local-search-based constant-factor approximation for the problem. Local search brings in new techniques to an area that has for long not seen any improvements and mi...

  3. Tabu Search Algorithm to Solve the Intermodal Terminal Location Problem

    Directory of Open Access Journals (Sweden)

    E. Karimi∗

    2015-03-01

    Full Text Available Establishment of appropriate terminals is effective as the main gate entrance to international, national and local transportation network for economic performance, traffic safety and reduction of environmental pollution. This paper focuses on intermodal terminal location problem. The main objective of this problem is to determine which of the terminals of a set of candidate terminals should be opened such that the total cost be minimized. In this problem, demands of customers will ship directly (without the use of terminals between the origin and destination of customers, or intermodaly (by using two terminals or even by combination of both methods. Since this problem is NP-hard, metaheuristics algorithms such as tabu search (TS is used to solve it. The algorithm is compared with greedy randomized adaptive search procedure (GRASP on instance of this problem. Results show the efficiency of TS in comparision with GRASP.

  4. A novel inverse kinematics algorithm based on unidirectional search

    Science.gov (United States)

    Lee, Young Dae; Kim, Kee Hwan

    2005-12-01

    In this paper, we propose a novel algorithm can exactly solve the inverse kinematics of a robot manipulator which does not have the closed form solution. The conventional numerical methods suggested by others have a weak point that they do not consider the feasible joint region of solution. Moreover they are likey to saddle on local points not on the real optimal points of solutions. In this work, we consider the inver kinematics problem as a kind of fixed point problem and propose a solution approach based on unidirectional search and boundary reflection algorithm which can guarantee the global optimal solution and the feasible range of joint limits. Futhermore we present an acceleration idea based upon Steffenson iteration which can speed up the search. Simulation results show the validity and efficiency of our approach.

  5. Providing an imputation algorithm for missing values of longitudinal data using Cuckoo search algorithm: A case study on cervical dystonia.

    Science.gov (United States)

    Golabpour, Amin; Etminani, Kobra; Doosti, Hassan; Miri, Hamid Heidarian; Ghanbari, Reza

    2017-06-01

    Missing values in data are found in a large number of studies in the field of medical sciences, especially longitudinal ones, in which repeated measurements are taken from each person during the study. In this regard, several statistical endeavors have been performed on the concepts, issues, and theoretical methods during the past few decades. Herein, we focused on the missing data related to patients excluded from longitudinal studies. To this end, two statistical parameters of similarity and correlation coefficient were employed. In addition, metaheuristic algorithms were applied to achieve an optimal solution. The selected metaheuristic algorithm, which has a great search functionality, was the Cuckoo search algorithm. Profiles of subjects with cervical dystonia (CD) were used to evaluate the proposed model after applying missingness. It was concluded that the algorithm used in this study had a higher accuracy (98.48%), compared with similar approaches. Concomitant use of similar parameters and correlation coefficients led to a significant increase in accuracy of missing data imputation.

  6. Performance Analysis of Binary Search Algorithm in RFID

    Directory of Open Access Journals (Sweden)

    Xiangmei SONG

    2014-12-01

    Full Text Available Binary search algorithm (BS is a kind of important anti-collision algorithm in the Radio Frequency Identification (RFID, is also one of the key technologies which determine whether the information in the tag is identified by the reader-writer fast and reliably. The performance of BS directly affects the quality of service in Internet of Things. This paper adopts an automated formal technology: probabilistic model checking to analyze the performance of BS algorithm formally. Firstly, according to the working principle of BS algorithm, its dynamic behavior is abstracted into a Discrete Time Markov Chains which can describe deterministic, discrete time and the probability selection. And then on the model we calculate the probability of the data sent successfully and the expected time of tags completing the data transmission. Compared to the another typical anti-collision protocol S-ALOHA in RFID, experimental results show that with an increase in the number of tags the BS algorithm has a less space and time consumption, the average number of conflicts increases slower than the S-ALOHA protocol standard, BS algorithm needs fewer expected time to complete the data transmission, and the average speed of the data transmission in BS is as 1.6 times as the S-ALOHA protocol.

  7. A Study on the Optimization Performance of Fireworks and Cuckoo Search Algorithms in Laser Machining Processes

    Science.gov (United States)

    Goswami, D.; Chakraborty, S.

    2014-11-01

    Laser machining is a promising non-contact process for effective machining of difficult-to-process advanced engineering materials. Increasing interest in the use of lasers for various machining operations can be attributed to its several unique advantages, like high productivity, non-contact processing, elimination of finishing operations, adaptability to automation, reduced processing cost, improved product quality, greater material utilization, minimum heat-affected zone and green manufacturing. To achieve the best desired machining performance and high quality characteristics of the machined components, it is extremely important to determine the optimal values of the laser machining process parameters. In this paper, fireworks algorithm and cuckoo search (CS) algorithm are applied for single as well as multi-response optimization of two laser machining processes. It is observed that although almost similar solutions are obtained for both these algorithms, CS algorithm outperforms fireworks algorithm with respect to average computation time, convergence rate and performance consistency.

  8. 3rd International Conference on Harmony Search Algorithm

    CERN Document Server

    2017-01-01

    This book presents state-of-the-art technical contributions based around one of the most successful evolutionary optimization algorithms published to date: Harmony Search. Contributions span from novel technical derivations of this algorithm to applications in the broad fields of civil engineering, energy, transportation & mobility and health, among many others and focus not only on its cross-domain applicability, but also on its core evolutionary operators, including elements inspired from other meta-heuristics. The global scientific community is witnessing an upsurge in groundbreaking, new advances in all areas of computational intelligence, with a particular flurry of research focusing on evolutionary computation and bio-inspired optimization. Observed processes in nature and sociology have provided the basis for innovative algorithmic developments aimed at leveraging the inherent capability to adapt characterized by various animals, including ants, fireflies, wolves and humans. However, it is the beha...

  9. Penguins Search Optimisation Algorithm for Association Rules Mining

    Directory of Open Access Journals (Sweden)

    Youcef Gheraibia

    2016-06-01

    Full Text Available Association Rules Mining (ARM is one of the most popular and well-known approaches for the decision-making process. All existing ARM algorithms are time consuming and generate a very large number of association rules with high overlapping. To deal with this issue, we propose a new ARM approach based on penguins search optimization algorithm (Pe-ARM for short. Moreover, an efficient measure is incorporated into the main process to evaluate the amount of overlapping among the generated rules. The proposed approach also ensures a good diversification over the whole solutions space. To demonstrate the effectiveness of the proposed approach, several experiments have been carried out on different datasets and specifically on the biological ones. The results reveal that the proposed approach outperforms the well-known ARM algorithms in both execution time and solution quality.

  10. Quantum discord and entanglement in grover search algorithm

    Directory of Open Access Journals (Sweden)

    Ye Bin

    2016-01-01

    Full Text Available Imperfections and noise in realistic quantum computers may seriously affect the accuracy of quantum algorithms. In this article we explore the impact of static imperfections on quantum entanglement as well as non-entangled quantum correlations in Grover’s search algorithm. Using the metrics of concurrence and geometric quantum discord, we show that both the evolution of entanglement and quantum discord in Grover algorithm can be restrained with the increasing strength of static imperfections. For very weak imperfections, the quantum entanglement and discord exhibit periodic behavior, while the periodicity will most certainly be destroyed with stronger imperfections. Moreover, entanglement sudden death may occur when the strength of static imperfections is greater than a certain threshold.

  11. Swarm, genetic and evolutionary programming algorithms applied to multiuser detection

    Directory of Open Access Journals (Sweden)

    Paul Jean Etienne Jeszensky

    2005-02-01

    Full Text Available In this paper, the particles swarm optimization technique, recently published in the literature, and applied to Direct Sequence/Code Division Multiple Access systems (DS/CDMA with multiuser detection (MuD is analyzed, evaluated and compared. The Swarm algorithm efficiency when applied to the DS-CDMA multiuser detection (Swarm-MuD is compared through the tradeoff performance versus computational complexity, being the complexity expressed in terms of the number of necessary operations in order to reach the performance obtained through the optimum detector or the Maximum Likelihood detector (ML. The comparison is accomplished among the genetic algorithm, evolutionary programming with cloning and Swarm algorithm under the same simulation basis. Additionally, it is proposed an heuristics-MuD complexity analysis through the number of computational operations. Finally, an analysis is carried out for the input parameters of the Swarm algorithm in the attempt to find the optimum parameters (or almost-optimum for the algorithm applied to the MuD problem.

  12. Arc-Search Infeasible Interior-Point Algorithm for Linear Programming

    OpenAIRE

    Yang, Yaguang

    2014-01-01

    Mehrotra's algorithm has been the most successful infeasible interior-point algorithm for linear programming since 1990. Most popular interior-point software packages for linear programming are based on Mehrotra's algorithm. This paper proposes an alternative algorithm, arc-search infeasible interior-point algorithm. We will demonstrate, by testing Netlib problems and comparing the test results obtained by arc-search infeasible interior-point algorithm and Mehrotra's algorithm, that the propo...

  13. A Sustainable City Planning Algorithm Based on TLBO and Local Search

    Science.gov (United States)

    Zhang, Ke; Lin, Li; Huang, Xuanxuan; Liu, Yiming; Zhang, Yonggang

    2017-09-01

    Nowadays, how to design a city with more sustainable features has become a center problem in the field of social development, meanwhile it has provided a broad stage for the application of artificial intelligence theories and methods. Because the design of sustainable city is essentially a constraint optimization problem, the swarm intelligence algorithm of extensive research has become a natural candidate for solving the problem. TLBO (Teaching-Learning-Based Optimization) algorithm is a new swarm intelligence algorithm. Its inspiration comes from the “teaching” and “learning” behavior of teaching class in the life. The evolution of the population is realized by simulating the “teaching” of the teacher and the student “learning” from each other, with features of less parameters, efficient, simple thinking, easy to achieve and so on. It has been successfully applied to scheduling, planning, configuration and other fields, which achieved a good effect and has been paid more and more attention by artificial intelligence researchers. Based on the classical TLBO algorithm, we propose a TLBO_LS algorithm combined with local search. We design and implement the random generation algorithm and evaluation model of urban planning problem. The experiments on the small and medium-sized random generation problem showed that our proposed algorithm has obvious advantages over DE algorithm and classical TLBO algorithm in terms of convergence speed and solution quality.

  14. Nature-inspired novel Cuckoo Search Algorithm for genome ...

    Indian Academy of Sciences (India)

    Bioinformatics; Cuckoo search; genome sequence assembly; meta- heuristics. 1. Introduction. The innovation in the cutting edge of soft computing technology proposes solutions to solve many challenging bioinformatics problems. A heuristic optimization technique has been applied for most exciting research areas in ...

  15. Algorithm for Rapid Searching Among Star-Catalog Entries

    Science.gov (United States)

    Liebe, Carl Christian

    2006-01-01

    An algorithm searches a star catalog to identify guide stars within the field of view of a telescope or camera. The algorithm is fast: the number of computations needed to perform the search is approximately proportional to the logarithm of the number of stars in the catalog. The algorithm requires the prior organization of the star catalog into a hierarchy utilizing independent spherical coverings (see figure), such that each successively higher level contains fewer elements. In the lowest and most numerous level of the hierarchy, the elements are individual stars in the star catalog. The next higher level contains a spherical covering (a constellation of n points on a sphere that minimizes the maximum distance of any point on the sphere from the closest one of the n points), the next higher level contains a smaller spherical covering, and so forth, ending at the highest level, which contains one element representing the point of entry into the search structure. With necessary exceptions at the lowest and highest levels, each element at each level is labeled in terms of the element to which it is linked in the next higher level and the first element to which it is linked in the next lower level. Each element is also labeled in terms of (1) its coordinates on the celestial sphere and (2) the largest angular distance to any element in any lower level in the hierarchy. The elements at all levels of the hierarchy are numbered on a single list, such that the elements of each constellation at each level are numbered consecutively. The algorithm is recursive. The input required to start the algorithm comprises the coordinates of a point on the celestial sphere. Attention is then focused on individual elements of the hierarchy, starting from the topmost one, as follows: The angle between the input point and the element under consideration is calculated. If the calculated angle is larger than the sum of (1) the predetermined angle to the most distant element plus (2) the

  16. A Diverse Stochastic Search Algorithm for Combination Therapeutics

    Directory of Open Access Journals (Sweden)

    Mehmet Umut Caglar

    2014-01-01

    Full Text Available Background. Design of drug combination cocktails to maximize sensitivity for individual patients presents a challenge in terms of minimizing the number of experiments to attain the desired objective. The enormous number of possible drug combinations constrains exhaustive experimentation approaches, and personal variations in genetic diseases restrict the use of prior knowledge in optimization. Results. We present a stochastic search algorithm that consisted of a parallel experimentation phase followed by a combination of focused and diversified sequential search. We evaluated our approach on seven synthetic examples; four of them were evaluated twice with different parameters, and two biological examples of bacterial and lung cancer cell inhibition response to combination drugs. The performance of our approach as compared to recently proposed adaptive reference update approach was superior for all the examples considered, achieving an average of 45% reduction in the number of experimental iterations. Conclusions. As the results illustrate, the proposed diverse stochastic search algorithm can produce optimized combinations in relatively smaller number of iterative steps. This approach can be combined with available knowledge on the genetic makeup of the patient to design optimal selection of drug cocktails.

  17. Applying Cuckoo Search for analysis of LFSR based cryptosystem

    Directory of Open Access Journals (Sweden)

    Maiya Din

    2016-09-01

    Full Text Available Cryptographic techniques are employed for minimizing security hazards to sensitive information. To make the systems more robust, cyphers or crypts being used need to be analysed for which cryptanalysts require ways to automate the process, so that cryptographic systems can be tested more efficiently. Evolutionary algorithms provide one such resort as these are capable of searching global optimal solution very quickly. Cuckoo Search (CS Algorithm has been used effectively in cryptanalysis of conventional systems like Vigenere and Transposition cyphers. Linear Feedback Shift Register (LFSR is a crypto primitive used extensively in design of cryptosystems. In this paper, we analyse LFSR based cryptosystem using Cuckoo Search to find correct initial states of used LFSR. Primitive polynomials of degree 11, 13, 17 and 19 are considered to analyse text crypts of length 200, 300 and 400 characters. Optimal solutions were obtained for the following CS parameters: Levy distribution parameter (β = 1.5 and Alien eggs discovering probability (pa = 0.25.

  18. A Hybrid Harmony Search Algorithm Approach for Optimal Power Flow

    Directory of Open Access Journals (Sweden)

    Mimoun YOUNES

    2012-08-01

    Full Text Available Optimal Power Flow (OPF is one of the main functions of Power system operation. It determines the optimal settings of generating units, bus voltage, transformer tap and shunt elements in Power System with the objective of minimizing total production costs or losses while the system is operating within its security limits. The aim of this paper is to propose a novel methodology (BCGAs-HSA that solves OPF including both active and reactive power dispatch It is based on combining the binary-coded genetic algorithm (BCGAs and the harmony search algorithm (HSA to determine the optimal global solution. This method was tested on the modified IEEE 30 bus test system. The results obtained by this method are compared with those obtained with BCGAs or HSA separately. The results show that the BCGAs-HSA approach can converge to the optimum solution with accuracy compared to those reported recently in the literature.

  19. Tractable Algorithms for Proximity Search on Large Graphs

    Science.gov (United States)

    2010-07-01

    Education never ends, Watson. It is a series of lessons with the greatest for the last. — Sir Arthur Conan Doyle’s Sherlock Holmes . 2.1 Introduction A...Doyle’s Sherlock Holmes . 5.1 Introduction In this thesis, our main goal is to design fast algorithms for proximity search in large graphs. In chapter 3...Conan Doyle’s Sherlock Holmes . In this thesis our main focus is on investigating some useful random walk based prox- imity measures. We have started

  20. Quantum Error Correction Protects Quantum Search Algorithms Against Decoherence

    Science.gov (United States)

    Botsinis, Panagiotis; Babar, Zunaira; Alanis, Dimitrios; Chandra, Daryus; Nguyen, Hung; Ng, Soon Xin; Hanzo, Lajos

    2016-01-01

    When quantum computing becomes a wide-spread commercial reality, Quantum Search Algorithms (QSA) and especially Grover’s QSA will inevitably be one of their main applications, constituting their cornerstone. Most of the literature assumes that the quantum circuits are free from decoherence. Practically, decoherence will remain unavoidable as is the Gaussian noise of classic circuits imposed by the Brownian motion of electrons, hence it may have to be mitigated. In this contribution, we investigate the effect of quantum noise on the performance of QSAs, in terms of their success probability as a function of the database size to be searched, when decoherence is modelled by depolarizing channels’ deleterious effects imposed on the quantum gates. Moreover, we employ quantum error correction codes for limiting the effects of quantum noise and for correcting quantum flips. More specifically, we demonstrate that, when we search for a single solution in a database having 4096 entries using Grover’s QSA at an aggressive depolarizing probability of 10−3, the success probability of the search is 0.22 when no quantum coding is used, which is improved to 0.96 when Steane’s quantum error correction code is employed. Finally, apart from Steane’s code, the employment of Quantum Bose-Chaudhuri-Hocquenghem (QBCH) codes is also considered. PMID:27924865

  1. Applied economic model development algorithm for electronics company

    Directory of Open Access Journals (Sweden)

    Mikhailov I.

    2017-01-01

    Full Text Available The purpose of this paper is to report about received experience in the field of creating the actual methods and algorithms that help to simplify development of applied decision support systems. It reports about an algorithm, which is a result of two years research and have more than one-year practical verification. In a case of testing electronic components, the time of the contract conclusion is crucial point to make the greatest managerial mistake. At this stage, it is difficult to achieve a realistic assessment of time-limit and of wage-fund for future work. The creation of estimating model is possible way to solve this problem. In the article is represented an algorithm for creation of those models. The algorithm is based on example of the analytical model development that serves for amount of work estimation. The paper lists the algorithm’s stages and explains their meanings with participants’ goals. The implementation of the algorithm have made possible twofold acceleration of these models development and fulfilment of management’s requirements. The resulting models have made a significant economic effect. A new set of tasks was identified to be further theoretical study.

  2. Use of a genetic algorithm in the search for a near-optimal shielding design

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Byeong Soo [Korea Institute of Nuclear Safety, 34 Gwahak-ro, Yuseong-gu, Daejeon 305-338 (Korea, Republic of); Moon, Joo Hyun, E-mail: jhmoon86@dongguk.ac.k [Dongguk University, 707 Seokjang-dong, Gyeongju, Gyeongbuk 780-714 (Korea, Republic of)

    2010-02-15

    An optimization method based on genetic algorithm (GA), which is referred as MACroscopic Near-Optimal Shielding design (MACNOS), is proposed for the search for an optimal radiation shield configuration subject to a given set of constraints. In MACNOS, a GA is used to search for the optimal shielding design and the penalty strategy is employed to deal with the constraints. In order to confirm its capability to search for the optimal shielding design, MACNOS is applied for solving a simple problem with regard to radiation shielding optimization of a hypothetical spaceship reactor. The application shows that, keeping the constraints satisfied, MACNOS is able to seek for the shielding design that minimizes the total weight by changing the thickness and the material of the shield. Therefore, it is expected that MACNOS is potentially useful in the search for the optimal design configuration in the conceptual design phase, where the selection of the shielding material and the estimate of the thickness are necessary.

  3. Gravitation search algorithm: Application to the optimal IIR filter design

    Directory of Open Access Journals (Sweden)

    Suman Kumar Saha

    2014-01-01

    Full Text Available This paper presents a global heuristic search optimization technique known as Gravitation Search Algorithm (GSA for the design of 8th order Infinite Impulse Response (IIR, low pass (LP, high pass (HP, band pass (BP and band stop (BS filters considering various non-linear characteristics of the filter design problems. This paper also adopts a novel fitness function in order to improve the stop band attenuation to a great extent. In GSA, law of gravity and mass interactions among different particles are adopted for handling the non-linear IIR filter design optimization problem. In this optimization technique, searcher agents are the collection of masses and interactions among them are governed by the Newtonian gravity and the laws of motion. The performances of the GSA based IIR filter designs have proven to be superior as compared to those obtained by real coded genetic algorithm (RGA and standard Particle Swarm Optimization (PSO. Extensive simulation results affirm that the proposed approach using GSA outperforms over its counterparts not only in terms of quality output, i.e., sharpness at cut-off, smaller pass band ripple, higher stop band attenuation, but also the fastest convergence speed with assured stability.

  4. Optimization design of diffractive optical elements by genetic local search algorithms

    Science.gov (United States)

    Zhou, Guangya; Zhao, Xiaolin; Wang, Zongguang; Chen, Yi-Xin; Zhang, Mingsheng

    1999-05-01

    In this paper, a novel optimization algorithm, termed genetic local search algorithm (GLSA), that combines a genetic algorithm (GA) with a local search technique is proposed to design DOE's. This hybrid algorithm performs an improved, more goal-oriented search compare to a purely GA. A 1:17 cross pattern fan out grating and a uniform focal plane intensity profile generator are designed to demonstrate the algorithm we proposed. Numerical results proved that the proposed algorithm are highly robust and efficient. High-quality DOE's are achieved by using the algorithms we proposed.

  5. Categorization and Searching of Color Images Using Mean Shift Algorithm

    Directory of Open Access Journals (Sweden)

    Prakash PANDEY

    2009-07-01

    Full Text Available Now a day’s Image Searching is still a challenging problem in content based image retrieval (CBIR system. Most CBIR system operates on all images without pre-sorting the images. The image search result contains many unrelated image. The aim of this research is to propose a new object based indexing system Based on extracting salient region representative from the image, categorizing the image into different types and search images that are similar to given query images.In our approach, the color features are extracted using the mean shift algorithm, a robust clustering technique, Dominant objects are obtained by performing region grouping of segmented thumbnails. The category for an image is generated automatically by analyzing the image for the presence of a dominant object. The images in the database are clustered based on region feature similarity using Euclidian distance. Placing an image into a category can help the user to navigate retrieval results more effectively. Extensive experimental results illustrate excellent performance.

  6. Applying Statistical Models and Parametric Distance Measures for Music Similarity Search

    Science.gov (United States)

    Lukashevich, Hanna; Dittmar, Christian; Bastuck, Christoph

    Automatic deriving of similarity relations between music pieces is an inherent field of music information retrieval research. Due to the nearly unrestricted amount of musical data, the real-world similarity search algorithms have to be highly efficient and scalable. The possible solution is to represent each music excerpt with a statistical model (ex. Gaussian mixture model) and thus to reduce the computational costs by applying the parametric distance measures between the models. In this paper we discuss the combinations of applying different parametric modelling techniques and distance measures and weigh the benefits of each one against the others.

  7. A spectral clustering search algorithm for predicting shallow landslide size and location

    Science.gov (United States)

    Bellugi, Dino; Milledge, David G.; Dietrich, William E.; McKean, Jim A.; Perron, J. Taylor; Sudderth, Erik B.; Kazian, Brian

    2015-02-01

    The potential hazard and geomorphic significance of shallow landslides depend on their location and size. Commonly applied one-dimensional stability models do not include lateral resistances and cannot predict landslide size. Multidimensional models must be applied to specific geometries, which are not known a priori, and testing all possible geometries is computationally prohibitive. We present an efficient deterministic search algorithm based on spectral graph theory and couple it with a multidimensional stability model to predict discrete landslides in applications at scales broader than a single hillslope using gridded spatial data. The algorithm is general, assuming only that instability results when driving forces acting on a cluster of cells exceed the resisting forces on its margins and that clusters behave as rigid blocks with a failure plane at the soil-bedrock interface. This algorithm recovers predefined clusters of unstable cells of varying shape and size on a synthetic landscape, predicts the size, location, and shape of an observed shallow landslide using field-measured physical parameters, and is robust to modest changes in input parameters. The search algorithm identifies patches of potential instability within large areas of stable landscape. Within these patches will be many different combinations of cells with a Factor of Safety less than one, suggesting that subtle variations in local conditions (e.g., pore pressure and root strength) may determine the ultimate form and exact location at a specific site. Nonetheless, the tests presented here suggest that the search algorithm enables the prediction of shallow landslide size as well as location across landscapes.

  8. Analysis of Multivariate Experimental Data Using A Simplified Regression Model Search Algorithm

    Science.gov (United States)

    Ulbrich, Norbert Manfred

    2013-01-01

    A new regression model search algorithm was developed in 2011 that may be used to analyze both general multivariate experimental data sets and wind tunnel strain-gage balance calibration data. The new algorithm is a simplified version of a more complex search algorithm that was originally developed at the NASA Ames Balance Calibration Laboratory. The new algorithm has the advantage that it needs only about one tenth of the original algorithm's CPU time for the completion of a search. In addition, extensive testing showed that the prediction accuracy of math models obtained from the simplified algorithm is similar to the prediction accuracy of math models obtained from the original algorithm. The simplified algorithm, however, cannot guarantee that search constraints related to a set of statistical quality requirements are always satisfied in the optimized regression models. Therefore, the simplified search algorithm is not intended to replace the original search algorithm. Instead, it may be used to generate an alternate optimized regression model of experimental data whenever the application of the original search algorithm either fails or requires too much CPU time. Data from a machine calibration of NASA's MK40 force balance is used to illustrate the application of the new regression model search algorithm.

  9. SANA: simulated annealing far outperforms many other search algorithms for biological network alignment.

    Science.gov (United States)

    Mamano, Nil; Hayes, Wayne B

    2017-07-15

    Every alignment algorithm consists of two orthogonal components: an objective function M measuring the quality of an alignment, and a search algorithm that explores the space of alignments looking for ones scoring well according to M . We introduce a new search algorithm called SANA (Simulated Annealing Network Aligner) and apply it to protein-protein interaction networks using S 3 as the topological measure. Compared against 12 recent algorithms, SANA produces 5-10 times as many correct node pairings as the others when the correct answer is known. We expose an anti-correlation in many existing aligners between their ability to produce good topological vs. functional similarity scores, whereas SANA usually outscores other methods in both measures. If given the perfect objective function encoding the identity mapping, SANA quickly converges to the perfect solution while many other algorithms falter. We observe that when aligning networks with a known mapping and optimizing only S 3 , SANA creates alignments that are not perfect and yet whose S 3 scores match that of the perfect alignment. We call this phenomenon saturation of the topological score . Saturation implies that a measure's correlation with alignment correctness falters before the perfect alignment is reached. This, combined with SANA's ability to produce the perfect alignment if given the perfect objective function, suggests that better objective functions may lead to dramatically better alignments. We conclude that future work should focus on finding better objective functions, and offer SANA as the search algorithm of choice. Software available at http://sana.ics.uci.edu . whayes@uci.edu. Supplementary data are available at Bioinformatics online.

  10. Search of latent periodicity in amino acid sequences by means of genetic algorithm and dynamic programming.

    Science.gov (United States)

    Pugacheva, Valentina; Korotkov, Alexander; Korotkov, Eugene

    2016-10-01

    The aim of this study was to show that amino acid sequences have a latent periodicity with insertions and deletions of amino acids in unknown positions of the analyzed sequence. Genetic algorithm, dynamic programming and random weight matrices were used to develop a new mathematical algorithm for latent periodicity search. A multiple alignment of periods was calculated with help of the direct optimization of the position-weight matrix without using pairwise alignments. The developed algorithm was applied to analyze amino acid sequences of a small number of proteins. This study showed the presence of latent periodicity with insertions and deletions in the amino acid sequences of such proteins, for which the presence of latent periodicity was not previously known. The origin of latent periodicity with insertions and deletions is discussed.

  11. An Adaptive Large Neighborhood Search Algorithm for the Multi-mode RCPSP

    DEFF Research Database (Denmark)

    Muller, Laurent Flindt

    We present an Adaptive Large Neighborhood Search algorithm for the Multi-mode Resource-Constrained Project Scheduling Problem (MRCPSP). We incorporate techniques for deriving additional precedence relations and propose a new method, so-called mode-diminution, for removing modes during execution....... These techniques make use of bound arguments, and we propose and experiment with three new bounds for the MRCPSP, in addition to bounds found in the literature. We propose a simple technique, so-called opportunistic mode-flipping, which can be applied whenever a schedule is generated,and which significantly...... so, some of the elements of the algorithm perform well, that is the bound arguments, the mode-removal procedure, and in particular opportunistic mode-flipping, and these elements may perhaps be used to improve the results of other algorithms for this problem....

  12. A multi-queue branch-and-bound algorithm for anytime optimal search with biological applications.

    Science.gov (United States)

    Lathrop, R H; Sazhin, A; Sun, Y; Steffin, N; Irani, S S

    2001-01-01

    Many practical biological problems involve an intractable (NP-hard) search through a large space of possibilities. This paper describes preliminary results from a multi-queue variant of branch-and-bound search that combines anytime and optimal search behavior. The algorithm applies to problems whose solutions may be described by an N-dimensional vector. It produces an approximate solution quickly, then iteratively improves the result over time until a global optimum is produced. A global optimum may be produced before producing its proof of global optimality. Local minima are never revisited. We describe preliminary applications to ab initio protein backbone prediction, small drug-like molecule conformations, and protein-DNA binding motif discovery. The results are encouraging, although still quite preliminary.

  13. Global multipartite entanglement dynamics in Grover's search algorithm

    Science.gov (United States)

    Pan, Minghua; Qiu, Daowen; Zheng, Shenggen

    2017-09-01

    Entanglement is considered to be one of the primary reasons for why quantum algorithms are more efficient than their classical counterparts for certain computational tasks. The global multipartite entanglement of the multiqubit states in Grover's search algorithm can be quantified using the geometric measure of entanglement (GME). Rossi et al. (Phys Rev A 87:022331, 2013) found that the entanglement dynamics is scale invariant for large n. Namely, the GME does not depend on the number n of qubits; rather, it only depends on the ratio of iteration k to the total iteration. In this paper, we discuss the optimization of the GME for large n. We prove that "the GME is scale invariant" does not always hold. We show that there is generally a turning point that can be computed in terms of the number of marked states and their Hamming weights during the curve of the GME. The GME is scale invariant prior to the turning point. However, the GME is not scale invariant after the turning point since it also depends on n and the marked states.

  14. Binary Classification of Multigranulation Searching Algorithm Based on Probabilistic Decision

    Directory of Open Access Journals (Sweden)

    Qinghua Zhang

    2016-01-01

    Full Text Available Multigranulation computing, which adequately embodies the model of human intelligence in process of solving complex problems, is aimed at decomposing the complex problem into many subproblems in different granularity spaces, and then the subproblems will be solved and synthesized for obtaining the solution of original problem. In this paper, an efficient binary classification of multigranulation searching algorithm which has optimal-mathematical expectation of classification times for classifying the objects of the whole domain is established. And it can solve the binary classification problems based on both multigranulation computing mechanism and probability statistic principle, such as the blood analysis case. Given the binary classifier, the negative sample ratio, and the total number of objects in domain, this model can search the minimum mathematical expectation of classification times and the optimal classification granularity spaces for mining all the negative samples. And the experimental results demonstrate that, with the granules divided into many subgranules, the efficiency of the proposed method gradually increases and tends to be stable. In addition, the complexity for solving problem is extremely reduced.

  15. Archiving, ordering and searching: search engines, algorithms, databases and deep mediatization

    DEFF Research Database (Denmark)

    Andersen, Jack

    2018-01-01

    This article argues that search engines, algorithms, and databases can be considered as a way of understanding deep mediatization (Couldry & Hepp, 2016). They are embedded in a variety of social and cultural practices and as such they change our communicative actions to be shaped by their logic...... reviewed recent trends in mediatization research, the argument is discussed and unfolded in-between the material and social constructivist-phenomenological interpretations of mediatization. In conclusion, it is discussed how deep this form of mediatization can be taken to be....

  16. Derivation and validation of a search algorithm to retrospectively identify mechanical ventilation initiation in the intensive care unit.

    Science.gov (United States)

    Smischney, Nathan J; Velagapudi, Venu M; Onigkeit, James A; Pickering, Brian W; Herasevich, Vitaly; Kashyap, Rahul

    2014-06-25

    The development and validation of automated electronic medical record (EMR) search strategies are important for establishing the timing of mechanical ventilation initiation in the intensive care unit (ICU).Thus, we sought to develop and validate an automated EMR search algorithm (strategy) for time zero, the moment of mechanical ventilation initiation in the critically ill patient. The EMR search algorithm was developed on the basis of several mechanical ventilation parameters, with the final parameter being positive end-expiratory pressure (PEEP), and was applied to a comprehensive institutional EMR database. The search algorithm was derived from a secondary retrospective analysis of a subset of 450 patients from a cohort of 2,684 patients admitted to a medical ICU and a surgical ICU from January 1, 2010, through December 31, 2011. It was then validated in an independent subset of 450 patients from the same period. The overall percent of agreement between our search algorithm and a comprehensive manual medical record review in the derivation and validation subsets, using peak inspiratory pressure (PIP) as the reference standard, was compared to assess timing of mechanical ventilation initiation. In the derivation subset, the automated electronic search strategy achieved an 87% (κ = 0.87) perfect agreement, with 94% agreement to within one minute. In validating this search algorithm, perfect agreement was found in 92% (κ = 0.92) of patients, with 99% agreement occurring within one minute. The use of an electronic search strategy resulted in highly accurate extraction of mechanical ventilation initiation in the ICU. The search algorithm of mechanical ventilation initiation is highly efficient and reliable and can facilitate both clinical research and patient care management in a timely manner.

  17. An Adaptation of an Algorithm of Search and Rescue Operations to Ship Manoeuvrability

    Directory of Open Access Journals (Sweden)

    Lech Kasyk

    2015-06-01

    Full Text Available This article presents an overview of an algorithm to facilitate action when planning search and rescue operations, taking into account actual hydro-meteorological conditions and the maneuverability of ships involved in the search.

  18. Optimal IIR filter design using Gravitational Search Algorithm with Wavelet Mutation

    Directory of Open Access Journals (Sweden)

    S.K. Saha

    2015-01-01

    Full Text Available This paper presents a global heuristic search optimization technique, which is a hybridized version of the Gravitational Search Algorithm (GSA and Wavelet Mutation (WM strategy. Thus, the Gravitational Search Algorithm with Wavelet Mutation (GSAWM was adopted for the design of an 8th-order infinite impulse response (IIR filter. GSA is based on the interaction of masses situated in a small isolated world guided by the approximation of Newtonian’s laws of gravity and motion. Each mass is represented by four parameters, namely, position, active, passive and inertia mass. The position of the heaviest mass gives the near optimal solution. For better exploitation in multidimensional search spaces, the WM strategy is applied to randomly selected particles that enhance the capability of GSA for finding better near optimal solutions. An extensive simulation study of low-pass (LP, high-pass (HP, band-pass (BP and band-stop (BS IIR filters unleashes the potential of GSAWM in achieving better cut-off frequency sharpness, smaller pass band and stop band ripples, smaller transition width and higher stop band attenuation with assured stability.

  19. A bio-inspired swarm robot coordination algorithm for multiple target searching

    Science.gov (United States)

    Meng, Yan; Gan, Jing; Desai, Sachi

    2008-04-01

    The coordination of a multi-robot system searching for multi targets is challenging under dynamic environment since the multi-robot system demands group coherence (agents need to have the incentive to work together faithfully) and group competence (agents need to know how to work together well). In our previous proposed bio-inspired coordination method, Local Interaction through Virtual Stigmergy (LIVS), one problem is the considerable randomness of the robot movement during coordination, which may lead to more power consumption and longer searching time. To address these issues, an adaptive LIVS (ALIVS) method is proposed in this paper, which not only considers the travel cost and target weight, but also predicting the target/robot ratio and potential robot redundancy with respect to the detected targets. Furthermore, a dynamic weight adjustment is also applied to improve the searching performance. This new method a truly distributed method where each robot makes its own decision based on its local sensing information and the information from its neighbors. Basically, each robot only communicates with its neighbors through a virtual stigmergy mechanism and makes its local movement decision based on a Particle Swarm Optimization (PSO) algorithm. The proposed ALIVS algorithm has been implemented on the embodied robot simulator, Player/Stage, in a searching target. The simulation results demonstrate the efficiency and robustness in a power-efficient manner with the real-world constraints.

  20. Adaptive infinite impulse response system identification using modified-interior search algorithm with Lèvy flight.

    Science.gov (United States)

    Kumar, Manjeet; Rawat, Tarun Kumar; Aggarwal, Apoorva

    2017-03-01

    In this paper, a new meta-heuristic optimization technique, called interior search algorithm (ISA) with Lèvy flight is proposed and applied to determine the optimal parameters of an unknown infinite impulse response (IIR) system for the system identification problem. ISA is based on aesthetics, which is commonly used in interior design and decoration processes. In ISA, composition phase and mirror phase are applied for addressing the nonlinear and multimodal system identification problems. System identification using modified-ISA (M-ISA) based method involves faster convergence, single parameter tuning and does not require derivative information because it uses a stochastic random search using the concepts of Lèvy flight. A proper tuning of control parameter has been performed in order to achieve a balance between intensification and diversification phases. In order to evaluate the performance of the proposed method, mean square error (MSE), computation time and percentage improvement are considered as the performance measure. To validate the performance of M-ISA based method, simulations has been carried out for three benchmarked IIR systems using same order and reduced order system. Genetic algorithm (GA), particle swarm optimization (PSO), cat swarm optimization (CSO), cuckoo search algorithm (CSA), differential evolution using wavelet mutation (DEWM), firefly algorithm (FFA), craziness based particle swarm optimization (CRPSO), harmony search (HS) algorithm, opposition based harmony search (OHS) algorithm, hybrid particle swarm optimization-gravitational search algorithm (HPSO-GSA) and ISA are also used to model the same examples and simulation results are compared. Obtained results confirm the efficiency of the proposed method. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Standard Sine Fitting Algorithms Applied To Blade Tip Timing Data

    Directory of Open Access Journals (Sweden)

    Kaźmierczak Krzysztof

    2014-12-01

    Full Text Available Blade Tip Timing (BTT is a non-intrusive method to measure blade vibration in turbomachinery. Time of Arrival (TOA is recorded when a blade is passing a stationary sensor. The measurement data, in form of undersampled (aliased tip-deflection signal, are difficult to analyze with standard signal processing methods like digital filters or Fourier Transform. Several indirect methods are applied to process TOA sequences, such as reconstruction of aliased spectrum and Least-Squares Fitting to harmonic oscillator model. We used standard sine fitting algorithms provided by IEEE-STD-1057 to estimate blade vibration parameters. Blade-tip displacement was simulated in time domain using SDOF model, sampled by stationary sensors and then processed by the sinefit.m toolkit. We evaluated several configurations of different sensor placement, noise level and number of data. Results of the linear sine fitting, performed with the frequency known a priori, were compared with the non-linear ones. Some of non-linear iterations were not convergent. The algorithms and testing results are aimed to be used in analysis of asynchronous blade vibration.

  2. An Analysis of the Quality of Repeated Plate Load Tests Using the Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    Kook-Hwan Cho

    2014-01-01

    Full Text Available We report a characteristic function to determine whether repetitive static plate load test (RSPLT moduli are within an acceptable quality range, based on a comparison between the initial loading values and reloading moduli. The results of RSPLTs depend on the experience and expertise of the engineer carrying out the test, as well as the loading device, hydraulic jack assembly, and bearing plates. To identify outlier data points, well-tested data were used to develop a characteristic function model using a harmony search algorithm error minimization technique. This measure was applied to determine the reliability of RSPLT data.

  3. Parametric optimization of ultrasonic machining process using gravitational search and fireworks algorithms

    Directory of Open Access Journals (Sweden)

    Debkalpa Goswami

    2015-03-01

    Full Text Available Ultrasonic machining (USM is a mechanical material removal process used to erode holes and cavities in hard or brittle workpieces by using shaped tools, high-frequency mechanical motion and an abrasive slurry. Unlike other non-traditional machining processes, such as laser beam and electrical discharge machining, USM process does not thermally damage the workpiece or introduce significant levels of residual stress, which is important for survival of materials in service. For having enhanced machining performance and better machined job characteristics, it is often required to determine the optimal control parameter settings of an USM process. The earlier mathematical approaches for parametric optimization of USM processes have mostly yielded near optimal or sub-optimal solutions. In this paper, two almost unexplored non-conventional optimization techniques, i.e. gravitational search algorithm (GSA and fireworks algorithm (FWA are applied for parametric optimization of USM processes. The optimization performance of these two algorithms is compared with that of other popular population-based algorithms, and the effects of their algorithm parameters on the derived optimal solutions and computational speed are also investigated. It is observed that FWA provides the best optimal results for the considered USM processes.

  4. An Improved Cuckoo Search Optimization Algorithm for the Problem of Chaotic Systems Parameter Estimation.

    Science.gov (United States)

    Wang, Jun; Zhou, Bihua; Zhou, Shudao

    2016-01-01

    This paper proposes an improved cuckoo search (ICS) algorithm to establish the parameters of chaotic systems. In order to improve the optimization capability of the basic cuckoo search (CS) algorithm, the orthogonal design and simulated annealing operation are incorporated in the CS algorithm to enhance the exploitation search ability. Then the proposed algorithm is used to establish parameters of the Lorenz chaotic system and Chen chaotic system under the noiseless and noise condition, respectively. The numerical results demonstrate that the algorithm can estimate parameters with high accuracy and reliability. Finally, the results are compared with the CS algorithm, genetic algorithm, and particle swarm optimization algorithm, and the compared results demonstrate the method is energy-efficient and superior.

  5. Finding people, papers, and posts: Vertical search algorithms and evaluation

    NARCIS (Netherlands)

    Berendsen, R.W.

    2015-01-01

    There is a growing diversity of information access applications. While general web search has been dominant in the past few decades, a wide variety of so-called vertical search tasks and applications have come to the fore. Vertical search is an often used term for search that targets specific

  6. Neural networks and differential evolution algorithm applied for modelling the depollution process of some gaseous streams.

    Science.gov (United States)

    Curteanu, Silvia; Suditu, Gabriel Dan; Buburuzan, Adela Marina; Dragoi, Elena Niculina

    2014-11-01

    The depollution of some gaseous streams containing n-hexane is studied by adsorption in a fixed bed column, under dynamic conditions, using granular activated carbon and two types of non-functionalized hypercross-linked polymeric resins. In order to model the process, a new neuro-evolutionary approach is proposed. It is a combination of a modified differential evolution (DE) with neural networks (NNs) and two local search algorithms, the global and local optimizers, working together to determine the optimal NN model. The main elements that characterize the applied variant of DE consist in using an opposition-based learning initialization, a simple self-adaptive procedure for the control parameters, and a modified mutation principle based on the fitness function as a criterion for reorganization. The results obtained prove that the proposed algorithm is able to determine a good model of the considered process, its performance being better than those of an available phenomenological model.

  7. An Iterated Local Search Algorithm for Multi-Period Water Distribution Network Design Optimization

    Directory of Open Access Journals (Sweden)

    Annelies De Corte

    2016-08-01

    Full Text Available Water distribution networks consist of different components, such as reservoirs and pipes, and exist to provide users (households, agriculture, industry with high-quality water at adequate pressure and flow. Water distribution network design optimization aims to find optimal diameters for every pipe, chosen from a limited set of commercially available diameters. This combinatorial optimization problem has received a lot of attention over the past forty years. In this paper, the well-studied single-period problem is extended to a multi-period setting in which time varying demand patterns occur. Moreover, an additional constraint—which sets a maximum water velocity—is imposed. A metaheuristic technique called iterated local search is applied to tackle this challenging optimization problem. A full-factorial experiment is conducted to validate the added value of the algorithm components and to configure optimal parameter settings. The algorithm is tested on a broad range of 150 different (freely available test networks.

  8. Searching for full power control rod patterns in a boiling water reactor using genetic algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Montes, Jose Luis [Departamento Sistemas Nucleares, ININ, Carr. Mexico-Toluca Km. 36.5, Ocoyoacac, Edo. de Mexico (Mexico)]. E-mail: jlmt@nuclear.inin.mx; Ortiz, Juan Jose [Departamento Sistemas Nucleares, ININ, Carr. Mexico-Toluca Km. 36.5, Ocoyoacac, Edo. de Mexico (Mexico)]. E-mail: jjortiz@nuclear.inin.mx; Requena, Ignacio [Departamento Ciencias Computacion e I.A. ETSII, Informatica, Universidad de Granada, C. Daniel Saucedo Aranda s/n. 18071 Granada (Spain)]. E-mail: requena@decsai.ugr.es; Perusquia, Raul [Departamento Sistemas Nucleares, ININ, Carr. Mexico-Toluca Km. 36.5, Ocoyoacac, Edo. de Mexico (Mexico)]. E-mail: rpc@nuclear.inin.mx

    2004-11-01

    One of the most important questions related to both safety and economic aspects in a nuclear power reactor operation, is without any doubt its reactivity control. During normal operation of a boiling water reactor, the reactivity control of its core is strongly determined by control rods patterns efficiency. In this paper, GACRP system is proposed based on the concepts of genetic algorithms for full power control rod patterns search. This system was carried out using LVNPP transition cycle characteristics, being applied too to an equilibrium cycle. Several operation scenarios, including core water flow variation throughout the cycle and different target axial power distributions, are considered. Genetic algorithm fitness function includes reactor security parameters, such as MLHGR, MCPR, reactor k{sub eff} and axial power density.

  9. Electric Load Forecasting Based on a Least Squares Support Vector Machine with Fuzzy Time Series and Global Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    Yan Hong Chen

    2016-01-01

    Full Text Available This paper proposes a new electric load forecasting model by hybridizing the fuzzy time series (FTS and global harmony search algorithm (GHSA with least squares support vector machines (LSSVM, namely GHSA-FTS-LSSVM model. Firstly, the fuzzy c-means clustering (FCS algorithm is used to calculate the clustering center of each cluster. Secondly, the LSSVM is applied to model the resultant series, which is optimized by GHSA. Finally, a real-world example is adopted to test the performance of the proposed model. In this investigation, the proposed model is verified using experimental datasets from the Guangdong Province Industrial Development Database, and results are compared against autoregressive integrated moving average (ARIMA model and other algorithms hybridized with LSSVM including genetic algorithm (GA, particle swarm optimization (PSO, harmony search, and so on. The forecasting results indicate that the proposed GHSA-FTS-LSSVM model effectively generates more accurate predictive results.

  10. An Adaptive Large Neighborhood Search Algorithm for the Resource-constrained Project Scheduling Problem

    DEFF Research Database (Denmark)

    Muller, Laurent Flindt

    2009-01-01

    We present an application of an Adaptive Large Neighborhood Search (ALNS) algorithm to the Resource-constrained Project Scheduling Problem (RCPSP). The ALNS framework was first proposed by Pisinger and Røpke [19] and can be described as a large neighborhood search algorithm with an adaptive layer...

  11. A New Iterated Local Search Algorithm for Solving Broadcast Scheduling Problems in Packet Radio Networks

    Directory of Open Access Journals (Sweden)

    Chih-Chiang Lin

    2010-01-01

    Full Text Available The broadcast scheduling problem (BSP in packet radio networks is a well-known NP-complete combinatorial optimization problem. The broadcast scheduling avoids packet collisions by allowing only one node transmission in each collision domain of a time division multiple access (TDMA network. It also improves the transmission utilization by assigning one transmission time slot to one or more nodes; thus, each node transmits at least once in each time frame. An optimum transmission schedule could minimize the length of a time frame while minimizing the number of idle nodes. In this paper, we propose a new iterated local search (ILS algorithm that consists of two special perturbation and local search operators to solve the BSPs. Computational experiments are applied to benchmark data sets and randomly generated problem instances. The experimental results show that our ILS approach is effective in solving the problems with only a few runtimes, even for very large networks with 2,500 nodes.

  12. SEARCH ALGORITHM FOR THE PARAMETRIC IDENTIFICATION OF THE ELECTRIC DRIVE OF THE MONITORING SYSTEM

    Directory of Open Access Journals (Sweden)

    A. S. Abufanas

    2017-01-01

    Full Text Available The problem of parametric identification of a mathematical model of a technical system or a device is considered, which considers the electric drive of a monitoring system installed on an unmanned aerial vehicle. Identification of the parameters of elements of a complex technical system is an actual scientific task, since when developing a new technical system for its synthesis and research, it is necessary to have mathematical models of the elements of the system.It is proposed to solve the problem by applying the search gradient identification algorithm for a given objective residual function in the form of a difference in the output signal of the identified element of the system and its model. When solving the problem, the random character of the processes occurring in the system and at the output of the output signal meter is taken into account. The identification algorithm is developed on the basis of the representation of the model of parameters in the form of an ordinary vector-matrix equation, on the right side of which there is a model of the driving influence in the form of a given deterministic function of time. A general structural diagram of the parametric identification search system with a gradient algorithm is presented.As an example for evaluating the operability of the proposed algorithm, we consider the simplest model of an electric drive, given by a transfer function in the form of an inertial link. Qualitative illustrations of the operability of the proposed algorithm and quantitative characteristics of the signal and parameter changes of the identified object are presented.

  13. Improved Multiobjective Harmony Search Algorithm with Application to Placement and Sizing of Distributed Generation

    Directory of Open Access Journals (Sweden)

    Wanxing Sheng

    2014-01-01

    Full Text Available To solve the comprehensive multiobjective optimization problem, this study proposes an improved metaheuristic searching algorithm with combination of harmony search and the fast nondominated sorting approach. This is a kind of the novel intelligent optimization algorithm for multiobjective harmony search (MOHS. The detailed description and the algorithm formulating are discussed. Taking the optimal placement and sizing issue of distributed generation (DG in distributed power system as one example, the solving procedure of the proposed method is given. Simulation result on modified IEEE 33-bus test system and comparison with NSGA-II algorithm has proved that the proposed MOHS can get promising results for engineering application.

  14. Dynamic Inertia Weight Binary Bat Algorithm with Neighborhood Search

    Directory of Open Access Journals (Sweden)

    Xingwang Huang

    2017-01-01

    Full Text Available Binary bat algorithm (BBA is a binary version of the bat algorithm (BA. It has been proven that BBA is competitive compared to other binary heuristic algorithms. Since the update processes of velocity in the algorithm are consistent with BA, in some cases, this algorithm also faces the premature convergence problem. This paper proposes an improved binary bat algorithm (IBBA to solve this problem. To evaluate the performance of IBBA, standard benchmark functions and zero-one knapsack problems have been employed. The numeric results obtained by benchmark functions experiment prove that the proposed approach greatly outperforms the original BBA and binary particle swarm optimization (BPSO. Compared with several other heuristic algorithms on zero-one knapsack problems, it also verifies that the proposed algorithm is more able to avoid local minima.

  15. A self-adaptive step Cuckoo search algorithm based on dimension by dimension improvement

    Directory of Open Access Journals (Sweden)

    Lu REN

    2015-10-01

    Full Text Available The choice of step length plays an important role in convergence speed and precision of Cuckoo search algorithm. In the paper, a self-adaptive step Cuckoo search algorithm based on dimensional improvement is provided. First, since the step in the original self-adaptive step Cuckoo search algorithm is not updated when the current position of the nest is in the optimal position, simple modification of the step is made for the update. Second, evaluation strategy based on dimension by dimension update is introduced to the modified self-adaptive step Cuckoo search algorithm. The experimental results show that the algorithm can balance the contradiction between the global convergence ability and the precision of optimization. Moreover, the proposed algorithm has better convergence speed.

  16. A Novel Quad Harmony Search Algorithm for Grid-Based Path Finding

    Directory of Open Access Journals (Sweden)

    Saso Koceski

    2014-09-01

    Full Text Available A novel approach to the problem of grid-based path finding has been introduced. The method is a block-based search algorithm, founded on the bases of two algorithms, namely the quad-tree algorithm, which offered a great opportunity for decreasing the time needed to compute the solution, and the harmony search (HS algorithm, a meta-heuristic algorithm used to obtain the optimal solution. This quad HS algorithm uses the quad-tree decomposition of free space in the grid to mark the free areas and treat them as a single node, which greatly improves the execution. The results of the quad HS algorithm have been compared to other meta-heuristic algorithms, i.e., ant colony, genetic algorithm, particle swarm optimization and simulated annealing, and it was proved to obtain the best results in terms of time and giving the optimal path.

  17. An implementation of modified scatter search algorithm to transmission expansion planning

    OpenAIRE

    MEYMAND, Majid ZEINADDINI; RASHIDINEJAD, Masoud; KHORASANI, Hamid

    2011-01-01

    Transmission network expansion planning (TNEP) is one of the most important tasks in the field of power systems, especially in deregulated power system environments. TNEP is a nonlinear mixed integer programming problem that can be solved via hybrid heuristic algorithms. This paper presents a modified scatter search algorithm (MSSA) to reinforce the ordinary scatter search algorithm (SSA) to be equipped for handling large scale transmission expansion planning (TEP) problems. The prop...

  18. Multiobjective Variable Neighborhood Search algorithm for scheduling independent jobs on computational grid

    Directory of Open Access Journals (Sweden)

    S. Selvi

    2015-07-01

    Full Text Available Grid computing solves high performance and high-throughput computing problems through sharing resources ranging from personal computers to super computers distributed around the world. As the grid environments facilitate distributed computation, the scheduling of grid jobs has become an important issue. In this paper, an investigation on implementing Multiobjective Variable Neighborhood Search (MVNS algorithm for scheduling independent jobs on computational grid is carried out. The performance of the proposed algorithm has been evaluated with Min–Min algorithm, Simulated Annealing (SA and Greedy Randomized Adaptive Search Procedure (GRASP algorithm. Simulation results show that MVNS algorithm generally performs better than other metaheuristics methods.

  19. Algorithm of search and track of static and moving large-scale objects

    Directory of Open Access Journals (Sweden)

    Kalyaev Anatoly

    2017-01-01

    Full Text Available We suggest an algorithm for processing of a sequence, which contains images of search and track of static and moving large-scale objects. The possible software implementation of the algorithm, based on multithread CUDA processing, is suggested. Experimental analysis of the suggested algorithm implementation is performed.

  20. A Distributed Election and Spanning Tree Algorithm Based on Depth First Search Traversals

    DEFF Research Database (Denmark)

    Skyum, Sven

    The existence of an effective distributed traversal algorithm for a class of graphs has proven useful in connection with election problems for those classes. In this paper we show how a general traversal algorithm, such as depth first search, can be turned into an effective election algorithm using...... modular techniques. The presented method also constructs a spanning tree for the graph....

  1. An Educational System for Learning Search Algorithms and Automatically Assessing Student Performance

    Science.gov (United States)

    Grivokostopoulou, Foteini; Perikos, Isidoros; Hatzilygeroudis, Ioannis

    2017-01-01

    In this paper, first we present an educational system that assists students in learning and tutors in teaching search algorithms, an artificial intelligence topic. Learning is achieved through a wide range of learning activities. Algorithm visualizations demonstrate the operational functionality of algorithms according to the principles of active…

  2. A Hybrid Tabu Search and Algorithm Genetic for Solving the Economic Dispatch Problem

    Directory of Open Access Journals (Sweden)

    Bakhta NAAMA

    2013-06-01

    Full Text Available The application of optimization techniques to power system planning and operation problems has been an area of active research in the recent past. Genetic Algorithm (GA, Tabu Search (TS are widely used to combinatorial optimization in recent years. Combining the advantages of individual algorithms, a hybrid TS/GA algorithm to solve the economic dispatch problem is proposed in this paper, using the method of penalty to transform the problem ED with constraints in a simple problem without constraints. An IEEE 57-bus power system has been used to test the proposed algorithm. Comparing the results of the proposed algorithm with GA, TS and proposed TS/GA hybrid method has the strongest capability of finding global optimal solution within reasonable computing time. We then give a comparison between two algorithms hybrids (Tabu Search / Genetic Algorithm TS/GA and (Tabu Search/ quasi-Newton method TS/QN.

  3. Dynamic Harmony Search with Polynomial Mutation Algorithm for Valve-Point Economic Load Dispatch.

    Science.gov (United States)

    Karthikeyan, M; Raja, T Sree Ranga

    2015-01-01

    Economic load dispatch (ELD) problem is an important issue in the operation and control of modern control system. The ELD problem is complex and nonlinear with equality and inequality constraints which makes it hard to be efficiently solved. This paper presents a new modification of harmony search (HS) algorithm named as dynamic harmony search with polynomial mutation (DHSPM) algorithm to solve ORPD problem. In DHSPM algorithm the key parameters of HS algorithm like harmony memory considering rate (HMCR) and pitch adjusting rate (PAR) are changed dynamically and there is no need to predefine these parameters. Additionally polynomial mutation is inserted in the updating step of HS algorithm to favor exploration and exploitation of the search space. The DHSPM algorithm is tested with three power system cases consisting of 3, 13, and 40 thermal units. The computational results show that the DHSPM algorithm is more effective in finding better solutions than other computational intelligence based methods.

  4. Applying Biomimetic Algorithms for Extra-Terrestrial Habitat Generation

    Science.gov (United States)

    Birge, Brian

    2012-01-01

    The objective is to simulate and optimize distributed cooperation among a network of robots tasked with cooperative excavation on an extra-terrestrial surface. Additionally to examine the concept of directed Emergence among a group of limited artificially intelligent agents. Emergence is the concept of achieving complex results from very simple rules or interactions. For example, in a termite mound each individual termite does not carry a blueprint of how to make their home in a global sense, but their interactions based strictly on local desires create a complex superstructure. Leveraging this Emergence concept applied to a simulation of cooperative agents (robots) will allow an examination of the success of non-directed group strategy achieving specific results. Specifically the simulation will be a testbed to evaluate population based robotic exploration and cooperative strategies while leveraging the evolutionary teamwork approach in the face of uncertainty about the environment and partial loss of sensors. Checking against a cost function and 'social' constraints will optimize cooperation when excavating a simulated tunnel. Agents will act locally with non-local results. The rules by which the simulated robots interact will be optimized to the simplest possible for the desired result, leveraging Emergence. Sensor malfunction and line of sight issues will be incorporated into the simulation. This approach falls under Swarm Robotics, a subset of robot control concerned with finding ways to control large groups of robots. Swarm Robotics often contains biologically inspired approaches, research comes from social insect observation but also data from among groups of herding, schooling, and flocking animals. Biomimetic algorithms applied to manned space exploration is the method under consideration for further study.

  5. Improving GPU-accelerated adaptive IDW interpolation algorithm using fast kNN search.

    Science.gov (United States)

    Mei, Gang; Xu, Nengxiong; Xu, Liangliang

    2016-01-01

    This paper presents an efficient parallel Adaptive Inverse Distance Weighting (AIDW) interpolation algorithm on modern Graphics Processing Unit (GPU). The presented algorithm is an improvement of our previous GPU-accelerated AIDW algorithm by adopting fast k-nearest neighbors (kNN) search. In AIDW, it needs to find several nearest neighboring data points for each interpolated point to adaptively determine the power parameter; and then the desired prediction value of the interpolated point is obtained by weighted interpolating using the power parameter. In this work, we develop a fast kNN search approach based on the space-partitioning data structure, even grid, to improve the previous GPU-accelerated AIDW algorithm. The improved algorithm is composed of the stages of kNN search and weighted interpolating. To evaluate the performance of the improved algorithm, we perform five groups of experimental tests. The experimental results indicate: (1) the improved algorithm can achieve a speedup of up to 1017 over the corresponding serial algorithm; (2) the improved algorithm is at least two times faster than our previous GPU-accelerated AIDW algorithm; and (3) the utilization of fast kNN search can significantly improve the computational efficiency of the entire GPU-accelerated AIDW algorithm.

  6. Benchmarking Heuristic Search and Optimisation Algorithms in Matlab

    OpenAIRE

    Luo, Wuqiao; Li, Yun

    2016-01-01

    With the proliferating development of heuristic methods, it has become challenging to choose the most suitable ones for an application at hand. This paper evaluates the performance of these algorithms available in Matlab, as it is problem dependent and parameter sensitive. Further, the paper attempts to address the challenge that there exists no satisfied benchmarks to evaluation all the algorithms at the same standard. The paper tests five heuristic algorithms in Matlab, the Nelder-Mead simp...

  7. Search and optimization by metaheuristics techniques and algorithms inspired by nature

    CERN Document Server

    Du, Ke-Lin

    2016-01-01

    This textbook provides a comprehensive introduction to nature-inspired metaheuristic methods for search and optimization, including the latest trends in evolutionary algorithms and other forms of natural computing. Over 100 different types of these methods are discussed in detail. The authors emphasize non-standard optimization problems and utilize a natural approach to the topic, moving from basic notions to more complex ones. An introductory chapter covers the necessary biological and mathematical backgrounds for understanding the main material. Subsequent chapters then explore almost all of the major metaheuristics for search and optimization created based on natural phenomena, including simulated annealing, recurrent neural networks, genetic algorithms and genetic programming, differential evolution, memetic algorithms, particle swarm optimization, artificial immune systems, ant colony optimization, tabu search and scatter search, bee and bacteria foraging algorithms, harmony search, biomolecular computin...

  8. Genetic Local Search Algorithm for Optimization Design of Diffractive Optical Elements

    Science.gov (United States)

    Zhou, Guangya; Chen, Yixin; Wang, Zongguang; Song, Hongwei

    1999-07-01

    We propose a genetic local search algorithm (GLSA) for the optimization design of diffractive optical elements (DOE s). This hybrid algorithm incorporates advantages of both genetic algorithm (GA) and local search techniques. It appears better able to locate the global minimum compared with a canonical GA. Sample cases investigated here include the optimization design of binary-phase Dammann gratings, continuous surface-relief grating array generators, and a uniform top-hat focal plane intensity profile generator. Two GLSA s whose incorporated local search techniques are the hill-climbing method and the simulated annealing algorithm are investigated. Numerical experimental results demonstrate that the proposed algorithm is highly efficient and robust. DOE s that have high diffraction efficiency and excellent uniformity can be achieved by use of the algorithm we propose.

  9. In Search Of The Consensus Among Musical Pattern Discovery Algorithms

    NARCIS (Netherlands)

    Ren, Iris Yuping; Koops, Vincent; Volk, Anja|info:eu-repo/dai/nl/304842117; Swierstra, Wouter|info:eu-repo/dai/nl/377565326

    2017-01-01

    Patterns are an essential part of music and there are many different algorithms that aim to discover them. Based on the improvements brought by using data fusion methods to find the consensus of algorithms on other MIR tasks, we hypothesize that fusing the output from musical pattern discovery

  10. Pattern Nulling of Linear Antenna Arrays Using Backtracking Search Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Kerim Guney

    2015-01-01

    Full Text Available An evolutionary method based on backtracking search optimization algorithm (BSA is proposed for linear antenna array pattern synthesis with prescribed nulls at interference directions. Pattern nulling is obtained by controlling only the amplitude, position, and phase of the antenna array elements. BSA is an innovative metaheuristic technique based on an iterative process. Various numerical examples of linear array patterns with the prescribed single, multiple, and wide nulls are given to illustrate the performance and flexibility of BSA. The results obtained by BSA are compared with the results of the following seventeen algorithms: particle swarm optimization (PSO, genetic algorithm (GA, modified touring ant colony algorithm (MTACO, quadratic programming method (QPM, bacterial foraging algorithm (BFA, bees algorithm (BA, clonal selection algorithm (CLONALG, plant growth simulation algorithm (PGSA, tabu search algorithm (TSA, memetic algorithm (MA, nondominated sorting GA-2 (NSGA-2, multiobjective differential evolution (MODE, decomposition with differential evolution (MOEA/D-DE, comprehensive learning PSO (CLPSO, harmony search algorithm (HSA, seeker optimization algorithm (SOA, and mean variance mapping optimization (MVMO. The simulation results show that the linear antenna array synthesis using BSA provides low side-lobe levels and deep null levels.

  11. A Fast PDE Algorithm Using Adaptive Scan and Search for Video Coding

    Science.gov (United States)

    Kim, Jong-Nam

    In this paper, we propose an algorithm that reduces unnecessary computations, while keeping the same prediction quality as that of the full search algorithm. In the proposed algorithm, we can reduce unnecessary computations efficiently by calculating initial matching error point from first 1/N partial errors. We can increase the probability that hits minimum error point as soon as possible. Our algorithm decreases the computational amount by about 20% of the conventional PDE algorithm without any degradation of prediction quality. Our algorithm would be useful in real-time video coding applications using MPEG-2/4 AVC standards.

  12. Numerical Algorithms for Personalized Search in Self-organizing Information Networks

    CERN Document Server

    Kamvar, Sep

    2010-01-01

    This book lays out the theoretical groundwork for personalized search and reputation management, both on the Web and in peer-to-peer and social networks. Representing much of the foundational research in this field, the book develops scalable algorithms that exploit the graphlike properties underlying personalized search and reputation management, and delves into realistic scenarios regarding Web-scale data. Sep Kamvar focuses on eigenvector-based techniques in Web search, introducing a personalized variant of Google's PageRank algorithm, and he outlines algorithms--such as the now-famous quad

  13. Elephant swarm water search algorithm for global optimization

    Indian Academy of Sciences (India)

    S Mandal

    2018-02-07

    Feb 7, 2018 ... been tested against a benchmark problem of computational biology, i.e., inference of Gene Regulatory Network based on Recurrent ..... else. //Local Search local water search or update the elephant velocity , using Eq. (2); end if; update the position , using Eq. (3);. //Update positions evaluate fitness value ...

  14. Detecting outlying studies in meta-regression models using a forward search algorithm.

    Science.gov (United States)

    Mavridis, Dimitris; Moustaki, Irini; Wall, Melanie; Salanti, Georgia

    2017-06-01

    When considering data from many trials, it is likely that some of them present a markedly different intervention effect or exert an undue influence on the summary results. We develop a forward search algorithm for identifying outlying and influential studies in meta-analysis models. The forward search algorithm starts by fitting the hypothesized model to a small subset of likely outlier-free studies and proceeds by adding studies into the set one-by-one that are determined to be closest to the fitted model of the existing set. As each study is added to the set, plots of estimated parameters and measures of fit are monitored to identify outliers by sharp changes in the forward plots. We apply the proposed outlier detection method to two real data sets; a meta-analysis of 26 studies that examines the effect of writing-to-learn interventions on academic achievement adjusting for three possible effect modifiers, and a meta-analysis of 70 studies that compares a fluoride toothpaste treatment to placebo for preventing dental caries in children. A simple simulated example is used to illustrate the steps of the proposed methodology, and a small-scale simulation study is conducted to evaluate the performance of the proposed method. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  15. Hybrid fuzzy charged system search algorithm based state estimation in distribution networks

    Directory of Open Access Journals (Sweden)

    Sachidananda Prasad

    2017-06-01

    Full Text Available This paper proposes a new hybrid charged system search (CSS algorithm based state estimation in radial distribution networks in fuzzy framework. The objective of the optimization problem is to minimize the weighted square of the difference between the measured and the estimated quantity. The proposed method of state estimation considers bus voltage magnitude and phase angle as state variable along with some equality and inequality constraints for state estimation in distribution networks. A rule based fuzzy inference system has been designed to control the parameters of the CSS algorithm to achieve better balance between the exploration and exploitation capability of the algorithm. The efficiency of the proposed fuzzy adaptive charged system search (FACSS algorithm has been tested on standard IEEE 33-bus system and Indian 85-bus practical radial distribution system. The obtained results have been compared with the conventional CSS algorithm, weighted least square (WLS algorithm and particle swarm optimization (PSO for feasibility of the algorithm.

  16. Journal of Fundamental and Applied Sciences: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  17. Bayero Journal of Pure and Applied Sciences: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  18. Global Journal of Pure and Applied Sciences: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  19. Nigerian Journal of Basic and Applied Sciences: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  20. Journal of Applied Science and Technology: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  1. Nigeria Journal of Pure and Applied Physics: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  2. West African Journal of Applied Ecology: Advanced Search

    African Journals Online (AJOL)

    Search tips: Search terms are case-insensitive; Common words are ignored; By default only articles containing all terms in the query are returned (i.e., AND is implied); Combine multiple words with OR to find articles containing either term; e.g., education OR research; Use parentheses to create more complex queries; e.g., ...

  3. Hybrid Artificial Bee Colony Algorithm and Particle Swarm Search for Global Optimization

    Directory of Open Access Journals (Sweden)

    Wang Chun-Feng

    2014-01-01

    Full Text Available Artificial bee colony (ABC algorithm is one of the most recent swarm intelligence based algorithms, which has been shown to be competitive to other population-based algorithms. However, there is still an insufficiency in ABC regarding its solution search equation, which is good at exploration but poor at exploitation. To overcome this problem, we propose a novel artificial bee colony algorithm based on particle swarm search mechanism. In this algorithm, for improving the convergence speed, the initial population is generated by using good point set theory rather than random selection firstly. Secondly, in order to enhance the exploitation ability, the employed bee, onlookers, and scouts utilize the mechanism of PSO to search new candidate solutions. Finally, for further improving the searching ability, the chaotic search operator is adopted in the best solution of the current iteration. Our algorithm is tested on some well-known benchmark functions and compared with other algorithms. Results show that our algorithm has good performance.

  4. Search algorithms as a framework for the optimization of drug combinations.

    Directory of Open Access Journals (Sweden)

    Diego Calzolari

    2008-12-01

    Full Text Available Combination therapies are often needed for effective clinical outcomes in the management of complex diseases, but presently they are generally based on empirical clinical experience. Here we suggest a novel application of search algorithms -- originally developed for digital communication -- modified to optimize combinations of therapeutic interventions. In biological experiments measuring the restoration of the decline with age in heart function and exercise capacity in Drosophila melanogaster, we found that search algorithms correctly identified optimal combinations of four drugs using only one-third of the tests performed in a fully factorial search. In experiments identifying combinations of three doses of up to six drugs for selective killing of human cancer cells, search algorithms resulted in a highly significant enrichment of selective combinations compared with random searches. In simulations using a network model of cell death, we found that the search algorithms identified the optimal combinations of 6-9 interventions in 80-90% of tests, compared with 15-30% for an equivalent random search. These findings suggest that modified search algorithms from information theory have the potential to enhance the discovery of novel therapeutic drug combinations. This report also helps to frame a biomedical problem that will benefit from an interdisciplinary effort and suggests a general strategy for its solution.

  5. Iterated local search and record-to-record travel applied to the fixed charge transportation problem

    DEFF Research Database (Denmark)

    Andersen, Jeanne; Klose, Andreas

    , transportation costs do, however, include a fixed charge. Iterated local search and record-to-record travel are both simple local search based meta-heuristics that, to our knowledge, not yet have been applied to the FCTP. In this paper, we apply both types of search strategies and combine them into a single...

  6. Efficient algorithm for binary search enhancement | Bennett | Journal ...

    African Journals Online (AJOL)

    Journal of Computer Science and Its Application. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 22, No 1 (2015) >. Log in or Register to get access to full text downloads.

  7. Search Algorithms for Software-Only Real-Time Recognition with Very Large Vocabularies

    National Research Council Canada - National Science Library

    Nguyen, Long; Schwartz, Richard; Kubala, Francis; Placeway, Paul

    1993-01-01

    .... We present a history of several advances in search algorithms, which together, have made it possible to implement real-time recognition of large vocabularies on a single workstation without the need...

  8. Memetic Algorithms with Local Search Chains in R: The Rmalschains Package

    Directory of Open Access Journals (Sweden)

    Christoph Bergmeir

    2016-12-01

    Full Text Available Global optimization is an important field of research both in mathematics and computer sciences. It has applications in nearly all fields of modern science and engineering. Memetic algorithms are powerful problem solvers in the domain of continuous optimization, as they offer a trade-off between exploration of the search space using an evolutionary algorithm scheme, and focused exploitation of promising regions with a local search algorithm. In particular, we describe the memetic algorithms with local search chains (MA-LS-Chains paradigm, and the R package Rmalschains, which implements them. MA-LS-Chains has proven to be effective compared to other algorithms, especially in high-dimensional problem solving. In an experimental study, we demonstrate the advantages of using Rmalschains for high-dimension optimization problems in comparison to other optimization methods already available in R.

  9. Online Order Priority Evaluation Based on Hybrid Harmony Search Algorithm of Optimized Support Vector Machines

    OpenAIRE

    Yuanyuan Zhao; Qian Chen

    2014-01-01

    To make production plan, online order priority evaluation is the current priority weakness of online order evaluation model. This thesis proposes an online order priority evaluation model based on hybrid harmony search algorithm of optimized support vector machine (HHS-SVM). Firstly, an online order priority evaluation index system is build, and then support vector machine is adopted to build an online order priority evaluation model; secondly, harmony search algorithm is used to optimize the...

  10. Solving the wind farm layout optimization problem using random search algorithm

    DEFF Research Database (Denmark)

    Feng, Ju; Shen, Wen Zhong

    2015-01-01

    Wind farm (WF) layout optimization is to find the optimal positions of wind turbines (WTs) inside a WF, so as to maximize and/or minimize a single objective or multiple objectives, while satisfying certain constraints. In this work, a random search (RS) algorithm based on continuous formulation...... by expert guesses or other optimization methods, and as an optimization tool to find the optimal layout of WF with a certain number of WTs. A new strategy to evaluate layouts is also used, which can largely save the computation cost. This method is first applied to a widely studied ideal test problem....... In this application, it is also found that in order to get consistent and reliable optimization results, up to 360 or more sectors for wind direction have to be used. Finally, considering the inevitable inter-annual variations in the wind conditions, the robustness of the optimized layouts against wind condition...

  11. Using a multi-objective genetic algorithm for developing aerial sensor team search strategies

    Science.gov (United States)

    Ridder, Jeffrey P.; Herweg, Jared A.; Sciortino, John C., Jr.

    2008-04-01

    Finding certain associated signals in the modern electromagnetic environment can prove a difficult task due to signal characteristics and associated platform tactics as well as the systems used to find these signals. One approach to finding such signal sets is to employ multiple small unmanned aerial systems (UASs) equipped with RF sensors in a team to search an area. The search environment may be partially known, but with a significant level of uncertainty as to the locations and emissions behavior of the individual signals and their associated platforms. The team is likely to benefit from a combination of using uncertain a priori information for planning and online search algorithms for dynamic tasking of the team. Two search algorithms are examined for effectiveness: Archimedean spirals, in which the UASs comprising the team do not respond to the environment, and artificial potential fields, in which they use environmental perception and interactions to dynamically guide the search. A multi-objective genetic algorithm (MOGA) is used to explore the desirable characteristics of search algorithms for this problem using two performance objectives. The results indicate that the MOGA can successfully use uncertain a priori information to set the parameters of the search algorithms. Also, we find that artificial potential fields may result in good performance, but that each of the fields has a different contribution that may be appropriate only in certain states.

  12. Hybrid Genetic Algorithm - Local Search Method for Ground-Water Management

    Science.gov (United States)

    Chiu, Y.; Nishikawa, T.; Martin, P.

    2008-12-01

    Ground-water management problems commonly are formulated as a mixed-integer, non-linear programming problem (MINLP). Relying only on conventional gradient-search methods to solve the management problem is computationally fast; however, the methods may become trapped in a local optimum. Global-optimization schemes can identify the global optimum, but the convergence is very slow when the optimal solution approaches the global optimum. In this study, we developed a hybrid optimization scheme, which includes a genetic algorithm and a gradient-search method, to solve the MINLP. The genetic algorithm identifies a near- optimal solution, and the gradient search uses the near optimum to identify the global optimum. Our methodology is applied to a conjunctive-use project in the Warren ground-water basin, California. Hi- Desert Water District (HDWD), the primary water-manager in the basin, plans to construct a wastewater treatment plant to reduce future septic-tank effluent from reaching the ground-water system. The treated wastewater instead will recharge the ground-water basin via percolation ponds as part of a larger conjunctive-use strategy, subject to State regulations (e.g. minimum distances and travel times). HDWD wishes to identify the least-cost conjunctive-use strategies that control ground-water levels, meet regulations, and identify new production-well locations. As formulated, the MINLP objective is to minimize water-delivery costs subject to constraints including pump capacities, available recharge water, water-supply demand, water-level constraints, and potential new-well locations. The methodology was demonstrated by an enumerative search of the entire feasible solution and comparing the optimum solution with results from the branch-and-bound algorithm. The results also indicate that the hybrid method identifies the global optimum within an affordable computation time. Sensitivity analyses, which include testing different recharge-rate scenarios, pond

  13. Applying Genetic Algorithms To Query Optimization in Document Retrieval.

    Science.gov (United States)

    Horng, Jorng-Tzong; Yeh, Ching-Chang

    2000-01-01

    Proposes a novel approach to automatically retrieve keywords and then uses genetic algorithms to adapt the keyword weights. Discusses Chinese text retrieval, term frequency rating formulas, vector space models, bigrams, the PAT-tree structure for information retrieval, query vectors, and relevance feedback. (Author/LRW)

  14. Data classification using metaheuristic Cuckoo Search technique for Levenberg Marquardt back propagation (CSLM) algorithm

    Science.gov (United States)

    Nawi, Nazri Mohd.; Khan, Abdullah; Rehman, M. Z.

    2015-05-01

    A nature inspired behavior metaheuristic techniques which provide derivative-free solutions to solve complex problems. One of the latest additions to the group of nature inspired optimization procedure is Cuckoo Search (CS) algorithm. Artificial Neural Network (ANN) training is an optimization task since it is desired to find optimal weight set of a neural network in training process. Traditional training algorithms have some limitation such as getting trapped in local minima and slow convergence rate. This study proposed a new technique CSLM by combining the best features of two known algorithms back-propagation (BP) and Levenberg Marquardt algorithm (LM) for improving the convergence speed of ANN training and avoiding local minima problem by training this network. Some selected benchmark classification datasets are used for simulation. The experiment result show that the proposed cuckoo search with Levenberg Marquardt algorithm has better performance than other algorithm used in this study.

  15. Inversion for Refractivity Parameters Using a Dynamic Adaptive Cuckoo Search with Crossover Operator Algorithm

    Directory of Open Access Journals (Sweden)

    Zhihua Zhang

    2016-01-01

    Full Text Available Using the RFC technique to estimate refractivity parameters is a complex nonlinear optimization problem. In this paper, an improved cuckoo search (CS algorithm is proposed to deal with this problem. To enhance the performance of the CS algorithm, a parameter dynamic adaptive operation and crossover operation were integrated into the standard CS (DACS-CO. Rechenberg’s 1/5 criteria combined with learning factor were used to control the parameter dynamic adaptive adjusting process. The crossover operation of genetic algorithm was utilized to guarantee the population diversity. The new hybrid algorithm has better local search ability and contributes to superior performance. To verify the ability of the DACS-CO algorithm to estimate atmospheric refractivity parameters, the simulation data and real radar clutter data are both implemented. The numerical experiments demonstrate that the DACS-CO algorithm can provide an effective method for near-real-time estimation of the atmospheric refractivity profile from radar clutter.

  16. A Teaching Approach from the Exhaustive Search Method to the Needleman-Wunsch Algorithm

    Science.gov (United States)

    Xu, Zhongneng; Yang, Yayun; Huang, Beibei

    2017-01-01

    The Needleman-Wunsch algorithm has become one of the core algorithms in bioinformatics; however, this programming requires more suitable explanations for students with different major backgrounds. In supposing sample sequences and using a simple store system, the connection between the exhaustive search method and the Needleman-Wunsch algorithm…

  17. On the use of harmony search algorithm in the training of wavelet neural networks

    Science.gov (United States)

    Lai, Kee Huong; Zainuddin, Zarita; Ong, Pauline

    2015-10-01

    Wavelet neural networks (WNNs) are a class of feedforward neural networks that have been used in a wide range of industrial and engineering applications to model the complex relationships between the given inputs and outputs. The training of WNNs involves the configuration of the weight values between neurons. The backpropagation training algorithm, which is a gradient-descent method, can be used for this training purpose. Nonetheless, the solutions found by this algorithm often get trapped at local minima. In this paper, a harmony search-based algorithm is proposed for the training of WNNs. The training of WNNs, thus can be formulated as a continuous optimization problem, where the objective is to maximize the overall classification accuracy. Each candidate solution proposed by the harmony search algorithm represents a specific WNN architecture. In order to speed up the training process, the solution space is divided into disjoint partitions during the random initialization step of harmony search algorithm. The proposed training algorithm is tested onthree benchmark problems from the UCI machine learning repository, as well as one real life application, namely, the classification of electroencephalography signals in the task of epileptic seizure detection. The results obtained show that the proposed algorithm outperforms the traditional harmony search algorithm in terms of overall classification accuracy.

  18. From Schrцdinger's equation to the quantum search algorithm£

    Indian Academy of Sciences (India)

    paper also provides a self contained introduction to quantum computing algorithms from a new per- spective. ... quantum mechanics. This subsection briefly mentions the concepts needed to understand the quantum search algorithm – it is by no means a comprehensive review of quantum ..... If we break up the evolution.

  19. New Search Space Reduction Algorithm for Vertical Reference Trajectory Optimization

    Directory of Open Access Journals (Sweden)

    Alejandro MURRIETA-MENDOZA

    2016-06-01

    Full Text Available Burning the fuel required to sustain a given flight releases pollution such as carbon dioxide and nitrogen oxides, and the amount of fuel consumed is also a significant expense for airlines. It is desirable to reduce fuel consumption to reduce both pollution and flight costs. To increase fuel savings in a given flight, one option is to compute the most economical vertical reference trajectory (or flight plan. A deterministic algorithm was developed using a numerical aircraft performance model to determine the most economical vertical flight profile considering take-off weight, flight distance, step climb and weather conditions. This algorithm is based on linear interpolations of the performance model using the Lagrange interpolation method. The algorithm downloads the latest available forecast from Environment Canada according to the departure date and flight coordinates, and calculates the optimal trajectory taking into account the effects of wind and temperature. Techniques to avoid unnecessary calculations are implemented to reduce the computation time. The costs of the reference trajectories proposed by the algorithm are compared with the costs of the reference trajectories proposed by a commercial flight management system using the fuel consumption estimated by the FlightSim® simulator made by Presagis®.

  20. A tabu search evalutionary algorithm for multiobjective optimization: Application to a bi-criterion aircraft structural reliability problem

    Science.gov (United States)

    Long, Kim Chenming

    Real-world engineering optimization problems often require the consideration of multiple conflicting and noncommensurate objectives, subject to nonconvex constraint regions in a high-dimensional decision space. Further challenges occur for combinatorial multiobjective problems in which the decision variables are not continuous. Traditional multiobjective optimization methods of operations research, such as weighting and epsilon constraint methods, are ill-suited to solving these complex, multiobjective problems. This has given rise to the application of a wide range of metaheuristic optimization algorithms, such as evolutionary, particle swarm, simulated annealing, and ant colony methods, to multiobjective optimization. Several multiobjective evolutionary algorithms have been developed, including the strength Pareto evolutionary algorithm (SPEA) and the non-dominated sorting genetic algorithm (NSGA), for determining the Pareto-optimal set of non-dominated solutions. Although numerous researchers have developed a wide range of multiobjective optimization algorithms, there is a continuing need to construct computationally efficient algorithms with an improved ability to converge to globally non-dominated solutions along the Pareto-optimal front for complex, large-scale, multiobjective engineering optimization problems. This is particularly important when the multiple objective functions and constraints of the real-world system cannot be expressed in explicit mathematical representations. This research presents a novel metaheuristic evolutionary algorithm for complex multiobjective optimization problems, which combines the metaheuristic tabu search algorithm with the evolutionary algorithm (TSEA), as embodied in genetic algorithms. TSEA is successfully applied to bicriteria (i.e., structural reliability and retrofit cost) optimization of the aircraft tail structure fatigue life, which increases its reliability by prolonging fatigue life. A comparison for this

  1. Efficient Grid-based Clustering Algorithm with Leaping Search and Merge Neighbors Method

    Science.gov (United States)

    Liu, Feng; Wen, Peng; Zhu, Erzhou

    2017-09-01

    The increasing data size makes the research of clustering algorithm still an important topic in data mining. As one of the fastest algorithms, the grid clustering algorithm now still suffers from low precision problem. And the efficiency of the algorithm also needed improvement. In order to cope with these problems, this paper proposes an efficient grid-based clustering algorithm by using leaping search and Merge Neighborhood (LSMN). In the algorithm, the LSMN first divides the data space into a finite number of grids and determines the validity of the grid according to the threshold. Then, leaping search mechanism is used to find valid grids of the grid by retrieving all the odd columns and odd rows. Finally, if the number of valid grids is greater than the invalid grid, the invalid grids are merged together. In the algorithm, the time cost is reduced and the accuracy is improved by leaping search and re-judgment of the invalid grid mechanisms respectively. The experimental results have shown that the proposed algorithm exhibits relatively better performance when compared with some popularly used algorithms.

  2. Electron-beam lithographic computer-generated holograms designed by direct search coding algorithm

    Science.gov (United States)

    Tamura, Hitoshi; Torii, Yasuhiro

    2009-08-01

    An optimized encoding algorithm is required to produce high-quality computer generated holograms (CGH). For such purpose, we have proposed that usage of the direct search algorithm (DSA) is effective for encoding the Lohmann-type binary amplitude and phase CGH. However, it takes much time for a computation time to get an optical solution by a DSA. To solve this problem, we have newly found that simultaneously selective direct search algorithm (SDSA) is greatly effective to shorten a computing time for encoding a Lohmann-type CGH.

  3. Applied algorithm in the liner inspection of solid rocket motors

    Science.gov (United States)

    Hoffmann, Luiz Felipe Simões; Bizarria, Francisco Carlos Parquet; Bizarria, José Walter Parquet

    2018-03-01

    In rocket motors, the bonding between the solid propellant and thermal insulation is accomplished by a thin adhesive layer, known as liner. The liner application method involves a complex sequence of tasks, which includes in its final stage, the surface integrity inspection. Nowadays in Brazil, an expert carries out a thorough visual inspection to detect defects on the liner surface that may compromise the propellant interface bonding. Therefore, this paper proposes an algorithm that uses the photometric stereo technique and the K-nearest neighbor (KNN) classifier to assist the expert in the surface inspection. Photometric stereo allows the surface information recovery of the test images, while the KNN method enables image pixels classification into two classes: non-defect and defect. Tests performed on a computer vision based prototype validate the algorithm. The positive results suggest that the algorithm is feasible and when implemented in a real scenario, will be able to help the expert in detecting defective areas on the liner surface.

  4. SKA pulsar search: technological challenges and best algorithms development

    Science.gov (United States)

    Baffa, C.

    2014-08-01

    One of the key scientific projects of the SKA radio telescope is a large survey for pulsars both in isolated and binary systems. The data rate of the pulsar search engine is expected to reach 0.6TeraSamples/sec. For the purposes of extracting hidden pulses from these streams, we need a complex search strategy which allows us to explore a three dimensional parameter space and it requires approximately 10PetaFlops. This problem is well suited for a parallel computing engine, but the dimensions of SKA bring this problem to a new level of complexity. An up-to-date study shows that this operation would require more than 2000 GPUs. In this report we will present possible mitigation strategies.

  5. Mechanical Decoupling Algorithm Applied to Electric Drive Test Bed

    Directory of Open Access Journals (Sweden)

    Song Qiang

    2014-01-01

    Full Text Available New approach and analysis are proposed in this paper to enhance the steady and rapidity of the electric drive test bed. Based on a basic drive motor dynamometer system (DMDS test bed, detailed mathematical model and process control are established and analyzed. Relative gain array (RGA method and diagonal matrix method are used to analyze the mechanical coupling caused by mechanical connection on the DMDS test bed, and the structure and algorithm of dynamic decoupling are proposed. Simulation and experiment all indicate that the designed decoupling method can efficiently improve the control accuracy and response speed.

  6. A Hybrid Evolutionary Algorithm to Quadratic Three-Dimensional Assignment Problem with Local Search for Many-Core Graphics Processors

    Science.gov (United States)

    Lipinski, Piotr

    This paper concerns the quadratic three-dimensional assignment problem (Q3AP), an extension of the quadratic assignment problem (QAP), and proposes an efficient hybrid evolutionary algorithm combining stochastic optimization and local search with a number of crossover operators, a number of mutation operators and an auto-adaptation mechanism. Auto-adaptation manages the pool of evolutionary operators applying different operators in different computation phases to better explore the search space and to avoid premature convergence. Local search additionally optimizes populations of candidate solutions and accelerates evolutionary search. It uses a many-core graphics processor to optimize a number of solutions in parallel, which enables its incorporation into the evolutionary algorithm without excessive increases in the computation time. Experiments performed on benchmark Q3AP instances derived from the classic QAP instances proposed by Nugent et al. confirmed that the proposed algorithm is able to find optimal solutions to Q3AP in a reasonable time and outperforms best known results found in the literature.

  7. Basic Searching, Interpolating, and Curve-Fitting Algorithms in C++

    Science.gov (United States)

    2015-01-01

    the yRandom namespace to populate an array with 2 14 pseudorandom numbers using the Mersenne twister 19937 algorithm. The array is then sorted...repetitions unsigned I[625];/*<-*/yRandom::Initialize(I,1);//....state of Mersenne twister double*X=new double[N];/*<-*/for(int i=0;i<N;++i)X[i]=yRandom... twister double R[N];/*<-*/for(int i=0;i<N;++i)R[i]=yRandom::RandU(I,0,1); double S=0,t=clock(); for(int i=0;i<N;++i)S+=yInterp::NNInterp(X,Y,R

  8. Genetic Algorithm Applied to the Eigenvalue Equalization Filtered-x LMS Algorithm (EE-FXLMS

    Directory of Open Access Journals (Sweden)

    Stephan P. Lovstedt

    2008-01-01

    Full Text Available The FXLMS algorithm, used extensively in active noise control (ANC, exhibits frequency-dependent convergence behavior. This leads to degraded performance for time-varying tonal noise and noise with multiple stationary tones. Previous work by the authors proposed the eigenvalue equalization filtered-x least mean squares (EE-FXLMS algorithm. For that algorithm, magnitude coefficients of the secondary path transfer function are modified to decrease variation in the eigenvalues of the filtered-x autocorrelation matrix, while preserving the phase, giving faster convergence and increasing overall attenuation. This paper revisits the EE-FXLMS algorithm, using a genetic algorithm to find magnitude coefficients that give the least variation in eigenvalues. This method overcomes some of the problems with implementing the EE-FXLMS algorithm arising from finite resolution of sampled systems. Experimental control results using the original secondary path model, and a modified secondary path model for both the previous implementation of EE-FXLMS and the genetic algorithm implementation are compared.

  9. SIMMER extension for multigroup energy structure search using genetic algorithm with different fitness functions

    Directory of Open Access Journals (Sweden)

    Mattia Massone

    2017-09-01

    Full Text Available The multigroup transport theory is the basis for many neutronics modules. A significant point of the cross-section (XS generation procedure is the choice of the energy groups' boundaries in the XS libraries, which must be carefully selected as an unsuitable energy meshing can easily lead to inaccurate results. This decision can require considerable effort and is particularly difficult for the common user, especially if not well-versed in reactor physics. This work investigates a genetic algorithm-based tool which selects an appropriate XS energy structure (ES specific for the considered problem, to be used for the condensation of a fine multigroup library. The procedure is accelerated by results storage and fitness calculation speed-up and can be easily parallelized. The extension is applied to the coupled code SIMMER and tested on the European Sustainable Nuclear Industrial Initiative (ESNII+ Advanced Sodium Technological Reactor for Industrial Demonstration (ASTRID-like reactor system with different fitness functions. The results show that, when the libraries are condensed based on the ESs suggested by the algorithm, the code actually returns the correct multiplication factor, in both reference and voided conditions. The computational effort reduction obtained by using the condensed library rather than the fine one is assessed and is much higher than the time required for the ES search.

  10. Differential search algorithm-based parametric optimization of electrochemical micromachining processes

    Directory of Open Access Journals (Sweden)

    Debkalpa Goswami

    2014-01-01

    Full Text Available Electrochemical micromachining (EMM appears to be a very promising micromachining process for having higher machining rate, better precision and control, reliability, flexibility, environmental acceptability, and capability of machining a wide range of materials. It permits machining of chemically resistant materials, like titanium, copper alloys, super alloys and stainless steel to be used in biomedical, electronic, micro-electromechanical system and nano-electromechanical system applications. Therefore, the optimal use of an EMM process for achieving enhanced machining rate and improved profile accuracy demands selection of its various machining parameters. Various optimization tools, primarily Derringer’s desirability function approach have been employed by the past researchers for deriving the best parametric settings of EMM processes, which inherently lead to sub-optimal or near optimal solutions. In this paper, an attempt is made to apply an almost new optimization tool, i.e. differential search algorithm (DSA for parametric optimization of three EMM processes. A comparative study of optimization performance between DSA, genetic algorithm and desirability function approach proves the wide acceptability of DSA as a global optimization tool.

  11. Index Fund Optimization Using a Genetic Algorithm and a Heuristic Local Search

    Science.gov (United States)

    Orito, Yukiko; Inoguchi, Manabu; Yamamoto, Hisashi

    It is well known that index funds are popular passively managed portfolios and have been used very extensively for the hedge trading. Index funds consist of a certain number of stocks of listed companies on a stock market such that the fund's return rates follow a similar path to the changing rates of the market indices. However it is hard to make a perfect index fund consisting of all companies included in the given market index. Thus, the index fund optimization can be viewed as a combinatorial optimization for portfolio managements. In this paper, we propose an optimization method that consists of a genetic algorithm and a heuristic local search algorithm to make strong linear association between the fund's return rates and the changing rates of market index. We apply the method to the Tokyo Stock Exchange and make index funds whose return rates follow a similar path to the changing rates of Tokyo Stock Price Index (TOPIX). The results show that our proposal method makes the index funds with strong linear association to the market index by small computing time.

  12. Optimization of Particle Search Algorithm for CFD-DEM Simulations

    Directory of Open Access Journals (Sweden)

    G. Baryshev

    2013-09-01

    Full Text Available Discrete element method has numerous applications in particle physics. However, simulating particles as discrete entities can become costly for large systems. In time-driven DEM simulation most computation time is taken by contact search stage. We propose an efficient collision detection method which is based on sorting particles by their coordinates. Using multiple sorting criteria allows minimizing number of potential neighbours and defines fitness of this approach for simulation of massive systems in 3D. This method is compared to a common approach that consists of placing particles onto a grid of cells. Advantage of the new approach is independence of simulation parameters upon particle radius and domain size.

  13. The intelligent web search, smart algorithms, and big data

    CERN Document Server

    Shroff, Gautam

    2013-01-01

    As we use the Web for social networking, shopping, and news, we leave a personal trail. These days, linger over a Web page selling lamps, and they will turn up at the advertising margins as you move around the Internet, reminding you, tempting you to make that purchase. Search engines such as Google can now look deep into the data on the Web to pull out instances of the words you are looking for. And there are pages that collect and assess information to give you a snapshot ofchanging political opinion. These are just basic examples of the growth of ""Web intelligence"", as increasingly sophis

  14. How doctors apply semantic components to specify search in work-related information retrieval

    DEFF Research Database (Denmark)

    Lykke, Marianne; Price, Susan L.; Delcambre, Lois L. M.

    2012-01-01

    of our study was to gain insight into how family practice physicians at sundhed.dk, a national healthcare portal in Denmark, applied the SC model to formulate queries to solve work-related search tasks. The results showed that doctors used the model purposively when choosing search facets and search...

  15. Accelerating Smith-Waterman Algorithm for Biological Database Search on CUDA-Compatible GPUs

    Science.gov (United States)

    Munekawa, Yuma; Ino, Fumihiko; Hagihara, Kenichi

    This paper presents a fast method capable of accelerating the Smith-Waterman algorithm for biological database search on a cluster of graphics processing units (GPUs). Our method is implemented using compute unified device architecture (CUDA), which is available on the nVIDIA GPU. As compared with previous methods, our method has four major contributions. (1) The method efficiently uses on-chip shared memory to reduce the data amount being transferred between off-chip video memory and processing elements in the GPU. (2) It also reduces the number of data fetches by applying a data reuse technique to query and database sequences. (3) A pipelined method is also implemented to overlap GPU execution with database access. (4) Finally, a master/worker paradigm is employed to accelerate hundreds of database searches on a cluster system. In experiments, the peak performance on a GeForce GTX 280 card reaches 8.32 giga cell updates per second (GCUPS). We also find that our method reduces the amount of data fetches to 1/140, achieving approximately three times higher performance than a previous CUDA-based method. Our 32-node cluster version is approximately 28 times faster than a single GPU version. Furthermore, the effective performance reaches 75.6 giga instructions per second (GIPS) using 32 GeForce 8800 GTX cards.

  16. Cuckoo search via Levy flights applied to uncapacitated facility location problem

    Science.gov (United States)

    Mesa, Armacheska; Castromayor, Kris; Garillos-Manliguez, Cinmayii; Calag, Vicente

    2017-11-01

    Facility location problem (FLP) is a mathematical way to optimally locate facilities within a set of candidates to satisfy the requirements of a given set of clients. This study addressed the uncapacitated FLP as it assures that the capacity of every selected facility is finite. Thus, even if the demand is not known, which often is the case, in reality, organizations may still be able to take strategic decisions such as locating the facilities. There are different approaches relevant to the uncapacitated FLP. Here, the cuckoo search via Lévy flight (CS-LF) was used to solve the problem. Though hybrid methods produce better results, this study employed CS-LF to determine first its potential in finding solutions for the problem, particularly when applied to a real-world problem. The method was applied to the data set obtained from a department store in Davao City, Philippines. Results showed that applying CS-LF yielded better facility locations compared to particle swarm optimization and other existing algorithms. Although these results showed that CS-LF is a promising method to solve this particular problem, further studies on other FLP are recommended to establish a strong foundation of the capability of CS-LF in solving FLP.

  17. Applied Graph-Mining Algorithms to Study Biomolecular Interaction Networks

    Science.gov (United States)

    2014-01-01

    Protein-protein interaction (PPI) networks carry vital information on the organization of molecular interactions in cellular systems. The identification of functionally relevant modules in PPI networks is one of the most important applications of biological network analysis. Computational analysis is becoming an indispensable tool to understand large-scale biomolecular interaction networks. Several types of computational methods have been developed and employed for the analysis of PPI networks. Of these computational methods, graph comparison and module detection are the two most commonly used strategies. This review summarizes current literature on graph kernel and graph alignment methods for graph comparison strategies, as well as module detection approaches including seed-and-extend, hierarchical clustering, optimization-based, probabilistic, and frequent subgraph methods. Herein, we provide a comprehensive review of the major algorithms employed under each theme, including our recently published frequent subgraph method, for detecting functional modules commonly shared across multiple cancer PPI networks. PMID:24800226

  18. Insights into performance of pattern search algorithms for high-frequency surface wave analysis

    Science.gov (United States)

    Song, Xianhai; Li, Duanyou; Gu, Hanming; Liao, Yonglong; Ren, Dachun

    2009-08-01

    Inversion of high-frequency surface wave dispersion curves is challenging for most local-search methods due to its high nonlinearity and to its multimodality. In this paper, we implemented an investigation to fully exploit and utilize the potentiality of pattern search algorithms and to further enhance their performance for surface wave analysis. We first investigate effects of different inversion strategies, initial mesh size and final mesh size, expansion factor, and contraction factor, as well as inclusion of noise in surface wave data on the performance of the approaches, by three synthetic earth models. Then, a comparative analysis with genetic algorithms is made to further highlight the advantages of the proposed inverse procedure. Finally, the insights issued from this analysis are verified by a real-world example from Henan, China. Results from both synthetic and field data demonstrate: (a) generalized pattern search (GPS) algorithm with the maximal positive basis set 2 N vectors works better than GPS algorithm with the minimal positive basis set N+1 vectors; (b) if one gets a suitable initial mesh size by taking some experimentation, then setting expansion factor Λ=1 (i.e., not allow expansions) and contraction factor θ=1/2 can greatly enhance the performance of pattern search algorithms. This is particularly true as the algorithm converges and final mesh size should go to zero; and (c) pattern search algorithms possess stronger immunity with respect to noise and should be considered good not only in terms of accuracy but also in terms of computation effort, especially when compared to the application of genetic algorithms to Rayleigh wave inversion.

  19. On the performance of an artificial bee colony optimization algorithm applied to the accident diagnosis in a PWR nuclear power plant

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Iona Maghali S. de; Schirru, Roberto; Medeiros, Jose A.C.C., E-mail: maghali@lmp.ufrj.b, E-mail: schirru@lmp.ufrj.b, E-mail: canedo@lmp.ufrj.b [Universidade Federal do Rio de Janeiro (UFRJ), RJ (Brazil). Coordenacao dos Programas de Pos-Graduacao de Engenharia. Programa de Engenharia Nuclear

    2009-07-01

    The swarm-based algorithm described in this paper is a new search algorithm capable of locating good solutions efficiently and within a reasonable running time. The work presents a population-based search algorithm that mimics the food foraging behavior of honey bee swarms and can be regarded as belonging to the category of intelligent optimization tools. In its basic version, the algorithm performs a kind of random search combined with neighborhood search and can be used for solving multi-dimensional numeric problems. Following a description of the algorithm, this paper presents a new event classification system based exclusively on the ability of the algorithm to find the best centroid positions that correctly identifies an accident in a PWR nuclear power plant, thus maximizing the number of correct classification of transients. The simulation results show that the performance of the proposed algorithm is comparable to other population-based algorithms when applied to the same problem, with the advantage of employing fewer control parameters. (author)

  20. Wavelet neural networks initialization using hybridized clustering and harmony search algorithm: Application in epileptic seizure detection

    Science.gov (United States)

    Zainuddin, Zarita; Lai, Kee Huong; Ong, Pauline

    2013-04-01

    Artificial neural networks (ANNs) are powerful mathematical models that are used to solve complex real world problems. Wavelet neural networks (WNNs), which were developed based on the wavelet theory, are a variant of ANNs. During the training phase of WNNs, several parameters need to be initialized; including the type of wavelet activation functions, translation vectors, and dilation parameter. The conventional k-means and fuzzy c-means clustering algorithms have been used to select the translation vectors. However, the solution vectors might get trapped at local minima. In this regard, the evolutionary harmony search algorithm, which is capable of searching for near-optimum solution vectors, both locally and globally, is introduced to circumvent this problem. In this paper, the conventional k-means and fuzzy c-means clustering algorithms were hybridized with the metaheuristic harmony search algorithm. In addition to obtaining the estimation of the global minima accurately, these hybridized algorithms also offer more than one solution to a particular problem, since many possible solution vectors can be generated and stored in the harmony memory. To validate the robustness of the proposed WNNs, the real world problem of epileptic seizure detection was presented. The overall classification accuracy from the simulation showed that the hybridized metaheuristic algorithms outperformed the standard k-means and fuzzy c-means clustering algorithms.

  1. Nature-inspired Cuckoo Search Algorithm for Side Lobe Suppression in a Symmetric Linear Antenna Array

    Directory of Open Access Journals (Sweden)

    K. N. Abdul Rani

    2012-09-01

    Full Text Available In this paper, we proposed a newly modified cuckoo search (MCS algorithm integrated with the Roulette wheel selection operator and the inertia weight controlling the search ability towards synthesizing symmetric linear array geometry with minimum side lobe level (SLL and/or nulls control. The basic cuckoo search (CS algorithm is primarily based on the natural obligate brood parasitic behavior of some cuckoo species in combination with the Levy flight behavior of some birds and fruit flies. The CS metaheuristic approach is straightforward and capable of solving effectively general N-dimensional, linear and nonlinear optimization problems. The array geometry synthesis is first formulated as an optimization problem with the goal of SLL suppression and/or null prescribed placement in certain directions, and then solved by the newly MCS algorithm for the optimum element or isotropic radiator locations in the azimuth-plane or xy-plane. The study also focuses on the four internal parameters of MCS algorithm specifically on their implicit effects in the array synthesis. The optimal inter-element spacing solutions obtained by the MCS-optimizer are validated through comparisons with the standard CS-optimizer and the conventional array within the uniform and the Dolph-Chebyshev envelope patterns using MATLABTM. Finally, we also compared the fine-tuned MCS algorithm with two popular evolutionary algorithm (EA techniques include particle swarm optimization (PSO and genetic algorithms (GA.

  2. Keyword-based Ciphertext Search Algorithm under Cloud Storage

    Directory of Open Access Journals (Sweden)

    Ren Xunyi

    2016-01-01

    Full Text Available With the development of network storage services, cloud storage have the advantage of high scalability , inexpensive, without access limit and easy to manage. These advantages make more and more small or medium enterprises choose to outsource large quantities of data to a third party. This way can make lots of small and medium enterprises get rid of costs of construction and maintenance, so it has broad market prospects. But now lots of cloud storage service providers can not protect data security.This result leakage of user data, so many users have to use traditional storage method.This has become one of the important factors that hinder the development of cloud storage. In this article, establishing keyword index by extracting keywords from ciphertext data. After that, encrypted data and the encrypted index upload cloud server together.User get related ciphertext by searching encrypted index, so it can response data leakage problem.

  3. A Metropolis algorithm combined with Nelder-Mead Simplex applied to nuclear reactor core design

    Energy Technology Data Exchange (ETDEWEB)

    Sacco, Wagner F. [Depto. de Modelagem Computacional, Instituto Politecnico, Universidade do Estado do Rio de Janeiro, R. Alberto Rangel, s/n, P.O. Box 972285, Nova Friburgo, RJ 28601-970 (Brazil)], E-mail: wfsacco@iprj.uerj.br; Filho, Hermes Alves; Henderson, Nelio [Depto. de Modelagem Computacional, Instituto Politecnico, Universidade do Estado do Rio de Janeiro, R. Alberto Rangel, s/n, P.O. Box 972285, Nova Friburgo, RJ 28601-970 (Brazil); Oliveira, Cassiano R.E. de [Nuclear and Radiological Engineering Program, George W. Woodruff School of Mechanical Engineering, Georgia Institute of Technology, Atlanta, GA 30332-0405 (United States)

    2008-05-15

    A hybridization of the recently introduced Particle Collision Algorithm (PCA) and the Nelder-Mead Simplex algorithm is introduced and applied to a core design optimization problem which was previously attacked by other metaheuristics. The optimization problem consists in adjusting several reactor cell parameters, such as dimensions, enrichment and materials, in order to minimize the average peak-factor in a three-enrichment-zone reactor, considering restrictions on the average thermal flux, criticality and sub-moderation. The new metaheuristic performs better than the genetic algorithm, particle swarm optimization, and the Metropolis algorithms PCA and the Great Deluge Algorithm, thus demonstrating its potential for other applications.

  4. A full-Newton step feasible interior-point algorithm for P∗(κ-LCP based on a new search direction

    Directory of Open Access Journals (Sweden)

    Behrouz Kheirfam

    2016-12-01

    Full Text Available In this paper, we present a full-Newton step feasible interior-point algorithm for a P∗(κ linear complementarity problem based on a new search direction. We apply a vector-valued function generated by a univariate function on nonlinear equations of the system which defines the central path. Furthermore, we derive the iteration bound for the algorithm, which coincides with the best-known iteration bound for these types of algorithms. Numerical results show that the proposed algorithm is competitive and reliable.

  5. Joint Kalman–Haar Algorithm Applied to Signal Processing

    Directory of Open Access Journals (Sweden)

    Alejandro Viegener

    2012-03-01

    Full Text Available Under the analysis of signals disturbed by noise, in this paper we propose a working methodology aimed to seize the best estimate of combining Kalman filtering with the characterization that is achieved by applying a multiresolution analysis (MRA using wavelets. From the standpoint of Kalman filtering this combined procedure is quasi-optimal, but the change to be made allows the simultaneous implementation of a scheme of wavelet denoising; with this decreases the computational cost of applying both procedures separately. Our proposal is to process the signal by successive non-overlapping intervals, combining the process for calculating the optimal filter with a MRA using the Haar wavelet. The method takes advantage of the combined use of both tools (Kalman-Haar and is free from edge problems related to the signal segmentation.

  6. SAGA: a hybrid search algorithm for Bayesian Network structure learning of transcriptional regulatory networks.

    Science.gov (United States)

    Adabor, Emmanuel S; Acquaah-Mensah, George K; Oduro, Francis T

    2015-02-01

    Bayesian Networks have been used for the inference of transcriptional regulatory relationships among genes, and are valuable for obtaining biological insights. However, finding optimal Bayesian Network (BN) is NP-hard. Thus, heuristic approaches have sought to effectively solve this problem. In this work, we develop a hybrid search method combining Simulated Annealing with a Greedy Algorithm (SAGA). SAGA explores most of the search space by undergoing a two-phase search: first with a Simulated Annealing search and then with a Greedy search. Three sets of background-corrected and normalized microarray datasets were used to test the algorithm. BN structure learning was also conducted using the datasets, and other established search methods as implemented in BANJO (Bayesian Network Inference with Java Objects). The Bayesian Dirichlet Equivalence (BDe) metric was used to score the networks produced with SAGA. SAGA predicted transcriptional regulatory relationships among genes in networks that evaluated to higher BDe scores with high sensitivities and specificities. Thus, the proposed method competes well with existing search algorithms for Bayesian Network structure learning of transcriptional regulatory networks. Copyright © 2014 Elsevier Inc. All rights reserved.

  7. Apply lightweight recognition algorithms in optical music recognition

    Science.gov (United States)

    Pham, Viet-Khoi; Nguyen, Hai-Dang; Nguyen-Khac, Tung-Anh; Tran, Minh-Triet

    2015-02-01

    The problems of digitalization and transformation of musical scores into machine-readable format are necessary to be solved since they help people to enjoy music, to learn music, to conserve music sheets, and even to assist music composers. However, the results of existing methods still require improvements for higher accuracy. Therefore, the authors propose lightweight algorithms for Optical Music Recognition to help people to recognize and automatically play musical scores. In our proposal, after removing staff lines and extracting symbols, each music symbol is represented as a grid of identical M ∗ N cells, and the features are extracted and classified with multiple lightweight SVM classifiers. Through experiments, the authors find that the size of 10 ∗ 12 cells yields the highest precision value. Experimental results on the dataset consisting of 4929 music symbols taken from 18 modern music sheets in the Synthetic Score Database show that our proposed method is able to classify printed musical scores with accuracy up to 99.56%.

  8. Feature selection method based on multi-fractal dimension and harmony search algorithm and its application

    Science.gov (United States)

    Zhang, Chen; Ni, Zhiwei; Ni, Liping; Tang, Na

    2016-10-01

    Feature selection is an important method of data preprocessing in data mining. In this paper, a novel feature selection method based on multi-fractal dimension and harmony search algorithm is proposed. Multi-fractal dimension is adopted as the evaluation criterion of feature subset, which can determine the number of selected features. An improved harmony search algorithm is used as the search strategy to improve the efficiency of feature selection. The performance of the proposed method is compared with that of other feature selection algorithms on UCI data-sets. Besides, the proposed method is also used to predict the daily average concentration of PM2.5 in China. Experimental results show that the proposed method can obtain competitive results in terms of both prediction accuracy and the number of selected features.

  9. Hybrid Harmony Search Algorithm and Interior Point Method for Economic Dispatch with Valve-Point Effect

    Science.gov (United States)

    Sivasubramani, S.; Ahmad, Md. Samar

    2014-06-01

    This paper proposes a new hybrid algorithm combining harmony search (HS) algorithm and interior point method (IPM) for economic dispatch (ED) problem with valve-point effect. ED problem with valve-point effect is modeled as a non-linear, constrained and non-convex optimization problem having several local minima. IPM is a best non-linear optimization method for convex optimization problems. Since ED problem with valve-point effect has multiple local minima, IPM results in a local optimum solution. In order to avoid IPM getting trapped in a local optimum, a new evolutionary algorithm HS, which is good in global exploration, has been combined. In the hybrid method, HS is used for global search and IPM for local search. The hybrid method has been tested on three different test systems to prove its effectiveness. Finally, the simulation results are also compared with other methods reported in the literature.

  10. Optimal vaccination schedule search using genetic algorithm over MPI technology

    Directory of Open Access Journals (Sweden)

    Calonaci Cristiano

    2012-11-01

    Full Text Available Abstract Background Immunological strategies that achieve the prevention of tumor growth are based on the presumption that the immune system, if triggered before tumor onset, could be able to defend from specific cancers. In supporting this assertion, in the last decade active immunization approaches prevented some virus-related cancers in humans. An immunopreventive cell vaccine for the non-virus-related human breast cancer has been recently developed. This vaccine, called Triplex, targets the HER-2-neu oncogene in HER-2/neu transgenic mice and has shown to almost completely prevent HER-2/neu-driven mammary carcinogenesis when administered with an intensive and life-long schedule. Methods To better understand the preventive efficacy of the Triplex vaccine in reduced schedules we employed a computational approach. The computer model developed allowed us to test in silico specific vaccination schedules in the quest for optimality. Specifically here we present a parallel genetic algorithm able to suggest optimal vaccination schedule. Results & Conclusions The enormous complexity of combinatorial space to be explored makes this approach the only possible one. The suggested schedule was then tested in vivo, giving good results. Finally, biologically relevant outcomes of optimization are presented.

  11. An Effective Hybrid Cuckoo Search Algorithm with Improved Shuffled Frog Leaping Algorithm for 0-1 Knapsack Problems

    Directory of Open Access Journals (Sweden)

    Yanhong Feng

    2014-01-01

    Full Text Available An effective hybrid cuckoo search algorithm (CS with improved shuffled frog-leaping algorithm (ISFLA is put forward for solving 0-1 knapsack problem. First of all, with the framework of SFLA, an improved frog-leap operator is designed with the effect of the global optimal information on the frog leaping and information exchange between frog individuals combined with genetic mutation with a small probability. Subsequently, in order to improve the convergence speed and enhance the exploitation ability, a novel CS model is proposed with considering the specific advantages of Lévy flights and frog-leap operator. Furthermore, the greedy transform method is used to repair the infeasible solution and optimize the feasible solution. Finally, numerical simulations are carried out on six different types of 0-1 knapsack instances, and the comparative results have shown the effectiveness of the proposed algorithm and its ability to achieve good quality solutions, which outperforms the binary cuckoo search, the binary differential evolution, and the genetic algorithm.

  12. A Fast LSF Search Algorithm Based on Interframe Correlation in G.723.1

    Directory of Open Access Journals (Sweden)

    Kulkarni Jaydeep P

    2004-01-01

    Full Text Available We explain a time complexity reduction algorithm that improves the line spectral frequencies (LSF search procedure on the unit circle for low bit rate speech codecs. The algorithm is based on strong interframe correlation exhibited by LSFs. The fixed point C code of ITU-T Recommendation G.723.1, which uses the “real root algorithm” was modified and the results were verified on ARM-7TDMI general purpose RISC processor. The algorithm works for all test vectors provided by International Telecommunications Union-Telecommunication (ITU-T as well as real speech. The average time reduction in the search computation was found to be approximately 20%.

  13. Addressing Data Analysis Challenges in Gravitational Wave Searches Using the Particle Swarm Optimization Algorithm

    Science.gov (United States)

    Weerathunga, Thilina Shihan

    2017-08-01

    Gravitational waves are a fundamental prediction of Einstein's General Theory of Relativity. The first experimental proof of their existence was provided by the Nobel Prize winning discovery by Taylor and Hulse of orbital decay in a binary pulsar system. The first detection of gravitational waves incident on earth from an astrophysical source was announced in 2016 by the LIGO Scientific Collaboration, launching the new era of gravitational wave (GW) astronomy. The signal detected was from the merger of two black holes, which is an example of sources called Compact Binary Coalescences (CBCs). Data analysis strategies used in the search for CBC signals are derivatives of the Maximum-Likelihood (ML) method. The ML method applied to data from a network of geographically distributed GW detectors--called fully coherent network analysis--is currently the best approach for estimating source location and GW polarization waveforms. However, in the case of CBCs, especially for lower mass systems (O(1M solar masses)) such as double neutron star binaries, fully coherent network analysis is computationally expensive. The ML method requires locating the global maximum of the likelihood function over a nine dimensional parameter space, where the computation of the likelihood at each point requires correlations involving O(104) to O(106) samples between the data and the corresponding candidate signal waveform template. Approximations, such as semi-coherent coincidence searches, are currently used to circumvent the computational barrier but incur a concomitant loss in sensitivity. We explored the effectiveness of Particle Swarm Optimization (PSO), a well-known algorithm in the field of swarm intelligence, in addressing the fully coherent network analysis problem. As an example, we used a four-detector network consisting of the two LIGO detectors at Hanford and Livingston, Virgo and Kagra, all having initial LIGO noise power spectral densities, and show that PSO can locate the global

  14. Comparison of stochastic search optimization algorithms for the laminated composites under mechanical and hygrothermal loadings

    OpenAIRE

    Aydın, Levent; Artem, Hatice Seçil

    2011-01-01

    The aim of the present study is to design the stacking sequence of the laminated composites that have low coefficient of thermal expansion and high elastic moduli. In design process, multi-objective genetic algorithm optimization of the carbon fiber laminated composite plates is verified by single objective optimization approach using three different stochastic optimization methods: genetic algorithm, generalized pattern search, and simulated annealing. However, both the multi- and single-obj...

  15. An Efficient Tabu Search DSA Algorithm for Heterogeneous Traffic in Cellular Networks

    OpenAIRE

    Kamal, Hany; Coupechoux, Marceau; Godlewski, Philippe

    2010-01-01

    International audience; In this paper, we propose and analyze a TS (Tabu Search) algorithm for DSA (Dynamic Spectrum Access) in cellular networks. We consider a scenario where cellular operators share a common access band, and we focus on the strategy of one operator providing packet services to the end-users. We consider a soft interference requirement for the algorithm's design that suits the packet traffic context. The operator's objective is to maximize its reward while taking into accoun...

  16. ParAlign: a parallel sequence alignment algorithm for rapid and sensitive database searches

    OpenAIRE

    Rognes, Torbjørn

    2001-01-01

    There is a need for faster and more sensitive algorithms for sequence similarity searching in view of the rapidly increasing amounts of genomic sequence data available. Parallel processing capabilities in the form of the single instruction, multiple data (SIMD) technology are now available in common microprocessors and enable a single microprocessor to perform many operations in parallel. The ParAlign algorithm has been specifically designed to take advantage of th...

  17. An improved Harmony Search algorithm for optimal scheduling of the diesel generators in oil rig platforms

    Energy Technology Data Exchange (ETDEWEB)

    Yadav, Parikshit; Kumar, Rajesh; Panda, S.K.; Chang, C.S. [Department of Electrical and Computer Engineering, National University of Singapore, Singapore 117576 (Singapore)

    2011-02-15

    Harmony Search (HS) algorithm is music based meta-heuristic optimization method which is analogous with the music improvisation process where musician continue to polish the pitches in order to obtain better harmony. The paper focuses on the optimal scheduling of the generators to reduce the fuel consumption in the oil rig platform. The accurate modeling of the specific fuel consumption is significant in this optimization. The specific fuel consumption has been modeled using cubic spline interpolation. The SFC curve is non-linear and discrete in nature, hence conventional methods fail to give optimal solution. HS algorithm has been used for optimal scheduling of the generators of both equal and unequal rating. Furthermore an Improved Harmony Search (IHS) method for generating new solution vectors that enhances accuracy and convergence rate of HS has been employed. The paper also focuses on the impacts of constant parameters on Harmony Search algorithm. Numerical results show that the IHS method has good convergence property. Moreover, the fuel consumption for IHS algorithm is lower when compared to HS and other heuristic or deterministic methods and is a powerful search algorithm for various engineering optimization problems. (author)

  18. Memetic Algorithm with Local Search as Modified Swine Influenza Model-Based Optimization and Its Use in ECG Filtering

    Directory of Open Access Journals (Sweden)

    Devidas G. Jadhav

    2014-01-01

    Full Text Available The Swine Influenza Model Based Optimization (SIMBO family is a newly introduced speedy optimization technique having the adaptive features in its mechanism. In this paper, the authors modified the SIMBO to make the algorithm further quicker. As the SIMBO family is faster, it is a better option for searching the basin. Thus, it is utilized in local searches in developing the proposed memetic algorithms (MAs. The MA has a faster speed compared to SIMBO with the balance in exploration and exploitation. So, MAs have small tradeoffs in convergence velocity for comprehensively optimizing the numerical standard benchmark test bed having functions with different properties. The utilization of SIMBO in the local searching is inherently the exploitation of better characteristics of the algorithms employed for the hybridization. The developed MA is applied to eliminate the power line interference (PLI from the biomedical signal ECG with the use of adaptive filter whose weights are optimized by the MA. The inference signal required for adaptive filter is obtained using the selective reconstruction of ECG from the intrinsic mode functions (IMFs of empirical mode decomposition (EMD.

  19. Personalization algorithms applied to cardiovascular disease risk assessment.

    Science.gov (United States)

    Paredes, S; Marques, T; Rocha, T; de Carvalho, P; Henriques, J; Morals, J

    2014-01-01

    Cardiovascular disease (CVD) is the major cause of death in the world. Clinical guidelines recommend the use of risk assessment tools (scores) to identify the CVD risk of each patient as the correct stratification of patients may significantly contribute to the optimization of the health care strategies. This work further explores the personalization of CVD risk assessment, supported on the evidence that a specific CVD risk assessment tool may have good performance within a given group of patients and might perform poorly within other groups. Two main personalization methods based on the proper creation of groups of patients are presented: i) clustering patients approach; ii) similarity measures approach. These two methodologies were validated in a Portuguese population (460 Acute Coronary Syndrome with non-ST segment elevation (ACS-NSTEMI) patients). The similarity measures approach had the best performance, achieving maximum values of sensitivity, specificity and geometric mean of, respectively, 77.7%, 63.2%, 69.7%. These values represent an enhancement in relation to the best performance obtained with current CVD risk assessment tools applied in clinical practice (78.5%, 53.2%, 64.4%).

  20. A Hybrid Water Distribution Networks Design Optimization Method Based on a Search Space Reduction Approach and a Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Juan Reca

    2017-11-01

    Full Text Available This work presents a new approach to increase the efficiency of the heuristics methods applied to the optimal design of water distribution systems. The approach is based on reducing the search space by bounding the diameters that can be used for every network pipe. To reduce the search space, two opposite extreme flow distribution scenarios are analyzed and velocity restrictions to the pipe flow are then applied. The first scenario produces the most uniform flow distribution in the network. The opposite scenario is represented by the network with the maximum flow accumulation. Both extreme flow distributions are calculated by solving a quadratic programming problem, which is a very robust and efficient procedure. This approach has been coupled to a Genetic Algorithm (GA. The GA has an integer coding scheme and variable number of alleles depending on the number of diameters comprised within the velocity restrictions. The methodology has been applied to several benchmark networks and its performance has been compared to a classic GA formulation with a non-bounded search space. It considerably reduced the search space and provided a much faster and more accurate convergence than the GA formulation. This approach can also be coupled to other metaheuristics.

  1. In-depth analysis of protein inference algorithms using multiple search engines and well-defined metrics.

    Science.gov (United States)

    Audain, Enrique; Uszkoreit, Julian; Sachsenberg, Timo; Pfeuffer, Julianus; Liang, Xiao; Hermjakob, Henning; Sanchez, Aniel; Eisenacher, Martin; Reinert, Knut; Tabb, David L; Kohlbacher, Oliver; Perez-Riverol, Yasset

    2017-01-06

    In mass spectrometry-based shotgun proteomics, protein identifications are usually the desired result. However, most of the analytical methods are based on the identification of reliable peptides and not the direct identification of intact proteins. Thus, assembling peptides identified from tandem mass spectra into a list of proteins, referred to as protein inference, is a critical step in proteomics research. Currently, different protein inference algorithms and tools are available for the proteomics community. Here, we evaluated five software tools for protein inference (PIA, ProteinProphet, Fido, ProteinLP, MSBayesPro) using three popular database search engines: Mascot, X!Tandem, and MS-GF+. All the algorithms were evaluated using a highly customizable KNIME workflow using four different public datasets with varying complexities (different sample preparation, species and analytical instruments). We defined a set of quality control metrics to evaluate the performance of each combination of search engines, protein inference algorithm, and parameters on each dataset. We show that the results for complex samples vary not only regarding the actual numbers of reported protein groups but also concerning the actual composition of groups. Furthermore, the robustness of reported proteins when using databases of differing complexities is strongly dependant on the applied inference algorithm. Finally, merging the identifications of multiple search engines does not necessarily increase the number of reported proteins, but does increase the number of peptides per protein and thus can generally be recommended. Protein inference is one of the major challenges in MS-based proteomics nowadays. Currently, there are a vast number of protein inference algorithms and implementations available for the proteomics community. Protein assembly impacts in the final results of the research, the quantitation values and the final claims in the research manuscript. Even though protein

  2. On the Runtime of Randomized Local Search and Simple Evolutionary Algorithms for Dynamic Makespan Scheduling

    DEFF Research Database (Denmark)

    Neumann, Frank; Witt, Carsten

    2015-01-01

    Evolutionary algorithms have been frequently used for dynamic optimization problems. With this paper, we contribute to the theoretical understanding of this research area. We present the first computational complexity analysis of evolutionary algorithms for a dynamic variant of a classical...... combinatorial optimization problem, namely makespan scheduling. We study the model of a strong adversary which is allowed to change one job at regular intervals. Furthermore, we investigate the setting of random changes. Our results show that randomized local search and a simple evolutionary algorithm are very...... effective in dynamically tracking changes made to the problem instance....

  3. A Line Search Multilevel Truncated Newton Algorithm for Computing the Optical Flow

    Directory of Open Access Journals (Sweden)

    Lluís Garrido

    2015-06-01

    Full Text Available We describe the implementation details and give the experimental results of three optimization algorithms for dense optical flow computation. In particular, using a line search strategy, we evaluate the performance of the unilevel truncated Newton method (LSTN, a multiresolution truncated Newton (MR/LSTN and a full multigrid truncated Newton (FMG/LSTN. We use three image sequences and four models of optical flow for performance evaluation. The FMG/LSTN algorithm is shown to lead to better optical flow estimation with less computational work than both the LSTN and MR/LSTN algorithms.

  4. Cache-Oblivious Data Structures and Algorithms for Undirected Breadth-First Search and Shortest Paths

    DEFF Research Database (Denmark)

    Brodal, G.S.; Fagerberg, R.; Meyer, U.

    2004-01-01

    We present improved cache-oblivious data structures and algorithms for breadth-first search and the single-source shortest path problem on undirected graphs with non-negative edge weights. Our results removes the performance gap between the currently best cache-aware algorithms for these problems...... and their cache-oblivious counterparts. Our shortest-path algorithm relies on a new data structure, called bucket heap, which is the first cache-oblivious priority queue to efficiently support a weak DecreaseKey operation....

  5. Investigation of candidate data structures and search algorithms to support a knowledge based fault diagnosis system

    Science.gov (United States)

    Bosworth, Edward L., Jr.

    1987-01-01

    The focus of this research is the investigation of data structures and associated search algorithms for automated fault diagnosis of complex systems such as the Hubble Space Telescope. Such data structures and algorithms will form the basis of a more sophisticated Knowledge Based Fault Diagnosis System. As a part of the research, several prototypes were written in VAXLISP and implemented on one of the VAX-11/780's at the Marshall Space Flight Center. This report describes and gives the rationale for both the data structures and algorithms selected. A brief discussion of a user interface is also included.

  6. A multiobjective scatter search algorithm for fault-tolerant NoC mapping optimisation

    Science.gov (United States)

    Le, Qianqi; Yang, Guowu; Hung, William N. N.; Zhang, Xinpeng; Fan, Fuyou

    2014-08-01

    Mapping IP cores to an on-chip network is an important step in Network-on-Chip (NoC) design and affects the performance of NoC systems. A mapping optimisation algorithm and a fault-tolerant mechanism are proposed in this article. The fault-tolerant mechanism and the corresponding routing algorithm can recover NoC communication from switch failures, while preserving high performance. The mapping optimisation algorithm is based on scatter search (SS), which is an intelligent algorithm with a powerful combinatorial search ability. To meet the requests of the NoC mapping application, the standard SS is improved for multiple objective optimisation. This method helps to obtain high-performance mapping layouts. The proposed algorithm was implemented on the Embedded Systems Synthesis Benchmarks Suite (E3S). Experimental results show that this optimisation algorithm achieves low-power consumption, little communication time, balanced link load and high reliability, compared to particle swarm optimisation and genetic algorithm.

  7. Large-Scale Recurrent Neural Network Based Modelling of Gene Regulatory Network Using Cuckoo Search-Flower Pollination Algorithm.

    Science.gov (United States)

    Mandal, Sudip; Khan, Abhinandan; Saha, Goutam; Pal, Rajat K

    2016-01-01

    The accurate prediction of genetic networks using computational tools is one of the greatest challenges in the postgenomic era. Recurrent Neural Network is one of the most popular but simple approaches to model the network dynamics from time-series microarray data. To date, it has been successfully applied to computationally derive small-scale artificial and real-world genetic networks with high accuracy. However, they underperformed for large-scale genetic networks. Here, a new methodology has been proposed where a hybrid Cuckoo Search-Flower Pollination Algorithm has been implemented with Recurrent Neural Network. Cuckoo Search is used to search the best combination of regulators. Moreover, Flower Pollination Algorithm is applied to optimize the model parameters of the Recurrent Neural Network formalism. Initially, the proposed method is tested on a benchmark large-scale artificial network for both noiseless and noisy data. The results obtained show that the proposed methodology is capable of increasing the inference of correct regulations and decreasing false regulations to a high degree. Secondly, the proposed methodology has been validated against the real-world dataset of the DNA SOS repair network of Escherichia coli. However, the proposed method sacrifices computational time complexity in both cases due to the hybrid optimization process.

  8. Large-Scale Recurrent Neural Network Based Modelling of Gene Regulatory Network Using Cuckoo Search-Flower Pollination Algorithm

    Directory of Open Access Journals (Sweden)

    Sudip Mandal

    2016-01-01

    Full Text Available The accurate prediction of genetic networks using computational tools is one of the greatest challenges in the postgenomic era. Recurrent Neural Network is one of the most popular but simple approaches to model the network dynamics from time-series microarray data. To date, it has been successfully applied to computationally derive small-scale artificial and real-world genetic networks with high accuracy. However, they underperformed for large-scale genetic networks. Here, a new methodology has been proposed where a hybrid Cuckoo Search-Flower Pollination Algorithm has been implemented with Recurrent Neural Network. Cuckoo Search is used to search the best combination of regulators. Moreover, Flower Pollination Algorithm is applied to optimize the model parameters of the Recurrent Neural Network formalism. Initially, the proposed method is tested on a benchmark large-scale artificial network for both noiseless and noisy data. The results obtained show that the proposed methodology is capable of increasing the inference of correct regulations and decreasing false regulations to a high degree. Secondly, the proposed methodology has been validated against the real-world dataset of the DNA SOS repair network of Escherichia coli. However, the proposed method sacrifices computational time complexity in both cases due to the hybrid optimization process.

  9. A comparative study of the A* heuristic search algorithm used to solve efficiently a puzzle game

    Science.gov (United States)

    Iordan, A. E.

    2018-01-01

    The puzzle game presented in this paper consists in polyhedra (prisms, pyramids or pyramidal frustums) which can be moved using the free available spaces. The problem requires to be found the minimum number of movements in order the game reaches to a goal configuration starting from an initial configuration. Because the problem is enough complex, the principal difficulty in solving it is given by dimension of search space, that leads to necessity of a heuristic search. The improving of the search method consists into determination of a strong estimation by the heuristic function which will guide the search process to the most promising side of the search tree. The comparative study is realized among Manhattan heuristic and the Hamming heuristic using A* search algorithm implemented in Java. This paper also presents the necessary stages in object oriented development of a software used to solve efficiently this puzzle game. The modelling of the software is achieved through specific UML diagrams representing the phases of analysis, design and implementation, the system thus being described in a clear and practical manner. With the purpose to confirm the theoretical results which demonstrates that Manhattan heuristic is more efficient was used space complexity criterion. The space complexity was measured by the number of generated nodes from the search tree, by the number of the expanded nodes and by the effective branching factor. From the experimental results obtained by using the Manhattan heuristic, improvements were observed regarding space complexity of A* algorithm versus Hamming heuristic.

  10. Metaheuristic Algorithms Applied to Bioenergy Supply Chain Problems: Theory, Review, Challenges, and Future

    Directory of Open Access Journals (Sweden)

    Krystel K. Castillo-Villar

    2014-11-01

    Full Text Available Bioenergy is a new source of energy that accounts for a substantial portion of the renewable energy production in many countries. The production of bioenergy is expected to increase due to its unique advantages, such as no harmful emissions and abundance. Supply-related problems are the main obstacles precluding the increase of use of biomass (which is bulky and has low energy density to produce bioenergy. To overcome this challenge, large-scale optimization models are needed to be solved to enable decision makers to plan, design, and manage bioenergy supply chains. Therefore, the use of effective optimization approaches is of great importance. The traditional mathematical methods (such as linear, integer, and mixed-integer programming frequently fail to find optimal solutions for non-convex and/or large-scale models whereas metaheuristics are efficient approaches for finding near-optimal solutions that use less computational resources. This paper presents a comprehensive review by studying and analyzing the application of metaheuristics to solve bioenergy supply chain models as well as the exclusive challenges of the mathematical problems applied in the bioenergy supply chain field. The reviewed metaheuristics include: (1 population approaches, such as ant colony optimization (ACO, the genetic algorithm (GA, particle swarm optimization (PSO, and bee colony algorithm (BCA; and (2 trajectory approaches, such as the tabu search (TS and simulated annealing (SA. Based on the outcomes of this literature review, the integrated design and planning of bioenergy supply chains problem has been solved primarily by implementing the GA. The production process optimization was addressed primarily by using both the GA and PSO. The supply chain network design problem was treated by utilizing the GA and ACO. The truck and task scheduling problem was solved using the SA and the TS, where the trajectory-based methods proved to outperform the population

  11. Hybridisations of Variable Neighbourhood Search and Modified Simplex Elements to Harmony Search and Shuffled Frog Leaping Algorithms for Process Optimisations

    Science.gov (United States)

    Aungkulanon, P.; Luangpaiboon, P.

    2010-10-01

    Nowadays, the engineering problem systems are large and complicated. An effective finite sequence of instructions for solving these problems can be categorised into optimisation and meta-heuristic algorithms. Though the best decision variable levels from some sets of available alternatives cannot be done, meta-heuristics is an alternative for experience-based techniques that rapidly help in problem solving, learning and discovery in the hope of obtaining a more efficient or more robust procedure. All meta-heuristics provide auxiliary procedures in terms of their own tooled box functions. It has been shown that the effectiveness of all meta-heuristics depends almost exclusively on these auxiliary functions. In fact, the auxiliary procedure from one can be implemented into other meta-heuristics. Well-known meta-heuristics of harmony search (HSA) and shuffled frog-leaping algorithms (SFLA) are compared with their hybridisations. HSA is used to produce a near optimal solution under a consideration of the perfect state of harmony of the improvisation process of musicians. A meta-heuristic of the SFLA, based on a population, is a cooperative search metaphor inspired by natural memetics. It includes elements of local search and global information exchange. This study presents solution procedures via constrained and unconstrained problems with different natures of single and multi peak surfaces including a curved ridge surface. Both meta-heuristics are modified via variable neighbourhood search method (VNSM) philosophy including a modified simplex method (MSM). The basic idea is the change of neighbourhoods during searching for a better solution. The hybridisations proceed by a descent method to a local minimum exploring then, systematically or at random, increasingly distant neighbourhoods of this local solution. The results show that the variant of HSA with VNSM and MSM seems to be better in terms of the mean and variance of design points and yields.

  12. RDEL: Restart Differential Evolution algorithm with Local Search Mutation for global numerical optimization

    Directory of Open Access Journals (Sweden)

    Ali Wagdy Mohamed

    2014-11-01

    Full Text Available In this paper, a novel version of Differential Evolution (DE algorithm based on a couple of local search mutation and a restart mechanism for solving global numerical optimization problems over continuous space is presented. The proposed algorithm is named as Restart Differential Evolution algorithm with Local Search Mutation (RDEL. In RDEL, inspired by Particle Swarm Optimization (PSO, a novel local mutation rule based on the position of the best and the worst individuals among the entire population of a particular generation is introduced. The novel local mutation scheme is joined with the basic mutation rule through a linear decreasing function. The proposed local mutation scheme is proven to enhance local search tendency of the basic DE and speed up the convergence. Furthermore, a restart mechanism based on random mutation scheme and a modified Breeder Genetic Algorithm (BGA mutation scheme is combined to avoid stagnation and/or premature convergence. Additionally, an exponent increased crossover probability rule and a uniform scaling factors of DE are introduced to promote the diversity of the population and to improve the search process, respectively. The performance of RDEL is investigated and compared with basic differential evolution, and state-of-the-art parameter adaptive differential evolution variants. It is discovered that the proposed modifications significantly improve the performance of DE in terms of quality of solution, efficiency and robustness.

  13. Robust Mokken Scale Analysis by Means of the Forward Search Algorithm for Outlier Detection

    Science.gov (United States)

    Zijlstra, Wobbe P.; van der Ark, L. Andries; Sijtsma, Klaas

    2011-01-01

    Exploratory Mokken scale analysis (MSA) is a popular method for identifying scales from larger sets of items. As with any statistical method, in MSA the presence of outliers in the data may result in biased results and wrong conclusions. The forward search algorithm is a robust diagnostic method for outlier detection, which we adapt here to…

  14. Neural network based adaptive control of nonlinear plants using random search optimization algorithms

    Science.gov (United States)

    Boussalis, Dhemetrios; Wang, Shyh J.

    1992-01-01

    This paper presents a method for utilizing artificial neural networks for direct adaptive control of dynamic systems with poorly known dynamics. The neural network weights (controller gains) are adapted in real time using state measurements and a random search optimization algorithm. The results are demonstrated via simulation using two highly nonlinear systems.

  15. Remote sensing imagery classification using multi-objective gravitational search algorithm

    Science.gov (United States)

    Zhang, Aizhu; Sun, Genyun; Wang, Zhenjie

    2016-10-01

    Simultaneous optimization of different validity measures can capture different data characteristics of remote sensing imagery (RSI) and thereby achieving high quality classification results. In this paper, two conflicting cluster validity indices, the Xie-Beni (XB) index and the fuzzy C-means (FCM) (Jm) measure, are integrated with a diversity-enhanced and memory-based multi-objective gravitational search algorithm (DMMOGSA) to present a novel multi-objective optimization based RSI classification method. In this method, the Gabor filter method is firstly implemented to extract texture features of RSI. Then, the texture features are syncretized with the spectral features to construct the spatial-spectral feature space/set of the RSI. Afterwards, cluster of the spectral-spatial feature set is carried out on the basis of the proposed method. To be specific, cluster centers are randomly generated initially. After that, the cluster centers are updated and optimized adaptively by employing the DMMOGSA. Accordingly, a set of non-dominated cluster centers are obtained. Therefore, numbers of image classification results of RSI are produced and users can pick up the most promising one according to their problem requirements. To quantitatively and qualitatively validate the effectiveness of the proposed method, the proposed classification method was applied to classifier two aerial high-resolution remote sensing imageries. The obtained classification results are compared with that produced by two single cluster validity index based and two state-of-the-art multi-objective optimization algorithms based classification results. Comparison results show that the proposed method can achieve more accurate RSI classification.

  16. Examining applying high performance genetic data feature selection and classification algorithms for colon cancer diagnosis.

    Science.gov (United States)

    Al-Rajab, Murad; Lu, Joan; Xu, Qiang

    2017-07-01

    This paper examines the accuracy and efficiency (time complexity) of high performance genetic data feature selection and classification algorithms for colon cancer diagnosis. The need for this research derives from the urgent and increasing need for accurate and efficient algorithms. Colon cancer is a leading cause of death worldwide, hence it is vitally important for the cancer tissues to be expertly identified and classified in a rapid and timely manner, to assure both a fast detection of the disease and to expedite the drug discovery process. In this research, a three-phase approach was proposed and implemented: Phases One and Two examined the feature selection algorithms and classification algorithms employed separately, and Phase Three examined the performance of the combination of these. It was found from Phase One that the Particle Swarm Optimization (PSO) algorithm performed best with the colon dataset as a feature selection (29 genes selected) and from Phase Two that the Support Vector Machine (SVM) algorithm outperformed other classifications, with an accuracy of almost 86%. It was also found from Phase Three that the combined use of PSO and SVM surpassed other algorithms in accuracy and performance, and was faster in terms of time analysis (94%). It is concluded that applying feature selection algorithms prior to classification algorithms results in better accuracy than when the latter are applied alone. This conclusion is important and significant to industry and society. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. A Reliable Order-Statistics-Based Approximate Nearest Neighbor Search Algorithm.

    Science.gov (United States)

    Verdoliva, Luisa; Cozzolino, Davide; Poggi, Giovanni

    2017-01-01

    We propose a new algorithm for fast approximate nearest neighbor search based on the properties of ordered vectors. Data vectors are classified based on the index and sign of their largest components, thereby partitioning the space in a number of cones centered in the origin. The query is itself classified, and the search starts from the selected cone and proceeds to neighboring ones. Overall, the proposed algorithm corresponds to locality sensitive hashing in the space of directions, with hashing based on the order of components. Thanks to the statistical features emerging through ordering, it deals very well with the challenging case of unstructured data, and is a valuable building block for more complex techniques dealing with structured data. Experiments on both simulated and real-world data prove the proposed algorithm to provide a state-of-the-art performance.

  18. Direct search coding algorithm with reduction in computing time by simultaneous selection rule

    Science.gov (United States)

    Tamura, Hitoshi

    2014-05-01

    An optimized encoding algorithm is required to produce high-quality computer-generated holograms (CGHs). For such a purpose, I have proposed that the use of the direct search algorithm (DSA) is effective for encoding the amplitude and phase in the Lohmann-type CGH. However, it takes much computation time to obtain an optimum solution by the DSA. To solve this problem, I have newly found that the simultaneous direct search algorithm (SDSA) is greatly effective for shortening the computation time for encoding the Lohmann-type CGH. As a result, the evaluation value of the reconstructed image for the SDSA is the same as that of 0.992 for the DSA. The computation time for the SDSA is drastically shortened from 3575 to 55 s for the DSA.

  19. A Local and Global Search Combined Particle Swarm Optimization Algorithm and Its Convergence Analysis

    Directory of Open Access Journals (Sweden)

    Weitian Lin

    2014-01-01

    Full Text Available Particle swarm optimization algorithm (PSOA is an advantage optimization tool. However, it has a tendency to get stuck in a near optimal solution especially for middle and large size problems and it is difficult to improve solution accuracy by fine-tuning parameters. According to the insufficiency, this paper researches the local and global search combine particle swarm algorithm (LGSCPSOA, and its convergence and obtains its convergence qualification. At the same time, it is tested with a set of 8 benchmark continuous functions and compared their optimization results with original particle swarm algorithm (OPSOA. Experimental results indicate that the LGSCPSOA improves the search performance especially on the middle and large size benchmark functions significantly.

  20. Application of Hybrid HS and Tabu Search Algorithm for Optimal Location of FACTS Devices to Reduce Power Losses in Power Systems

    Directory of Open Access Journals (Sweden)

    Z. Masomi Zohrabad

    2016-12-01

    Full Text Available Power networks continue to grow following the annual growth of energy demand. As constructing new energy generation facilities bears a high cost, minimizing power grid losses becomes essential to permit low cost energy transmission in larger distances and additional areas. This study aims to model an optimization problem for an IEEE 30-bus power grid using a Tabu search algorithm based on an improved hybrid Harmony Search (HS method to reduce overall grid losses. The proposed algorithm is applied to find the best location for the installation of a Unified Power Flow Controller (UPFC. The results obtained from installation of the UPFC in the grid are presented by displaying outputs.

  1. Parameter Identification of Steam Turbine Speed Governing System Using an Improved Gravitational Search Algorithm

    Science.gov (United States)

    Zhong, Jing-liang; Deng, Tong-tian; Wang, Jia-sheng

    2017-05-01

    Since most of the traditional parameter identification methods used in the steam turbine speed governing system (STSGS) have the shortages of great work load, poor fitness and long period by hand, a novel improved gravitational search algorithm (VGSA) method, whose gravitational parameter can be dynamically adjusted according to the current fitness and search space will keep being more and more narrow during the iteration process, is proposed in this paper based on an improved gravitational search algorithm (IGSA). The performance of this new method was identified through the comparisons of the steam turbine speed governing system identification results with IGSA using the measured data from a 600MW and a 300MW thermal power unit. The results show that the new method VGSA has the features of higher precision and higher speed during the identification process, and it brings a new scheme for steam turbine speed governing system identification.

  2. Development of a Genetic Algorithm for the search of Optical Model parameters

    Energy Technology Data Exchange (ETDEWEB)

    Abriola, D., E-mail: ad.abriola@iaea.org [International Atomic Energy Agency, Nuclear Data Section, P.O. Box 100, 1400 Vienna (Austria)

    2011-12-15

    The analysis of elastic scattering cross sections in terms of the Optical Model is subject to a series of well known ambiguities. Diverse assumptions about the initial values or shape of the potentials frequently produce different parameters, leading to different physical interpretations of the observed data. It would be important to have a starting set of 'user independent' optical potentials that fit the experimental data to allow the evaluator to consider a large array of possibilities before committing to a particular optical potential. This work presents a Genetic Algorithm (GA) code that simulates natural selection and evolution, allowing a 'blind search' of the multiparametric {chi}{sup 2} surface. In this GA, the genes subject to evolution are the parameters of the optical potential. The GA variables, operators and procedures are described, and the GA is applied to two cases in which the elastic scattering cross section is adjusted: one for the {sup 7}Li + {sup 27}Al system at energies close to the Coulomb barrier where the interaction occurs near the nuclear surface, and another for the {sup 16}O + {sup 16}O system where the two nuclei deeply interpenetrate each other. Further developments are described.

  3. PARALLEL QUICK SEARCH ALGORITHM TO SPEED PACKET PAYLOAD FILTERING IN NIDS

    Directory of Open Access Journals (Sweden)

    ADNAN A. HNAIF

    2009-06-01

    Full Text Available An Intrusion Detection System (IDS is a system to detect intruders who try to hack in to the network and steal information and report them to the network administrator. There are many tools used in this field, snort consider one of the most tools mostly used in Network Intrusion Detection System (NIDS. In spite of consuming 31% of total processing due to string matching, and 80% of total processing in case of web-intensive traffic, snort using its rule sets to determine which packets are allowed to pass and which are rejected. In this paper, we parallelized the quick search algorithm using OpenMP and Pthread (Posix using C language and made a comparison between them; we determine the required number of threads according to many factors. By doing this, we managed to speed up the filtering process for more than 40% and finally. We applied the proposed method into NIDS to enhance the speed of matching process between incoming packet contents and snort rule sets.

  4. Hybridization of Strength Pareto Multiobjective Optimization with Modified Cuckoo Search Algorithm for Rectangular Array.

    Science.gov (United States)

    Abdul Rani, Khairul Najmy; Abdulmalek, Mohamedfareq; A Rahim, Hasliza; Siew Chin, Neoh; Abd Wahab, Alawiyah

    2017-04-20

    This research proposes the various versions of modified cuckoo search (MCS) metaheuristic algorithm deploying the strength Pareto evolutionary algorithm (SPEA) multiobjective (MO) optimization technique in rectangular array geometry synthesis. Precisely, the MCS algorithm is proposed by incorporating the Roulette wheel selection operator to choose the initial host nests (individuals) that give better results, adaptive inertia weight to control the positions exploration of the potential best host nests (solutions), and dynamic discovery rate to manage the fraction probability of finding the best host nests in 3-dimensional search space. In addition, the MCS algorithm is hybridized with the particle swarm optimization (PSO) and hill climbing (HC) stochastic techniques along with the standard strength Pareto evolutionary algorithm (SPEA) forming the MCSPSOSPEA and MCSHCSPEA, respectively. All the proposed MCS-based algorithms are examined to perform MO optimization on Zitzler-Deb-Thiele's (ZDT's) test functions. Pareto optimum trade-offs are done to generate a set of three non-dominated solutions, which are locations, excitation amplitudes, and excitation phases of array elements, respectively. Overall, simulations demonstrates that the proposed MCSPSOSPEA outperforms other compatible competitors, in gaining a high antenna directivity, small half-power beamwidth (HPBW), low average side lobe level (SLL) suppression, and/or significant predefined nulls mitigation, simultaneously.

  5. Hybridization of Strength Pareto Multiobjective Optimization with Modified Cuckoo Search Algorithm for Rectangular Array

    Science.gov (United States)

    Abdul Rani, Khairul Najmy; Abdulmalek, Mohamedfareq; A. Rahim, Hasliza; Siew Chin, Neoh; Abd Wahab, Alawiyah

    2017-04-01

    This research proposes the various versions of modified cuckoo search (MCS) metaheuristic algorithm deploying the strength Pareto evolutionary algorithm (SPEA) multiobjective (MO) optimization technique in rectangular array geometry synthesis. Precisely, the MCS algorithm is proposed by incorporating the Roulette wheel selection operator to choose the initial host nests (individuals) that give better results, adaptive inertia weight to control the positions exploration of the potential best host nests (solutions), and dynamic discovery rate to manage the fraction probability of finding the best host nests in 3-dimensional search space. In addition, the MCS algorithm is hybridized with the particle swarm optimization (PSO) and hill climbing (HC) stochastic techniques along with the standard strength Pareto evolutionary algorithm (SPEA) forming the MCSPSOSPEA and MCSHCSPEA, respectively. All the proposed MCS-based algorithms are examined to perform MO optimization on Zitzler-Deb-Thiele’s (ZDT’s) test functions. Pareto optimum trade-offs are done to generate a set of three non-dominated solutions, which are locations, excitation amplitudes, and excitation phases of array elements, respectively. Overall, simulations demonstrates that the proposed MCSPSOSPEA outperforms other compatible competitors, in gaining a high antenna directivity, small half-power beamwidth (HPBW), low average side lobe level (SLL) suppression, and/or significant predefined nulls mitigation, simultaneously.

  6. Derivation and Validation of a Search Algorithm to Retrospectively Identify CRRT Initiation in the ECMO Patients.

    Science.gov (United States)

    Guru, Pramod K; Singh, Tarun D; Passe, Melissa; Kashani, Kianoush B; Schears, Gregory J; Kashyap, Rahul

    2016-01-01

    The role of extracorporeal membrane oxygenation (ECMO) in refractory cardiorespiratory failure is gaining momentum with recent advancements in technology. However, the need for dialysis modes such as continuous renal replacement therapy (CRRT) has also increased in the management for acute kidney injury. Establishing the exact timing of CRRT initiation in these patients from the electronic medical record is vital for automated data extraction for research and quality improvement efforts. We aimed to derive and validate an automated Electronic Health Records (EHR) search strategy for CRRT initiation in patients receiving ECMO. We screened 488 patients who received ECMO and a total of 213 patients underwent CRRT. We evaluated random 120 patients, 60 for derivation and 60 for validation cohorts. Following implementation of eligibility criteria, the algorithm was derived in 55 out of 120 ECMO/CRRT patients. The search algorithm was developed using first-time chart entry of 'access pressure drop' at CRRT initiation. The algorithm was then validated in an independent subset of 52 patients from the same time period. The overall agreement between electronic search algorithm and a comprehensive manual medical record review in the derivation and validation subsets, using 'access pressure drop' as the reference standard, was compared to assess CRRT initiation time. In the derivation subset (N=55), the automated electronic search strategy achieved an excellent agreement with manual search (κ =0.99, 54 were identified electronically, and 55 upon manual review). There was no time difference observed in 49/54(89%) patients, while in the remaining 5 (9%) patients time difference was within 15 minutes. In the validation cohort (N=52), agreement was 100 % (κ = 1.0, both methods identified 52 patients). Out of 52 patients, 47 (90%) had no time difference between the methods, for the remaining 5 (10%) patients, differences were within 15 minutes. The use of an electronic search

  7. An Adaptive Single-Well Stochastic Resonance Algorithm Applied to Trace Analysis of Clenbuterol in Human Urine

    Directory of Open Access Journals (Sweden)

    Shaofei Xie

    2012-02-01

    Full Text Available Based on the theory of stochastic resonance, an adaptive single-well stochastic resonance (ASSR coupled with genetic algorithm was developed to enhance the signal-to-noise ratio of weak chromatographic signals. In conventional stochastic resonance algorithm, there are two or more parameters needed to be optimized and the proper parameters values were obtained by a universal searching within a given range. In the developed ASSR, the optimization of system parameter was simplified and automatic implemented. The ASSR was applied to the trace analysis of clenbuterol in human urine and it helped to significantly improve the limit of detection and limit of quantification of clenbuterol. Good linearity, precision and accuracy of the proposed method ensure that it could be an effective tool for trace analysis and the improvement of detective sensibility of current detectors.

  8. Power to the People! Meta-algorithmic modelling in applied data science

    NARCIS (Netherlands)

    Spruit, M.; Jagesar, R.

    2016-01-01

    This position paper first defines the research field of applied data science at the intersection of domain expertise, data mining, and engineering capabilities, with particular attention to analytical applications. We then propose a meta-algorithmic approach for applied data science with societal

  9. Voltage stability index based optimal placement of static VAR compensator and sizing using Cuckoo search algorithm

    Science.gov (United States)

    Venkateswara Rao, B.; Kumar, G. V. Nagesh; Chowdary, D. Deepak; Bharathi, M. Aruna; Patra, Stutee

    2017-07-01

    This paper furnish the new Metaheuristic algorithm called Cuckoo Search Algorithm (CSA) for solving optimal power flow (OPF) problem with minimization of real power generation cost. The CSA is found to be the most efficient algorithm for solving single objective optimal power flow problems. The CSA performance is tested on IEEE 57 bus test system with real power generation cost minimization as objective function. Static VAR Compensator (SVC) is one of the best shunt connected device in the Flexible Alternating Current Transmission System (FACTS) family. It has capable of controlling the voltage magnitudes of buses by injecting the reactive power to system. In this paper SVC is integrated in CSA based Optimal Power Flow to optimize the real power generation cost. SVC is used to improve the voltage profile of the system. CSA gives better results as compared to genetic algorithm (GA) in both without and with SVC conditions.

  10. CSLM: Levenberg Marquardt based Back Propagation Algorithm Optimized with Cuckoo Search

    Directory of Open Access Journals (Sweden)

    Nazri Mohd. Nawi

    2014-11-01

    Full Text Available Training an artificial neural network is an optimization task, since it is desired to find optimal weight sets for a neural network during training process. Traditional training algorithms such as back propagation have some drawbacks such as getting stuck in local minima and slow speed of convergence. This study combines the best features of two algorithms; i.e. Levenberg Marquardt back propagation (LMBP and Cuckoo Search (CS for improving the convergence speed of artificial neural networks (ANN training. The proposed CSLM algorithm is trained on XOR and OR datasets. The experimental results show that the proposed CSLM algorithm has better performance than other similar hybrid variants used in this study.

  11. The use of a multiobjective evolutionary algorithm to increase flexibility in the search for better IMRT plans.

    Science.gov (United States)

    Holdsworth, Clay; Kim, Minsun; Liao, Jay; Phillips, Mark

    2012-04-01

    To evaluate how a more flexible and thorough multiobjective search of feasible IMRT plans affects performance in IMRT optimization. A multiobjective evolutionary algorithm (MOEA) was used as a tool to investigate how expanding the search space to include a wider range of penalty functions affects the quality of the set of IMRT plans produced. The MOEA uses a population of IMRT plans to generate new IMRT plans through deterministic minimization of recombined penalty functions that are weighted sums of multiple, tissue-specific objective functions. The quality of the generated plans are judged by an independent set of nonconvex, clinically relevant decision criteria, and all dominated plans are eliminated. As this process repeats itself, better plans are produced so that the population of IMRT plans will approach the Pareto front. Three different approaches were used to explore the effects of expanding the search space. First, the evolutionary algorithm used genetic optimization principles to search by simultaneously optimizing both the weights and tissue-specific dose parameters in penalty functions. Second, penalty function parameters were individually optimized for each voxel in all organs at risk (OARs) in the MOEA. Finally, a heuristic voxel-specific improvement (VSI) algorithm that can be used on any IMRT plan was developed that incrementally improves voxel-specific penalty function parameters for all structures (OARs and targets). Different approaches were compared using the concept of domination comparison applied to the sets of plans obtained by multiobjective optimization. MOEA optimizations that simultaneously searched both importance weights and dose parameters generated sets of IMRT plans that were superior to sets of plans produced when either type of parameter was fixed for four example prostate plans. The amount of improvement increased with greater overlap between OARs and targets. Allowing the MOEA to search for voxel-specific penalty functions

  12. Parameter Identification of the 2-Chlorophenol Oxidation Model Using Improved Differential Search Algorithm

    Directory of Open Access Journals (Sweden)

    Guang-zhou Chen

    2015-01-01

    Full Text Available Parameter identification plays a crucial role for simulating and using model. This paper firstly carried out the sensitivity analysis of the 2-chlorophenol oxidation model in supercritical water using the Monte Carlo method. Then, to address the nonlinearity of the model, two improved differential search (DS algorithms were proposed to carry out the parameter identification of the model. One strategy is to adopt the Latin hypercube sampling method to replace the uniform distribution of initial population; the other is to combine DS with simplex method. The results of sensitivity analysis reveal the sensitivity and the degree of difficulty identified for every model parameter. Furthermore, the posteriori probability distribution of parameters and the collaborative relationship between any two parameters can be obtained. To verify the effectiveness of the improved algorithms, the optimization performance of improved DS in kinetic parameter estimation is studied and compared with that of the basic DS algorithm, differential evolution, artificial bee colony optimization, and quantum-behaved particle swarm optimization. And the experimental results demonstrate that the DS with the Latin hypercube sampling method does not present better performance, while the hybrid methods have the advantages of strong global search ability and local search ability and are more effective than the other algorithms.

  13. Combined heat and power economic dispatch by a fish school search algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Leonardo Trigueiro dos; Costa e Silva, Marsil de Athayde [Undergraduate in Mechatronics Engineering, Pontifical Catholic University of Parana, Curitiba, PR (Brazil); Coelho, Leandro dos Santos [Industrial and Systems Engineering Graduate Program, PPGEPS, Pontifical Catholic University of Parana, Curitiba, PR (Brazil)], e-mail: leandro.coelho@pucpr.br

    2010-07-01

    The conversion of primary fossil fuels, such as coal and gas, to electricity is a a relatively inefficient process. Even the most modern combined cycle plants can only achieve efficiencies of between 50-60%. A great portion of the energy wasted in this conversion process is released to the environment as waste heat. The principle of combined heat and power, also known as cogeneration, is to recover and make beneficial use of this heat, significantly raising the overall efficiency of the conversion process. However, the optimal utilization of multiple combined heat and power systems is a complicated problem which needs powerful methods to solve. This paper presents a fish school search (FSS) algorithm to solve the combined heat and power economic dispatch problem. FSS is a novel approach recently proposed to perform search in complex optimization problems. Some simulations presented in the literature indicated that FSS can outperform many bio-inspired algorithms, mainly in multimodal functions. The search process in FSS is carried out by a population of limited-memory individuals - the fishes. Each fish represents a possible solution to the problem. Similarly to particle swarm optimization or genetic algorithm, search guidance in FSS is driven by the success of some individual members of the population. A four-unit system proposed recently which is a benchmark case in the power systems field has been validated as a case study in this paper. (author)

  14. A novel algorithm for validating peptide identification from a shotgun proteomics search engine.

    Science.gov (United States)

    Jian, Ling; Niu, Xinnan; Xia, Zhonghang; Samir, Parimal; Sumanasekera, Chiranthani; Mu, Zheng; Jennings, Jennifer L; Hoek, Kristen L; Allos, Tara; Howard, Leigh M; Edwards, Kathryn M; Weil, P Anthony; Link, Andrew J

    2013-03-01

    Liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS) has revolutionized the proteomics analysis of complexes, cells, and tissues. In a typical proteomic analysis, the tandem mass spectra from a LC-MS/MS experiment are assigned to a peptide by a search engine that compares the experimental MS/MS peptide data to theoretical peptide sequences in a protein database. The peptide spectra matches are then used to infer a list of identified proteins in the original sample. However, the search engines often fail to distinguish between correct and incorrect peptides assignments. In this study, we designed and implemented a novel algorithm called De-Noise to reduce the number of incorrect peptide matches and maximize the number of correct peptides at a fixed false discovery rate using a minimal number of scoring outputs from the SEQUEST search engine. The novel algorithm uses a three-step process: data cleaning, data refining through a SVM-based decision function, and a final data refining step based on proteolytic peptide patterns. Using proteomics data generated on different types of mass spectrometers, we optimized the De-Noise algorithm on the basis of the resolution and mass accuracy of the mass spectrometer employed in the LC-MS/MS experiment. Our results demonstrate De-Noise improves peptide identification compared to other methods used to process the peptide sequence matches assigned by SEQUEST. Because De-Noise uses a limited number of scoring attributes, it can be easily implemented with other search engines.

  15. A novel symbiotic organisms search algorithm for optimal power flow of power system with FACTS devices

    Directory of Open Access Journals (Sweden)

    Dharmbir Prasad

    2016-03-01

    Full Text Available In this paper, symbiotic organisms search (SOS algorithm is proposed for the solution of optimal power flow (OPF problem of power system equipped with flexible ac transmission systems (FACTS devices. Inspired by interaction between organisms in ecosystem, SOS algorithm is a recent population based algorithm which does not require any algorithm specific control parameters unlike other algorithms. The performance of the proposed SOS algorithm is tested on the modified IEEE-30 bus and IEEE-57 bus test systems incorporating two types of FACTS devices, namely, thyristor controlled series capacitor and thyristor controlled phase shifter at fixed locations. The OPF problem of the present work is formulated with four different objective functions viz. (a fuel cost minimization, (b transmission active power loss minimization, (c emission reduction and (d minimization of combined economic and environmental cost. The simulation results exhibit the potential of the proposed SOS algorithm and demonstrate its effectiveness for solving the OPF problem of power system incorporating FACTS devices over the other evolutionary optimization techniques that surfaced in the recent state-of-the-art literature.

  16. FAILURE CORRECTION OF LINEAR ARRAY ANTENNA WITH MULTIPLE NULL PLACEMENT USING CUCKOO SEARCH ALGORITHM

    Directory of Open Access Journals (Sweden)

    R. Muralidaran

    2014-03-01

    Full Text Available The influence of evolutionary algorithms enhanced its scope of getting its existence in almost every complex optimization problems. In this paper, cuckoo search algorithm, an algorithm based on the brood parasite behavior along with Levy weights has been proposed for the radiation pattern correction of a linear array of isotropic antennas with uniform spacing when failed with more than one antenna element. Even though deterioration produced by the failure of antenna elements results in various undesirable effects, consideration in this paper is given to the correction of side lobe level and null placement at two places. Various articles in the past have already shown that the idea to correct the radiation pattern is to alter the amplitude weights of the remaining unfailed elements, instead of replacing the faulty elements. This approach is made use of modifying the current excitations of unfailed elements using cuckoo search algorithm such that the resulting radiation pattern is similar to the unfailed original pattern in terms of side lobe level and null placement at two places. Examples shown in this paper demonstrate the effectiveness of this algorithm in achieving the desired objectives.

  17. New reference trajectory optimization algorithm for a flight management system inspired in beam search

    Directory of Open Access Journals (Sweden)

    Alejandro MURRIETA-MENDOZA

    2017-08-01

    Full Text Available With the objective of reducing the flight cost and the amount of polluting emissions released in the atmosphere, a new optimization algorithm considering the climb, cruise and descent phases is presented for the reference vertical flight trajectory. The selection of the reference vertical navigation speeds and altitudes was solved as a discrete combinatory problem by means of a graph-tree passing through nodes using the beam search optimization technique. To achieve a compromise between the execution time and the algorithm’s ability to find the global optimal solution, a heuristic methodology introducing a parameter called “optimism coefficient was used in order to estimate the trajectory’s flight cost at every node. The optimal trajectory cost obtained with the developed algorithm was compared with the cost of the optimal trajectory provided by a commercial flight management system(FMS. The global optimal solution was validated against an exhaustive search algorithm(ESA, other than the proposed algorithm. The developed algorithm takes into account weather effects, step climbs during cruise and air traffic management constraints such as constant altitude segments, constant cruise Mach, and a pre-defined reference lateral navigation route. The aircraft fuel burn was computed using a numerical performance model which was created and validated using flight test experimental data.

  18. An Effective Cuckoo Search Algorithm for Node Localization in Wireless Sensor Network.

    Science.gov (United States)

    Cheng, Jing; Xia, Linyuan

    2016-08-31

    Localization is an essential requirement in the increasing prevalence of wireless sensor network (WSN) applications. Reducing the computational complexity, communication overhead in WSN localization is of paramount importance in order to prolong the lifetime of the energy-limited sensor nodes and improve localization performance. This paper proposes an effective Cuckoo Search (CS) algorithm for node localization. Based on the modification of step size, this approach enables the population to approach global optimal solution rapidly, and the fitness of each solution is employed to build mutation probability for avoiding local convergence. Further, the approach restricts the population in the certain range so that it can prevent the energy consumption caused by insignificant search. Extensive experiments were conducted to study the effects of parameters like anchor density, node density and communication range on the proposed algorithm with respect to average localization error and localization success ratio. In addition, a comparative study was conducted to realize the same localization task using the same network deployment. Experimental results prove that the proposed CS algorithm can not only increase convergence rate but also reduce average localization error compared with standard CS algorithm and Particle Swarm Optimization (PSO) algorithm.

  19. Hybrid local search algorithm via evolutionary avalanches for spin glass based portfolio selection

    Directory of Open Access Journals (Sweden)

    Majid Vafaei Jahan

    2012-07-01

    As shown in this paper, this strategy can lead to faster rate of convergence and improved performance than conventional SA and EO algorithm. The resulting are then used to solve the portfolio selection multi-objective problem that is a non-deterministic polynomial complete (NPC problem. This is confirmed by test results of five of the world’s major stock markets, reliability test and phase transition diagram; and finally, the convergence speed is compared to other heuristic methods such as Neural Network (NN, Tabu Search (TS, and Genetic Algorithm (GA.

  20. A Formal Verification Model for Performance Analysis of Reinforcement Learning Algorithms Applied t o Dynamic Networks

    OpenAIRE

    Shrirang Ambaji KULKARNI; Raghavendra G . RAO

    2017-01-01

    Routing data packets in a dynamic network is a difficult and important problem in computer networks. As the network is dynamic, it is subject to frequent topology changes and is subject to variable link costs due to congestion and bandwidth. Existing shortest path algorithms fail to converge to better solutions under dynamic network conditions. Reinforcement learning algorithms posses better adaptation techniques in dynamic environments. In this paper we apply model based Q-Routing technique ...

  1. Algorithms for searching Fast radio bursts and pulsars in tight binary systems.

    Science.gov (United States)

    Zackay, Barak

    2017-01-01

    Fast radio bursts (FRB's) are an exciting, recently discovered, astrophysical transients which their origins are unknown.Currently, these bursts are believed to be coming from cosmological distances, allowing us to probe the electron content on cosmological length scales. Even though their precise localization is crucial for the determination of their origin, radio interferometers were not extensively employed in searching for them due to computational limitations.I will briefly present the Fast Dispersion Measure Transform (FDMT) algorithm,that allows to reduce the operation count in blind incoherent dedispersion by 2-3 orders of magnitude.In addition, FDMT enables to probe the unexplored domain of sub-microsecond astrophysical pulses.Pulsars in tight binary systems are among the most important astrophysical objects as they provide us our best tests of general relativity in the strong field regime.I will provide a preview to a novel algorithm that enables the detection of pulsars in short binary systems using observation times longer than an orbital period.Current pulsar search programs limit their searches for integration times shorter than a few percents of the orbital period.Until now, searching for pulsars in binary systems using observation times longer than an orbital period was considered impossible as one has to blindly enumerate all options for the Keplerian parameters, the pulsar rotation period, and the unknown DM.Using the current state of the art pulsar search techniques and all computers on the earth, such an enumeration would take longer than a Hubble time. I will demonstrate that using the new algorithm, it is possible to conduct such an enumeration on a laptop using real data of the double pulsar PSR J0737-3039.Among the other applications of this algorithm are:1) Searching for all pulsars on all sky positions in gamma ray observations of the Fermi LAT satellite.2) Blind searching for continuous gravitational wave sources emitted by pulsars with

  2. Hooke–Jeeves Method-used Local Search in a Hybrid Global Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    V. D. Sulimov

    2014-01-01

    Full Text Available Modern methods for optimization investigation of complex systems are based on development and updating the mathematical models of systems because of solving the appropriate inverse problems. Input data desirable for solution are obtained from the analysis of experimentally defined consecutive characteristics for a system or a process. Causal characteristics are the sought ones to which equation coefficients of mathematical models of object, limit conditions, etc. belong. The optimization approach is one of the main ones to solve the inverse problems. In the main case it is necessary to find a global extremum of not everywhere differentiable criterion function. Global optimization methods are widely used in problems of identification and computation diagnosis system as well as in optimal control, computing to-mography, image restoration, teaching the neuron networks, other intelligence technologies. Increasingly complicated systems of optimization observed during last decades lead to more complicated mathematical models, thereby making solution of appropriate extreme problems significantly more difficult. A great deal of practical applications may have the problem con-ditions, which can restrict modeling. As a consequence, in inverse problems the criterion functions can be not everywhere differentiable and noisy. Available noise means that calculat-ing the derivatives is difficult and unreliable. It results in using the optimization methods without calculating the derivatives.An efficiency of deterministic algorithms of global optimization is significantly restrict-ed by their dependence on the extreme problem dimension. When the number of variables is large they use the stochastic global optimization algorithms. As stochastic algorithms yield too expensive solutions, so this drawback restricts their applications. Developing hybrid algo-rithms that combine a stochastic algorithm for scanning the variable space with deterministic local search

  3. Harmony search optimization algorithm for a novel transportation problem in a consolidation network

    Science.gov (United States)

    Davod Hosseini, Seyed; Akbarpour Shirazi, Mohsen; Taghi Fatemi Ghomi, Seyed Mohammad

    2014-11-01

    This article presents a new harmony search optimization algorithm to solve a novel integer programming model developed for a consolidation network. In this network, a set of vehicles is used to transport goods from suppliers to their corresponding customers via two transportation systems: direct shipment and milk run logistics. The objective of this problem is to minimize the total shipping cost in the network, so it tries to reduce the number of required vehicles using an efficient vehicle routing strategy in the solution approach. Solving several numerical examples confirms that the proposed solution approach based on the harmony search algorithm performs much better than CPLEX in reducing both the shipping cost in the network and computational time requirement, especially for realistic size problem instances.

  4. Transmission network expansion planning based on hybridization model of neural networks and harmony search algorithm

    Directory of Open Access Journals (Sweden)

    Mohammad Taghi Ameli

    2012-01-01

    Full Text Available Transmission Network Expansion Planning (TNEP is a basic part of power network planning that determines where, when and how many new transmission lines should be added to the network. So, the TNEP is an optimization problem in which the expansion purposes are optimized. Artificial Intelligence (AI tools such as Genetic Algorithm (GA, Simulated Annealing (SA, Tabu Search (TS and Artificial Neural Networks (ANNs are methods used for solving the TNEP problem. Today, by using the hybridization models of AI tools, we can solve the TNEP problem for large-scale systems, which shows the effectiveness of utilizing such models. In this paper, a new approach to the hybridization model of Probabilistic Neural Networks (PNNs and Harmony Search Algorithm (HSA was used to solve the TNEP problem. Finally, by considering the uncertain role of the load based on a scenario technique, this proposed model was tested on the Garver’s 6-bus network.

  5. New approaches of the potential field for QPSO algorithm applied to nuclear reactor reload problem

    Energy Technology Data Exchange (ETDEWEB)

    Nicolau, Andressa dos Santos; Schirru, Roberto, E-mail: andressa@lmp.ufrj.br [Coordenacao dos Programas de Pos-Graduacao em Engenharia (COPPE/UFRJ), Rio de Janeiro, RJ (Brazil). Programa de Engenharia Nuclear

    2015-07-01

    Recently quantum-inspired version of the Particle Swarm Optimization (PSO) algorithm, Quantum Particle Swarm Optimization (QPSO) was proposed. The QPSO algorithm permits all particles to have a quantum behavior, where some sort of 'quantum motion' is imposed in the search process. When the QPSO is tested against a set of benchmarking functions, it showed superior performances as compared to classical PSO. The QPSO outperforms the classical one most of the time in convergence speed and achieves better levels for the fitness functions. The great advantage of QPSO algorithm is that it uses only one parameter control. The critical step or QPSO algorithm is the choice of suitable attractive potential field that can guarantee bound states for the particles moving in the quantum environment. In this article, one version of QPSO algorithm was tested with two types of potential well: delta-potential well harmonic oscillator. The main goal of this study is to show with of the potential field is the most suitable for use in QPSO in a solution of the Nuclear Reactor Reload Optimization Problem, especially in the cycle 7 of a Brazilian Nuclear Power Plant. All result were compared with the performance of its classical counterpart of the literature and shows that QPSO algorithm are well situated among the best alternatives for dealing with hard optimization problems, such as NRROP. (author)

  6. Design of binary patterns for speckle reduction in holographic display with compressive sensing and direct-binary search algorithm

    Science.gov (United States)

    Leportier, Thibault; Hwang, Do Kyung; Park, Min-Chul

    2017-08-01

    One problem common to imaging techniques based on coherent light is speckle noise. This phenomenon is caused mostly by random interference of light scattered by rough surfaces. Speckle noise can be avoided by using advanced holographic imaging techniques such as optical scanning holography. A more widely known method is to capture several holograms of the same object and to perform an averaging operation so that the signal to noise ratio can be improved. Several digital filters were also proposed to reduce noise in the numerical reconstruction plane of holograms, even though they usually require finding a compromise between noise reduction and edge preservation. In this study, we used a digital filter based on compressive sensing algorithm. This approach enables to obtain results equivalent to the average of multiple holograms, but only a single hologram is needed. Filters for speckle reduction are applied on numerical reconstructions of hologram, and not on the hologram itself. Then, optical reconstruction cannot be performed. We propose a method based on direct-binary search (DBS) algorithm to generate binary holograms that can be reconstructed optically after application of a speckle reduction filter. Since the optimization procedure of the DBS algorithm is performed in the image plane, speckle reduction techniques can be applied on the complex hologram and used as a reference to obtain a binary pattern where the speckle noise generated during the recording of the hologram has been filtered.

  7. A Non-Symmetrical Solution Applying a Genetic Algorithm with Natural Crossover for the Structural Optimization of Truss Structures

    Directory of Open Access Journals (Sweden)

    Alvarado-Cárdenas R.

    2012-07-01

    Full Text Available In this research it is proposed a genetic algorithm with “natural crossover” that was applied to a continuous-discrete representation in order to optimize truss structures. The objective is to reduce the weight by restraining node displacement and limiting the cross sections to use. The solutions are combined applying two types of crossovers to the same representation, thus allowing to effectively explore the search space. The results are validated by comparing those found herein against those found in current literature for the case of the design of a 70 m span bridge truss structure. Solutions obtained are lighter and with different topology. Additionally, a case study is proposed, a greenhouse roof truss structure, in order to generate an actual application that is built in a practical scale and it is loaded afterwards to verify its strength.

  8. A Novel Quantum-Behaved Lightning Search Algorithm Approach to Improve the Fuzzy Logic Speed Controller for an Induction Motor Drive

    Directory of Open Access Journals (Sweden)

    Jamal Abd Ali

    2015-11-01

    Full Text Available This paper presents a novel lightning search algorithm (LSA using quantum mechanics theories to generate a quantum-inspired LSA (QLSA. The QLSA improves the searching of each step leader to obtain the best position for a projectile. To evaluate the reliability and efficiency of the proposed algorithm, the QLSA is tested using eighteen benchmark functions with various characteristics. The QLSA is applied to improve the design of the fuzzy logic controller (FLC for controlling the speed response of the induction motor drive. The proposed algorithm avoids the exhaustive conventional trial-and-error procedure for obtaining membership functions (MFs. The generated adaptive input and output MFs are implemented in the fuzzy speed controller design to formulate the objective functions. Mean absolute error (MAE of the rotor speed is the objective function of optimization controller. An optimal QLSA-based FLC (QLSAF optimization controller is employed to tune and minimize the MAE, thereby improving the performance of the induction motor with the change in speed and mechanical load. To validate the performance of the developed controller, the results obtained with the QLSAF are compared to the results obtained with LSA, the backtracking search algorithm (BSA, the gravitational search algorithm (GSA, the particle swarm optimization (PSO and the proportional integral derivative controllers (PID, respectively. Results show that the QLASF outperforms the other control methods in all of the tested cases in terms of damping capability and transient response under different mechanical loads and speeds.

  9. An Encoding Technique for Multiobjective Evolutionary Algorithms Applied to Power Distribution System Reconfiguration

    Directory of Open Access Journals (Sweden)

    J. L. Guardado

    2014-01-01

    Full Text Available Network reconfiguration is an alternative to reduce power losses and optimize the operation of power distribution systems. In this paper, an encoding scheme for evolutionary algorithms is proposed in order to search efficiently for the Pareto-optimal solutions during the reconfiguration of power distribution systems considering multiobjective optimization. The encoding scheme is based on the edge window decoder (EWD technique, which was embedded in the Strength Pareto Evolutionary Algorithm 2 (SPEA2 and the Nondominated Sorting Genetic Algorithm II (NSGA-II. The effectiveness of the encoding scheme was proved by solving a test problem for which the true Pareto-optimal solutions are known in advance. In order to prove the practicability of the encoding scheme, a real distribution system was used to find the near Pareto-optimal solutions for different objective functions to optimize.

  10. An Adaptive Image Enhancement Technique by Combining Cuckoo Search and Particle Swarm Optimization Algorithm

    Science.gov (United States)

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper. PMID:25784928

  11. A Prefiltered Cuckoo Search Algorithm with Geometric Operators for Solving Sudoku Problems

    Directory of Open Access Journals (Sweden)

    Ricardo Soto

    2014-01-01

    Full Text Available The Sudoku is a famous logic-placement game, originally popularized in Japan and today widely employed as pastime and as testbed for search algorithms. The classic Sudoku consists in filling a 9×9 grid, divided into nine 3×3 regions, so that each column, row, and region contains different digits from 1 to 9. This game is known to be NP-complete, with existing various complete and incomplete search algorithms able to solve different instances of it. In this paper, we present a new cuckoo search algorithm for solving Sudoku puzzles combining prefiltering phases and geometric operations. The geometric operators allow one to correctly move toward promising regions of the combinatorial space, while the prefiltering phases are able to previously delete from domains the values that do not conduct to any feasible solution. This integration leads to a more efficient domain filtering and as a consequence to a faster solving process. We illustrate encouraging experimental results where our approach noticeably competes with the best approximate methods reported in the literature.

  12. A Prefiltered Cuckoo Search Algorithm with Geometric Operators for Solving Sudoku Problems

    Science.gov (United States)

    Crawford, Broderick; Galleguillos, Cristian; Paredes, Fernando

    2014-01-01

    The Sudoku is a famous logic-placement game, originally popularized in Japan and today widely employed as pastime and as testbed for search algorithms. The classic Sudoku consists in filling a 9 × 9 grid, divided into nine 3 × 3 regions, so that each column, row, and region contains different digits from 1 to 9. This game is known to be NP-complete, with existing various complete and incomplete search algorithms able to solve different instances of it. In this paper, we present a new cuckoo search algorithm for solving Sudoku puzzles combining prefiltering phases and geometric operations. The geometric operators allow one to correctly move toward promising regions of the combinatorial space, while the prefiltering phases are able to previously delete from domains the values that do not conduct to any feasible solution. This integration leads to a more efficient domain filtering and as a consequence to a faster solving process. We illustrate encouraging experimental results where our approach noticeably competes with the best approximate methods reported in the literature. PMID:24707205

  13. An Adaptive Image Enhancement Technique by Combining Cuckoo Search and Particle Swarm Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Zhiwei Ye

    2015-01-01

    Full Text Available Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper.

  14. An adaptive image enhancement technique by combining cuckoo search and particle swarm optimization algorithm.

    Science.gov (United States)

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper.

  15. ParAlign: a parallel sequence alignment algorithm for rapid and sensitive database searches.

    Science.gov (United States)

    Rognes, T

    2001-04-01

    There is a need for faster and more sensitive algorithms for sequence similarity searching in view of the rapidly increasing amounts of genomic sequence data available. Parallel processing capabilities in the form of the single instruction, multiple data (SIMD) technology are now available in common microprocessors and enable a single microprocessor to perform many operations in parallel. The ParAlign algorithm has been specifically designed to take advantage of this technology. The new algorithm initially exploits parallelism to perform a very rapid computation of the exact optimal ungapped alignment score for all diagonals in the alignment matrix. Then, a novel heuristic is employed to compute an approximate score of a gapped alignment by combining the scores of several diagonals. This approximate score is used to select the most interesting database sequences for a subsequent Smith-Waterman alignment, which is also parallelised. The resulting method represents a substantial improvement compared to existing heuristics. The sensitivity and specificity of ParAlign was found to be as good as Smith-Waterman implementations when the same method for computing the statistical significance of the matches was used. In terms of speed, only the significantly less sensitive NCBI BLAST 2 program was found to outperform the new approach. Online searches are available at http://dna.uio.no/search/

  16. A fast approximate nearest neighbor search algorithm in the Hamming space.

    Science.gov (United States)

    Esmaeili, Mani Malek; Ward, Rabab Kreidieh; Fatourechi, Mehrdad

    2012-12-01

    A fast approximate nearest neighbor search algorithm for the (binary) Hamming space is proposed. The proposed Error Weighted Hashing (EWH) algorithm is up to 20 times faster than the popular locality sensitive hashing (LSH) algorithm and works well even for large nearest neighbor distances where LSH fails. EWH significantly reduces the number of candidate nearest neighbors by weighing them based on the difference between their hash vectors. EWH can be used for multimedia retrieval and copy detection systems that are based on binary fingerprinting. On a fingerprint database with more than 1,000 videos, for a specific detection accuracy, we demonstrate that EWH is more than 10 times faster than LSH. For the same retrieval time, we show that EWH has a significantly better detection accuracy with a 15 times lower error rate.

  17. Optimal gravitational search algorithm for automatic generation control of interconnected power systems

    Directory of Open Access Journals (Sweden)

    Rabindra Kumar Sahu

    2014-09-01

    Full Text Available An attempt is made for the effective application of Gravitational Search Algorithm (GSA to optimize PI/PIDF controller parameters in Automatic Generation Control (AGC of interconnected power systems. Initially, comparison of several conventional objective functions reveals that ITAE yields better system performance. Then, the parameters of GSA technique are properly tuned and the GSA control parameters are proposed. The superiority of the proposed approach is demonstrated by comparing the results of some recently published techniques such as Differential Evolution (DE, Bacteria Foraging Optimization Algorithm (BFOA and Genetic Algorithm (GA. Additionally, sensitivity analysis is carried out that demonstrates the robustness of the optimized controller parameters to wide variations in operating loading condition and time constants of speed governor, turbine, tie-line power. Finally, the proposed approach is extended to a more realistic power system model by considering the physical constraints such as reheat turbine, Generation Rate Constraint (GRC and Governor Dead Band nonlinearity.

  18. Recurrent neural network-based modeling of gene regulatory network using elephant swarm water search algorithm.

    Science.gov (United States)

    Mandal, Sudip; Saha, Goutam; Pal, Rajat Kumar

    2017-08-01

    Correct inference of genetic regulations inside a cell from the biological database like time series microarray data is one of the greatest challenges in post genomic era for biologists and researchers. Recurrent Neural Network (RNN) is one of the most popular and simple approach to model the dynamics as well as to infer correct dependencies among genes. Inspired by the behavior of social elephants, we propose a new metaheuristic namely Elephant Swarm Water Search Algorithm (ESWSA) to infer Gene Regulatory Network (GRN). This algorithm is mainly based on the water search strategy of intelligent and social elephants during drought, utilizing the different types of communication techniques. Initially, the algorithm is tested against benchmark small and medium scale artificial genetic networks without and with presence of different noise levels and the efficiency was observed in term of parametric error, minimum fitness value, execution time, accuracy of prediction of true regulation, etc. Next, the proposed algorithm is tested against the real time gene expression data of Escherichia Coli SOS Network and results were also compared with others state of the art optimization methods. The experimental results suggest that ESWSA is very efficient for GRN inference problem and performs better than other methods in many ways.

  19. A heuristic algorithm based on tabu search for vehicle routing problems with backhauls

    Directory of Open Access Journals (Sweden)

    Jhon Jairo Santa Chávez

    2017-07-01

    Full Text Available In this paper, a heuristic algorithm based on Tabu Search Approach for solving the Vehicle Routing Problem with Backhauls (VRPB is proposed. The problem considers a set of customers divided in two subsets: Linehaul and Backhaul customers. Each Linehaul customer requires the delivery of a given quantity of product from the depot, whereas a given quantity of product must be picked up from each Backhaul customer and transported to the depot. In the proposed algorithm, each route consists of one sub-route in which only the delivery task is done, and one sub-route in which only the collection process is performed. The search process allows obtaining a correct order to visit all the customers on each sub-route. In addition, the proposed algorithm determines the best connections among the sub-routes in order to obtain a global solution with the minimum traveling cost. The efficiency of the algorithm is evaluated on a set of benchmark instances taken from the literature. The results show that the computing times are greatly reduced with a high quality of solutions. Finally, conclusions and suggestions for future works are presented.

  20. Automatic boiling water reactor control rod pattern design using particle swarm optimization algorithm and local search

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Cheng-Der, E-mail: jdwang@iner.gov.tw [Nuclear Engineering Division, Institute of Nuclear Energy Research, No. 1000, Wenhua Rd., Jiaan Village, Longtan Township, Taoyuan County 32546, Taiwan, ROC (China); Lin, Chaung [National Tsing Hua University, Department of Engineering and System Science, 101, Section 2, Kuang Fu Road, Hsinchu 30013, Taiwan (China)

    2013-02-15

    Highlights: ► The PSO algorithm was adopted to automatically design a BWR CRP. ► The local search procedure was added to improve the result of PSO algorithm. ► The results show that the obtained CRP is the same good as that in the previous work. -- Abstract: This study developed a method for the automatic design of a boiling water reactor (BWR) control rod pattern (CRP) using the particle swarm optimization (PSO) algorithm. The PSO algorithm is more random compared to the rank-based ant system (RAS) that was used to solve the same BWR CRP design problem in the previous work. In addition, the local search procedure was used to make improvements after PSO, by adding the single control rod (CR) effect. The design goal was to obtain the CRP so that the thermal limits and shutdown margin would satisfy the design requirement and the cycle length, which is implicitly controlled by the axial power distribution, would be acceptable. The results showed that the same acceptable CRP found in the previous work could be obtained.

  1. An opposition-based harmony search algorithm for engineering optimization problems

    Directory of Open Access Journals (Sweden)

    Abhik Banerjee

    2014-03-01

    Full Text Available Harmony search (HS is a derivative-free real parameter optimization algorithm. It draws inspiration from the musical improvisation process of searching for a perfect state of harmony. The proposed opposition-based HS (OHS of the present work employs opposition-based learning for harmony memory initialization and also for generation jumping. The concept of opposite number is utilized in OHS to improve the convergence rate of the HS algorithm. The potential of the proposed algorithm is assessed by means of an extensive comparative study of the numerical results on sixteen benchmark test functions. Additionally, the effectiveness of the proposed algorithm is tested for reactive power compensation of an autonomous power system. For real-time reactive power compensation of the studied model, Takagi Sugeno fuzzy logic (TSFL is employed. Time-domain simulation reveals that the proposed OHS-TSFL yields on-line, off-nominal model parameters, resulting in real-time incremental change in terminal voltage response profile.

  2. FIERCE: FInding volcanic ERuptive CEnters by a grid-searching algorithm in R

    Science.gov (United States)

    Carniel, Roberto; Guzmán, Silvina; Neri, Marco

    2017-02-01

    Most eruptions are fed by dikes whose spatial distribution can provide important insights into the positions of possible old eruptive centers that are no longer clearly identifiable in the field. Locating these centers can in turn have further applications, e.g., in hazard assessment. We propose a purely geometrical algorithm—implemented as an R open-source script—named FIERCE (FInding volcanic ERuptive CEnters) based on the number of intersections of dikes identified within a grid of rectangular cells overlain onto a given search region. The algorithm recognizes radial distributions, tangential distributions, or combinations of both. We applied FIERCE to both well-known and less-studied volcanic edifices, in different tectonic settings and having different evolution histories, ages, and compositions. At Summer Coon volcano, FIERCE demonstrated that a radial dike distribution clearly indicates the position of the central vent. On Etna, it confirmed the position of the most important ancient eruptive centers and allowed us to study effects of the structural alignments and topography. On Stromboli, FIERCE not only enabled confirmation of some published locations of older vents but also identified possible vent areas not previously suggested. It also highlighted the influence of the regional structural trend and the collapse scars. FIERCE demonstrated that the dikes at the Somma-Vesuvius were emplaced before formation of Mt. Somma's caldera and indicated a plausible location for the old volcanic crater of Mt. Somma which is compatible with previous studies. At the Vicuña Pampa Volcanic Complex, FIERCE highlights the position of two different vents of a highly degraded volcano.

  3. A Comparison of a Standard Genetic Algorithm with a Hybrid Genetic Algorithm Applied to Cell Formation Problem

    Directory of Open Access Journals (Sweden)

    Waqas Javaid

    2014-09-01

    Full Text Available Though there are a number of benefits associated with cellular manufacturing systems, its implementation (identification of part families and corresponding machine groups for real life problems is still a challenging task. To handle the complexity of optimizing multiple objectives and larger size of the problem, most of the researchers in the past two decades or so have focused on developing genetic algorithm (GA based techniques. Recently this trend has shifted from standard GA to hybrid GA (HGA based approaches in the quest for greater effectiveness as far as convergence on to the optimum solution is concerned. In order to prove the point, that HGAs possess better convergence abilities than standard GAs, a methodology, initially based on standard GA and later on hybridized with a local search heuristic (LSH, has been developed during this research. Computational experience shows that HGA maintains its accuracy level with increase in problem size, whereas standard GA looses its effectiveness as the problem size grows.

  4. Iterated Local Search Algorithm with Strategic Oscillation for School Bus Routing Problem with Bus Stop Selection

    Directory of Open Access Journals (Sweden)

    Mohammad Saied Fallah Niasar

    2017-02-01

    Full Text Available he school bus routing problem (SBRP represents a variant of the well-known vehicle routing problem. The main goal of this study is to pick up students allocated to some bus stops and generate routes, including the selected stops, in order to carry students to school. In this paper, we have proposed a simple but effective metaheuristic approach that employs two features: first, it utilizes large neighborhood structures for a deeper exploration of the search space; second, the proposed heuristic executes an efficient transition between the feasible and infeasible portions of the search space. Exploration of the infeasible area is controlled by a dynamic penalty function to convert the unfeasible solution into a feasible one. Two metaheuristics, called N-ILS (a variant of the Nearest Neighbourhood with Iterated Local Search algorithm and I-ILS (a variant of Insertion with Iterated Local Search algorithm are proposed to solve SBRP. Our experimental procedure is based on the two data sets. The results show that N-ILS is able to obtain better solutions in shorter computing times. Additionally, N-ILS appears to be very competitive in comparison with the best existing metaheuristics suggested for SBRP

  5. A New Improved Quantum Evolution Algorithm with Local Search Procedure for Capacitated Vehicle Routing Problem

    Directory of Open Access Journals (Sweden)

    Ligang Cui

    2013-01-01

    Full Text Available The capacitated vehicle routing problem (CVRP is the most classical vehicle routing problem (VRP; many solution techniques are proposed to find its better answer. In this paper, a new improved quantum evolution algorithm (IQEA with a mixed local search procedure is proposed for solving CVRPs. First, an IQEA with a double chain quantum chromosome, new quantum rotation schemes, and self-adaptive quantum Not gate is constructed to initialize and generate feasible solutions. Then, to further strengthen IQEA's searching ability, three local search procedures 1-1 exchange, 1-0 exchange, and 2-OPT, are adopted. Experiments on a small case have been conducted to analyze the sensitivity of main parameters and compare the performances of the IQEA with different local search strategies. Together with results from the testing of CVRP benchmarks, the superiorities of the proposed algorithm over the PSO, SR-1, and SR-2 have been demonstrated. At last, a profound analysis of the experimental results is presented and some suggestions on future researches are given.

  6. A hybrid firefly algorithm and pattern search technique for SSSC based power oscillation damping controller design

    Directory of Open Access Journals (Sweden)

    Srikanta Mahapatra

    2014-12-01

    Full Text Available In this paper, a novel hybrid Firefly Algorithm and Pattern Search (h-FAPS technique is proposed for a Static Synchronous Series Compensator (SSSC-based power oscillation damping controller design. The proposed h-FAPS technique takes the advantage of global search capability of FA and local search facility of PS. In order to tackle the drawback of using the remote signal that may impact reliability of the controller, a modified signal equivalent to the remote speed deviation signal is constructed from the local measurements. The performances of the proposed controllers are evaluated in SMIB and multi-machine power system subjected to various transient disturbances. To show the effectiveness and robustness of the proposed design approach, simulation results are presented and compared with some recently published approaches such as Differential Evolution (DE and Particle Swarm Optimization (PSO. It is observed that the proposed approach yield superior damping performance compared to some recently reported approaches.

  7. Modified Cuckoo Search Algorithm for Solving Nonconvex Economic Load Dispatch Problems

    Directory of Open Access Journals (Sweden)

    Thang Trung Nguyen

    2016-01-01

    Full Text Available This paper presents the application of modified cuckoo search algorithm (MCSA for solving economic load dispatch (ELD problems. The MCSA method is developed to improve the search ability and solution quality of the conventional CSA method. In the MCSA, the evaluation of eggs has divided the initial eggs into two groups, the top egg group with good quality and the abandoned group with worse quality. Moreover, the value of the updated step size in MCSA is adapted as generating a new solution for the abandoned group and the top group via the Levy flights so that a large zone is searched at the beginning and a local zone is foraged as the maximum number of iterations is nearly reached. The MCSA method has been tested on different systems with different characteristics of thermal units and constraints. The result comparison with other methods in the literature has indicated that the MCSA method can be a powerful method for solving the ELD.

  8. [Cluster ensemble algorithm based on dual neural gas applied to cancer gene expression profiles].

    Science.gov (United States)

    Zhang, Xiaodong; Chen, Hantao

    2015-02-01

    The microarray technology used in biological and medical research provides a new idea for the diagnosis and treatment of cancer. To find different types of cancer and to classify the cancer samples accurately, we propose a new cluster ensemble framework Dual Neural Gas Cluster Ensemble (DNGCE), which is based on neural gas algorithm, to discover the underlying structure of noisy cancer gene expression profiles. This framework DNGCE applies the neural gas algorithm to perform clustering not only on the sample dimension, but also on the attribute dimension. It also adopts the normalized cut algorithm to partition off the consensus matrix constructed from multiple clustering solutions. We obtained the final accurate results. Experiments on cancer gene expression profiles illustrated that the proposed approach could achieve good performance, as it outperforms the single clustering algorithms and most of the existing approaches in the process of clustering gene expression profiles.

  9. Neural Network Blind Equalization Algorithm Applied in Medical CT Image Restoration

    Directory of Open Access Journals (Sweden)

    Yunshan Sun

    2013-01-01

    Full Text Available A new algorithm for iterative blind image restoration is presented in this paper. The method extends blind equalization found in the signal case to the image. A neural network blind equalization algorithm is derived and used in conjunction with Zigzag coding to restore the original image. As a result, the effect of PSF can be removed by using the proposed algorithm, which contributes to eliminate intersymbol interference (ISI. In order to obtain the estimation of the original image, what is proposed in this method is to optimize constant modulus blind equalization cost function applied to grayscale CT image by using conjugate gradient method. Analysis of convergence performance of the algorithm verifies the feasibility of this method theoretically; meanwhile, simulation results and performance evaluations of recent image quality metrics are provided to assess the effectiveness of the proposed method.

  10. Solving Flexible Job-Shop Scheduling Problem Using Gravitational Search Algorithm and Colored Petri Net

    Directory of Open Access Journals (Sweden)

    Behnam Barzegar

    2012-01-01

    Full Text Available Scheduled production system leads to avoiding stock accumulations, losses reduction, decreasing or even eliminating idol machines, and effort to better benefitting from machines for on time responding customer orders and supplying requested materials in suitable time. In flexible job-shop scheduling production systems, we could reduce time and costs by transferring and delivering operations on existing machines, that is, among NP-hard problems. The scheduling objective minimizes the maximal completion time of all the operations, which is denoted by Makespan. Different methods and algorithms have been presented for solving this problem. Having a reasonable scheduled production system has significant influence on improving effectiveness and attaining to organization goals. In this paper, new algorithm were proposed for flexible job-shop scheduling problem systems (FJSSP-GSPN that is based on gravitational search algorithm (GSA. In the proposed method, the flexible job-shop scheduling problem systems was modeled by color Petri net and CPN tool and then a scheduled job was programmed by GSA algorithm. The experimental results showed that the proposed method has reasonable performance in comparison with other algorithms.

  11. A Local and Global Search Combine Particle Swarm Optimization Algorithm for Job-Shop Scheduling to Minimize Makespan

    Directory of Open Access Journals (Sweden)

    Zhigang Lian

    2010-01-01

    Full Text Available The Job-shop scheduling problem (JSSP is a branch of production scheduling, which is among the hardest combinatorial optimization problems. Many different approaches have been applied to optimize JSSP, but for some JSSP even with moderate size cannot be solved to guarantee optimality. The original particle swarm optimization algorithm (OPSOA, generally, is used to solve continuous problems, and rarely to optimize discrete problems such as JSSP. In OPSOA, through research I find that it has a tendency to get stuck in a near optimal solution especially for middle and large size problems. The local and global search combine particle swarm optimization algorithm (LGSCPSOA is used to solve JSSP, where particle-updating mechanism benefits from the searching experience of one particle itself, the best of all particles in the swarm, and the best of particles in neighborhood population. The new coding method is used in LGSCPSOA to optimize JSSP, and it gets all sequences are feasible solutions. Three representative instances are made computational experiment, and simulation shows that the LGSCPSOA is efficacious for JSSP to minimize makespan.

  12. Energy loss optimization of run-off-road wheels applying imperialist competitive algorithm

    Directory of Open Access Journals (Sweden)

    Hamid Taghavifar

    2014-08-01

    Full Text Available The novel imperialist competitive algorithm (ICA has presented outstanding fitness on various optimization problems. Application of meta-heuristics has been a dynamic studying interest of the reliability optimization to determine idleness and reliability constituents. The application of a meta-heuristic evolutionary optimization method, imperialist competitive algorithm (ICA, for minimization of energy loss due to wheel rolling resistance in a soil bin facility equipped with single-wheel tester is discussed. The required data were collected thorough various designed experiments in the controlled soil bin environment. Local and global searching of the search space proposed that the energy loss could be reduced to the minimum amount of 15.46 J at the optimized input variable configuration of wheel load at 1.2 kN, tire inflation pressure of 296 kPa and velocity of 2 m/s. Meanwhile, genetic algorithm (GA, particle swarm optimization (PSO and hybridized GA–PSO approaches were benchmarked among the broad spectrum of meta-heuristics to find the outperforming approach. It was deduced that, on account of the obtained results, ICA can achieve optimum configuration with superior accuracy in less required computational time.

  13. An Intuitive Dominant Test Algorithm of CP-nets Applied on Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    Liu Zhaowei

    2014-07-01

    Full Text Available A wireless sensor network is of spatially distributed with autonomous sensors, just like a multi-Agent system with single Agent. Conditional Preference networks is a qualitative tool for representing ceteris paribus (all other things being equal preference statements, it has been a research hotspot in artificial intelligence recently. But the algorithm and complexity of strong dominant test with respect to binary-valued structure CP-nets have not been solved, and few researchers address the application to other domain. In this paper, strong dominant test and application of CP-nets are studied in detail. Firstly, by constructing induced graph of CP-nets and studying its properties, we make a conclusion that the problem of strong dominant test on binary-valued CP-nets is single source shortest path problem essentially, so strong dominant test problem can be solved by improved Dijkstra’s algorithm. Secondly, we apply the algorithm above mentioned to the completeness of wireless sensor network, and design a completeness judging algorithm based on strong dominant test. Thirdly, we apply the algorithm on wireless sensor network to solve routing problem. In the end, we point out some interesting work in the future.

  14. Continuous grasp algorithm applied to economic dispatch problem of thermal units

    Energy Technology Data Exchange (ETDEWEB)

    Vianna Neto, Julio Xavier [Pontifical Catholic University of Parana - PUCPR, Curitiba, PR (Brazil). Undergraduate Program at Mechatronics Engineering; Bernert, Diego Luis de Andrade; Coelho, Leandro dos Santos [Pontifical Catholic University of Parana - PUCPR, Curitiba, PR (Brazil). Industrial and Systems Engineering Graduate Program, LAS/PPGEPS], e-mail: leandro.coelho@pucpr.br

    2010-07-01

    The economic dispatch problem (EDP) is one of the fundamental issues in power systems to obtain benefits with the stability, reliability and security. Its objective is to allocate the power demand among committed generators in the most economical manner, while all physical and operational constraints are satisfied. The cost of power generation, particularly in fossil fuel plants, is very high and economic dispatch helps in saving a significant amount of revenue. Recently, as an alternative to the conventional mathematical approaches, modern heuristic optimization techniques such as simulated annealing, evolutionary algorithms, neural networks, ant colony, and tabu search have been given much attention by many researchers due to their ability to find an almost global optimal solution in EDPs. On other hand, continuous GRASP (C-GRASP) is a stochastic local search meta-heuristic for finding cost-efficient solutions to continuous global optimization problems subject to box constraints. Like a greedy randomized adaptive search procedure (GRASP), a C-GRASP is a multi-start procedure where a starting solution for local improvement is constructed in a greedy randomized fashion. The C-GRASP algorithm is validated for a test system consisting of fifteen units, test system that takes into account spinning reserve and prohibited operating zones constrains. (author)

  15. System network planning expansion using mathematical programming, genetic algorithms and tabu search

    Energy Technology Data Exchange (ETDEWEB)

    Sadegheih, A. [Department of Industrial Engineering, University of Yazd, P.O. Box 89195-741, Yazd (Iran); Drake, P.R. [E-Business and Operations Management Division, University of Liverpool Management School, University of Liverpool, Liverpool (United Kingdom)

    2008-06-15

    In this paper, system network planning expansion is formulated for mixed integer programming, a genetic algorithm (GA) and tabu search (TS). Compared with other optimization methods, GAs are suitable for traversing large search spaces, since they can do this relatively rapidly and because the use of mutation diverts the method away from local minima, which will tend to become more common as the search space increases in size. GA's give an excellent trade off between solution quality and computing time and flexibility for taking into account specific constraints in real situations. TS has emerged as a new, highly efficient, search paradigm for finding quality solutions to combinatorial problems. It is characterized by gathering knowledge during the search and subsequently profiting from this knowledge. The attractiveness of the technique comes from its ability to escape local optimality. The cost function of this problem consists of the capital investment cost in discrete form, the cost of transmission losses and the power generation costs. The DC load flow equations for the network are embedded in the constraints of the mathematical model to avoid sub-optimal solutions that can arise if the enforcement of such constraints is done in an indirect way. The solution of the model gives the best line additions and also provides information regarding the optimal generation at each generation point. This method of solution is demonstrated on the expansion of a 10 bus bar system to 18 bus bars. Finally, a steady-state genetic algorithm is employed rather than generational replacement, also uniform crossover is used. (author)

  16. Hybrid nested sampling algorithm for Bayesian model selection applied to inverse subsurface flow problems

    KAUST Repository

    Elsheikh, Ahmed H.

    2014-02-01

    A Hybrid Nested Sampling (HNS) algorithm is proposed for efficient Bayesian model calibration and prior model selection. The proposed algorithm combines, Nested Sampling (NS) algorithm, Hybrid Monte Carlo (HMC) sampling and gradient estimation using Stochastic Ensemble Method (SEM). NS is an efficient sampling algorithm that can be used for Bayesian calibration and estimating the Bayesian evidence for prior model selection. Nested sampling has the advantage of computational feasibility. Within the nested sampling algorithm, a constrained sampling step is performed. For this step, we utilize HMC to reduce the correlation between successive sampled states. HMC relies on the gradient of the logarithm of the posterior distribution, which we estimate using a stochastic ensemble method based on an ensemble of directional derivatives. SEM only requires forward model runs and the simulator is then used as a black box and no adjoint code is needed. The developed HNS algorithm is successfully applied for Bayesian calibration and prior model selection of several nonlinear subsurface flow problems. © 2013 Elsevier Inc.

  17. Multi-objective simultaneous placement of DG and DSTATCOM using novel lightning search algorithm

    Directory of Open Access Journals (Sweden)

    Yuvaraj Thangaraj

    2017-10-01

    Full Text Available In this proposed study, a new long term scheduling is proposed for simultaneous placement of Distributed Generation (DG and Distribution STATic COMpensator (DSTATCOM in the radial distribution networks. The proposed work has a unique multi-objective function which consists of minimizing power loss, and total voltage deviation (TVD, as well as maximizing the voltage stability index (VSI subject to equality and inequality system constraints. The multi-objective problem has been solved by a novel metaheuristic optimization algorithm called as lightning search algorithm (LSA. In the proposed approach, the feeder loads are varied linearly from light load (0.5 to peak load (1.6 with a step size of 1%. In each load step, the optimal sizing for DG and DSTATCOM are calculated by LSA. Through curve fitting technique (CFT, the optimal sizing for both DG and DSTATCOM per load level is formulated in the form of generalized equation. The proposed generalized equation will help the distribution network operators (DNOs to select the DG and DSTATCOM sizes according to the load changes. The proposed method is tested on two test systems of 33-bus and 69-bus in different cases. Keywords: Distributed Generation (DG, Distribution STATic COMpensator (DSTATCOM, Lightning search algorithm (LSA, Voltage stability index (VSI, Curve fitting technique (CFT, Distribution network operators (DNOs

  18. Optimal Capacitor Placement in Wind Farms by Considering Harmonics Using Discrete Lightning Search Algorithm

    Directory of Open Access Journals (Sweden)

    Reza Sirjani

    2017-09-01

    Full Text Available Currently, many wind farms exist throughout the world and, in some cases, supply a significant portion of energy to networks. However, numerous uncertainties remain with respect to the amount of energy generated by wind turbines and other sophisticated operational aspects, such as voltage and reactive power management, which requires further development and consideration. To fix the problem of poor reactive power compensation in wind farms, optimal capacitor placement has been proposed in existing wind farms as a simple and relatively inexpensive method. However, the use of induction generators, transformers, and additional capacitors represent potential problems for the harmonics of a system and therefore must be taken into account at wind farms. The optimal location and size of capacitors at buses of an 80-MW wind farm were determined according to modelled wind speed, system equivalent circuits, and harmonics in order to minimize energy losses, optimize reactive power and reduce the management costs. The discrete version of the lightning search algorithm (DLSA is a powerful and flexible nature-inspired optimization technique that was developed and implemented herein for optimal capacitor placement in wind farms. The obtained results are compared with the results of the genetic algorithm (GA and the discrete harmony search algorithm (DHSA.

  19. An adaptive immune optimization algorithm with dynamic lattice searching operation for fast optimization of atomic clusters

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Xia, E-mail: xiawu@mail.nankai.edu.cn; Wu, Genhua

    2014-08-31

    Highlights: • A high efficient method for optimization of atomic clusters is developed. • Its performance is studied by optimizing Lennard-Jones clusters and Ag clusters. • The method is proved to be quite efficient. • A new Ag{sub 61} cluster with stacking-fault face-centered cubic motif is found. - Abstract: Geometrical optimization of atomic clusters is performed by a development of adaptive immune optimization algorithm (AIOA) with dynamic lattice searching (DLS) operation (AIOA-DLS method). By a cycle of construction and searching of the dynamic lattice (DL), DLS algorithm rapidly makes the clusters more regular and greatly reduces the potential energy. DLS can thus be used as an operation acting on the new individuals after mutation operation in AIOA to improve the performance of the AIOA. The AIOA-DLS method combines the merit of evolutionary algorithm and idea of dynamic lattice. The performance of the proposed method is investigated in the optimization of Lennard-Jones clusters within 250 atoms and silver clusters described by many-body Gupta potential within 150 atoms. Results reported in the literature are reproduced, and the motif of Ag{sub 61} cluster is found to be stacking-fault face-centered cubic, whose energy is lower than that of previously obtained icosahedron.

  20. Hybrid water flow-like algorithm with Tabu search for traveling salesman problem

    Science.gov (United States)

    Bostamam, Jasmin M.; Othman, Zulaiha

    2016-08-01

    This paper presents a hybrid Water Flow-like Algorithm with Tabu Search for solving travelling salesman problem (WFA-TS-TSP).WFA has been proven its outstanding performances in solving TSP meanwhile TS is a conventional algorithm which has been used since decades to solve various combinatorial optimization problem including TSP. Hybridization between WFA with TS provides a better balance of exploration and exploitation criteria which are the key elements in determining the performance of one metaheuristic. TS use two different local search namely, 2opt and 3opt separately. The proposed WFA-TS-TSP is tested on 23 sets on the well-known benchmarked symmetric TSP instances. The result shows that the proposed WFA-TS-TSP has significant better quality solutions compared to WFA. The result also shows that the WFA-TS-TSP with 3-opt obtained the best quality solution. With the result obtained, it could be concluded that WFA has potential to be further improved by using hybrid technique or using better local search technique.

  1. An Improved Hybrid Genetic Algorithm with a New Local Search Procedure

    Directory of Open Access Journals (Sweden)

    Wen Wan

    2013-01-01

    Full Text Available One important challenge of a hybrid genetic algorithm (HGA (also called memetic algorithm is the tradeoff between global and local searching (LS as it is the case that the cost of an LS can be rather high. This paper proposes a novel, simplified, and efficient HGA with a new individual learning procedure that performs a LS only when the best offspring (solution in the offspring population is also the best in the current parent population. Additionally, a new LS method is developed based on a three-directional search (TD, which is derivative-free and self-adaptive. The new HGA with two different LS methods (the TD and Neld-Mead simplex is compared with a traditional HGA. Four benchmark functions are employed to illustrate the improvement of the proposed method with the new learning procedure. The results show that the new HGA greatly reduces the number of function evaluations and converges much faster to the global optimum than a traditional HGA. The TD local search method is a good choice in helping to locate a global “mountain” (or “valley” but may not perform the Nelder-Mead method in the final fine tuning toward the optimal solution.

  2. An Efficient Exact Algorithm for the Motif Stem Search Problem over Large Alphabets.

    Science.gov (United States)

    Yu, Qiang; Huo, Hongwei; Vitter, Jeffrey Scott; Huan, Jun; Nekrich, Yakov

    2015-01-01

    In recent years, there has been an increasing interest in planted (l, d) motif search (PMS) with applications to discovering significant segments in biological sequences. However, there has been little discussion about PMS over large alphabets. This paper focuses on motif stem search (MSS), which is recently introduced to search motifs on large-alphabet inputs. A motif stem is an l-length string with some wildcards. The goal of the MSS problem is to find a set of stems that represents a superset of all (l , d) motifs present in the input sequences, and the superset is expected to be as small as possible. The three main contributions of this paper are as follows: (1) We build motif stem representation more precisely by using regular expressions. (2) We give a method for generating all possible motif stems without redundant wildcards. (3) We propose an efficient exact algorithm, called StemFinder, for solving the MSS problem. Compared with the previous MSS algorithms, StemFinder runs much faster and reports fewer stems which represent a smaller superset of all (l, d) motifs. StemFinder is freely available at http://sites.google.com/site/feqond/stemfinder.

  3. ACTION OF UNIFORM SEARCH ALGORITHM WHEN SELECTING LANGUAGE UNITS IN THE PROCESS OF SPEECH

    Directory of Open Access Journals (Sweden)

    Ирина Михайловна Некипелова

    2013-05-01

    Full Text Available The article is devoted to research of action of uniform search algorithm when selecting by human of language units for speech produce. The process is connected with a speech optimization phenomenon. This makes it possible to shorten the time of cogitation something that human want to say, and to achieve the maximum precision in thoughts expression. The algorithm of uniform search works at consciousness  and subconsciousness levels. It favours the forming of automatism produce and perception of speech. Realization of human's cognitive potential in the process of communication starts up complicated mechanism of self-organization and self-regulation of language. In turn, it results in optimization of language system, servicing needs not only human's self-actualization but realization of communication in society. The method of problem-oriented search is used for researching of optimization mechanisms, which are distinctive to speech producing and stabilization of language.DOI: http://dx.doi.org/10.12731/2218-7405-2013-4-50

  4. The gravitational attraction algorithm: a new metaheuristic applied to a nuclear reactor core design optimization problem

    Energy Technology Data Exchange (ETDEWEB)

    Sacco, Wagner F.; Oliveira, Cassiano R.E. de [Georgia Institute of Technology, Atlanta, GA (United States). George W. Woodruff School of Mechanical Engineering. Nuclear and Radiological Engineering Program]. E-mail: wagner.sacco@me.gatech.edu; cassiano.oliveira@nre.gatech.edu; Pereira, Claudio M.N.A. [Instituto de Engenharia Nuclear (IEN), Rio de Janeiro, RJ (Brazil)]. E-mail: cmnap@ien.gov.br

    2005-07-01

    A new metaheuristic called 'Gravitational Attraction Algorithm' (GAA) is introduced in this article. It is an analogy with the gravitational force field, where a body attracts another proportionally to both masses and inversely to their distances. The GAA is a populational algorithm where, first of all, the solutions are clustered using the Fuzzy Clustering Means (FCM) algorithm. Following that, the gravitational forces of the individuals in relation to each cluster are evaluated and this individual or solution is displaced to the cluster with the greatest attractive force. Once it is inside this cluster, the solution receives small stochastic variations, performing a local exploration. Then the solutions are crossed over and the process starts all over again. The parameters required by the GAA are the 'diversity factor', which is used to create a random diversity in a fashion similar to genetic algorithm's mutation, and the number of clusters for the FCM. GAA is applied to the reactor core design optimization problem which consists in adjusting several reactor cell parameters in order to minimize the average peak-factor in a 3-enrichment-zone reactor, considering operational restrictions. This problem was previously attacked using the canonical genetic algorithm (GA) and a Niching Genetic Algorithm (NGA). The new metaheuristic is then compared to those two algorithms. The three algorithms are submitted to the same computational effort and GAA reaches the best results, showing its potential for other applications in the nuclear engineering field as, for instance, the nuclear core reload optimization problem. (author)

  5. Liverpool's Discovery: A University Library Applies a New Search Tool to Improve the User Experience

    Science.gov (United States)

    Kenney, Brian

    2011-01-01

    This article features the University of Liverpool's arts and humanities library, which applies a new search tool to improve the user experience. In nearly every way imaginable, the Sydney Jones Library and the Harold Cohen Library--the university's two libraries that serve science, engineering, and medical students--support the lives of their…

  6. Investigation of super-resolution processing algorithm by target light-intensity search in digital holography

    Science.gov (United States)

    Neo, Atsushi; Kakue, Takashi; Shimobaba, Tomoyoshi; Masuda, Nobuyuki; Ito, Tomoyoshi

    2017-04-01

    Digital holography is expected to be useful in the analysis of moving three-dimensional (3D) image measurement. In this technique, a two-dimensional interference fringe recorded using a 3D image is captured with an image sensor, and the 3D image is reproduced on a computer. To obtain the reproduced 3D images with high spatial resolution, a high-performance image sensor is required, which increases the system cost. We propose an algorithm for super-resolution processing in digital holography that does not require a high-performance image sensor. The proposed algorithm wherein 3D images are considered as the aggregation of object points improves spatial resolution by performing a light-intensity search of the reproduced image and the object points.

  7. Parameter Estimation for Traffic Noise Models Using a Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    Deok-Soon An

    2013-01-01

    Full Text Available A technique has been developed for predicting road traffic noise for environmental assessment, taking into account traffic volume as well as road surface conditions. The ASJ model (ASJ Prediction Model for Road Traffic Noise, 1999, which is based on the sound power level of the noise emitted by the interaction between the road surface and tires, employs regression models for two road surface types: dense-graded asphalt (DGA and permeable asphalt (PA. However, these models are not applicable to other types of road surfaces. Accordingly, this paper introduces a parameter estimation procedure for ASJ-based noise prediction models, utilizing a harmony search (HS algorithm. Traffic noise measurement data for four different vehicle types were used in the algorithm to determine the regression parameters for several road surface types. The parameters of the traffic noise prediction models were evaluated using another measurement set, and good agreement was observed between the predicted and measured sound power levels.

  8. Optimization of Filter by using Support Vector Regression Machine with Cuckoo Search Algorithm

    Directory of Open Access Journals (Sweden)

    M. İlarslan

    2014-09-01

    Full Text Available Herein, a new methodology using a 3D Electromagnetic (EM simulator-based Support Vector Regression Machine (SVRM models of base elements is presented for band-pass filter (BPF design. SVRM models of elements, which are as fast as analytical equations and as accurate as a 3D EM simulator, are employed in a simple and efficient Cuckoo Search Algorithm (CSA to optimize an ultra-wideband (UWB microstrip BPF. CSA performance is verified by comparing it with other Meta-Heuristics such as Genetic Algorithm (GA and Particle Swarm Optimization (PSO. As an example of the proposed design methodology, an UWB BPF that operates between the frequencies of 3.1 GHz and 10.6 GHz is designed, fabricated and measured. The simulation and measurement results indicate in conclusion the superior performance of this optimization methodology in terms of improved filter response characteristics like return loss, insertion loss, harmonic suppression and group delay.

  9. MUSIC algorithm for location searching of dielectric anomalies from S-parameters using microwave imaging

    Science.gov (United States)

    Park, Won-Kwang; Kim, Hwa Pyung; Lee, Kwang-Jae; Son, Seong-Ho

    2017-11-01

    Motivated by the biomedical engineering used in early-stage breast cancer detection, we investigated the use of MUltiple SIgnal Classification (MUSIC) algorithm for location searching of small anomalies using S-parameters. We considered the application of MUSIC to functional imaging where a small number of dipole antennas are used. Our approach is based on the application of Born approximation or physical factorization. We analyzed cases in which the anomaly is respectively small and large in relation to the wavelength, and the structure of the left-singular vectors is linked to the nonzero singular values of a Multi-Static Response (MSR) matrix whose elements are the S-parameters. Using simulations, we demonstrated the strengths and weaknesses of the MUSIC algorithm in detecting both small and extended anomalies.

  10. Parameter estimation by Differential Search Algorithm from horizontal loop electromagnetic (HLEM) data

    Science.gov (United States)

    Alkan, Hilal; Balkaya, Çağlayan

    2018-02-01

    We present an efficient inversion tool for parameter estimation from horizontal loop electromagnetic (HLEM) data using Differential Search Algorithm (DSA) which is a swarm-intelligence-based metaheuristic proposed recently. The depth, dip, and origin of a thin subsurface conductor causing the anomaly are the parameters estimated by the HLEM method commonly known as Slingram. The applicability of the developed scheme was firstly tested on two synthetically generated anomalies with and without noise content. Two control parameters affecting the convergence characteristic to the solution of the algorithm were tuned for the so-called anomalies including one and two conductive bodies, respectively. Tuned control parameters yielded more successful statistical results compared to widely used parameter couples in DSA applications. Two field anomalies measured over a dipping graphitic shale from Northern Australia were then considered, and the algorithm provided the depth estimations being in good agreement with those of previous studies and drilling information. Furthermore, the efficiency and reliability of the results obtained were investigated via probability density function. Considering the results obtained, we can conclude that DSA characterized by the simple algorithmic structure is an efficient and promising metaheuristic for the other relatively low-dimensional geophysical inverse problems. Finally, the researchers after being familiar with the content of developed scheme displaying an easy to use and flexible characteristic can easily modify and expand it for their scientific optimization problems.

  11. A “Tuned” Mask Learnt Approach Based on Gravitational Search Algorithm

    Directory of Open Access Journals (Sweden)

    Youchuan Wan

    2016-01-01

    Full Text Available Texture image classification is an important topic in many applications in machine vision and image analysis. Texture feature extracted from the original texture image by using “Tuned” mask is one of the simplest and most effective methods. However, hill climbing based training methods could not acquire the satisfying mask at a time; on the other hand, some commonly used evolutionary algorithms like genetic algorithm (GA and particle swarm optimization (PSO easily fall into the local optimum. A novel approach for texture image classification exemplified with recognition of residential area is detailed in the paper. In the proposed approach, “Tuned” mask is viewed as a constrained optimization problem and the optimal “Tuned” mask is acquired by maximizing the texture energy via a newly proposed gravitational search algorithm (GSA. The optimal “Tuned” mask is achieved through the convergence of GSA. The proposed approach has been, respectively, tested on some public texture and remote sensing images. The results are then compared with that of GA, PSO, honey-bee mating optimization (HBMO, and artificial immune algorithm (AIA. Moreover, feature extracted by Gabor wavelet is also utilized to make a further comparison. Experimental results show that the proposed method is robust and adaptive and exhibits better performance than other methods involved in the paper in terms of fitness value and classification accuracy.

  12. A stochastic local search algorithm for distance-based phylogeny reconstruction.

    Science.gov (United States)

    Tria, Francesca; Caglioti, Emanuele; Loreto, Vittorio; Pagnani, Andrea

    2010-11-01

    In many interesting cases, the reconstruction of a correct phylogeny is blurred by high mutation rates and/or horizontal transfer events. As a consequence, a divergence arises between the true evolutionary distances and the differences between pairs of taxa as inferred from available data, making the phylogenetic reconstruction a challenging problem. Mathematically, this divergence translates in a loss of additivity of the actual distances between taxa. In distance-based reconstruction methods, two properties of additive distances have been extensively exploited as antagonist criteria to drive phylogeny reconstruction: On the one hand, a local property of quartets, that is, sets of four taxa in a tree, the four-points condition; on the other hand, a recently proposed formula that allows to write the tree length as a function of the distances between taxa, Pauplin's formula. Here, we introduce a new reconstruction scheme that exploits in a unified framework both the four-points condition and the Pauplin's formula. We propose, in particular, a new general class of distance-based Stochastic Local Search algorithms, which reduces in a limit case to the minimization of Pauplin's length. When tested on artificially generated phylogenies, our Stochastic Big-Quartet Swapping algorithmic scheme significantly outperforms state-of-art distance-based algorithms in cases of deviation from additivity due to high rate of back mutations. A significant improvement is also observed with respect to the state-of-art algorithms in the case of high rate of horizontal transfer.

  13. Understanding Air Transportation Market Dynamics Using a Search Algorithm for Calibrating Travel Demand and Price

    Science.gov (United States)

    Kumar, Vivek; Horio, Brant M.; DeCicco, Anthony H.; Hasan, Shahab; Stouffer, Virginia L.; Smith, Jeremy C.; Guerreiro, Nelson M.

    2015-01-01

    This paper presents a search algorithm based framework to calibrate origin-destination (O-D) market specific airline ticket demands and prices for the Air Transportation System (ATS). This framework is used for calibrating an agent based model of the air ticket buy-sell process - Airline Evolutionary Simulation (Airline EVOS) -that has fidelity of detail that accounts for airline and consumer behaviors and the interdependencies they share between themselves and the NAS. More specificially, this algorithm simultaneous calibrates demand and airfares for each O-D market, to within specified threshold of a pre-specified target value. The proposed algorithm is illustrated with market data targets provided by the Transportation System Analysis Model (TSAM) and Airline Origin and Destination Survey (DB1B). Although we specify these models and datasources for this calibration exercise, the methods described in this paper are applicable to calibrating any low-level model of the ATS to some other demand forecast model-based data. We argue that using a calibration algorithm such as the one we present here to synchronize ATS models with specialized forecast demand models, is a powerful tool for establishing credible baseline conditions in experiments analyzing the effects of proposed policy changes to the ATS.

  14. A Formal Verification Model for Performance Analysis of Reinforcement Learning Algorithms Applied t o Dynamic Networks

    Directory of Open Access Journals (Sweden)

    Shrirang Ambaji KULKARNI

    2017-04-01

    Full Text Available Routing data packets in a dynamic network is a difficult and important problem in computer networks. As the network is dynamic, it is subject to frequent topology changes and is subject to variable link costs due to congestion and bandwidth. Existing shortest path algorithms fail to converge to better solutions under dynamic network conditions. Reinforcement learning algorithms posses better adaptation techniques in dynamic environments. In this paper we apply model based Q-Routing technique for routing in dynamic network. To analyze the correctness of Q-Routing algorithms mathematically, we provide a proof and also implement a SPIN based verification model. We also perform simulation based analysis of Q-Routing for given metrics.

  15. Optimum wavelet based masking for the contrast enhancement of medical images using enhanced cuckoo search algorithm.

    Science.gov (United States)

    Daniel, Ebenezer; Anitha, J

    2016-04-01

    Unsharp masking techniques are a prominent approach in contrast enhancement. Generalized masking formulation has static scale value selection, which limits the gain of contrast. In this paper, we propose an Optimum Wavelet Based Masking (OWBM) using Enhanced Cuckoo Search Algorithm (ECSA) for the contrast improvement of medical images. The ECSA can automatically adjust the ratio of nest rebuilding, using genetic operators such as adaptive crossover and mutation. First, the proposed contrast enhancement approach is validated quantitatively using Brain Web and MIAS database images. Later, the conventional nest rebuilding of cuckoo search optimization is modified using Adaptive Rebuilding of Worst Nests (ARWN). Experimental results are analyzed using various performance matrices, and our OWBM shows improved results as compared with other reported literature. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. An Iterated Local Search Algorithm for Estimating the Parameters of the Gamma/Gompertz Distribution

    Directory of Open Access Journals (Sweden)

    Behrouz Afshar-Nadjafi

    2014-01-01

    Full Text Available Extensive research has been devoted to the estimation of the parameters of frequently used distributions. However, little attention has been paid to estimation of parameters of Gamma/Gompertz distribution, which is often encountered in customer lifetime and mortality risks distribution literature. This distribution has three parameters. In this paper, we proposed an algorithm for estimating the parameters of Gamma/Gompertz distribution based on maximum likelihood estimation method. Iterated local search (ILS is proposed to maximize likelihood function. Finally, the proposed approach is computationally tested using some numerical examples and results are analyzed.

  17. An Effective Hybrid Firefly Algorithm with Harmony Search for Global Numerical Optimization

    Directory of Open Access Journals (Sweden)

    Lihong Guo

    2013-01-01

    Full Text Available A hybrid metaheuristic approach by hybridizing harmony search (HS and firefly algorithm (FA, namely, HS/FA, is proposed to solve function optimization. In HS/FA, the exploration of HS and the exploitation of FA are fully exerted, so HS/FA has a faster convergence speed than HS and FA. Also, top fireflies scheme is introduced to reduce running time, and HS is utilized to mutate between fireflies when updating fireflies. The HS/FA method is verified by various benchmarks. From the experiments, the implementation of HS/FA is better than the standard FA and other eight optimization methods.

  18. Curved-line search algorithm for ab initio atomic structure relaxation

    Science.gov (United States)

    Chen, Zhanghui; Li, Jingbo; Li, Shushen; Wang, Lin-Wang

    2017-09-01

    Ab initio atomic relaxations often take large numbers of steps and long times to converge, especially when the initial atomic configurations are far from the local minimum or there are curved and narrow valleys in the multidimensional potentials. An atomic relaxation method based on on-the-flight force learning and a corresponding curved-line search algorithm is presented to accelerate this process. Results demonstrate the superior performance of this method for metal and magnetic clusters when compared with the conventional conjugate-gradient method.

  19. A Nonmonotone Line Search Filter Algorithm for the System of Nonlinear Equations

    Directory of Open Access Journals (Sweden)

    Zhong Jin

    2012-01-01

    Full Text Available We present a new iterative method based on the line search filter method with the nonmonotone strategy to solve the system of nonlinear equations. The equations are divided into two groups; some equations are treated as constraints and the others act as the objective function, and the two groups are just updated at the iterations where it is needed indeed. We employ the nonmonotone idea to the sufficient reduction conditions and filter technique which leads to a flexibility and acceptance behavior comparable to monotone methods. The new algorithm is shown to be globally convergent and numerical experiments demonstrate its effectiveness.

  20. PMSVM: An Optimized Support Vector Machine Classification Algorithm Based on PCA and Multilevel Grid Search Methods

    Directory of Open Access Journals (Sweden)

    Yukai Yao

    2015-01-01

    Full Text Available We propose an optimized Support Vector Machine classifier, named PMSVM, in which System Normalization, PCA, and Multilevel Grid Search methods are comprehensively considered for data preprocessing and parameters optimization, respectively. The main goals of this study are to improve the classification efficiency and accuracy of SVM. Sensitivity, Specificity, Precision, and ROC curve, and so forth, are adopted to appraise the performances of PMSVM. Experimental results show that PMSVM has relatively better accuracy and remarkable higher efficiency compared with traditional SVM algorithms.

  1. A Rule-Based Local Search Algorithm for General Shift Design Problems in Airport Ground Handling

    DEFF Research Database (Denmark)

    Clausen, Tommy

    We consider a generalized version of the shift design problem where shifts are created to cover a multiskilled demand and fit the parameters of the workforce. We present a collection of constraints and objectives for the generalized shift design problem. A local search solution framework with mul...... with multiple neighborhoods and a loosely coupled rule engine based on simulated annealing is presented. Computational experiments on real-life data from various airport ground handling organization show the performance and flexibility of the proposed algorithm....

  2. NSGA-II Algorithm with a Local Search Strategy for Multiobjective Optimal Design of Dry-Type Air-Core Reactor

    Directory of Open Access Journals (Sweden)

    Chengfen Zhang

    2015-01-01

    Full Text Available Dry-type air-core reactor is now widely applied in electrical power distribution systems, for which the optimization design is a crucial issue. In the optimization design problem of dry-type air-core reactor, the objectives of minimizing the production cost and minimizing the operation cost are both important. In this paper, a multiobjective optimal model is established considering simultaneously the two objectives of minimizing the production cost and minimizing the operation cost. To solve the multi-objective optimization problem, a memetic evolutionary algorithm is proposed, which combines elitist nondominated sorting genetic algorithm version II (NSGA-II with a local search strategy based on the covariance matrix adaptation evolution strategy (CMA-ES. NSGA-II can provide decision maker with flexible choices among the different trade-off solutions, while the local-search strategy, which is applied to nondominated individuals randomly selected from the current population in a given generation and quantity, can accelerate the convergence speed. Furthermore, another modification is that an external archive is set in the proposed algorithm for increasing the evolutionary efficiency. The proposed algorithm is tested on a dry-type air-core reactor made of rectangular cross-section litz-wire. Simulation results show that the proposed algorithm has high efficiency and it converges to a better Pareto front.

  3. Improved approach for electric vehicle rapid charging station placement and sizing using Google maps and binary lightning search algorithm.

    Directory of Open Access Journals (Sweden)

    Md Mainul Islam

    Full Text Available The electric vehicle (EV is considered a premium solution to global warming and various types of pollution. Nonetheless, a key concern is the recharging of EV batteries. Therefore, this study proposes a novel approach that considers the costs of transportation loss, buildup, and substation energy loss and that incorporates harmonic power loss into optimal rapid charging station (RCS planning. A novel optimization technique, called binary lightning search algorithm (BLSA, is proposed to solve the optimization problem. BLSA is also applied to a conventional RCS planning method. A comprehensive analysis is conducted to assess the performance of the two RCS planning methods by using the IEEE 34-bus test system as the power grid. The comparative studies show that the proposed BLSA is better than other optimization techniques. The daily total cost in RCS planning of the proposed method, including harmonic power loss, decreases by 10% compared with that of the conventional method.

  4. Improved approach for electric vehicle rapid charging station placement and sizing using Google maps and binary lightning search algorithm.

    Science.gov (United States)

    Islam, Md Mainul; Shareef, Hussain; Mohamed, Azah

    2017-01-01

    The electric vehicle (EV) is considered a premium solution to global warming and various types of pollution. Nonetheless, a key concern is the recharging of EV batteries. Therefore, this study proposes a novel approach that considers the costs of transportation loss, buildup, and substation energy loss and that incorporates harmonic power loss into optimal rapid charging station (RCS) planning. A novel optimization technique, called binary lightning search algorithm (BLSA), is proposed to solve the optimization problem. BLSA is also applied to a conventional RCS planning method. A comprehensive analysis is conducted to assess the performance of the two RCS planning methods by using the IEEE 34-bus test system as the power grid. The comparative studies show that the proposed BLSA is better than other optimization techniques. The daily total cost in RCS planning of the proposed method, including harmonic power loss, decreases by 10% compared with that of the conventional method.

  5. Improved approach for electric vehicle rapid charging station placement and sizing using Google maps and binary lightning search algorithm

    Science.gov (United States)

    Shareef, Hussain; Mohamed, Azah

    2017-01-01

    The electric vehicle (EV) is considered a premium solution to global warming and various types of pollution. Nonetheless, a key concern is the recharging of EV batteries. Therefore, this study proposes a novel approach that considers the costs of transportation loss, buildup, and substation energy loss and that incorporates harmonic power loss into optimal rapid charging station (RCS) planning. A novel optimization technique, called binary lightning search algorithm (BLSA), is proposed to solve the optimization problem. BLSA is also applied to a conventional RCS planning method. A comprehensive analysis is conducted to assess the performance of the two RCS planning methods by using the IEEE 34-bus test system as the power grid. The comparative studies show that the proposed BLSA is better than other optimization techniques. The daily total cost in RCS planning of the proposed method, including harmonic power loss, decreases by 10% compared with that of the conventional method. PMID:29220396

  6. A comparative study of machine learning algorithms applied to predictive toxicology data mining.

    Science.gov (United States)

    Neagu, Daniel C; Guo, Gongde; Trundle, Paul R; Cronin, Mark T D

    2007-03-01

    This paper reports results of a comparative study of widely used machine learning algorithms applied to predictive toxicology data mining. The machine learning algorithms involved were chosen in terms of their representability and diversity, and were extensively evaluated with seven toxicity data sets which were taken from real-world applications. Some results based on visual analysis of the correlations of different descriptors to the class values of chemical compounds, and on the relationships of the range of chosen descriptors to the performance of machine learning algorithms, are emphasised from our experiments. Some interesting findings relating to the data and the quality of the models are presented--for example, that no specific algorithm appears best for all seven toxicity data sets, and that up to five descriptors are sufficient for creating classification models for each toxicity data set with good accuracy. We suggest that, for a specific data set, model accuracy is affected by the feature selection method and model development technique. Models built with too many or too few descriptors are undesirable, and finding the optimal feature subset appears at least as important as selecting appropriate algorithms with which to build a final model.

  7. A New Operational Snow Retrieval Algorithm Applied to Historical AMSR-E Brightness Temperatures

    Directory of Open Access Journals (Sweden)

    Marco Tedesco

    2016-12-01

    Full Text Available Snow is a key element of the water and energy cycles and the knowledge of spatio-temporal distribution of snow depth and snow water equivalent (SWE is fundamental for hydrological and climatological applications. SWE and snow depth estimates can be obtained from spaceborne microwave brightness temperatures at global scale and high temporal resolution (daily. In this regard, the data recorded by the Advanced Microwave Scanning Radiometer—Earth Orbiting System (EOS (AMSR-E onboard the National Aeronautics and Space Administration’s (NASA AQUA spacecraft have been used to generate operational estimates of SWE and snow depth, complementing estimates generated with other microwave sensors flying on other platforms. In this study, we report the results concerning the development and assessment of a new operational algorithm applied to historical AMSR-E data. The new algorithm here proposed makes use of climatological data, electromagnetic modeling and artificial neural networks for estimating snow depth as well as a spatio-temporal dynamic density scheme to convert snow depth to SWE. The outputs of the new algorithm are compared with those of the current AMSR-E operational algorithm as well as in-situ measurements and other operational snow products, specifically the Canadian Meteorological Center (CMC and GlobSnow datasets. Our results show that the AMSR-E algorithm here proposed generally performs better than the operational one and addresses some major issues identified in the spatial distribution of snow depth fields associated with the evolution of effective grain size.

  8. A reverse engineering algorithm for neural networks, applied to the subthalamopallidal network of basal ganglia.

    Science.gov (United States)

    Floares, Alexandru George

    2008-01-01

    Modeling neural networks with ordinary differential equations systems is a sensible approach, but also very difficult. This paper describes a new algorithm based on linear genetic programming which can be used to reverse engineer neural networks. The RODES algorithm automatically discovers the structure of the network, including neural connections, their signs and strengths, estimates its parameters, and can even be used to identify the biophysical mechanisms involved. The algorithm is tested on simulated time series data, generated using a realistic model of the subthalamopallidal network of basal ganglia. The resulting ODE system is highly accurate, and results are obtained in a matter of minutes. This is because the problem of reverse engineering a system of coupled differential equations is reduced to one of reverse engineering individual algebraic equations. The algorithm allows the incorporation of common domain knowledge to restrict the solution space. To our knowledge, this is the first time a realistic reverse engineering algorithm based on linear genetic programming has been applied to neural networks.

  9. A New Operational Snow Retrieval Algorithm Applied to Historical AMSR-E Brightness Temperatures

    Science.gov (United States)

    Tedesco, Marco; Jeyaratnam, Jeyavinoth

    2016-01-01

    Snow is a key element of the water and energy cycles and the knowledge of spatio-temporal distribution of snow depth and snow water equivalent (SWE) is fundamental for hydrological and climatological applications. SWE and snow depth estimates can be obtained from spaceborne microwave brightness temperatures at global scale and high temporal resolution (daily). In this regard, the data recorded by the Advanced Microwave Scanning Radiometer-Earth Orbiting System (EOS) (AMSR-E) onboard the National Aeronautics and Space Administration's (NASA) AQUA spacecraft have been used to generate operational estimates of SWE and snow depth, complementing estimates generated with other microwave sensors flying on other platforms. In this study, we report the results concerning the development and assessment of a new operational algorithm applied to historical AMSR-E data. The new algorithm here proposed makes use of climatological data, electromagnetic modeling and artificial neural networks for estimating snow depth as well as a spatio-temporal dynamic density scheme to convert snow depth to SWE. The outputs of the new algorithm are compared with those of the current AMSR-E operational algorithm as well as in-situ measurements and other operational snow products, specifically the Canadian Meteorological Center (CMC) and GlobSnow datasets. Our results show that the AMSR-E algorithm here proposed generally performs better than the operational one and addresses some major issues identified in the spatial distribution of snow depth fields associated with the evolution of effective grain size.

  10. Steganography: Applying and Evaluating Two Algorithms for Embedding Audio Data in an Image

    OpenAIRE

    Khaled Nasser ElSayed

    2015-01-01

    Information transmission is increasing with grow of using WEB. So, information security has become very important. Security of data and information is the major task for scientists and political and military people. One of the most secure methods is embedding data (steganography) in different media like text, audio, digital images. this paper present two experiments in steganography of digital audio data file. It applies empirically, two algorithms in steganography in images through random in...

  11. Turn-Based War Chess Model and Its Search Algorithm per Turn

    Directory of Open Access Journals (Sweden)

    Hai Nan

    2016-01-01

    Full Text Available War chess gaming has so far received insufficient attention but is a significant component of turn-based strategy games (TBS and is studied in this paper. First, a common game model is proposed through various existing war chess types. Based on the model, we propose a theory frame involving combinational optimization on the one hand and game tree search on the other hand. We also discuss a key problem, namely, that the number of the branching factors of each turn in the game tree is huge. Then, we propose two algorithms for searching in one turn to solve the problem: (1 enumeration by order; (2 enumeration by recursion. The main difference between these two is the permutation method used: the former uses the dictionary sequence method, while the latter uses the recursive permutation method. Finally, we prove that both of these algorithms are optimal, and we analyze the difference between their efficiencies. An important factor is the total time taken for the unit to expand until it achieves its reachable position. The factor, which is the total number of expansions that each unit makes in its reachable position, is set. The conclusion proposed is in terms of this factor: Enumeration by recursion is better than enumeration by order in all situations.

  12. A Hybrid alldifferent-Tabu Search Algorithm for Solving Sudoku Puzzles

    Directory of Open Access Journals (Sweden)

    Ricardo Soto

    2015-01-01

    Full Text Available The Sudoku problem is a well-known logic-based puzzle of combinatorial number-placement. It consists in filling a n2 × n2 grid, composed of n columns, n rows, and n subgrids, each one containing distinct integers from 1 to n2. Such a puzzle belongs to the NP-complete collection of problems, to which there exist diverse exact and approximate methods able to solve it. In this paper, we propose a new hybrid algorithm that smartly combines a classic tabu search procedure with the alldifferent global constraint from the constraint programming world. The alldifferent constraint is known to be efficient for domain filtering in the presence of constraints that must be pairwise different, which are exactly the kind of constraints that Sudokus own. This ability clearly alleviates the work of the tabu search, resulting in a faster and more robust approach for solving Sudokus. We illustrate interesting experimental results where our proposed algorithm outperforms the best results previously reported by hybrids and approximate methods.

  13. A Hybrid alldifferent-Tabu Search Algorithm for Solving Sudoku Puzzles

    Science.gov (United States)

    Crawford, Broderick; Paredes, Fernando; Norero, Enrique

    2015-01-01

    The Sudoku problem is a well-known logic-based puzzle of combinatorial number-placement. It consists in filling a n 2 × n 2 grid, composed of n columns, n rows, and n subgrids, each one containing distinct integers from 1 to n 2. Such a puzzle belongs to the NP-complete collection of problems, to which there exist diverse exact and approximate methods able to solve it. In this paper, we propose a new hybrid algorithm that smartly combines a classic tabu search procedure with the alldifferent global constraint from the constraint programming world. The alldifferent constraint is known to be efficient for domain filtering in the presence of constraints that must be pairwise different, which are exactly the kind of constraints that Sudokus own. This ability clearly alleviates the work of the tabu search, resulting in a faster and more robust approach for solving Sudokus. We illustrate interesting experimental results where our proposed algorithm outperforms the best results previously reported by hybrids and approximate methods. PMID:26078751

  14. Application of Harmony Search algorithm to the solution of groundwater management models

    Science.gov (United States)

    Tamer Ayvaz, M.

    2009-06-01

    This study proposes a groundwater resources management model in which the solution is performed through a combined simulation-optimization model. A modular three-dimensional finite difference groundwater flow model, MODFLOW is used as the simulation model. This model is then combined with a Harmony Search (HS) optimization algorithm which is based on the musical process of searching for a perfect state of harmony. The performance of the proposed HS based management model is tested on three separate groundwater management problems: (i) maximization of total pumping from an aquifer (steady-state); (ii) minimization of the total pumping cost to satisfy the given demand (steady-state); and (iii) minimization of the pumping cost to satisfy the given demand for multiple management periods (transient). The sensitivity of HS algorithm is evaluated by performing a sensitivity analysis which aims to determine the impact of related solution parameters on convergence behavior. The results show that HS yields nearly same or better solutions than the previous solution methods and may be used to solve management problems in groundwater modeling.

  15. Optimization process planning using hybrid genetic algorithm and intelligent search for job shop machining.

    Science.gov (United States)

    Salehi, Mojtaba; Bahreininejad, Ardeshir

    2011-08-01

    Optimization of process planning is considered as the key technology for computer-aided process planning which is a rather complex and difficult procedure. A good process plan of a part is built up based on two elements: (1) the optimized sequence of the operations of the part; and (2) the optimized selection of the machine, cutting tool and Tool Access Direction (TAD) for each operation. In the present work, the process planning is divided into preliminary planning, and secondary/detailed planning. In the preliminary stage, based on the analysis of order and clustering constraints as a compulsive constraint aggregation in operation sequencing and using an intelligent searching strategy, the feasible sequences are generated. Then, in the detailed planning stage, using the genetic algorithm which prunes the initial feasible sequences, the optimized operation sequence and the optimized selection of the machine, cutting tool and TAD for each operation based on optimization constraints as an additive constraint aggregation are obtained. The main contribution of this work is the optimization of sequence of the operations of the part, and optimization of machine selection, cutting tool and TAD for each operation using the intelligent search and genetic algorithm simultaneously.

  16. Obstacle Avoidance for Redundant Manipulators Utilizing a Backward Quadratic Search Algorithm

    Directory of Open Access Journals (Sweden)

    Tianjian Hu

    2016-06-01

    Full Text Available Obstacle avoidance can be achieved as a secondary task by appropriate inverse kinematics (IK resolution of redundant manipulators. Most prior literature requires the time-consuming determination of the closest point to the obstacle for every calculation step. Aiming at the relief of computational burden, this paper develops what is termed a backward quadratic search algorithm (BQSA as another option for solving IK problems in obstacle avoidance. The BQSA detects possible collisions based on the root property of a category of quadratic functions, which are derived from ellipse-enveloped obstacles and the positions of each link's end-points. The algorithm executes a backward search for possible obstacle collisions, from the end-effector to the base, and avoids obstacles by utilizing a hybrid IK scheme, incorporating the damped least-squares method, the weighted least-norm method and the gradient projection method. Some details of the hybrid IK scheme, such as values of the damped factor, weights and the clamping velocity, are discussed, along with a comparison of computational load between previous methods and BQSA. Simulations of a planar seven-link manipulator and a PUMA 560 robot verify the effectiveness of BQSA.

  17. A local search with smoothing approximation in hybrid algorithms of diagnostics of hydromechanical systems

    Directory of Open Access Journals (Sweden)

    V. D. Sulimov

    2014-01-01

    Full Text Available Modern methods for solving practical problems relating to trouble free, efficient and pro-longed operation of complex systems are presumed the application of computational diagnos-tics. Input data for diagnosing usually contain the results of experimental measurements of the system certain investigatory characteristics; among them may be registered parameters of oscillatory motion or impact process. The diagnostic procedure is founded on the solution of the corresponding inverse spectral problem; the problem in many cases may be reduced to a minimization of an appropriate error criterion. Eigenvalues from the direct problem for the mathematical model and useful measured data for the system are used in order to construct the corresponding criterion. When solving these inverse problems, consideration must be given to following special features: the error criterion may be represented by nondifferentiable and multiextremal function.Consideration is being given to problems of identification of anomalies in the phase constitution of the coolant circulating throw the reactor primary circuit. Main dynamical char-acteristics of the object under diagnosing are considered as continuous functions of the bounded set of control variables. Possible occurrence of anomalies in the phase constitution of the coolant can be detected owing to changes in dynamical characteristics of the two-phase flow. It is suggested that criterion functions are continuous, Lipschitzian, multiextremal and not everywhere differentiable. Two novel hybrid algorithms are proposed with scanning a search space by use of the modern stochastic Multi-Particle Collision Algorithm on base of analogy with absorbtion and scattering processes for nuclear particles. The local search is im-plemented using the hyperbolic smoothing function method for the first algorithm, and the linearization method with two-parametric smoothing approximations of criteria for the second one. Some results on solving

  18. Optimum Design of Gravity Retaining Walls Using Charged System Search Algorithm

    Directory of Open Access Journals (Sweden)

    S. Talatahari

    2012-01-01

    Full Text Available This study focuses on the optimum design retaining walls, as one of the familiar types of the retaining walls which may be constructed of stone masonry, unreinforced concrete, or reinforced concrete. The material cost is one of the major factors in the construction of gravity retaining walls therefore, minimizing the weight or volume of these systems can reduce the cost. To obtain an optimal seismic design of such structures, this paper proposes a method based on a novel meta-heuristic algorithm. The algorithm is inspired by the Coulomb's and Gauss’s laws of electrostatics in physics, and it is called charged system search (CSS. In order to evaluate the efficiency of this algorithm, an example is utilized. Comparing the results of the retaining wall designs obtained by the other methods illustrates a good performance of the CSS. In this paper, we used the Mononobe-Okabe method which is one of the pseudostatic approaches to determine the dynamic earth pressure.

  19. A depth-first search algorithm to compute elementary flux modes by linear programming.

    Science.gov (United States)

    Quek, Lake-Ee; Nielsen, Lars K

    2014-07-30

    The decomposition of complex metabolic networks into elementary flux modes (EFMs) provides a useful framework for exploring reaction interactions systematically. Generating a complete set of EFMs for large-scale models, however, is near impossible. Even for moderately-sized models (linear programming (LP) to enumerate EFMs in an exhaustive fashion. Constraints can be introduced to directly generate a subset of EFMs satisfying the set of constraints. The depth-first search algorithm has a constant memory overhead. Using flux constraints, a large LP problem can be massively divided and parallelized into independent sub-jobs for deployment into computing clusters. Since the sub-jobs do not overlap, the approach scales to utilize all available computing nodes with minimal coordination overhead or memory limitations. The speed of the algorithm was comparable to efmtool, a mainstream Double Description method, when enumerating all EFMs; the attrition power gained from performing flux feasibility tests offsets the increased computational demand of running an LP solver. Unlike the Double Description method, the algorithm enables accelerated enumeration of all EFMs satisfying a set of constraints.

  20. A Biogeography-Based Optimization Algorithm Hybridized with Tabu Search for the Quadratic Assignment Problem.

    Science.gov (United States)

    Lim, Wee Loon; Wibowo, Antoni; Desa, Mohammad Ishak; Haron, Habibollah

    2016-01-01

    The quadratic assignment problem (QAP) is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO), a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them.

  1. A Biogeography-Based Optimization Algorithm Hybridized with Tabu Search for the Quadratic Assignment Problem

    Directory of Open Access Journals (Sweden)

    Wee Loon Lim

    2016-01-01

    Full Text Available The quadratic assignment problem (QAP is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO, a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them.

  2. Dual Search Maximum Power Point (DSMPP Algorithm Based on Mathematical Analysis under Shaded Conditions

    Directory of Open Access Journals (Sweden)

    Shahrooz Hajighorbani

    2015-10-01

    Full Text Available Photovoltaic (PV systems represent a clean, renewable source of energy that has non-linear current-voltage (I-V and power-voltage (P-V characteristics. To increase the efficiency, a PV system must operate at the maximum power point (MPP to produce the maximum available power. Under uniform conditions, there is only a single MPP in the P-V curve of a PV system; however, determining the MPP is more complicated under partially shaded conditions (PSCs because multiple peak power points exist. In recent years, various studies have been performed to obtain the highest peak power point under PSCs, which is referred to as the global maximum power point (GMPP. In this paper, a novel method based on mathematical analysis that reduces the search zone and simultaneously identifies the possible MPPs in the specified zone is proposed; this proposed method is called the dual search maximum power point (DSMPP algorithm. To evaluate the effectiveness of the proposed method, simulation and hardware implementations are carried out. The results show that the search time of GMPP is significantly reduced and the GMPP is detected in the minimum amount of time with high accuracy and minimum oscillation in the power produced.

  3. The Surface Extraction from TIN based Search-space Minimization (SETSM) algorithm

    Science.gov (United States)

    Noh, Myoung-Jong; Howat, Ian M.

    2017-07-01

    Digital Elevation Models (DEMs) provide critical information for a wide range of scientific, navigational and engineering activities. Submeter resolution, stereoscopic satellite imagery with high geometric and radiometric quality, and wide spatial coverage are becoming increasingly accessible for generating stereo-photogrammetric DEMs. However, low contrast and repeatedly-textured surfaces, such as snow and glacial ice at high latitudes, and mountainous terrains challenge existing stereo-photogrammetric DEM generation techniques, particularly without a-priori information such as existing seed DEMs or the manual setting of terrain-specific parameters. To utilize these data for fully-automatic DEM extraction at a large scale, we developed the Surface Extraction from TIN-based Search-space Minimization (SETSM) algorithm. SETSM is fully automatic (i.e. no search parameter settings are needed) and uses only the sensor model Rational Polynomial Coefficients (RPCs). SETSM adopts a hierarchical, combined image- and object-space matching strategy utilizing weighted normalized cross-correlation with both original distorted and geometrically corrected images for overcoming ambiguities caused by foreshortening and occlusions. In addition, SETSM optimally minimizes search-spaces to extract optimal matches over problematic terrains by iteratively updating object surfaces within a Triangulated Irregular Network, and utilizes a geometric-constrained blunder and outlier detection in object space. We prove the ability of SETSM to mitigate typical stereo-photogrammetric matching problems over a range of challenging terrains. SETSM is the primary DEM generation software for the US National Science Foundation's ArcticDEM project.

  4. Hybrid geometric-random template-placement algorithm for gravitational wave searches from compact binary coalescences

    Science.gov (United States)

    Roy, Soumen; Sengupta, Anand S.; Thakor, Nilay

    2017-05-01

    Astrophysical compact binary systems consisting of neutron stars and black holes are an important class of gravitational wave (GW) sources for advanced LIGO detectors. Accurate theoretical waveform models from the inspiral, merger, and ringdown phases of such systems are used to filter detector data under the template-based matched-filtering paradigm. An efficient grid over the parameter space at a fixed minimal match has a direct impact on the overall time taken by these searches. We present a new hybrid geometric-random template placement algorithm for signals described by parameters of two masses and one spin magnitude. Such template banks could potentially be used in GW searches from binary neutron stars and neutron star-black hole systems. The template placement is robust and is able to automatically accommodate curvature and boundary effects with no fine-tuning. We also compare these banks against vanilla stochastic template banks and show that while both are equally efficient in the fitting-factor sense, the bank sizes are ˜25 % larger in the stochastic method. Further, we show that the generation of the proposed hybrid banks can be sped up by nearly an order of magnitude over the stochastic bank. Generic issues related to optimal implementation are discussed in detail. These improvements are expected to directly reduce the computational cost of gravitational wave searches.

  5. Volume reconstruction optimization for tomo-PIV algorithms applied to experimental data

    Science.gov (United States)

    Martins, Fabio J. W. A.; Foucaut, Jean-Marc; Thomas, Lionel; Azevedo, Luis F. A.; Stanislas, Michel

    2015-08-01

    Tomographic PIV is a three-component volumetric velocity measurement technique based on the tomographic reconstruction of a particle distribution imaged by multiple camera views. In essence, the performance and accuracy of this technique is highly dependent on the parametric adjustment and the reconstruction algorithm used. Although synthetic data have been widely employed to optimize experiments, the resulting reconstructed volumes might not have optimal quality. The purpose of the present study is to offer quality indicators that can be applied to data samples in order to improve the quality of velocity results obtained by the tomo-PIV technique. The methodology proposed can potentially lead to significantly reduction in the time required to optimize a tomo-PIV reconstruction, also leading to better quality velocity results. Tomo-PIV data provided by a six-camera turbulent boundary-layer experiment were used to optimize the reconstruction algorithms according to this methodology. Velocity statistics measurements obtained by optimized BIMART, SMART and MART algorithms were compared with hot-wire anemometer data and velocity measurement uncertainties were computed. Results indicated that BIMART and SMART algorithms produced reconstructed volumes with equivalent quality as the standard MART with the benefit of reduced computational time.

  6. High resolution direction of arrival (DOA) estimation based on improved orthogonal matching pursuit (OMP) algorithm by iterative local searching.

    Science.gov (United States)

    Wang, Wenyi; Wu, Renbiao

    2013-08-22

    DOA (Direction of Arrival) estimation is a major problem in array signal processing applications. Recently, compressive sensing algorithms, including convex relaxation algorithms and greedy algorithms, have been recognized as a kind of novel DOA estimation algorithm. However, the success of these algorithms is limited by the RIP (Restricted Isometry Property) condition or the mutual coherence of measurement matrix. In the DOA estimation problem, the columns of measurement matrix are steering vectors corresponding to different DOAs. Thus, it violates the mutual coherence condition. The situation gets worse when there are two sources from two adjacent DOAs. In this paper, an algorithm based on OMP (Orthogonal Matching Pursuit), called ILS-OMP (Iterative Local Searching-Orthogonal Matching Pursuit), is proposed to improve DOA resolution by Iterative Local Searching. Firstly, the conventional OMP algorithm is used to obtain initial estimated DOAs. Then, in each iteration, a local searching process for every estimated DOA is utilized to find a new DOA in a given DOA set to further decrease the residual. Additionally, the estimated DOAs are updated by substituting the initial DOA with the new one. The simulation results demonstrate the advantages of the proposed algorithm.

  7. High Resolution Direction of Arrival (DOA Estimation Based on Improved Orthogonal Matching Pursuit (OMP Algorithm by Iterative Local Searching

    Directory of Open Access Journals (Sweden)

    Renbiao Wu

    2013-08-01

    Full Text Available DOA (Direction of Arrival estimation is a major problem in array signal processing applications. Recently, compressive sensing algorithms, including convex relaxation algorithms and greedy algorithms, have been recognized as a kind of novel DOA estimation algorithm. However, the success of these algorithms is limited by the RIP (Restricted Isometry Property condition or the mutual coherence of measurement matrix. In the DOA estimation problem, the columns of measurement matrix are steering vectors corresponding to different DOAs. Thus, it violates the mutual coherence condition. The situation gets worse when there are two sources from two adjacent DOAs. In this paper, an algorithm based on OMP (Orthogonal Matching Pursuit, called ILS-OMP (Iterative Local Searching-Orthogonal Matching Pursuit, is proposed to improve DOA resolution by Iterative Local Searching. Firstly, the conventional OMP algorithm is used to obtain initial estimated DOAs. Then, in each iteration, a local searching process for every estimated DOA is utilized to find a new DOA in a given DOA set to further decrease the residual. Additionally, the estimated DOAs are updated by substituting the initial DOA with the new one. The simulation results demonstrate the advantages of the proposed algorithm.

  8. Improving Limit Surface Search Algorithms in RAVEN Using Acceleration Schemes: Level II Milestone

    Energy Technology Data Exchange (ETDEWEB)

    Alfonsi, Andrea [Idaho National Laboratory (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Laboratory (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Laboratory (INL), Idaho Falls, ID (United States); Cogliati, Joshua Joseph [Idaho National Laboratory (INL), Idaho Falls, ID (United States); Sen, Ramazan Sonat [Idaho National Laboratory (INL), Idaho Falls, ID (United States); Smith, Curtis Lee [Idaho National Laboratory (INL), Idaho Falls, ID (United States)

    2015-07-01

    The RAVEN code is becoming a comprehensive tool to perform Probabilistic Risk Assessment (PRA); Uncertainty Quantification (UQ) and Propagation; and Verification and Validation (V&V). The RAVEN code is being developed to support the Risk-Informed Safety Margin Characterization (RISMC) pathway by developing an advanced set of methodologies and algorithms for use in advanced risk analysis. The RISMC approach uses system simulator codes applied to stochastic analysis tools. The fundamental idea behind this coupling approach to perturb (by employing sampling strategies) timing and sequencing of events, internal parameters of the system codes (i.e., uncertain parameters of the physics model) and initial conditions to estimate values ranges and associated probabilities of figures of merit of interest for engineering and safety (e.g. core damage probability, etc.). This approach applied to complex systems such as nuclear power plants requires performing a series of computationally expensive simulation runs. The large computational burden is caused by the large set of (uncertain) parameters characterizing those systems. Consequently, exploring the uncertain/parametric domain, with a good level of confidence, is generally not affordable, considering the limited computational resources that are currently available. In addition, the recent tendency to develop newer tools, characterized by higher accuracy and larger computational resources (if compared with the presently used legacy codes, that have been developed decades ago), has made this issue even more compelling. In order to overcome to these limitations, the strategy for the exploration of the uncertain/parametric space needs to use at best the computational resources focusing the computational effort in those regions of the uncertain/parametric space that are “interesting” (e.g., risk-significant regions of the input space) with respect the targeted Figures Of Merit (FOM): for example, the failure of the system

  9. A fast method for searching for repeating earthquakes, applied to the northern San Francisco Bay area

    Science.gov (United States)

    Shakibay Senobari, N.; Funning, G.

    2016-12-01

    Repeating earthquakes (REs) are the regular or semi-regular failures of the same patch on a fault, producing near-identical waveforms at a given station. Sequences of REs are commonly interpreted as slip on small locked patches surrounded by large areas of fault that are creeping (Nadeau and McEvilly, 1999). Detecting them, therefore, places important constraints on the extent of fault creep at depth. In addition, the magnitude and recurrence interval of these RE sequences can be related to the creep rate and used as constraints on slip models. In this study we search for REs in northern California fault systems upon which creep is suspected, but not well constrained, including the Rodgers Creek, Maacama, Bartlett Springs, Concord-Green Valley, West Napa and Greenville faults, targeting events recorded at stations where the instrument was not changed for 10 years or more. A pair of events can be identified as REs based on a high cross-correlation coefficient (CCC) between their waveforms. Thus a fundamental step in RE searches is calculating the CCC for all event waveform pairs recorded at common stations. This becomes computationally expensive for large data sets. To expedite our search, we use a fast and accurate similarity search algorithm developed by the computer science community (Mueen et al., 2015; Zhu et al., 2016). Our initial tests on a data set including 1500 waveforms suggest it is around 40 times faster than the algorithm that we used previously (Shakibay Senobari and Funning, AGU Fall Meeting 2014). We search for event pairs with CCC>0.85 and cluster them based on their similarity. A second, location based filter, based on the differential S-P times for each event pair at 5 or more stations, is used as an independent check. We consider a cluster of events a RE sequence if the source location separation distance for each pair is less than the estimated circular size of the source (e.g. Chen et al., 2008); these are gathered into an RE catalogue. In

  10. Heterogeneous Ensemble Combination Search Using Genetic Algorithm for Class Imbalanced Data Classification.

    Directory of Open Access Journals (Sweden)

    Mohammad Nazmul Haque

    Full Text Available Classification of datasets with imbalanced sample distributions has always been a challenge. In general, a popular approach for enhancing classification performance is the construction of an ensemble of classifiers. However, the performance of an ensemble is dependent on the choice of constituent base classifiers. Therefore, we propose a genetic algorithm-based search method for finding the optimum combination from a pool of base classifiers to form a heterogeneous ensemble. The algorithm, called GA-EoC, utilises 10 fold-cross validation on training data for evaluating the quality of each candidate ensembles. In order to combine the base classifiers decision into ensemble's output, we used the simple and widely used majority voting approach. The proposed algorithm, along with the random sub-sampling approach to balance the class distribution, has been used for classifying class-imbalanced datasets. Additionally, if a feature set was not available, we used the (α, β - k Feature Set method to select a better subset of features for classification. We have tested GA-EoC with three benchmarking datasets from the UCI-Machine Learning repository, one Alzheimer's disease dataset and a subset of the PubFig database of Columbia University. In general, the performance of the proposed method on the chosen datasets is robust and better than that of the constituent base classifiers and many other well-known ensembles. Based on our empirical study we claim that a genetic algorithm is a superior and reliable approach to heterogeneous ensemble construction and we expect that the proposed GA-EoC would perform consistently in other cases.

  11. Search for gamma-ray emitting AGN among unidentified Fermi-LAT sources using machine learning algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Doert, Marlene [Technische Universitaet Dortmund (Germany); Ruhr-Universitaet Bochum (Germany); Einecke, Sabrina [Technische Universitaet Dortmund (Germany); Errando, Manel [Barnard College, Columbia University, New York City (United States)

    2015-07-01

    The second Fermi-LAT source catalog (2FGL) is the deepest all-sky survey of the gamma-ray sky currently available to the community. Out of the 1873 catalog sources, 576 remain unassociated. We present a search for active galactic nuclei (AGN) among these unassociated objects, which aims at a reduction of the number of unassociated gamma-ray sources and a more complete characterization of the population of gamma-ray emitting AGN. Our study uses two complimentary machine learning algorithms which are individually trained on the gamma-ray properties of associated 2FGL sources and thereafter applied to the unassociated sample. The intersection of the two methods yields a high-confidence sample of 231 AGN candidate sources. We estimate the performance of the classification by taking inherent differences between the samples of associated and unassociated 2FGL sources into account. A search for infra-red counterparts and first results from follow-up studies in the X-ray band using Swift satellite data for a subset of our AGN candidates are also presented.

  12. GPU Based N-Gram String Matching Algorithm with Score Table Approach for String Searching in Many Documents

    Science.gov (United States)

    Srinivasa, K. G.; Shree Devi, B. N.

    2017-10-01

    String searching in documents has become a tedious task with the evolution of Big Data. Generation of large data sets demand for a high performance search algorithm in areas such as text mining, information retrieval and many others. The popularity of GPU's for general purpose computing has been increasing for various applications. Therefore it is of great interest to exploit the thread feature of a GPU to provide a high performance search algorithm. This paper proposes an optimized new approach to N-gram model for string search in a number of lengthy documents and its GPU implementation. The algorithm exploits GPGPUs for searching strings in many documents employing character level N-gram matching with parallel Score Table approach and search using CUDA API. The new approach of Score table used for frequency storage of N-grams in a document, makes the search independent of the document's length and allows faster access to the frequency values, thus decreasing the search complexity. The extensive thread feature in a GPU has been exploited to enable parallel pre-processing of trigrams in a document for Score Table creation and parallel search in huge number of documents, thus speeding up the whole search process even for a large pattern size. Experiments were carried out for many documents of varied length and search strings from the standard Lorem Ipsum text on NVIDIA's GeForce GT 540M GPU with 96 cores. Results prove that the parallel approach for Score Table creation and searching gives a good speed up than the same approach executed serially.

  13. GPU Based N-Gram String Matching Algorithm with Score Table Approach for String Searching in Many Documents

    Science.gov (United States)

    Srinivasa, K. G.; Shree Devi, B. N.

    2017-09-01

    String searching in documents has become a tedious task with the evolution of Big Data. Generation of large data sets demand for a high performance search algorithm in areas such as text mining, information retrieval and many others. The popularity of GPU's for general purpose computing has been increasing for various applications. Therefore it is of great interest to exploit the thread feature of a GPU to provide a high performance search algorithm. This paper proposes an optimized new approach to N-gram model for string search in a number of lengthy documents and its GPU implementation. The algorithm exploits GPGPUs for searching strings in many documents employing character level N-gram matching with parallel Score Table approach and search using CUDA API. The new approach of Score table used for frequency storage of N-grams in a document, makes the search independent of the document's length and allows faster access to the frequency values, thus decreasing the search complexity. The extensive thread feature in a GPU has been exploited to enable parallel pre-processing of trigrams in a document for Score Table creation and parallel search in huge number of documents, thus speeding up the whole search process even for a large pattern size. Experiments were carried out for many documents of varied length and search strings from the standard Lorem Ipsum text on NVIDIA's GeForce GT 540M GPU with 96 cores. Results prove that the parallel approach for Score Table creation and searching gives a good speed up than the same approach executed serially.

  14. A New Missing Data Imputation Algorithm Applied to Electrical Data Loggers

    Directory of Open Access Journals (Sweden)

    Concepción Crespo Turrado

    2015-12-01

    Full Text Available Nowadays, data collection is a key process in the study of electrical power networks when searching for harmonics and a lack of balance among phases. In this context, the lack of data of any of the main electrical variables (phase-to-neutral voltage, phase-to-phase voltage, and current in each phase and power factor adversely affects any time series study performed. When this occurs, a data imputation process must be accomplished in order to substitute the data that is missing for estimated values. This paper presents a novel missing data imputation method based on multivariate adaptive regression splines (MARS and compares it with the well-known technique called multivariate imputation by chained equations (MICE. The results obtained demonstrate how the proposed method outperforms the MICE algorithm.

  15. A search algorithm to meta-optimize the parameters for an extended Kalman filter to improve classification on hyper-temporal images

    CSIR Research Space (South Africa)

    Salmon, BP

    2012-07-01

    Full Text Available In this paper the Bias Variance Search Algorithm is proposed as an algorithm to optimize a candidate set of initial parameters for an Extended Kalman filter (EKF). The search algorithm operates on a Bias Variance Equilibrium Point criterion...

  16. Optimized Aircraft Electric Control System Based on Adaptive Tabu Search Algorithm and Fuzzy Logic Control

    Directory of Open Access Journals (Sweden)

    Saifullah Khalid

    2016-09-01

    Full Text Available Three conventional control constant instantaneous power control, sinusoidal current control, and synchronous reference frame techniques for extracting reference currents for shunt active power filters have been optimized using Fuzzy Logic control and Adaptive Tabu search Algorithm and their performances have been compared. Critical analysis of Comparison of the compensation ability of different control strategies based on THD and speed will be done, and suggestions will be given for the selection of technique to be used. The simulated results using MATLAB model are presented, and they will clearly prove the value of the proposed control method of aircraft shunt APF. The waveforms observed after the application of filter will be having the harmonics within the limits and the power quality will be improved.

  17. Intelligent energy allocation strategy for PHEV charging station using gravitational search algorithm

    Science.gov (United States)

    Rahman, Imran; Vasant, Pandian M.; Singh, Balbir Singh Mahinder; Abdullah-Al-Wadud, M.

    2014-10-01

    Recent researches towards the use of green technologies to reduce pollution and increase penetration of renewable energy sources in the transportation sector are gaining popularity. The development of the smart grid environment focusing on PHEVs may also heal some of the prevailing grid problems by enabling the implementation of Vehicle-to-Grid (V2G) concept. Intelligent energy management is an important issue which has already drawn much attention to researchers. Most of these works require formulation of mathematical models which extensively use computational intelligence-based optimization techniques to solve many technical problems. Higher penetration of PHEVs require adequate charging infrastructure as well as smart charging strategies. We used Gravitational Search Algorithm (GSA) to intelligently allocate energy to the PHEVs considering constraints such as energy price, remaining battery capacity, and remaining charging time.

  18. An Optimization Model and Modified Harmony Search Algorithm for Microgrid Planning with ESS

    Directory of Open Access Journals (Sweden)

    Yang Jiao

    2017-01-01

    Full Text Available To solve problems such as the high cost of microgrids (MGs, balance between supply and demand, stability of system operation, and optimizing the MG planning model, the energy storage system (ESS and harmony search algorithm (HSA are proposed. First, the conventional MG planning optimization model is constructed and the constraint conditions are defined: the supply and demand balance and reserve requirements. Second, an ESS is integrated into the optimal model of MG planning. The model with an ESS can solve and identify parameters such as the optimal power, optimal capacity, and optimal installation year. Third, the convergence speed and robustness of the ESS are optimized and improved. A case study comprising three different cases concludes the paper. The results show that the modified HSA (MHSA can effectively improve the stability and economy of MG operation with an ESS.

  19. APPLYING ARTIFICIAL NEURAL NETWORK OPTIMIZED BY FIREWORKS ALGORITHM FOR STOCK PRICE ESTIMATION

    Directory of Open Access Journals (Sweden)

    Khuat Thanh Tung

    2016-04-01

    Full Text Available Stock prediction is to determine the future value of a company stock dealt on an exchange. It plays a crucial role to raise the profit gained by firms and investors. Over the past few years, many methods have been developed in which plenty of efforts focus on the machine learning framework achieving the promising results. In this paper, an approach based on Artificial Neural Network (ANN optimized by Fireworks algorithm and data preprocessing by Haar Wavelet is applied to estimate the stock prices. The system was trained and tested with real data of various companies collected from Yahoo Finance. The obtained results are encouraging.

  20. Bee-Inspired Algorithms Applied to Vehicle Routing Problems: A Survey and a Proposal

    Directory of Open Access Journals (Sweden)

    Thiago A. S. Masutti

    2017-01-01

    Full Text Available Vehicle routing problems constitute a class of combinatorial optimization tasks that search for optimal routes (e.g., minimal cost routes for one or more vehicles to attend a set of nodes (e.g., cities or customers. Finding the optimal solution to vehicle routing tasks is an NP-hard problem, meaning that the size of problems that can be solved by exhaustive search is limited. From a practical perspective, this class of problems has a wide and important set of applications, from the distribution of goods to the integrated chip design. Rooted on the use of collective intelligence, swarm-inspired algorithms, more specifically bee-inspired approaches, have been used with good performance to solve such problems. In this context, the present paper provides a broad review on the use of bee-inspired methods for solving vehicle routing problems, introduces a new approach to solve one of the main tasks in this area (the travelling salesman problem, and describes open problems in the field.

  1. Kombinasi Firefly Algorithm-Tabu Search untuk Penyelesaian Traveling Salesman Problem

    Directory of Open Access Journals (Sweden)

    Riyan Naufal Hay's

    2017-07-01

    Full Text Available Traveling Salesman Problem (TSP adalah masalah optimasi kombinatorial klasik dan memiliki peran dalam perencanaan, penjadwalan, dan pencarian pada bidang rekayasa dan pengetahuan (Dong, 2012. TSP juga merupakan objek yang baik untuk menguji kinerja metode optimasi, beberapa metode seperti Cooperative Genetic Ant System (CGAS (Dong, 2012, Parallelized Genetic Ant Colony System (PGAS Particle Swarm Optimization and Ant Colony Optimization Algorithms (PSO–ACO (Elloumi, 2014, dan Ant Colony Hyper-Heuristics (ACO HH (Aziz, 2015 telah dikembangkan untuk memecahkan TSP. Sehingga, pada penelitian ini diimplementasikan kombinasi metode baru untuk meningkatkan akurasi penyelesaian TSP. Firefly Algorithm (FA merupakan salah satu algoritma yang dapat digunakan untuk memecahkan masalah optimasi kombinatorial (Layeb, 2014. FA merupakan algoritma yang berpotensi kuat dalam memecahkan kasus optimasi dibanding algoritma yang ada termasuk Particle Swarm Optimization (Yang, 2010. Namun, FA memiliki kekurangan dalam memecahkan masalah optimasi dengan skala besar (Baykasoğlu dan Ozsoy, 2014. Tabu Search (TS merupakan metode optimasi yang terbukti efektif untuk memecahkan masalah optimasi dengan skala besar (Pedro, 2013. Pada penelitian ini, TS akan diterapkan pada FA (FATS untuk memecahkan kasus TSP. Hasil FATS akan dibandingkan terhadap penelitian sebelumnya yaitu ACOHH. Perbandingan hasil menunjukan peningkatan akurasi sebesar 0.89% pada dataset Oliver30, 0.14% dataset Eil51, 3.81% dataset Eil76 dan 1.27% dataset KroA100.

  2. A Framing Link Based Tabu Search Algorithm for Large-Scale Multidepot Vehicle Routing Problems

    Directory of Open Access Journals (Sweden)

    Xuhao Zhang

    2014-01-01

    Full Text Available A framing link (FL based tabu search algorithm is proposed in this paper for a large-scale multidepot vehicle routing problem (LSMDVRP. Framing links are generated during continuous great optimization of current solutions and then taken as skeletons so as to improve optimal seeking ability, speed up the process of optimization, and obtain better results. Based on the comparison between pre- and postmutation routes in the current solution, different parts are extracted. In the current optimization period, links involved in the optimal solution are regarded as candidates to the FL base. Multiple optimization periods exist in the whole algorithm, and there are several potential FLs in each period. If the update condition is satisfied, the FL base is updated, new FLs are added into the current route, and the next period starts. Through adjusting the borderline of multidepot sharing area with dynamic parameters, the authors define candidate selection principles for three kinds of customer connections, respectively. Link split and the roulette approach are employed to choose FLs. 18 LSMDVRP instances in three groups are studied and new optimal solution values for nine of them are obtained, with higher computation speed and reliability.

  3. Optimal coordinated voltage control in active distribution networks using backtracking search algorithm.

    Science.gov (United States)

    Tengku Hashim, Tengku Juhana; Mohamed, Azah

    2017-01-01

    The growing interest in distributed generation (DG) in recent years has led to a number of generators connected to a distribution system. The integration of DGs in a distribution system has resulted in a network known as active distribution network due to the existence of bidirectional power flow in the system. Voltage rise issue is one of the predominantly important technical issues to be addressed when DGs exist in an active distribution network. This paper presents the application of the backtracking search algorithm (BSA), which is relatively new optimisation technique to determine the optimal settings of coordinated voltage control in a distribution system. The coordinated voltage control considers power factor, on-load tap-changer and generation curtailment control to manage voltage rise issue. A multi-objective function is formulated to minimise total losses and voltage deviation in a distribution system. The proposed BSA is compared with that of particle swarm optimisation (PSO) so as to evaluate its effectiveness in determining the optimal settings of power factor, tap-changer and percentage active power generation to be curtailed. The load flow algorithm from MATPOWER is integrated in the MATLAB environment to solve the multi-objective optimisation problem. Both the BSA and PSO optimisation techniques have been tested on a radial 13-bus distribution system and the results show that the BSA performs better than PSO by providing better fitness value and convergence rate.

  4. Optimization of Nano-Process Deposition Parameters Based on Gravitational Search Algorithm

    Directory of Open Access Journals (Sweden)

    Norlina Mohd Sabri

    2016-06-01

    Full Text Available This research is focusing on the radio frequency (RF magnetron sputtering process, a physical vapor deposition technique which is widely used in thin film production. This process requires the optimized combination of deposition parameters in order to obtain the desirable thin film. The conventional method in the optimization of the deposition parameters had been reported to be costly and time consuming due to its trial and error nature. Thus, gravitational search algorithm (GSA technique had been proposed to solve this nano-process parameters optimization problem. In this research, the optimized parameter combination was expected to produce the desirable electrical and optical properties of the thin film. The performance of GSA in this research was compared with that of Particle Swarm Optimization (PSO, Genetic Algorithm (GA, Artificial Immune System (AIS and Ant Colony Optimization (ACO. Based on the overall results, the GSA optimized parameter combination had generated the best electrical and an acceptable optical properties of thin film compared to the others. This computational experiment is expected to overcome the problem of having to conduct repetitive laboratory experiments in obtaining the most optimized parameter combination. Based on this initial experiment, the adaptation of GSA into this problem could offer a more efficient and productive way of depositing quality thin film in the fabrication process.

  5. Process planning optimization on turning machine tool using a hybrid genetic algorithm with local search approach

    Directory of Open Access Journals (Sweden)

    Yuliang Su

    2015-04-01

    Full Text Available A turning machine tool is a kind of new type of machine tool that is equipped with more than one spindle and turret. The distinctive simultaneous and parallel processing abilities of turning machine tool increase the complexity of process planning. The operations would not only be sequenced and satisfy precedence constraints, but also should be scheduled with multiple objectives such as minimizing machining cost, maximizing utilization of turning machine tool, and so on. To solve this problem, a hybrid genetic algorithm was proposed to generate optimal process plans based on a mixed 0-1 integer programming model. An operation precedence graph is used to represent precedence constraints and help generate a feasible initial population of hybrid genetic algorithm. Encoding strategy based on data structure was developed to represent process plans digitally in order to form the solution space. In addition, a local search approach for optimizing the assignments of available turrets would be added to incorporate scheduling with process planning. A real-world case is used to prove that the proposed approach could avoid infeasible solutions and effectively generate a global optimal process plan.

  6. Efficiency Criteria as a Solution to the Uncertainty in the Choice of Population Size in Population-Based Algorithms Applied to Water Network Optimization

    Directory of Open Access Journals (Sweden)

    Daniel Mora-Melià

    2016-12-01

    Full Text Available Different Population-based Algorithms (PbAs have been used in recent years to solve all types of optimization problems related to water resource issues. However, the performances of these techniques depend heavily on correctly setting some specific parameters that guide the search for solutions. The initial random population size P is the only parameter common to all PbAs, but this parameter has received little attention from researchers. This paper explores P behaviour in a pipe-sizing problem considering both quality and speed criteria. To relate both concepts, this study applies a method based on an efficiency ratio E. First, specific parameters in each algorithm are calibrated with a fixed P. Second, specific parameters remain fixed, and the initial population size P is modified. After more than 600,000 simulations, the influence of P on obtaining successful solutions is statistically analysed. The proposed methodology is applied to four well-known benchmark networks and four different algorithms. The main conclusion of this study is that using a small population size is more efficient above a certain minimum size. Moreover, the results ensure optimal parameter calibration in each algorithm, and they can be used to select the most appropriate algorithm depending on the complexity of the problem and the goal of optimization.

  7. 1/f Noise in the Simple Genetic Algorithm Applied to a Traveling Salesman Problem

    Science.gov (United States)

    Yamada, Mitsuhiro

    Complex dynamical systems are observed in physics, biology, and even economics. Such systems in balance are considered to be in a critical state, and 1/f noise is considered to be a footprint. Complex dynamical systems have also been investigated in the field of evolutionary algorithms inspired by biological evolution. The genetic algorithm (GA) is a well-known evolutionary algorithm in which many individuals interact, and the simplest GA is referred to as the simple GA (SGA). However, the GA has not been examined from the viewpoint of the emergence of 1/f noise. In the present paper, the SGA is applied to a traveling salesman problem in order to investigate the SGA from such a viewpoint. The timecourses of the fitness of the candidate solution were examined. As a result, when the mutation and crossover probabilities were optimal, the system evolved toward a critical state in which the average maximum fitness over all trial runs was maximum. In this situation, the fluctuation of the fitness of the candidate solution resulted in the 1/f power spectrum, and the dynamics of the system had no intrinsic time or length scale.

  8. Quality assurance in health sciences literature searching: applying the ISO 9000 quality standard.

    Science.gov (United States)

    Cullen, R; Mason, D

    1995-09-01

    Medicine is a literature-based discipline. Ensuring that the literature review which precedes a significant piece of medical research has met predetermined standards is essential. A list of items reviewed carries no guarantees that all appropriate items have been included in the survey of the literature, or that appropriate sources have been efficiently searched. This would be a matter for concern in any discipline. In medicine it is a matter of life and death. Quality assurance procedures that offer guarantees of the standards built into the process, rather than quality control which measures only outputs, can provide the necessary reassurance. The ISO 9000 quality standard offers a much needed quality assurance process. A methodology for applying the ISO 9000 standard to the task of searching the medical literature is outlined in this paper. A new role for medical librarians in promoting a rigorous methodology in the literature review equal to that of the research it supports is defined.

  9. PSimScan: algorithm and utility for fast protein similarity search.

    Directory of Open Access Journals (Sweden)

    Anna Kaznadzey

    Full Text Available In the era of metagenomics and diagnostics sequencing, the importance of protein comparison methods of boosted performance cannot be overstated. Here we present PSimScan (Protein Similarity Scanner, a flexible open source protein similarity search tool which provides a significant gain in speed compared to BLASTP at the price of controlled sensitivity loss. The PSimScan algorithm introduces a number of novel performance optimization methods that can be further used by the community to improve the speed and lower hardware requirements of bioinformatics software. The optimization starts at the lookup table construction, then the initial lookup table-based hits are passed through a pipeline of filtering and aggregation routines of increasing computational complexity. The first step in this pipeline is a novel algorithm that builds and selects 'similarity zones' aggregated from neighboring matches on small arrays of adjacent diagonals. PSimScan performs 5 to 100 times faster than the standard NCBI BLASTP, depending on chosen parameters, and runs on commodity hardware. Its sensitivity and selectivity at the slowest settings are comparable to the NCBI BLASTP's and decrease with the increase of speed, yet stay at the levels reasonable for many tasks. PSimScan is most advantageous when used on large collections of query sequences. Comparing the entire proteome of Streptocuccus pneumoniae (2,042 proteins to the NCBI's non-redundant protein database of 16,971,855 records takes 6.5 hours on a moderately powerful PC, while the same task with the NCBI BLASTP takes over 66 hours. We describe innovations in the PSimScan algorithm in considerable detail to encourage bioinformaticians to improve on the tool and to use the innovations in their own software development.

  10. A Method for Estimating View Transformations from Image Correspondences Based on the Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    Erik Cuevas

    2015-01-01

    Full Text Available In this paper, a new method for robustly estimating multiple view relations from point correspondences is presented. The approach combines the popular random sampling consensus (RANSAC algorithm and the evolutionary method harmony search (HS. With this combination, the proposed method adopts a different sampling strategy than RANSAC to generate putative solutions. Under the new mechanism, at each iteration, new candidate solutions are built taking into account the quality of the models generated by previous candidate solutions, rather than purely random as it is the case of RANSAC. The rules for the generation of candidate solutions (samples are motivated by the improvisation process that occurs when a musician searches for a better state of harmony. As a result, the proposed approach can substantially reduce the number of iterations still preserving the robust capabilities of RANSAC. The method is generic and its use is illustrated by the estimation of homographies, considering synthetic and real images. Additionally, in order to demonstrate the performance of the proposed approach within a real engineering application, it is employed to solve the problem of position estimation in a humanoid robot. Experimental results validate the efficiency of the proposed method in terms of accuracy, speed, and robustness.

  11. qPMS9: An Efficient Algorithm for Quorum Planted Motif Search

    Science.gov (United States)

    Nicolae, Marius; Rajasekaran, Sanguthevar

    2015-01-01

    Discovering patterns in biological sequences is a crucial problem. For example, the identification of patterns in DNA sequences has resulted in the determination of open reading frames, identification of gene promoter elements, intron/exon splicing sites, and SH RNAs, location of RNA degradation signals, identification of alternative splicing sites, etc. In protein sequences, patterns have led to domain identification, location of protease cleavage sites, identification of signal peptides, protein interactions, determination of protein degradation elements, identification of protein trafficking elements, discovery of short functional motifs, etc. In this paper we focus on the identification of an important class of patterns, namely, motifs. We study the (l, d) motif search problem or Planted Motif Search (PMS). PMS receives as input n strings and two integers l and d. It returns all sequences M of length l that occur in each input string, where each occurrence differs from M in at most d positions. Another formulation is quorum PMS (qPMS), where the motif appears in at least q% of the strings. We introduce qPMS9, a parallel exact qPMS algorithm that offers significant runtime improvements on DNA and protein datasets. qPMS9 solves the challenging DNA (l, d)-instances (28, 12) and (30, 13). The source code is available at https://code.google.com/p/qpms9/.

  12. An Effective Framework For Economic Dispatch Using Modified Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    Advik Kumar

    2017-09-01

    Full Text Available The effects of ever-increasing wind power generation for solving the economic dispatch ED problem have led to high penetration of renewable energy source in new power systems. Continuing search for better utilizing of wind turbine associated with thermal sources to find the optimal allocation of output power is necessary in which pro-vide more reliability and efficiency. Dynamic nature of wind energy has imposed uncertainties characteristics in the poser systems. To deal with this problem an effective probabilistic method to investigate all unpredictability would be a good idea to make more realistic analysis. This paper presents a heuristics optimization method based on harmony search HS algorithm to solve non-convex ED problems while uncertainties effects caused by wind turbines are considered. To involve a realistic analysis as a more practical investigation the proposed probabilistic ED PED approach includes prohibited operating zone POZ system spinning reserve ramp rate limits variety of fuel is considered in this studies. Point Estimate Method PEM as a proposed PED model the uncertainties of wind speed for wind turbines to present better realization to the problem. Optimal solution are presented for vari-ous test system and these solutions demonstrate the benefits of our approach in terms of cost over existing ED techniques.

  13. EVALUATION OF WEB SEARCHING METHOD USING A NOVEL WPRR ALGORITHM FOR TWO DIFFERENT CASE STUDIES

    Directory of Open Access Journals (Sweden)

    V. Lakshmi Praba

    2012-04-01

    Full Text Available The World-Wide Web provides every internet citizen with access to an abundance of information, but it becomes increasingly difficult to identify the relevant pieces of information. Research in web mining tries to address this problem by applying techniques from data mining and machine learning to web data and documents. Web content mining and web structure mining have important roles in identifying the relevant web page. Relevancy of web page denotes how well a retrieved web page or set of web pages meets the information need of the user. Page Rank, Weighted Page Rank and Hypertext Induced Topic Selection (HITS are existing algorithms which considers only web structure mining. Vector Space Model (VSM, Cover Density Ranking (CDR, Okapi similarity measurement (Okapi and Three-Level Scoring method (TLS are some of existing relevancy score methods which consider only web content mining. In this paper, we propose a new algorithm, Weighted Page with Relevant Rank (WPRR which is blend of both web content mining and web structure mining that demonstrates the relevancy of the page with respect to given query for two different case scenarios. It is shown that WPRR’s performance is better than the existing algorithms.

  14. Land use mapping from CBERS-2 images with open source tools by applying different classification algorithms

    Science.gov (United States)

    Sanhouse-García, Antonio J.; Rangel-Peraza, Jesús Gabriel; Bustos-Terrones, Yaneth; García-Ferrer, Alfonso; Mesas-Carrascosa, Francisco J.

    2016-02-01

    Land cover classification is often based on different characteristics between their classes, but with great homogeneity within each one of them. This cover is obtained through field work or by mean of processing satellite images. Field work involves high costs; therefore, digital image processing techniques have become an important alternative to perform this task. However, in some developing countries and particularly in Casacoima municipality in Venezuela, there is a lack of geographic information systems due to the lack of updated information and high costs in software license acquisition. This research proposes a low cost methodology to develop thematic mapping of local land use and types of coverage in areas with scarce resources. Thematic mapping was developed from CBERS-2 images and spatial information available on the network using open source tools. The supervised classification method per pixel and per region was applied using different classification algorithms and comparing them among themselves. Classification method per pixel was based on Maxver algorithms (maximum likelihood) and Euclidean distance (minimum distance), while per region classification was based on the Bhattacharya algorithm. Satisfactory results were obtained from per region classification, where overall reliability of 83.93% and kappa index of 0.81% were observed. Maxver algorithm showed a reliability value of 73.36% and kappa index 0.69%, while Euclidean distance obtained values of 67.17% and 0.61% for reliability and kappa index, respectively. It was demonstrated that the proposed methodology was very useful in cartographic processing and updating, which in turn serve as a support to develop management plans and land management. Hence, open source tools showed to be an economically viable alternative not only for forestry organizations, but for the general public, allowing them to develop projects in economically depressed and/or environmentally threatened areas.

  15. Derivation and validation of the automated search algorithms to identify cognitive impairment and dementia in electronic health records.

    Science.gov (United States)

    Amra, Sakusic; O'Horo, John C; Singh, Tarun D; Wilson, Gregory A; Kashyap, Rahul; Petersen, Ronald; Roberts, Rosebud O; Fryer, John D; Rabinstein, Alejandro A; Gajic, Ognjen

    2017-02-01

    Long-term cognitive impairment is a common and important problem in survivors of critical illness. We developed electronic search algorithms to identify cognitive impairment and dementia from the electronic medical records (EMRs) that provide opportunity for big data analysis. Eligible patients met 2 criteria. First, they had a formal cognitive evaluation by The Mayo Clinic Study of Aging. Second, they were hospitalized in intensive care unit at our institution between 2006 and 2014. The "criterion standard" for diagnosis was formal cognitive evaluation supplemented by input from an expert neurologist. Using all available EMR data, we developed and improved our algorithms in the derivation cohort and validated them in the independent validation cohort. Of 993 participants who underwent formal cognitive testing and were hospitalized in intensive care unit, we selected 151 participants at random to form the derivation and validation cohorts. The automated electronic search algorithm for cognitive impairment was 94.3% sensitive and 93.0% specific. The search algorithms for dementia achieved respective sensitivity and specificity of 97% and 99%. EMR search algorithms significantly outperformed International Classification of Diseases codes. Automated EMR data extractions for cognitive impairment and dementia are reliable and accurate and can serve as acceptable and efficient alternatives to time-consuming manual data review. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Security Analysis of Image Encryption Based on Gyrator Transform by Searching the Rotation Angle with Improved PSO Algorithm

    Directory of Open Access Journals (Sweden)

    Jun Sang

    2015-08-01

    Full Text Available Gyrator transform has been widely used for image encryption recently. For gyrator transform-based image encryption, the rotation angle used in the gyrator transform is one of the secret keys. In this paper, by analyzing the properties of the gyrator transform, an improved particle swarm optimization (PSO algorithm was proposed to search the rotation angle in a single gyrator transform. Since the gyrator transform is continuous, it is time-consuming to exhaustedly search the rotation angle, even considering the data precision in a computer. Therefore, a computational intelligence-based search may be an alternative choice. Considering the properties of severe local convergence and obvious global fluctuations of the gyrator transform, an improved PSO algorithm was proposed to be suitable for such situations. The experimental results demonstrated that the proposed improved PSO algorithm can significantly improve the efficiency of searching the rotation angle in a single gyrator transform. Since gyrator transform is the foundation of image encryption in gyrator transform domains, the research on the method of searching the rotation angle in a single gyrator transform is useful for further study on the security of such image encryption algorithms.

  17. Breadth-first search-based single-phase algorithms for bridge detection in wireless sensor networks.

    Science.gov (United States)

    Akram, Vahid Khalilpour; Dagdeviren, Orhan

    2013-07-10

    Wireless sensor networks (WSNs) are promising technologies for exploring harsh environments, such as oceans, wild forests, volcanic regions and outer space. Since sensor nodes may have limited transmission range, application packets may be transmitted by multi-hop communication. Thus, connectivity is a very important issue. A bridge is a critical edge whose removal breaks the connectivity of the network. Hence, it is crucial to detect bridges and take preventions. Since sensor nodes are battery-powered, services running on nodes should consume low energy. In this paper, we propose energy-efficient and distributed bridge detection algorithms for WSNs. Our algorithms run single phase and they are integrated with the Breadth-First Search (BFS) algorithm, which is a popular routing algorithm. Our first algorithm is an extended version of Milic's algorithm, which is designed to reduce the message length. Our second algorithm is novel and uses ancestral knowledge to detect bridges. We explain the operation of the algorithms, analyze their proof of correctness, message, time, space and computational complexities. To evaluate practical importance, we provide testbed experiments and extensive simulations. We show that our proposed algorithms provide less resource consumption, and the energy savings of our algorithms are up by 5.5-times.

  18. Hybrid Nelder-Mead search based optimal Least Mean Square algorithms for heart and lung sound separation

    Directory of Open Access Journals (Sweden)

    Ruban Nersisson

    2017-06-01

    Full Text Available Algorithms for separation of heart sounds from background lung sound noises are vital for accurate diagnosis of heart diseases. In this paper, an improved adaptive noise cancellation technique based on the Least Mean Square (LMS algorithm is used to separate heart sounds from lung sounds. The step size parameter in the LMS algorithm is optimally chosen using a hybrid Nelder-Mead (H-NM optimization algorithm. The NM algorithm is initialized with a good initial solution by using computationally cheap random search to compute a rough estimate of the global minimum. Initialization of the NM algorithm with a good initial guess avoided convergence to shallow local minima and improved the quality of the final solution. The effects of using two state-of-the-arts biologically inspired heuristic optimization algorithms instead of the H-NM algorithm and three variants of the standard LMS algorithm are investigated. The correlation coefficient between the ideal and filtered heart sound signal and running time-complexity of different algorithms are taken as the metric for comparison of different heart sound separation approaches. Simulation results indicate that the approach presented in this paper performs significantly better than a variety of alternate approaches on heart sound separation problems.

  19. Assessment of diverse algorithms applied on MODIS Aqua and Terra data over land surfaces in Europe

    Science.gov (United States)

    Glantz, P.; Tesche, M.

    2012-04-01

    MODIS c005 algorithm. The discrepancy between SAERand AERONET AOT is, however, substantially larger for the wavelength 488 nm, which means that several of the AOT values are without the MODIS expected uncertainty range. Both algorithms are unable to estimate Ångström exponent accurately, although the MODIS c005 algorithm performs a better job. Based on the inter-comparison of the SAER and MODIS c005 algorithms it was found here that the former estimation of AOT is for values up to 1on the whole within the expected uncertainties for one standard deviation of the MODIS retrievals, considering both Aqua and Terra and periods 1 and 3. The latter also occurs for Aqua and period 2, while then for AOT values lower than 0.5. The present algorithms were, beside aerosols emitted from clean sources and continental sources in Europe, also applied with favor on aerosol particles transported from agricultural fires in Russia and Ukraine. The latter events were associated with high aerosol loadings, although probably with similar single scattering albedo as the days classified as clean. We also present observations performed with space borne and ground-based lidars in the area investigated. From the latter platforms the vertical distribution of aerosol extinction in the atmosphere can be measured. This study suggests that the present satellite retrievals of AOT, particularly obtained with the MODIS c005 algorithm, will, in combination with the lidar measurements, be very useful in validation of regional and climate models over Europe.

  20. ∊-constraint heat transfer search (∊-HTS algorithm for solving multi-objective engineering design problems

    Directory of Open Access Journals (Sweden)

    Mohamed A. Tawhid

    2018-01-01

    Full Text Available In this paper, an effective ∊-constraint heat transfer search (∊-HTS algorithm for the multi-objective engineering design problems is presented. This algorithm is developed to solve multi-objective optimization problems by evaluating a set of single objective sub-problems. The effectiveness of the proposed algorithm is checked by implementing it on multi-objective benchmark problems that have various characteristics of Pareto front such as discrete, convex, and non-convex. This algorithm is also tested for several distinctive multi-objective engineering design problems, such as four bar truss problem, gear train problem, multi-plate disc brake design, speed reducer problem, welded beam design, and spring design problem. Moreover, the numerical experimentation shows that the proposed algorithm generates the solution to represent true Pareto front.

  1. Algorithms

    Indian Academy of Sciences (India)

    positive numbers. The word 'algorithm' was most often associated with this algorithm till 1950. It may however be pOinted out that several non-trivial algorithms such as synthetic (polynomial) division have been found in Vedic Mathematics which are dated much before Euclid's algorithm. A programming language Is used.

  2. Modified harmony search

    Science.gov (United States)

    Mohamed, Najihah; Lutfi Amri Ramli, Ahmad; Majid, Ahmad Abd; Piah, Abd Rahni Mt

    2017-09-01

    A metaheuristic algorithm, called Harmony Search is quite highly applied in optimizing parameters in many areas. HS is a derivative-free real parameter optimization algorithm, and draws an inspiration from the musical improvisation process of searching for a perfect state of harmony. Propose in this paper Modified Harmony Search for solving optimization problems, which employs a concept from genetic algorithm method and particle swarm optimization for generating new solution vectors that enhances the performance of HS algorithm. The performances of MHS and HS are investigated on ten benchmark optimization problems in order to make a comparison to reflect the efficiency of the MHS in terms of final accuracy, convergence speed and robustness.

  3. Use of an algorithm applied to urine drug screening to assess adherence to an oxycontin regimen.

    Science.gov (United States)

    Couto, Joseph E; Webster, Lynn; Romney, Martha C; Leider, Harry L; Linden, Ariel

    2009-01-01

    This study examined the ability of an algorithm applied to urine drug levels of oxycodone in healthy adult volunteers to differentiate among low, medium, and high doses of OxyContin. Thirty-six healthy volunteers were randomized to receive 80, 160, or 240 mg of daily OxyContin to steady state while under a naltrexone blockade. During days 3 and 4 of the study, urine samples of all participants were collected, and oxycodone levels detected in the urine were obtained using a liquid chromatography-mass spectrometry (LC-MS-MS) assay. The concordance was calculated for raw and adjusted LC-MS-MS urine oxycodone values within each study participant between their third and fourth day values. Also, an analysis of medians was calculated for each of the dosage groupings using Bonett-Price confidence intervals for both raw and adjusted LC-MS-MS values. The concordance correlation coefficient for the raw LC-MS-MS values between days 3 and 4 was 0.689 (95% confidence intervals = 0.515, 0.864), whereas the concordance correlation coefficient for the LC-MS-MS values using the algorithm (ie, normalized values) was 0.882 (95% confidence intervals = 0.808, 0.956). Because of greater variability in the raw values, some overlap was observed in the confidence intervals of the various OxyContin doses, whereas no overlap was observed in the normalized confidence intervals regardless of the application of a Bonferroni adjustment. In contrast to raw LC-MS-MS values, an algorithm that normalizes oxycodone urine drug levels for pH, specific gravity, and lean body mass discriminates well among all three of the daily doses of OxyContin tested (80, 160, and 240 mg), even with correcting for multiple analyses.

  4. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment.

    Science.gov (United States)

    Abdullahi, Mohammed; Ngadi, Md Asri

    2016-01-01

    Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS) has been shown to perform competitively with Particle Swarm Optimization (PSO). The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA) based SOS (SASOS) in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs) which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.

  5. Hybrid Symbiotic Organisms Search Optimization Algorithm for Scheduling of Tasks on Cloud Computing Environment.

    Directory of Open Access Journals (Sweden)

    Mohammed Abdullahi

    Full Text Available Cloud computing has attracted significant attention from research community because of rapid migration rate of Information Technology services to its domain. Advances in virtualization technology has made cloud computing very popular as a result of easier deployment of application services. Tasks are submitted to cloud datacenters to be processed on pay as you go fashion. Task scheduling is one the significant research challenges in cloud computing environment. The current formulation of task scheduling problems has been shown to be NP-complete, hence finding the exact solution especially for large problem sizes is intractable. The heterogeneous and dynamic feature of cloud resources makes optimum task scheduling non-trivial. Therefore, efficient task scheduling algorithms are required for optimum resource utilization. Symbiotic Organisms Search (SOS has been shown to perform competitively with Particle Swarm Optimization (PSO. The aim of this study is to optimize task scheduling in cloud computing environment based on a proposed Simulated Annealing (SA based SOS (SASOS in order to improve the convergence rate and quality of solution of SOS. The SOS algorithm has a strong global exploration capability and uses fewer parameters. The systematic reasoning ability of SA is employed to find better solutions on local solution regions, hence, adding exploration ability to SOS. Also, a fitness function is proposed which takes into account the utilization level of virtual machines (VMs which reduced makespan and degree of imbalance among VMs. CloudSim toolkit was used to evaluate the efficiency of the proposed method using both synthetic and standard workload. Results of simulation showed that hybrid SOS performs better than SOS in terms of convergence speed, response time, degree of imbalance, and makespan.

  6. Applying a Locally Linear Embedding Algorithm for Feature Extraction and Visualization of MI-EEG

    Directory of Open Access Journals (Sweden)

    Mingai Li

    2016-01-01

    Full Text Available Robotic-assisted rehabilitation system based on Brain-Computer Interface (BCI is an applicable solution for stroke survivors with a poorly functioning hemiparetic arm. The key technique for rehabilitation system is the feature extraction of Motor Imagery Electroencephalography (MI-EEG, which is a nonlinear time-varying and nonstationary signal with remarkable time-frequency characteristic. Though a few people have made efforts to explore the nonlinear nature from the perspective of manifold learning, they hardly take into full account both time-frequency feature and nonlinear nature. In this paper, a novel feature extraction method is proposed based on the Locally Linear Embedding (LLE algorithm and DWT. The multiscale multiresolution analysis is implemented for MI-EEG by DWT. LLE is applied to the approximation components to extract the nonlinear features, and the statistics of the detail components are calculated to obtain the time-frequency features. Then, the two features are combined serially. A backpropagation neural network is optimized by genetic algorithm and employed as a classifier to evaluate the effectiveness of the proposed method. The experiment results of 10-fold cross validation on a public BCI Competition dataset show that the nonlinear features visually display obvious clustering distribution and the fused features improve the classification accuracy and stability. This paper successfully achieves application of manifold learning in BCI.

  7. Exact colouring algorithm for weighted graphs applied to timetabling problems with lectures of different lengths

    NARCIS (Netherlands)

    Cangalovic, Mirjana; Schreuder, J.A.M.

    1991-01-01

    An exact algorithm is presented for determining the interval chromatic number of a weighted graph. The algorithm is based on enumeration and the Branch-and-Bound principle. Computational experiments with the application of the algorithm to random weighted graphs are given. The algorithm and its

  8. Applied Swarm-based medicine: collecting decision trees for patterns of algorithms analysis.

    Science.gov (United States)

    Panje, Cédric M; Glatzer, Markus; von Rappard, Joscha; Rothermundt, Christian; Hundsberger, Thomas; Zumstein, Valentin; Plasswilm, Ludwig; Putora, Paul Martin

    2017-08-16

    The objective consensus methodology has recently been applied in consensus finding in several studies on medical decision-making among clinical experts or guidelines. The main advantages of this method are an automated analysis and comparison of treatment algorithms of the participating centers which can be performed anonymously. Based on the experience from completed consensus analyses, the main steps for the successful implementation of the objective consensus methodology were identified and discussed among the main investigators. The following steps for the successful collection and conversion of decision trees were identified and defined in detail: problem definition, population selection, draft input collection, tree conversion, criteria adaptation, problem re-evaluation, results distribution and refinement, tree finalisation, and analysis. This manuscript provides information on the main steps for successful collection of decision trees and summarizes important aspects at each point of the analysis.

  9. Aida-CMK multi-algorithm optimization kernel applied to analog IC sizing

    CERN Document Server

    Lourenço, Ricardo; Horta, Nuno

    2015-01-01

    This work addresses the research and development of an innovative optimization kernel applied to analog integrated circuit (IC) design. Particularly, this works describes the modifications inside the AIDA Framework, an electronic design automation framework fully developed by at the Integrated Circuits Group-LX of the Instituto de Telecomunicações, Lisbon. It focusses on AIDA-CMK, by enhancing AIDA-C, which is the circuit optimizer component of AIDA, with a new multi-objective multi-constraint optimization module that constructs a base for multiple algorithm implementations. The proposed solution implements three approaches to multi-objective multi-constraint optimization, namely, an evolutionary approach with NSGAII, a swarm intelligence approach with MOPSO and stochastic hill climbing approach with MOSA. Moreover, the implemented structure allows the easy hybridization between kernels transforming the previous simple NSGAII optimization module into a more evolved and versatile module supporting multiple s...

  10. An Improved Global Harmony Search Algorithm for the Identification of Nonlinear Discrete-Time Systems Based on Volterra Filter Modeling

    Directory of Open Access Journals (Sweden)

    Zongyan Li

    2016-01-01

    Full Text Available This paper describes an improved global harmony search (IGHS algorithm for identifying the nonlinear discrete-time systems based on second-order Volterra model. The IGHS is an improved version of the novel global harmony search (NGHS algorithm, and it makes two significant improvements on the NGHS. First, the genetic mutation operation is modified by combining normal distribution and Cauchy distribution, which enables the IGHS to fully explore and exploit the solution space. Second, an opposition-based learning (OBL is introduced and modified to improve the quality of harmony vectors. The IGHS algorithm is implemented on two numerical examples, and they are nonlinear discrete-time rational system and the real heat exchanger, respectively. The results of the IGHS are compared with those of the other three methods, and it has been verified to be more effective than the other three methods on solving the above two problems with different input signals and system memory sizes.

  11. Software Trigger Algorithms to Search for Magnetic Monopoles with the NO$\

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Z. [Virginia U.; Dukes, E. [Virginia U.; Ehrlich, R. [Virginia U.; Frank, M. [Virginia U.; Group, C. [Fermilab; Norman, A. [Fermilab

    2014-01-01

    The NOvA far detector, due to its surface proximity, large size, good timing resolution, large energy dynamic range, and continuous readout, is sensitive to the detection of magnetic monopoles over a large range of velocities and masses. In order to record candidate magnetic monopole events with high efficiency we have designed a software-based trigger to make decisions based on the data recorded by the detector. The decisions must be fast, have high efficiency, and a large rejection factor for the over 100,000 cosmic rays that course through the detector every second. In this paper we briefly describe the simulation of magnetic monopoles, including the detector response, and then discuss the algorithms applied to identify magnetic monopole candidates. We also present the results of trigger efficiency and purity tests using simulated samples of magnetic monopoles with overlaid cosmic backgrounds and electronic noise.

  12. The Scatter Search Based Algorithm to Revenue Management Problem in Broadcasting Companies

    Science.gov (United States)

    Pishdad, Arezoo; Sharifyazdi, Mehdi; Karimpour, Reza

    2009-09-01

    The problem under question in this paper which is faced by broadcasting companies is how to benefit from a limited advertising space. This problem is due to the stochastic behavior of customers (advertiser) in different fare classes. To address this issue we propose a mathematical constrained nonlinear multi period model which incorporates cancellation and overbooking. The objective function is to maximize the total expected revenue and our numerical method performs it by determining the sales limits for each class of customer to present the revenue management control policy. Scheduling the advertising spots in breaks is another area of concern and we consider it as a constraint in our model. In this paper an algorithm based on Scatter search is developed to acquire a good feasible solution. This method uses simulation over customer arrival and in a continuous finite time horizon [0, T]. Several sensitivity analyses are conducted in computational result for depicting the effectiveness of proposed method. It also provides insight into better results of considering revenue management (control policy) compared to "no sales limit" policy in which sooner demand will served first.

  13. Detection of Spam Email by Combining Harmony Search Algorithm and Decision Tree

    Directory of Open Access Journals (Sweden)

    M. Z. Gashti

    2017-06-01

    Full Text Available Spam emails is probable the main problem faced by most e-mail users. There are many features in spam email detection and some of these features have little effect on detection and cause skew detection and classification of spam email. Thus, Feature Selection (FS is one of the key topics in spam email detection systems. With choosing the important and effective features in classification, its performance can be optimized. Selector features has the task of finding a subset of features to improve the accuracy of its predictions. In this paper, a hybrid of Harmony Search Algorithm (HSA and decision tree is used for selecting the best features and classification. The obtained results on Spam-base dataset show that the rate of recognition accuracy in the proposed model is 95.25% which is high in comparison with models such as SVM, NB, J48 and MLP. Also, the accuracy of the proposed model on the datasets of Ling-spam and PU1 is high in comparison with models such as NB, SVM and LR.

  14. Improving Accuracy of River Flow Forecasting Using LSSVR with Gravitational Search Algorithm

    Directory of Open Access Journals (Sweden)

    Rana Muhammad Adnan

    2017-01-01

    Full Text Available River flow prediction is essential in many applications of water resources planning and management. In this paper, the accuracy of multivariate adaptive regression splines (MARS, model 5 regression tree (M5RT, and conventional multiple linear regression (CMLR is compared with a hybrid least square support vector regression-gravitational search algorithm (HLGSA in predicting monthly river flows. In the first part of the study, all three regression methods were compared with each other in predicting river flows of each basin. It was found that the HLGSA method performed better than the MARS, M5RT, and CMLR in river flow prediction. The effect of log transformation on prediction accuracy of the regression methods was also examined in the second part of the study. Log transformation of the river flow data significantly increased the prediction accuracy of all regression methods. It was also found that log HLGSA (LHLSGA performed better than the other regression methods. In the third part of the study, the accuracy of the LHLGSA and HLGSA methods was examined in river flow estimation using nearby river flow data. On the basis of results of all applications, it was found that LHLGSA and HLGSA could be successfully used in prediction and estimation of river flow.

  15. Gravitational search algorithm based tuning of a PI speed controller for an induction motor drive

    Science.gov (United States)

    Abd Ali, Jamal; Hannan, M. A.; Mohamed, Azah

    2016-03-01

    Proportional-integral (PI)-controller is very useful for controlling speed and mechanical load variables for the three-phase induction motor (TIM) operation. However, the conventional PI-controller has a very exhaustive trial and error procedure for obtaining it is parameters. In this paper, PI speed controller has been improved in it is design technique to suite TIM by utilizing a gravitational search algorithm (GSA) optimization technique. The mean absolute error (MAE) of the speed response has been used as an objective function. An optimal GSA based PI speed controller (GSA-PI) objective function is also employed to tune and minimize the MAE for developing the performance of the TIM in terms of changes speed and mechanical load. This experiment use space vector pulse width modulation (SVPWM) technique to create pulse width modulation for switching devices for three phase bridge inverter. Results obtained from the GSA-PI speed controller are compared with those obtained through particle swarm optimization (PSO) to validate the developed controller. Then it has been proved that the robustness of the GSA-PI speed controller is far better than that of the1 PSO controller in all tested cases in terms of damping capability and transient response under different mechanical loads and speeds.

  16. Tmax Determined Using a Bayesian Estimation Deconvolution Algorithm Applied to Bolus Tracking Perfusion Imaging: A Digital Phantom Validation Study.

    Science.gov (United States)

    Uwano, Ikuko; Sasaki, Makoto; Kudo, Kohsuke; Boutelier, Timothé; Kameda, Hiroyuki; Mori, Futoshi; Yamashita, Fumio

    2017-01-10

    The Bayesian estimation algorithm improves the precision of bolus tracking perfusion imaging. However, this algorithm cannot directly calculate Tmax, the time scale widely used to identify ischemic penumbra, because Tmax is a non-physiological, artificial index that reflects the tracer arrival delay (TD) and other parameters. We calculated Tmax from the TD and mean transit time (MTT) obtained by the Bayesian algorithm and determined its accuracy in comparison with Tmax obtained by singular value decomposition (SVD) algorithms. The TD and MTT maps were generated by the Bayesian algorithm applied to digital phantoms with time-concentration curves that reflected a range of values for various perfusion metrics using a global arterial input function. Tmax was calculated from the TD and MTT using constants obtained by a linear least-squares fit to Tmax obtained from the two SVD algorithms that showed the best benchmarks in a previous study. Correlations between the Tmax values obtained by the Bayesian and SVD methods were examined. The Bayesian algorithm yielded accurate TD and MTT values relative to the true values of the digital phantom. Tmax calculated from the TD and MTT values with the least-squares fit constants showed excellent correlation (Pearson's correlation coefficient = 0.99) and agreement (intraclass correlation coefficient = 0.99) with Tmax obtained from SVD algorithms. Quantitative analyses of Tmax values calculated from Bayesian-estimation algorithm-derived TD and MTT from a digital phantom correlated and agreed well with Tmax values determined using SVD algorithms.

  17. MT's algorithm: A new algorithm to search for the optimum set of modulation indices for simultaneous range, command, and telemetry

    Science.gov (United States)

    Nguyen, Tien Manh

    1989-01-01

    MT's algorithm was developed as an aid in the design of space telecommunications systems when utilized with simultaneous range/command/telemetry operations. This algorithm provides selection of modulation indices for: (1) suppression of undesired signals to achieve desired link performance margins and/or to allow for a specified performance degradation in the data channel (command/telemetry) due to the presence of undesired signals (interferers); and (2) optimum power division between the carrier, the range, and the data channel. A software program using this algorithm was developed for use with MathCAD software. This software program, called the MT program, provides the computation of optimum modulation indices for all possible cases that are recommended by the Consultative Committee on Space Data System (CCSDS) (with emphasis on the squarewave, NASA/JPL ranging system).

  18. A UWB/Improved PDR Integration Algorithm Applied to Dynamic Indoor Positioning for Pedestrians.

    Science.gov (United States)

    Chen, Pengzhan; Kuang, Ye; Chen, Xiaoyue

    2017-09-08

    Inertial sensors are widely used in various applications, such as human motion monitoring and pedestrian positioning. However, inertial sensors cannot accurately define the process of human movement, a limitation that causes data drift in the process of human body positioning, thus seriously affecting positioning accuracy and stability. The traditional pedestrian dead-reckoning algorithm, which is based on a single inertial measurement unit, can suppress the data drift, but fails to accurately calculate the number of walking steps and heading value, thus it cannot meet the application requirements. This study proposes an indoor dynamic positioning method with an error self-correcting function based on the symmetrical characteristics of human motion to obtain the definition basis of human motion process quickly and to solve the abovementioned problems. On the basis of this proposed method, an ultra-wide band (UWB) method is introduced. An unscented Kalman filter is applied to fuse inertial sensors and UWB data, inertial positioning is applied to compensation for the defects of susceptibility to UWB signal obstacles, and UWB positioning is used to overcome the error accumulation of inertial positioning. The above method can improve both the positioning accuracy and the response of the positioning results. Finally, this study designs an indoor positioning test system to test the static and dynamic performances of the proposed indoor positioning method. Results show that the positioning system both has high accuracy and good real-time performance.

  19. Algorithms

    Indian Academy of Sciences (India)

    In the description of algorithms and programming languages, what is the role of control abstraction? • What are the inherent limitations of the algorithmic processes? In future articles in this series, we will show that these constructs are powerful and can be used to encode any algorithm. In the next article, we will discuss ...

  20. The Theory of Planned Behaviour Applied to Search Engines as a Learning Tool

    Science.gov (United States)

    Liaw, Shu-Sheng

    2004-01-01

    Search engines have been developed for helping learners to seek online information. Based on theory of planned behaviour approach, this research intends to investigate the behaviour of using search engines as a learning tool. After factor analysis, the results suggest that perceived satisfaction of search engine, search engines as an information…

  1. FHSA-SED: Two-Locus Model Detection for Genome-Wide Association Study with Harmony Search Algorithm.

    Directory of Open Access Journals (Sweden)

    Shouheng Tuo

    Full Text Available Two-locus model is a typical significant disease model to be identified in genome-wide association study (GWAS. Due to intensive computational burden and diversity of disease models, existing methods have drawbacks on low detection power, high computation cost, and preference for some types of disease models.In this study, two scoring functions (Bayesian network based K2-score and Gini-score are used for characterizing two SNP locus as a candidate model, the two criteria are adopted simultaneously for improving identification power and tackling the preference problem to disease models. Harmony search algorithm (HSA is improved for quickly finding the most likely candidate models among all two-locus models, in which a local search algorithm with two-dimensional tabu table is presented to avoid repeatedly evaluating some disease models that have strong marginal effect. Finally G-test statistic is used to further test the candidate models.We investigate our method named FHSA-SED on 82 simulated datasets and a real AMD dataset, and compare it with two typical methods (MACOED and CSE which have been developed recently based on swarm intelligent search algorithm. The results of simulation experiments indicate that our method outperforms the two compared algorithms in terms of detection power, computation time, evaluation times, sensitivity (TPR, specificity (SPC, positive predictive value (PPV and accuracy (ACC. Our method has identified two SNPs (rs3775652 and rs10511467 that may be also associated with disease in AMD dataset.

  2. FHSA-SED: Two-Locus Model Detection for Genome-Wide Association Study with Harmony Search Algorithm.

    Science.gov (United States)

    Tuo, Shouheng; Zhang, Junying; Yuan, Xiguo; Zhang, Yuanyuan; Liu, Zhaowen

    2016-01-01

    Two-locus model is a typical significant disease model to be identified in genome-wide association study (GWAS). Due to intensive computational burden and diversity of disease models, existing methods have drawbacks on low detection power, high computation cost, and preference for some types of disease models. In this study, two scoring functions (Bayesian network based K2-score and Gini-score) are used for characterizing two SNP locus as a candidate model, the two criteria are adopted simultaneously for improving identification power and tackling the preference problem to disease models. Harmony search algorithm (HSA) is improved for quickly finding the most likely candidate models among all two-locus models, in which a local search algorithm with two-dimensional tabu table is presented to avoid repeatedly evaluating some disease models that have strong marginal effect. Finally G-test statistic is used to further test the candidate models. We investigate our method named FHSA-SED on 82 simulated datasets and a real AMD dataset, and compare it with two typical methods (MACOED and CSE) which have been developed recently based on swarm intelligent search algorithm. The results of simulation experiments indicate that our method outperforms the two compared algorithms in terms of detection power, computation time, evaluation times, sensitivity (TPR), specificity (SPC), positive predictive value (PPV) and accuracy (ACC). Our method has identified two SNPs (rs3775652 and rs10511467) that may be also associated with disease in AMD dataset.

  3. COMPARING SEARCHING AND SORTING ALGORITHMS EFFICIENCY IN IMPLEMENTING COMPUTATIONAL EXPERIMENT IN PROGRAMMING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    R. Sagan

    2011-11-01

    Full Text Available This article considers different aspects which allow defining correctness of choosing sorting algorithms. Also some algorithms, needed for computational experiments for certain class of programs, are compared.

  4. EEG/ERP adaptive noise canceller design with controlled search space (CSS) approach in cuckoo and other optimization algorithms.

    Science.gov (United States)

    Ahirwal, M K; Kumar, Anil; Singh, G K

    2013-01-01

    This paper explores the migration of adaptive filtering with swarm intelligence/evolutionary techniques employed in the field of electroencephalogram/event-related potential noise cancellation or extraction. A new approach is proposed in the form of controlled search space to stabilize the randomness of swarm intelligence techniques especially for the EEG signal. Swarm-based algorithms such as Particles Swarm Optimization, Artificial Bee Colony, and Cuckoo Optimization Algorithm with their variants are implemented to design optimized adaptive noise canceler. The proposed controlled search space technique is tested on each of the swarm intelligence techniques and is found to be more accurate and powerful. Adaptive noise canceler with traditional algorithms such as least-mean-square, normalized least-mean-square, and recursive least-mean-square algorithms are also implemented to compare the results. ERP signals such as simulated visual evoked potential, real visual evoked potential, and real sensorimotor evoked potential are used, due to their physiological importance in various EEG studies. Average computational time and shape measures of evolutionary techniques are observed 8.21E-01 sec and 1.73E-01, respectively. Though, traditional algorithms take negligible time consumption, but are unable to offer good shape preservation of ERP, noticed as average computational time and shape measure difference, 1.41E-02 sec and 2.60E+00, respectively.

  5. Search for Active-State Conformation of Drug Target GPCR Using Real-Coded Genetic Algorithm

    Science.gov (United States)

    Ishino, Yoko; Harada, Takanori; Aida, Misako

    G-Protein coupled receptors (GPCRs) comprise a large superfamily of proteins and are a target for nearly 50% of drugs in clinical use today. GPCRs have a unique structural motif, seven transmembrane helices, and it is known that agonists and antagonists dock with a GPCR in its ``active'' and ``inactive'' condition, respectively. Knowing conformations of both states is eagerly anticipated for elucidation of drug action mechanism. Since GPCRs are difficult to crystallize, the 3D structures of these receptors have not yet been determined by X-ray crystallography, except the inactive-state conformation of two proteins. The conformation of them enabled the inactive form of other GPCRs to be modeled by computer-aided homology modeling. However, to date, the active form of GPCRs has not been solved. This paper describes a novel method to predict the 3D structure of an active-state GPCR aiming at molecular docking-based virtual screening using real-coded genetic algorithm (real-coded GA), receptor-ligand docking simulations, and molecular dynamics (MD) simulations. The basic idea of the method is that the MD is first used to calculate an average 3D coordinates of all atoms of a GPCR protein against heat fluctuation on the pico- or nano- second time scale, and then real-coded GA involving receptor-ligand docking simulations functions to determine the rotation angle of each helix as a movement on wider time scale. The method was validated using human leukotriene B4 receptor BLT1 as a sample GPCR. Our study demonstrated that the established evolutionary search for the active state of the leukotriene receptor provided the appropriate 3D structure of the receptor to dock with its agonists.

  6. Design of Content Based Image Retrieval Scheme for Diabetic Retinopathy Images using Harmony Search Algorithm.

    Science.gov (United States)

    Sivakamasundari, J; Natarajan, V

    2015-01-01

    Diabetic Retinopathy (DR) is a disorder that affects the structure of retinal blood vessels due to long-standing diabetes mellitus. Automated segmentation of blood vessel is vital for periodic screening and timely diagnosis. An attempt has been made to generate continuous retinal vasculature for the design of Content Based Image Retrieval (CBIR) application. The typical normal and abnormal retinal images are preprocessed to improve the vessel contrast. The blood vessels are segmented using evolutionary based Harmony Search Algorithm (HSA) combined with Otsu Multilevel Thresholding (MLT) method by best objective functions. The segmentation results are validated with corresponding ground truth images using binary similarity measures. The statistical, textural and structural features are obtained from the segmented images of normal and DR affected retina and are analyzed. CBIR in medical image retrieval applications are used to assist physicians in clinical decision-support techniques and research fields. A CBIR system is developed using HSA based Otsu MLT segmentation technique and the features obtained from the segmented images. Similarity matching is carried out between the features of query and database images using Euclidean Distance measure. Similar images are ranked and retrieved. The retrieval performance of CBIR system is evaluated in terms of precision and recall. The CBIR systems developed using HSA based Otsu MLT and conventional Otsu MLT methods are compared. The retrieval performance such as precision and recall are found to be 96% and 58% for CBIR system using HSA based Otsu MLT segmentation. This automated CBIR system could be recommended for use in computer assisted diagnosis for diabetic retinopathy screening.

  7. Operation management of daily economic dispatch using novel hybrid particle swarm optimization and gravitational search algorithm with hybrid mutation strategy

    Science.gov (United States)

    Wang, Yan; Huang, Song; Ji, Zhicheng

    2017-07-01

    This paper presents a hybrid particle swarm optimization and gravitational search algorithm based on hybrid mutation strategy (HGSAPSO-M) to optimize economic dispatch (ED) including distributed generations (DGs) considering market-based energy pricing. A daily ED model was formulated and a hybrid mutation strategy was adopted in HGSAPSO-M. The hybrid mutation strategy includes two mutation operators, chaotic mutation, Gaussian mutation. The proposed algorithm was tested on IEEE-33 bus and results show that the approach is effective for this problem.

  8. Fast online and index-based algorithms for approximate search of RNA sequence-structure patterns

    National Research Council Canada - National Science Library

    Meyer, Fernando; Kurtz, Stefan; Beckstette, Michael

    2013-01-01

    .... However, current tools for searching with RNA sequence-structure patterns cannot fully handle mutations occurring on both these levels or are simply not fast enough for searching large sequence data...

  9. An efficient iterated local search algorithm for the total tardiness blocking flow shop problem

    OpenAIRE

    Ribas Vila, Immaculada; Companys Pascual, Ramón; Tort-Martorell Llabrés, Xavier

    2013-01-01

    This paper deals with the blocking flow shop problem and proposes an Iterated Local Search (ILS) procedure combined with a variable neighbourhood search (VNS) for the total tardiness minimisation. The proposed ILS makes use of a NEH-based procedure to generate the initial solution, and uses a local search to intensify the exploration that combines the insertion and swap neighbourhood and uses a perturbation mechanism consisting of three neighbourhood operators to diversify the search. The com...

  10. Early Seizure Detection by Applying Frequency-Based Algorithm Derived from the Principal Component Analysis

    Directory of Open Access Journals (Sweden)

    Jiseon Lee

    2017-08-01

    Full Text Available The use of automatic electrical stimulation in response to early seizure detection has been introduced as a new treatment for intractable epilepsy. For the effective application of this method as a successful treatment, improving the accuracy of the early seizure detection is crucial. In this paper, we proposed the application of a frequency-based algorithm derived from principal component analysis (PCA, and demonstrated improved efficacy for early seizure detection in a pilocarpine-induced epilepsy rat model. A total of 100 ictal electroencephalographs (EEG during spontaneous recurrent seizures from 11 epileptic rats were finally included for the analysis. PCA was applied to the covariance matrix of a conventional EEG frequency band signal. Two PCA results were compared: one from the initial segment of seizures (5 sec of seizure onset and the other from the whole segment of seizures. In order to compare the accuracy, we obtained the specific threshold satisfying the target performance from the training set, and compared the False Positive (FP, False Negative (FN, and Latency (Lat of the PCA based feature derived from the initial segment of seizures to the other six features in the testing set. The PCA based feature derived from the initial segment of seizures performed significantly better than other features with a 1.40% FP, zero FN, and 0.14 s Lat. These results demonstrated that the proposed frequency-based feature from PCA that captures the characteristics of the initial phase of seizure was effective for early detection of seizures. Experiments with rat ictal EEGs showed an improved early seizure detection rate with PCA applied to the covariance of the initial 5 s segment of visual seizure onset instead of using the whole seizure segment or other conventional frequency bands.

  11. Adaptive Guidance and Control Algorithms applied to the X-38 Reentry Mission

    Science.gov (United States)

    Graesslin, M.; Wallner, E.; Burkhardt, J.; Schoettle, U.; Well, K. H.

    International Space Station's Crew Return/Rescue Vehicle (CRV) is planned to autonomously return the complete crew of 7 astronauts back to earth in case of an emergency. As prototype of such a vehicle, the X-38, is being developed and built by NASA with European participation. The X-38 is a lifting body with a hyper- sonic lift to drag ratio of about 0.9. In comparison to the Space Shuttle Orbiter, the X-38 has less aerodynamic manoeuvring capability and less actuators. Within the German technology programme TETRA (TEchnologies for future space TRAnsportation systems) contributing to the X-38 program, guidance and control algorithms have been developed and applied to the X-38 reentry mission. The adaptive guidance concept conceived combines an on-board closed-loop predictive guidance algorithm with flight load control that temporarily overrides the attitude commands of the predictive component if the corre- sponding load constraints are violated. The predictive guidance scheme combines an optimization step and a sequence of constraint restoration cycles. In order to satisfy on-board computation limitations the complete scheme is performed only during the exo-atmospheric flight coast phase. During the controlled atmospheric flight segment the task is reduced to a repeatedly solved targeting problem based on the initial optimal solution, thus omitting in-flight constraints. To keep the flight loads - especially the heat flux, which is in fact a major concern of the X-38 reentry flight - below their maximum admissible values, a flight path controller based on quadratic minimization techniques may override the predictive guidance command for a flight along the con- straint boundary. The attitude control algorithms developed are based on dynamic inversion. This methodology enables the designer to straightforwardly devise a controller structure from the system dynamics. The main ad- vantage of this approach with regard to reentry control design lies in the fact that

  12. Automated EEG artifact elimination by applying machine learning algorithms to ICA-based features

    Science.gov (United States)

    Radüntz, Thea; Scouten, Jon; Hochmuth, Olaf; Meffert, Beate

    2017-08-01

    Objective. Biological and non-biological artifacts cause severe problems when dealing with electroencephalogram (EEG) recordings. Independent component analysis (ICA) is a widely used method for eliminating various artifacts from recordings. However, evaluating and classifying the calculated independent components (IC) as artifact or EEG is not fully automated at present. Approach. In this study, we propose a new approach for automated artifact elimination, which applies machine learning algorithms to ICA-based features. Main results. We compared the performance of our classifiers with the visual classification results given by experts. The best result with an accuracy rate of 95% was achieved using features obtained by range filtering of the topoplots and IC power spectra combined with an artificial neural network. Significance. Compared with the existing automated solutions, our proposed method is not limited to specific types of artifacts, electrode configurations, or number of EEG channels. The main advantages of the proposed method is that it provides an automatic, reliable, real-time capable, and practical tool, which avoids the need for the time-consuming manual selection of ICs during artifact removal.

  13. Genetic algorithm applied to the optimization of quantum cascade lasers with second harmonic generation

    Energy Technology Data Exchange (ETDEWEB)

    Gajić, A. [School of Electrical Engineering, University of Belgrade, Bulevar kralja Aleksandra 73, 11120 Belgrade (Serbia); Telekom Srbija, a.d., Takovska 2, 11000 Belgrade (Serbia); Radovanović, J., E-mail: radovanovic@etf.bg.ac.rs; Milanović, V. [School of Electrical Engineering, University of Belgrade, Bulevar kralja Aleksandra 73, 11120 Belgrade (Serbia); Indjin, D.; Ikonić, Z. [School of Electronic and Electrical Engineering, University of Leeds, Leeds LS2 9JT (United Kingdom)

    2014-02-07

    A computational model for the optimization of the second order optical nonlinearities in GaInAs/AlInAs quantum cascade laser structures is presented. The set of structure parameters that lead to improved device performance was obtained through the implementation of the Genetic Algorithm. In the following step, the linear and second harmonic generation power were calculated by self-consistently solving the system of rate equations for carriers and photons. This rate equation system included both stimulated and simultaneous double photon absorption processes that occur between the levels relevant for second harmonic generation, and material-dependent effective mass, as well as band nonparabolicity, were taken into account. The developed method is general, in the sense that it can be applied to any higher order effect, which requires the photon density equation to be included. Specifically, we have addressed the optimization of the active region of a double quantum well In{sub 0.53}Ga{sub 0.47}As/Al{sub 0.48}In{sub 0.52}As structure and presented its output characteristics.

  14. Performance Comparison of Particle Swarm Optimization and Gravitational Search Algorithm to the Designed of Controller for Nonlinear System

    Directory of Open Access Journals (Sweden)

    Sahazati Md Rozali

    2014-01-01

    Full Text Available This paper presents backstepping controller design for tracking purpose of nonlinear system. Since the performance of the designed controller depends on the value of control parameters, gravitational search algorithm (GSA and particle swarm optimization (PSO techniques are used to optimise these parameters in order to achieve a predefined system performance. The performance is evaluated based on the tracking error between reference input given to the system and the system output. Then, the efficacy of the backstepping controller is verified in simulation environment under various system setup including both the system subjected to external disturbance and without disturbance. The simulation results show that backstepping with particle swarm optimization technique performs better than the similar controller with gravitational search algorithm technique in terms of output response and tracking error.

  15. An Efficient Two-Objective Hybrid Local Search Algorithm for Solving the Fuel Consumption Vehicle Routing Problem

    Directory of Open Access Journals (Sweden)

    Weizhen Rao

    2016-01-01

    Full Text Available The classical model of vehicle routing problem (VRP generally minimizes either the total vehicle travelling distance or the total number of dispatched vehicles. Due to the increased importance of environmental sustainability, one variant of VRPs that minimizes the total vehicle fuel consumption has gained much attention. The resulting fuel consumption VRP (FCVRP becomes increasingly important yet difficult. We present a mixed integer programming model for the FCVRP, and fuel consumption is measured through the degree of road gradient. Complexity analysis of FCVRP is presented through analogy with the capacitated VRP. To tackle the FCVRP’s computational intractability, we propose an efficient two-objective hybrid local search algorithm (TOHLS. TOHLS is based on a hybrid local search algorithm (HLS that is also used to solve FCVRP. Based on the Golden CVRP benchmarks, 60 FCVRP instances are generated and tested. Finally, the computational results show that the proposed TOHLS significantly outperforms the HLS.

  16. A granular tabu search algorithm for a real case study of a vehicle routing problem with a heterogeneous fleet and time windows

    Directory of Open Access Journals (Sweden)

    Jose Bernal

    2017-10-01

    Full Text Available Purpose: We consider a real case study of a vehicle routing problem with a heterogeneous fleet and time windows (HFVRPTW for a franchise company bottling Coca-Cola products in Colombia. This study aims to determine the routes to be performed to fulfill the demand of the customers by using a heterogeneous fleet and considering soft time windows. The objective is to minimize the distance traveled by the performed routes. Design/methodology/approach: We propose a two-phase heuristic algorithm. In the proposed approach, after an initial phase (first phase, a granular tabu search is applied during the improvement phase (second phase. Two additional procedures are considered to help that the algorithm could escape from local optimum, given that during a given number of iterations there has been no improvement. Findings: Computational experiments on real instances show that the proposed algorithm is able to obtain high-quality solutions within a short computing time compared to the results found by the software that the company currently uses to plan the daily routes. Originality/value: We propose a novel metaheuristic algorithm for solving a real routing problem by considering heterogeneous fleet and time windows. The efficiency of the proposed approach has been tested on real instances, and the computational experiments shown its applicability and performance for solving NP-Hard Problems related with routing problems with similar characteristics. The proposed algorithm was able to improve some of the current solutions applied by the company by reducing the route length and the number of vehicles.

  17. Calculation of the velocity components for continuous GNSS station through applying the algorithm for least squares adjustment.

    Directory of Open Access Journals (Sweden)

    Jorge Moya Zamora

    2014-06-01

    Full Text Available The calculation of the velocity of a continuous GNSS observation station represents a key input in modern surveying. The act of determining the position of the GNSS stations involves daily which can establish the time series of stations, based on which information can be influenced by phenomena affecting the performance thereof. This article is a description of the algorithm of the least squares adapted and applied to the determination of the velocity components of continuous observation stations. Furthermore, this algorithm is applied for calculating the speed of ETCG station belonging to the Geocentric System for the Americas (SIRGAS.

  18. Algorithms

    Indian Academy of Sciences (India)

    , i is referred to as the loop-index, 'stat-body' is any sequence of ... while i ~ N do stat-body; i: = i+ 1; endwhile. The algorithm for sorting the numbers is described in Table 1 and the algorithmic steps on a list of 4 numbers shown in. Figure 1.

  19. Multi-objective optimization in the presence of practical constraints using non-dominated sorting hybrid cuckoo search algorithm

    Directory of Open Access Journals (Sweden)

    M. Balasubbareddy

    2015-12-01

    Full Text Available A novel optimization algorithm is proposed to solve single and multi-objective optimization problems with generation fuel cost, emission, and total power losses as objectives. The proposed method is a hybridization of the conventional cuckoo search algorithm and arithmetic crossover operations. Thus, the non-linear, non-convex objective function can be solved under practical constraints. The effectiveness of the proposed algorithm is analyzed for various cases to illustrate the effect of practical constraints on the objectives' optimization. Two and three objective multi-objective optimization problems are formulated and solved using the proposed non-dominated sorting-based hybrid cuckoo search algorithm. The effectiveness of the proposed method in confining the Pareto front solutions in the solution region is analyzed. The results for single and multi-objective optimization problems are physically interpreted on standard test functions as well as the IEEE-30 bus test system with supporting numerical and graphical results and also validated against existing methods.

  20. INGV Oblique Ionograms Automatic Scaling Algorithm applied to the ionograms recorded by Ebro Observatory ionosonde in disturbed conditions

    Science.gov (United States)

    Ippolito, Alessandro; Scotto, Carlo; Altadill, David; Blanch, Estefania

    2017-04-01

    The OIASA algorithm (Oblique Ionograms Automatic Scaling Algorithm) for the identification of trace of oblique ionograms has been applied to the oblique ionograms produced at Ebro Observatory (Spain) and related to the radiolink between the ionospheric stations of Dourbes (50.1 N, 4.6 E) and Roquetes (40.8 N, 0.5 E). Four different periods of 2015 have been analysed, each of them characterised by the occurrence of geomagnetic storms. The algorithm allows the determination of the Maximum Usable Frequency (MUF) for communication between the transmitter and receiver, and shows a very good capacity in automatically rejecting poor quality ionograms. The behaviour and performance of the autoscaling programs under geomagnetic disturbed condition have been evaluated. The results show a good agreement between MUF values provided by the automatic scaling algorithm and the MUF values manually scaled by an expert operator. Furthermore, the results show the good capabilities of OIASA in discarding ionograms that lack of sufficient information.

  1. Imperialist Competitive Algorithm with Dynamic Parameter Adaptation Using Fuzzy Logic Applied to the Optimization of Mathematical Functions

    Directory of Open Access Journals (Sweden)

    Emer Bernal

    2017-01-01

    Full Text Available In this paper we are presenting a method using fuzzy logic for dynamic parameter adaptation in the imperialist competitive algorithm, which is usually known by its acronym ICA. The ICA algorithm was initially studied in its original form to find out how it works and what parameters have more effect upon its results. Based on this study, several designs of fuzzy systems for dynamic adjustment of the ICA parameters are proposed. The experiments were performed on the basis of solving complex optimization problems, particularly applied to benchmark mathematical functions. A comparison of the original imperialist competitive algorithm and our proposed fuzzy imperialist competitive algorithm was performed. In addition, the fuzzy ICA was compared with another metaheuristic using a statistical test to measure the advantage of the proposed fuzzy approach for dynamic parameter adaptation.

  2. Google Searches for "Cheap Cigarettes" Spike at Tax Increases: Evidence from an Algorithm to Detect Spikes in Time Series Data.

    Science.gov (United States)

    Caputi, Theodore L

    2017-06-22

    Online cigarette dealers have lower prices than brick-and-mortar retailers and advertise tax-free status 1-8. Previous studies show smokers search out these online alternatives at the time of a cigarette tax increase 9-10. However, these studies rely upon researchers' decision to consider a specific date and preclude the possibility that researchers focus on the wrong date. The purpose of this study is to introduce an unbiased methodology to the field of observing search patterns and to use this methodology to determine whether smokers search Google for "cheap cigarettes" at cigarette tax increases and, if so, whether the increased level of searches persists. Publicly available data from Google Trends is used to observe standardized search volumes for the term, "cheap cigarettes." Seasonal Hybrid Extreme Studentized Deviate and E-Divisive with Means tests were performed to observe spikes and mean level shifts in search volume. Of the twelve cigarette tax increases studied, ten showed spikes in searches for "cheap cigarettes" within two weeks of the tax increase. However, the mean level shifts did not occur for any cigarette tax increase. Searches for "cheap cigarettes" spike around the time of a cigarette tax increase, but the mean level of searches does not shift in response to a tax increase. The SHESD and EDM tests are unbiased methodologies that can be used to identify spikes and mean level shifts in time series data without an a priori date to be studied. SHESD and EDM affirm spikes in interest are related to tax increases. Applies improved statistical techniques (SHESD and EDM) to Google search data related to cigarettes, reducing bias and increasing powerContributes to the body of evidence that state and federal tax increases are associated with spikes in searches for cheap cigarettes and may be good dates for increased online health messaging related to tobacco.

  3. From Schrödinger's equation to the quantum search algorithm

    Indian Academy of Sciences (India)

    Although the algorithm itself is widely known, not so well known is the series of steps that first led to it, these are quite different from any of the generally known forms of the algorithm. This paper describes these steps, which start by discretizing Schrödinger's equation. This paper also provides a self contained introduction to ...

  4. Top-k Keyword Search Over Graphs Based On Backward Search

    Directory of Open Access Journals (Sweden)

    Zeng Jia-Hui

    2017-01-01

    Full Text Available Keyword search is one of the most friendly and intuitive information retrieval methods. Using the keyword search to get the connected subgraph has a lot of application in the graph-based cognitive computation, and it is a basic technology. This paper focuses on the top-k keyword searching over graphs. We implemented a keyword search algorithm which applies the backward search idea. The algorithm locates the keyword vertices firstly, and then applies backward search to find rooted trees that contain query keywords. The experiment shows that query time is affected by the iteration number of the algorithm.

  5. A Parallel Biased Random-Key Genetic Algorithm with Multiple Populations Applied to Irregular Strip Packing Problems

    Directory of Open Access Journals (Sweden)

    Bonfim Amaro Júnior

    2017-01-01

    Full Text Available The irregular strip packing problem (ISPP is a class of cutting and packing problem (C&P in which a set of items with arbitrary formats must be placed in a container with a variable length. The aim of this work is to minimize the area needed to accommodate the given demand. ISPP is present in various types of industries from manufacturers to exporters (e.g., shipbuilding, clothes, and glass. In this paper, we propose a parallel Biased Random-Key Genetic Algorithm (µ-BRKGA with multiple populations for the ISPP by applying a collision-free region (CFR concept as the positioning method, in order to obtain an efficient and fast layout solution. The layout problem for the proposed algorithm is represented by the placement order into the container and the corresponding orientation. In order to evaluate the proposed (µ-BRKGA algorithm, computational tests using benchmark problems were applied, analyzed, and compared with different approaches.

  6. Comparison and evaluation of network clustering algorithms applied to genetic interaction networks.

    Science.gov (United States)

    Hou, Lin; Wang, Lin; Berg, Arthur; Qian, Minping; Zhu, Yunping; Li, Fangting; Deng, Minghua

    2012-01-01

    The goal of network clustering algorithms detect dense clusters in a network, and provide a first step towards the understanding of large scale biological networks. With numerous recent advances in biotechnologies, large-scale genetic interactions are widely available, but there is a limited understanding of which clustering algorithms may be most effective. In order to address this problem, we conducted a systematic study to compare and evaluate six clustering algorithms in analyzing genetic interaction networks, and investigated influencing factors in choosing algorithms. The algorithms considered in this comparison include hierarchical clustering, topological overlap matrix, bi-clustering, Markov clustering, Bayesian discriminant analysis based community detection, and variational Bayes approach to modularity. Both experimentally identified and synthetically constructed networks were used in this comparison. The accuracy of the algorithms is measured by the Jaccard index in comparing predicted gene modules with benchmark gene sets. The results suggest that the choice differs according to the network topology and evaluation criteria. Hierarchical clustering showed to be best at predicting protein complexes; Bayesian discriminant analysis based community detection proved best under epistatic miniarray profile (EMAP) datasets; the variational Bayes approach to modularity was noticeably better than the other algorithms in the genome-scale networks.

  7. The Patch-Levy-Based Bees Algorithm Applied to Dynamic Optimization Problems

    Directory of Open Access Journals (Sweden)

    Wasim A. Hussein

    2017-01-01

    Full Text Available Many real-world optimization problems are actually of dynamic nature. These problems change over time in terms of the objective function, decision variables, constraints, and so forth. Therefore, it is very important to study the performance of a metaheuristic algorithm in dynamic environments to assess the robustness of the algorithm to deal with real-word problems. In addition, it is important to adapt the existing metaheuristic algorithms to perform well in dynamic environments. This paper investigates a recently proposed version of Bees Algorithm, which is called Patch-Levy-based Bees Algorithm (PLBA, on solving dynamic problems, and adapts it to deal with such problems. The performance of the PLBA is compared with other BA versions and other state-of-the-art algorithms on a set of dynamic multimodal benchmark problems of different degrees of difficulties. The results of the experiments show that PLBA achieves better results than the other BA variants. The obtained results also indicate that PLBA significantly outperforms some of the other state-of-the-art algorithms and is competitive with others.

  8. Study of data fusion algorithms applied to unattended ground sensor network

    Science.gov (United States)

    Pannetier, B.; Moras, J.; Dezert, Jean; Sella, G.

    2014-06-01

    In this paper, data obtained from wireless unattended ground sensor network are used for tracking multiple ground targets (vehicles, pedestrians and animals) moving on and off the road network. The goal of the study is to evaluate several data fusion algorithms to select the best approach to establish the tactical situational awareness. The ground sensor network is composed of heterogeneous sensors (optronic, radar, seismic, acoustic, magnetic sensors) and data fusion nodes. The fusion nodes are small hardware platforms placed on the surveillance area that communicate together. In order to satisfy operational needs and the limited communication bandwidth between the nodes, we study several data fusion algorithms to track and classify targets in real time. A multiple targets tracking (MTT) algorithm is integrated in each data fusion node taking into account embedded constraint. The choice of the MTT algorithm is motivated by the limit of the chosen technology. In the fusion nodes, the distributed MTT algorithm exploits the road network information in order to constrain the multiple dynamic models. Then, a variable structure interacting multiple model (VS-IMM) is adapted with the road network topology. This algorithm is well-known in centralized architecture, but it implies a modification of other data fusion algorithms to preserve the performances of the tracking under constraints. Based on such VS-IMM MTT algorithm, we adapt classical data fusion techniques to make it working in three architectures: centralized, distributed and hierarchical. The sensors measurements are considered asynchronous, but the fusion steps are synchronized on all sensors. Performances of data fusion algorithms are evaluated using simulated data and also validated on real data. The scenarios under analysis contain multiple targets with close and crossing trajectories involving data association uncertainties.

  9. Optimal Power Flow Using Gbest-Guided Cuckoo Search Algorithm with Feedback Control Strategy and Constraint Domination Rule

    Directory of Open Access Journals (Sweden)

    Gonggui Chen

    2017-01-01

    Full Text Available The optimal power flow (OPF is well-known as a significant optimization tool for the security and economic operation of power system, and OPF problem is a complex nonlinear, nondifferentiable programming problem. Thus this paper proposes a Gbest-guided cuckoo search algorithm with the feedback control strategy and constraint domination rule which is named as FCGCS algorithm for solving OPF problem and getting optimal solution. This FCGCS algorithm is guided by the global best solution for strengthening exploitation ability. Feedback control strategy is devised to dynamically regulate the control parameters according to actual and specific feedback value in the simulation process. And the constraint domination rule can efficiently handle inequality constraints on state variables, which is superior to traditional penalty function method. The performance of FCGCS algorithm is tested and validated on the IEEE 30-bus and IEEE 57-bus example systems, and simulation results are compared with different methods obtained from other literatures recently. The comparison results indicate that FCGCS algorithm can provide high-quality feasible solutions for different OPF problems.

  10. Comparing three stochastic search algorithms for computational protein design: Monte Carlo, replica exchange Monte Carlo, and a multistart, steepest-descent heuristic.

    Science.gov (United States)

    Mignon, David; Simonson, Thomas

    2016-07-15

    Computational protein design depends on an energy function and an algorithm to search the sequence/conformation space. We compare three stochastic search algorithms: a heuristic, Monte Carlo (MC), and a Replica Exchange Monte Carlo method (REMC). The heuristic performs a steepest-descent minimization starting from thousands of random starting points. The methods are applied to nine test proteins from three structural families, with a fixed backbone structure, a molecular mechanics energy function, and with 1, 5, 10, 20, 30, or all amino acids allowed to mutate. Results are compared to an exact, "Cost Function Network" method that identifies the global minimum energy conformation (GMEC) in favorable cases. The designed sequences accurately reproduce experimental sequences in the hydrophobic core. The heuristic and REMC agree closely and reproduce the GMEC when it is known, with a few exceptions. Plain MC performs well for most cases, occasionally departing from the GMEC by 3-4 kcal/mol. With REMC, the diversity of the sequences sampled agrees with exact enumeration where the latter is possible: up to 2 kcal/mol above the GMEC. Beyond, room temperature replicas sample sequences up to 10 kcal/mol above the GMEC, providing thermal averages and a solution to the inverse protein folding problem. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  11. MODIS 250M burnt area detection algorithm: A case study applied, optimized and evaluated over continental Portugal.

    Science.gov (United States)

    Mota, Bernardo; Benali, Akli; Pereira, Jose Miguel

    2014-05-01

    The dependence on satellites to derive burnt area (BA) maps is unquestionable. High resolution inventories normally result from change detection algorithms applied to pre and post fire season high resolution imagery. But these have no temporal discrimination within the occurring season. Limited to the larger fire scars, coarser resolution imagery based on reflectance or thermal information can help to map the individual fire progression. The Moderate Resolution Imaging Spectroradiometer (MODIS) 250m imagery bands, freely available, can be used to provide quick areal estimates and provide the needed temporal discrimination with four times the standard spatial resolution BA products. The scope of this study is to assess the spatial and temporal accuracy of burnt area maps derived by the MODIS 250m resolution Burnt Area algorithm (M250BA) presented by Mota et al., (2013) on an Mediterranean landscape. The algorithm is an improved adaptation of one of the burnt area algorithms developed within the scope of the Fire_CCI project and was applied to an area covering continental Portugal for the period of 2001-2013. The algorithm comprises a temporal analysis based on change point detections and a spatial analysis based on Markov random fields. We explored the benefits of applying standard optimization techniques to the algorithm and achieved significant performance improvements.. Temporal and spatial accuracy assessments were performed by comparing the results with spatial and temporal distribution of active fire maps and with high resolution burnt area maps, derived by the MCD14ML thermal anomalies dataset and by Landsat BA classifications, respectively. Accuracy results highlight the potential applications for this BA algorithm and the advantages of using 250m spatial resolution images for BA detection. The study also extends the current national burnt area atlas since 2010. Due to the open-access data policy, the algorithm can be easily parameterised and applied to any

  12. Enhancing State-of-the-art Multi-objective Optimization Algorithms by Applying Domain Specific Operators

    DEFF Research Database (Denmark)

    Ghoreishi, Newsha; Sørensen, Jan Corfixen; Jørgensen, Bo Nørregaard

    2015-01-01

    To solve dynamic multi-optimization problems, optimization algorithms are required to converge quickly in response to changes in the environment without reducing the diversity of the found solutions. Most Multi-Objective Evolutionary Algorithms (MOEAs) are designed to solve static multiobjective...... problems. Problems emerge when the algorithms can not converge fast enough, due to scalability issues introduced by using too generic operators. This paper presents an evolutionary algorithm CONTROLEUM-GA that uses domain specific variables and operators to solve a real dynamic greenhouse climate control...... optimization problems where the environment does not change dynamically. For that reason, the requirement for convergence in static optimization problems is not as timecritical as for dynamic optimization problems. Most MOEAs use generic variables and operators that scale to static multi-objective optimization...

  13. Evolutionary Algorithms Applied to Antennas and Propagation: A Review of State of the Art

    Directory of Open Access Journals (Sweden)

    Sotirios K. Goudos

    2016-01-01

    Full Text Available A review of evolutionary algorithms (EAs with applications to antenna and propagation problems is presented. EAs have emerged as viable candidates for global optimization problems and have been attracting the attention of the research community interested in solving real-world engineering problems, as evidenced by the fact that very large number of antenna design problems have been addressed in the literature in recent years by using EAs. In this paper, our primary focus is on Genetic Algorithms (GAs, Particle Swarm Optimization (PSO, and Differential Evolution (DE, though we also briefly review other recently introduced nature-inspired algorithms. An overview of case examples optimized by each family of algorithms is included in the paper.

  14. Differential Reduction Algorithms for Hypergeometric Functions Applied to Feynman Diagram Calculation

    OpenAIRE

    Bytev, V. V.; Kalmykov, M; Kniehl, B. A.; Ward, B F L; Yost, S A

    2009-01-01

    We describe the application of differential reduction algorithms for Feynman Diagram calculation. We illustrate the procedure in the context of generalized hypergeometric functions, and give an example for a type of q-loop bubble diagram.

  15. Hybrid Model Based on Genetic Algorithms and SVM Applied to Variable Selection within Fruit Juice Classification

    Directory of Open Access Journals (Sweden)

    C. Fernandez-Lozano

    2013-01-01

    Full Text Available Given the background of the use of Neural Networks in problems of apple juice classification, this paper aim at implementing a newly developed method in the field of machine learning: the Support Vector Machines (SVM. Therefore, a hybrid model that combines genetic algorithms and support vector machines is suggested in such a way that, when using SVM as a fitness function of the Genetic Algorithm (GA, the most representative variables for a specific classification problem can be selected.

  16. Hybrid Model Based on Genetic Algorithms and SVM Applied to Variable Selection within Fruit Juice Classification

    Science.gov (United States)

    Fernandez-Lozano, C.; Canto, C.; Gestal, M.; Andrade-Garda, J. M.; Rabuñal, J. R.; Dorado, J.; Pazos, A.

    2013-01-01

    Given the background of the use of Neural Networks in problems of apple juice classification, this paper aim at implementing a newly developed method in the field of machine learning: the Support Vector Machines (SVM). Therefore, a hybrid model that combines genetic algorithms and support vector machines is suggested in such a way that, when using SVM as a fitness function of the Genetic Algorithm (GA), the most representative variables for a specific classification problem can be selected. PMID:24453933

  17. Pentagram star pattern identification algorithm applied in three-head star sensors

    Science.gov (United States)

    Wu, Feng; Zhu, Xifang; Jiang, Xiaoyan

    2017-07-01

    A pentagram star pattern identification algorithm for three-head star sensors was proposed. Its realization scheme was presented completely. Simulated star maps were produced by letting the three-head star sensor travel around the celestial sphere randomly and image the observed stars. Monte Carlo experiments were carried out. The performances of the pentagram algorithm were evaluated. It proves that its identification success rate reaches up to 98%.

  18. A Subspace Preconditioned LSQR Gauss-Newton Method with a Constrained Line Search Path Applied to 3D Biomedical Microwave Imaging

    Directory of Open Access Journals (Sweden)

    Jürgen De Zaeytijd

    2015-01-01

    Full Text Available Three contributions that can improve the performance of a Newton-type iterative quantitative microwave imaging algorithm in a biomedical context are proposed. (i To speed up the iterative forward problem solution, we extrapolate the initial guess of the field from a few field solutions corresponding to previous source positions for the same complex permittivity (i.e., “marching on in source position” as well as from a Born-type approximation that is computed from a field solution corresponding to one previous complex permittivity profile for the same source position. (ii The regularized Gauss-Newton update system can be ill-conditioned; hence we propose to employ a two-level preconditioned iterative solution method. We apply the subspace preconditioned LSQR algorithm from Jacobsen et al. (2003 and we employ a 3D cosine basis. (iii We propose a new constrained line search path in the Gauss-Newton optimization, which incorporates in a smooth manner lower and upper bounds on the object permittivity, such that these bounds never can be violated along the search path. Single-frequency reconstructions from bipolarized synthetic data are shown for various three-dimensional numerical biological phantoms, including a realistic breast phantom from the University of Wisconsin-Madison (UWCEM online repository.

  19. Steady state load shedding to mitigate blackout in power systems using an improved harmony search algorithm

    Directory of Open Access Journals (Sweden)

    R. Mageshvaran

    2015-09-01

    The proposed algorithm is tested on IEEE 14, 30 and 118 bus test systems. The viability of the proposed method in terms of solution quality and convergence properties is compared with the other conventional methods reported earlier.

  20. Analysis of an iterated local search algorithm for vertex cover in sparse random graphs

    DEFF Research Database (Denmark)

    Witt, Carsten

    2012-01-01

    finds an optimal cover in polynomial time with a probability arbitrarily close to 1. This behavior relies on the absence of a giant component. As an additional insight into the randomized search, it is shown that the heuristic fails badly also on graphs consisting of a single tree component of maximum......Recently, various randomized search heuristics have been studied for the solution of the minimum vertex cover problem, in particular for sparse random instances according to the G(n,c/n) model, where c>0 is a constant. Methods from statistical physics suggest that the problem is easy if c......search heuristics on random graphs. For csearch heuristic...

  1. Algorithms for Regular Tree Grammar Network Search and Their Application to Mining Human-viral Infection Patterns.

    Science.gov (United States)

    Smoly, Ilan; Carmel, Amir; Shemer-Avni, Yonat; Yeger-Lotem, Esti; Ziv-Ukelson, Michal

    2016-03-01

    Network querying is a powerful approach to mine molecular interaction networks. Most state-of-the-art network querying tools either confine the search to a prespecified topology in the form of some template subnetwork, or do not specify any topological constraints at all. Another approach is grammar-based queries, which are more flexible and expressive as they allow for expressing the topology of the sought pattern according to some grammar-based logic. Previous grammar-based network querying tools were confined to the identification of paths. In this article, we extend the patterns identified by grammar-based query approaches from paths to trees. For this, we adopt a higher order query descriptor in the form of a regular tree grammar (RTG). We introduce a novel problem and propose an algorithm to search a given graph for the k highest scoring subgraphs matching a tree accepted by an RTG. Our algorithm is based on the combination of dynamic programming with color coding, and includes an extension of previous k-best parsing optimization approaches to avoid isomorphic trees in the output. We implement the new algorithm and exemplify its application to mining viral infection patterns within molecular interaction networks. Our code is available online.

  2. Maximize Minimum Utility Function of Fractional Cloud Computing System Based on Search Algorithm Utilizing the Mittag-Leffler Sum

    Directory of Open Access Journals (Sweden)

    Rabha W. Ibrahim

    2018-01-01

    Full Text Available The maximum min utility function (MMUF problem is an important representative of a large class of cloud computing systems (CCS. Having numerous applications in practice, especially in economy and industry. This paper introduces an effective solution-based search (SBS algorithm for solving the problem MMUF. First, we suggest a new formula of the utility function in term of the capacity of the cloud. We formulate the capacity in CCS, by using a fractional diffeo-integral equation. This equation usually describes the flow of CCS. The new formula of the utility function is modified recent active utility functions. The suggested technique first creates a high-quality initial solution by eliminating the less promising components, and then develops the quality of the achieved solution by the summation search solution (SSS. This method is considered by the Mittag-Leffler sum as hash functions to determine the position of the agent. Experimental results commonly utilized in the literature demonstrate that the proposed algorithm competes approvingly with the state-of-the-art algorithms both in terms of solution quality and computational efficiency.

  3. Solution Approach to Automatic Generation Control Problem Using Hybridized Gravitational Search Algorithm Optimized PID and FOPID Controllers

    Directory of Open Access Journals (Sweden)

    DAHIYA, P.

    2015-05-01

    Full Text Available This paper presents the application of hybrid opposition based disruption operator in gravitational search algorithm (DOGSA to solve automatic generation control (AGC problem of four area hydro-thermal-gas interconnected power system. The proposed DOGSA approach combines the advantages of opposition based learning which enhances the speed of convergence and disruption operator which has the ability to further explore and exploit the search space of standard gravitational search algorithm (GSA. The addition of these two concepts to GSA increases its flexibility for solving the complex optimization problems. This paper addresses the design and performance analysis of DOGSA based proportional integral derivative (PID and fractional order proportional integral derivative (FOPID controllers for automatic generation control problem. The proposed approaches are demonstrated by comparing the results with the standard GSA, opposition learning based GSA (OGSA and disruption based GSA (DGSA. The sensitivity analysis is also carried out to study the robustness of DOGSA tuned controllers in order to accommodate variations in operating load conditions, tie-line synchronizing coefficient, time constants of governor and turbine. Further, the approaches are extended to a more realistic power system model by considering the physical constraints such as thermal turbine generation rate constraint, speed governor dead band and time delay.

  4. A computationally efficient depression-filling algorithm for digital elevation models, applied to proglacial lake drainage

    Science.gov (United States)

    Berends, Constantijn J.; van de Wal, Roderik S. W.

    2016-12-01

    Many processes govern the deglaciation of ice sheets. One of the processes that is usually ignored is the calving of ice in lakes that temporarily surround the ice sheet. In order to capture this process a "flood-fill algorithm" is needed. Here we present and evaluate several optimizations to a standard flood-fill algorithm in terms of computational efficiency. As an example, we determine the land-ocean mask for a 1 km resolution digital elevation model (DEM) of North America and Greenland, a geographical area of roughly 7000 by 5000 km (roughly 35 million elements), about half of which is covered by ocean. Determining the land-ocean mask with our improved flood-fill algorithm reduces computation time by 90 % relative to using a standard stack-based flood-fill algorithm. This implies that it is now feasible to include the calving of ice in lakes as a dynamical process inside an ice-sheet model. We demonstrate this by using bedrock elevation, ice thickness and geoid perturbation fields from the output of a coupled ice-sheet-sea-level equation model at 30 000 years before present and determine the extent of Lake Agassiz, using both the standard and improved versions of the flood-fill algorithm. We show that several optimizations to the flood-fill algorithm used for filling a depression up to a water level, which is not defined beforehand, decrease the computation time by up to 99 %. The resulting reduction in computation time allows determination of the extent and volume of depressions in a DEM over large geographical grids or repeatedly over long periods of time, where computation time might otherwise be a limiting factor. The algorithm can be used for all glaciological and hydrological models, which need to trace the evolution over time of lakes or drainage basins in general.

  5. An Improved String-Searching Algorithm and Its Application in Component Security Testing

    National Research Council Canada - National Science Library

    Jinfu Chen Saihua Cai Lili Zhu Yuchi Guo Rubing Huang Xiaolei Zhao Yunqi Sheng

    2016-01-01

    Mass monitor logs are produced during the process of component security testing. In order to mine the explicit and implicit security exception information of the tested component, the log should be searched for keyword strings...

  6. The effect of neighborhood structures on tabu search algorithm in solving university course timetabling problem

    Science.gov (United States)

    Shakir, Ali; AL-Khateeb, Belal; Shaker, Khalid; Jalab, Hamid A.

    2014-12-01

    The design of course timetables for academic institutions is a very difficult job due to the huge number of possible feasible timetables with respect to the problem size. This process contains lots of constraints that must be taken into account and a large search space to be explored, even if the size of the problem input is not significantly large. Different heuristic approaches have been proposed in the literature in order to solve this kind of problem. One of the efficient solution methods for this problem is tabu search. Different neighborhood structures based on different types of move have been defined in studies using tabu search. In this paper, different neighborhood structures on the operation of tabu search are examined. The performance of different neighborhood structures is tested over eleven benchmark datasets. The obtained results of every neighborhood structures are compared with each other. Results obtained showed the disparity between each neighborhood structures and another in terms of penalty cost.

  7. Comparison of vessel enhancement algorithms applied to time-of-flight MRA images for cerebrovascular segmentation.

    Science.gov (United States)

    Phellan, Renzo; Forkert, Nils D

    2017-11-01

    Vessel enhancement algorithms are often used as a preprocessing step for vessel segmentation in medical images to improve the overall segmentation accuracy. Each algorithm uses different characteristics to enhance vessels, such that the most suitable algorithm may vary for different applications. This paper presents a comparative analysis of the accuracy gains in vessel segmentation generated by the use of nine vessel enhancement algorithms: Multiscale vesselness using the formulas described by Erdt (MSE), Frangi (MSF), and Sato (MSS), optimally oriented flux (OOF), ranking orientations responses path operator (RORPO), the regularized Perona-Malik approach (RPM), vessel enhanced diffusion (VED), hybrid diffusion with continuous switch (HDCS), and the white top hat algorithm (WTH). The filters were evaluated and compared based on time-of-flight MRA datasets and corresponding manual segmentations from 5 healthy subjects and 10 patients with an arteriovenous malformation. Additionally, five synthetic angiographic datasets with corresponding ground truth segmentation were generated with three different noise levels (low, medium, and high) and also used for comparison. The parameters for each algorithm and subsequent segmentation were optimized using leave-one-out cross evaluation. The Dice coefficient, Matthews correlation coefficient, area under the ROC curve, number of connected components, and true positives were used for comparison. The results of this study suggest that vessel enhancement algorithms do not always lead to more accurate segmentation results compared to segmenting nonenhanced images directly. Multiscale vesselness algorithms, such as MSE, MSF, and MSS proved to be robust to noise, while diffusion-based filters, such as RPM, VED, and HDCS ranked in the top of the list in scenarios with medium or no noise. Filters that assume tubular-shapes, such as MSE, MSF, MSS, OOF, RORPO, and VED show a decrease in accuracy when considering patients with an AVM

  8. Searching protein 3-D structures for optimal structure alignment using intelligent algorithms and data structures.

    Science.gov (United States)

    Novosád, Tomáš; Snášel, Václav; Abraham, Ajith; Yang, Jack Y

    2010-11-01

    In this paper, we present a novel algorithm for measuring protein similarity based on their 3-D structure (protein tertiary structure). The algorithm used a suffix tree for discovering common parts of main chains of all proteins appearing in the current research collaboratory for structural bioinformatics protein data bank (PDB). By identifying these common parts, we build a vector model and use some classical information retrieval (IR) algorithms based on the vector model to measure the similarity between proteins--all to all protein similarity. For the calculation of protein similarity, we use term frequency × inverse document frequency ( tf × idf ) term weighing schema and cosine similarity measure. The goal of this paper is to introduce new protein similarity metric based on suffix trees and IR methods. Whole current PDB database was used to demonstrate very good time complexity of the algorithm as well as high precision. We have chosen the structural classification of proteins (SCOP) database for verification of the precision of our algorithm because it is maintained primarily by humans. The next success of this paper would be the ability to determine SCOP categories of proteins not included in the latest version of the SCOP database (v. 1.75) with nearly 100% precision.

  9. DESIGN OF A WEB SEMI-INTELLIGENT METADATA SEARCH MODEL APPLIED IN DATA WAREHOUSING SYSTEMS

    OpenAIRE

    Luna Ramírez,Enrique; Ambriz Delgadillo,Humberto; Nungaray Ornelas,J. Antonio; Álvarez Rodríguez, Francisco Javier; Jorge N Mondragón Reyes

    2008-01-01

    In this paper, the design of a Web metadata search model with semi-intelligent features is proposed. The search model is oriented to retrieve the metadata associated to a data warehouse in a fast, flexible and reliable way. Our proposal includes a set of distinctive functionalities, which consist of the temporary storage of the frequently used metadata in an exclusive store, different to the global data warehouse metadata store, and of the use of control processes to retrieve information from...

  10. Algorithms

    Indian Academy of Sciences (India)

    Algorithms. 3. Procedures and Recursion. R K Shyamasundar. In this article we introduce procedural abstraction and illustrate its uses. Further, we illustrate the notion of recursion which is one of the most useful features of procedural abstraction. Procedures. Let us consider a variation of the pro blem of summing the first M.

  11. Algorithms

    Indian Academy of Sciences (India)

    number of elements. We shall illustrate the widely used matrix multiplication algorithm using the two dimensional arrays in the following. Consider two matrices A and B of integer type with di- mensions m x nand n x p respectively. Then, multiplication of. A by B denoted, A x B , is defined by matrix C of dimension m xp where.

  12. An Applied Research of Decision Tree Algorithm in Track and Field Equipment Training

    Directory of Open Access Journals (Sweden)

    Liu Shaoqing

    2015-01-01

    Full Text Available This paper has conducted a study on the applications of track and field equipment training based on ID3 algorithm of decision tree model. For the selection of the elements used by decision tree, this paper can be divided into track training equipment, field events training equipment and auxiliary training equipment according to the properties of track and field equipment. The decision tree that regards track training equipment as root nodes has been obtained under the conditions of lowering computation cost through the selection of data as well as the application and optimization of ID3 algorithm model.

  13. A genetic algorithm based global search strategy for population pharmacokinetic/pharmacodynamic model selection.

    Science.gov (United States)

    Sale, Mark; Sherer, Eric A

    2015-01-01

    The current algorithm for selecting a population pharmacokinetic/pharmacodynamic model is based on the well-established forward addition/backward elimination method. A central strength of this approach is the opportunity for a modeller to continuously examine the data and postulate new hypotheses to explain observed biases. This algorithm has served the modelling community well, but the model selection process has essentially remained unchanged for the last 30 years. During this time, more robust approaches to model selection have been made feasible by new technology and dramatic increases in computation speed. We review these methods, with emphasis on genetic algorithm approaches and discuss the role these methods may play in population pharmacokinetic/pharmacodynamic model selection. © 2013 The British Pharmacological Society.

  14. Converting optical scanning holograms of real objects to binary Fourier holograms using an iterative direct binary search algorithm.

    Science.gov (United States)

    Leportier, Thibault; Park, Min Chul; Kim, You Seok; Kim, Taegeun

    2015-02-09

    In this paper, we present a three-dimensional holographic imaging system. The proposed approach records a complex hologram of a real object using optical scanning holography, converts the complex form to binary data, and then reconstructs the recorded hologram using a spatial light modulator (SLM). The conversion from the recorded hologram to a binary hologram is achieved using a direct binary search algorithm. We present experimental results that verify the efficacy of our approach. To the best of our knowledge, this is the first time that a hologram of a real object has been reconstructed using a binary SLM.

  15. Advanced Modeling System for Optimization of Wind Farm Layout and Wind Turbine Sizing Using a Multi-Level Extended Pattern Search Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    DuPont, Bryony; Cagan, Jonathan; Moriarty, Patrick

    2016-07-01

    This paper presents a system of modeling advances that can be applied in the computational optimization of wind plants. These modeling advances include accurate cost and power modeling, partial wake interaction, and the effects of varying atmospheric stability. To validate the use of this advanced modeling system, it is employed within an Extended Pattern Search (EPS)-Multi-Agent System (MAS) optimization approach for multiple wind scenarios. The wind farm layout optimization problem involves optimizing the position and size of wind turbines such that the aerodynamic effects of upstream turbines are reduced, which increases the effective wind speed and resultant power at each turbine. The EPS-MAS optimization algorithm employs a profit objective, and an overarching search determines individual turbine positions, with a concurrent EPS-MAS determining the optimal hub height and rotor diameter for each turbine. Two wind cases are considered: (1) constant, unidirectional wind, and (2) three discrete wind speeds and varying wind directions, each of which have a probability of occurrence. Results show the advantages of applying the series of advanced models compared to previous application of an EPS with less advanced models to wind farm layout optimization, and imply best practices for computational optimization of wind farms with improved accuracy.

  16. Searching of fuel recharges by means of genetic algorithms and neural networks in BWRs; Busqueda de recargas de combustible mediante algoritmos geneticos y redes neuronales en BWRs

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz S, J.J.; Montes T, J.L.; Castillo M, J.A.; Perusquia del C, R. [ININ, Carretera Mexico-Toluca Km. 36.5, 52045 Estado de Mexico (Mexico)

    2004-07-01

    In this work improvements to the systems RENOR and RECOPIA are presented, that were developed to optimize fuel recharges in boiling water reactors. The RENOR system is based on a Multi state recurrent neural network while RECOPIA is based on a Genetic Algorithm. In the new versions of these systems there is incorporate the execution of the Turned off Margin in Cold and the Excess of Reactivity in Hot. The new systems were applied to an operation cycle of the Unit 1 of the Nuclear Power station of Laguna Verde. The recharges of fuel obtained by both methods are compared among if being observed that RENOR has better performance that RECOPIA, due to the nature of its search process. RECOPIA requires of approximately 1.4 times more time that RENOR to find a satisfactory recharge of fuel. (Author)

  17. Effective and extensible feature extraction method using genetic algorithm-based frequency-domain feature search for epileptic EEG multiclassification.

    Science.gov (United States)

    Wen, Tingxi; Zhang, Zhongnan

    2017-05-01

    In this paper, genetic algorithm-based frequency-domain feature search (GAFDS) method is proposed for the electroencephalogram (EEG) analysis of epilepsy. In this method, frequency-domain features are first searched and then combined with nonlinear features. Subsequently, these features are selected and optimized to classify EEG signals. The extracted features are analyzed experimentally. The features extracted by GAFDS show remarkable independence, and they are superior to the nonlinear features in terms of the ratio of interclass distance and intraclass distance. Moreover, the proposed feature search method can search for features of instantaneous frequency in a signal after Hilbert transformation. The classification results achieved using these features are reasonable; thus, GAFDS exhibits good extensibility. Multiple classical classifiers (i.e., k-nearest neighbor, linear discriminant analysis, decision tree, AdaBoost, multilayer perceptron, and Naïve Bayes) achieve satisfactory classification accuracies by using the features generated by the GAFDS method and the optimized feature selection. The accuracies for 2-classification and 3-classification problems may reach up to 99% and 97%, respectively. Results of several cross-validation experiments illustrate that GAFDS is effective in the extraction of effective features for EEG classification. Therefore, the proposed feature selection and optimization model can improve classification accuracy.

  18. Effective and extensible feature extraction method using genetic algorithm-based frequency-domain feature search for epileptic EEG multiclassification

    Science.gov (United States)

    Wen, Tingxi; Zhang, Zhongnan

    2017-01-01

    Abstract In this paper, genetic algorithm-based frequency-domain feature search (GAFDS) method is proposed for the electroencephalogram (EEG) analysis of epilepsy. In this method, frequency-domain features are first searched and then combined with nonlinear features. Subsequently, these features are selected and optimized to classify EEG signals. The extracted features are analyzed experimentally. The features extracted by GAFDS show remarkable independence, and they are superior to the nonlinear features in terms of the ratio of interclass distance and intraclass distance. Moreover, the proposed feature search method can search for features of instantaneous frequency in a signal after Hilbert transformation. The classification results achieved using these features are reasonable; thus, GAFDS exhibits good extensibility. Multiple classical classifiers (i.e., k-nearest neighbor, linear discriminant analysis, decision tree, AdaBoost, multilayer perceptron, and Naïve Bayes) achieve satisfactory classification accuracies by using the features generated by the GAFDS method and the optimized feature selection. The accuracies for 2-classification and 3-classification problems may reach up to 99% and 97%, respectively. Results of several cross-validation experiments illustrate that GAFDS is effective in the extraction of effective features for EEG classification. Therefore, the proposed feature selection and optimization model can improve classification accuracy. PMID:28489789

  19. A semi-supervised segmentation algorithm as applied to k-means ...

    African Journals Online (AJOL)

    ... study the newly proposed semi-supervised segmentation algorithm outperforms both an unsupervised and a supervised segmentation technique, when compared by using the Gini coecient as performance measure of the resulting predictive models. Key words: Banking, clustering, multivariate statistics, data mining ...

  20. A computationally efficient depression-filling algorithm for digital elevation models, applied to proglacial lake drainage

    NARCIS (Netherlands)

    Berends, Constantijn J.; Van De Wal, Roderik S W

    2016-01-01

    Many processes govern the deglaciation of ice sheets. One of the processes that is usually ignored is the calving of ice in lakes that temporarily surround the ice sheet. In order to capture this process a "flood-fill algorithm" is needed. Here we present and evaluate several optimizations to a

  1. Natural search algorithms as a bridge between organisms, evolution, and ecology.

    Science.gov (United States)

    Hein, Andrew M; Carrara, Francesco; Brumley, Douglas R; Stocker, Roman; Levin, Simon A

    2016-08-23

    The ability to navigate is a hallmark of living systems, from single cells to higher animals. Searching for targets, such as food or mates in particular, is one of the fundamental navigational tasks many organisms must execute to survive and reproduce. Here, we argue that a recent surge of studies of the proximate mechanisms that underlie search behavior offers a new opportunity to integrate the biophysics and neuroscience of sensory systems with ecological and evolutionary processes, closing a feedback loop that promises exciting new avenues of scientific exploration at the frontier of systems biology.

  2. Shifting Inductive Bias with Success-Story Algorithm, Adaptive Levin Search, and Incremental Self-Improvement

    NARCIS (Netherlands)

    Schmidhuber, J.; Zhao, J.; Wiering, M.A.

    1997-01-01

    We study task sequences that allow for speeding up the learners average reward intake through appropriate shifts of inductive bias changes of the learner's policy. To evaluate long-term effects of bias shifts setting the stage for later bias shifts we use the "success-story algorithm" (SSA).SSA

  3. VisualRank: applying PageRank to large-scale image search.

    Science.gov (United States)

    Jing, Yushi; Baluja, Shumeet

    2008-11-01

    Because of the relative ease in understanding and processing text, commercial image-search systems often rely on techniques that are largely indistinguishable from text-search. Recently, academic studies have demonstrated the effectiveness of employing image-based features to provide alternative or additional signals. However, it remains uncertain whether such techniques will generalize to a large number of popular web queries, and whether the potential improvement to search quality warrants the additional computational cost. In this work, we cast the image-ranking problem into the task of identifying "authority" nodes on an inferred visual similarity graph and propose VisualRank to analyze the visual link structures among images. The images found to be "authorities" are chosen as those that answer the image-queries well. To understand the performance of such an approach in a real system, we conducted a series of large-scale experiments based on the task of retrieving images for 2000 of the most popular products queries. Our experimental results show significant improvement, in terms of user satisfaction and relevancy, in comparison to the most recent Google Image Search results. Maintaining modest computational cost is vital to ensuring that this procedure can be used in practice; we describe the techniques required to make this system practical for large scale deployment in commercial search engines.

  4. Hybrid self organizing migrating algorithm - Scatter search for the task of capacitated vehicle routing problem

    Science.gov (United States)

    Davendra, Donald; Zelinka, Ivan; Senkerik, Roman; Jasek, Roman; Bialic-Davendra, Magdalena

    2012-11-01

    One of the new emerging application strategies for optimization is the hybridization of existing metaheuristics. The research combines the unique paradigms of solution space sampling of SOMA and memory retention capabilities of Scatter Search for the task of capacitated vehicle routing problem. The new hybrid heuristic is tested on the Taillard sets and obtains good results.

  5. Attitude algorithm and initial alignment method for SINS applied in short-range aircraft

    Science.gov (United States)

    Zhang, Rong-Hui; He, Zhao-Cheng; You, Feng; Chen, Bo

    2017-07-01

    This paper presents an attitude solution algorithm based on the Micro-Electro-Mechanical System and quaternion method. We completed the numerical calculation and engineering practice by adopting fourth-order Runge-Kutta algorithm in the digital signal processor. The state space mathematical model of initial alignment in static base was established, and the initial alignment method based on Kalman filter was proposed. Based on the hardware in the loop simulation platform, the short-range flight simulation test and the actual flight test were carried out. The results show that the error of pitch, yaw and roll angle is fast convergent, and the fitting rate between flight simulation and flight test is more than 85%.

  6. Distributed parallel processing applied to an implicit multigrid Euler/Navier-Stokes algorithm

    Science.gov (United States)

    Tysinger, T. L.; Caughey, D. A.

    1993-01-01

    An implicit multigrid algorithm for the solution of the Euler and Navier-Stokes equations has been implemented within the framework of multiple block-structured grids in which the physical domain is spatially decomposed into several blocks and the solution is advanced in parallel on each block. Utilities have been developed to implement such a scheme in a distributed computing environment. The multi-block algorithm is designed so that the explicit residual calculation is identical to that of single-block scheme, and therefore converged solutions for both schemes must be the same. To accelerate convergence, synchronous and asynchronous multigrid strategies are implemented. Significant speedups have been achieved in a multiple processor environment, while convergence rates similar to those of the single-block scheme are observed.

  7. Algorithm applied in dialogue with Skateholders: a case study in a business tourism sector

    Directory of Open Access Journals (Sweden)

    Ana María Gil Lafuente

    2010-12-01

    Full Text Available According to numerous scientific studies one of the most important points in the area of sustainability in business is related to dialogue with stakeholders. Based on Stakeholder Theory we try to analyze corporate sustainability and the process of preparing a report that a company in the tourism sector in accordance with the guidelines of the guide G3 - Global Reporting Initiative. With the completion of an empirical study seeks to understand the expectations of stakeholders regarding the implementation of the contents of the sustainability report. To achieve the proposed aim we use «The Expertons Method» algorithm that allows the aggregation of opinions of various experts on the subject and represents an important extension of fuzzy subsets for aggregation processes. At the end of our study, we present the results of using this algorithm, the contributions and future research.

  8. Congestion management of deregulated power systems by optimal setting of Interline Power Flow Controller using Gravitational Search algorithm

    Directory of Open Access Journals (Sweden)

    Akanksha Mishra

    2017-05-01

    Full Text Available In a deregulated electricity market it may at times become difficult to dispatch all the required power that is scheduled to flow due to congestion in transmission lines. An Interline Power Flow Controller (IPFC can be used to reduce the system loss and power flow in the heavily loaded line, improve stability and loadability of the system. This paper proposes a Disparity Line Utilization Factor for the optimal placement and Gravitational Search algorithm based optimal tuning of IPFC to control the congestion in transmission lines. DLUF ranks the transmission lines in terms of relative line congestion. The IPFC is accordingly placed in the most congested and the least congested line connected to the same bus. Optimal sizing of IPFC is carried using Gravitational Search algorithm. A multi-objective function has been chosen for tuning the parameters of the IPFC. The proposed method is implemented on an IEEE-30 bus test system. Graphical representations have been included in the paper showing reduction in LUF of the transmission lines after the placement of an IPFC. A reduction in active power and reactive power loss of the system by about 6% is observed after an optimally tuned IPFC has been included in the power system. The effectiveness of the proposed tuning method has also been shown in the paper through the reduction in the values of the objective functions.

  9. Multiple Harmonics Fitting Algorithms Applied to Periodic Signals Based on Hilbert-Huang Transform

    Directory of Open Access Journals (Sweden)

    Hui Wang

    2013-01-01

    Full Text Available A new generation of multipurpose measurement equipment is transforming the role of computers in instrumentation. The new features involve mixed devices, such as kinds of sensors, analog-to-digital and digital-to-analog converters, and digital signal processing techniques, that are able to substitute typical discrete instruments like multimeters and analyzers. Signal-processing applications frequently use least-squares (LS sine-fitting algorithms. Periodic signals may be interpreted as a sum of sine waves with multiple frequencies: the Fourier series. This paper describes a new sine fitting algorithm that is able to fit a multiharmonic acquired periodic signal. By means of a “sinusoidal wave” whose amplitude and phase are both transient, the “triangular wave” can be reconstructed on the basis of Hilbert-Huang transform (HHT. This method can be used to test effective number of bits (ENOBs of analog-to-digital converter (ADC, avoiding the trouble of selecting initial value of the parameters and working out the nonlinear equations. The simulation results show that the algorithm is precise and efficient. In the case of enough sampling points, even under the circumstances of low-resolution signal with the harmonic distortion existing, the root mean square (RMS error between the sampling data of original “triangular wave” and the corresponding points of fitting “sinusoidal wave” is marvelously small. That maybe means, under the circumstances of any periodic signal, that ENOBs of high-resolution ADC can be tested accurately.

  10. User Activity Recognition in Smart Homes Using Pattern Clustering Applied to Temporal ANN Algorithm

    Directory of Open Access Journals (Sweden)

    Serge Thomas Mickala Bourobou

    2015-05-01

    Full Text Available This paper discusses the possibility of recognizing and predicting user activities in the IoT (Internet of Things based smart environment. The activity recognition is usually done through two steps: activity pattern clustering and activity type decision. Although many related works have been suggested, they had some limited performance because they focused only on one part between the two steps. This paper tries to find the best combination of a pattern clustering method and an activity decision algorithm among various existing works. For the first step, in order to classify so varied and complex user activities, we use a relevant and efficient unsupervised learning method called the K-pattern clustering algorithm. In the second step, the training of smart environment for recognizing and predicting user activities inside his/her personal space is done by utilizing the artificial neural network based on the Allen’s temporal relations. The experimental results show that our combined method provides the higher recognition accuracy for various activities, as compared with other data mining classification algorithms. Furthermore, it is more appropriate for a dynamic environment like an IoT based smart home.

  11. User Activity Recognition in Smart Homes Using Pattern Clustering Applied to Temporal ANN Algorithm.

    Science.gov (United States)

    Bourobou, Serge Thomas Mickala; Yoo, Younghwan

    2015-05-21

    This paper discusses the possibility of recognizing and predicting user activities in the IoT (Internet of Things) based smart environment. The activity recognition is usually done through two steps: activity pattern clustering and activity type decision. Although many related works have been suggested, they had some limited performance because they focused only on one part between the two steps. This paper tries to find the best combination of a pattern clustering method and an activity decision algorithm among various existing works. For the first step, in order to classify so varied and complex user activities, we use a relevant and efficient unsupervised learning method called the K-pattern clustering algorithm. In the second step, the training of smart environment for recognizing and predicting user activities inside his/her personal space is done by utilizing the artificial neural network based on the Allen's temporal relations. The experimental results show that our combined method provides the higher recognition accuracy for various activities, as compared with other data mining classification algorithms. Furthermore, it is more appropriate for a dynamic environment like an IoT based smart home.

  12. Optimization of spatial light distribution through genetic algorithms for vision systems applied to quality control

    Science.gov (United States)

    Castellini, P.; Cecchini, S.; Stroppa, L.; Paone, N.

    2015-02-01

    The paper presents an adaptive illumination system for image quality enhancement in vision-based quality control systems. In particular, a spatial modulation of illumination intensity is proposed in order to improve image quality, thus compensating for different target scattering properties, local reflections and fluctuations of ambient light. The desired spatial modulation of illumination is obtained by a digital light projector, used to illuminate the scene with an arbitrary spatial distribution of light intensity, designed to improve feature extraction in the region of interest. The spatial distribution of illumination is optimized by running a genetic algorithm. An image quality estimator is used to close the feedback loop and to stop iterations once the desired image quality is reached. The technique proves particularly valuable for optimizing the spatial illumination distribution in the region of interest, with the remarkable capability of the genetic algorithm to adapt the light distribution to very different target reflectivity and ambient conditions. The final objective of the proposed technique is the improvement of the matching score in the recognition of parts through matching algorithms, hence of the diagnosis of machine vision-based quality inspections. The procedure has been validated both by a numerical model and by an experimental test, referring to a significant problem of quality control for the washing machine manufacturing industry: the recognition of a metallic clamp. Its applicability to other domains is also presented, specifically for the visual inspection of shoes with retro-reflective tape and T-shirts with paillettes.

  13. Finite-Time Performance of Local Search Algorithms: Theory and Application

    Science.gov (United States)

    2010-06-10

    34The Noising Methods: A Generalization of Some Metaheuristics ." European Journal of Operational Research 135(1). 86-101. G. Chartrand. O.R...A.W. Johnson (2003). "The Theory and Practice of Simulated Annealing," 287-319, (Chapter 10 in Handbook on Metaheuristics , F. Glover and G...C. Reeves (2003), "Genetic Algorithms", Chapter 3 in Handbook of Metaheuristics , Kluwer Academic Publishing, Norwell, Massachusetts. G. Reinelt

  14. The xyz algorithm for fast interaction search in high-dimensional data

    OpenAIRE

    Thanei, Gian-Andrea; Meinshausen, Nicolai; Shah, Rajen D.

    2016-01-01

    When performing regression on a dataset with $p$ variables, it is often of interest to go beyond using main linear effects and include interactions as products between individual variables. For small-scale problems, these interactions can be computed explicitly but this leads to a computational complexity of at least $\\mathcal{O}(p^2)$ if done naively. This cost can be prohibitive if $p$ is very large. We introduce a new randomised algorithm that is able to discover interactions with high pro...

  15. Pathway Detection from Protein Interaction Networks and Gene Expression Data Using Color-Coding Methods and A* Search Algorithms

    Directory of Open Access Journals (Sweden)

    Cheng-Yu Yeh

    2012-01-01

    Full Text Available With the large availability of protein interaction networks and microarray data supported, to identify the linear paths that have biological significance in search of a potential pathway is a challenge issue. We proposed a color-coding method based on the characteristics of biological network topology and applied heuristic search to speed up color-coding method. In the experiments, we tested our methods by applying to two datasets: yeast and human prostate cancer networks and gene expression data set. The comparisons of our method with other existing methods on known yeast MAPK pathways in terms of precision and recall show that we can find maximum number of the proteins and perform comparably well. On the other hand, our method is more efficient than previous ones and detects the paths of length 10 within 40 seconds using CPU Intel 1.73GHz and 1GB main memory running under windows operating system.

  16. Forecasting Hoabinh Reservoir’s Incoming Flow: An Application of Neural Networks with the Cuckoo Search Algorithm

    Directory of Open Access Journals (Sweden)

    Jeng-Fung Chen

    2014-11-01

    Full Text Available The accuracy of reservoir flow forecasting has the most significant influence on the assurance of stability and annual operations of hydro-constructions. For instance, accurate forecasting on the ebb and flow of Vietnam’s Hoabinh Reservoir can aid in the preparation and prevention of lowland flooding and drought, as well as regulating electric energy. This raises the need to propose a model that accurately forecasts the incoming flow of the Hoabinh Reservoir. In this study, a solution to this problem based on neural network with the Cuckoo Search (CS algorithm is presented. In particular, we used hydrographic data and predicted total incoming flows of the Hoabinh Reservoir over a period of 10 days. The Cuckoo Search algorithm was utilized to train the feedforward neural network (FNN for prediction. The algorithm optimized the weights between layers and biases of the neuron network. Different forecasting models for the three scenarios were developed. The constructed models have shown high forecasting performance based on the performance indices calculated. These results were also compared with those obtained from the neural networks trained by the particle swarm optimization (PSO and back-propagation (BP, indicating that the proposed approach performed more effectively. Based on the experimental results, the scenario using the rainfall and the flow as input yielded the highest forecasting accuracy when compared with other scenarios. The performance criteria RMSE, MAPE, and R obtained by the CS-FNN in this scenario were calculated as 48.7161, 0.067268 and 0.8965, respectively. These results were highly correlated to actual values. It is expected that this work may be useful for hydrographic forecasting.

  17. A Novel Hierarchical Model to Locate Health Care Facilities with Fuzzy Demand Solved by Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    Mehdi Alinaghian

    2014-08-01

    Full Text Available In the field of health losses resulting from failure to establish the facilities in a suitable location and the required number, beyond the cost and quality of service will result in an increase in mortality and the spread of diseases. So the facility location models have special importance in this area. In this paper, a successively inclusive hierarchical model for location of health centers in term of the transfer of patients from a lower level to a higher level of health centers has been developed. Since determination the exact number of demand for health care in the future is difficult and in order to make the model close to the real conditions of demand uncertainty, a fuzzy programming model based on credibility theory is considered. To evaluate the proposed model, several numerical examples are solved in small size. In order to solve large scale problems, a meta-heuristic algorithm based on harmony search algorithm was developed in conjunction with the GAMS software which indicants the performance of the proposed algorithm.

  18. A spectral clustering search algorithm for predicting shallow landslide size and location

    Science.gov (United States)

    Dino Bellugi; David G. Milledge; William E. Dietrich; Jim A. McKean; J. Taylor Perron; Erik B. Sudderth; Brian Kazian

    2015-01-01

    The potential hazard and geomorphic significance of shallow landslides depend on their location and size. Commonly applied one-dimensional stability models do not include lateral resistances and cannot predict landslide size. Multi-dimensional models must be applied to specific geometries, which are not known a priori, and testing all possible geometries is...

  19. Solving the Vehicle Routing Problem with Stochastic Demands via Hybrid Genetic Algorithm-Tabu Search

    OpenAIRE

    Ismail, Z.; Irhamah

    2008-01-01

    This study considers a version of the stochastic vehicle routing problem where customer demands are random variables with known probability distribution. A new scheme based on a hybrid GA and Tabu Search heuristic is proposed for this problem under a priori approach with preventive restocking. The relative performance of the proposed HGATS is compared to each GA and TS alone, on a set of randomly generated problems following some discrete probability distributions. The problem data are inspir...

  20. Applying active learning to high-throughput phenotyping algorithms for electronic health records data.

    Science.gov (United States)

    Chen, Yukun; Carroll, Robert J; Hinz, Eugenia R McPeek; Shah, Anushi; Eyler, Anne E; Denny, Joshua C; Xu, Hua

    2013-12-01

    Generalizable, high-throughput phenotyping methods based on supervised machine learning (ML) algorithms could significantly accelerate the use of electronic health records data for clinical and translational research. However, they often require large numbers of annotated samples, which are costly and time-consuming to review. We investigated the use of active learning (AL) in ML-based phenotyping algorithms. We integrated an uncertainty sampling AL approach with support vector machines-based phenotyping algorithms and evaluated its performance using three annotated disease cohorts including rheumatoid arthritis (RA), colorectal cancer (CRC), and venous thromboembolism (VTE). We investigated performance using two types of feature sets: unrefined features, which contained at least all clinical concepts extracted from notes and billing codes; and a smaller set of refined features selected by domain experts. The performance of the AL was compared with a passive learning (PL) approach based on random sampling. Our evaluation showed that AL outperformed PL on three phenotyping tasks. When unrefined features were used in the RA and CRC tasks, AL reduced the number of annotated samples required to achieve an area under the curve (AUC) score of 0.95 by 68% and 23%, respectively. AL also achieved a reduction of 68% for VTE with an optimal AUC of 0.70 using refined features. As expected, refined features improved the performance of phenotyping classifiers and required fewer annotated samples. This study demonstrated that AL can be useful in ML-based phenotyping methods. Moreover, AL and feature engineering based on domain knowledge could be combined to develop efficient and generalizable phenotyping methods.