WorldWideScience

Sample records for greedy search algorithm

  1. Synthesis of Greedy Algorithms Using Dominance Relations

    Science.gov (United States)

    Nedunuri, Srinivas; Smith, Douglas R.; Cook, William R.

    2010-01-01

    Greedy algorithms exploit problem structure and constraints to achieve linear-time performance. Yet there is still no completely satisfactory way of constructing greedy algorithms. For example, the Greedy Algorithm of Edmonds depends upon translating a problem into an algebraic structure called a matroid, but the existence of such a translation can be as hard to determine as the existence of a greedy algorithm itself. An alternative characterization of greedy algorithms is in terms of dominance relations, a well-known algorithmic technique used to prune search spaces. We demonstrate a process by which dominance relations can be methodically derived for a number of greedy algorithms, including activity selection, and prefix-free codes. By incorporating our approach into an existing framework for algorithm synthesis, we demonstrate that it could be the basis for an effective engineering method for greedy algorithms. We also compare our approach with other characterizations of greedy algorithms.

  2. Optimization by GRASP greedy randomized adaptive search procedures

    CERN Document Server

    Resende, Mauricio G C

    2016-01-01

    This is the first book to cover GRASP (Greedy Randomized Adaptive Search Procedures), a metaheuristic that has enjoyed wide success in practice with a broad range of applications to real-world combinatorial optimization problems. The state-of-the-art coverage and carefully crafted pedagogical style lends this book highly accessible as an introductory text not only to GRASP, but also to combinatorial optimization, greedy algorithms, local search, and path-relinking, as well as to heuristics and metaheuristics, in general. The focus is on algorithmic and computational aspects of applied optimization with GRASP with emphasis given to the end-user, providing sufficient information on the broad spectrum of advances in applied optimization with GRASP. For the more advanced reader, chapters on hybridization with path-relinking and parallel and continuous GRASP present these topics in a clear and concise fashion. Additionally, the book offers a very complete annotated bibliography of GRASP and combinatorial optimizat...

  3. Greedy Algorithms for Nonnegativity-Constrained Simultaneous Sparse Recovery

    Science.gov (United States)

    Kim, Daeun; Haldar, Justin P.

    2016-01-01

    This work proposes a family of greedy algorithms to jointly reconstruct a set of vectors that are (i) nonnegative and (ii) simultaneously sparse with a shared support set. The proposed algorithms generalize previous approaches that were designed to impose these constraints individually. Similar to previous greedy algorithms for sparse recovery, the proposed algorithms iteratively identify promising support indices. In contrast to previous approaches, the support index selection procedure has been adapted to prioritize indices that are consistent with both the nonnegativity and shared support constraints. Empirical results demonstrate for the first time that the combined use of simultaneous sparsity and nonnegativity constraints can substantially improve recovery performance relative to existing greedy algorithms that impose less signal structure. PMID:26973368

  4. Efficient and accurate Greedy Search Methods for mining functional modules in protein interaction networks.

    Science.gov (United States)

    He, Jieyue; Li, Chaojun; Ye, Baoliu; Zhong, Wei

    2012-06-25

    Most computational algorithms mainly focus on detecting highly connected subgraphs in PPI networks as protein complexes but ignore their inherent organization. Furthermore, many of these algorithms are computationally expensive. However, recent analysis indicates that experimentally detected protein complexes generally contain Core/attachment structures. In this paper, a Greedy Search Method based on Core-Attachment structure (GSM-CA) is proposed. The GSM-CA method detects densely connected regions in large protein-protein interaction networks based on the edge weight and two criteria for determining core nodes and attachment nodes. The GSM-CA method improves the prediction accuracy compared to other similar module detection approaches, however it is computationally expensive. Many module detection approaches are based on the traditional hierarchical methods, which is also computationally inefficient because the hierarchical tree structure produced by these approaches cannot provide adequate information to identify whether a network belongs to a module structure or not. In order to speed up the computational process, the Greedy Search Method based on Fast Clustering (GSM-FC) is proposed in this work. The edge weight based GSM-FC method uses a greedy procedure to traverse all edges just once to separate the network into the suitable set of modules. The proposed methods are applied to the protein interaction network of S. cerevisiae. Experimental results indicate that many significant functional modules are detected, most of which match the known complexes. Results also demonstrate that the GSM-FC algorithm is faster and more accurate as compared to other competing algorithms. Based on the new edge weight definition, the proposed algorithm takes advantages of the greedy search procedure to separate the network into the suitable set of modules. Experimental analysis shows that the identified modules are statistically significant. The algorithm can reduce the

  5. An efficient community detection algorithm using greedy surprise maximization

    International Nuclear Information System (INIS)

    Jiang, Yawen; Jia, Caiyan; Yu, Jian

    2014-01-01

    Community detection is an important and crucial problem in complex network analysis. Although classical modularity function optimization approaches are widely used for identifying communities, the modularity function (Q) suffers from its resolution limit. Recently, the surprise function (S) was experimentally proved to be better than the Q function. However, up until now, there has been no algorithm available to perform searches to directly determine the maximal surprise values. In this paper, considering the superiority of the S function over the Q function, we propose an efficient community detection algorithm called AGSO (algorithm based on greedy surprise optimization) and its improved version FAGSO (fast-AGSO), which are based on greedy surprise optimization and do not suffer from the resolution limit. In addition, (F)AGSO does not need the number of communities K to be specified in advance. Tests on experimental networks show that (F)AGSO is able to detect optimal partitions in both simple and even more complex networks. Moreover, algorithms based on surprise maximization perform better than those algorithms based on modularity maximization, including Blondel–Guillaume–Lambiotte–Lefebvre (BGLL), Clauset–Newman–Moore (CNM) and the other state-of-the-art algorithms such as Infomap, order statistics local optimization method (OSLOM) and label propagation algorithm (LPA). (paper)

  6. Greedy algorithms withweights for construction of partial association rules

    KAUST Repository

    Moshkov, Mikhail; Piliszczu, Marcin; Zielosko, Beata Marta

    2009-01-01

    This paper is devoted to the study of approximate algorithms for minimization of the total weight of attributes occurring in partial association rules. We consider mainly greedy algorithms with weights for construction of rules. The paper contains bounds on precision of these algorithms and bounds on the minimal weight of partial association rules based on an information obtained during the greedy algorithm run.

  7. Greedy algorithms withweights for construction of partial association rules

    KAUST Repository

    Moshkov, Mikhail

    2009-09-10

    This paper is devoted to the study of approximate algorithms for minimization of the total weight of attributes occurring in partial association rules. We consider mainly greedy algorithms with weights for construction of rules. The paper contains bounds on precision of these algorithms and bounds on the minimal weight of partial association rules based on an information obtained during the greedy algorithm run.

  8. A new greedy search method for the design of digital IIR filter

    Directory of Open Access Journals (Sweden)

    Ranjit Kaur

    2015-07-01

    Full Text Available A new greedy search method is applied in this paper to design the optimal digital infinite impulse response (IIR filter. The greedy search method is based on binary successive approximation (BSA and evolutionary search (ES. The suggested greedy search method optimizes the magnitude response and the phase response simultaneously and also finds the lowest order of the filter. The order of the filter is controlled by a control gene whose value is also optimized along with the filter coefficients to obtain optimum order of designed IIR filter. The stability constraints of IIR filter are taken care of during the design procedure. To determine the trade-off relationship between conflicting objectives in the non-inferior domain, the weighting method is exploited. The proposed approach is effectively applied to solve the multiobjective optimization problems of designing the digital low-pass (LP, high-pass (HP, bandpass (BP, and bandstop (BS filters. It has been demonstrated that this technique not only fulfills all types of filter performance requirements, but also the lowest order of the filter can be found. The computational experiments show that the proposed approach gives better digital IIR filters than the existing evolutionary algorithm (EA based methods.

  9. Search Greedy for radial fuel optimization

    International Nuclear Information System (INIS)

    Ortiz, J. J.; Castillo, J. A.; Pelta, D. A.

    2008-01-01

    In this work a search algorithm Greedy is presented for the optimization of fuel cells in reactors BWR. As first phase a study was made of sensibility of the Factor of Pick of Local Power (FPPL) of the cell, in function of the exchange of the content of two fuel rods. His way it could settle down that then the rods to exchange do not contain gadolinium, small changes take place in the value of the FPPL of the cell. This knowledge was applied later in the search Greedy to optimize fuel cell. Exchanges of rods with gadolinium takes as a mechanism of global search and exchanges of rods without gadolinium takes as a method of local search. It worked with a cell of 10x10 rods and 2 circular water channels in center of the same one. From an inventory of enrichments of uranium and concentrations of given gadolinium and one distribution of well-known enrichments; the techniques finds good solutions that the FPPL minimizes, maintaining the factor of multiplication of neutrons in a range appropriate of values. In the low part of the assembly of a lot of recharge of a cycle of 18 months the cells were placed. The values of FPPL of the opposing cells are similar or smaller to those of the original cell and with behaviors in the nucleus also comparable to those obtained with the original cell. The evaluation of the cells was made with the code of transport CASMO-IV and the evaluation of the nucleus was made by means of the one simulator of the nucleus SIMULATE-3. (Author)

  10. A Guiding Evolutionary Algorithm with Greedy Strategy for Global Optimization Problems

    Directory of Open Access Journals (Sweden)

    Leilei Cao

    2016-01-01

    Full Text Available A Guiding Evolutionary Algorithm (GEA with greedy strategy for global optimization problems is proposed. Inspired by Particle Swarm Optimization, the Genetic Algorithm, and the Bat Algorithm, the GEA was designed to retain some advantages of each method while avoiding some disadvantages. In contrast to the usual Genetic Algorithm, each individual in GEA is crossed with the current global best one instead of a randomly selected individual. The current best individual served as a guide to attract offspring to its region of genotype space. Mutation was added to offspring according to a dynamic mutation probability. To increase the capability of exploitation, a local search mechanism was applied to new individuals according to a dynamic probability of local search. Experimental results show that GEA outperformed the other three typical global optimization algorithms with which it was compared.

  11. Biclustering of gene expression data using reactive greedy randomized adaptive search procedure.

    Science.gov (United States)

    Dharan, Smitha; Nair, Achuthsankar S

    2009-01-30

    Biclustering algorithms belong to a distinct class of clustering algorithms that perform simultaneous clustering of both rows and columns of the gene expression matrix and can be a very useful analysis tool when some genes have multiple functions and experimental conditions are diverse. Cheng and Church have introduced a measure called mean squared residue score to evaluate the quality of a bicluster and has become one of the most popular measures to search for biclusters. In this paper, we review basic concepts of the metaheuristics Greedy Randomized Adaptive Search Procedure (GRASP)-construction and local search phases and propose a new method which is a variant of GRASP called Reactive Greedy Randomized Adaptive Search Procedure (Reactive GRASP) to detect significant biclusters from large microarray datasets. The method has two major steps. First, high quality bicluster seeds are generated by means of k-means clustering. In the second step, these seeds are grown using the Reactive GRASP, in which the basic parameter that defines the restrictiveness of the candidate list is self-adjusted, depending on the quality of the solutions found previously. We performed statistical and biological validations of the biclusters obtained and evaluated the method against the results of basic GRASP and as well as with the classic work of Cheng and Church. The experimental results indicate that the Reactive GRASP approach outperforms the basic GRASP algorithm and Cheng and Church approach. The Reactive GRASP approach for the detection of significant biclusters is robust and does not require calibration efforts.

  12. Greedy Algorithms for Reduced Bases in Banach Spaces

    KAUST Repository

    DeVore, Ronald

    2013-02-26

    Given a Banach space X and one of its compact sets F, we consider the problem of finding a good n-dimensional space X n⊂X which can be used to approximate the elements of F. The best possible error we can achieve for such an approximation is given by the Kolmogorov width dn(F)X. However, finding the space which gives this performance is typically numerically intractable. Recently, a new greedy strategy for obtaining good spaces was given in the context of the reduced basis method for solving a parametric family of PDEs. The performance of this greedy algorithm was initially analyzed in Buffa et al. (Modél. Math. Anal. Numér. 46:595-603, 2012) in the case X=H is a Hilbert space. The results of Buffa et al. (Modél. Math. Anal. Numér. 46:595-603, 2012) were significantly improved upon in Binev et al. (SIAM J. Math. Anal. 43:1457-1472, 2011). The purpose of the present paper is to give a new analysis of the performance of such greedy algorithms. Our analysis not only gives improved results for the Hilbert space case but can also be applied to the same greedy procedure in general Banach spaces. © 2013 Springer Science+Business Media New York.

  13. Comparison of greedy algorithms for α-decision tree construction

    KAUST Repository

    Alkhalid, Abdulaziz; Chikalov, Igor; Moshkov, Mikhail

    2011-01-01

    A comparison among different heuristics that are used by greedy algorithms which constructs approximate decision trees (α-decision trees) is presented. The comparison is conducted using decision tables based on 24 data sets from UCI Machine Learning Repository [2]. Complexity of decision trees is estimated relative to several cost functions: depth, average depth, number of nodes, number of nonterminal nodes, and number of terminal nodes. Costs of trees built by greedy algorithms are compared with minimum costs calculated by an algorithm based on dynamic programming. The results of experiments assign to each cost function a set of potentially good heuristics that minimize it. © 2011 Springer-Verlag.

  14. Heuristic rules analysis on the fuel cells design using greedy search

    International Nuclear Information System (INIS)

    Ortiz, J. J.; Castillo, J. A.; Montes, J. L.; Hernandez, J. L.

    2009-10-01

    This work approaches the study of one of the heuristic rules of fuel cells design for boiling water nuclear reactors. This rule requires that the minor uranium enrichment is placed in the corners of the fuel cell. Also the search greedy is applied for the fuel cells design where explicitly does not take in count this rule, allowing the possibility to place any uranium enrichment with the condition that it does not contain gadolinium. Results are shown in the quality of the obtained cell by search greedy when it considers the rule and when not. The cell quality is measured with the value of the power pick factor obtained, as well as of the neutrons multiplication factor in an infinite medium. Cells were analyzed with 1 and 2 gadolinium concentrations low operation conditions at 120% of the nominal power of the reactors of the nuclear power plant of Laguna Verde. The results show that not to consider the rule in cells with a single gadolinium concentration, it has as repercussion that the greedy search has a minor performance. On the other hand with cells of two gadolinium concentrations, the performance of the greedy search was better. (Author)

  15. Greedy Local Search and Vertex Cover in Sparse Random Graphs

    DEFF Research Database (Denmark)

    Witt, Carsten

    2009-01-01

    . This work starts with a rigorous explanation for this claim based on the refined analysis of the Karp-Sipser algorithm by Aronson et al. Subsequently, theoretical supplements are given to experimental studies of search heuristics on random graphs. For c 1, a greedy and randomized local-search heuristic...... finds an optimal cover in polynomial time with a probability arbitrarily close to 1. This behavior relies on the absence of a giant component. As an additional insight into the randomized search, it is shown that the heuristic fails badly also on graphs consisting of a single tree component of maximum......Recently, various randomized search heuristics have been studied for the solution of the minimum vertex cover problem, in particular for sparse random instances according to the G(n, c/n) model, where c > 0 is a constant. Methods from statistical physics suggest that the problem is easy if c

  16. Comparison of Greedy Algorithms for Decision Tree Optimization

    KAUST Repository

    Alkhalid, Abdulaziz; Chikalov, Igor; Moshkov, Mikhail

    2013-01-01

    This chapter is devoted to the study of 16 types of greedy algorithms for decision tree construction. The dynamic programming approach is used for construction of optimal decision trees. Optimization is performed relative to minimal values

  17. Sistem Pencarian Hotel Berdasarkan Rute Perjalanan Terpendek Dengan Mempertimbangkan Daya Tarik Wisata Menggunakan Algoritma Greedy

    Directory of Open Access Journals (Sweden)

    Audrey Maximillian Herli

    2015-04-01

    Full Text Available Hotel search was an important thingfor travelers in their traveling journey. Travelers would consider criteria such as class, price and review of the hotel.Beside those things, distance between Hotel and tourist attractionswasalsoimportant factor to be considered. In this research, system was constructed to perform a hotels search by shortest travelling route using Greedy Algorithm. This research was conducted through four stages, the first stage wasdata and information collectingof tourist attraction and hotel. Second stagewasdata analysis with greedy algorithm in purpose to classify the data and implementing greedy algorithm with manual calculation to the problem research. The third stage was the development of the system, and the last stage wasevaluating the system with the experts who are experienced in the field of tourism and the prospective user of this application. Results from this study was the system can provide recommendations and sequence the shortest journey between the hotel and tourist attraction based on the greedy algorithm.

  18. Heuristic rules analysis on the fuel cells design using greedy search;Analisis de reglas heuristicas en el diseno de celdas de combustible usando busqueda greedy

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz, J. J.; Castillo, J. A.; Montes, J. L.; Hernandez, J. L., E-mail: juanjose.ortiz@inin.gob.m [ININ, Carretera Mexico-Toluca s/n, 52750 Ocoyoacac, Estado de Mexico (Mexico)

    2009-10-15

    This work approaches the study of one of the heuristic rules of fuel cells design for boiling water nuclear reactors. This rule requires that the minor uranium enrichment is placed in the corners of the fuel cell. Also the search greedy is applied for the fuel cells design where explicitly does not take in count this rule, allowing the possibility to place any uranium enrichment with the condition that it does not contain gadolinium. Results are shown in the quality of the obtained cell by search greedy when it considers the rule and when not. The cell quality is measured with the value of the power pick factor obtained, as well as of the neutrons multiplication factor in an infinite medium. Cells were analyzed with 1 and 2 gadolinium concentrations low operation conditions at 120% of the nominal power of the reactors of the nuclear power plant of Laguna Verde. The results show that not to consider the rule in cells with a single gadolinium concentration, it has as repercussion that the greedy search has a minor performance. On the other hand with cells of two gadolinium concentrations, the performance of the greedy search was better. (Author)

  19. Greedy Algorithms for Reduced Bases in Banach Spaces

    KAUST Repository

    DeVore, Ronald; Petrova, Guergana; Wojtaszczyk, Przemyslaw

    2013-01-01

    family of PDEs. The performance of this greedy algorithm was initially analyzed in Buffa et al. (Modél. Math. Anal. Numér. 46:595-603, 2012) in the case X=H is a Hilbert space. The results of Buffa et al. (Modél. Math. Anal. Numér. 46:595-603, 2012) were

  20. Application of a greedy algorithm to military aircraft fleet retirements

    NARCIS (Netherlands)

    Newcamp, J.M.; Verhagen, W.J.C.; Udluft, H.; Curran, Ricky

    2017-01-01

    This article presents a retirement analysis model for aircraft fleets. By employing a greedy algorithm, the presented solution is capable of identifying individually weak assets in a fleet of aircraft with inhomogeneous historical utilization. The model forecasts future retirement scenarios

  1. Reducing a congestion with introduce the greedy algorithm on traffic light control

    Science.gov (United States)

    Catur Siswipraptini, Puji; Hendro Martono, Wisnu; Hartanti, Dian

    2018-03-01

    The density of vehicles causes congestion seen at every junction in the city of jakarta due to the static or manual traffic timing lamp system consequently the length of the queue at the junction is uncertain. The research has been aimed at designing a sensor based traffic system based on the queue length detection of the vehicle to optimize the duration of the green light. In detecting the length of the queue of vehicles using infrared sensor assistance placed in each intersection path, then apply Greedy algorithm to help accelerate the movement of green light duration for the path that requires, while to apply the traffic lights regulation program based on greedy algorithm which is then stored on microcontroller with Arduino Mega 2560 type. Where a developed system implements the greedy algorithm with the help of the infrared sensor it will extend the duration of the green light on the long vehicle queue and accelerate the duration of the green light at the intersection that has the queue not too dense. Furthermore, the design is made to form an artificial form of the actual situation of the scale model or simple simulator (next we just called as scale model of simulator) of the intersection then tested. Sensors used are infrared sensors, where the placement of sensors in each intersection on the scale model is placed within 10 cm of each sensor and serves as a queue detector. From the results of the test process on the scale model with a longer queue obtained longer green light time so it will fix the problem of long queue of vehicles. Using greedy algorithms can add long green lights for 2 seconds on tracks that have long queues at least three sensor levels and accelerate time at other intersections that have longer queue sensor levels less than level three.

  2. Greedy Oriented Flows

    NARCIS (Netherlands)

    Faigle, Ulrich; Kern, Walter; Peis, Britta

    We investigate the following greedy approach to attack linear programs of type (Formula presented.) where A has entries in (Formula presented.): The greedy algorithm starts with a feasible solution x and, iteratively, chooses an improving variable and raises it until some constraint becomes tight.

  3. Search Greedy for radial fuel optimization; Busqueda Greddy para optimizacion radial de combustible

    Energy Technology Data Exchange (ETDEWEB)

    Ortiz, J. J.; Castillo, J. A. [ININ, 52750 La Marquesa, Estado de Mexico (Mexico); Pelta, D. A. [Universidad de Granada, ETS Ingenieria Informatica y Telecomunicaciones, C/Daniel Saucedo Aranda s/n, 18071 Granada (Spain)]. e-mail: jjortiz@nuclear.inin.mx

    2008-07-01

    In this work a search algorithm Greedy is presented for the optimization of fuel cells in reactors BWR. As first phase a study was made of sensibility of the Factor of Pick of Local Power (FPPL) of the cell, in function of the exchange of the content of two fuel rods. His way it could settle down that then the rods to exchange do not contain gadolinium, small changes take place in the value of the FPPL of the cell. This knowledge was applied later in the search Greedy to optimize fuel cell. Exchanges of rods with gadolinium takes as a mechanism of global search and exchanges of rods without gadolinium takes as a method of local search. It worked with a cell of 10x10 rods and 2 circular water channels in center of the same one. From an inventory of enrichments of uranium and concentrations of given gadolinium and one distribution of well-known enrichments; the techniques finds good solutions that the FPPL minimizes, maintaining the factor of multiplication of neutrons in a range appropriate of values. In the low part of the assembly of a lot of recharge of a cycle of 18 months the cells were placed. The values of FPPL of the opposing cells are similar or smaller to those of the original cell and with behaviors in the nucleus also comparable to those obtained with the original cell. The evaluation of the cells was made with the code of transport CASMO-IV and the evaluation of the nucleus was made by means of the one simulator of the nucleus SIMULATE-3. (Author)

  4. Analysis of some greedy algorithms for the single-sink fixed-charge transportation problem

    DEFF Research Database (Denmark)

    Görtz, Simon; Klose, Andreas

    2009-01-01

    -charge transportation problem. Nevertheless, just a few methods for solving this problem have been proposed in the literature. In this paper, some greedy heuristic solutions methods for the SSFCTP are investigated. It is shown that two greedy approaches for the SSFCTP known from the literature can be arbitrarily bad......, whereas an approximation algorithm proposed in the literature for the binary min-knapsack problem has a guaranteed worst case bound if adapted accordingly to the case of the SSFCTP....

  5. Effective Iterated Greedy Algorithm for Flow-Shop Scheduling Problems with Time lags

    Science.gov (United States)

    ZHAO, Ning; YE, Song; LI, Kaidian; CHEN, Siyu

    2017-05-01

    Flow shop scheduling problem with time lags is a practical scheduling problem and attracts many studies. Permutation problem(PFSP with time lags) is concentrated but non-permutation problem(non-PFSP with time lags) seems to be neglected. With the aim to minimize the makespan and satisfy time lag constraints, efficient algorithms corresponding to PFSP and non-PFSP problems are proposed, which consist of iterated greedy algorithm for permutation(IGTLP) and iterated greedy algorithm for non-permutation (IGTLNP). The proposed algorithms are verified using well-known simple and complex instances of permutation and non-permutation problems with various time lag ranges. The permutation results indicate that the proposed IGTLP can reach near optimal solution within nearly 11% computational time of traditional GA approach. The non-permutation results indicate that the proposed IG can reach nearly same solution within less than 1% computational time compared with traditional GA approach. The proposed research combines PFSP and non-PFSP together with minimal and maximal time lag consideration, which provides an interesting viewpoint for industrial implementation.

  6. Greedy algorithms for construction of approximate tests for decision tables with many-valued decisions

    KAUST Repository

    Azad, Mohammad; Chikalov, Igor; Moshkov, Mikhail; Zielosko, Beata

    2012-01-01

    The paper is devoted to the study of a greedy algorithm for construction of approximate tests (super-reducts) This algorithm is applicable to decision tables with many-valued decisions where each row is labeled with a set of decisions For a given

  7. Multiobjective Variable Neighborhood Search algorithm for scheduling independent jobs on computational grid

    Directory of Open Access Journals (Sweden)

    S. Selvi

    2015-07-01

    Full Text Available Grid computing solves high performance and high-throughput computing problems through sharing resources ranging from personal computers to super computers distributed around the world. As the grid environments facilitate distributed computation, the scheduling of grid jobs has become an important issue. In this paper, an investigation on implementing Multiobjective Variable Neighborhood Search (MVNS algorithm for scheduling independent jobs on computational grid is carried out. The performance of the proposed algorithm has been evaluated with Min–Min algorithm, Simulated Annealing (SA and Greedy Randomized Adaptive Search Procedure (GRASP algorithm. Simulation results show that MVNS algorithm generally performs better than other metaheuristics methods.

  8. Greedy algorithms for diffuse optical tomography reconstruction

    Science.gov (United States)

    Dileep, B. P. V.; Das, Tapan; Dutta, Pranab K.

    2018-03-01

    Diffuse optical tomography (DOT) is a noninvasive imaging modality that reconstructs the optical parameters of a highly scattering medium. However, the inverse problem of DOT is ill-posed and highly nonlinear due to the zig-zag propagation of photons that diffuses through the cross section of tissue. The conventional DOT imaging methods iteratively compute the solution of forward diffusion equation solver which makes the problem computationally expensive. Also, these methods fail when the geometry is complex. Recently, the theory of compressive sensing (CS) has received considerable attention because of its efficient use in biomedical imaging applications. The objective of this paper is to solve a given DOT inverse problem by using compressive sensing framework and various Greedy algorithms such as orthogonal matching pursuit (OMP), compressive sampling matching pursuit (CoSaMP), and stagewise orthogonal matching pursuit (StOMP), regularized orthogonal matching pursuit (ROMP) and simultaneous orthogonal matching pursuit (S-OMP) have been studied to reconstruct the change in the absorption parameter i.e, Δα from the boundary data. Also, the Greedy algorithms have been validated experimentally on a paraffin wax rectangular phantom through a well designed experimental set up. We also have studied the conventional DOT methods like least square method and truncated singular value decomposition (TSVD) for comparison. One of the main features of this work is the usage of less number of source-detector pairs, which can facilitate the use of DOT in routine applications of screening. The performance metrics such as mean square error (MSE), normalized mean square error (NMSE), structural similarity index (SSIM), and peak signal to noise ratio (PSNR) have been used to evaluate the performance of the algorithms mentioned in this paper. Extensive simulation results confirm that CS based DOT reconstruction outperforms the conventional DOT imaging methods in terms of

  9. Greedy Gossip With Eavesdropping

    Science.gov (United States)

    Ustebay, Deniz; Oreshkin, Boris N.; Coates, Mark J.; Rabbat, Michael G.

    2010-07-01

    This paper presents greedy gossip with eavesdropping (GGE), a novel randomized gossip algorithm for distributed computation of the average consensus problem. In gossip algorithms, nodes in the network randomly communicate with their neighbors and exchange information iteratively. The algorithms are simple and decentralized, making them attractive for wireless network applications. In general, gossip algorithms are robust to unreliable wireless conditions and time varying network topologies. In this paper we introduce GGE and demonstrate that greedy updates lead to rapid convergence. We do not require nodes to have any location information. Instead, greedy updates are made possible by exploiting the broadcast nature of wireless communications. During the operation of GGE, when a node decides to gossip, instead of choosing one of its neighbors at random, it makes a greedy selection, choosing the node which has the value most different from its own. In order to make this selection, nodes need to know their neighbors' values. Therefore, we assume that all transmissions are wireless broadcasts and nodes keep track of their neighbors' values by eavesdropping on their communications. We show that the convergence of GGE is guaranteed for connected network topologies. We also study the rates of convergence and illustrate, through theoretical bounds and numerical simulations, that GGE consistently outperforms randomized gossip and performs comparably to geographic gossip on moderate-sized random geometric graph topologies.

  10. TH-CD-209-01: A Greedy Reassignment Algorithm for the PBS Minimum Monitor Unit Constraint

    Energy Technology Data Exchange (ETDEWEB)

    Lin, Y; Kooy, H; Craft, D; Depauw, N; Flanz, J; Clasie, B [Massachusetts General Hospital and Harvard Medical School, Boston, MA (United States)

    2016-06-15

    Purpose: To investigate a Greedy Reassignment algorithm in order to mitigate the effects of low weight spots in proton pencil beam scanning (PBS) treatment plans. Methods: To convert a plan from the treatment planning system’s (TPS) to a deliverable plan, post processing methods can be used to adjust the spot maps to meets the minimum MU constraint. Existing methods include: deleting low weight spots (Cut method), or rounding spots with weight above/below half the limit up/down to the limit/zero (Round method). An alternative method called Greedy Reassignment was developed in this work in which the lowest weight spot in the field was removed and its weight reassigned equally among its nearest neighbors. The process was repeated with the next lowest weight spot until all spots in the field were above the MU constraint. The algorithm performance was evaluated using plans collected from 190 patients (496 fields) treated at our facility. The evaluation criteria were the γ-index pass rate comparing the pre-processed and post-processed dose distributions. A planning metric was further developed to predict the impact of post-processing on treatment plans for various treatment planning, machine, and dose tolerance parameters. Results: For fields with a gamma pass rate of 90±1%, the metric has a standard deviation equal to 18% of the centroid value. This showed that the metric and γ-index pass rate are correlated for the Greedy Reassignment algorithm. Using a 3rd order polynomial fit to the data, the Greedy Reassignment method had 1.8 times better metric at 90% pass rate compared to other post-processing methods. Conclusion: We showed that the Greedy Reassignment method yields deliverable plans that are closest to the optimized-without-MU-constraint plan from the TPS. The metric developed in this work could help design the minimum MU threshold with the goal of keeping the γ-index pass rate above an acceptable value.

  11. Evaluation of a Didactic Method for the Active Learning of Greedy Algorithms

    Science.gov (United States)

    Esteban-Sánchez, Natalia; Pizarro, Celeste; Velázquez-Iturbide, J. Ángel

    2014-01-01

    An evaluation of the educational effectiveness of a didactic method for the active learning of greedy algorithms is presented. The didactic method sets students structured-inquiry challenges to be addressed with a specific experimental method, supported by the interactive system GreedEx. This didactic method has been refined over several years of…

  12. Greedy algorithms for construction of approximate tests for decision tables with many-valued decisions

    KAUST Repository

    Azad, Mohammad

    2012-12-14

    The paper is devoted to the study of a greedy algorithm for construction of approximate tests (super-reducts) This algorithm is applicable to decision tables with many-valued decisions where each row is labeled with a set of decisions For a given row, we should find a decision from the set attached to this row We consider bounds on the precision of this algorithm relative to the cardinality of tests.

  13. A simple greedy algorithm for dynamic graph orientation

    DEFF Research Database (Denmark)

    Berglin, Edvin; Brodal, Gerth Stølting

    2017-01-01

    Graph orientations with low out-degree are one of several ways to efficiently store sparse graphs. If the graphs allow for insertion and deletion of edges, one may have to flip the orientation of some edges to prevent blowing up the maximum out-degree. We use arboricity as our sparsity measure....... With an immensely simple greedy algorithm, we get parametrized trade-off bounds between out-degree and worst case number of flips, which previously only existed for amortized number of flips. We match the previous best worst-case algorithm (in O(log n) flips) for general arboricity and beat it for either constant...... or super-logarithmic arboricity. We also match a previous best amortized result for at least logarithmic arboricity, and give the first results with worst-case O(1) and O(sqrt(log n)) flips nearly matching degree bounds to their respective amortized solutions....

  14. A greedy heuristic using adjoint functions for the optimization of seed and needle configurations in prostate seed implant

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Sua [Department of Radiation Oncology, Duke University Medical Center, Box 3295, Durham, NC 27710 (United States); Kowalok, Michael E [Department of Radiation Oncology, Virginia Commonwealth University Health System, 401 College St., PO Box 980058, Richmond, VA 23298-0058 (United States); Thomadsen, Bruce R [Department of Medical Physics, University of Wisconsin-Madison, 1530 MSC, 1300 University Ave., Madison, WI 53706 (United States); Henderson, Douglass L [Department of Engineering Physics, University of Wisconsin-Madison, 153 Engineering Research Bldg., 1500 Engineering Dr., Madison, WI 53706 (United States)

    2007-02-07

    We continue our work on the development of an efficient treatment-planning algorithm for prostate seed implants by incorporation of an automated seed and needle configuration routine. The treatment-planning algorithm is based on region of interest (ROI) adjoint functions and a greedy heuristic. As defined in this work, the adjoint function of an ROI is the sensitivity of the average dose in the ROI to a unit-strength brachytherapy source at any seed position. The greedy heuristic uses a ratio of target and critical structure adjoint functions to rank seed positions according to their ability to irradiate the target ROI while sparing critical structure ROIs. Because seed positions are ranked in advance and because the greedy heuristic does not modify previously selected seed positions, the greedy heuristic constructs a complete seed configuration quickly. Isodose surface constraints determine the search space and the needle constraint limits the number of needles. This study additionally includes a methodology that scans possible combinations of these constraint values automatically. This automated selection scheme saves the user the effort of manually searching constraint values. With this method, clinically acceptable treatment plans are obtained in less than 2 min. For comparison, the branch-and-bound method used to solve a mixed integer-programming model took close to 2.5 h to arrive at a feasible solution. Both methods achieved good treatment plans, but the speedup provided by the greedy heuristic was a factor of approximately 100. This attribute makes this algorithm suitable for intra-operative real-time treatment planning.

  15. A greedy heuristic using adjoint functions for the optimization of seed and needle configurations in prostate seed implant

    International Nuclear Information System (INIS)

    Yoo, Sua; Kowalok, Michael E; Thomadsen, Bruce R; Henderson, Douglass L

    2007-01-01

    We continue our work on the development of an efficient treatment-planning algorithm for prostate seed implants by incorporation of an automated seed and needle configuration routine. The treatment-planning algorithm is based on region of interest (ROI) adjoint functions and a greedy heuristic. As defined in this work, the adjoint function of an ROI is the sensitivity of the average dose in the ROI to a unit-strength brachytherapy source at any seed position. The greedy heuristic uses a ratio of target and critical structure adjoint functions to rank seed positions according to their ability to irradiate the target ROI while sparing critical structure ROIs. Because seed positions are ranked in advance and because the greedy heuristic does not modify previously selected seed positions, the greedy heuristic constructs a complete seed configuration quickly. Isodose surface constraints determine the search space and the needle constraint limits the number of needles. This study additionally includes a methodology that scans possible combinations of these constraint values automatically. This automated selection scheme saves the user the effort of manually searching constraint values. With this method, clinically acceptable treatment plans are obtained in less than 2 min. For comparison, the branch-and-bound method used to solve a mixed integer-programming model took close to 2.5 h to arrive at a feasible solution. Both methods achieved good treatment plans, but the speedup provided by the greedy heuristic was a factor of approximately 100. This attribute makes this algorithm suitable for intra-operative real-time treatment planning

  16. A Greedy Algorithm for Neighborhood Overlap-Based Community Detection

    Directory of Open Access Journals (Sweden)

    Natarajan Meghanathan

    2016-01-01

    Full Text Available The neighborhood overlap (NOVER of an edge u-v is defined as the ratio of the number of nodes who are neighbors for both u and v to that of the number of nodes who are neighbors of at least u or v. In this paper, we hypothesize that an edge u-v with a lower NOVER score bridges two or more sets of vertices, with very few edges (other than u-v connecting vertices from one set to another set. Accordingly, we propose a greedy algorithm of iteratively removing the edges of a network in the increasing order of their neighborhood overlap and calculating the modularity score of the resulting network component(s after the removal of each edge. The network component(s that have the largest cumulative modularity score are identified as the different communities of the network. We evaluate the performance of the proposed NOVER-based community detection algorithm on nine real-world network graphs and compare the performance against the multi-level aggregation-based Louvain algorithm, as well as the original and time-efficient versions of the edge betweenness-based Girvan-Newman (GN community detection algorithm.

  17. Comparison of Greedy Algorithms for Decision Tree Optimization

    KAUST Repository

    Alkhalid, Abdulaziz

    2013-01-01

    This chapter is devoted to the study of 16 types of greedy algorithms for decision tree construction. The dynamic programming approach is used for construction of optimal decision trees. Optimization is performed relative to minimal values of average depth, depth, number of nodes, number of terminal nodes, and number of nonterminal nodes of decision trees. We compare average depth, depth, number of nodes, number of terminal nodes and number of nonterminal nodes of constructed trees with minimum values of the considered parameters obtained based on a dynamic programming approach. We report experiments performed on data sets from UCI ML Repository and randomly generated binary decision tables. As a result, for depth, average depth, and number of nodes we propose a number of good heuristics. © Springer-Verlag Berlin Heidelberg 2013.

  18. Two-pass greedy regular expression parsing

    DEFF Research Database (Denmark)

    Grathwohl, Niels Bjørn Bugge; Henglein, Fritz; Nielsen, Lasse

    2013-01-01

    We present new algorithms for producing greedy parses for regular expressions (REs) in a semi-streaming fashion. Our lean-log algorithm executes in time O(mn) for REs of size m and input strings of size n and outputs a compact bit-coded parse tree representation. It improves on previous algorithms...... by: operating in only 2 passes; using only O(m) words of random-access memory (independent of n); requiring only kn bits of sequentially written and read log storage, where k ... and not requiring it to be stored at all. Previous RE parsing algorithms do not scale linearly with input size, or require substantially more log storage and employ 3 passes where the first consists of reversing the input, or do not or are not known to produce a greedy parse. The performance of our unoptimized C...

  19. Ship Block Transportation Scheduling Problem Based on Greedy Algorithm

    Directory of Open Access Journals (Sweden)

    Chong Wang

    2016-05-01

    Full Text Available Ship block transportation problems are crucial issues to address in reducing the construction cost and improving the productivity of shipyards. Shipyards aim to maximize the workload balance of transporters with time constraint such that all blocks should be transported during the planning horizon. This process leads to three types of penalty time: empty transporter travel time, delay time, and tardy time. This study aims to minimize the sum of the penalty time. First, this study presents the problem of ship block transportation with the generalization of the block transportation restriction on the multi-type transporter. Second, the problem is transformed into the classical traveling salesman problem and assignment problem through a reasonable model simplification and by adding a virtual node to the proposed directed graph. Then, a heuristic algorithm based on greedy algorithm is proposed to assign blocks to available transporters and sequencing blocks for each transporter simultaneously. Finally, the numerical experiment method is used to validate the model, and its result shows that the proposed algorithm is effective in realizing the efficient use of the transporters in shipyards. Numerical simulation results demonstrate the promising application of the proposed method to efficiently improve the utilization of transporters and to reduce the cost of ship block logistics for shipyards.

  20. Optimization of wind farm micro-siting for complex terrain using greedy algorithm

    International Nuclear Information System (INIS)

    Song, M.X.; Chen, K.; He, Z.Y.; Zhang, X.

    2014-01-01

    An optimization approach based on greedy algorithm for optimization of wind farm micro-siting is presented. The key of optimizing wind farm micro-siting is the fast and accurate evaluation of the wake flow interactions of wind turbines. The virtual particle model is employed for wake flow simulation of wind turbines, which makes the present method applicable for non-uniform flow fields on complex terrains. In previous bionic optimization method, within each step of the optimization process, only the power output of the turbine that is being located or relocated is considered. To aim at the overall power output of the wind farm comprehensively, a dependent region technique is introduced to improve the estimation of power output during the optimization procedure. With the technique, the wake flow influences can be reduced more efficiently during the optimization procedure. During the optimization process, the turbine that is being added will avoid being affected other turbines, and avoid affecting other turbine in the meantime. The results from the numerical calculations demonstrate that the present method is effective for wind farm micro-siting on complex terrain, and it produces better solutions in less time than the previous bionic method. - Highlights: • Greedy algorithm is applied to wind farm micro-siting problem. • The present method is effective for optimization on complex terrains. • Dependent region is suggested to improve the evaluation of wake influences. • The present method has better performance than the bionic method

  1. Performance improvement of multi-class detection using greedy algorithm for Viola-Jones cascade selection

    Science.gov (United States)

    Tereshin, Alexander A.; Usilin, Sergey A.; Arlazarov, Vladimir V.

    2018-04-01

    This paper aims to study the problem of multi-class object detection in video stream with Viola-Jones cascades. An adaptive algorithm for selecting Viola-Jones cascade based on greedy choice strategy in solution of the N-armed bandit problem is proposed. The efficiency of the algorithm on the problem of detection and recognition of the bank card logos in the video stream is shown. The proposed algorithm can be effectively used in documents localization and identification, recognition of road scene elements, localization and tracking of the lengthy objects , and for solving other problems of rigid object detection in a heterogeneous data flows. The computational efficiency of the algorithm makes it possible to use it both on personal computers and on mobile devices based on processors with low power consumption.

  2. Greedy Algorithm for the Construction of Approximate Decision Rules for Decision Tables with Many-Valued Decisions

    KAUST Repository

    Azad, Mohammad

    2016-10-20

    The paper is devoted to the study of a greedy algorithm for construction of approximate decision rules. This algorithm is applicable to decision tables with many-valued decisions where each row is labeled with a set of decisions. For a given row, we should find a decision from the set attached to this row. We consider bounds on the precision of this algorithm relative to the length of rules. To illustrate proposed approach we study a problem of recognition of labels of points in the plain. This paper contains also results of experiments with modified decision tables from UCI Machine Learning Repository.

  3. Greedy Algorithm for the Construction of Approximate Decision Rules for Decision Tables with Many-Valued Decisions

    KAUST Repository

    Azad, Mohammad; Moshkov, Mikhail; Zielosko, Beata

    2016-01-01

    The paper is devoted to the study of a greedy algorithm for construction of approximate decision rules. This algorithm is applicable to decision tables with many-valued decisions where each row is labeled with a set of decisions. For a given row, we should find a decision from the set attached to this row. We consider bounds on the precision of this algorithm relative to the length of rules. To illustrate proposed approach we study a problem of recognition of labels of points in the plain. This paper contains also results of experiments with modified decision tables from UCI Machine Learning Repository.

  4. Deterministic Greedy Routing with Guaranteed Delivery in 3D Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Su Xia

    2014-05-01

    Full Text Available With both computational complexity and storage space bounded by a small constant, greedy routing is recognized as an appealing approach to support scalable routing in wireless sensor networks. However, significant challenges have been encountered in extending greedy routing from 2D to 3D space. In this research, we develop decentralized solutions to achieve greedy routing in 3D sensor networks. Our proposed approach is based on a unit tetrahedron cell (UTC mesh structure. We propose a distributed algorithm to realize volumetric harmonic mapping (VHM of the UTC mesh under spherical boundary condition. It is a one-to-one map that yields virtual coordinates for each node in the network without or with one internal hole. Since a boundary has been mapped to a sphere, node-based greedy routing is always successful thereon. At the same time, we exploit the UTC mesh to develop a face-based greedy routing algorithm and prove its success at internal nodes. To deliver a data packet to its destination, face-based and node-based greedy routing algorithms are employed alternately at internal and boundary UTCs, respectively. For networks with multiple internal holes, a segmentation and tunnel-based routing strategy is proposed on top of VHM to support global end-to-end routing. As far as we know, this is the first work that realizes truly deterministic routing with constant-bounded storage and computation in general 3D wireless sensor networks.

  5. Optimization of travel salesman problem using the ant colony system and Greedy search

    International Nuclear Information System (INIS)

    Esquivel E, J.; Ordonez A, A.; Ortiz S, J. J.

    2008-01-01

    In this paper we present some results obtained during the development of optimization systems that can be used to design refueling and patterns of control rods in a BWR. These systems use ant colonies and Greedy search. The first phase of this project is to be familiar with these optimization techniques applied to the problem of travel salesman problem (TSP). The utility of TSP study is that, like the refueling design and pattern design of control rods are problems of combinative optimization. Even, the similarity with the problem of the refueling design is remarkable. It is presented some results for the TSP with the 32 state capitals of Mexico country. (Author)

  6. A greedy algorithm for construction of decision trees for tables with many-valued decisions - A comparative study

    KAUST Repository

    Azad, Mohammad

    2013-11-25

    In the paper, we study a greedy algorithm for construction of decision trees. This algorithm is applicable to decision tables with many-valued decisions where each row is labeled with a set of decisions. For a given row, we should find a decision from the set attached to this row. Experimental results for data sets from UCI Machine Learning Repository and randomly generated tables are presented. We make a comparative study of the depth and average depth of the constructed decision trees for proposed approach and approach based on generalized decision. The obtained results show that the proposed approach can be useful from the point of view of knowledge representation and algorithm construction.

  7. A greedy algorithm for construction of decision trees for tables with many-valued decisions - A comparative study

    KAUST Repository

    Azad, Mohammad; Chikalov, Igor; Moshkov, Mikhail; Zielosko, Beata

    2013-01-01

    In the paper, we study a greedy algorithm for construction of decision trees. This algorithm is applicable to decision tables with many-valued decisions where each row is labeled with a set of decisions. For a given row, we should find a decision from the set attached to this row. Experimental results for data sets from UCI Machine Learning Repository and randomly generated tables are presented. We make a comparative study of the depth and average depth of the constructed decision trees for proposed approach and approach based on generalized decision. The obtained results show that the proposed approach can be useful from the point of view of knowledge representation and algorithm construction.

  8. Greedy algorithms for high-dimensional non-symmetric linear problems***

    Directory of Open Access Journals (Sweden)

    Cancès E.

    2013-12-01

    Full Text Available In this article, we present a family of numerical approaches to solve high-dimensional linear non-symmetric problems. The principle of these methods is to approximate a function which depends on a large number of variates by a sum of tensor product functions, each term of which is iteratively computed via a greedy algorithm ? . There exists a good theoretical framework for these methods in the case of (linear and nonlinear symmetric elliptic problems. However, the convergence results are not valid any more as soon as the problems under consideration are not symmetric. We present here a review of the main algorithms proposed in the literature to circumvent this difficulty, together with some new approaches. The theoretical convergence results and the practical implementation of these algorithms are discussed. Their behaviors are illustrated through some numerical examples. Dans cet article, nous présentons une famille de méthodes numériques pour résoudre des problèmes linéaires non symétriques en grande dimension. Le principe de ces approches est de représenter une fonction dépendant d’un grand nombre de variables sous la forme d’une somme de fonctions produit tensoriel, dont chaque terme est calculé itérativement via un algorithme glouton ? . Ces méthodes possèdent de bonnes propriétés théoriques dans le cas de problèmes elliptiques symétriques (linéaires ou non linéaires, mais celles-ci ne sont plus valables dès lors que les problèmes considérés ne sont plus symétriques. Nous présentons une revue des principaux algorithmes proposés dans la littérature pour contourner cette difficulté ainsi que de nouvelles approches que nous proposons. Les résultats de convergence théoriques et la mise en oeuvre pratique de ces algorithmes sont détaillés et leur comportement est illustré au travers d’exemples numériques.

  9. Improving multivariate Horner schemes with Monte Carlo tree search

    Science.gov (United States)

    Kuipers, J.; Plaat, A.; Vermaseren, J. A. M.; van den Herik, H. J.

    2013-11-01

    Optimizing the cost of evaluating a polynomial is a classic problem in computer science. For polynomials in one variable, Horner's method provides a scheme for producing a computationally efficient form. For multivariate polynomials it is possible to generalize Horner's method, but this leaves freedom in the order of the variables. Traditionally, greedy schemes like most-occurring variable first are used. This simple textbook algorithm has given remarkably efficient results. Finding better algorithms has proved difficult. In trying to improve upon the greedy scheme we have implemented Monte Carlo tree search, a recent search method from the field of artificial intelligence. This results in better Horner schemes and reduces the cost of evaluating polynomials, sometimes by factors up to two.

  10. 'Misclassification error' greedy heuristic to construct decision trees for inconsistent decision tables

    KAUST Repository

    Azad, Mohammad; Moshkov, Mikhail

    2014-01-01

    A greedy algorithm has been presented in this paper to construct decision trees for three different approaches (many-valued decision, most common decision, and generalized decision) in order to handle the inconsistency of multiple decisions in a decision table. In this algorithm, a greedy heuristic ‘misclassification error’ is used which performs faster, and for some cost function, results are better than ‘number of boundary subtables’ heuristic in literature. Therefore, it can be used in the case of larger data sets and does not require huge amount of memory. Experimental results of depth, average depth and number of nodes of decision trees constructed by this algorithm are compared in the framework of each of the three approaches.

  11. Detecting highly overlapping community structure by greedy clique expansion

    OpenAIRE

    Lee, Conrad; Reid, Fergal; McDaid, Aaron; Hurley, Neil

    2010-01-01

    In complex networks it is common for each node to belong to several communities, implying a highly overlapping community structure. Recent advances in benchmarking indicate that existing community assignment algorithms that are capable of detecting overlapping communities perform well only when the extent of community overlap is kept to modest levels. To overcome this limitation, we introduce a new community assignment algorithm called Greedy Clique Expansion (GCE). The algorithm identifies d...

  12. Initialization and Restart in Stochastic Local Search: Computing a Most Probable Explanation in Bayesian Networks

    Science.gov (United States)

    Mengshoel, Ole J.; Wilkins, David C.; Roth, Dan

    2010-01-01

    For hard computational problems, stochastic local search has proven to be a competitive approach to finding optimal or approximately optimal problem solutions. Two key research questions for stochastic local search algorithms are: Which algorithms are effective for initialization? When should the search process be restarted? In the present work we investigate these research questions in the context of approximate computation of most probable explanations (MPEs) in Bayesian networks (BNs). We introduce a novel approach, based on the Viterbi algorithm, to explanation initialization in BNs. While the Viterbi algorithm works on sequences and trees, our approach works on BNs with arbitrary topologies. We also give a novel formalization of stochastic local search, with focus on initialization and restart, using probability theory and mixture models. Experimentally, we apply our methods to the problem of MPE computation, using a stochastic local search algorithm known as Stochastic Greedy Search. By carefully optimizing both initialization and restart, we reduce the MPE search time for application BNs by several orders of magnitude compared to using uniform at random initialization without restart. On several BNs from applications, the performance of Stochastic Greedy Search is competitive with clique tree clustering, a state-of-the-art exact algorithm used for MPE computation in BNs.

  13. Improving performances of suboptimal greedy iterative biclustering heuristics via localization.

    Science.gov (United States)

    Erten, Cesim; Sözdinler, Melih

    2010-10-15

    Biclustering gene expression data is the problem of extracting submatrices of genes and conditions exhibiting significant correlation across both the rows and the columns of a data matrix of expression values. Even the simplest versions of the problem are computationally hard. Most of the proposed solutions therefore employ greedy iterative heuristics that locally optimize a suitably assigned scoring function. We provide a fast and simple pre-processing algorithm called localization that reorders the rows and columns of the input data matrix in such a way as to group correlated entries in small local neighborhoods within the matrix. The proposed localization algorithm takes its roots from effective use of graph-theoretical methods applied to problems exhibiting a similar structure to that of biclustering. In order to evaluate the effectivenesss of the localization pre-processing algorithm, we focus on three representative greedy iterative heuristic methods. We show how the localization pre-processing can be incorporated into each representative algorithm to improve biclustering performance. Furthermore, we propose a simple biclustering algorithm, Random Extraction After Localization (REAL) that randomly extracts submatrices from the localization pre-processed data matrix, eliminates those with low similarity scores, and provides the rest as correlated structures representing biclusters. We compare the proposed localization pre-processing with another pre-processing alternative, non-negative matrix factorization. We show that our fast and simple localization procedure provides similar or even better results than the computationally heavy matrix factorization pre-processing with regards to H-value tests. We next demonstrate that the performances of the three representative greedy iterative heuristic methods improve with localization pre-processing when biological correlations in the form of functional enrichment and PPI verification constitute the main performance

  14. A Greedy Approach for Placement of Subsurface Aquifer Wells in an Ensemble Filtering Framework

    KAUST Repository

    El Gharamti, Mohamad; Marzouk, Youssef M.; Huan, Xun; Hoteit, Ibrahim

    2015-01-01

    Optimizing wells placement may help in better understanding subsurface solute transport and detecting contaminant plumes. In this work, we use the ensemble Kalman filter (EnKF) as a data assimilation tool and propose a greedy observational design algorithm to optimally select aquifer wells locations for updating the prior contaminant ensemble. The algorithm is greedy in the sense that it operates sequentially, without taking into account expected future gains. The selection criteria is based on maximizing the information gain that the EnKF carries during the update of the prior uncertainties. We test the efficiency of this algorithm in a synthetic aquifer system where a contaminant plume is set to migrate over a 30 years period across a heterogenous domain.

  15. A Greedy Approach for Placement of Subsurface Aquifer Wells in an Ensemble Filtering Framework

    KAUST Repository

    El Gharamti, Mohamad

    2015-11-26

    Optimizing wells placement may help in better understanding subsurface solute transport and detecting contaminant plumes. In this work, we use the ensemble Kalman filter (EnKF) as a data assimilation tool and propose a greedy observational design algorithm to optimally select aquifer wells locations for updating the prior contaminant ensemble. The algorithm is greedy in the sense that it operates sequentially, without taking into account expected future gains. The selection criteria is based on maximizing the information gain that the EnKF carries during the update of the prior uncertainties. We test the efficiency of this algorithm in a synthetic aquifer system where a contaminant plume is set to migrate over a 30 years period across a heterogenous domain.

  16. Iterated greedy algorithms to minimize the total family flow time for job-shop scheduling with job families and sequence-dependent set-ups

    Science.gov (United States)

    Kim, Ji-Su; Park, Jung-Hyeon; Lee, Dong-Ho

    2017-10-01

    This study addresses a variant of job-shop scheduling in which jobs are grouped into job families, but they are processed individually. The problem can be found in various industrial systems, especially in reprocessing shops of remanufacturing systems. If the reprocessing shop is a job-shop type and has the component-matching requirements, it can be regarded as a job shop with job families since the components of a product constitute a job family. In particular, sequence-dependent set-ups in which set-up time depends on the job just completed and the next job to be processed are also considered. The objective is to minimize the total family flow time, i.e. the maximum among the completion times of the jobs within a job family. A mixed-integer programming model is developed and two iterated greedy algorithms with different local search methods are proposed. Computational experiments were conducted on modified benchmark instances and the results are reported.

  17. Approximation algorithms for a genetic diagnostics problem.

    Science.gov (United States)

    Kosaraju, S R; Schäffer, A A; Biesecker, L G

    1998-01-01

    We define and study a combinatorial problem called WEIGHTED DIAGNOSTIC COVER (WDC) that models the use of a laboratory technique called genotyping in the diagnosis of an important class of chromosomal aberrations. An optimal solution to WDC would enable us to define a genetic assay that maximizes the diagnostic power for a specified cost of laboratory work. We develop approximation algorithms for WDC by making use of the well-known problem SET COVER for which the greedy heuristic has been extensively studied. We prove worst-case performance bounds on the greedy heuristic for WDC and for another heuristic we call directional greedy. We implemented both heuristics. We also implemented a local search heuristic that takes the solutions obtained by greedy and dir-greedy and applies swaps until they are locally optimal. We report their performance on a real data set that is representative of the options that a clinical geneticist faces for the real diagnostic problem. Many open problems related to WDC remain, both of theoretical interest and practical importance.

  18. Treatment planning for prostate brachytherapy using region of interest adjoint functions and a greedy heuristic

    International Nuclear Information System (INIS)

    Yoo, Sua; Kowalok, Michael E; Thomadsen, Bruce R; Henderson, Douglass L

    2003-01-01

    We have developed an efficient treatment-planning algorithm for prostate implants that is based on region of interest (ROI) adjoint functions and a greedy heuristic. For this work, we define the adjoint function for an ROI as the sensitivity of the average dose in the ROI to a unit-strength brachytherapy source at any seed position. The greedy heuristic uses a ratio of target and critical structure adjoint functions to rank seed positions according to their ability to irradiate the target ROI while sparing critical structure ROIs. This ratio is computed once for each seed position prior to the optimization process. Optimization is performed by a greedy heuristic that selects seed positions according to their ratio values. With this method, clinically acceptable treatment plans are obtained in less than 2 s. For comparison, a branch-and-bound method to solve a mixed integer-programming model took more than 50 min to arrive at a feasible solution. Both methods achieved good treatment plans, but the speedup provided by the greedy heuristic was a factor of approximately 1500. This attribute makes this algorithm suitable for intra-operative real-time treatment planning

  19. A comparison of performance measures for online algorithms

    DEFF Research Database (Denmark)

    Boyar, Joan; Irani, Sandy; Larsen, Kim Skak

    2009-01-01

    is to balance greediness and adaptability. We examine how these measures evaluate the Greedy Algorithm and Lazy Double Coverage, commonly studied algorithms in the context of server problems. We examine Competitive Analysis, the Max/Max Ratio, the Random Order Ratio, Bijective Analysis and Relative Worst Order...... Analysis and determine how they compare the two algorithms. We find that by the Max/Max Ratio and Bijective Analysis, Greedy is the better algorithm. Under the other measures Lazy Double Coverage is better, though Relative Worst Order Analysis indicates that Greedy is sometimes better. Our results also...... provide the first proof of optimality of an algorithm under Relative Worst Order Analysis....

  20. A trust-based sensor allocation algorithm in cooperative space search problems

    Science.gov (United States)

    Shen, Dan; Chen, Genshe; Pham, Khanh; Blasch, Erik

    2011-06-01

    Sensor allocation is an important and challenging problem within the field of multi-agent systems. The sensor allocation problem involves deciding how to assign a number of targets or cells to a set of agents according to some allocation protocol. Generally, in order to make efficient allocations, we need to design mechanisms that consider both the task performers' costs for the service and the associated probability of success (POS). In our problem, the costs are the used sensor resource, and the POS is the target tracking performance. Usually, POS may be perceived differently by different agents because they typically have different standards or means of evaluating the performance of their counterparts (other sensors in the search and tracking problem). Given this, we turn to the notion of trust to capture such subjective perceptions. In our approach, we develop a trust model to construct a novel mechanism that motivates sensor agents to limit their greediness or selfishness. Then we model the sensor allocation optimization problem with trust-in-loop negotiation game and solve it using a sub-game perfect equilibrium. Numerical simulations are performed to demonstrate the trust-based sensor allocation algorithm in cooperative space situation awareness (SSA) search problems.

  1. Composite Differential Search Algorithm

    Directory of Open Access Journals (Sweden)

    Bo Liu

    2014-01-01

    Full Text Available Differential search algorithm (DS is a relatively new evolutionary algorithm inspired by the Brownian-like random-walk movement which is used by an organism to migrate. It has been verified to be more effective than ABC, JDE, JADE, SADE, EPSDE, GSA, PSO2011, and CMA-ES. In this paper, we propose four improved solution search algorithms, namely “DS/rand/1,” “DS/rand/2,” “DS/current to rand/1,” and “DS/current to rand/2” to search the new space and enhance the convergence rate for the global optimization problem. In order to verify the performance of different solution search methods, 23 benchmark functions are employed. Experimental results indicate that the proposed algorithm performs better than, or at least comparable to, the original algorithm when considering the quality of the solution obtained. However, these schemes cannot still achieve the best solution for all functions. In order to further enhance the convergence rate and the diversity of the algorithm, a composite differential search algorithm (CDS is proposed in this paper. This new algorithm combines three new proposed search schemes including “DS/rand/1,” “DS/rand/2,” and “DS/current to rand/1” with three control parameters using a random method to generate the offspring. Experiment results show that CDS has a faster convergence rate and better search ability based on the 23 benchmark functions.

  2. Optimal Fungal Space Searching Algorithms.

    Science.gov (United States)

    Asenova, Elitsa; Lin, Hsin-Yu; Fu, Eileen; Nicolau, Dan V; Nicolau, Dan V

    2016-10-01

    Previous experiments have shown that fungi use an efficient natural algorithm for searching the space available for their growth in micro-confined networks, e.g., mazes. This natural "master" algorithm, which comprises two "slave" sub-algorithms, i.e., collision-induced branching and directional memory, has been shown to be more efficient than alternatives, with one, or the other, or both sub-algorithms turned off. In contrast, the present contribution compares the performance of the fungal natural algorithm against several standard artificial homologues. It was found that the space-searching fungal algorithm consistently outperforms uninformed algorithms, such as Depth-First-Search (DFS). Furthermore, while the natural algorithm is inferior to informed ones, such as A*, this under-performance does not importantly increase with the increase of the size of the maze. These findings suggest that a systematic effort of harvesting the natural space searching algorithms used by microorganisms is warranted and possibly overdue. These natural algorithms, if efficient, can be reverse-engineered for graph and tree search strategies.

  3. A Particle Swarm Optimization-Based Approach with Local Search for Predicting Protein Folding.

    Science.gov (United States)

    Yang, Cheng-Hong; Lin, Yu-Shiun; Chuang, Li-Yeh; Chang, Hsueh-Wei

    2017-10-01

    The hydrophobic-polar (HP) model is commonly used for predicting protein folding structures and hydrophobic interactions. This study developed a particle swarm optimization (PSO)-based algorithm combined with local search algorithms; specifically, the high exploration PSO (HEPSO) algorithm (which can execute global search processes) was combined with three local search algorithms (hill-climbing algorithm, greedy algorithm, and Tabu table), yielding the proposed HE-L-PSO algorithm. By using 20 known protein structures, we evaluated the performance of the HE-L-PSO algorithm in predicting protein folding in the HP model. The proposed HE-L-PSO algorithm exhibited favorable performance in predicting both short and long amino acid sequences with high reproducibility and stability, compared with seven reported algorithms. The HE-L-PSO algorithm yielded optimal solutions for all predicted protein folding structures. All HE-L-PSO-predicted protein folding structures possessed a hydrophobic core that is similar to normal protein folding.

  4. Metaheuristic algorithms for building Covering Arrays: A review

    Directory of Open Access Journals (Sweden)

    Jimena Adriana Timaná-Peña

    2016-09-01

    Full Text Available Covering Arrays (CA are mathematical objects used in the functional testing of software components. They enable the testing of all interactions of a given size of input parameters in a procedure, function, or logical unit in general, using the minimum number of test cases. Building CA is a complex task (NP-complete problem that involves lengthy execution times and high computational loads. The most effective methods for building CAs are algebraic, Greedy, and metaheuristic-based. The latter have reported the best results to date. This paper presents a description of the major contributions made by a selection of different metaheuristics, including simulated annealing, tabu search, genetic algorithms, ant colony algorithms, particle swarm algorithms, and harmony search algorithms. It is worth noting that simulated annealing-based algorithms have evolved as the most competitive, and currently form the state of the art.

  5. Quasi-greedy systems of integer translates

    DEFF Research Database (Denmark)

    Nielsen, Morten; Sikic, Hrvoje

    We consider quasi-greedy systems of integer translates in a finitely generated shift invariant subspace of L2(Rd), that is systems for which the thresholding approximation procedure is well behaved. We prove that every quasi-greedy system of integer translates is also a Riesz basis for its closed...

  6. Quasi-greedy systems of integer translates

    DEFF Research Database (Denmark)

    Nielsen, Morten; Sikic, Hrvoje

    2008-01-01

    We consider quasi-greedy systems of integer translates in a finitely generated shift-invariant subspace of L2(Rd), that is systems for which the thresholding approximation procedure is well behaved. We prove that every quasi-greedy system of integer translates is also a Riesz basis for its closed...

  7. A review on quantum search algorithms

    Science.gov (United States)

    Giri, Pulak Ranjan; Korepin, Vladimir E.

    2017-12-01

    The use of superposition of states in quantum computation, known as quantum parallelism, has significant advantage in terms of speed over the classical computation. It is evident from the early invented quantum algorithms such as Deutsch's algorithm, Deutsch-Jozsa algorithm and its variation as Bernstein-Vazirani algorithm, Simon algorithm, Shor's algorithms, etc. Quantum parallelism also significantly speeds up the database search algorithm, which is important in computer science because it comes as a subroutine in many important algorithms. Quantum database search of Grover achieves the task of finding the target element in an unsorted database in a time quadratically faster than the classical computer. We review Grover's quantum search algorithms for a singe and multiple target elements in a database. The partial search algorithm of Grover and Radhakrishnan and its optimization by Korepin called GRK algorithm are also discussed.

  8. On local optima in learning bayesian networks

    DEFF Research Database (Denmark)

    Dalgaard, Jens; Kocka, Tomas; Pena, Jose

    2003-01-01

    This paper proposes and evaluates the k-greedy equivalence search algorithm (KES) for learning Bayesian networks (BNs) from complete data. The main characteristic of KES is that it allows a trade-off between greediness and randomness, thus exploring different good local optima. When greediness...... is set at maximum, KES corresponds to the greedy equivalence search algorithm (GES). When greediness is kept at minimum, we prove that under mild assumptions KES asymptotically returns any inclusion optimal BN with nonzero probability. Experimental results for both synthetic and real data are reported...

  9. Quantum random-walk search algorithm

    International Nuclear Information System (INIS)

    Shenvi, Neil; Whaley, K. Birgitta; Kempe, Julia

    2003-01-01

    Quantum random walks on graphs have been shown to display many interesting properties, including exponentially fast hitting times when compared with their classical counterparts. However, it is still unclear how to use these novel properties to gain an algorithmic speedup over classical algorithms. In this paper, we present a quantum search algorithm based on the quantum random-walk architecture that provides such a speedup. It will be shown that this algorithm performs an oracle search on a database of N items with O(√(N)) calls to the oracle, yielding a speedup similar to other quantum search algorithms. It appears that the quantum random-walk formulation has considerable flexibility, presenting interesting opportunities for development of other, possibly novel quantum algorithms

  10. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng; Yuan, Ganzhao; Ghanem, Bernard

    2013-01-01

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  11. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng

    2013-10-03

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  12. Algorithms

    Indian Academy of Sciences (India)

    to as 'divide-and-conquer'. Although there has been a large effort in realizing efficient algorithms, there are not many universally accepted algorithm design paradigms. In this article, we illustrate algorithm design techniques such as balancing, greedy strategy, dynamic programming strategy, and backtracking or traversal of ...

  13. From greedy to lazy expansions and their driving dynamics

    NARCIS (Netherlands)

    Dajani, K.; Kraaikamp, C.

    2001-01-01

    In this paper we study the ergodic properties of non-greedy series expansions to non-integer bases β > 1. It is shown that the so-called 'lazy' expansion is isomorphic to the 'greedy' expansion. Furthermore, a class of expansions to base β > 1, β =2 Z, 'in between' the lazy and the greedy

  14. Continuous grasp algorithm applied to economic dispatch problem of thermal units

    Energy Technology Data Exchange (ETDEWEB)

    Vianna Neto, Julio Xavier [Pontifical Catholic University of Parana - PUCPR, Curitiba, PR (Brazil). Undergraduate Program at Mechatronics Engineering; Bernert, Diego Luis de Andrade; Coelho, Leandro dos Santos [Pontifical Catholic University of Parana - PUCPR, Curitiba, PR (Brazil). Industrial and Systems Engineering Graduate Program, LAS/PPGEPS], e-mail: leandro.coelho@pucpr.br

    2010-07-01

    The economic dispatch problem (EDP) is one of the fundamental issues in power systems to obtain benefits with the stability, reliability and security. Its objective is to allocate the power demand among committed generators in the most economical manner, while all physical and operational constraints are satisfied. The cost of power generation, particularly in fossil fuel plants, is very high and economic dispatch helps in saving a significant amount of revenue. Recently, as an alternative to the conventional mathematical approaches, modern heuristic optimization techniques such as simulated annealing, evolutionary algorithms, neural networks, ant colony, and tabu search have been given much attention by many researchers due to their ability to find an almost global optimal solution in EDPs. On other hand, continuous GRASP (C-GRASP) is a stochastic local search meta-heuristic for finding cost-efficient solutions to continuous global optimization problems subject to box constraints. Like a greedy randomized adaptive search procedure (GRASP), a C-GRASP is a multi-start procedure where a starting solution for local improvement is constructed in a greedy randomized fashion. The C-GRASP algorithm is validated for a test system consisting of fifteen units, test system that takes into account spinning reserve and prohibited operating zones constrains. (author)

  15. Search Parameter Optimization for Discrete, Bayesian, and Continuous Search Algorithms

    Science.gov (United States)

    2017-09-01

    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS SEARCH PARAMETER OPTIMIZATION FOR DISCRETE , BAYESIAN, AND CONTINUOUS SEARCH ALGORITHMS by...to 09-22-2017 4. TITLE AND SUBTITLE SEARCH PARAMETER OPTIMIZATION FOR DISCRETE , BAYESIAN, AND CON- TINUOUS SEARCH ALGORITHMS 5. FUNDING NUMBERS 6...simple search and rescue acts to prosecuting aerial/surface/submersible targets on mission. This research looks at varying the known discrete and

  16. Analysis of Greedy Decision Making for Geographic Routing for Networks of Randomly Moving Objects

    Directory of Open Access Journals (Sweden)

    Amber Israr

    2016-04-01

    Full Text Available Autonomous and self-organizing wireless ad-hoc communication networks for moving objects consist of nodes, which use no centralized network infrastructure. Examples of moving object networks are networks of flying objects, networks of vehicles, networks of moving people or robots. Moving object networks have to face many critical challenges in terms of routing because of dynamic topological changes and asymmetric networks links. A suitable and effective routing mechanism helps to extend the deployment of moving nodes. In this paper an attempt has been made to analyze the performance of the Greedy Decision method (position aware distance based algorithm for geographic routing for network nodes moving according to the random waypoint mobility model. The widely used GPSR (Greedy Packet Stateless Routing protocol utilizes geographic distance and position based data of nodes to transmit packets towards destination nodes. In this paper different scenarios have been tested to develop a concrete set of recommendations for optimum deployment of distance based Greedy Decision of Geographic Routing in randomly moving objects network

  17. A hybrid search algorithm for swarm robots searching in an unknown environment.

    Science.gov (United States)

    Li, Shoutao; Li, Lina; Lee, Gordon; Zhang, Hao

    2014-01-01

    This paper proposes a novel method to improve the efficiency of a swarm of robots searching in an unknown environment. The approach focuses on the process of feeding and individual coordination characteristics inspired by the foraging behavior in nature. A predatory strategy was used for searching; hence, this hybrid approach integrated a random search technique with a dynamic particle swarm optimization (DPSO) search algorithm. If a search robot could not find any target information, it used a random search algorithm for a global search. If the robot found any target information in a region, the DPSO search algorithm was used for a local search. This particle swarm optimization search algorithm is dynamic as all the parameters in the algorithm are refreshed synchronously through a communication mechanism until the robots find the target position, after which, the robots fall back to a random searching mode. Thus, in this searching strategy, the robots alternated between two searching algorithms until the whole area was covered. During the searching process, the robots used a local communication mechanism to share map information and DPSO parameters to reduce the communication burden and overcome hardware limitations. If the search area is very large, search efficiency may be greatly reduced if only one robot searches an entire region given the limited resources available and time constraints. In this research we divided the entire search area into several subregions, selected a target utility function to determine which subregion should be initially searched and thereby reduced the residence time of the target to improve search efficiency.

  18. Adiabatic quantum search algorithm for structured problems

    International Nuclear Information System (INIS)

    Roland, Jeremie; Cerf, Nicolas J.

    2003-01-01

    The study of quantum computation has been motivated by the hope of finding efficient quantum algorithms for solving classically hard problems. In this context, quantum algorithms by local adiabatic evolution have been shown to solve an unstructured search problem with a quadratic speedup over a classical search, just as Grover's algorithm. In this paper, we study how the structure of the search problem may be exploited to further improve the efficiency of these quantum adiabatic algorithms. We show that by nesting a partial search over a reduced set of variables into a global search, it is possible to devise quantum adiabatic algorithms with a complexity that, although still exponential, grows with a reduced order in the problem size

  19. A HYBRID HEURISTIC ALGORITHM FOR THE CLUSTERED TRAVELING SALESMAN PROBLEM

    Directory of Open Access Journals (Sweden)

    Mário Mestria

    2016-04-01

    Full Text Available ABSTRACT This paper proposes a hybrid heuristic algorithm, based on the metaheuristics Greedy Randomized Adaptive Search Procedure, Iterated Local Search and Variable Neighborhood Descent, to solve the Clustered Traveling Salesman Problem (CTSP. Hybrid Heuristic algorithm uses several variable neighborhood structures combining the intensification (using local search operators and diversification (constructive heuristic and perturbation routine. In the CTSP, the vertices are partitioned into clusters and all vertices of each cluster have to be visited contiguously. The CTSP is -hard since it includes the well-known Traveling Salesman Problem (TSP as a special case. Our hybrid heuristic is compared with three heuristics from the literature and an exact method. Computational experiments are reported for different classes of instances. Experimental results show that the proposed hybrid heuristic obtains competitive results within reasonable computational time.

  20. Quantum-circuit model of Hamiltonian search algorithms

    International Nuclear Information System (INIS)

    Roland, Jeremie; Cerf, Nicolas J.

    2003-01-01

    We analyze three different quantum search algorithms, namely, the traditional circuit-based Grover's algorithm, its continuous-time analog by Hamiltonian evolution, and the quantum search by local adiabatic evolution. We show that these algorithms are closely related in the sense that they all perform a rotation, at a constant angular velocity, from a uniform superposition of all states to the solution state. This makes it possible to implement the two Hamiltonian-evolution algorithms on a conventional quantum circuit, while keeping the quadratic speedup of Grover's original algorithm. It also clarifies the link between the adiabatic search algorithm and Grover's algorithm

  1. Fast algorithm of adaptive Fourier series

    Science.gov (United States)

    Gao, You; Ku, Min; Qian, Tao

    2018-05-01

    Adaptive Fourier decomposition (AFD, precisely 1-D AFD or Core-AFD) was originated for the goal of positive frequency representations of signals. It achieved the goal and at the same time offered fast decompositions of signals. There then arose several types of AFDs. AFD merged with the greedy algorithm idea, and in particular, motivated the so-called pre-orthogonal greedy algorithm (Pre-OGA) that was proven to be the most efficient greedy algorithm. The cost of the advantages of the AFD type decompositions is, however, the high computational complexity due to the involvement of maximal selections of the dictionary parameters. The present paper offers one formulation of the 1-D AFD algorithm by building the FFT algorithm into it. Accordingly, the algorithm complexity is reduced, from the original $\\mathcal{O}(M N^2)$ to $\\mathcal{O}(M N\\log_2 N)$, where $N$ denotes the number of the discretization points on the unit circle and $M$ denotes the number of points in $[0,1)$. This greatly enhances the applicability of AFD. Experiments are carried out to show the high efficiency of the proposed algorithm.

  2. A DIFFERENTIAL EVOLUTION ALGORITHM DEVELOPED FOR A NURSE SCHEDULING PROBLEM

    Directory of Open Access Journals (Sweden)

    Shahnazari-Shahrezaei, P.

    2012-11-01

    Full Text Available Nurse scheduling is a type of manpower allocation problem that tries to satisfy hospital managers objectives and nurses preferences as much as possible by generating fair shift schedules. This paper presents a nurse scheduling problem based on a real case study, and proposes two meta-heuristics a differential evolution algorithm (DE and a greedy randomised adaptive search procedure (GRASP to solve it. To investigate the efficiency of the proposed algorithms, two problems are solved. Furthermore, some comparison metrics are applied to examine the reliability of the proposed algorithms. The computational results in this paper show that the proposed DE outperforms the GRASP.

  3. A new efficient RLF-like algorithm for the vertex coloring problem

    Directory of Open Access Journals (Sweden)

    Adegbindin Mourchid

    2016-01-01

    Full Text Available The Recursive Largest First (RLF algorithm is one of the most popular greedy heuristics for the vertex coloring problem. It sequentially builds color classes on the basis of greedy choices. In particular, the first vertex placed in a color class C is one with a maximum number of uncolored neighbors, and the next vertices placed in C are chosen so that they have as many uncolored neighbors which cannot be placed in C. These greedy choices can have a significant impact on the performance of the algorithm, which explains why we propose alternative selection rules. Computational experiments on 63 difficult DIMACS instances show that the resulting new RLF-like algorithm, when compared with the standard RLF, allows to obtain a reduction of more than 50% of the gap between the number of colors used and the best known upper bound on the chromatic number. The new greedy algorithm even competes with basic metaheuristics for the vertex coloring problem.

  4. Generalized Jaynes-Cummings model as a quantum search algorithm

    International Nuclear Information System (INIS)

    Romanelli, A.

    2009-01-01

    We propose a continuous time quantum search algorithm using a generalization of the Jaynes-Cummings model. In this model the states of the atom are the elements among which the algorithm realizes the search, exciting resonances between the initial and the searched states. This algorithm behaves like Grover's algorithm; the optimal search time is proportional to the square root of the size of the search set and the probability to find the searched state oscillates periodically in time. In this frame, it is possible to reinterpret the usual Jaynes-Cummings model as a trivial case of the quantum search algorithm.

  5. An ILP based memetic algorithm for finding minimum positive influence dominating sets in social networks

    Science.gov (United States)

    Lin, Geng; Guan, Jian; Feng, Huibin

    2018-06-01

    The positive influence dominating set problem is a variant of the minimum dominating set problem, and has lots of applications in social networks. It is NP-hard, and receives more and more attention. Various methods have been proposed to solve the positive influence dominating set problem. However, most of the existing work focused on greedy algorithms, and the solution quality needs to be improved. In this paper, we formulate the minimum positive influence dominating set problem as an integer linear programming (ILP), and propose an ILP based memetic algorithm (ILPMA) for solving the problem. The ILPMA integrates a greedy randomized adaptive construction procedure, a crossover operator, a repair operator, and a tabu search procedure. The performance of ILPMA is validated on nine real-world social networks with nodes up to 36,692. The results show that ILPMA significantly improves the solution quality, and is robust.

  6. Detection of Cheating by Decimation Algorithm

    Science.gov (United States)

    Yamanaka, Shogo; Ohzeki, Masayuki; Decelle, Aurélien

    2015-02-01

    We expand the item response theory to study the case of "cheating students" for a set of exams, trying to detect them by applying a greedy algorithm of inference. This extended model is closely related to the Boltzmann machine learning. In this paper we aim to infer the correct biases and interactions of our model by considering a relatively small number of sets of training data. Nevertheless, the greedy algorithm that we employed in the present study exhibits good performance with a few number of training data. The key point is the sparseness of the interactions in our problem in the context of the Boltzmann machine learning: the existence of cheating students is expected to be very rare (possibly even in real world). We compare a standard approach to infer the sparse interactions in the Boltzmann machine learning to our greedy algorithm and we find the latter to be superior in several aspects.

  7. Search algorithms, hidden labour and information control

    Directory of Open Access Journals (Sweden)

    Paško Bilić

    2016-06-01

    Full Text Available The paper examines some of the processes of the closely knit relationship between Google’s ideologies of neutrality and objectivity and global market dominance. Neutrality construction comprises an important element sustaining the company’s economic position and is reflected in constant updates, estimates and changes to utility and relevance of search results. Providing a purely technical solution to these issues proves to be increasingly difficult without a human hand in steering algorithmic solutions. Search relevance fluctuates and shifts through continuous tinkering and tweaking of the search algorithm. The company also uses third parties to hire human raters for performing quality assessments of algorithmic updates and adaptations in linguistically and culturally diverse global markets. The adaptation process contradicts the technical foundations of the company and calculations based on the initial Page Rank algorithm. Annual market reports, Google’s Search Quality Rating Guidelines, and reports from media specialising in search engine optimisation business are analysed. The Search Quality Rating Guidelines document provides a rare glimpse into the internal architecture of search algorithms and the notions of utility and relevance which are presented and structured as neutral and objective. Intertwined layers of ideology, hidden labour of human raters, advertising revenues, market dominance and control are discussed throughout the paper.

  8. Hybrid Feature Selection Approach Based on GRASP for Cancer Microarray Data

    Directory of Open Access Journals (Sweden)

    Arpita Nagpal

    2017-01-01

    Full Text Available Microarray data usually contain a large number of genes, but a small number of samples. Feature subset selection for microarray data aims at reducing the number of genes so that useful information can be extracted from the samples. Reducing the dimension of data sets further helps in improving the computational efficiency of the learning model. In this paper, we propose a modified algorithm based on the tabu search as local search procedures to a Greedy Randomized Adaptive Search Procedure (GRASP for high dimensional microarray data sets. The proposed Tabu based Greedy Randomized Adaptive Search Procedure algorithm is named as TGRASP. In TGRASP, a new parameter has been introduced named as Tabu Tenure and the existing parameters, NumIter and size have been modified. We observed that different parameter settings affect the quality of the optimum. The second proposed algorithm known as FFGRASP (Firefly Greedy Randomized Adaptive Search Procedure uses a firefly optimization algorithm in the local search optimzation phase of the greedy randomized adaptive search procedure (GRASP. Firefly algorithm is one of the powerful algorithms for optimization of multimodal applications. Experimental results show that the proposed TGRASP and FFGRASP algorithms are much better than existing algorithm with respect to three performance parameters viz. accuracy, run time, number of a selected subset of features. We have also compared both the approaches with a unified metric (Extended Adjusted Ratio of Ratios which has shown that TGRASP approach outperforms existing approach for six out of nine cancer microarray datasets and FFGRASP performs better on seven out of nine datasets.

  9. A hybrid heuristic algorithm for the open-pit-mining operational planning problem.

    OpenAIRE

    Souza, Marcone Jamilson Freitas; Coelho, Igor Machado; Ribas, Sabir; Santos, Haroldo Gambini; Merschmann, Luiz Henrique de Campos

    2010-01-01

    This paper deals with the Open-Pit-Mining Operational Planning problem with dynamic truck allocation. The objective is to optimize mineral extraction in the mines by minimizing the number of mining trucks used to meet production goals and quality requirements. According to the literature, this problem is NPhard, so a heuristic strategy is justified. We present a hybrid algorithm that combines characteristics of two metaheuristics: Greedy Randomized Adaptive Search Procedures and General Varia...

  10. Quantum walks and search algorithms

    CERN Document Server

    Portugal, Renato

    2013-01-01

    This book addresses an interesting area of quantum computation called quantum walks, which play an important role in building quantum algorithms, in particular search algorithms. Quantum walks are the quantum analogue of classical random walks. It is known that quantum computers have great power for searching unsorted databases. This power extends to many kinds of searches, particularly to the problem of finding a specific location in a spatial layout, which can be modeled by a graph. The goal is to find a specific node knowing that the particle uses the edges to jump from one node to the next. This book is self-contained with main topics that include: Grover's algorithm, describing its geometrical interpretation and evolution by means of the spectral decomposition of the evolution operater Analytical solutions of quantum walks on important graphs like line, cycles, two-dimensional lattices, and hypercubes using Fourier transforms Quantum walks on generic graphs, describing methods to calculate the limiting d...

  11. 2nd International Conference on Harmony Search Algorithm

    CERN Document Server

    Geem, Zong

    2016-01-01

    The Harmony Search Algorithm (HSA) is one of the most well-known techniques in the field of soft computing, an important paradigm in the science and engineering community.  This volume, the proceedings of the 2nd International Conference on Harmony Search Algorithm 2015 (ICHSA 2015), brings together contributions describing the latest developments in the field of soft computing with a special focus on HSA techniques. It includes coverage of new methods that have potentially immense application in various fields. Contributed articles cover aspects of the following topics related to the Harmony Search Algorithm: analytical studies; improved, hybrid and multi-objective variants; parameter tuning; and large-scale applications.  The book also contains papers discussing recent advances on the following topics: genetic algorithms; evolutionary strategies; the firefly algorithm and cuckoo search; particle swarm optimization and ant colony optimization; simulated annealing; and local search techniques.   This book ...

  12. Searching Process with Raita Algorithm and its Application

    Science.gov (United States)

    Rahim, Robbi; Saleh Ahmar, Ansari; Abdullah, Dahlan; Hartama, Dedy; Napitupulu, Darmawan; Putera Utama Siahaan, Andysah; Hasan Siregar, Muhammad Noor; Nasution, Nurliana; Sundari, Siti; Sriadhi, S.

    2018-04-01

    Searching is a common process performed by many computer users, Raita algorithm is one algorithm that can be used to match and find information in accordance with the patterns entered. Raita algorithm applied to the file search application using java programming language and the results obtained from the testing process of the file search quickly and with accurate results and support many data types.

  13. Formal Modeling of Greedy Nodes in 802.15.4 WSN

    Directory of Open Access Journals (Sweden)

    Youcef Hammal

    2015-06-01

    Full Text Available This paper deals with formal specification of the non-slotted CSMA/CA protocol in wireless sensor networks(WSN whose some nodes own a greedy behavior. This protocol requires sensor nodes to wait some time before initiating a transmission, whereas greedy nodes may try to reduce their waiting duration, which may penalize other nodes. To analyze their impact on WSN mode in operation, we use timed automata of the model-checker UPPAAL to capture the abstract behavior of communication medium, sane, and greedy nodes in WSN. This enables the use of analysis tools to check whether these models satisfy intended properties.

  14. Improved Degree Search Algorithms in Unstructured P2P Networks

    Directory of Open Access Journals (Sweden)

    Guole Liu

    2012-01-01

    Full Text Available Searching and retrieving the demanded correct information is one important problem in networks; especially, designing an efficient search algorithm is a key challenge in unstructured peer-to-peer (P2P networks. Breadth-first search (BFS and depth-first search (DFS are the current two typical search methods. BFS-based algorithms show the perfect performance in the aspect of search success rate of network resources, while bringing the huge search messages. On the contrary, DFS-based algorithms reduce the search message quantity and also cause the dropping of search success ratio. To address the problem that only one of performances is excellent, we propose two memory function degree search algorithms: memory function maximum degree algorithm (MD and memory function preference degree algorithm (PD. We study their performance including the search success rate and the search message quantity in different networks, which are scale-free networks, random graph networks, and small-world networks. Simulations show that the two performances are both excellent at the same time, and the performances are improved at least 10 times.

  15. Decoherence in optimized quantum random-walk search algorithm

    International Nuclear Information System (INIS)

    Zhang Yu-Chao; Bao Wan-Su; Wang Xiang; Fu Xiang-Qun

    2015-01-01

    This paper investigates the effects of decoherence generated by broken-link-type noise in the hypercube on an optimized quantum random-walk search algorithm. When the hypercube occurs with random broken links, the optimized quantum random-walk search algorithm with decoherence is depicted through defining the shift operator which includes the possibility of broken links. For a given database size, we obtain the maximum success rate of the algorithm and the required number of iterations through numerical simulations and analysis when the algorithm is in the presence of decoherence. Then the computational complexity of the algorithm with decoherence is obtained. The results show that the ultimate effect of broken-link-type decoherence on the optimized quantum random-walk search algorithm is negative. (paper)

  16. Wolf Search Algorithm for Solving Optimal Reactive Power Dispatch Problem

    Directory of Open Access Journals (Sweden)

    Kanagasabai Lenin

    2015-03-01

    Full Text Available This paper presents a new bio-inspired heuristic optimization algorithm called the Wolf Search Algorithm (WSA for solving the multi-objective reactive power dispatch problem. Wolf Search algorithm is a new bio – inspired heuristic algorithm which based on wolf preying behaviour. The way wolves search for food and survive by avoiding their enemies has been imitated to formulate the algorithm for solving the reactive power dispatches. And the speciality  of wolf is  possessing  both individual local searching ability and autonomous flocking movement and this special property has been utilized to formulate the search algorithm .The proposed (WSA algorithm has been tested on standard IEEE 30 bus test system and simulation results shows clearly about the good performance of the proposed algorithm .

  17. A HYBRID HEURISTIC ALGORITHM FOR SOLVING THE RESOURCE CONSTRAINED PROJECT SCHEDULING PROBLEM (RCPSP

    Directory of Open Access Journals (Sweden)

    Juan Carlos Rivera

    Full Text Available The Resource Constrained Project Scheduling Problem (RCPSP is a problem of great interest for the scientific community because it belongs to the class of NP-Hard problems and no methods are known that can solve it accurately in polynomial processing times. For this reason heuristic methods are used to solve it in an efficient way though there is no guarantee that an optimal solution can be obtained. This research presents a hybrid heuristic search algorithm to solve the RCPSP efficiently, combining elements of the heuristic Greedy Randomized Adaptive Search Procedure (GRASP, Scatter Search and Justification. The efficiency obtained is measured taking into account the presence of the new elements added to the GRASP algorithm taken as base: Justification and Scatter Search. The algorithms are evaluated using three data bases of instances of the problem: 480 instances of 30 activities, 480 of 60, and 600 of 120 activities respectively, taken from the library PSPLIB available online. The solutions obtained by the developed algorithm for the instances of 30, 60 and 120 are compared with results obtained by other researchers at international level, where a prominent place is obtained, according to Chen (2011.

  18. THE QUASIPERIODIC AUTOMATED TRANSIT SEARCH ALGORITHM

    International Nuclear Information System (INIS)

    Carter, Joshua A.; Agol, Eric

    2013-01-01

    We present a new algorithm for detecting transiting extrasolar planets in time-series photometry. The Quasiperiodic Automated Transit Search (QATS) algorithm relaxes the usual assumption of strictly periodic transits by permitting a variable, but bounded, interval between successive transits. We show that this method is capable of detecting transiting planets with significant transit timing variations without any loss of significance— s mearing — as would be incurred with traditional algorithms; however, this is at the cost of a slightly increased stochastic background. The approximate times of transit are standard products of the QATS search. Despite the increased flexibility, we show that QATS has a run-time complexity that is comparable to traditional search codes and is comparably easy to implement. QATS is applicable to data having a nearly uninterrupted, uniform cadence and is therefore well suited to the modern class of space-based transit searches (e.g., Kepler, CoRoT). Applications of QATS include transiting planets in dynamically active multi-planet systems and transiting planets in stellar binary systems.

  19. Differential harmony search algorithm to optimize PWRs loading pattern

    Energy Technology Data Exchange (ETDEWEB)

    Poursalehi, N., E-mail: npsalehi@yahoo.com [Engineering Department, Shahid Beheshti University, G.C, P.O.Box: 1983963113, Tehran (Iran, Islamic Republic of); Zolfaghari, A.; Minuchehr, A. [Engineering Department, Shahid Beheshti University, G.C, P.O.Box: 1983963113, Tehran (Iran, Islamic Republic of)

    2013-04-15

    Highlights: ► Exploit of DHS algorithm in LP optimization reveals its flexibility, robustness and reliability. ► Upshot of our experiments with DHS shows that the search approach to optimal LP is quickly. ► On the average, the final band width of DHS fitness values is narrow relative to HS and GHS. -- Abstract: The objective of this work is to develop a core loading optimization technique using differential harmony search algorithm in the context of obtaining an optimal configuration of fuel assemblies in pressurized water reactors. To implement and evaluate the proposed technique, differential harmony search nodal expansion package for 2-D geometry, DHSNEP-2D, is developed. The package includes two modules; in the first modules differential harmony search (DHS) is implemented and nodal expansion code which solves two dimensional-multi group neutron diffusion equations using fourth degree flux expansion with one node per a fuel assembly is in the second module. For evaluation of DHS algorithm, classical harmony search (HS) and global-best harmony search (GHS) algorithms are also included in DHSNEP-2D in order to compare the outcome of techniques together. For this purpose, two PWR test cases have been investigated to demonstrate the DHS algorithm capability in obtaining near optimal loading pattern. Results show that the convergence rate of DHS and execution times are quite promising and also is reliable for the fuel management operation. Moreover, numerical results show the good performance of DHS relative to other competitive algorithms such as genetic algorithm (GA), classical harmony search (HS) and global-best harmony search (GHS) algorithms.

  20. Differential harmony search algorithm to optimize PWRs loading pattern

    International Nuclear Information System (INIS)

    Poursalehi, N.; Zolfaghari, A.; Minuchehr, A.

    2013-01-01

    Highlights: ► Exploit of DHS algorithm in LP optimization reveals its flexibility, robustness and reliability. ► Upshot of our experiments with DHS shows that the search approach to optimal LP is quickly. ► On the average, the final band width of DHS fitness values is narrow relative to HS and GHS. -- Abstract: The objective of this work is to develop a core loading optimization technique using differential harmony search algorithm in the context of obtaining an optimal configuration of fuel assemblies in pressurized water reactors. To implement and evaluate the proposed technique, differential harmony search nodal expansion package for 2-D geometry, DHSNEP-2D, is developed. The package includes two modules; in the first modules differential harmony search (DHS) is implemented and nodal expansion code which solves two dimensional-multi group neutron diffusion equations using fourth degree flux expansion with one node per a fuel assembly is in the second module. For evaluation of DHS algorithm, classical harmony search (HS) and global-best harmony search (GHS) algorithms are also included in DHSNEP-2D in order to compare the outcome of techniques together. For this purpose, two PWR test cases have been investigated to demonstrate the DHS algorithm capability in obtaining near optimal loading pattern. Results show that the convergence rate of DHS and execution times are quite promising and also is reliable for the fuel management operation. Moreover, numerical results show the good performance of DHS relative to other competitive algorithms such as genetic algorithm (GA), classical harmony search (HS) and global-best harmony search (GHS) algorithms

  1. Arc-Search Infeasible Interior-Point Algorithm for Linear Programming

    OpenAIRE

    Yang, Yaguang

    2014-01-01

    Mehrotra's algorithm has been the most successful infeasible interior-point algorithm for linear programming since 1990. Most popular interior-point software packages for linear programming are based on Mehrotra's algorithm. This paper proposes an alternative algorithm, arc-search infeasible interior-point algorithm. We will demonstrate, by testing Netlib problems and comparing the test results obtained by arc-search infeasible interior-point algorithm and Mehrotra's algorithm, that the propo...

  2. Cuckoo search and firefly algorithm theory and applications

    CERN Document Server

    2014-01-01

    Nature-inspired algorithms such as cuckoo search and firefly algorithm have become popular and widely used in recent years in many applications. These algorithms are flexible, efficient and easy to implement. New progress has been made in the last few years, and it is timely to summarize the latest developments of cuckoo search and firefly algorithm and their diverse applications. This book will review both theoretical studies and applications with detailed algorithm analysis, implementation and case studies so that readers can benefit most from this book.  Application topics are contributed by many leading experts in the field. Topics include cuckoo search, firefly algorithm, algorithm analysis, feature selection, image processing, travelling salesman problem, neural network, GPU optimization, scheduling, queuing, multi-objective manufacturing optimization, semantic web service, shape optimization, and others.   This book can serve as an ideal reference for both graduates and researchers in computer scienc...

  3. Searching for the majority: algorithms of voluntary control.

    Directory of Open Access Journals (Sweden)

    Jin Fan

    Full Text Available Voluntary control of information processing is crucial to allocate resources and prioritize the processes that are most important under a given situation; the algorithms underlying such control, however, are often not clear. We investigated possible algorithms of control for the performance of the majority function, in which participants searched for and identified one of two alternative categories (left or right pointing arrows as composing the majority in each stimulus set. We manipulated the amount (set size of 1, 3, and 5 and content (ratio of left and right pointing arrows within a set of the inputs to test competing hypotheses regarding mental operations for information processing. Using a novel measure based on computational load, we found that reaction time was best predicted by a grouping search algorithm as compared to alternative algorithms (i.e., exhaustive or self-terminating search. The grouping search algorithm involves sampling and resampling of the inputs before a decision is reached. These findings highlight the importance of investigating the implications of voluntary control via algorithms of mental operations.

  4. Greedy algorithm with weights for decision tree construction

    KAUST Repository

    Moshkov, Mikhail

    2010-01-01

    An approximate algorithm for minimization of weighted depth of decision trees is considered. A bound on accuracy of this algorithm is obtained which is unimprovable in general case. Under some natural assumptions on the class NP, the considered algorithm is close (from the point of view of accuracy) to best polynomial approximate algorithms for minimization of weighted depth of decision trees.

  5. Greedy algorithm with weights for decision tree construction

    KAUST Repository

    Moshkov, Mikhail

    2010-12-01

    An approximate algorithm for minimization of weighted depth of decision trees is considered. A bound on accuracy of this algorithm is obtained which is unimprovable in general case. Under some natural assumptions on the class NP, the considered algorithm is close (from the point of view of accuracy) to best polynomial approximate algorithms for minimization of weighted depth of decision trees.

  6. Searching Algorithms Implemented on Probabilistic Systolic Arrays

    Czech Academy of Sciences Publication Activity Database

    Kramosil, Ivan

    1996-01-01

    Roč. 25, č. 1 (1996), s. 7-45 ISSN 0308-1079 R&D Projects: GA ČR GA201/93/0781 Keywords : searching algorithms * probabilistic algorithms * systolic arrays * parallel algorithms Impact factor: 0.214, year: 1996

  7. Optimal Route Searching with Multiple Dynamical Constraints—A Geometric Algebra Approach

    Directory of Open Access Journals (Sweden)

    Dongshuang Li

    2018-05-01

    Full Text Available The process of searching for a dynamic constrained optimal path has received increasing attention in traffic planning, evacuation, and personalized or collaborative traffic service. As most existing multiple constrained optimal path (MCOP methods cannot search for a path given various types of constraints that dynamically change during the search, few approaches for dynamic multiple constrained optimal path (DMCOP with type II dynamics are available for practical use. In this study, we develop a method to solve the DMCOP problem with type II dynamics based on the unification of various types of constraints under a geometric algebra (GA framework. In our method, the network topology and three different types of constraints are represented by using algebraic base coding. With a parameterized optimization of the MCOP algorithm based on a greedy search strategy under the generation-refinement paradigm, this algorithm is found to accurately support the discovery of optimal paths as the constraints of numerical values, nodes, and route structure types are dynamically added to the network. The algorithm was tested with simulated cases of optimal tourism route searches in China’s road networks with various combinations of constraints. The case study indicates that our algorithm can not only solve the DMCOP with different types of constraints but also use constraints to speed up the route filtering.

  8. Effects of a random noisy oracle on search algorithm complexity

    International Nuclear Information System (INIS)

    Shenvi, Neil; Brown, Kenneth R.; Whaley, K. Birgitta

    2003-01-01

    Grover's algorithm provides a quadratic speed-up over classical algorithms for unstructured database or library searches. This paper examines the robustness of Grover's search algorithm to a random phase error in the oracle and analyzes the complexity of the search process as a function of the scaling of the oracle error with database or library size. Both the discrete- and continuous-time implementations of the search algorithm are investigated. It is shown that unless the oracle phase error scales as O(N -1/4 ), neither the discrete- nor the continuous-time implementation of Grover's algorithm is scalably robust to this error in the absence of error correction

  9. An Enhanced Artificial Bee Colony Algorithm with Solution Acceptance Rule and Probabilistic Multisearch.

    Science.gov (United States)

    Yurtkuran, Alkın; Emel, Erdal

    2016-01-01

    The artificial bee colony (ABC) algorithm is a popular swarm based technique, which is inspired from the intelligent foraging behavior of honeybee swarms. This paper proposes a new variant of ABC algorithm, namely, enhanced ABC with solution acceptance rule and probabilistic multisearch (ABC-SA) to address global optimization problems. A new solution acceptance rule is proposed where, instead of greedy selection between old solution and new candidate solution, worse candidate solutions have a probability to be accepted. Additionally, the acceptance probability of worse candidates is nonlinearly decreased throughout the search process adaptively. Moreover, in order to improve the performance of the ABC and balance the intensification and diversification, a probabilistic multisearch strategy is presented. Three different search equations with distinctive characters are employed using predetermined search probabilities. By implementing a new solution acceptance rule and a probabilistic multisearch approach, the intensification and diversification performance of the ABC algorithm is improved. The proposed algorithm has been tested on well-known benchmark functions of varying dimensions by comparing against novel ABC variants, as well as several recent state-of-the-art algorithms. Computational results show that the proposed ABC-SA outperforms other ABC variants and is superior to state-of-the-art algorithms proposed in the literature.

  10. An Enhanced Artificial Bee Colony Algorithm with Solution Acceptance Rule and Probabilistic Multisearch

    Directory of Open Access Journals (Sweden)

    Alkın Yurtkuran

    2016-01-01

    Full Text Available The artificial bee colony (ABC algorithm is a popular swarm based technique, which is inspired from the intelligent foraging behavior of honeybee swarms. This paper proposes a new variant of ABC algorithm, namely, enhanced ABC with solution acceptance rule and probabilistic multisearch (ABC-SA to address global optimization problems. A new solution acceptance rule is proposed where, instead of greedy selection between old solution and new candidate solution, worse candidate solutions have a probability to be accepted. Additionally, the acceptance probability of worse candidates is nonlinearly decreased throughout the search process adaptively. Moreover, in order to improve the performance of the ABC and balance the intensification and diversification, a probabilistic multisearch strategy is presented. Three different search equations with distinctive characters are employed using predetermined search probabilities. By implementing a new solution acceptance rule and a probabilistic multisearch approach, the intensification and diversification performance of the ABC algorithm is improved. The proposed algorithm has been tested on well-known benchmark functions of varying dimensions by comparing against novel ABC variants, as well as several recent state-of-the-art algorithms. Computational results show that the proposed ABC-SA outperforms other ABC variants and is superior to state-of-the-art algorithms proposed in the literature.

  11. 6. Algorithms for Sorting and Searching

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 2; Issue 3. Algorithms - Algorithms for Sorting and Searching. R K Shyamasundar. Series Article ... Author Affiliations. R K Shyamasundar1. Computer Science Group, Tata Institute of Fundamental Research, Homi Bhabha Road, Mumbai 400 005, India ...

  12. A Novel Self-Adaptive Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    Kaiping Luo

    2013-01-01

    Full Text Available The harmony search algorithm is a music-inspired optimization technology and has been successfully applied to diverse scientific and engineering problems. However, like other metaheuristic algorithms, it still faces two difficulties: parameter setting and finding the optimal balance between diversity and intensity in searching. This paper proposes a novel, self-adaptive search mechanism for optimization problems with continuous variables. This new variant can automatically configure the evolutionary parameters in accordance with problem characteristics, such as the scale and the boundaries, and dynamically select evolutionary strategies in accordance with its search performance. The new variant simplifies the parameter setting and efficiently solves all types of optimization problems with continuous variables. Statistical test results show that this variant is considerably robust and outperforms the original harmony search (HS, improved harmony search (IHS, and other self-adaptive variants for large-scale optimization problems and constrained problems.

  13. Merged Search Algorithms for Radio Frequency Identification Anticollision

    Directory of Open Access Journals (Sweden)

    Bih-Yaw Shih

    2012-01-01

    The arbitration algorithm for RFID system is used to arbitrate all the tags to avoid the collision problem with the existence of multiple tags in the interrogation field of a transponder. A splitting algorithm which is called Binary Search Tree (BST is well known for multitags arbitration. In the current study, a splitting-based schema called Merged Search Tree is proposed to capture identification codes correctly for anticollision. Performance of the proposed algorithm is compared with the original BST according to time and power consumed during the arbitration process. The results show that the proposed model can reduce searching time and power consumed to achieve a better performance arbitration.

  14. Modified Parameters of Harmony Search Algorithm for Better Searching

    Science.gov (United States)

    Farraliza Mansor, Nur; Abal Abas, Zuraida; Samad Shibghatullah, Abdul; Rahman, Ahmad Fadzli Nizam Abdul

    2017-08-01

    The scheduling and rostering problems are deliberated as integrated due to they depend on each other whereby the input of rostering problems is a scheduling problems. In this research, the integrated scheduling and rostering bus driver problems are defined as maximising the balance of the assignment of tasks in term of distribution of shifts and routes. It is essential to achieve is fairer among driver because this can bring to increase in driver levels of satisfaction. The latest approaches still unable to address the fairness problem that has emerged, thus this research proposes a strategy to adopt an amendment of a harmony search algorithm in order to address the fairness issue and thus the level of fairness will be escalate. The harmony search algorithm is classified as a meta-heuristics algorithm that is capable of solving hard and combinatorial or discrete optimisation problems. In this respect, the three main operators in HS, namely the Harmony Memory Consideration Rate (HMCR), Pitch Adjustment Rate (PAR) and Bandwidth (BW) play a vital role in balancing local exploitation and global exploration. These parameters influence the overall performance of the HS algorithm, and therefore it is crucial to fine-tune them. The contributions to this research are the HMCR parameter using step function while the fret spacing concept on guitars that is associated with mathematical formulae is also applied in the BW parameter. The model of constant step function is introduced in the alteration of HMCR parameter. The experimental results revealed that our proposed approach is superior than parameter adaptive harmony search algorithm. In conclusion, this proposed approach managed to generate a fairer roster and was thus capable of maximising the balancing distribution of shifts and routes among drivers, which contributed to the lowering of illness, incidents, absenteeism and accidents.

  15. Q-learning-based adjustable fixed-phase quantum Grover search algorithm

    International Nuclear Information System (INIS)

    Guo Ying; Shi Wensha; Wang Yijun; Hu, Jiankun

    2017-01-01

    We demonstrate that the rotation phase can be suitably chosen to increase the efficiency of the phase-based quantum search algorithm, leading to a dynamic balance between iterations and success probabilities of the fixed-phase quantum Grover search algorithm with Q-learning for a given number of solutions. In this search algorithm, the proposed Q-learning algorithm, which is a model-free reinforcement learning strategy in essence, is used for performing a matching algorithm based on the fraction of marked items λ and the rotation phase α. After establishing the policy function α = π(λ), we complete the fixed-phase Grover algorithm, where the phase parameter is selected via the learned policy. Simulation results show that the Q-learning-based Grover search algorithm (QLGA) enables fewer iterations and gives birth to higher success probabilities. Compared with the conventional Grover algorithms, it avoids the optimal local situations, thereby enabling success probabilities to approach one. (author)

  16. Minimizing the Total Service Time of Discrete Dynamic Berth Allocation Problem by an Iterated Greedy Heuristic

    Science.gov (United States)

    2014-01-01

    Berth allocation is the forefront operation performed when ships arrive at a port and is a critical task in container port optimization. Minimizing the time ships spend at berths constitutes an important objective of berth allocation problems. This study focuses on the discrete dynamic berth allocation problem (discrete DBAP), which aims to minimize total service time, and proposes an iterated greedy (IG) algorithm to solve it. The proposed IG algorithm is tested on three benchmark problem sets. Experimental results show that the proposed IG algorithm can obtain optimal solutions for all test instances of the first and second problem sets and outperforms the best-known solutions for 35 out of 90 test instances of the third problem set. PMID:25295295

  17. Minimizing the Total Service Time of Discrete Dynamic Berth Allocation Problem by an Iterated Greedy Heuristic

    Directory of Open Access Journals (Sweden)

    Shih-Wei Lin

    2014-01-01

    Full Text Available Berth allocation is the forefront operation performed when ships arrive at a port and is a critical task in container port optimization. Minimizing the time ships spend at berths constitutes an important objective of berth allocation problems. This study focuses on the discrete dynamic berth allocation problem (discrete DBAP, which aims to minimize total service time, and proposes an iterated greedy (IG algorithm to solve it. The proposed IG algorithm is tested on three benchmark problem sets. Experimental results show that the proposed IG algorithm can obtain optimal solutions for all test instances of the first and second problem sets and outperforms the best-known solutions for 35 out of 90 test instances of the third problem set.

  18. Phase matching in quantum searching and the improved Grover algorithm

    International Nuclear Information System (INIS)

    Long Guilu; Li Yansong; Xiao Li; Tu Changcun; Sun Yang

    2004-01-01

    The authors briefly introduced some of our recent work related to the phase matching condition in quantum searching algorithms and the improved Grover algorithm. When one replaces the two phase inversions in the Grover algorithm with arbitrary phase rotations, the modified algorithm usually fails in searching the marked state unless a phase matching condition is satisfied between the two phases. the Grover algorithm is not 100% in success rate, an improved Grover algorithm with zero-failure rate is given by replacing the phase inversions with angles that depends on the size of the database. Other aspects of the Grover algorithm such as the SO(3) picture of quantum searching, the dominant gate imperfections in the Grover algorithm are also mentioned. (author)

  19. Adaptive switching gravitational search algorithm: an attempt to ...

    Indian Academy of Sciences (India)

    Nor Azlina Ab Aziz

    An adaptive gravitational search algorithm (GSA) that switches between synchronous and ... genetic algorithm (GA), bat-inspired algorithm (BA) and grey wolf optimizer (GWO). ...... heuristic with applications in applied electromagnetics. Prog.

  20. Analysis of Multivariate Experimental Data Using A Simplified Regression Model Search Algorithm

    Science.gov (United States)

    Ulbrich, Norbert Manfred

    2013-01-01

    A new regression model search algorithm was developed in 2011 that may be used to analyze both general multivariate experimental data sets and wind tunnel strain-gage balance calibration data. The new algorithm is a simplified version of a more complex search algorithm that was originally developed at the NASA Ames Balance Calibration Laboratory. The new algorithm has the advantage that it needs only about one tenth of the original algorithm's CPU time for the completion of a search. In addition, extensive testing showed that the prediction accuracy of math models obtained from the simplified algorithm is similar to the prediction accuracy of math models obtained from the original algorithm. The simplified algorithm, however, cannot guarantee that search constraints related to a set of statistical quality requirements are always satisfied in the optimized regression models. Therefore, the simplified search algorithm is not intended to replace the original search algorithm. Instead, it may be used to generate an alternate optimized regression model of experimental data whenever the application of the original search algorithm either fails or requires too much CPU time. Data from a machine calibration of NASA's MK40 force balance is used to illustrate the application of the new regression model search algorithm.

  1. NEWordS A News Search Engine for English Vocabulary Learning

    Directory of Open Access Journals (Sweden)

    Xuejing Huang

    2015-08-01

    Full Text Available Vocabulary is the first hurdle for English learners to over- come. Instead of simply showing a word again and again we come up with an idea to develop an English news article search engine based on users word-reciting record on Shanbay.com. It is designed for advanced English learners to find suitable reading materials. The search engine consists of Crawling Module Document Normalizing module Indexing Module Querying Module and Interface Module. We propose three sorting amp ranking algorithms for Querying Module. For the basic algorithm five crucial principles are taken into consideration. Term frequency inverse document frequency familiarity degree and article freshness degree are factors in this algorithm. Then we think of a improved algorithm for the scene in which a user read multiple articles in the searching result list. Here we adopt a iterative amp greedy method. The essential idea is to select English news articles one by one according to the query meanwhile dynamically update the unfamiliarity of the words during each iterative step. Moreover we develop an advanced algorithm to take article difficulty in to account. Interface Module is designed as a website meanwhile some data visualization technologies e.g. word cloud are applied here. Furthermore we conduct both applicability check and performance evaluation. Metrics such as searching time word-covering ratio and minimum number of articles that completely cover all the queried vocabulary are randomly sampled and profoundly analyzed. The result shows that our search engine works very well with satisfying performance.

  2. An Aircraft Service Staff Rostering using a Hybrid GRASP Algorithm

    Directory of Open Access Journals (Sweden)

    W.H. Ip

    2009-10-01

    Full Text Available The aircraft ground service company is responsible for carrying out the regular tasks to aircraft maintenace between their arrival at and departure from the airport. This paper presents the application of a hybrid approach based upon greedy randomized adaptive search procedure (GRASP for rostering technical staff such that they are assigned predefined shift patterns. The rostering of staff is posed as an optimization problem with an aim of minimizing the violations of hard and soft constraints. The proposed algorithm iteratively constructs a set of solutions by GRASP. Furthermore, with multi-agent techniques, we efficiently identify an optimal roster with minimal constraint violations and fair to employees. Experimental results are included to demonstrate the effectiveness of the proposed algorithm.

  3. ESHOPPS: A COMPUTATIONAL TOOL TO AID THE TEACHING OF SHORTEST PATH ALGORITHMS

    Directory of Open Access Journals (Sweden)

    S. J. de A. LIMA

    2015-07-01

    Full Text Available The development of a computational tool called EShoPPS – Environment for Shortest Path Problem Solving, which is used to assist students in understanding the working of Dijkstra, Greedy search and A*(star algorithms is presented in this paper. Such algorithms are commonly taught in graduate and undergraduate courses of Engineering and Informatics and are used for solving many optimization problems that can be characterized as Shortest Path Problem. The EShoPPS is an interactive tool that allows students to create a graph representing the problem and also helps in developing their knowledge of each specific algorithm. Experiments performed with 155 students of undergraduate and graduate courses such as Industrial Engineering, Computer Science and Information Systems have shown that by using the EShoPPS tool students were able to improve their interpretation of investigated algorithms.

  4. Efficient algorithm for binary search enhancement | Bennett | Journal ...

    African Journals Online (AJOL)

    Log in or Register to get access to full text downloads. ... This paper presents an Enhanced Binary Search algorithm that ensures that search is performed if ... search region of the list, therefore enabling search to be performed in reduced time.

  5. Online learning algorithm for ensemble of decision rules

    KAUST Repository

    Chikalov, Igor; Moshkov, Mikhail; Zielosko, Beata

    2011-01-01

    We describe an online learning algorithm that builds a system of decision rules for a classification problem. Rules are constructed according to the minimum description length principle by a greedy algorithm or using the dynamic programming approach

  6. A Comparison of Local Search Methods for the Multicriteria Police Districting Problem on Graph

    Directory of Open Access Journals (Sweden)

    F. Liberatore

    2016-01-01

    Full Text Available In the current economic climate, law enforcement agencies are facing resource shortages. The effective and efficient use of scarce resources is therefore of the utmost importance to provide a high standard public safety service. Optimization models specifically tailored to the necessity of police agencies can help to ameliorate their use. The Multicriteria Police Districting Problem (MC-PDP on a graph concerns the definition of sound patrolling sectors in a police district. The objective of this problem is to partition a graph into convex and continuous subsets, while ensuring efficiency and workload balance among the subsets. The model was originally formulated in collaboration with the Spanish National Police Corps. We propose for its solution three local search algorithms: a Simple Hill Climbing, a Steepest Descent Hill Climbing, and a Tabu Search. To improve their diversification capabilities, all the algorithms implement a multistart procedure, initialized by randomized greedy solutions. The algorithms are empirically tested on a case study on the Central District of Madrid. Our experiments show that the solutions identified by the novel Tabu Search outperform the other algorithms. Finally, research guidelines for future developments on the MC-PDP are given.

  7. PWR loading pattern optimization using Harmony Search algorithm

    International Nuclear Information System (INIS)

    Poursalehi, N.; Zolfaghari, A.; Minuchehr, A.

    2013-01-01

    Highlights: ► Numerical results reveal that the HS method is reliable. ► The great advantage of HS is significant gain in computational cost. ► On the average, the final band width of search fitness values is narrow. ► Our experiments show that the search approaches the optimal value fast. - Abstract: In this paper a core reloading technique using Harmony Search, HS, is presented in the context of finding an optimal configuration of fuel assemblies, FA, in pressurized water reactors. To implement and evaluate the proposed technique a Harmony Search along Nodal Expansion Code for 2-D geometry, HSNEC2D, is developed to obtain nearly optimal arrangement of fuel assemblies in PWR cores. This code consists of two sections including Harmony Search algorithm and Nodal Expansion modules using fourth degree flux expansion which solves two dimensional-multi group diffusion equations with one node per fuel assembly. Two optimization test problems are investigated to demonstrate the HS algorithm capability in converging to near optimal loading pattern in the fuel management field and other subjects. Results, convergence rate and reliability of the method are quite promising and show the HS algorithm performs very well and is comparable to other competitive algorithms such as Genetic Algorithm and Particle Swarm Intelligence. Furthermore, implementation of nodal expansion technique along HS causes considerable reduction of computational time to process and analysis optimization in the core fuel management problems

  8. An improved harmony search algorithm for power economic load dispatch

    Energy Technology Data Exchange (ETDEWEB)

    Santos Coelho, Leandro dos [Pontifical Catholic University of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, PPGEPS, Imaculada Conceicao, 1155, 80215-901 Curitiba, PR (Brazil)], E-mail: leandro.coelho@pucpr.br; Mariani, Viviana Cocco [Pontifical Catholic University of Parana, PUCPR, Department of Mechanical Engineering, PPGEM, Imaculada Conceicao, 1155, 80215-901 Curitiba, PR (Brazil)], E-mail: viviana.mariani@pucpr.br

    2009-10-15

    A meta-heuristic algorithm called harmony search (HS), mimicking the improvisation process of music players, has been recently developed. The HS algorithm has been successful in several optimization problems. The HS algorithm does not require derivative information and uses stochastic random search instead of a gradient search. In addition, the HS algorithm is simple in concept, few in parameters, and easy in implementation. This paper presents an improved harmony search (IHS) algorithm based on exponential distribution for solving economic dispatch problems. A 13-unit test system with incremental fuel cost function taking into account the valve-point loading effects is used to illustrate the effectiveness of the proposed IHS method. Numerical results show that the IHS method has good convergence property. Furthermore, the generation costs of the IHS method are lower than those of the classical HS and other optimization algorithms reported in recent literature.

  9. An improved harmony search algorithm for power economic load dispatch

    Energy Technology Data Exchange (ETDEWEB)

    Coelho, Leandro dos Santos [Pontifical Catholic Univ. of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, PPGEPS, Imaculada Conceicao, 1155, 80215-901 Curitiba, PR (Brazil); Mariani, Viviana Cocco [Pontifical Catholic Univ. of Parana, PUCPR, Dept. of Mechanical Engineering, PPGEM, Imaculada Conceicao, 1155, 80215-901 Curitiba, PR (Brazil)

    2009-10-15

    A meta-heuristic algorithm called harmony search (HS), mimicking the improvisation process of music players, has been recently developed. The HS algorithm has been successful in several optimization problems. The HS algorithm does not require derivative information and uses stochastic random search instead of a gradient search. In addition, the HS algorithm is simple in concept, few in parameters, and easy in implementation. This paper presents an improved harmony search (IHS) algorithm based on exponential distribution for solving economic dispatch problems. A 13-unit test system with incremental fuel cost function taking into account the valve-point loading effects is used to illustrate the effectiveness of the proposed IHS method. Numerical results show that the IHS method has good convergence property. Furthermore, the generation costs of the IHS method are lower than those of the classical HS and other optimization algorithms reported in recent literature. (author)

  10. An improved harmony search algorithm for power economic load dispatch

    International Nuclear Information System (INIS)

    Santos Coelho, Leandro dos; Mariani, Viviana Cocco

    2009-01-01

    A meta-heuristic algorithm called harmony search (HS), mimicking the improvisation process of music players, has been recently developed. The HS algorithm has been successful in several optimization problems. The HS algorithm does not require derivative information and uses stochastic random search instead of a gradient search. In addition, the HS algorithm is simple in concept, few in parameters, and easy in implementation. This paper presents an improved harmony search (IHS) algorithm based on exponential distribution for solving economic dispatch problems. A 13-unit test system with incremental fuel cost function taking into account the valve-point loading effects is used to illustrate the effectiveness of the proposed IHS method. Numerical results show that the IHS method has good convergence property. Furthermore, the generation costs of the IHS method are lower than those of the classical HS and other optimization algorithms reported in recent literature.

  11. A New Approximate Chimera Donor Cell Search Algorithm

    Science.gov (United States)

    Holst, Terry L.; Nixon, David (Technical Monitor)

    1998-01-01

    The objectives of this study were to develop chimera-based full potential methodology which is compatible with overflow (Euler/Navier-Stokes) chimera flow solver and to develop a fast donor cell search algorithm that is compatible with the chimera full potential approach. Results of this work included presenting a new donor cell search algorithm suitable for use with a chimera-based full potential solver. This algorithm was found to be extremely fast and simple producing donor cells as fast as 60,000 per second.

  12. Hybridizing Evolutionary Algorithms with Opportunistic Local Search

    DEFF Research Database (Denmark)

    Gießen, Christian

    2013-01-01

    There is empirical evidence that memetic algorithms (MAs) can outperform plain evolutionary algorithms (EAs). Recently the first runtime analyses have been presented proving the aforementioned conjecture rigorously by investigating Variable-Depth Search, VDS for short (Sudholt, 2008). Sudholt...

  13. A heuristic algorithm for a multi-product four-layer capacitated location-routing problem

    Directory of Open Access Journals (Sweden)

    Mohsen Hamidi

    2014-01-01

    Full Text Available The purpose of this study is to solve a complex multi-product four-layer capacitated location-routing problem (LRP in which two specific constraints are taken into account: 1 plants have limited production capacity, and 2 central depots have limited capacity for storing and transshipping products. The LRP represents a multi-product four-layer distribution network that consists of plants, central depots, regional depots, and customers. A heuristic algorithm is developed to solve the four-layer LRP. The heuristic uses GRASP (Greedy Randomized Adaptive Search Procedure and two probabilistic tabu search strategies of intensification and diversification to tackle the problem. Results show that the heuristic solves the problem effectively.

  14. Online learning algorithm for ensemble of decision rules

    KAUST Repository

    Chikalov, Igor

    2011-01-01

    We describe an online learning algorithm that builds a system of decision rules for a classification problem. Rules are constructed according to the minimum description length principle by a greedy algorithm or using the dynamic programming approach. © 2011 Springer-Verlag.

  15. Greedy Deep Dictionary Learning

    OpenAIRE

    Tariyal, Snigdha; Majumdar, Angshul; Singh, Richa; Vatsa, Mayank

    2016-01-01

    In this work we propose a new deep learning tool called deep dictionary learning. Multi-level dictionaries are learnt in a greedy fashion, one layer at a time. This requires solving a simple (shallow) dictionary learning problem, the solution to this is well known. We apply the proposed technique on some benchmark deep learning datasets. We compare our results with other deep learning tools like stacked autoencoder and deep belief network; and state of the art supervised dictionary learning t...

  16. Quantum algorithms for the ordered search problem via semidefinite programming

    International Nuclear Information System (INIS)

    Childs, Andrew M.; Landahl, Andrew J.; Parrilo, Pablo A.

    2007-01-01

    One of the most basic computational problems is the task of finding a desired item in an ordered list of N items. While the best classical algorithm for this problem uses log 2 N queries to the list, a quantum computer can solve the problem using a constant factor fewer queries. However, the precise value of this constant is unknown. By characterizing a class of quantum query algorithms for the ordered search problem in terms of a semidefinite program, we find quantum algorithms for small instances of the ordered search problem. Extending these algorithms to arbitrarily large instances using recursion, we show that there is an exact quantum ordered search algorithm using 4 log 605 N≅0.433 log 2 N queries, which improves upon the previously best known exact algorithm

  17. Nuclear expert web search and crawler algorithm

    International Nuclear Information System (INIS)

    Reis, Thiago; Barroso, Antonio C.O.; Baptista, Benedito Filho D.

    2013-01-01

    In this paper we present preliminary research on web search and crawling algorithm applied specifically to nuclear-related web information. We designed a web-based nuclear-oriented expert system guided by a web crawler algorithm and a neural network able to search and retrieve nuclear-related hyper textual web information in autonomous and massive fashion. Preliminary experimental results shows a retrieval precision of 80% for web pages related to any nuclear theme and a retrieval precision of 72% for web pages related only to nuclear power theme. (author)

  18. Nuclear expert web search and crawler algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Reis, Thiago; Barroso, Antonio C.O.; Baptista, Benedito Filho D., E-mail: thiagoreis@usp.br, E-mail: barroso@ipen.br, E-mail: bdbfilho@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2013-07-01

    In this paper we present preliminary research on web search and crawling algorithm applied specifically to nuclear-related web information. We designed a web-based nuclear-oriented expert system guided by a web crawler algorithm and a neural network able to search and retrieve nuclear-related hyper textual web information in autonomous and massive fashion. Preliminary experimental results shows a retrieval precision of 80% for web pages related to any nuclear theme and a retrieval precision of 72% for web pages related only to nuclear power theme. (author)

  19. A Hybrid Genetic Algorithm for the Multiple Crossdocks Problem

    Directory of Open Access Journals (Sweden)

    Zhaowei Miao

    2012-01-01

    Full Text Available We study a multiple crossdocks problem with supplier and customer time windows, where any violation of time windows will incur a penalty cost and the flows through the crossdock are constrained by fixed transportation schedules and crossdock capacities. We prove this problem to be NP-hard in the strong sense and therefore focus on developing efficient heuristics. Based on the problem structure, we propose a hybrid genetic algorithm (HGA integrating greedy technique and variable neighborhood search method to solve the problem. Extensive experiments under different scenarios were conducted, and results show that HGA outperforms CPLEX solver, providing solutions in realistic timescales.

  20. A hardware-oriented concurrent TZ search algorithm for High-Efficiency Video Coding

    Science.gov (United States)

    Doan, Nghia; Kim, Tae Sung; Rhee, Chae Eun; Lee, Hyuk-Jae

    2017-12-01

    High-Efficiency Video Coding (HEVC) is the latest video coding standard, in which the compression performance is double that of its predecessor, the H.264/AVC standard, while the video quality remains unchanged. In HEVC, the test zone (TZ) search algorithm is widely used for integer motion estimation because it effectively searches the good-quality motion vector with a relatively small amount of computation. However, the complex computation structure of the TZ search algorithm makes it difficult to implement it in the hardware. This paper proposes a new integer motion estimation algorithm which is designed for hardware execution by modifying the conventional TZ search to allow parallel motion estimations of all prediction unit (PU) partitions. The algorithm consists of the three phases of zonal, raster, and refinement searches. At the beginning of each phase, the algorithm obtains the search points required by the original TZ search for all PU partitions in a coding unit (CU). Then, all redundant search points are removed prior to the estimation of the motion costs, and the best search points are then selected for all PUs. Compared to the conventional TZ search algorithm, experimental results show that the proposed algorithm significantly decreases the Bjøntegaard Delta bitrate (BD-BR) by 0.84%, and it also reduces the computational complexity by 54.54%.

  1. Learning Search Algorithms: An Educational View

    Directory of Open Access Journals (Sweden)

    Ales Janota

    2014-12-01

    Full Text Available Artificial intelligence methods find their practical usage in many applications including maritime industry. The paper concentrates on the methods of uninformed and informed search, potentially usable in solving of complex problems based on the state space representation. The problem of introducing the search algorithms to newcomers has its technical and psychological dimensions. The authors show how it is possible to cope with both of them through design and use of specialized authoring systems. A typical example of searching a path through the maze is used to demonstrate how to test, observe and compare properties of various search strategies. Performance of search methods is evaluated based on the common criteria.

  2. Sustainable Scheduling of Cloth Production Processes by Multi-Objective Genetic Algorithm with Tabu-Enhanced Local Search

    Directory of Open Access Journals (Sweden)

    Rui Zhang

    2017-09-01

    Full Text Available The dyeing of textile materials is the most critical process in cloth production because of the strict technological requirements. In addition to the technical aspect, there have been increasing concerns over how to minimize the negative environmental impact of the dyeing industry. The emissions of pollutants are mainly caused by frequent cleaning operations which are necessary for initializing the dyeing equipment, as well as idled production capacity which leads to discharge of unconsumed chemicals. Motivated by these facts, we propose a methodology to reduce the pollutant emissions by means of systematic production scheduling. Firstly, we build a three-objective scheduling model that incorporates both the traditional tardiness objective and the environmentally-related objectives. A mixed-integer programming formulation is also provided to accurately define the problem. Then, we present a novel solution method for the sustainable scheduling problem, namely, a multi-objective genetic algorithm with tabu-enhanced iterated greedy local search strategy (MOGA-TIG. Finally, we conduct extensive computational experiments to investigate the actual performance of the MOGA-TIG. Based on a fair comparison with two state-of-the-art multi-objective optimizers, it is concluded that the MOGA-TIG is able to achieve satisfactory solution quality within tight computational time budget for the studied scheduling problem.

  3. Hybrid Artificial Bee Colony Algorithm and Particle Swarm Search for Global Optimization

    Directory of Open Access Journals (Sweden)

    Wang Chun-Feng

    2014-01-01

    Full Text Available Artificial bee colony (ABC algorithm is one of the most recent swarm intelligence based algorithms, which has been shown to be competitive to other population-based algorithms. However, there is still an insufficiency in ABC regarding its solution search equation, which is good at exploration but poor at exploitation. To overcome this problem, we propose a novel artificial bee colony algorithm based on particle swarm search mechanism. In this algorithm, for improving the convergence speed, the initial population is generated by using good point set theory rather than random selection firstly. Secondly, in order to enhance the exploitation ability, the employed bee, onlookers, and scouts utilize the mechanism of PSO to search new candidate solutions. Finally, for further improving the searching ability, the chaotic search operator is adopted in the best solution of the current iteration. Our algorithm is tested on some well-known benchmark functions and compared with other algorithms. Results show that our algorithm has good performance.

  4. Efficient sequential and parallel algorithms for planted motif search.

    Science.gov (United States)

    Nicolae, Marius; Rajasekaran, Sanguthevar

    2014-01-31

    Motif searching is an important step in the detection of rare events occurring in a set of DNA or protein sequences. One formulation of the problem is known as (l,d)-motif search or Planted Motif Search (PMS). In PMS we are given two integers l and d and n biological sequences. We want to find all sequences of length l that appear in each of the input sequences with at most d mismatches. The PMS problem is NP-complete. PMS algorithms are typically evaluated on certain instances considered challenging. Despite ample research in the area, a considerable performance gap exists because many state of the art algorithms have large runtimes even for moderately challenging instances. This paper presents a fast exact parallel PMS algorithm called PMS8. PMS8 is the first algorithm to solve the challenging (l,d) instances (25,10) and (26,11). PMS8 is also efficient on instances with larger l and d such as (50,21). We include a comparison of PMS8 with several state of the art algorithms on multiple problem instances. This paper also presents necessary and sufficient conditions for 3 l-mers to have a common d-neighbor. The program is freely available at http://engr.uconn.edu/~man09004/PMS8/. We present PMS8, an efficient exact algorithm for Planted Motif Search. PMS8 introduces novel ideas for generating common neighborhoods. We have also implemented a parallel version for this algorithm. PMS8 can solve instances not solved by any previous algorithms.

  5. Uncertain multiobjective redundancy allocation problem of repairable systems based on artificial bee colony algorithm

    Institute of Scientific and Technical Information of China (English)

    Guo Jiansheng; Wang Zutong; Zheng Mingfa; Wang Ying

    2014-01-01

    Based on the uncertainty theory, this paper is devoted to the redundancy allocation problem in repairable parallel-series systems with uncertain factors, where the failure rate, repair rate and other relative coefficients involved are considered as uncertain variables. The availability of the system and the corresponding designing cost are considered as two optimization objectives. A crisp multiobjective optimization formulation is presented on the basis of uncertainty theory to solve this resultant problem. For solving this problem efficiently, a new multiobjective artificial bee colony algorithm is proposed to search the Pareto efficient set, which introduces rank value and crowding distance in the greedy selection strategy, applies fast non-dominated sort procedure in the exploitation search and inserts tournament selection in the onlooker bee phase. It shows that the proposed algorithm outperforms NSGA-II greatly and can solve multiobjective redundancy allocation problem efficiently. Finally, a numerical example is provided to illustrate this approach.

  6. Algorithm for shortest path search in Geographic Information Systems by using reduced graphs.

    Science.gov (United States)

    Rodríguez-Puente, Rafael; Lazo-Cortés, Manuel S

    2013-01-01

    The use of Geographic Information Systems has increased considerably since the eighties and nineties. As one of their most demanding applications we can mention shortest paths search. Several studies about shortest path search show the feasibility of using graphs for this purpose. Dijkstra's algorithm is one of the classic shortest path search algorithms. This algorithm is not well suited for shortest path search in large graphs. This is the reason why various modifications to Dijkstra's algorithm have been proposed by several authors using heuristics to reduce the run time of shortest path search. One of the most used heuristic algorithms is the A* algorithm, the main goal is to reduce the run time by reducing the search space. This article proposes a modification of Dijkstra's shortest path search algorithm in reduced graphs. It shows that the cost of the path found in this work, is equal to the cost of the path found using Dijkstra's algorithm in the original graph. The results of finding the shortest path, applying the proposed algorithm, Dijkstra's algorithm and A* algorithm, are compared. This comparison shows that, by applying the approach proposed, it is possible to obtain the optimal path in a similar or even in less time than when using heuristic algorithms.

  7. Novel search algorithms for a mid-infrared spectral library of cotton contaminants.

    Science.gov (United States)

    Loudermilk, J Brian; Himmelsbach, David S; Barton, Franklin E; de Haseth, James A

    2008-06-01

    During harvest, a variety of plant based contaminants are collected along with cotton lint. The USDA previously created a mid-infrared, attenuated total reflection (ATR), Fourier transform infrared (FT-IR) spectral library of cotton contaminants for contaminant identification as the contaminants have negative impacts on yarn quality. This library has shown impressive identification rates for extremely similar cellulose based contaminants in cases where the library was representative of the samples searched. When spectra of contaminant samples from crops grown in different geographic locations, seasons, and conditions and measured with a different spectrometer and accessories were searched, identification rates for standard search algorithms decreased significantly. Six standard algorithms were examined: dot product, correlation, sum of absolute values of differences, sum of the square root of the absolute values of differences, sum of absolute values of differences of derivatives, and sum of squared differences of derivatives. Four categories of contaminants derived from cotton plants were considered: leaf, stem, seed coat, and hull. Experiments revealed that the performance of the standard search algorithms depended upon the category of sample being searched and that different algorithms provided complementary information about sample identity. These results indicated that choosing a single standard algorithm to search the library was not possible. Three voting scheme algorithms based on result frequency, result rank, category frequency, or a combination of these factors for the results returned by the standard algorithms were developed and tested for their capability to overcome the unpredictability of the standard algorithms' performances. The group voting scheme search was based on the number of spectra from each category of samples represented in the library returned in the top ten results of the standard algorithms. This group algorithm was able to identify

  8. Comparison of genetic algorithm and harmony search for generator maintenance scheduling

    International Nuclear Information System (INIS)

    Khan, L.; Mumtaz, S.; Khattak, A.

    2012-01-01

    GMS (Generator Maintenance Scheduling) ranks very high in decision making of power generation management. Generators maintenance schedule decides the time period of maintenance tasks and a reliable reserve margin is also maintained during this time period. In this paper, a comparison of GA (Genetic Algorithm) and US (Harmony Search) algorithm is presented to solve generators maintenance scheduling problem for WAPDA (Water And Power Development Authority) Pakistan. GA is a search procedure, which is used in search problems to compute exact and optimized solution. GA is considered as global search heuristic technique. HS algorithm is quite efficient, because the convergence rate of this algorithm is very fast. HS algorithm is based on the concept of music improvisation process of searching for a perfect state of harmony. The two algorithms generate feasible and optimal solutions and overcome the limitations of the conventional methods including extensive computational effort, which increases exponentially as the size of the problem increases. The proposed methods are tested, validated and compared on the WAPDA electric system. (author)

  9. A Cooperative Harmony Search Algorithm for Function Optimization

    Directory of Open Access Journals (Sweden)

    Gang Li

    2014-01-01

    Full Text Available Harmony search algorithm (HS is a new metaheuristic algorithm which is inspired by a process involving musical improvisation. HS is a stochastic optimization technique that is similar to genetic algorithms (GAs and particle swarm optimizers (PSOs. It has been widely applied in order to solve many complex optimization problems, including continuous and discrete problems, such as structure design, and function optimization. A cooperative harmony search algorithm (CHS is developed in this paper, with cooperative behavior being employed as a significant improvement to the performance of the original algorithm. Standard HS just uses one harmony memory and all the variables of the object function are improvised within the harmony memory, while the proposed algorithm CHS uses multiple harmony memories, so that each harmony memory can optimize different components of the solution vector. The CHS was then applied to function optimization problems. The results of the experiment show that CHS is capable of finding better solutions when compared to HS and a number of other algorithms, especially in high-dimensional problems.

  10. Search algorithms as a framework for the optimization of drug combinations.

    Directory of Open Access Journals (Sweden)

    Diego Calzolari

    2008-12-01

    Full Text Available Combination therapies are often needed for effective clinical outcomes in the management of complex diseases, but presently they are generally based on empirical clinical experience. Here we suggest a novel application of search algorithms -- originally developed for digital communication -- modified to optimize combinations of therapeutic interventions. In biological experiments measuring the restoration of the decline with age in heart function and exercise capacity in Drosophila melanogaster, we found that search algorithms correctly identified optimal combinations of four drugs using only one-third of the tests performed in a fully factorial search. In experiments identifying combinations of three doses of up to six drugs for selective killing of human cancer cells, search algorithms resulted in a highly significant enrichment of selective combinations compared with random searches. In simulations using a network model of cell death, we found that the search algorithms identified the optimal combinations of 6-9 interventions in 80-90% of tests, compared with 15-30% for an equivalent random search. These findings suggest that modified search algorithms from information theory have the potential to enhance the discovery of novel therapeutic drug combinations. This report also helps to frame a biomedical problem that will benefit from an interdisciplinary effort and suggests a general strategy for its solution.

  11. Numerical Algorithms for Personalized Search in Self-organizing Information Networks

    CERN Document Server

    Kamvar, Sep

    2010-01-01

    This book lays out the theoretical groundwork for personalized search and reputation management, both on the Web and in peer-to-peer and social networks. Representing much of the foundational research in this field, the book develops scalable algorithms that exploit the graphlike properties underlying personalized search and reputation management, and delves into realistic scenarios regarding Web-scale data. Sep Kamvar focuses on eigenvector-based techniques in Web search, introducing a personalized variant of Google's PageRank algorithm, and he outlines algorithms--such as the now-famous quad

  12. Evaluation of dynamically dimensioned search algorithm for optimizing SWAT by altering sampling distributions and searching range

    Science.gov (United States)

    The primary advantage of Dynamically Dimensioned Search algorithm (DDS) is that it outperforms many other optimization techniques in both convergence speed and the ability in searching for parameter sets that satisfy statistical guidelines while requiring only one algorithm parameter (perturbation f...

  13. Search and optimization by metaheuristics techniques and algorithms inspired by nature

    CERN Document Server

    Du, Ke-Lin

    2016-01-01

    This textbook provides a comprehensive introduction to nature-inspired metaheuristic methods for search and optimization, including the latest trends in evolutionary algorithms and other forms of natural computing. Over 100 different types of these methods are discussed in detail. The authors emphasize non-standard optimization problems and utilize a natural approach to the topic, moving from basic notions to more complex ones. An introductory chapter covers the necessary biological and mathematical backgrounds for understanding the main material. Subsequent chapters then explore almost all of the major metaheuristics for search and optimization created based on natural phenomena, including simulated annealing, recurrent neural networks, genetic algorithms and genetic programming, differential evolution, memetic algorithms, particle swarm optimization, artificial immune systems, ant colony optimization, tabu search and scatter search, bee and bacteria foraging algorithms, harmony search, biomolecular computin...

  14. Algorithms for selecting informative marker panels for population assignment.

    Science.gov (United States)

    Rosenberg, Noah A

    2005-11-01

    Given a set of potential source populations, genotypes of an individual of unknown origin at a collection of markers can be used to predict the correct source population of the individual. For improved efficiency, informative markers can be chosen from a larger set of markers to maximize the accuracy of this prediction. However, selecting the loci that are individually most informative does not necessarily produce the optimal panel. Here, using genotypes from eight species--carp, cat, chicken, dog, fly, grayling, human, and maize--this univariate accumulation procedure is compared to new multivariate "greedy" and "maximin" algorithms for choosing marker panels. The procedures generally suggest similar panels, although the greedy method often recommends inclusion of loci that are not chosen by the other algorithms. In seven of the eight species, when applied to five or more markers, all methods achieve at least 94% assignment accuracy on simulated individuals, with one species--dog--producing this level of accuracy with only three markers, and the eighth species--human--requiring approximately 13-16 markers. The new algorithms produce substantial improvements over use of randomly selected markers; where differences among the methods are noticeable, the greedy algorithm leads to slightly higher probabilities of correct assignment. Although none of the approaches necessarily chooses the panel with optimal performance, the algorithms all likely select panels with performance near enough to the maximum that they all are suitable for practical use.

  15. An ensemble based nonlinear orthogonal matching pursuit algorithm for sparse history matching of reservoir models

    KAUST Repository

    Fsheikh, Ahmed H.

    2013-01-01

    A nonlinear orthogonal matching pursuit (NOMP) for sparse calibration of reservoir models is presented. Sparse calibration is a challenging problem as the unknowns are both the non-zero components of the solution and their associated weights. NOMP is a greedy algorithm that discovers at each iteration the most correlated components of the basis functions with the residual. The discovered basis (aka support) is augmented across the nonlinear iterations. Once the basis functions are selected from the dictionary, the solution is obtained by applying Tikhonov regularization. The proposed algorithm relies on approximate gradient estimation using an iterative stochastic ensemble method (ISEM). ISEM utilizes an ensemble of directional derivatives to efficiently approximate gradients. In the current study, the search space is parameterized using an overcomplete dictionary of basis functions built using the K-SVD algorithm.

  16. Simulated annealing algorithm for solving chambering student-case assignment problem

    Science.gov (United States)

    Ghazali, Saadiah; Abdul-Rahman, Syariza

    2015-12-01

    The problem related to project assignment problem is one of popular practical problem that appear nowadays. The challenge of solving the problem raise whenever the complexity related to preferences, the existence of real-world constraints and problem size increased. This study focuses on solving a chambering student-case assignment problem by using a simulated annealing algorithm where this problem is classified under project assignment problem. The project assignment problem is considered as hard combinatorial optimization problem and solving it using a metaheuristic approach is an advantage because it could return a good solution in a reasonable time. The problem of assigning chambering students to cases has never been addressed in the literature before. For the proposed problem, it is essential for law graduates to peruse in chambers before they are qualified to become legal counselor. Thus, assigning the chambering students to cases is a critically needed especially when involving many preferences. Hence, this study presents a preliminary study of the proposed project assignment problem. The objective of the study is to minimize the total completion time for all students in solving the given cases. This study employed a minimum cost greedy heuristic in order to construct a feasible initial solution. The search then is preceded with a simulated annealing algorithm for further improvement of solution quality. The analysis of the obtained result has shown that the proposed simulated annealing algorithm has greatly improved the solution constructed by the minimum cost greedy heuristic. Hence, this research has demonstrated the advantages of solving project assignment problem by using metaheuristic techniques.

  17. Genetic local search algorithm for optimization design of diffractive optical elements.

    Science.gov (United States)

    Zhou, G; Chen, Y; Wang, Z; Song, H

    1999-07-10

    We propose a genetic local search algorithm (GLSA) for the optimization design of diffractive optical elements (DOE's). This hybrid algorithm incorporates advantages of both genetic algorithm (GA) and local search techniques. It appears better able to locate the global minimum compared with a canonical GA. Sample cases investigated here include the optimization design of binary-phase Dammann gratings, continuous surface-relief grating array generators, and a uniform top-hat focal plane intensity profile generator. Two GLSA's whose incorporated local search techniques are the hill-climbing method and the simulated annealing algorithm are investigated. Numerical experimental results demonstrate that the proposed algorithm is highly efficient and robust. DOE's that have high diffraction efficiency and excellent uniformity can be achieved by use of the algorithm we propose.

  18. A Fuzzy Gravitational Search Algorithm to Design Optimal IIR Filters

    Directory of Open Access Journals (Sweden)

    Danilo Pelusi

    2018-03-01

    Full Text Available The goodness of Infinite Impulse Response (IIR digital filters design depends on pass band ripple, stop band ripple and transition band values. The main problem is defining a suitable error fitness function that depends on these parameters. This fitness function can be optimized by search algorithms such as evolutionary algorithms. This paper proposes an intelligent algorithm for the design of optimal 8th order IIR filters. The main contribution is the design of Fuzzy Inference Systems able to tune key parameters of a revisited version of the Gravitational Search Algorithm (GSA. In this way, a Fuzzy Gravitational Search Algorithm (FGSA is designed. The optimization performances of FGSA are compared with those of Differential Evolution (DE and GSA. The results show that FGSA is the algorithm that gives the best compromise between goodness, robustness and convergence rate for the design of 8th order IIR filters. Moreover, FGSA assures a good stability of the designed filters.

  19. The quadratic speedup in Grover's search algorithm from the entanglement perspective

    International Nuclear Information System (INIS)

    Rungta, Pranaw

    2009-01-01

    We show that Grover's algorithm can be described as an iterative change of the bipartite entanglement, which leads to a necessary and sufficient condition for quadratic speedup. This allows us to reestablish, from the entanglement perspective, that Grover's search algorithm is the only optimal pure state search algorithm.

  20. Progressive-Search Algorithms for Large-Vocabulary Speech Recognition

    National Research Council Canada - National Science Library

    Murveit, Hy; Butzberger, John; Digalakis, Vassilios; Weintraub, Mitch

    1993-01-01

    .... An algorithm, the "Forward-Backward Word-Life Algorithm," is described. It can generate a word lattice in a progressive search that would be used as a language model embedded in a succeeding recognition pass to reduce computation requirements...

  1. Fault-tolerant search algorithms reliable computation with unreliable information

    CERN Document Server

    Cicalese, Ferdinando

    2013-01-01

    Why a book on fault-tolerant search algorithms? Searching is one of the fundamental problems in computer science. Time and again algorithmic and combinatorial issues originally studied in the context of search find application in the most diverse areas of computer science and discrete mathematics. On the other hand, fault-tolerance is a necessary ingredient of computing. Due to their inherent complexity, information systems are naturally prone to errors, which may appear at any level - as imprecisions in the data, bugs in the software, or transient or permanent hardware failures. This book pr

  2. Hard Ware Implementation of Diamond Search Algorithm for Motion Estimation and Object Tracking

    International Nuclear Information System (INIS)

    Hashimaa, S.M.; Mahmoud, I.I.; Elazm, A.A.

    2009-01-01

    Object tracking is very important task in computer vision. Fast search algorithms emerged as important search technique to achieve real time tracking results. To enhance the performance of these algorithms, we advocate the hardware implementation of such algorithms. Diamond search block matching motion estimation has been proposed recently to reduce the complexity of motion estimation. In this paper we selected the diamond search algorithm (DS) for implementation using FPGA. This is due to its fundamental role in all fast search patterns. The proposed architecture is simulated and synthesized using Xilinix and modelsim soft wares. The results agree with the algorithm implementation in Matlab environment.

  3. An Enhanced Discrete Artificial Bee Colony Algorithm to Minimize the Total Flow Time in Permutation Flow Shop Scheduling with Limited Buffers

    Directory of Open Access Journals (Sweden)

    Guanlong Deng

    2016-01-01

    Full Text Available This paper presents an enhanced discrete artificial bee colony algorithm for minimizing the total flow time in the flow shop scheduling problem with buffer capacity. First, the solution in the algorithm is represented as discrete job permutation to directly convert to active schedule. Then, we present a simple and effective scheme called best insertion for the employed bee and onlooker bee and introduce a combined local search exploring both insertion and swap neighborhood. To validate the performance of the presented algorithm, a computational campaign is carried out on the Taillard benchmark instances, and computations and comparisons show that the proposed algorithm is not only capable of solving the benchmark set better than the existing discrete differential evolution algorithm and iterated greedy algorithm, but also capable of performing better than two recently proposed discrete artificial bee colony algorithms.

  4. Quantum signature scheme based on a quantum search algorithm

    International Nuclear Information System (INIS)

    Yoon, Chun Seok; Kang, Min Sung; Lim, Jong In; Yang, Hyung Jin

    2015-01-01

    We present a quantum signature scheme based on a two-qubit quantum search algorithm. For secure transmission of signatures, we use a quantum search algorithm that has not been used in previous quantum signature schemes. A two-step protocol secures the quantum channel, and a trusted center guarantees non-repudiation that is similar to other quantum signature schemes. We discuss the security of our protocol. (paper)

  5. Nature-inspired novel Cuckoo Search Algorithm for genome

    Indian Academy of Sciences (India)

    This study aims to produce a novel optimization algorithm, called the Cuckoo Search Algorithm (CS), for solving the genome sequence assembly problem. ... Department of Electronics and Communication Engineering, Coimbatore Institute of Technology, Coimbatore 641 014, India; Department of Information Technology, ...

  6. Parameter estimation for chaotic systems using a hybrid adaptive cuckoo search with simulated annealing algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Sheng, Zheng, E-mail: 19994035@sina.com [College of Meteorology and Oceanography, PLA University of Science and Technology, Nanjing 211101 (China); Wang, Jun; Zhou, Bihua [National Defense Key Laboratory on Lightning Protection and Electromagnetic Camouflage, PLA University of Science and Technology, Nanjing 210007 (China); Zhou, Shudao [College of Meteorology and Oceanography, PLA University of Science and Technology, Nanjing 211101 (China); Collaborative Innovation Center on Forecast and Evaluation of Meteorological Disasters, Nanjing University of Information Science and Technology, Nanjing 210044 (China)

    2014-03-15

    This paper introduces a novel hybrid optimization algorithm to establish the parameters of chaotic systems. In order to deal with the weaknesses of the traditional cuckoo search algorithm, the proposed adaptive cuckoo search with simulated annealing algorithm is presented, which incorporates the adaptive parameters adjusting operation and the simulated annealing operation in the cuckoo search algorithm. Normally, the parameters of the cuckoo search algorithm are kept constant that may result in decreasing the efficiency of the algorithm. For the purpose of balancing and enhancing the accuracy and convergence rate of the cuckoo search algorithm, the adaptive operation is presented to tune the parameters properly. Besides, the local search capability of cuckoo search algorithm is relatively weak that may decrease the quality of optimization. So the simulated annealing operation is merged into the cuckoo search algorithm to enhance the local search ability and improve the accuracy and reliability of the results. The functionality of the proposed hybrid algorithm is investigated through the Lorenz chaotic system under the noiseless and noise condition, respectively. The numerical results demonstrate that the method can estimate parameters efficiently and accurately in the noiseless and noise condition. Finally, the results are compared with the traditional cuckoo search algorithm, genetic algorithm, and particle swarm optimization algorithm. Simulation results demonstrate the effectiveness and superior performance of the proposed algorithm.

  7. Parameter estimation for chaotic systems using a hybrid adaptive cuckoo search with simulated annealing algorithm

    International Nuclear Information System (INIS)

    Sheng, Zheng; Wang, Jun; Zhou, Bihua; Zhou, Shudao

    2014-01-01

    This paper introduces a novel hybrid optimization algorithm to establish the parameters of chaotic systems. In order to deal with the weaknesses of the traditional cuckoo search algorithm, the proposed adaptive cuckoo search with simulated annealing algorithm is presented, which incorporates the adaptive parameters adjusting operation and the simulated annealing operation in the cuckoo search algorithm. Normally, the parameters of the cuckoo search algorithm are kept constant that may result in decreasing the efficiency of the algorithm. For the purpose of balancing and enhancing the accuracy and convergence rate of the cuckoo search algorithm, the adaptive operation is presented to tune the parameters properly. Besides, the local search capability of cuckoo search algorithm is relatively weak that may decrease the quality of optimization. So the simulated annealing operation is merged into the cuckoo search algorithm to enhance the local search ability and improve the accuracy and reliability of the results. The functionality of the proposed hybrid algorithm is investigated through the Lorenz chaotic system under the noiseless and noise condition, respectively. The numerical results demonstrate that the method can estimate parameters efficiently and accurately in the noiseless and noise condition. Finally, the results are compared with the traditional cuckoo search algorithm, genetic algorithm, and particle swarm optimization algorithm. Simulation results demonstrate the effectiveness and superior performance of the proposed algorithm

  8. Improving GPU-accelerated adaptive IDW interpolation algorithm using fast kNN search.

    Science.gov (United States)

    Mei, Gang; Xu, Nengxiong; Xu, Liangliang

    2016-01-01

    This paper presents an efficient parallel Adaptive Inverse Distance Weighting (AIDW) interpolation algorithm on modern Graphics Processing Unit (GPU). The presented algorithm is an improvement of our previous GPU-accelerated AIDW algorithm by adopting fast k-nearest neighbors (kNN) search. In AIDW, it needs to find several nearest neighboring data points for each interpolated point to adaptively determine the power parameter; and then the desired prediction value of the interpolated point is obtained by weighted interpolating using the power parameter. In this work, we develop a fast kNN search approach based on the space-partitioning data structure, even grid, to improve the previous GPU-accelerated AIDW algorithm. The improved algorithm is composed of the stages of kNN search and weighted interpolating. To evaluate the performance of the improved algorithm, we perform five groups of experimental tests. The experimental results indicate: (1) the improved algorithm can achieve a speedup of up to 1017 over the corresponding serial algorithm; (2) the improved algorithm is at least two times faster than our previous GPU-accelerated AIDW algorithm; and (3) the utilization of fast kNN search can significantly improve the computational efficiency of the entire GPU-accelerated AIDW algorithm.

  9. Pattern Nulling of Linear Antenna Arrays Using Backtracking Search Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Kerim Guney

    2015-01-01

    Full Text Available An evolutionary method based on backtracking search optimization algorithm (BSA is proposed for linear antenna array pattern synthesis with prescribed nulls at interference directions. Pattern nulling is obtained by controlling only the amplitude, position, and phase of the antenna array elements. BSA is an innovative metaheuristic technique based on an iterative process. Various numerical examples of linear array patterns with the prescribed single, multiple, and wide nulls are given to illustrate the performance and flexibility of BSA. The results obtained by BSA are compared with the results of the following seventeen algorithms: particle swarm optimization (PSO, genetic algorithm (GA, modified touring ant colony algorithm (MTACO, quadratic programming method (QPM, bacterial foraging algorithm (BFA, bees algorithm (BA, clonal selection algorithm (CLONALG, plant growth simulation algorithm (PGSA, tabu search algorithm (TSA, memetic algorithm (MA, nondominated sorting GA-2 (NSGA-2, multiobjective differential evolution (MODE, decomposition with differential evolution (MOEA/D-DE, comprehensive learning PSO (CLPSO, harmony search algorithm (HSA, seeker optimization algorithm (SOA, and mean variance mapping optimization (MVMO. The simulation results show that the linear antenna array synthesis using BSA provides low side-lobe levels and deep null levels.

  10. Combined heat and power economic dispatch by harmony search algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Vasebi, A.; Bathaee, S.M.T. [Power System Research Laboratory, Department of Electrical and Electronic Engineering, K.N.Toosi University of Technology, 322-Mirdamad Avenue West, 19697 Tehran (Iran); Fesanghary, M. [Department of Mechanical Engineering, Amirkabir University of Technology, 424-Hafez Avenue, Tehran (Iran)

    2007-12-15

    The optimal utilization of multiple combined heat and power (CHP) systems is a complicated problem that needs powerful methods to solve. This paper presents a harmony search (HS) algorithm to solve the combined heat and power economic dispatch (CHPED) problem. The HS algorithm is a recently developed meta-heuristic algorithm, and has been very successful in a wide variety of optimization problems. The method is illustrated using a test case taken from the literature as well as a new one proposed by authors. Numerical results reveal that the proposed algorithm can find better solutions when compared to conventional methods and is an efficient search algorithm for CHPED problem. (author)

  11. Tackling Error Propagation through Reinforcement Learning: A Case of Greedy Dependency Parsing

    OpenAIRE

    Le, Minh; Fokkens, Antske

    2017-01-01

    Error propagation is a common problem in NLP. Reinforcement learning explores erroneous states during training and can therefore be more robust when mistakes are made early in a process. In this paper, we apply reinforcement learning to greedy dependency parsing which is known to suffer from error propagation. Reinforcement learning improves accuracy of both labeled and unlabeled dependencies of the Stanford Neural Dependency Parser, a high performance greedy parser, while maintaining its eff...

  12. Kernel Clustering with a Differential Harmony Search Algorithm for Scheme Classification

    Directory of Open Access Journals (Sweden)

    Yu Feng

    2017-01-01

    Full Text Available This paper presents a kernel fuzzy clustering with a novel differential harmony search algorithm to coordinate with the diversion scheduling scheme classification. First, we employed a self-adaptive solution generation strategy and differential evolution-based population update strategy to improve the classical harmony search. Second, we applied the differential harmony search algorithm to the kernel fuzzy clustering to help the clustering method obtain better solutions. Finally, the combination of the kernel fuzzy clustering and the differential harmony search is applied for water diversion scheduling in East Lake. A comparison of the proposed method with other methods has been carried out. The results show that the kernel clustering with the differential harmony search algorithm has good performance to cooperate with the water diversion scheduling problems.

  13. A Direct Search Algorithm for Global Optimization

    Directory of Open Access Journals (Sweden)

    Enrique Baeyens

    2016-06-01

    Full Text Available A direct search algorithm is proposed for minimizing an arbitrary real valued function. The algorithm uses a new function transformation and three simplex-based operations. The function transformation provides global exploration features, while the simplex-based operations guarantees the termination of the algorithm and provides global convergence to a stationary point if the cost function is differentiable and its gradient is Lipschitz continuous. The algorithm’s performance has been extensively tested using benchmark functions and compared to some well-known global optimization algorithms. The results of the computational study show that the algorithm combines both simplicity and efficiency and is competitive with the heuristics-based strategies presently used for global optimization.

  14. Teaching AI Search Algorithms in a Web-Based Educational System

    Science.gov (United States)

    Grivokostopoulou, Foteini; Hatzilygeroudis, Ioannis

    2013-01-01

    In this paper, we present a way of teaching AI search algorithms in a web-based adaptive educational system. Teaching is based on interactive examples and exercises. Interactive examples, which use visualized animations to present AI search algorithms in a step-by-step way with explanations, are used to make learning more attractive. Practice…

  15. Archiving, ordering and searching: search engines, algorithms, databases and deep mediatization

    DEFF Research Database (Denmark)

    Andersen, Jack

    2018-01-01

    This article argues that search engines, algorithms, and databases can be considered as a way of understanding deep mediatization (Couldry & Hepp, 2016). They are embedded in a variety of social and cultural practices and as such they change our communicative actions to be shaped by their logic o...... reviewed recent trends in mediatization research, the argument is discussed and unfolded in-between the material and social constructivist-phenomenological interpretations of mediatization. In conclusion, it is discussed how deep this form of mediatization can be taken to be.......This article argues that search engines, algorithms, and databases can be considered as a way of understanding deep mediatization (Couldry & Hepp, 2016). They are embedded in a variety of social and cultural practices and as such they change our communicative actions to be shaped by their logic...

  16. Transitionless driving on adiabatic search algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Sangchul, E-mail: soh@qf.org.qa [Qatar Environment and Energy Research Institute, Qatar Foundation, Doha (Qatar); Kais, Sabre, E-mail: kais@purdue.edu [Qatar Environment and Energy Research Institute, Qatar Foundation, Doha (Qatar); Department of Chemistry, Department of Physics and Birck Nanotechnology Center, Purdue University, West Lafayette, Indiana 47907 (United States)

    2014-12-14

    We study quantum dynamics of the adiabatic search algorithm with the equivalent two-level system. Its adiabatic and non-adiabatic evolution is studied and visualized as trajectories of Bloch vectors on a Bloch sphere. We find the change in the non-adiabatic transition probability from exponential decay for the short running time to inverse-square decay in asymptotic running time. The scaling of the critical running time is expressed in terms of the Lambert W function. We derive the transitionless driving Hamiltonian for the adiabatic search algorithm, which makes a quantum state follow the adiabatic path. We demonstrate that a uniform transitionless driving Hamiltonian, approximate to the exact time-dependent driving Hamiltonian, can alter the non-adiabatic transition probability from the inverse square decay to the inverse fourth power decay with the running time. This may open up a new but simple way of speeding up adiabatic quantum dynamics.

  17. An Improved Harmony Search Algorithm for Power Distribution Network Planning

    Directory of Open Access Journals (Sweden)

    Wei Sun

    2015-01-01

    Full Text Available Distribution network planning because of involving many variables and constraints is a multiobjective, discrete, nonlinear, and large-scale optimization problem. Harmony search (HS algorithm is a metaheuristic algorithm inspired by the improvisation process of music players. HS algorithm has several impressive advantages, such as easy implementation, less adjustable parameters, and quick convergence. But HS algorithm still has some defects such as premature convergence and slow convergence speed. According to the defects of the standard algorithm and characteristics of distribution network planning, an improved harmony search (IHS algorithm is proposed in this paper. We set up a mathematical model of distribution network structure planning, whose optimal objective function is to get the minimum annual cost and constraint conditions are overload and radial network. IHS algorithm is applied to solve the complex optimization mathematical model. The empirical results strongly indicate that IHS algorithm can effectively provide better results for solving the distribution network planning problem compared to other optimization algorithms.

  18. Hybrid Projected Gradient-Evolutionary Search Algorithm for Mixed Integer Nonlinear Optimization Problems

    National Research Council Canada - National Science Library

    Homaifar, Abdollah; Esterline, Albert; Kimiaghalam, Bahram

    2005-01-01

    The Hybrid Projected Gradient-Evolutionary Search Algorithm (HPGES) algorithm uses a specially designed evolutionary-based global search strategy to efficiently create candidate solutions in the solution space...

  19. Modification of Brueschweiler quantum searching algorithm and realization by NMR experiment

    International Nuclear Information System (INIS)

    Yang Xiaodong; Wei Daxiu; Luo Jun; Miao Xijia

    2002-01-01

    In recent years, quantum computing research has made big progress, which exploit quantum mechanical laws, such as interference, superposition and parallelism, to perform computing tasks. The most inducing thing is that the quantum computing can provide large rise to the speedup in quantum algorithm. Quantum computing can solve some problems, which are impossible or difficult for the classical computing. The problem of searching for a specific item in an unsorted database can be solved with certain quantum algorithm, for example, Grover quantum algorithm and Brueschweiler quantum algorithm. The former gives a quadratic speedup, and the latter gives an exponential speedup comparing with the corresponding classical algorithm. In Brueschweiler quantum searching algorithm, the data qubit and the read-out qubit (the ancilla qubit) are different qubits. The authors have studied Brueschweiler algorithm and proposed a modified version, in which no ancilla qubit is needed to reach exponential speedup in the searching, the data and the read-out qubit are the same qubits. The modified Brueschweiler algorithm can be easier to design and realize. The authors also demonstrate the modified Brueschweiler algorithm in a 3-qubit molecular system by Nuclear Magnetic Resonance (NMR) experiment

  20. Modified cuckoo search: A new gradient free optimisation algorithm

    International Nuclear Information System (INIS)

    Walton, S.; Hassan, O.; Morgan, K.; Brown, M.R.

    2011-01-01

    Highlights: → Modified cuckoo search (MCS) is a new gradient free optimisation algorithm. → MCS shows a high convergence rate, able to outperform other optimisers. → MCS is particularly strong at high dimension objective functions. → MCS performs well when applied to engineering problems. - Abstract: A new robust optimisation algorithm, which can be regarded as a modification of the recently developed cuckoo search, is presented. The modification involves the addition of information exchange between the top eggs, or the best solutions. Standard optimisation benchmarking functions are used to test the effects of these modifications and it is demonstrated that, in most cases, the modified cuckoo search performs as well as, or better than, the standard cuckoo search, a particle swarm optimiser, and a differential evolution strategy. In particular the modified cuckoo search shows a high convergence rate to the true global minimum even at high numbers of dimensions.

  1. Object Detection and Tracking using Modified Diamond Search Block Matching Motion Estimation Algorithm

    Directory of Open Access Journals (Sweden)

    Apurva Samdurkar

    2018-06-01

    Full Text Available Object tracking is one of the main fields within computer vision. Amongst various methods/ approaches for object detection and tracking, the background subtraction approach makes the detection of object easier. To the detected object, apply the proposed block matching algorithm for generating the motion vectors. The existing diamond search (DS and cross diamond search algorithms (CDS are studied and experiments are carried out on various standard video data sets and user defined data sets. Based on the study and analysis of these two existing algorithms a modified diamond search pattern (MDS algorithm is proposed using small diamond shape search pattern in initial step and large diamond shape (LDS in further steps for motion estimation. The initial search pattern consists of five points in small diamond shape pattern and gradually grows into a large diamond shape pattern, based on the point with minimum cost function. The algorithm ends with the small shape pattern at last. The proposed MDS algorithm finds the smaller motion vectors and fewer searching points than the existing DS and CDS algorithms. Further, object detection is carried out by using background subtraction approach and finally, MDS motion estimation algorithm is used for tracking the object in color video sequences. The experiments are carried out by using different video data sets containing a single object. The results are evaluated and compared by using the evaluation parameters like average searching points per frame and average computational time per frame. The experimental results show that the MDS performs better than DS and CDS on average search point and average computation time.

  2. Cooperative mobile agents search using beehive partitioned structure and Tabu Random search algorithm

    Science.gov (United States)

    Ramazani, Saba; Jackson, Delvin L.; Selmic, Rastko R.

    2013-05-01

    In search and surveillance operations, deploying a team of mobile agents provides a robust solution that has multiple advantages over using a single agent in efficiency and minimizing exploration time. This paper addresses the challenge of identifying a target in a given environment when using a team of mobile agents by proposing a novel method of mapping and movement of agent teams in a cooperative manner. The approach consists of two parts. First, the region is partitioned into a hexagonal beehive structure in order to provide equidistant movements in every direction and to allow for more natural and flexible environment mapping. Additionally, in search environments that are partitioned into hexagons, mobile agents have an efficient travel path while performing searches due to this partitioning approach. Second, we use a team of mobile agents that move in a cooperative manner and utilize the Tabu Random algorithm to search for the target. Due to the ever-increasing use of robotics and Unmanned Aerial Vehicle (UAV) platforms, the field of cooperative multi-agent search has developed many applications recently that would benefit from the use of the approach presented in this work, including: search and rescue operations, surveillance, data collection, and border patrol. In this paper, the increased efficiency of the Tabu Random Search algorithm method in combination with hexagonal partitioning is simulated, analyzed, and advantages of this approach are presented and discussed.

  3. Contemporary Greedy Institutions: An Essay on Lewis Coser’s Concept in Times of the ‘Hive Mind’

    OpenAIRE

    de Campo, Marianne Egger

    2013-01-01

    Lewis Coser perennially discussed various forms and facets of ‘greedy institutions’ with their total grasp on the individual. Coser’s ‘greedy institutions’ demand undivided time and loyalty from the individual who will voluntarily devote him/herself for exclusive benefits only granted to loyal followers. Although the ancient authorities have vanished—princes with their court Jews, masters with their servants, or religious and political missionaries— one can argue that the idea of the greedy i...

  4. An Efficient VQ Codebook Search Algorithm Applied to AMR-WB Speech Coding

    Directory of Open Access Journals (Sweden)

    Cheng-Yu Yeh

    2017-04-01

    Full Text Available The adaptive multi-rate wideband (AMR-WB speech codec is widely used in modern mobile communication systems for high speech quality in handheld devices. Nonetheless, a major disadvantage is that vector quantization (VQ of immittance spectral frequency (ISF coefficients takes a considerable computational load in the AMR-WB coding. Accordingly, a binary search space-structured VQ (BSS-VQ algorithm is adopted to efficiently reduce the complexity of ISF quantization in AMR-WB. This search algorithm is done through a fast locating technique combined with lookup tables, such that an input vector is efficiently assigned to a subspace where relatively few codeword searches are required to be executed. In terms of overall search performance, this work is experimentally validated as a superior search algorithm relative to a multiple triangular inequality elimination (MTIE, a TIE with dynamic and intersection mechanisms (DI-TIE, and an equal-average equal-variance equal-norm nearest neighbor search (EEENNS approach. With a full search algorithm as a benchmark for overall search load comparison, this work provides an 87% search load reduction at a threshold of quantization accuracy of 0.96, a figure far beyond 55% in the MTIE, 76% in the EEENNS approach, and 83% in the DI-TIE approach.

  5. A Functional Programming Approach to AI Search Algorithms

    Science.gov (United States)

    Panovics, Janos

    2012-01-01

    The theory and practice of search algorithms related to state-space represented problems form the major part of the introductory course of Artificial Intelligence at most of the universities and colleges offering a degree in the area of computer science. Students usually meet these algorithms only in some imperative or object-oriented language…

  6. Computer Algorithms in the Search for Unrelated Stem Cell Donors

    Directory of Open Access Journals (Sweden)

    David Steiner

    2012-01-01

    Full Text Available Hematopoietic stem cell transplantation (HSCT is a medical procedure in the field of hematology and oncology, most often performed for patients with certain cancers of the blood or bone marrow. A lot of patients have no suitable HLA-matched donor within their family, so physicians must activate a “donor search process” by interacting with national and international donor registries who will search their databases for adult unrelated donors or cord blood units (CBU. Information and communication technologies play a key role in the donor search process in donor registries both nationally and internationaly. One of the major challenges for donor registry computer systems is the development of a reliable search algorithm. This work discusses the top-down design of such algorithms and current practice. Based on our experience with systems used by several stem cell donor registries, we highlight typical pitfalls in the implementation of an algorithm and underlying data structure.

  7. Construction Example for Algebra System Using Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    FangAn Deng

    2015-01-01

    Full Text Available The construction example of algebra system is to verify the existence of a complex algebra system, and it is a NP-hard problem. In this paper, to solve this kind of problems, firstly, a mathematical optimization model for construction example of algebra system is established. Secondly, an improved harmony search algorithm based on NGHS algorithm (INGHS is proposed to find as more solutions as possible for the optimization model; in the proposed INGHS algorithm, to achieve the balance between exploration power and exploitation power in the search process, a global best strategy and parameters dynamic adjustment method are present. Finally, nine construction examples of algebra system are used to evaluate the optimization model and performance of INGHS. The experimental results show that the proposed algorithm has strong performance for solving complex construction example problems of algebra system.

  8. Parallel algorithms for unconstrained optimization by multisplitting with inexact subspace search - the abstract

    Energy Technology Data Exchange (ETDEWEB)

    Renaut, R.; He, Q. [Arizona State Univ., Tempe, AZ (United States)

    1994-12-31

    In a new parallel iterative algorithm for unconstrained optimization by multisplitting is proposed. In this algorithm the original problem is split into a set of small optimization subproblems which are solved using well known sequential algorithms. These algorithms are iterative in nature, e.g. DFP variable metric method. Here the authors use sequential algorithms based on an inexact subspace search, which is an extension to the usual idea of an inexact fine search. Essentially the idea of the inexact line search for nonlinear minimization is that at each iteration the authors only find an approximate minimum in the line search direction. Hence by inexact subspace search, they mean that, instead of finding the minimum of the subproblem at each interation, they do an incomplete down hill search to give an approximate minimum. Some convergence and numerical results for this algorithm will be presented. Further, the original theory will be generalized to the situation with a singular Hessian. Applications for nonlinear least squares problems will be presented. Experimental results will be presented for implementations on an Intel iPSC/860 Hypercube with 64 nodes as well as on the Intel Paragon.

  9. Motion Vector Estimation Using Line-Square Search Block Matching Algorithm for Video Sequences

    Directory of Open Access Journals (Sweden)

    Guo Bao-long

    2004-09-01

    Full Text Available Motion estimation and compensation techniques are widely used for video coding applications but the real-time motion estimation is not easily achieved due to its enormous computations. In this paper, a new fast motion estimation algorithm based on line search is presented, in which computation complexity is greatly reduced by using the line search strategy and a parallel search pattern. Moreover, the accurate search is achieved because the small square search pattern is used. It has a best-case scenario of only 9 search points, which is 4 search points less than the diamond search algorithm. Simulation results show that, compared with the previous techniques, the LSPS algorithm significantly reduces the computational requirements for finding motion vectors, and also produces close performance in terms of motion compensation errors.

  10. APPECT: An Approximate Backbone-Based Clustering Algorithm for Tags

    DEFF Research Database (Denmark)

    Zong, Yu; Xu, Guandong; Jin, Pin

    2011-01-01

    algorithm for Tags (APPECT). The main steps of APPECT are: (1) we execute the K-means algorithm on a tag similarity matrix for M times and collect a set of tag clustering results Z={C1,C2,…,Cm}; (2) we form the approximate backbone of Z by executing a greedy search; (3) we fix the approximate backbone...... as the initial tag clustering result and then assign the rest tags into the corresponding clusters based on the similarity. Experimental results on three real world datasets namely MedWorm, MovieLens and Dmoz demonstrate the effectiveness and the superiority of the proposed method against the traditional...... Agglomerative Clustering on tagging data, which possess the inherent drawbacks, such as the sensitivity of initialization. In this paper, we instead make use of the approximate backbone of tag clustering results to find out better tag clusters. In particular, we propose an APProximate backbonE-based Clustering...

  11. Filter Pattern Search Algorithms for Mixed Variable Constrained Optimization Problems

    National Research Council Canada - National Science Library

    Abramson, Mark A; Audet, Charles; Dennis, Jr, J. E

    2004-01-01

    .... This class combines and extends the Audet-Dennis Generalized Pattern Search (GPS) algorithms for bound constrained mixed variable optimization, and their GPS-filter algorithms for general nonlinear constraints...

  12. A Novel Quad Harmony Search Algorithm for Grid-Based Path Finding

    Directory of Open Access Journals (Sweden)

    Saso Koceski

    2014-09-01

    Full Text Available A novel approach to the problem of grid-based path finding has been introduced. The method is a block-based search algorithm, founded on the bases of two algorithms, namely the quad-tree algorithm, which offered a great opportunity for decreasing the time needed to compute the solution, and the harmony search (HS algorithm, a meta-heuristic algorithm used to obtain the optimal solution. This quad HS algorithm uses the quad-tree decomposition of free space in the grid to mark the free areas and treat them as a single node, which greatly improves the execution. The results of the quad HS algorithm have been compared to other meta-heuristic algorithms, i.e., ant colony, genetic algorithm, particle swarm optimization and simulated annealing, and it was proved to obtain the best results in terms of time and giving the optimal path.

  13. A Hybrid Symbiotic Organisms Search Algorithm with Variable Neighbourhood Search for Solving Symmetric and Asymmetric Traveling Salesman Problem

    Science.gov (United States)

    Umam, M. I. H.; Santosa, B.

    2018-04-01

    Combinatorial optimization has been frequently used to solve both problems in science, engineering, and commercial applications. One combinatorial problems in the field of transportation is to find a shortest travel route that can be taken from the initial point of departure to point of destination, as well as minimizing travel costs and travel time. When the distance from one (initial) node to another (destination) node is the same with the distance to travel back from destination to initial, this problems known to the Traveling Salesman Problem (TSP), otherwise it call as an Asymmetric Traveling Salesman Problem (ATSP). The most recent optimization techniques is Symbiotic Organisms Search (SOS). This paper discuss how to hybrid the SOS algorithm with variable neighborhoods search (SOS-VNS) that can be applied to solve the ATSP problem. The proposed mechanism to add the variable neighborhoods search as a local search is to generate the better initial solution and then we modify the phase of parasites with adapting mechanism of mutation. After modification, the performance of the algorithm SOS-VNS is evaluated with several data sets and then the results is compared with the best known solution and some algorithm such PSO algorithm and SOS original algorithm. The SOS-VNS algorithm shows better results based on convergence, divergence and computing time.

  14. Scaling Up Coordinate Descent Algorithms for Large ℓ1 Regularization Problems

    Energy Technology Data Exchange (ETDEWEB)

    Scherrer, Chad; Halappanavar, Mahantesh; Tewari, Ambuj; Haglin, David J.

    2012-07-03

    We present a generic framework for parallel coordinate descent (CD) algorithms that has as special cases the original sequential algorithms of Cyclic CD and Stochastic CD, as well as the recent parallel Shotgun algorithm of Bradley et al. We introduce two novel parallel algorithms that are also special cases---Thread-Greedy CD and Coloring-Based CD---and give performance measurements for an OpenMP implementation of these.

  15. Hard Real-Time Task Scheduling in Cloud Computing Using an Adaptive Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Amjad Mahmood

    2017-04-01

    Full Text Available In the Infrastructure-as-a-Service cloud computing model, virtualized computing resources in the form of virtual machines are provided over the Internet. A user can rent an arbitrary number of computing resources to meet their requirements, making cloud computing an attractive choice for executing real-time tasks. Economical task allocation and scheduling on a set of leased virtual machines is an important problem in the cloud computing environment. This paper proposes a greedy and a genetic algorithm with an adaptive selection of suitable crossover and mutation operations (named as AGA to allocate and schedule real-time tasks with precedence constraint on heterogamous virtual machines. A comprehensive simulation study has been done to evaluate the performance of the proposed algorithms in terms of their solution quality and efficiency. The simulation results show that AGA outperforms the greedy algorithm and non-adaptive genetic algorithm in terms of solution quality.

  16. Car painting process scheduling with harmony search algorithm

    Science.gov (United States)

    Syahputra, M. F.; Maiyasya, A.; Purnamawati, S.; Abdullah, D.; Albra, W.; Heikal, M.; Abdurrahman, A.; Khaddafi, M.

    2018-02-01

    Automotive painting program in the process of painting the car body by using robot power, making efficiency in the production system. Production system will be more efficient if pay attention to scheduling of car order which will be done by considering painting body shape of car. Flow shop scheduling is a scheduling model in which the job-job to be processed entirely flows in the same product direction / path. Scheduling problems often arise if there are n jobs to be processed on the machine, which must be specified which must be done first and how to allocate jobs on the machine to obtain a scheduled production process. Harmony Search Algorithm is a metaheuristic optimization algorithm based on music. The algorithm is inspired by observations that lead to music in search of perfect harmony. This musical harmony is in line to find optimal in the optimization process. Based on the tests that have been done, obtained the optimal car sequence with minimum makespan value.

  17. Reasoning about Grover's Quantum Search Algorithm using Probabilistic wp

    NARCIS (Netherlands)

    Butler, M.J.; Hartel, Pieter H.

    Grover's search algorithm is designed to be executed on a quantum mechanical computer. In this paper, the probabilistic wp-calculus is used to model and reason about Grover's algorithm. It is demonstrated that the calculus provides a rigorous programming notation for modelling this and other quantum

  18. Effects of systematic phase errors on optimized quantum random-walk search algorithm

    International Nuclear Information System (INIS)

    Zhang Yu-Chao; Bao Wan-Su; Wang Xiang; Fu Xiang-Qun

    2015-01-01

    This study investigates the effects of systematic errors in phase inversions on the success rate and number of iterations in the optimized quantum random-walk search algorithm. Using the geometric description of this algorithm, a model of the algorithm with phase errors is established, and the relationship between the success rate of the algorithm, the database size, the number of iterations, and the phase error is determined. For a given database size, we obtain both the maximum success rate of the algorithm and the required number of iterations when phase errors are present in the algorithm. Analyses and numerical simulations show that the optimized quantum random-walk search algorithm is more robust against phase errors than Grover’s algorithm. (paper)

  19. A Hybrid Backtracking Search Optimization Algorithm with Differential Evolution

    Directory of Open Access Journals (Sweden)

    Lijin Wang

    2015-01-01

    Full Text Available The backtracking search optimization algorithm (BSA is a new nature-inspired method which possesses a memory to take advantage of experiences gained from previous generation to guide the population to the global optimum. BSA is capable of solving multimodal problems, but it slowly converges and poorly exploits solution. The differential evolution (DE algorithm is a robust evolutionary algorithm and has a fast convergence speed in the case of exploitive mutation strategies that utilize the information of the best solution found so far. In this paper, we propose a hybrid backtracking search optimization algorithm with differential evolution, called HBD. In HBD, DE with exploitive strategy is used to accelerate the convergence by optimizing one worse individual according to its probability at each iteration process. A suit of 28 benchmark functions are employed to verify the performance of HBD, and the results show the improvement in effectiveness and efficiency of hybridization of BSA and DE.

  20. Pareto Optimization of a Half Car Passive Suspension Model Using a Novel Multiobjective Heat Transfer Search Algorithm

    OpenAIRE

    Savsani, Vimal; Patel, Vivek; Gadhvi, Bhargav; Tawhid, Mohamed

    2017-01-01

    Most of the modern multiobjective optimization algorithms are based on the search technique of genetic algorithms; however the search techniques of other recently developed metaheuristics are emerging topics among researchers. This paper proposes a novel multiobjective optimization algorithm named multiobjective heat transfer search (MOHTS) algorithm, which is based on the search technique of heat transfer search (HTS) algorithm. MOHTS employs the elitist nondominated sorting and crowding dis...

  1. Concise quantum associative memories with nonlinear search algorithm

    International Nuclear Information System (INIS)

    Tchapet Njafa, J.P.; Nana Engo, S.G.

    2016-01-01

    The model of Quantum Associative Memories (QAM) we propose here consists in simplifying and generalizing that of Rigui Zhou et al. [1] which uses the quantum matrix with the binary decision diagram put forth by David Rosenbaum [2] and the Abrams and Lloyd's nonlinear search algorithm [3]. Our model gives the possibility to retrieve one of the sought states in multi-values retrieving scheme when a measurement is done on the first register in O(c-r) time complexity. It is better than Grover's algorithm and its modified form which need O(√((2 n )/(m))) steps when they are used as the retrieval algorithm. n is the number of qubits of the first register and m the number of x values for which f(x) = 1. As the nonlinearity makes the system highly susceptible to the noise, an analysis of the influence of the single qubit noise channels on the Nonlinear Search Algorithm of our model of QAM shows a fidelity of about 0.7 whatever the number of qubits existing in the first register, thus demonstrating the robustness of our model. (copyright 2016 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  2. Error tolerance in an NMR implementation of Grover's fixed-point quantum search algorithm

    International Nuclear Information System (INIS)

    Xiao Li; Jones, Jonathan A.

    2005-01-01

    We describe an implementation of Grover's fixed-point quantum search algorithm on a nuclear magnetic resonance quantum computer, searching for either one or two matching items in an unsorted database of four items. In this algorithm the target state (an equally weighted superposition of the matching states) is a fixed point of the recursive search operator, so that the algorithm always moves towards the desired state. The effects of systematic errors in the implementation are briefly explored

  3. Two-agent cooperative search using game models with endurance-time constraints

    Science.gov (United States)

    Sujit, P. B.; Ghose, Debasish

    2010-07-01

    In this article, the problem of two Unmanned Aerial Vehicles (UAVs) cooperatively searching an unknown region is addressed. The search region is discretized into hexagonal cells and each cell is assumed to possess an uncertainty value. The UAVs have to cooperatively search these cells taking limited endurance, sensor and communication range constraints into account. Due to limited endurance, the UAVs need to return to the base station for refuelling and also need to select a base station when multiple base stations are present. This article proposes a route planning algorithm that takes endurance time constraints into account and uses game theoretical strategies to reduce the uncertainty. The route planning algorithm selects only those cells that ensure the agent will return to any one of the available bases. A set of paths are formed using these cells which the game theoretical strategies use to select a path that yields maximum uncertainty reduction. We explore non-cooperative Nash, cooperative and security strategies from game theory to enhance the search effectiveness. Monte-Carlo simulations are carried out which show the superiority of the game theoretical strategies over greedy strategy for different look ahead step length paths. Within the game theoretical strategies, non-cooperative Nash and cooperative strategy perform similarly in an ideal case, but Nash strategy performs better than the cooperative strategy when the perceived information is different. We also propose a heuristic based on partitioning of the search space into sectors to reduce computational overhead without performance degradation.

  4. Parameters identification of hydraulic turbine governing system using improved gravitational search algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Chaoshun Li; Jianzhong Zhou [College of Hydroelectric Digitization Engineering, Huazhong University of Science and Technology, Wuhan 430074 (China)

    2011-01-15

    Parameter identification of hydraulic turbine governing system (HTGS) is crucial in precise modeling of hydropower plant and provides support for the analysis of stability of power system. In this paper, a newly developed optimization algorithm, called gravitational search algorithm (GSA), is introduced and applied in parameter identification of HTGS, and the GSA is improved by combination of the search strategy of particle swarm optimization. Furthermore, a new weighted objective function is proposed in the identification frame. The improved gravitational search algorithm (IGSA), together with genetic algorithm, particle swarm optimization and GSA, is employed in parameter identification experiments and the procedure is validated by comparing experimental and simulated results. Consequently, IGSA is shown to locate more precise parameter values than the compared methods with higher efficiency. (author)

  5. Parameters identification of hydraulic turbine governing system using improved gravitational search algorithm

    International Nuclear Information System (INIS)

    Li Chaoshun; Zhou Jianzhong

    2011-01-01

    Parameter identification of hydraulic turbine governing system (HTGS) is crucial in precise modeling of hydropower plant and provides support for the analysis of stability of power system. In this paper, a newly developed optimization algorithm, called gravitational search algorithm (GSA), is introduced and applied in parameter identification of HTGS, and the GSA is improved by combination of the search strategy of particle swarm optimization. Furthermore, a new weighted objective function is proposed in the identification frame. The improved gravitational search algorithm (IGSA), together with genetic algorithm, particle swarm optimization and GSA, is employed in parameter identification experiments and the procedure is validated by comparing experimental and simulated results. Consequently, IGSA is shown to locate more precise parameter values than the compared methods with higher efficiency.

  6. A novel directional asymmetric sampling search algorithm for fast block-matching motion estimation

    Science.gov (United States)

    Li, Yue-e.; Wang, Qiang

    2011-11-01

    This paper proposes a novel directional asymmetric sampling search (DASS) algorithm for video compression. Making full use of the error information (block distortions) of the search patterns, eight different direction search patterns are designed for various situations. The strategy of local sampling search is employed for the search of big-motion vector. In order to further speed up the search, early termination strategy is adopted in procedure of DASS. Compared to conventional fast algorithms, the proposed method has the most satisfactory PSNR values for all test sequences.

  7. Noise propagation in iterative reconstruction algorithms with line searches

    International Nuclear Information System (INIS)

    Qi, Jinyi

    2003-01-01

    In this paper we analyze the propagation of noise in iterative image reconstruction algorithms. We derive theoretical expressions for the general form of preconditioned gradient algorithms with line searches. The results are applicable to a wide range of iterative reconstruction problems, such as emission tomography, transmission tomography, and image restoration. A unique contribution of this paper comparing to our previous work [1] is that the line search is explicitly modeled and we do not use the approximation that the gradient of the objective function is zero. As a result, the error in the estimate of noise at early iterations is significantly reduced

  8. Tag-Based Social Image Search: Toward Relevant and Diverse Results

    Science.gov (United States)

    Yang, Kuiyuan; Wang, Meng; Hua, Xian-Sheng; Zhang, Hong-Jiang

    Recent years have witnessed a great success of social media websites. Tag-based image search is an important approach to access the image content of interest on these websites. However, the existing ranking methods for tag-based image search frequently return results that are irrelevant or lack of diversity. This chapter presents a diverse relevance ranking scheme which simultaneously takes relevance and diversity into account by exploring the content of images and their associated tags. First, it estimates the relevance scores of images with respect to the query term based on both visual information of images and semantic information of associated tags. Then semantic similarities of social images are estimated based on their tags. Based on the relevance scores and the similarities, the ranking list is generated by a greedy ordering algorithm which optimizes Average Diverse Precision (ADP), a novel measure that is extended from the conventional Average Precision (AP). Comprehensive experiments and user studies demonstrate the effectiveness of the approach.

  9. An improved exploratory search technique for pure integer linear programming problems

    Science.gov (United States)

    Fogle, F. R.

    1990-01-01

    The development is documented of a heuristic method for the solution of pure integer linear programming problems. The procedure draws its methodology from the ideas of Hooke and Jeeves type 1 and 2 exploratory searches, greedy procedures, and neighborhood searches. It uses an efficient rounding method to obtain its first feasible integer point from the optimal continuous solution obtained via the simplex method. Since this method is based entirely on simple addition or subtraction of one to each variable of a point in n-space and the subsequent comparison of candidate solutions to a given set of constraints, it facilitates significant complexity improvements over existing techniques. It also obtains the same optimal solution found by the branch-and-bound technique in 44 of 45 small to moderate size test problems. Two example problems are worked in detail to show the inner workings of the method. Furthermore, using an established weighted scheme for comparing computational effort involved in an algorithm, a comparison of this algorithm is made to the more established and rigorous branch-and-bound method. A computer implementation of the procedure, in PC compatible Pascal, is also presented and discussed.

  10. Partial Transmit Sequence Optimization Using Improved Harmony Search Algorithm for PAPR Reduction in OFDM

    Directory of Open Access Journals (Sweden)

    Mangal Singh

    2017-12-01

    Full Text Available This paper considers the use of the Partial Transmit Sequence (PTS technique to reduce the Peak‐to‐Average Power Ratio (PAPR of an Orthogonal Frequency Division Multiplexing signal in wireless communication systems. Search complexity is very high in the traditional PTS scheme because it involves an extensive random search over all combinations of allowed phase vectors, and it increases exponentially with the number of phase vectors. In this paper, a suboptimal metaheuristic algorithm for phase optimization based on an improved harmony search (IHS is applied to explore the optimal combination of phase vectors that provides improved performance compared with existing evolutionary algorithms such as the harmony search algorithm and firefly algorithm. IHS enhances the accuracy and convergence rate of the conventional algorithms with very few parameters to adjust. Simulation results show that an improved harmony search‐based PTS algorithm can achieve a significant reduction in PAPR using a simple network structure compared with conventional algorithms.

  11. Application of multiple tabu search algorithm to solve dynamic economic dispatch considering generator constraints

    International Nuclear Information System (INIS)

    Pothiya, Saravuth; Ngamroo, Issarachai; Kongprawechnon, Waree

    2008-01-01

    This paper presents a new optimization technique based on a multiple tabu search algorithm (MTS) to solve the dynamic economic dispatch (ED) problem with generator constraints. In the constrained dynamic ED problem, the load demand and spinning reserve capacity as well as some practical operation constraints of generators, such as ramp rate limits and prohibited operating zone are taken into consideration. The MTS algorithm introduces additional mechanisms such as initialization, adaptive searches, multiple searches, crossover and restarting process. To show its efficiency, the MTS algorithm is applied to solve constrained dynamic ED problems of power systems with 6 and 15 units. The results obtained from the MTS algorithm are compared to those achieved from the conventional approaches, such as simulated annealing (SA), genetic algorithm (GA), tabu search (TS) algorithm and particle swarm optimization (PSO). The experimental results show that the proposed MTS algorithm approaches is able to obtain higher quality solutions efficiently and with less computational time than the conventional approaches

  12. Algorithms for Academic Search and Recommendation Systems

    DEFF Research Database (Denmark)

    Amolochitis, Emmanouil

    2014-01-01

    are part of a developed Movie Recommendation system, the first such system to be commercially deployed in Greece by a major Triple Play services provider. In the third part of the work we present the design of a quantitative association rule mining algorithm. The introduced mining algorithm processes......In this work we present novel algorithms for academic search, recommendation and association rules mining. In the first part of the work we introduce a novel hierarchical heuristic scheme for re-ranking academic publications. The scheme is based on the hierarchical combination of a custom...... implementation of the term frequency heuristic, a time-depreciated citation score and a graph-theoretic computed score that relates the paper’s index terms with each other. On the second part we describe the design of hybrid recommender ensemble (user, item and content based). The newly introduced algorithms...

  13. A novel line segment detection algorithm based on graph search

    Science.gov (United States)

    Zhao, Hong-dan; Liu, Guo-ying; Song, Xu

    2018-02-01

    To overcome the problem of extracting line segment from an image, a method of line segment detection was proposed based on the graph search algorithm. After obtaining the edge detection result of the image, the candidate straight line segments are obtained in four directions. For the candidate straight line segments, their adjacency relationships are depicted by a graph model, based on which the depth-first search algorithm is employed to determine how many adjacent line segments need to be merged. Finally we use the least squares method to fit the detected straight lines. The comparative experimental results verify that the proposed algorithm has achieved better results than the line segment detector (LSD).

  14. Adaptive symbiotic organisms search (SOS algorithm for structural design optimization

    Directory of Open Access Journals (Sweden)

    Ghanshyam G. Tejani

    2016-07-01

    Full Text Available The symbiotic organisms search (SOS algorithm is an effective metaheuristic developed in 2014, which mimics the symbiotic relationship among the living beings, such as mutualism, commensalism, and parasitism, to survive in the ecosystem. In this study, three modified versions of the SOS algorithm are proposed by introducing adaptive benefit factors in the basic SOS algorithm to improve its efficiency. The basic SOS algorithm only considers benefit factors, whereas the proposed variants of the SOS algorithm, consider effective combinations of adaptive benefit factors and benefit factors to study their competence to lay down a good balance between exploration and exploitation of the search space. The proposed algorithms are tested to suit its applications to the engineering structures subjected to dynamic excitation, which may lead to undesirable vibrations. Structure optimization problems become more challenging if the shape and size variables are taken into account along with the frequency. To check the feasibility and effectiveness of the proposed algorithms, six different planar and space trusses are subjected to experimental analysis. The results obtained using the proposed methods are compared with those obtained using other optimization methods well established in the literature. The results reveal that the adaptive SOS algorithm is more reliable and efficient than the basic SOS algorithm and other state-of-the-art algorithms.

  15. An Accelerated Greedy Missing Point Estimation Procedure

    DEFF Research Database (Denmark)

    Zimmermann, Ralf; Willcox, Karen

    2016-01-01

    , without requiring solution of the modified eigenvalue problem. Based on theoretical insights into symmetric rank-one eigenvalue modifications, we derive a variation of the greedy method that is faster than the standard approach and yields better results for the cases studied. The proposed approach...... is illustrated by numerical experiments, where we observe a speed-up by two orders of magnitude when c Read More: http://epubs.siam.org/doi/abs/10.1137/15M1042899...

  16. An enhanced search algorithm for the charged fuel enrichment in equilibrium cycle analysis of REBUS-3

    International Nuclear Information System (INIS)

    Park, Tongkyu; Yang, Won Sik; Kim, Sang-Ji

    2017-01-01

    Highlights: • An enhanced search algorithm for charged fuel enrichment was developed for equilibrium cycle analysis with REBUS-3. • The new search algorithm is not sensitive to the user-specified initial guesses. • The new algorithm reduces the computational time by a factor of 2–3. - Abstract: This paper presents an enhanced search algorithm for the charged fuel enrichment in equilibrium cycle analysis of REBUS-3. The current enrichment search algorithm of REBUS-3 takes a large number of iterations to yield a converged solution or even terminates without a converged solution when the user-specified initial guesses are far from the solution. To resolve the convergence problem and to reduce the computational time, an enhanced search algorithm was developed. The enhanced algorithm is based on the idea of minimizing the number of enrichment estimates by allowing drastic enrichment changes and by optimizing the current search algorithm of REBUS-3. Three equilibrium cycle problems with recycling, without recycling and of high discharge burnup were defined and a series of sensitivity analyses were performed with a wide range of user-specified initial guesses. Test results showed that the enhanced search algorithm is able to produce a converged solution regardless of the initial guesses. In addition, it was able to reduce the number of flux calculations by a factor of 2.9, 1.8, and 1.7 for equilibrium cycle problems with recycling, without recycling, and of high discharge burnup, respectively, compared to the current search algorithm.

  17. State-set branching

    DEFF Research Database (Denmark)

    Jensen, Rune Møller; Veloso, Manuela M.; Bryant, Randal E.

    2008-01-01

    In this article, we present a framework called state-set branching that combines symbolic search based on reduced ordered Binary Decision Diagrams (BDDs) with best-first search, such as A* and greedy best-first search. The framework relies on an extension of these algorithms from expanding a sing...

  18. Sensitivity analysis of a greedy heuristic for knapsack problems

    NARCIS (Netherlands)

    Ghosh, D; Chakravarti, N; Sierksma, G

    2006-01-01

    In this paper, we carry out parametric analysis as well as a tolerance limit based sensitivity analysis of a greedy heuristic for two knapsack problems-the 0-1 knapsack problem and the subset sum problem. We carry out the parametric analysis based on all problem parameters. In the tolerance limit

  19. Adaptive Greedy Dictionary Selection for Web Media Summarization.

    Science.gov (United States)

    Cong, Yang; Liu, Ji; Sun, Gan; You, Quanzeng; Li, Yuncheng; Luo, Jiebo

    2017-01-01

    Initializing an effective dictionary is an indispensable step for sparse representation. In this paper, we focus on the dictionary selection problem with the objective to select a compact subset of basis from original training data instead of learning a new dictionary matrix as dictionary learning models do. We first design a new dictionary selection model via l 2,0 norm. For model optimization, we propose two methods: one is the standard forward-backward greedy algorithm, which is not suitable for large-scale problems; the other is based on the gradient cues at each forward iteration and speeds up the process dramatically. In comparison with the state-of-the-art dictionary selection models, our model is not only more effective and efficient, but also can control the sparsity. To evaluate the performance of our new model, we select two practical web media summarization problems: 1) we build a new data set consisting of around 500 users, 3000 albums, and 1 million images, and achieve effective assisted albuming based on our model and 2) by formulating the video summarization problem as a dictionary selection issue, we employ our model to extract keyframes from a video sequence in a more flexible way. Generally, our model outperforms the state-of-the-art methods in both these two tasks.

  20. Self-adaptive global best harmony search algorithm applied to reactor core fuel management optimization

    International Nuclear Information System (INIS)

    Poursalehi, N.; Zolfaghari, A.; Minuchehr, A.; Valavi, K.

    2013-01-01

    Highlights: • SGHS enhanced the convergence rate of LPO using some improvements in comparison to basic HS and GHS. • SGHS optimization algorithm obtained averagely better fitness relative to basic HS and GHS algorithms. • Upshot of the SGHS implementation in the LPO reveals its flexibility, efficiency and reliability. - Abstract: The aim of this work is to apply the new developed optimization algorithm, Self-adaptive Global best Harmony Search (SGHS), for PWRs fuel management optimization. SGHS algorithm has some modifications in comparison with basic Harmony Search (HS) and Global-best Harmony Search (GHS) algorithms such as dynamically change of parameters. For the demonstration of SGHS ability to find an optimal configuration of fuel assemblies, basic Harmony Search (HS) and Global-best Harmony Search (GHS) algorithms also have been developed and investigated. For this purpose, Self-adaptive Global best Harmony Search Nodal Expansion package (SGHSNE) has been developed implementing HS, GHS and SGHS optimization algorithms for the fuel management operation of nuclear reactor cores. This package uses developed average current nodal expansion code which solves the multi group diffusion equation by employment of first and second orders of Nodal Expansion Method (NEM) for two dimensional, hexagonal and rectangular geometries, respectively, by one node per a FA. Loading pattern optimization was performed using SGHSNE package for some test cases to present the SGHS algorithm capability in converging to near optimal loading pattern. Results indicate that the convergence rate and reliability of the SGHS method are quite promising and practically, SGHS improves the quality of loading pattern optimization results relative to HS and GHS algorithms. As a result, it has the potential to be used in the other nuclear engineering optimization problems

  1. An improved harmony search algorithm for synchronization of discrete-time chaotic systems

    International Nuclear Information System (INIS)

    Santos Coelho, Leandro dos; Andrade Bernert, Diego Luis de

    2009-01-01

    The harmony search (HS) algorithm is a recently developed meta-heuristic algorithm, and has been very successful in a wide variety of optimization problems. HS was conceptualized using an analogy with music improvisation process where music players improvise the pitches of their instruments to obtain better harmony. The HS algorithm does not require initial values and uses a random search instead of a gradient search, so derivative information is unnecessary. Furthermore, the HS algorithm is simple in concept, few in parameters, easy in implementation, imposes fewer mathematical requirements, and does not require initial value settings of the decision variables. In recent years, the investigation of synchronization and control problem for discrete chaotic systems has attracted much attention, and many possible applications. The tuning of a proportional-integral-derivative (PID) controller based on an improved HS (IHS) algorithm for synchronization of two identical discrete chaotic systems subject the different initial conditions is investigated in this paper. Simulation results of the IHS to determine the PID parameters to synchronization of two Henon chaotic systems are compared with other HS approaches including classical HS and global-best HS. Numerical results reveal that the proposed IHS method is a powerful search and controller design optimization tool for synchronization of chaotic systems.

  2. A New Fuzzy Harmony Search Algorithm Using Fuzzy Logic for Dynamic Parameter Adaptation

    Directory of Open Access Journals (Sweden)

    Cinthia Peraza

    2016-10-01

    Full Text Available In this paper, a new fuzzy harmony search algorithm (FHS for solving optimization problems is presented. FHS is based on a recent method using fuzzy logic for dynamic adaptation of the harmony memory accepting (HMR and pitch adjustment (PArate parameters that improve the convergence rate of traditional harmony search algorithm (HS. The objective of the method is to dynamically adjust the parameters in the range from 0.7 to 1. The impact of using fixed parameters in the harmony search algorithm is discussed and a strategy for efficiently tuning these parameters using fuzzy logic is presented. The FHS algorithm was successfully applied to different benchmarking optimization problems. The results of simulation and comparison studies demonstrate the effectiveness and efficiency of the proposed approach.

  3. The quantum walk search algorithm: Factors affecting efficiency

    OpenAIRE

    Lovett, Neil B.; Everitt, Matthew; Heath, Robert M.; Kendon, Viv

    2011-01-01

    We numerically study the quantum walk search algorithm of Shenvi, Kempe and Whaley [PRA \\textbf{67} 052307] and the factors which affect its efficiency in finding an individual state from an unsorted set. Previous work has focused purely on the effects of the dimensionality of the dataset to be searched. Here, we consider the effects of interpolating between dimensions, connectivity of the dataset, and the possibility of disorder in the underlying substrate: all these factors affect the effic...

  4. Quantum Partial Searching Algorithm of a Database with Several Target Items

    International Nuclear Information System (INIS)

    Pu-Cha, Zhong; Wan-Su, Bao; Yun, Wei

    2009-01-01

    Choi and Korepin [Quantum Information Processing 6(2007)243] presented a quantum partial search algorithm of a database with several target items which can find a target block quickly when each target block contains the same number of target items. Actually, the number of target items in each target block is arbitrary. Aiming at this case, we give a condition to guarantee performance of the partial search algorithm to be performed and the number of queries to oracle of the algorithm to be minimized. In addition, by further numerical computing we come to the conclusion that the more uniform the distribution of target items, the smaller the number of queries

  5. International Timetabling Competition 2011: An Adaptive Large Neighborhood Search algorithm

    DEFF Research Database (Denmark)

    Sørensen, Matias; Kristiansen, Simon; Stidsen, Thomas Riis

    2012-01-01

    An algorithm based on Adaptive Large Neighborhood Search (ALNS) for solving the generalized High School Timetabling problem in XHSTT-format (Post et al (2012a)) is presented. This algorithm was among the nalists of round 2 of the International Timetabling Competition 2011 (ITC2011). For problem...

  6. Unsupervised quantification of abdominal fat from CT images using Greedy Snakes

    Science.gov (United States)

    Agarwal, Chirag; Dallal, Ahmed H.; Arbabshirani, Mohammad R.; Patel, Aalpen; Moore, Gregory

    2017-02-01

    Adipose tissue has been associated with adverse consequences of obesity. Total adipose tissue (TAT) is divided into subcutaneous adipose tissue (SAT) and visceral adipose tissue (VAT). Intra-abdominal fat (VAT), located inside the abdominal cavity, is a major factor for the classic obesity related pathologies. Since direct measurement of visceral and subcutaneous fat is not trivial, substitute metrics like waist circumference (WC) and body mass index (BMI) are used in clinical settings to quantify obesity. Abdominal fat can be assessed effectively using CT or MRI, but manual fat segmentation is rather subjective and time-consuming. Hence, an automatic and accurate quantification tool for abdominal fat is needed. The goal of this study is to extract TAT, VAT and SAT fat from abdominal CT in a fully automated unsupervised fashion using energy minimization techniques. We applied a four step framework consisting of 1) initial body contour estimation, 2) approximation of the body contour, 3) estimation of inner abdominal contour using Greedy Snakes algorithm, and 4) voting, to segment the subcutaneous and visceral fat. We validated our algorithm on 952 clinical abdominal CT images (from 476 patients with a very wide BMI range) collected from various radiology departments of Geisinger Health System. To our knowledge, this is the first study of its kind on such a large and diverse clinical dataset. Our algorithm obtained a 3.4% error for VAT segmentation compared to manual segmentation. These personalized and accurate measurements of fat can complement traditional population health driven obesity metrics such as BMI and WC.

  7. An opposition-based harmony search algorithm for engineering optimization problems

    Directory of Open Access Journals (Sweden)

    Abhik Banerjee

    2014-03-01

    Full Text Available Harmony search (HS is a derivative-free real parameter optimization algorithm. It draws inspiration from the musical improvisation process of searching for a perfect state of harmony. The proposed opposition-based HS (OHS of the present work employs opposition-based learning for harmony memory initialization and also for generation jumping. The concept of opposite number is utilized in OHS to improve the convergence rate of the HS algorithm. The potential of the proposed algorithm is assessed by means of an extensive comparative study of the numerical results on sixteen benchmark test functions. Additionally, the effectiveness of the proposed algorithm is tested for reactive power compensation of an autonomous power system. For real-time reactive power compensation of the studied model, Takagi Sugeno fuzzy logic (TSFL is employed. Time-domain simulation reveals that the proposed OHS-TSFL yields on-line, off-nominal model parameters, resulting in real-time incremental change in terminal voltage response profile.

  8. A note on the Greedy [beta-]transformation with arbitrary digits

    NARCIS (Netherlands)

    Dajani, K.; Kalle, C.C.C.J.

    2011-01-01

    We consider a generalization of the greedy and lazy [beta]expansions with digit set A = fa0

  9. Improved Multiobjective Harmony Search Algorithm with Application to Placement and Sizing of Distributed Generation

    Directory of Open Access Journals (Sweden)

    Wanxing Sheng

    2014-01-01

    Full Text Available To solve the comprehensive multiobjective optimization problem, this study proposes an improved metaheuristic searching algorithm with combination of harmony search and the fast nondominated sorting approach. This is a kind of the novel intelligent optimization algorithm for multiobjective harmony search (MOHS. The detailed description and the algorithm formulating are discussed. Taking the optimal placement and sizing issue of distributed generation (DG in distributed power system as one example, the solving procedure of the proposed method is given. Simulation result on modified IEEE 33-bus test system and comparison with NSGA-II algorithm has proved that the proposed MOHS can get promising results for engineering application.

  10. On the use of harmony search algorithm in the training of wavelet neural networks

    Science.gov (United States)

    Lai, Kee Huong; Zainuddin, Zarita; Ong, Pauline

    2015-10-01

    Wavelet neural networks (WNNs) are a class of feedforward neural networks that have been used in a wide range of industrial and engineering applications to model the complex relationships between the given inputs and outputs. The training of WNNs involves the configuration of the weight values between neurons. The backpropagation training algorithm, which is a gradient-descent method, can be used for this training purpose. Nonetheless, the solutions found by this algorithm often get trapped at local minima. In this paper, a harmony search-based algorithm is proposed for the training of WNNs. The training of WNNs, thus can be formulated as a continuous optimization problem, where the objective is to maximize the overall classification accuracy. Each candidate solution proposed by the harmony search algorithm represents a specific WNN architecture. In order to speed up the training process, the solution space is divided into disjoint partitions during the random initialization step of harmony search algorithm. The proposed training algorithm is tested onthree benchmark problems from the UCI machine learning repository, as well as one real life application, namely, the classification of electroencephalography signals in the task of epileptic seizure detection. The results obtained show that the proposed algorithm outperforms the traditional harmony search algorithm in terms of overall classification accuracy.

  11. Improved Harmony Search Algorithm with Chaos for Absolute Value Equation

    Directory of Open Access Journals (Sweden)

    Shouheng Tuo

    2013-11-01

    Full Text Available In this paper, an improved harmony search with chaos (HSCH is presented for solving NP-hard absolute value equation (AVE Ax - |x| = b, where A is an arbitrary square matrix whose singular values exceed one. The simulation results in solving some given AVE problems demonstrate that the HSCH algorithm is valid and outperforms the classical HS algorithm (CHS and HS algorithm with differential mutation operator (HSDE.

  12. Hybrid fuzzy charged system search algorithm based state estimation in distribution networks

    Directory of Open Access Journals (Sweden)

    Sachidananda Prasad

    2017-06-01

    Full Text Available This paper proposes a new hybrid charged system search (CSS algorithm based state estimation in radial distribution networks in fuzzy framework. The objective of the optimization problem is to minimize the weighted square of the difference between the measured and the estimated quantity. The proposed method of state estimation considers bus voltage magnitude and phase angle as state variable along with some equality and inequality constraints for state estimation in distribution networks. A rule based fuzzy inference system has been designed to control the parameters of the CSS algorithm to achieve better balance between the exploration and exploitation capability of the algorithm. The efficiency of the proposed fuzzy adaptive charged system search (FACSS algorithm has been tested on standard IEEE 33-bus system and Indian 85-bus practical radial distribution system. The obtained results have been compared with the conventional CSS algorithm, weighted least square (WLS algorithm and particle swarm optimization (PSO for feasibility of the algorithm.

  13. A Two-Phase Heuristic Algorithm for the Common Frequency Routing Problem with Vehicle Type Choice in the Milk Run

    Directory of Open Access Journals (Sweden)

    Yu Lin

    2015-01-01

    Full Text Available High frequency and small lot size are characteristics of milk runs and are often used to implement the just-in-time (JIT strategy in logistical systems. The common frequency problem, which simultaneously involves planning of the route and frequency, has been extensively researched in milk run systems. In addition, vehicle type choice in the milk run system also has a significant influence on the operating cost. Therefore, in this paper, we simultaneously consider vehicle routing planning, frequency planning, and vehicle type choice in order to optimize the sum of the cost of transportation, inventory, and dispatch. To this end, we develop a mathematical model to describe the common frequency problem with vehicle type choice. Since the problem is NP hard, we develop a two-phase heuristic algorithm to solve the model. More specifically, an initial satisfactory solution is first generated through a greedy heuristic algorithm to maximize the ratio of the superior arc frequency to the inferior arc frequency. Following this, a tabu search (TS with limited search scope is used to improve the initial satisfactory solution. Numerical examples with different sizes establish the efficacy of our model and our proposed algorithm.

  14. Experimental implementation of a quantum random-walk search algorithm using strongly dipolar coupled spins

    International Nuclear Information System (INIS)

    Lu Dawei; Peng Xinhua; Du Jiangfeng; Zhu Jing; Zou Ping; Yu Yihua; Zhang Shanmin; Chen Qun

    2010-01-01

    An important quantum search algorithm based on the quantum random walk performs an oracle search on a database of N items with O(√(phN)) calls, yielding a speedup similar to the Grover quantum search algorithm. The algorithm was implemented on a quantum information processor of three-qubit liquid-crystal nuclear magnetic resonance (NMR) in the case of finding 1 out of 4, and the diagonal elements' tomography of all the final density matrices was completed with comprehensible one-dimensional NMR spectra. The experimental results agree well with the theoretical predictions.

  15. Solving k-Barrier Coverage Problem Using Modified Gravitational Search Algorithm

    Directory of Open Access Journals (Sweden)

    Yanhua Zhang

    2017-01-01

    Full Text Available Coverage problem is a critical issue in wireless sensor networks for security applications. The k-barrier coverage is an effective measure to ensure robustness. In this paper, we formulate the k-barrier coverage problem as a constrained optimization problem and introduce the energy constraint of sensor node to prolong the lifetime of the k-barrier coverage. A novel hybrid particle swarm optimization and gravitational search algorithm (PGSA is proposed to solve this problem. The proposed PGSA adopts a k-barrier coverage generation strategy based on probability and integrates the exploitation ability in particle swarm optimization to update the velocity and enhance the global search capability and introduce the boundary mutation strategy of an agent to increase the population diversity and search accuracy. Extensive simulations are conducted to demonstrate the effectiveness of our proposed algorithm.

  16. Computing gap free Pareto front approximations with stochastic search algorithms.

    Science.gov (United States)

    Schütze, Oliver; Laumanns, Marco; Tantar, Emilia; Coello, Carlos A Coello; Talbi, El-Ghazali

    2010-01-01

    Recently, a convergence proof of stochastic search algorithms toward finite size Pareto set approximations of continuous multi-objective optimization problems has been given. The focus was on obtaining a finite approximation that captures the entire solution set in some suitable sense, which was defined by the concept of epsilon-dominance. Though bounds on the quality of the limit approximation-which are entirely determined by the archiving strategy and the value of epsilon-have been obtained, the strategies do not guarantee to obtain a gap free approximation of the Pareto front. That is, such approximations A can reveal gaps in the sense that points f in the Pareto front can exist such that the distance of f to any image point F(a), a epsilon A, is "large." Since such gap free approximations are desirable in certain applications, and the related archiving strategies can be advantageous when memetic strategies are included in the search process, we are aiming in this work for such methods. We present two novel strategies that accomplish this task in the probabilistic sense and under mild assumptions on the stochastic search algorithm. In addition to the convergence proofs, we give some numerical results to visualize the behavior of the different archiving strategies. Finally, we demonstrate the potential for a possible hybridization of a given stochastic search algorithm with a particular local search strategy-multi-objective continuation methods-by showing that the concept of epsilon-dominance can be integrated into this approach in a suitable way.

  17. An Adaptive Large Neighborhood Search Algorithm for the Resource-constrained Project Scheduling Problem

    DEFF Research Database (Denmark)

    Muller, Laurent Flindt

    2009-01-01

    We present an application of an Adaptive Large Neighborhood Search (ALNS) algorithm to the Resource-constrained Project Scheduling Problem (RCPSP). The ALNS framework was first proposed by Pisinger and Røpke [19] and can be described as a large neighborhood search algorithm with an adaptive layer......, where a set of destroy/repair neighborhoods compete to modify the current solution in each iteration of the algorithm. Experiments are performed on the wellknown J30, J60 and J120 benchmark instances, which show that the proposed algorithm is competitive and confirms the strength of the ALNS framework...

  18. A solution to energy and environmental problems of electric power system using hybrid harmony search-random search optimization algorithm

    Directory of Open Access Journals (Sweden)

    Vikram Kumar Kamboj

    2016-04-01

    Full Text Available In recent years, global warming and carbon dioxide (CO2 emission reduction have become important issues in India, as CO2 emission levels are continuing to rise in accordance with the increased volume of Indian national energy consumption under the pressure of global warming, it is crucial for Indian government to impose the effective policy to promote CO2 emission reduction. Challenge of supplying the nation with high quality and reliable electrical energy at a reasonable cost, converted government policy into deregulation and restructuring environment. This research paper presents aims to presents an effective solution for energy and environmental problems of electric power using an efficient and powerful hybrid optimization algorithm: Hybrid Harmony search-random search algorithm. The proposed algorithm is tested for standard IEEE-14 bus, -30 bus and -56 bus system. The effectiveness of proposed hybrid algorithm is compared with others well known evolutionary, heuristics and meta-heuristics search algorithms. For multi-objective unit commitment, it is found that as there are conflicting relationship between cost and emission, if the performance in cost criterion is improved, performance in the emission is seen to deteriorate.

  19. Cost reduction improvement for power generation system integrating WECS using harmony search algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Ngonkham, S. [Khonkaen Univ., Amphur Muang (Thailand). Dept. of Electrical Engineering; Buasri, P. [Khonkaen Univ., Amphur Muang (Thailand). Embed System Research Group

    2009-03-11

    A harmony search (HS) algorithm was used to optimize economic dispatch (ED) in a wind energy conversion system (WECS) for power system integration. The HS algorithm was based on a stochastic random search method. System costs for the WECS system were estimated in relation to average wind speeds. The HS algorithm was implemented to optimize the ED with a simple programming procedure. The study showed that the initial parameters must be carefully selected to ensure the accuracy of the HS algorithm. The algorithm demonstrated that total costs of the WECS system were higher than costs associated with energy efficiency procedures that reduced the same amount of greenhouse gas (GHG) emissions. 7 refs,. 10 tabs., 16 figs.

  20. Improved gravitational search algorithm for parameter identification of water turbine regulation system

    International Nuclear Information System (INIS)

    Chen, Zhihuan; Yuan, Xiaohui; Tian, Hao; Ji, Bin

    2014-01-01

    Highlights: • We propose an improved gravitational search algorithm (IGSA). • IGSA is applied to parameter identification of water turbine regulation system (WTRS). • WTRS is modeled by considering the impact of turbine speed on torque and water flow. • Weighted objective function strategy is applied to parameter identification of WTRS. - Abstract: Parameter identification of water turbine regulation system (WTRS) is crucial in precise modeling hydropower generating unit (HGU) and provides support for the adaptive control and stability analysis of power system. In this paper, an improved gravitational search algorithm (IGSA) is proposed and applied to solve the identification problem for WTRS system under load and no-load running conditions. This newly algorithm which is based on standard gravitational search algorithm (GSA) accelerates convergence speed with combination of the search strategy of particle swarm optimization and elastic-ball method. Chaotic mutation which is devised to stepping out the local optimal with a certain probability is also added into the algorithm to avoid premature. Furthermore, a new kind of model associated to the engineering practices is built and analyzed in the simulation tests. An illustrative example for parameter identification of WTRS is used to verify the feasibility and effectiveness of the proposed IGSA, as compared with standard GSA and particle swarm optimization in terms of parameter identification accuracy and convergence speed. The simulation results show that IGSA performs best for all identification indicators

  1. The Research and Test of Fast Radio Burst Real-time Search Algorithm Based on GPU Acceleration

    Science.gov (United States)

    Wang, J.; Chen, M. Z.; Pei, X.; Wang, Z. Q.

    2017-03-01

    In order to satisfy the research needs of Nanshan 25 m radio telescope of Xinjiang Astronomical Observatory (XAO) and study the key technology of the planned QiTai radio Telescope (QTT), the receiver group of XAO studied the GPU (Graphics Processing Unit) based real-time FRB searching algorithm which developed from the original FRB searching algorithm based on CPU (Central Processing Unit), and built the FRB real-time searching system. The comparison of the GPU system and the CPU system shows that: on the basis of ensuring the accuracy of the search, the speed of the GPU accelerated algorithm is improved by 35-45 times compared with the CPU algorithm.

  2. An Elite Decision Making Harmony Search Algorithm for Optimization Problem

    Directory of Open Access Journals (Sweden)

    Lipu Zhang

    2012-01-01

    Full Text Available This paper describes a new variant of harmony search algorithm which is inspired by a well-known item “elite decision making.” In the new algorithm, the good information captured in the current global best and the second best solutions can be well utilized to generate new solutions, following some probability rule. The generated new solution vector replaces the worst solution in the solution set, only if its fitness is better than that of the worst solution. The generating and updating steps and repeated until the near-optimal solution vector is obtained. Extensive computational comparisons are carried out by employing various standard benchmark optimization problems, including continuous design variables and integer variables minimization problems from the literature. The computational results show that the proposed new algorithm is competitive in finding solutions with the state-of-the-art harmony search variants.

  3. Nature-inspired novel Cuckoo Search Algorithm for genome ...

    Indian Academy of Sciences (India)

    compared their results with other methods such as the genetic algorithm. ... It is a population-based search procedure used as an optimization tool, in ... In this section, the problem formulation, fitness evaluation, flowchart and implementation of the ..... Machine Learning 21: 11–33 ... Numerical Optimization 1: 330–343.

  4. Algorithm of search and track of static and moving large-scale objects

    Directory of Open Access Journals (Sweden)

    Kalyaev Anatoly

    2017-01-01

    Full Text Available We suggest an algorithm for processing of a sequence, which contains images of search and track of static and moving large-scale objects. The possible software implementation of the algorithm, based on multithread CUDA processing, is suggested. Experimental analysis of the suggested algorithm implementation is performed.

  5. An Educational System for Learning Search Algorithms and Automatically Assessing Student Performance

    Science.gov (United States)

    Grivokostopoulou, Foteini; Perikos, Isidoros; Hatzilygeroudis, Ioannis

    2017-01-01

    In this paper, first we present an educational system that assists students in learning and tutors in teaching search algorithms, an artificial intelligence topic. Learning is achieved through a wide range of learning activities. Algorithm visualizations demonstrate the operational functionality of algorithms according to the principles of active…

  6. An improved algorithm for searching all minimal cuts in modified networks

    International Nuclear Information System (INIS)

    Yeh, W.-C.

    2008-01-01

    A modified network is an updated network after inserting a branch string (a special path) between two nodes in the original network. Modifications are common for network expansion or reinforcement evaluation and planning. The problem of searching all minimal cuts (MCs) in a modified network is discussed and solved in this study. The existing best-known methods for solving this problem either needed extensive comparison and verification or failed to solve some special but important cases. Therefore, a more efficient, intuitive and generalized method for searching all MCs without an extensive research procedure is proposed. In this study, we first develop an intuitive algorithm based upon the reformation of all MCs in the original network to search for all MCs in a modified network. Next, the correctness of the proposed algorithm will be analyzed and proven. The computational complexity of the proposed algorithm is analyzed and compared with the existing best-known methods. Finally, two examples illustrate how all MCs are generated in a modified network using the information of all of the MCs in the corresponding original network

  7. Algorithms for searching Fast radio bursts and pulsars in tight binary systems.

    Science.gov (United States)

    Zackay, Barak

    2017-01-01

    Fast radio bursts (FRB's) are an exciting, recently discovered, astrophysical transients which their origins are unknown.Currently, these bursts are believed to be coming from cosmological distances, allowing us to probe the electron content on cosmological length scales. Even though their precise localization is crucial for the determination of their origin, radio interferometers were not extensively employed in searching for them due to computational limitations.I will briefly present the Fast Dispersion Measure Transform (FDMT) algorithm,that allows to reduce the operation count in blind incoherent dedispersion by 2-3 orders of magnitude.In addition, FDMT enables to probe the unexplored domain of sub-microsecond astrophysical pulses.Pulsars in tight binary systems are among the most important astrophysical objects as they provide us our best tests of general relativity in the strong field regime.I will provide a preview to a novel algorithm that enables the detection of pulsars in short binary systems using observation times longer than an orbital period.Current pulsar search programs limit their searches for integration times shorter than a few percents of the orbital period.Until now, searching for pulsars in binary systems using observation times longer than an orbital period was considered impossible as one has to blindly enumerate all options for the Keplerian parameters, the pulsar rotation period, and the unknown DM.Using the current state of the art pulsar search techniques and all computers on the earth, such an enumeration would take longer than a Hubble time. I will demonstrate that using the new algorithm, it is possible to conduct such an enumeration on a laptop using real data of the double pulsar PSR J0737-3039.Among the other applications of this algorithm are:1) Searching for all pulsars on all sky positions in gamma ray observations of the Fermi LAT satellite.2) Blind searching for continuous gravitational wave sources emitted by pulsars with

  8. Comparison of multiobjective harmony search, cuckoo search and bat-inspired algorithms for renewable distributed generation placement

    Directory of Open Access Journals (Sweden)

    John E. Candelo-Becerra

    2015-07-01

    Full Text Available Electric power losses have a significant impact on the total costs of distribution networks. The use of renewable energy sources is a major alternative to improve power losses and costs, although other important issues are also enhanced such as voltage magnitudes and network congestion. However, determining the best location and size of renewable energy generators can be sometimes a challenging task due to a large number of possible combinations in the search space. Furthermore, the multiobjective functions increase the complexity of the problem and metaheuristics are preferred to find solutions in a relatively short time. This paper evaluates the performance of the cuckoo search (CS, harmony search (HS, and bat-inspired (BA algorithms for the location and size of renewable distributed generation (RDG in radial distribution networks using a multiobjective function defined as minimizing the energy losses and the RDG costs. The metaheuristic algorithms were programmed in Matlab and tested using the 33-node radial distribution network. The three algorithms obtained similar results for the two objectives evaluated, finding points close to the best solutions in the Pareto front. Comparisons showed that the CS obtained the minimum results for most points evaluated, but the BA and the HS were close to the best solution.

  9. Training Feedforward Neural Networks Using Symbiotic Organisms Search Algorithm

    Directory of Open Access Journals (Sweden)

    Haizhou Wu

    2016-01-01

    Full Text Available Symbiotic organisms search (SOS is a new robust and powerful metaheuristic algorithm, which stimulates the symbiotic interaction strategies adopted by organisms to survive and propagate in the ecosystem. In the supervised learning area, it is a challenging task to present a satisfactory and efficient training algorithm for feedforward neural networks (FNNs. In this paper, SOS is employed as a new method for training FNNs. To investigate the performance of the aforementioned method, eight different datasets selected from the UCI machine learning repository are employed for experiment and the results are compared among seven metaheuristic algorithms. The results show that SOS performs better than other algorithms for training FNNs in terms of converging speed. It is also proven that an FNN trained by the method of SOS has better accuracy than most algorithms compared.

  10. A Dynamic Neighborhood Learning-Based Gravitational Search Algorithm.

    Science.gov (United States)

    Zhang, Aizhu; Sun, Genyun; Ren, Jinchang; Li, Xiaodong; Wang, Zhenjie; Jia, Xiuping

    2018-01-01

    Balancing exploration and exploitation according to evolutionary states is crucial to meta-heuristic search (M-HS) algorithms. Owing to its simplicity in theory and effectiveness in global optimization, gravitational search algorithm (GSA) has attracted increasing attention in recent years. However, the tradeoff between exploration and exploitation in GSA is achieved mainly by adjusting the size of an archive, named , which stores those superior agents after fitness sorting in each iteration. Since the global property of remains unchanged in the whole evolutionary process, GSA emphasizes exploitation over exploration and suffers from rapid loss of diversity and premature convergence. To address these problems, in this paper, we propose a dynamic neighborhood learning (DNL) strategy to replace the model and thereby present a DNL-based GSA (DNLGSA). The method incorporates the local and global neighborhood topologies for enhancing the exploration and obtaining adaptive balance between exploration and exploitation. The local neighborhoods are dynamically formed based on evolutionary states. To delineate the evolutionary states, two convergence criteria named limit value and population diversity, are introduced. Moreover, a mutation operator is designed for escaping from the local optima on the basis of evolutionary states. The proposed algorithm was evaluated on 27 benchmark problems with different characteristic and various difficulties. The results reveal that DNLGSA exhibits competitive performances when compared with a variety of state-of-the-art M-HS algorithms. Moreover, the incorporation of local neighborhood topology reduces the numbers of calculations of gravitational force and thus alleviates the high computational cost of GSA.

  11. State-of-the-Art Review on Relevance of Genetic Algorithm to Internet Web Search

    Directory of Open Access Journals (Sweden)

    Kehinde Agbele

    2012-01-01

    Full Text Available People use search engines to find information they desire with the aim that their information needs will be met. Information retrieval (IR is a field that is concerned primarily with the searching and retrieving of information in the documents and also searching the search engine, online databases, and Internet. Genetic algorithms (GAs are robust, efficient, and optimizated methods in a wide area of search problems motivated by Darwin’s principles of natural selection and survival of the fittest. This paper describes information retrieval systems (IRS components. This paper looks at how GAs can be applied in the field of IR and specifically the relevance of genetic algorithms to internet web search. Finally, from the proposals surveyed it turns out that GA is applied to diverse problem fields of internet web search.

  12. Grover's quantum search algorithm for an arbitrary initial mixed state

    International Nuclear Information System (INIS)

    Biham, Eli; Kenigsberg, Dan

    2002-01-01

    The Grover quantum search algorithm is generalized to deal with an arbitrary mixed initial state. The probability to measure a marked state as a function of time is calculated, and found to depend strongly on the specific initial state. The form of the function, though, remains as it is in the case of initial pure state. We study the role of the von Neumann entropy of the initial state, and show that the entropy cannot be a measure for the usefulness of the algorithm. We give few examples and show that for some extremely mixed initial states (carrying high entropy), the generalized Grover algorithm is considerably faster than any classical algorithm

  13. Connectivity algorithm with depth first search (DFS) on simple graphs

    Science.gov (United States)

    Riansanti, O.; Ihsan, M.; Suhaimi, D.

    2018-01-01

    This paper discusses an algorithm to detect connectivity of a simple graph using Depth First Search (DFS). The DFS implementation in this paper differs than other research, that is, on counting the number of visited vertices. The algorithm obtains s from the number of vertices and visits source vertex, following by its adjacent vertices until the last vertex adjacent to the previous source vertex. Any simple graph is connected if s equals 0 and disconnected if s is greater than 0. The complexity of the algorithm is O(n2).

  14. Success rate and entanglement measure in Grover's search algorithm for certain kinds of four qubit states

    International Nuclear Information System (INIS)

    Chamoli, Arti; Bhandari, C.M.

    2005-01-01

    Entanglement plays a crucial role in the efficacy of quantum algorithms. Whereas the role of entanglement is quite obvious and conspicuous in teleportation and superdense coding, it is not so distinct in other situations such as in search algorithm. The starting state in Grover's search algorithm is supposedly a uniform superposition state (not entangled) with a success probability around unity. An operational entanglement measure has been defined and investigated analytically for two qubit states [O. Biham, M.A. Neilsen, T. Osborne, Phys. Rev. A 65 (2002) 062312, Y. Shimoni, D. Shapira, O. Biham, Phys. Rev. A 69 (2004) 062303] seeking a relationship with the success rate of search algorithm. This Letter examines the success rate of search algorithm for various four-qubit states. Analytic expressions for the same have been worked out which can provide the success rate and entanglement measure for certain kinds of four qubit input states

  15. A Heuristic Task Scheduling Algorithm for Heterogeneous Virtual Clusters

    Directory of Open Access Journals (Sweden)

    Weiwei Lin

    2016-01-01

    Full Text Available Cloud computing provides on-demand computing and storage services with high performance and high scalability. However, the rising energy consumption of cloud data centers has become a prominent problem. In this paper, we first introduce an energy-aware framework for task scheduling in virtual clusters. The framework consists of a task resource requirements prediction module, an energy estimate module, and a scheduler with a task buffer. Secondly, based on this framework, we propose a virtual machine power efficiency-aware greedy scheduling algorithm (VPEGS. As a heuristic algorithm, VPEGS estimates task energy by considering factors including task resource demands, VM power efficiency, and server workload before scheduling tasks in a greedy manner. We simulated a heterogeneous VM cluster and conducted experiment to evaluate the effectiveness of VPEGS. Simulation results show that VPEGS effectively reduced total energy consumption by more than 20% without producing large scheduling overheads. With the similar heuristic ideology, it outperformed Min-Min and RASA with respect to energy saving by about 29% and 28%, respectively.

  16. GSNFS: Gene subnetwork biomarker identification of lung cancer expression data.

    Science.gov (United States)

    Doungpan, Narumol; Engchuan, Worrawat; Chan, Jonathan H; Meechai, Asawin

    2016-12-05

    Gene expression has been used to identify disease gene biomarkers, but there are ongoing challenges. Single gene or gene-set biomarkers are inadequate to provide sufficient understanding of complex disease mechanisms and the relationship among those genes. Network-based methods have thus been considered for inferring the interaction within a group of genes to further study the disease mechanism. Recently, the Gene-Network-based Feature Set (GNFS), which is capable of handling case-control and multiclass expression for gene biomarker identification, has been proposed, partly taking into account of network topology. However, its performance relies on a greedy search for building subnetworks and thus requires further improvement. In this work, we establish a new approach named Gene Sub-Network-based Feature Selection (GSNFS) by implementing the GNFS framework with two proposed searching and scoring algorithms, namely gene-set-based (GS) search and parent-node-based (PN) search, to identify subnetworks. An additional dataset is used to validate the results. The two proposed searching algorithms of the GSNFS method for subnetwork expansion are concerned with the degree of connectivity and the scoring scheme for building subnetworks and their topology. For each iteration of expansion, the neighbour genes of a current subnetwork, whose expression data improved the overall subnetwork score, is recruited. While the GS search calculated the subnetwork score using an activity score of a current subnetwork and the gene expression values of its neighbours, the PN search uses the expression value of the corresponding parent of each neighbour gene. Four lung cancer expression datasets were used for subnetwork identification. In addition, using pathway data and protein-protein interaction as network data in order to consider the interaction among significant genes were discussed. Classification was performed to compare the performance of the identified gene subnetworks with three

  17. Moon Search Algorithms for NASA's Dawn Mission to Asteroid Vesta

    Science.gov (United States)

    Memarsadeghi, Nargess; Mcfadden, Lucy A.; Skillman, David R.; McLean, Brian; Mutchler, Max; Carsenty, Uri; Palmer, Eric E.

    2012-01-01

    A moon or natural satellite is a celestial body that orbits a planetary body such as a planet, dwarf planet, or an asteroid. Scientists seek understanding the origin and evolution of our solar system by studying moons of these bodies. Additionally, searches for satellites of planetary bodies can be important to protect the safety of a spacecraft as it approaches or orbits a planetary body. If a satellite of a celestial body is found, the mass of that body can also be calculated once its orbit is determined. Ensuring the Dawn spacecraft's safety on its mission to the asteroid Vesta primarily motivated the work of Dawn's Satellite Working Group (SWG) in summer of 2011. Dawn mission scientists and engineers utilized various computational tools and techniques for Vesta's satellite search. The objectives of this paper are to 1) introduce the natural satellite search problem, 2) present the computational challenges, approaches, and tools used when addressing this problem, and 3) describe applications of various image processing and computational algorithms for performing satellite searches to the electronic imaging and computer science community. Furthermore, we hope that this communication would enable Dawn mission scientists to improve their satellite search algorithms and tools and be better prepared for performing the same investigation in 2015, when the spacecraft is scheduled to approach and orbit the dwarf planet Ceres.

  18. Ant colony search algorithm for optimal reactive power optimization

    Directory of Open Access Journals (Sweden)

    Lenin K.

    2006-01-01

    Full Text Available The paper presents an (ACSA Ant colony search Algorithm for Optimal Reactive Power Optimization and voltage control of power systems. ACSA is a new co-operative agents’ approach, which is inspired by the observation of the behavior of real ant colonies on the topic of ant trial formation and foraging methods. Hence, in the ACSA a set of co-operative agents called "Ants" co-operates to find good solution for Reactive Power Optimization problem. The ACSA is applied for optimal reactive power optimization is evaluated on standard IEEE, 30, 57, 191 (practical test bus system. The proposed approach is tested and compared to genetic algorithm (GA, Adaptive Genetic Algorithm (AGA.

  19. 3rd International Conference on Harmony Search Algorithm

    CERN Document Server

    2017-01-01

    This book presents state-of-the-art technical contributions based around one of the most successful evolutionary optimization algorithms published to date: Harmony Search. Contributions span from novel technical derivations of this algorithm to applications in the broad fields of civil engineering, energy, transportation & mobility and health, among many others and focus not only on its cross-domain applicability, but also on its core evolutionary operators, including elements inspired from other meta-heuristics. The global scientific community is witnessing an upsurge in groundbreaking, new advances in all areas of computational intelligence, with a particular flurry of research focusing on evolutionary computation and bio-inspired optimization. Observed processes in nature and sociology have provided the basis for innovative algorithmic developments aimed at leveraging the inherent capability to adapt characterized by various animals, including ants, fireflies, wolves and humans. However, it is the beha...

  20. Performance Analysis of Binary Search Algorithm in RFID

    Directory of Open Access Journals (Sweden)

    Xiangmei SONG

    2014-12-01

    Full Text Available Binary search algorithm (BS is a kind of important anti-collision algorithm in the Radio Frequency Identification (RFID, is also one of the key technologies which determine whether the information in the tag is identified by the reader-writer fast and reliably. The performance of BS directly affects the quality of service in Internet of Things. This paper adopts an automated formal technology: probabilistic model checking to analyze the performance of BS algorithm formally. Firstly, according to the working principle of BS algorithm, its dynamic behavior is abstracted into a Discrete Time Markov Chains which can describe deterministic, discrete time and the probability selection. And then on the model we calculate the probability of the data sent successfully and the expected time of tags completing the data transmission. Compared to the another typical anti-collision protocol S-ALOHA in RFID, experimental results show that with an increase in the number of tags the BS algorithm has a less space and time consumption, the average number of conflicts increases slower than the S-ALOHA protocol standard, BS algorithm needs fewer expected time to complete the data transmission, and the average speed of the data transmission in BS is as 1.6 times as the S-ALOHA protocol.

  1. Investigation on the improvement of genetic algorithm for PWR loading pattern search and its benchmark verification

    International Nuclear Information System (INIS)

    Li Qianqian; Jiang Xiaofeng; Zhang Shaohong

    2009-01-01

    In this study, the age technique, the concepts of relativeness degree and worth function are exploited to improve the performance of genetic algorithm (GA) for PWR loading pattern search. Among them, the age technique endows the algorithm be capable of learning from previous search 'experience' and guides it to do a better search in the vicinity ora local optimal; the introduction of the relativeness degree checks the relativeness of two loading patterns before performing crossover between them, which can significantly reduce the possibility of prematurity of the algorithm; while the application of the worth function makes the algorithm be capable of generating new loading patterns based on the statistics of common features of evaluated good loading patterns. Numerical verification against a loading pattern search benchmark problem ora two-loop reactor demonstrates that the adoption of these techniques is able to significantly enhance the efficiency of the genetic algorithm while improves the quality of the final solution as well. (authors)

  2. Optimal IIR filter design using Gravitational Search Algorithm with Wavelet Mutation

    Directory of Open Access Journals (Sweden)

    S.K. Saha

    2015-01-01

    Full Text Available This paper presents a global heuristic search optimization technique, which is a hybridized version of the Gravitational Search Algorithm (GSA and Wavelet Mutation (WM strategy. Thus, the Gravitational Search Algorithm with Wavelet Mutation (GSAWM was adopted for the design of an 8th-order infinite impulse response (IIR filter. GSA is based on the interaction of masses situated in a small isolated world guided by the approximation of Newtonian’s laws of gravity and motion. Each mass is represented by four parameters, namely, position, active, passive and inertia mass. The position of the heaviest mass gives the near optimal solution. For better exploitation in multidimensional search spaces, the WM strategy is applied to randomly selected particles that enhance the capability of GSA for finding better near optimal solutions. An extensive simulation study of low-pass (LP, high-pass (HP, band-pass (BP and band-stop (BS IIR filters unleashes the potential of GSAWM in achieving better cut-off frequency sharpness, smaller pass band and stop band ripples, smaller transition width and higher stop band attenuation with assured stability.

  3. An Improved Crow Search Algorithm Applied to Energy Problems

    Directory of Open Access Journals (Sweden)

    Primitivo Díaz

    2018-03-01

    Full Text Available The efficient use of energy in electrical systems has become a relevant topic due to its environmental impact. Parameter identification in induction motors and capacitor allocation in distribution networks are two representative problems that have strong implications in the massive use of energy. From an optimization perspective, both problems are considered extremely complex due to their non-linearity, discontinuity, and high multi-modality. These characteristics make difficult to solve them by using standard optimization techniques. On the other hand, metaheuristic methods have been widely used as alternative optimization algorithms to solve complex engineering problems. The Crow Search Algorithm (CSA is a recent metaheuristic method based on the intelligent group behavior of crows. Although CSA presents interesting characteristics, its search strategy presents great difficulties when it faces high multi-modal formulations. In this paper, an improved version of the CSA method is presented to solve complex optimization problems of energy. In the new algorithm, two features of the original CSA are modified: (I the awareness probability (AP and (II the random perturbation. With such adaptations, the new approach preserves solution diversity and improves the convergence to difficult high multi-modal optima. In order to evaluate its performance, the proposed algorithm has been tested in a set of four optimization problems which involve induction motors and distribution networks. The results demonstrate the high performance of the proposed method when it is compared with other popular approaches.

  4. Application of Tabu Search Algorithm in Job Shop Scheduling

    Directory of Open Access Journals (Sweden)

    Betrianis Betrianis

    2010-10-01

    Full Text Available Tabu Search is one of local search methods which is used to solve the combinatorial optimization problem. This method aimed is to make the searching process of the best solution in a complex combinatorial optimization problem(np hard, ex : job shop scheduling problem, became more effective, in a less computational time but with no guarantee to optimum solution.In this paper, tabu search is used to solve the job shop scheduling problem consists of 3 (three cases, which is ordering package of September, October and November with objective of minimizing makespan (Cmax. For each ordering package, there is a combination for initial solution and tabu list length. These result then  compared with 4 (four other methods using basic dispatching rules such as Shortest Processing Time (SPT, Earliest Due Date (EDD, Most Work Remaining (MWKR dan First Come First Served (FCFS. Scheduling used Tabu Search Algorithm is sensitive for variables changes and gives makespan shorter than scheduling used by other four methods.

  5. Redundancy allocation of series-parallel systems using a variable neighborhood search algorithm

    International Nuclear Information System (INIS)

    Liang, Y.-C.; Chen, Y.-C.

    2007-01-01

    This paper presents a meta-heuristic algorithm, variable neighborhood search (VNS), to the redundancy allocation problem (RAP). The RAP, an NP-hard problem, has attracted the attention of much prior research, generally in a restricted form where each subsystem must consist of identical components. The newer meta-heuristic methods overcome this limitation and offer a practical way to solve large instances of the relaxed RAP where different components can be used in parallel. Authors' previously published work has shown promise for the variable neighborhood descent (VND) method, the simplest version among VNS variations, on RAP. The variable neighborhood search method itself has not been used in reliability design, yet it is a method that fits those combinatorial problems with potential neighborhood structures, as in the case of the RAP. Therefore, authors further extended their work to develop a VNS algorithm for the RAP and tested a set of well-known benchmark problems from the literature. Results on 33 test instances ranging from less to severely constrained conditions show that the variable neighborhood search method improves the performance of VND and provides a competitive solution quality at economically computational expense in comparison with the best-known heuristics including ant colony optimization, genetic algorithm, and tabu search

  6. Redundancy allocation of series-parallel systems using a variable neighborhood search algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Liang, Y.-C. [Department of Industrial Engineering and Management, Yuan Ze University, No 135 Yuan-Tung Road, Chung-Li, Taoyuan County, Taiwan 320 (China)]. E-mail: ycliang@saturn.yzu.edu.tw; Chen, Y.-C. [Department of Industrial Engineering and Management, Yuan Ze University, No 135 Yuan-Tung Road, Chung-Li, Taoyuan County, Taiwan 320 (China)]. E-mail: s927523@mail.yzu.edu.tw

    2007-03-15

    This paper presents a meta-heuristic algorithm, variable neighborhood search (VNS), to the redundancy allocation problem (RAP). The RAP, an NP-hard problem, has attracted the attention of much prior research, generally in a restricted form where each subsystem must consist of identical components. The newer meta-heuristic methods overcome this limitation and offer a practical way to solve large instances of the relaxed RAP where different components can be used in parallel. Authors' previously published work has shown promise for the variable neighborhood descent (VND) method, the simplest version among VNS variations, on RAP. The variable neighborhood search method itself has not been used in reliability design, yet it is a method that fits those combinatorial problems with potential neighborhood structures, as in the case of the RAP. Therefore, authors further extended their work to develop a VNS algorithm for the RAP and tested a set of well-known benchmark problems from the literature. Results on 33 test instances ranging from less to severely constrained conditions show that the variable neighborhood search method improves the performance of VND and provides a competitive solution quality at economically computational expense in comparison with the best-known heuristics including ant colony optimization, genetic algorithm, and tabu search.

  7. A Prefiltered Cuckoo Search Algorithm with Geometric Operators for Solving Sudoku Problems

    Directory of Open Access Journals (Sweden)

    Ricardo Soto

    2014-01-01

    Full Text Available The Sudoku is a famous logic-placement game, originally popularized in Japan and today widely employed as pastime and as testbed for search algorithms. The classic Sudoku consists in filling a 9×9 grid, divided into nine 3×3 regions, so that each column, row, and region contains different digits from 1 to 9. This game is known to be NP-complete, with existing various complete and incomplete search algorithms able to solve different instances of it. In this paper, we present a new cuckoo search algorithm for solving Sudoku puzzles combining prefiltering phases and geometric operations. The geometric operators allow one to correctly move toward promising regions of the combinatorial space, while the prefiltering phases are able to previously delete from domains the values that do not conduct to any feasible solution. This integration leads to a more efficient domain filtering and as a consequence to a faster solving process. We illustrate encouraging experimental results where our approach noticeably competes with the best approximate methods reported in the literature.

  8. Improved Seam-Line Searching Algorithm for UAV Image Mosaic with Optical Flow.

    Science.gov (United States)

    Zhang, Weilong; Guo, Bingxuan; Li, Ming; Liao, Xuan; Li, Wenzhuo

    2018-04-16

    Ghosting and seams are two major challenges in creating unmanned aerial vehicle (UAV) image mosaic. In response to these problems, this paper proposes an improved method for UAV image seam-line searching. First, an image matching algorithm is used to extract and match the features of adjacent images, so that they can be transformed into the same coordinate system. Then, the gray scale difference, the gradient minimum, and the optical flow value of pixels in adjacent image overlapped area in a neighborhood are calculated, which can be applied to creating an energy function for seam-line searching. Based on that, an improved dynamic programming algorithm is proposed to search the optimal seam-lines to complete the UAV image mosaic. This algorithm adopts a more adaptive energy aggregation and traversal strategy, which can find a more ideal splicing path for adjacent UAV images and avoid the ground objects better. The experimental results show that the proposed method can effectively solve the problems of ghosting and seams in the panoramic UAV images.

  9. Stochastic search in structural optimization - Genetic algorithms and simulated annealing

    Science.gov (United States)

    Hajela, Prabhat

    1993-01-01

    An account is given of illustrative applications of genetic algorithms and simulated annealing methods in structural optimization. The advantages of such stochastic search methods over traditional mathematical programming strategies are emphasized; it is noted that these methods offer a significantly higher probability of locating the global optimum in a multimodal design space. Both genetic-search and simulated annealing can be effectively used in problems with a mix of continuous, discrete, and integer design variables.

  10. RDEL: Restart Differential Evolution algorithm with Local Search Mutation for global numerical optimization

    Directory of Open Access Journals (Sweden)

    Ali Wagdy Mohamed

    2014-11-01

    Full Text Available In this paper, a novel version of Differential Evolution (DE algorithm based on a couple of local search mutation and a restart mechanism for solving global numerical optimization problems over continuous space is presented. The proposed algorithm is named as Restart Differential Evolution algorithm with Local Search Mutation (RDEL. In RDEL, inspired by Particle Swarm Optimization (PSO, a novel local mutation rule based on the position of the best and the worst individuals among the entire population of a particular generation is introduced. The novel local mutation scheme is joined with the basic mutation rule through a linear decreasing function. The proposed local mutation scheme is proven to enhance local search tendency of the basic DE and speed up the convergence. Furthermore, a restart mechanism based on random mutation scheme and a modified Breeder Genetic Algorithm (BGA mutation scheme is combined to avoid stagnation and/or premature convergence. Additionally, an exponent increased crossover probability rule and a uniform scaling factors of DE are introduced to promote the diversity of the population and to improve the search process, respectively. The performance of RDEL is investigated and compared with basic differential evolution, and state-of-the-art parameter adaptive differential evolution variants. It is discovered that the proposed modifications significantly improve the performance of DE in terms of quality of solution, efficiency and robustness.

  11. A Nodes Deployment Algorithm in Wireless Sensor Network Based on Distribution

    Directory of Open Access Journals (Sweden)

    Song Yuli

    2014-07-01

    Full Text Available Wireless sensor network coverage is a basic problem of wireless sensor network. In this paper, we propose a wireless sensor network node deployment algorithm base on distribution in order to form an efficient wireless sensor network. The iteratively greedy algorithm is used in this paper to choose priority nodes into active until the entire network is covered by wireless sensor nodes, the whole network to multiply connected. The simulation results show that the distributed wireless sensor network node deployment algorithm can form a multiply connected wireless sensor network.

  12. Intermediate view reconstruction using adaptive disparity search algorithm for real-time 3D processing

    Science.gov (United States)

    Bae, Kyung-hoon; Park, Changhan; Kim, Eun-soo

    2008-03-01

    In this paper, intermediate view reconstruction (IVR) using adaptive disparity search algorithm (ASDA) is for realtime 3-dimensional (3D) processing proposed. The proposed algorithm can reduce processing time of disparity estimation by selecting adaptive disparity search range. Also, the proposed algorithm can increase the quality of the 3D imaging. That is, by adaptively predicting the mutual correlation between stereo images pair using the proposed algorithm, the bandwidth of stereo input images pair can be compressed to the level of a conventional 2D image and a predicted image also can be effectively reconstructed using a reference image and disparity vectors. From some experiments, stereo sequences of 'Pot Plant' and 'IVO', it is shown that the proposed algorithm improves the PSNRs of a reconstructed image to about 4.8 dB by comparing with that of conventional algorithms, and reduces the Synthesizing time of a reconstructed image to about 7.02 sec by comparing with that of conventional algorithms.

  13. Crane Double Cycling in Container Ports: Algorithms, Evaluation, and Planning

    OpenAIRE

    Goodchild, Anne Victoria

    2005-01-01

    Loading ships as they are unloaded (double-cycling) can improve the efficiency of a quay crane and container port. This dissertation describes the double-cycling problem, and presents solution algorithms and simple formulae to estimate benefits. In Chapter 2 we focus on reducing the number of operations necessary to turn around a ship. First an intuitive lower bound is developed. We then present a greedy algorithm that was developed based on the physical properties of the problem and yields a...

  14. Oscillating feature subset search algorithm for text categorization

    Czech Academy of Sciences Publication Activity Database

    Novovičová, Jana; Somol, Petr; Pudil, Pavel

    2006-01-01

    Roč. 44, č. 4225 (2006), s. 578-587 ISSN 0302-9743 R&D Projects: GA AV ČR IAA2075302; GA MŠk 2C06019 EU Projects: European Commission(XE) 507752 - MUSCLE Institutional research plan: CEZ:AV0Z10750506 Keywords : text classification * feature selection * oscillating search algorithm * Bhattacharyya distance Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 0.402, year: 2005

  15. Data classification using metaheuristic Cuckoo Search technique for Levenberg Marquardt back propagation (CSLM) algorithm

    Science.gov (United States)

    Nawi, Nazri Mohd.; Khan, Abdullah; Rehman, M. Z.

    2015-05-01

    A nature inspired behavior metaheuristic techniques which provide derivative-free solutions to solve complex problems. One of the latest additions to the group of nature inspired optimization procedure is Cuckoo Search (CS) algorithm. Artificial Neural Network (ANN) training is an optimization task since it is desired to find optimal weight set of a neural network in training process. Traditional training algorithms have some limitation such as getting trapped in local minima and slow convergence rate. This study proposed a new technique CSLM by combining the best features of two known algorithms back-propagation (BP) and Levenberg Marquardt algorithm (LM) for improving the convergence speed of ANN training and avoiding local minima problem by training this network. Some selected benchmark classification datasets are used for simulation. The experiment result show that the proposed cuckoo search with Levenberg Marquardt algorithm has better performance than other algorithm used in this study.

  16. Inversion for Refractivity Parameters Using a Dynamic Adaptive Cuckoo Search with Crossover Operator Algorithm.

    Science.gov (United States)

    Zhang, Zhihua; Sheng, Zheng; Shi, Hanqing; Fan, Zhiqiang

    2016-01-01

    Using the RFC technique to estimate refractivity parameters is a complex nonlinear optimization problem. In this paper, an improved cuckoo search (CS) algorithm is proposed to deal with this problem. To enhance the performance of the CS algorithm, a parameter dynamic adaptive operation and crossover operation were integrated into the standard CS (DACS-CO). Rechenberg's 1/5 criteria combined with learning factor were used to control the parameter dynamic adaptive adjusting process. The crossover operation of genetic algorithm was utilized to guarantee the population diversity. The new hybrid algorithm has better local search ability and contributes to superior performance. To verify the ability of the DACS-CO algorithm to estimate atmospheric refractivity parameters, the simulation data and real radar clutter data are both implemented. The numerical experiments demonstrate that the DACS-CO algorithm can provide an effective method for near-real-time estimation of the atmospheric refractivity profile from radar clutter.

  17. Tackling Error Propagation through Reinforcement Learning: A Case of Greedy Dependency Parsing

    NARCIS (Netherlands)

    Le, M.N.; Fokkens, A.S.

    Error propagation is a common problem in NLP. Reinforcement learning explores erroneous states during training and can therefore be more robust when mistakes are made early in a process. In this paper, we apply reinforcement learning to greedy dependency parsing which is known to suffer from error

  18. A Statistical Model for Uplink Intercell Interference with Power Adaptation and Greedy Scheduling

    KAUST Repository

    Tabassum, Hina

    2012-10-03

    This paper deals with the statistical modeling of uplink inter-cell interference (ICI) considering greedy scheduling with power adaptation based on channel conditions. The derived model is implicitly generalized for any kind of shadowing and fading environments. More precisely, we develop a generic model for the distribution of ICI based on the locations of the allocated users and their transmit powers. The derived model is utilized to evaluate important network performance metrics such as ergodic capacity, average fairness and average power preservation numerically. Monte-Carlo simulation details are included to support the analysis and show the accuracy of the derived expressions. In parallel to the literature, we show that greedy scheduling with power adaptation reduces the ICI, average power consumption of users, and enhances the average fairness among users, compared to the case without power adaptation. © 2012 IEEE.

  19. A Statistical Model for Uplink Intercell Interference with Power Adaptation and Greedy Scheduling

    KAUST Repository

    Tabassum, Hina; Yilmaz, Ferkan; Dawy, Zaher; Alouini, Mohamed-Slim

    2012-01-01

    This paper deals with the statistical modeling of uplink inter-cell interference (ICI) considering greedy scheduling with power adaptation based on channel conditions. The derived model is implicitly generalized for any kind of shadowing and fading environments. More precisely, we develop a generic model for the distribution of ICI based on the locations of the allocated users and their transmit powers. The derived model is utilized to evaluate important network performance metrics such as ergodic capacity, average fairness and average power preservation numerically. Monte-Carlo simulation details are included to support the analysis and show the accuracy of the derived expressions. In parallel to the literature, we show that greedy scheduling with power adaptation reduces the ICI, average power consumption of users, and enhances the average fairness among users, compared to the case without power adaptation. © 2012 IEEE.

  20. Induction Motor Parameter Identification Using a Gravitational Search Algorithm

    Directory of Open Access Journals (Sweden)

    Omar Avalos

    2016-04-01

    Full Text Available The efficient use of electrical energy is a topic that has attracted attention for its environmental consequences. On the other hand, induction motors represent the main component in most industries. They consume the highest energy percentages in industrial facilities. This energy consumption depends on the operation conditions of the induction motor imposed by its internal parameters. Since the internal parameters of an induction motor are not directly measurable, an identification process must be conducted to obtain them. In the identification process, the parameter estimation is transformed into a multidimensional optimization problem where the internal parameters of the induction motor are considered as decision variables. Under this approach, the complexity of the optimization problem tends to produce multimodal error surfaces for which their cost functions are significantly difficult to minimize. Several algorithms based on evolutionary computation principles have been successfully applied to identify the optimal parameters of induction motors. However, most of them maintain an important limitation: They frequently obtain sub-optimal solutions as a result of an improper equilibrium between exploitation and exploration in their search strategies. This paper presents an algorithm for the optimal parameter identification of induction motors. To determine the parameters, the proposed method uses a recent evolutionary method called the gravitational search algorithm (GSA. Different from most of the existent evolutionary algorithms, the GSA presents a better performance in multimodal problems, avoiding critical flaws such as the premature convergence to sub-optimal solutions. Numerical simulations have been conducted on several models to show the effectiveness of the proposed scheme.

  1. Dynamic Vehicle Routing Using an Improved Variable Neighborhood Search Algorithm

    Directory of Open Access Journals (Sweden)

    Yingcheng Xu

    2013-01-01

    Full Text Available In order to effectively solve the dynamic vehicle routing problem with time windows, the mathematical model is established and an improved variable neighborhood search algorithm is proposed. In the algorithm, allocation customers and planning routes for the initial solution are completed by the clustering method. Hybrid operators of insert and exchange are used to achieve the shaking process, the later optimization process is presented to improve the solution space, and the best-improvement strategy is adopted, which make the algorithm can achieve a better balance in the solution quality and running time. The idea of simulated annealing is introduced to take control of the acceptance of new solutions, and the influences of arrival time, distribution of geographical location, and time window range on route selection are analyzed. In the experiment, the proposed algorithm is applied to solve the different sizes' problems of DVRP. Comparing to other algorithms on the results shows that the algorithm is effective and feasible.

  2. Nearest greedy for solving the waste collection vehicle routing problem: A case study

    Science.gov (United States)

    Mat, Nur Azriati; Benjamin, Aida Mauziah; Abdul-Rahman, Syariza; Wibowo, Antoni

    2017-11-01

    This paper presents a real case study pertaining to an issue related to waste collection in the northern part of Malaysia by using a constructive heuristic algorithm known as the Nearest Greedy (NG) technique. This technique has been widely used to devise initial solutions for issues concerning vehicle routing. Basically, the waste collection cycle involves the following steps: i) each vehicle starts from a depot, ii) visits a number of customers to collect waste, iii) unloads waste at the disposal site, and lastly, iv) returns to the depot. Moreover, the sample data set used in this paper consisted of six areas, where each area involved up to 103 customers. In this paper, the NG technique was employed to construct an initial route for each area. The solution proposed from the technique was compared with the present vehicle routes implemented by a waste collection company within the city. The comparison results portrayed that NG offered better vehicle routes with a 11.07% reduction of the total distance traveled, in comparison to the present vehicle routes.

  3. Combinatorial Algorithms for Computing Column Space Bases ThatHave Sparse Inverses

    Energy Technology Data Exchange (ETDEWEB)

    Pinar, Ali; Chow, Edmond; Pothen, Alex

    2005-03-18

    This paper presents a combinatorial study on the problem ofconstructing a sparse basis forthe null-space of a sparse, underdetermined, full rank matrix, A. Such a null-space is suitable forsolving solving many saddle point problems. Our approach is to form acolumn space basis of A that has a sparse inverse, by selecting suitablecolumns of A. This basis is then used to form a sparse null-space basisin fundamental form. We investigate three different algorithms forcomputing the column space basis: Two greedy approaches that rely onmatching, and a third employing a divide and conquer strategy implementedwith hypergraph partitioning followed by the greedy approach. We alsodiscuss the complexity of selecting a column basis when it is known thata block diagonal basis exists with a small given block size.

  4. Pareto Optimization of a Half Car Passive Suspension Model Using a Novel Multiobjective Heat Transfer Search Algorithm

    Directory of Open Access Journals (Sweden)

    Vimal Savsani

    2017-01-01

    Full Text Available Most of the modern multiobjective optimization algorithms are based on the search technique of genetic algorithms; however the search techniques of other recently developed metaheuristics are emerging topics among researchers. This paper proposes a novel multiobjective optimization algorithm named multiobjective heat transfer search (MOHTS algorithm, which is based on the search technique of heat transfer search (HTS algorithm. MOHTS employs the elitist nondominated sorting and crowding distance approach of an elitist based nondominated sorting genetic algorithm-II (NSGA-II for obtaining different nondomination levels and to preserve the diversity among the optimal set of solutions, respectively. The capability in yielding a Pareto front as close as possible to the true Pareto front of MOHTS has been tested on the multiobjective optimization problem of the vehicle suspension design, which has a set of five second-order linear ordinary differential equations. Half car passive ride model with two different sets of five objectives is employed for optimizing the suspension parameters using MOHTS and NSGA-II. The optimization studies demonstrate that MOHTS achieves the better nondominated Pareto front with the widespread (diveresed set of optimal solutions as compared to NSGA-II, and further the comparison of the extreme points of the obtained Pareto front reveals the dominance of MOHTS over NSGA-II, multiobjective uniform diversity genetic algorithm (MUGA, and combined PSO-GA based MOEA.

  5. An Effective Cuckoo Search Algorithm for Node Localization in Wireless Sensor Network.

    Science.gov (United States)

    Cheng, Jing; Xia, Linyuan

    2016-08-31

    Localization is an essential requirement in the increasing prevalence of wireless sensor network (WSN) applications. Reducing the computational complexity, communication overhead in WSN localization is of paramount importance in order to prolong the lifetime of the energy-limited sensor nodes and improve localization performance. This paper proposes an effective Cuckoo Search (CS) algorithm for node localization. Based on the modification of step size, this approach enables the population to approach global optimal solution rapidly, and the fitness of each solution is employed to build mutation probability for avoiding local convergence. Further, the approach restricts the population in the certain range so that it can prevent the energy consumption caused by insignificant search. Extensive experiments were conducted to study the effects of parameters like anchor density, node density and communication range on the proposed algorithm with respect to average localization error and localization success ratio. In addition, a comparative study was conducted to realize the same localization task using the same network deployment. Experimental results prove that the proposed CS algorithm can not only increase convergence rate but also reduce average localization error compared with standard CS algorithm and Particle Swarm Optimization (PSO) algorithm.

  6. Design and economic investigation of shell and tube heat exchangers using Improved Intelligent Tuned Harmony Search algorithm

    Directory of Open Access Journals (Sweden)

    Oguz Emrah Turgut

    2014-12-01

    Full Text Available This study explores the thermal design of shell and tube heat exchangers by using Improved Intelligent Tuned Harmony Search (I-ITHS algorithm. Intelligent Tuned Harmony Search (ITHS is an upgraded version of harmony search algorithm which has an advantage of deciding intensification and diversification processes by applying proper pitch adjusting strategy. In this study, we aim to improve the search capacity of ITHS algorithm by utilizing chaotic sequences instead of uniformly distributed random numbers and applying alternative search strategies inspired by Artificial Bee Colony algorithm and Opposition Based Learning on promising areas (best solutions. Design variables including baffle spacing, shell diameter, tube outer diameter and number of tube passes are used to minimize total cost of heat exchanger that incorporates capital investment and the sum of discounted annual energy expenditures related to pumping and heat exchanger area. Results show that I-ITHS can be utilized in optimizing shell and tube heat exchangers.

  7. Short-term economic environmental hydrothermal scheduling using improved multi-objective gravitational search algorithm

    International Nuclear Information System (INIS)

    Li, Chunlong; Zhou, Jianzhong; Lu, Peng; Wang, Chao

    2015-01-01

    Highlights: • Improved multi-objective gravitational search algorithm. • An elite archive set is proposed to guide evolutionary process. • Neighborhood searching mechanism to improve local search ability. • Adopt chaotic mutation for avoiding premature convergence. • Propose feasible space method to handle hydro plant constrains. - Abstract: With growing concerns about energy and environment, short-term economic environmental hydrothermal scheduling (SEEHS) plays a more and more important role in power system. Because of the two objectives and various constraints, SEEHS is a complex multi-objective optimization problem (MOOP). In order to solve the problem, we propose an improved multi-objective gravitational search algorithm (IMOGSA) in this paper. In IMOGSA, the mass of the agent is redefined by multiple objectives to make it suitable for MOOP. An elite archive set is proposed to keep Pareto optimal solutions and guide evolutionary process. For balancing exploration and exploitation, a neighborhood searching mechanism is presented to cooperate with chaotic mutation. Moreover, a novel method based on feasible space is proposed to handle hydro plant constraints during SEEHS, and a violation adjustment method is adopted to handle power balance constraint. For verifying its effectiveness, the proposed IMOGSA is applied to a hydrothermal system in two different case studies. The simulation results show that IMOGSA has a competitive performance in SEEHS when compared with other established algorithms

  8. Inversion for Refractivity Parameters Using a Dynamic Adaptive Cuckoo Search with Crossover Operator Algorithm

    Directory of Open Access Journals (Sweden)

    Zhihua Zhang

    2016-01-01

    Full Text Available Using the RFC technique to estimate refractivity parameters is a complex nonlinear optimization problem. In this paper, an improved cuckoo search (CS algorithm is proposed to deal with this problem. To enhance the performance of the CS algorithm, a parameter dynamic adaptive operation and crossover operation were integrated into the standard CS (DACS-CO. Rechenberg’s 1/5 criteria combined with learning factor were used to control the parameter dynamic adaptive adjusting process. The crossover operation of genetic algorithm was utilized to guarantee the population diversity. The new hybrid algorithm has better local search ability and contributes to superior performance. To verify the ability of the DACS-CO algorithm to estimate atmospheric refractivity parameters, the simulation data and real radar clutter data are both implemented. The numerical experiments demonstrate that the DACS-CO algorithm can provide an effective method for near-real-time estimation of the atmospheric refractivity profile from radar clutter.

  9. Performance of genetic algorithms in search for water splitting perovskites

    DEFF Research Database (Denmark)

    Jain, A.; Castelli, Ivano Eligio; Hautier, G.

    2013-01-01

    We examine the performance of genetic algorithms (GAs) in uncovering solar water light splitters over a space of almost 19,000 perovskite materials. The entire search space was previously calculated using density functional theory to determine solutions that fulfill constraints on stability, band...

  10. A GENETIC ALGORITHM USING THE LOCAL SEARCH HEURISTIC IN FACILITIES LAYOUT PROBLEM: A MEMETİC ALGORİTHM APPROACH

    Directory of Open Access Journals (Sweden)

    Orhan TÜRKBEY

    2002-02-01

    Full Text Available Memetic algorithms, which use local search techniques, are hybrid structured algorithms like genetic algorithms among evolutionary algorithms. In this study, for Quadratic Assignment Problem (QAP, a memetic structured algorithm using a local search heuristic like 2-opt is developed. Developed in the algorithm, a crossover operator that has not been used before for QAP is applied whereas, Eshelman procedure is used in order to increase thesolution variability. The developed memetic algorithm is applied on test problems taken from QAP-LIB, the results are compared with the present techniques in the literature.

  11. An improved Harmony Search algorithm for optimal scheduling of the diesel generators in oil rig platforms

    Energy Technology Data Exchange (ETDEWEB)

    Yadav, Parikshit; Kumar, Rajesh; Panda, S.K.; Chang, C.S. [Department of Electrical and Computer Engineering, National University of Singapore, Singapore 117576 (Singapore)

    2011-02-15

    Harmony Search (HS) algorithm is music based meta-heuristic optimization method which is analogous with the music improvisation process where musician continue to polish the pitches in order to obtain better harmony. The paper focuses on the optimal scheduling of the generators to reduce the fuel consumption in the oil rig platform. The accurate modeling of the specific fuel consumption is significant in this optimization. The specific fuel consumption has been modeled using cubic spline interpolation. The SFC curve is non-linear and discrete in nature, hence conventional methods fail to give optimal solution. HS algorithm has been used for optimal scheduling of the generators of both equal and unequal rating. Furthermore an Improved Harmony Search (IHS) method for generating new solution vectors that enhances accuracy and convergence rate of HS has been employed. The paper also focuses on the impacts of constant parameters on Harmony Search algorithm. Numerical results show that the IHS method has good convergence property. Moreover, the fuel consumption for IHS algorithm is lower when compared to HS and other heuristic or deterministic methods and is a powerful search algorithm for various engineering optimization problems. (author)

  12. An Improved Harmony Search algorithm for optimal scheduling of the diesel generators in oil rig platforms

    International Nuclear Information System (INIS)

    Yadav, Parikshit; Kumar, Rajesh; Panda, S.K.; Chang, C.S.

    2011-01-01

    Harmony Search (HS) algorithm is music based meta-heuristic optimization method which is analogous with the music improvisation process where musician continue to polish the pitches in order to obtain better harmony. The paper focuses on the optimal scheduling of the generators to reduce the fuel consumption in the oil rig platform. The accurate modeling of the specific fuel consumption is significant in this optimization. The specific fuel consumption has been modeled using cubic spline interpolation. The SFC curve is non-linear and discrete in nature, hence conventional methods fail to give optimal solution. HS algorithm has been used for optimal scheduling of the generators of both equal and unequal rating. Furthermore an Improved Harmony Search (IHS) method for generating new solution vectors that enhances accuracy and convergence rate of HS has been employed. The paper also focuses on the impacts of constant parameters on Harmony Search algorithm. Numerical results show that the IHS method has good convergence property. Moreover, the fuel consumption for IHS algorithm is lower when compared to HS and other heuristic or deterministic methods and is a powerful search algorithm for various engineering optimization problems.

  13. Reactive power planning with FACTS devices using gravitational search algorithm

    Directory of Open Access Journals (Sweden)

    Biplab Bhattacharyya

    2015-09-01

    Full Text Available In this paper, Gravitational Search Algorithm (GSA is used as optimization method in reactive power planning using FACTS (Flexible AC transmission system devices. The planning problem is formulated as a single objective optimization problem where the real power loss and bus voltage deviations are minimized under different loading conditions. GSA based optimization algorithm and particle swarm optimization techniques (PSO are applied on IEEE 30 bus system. Results show that GSA can also be a very effective tool for reactive power planning.

  14. Implementation of the Grover search algorithm with Josephson charge qubits

    International Nuclear Information System (INIS)

    Zheng Xiaohu; Dong Ping; Xue Zhengyuan; Cao Zhuoliang

    2007-01-01

    A scheme of implementing the Grover search algorithm based on Josephson charge qubits has been proposed, which would be a key step to scale more complex quantum algorithms and very important for constructing a real quantum computer via Josephson charge qubits. The present scheme is simple but fairly efficient, and easily manipulated because any two-charge-qubit can be selectively and effectively coupled by a common inductance. More manipulations can be carried out before decoherence sets in. Our scheme can be realized within the current technology

  15. MIDAS: a database-searching algorithm for metabolite identification in metabolomics.

    Science.gov (United States)

    Wang, Yingfeng; Kora, Guruprasad; Bowen, Benjamin P; Pan, Chongle

    2014-10-07

    A database searching approach can be used for metabolite identification in metabolomics by matching measured tandem mass spectra (MS/MS) against the predicted fragments of metabolites in a database. Here, we present the open-source MIDAS algorithm (Metabolite Identification via Database Searching). To evaluate a metabolite-spectrum match (MSM), MIDAS first enumerates possible fragments from a metabolite by systematic bond dissociation, then calculates the plausibility of the fragments based on their fragmentation pathways, and finally scores the MSM to assess how well the experimental MS/MS spectrum from collision-induced dissociation (CID) is explained by the metabolite's predicted CID MS/MS spectrum. MIDAS was designed to search high-resolution tandem mass spectra acquired on time-of-flight or Orbitrap mass spectrometer against a metabolite database in an automated and high-throughput manner. The accuracy of metabolite identification by MIDAS was benchmarked using four sets of standard tandem mass spectra from MassBank. On average, for 77% of original spectra and 84% of composite spectra, MIDAS correctly ranked the true compounds as the first MSMs out of all MetaCyc metabolites as decoys. MIDAS correctly identified 46% more original spectra and 59% more composite spectra at the first MSMs than an existing database-searching algorithm, MetFrag. MIDAS was showcased by searching a published real-world measurement of a metabolome from Synechococcus sp. PCC 7002 against the MetaCyc metabolite database. MIDAS identified many metabolites missed in the previous study. MIDAS identifications should be considered only as candidate metabolites, which need to be confirmed using standard compounds. To facilitate manual validation, MIDAS provides annotated spectra for MSMs and labels observed mass spectral peaks with predicted fragments. The database searching and manual validation can be performed online at http://midas.omicsbio.org.

  16. Parameter Identification of the 2-Chlorophenol Oxidation Model Using Improved Differential Search Algorithm

    Directory of Open Access Journals (Sweden)

    Guang-zhou Chen

    2015-01-01

    Full Text Available Parameter identification plays a crucial role for simulating and using model. This paper firstly carried out the sensitivity analysis of the 2-chlorophenol oxidation model in supercritical water using the Monte Carlo method. Then, to address the nonlinearity of the model, two improved differential search (DS algorithms were proposed to carry out the parameter identification of the model. One strategy is to adopt the Latin hypercube sampling method to replace the uniform distribution of initial population; the other is to combine DS with simplex method. The results of sensitivity analysis reveal the sensitivity and the degree of difficulty identified for every model parameter. Furthermore, the posteriori probability distribution of parameters and the collaborative relationship between any two parameters can be obtained. To verify the effectiveness of the improved algorithms, the optimization performance of improved DS in kinetic parameter estimation is studied and compared with that of the basic DS algorithm, differential evolution, artificial bee colony optimization, and quantum-behaved particle swarm optimization. And the experimental results demonstrate that the DS with the Latin hypercube sampling method does not present better performance, while the hybrid methods have the advantages of strong global search ability and local search ability and are more effective than the other algorithms.

  17. Dynamic Harmony Search with Polynomial Mutation Algorithm for Valve-Point Economic Load Dispatch

    Directory of Open Access Journals (Sweden)

    M. Karthikeyan

    2015-01-01

    mutation (DHSPM algorithm to solve ORPD problem. In DHSPM algorithm the key parameters of HS algorithm like harmony memory considering rate (HMCR and pitch adjusting rate (PAR are changed dynamically and there is no need to predefine these parameters. Additionally polynomial mutation is inserted in the updating step of HS algorithm to favor exploration and exploitation of the search space. The DHSPM algorithm is tested with three power system cases consisting of 3, 13, and 40 thermal units. The computational results show that the DHSPM algorithm is more effective in finding better solutions than other computational intelligence based methods.

  18. Defining Algorithmic Ideology: Using Ideology Critique to Scrutinize Corporate Search Engines

    Directory of Open Access Journals (Sweden)

    Astrid Mager

    2014-02-01

    Full Text Available This article conceptualizes “algorithmic ideology” as a valuable tool to understand and critique corporate search engines in the context of wider socio-political developments. Drawing on critical theory it shows how capitalist value-systems manifest in search technology, how they spread through algorithmic logics and how they are stabilized in society. Following philosophers like Althusser, Marx and Gramsci it elaborates how content providers and users contribute to Google’s capital accumulation cycle and exploitation schemes that come along with it. In line with contemporary mass media and neoliberal politics they appear to be fostering capitalism and its “commodity fetishism” (Marx. It further reveals that the capitalist hegemony has to be constantly negotiated and renewed. This dynamic notion of ideology opens up the view for moments of struggle and counter-actions. “Organic intellectuals” (Gramsci can play a central role in challenging powerful actors like Google and their algorithmic ideology. To pave the way towards more democratic information technology, however, requires more than single organic intellectuals. Additional obstacles need to be conquered, as I finally discuss.

  19. PowerPlay: Training an Increasingly General Problem Solver by Continually Searching for the Simplest Still Unsolvable Problem.

    Science.gov (United States)

    Schmidhuber, Jürgen

    2013-01-01

    Most of computer science focuses on automatically solving given computational problems. I focus on automatically inventing or discovering problems in a way inspired by the playful behavior of animals and humans, to train a more and more general problem solver from scratch in an unsupervised fashion. Consider the infinite set of all computable descriptions of tasks with possibly computable solutions. Given a general problem-solving architecture, at any given time, the novel algorithmic framework PowerPlay (Schmidhuber, 2011) searches the space of possible pairs of new tasks and modifications of the current problem solver, until it finds a more powerful problem solver that provably solves all previously learned tasks plus the new one, while the unmodified predecessor does not. Newly invented tasks may require to achieve a wow-effect by making previously learned skills more efficient such that they require less time and space. New skills may (partially) re-use previously learned skills. The greedy search of typical PowerPlay variants uses time-optimal program search to order candidate pairs of tasks and solver modifications by their conditional computational (time and space) complexity, given the stored experience so far. The new task and its corresponding task-solving skill are those first found and validated. This biases the search toward pairs that can be described compactly and validated quickly. The computational costs of validating new tasks need not grow with task repertoire size. Standard problem solver architectures of personal computers or neural networks tend to generalize by solving numerous tasks outside the self-invented training set; PowerPlay's ongoing search for novelty keeps breaking the generalization abilities of its present solver. This is related to Gödel's sequence of increasingly powerful formal theories based on adding formerly unprovable statements to the axioms without affecting previously provable theorems. The continually increasing

  20. An improved version of Inverse Distance Weighting metamodel assisted Harmony Search algorithm for truss design optimization

    Directory of Open Access Journals (Sweden)

    Y. Gholipour

    Full Text Available This paper focuses on a metamodel-based design optimization algorithm. The intention is to improve its computational cost and convergence rate. Metamodel-based optimization method introduced here, provides the necessary means to reduce the computational cost and convergence rate of the optimization through a surrogate. This algorithm is a combination of a high quality approximation technique called Inverse Distance Weighting and a meta-heuristic algorithm called Harmony Search. The outcome is then polished by a semi-tabu search algorithm. This algorithm adopts a filtering system and determines solution vectors where exact simulation should be applied. The performance of the algorithm is evaluated by standard truss design problems and there has been a significant decrease in the computational effort and improvement of convergence rate.

  1. Combined heat and power economic dispatch by a fish school search algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Leonardo Trigueiro dos; Costa e Silva, Marsil de Athayde [Undergraduate in Mechatronics Engineering, Pontifical Catholic University of Parana, Curitiba, PR (Brazil); Coelho, Leandro dos Santos [Industrial and Systems Engineering Graduate Program, PPGEPS, Pontifical Catholic University of Parana, Curitiba, PR (Brazil)], e-mail: leandro.coelho@pucpr.br

    2010-07-01

    The conversion of primary fossil fuels, such as coal and gas, to electricity is a a relatively inefficient process. Even the most modern combined cycle plants can only achieve efficiencies of between 50-60%. A great portion of the energy wasted in this conversion process is released to the environment as waste heat. The principle of combined heat and power, also known as cogeneration, is to recover and make beneficial use of this heat, significantly raising the overall efficiency of the conversion process. However, the optimal utilization of multiple combined heat and power systems is a complicated problem which needs powerful methods to solve. This paper presents a fish school search (FSS) algorithm to solve the combined heat and power economic dispatch problem. FSS is a novel approach recently proposed to perform search in complex optimization problems. Some simulations presented in the literature indicated that FSS can outperform many bio-inspired algorithms, mainly in multimodal functions. The search process in FSS is carried out by a population of limited-memory individuals - the fishes. Each fish represents a possible solution to the problem. Similarly to particle swarm optimization or genetic algorithm, search guidance in FSS is driven by the success of some individual members of the population. A four-unit system proposed recently which is a benchmark case in the power systems field has been validated as a case study in this paper. (author)

  2. Optimization of Transformation Coefficients Using Direct Search and Swarm Intelligence

    Directory of Open Access Journals (Sweden)

    Manusov V.Z.

    2017-04-01

    Full Text Available This research considers optimization of tap position of transformers in power systems to reduce power losses. Now, methods based on heuristic rules and fuzzy logic, or methods that optimize parts of the whole system separately, are applied to this problem. The first approach requires expert knowledge about processes in the network. The second methods are not able to consider all the interrelations of system’s parts, while changes in segment affect the entire system. Both approaches are tough to implement and require adjustment to the tasks solved. It needs to implement algorithms that can take into account complex interrelations of optimized variables and self-adapt to optimization task. It is advisable to use algorithms given complex interrelations of optimized variables and independently adapting from optimization tasks. Such algorithms include Swarm Intelligence algorithms. Their main features are self-organization, which allows them to automatically adapt to conditions of tasks, and the ability to efficiently exit from local extremes. Thus, they do not require specialized knowledge of the system, in contrast to fuzzy logic. In addition, they can efficiently find quasi-optimal solutions converging to the global optimum. This research applies Particle Swarm Optimization algorithm (PSO. The model of Tajik power system used in experiments. It was found out that PSO is much more efficient than greedy heuristics and more flexible and easier to use than fuzzy logic. PSO allows reducing active power losses from 48.01 to 45.83 MW (4.5%. With al, the effect of using greedy heuristics or fuzzy logic is two times smaller (2.3%.

  3. Hybrid Optimization Algorithm of Particle Swarm Optimization and Cuckoo Search for Preventive Maintenance Period Optimization

    Directory of Open Access Journals (Sweden)

    Jianwen Guo

    2016-01-01

    Full Text Available All equipment must be maintained during its lifetime to ensure normal operation. Maintenance is one of the critical roles in the success of manufacturing enterprises. This paper proposed a preventive maintenance period optimization model (PMPOM to find an optimal preventive maintenance period. By making use of the advantages of particle swarm optimization (PSO and cuckoo search (CS algorithm, a hybrid optimization algorithm of PSO and CS is proposed to solve the PMPOM problem. The test functions show that the proposed algorithm exhibits more outstanding performance than particle swarm optimization and cuckoo search. Experiment results show that the proposed algorithm has advantages of strong optimization ability and fast convergence speed to solve the PMPOM problem.

  4. Nature-inspired Cuckoo Search Algorithm for Side Lobe Suppression in a Symmetric Linear Antenna Array

    Directory of Open Access Journals (Sweden)

    K. N. Abdul Rani

    2012-09-01

    Full Text Available In this paper, we proposed a newly modified cuckoo search (MCS algorithm integrated with the Roulette wheel selection operator and the inertia weight controlling the search ability towards synthesizing symmetric linear array geometry with minimum side lobe level (SLL and/or nulls control. The basic cuckoo search (CS algorithm is primarily based on the natural obligate brood parasitic behavior of some cuckoo species in combination with the Levy flight behavior of some birds and fruit flies. The CS metaheuristic approach is straightforward and capable of solving effectively general N-dimensional, linear and nonlinear optimization problems. The array geometry synthesis is first formulated as an optimization problem with the goal of SLL suppression and/or null prescribed placement in certain directions, and then solved by the newly MCS algorithm for the optimum element or isotropic radiator locations in the azimuth-plane or xy-plane. The study also focuses on the four internal parameters of MCS algorithm specifically on their implicit effects in the array synthesis. The optimal inter-element spacing solutions obtained by the MCS-optimizer are validated through comparisons with the standard CS-optimizer and the conventional array within the uniform and the Dolph-Chebyshev envelope patterns using MATLABTM. Finally, we also compared the fine-tuned MCS algorithm with two popular evolutionary algorithm (EA techniques include particle swarm optimization (PSO and genetic algorithms (GA.

  5. Heat Transfer Search Algorithm for Non-convex Economic Dispatch Problems

    Science.gov (United States)

    Hazra, Abhik; Das, Saborni; Basu, Mousumi

    2018-03-01

    This paper presents Heat Transfer Search (HTS) algorithm for the non-linear economic dispatch problem. HTS algorithm is based on the law of thermodynamics and heat transfer. The proficiency of the suggested technique has been disclosed on three dissimilar complicated economic dispatch problems with valve point effect; prohibited operating zone; and multiple fuels with valve point effect. Test results acquired from the suggested technique for the economic dispatch problem have been fitted to that acquired from other stated evolutionary techniques. It has been observed that the suggested HTS carry out superior solutions.

  6. δ-Similar Elimination to Enhance Search Performance of Multiobjective Evolutionary Algorithms

    Science.gov (United States)

    Aguirre, Hernán; Sato, Masahiko; Tanaka, Kiyoshi

    In this paper, we propose δ-similar elimination to improve the search performance of multiobjective evolutionary algorithms in combinatorial optimization problems. This method eliminates similar individuals in objective space to fairly distribute selection among the different regions of the instantaneous Pareto front. We investigate four eliminating methods analyzing their effects using NSGA-II. In addition, we compare the search performance of NSGA-II enhanced by our method and NSGA-II enhanced by controlled elitism.

  7. A Teaching Approach from the Exhaustive Search Method to the Needleman-Wunsch Algorithm

    Science.gov (United States)

    Xu, Zhongneng; Yang, Yayun; Huang, Beibei

    2017-01-01

    The Needleman-Wunsch algorithm has become one of the core algorithms in bioinformatics; however, this programming requires more suitable explanations for students with different major backgrounds. In supposing sample sequences and using a simple store system, the connection between the exhaustive search method and the Needleman-Wunsch algorithm…

  8. A Local and Global Search Combined Particle Swarm Optimization Algorithm and Its Convergence Analysis

    Directory of Open Access Journals (Sweden)

    Weitian Lin

    2014-01-01

    Full Text Available Particle swarm optimization algorithm (PSOA is an advantage optimization tool. However, it has a tendency to get stuck in a near optimal solution especially for middle and large size problems and it is difficult to improve solution accuracy by fine-tuning parameters. According to the insufficiency, this paper researches the local and global search combine particle swarm algorithm (LGSCPSOA, and its convergence and obtains its convergence qualification. At the same time, it is tested with a set of 8 benchmark continuous functions and compared their optimization results with original particle swarm algorithm (OPSOA. Experimental results indicate that the LGSCPSOA improves the search performance especially on the middle and large size benchmark functions significantly.

  9. Fast Solution in Sparse LDA for Binary Classification

    Science.gov (United States)

    Moghaddam, Baback

    2010-01-01

    An algorithm that performs sparse linear discriminant analysis (Sparse-LDA) finds near-optimal solutions in far less time than the prior art when specialized to binary classification (of 2 classes). Sparse-LDA is a type of feature- or variable- selection problem with numerous applications in statistics, machine learning, computer vision, computational finance, operations research, and bio-informatics. Because of its combinatorial nature, feature- or variable-selection problems are NP-hard or computationally intractable in cases involving more than 30 variables or features. Therefore, one typically seeks approximate solutions by means of greedy search algorithms. The prior Sparse-LDA algorithm was a greedy algorithm that considered the best variable or feature to add/ delete to/ from its subsets in order to maximally discriminate between multiple classes of data. The present algorithm is designed for the special but prevalent case of 2-class or binary classification (e.g. 1 vs. 0, functioning vs. malfunctioning, or change versus no change). The present algorithm provides near-optimal solutions on large real-world datasets having hundreds or even thousands of variables or features (e.g. selecting the fewest wavelength bands in a hyperspectral sensor to do terrain classification) and does so in typical computation times of minutes as compared to days or weeks as taken by the prior art. Sparse LDA requires solving generalized eigenvalue problems for a large number of variable subsets (represented by the submatrices of the input within-class and between-class covariance matrices). In the general (fullrank) case, the amount of computation scales at least cubically with the number of variables and thus the size of the problems that can be solved is limited accordingly. However, in binary classification, the principal eigenvalues can be found using a special analytic formula, without resorting to costly iterative techniques. The present algorithm exploits this analytic

  10. Identification of Fuzzy Inference Systems by Means of a Multiobjective Opposition-Based Space Search Algorithm

    Directory of Open Access Journals (Sweden)

    Wei Huang

    2013-01-01

    Full Text Available We introduce a new category of fuzzy inference systems with the aid of a multiobjective opposition-based space search algorithm (MOSSA. The proposed MOSSA is essentially a multiobjective space search algorithm improved by using an opposition-based learning that employs a so-called opposite numbers mechanism to speed up the convergence of the optimization algorithm. In the identification of fuzzy inference system, the MOSSA is exploited to carry out the parametric identification of the fuzzy model as well as to realize its structural identification. Experimental results demonstrate the effectiveness of the proposed fuzzy models.

  11. Hybrid Genetic Algorithm - Local Search Method for Ground-Water Management

    Science.gov (United States)

    Chiu, Y.; Nishikawa, T.; Martin, P.

    2008-12-01

    Ground-water management problems commonly are formulated as a mixed-integer, non-linear programming problem (MINLP). Relying only on conventional gradient-search methods to solve the management problem is computationally fast; however, the methods may become trapped in a local optimum. Global-optimization schemes can identify the global optimum, but the convergence is very slow when the optimal solution approaches the global optimum. In this study, we developed a hybrid optimization scheme, which includes a genetic algorithm and a gradient-search method, to solve the MINLP. The genetic algorithm identifies a near- optimal solution, and the gradient search uses the near optimum to identify the global optimum. Our methodology is applied to a conjunctive-use project in the Warren ground-water basin, California. Hi- Desert Water District (HDWD), the primary water-manager in the basin, plans to construct a wastewater treatment plant to reduce future septic-tank effluent from reaching the ground-water system. The treated wastewater instead will recharge the ground-water basin via percolation ponds as part of a larger conjunctive-use strategy, subject to State regulations (e.g. minimum distances and travel times). HDWD wishes to identify the least-cost conjunctive-use strategies that control ground-water levels, meet regulations, and identify new production-well locations. As formulated, the MINLP objective is to minimize water-delivery costs subject to constraints including pump capacities, available recharge water, water-supply demand, water-level constraints, and potential new-well locations. The methodology was demonstrated by an enumerative search of the entire feasible solution and comparing the optimum solution with results from the branch-and-bound algorithm. The results also indicate that the hybrid method identifies the global optimum within an affordable computation time. Sensitivity analyses, which include testing different recharge-rate scenarios, pond

  12. Infrastructure system restoration planning using evolutionary algorithms

    Science.gov (United States)

    Corns, Steven; Long, Suzanna K.; Shoberg, Thomas G.

    2016-01-01

    This paper presents an evolutionary algorithm to address restoration issues for supply chain interdependent critical infrastructure. Rapid restoration of infrastructure after a large-scale disaster is necessary to sustaining a nation's economy and security, but such long-term restoration has not been investigated as thoroughly as initial rescue and recovery efforts. A model of the Greater Saint Louis Missouri area was created and a disaster scenario simulated. An evolutionary algorithm is used to determine the order in which the bridges should be repaired based on indirect costs. Solutions were evaluated based on the reduction of indirect costs and the restoration of transportation capacity. When compared to a greedy algorithm, the evolutionary algorithm solution reduced indirect costs by approximately 12.4% by restoring automotive travel routes for workers and re-establishing the flow of commodities across the three rivers in the Saint Louis area.

  13. Two General Extension Algorithms of Latin Hypercube Sampling

    Directory of Open Access Journals (Sweden)

    Zhi-zhao Liu

    2015-01-01

    Full Text Available For reserving original sampling points to reduce the simulation runs, two general extension algorithms of Latin Hypercube Sampling (LHS are proposed. The extension algorithms start with an original LHS of size m and construct a new LHS of size m+n that contains the original points as many as possible. In order to get a strict LHS of larger size, some original points might be deleted. The relationship of original sampling points in the new LHS structure is shown by a simple undirected acyclic graph. The basic general extension algorithm is proposed to reserve the most original points, but it costs too much time. Therefore, a general extension algorithm based on greedy algorithm is proposed to reduce the extension time, which cannot guarantee to contain the most original points. These algorithms are illustrated by an example and applied to evaluating the sample means to demonstrate the effectiveness.

  14. A novel optimization method, Gravitational Search Algorithm (GSA), for PWR core optimization

    International Nuclear Information System (INIS)

    Mahmoudi, S.M.; Aghaie, M.; Bahonar, M.; Poursalehi, N.

    2016-01-01

    Highlights: • The Gravitational Search Algorithm (GSA) is introduced. • The advantage of GSA is verified in Shekel’s Foxholes. • Reload optimizing in WWER-1000 and WWER-440 cases are performed. • Maximizing K eff , minimizing PPFs and flattening power density is considered. - Abstract: In-core fuel management optimization (ICFMO) is one of the most challenging concepts of nuclear engineering. In recent decades several meta-heuristic algorithms or computational intelligence methods have been expanded to optimize reactor core loading pattern. This paper presents a new method of using Gravitational Search Algorithm (GSA) for in-core fuel management optimization. The GSA is constructed based on the law of gravity and the notion of mass interactions. It uses the theory of Newtonian physics and searcher agents are the collection of masses. In this work, at the first step, GSA method is compared with other meta-heuristic algorithms on Shekel’s Foxholes problem. In the second step for finding the best core, the GSA algorithm has been performed for three PWR test cases including WWER-1000 and WWER-440 reactors. In these cases, Multi objective optimizations with the following goals are considered, increment of multiplication factor (K eff ), decrement of power peaking factor (PPF) and power density flattening. It is notable that for neutronic calculation, PARCS (Purdue Advanced Reactor Core Simulator) code is used. The results demonstrate that GSA algorithm have promising performance and could be proposed for other optimization problems of nuclear engineering field.

  15. Improved gravitational search algorithm for unit commitment considering uncertainty of wind power

    International Nuclear Information System (INIS)

    Ji, Bin; Yuan, Xiaohui; Chen, Zhihuan; Tian, Hao

    2014-01-01

    With increasing wind farm integrations, unit commitment (UC) is more difficult to solve because of the intermittent and fluctuation nature of wind power. In this paper, scenario generation and reduction technique is applied to simulate the impacts of its uncertainty on system operation. And then a model of thermal UC problem with wind power integration (UCW) is established. Combination of quantum-inspired binary gravitational search algorithm (GSA) and scenario analysis method is proposed to solve UCW problem. Meanwhile, heuristic search strategies are used to handle the constraints of thermal unit for each scenario. In addition, a priority list of thermal units based on the weight between average full-load cost and maximal power output is utilized during the optimization process. Moreover, two UC test systems with and without wind power integration are used to verify the feasibility and effectiveness of the proposed method as well as the performance of the algorithm. The results are analyzed in detail, which demonstrate the model and the proposed method is practicable. The comparison with other methods clearly shows that the proposed method has higher efficiency for solving UC problems with and even without wind farm integration. - Highlights: • Impact of wind fluctuation on unit commitment problem (UCW) is investigated. • Quantum-inspired gravitational search algorithm (QBGSA) is used to optimize UC. • A new method combines QBGSA with scenario analysis is proposed to solve UCW. • Heuristic search strategies are applied to handle the constraints of the UCW. • The results verify the proposed method is feasible and efficient for handling UCW

  16. An Adaptive Large Neighborhood Search Algorithm for the Multi-mode RCPSP

    DEFF Research Database (Denmark)

    Muller, Laurent Flindt

    We present an Adaptive Large Neighborhood Search algorithm for the Multi-mode Resource-Constrained Project Scheduling Problem (MRCPSP). We incorporate techniques for deriving additional precedence relations and propose a new method, so-called mode-diminution, for removing modes during execution...

  17. Harmony search algorithm for solving combined heat and power economic dispatch problems

    Energy Technology Data Exchange (ETDEWEB)

    Khorram, Esmaile, E-mail: eskhor@aut.ac.i [Department of Applied Mathematics, Faculty of Mathematics and Computer Science, Amirkabir University of Technology, No. 424, Hafez Ave., 15914 Tehran (Iran, Islamic Republic of); Jaberipour, Majid, E-mail: Majid.Jaberipour@gmail.co [Department of Applied Mathematics, Faculty of Mathematics and Computer Science, Amirkabir University of Technology, No. 424, Hafez Ave., 15914 Tehran (Iran, Islamic Republic of)

    2011-02-15

    Economic dispatch (ED) is one of the key optimization problems in electric power system operation. The problem grows complex if one or more units produce both power and heat. Combined heat and power economic dispatch (CHPED) problem is a complicated problem that needs powerful methods to solve. This paper presents a harmony search (EDHS) algorithm to solve CHPED. Some standard examples are presented to demonstrate the effectiveness of this algorithm in obtaining the optimal solution. In all cases, the solutions obtained using EDHS algorithm are better than those obtained by other methods.

  18. A Line Search Multilevel Truncated Newton Algorithm for Computing the Optical Flow

    Directory of Open Access Journals (Sweden)

    Lluís Garrido

    2015-06-01

    Full Text Available We describe the implementation details and give the experimental results of three optimization algorithms for dense optical flow computation. In particular, using a line search strategy, we evaluate the performance of the unilevel truncated Newton method (LSTN, a multiresolution truncated Newton (MR/LSTN and a full multigrid truncated Newton (FMG/LSTN. We use three image sequences and four models of optical flow for performance evaluation. The FMG/LSTN algorithm is shown to lead to better optical flow estimation with less computational work than both the LSTN and MR/LSTN algorithms.

  19. A bio-inspired swarm robot coordination algorithm for multiple target searching

    Science.gov (United States)

    Meng, Yan; Gan, Jing; Desai, Sachi

    2008-04-01

    The coordination of a multi-robot system searching for multi targets is challenging under dynamic environment since the multi-robot system demands group coherence (agents need to have the incentive to work together faithfully) and group competence (agents need to know how to work together well). In our previous proposed bio-inspired coordination method, Local Interaction through Virtual Stigmergy (LIVS), one problem is the considerable randomness of the robot movement during coordination, which may lead to more power consumption and longer searching time. To address these issues, an adaptive LIVS (ALIVS) method is proposed in this paper, which not only considers the travel cost and target weight, but also predicting the target/robot ratio and potential robot redundancy with respect to the detected targets. Furthermore, a dynamic weight adjustment is also applied to improve the searching performance. This new method a truly distributed method where each robot makes its own decision based on its local sensing information and the information from its neighbors. Basically, each robot only communicates with its neighbors through a virtual stigmergy mechanism and makes its local movement decision based on a Particle Swarm Optimization (PSO) algorithm. The proposed ALIVS algorithm has been implemented on the embodied robot simulator, Player/Stage, in a searching target. The simulation results demonstrate the efficiency and robustness in a power-efficient manner with the real-world constraints.

  20. Path searching in switching networks using cellular algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Koczy, L T; Langer, J; Legendi, T

    1981-01-01

    After a survey of the important statements in the paper A Mathematical Model of Path Searching in General Type Switching Networks (see IBID., vol.25, no.1, p.31-43, 1981) the authors consider the possible implementation for cellular automata of the algorithm introduced there. The cellular field used consists of 5 neighbour 8 state cells. Running times required by a traditional serial processor and by the cellular field, respectively, are compared. By parallel processing this running time can be reduced. 5 references.

  1. Tractable Algorithms for Proximity Search on Large Graphs

    Science.gov (United States)

    2010-07-01

    Education never ends, Watson. It is a series of lessons with the greatest for the last. — Sir Arthur Conan Doyle’s Sherlock Holmes . 2.1 Introduction A...Doyle’s Sherlock Holmes . 5.1 Introduction In this thesis, our main goal is to design fast algorithms for proximity search in large graphs. In chapter 3...Conan Doyle’s Sherlock Holmes . In this thesis our main focus is on investigating some useful random walk based prox- imity measures. We have started

  2. Location, Allocation and Routing of Temporary Health Centers in Rural Areas in Crisis, Solved by Improved Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    Mahdi Alinaghian

    2017-01-01

    Full Text Available In this paper, an uncertain integrated model for simultaneously locating temporary health centers in the affected areas, allocating affected areas to these centers, and routing to transport their required good is considered. Health centers can be settled in one of the affected areas or in a place out of them; therefore, the proposed model offers the best relief operation policy when it is possible to supply the goods of affected areas (which are customers of goods directly or under coverage. Due to that the problem is NP-Hard, to solve the problem in large-scale, a meta-heuristic algorithm based on harmony search algorithm is presented and its performance has been compared with basic harmony search algorithm and neighborhood search algorithm in small and large scale test problems. The results show that the proposed harmony search algorithm has a suitable efficiency.

  3. A novel symbiotic organisms search algorithm for congestion management in deregulated environment

    Science.gov (United States)

    Verma, Sumit; Saha, Subhodip; Mukherjee, V.

    2017-01-01

    In today's competitive electricity market, managing transmission congestion in deregulated power system has created challenges for independent system operators to operate the transmission lines reliably within the limits. This paper proposes a new meta-heuristic algorithm, called as symbiotic organisms search (SOS) algorithm, for congestion management (CM) problem in pool based electricity market by real power rescheduling of generators. Inspired by interactions among organisms in ecosystem, SOS algorithm is a recent population based algorithm which does not require any algorithm specific control parameters unlike other algorithms. Various security constraints such as load bus voltage and line loading are taken into account while dealing with the CM problem. In this paper, the proposed SOS algorithm is applied on modified IEEE 30- and 57-bus test power system for the solution of CM problem. The results, thus, obtained are compared to those reported in the recent state-of-the-art literature. The efficacy of the proposed SOS algorithm for obtaining the higher quality solution is also established.

  4. LETTER TO THE EDITOR: Constant-time solution to the global optimization problem using Brüschweiler's ensemble search algorithm

    Science.gov (United States)

    Protopopescu, V.; D'Helon, C.; Barhen, J.

    2003-06-01

    A constant-time solution of the continuous global optimization problem (GOP) is obtained by using an ensemble algorithm. We show that under certain assumptions, the solution can be guaranteed by mapping the GOP onto a discrete unsorted search problem, whereupon Brüschweiler's ensemble search algorithm is applied. For adequate sensitivities of the measurement technique, the query complexity of the ensemble search algorithm depends linearly on the size of the function's domain. Advantages and limitations of an eventual NMR implementation are discussed.

  5. Reduction rules-based search algorithm for opportunistic replacement strategy of multiple life-limited parts

    Directory of Open Access Journals (Sweden)

    Xuyun FU

    2018-01-01

    Full Text Available The opportunistic replacement of multiple Life-Limited Parts (LLPs is a problem widely existing in industry. The replacement strategy of LLPs has a great impact on the total maintenance cost to a lot of equipment. This article focuses on finding a quick and effective algorithm for this problem. To improve the algorithm efficiency, six reduction rules are suggested from the perspectives of solution feasibility, determination of the replacement of LLPs, determination of the maintenance occasion and solution optimality. Based on these six reduction rules, a search algorithm is proposed. This search algorithm can identify one or several optimal solutions. A numerical experiment shows that these six reduction rules are effective, and the time consumed by the algorithm is less than 38 s if the total life of equipment is shorter than 55000 and the number of LLPs is less than 11. A specific case shows that the algorithm can obtain optimal solutions which are much better than the result of the traditional method in 10 s, and it can provide support for determining to-be-replaced LLPs when determining the maintenance workscope of an aircraft engine. Therefore, the algorithm is applicable to engineering applications concerning opportunistic replacement of multiple LLPs in aircraft engines.

  6. A Hybrid alldifferent-Tabu Search Algorithm for Solving Sudoku Puzzles

    Directory of Open Access Journals (Sweden)

    Ricardo Soto

    2015-01-01

    Full Text Available The Sudoku problem is a well-known logic-based puzzle of combinatorial number-placement. It consists in filling a n2 × n2 grid, composed of n columns, n rows, and n subgrids, each one containing distinct integers from 1 to n2. Such a puzzle belongs to the NP-complete collection of problems, to which there exist diverse exact and approximate methods able to solve it. In this paper, we propose a new hybrid algorithm that smartly combines a classic tabu search procedure with the alldifferent global constraint from the constraint programming world. The alldifferent constraint is known to be efficient for domain filtering in the presence of constraints that must be pairwise different, which are exactly the kind of constraints that Sudokus own. This ability clearly alleviates the work of the tabu search, resulting in a faster and more robust approach for solving Sudokus. We illustrate interesting experimental results where our proposed algorithm outperforms the best results previously reported by hybrids and approximate methods.

  7. Comparing imagery in The Greedy Hippo and Crouching Tiger, Hidden Dragon

    Directory of Open Access Journals (Sweden)

    John R. Botha

    2012-11-01

    Full Text Available This article used as point of departure the Ngano song-story entitled The Greedy Hippo as developed by Christiaan van der Westhuizen and based on the Tambani embroidery project. This animated interpretation of The Greedy Hippo is used to further the aims of interdisciplinary research as stated by the original project by comparing its fantasy imagery with that of the Chinese film Crouching Tiger, Hidden Dragon by director Ang Lee. The analysis of the images in both works of art has been based on Jungian archetypes, with specific reference to the need for an esoteric and imaginative reading of fantasy in the chosen works of art. Reference was made to the dominant role of realism as style in Western art as opposed to the more esoteric, symbolic, and fantasy imagery of the art of other cultures, with emphasis on China. Chinese landscape painting and even poetry has been briefly contextualised with regards to the chosen works of art. Finally, the different characters of both works of art were analysed within the contexts of their symbolic meanings as based on Jungian archetypes, with particular reference to the scenes where the characters are depicted as flying through the air, which were explained within the contexts of Wu Xia martial arts style films.s.

  8. Fast quantum search algorithm for databases of arbitrary size and its implementation in a cavity QED system

    International Nuclear Information System (INIS)

    Li, H.Y.; Wu, C.W.; Liu, W.T.; Chen, P.X.; Li, C.Z.

    2011-01-01

    We propose a method for implementing the Grover search algorithm directly in a database containing any number of items based on multi-level systems. Compared with the searching procedure in the database with qubits encoding, our modified algorithm needs fewer iteration steps to find the marked item and uses the carriers of the information more economically. Furthermore, we illustrate how to realize our idea in cavity QED using Zeeman's level structure of atoms. And the numerical simulation under the influence of the cavity and atom decays shows that the scheme could be achieved efficiently within current state-of-the-art technology. -- Highlights: ► A modified Grover algorithm is proposed for searching in an arbitrary dimensional Hilbert space. ► Our modified algorithm requires fewer iteration steps to find the marked item. ► The proposed method uses the carriers of the information more economically. ► A scheme for a six-item Grover search in cavity QED is proposed. ► Numerical simulation under decays shows that the scheme can be achieved with enough fidelity.

  9. Unrelated Hematopoietic Stem Cell Donor Matching Probability and Search Algorithm

    Directory of Open Access Journals (Sweden)

    J.-M. Tiercy

    2012-01-01

    Full Text Available In transplantation of hematopoietic stem cells (HSCs from unrelated donors a high HLA compatibility level decreases the risk of acute graft-versus-host disease and mortality. The diversity of the HLA system at the allelic and haplotypic level and the heterogeneity of HLA typing data of the registered donors render the search process a complex task. This paper summarizes our experience with a search algorithm that includes at the start of the search a probability estimate (high/intermediate/low to identify a HLA-A, B, C, DRB1, DQB1-compatible donor (a 10/10 match. Based on 2002–2011 searches about 30% of patients have a high, 30% an intermediate, and 40% a low probability search. Search success rate and duration are presented and discussed in light of the experience of other centers. Overall a 9-10/10 matched HSC donor can now be identified for 60–80% of patients of European descent. For high probability searches donors can be selected on the basis of DPB1-matching with an estimated success rate of >40%. For low probability searches there is no consensus on which HLA incompatibilities are more permissive, although HLA-DQB1 mismatches are generally considered as acceptable. Models for the discrimination of more detrimental mismatches based on specific amino acid residues rather than specific HLA alleles are presented.

  10. A novel algorithm for validating peptide identification from a shotgun proteomics search engine.

    Science.gov (United States)

    Jian, Ling; Niu, Xinnan; Xia, Zhonghang; Samir, Parimal; Sumanasekera, Chiranthani; Mu, Zheng; Jennings, Jennifer L; Hoek, Kristen L; Allos, Tara; Howard, Leigh M; Edwards, Kathryn M; Weil, P Anthony; Link, Andrew J

    2013-03-01

    Liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS) has revolutionized the proteomics analysis of complexes, cells, and tissues. In a typical proteomic analysis, the tandem mass spectra from a LC-MS/MS experiment are assigned to a peptide by a search engine that compares the experimental MS/MS peptide data to theoretical peptide sequences in a protein database. The peptide spectra matches are then used to infer a list of identified proteins in the original sample. However, the search engines often fail to distinguish between correct and incorrect peptides assignments. In this study, we designed and implemented a novel algorithm called De-Noise to reduce the number of incorrect peptide matches and maximize the number of correct peptides at a fixed false discovery rate using a minimal number of scoring outputs from the SEQUEST search engine. The novel algorithm uses a three-step process: data cleaning, data refining through a SVM-based decision function, and a final data refining step based on proteolytic peptide patterns. Using proteomics data generated on different types of mass spectrometers, we optimized the De-Noise algorithm on the basis of the resolution and mass accuracy of the mass spectrometer employed in the LC-MS/MS experiment. Our results demonstrate De-Noise improves peptide identification compared to other methods used to process the peptide sequence matches assigned by SEQUEST. Because De-Noise uses a limited number of scoring attributes, it can be easily implemented with other search engines.

  11. Optimal Refueling Pattern Search for a CANDU Reactor Using a Genetic Algorithm

    International Nuclear Information System (INIS)

    Quang Binh, DO; Gyuhong, ROH; Hangbok, CHOI

    2006-01-01

    This paper presents the results from the application of genetic algorithms to a refueling optimization of a Canada deuterium uranium (CANDU) reactor. This work aims at making a mathematical model of the refueling optimization problem including the objective function and constraints and developing a method based on genetic algorithms to solve the problem. The model of the optimization problem and the proposed method comply with the key features of the refueling strategy of the CANDU reactor which adopts an on-power refueling operation. In this study, a genetic algorithm combined with an elitism strategy was used to automatically search for the refueling patterns. The objective of the optimization was to maximize the discharge burn-up of the refueling bundles, minimize the maximum channel power, or minimize the maximum change in the zone controller unit (ZCU) water levels. A combination of these objectives was also investigated. The constraints include the discharge burn-up, maximum channel power, maximum bundle power, channel power peaking factor and the ZCU water level. A refueling pattern that represents the refueling rate and channels was coded by a one-dimensional binary chromosome, which is a string of binary numbers 0 and 1. A computer program was developed in FORTRAN 90 running on an HP 9000 workstation to conduct the search for the optimal refueling patterns for a CANDU reactor at the equilibrium state. The results showed that it was possible to apply genetic algorithms to automatically search for the refueling channels of the CANDU reactor. The optimal refueling patterns were compared with the solutions obtained from the AUTOREFUEL program and the results were consistent with each other. (authors)

  12. Greedy and metaheuristics for the offline scheduling problem in grid computing

    DEFF Research Database (Denmark)

    Gamst, Mette

    In grid computing a number of geographically distributed resources connected through a wide area network, are utilized as one computations unit. The NP-hard offline scheduling problem in grid computing consists of assigning jobs to resources in advance. In this paper, five greedy heuristics and two....... All heuristics solve instances with up to 2000 jobs and 1000 resources, thus the results are useful both with respect to running times and to solution values....

  13. Identification of alternative splice variants in Aspergillus flavus through comparison of multiple tandem MS search algorithms

    Directory of Open Access Journals (Sweden)

    Chang Kung-Yen

    2011-07-01

    Full Text Available Abstract Background Database searching is the most frequently used approach for automated peptide assignment and protein inference of tandem mass spectra. The results, however, depend on the sequences in target databases and on search algorithms. Recently by using an alternative splicing database, we identified more proteins than with the annotated proteins in Aspergillus flavus. In this study, we aimed at finding a greater number of eligible splice variants based on newly available transcript sequences and the latest genome annotation. The improved database was then used to compare four search algorithms: Mascot, OMSSA, X! Tandem, and InsPecT. Results The updated alternative splicing database predicted 15833 putative protein variants, 61% more than the previous results. There was transcript evidence for 50% of the updated genes compared to the previous 35% coverage. Database searches were conducted using the same set of spectral data, search parameters, and protein database but with different algorithms. The false discovery rates of the peptide-spectrum matches were estimated Conclusions We were able to detect dozens of new peptides using the improved alternative splicing database with the recently updated annotation of the A. flavus genome. Unlike the identifications of the peptides and the RefSeq proteins, large variations existed between the putative splice variants identified by different algorithms. 12 candidates of putative isoforms were reported based on the consensus peptide-spectrum matches. This suggests that applications of multiple search engines effectively reduced the possible false positive results and validated the protein identifications from tandem mass spectra using an alternative splicing database.

  14. A Local Search Algorithm for the Flow Shop Scheduling Problem with Release Dates

    Directory of Open Access Journals (Sweden)

    Tao Ren

    2015-01-01

    Full Text Available This paper discusses the flow shop scheduling problem to minimize the makespan with release dates. By resequencing the jobs, a modified heuristic algorithm is obtained for handling large-sized problems. Moreover, based on some properties, a local search scheme is provided to improve the heuristic to gain high-quality solution for moderate-sized problems. A sequence-independent lower bound is presented to evaluate the performance of the algorithms. A series of simulation results demonstrate the effectiveness of the proposed algorithms.

  15. Symbiotic organisms search algorithm for dynamic economic dispatch with valve-point effects

    Science.gov (United States)

    Sonmez, Yusuf; Kahraman, H. Tolga; Dosoglu, M. Kenan; Guvenc, Ugur; Duman, Serhat

    2017-05-01

    In this study, symbiotic organisms search (SOS) algorithm is proposed to solve the dynamic economic dispatch with valve-point effects problem, which is one of the most important problems of the modern power system. Some practical constraints like valve-point effects, ramp rate limits and prohibited operating zones have been considered as solutions. Proposed algorithm was tested on five different test cases in 5 units, 10 units and 13 units systems. The obtained results have been compared with other well-known metaheuristic methods reported before. Results show that proposed algorithm has a good convergence and produces better results than other methods.

  16. GPU Based N-Gram String Matching Algorithm with Score Table Approach for String Searching in Many Documents

    Science.gov (United States)

    Srinivasa, K. G.; Shree Devi, B. N.

    2017-10-01

    String searching in documents has become a tedious task with the evolution of Big Data. Generation of large data sets demand for a high performance search algorithm in areas such as text mining, information retrieval and many others. The popularity of GPU's for general purpose computing has been increasing for various applications. Therefore it is of great interest to exploit the thread feature of a GPU to provide a high performance search algorithm. This paper proposes an optimized new approach to N-gram model for string search in a number of lengthy documents and its GPU implementation. The algorithm exploits GPGPUs for searching strings in many documents employing character level N-gram matching with parallel Score Table approach and search using CUDA API. The new approach of Score table used for frequency storage of N-grams in a document, makes the search independent of the document's length and allows faster access to the frequency values, thus decreasing the search complexity. The extensive thread feature in a GPU has been exploited to enable parallel pre-processing of trigrams in a document for Score Table creation and parallel search in huge number of documents, thus speeding up the whole search process even for a large pattern size. Experiments were carried out for many documents of varied length and search strings from the standard Lorem Ipsum text on NVIDIA's GeForce GT 540M GPU with 96 cores. Results prove that the parallel approach for Score Table creation and searching gives a good speed up than the same approach executed serially.

  17. Hybridization of Strength Pareto Multiobjective Optimization with Modified Cuckoo Search Algorithm for Rectangular Array.

    Science.gov (United States)

    Abdul Rani, Khairul Najmy; Abdulmalek, Mohamedfareq; A Rahim, Hasliza; Siew Chin, Neoh; Abd Wahab, Alawiyah

    2017-04-20

    This research proposes the various versions of modified cuckoo search (MCS) metaheuristic algorithm deploying the strength Pareto evolutionary algorithm (SPEA) multiobjective (MO) optimization technique in rectangular array geometry synthesis. Precisely, the MCS algorithm is proposed by incorporating the Roulette wheel selection operator to choose the initial host nests (individuals) that give better results, adaptive inertia weight to control the positions exploration of the potential best host nests (solutions), and dynamic discovery rate to manage the fraction probability of finding the best host nests in 3-dimensional search space. In addition, the MCS algorithm is hybridized with the particle swarm optimization (PSO) and hill climbing (HC) stochastic techniques along with the standard strength Pareto evolutionary algorithm (SPEA) forming the MCSPSOSPEA and MCSHCSPEA, respectively. All the proposed MCS-based algorithms are examined to perform MO optimization on Zitzler-Deb-Thiele's (ZDT's) test functions. Pareto optimum trade-offs are done to generate a set of three non-dominated solutions, which are locations, excitation amplitudes, and excitation phases of array elements, respectively. Overall, simulations demonstrates that the proposed MCSPSOSPEA outperforms other compatible competitors, in gaining a high antenna directivity, small half-power beamwidth (HPBW), low average side lobe level (SLL) suppression, and/or significant predefined nulls mitigation, simultaneously.

  18. Using a Card Trick to Teach Discrete Mathematics

    Science.gov (United States)

    Simonson, Shai; Holm, Tara S.

    2003-01-01

    We present a card trick that can be used to review or teach a variety of topics in discrete mathematics. We address many subjects, including permutations, combinations, functions, graphs, depth first search, the pigeonhole principle, greedy algorithms, and concepts from number theory. Moreover, the trick motivates the use of computers in…

  19. Micro-seismic waveform matching inversion based on gravitational search algorithm and parallel computation

    Science.gov (United States)

    Jiang, Y.; Xing, H. L.

    2016-12-01

    Micro-seismic events induced by water injection, mining activity or oil/gas extraction are quite informative, the interpretation of which can be applied for the reconstruction of underground stress and monitoring of hydraulic fracturing progress in oil/gas reservoirs. The source characterises and locations are crucial parameters that required for these purposes, which can be obtained through the waveform matching inversion (WMI) method. Therefore it is imperative to develop a WMI algorithm with high accuracy and convergence speed. Heuristic algorithm, as a category of nonlinear method, possesses a very high convergence speed and good capacity to overcome local minimal values, and has been well applied for many areas (e.g. image processing, artificial intelligence). However, its effectiveness for micro-seismic WMI is still poorly investigated; very few literatures exits that addressing this subject. In this research an advanced heuristic algorithm, gravitational search algorithm (GSA) , is proposed to estimate the focal mechanism (angle of strike, dip and rake) and source locations in three dimension. Unlike traditional inversion methods, the heuristic algorithm inversion does not require the approximation of green function. The method directly interacts with a CPU parallelized finite difference forward modelling engine, and updating the model parameters under GSA criterions. The effectiveness of this method is tested with synthetic data form a multi-layered elastic model; the results indicate GSA can be well applied on WMI and has its unique advantages. Keywords: Micro-seismicity, Waveform matching inversion, gravitational search algorithm, parallel computation

  20. An analytical study of composite laminate lay-up using search algorithms for maximization of flexural stiffness and minimization of springback angle

    Science.gov (United States)

    Singh, Ranjan Kumar; Rinawa, Moti Lal

    2018-04-01

    The residual stresses arising in fiber-reinforced laminates during their curing in closed molds lead to changes in the composites after their removal from the molds and cooling. One of these dimensional changes of angle sections is called springback. The parameters such as lay-up, stacking sequence, material system, cure temperature, thickness etc play important role in it. In present work, it is attempted to optimize lay-up and stacking sequence for maximization of flexural stiffness and minimization of springback angle. The search algorithms are employed to obtain best sequence through repair strategy such as swap. A new search algorithm, termed as lay-up search algorithm (LSA) is also proposed, which is an extension of permutation search algorithm (PSA). The efficacy of PSA and LSA is tested on the laminates with a range of lay-ups. A computer code is developed on MATLAB implementing the above schemes. Also, the strategies for multi objective optimization using search algorithms are suggested and tested.

  1. Algorithm of axial fuel optimization based in progressive steps of turned search

    International Nuclear Information System (INIS)

    Martin del Campo, C.; Francois, J.L.

    2003-01-01

    The development of an algorithm for the axial optimization of fuel of boiling water reactors (BWR) is presented. The algorithm is based in a serial optimizations process in the one that the best solution in each stage is the starting point of the following stage. The objective function of each stage adapts to orient the search toward better values of one or two parameters leaving the rest like restrictions. Conform to it advances in those optimization stages, it is increased the fineness of the evaluation of the investigated designs. The algorithm is based on three stages, in the first one are used Genetic algorithms and in the two following Tabu Search. The objective function of the first stage it looks for to minimize the average enrichment of the one it assembles and to fulfill with the generation of specified energy for the operation cycle besides not violating none of the limits of the design base. In the following stages the objective function looks for to minimize the power factor peak (PPF) and to maximize the margin of shutdown (SDM), having as restrictions the one average enrichment obtained for the best design in the first stage and those other restrictions. The third stage, very similar to the previous one, it begins with the design of the previous stage but it carries out a search of the margin of shutdown to different exhibition steps with calculations in three dimensions (3D). An application to the case of the design of the fresh assemble for the fourth fuel reload of the Unit 1 reactor of the Laguna Verde power plant (U1-CLV) is presented. The obtained results show an advance in the handling of optimization methods and in the construction of the objective functions that should be used for the different design stages of the fuel assemblies. (Author)

  2. Theoretical and Empirical Analyses of an Improved Harmony Search Algorithm Based on Differential Mutation Operator

    Directory of Open Access Journals (Sweden)

    Longquan Yong

    2012-01-01

    Full Text Available Harmony search (HS method is an emerging metaheuristic optimization algorithm. In this paper, an improved harmony search method based on differential mutation operator (IHSDE is proposed to deal with the optimization problems. Since the population diversity plays an important role in the behavior of evolution algorithm, the aim of this paper is to calculate the expected population mean and variance of IHSDE from theoretical viewpoint. Numerical results, compared with the HSDE, NGHS, show that the IHSDE method has good convergence property over a test-suite of well-known benchmark functions.

  3. A Sustainable City Planning Algorithm Based on TLBO and Local Search

    Science.gov (United States)

    Zhang, Ke; Lin, Li; Huang, Xuanxuan; Liu, Yiming; Zhang, Yonggang

    2017-09-01

    Nowadays, how to design a city with more sustainable features has become a center problem in the field of social development, meanwhile it has provided a broad stage for the application of artificial intelligence theories and methods. Because the design of sustainable city is essentially a constraint optimization problem, the swarm intelligence algorithm of extensive research has become a natural candidate for solving the problem. TLBO (Teaching-Learning-Based Optimization) algorithm is a new swarm intelligence algorithm. Its inspiration comes from the “teaching” and “learning” behavior of teaching class in the life. The evolution of the population is realized by simulating the “teaching” of the teacher and the student “learning” from each other, with features of less parameters, efficient, simple thinking, easy to achieve and so on. It has been successfully applied to scheduling, planning, configuration and other fields, which achieved a good effect and has been paid more and more attention by artificial intelligence researchers. Based on the classical TLBO algorithm, we propose a TLBO_LS algorithm combined with local search. We design and implement the random generation algorithm and evaluation model of urban planning problem. The experiments on the small and medium-sized random generation problem showed that our proposed algorithm has obvious advantages over DE algorithm and classical TLBO algorithm in terms of convergence speed and solution quality.

  4. LiveWire interactive boundary extraction algorithm based on Haar wavelet transform and control point set direction search

    Science.gov (United States)

    Cheng, Jun; Zhang, Jun; Tian, Jinwen

    2015-12-01

    Based on deep analysis of the LiveWire interactive boundary extraction algorithm, a new algorithm focusing on improving the speed of LiveWire algorithm is proposed in this paper. Firstly, the Haar wavelet transform is carried on the input image, and the boundary is extracted on the low resolution image obtained by the wavelet transform of the input image. Secondly, calculating LiveWire shortest path is based on the control point set direction search by utilizing the spatial relationship between the two control points users provide in real time. Thirdly, the search order of the adjacent points of the starting node is set in advance. An ordinary queue instead of a priority queue is taken as the storage pool of the points when optimizing their shortest path value, thus reducing the complexity of the algorithm from O[n2] to O[n]. Finally, A region iterative backward projection method based on neighborhood pixel polling has been used to convert dual-pixel boundary of the reconstructed image to single-pixel boundary after Haar wavelet inverse transform. The algorithm proposed in this paper combines the advantage of the Haar wavelet transform and the advantage of the optimal path searching method based on control point set direction search. The former has fast speed of image decomposition and reconstruction and is more consistent with the texture features of the image and the latter can reduce the time complexity of the original algorithm. So that the algorithm can improve the speed in interactive boundary extraction as well as reflect the boundary information of the image more comprehensively. All methods mentioned above have a big role in improving the execution efficiency and the robustness of the algorithm.

  5. User Adapted Motor-Imaginary Brain-Computer Interface by means of EEG Channel Selection Based on Estimation of Distributed Algorithms

    Directory of Open Access Journals (Sweden)

    Aitzol Astigarraga

    2016-01-01

    Full Text Available Brain-Computer Interfaces (BCIs have become a research field with interesting applications, and it can be inferred from published papers that different persons activate different parts of the brain to perform the same action. This paper presents a personalized interface design method, for electroencephalogram- (EEG- based BCIs, based on channel selection. We describe a novel two-step method in which firstly a computationally inexpensive greedy algorithm finds an adequate search range; and, then, an Estimation of Distribution Algorithm (EDA is applied in the reduced range to obtain the optimal channel subset. The use of the EDA allows us to select the most interacting channels subset, removing the irrelevant and noisy ones, thus selecting the most discriminative subset of channels for each user improving accuracy. The method is tested on the IIIa dataset from the BCI competition III. Experimental results show that the resulting channel subset is consistent with motor-imaginary-related neurophysiological principles and, on the other hand, optimizes performance reducing the number of channels.

  6. Cost Forecasting of Substation Projects Based on Cuckoo Search Algorithm and Support Vector Machines

    Directory of Open Access Journals (Sweden)

    Dongxiao Niu

    2018-01-01

    Full Text Available Accurate prediction of substation project cost is helpful to improve the investment management and sustainability. It is also directly related to the economy of substation project. Ensemble Empirical Mode Decomposition (EEMD can decompose variables with non-stationary sequence signals into significant regularity and periodicity, which is helpful in improving the accuracy of prediction model. Adding the Gauss perturbation to the traditional Cuckoo Search (CS algorithm can improve the searching vigor and precision of CS algorithm. Thus, the parameters and kernel functions of Support Vector Machines (SVM model are optimized. By comparing the prediction results with other models, this model has higher prediction accuracy.

  7. Gravitation search algorithm: Application to the optimal IIR filter design

    Directory of Open Access Journals (Sweden)

    Suman Kumar Saha

    2014-01-01

    Full Text Available This paper presents a global heuristic search optimization technique known as Gravitation Search Algorithm (GSA for the design of 8th order Infinite Impulse Response (IIR, low pass (LP, high pass (HP, band pass (BP and band stop (BS filters considering various non-linear characteristics of the filter design problems. This paper also adopts a novel fitness function in order to improve the stop band attenuation to a great extent. In GSA, law of gravity and mass interactions among different particles are adopted for handling the non-linear IIR filter design optimization problem. In this optimization technique, searcher agents are the collection of masses and interactions among them are governed by the Newtonian gravity and the laws of motion. The performances of the GSA based IIR filter designs have proven to be superior as compared to those obtained by real coded genetic algorithm (RGA and standard Particle Swarm Optimization (PSO. Extensive simulation results affirm that the proposed approach using GSA outperforms over its counterparts not only in terms of quality output, i.e., sharpness at cut-off, smaller pass band ripple, higher stop band attenuation, but also the fastest convergence speed with assured stability.

  8. A Cooperative Search and Coverage Algorithm with Controllable Revisit and Connectivity Maintenance for Multiple Unmanned Aerial Vehicles

    Directory of Open Access Journals (Sweden)

    Zhong Liu

    2018-05-01

    Full Text Available In this paper, we mainly study a cooperative search and coverage algorithm for a given bounded rectangle region, which contains several unknown stationary targets, by a team of unmanned aerial vehicles (UAVs with non-ideal sensors and limited communication ranges. Our goal is to minimize the search time, while gathering more information about the environment and finding more targets. For this purpose, a novel cooperative search and coverage algorithm with controllable revisit mechanism is presented. Firstly, as the representation of the environment, the cognitive maps that included the target probability map (TPM, the uncertain map (UM, and the digital pheromone map (DPM are constituted. We also design a distributed update and fusion scheme for the cognitive map. This update and fusion scheme can guarantee that each one of the cognitive maps converges to the same one, which reflects the targets’ true existence or absence in each cell of the search region. Secondly, we develop a controllable revisit mechanism based on the DPM. This mechanism can concentrate the UAVs to revisit sub-areas that have a large target probability or high uncertainty. Thirdly, in the frame of distributed receding horizon optimizing, a path planning algorithm for the multi-UAVs cooperative search and coverage is designed. In the path planning algorithm, the movement of the UAVs is restricted by the potential fields to meet the requirements of avoiding collision and maintaining connectivity constraints. Moreover, using the minimum spanning tree (MST topology optimization strategy, we can obtain a tradeoff between the search coverage enhancement and the connectivity maintenance. The feasibility of the proposed algorithm is demonstrated by comparison simulations by way of analyzing the effects of the controllable revisit mechanism and the connectivity maintenance scheme. The Monte Carlo method is employed to validate the influence of the number of UAVs, the sensing radius

  9. AHP 47: THE GREEDY KING AND TRICKY MAN

    Directory of Open Access Journals (Sweden)

    Lcags so lhun 'grub ལྕགས་སོ་ལྷུན་འགྲུབ། (Klu sgrub ཀླུ་སྒྲུབ།

    2017-04-01

    Full Text Available Rgya mo skyid (b. 1992 of Mdo ba Town, Reb gong (Thun rin, Tongren County, Rma lho (Huangnan Tibetan Autonomous Prefecture, Mtsho sngon (Qinghai Province, China told me this story in an apartment in Xi'an City, Shaanxi Province on 21 August, 2016. She said, "When I was about five years old, my grandfather (Kun bzang, b. 1939 told me many stories such as this before we went to bed every night. I forgot many stories, but this story is still very clear." There was once a greedy local king who collected taxes monthly. There was also a very poor man known as Tricky Tsag thul. The local king came to Tricky's home to punish him for not paying his taxes for several months. ...

  10. Online evolution for multi-action adversarial games

    DEFF Research Database (Denmark)

    Justesen, Niels Orsleff; Mahlmann, Tobias; Togelius, Julian

    2016-01-01

    the combination of atomic actions that make up a single move, with a state evaluation function used for fitness. We implement Online Evolution for the turn-based multi-action game Hero Academy and compare it with a standard Monte Carlo Tree Search implementation as well as two types of greedy algorithms. Online...

  11. System network planning expansion using mathematical programming, genetic algorithms and tabu search

    International Nuclear Information System (INIS)

    Sadegheih, A.; Drake, P.R.

    2008-01-01

    In this paper, system network planning expansion is formulated for mixed integer programming, a genetic algorithm (GA) and tabu search (TS). Compared with other optimization methods, GAs are suitable for traversing large search spaces, since they can do this relatively rapidly and because the use of mutation diverts the method away from local minima, which will tend to become more common as the search space increases in size. GA's give an excellent trade off between solution quality and computing time and flexibility for taking into account specific constraints in real situations. TS has emerged as a new, highly efficient, search paradigm for finding quality solutions to combinatorial problems. It is characterized by gathering knowledge during the search and subsequently profiting from this knowledge. The attractiveness of the technique comes from its ability to escape local optimality. The cost function of this problem consists of the capital investment cost in discrete form, the cost of transmission losses and the power generation costs. The DC load flow equations for the network are embedded in the constraints of the mathematical model to avoid sub-optimal solutions that can arise if the enforcement of such constraints is done in an indirect way. The solution of the model gives the best line additions and also provides information regarding the optimal generation at each generation point. This method of solution is demonstrated on the expansion of a 10 bus bar system to 18 bus bars. Finally, a steady-state genetic algorithm is employed rather than generational replacement, also uniform crossover is used

  12. Heuristic algorithms for solving of the tool routing problem for CNC cutting machines

    Science.gov (United States)

    Chentsov, P. A.; Petunin, A. A.; Sesekin, A. N.; Shipacheva, E. N.; Sholohov, A. E.

    2015-11-01

    The article is devoted to the problem of minimizing the path of the cutting tool to shape cutting machines began. This problem can be interpreted as a generalized traveling salesman problem. Earlier version of the dynamic programming method to solve this problem was developed. Unfortunately, this method allows to process an amount not exceeding thirty circuits. In this regard, the task of constructing quasi-optimal route becomes relevant. In this paper we propose options for quasi-optimal greedy algorithms. Comparison of the results of exact and approximate algorithms is given.

  13. Derivation and validation of the automated search algorithms to identify cognitive impairment and dementia in electronic health records.

    Science.gov (United States)

    Amra, Sakusic; O'Horo, John C; Singh, Tarun D; Wilson, Gregory A; Kashyap, Rahul; Petersen, Ronald; Roberts, Rosebud O; Fryer, John D; Rabinstein, Alejandro A; Gajic, Ognjen

    2017-02-01

    Long-term cognitive impairment is a common and important problem in survivors of critical illness. We developed electronic search algorithms to identify cognitive impairment and dementia from the electronic medical records (EMRs) that provide opportunity for big data analysis. Eligible patients met 2 criteria. First, they had a formal cognitive evaluation by The Mayo Clinic Study of Aging. Second, they were hospitalized in intensive care unit at our institution between 2006 and 2014. The "criterion standard" for diagnosis was formal cognitive evaluation supplemented by input from an expert neurologist. Using all available EMR data, we developed and improved our algorithms in the derivation cohort and validated them in the independent validation cohort. Of 993 participants who underwent formal cognitive testing and were hospitalized in intensive care unit, we selected 151 participants at random to form the derivation and validation cohorts. The automated electronic search algorithm for cognitive impairment was 94.3% sensitive and 93.0% specific. The search algorithms for dementia achieved respective sensitivity and specificity of 97% and 99%. EMR search algorithms significantly outperformed International Classification of Diseases codes. Automated EMR data extractions for cognitive impairment and dementia are reliable and accurate and can serve as acceptable and efficient alternatives to time-consuming manual data review. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Optimized hyperspectral band selection using hybrid genetic algorithm and gravitational search algorithm

    Science.gov (United States)

    Zhang, Aizhu; Sun, Genyun; Wang, Zhenjie

    2015-12-01

    The serious information redundancy in hyperspectral images (HIs) cannot contribute to the data analysis accuracy, instead it require expensive computational resources. Consequently, to identify the most useful and valuable information from the HIs, thereby improve the accuracy of data analysis, this paper proposed a novel hyperspectral band selection method using the hybrid genetic algorithm and gravitational search algorithm (GA-GSA). In the proposed method, the GA-GSA is mapped to the binary space at first. Then, the accuracy of the support vector machine (SVM) classifier and the number of selected spectral bands are utilized to measure the discriminative capability of the band subset. Finally, the band subset with the smallest number of spectral bands as well as covers the most useful and valuable information is obtained. To verify the effectiveness of the proposed method, studies conducted on an AVIRIS image against two recently proposed state-of-the-art GSA variants are presented. The experimental results revealed the superiority of the proposed method and indicated that the method can indeed considerably reduce data storage costs and efficiently identify the band subset with stable and high classification precision.

  15. A Harmony Search Algorithm approach for optimizing traffic signal timings

    Directory of Open Access Journals (Sweden)

    Mauro Dell'Orco

    2013-07-01

    Full Text Available In this study, a bi-level formulation is presented for solving the Equilibrium Network Design Problem (ENDP. The optimisation of the signal timing has been carried out at the upper-level using the Harmony Search Algorithm (HSA, whilst the traffic assignment has been carried out through the Path Flow Estimator (PFE at the lower level. The results of HSA have been first compared with those obtained using the Genetic Algorithm, and the Hill Climbing on a two-junction network for a fixed set of link flows. Secondly, the HSA with PFE has been applied to the medium-sized network to show the applicability of the proposed algorithm in solving the ENDP. Additionally, in order to test the sensitivity of perceived travel time error, we have used the HSA with PFE with various level of perceived travel time. The results showed that the proposed method is quite simple and efficient in solving the ENDP.

  16. THE ALGORITHM AND PROGRAM OF M-MATRICES SEARCH AND STUDY

    Directory of Open Access Journals (Sweden)

    Y. N. Balonin

    2013-05-01

    Full Text Available The algorithm and software for search and study of orthogonal bases matrices – minimax matrices (M-matrix are considered. The algorithm scheme is shown, comments on calculation blocks are given, and interface of the MMatrix software system developed with participation of the authors is explained. The results of the universal algorithm work are presented as Hadamard matrices, Belevitch matrices (C-matrices, conference matrices and matrices of even and odd orders complementary and closely related to those ones by their properties, in particular, the matrix of the 22-th order for which there is no C-matrix. Examples of portraits for alternative matrices of the 255-th and the 257-th orders are given corresponding to the sequences of Mersenne and Fermat numbers. A new way to get Hadamard matrices is explained, different from the previously known procedures based on iterative processes and calculations of Lagrange symbols, with theoretical and practical meaning.

  17. New reference trajectory optimization algorithm for a flight management system inspired in beam search

    Directory of Open Access Journals (Sweden)

    Alejandro MURRIETA-MENDOZA

    2017-08-01

    Full Text Available With the objective of reducing the flight cost and the amount of polluting emissions released in the atmosphere, a new optimization algorithm considering the climb, cruise and descent phases is presented for the reference vertical flight trajectory. The selection of the reference vertical navigation speeds and altitudes was solved as a discrete combinatory problem by means of a graph-tree passing through nodes using the beam search optimization technique. To achieve a compromise between the execution time and the algorithm’s ability to find the global optimal solution, a heuristic methodology introducing a parameter called “optimism coefficient was used in order to estimate the trajectory’s flight cost at every node. The optimal trajectory cost obtained with the developed algorithm was compared with the cost of the optimal trajectory provided by a commercial flight management system(FMS. The global optimal solution was validated against an exhaustive search algorithm(ESA, other than the proposed algorithm. The developed algorithm takes into account weather effects, step climbs during cruise and air traffic management constraints such as constant altitude segments, constant cruise Mach, and a pre-defined reference lateral navigation route. The aircraft fuel burn was computed using a numerical performance model which was created and validated using flight test experimental data.

  18. Greedy heuristics for minimization of number of terminal nodes in decision trees

    KAUST Repository

    Hussain, Shahid

    2014-10-01

    This paper describes, in detail, several greedy heuristics for construction of decision trees. We study the number of terminal nodes of decision trees, which is closely related with the cardinality of the set of rules corresponding to the tree. We compare these heuristics empirically for two different types of datasets (datasets acquired from UCI ML Repository and randomly generated data) as well as compare with the optimal results obtained using dynamic programming method.

  19. Greedy heuristics for minimization of number of terminal nodes in decision trees

    KAUST Repository

    Hussain, Shahid

    2014-01-01

    This paper describes, in detail, several greedy heuristics for construction of decision trees. We study the number of terminal nodes of decision trees, which is closely related with the cardinality of the set of rules corresponding to the tree. We compare these heuristics empirically for two different types of datasets (datasets acquired from UCI ML Repository and randomly generated data) as well as compare with the optimal results obtained using dynamic programming method.

  20. Categorization and Searching of Color Images Using Mean Shift Algorithm

    Directory of Open Access Journals (Sweden)

    Prakash PANDEY

    2009-07-01

    Full Text Available Now a day’s Image Searching is still a challenging problem in content based image retrieval (CBIR system. Most CBIR system operates on all images without pre-sorting the images. The image search result contains many unrelated image. The aim of this research is to propose a new object based indexing system Based on extracting salient region representative from the image, categorizing the image into different types and search images that are similar to given query images.In our approach, the color features are extracted using the mean shift algorithm, a robust clustering technique, Dominant objects are obtained by performing region grouping of segmented thumbnails. The category for an image is generated automatically by analyzing the image for the presence of a dominant object. The images in the database are clustered based on region feature similarity using Euclidian distance. Placing an image into a category can help the user to navigate retrieval results more effectively. Extensive experimental results illustrate excellent performance.

  1. Greedy solution of ill-posed problems: error bounds and exact inversion

    International Nuclear Information System (INIS)

    Denis, L; Lorenz, D A; Trede, D

    2009-01-01

    The orthogonal matching pursuit (OMP) is a greedy algorithm to solve sparse approximation problems. Sufficient conditions for exact recovery are known with and without noise. In this paper we investigate the applicability of the OMP for the solution of ill-posed inverse problems in general, and in particular for two deconvolution examples from mass spectrometry and digital holography, respectively. In sparse approximation problems one often has to deal with the problem of redundancy of a dictionary, i.e. the atoms are not linearly independent. However, one expects them to be approximatively orthogonal and this is quantified by the so-called incoherence. This idea cannot be transferred to ill-posed inverse problems since here the atoms are typically far from orthogonal. The ill-posedness of the operator probably causes the correlation of two distinct atoms to become huge, i.e. that two atoms look much alike. Therefore, one needs conditions which take the structure of the problem into account and work without the concept of coherence. In this paper we develop results for the exact recovery of the support of noisy signals. In the two examples, mass spectrometry and digital holography, we show that our results lead to practically relevant estimates such that one may check a priori if the experimental setup guarantees exact deconvolution with OMP. Especially in the example from digital holography, our analysis may be regarded as a first step to calculate the resolution power of droplet holography

  2. An adaptive image enhancement technique by combining cuckoo search and particle swarm optimization algorithm.

    Science.gov (United States)

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper.

  3. An Adaptive Image Enhancement Technique by Combining Cuckoo Search and Particle Swarm Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Zhiwei Ye

    2015-01-01

    Full Text Available Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper.

  4. Upper-Lower Bounds Candidate Sets Searching Algorithm for Bayesian Network Structure Learning

    Directory of Open Access Journals (Sweden)

    Guangyi Liu

    2014-01-01

    Full Text Available Bayesian network is an important theoretical model in artificial intelligence field and also a powerful tool for processing uncertainty issues. Considering the slow convergence speed of current Bayesian network structure learning algorithms, a fast hybrid learning method is proposed in this paper. We start with further analysis of information provided by low-order conditional independence testing, and then two methods are given for constructing graph model of network, which is theoretically proved to be upper and lower bounds of the structure space of target network, so that candidate sets are given as a result; after that a search and scoring algorithm is operated based on the candidate sets to find the final structure of the network. Simulation results show that the algorithm proposed in this paper is more efficient than similar algorithms with the same learning precision.

  5. Hooke–Jeeves Method-used Local Search in a Hybrid Global Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    V. D. Sulimov

    2014-01-01

    Full Text Available Modern methods for optimization investigation of complex systems are based on development and updating the mathematical models of systems because of solving the appropriate inverse problems. Input data desirable for solution are obtained from the analysis of experimentally defined consecutive characteristics for a system or a process. Causal characteristics are the sought ones to which equation coefficients of mathematical models of object, limit conditions, etc. belong. The optimization approach is one of the main ones to solve the inverse problems. In the main case it is necessary to find a global extremum of not everywhere differentiable criterion function. Global optimization methods are widely used in problems of identification and computation diagnosis system as well as in optimal control, computing to-mography, image restoration, teaching the neuron networks, other intelligence technologies. Increasingly complicated systems of optimization observed during last decades lead to more complicated mathematical models, thereby making solution of appropriate extreme problems significantly more difficult. A great deal of practical applications may have the problem con-ditions, which can restrict modeling. As a consequence, in inverse problems the criterion functions can be not everywhere differentiable and noisy. Available noise means that calculat-ing the derivatives is difficult and unreliable. It results in using the optimization methods without calculating the derivatives.An efficiency of deterministic algorithms of global optimization is significantly restrict-ed by their dependence on the extreme problem dimension. When the number of variables is large they use the stochastic global optimization algorithms. As stochastic algorithms yield too expensive solutions, so this drawback restricts their applications. Developing hybrid algo-rithms that combine a stochastic algorithm for scanning the variable space with deterministic local search

  6. In-depth analysis of protein inference algorithms using multiple search engines and well-defined metrics.

    Science.gov (United States)

    Audain, Enrique; Uszkoreit, Julian; Sachsenberg, Timo; Pfeuffer, Julianus; Liang, Xiao; Hermjakob, Henning; Sanchez, Aniel; Eisenacher, Martin; Reinert, Knut; Tabb, David L; Kohlbacher, Oliver; Perez-Riverol, Yasset

    2017-01-06

    In mass spectrometry-based shotgun proteomics, protein identifications are usually the desired result. However, most of the analytical methods are based on the identification of reliable peptides and not the direct identification of intact proteins. Thus, assembling peptides identified from tandem mass spectra into a list of proteins, referred to as protein inference, is a critical step in proteomics research. Currently, different protein inference algorithms and tools are available for the proteomics community. Here, we evaluated five software tools for protein inference (PIA, ProteinProphet, Fido, ProteinLP, MSBayesPro) using three popular database search engines: Mascot, X!Tandem, and MS-GF+. All the algorithms were evaluated using a highly customizable KNIME workflow using four different public datasets with varying complexities (different sample preparation, species and analytical instruments). We defined a set of quality control metrics to evaluate the performance of each combination of search engines, protein inference algorithm, and parameters on each dataset. We show that the results for complex samples vary not only regarding the actual numbers of reported protein groups but also concerning the actual composition of groups. Furthermore, the robustness of reported proteins when using databases of differing complexities is strongly dependant on the applied inference algorithm. Finally, merging the identifications of multiple search engines does not necessarily increase the number of reported proteins, but does increase the number of peptides per protein and thus can generally be recommended. Protein inference is one of the major challenges in MS-based proteomics nowadays. Currently, there are a vast number of protein inference algorithms and implementations available for the proteomics community. Protein assembly impacts in the final results of the research, the quantitation values and the final claims in the research manuscript. Even though protein

  7. ACTION OF UNIFORM SEARCH ALGORITHM WHEN SELECTING LANGUAGE UNITS IN THE PROCESS OF SPEECH

    Directory of Open Access Journals (Sweden)

    Ирина Михайловна Некипелова

    2013-05-01

    Full Text Available The article is devoted to research of action of uniform search algorithm when selecting by human of language units for speech produce. The process is connected with a speech optimization phenomenon. This makes it possible to shorten the time of cogitation something that human want to say, and to achieve the maximum precision in thoughts expression. The algorithm of uniform search works at consciousness  and subconsciousness levels. It favours the forming of automatism produce and perception of speech. Realization of human's cognitive potential in the process of communication starts up complicated mechanism of self-organization and self-regulation of language. In turn, it results in optimization of language system, servicing needs not only human's self-actualization but realization of communication in society. The method of problem-oriented search is used for researching of optimization mechanisms, which are distinctive to speech producing and stabilization of language.DOI: http://dx.doi.org/10.12731/2218-7405-2013-4-50

  8. Verification of Single-Peptide Protein Identifications by the Application of Complementary Database Search Algorithms

    National Research Council Canada - National Science Library

    Rohrbough, James G; Breci, Linda; Merchant, Nirav; Miller, Susan; Haynes, Paul A

    2005-01-01

    .... One such technique, known as the Multi-Dimensional Protein Identification Technique, or MudPIT, involves the use of computer search algorithms that automate the process of identifying proteins...

  9. Comparison of a constraint directed search to a genetic algorithm in a scheduling application

    International Nuclear Information System (INIS)

    Abbott, L.

    1993-01-01

    Scheduling plutonium containers for blending is a time-intensive operation. Several constraints must be taken into account; including the number of containers in a dissolver run, the size of each dissolver run, and the size and target purity of the blended mixture formed from these runs. Two types of algorithms have been used to solve this problem: a constraint directed search and a genetic algorithm. This paper discusses the implementation of these two different approaches to the problem and the strengths and weaknesses of each algorithm

  10. On construction of partial association rules

    KAUST Repository

    Moshkov, Mikhail

    2009-01-01

    This paper is devoted to the study of approximate algorithms for minimization of partial association rule length. It is shown that under some natural assumptions on the class NP, a greedy algorithm is close to the best polynomial approximate algorithms for solving of this NP-hard problem. The paper contains various bounds on precision of the greedy algorithm, bounds on minimal length of rules based on an information obtained during greedy algorithm work, and results of the study of association rules for the most part of binary information systems. © 2009 Springer Berlin Heidelberg.

  11. A Novel adaptative Discrete Cuckoo Search Algorithm for parameter optimization in computer vision

    Directory of Open Access Journals (Sweden)

    loubna benchikhi

    2017-10-01

    Full Text Available Computer vision applications require choosing operators and their parameters, in order to provide the best outcomes. Often, the users quarry on expert knowledge and must experiment many combinations to find manually the best one. As performance, time and accuracy are important, it is necessary to automate parameter optimization at least for crucial operators. In this paper, a novel approach based on an adaptive discrete cuckoo search algorithm (ADCS is proposed. It automates the process of algorithms’ setting and provides optimal parameters for vision applications. This work reconsiders a discretization problem to adapt the cuckoo search algorithm and presents the procedure of parameter optimization. Some experiments on real examples and comparisons to other metaheuristic-based approaches: particle swarm optimization (PSO, reinforcement learning (RL and ant colony optimization (ACO show the efficiency of this novel method.

  12. Simulating quantum search algorithm using vibronic states of I2 manipulated by optimally designed gate pulses

    International Nuclear Information System (INIS)

    Ohtsuki, Yukiyoshi

    2010-01-01

    In this paper, molecular quantum computation is numerically studied with the quantum search algorithm (Grover's algorithm) by means of optimal control simulation. Qubits are implemented in the vibronic states of I 2 , while gate operations are realized by optimally designed laser pulses. The methodological aspects of the simulation are discussed in detail. We show that the algorithm for solving a gate pulse-design problem has the same mathematical form as a state-to-state control problem in the density matrix formalism, which provides monotonically convergent algorithms as an alternative to the Krotov method. The sequential irradiation of separately designed gate pulses leads to the population distribution predicted by Grover's algorithm. The computational accuracy is reduced by the imperfect quality of the pulse design and by the electronic decoherence processes that are modeled by the non-Markovian master equation. However, as long as we focus on the population distribution of the vibronic qubits, we can search a target state with high probability without introducing error-correction processes during the computation. A generalized gate pulse-design scheme to explicitly include decoherence effects is outlined, in which we propose a new objective functional together with its solution algorithm that guarantees monotonic convergence.

  13. A modified Symbiotic Organisms Search algorithm for large scale economic dispatch problem with valve-point effects

    International Nuclear Information System (INIS)

    Secui, Dinu Calin

    2016-01-01

    This paper proposes a new metaheuristic algorithm, called Modified Symbiotic Organisms Search (MSOS) algorithm, to solve the economic dispatch problem considering the valve-point effects, the prohibited operating zones (POZ), the transmission line losses, multi-fuel sources, as well as other operating constraints of the generating units and power system. The MSOS algorithm introduces, in all of its phases, new relations to update the solutions to improve its capacity of identifying stable and of high-quality solutions in a reasonable time. Furthermore, to increase the capacity of exploring the MSOS algorithm in finding the most promising zones, it is endowed with a chaotic component generated by the Logistic map. The performance of the modified algorithm and of the original algorithm Symbiotic Organisms Search (SOS) is tested on five systems of different characteristics, constraints and dimensions (13-unit, 40-unit, 80-unit, 160-unit and 320-unit). The results obtained by applying the proposed algorithm (MSOS) show that this has a better performance than other techniques of optimization recently used in solving the economic dispatch problem with valve-point effects. - Highlights: • A new modified SOS algorithm (MSOS) is proposed to solve the EcD problem. • Valve-point effects, ramp-rate limits, POZ, multi-fuel sources, transmission losses were considered. • The algorithm is tested on five systems having 13, 40, 80, 160 and 320 thermal units. • MSOS algorithm outperforms many other optimization techniques.

  14. From Schrцdinger's equation to the quantum search algorithm£

    Indian Academy of Sciences (India)

    Also the framework was simple and general and could be extended to ... It is unusual to write a paper listing the steps that led to a result after the result itself ... the quantum search algorithm – it is by no means a comprehensive review of quantum ..... D, as defined in the previous section, is no longer unitary for large ε.

  15. MRS algorithm: a new method for searching myocardial region in SPECT myocardial perfusion images.

    Science.gov (United States)

    He, Yuan-Lie; Tian, Lian-Fang; Chen, Ping; Li, Bin; Mao, Zhong-Yuan

    2005-10-01

    First, the necessity of automatically segmenting myocardium from myocardial SPECT image is discussed in Section 1. To eliminate the influence of the background, the optimal threshold segmentation method modified for the MRS algorithm is explained in Section 2. Then, the image erosion structure is applied to identify the myocardium region and the liver region. The contour tracing method is introduced to extract the myocardial contour. To locate the centriod of the myocardium, the myocardial centriod searching method is developed. The protocol of the MRS algorithm is summarized in Section 6. The performance of the MRS algorithm is investigated and the conclusion is drawn in Section 7. Finally, the importance of the MRS algorithm and the improvement of the MRS algorithm are discussed.

  16. An Aircraft Service Staff Rostering using a Hybrid GRASP Algorithm

    OpenAIRE

    Cho, Vincent; Wu, Gene Pak Kit; Ip, W.H.

    2009-01-01

    The aircraft ground service company is responsible for carrying out the regular tasks to aircraft maintenace between their arrival at and departure from the airport. This paper presents the application of a hybrid approach based upon greedy randomized adaptive search procedure (GRASP) for rostering technical staff such that they are assigned predefined shift patterns. The rostering of staff is posed as an optimization problem with an aim of minimizing the violations of hard and soft constrain...

  17. A New Improved Quantum Evolution Algorithm with Local Search Procedure for Capacitated Vehicle Routing Problem

    Directory of Open Access Journals (Sweden)

    Ligang Cui

    2013-01-01

    Full Text Available The capacitated vehicle routing problem (CVRP is the most classical vehicle routing problem (VRP; many solution techniques are proposed to find its better answer. In this paper, a new improved quantum evolution algorithm (IQEA with a mixed local search procedure is proposed for solving CVRPs. First, an IQEA with a double chain quantum chromosome, new quantum rotation schemes, and self-adaptive quantum Not gate is constructed to initialize and generate feasible solutions. Then, to further strengthen IQEA's searching ability, three local search procedures 1-1 exchange, 1-0 exchange, and 2-OPT, are adopted. Experiments on a small case have been conducted to analyze the sensitivity of main parameters and compare the performances of the IQEA with different local search strategies. Together with results from the testing of CVRP benchmarks, the superiorities of the proposed algorithm over the PSO, SR-1, and SR-2 have been demonstrated. At last, a profound analysis of the experimental results is presented and some suggestions on future researches are given.

  18. Hybrid feature selection algorithm using symmetrical uncertainty and a harmony search algorithm

    Science.gov (United States)

    Salameh Shreem, Salam; Abdullah, Salwani; Nazri, Mohd Zakree Ahmad

    2016-04-01

    Microarray technology can be used as an efficient diagnostic system to recognise diseases such as tumours or to discriminate between different types of cancers in normal tissues. This technology has received increasing attention from the bioinformatics community because of its potential in designing powerful decision-making tools for cancer diagnosis. However, the presence of thousands or tens of thousands of genes affects the predictive accuracy of this technology from the perspective of classification. Thus, a key issue in microarray data is identifying or selecting the smallest possible set of genes from the input data that can achieve good predictive accuracy for classification. In this work, we propose a two-stage selection algorithm for gene selection problems in microarray data-sets called the symmetrical uncertainty filter and harmony search algorithm wrapper (SU-HSA). Experimental results show that the SU-HSA is better than HSA in isolation for all data-sets in terms of the accuracy and achieves a lower number of genes on 6 out of 10 instances. Furthermore, the comparison with state-of-the-art methods shows that our proposed approach is able to obtain 5 (out of 10) new best results in terms of the number of selected genes and competitive results in terms of the classification accuracy.

  19. Solving the wind farm layout optimization problem using random search algorithm

    DEFF Research Database (Denmark)

    Feng, Ju; Shen, Wen Zhong

    2015-01-01

    , in which better results than the genetic algorithm (GA) and the old version of the RS algorithm are obtained. Second it is applied to the Horns Rev 1 WF, and the optimized layouts obtain a higher power production than its original layout, both for the real scenario and for two constructed scenarios......Wind farm (WF) layout optimization is to find the optimal positions of wind turbines (WTs) inside a WF, so as to maximize and/or minimize a single objective or multiple objectives, while satisfying certain constraints. In this work, a random search (RS) algorithm based on continuous formulation....... In this application, it is also found that in order to get consistent and reliable optimization results, up to 360 or more sectors for wind direction have to be used. Finally, considering the inevitable inter-annual variations in the wind conditions, the robustness of the optimized layouts against wind condition...

  20. Obstacle Avoidance for Redundant Manipulators Utilizing a Backward Quadratic Search Algorithm

    Directory of Open Access Journals (Sweden)

    Tianjian Hu

    2016-06-01

    Full Text Available Obstacle avoidance can be achieved as a secondary task by appropriate inverse kinematics (IK resolution of redundant manipulators. Most prior literature requires the time-consuming determination of the closest point to the obstacle for every calculation step. Aiming at the relief of computational burden, this paper develops what is termed a backward quadratic search algorithm (BQSA as another option for solving IK problems in obstacle avoidance. The BQSA detects possible collisions based on the root property of a category of quadratic functions, which are derived from ellipse-enveloped obstacles and the positions of each link's end-points. The algorithm executes a backward search for possible obstacle collisions, from the end-effector to the base, and avoids obstacles by utilizing a hybrid IK scheme, incorporating the damped least-squares method, the weighted least-norm method and the gradient projection method. Some details of the hybrid IK scheme, such as values of the damped factor, weights and the clamping velocity, are discussed, along with a comparison of computational load between previous methods and BQSA. Simulations of a planar seven-link manipulator and a PUMA 560 robot verify the effectiveness of BQSA.

  1. FHSA-SED: Two-Locus Model Detection for Genome-Wide Association Study with Harmony Search Algorithm.

    Directory of Open Access Journals (Sweden)

    Shouheng Tuo

    Full Text Available Two-locus model is a typical significant disease model to be identified in genome-wide association study (GWAS. Due to intensive computational burden and diversity of disease models, existing methods have drawbacks on low detection power, high computation cost, and preference for some types of disease models.In this study, two scoring functions (Bayesian network based K2-score and Gini-score are used for characterizing two SNP locus as a candidate model, the two criteria are adopted simultaneously for improving identification power and tackling the preference problem to disease models. Harmony search algorithm (HSA is improved for quickly finding the most likely candidate models among all two-locus models, in which a local search algorithm with two-dimensional tabu table is presented to avoid repeatedly evaluating some disease models that have strong marginal effect. Finally G-test statistic is used to further test the candidate models.We investigate our method named FHSA-SED on 82 simulated datasets and a real AMD dataset, and compare it with two typical methods (MACOED and CSE which have been developed recently based on swarm intelligent search algorithm. The results of simulation experiments indicate that our method outperforms the two compared algorithms in terms of detection power, computation time, evaluation times, sensitivity (TPR, specificity (SPC, positive predictive value (PPV and accuracy (ACC. Our method has identified two SNPs (rs3775652 and rs10511467 that may be also associated with disease in AMD dataset.

  2. Certain integrable system on a space associated with a quantum search algorithm

    International Nuclear Information System (INIS)

    Uwano, Y.; Hino, H.; Ishiwatari, Y.

    2007-01-01

    On thinking up a Grover-type quantum search algorithm for an ordered tuple of multiqubit states, a gradient system associated with the negative von Neumann entropy is studied on the space of regular relative configurations of multiqubit states (SR 2 CMQ). The SR 2 CMQ emerges, through a geometric procedure, from the space of ordered tuples of multiqubit states for the quantum search. The aim of this paper is to give a brief report on the integrability of the gradient dynamical system together with quantum information geometry of the underlying space, SR 2 CMQ, of that system

  3. Gravity Search Algorithm hybridized Recursive Least Square method for power system harmonic estimation

    Directory of Open Access Journals (Sweden)

    Santosh Kumar Singh

    2017-06-01

    Full Text Available This paper presents a new hybrid method based on Gravity Search Algorithm (GSA and Recursive Least Square (RLS, known as GSA-RLS, to solve the harmonic estimation problems in the case of time varying power signals in presence of different noises. GSA is based on the Newton’s law of gravity and mass interactions. In the proposed method, the searcher agents are a collection of masses that interact with each other using Newton’s laws of gravity and motion. The basic GSA algorithm strategy is combined with RLS algorithm sequentially in an adaptive way to update the unknown parameters (weights of the harmonic signal. Simulation and practical validation are made with the experimentation of the proposed algorithm with real time data obtained from a heavy paper industry. A comparative performance of the proposed algorithm is evaluated with other recently reported algorithms like, Differential Evolution (DE, Particle Swarm Optimization (PSO, Bacteria Foraging Optimization (BFO, Fuzzy-BFO (F-BFO hybridized with Least Square (LS and BFO hybridized with RLS algorithm, which reveals that the proposed GSA-RLS algorithm is the best in terms of accuracy, convergence and computational time.

  4. Optimum Design of Gravity Retaining Walls Using Charged System Search Algorithm

    Directory of Open Access Journals (Sweden)

    S. Talatahari

    2012-01-01

    Full Text Available This study focuses on the optimum design retaining walls, as one of the familiar types of the retaining walls which may be constructed of stone masonry, unreinforced concrete, or reinforced concrete. The material cost is one of the major factors in the construction of gravity retaining walls therefore, minimizing the weight or volume of these systems can reduce the cost. To obtain an optimal seismic design of such structures, this paper proposes a method based on a novel meta-heuristic algorithm. The algorithm is inspired by the Coulomb's and Gauss’s laws of electrostatics in physics, and it is called charged system search (CSS. In order to evaluate the efficiency of this algorithm, an example is utilized. Comparing the results of the retaining wall designs obtained by the other methods illustrates a good performance of the CSS. In this paper, we used the Mononobe-Okabe method which is one of the pseudostatic approaches to determine the dynamic earth pressure.

  5. BiCluE - Exact and heuristic algorithms for weighted bi-cluster editing of biomedical data

    DEFF Research Database (Denmark)

    Sun, Peng; Guo, Jiong; Baumbach, Jan

    2013-01-01

    to solve the weighted bi-cluster editing problem. It implements (1) an exact algorithm based on fixed-parameter tractability and (2) a polynomial-time greedy heuristics based on solving the hardest part, edge deletions, first. We evaluated its performance on artificial graphs. Afterwards we exemplarily...... problem. BiCluE as well as the supplementary results are available online at http://biclue.mpi-inf.mpg.de webcite....

  6. Memoryless cooperative graph search based on the simulated annealing algorithm

    International Nuclear Information System (INIS)

    Hou Jian; Yan Gang-Feng; Fan Zhen

    2011-01-01

    We have studied the problem of reaching a globally optimal segment for a graph-like environment with a single or a group of autonomous mobile agents. Firstly, two efficient simulated-annealing-like algorithms are given for a single agent to solve the problem in a partially known environment and an unknown environment, respectively. It shows that under both proposed control strategies, the agent will eventually converge to a globally optimal segment with probability 1. Secondly, we use multi-agent searching to simultaneously reduce the computation complexity and accelerate convergence based on the algorithms we have given for a single agent. By exploiting graph partition, a gossip-consensus method based scheme is presented to update the key parameter—radius of the graph, ensuring that the agents spend much less time finding a globally optimal segment. (interdisciplinary physics and related areas of science and technology)

  7. Parameter Search Algorithms for Microwave Radar-Based Breast Imaging: Focal Quality Metrics as Fitness Functions.

    Science.gov (United States)

    O'Loughlin, Declan; Oliveira, Bárbara L; Elahi, Muhammad Adnan; Glavin, Martin; Jones, Edward; Popović, Milica; O'Halloran, Martin

    2017-12-06

    Inaccurate estimation of average dielectric properties can have a tangible impact on microwave radar-based breast images. Despite this, recent patient imaging studies have used a fixed estimate although this is known to vary from patient to patient. Parameter search algorithms are a promising technique for estimating the average dielectric properties from the reconstructed microwave images themselves without additional hardware. In this work, qualities of accurately reconstructed images are identified from point spread functions. As the qualities of accurately reconstructed microwave images are similar to the qualities of focused microscopic and photographic images, this work proposes the use of focal quality metrics for average dielectric property estimation. The robustness of the parameter search is evaluated using experimental dielectrically heterogeneous phantoms on the three-dimensional volumetric image. Based on a very broad initial estimate of the average dielectric properties, this paper shows how these metrics can be used as suitable fitness functions in parameter search algorithms to reconstruct clear and focused microwave radar images.

  8. Modified Cuckoo Search Algorithm for Solving Nonconvex Economic Load Dispatch Problems

    Directory of Open Access Journals (Sweden)

    Thang Trung Nguyen

    2016-01-01

    Full Text Available This paper presents the application of modified cuckoo search algorithm (MCSA for solving economic load dispatch (ELD problems. The MCSA method is developed to improve the search ability and solution quality of the conventional CSA method. In the MCSA, the evaluation of eggs has divided the initial eggs into two groups, the top egg group with good quality and the abandoned group with worse quality. Moreover, the value of the updated step size in MCSA is adapted as generating a new solution for the abandoned group and the top group via the Levy flights so that a large zone is searched at the beginning and a local zone is foraged as the maximum number of iterations is nearly reached. The MCSA method has been tested on different systems with different characteristics of thermal units and constraints. The result comparison with other methods in the literature has indicated that the MCSA method can be a powerful method for solving the ELD.

  9. A variable-depth search algorithm for recursive bi-partitioning of signal flow graphs

    NARCIS (Netherlands)

    de Kock, E.A.; Aarts, E.H.L.; Essink, G.; Jansen, R.E.J.; Korst, J.H.M.

    1995-01-01

    We discuss the use of local search techniques for mapping video algorithms onto programmable high-performance video signal processors. The mapping problem is very complex due to many constraints that need to be satisfied in order to obtain a feasible solution. The complexity is reduced by

  10. Design of ACM system based on non-greedy punctured LDPC codes

    Science.gov (United States)

    Lu, Zijun; Jiang, Zihong; Zhou, Lin; He, Yucheng

    2017-08-01

    In this paper, an adaptive coded modulation (ACM) scheme based on rate-compatible LDPC (RC-LDPC) codes was designed. The RC-LDPC codes were constructed by a non-greedy puncturing method which showed good performance in high code rate region. Moreover, the incremental redundancy scheme of LDPC-based ACM system over AWGN channel was proposed. By this scheme, code rates vary from 2/3 to 5/6 and the complication of the ACM system is lowered. Simulations show that more and more obvious coding gain can be obtained by the proposed ACM system with higher throughput.

  11. A meta-heuristic method for solving scheduling problem: crow search algorithm

    Science.gov (United States)

    Adhi, Antono; Santosa, Budi; Siswanto, Nurhadi

    2018-04-01

    Scheduling is one of the most important processes in an industry both in manufacturingand services. The scheduling process is the process of selecting resources to perform an operation on tasks. Resources can be machines, peoples, tasks, jobs or operations.. The selection of optimum sequence of jobs from a permutation is an essential issue in every research in scheduling problem. Optimum sequence becomes optimum solution to resolve scheduling problem. Scheduling problem becomes NP-hard problem since the number of job in the sequence is more than normal number can be processed by exact algorithm. In order to obtain optimum results, it needs a method with capability to solve complex scheduling problems in an acceptable time. Meta-heuristic is a method usually used to solve scheduling problem. The recently published method called Crow Search Algorithm (CSA) is adopted in this research to solve scheduling problem. CSA is an evolutionary meta-heuristic method which is based on the behavior in flocks of crow. The calculation result of CSA for solving scheduling problem is compared with other algorithms. From the comparison, it is found that CSA has better performance in term of optimum solution and time calculation than other algorithms.

  12. Research on multirobot pursuit task allocation algorithm based on emotional cooperation factor.

    Science.gov (United States)

    Fang, Baofu; Chen, Lu; Wang, Hao; Dai, Shuanglu; Zhong, Qiubo

    2014-01-01

    Multirobot task allocation is a hot issue in the field of robot research. A new emotional model is used with the self-interested robot, which gives a new way to measure self-interested robots' individual cooperative willingness in the problem of multirobot task allocation. Emotional cooperation factor is introduced into self-interested robot; it is updated based on emotional attenuation and external stimuli. Then a multirobot pursuit task allocation algorithm is proposed, which is based on emotional cooperation factor. Combined with the two-step auction algorithm recruiting team leaders and team collaborators, set up pursuit teams, and finally use certain strategies to complete the pursuit task. In order to verify the effectiveness of this algorithm, some comparing experiments have been done with the instantaneous greedy optimal auction algorithm; the results of experiments show that the total pursuit time and total team revenue can be optimized by using this algorithm.

  13. Automated real-time search and analysis algorithms for a non-contact 3D profiling system

    Science.gov (United States)

    Haynes, Mark; Wu, Chih-Hang John; Beck, B. Terry; Peterman, Robert J.

    2013-04-01

    The purpose of this research is to develop a new means of identifying and extracting geometrical feature statistics from a non-contact precision-measurement 3D profilometer. Autonomous algorithms have been developed to search through large-scale Cartesian point clouds to identify and extract geometrical features. These algorithms are developed with the intent of providing real-time production quality control of cold-rolled steel wires. The steel wires in question are prestressing steel reinforcement wires for concrete members. The geometry of the wire is critical in the performance of the overall concrete structure. For this research a custom 3D non-contact profilometry system has been developed that utilizes laser displacement sensors for submicron resolution surface profiling. Optimizations in the control and sensory system allow for data points to be collected at up to an approximate 400,000 points per second. In order to achieve geometrical feature extraction and tolerancing with this large volume of data, the algorithms employed are optimized for parsing large data quantities. The methods used provide a unique means of maintaining high resolution data of the surface profiles while keeping algorithm running times within practical bounds for industrial application. By a combination of regional sampling, iterative search, spatial filtering, frequency filtering, spatial clustering, and template matching a robust feature identification method has been developed. These algorithms provide an autonomous means of verifying tolerances in geometrical features. The key method of identifying the features is through a combination of downhill simplex and geometrical feature templates. By performing downhill simplex through several procedural programming layers of different search and filtering techniques, very specific geometrical features can be identified within the point cloud and analyzed for proper tolerancing. Being able to perform this quality control in real time

  14. Blind spectrum reconstruction algorithm with L0-sparse representation

    International Nuclear Information System (INIS)

    Liu, Hai; Zhang, Zhaoli; Liu, Sanyan; Shu, Jiangbo; Liu, Tingting; Zhang, Tianxu

    2015-01-01

    Raman spectrum often suffers from band overlap and Poisson noise. This paper presents a new blind Poissonian Raman spectrum reconstruction method, which incorporates the L 0 -sparse prior together with the total variation constraint into the maximum a posteriori framework. Furthermore, the greedy analysis pursuit algorithm is adopted to solve the L 0 -based minimization problem. Simulated and real spectrum experimental results show that the proposed method can effectively preserve spectral structure and suppress noise. The reconstructed Raman spectra are easily used for interpreting unknown chemical mixtures. (paper)

  15. Computational Comparison of Several Greedy Algorithms for the Minimum Cost Perfect Matching Problem on Large Graphs

    DEFF Research Database (Denmark)

    Wøhlk, Sanne; Laporte, Gilbert

    2017-01-01

    The aim of this paper is to computationally compare several algorithms for the Minimum Cost Perfect Matching Problem on an undirected complete graph. Our work is motivated by the need to solve large instances of the Capacitated Arc Routing Problem (CARP) arising in the optimization of garbage...... collection in Denmark. Common heuristics for the CARP involve the optimal matching of the odd-degree nodes of a graph. The algorithms used in the comparison include the CPLEX solution of an exact formulation, the LEDA matching algorithm, a recent implementation of the Blossom algorithm, as well as six...

  16. Chaos optimization algorithms based on chaotic maps with different probability distribution and search speed for global optimization

    Science.gov (United States)

    Yang, Dixiong; Liu, Zhenjun; Zhou, Jilei

    2014-04-01

    Chaos optimization algorithms (COAs) usually utilize the chaotic map like Logistic map to generate the pseudo-random numbers mapped as the design variables for global optimization. Many existing researches indicated that COA can more easily escape from the local minima than classical stochastic optimization algorithms. This paper reveals the inherent mechanism of high efficiency and superior performance of COA, from a new perspective of both the probability distribution property and search speed of chaotic sequences generated by different chaotic maps. The statistical property and search speed of chaotic sequences are represented by the probability density function (PDF) and the Lyapunov exponent, respectively. Meanwhile, the computational performances of hybrid chaos-BFGS algorithms based on eight one-dimensional chaotic maps with different PDF and Lyapunov exponents are compared, in which BFGS is a quasi-Newton method for local optimization. Moreover, several multimodal benchmark examples illustrate that, the probability distribution property and search speed of chaotic sequences from different chaotic maps significantly affect the global searching capability and optimization efficiency of COA. To achieve the high efficiency of COA, it is recommended to adopt the appropriate chaotic map generating the desired chaotic sequences with uniform or nearly uniform probability distribution and large Lyapunov exponent.

  17. Automatic boiling water reactor control rod pattern design using particle swarm optimization algorithm and local search

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Cheng-Der, E-mail: jdwang@iner.gov.tw [Nuclear Engineering Division, Institute of Nuclear Energy Research, No. 1000, Wenhua Rd., Jiaan Village, Longtan Township, Taoyuan County 32546, Taiwan, ROC (China); Lin, Chaung [National Tsing Hua University, Department of Engineering and System Science, 101, Section 2, Kuang Fu Road, Hsinchu 30013, Taiwan (China)

    2013-02-15

    Highlights: ► The PSO algorithm was adopted to automatically design a BWR CRP. ► The local search procedure was added to improve the result of PSO algorithm. ► The results show that the obtained CRP is the same good as that in the previous work. -- Abstract: This study developed a method for the automatic design of a boiling water reactor (BWR) control rod pattern (CRP) using the particle swarm optimization (PSO) algorithm. The PSO algorithm is more random compared to the rank-based ant system (RAS) that was used to solve the same BWR CRP design problem in the previous work. In addition, the local search procedure was used to make improvements after PSO, by adding the single control rod (CR) effect. The design goal was to obtain the CRP so that the thermal limits and shutdown margin would satisfy the design requirement and the cycle length, which is implicitly controlled by the axial power distribution, would be acceptable. The results showed that the same acceptable CRP found in the previous work could be obtained.

  18. A Fast Exact k-Nearest Neighbors Algorithm for High Dimensional Search Using k-Means Clustering and Triangle Inequality.

    Science.gov (United States)

    Wang, Xueyi

    2012-02-08

    The k-nearest neighbors (k-NN) algorithm is a widely used machine learning method that finds nearest neighbors of a test object in a feature space. We present a new exact k-NN algorithm called kMkNN (k-Means for k-Nearest Neighbors) that uses the k-means clustering and the triangle inequality to accelerate the searching for nearest neighbors in a high dimensional space. The kMkNN algorithm has two stages. In the buildup stage, instead of using complex tree structures such as metric trees, kd-trees, or ball-tree, kMkNN uses a simple k-means clustering method to preprocess the training dataset. In the searching stage, given a query object, kMkNN finds nearest training objects starting from the nearest cluster to the query object and uses the triangle inequality to reduce the distance calculations. Experiments show that the performance of kMkNN is surprisingly good compared to the traditional k-NN algorithm and tree-based k-NN algorithms such as kd-trees and ball-trees. On a collection of 20 datasets with up to 10(6) records and 10(4) dimensions, kMkNN shows a 2-to 80-fold reduction of distance calculations and a 2- to 60-fold speedup over the traditional k-NN algorithm for 16 datasets. Furthermore, kMkNN performs significant better than a kd-tree based k-NN algorithm for all datasets and performs better than a ball-tree based k-NN algorithm for most datasets. The results show that kMkNN is effective for searching nearest neighbors in high dimensional spaces.

  19. Improved Harmony Search Algorithm for Truck Scheduling Problem in Multiple-Door Cross-Docking Systems

    Directory of Open Access Journals (Sweden)

    Zhanzhong Wang

    2018-01-01

    Full Text Available The key of realizing the cross docking is to design the joint of inbound trucks and outbound trucks, so a proper sequence of trucks will make the cross-docking system much more efficient and need less makespan. A cross-docking system is proposed with multiple receiving and shipping dock doors. The objective is to find the best door assignments and the sequences of trucks in the principle of products distribution to minimize the total makespan of cross docking. To solve the problem that is regarded as a mixed integer linear programming (MILP model, three metaheuristics, namely, harmony search (HS, improved harmony search (IHS, and genetic algorithm (GA, are proposed. Furthermore, the fixed parameters are optimized by Taguchi experiments to improve the accuracy of solutions further. Finally, several numerical examples are put forward to evaluate the performances of proposed algorithms.

  20. Parametric optimization of ultrasonic machining process using gravitational search and fireworks algorithms

    Directory of Open Access Journals (Sweden)

    Debkalpa Goswami

    2015-03-01

    Full Text Available Ultrasonic machining (USM is a mechanical material removal process used to erode holes and cavities in hard or brittle workpieces by using shaped tools, high-frequency mechanical motion and an abrasive slurry. Unlike other non-traditional machining processes, such as laser beam and electrical discharge machining, USM process does not thermally damage the workpiece or introduce significant levels of residual stress, which is important for survival of materials in service. For having enhanced machining performance and better machined job characteristics, it is often required to determine the optimal control parameter settings of an USM process. The earlier mathematical approaches for parametric optimization of USM processes have mostly yielded near optimal or sub-optimal solutions. In this paper, two almost unexplored non-conventional optimization techniques, i.e. gravitational search algorithm (GSA and fireworks algorithm (FWA are applied for parametric optimization of USM processes. The optimization performance of these two algorithms is compared with that of other popular population-based algorithms, and the effects of their algorithm parameters on the derived optimal solutions and computational speed are also investigated. It is observed that FWA provides the best optimal results for the considered USM processes.

  1. Turn-Based War Chess Model and Its Search Algorithm per Turn

    Directory of Open Access Journals (Sweden)

    Hai Nan

    2016-01-01

    Full Text Available War chess gaming has so far received insufficient attention but is a significant component of turn-based strategy games (TBS and is studied in this paper. First, a common game model is proposed through various existing war chess types. Based on the model, we propose a theory frame involving combinational optimization on the one hand and game tree search on the other hand. We also discuss a key problem, namely, that the number of the branching factors of each turn in the game tree is huge. Then, we propose two algorithms for searching in one turn to solve the problem: (1 enumeration by order; (2 enumeration by recursion. The main difference between these two is the permutation method used: the former uses the dictionary sequence method, while the latter uses the recursive permutation method. Finally, we prove that both of these algorithms are optimal, and we analyze the difference between their efficiencies. An important factor is the total time taken for the unit to expand until it achieves its reachable position. The factor, which is the total number of expansions that each unit makes in its reachable position, is set. The conclusion proposed is in terms of this factor: Enumeration by recursion is better than enumeration by order in all situations.

  2. Nonmyopic Sensor Scheduling and its Efficient Implementation for Target Tracking Applications

    Directory of Open Access Journals (Sweden)

    Morrell Darryl

    2006-01-01

    Full Text Available We propose two nonmyopic sensor scheduling algorithms for target tracking applications. We consider a scenario where a bearing-only sensor is constrained to move in a finite number of directions to track a target in a two-dimensional plane. Both algorithms provide the best sensor sequence by minimizing a predicted expected scheduler cost over a finite time-horizon. The first algorithm approximately computes the scheduler costs based on the predicted covariance matrix of the tracker error. The second algorithm uses the unscented transform in conjunction with a particle filter to approximate covariance-based costs or information-theoretic costs. We also propose the use of two branch-and-bound-based optimal pruning algorithms for efficient implementation of the scheduling algorithms. We design the first pruning algorithm by combining branch-and-bound with a breadth-first search and a greedy-search; the second pruning algorithm combines branch-and-bound with a uniform-cost search. Simulation results demonstrate the advantage of nonmyopic scheduling over myopic scheduling and the significant savings in computational and memory resources when using the pruning algorithms.

  3. Multi-objective optimization in the presence of practical constraints using non-dominated sorting hybrid cuckoo search algorithm

    Directory of Open Access Journals (Sweden)

    M. Balasubbareddy

    2015-12-01

    Full Text Available A novel optimization algorithm is proposed to solve single and multi-objective optimization problems with generation fuel cost, emission, and total power losses as objectives. The proposed method is a hybridization of the conventional cuckoo search algorithm and arithmetic crossover operations. Thus, the non-linear, non-convex objective function can be solved under practical constraints. The effectiveness of the proposed algorithm is analyzed for various cases to illustrate the effect of practical constraints on the objectives' optimization. Two and three objective multi-objective optimization problems are formulated and solved using the proposed non-dominated sorting-based hybrid cuckoo search algorithm. The effectiveness of the proposed method in confining the Pareto front solutions in the solution region is analyzed. The results for single and multi-objective optimization problems are physically interpreted on standard test functions as well as the IEEE-30 bus test system with supporting numerical and graphical results and also validated against existing methods.

  4. Hybridisations of Variable Neighbourhood Search and Modified Simplex Elements to Harmony Search and Shuffled Frog Leaping Algorithms for Process Optimisations

    Science.gov (United States)

    Aungkulanon, P.; Luangpaiboon, P.

    2010-10-01

    Nowadays, the engineering problem systems are large and complicated. An effective finite sequence of instructions for solving these problems can be categorised into optimisation and meta-heuristic algorithms. Though the best decision variable levels from some sets of available alternatives cannot be done, meta-heuristics is an alternative for experience-based techniques that rapidly help in problem solving, learning and discovery in the hope of obtaining a more efficient or more robust procedure. All meta-heuristics provide auxiliary procedures in terms of their own tooled box functions. It has been shown that the effectiveness of all meta-heuristics depends almost exclusively on these auxiliary functions. In fact, the auxiliary procedure from one can be implemented into other meta-heuristics. Well-known meta-heuristics of harmony search (HSA) and shuffled frog-leaping algorithms (SFLA) are compared with their hybridisations. HSA is used to produce a near optimal solution under a consideration of the perfect state of harmony of the improvisation process of musicians. A meta-heuristic of the SFLA, based on a population, is a cooperative search metaphor inspired by natural memetics. It includes elements of local search and global information exchange. This study presents solution procedures via constrained and unconstrained problems with different natures of single and multi peak surfaces including a curved ridge surface. Both meta-heuristics are modified via variable neighbourhood search method (VNSM) philosophy including a modified simplex method (MSM). The basic idea is the change of neighbourhoods during searching for a better solution. The hybridisations proceed by a descent method to a local minimum exploring then, systematically or at random, increasingly distant neighbourhoods of this local solution. The results show that the variant of HSA with VNSM and MSM seems to be better in terms of the mean and variance of design points and yields.

  5. Top-k Keyword Search Over Graphs Based On Backward Search

    Directory of Open Access Journals (Sweden)

    Zeng Jia-Hui

    2017-01-01

    Full Text Available Keyword search is one of the most friendly and intuitive information retrieval methods. Using the keyword search to get the connected subgraph has a lot of application in the graph-based cognitive computation, and it is a basic technology. This paper focuses on the top-k keyword searching over graphs. We implemented a keyword search algorithm which applies the backward search idea. The algorithm locates the keyword vertices firstly, and then applies backward search to find rooted trees that contain query keywords. The experiment shows that query time is affected by the iteration number of the algorithm.

  6. A Robust Parallel Algorithm for Combinatorial Compressed Sensing

    Science.gov (United States)

    Mendoza-Smith, Rodrigo; Tanner, Jared W.; Wechsung, Florian

    2018-04-01

    In previous work two of the authors have shown that a vector $x \\in \\mathbb{R}^n$ with at most $k Parallel-$\\ell_0$ decoding algorithm, where $\\mathrm{nnz}(A)$ denotes the number of nonzero entries in $A \\in \\mathbb{R}^{m \\times n}$. In this paper we present the Robust-$\\ell_0$ decoding algorithm, which robustifies Parallel-$\\ell_0$ when the sketch $Ax$ is corrupted by additive noise. This robustness is achieved by approximating the asymptotic posterior distribution of values in the sketch given its corrupted measurements. We provide analytic expressions that approximate these posteriors under the assumptions that the nonzero entries in the signal and the noise are drawn from continuous distributions. Numerical experiments presented show that Robust-$\\ell_0$ is superior to existing greedy and combinatorial compressed sensing algorithms in the presence of small to moderate signal-to-noise ratios in the setting of Gaussian signals and Gaussian additive noise.

  7. Robust MST-Based Clustering Algorithm.

    Science.gov (United States)

    Liu, Qidong; Zhang, Ruisheng; Zhao, Zhili; Wang, Zhenghai; Jiao, Mengyao; Wang, Guangjing

    2018-06-01

    Minimax similarity stresses the connectedness of points via mediating elements rather than favoring high mutual similarity. The grouping principle yields superior clustering results when mining arbitrarily-shaped clusters in data. However, it is not robust against noises and outliers in the data. There are two main problems with the grouping principle: first, a single object that is far away from all other objects defines a separate cluster, and second, two connected clusters would be regarded as two parts of one cluster. In order to solve such problems, we propose robust minimum spanning tree (MST)-based clustering algorithm in this letter. First, we separate the connected objects by applying a density-based coarsening phase, resulting in a low-rank matrix in which the element denotes the supernode by combining a set of nodes. Then a greedy method is presented to partition those supernodes through working on the low-rank matrix. Instead of removing the longest edges from MST, our algorithm groups the data set based on the minimax similarity. Finally, the assignment of all data points can be achieved through their corresponding supernodes. Experimental results on many synthetic and real-world data sets show that our algorithm consistently outperforms compared clustering algorithms.

  8. Day-ahead distributed energy resource scheduling using differential search algorithm

    DEFF Research Database (Denmark)

    Soares, J.; Lobo, C.; Silva, M.

    2015-01-01

    The number of dispersed energy resources is growing every day, such as the use of more distributed generators. This paper deals with energy resource scheduling model in future smart grids. The methodology can be used by virtual power players (VPPs) considering day-ahead time horizon. This method...... considers that energy resources are managed by a VPP which establishes contracts with their owners. The full AC power flow calculation included in the model takes into account network constraints. This paper presents an application of differential search algorithm (DSA) for solving the day-ahead scheduling...

  9. Searching for continuous gravitational wave signals. The hierarchical Hough transform algorithm

    International Nuclear Information System (INIS)

    Papa, M.; Schutz, B.F.; Sintes, A.M.

    2001-01-01

    It is well known that matched filtering techniques cannot be applied for searching extensive parameter space volumes for continuous gravitational wave signals. This is the reason why alternative strategies are being pursued. Hierarchical strategies are best at investigating a large parameter space when there exist computational power constraints. Algorithms of this kind are being implemented by all the groups that are developing software for analyzing the data of the gravitational wave detectors that will come online in the next years. In this talk I will report about the hierarchical Hough transform method that the GEO 600 data analysis team at the Albert Einstein Institute is developing. The three step hierarchical algorithm has been described elsewhere [8]. In this talk I will focus on some of the implementational aspects we are currently concerned with. (author)

  10. Breadth-First Search-Based Single-Phase Algorithms for Bridge Detection in Wireless Sensor Networks

    Science.gov (United States)

    Akram, Vahid Khalilpour; Dagdeviren, Orhan

    2013-01-01

    Wireless sensor networks (WSNs) are promising technologies for exploring harsh environments, such as oceans, wild forests, volcanic regions and outer space. Since sensor nodes may have limited transmission range, application packets may be transmitted by multi-hop communication. Thus, connectivity is a very important issue. A bridge is a critical edge whose removal breaks the connectivity of the network. Hence, it is crucial to detect bridges and take preventions. Since sensor nodes are battery-powered, services running on nodes should consume low energy. In this paper, we propose energy-efficient and distributed bridge detection algorithms for WSNs. Our algorithms run single phase and they are integrated with the Breadth-First Search (BFS) algorithm, which is a popular routing algorithm. Our first algorithm is an extended version of Milic's algorithm, which is designed to reduce the message length. Our second algorithm is novel and uses ancestral knowledge to detect bridges. We explain the operation of the algorithms, analyze their proof of correctness, message, time, space and computational complexities. To evaluate practical importance, we provide testbed experiments and extensive simulations. We show that our proposed algorithms provide less resource consumption, and the energy savings of our algorithms are up by 5.5-times. PMID:23845930

  11. Improved Algorithms OF CELF and CELF++ for Influence Maximization

    Directory of Open Access Journals (Sweden)

    Jiaguo Lv

    2014-06-01

    Full Text Available Motivated by the wide application in some fields, such as viral marketing, sales promotion etc, influence maximization has been the most important and extensively studied problem in social network. However, the most classical KK-Greedy algorithm for influence maximization is inefficient. Two major sources of the algorithm’s inefficiency were analyzed in this paper. With the analysis of algorithms CELF and CELF++, all nodes in the influenced set of u would never bring any marginal gain when a new seed u was produced. Through this optimization strategy, a lot of redundant nodes will be removed from the candidate nodes. Basing on the strategy, two improved algorithms of Lv_CELF and Lv_CELF++ were proposed in this study. To evaluate the two algorithms, the two algorithms with their benchmark algorithms of CELF and CELF++ were conducted on some real world datasets. To estimate the algorithms, influence degree and running time were employed to measure the performance and efficiency respectively. Experimental results showed that, compared with benchmark algorithms of CELF and CELF++, matching effects and higher efficiency were achieved by the new algorithms Lv_CELF and Lv_CELF++. Solutions with the proposed optimization strategy can be useful for the decisionmaking problems under the scenarios related to the influence maximization problem.

  12. Recurrent neural network-based modeling of gene regulatory network using elephant swarm water search algorithm.

    Science.gov (United States)

    Mandal, Sudip; Saha, Goutam; Pal, Rajat Kumar

    2017-08-01

    Correct inference of genetic regulations inside a cell from the biological database like time series microarray data is one of the greatest challenges in post genomic era for biologists and researchers. Recurrent Neural Network (RNN) is one of the most popular and simple approach to model the dynamics as well as to infer correct dependencies among genes. Inspired by the behavior of social elephants, we propose a new metaheuristic namely Elephant Swarm Water Search Algorithm (ESWSA) to infer Gene Regulatory Network (GRN). This algorithm is mainly based on the water search strategy of intelligent and social elephants during drought, utilizing the different types of communication techniques. Initially, the algorithm is tested against benchmark small and medium scale artificial genetic networks without and with presence of different noise levels and the efficiency was observed in term of parametric error, minimum fitness value, execution time, accuracy of prediction of true regulation, etc. Next, the proposed algorithm is tested against the real time gene expression data of Escherichia Coli SOS Network and results were also compared with others state of the art optimization methods. The experimental results suggest that ESWSA is very efficient for GRN inference problem and performs better than other methods in many ways.

  13. Research on Multirobot Pursuit Task Allocation Algorithm Based on Emotional Cooperation Factor

    Directory of Open Access Journals (Sweden)

    Baofu Fang

    2014-01-01

    Full Text Available Multirobot task allocation is a hot issue in the field of robot research. A new emotional model is used with the self-interested robot, which gives a new way to measure self-interested robots’ individual cooperative willingness in the problem of multirobot task allocation. Emotional cooperation factor is introduced into self-interested robot; it is updated based on emotional attenuation and external stimuli. Then a multirobot pursuit task allocation algorithm is proposed, which is based on emotional cooperation factor. Combined with the two-step auction algorithm recruiting team leaders and team collaborators, set up pursuit teams, and finally use certain strategies to complete the pursuit task. In order to verify the effectiveness of this algorithm, some comparing experiments have been done with the instantaneous greedy optimal auction algorithm; the results of experiments show that the total pursuit time and total team revenue can be optimized by using this algorithm.

  14. A greedy double swap heuristic for nurse scheduling

    Directory of Open Access Journals (Sweden)

    Murphy Choy

    2012-10-01

    Full Text Available One of the key challenges of nurse scheduling problem (NSP is the number of constraints placed on preparing the timetable, both from the regulatory requirements as well as the patients’ demand for the appropriate nursing care specialists. In addition, the preferences of the nursing staffs related to their work schedules add another dimension of complexity. Most solutions proposed for solving nurse scheduling involve the use of mathematical programming and generally considers only the hard constraints. However, the psychological needs of the nurses are ignored and this resulted in subsequent interventions by the nursing staffs to remedy any deficiency and often results in last minute changes to the schedule. In this paper, we present a staff preference optimization framework solved with a greedy double swap heuristic. The heuristic yields good performance in speed at solving the problem. The heuristic is simple and we will demonstrate its performance by implementing it on open source spreadsheet software.

  15. Starvation dynamics of a greedy forager

    Science.gov (United States)

    Bhat, U.; Redner, S.; Bénichou, O.

    2017-07-01

    We investigate the dynamics of a greedy forager that moves by random walking in an environment where each site initially contains one unit of food. Upon encountering a food-containing site, the forager eats all the food there and can subsequently hop an additional S steps without food before starving to death. Upon encountering an empty site, the forager goes hungry and comes one time unit closer to starvation. We investigate the new feature of forager greed; if the forager has a choice between hopping to an empty site or to a food-containing site in its nearest neighborhood, it hops preferentially towards food. If the neighboring sites all contain food or are all empty, the forager hops equiprobably to one of these neighbors. Paradoxically, the lifetime of the forager can depend non-monotonically on greed, and the sense of the non-monotonicity is opposite in one and two dimensions. Even more unexpectedly, the forager lifetime in one dimension is substantially enhanced when the greed is negative; here the forager tends to avoid food in its local neighborhood. We also determine the average amount of food consumed at the instant when the forager starves. We present analytic, heuristic, and numerical results to elucidate these intriguing phenomena.

  16. A Hybrid Harmony Search Algorithm Approach for Optimal Power Flow

    Directory of Open Access Journals (Sweden)

    Mimoun YOUNES

    2012-08-01

    Full Text Available Optimal Power Flow (OPF is one of the main functions of Power system operation. It determines the optimal settings of generating units, bus voltage, transformer tap and shunt elements in Power System with the objective of minimizing total production costs or losses while the system is operating within its security limits. The aim of this paper is to propose a novel methodology (BCGAs-HSA that solves OPF including both active and reactive power dispatch It is based on combining the binary-coded genetic algorithm (BCGAs and the harmony search algorithm (HSA to determine the optimal global solution. This method was tested on the modified IEEE 30 bus test system. The results obtained by this method are compared with those obtained with BCGAs or HSA separately. The results show that the BCGAs-HSA approach can converge to the optimum solution with accuracy compared to those reported recently in the literature.

  17. Optimization of partial search

    International Nuclear Information System (INIS)

    Korepin, Vladimir E

    2005-01-01

    A quantum Grover search algorithm can find a target item in a database faster than any classical algorithm. One can trade accuracy for speed and find a part of the database (a block) containing the target item even faster; this is partial search. A partial search algorithm was recently suggested by Grover and Radhakrishnan. Here we optimize it. Efficiency of the search algorithm is measured by the number of queries to the oracle. The author suggests a new version of the Grover-Radhakrishnan algorithm which uses a minimal number of such queries. The algorithm can run on the same hardware that is used for the usual Grover algorithm. (letter to the editor)

  18. Modified Backtracking Search Optimization Algorithm Inspired by Simulated Annealing for Constrained Engineering Optimization Problems

    Directory of Open Access Journals (Sweden)

    Hailong Wang

    2018-01-01

    Full Text Available The backtracking search optimization algorithm (BSA is a population-based evolutionary algorithm for numerical optimization problems. BSA has a powerful global exploration capacity while its local exploitation capability is relatively poor. This affects the convergence speed of the algorithm. In this paper, we propose a modified BSA inspired by simulated annealing (BSAISA to overcome the deficiency of BSA. In the BSAISA, the amplitude control factor (F is modified based on the Metropolis criterion in simulated annealing. The redesigned F could be adaptively decreased as the number of iterations increases and it does not introduce extra parameters. A self-adaptive ε-constrained method is used to handle the strict constraints. We compared the performance of the proposed BSAISA with BSA and other well-known algorithms when solving thirteen constrained benchmarks and five engineering design problems. The simulation results demonstrated that BSAISA is more effective than BSA and more competitive with other well-known algorithms in terms of convergence speed.

  19. Iterated Local Search Algorithm with Strategic Oscillation for School Bus Routing Problem with Bus Stop Selection

    Directory of Open Access Journals (Sweden)

    Mohammad Saied Fallah Niasar

    2017-02-01

    Full Text Available he school bus routing problem (SBRP represents a variant of the well-known vehicle routing problem. The main goal of this study is to pick up students allocated to some bus stops and generate routes, including the selected stops, in order to carry students to school. In this paper, we have proposed a simple but effective metaheuristic approach that employs two features: first, it utilizes large neighborhood structures for a deeper exploration of the search space; second, the proposed heuristic executes an efficient transition between the feasible and infeasible portions of the search space. Exploration of the infeasible area is controlled by a dynamic penalty function to convert the unfeasible solution into a feasible one. Two metaheuristics, called N-ILS (a variant of the Nearest Neighbourhood with Iterated Local Search algorithm and I-ILS (a variant of Insertion with Iterated Local Search algorithm are proposed to solve SBRP. Our experimental procedure is based on the two data sets. The results show that N-ILS is able to obtain better solutions in shorter computing times. Additionally, N-ILS appears to be very competitive in comparison with the best existing metaheuristics suggested for SBRP

  20. Improvement of the Gravitational Search Algorithm by means of Low-Discrepancy Sobol Quasi Random-Number Sequence Based Initialization

    Directory of Open Access Journals (Sweden)

    ALTINOZ, O. T.

    2014-08-01

    Full Text Available Nature-inspired optimization algorithms can obtain the optima by updating the position of each member in the population. At the beginning of the algorithm, the particles of the population are spread into the search space. The initial distribution of particles corresponds to the beginning points of the search process. Hence, the aim is to alter the position for each particle beginning with this initial position until the optimum solution will be found with respect to the pre-determined conditions like maximum iteration, and specific error value for the fitness function. Therefore, initial positions of the population have a direct effect on both accuracy of the optima and the computational cost. If any member in the population is close enough to the optima, this eases the achievement of the exact solution. On the contrary, individuals grouped far away from the optima might yield pointless efforts. In this study, low-discrepancy quasi-random number sequence is preferred for the localization of the population at the initialization phase. By this way, the population is distributed into the search space in a more uniform manner at the initialization phase. The technique is applied to the Gravitational Search Algorithm and compared via the performance on benchmark function solutions.

  1. Dynamic route guidance algorithm based algorithm based on artificial immune system

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    To improve the performance of the K-shortest paths search in intelligent traffic guidance systems,this paper proposes an optimal search algorithm based on the intelligent optimization search theory and the memphor mechanism of vertebrate immune systems.This algorithm,applied to the urban traffic network model established by the node-expanding method,can expediently realize K-shortest paths search in the urban traffic guidance systems.Because of the immune memory and global parallel search ability from artificial immune systems,K shortest paths can be found without any repeat,which indicates evidently the superiority of the algorithm to the conventional ones.Not only does it perform a better parallelism,the algorithm also prevents premature phenomenon that often occurs in genetic algorithms.Thus,it is especially suitable for real-time requirement of the traffic guidance system and other engineering optimal applications.A case study verifies the efficiency and the practicability of the algorithm aforementioned.

  2. Preconditioned dynamic mode decomposition and mode selection algorithms for large datasets using incremental proper orthogonal decomposition

    Science.gov (United States)

    Ohmichi, Yuya

    2017-07-01

    In this letter, we propose a simple and efficient framework of dynamic mode decomposition (DMD) and mode selection for large datasets. The proposed framework explicitly introduces a preconditioning step using an incremental proper orthogonal decomposition (POD) to DMD and mode selection algorithms. By performing the preconditioning step, the DMD and mode selection can be performed with low memory consumption and therefore can be applied to large datasets. Additionally, we propose a simple mode selection algorithm based on a greedy method. The proposed framework is applied to the analysis of three-dimensional flow around a circular cylinder.

  3. Optimal gravitational search algorithm for automatic generation control of interconnected power systems

    Directory of Open Access Journals (Sweden)

    Rabindra Kumar Sahu

    2014-09-01

    Full Text Available An attempt is made for the effective application of Gravitational Search Algorithm (GSA to optimize PI/PIDF controller parameters in Automatic Generation Control (AGC of interconnected power systems. Initially, comparison of several conventional objective functions reveals that ITAE yields better system performance. Then, the parameters of GSA technique are properly tuned and the GSA control parameters are proposed. The superiority of the proposed approach is demonstrated by comparing the results of some recently published techniques such as Differential Evolution (DE, Bacteria Foraging Optimization Algorithm (BFOA and Genetic Algorithm (GA. Additionally, sensitivity analysis is carried out that demonstrates the robustness of the optimized controller parameters to wide variations in operating loading condition and time constants of speed governor, turbine, tie-line power. Finally, the proposed approach is extended to a more realistic power system model by considering the physical constraints such as reheat turbine, Generation Rate Constraint (GRC and Governor Dead Band nonlinearity.

  4. A comparative study of the A* heuristic search algorithm used to solve efficiently a puzzle game

    Science.gov (United States)

    Iordan, A. E.

    2018-01-01

    The puzzle game presented in this paper consists in polyhedra (prisms, pyramids or pyramidal frustums) which can be moved using the free available spaces. The problem requires to be found the minimum number of movements in order the game reaches to a goal configuration starting from an initial configuration. Because the problem is enough complex, the principal difficulty in solving it is given by dimension of search space, that leads to necessity of a heuristic search. The improving of the search method consists into determination of a strong estimation by the heuristic function which will guide the search process to the most promising side of the search tree. The comparative study is realized among Manhattan heuristic and the Hamming heuristic using A* search algorithm implemented in Java. This paper also presents the necessary stages in object oriented development of a software used to solve efficiently this puzzle game. The modelling of the software is achieved through specific UML diagrams representing the phases of analysis, design and implementation, the system thus being described in a clear and practical manner. With the purpose to confirm the theoretical results which demonstrates that Manhattan heuristic is more efficient was used space complexity criterion. The space complexity was measured by the number of generated nodes from the search tree, by the number of the expanded nodes and by the effective branching factor. From the experimental results obtained by using the Manhattan heuristic, improvements were observed regarding space complexity of A* algorithm versus Hamming heuristic.

  5. A Novel Quantum-Behaved Lightning Search Algorithm Approach to Improve the Fuzzy Logic Speed Controller for an Induction Motor Drive

    Directory of Open Access Journals (Sweden)

    Jamal Abd Ali

    2015-11-01

    Full Text Available This paper presents a novel lightning search algorithm (LSA using quantum mechanics theories to generate a quantum-inspired LSA (QLSA. The QLSA improves the searching of each step leader to obtain the best position for a projectile. To evaluate the reliability and efficiency of the proposed algorithm, the QLSA is tested using eighteen benchmark functions with various characteristics. The QLSA is applied to improve the design of the fuzzy logic controller (FLC for controlling the speed response of the induction motor drive. The proposed algorithm avoids the exhaustive conventional trial-and-error procedure for obtaining membership functions (MFs. The generated adaptive input and output MFs are implemented in the fuzzy speed controller design to formulate the objective functions. Mean absolute error (MAE of the rotor speed is the objective function of optimization controller. An optimal QLSA-based FLC (QLSAF optimization controller is employed to tune and minimize the MAE, thereby improving the performance of the induction motor with the change in speed and mechanical load. To validate the performance of the developed controller, the results obtained with the QLSAF are compared to the results obtained with LSA, the backtracking search algorithm (BSA, the gravitational search algorithm (GSA, the particle swarm optimization (PSO and the proportional integral derivative controllers (PID, respectively. Results show that the QLASF outperforms the other control methods in all of the tested cases in terms of damping capability and transient response under different mechanical loads and speeds.

  6. Optimal Capacitor Placement in Wind Farms by Considering Harmonics Using Discrete Lightning Search Algorithm

    Directory of Open Access Journals (Sweden)

    Reza Sirjani

    2017-09-01

    Full Text Available Currently, many wind farms exist throughout the world and, in some cases, supply a significant portion of energy to networks. However, numerous uncertainties remain with respect to the amount of energy generated by wind turbines and other sophisticated operational aspects, such as voltage and reactive power management, which requires further development and consideration. To fix the problem of poor reactive power compensation in wind farms, optimal capacitor placement has been proposed in existing wind farms as a simple and relatively inexpensive method. However, the use of induction generators, transformers, and additional capacitors represent potential problems for the harmonics of a system and therefore must be taken into account at wind farms. The optimal location and size of capacitors at buses of an 80-MW wind farm were determined according to modelled wind speed, system equivalent circuits, and harmonics in order to minimize energy losses, optimize reactive power and reduce the management costs. The discrete version of the lightning search algorithm (DLSA is a powerful and flexible nature-inspired optimization technique that was developed and implemented herein for optimal capacitor placement in wind farms. The obtained results are compared with the results of the genetic algorithm (GA and the discrete harmony search algorithm (DHSA.

  7. Comparative Study on Feature Selection and Fusion Schemes for Emotion Recognition from Speech

    Directory of Open Access Journals (Sweden)

    Santiago Planet

    2012-09-01

    Full Text Available The automatic analysis of speech to detect affective states may improve the way users interact with electronic devices. However, the analysis only at the acoustic level could be not enough to determine the emotion of a user in a realistic scenario. In this paper we analyzed the spontaneous speech recordings of the FAU Aibo Corpus at the acoustic and linguistic levels to extract two sets of features. The acoustic set was reduced by a greedy procedure selecting the most relevant features to optimize the learning stage. We compared two versions of this greedy selection algorithm by performing the search of the relevant features forwards and backwards. We experimented with three classification approaches: Naïve-Bayes, a support vector machine and a logistic model tree, and two fusion schemes: decision-level fusion, merging the hard-decisions of the acoustic and linguistic classifiers by means of a decision tree; and feature-level fusion, concatenating both sets of features before the learning stage. Despite the low performance achieved by the linguistic data, a dramatic improvement was achieved after its combination with the acoustic information, improving the results achieved by this second modality on its own. The results achieved by the classifiers using the parameters merged at feature level outperformed the classification results of the decision-level fusion scheme, despite the simplicity of the scheme. Moreover, the extremely reduced set of acoustic features obtained by the greedy forward search selection algorithm improved the results provided by the full set.

  8. Greedy algorithms and Zipf laws

    Science.gov (United States)

    Moran, José; Bouchaud, Jean-Philippe

    2018-04-01

    We consider a simple model of firm/city/etc growth based on a multi-item criterion: whenever entity B fares better than entity A on a subset of M items out of K, the agent originally in A moves to B. We solve the model analytically in the cases K  =  1 and . The resulting stationary distribution of sizes is generically a Zipf-law provided M  >  K/2. When , no selection occurs and the size distribution remains thin-tailed. In the special case M  =  K, one needs to regularize the problem by introducing a small ‘default’ probability ϕ. We find that the stationary distribution has a power-law tail that becomes a Zipf-law when . The approach to the stationary state can also be characterized, with strong similarities with a simple ‘aging’ model considered by Barrat and Mézard.

  9. Real Time Search Algorithm for Observation Outliers During Monitoring Engineering Constructions

    Science.gov (United States)

    Latos, Dorota; Kolanowski, Bogdan; Pachelski, Wojciech; Sołoducha, Ryszard

    2017-12-01

    Real time monitoring of engineering structures in case of an emergency of disaster requires collection of a large amount of data to be processed by specific analytical techniques. A quick and accurate assessment of the state of the object is crucial for a probable rescue action. One of the more significant evaluation methods of large sets of data, either collected during a specified interval of time or permanently, is the time series analysis. In this paper presented is a search algorithm for those time series elements which deviate from their values expected during monitoring. Quick and proper detection of observations indicating anomalous behavior of the structure allows to take a variety of preventive actions. In the algorithm, the mathematical formulae used provide maximal sensitivity to detect even minimal changes in the object's behavior. The sensitivity analyses were conducted for the algorithm of moving average as well as for the Douglas-Peucker algorithm used in generalization of linear objects in GIS. In addition to determining the size of deviations from the average it was used the so-called Hausdorff distance. The carried out simulation and verification of laboratory survey data showed that the approach provides sufficient sensitivity for automatic real time analysis of large amount of data obtained from different and various sensors (total stations, leveling, camera, radar).

  10. Real Time Search Algorithm for Observation Outliers During Monitoring Engineering Constructions

    Directory of Open Access Journals (Sweden)

    Latos Dorota

    2017-12-01

    Full Text Available Real time monitoring of engineering structures in case of an emergency of disaster requires collection of a large amount of data to be processed by specific analytical techniques. A quick and accurate assessment of the state of the object is crucial for a probable rescue action. One of the more significant evaluation methods of large sets of data, either collected during a specified interval of time or permanently, is the time series analysis. In this paper presented is a search algorithm for those time series elements which deviate from their values expected during monitoring. Quick and proper detection of observations indicating anomalous behavior of the structure allows to take a variety of preventive actions. In the algorithm, the mathematical formulae used provide maximal sensitivity to detect even minimal changes in the object’s behavior. The sensitivity analyses were conducted for the algorithm of moving average as well as for the Douglas-Peucker algorithm used in generalization of linear objects in GIS. In addition to determining the size of deviations from the average it was used the so-called Hausdorff distance. The carried out simulation and verification of laboratory survey data showed that the approach provides sufficient sensitivity for automatic real time analysis of large amount of data obtained from different and various sensors (total stations, leveling, camera, radar.

  11. A “Tuned” Mask Learnt Approach Based on Gravitational Search Algorithm

    Directory of Open Access Journals (Sweden)

    Youchuan Wan

    2016-01-01

    Full Text Available Texture image classification is an important topic in many applications in machine vision and image analysis. Texture feature extracted from the original texture image by using “Tuned” mask is one of the simplest and most effective methods. However, hill climbing based training methods could not acquire the satisfying mask at a time; on the other hand, some commonly used evolutionary algorithms like genetic algorithm (GA and particle swarm optimization (PSO easily fall into the local optimum. A novel approach for texture image classification exemplified with recognition of residential area is detailed in the paper. In the proposed approach, “Tuned” mask is viewed as a constrained optimization problem and the optimal “Tuned” mask is acquired by maximizing the texture energy via a newly proposed gravitational search algorithm (GSA. The optimal “Tuned” mask is achieved through the convergence of GSA. The proposed approach has been, respectively, tested on some public texture and remote sensing images. The results are then compared with that of GA, PSO, honey-bee mating optimization (HBMO, and artificial immune algorithm (AIA. Moreover, feature extracted by Gabor wavelet is also utilized to make a further comparison. Experimental results show that the proposed method is robust and adaptive and exhibits better performance than other methods involved in the paper in terms of fitness value and classification accuracy.

  12. Nature-inspired optimization algorithms

    CERN Document Server

    Yang, Xin-She

    2014-01-01

    Nature-Inspired Optimization Algorithms provides a systematic introduction to all major nature-inspired algorithms for optimization. The book's unified approach, balancing algorithm introduction, theoretical background and practical implementation, complements extensive literature with well-chosen case studies to illustrate how these algorithms work. Topics include particle swarm optimization, ant and bee algorithms, simulated annealing, cuckoo search, firefly algorithm, bat algorithm, flower algorithm, harmony search, algorithm analysis, constraint handling, hybrid methods, parameter tuning

  13. Discrete Particle Swarm Optimization Routing Protocol for Wireless Sensor Networks with Multiple Mobile Sinks.

    Science.gov (United States)

    Yang, Jin; Liu, Fagui; Cao, Jianneng; Wang, Liangming

    2016-07-14

    Mobile sinks can achieve load-balancing and energy-consumption balancing across the wireless sensor networks (WSNs). However, the frequent change of the paths between source nodes and the sinks caused by sink mobility introduces significant overhead in terms of energy and packet delays. To enhance network performance of WSNs with mobile sinks (MWSNs), we present an efficient routing strategy, which is formulated as an optimization problem and employs the particle swarm optimization algorithm (PSO) to build the optimal routing paths. However, the conventional PSO is insufficient to solve discrete routing optimization problems. Therefore, a novel greedy discrete particle swarm optimization with memory (GMDPSO) is put forward to address this problem. In the GMDPSO, particle's position and velocity of traditional PSO are redefined under discrete MWSNs scenario. Particle updating rule is also reconsidered based on the subnetwork topology of MWSNs. Besides, by improving the greedy forwarding routing, a greedy search strategy is designed to drive particles to find a better position quickly. Furthermore, searching history is memorized to accelerate convergence. Simulation results demonstrate that our new protocol significantly improves the robustness and adapts to rapid topological changes with multiple mobile sinks, while efficiently reducing the communication overhead and the energy consumption.

  14. Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory

    Science.gov (United States)

    Kitazono, Jun; Kanai, Ryota; Oizumi, Masafumi

    2018-03-01

    The ability to integrate information in the brain is considered to be an essential property for cognition and consciousness. Integrated Information Theory (IIT) hypothesizes that the amount of integrated information ($\\Phi$) in the brain is related to the level of consciousness. IIT proposes that to quantify information integration in a system as a whole, integrated information should be measured across the partition of the system at which information loss caused by partitioning is minimized, called the Minimum Information Partition (MIP). The computational cost for exhaustively searching for the MIP grows exponentially with system size, making it difficult to apply IIT to real neural data. It has been previously shown that if a measure of $\\Phi$ satisfies a mathematical property, submodularity, the MIP can be found in a polynomial order by an optimization algorithm. However, although the first version of $\\Phi$ is submodular, the later versions are not. In this study, we empirically explore to what extent the algorithm can be applied to the non-submodular measures of $\\Phi$ by evaluating the accuracy of the algorithm in simulated data and real neural data. We find that the algorithm identifies the MIP in a nearly perfect manner even for the non-submodular measures. Our results show that the algorithm allows us to measure $\\Phi$ in large systems within a practical amount of time.

  15. Robust total energy demand estimation with a hybrid Variable Neighborhood Search – Extreme Learning Machine algorithm

    International Nuclear Information System (INIS)

    Sánchez-Oro, J.; Duarte, A.; Salcedo-Sanz, S.

    2016-01-01

    Highlights: • The total energy demand in Spain is estimated with a Variable Neighborhood algorithm. • Socio-economic variables are used, and one year ahead prediction horizon is considered. • Improvement of the prediction with an Extreme Learning Machine network is considered. • Experiments are carried out in real data for the case of Spain. - Abstract: Energy demand prediction is an important problem whose solution is evaluated by policy makers in order to take key decisions affecting the economy of a country. A number of previous approaches to improve the quality of this estimation have been proposed in the last decade, the majority of them applying different machine learning techniques. In this paper, the performance of a robust hybrid approach, composed of a Variable Neighborhood Search algorithm and a new class of neural network called Extreme Learning Machine, is discussed. The Variable Neighborhood Search algorithm is focused on obtaining the most relevant features among the set of initial ones, by including an exponential prediction model. While previous approaches consider that the number of macroeconomic variables used for prediction is a parameter of the algorithm (i.e., it is fixed a priori), the proposed Variable Neighborhood Search method optimizes both: the number of variables and the best ones. After this first step of feature selection, an Extreme Learning Machine network is applied to obtain the final energy demand prediction. Experiments in a real case of energy demand estimation in Spain show the excellent performance of the proposed approach. In particular, the whole method obtains an estimation of the energy demand with an error lower than 2%, even when considering the crisis years, which are a real challenge.

  16. A Hybrid Seasonal Mechanism with a Chaotic Cuckoo Search Algorithm with a Support Vector Regression Model for Electric Load Forecasting

    Directory of Open Access Journals (Sweden)

    Yongquan Dong

    2018-04-01

    Full Text Available Providing accurate electric load forecasting results plays a crucial role in daily energy management of the power supply system. Due to superior forecasting performance, the hybridizing support vector regression (SVR model with evolutionary algorithms has received attention and deserves to continue being explored widely. The cuckoo search (CS algorithm has the potential to contribute more satisfactory electric load forecasting results. However, the original CS algorithm suffers from its inherent drawbacks, such as parameters that require accurate setting, loss of population diversity, and easy trapping in local optima (i.e., premature convergence. Therefore, proposing some critical improvement mechanisms and employing an improved CS algorithm to determine suitable parameter combinations for an SVR model is essential. This paper proposes the SVR with chaotic cuckoo search (SVRCCS model based on using a tent chaotic mapping function to enrich the cuckoo search space and diversify the population to avoid trapping in local optima. In addition, to deal with the cyclic nature of electric loads, a seasonal mechanism is combined with the SVRCCS model, namely giving a seasonal SVR with chaotic cuckoo search (SSVRCCS model, to produce more accurate forecasting performances. The numerical results, tested by using the datasets from the National Electricity Market (NEM, Queensland, Australia and the New York Independent System Operator (NYISO, NY, USA, show that the proposed SSVRCCS model outperforms other alternative models.

  17. Waste Load Allocation Based on Total Maximum Daily Load Approach Using the Charged System Search (CSS Algorithm

    Directory of Open Access Journals (Sweden)

    Elham Faraji

    2016-03-01

    Full Text Available In this research, the capability of a charged system search algorithm (CSS in handling water management optimization problems is investigated. First, two complex mathematical problems are solved by CSS and the results are compared with those obtained from other metaheuristic algorithms. In the last step, the optimization model developed by the CSS algorithm is applied to the waste load allocation in rivers based on the total maximum daily load (TMDL concept. The results are presented in Tables and Figures for easy comparison. The study indicates the superiority of the CSS algorithm in terms of its speed and performance over the other metaheuristic algorithms while its precision in water management optimization problems is verified.

  18. Solving Flexible Job-Shop Scheduling Problem Using Gravitational Search Algorithm and Colored Petri Net

    Directory of Open Access Journals (Sweden)

    Behnam Barzegar

    2012-01-01

    Full Text Available Scheduled production system leads to avoiding stock accumulations, losses reduction, decreasing or even eliminating idol machines, and effort to better benefitting from machines for on time responding customer orders and supplying requested materials in suitable time. In flexible job-shop scheduling production systems, we could reduce time and costs by transferring and delivering operations on existing machines, that is, among NP-hard problems. The scheduling objective minimizes the maximal completion time of all the operations, which is denoted by Makespan. Different methods and algorithms have been presented for solving this problem. Having a reasonable scheduled production system has significant influence on improving effectiveness and attaining to organization goals. In this paper, new algorithm were proposed for flexible job-shop scheduling problem systems (FJSSP-GSPN that is based on gravitational search algorithm (GSA. In the proposed method, the flexible job-shop scheduling problem systems was modeled by color Petri net and CPN tool and then a scheduled job was programmed by GSA algorithm. The experimental results showed that the proposed method has reasonable performance in comparison with other algorithms.

  19. Forecasting solar radiation using an optimized hybrid model by Cuckoo Search algorithm

    International Nuclear Information System (INIS)

    Wang, Jianzhou; Jiang, He; Wu, Yujie; Dong, Yao

    2015-01-01

    Due to energy crisis and environmental problems, it is very urgent to find alternative energy sources nowadays. Solar energy, as one of the great potential clean energies, has widely attracted the attention of researchers. In this paper, an optimized hybrid method by CS (Cuckoo Search) on the basis of the OP-ELM (Optimally Pruned Extreme Learning Machine), called CS-OP-ELM, is developed to forecast clear sky and real sky global horizontal radiation. First, MRSR (Multiresponse Sparse Regression) and LOO-CV (leave-one-out cross-validation) can be applied to rank neurons and prune the possibly meaningless neurons of the FFNN (Feed Forward Neural Network), respectively. Then, Direct strategy and Direct-Recursive strategy based on OP-ELM are introduced to build a hybrid model. Furthermore, CS (Cuckoo Search) optimized algorithm is employed to determine the proper weight coefficients. In order to verify the effectiveness of the developed method, hourly solar radiation data from six sites of the United States has been collected, and methods like ARMA (Autoregression moving average), BP (Back Propagation) neural network and OP-ELM can be compared with CS-OP-ELM. Experimental results show the optimized hybrid method CS-OP-ELM has the best forecasting performance. - Highlights: • An optimized hybrid method called CS-OP-ELM is proposed to forecast solar radiation. • CS-OP-ELM adopts multiple variables dataset as input variables. • Direct and Direct-Recursive strategy are introduced to build a hybrid model. • CS (Cuckoo Search) algorithm is used to determine the optimal weight coefficients. • The proposed method has the best performance compared with other methods

  20. Simultaneous determination of aquifer parameters and zone structures with fuzzy c-means clustering and meta-heuristic harmony search algorithm

    Science.gov (United States)

    Ayvaz, M. Tamer

    2007-11-01

    This study proposes an inverse solution algorithm through which both the aquifer parameters and the zone structure of these parameters can be determined based on a given set of observations on piezometric heads. In the zone structure identification problem fuzzy c-means ( FCM) clustering method is used. The association of the zone structure with the transmissivity distribution is accomplished through an optimization model. The meta-heuristic harmony search ( HS) algorithm, which is conceptualized using the musical process of searching for a perfect state of harmony, is used as an optimization technique. The optimum parameter zone structure is identified based on three criteria which are the residual error, parameter uncertainty, and structure discrimination. A numerical example given in the literature is solved to demonstrate the performance of the proposed algorithm. Also, a sensitivity analysis is performed to test the performance of the HS algorithm for different sets of solution parameters. Results indicate that the proposed solution algorithm is an effective way in the simultaneous identification of aquifer parameters and their corresponding zone structures.

  1. Maximum relevance, minimum redundancy band selection based on neighborhood rough set for hyperspectral data classification

    International Nuclear Information System (INIS)

    Liu, Yao; Chen, Yuehua; Tan, Kezhu; Xie, Hong; Wang, Liguo; Xie, Wu; Yan, Xiaozhen; Xu, Zhen

    2016-01-01

    Band selection is considered to be an important processing step in handling hyperspectral data. In this work, we selected informative bands according to the maximal relevance minimal redundancy (MRMR) criterion based on neighborhood mutual information. Two measures MRMR difference and MRMR quotient were defined and a forward greedy search for band selection was constructed. The performance of the proposed algorithm, along with a comparison with other methods (neighborhood dependency measure based algorithm, genetic algorithm and uninformative variable elimination algorithm), was studied using the classification accuracy of extreme learning machine (ELM) and random forests (RF) classifiers on soybeans’ hyperspectral datasets. The results show that the proposed MRMR algorithm leads to promising improvement in band selection and classification accuracy. (paper)

  2. Ringed Seal Search for Global Optimization via a Sensitive Search Model.

    Directory of Open Access Journals (Sweden)

    Younes Saadi

    Full Text Available The efficiency of a metaheuristic algorithm for global optimization is based on its ability to search and find the global optimum. However, a good search often requires to be balanced between exploration and exploitation of the search space. In this paper, a new metaheuristic algorithm called Ringed Seal Search (RSS is introduced. It is inspired by the natural behavior of the seal pup. This algorithm mimics the seal pup movement behavior and its ability to search and choose the best lair to escape predators. The scenario starts once the seal mother gives birth to a new pup in a birthing lair that is constructed for this purpose. The seal pup strategy consists of searching and selecting the best lair by performing a random walk to find a new lair. Affected by the sensitive nature of seals against external noise emitted by predators, the random walk of the seal pup takes two different search states, normal state and urgent state. In the normal state, the pup performs an intensive search between closely adjacent lairs; this movement is modeled via a Brownian walk. In an urgent state, the pup leaves the proximity area and performs an extensive search to find a new lair from sparse targets; this movement is modeled via a Levy walk. The switch between these two states is realized by the random noise emitted by predators. The algorithm keeps switching between normal and urgent states until the global optimum is reached. Tests and validations were performed using fifteen benchmark test functions to compare the performance of RSS with other baseline algorithms. The results show that RSS is more efficient than Genetic Algorithm, Particles Swarm Optimization and Cuckoo Search in terms of convergence rate to the global optimum. The RSS shows an improvement in terms of balance between exploration (extensive and exploitation (intensive of the search space. The RSS can efficiently mimic seal pups behavior to find best lair and provide a new algorithm to be

  3. Multiobjective pressurized water reactor reload core design by nondominated genetic algorithm search

    International Nuclear Information System (INIS)

    Parks, G.T.

    1996-01-01

    The design of pressurized water reactor reload cores is not only a formidable optimization problem but also, in many instances, a multiobjective problem. A genetic algorithm (GA) designed to perform true multiobjective optimization on such problems is described. Genetic algorithms simulate natural evolution. They differ from most optimization techniques by searching from one group of solutions to another, rather than from one solution to another. New solutions are generated by breeding from existing solutions. By selecting better (in a multiobjective sense) solutions as parents more often, the population can be evolved to reveal the trade-off surface between the competing objectives. An example illustrating the effectiveness of this novel method is presented and analyzed. It is found that in solving a reload design problem the algorithm evaluates a similar number of loading patterns to other state-of-the-art methods, but in the process reveals much more information about the nature of the problem being solved. The actual computational cost incurred depends on the core simulator used; the GA itself is code independent

  4. An inertia-free filter line-search algorithm for large-scale nonlinear programming

    Energy Technology Data Exchange (ETDEWEB)

    Chiang, Nai-Yuan; Zavala, Victor M.

    2016-02-15

    We present a filter line-search algorithm that does not require inertia information of the linear system. This feature enables the use of a wide range of linear algebra strategies and libraries, which is essential to tackle large-scale problems on modern computing architectures. The proposed approach performs curvature tests along the search step to detect negative curvature and to trigger convexification. We prove that the approach is globally convergent and we implement the approach within a parallel interior-point framework to solve large-scale and highly nonlinear problems. Our numerical tests demonstrate that the inertia-free approach is as efficient as inertia detection via symmetric indefinite factorizations. We also demonstrate that the inertia-free approach can lead to reductions in solution time because it reduces the amount of convexification needed.

  5. Self-learning search engines

    NARCIS (Netherlands)

    Schuth, A.

    2015-01-01

    How does a search engine such as Google know which search results to display? There are many competing algorithms that generate search results, but which one works best? We developed a new probabilistic method for quickly comparing large numbers of search algorithms by examining the results users

  6. A search algorithm to meta-optimize the parameters for an extended Kalman filter to improve classification on hyper-temporal images

    CSIR Research Space (South Africa)

    Salmon, BP

    2012-07-01

    Full Text Available stream_source_info Salmon2_2012.pdf.txt stream_content_type text/plain stream_size 16400 Content-Encoding ISO-8859-1 stream_name Salmon2_2012.pdf.txt Content-Type text/plain; charset=ISO-8859-1 A SEARCH ALGORITHM TO META... the spectral bands separately and introduced a meta-optimization method for the EKF that will be called the Bias Variance Equilibrium Point (BVEP) in this paper. The objective of this paper is to introduce an unsuper- vised search algorithm called the Bias...

  7. Causal gene identification using combinatorial V-structure search.

    Science.gov (United States)

    Cai, Ruichu; Zhang, Zhenjie; Hao, Zhifeng

    2013-07-01

    With the advances of biomedical techniques in the last decade, the costs of human genomic sequencing and genomic activity monitoring are coming down rapidly. To support the huge genome-based business in the near future, researchers are eager to find killer applications based on human genome information. Causal gene identification is one of the most promising applications, which may help the potential patients to estimate the risk of certain genetic diseases and locate the target gene for further genetic therapy. Unfortunately, existing pattern recognition techniques, such as Bayesian networks, cannot be directly applied to find the accurate causal relationship between genes and diseases. This is mainly due to the insufficient number of samples and the extremely high dimensionality of the gene space. In this paper, we present the first practical solution to causal gene identification, utilizing a new combinatorial formulation over V-Structures commonly used in conventional Bayesian networks, by exploring the combinations of significant V-Structures. We prove the NP-hardness of the combinatorial search problem under a general settings on the significance measure on the V-Structures, and present a greedy algorithm to find sub-optimal results. Extensive experiments show that our proposal is both scalable and effective, particularly with interesting findings on the causal genes over real human genome data. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. On the Runtime of Randomized Local Search and Simple Evolutionary Algorithms for Dynamic Makespan Scheduling

    DEFF Research Database (Denmark)

    Neumann, Frank; Witt, Carsten

    2015-01-01

    combinatorial optimization problem, namely makespan scheduling. We study the model of a strong adversary which is allowed to change one job at regular intervals. Furthermore, we investigate the setting of random changes. Our results show that randomized local search and a simple evolutionary algorithm are very...

  9. Parameter Estimation for Traffic Noise Models Using a Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    Deok-Soon An

    2013-01-01

    Full Text Available A technique has been developed for predicting road traffic noise for environmental assessment, taking into account traffic volume as well as road surface conditions. The ASJ model (ASJ Prediction Model for Road Traffic Noise, 1999, which is based on the sound power level of the noise emitted by the interaction between the road surface and tires, employs regression models for two road surface types: dense-graded asphalt (DGA and permeable asphalt (PA. However, these models are not applicable to other types of road surfaces. Accordingly, this paper introduces a parameter estimation procedure for ASJ-based noise prediction models, utilizing a harmony search (HS algorithm. Traffic noise measurement data for four different vehicle types were used in the algorithm to determine the regression parameters for several road surface types. The parameters of the traffic noise prediction models were evaluated using another measurement set, and good agreement was observed between the predicted and measured sound power levels.

  10. Transmission network expansion planning based on hybridization model of neural networks and harmony search algorithm

    Directory of Open Access Journals (Sweden)

    Mohammad Taghi Ameli

    2012-01-01

    Full Text Available Transmission Network Expansion Planning (TNEP is a basic part of power network planning that determines where, when and how many new transmission lines should be added to the network. So, the TNEP is an optimization problem in which the expansion purposes are optimized. Artificial Intelligence (AI tools such as Genetic Algorithm (GA, Simulated Annealing (SA, Tabu Search (TS and Artificial Neural Networks (ANNs are methods used for solving the TNEP problem. Today, by using the hybridization models of AI tools, we can solve the TNEP problem for large-scale systems, which shows the effectiveness of utilizing such models. In this paper, a new approach to the hybridization model of Probabilistic Neural Networks (PNNs and Harmony Search Algorithm (HSA was used to solve the TNEP problem. Finally, by considering the uncertain role of the load based on a scenario technique, this proposed model was tested on the Garver’s 6-bus network.

  11. Heuristic algorithms for the minmax regret flow-shop problem with interval processing times.

    Science.gov (United States)

    Ćwik, Michał; Józefczyk, Jerzy

    2018-01-01

    An uncertain version of the permutation flow-shop with unlimited buffers and the makespan as a criterion is considered. The investigated parametric uncertainty is represented by given interval-valued processing times. The maximum regret is used for the evaluation of uncertainty. Consequently, the minmax regret discrete optimization problem is solved. Due to its high complexity, two relaxations are applied to simplify the optimization procedure. First of all, a greedy procedure is used for calculating the criterion's value, as such calculation is NP-hard problem itself. Moreover, the lower bound is used instead of solving the internal deterministic flow-shop. The constructive heuristic algorithm is applied for the relaxed optimization problem. The algorithm is compared with previously elaborated other heuristic algorithms basing on the evolutionary and the middle interval approaches. The conducted computational experiments showed the advantage of the constructive heuristic algorithm with regards to both the criterion and the time of computations. The Wilcoxon paired-rank statistical test confirmed this conclusion.

  12. NSGA-II Algorithm with a Local Search Strategy for Multiobjective Optimal Design of Dry-Type Air-Core Reactor

    Directory of Open Access Journals (Sweden)

    Chengfen Zhang

    2015-01-01

    Full Text Available Dry-type air-core reactor is now widely applied in electrical power distribution systems, for which the optimization design is a crucial issue. In the optimization design problem of dry-type air-core reactor, the objectives of minimizing the production cost and minimizing the operation cost are both important. In this paper, a multiobjective optimal model is established considering simultaneously the two objectives of minimizing the production cost and minimizing the operation cost. To solve the multi-objective optimization problem, a memetic evolutionary algorithm is proposed, which combines elitist nondominated sorting genetic algorithm version II (NSGA-II with a local search strategy based on the covariance matrix adaptation evolution strategy (CMA-ES. NSGA-II can provide decision maker with flexible choices among the different trade-off solutions, while the local-search strategy, which is applied to nondominated individuals randomly selected from the current population in a given generation and quantity, can accelerate the convergence speed. Furthermore, another modification is that an external archive is set in the proposed algorithm for increasing the evolutionary efficiency. The proposed algorithm is tested on a dry-type air-core reactor made of rectangular cross-section litz-wire. Simulation results show that the proposed algorithm has high efficiency and it converges to a better Pareto front.

  13. Electric Load Forecasting Based on a Least Squares Support Vector Machine with Fuzzy Time Series and Global Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    Yan Hong Chen

    2016-01-01

    Full Text Available This paper proposes a new electric load forecasting model by hybridizing the fuzzy time series (FTS and global harmony search algorithm (GHSA with least squares support vector machines (LSSVM, namely GHSA-FTS-LSSVM model. Firstly, the fuzzy c-means clustering (FCS algorithm is used to calculate the clustering center of each cluster. Secondly, the LSSVM is applied to model the resultant series, which is optimized by GHSA. Finally, a real-world example is adopted to test the performance of the proposed model. In this investigation, the proposed model is verified using experimental datasets from the Guangdong Province Industrial Development Database, and results are compared against autoregressive integrated moving average (ARIMA model and other algorithms hybridized with LSSVM including genetic algorithm (GA, particle swarm optimization (PSO, harmony search, and so on. The forecasting results indicate that the proposed GHSA-FTS-LSSVM model effectively generates more accurate predictive results.

  14. Optimization of search algorithms for a mass spectra library

    International Nuclear Information System (INIS)

    Domokos, L.; Henneberg, D.; Weimann, B.

    1983-01-01

    The SISCOM mass spectra library search is mainly an interpretative system producing a ''hit list'' of similar spectra based on six comparison factors. This paper deals with extension of the system; the aim is exact identification (retrieval) of those reference spectra in the SISCOM hit list that correspond to the unknown compounds or components of the mixture. Thus, instead of a similarity measure, a decision (retrieval) function is needed to establish the identity of reference and unknown compounds by comparison of their spectra. To facilitate estimation of the weightings of the different variables in the retrieval function, pattern recognition algorithms were applied. Numerous statistical evaluations of three different library collections were made to check the quality of data bases and to derive appropriate variables for the retrieval function. (Auth.)

  15. A novel symbiotic organisms search algorithm for optimal power flow of power system with FACTS devices

    Directory of Open Access Journals (Sweden)

    Dharmbir Prasad

    2016-03-01

    Full Text Available In this paper, symbiotic organisms search (SOS algorithm is proposed for the solution of optimal power flow (OPF problem of power system equipped with flexible ac transmission systems (FACTS devices. Inspired by interaction between organisms in ecosystem, SOS algorithm is a recent population based algorithm which does not require any algorithm specific control parameters unlike other algorithms. The performance of the proposed SOS algorithm is tested on the modified IEEE-30 bus and IEEE-57 bus test systems incorporating two types of FACTS devices, namely, thyristor controlled series capacitor and thyristor controlled phase shifter at fixed locations. The OPF problem of the present work is formulated with four different objective functions viz. (a fuel cost minimization, (b transmission active power loss minimization, (c emission reduction and (d minimization of combined economic and environmental cost. The simulation results exhibit the potential of the proposed SOS algorithm and demonstrate its effectiveness for solving the OPF problem of power system incorporating FACTS devices over the other evolutionary optimization techniques that surfaced in the recent state-of-the-art literature.

  16. Spatial search by quantum walk

    International Nuclear Information System (INIS)

    Childs, Andrew M.; Goldstone, Jeffrey

    2004-01-01

    Grover's quantum search algorithm provides a way to speed up combinatorial search, but is not directly applicable to searching a physical database. Nevertheless, Aaronson and Ambainis showed that a database of N items laid out in d spatial dimensions can be searched in time of order √(N) for d>2, and in time of order √(N) poly(log N) for d=2. We consider an alternative search algorithm based on a continuous-time quantum walk on a graph. The case of the complete graph gives the continuous-time search algorithm of Farhi and Gutmann, and other previously known results can be used to show that √(N) speedup can also be achieved on the hypercube. We show that full √(N) speedup can be achieved on a d-dimensional periodic lattice for d>4. In d=4, the quantum walk search algorithm takes time of order √(N) poly(log N), and in d<4, the algorithm does not provide substantial speedup

  17. Discrete Teaching-learning-based optimization Algorithm for Traveling Salesman Problems

    Directory of Open Access Journals (Sweden)

    Wu Lehui

    2017-01-01

    Full Text Available In this paper, a discrete variant of TLBO (DTLBO is proposed for solving the traveling salesman problem (TSP. In the proposed method, an effective learner representation scheme is redefined based on the characteristics of TSP problem. Moreover, all learners are randomly divided into several sub-swarms with equal amounts of learners so as to increase the diversity of population and reduce the probability of being trapped in local optimum. In each sub-swarm, the new positions of learners in the teaching phase and the learning phase are generated by the crossover operation, the legality detection and mutation operation, and then the offspring learners are determined based on greedy selection. Finally, to verify the performance of the proposed algorithm, benchmark TSP problems are examined and the results indicate that DTLBO is effective compared with other algorithms used for TSP problems.

  18. Archimedean copula estimation of distribution algorithm based on artificial bee colony algorithm

    Institute of Scientific and Technical Information of China (English)

    Haidong Xu; Mingyan Jiang; Kun Xu

    2015-01-01

    The artificial bee colony (ABC) algorithm is a com-petitive stochastic population-based optimization algorithm. How-ever, the ABC algorithm does not use the social information and lacks the knowledge of the problem structure, which leads to in-sufficiency in both convergent speed and searching precision. Archimedean copula estimation of distribution algorithm (ACEDA) is a relatively simple, time-economic and multivariate correlated EDA. This paper proposes a novel hybrid algorithm based on the ABC algorithm and ACEDA cal ed Archimedean copula estima-tion of distribution based on the artificial bee colony (ACABC) algorithm. The hybrid algorithm utilizes ACEDA to estimate the distribution model and then uses the information to help artificial bees to search more efficiently in the search space. Six bench-mark functions are introduced to assess the performance of the ACABC algorithm on numerical function optimization. Experimen-tal results show that the ACABC algorithm converges much faster with greater precision compared with the ABC algorithm, ACEDA and the global best (gbest)-guided ABC (GABC) algorithm in most of the experiments.

  19. ASIGNACIÓN DE SUPERVISORES FORESTALES: RESOLUCIÓN MEDIANTE UN ALGORITMO TABU SEARCH ASSIGNMENT OF FOREST SUPERVISORS: RESOLUTION BY MEANS OF A TABU SEARCH ALGORITHM

    Directory of Open Access Journals (Sweden)

    Lorena Pradenas Rojas

    2008-12-01

    Full Text Available En este estudio se presenta un modelo matemático para un problema genérico de asignación de personal. Se implementa y evalúa un procedimiento de solución mediante la metaheurística Tabu Search. El algoritmo propuesto es usado para resolver un caso real de asignación de supervisores forestales. Los resultados muestran que el algoritmo desarrollado es eficiente en la resolución de este tipo de problema y tiene un amplio rango de aplicación para otras situaciones reales.This study presents a mathematical model for a generic problem of staff allocation. A solution is implemented and evaluated by means of the Tabu Search metaheuristic. The proposed algorithm is used to solve a real case of forestry supervisors' allocation. The results show that the developed algorithm is efficient solving this kind of problems and that it has a wide range of application for other real situations.

  20. Cooperative Search and Rescue with Artificial Fishes Based on Fish-Swarm Algorithm for Underwater Wireless Sensor Networks

    Science.gov (United States)

    Zhao, Wei; Tang, Zhenmin; Yang, Yuwang; Wang, Lei; Lan, Shaohua

    2014-01-01

    This paper presents a searching control approach for cooperating mobile sensor networks. We use a density function to represent the frequency of distress signals issued by victims. The mobile nodes' moving in mission space is similar to the behaviors of fish-swarm in water. So, we take the mobile node as artificial fish node and define its operations by a probabilistic model over a limited range. A fish-swarm based algorithm is designed requiring local information at each fish node and maximizing the joint detection probabilities of distress signals. Optimization of formation is also considered for the searching control approach and is optimized by fish-swarm algorithm. Simulation results include two schemes: preset route and random walks, and it is showed that the control scheme has adaptive and effective properties. PMID:24741341

  1. Cooperative Search and Rescue with Artificial Fishes Based on Fish-Swarm Algorithm for Underwater Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Wei Zhao

    2014-01-01

    Full Text Available This paper presents a searching control approach for cooperating mobile sensor networks. We use a density function to represent the frequency of distress signals issued by victims. The mobile nodes’ moving in mission space is similar to the behaviors of fish-swarm in water. So, we take the mobile node as artificial fish node and define its operations by a probabilistic model over a limited range. A fish-swarm based algorithm is designed requiring local information at each fish node and maximizing the joint detection probabilities of distress signals. Optimization of formation is also considered for the searching control approach and is optimized by fish-swarm algorithm. Simulation results include two schemes: preset route and random walks, and it is showed that the control scheme has adaptive and effective properties.

  2. Searching Algorithm Using Bayesian Updates

    Science.gov (United States)

    Caudle, Kyle

    2010-01-01

    In late October 1967, the USS Scorpion was lost at sea, somewhere between the Azores and Norfolk Virginia. Dr. Craven of the U.S. Navy's Special Projects Division is credited with using Bayesian Search Theory to locate the submarine. Bayesian Search Theory is a straightforward and interesting application of Bayes' theorem which involves searching…

  3. Parameter estimation by Differential Search Algorithm from horizontal loop electromagnetic (HLEM) data

    Science.gov (United States)

    Alkan, Hilal; Balkaya, Çağlayan

    2018-02-01

    We present an efficient inversion tool for parameter estimation from horizontal loop electromagnetic (HLEM) data using Differential Search Algorithm (DSA) which is a swarm-intelligence-based metaheuristic proposed recently. The depth, dip, and origin of a thin subsurface conductor causing the anomaly are the parameters estimated by the HLEM method commonly known as Slingram. The applicability of the developed scheme was firstly tested on two synthetically generated anomalies with and without noise content. Two control parameters affecting the convergence characteristic to the solution of the algorithm were tuned for the so-called anomalies including one and two conductive bodies, respectively. Tuned control parameters yielded more successful statistical results compared to widely used parameter couples in DSA applications. Two field anomalies measured over a dipping graphitic shale from Northern Australia were then considered, and the algorithm provided the depth estimations being in good agreement with those of previous studies and drilling information. Furthermore, the efficiency and reliability of the results obtained were investigated via probability density function. Considering the results obtained, we can conclude that DSA characterized by the simple algorithmic structure is an efficient and promising metaheuristic for the other relatively low-dimensional geophysical inverse problems. Finally, the researchers after being familiar with the content of developed scheme displaying an easy to use and flexible characteristic can easily modify and expand it for their scientific optimization problems.

  4. Puzzles, paradoxes, and problem solving an introduction to mathematical thinking

    CERN Document Server

    Reba, Marilyn A

    2014-01-01

    Graphs: Puzzles and Optimization Graphical Representation and Search Greedy Algorithms and Dynamic Programming Shortest Paths, DNA Sequences, and GPS Systems Routing Problems and Optimal Circuits Traveling Salesmen and Optimal Orderings Vertex Colorings and Edge Matchings Logic: Rational Inference and Computer Circuits Inductive and Deductive Arguments Deductive Arguments and Truth-Tables Deductive Arguments and Derivations Deductive Logic and Equivalence Modeling Using Deductive Logic Probability: Predictions and Expectations Probability and Counting Counting and Unordered Outcomes Independen

  5. Golden Sine Algorithm: A Novel Math-Inspired Algorithm

    Directory of Open Access Journals (Sweden)

    TANYILDIZI, E.

    2017-05-01

    Full Text Available In this study, Golden Sine Algorithm (Gold-SA is presented as a new metaheuristic method for solving optimization problems. Gold-SA has been developed as a new search algorithm based on population. This math-based algorithm is inspired by sine that is a trigonometric function. In the algorithm, random individuals are created as many as the number of search agents with uniform distribution for each dimension. The Gold-SA operator searches to achieve a better solution in each iteration by trying to bring the current situation closer to the target value. The solution space is narrowed by the golden section so that the areas that are supposed to give only good results are scanned instead of the whole solution space scan. In the tests performed, it is seen that Gold-SA has better results than other population based methods. In addition, Gold-SA has fewer algorithm-dependent parameters and operators than other metaheuristic methods, increasing the importance of this method by providing faster convergence of this new method.

  6. Optimization process planning using hybrid genetic algorithm and intelligent search for job shop machining.

    Science.gov (United States)

    Salehi, Mojtaba; Bahreininejad, Ardeshir

    2011-08-01

    Optimization of process planning is considered as the key technology for computer-aided process planning which is a rather complex and difficult procedure. A good process plan of a part is built up based on two elements: (1) the optimized sequence of the operations of the part; and (2) the optimized selection of the machine, cutting tool and Tool Access Direction (TAD) for each operation. In the present work, the process planning is divided into preliminary planning, and secondary/detailed planning. In the preliminary stage, based on the analysis of order and clustering constraints as a compulsive constraint aggregation in operation sequencing and using an intelligent searching strategy, the feasible sequences are generated. Then, in the detailed planning stage, using the genetic algorithm which prunes the initial feasible sequences, the optimized operation sequence and the optimized selection of the machine, cutting tool and TAD for each operation based on optimization constraints as an additive constraint aggregation are obtained. The main contribution of this work is the optimization of sequence of the operations of the part, and optimization of machine selection, cutting tool and TAD for each operation using the intelligent search and genetic algorithm simultaneously.

  7. Modified harmony search

    Science.gov (United States)

    Mohamed, Najihah; Lutfi Amri Ramli, Ahmad; Majid, Ahmad Abd; Piah, Abd Rahni Mt

    2017-09-01

    A metaheuristic algorithm, called Harmony Search is quite highly applied in optimizing parameters in many areas. HS is a derivative-free real parameter optimization algorithm, and draws an inspiration from the musical improvisation process of searching for a perfect state of harmony. Propose in this paper Modified Harmony Search for solving optimization problems, which employs a concept from genetic algorithm method and particle swarm optimization for generating new solution vectors that enhances the performance of HS algorithm. The performances of MHS and HS are investigated on ten benchmark optimization problems in order to make a comparison to reflect the efficiency of the MHS in terms of final accuracy, convergence speed and robustness.

  8. Algebraic Algorithm Design and Local Search

    National Research Council Canada - National Science Library

    Graham, Robert

    1996-01-01

    .... Algebraic techniques have been applied successfully to algorithm synthesis by the use of algorithm theories and design tactics, an approach pioneered in the Kestrel Interactive Development System (KIDS...

  9. Forecasting Energy CO2 Emissions Using a Quantum Harmony Search Algorithm-Based DMSFE Combination Model

    Directory of Open Access Journals (Sweden)

    Xingsheng Gu

    2013-03-01

    Full Text Available he accurate forecasting of carbon dioxide (CO2 emissions from fossil fuel energy consumption is a key requirement for making energy policy and environmental strategy. In this paper, a novel quantum harmony search (QHS algorithm-based discounted mean square forecast error (DMSFE combination model is proposed. In the DMSFE combination forecasting model, almost all investigations assign the discounting factor (β arbitrarily since β varies between 0 and 1 and adopt one value for all individual models and forecasting periods. The original method doesn’t consider the influences of the individual model and the forecasting period. This work contributes by changing β from one value to a matrix taking the different model and the forecasting period into consideration and presenting a way of searching for the optimal β values by using the QHS algorithm through optimizing the mean absolute percent error (MAPE objective function. The QHS algorithm-based optimization DMSFE combination forecasting model is established and tested by forecasting CO2 emission of the World top‒5 CO2 emitters. The evaluation indexes such as MAPE, root mean squared error (RMSE and mean absolute error (MAE are employed to test the performance of the presented approach. The empirical analyses confirm the validity of the presented method and the forecasting accuracy can be increased in a certain degree.

  10. Is searching full text more effective than searching abstracts?

    Science.gov (United States)

    Lin, Jimmy

    2009-02-03

    With the growing availability of full-text articles online, scientists and other consumers of the life sciences literature now have the ability to go beyond searching bibliographic records (title, abstract, metadata) to directly access full-text content. Motivated by this emerging trend, I posed the following question: is searching full text more effective than searching abstracts? This question is answered by comparing text retrieval algorithms on MEDLINE abstracts, full-text articles, and spans (paragraphs) within full-text articles using data from the TREC 2007 genomics track evaluation. Two retrieval models are examined: bm25 and the ranking algorithm implemented in the open-source Lucene search engine. Experiments show that treating an entire article as an indexing unit does not consistently yield higher effectiveness compared to abstract-only search. However, retrieval based on spans, or paragraphs-sized segments of full-text articles, consistently outperforms abstract-only search. Results suggest that highest overall effectiveness may be achieved by combining evidence from spans and full articles. Users searching full text are more likely to find relevant articles than searching only abstracts. This finding affirms the value of full text collections for text retrieval and provides a starting point for future work in exploring algorithms that take advantage of rapidly-growing digital archives. Experimental results also highlight the need to develop distributed text retrieval algorithms, since full-text articles are significantly longer than abstracts and may require the computational resources of multiple machines in a cluster. The MapReduce programming model provides a convenient framework for organizing such computations.

  11. Is searching full text more effective than searching abstracts?

    Directory of Open Access Journals (Sweden)

    Lin Jimmy

    2009-02-01

    Full Text Available Abstract Background With the growing availability of full-text articles online, scientists and other consumers of the life sciences literature now have the ability to go beyond searching bibliographic records (title, abstract, metadata to directly access full-text content. Motivated by this emerging trend, I posed the following question: is searching full text more effective than searching abstracts? This question is answered by comparing text retrieval algorithms on MEDLINE® abstracts, full-text articles, and spans (paragraphs within full-text articles using data from the TREC 2007 genomics track evaluation. Two retrieval models are examined: bm25 and the ranking algorithm implemented in the open-source Lucene search engine. Results Experiments show that treating an entire article as an indexing unit does not consistently yield higher effectiveness compared to abstract-only search. However, retrieval based on spans, or paragraphs-sized segments of full-text articles, consistently outperforms abstract-only search. Results suggest that highest overall effectiveness may be achieved by combining evidence from spans and full articles. Conclusion Users searching full text are more likely to find relevant articles than searching only abstracts. This finding affirms the value of full text collections for text retrieval and provides a starting point for future work in exploring algorithms that take advantage of rapidly-growing digital archives. Experimental results also highlight the need to develop distributed text retrieval algorithms, since full-text articles are significantly longer than abstracts and may require the computational resources of multiple machines in a cluster. The MapReduce programming model provides a convenient framework for organizing such computations.

  12. Optimization of distribution piping network in district cooling system using genetic algorithm with local search

    International Nuclear Information System (INIS)

    Chan, Apple L.S.; Hanby, Vic I.; Chow, T.T.

    2007-01-01

    A district cooling system is a sustainable means of distribution of cooling energy through mass production. A cooling medium like chilled water is generated at a central refrigeration plant and supplied to serve a group of consumer buildings through a piping network. Because of the substantial capital investment involved, an optimal design of the distribution piping configuration is one of the crucial factors for successful implementation of the district cooling scheme. In the present study, genetic algorithm (GA) incorporated with local search techniques was developed to find the optimal/near optimal configuration of the piping network in a hypothetical site. The effect of local search, mutation rate and frequency of local search on the performance of the GA in terms of both solution quality and computation time were investigated and presented in this paper

  13. A depth-first search algorithm to compute elementary flux modes by linear programming.

    Science.gov (United States)

    Quek, Lake-Ee; Nielsen, Lars K

    2014-07-30

    The decomposition of complex metabolic networks into elementary flux modes (EFMs) provides a useful framework for exploring reaction interactions systematically. Generating a complete set of EFMs for large-scale models, however, is near impossible. Even for moderately-sized models (linear programming (LP) to enumerate EFMs in an exhaustive fashion. Constraints can be introduced to directly generate a subset of EFMs satisfying the set of constraints. The depth-first search algorithm has a constant memory overhead. Using flux constraints, a large LP problem can be massively divided and parallelized into independent sub-jobs for deployment into computing clusters. Since the sub-jobs do not overlap, the approach scales to utilize all available computing nodes with minimal coordination overhead or memory limitations. The speed of the algorithm was comparable to efmtool, a mainstream Double Description method, when enumerating all EFMs; the attrition power gained from performing flux feasibility tests offsets the increased computational demand of running an LP solver. Unlike the Double Description method, the algorithm enables accelerated enumeration of all EFMs satisfying a set of constraints.

  14. GRASP (Greedy Randomized Adaptive Search Procedures) applied to optimization of petroleum products distribution in pipeline networks; GRASP (Greedy Randomized Adaptative Search Procedures) aplicado ao 'scheduling' de redes de distribuicao de petroleo e derivados

    Energy Technology Data Exchange (ETDEWEB)

    Conte, Viviane Cristhyne Bini; Arruda, Lucia Valeria Ramos de; Yamamoto, Lia [Universidade Tecnologica Federal do Parana (UTFPR), Curitiba, PR (Brazil)

    2008-07-01

    Planning and scheduling of the pipeline network operations aim the most efficient use of the resources resulting in a better performance of the network. A petroleum distribution pipeline network is composed by refineries, sources and/or storage parks, connected by a set of pipelines, which operate the transportation of petroleum and derivatives among adjacent areas. In real scenes, this problem is considered a combinatorial problem, which has difficult solution, which makes necessary methodologies of the resolution that present low computational time. This work aims to get solutions that attempt the demands and minimize the number of batch fragmentations on the sent operations of products for the pipelines in a simplified model of a real network, through by application of the local search metaheuristic GRASP. GRASP does not depend of solutions of previous iterations and works in a random way so it allows the search for the solution in an ampler and diversified search space. GRASP utilization does not demand complex calculation, even the construction stage that requires more computational effort, which provides relative rapidity in the attainment of good solutions. GRASP application on the scheduling of the operations of this network presented feasible solutions in a low computational time. (author)

  15. An Efficient Two-Objective Hybrid Local Search Algorithm for Solving the Fuel Consumption Vehicle Routing Problem

    Directory of Open Access Journals (Sweden)

    Weizhen Rao

    2016-01-01

    Full Text Available The classical model of vehicle routing problem (VRP generally minimizes either the total vehicle travelling distance or the total number of dispatched vehicles. Due to the increased importance of environmental sustainability, one variant of VRPs that minimizes the total vehicle fuel consumption has gained much attention. The resulting fuel consumption VRP (FCVRP becomes increasingly important yet difficult. We present a mixed integer programming model for the FCVRP, and fuel consumption is measured through the degree of road gradient. Complexity analysis of FCVRP is presented through analogy with the capacitated VRP. To tackle the FCVRP’s computational intractability, we propose an efficient two-objective hybrid local search algorithm (TOHLS. TOHLS is based on a hybrid local search algorithm (HLS that is also used to solve FCVRP. Based on the Golden CVRP benchmarks, 60 FCVRP instances are generated and tested. Finally, the computational results show that the proposed TOHLS significantly outperforms the HLS.

  16. Comparison of heuristic optimization techniques for the enrichment and gadolinia distribution in BWR fuel lattices and decision analysis

    International Nuclear Information System (INIS)

    Castillo, Alejandro; Martín-del-Campo, Cecilia; Montes-Tadeo, José-Luis; François, Juan-Luis; Ortiz-Servin, Juan-José; Perusquía-del-Cueto, Raúl

    2014-01-01

    Highlights: • Different metaheuristic optimization techniques were compared. • The optimal enrichment and gadolinia distribution in a BWR fuel lattice was studied. • A decision making tool based on the Position Vector of Minimum Regret was applied. • Similar results were found for the different optimization techniques. - Abstract: In the present study a comparison of the performance of five heuristic techniques for optimization of combinatorial problems is shown. The techniques are: Ant Colony System, Artificial Neural Networks, Genetic Algorithms, Greedy Search and a hybrid of Path Relinking and Scatter Search. They were applied to obtain an “optimal” enrichment and gadolinia distribution in a fuel lattice of a boiling water reactor. All techniques used the same objective function for qualifying the different distributions created during the optimization process as well as the same initial conditions and restrictions. The parameters included in the objective function are the k-infinite multiplication factor, the maximum local power peaking factor, the average enrichment and the average gadolinia concentration of the lattice. The CASMO-4 code was used to obtain the neutronic parameters. The criteria for qualifying the optimization techniques include also the evaluation of the best lattice with burnup and the number of evaluations of the objective function needed to obtain the best solution. In conclusion all techniques obtain similar results, but there are methods that found better solutions faster than others. A decision analysis tool based on the Position Vector of Minimum Regret was applied to aggregate the criteria in order to rank the solutions according to three functions: neutronic grade at 0 burnup, neutronic grade with burnup and global cost which aggregates the computing time in the decision. According to the results Greedy Search found the best lattice in terms of the neutronic grade at 0 burnup and also with burnup. However, Greedy Search is

  17. Algorithms for Regular Tree Grammar Network Search and Their Application to Mining Human-viral Infection Patterns.

    Science.gov (United States)

    Smoly, Ilan; Carmel, Amir; Shemer-Avni, Yonat; Yeger-Lotem, Esti; Ziv-Ukelson, Michal

    2016-03-01

    Network querying is a powerful approach to mine molecular interaction networks. Most state-of-the-art network querying tools either confine the search to a prespecified topology in the form of some template subnetwork, or do not specify any topological constraints at all. Another approach is grammar-based queries, which are more flexible and expressive as they allow for expressing the topology of the sought pattern according to some grammar-based logic. Previous grammar-based network querying tools were confined to the identification of paths. In this article, we extend the patterns identified by grammar-based query approaches from paths to trees. For this, we adopt a higher order query descriptor in the form of a regular tree grammar (RTG). We introduce a novel problem and propose an algorithm to search a given graph for the k highest scoring subgraphs matching a tree accepted by an RTG. Our algorithm is based on the combination of dynamic programming with color coding, and includes an extension of previous k-best parsing optimization approaches to avoid isomorphic trees in the output. We implement the new algorithm and exemplify its application to mining viral infection patterns within molecular interaction networks. Our code is available online.

  18. An adaptive immune optimization algorithm with dynamic lattice searching operation for fast optimization of atomic clusters

    International Nuclear Information System (INIS)

    Wu, Xia; Wu, Genhua

    2014-01-01

    Highlights: • A high efficient method for optimization of atomic clusters is developed. • Its performance is studied by optimizing Lennard-Jones clusters and Ag clusters. • The method is proved to be quite efficient. • A new Ag 61 cluster with stacking-fault face-centered cubic motif is found. - Abstract: Geometrical optimization of atomic clusters is performed by a development of adaptive immune optimization algorithm (AIOA) with dynamic lattice searching (DLS) operation (AIOA-DLS method). By a cycle of construction and searching of the dynamic lattice (DL), DLS algorithm rapidly makes the clusters more regular and greatly reduces the potential energy. DLS can thus be used as an operation acting on the new individuals after mutation operation in AIOA to improve the performance of the AIOA. The AIOA-DLS method combines the merit of evolutionary algorithm and idea of dynamic lattice. The performance of the proposed method is investigated in the optimization of Lennard-Jones clusters within 250 atoms and silver clusters described by many-body Gupta potential within 150 atoms. Results reported in the literature are reproduced, and the motif of Ag 61 cluster is found to be stacking-fault face-centered cubic, whose energy is lower than that of previously obtained icosahedron

  19. A Biogeography-Based Optimization Algorithm Hybridized with Tabu Search for the Quadratic Assignment Problem.

    Science.gov (United States)

    Lim, Wee Loon; Wibowo, Antoni; Desa, Mohammad Ishak; Haron, Habibollah

    2016-01-01

    The quadratic assignment problem (QAP) is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO), a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them.

  20. A Biogeography-Based Optimization Algorithm Hybridized with Tabu Search for the Quadratic Assignment Problem

    Science.gov (United States)

    Lim, Wee Loon; Wibowo, Antoni; Desa, Mohammad Ishak; Haron, Habibollah

    2016-01-01

    The quadratic assignment problem (QAP) is an NP-hard combinatorial optimization problem with a wide variety of applications. Biogeography-based optimization (BBO), a relatively new optimization technique based on the biogeography concept, uses the idea of migration strategy of species to derive algorithm for solving optimization problems. It has been shown that BBO provides performance on a par with other optimization methods. A classical BBO algorithm employs the mutation operator as its diversification strategy. However, this process will often ruin the quality of solutions in QAP. In this paper, we propose a hybrid technique to overcome the weakness of classical BBO algorithm to solve QAP, by replacing the mutation operator with a tabu search procedure. Our experiments using the benchmark instances from QAPLIB show that the proposed hybrid method is able to find good solutions for them within reasonable computational times. Out of 61 benchmark instances tested, the proposed method is able to obtain the best known solutions for 57 of them. PMID:26819585

  1. An image segmentation method based on fuzzy C-means clustering and Cuckoo search algorithm

    Science.gov (United States)

    Wang, Mingwei; Wan, Youchuan; Gao, Xianjun; Ye, Zhiwei; Chen, Maolin

    2018-04-01

    Image segmentation is a significant step in image analysis and machine vision. Many approaches have been presented in this topic; among them, fuzzy C-means (FCM) clustering is one of the most widely used methods for its high efficiency and ambiguity of images. However, the success of FCM could not be guaranteed because it easily traps into local optimal solution. Cuckoo search (CS) is a novel evolutionary algorithm, which has been tested on some optimization problems and proved to be high-efficiency. Therefore, a new segmentation technique using FCM and blending of CS algorithm is put forward in the paper. Further, the proposed method has been measured on several images and compared with other existing FCM techniques such as genetic algorithm (GA) based FCM and particle swarm optimization (PSO) based FCM in terms of fitness value. Experimental results indicate that the proposed method is robust, adaptive and exhibits the better performance than other methods involved in the paper.

  2. An Improved Global Harmony Search Algorithm for the Identification of Nonlinear Discrete-Time Systems Based on Volterra Filter Modeling

    Directory of Open Access Journals (Sweden)

    Zongyan Li

    2016-01-01

    Full Text Available This paper describes an improved global harmony search (IGHS algorithm for identifying the nonlinear discrete-time systems based on second-order Volterra model. The IGHS is an improved version of the novel global harmony search (NGHS algorithm, and it makes two significant improvements on the NGHS. First, the genetic mutation operation is modified by combining normal distribution and Cauchy distribution, which enables the IGHS to fully explore and exploit the solution space. Second, an opposition-based learning (OBL is introduced and modified to improve the quality of harmony vectors. The IGHS algorithm is implemented on two numerical examples, and they are nonlinear discrete-time rational system and the real heat exchanger, respectively. The results of the IGHS are compared with those of the other three methods, and it has been verified to be more effective than the other three methods on solving the above two problems with different input signals and system memory sizes.

  3. Maximize Minimum Utility Function of Fractional Cloud Computing System Based on Search Algorithm Utilizing the Mittag-Leffler Sum

    Directory of Open Access Journals (Sweden)

    Rabha W. Ibrahim

    2018-01-01

    Full Text Available The maximum min utility function (MMUF problem is an important representative of a large class of cloud computing systems (CCS. Having numerous applications in practice, especially in economy and industry. This paper introduces an effective solution-based search (SBS algorithm for solving the problem MMUF. First, we suggest a new formula of the utility function in term of the capacity of the cloud. We formulate the capacity in CCS, by using a fractional diffeo-integral equation. This equation usually describes the flow of CCS. The new formula of the utility function is modified recent active utility functions. The suggested technique first creates a high-quality initial solution by eliminating the less promising components, and then develops the quality of the achieved solution by the summation search solution (SSS. This method is considered by the Mittag-Leffler sum as hash functions to determine the position of the agent. Experimental results commonly utilized in the literature demonstrate that the proposed algorithm competes approvingly with the state-of-the-art algorithms both in terms of solution quality and computational efficiency.

  4. Voltage stability index based optimal placement of static VAR compensator and sizing using Cuckoo search algorithm

    Science.gov (United States)

    Venkateswara Rao, B.; Kumar, G. V. Nagesh; Chowdary, D. Deepak; Bharathi, M. Aruna; Patra, Stutee

    2017-07-01

    This paper furnish the new Metaheuristic algorithm called Cuckoo Search Algorithm (CSA) for solving optimal power flow (OPF) problem with minimization of real power generation cost. The CSA is found to be the most efficient algorithm for solving single objective optimal power flow problems. The CSA performance is tested on IEEE 57 bus test system with real power generation cost minimization as objective function. Static VAR Compensator (SVC) is one of the best shunt connected device in the Flexible Alternating Current Transmission System (FACTS) family. It has capable of controlling the voltage magnitudes of buses by injecting the reactive power to system. In this paper SVC is integrated in CSA based Optimal Power Flow to optimize the real power generation cost. SVC is used to improve the voltage profile of the system. CSA gives better results as compared to genetic algorithm (GA) in both without and with SVC conditions.

  5. MUSIC algorithm for location searching of dielectric anomalies from S-parameters using microwave imaging

    Science.gov (United States)

    Park, Won-Kwang; Kim, Hwa Pyung; Lee, Kwang-Jae; Son, Seong-Ho

    2017-11-01

    Motivated by the biomedical engineering used in early-stage breast cancer detection, we investigated the use of MUltiple SIgnal Classification (MUSIC) algorithm for location searching of small anomalies using S-parameters. We considered the application of MUSIC to functional imaging where a small number of dipole antennas are used. Our approach is based on the application of Born approximation or physical factorization. We analyzed cases in which the anomaly is respectively small and large in relation to the wavelength, and the structure of the left-singular vectors is linked to the nonzero singular values of a Multi-Static Response (MSR) matrix whose elements are the S-parameters. Using simulations, we demonstrated the strengths and weaknesses of the MUSIC algorithm in detecting both small and extended anomalies.

  6. Fault-ignorant quantum search

    International Nuclear Information System (INIS)

    Vrana, Péter; Reeb, David; Reitzner, Daniel; Wolf, Michael M

    2014-01-01

    We investigate the problem of quantum searching on a noisy quantum computer. Taking a fault-ignorant approach, we analyze quantum algorithms that solve the task for various different noise strengths, which are possibly unknown beforehand. We prove lower bounds on the runtime of such algorithms and thereby find that the quadratic speedup is necessarily lost (in our noise models). However, for low but constant noise levels the algorithms we provide (based on Grover's algorithm) still outperform the best noiseless classical search algorithm. (paper)

  7. An improved Pattern Search based algorithm to solve the Dynamic Economic Dispatch problem with valve-point effect

    International Nuclear Information System (INIS)

    Alsumait, J.S.; Qasem, M.; Sykulski, J.K.; Al-Othman, A.K.

    2010-01-01

    In this paper, an improved algorithm based on Pattern Search method (PS) to solve the Dynamic Economic Dispatch is proposed. The algorithm maintains the essential unit ramp rate constraint, along with all other necessary constraints, not only for the time horizon of operation (24 h), but it preserves these constraints through the transaction period to the next time horizon (next day) in order to avoid the discontinuity of the power system operation. The Dynamic Economic and Emission Dispatch problem (DEED) is also considered. The load balance constraints, operating limits, valve-point loading and network losses are included in the models of both DED and DEED. The numerical results clarify the significance of the improved algorithm and verify its performance.

  8. Smoothed Analysis of Local Search Algorithms

    NARCIS (Netherlands)

    Manthey, Bodo; Dehne, Frank; Sack, Jörg-Rüdiger; Stege, Ulrike

    2015-01-01

    Smoothed analysis is a method for analyzing the performance of algorithms for which classical worst-case analysis fails to explain the performance observed in practice. Smoothed analysis has been applied to explain the performance of a variety of algorithms in the last years. One particular class of

  9. An Effective Hybrid Firefly Algorithm with Harmony Search for Global Numerical Optimization

    Directory of Open Access Journals (Sweden)

    Lihong Guo

    2013-01-01

    Full Text Available A hybrid metaheuristic approach by hybridizing harmony search (HS and firefly algorithm (FA, namely, HS/FA, is proposed to solve function optimization. In HS/FA, the exploration of HS and the exploitation of FA are fully exerted, so HS/FA has a faster convergence speed than HS and FA. Also, top fireflies scheme is introduced to reduce running time, and HS is utilized to mutate between fireflies when updating fireflies. The HS/FA method is verified by various benchmarks. From the experiments, the implementation of HS/FA is better than the standard FA and other eight optimization methods.

  10. A Nonmonotone Line Search Filter Algorithm for the System of Nonlinear Equations

    Directory of Open Access Journals (Sweden)

    Zhong Jin

    2012-01-01

    Full Text Available We present a new iterative method based on the line search filter method with the nonmonotone strategy to solve the system of nonlinear equations. The equations are divided into two groups; some equations are treated as constraints and the others act as the objective function, and the two groups are just updated at the iterations where it is needed indeed. We employ the nonmonotone idea to the sufficient reduction conditions and filter technique which leads to a flexibility and acceptance behavior comparable to monotone methods. The new algorithm is shown to be globally convergent and numerical experiments demonstrate its effectiveness.

  11. POWERPLAY: Training an Increasingly General Problem Solver by Continually Searching for the Simplest Still Unsolvable Problem

    Directory of Open Access Journals (Sweden)

    Jürgen eSchmidhuber

    2013-06-01

    Full Text Available Most of computer science focuses on automatically solving given computational problems. I focus on automatically inventing or discovering problems in a way inspired by the playful behavior of animals and humans, to train a more and more general problem solver from scratch in an unsupervised fashion. Consider the infinite set of all computable descriptions of tasks with possibly computable solutions. The novel algorithmic framework POWERPLAY (2011 continually searches the space of possible pairs of new tasks and modifications of the current problem solver, until it finds a more powerful problem solver that provably solves all previously learned tasks plus the new one, while the unmodified predecessor does not. Wow-effects are achieved by continually making previously learned skills more efficient such that they require less time and space. New skills may (partially re-use previously learned skills. POWERPLAY's search orders candidate pairs of tasks and solver modifications by their conditional computational (time & space complexity, given the stored experience so far. The new task and its corresponding task-solving skill are those first found and validated. The computational costs of validating new tasks need not grow with task repertoire size. POWERPLAY's ongoing search for novelty keeps breaking the generalization abilities of its present solver. This is related to Goedel's sequence of increasingly powerful formal theories based on adding formerly unprovable statements to the axioms without affecting previously provable theorems. The continually increasing repertoire of problem solving procedures can be exploited by a parallel search for solutions to additional externally posed tasks. POWERPLAY may be viewed as a greedy but practical implementation of basic principles of creativity. A first experimental analysis can be found in separate papers [58, 56, 57].

  12. How Do Severe Constraints Affect the Search Ability of Multiobjective Evolutionary Algorithms in Water Resources?

    Science.gov (United States)

    Clarkin, T. J.; Kasprzyk, J. R.; Raseman, W. J.; Herman, J. D.

    2015-12-01

    This study contributes a diagnostic assessment of multiobjective evolutionary algorithm (MOEA) search on a set of water resources problem formulations with different configurations of constraints. Unlike constraints in classical optimization modeling, constraints within MOEA simulation-optimization represent limits on acceptable performance that delineate whether solutions within the search problem are feasible. Constraints are relevant because of the emergent pressures on water resources systems: increasing public awareness of their sustainability, coupled with regulatory pressures on water management agencies. In this study, we test several state-of-the-art MOEAs that utilize restricted tournament selection for constraint handling on varying configurations of water resources planning problems. For example, a problem that has no constraints on performance levels will be compared with a problem with several severe constraints, and a problem with constraints that have less severe values on the constraint thresholds. One such problem, Lower Rio Grande Valley (LRGV) portfolio planning, has been solved with a suite of constraints that ensure high reliability, low cost variability, and acceptable performance in a single year severe drought. But to date, it is unclear whether or not the constraints are negatively affecting MOEAs' ability to solve the problem effectively. Two categories of results are explored. The first category uses control maps of algorithm performance to determine if the algorithm's performance is sensitive to user-defined parameters. The second category uses run-time performance metrics to determine the time required for the algorithm to reach sufficient levels of convergence and diversity on the solution sets. Our work exploring the effect of constraints will better enable practitioners to define MOEA problem formulations for real-world systems, especially when stakeholders are concerned with achieving fixed levels of performance according to one or

  13. Enhancing Artificial Bee Colony Algorithm with Self-Adaptive Searching Strategy and Artificial Immune Network Operators for Global Optimization

    Directory of Open Access Journals (Sweden)

    Tinggui Chen

    2014-01-01

    Full Text Available Artificial bee colony (ABC algorithm, inspired by the intelligent foraging behavior of honey bees, was proposed by Karaboga. It has been shown to be superior to some conventional intelligent algorithms such as genetic algorithm (GA, artificial colony optimization (ACO, and particle swarm optimization (PSO. However, the ABC still has some limitations. For example, ABC can easily get trapped in the local optimum when handing in functions that have a narrow curving valley, a high eccentric ellipse, or complex multimodal functions. As a result, we proposed an enhanced ABC algorithm called EABC by introducing self-adaptive searching strategy and artificial immune network operators to improve the exploitation and exploration. The simulation results tested on a suite of unimodal or multimodal benchmark functions illustrate that the EABC algorithm outperforms ACO, PSO, and the basic ABC in most of the experiments.

  14. The Surface Extraction from TIN based Search-space Minimization (SETSM) algorithm

    Science.gov (United States)

    Noh, Myoung-Jong; Howat, Ian M.

    2017-07-01

    Digital Elevation Models (DEMs) provide critical information for a wide range of scientific, navigational and engineering activities. Submeter resolution, stereoscopic satellite imagery with high geometric and radiometric quality, and wide spatial coverage are becoming increasingly accessible for generating stereo-photogrammetric DEMs. However, low contrast and repeatedly-textured surfaces, such as snow and glacial ice at high latitudes, and mountainous terrains challenge existing stereo-photogrammetric DEM generation techniques, particularly without a-priori information such as existing seed DEMs or the manual setting of terrain-specific parameters. To utilize these data for fully-automatic DEM extraction at a large scale, we developed the Surface Extraction from TIN-based Search-space Minimization (SETSM) algorithm. SETSM is fully automatic (i.e. no search parameter settings are needed) and uses only the sensor model Rational Polynomial Coefficients (RPCs). SETSM adopts a hierarchical, combined image- and object-space matching strategy utilizing weighted normalized cross-correlation with both original distorted and geometrically corrected images for overcoming ambiguities caused by foreshortening and occlusions. In addition, SETSM optimally minimizes search-spaces to extract optimal matches over problematic terrains by iteratively updating object surfaces within a Triangulated Irregular Network, and utilizes a geometric-constrained blunder and outlier detection in object space. We prove the ability of SETSM to mitigate typical stereo-photogrammetric matching problems over a range of challenging terrains. SETSM is the primary DEM generation software for the US National Science Foundation's ArcticDEM project.

  15. A Simulation of Readiness-Based Sparing Policies

    Science.gov (United States)

    2017-06-01

    variant of a greedy heuristic algorithm to set stock levels and estimate overall WS availability. Our discrete event simulation is then used to test the...variant of a greedy heuristic algorithm to set stock levels and estimate overall WS availability. Our discrete event simulation is then used to test the...WS Type with 22 Individual WS at a Representative Site.....................................................................................31

  16. Contact-impact algorithms on parallel computers

    International Nuclear Information System (INIS)

    Zhong Zhihua; Nilsson, Larsgunnar

    1994-01-01

    Contact-impact algorithms on parallel computers are discussed within the context of explicit finite element analysis. The algorithms concerned include a contact searching algorithm and an algorithm for contact force calculations. The contact searching algorithm is based on the territory concept of the general HITA algorithm. However, no distinction is made between different contact bodies, or between different contact surfaces. All contact segments from contact boundaries are taken as a single set. Hierarchy territories and contact territories are expanded. A three-dimensional bucket sort algorithm is used to sort contact nodes. The defence node algorithm is used in the calculation of contact forces. Both the contact searching algorithm and the defence node algorithm are implemented on the connection machine CM-200. The performance of the algorithms is examined under different circumstances, and numerical results are presented. ((orig.))

  17. Non-tables look-up search algorithm for efficient H.264/AVC context-based adaptive variable length coding decoding

    Science.gov (United States)

    Han, Yishi; Luo, Zhixiao; Wang, Jianhua; Min, Zhixuan; Qin, Xinyu; Sun, Yunlong

    2014-09-01

    In general, context-based adaptive variable length coding (CAVLC) decoding in H.264/AVC standard requires frequent access to the unstructured variable length coding tables (VLCTs) and significant memory accesses are consumed. Heavy memory accesses will cause high power consumption and time delays, which are serious problems for applications in portable multimedia devices. We propose a method for high-efficiency CAVLC decoding by using a program instead of all the VLCTs. The decoded codeword from VLCTs can be obtained without any table look-up and memory access. The experimental results show that the proposed algorithm achieves 100% memory access saving and 40% decoding time saving without degrading video quality. Additionally, the proposed algorithm shows a better performance compared with conventional CAVLC decoding, such as table look-up by sequential search, table look-up by binary search, Moon's method, and Kim's method.

  18. Hybrid Multiple Soft-Sensor Models of Grinding Granularity Based on Cuckoo Searching Algorithm and Hysteresis Switching Strategy

    Directory of Open Access Journals (Sweden)

    Jie-Sheng Wang

    2015-01-01

    Full Text Available According to the characteristics of grinding process and accuracy requirements of technical indicators, a hybrid multiple soft-sensor modeling method of grinding granularity is proposed based on cuckoo searching (CS algorithm and hysteresis switching (HS strategy. Firstly, a mechanism soft-sensor model of grinding granularity is deduced based on the technique characteristics and a lot of experimental data of grinding process. Meanwhile, the BP neural network soft-sensor model and wavelet neural network (WNN soft-sensor model are set up. Then, the hybrid multiple soft-sensor model based on the hysteresis switching strategy is realized. That is to say, the optimum model is selected as the current predictive model according to the switching performance index at each sampling instant. Finally the cuckoo searching algorithm is adopted to optimize the performance parameters of hysteresis switching strategy. Simulation results show that the proposed model has better generalization results and prediction precision, which can satisfy the real-time control requirements of grinding classification process.

  19. There Are (super)Giants in the Sky: Searching for Misidentified Massive Stars in Algorithmically-Selected Quasar Catalogs

    Science.gov (United States)

    Dorn-Wallenstein, Trevor Z.; Levesque, Emily

    2017-11-01

    Thanks to incredible advances in instrumentation, surveys like the Sloan Digital Sky Survey have been able to find and catalog billions of objects, ranging from local M dwarfs to distant quasars. Machine learning algorithms have greatly aided in the effort to classify these objects; however, there are regimes where these algorithms fail, where interesting oddities may be found. We present here an X-ray bright quasar misidentified as a red supergiant/X-ray binary, and a subsequent search of the SDSS quasar catalog for X-ray bright stars misidentified as quasars.

  20. Generalised Adaptive Harmony Search: A Comparative Analysis of Modern Harmony Search

    Directory of Open Access Journals (Sweden)

    Jaco Fourie

    2013-01-01

    Full Text Available Harmony search (HS was introduced in 2001 as a heuristic population-based optimisation algorithm. Since then HS has become a popular alternative to other heuristic algorithms like simulated annealing and particle swarm optimisation. However, some flaws, like the need for parameter tuning, were identified and have been a topic of study for much research over the last 10 years. Many variants of HS were developed to address some of these flaws, and most of them have made substantial improvements. In this paper we compare the performance of three recent HS variants: exploratory harmony search, self-adaptive harmony search, and dynamic local-best harmony search. We compare the accuracy of these algorithms, using a set of well-known optimisation benchmark functions that include both unimodal and multimodal problems. Observations from this comparison led us to design a novel hybrid that combines the best attributes of these modern variants into a single optimiser called generalised adaptive harmony search.