WorldWideScience

Sample records for proposed aco algorithms

  1. Processing time tolerance-based ACO algorithm for solving job-shop scheduling problem

    Science.gov (United States)

    Luo, Yabo; Waden, Yongo P.

    2017-06-01

    Ordinarily, Job Shop Scheduling Problem (JSSP) is known as NP-hard problem which has uncertainty and complexity that cannot be handled by a linear method. Thus, currently studies on JSSP are concentrated mainly on applying different methods of improving the heuristics for optimizing the JSSP. However, there still exist many problems for efficient optimization in the JSSP, namely, low efficiency and poor reliability, which can easily trap the optimization process of JSSP into local optima. Therefore, to solve this problem, a study on Ant Colony Optimization (ACO) algorithm combined with constraint handling tactics is carried out in this paper. Further, the problem is subdivided into three parts: (1) Analysis of processing time tolerance-based constraint features in the JSSP which is performed by the constraint satisfying model; (2) Satisfying the constraints by considering the consistency technology and the constraint spreading algorithm in order to improve the performance of ACO algorithm. Hence, the JSSP model based on the improved ACO algorithm is constructed; (3) The effectiveness of the proposed method based on reliability and efficiency is shown through comparative experiments which are performed on benchmark problems. Consequently, the results obtained by the proposed method are better, and the applied technique can be used in optimizing JSSP.

  2. Solving optimum operation of single pump unit problem with ant colony optimization (ACO) algorithm

    International Nuclear Information System (INIS)

    Yuan, Y; Liu, C

    2012-01-01

    For pumping stations, the effective scheduling of daily pump operations from solutions to the optimum design operation problem is one of the greatest potential areas for energy cost-savings, there are some difficulties in solving this problem with traditional optimization methods due to the multimodality of the solution region. In this case, an ACO model for optimum operation of pumping unit is proposed and the solution method by ants searching is presented by rationally setting the object function and constrained conditions. A weighted directed graph was constructed and feasible solutions may be found by iteratively searching of artificial ants, and then the optimal solution can be obtained by applying the rule of state transition and the pheromone updating. An example calculation was conducted and the minimum cost was found as 4.9979. The result of ant colony algorithm was compared with the result from dynamic programming or evolutionary solving method in commercial software under the same discrete condition. The result of ACO is better and the computing time is shorter which indicates that ACO algorithm can provide a high application value to the field of optimal operation of pumping stations and related fields.

  3. ENHANCED HYBRID PSO – ACO ALGORITHM FOR GRID SCHEDULING

    Directory of Open Access Journals (Sweden)

    P. Mathiyalagan

    2010-07-01

    Full Text Available Grid computing is a high performance computing environment to solve larger scale computational demands. Grid computing contains resource management, task scheduling, security problems, information management and so on. Task scheduling is a fundamental issue in achieving high performance in grid computing systems. A computational GRID is typically heterogeneous in the sense that it combines clusters of varying sizes, and different clusters typically contains processing elements with different level of performance. In this, heuristic approaches based on particle swarm optimization and ant colony optimization algorithms are adopted for solving task scheduling problems in grid environment. Particle Swarm Optimization (PSO is one of the latest evolutionary optimization techniques by nature. It has the better ability of global searching and has been successfully applied to many areas such as, neural network training etc. Due to the linear decreasing of inertia weight in PSO the convergence rate becomes faster, which leads to the minimal makespan time when used for scheduling. To make the convergence rate faster, the PSO algorithm is improved by modifying the inertia parameter, such that it produces better performance and gives an optimized result. The ACO algorithm is improved by modifying the pheromone updating rule. ACO algorithm is hybridized with PSO algorithm for efficient result and better convergence in PSO algorithm.

  4. Controlling‏ ‏the Balance of Exploration and ‎Exploitation in ACO Algorithm

    Directory of Open Access Journals (Sweden)

    Ayad ‎ Mohammed Jabbar

    2018-02-01

    Full Text Available Ant colony optimization is a meta-heuristic algorithm inspired by the foraging behavior of real ant colony. The algorithm is a population-based solution employed in different optimization problems such as classification, image processing, clustering, and so on. This paper sheds the light on the side of improving the results of traveling salesman problem produced by the algorithm. The key success that produces the valuable results is due to the two important components of exploration and exploitation. Balancing both components is the foundation of controlling search within the ACO. This paper proposes to modify the main probabilistic method to overcome the drawbacks of the exploration problem and produces global optimal results in high dimensional space. Experiments on six variant of ant colony optimization indicate that the proposed work produces high-quality results in terms of shortest route

  5. ACO-Initialized Wavelet Neural Network for Vibration Fault Diagnosis of Hydroturbine Generating Unit

    OpenAIRE

    Xiao, Zhihuai; He, Xinying; Fu, Xiangqian; Malik, O. P.

    2015-01-01

    Considering the drawbacks of traditional wavelet neural network, such as low convergence speed and high sensitivity to initial parameters, an ant colony optimization- (ACO-) initialized wavelet neural network is proposed in this paper for vibration fault diagnosis of a hydroturbine generating unit. In this method, parameters of the wavelet neural network are initialized by the ACO algorithm, and then the wavelet neural network is trained by the gradient descent algorithm. Amplitudes of the fr...

  6. ACO-Initialized Wavelet Neural Network for Vibration Fault Diagnosis of Hydroturbine Generating Unit

    Directory of Open Access Journals (Sweden)

    Zhihuai Xiao

    2015-01-01

    Full Text Available Considering the drawbacks of traditional wavelet neural network, such as low convergence speed and high sensitivity to initial parameters, an ant colony optimization- (ACO- initialized wavelet neural network is proposed in this paper for vibration fault diagnosis of a hydroturbine generating unit. In this method, parameters of the wavelet neural network are initialized by the ACO algorithm, and then the wavelet neural network is trained by the gradient descent algorithm. Amplitudes of the frequency components of the hydroturbine generating unit vibration signals are used as feature vectors for wavelet neural network training to realize mapping relationship from vibration features to fault types. A real vibration fault diagnosis case result of a hydroturbine generating unit shows that the proposed method has faster convergence speed and stronger generalization ability than the traditional wavelet neural network and ACO wavelet neural network. Thus it can provide an effective solution for online vibration fault diagnosis of a hydroturbine generating unit.

  7. Dynamic Fuzzy Logic Parameter Tuning for ACO and Its Application in the Fuzzy Logic Control of an Autonomous Mobile Robot

    Directory of Open Access Journals (Sweden)

    Oscar Castillo

    2013-01-01

    Full Text Available Ant Colony Optimization (ACO is a population-based constructive meta-heuristic that exploits a form of past performance memory inspired by the foraging behaviour of real ants. The behaviour of the ACO algorithm is highly dependent on the values defined for its parameters. Adaptation and parameter control are recurring themes in the field of bio-inspired algorithms. The present paper explores a new approach to diversity control in ACO. The central idea is to avoid or slow down full convergence through the dynamic variation of certain parameters. The performance of different variants of the ACO algorithm was observed to choose one as the basis for the proposed approach. A convergence fuzzy logic controller with the objective of maintaining diversity at some level to avoid premature convergence was created. Encouraging results have been obtained on its application to the design of fuzzy controllers. In particular, the optimization of membership functions for a unicycle mobile robot trajectory control is presented with the proposed method.

  8. Multi-criteria ACO-based Algorithm for Ship’s Trajectory Planning

    Directory of Open Access Journals (Sweden)

    Agnieszka Lazarowska

    2017-03-01

    Full Text Available The paper presents a new approach for solving a path planning problem for ships in the environment with static and dynamic obstacles. The algorithm utilizes a heuristic method, classified to the group of Swarm Intelligence approaches, called the Ant Colony Optimization. The method is inspired by a collective behaviour of ant colonies. A group of agents - artificial ants searches through the solution space in order to find a safe, optimal trajectory for a ship. The problem is considered as a multi-criteria optimization task. The criteria taken into account during problem solving are: path safety, path length, the International Regulations for Preventing Collisions at Sea (COLREGs compliance and path smoothness. The paper includes the description of the new multi-criteria ACO-based algorithm along with the presentation and discussion of simulation tests results.

  9. Multi-objective ACO algorithms to minimise the makespan and the total rejection cost on BPMs with arbitrary job weights

    Science.gov (United States)

    Jia, Zhao-hong; Pei, Ming-li; Leung, Joseph Y.-T.

    2017-12-01

    In this paper, we investigate the batch-scheduling problem with rejection on parallel machines with non-identical job sizes and arbitrary job-rejected weights. If a job is rejected, the corresponding penalty has to be paid. Our objective is to minimise the makespan of the processed jobs and the total rejection cost of the rejected jobs. Based on the selected multi-objective optimisation approaches, two problems, P1 and P2, are considered. In P1, the two objectives are linearly combined into one single objective. In P2, the two objectives are simultaneously minimised and the Pareto non-dominated solution set is to be found. Based on the ant colony optimisation (ACO), two algorithms, called LACO and PACO, are proposed to address the two problems, respectively. Two different objective-oriented pheromone matrices and heuristic information are designed. Additionally, a local optimisation algorithm is adopted to improve the solution quality. Finally, simulated experiments are conducted, and the comparative results verify the effectiveness and efficiency of the proposed algorithms, especially on large-scale instances.

  10. Optimizing VM allocation and data placement for data-intensive applications in cloud using ACO metaheuristic algorithm

    Directory of Open Access Journals (Sweden)

    T.P. Shabeera

    2017-04-01

    Full Text Available Nowadays data-intensive applications for processing big data are being hosted in the cloud. Since the cloud environment provides virtualized resources for computation, and data-intensive applications require communication between the computing nodes, the placement of Virtual Machines (VMs and location of data affect the overall computation time. Majority of the research work reported in the current literature consider the selection of physical nodes for placing data and VMs as independent problems. This paper proposes an approach which considers VM placement and data placement hand in hand. The primary objective is to reduce cross network traffic and bandwidth usage, by placing required number of VMs and data in Physical Machines (PMs which are physically closer. The VM and data placement problem (referred as MinDistVMDataPlacement problem is defined in this paper and has been proved to be NP- Hard. This paper presents and evaluates a metaheuristic algorithm based on Ant Colony Optimization (ACO, which selects a set of adjacent PMs for placing data and VMs. Data is distributed in the physical storage devices of the selected PMs. According to the processing capacity of each PM, a set of VMs are placed on these PMs to process data stored in them. We use simulation to evaluate our algorithm. The results show that the proposed algorithm selects PMs in close proximity and the jobs executed in the VMs allocated by the proposed scheme outperforms other allocation schemes.

  11. MGA trajectory planning with an ACO-inspired algorithm

    Science.gov (United States)

    Ceriotti, Matteo; Vasile, Massimiliano

    2010-11-01

    Given a set of celestial bodies, the problem of finding an optimal sequence of swing-bys, deep space manoeuvres (DSM) and transfer arcs connecting the elements of the set is combinatorial in nature. The number of possible paths grows exponentially with the number of celestial bodies. Therefore, the design of an optimal multiple gravity assist (MGA) trajectory is a NP-hard mixed combinatorial-continuous problem. Its automated solution would greatly improve the design of future space missions, allowing the assessment of a large number of alternative mission options in a short time. This work proposes to formulate the complete automated design of a multiple gravity assist trajectory as an autonomous planning and scheduling problem. The resulting scheduled plan will provide the optimal planetary sequence and a good estimation of the set of associated optimal trajectories. The trajectory model consists of a sequence of celestial bodies connected by two-dimensional transfer arcs containing one DSM. For each transfer arc, the position of the planet and the spacecraft, at the time of arrival, are matched by varying the pericentre of the preceding swing-by, or the magnitude of the launch excess velocity, for the first arc. For each departure date, this model generates a full tree of possible transfers from the departure to the destination planet. Each leaf of the tree represents a planetary encounter and a possible way to reach that planet. An algorithm inspired by ant colony optimization (ACO) is devised to explore the space of possible plans. The ants explore the tree from departure to destination adding one node at the time: every time an ant is at a node, a probability function is used to select a feasible direction. This approach to automatic trajectory planning is applied to the design of optimal transfers to Saturn and among the Galilean moons of Jupiter. Solutions are compared to those found through more traditional genetic-algorithm techniques.

  12. Improving the Interpretability of Classification Rules Discovered by an Ant Colony Algorithm: Extended Results

    OpenAIRE

    Otero, Fernando E.B.; Freitas, Alex A.

    2016-01-01

    The vast majority of Ant Colony Optimization (ACO) algorithms for inducing classification rules use an ACO-based procedure to create a rule in an one-at-a-time fashion. An improved search strategy has been proposed in the cAnt-MinerPB algorithm, where an ACO-based procedure is used to create a complete list of rules (ordered rules)-i.e., the ACO search is guided by the quality of a list of rules, instead of an individual rule. In this paper we propose an extension of the cAnt-MinerPB algorith...

  13. Parameter estimation of Lorenz chaotic system using a hybrid swarm intelligence algorithm

    International Nuclear Information System (INIS)

    Lazzús, Juan A.; Rivera, Marco; López-Caraballo, Carlos H.

    2016-01-01

    A novel hybrid swarm intelligence algorithm for chaotic system parameter estimation is present. For this purpose, the parameters estimation on Lorenz systems is formulated as a multidimensional problem, and a hybrid approach based on particle swarm optimization with ant colony optimization (PSO–ACO) is implemented to solve this problem. Firstly, the performance of the proposed PSO–ACO algorithm is tested on a set of three representative benchmark functions, and the impact of the parameter settings on PSO–ACO efficiency is studied. Secondly, the parameter estimation is converted into an optimization problem on a three-dimensional Lorenz system. Numerical simulations on Lorenz model and comparisons with results obtained by other algorithms showed that PSO–ACO is a very powerful tool for parameter estimation with high accuracy and low deviations. - Highlights: • PSO–ACO combined particle swarm optimization with ant colony optimization. • This study is the first research of PSO–ACO to estimate parameters of chaotic systems. • PSO–ACO algorithm can identify the parameters of the three-dimensional Lorenz system with low deviations. • PSO–ACO is a very powerful tool for the parameter estimation on other chaotic system.

  14. Parameter estimation of Lorenz chaotic system using a hybrid swarm intelligence algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Lazzús, Juan A., E-mail: jlazzus@dfuls.cl; Rivera, Marco; López-Caraballo, Carlos H.

    2016-03-11

    A novel hybrid swarm intelligence algorithm for chaotic system parameter estimation is present. For this purpose, the parameters estimation on Lorenz systems is formulated as a multidimensional problem, and a hybrid approach based on particle swarm optimization with ant colony optimization (PSO–ACO) is implemented to solve this problem. Firstly, the performance of the proposed PSO–ACO algorithm is tested on a set of three representative benchmark functions, and the impact of the parameter settings on PSO–ACO efficiency is studied. Secondly, the parameter estimation is converted into an optimization problem on a three-dimensional Lorenz system. Numerical simulations on Lorenz model and comparisons with results obtained by other algorithms showed that PSO–ACO is a very powerful tool for parameter estimation with high accuracy and low deviations. - Highlights: • PSO–ACO combined particle swarm optimization with ant colony optimization. • This study is the first research of PSO–ACO to estimate parameters of chaotic systems. • PSO–ACO algorithm can identify the parameters of the three-dimensional Lorenz system with low deviations. • PSO–ACO is a very powerful tool for the parameter estimation on other chaotic system.

  15. Layered ACO-OFDM for intensity-modulated direct-detection optical wireless transmission.

    Science.gov (United States)

    Wang, Qi; Qian, Chen; Guo, Xuhan; Wang, Zhaocheng; Cunningham, David G; White, Ian H

    2015-05-04

    Layered asymmetrically clipped optical orthogonal frequency division multiplexing (ACO-OFDM) with high spectral efficiency is proposed in this paper for optical wireless transmission employing intensity modulation with direct detection. In contrast to the conventional ACO-OFDM, which only utilizes odd subcarriers for modulation, leading to an obvious spectral efficiency loss, in layered ACO-OFDM, the subcarriers are divided into different layers and modulated by different kinds of ACO-OFDM, which are combined for simultaneous transmission. In this way, more subcarriers are used for data transmission and the spectral efficiency is improved. An iterative receiver is also proposed for layered ACO-OFDM, where the negative clipping distortion of each layer is subtracted once it is detected so that the signals from different layers can be recovered. Theoretical analysis shows that the proposed scheme can improve the spectral efficiency by up to 2 times compared with conventional ACO-OFDM approaches with the same modulation order. Meanwhile, simulation results confirm a considerable signal-to-noise ratio gain over ACO-OFDM at the same spectral efficiency.

  16. Improving the Interpretability of Classification Rules Discovered by an Ant Colony Algorithm: Extended Results.

    Science.gov (United States)

    Otero, Fernando E B; Freitas, Alex A

    2016-01-01

    Most ant colony optimization (ACO) algorithms for inducing classification rules use a ACO-based procedure to create a rule in a one-at-a-time fashion. An improved search strategy has been proposed in the cAnt-Miner[Formula: see text] algorithm, where an ACO-based procedure is used to create a complete list of rules (ordered rules), i.e., the ACO search is guided by the quality of a list of rules instead of an individual rule. In this paper we propose an extension of the cAnt-Miner[Formula: see text] algorithm to discover a set of rules (unordered rules). The main motivations for this work are to improve the interpretation of individual rules by discovering a set of rules and to evaluate the impact on the predictive accuracy of the algorithm. We also propose a new measure to evaluate the interpretability of the discovered rules to mitigate the fact that the commonly used model size measure ignores how the rules are used to make a class prediction. Comparisons with state-of-the-art rule induction algorithms, support vector machines, and the cAnt-Miner[Formula: see text] producing ordered rules are also presented.

  17. An experimental analysis of design choices of multi-objective ant colony optimization algorithms

    OpenAIRE

    Lopez-Ibanez, Manuel; Stutzle, Thomas

    2012-01-01

    There have been several proposals on how to apply the ant colony optimization (ACO) metaheuristic to multi-objective combinatorial optimization problems (MOCOPs). This paper proposes a new formulation of these multi-objective ant colony optimization (MOACO) algorithms. This formulation is based on adding specific algorithm components for tackling multiple objectives to the basic ACO metaheuristic. Examples of these components are how to represent multiple objectives using pheromone and heuris...

  18. Hybrid Swarm Intelligence Energy Efficient Clustered Routing Algorithm for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Rajeev Kumar

    2016-01-01

    Full Text Available Currently, wireless sensor networks (WSNs are used in many applications, namely, environment monitoring, disaster management, industrial automation, and medical electronics. Sensor nodes carry many limitations like low battery life, small memory space, and limited computing capability. To create a wireless sensor network more energy efficient, swarm intelligence technique has been applied to resolve many optimization issues in WSNs. In many existing clustering techniques an artificial bee colony (ABC algorithm is utilized to collect information from the field periodically. Nevertheless, in the event based applications, an ant colony optimization (ACO is a good solution to enhance the network lifespan. In this paper, we combine both algorithms (i.e., ABC and ACO and propose a new hybrid ABCACO algorithm to solve a Nondeterministic Polynomial (NP hard and finite problem of WSNs. ABCACO algorithm is divided into three main parts: (i selection of optimal number of subregions and further subregion parts, (ii cluster head selection using ABC algorithm, and (iii efficient data transmission using ACO algorithm. We use a hierarchical clustering technique for data transmission; the data is transmitted from member nodes to the subcluster heads and then from subcluster heads to the elected cluster heads based on some threshold value. Cluster heads use an ACO algorithm to discover the best route for data transmission to the base station (BS. The proposed approach is very useful in designing the framework for forest fire detection and monitoring. The simulation results show that the ABCACO algorithm enhances the stability period by 60% and also improves the goodput by 31% against LEACH and WSNCABC, respectively.

  19. Financial Performance of Rural Medicare ACOs.

    Science.gov (United States)

    Nattinger, Matthew C; Mueller, Keith; Ullrich, Fred; Zhu, Xi

    2018-12-01

    The Centers for Medicare & Medicaid Services (CMS) has facilitated the development of Medicare accountable care organizations (ACOs), mostly through the Medicare Shared Savings Program (MSSP). To inform the operation of the Center for Medicare & Medicaid Innovation's (CMMI) ACO programs, we assess the financial performance of rural ACOs based on different levels of rural presence. We used the 2014 performance data for Medicare ACOs to examine the financial performance of rural ACOs with different levels of rural presence: exclusively rural, mostly rural, and mixed rural/metropolitan. Of the ACOs reporting performance data, we identified 97 ACOs with a measurable rural presence. We found that successful rural ACO financial performance is associated with the ACO's organizational type (eg, physician-based) and that 8 of the 11 rural ACOs participating in the Advanced Payment Program (APP) garnered savings for Medicare. Unlike previous work, we did not find an association between ACO size or experience and rural ACO financial performance. Our findings suggest that rural ACO financial success is likely associated with factors unique to rural environments. Given the emphasis CMS has placed on rural ACO development, further research to identify these factors is warranted. © 2016 National Rural Health Association.

  20. Transfer function fitting using a continuous Ant Colony Optimization (ACO algorithm

    Directory of Open Access Journals (Sweden)

    A. Reineix

    2015-03-01

    Full Text Available An original approach is proposed in order to achieve the  fitting of ultra-wideband complex frequency functions, such  as the complex impedances, by using the so-called ACO  (Ant Colony Optimization methods. First, we present the  optimization principle of ACO, which originally was  dedicated to the combinatorial problems. Further on, the  extension to the continuous and mixed problems is  explained in more details. The interest in this approach is  proved by its ability to define practical constraints and  objectives, such as minimizing the number of filters used in  the model with respect to a fixed relative error. Finally, the  establishment of the model for the first and second order  filter types illustrates the power of the method and its  interest for the time-domain electromagnetic computation.

  1. Pioneer ACO PUF

    Data.gov (United States)

    U.S. Department of Health & Human Services — Pioneer ACO PUF - To address the increasing number of requests for Pioneer ACO data, the Centers for Medicare and Medicaid Services (CMS) has created a standard...

  2. MODELO ACO PARA LA RECOLECCIÓN DE RESIDUOS POR CONTENEDORES ACO MODEL APPLIED TO THE WASTE COLLECTION BY CONTAINERS

    Directory of Open Access Journals (Sweden)

    Eduardo Salazar Hornig

    2009-08-01

    Full Text Available ACO es una metaheurística inspirada en el comportamiento de las colonias de hormigas para solucionar problemas de optimización combinatoria, por medio de la utilización de agentes computacionales simples que trabajan de manera cooperativa y se comunican mediante rastros de feromona artificiales. En este trabajo se presenta un modelo para resolver el Problema de Recolección de Residuos Domiciliarios por Contenedores, el que aplica un concepto de secuencias parciales de recolección que deben ser unidas para minimizar la distancia total de recolección. El problema de unir las secuencias parciales se representa como un TSP, el que es resuelto mediante un algoritmo ACO. En base a recomendaciones de la literatura, se calibran experimentalmente los parámetros del algoritmo y se recomiendan rangos de valores que representan buenos rendimientos promedio. El modelo se aplica a un sector de recolección de la comuna de San Pedro de la Paz, Chile, obteniéndose rutas de recolección que reducen la distancia total recorrida respecto de la actual ruta utilizada y de la solución obtenida con otro modelo desarrollado previamente.ACO is a metaheuristic inspired in the behavior of natural ant colonies to solve combinatorial optimization problems, based on simple agents that work cooperatively communicating by artificial pheromone trails. In this paper a model to solve the municipal waste collection problem by containers is presented, which applies a concept of partial collection sequences that must be joined to minimize the total collection distance. The problem to join the partial collection sequences is represented as a TSP, which is solved by an ACO algorithm. Based on the literature, algorithm parameters are experimentally calibrated and range of variations that represents good average solutions are recommended. The model is applied to a waste collection sector of the San Pedro de la Paz commune in Chile, obtaining recollection routes with less total

  3. A universal optimization strategy for ant colony optimization algorithms based on the Physarum-inspired mathematical model

    International Nuclear Information System (INIS)

    Zhang, Zili; Gao, Chao; Liu, Yuxin; Qian, Tao

    2014-01-01

    Ant colony optimization (ACO) algorithms often fall into the local optimal solution and have lower search efficiency for solving the travelling salesman problem (TSP). According to these shortcomings, this paper proposes a universal optimization strategy for updating the pheromone matrix in the ACO algorithms. The new optimization strategy takes advantages of the unique feature of critical paths reserved in the process of evolving adaptive networks of the Physarum-inspired mathematical model (PMM). The optimized algorithms, denoted as PMACO algorithms, can enhance the amount of pheromone in the critical paths and promote the exploitation of the optimal solution. Experimental results in synthetic and real networks show that the PMACO algorithms are more efficient and robust than the traditional ACO algorithms, which are adaptable to solve the TSP with single or multiple objectives. Meanwhile, we further analyse the influence of parameters on the performance of the PMACO algorithms. Based on these analyses, the best values of these parameters are worked out for the TSP. (paper)

  4. Function-Oriented Networking and On-Demand Routing System in Network Using Ant Colony Optimization Algorithm

    Directory of Open Access Journals (Sweden)

    Young-Bo Sim

    2017-11-01

    Full Text Available In this paper, we proposed and developed Function-Oriented Networking (FON, a platform for network users. It has a different philosophy as opposed to technologies for network managers of Software-Defined Networking technology, OpenFlow. It is a technology that can immediately reflect the demands of the network users in the network, unlike the existing OpenFlow and Network Functions Virtualization (NFV, which do not reflect directly the needs of the network users. It allows the network user to determine the policy of the direct network, so it can be applied more precisely than the policy applied by the network manager. This is expected to increase the satisfaction of the service users when the network users try to provide new services. We developed FON function that performs on-demand routing for Low-Delay Required service. We analyzed the characteristics of the Ant Colony Optimization (ACO algorithm and found that the algorithm is suitable for low-delay required services. It was also the first in the world to implement the routing software using ACO Algorithm in the real Ethernet network. In order to improve the routing performance, several algorithms of the ACO Algorithm have been developed to enable faster path search-routing and path recovery. The relationship between the network performance index and the ACO routing parameters is derived, and the results are compared and analyzed. Through this, it was possible to develop the ACO algorithm.

  5. An Effective Hybrid Routing Algorithm in WSN: Ant Colony Optimization in combination with Hop Count Minimization

    Directory of Open Access Journals (Sweden)

    Ailian Jiang

    2018-03-01

    Full Text Available Low cost, high reliability and easy maintenance are key criteria in the design of routing protocols for wireless sensor networks (WSNs. This paper investigates the existing ant colony optimization (ACO-based WSN routing algorithms and the minimum hop count WSN routing algorithms by reviewing their strengths and weaknesses. We also consider the critical factors of WSNs, such as energy constraint of sensor nodes, network load balancing and dynamic network topology. Then we propose a hybrid routing algorithm that integrates ACO and a minimum hop count scheme. The proposed algorithm is able to find the optimal routing path with minimal total energy consumption and balanced energy consumption on each node. The algorithm has unique superiority in terms of searching for the optimal path, balancing the network load and the network topology maintenance. The WSN model and the proposed algorithm have been implemented using C++. Extensive simulation experimental results have shown that our algorithm outperforms several other WSN routing algorithms on such aspects that include the rate of convergence, the success rate in searching for global optimal solution, and the network lifetime.

  6. Aplicación de un algoritmo ACO al problema de taller de flujo de permutación con tiempos de preparación dependientes de la secuencia y minimización de makespan An ant colony algorithm for the permutation flowshop with sequence dependent setup times and makespan minimization

    Directory of Open Access Journals (Sweden)

    Eduardo Salazar Hornig

    2011-08-01

    Full Text Available En este trabajo se estudió el problema de secuenciamiento de trabajos en el taller de flujo de permutación con tiempos de preparación dependientes de la secuencia y minimización de makespan. Para ello se propuso un algoritmo de optimización mediante colonia de hormigas (ACO, llevando el problema original a una estructura semejante al problema del vendedor viajero TSP (Traveling Salesman Problem asimétrico, utilizado para su evaluación problemas propuestos en la literatura y se compara con una adaptación de la heurística NEH (Nawaz-Enscore-Ham. Posteriormente se aplica una búsqueda en vecindad a la solución obtenida tanto por ACO como NEH.This paper studied the permutation flowshop with sequence dependent setup times and makespan minimization. An ant colony algorithm which turns the original problem into an asymmetric TSP (Traveling Salesman Problem structure is presented, and applied to problems proposed in the literature and is compared with an adaptation of the NEH heuristic. Subsequently a neighborhood search was applied to the solution obtained by the ACO algorithm and the NEH heuristic.

  7. A novel hybrid approach based on Particle Swarm Optimization and Ant Colony Algorithm to forecast energy demand of Turkey

    International Nuclear Information System (INIS)

    Kıran, Mustafa Servet; Özceylan, Eren; Gündüz, Mesut; Paksoy, Turan

    2012-01-01

    Highlights: ► PSO and ACO algorithms are hybridized for forecasting energy demands of Turkey. ► Linear and quadratic forms are developed to meet the fluctuations of indicators. ► GDP, population, export and import have significant impacts on energy demand. ► Quadratic form provides better fit solution than linear form. ► Proposed approach gives lower estimation error than ACO and PSO, separately. - Abstract: This paper proposes a new hybrid method (HAP) for estimating energy demand of Turkey using Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO). Proposed energy demand model (HAPE) is the first model which integrates two mentioned meta-heuristic techniques. While, PSO, developed for solving continuous optimization problems, is a population based stochastic technique; ACO, simulating behaviors between nest and food source of real ants, is generally used for discrete optimizations. Hybrid method based PSO and ACO is developed to estimate energy demand using gross domestic product (GDP), population, import and export. HAPE is developed in two forms which are linear (HAPEL) and quadratic (HAPEQ). The future energy demand is estimated under different scenarios. In order to show the accuracy of the algorithm, a comparison is made with ACO and PSO which are developed for the same problem. According to obtained results, relative estimation errors of the HAPE model are the lowest of them and quadratic form (HAPEQ) provides better-fit solutions due to fluctuations of the socio-economic indicators.

  8. Urban Growth Modeling Using Cellular Automata with Multi-Temporal Remote Sensing Images Calibrated by the Artificial Bee Colony Optimization Algorithm.

    Science.gov (United States)

    Naghibi, Fereydoun; Delavar, Mahmoud Reza; Pijanowski, Bryan

    2016-12-14

    Cellular Automata (CA) is one of the most common techniques used to simulate the urbanization process. CA-based urban models use transition rules to deliver spatial patterns of urban growth and urban dynamics over time. Determining the optimum transition rules of the CA is a critical step because of the heterogeneity and nonlinearities existing among urban growth driving forces. Recently, new CA models integrated with optimization methods based on swarm intelligence algorithms were proposed to overcome this drawback. The Artificial Bee Colony (ABC) algorithm is an advanced meta-heuristic swarm intelligence-based algorithm. Here, we propose a novel CA-based urban change model that uses the ABC algorithm to extract optimum transition rules. We applied the proposed ABC-CA model to simulate future urban growth in Urmia (Iran) with multi-temporal Landsat images from 1997, 2006 and 2015. Validation of the simulation results was made through statistical methods such as overall accuracy, the figure of merit and total operating characteristics (TOC). Additionally, we calibrated the CA model by ant colony optimization (ACO) to assess the performance of our proposed model versus similar swarm intelligence algorithm methods. We showed that the overall accuracy and the figure of merit of the ABC-CA model are 90.1% and 51.7%, which are 2.9% and 8.8% higher than those of the ACO-CA model, respectively. Moreover, the allocation disagreement of the simulation results for the ABC-CA model is 9.9%, which is 2.9% less than that of the ACO-CA model. Finally, the ABC-CA model also outperforms the ACO-CA model with fewer quantity and allocation errors and slightly more hits.

  9. Enhancing Artificial Bee Colony Algorithm with Self-Adaptive Searching Strategy and Artificial Immune Network Operators for Global Optimization

    Directory of Open Access Journals (Sweden)

    Tinggui Chen

    2014-01-01

    Full Text Available Artificial bee colony (ABC algorithm, inspired by the intelligent foraging behavior of honey bees, was proposed by Karaboga. It has been shown to be superior to some conventional intelligent algorithms such as genetic algorithm (GA, artificial colony optimization (ACO, and particle swarm optimization (PSO. However, the ABC still has some limitations. For example, ABC can easily get trapped in the local optimum when handing in functions that have a narrow curving valley, a high eccentric ellipse, or complex multimodal functions. As a result, we proposed an enhanced ABC algorithm called EABC by introducing self-adaptive searching strategy and artificial immune network operators to improve the exploitation and exploration. The simulation results tested on a suite of unimodal or multimodal benchmark functions illustrate that the EABC algorithm outperforms ACO, PSO, and the basic ABC in most of the experiments.

  10. Inverse estimation of the spheroidal particle size distribution using Ant Colony Optimization algorithms in multispectral extinction technique

    Science.gov (United States)

    He, Zhenzong; Qi, Hong; Wang, Yuqing; Ruan, Liming

    2014-10-01

    Four improved Ant Colony Optimization (ACO) algorithms, i.e. the probability density function based ACO (PDF-ACO) algorithm, the Region ACO (RACO) algorithm, Stochastic ACO (SACO) algorithm and Homogeneous ACO (HACO) algorithm, are employed to estimate the particle size distribution (PSD) of the spheroidal particles. The direct problems are solved by the extended Anomalous Diffraction Approximation (ADA) and the Lambert-Beer law. Three commonly used monomodal distribution functions i.e. the Rosin-Rammer (R-R) distribution function, the normal (N-N) distribution function, and the logarithmic normal (L-N) distribution function are estimated under dependent model. The influence of random measurement errors on the inverse results is also investigated. All the results reveal that the PDF-ACO algorithm is more accurate than the other three ACO algorithms and can be used as an effective technique to investigate the PSD of the spheroidal particles. Furthermore, the Johnson's SB (J-SB) function and the modified beta (M-β) function are employed as the general distribution functions to retrieve the PSD of spheroidal particles using PDF-ACO algorithm. The investigation shows a reasonable agreement between the original distribution function and the general distribution function when only considering the variety of the length of the rotational semi-axis.

  11. Organizational Attributes Associated With Medicare ACO Quality Performance.

    Science.gov (United States)

    Zhu, Xi; Mueller, Keith; Huang, Huang; Ullrich, Fred; Vaughn, Thomas; MacKinney, A Clinton

    2018-05-08

    To evaluate associations between geographic, structural, and service-provision attributes of Accountable Care Organizations (ACOs) participating in the Medicare Shared Savings Program (MSSP) and the ACOs' quality performance. We conducted cross-sectional and longitudinal analyses of ACO quality performance using data from the Centers for Medicare and Medicaid Services and additional sources. The sample included 322 and 385 MSSP ACOs that had successfully reported quality measures in 2014 and 2015, respectively. Results show that after adjusting for other organizational factors, rural ACOs' average quality score was comparable to that of ACOs serving other geographic categories. ACOs with hospital-system sponsorship, larger beneficiary panels, and higher posthospitalization follow-up rates achieved better quality performance. There is no significant difference in average quality performance between rural ACOs and other ACOs after adjusting for structural and service-provision factors. MSSP ACO quality performance is positively associated with hospital-system sponsorship, beneficiary panel size, and posthospitalization follow-up rate. © 2018 National Rural Health Association.

  12. A hybrid of ant colony optimization and artificial bee colony algorithm for probabilistic optimal placement and sizing of distributed energy resources

    International Nuclear Information System (INIS)

    Kefayat, M.; Lashkar Ara, A.; Nabavi Niaki, S.A.

    2015-01-01

    Highlights: • A probabilistic optimization framework incorporated with uncertainty is proposed. • A hybrid optimization approach combining ACO and ABC algorithms is proposed. • The problem is to deal with technical, environmental and economical aspects. • A fuzzy interactive approach is incorporated to solve the multi-objective problem. • Several strategies are implemented to compare with literature methods. - Abstract: In this paper, a hybrid configuration of ant colony optimization (ACO) with artificial bee colony (ABC) algorithm called hybrid ACO–ABC algorithm is presented for optimal location and sizing of distributed energy resources (DERs) (i.e., gas turbine, fuel cell, and wind energy) on distribution systems. The proposed algorithm is a combined strategy based on the discrete (location optimization) and continuous (size optimization) structures to achieve advantages of the global and local search ability of ABC and ACO algorithms, respectively. Also, in the proposed algorithm, a multi-objective ABC is used to produce a set of non-dominated solutions which store in the external archive. The objectives consist of minimizing power losses, total emissions produced by substation and resources, total electrical energy cost, and improving the voltage stability. In order to investigate the impact of the uncertainty in the output of the wind energy and load demands, a probabilistic load flow is necessary. In this study, an efficient point estimate method (PEM) is employed to solve the optimization problem in a stochastic environment. The proposed algorithm is tested on the IEEE 33- and 69-bus distribution systems. The results demonstrate the potential and effectiveness of the proposed algorithm in comparison with those of other evolutionary optimization methods

  13. Loading pattern optimization using ant colony algorithm

    International Nuclear Information System (INIS)

    Hoareau, Fabrice

    2008-01-01

    Electricite de France (EDF) operates 58 nuclear power plants (NPP), of the Pressurized Water Reactor type. The loading pattern optimization of these NPP is currently done by EDF expert engineers. Within this framework, EDF R and D has developed automatic optimization tools that assist the experts. LOOP is an industrial tool, developed by EDF R and D and based on a simulated annealing algorithm. In order to improve the results of such automatic tools, new optimization methods have to be tested. Ant Colony Optimization (ACO) algorithms are recent methods that have given very good results on combinatorial optimization problems. In order to evaluate the performance of such methods on loading pattern optimization, direct comparisons between LOOP and a mock-up based on the Max-Min Ant System algorithm (a particular variant of ACO algorithms) were made on realistic test-cases. It is shown that the results obtained by the ACO mock-up are very similar to those of LOOP. Future research will consist in improving these encouraging results by using parallelization and by hybridizing the ACO algorithm with local search procedures. (author)

  14. Analysis of parameter estimation and optimization application of ant colony algorithm in vehicle routing problem

    Science.gov (United States)

    Xu, Quan-Li; Cao, Yu-Wei; Yang, Kun

    2018-03-01

    Ant Colony Optimization (ACO) is the most widely used artificial intelligence algorithm at present. This study introduced the principle and mathematical model of ACO algorithm in solving Vehicle Routing Problem (VRP), and designed a vehicle routing optimization model based on ACO, then the vehicle routing optimization simulation system was developed by using c ++ programming language, and the sensitivity analyses, estimations and improvements of the three key parameters of ACO were carried out. The results indicated that the ACO algorithm designed in this paper can efficiently solve rational planning and optimization of VRP, and the different values of the key parameters have significant influence on the performance and optimization effects of the algorithm, and the improved algorithm is not easy to locally converge prematurely and has good robustness.

  15. 2D Tsallis Entropy for Image Segmentation Based on Modified Chaotic Bat Algorithm

    Directory of Open Access Journals (Sweden)

    Zhiwei Ye

    2018-03-01

    Full Text Available Image segmentation is a significant step in image analysis and computer vision. Many entropy based approaches have been presented in this topic; among them, Tsallis entropy is one of the best performing methods. However, 1D Tsallis entropy does not consider make use of the spatial correlation information within the neighborhood results might be ruined by noise. Therefore, 2D Tsallis entropy is proposed to solve the problem, and results are compared with 1D Fisher, 1D maximum entropy, 1D cross entropy, 1D Tsallis entropy, fuzzy entropy, 2D Fisher, 2D maximum entropy and 2D cross entropy. On the other hand, due to the existence of huge computational costs, meta-heuristics algorithms like genetic algorithm (GA, particle swarm optimization (PSO, ant colony optimization algorithm (ACO and differential evolution algorithm (DE are used to accelerate the 2D Tsallis entropy thresholding method. In this paper, considering 2D Tsallis entropy as a constrained optimization problem, the optimal thresholds are acquired by maximizing the objective function using a modified chaotic Bat algorithm (MCBA. The proposed algorithm has been tested on some actual and infrared images. The results are compared with that of PSO, GA, ACO and DE and demonstrate that the proposed method outperforms other approaches involved in the paper, which is a feasible and effective option for image segmentation.

  16. Ant-Based Phylogenetic Reconstruction (ABPR: A new distance algorithm for phylogenetic estimation based on ant colony optimization

    Directory of Open Access Journals (Sweden)

    Karla Vittori

    2008-12-01

    Full Text Available We propose a new distance algorithm for phylogenetic estimation based on Ant Colony Optimization (ACO, named Ant-Based Phylogenetic Reconstruction (ABPR. ABPR joins two taxa iteratively based on evolutionary distance among sequences, while also accounting for the quality of the phylogenetic tree built according to the total length of the tree. Similar to optimization algorithms for phylogenetic estimation, the algorithm allows exploration of a larger set of nearly optimal solutions. We applied the algorithm to four empirical data sets of mitochondrial DNA ranging from 12 to 186 sequences, and from 898 to 16,608 base pairs, and covering taxonomic levels from populations to orders. We show that ABPR performs better than the commonly used Neighbor-Joining algorithm, except when sequences are too closely related (e.g., population-level sequences. The phylogenetic relationships recovered at and above species level by ABPR agree with conventional views. However, like other algorithms of phylogenetic estimation, the proposed algorithm failed to recover expected relationships when distances are too similar or when rates of evolution are very variable, leading to the problem of long-branch attraction. ABPR, as well as other ACO-based algorithms, is emerging as a fast and accurate alternative method of phylogenetic estimation for large data sets.

  17. Tumores cardíacos primarios

    Directory of Open Access Journals (Sweden)

    Rosa Eugenia Díaz Garriga

    2013-10-01

    Full Text Available Introducción: los tumores cardíacos primarios son aquellos que se originan en Miocardio o Pericardio. El 90% son benignos, no son invasivos, pero debido a su localización pueden provocar alteraciones hemodinámicas graves y arrítmias. Presentación del caso: dos casos portadores de tumores cardíacos diagnosticados en la etapa prenatal, una gestante de 32 años, portadora de una Neurofribromatosis que en la ecocardiografía fetal de su hijo, se identifican dos tipos de tumores cardíacos, un mixoma auricular y un fibroma, y un niño que desde la etapa prenatal se diagnosticó un rabdomioma, lo cual se confirmó al nacimiento y que regresó espontáneamente. Conclusiones: a ecocardiografía fetal permite cada vez con más frecuencia, el diagnóstico intraútero de tumores cardíacos. Los rabdomiomas regresan en más del 50% de los casos, pero pueden ser un marcador de Esclerosis Tuberosa. Los tumores cardiacos se asocian a otras afecciones congénitas y requieren de tratamiento quirúrgico. Aspectos todos a tener en consideración para realizar el asesoramiento genético a la familia.

  18. A Hybrid Ant Colony Optimization Algorithm for the Extended Capacitated Arc Routing Problem.

    Science.gov (United States)

    Li-Ning Xing; Rohlfshagen, P; Ying-Wu Chen; Xin Yao

    2011-08-01

    The capacitated arc routing problem (CARP) is representative of numerous practical applications, and in order to widen its scope, we consider an extended version of this problem that entails both total service time and fixed investment costs. We subsequently propose a hybrid ant colony optimization (ACO) algorithm (HACOA) to solve instances of the extended CARP. This approach is characterized by the exploitation of heuristic information, adaptive parameters, and local optimization techniques: Two kinds of heuristic information, arc cluster information and arc priority information, are obtained continuously from the solutions sampled to guide the subsequent optimization process. The adaptive parameters ease the burden of choosing initial values and facilitate improved and more robust results. Finally, local optimization, based on the two-opt heuristic, is employed to improve the overall performance of the proposed algorithm. The resulting HACOA is tested on four sets of benchmark problems containing a total of 87 instances with up to 140 nodes and 380 arcs. In order to evaluate the effectiveness of the proposed method, some existing capacitated arc routing heuristics are extended to cope with the extended version of this problem; the experimental results indicate that the proposed ACO method outperforms these heuristics.

  19. Pioneer ACO Model

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Pioneer ACO Model is designed for health care organizations and providers that are already experienced in coordinating care for patients across care settings. It...

  20. Unveiling the unicorn: a leader's guide to ACO preparation.

    Science.gov (United States)

    Aslin, Paul

    2011-01-01

    The great uncertainty surrounding healthcare reform provides little incentive for action. However, as healthcare leaders wait for final rules and clarity about accountable care organizations (ACOs), inaction is the inappropriate response. Several central themes emerge from research about beginning the ACO process. Leaders should be able to understand and articulate ACO concepts. They should champion embracing cultural change while partnering with physicians. Inventory of skills and capabilities should take place to understand any deficiencies required to implement an ACO. Finally, a plan should be formed by asking strategic questions on each platform needed to ensure performance and strategic goals are at the forefront of decisions regarding structure and function of an ACO. It takes a visionary leader to accept these challenges.

  1. Cloud Service Scheduling Algorithm Research and Optimization

    Directory of Open Access Journals (Sweden)

    Hongyan Cui

    2017-01-01

    Full Text Available We propose a cloud service scheduling model that is referred to as the Task Scheduling System (TSS. In the user module, the process time of each task is in accordance with a general distribution. In the task scheduling module, we take a weighted sum of makespan and flowtime as the objective function and use an Ant Colony Optimization (ACO and a Genetic Algorithm (GA to solve the problem of cloud task scheduling. Simulation results show that the convergence speed and output performance of our Genetic Algorithm-Chaos Ant Colony Optimization (GA-CACO are optimal.

  2. A performance improvement and cost-efficient ACO-OFDM scheme for visible light communications

    Science.gov (United States)

    Zhang, Tiantian; Zhou, Ji; Zhang, Zhenshan; Qiao, Yaojun; Su, Fei; Yang, Aiying

    2017-11-01

    In this paper, we propose a performance improvement and cost-efficient discrete Hartley transform (DHT)-based asymmetrically clipped optical orthogonal frequency division multiplexing (ACO-OFDM) scheme for visible light communications (VLC). The simple one-dimensional modulation constellation and simplified encoding structure reduce the complexity of system considerably. The DHT-spreading technique is employed to reduce peak-to-average power ratio (PAPR) of ACO-OFDM signals. Moreover, the intra-symbol frequency-domain averaging (ISFA) technique is used to increase the accuracy of channel estimation by removing the effect of ambient noise in the VLC channel effectively. To verify the feasibility of the proposed scheme, we study its performance via simulation. This scheme reduces the requirement to the resolution of DAC and increases the tolerance to the nonlinear characteristics of LED, both of which are cost-efficient. At forward error correction (FEC) limit (BER = 1 × 10-3), simulation results illustrate that compared with DHT-based ACO-OFDM without the ISFA technique, our scheme has 3.2 dB and 2.7 dB improvement of the required Eb /N0 when BPSK and 4-PAM are modulated, respectively.

  3. Optimization on Paddy Crops in Central Java (with Solver, SVD on Least Square and ACO (Ant Colony Algorithm))

    Science.gov (United States)

    Parhusip, H. A.; Trihandaru, S.; Susanto, B.; Prasetyo, S. Y. J.; Agus, Y. H.; Simanjuntak, B. H.

    2017-03-01

    Several algorithms and objective functions on paddy crops have been studied to get optimal paddy crops in Central Java based on the data given from Surakarta and Boyolali. The algorithms are linear solver, least square and Ant Colony Algorithms (ACO) to develop optimization procedures on paddy crops modelled with Modified GSTAR (Generalized Space-Time Autoregressive) and nonlinear models where the nonlinear models are quadratic and power functions. The studied data contain paddy crops from Surakarta and Boyolali determining the best period of planting in the year 1992-2012 for Surakarta where 3 periods for planting are known and the optimal amount of paddy crops in Boyolali in the year 2008-2013. Having these analyses may guide the local agriculture government to give a decision on rice sustainability in its region. The best period for planting in Surakarta is observed, i.e. the best period is in September-December based on the data 1992-2012 by considering the planting area, the cropping area, and the paddy crops are the most important factors to be taken into account. As a result, we can refer the paddy crops in this best period (about 60.4 thousand tons per year) as the optimal results in 1992-2012 where the used objective function is quadratic. According to the research, the optimal paddy crops in Boyolali about 280 thousand tons per year where the studied factors are the amount of rainfalls, the harvested area and the paddy crops in 2008-2013. In this case, linear and power functions are studied to be the objective functions. Compared to all studied algorithms, the linear solver is still recommended to be an optimization tool for a local agriculture government to predict paddy crops in future.

  4. Ant Colony Optimization Approaches to Clustering of Lung Nodules from CT Images

    Directory of Open Access Journals (Sweden)

    Ravichandran C. Gopalakrishnan

    2014-01-01

    Full Text Available Lung cancer is becoming a threat to mankind. Applying machine learning algorithms for detection and segmentation of irregular shaped lung nodules remains a remarkable milestone in CT scan image analysis research. In this paper, we apply ACO algorithm for lung nodule detection. We have compared the performance against three other algorithms, namely, Otsu algorithm, watershed algorithm, and global region based segmentation. In addition, we suggest a novel approach which involves variations of ACO, namely, refined ACO, logical ACO, and variant ACO. Variant ACO shows better reduction in false positives. In addition we propose black circular neighborhood approach to detect nodule centers from the edge detected image. Genetic algorithm based clustering is performed to cluster the nodules based on intensity, shape, and size. The performance of the overall approach is compared with hierarchical clustering to establish the improvisation in the proposed approach.

  5. ACO model should encourage efficient care delivery.

    Science.gov (United States)

    Toussaint, John; Krueger, David; Shortell, Stephen M; Milstein, Arnold; Cutler, David M

    2015-09-01

    The independent Office of the Actuary for CMS certified that the Pioneer ACO model has met the stringent criteria for expansion to a larger population. Significant savings have accrued and quality targets have been met, so the program as a whole appears to be working. Ironically, 13 of the initial 32 enrollees have left. We attribute this to the design of the ACO models which inadequately support efficient care delivery. Using Bellin-ThedaCare Healthcare Partners as an example, we will focus on correctible flaws in four core elements of the ACO payment model: finance spending and targets, attribution, and quality performance. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. PMU Placement Methods in Power Systems based on Evolutionary Algorithms and GPS Receiver

    Directory of Open Access Journals (Sweden)

    M. R. Mosavi

    2013-06-01

    Full Text Available In this paper, optimal placement of Phasor Measurement Unit (PMU using Global Positioning System (GPS is discussed. Ant Colony Optimization (ACO, Simulated Annealing (SA, Particle Swarm Optimization (PSO and Genetic Algorithm (GA are used for this problem. Pheromone evaporation coefficient and the probability of moving from state x to state y by ant are introduced into the ACO. The modified algorithm overcomes the ACO in obtaining global optimal solution and convergence speed, when applied to optimizing the PMU placement problem. We also compare this simulink with SA, PSO and GA that to find capability of ACO in the search of optimal solution. The fitness function includes observability, redundancy and number of PMU. Logarithmic Least Square Method (LLSM is used to calculate the weights of fitness function. The suggested optimization method is applied in 30-bus IEEE system and the simulation results show modified ACO find results better than PSO and SA, but same result with GA.

  7. The asthma-chronic obstructive pulmonary disease overlap syndrome (ACOS): opportunities and challenges.

    Science.gov (United States)

    Barrecheguren, Miriam; Esquinas, Cristina; Miravitlles, Marc

    2015-01-01

    Some individuals share characteristics of asthma and chronic obstructive pulmonary disease (COPD). The asthma-COPD overlap syndrome (ACOS) has been defined as symptoms of increased variability of airflow in association with an incompletely reversible airflow obstruction. In this review, we present the latest findings in the diagnosis, characterization and management of ACOS. Around 15-20% of COPD patients may have an ACOS. Patients with ACOS are characterized by increased reversibility of airflow obstruction, eosinophilic bronchial and systemic inflammation, and increased response to inhaled corticosteroids, compared with the remaining patients with COPD. Patients with ACOS have more frequent exacerbations, more wheezing and dyspnoea, but similar cough and sputum production compared with COPD. The relevance of the ACOS is to identify patients with COPD who may have underlying eosinophilic inflammation that responds to inhaled corticosteroids. So far, the previous diagnosis of asthma in a patient with COPD is the more reliable criterion for ACOS. Ongoing studies will clarify if concentrations of blood eosinophils may be useful to identify this subgroup of patients with COPD. If this is the case, the interest of ACOS may shift to that of eosinophilic COPD, which is easier to diagnose and has clear therapeutic implications.

  8. Substantial Physician Turnover And Beneficiary 'Churn' In A Large Medicare Pioneer ACO.

    Science.gov (United States)

    Hsu, John; Vogeli, Christine; Price, Mary; Brand, Richard; Chernew, Michael E; Mohta, Namita; Chaguturu, Sreekanth K; Weil, Eric; Ferris, Timothy G

    2017-04-01

    Alternative payment models, such as accountable care organizations (ACOs), attempt to stimulate improvements in care delivery by better alignment of payer and provider incentives. However, limited attention has been paid to the physicians who actually deliver the care. In a large Medicare Pioneer ACO, we found that the number of beneficiaries per physician was low (median of seventy beneficiaries per physician, or less than 5 percent of a typical panel). We also found substantial physician turnover: More than half of physicians either joined (41 percent) or left (18 percent) the ACO during the 2012-14 contract period studied. When physicians left the ACO, most of their attributed beneficiaries also left the ACO. Conversely, about half of the growth in the beneficiary population was because of new physicians affiliating with the ACO; the remainder joined after switching physicians. These findings may help explain the muted financial impact ACOs have had overall, and they raise the possibility of future gaming on the part of ACOs to artificially control spending. Policy refinements include coordinated and standardized risk-sharing parameters across payers to prevent any dilution of the payment incentives or confusion from a cacophony of incentives across payers. Project HOPE—The People-to-People Health Foundation, Inc.

  9. Determinants of success in Shared Savings Programs: An analysis of ACO and market characteristics.

    Science.gov (United States)

    Ouayogodé, Mariétou H; Colla, Carrie H; Lewis, Valerie A

    2017-03-01

    Medicare's Accountable Care Organization (ACO) programs introduced shared savings to traditional Medicare, which allow providers who reduce health care costs for their patients to retain a percentage of the savings they generate. To examine ACO and market factors associated with superior financial performance in Medicare ACO programs. We obtained financial performance data from the Centers for Medicare and Medicaid Services (CMS); we derived market-level characteristics from Medicare claims; and we collected ACO characteristics from the National Survey of ACOs for 215 ACOs. We examined the association between ACO financial performance and ACO provider composition, leadership structure, beneficiary characteristics, risk bearing experience, quality and process improvement capabilities, physician performance management, market competition, CMS-assigned financial benchmark, and ACO contract start date. We examined two outcomes from Medicare ACOs' first performance year: savings per Medicare beneficiary and earning shared savings payments (a dichotomous variable). When modeling the ACO ability to save and earn shared savings payments, we estimated positive regression coefficients for a greater proportion of primary care providers in the ACO, more practicing physicians on the governing board, physician leadership, active engagement in reducing hospital re-admissions, a greater proportion of disabled Medicare beneficiaries assigned to the ACO, financial incentives offered to physicians, a larger financial benchmark, and greater ACO market penetration. No characteristic of organizational structure was significantly associated with both outcomes of savings per beneficiary and likelihood of achieving shared savings. ACO prior experience with risk-bearing contracts was positively correlated with savings and significantly increased the likelihood of receiving shared savings payments. In the first year, performance is quite heterogeneous, yet organizational structure does not

  10. Feocromocitoma cardíaco

    Directory of Open Access Journals (Sweden)

    Gustavo L. Knop

    2006-01-01

    Full Text Available Los feocromocitomas cardíacos primarios (FCP son sumamente infrecuentes. Hasta el presente son menos de 50 los casos comunicados en el mundo. Presentamos el caso de un tumor intrapericárdico, que resultó ser un feocromocitoma primario, en una mujer de mediana edad, cuyo signo principal fue hipertensión arterial severa (HTAs. Los estudios diagnósticos por imágenes corroboraron la presencia de un tumor intrapericárdico como único hallazgo y los estudios bioquímicos de catecolaminas y sus metabolitos excretados por orina reafirmaron el diagnóstico etiológico. El tumor fue resecado quirúrgicamente sin complicaciones mediante cirugía cardíaca convencional con circulación extracorpórea (CEC y paro cardíaco con cardioplejía. Siete meses después de la operación, la paciente se encuentra asintomática y normotensa.

  11. PROPOSAL OF ALGORITHM FOR ROUTE OPTIMIZATION

    OpenAIRE

    Robert Ramon de Carvalho Sousa; Abimael de Jesus Barros Costa; Eliezé Bulhões de Carvalho; Adriano de Carvalho Paranaíba; Daylyne Maerla Gomes Lima Sandoval

    2016-01-01

    This article uses “Six Sigma” methodology for the elaboration of an algorithm for routing problems which is able to obtain more efficient results than those from Clarke and Wright´s (CW) algorithm (1964) in situations of random increase of product delivery demands, facing the incapability of service level increase . In some situations, the algorithm proposed obtained more efficient results than the CW algorithm. The key factor was a reduction in the number of mistakes (on...

  12. CO2 Measurements from Space: Lessons Learned from the Collaboration between the ACOS/OCO-2 and GOSAT Teams

    Science.gov (United States)

    Crisp, D.; Eldering, A.; Gunson, M. R.

    2012-12-01

    The NASA Orbiting Carbon Observatory (OCO) and the Japanese Greenhouse gases Observing SATellite (GOSAT) were the first two missions designed to collect space-based observations of the column-averaged CO2 dry air mole fraction, XCO2, with the sensitivity, coverage, and resolution needed to quantify CO2 fluxes on regional scales over the globe. The OCO and GOSAT teams formed a close collaboration during the development phases of these missions. After the loss of OCO, the GOSAT project team invited the OCO team to contribute to the analysis of measurements collected by the GOSAT Thermal And Near infrared Sensor for carbon Observations-Fourier Transform Spectrometer (TANSO-FTS). NASA responded by reformulating the OCO science team under the Atmospheric CO2 Observations from Space (ACOS) task to exploit this opportunity. This collaboration is providing an independent GOSAT XCO2 product, and valuable insights into the retrieval algorithms, calibration methods, and validation techniques that are being developed to analyze data anticipated the NASA Orbiting Carbon Observatory-2 (OCO-2). The ACOS/OCO-2 and GOSAT teams have conducted four, joint, vicarious calibration campaigns at Railroad Valley, Nevada to track the long-term radiometric performance of the TANSO-FTS instrument. The methods used in these campaigns evolved from those used to characterize the radiometric performance of high spatial resolution, imaging spectroradiometers. For TANSO-FTS, the conventional, surface based radiometric measurements have been augmented with surface and aircraft measurements of atmospheric temperature and trace gas profiles, as well as surface observations from MODIS and ASTER to characterize spatial variations of the surface reflectance within the (relatively large) sounding footprints. Similar methods will be needed for OCO-2. The ACOS/OCO-2 retrieval algorithm and associated data screening methods have been modified to estimate XCO2 from TANSO-FTS observations. Comparisons of TANSO

  13. BER Performance of Stratified ACO-OFDM for Optical Wireless Communications over Multipath Channel

    OpenAIRE

    Gebeyehu, Zelalem Hailu; Langat, Philip Kibet; Maina, Ciira Wa

    2018-01-01

    In intensity modulation/direct detection- (IM/DD-) based optical OFDM systems, the requirement of the input signal to be real and positive unipolar imposes a reduction of system performances. Among previously proposed unipolar optical OFDM schemes for optical wireless communications (OWC), asymmetrically clipped optical OFDM (ACO-OFDM) and direct current biased optical OFDM (DCO-OFDM) are the most accepted ones. But those proposed schemes experience either spectral efficiency loss or energy e...

  14. Paro Cardíaco en el Embarazo

    OpenAIRE

    Manuel Eduardo Sáenz Madrigal; Carlos Adrián Vindas Morera

    2013-01-01

    El paro cardíaco en el embarazo presenta un escenario único en el que están incluidos dos pacientes: la madre y el feto. El manejo de este escenario requiere de un equipo multidisciplinario incluyendo especialistas en anestesia, obstetricia, neonatología, cardiología y en ocasiones cirugía cardíaca. Los protocolos de soporte vital básico y soporte cardíaco avanzado deben ser implementados, sin embargo, dados los cambios anatómicos y fisiológicos que ocurren en el embarazo, algunas modificacio...

  15. An improved ant colony optimization algorithm with fault tolerance for job scheduling in grid computing systems.

    Directory of Open Access Journals (Sweden)

    Hajara Idris

    Full Text Available The Grid scheduler, schedules user jobs on the best available resource in terms of resource characteristics by optimizing job execution time. Resource failure in Grid is no longer an exception but a regular occurring event as resources are increasingly being used by the scientific community to solve computationally intensive problems which typically run for days or even months. It is therefore absolutely essential that these long-running applications are able to tolerate failures and avoid re-computations from scratch after resource failure has occurred, to satisfy the user's Quality of Service (QoS requirement. Job Scheduling with Fault Tolerance in Grid Computing using Ant Colony Optimization is proposed to ensure that jobs are executed successfully even when resource failure has occurred. The technique employed in this paper, is the use of resource failure rate, as well as checkpoint-based roll back recovery strategy. Check-pointing aims at reducing the amount of work that is lost upon failure of the system by immediately saving the state of the system. A comparison of the proposed approach with an existing Ant Colony Optimization (ACO algorithm is discussed. The experimental results of the implemented Fault Tolerance scheduling algorithm show that there is an improvement in the user's QoS requirement over the existing ACO algorithm, which has no fault tolerance integrated in it. The performance evaluation of the two algorithms was measured in terms of the three main scheduling performance metrics: makespan, throughput and average turnaround time.

  16. PROPOSAL OF ALGORITHM FOR ROUTE OPTIMIZATION

    Directory of Open Access Journals (Sweden)

    Robert Ramon de Carvalho Sousa

    2016-06-01

    Full Text Available This article uses “Six Sigma” methodology for the elaboration of an algorithm for routing problems which is able to obtain more efficient results than those from Clarke and Wright´s (CW algorithm (1964 in situations of random increase of product delivery demands, facing the incapability of service level increase . In some situations, the algorithm proposed obtained more efficient results than the CW algorithm. The key factor was a reduction in the number of mistakes (one way routes and in the level of result variation.

  17. Dynamic routing and spectrum assignment based on multilayer virtual topology and ant colony optimization in elastic software-defined optical networks

    Science.gov (United States)

    Wang, Fu; Liu, Bo; Zhang, Lijia; Zhang, Qi; Tian, Qinghua; Tian, Feng; Rao, Lan; Xin, Xiangjun

    2017-07-01

    Elastic software-defined optical networks greatly improve the flexibility of the optical switching network while it has brought challenges to the routing and spectrum assignment (RSA). A multilayer virtual topology model is proposed to solve RSA problems. Two RSA algorithms based on the virtual topology are proposed, which are the ant colony optimization (ACO) algorithm of minimum consecutiveness loss and the ACO algorithm of maximum spectrum consecutiveness. Due to the computing power of the control layer in the software-defined network, the routing algorithm avoids the frequent link-state information between routers. Based on the effect of the spectrum consecutiveness loss on the pheromone in the ACO, the path and spectrum of the minimal impact on the network are selected for the service request. The proposed algorithms have been compared with other algorithms. The results show that the proposed algorithms can reduce the blocking rate by at least 5% and perform better in spectrum efficiency. Moreover, the proposed algorithms can effectively decrease spectrum fragmentation and enhance available spectrum consecutiveness.

  18. Optimization of Straight Cylindrical Turning Using Artificial Bee Colony (ABC) Algorithm

    Science.gov (United States)

    Prasanth, Rajanampalli Seshasai Srinivasa; Hans Raj, Kandikonda

    2017-04-01

    Artificial bee colony (ABC) algorithm, that mimics the intelligent foraging behavior of honey bees, is increasingly gaining acceptance in the field of process optimization, as it is capable of handling nonlinearity, complexity and uncertainty. Straight cylindrical turning is a complex and nonlinear machining process which involves the selection of appropriate cutting parameters that affect the quality of the workpiece. This paper presents the estimation of optimal cutting parameters of the straight cylindrical turning process using the ABC algorithm. The ABC algorithm is first tested on four benchmark problems of numerical optimization and its performance is compared with genetic algorithm (GA) and ant colony optimization (ACO) algorithm. Results indicate that, the rate of convergence of ABC algorithm is better than GA and ACO. Then, the ABC algorithm is used to predict optimal cutting parameters such as cutting speed, feed rate, depth of cut and tool nose radius to achieve good surface finish. Results indicate that, the ABC algorithm estimated a comparable surface finish when compared with real coded genetic algorithm and differential evolution algorithm.

  19. ADAPTIVE ANT COLONY OPTIMIZATION BASED GRADIENT FOR EDGE DETECTION

    Directory of Open Access Journals (Sweden)

    Febri Liantoni

    2014-08-01

    Full Text Available Ant Colony Optimization (ACO is a nature-inspired optimization algorithm which is motivated by ants foraging behavior. Due to its favorable advantages, ACO has been widely used to solve several NP-hard problems, including edge detection. Since ACO initially distributes ants at random, it may cause imbalance ant distribution which later affects path discovery process. In this paper an adaptive ACO is proposed to optimize edge detection by adaptively distributing ant according to gradient analysis. Ants are adaptively distributed according to gradient ratio of each image regions. Region which has bigger gradient ratio, will have bigger number of ant distribution. Experiments are conducted using images from various datasets. Precision and recall are used to quantitatively evaluate performance of the proposed algorithm. Precision and recall of adaptive ACO reaches 76.98 % and 96.8 %. Whereas highest precision and recall for standard ACO are 69.74 % and 74.85 %. Experimental results show that the adaptive ACO outperforms standard ACO which randomly distributes ants.

  20. Optimization of reload of nuclear power plants using ACO together with the GENES reactor physics code

    Energy Technology Data Exchange (ETDEWEB)

    Lima, Alan M.M. de; Freire, Fernando S.; Nicolau, Andressa S.; Schirru, Roberto, E-mail: alan@lmp.ufrj.br, E-mail: andressa@lmp.ufrj.br, E-mail: schirru@lmp.ufrj.br, E-mail: ffreire@eletronuclear.gov.br [Coordenacao de Pos-Graduacao e Pesquisa de Engenharia (PEN/COPPE/UFRJ), Rio de Janeiro, RJ (Brazil); Eletrobras Termonuclear S.A. (ELETRONUCLEAR), Rio de Janeiro, RJ (Brazil)

    2017-11-01

    The Nuclear reload of a Pressurized Water Reactor (PWR) occurs whenever the burning of the fuel elements can no longer maintain the criticality of the reactor, that is, it cannot maintain the Nuclear power plant operates within its nominal power. Nuclear reactor reload optimization problem consists of finding a loading pattern of fuel assemblies in the reactor core in order to minimize the cost/benefit ratio, trying to obtain maximum power generation with a minimum of cost, since in all reloads an average of one third of the new fuel elements are purchased. This loading pattern must also satisfy constraints of symmetry and security. In practice, it consists of the placing 121 fuel elements in 121 core positions, in the case of the Angra 1 Brazilian Nuclear Power Plant (NPP), making this new arrangement provide the best cost/benefit ratio. It is an extremely complex problem, since it has around 1% of great places. A core of 121 fuel elements has approximately 10{sup 13} combinations and 10{sup 11} great locations. With this number of possible combinations it is impossible to test all, in order to choose the best. In this work a system called ACO-GENES is proposed in order to optimization the Nuclear Reactor Reload Problem. ACO is successfully used in combination problems, and it is expected that ACO-GENES will show a robust optimization system, since in addition to optimizing ACO, it allows important prior knowledge such as K infinite, burn, etc. After optimization by ACO-GENES, the best results will be validated by a licensed reactor physics code and will be compared with the actual results of the cycle. (author)

  1. Optimization of reload of nuclear power plants using ACO together with the GENES reactor physics code

    International Nuclear Information System (INIS)

    Lima, Alan M.M. de; Freire, Fernando S.; Nicolau, Andressa S.; Schirru, Roberto

    2017-01-01

    The Nuclear reload of a Pressurized Water Reactor (PWR) occurs whenever the burning of the fuel elements can no longer maintain the criticality of the reactor, that is, it cannot maintain the Nuclear power plant operates within its nominal power. Nuclear reactor reload optimization problem consists of finding a loading pattern of fuel assemblies in the reactor core in order to minimize the cost/benefit ratio, trying to obtain maximum power generation with a minimum of cost, since in all reloads an average of one third of the new fuel elements are purchased. This loading pattern must also satisfy constraints of symmetry and security. In practice, it consists of the placing 121 fuel elements in 121 core positions, in the case of the Angra 1 Brazilian Nuclear Power Plant (NPP), making this new arrangement provide the best cost/benefit ratio. It is an extremely complex problem, since it has around 1% of great places. A core of 121 fuel elements has approximately 10"1"3 combinations and 10"1"1 great locations. With this number of possible combinations it is impossible to test all, in order to choose the best. In this work a system called ACO-GENES is proposed in order to optimization the Nuclear Reactor Reload Problem. ACO is successfully used in combination problems, and it is expected that ACO-GENES will show a robust optimization system, since in addition to optimizing ACO, it allows important prior knowledge such as K infinite, burn, etc. After optimization by ACO-GENES, the best results will be validated by a licensed reactor physics code and will be compared with the actual results of the cycle. (author)

  2. Super ACO FEL oscillation at 300 nm

    CERN Document Server

    Nutarelli, D; Renault, E; Nahon, L; Couprie, Marie Emmanuelle

    2000-01-01

    Some recent improvements, involving both the optical cavity mirrors and the positron beam dynamics in the storage ring, have allowed us to achieve a laser oscillation at 300 nm on the Super ACO Storage Ring FEL. The Super ACO storage ring is operated at 800 MeV which is the nominal energy for the usual synchrotron radiation users, and the highest energy for a storage ring FEL. The lasing at 300 nm could be kept during 2 h per injection, with a stored current ranging between 30 and 60 mA. The FEL characteristics are presented here. The longitudinal stability and the FEL optics behaviour are also discussed.

  3. Susceptibility to false memories in patients with ACoA aneurysm.

    Science.gov (United States)

    Borsutzky, Sabine; Fujiwara, Esther; Brand, Matthias; Markowitsch, Hans J

    2010-08-01

    We examined ACoA patients regarding their susceptibility to a range of false memory phenomena. We targeted provoked confabulation, false recall and false recognition in the Deese-Roediger-McDermott-paradigm (DRM-paradigm) as well as false recognition in a mirror reading task. ACoA patients produced more provoked confabulations and more false recognition in mirror reading than comparison subjects. Conversely, false recall/false recognition in the DRM-paradigm were similar in patients and controls. Whereas the former two indices of false memories were correlated, no relationship was revealed with the DRM-paradigm. Our results suggest that rupture of ACoA aneurysm leads to an increased susceptibility to a subset of false memories types. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  4. Trasplante cardíaco: organización e indicaciones

    Directory of Open Access Journals (Sweden)

    José M. Revuelta

    2008-01-01

    La experiencia clínica está demostrando que las indicaciones y las contraindicaciones del trasplante cardíaco están en constante cambio, debido principalmente a los avances en los cuidados pre, per y postoperatorios, a los nuevos fármacos y otras novedosas alternativas terapéuticas. La clase funcional (NY HA del paciente es poco precisa para indicar el trasplante cardíaco, por lo que se precisa de otras determinaciones, como el consumo máximo de oxígeno: VO2 máx < 10 ml/kg/min conlleva un mortalidad hospitalaria elevada. Asimismo, la fracción de eyección ventricular izquierda disminuida (FEVI < 20% no debe considerarse como el indicador principal para establecer la indicación de trasplante, siendo necesario valorar otros factores de riesgo preoperatorios. En la última década, la experiencia ha puesto de manifiesto que algunas de las contraindicaciones absolutas, antes vigentes, no siempre deben desaconsejar el trasplante cardíaco.

  5. Ant Colony Optimization Algorithm for Centralized Dynamic Channel Allocation in Multi-Cell OFDMA Systems

    Science.gov (United States)

    Kim, Hyo-Su; Kim, Dong-Hoi

    The dynamic channel allocation (DCA) scheme in multi-cell systems causes serious inter-cell interference (ICI) problem to some existing calls when channels for new calls are allocated. Such a problem can be addressed by advanced centralized DCA design that is able to minimize ICI. Thus, in this paper, a centralized DCA is developed for the downlink of multi-cell orthogonal frequency division multiple access (OFDMA) systems with full spectral reuse. However, in practice, as the search space of channel assignment for centralized DCA scheme in multi-cell systems grows exponentially with the increase of the number of required calls, channels, and cells, it becomes an NP-hard problem and is currently too complicated to find an optimum channel allocation. In this paper, we propose an ant colony optimization (ACO) based DCA scheme using a low-complexity ACO algorithm which is a kind of heuristic algorithm in order to solve the aforementioned problem. Simulation results demonstrate significant performance improvements compared to the existing schemes in terms of the grade of service (GoS) performance and the forced termination probability of existing calls without degrading the system performance of the average throughput.

  6. Domainwise Web Page Optimization Based On Clustered Query Sessions Using Hybrid Of Trust And ACO For Effective Information Retrieval

    Directory of Open Access Journals (Sweden)

    Dr. Suruchi Chawla

    2015-08-01

    Full Text Available Abstract In this paper hybrid of Ant Colony OptimizationACO and trust has been used for domainwise web page optimization in clustered query sessions for effective Information retrieval. The trust of the web page identifies its degree of relevance in satisfying specific information need of the user. The trusted web pages when optimized using pheromone updates in ACO will identify the trusted colonies of web pages which will be relevant to users information need in a given domain. Hence in this paper the hybrid of Trust and ACO has been used on clustered query sessions for identifying more and more relevant number of documents in a given domain in order to better satisfy the information need of the user. Experiment was conducted on the data set of web query sessions to test the effectiveness of the proposed approach in selected three domains Academics Entertainment and Sports and the results confirm the improvement in the precision of search results.

  7. Optimum Layout for Water Quality Monitoring Stations through Ant Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Amin Afshar

    2006-09-01

    Full Text Available Due to the high cost of monitoring systems, budget limitations, and high priority given to water quality control in municipal networks, especially for unexpected events, optimum location of monitoring stations has received considerable attention during the last decade. An optimization model needs to be developed for the desirable location of monitoring stations. This research attempts to develop such a model using Ant Colony Optimization (ACO algorithm and tires to verify it through a bench-mark classical example used in previous researches. Selection of ACO as optimizer was fully justified due to discrete decision space and extensive number of binary variables in modeling system. Diversity of the policies derived from ACO may facilitate the process of decision making considering the social, physical, and economical conditions.

  8. Solving multi-objective job shop problem using nature-based algorithms: new Pareto approximation features

    Directory of Open Access Journals (Sweden)

    Jarosław Rudy

    2015-01-01

    Full Text Available In this paper the job shop scheduling problem (JSP with minimizing two criteria simultaneously is considered. JSP is frequently used model in real world applications of combinatorial optimization. Multi-objective job shop problems (MOJSP were rarely studied. We implement and compare two multi-agent nature-based methods, namely ant colony optimization (ACO and genetic algorithm (GA for MOJSP. Both of those methods employ certain technique, taken from the multi-criteria decision analysis in order to establish ranking of solutions. ACO and GA differ in a method of keeping information about previously found solutions and their quality, which affects the course of the search. In result, new features of Pareto approximations provided by said algorithms are observed: aside from the slight superiority of the ACO method the Pareto frontier approximations provided by both methods are disjoint sets. Thus, both methods can be used to search mutually exclusive areas of the Pareto frontier.

  9. Simplifying Multiproject Scheduling Problem Based on Design Structure Matrix and Its Solution by an Improved aiNet Algorithm

    Directory of Open Access Journals (Sweden)

    Chunhua Ju

    2012-01-01

    Full Text Available Managing multiple project is a complex task involving the unrelenting pressures of time and cost. Many studies have proposed various tools and techniques for single-project scheduling; however, the literature further considering multimode or multiproject issues occurring in the real world is rather scarce. In this paper, design structure matrix (DSM and an improved artificial immune network algorithm (aiNet are developed to solve a multi-mode resource-constrained scheduling problem. Firstly, the DSM is used to simplify the mathematic model of multi-project scheduling problem. Subsequently, aiNet algorithm comprised of clonal selection, negative selection, and network suppression is adopted to realize the local searching and global searching, which will assure that it has a powerful searching ability and also avoids the possible combinatorial explosion. Finally, the approach is tested on a set of randomly cases generated from ProGen. The computational results validate the effectiveness of the proposed algorithm comparing with other famous metaheuristic algorithms such as genetic algorithm (GA, simulated annealing algorithm (SA, and ant colony optimization (ACO.

  10. A practical algorithm for distribution state estimation including renewable energy sources

    Energy Technology Data Exchange (ETDEWEB)

    Niknam, Taher [Electronic and Electrical Department, Shiraz University of Technology, Modares Blvd., P.O. 71555-313, Shiraz (Iran); Firouzi, Bahman Bahmani [Islamic Azad University Marvdasht Branch, Marvdasht (Iran)

    2009-11-15

    Renewable energy is energy that is in continuous supply over time. These kinds of energy sources are divided into five principal renewable sources of energy: the sun, the wind, flowing water, biomass and heat from within the earth. According to some studies carried out by the research institutes, about 25% of the new generation will be generated by Renewable Energy Sources (RESs) in the near future. Therefore, it is necessary to study the impact of RESs on the power systems, especially on the distribution networks. This paper presents a practical Distribution State Estimation (DSE) including RESs and some practical consideration. The proposed algorithm is based on the combination of Nelder-Mead simplex search and Particle Swarm Optimization (PSO) algorithms, called PSO-NM. The proposed algorithm can estimate load and RES output values by Weighted Least-Square (WLS) approach. Some practical considerations are var compensators, Voltage Regulators (VRs), Under Load Tap Changer (ULTC) transformer modeling, which usually have nonlinear and discrete characteristics, and unbalanced three-phase power flow equations. The comparison results with other evolutionary optimization algorithms such as original PSO, Honey Bee Mating Optimization (HBMO), Neural Networks (NNs), Ant Colony Optimization (ACO), and Genetic Algorithm (GA) for a test system demonstrate that PSO-NM is extremely effective and efficient for the DSE problems. (author)

  11. Hardware-efficient signal generation of layered/enhanced ACO-OFDM for short-haul fiber-optic links.

    Science.gov (United States)

    Wang, Qibing; Song, Binhuang; Corcoran, Bill; Boland, David; Zhu, Chen; Zhuang, Leimeng; Lowery, Arthur J

    2017-06-12

    Layered/enhanced ACO-OFDM is a promising candidate for intensity modulation and direct-detection based short-haul fiber-optic links due to its both power and spectral efficiency. In this paper, we firstly demonstrate a hardware-efficient real-time 9.375 Gb/s QPSK-encoded layered/enhanced asymmetrical clipped optical OFDM (L/E-ACO-OFDM) transmitter using a Virtex-6 FPGA. This L/E-ACO-OFDM signal is successfully transmitted over 20-km uncompensated standard single-mode fiber (S-SMF) using a directly modulated laser. Several methods are explored to reduce the FPGA's logic resource utilization by taking advantage of the L/E-ACO-OFDM's signal characteristics. We show that the logic resource occupation of L/E-ACO-OFDM transmitter is almost the same as that of DC-biased OFDM transmitter when they achieve the same spectral efficiency, proving its great potential to be used in a real-time short-haul optical transmission link.

  12. An ant colony optimization algorithm for phylogenetic estimation under the minimum evolution principle

    Directory of Open Access Journals (Sweden)

    Milinkovitch Michel C

    2007-11-01

    Full Text Available Abstract Background Distance matrix methods constitute a major family of phylogenetic estimation methods, and the minimum evolution (ME principle (aiming at recovering the phylogeny with shortest length is one of the most commonly used optimality criteria for estimating phylogenetic trees. The major difficulty for its application is that the number of possible phylogenies grows exponentially with the number of taxa analyzed and the minimum evolution principle is known to belong to the NP MathType@MTEF@5@5@+=feaafiart1ev1aaatCvAUfKttLearuWrP9MDH5MBPbIqV92AaeXatLxBI9gBaebbnrfifHhDYfgasaacPC6xNi=xH8viVGI8Gi=hEeeu0xXdbba9frFj0xb9qqpG0dXdb9aspeI8k8fiI+fsY=rqGqVepae9pg0db9vqaiVgFr0xfr=xfr=xc9adbaqaaeGacaGaaiaabeqaaeqabiWaaaGcbaWenfgDOvwBHrxAJfwnHbqeg0uy0HwzTfgDPnwy1aaceaGae8xdX7Kaeeiuaafaaa@3888@-hard class of problems. Results In this paper, we introduce an Ant Colony Optimization (ACO algorithm to estimate phylogenies under the minimum evolution principle. ACO is an optimization technique inspired from the foraging behavior of real ant colonies. This behavior is exploited in artificial ant colonies for the search of approximate solutions to discrete optimization problems. Conclusion We show that the ACO algorithm is potentially competitive in comparison with state-of-the-art algorithms for the minimum evolution principle. This is the first application of an ACO algorithm to the phylogenetic estimation problem.

  13. Hospitals Participating In ACOs Tend To Be Large And Urban, Allowing Access To Capital And Data.

    Science.gov (United States)

    Colla, Carrie H; Lewis, Valerie A; Tierney, Emily; Muhlestein, David B

    2016-03-01

    Relationships between physicians and hospitals have changed considerably over the past decade, as hospitals and physician groups have integrated and new public and private payment policies have created financial interdependence. The extent to which accountable care organizations (ACOs) involve hospitals in their operations may prove to be vitally important, because managing hospital care is a key part of improving health care quality and lowering cost growth. Using primary data on ACO composition and capabilities paired with hospital characteristics, we found that 20 percent of US hospitals were part of an ACO in 2014. Hospitals that were in urban areas, were nonprofit, or had a smaller share of Medicare patients were more likely to participate in ACOs, compared to hospitals that were in more rural areas, were for-profit or government owned, or had a larger share of Medicare patients, respectively. Qualitative data identified the following advantages of including a hospital in an ACO: the availability of start-up capital, advanced data sharing, and engagement of providers across the care continuum. Although the 63 percent of ACOs that included hospitals offered more comprehensive services compared to ACOs without hospitals, we found no differences between the two groups in their ability to manage hospital-related aspects of patient care. Project HOPE—The People-to-People Health Foundation, Inc.

  14. RNA Interference of 1-Aminocyclopropane-1-carboxylic Acid Oxidase (ACO1 and ACO2 Genes Expression Prolongs the Shelf Life of Eksotika (Carica papaya L. Papaya Fruit

    Directory of Open Access Journals (Sweden)

    Rogayah Sekeli

    2014-06-01

    Full Text Available The purpose of this study was to evaluate the effectiveness of using RNA interference in down regulating the expression of 1-aminocyclopropane-1-carboxylic acid oxidase gene in Eksotika papaya. One-month old embryogenic calli were separately transformed with Agrobacterium strain LBA 4404 harbouring the three different RNAi pOpOff2 constructs bearing the 1-aminocyclopropane-1-carboxylic acid oxidase gene. A total of 176 putative transformed lines were produced from 15,000 calli transformed, selected, then regenerated on medium supplemented with kanamycin. Integration and expression of the targeted gene in putatively transformed lines were verified by PCR and real-time RT-PCR. Confined field evaluation of a total of 31 putative transgenic lines planted showed a knockdown expression of the targeted ACO1 and ACO2 genes in 13 lines, which required more than 8 days to achieve the full yellow colour (Index 6. Fruits harvested from lines pRNAiACO2 L2-9 and pRNAiACO1 L2 exhibited about 20 and 14 days extended post-harvest shelf life to reach Index 6, respectively. The total soluble solids contents of the fruits ranged from 11 to 14° Brix, a range similar to fruits from non-transformed, wild type seed-derived plants.

  15. A bat algorithm with mutation for UCAV path planning.

    Science.gov (United States)

    Wang, Gaige; Guo, Lihong; Duan, Hong; Liu, Luo; Wang, Heqi

    2012-01-01

    Path planning for uninhabited combat air vehicle (UCAV) is a complicated high dimension optimization problem, which mainly centralizes on optimizing the flight route considering the different kinds of constrains under complicated battle field environments. Original bat algorithm (BA) is used to solve the UCAV path planning problem. Furthermore, a new bat algorithm with mutation (BAM) is proposed to solve the UCAV path planning problem, and a modification is applied to mutate between bats during the process of the new solutions updating. Then, the UCAV can find the safe path by connecting the chosen nodes of the coordinates while avoiding the threat areas and costing minimum fuel. This new approach can accelerate the global convergence speed while preserving the strong robustness of the basic BA. The realization procedure for original BA and this improved metaheuristic approach BAM is also presented. To prove the performance of this proposed metaheuristic method, BAM is compared with BA and other population-based optimization methods, such as ACO, BBO, DE, ES, GA, PBIL, PSO, and SGA. The experiment shows that the proposed approach is more effective and feasible in UCAV path planning than the other models.

  16. Hybrid ANFIS with ant colony optimization algorithm for prediction of shear wave velocity from a carbonate reservoir in Iran

    Directory of Open Access Journals (Sweden)

    Hadi Fattahi

    2016-12-01

    Full Text Available Shear wave velocity (Vs data are key information for petrophysical, geophysical and geomechanical studies. Although compressional wave velocity (Vp measurements exist in almost all wells, shear wave velocity is not recorded for most of elderly wells due to lack of technologic tools. Furthermore, measurement of shear wave velocity is to some extent costly. This study proposes a novel methodology to remove aforementioned problems by use of hybrid adaptive neuro fuzzy inference system (ANFIS with ant colony optimization algorithm (ACO based on fuzzy c–means clustering (FCM and subtractive clustering (SCM. The ACO is combined with two ANFIS models for determining the optimal value of its user–defined parameters. The optimization implementation by the ACO significantly improves the generalization ability of the ANFIS models. These models are used in this study to formulate conventional well log data into Vs in a quick, cheap, and accurate manner. A total of 3030 data points was used for model construction and 833 data points were employed for assessment of ANFIS models. Finally, a comparison among ANFIS models, and six well–known empirical correlations demonstrated ANFIS models outperformed other methods. This strategy was successfully applied in the Marun reservoir, Iran.

  17. Do accountable care organizations (ACOs) help or hinder primary care physicians' ability to deliver high-quality care?

    Science.gov (United States)

    Berenson, Robert A; Burton, Rachel A; McGrath, Megan

    2016-09-01

    Many view advanced primary care models such as the patient-centered medical home as foundational for accountable care organizations (ACOs), but it remains unclear how these two delivery reforms are complementary and how they may produce conflict. The objective of this study was to identify how joining an ACO could help or hinder a primary care practice's efforts to deliver high-quality care. This qualitative study involved interviews with a purposive sample of 32 early adopters of advanced primary care and/or ACO models, drawn from across the U.S. and conducted in mid-2014. Interview notes were coded using qualitative data analysis software, permitting topic-specific queries which were then summarized. Respondents perceived many potential benefits of joining an ACO, including care coordination staff, data analytics, and improved communication with other providers. However, respondents were also concerned about added "bureaucratic" requirements, referral restrictions, and a potential inability to recoup investments in practice improvements. Interviewees generally thought joining an ACO could complement a practice's efforts to deliver high-quality care, yet noted some concerns that could undermine these synergies. Both the advantages and disadvantages of joining an ACO seemed exacerbated for small practices, since they are most likely to benefit from additional resources yet are most likely to chafe under added bureaucratic requirements. Our identification of the potential pros and cons of joining an ACO may help providers identify areas to examine when weighing whether to enter into such an arrangement, and may help ACOs identify potential areas for improvement. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Tailoring Systems Engineering Processes in a Conceptual Design Environment: A Case Study at NASA Marshall Spaceflight Center's ACO

    Science.gov (United States)

    Mulqueen, John; Maples, C. Dauphne; Fabisinski, Leo, III

    2012-01-01

    This paper provides an overview of Systems Engineering as it is applied in a conceptual design space systems department at the National Aeronautics and Space Administration (NASA) Marshall Spaceflight Center (MSFC) Advanced Concepts Office (ACO). Engineering work performed in the NASA MFSC's ACO is targeted toward the Exploratory Research and Concepts Development life cycle stages, as defined in the International Council on Systems Engineering (INCOSE) System Engineering Handbook. This paper addresses three ACO Systems Engineering tools that correspond to three INCOSE Technical Processes: Stakeholder Requirements Definition, Requirements Analysis, and Integration, as well as one Project Process Risk Management. These processes are used to facilitate, streamline, and manage systems engineering processes tailored for the earliest two life cycle stages, which is the environment in which ACO engineers work. The role of systems engineers and systems engineering as performed in ACO is explored in this paper. The need for tailoring Systems Engineering processes, tools, and products in the ever-changing engineering services ACO provides to its customers is addressed.

  19. Housing, Transportation, And Food: How ACOs Seek To Improve Population Health By Addressing Nonmedical Needs Of Patients.

    Science.gov (United States)

    Fraze, Taressa; Lewis, Valerie A; Rodriguez, Hector P; Fisher, Elliott S

    2016-11-01

    Addressing nonmedical needs-such as the need for housing-is critical to advancing population health, improving the quality of care, and lowering the costs of care. Accountable care organizations (ACOs) are well positioned to address these needs. We used qualitative interviews with ACO leaders and site visits to examine how these organizations addressed the nonmedical needs of their patients, and the extent to which they did so. We developed a typology of medical and social services integration among ACOs that disentangles service and organizational integration. We found that the nonmedical needs most commonly addressed by ACOs were the need for transportation and housing and food insecurity. ACOs identified nonmedical needs through processes that were part of the primary care visit or care transformation programs. Approaches to meeting patients' nonmedical needs were either individualized solutions (developed patient by patient) or targeted approaches (programs developed to address specific needs). As policy makers continue to provide incentives for health care organizations to meet a broader spectrum of patients' needs, these findings offer insights into how health care organizations such as ACOs integrate themselves with nonmedical organizations. Project HOPE—The People-to-People Health Foundation, Inc.

  20. [ACOS: Clinical and functional features The Russian formulation mesalazine (kansalazine) in the therapy of ulcerative colitis].

    Science.gov (United States)

    Sobko, E A; Chubarova, S V; Demko, I V; Loktionova, M M; Ishchenko, O P; Solovyeva, I A; Kraposhina, A Yu; Gordeeva, N V

    To investigate the clinical and functional parameters in patients with asthma-chronic obstructive pulmonary disease overlap syndrome (ACOS) versus those with chronic obstructive pulmonary disease (COPD) and asthma. A total of 129 people were examined. 51 patients with ACOS were followed up in Group 1; Group 2 included 38 patients with severe asthma; Group 3 consisted of 40 patients with severe COPD. All the patients underwent clinical examination: history data collection, physical examination, evaluation of disease symptoms, and study of respiratory function (spirometry, body plethysmography). ACOS is clinically characterized by considerable demands for emergency drugs and by more frequent asthmatic fits and exacerbations, which require hospitalization. The parameters of bronchial resistance in ACOS were established to be increased throughout the follow-up period and to be comparable with those in patients with COPD. In the patients with ACOS, the severity of pulmonary hyperinflation was associated with increased demands for emergency drugs (r=0.59; p=0.015). Fixed bronchial obstruction in ACOS can be caused by smoking intensity and duration associated with increased bronchial resistance in expiration (r=0.51; p=0.003) and intrathoracic volume (r=0.71; p=0.0001); as well as increased body mass index (p<0.001) and disease duration, which were interrelated with a reduction in the forced expiratory volume in one second/forced vital capacity ratio (r=-0.63; p=0.001 and r=-0.71; p=0.0034, respectively). Patients with ACOS show more severe clinical manifestations and a substantial increase in functional residual capacity and intrathoracic volume throughout the follow-up period, suggesting that the distal bronchi are impaired and pulmonary hyperinflation develops.

  1. Ant Colony Optimization Algorithm to Dynamic Energy Management in Cloud Data Center

    Directory of Open Access Journals (Sweden)

    Shanchen Pang

    2017-01-01

    Full Text Available With the wide deployment of cloud computing data centers, the problems of power consumption have become increasingly prominent. The dynamic energy management problem in pursuit of energy-efficiency in cloud data centers is investigated. Specifically, a dynamic energy management system model for cloud data centers is built, and this system is composed of DVS Management Module, Load Balancing Module, and Task Scheduling Module. According to Task Scheduling Module, the scheduling process is analyzed by Stochastic Petri Net, and a task-oriented resource allocation method (LET-ACO is proposed, which optimizes the running time of the system and the energy consumption by scheduling tasks. Simulation studies confirm the effectiveness of the proposed system model. And the simulation results also show that, compared to ACO, Min-Min, and RR scheduling strategy, the proposed LET-ACO method can save up to 28%, 31%, and 40% energy consumption while meeting performance constraints.

  2. A Novel Algorithm of Quantum Random Walk in Server Traffic Control and Task Scheduling

    Directory of Open Access Journals (Sweden)

    Dong Yumin

    2014-01-01

    Full Text Available A quantum random walk optimization model and algorithm in network cluster server traffic control and task scheduling is proposed. In order to solve the problem of server load balancing, we research and discuss the distribution theory of energy field in quantum mechanics and apply it to data clustering. We introduce the method of random walk and illuminate what the quantum random walk is. Here, we mainly research the standard model of one-dimensional quantum random walk. For the data clustering problem of high dimensional space, we can decompose one m-dimensional quantum random walk into m one-dimensional quantum random walk. In the end of the paper, we compare the quantum random walk optimization method with GA (genetic algorithm, ACO (ant colony optimization, and SAA (simulated annealing algorithm. In the same time, we prove its validity and rationality by the experiment of analog and simulation.

  3. Ant Colony Optimization ACO For The Traveling Salesman Problem TSP Using Partitioning

    Directory of Open Access Journals (Sweden)

    Alok Bajpai

    2015-08-01

    Full Text Available Abstract An ant colony optimization is a technique which was introduced in 1990s and which can be applied to a variety of discrete combinatorial optimization problem and to continuous optimization. The ACO algorithm is simulated with the foraging behavior of the real ants to find the incremental solution constructions and to realize a pheromone laying-and-following mechanism. This pheromone is the indirect communication among the ants. In this paper we introduces the partitioning technique based on the divide and conquer strategy for the traveling salesman problem which is one of the most important combinatorial problem in which the original problem is partitioned into the group of sub problems. And then we apply the ant colony algorithm using candidate list strategy for each smaller sub problems. After that by applying the local optimization and combining the sub problems to find the good solution for the original problem by improving the exploration efficiency of the ants. At the end of this paper we have also be presented the comparison of result with the normal ant colony system for finding the optimal solution to the traveling salesman problem.

  4. Citrus CitNAC62 cooperates with CitWRKY1 to participate in citric acid degradation via up-regulation of CitAco3.

    Science.gov (United States)

    Li, Shao-Jia; Yin, Xue-Ren; Wang, Wen-Li; Liu, Xiao-Fen; Zhang, Bo; Chen, Kun-Song

    2017-06-15

    Citric acid is the predominant organic acid of citrus fruit. Degradation of citric acid occurs during fruit development, influencing fruit acidity. Associations of CitAco3 transcripts and citric acid degradation have been reported for citrus fruit. Here, transient overexpression of CitAco3 significantly reduced the citric acid content of citrus leaves and fruits. Using dual luciferase assays, it was shown that CitNAC62 and CitWRKY1 could transactivate the promoter of CitAco3. Subcellular localization results showed that CitWRKY1 was located in the nucleus and CitNAC62 was not. Yeast two-hybrid analysis and bimolecular fluorescence complementation (BiFC) assays indicated that the two differently located transcription factors could interact with each other. Furthermore, BiFC showed that the protein-protein interaction occurred only in the nucleus, indicating the potential mobility of CitNAC62 in plant cells. A synergistic effect on citrate content was observed between CitNAC62 and CitWRKY1. Transient overexpression of CitNAC62 or CitWRKY1 led to significantly lower citrate content in citrus fruit. The combined expression of CitNAC62 and CitWRKY1 resulted in lower citrate content compared with the expression of CitNAC62 or CitWRKY1 alone. The transcript abundance of CitAco3 was consistent with the citrate content. Thus, we propose that a complex of CitWRKY1 and CitNAC62 contributes to citric acid degradation in citrus fruit, potentially via modulation of CitAco3. © The Author 2017. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  5. Ant colony optimisation for economic dispatch problem with non-smooth cost functions

    Energy Technology Data Exchange (ETDEWEB)

    Pothiya, Saravuth; Kongprawechnon, Waree [School of Communication, Instrumentation and Control, Sirindhorn International Institute of Technology, Thammasat University, P.O. Box 22, Pathumthani (Thailand); Ngamroo, Issarachai [Center of Excellence for Innovative Energy Systems, Faculty of Engineering, King Mongkut' s Institute of Technology Ladkrabang, Bangkok 10520 (Thailand)

    2010-06-15

    This paper presents a novel and efficient optimisation approach based on the ant colony optimisation (ACO) for solving the economic dispatch (ED) problem with non-smooth cost functions. In order to improve the performance of ACO algorithm, three additional techniques, i.e. priority list, variable reduction, and zoom feature are presented. To show its efficiency and effectiveness, the proposed ACO is applied to two types of ED problems with non-smooth cost functions. Firstly, the ED problem with valve-point loading effects consists of 13 and 40 generating units. Secondly, the ED problem considering the multiple fuels consists of 10 units. Additionally, the results of the proposed ACO are compared with those of the conventional heuristic approaches. The experimental results show that the proposed ACO approach is comparatively capable of obtaining higher quality solution and faster computational time. (author)

  6. A Novel adaptative Discrete Cuckoo Search Algorithm for parameter optimization in computer vision

    Directory of Open Access Journals (Sweden)

    loubna benchikhi

    2017-10-01

    Full Text Available Computer vision applications require choosing operators and their parameters, in order to provide the best outcomes. Often, the users quarry on expert knowledge and must experiment many combinations to find manually the best one. As performance, time and accuracy are important, it is necessary to automate parameter optimization at least for crucial operators. In this paper, a novel approach based on an adaptive discrete cuckoo search algorithm (ADCS is proposed. It automates the process of algorithms’ setting and provides optimal parameters for vision applications. This work reconsiders a discretization problem to adapt the cuckoo search algorithm and presents the procedure of parameter optimization. Some experiments on real examples and comparisons to other metaheuristic-based approaches: particle swarm optimization (PSO, reinforcement learning (RL and ant colony optimization (ACO show the efficiency of this novel method.

  7. Cloud computing task scheduling strategy based on differential evolution and ant colony optimization

    Science.gov (United States)

    Ge, Junwei; Cai, Yu; Fang, Yiqiu

    2018-05-01

    This paper proposes a task scheduling strategy DEACO based on the combination of Differential Evolution (DE) and Ant Colony Optimization (ACO), aiming at the single problem of optimization objective in cloud computing task scheduling, this paper combines the shortest task completion time, cost and load balancing. DEACO uses the solution of the DE to initialize the initial pheromone of ACO, reduces the time of collecting the pheromone in ACO in the early, and improves the pheromone updating rule through the load factor. The proposed algorithm is simulated on cloudsim, and compared with the min-min and ACO. The experimental results show that DEACO is more superior in terms of time, cost, and load.

  8. Impact of Provider Participation in ACO Programs on Preventive Care Services, Patient Experiences, and Health Care Expenditures in US Adults Aged 18-64.

    Science.gov (United States)

    Hong, Young-Rock; Sonawane, Kalyani; Larson, Samantha; Mainous, Arch G; Marlow, Nicole M

    2018-05-15

    Little is known about the impact of accountable care organization (ACO) on US adults aged 18-64. To examine whether having a usual source of care (USC) provider participating in an ACO affects receipt of preventive care services, patient experiences, and health care expenditures among nonelderly Americans. A cross-sectional analysis of the 2015 Medical Organizations Survey linked with the Medical Expenditure Panel Survey. Survey respondents aged 18-64 with an identified USC and continuous health insurance coverage during 2015. Preventative care services (routine checkup, flu vaccination, and cancer screening), patient experiences with health care (access to care, interaction quality with providers, and global satisfaction), and health care expenditures (total and out-of-pocket expenditures) for respondents with USC by ACO and non-ACO provider groups. Among 1563, nonelderly Americans having a USC, we found that nearly 62.7% [95% confidence interval (CI), 58.6%-66.7%; representing 15,722,208 Americans] were cared for by ACO providers. Our analysis showed no significant differences in preventive care services or patient experiences between ACO and non-ACO groups. Adjusted mean total health expenditures were slightly higher for the ACO than non-ACO group [$7016 (95% CI, $4949-$9914) vs. $6796 (95% CI, $4724-$9892)]; however, this difference was not statistically significant (P=0.250). Our findings suggest that having a USC provider participating in an ACO is not associated with preventive care services use, patient experiences, or health care expenditures among a nonelderly population.

  9. Quantifying Spatiotemporal Dynamics of Solar Radiation over the Northeast China Based on ACO-BPNN Model and Intensity Analysis

    Directory of Open Access Journals (Sweden)

    Xiangqian Li

    2017-01-01

    Full Text Available Reliable information on the spatiotemporal dynamics of solar radiation plays a crucial role in studies relating to global climate change. In this study, a new backpropagation neural network (BPNN model optimized with an Ant Colony Optimization (ACO algorithm was developed to generate the ACO-BPNN model, which had demonstrated superior performance for simulating solar radiation compared to traditional BPNN modelling, for Northeast China. On this basis, we applied an intensity analysis to investigate the spatiotemporal variation of solar radiation from 1982 to 2010 over the study region at three levels: interval, category, and conversion. Research findings revealed that (1 the solar radiation resource in the study region increased from the 1980s to the 2000s and the average annual rate of variation from the 1980s to the 1990s was lower than that from the 1990s to the 2000s and (2 the gains and losses of solar radiation at each level were in different conditions. The poor, normal, and comparatively abundant levels were transferred to higher levels, whereas the abundant level was transferred to lower levels. We believe our findings contribute to implementing ad hoc energy management strategies to optimize the use of solar radiation resources and provide scientific suggestions for policy planning.

  10. Construction of the Orsay ACO storage ring, 1: progress report. 2: detecting systems; La construction de l'anneau de stockage ACO d'Orsay, 1: etat des travaux. 2: systemes de detection

    Energy Technology Data Exchange (ETDEWEB)

    Blanc-Lapierre, A; Arzelier, G; Beck, R A; Belbeoch, R; Boutouyrie, B; Bruck, H; Burnod, L; Gendreau, G; Gratreau, P; Haissinski, J; Jolivot, R; Laborde, J; Le Duff, J; Leleux, G; Marin, P C; Milman, B; Sommer, E; Zyngier, H [Laboratoire de l' Accelerateur Lineaire, 91 - Orsay (France); [Commissariat a l' Energie Atomique, Saclay (France). Centre d' Etudes Nucleaires; Augustin, J E; Rumpf, F; Silva, E

    1966-07-01

    1. - A brief progress report is given, with mention of the difficulties solved since the 1963 Dubna Conference and the extra facilities which will be available on the ring (beam observation and measurements); 2. - Typical large solid angle detecting systems and equipment for accurate monitoring under construction at Orsay are also presented. Emphasis is put on some peculiar features at ACO as a tool for physics: long experimental straight section and short length of the bunches. (authors) [French] 1. - Un bref rapport est presente sur l'etat d'avancement du projet ACO en precisant les difficultes resolues depuis la Conference de Dubna en 1963, et les moyens nouveaux mis en place sur l'anneau pour les observations et mesures sur les faisceaux; 2. - Des systemes de detection a grand angle solide et de monitorage precis, en construction a Orsay, sont egalement presentes. On insiste sur quelques traits particuliers d'ACO en tant qu'outil de travail de physique: section droite experimentale longue et paquets courts. (auteurs)

  11. Advanced Harmony Search with Ant Colony Optimization for Solving the Traveling Salesman Problem

    Directory of Open Access Journals (Sweden)

    Ho-Yoeng Yun

    2013-01-01

    Full Text Available We propose a novel heuristic algorithm based on the methods of advanced Harmony Search and Ant Colony Optimization (AHS-ACO to effectively solve the Traveling Salesman Problem (TSP. The TSP, in general, is well known as an NP-complete problem, whose computational complexity increases exponentially by increasing the number of cities. In our algorithm, Ant Colony Optimization (ACO is used to search the local optimum in the solution space, followed by the use of the Harmony Search to escape the local optimum determined by the ACO and to move towards a global optimum. Experiments were performed to validate the efficiency of our algorithm through a comparison with other algorithms and the optimum solutions presented in the TSPLIB. The results indicate that our algorithm is capable of generating the optimum solution for most instances in the TSPLIB; moreover, our algorithm found better solutions in two cases (kroB100 and pr144 when compared with the optimum solution presented in the TSPLIB.

  12. ALOHA Cabled Observatory (ACO): Acoustic Doppler Current Profiler (ADCP): Velocity

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The University of Hawaii's ALOHA ("A Long-term Oligotrophic Habitat Assessment") Cabled Observatory (ACO) is located 100 km north of the island of Oahu, Hawaii (22...

  13. An Efficient Technique for Hardware/Software Partitioning Process in Codesign

    Directory of Open Access Journals (Sweden)

    Imene Mhadhbi

    2016-01-01

    Full Text Available Codesign methodology deals with the problem of designing complex embedded systems, where automatic hardware/software partitioning is one key issue. The research efforts in this issue are focused on exploring new automatic partitioning methods which consider only binary or extended partitioning problems. The main contribution of this paper is to propose a hybrid FCMPSO partitioning technique, based on Fuzzy C-Means (FCM and Particle Swarm Optimization (PSO algorithms suitable for mapping embedded applications for both binary and multicores target architecture. Our FCMPSO optimization technique has been compared using different graphical models with a large number of instances. Performance analysis reveals that FCMPSO outperforms PSO algorithm as well as the Genetic Algorithm (GA, Simulated Annealing (SA, Ant Colony Optimization (ACO, and FCM standard metaheuristic based techniques and also hybrid solutions including PSO then GA, GA then SA, GA then ACO, ACO then SA, FCM then GA, FCM then SA, and finally ACO followed by FCM.

  14. 178. Linfoma cardíaco. Una neoplasia infrecuente

    Directory of Open Access Journals (Sweden)

    E. Sandoval

    2010-01-01

    Conclusiones: Los linfomas cardíacos son de alto grado de agresividad. La localización principal es la aurícula derecha, con clínica inespecífica y variable. No existe claro consenso sobre el tratamiento de elección, pero parece haber mayor supervivencia con tratamiento combinado con quimioterapia con antraciclínicos y radioterapia.

  15. Simulation optimization based ant colony algorithm for the uncertain quay crane scheduling problem

    Directory of Open Access Journals (Sweden)

    Naoufal Rouky

    2019-01-01

    Full Text Available This work is devoted to the study of the Uncertain Quay Crane Scheduling Problem (QCSP, where the loading /unloading times of containers and travel time of quay cranes are considered uncertain. The problem is solved with a Simulation Optimization approach which takes advantage of the great possibilities offered by the simulation to model the real details of the problem and the capacity of the optimization to find solutions with good quality. An Ant Colony Optimization (ACO meta-heuristic hybridized with a Variable Neighborhood Descent (VND local search is proposed to determine the assignments of tasks to quay cranes and the sequences of executions of tasks on each crane. Simulation is used inside the optimization algorithm to generate scenarios in agreement with the probabilities of the distributions of the uncertain parameters, thus, we carry out stochastic evaluations of the solutions found by each ant. The proposed optimization algorithm is tested first for the deterministic case on several well-known benchmark instances. Then, in the stochastic case, since no other work studied exactly the same problem with the same assumptions, the Simulation Optimization approach is compared with the deterministic version. The experimental results show that the optimization algorithm is competitive as compared to the existing methods and that the solutions found by the Simulation Optimization approach are more robust than those found by the optimization algorithm.

  16. Ant colony optimisation-direct cover: a hybrid ant colony direct cover technique for multi-level synthesis of multiple-valued logic functions

    Science.gov (United States)

    Abd-El-Barr, Mostafa

    2010-12-01

    The use of non-binary (multiple-valued) logic in the synthesis of digital systems can lead to savings in chip area. Advances in very large scale integration (VLSI) technology have enabled the successful implementation of multiple-valued logic (MVL) circuits. A number of heuristic algorithms for the synthesis of (near) minimal sum-of products (two-level) realisation of MVL functions have been reported in the literature. The direct cover (DC) technique is one such algorithm. The ant colony optimisation (ACO) algorithm is a meta-heuristic that uses constructive greediness to explore a large solution space in finding (near) optimal solutions. The ACO algorithm mimics the ant's behaviour in the real world in using the shortest path to reach food sources. We have previously introduced an ACO-based heuristic for the synthesis of two-level MVL functions. In this article, we introduce the ACO-DC hybrid technique for the synthesis of multi-level MVL functions. The basic idea is to use an ant to decompose a given MVL function into a number of levels and then synthesise each sub-function using a DC-based technique. The results obtained using the proposed approach are compared to those obtained using existing techniques reported in the literature. A benchmark set consisting of 50,000 randomly generated 2-variable 4-valued functions is used in the comparison. The results obtained using the proposed ACO-DC technique are shown to produce efficient realisation in terms of the average number of gates (as a measure of chip area) needed for the synthesis of a given MVL function.

  17. Proposed genetic algorithms for construction site lay out

    NARCIS (Netherlands)

    Mawdesley, Michael J.; Al-Jibouri, Saad H.S.

    2003-01-01

    The positioning of temporary facilities on a construction site is an area of research which has been recognised as important but which has received relatively little attention. In this paper, a genetic algorithm is proposed to solve the problem in which m facilities are to be positioned to n

  18. Ant colony algorithm for clustering in portfolio optimization

    Science.gov (United States)

    Subekti, R.; Sari, E. R.; Kusumawati, R.

    2018-03-01

    This research aims to describe portfolio optimization using clustering methods with ant colony approach. Two stock portfolios of LQ45 Indonesia is proposed based on the cluster results obtained from ant colony optimization (ACO). The first portfolio consists of assets with ant colony displacement opportunities beyond the defined probability limits of the researcher, where the weight of each asset is determined by mean-variance method. The second portfolio consists of two assets with the assumption that each asset is a cluster formed from ACO. The first portfolio has a better performance compared to the second portfolio seen from the Sharpe index.

  19. Proposed hybrid-classifier ensemble algorithm to map snow cover area

    Science.gov (United States)

    Nijhawan, Rahul; Raman, Balasubramanian; Das, Josodhir

    2018-01-01

    Metaclassification ensemble approach is known to improve the prediction performance of snow-covered area. The methodology adopted in this case is based on neural network along with four state-of-art machine learning algorithms: support vector machine, artificial neural networks, spectral angle mapper, K-mean clustering, and a snow index: normalized difference snow index. An AdaBoost ensemble algorithm related to decision tree for snow-cover mapping is also proposed. According to available literature, these methods have been rarely used for snow-cover mapping. Employing the above techniques, a study was conducted for Raktavarn and Chaturangi Bamak glaciers, Uttarakhand, Himalaya using multispectral Landsat 7 ETM+ (enhanced thematic mapper) image. The study also compares the results with those obtained from statistical combination methods (majority rule and belief functions) and accuracies of individual classifiers. Accuracy assessment is performed by computing the quantity and allocation disagreement, analyzing statistic measures (accuracy, precision, specificity, AUC, and sensitivity) and receiver operating characteristic curves. A total of 225 combinations of parameters for individual classifiers were trained and tested on the dataset and results were compared with the proposed approach. It was observed that the proposed methodology produced the highest classification accuracy (95.21%), close to (94.01%) that was produced by the proposed AdaBoost ensemble algorithm. From the sets of observations, it was concluded that the ensemble of classifiers produced better results compared to individual classifiers.

  20. Bending The Spending Curve By Altering Care Delivery Patterns: The Role Of Care Management Within A Pioneer ACO.

    Science.gov (United States)

    Hsu, John; Price, Mary; Vogeli, Christine; Brand, Richard; Chernew, Michael E; Chaguturu, Sreekanth K; Weil, Eric; Ferris, Timothy G

    2017-05-01

    Accountable care organizations (ACOs) appear to lower medical spending, but there is little information on how they do so. We examined the impact of patient participation in a Pioneer ACO and its care management program on rates of emergency department (ED) visits and hospitalizations and on Medicare spending. We used data for the period 2009-14, exploiting naturally staggered program entry to create concurrent controls to help isolate the program effects. The care management program (the ACO's primary intervention) targeted beneficiaries with elevated but modifiable risks for future spending. ACO participation had a modest effect on spending, in line with previous estimates. Participation in the care management program was associated with substantial reductions in rates for hospitalizations and both all and nonemergency ED visits, as well as Medicare spending, when compared to preparticipation levels and to rates and spending for a concurrent sample of beneficiaries who were eligible for but had not yet started the program. Rates of ED visits and hospitalizations were reduced by 6 percent and 8 percent, respectively, and Medicare spending was reduced by 6 percent. Targeting beneficiaries with modifiable high risks and shifting care away from the ED represent viable mechanisms for altering spending within ACOs. Project HOPE—The People-to-People Health Foundation, Inc.

  1. 288. Feocromocitoma Cardíaco a Nivel Del Surco Interauricular

    Directory of Open Access Journals (Sweden)

    C. Iglesias Gil

    2012-04-01

    La localización extraadrenal de los feocromocitomas es infrecuente, siendo la torácica inferior al 2% y la intrapericárdica excepcional. Dado que son escasos los reportes de tumores secretores de catecolaminas a nivel cardíaco, presentamos este caso.

  2. The development and psychometric validation of an instrument to evaluate nurses' attitudes towards communication with the patient (ACO).

    Science.gov (United States)

    Giménez-Espert, María Del Carmen; Prado-Gascó, Vicente Javier

    2018-05-01

    Patient communication is a key skill for nurses involved in clinical care. Its measurement is a complex phenomenon that can be addressed through attitude evaluation. To develop and psychometrically test a measure of nurses' attitudes towards communication with patients (ACO), to study the relationship between these dimensions, and to analyse nursing attitudes. To develop and psychometrically test the ACO questionnaire. All hospitals in the province of Valencia were invited by e-mail to distribute the ACO instrument. Ten hospitals took part in the study. The study population was composed of a convenience sample of 400 hospital nurses on general or special services. The inclusion criteria were nurses at the selected centres who had previously provided an informed consent to participate. A literature review and expert consultation (N = 10) was used to develop the content of the questionnaire. The 62-item version of the instrument was applied to a convenience sample of 400 nurses between May 2015 and March 2016. Factor structure was evaluated with exploratory and confirmatory factor analysis (EFA, CFA), and reliability was evaluated with Cronbach's alpha, composite reliability (CR), and average variance extracted (AVE). The final instrument (ACO), composed of 25 items grouped into three attitude dimensions (cognitive, affective and behavioural), had good psychometric properties. In the study sample, nurses had a favourable attitude towards communication. The cognitive and affective dimensions of the ACO should be able to predict the behaviour dimension. The ACO is useful for evaluating current clinical practices, identifying educational needs and assessing the effectiveness of communication training or other interventions intended to improve communication. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Topology in two dimensions. II - The Abell and ACO cluster catalogues

    Science.gov (United States)

    Plionis, Manolis; Valdarnini, Riccardo; Coles, Peter

    1992-09-01

    We apply a method for quantifying the topology of projected galaxy clustering to the Abell and ACO catalogues of rich clusters. We use numerical simulations to quantify the statistical bias involved in using high peaks to define the large-scale structure, and we use the results obtained to correct our observational determinations for this known selection effect and also for possible errors introduced by boundary effects. We find that the Abell cluster sample is consistent with clusters being identified with high peaks of a Gaussian random field, but that the ACO shows a slight meatball shift away from the Gaussian behavior over and above that expected purely from the high-peak selection. The most conservative explanation of this effect is that it is caused by some artefact of the procedure used to select the clusters in the two samples.

  4. Prevalence of nursing diagnosis of decreased cardiac output and the predictive value of defining characteristics in patients under evaluation for heart transplant Prevalencia del diagnóstico de enfermería de disminución del gasto cardíaco y valor predictivo de las características definidoras en pacientes en fase de evaluación para trasplante cardíaco Prevalência do diagnóstico de enfermagem de débito cardíaco diminuído e valor preditivo das características definidoras em pacientes em avaliação para transplante cardíaco

    Directory of Open Access Journals (Sweden)

    Lígia Neres Matos

    2012-04-01

    Full Text Available The purposes of the study were to identify the prevalence of defining characteristics (DC of decreased cardiac output (DCO in patients with cardiac insufficiency under evaluation for heart transplantation, and to ascertain the likelihood of defining characteristics being predictive factors for the existence of reduction in cardiac output. Data was obtained by retrospective documental analysis of the clinical records of right-sided heart catheterizations in 38 patients between 2004 and 2009. The results showed that 71.1% of the patients had decreased cardiac output (measured by cardiac index. The majority of the NANDA-International defining characteristics for DCO were more frequent in individuals with reduced cardiac index levels. The study emphasizes the odds ratio (OR for increased Systemic Vascular Resistance of OR=4.533, of the third heart sound with OR=3.429 and the reduced ejection fraction with OR=2.850. By obtaining the predictive values for the defining characteristics the study identifies them as diagnostic indicators of decreased cardiac output.El estudio tiene como objetivos identificar la prevalencia de las características definitorias de la disminución del gasto cardíaco (DGC en pacientes con insuficiencia cardíaca (IC en evaluación para el trasplante de corazón y ver la probabilidad de las características definitorias sean factores predictivos de la existencia de una disminución del gasto cardíaco. Los datos se obtuvieron mediante el análisis documental retrospectivo de historias clínicas de cateterismo cordiaco derecho en 38 pacientes con insuficiencia cardíaca en la evaluación para el trasplante de corazón entre 2004 y 2009. Los resultados mostraron que el 71% de los pacientes habían reducido el gasto cardíaco (medido por el índice cardíaco. La mayoría de las características definitorias de la NANDA-International para el DGC fueron más frecuentes en individuos con índice cardíaco reducido. Los aspectos m

  5. Betting on change: Tenet deal with Vanguard shows it's primed to try ACO effort, new payment model.

    Science.gov (United States)

    Kutscher, Beth

    2013-07-01

    Tenet Healthcare Corp.'s acquisition of Vanguard Health Systems is a sign the investor-owned chain is willing to take a chance on alternative payment models such as accountable care organizations. There's no certainty that ACOs will deliver the improvements on quality or cost savings, but Vanguard Vice Chairman Keith Pitts, left, says his system's Pioneer ACO in Detroit has already achieved some cost savings.

  6. Multi-criteria ACO-based Algorithm for Ship’s Trajectory Planning

    OpenAIRE

    Agnieszka Lazarowska

    2017-01-01

    The paper presents a new approach for solving a path planning problem for ships in the environment with static and dynamic obstacles. The algorithm utilizes a heuristic method, classified to the group of Swarm Intelligence approaches, called the Ant Colony Optimization. The method is inspired by a collective behaviour of ant colonies. A group of agents - artificial ants searches through the solution space in order to find a safe, optimal trajectory for a ship. The problem is considered as a ...

  7. Protocolo del manejo y selección del donante cardíaco

    Directory of Open Access Journals (Sweden)

    José Elizalde

    2008-01-01

    Full Text Available El trasplante cardíaco es una terapia aceptada en grupos de pacientes afectos de insuficiencia cardíaca terminal. El envejecimiento progresivo de los donantes y la reducción del número de traumatismos craneoencefálicos mortales obliga a una valoración exhaustiva en cada caso. La ecocardiografía es, hoy en día, una exploración obligada en la valoración del donante. La Organización Nacional de Trasplantes ha elaborado un protocolo de manejo del donante torácico para optimizar su aprovechamiento. Se revisan los criterios de selección del donante cardíaco insistiendo en las contraindicaciones, en el despistaje de enfermedad coronaria y el mantenimiento durante el proceso de selección.

  8. Optic disc detection using ant colony optimization

    Science.gov (United States)

    Dias, Marcy A.; Monteiro, Fernando C.

    2012-09-01

    The retinal fundus images are used in the treatment and diagnosis of several eye diseases, such as diabetic retinopathy and glaucoma. This paper proposes a new method to detect the optic disc (OD) automatically, due to the fact that the knowledge of the OD location is essential to the automatic analysis of retinal images. Ant Colony Optimization (ACO) is an optimization algorithm inspired by the foraging behaviour of some ant species that has been applied in image processing for edge detection. Recently, the ACO was used in fundus images to detect edges, and therefore, to segment the OD and other anatomical retinal structures. We present an algorithm for the detection of OD in the retina which takes advantage of the Gabor wavelet transform, entropy and ACO algorithm. Forty images of the retina from DRIVE database were used to evaluate the performance of our method.

  9. Trasplante cardíaco pediátrico: pasado, presente y futuro

    Directory of Open Access Journals (Sweden)

    María-Teresa González-López

    2017-01-01

    Conclusión: El trasplante cardíaco pediátrico presenta óptimos resultados en nuestro medio, superponibles a series mundiales. Aunque el perfil de riesgo está incrementándose, los resultados actuales reflejan los avances en el manejo de estos pacientes.

  10. Silencing of ACO decreases reproduction and energy metabolism in triazophos-treated female brown plant hoppers, Nilaparvata lugens Stål (Hemiptera: Delphacidae).

    Science.gov (United States)

    Liu, Zong-Yu; Jiang, Yi-Ping; Li, Lei; You, Lin-Lin; Wu, You; Xu, Bin; Ge, Lin-Quan; Wu, Jin-Cai

    2016-03-01

    The brown plant hopper (BPH), Nilaparvata lugens Stål (Hemiptera: Delphacidae), is a major pest affecting rice in Asia, and outbreaks of this pest are closely linked to pesticide-induced stimulation of reproduction. Therefore, the BPH is a classic example of a resurgent pest. However, the effects of different genes on the regulation of pesticide-induced reproductive stimulation in the BPH are unclear. In this study, the regulatory effects of acyl-coenzyme A oxidase (ACO) on the reproduction and biochemistry of the BPH were investigated with gene silencing. The number of eggs laid per female by triazophos (TZP)+dsACO BPH females was significantly lower than those of TZP-treated (without ACO silencing) or TZP+GFP females (negative control), with the number of eggs decreasing by 30.8% (from 529.5 to 366.3) and 32.0% (from 540.5 to 366.3), respectively. The preoviposition period, oviposition period, and longevity of the TZP-treated females were also influenced by dsACO treatment. Additionally, the amounts of crude fat, protein, and some fatty acids (oleic acid, palmitic acid, linoleic acid, stearic acid, and myristoleic acid) in TZP+dsACO females were significantly lower than in TZP-treated females. Thus, ACO is one of the key genes regulating the TZP-induced stimulation of reproduction in BPH females. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. An Effective Mechanism for Virtual Machine Placement using Aco in IAAS Cloud

    Science.gov (United States)

    Shenbaga Moorthy, Rajalakshmi; Fareentaj, U.; Divya, T. K.

    2017-08-01

    Cloud computing provides an effective way to dynamically provide numerous resources to meet customer demands. A major challenging problem for cloud providers is designing efficient mechanisms for optimal virtual machine Placement (OVMP). Such mechanisms enable the cloud providers to effectively utilize their available resources and obtain higher profits. In order to provide appropriate resources to the clients an optimal virtual machine placement algorithm is proposed. Virtual machine placement is NP-Hard problem. Such NP-Hard problem can be solved using heuristic algorithm. In this paper, Ant Colony Optimization based virtual machine placement is proposed. Our proposed system focuses on minimizing the cost spending in each plan for hosting virtual machines in a multiple cloud provider environment and the response time of each cloud provider is monitored periodically, in such a way to minimize delay in providing the resources to the users. The performance of the proposed algorithm is compared with greedy mechanism. The proposed algorithm is simulated in Eclipse IDE. The results clearly show that the proposed algorithm minimizes the cost, response time and also number of migrations.

  12. Applications of UV-storage ring free electron lasers: the case of super-ACO

    CERN Document Server

    Nahon, L; Couprie, Marie Emmanuelle; Merola, F; Dumas, P; Marsi, M; Taleb-Ibrahimi, A; Nutarelli, D; Roux, R; Billardon, M

    1999-01-01

    The potential of UV-storage ring free electron lasers (SRFELs) for the performance of original application experiments is shown with a special emphasis concerning their combination with the naturally synchronized synchrotron radiation (SR). The first two-color FEL+SR experiment, performed in surface science at Super-ACO is reported. The experimental parameters found to be the most important as gathered from the acquired experience, are underlined and discussed. Finally, future prospects for the scientific program of the Super-ACO FEL are presented with two-color experiments combining the FEL with SR undulator-based XUV and VUV beamlines as well as with a SR white light bending magnet beamline emiting in the IR-UV (20 mu m-0.25 mu m).

  13. Low-peak-to-average power ratio and low-complexity asymmetrically clipped optical orthogonal frequency-division multiplexing uplink transmission scheme for long-reach passive optical network.

    Science.gov (United States)

    Zhou, Ji; Qiao, Yaojun

    2015-09-01

    In this Letter, we propose a discrete Hartley transform (DHT)-spread asymmetrically clipped optical orthogonal frequency-division multiplexing (DHT-S-ACO-OFDM) uplink transmission scheme in which the multiplexing/demultiplexing process also uses the DHT algorithm. By designing a simple encoding structure, the computational complexity of the transmitter can be reduced from O(Nlog(2)(N)) to O(N). At the probability of 10(-3), the peak-to-average power ratio (PAPR) of 2-ary pulse amplitude modulation (2-PAM)-modulated DHT-S-ACO-OFDM is approximately 9.7 dB lower than that of 2-PAM-modulated conventional ACO-OFDM. To verify the feasibility of the proposed scheme, a 4-Gbit/s DHT-S-ACO-OFDM uplink transmission scheme with a 1∶64 way split has been experimentally implemented using 100-km standard single-mode fiber (SSMF) for a long-reach passive optical network (LR-PON).

  14. Free electron laser on the ACO storage ring

    International Nuclear Information System (INIS)

    Elleaume, P.

    1984-06-01

    This dissertation presents the design and characteristics of a Free Electron Laser built on the electron storage ring ACO at Orsay. The weak optical gain available (approximately 0.1% per pass) necessitated the use of an optical klystron instead of an undulator and the use of mirror with extremely high reflectivity. The laser characteristics: spectra, micro and macro-temporal structures, transverse structure and power are presented. They are in very good agreement with a classical theory based on the Lorentz force and Maxwell equations [fr

  15. Hypersensitivity to local anaesthetics--update and proposal of evaluation algorithm

    DEFF Research Database (Denmark)

    Thyssen, Jacob Pontoppidan; Menné, Torkil; Elberling, Jesper

    2008-01-01

    of patients suspected with immediate- and delayed-type immune reactions. Literature was examined using PubMed-Medline, EMBASE, Biosis and Science Citation Index. Based on the literature, the proposed algorithm may safely and rapidly distinguish between immediate-type and delayed-type allergic immune reactions....

  16. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm.

    Science.gov (United States)

    Abdulhamid, Shafi'i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques.

  17. Secure Scientific Applications Scheduling Technique for Cloud Computing Environment Using Global League Championship Algorithm

    Science.gov (United States)

    Abdulhamid, Shafi’i Muhammad; Abd Latiff, Muhammad Shafie; Abdul-Salaam, Gaddafi; Hussain Madni, Syed Hamid

    2016-01-01

    Cloud computing system is a huge cluster of interconnected servers residing in a datacenter and dynamically provisioned to clients on-demand via a front-end interface. Scientific applications scheduling in the cloud computing environment is identified as NP-hard problem due to the dynamic nature of heterogeneous resources. Recently, a number of metaheuristics optimization schemes have been applied to address the challenges of applications scheduling in the cloud system, without much emphasis on the issue of secure global scheduling. In this paper, scientific applications scheduling techniques using the Global League Championship Algorithm (GBLCA) optimization technique is first presented for global task scheduling in the cloud environment. The experiment is carried out using CloudSim simulator. The experimental results show that, the proposed GBLCA technique produced remarkable performance improvement rate on the makespan that ranges between 14.44% to 46.41%. It also shows significant reduction in the time taken to securely schedule applications as parametrically measured in terms of the response time. In view of the experimental results, the proposed technique provides better-quality scheduling solution that is suitable for scientific applications task execution in the Cloud Computing environment than the MinMin, MaxMin, Genetic Algorithm (GA) and Ant Colony Optimization (ACO) scheduling techniques. PMID:27384239

  18. Interdisciplinariedade: entre o desejo e a prática dos profissionais do transplante cardíaco no Instituto Dante Pazzanese de Cardiologia

    OpenAIRE

    Santos, Nadja Maria Codá dos

    2005-01-01

    Esta dissertação tem o objetivo geral de: analisar a prática interdisciplinar nos 13 anos de transplante cardíaco no Instituto Dante Pazzanese de Cardiologia IDPC. Buscou como objetivos específicos: resgatar a história do transplante cardíaco no IDPC; analisar as perspectivas dos profissionais sobre a contribuição de sua profissão e das demais para o trabalho interdisciplinar; estudar contribuição do Serviço Social para a prática interdisciplinar. Na prática do transplante cardíaco, inserid...

  19. Towards the Fourier limit on the super-ACO Storage Ring FEL

    International Nuclear Information System (INIS)

    Couprie, M.E.; De Ninno, G.; Moneron, G.; Nutarelli, D.; Hirsch, M.; Garzella, D.; Renault, E.; Roux, R.; Thomas, C.

    2001-01-01

    Systematic studies on the Free Electron Laser (FEL) line and micropulse have been performed on the Super-ACO storage ring FEL with a monochromator and a double-sweep streak camera under various conditions of operation (detuning, 'CW' and Q-switched mode). From these data, it appears that the FEL is usually operated very close to the Fourier limit

  20. Towards the Fourier limit on the super-ACO Storage Ring FEL

    CERN Document Server

    Couprie, Marie Emmanuelle; Garzella, D; Hirsch, M; Moneron, G; Nutarelli, D; Renault, E; Roux, R; Thomas, C

    2001-01-01

    Systematic studies on the Free Electron Laser (FEL) line and micropulse have been performed on the Super-ACO storage ring FEL with a monochromator and a double-sweep streak camera under various conditions of operation (detuning, 'CW' and Q-switched mode). From these data, it appears that the FEL is usually operated very close to the Fourier limit.

  1. Intelligent Search Method Based ACO Techniques for a Multistage Decision Problem EDP/LFP

    Directory of Open Access Journals (Sweden)

    Mostefa RAHLI

    2006-07-01

    the algorithm getting to him a rate preferably more or less justifiable. In operational research, this subject is known under the name of CPO [14] (combinatory problem optimization.The choice of a numerical method to use for a merged case study and calculation of the LFP/Fitting/EDP what is [7, 8, 9, 10, 18, 19, 20] (in theoretical form of a problem compensates the final decision to adopt and a strategy of optimal production (which is a practical problem form and the final task most wanted.Each method is imposed by:· The algorithm complexity.· In an application gathering all calculations, the number of uses of method compared to the total number of later issues.· The maximum number of iterations for a given use.· The maximum iterations count allowed for this algorithm kind.· The limitations of the algorithm such as: applicability of a method (algorithm adapted or not to the problem; does the problem constrained or not; problem dimension or order N (N ≤ Nmax; the algorithm stability.It's well-known that for an approached calculation method, the propagation of errors strongly conditions the need of making its adequate choice and if it can be adopted compared to others for the same area.More is the number of the elementary operations is large more the final result misses precision and especially if the finality of the study is a responsible decision to make and a satisfaction of constraints and multiple conditions. Our study proposes an inference based solution (AI with the use of ACO technique (Ant colony Optimization2.

  2. Proposal of an Algorithm to Synthesize Music Suitable for Dance

    Science.gov (United States)

    Morioka, Hirofumi; Nakatani, Mie; Nishida, Shogo

    This paper proposes an algorithm for synthesizing music suitable for emotions in moving pictures. Our goal is to support multi-media content creation; web page design, animation films and so on. Here we adopt a human dance as a moving picture to examine the availability of our method. Because we think the dance image has high affinity with music. This algorithm is composed of three modules. The first is the module for computing emotions from an input dance image, the second is for computing emotions from music in the database and the last is for selecting music suitable for input dance via an interface of emotion.

  3. An effective inversion algorithm for retrieving bimodal aerosol particle size distribution from spectral extinction data

    International Nuclear Information System (INIS)

    He, Zhenzong; Qi, Hong; Yao, Yuchen; Ruan, Liming

    2014-01-01

    The Ant Colony Optimization algorithm based on the probability density function (PDF-ACO) is applied to estimate the bimodal aerosol particle size distribution (PSD). The direct problem is solved by the modified Anomalous Diffraction Approximation (ADA, as an approximation for optically large and soft spheres, i.e., χ⪢1 and |m−1|⪡1) and the Beer–Lambert law. First, a popular bimodal aerosol PSD and three other bimodal PSDs are retrieved in the dependent model by the multi-wavelength extinction technique. All the results reveal that the PDF-ACO algorithm can be used as an effective technique to investigate the bimodal PSD. Then, the Johnson's S B (J-S B ) function and the modified beta (M-β) function are employed as the general distribution function to retrieve the bimodal PSDs under the independent model. Finally, the J-S B and M-β functions are applied to recover actual measurement aerosol PSDs over Beijing and Shanghai obtained from the aerosol robotic network (AERONET). The numerical simulation and experimental results demonstrate that these two general functions, especially the J-S B function, can be used as a versatile distribution function to retrieve the bimodal aerosol PSD when no priori information about the PSD is available. - Highlights: • Bimodal PSDs are retrieved by ACO based on probability density function accurately. • J-S B and M-β functions can be used as the versatile function to recover bimodal PSDs. • Bimodal aerosol PSDs can be estimated by J-S B function more reasonably

  4. Caso clínico del Departamento de Psiquiatría: Trastorno afectivo bipolar 1, episodio maníaco

    Directory of Open Access Journals (Sweden)

    José Manuel Calvo Gómez

    1995-04-01

    Full Text Available Se trata de un paciente de 21 años con un cuadro clínico de una semana de evolución cuyas características satisfacen los criterios diagnósticos de la Cuarta Edición del Manual Estadístico y Diagnóstico de los Trastornos Mentales (DSM IV para trastorno afectivo bipolar 1,episodio maníaco severo con características psicóticas congruentes con el estado de ánimo. Criterios diagnósticos para trastorno afectivo bipolar 1: A. Actualmente (o recientemente en episodio maníaco. B. Ha existido previamente al menos un episodio depresivo mayor, episodio maníaco o episodio mixto. C. Los episodios afectivos de los criterios A y B no son producidos por un trastorno esquizoafectivo y no están superpuestos sobre esquizofrenia, trastorno esquizofreniforme, trastorno delirante, o trastorno psicótico no especificado en otro sitio.

  5. Kombinasi Firefly Algorithm-Tabu Search untuk Penyelesaian Traveling Salesman Problem

    Directory of Open Access Journals (Sweden)

    Riyan Naufal Hay's

    2017-07-01

    Full Text Available Traveling Salesman Problem (TSP adalah masalah optimasi kombinatorial klasik dan memiliki peran dalam perencanaan, penjadwalan, dan pencarian pada bidang rekayasa dan pengetahuan (Dong, 2012. TSP juga merupakan objek yang baik untuk menguji kinerja metode optimasi, beberapa metode seperti Cooperative Genetic Ant System (CGAS (Dong, 2012, Parallelized Genetic Ant Colony System (PGAS Particle Swarm Optimization and Ant Colony Optimization Algorithms (PSO–ACO (Elloumi, 2014, dan Ant Colony Hyper-Heuristics (ACO HH (Aziz, 2015 telah dikembangkan untuk memecahkan TSP. Sehingga, pada penelitian ini diimplementasikan kombinasi metode baru untuk meningkatkan akurasi penyelesaian TSP. Firefly Algorithm (FA merupakan salah satu algoritma yang dapat digunakan untuk memecahkan masalah optimasi kombinatorial (Layeb, 2014. FA merupakan algoritma yang berpotensi kuat dalam memecahkan kasus optimasi dibanding algoritma yang ada termasuk Particle Swarm Optimization (Yang, 2010. Namun, FA memiliki kekurangan dalam memecahkan masalah optimasi dengan skala besar (Baykasoğlu dan Ozsoy, 2014. Tabu Search (TS merupakan metode optimasi yang terbukti efektif untuk memecahkan masalah optimasi dengan skala besar (Pedro, 2013. Pada penelitian ini, TS akan diterapkan pada FA (FATS untuk memecahkan kasus TSP. Hasil FATS akan dibandingkan terhadap penelitian sebelumnya yaitu ACOHH. Perbandingan hasil menunjukan peningkatan akurasi sebesar 0.89% pada dataset Oliver30, 0.14% dataset Eil51, 3.81% dataset Eil76 dan 1.27% dataset KroA100.

  6. Near infrared system coupled chemometric algorithms for enumeration of total fungi count in cocoa beans neat solution.

    Science.gov (United States)

    Kutsanedzie, Felix Y H; Chen, Quansheng; Hassan, Md Mehedi; Yang, Mingxiu; Sun, Hao; Rahman, Md Hafizur

    2018-02-01

    Total fungi count (TFC) is a quality indicator of cocoa beans when unmonitored leads to quality and safety problems. Fourier transform near infrared spectroscopy (FT-NIRS) combined with chemometric algorithms like partial least square (PLS); synergy interval-PLS (Si-PLS); synergy interval-genetic algorithm-PLS (Si-GAPLS); Ant colony optimization - PLS (ACO-PLS) and competitive-adaptive reweighted sampling-PLS (CARS-PLS) was employed to predict TFC in cocoa beans neat solution. Model results were evaluated using the correlation coefficients of the prediction (Rp) and calibration (Rc); root mean square error of prediction (RMSEP), and the ratio of sample standard deviation to RMSEP (RPD). The developed models performance yielded 0.951≤Rp≤0.975; and 3.15≤RPD≤4.32. The models' prediction stability improved in the order of PLSACO-PLS

  7. Certeza diagnóstica en la mortalidad de una población de pacientes con trasplante cardíaco

    Directory of Open Access Journals (Sweden)

    Marcos Amuchástegui

    2008-01-01

    Full Text Available RESUMENIntroducciónA pesar de que la morbimortalidad en el trasplante cardíaco ha sido motivo de extensoanálisis, la mayoría de los estudios y registros de mortalidad en pacientes trasplantados sebasan sobre datos clínicos. En la bibliografía existen comunicaciones aisladas de autopsiasen pacientes con trasplante cardíaco.ObjetivoDeterminar la importancia de la realización de estudios anatomopatológicos para el diagnósticode causa de muerte en un programa de trasplante cardíaco.Material y métodosSe incluyeron todos los pacientes con trasplante cardíaco fallecidos entre enero 1990 y enero2005. El diagnóstico definitivo de la causa de muerte fue corroborado por autopsia obiopsia de órgano sólido. Las causas de muerte evaluadas fueron falla precoz del injerto,rechazo celular, infección, enfermedad vascular del injerto, neoplasia y otros.ResultadosDurante el período en estudio 73 pacientes fueron sometidos a trasplante cardíaco; de ellos,fallecieron 31. Se obtuvieron 12 autopsias y 7 biopsias de órgano sólido que certificaron lacausa de muerte (61%. La causa de muerte más frecuente fue el rechazo celular mayor degrado III. En el 12,9%, la anatomía patológica difirió de la sospecha clínica de la causa demuerte.ConclusiónLa información clinicopatológica derivada de estudios post mortem es un indicador de nuestrarealidad asistencial y se constituye en un pilar fundamental para el conocimiento y elmanejo futuro de los pacientes trasplantados, por lo que consideramos que la realización deautopsias en estos pacientes es de vital importancia.REV ARGENT CARDIOL 2008;76:292-294.

  8. Ant colony optimization as a descriptor selection in QSPR modeling: Estimation of the λmax of anthraquinones-based dyes

    OpenAIRE

    Morteza Atabati; Kobra Zarei; Azam Borhani

    2016-01-01

    Quantitative structure–property relationship (QSPR) studies based on ant colony optimization (ACO) were carried out for the prediction of λmax of 9,10-anthraquinone derivatives. ACO is a meta-heuristic algorithm, which is derived from the observation of real ants and proposed to feature selection. After optimization of 3D geometry of structures by the semi-empirical quantum-chemical calculation at AM1 level, different descriptors were calculated by the HyperChem and Dragon softwares (1514 des...

  9. Integrating Medication Therapy Management (MTM Services Provided by Community Pharmacists into a Community-Based Accountable Care Organization (ACO

    Directory of Open Access Journals (Sweden)

    Brian Isetts

    2017-10-01

    Full Text Available (1 Background: As the U.S. healthcare system evolves from fee-for-service financing to global population-based payments designed to be accountable for both quality and total cost of care, the effective and safe use of medications is gaining increased importance. The purpose of this project was to determine the feasibility of integrating medication therapy management (MTM services provided by community pharmacists into the clinical care teams and the health information technology (HIT infrastructure for Minnesota Medicaid recipients of a 12-county community-based accountable care organization (ACO. (2 Methods: The continuous quality improvement evaluation methodology employed in this project was the context + mechanism = outcome (CMO model to account for the fact that programs only work insofar as they introduce promising ideas, solutions and opportunities in the appropriate social and cultural contexts. Collaborations between a 12-county ACO and 15 community pharmacies in Southwest Minnesota served as the social context for this feasibility study of MTM referrals to community pharmacists. (3 Results: All 15 community pharmacy sites were integrated into the HIT infrastructure through Direct Secure Messaging, and there were 32 recipients who received MTM services subsequent to referrals from the ACO at 5 of the 15 community pharmacies over a 1-year implementation phase. (4 Conclusion: At the conclusion of this project, an effective electronic communication and MTM referral system was activated, and consideration was given to community pharmacists providing MTM in future ACO shared savings agreements.

  10. Energy Efficiency Performance Improvements for Ant-Based Routing Algorithm in Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Adamu Murtala Zungeru

    2013-01-01

    Full Text Available The main problem for event gathering in wireless sensor networks (WSNs is the restricted communication range for each node. Due to the restricted communication range and high network density, event forwarding in WSNs is very challenging and requires multihop data forwarding. Currently, the energy-efficient ant based routing (EEABR algorithm, based on the ant colony optimization (ACO metaheuristic, is one of the state-of-the-art energy-aware routing protocols. In this paper, we propose three improvements to the EEABR algorithm to further improve its energy efficiency. The improvements to the original EEABR are based on the following: (1 a new scheme to intelligently initialize the routing tables giving priority to neighboring nodes that simultaneously could be the destination, (2 intelligent update of routing tables in case of a node or link failure, and (3 reducing the flooding ability of ants for congestion control. The energy efficiency improvements are significant particularly for dynamic routing environments. Experimental results using the RMASE simulation environment show that the proposed method increases the energy efficiency by up to 9% and 64% in converge-cast and target-tracking scenarios, respectively, over the original EEABR without incurring a significant increase in complexity. The method is also compared and found to also outperform other swarm-based routing protocols such as sensor-driven and cost-aware ant routing (SC and Beesensor.

  11. Edge detection in digital images using Ant Colony Optimization

    Directory of Open Access Journals (Sweden)

    Marjan Kuchaki Rafsanjani

    2015-11-01

    Full Text Available Ant Colony Optimization (ACO is an optimization algorithm inspired by the behavior of real ant colonies to approximate the solutions of difficult optimization problems. In this paper, ACO is introduced to tackle the image edge detection problem. The proposed approach is based on the distribution of ants on an image; ants try to find possible edges by using a state transition function. Experimental results show that the proposed method compared to standard edge detectors is less sensitive to Gaussian noise and gives finer details and thinner edges when compared to earlier ant-based approaches.

  12. Expression Study of LeGAPDH, LeACO1, LeACS1A, and LeACS2 in Tomato Fruit (Solanum lycopersicum

    Directory of Open Access Journals (Sweden)

    Pijar Riza Anugerah

    2015-10-01

    Full Text Available Tomato is a climacteric fruit, which is characterized by ripening-related increase of respiration and elevated ethylene synthesis. Ethylene is the key hormone in ripening process of climacteric fruits. The objective of this research is to study the expression of three ethylene synthesis genes: LeACO1, LeACS1A, LeACS2, and a housekeeping gene LeGAPDH in ripening tomato fruit. Specific primers have been designed to amplify complementary DNA fragment of LeGAPDH (143 bp, LeACO1 (240 bp, LeACS1A (169 bp, and LeACS2 (148 bp using polymerase chain reaction. Nucleotide BLAST results of the complementary DNA fragments show high similarity with LeGAPDH (NM_001247874.1, LeACO1 (NM_001247095.1, LeACS1A (NM_001246993.1, LeACS2 (NM_001247249.1, respectively. Expression study showed that LeACO1, LeACS1A, LeACS2, and LeGAPDH genes were expressed in ripening tomato fruit. Isolation methods, reference sequences, and primers used in this study can be used in future experiments to study expression of genes responsible for ethylene synthesis using quantitative polymerase chain reaction and to design better strategy for controlling fruit ripening in agroindustry.

  13. The modification of hybrid method of ant colony optimization, particle swarm optimization and 3-OPT algorithm in traveling salesman problem

    Science.gov (United States)

    Hertono, G. F.; Ubadah; Handari, B. D.

    2018-03-01

    The traveling salesman problem (TSP) is a famous problem in finding the shortest tour to visit every vertex exactly once, except the first vertex, given a set of vertices. This paper discusses three modification methods to solve TSP by combining Ant Colony Optimization (ACO), Particle Swarm Optimization (PSO) and 3-Opt Algorithm. The ACO is used to find the solution of TSP, in which the PSO is implemented to find the best value of parameters α and β that are used in ACO.In order to reduce the total of tour length from the feasible solution obtained by ACO, then the 3-Opt will be used. In the first modification, the 3-Opt is used to reduce the total tour length from the feasible solutions obtained at each iteration, meanwhile, as the second modification, 3-Opt is used to reduce the total tour length from the entire solution obtained at every iteration. In the third modification, 3-Opt is used to reduce the total tour length from different solutions obtained at each iteration. Results are tested using 6 benchmark problems taken from TSPLIB by calculating the relative error to the best known solution as well as the running time. Among those modifications, only the second and third modification give satisfactory results except the second one needs more execution time compare to the third modifications.

  14. Estimación del gasto cardíaco: Utilidad en la práctica clínica. Monitorización disponible invasiva y no invasiva

    OpenAIRE

    García, X.; Mateu, L.; Maynar, J.; Mercadal, J.; Ochagavía, A.; Ferrandiz, A.

    2011-01-01

    Esta revisión pretende profundizar en el conocimiento del gasto cardíaco, sus variables y sus condicionantes, así como repasar exhaustivamente las diferentes técnicas disponibles para su monitorización y establecer las situaciones en que el conocimiento del gasto cardíaco nos aporta una información fundamental en el manejo del paciente crítico. La técnica de Fick, utilizada en los inicios para calcular el gasto cardíaco de los pacientes, ha sido sustituida hoy en día en la práctica clínica po...

  15. Implementation of a Pilot ACO Payment Model and the Use of Discretionary and Non-Discretionary Cardiovascular Care

    Science.gov (United States)

    Colla, Carrie. H.; Goodney, Philip P.; Lewis, Valerie A.; Nallamothu, Brahmajee K.; Gottlieb, Daniel J.; Meara, Ellen R.

    2014-01-01

    Background Accountable care organizations (ACOs) seek to reduce growth in healthcare spending while ensuring high-quality care. We hypothesized that ACO implementation would selectively limit utilization of discretionary cardiovascular care (defined as care occurring in the absence of indications such as myocardial infarction or stroke), while maintaining high-quality care such as non-discretionary cardiovascular imaging and procedures. Methods and Results The intervention group was composed of fee-for-service Medicare patients (n=819,779) from 10 groups participating in a Medicare pilot ACO, the Physician Group Practice Demonstration (PGPD). Matched controls were patients (n=934,621) from non-participating groups in the same regions. We compared utilization of cardiovascular care before (2002-2004) and after (2005-2009) PGPD implementation, studying both discretionary and non-discretionary carotid and coronary imaging and procedures. Our main outcome measure was the difference in the proportion of patients treated with imaging and procedures, among patients of PGPD practices compared to patients in control practices, before and after PGPD implementation (difference-in-difference). For discretionary imaging, the difference-in-difference between PGPD practices and controls was not statistically significant for discretionary carotid imaging (0.17%; 95% CI -0.51% to 0.85%, p=0.595) or discretionary coronary imaging (-0.19%; 95% CI -0.73% to 0.35%, p=0.468). Similarly, the difference-in-difference was also minimal for discretionary carotid revascularization (0.003%; 95% CI -0.008% to 0.002%, p=0.705) and coronary revascularization (-0.02%, 95% CI -0.11% to 0.07%, p=0.06). The difference-in-difference associated with PGPD implementation was also essentially zero for non-discretionary cardiovascular imaging or procedures. Conclusions Implementation of a pilot ACO did not limit the utilization of discretionary or non-discretionary cardiovascular care in ten large health

  16. Bioimpedância transtorácica comparada à ressonância magnética na avaliação do débito cardíaco

    Directory of Open Access Journals (Sweden)

    Humberto Villacorta Junior

    Full Text Available FUNDAMENTO: A ressonância magnética cardíaca é considerada o método padrão-ouro para o cálculo de volumes cardíacos. A bioimpedância transtorácica cardíaca avalia o débito cardíaco. Não há trabalhos que validem essa medida comparada à ressonância. OBJETIVO: Avaliar o desempenho da bioimpedância transtorácica cardíaca no cálculo do débito cardíaco, índice cardíaco e volume sistólico, utilizando a ressonância como padrão-ouro. MÉTODOS: Avaliados 31 pacientes, com média de idade de 56,7 ± 18 anos, sendo 18 (58% do sexo masculino. Foram excluídos os pacientes cuja indicação para a ressonância magnética cardíaca incluía avaliação sob estresse farmacológico. A correlação entre os métodos foi avaliada pelo coeficiente de Pearson, e a dispersão das diferenças absolutas em relação à média foi demonstrada pelo método de Bland-Altman. A concordância entre os métodos foi realizada pelo coeficiente de correlação intraclasses. RESULTADOS: A média do débito cardíaco pela bioimpedância transtorácica cardíaca e pela ressonância foi, respectivamente, 5,16 ± 0,9 e 5,13 ± 0,9 L/min. Observou-se boa correlação entre os métodos para o débito cardíaco (r = 0,79; p = 0,0001, índice cardíaco (r = 0,74; p = 0,0001 e volume sistólico (r = 0,88; p = 0,0001. A avaliação pelo gráfico de Bland-Altman mostrou pequena dispersão das diferenças em relação à média, com baixa amplitude dos intervalos de concordância. Houve boa concordância entre os dois métodos quando avaliados pelo coeficiente de correlação intraclasses, com coeficientes para débito cardíaco, índice cardíaco e volume sistólico de 0,78, 0,73 e 0,88, respectivamente (p < 0,0001 para todas as comparações. CONCLUSÃO: A bioimpedância transtorácica cardíaca mostrou-se acurada no cálculo do débito cardíaco quando comparada à ressonância magnética cardíaca.

  17. Proposed Fuzzy-NN Algorithm with LoRaCommunication Protocol for Clustered Irrigation Systems

    Directory of Open Access Journals (Sweden)

    Sotirios Kontogiannis

    2017-11-01

    Full Text Available Modern irrigation systems utilize sensors and actuators, interconnected together as a single entity. In such entities, A.I. algorithms are implemented, which are responsible for the irrigation process. In this paper, the authors present an irrigation Open Watering System (OWS architecture that spatially clusters the irrigation process into autonomous irrigation sections. Authors’ OWS implementation includes a Neuro-Fuzzy decision algorithm called FITRA, which originates from the Greek word for seed. In this paper, the FITRA algorithm is described in detail, as are experimentation results that indicate significant water conservations from the use of the FITRA algorithm. Furthermore, the authors propose a new communication protocol over LoRa radio as an alternative low-energy and long-range OWS clusters communication mechanism. The experimental scenarios confirm that the FITRA algorithm provides more efficient irrigation on clustered areas than existing non-clustered, time scheduled or threshold adaptive algorithms. This is due to the FITRA algorithm’s frequent monitoring of environmental conditions, fuzzy and neural network adaptation as well as adherence to past irrigation preferences.

  18. Hybrid Optimization of Object-Based Classification in High-Resolution Images Using Continous ANT Colony Algorithm with Emphasis on Building Detection

    Science.gov (United States)

    Tamimi, E.; Ebadi, H.; Kiani, A.

    2017-09-01

    Automatic building detection from High Spatial Resolution (HSR) images is one of the most important issues in Remote Sensing (RS). Due to the limited number of spectral bands in HSR images, using other features will lead to improve accuracy. By adding these features, the presence probability of dependent features will be increased, which leads to accuracy reduction. In addition, some parameters should be determined in Support Vector Machine (SVM) classification. Therefore, it is necessary to simultaneously determine classification parameters and select independent features according to image type. Optimization algorithm is an efficient method to solve this problem. On the other hand, pixel-based classification faces several challenges such as producing salt-paper results and high computational time in high dimensional data. Hence, in this paper, a novel method is proposed to optimize object-based SVM classification by applying continuous Ant Colony Optimization (ACO) algorithm. The advantages of the proposed method are relatively high automation level, independency of image scene and type, post processing reduction for building edge reconstruction and accuracy improvement. The proposed method was evaluated by pixel-based SVM and Random Forest (RF) classification in terms of accuracy. In comparison with optimized pixel-based SVM classification, the results showed that the proposed method improved quality factor and overall accuracy by 17% and 10%, respectively. Also, in the proposed method, Kappa coefficient was improved by 6% rather than RF classification. Time processing of the proposed method was relatively low because of unit of image analysis (image object). These showed the superiority of the proposed method in terms of time and accuracy.

  19. Proposed algorithm for determining the delta intercept of a thermocouple psychrometer curve

    International Nuclear Information System (INIS)

    Kurzmack, M.A.

    1993-01-01

    The USGS Hydrologic Investigations Program is currently developing instrumentation to study the unsaturated zone at Yucca Mountain in Nevada. Surface-based boreholes up to 2,500 feet in depth will be drilled, and then instrumented in order to define the water potential field within the unsaturated zone. Thermocouple psychrometers will be used to monitor the in-situ water potential. An algorithm is proposed for simply and efficiently reducing a six wire thermocouple psychrometer voltage output curve to a single value, the delta intercept. The algorithm identifies a plateau region in the psychrometer curve and extrapolates a linear regression back to the initial start of relaxation. When properly conditioned for the measurements being made, the algorithm results in reasonable results even with incomplete or noisy psychrometer curves over a 1 to 60 bar range

  20. Omalizumab (anti-IgE) therapy in the asthma-COPD overlap syndrome (ACOS) and its effects on circulating cytokine levels.

    Science.gov (United States)

    Yalcin, Arzu Didem; Celik, Betul; Yalcin, Ata Nevzat

    2016-06-01

    The term "asthma-chronic obstructive pulmonary disease (COPD) overlap syndrome" (ACOS) has been applied to the condition, in which a person has clinical features of both asthma and COPD. The patients (N = 10) were presented to our clinic with low lung function, limited reversibility of airway obstruction, hyperinflation, abnormal body composition, dyspnea and episodic wheezing. Based on the clinical and laboratory findings, the patients were diagnosed with ACOS. Patients' serum IL-2 (sIL-2), sIL-4 sIL-6, sIL-10, sIL-17, sTNF-α and sIFN-γ levels were investigated as an apoptotic marker and a marker for inflammation. Having undergone omalizumab treatment and a long-term (12 months) later, patients had a decreased IgE, fractional exhaled nitric oxide concentrations (FENO), eosinophil, neutrophils, macrophages, eosinophil cationic peptide (ECP) and sIL-4 levels. To our knowledge, this is the first documentation of omalizumab use in ACOS. We demonstrated decreased IL-4, allergic pulmonary symptoms (dyspnea, wheezing, bronchial hyper responsiveness) and migraine attacks in the patients.

  1. Current situation of asthma-COPD overlap syndrome (ACOS) in Chinese patients older than 40 years with airflow limitation: rationale and design for a multicenter, cross-sectional trial (study protocol).

    Science.gov (United States)

    Kang, Jian; Yao, Wanzhen; Cai, Baiqiang; Chen, Ping; Ling, Xia; Shang, Hongyan

    2016-12-01

    Asthma and chronic obstructive pulmonary disease (COPD) are the frequently occurring chronic airway diseases, and the overlapping syndrome observed in the majority of patients has been recently defined as asthma-COPD overlap syndrome (ACOS) by the Global Initiative for Chronic Obstructive Lung (GOLD, 2014) and Global initiative for Asthma (GINA, 2015). The proportion, features, and clinical practice of ACOS still remain elusive in China. We are conducting this multicenter, cross-sectional, observational study (NCT02600221) to investigate the distributions of chronic obstructive diseases in patients >40 years of age with chronic airflow limitation in China along with determination of the main clinical practice and features of these diseases. The study will also explore the factors that may influence the exacerbations and severity of ACOS in Chinese patients (>40 years of age). A total of 2,000 patients (age, ≥40 years; either sex) who are clinically diagnosed as having asthma, COPD/chronic bronchitis/emphysema, or ACOS for at least 12 months with airflow limitation [post-bronchodilator forced expiratory volume in 1 second/forced vital capacity (FEV 1 /FVC): current situation, main clinical practice, and features of ACOS, asthma, and COPD conditions in Chinese patients. The insights will be helpful in designing optimal management strategies for ACOS and redefining the healthcare development programs.

  2. A practical algorithm for optimal operation management of distribution network including fuel cell power plants

    Energy Technology Data Exchange (ETDEWEB)

    Niknam, Taher; Meymand, Hamed Zeinoddini; Nayeripour, Majid [Electrical and Electronic Engineering Department, Shiraz University of Technology, Shiraz (Iran)

    2010-08-15

    Fuel cell power plants (FCPPs) have been taken into a great deal of consideration in recent years. The continuing growth of the power demand together with environmental constraints is increasing interest to use FCPPs in power system. Since FCPPs are usually connected to distribution network, the effect of FCPPs on distribution network is more than other sections of power system. One of the most important issues in distribution networks is optimal operation management (OOM) which can be affected by FCPPs. This paper proposes a new approach for optimal operation management of distribution networks including FCCPs. In the article, we consider the total electrical energy losses, the total electrical energy cost and the total emission as the objective functions which should be minimized. Whereas the optimal operation in distribution networks has a nonlinear mixed integer optimization problem, the optimal solution could be obtained through an evolutionary method. We use a new evolutionary algorithm based on Fuzzy Adaptive Particle Swarm Optimization (FAPSO) to solve the optimal operation problem and compare this method with Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Differential Evolution (DE), Ant Colony Optimization (ACO) and Tabu Search (TS) over two distribution test feeders. (author)

  3. Impacto del riesgo cardíaco sobre la Aneurismectomía

    Directory of Open Access Journals (Sweden)

    Orestes Díaz Hernández

    2000-04-01

    Full Text Available Se realiza un análisis cardiológico a 100 pacientes a los cuales se les efectuó cirugía electiva por aneurisma de la aorta abdominal infrarrenal; para esto se emplearon 5 métodos: el índice de riesgo cardíaco de Goldman original y modificado, los marcadores coronarios de Eagle, las indicaciones por niveles recomendadas por Hollier y la ecuación de Cooperman. La cardiopatía isquémica se detectó en el 59 % del total de los pacientes. De ellos, el 43 % tenía antecedentes de infarto del miocardio, el 33 % angina estable bajo tratamiento medicamentoso, el 6 % insuficiencia cardíaca congestiva con tratamiento digitálico y el 11 % un ritmo no sinusal detectado en el electrocardiograma. La supervivencia a largo plazo en la cardiopatía isquémica se comportó al año y 2 años en el 80 %, a los 5 años en el 72 % y a los 10 y 13 años en el 35 %. Se propone un algoritmo simple y modificado de valoración cardiológica en los pacientes que muestran aneurisma de la aorta abdominalA cardiological analysis of 100 patients whe underwent elective surgery due to infrarenal abdominal aortic aneurysm was made. 5 methods were used to this end: the original and modified Goldman´s cardiac resk index, Eagle´s coronary markers, the indications by levels recommended by Hollier and Cooperman´s equation. Ischaemic heart disease was detected in 59 % of the patients. 43 % of them had history of myocardial infarction, 33 % of stable angina under drug treatment, 6 % of congestive heart failure with digitalis treatment and 11 % had a nonsinusal rate detected in the EKG. Among those suffering from schemic heart disease there was a long term survival of 80 % at the lst and 2nd year, 72 % at the 5 th year, and 35 % at the 10th and 13th year. A simple and modified algorithm of cardiological assessment was proposed for patients with abdominal aortic aneurysm

  4. An effective inversion algorithm for retrieving bimodal aerosol particle size distribution from spectral extinction data

    Science.gov (United States)

    He, Zhenzong; Qi, Hong; Yao, Yuchen; Ruan, Liming

    2014-12-01

    The Ant Colony Optimization algorithm based on the probability density function (PDF-ACO) is applied to estimate the bimodal aerosol particle size distribution (PSD). The direct problem is solved by the modified Anomalous Diffraction Approximation (ADA, as an approximation for optically large and soft spheres, i.e., χ⪢1 and |m-1|⪡1) and the Beer-Lambert law. First, a popular bimodal aerosol PSD and three other bimodal PSDs are retrieved in the dependent model by the multi-wavelength extinction technique. All the results reveal that the PDF-ACO algorithm can be used as an effective technique to investigate the bimodal PSD. Then, the Johnson's SB (J-SB) function and the modified beta (M-β) function are employed as the general distribution function to retrieve the bimodal PSDs under the independent model. Finally, the J-SB and M-β functions are applied to recover actual measurement aerosol PSDs over Beijing and Shanghai obtained from the aerosol robotic network (AERONET). The numerical simulation and experimental results demonstrate that these two general functions, especially the J-SB function, can be used as a versatile distribution function to retrieve the bimodal aerosol PSD when no priori information about the PSD is available.

  5. Construction Method of Display Proposal for Commodities in Sales Promotion by Genetic Algorithm

    Science.gov (United States)

    Yumoto, Masaki

    In a sales promotion task, wholesaler prepares and presents the display proposal for commodities in order to negotiate with retailer's buyers what commodities they should sell. For automating the sales promotion tasks, the proposal has to be constructed according to the target retailer's buyer. However, it is difficult to construct the proposal suitable for the target retail store because of too much combination of commodities. This paper proposes a construction method by Genetic algorithm (GA). The proposed method represents initial display proposals for commodities with genes, improve ones with the evaluation value by GA, and rearrange one with the highest evaluation value according to the classification of commodity. Through practical experiment, we can confirm that display proposal by the proposed method is similar with the one constructed by a wholesaler.

  6. [New methodological advances: algorithm proposal for management of Clostridium difficile infection].

    Science.gov (United States)

    González-Abad, María José; Alonso-Sanz, Mercedes

    2015-06-01

    Clostridium difficile infection (CDI) is considered the most common cause of health care-associated diarrhea and also is an etiologic agent of community diarrhea. The aim of this study was to assess the potential benefit of a test that detects glutamate dehydrogenase (GDH) antigen and C. difficile toxin A/B, simultaneously, followed by detection of C. difficile toxin B (tcdB) gene by PCR as confirmatory assay on discrepant samples, and to propose an algorithm more efficient. From June 2012 to January 2013 at Hospital Infantil Universitario Niño Jesús, Madrid, the stool samples were studied for the simultaneous detection of GDH and toxin A/B, and also for detection of toxin A/B alone. When results between GDH and toxin A/B were discordant, a single sample for patient was selected for detection of C. difficile toxin B (tcdB) gene. A total of 116 samples (52 patients) were tested. Four were positive and 75 negative for toxigenic C. difficile (Toxin A/B, alone or combined with GDH). C. difficile was detected in the remaining 37 samples but not toxin A/B, regardless of the method used, except one. Twenty of the 37 specimens were further tested for C. difficile toxin B (tcdB) gene and 7 were positive. The simultaneous detection of GDH and toxin A/B combined with PCR recovered undiagnosed cases of CDI. In accordance with our data, we propose a two-step algorithm: detection of GDH and PCR (in samples GDH positive). This algorithm could provide a superior cost-benefit ratio in our population.

  7. A Family of ACO Routing Protocols for Mobile Ad Hoc Networks

    Science.gov (United States)

    Rupérez Cañas, Delfín; Sandoval Orozco, Ana Lucila; García Villalba, Luis Javier; Kim, Tai-hoon

    2017-01-01

    In this work, an ACO routing protocol for mobile ad hoc networks based on AntHocNet is specified. As its predecessor, this new protocol, called AntOR, is hybrid in the sense that it contains elements from both reactive and proactive routing. Specifically, it combines a reactive route setup process with a proactive route maintenance and improvement process. Key aspects of the AntOR protocol are the disjoint-link and disjoint-node routes, separation between the regular pheromone and the virtual pheromone in the diffusion process and the exploration of routes, taking into consideration the number of hops in the best routes. In this work, a family of ACO routing protocols based on AntOR is also specified. These protocols are based on protocol successive refinements. In this work, we also present a parallelized version of AntOR that we call PAntOR. Using programming multiprocessor architectures based on the shared memory protocol, PAntOR allows running tasks in parallel using threads. This parallelization is applicable in the route setup phase, route local repair process and link failure notification. In addition, a variant of PAntOR that consists of having more than one interface, which we call PAntOR-MI (PAntOR-Multiple Interface), is specified. This approach parallelizes the sending of broadcast messages by interface through threads. PMID:28531159

  8. A Family of ACO Routing Protocols for Mobile Ad Hoc Networks.

    Science.gov (United States)

    Rupérez Cañas, Delfín; Sandoval Orozco, Ana Lucila; García Villalba, Luis Javier; Kim, Tai-Hoon

    2017-05-22

    In this work, an ACO routing protocol for mobile ad hoc networks based on AntHocNet is specified. As its predecessor, this new protocol, called AntOR, is hybrid in the sense that it contains elements from both reactive and proactive routing. Specifically, it combines a reactive route setup process with a proactive route maintenance and improvement process. Key aspects of the AntOR protocol are the disjoint-link and disjoint-node routes, separation between the regular pheromone and the virtual pheromone in the diffusion process and the exploration of routes, taking into consideration the number of hops in the best routes. In this work, a family of ACO routing protocols based on AntOR is also specified. These protocols are based on protocol successive refinements. In this work, we also present a parallelized version of AntOR that we call PAntOR. Using programming multiprocessor architectures based on the shared memory protocol, PAntOR allows running tasks in parallel using threads. This parallelization is applicable in the route setup phase, route local repair process and link failure notification. In addition, a variant of PAntOR that consists of having more than one interface, which we call PAntOR-MI (PAntOR-Multiple Interface), is specified. This approach parallelizes the sending of broadcast messages by interface through threads.

  9. Hopfield-K-Means clustering algorithm: A proposal for the segmentation of electricity customers

    Energy Technology Data Exchange (ETDEWEB)

    Lopez, Jose J.; Aguado, Jose A.; Martin, F.; Munoz, F.; Rodriguez, A.; Ruiz, Jose E. [Department of Electrical Engineering, University of Malaga, C/ Dr. Ortiz Ramos, sn., Escuela de Ingenierias, 29071 Malaga (Spain)

    2011-02-15

    Customer classification aims at providing electric utilities with a volume of information to enable them to establish different types of tariffs. Several methods have been used to segment electricity customers, including, among others, the hierarchical clustering, Modified Follow the Leader and K-Means methods. These, however, entail problems with the pre-allocation of the number of clusters (Follow the Leader), randomness of the solution (K-Means) and improvement of the solution obtained (hierarchical algorithm). Another segmentation method used is Hopfield's autonomous recurrent neural network, although the solution obtained only guarantees that it is a local minimum. In this paper, we present the Hopfield-K-Means algorithm in order to overcome these limitations. This approach eliminates the randomness of the initial solution provided by K-Means based algorithms and it moves closer to the global optimun. The proposed algorithm is also compared against other customer segmentation and characterization techniques, on the basis of relative validation indexes. Finally, the results obtained by this algorithm with a set of 230 electricity customers (residential, industrial and administrative) are presented. (author)

  10. Hopfield-K-Means clustering algorithm: A proposal for the segmentation of electricity customers

    International Nuclear Information System (INIS)

    Lopez, Jose J.; Aguado, Jose A.; Martin, F.; Munoz, F.; Rodriguez, A.; Ruiz, Jose E.

    2011-01-01

    Customer classification aims at providing electric utilities with a volume of information to enable them to establish different types of tariffs. Several methods have been used to segment electricity customers, including, among others, the hierarchical clustering, Modified Follow the Leader and K-Means methods. These, however, entail problems with the pre-allocation of the number of clusters (Follow the Leader), randomness of the solution (K-Means) and improvement of the solution obtained (hierarchical algorithm). Another segmentation method used is Hopfield's autonomous recurrent neural network, although the solution obtained only guarantees that it is a local minimum. In this paper, we present the Hopfield-K-Means algorithm in order to overcome these limitations. This approach eliminates the randomness of the initial solution provided by K-Means based algorithms and it moves closer to the global optimun. The proposed algorithm is also compared against other customer segmentation and characterization techniques, on the basis of relative validation indexes. Finally, the results obtained by this algorithm with a set of 230 electricity customers (residential, industrial and administrative) are presented. (author)

  11. A proposal simulated annealing algorithm for proportional parallel flow shops with separated setup times

    Directory of Open Access Journals (Sweden)

    Helio Yochihiro Fuchigami

    2014-08-01

    Full Text Available This article addresses the problem of minimizing makespan on two parallel flow shops with proportional processing and setup times. The setup times are separated and sequence-independent. The parallel flow shop scheduling problem is a specific case of well-known hybrid flow shop, characterized by a multistage production system with more than one machine working in parallel at each stage. This situation is very common in various kinds of companies like chemical, electronics, automotive, pharmaceutical and food industries. This work aimed to propose six Simulated Annealing algorithms, their perturbation schemes and an algorithm for initial sequence generation. This study can be classified as “applied research” regarding the nature, “exploratory” about the objectives and “experimental” as to procedures, besides the “quantitative” approach. The proposed algorithms were effective regarding the solution and computationally efficient. Results of Analysis of Variance (ANOVA revealed no significant difference between the schemes in terms of makespan. It’s suggested the use of PS4 scheme, which moves a subsequence of jobs, for providing the best percentage of success. It was also found that there is a significant difference between the results of the algorithms for each value of the proportionality factor of the processing and setup times of flow shops.

  12. Qualidade de vida de pacientes submetidos ao transplante cardíaco: aplicação da escala Whoqol-Bref

    Directory of Open Access Journals (Sweden)

    Maria Isis Freire de Aguiar

    2011-01-01

    Full Text Available FUNDAMENTO: O sucesso do transplante cardíaco significa garantir a sobrevida dos pacientes com cardiopatia e permitir-lhes desenvolver suas atividades diárias. O transplante cardíaco apresenta-se como a primeira opção de tratamento na falência cardíaca, representando um aumento de sobrevida e qualidade de vida dos transplantados. OBJETIVO: Avaliar a qualidade de vida de pacientes submetidos ao transplante cardíaco através da aplicação de uma escala padronizada (Whoqol-Bref. MÉTODOS: Estudo exploratório descritivo de abordagem quantitativa, realizado com 55 pacientes submetidos ao transplante cardíaco, em um período entre o terceiro e o 103º mês, que realizam acompanhamento na Unidade de Transplante e Insuficiência Cardíaca em um Hospital de Referência em Cardiologia na cidade de Fortaleza, CE. Os dados foram coletados no período de fevereiro a abril de 2009, por meio da aplicação de um questionário padronizado pela Organização Mundial da Saúde e utilização de dados constantes nos prontuários. RESULTADOS: Com relação ao domínio físico, 62,8% e 58,3% dos pacientes, dos sexos masculino e feminino, respectivamente, estão satisfeitos. No domínio psicológico, dentre pacientes do sexo masculino, 65,1% apresentam satisfação quanto à qualidade de vida e, no sexo feminino, 58,3% encontram-se satisfeitas. No domínio das relações sociais, observou-se que, no sexo masculino, 53,5% estão muito satisfeitos, e apresentou-se um nível de satisfação de 100% no sexo feminino. No domínio do meio ambiente, 65,1% do sexo masculino encontram-se satisfeitos, e no sexo feminino, 83,3% estão satisfeitas. CONCLUSÃO: O transplante cardíaco teve bastante influência na qualidade de vida dos pacientes transplantados, pois os resultados mostram-se estatisticamente significantes no pós-transplante.

  13. Optimization of Nano-Process Deposition Parameters Based on Gravitational Search Algorithm

    Directory of Open Access Journals (Sweden)

    Norlina Mohd Sabri

    2016-06-01

    Full Text Available This research is focusing on the radio frequency (RF magnetron sputtering process, a physical vapor deposition technique which is widely used in thin film production. This process requires the optimized combination of deposition parameters in order to obtain the desirable thin film. The conventional method in the optimization of the deposition parameters had been reported to be costly and time consuming due to its trial and error nature. Thus, gravitational search algorithm (GSA technique had been proposed to solve this nano-process parameters optimization problem. In this research, the optimized parameter combination was expected to produce the desirable electrical and optical properties of the thin film. The performance of GSA in this research was compared with that of Particle Swarm Optimization (PSO, Genetic Algorithm (GA, Artificial Immune System (AIS and Ant Colony Optimization (ACO. Based on the overall results, the GSA optimized parameter combination had generated the best electrical and an acceptable optical properties of thin film compared to the others. This computational experiment is expected to overcome the problem of having to conduct repetitive laboratory experiments in obtaining the most optimized parameter combination. Based on this initial experiment, the adaptation of GSA into this problem could offer a more efficient and productive way of depositing quality thin film in the fabrication process.

  14. A proposed adaptive step size perturbation and observation maximum power point tracking algorithm based on photovoltaic system modeling

    Science.gov (United States)

    Huang, Yu

    Solar energy becomes one of the major alternative renewable energy options for its huge abundance and accessibility. Due to the intermittent nature, the high demand of Maximum Power Point Tracking (MPPT) techniques exists when a Photovoltaic (PV) system is used to extract energy from the sunlight. This thesis proposed an advanced Perturbation and Observation (P&O) algorithm aiming for relatively practical circumstances. Firstly, a practical PV system model is studied with determining the series and shunt resistances which are neglected in some research. Moreover, in this proposed algorithm, the duty ratio of a boost DC-DC converter is the object of the perturbation deploying input impedance conversion to achieve working voltage adjustment. Based on the control strategy, the adaptive duty ratio step size P&O algorithm is proposed with major modifications made for sharp insolation change as well as low insolation scenarios. Matlab/Simulink simulation for PV model, boost converter control strategy and various MPPT process is conducted step by step. The proposed adaptive P&O algorithm is validated by the simulation results and detail analysis of sharp insolation changes, low insolation condition and continuous insolation variation.

  15. Analysis of Ant Colony Optimization and Population-Based Evolutionary Algorithms on Dynamic Problems

    DEFF Research Database (Denmark)

    Lissovoi, Andrei

    the dynamic optimum for finite alphabets up to size μ, while MMAS is able to do so for any finite alphabet size. Parallel Evolutionary Algorithms on Maze. We prove that while a (1 + λ) EA is unable to track the optimum of the dynamic fitness function Maze for offspring population size up to λ = O(n1-ε......This thesis presents new running time analyses of nature-inspired algorithms on various dynamic problems. It aims to identify and analyse the features of algorithms and problem classes which allow efficient optimization to occur in the presence of dynamic behaviour. We consider the following...... settings: λ-MMAS on Dynamic Shortest Path Problems. We investigate how in-creasing the number of ants simulated per iteration may help an ACO algorithm to track optimum in a dynamic problem. It is shown that while a constant number of ants per-vertex is sufficient to track some oscillations, there also...

  16. HYBRID OPTIMIZATION OF OBJECT-BASED CLASSIFICATION IN HIGH-RESOLUTION IMAGES USING CONTINOUS ANT COLONY ALGORITHM WITH EMPHASIS ON BUILDING DETECTION

    Directory of Open Access Journals (Sweden)

    E. Tamimi

    2017-09-01

    Full Text Available Automatic building detection from High Spatial Resolution (HSR images is one of the most important issues in Remote Sensing (RS. Due to the limited number of spectral bands in HSR images, using other features will lead to improve accuracy. By adding these features, the presence probability of dependent features will be increased, which leads to accuracy reduction. In addition, some parameters should be determined in Support Vector Machine (SVM classification. Therefore, it is necessary to simultaneously determine classification parameters and select independent features according to image type. Optimization algorithm is an efficient method to solve this problem. On the other hand, pixel-based classification faces several challenges such as producing salt-paper results and high computational time in high dimensional data. Hence, in this paper, a novel method is proposed to optimize object-based SVM classification by applying continuous Ant Colony Optimization (ACO algorithm. The advantages of the proposed method are relatively high automation level, independency of image scene and type, post processing reduction for building edge reconstruction and accuracy improvement. The proposed method was evaluated by pixel-based SVM and Random Forest (RF classification in terms of accuracy. In comparison with optimized pixel-based SVM classification, the results showed that the proposed method improved quality factor and overall accuracy by 17% and 10%, respectively. Also, in the proposed method, Kappa coefficient was improved by 6% rather than RF classification. Time processing of the proposed method was relatively low because of unit of image analysis (image object. These showed the superiority of the proposed method in terms of time and accuracy.

  17. Designing algorithm visualization on mobile platform: The proposed guidelines

    Science.gov (United States)

    Supli, A. A.; Shiratuddin, N.

    2017-09-01

    This paper entails an ongoing study about the design guidelines of algorithm visualization (AV) on mobile platform, helping students learning data structures and algorithm (DSA) subject effectively. Our previous review indicated that design guidelines of AV on mobile platform are still few. Mostly, previous guidelines of AV are developed for AV on desktop and website platform. In fact, mobile learning has been proved to enhance engagement in learning circumstances, and thus effect student's performance. In addition, the researchers highly recommend including UI design and Interactivity in designing effective AV system. However, the discussions of these two aspects in previous AV design guidelines are not comprehensive. The UI design in this paper describes the arrangement of AV features in mobile environment, whereas interactivity is about the active learning strategy features based on learning experiences (how to engage learners). Thus, this study main objective is to propose design guidelines of AV on mobile platform (AVOMP) that entails comprehensively UI design and interactivity aspects. These guidelines are developed through content analysis and comparative analysis from various related studies. These guidelines are useful for AV designers to help them constructing AVOMP for various topics on DSA.

  18. Sedação com sufentanil e clonidina em pacientes submetidos a cateterismo cardíaco

    Directory of Open Access Journals (Sweden)

    Anita Perpetua Carvalho Rocha

    2011-03-01

    Full Text Available FUNDAMENTO: A sedação para a realização de cateterismo cardíaco tem sido alvo de preocupação. Benzodiazepínicos, agonistas alfa-2 adrenérgicos e opioides são utilizados para esse fim, entretanto, cada um destes medicamentos possui vantagens e desvantagens. OBJETIVO: Avaliar a eficácia do sufentanil e da clonidina como sedativos em pacientes submetidos a cateterismo cardíaco, observando o impacto dos mesmos sobre os parâmetros hemodinâmicos e respiratórios, a presença de efeitos colaterais, além da satisfação do paciente e do hemodinamicista com o exame. MÉTODOS: Trata-se de um ensaio clínico prospectivo, duplo-cego, randomizado e controlado, que envolveu 60 pacientes que receberam 0,1 µg/kg de sufentanil ou 0,5 µg/kg de clonidina antes da realização do cateterismo cardíaco. O escore de sedação segundo a escala de Ramsay, a necessidade de utilização de midazolam, os efeitos colaterais, os parâmetros hemodinâmicos e respiratórios foram registrados, sendo os dados analisados em 06 diferentes momentos. RESULTADOS: O comportamento da pressão arterial, da frequência cardíaca e da frequência respiratória foi semelhante nos dois grupos, entretanto, no momento 2, os pacientes do grupo sufentanil (Grupo S apresentaram menor escore de sedação segundo a escala de Ramsay, e a saturação periférica da oxihemoglobina foi menor que o grupo clonidina (Grupo C no momento 6. Os pacientes do Grupo S apresentaram maior incidência de náusea e vômito pós-operatório que os pacientes do Grupo C. A satisfação dos pacientes foi maior no grupo clonidina. Os hemodinamicistas mostraram-se satisfeitos nos dois grupos. CONCLUSÃO: O sufentanil e a clonidina foram efetivos como sedativos em pacientes submetidos a cateterismo cardíaco.

  19. Ant Foraging Behavior for Job Shop Problem

    Directory of Open Access Journals (Sweden)

    Mahad Diyana Abdul

    2016-01-01

    Full Text Available Ant Colony Optimization (ACO is a new algorithm approach, inspired by the foraging behavior of real ants. It has frequently been applied to many optimization problems and one such problem is in solving the job shop problem (JSP. The JSP is a finite set of jobs processed on a finite set of machine where once a job initiates processing on a given machine, it must complete processing and uninterrupted. In solving the Job Shop Scheduling problem, the process is measure by the amount of time required in completing a job known as a makespan and minimizing the makespan is the main objective of this study. In this paper, we developed an ACO algorithm to minimize the makespan. A real set of problems from a metal company in Johor bahru, producing 20 parts with jobs involving the process of clinching, tapping and power press respectively. The result from this study shows that the proposed ACO heuristics managed to produce a god result in a short time.

  20. Eight reasons payer interoperability and data sharing are essential in ACOs. Interoperability standards could be a prerequisite to measuring care.

    Science.gov (United States)

    Mookencherry, Shefali

    2012-01-01

    It makes strategic and business sense for payers and providers to collaborate on how to take substantial cost out of the healthcare delivery system. Acting independently, neither medical groups, hospitals nor health plans have the optimal mix of resources and incentives to significantly reduce costs. Payers have core assets such as marketing, claims data, claims processing, reimbursement systems and capital. It would be cost prohibitive for all but the largest providers to develop these capabilities in order to compete directly with insurers. Likewise, medical groups and hospitals are positioned to foster financial interdependence among providers and coordinate the continuum of patient illnesses and care settings. Payers and providers should commit to reasonable clinical and cost goals, and share resources to minimize expenses and financial risks. It is in the interest of payers to work closely with providers on risk-management strategies because insurers need synergy with ACOs to remain cost competitive. It is in the interest of ACOs to work collaboratively with payers early on to develop reasonable and effective performance benchmarks. Hence, it is essential to have payer interoperability and data sharing integrated in an ACO model.

  1. Validation of the concept Risk for Decreased Cardiac Output Validación del concepto riesgo de débito cardiaco disminuido Validação do conceito risco de débito cardíaco diminuído

    Directory of Open Access Journals (Sweden)

    Eduarda Ribeiro dos Santos

    2013-02-01

    Full Text Available OBJECTIVES: to validate the concept "risk for decreased cardiac output". METHOD: Six of the eight steps suggested in the technique developed by Walker & Avant were adopted to analyze the concept of the phenomenon under study and the proposal made by Hoskins was used for content validation, taking into account agreement achieved among five experts. RESULTS: the concept "decreased cardiac output" was found in the nursing and medical fields and refers to the heart's pumping capacity while the concept "risk" is found in a large number of disciplines. In regard to the defining attributes, "impaired pumping capacity" was the main attribute of decreased cardiac output and "probability" was the main attribute of risk. The uses and defining attributes of the concepts "decreased cardiac output" and "risk" were analyzed as well as their antecedent and consequent events in order to establish the definition of "risk for decreased cardiac output", which was validated by 100% of the experts. CONCLUSION: The obtained data indicate that the risk for decreased cardiac output phenomenon can be a nursing diagnosis and refining it can contribute to the advancement of nursing classifications in this context.OBJETIVO: Validar el concepto riesgo del débito cardíaco disminuido. MÉTODO: Fue adoptada la técnica de Walker & Avant para analizar la definición del fenómeno enfocado, utilizando seis de las ocho etapas sugeridas y la propuesta de Hoskins para validar el contenido, considerándose la conformidad entre cinco expertos. RESULTADOS: el concepto de debito cardíaco disminuido se encuentra en las áreas enfermería y médica y su atención se centra en la capacidad de bombeo del corazón. Sin embargo, el concepto de riesgo es presente en un gran número de áreas. Las características definitorias de la disminución del gasto cardíaco mostraron como principal atributo deficiencia de la bomba cardiaca y, para el riesgo, el atributo de probabilidad. Analizados

  2. Hybrid Metaheuristic Approach for Nonlocal Optimization of Molecular Systems.

    Science.gov (United States)

    Dresselhaus, Thomas; Yang, Jack; Kumbhar, Sadhana; Waller, Mark P

    2013-04-09

    Accurate modeling of molecular systems requires a good knowledge of the structure; therefore, conformation searching/optimization is a routine necessity in computational chemistry. Here we present a hybrid metaheuristic optimization (HMO) algorithm, which combines ant colony optimization (ACO) and particle swarm optimization (PSO) for the optimization of molecular systems. The HMO implementation meta-optimizes the parameters of the ACO algorithm on-the-fly by the coupled PSO algorithm. The ACO parameters were optimized on a set of small difluorinated polyenes where the parameters exhibited small variance as the size of the molecule increased. The HMO algorithm was validated by searching for the closed form of around 100 molecular balances. Compared to the gradient-based optimized molecular balance structures, the HMO algorithm was able to find low-energy conformations with a 87% success rate. Finally, the computational effort for generating low-energy conformation(s) for the phenylalanyl-glycyl-glycine tripeptide was approximately 60 CPU hours with the ACO algorithm, in comparison to 4 CPU years required for an exhaustive brute-force calculation.

  3. Ant Colony Optimization for Markowitz Mean-Variance Portfolio Model

    Science.gov (United States)

    Deng, Guang-Feng; Lin, Woo-Tsong

    This work presents Ant Colony Optimization (ACO), which was initially developed to be a meta-heuristic for combinatorial optimization, for solving the cardinality constraints Markowitz mean-variance portfolio model (nonlinear mixed quadratic programming problem). To our knowledge, an efficient algorithmic solution for this problem has not been proposed until now. Using heuristic algorithms in this case is imperative. Numerical solutions are obtained for five analyses of weekly price data for the following indices for the period March, 1992 to September, 1997: Hang Seng 31 in Hong Kong, DAX 100 in Germany, FTSE 100 in UK, S&P 100 in USA and Nikkei 225 in Japan. The test results indicate that the ACO is much more robust and effective than Particle swarm optimization (PSO), especially for low-risk investment portfolios.

  4. Significados dos episódios maníacos para pacientes com transtorno bipolar em remissão: um estudo qualitativo

    Directory of Open Access Journals (Sweden)

    Luiz Carlos Pereira Bin

    2014-07-01

    Full Text Available Objetivo Discutir os significados da vivência de episódios maníacos para pacientes com transtorno bipolar (TAB. Métodos Trata-se de uma pesquisa qualitativa, feita por meio de entrevistas semidirigidas em profundidade, em uma amostra fechada pelo critério de saturação com oito pacientes com TAB em remissão. A técnica de tratamento de dados foi feita por meio da análise de conteúdo das entrevistas transcritas na íntegra e categorização. Os resultados foram submetidos à validação externa, no Laboratório de Pesquisa Clínico-Qualitativa do Departamento de Psicologia Médica e Psiquiatria da Unicamp, composto por 37 pesquisadores do método, entre eles mestrandos, doutorandos, pós-doutorados e pesquisadores seniores. Resultados Foram identificadas três categorias – Ambivalência e vergonha: pensar ou não pensar sobre os episódios maníacos; Organizando sentimentos pessoais: a remissão como um momento de autoconsciência; Episódios maníacos estruturando relações interpessoais versus projeções da angústia. Conclusão Os achados da presente pesquisa contribuem para a maior compreensão dos quadros maníacos no TAB, que podem auxiliar nas reflexões acerca da relação profissional-paciente, para elaborar estratégias para aderência e para as medidas terapêuticas e preventivas da recorrência dos episódios. Podem auxiliar a equipe de saúde envolvida no acompanhamento desses casos e também os pesquisadores na investigação da contribuição dos significados aqui discutidos nos fenômenos de aderência ao tratamento e de um melhor prognóstico.

  5. Ant Colony Optimization and the Minimum Cut Problem

    DEFF Research Database (Denmark)

    Kötzing, Timo; Lehre, Per Kristian; Neumann, Frank

    2010-01-01

    Ant Colony Optimization (ACO) is a powerful metaheuristic for solving combinatorial optimization problems. With this paper we contribute to the theoretical understanding of this kind of algorithm by investigating the classical minimum cut problem. An ACO algorithm similar to the one that was prov...

  6. A short-term operating room surgery scheduling problem integrating multiple nurses roster constraints.

    Science.gov (United States)

    Xiang, Wei; Yin, Jiao; Lim, Gino

    2015-02-01

    Operating room (OR) surgery scheduling determines the individual surgery's operation start time and assigns the required resources to each surgery over a schedule period, considering several constraints related to a complete surgery flow and the multiple resources involved. This task plays a decisive role in providing timely treatments for the patients while balancing hospital resource utilization. The originality of the present study is to integrate the surgery scheduling problem with real-life nurse roster constraints such as their role, specialty, qualification and availability. This article proposes a mathematical model and an ant colony optimization (ACO) approach to efficiently solve such surgery scheduling problems. A modified ACO algorithm with a two-level ant graph model is developed to solve such combinatorial optimization problems because of its computational complexity. The outer ant graph represents surgeries, while the inner graph is a dynamic resource graph. Three types of pheromones, i.e. sequence-related, surgery-related, and resource-related pheromone, fitting for a two-level model are defined. The iteration-best and feasible update strategy and local pheromone update rules are adopted to emphasize the information related to the good solution in makespan, and the balanced utilization of resources as well. The performance of the proposed ACO algorithm is then evaluated using the test cases from (1) the published literature data with complete nurse roster constraints, and 2) the real data collected from a hospital in China. The scheduling results using the proposed ACO approach are compared with the test case from both the literature and the real life hospital scheduling. Comparison results with the literature shows that the proposed ACO approach has (1) an 1.5-h reduction in end time; (2) a reduction in variation of resources' working time, i.e. 25% for ORs, 50% for nurses in shift 1 and 86% for nurses in shift 2; (3) an 0.25h reduction in

  7. NHETS - Estudo de Necrópsias de Pacientes Submetidos a Transplante Cardíaco

    Directory of Open Access Journals (Sweden)

    Thiago Ninck Valette

    2014-06-01

    Full Text Available Fundamento: Discordâncias entre diagnóstico pre e post-mortem são relatadas na literatura, podendo variar de 4,1 a 49,8% dentre os casos encaminhados para exame necroscópico, com importante repercussão no tratamento dos pacientes. Objetivo: Analisar pacientes com óbito após o transplante cardíaco e confrontar os diagnósticos pre e post-mortem. Métodos: Por meio da revisão de prontuários, foram analisados dados clínicos, presença de comorbidades, esquema de imunossupressão, exames laboratoriais, causa clínica do óbito e causa do óbito à necrópsia. Foram confrontadas, então, a causa clínica e a causa necroscópica do óbito de cada paciente. Resultados: Foram analisados 48 óbitos submetidos à necrópsia no período de 2000 a 2010; 29 (60,4% tiveram diagnósticos clínico e necroscópico concordantes, 16 (33,3% tiveram diagnósticos discordantes e três (6,3% tiveram diagnóstico não esclarecido. Entre os discordantes, 15 (31,3% apresentaram possível impacto na sobrevida e um (2,1% não apresentou impacto na sobrevida. O principal diagnóstico clínico feito equivocadamente foi o de infecção, com cinco casos (26,7% dos discordantes, seguido por rejeição hiperaguda, com quatro casos (20% dos discordantes, e tromboembolismo pulmonar, com três casos (13,3% dos discordantes. Conclusão: Discordâncias entre o diagnóstico clínico e achados da necrópsia são comumente encontradas no transplante cardíaco. Novas estratégias no aperfeiçoamento do diagnóstico clínico devem ser introduzidas, considerando-se os resultados da necrópsia para melhoria do tratamento da insuficiência cardíaca por meio do transplante cardíaco.

  8. Comparison of Different MPPT Algorithms with a Proposed One Using a Power Estimator for Grid Connected PV Systems

    Directory of Open Access Journals (Sweden)

    Manel Hlaili

    2016-01-01

    Full Text Available Photovoltaic (PV energy is one of the most important energy sources since it is clean and inexhaustible. It is important to operate PV energy conversion systems in the maximum power point (MPP to maximize the output energy of PV arrays. An MPPT control is necessary to extract maximum power from the PV arrays. In recent years, a large number of techniques have been proposed for tracking the maximum power point. This paper presents a comparison of different MPPT methods and proposes one which used a power estimator and also analyses their suitability for systems which experience a wide range of operating conditions. The classic analysed methods, the incremental conductance (IncCond, perturbation and observation (P&O, ripple correlation (RC algorithms, are suitable and practical. Simulation results of a single phase NPC grid connected PV system operating with the aforementioned methods are presented to confirm effectiveness of the scheme and algorithms. Simulation results verify the correct operation of the different MPPT and the proposed algorithm.

  9. Approaches to drug therapy for COPD in Russia: a proposed therapeutic algorithm

    Directory of Open Access Journals (Sweden)

    Zykov KA

    2017-04-01

    Full Text Available Kirill A Zykov,1 Svetlana I Ovcharenko2 1Laboratory of Pulmonology, Moscow State University of Medicine and Dentistry named after A.I. Evdokimov, 2I.M. Sechenov First Moscow State Medical University, Moscow, Russia Abstract: Until recently, there have been few clinical algorithms for the management of patients with COPD. Current evidence-based clinical management guidelines can appear to be complex, and they lack clear step-by-step instructions. For these reasons, we chose to create a simple and practical clinical algorithm for the management of patients with COPD, which would be applicable to real-world clinical practice, and which was based on clinical symptoms and spirometric parameters that would take into account the pathophysiological heterogeneity of COPD. This optimized algorithm has two main fields, one for nonspecialist treatment by primary care and general physicians and the other for treatment by specialized pulmonologists. Patients with COPD are treated with long-acting bronchodilators and short-acting drugs on a demand basis. If the forced expiratory volume in one second (FEV1 is ≥50% of predicted and symptoms are mild, treatment with a single long-acting muscarinic antagonist or long-acting beta-agonist is proposed. When FEV1 is <50% of predicted and/or the COPD assessment test score is ≥10, the use of combined bronchodilators is advised. If there is no response to treatment after three months, referral to a pulmonary specialist is recommended for ­pathophysiological endotyping: 1 eosinophilic endotype with peripheral blood or sputum eosinophilia >3%; 2 neutrophilic endotype with peripheral blood neutrophilia >60% or green sputum; or 3 pauci-granulocytic endotype. It is hoped that this simple, optimized, step-by-step algorithm will help to individualize the treatment of COPD in real-world clinical practice. This algorithm has yet to be evaluated prospectively or by comparison with other COPD management algorithms, including

  10. Tests were inconclusive: California's experiment with ACOs raises questions about whether they'll save money.

    Science.gov (United States)

    Vesely, Rebecca

    2010-11-01

    In California, many providers have been working in accountable care organizations for decades, and they say they can work well. "It's not the type of organization that defines the success of an ACO, it's the leadership," said Tom Williams, executive director of the Oakland, Calif.-based Integrated Healthcare Association. "For success, there needs to be alignment between hospitals and physicians."

  11. Survey on Recent Research and Implementation of Ant Colony Optimization in Various Engineering Applications

    Directory of Open Access Journals (Sweden)

    Mohan B. Chandra

    2011-08-01

    Full Text Available Ant colony optimization (ACO takes inspiration from the foraging behaviour of real ant species. This ACO exploits a similar mechanism for solving optimization problems for the various engineering field of study. Many successful implementations using ACO are now available in many applications. This paper reviewing varies systematic approach on recent research and implementation of ACO. Finally it presents the experimental result of ACO which is applied for routing problem and compared with existing algorithms.

  12. Transient absorption spectroscopy in biology using the Super-ACO storage ring FEL and the synchrotron radiation combination

    CERN Document Server

    Renault, E; De Ninno, G; Garzella, D; Hirsch, M; Nahon, L; Nutarelli, D

    2001-01-01

    The Super-ACO storage ring FEL, covering the UV range down to 300 nm with a high average power (300 mW at 350 nm) together with a high stability and long lifetime, is a unique tool for the performance of users applications. We present here the first pump-probe two color experiments on biological species using a storage ring FEL coupled to the synchrotron radiation. The intense UV pulse of the Super-ACO FEL is used to prepare a high initial concentration of chromophores in their first singlet electronic excited state. The nearby bending magnet synchrotron radiation provides, on the other hand a pulsed, white light continuum (UV-IR), naturally synchronized with the FEL pulses and used to probe the photochemical subsequent events and the associated transient species. We have demonstrated the feasibility with a dye molecule (POPOP) observing a two-color effect, signature of excited state absorption and a temporal signature with Acridine. Applications on various chromophores of biological interest are carried out,...

  13. OPTIMIZED PARTICLE SWARM OPTIMIZATION BASED DEADLINE CONSTRAINED TASK SCHEDULING IN HYBRID CLOUD

    Directory of Open Access Journals (Sweden)

    Dhananjay Kumar

    2016-01-01

    Full Text Available Cloud Computing is a dominant way of sharing of computing resources that can be configured and provisioned easily. Task scheduling in Hybrid cloud is a challenge as it suffers from producing the best QoS (Quality of Service when there is a high demand. In this paper a new resource allocation algorithm, to find the best External Cloud provider when the intermediate provider’s resources aren’t enough to satisfy the customer’s demand is proposed. The proposed algorithm called Optimized Particle Swarm Optimization (OPSO combines the two metaheuristic algorithms namely Particle Swarm Optimization and Ant Colony Optimization (ACO. These metaheuristic algorithms are used for the purpose of optimization in the search space of the required solution, to find the best resource from the pool of resources and to obtain maximum profit even when the number of tasks submitted for execution is very high. This optimization is performed to allocate job requests to internal and external cloud providers to obtain maximum profit. It helps to improve the system performance by improving the CPU utilization, and handle multiple requests at the same time. The simulation result shows that an OPSO yields 0.1% - 5% profit to the intermediate cloud provider compared with standard PSO and ACO algorithms and it also increases the CPU utilization by 0.1%.

  14. Acute Care of At-Risk Newborns (ACoRN: quantitative and qualitative educational evaluation of the program in a region of China

    Directory of Open Access Journals (Sweden)

    Singhal Nalini

    2012-06-01

    Full Text Available Abstract Background The Acute Care of at-Risk Newborns (ACoRN program was developed in Canada for trained health care providers for the identification and management of newborns who are at-risk and/or become unwell in the first few hours or days after birth. The ACoRN process follows an 8-step framework that enables the evaluation and management of babies irrespective of the experience or expertise of the caregiving individual or team. This study assesses the applicability of the program to Chinese pediatric practitioners. Methods Course content and educational materials were translated from English into Chinese by bilingual neonatal practitioners. Confidence and knowledge questionnaires were developed and reviewed for face and content validity by a team of ACoRN instructors. Bilingual Chinese instructors were trained at the tertiary perinatal centre in Hangzhou Zhejiang to deliver the course at 15 level II county hospitals. Participants completed pre- and post-course confidence and knowledge questionnaires and provided feedback through post-course focus groups. Results 216 physicians and nurses were trained. Confidence and knowledge relating to neonatal stabilization improved significantly following the courses. Participants rated course utility and function between 4.2 and 4.6/5 on all items. Pre/post measures of confidence were significantly correlated with post course knowledge. Focus group data supported the perceived value of the program and recommended course adjustments to include pre-course reading, and increased content related to simulation, communication skills, and management of respiratory illness and jaundice. Conclusions ACoRN, a Canadian educational program, appears to be well received by Chinese health care providers and results in improved knowledge and confidence. International program adaptation for use by health care professionals requires structured and systematic evaluation to ensure that the program meets the needs of

  15. Replication and Comparison of the Newly Proposed ADOS-2, Module 4 Algorithm in ASD without ID: A Multi-Site Study

    Science.gov (United States)

    Pugliese, Cara E.; Kenworthy, Lauren; Bal, Vanessa Hus; Wallace, Gregory L.; Yerys, Benjamin E.; Maddox, Brenna B.; White, Susan W.; Popal, Haroon; Armour, Anna Chelsea; Miller, Judith; Herrington, John D.; Schultz, Robert T.; Martin, Alex; Anthony, Laura Gutermuth

    2015-01-01

    Recent updates have been proposed to the Autism Diagnostic Observation Schedule-2 Module 4 diagnostic algorithm. This new algorithm, however, has not yet been validated in an independent sample without intellectual disability (ID). This multi-site study compared the original and revised algorithms in individuals with ASD without ID. The revised…

  16. Vertigo in childhood: proposal for a diagnostic algorithm based upon clinical experience.

    Science.gov (United States)

    Casani, A P; Dallan, I; Navari, E; Sellari Franceschini, S; Cerchiai, N

    2015-06-01

    The aim of this paper is to analyse, after clinical experience with a series of patients with established diagnoses and review of the literature, all relevant anamnestic features in order to build a simple diagnostic algorithm for vertigo in childhood. This study is a retrospective chart review. A series of 37 children underwent complete clinical and instrumental vestibular examination. Only neurological disorders or genetic diseases represented exclusion criteria. All diagnoses were reviewed after applying the most recent diagnostic guidelines. In our experience, the most common aetiology for dizziness is vestibular migraine (38%), followed by acute labyrinthitis/neuritis (16%) and somatoform vertigo (16%). Benign paroxysmal vertigo was diagnosed in 4 patients (11%) and paroxysmal torticollis was diagnosed in a 1-year-old child. In 8% (3 patients) of cases, the dizziness had a post-traumatic origin: 1 canalolithiasis of the posterior semicircular canal and 2 labyrinthine concussions, respectively. Menière's disease was diagnosed in 2 cases. A bilateral vestibular failure of unknown origin caused chronic dizziness in 1 patient. In conclusion, this algorithm could represent a good tool for guiding clinical suspicion to correct diagnostic assessment in dizzy children where no neurological findings are detectable. The algorithm has just a few simple steps, based mainly on two aspects to be investigated early: temporal features of vertigo and presence of hearing impairment. A different algorithm has been proposed for cases in which a traumatic origin is suspected.

  17. Replication and Comparison of the Newly Proposed ADOS-2, Module 4 Algorithm in ASD without ID: A Multi-site Study

    OpenAIRE

    Pugliese, Cara E.; Kenworthy, Lauren; Bal, Vanessa Hus; Wallace, Gregory L; Yerys, Benjamin E; Maddox, Brenna B.; White, Susan W.; Popal, Haroon; Armour, Anna Chelsea; Miller, Judith; Herrington, John D.; Schultz, Robert T.; Martin, Alex; Anthony, Laura Gutermuth

    2015-01-01

    Recent updates have been proposed to the Autism Diagnostic Observation Schedule-2 Module 4 diagnostic algorithm. This new algorithm, however, has not yet been validated in an independent sample without intellectual disability (ID). This multi-site study compared the original and revised algorithms in individuals with ASD without ID. The revised algorithm demonstrated increased sensitivity, but lower specificity in the overall sample. Estimates were highest for females, individuals with a verb...

  18. Autonomous Star Tracker Algorithms

    DEFF Research Database (Denmark)

    Betto, Maurizio; Jørgensen, John Leif; Kilsgaard, Søren

    1998-01-01

    Proposal, in response to an ESA R.f.P., to design algorithms for autonomous star tracker operations.The proposal also included the development of a star tracker breadboard to test the algorithms performances.......Proposal, in response to an ESA R.f.P., to design algorithms for autonomous star tracker operations.The proposal also included the development of a star tracker breadboard to test the algorithms performances....

  19. Ant colony optimization as a descriptor selection in QSPR modeling: Estimation of the λmax of anthraquinones-based dyes

    Directory of Open Access Journals (Sweden)

    Morteza Atabati

    2016-09-01

    Full Text Available Quantitative structure–property relationship (QSPR studies based on ant colony optimization (ACO were carried out for the prediction of λmax of 9,10-anthraquinone derivatives. ACO is a meta-heuristic algorithm, which is derived from the observation of real ants and proposed to feature selection. After optimization of 3D geometry of structures by the semi-empirical quantum-chemical calculation at AM1 level, different descriptors were calculated by the HyperChem and Dragon softwares (1514 descriptors. A major problem of QSPR is the high dimensionality of the descriptor space; therefore, descriptor selection is the most important step. In this paper, an ACO algorithm was used to select the best descriptors. Then selected descriptors were applied for model development using multiple linear regression. The average absolute relative deviation and correlation coefficient for the calibration set were obtained as 3.3% and 0.9591, respectively, while the average absolute relative deviation and correlation coefficient for the prediction set were obtained as 5.0% and 0.9526, respectively. The results showed that the applied procedure is suitable for prediction of λmax of 9,10-anthraquinone derivatives.

  20. An ant colony optimization based feature selection for web page classification.

    Science.gov (United States)

    Saraç, Esra; Özel, Selma Ayşe

    2014-01-01

    The increased popularity of the web has caused the inclusion of huge amount of information to the web, and as a result of this explosive information growth, automated web page classification systems are needed to improve search engines' performance. Web pages have a large number of features such as HTML/XML tags, URLs, hyperlinks, and text contents that should be considered during an automated classification process. The aim of this study is to reduce the number of features to be used to improve runtime and accuracy of the classification of web pages. In this study, we used an ant colony optimization (ACO) algorithm to select the best features, and then we applied the well-known C4.5, naive Bayes, and k nearest neighbor classifiers to assign class labels to web pages. We used the WebKB and Conference datasets in our experiments, and we showed that using the ACO for feature selection improves both accuracy and runtime performance of classification. We also showed that the proposed ACO based algorithm can select better features with respect to the well-known information gain and chi square feature selection methods.

  1. Solving Multi-Resource Constrained Project Scheduling Problem using Ant Colony Optimization

    Directory of Open Access Journals (Sweden)

    Hsiang-Hsi Huang

    2015-01-01

    Full Text Available This paper applied Ant Colony Optimization (ACO to develop a resource constraints scheduling model to achieve the resource allocation optimization and the shortest completion time of a project under resource constraints and the activities precedence requirement for projects. Resource leveling is also discussed and has to be achieved under the resource allocation optimization in this research. Testing cases and examples adopted from the international test bank were studied for verifying the effectiveness of the proposed model. The results showed that the solutions of different cases all have a better performance within a reasonable time. These can be obtained through ACO algorithm under the same constrained conditions. A program was written for the proposed model that is able to automatically produce the project resource requirement figure after the project duration is solved.

  2. The Algorithm for Algorithms: An Evolutionary Algorithm Based on Automatic Designing of Genetic Operators

    Directory of Open Access Journals (Sweden)

    Dazhi Jiang

    2015-01-01

    Full Text Available At present there is a wide range of evolutionary algorithms available to researchers and practitioners. Despite the great diversity of these algorithms, virtually all of the algorithms share one feature: they have been manually designed. A fundamental question is “are there any algorithms that can design evolutionary algorithms automatically?” A more complete definition of the question is “can computer construct an algorithm which will generate algorithms according to the requirement of a problem?” In this paper, a novel evolutionary algorithm based on automatic designing of genetic operators is presented to address these questions. The resulting algorithm not only explores solutions in the problem space like most traditional evolutionary algorithms do, but also automatically generates genetic operators in the operator space. In order to verify the performance of the proposed algorithm, comprehensive experiments on 23 well-known benchmark optimization problems are conducted. The results show that the proposed algorithm can outperform standard differential evolution algorithm in terms of convergence speed and solution accuracy which shows that the algorithm designed automatically by computers can compete with the algorithms designed by human beings.

  3. Comparative study of heuristics algorithms in solving flexible job shop scheduling problem with condition based maintenance

    Directory of Open Access Journals (Sweden)

    Yahong Zheng

    2014-05-01

    Full Text Available Purpose: This paper focuses on a classic optimization problem in operations research, the flexible job shop scheduling problem (FJSP, to discuss the method to deal with uncertainty in a manufacturing system.Design/methodology/approach: In this paper, condition based maintenance (CBM, a kind of preventive maintenance, is suggested to reduce unavailability of machines. Different to the simultaneous scheduling algorithm (SSA used in the previous article (Neale & Cameron,1979, an inserting algorithm (IA is applied, in which firstly a pre-schedule is obtained through heuristic algorithm and then maintenance tasks are inserted into the pre-schedule scheme.Findings: It is encouraging that a new better solution for an instance in benchmark of FJSP is obtained in this research. Moreover, factually SSA used in literature for solving normal FJSPPM (FJSP with PM is not suitable for the dynamic FJSPPM. Through application in the benchmark of normal FJSPPM, it is found that although IA obtains inferior results compared to SSA used in literature, it performs much better in executing speed.Originality/value: Different to traditional scheduling of FJSP, uncertainty of machines is taken into account, which increases the complexity of the problem. An inserting algorithm (IA is proposed to solve the dynamic scheduling problem. It is stated that the quality of the final result depends much on the quality of the pre-schedule obtained during the procedure of solving a normal FJSP. In order to find the best solution of FJSP, a comparative study of three heuristics is carried out, the integrated GA, ACO and ABC. In the comparative study, we find that GA performs best in the three heuristic algorithms. Meanwhile, a new better solution for an instance in benchmark of FJSP is obtained in this research.

  4. Doctors on deck. ACOs led by doctors seek to manage costs, quality and hospital relationships.

    Science.gov (United States)

    Evans, Melanie

    2012-04-16

    Most of the first crop of ACOs in the Medicare Shared Savings Program are owned and operated by physicians without formal participation of a hospital in the efforts to improve quality and curb costs. "There were some people who feared that the only entities that would participate would be hospital-dominated systems," says Jonathan Blum, director of the Center for Medicare Management at the CMS, left. "That has not happened".

  5. The Lobe Fissure Tracking by the Modified Ant Colony Optimization Framework in CT Images

    Directory of Open Access Journals (Sweden)

    Chii-Jen Chen

    2014-11-01

    Full Text Available Chest computed tomography (CT is the most commonly used technique for the inspection of lung lesions. However, the lobe fissures in lung CT is still difficult to observe owing to its imaging structure. Therefore, in this paper, we aimed to develop an efficient tracking framework to extract the lobe fissures by the proposed modified ant colony optimization (ACO algorithm. We used the method of increasing the consistency of pheromone on lobe fissure to improve the accuracy of path tracking. In order to validate the proposed system, we had tested our method in a database from 15 lung patients. In the experiment, the quantitative assessment shows that the proposed ACO method achieved the average F-measures of 80.9% and 82.84% in left and right lungs, respectively. The experiments indicate our method results more satisfied performance, and can help investigators detect lung lesion for further examination.

  6. Transient absorption spectroscopy in biology using the Super-ACO storage ring FEL and the synchrotron radiation combination

    International Nuclear Information System (INIS)

    Renault, Eric; Nahon, Laurent; Garzella, David; Nutarelli, Daniele; De Ninno, Giovanni; Hirsch, Matthias; Couprie, Marie Emmanuelle

    2001-01-01

    The Super-ACO storage ring FEL, covering the UV range down to 300 nm with a high average power (300 mW at 350 nm) together with a high stability and long lifetime, is a unique tool for the performance of users applications. We present here the first pump-probe two color experiments on biological species using a storage ring FEL coupled to the synchrotron radiation. The intense UV pulse of the Super-ACO FEL is used to prepare a high initial concentration of chromophores in their first singlet electronic excited state. The nearby bending magnet synchrotron radiation provides, on the other hand a pulsed, white light continuum (UV-IR), naturally synchronized with the FEL pulses and used to probe the photochemical subsequent events and the associated transient species. We have demonstrated the feasibility with a dye molecule (POPOP) observing a two-color effect, signature of excited state absorption and a temporal signature with Acridine. Applications on various chromophores of biological interest are carried out, such as the time-resolved absorption study of the first excited state of Acridine

  7. Registro Español de Trasplante Cardíaco. XX Informe oficial de la sección de insuficiencia Cardíaca y Trasplante Cardíaco de la sociedad Española de Cardiología (1984-2008

    Directory of Open Access Journals (Sweden)

    Luis Almenar Bonet

    2009-07-01

    Conclusiones: La supervivencia obtenida en España con el TC, sobre todo en los últimos años, sitúa al trasplante cardíaco como el tratamiento de elección para cardiopatías irreversibles en situación funcional avanzada y sin otras opciones médicas o quirúrgicas establecidas.

  8. Interacciones entre dispositivos cardíacos implantables y modalidades fisioterapéuticas: ¿Mito o realidad?

    Directory of Open Access Journals (Sweden)

    Genevieve C. Digby

    2011-04-01

    Full Text Available La fisioterapia se ha transformado en una especialidad que claramente incide en la calidad de vida de nuestros pacientes. En poblaciones añosas, el uso de fisioterapia incluye múltiples modalidades para un alto número de distintas enfermedades. Varios informes sobre posibles interacciones negativas entre las distintas modalidades de fisioterapia y los dispositivos cardíacos implantables (marcapasos y cardiodesfibriladores han sido publicados en los últimos aňos. A pesar de ello, existe muy poca evidencia y guías precisas para identificar cuáles son las modalidades de fisioterapia seguras a utilizar en esta población de pacientes. En la siguiente revisión, nos propusimos resumir las interacciones documentadas entre fisioterapia y dispositivos cardíacos implantables (DCI, discutir el estándar actual de estas prácticas e identificar las principales consideraciones que existen desde la perspectiva de un servicio de electrofisiología cardíaca, para el tratamiento adecuado en estos pacientes. Finalmente, abogamos por fortalecer la colaboración entre fisioterapeutas y electrofisiólogos, con el fin de asegurar una óptima y segura atención de este grupo de pacientes.

  9. Manejo de las dislipidemias en pacientes cardíacos trasplantados. Hallazgos sobre nuevos factores de riesgo

    Directory of Open Access Journals (Sweden)

    Walter Masson

    2008-01-01

    Full Text Available En pacientes cardíacos trasplantados, el desarrollo de enfermedad vascular coronaria es una complicación frecuente y la dislipidemia es uno de los predictores más importantes. Los inmunosupresores predisponen a las dislipidemias y dificultan la utilización de hipolipemiantes. En este grupo particular de pacientes se recomienda alcanzar las metas terapéuticas de prevención secundaria. Las estatinas son los hipolipemiantes de elección. No existen recomendaciones claras sobre nuevos factores de riesgo, como la homocisteína y la lipoproteína (a [Lp(a].Con el objetivo de conocer el perfil lipídico, la prevalencia de homocisteinemia y de Lp(a elevadas, el cumplimiento de las metas terapéuticas y la tolerancia a la medicación, se incluyeron en el estudio 23 pacientes cardíacos trasplantados. Los resultados mostraron que el cumplimiento de las metas lipídicas fue aceptable y que el 65% recibía tratamiento hipolipemiante. El uso de estatinas fue seguro. Se encontró una prevalencia alta de homocisteína y Lp(a elevadas. Su implicación en la modificación del tratamiento se desconoce.

  10. Mensuração do tamanho cardíaco em radiografias de gatos com hipertireoidismo experimental

    Directory of Open Access Journals (Sweden)

    Mauro José Lahm Cardoso

    2007-04-01

    Full Text Available O hipertireoidismo felino ou tireotoxicose é a doença endócrina mais freqüente em gatos domésticos. O hipertireoidismo felino é uma desordem multissistêmica associada com aumento das concentrações dos hormônios tireoidianos circulantes, triidotironina (T3 e tiroxina (T4. Anormalidades cardiovasculares em gatos com hipertireoidismo espontâneo já foram bem descritas. O objetivo deste trabalho foi realizar a mensuração cardíaca em 19 gatos submetidos a tireotoxicose experimental, utilizando o "vertebral heart size" (VHS. A mensuração utilizando o VHS foi realizada a partir de radiografias torácicas lateral, dorsoventral e ventrodorsal. As radiografias torácicas laterais foram mais eficazes do que as ventrodorsal e dorsoventral em ilustrar o aumento progressivo do coração. O VHS é um método fácil de ser usado, permitindo a avaliação do tamanho cardíaco em gatos hipertireóideos, e facilita a identificação de cardiomegalia e progressão do tamanho cardíaco.

  11. Proposals for Updating Tai Algorithm

    Science.gov (United States)

    1997-12-01

    1997 meeting, the Comiti International des Poids et Mesures (CIPM) decided to change the name of the Comiti Consultatif pour la Difinition de la ...Report of the BIPM Time Section, 1988,1, D1-D22. [2] P. Tavella, C. Thomas, Comparative study of time scale algorithms, Metrologia , 1991, 28, 57...alternative choice for implementing an upper limit of clock weights, Metrologia , 1996, 33, 227-240. [5] C. Thomas, Impact of New Clock Technologies

  12. Environmental Assessment for the proposed modification and continued operation of the DIII-D facility

    International Nuclear Information System (INIS)

    1995-07-01

    The EA evaluates the proposed action of modifying the DIII-D fusion facility and conducting related research activities at the GA San Diego site over 1995-1999 under DOE contract number DE-ACO3-89ER51114. The proposed action is need to advance magnetic fusion research for future generation fusion devices such as ITER and TPX. It was determined that the proposed action is not a major action significantly affecting the quality of the human environment according to NEPA; therefore a finding of no significant impact is made and an environmental impact statement is not required

  13. Environmental Assessment for the proposed modification and continued operation of the DIII-D facility

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    The EA evaluates the proposed action of modifying the DIII-D fusion facility and conducting related research activities at the GA San Diego site over 1995-1999 under DOE contract number DE-ACO3-89ER51114. The proposed action is need to advance magnetic fusion research for future generation fusion devices such as ITER and TPX. It was determined that the proposed action is not a major action significantly affecting the quality of the human environment according to NEPA; therefore a finding of no significant impact is made and an environmental impact statement is not required.

  14. Sedação com sufentanil e clonidina em pacientes submetidos a cateterismo cardíaco Sedación con sufentanil y clonidina en pacientes sometidos a cateterismo cardíaco Sedation with sufentanil and clonidine in patients undergoing heart catheterization

    Directory of Open Access Journals (Sweden)

    Anita Perpetua Carvalho Rocha

    2011-03-01

    Full Text Available FUNDAMENTO: A sedação para a realização de cateterismo cardíaco tem sido alvo de preocupação. Benzodiazepínicos, agonistas alfa-2 adrenérgicos e opioides são utilizados para esse fim, entretanto, cada um destes medicamentos possui vantagens e desvantagens. OBJETIVO: Avaliar a eficácia do sufentanil e da clonidina como sedativos em pacientes submetidos a cateterismo cardíaco, observando o impacto dos mesmos sobre os parâmetros hemodinâmicos e respiratórios, a presença de efeitos colaterais, além da satisfação do paciente e do hemodinamicista com o exame. MÉTODOS: Trata-se de um ensaio clínico prospectivo, duplo-cego, randomizado e controlado, que envolveu 60 pacientes que receberam 0,1 µg/kg de sufentanil ou 0,5 µg/kg de clonidina antes da realização do cateterismo cardíaco. O escore de sedação segundo a escala de Ramsay, a necessidade de utilização de midazolam, os efeitos colaterais, os parâmetros hemodinâmicos e respiratórios foram registrados, sendo os dados analisados em 06 diferentes momentos. RESULTADOS: O comportamento da pressão arterial, da frequência cardíaca e da frequência respiratória foi semelhante nos dois grupos, entretanto, no momento 2, os pacientes do grupo sufentanil (Grupo S apresentaram menor escore de sedação segundo a escala de Ramsay, e a saturação periférica da oxihemoglobina foi menor que o grupo clonidina (Grupo C no momento 6. Os pacientes do Grupo S apresentaram maior incidência de náusea e vômito pós-operatório que os pacientes do Grupo C. A satisfação dos pacientes foi maior no grupo clonidina. Os hemodinamicistas mostraram-se satisfeitos nos dois grupos. CONCLUSÃO: O sufentanil e a clonidina foram efetivos como sedativos em pacientes submetidos a cateterismo cardíaco.FUNDAMENTO: La sedación para la realización de cateterismo cardíaco ha sido blanco de preocupación. Benzodiazepínicos, agonistas alfa-2 adrenérgicos y opioides son utilizados para ese fin

  15. A New Modified Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    Medha Gupta

    2016-07-01

    Full Text Available Nature inspired meta-heuristic algorithms studies the emergent collective intelligence of groups of simple agents. Firefly Algorithm is one of the new such swarm-based metaheuristic algorithm inspired by the flashing behavior of fireflies. The algorithm was first proposed in 2008 and since then has been successfully used for solving various optimization problems. In this work, we intend to propose a new modified version of Firefly algorithm (MoFA and later its performance is compared with the standard firefly algorithm along with various other meta-heuristic algorithms. Numerical studies and results demonstrate that the proposed algorithm is superior to existing algorithms.

  16. Proposal for a fully decentralized blockchain and proof-of-work algorithm for solving NP-complete problems

    OpenAIRE

    Oliver, Carlos G.; Ricottone, Alessandro; Philippopoulos, Pericles

    2017-01-01

    We propose a proof-of-work algorithm that rewards blockchain miners for using computational resources to solve NP-complete puzzles. The resulting blockchain will publicly store and improve solutions to problems with real world applications while maintaining a secure and fully functional transaction ledger.

  17. A New Metaheuristic Algorithm for Long-Term Open-Pit Production Planning / Nowy meta-heurystyczny algorytm wspomagający długoterminowe planowanie produkcji w kopalni odkrywkowej

    Science.gov (United States)

    Sattarvand, Javad; Niemann-Delius, Christian

    2013-03-01

    Paper describes a new metaheuristic algorithm which has been developed based on the Ant Colony Optimisation (ACO) and its efficiency have been discussed. To apply the ACO process on mine planning problem, a series of variables are considered for each block as the pheromone trails that represent the desirability of the block for being the deepest point of the mine in that column for the given mining period. During implementation several mine schedules are constructed in each iteration. Then the pheromone values of all blocks are reduced to a certain percentage and additionally the pheromone value of those blocks that are used in defining the constructed schedules are increased according to the quality of the generated solutions. By repeated iterations, the pheromone values of those blocks that define the shape of the optimum solution are increased whereas those of the others have been significantly evaporated.

  18. Topology optimum design of compliant mechanisms using modified ant colony optimization

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Kwang Seon; Han, Seog Young [Hanyang University, Seoul (Korea, Republic of)

    2015-08-15

    A Modified ant colony optimization (MACO) algorithm was suggested for topology optimal design of compliant mechanisms since standard ACO cannot provide an appropriate optimal topology. In order to improve computational efficiency and suitability of standard ACO algorithm in topology optimization for compliant mechanisms, a continuous variable, called the 'Element contribution significance (ECS),'is employed, which serves to replace the positions of ants in the standard ACO algorithm, and assess the importance of each element in the optimization process. MACO algorithm was applied to topology optimizations of both linear and geometrically nonlinear compliant mechanisms using three kinds of objective functions, and optimized topologies were compared each other. From the comparisons, it was concluded that MACO algorithm can effectively be applied to topology optimizations of linear and geometrically nonlinear compliant mechanisms, and the ratio of Mutual potential energy (MPE) to Strain energy (SE) type of objective function is the best for topology optimal design of compliant mechanisms.

  19. A cena constituinte da psicose maníaco-depressiva no Brasil The emergence of manic depressive psychosis as a diagnosis in Brazil

    Directory of Open Access Journals (Sweden)

    Joel Birman

    2010-12-01

    Full Text Available A intenção deste ensaio é esboçar a leitura da psicose maníaco-depressiva no Brasil, no começo do século XX. Destaca a transformação teórica ocorrida na psiquiatria brasileira, que se deslocou da tradição francesa para a alemã. Sublinha o modo como a problemática da histeria foi substituída pela da psicose maníaco-depressiva nesse contexto histórico.This essay examines the early twentieth-century interpretation of manic depressive psychosis in Brazil, during a moment when Brazilian psychiatry witnessed a theoretical shift from the French to German traditions. It calls special attention to how the problem of hysteria was replaced by manic depressive psychosis within this historical context.

  20. Meta-Heuristics in Short Scale Construction: Ant Colony Optimization and Genetic Algorithm.

    Science.gov (United States)

    Schroeders, Ulrich; Wilhelm, Oliver; Olaru, Gabriel

    2016-01-01

    The advent of large-scale assessment, but also the more frequent use of longitudinal and multivariate approaches to measurement in psychological, educational, and sociological research, caused an increased demand for psychometrically sound short scales. Shortening scales economizes on valuable administration time, but might result in inadequate measures because reducing an item set could: a) change the internal structure of the measure, b) result in poorer reliability and measurement precision, c) deliver measures that cannot effectively discriminate between persons on the intended ability spectrum, and d) reduce test-criterion relations. Different approaches to abbreviate measures fare differently with respect to the above-mentioned problems. Therefore, we compare the quality and efficiency of three item selection strategies to derive short scales from an existing long version: a Stepwise COnfirmatory Factor Analytical approach (SCOFA) that maximizes factor loadings and two metaheuristics, specifically an Ant Colony Optimization (ACO) with a tailored user-defined optimization function and a Genetic Algorithm (GA) with an unspecific cost-reduction function. SCOFA compiled short versions were highly reliable, but had poor validity. In contrast, both metaheuristics outperformed SCOFA and produced efficient and psychometrically sound short versions (unidimensional, reliable, sensitive, and valid). We discuss under which circumstances ACO and GA produce equivalent results and provide recommendations for conditions in which it is advisable to use a metaheuristic with an unspecific out-of-the-box optimization function.

  1. Autotransplante cardíaco: um novo método no tratamento de problemas cardíacos complexos Heart autotransplantation: a new technique to complex intracardiac reppairs

    Directory of Open Access Journals (Sweden)

    Randas J. V Batista

    1995-06-01

    Full Text Available No período de janeiro de 1990 a maio de 1995 foram operados com a técnica do autotransplante cardíaco 92 pacientes com cardiopatias complexas e arritmias supraventriculares, principalmente fibrilação atrial (n=89, reentrada (n=2, QT longo (n=1. O sexo feminino predominou (n=63. A idade variou de 18 a 76 anos (m=43. Os defeitos concomitantes foram: átrio esquerdo gigante (medido pelo ecocardiograma > 6 cm (n=65; átrio direito gigante (n=9; átrio esquerdo aumentado (4 cm (n=23; estenose mitral (n=46; insuficiência mitral (n=28; dupla lesão mitral (n=16; estenose aórtica (n=12; insuficiência aórtica (n=5; insuficiência tricúspide (n=78; trombose atrial (n=23; calcificação atrial (n=12; hipertensão pulmonar (n=86; fibroelastose biventricular (n=3; rotura atrioventricular (pós-troca de valva mitral (n=1; aneurisma da raiz aórtica (n=1; ventriculectomia parcial (n=8; 88 pacientes saíram do centro cirúrgico em ritmo sinusal e assim permaneceram; 6 precisaram de drogas inotrópicas e 3 de drogas antiarrítmicas. Todos os pacientes que apresentavam átrio esquerdo ou direito gigante com fibrilação atrial tiveram seus átrios reduzidos ao tamanho normal. Não houve mortalidade operatória e 6 evoluíram a óbito hospitalar. Na reavaliação aos seis meses de pós-operatório, os sobreviventes estavam bem, em ritmo sinusal. A técnica do autotransplante cardíaco facilita o reparo intracardíaco, proporciona a redução atrial e conseqüente retorno do paciente ao ritmo sinusal e abre novas perspectivas.From January 1990 to May 1995,92 patients with complex cardiac problems and supraventricular arrhythmias were operated upon with the technique of heart autotransplantation. The arrhythmias were: atrial fibrillation (n=89; reentry (n=2; long QT syndrome (n=1. Females predominated (n=63. The age varied from 18 to 76 years (m=43. Concomittant defects were: giant left atrium (> 6 cm measured by echo (n=65; giant right atrium (n=9; large

  2. Modified Clipped LMS Algorithm

    Directory of Open Access Journals (Sweden)

    Lotfizad Mojtaba

    2005-01-01

    Full Text Available Abstract A new algorithm is proposed for updating the weights of an adaptive filter. The proposed algorithm is a modification of an existing method, namely, the clipped LMS, and uses a three-level quantization ( scheme that involves the threshold clipping of the input signals in the filter weight update formula. Mathematical analysis shows the convergence of the filter weights to the optimum Wiener filter weights. Also, it can be proved that the proposed modified clipped LMS (MCLMS algorithm has better tracking than the LMS algorithm. In addition, this algorithm has reduced computational complexity relative to the unmodified one. By using a suitable threshold, it is possible to increase the tracking capability of the MCLMS algorithm compared to the LMS algorithm, but this causes slower convergence. Computer simulations confirm the mathematical analysis presented.

  3. The Orthogonally Partitioned EM Algorithm: Extending the EM Algorithm for Algorithmic Stability and Bias Correction Due to Imperfect Data.

    Science.gov (United States)

    Regier, Michael D; Moodie, Erica E M

    2016-05-01

    We propose an extension of the EM algorithm that exploits the common assumption of unique parameterization, corrects for biases due to missing data and measurement error, converges for the specified model when standard implementation of the EM algorithm has a low probability of convergence, and reduces a potentially complex algorithm into a sequence of smaller, simpler, self-contained EM algorithms. We use the theory surrounding the EM algorithm to derive the theoretical results of our proposal, showing that an optimal solution over the parameter space is obtained. A simulation study is used to explore the finite sample properties of the proposed extension when there is missing data and measurement error. We observe that partitioning the EM algorithm into simpler steps may provide better bias reduction in the estimation of model parameters. The ability to breakdown a complicated problem in to a series of simpler, more accessible problems will permit a broader implementation of the EM algorithm, permit the use of software packages that now implement and/or automate the EM algorithm, and make the EM algorithm more accessible to a wider and more general audience.

  4. Quantificação de tecido conjuntivo do músculo cardíaco de cães

    Directory of Open Access Journals (Sweden)

    Hildebrando Gomes Benedicto

    2003-01-01

    Full Text Available Objetivou-se, neste trabalho, estudar a proporção de tecido conjuntivo existente na fração ventricular direita e esquerda do músculo cardíaco de cães, buscando, através da morfometria, dados referentes a inter-relação entre o tecido conjuntivo e o tecido muscular cardíaco, para o conhecimento das relações anátomo-funcionais da estrutura cardíaca, característica de determinados processos ligados a diminuição do trabalho do órgão. Utilizou-se 6 corações de cães SRD, machos e fêmeas, com idade entre 48 e 150 meses, pesando entre 18 e 30 Kg, sem alterações cardíacas, confirmado mediante exames eletrocardiográfico e ecocardiográfico. Preparou-se o material oriundo de três regiões ventriculares em relação a sua base, proximal, média e distal, tanto da face direita quanto da esquerda, segundo as técnicas histológicas convencionais e corados com Picrosirius red, Fucsina-Paraldeido e Tricromo de Gomori, para evidenciação das fibras conjuntivas. As lâminas foram analisadas com auxílio do Axioscópio Zeiss acoplado ao programa de análise de imagens KS-400 Zeiss. A quantidade de tecido conjuntivo no Ventrículo Esquerdo variou de 0,44 a 26,26%; no Ventrículo Direito variou de 0,97 a 21,18%; no ápice variou de 1,32 a 29,24% e no septo interventricular variou de 5,41 a 11,24%. Os resultados obtidos mostram que há uma complexa rede de fibras conjuntivas envolvendo as fibras do tecido muscular cardíaco e que sua quantidade e disposição é muito variada, dependendo da região estudada.

  5. Study on ant colony optimization for fuel loading pattern problem

    International Nuclear Information System (INIS)

    Kishi, Hironori; Kitada, Takanori

    2013-01-01

    Modified ant colony optimization (ACO) was applied to the in-core fuel loading pattern (LP) optimization problem to minimize the power peaking factor (PPF) in the modeled 1/4 symmetry PWR core. Loading order was found to be important in ACO. Three different loading orders with and without the adjacent effect between fuel assemblies (FAs) were compared, and it was found that the loading order from the central core is preferable because many selections of FAs to be inserted are available in the core center region. LPs were determined from pheromone trail and heuristic information, which is a priori knowledge based on the feature of the problem. Three types of heuristic information were compared to obtain the desirable performance of searching LPs with low PPF. Moreover, mutation operation, such as the genetic algorithm (GA), was introduced into the ACO algorithm to avoid searching similar LPs because heuristic information used in ACO tends to localize the searching space in the LP problem. The performance of ACO with some improvement was compared with those of simulated annealing and GA. In conclusion, good performance can be achieved by setting proper heuristic information and mutation operation parameter in ACO. (author)

  6. Algorithms to analyze the quality test parameter values of seafood in the proposed ontology based seafood quality analyzer and miner (ONTO SQAM model

    Directory of Open Access Journals (Sweden)

    Vinu Sherimon

    2017-07-01

    Full Text Available Ensuring the quality of food, particularly seafood has increasingly become an important issue nowadays. Quality Management Systems empower any organization to identify, measure, control and improve the quality of the products manufactured that will eventually lead to improved business performance. With the advent of new technologies, now intelligent systems are being developed. To ensure the quality of seafood, an ontology based seafood quality analyzer and miner (ONTO SQAM model is proposed. The knowledge is represented using ontology. The domain concepts are defined using ontology. This paper presents the initial part of the proposed model – the analysis of quality test parameter values. Two algorithms are proposed to do the analysis – Comparison Algorithm and Data Store Updater algorithm. The algorithms ensure that the values of various quality tests are in the acceptable range. The real data sets taken from different seafood companies in Kerala, India, and validated by the Marine Product Export Development Authority of India (MPEDA are used for the experiments. The performance of the algorithms is evaluated using standard performance metrics such as precision, recall, and accuracy. The results obtained show that all the three measures achieved good results.

  7. Metaheuristic Algorithms Applied to Bioenergy Supply Chain Problems: Theory, Review, Challenges, and Future

    Directory of Open Access Journals (Sweden)

    Krystel K. Castillo-Villar

    2014-11-01

    Full Text Available Bioenergy is a new source of energy that accounts for a substantial portion of the renewable energy production in many countries. The production of bioenergy is expected to increase due to its unique advantages, such as no harmful emissions and abundance. Supply-related problems are the main obstacles precluding the increase of use of biomass (which is bulky and has low energy density to produce bioenergy. To overcome this challenge, large-scale optimization models are needed to be solved to enable decision makers to plan, design, and manage bioenergy supply chains. Therefore, the use of effective optimization approaches is of great importance. The traditional mathematical methods (such as linear, integer, and mixed-integer programming frequently fail to find optimal solutions for non-convex and/or large-scale models whereas metaheuristics are efficient approaches for finding near-optimal solutions that use less computational resources. This paper presents a comprehensive review by studying and analyzing the application of metaheuristics to solve bioenergy supply chain models as well as the exclusive challenges of the mathematical problems applied in the bioenergy supply chain field. The reviewed metaheuristics include: (1 population approaches, such as ant colony optimization (ACO, the genetic algorithm (GA, particle swarm optimization (PSO, and bee colony algorithm (BCA; and (2 trajectory approaches, such as the tabu search (TS and simulated annealing (SA. Based on the outcomes of this literature review, the integrated design and planning of bioenergy supply chains problem has been solved primarily by implementing the GA. The production process optimization was addressed primarily by using both the GA and PSO. The supply chain network design problem was treated by utilizing the GA and ACO. The truck and task scheduling problem was solved using the SA and the TS, where the trajectory-based methods proved to outperform the population

  8. Update Strength in EDAs and ACO: How to Avoid Genetic Drift

    DEFF Research Database (Denmark)

    Sudholt, Dirk; Witt, Carsten

    2016-01-01

    , showing that the update strength should be limited to 1/K, ρ = O(1/(√n log n)). In fact, choosing 1/K, ρ ∼ 1/(√n log n) both algorithms efficiently optimize OneMax in expected time O (n log n). Our analyses provide new insights into the stochastic behavior of probabilistic model-building GAs and propose...

  9. Changes in health care spending and quality for Medicare beneficiaries associated with a commercial ACO contract.

    Science.gov (United States)

    McWilliams, J Michael; Landon, Bruce E; Chernew, Michael E

    2013-08-28

    In a multipayer system, new payment incentives implemented by one insurer for an accountable care organization (ACO) may also affect spending and quality of care for another insurer's enrollees served by the ACO. Such spillover effects reflect the extent of organizational efforts to reform care delivery and can contribute to the net impact of ACOs. We examined whether the Blue Cross Blue Shield (BCBS) of Massachusetts' Alternative Quality Contract (AQC), an early commercial ACO initiative associated with reduced spending and improved quality for BCBS enrollees, was also associated with changes in spending and quality for Medicare beneficiaries, who were not covered by the AQC. Quasi-experimental comparisons from 2007-2010 of elderly fee-for-service Medicare beneficiaries in Massachusetts (1,761,325 person-years) served by 11 provider organizations entering the AQC in 2009 or 2010 (intervention group) vs beneficiaries served by other providers (control group). Using a difference-in-differences approach, we estimated changes in spending and quality for the intervention group in the first and second years of exposure to the AQC relative to concurrent changes for the control group. Regression and propensity score methods were used to adjust for differences in sociodemographic and clinical characteristics. The primary outcome was total quarterly medical spending per beneficiary. Secondary outcomes included spending by setting and type of service, 5 process measures of quality, potentially avoidable hospitalizations, and 30-day readmissions. Before entering the AQC, total quarterly spending per beneficiary for the intervention group was $150 (95% CI, $25-$274) higher than for the control group and increased at a similar rate. In year 2 of the intervention group's exposure to the AQC, this difference was reduced to $51 (95% CI, -$109 to $210; P = .53), constituting a significant differential change of -$99 (95% CI, -$183 to -$16; P = .02) or a 3.4% savings

  10. A Stochastic Inversion Method for Potential Field Data: Ant Colony Optimization

    Science.gov (United States)

    Liu, Shuang; Hu, Xiangyun; Liu, Tianyou

    2014-07-01

    Simulating natural ants' foraging behavior, the ant colony optimization (ACO) algorithm performs excellently in combinational optimization problems, for example the traveling salesman problem and the quadratic assignment problem. However, the ACO is seldom used to inverted for gravitational and magnetic data. On the basis of the continuous and multi-dimensional objective function for potential field data optimization inversion, we present the node partition strategy ACO (NP-ACO) algorithm for inversion of model variables of fixed shape and recovery of physical property distributions of complicated shape models. We divide the continuous variables into discrete nodes and ants directionally tour the nodes by use of transition probabilities. We update the pheromone trails by use of Gaussian mapping between the objective function value and the quantity of pheromone. It can analyze the search results in real time and promote the rate of convergence and precision of inversion. Traditional mapping, including the ant-cycle system, weaken the differences between ant individuals and lead to premature convergence. We tested our method by use of synthetic data and real data from scenarios involving gravity and magnetic anomalies. The inverted model variables and recovered physical property distributions were in good agreement with the true values. The ACO algorithm for binary representation imaging and full imaging can recover sharper physical property distributions than traditional linear inversion methods. The ACO has good optimization capability and some excellent characteristics, for example robustness, parallel implementation, and portability, compared with other stochastic metaheuristics.

  11. Avaliação da doença vascular do enxerto no transplante cardíaco: experiência de um centro brasileiro

    Directory of Open Access Journals (Sweden)

    Elide Sbardellotto Mariano da Costa

    2012-10-01

    Full Text Available FUNDAMENTO: O transplante cardíaco continua sendo o tratamento de escolha para a insuficiência cardíaca refratária ao tratamento otimizado. Dois métodos diagnósticos apresentam elevada sensibilidade no diagnóstico de episódios de rejeição ao enxerto e Doença Vascular do Enxerto (DVE, causas importantes de mortalidade no pós-transplante. OBJETIVO: Avaliar a relação entre os resultados do ultrassom intracoronariano (USIV e os laudos das biópsias endomiocárdicas (BX no seguimento de pacientes submetidos a transplante cardíaco em um serviço de referência brasileiro. MÉTODOS: Foi realizado um ensaio epidemiológico retrospectivo observacional, com pacientes submetidos a transplante cardíaco ortotópico, no período de 2000 a 2009. Foram analisados os prontuários desses pacientes e os resultados dos USIV e BX realizados rotineiramente no seguimento clínico pós-transplante e terapêutica em uso. RESULTADOS: Dos 77 pacientes analisados, 63,63% são do sexo masculino, nas faixas etárias de 22 a 69 anos. Quanto aos resultados dos USIV, 33,96% foram classificados em Stanford classe I, e 32,08%, como Stanford IV. Dos 143 laudos das biópsias, 51,08% tiveram resultado 1R, 3R em 0,69% dos laudos, e 14,48% apresentaram a descrição de efeito Quilty. Todos usaram antiproliferativos, 80,51% usaram inibidores da calcineurina e 19,48% usaram inibidores do sinal de proliferação (ISP. CONCLUSÃO: A avaliação dos pacientes pós-transplante cardíaco por meio do USIV incorpora informações detalhadas para o diagnóstico precoce e sensível da DVE, que são complementadas pelas informações histológicas fornecidas pelas BX, estabelecendo uma possível relação causal entre a DVE e os episódios de rejeição humoral.

  12. Imunodepressão induzida por talidomida e ciclosporina em transplante cardíaco heterotópico de coelho

    Directory of Open Access Journals (Sweden)

    João Batista Vieira de Carvalho

    Full Text Available OBJETIVO: A talidomida, por seus efeitos antiinflamatórios e imunodepressores, tem sido utilizada no tratamento de doenças dermatológicas e na doença enxerto-contra-hospedeiro no transplante de medula óssea. O objetivo deste trabalho é avaliar a ação deste medicamento como imunodepressor em transplante de órgãos, estudando sua ação isoladamente ou em combinação com a ciclosporina na prevenção da rejeição ao aloenxerto cardíaco heterotópico em coelho. MÉTODO: Foram utilizados 50 coelhos, sendo 25 doadores e 25 receptores.Os animais receptores foram subdivididos em cinco grupos (n= 5 : Grupo I (controle animais não-imunodeprimidos; Grupo II (imunodeprimidos com ciclosporina na dose de 10 mg/kg/dia; Grupo III (imunodeprimidos com talidomida na dose de 100 mg/kg/dia; Grupo IV (imunodeprimidos com ciclosporina na dose de 5,0 mg/kg/dia e Grupo V (imunodeprimidos com ciclosporina na dose de 5,0 mg/kg/dia associada a talidomida na dose de 50 mg/kg/dia. Os medicamentos foram administrados através de cateter orogástrico, a partir do dia anterior ao transplante. RESULTADOS: O coração do doador foi implantado no abdome dos receptores. A associação de talidomida e ciclosporina apresentou o menor escore histopatológico de rejeição (p < 0,05. Observou-se que a talidomida empregada isoladamente ou associada à ciclosporina foi efetiva contra a rejeição, aumentando a sobrevida (p < 0,01 de animais submetidos ao transplante cardíaco heterotópico em posição abdominal. CONCLUSÕES: A talidomida empregada isoladamente, ou associada à ciclosporina, pode representar uma opção de imunodepressão em transplantes cardíaco heterotópico experimental de coelho.

  13. A sequential fuzzy diagnosis method for rotating machinery using ant colony optimization and possibility theory

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Hao; Ping, Xueliang; Cao, Yi; Lie, Ke [Jiangnan University, Wuxi (China); Chen, Peng [Mie University, Mie (Japan); Wang, Huaqing [Beijing University, Beijing (China)

    2014-04-15

    This study proposes a novel intelligent fault diagnosis method for rotating machinery using ant colony optimization (ACO) and possibility theory. The non-dimensional symptom parameters (NSPs) in the frequency domain are defined to reflect the features of the vibration signals measured in each state. A sensitive evaluation method for selecting good symptom parameters using principal component analysis (PCA) is proposed for detecting and distinguishing faults in rotating machinery. By using ACO clustering algorithm, the synthesizing symptom parameters (SSP) for condition diagnosis are obtained. A fuzzy diagnosis method using sequential inference and possibility theory is also proposed, by which the conditions of the machinery can be identified sequentially. Lastly, the proposed method is compared with a conventional neural networks (NN) method. Practical examples of diagnosis for a V-belt driving equipment used in a centrifugal fan are provided to verify the effectiveness of the proposed method. The results verify that the faults that often occur in V-belt driving equipment, such as a pulley defect state, a belt defect state and a belt looseness state, are effectively identified by the proposed method, while these faults are difficult to detect using conventional NN.

  14. Manejo de las dislipidemias en pacientes cardíacos trasplantados. Hallazgos sobre nuevos factores de riesgo

    OpenAIRE

    Walter Masson; Norberto Vulcano; Sandra Fernández; Alberto Domenech; Ricardo Marenchino; Daniel Bracco; César Belziti

    2008-01-01

    En pacientes cardíacos trasplantados, el desarrollo de enfermedad vascular coronaria es una complicación frecuente y la dislipidemia es uno de los predictores más importantes. Los inmunosupresores predisponen a las dislipidemias y dificultan la utilización de hipolipemiantes. En este grupo particular de pacientes se recomienda alcanzar las metas terapéuticas de prevención secundaria. Las estatinas son los hipolipemiantes de elección. No existen recomendaciones claras sobre nuevos factores de ...

  15. Manejo de las dislipidemias en pacientes cardíacos trasplantados: Hallazgos sobre nuevos factores de riesgo

    OpenAIRE

    Masson, Walter; Vulcano, Norberto; Fernández, Sandra; Domenech, Alberto; Marenchino, Ricardo; Braccomtsac, Daniel; Belziti, César

    2008-01-01

    En pacientes cardíacos trasplantados, el desarrollo de enfermedad vascular coronaria es una complicación frecuente y la dislipidemia es uno de los predictores más importantes. Los inmunosupresores predisponen a las dislipidemias y dificultan la utilización de hipolipemiantes. En este grupo particular de pacientes se recomienda alcanzar las metas terapéuticas de prevención secundaria. Las estatinas son los hipolipemiantes de elección. No existen recomendaciones claras sobre nuevos factores de ...

  16. Débito cardíaco diminuído: revisão sistemática das características definidoras Débito cardíaco disminuído: revisión sistemática de las características definidoras Decreased cardiac output: a systematic review of the defining characteristics

    Directory of Open Access Journals (Sweden)

    Vanessa de Souza

    2011-01-01

    Full Text Available OBJETIVOS: Caracterizar os artrigos científicos relacionados ao diagnóstico de enfermagem débito cardíaco diminuído. Verificar os artigos que descrevem o comportamento das características definidoras deste diagnóstico, identificando aquelas que ocorrem com maior frequência. MÉTODOS: Trata-se de uma revisão sistemática realizada nas bases de dados: Lilacs, SciELO, Embase, Medline, Pubmed e Cochrane, no período de 1985 a 2008. RESULTADOS: Foram selecionados 13 artigos, identificando 50 características definidoras, sendo dez com maior frequência: alteração da frequência/ritmo cardíaco, dispneia, labilidade da pressão arterial, estertores, oligúria anúria, edema, pele fria, fadiga/fraqueza, diminuição dos pulsos periféricos e diminuição da perfusão periférica. CONCLUSÃO: A temática vem sendo pouco explorada. Constatou-se a importância do exame físico, a utilização de técnicas menos invasivas e a necessidade de rever as características definidoras propostas a fim de proporcionar clareza e objetividade na identificação desse diagnóstico de enfermagemOBJETIVOS: Caracterizar los artículos científicos relacionados al diagnóstico de enfermería débito cardíaco disminuído. Verificar los artículos que describen el comportamiento de las características definidoras de este diagnóstico, identificando aquellas que ocurren con mayor frecuencia. MÉTODOS: Se trata de una revisión sistemática realizada en las bases de datos: Lilacs, SciELO, Embase, Medline, Pubmed y Cochrane, en el período de 1985 al 2008. RESULTADOS: Fueron seleccionados 13 artículos, identificando 50 características definidoras, siendo diez con mayor frecuencia: alteración de la frecuencia/ritmo cardíaco, disnea, labilidad de la presión arterial, estertores, oliguria anuria, edema, piel fría, fatiga/debilidad, disminución de los pulsos periféricos y disminución de la perfusión periférica. CONCLUSIÓN: La temática viene siendo poco

  17. Analysis and Improvement of Fireworks Algorithm

    Directory of Open Access Journals (Sweden)

    Xi-Guang Li

    2017-02-01

    Full Text Available The Fireworks Algorithm is a recently developed swarm intelligence algorithm to simulate the explosion process of fireworks. Based on the analysis of each operator of Fireworks Algorithm (FWA, this paper improves the FWA and proves that the improved algorithm converges to the global optimal solution with probability 1. The proposed algorithm improves the goal of further boosting performance and achieving global optimization where mainly include the following strategies. Firstly using the opposition-based learning initialization population. Secondly a new explosion amplitude mechanism for the optimal firework is proposed. In addition, the adaptive t-distribution mutation for non-optimal individuals and elite opposition-based learning for the optimal individual are used. Finally, a new selection strategy, namely Disruptive Selection, is proposed to reduce the running time of the algorithm compared with FWA. In our simulation, we apply the CEC2013 standard functions and compare the proposed algorithm (IFWA with SPSO2011, FWA, EFWA and dynFWA. The results show that the proposed algorithm has better overall performance on the test functions.

  18. Traumatic subarachnoid pleural fistula in children: case report, algorithm and classification proposal

    Directory of Open Access Journals (Sweden)

    Moscote-Salazar Luis Rafael

    2016-06-01

    Full Text Available Subarachnoid pleural fistulas are rare. They have been described as complications of thoracic surgery, penetrating injuries and spinal surgery, among others. We present the case of a 3-year-old female child, who suffer spinal cord trauma secondary to a car accident, developing a posterior subarachnoid pleural fistula. To our knowledge this is the first reported case of a pediatric patient with subarachnoid pleural fistula resulting from closed trauma, requiring intensive multimodal management. We also present a management algorithm and a proposed classification. The diagnosis of this pathology is difficult when not associated with neurological deficit. A high degree of suspicion, multidisciplinary management and timely surgical intervention allow optimal management.

  19. Automatic boiling water reactor loading pattern design using ant colony optimization algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Wang, C.-D. [Department of Engineering and System Science, National Tsing Hua University, 101, Section 2 Kuang Fu Road, Hsinchu 30013, Taiwan (China); Nuclear Engineering Division, Institute of Nuclear Energy Research, No. 1000, Wenhua Rd., Jiaan Village, Longtan Township, Taoyuan County 32546, Taiwan (China)], E-mail: jdwang@iner.gov.tw; Lin Chaung [Department of Engineering and System Science, National Tsing Hua University, 101, Section 2 Kuang Fu Road, Hsinchu 30013, Taiwan (China)

    2009-08-15

    An automatic boiling water reactor (BWR) loading pattern (LP) design methodology was developed using the rank-based ant system (RAS), which is a variant of the ant colony optimization (ACO) algorithm. To reduce design complexity, only the fuel assemblies (FAs) of one eight-core positions were determined using the RAS algorithm, and then the corresponding FAs were loaded into the other parts of the core. Heuristic information was adopted to exclude the selection of the inappropriate FAs which will reduce search space, and thus, the computation time. When the LP was determined, Haling cycle length, beginning of cycle (BOC) shutdown margin (SDM), and Haling end of cycle (EOC) maximum fraction of limit for critical power ratio (MFLCPR) were calculated using SIMULATE-3 code, which were used to evaluate the LP for updating pheromone of RAS. The developed design methodology was demonstrated using FAs of a reference cycle of the BWR6 nuclear power plant. The results show that, the designed LP can be obtained within reasonable computation time, and has a longer cycle length than that of the original design.

  20. Dynamic Vehicle Routing Problems with Enhanced Ant Colony Optimization

    Directory of Open Access Journals (Sweden)

    Haitao Xu

    2018-01-01

    Full Text Available As we all know, there are a great number of optimization problems in the world. One of the relatively complicated and high-level problems is the vehicle routing problem (VRP. Dynamic vehicle routing problem (DVRP is a major variant of VRP, and it is closer to real logistic scene. In DVRP, the customers’ demands appear with time, and the unserved customers’ points must be updated and rearranged while carrying out the programming paths. Owing to the complexity and significance of the problem, DVRP applications have grabbed the attention of researchers in the past two decades. In this paper, we have two main contributions to solving DVRP. Firstly, DVRP is solved with enhanced Ant Colony Optimization (E-ACO, which is the traditional Ant Colony Optimization (ACO fusing improved K-means and crossover operation. K-means can divide the region with the most reasonable distance, while ACO using crossover is applied to extend search space and avoid falling into local optimum prematurely. Secondly, several new evaluation benchmarks are proposed, which can objectively and comprehensively estimate the proposed method. In the experiment, the results for different scale problems are compared to those of previously published papers. Experimental results show that the algorithm is feasible and efficient.

  1. Improved autonomous star identification algorithm

    International Nuclear Information System (INIS)

    Luo Li-Yan; Xu Lu-Ping; Zhang Hua; Sun Jing-Rong

    2015-01-01

    The log–polar transform (LPT) is introduced into the star identification because of its rotation invariance. An improved autonomous star identification algorithm is proposed in this paper to avoid the circular shift of the feature vector and to reduce the time consumed in the star identification algorithm using LPT. In the proposed algorithm, the star pattern of the same navigation star remains unchanged when the stellar image is rotated, which makes it able to reduce the star identification time. The logarithmic values of the plane distances between the navigation and its neighbor stars are adopted to structure the feature vector of the navigation star, which enhances the robustness of star identification. In addition, some efforts are made to make it able to find the identification result with fewer comparisons, instead of searching the whole feature database. The simulation results demonstrate that the proposed algorithm can effectively accelerate the star identification. Moreover, the recognition rate and robustness by the proposed algorithm are better than those by the LPT algorithm and the modified grid algorithm. (paper)

  2. Taponamiento cardíaco secundario a carcinoma papilar esclerosante difuso de tiroides

    Directory of Open Access Journals (Sweden)

    Verónica Riva

    2011-12-01

    Full Text Available El carcinoma papilar, variante esclerosante difusa, corresponde al 2% de todos los carcinomas papilares de la tiroides. Se caracteriza por comprometer de manera difusa y bilateral a la glándula tiroides. Clínicamente se manifiesta con metástasis ganglionares y pulmonares, afectando predominantemente a mujeres jóvenes. Se describe un caso de taponamiento cardíaco como presentación inicial de un carcinoma papilar de tiroides variante esclerosante difusa. Una mujer de 32 años concurrió al servicio de emergencias médicas refiriendo epigastralgia y tos seca. Durante el examen físico se constató hipotensión arterial, taquicardia y ruidos cardíacos disminuidos. Se realizó un ecocardiograma, observándose derrame pericárdico. Por medio de una pericardiocentesis se obtuvo líquido pericárdico, cuyo análisis mostró células neoplásicas. Durante la evolución la paciente presentó recurrencia del derrame pericárdico por lo que se realizó una ventana pleuropericárdica, detectándose durante la cirugía una lesión nodular subpleural, la cual fue biopsiada e informada posteriormente como una metástasis de carcinoma papilar vinculable a origen tiroideo. Se realizó una tiroidectomía total con linfadenectomía cervical bilateral. El diagnóstico final fue carcinoma papilar, variante esclerosante difusa. Esta variante infiltra el tejido conectivo de los espacios interfoliculares, simulando una tiroiditis y se caracteriza por una permeación vascular temprana. En oposición a la variante clásica, la esclerosante difusa presenta mayor agresividad y mayor tasa de recurrencia. El carcinoma papilar de tiroides debe tenerse presente como diagnóstico diferencial en nuestro medio, en todas aquellas lesiones neoplásicas papilares metastásicas, más aún si se trata de mujeres jóvenes.

  3. Verification-Based Interval-Passing Algorithm for Compressed Sensing

    OpenAIRE

    Wu, Xiaofu; Yang, Zhen

    2013-01-01

    We propose a verification-based Interval-Passing (IP) algorithm for iteratively reconstruction of nonnegative sparse signals using parity check matrices of low-density parity check (LDPC) codes as measurement matrices. The proposed algorithm can be considered as an improved IP algorithm by further incorporation of the mechanism of verification algorithm. It is proved that the proposed algorithm performs always better than either the IP algorithm or the verification algorithm. Simulation resul...

  4. Qualidade de vida de pacientes submetidos ao transplante cardíaco: aplicação da escala Whoqol-Bref Calidad de vida de pacientes sometidos a transplante cardíaco: aplicación de la escala Whoqol-Bref Quality of life of patients that had a heart transplant: application of Whoqol-Bref scale

    Directory of Open Access Journals (Sweden)

    Maria Isis Freire de Aguiar

    2011-01-01

    Full Text Available FUNDAMENTO: O sucesso do transplante cardíaco significa garantir a sobrevida dos pacientes com cardiopatia e permitir-lhes desenvolver suas atividades diárias. O transplante cardíaco apresenta-se como a primeira opção de tratamento na falência cardíaca, representando um aumento de sobrevida e qualidade de vida dos transplantados. OBJETIVO: Avaliar a qualidade de vida de pacientes submetidos ao transplante cardíaco através da aplicação de uma escala padronizada (Whoqol-Bref. MÉTODOS: Estudo exploratório descritivo de abordagem quantitativa, realizado com 55 pacientes submetidos ao transplante cardíaco, em um período entre o terceiro e o 103º mês, que realizam acompanhamento na Unidade de Transplante e Insuficiência Cardíaca em um Hospital de Referência em Cardiologia na cidade de Fortaleza, CE. Os dados foram coletados no período de fevereiro a abril de 2009, por meio da aplicação de um questionário padronizado pela Organização Mundial da Saúde e utilização de dados constantes nos prontuários. RESULTADOS: Com relação ao domínio físico, 62,8% e 58,3% dos pacientes, dos sexos masculino e feminino, respectivamente, estão satisfeitos. No domínio psicológico, dentre pacientes do sexo masculino, 65,1% apresentam satisfação quanto à qualidade de vida e, no sexo feminino, 58,3% encontram-se satisfeitas. No domínio das relações sociais, observou-se que, no sexo masculino, 53,5% estão muito satisfeitos, e apresentou-se um nível de satisfação de 100% no sexo feminino. No domínio do meio ambiente, 65,1% do sexo masculino encontram-se satisfeitos, e no sexo feminino, 83,3% estão satisfeitas. CONCLUSÃO: O transplante cardíaco teve bastante influência na qualidade de vida dos pacientes transplantados, pois os resultados mostram-se estatisticamente significantes no pós-transplante.FUNDAMENTO: El éxito del transplante cardíaco significa garantizar la sobrevida de los pacientes con cardiopatía y permitirles

  5. The global Minmax k-means algorithm.

    Science.gov (United States)

    Wang, Xiaoyan; Bai, Yanping

    2016-01-01

    The global k -means algorithm is an incremental approach to clustering that dynamically adds one cluster center at a time through a deterministic global search procedure from suitable initial positions, and employs k -means to minimize the sum of the intra-cluster variances. However the global k -means algorithm sometimes results singleton clusters and the initial positions sometimes are bad, after a bad initialization, poor local optimal can be easily obtained by k -means algorithm. In this paper, we modified the global k -means algorithm to eliminate the singleton clusters at first, and then we apply MinMax k -means clustering error method to global k -means algorithm to overcome the effect of bad initialization, proposed the global Minmax k -means algorithm. The proposed clustering method is tested on some popular data sets and compared to the k -means algorithm, the global k -means algorithm and the MinMax k -means algorithm. The experiment results show our proposed algorithm outperforms other algorithms mentioned in the paper.

  6. Gradient Evolution-based Support Vector Machine Algorithm for Classification

    Science.gov (United States)

    Zulvia, Ferani E.; Kuo, R. J.

    2018-03-01

    This paper proposes a classification algorithm based on a support vector machine (SVM) and gradient evolution (GE) algorithms. SVM algorithm has been widely used in classification. However, its result is significantly influenced by the parameters. Therefore, this paper aims to propose an improvement of SVM algorithm which can find the best SVMs’ parameters automatically. The proposed algorithm employs a GE algorithm to automatically determine the SVMs’ parameters. The GE algorithm takes a role as a global optimizer in finding the best parameter which will be used by SVM algorithm. The proposed GE-SVM algorithm is verified using some benchmark datasets and compared with other metaheuristic-based SVM algorithms. The experimental results show that the proposed GE-SVM algorithm obtains better results than other algorithms tested in this paper.

  7. Composite Differential Search Algorithm

    Directory of Open Access Journals (Sweden)

    Bo Liu

    2014-01-01

    Full Text Available Differential search algorithm (DS is a relatively new evolutionary algorithm inspired by the Brownian-like random-walk movement which is used by an organism to migrate. It has been verified to be more effective than ABC, JDE, JADE, SADE, EPSDE, GSA, PSO2011, and CMA-ES. In this paper, we propose four improved solution search algorithms, namely “DS/rand/1,” “DS/rand/2,” “DS/current to rand/1,” and “DS/current to rand/2” to search the new space and enhance the convergence rate for the global optimization problem. In order to verify the performance of different solution search methods, 23 benchmark functions are employed. Experimental results indicate that the proposed algorithm performs better than, or at least comparable to, the original algorithm when considering the quality of the solution obtained. However, these schemes cannot still achieve the best solution for all functions. In order to further enhance the convergence rate and the diversity of the algorithm, a composite differential search algorithm (CDS is proposed in this paper. This new algorithm combines three new proposed search schemes including “DS/rand/1,” “DS/rand/2,” and “DS/current to rand/1” with three control parameters using a random method to generate the offspring. Experiment results show that CDS has a faster convergence rate and better search ability based on the 23 benchmark functions.

  8. Cardiac beta-receptors in experimental Chagas' disease Receptores beta cardíacos na doença de Chagas experimental

    Directory of Open Access Journals (Sweden)

    Julio E. Enders

    1995-02-01

    Full Text Available Experimental Chagas' disease (45 to 90 days post-infection showed serious cardiac alterations in the contractility and in the pharmacological response to beta adrenergic receptors in normal and T. cruzi infected mice (post-acute phase. Chagasic infection did not change the beta receptors density (78.591 ± 3.125 fmol/mg protein and 73.647 ± 2.194 fmol/mg protein for controls but their affinity was significantly diminished (Kd = 7.299 ± 0.426 nM and Kd = 3.759 ± 0.212 nM for the control p Estudaram-se os receptores beta cardíacos de camundongos infectados pelo Trypanosoma cruzi na fase pós-aguda da doença de Chagas para estabelecer em que medida os mesmos contribuem a gerar respostas anômalas às catecolaminas observadas nestes miocardios. Utilizara-se 3-H/DHA para a marcação dos receptores beta cardíacos dos camundongos normais e dos infectados na fase pós-aguda (45 a 90 dias pós-infecção. O número dos sítios de fixação foi similar nos dois grupos, 78.591 ± 3.125 fmol/mg. Proteína nos chagásicos e 73.647 ± 2.194 fmol/mg. Proteína no grupo controle. Em vez disso, a afinidade verificou-se significativamente diminuida no grupo chagásico (Kd = 7.299 ± 0.426 nM respeito do controle (Kd = 3.759 ± 0.212 nM p < 0.001. Os resultados obtidos demonstram que as modificações observadas na estimulação adrenérgica do miocárdio chagásico se correlacionam com a menor afinidade dos receptores beta cardíacos e que estas alterações exerceriam uma parte determinante para as consequências funcionais que são detectadas na fase crônica.

  9. Avaliação do sopro cardíaco na infância Assessment of heart murmurs in childhood

    Directory of Open Access Journals (Sweden)

    Maria Elisabeth B.A. Kobinger

    2003-06-01

    Full Text Available OBJETIVO: discutir a avaliação clínica e laboratorial do sopro cardíaco, considerando sua alta freqüência no atendimento ambulatorial do pediatra geral. FONTE DE DADOS: revisão baseada na análise crítica da literatura atual e consulta a compêndios de cardiologia pediátrica e pediatria contendo informações básicas sobre o tema. SÍNTESE DOS DADOS: os principais destaques do artigo referem-se à importância da anamnese e exame físico do sistema cardiovascular, essenciais para o pediatra geral diagnosticar o sopro cardíaco inocente, e identificar situações que indiquem a ocorrência de cardiopatias congênitas, ou adquiridas, e a necessidade de encaminhamento ao especialista. CONCLUSÕES: o pediatra geral é geralmente o primeiro médico a detectar um sopro cardíaco e deve estar apto a reconhecer o sopro inocente, assim como a suspeitar precocemente de doenças cardiovasculares.OBJECTIVE: to discuss clinical and laboratorial evaluation of heart murmurs in children, an important problem faced by pediatricians in their practice. SOURCES OF DATA: this review was based on a critical analysis of the current literature, as well as pediatrics and pediatric cardiology textbooks, which were found to be an important source of information on the subject. SUMMARY OS THE FINDINGS: it is important for pediatricians to know how to obtain precise information regarding the patient's medical history and to perform extensive physical examination of a child with heart murmur. The diagnosis of innocent heart murmur is essentially clinical and it can help the pediatrician to identify situations which are associated with cardiovascular diseases. CONCLUSIONS: in our series, short-term video-EEG monitoring established a reliable diagnosis in most patients due to correlation between clinical and EEG data. This procedure was well tolerated by children, including infants and those with psychiatric disorders.

  10. Multiple-algorithm parallel fusion of infrared polarization and intensity images based on algorithmic complementarity and synergy

    Science.gov (United States)

    Zhang, Lei; Yang, Fengbao; Ji, Linna; Lv, Sheng

    2018-01-01

    Diverse image fusion methods perform differently. Each method has advantages and disadvantages compared with others. One notion is that the advantages of different image methods can be effectively combined. A multiple-algorithm parallel fusion method based on algorithmic complementarity and synergy is proposed. First, in view of the characteristics of the different algorithms and difference-features among images, an index vector-based feature-similarity is proposed to define the degree of complementarity and synergy. This proposed index vector is a reliable evidence indicator for algorithm selection. Second, the algorithms with a high degree of complementarity and synergy are selected. Then, the different degrees of various features and infrared intensity images are used as the initial weights for the nonnegative matrix factorization (NMF). This avoids randomness of the NMF initialization parameter. Finally, the fused images of different algorithms are integrated using the NMF because of its excellent data fusing performance on independent features. Experimental results demonstrate that the visual effect and objective evaluation index of the fused images obtained using the proposed method are better than those obtained using traditional methods. The proposed method retains all the advantages that individual fusion algorithms have.

  11. Trasplante cardíaco en pacientes con enfermedad de Chagas. Experiencia de un único centro

    Directory of Open Access Journals (Sweden)

    Alfredo Inácio Fiorelli

    2011-07-01

    Conclusiones: El trasplante cardíaco es el único tratamiento actual eficaz de tratamiento de la enfermedad de Chagas en fase terminal. La reactivación de la enfermedad es un problema real que es fácilmente revertido con la introducción de la terapéutica farmacológica específica, restaurando los padrones histológicos del miocardio sin dejar secuelas. La inmunosupresión, en especial los corticoides, predisponen al desarrollo de neoplasias y a la reactivación de la enfermedad, exigiendo una atención especial su interrupción o reducción precoz.

  12. Lymphatic malformations: a proposed management algorithm.

    LENUS (Irish Health Repository)

    Oosthuizen, J C

    2012-02-01

    OBJECTIVE: The aim of this study was to develop a management algorithm for cervicofacial lymphatic malformations, based on the authors\\' experience in managing these lesions as well as current literature on the subject. STUDY DESIGN AND METHODS: A retrospective medical record review of all the patients treated for lymphatic malformations at our institution during a 10-year period (1998-2008) was performed. DATA COLLECTED: age at diagnosis, location and type of lesion, radiologic investigation performed, presenting symptoms, treatment modality used, complications and results achieved. RESULTS: 14 patients were identified. Eight (57%) male and six (43%) female. There was an equal distribution between the left and right sides. The majority (71%) of cases were diagnosed within the first year of life. The majority of lesions were located in the suprahyoid region. The predominant reason for referral was an asymptomatic mass in 7 cases (50%) followed by airway compromise (36%) and dysphagia (14%). Management options employed included: observation, OK-432 injection, surgical excision and laser therapy. In 5 cases (36%) a combination of these were used. CONCLUSION: Historically surgical excision has been the management option of choice for lymphatic malformations. However due to the morbidity and high complication rate associated this is increasingly being questioned. Recent advances in sclerotherapy e.g. OK-432 injection have also shown significant promise. Based on experience in managing these lesions as well as current literature the authors of this paper have developed an algorithm for the management of cervicofacial lymphatic malformations.

  13. Proposed algorithm to improve job shop production scheduling using ant colony optimization method

    Science.gov (United States)

    Pakpahan, Eka KA; Kristina, Sonna; Setiawan, Ari

    2017-12-01

    This paper deals with the determination of job shop production schedule on an automatic environment. On this particular environment, machines and material handling system are integrated and controlled by a computer center where schedule were created and then used to dictate the movement of parts and the operations at each machine. This setting is usually designed to have an unmanned production process for a specified interval time. We consider here parts with various operations requirement. Each operation requires specific cutting tools. These parts are to be scheduled on machines each having identical capability, meaning that each machine is equipped with a similar set of cutting tools therefore is capable of processing any operation. The availability of a particular machine to process a particular operation is determined by the remaining life time of its cutting tools. We proposed an algorithm based on the ant colony optimization method and embedded them on matlab software to generate production schedule which minimize the total processing time of the parts (makespan). We test the algorithm on data provided by real industry and the process shows a very short computation time. This contributes a lot to the flexibility and timelines targeted on an automatic environment.

  14. Distributed k-Means Algorithm and Fuzzy c-Means Algorithm for Sensor Networks Based on Multiagent Consensus Theory.

    Science.gov (United States)

    Qin, Jiahu; Fu, Weiming; Gao, Huijun; Zheng, Wei Xing

    2016-03-03

    This paper is concerned with developing a distributed k-means algorithm and a distributed fuzzy c-means algorithm for wireless sensor networks (WSNs) where each node is equipped with sensors. The underlying topology of the WSN is supposed to be strongly connected. The consensus algorithm in multiagent consensus theory is utilized to exchange the measurement information of the sensors in WSN. To obtain a faster convergence speed as well as a higher possibility of having the global optimum, a distributed k-means++ algorithm is first proposed to find the initial centroids before executing the distributed k-means algorithm and the distributed fuzzy c-means algorithm. The proposed distributed k-means algorithm is capable of partitioning the data observed by the nodes into measure-dependent groups which have small in-group and large out-group distances, while the proposed distributed fuzzy c-means algorithm is capable of partitioning the data observed by the nodes into different measure-dependent groups with degrees of membership values ranging from 0 to 1. Simulation results show that the proposed distributed algorithms can achieve almost the same results as that given by the centralized clustering algorithms.

  15. Proposed parameters for a circular particle accelerator for proton beam therapy obtained by genetic algorithm

    International Nuclear Information System (INIS)

    Campos, Gustavo L.; Campos, Tarcísio P.R.

    2017-01-01

    This paper brings to light optimized proposal for a circular particle accelerator for proton beam therapy purposes (named as ACPT). The methodology applied is based on computational metaheuristics based on genetic algorithms (GA) were used to obtain optimized parameters of the equipment. Some fundamental concepts in the metaheuristics developed in Matlab® software will be presented. Four parameters were considered for the proposed modeling for the equipment, being: potential difference, magnetic field, length and radius of the resonant cavity. As result, this article showed optimized parameters for two ACPT, one of them used for ocular radiation therapy, as well some parameters that will allow teletherapy, called in order ACPT - 65 and ACPT - 250, obtained through metaheuristics based in GA. (author)

  16. Proposed parameters for a circular particle accelerator for proton beam therapy obtained by genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Campos, Gustavo L.; Campos, Tarcísio P.R., E-mail: gustavo.lobato@ifmg.edu.br, E-mail: tprcampos@pq.cnpq.br, E-mail: gustavo.lobato@ifmg.edu.br [Universidade Federal de Minas Gerais (UFMG), Belo Horizonte, MG (Brazil). Departamento de Engenharia Nuclear

    2017-07-01

    This paper brings to light optimized proposal for a circular particle accelerator for proton beam therapy purposes (named as ACPT). The methodology applied is based on computational metaheuristics based on genetic algorithms (GA) were used to obtain optimized parameters of the equipment. Some fundamental concepts in the metaheuristics developed in Matlab® software will be presented. Four parameters were considered for the proposed modeling for the equipment, being: potential difference, magnetic field, length and radius of the resonant cavity. As result, this article showed optimized parameters for two ACPT, one of them used for ocular radiation therapy, as well some parameters that will allow teletherapy, called in order ACPT - 65 and ACPT - 250, obtained through metaheuristics based in GA. (author)

  17. Normalization based K means Clustering Algorithm

    OpenAIRE

    Virmani, Deepali; Taneja, Shweta; Malhotra, Geetika

    2015-01-01

    K-means is an effective clustering technique used to separate similar data into groups based on initial centroids of clusters. In this paper, Normalization based K-means clustering algorithm(N-K means) is proposed. Proposed N-K means clustering algorithm applies normalization prior to clustering on the available data as well as the proposed approach calculates initial centroids based on weights. Experimental results prove the betterment of proposed N-K means clustering algorithm over existing...

  18. Unsupervised Classification Using Immune Algorithm

    OpenAIRE

    Al-Muallim, M. T.; El-Kouatly, R.

    2012-01-01

    Unsupervised classification algorithm based on clonal selection principle named Unsupervised Clonal Selection Classification (UCSC) is proposed in this paper. The new proposed algorithm is data driven and self-adaptive, it adjusts its parameters to the data to make the classification operation as fast as possible. The performance of UCSC is evaluated by comparing it with the well known K-means algorithm using several artificial and real-life data sets. The experiments show that the proposed U...

  19. A Hybrid Chaotic Quantum Evolutionary Algorithm

    DEFF Research Database (Denmark)

    Cai, Y.; Zhang, M.; Cai, H.

    2010-01-01

    A hybrid chaotic quantum evolutionary algorithm is proposed to reduce amount of computation, speed up convergence and restrain premature phenomena of quantum evolutionary algorithm. The proposed algorithm adopts the chaotic initialization method to generate initial population which will form a pe...... tests. The presented algorithm is applied to urban traffic signal timing optimization and the effect is satisfied....

  20. Multimodal optimization by using hybrid of artificial bee colony algorithm and BFGS algorithm

    Science.gov (United States)

    Anam, S.

    2017-10-01

    Optimization has become one of the important fields in Mathematics. Many problems in engineering and science can be formulated into optimization problems. They maybe have many local optima. The optimization problem with many local optima, known as multimodal optimization problem, is how to find the global solution. Several metaheuristic methods have been proposed to solve multimodal optimization problems such as Particle Swarm Optimization (PSO), Genetics Algorithm (GA), Artificial Bee Colony (ABC) algorithm, etc. The performance of the ABC algorithm is better than or similar to those of other population-based algorithms with the advantage of employing a fewer control parameters. The ABC algorithm also has the advantages of strong robustness, fast convergence and high flexibility. However, it has the disadvantages premature convergence in the later search period. The accuracy of the optimal value cannot meet the requirements sometimes. Broyden-Fletcher-Goldfarb-Shanno (BFGS) algorithm is a good iterative method for finding a local optimum. Compared with other local optimization methods, the BFGS algorithm is better. Based on the advantages of the ABC algorithm and the BFGS algorithm, this paper proposes a hybrid of the artificial bee colony algorithm and the BFGS algorithm to solve the multimodal optimization problem. The first step is that the ABC algorithm is run to find a point. In the second step is that the point obtained by the first step is used as an initial point of BFGS algorithm. The results show that the hybrid method can overcome from the basic ABC algorithm problems for almost all test function. However, if the shape of function is flat, the proposed method cannot work well.

  1. Artifact removal algorithms for stroke detection using a multistatic MIST beamforming algorithm.

    Science.gov (United States)

    Ricci, E; Di Domenico, S; Cianca, E; Rossi, T

    2015-01-01

    Microwave imaging (MWI) has been recently proved as a promising imaging modality for low-complexity, low-cost and fast brain imaging tools, which could play a fundamental role to efficiently manage emergencies related to stroke and hemorrhages. This paper focuses on the UWB radar imaging approach and in particular on the processing algorithms of the backscattered signals. Assuming the use of the multistatic version of the MIST (Microwave Imaging Space-Time) beamforming algorithm, developed by Hagness et al. for the early detection of breast cancer, the paper proposes and compares two artifact removal algorithms. Artifacts removal is an essential step of any UWB radar imaging system and currently considered artifact removal algorithms have been shown not to be effective in the specific scenario of brain imaging. First of all, the paper proposes modifications of a known artifact removal algorithm. These modifications are shown to be effective to achieve good localization accuracy and lower false positives. However, the main contribution is the proposal of an artifact removal algorithm based on statistical methods, which allows to achieve even better performance but with much lower computational complexity.

  2. Named Entity Linking Algorithm

    Directory of Open Access Journals (Sweden)

    M. F. Panteleev

    2017-01-01

    Full Text Available In the tasks of processing text in natural language, Named Entity Linking (NEL represents the task to define and link some entity, which is found in the text, with some entity in the knowledge base (for example, Dbpedia. Currently, there is a diversity of approaches to solve this problem, but two main classes can be identified: graph-based approaches and machine learning-based ones. Graph and Machine Learning approaches-based algorithm is proposed accordingly to the stated assumptions about the interrelations of named entities in a sentence and in general.In the case of graph-based approaches, it is necessary to solve the problem of identifying an optimal set of the related entities according to some metric that characterizes the distance between these entities in a graph built on some knowledge base. Due to limitations in processing power, to solve this task directly is impossible. Therefore, its modification is proposed. Based on the algorithms of machine learning, an independent solution cannot be built due to small volumes of training datasets relevant to NEL task. However, their use can contribute to improving the quality of the algorithm. The adaptation of the Latent Dirichlet Allocation model is proposed in order to obtain a measure of the compatibility of attributes of various entities encountered in one context.The efficiency of the proposed algorithm was experimentally tested. A test dataset was independently generated. On its basis the performance of the model was compared using the proposed algorithm with the open source product DBpedia Spotlight, which solves the NEL problem.The mockup, based on the proposed algorithm, showed a low speed as compared to DBpedia Spotlight. However, the fact that it has shown higher accuracy, stipulates the prospects for work in this direction.The main directions of development were proposed in order to increase the accuracy of the system and its productivity.

  3. A comparative analysis of three metaheuristic methods applied to fuzzy cognitive maps learning

    Directory of Open Access Journals (Sweden)

    Bruno A. Angélico

    2013-12-01

    Full Text Available This work analyses the performance of three different population-based metaheuristic approaches applied to Fuzzy cognitive maps (FCM learning in qualitative control of processes. Fuzzy cognitive maps permit to include the previous specialist knowledge in the control rule. Particularly, Particle Swarm Optimization (PSO, Genetic Algorithm (GA and an Ant Colony Optimization (ACO are considered for obtaining appropriate weight matrices for learning the FCM. A statistical convergence analysis within 10000 simulations of each algorithm is presented. In order to validate the proposed approach, two industrial control process problems previously described in the literature are considered in this work.

  4. A Combination of Meta-heuristic and Heuristic Algorithms for the VRP, OVRP and VRP with Simultaneous Pickup and Delivery

    Directory of Open Access Journals (Sweden)

    Maryam Ashouri

    2017-07-01

    Full Text Available Vehicle routing problem (VRP is a Nondeterministic Polynomial Hard combinatorial optimization problem to serve the consumers from central depots and returned back to the originated depots with given vehicles. Furthermore, two of the most important extensions of the VRPs are the open vehicle routing problem (OVRP and VRP with simultaneous pickup and delivery (VRPSPD. In OVRP, the vehicles have not return to the depot after last visit and in VRPSPD, customers require simultaneous delivery and pick-up service. The aim of this paper is to present a combined effective ant colony optimization (CEACO which includes sweep and several local search algorithms which is different with common ant colony optimization (ACO. An extensive numerical experiment is performed on benchmark problem instances addressed in the literature. The computational result shows that suggested CEACO approach not only presented a very satisfying scalability, but also was competitive with other meta-heuristic algorithms in the literature for solving VRP, OVRP and VRPSPD problems. Keywords: Meta-heuristic algorithms, Vehicle Routing Problem, Open Vehicle Routing Problem, Simultaneously Pickup and Delivery, Ant Colony Optimization.

  5. Real-Coded Quantum-Inspired Genetic Algorithm-Based BP Neural Network Algorithm

    Directory of Open Access Journals (Sweden)

    Jianyong Liu

    2015-01-01

    Full Text Available The method that the real-coded quantum-inspired genetic algorithm (RQGA used to optimize the weights and threshold of BP neural network is proposed to overcome the defect that the gradient descent method makes the algorithm easily fall into local optimal value in the learning process. Quantum genetic algorithm (QGA is with good directional global optimization ability, but the conventional QGA is based on binary coding; the speed of calculation is reduced by the coding and decoding processes. So, RQGA is introduced to explore the search space, and the improved varied learning rate is adopted to train the BP neural network. Simulation test shows that the proposed algorithm is effective to rapidly converge to the solution conformed to constraint conditions.

  6. Efficient distribution of toy products using ant colony optimization algorithm

    Science.gov (United States)

    Hidayat, S.; Nurpraja, C. A.

    2017-12-01

    CV Atham Toys (CVAT) produces wooden toys and furniture, comprises 13 small and medium industries. CVAT always attempt to deliver customer orders on time but delivery costs are high. This is because of inadequate infrastructure such that delivery routes are long, car maintenance costs are high, while fuel subsidy by the government is still temporary. This study seeks to minimize the cost of product distribution based on the shortest route using one of five Ant Colony Optimization (ACO) algorithms to solve the Vehicle Routing Problem (VRP). This study concludes that the best of the five is the Ant Colony System (ACS) algorithm. The best route in 1st week gave a total distance of 124.11 km at a cost of Rp 66,703.75. The 2nd week route gave a total distance of 132.27 km at a cost of Rp 71,095.13. The 3rd week best route gave a total distance of 122.70 km with a cost of Rp 65,951.25. While the 4th week gave a total distance of 132.27 km at a cost of Rp 74,083.63. Prior to this study there was no effort to calculate these figures.

  7. Correção simultânea de defeito congênito intracardíaco e pectus excavatum Simultaneous repair of congenital heart defect and pectus excavatum

    Directory of Open Access Journals (Sweden)

    João Roberto Breda

    2007-09-01

    Full Text Available Relatamos tratamento simultâneo de pectus excavatum e defeito congênito intracardíaco representado por comunicação interatrial ostium secundum. Paciente do sexo masculino, 8 anos de idade, com diagnóstico clínico e ecocardiográfico de comunicação interatrial, associada à deformidade da parede torácica tipo pectus excavatum. Foi encaminhado para operação com correção simultânea do defeito congênito intracardíaco associado ao reparo do pectus. O tratamento operatório simultâneo do pectus excavatum e defeitos congênitos intracardíacos torna difícil o acesso ao coração. Foi feita a correção simultânea dessas alterações, com satisfatório resultado, sobretudo estético, para o paciente.The author describes the simultaneous treatment of pectus excavatum and congenital intracardiac defect (atrial septal defect represented by the interatrial foramen secundum. An 8-year-old boy, with clinical and echocardiography diagnosis of atrial septal defect associated with pectus excavatum was referred to a simultaneous surgical treatment of both abnormalities. The simultaneous surgical treatment of both pectus excavatum and congenital intracardiac defects make it difficult to access the heart. In this case, the simultaneous surgical treatment of atrial septal defect and pectus excavatum was a valuable alternative to surgical repair of both abnormalities, mainly due to its cosmetic outcome.

  8. A new hybrid evolutionary algorithm based on new fuzzy adaptive PSO and NM algorithms for Distribution Feeder Reconfiguration

    International Nuclear Information System (INIS)

    Niknam, Taher; Azadfarsani, Ehsan; Jabbari, Masoud

    2012-01-01

    Highlights: ► Network reconfiguration is a very important way to save the electrical energy. ► This paper proposes a new algorithm to solve the DFR. ► The algorithm combines NFAPSO with NM. ► The proposed algorithm is tested on two distribution test feeders. - Abstract: Network reconfiguration for loss reduction in distribution system is a very important way to save the electrical energy. This paper proposes a new hybrid evolutionary algorithm to solve the Distribution Feeder Reconfiguration problem (DFR). The algorithm is based on combination of a New Fuzzy Adaptive Particle Swarm Optimization (NFAPSO) and Nelder–Mead simplex search method (NM) called NFAPSO–NM. In the proposed algorithm, a new fuzzy adaptive particle swarm optimization includes two parts. The first part is Fuzzy Adaptive Binary Particle Swarm Optimization (FABPSO) that determines the status of tie switches (open or close) and second part is Fuzzy Adaptive Discrete Particle Swarm Optimization (FADPSO) that determines the sectionalizing switch number. In other side, due to the results of binary PSO(BPSO) and discrete PSO(DPSO) algorithms highly depends on the values of their parameters such as the inertia weight and learning factors, a fuzzy system is employed to adaptively adjust the parameters during the search process. Moreover, the Nelder–Mead simplex search method is combined with the NFAPSO algorithm to improve its performance. Finally, the proposed algorithm is tested on two distribution test feeders. The results of simulation show that the proposed method is very powerful and guarantees to obtain the global optimization.

  9. Optimal Pid Controller Design Using Adaptive Vurpso Algorithm

    Science.gov (United States)

    Zirkohi, Majid Moradi

    2015-04-01

    The purpose of this paper is to improve theVelocity Update Relaxation Particle Swarm Optimization algorithm (VURPSO). The improved algorithm is called Adaptive VURPSO (AVURPSO) algorithm. Then, an optimal design of a Proportional-Integral-Derivative (PID) controller is obtained using the AVURPSO algorithm. An adaptive momentum factor is used to regulate a trade-off between the global and the local exploration abilities in the proposed algorithm. This operation helps the system to reach the optimal solution quickly and saves the computation time. Comparisons on the optimal PID controller design confirm the superiority of AVURPSO algorithm to the optimization algorithms mentioned in this paper namely the VURPSO algorithm, the Ant Colony algorithm, and the conventional approach. Comparisons on the speed of convergence confirm that the proposed algorithm has a faster convergence in a less computation time to yield a global optimum value. The proposed AVURPSO can be used in the diverse areas of optimization problems such as industrial planning, resource allocation, scheduling, decision making, pattern recognition and machine learning. The proposed AVURPSO algorithm is efficiently used to design an optimal PID controller.

  10. An Adaptive Filtering Algorithm Based on Genetic Algorithm-Backpropagation Network

    Directory of Open Access Journals (Sweden)

    Kai Hu

    2013-01-01

    Full Text Available A new image filtering algorithm is proposed. GA-BPN algorithm uses genetic algorithm (GA to decide weights in a back propagation neural network (BPN. It has better global optimal characteristics than traditional optimal algorithm. In this paper, we used GA-BPN to do image noise filter researching work. Firstly, this paper uses training samples to train GA-BPN as the noise detector. Then, we utilize the well-trained GA-BPN to recognize noise pixels in target image. And at last, an adaptive weighted average algorithm is used to recover noise pixels recognized by GA-BPN. Experiment data shows that this algorithm has better performance than other filters.

  11. Recursive automatic classification algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Bauman, E V; Dorofeyuk, A A

    1982-03-01

    A variational statement of the automatic classification problem is given. The dependence of the form of the optimal partition surface on the form of the classification objective functional is investigated. A recursive algorithm is proposed for maximising a functional of reasonably general form. The convergence problem is analysed in connection with the proposed algorithm. 8 references.

  12. Modified Decoding Algorithm of LLR-SPA

    Directory of Open Access Journals (Sweden)

    Zhongxun Wang

    2014-09-01

    Full Text Available In wireless sensor networks, the energy consumption is mainly occurred in the stage of information transmission. The Low Density Parity Check code can make full use of the channel information to save energy. Because of the widely used decoding algorithm of the Low Density Parity Check code, this paper proposes a new decoding algorithm which is based on the LLR-SPA (Sum-Product Algorithm in Log-Likelihood-domain to improve the accuracy of the decoding algorithm. In the modified algorithm, a piecewise linear function is used to approximate the complicated Jacobi correction term in LLR-SPA decoding algorithm. Construct the tangent by the tangency point to the function of Jacobi correction term, which is based on the first order Taylor Series. In this way, the proposed piecewise linear approximation offers almost a perfect match to the function of Jacobi correction term. Meanwhile, the proposed piecewise linear approximation could avoid the operation of logarithmic which is more suitable for practical application. The simulation results show that the proposed algorithm could improve the decoding accuracy greatly without noticeable variation of the computational complexity.

  13. Efficient scheduling request algorithm for opportunistic wireless access

    KAUST Repository

    Nam, Haewoon

    2011-08-01

    An efficient scheduling request algorithm for opportunistic wireless access based on user grouping is proposed in this paper. Similar to the well-known opportunistic splitting algorithm, the proposed algorithm initially adjusts (or lowers) the threshold during a guard period if no user sends a scheduling request. However, if multiple users make requests simultaneously and therefore a collision occurs, the proposed algorithm no longer updates the threshold but narrows down the user search space by splitting the users into multiple groups iteratively, whereas the opportunistic splitting algorithm keeps adjusting the threshold until a single user is found. Since the threshold is only updated when no user sends a request, it is shown that the proposed algorithm significantly alleviates the burden of the signaling for the threshold distribution to the users by the scheduler. More importantly, the proposed algorithm requires a less number of mini-slots to make a user selection given a certain scheduling outage probability. © 2011 IEEE.

  14. Iterative group splitting algorithm for opportunistic scheduling systems

    KAUST Repository

    Nam, Haewoon

    2014-05-01

    An efficient feedback algorithm for opportunistic scheduling systems based on iterative group splitting is proposed in this paper. Similar to the opportunistic splitting algorithm, the proposed algorithm adjusts (or lowers) the feedback threshold during a guard period if no user sends a feedback. However, when a feedback collision occurs at any point of time, the proposed algorithm no longer updates the threshold but narrows down the user search space by dividing the users into multiple groups iteratively, whereas the opportunistic splitting algorithm keeps adjusting the threshold until a single user is found. Since the threshold is only updated when no user sends a feedback, it is shown that the proposed algorithm significantly alleviates the signaling overhead for the threshold distribution to the users by the scheduler. More importantly, the proposed algorithm requires a less number of mini-slots than the opportunistic splitting algorithm to make a user selection with a given level of scheduling outage probability or provides a higher ergodic capacity given a certain number of mini-slots. © 2013 IEEE.

  15. Tumores cardíacos: 10 anos de experiência

    Directory of Open Access Journals (Sweden)

    Antônio Augusto MIANA

    1997-01-01

    Full Text Available Objetivo: Avaliar a experiência cirúrgica do grupo no tratamento dos tumores cardíacos durante o período de janeiro de 1985 a dezembro de 1994. Casuística e Métodos: De um total de 2268 cirurgias cardíacas com circulação extracorpórea realizadas neste período de 10 anos, 6 foram para extirpação de tumores intracavitários, perfazendo 0,26% dos casos. Destes, 3 eram mixomas de átrio esquerdo, 1 fibroelastoma papilar de valva mitral, 1 rabdomioma de ventrículo esquerdo e 1 fibroma de ventrículo esquerdo. A forma clínica de apresentação foi embolia sistêmica (3 casos ou insuficiência cardíaca (3 casos. O diagnóstico foi ecocardiográfico (5 casos e angiográfico (6 casos. Resultados: Todos os pacientes tiveram evolução imediata favorável, exceto 1 deles, com mixoma de átrio equerdo, que evoluiu com mediastinite e septicemia, vindo a falecer e constituindo o único óbito hospitalar. O seguimento pós-operatório tardio, obtido em 4 pacientes (15 a 111 meses - média 49 ± 36,8 meses, não detectou qualquer recidiva, encontrando-se todos em classe funcional I (NYHA. Conclusões: Os autores concluem que os tumores cardíacos são bastante raros, de fácil diagnóstico desde que considerados, na grande maioria benignos e que cursam favoravelmente com a extirpação cirúrgica.Purpose: To review the surgical experience of our group in the treatment of primary cardiac tumors, during a 10 year period beginning January 1985 till December 1994. Material and Methods: From a total of 2268 cardiac surgeries with the aid of extracorporeal circulation performed during this 10 year period, there were 6 cases of intracavitary tumor resection, an incidence of 0.26%. Three were myxomas of the left atrium, 1 papillary fibroelastoma of the mitral valve, 1 rhabdomyoma of the left ventricule and 1 fibroma of the left ventricle. Three patients presented systemic embolism and the other three congestive heart failure. The diagnosis was confirmed

  16. Relative Pose Estimation Algorithm with Gyroscope Sensor

    Directory of Open Access Journals (Sweden)

    Shanshan Wei

    2016-01-01

    Full Text Available This paper proposes a novel vision and inertial fusion algorithm S2fM (Simplified Structure from Motion for camera relative pose estimation. Different from current existing algorithms, our algorithm estimates rotation parameter and translation parameter separately. S2fM employs gyroscopes to estimate camera rotation parameter, which is later fused with the image data to estimate camera translation parameter. Our contributions are in two aspects. (1 Under the circumstance that no inertial sensor can estimate accurately enough translation parameter, we propose a translation estimation algorithm by fusing gyroscope sensor and image data. (2 Our S2fM algorithm is efficient and suitable for smart devices. Experimental results validate efficiency of the proposed S2fM algorithm.

  17. Iterative group splitting algorithm for opportunistic scheduling systems

    KAUST Repository

    Nam, Haewoon; Alouini, Mohamed-Slim

    2014-01-01

    An efficient feedback algorithm for opportunistic scheduling systems based on iterative group splitting is proposed in this paper. Similar to the opportunistic splitting algorithm, the proposed algorithm adjusts (or lowers) the feedback threshold

  18. Efficient scheduling request algorithm for opportunistic wireless access

    KAUST Repository

    Nam, Haewoon; Alouini, Mohamed-Slim

    2011-01-01

    An efficient scheduling request algorithm for opportunistic wireless access based on user grouping is proposed in this paper. Similar to the well-known opportunistic splitting algorithm, the proposed algorithm initially adjusts (or lowers

  19. The application of ant colony optimization in the solution of 3D traveling salesman problem on a sphere

    Directory of Open Access Journals (Sweden)

    Hüseyin Eldem

    2017-08-01

    Full Text Available Traveling Salesman Problem (TSP is a problem in combinatorial optimization that should be solved by a salesperson who has to travel all cities at the minimum cost (minimum route and return to the starting city (node. Todays, to resolve the minimum cost of this problem, many optimization algorithms have been used. The major ones are these metaheuristic algorithms. In this study, one of the metaheuristic methods, Ant Colony Optimization (ACO method (Max-Min Ant System – MMAS, was used to solve the Non-Euclidean TSP, which consisted of sets of different count points coincidentally located on the surface of a sphere. In this study seven point sets were used which have different point count. The performance of the MMAS method solving Non-Euclidean TSP problem was demonstrated by different experiments. Also, the results produced by ACO are compared with Discrete Cuckoo Search Algorithm (DCS and Genetic Algorithm (GA that are in the literature. The experiments for TSP on a sphere, show that ACO’s average results were better than the GA’s average results and also best results of ACO successful than the DCS.

  20. Bit Loading Algorithms for Cooperative OFDM Systems

    Directory of Open Access Journals (Sweden)

    Gui Bo

    2008-01-01

    Full Text Available Abstract We investigate the resource allocation problem for an OFDM cooperative network with a single source-destination pair and multiple relays. Assuming knowledge of the instantaneous channel gains for all links in the entire network, we propose several bit and power allocation schemes aiming at minimizing the total transmission power under a target rate constraint. First, an optimal and efficient bit loading algorithm is proposed when the relay node uses the same subchannel to relay the information transmitted by the source node. To further improve the performance gain, subchannel permutation, in which the subchannels are reallocated at relay nodes, is considered. An optimal subchannel permutation algorithm is first proposed and then an efficient suboptimal algorithm is considered to achieve a better complexity-performance tradeoff. A distributed bit loading algorithm is also proposed for ad hoc networks. Simulation results show that significant performance gains can be achieved by the proposed bit loading algorithms, especially when subchannel permutation is employed.

  1. Bit Loading Algorithms for Cooperative OFDM Systems

    Directory of Open Access Journals (Sweden)

    Bo Gui

    2007-12-01

    Full Text Available We investigate the resource allocation problem for an OFDM cooperative network with a single source-destination pair and multiple relays. Assuming knowledge of the instantaneous channel gains for all links in the entire network, we propose several bit and power allocation schemes aiming at minimizing the total transmission power under a target rate constraint. First, an optimal and efficient bit loading algorithm is proposed when the relay node uses the same subchannel to relay the information transmitted by the source node. To further improve the performance gain, subchannel permutation, in which the subchannels are reallocated at relay nodes, is considered. An optimal subchannel permutation algorithm is first proposed and then an efficient suboptimal algorithm is considered to achieve a better complexity-performance tradeoff. A distributed bit loading algorithm is also proposed for ad hoc networks. Simulation results show that significant performance gains can be achieved by the proposed bit loading algorithms, especially when subchannel permutation is employed.

  2. Inclusive Flavour Tagging Algorithm

    International Nuclear Information System (INIS)

    Likhomanenko, Tatiana; Derkach, Denis; Rogozhnikov, Alex

    2016-01-01

    Identifying the flavour of neutral B mesons production is one of the most important components needed in the study of time-dependent CP violation. The harsh environment of the Large Hadron Collider makes it particularly hard to succeed in this task. We present an inclusive flavour-tagging algorithm as an upgrade of the algorithms currently used by the LHCb experiment. Specifically, a probabilistic model which efficiently combines information from reconstructed vertices and tracks using machine learning is proposed. The algorithm does not use information about underlying physics process. It reduces the dependence on the performance of lower level identification capacities and thus increases the overall performance. The proposed inclusive flavour-tagging algorithm is applicable to tag the flavour of B mesons in any proton-proton experiment. (paper)

  3. Flexible Job Shop Scheduling Problem Using an Improved Ant Colony Optimization

    Directory of Open Access Journals (Sweden)

    Lei Wang

    2017-01-01

    Full Text Available As an extension of the classical job shop scheduling problem, the flexible job shop scheduling problem (FJSP plays an important role in real production systems. In FJSP, an operation is allowed to be processed on more than one alternative machine. It has been proven to be a strongly NP-hard problem. Ant colony optimization (ACO has been proven to be an efficient approach for dealing with FJSP. However, the basic ACO has two main disadvantages including low computational efficiency and local optimum. In order to overcome these two disadvantages, an improved ant colony optimization (IACO is proposed to optimize the makespan for FJSP. The following aspects are done on our improved ant colony optimization algorithm: select machine rule problems, initialize uniform distributed mechanism for ants, change pheromone’s guiding mechanism, select node method, and update pheromone’s mechanism. An actual production instance and two sets of well-known benchmark instances are tested and comparisons with some other approaches verify the effectiveness of the proposed IACO. The results reveal that our proposed IACO can provide better solution in a reasonable computational time.

  4. Inter-dependence of the electron beam excitations with the free electron laser stability on the super-ACO storage ring

    CERN Document Server

    Couprie, Marie Emmanuelle; Nutarelli, D; Renault, E; Billardon, M

    1999-01-01

    Storage ring free electron lasers have a complex dynamics as compared to the LINAC driven FEL sources since both the laser and the recirculating electron beam behaviours are involved. Electron beam perturbations can strongly affect the FEL operation (start-up, stability) whereas the FEL can stabilize beam instabilities. Experimental analysis together with simulations are reported here. Improvements of the Super-ACO FEL for users is discussed, and consequences are given in terms of electron beam tolerances for a source development for users.

  5. Relating two proposed methods for speedup of algorithms for fitting two- and three-way principal component and related multilinear models

    NARCIS (Netherlands)

    Kiers, Henk A.L.; Harshman, Richard A.

    Multilinear analysis methods such as component (and three-way component) analysis of very large data sets can become very computationally demanding and even infeasible unless some method is used to compress the data and/or speed up the algorithms. We discuss two previously proposed speedup methods.

  6. An integrated ant colony optimization approach to compare strategies of clearing market in electricity markets. Agent-based simulation

    International Nuclear Information System (INIS)

    Azadeh, A.; Maleki-Shoja, B.; Skandari, M.R.

    2010-01-01

    In this paper, an innovative model of agent based simulation, based on Ant Colony Optimization (ACO) algorithm is proposed in order to compare three available strategies of clearing wholesale electricity markets, i.e. uniform, pay-as-bid, and generalized Vickrey rules. The supply side actors of the power market are modeled as adaptive agents who learn how to bid strategically to optimize their profit through indirect interaction with other actors of the market. The proposed model is proper for bidding functions with high number of dimensions and enables modelers to avoid curse of dimensionality as dimension grows. Test systems are then used to study the behavior of each pricing rule under different degrees of competition and heterogeneity. Finally, the pricing rules are comprehensively compared using different economic criteria such as average cleared price, efficiency of allocation, and price volatility. Also, principle component analysis (PCA) is used to rank and select the best price rule. To the knowledge of the authors, this is the first study that uses ACO for assessing strategies of wholesale electricity market. (author)

  7. Bouc–Wen hysteresis model identification using Modified Firefly Algorithm

    International Nuclear Information System (INIS)

    Zaman, Mohammad Asif; Sikder, Urmita

    2015-01-01

    The parameters of Bouc–Wen hysteresis model are identified using a Modified Firefly Algorithm. The proposed algorithm uses dynamic process control parameters to improve its performance. The algorithm is used to find the model parameter values that results in the least amount of error between a set of given data points and points obtained from the Bouc–Wen model. The performance of the algorithm is compared with the performance of conventional Firefly Algorithm, Genetic Algorithm and Differential Evolution algorithm in terms of convergence rate and accuracy. Compared to the other three optimization algorithms, the proposed algorithm is found to have good convergence rate with high degree of accuracy in identifying Bouc–Wen model parameters. Finally, the proposed method is used to find the Bouc–Wen model parameters from experimental data. The obtained model is found to be in good agreement with measured data. - Highlights: • We describe a new method to find the Bouc–Wen hysteresis model parameters. • We propose a Modified Firefly Algorithm. • We compare our method with existing methods to find that the proposed method performs better. • We use our model to fit experimental results. Good agreement is found

  8. Bouc–Wen hysteresis model identification using Modified Firefly Algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Zaman, Mohammad Asif, E-mail: zaman@stanford.edu [Department of Electrical Engineering, Stanford University (United States); Sikder, Urmita [Department of Electrical Engineering and Computer Sciences, University of California, Berkeley (United States)

    2015-12-01

    The parameters of Bouc–Wen hysteresis model are identified using a Modified Firefly Algorithm. The proposed algorithm uses dynamic process control parameters to improve its performance. The algorithm is used to find the model parameter values that results in the least amount of error between a set of given data points and points obtained from the Bouc–Wen model. The performance of the algorithm is compared with the performance of conventional Firefly Algorithm, Genetic Algorithm and Differential Evolution algorithm in terms of convergence rate and accuracy. Compared to the other three optimization algorithms, the proposed algorithm is found to have good convergence rate with high degree of accuracy in identifying Bouc–Wen model parameters. Finally, the proposed method is used to find the Bouc–Wen model parameters from experimental data. The obtained model is found to be in good agreement with measured data. - Highlights: • We describe a new method to find the Bouc–Wen hysteresis model parameters. • We propose a Modified Firefly Algorithm. • We compare our method with existing methods to find that the proposed method performs better. • We use our model to fit experimental results. Good agreement is found.

  9. A Newton-type neural network learning algorithm

    International Nuclear Information System (INIS)

    Ivanov, V.V.; Puzynin, I.V.; Purehvdorzh, B.

    1993-01-01

    First- and second-order learning methods for feed-forward multilayer networks are considered. A Newton-type algorithm is proposed and compared with the common back-propagation algorithm. It is shown that the proposed algorithm provides better learning quality. Some recommendations for their usage are given. 11 refs.; 1 fig.; 1 tab

  10. An improved VSS NLMS algorithm for active noise cancellation

    Science.gov (United States)

    Sun, Yunzhuo; Wang, Mingjiang; Han, Yufei; Zhang, Congyan

    2017-08-01

    In this paper, an improved variable step size NLMS algorithm is proposed. NLMS has fast convergence rate and low steady state error compared to other traditional adaptive filtering algorithm. But there is a contradiction between the convergence speed and steady state error that affect the performance of the NLMS algorithm. Now, we propose a new variable step size NLMS algorithm. It dynamically changes the step size according to current error and iteration times. The proposed algorithm has simple formulation and easily setting parameters, and effectively solves the contradiction in NLMS. The simulation results show that the proposed algorithm has a good tracking ability, fast convergence rate and low steady state error simultaneously.

  11. Efficient RNA structure comparison algorithms.

    Science.gov (United States)

    Arslan, Abdullah N; Anandan, Jithendar; Fry, Eric; Monschke, Keith; Ganneboina, Nitin; Bowerman, Jason

    2017-12-01

    Recently proposed relative addressing-based ([Formula: see text]) RNA secondary structure representation has important features by which an RNA structure database can be stored into a suffix array. A fast substructure search algorithm has been proposed based on binary search on this suffix array. Using this substructure search algorithm, we present a fast algorithm that finds the largest common substructure of given multiple RNA structures in [Formula: see text] format. The multiple RNA structure comparison problem is NP-hard in its general formulation. We introduced a new problem for comparing multiple RNA structures. This problem has more strict similarity definition and objective, and we propose an algorithm that solves this problem efficiently. We also develop another comparison algorithm that iteratively calls this algorithm to locate nonoverlapping large common substructures in compared RNAs. With the new resulting tools, we improved the RNASSAC website (linked from http://faculty.tamuc.edu/aarslan ). This website now also includes two drawing tools: one specialized for preparing RNA substructures that can be used as input by the search tool, and another one for automatically drawing the entire RNA structure from a given structure sequence.

  12. An Adaptive Unified Differential Evolution Algorithm for Global Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Qiang, Ji; Mitchell, Chad

    2014-11-03

    In this paper, we propose a new adaptive unified differential evolution algorithm for single-objective global optimization. Instead of the multiple mutation strate- gies proposed in conventional differential evolution algorithms, this algorithm employs a single equation unifying multiple strategies into one expression. It has the virtue of mathematical simplicity and also provides users the flexibility for broader exploration of the space of mutation operators. By making all control parameters in the proposed algorithm self-adaptively evolve during the process of optimization, it frees the application users from the burden of choosing appro- priate control parameters and also improves the performance of the algorithm. In numerical tests using thirteen basic unimodal and multimodal functions, the proposed adaptive unified algorithm shows promising performance in compari- son to several conventional differential evolution algorithms.

  13. An approach using quantum ant colony optimization applied to the problem of identification of nuclear power plant transients

    International Nuclear Information System (INIS)

    Silva, Marcio H.; Schirru, Roberto; Medeiros, Jose A.C.C.

    2009-01-01

    Using concepts and principles of the quantum computation, as the quantum bit and superposition of states, coupled with the biological metaphor of a colony of ants, used in the Ant Colony Optimization algorithm (ACO), Wang et al developed the Quantum Ant Colony Optimization (QACO). In this paper we present a modification of the algorithm proposed by Wang et al. While the original QACO was used just for simple benchmarks functions with, at the most, two dimensions, QACO A lfa was developed for application where the original QACO, due to its tendency to converge prematurely, does not obtain good results, as in complex multidimensional functions. Furthermore, to evaluate its behavior, both algorithms are applied to the real problem of identification of accidents in PWR nuclear power plants. (author)

  14. Fast algorithm for Morphological Filters

    International Nuclear Information System (INIS)

    Lou Shan; Jiang Xiangqian; Scott, Paul J

    2011-01-01

    In surface metrology, morphological filters, which evolved from the envelope filtering system (E-system) work well for functional prediction of surface finish in the analysis of surfaces in contact. The naive algorithms are time consuming, especially for areal data, and not generally adopted in real practice. A fast algorithm is proposed based on the alpha shape. The hull obtained by rolling the alpha ball is equivalent to the morphological opening/closing in theory. The algorithm depends on Delaunay triangulation with time complexity O(nlogn). In comparison to the naive algorithms it generates the opening and closing envelope without combining dilation and erosion. Edge distortion is corrected by reflective padding for open profiles/surfaces. Spikes in the sample data are detected and points interpolated to prevent singularities. The proposed algorithm works well both for morphological profile and area filters. Examples are presented to demonstrate the validity and superiority on efficiency of this algorithm over the naive algorithm.

  15. WDM Multicast Tree Construction Algorithms and Their Comparative Evaluations

    Science.gov (United States)

    Makabe, Tsutomu; Mikoshi, Taiju; Takenaka, Toyofumi

    We propose novel tree construction algorithms for multicast communication in photonic networks. Since multicast communications consume many more link resources than unicast communications, effective algorithms for route selection and wavelength assignment are required. We propose a novel tree construction algorithm, called the Weighted Steiner Tree (WST) algorithm and a variation of the WST algorithm, called the Composite Weighted Steiner Tree (CWST) algorithm. Because these algorithms are based on the Steiner Tree algorithm, link resources among source and destination pairs tend to be commonly used and link utilization ratios are improved. Because of this, these algorithms can accept many more multicast requests than other multicast tree construction algorithms based on the Dijkstra algorithm. However, under certain delay constraints, the blocking characteristics of the proposed Weighted Steiner Tree algorithm deteriorate since some light paths between source and destinations use many hops and cannot satisfy the delay constraint. In order to adapt the approach to the delay-sensitive environments, we have devised the Composite Weighted Steiner Tree algorithm comprising the Weighted Steiner Tree algorithm and the Dijkstra algorithm for use in a delay constrained environment such as an IPTV application. In this paper, we also give the results of simulation experiments which demonstrate the superiority of the proposed Composite Weighted Steiner Tree algorithm compared with the Distributed Minimum Hop Tree (DMHT) algorithm, from the viewpoint of the light-tree request blocking.

  16. Firefly Mating Algorithm for Continuous Optimization Problems

    Directory of Open Access Journals (Sweden)

    Amarita Ritthipakdee

    2017-01-01

    Full Text Available This paper proposes a swarm intelligence algorithm, called firefly mating algorithm (FMA, for solving continuous optimization problems. FMA uses genetic algorithm as the core of the algorithm. The main feature of the algorithm is a novel mating pair selection method which is inspired by the following 2 mating behaviors of fireflies in nature: (i the mutual attraction between males and females causes them to mate and (ii fireflies of both sexes are of the multiple-mating type, mating with multiple opposite sex partners. A female continues mating until her spermatheca becomes full, and, in the same vein, a male can provide sperms for several females until his sperm reservoir is depleted. This new feature enhances the global convergence capability of the algorithm. The performance of FMA was tested with 20 benchmark functions (sixteen 30-dimensional functions and four 2-dimensional ones against FA, ALC-PSO, COA, MCPSO, LWGSODE, MPSODDS, DFOA, SHPSOS, LSA, MPDPGA, DE, and GABC algorithms. The experimental results showed that the success rates of our proposed algorithm with these functions were higher than those of other algorithms and the proposed algorithm also required fewer numbers of iterations to reach the global optima.

  17. “Manifestos do Coração”: Significados Atribuídos à Doença por Pacientes Cardíacos Pré-cirúrgicos

    Directory of Open Access Journals (Sweden)

    Shana Hastenpflug Wottrich

    Full Text Available RESUMOEste estudo objetivou explorar os significados atribuídos à doença cardíaca por pacientes cardíacos pré-cirúrgicos em tratamento ambulatorial no sul do Brasil. É um estudo clínico-qualitativo, de caráter exploratório e descritivo, cuja coleta de dados aconteceu por meio de entrevistas semiestruturadas e da autofotografia, propostas a 15 indivíduos. Foi realizada análise de conteúdo temática, emergindo as categorias: Confrontação com a doença: os saberes em questão, Negação da doença, Doença e trabalho e Sexualidade rompida. Os resultados destacaram as dificuldades dos participantes relacionadas à apropriação do quadro da doença e à aceitação dessa condição. Salienta-se a premência de ações de saúde, que possam ser coadjuvantes na reestruturação das possibilidades de vida para os pacientes cardíacos pré-cirúrgicos.

  18. Perfil dos pacientes na Lista Única de Espera para transplante cardíaco no estado do Ceará Perfil de los pacientes en la lista única de espera para transplante cardíaco en el estado de Ceará Profile of patients in the Unified Waiting List for heart transplantation in state of Ceará

    Directory of Open Access Journals (Sweden)

    Francisca Elisângela Teixeira Lima

    2010-07-01

    Full Text Available FUNDAMENTO: Os transplantes de órgãos têm aumentado consideravelmente nos últimos anos em razão da evolução tecnológica e da sensibilização da sociedade para doação de órgãos. OBJETIVO: Descrever as características dos pacientes da Lista Única de Espera para transplante cardíaco; identificar as principais cardiopatias; e verificar o tempo médio de permanência do paciente na Lista até a realização da cirurgia. MÉTODOS: Trata-se de um estudo descritivo, documental e retrospectivo, com abordagem quantitativa, desenvolvido na Central de Transplante do Estado do Ceará, com 156 pacientes incluídos na Lista Única de Espera do ano de 1999 a 2006. Os dados foram organizados em figuras. RESULTADOS: Foram encontrados: 81% do sexo masculino; 22,4% adultos jovens (20 a 40 anos e 56,4% adultos de meia-idade (40 a 64 anos, com uma média de 36 anos; 79% procedentes de Fortaleza-CE; 91% tinham miocardiopatia como causa do transplante cardíaco. Dentre esses pacientes 102 (69% foram transplantados; 37 (25% evoluíram para óbito antes do transplante e 8 (6% foram excluídos por melhora ou piora do quadro clínico. CONCLUSÃO: Os pacientes da Lista Única de Espera para transplante cardíaco no Estado do Ceará, no período de 1999 a 2006, eram do sexo masculino (80%, com faixa etária variando de 1 a 71 anos, com predomínio da miocardiopatia dilatada (53,4%, e o tempo médio de espera foi de 136 dias até o dia do transplante cardíaco.FUNDAMENTO: Los transplantes de órganos han aumentado considerablemente en los últimos años, en razón de la evolución tecnológica y de la sensibilización de la sociedad para la donación de órganos. OBJETIVO: Describir las características de los pacientes de la Lista Única de Espera para transplante cardíaco; identificar las principales cardiopatías; y verificar el tiempo medio de permanencia del paciente en la Lista hasta la realización de la cirugía. MÉTODOS: Se trata de un estudio

  19. Proposed Network Intrusion Detection System ‎Based on Fuzzy c Mean Algorithm in Cloud ‎Computing Environment

    Directory of Open Access Journals (Sweden)

    Shawq Malik Mehibs

    2017-12-01

    Full Text Available Nowadays cloud computing had become is an integral part of IT industry, cloud computing provides Working environment allow a user of environmental to share data and resources over the internet. Where cloud computing its virtual grouping of resources offered over the internet, this lead to different matters related to the security and privacy in cloud computing. And therefore, create intrusion detection very important to detect outsider and insider intruders of cloud computing with high detection rate and low false positive alarm in the cloud environment. This work proposed network intrusion detection module using fuzzy c mean algorithm. The kdd99 dataset used for experiments .the proposed system characterized by a high detection rate with low false positive alarm

  20. Opposite Degree Algorithm and Its Applications

    Directory of Open Access Journals (Sweden)

    Xiao-Guang Yue

    2015-12-01

    Full Text Available The opposite (Opposite Degree, referred to as OD algorithm is an intelligent algorithm proposed by Yue Xiaoguang et al. Opposite degree algorithm is mainly based on the concept of opposite degree, combined with the idea of design of neural network and genetic algorithm and clustering analysis algorithm. The OD algorithm is divided into two sub algorithms, namely: opposite degree - numerical computation (OD-NC algorithm and opposite degree - Classification computation (OD-CC algorithm.

  1. Clarifying Cutting and Sewing Processes with Due Windows Using an Effective Ant Colony Optimization

    Directory of Open Access Journals (Sweden)

    Rong-Hwa Huang

    2013-01-01

    Full Text Available The cutting and sewing process is a traditional flow shop scheduling problem in the real world. This two-stage flexible flow shop is often commonly associated with manufacturing in the fashion and textiles industry. Many investigations have demonstrated that the ant colony optimization (ACO algorithm is effective and efficient for solving scheduling problems. This work applies a novel effective ant colony optimization (EACO algorithm to solve two-stage flexible flow shop scheduling problems and thereby minimize earliness, tardiness, and makespan. Computational results reveal that for both small and large problems, EACO is more effective and robust than both the particle swarm optimization (PSO algorithm and the ACO algorithm. Importantly, this work demonstrates that EACO can solve complex scheduling problems in an acceptable period of time.

  2. Applying Kitaev's algorithm in an ion trap quantum computer

    International Nuclear Information System (INIS)

    Travaglione, B.; Milburn, G.J.

    2000-01-01

    Full text: Kitaev's algorithm is a method of estimating eigenvalues associated with an operator. Shor's factoring algorithm, which enables a quantum computer to crack RSA encryption codes, is a specific example of Kitaev's algorithm. It has been proposed that the algorithm can also be used to generate eigenstates. We extend this proposal for small quantum systems, identifying the conditions under which the algorithm can successfully generate eigenstates. We then propose an implementation scheme based on an ion trap quantum computer. This scheme allows us to illustrate a simple example, in which the algorithm effectively generates eigenstates

  3. Simulation and Proposed Handover Alert Algorithm for Mobile Communication System

    Directory of Open Access Journals (Sweden)

    Muzhir Shaban Al-Ani

    2009-10-01

    Full Text Available this paper deals with the simulation and presentation of a novel approach to design and implementation of algorithm to realize hand over process for a mobile communication system during mobile network. This algorithm performs the ability of the system to extract important information features about the received signal. When the strength of the received signal is dropped below a certain threshold value then an alert process is activated to achieve the continuity of the transmission due to a ready scan which is existed on time.

  4. DNABIT Compress - Genome compression algorithm.

    Science.gov (United States)

    Rajarajeswari, Pothuraju; Apparao, Allam

    2011-01-22

    Data compression is concerned with how information is organized in data. Efficient storage means removal of redundancy from the data being stored in the DNA molecule. Data compression algorithms remove redundancy and are used to understand biologically important molecules. We present a compression algorithm, "DNABIT Compress" for DNA sequences based on a novel algorithm of assigning binary bits for smaller segments of DNA bases to compress both repetitive and non repetitive DNA sequence. Our proposed algorithm achieves the best compression ratio for DNA sequences for larger genome. Significantly better compression results show that "DNABIT Compress" algorithm is the best among the remaining compression algorithms. While achieving the best compression ratios for DNA sequences (Genomes),our new DNABIT Compress algorithm significantly improves the running time of all previous DNA compression programs. Assigning binary bits (Unique BIT CODE) for (Exact Repeats, Reverse Repeats) fragments of DNA sequence is also a unique concept introduced in this algorithm for the first time in DNA compression. This proposed new algorithm could achieve the best compression ratio as much as 1.58 bits/bases where the existing best methods could not achieve a ratio less than 1.72 bits/bases.

  5. Modified artificial bee colony algorithm for reactive power optimization

    Science.gov (United States)

    Sulaiman, Noorazliza; Mohamad-Saleh, Junita; Abro, Abdul Ghani

    2015-05-01

    Bio-inspired algorithms (BIAs) implemented to solve various optimization problems have shown promising results which are very important in this severely complex real-world. Artificial Bee Colony (ABC) algorithm, a kind of BIAs has demonstrated tremendous results as compared to other optimization algorithms. This paper presents a new modified ABC algorithm referred to as JA-ABC3 with the aim to enhance convergence speed and avoid premature convergence. The proposed algorithm has been simulated on ten commonly used benchmarks functions. Its performance has also been compared with other existing ABC variants. To justify its robust applicability, the proposed algorithm has been tested to solve Reactive Power Optimization problem. The results have shown that the proposed algorithm has superior performance to other existing ABC variants e.g. GABC, BABC1, BABC2, BsfABC dan IABC in terms of convergence speed. Furthermore, the proposed algorithm has also demonstrated excellence performance in solving Reactive Power Optimization problem.

  6. A Hybrid Maximum Power Point Tracking Approach for Photovoltaic Systems under Partial Shading Conditions Using a Modified Genetic Algorithm and the Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    Yu-Pei Huang

    2018-01-01

    Full Text Available This paper proposes a modified maximum power point tracking (MPPT algorithm for photovoltaic systems under rapidly changing partial shading conditions (PSCs. The proposed algorithm integrates a genetic algorithm (GA and the firefly algorithm (FA and further improves its calculation process via a differential evolution (DE algorithm. The conventional GA is not advisable for MPPT because of its complicated calculations and low accuracy under PSCs. In this study, we simplified the GA calculations with the integration of the DE mutation process and FA attractive process. Results from both the simulation and evaluation verify that the proposed algorithm provides rapid response time and high accuracy due to the simplified processing. For instance, evaluation results demonstrate that when compared to the conventional GA, the execution time and tracking accuracy of the proposed algorithm can be, respectively, improved around 69.4% and 4.16%. In addition, in comparison to FA, the tracking speed and tracking accuracy of the proposed algorithm can be improved around 42.9% and 1.85%, respectively. Consequently, the major improvement of the proposed method when evaluated against the conventional GA and FA is tracking speed. Moreover, this research provides a framework to integrate multiple nature-inspired algorithms for MPPT. Furthermore, the proposed method is adaptable to different types of solar panels and different system formats with specifically designed equations, the advantages of which are rapid tracking speed with high accuracy under PSCs.

  7. Efficient On-the-fly Algorithms for the Analysis of Timed Games

    DEFF Research Database (Denmark)

    Cassez, Franck; David, Alexandre; Fleury, Emmanuel

    2005-01-01

    In this paper, we propose the first efficient on-the-fly algorithm for solving games based on timed game automata with respect to reachability and safety properties The algorithm we propose is a symbolic extension of the on-the-fly algorithm suggested by Liu & Smolka [15] for linear-time model-ch...... symbolic algorithm are proposed as well as methods for obtaining time-optimal winning strategies (for reachability games). Extensive evaluation of an experimental implementation of the algorithm yields very encouraging performance results.......In this paper, we propose the first efficient on-the-fly algorithm for solving games based on timed game automata with respect to reachability and safety properties The algorithm we propose is a symbolic extension of the on-the-fly algorithm suggested by Liu & Smolka [15] for linear-time model...

  8. Modified Projection Algorithms for Solving the Split Equality Problems

    Directory of Open Access Journals (Sweden)

    Qiao-Li Dong

    2014-01-01

    proposed a CQ algorithm for solving it. In this paper, we propose a modification for the CQ algorithm, which computes the stepsize adaptively and performs an additional projection step onto two half-spaces in each iteration. We further propose a relaxation scheme for the self-adaptive projection algorithm by using projections onto half-spaces instead of those onto the original convex sets, which is much more practical. Weak convergence results for both algorithms are analyzed.

  9. Utilização de valvas homólogas e heterólogas em condutos extracardíacos The use of homograph and heterograph valves in extracardiac conduits

    Directory of Open Access Journals (Sweden)

    Rui Siqueira de Almeida

    1988-08-01

    Full Text Available O conceito do uso de um conduto extracardíaco para estabelecer uma via de saída, conectando o ventrículo direito com o tronco pulmonar, ou seus ramos, foi desenvolvido na década de 60. Entre 1971 e 1986, 335 pacientes receberam, no The Hospital for Sick Children, de Londres, condutos extracardíacos para o lado direito do coração; 176 destes foram homoenxertos aórticos, preservados em solução antibióticonutriente; 140 heteroenxertos (Hancock, Ross, Carpentier-Edwards, lonescu-Shiley e 19 tubos não valvulados. Estes condutos foram usados na correção de defeitos cardíacos complexos. A idade média foi de 6,34 anos e o peso médio, de 17,8 kg. O diâmetro interno dos condutos variou de 8 a 30 mm. A mortalidade hospitalar foi de 29,2% e o seguimento dos sobrevivente teve uma duração máxima de 14,3 anos, sendo que apenas 40% delas foram relacionadas ao conduto extracardíaco. A curva atuarial, livre de obstrução, dos condutos extracardíacos foi significativa, quando se analisaram os homoenxertos, face a cada grupo de heteroenxertos (p The concept of using extracardiac conduits, to establish an outflow tract between the right ventricle and the pulmonary artery was developed on the sixties. Between 1971 and 1986, 335 patients received extracardiac conduits for the right heart, at The Hospital for Sick Children, London; 176 were antibiotic preserved aortic homografts (Hancock, Ross, Carpentier-Edwards, lonescu-Shiley and 19 non-valved tubes. These conduits were used for the repair of complex congenital heart defects. The mean age of these groups was 6.34 ± 4.6 years and the mean weight 17.8 ± 10.8 kg. The internal diameter of the conduits varied from 8 to 30 mm. The hospital mortality was 29.2% and long-term follow-up of the survivals had a maximum period of 14,39 years. Sixty patients (17.9% were submited to 60 reoperations, being only 40% conduit related. The actuarial survival cun/e of freedom from obstruction was significant

  10. Algorithmic approach to diagram techniques

    International Nuclear Information System (INIS)

    Ponticopoulos, L.

    1980-10-01

    An algorithmic approach to diagram techniques of elementary particles is proposed. The definition and axiomatics of the theory of algorithms are presented, followed by the list of instructions of an algorithm formalizing the construction of graphs and the assignment of mathematical objects to them. (T.A.)

  11. Dynamic Inertia Weight Binary Bat Algorithm with Neighborhood Search

    Directory of Open Access Journals (Sweden)

    Xingwang Huang

    2017-01-01

    Full Text Available Binary bat algorithm (BBA is a binary version of the bat algorithm (BA. It has been proven that BBA is competitive compared to other binary heuristic algorithms. Since the update processes of velocity in the algorithm are consistent with BA, in some cases, this algorithm also faces the premature convergence problem. This paper proposes an improved binary bat algorithm (IBBA to solve this problem. To evaluate the performance of IBBA, standard benchmark functions and zero-one knapsack problems have been employed. The numeric results obtained by benchmark functions experiment prove that the proposed approach greatly outperforms the original BBA and binary particle swarm optimization (BPSO. Compared with several other heuristic algorithms on zero-one knapsack problems, it also verifies that the proposed algorithm is more able to avoid local minima.

  12. A Novel Hybrid Firefly Algorithm for Global Optimization.

    Directory of Open Access Journals (Sweden)

    Lina Zhang

    Full Text Available Global optimization is challenging to solve due to its nonlinearity and multimodality. Traditional algorithms such as the gradient-based methods often struggle to deal with such problems and one of the current trends is to use metaheuristic algorithms. In this paper, a novel hybrid population-based global optimization algorithm, called hybrid firefly algorithm (HFA, is proposed by combining the advantages of both the firefly algorithm (FA and differential evolution (DE. FA and DE are executed in parallel to promote information sharing among the population and thus enhance searching efficiency. In order to evaluate the performance and efficiency of the proposed algorithm, a diverse set of selected benchmark functions are employed and these functions fall into two groups: unimodal and multimodal. The experimental results show better performance of the proposed algorithm compared to the original version of the firefly algorithm (FA, differential evolution (DE and particle swarm optimization (PSO in the sense of avoiding local minima and increasing the convergence rate.

  13. Online co-regularized algorithms

    NARCIS (Netherlands)

    Ruijter, T. de; Tsivtsivadze, E.; Heskes, T.

    2012-01-01

    We propose an online co-regularized learning algorithm for classification and regression tasks. We demonstrate that by sequentially co-regularizing prediction functions on unlabeled data points, our algorithm provides improved performance in comparison to supervised methods on several UCI benchmarks

  14. The role of cardiovascular magnetic resonance in candidates for Fontan operation: Proposal of a new Algorithm

    Directory of Open Access Journals (Sweden)

    Ait-Ali Lamia

    2011-11-01

    Full Text Available Abstract Background To propose a new diagnostic algorithm for candidates for Fontan and identify those who can skip cardiac catheterization (CC. Methods Forty-four candidates for Fontan (median age 4.8 years, range: 2-29 years were prospectively evaluated by trans-thoracic echocardiography (TTE, Cardiovascular magnetic resonance (CMR and CC. Before CC, according to clinical, echo and CMR findings, patients were divided in two groups: Group I comprised 18 patients deemed suitable for Fontan without requiring CC; group II comprised 26 patients indicated for CC either in order to detect more details, or for interventional procedures. Results In Group I ("CC not required" no unexpected new information affecting surgical planning was provided by CC. Conversely, in Group II new information was provided by CC in three patients (0 vs 11.5%, p = 0.35 and in six an interventional procedure was performed. During CC, minor complications occurred in one patient from Group I and in three from Group II (6 vs 14%, p = 0.7. Radiation Dose-Area product was similar in the two groups (Median 20 Gycm2, range: 5-40 vs 26.5 Gycm2, range: 9-270 p = 0.37. All 18 Group I patients and 19 Group II patients underwent a total cavo-pulmonary anastomosis; in the remaining seven group II patients, four were excluded from Fontan; two are awaiting Fontan; one refused the intervention. Conclusion In this paper we propose a new diagnostic algorithm in a pre-Fontan setting. An accurate non-invasive evaluation comprising TTE and CMR could select patients who can skip CC.

  15. ADAPTIVE SELECTION OF AUXILIARY OBJECTIVES IN MULTIOBJECTIVE EVOLUTIONARY ALGORITHMS

    Directory of Open Access Journals (Sweden)

    I. A. Petrova

    2016-05-01

    Full Text Available Subject of Research.We propose to modify the EA+RL method, which increases efficiency of evolutionary algorithms by means of auxiliary objectives. The proposed modification is compared to the existing objective selection methods on the example of travelling salesman problem. Method. In the EA+RL method a reinforcement learning algorithm is used to select an objective – the target objective or one of the auxiliary objectives – at each iteration of the single-objective evolutionary algorithm.The proposed modification of the EA+RL method adopts this approach for the usage with a multiobjective evolutionary algorithm. As opposed to theEA+RL method, in this modification one of the auxiliary objectives is selected by reinforcement learning and optimized together with the target objective at each step of the multiobjective evolutionary algorithm. Main Results.The proposed modification of the EA+RL method was compared to the existing objective selection methods on the example of travelling salesman problem. In the EA+RL method and its proposed modification reinforcement learning algorithms for stationary and non-stationary environment were used. The proposed modification of the EA+RL method applied with reinforcement learning for non-stationary environment outperformed the considered objective selection algorithms on the most problem instances. Practical Significance. The proposed approach increases efficiency of evolutionary algorithms, which may be used for solving discrete NP-hard optimization problems. They are, in particular, combinatorial path search problems and scheduling problems.

  16. Greedy Algorithms for Nonnegativity-Constrained Simultaneous Sparse Recovery

    Science.gov (United States)

    Kim, Daeun; Haldar, Justin P.

    2016-01-01

    This work proposes a family of greedy algorithms to jointly reconstruct a set of vectors that are (i) nonnegative and (ii) simultaneously sparse with a shared support set. The proposed algorithms generalize previous approaches that were designed to impose these constraints individually. Similar to previous greedy algorithms for sparse recovery, the proposed algorithms iteratively identify promising support indices. In contrast to previous approaches, the support index selection procedure has been adapted to prioritize indices that are consistent with both the nonnegativity and shared support constraints. Empirical results demonstrate for the first time that the combined use of simultaneous sparsity and nonnegativity constraints can substantially improve recovery performance relative to existing greedy algorithms that impose less signal structure. PMID:26973368

  17. Theory of monochromators based on holographic toroidal arrays for the X-UV spectrum band. Tests of the 'TGM 10 metres, 4 degrees' on the ACO storage ring

    International Nuclear Information System (INIS)

    Lizon a Lugrin, Eric

    1988-01-01

    As the use of synchrotron radiation is strongly increasing, needs for monochromators in the X-UV range are very important. This research thesis aimed at the development of prototype monochromator based toroidal lamellar arrays with grazing incidence. In the first part, the author recalls theoretical aspects of light scattering rules adapted to a lamellar array, and of wave-matter interaction rules. In the second part, he reports the calculation of the monochromator, its mechanical description, and its implementation on the light line of the ACO storage ring. In the third part, the author reports tests performed without any input slot and in reverse optical configuration on the ACO storage ring. The energy range, the linearity with respect to wave length, the rejection of higher orders of scattered light, flow and resolution are in compliance with expected values [fr

  18. A voting-based star identification algorithm utilizing local and global distribution

    Science.gov (United States)

    Fan, Qiaoyun; Zhong, Xuyang; Sun, Junhua

    2018-03-01

    A novel star identification algorithm based on voting scheme is presented in this paper. In the proposed algorithm, the global distribution and local distribution of sensor stars are fully utilized, and the stratified voting scheme is adopted to obtain the candidates for sensor stars. The database optimization is employed to reduce its memory requirement and improve the robustness of the proposed algorithm. The simulation shows that the proposed algorithm exhibits 99.81% identification rate with 2-pixel standard deviations of positional noises and 0.322-Mv magnitude noises. Compared with two similar algorithms, the proposed algorithm is more robust towards noise, and the average identification time and required memory is less. Furthermore, the real sky test shows that the proposed algorithm performs well on the real star images.

  19. Multi-User Identification-Based Eye-Tracking Algorithm Using Position Estimation

    Directory of Open Access Journals (Sweden)

    Suk-Ju Kang

    2016-12-01

    Full Text Available This paper proposes a new multi-user eye-tracking algorithm using position estimation. Conventional eye-tracking algorithms are typically suitable only for a single user, and thereby cannot be used for a multi-user system. Even though they can be used to track the eyes of multiple users, their detection accuracy is low and they cannot identify multiple users individually. The proposed algorithm solves these problems and enhances the detection accuracy. Specifically, the proposed algorithm adopts a classifier to detect faces for the red, green, and blue (RGB and depth images. Then, it calculates features based on the histogram of the oriented gradient for the detected facial region to identify multiple users, and selects the template that best matches the users from a pre-determined face database. Finally, the proposed algorithm extracts the final eye positions based on anatomical proportions. Simulation results show that the proposed algorithm improved the average F1 score by up to 0.490, compared with benchmark algorithms.

  20. An empirical study on SAJQ (Sorting Algorithm for Join Queries

    Directory of Open Access Journals (Sweden)

    Hassan I. Mathkour

    2010-06-01

    Full Text Available Most queries that applied on database management systems (DBMS depend heavily on the performance of the used sorting algorithm. In addition to have an efficient sorting algorithm, as a primary feature, stability of such algorithms is a major feature that is needed in performing DBMS queries. In this paper, we study a new Sorting Algorithm for Join Queries (SAJQ that has both advantages of being efficient and stable. The proposed algorithm takes the advantage of using the m-way-merge algorithm in enhancing its time complexity. SAJQ performs the sorting operation in a time complexity of O(nlogm, where n is the length of the input array and m is number of sub-arrays used in sorting. An unsorted input array of length n is arranged into m sorted sub-arrays. The m-way-merge algorithm merges the sorted m sub-arrays into the final output sorted array. The proposed algorithm keeps the stability of the keys intact. An analytical proof has been conducted to prove that, in the worst case, the proposed algorithm has a complexity of O(nlogm. Also, a set of experiments has been performed to investigate the performance of the proposed algorithm. The experimental results have shown that the proposed algorithm outperforms other Stable–Sorting algorithms that are designed for join-based queries.

  1. An Efficient Algorithm for Unconstrained Optimization

    Directory of Open Access Journals (Sweden)

    Sergio Gerardo de-los-Cobos-Silva

    2015-01-01

    Full Text Available This paper presents an original and efficient PSO algorithm, which is divided into three phases: (1 stabilization, (2 breadth-first search, and (3 depth-first search. The proposed algorithm, called PSO-3P, was tested with 47 benchmark continuous unconstrained optimization problems, on a total of 82 instances. The numerical results show that the proposed algorithm is able to reach the global optimum. This work mainly focuses on unconstrained optimization problems from 2 to 1,000 variables.

  2. A Unified Differential Evolution Algorithm for Global Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Qiang, Ji; Mitchell, Chad

    2014-06-24

    Abstract?In this paper, we propose a new unified differential evolution (uDE) algorithm for single objective global optimization. Instead of selecting among multiple mutation strategies as in the conventional differential evolution algorithm, this algorithm employs a single equation as the mutation strategy. It has the virtue of mathematical simplicity and also provides users the flexbility for broader exploration of different mutation strategies. Numerical tests using twelve basic unimodal and multimodal functions show promising performance of the proposed algorithm in comparison to convential differential evolution algorithms.

  3. Síndrome de bajo gasto cardíaco en la tetralogía de Fallot

    Directory of Open Access Journals (Sweden)

    Lincoln de la Parte Pérez

    2002-06-01

    Full Text Available Se realiza un estudio retrospectivo de 200 niños operados de tetralogía de Fallot en el Cardiocentro del Hospital Pediátrico Universitario "William Soler" durante el período comprendido entre los años 1990 y 1993. Se observó una relación directamente proporcional entre el tiempo de pinzamiento aórtico (paro anóxico, el tiempo de circulación extracorpórea (CEC y el desarrollo del síndrome de bajo gasto cardíaco. La incidencia de bajo gasto cardíaco en los niños con tiempo de pinzamiento aórtico menor a los 30 min fue nula, en los pacientes con tiempo menor a los 60 min la incidencia fue del 23,2 % (40 pacientes. En los 24 pacientes que sufrieron procedimientos quirúrgicos prolongados y tiempo de pinzamiento aórtico mayor de 1 hora la incidencia fue del 95,8 %. En los pacientes con tiempo de CEC menor de 90 min la incidencia de bajo gasto cardíaco que de sólo el 9,75 %, en los niños con tiempo de CEC entre 91 y 105 min fue de 15,53 %, en los que necesitaron entre 106 y 120 min de CEC fue del 62,5 % y en los procedimientos quirúrgicos prolongados fue del 95,8 %. La mayoría de los pacientes respondió bien al tratamiento, aunque en un pequeño número de ellos se mantuvo la disfunción ventricular a pesar del tratamiento. Siete niños fallecieron en el posoperatorio para una mortalidad del 3,5 %.A retrospective study of 200 children with Tetralogy of Fallot, who were operated on the Cardiology Center of "William Soler" University Pediatric Hospital from 1990 to 1993, was performed. There was a direct proportional relation between time of holding aorta with forceps (anoxic arrest, time of extracorporeal circulation and the development of low cardiac output syndrome. The incidence of low cardiac output in children who underwent holding of aorta with forceps for less than 30 minutes was zero, but the same procedure applied in patients for less than 60 minutes, the incidence was 23,2%(40 patients. The 24 patients that underwent

  4. The serial message-passing schedule for LDPC decoding algorithms

    Science.gov (United States)

    Liu, Mingshan; Liu, Shanshan; Zhou, Yuan; Jiang, Xue

    2015-12-01

    The conventional message-passing schedule for LDPC decoding algorithms is the so-called flooding schedule. It has the disadvantage that the updated messages cannot be used until next iteration, thus reducing the convergence speed . In this case, the Layered Decoding algorithm (LBP) based on serial message-passing schedule is proposed. In this paper the decoding principle of LBP algorithm is briefly introduced, and then proposed its two improved algorithms, the grouped serial decoding algorithm (Grouped LBP) and the semi-serial decoding algorithm .They can improve LBP algorithm's decoding speed while maintaining a good decoding performance.

  5. Phase retrieval via incremental truncated amplitude flow algorithm

    Science.gov (United States)

    Zhang, Quanbing; Wang, Zhifa; Wang, Linjie; Cheng, Shichao

    2017-10-01

    This paper considers the phase retrieval problem of recovering the unknown signal from the given quadratic measurements. A phase retrieval algorithm based on Incremental Truncated Amplitude Flow (ITAF) which combines the ITWF algorithm and the TAF algorithm is proposed. The proposed ITAF algorithm enhances the initialization by performing both of the truncation methods used in ITWF and TAF respectively, and improves the performance in the gradient stage by applying the incremental method proposed in ITWF to the loop stage of TAF. Moreover, the original sampling vector and measurements are preprocessed before initialization according to the variance of the sensing matrix. Simulation experiments verified the feasibility and validity of the proposed ITAF algorithm. The experimental results show that it can obtain higher success rate and faster convergence speed compared with other algorithms. Especially, for the noiseless random Gaussian signals, ITAF can recover any real-valued signal accurately from the magnitude measurements whose number is about 2.5 times of the signal length, which is close to the theoretic limit (about 2 times of the signal length). And it usually converges to the optimal solution within 20 iterations which is much less than the state-of-the-art algorithms.

  6. Network-Oblivious Algorithms

    DEFF Research Database (Denmark)

    Bilardi, Gianfranco; Pietracaprina, Andrea; Pucci, Geppino

    2016-01-01

    A framework is proposed for the design and analysis of network-oblivious algorithms, namely algorithms that can run unchanged, yet efficiently, on a variety of machines characterized by different degrees of parallelism and communication capabilities. The framework prescribes that a network......-oblivious algorithm be specified on a parallel model of computation where the only parameter is the problem’s input size, and then evaluated on a model with two parameters, capturing parallelism granularity and communication latency. It is shown that for a wide class of network-oblivious algorithms, optimality...... of cache hierarchies, to the realm of parallel computation. Its effectiveness is illustrated by providing optimal network-oblivious algorithms for a number of key problems. Some limitations of the oblivious approach are also discussed....

  7. Archimedean copula estimation of distribution algorithm based on artificial bee colony algorithm

    Institute of Scientific and Technical Information of China (English)

    Haidong Xu; Mingyan Jiang; Kun Xu

    2015-01-01

    The artificial bee colony (ABC) algorithm is a com-petitive stochastic population-based optimization algorithm. How-ever, the ABC algorithm does not use the social information and lacks the knowledge of the problem structure, which leads to in-sufficiency in both convergent speed and searching precision. Archimedean copula estimation of distribution algorithm (ACEDA) is a relatively simple, time-economic and multivariate correlated EDA. This paper proposes a novel hybrid algorithm based on the ABC algorithm and ACEDA cal ed Archimedean copula estima-tion of distribution based on the artificial bee colony (ACABC) algorithm. The hybrid algorithm utilizes ACEDA to estimate the distribution model and then uses the information to help artificial bees to search more efficiently in the search space. Six bench-mark functions are introduced to assess the performance of the ACABC algorithm on numerical function optimization. Experimen-tal results show that the ACABC algorithm converges much faster with greater precision compared with the ABC algorithm, ACEDA and the global best (gbest)-guided ABC (GABC) algorithm in most of the experiments.

  8. Dynamics and performance of the free electron laser at Super-Aco with a harmonic RF cavity set on 500 MHz

    International Nuclear Information System (INIS)

    Nutarelli, D.

    2000-01-01

    This work is dedicated to the development of the potentialities of the free electron laser that has been installed on the storage ring Super-Aco at Orsay university. We have studied the dynamics of the electron beam inside a harmonic RF cavity set on 500 MHz. The impact of the geometric characteristics of the optical cavity on the transverse overlapping between laser radiation and the electron beam has been studied in details. An important part of the work has been the assessment of the optical characterization of the dielectric multi-layer mirrors of the cavity. For that purpose a complete system has been designed to assess the changes in optical properties of mirrors during laser operation. Another important part of this work was the study of the interaction process between laser radiation and the electron bunch leading to saturation. This interaction process has been simulated through a new model and some predictions given by this model have been successfully confronted to experimental data. The installation of the harmonic RF cavity has led to a significant increase of the laser radiation gain and the value of the mean power of the laser radiation has reached 300 mW. An interesting application of this technique is the generation of high energy gamma photons through Compton backscattering. A collimated 35 MeV-energy photon beam has been produced at Super-Aco with a rate of 5.10 6 photons per second. (A.C.)

  9. Automatic Circuit Design and Optimization Using Modified PSO Algorithm

    Directory of Open Access Journals (Sweden)

    Subhash Patel

    2016-04-01

    Full Text Available In this work, we have proposed modified PSO algorithm based optimizer for automatic circuit design. The performance of the modified PSO algorithm is compared with two other evolutionary algorithms namely ABC algorithm and standard PSO algorithm by designing two stage CMOS operational amplifier and bulk driven OTA in 130nm technology. The results show the robustness of the proposed algorithm. With modified PSO algorithm, the average design error for two stage op-amp is only 0.054% in contrast to 3.04% for standard PSO algorithm and 5.45% for ABC algorithm. For bulk driven OTA, average design error is 1.32% with MPSO compared to 4.70% with ABC algorithm and 5.63% with standard PSO algorithm.

  10. Cascade Error Projection: A New Learning Algorithm

    Science.gov (United States)

    Duong, T. A.; Stubberud, A. R.; Daud, T.; Thakoor, A. P.

    1995-01-01

    A new neural network architecture and a hardware implementable learning algorithm is proposed. The algorithm, called cascade error projection (CEP), handles lack of precision and circuit noise better than existing algorithms.

  11. Adaptive Step Size Gradient Ascent ICA Algorithm for Wireless MIMO Systems

    Directory of Open Access Journals (Sweden)

    Zahoor Uddin

    2018-01-01

    Full Text Available Independent component analysis (ICA is a technique of blind source separation (BSS used for separation of the mixed received signals. ICA algorithms are classified into adaptive and batch algorithms. Adaptive algorithms perform well in time-varying scenario with high-computational complexity, while batch algorithms have better separation performance in quasistatic channels with low-computational complexity. Amongst batch algorithms, the gradient-based ICA algorithms perform well, but step size selection is critical in these algorithms. In this paper, an adaptive step size gradient ascent ICA (ASS-GAICA algorithm is presented. The proposed algorithm is free from selection of the step size parameter with improved convergence and separation performance. Different performance evaluation criteria are used to verify the effectiveness of the proposed algorithm. Performance of the proposed algorithm is compared with the FastICA and optimum block adaptive ICA (OBAICA algorithms for quasistatic and time-varying wireless channels. Simulation is performed over quadrature amplitude modulation (QAM and binary phase shift keying (BPSK signals. Results show that the proposed algorithm outperforms the FastICA and OBAICA algorithms for a wide range of signal-to-noise ratio (SNR and input data block lengths.

  12. A Hybrid ACO Approach to the Matrix Bandwidth Minimization Problem

    Science.gov (United States)

    Pintea, Camelia-M.; Crişan, Gloria-Cerasela; Chira, Camelia

    The evolution of the human society raises more and more difficult endeavors. For some of the real-life problems, the computing time-restriction enhances their complexity. The Matrix Bandwidth Minimization Problem (MBMP) seeks for a simultaneous permutation of the rows and the columns of a square matrix in order to keep its nonzero entries close to the main diagonal. The MBMP is a highly investigated {NP}-complete problem, as it has broad applications in industry, logistics, artificial intelligence or information recovery. This paper describes a new attempt to use the Ant Colony Optimization framework in tackling MBMP. The introduced model is based on the hybridization of the Ant Colony System technique with new local search mechanisms. Computational experiments confirm a good performance of the proposed algorithm for the considered set of MBMP instances.

  13. Two General Extension Algorithms of Latin Hypercube Sampling

    Directory of Open Access Journals (Sweden)

    Zhi-zhao Liu

    2015-01-01

    Full Text Available For reserving original sampling points to reduce the simulation runs, two general extension algorithms of Latin Hypercube Sampling (LHS are proposed. The extension algorithms start with an original LHS of size m and construct a new LHS of size m+n that contains the original points as many as possible. In order to get a strict LHS of larger size, some original points might be deleted. The relationship of original sampling points in the new LHS structure is shown by a simple undirected acyclic graph. The basic general extension algorithm is proposed to reserve the most original points, but it costs too much time. Therefore, a general extension algorithm based on greedy algorithm is proposed to reduce the extension time, which cannot guarantee to contain the most original points. These algorithms are illustrated by an example and applied to evaluating the sample means to demonstrate the effectiveness.

  14. 314. Extracción de dispositivos intracardíacos en pacientes sometidos previamente a procedimientos de cardiología intervencionista

    OpenAIRE

    E. Villagrán; L. Montes; Z. Garcés; A. Ayaon; M. Carnero; J. Silva; A. Alswies; J.E. Rodríguez

    2012-01-01

    Descripción de una cohorte de pacientes sometidos a la implantación de algún dispositivo intracardíaco percutáneo (DCIP) que finalmente fueron intervenidos quirúrgicamente por disfunción de los mismos. Material y métodos: Se analizó de forma retrospectiva toda la cohorte de pacientes sometidos a extracción de DCIP intervenidos en nuestro centro. Se describe la muestra de pacientes y se resumen los resultados quirúrgicos. Resultados: Desde julio de 2007 – julio de 2011, 25 pacientes inte...

  15. Investigating fractional exhaled nitric oxide (FeNO) in chronic obstructive pulmonary disease (COPD) and asthma-COPD overlap (ACO): a scoping review protocol.

    Science.gov (United States)

    Mostafavi-Pour-Manshadi, Seyed-Mohammad-Yousof; Naderi, Nafiseh; Barrecheguren, Miriam; Dehghan, Abolfazl; Bourbeau, Jean

    2017-12-21

    During the last decade, many articles have been published, including reviews on fractional exhaled nitric oxide (FeNO) use and utility in clinical practice and for monitoring and identifying eosinophilic airway inflammation, especially in asthma, and evaluating corticosteroid responsiveness. However, the exact role of FeNO in patients with chronic obstructive pulmonary disease (COPD) and its ability to distinguish patients with COPD and those having concomitant asthma, that is, asthma-COPD overlap (ACO) is still unclear and needs to be defined. Due to the broad topics of FeNO in chronic airway disease, we undertook a scoping review. The present article describes the protocol of a scoping review of peer-reviewed published literature specific to FeNO in COPD/ACO over the last decade. We used Joanna Briggs Institute Reviewers' Manual scoping review methodology as well as Levac et al 's and Arksey et al 's framework as guides. We searched a variety of databases, including Medline, Embase, Cumulative Index to Nursing and Allied Health Literature (CINAHL), Cochrane Library, Web of Science, and BioSciences Information Service (BIOSIS) on 29 June 2016. Additional studies will be recognised by exploring the reference list of identified eligible studies. Screening of eligible studies will be independently performed by two reviewers and any disagreement will be solved by the third reviewer. We will analyse the gathered data from article bibliographies and abstracts. To investigate the body of published studies regarding the role of FeNO in patients with COPD and its usefulness in the clinical setting, a scoping review can be used as a modern and pioneer model, which does not need ethics approval. By this review, new insights for conducting new research specific to FeNO in COPD/ACO population will emerge. The results of this study will be reported in the scientific meetings and conferences, which aim to provide information to the clinicians, primary care providers and basic

  16. DNABIT Compress – Genome compression algorithm

    Science.gov (United States)

    Rajarajeswari, Pothuraju; Apparao, Allam

    2011-01-01

    Data compression is concerned with how information is organized in data. Efficient storage means removal of redundancy from the data being stored in the DNA molecule. Data compression algorithms remove redundancy and are used to understand biologically important molecules. We present a compression algorithm, “DNABIT Compress” for DNA sequences based on a novel algorithm of assigning binary bits for smaller segments of DNA bases to compress both repetitive and non repetitive DNA sequence. Our proposed algorithm achieves the best compression ratio for DNA sequences for larger genome. Significantly better compression results show that “DNABIT Compress” algorithm is the best among the remaining compression algorithms. While achieving the best compression ratios for DNA sequences (Genomes),our new DNABIT Compress algorithm significantly improves the running time of all previous DNA compression programs. Assigning binary bits (Unique BIT CODE) for (Exact Repeats, Reverse Repeats) fragments of DNA sequence is also a unique concept introduced in this algorithm for the first time in DNA compression. This proposed new algorithm could achieve the best compression ratio as much as 1.58 bits/bases where the existing best methods could not achieve a ratio less than 1.72 bits/bases. PMID:21383923

  17. Block Least Mean Squares Algorithm over Distributed Wireless Sensor Network

    Directory of Open Access Journals (Sweden)

    T. Panigrahi

    2012-01-01

    Full Text Available In a distributed parameter estimation problem, during each sampling instant, a typical sensor node communicates its estimate either by the diffusion algorithm or by the incremental algorithm. Both these conventional distributed algorithms involve significant communication overheads and, consequently, defeat the basic purpose of wireless sensor networks. In the present paper, we therefore propose two new distributed algorithms, namely, block diffusion least mean square (BDLMS and block incremental least mean square (BILMS by extending the concept of block adaptive filtering techniques to the distributed adaptation scenario. The performance analysis of the proposed BDLMS and BILMS algorithms has been carried out and found to have similar performances to those offered by conventional diffusion LMS and incremental LMS algorithms, respectively. The convergence analyses of the proposed algorithms obtained from the simulation study are also found to be in agreement with the theoretical analysis. The remarkable and interesting aspect of the proposed block-based algorithms is that their communication overheads per node and latencies are less than those of the conventional algorithms by a factor as high as the block size used in the algorithms.

  18. Opposition-Based Adaptive Fireworks Algorithm

    OpenAIRE

    Chibing Gong

    2016-01-01

    A fireworks algorithm (FWA) is a recent swarm intelligence algorithm that is inspired by observing fireworks explosions. An adaptive fireworks algorithm (AFWA) proposes additional adaptive amplitudes to improve the performance of the enhanced fireworks algorithm (EFWA). The purpose of this paper is to add opposition-based learning (OBL) to AFWA with the goal of further boosting performance and achieving global optimization. Twelve benchmark functions are tested in use of an opposition-based a...

  19. Frequency-Domain Adaptive Algorithm for Network Echo Cancellation in VoIP

    Directory of Open Access Journals (Sweden)

    Patrick A. Naylor

    2008-05-01

    Full Text Available We propose a new low complexity, low delay, and fast converging frequency-domain adaptive algorithm for network echo cancellation in VoIP exploiting MMax and sparse partial (SP tap-selection criteria in the frequency domain. We incorporate these tap-selection techniques into the multidelay filtering (MDF algorithm in order to mitigate the delay inherent in frequency-domain algorithms. We illustrate two such approaches and discuss their tradeoff between convergence performance and computational complexity. Simulation results show an improvement in convergence rate for the proposed algorithm over MDF and significantly reduced complexity. The proposed algorithm achieves a convergence performance close to that of the recently proposed, but substantially more complex improved proportionate MDF (IPMDF algorithm.

  20. Ant colony optimization techniques for the hamiltonian p-median problem

    Directory of Open Access Journals (Sweden)

    M. Zohrehbandian

    2010-12-01

    Full Text Available Location-Routing problems involve locating a number of facilitiesamong candidate sites and establishing delivery routes to a set of users in such a way that the total system cost is minimized. A special case of these problems is Hamiltonian p-Median problem (HpMP. This research applies the metaheuristic method of ant colony optimization (ACO to solve the HpMP. Modifications are made to the ACO algorithm used to solve the traditional vehicle routing problem (VRP in order to allow the search of the optimal solution of the HpMP. Regarding this metaheuristic algorithm a computational experiment is reported as well.

  1. Adaptive symbiotic organisms search (SOS algorithm for structural design optimization

    Directory of Open Access Journals (Sweden)

    Ghanshyam G. Tejani

    2016-07-01

    Full Text Available The symbiotic organisms search (SOS algorithm is an effective metaheuristic developed in 2014, which mimics the symbiotic relationship among the living beings, such as mutualism, commensalism, and parasitism, to survive in the ecosystem. In this study, three modified versions of the SOS algorithm are proposed by introducing adaptive benefit factors in the basic SOS algorithm to improve its efficiency. The basic SOS algorithm only considers benefit factors, whereas the proposed variants of the SOS algorithm, consider effective combinations of adaptive benefit factors and benefit factors to study their competence to lay down a good balance between exploration and exploitation of the search space. The proposed algorithms are tested to suit its applications to the engineering structures subjected to dynamic excitation, which may lead to undesirable vibrations. Structure optimization problems become more challenging if the shape and size variables are taken into account along with the frequency. To check the feasibility and effectiveness of the proposed algorithms, six different planar and space trusses are subjected to experimental analysis. The results obtained using the proposed methods are compared with those obtained using other optimization methods well established in the literature. The results reveal that the adaptive SOS algorithm is more reliable and efficient than the basic SOS algorithm and other state-of-the-art algorithms.

  2. Discrete Riccati equation solutions: Distributed algorithms

    Directory of Open Access Journals (Sweden)

    D. G. Lainiotis

    1996-01-01

    Full Text Available In this paper new distributed algorithms for the solution of the discrete Riccati equation are introduced. The algorithms are used to provide robust and computational efficient solutions to the discrete Riccati equation. The proposed distributed algorithms are theoretically interesting and computationally attractive.

  3. Rate-control algorithms testing by using video source model

    DEFF Research Database (Denmark)

    Belyaev, Evgeny; Turlikov, Andrey; Ukhanova, Anna

    2008-01-01

    In this paper the method of rate control algorithms testing by the use of video source model is suggested. The proposed method allows to significantly improve algorithms testing over the big test set.......In this paper the method of rate control algorithms testing by the use of video source model is suggested. The proposed method allows to significantly improve algorithms testing over the big test set....

  4. Scalable Nearest Neighbor Algorithms for High Dimensional Data.

    Science.gov (United States)

    Muja, Marius; Lowe, David G

    2014-11-01

    For many computer vision and machine learning problems, large training sets are key for good performance. However, the most computationally expensive part of many computer vision and machine learning algorithms consists of finding nearest neighbor matches to high dimensional vectors that represent the training data. We propose new algorithms for approximate nearest neighbor matching and evaluate and compare them with previous algorithms. For matching high dimensional features, we find two algorithms to be the most efficient: the randomized k-d forest and a new algorithm proposed in this paper, the priority search k-means tree. We also propose a new algorithm for matching binary features by searching multiple hierarchical clustering trees and show it outperforms methods typically used in the literature. We show that the optimal nearest neighbor algorithm and its parameters depend on the data set characteristics and describe an automated configuration procedure for finding the best algorithm to search a particular data set. In order to scale to very large data sets that would otherwise not fit in the memory of a single machine, we propose a distributed nearest neighbor matching framework that can be used with any of the algorithms described in the paper. All this research has been released as an open source library called fast library for approximate nearest neighbors (FLANN), which has been incorporated into OpenCV and is now one of the most popular libraries for nearest neighbor matching.

  5. New Dandelion Algorithm Optimizes Extreme Learning Machine for Biomedical Classification Problems

    Directory of Open Access Journals (Sweden)

    Xiguang Li

    2017-01-01

    Full Text Available Inspired by the behavior of dandelion sowing, a new novel swarm intelligence algorithm, namely, dandelion algorithm (DA, is proposed for global optimization of complex functions in this paper. In DA, the dandelion population will be divided into two subpopulations, and different subpopulations will undergo different sowing behaviors. Moreover, another sowing method is designed to jump out of local optimum. In order to demonstrate the validation of DA, we compare the proposed algorithm with other existing algorithms, including bat algorithm, particle swarm optimization, and enhanced fireworks algorithm. Simulations show that the proposed algorithm seems much superior to other algorithms. At the same time, the proposed algorithm can be applied to optimize extreme learning machine (ELM for biomedical classification problems, and the effect is considerable. At last, we use different fusion methods to form different fusion classifiers, and the fusion classifiers can achieve higher accuracy and better stability to some extent.

  6. Multilevel Image Segmentation Based on an Improved Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    Kai Chen

    2016-01-01

    Full Text Available Multilevel image segmentation is time-consuming and involves large computation. The firefly algorithm has been applied to enhancing the efficiency of multilevel image segmentation. However, in some cases, firefly algorithm is easily trapped into local optima. In this paper, an improved firefly algorithm (IFA is proposed to search multilevel thresholds. In IFA, in order to help fireflies escape from local optima and accelerate the convergence, two strategies (i.e., diversity enhancing strategy with Cauchy mutation and neighborhood strategy are proposed and adaptively chosen according to different stagnation stations. The proposed IFA is compared with three benchmark optimal algorithms, that is, Darwinian particle swarm optimization, hybrid differential evolution optimization, and firefly algorithm. The experimental results show that the proposed method can efficiently segment multilevel images and obtain better performance than the other three methods.

  7. Pharmacological Management of Elderly Patients with Asthma-Chronic Obstructive Pulmonary Disease Overlap Syndrome: Room for Speculation?

    Science.gov (United States)

    Castiglia, Daniela; Battaglia, Salvatore; Benfante, Alida; Sorino, Claudio; Scichilone, Nicola

    2016-06-01

    Asthma and chronic obstructive pulmonary disease (COPD) are two distinct diseases that share a condition of chronic inflammation of the airways and bronchial obstruction. In clinical settings, it is not rare to come across patients who present with clinical and functional features of both diseases, posing a diagnostic dilemma. The overlap condition has been termed asthma-COPD overlap syndrome (ACOS), and mainly occurs in individuals with long-standing asthma, especially if they are also current or former smokers. Patients with ACOS have poorer health-related quality of life and a higher exacerbation rate than subjects with asthma or COPD alone. Whether ACOS is a distinct nosological entity with genetic variants or rather a condition of concomitant diseases that overlap is still a matter of debate. However, there is no doubt that extended life expectancy has increased the prevalence of asthma and COPD in older ages, and thus the probability that overlap conditions occur in clinical settings. In addition, age-associated changes of the lung create the basis for the two entities to converge on the same subject. ACOS patients may benefit from a stepwise treatment similar to that of asthma and COPD; however, the proposed therapeutic algorithms are only speculative and extrapolated from studies that are not representative of the ACOS population. Inhaled corticosteroids are the mainstay of therapy, and always in conjunction with long-acting bronchodilators. The potential heterogeneity of the overlap syndrome in terms of inflammatory features (T helper-1 vs. T helper-2 pathways) may be responsible for the different responses to treatments. The interaction between respiratory drugs and concomitant diseases should be carefully evaluated. Similarly, the effect of non-respiratory drugs, such as aspirin, statins, and β-blockers, on lung function needs to be properly assessed.

  8. A Self Adaptive Differential Evolution Algorithm for Global Optimization

    Science.gov (United States)

    Kumar, Pravesh; Pant, Millie

    This paper presents a new Differential Evolution algorithm based on hybridization of adaptive control parameters and trigonometric mutation. First we propose a self adaptive DE named ADE where choice of control parameter F and Cr is not fixed at some constant value but is taken iteratively. The proposed algorithm is further modified by applying trigonometric mutation in it and the corresponding algorithm is named as ATDE. The performance of ATDE is evaluated on the set of 8 benchmark functions and the results are compared with the classical DE algorithm in terms of average fitness function value, number of function evaluations, convergence time and success rate. The numerical result shows the competence of the proposed algorithm.

  9. Improved Collaborative Filtering Algorithm using Topic Model

    Directory of Open Access Journals (Sweden)

    Liu Na

    2016-01-01

    Full Text Available Collaborative filtering algorithms make use of interactions rates between users and items for generating recommendations. Similarity among users or items is calculated based on rating mostly, without considering explicit properties of users or items involved. In this paper, we proposed collaborative filtering algorithm using topic model. We describe user-item matrix as document-word matrix and user are represented as random mixtures over item, each item is characterized by a distribution over users. The experiments showed that the proposed algorithm achieved better performance compared the other state-of-the-art algorithms on Movie Lens data sets.

  10. Key Distribution and Changing Key Cryptosystem Based on Phase Retrieval Algorithm and RSA Public-Key Algorithm

    Directory of Open Access Journals (Sweden)

    Tieyu Zhao

    2015-01-01

    Full Text Available The optical image encryption has attracted more and more researchers’ attention, and the various encryption schemes have been proposed. In existing optical cryptosystem, the phase functions or images are usually used as the encryption keys, and it is difficult that the traditional public-key algorithm (such as RSA, ECC, etc. is used to complete large numerical key transfer. In this paper, we propose a key distribution scheme based on the phase retrieval algorithm and the RSA public-key algorithm, which solves the problem for the key distribution in optical image encryption system. Furthermore, we also propose a novel image encryption system based on the key distribution principle. In the system, the different keys can be used in every encryption process, which greatly improves the security of the system.

  11. An Improved Perturb and Observe Algorithm for Photovoltaic Motion Carriers

    Science.gov (United States)

    Peng, Lele; Xu, Wei; Li, Liming; Zheng, Shubin

    2018-03-01

    An improved perturbation and observation algorithm for photovoltaic motion carriers is proposed in this paper. The model of the proposed algorithm is given by using Lambert W function and tangent error method. Moreover, by using matlab and experiment of photovoltaic system, the tracking performance of the proposed algorithm is tested. And the results demonstrate that the improved algorithm has fast tracking speed and high efficiency. Furthermore, the energy conversion efficiency by the improved method has increased by nearly 8.2%.

  12. A novel hybrid algorithm of GSA with Kepler algorithm for numerical optimization

    Directory of Open Access Journals (Sweden)

    Soroor Sarafrazi

    2015-07-01

    Full Text Available It is now well recognized that pure algorithms can be promisingly improved by hybridization with other techniques. One of the relatively new metaheuristic algorithms is Gravitational Search Algorithm (GSA which is based on the Newton laws. In this paper, to enhance the performance of GSA, a novel algorithm called “Kepler”, inspired by the astrophysics, is introduced. The Kepler algorithm is based on the principle of the first Kepler law. The hybridization of GSA and Kepler algorithm is an efficient approach to provide much stronger specialization in intensification and/or diversification. The performance of GSA–Kepler is evaluated by applying it to 14 benchmark functions with 20–1000 dimensions and the optimal approximation of linear system as a practical optimization problem. The results obtained reveal that the proposed hybrid algorithm is robust enough to optimize the benchmark functions and practical optimization problems.

  13. Low-Complexity Compression Algorithm for Hyperspectral Images Based on Distributed Source Coding

    Directory of Open Access Journals (Sweden)

    Yongjian Nian

    2013-01-01

    Full Text Available A low-complexity compression algorithm for hyperspectral images based on distributed source coding (DSC is proposed in this paper. The proposed distributed compression algorithm can realize both lossless and lossy compression, which is implemented by performing scalar quantization strategy on the original hyperspectral images followed by distributed lossless compression. Multilinear regression model is introduced for distributed lossless compression in order to improve the quality of side information. Optimal quantized step is determined according to the restriction of the correct DSC decoding, which makes the proposed algorithm achieve near lossless compression. Moreover, an effective rate distortion algorithm is introduced for the proposed algorithm to achieve low bit rate. Experimental results show that the compression performance of the proposed algorithm is competitive with that of the state-of-the-art compression algorithms for hyperspectral images.

  14. Android Malware Classification Using K-Means Clustering Algorithm

    Science.gov (United States)

    Hamid, Isredza Rahmi A.; Syafiqah Khalid, Nur; Azma Abdullah, Nurul; Rahman, Nurul Hidayah Ab; Chai Wen, Chuah

    2017-08-01

    Malware was designed to gain access or damage a computer system without user notice. Besides, attacker exploits malware to commit crime or fraud. This paper proposed Android malware classification approach based on K-Means clustering algorithm. We evaluate the proposed model in terms of accuracy using machine learning algorithms. Two datasets were selected to demonstrate the practicing of K-Means clustering algorithms that are Virus Total and Malgenome dataset. We classify the Android malware into three clusters which are ransomware, scareware and goodware. Nine features were considered for each types of dataset such as Lock Detected, Text Detected, Text Score, Encryption Detected, Threat, Porn, Law, Copyright and Moneypak. We used IBM SPSS Statistic software for data classification and WEKA tools to evaluate the built cluster. The proposed K-Means clustering algorithm shows promising result with high accuracy when tested using Random Forest algorithm.

  15. Watermarking Algorithms for 3D NURBS Graphic Data

    Directory of Open Access Journals (Sweden)

    Jae Jun Lee

    2004-10-01

    Full Text Available Two watermarking algorithms for 3D nonuniform rational B-spline (NURBS graphic data are proposed: one is appropriate for the steganography, and the other for watermarking. Instead of directly embedding data into the parameters of NURBS, the proposed algorithms embed data into the 2D virtual images extracted by parameter sampling of 3D model. As a result, the proposed steganography algorithm can embed information into more places of the surface than the conventional algorithm, while preserving the data size of the model. Also, any existing 2D watermarking technique can be used for the watermarking of 3D NURBS surfaces. From the experiment, it is found that the algorithm for the watermarking is robust to the attacks on weights, control points, and knots. It is also found to be robust to the remodeling of NURBS models.

  16. [A new peak detection algorithm of Raman spectra].

    Science.gov (United States)

    Jiang, Cheng-Zhi; Sun, Qiang; Liu, Ying; Liang, Jing-Qiu; An, Yan; Liu, Bing

    2014-01-01

    The authors proposed a new Raman peak recognition method named bi-scale correlation algorithm. The algorithm uses the combination of the correlation coefficient and the local signal-to-noise ratio under two scales to achieve Raman peak identification. We compared the performance of the proposed algorithm with that of the traditional continuous wavelet transform method through MATLAB, and then tested the algorithm with real Raman spectra. The results show that the average time for identifying a Raman spectrum is 0.51 s with the algorithm, while it is 0.71 s with the continuous wavelet transform. When the signal-to-noise ratio of Raman peak is greater than or equal to 6 (modern Raman spectrometers feature an excellent signal-to-noise ratio), the recognition accuracy with the algorithm is higher than 99%, while it is less than 84% with the continuous wavelet transform method. The mean and the standard deviations of the peak position identification error of the algorithm are both less than that of the continuous wavelet transform method. Simulation analysis and experimental verification prove that the new algorithm possesses the following advantages: no needs of human intervention, no needs of de-noising and background removal operation, higher recognition speed and higher recognition accuracy. The proposed algorithm is operable in Raman peak identification.

  17. Some nonlinear space decomposition algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Tai, Xue-Cheng; Espedal, M. [Univ. of Bergen (Norway)

    1996-12-31

    Convergence of a space decomposition method is proved for a general convex programming problem. The space decomposition refers to methods that decompose a space into sums of subspaces, which could be a domain decomposition or a multigrid method for partial differential equations. Two algorithms are proposed. Both can be used for linear as well as nonlinear elliptic problems and they reduce to the standard additive and multiplicative Schwarz methods for linear elliptic problems. Two {open_quotes}hybrid{close_quotes} algorithms are also presented. They converge faster than the additive one and have better parallelism than the multiplicative method. Numerical tests with a two level domain decomposition for linear, nonlinear and interface elliptic problems are presented for the proposed algorithms.

  18. A robust firearm identification algorithm of forensic ballistics specimens

    Science.gov (United States)

    Chuan, Z. L.; Jemain, A. A.; Liong, C.-Y.; Ghani, N. A. M.; Tan, L. K.

    2017-09-01

    There are several inherent difficulties in the existing firearm identification algorithms, include requiring the physical interpretation and time consuming. Therefore, the aim of this study is to propose a robust algorithm for a firearm identification based on extracting a set of informative features from the segmented region of interest (ROI) using the simulated noisy center-firing pin impression images. The proposed algorithm comprises Laplacian sharpening filter, clustering-based threshold selection, unweighted least square estimator, and segment a square ROI from the noisy images. A total of 250 simulated noisy images collected from five different pistols of the same make, model and caliber are used to evaluate the robustness of the proposed algorithm. This study found that the proposed algorithm is able to perform the identical task on the noisy images with noise levels as high as 70%, while maintaining a firearm identification accuracy rate of over 90%.

  19. Incoherent beam combining based on the momentum SPGD algorithm

    Science.gov (United States)

    Yang, Guoqing; Liu, Lisheng; Jiang, Zhenhua; Guo, Jin; Wang, Tingfeng

    2018-05-01

    Incoherent beam combining (ICBC) technology is one of the most promising ways to achieve high-energy, near-diffraction laser output. In this paper, the momentum method is proposed as a modification of the stochastic parallel gradient descent (SPGD) algorithm. The momentum method can improve the speed of convergence of the combining system efficiently. The analytical method is employed to interpret the principle of the momentum method. Furthermore, the proposed algorithm is testified through simulations as well as experiments. The results of the simulations and the experiments show that the proposed algorithm not only accelerates the speed of the iteration, but also keeps the stability of the combining process. Therefore the feasibility of the proposed algorithm in the beam combining system is testified.

  20. [Adequacy of clinical interventions in patients with advanced and complex disease. Proposal of a decision making algorithm].

    Science.gov (United States)

    Ameneiros-Lago, E; Carballada-Rico, C; Garrido-Sanjuán, J A; García Martínez, A

    2015-01-01

    Decision making in the patient with chronic advanced disease is especially complex. Health professionals are obliged to prevent avoidable suffering and not to add any more damage to that of the disease itself. The adequacy of the clinical interventions consists of only offering those diagnostic and therapeutic procedures appropriate to the clinical situation of the patient and to perform only those allowed by the patient or representative. In this article, the use of an algorithm is proposed that should serve to help health professionals in this decision making process. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.

  1. Hacia una descripción del cuento policíaco español contemporáneo

    Directory of Open Access Journals (Sweden)

    Alicia Valverde Velasco

    2014-06-01

    Full Text Available Desde su recepción en España, ya en el siglo XIX, el cuento policíaco nunca había experimentado un auge en su cultivo y aceptación como el acaecido a partir de los años setenta. Desde entonces, el cuento literario, como forma narrativa susceptible de desarrollar tramas de muy distinta índole, entre ellas la del hecho criminal, ha experimentado un período de reconocimiento paulatino por parte de los autores, las editoriales, el público y la crítica, por lo que no es de extrañar que los escritores del llamado “género criminal” se atreviesen a explorar las posibilidades del relato breve con mayor fortuna que durante las décadas precedentes.

  2. An Enhanced Genetic Algorithm for the Generalized Traveling Salesman Problem

    Directory of Open Access Journals (Sweden)

    H. Jafarzadeh

    2017-12-01

    Full Text Available The generalized traveling salesman problem (GTSP deals with finding the minimum-cost tour in a clustered set of cities. In this problem, the traveler is interested in finding the best path that goes through all clusters. As this problem is NP-hard, implementing a metaheuristic algorithm to solve the large scale problems is inevitable. The performance of these algorithms can be intensively promoted by other heuristic algorithms. In this study, a search method is developed that improves the quality of the solutions and competition time considerably in comparison with Genetic Algorithm. In the proposed algorithm, the genetic algorithms with the Nearest Neighbor Search (NNS are combined and a heuristic mutation operator is applied. According to the experimental results on a set of standard test problems with symmetric distances, the proposed algorithm finds the best solutions in most cases with the least computational time. The proposed algorithm is highly competitive with the published until now algorithms in both solution quality and running time.

  3. Comparação entre a medida contínua do débito cardíaco e por termodiluição em bolus durante a revascularização miocárdica sem circulação extracorpórea

    Directory of Open Access Journals (Sweden)

    Kim Sílvia M.

    2004-01-01

    Full Text Available JUSTIFICATIVA E OBJETIVOS: A cirurgia de revascularização miocárdica sem o uso de circulação extracorpórea (CEC relaciona-se a importantes alterações hemodinâmicas bruscas, que podem não ser prontamente detectadas pela medida contínua de débito cardíaco. Este estudo compara resultados obtidos pela medida do índice cardíaco com o cateter de artéria pulmonar com filamento térmico (Baxter Edwards Critical Care, Irvine, CA com o método padrão por termodiluição com solução, durante a anastomose coronariana distal. MÉTODO: Dez pacientes submetidos à cirurgia de revascularização miocárdica sem CEC foram monitorizados com o cateter de artéria pulmonar com filamento térmico. As medidas de índice cardíaco foram obtidas em quatro momentos: no início da anestesia, enquanto o tórax ainda estava fechado (M1, após a esternotomia (M2, após a estabilização do coração com o aparelho octopus (M3 e ao final da anastomose coronariana distal (M4. RESULTADOS: Houve diminuição significativa (p < 0,05 do índice cardíaco durante a anastomose coronariana, detectada pela medida com termodiluição com bolus de solução. O índice cardíaco variou de 2,8 ± 0,7 para 2,3 ± 0,8 l.min.m-2 no início da anastomose e 2,5 ± 0,8 l.min.m-2 ao final da mesma. Essa variação não foi detectada pela medida contínua (de 3 ± 0,6 para 3,2 ± 0,5 e 3,1 ± 0,6 l.min.m-2 durante a anastomose coronariana. CONCLUSÕES: A medida de débito cardíaco contínuo utilizando o cateter de artéria pulmonar com filamento térmico apresentou atraso na detecção das alterações hemodinâmicas agudas relacionadas à mudança do posicionamento do coração na cirurgia de revascularização miocárdica sem CEC.

  4. Memetic Algorithm and its Application to the Arrangement of Exam Timetable

    Directory of Open Access Journals (Sweden)

    Wenhua Huang

    2016-06-01

    Full Text Available This paper looks at Memetic Algorithm for solving timetabling problems. We present a new memetic algorithm which consists of global search algorithm and local search algorithm. In the proposed method, a genetic algorithm is chosen for global search algorithm while a simulated annealing algorithm is used for local search algorithm. In particular, we could get an optimal solution through the .NET with the real data of JiangXi Normal University. Experimental results show that the proposed algorithm can solve the university exam timetabling problem efficiently.

  5. PARAMETER ESTIMATION OF VALVE STICTION USING ANT COLONY OPTIMIZATION

    Directory of Open Access Journals (Sweden)

    S. Kalaivani

    2012-07-01

    Full Text Available In this paper, a procedure for quantifying valve stiction in control loops based on ant colony optimization has been proposed. Pneumatic control valves are widely used in the process industry. The control valve contains non-linearities such as stiction, backlash, and deadband that in turn cause oscillations in the process output. Stiction is one of the long-standing problems and it is the most severe problem in the control valves. Thus the measurement data from an oscillating control loop can be used as a possible diagnostic signal to provide an estimate of the stiction magnitude. Quantification of control valve stiction is still a challenging issue. Prior to doing stiction detection and quantification, it is necessary to choose a suitable model structure to describe control-valve stiction. To understand the stiction phenomenon, the Stenman model is used. Ant Colony Optimization (ACO, an intelligent swarm algorithm, proves effective in various fields. The ACO algorithm is inspired from the natural trail following behaviour of ants. The parameters of the Stenman model are estimated using ant colony optimization, from the input-output data by minimizing the error between the actual stiction model output and the simulated stiction model output. Using ant colony optimization, Stenman model with known nonlinear structure and unknown parameters can be estimated.

  6. A robust embedded vision system feasible white balance algorithm

    Science.gov (United States)

    Wang, Yuan; Yu, Feihong

    2018-01-01

    White balance is a very important part of the color image processing pipeline. In order to meet the need of efficiency and accuracy in embedded machine vision processing system, an efficient and robust white balance algorithm combining several classical ones is proposed. The proposed algorithm mainly has three parts. Firstly, in order to guarantee higher efficiency, an initial parameter calculated from the statistics of R, G and B components from raw data is used to initialize the following iterative method. After that, the bilinear interpolation algorithm is utilized to implement demosaicing procedure. Finally, an adaptive step adjustable scheme is introduced to ensure the controllability and robustness of the algorithm. In order to verify the proposed algorithm's performance on embedded vision system, a smart camera based on IMX6 DualLite, IMX291 and XC6130 is designed. Extensive experiments on a large amount of images under different color temperatures and exposure conditions illustrate that the proposed white balance algorithm avoids color deviation problem effectively, achieves a good balance between efficiency and quality, and is suitable for embedded machine vision processing system.

  7. A Location-Aware Vertical Handoff Algorithm for Hybrid Networks

    KAUST Repository

    Mehbodniya, Abolfazl

    2010-07-01

    One of the main objectives of wireless networking is to provide mobile users with a robust connection to different networks so that they can move freely between heterogeneous networks while running their computing applications with no interruption. Horizontal handoff, or generally speaking handoff, is a process which maintains a mobile user\\'s active connection as it moves within a wireless network, whereas vertical handoff (VHO) refers to handover between different types of networks or different network layers. Optimizing VHO process is an important issue, required to reduce network signalling and mobile device power consumption as well as to improve network quality of service (QoS) and grade of service (GoS). In this paper, a VHO algorithm in multitier (overlay) networks is proposed. This algorithm uses pattern recognition to estimate user\\'s position, and decides on the handoff based on this information. For the pattern recognition algorithm structure, the probabilistic neural network (PNN) which has considerable simplicity and efficiency over existing pattern classifiers is used. Further optimization is proposed to improve the performance of the PNN algorithm. Performance analysis and comparisons with the existing VHO algorithm are provided and demonstrate a significant improvement with the proposed algorithm. Furthermore, incorporating the proposed algorithm, a structure is proposed for VHO from the medium access control (MAC) layer point of view. © 2010 ACADEMY PUBLISHER.

  8. Simultaneous and semi-alternating projection algorithms for solving split equality problems.

    Science.gov (United States)

    Dong, Qiao-Li; Jiang, Dan

    2018-01-01

    In this article, we first introduce two simultaneous projection algorithms for solving the split equality problem by using a new choice of the stepsize, and then propose two semi-alternating projection algorithms. The weak convergence of the proposed algorithms is analyzed under standard conditions. As applications, we extend the results to solve the split feasibility problem. Finally, a numerical example is presented to illustrate the efficiency and advantage of the proposed algorithms.

  9. Dynamic route guidance algorithm based algorithm based on artificial immune system

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    To improve the performance of the K-shortest paths search in intelligent traffic guidance systems,this paper proposes an optimal search algorithm based on the intelligent optimization search theory and the memphor mechanism of vertebrate immune systems.This algorithm,applied to the urban traffic network model established by the node-expanding method,can expediently realize K-shortest paths search in the urban traffic guidance systems.Because of the immune memory and global parallel search ability from artificial immune systems,K shortest paths can be found without any repeat,which indicates evidently the superiority of the algorithm to the conventional ones.Not only does it perform a better parallelism,the algorithm also prevents premature phenomenon that often occurs in genetic algorithms.Thus,it is especially suitable for real-time requirement of the traffic guidance system and other engineering optimal applications.A case study verifies the efficiency and the practicability of the algorithm aforementioned.

  10. Summer Research Program (1992). Summer Faculty Research Program (SFRP) Reports. Volume 5B. Wright Laboratory

    Science.gov (United States)

    1992-12-01

    2] "Proposal for Space Integrated Control Exýeriment (SPICE)", Prepared by Lockheed Missiles & Space Company, Inc. (in col- laboration with...components are given as Xt(k) = AcosO + na(k) (3a) Xo(k) = AsinO + nQ(k). (3b) Since the amplitude, frequency and the phase of the received signal are...N( AcosO ,a 2 ) (12a) XQ(k) = N( AcosO ,a 2). (12b) In this case, the joint probability density function (PDF) is given by 12X,2e -J.w[((r - Acos0)2

  11. Controller Parameter Optimization for Nonlinear Systems Using Enhanced Bacteria Foraging Algorithm

    Directory of Open Access Journals (Sweden)

    V. Rajinikanth

    2012-01-01

    Full Text Available An enhanced bacteria foraging optimization (EBFO algorithm-based Proportional + integral + derivative (PID controller tuning is proposed for a class of nonlinear process models. The EBFO algorithm is a modified form of standard BFO algorithm. A multiobjective performance index is considered to guide the EBFO algorithm for discovering the best possible value of controller parameters. The efficiency of the proposed scheme has been validated through a comparative study with classical BFO, adaptive BFO, PSO, and GA based controller tuning methods proposed in the literature. The proposed algorithm is tested in real time on a nonlinear spherical tank system. The real-time results show that, EBFO tuned PID controller gives a smooth response for setpoint tracking performance.

  12. An effective one-dimensional anisotropic fingerprint enhancement algorithm

    Science.gov (United States)

    Ye, Zhendong; Xie, Mei

    2012-01-01

    Fingerprint identification is one of the most important biometric technologies. The performance of the minutiae extraction and the speed of the fingerprint verification system rely heavily on the quality of the input fingerprint images, so the enhancement of the low fingerprint is a critical and difficult step in a fingerprint verification system. In this paper we proposed an effective algorithm for fingerprint enhancement. Firstly we use normalization algorithm to reduce the variations in gray level values along ridges and valleys. Then we utilize the structure tensor approach to estimate each pixel of the fingerprint orientations. At last we propose a novel algorithm which combines the advantages of onedimensional Gabor filtering method and anisotropic method to enhance the fingerprint in recoverable region. The proposed algorithm has been evaluated on the database of Fingerprint Verification Competition 2004, and the results show that our algorithm performs within less time.

  13. Motion tolerant iterative reconstruction algorithm for cone-beam helical CT imaging

    Energy Technology Data Exchange (ETDEWEB)

    Takahashi, Hisashi; Goto, Taiga; Hirokawa, Koichi; Miyazaki, Osamu [Hitachi Medical Corporation, Chiba-ken (Japan). CT System Div.

    2011-07-01

    We have developed a new advanced iterative reconstruction algorithm for cone-beam helical CT. The features of this algorithm are: (a) it uses separable paraboloidal surrogate (SPS) technique as a foundation for reconstruction to reduce noise and cone-beam artifact, (b) it uses a view weight in the back-projection process to reduce motion artifact. To confirm the improvement of our proposed algorithm over other existing algorithm, such as Feldkamp-Davis-Kress (FDK) or SPS algorithm, we compared the motion artifact reduction, image noise reduction (standard deviation of CT number), and cone-beam artifact reduction on simulated and clinical data set. Our results demonstrate that the proposed algorithm dramatically reduces motion artifacts compared with the SPS algorithm, and decreases image noise compared with the FDK algorithm. In addition, the proposed algorithm potentially improves time resolution of iterative reconstruction. (orig.)

  14. 123. Intervención quirúrgica urgente en paciente varón joven con angiosarcoma primario cardíaco con diagnóstico de tromboembolismo pulmonar agudo

    Directory of Open Access Journals (Sweden)

    N. Miranda

    2012-04-01

    Conclusiones: Son infrecuentes los casos de sarcoma cardíaco primario publicados en la bibliografía y muy pocos los que han cursado como tromboembolismo pulmonar agudo. Este caso ilustra la enorme dificultad que presentan para un diagnóstico y tratamiento precoz que ofrezca unas mínimas expectativas de vida a medio plazo.

  15. Model-Free Adaptive Control Algorithm with Data Dropout Compensation

    Directory of Open Access Journals (Sweden)

    Xuhui Bu

    2012-01-01

    Full Text Available The convergence of model-free adaptive control (MFAC algorithm can be guaranteed when the system is subject to measurement data dropout. The system output convergent speed gets slower as dropout rate increases. This paper proposes a MFAC algorithm with data compensation. The missing data is first estimated using the dynamical linearization method, and then the estimated value is introduced to update control input. The convergence analysis of the proposed MFAC algorithm is given, and the effectiveness is also validated by simulations. It is shown that the proposed algorithm can compensate the effect of the data dropout, and the better output performance can be obtained.

  16. An elitist teaching-learning-based optimization algorithm for solving complex constrained optimization problems

    Directory of Open Access Journals (Sweden)

    Vivek Patel

    2012-08-01

    Full Text Available Nature inspired population based algorithms is a research field which simulates different natural phenomena to solve a wide range of problems. Researchers have proposed several algorithms considering different natural phenomena. Teaching-Learning-based optimization (TLBO is one of the recently proposed population based algorithm which simulates the teaching-learning process of the class room. This algorithm does not require any algorithm-specific control parameters. In this paper, elitism concept is introduced in the TLBO algorithm and its effect on the performance of the algorithm is investigated. The effects of common controlling parameters such as the population size and the number of generations on the performance of the algorithm are also investigated. The proposed algorithm is tested on 35 constrained benchmark functions with different characteristics and the performance of the algorithm is compared with that of other well known optimization algorithms. The proposed algorithm can be applied to various optimization problems of the industrial environment.

  17. A new simple iterative reconstruction algorithm for SPECT transmission measurement

    International Nuclear Information System (INIS)

    Hwang, D.S.; Zeng, G.L.

    2005-01-01

    This paper proposes a new iterative reconstruction algorithm for transmission tomography and compares this algorithm with several other methods. The new algorithm is simple and resembles the emission ML-EM algorithm in form. Due to its simplicity, it is easy to implement and fast to compute a new update at each iteration. The algorithm also always guarantees non-negative solutions. Evaluations are performed using simulation studies and real phantom data. Comparisons with other algorithms such as convex, gradient, and logMLEM show that the proposed algorithm is as good as others and performs better in some cases

  18. Seismic noise attenuation using an online subspace tracking algorithm

    Science.gov (United States)

    Zhou, Yatong; Li, Shuhua; Zhang, Dong; Chen, Yangkang

    2018-02-01

    We propose a new low-rank based noise attenuation method using an efficient algorithm for tracking subspaces from highly corrupted seismic observations. The subspace tracking algorithm requires only basic linear algebraic manipulations. The algorithm is derived by analysing incremental gradient descent on the Grassmannian manifold of subspaces. When the multidimensional seismic data are mapped to a low-rank space, the subspace tracking algorithm can be directly applied to the input low-rank matrix to estimate the useful signals. Since the subspace tracking algorithm is an online algorithm, it is more robust to random noise than traditional truncated singular value decomposition (TSVD) based subspace tracking algorithm. Compared with the state-of-the-art algorithms, the proposed denoising method can obtain better performance. More specifically, the proposed method outperforms the TSVD-based singular spectrum analysis method in causing less residual noise and also in saving half of the computational cost. Several synthetic and field data examples with different levels of complexities demonstrate the effectiveness and robustness of the presented algorithm in rejecting different types of noise including random noise, spiky noise, blending noise, and coherent noise.

  19. Algorithm for Public Electric Transport Schedule Control for Intelligent Embedded Devices

    Science.gov (United States)

    Alps, Ivars; Potapov, Andrey; Gorobetz, Mikhail; Levchenkov, Anatoly

    2010-01-01

    In this paper authors present heuristics algorithm for precise schedule fulfilment in city traffic conditions taking in account traffic lights. The algorithm is proposed for programmable controller. PLC is proposed to be installed in electric vehicle to control its motion speed and signals of traffic lights. Algorithm is tested using real controller connected to virtual devices and real functional models of real tram devices. Results of experiments show high precision of public transport schedule fulfilment using proposed algorithm.

  20. Efecto de la incorporación de fibras dietéticas en la calidad de panes para celíacos

    OpenAIRE

    Díaz Malmierca, Álvaro

    2013-01-01

    La investigación en la elaboración de panes sin gluten está muy avanzada, siendo el déficit en micronutrientes de estos productos uno de los problemas sobre el que se está centrando la investigación. Uno de los nutrientes más deficitarios, y que más problemas causa sobre la salud de los pacientes celíacos es la fibra dietética. En este estudio se ha investigado el efecto que produce la adición de un 10% de diferentes fibras sobre las características de los panes, la influencia ...

  1. High-order hydrodynamic algorithms for exascale computing

    Energy Technology Data Exchange (ETDEWEB)

    Morgan, Nathaniel Ray [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-02-05

    Hydrodynamic algorithms are at the core of many laboratory missions ranging from simulating ICF implosions to climate modeling. The hydrodynamic algorithms commonly employed at the laboratory and in industry (1) typically lack requisite accuracy for complex multi- material vortical flows and (2) are not well suited for exascale computing due to poor data locality and poor FLOP/memory ratios. Exascale computing requires advances in both computer science and numerical algorithms. We propose to research the second requirement and create a new high-order hydrodynamic algorithm that has superior accuracy, excellent data locality, and excellent FLOP/memory ratios. This proposal will impact a broad range of research areas including numerical theory, discrete mathematics, vorticity evolution, gas dynamics, interface instability evolution, turbulent flows, fluid dynamics and shock driven flows. If successful, the proposed research has the potential to radically transform simulation capabilities and help position the laboratory for computing at the exascale.

  2. Approximated affine projection algorithm for feedback cancellation in hearing aids.

    Science.gov (United States)

    Lee, Sangmin; Kim, In-Young; Park, Young-Cheol

    2007-09-01

    We propose an approximated affine projection (AP) algorithm for feedback cancellation in hearing aids. It is based on the conventional approach using the Gauss-Seidel (GS) iteration, but provides more stable convergence behaviour even with small step sizes. In the proposed algorithm, a residue of the weighted error vector, instead of the current error sample, is used to provide stable convergence. A new learning rate control scheme is also applied to the proposed algorithm to prevent signal cancellation and system instability. The new scheme determines step size in proportion to the prediction factor of the input, so that adaptation is inhibited whenever tone-like signals are present in the input. Simulation results verified the efficiency of the proposed algorithm.

  3. GSM Channel Equalization Algorithm - Modern DSP Coprocessor Approach

    Directory of Open Access Journals (Sweden)

    M. Drutarovsky

    1999-12-01

    Full Text Available The paper presents basic equations of efficient GSM Viterbi equalizer algorithm based on approximation of GMSK modulation by linear superposition of amplitude modulated pulses. This approximation allows to use Ungerboeck form of channel equalizer with significantly reduced arithmetic complexity. Proposed algorithm can be effectively implemented on the Viterbi and Filter coprocessors of new Motorola DSP56305 digital signal processor. Short overview of coprocessor features related to the proposed algorithm is included.

  4. Computationally Efficient DOA Tracking Algorithm in Monostatic MIMO Radar with Automatic Association

    Directory of Open Access Journals (Sweden)

    Huaxin Yu

    2014-01-01

    Full Text Available We consider the problem of tracking the direction of arrivals (DOA of multiple moving targets in monostatic multiple-input multiple-output (MIMO radar. A low-complexity DOA tracking algorithm in monostatic MIMO radar is proposed. The proposed algorithm obtains DOA estimation via the difference between previous and current covariance matrix of the reduced-dimension transformation signal, and it reduces the computational complexity and realizes automatic association in DOA tracking. Error analysis and Cramér-Rao lower bound (CRLB of DOA tracking are derived in the paper. The proposed algorithm not only can be regarded as an extension of array-signal-processing DOA tracking algorithm in (Zhang et al. (2008, but also is an improved version of the DOA tracking algorithm in (Zhang et al. (2008. Furthermore, the proposed algorithm has better DOA tracking performance than the DOA tracking algorithm in (Zhang et al. (2008. The simulation results demonstrate effectiveness of the proposed algorithm. Our work provides the technical support for the practical application of MIMO radar.

  5. PAPR Reduction in OFDM-based Visible Light Communication Systems Using a Combination of Novel Peak-value Feedback Algorithm and Genetic Algorithm

    Science.gov (United States)

    Deng, Honggui; Liu, Yan; Ren, Shuang; He, Hailang; Tang, Chengying

    2017-10-01

    We propose an enhanced partial transmit sequence technique based on novel peak-value feedback algorithm and genetic algorithm (GAPFA-PTS) to reduce peak-to-average power ratio (PAPR) of orthogonal frequency division multiplexing (OFDM) signals in visible light communication (VLC) systems(VLC-OFDM). To demonstrate the advantages of our proposed algorithm, we analyze the flow of proposed technique and compare the performances with other techniques through MATLAB simulation. The results show that GAPFA-PTS technique achieves a significant improvement in PAPR reduction while maintaining low bit error rate (BER) and low complexity in VLC-OFDM systems.

  6. Improved ant Colony Optimization for Virtual Teams Building in Collaborative Process Planning

    Directory of Open Access Journals (Sweden)

    Yingying Su

    2014-02-01

    Full Text Available Virtual teams have been adopted by organizations to gain competitive advantages in this global economy. Virtual teams are a ubiquitous part of getting work done in almost every organization. For the purpose of building virtual teams in collaborative process planning, the method based on improved ant colony algorithm (IMACO was proposed. The concept of virtual team was illustrated and the necessity of building virtual teams in collaborative process planning was analyzed. The sub tasks with certain timing relationship were described and the model of building virtual teams in collaborative process planning was established, which was solved by improved ant colony algorithm. In this paper applications of the IMACO and ACO are compared and demonstrate that the use of the IMACO algorithm performs better. An example was studied to illustrate the effectiveness of the strategy.

  7. Two-wavelength Lidar inversion algorithm for determining planetary boundary layer height

    Science.gov (United States)

    Liu, Boming; Ma, Yingying; Gong, Wei; Jian, Yang; Ming, Zhang

    2018-02-01

    This study proposes a two-wavelength Lidar inversion algorithm to determine the boundary layer height (BLH) based on the particles clustering. Color ratio and depolarization ratio are used to analyze the particle distribution, based on which the proposed algorithm can overcome the effects of complex aerosol layers to calculate the BLH. The algorithm is used to determine the top of the boundary layer under different mixing state. Experimental results demonstrate that the proposed algorithm can determine the top of the boundary layer even in a complex case. Moreover, it can better deal with the weak convection conditions. Finally, experimental data from June 2015 to December 2015 were used to verify the reliability of the proposed algorithm. The correlation between the results of the proposed algorithm and the manual method is R2 = 0.89 with a RMSE of 131 m and mean bias of 49 m; the correlation between the results of the ideal profile fitting method and the manual method is R2 = 0.64 with a RMSE of 270 m and a mean bias of 165 m; and the correlation between the results of the wavelet covariance transform method and manual method is R2 = 0.76, with a RMSE of 196 m and mean bias of 23 m. These findings indicate that the proposed algorithm has better reliability and stability than traditional algorithms.

  8. 77 FR 5021 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Science.gov (United States)

    2012-02-01

    ... defined as a medical group, Accountable Care Organization (ACO), state organization or some other grouping... information upon the respondents, including the use of automated collection techniques or other forms of...

  9. 76 FR 72931 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Science.gov (United States)

    2011-11-28

    ..., Accountable Care Organization (ACO), state organization or some other grouping of practices. A practice is an..., including the use of automated collection techniques or other forms of information technology. Comments...

  10. Aircraft technology portfolio optimization using ant colony optimization

    Science.gov (United States)

    Villeneuve, Frederic J.; Mavris, Dimitri N.

    2012-11-01

    Technology portfolio selection is a combinatorial optimization problem often faced with a large number of combinations and technology incompatibilities. The main research question addressed in this article is to determine if Ant Colony Optimization (ACO) is better suited than Genetic Algorithms (GAs) and Simulated Annealing (SA) for technology portfolio optimization when incompatibility constraints between technologies are present. Convergence rate, capability to find optima, and efficiency in handling of incompatibilities are the three criteria of comparison. The application problem consists of finding the best technology portfolio from 29 aircraft technologies. The results show that ACO and GAs converge faster and find optima more easily than SA, and that ACO can optimize portfolios with technology incompatibilities without using penalty functions. This latter finding paves the way for more use of ACO when the number of constraints increases, such as in the technology and concept selection for complex engineering systems.

  11. Two New PRP Conjugate Gradient Algorithms for Minimization Optimization Models.

    Science.gov (United States)

    Yuan, Gonglin; Duan, Xiabin; Liu, Wenjie; Wang, Xiaoliang; Cui, Zengru; Sheng, Zhou

    2015-01-01

    Two new PRP conjugate Algorithms are proposed in this paper based on two modified PRP conjugate gradient methods: the first algorithm is proposed for solving unconstrained optimization problems, and the second algorithm is proposed for solving nonlinear equations. The first method contains two aspects of information: function value and gradient value. The two methods both possess some good properties, as follows: 1) βk ≥ 0 2) the search direction has the trust region property without the use of any line search method 3) the search direction has sufficient descent property without the use of any line search method. Under some suitable conditions, we establish the global convergence of the two algorithms. We conduct numerical experiments to evaluate our algorithms. The numerical results indicate that the first algorithm is effective and competitive for solving unconstrained optimization problems and that the second algorithm is effective for solving large-scale nonlinear equations.

  12. A high accuracy algorithm of displacement measurement for a micro-positioning stage

    Directory of Open Access Journals (Sweden)

    Xiang Zhang

    2017-05-01

    Full Text Available A high accuracy displacement measurement algorithm for a two degrees of freedom compliant precision micro-positioning stage is proposed based on the computer micro-vision technique. The algorithm consists of an integer-pixel and a subpixel matching procedure. Series of simulations are conducted to verify the proposed method. The results show that the proposed algorithm possesses the advantages of high precision and stability, the resolution can attain to 0.01 pixel theoretically. In addition, the consuming time is reduced about 6.7 times compared with the classical normalized cross correlation algorithm. To validate the practical performance of the proposed algorithm, a laser interferometer measurement system (LIMS is built up. The experimental results demonstrate that the algorithm has better adaptability than that of the LIMS.

  13. Quasi-human seniority-order algorithm for unequal circles packing

    International Nuclear Information System (INIS)

    Zhu, Dingju

    2016-01-01

    In the existing methods for solving unequal circles packing problems, the initial configuration is given arbitrarily or randomly, but the impact of different initial configurations for existing packing algorithm to the speed of existing packing algorithm solving unequal circles packing problems is very large. The quasi-human seniority-order algorithm proposed in this paper can generate a better initial configuration for existing packing algorithm to accelerate the speed of existing packing algorithm solving unequal circles packing problems. In experiments, the quasi-human seniority-order algorithm is applied to generate better initial configurations for quasi-physical elasticity methods to solve the unequal circles packing problems, and the experimental results show that the proposed quasi-human seniority-order algorithm can greatly improve the speed of solving the problem.

  14. Variation In Accountable Care Organization Spending And Sensitivity To Risk Adjustment: Implications For Benchmarking.

    Science.gov (United States)

    Rose, Sherri; Zaslavsky, Alan M; McWilliams, J Michael

    2016-03-01

    Spending targets (or benchmarks) for accountable care organizations (ACOs) participating in the Medicare Shared Savings Program must be set carefully to encourage program participation while achieving fiscal goals and minimizing unintended consequences, such as penalizing ACOs for serving sicker patients. Recently proposed regulatory changes include measures to make benchmarks more similar for ACOs in the same area with different historical spending levels. We found that ACOs vary widely in how their spending levels compare with those of other local providers after standard case-mix adjustments. Additionally adjusting for survey measures of patient health meaningfully reduced the variation in differences between ACO spending and local average fee-for-service spending, but substantial variation remained, which suggests that differences in care efficiency between ACOs and local non-ACO providers vary widely. Accordingly, measures to equilibrate benchmarks between high- and low-spending ACOs--such as setting benchmarks to risk-adjusted average fee-for-service spending in an area--should be implemented gradually to maintain participation by ACOs with high spending. Use of survey information also could help mitigate perverse incentives for risk selection and upcoding and limit unintended consequences of new benchmarking methodologies for ACOs serving sicker patients. Project HOPE—The People-to-People Health Foundation, Inc.

  15. A continuation multilevel Monte Carlo algorithm

    KAUST Repository

    Collier, Nathan; Haji Ali, Abdul Lateef; Nobile, Fabio; von Schwerin, Erik; Tempone, Raul

    2014-01-01

    We propose a novel Continuation Multi Level Monte Carlo (CMLMC) algorithm for weak approximation of stochastic models. The CMLMC algorithm solves the given approximation problem for a sequence of decreasing tolerances, ending when the required error

  16. Unsupervised learning algorithms

    CERN Document Server

    Aydin, Kemal

    2016-01-01

    This book summarizes the state-of-the-art in unsupervised learning. The contributors discuss how with the proliferation of massive amounts of unlabeled data, unsupervised learning algorithms, which can automatically discover interesting and useful patterns in such data, have gained popularity among researchers and practitioners. The authors outline how these algorithms have found numerous applications including pattern recognition, market basket analysis, web mining, social network analysis, information retrieval, recommender systems, market research, intrusion detection, and fraud detection. They present how the difficulty of developing theoretically sound approaches that are amenable to objective evaluation have resulted in the proposal of numerous unsupervised learning algorithms over the past half-century. The intended audience includes researchers and practitioners who are increasingly using unsupervised learning algorithms to analyze their data. Topics of interest include anomaly detection, clustering,...

  17. A Dynamic Fuzzy Cluster Algorithm for Time Series

    Directory of Open Access Journals (Sweden)

    Min Ji

    2013-01-01

    clustering time series by introducing the definition of key point and improving FCM algorithm. The proposed algorithm works by determining those time series whose class labels are vague and further partitions them into different clusters over time. The main advantage of this approach compared with other existing algorithms is that the property of some time series belonging to different clusters over time can be partially revealed. Results from simulation-based experiments on geographical data demonstrate the excellent performance and the desired results have been obtained. The proposed algorithm can be applied to solve other clustering problems in data mining.

  18. Analog Circuit Design Optimization Based on Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    Mansour Barari

    2014-01-01

    Full Text Available This paper investigates an evolutionary-based designing system for automated sizing of analog integrated circuits (ICs. Two evolutionary algorithms, genetic algorithm and PSO (Parswal particle swarm optimization algorithm, are proposed to design analog ICs with practical user-defined specifications. On the basis of the combination of HSPICE and MATLAB, the system links circuit performances, evaluated through specific electrical simulation, to the optimization system in the MATLAB environment, for the selected topology. The system has been tested by typical and hard-to-design cases, such as complex analog blocks with stringent design requirements. The results show that the design specifications are closely met. Comparisons with available methods like genetic algorithms show that the proposed algorithm offers important advantages in terms of optimization quality and robustness. Moreover, the algorithm is shown to be efficient.

  19. Research and implementation of finger-vein recognition algorithm

    Science.gov (United States)

    Pang, Zengyao; Yang, Jie; Chen, Yilei; Liu, Yin

    2017-06-01

    In finger vein image preprocessing, finger angle correction and ROI extraction are important parts of the system. In this paper, we propose an angle correction algorithm based on the centroid of the vein image, and extract the ROI region according to the bidirectional gray projection method. Inspired by the fact that features in those vein areas have similar appearance as valleys, a novel method was proposed to extract center and width of palm vein based on multi-directional gradients, which is easy-computing, quick and stable. On this basis, an encoding method was designed to determine the gray value distribution of texture image. This algorithm could effectively overcome the edge of the texture extraction error. Finally, the system was equipped with higher robustness and recognition accuracy by utilizing fuzzy threshold determination and global gray value matching algorithm. Experimental results on pairs of matched palm images show that, the proposed method has a EER with 3.21% extracts features at the speed of 27ms per image. It can be concluded that the proposed algorithm has obvious advantages in grain extraction efficiency, matching accuracy and algorithm efficiency.

  20. An Innovative Thinking-Based Intelligent Information Fusion Algorithm

    Directory of Open Access Journals (Sweden)

    Huimin Lu

    2013-01-01

    Full Text Available This study proposes an intelligent algorithm that can realize information fusion in reference to the relative research achievements in brain cognitive theory and innovative computation. This algorithm treats knowledge as core and information fusion as a knowledge-based innovative thinking process. Furthermore, the five key parts of this algorithm including information sense and perception, memory storage, divergent thinking, convergent thinking, and evaluation system are simulated and modeled. This algorithm fully develops innovative thinking skills of knowledge in information fusion and is a try to converse the abstract conception of brain cognitive science to specific and operable research routes and strategies. Furthermore, the influences of each parameter of this algorithm on algorithm performance are analyzed and compared with those of classical intelligent algorithms trough test. Test results suggest that the algorithm proposed in this study can obtain the optimum problem solution by less target evaluation times, improve optimization effectiveness, and achieve the effective fusion of information.

  1. Multimodal Estimation of Distribution Algorithms.

    Science.gov (United States)

    Yang, Qiang; Chen, Wei-Neng; Li, Yun; Chen, C L Philip; Xu, Xiang-Min; Zhang, Jun

    2016-02-15

    Taking the advantage of estimation of distribution algorithms (EDAs) in preserving high diversity, this paper proposes a multimodal EDA. Integrated with clustering strategies for crowding and speciation, two versions of this algorithm are developed, which operate at the niche level. Then these two algorithms are equipped with three distinctive techniques: 1) a dynamic cluster sizing strategy; 2) an alternative utilization of Gaussian and Cauchy distributions to generate offspring; and 3) an adaptive local search. The dynamic cluster sizing affords a potential balance between exploration and exploitation and reduces the sensitivity to the cluster size in the niching methods. Taking advantages of Gaussian and Cauchy distributions, we generate the offspring at the niche level through alternatively using these two distributions. Such utilization can also potentially offer a balance between exploration and exploitation. Further, solution accuracy is enhanced through a new local search scheme probabilistically conducted around seeds of niches with probabilities determined self-adaptively according to fitness values of these seeds. Extensive experiments conducted on 20 benchmark multimodal problems confirm that both algorithms can achieve competitive performance compared with several state-of-the-art multimodal algorithms, which is supported by nonparametric tests. Especially, the proposed algorithms are very promising for complex problems with many local optima.

  2. A hybrid Jaya algorithm for reliability-redundancy allocation problems

    Science.gov (United States)

    Ghavidel, Sahand; Azizivahed, Ali; Li, Li

    2018-04-01

    This article proposes an efficient improved hybrid Jaya algorithm based on time-varying acceleration coefficients (TVACs) and the learning phase introduced in teaching-learning-based optimization (TLBO), named the LJaya-TVAC algorithm, for solving various types of nonlinear mixed-integer reliability-redundancy allocation problems (RRAPs) and standard real-parameter test functions. RRAPs include series, series-parallel, complex (bridge) and overspeed protection systems. The search power of the proposed LJaya-TVAC algorithm for finding the optimal solutions is first tested on the standard real-parameter unimodal and multi-modal functions with dimensions of 30-100, and then tested on various types of nonlinear mixed-integer RRAPs. The results are compared with the original Jaya algorithm and the best results reported in the recent literature. The optimal results obtained with the proposed LJaya-TVAC algorithm provide evidence for its better and acceptable optimization performance compared to the original Jaya algorithm and other reported optimal results.

  3. A High-Order CFS Algorithm for Clustering Big Data

    Directory of Open Access Journals (Sweden)

    Fanyu Bu

    2016-01-01

    Full Text Available With the development of Internet of Everything such as Internet of Things, Internet of People, and Industrial Internet, big data is being generated. Clustering is a widely used technique for big data analytics and mining. However, most of current algorithms are not effective to cluster heterogeneous data which is prevalent in big data. In this paper, we propose a high-order CFS algorithm (HOCFS to cluster heterogeneous data by combining the CFS clustering algorithm and the dropout deep learning model, whose functionality rests on three pillars: (i an adaptive dropout deep learning model to learn features from each type of data, (ii a feature tensor model to capture the correlations of heterogeneous data, and (iii a tensor distance-based high-order CFS algorithm to cluster heterogeneous data. Furthermore, we verify our proposed algorithm on different datasets, by comparison with other two clustering schemes, that is, HOPCM and CFS. Results confirm the effectiveness of the proposed algorithm in clustering heterogeneous data.

  4. Opposition-Based Adaptive Fireworks Algorithm

    Directory of Open Access Journals (Sweden)

    Chibing Gong

    2016-07-01

    Full Text Available A fireworks algorithm (FWA is a recent swarm intelligence algorithm that is inspired by observing fireworks explosions. An adaptive fireworks algorithm (AFWA proposes additional adaptive amplitudes to improve the performance of the enhanced fireworks algorithm (EFWA. The purpose of this paper is to add opposition-based learning (OBL to AFWA with the goal of further boosting performance and achieving global optimization. Twelve benchmark functions are tested in use of an opposition-based adaptive fireworks algorithm (OAFWA. The final results conclude that OAFWA significantly outperformed EFWA and AFWA in terms of solution accuracy. Additionally, OAFWA was compared with a bat algorithm (BA, differential evolution (DE, self-adapting control parameters in differential evolution (jDE, a firefly algorithm (FA, and a standard particle swarm optimization 2011 (SPSO2011 algorithm. The research results indicate that OAFWA ranks the highest of the six algorithms for both solution accuracy and runtime cost.

  5. Parallel data encryption with RSA algorithm

    OpenAIRE

    Неретин, А. А.

    2016-01-01

    In this paper a parallel RSA algorithm with preliminary shuffling of source text was presented.Dependence of an encryption speed on the number of encryption nodes has been analysed, The proposed algorithm was implemented on C# language.

  6. An algorithm for determination of peak regions and baseline elimination in spectroscopic data

    International Nuclear Information System (INIS)

    Morhac, Miroslav

    2009-01-01

    In the paper we propose a new algorithm for the determination of peaks containing regions and their separation from peak-free regions. Further based on this algorithm we propose a new background elimination algorithm which allows more accurate estimate of the background beneath the peaks than the algorithms known so far. The algorithm is based on a clipping operation with the window adjustable automatically to the widths of identified peak regions. The illustrative examples presented in the paper prove in favor of the proposed algorithms.

  7. Tamponamento cardíaco tardio traumático: análise de cinco casos Traumatic late cardiac tamponade: analysis of five cases

    Directory of Open Access Journals (Sweden)

    FERNANDO LUIZ WESTPHAL

    2000-09-01

    Full Text Available São analisados cinco casos de tamponamento cardíaco tardio traumático. Os pacientes eram masculinos, idade média de 26,2 anos, vítimas de ferimento por arma branca em região precordial, área de Ziedler, sendo admitidos em centro de referência para trauma. Foram classificados pelo índice fisiológico de Ivatury para trauma cardíaco e inicialmente tratados por pleurotomia intercostal e reposição volêmica, com estabilização do quadro hemodinâmico e respiratório. Os pacientes foram readmitidos após intervalo de oito a 24 dias (mediana de 20 dias, agora no serviço de cirurgia torácica de um hospital de referência terciária, com sinais de tamponamento cardíaco. Os exames diagnósticos confirmaram derrame pericárdico com espessamento pericárdico associado a encarceramento de base pulmonar esquerdo em quatro casos, os quais foram abordados por toracotomia póstero-lateral, com realização de pericardiectomia parcial e descorticação pulmonar. Um paciente evoluiu com pericardite purulenta, comprovada por exames complementares, e foi submetido à drenagem pericárdica subxifóidea. Ocorreu arritmia pós-operatória em um paciente; os demais evoluíram sem complicações pós-operatórias ou recidiva do tamponamento.Five traumatic late cardiac tamponade cases were analyzed. All patients were male, mean age was 26.2, victims of thoracic penetrating stabbing wound in the precordial region, Ziedler area, admitted to a trauma reference center. They were classified by the Ivatury physiological index for cardiac trauma. The first treatment approach was intercostal pleurectomy and volemic resuscitation followed by hemodynamic and respiratory recovery. Patients with cardiac tamponade symptoms were re-admitted within an interval from eight to twenty four days (mean 20 days in a thoracic surgery service of a tertiary reference hospital. Diagnostic exams confirmed thickening and pericardial effusion associated with a left pulmonary base

  8. Optimization-Based Image Segmentation by Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Rosenberger C

    2008-01-01

    Full Text Available Abstract Many works in the literature focus on the definition of evaluation metrics and criteria that enable to quantify the performance of an image processing algorithm. These evaluation criteria can be used to define new image processing algorithms by optimizing them. In this paper, we propose a general scheme to segment images by a genetic algorithm. The developed method uses an evaluation criterion which quantifies the quality of an image segmentation result. The proposed segmentation method can integrate a local ground truth when it is available in order to set the desired level of precision of the final result. A genetic algorithm is then used in order to determine the best combination of information extracted by the selected criterion. Then, we show that this approach can either be applied for gray-levels or multicomponents images in a supervised context or in an unsupervised one. Last, we show the efficiency of the proposed method through some experimental results on several gray-levels and multicomponents images.

  9. Optimization-Based Image Segmentation by Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    H. Laurent

    2008-05-01

    Full Text Available Many works in the literature focus on the definition of evaluation metrics and criteria that enable to quantify the performance of an image processing algorithm. These evaluation criteria can be used to define new image processing algorithms by optimizing them. In this paper, we propose a general scheme to segment images by a genetic algorithm. The developed method uses an evaluation criterion which quantifies the quality of an image segmentation result. The proposed segmentation method can integrate a local ground truth when it is available in order to set the desired level of precision of the final result. A genetic algorithm is then used in order to determine the best combination of information extracted by the selected criterion. Then, we show that this approach can either be applied for gray-levels or multicomponents images in a supervised context or in an unsupervised one. Last, we show the efficiency of the proposed method through some experimental results on several gray-levels and multicomponents images.

  10. An improved algorithm for connectivity analysis of distribution networks

    International Nuclear Information System (INIS)

    Kansal, M.L.; Devi, Sunita

    2007-01-01

    In the present paper, an efficient algorithm for connectivity analysis of moderately sized distribution networks has been suggested. Algorithm is based on generation of all possible minimal system cutsets. The algorithm is efficient as it identifies only the necessary and sufficient conditions of system failure conditions in n-out-of-n type of distribution networks. The proposed algorithm is demonstrated with the help of saturated and unsaturated distribution networks. The computational efficiency of the algorithm is justified by comparing the computational efforts with the previously suggested appended spanning tree (AST) algorithm. The proposed technique has the added advantage as it can be utilized for generation of system inequalities which is useful in reliability estimation of capacitated networks

  11. Phase-unwrapping algorithm by a rounding-least-squares approach

    Science.gov (United States)

    Juarez-Salazar, Rigoberto; Robledo-Sanchez, Carlos; Guerrero-Sanchez, Fermin

    2014-02-01

    A simple and efficient phase-unwrapping algorithm based on a rounding procedure and a global least-squares minimization is proposed. Instead of processing the gradient of the wrapped phase, this algorithm operates over the gradient of the phase jumps by a robust and noniterative scheme. Thus, the residue-spreading and over-smoothing effects are reduced. The algorithm's performance is compared with four well-known phase-unwrapping methods: minimum cost network flow (MCNF), fast Fourier transform (FFT), quality-guided, and branch-cut. A computer simulation and experimental results show that the proposed algorithm reaches a high-accuracy level than the MCNF method by a low-computing time similar to the FFT phase-unwrapping method. Moreover, since the proposed algorithm is simple, fast, and user-free, it could be used in metrological interferometric and fringe-projection automatic real-time applications.

  12. SVC control enhancement applying self-learning fuzzy algorithm for islanded microgrid

    Directory of Open Access Journals (Sweden)

    Hossam Gabbar

    2016-03-01

    Full Text Available Maintaining voltage stability, within acceptable levels, for islanded Microgrids (MGs is a challenge due to limited exchange power between generation and loads. This paper proposes an algorithm to enhance the dynamic performance of islanded MGs in presence of load disturbance using Static VAR Compensator (SVC with Fuzzy Model Reference Learning Controller (FMRLC. The proposed algorithm compensates MG nonlinearity via fuzzy membership functions and inference mechanism imbedded in both controller and inverse model. Hence, MG keeps the desired performance as required at any operating condition. Furthermore, the self-learning capability of the proposed control algorithm compensates for grid parameter’s variation even with inadequate information about load dynamics. A reference model was designed to reject bus voltage disturbance with achievable performance by the proposed fuzzy controller. Three simulations scenarios have been presented to investigate effectiveness of proposed control algorithm in improving steady-state and transient performance of islanded MGs. The first scenario conducted without SVC, second conducted with SVC using PID controller and third conducted using FMRLC algorithm. A comparison for results shows ability of proposed control algorithm to enhance disturbance rejection due to learning process.

  13. One improved LSB steganography algorithm

    Science.gov (United States)

    Song, Bing; Zhang, Zhi-hong

    2013-03-01

    It is easy to be detected by X2 and RS steganalysis with high accuracy that using LSB algorithm to hide information in digital image. We started by selecting information embedded location and modifying the information embedded method, combined with sub-affine transformation and matrix coding method, improved the LSB algorithm and a new LSB algorithm was proposed. Experimental results show that the improved one can resist the X2 and RS steganalysis effectively.

  14. Hill climbing algorithms and trivium

    DEFF Research Database (Denmark)

    Borghoff, Julia; Knudsen, Lars Ramkilde; Matusiewicz, Krystian

    2011-01-01

    This paper proposes a new method to solve certain classes of systems of multivariate equations over the binary field and its cryptanalytical applications. We show how heuristic optimization methods such as hill climbing algorithms can be relevant to solving systems of multivariate equations....... A characteristic of equation systems that may be efficiently solvable by the means of such algorithms is provided. As an example, we investigate equation systems induced by the problem of recovering the internal state of the stream cipher Trivium. We propose an improved variant of the simulated annealing method...

  15. Scalable unit commitment by memory-bounded ant colony optimization with A{sup *} local search

    Energy Technology Data Exchange (ETDEWEB)

    Saber, Ahmed Yousuf; Alshareef, Abdulaziz Mohammed [Department of Electrical and Computer Engineering, King Abdulaziz University, P.O. Box 80204, Jeddah 21589 (Saudi Arabia)

    2008-07-15

    Ant colony optimization (ACO) is successfully applied in optimization problems. Performance of the basic ACO for small problems with moderate dimension and searching space is satisfactory. As the searching space grows exponentially in the large-scale unit commitment problem, the basic ACO is not applicable for the vast size of pheromone matrix of ACO in practical time and physical computer-memory limit. However, memory-bounded methods prune the least-promising nodes to fit the system in computer memory. Therefore, the authors propose memory-bounded ant colony optimization (MACO) in this paper for the scalable (no restriction for system size) unit commitment problem. This MACO intelligently solves the limitation of computer memory, and does not permit the system to grow beyond a bound on memory. In the memory-bounded ACO implementation, A{sup *} heuristic is introduced to increase local searching ability and probabilistic nearest neighbor method is applied to estimate pheromone intensity for the forgotten value. Finally, the benchmark data sets and existing methods are used to show the effectiveness of the proposed method. (author)

  16. Merged Search Algorithms for Radio Frequency Identification Anticollision

    Directory of Open Access Journals (Sweden)

    Bih-Yaw Shih

    2012-01-01

    The arbitration algorithm for RFID system is used to arbitrate all the tags to avoid the collision problem with the existence of multiple tags in the interrogation field of a transponder. A splitting algorithm which is called Binary Search Tree (BST is well known for multitags arbitration. In the current study, a splitting-based schema called Merged Search Tree is proposed to capture identification codes correctly for anticollision. Performance of the proposed algorithm is compared with the original BST according to time and power consumed during the arbitration process. The results show that the proposed model can reduce searching time and power consumed to achieve a better performance arbitration.

  17. Exact and Heuristic Algorithms for Runway Scheduling

    Science.gov (United States)

    Malik, Waqar A.; Jung, Yoon C.

    2016-01-01

    This paper explores the Single Runway Scheduling (SRS) problem with arrivals, departures, and crossing aircraft on the airport surface. Constraints for wake vortex separations, departure area navigation separations and departure time window restrictions are explicitly considered. The main objective of this research is to develop exact and heuristic based algorithms that can be used in real-time decision support tools for Air Traffic Control Tower (ATCT) controllers. The paper provides a multi-objective dynamic programming (DP) based algorithm that finds the exact solution to the SRS problem, but may prove unusable for application in real-time environment due to large computation times for moderate sized problems. We next propose a second algorithm that uses heuristics to restrict the search space for the DP based algorithm. A third algorithm based on a combination of insertion and local search (ILS) heuristics is then presented. Simulation conducted for the east side of Dallas/Fort Worth International Airport allows comparison of the three proposed algorithms and indicates that the ILS algorithm performs favorably in its ability to find efficient solutions and its computation times.

  18. Asymmetric intimacy and algorithm for detecting communities in bipartite networks

    Science.gov (United States)

    Wang, Xingyuan; Qin, Xiaomeng

    2016-11-01

    In this paper, an algorithm to choose a good partition in bipartite networks has been proposed. Bipartite networks have more theoretical significance and broader prospect of application. In view of distinctive structure of bipartite networks, in our method, two parameters are defined to show the relationships between the same type nodes and heterogeneous nodes respectively. Moreover, our algorithm employs a new method of finding and expanding the core communities in bipartite networks. Two kinds of nodes are handled separately and merged, and then the sub-communities are obtained. After that, objective communities will be found according to the merging rule. The proposed algorithm has been simulated in real-world networks and artificial networks, and the result verifies the accuracy and reliability of the parameters on intimacy for our algorithm. Eventually, comparisons with similar algorithms depict that the proposed algorithm has better performance.

  19. Species co-evolutionary algorithm: a novel evolutionary algorithm based on the ecology and environments for optimization

    DEFF Research Database (Denmark)

    Li, Wuzhao; Wang, Lei; Cai, Xingjuan

    2015-01-01

    and affect each other in many ways. The relationships include competition, predation, parasitism, mutualism and pythogenesis. In this paper, we consider the five relationships between solutions to propose a co-evolutionary algorithm termed species co-evolutionary algorithm (SCEA). In SCEA, five operators...

  20. Variable forgetting factor mechanisms for diffusion recursive least squares algorithm in sensor networks

    Science.gov (United States)

    Zhang, Ling; Cai, Yunlong; Li, Chunguang; de Lamare, Rodrigo C.

    2017-12-01

    In this work, we present low-complexity variable forgetting factor (VFF) techniques for diffusion recursive least squares (DRLS) algorithms. Particularly, we propose low-complexity VFF-DRLS algorithms for distributed parameter and spectrum estimation in sensor networks. For the proposed algorithms, they can adjust the forgetting factor automatically according to the posteriori error signal. We develop detailed analyses in terms of mean and mean square performance for the proposed algorithms and derive mathematical expressions for the mean square deviation (MSD) and the excess mean square error (EMSE). The simulation results show that the proposed low-complexity VFF-DRLS algorithms achieve superior performance to the existing DRLS algorithm with fixed forgetting factor when applied to scenarios of distributed parameter and spectrum estimation. Besides, the simulation results also demonstrate a good match for our proposed analytical expressions.

  1. Genetic Algorithm Applied to the Eigenvalue Equalization Filtered-x LMS Algorithm (EE-FXLMS

    Directory of Open Access Journals (Sweden)

    Stephan P. Lovstedt

    2008-01-01

    Full Text Available The FXLMS algorithm, used extensively in active noise control (ANC, exhibits frequency-dependent convergence behavior. This leads to degraded performance for time-varying tonal noise and noise with multiple stationary tones. Previous work by the authors proposed the eigenvalue equalization filtered-x least mean squares (EE-FXLMS algorithm. For that algorithm, magnitude coefficients of the secondary path transfer function are modified to decrease variation in the eigenvalues of the filtered-x autocorrelation matrix, while preserving the phase, giving faster convergence and increasing overall attenuation. This paper revisits the EE-FXLMS algorithm, using a genetic algorithm to find magnitude coefficients that give the least variation in eigenvalues. This method overcomes some of the problems with implementing the EE-FXLMS algorithm arising from finite resolution of sampled systems. Experimental control results using the original secondary path model, and a modified secondary path model for both the previous implementation of EE-FXLMS and the genetic algorithm implementation are compared.

  2. Application of Hybrid Optimization Algorithm in the Synthesis of Linear Antenna Array

    Directory of Open Access Journals (Sweden)

    Ezgi Deniz Ülker

    2014-01-01

    Full Text Available The use of hybrid algorithms for solving real-world optimization problems has become popular since their solution quality can be made better than the algorithms that form them by combining their desirable features. The newly proposed hybrid method which is called Hybrid Differential, Particle, and Harmony (HDPH algorithm is different from the other hybrid forms since it uses all features of merged algorithms in order to perform efficiently for a wide variety of problems. In the proposed algorithm the control parameters are randomized which makes its implementation easy and provides a fast response. This paper describes the application of HDPH algorithm to linear antenna array synthesis. The results obtained with the HDPH algorithm are compared with three merged optimization techniques that are used in HDPH. The comparison shows that the performance of the proposed algorithm is comparatively better in both solution quality and robustness. The proposed hybrid algorithm HDPH can be an efficient candidate for real-time optimization problems since it yields reliable performance at all times when it gets executed.

  3. Two New PRP Conjugate Gradient Algorithms for Minimization Optimization Models.

    Directory of Open Access Journals (Sweden)

    Gonglin Yuan

    Full Text Available Two new PRP conjugate Algorithms are proposed in this paper based on two modified PRP conjugate gradient methods: the first algorithm is proposed for solving unconstrained optimization problems, and the second algorithm is proposed for solving nonlinear equations. The first method contains two aspects of information: function value and gradient value. The two methods both possess some good properties, as follows: 1 βk ≥ 0 2 the search direction has the trust region property without the use of any line search method 3 the search direction has sufficient descent property without the use of any line search method. Under some suitable conditions, we establish the global convergence of the two algorithms. We conduct numerical experiments to evaluate our algorithms. The numerical results indicate that the first algorithm is effective and competitive for solving unconstrained optimization problems and that the second algorithm is effective for solving large-scale nonlinear equations.

  4. A genetic algorithm for solving supply chain network design model

    Science.gov (United States)

    Firoozi, Z.; Ismail, N.; Ariafar, S. H.; Tang, S. H.; Ariffin, M. K. M. A.

    2013-09-01

    Network design is by nature costly and optimization models play significant role in reducing the unnecessary cost components of a distribution network. This study proposes a genetic algorithm to solve a distribution network design model. The structure of the chromosome in the proposed algorithm is defined in a novel way that in addition to producing feasible solutions, it also reduces the computational complexity of the algorithm. Computational results are presented to show the algorithm performance.

  5. Parallel algorithms for testing finite state machines:Generating UIO sequences

    OpenAIRE

    Hierons, RM; Turker, UC

    2016-01-01

    This paper describes an efficient parallel algorithm that uses many-core GPUs for automatically deriving Unique Input Output sequences (UIOs) from Finite State Machines. The proposed algorithm uses the global scope of the GPU's global memory through coalesced memory access and minimises the transfer between CPU and GPU memory. The results of experiments indicate that the proposed method yields considerably better results compared to a single core UIO construction algorithm. Our algorithm is s...

  6. Learning Intelligent Genetic Algorithms Using Japanese Nonograms

    Science.gov (United States)

    Tsai, Jinn-Tsong; Chou, Ping-Yi; Fang, Jia-Cen

    2012-01-01

    An intelligent genetic algorithm (IGA) is proposed to solve Japanese nonograms and is used as a method in a university course to learn evolutionary algorithms. The IGA combines the global exploration capabilities of a canonical genetic algorithm (CGA) with effective condensed encoding, improved fitness function, and modified crossover and…

  7. Active Semisupervised Clustering Algorithm with Label Propagation for Imbalanced and Multidensity Datasets

    Directory of Open Access Journals (Sweden)

    Mingwei Leng

    2013-01-01

    Full Text Available The accuracy of most of the existing semisupervised clustering algorithms based on small size of labeled dataset is low when dealing with multidensity and imbalanced datasets, and labeling data is quite expensive and time consuming in many real-world applications. This paper focuses on active data selection and semisupervised clustering algorithm in multidensity and imbalanced datasets and proposes an active semisupervised clustering algorithm. The proposed algorithm uses an active mechanism for data selection to minimize the amount of labeled data, and it utilizes multithreshold to expand labeled datasets on multidensity and imbalanced datasets. Three standard datasets and one synthetic dataset are used to demonstrate the proposed algorithm, and the experimental results show that the proposed semisupervised clustering algorithm has a higher accuracy and a more stable performance in comparison to other clustering and semisupervised clustering algorithms, especially when the datasets are multidensity and imbalanced.

  8. Genetic local search algorithm for optimization design of diffractive optical elements.

    Science.gov (United States)

    Zhou, G; Chen, Y; Wang, Z; Song, H

    1999-07-10

    We propose a genetic local search algorithm (GLSA) for the optimization design of diffractive optical elements (DOE's). This hybrid algorithm incorporates advantages of both genetic algorithm (GA) and local search techniques. It appears better able to locate the global minimum compared with a canonical GA. Sample cases investigated here include the optimization design of binary-phase Dammann gratings, continuous surface-relief grating array generators, and a uniform top-hat focal plane intensity profile generator. Two GLSA's whose incorporated local search techniques are the hill-climbing method and the simulated annealing algorithm are investigated. Numerical experimental results demonstrate that the proposed algorithm is highly efficient and robust. DOE's that have high diffraction efficiency and excellent uniformity can be achieved by use of the algorithm we propose.

  9. A Spherical Model Based Keypoint Descriptor and Matching Algorithm for Omnidirectional Images

    Directory of Open Access Journals (Sweden)

    Guofeng Tong

    2014-04-01

    Full Text Available Omnidirectional images generally have nonlinear distortion in radial direction. Unfortunately, traditional algorithms such as scale-invariant feature transform (SIFT and Descriptor-Nets (D-Nets do not work well in matching omnidirectional images just because they are incapable of dealing with the distortion. In order to solve this problem, a new voting algorithm is proposed based on the spherical model and the D-Nets algorithm. Because the spherical-based keypoint descriptor contains the distortion information of omnidirectional images, the proposed matching algorithm is invariant to distortion. Keypoint matching experiments are performed on three pairs of omnidirectional images, and comparison is made among the proposed algorithm, the SIFT and the D-Nets. The result shows that the proposed algorithm is more robust and more precise than the SIFT, and the D-Nets in matching omnidirectional images. Comparing with the SIFT and the D-Nets, the proposed algorithm has two main advantages: (a there are more real matching keypoints; (b the coverage range of the matching keypoints is wider, including the seriously distorted areas.

  10. Applying Organization Theory to Understanding the Adoption and Implementation of Accountable Care Organizations: Commentary.

    Science.gov (United States)

    Shortell, Stephen M

    2016-12-01

    This commentary highights the key arguments and contributions of institutional thoery, transaction cost economics (TCE) theory, high reliability theory, and organizational learning theory to understanding the development and evolution of Accountable Care Organizations (ACOs). Institutional theory and TCE theory primarily emphasize the external influences shaping ACOs while high reliability theory and organizational learning theory underscore the internal fctors influencing ACO perfromance. A framework based on Implementation Science is proposed to conside the multiple perspectives on ACOs and, in particular, their abiity to innovate to achieve desired cost, quality, and population health goals. © The Author(s) 2016.

  11. Fuzzy Rules for Ant Based Clustering Algorithm

    Directory of Open Access Journals (Sweden)

    Amira Hamdi

    2016-01-01

    Full Text Available This paper provides a new intelligent technique for semisupervised data clustering problem that combines the Ant System (AS algorithm with the fuzzy c-means (FCM clustering algorithm. Our proposed approach, called F-ASClass algorithm, is a distributed algorithm inspired by foraging behavior observed in ant colonyT. The ability of ants to find the shortest path forms the basis of our proposed approach. In the first step, several colonies of cooperating entities, called artificial ants, are used to find shortest paths in a complete graph that we called graph-data. The number of colonies used in F-ASClass is equal to the number of clusters in dataset. Hence, the partition matrix of dataset founded by artificial ants is given in the second step, to the fuzzy c-means technique in order to assign unclassified objects generated in the first step. The proposed approach is tested on artificial and real datasets, and its performance is compared with those of K-means, K-medoid, and FCM algorithms. Experimental section shows that F-ASClass performs better according to the error rate classification, accuracy, and separation index.

  12. An efficient non-dominated sorting method for evolutionary algorithms.

    Science.gov (United States)

    Fang, Hongbing; Wang, Qian; Tu, Yi-Cheng; Horstemeyer, Mark F

    2008-01-01

    We present a new non-dominated sorting algorithm to generate the non-dominated fronts in multi-objective optimization with evolutionary algorithms, particularly the NSGA-II. The non-dominated sorting algorithm used by NSGA-II has a time complexity of O(MN(2)) in generating non-dominated fronts in one generation (iteration) for a population size N and M objective functions. Since generating non-dominated fronts takes the majority of total computational time (excluding the cost of fitness evaluations) of NSGA-II, making this algorithm faster will significantly improve the overall efficiency of NSGA-II and other genetic algorithms using non-dominated sorting. The new non-dominated sorting algorithm proposed in this study reduces the number of redundant comparisons existing in the algorithm of NSGA-II by recording the dominance information among solutions from their first comparisons. By utilizing a new data structure called the dominance tree and the divide-and-conquer mechanism, the new algorithm is faster than NSGA-II for different numbers of objective functions. Although the number of solution comparisons by the proposed algorithm is close to that of NSGA-II when the number of objectives becomes large, the total computational time shows that the proposed algorithm still has better efficiency because of the adoption of the dominance tree structure and the divide-and-conquer mechanism.

  13. Rules Extraction with an Immune Algorithm

    Directory of Open Access Journals (Sweden)

    Deqin Yan

    2007-12-01

    Full Text Available In this paper, a method of extracting rules with immune algorithms from information systems is proposed. Designing an immune algorithm is based on a sharing mechanism to extract rules. The principle of sharing and competing resources in the sharing mechanism is consistent with the relationship of sharing and rivalry among rules. In order to extract rules efficiently, a new concept of flexible confidence and rule measurement is introduced. Experiments demonstrate that the proposed method is effective.

  14. Algorithm for Spatial Clustering with Obstacles

    OpenAIRE

    El-Sharkawi, Mohamed E.; El-Zawawy, Mohamed A.

    2009-01-01

    In this paper, we propose an efficient clustering technique to solve the problem of clustering in the presence of obstacles. The proposed algorithm divides the spatial area into rectangular cells. Each cell is associated with statistical information that enables us to label the cell as dense or non-dense. We also label each cell as obstructed (i.e. intersects any obstacle) or non-obstructed. Then the algorithm finds the regions (clusters) of connected, dense, non-obstructed cells. Finally, th...

  15. Optimized hyperspectral band selection using hybrid genetic algorithm and gravitational search algorithm

    Science.gov (United States)

    Zhang, Aizhu; Sun, Genyun; Wang, Zhenjie

    2015-12-01

    The serious information redundancy in hyperspectral images (HIs) cannot contribute to the data analysis accuracy, instead it require expensive computational resources. Consequently, to identify the most useful and valuable information from the HIs, thereby improve the accuracy of data analysis, this paper proposed a novel hyperspectral band selection method using the hybrid genetic algorithm and gravitational search algorithm (GA-GSA). In the proposed method, the GA-GSA is mapped to the binary space at first. Then, the accuracy of the support vector machine (SVM) classifier and the number of selected spectral bands are utilized to measure the discriminative capability of the band subset. Finally, the band subset with the smallest number of spectral bands as well as covers the most useful and valuable information is obtained. To verify the effectiveness of the proposed method, studies conducted on an AVIRIS image against two recently proposed state-of-the-art GSA variants are presented. The experimental results revealed the superiority of the proposed method and indicated that the method can indeed considerably reduce data storage costs and efficiently identify the band subset with stable and high classification precision.

  16. Seismic active control by a heuristic-based algorithm

    International Nuclear Information System (INIS)

    Tang, Yu.

    1996-01-01

    A heuristic-based algorithm for seismic active control is generalized to permit consideration of the effects of control-structure interaction and actuator dynamics. Control force is computed at onetime step ahead before being applied to the structure. Therefore, the proposed control algorithm is free from the problem of time delay. A numerical example is presented to show the effectiveness of the proposed control algorithm. Also, two indices are introduced in the paper to assess the effectiveness and efficiency of control laws

  17. A Novel Modified Algorithm with Reduced Complexity LDPC Code Decoder

    Directory of Open Access Journals (Sweden)

    Song Yang

    2014-10-01

    Full Text Available A novel efficient decoding algorithm reduced the sum-product algorithm (SPA Complexity with LPDC code is proposed. Base on the hyperbolic tangent rule, modified the Check node update with two horizontal process, which have similar calculation, Motivated by the finding that sun- min (MS algorithm reduce the complexity reducing the approximation error in the horizontal process, simplify the information weight small part. Compared with the exiting approximations, the proposed method is less computational complexity than SPA algorithm. Simulation results show that the author algorithm can achieve performance very close SPA.

  18. A scalable and practical one-pass clustering algorithm for recommender system

    Science.gov (United States)

    Khalid, Asra; Ghazanfar, Mustansar Ali; Azam, Awais; Alahmari, Saad Ali

    2015-12-01

    KMeans clustering-based recommendation algorithms have been proposed claiming to increase the scalability of recommender systems. One potential drawback of these algorithms is that they perform training offline and hence cannot accommodate the incremental updates with the arrival of new data, making them unsuitable for the dynamic environments. From this line of research, a new clustering algorithm called One-Pass is proposed, which is a simple, fast, and accurate. We show empirically that the proposed algorithm outperforms K-Means in terms of recommendation and training time while maintaining a good level of accuracy.

  19. A similarity based agglomerative clustering algorithm in networks

    Science.gov (United States)

    Liu, Zhiyuan; Wang, Xiujuan; Ma, Yinghong

    2018-04-01

    The detection of clusters is benefit for understanding the organizations and functions of networks. Clusters, or communities, are usually groups of nodes densely interconnected but sparsely linked with any other clusters. To identify communities, an efficient and effective community agglomerative algorithm based on node similarity is proposed. The proposed method initially calculates similarities between each pair of nodes, and form pre-partitions according to the principle that each node is in the same community as its most similar neighbor. After that, check each partition whether it satisfies community criterion. For the pre-partitions who do not satisfy, incorporate them with others that having the biggest attraction until there are no changes. To measure the attraction ability of a partition, we propose an attraction index that based on the linked node's importance in networks. Therefore, our proposed method can better exploit the nodes' properties and network's structure. To test the performance of our algorithm, both synthetic and empirical networks ranging in different scales are tested. Simulation results show that the proposed algorithm can obtain superior clustering results compared with six other widely used community detection algorithms.

  20. Distribution agnostic structured sparsity recovery algorithms

    KAUST Repository

    Al-Naffouri, Tareq Y.; Masood, Mudassir

    2013-01-01

    We present an algorithm and its variants for sparse signal recovery from a small number of its measurements in a distribution agnostic manner. The proposed algorithm finds Bayesian estimate of a sparse signal to be recovered and at the same time

  1. VLSI PARTITIONING ALGORITHM WITH ADAPTIVE CONTROL PARAMETER

    Directory of Open Access Journals (Sweden)

    P. N. Filippenko

    2013-03-01

    Full Text Available The article deals with the problem of very large-scale integration circuit partitioning. A graph is selected as a mathematical model describing integrated circuit. Modification of ant colony optimization algorithm is presented, which is used to solve graph partitioning problem. Ant colony optimization algorithm is an optimization method based on the principles of self-organization and other useful features of the ants’ behavior. The proposed search system is based on ant colony optimization algorithm with the improved method of the initial distribution and dynamic adjustment of the control search parameters. The experimental results and performance comparison show that the proposed method of very large-scale integration circuit partitioning provides the better search performance over other well known algorithms.

  2. A Fast DOA Estimation Algorithm Based on Polarization MUSIC

    Directory of Open Access Journals (Sweden)

    R. Guo

    2015-04-01

    Full Text Available A fast DOA estimation algorithm developed from MUSIC, which also benefits from the processing of the signals' polarization information, is presented. Besides performance enhancement in precision and resolution, the proposed algorithm can be exerted on various forms of polarization sensitive arrays, without specific requirement on the array's pattern. Depending on the continuity property of the space spectrum, a huge amount of computation incurred in the calculation of 4-D space spectrum is averted. Performance and computation complexity analysis of the proposed algorithm is discussed and the simulation results are presented. Compared with conventional MUSIC, it is indicated that the proposed algorithm has considerable advantage in aspects of precision and resolution, with a low computation complexity proportional to a conventional 2-D MUSIC.

  3. Successive approximation algorithm for cancellation of artifacts in DSA images

    International Nuclear Information System (INIS)

    Funakami, Raiko; Hiroshima, Kyoichi; Nishino, Junji

    2000-01-01

    In this paper, we propose an algorithm for cancellation of artifacts in DSA images. We have already proposed an automatic registration method based on the detection of local movements. When motion of the object is large, it is difficult to estimate the exact movement, and the cancellation of artifacts may therefore fail. The algorithm we propose here is based on a simple rigid model. We present the results of applying the proposed method to a series of experimental X-ray images, as well as the results of applying the algorithm as preprocessing for a registration method based on local movement. (author)

  4. A Double Evolutionary Pool Memetic Algorithm for Examination Timetabling Problems

    Directory of Open Access Journals (Sweden)

    Yu Lei

    2014-01-01

    Full Text Available A double evolutionary pool memetic algorithm is proposed to solve the examination timetabling problem. To improve the performance of the proposed algorithm, two evolutionary pools, that is, the main evolutionary pool and the secondary evolutionary pool, are employed. The genetic operators have been specially designed to fit the examination timetabling problem. A simplified version of the simulated annealing strategy is designed to speed the convergence of the algorithm. A clonal mechanism is introduced to preserve population diversity. Extensive experiments carried out on 12 benchmark examination timetabling instances show that the proposed algorithm is able to produce promising results for the uncapacitated examination timetabling problem.

  5. Hybridizing Differential Evolution with a Genetic Algorithm for Color Image Segmentation

    Directory of Open Access Journals (Sweden)

    R. V. V. Krishna

    2016-10-01

    Full Text Available This paper proposes a hybrid of differential evolution and genetic algorithms to solve the color image segmentation problem. Clustering based color image segmentation algorithms segment an image by clustering the features of color and texture, thereby obtaining accurate prototype cluster centers. In the proposed algorithm, the color features are obtained using the homogeneity model. A new texture feature named Power Law Descriptor (PLD which is a modification of Weber Local Descriptor (WLD is proposed and further used as a texture feature for clustering. Genetic algorithms are competent in handling binary variables, while differential evolution on the other hand is more efficient in handling real parameters. The obtained texture feature is binary in nature and the color feature is a real value, which suits very well the hybrid cluster center optimization problem in image segmentation. Thus in the proposed algorithm, the optimum texture feature centers are evolved using genetic algorithms, whereas the optimum color feature centers are evolved using differential evolution.

  6. Wavefront-ray grid FDTD algorithm

    OpenAIRE

    ÇİYDEM, MEHMET

    2016-01-01

    A finite difference time domain algorithm on a wavefront-ray grid (WRG-FDTD) is proposed in this study to reduce numerical dispersion of conventional FDTD methods. A FDTD algorithm conforming to a wavefront-ray grid can be useful to take into account anisotropy effects of numerical grids since it features directional energy flow along the rays. An explicit and second-order accurate WRG-FDTD algorithm is provided in generalized curvilinear coordinates for an inhomogeneous isotropic medium. Num...

  7. Quick fuzzy backpropagation algorithm.

    Science.gov (United States)

    Nikov, A; Stoeva, S

    2001-03-01

    A modification of the fuzzy backpropagation (FBP) algorithm called QuickFBP algorithm is proposed, where the computation of the net function is significantly quicker. It is proved that the FBP algorithm is of exponential time complexity, while the QuickFBP algorithm is of polynomial time complexity. Convergence conditions of the QuickFBP, resp. the FBP algorithm are defined and proved for: (1) single output neural networks in case of training patterns with different targets; and (2) multiple output neural networks in case of training patterns with equivalued target vector. They support the automation of the weights training process (quasi-unsupervised learning) establishing the target value(s) depending on the network's input values. In these cases the simulation results confirm the convergence of both algorithms. An example with a large-sized neural network illustrates the significantly greater training speed of the QuickFBP rather than the FBP algorithm. The adaptation of an interactive web system to users on the basis of the QuickFBP algorithm is presented. Since the QuickFBP algorithm ensures quasi-unsupervised learning, this implies its broad applicability in areas of adaptive and adaptable interactive systems, data mining, etc. applications.

  8. Experiência com transplante cardíaco heterotópico em pacientes com resistência pulmonar elevada: seguimento tardio Experiencia con trasplante cardíaco heterotópico en pacientes con resistencia pulmonar elevada: seguimiento tardío Experience with heterotopic heart transplantation in patients with elevated pulmonary vascular resistance: late follow-up

    Directory of Open Access Journals (Sweden)

    Jose Henrique Andrade Vila

    2010-02-01

    Full Text Available FUNDAMENTO: Nos últimos anos o numero de artigos sobre transplante cardíaco heterotópico tem sido escasso na literatura, inclusive internacional, e em particular do seguimento de longo prazo destes pacientes, o que levou ao presente relato. OBJETIVO: Relatar a experiência clínica inicial e evolução tardia de quatro pacientes submetidos a transplante cardíaco heterotópico, sua indicação e principais complicações. MÉTODOS: As cirurgias ocorreram entre 1992 e 2001, sendo que a indicação de transplante heterotópico, em todas, foi pela RVP, variável de 4,8UW a 6.5UW, com gradiente transpulmonar acima de 15mmHg. No 3º paciente, foi realizada uma anastomose direta entre as artérias pulmonares sem emprego de tubo protético e, no coração nativo, foi realizada uma valvoplastia mitral e aneurismectomia de ventrículo esquerdo (VE. O esquema imunossupressor imediato foi duplo com ciclosporina e azatioprina nos três primeiros pacientes e ciclosporina e micofenolato mofetil no 4º paciente. RESULTADOS: Um óbito imediato por falência do enxerto, um óbito após dois anos e meio por endocardite em trombo intraventricular no coração nativo, e um terceiro óbito seis anos após o transplante, por complicações pós-operatórias de cirurgia na valva aórtica do coração nativo. O remanescente, 15 anos após o transplante, encontra-se bem, em classe funcional II (NYHA, seis anos após a oclusão cirúrgica da valva aórtica do coração nativo. CONCLUSÃO: O transplante cardíaco heterotópico é um procedimento com resultado inferior ao transplante cardíaco ortotópico, por apresentarem maior RVP. Os trombos intraventriculares no coração nativo, que exigem anticoagulação prolongada, bem como as complicações de válvula aórtica, também no coração nativo, podem exigir tratamento cirúrgico. Entretanto, em um paciente, a sobrevida de 15 anos mostrou a eficácia de longo prazo desse tipo de alternativa, para pacientes

  9. A Formally Verified Conflict Detection Algorithm for Polynomial Trajectories

    Science.gov (United States)

    Narkawicz, Anthony; Munoz, Cesar

    2015-01-01

    In air traffic management, conflict detection algorithms are used to determine whether or not aircraft are predicted to lose horizontal and vertical separation minima within a time interval assuming a trajectory model. In the case of linear trajectories, conflict detection algorithms have been proposed that are both sound, i.e., they detect all conflicts, and complete, i.e., they do not present false alarms. In general, for arbitrary nonlinear trajectory models, it is possible to define detection algorithms that are either sound or complete, but not both. This paper considers the case of nonlinear aircraft trajectory models based on polynomial functions. In particular, it proposes a conflict detection algorithm that precisely determines whether, given a lookahead time, two aircraft flying polynomial trajectories are in conflict. That is, it has been formally verified that, assuming that the aircraft trajectories are modeled as polynomial functions, the proposed algorithm is both sound and complete.

  10. Efficient Record Linkage Algorithms Using Complete Linkage Clustering.

    Science.gov (United States)

    Mamun, Abdullah-Al; Aseltine, Robert; Rajasekaran, Sanguthevar

    2016-01-01

    Data from different agencies share data of the same individuals. Linking these datasets to identify all the records belonging to the same individuals is a crucial and challenging problem, especially given the large volumes of data. A large number of available algorithms for record linkage are prone to either time inefficiency or low-accuracy in finding matches and non-matches among the records. In this paper we propose efficient as well as reliable sequential and parallel algorithms for the record linkage problem employing hierarchical clustering methods. We employ complete linkage hierarchical clustering algorithms to address this problem. In addition to hierarchical clustering, we also use two other techniques: elimination of duplicate records and blocking. Our algorithms use sorting as a sub-routine to identify identical copies of records. We have tested our algorithms on datasets with millions of synthetic records. Experimental results show that our algorithms achieve nearly 100% accuracy. Parallel implementations achieve almost linear speedups. Time complexities of these algorithms do not exceed those of previous best-known algorithms. Our proposed algorithms outperform previous best-known algorithms in terms of accuracy consuming reasonable run times.

  11. Theoretical analysis of two ACO approaches for the traveling salesman problem

    DEFF Research Database (Denmark)

    Kötzing, Timo; Neumann, Frank; Röglin, Heiko

    2012-01-01

    Bioinspired algorithms, such as evolutionary algorithms and ant colony optimization, are widely used for different combinatorial optimization problems. These algorithms rely heavily on the use of randomness and are hard to understand from a theoretical point of view. This paper contributes...... to the theoretical analysis of ant colony optimization and studies this type of algorithm on one of the most prominent combinatorial optimization problems, namely the traveling salesperson problem (TSP). We present a new construction graph and show that it has a stronger local property than one commonly used...... for constructing solutions of the TSP. The rigorous runtime analysis for two ant colony optimization algorithms, based on these two construction procedures, shows that they lead to good approximation in expected polynomial time on random instances. Furthermore, we point out in which situations our algorithms get...

  12. Chemical optimization algorithm for fuzzy controller design

    CERN Document Server

    Astudillo, Leslie; Castillo, Oscar

    2014-01-01

    In this book, a novel optimization method inspired by a paradigm from nature is introduced. The chemical reactions are used as a paradigm to propose an optimization method that simulates these natural processes. The proposed algorithm is described in detail and then a set of typical complex benchmark functions is used to evaluate the performance of the algorithm. Simulation results show that the proposed optimization algorithm can outperform other methods in a set of benchmark functions. This chemical reaction optimization paradigm is also applied to solve the tracking problem for the dynamic model of a unicycle mobile robot by integrating a kinematic and a torque controller based on fuzzy logic theory. Computer simulations are presented confirming that this optimization paradigm is able to outperform other optimization techniques applied to this particular robot application

  13. Estudo anatômico dos gânglios celíaco, celiacomesentérico e mesentérico cranial e de suas conexões no gato doméstico (Felix domestica, Linnaeus, 1758

    Directory of Open Access Journals (Sweden)

    Antonio Augusto Coppi Maciel Ribeiro

    2000-01-01

    Full Text Available O gânglio celíaco é um dos principais responsáveis pela inervação do estômago, intestinos, fígado, pâncreas e ainda contribui para a inervação do baço, sendo desta forma essencial ao controle da motilidade gastrointestinal. O conhecimento do suprimento nervoso endereçado a estes órgãos é fundamental na clínica médica e cirúrgica no tocante às atonias digestivas, gastroenterites hemorrágicas, torções gástricas e invaginações intestinais. Neste trabalho, estudou-se a anatomia macro e microscópica dos gânglios celíaco, celiacomesentérico e mesentérico cranial. Foram utilizados trinta gatos domésticos, adultos, 10 machos e 20 fêmeas. A aorta torácica desses gatos foi injetada com solução de Neoprene látex 650 corada, sendo os animais congelados por no mínimo 48 horas. A fixação foi feita posteriormente com solução aquosa de formol a 10%. Nos estudos microscópicos, utilizaram-se as seguintes colorações: hematoxilina-eosina, tricrômico de Masson, reticulina e hematoxilina ácida fosfotúngstica. O gânglio celíaco ocorreu 7 vezes, sendo 4 direitos e 3 esquerdos, predominando a forma elíptica (23,3% e a situação periarterial. Os gânglios celiacomesentéricos foram contados em número de 24, sendo 11 independentes; 2 apresentando as porções direita e esquerda e 11 gânglios celiacomesentéricos direitos com uma porção mesentérica cranial esquerda, tendo formato semilunar assimétrico. Esses achados sugerem o predomínio da fusão do gânglio celíaco ao mesentérico cranial, constituindo assim o gânglio celiacomesentérico. Este é formado por neurônios imersos em abundante matriz conjuntiva fibrosa, envoltos por uma cápsula contendo fibras elásticas, colágenas e reticulares, apresentando uma continuidade nos pontos de fusão ganglionar.

  14. Improved algorithm for solving nonlinear parabolized stability equations

    International Nuclear Information System (INIS)

    Zhao Lei; Zhang Cun-bo; Liu Jian-xin; Luo Ji-sheng

    2016-01-01

    Due to its high computational efficiency and ability to consider nonparallel and nonlinear effects, nonlinear parabolized stability equations (NPSE) approach has been widely used to study the stability and transition mechanisms. However, it often diverges in hypersonic boundary layers when the amplitude of disturbance reaches a certain level. In this study, an improved algorithm for solving NPSE is developed. In this algorithm, the mean flow distortion is included into the linear operator instead of into the nonlinear forcing terms in NPSE. An under-relaxation factor for computing the nonlinear terms is introduced during the iteration process to guarantee the robustness of the algorithm. Two case studies, the nonlinear development of stationary crossflow vortices and the fundamental resonance of the second mode disturbance in hypersonic boundary layers, are presented to validate the proposed algorithm for NPSE. Results from direct numerical simulation (DNS) are regarded as the baseline for comparison. Good agreement can be found between the proposed algorithm and DNS, which indicates the great potential of the proposed method on studying the crossflow and streamwise instability in hypersonic boundary layers. (paper)

  15. Novel Adaptive Bacteria Foraging Algorithms for Global Optimization

    Directory of Open Access Journals (Sweden)

    Ahmad N. K. Nasir

    2014-01-01

    Full Text Available This paper presents improved versions of bacterial foraging algorithm (BFA. The chemotaxis feature of bacteria through random motion is an effective strategy for exploring the optimum point in a search area. The selection of small step size value in the bacteria motion leads to high accuracy in the solution but it offers slow convergence. On the contrary, defining a large step size in the motion provides faster convergence but the bacteria will be unable to locate the optimum point hence reducing the fitness accuracy. In order to overcome such problems, novel linear and nonlinear mathematical relationships based on the index of iteration, index of bacteria, and fitness cost are adopted which can dynamically vary the step size of bacteria movement. The proposed algorithms are tested with several unimodal and multimodal benchmark functions in comparison with the original BFA. Moreover, the application of the proposed algorithms in modelling of a twin rotor system is presented. The results show that the proposed algorithms outperform the predecessor algorithm in all test functions and acquire better model for the twin rotor system.

  16. An Improved Direction Finding Algorithm Based on Toeplitz Approximation

    Directory of Open Access Journals (Sweden)

    Qing Wang

    2013-01-01

    Full Text Available In this paper, a novel direction of arrival (DOA estimation algorithm called the Toeplitz fourth order cumulants multiple signal classification method (TFOC-MUSIC algorithm is proposed through combining a fast MUSIC-like algorithm termed the modified fourth order cumulants MUSIC (MFOC-MUSIC algorithm and Toeplitz approximation. In the proposed algorithm, the redundant information in the cumulants is removed. Besides, the computational complexity is reduced due to the decreased dimension of the fourth-order cumulants matrix, which is equal to the number of the virtual array elements. That is, the effective array aperture of a physical array remains unchanged. However, due to finite sampling snapshots, there exists an estimation error of the reduced-rank FOC matrix and thus the capacity of DOA estimation degrades. In order to improve the estimation performance, Toeplitz approximation is introduced to recover the Toeplitz structure of the reduced-dimension FOC matrix just like the ideal one which has the Toeplitz structure possessing optimal estimated results. The theoretical formulas of the proposed algorithm are derived, and the simulations results are presented. From the simulations, in comparison with the MFOC-MUSIC algorithm, it is concluded that the TFOC-MUSIC algorithm yields an excellent performance in both spatially-white noise and in spatially-color noise environments.

  17. Applying Data Clustering Feature to Speed Up Ant Colony Optimization

    Directory of Open Access Journals (Sweden)

    Chao-Yang Pang

    2014-01-01

    Full Text Available Ant colony optimization (ACO is often used to solve optimization problems, such as traveling salesman problem (TSP. When it is applied to TSP, its runtime is proportional to the squared size of problem N so as to look less efficient. The following statistical feature is observed during the authors’ long-term gene data analysis using ACO: when the data size N becomes big, local clustering appears frequently. That is, some data cluster tightly in a small area and form a class, and the correlation between different classes is weak. And this feature makes the idea of divide and rule feasible for the estimate of solution of TSP. In this paper an improved ACO algorithm is presented, which firstly divided all data into local clusters and calculated small TSP routes and then assembled a big TSP route with them. Simulation shows that the presented method improves the running speed of ACO by 200 factors under the condition that data set holds feature of local clustering.

  18. The optimal algorithm for Multi-source RS image fusion.

    Science.gov (United States)

    Fu, Wei; Huang, Shui-Guang; Li, Zeng-Shun; Shen, Hao; Li, Jun-Shuai; Wang, Peng-Yuan

    2016-01-01

    In order to solve the issue which the fusion rules cannot be self-adaptively adjusted by using available fusion methods according to the subsequent processing requirements of Remote Sensing (RS) image, this paper puts forward GSDA (genetic-iterative self-organizing data analysis algorithm) by integrating the merit of genetic arithmetic together with the advantage of iterative self-organizing data analysis algorithm for multi-source RS image fusion. The proposed algorithm considers the wavelet transform of the translation invariance as the model operator, also regards the contrast pyramid conversion as the observed operator. The algorithm then designs the objective function by taking use of the weighted sum of evaluation indices, and optimizes the objective function by employing GSDA so as to get a higher resolution of RS image. As discussed above, the bullet points of the text are summarized as follows.•The contribution proposes the iterative self-organizing data analysis algorithm for multi-source RS image fusion.•This article presents GSDA algorithm for the self-adaptively adjustment of the fusion rules.•This text comes up with the model operator and the observed operator as the fusion scheme of RS image based on GSDA. The proposed algorithm opens up a novel algorithmic pathway for multi-source RS image fusion by means of GSDA.

  19. Green cloud environment by using robust planning algorithm

    Directory of Open Access Journals (Sweden)

    Jyoti Thaman

    2017-11-01

    Full Text Available Cloud computing provided a framework for seamless access to resources through network. Access to resources is quantified through SLA between service providers and users. Service provider tries to best exploit their resources and reduce idle times of the resources. Growing energy concerns further makes the life of service providers miserable. User’s requests are served by allocating users tasks to resources in Clouds and Grid environment through scheduling algorithms and planning algorithms. With only few Planning algorithms in existence rarely planning and scheduling algorithms are differentiated. This paper proposes a robust hybrid planning algorithm, Robust Heterogeneous-Earliest-Finish-Time (RHEFT for binding tasks to VMs. The allocation of tasks to VMs is based on a novel task matching algorithm called Interior Scheduling. The consistent performance of proposed RHEFT algorithm is compared with Heterogeneous-Earliest-Finish-Time (HEFT and Distributed HEFT (DHEFT for various parameters like utilization ratio, makespan, Speed-up and Energy Consumption. RHEFT’s consistent performance against HEFT and DHEFT has established the robustness of the hybrid planning algorithm through rigorous simulations.

  20. Runtime analysis of the 1-ANT ant colony optimizer

    DEFF Research Database (Denmark)

    Doerr, Benjamin; Neumann, Frank; Sudholt, Dirk

    2011-01-01

    The runtime analysis of randomized search heuristics is a growing field where, in the last two decades, many rigorous results have been obtained. First runtime analyses of ant colony optimization (ACO) have been conducted only recently. In these studies simple ACO algorithms such as the 1-ANT...... that give us a more detailed impression of the 1-ANT’s performance. Furthermore, the experiments also deal with the question whether using many ant solutions in one iteration can decrease the total runtime....

  1. Improved Parallel Three-List Algorithm for the Knapsack Problem without Memory Conflicts

    Institute of Scientific and Technical Information of China (English)

    Pan Jun; Li Kenli; Li Qinghua

    2006-01-01

    Based on the two-list algorithm and the parallel three-list algorithm, an improved parallel three-list algorithm for knapsack problem is proposed, in which the method of divide and conquer, and parallel merging without memory conflicts are adopted. To find a solution for the n-element knapsack problem, the proposed algorithm needs O(23n/8) time when O(23n/8) shared memory units and O(2n/4) processors are available. The comparisons between the proposed algorithm and 10 existing algorithms show that the improved parallel three-list algorithm is the first exclusive-read exclusive-write (EREW) parallel algorithm that can solve the knapsack instances in less than O(2n/2) time when the available hardware resource is smaller than O(2n/2), and hence is an improved result over the past researches.

  2. Artificial root foraging optimizer algorithm with hybrid strategies

    Directory of Open Access Journals (Sweden)

    Yang Liu

    2017-02-01

    Full Text Available In this work, a new plant-inspired optimization algorithm namely the hybrid artificial root foraging optimizion (HARFO is proposed, which mimics the iterative root foraging behaviors for complex optimization. In HARFO model, two innovative strategies were developed: one is the root-to-root communication strategy, which enables the individual exchange information with each other in different efficient topologies that can essentially improve the exploration ability; the other is co-evolution strategy, which can structure the hierarchical spatial population driven by evolutionary pressure of multiple sub-populations that ensure the diversity of root population to be well maintained. The proposed algorithm is benchmarked against four classical evolutionary algorithms on well-designed test function suites including both classical and composition test functions. Through the rigorous performance analysis that of all these tests highlight the significant performance improvement, and the comparative results show the superiority of the proposed algorithm.

  3. Energy demand projection of China using a path-coefficient analysis and PSO–GA approach

    International Nuclear Information System (INIS)

    Yu Shiwei; Zhu Kejun; Zhang Xian

    2012-01-01

    Highlights: ► The effect mechanism of China’s energy demand is investigated detailedly. ► A hybrid algorithm PSO–GA optimal energy demands estimating model for China. ► China’s energy demand will reach 4.48 billion tce in 2015. ► The proposed method forecast shows its superiority compared with others. - Abstract: Energy demand projection is fundamental to rational energy planning formulation. The present study investigates the direct and indirect effects of five factors, namely GDP, population, proportion of industrial, proportion of urban population and coal percentage of total energy consumption on China’s energy demand, implementing a path-coefficient analysis. On this basis, a hybrid algorithm, Particle Swarm Optimization and Genetic Algorithm optimal Energy Demand Estimating (PSO–GA EDE) model, is proposed for China. The coefficients of the three forms of the model (linear, exponential and quadratic model) are optimized by proposed PSO–GA. To obtain a combinational prediction of three forms, a departure coefficient method is applied to get the combinational weights. The results show that the China’s energy demand will be 4.48 billion tce in 2015. Furthermore; the proposed method forecast shows its superiority compared with other single optimization method such as GA, PSO or ACO and multiple linear regressions.

  4. Wolf Search Algorithm for Solving Optimal Reactive Power Dispatch Problem

    Directory of Open Access Journals (Sweden)

    Kanagasabai Lenin

    2015-03-01

    Full Text Available This paper presents a new bio-inspired heuristic optimization algorithm called the Wolf Search Algorithm (WSA for solving the multi-objective reactive power dispatch problem. Wolf Search algorithm is a new bio – inspired heuristic algorithm which based on wolf preying behaviour. The way wolves search for food and survive by avoiding their enemies has been imitated to formulate the algorithm for solving the reactive power dispatches. And the speciality  of wolf is  possessing  both individual local searching ability and autonomous flocking movement and this special property has been utilized to formulate the search algorithm .The proposed (WSA algorithm has been tested on standard IEEE 30 bus test system and simulation results shows clearly about the good performance of the proposed algorithm .

  5. Extending lifetime of wireless sensor networks using multi-sensor ...

    Indian Academy of Sciences (India)

    SOUMITRA DAS

    In this paper a multi-sensor data fusion approach for wireless sensor network based on bayesian methods and ant colony ... niques for efficiently routing the data from source to the BS ... Literature review ... efficient scheduling and lot more to increase the lifetime of ... Nature-inspired algorithms such as ACO algorithms have.

  6. A modified genetic algorithm with fuzzy roulette wheel selection for job-shop scheduling problems

    Science.gov (United States)

    Thammano, Arit; Teekeng, Wannaporn

    2015-05-01

    The job-shop scheduling problem is one of the most difficult production planning problems. Since it is in the NP-hard class, a recent trend in solving the job-shop scheduling problem is shifting towards the use of heuristic and metaheuristic algorithms. This paper proposes a novel metaheuristic algorithm, which is a modification of the genetic algorithm. This proposed algorithm introduces two new concepts to the standard genetic algorithm: (1) fuzzy roulette wheel selection and (2) the mutation operation with tabu list. The proposed algorithm has been evaluated and compared with several state-of-the-art algorithms in the literature. The experimental results on 53 JSSPs show that the proposed algorithm is very effective in solving the combinatorial optimization problems. It outperforms all state-of-the-art algorithms on all benchmark problems in terms of the ability to achieve the optimal solution and the computational time.

  7. Two-Step Proximal Gradient Algorithm for Low-Rank Matrix Completion

    Directory of Open Access Journals (Sweden)

    Qiuyu Wang

    2016-06-01

    Full Text Available In this paper, we  propose a two-step proximal gradient algorithm to solve nuclear norm regularized least squares for the purpose of recovering low-rank data matrix from sampling of its entries. Each iteration generated by the proposed algorithm is a combination of the latest three points, namely, the previous point, the current iterate, and its proximal gradient point. This algorithm preserves the computational simplicity of classical proximal gradient algorithm where a singular value decomposition in proximal operator is involved. Global convergence is followed directly in the literature. Numerical results are reported to show the efficiency of the algorithm.

  8. Arc-Search Infeasible Interior-Point Algorithm for Linear Programming

    OpenAIRE

    Yang, Yaguang

    2014-01-01

    Mehrotra's algorithm has been the most successful infeasible interior-point algorithm for linear programming since 1990. Most popular interior-point software packages for linear programming are based on Mehrotra's algorithm. This paper proposes an alternative algorithm, arc-search infeasible interior-point algorithm. We will demonstrate, by testing Netlib problems and comparing the test results obtained by arc-search infeasible interior-point algorithm and Mehrotra's algorithm, that the propo...

  9. Endovascular repair of an aorto-iliac aneurysm succeeded by kidney transplantation Tratamento endovascular de aneurisma aorto-ilíaco sucedido por transplante renal

    Directory of Open Access Journals (Sweden)

    Marcelo Bellini Dalio

    2010-09-01

    Full Text Available We present the case of aorto-iliac aneurysm in a patient with chronic renal failure requiring dialysis who were treated with an endovascular stent graft and, later on, submitted to kidney transplantation. A 53-year-old male with renal failure requiring dialysis presented with an asymptomatic abdominal aorto-iliac aneurysm measuring 5.0cm of diameter. He was treated with endovascular repair technique, being used an endoprosthesis Excluder®. After four months, he was successfully submitted to kidney transplantation (dead donor, with anastomosis of the graft renal artery in the external iliac artery distal to the endoprosthesis. The magnetic resonance imaging, carried out 30 days after the procedure, showed a good positioning of the endoprosthesis and adequate perfusion of the renal graft. In the follow-up, the patient presented improvement of nitrogenous waste, good positioning of the endoprosthesis without migration or endoleak. The endovascular repair of aorto-iliac aneurysm in a patient with end-stage renal failure under hemodialysis treatment showed to be feasible, safe and efficient, as it did not prevent the success of the posterior kidney transplantation.Apresentamos o caso de aneurisma aortoilíaco em um paciente com insuficiência renal crônica dialítica tratado com uma endoprótese vascular, sendo, após, submetido a transplante renal. Um homem de 53 anos com insuficiência renal dialítica apresentava um aneurisma abdominal aortoilíaco assintomático com 5,0cm de diâmetro. Foi tratado com técnica endovascular com uma endoprótese Excluderâ. Após quatro meses, foi submetido a transplante renal (doador cadáver com sucesso, com anastomose da artéria renal do enxerto na artéria ilíaca externa distal à endoprótese. A ressonância magnética 30 dias após o procedimento mostrou a endoprótese bem posicionada e o enxerto renal bem perfundido. No seguimento, o paciente evoluiu com melhora das escórias nitrogenadas, bom

  10. Quantum algorithm for support matrix machines

    Science.gov (United States)

    Duan, Bojia; Yuan, Jiabin; Liu, Ying; Li, Dan

    2017-09-01

    We propose a quantum algorithm for support matrix machines (SMMs) that efficiently addresses an image classification problem by introducing a least-squares reformulation. This algorithm consists of two core subroutines: a quantum matrix inversion (Harrow-Hassidim-Lloyd, HHL) algorithm and a quantum singular value thresholding (QSVT) algorithm. The two algorithms can be implemented on a universal quantum computer with complexity O[log(npq) ] and O[log(pq)], respectively, where n is the number of the training data and p q is the size of the feature space. By iterating the algorithms, we can find the parameters for the SMM classfication model. Our analysis shows that both HHL and QSVT algorithms achieve an exponential increase of speed over their classical counterparts.

  11. A Modified Artificial Bee Colony Algorithm for p-Center Problems

    Directory of Open Access Journals (Sweden)

    Alkın Yurtkuran

    2014-01-01

    Full Text Available The objective of the p-center problem is to locate p-centers on a network such that the maximum of the distances from each node to its nearest center is minimized. The artificial bee colony algorithm is a swarm-based meta-heuristic algorithm that mimics the foraging behavior of honey bee colonies. This study proposes a modified ABC algorithm that benefits from a variety of search strategies to balance exploration and exploitation. Moreover, random key-based coding schemes are used to solve the p-center problem effectively. The proposed algorithm is compared to state-of-the-art techniques using different benchmark problems, and computational results reveal that the proposed approach is very efficient.

  12. A novel highly parallel algorithm for linearly unmixing hyperspectral images

    Science.gov (United States)

    Guerra, Raúl; López, Sebastián.; Callico, Gustavo M.; López, Jose F.; Sarmiento, Roberto

    2014-10-01

    Endmember extraction and abundances calculation represent critical steps within the process of linearly unmixing a given hyperspectral image because of two main reasons. The first one is due to the need of computing a set of accurate endmembers in order to further obtain confident abundance maps. The second one refers to the huge amount of operations involved in these time-consuming processes. This work proposes an algorithm to estimate the endmembers of a hyperspectral image under analysis and its abundances at the same time. The main advantage of this algorithm is its high parallelization degree and the mathematical simplicity of the operations implemented. This algorithm estimates the endmembers as virtual pixels. In particular, the proposed algorithm performs the descent gradient method to iteratively refine the endmembers and the abundances, reducing the mean square error, according with the linear unmixing model. Some mathematical restrictions must be added so the method converges in a unique and realistic solution. According with the algorithm nature, these restrictions can be easily implemented. The results obtained with synthetic images demonstrate the well behavior of the algorithm proposed. Moreover, the results obtained with the well-known Cuprite dataset also corroborate the benefits of our proposal.

  13. Electricity Load Forecasting Using Support Vector Regression with Memetic Algorithms

    Directory of Open Access Journals (Sweden)

    Zhongyi Hu

    2013-01-01

    Full Text Available Electricity load forecasting is an important issue that is widely explored and examined in power systems operation literature and commercial transactions in electricity markets literature as well. Among the existing forecasting models, support vector regression (SVR has gained much attention. Considering the performance of SVR highly depends on its parameters; this study proposed a firefly algorithm (FA based memetic algorithm (FA-MA to appropriately determine the parameters of SVR forecasting model. In the proposed FA-MA algorithm, the FA algorithm is applied to explore the solution space, and the pattern search is used to conduct individual learning and thus enhance the exploitation of FA. Experimental results confirm that the proposed FA-MA based SVR model can not only yield more accurate forecasting results than the other four evolutionary algorithms based SVR models and three well-known forecasting models but also outperform the hybrid algorithms in the related existing literature.

  14. Ensemble of hybrid genetic algorithm for two-dimensional phase unwrapping

    Science.gov (United States)

    Balakrishnan, D.; Quan, C.; Tay, C. J.

    2013-06-01

    The phase unwrapping is the final and trickiest step in any phase retrieval technique. Phase unwrapping by artificial intelligence methods (optimization algorithms) such as hybrid genetic algorithm, reverse simulated annealing, particle swarm optimization, minimum cost matching showed better results than conventional phase unwrapping methods. In this paper, Ensemble of hybrid genetic algorithm with parallel populations is proposed to solve the branch-cut phase unwrapping problem. In a single populated hybrid genetic algorithm, the selection, cross-over and mutation operators are applied to obtain new population in every generation. The parameters and choice of operators will affect the performance of the hybrid genetic algorithm. The ensemble of hybrid genetic algorithm will facilitate to have different parameters set and different choice of operators simultaneously. Each population will use different set of parameters and the offspring of each population will compete against the offspring of all other populations, which use different set of parameters. The effectiveness of proposed algorithm is demonstrated by phase unwrapping examples and advantages of the proposed method are discussed.

  15. A DIFFERENTIAL EVOLUTION ALGORITHM DEVELOPED FOR A NURSE SCHEDULING PROBLEM

    Directory of Open Access Journals (Sweden)

    Shahnazari-Shahrezaei, P.

    2012-11-01

    Full Text Available Nurse scheduling is a type of manpower allocation problem that tries to satisfy hospital managers objectives and nurses preferences as much as possible by generating fair shift schedules. This paper presents a nurse scheduling problem based on a real case study, and proposes two meta-heuristics a differential evolution algorithm (DE and a greedy randomised adaptive search procedure (GRASP to solve it. To investigate the efficiency of the proposed algorithms, two problems are solved. Furthermore, some comparison metrics are applied to examine the reliability of the proposed algorithms. The computational results in this paper show that the proposed DE outperforms the GRASP.

  16. A simple algorithm for calculating the area of an arbitrary polygon

    Directory of Open Access Journals (Sweden)

    K.R. Wijeweera

    2017-06-01

    Full Text Available Computing the area of an arbitrary polygon is a popular problem in pure mathematics. The two methods used are Shoelace Method (SM and Orthogonal Trapezoids Method (OTM. In OTM, the polygon is partitioned into trapezoids by drawing either horizontal or vertical lines through its vertices. The area of each trapezoid is computed and the resultant areas are added up. In SM, a formula which is a generalization of Green’s Theorem for the discrete case is used. The most of the available systems is based on SM. Since an algorithm for OTM is not available in literature, this paper proposes an algorithm for OTM along with efficient implementation. Conversion of a pure mathematical method into an efficient computer program is not straightforward. In order to reduce the run time, minimal computation needs to be achieved. Handling of indeterminate forms and special cases separately can support this. On the other hand, precision error should also be avoided. Salient feature of the proposed algorithm is that it successfully handles these situations achieving minimum run time. Experimental results of the proposed method are compared against that of the existing algorithm. However, the proposed algorithm suggests a way to partition a polygon into orthogonal trapezoids which is not an easy task. Additionally, the proposed algorithm uses only basic mathematical concepts while the Green’s theorem uses complicated mathematical concepts. The proposed algorithm can be used when the simplicity is important than the speed.

  17. Proposed algorithm for the management of athletes with athletic pubalgia (sports hernia): a case series.

    Science.gov (United States)

    Kachingwe, Aimie F; Grech, Steven

    2008-12-01

    A case series of 6 athletes with a suspected sports hernia. Groin pain in athletes is common, and 1 source of groin pain is athletic pubalgia, or a sports hernia. Description of this condition and its management is scarce in the physical therapy literature. The purpose of this case series is to describe a conservative approach to treating athletes with a likely sports hernia and to provide physical therapists with an algorithm for managing athletes with this dysfunction. Six collegiate athletes (age range, 19-22 years; 4 males, 2 females) with a physician diagnosis of groin pain secondary to possible/probable sports hernia were referred to physical therapy. A method of evaluation was constructed and a cluster of 5 key findings indicative of a sports hernia is presented. The athletes were managed according to a proposed algorithm and received physical therapy consisting of soft tissue and joint mobilization/manipulation, neuromuscular re-education, manual stretching, and therapeutic exercise. Three of the athletes received conservative intervention and were able to fully return to sport after a mean of 7.7 sessions of physical therapy. The other 3 athletes reached this outcome after surgical repair and a mean of 6.7 sessions of physical therapy. Conservative management including manual therapy appears to be a viable option in the management of athletes with a sports hernia. Follow-up randomized clinical trials should be performed to further investigate the effectiveness of conservative rehabilitation compared to a homogeneous group of patients undergoing surgical repair for this condition. Therapy, level 4.

  18. Advanced defect detection algorithm using clustering in ultrasonic NDE

    Science.gov (United States)

    Gongzhang, Rui; Gachagan, Anthony

    2016-02-01

    A range of materials used in industry exhibit scattering properties which limits ultrasonic NDE. Many algorithms have been proposed to enhance defect detection ability, such as the well-known Split Spectrum Processing (SSP) technique. Scattering noise usually cannot be fully removed and the remaining noise can be easily confused with real feature signals, hence becoming artefacts during the image interpretation stage. This paper presents an advanced algorithm to further reduce the influence of artefacts remaining in A-scan data after processing using a conventional defect detection algorithm. The raw A-scan data can be acquired from either traditional single transducer or phased array configurations. The proposed algorithm uses the concept of unsupervised machine learning to cluster segmental defect signals from pre-processed A-scans into different classes. The distinction and similarity between each class and the ensemble of randomly selected noise segments can be observed by applying a classification algorithm. Each class will then be labelled as `legitimate reflector' or `artefacts' based on this observation and the expected probability of defection (PoD) and probability of false alarm (PFA) determined. To facilitate data collection and validate the proposed algorithm, a 5MHz linear array transducer is used to collect A-scans from both austenitic steel and Inconel samples. Each pulse-echo A-scan is pre-processed using SSP and the subsequent application of the proposed clustering algorithm has provided an additional reduction to PFA while maintaining PoD for both samples compared with SSP results alone.

  19. Maximum-entropy clustering algorithm and its global convergence analysis

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    Constructing a batch of differentiable entropy functions touniformly approximate an objective function by means of the maximum-entropy principle, a new clustering algorithm, called maximum-entropy clustering algorithm, is proposed based on optimization theory. This algorithm is a soft generalization of the hard C-means algorithm and possesses global convergence. Its relations with other clustering algorithms are discussed.

  20. An Agent-Based Co-Evolutionary Multi-Objective Algorithm for Portfolio Optimization

    Directory of Open Access Journals (Sweden)

    Rafał Dreżewski

    2017-08-01

    Full Text Available Algorithms based on the process of natural evolution are widely used to solve multi-objective optimization problems. In this paper we propose the agent-based co-evolutionary algorithm for multi-objective portfolio optimization. The proposed technique is compared experimentally to the genetic algorithm, co-evolutionary algorithm and a more classical approach—the trend-following algorithm. During the experiments historical data from the Warsaw Stock Exchange is used in order to assess the performance of the compared algorithms. Finally, we draw some conclusions from these experiments, showing the strong and weak points of all the techniques.

  1. A Learning Algorithm for Multimodal Grammar Inference.

    Science.gov (United States)

    D'Ulizia, A; Ferri, F; Grifoni, P

    2011-12-01

    The high costs of development and maintenance of multimodal grammars in integrating and understanding input in multimodal interfaces lead to the investigation of novel algorithmic solutions in automating grammar generation and in updating processes. Many algorithms for context-free grammar inference have been developed in the natural language processing literature. An extension of these algorithms toward the inference of multimodal grammars is necessary for multimodal input processing. In this paper, we propose a novel grammar inference mechanism that allows us to learn a multimodal grammar from its positive samples of multimodal sentences. The algorithm first generates the multimodal grammar that is able to parse the positive samples of sentences and, afterward, makes use of two learning operators and the minimum description length metrics in improving the grammar description and in avoiding the over-generalization problem. The experimental results highlight the acceptable performances of the algorithm proposed in this paper since it has a very high probability of parsing valid sentences.

  2. Parallel image encryption algorithm based on discretized chaotic map

    International Nuclear Information System (INIS)

    Zhou Qing; Wong Kwokwo; Liao Xiaofeng; Xiang Tao; Hu Yue

    2008-01-01

    Recently, a variety of chaos-based algorithms were proposed for image encryption. Nevertheless, none of them works efficiently in parallel computing environment. In this paper, we propose a framework for parallel image encryption. Based on this framework, a new algorithm is designed using the discretized Kolmogorov flow map. It fulfills all the requirements for a parallel image encryption algorithm. Moreover, it is secure and fast. These properties make it a good choice for image encryption on parallel computing platforms

  3. The bilinear complexity and practical algorithms for matrix multiplication

    Science.gov (United States)

    Smirnov, A. V.

    2013-12-01

    A method for deriving bilinear algorithms for matrix multiplication is proposed. New estimates for the bilinear complexity of a number of problems of the exact and approximate multiplication of rectangular matrices are obtained. In particular, the estimate for the boundary rank of multiplying 3 × 3 matrices is improved and a practical algorithm for the exact multiplication of square n × n matrices is proposed. The asymptotic arithmetic complexity of this algorithm is O( n 2.7743).

  4. Statistical Fractal Models Based on GND-PCA and Its Application on Classification of Liver Diseases

    Directory of Open Access Journals (Sweden)

    Huiyan Jiang

    2013-01-01

    Full Text Available A new method is proposed to establish the statistical fractal model for liver diseases classification. Firstly, the fractal theory is used to construct the high-order tensor, and then Generalized -dimensional Principal Component Analysis (GND-PCA is used to establish the statistical fractal model and select the feature from the region of liver; at the same time different features have different weights, and finally, Support Vector Machine Optimized Ant Colony (ACO-SVM algorithm is used to establish the classifier for the recognition of liver disease. In order to verify the effectiveness of the proposed method, PCA eigenface method and normal SVM method are chosen as the contrast methods. The experimental results show that the proposed method can reconstruct liver volume better and improve the classification accuracy of liver diseases.

  5. Algorithm for shortest path search in Geographic Information Systems by using reduced graphs.

    Science.gov (United States)

    Rodríguez-Puente, Rafael; Lazo-Cortés, Manuel S

    2013-01-01

    The use of Geographic Information Systems has increased considerably since the eighties and nineties. As one of their most demanding applications we can mention shortest paths search. Several studies about shortest path search show the feasibility of using graphs for this purpose. Dijkstra's algorithm is one of the classic shortest path search algorithms. This algorithm is not well suited for shortest path search in large graphs. This is the reason why various modifications to Dijkstra's algorithm have been proposed by several authors using heuristics to reduce the run time of shortest path search. One of the most used heuristic algorithms is the A* algorithm, the main goal is to reduce the run time by reducing the search space. This article proposes a modification of Dijkstra's shortest path search algorithm in reduced graphs. It shows that the cost of the path found in this work, is equal to the cost of the path found using Dijkstra's algorithm in the original graph. The results of finding the shortest path, applying the proposed algorithm, Dijkstra's algorithm and A* algorithm, are compared. This comparison shows that, by applying the approach proposed, it is possible to obtain the optimal path in a similar or even in less time than when using heuristic algorithms.

  6. A street rubbish detection algorithm based on Sift and RCNN

    Science.gov (United States)

    Yu, XiPeng; Chen, Zhong; Zhang, Shuo; Zhang, Ting

    2018-02-01

    This paper presents a street rubbish detection algorithm based on image registration with Sift feature and RCNN. Firstly, obtain the rubbish region proposal on the real-time street image and set up the CNN convolution neural network trained by the rubbish samples set consists of rubbish and non-rubbish images; Secondly, for every clean street image, obtain the Sift feature and do image registration with the real-time street image to obtain the differential image, the differential image filters a lot of background information, obtain the rubbish region proposal rect where the rubbish may appear on the differential image by the selective search algorithm. Then, the CNN model is used to detect the image pixel data in each of the region proposal on the real-time street image. According to the output vector of the CNN, it is judged whether the rubbish is in the region proposal or not. If it is rubbish, the region proposal on the real-time street image is marked. This algorithm avoids the large number of false detection caused by the detection on the whole image because the CNN is used to identify the image only in the region proposal on the real-time street image that may appear rubbish. Different from the traditional object detection algorithm based on the region proposal, the region proposal is obtained on the differential image not whole real-time street image, and the number of the invalid region proposal is greatly reduced. The algorithm has the high mean average precision (mAP).

  7. Performance indices and evaluation of algorithms in building energy efficient design optimization

    International Nuclear Information System (INIS)

    Si, Binghui; Tian, Zhichao; Jin, Xing; Zhou, Xin; Tang, Peng; Shi, Xing

    2016-01-01

    Building energy efficient design optimization is an emerging technique that is increasingly being used to design buildings with better overall performance and a particular emphasis on energy efficiency. To achieve building energy efficient design optimization, algorithms are vital to generate new designs and thus drive the design optimization process. Therefore, the performance of algorithms is crucial to achieving effective energy efficient design techniques. This study evaluates algorithms used for building energy efficient design optimization. A set of performance indices, namely, stability, robustness, validity, speed, coverage, and locality, is proposed to evaluate the overall performance of algorithms. A benchmark building and a design optimization problem are also developed. Hooke–Jeeves algorithm, Multi-Objective Genetic Algorithm II, and Multi-Objective Particle Swarm Optimization algorithm are evaluated by using the proposed performance indices and benchmark design problem. Results indicate that no algorithm performs best in all six areas. Therefore, when facing an energy efficient design problem, the algorithm must be carefully selected based on the nature of the problem and the performance indices that matter the most. - Highlights: • Six indices of algorithm performance in building energy optimization are developed. • For each index, its concept is defined and the calculation formulas are proposed. • A benchmark building and benchmark energy efficient design problem are proposed. • The performance of three selected algorithms are evaluated.

  8. An Improved User Selection Algorithm in Multiuser MIMO Broadcast with Channel Prediction

    Science.gov (United States)

    Min, Zhi; Ohtsuki, Tomoaki

    In multiuser MIMO-BC (Multiple-Input Multiple-Output Broadcasting) systems, user selection is important to achieve multiuser diversity. The optimal user selection algorithm is to try all the combinations of users to find the user group that can achieve the multiuser diversity. Unfortunately, the high calculation cost of the optimal algorithm prevents its implementation. Thus, instead of the optimal algorithm, some suboptimal user selection algorithms were proposed based on semiorthogonality of user channel vectors. The purpose of this paper is to achieve multiuser diversity with a small amount of calculation. For this purpose, we propose a user selection algorithm that can improve the orthogonality of a selected user group. We also apply a channel prediction technique to a MIMO-BC system to get more accurate channel information at the transmitter. Simulation results show that the channel prediction can improve the accuracy of channel information for user selections, and the proposed user selection algorithm achieves higher sum rate capacity than the SUS (Semiorthogonal User Selection) algorithm. Also we discuss the setting of the algorithm threshold. As the result of a discussion on the calculation complexity, which uses the number of complex multiplications as the parameter, the proposed algorithm is shown to have a calculation complexity almost equal to that of the SUS algorithm, and they are much lower than that of the optimal user selection algorithm.

  9. Automatic bounding estimation in modified NLMS algorithm

    International Nuclear Information System (INIS)

    Shahtalebi, K.; Doost-Hoseini, A.M.

    2002-01-01

    Modified Normalized Least Mean Square algorithm, which is a sign form of Nlm based on set-membership (S M) theory in the class of optimal bounding ellipsoid (OBE) algorithms, requires a priori knowledge of error bounds that is unknown in most applications. In a special but popular case of measurement noise, a simple algorithm has been proposed. With some simulation examples the performance of algorithm is compared with Modified Normalized Least Mean Square

  10. Flood susceptibility mapping using novel ensembles of adaptive neuro fuzzy inference system and metaheuristic algorithms.

    Science.gov (United States)

    Razavi Termeh, Seyed Vahid; Kornejady, Aiding; Pourghasemi, Hamid Reza; Keesstra, Saskia

    2018-02-15

    Flood is one of the most destructive natural disasters which cause great financial and life losses per year. Therefore, producing susceptibility maps for flood management are necessary in order to reduce its harmful effects. The aim of the present study is to map flood hazard over the Jahrom Township in Fars Province using a combination of adaptive neuro-fuzzy inference systems (ANFIS) with different metaheuristics algorithms such as ant colony optimization (ACO), genetic algorithm (GA), and particle swarm optimization (PSO) and comparing their accuracy. A total number of 53 flood locations areas were identified, 35 locations of which were randomly selected in order to model flood susceptibility and the remaining 16 locations were used to validate the models. Learning vector quantization (LVQ), as one of the supervised neural network methods, was employed in order to estimate factors' importance. Nine flood conditioning factors namely: slope degree, plan curvature, altitude, topographic wetness index (TWI), stream power index (SPI), distance from river, land use/land cover, rainfall, and lithology were selected and the corresponding maps were prepared in ArcGIS. The frequency ratio (FR) model was used to assign weights to each class within particular controlling factor, then the weights was transferred into MATLAB software for further analyses and to combine with metaheuristic models. The ANFIS-PSO was found to be the most practical model in term of producing the highly focused flood susceptibility map with lesser spatial distribution related to highly susceptible classes. The chi-square result attests the same, where the ANFIS-PSO had the highest spatial differentiation within flood susceptibility classes over the study area. The area under the curve (AUC) obtained from ROC curve indicated the accuracy of 91.4%, 91.8%, 92.6% and 94.5% for the respective models of FR, ANFIS-ACO, ANFIS-GA, and ANFIS-PSO ensembles. So, the ensemble of ANFIS-PSO was introduced as the

  11. An Autonomous Star Identification Algorithm Based on One-Dimensional Vector Pattern for Star Sensors.

    Science.gov (United States)

    Luo, Liyan; Xu, Luping; Zhang, Hua

    2015-07-07

    In order to enhance the robustness and accelerate the recognition speed of star identification, an autonomous star identification algorithm for star sensors is proposed based on the one-dimensional vector pattern (one_DVP). In the proposed algorithm, the space geometry information of the observed stars is used to form the one-dimensional vector pattern of the observed star. The one-dimensional vector pattern of the same observed star remains unchanged when the stellar image rotates, so the problem of star identification is simplified as the comparison of the two feature vectors. The one-dimensional vector pattern is adopted to build the feature vector of the star pattern, which makes it possible to identify the observed stars robustly. The characteristics of the feature vector and the proposed search strategy for the matching pattern make it possible to achieve the recognition result as quickly as possible. The simulation results demonstrate that the proposed algorithm can effectively accelerate the star identification. Moreover, the recognition accuracy and robustness by the proposed algorithm are better than those by the pyramid algorithm, the modified grid algorithm, and the LPT algorithm. The theoretical analysis and experimental results show that the proposed algorithm outperforms the other three star identification algorithms.

  12. Blind Source Separation Based on Covariance Ratio and Artificial Bee Colony Algorithm

    Directory of Open Access Journals (Sweden)

    Lei Chen

    2014-01-01

    Full Text Available The computation amount in blind source separation based on bioinspired intelligence optimization is high. In order to solve this problem, we propose an effective blind source separation algorithm based on the artificial bee colony algorithm. In the proposed algorithm, the covariance ratio of the signals is utilized as the objective function and the artificial bee colony algorithm is used to solve it. The source signal component which is separated out, is then wiped off from mixtures using the deflation method. All the source signals can be recovered successfully by repeating the separation process. Simulation experiments demonstrate that significant improvement of the computation amount and the quality of signal separation is achieved by the proposed algorithm when compared to previous algorithms.

  13. Channel Parameter Estimation for Scatter Cluster Model Using Modified MUSIC Algorithm

    Directory of Open Access Journals (Sweden)

    Jinsheng Yang

    2012-01-01

    Full Text Available Recently, the scatter cluster models which precisely evaluate the performance of the wireless communication system have been proposed in the literature. However, the conventional SAGE algorithm does not work for these scatter cluster-based models because it performs poorly when the transmit signals are highly correlated. In this paper, we estimate the time of arrival (TOA, the direction of arrival (DOA, and Doppler frequency for scatter cluster model by the modified multiple signal classification (MUSIC algorithm. Using the space-time characteristics of the multiray channel, the proposed algorithm combines the temporal filtering techniques and the spatial smoothing techniques to isolate and estimate the incoming rays. The simulation results indicated that the proposed algorithm has lower complexity and is less time-consuming in the dense multipath environment than SAGE algorithm. Furthermore, the estimations’ performance increases with elements of receive array and samples length. Thus, the problem of the channel parameter estimation of the scatter cluster model can be effectively addressed with the proposed modified MUSIC algorithm.

  14. Efficient algorithms of multidimensional γ-ray spectra compression

    International Nuclear Information System (INIS)

    Morhac, M.; Matousek, V.

    2006-01-01

    The efficient algorithms to compress multidimensional γ-ray events are presented. Two alternative kinds of compression algorithms based on both the adaptive orthogonal and randomizing transforms are proposed. In both algorithms we employ the reduction of data volume due to the symmetry of the γ-ray spectra

  15. Extracción de un catéter fracturado mediante cateterismo cardíaco intervencionista en un paciente prematuro de 1 600 g

    Directory of Open Access Journals (Sweden)

    Francisco Javier Ozores Suárez

    2011-06-01

    Full Text Available Se documenta el caso de un paciente de 1 mes de edad, prematuro, con peso de 1 600 g, cuyo catéter epicutáneo se fracturó y desplazó hasta ubicarse en la porción distal en la rama izquierda de la arteria pulmonar. El catéter se extrajo por vía femoral mediante un procedimiento de cateterismo cardíaco intervencionista, con lo que se demostró la efectividad de dicho procedimiento ante este tipo de complicación.

  16. A Multipath Routing Protocol Based on Clustering and Ant Colony Optimization for Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Jing Yang

    2010-05-01

    Full Text Available For monitoring burst events in a kind of reactive wireless sensor networks (WSNs, a multipath routing protocol (MRP based on dynamic clustering and ant colony optimization (ACO is proposed.. Such an approach can maximize the network lifetime and reduce the energy consumption. An important attribute of WSNs is their limited power supply, and therefore some metrics (such as energy consumption of communication among nodes, residual energy, path length were considered as very important criteria while designing routing in the MRP. Firstly, a cluster head (CH is selected among nodes located in the event area according to some parameters, such as residual energy. Secondly, an improved ACO algorithm is applied in the search for multiple paths between the CH and sink node. Finally, the CH dynamically chooses a route to transmit data with a probability that depends on many path metrics, such as energy consumption. The simulation results show that MRP can prolong the network lifetime, as well as balance of energy consumption among nodes and reduce the average energy consumption effectively.

  17. No-Wait Flexible Flow Shop Scheduling with Due Windows

    Directory of Open Access Journals (Sweden)

    Rong-Hwa Huang

    2015-01-01

    Full Text Available To improve capacity and reduce processing time, the flow shop with multiprocessors (FSMP system is commonly used in glass, steel, and semiconductor production. No-wait FSMP is a modern production system that responds to periods when zero work is required in process production. The production process must be continuous and uninterrupted. Setup time must also be considered. Just-in-time (JIT production is very popular in industry, and timely delivery is important to customer satisfaction. Therefore, it is essential to consider the time window constraint, which is also very complex. This study focuses on a no-wait FSMP problem with time window constraint. An improved ant colony optimization (ACO, known as ant colony optimization with flexible update (ACOFU, is developed to solve the problem. The results demonstrate that ACOFU is more effective and robust than ACO when applied to small-scale problems. ACOFU has superior solution capacity and robustness when applied to large-scale problems. Therefore, this study concludes that the proposed algorithm ACOFU performs excellently when applied to the scheduling problem discussed in this study.

  18. Regularization iteration imaging algorithm for electrical capacitance tomography

    Science.gov (United States)

    Tong, Guowei; Liu, Shi; Chen, Hongyan; Wang, Xueyao

    2018-03-01

    The image reconstruction method plays a crucial role in real-world applications of the electrical capacitance tomography technique. In this study, a new cost function that simultaneously considers the sparsity and low-rank properties of the imaging targets is proposed to improve the quality of the reconstruction images, in which the image reconstruction task is converted into an optimization problem. Within the framework of the split Bregman algorithm, an iterative scheme that splits a complicated optimization problem into several simpler sub-tasks is developed to solve the proposed cost function efficiently, in which the fast-iterative shrinkage thresholding algorithm is introduced to accelerate the convergence. Numerical experiment results verify the effectiveness of the proposed algorithm in improving the reconstruction precision and robustness.

  19. Visual Perception Based Rate Control Algorithm for HEVC

    Science.gov (United States)

    Feng, Zeqi; Liu, PengYu; Jia, Kebin

    2018-01-01

    For HEVC, rate control is an indispensably important video coding technology to alleviate the contradiction between video quality and the limited encoding resources during video communication. However, the rate control benchmark algorithm of HEVC ignores subjective visual perception. For key focus regions, bit allocation of LCU is not ideal and subjective quality is unsatisfied. In this paper, a visual perception based rate control algorithm for HEVC is proposed. First bit allocation weight of LCU level is optimized based on the visual perception of luminance and motion to ameliorate video subjective quality. Then λ and QP are adjusted in combination with the bit allocation weight to improve rate distortion performance. Experimental results show that the proposed algorithm reduces average 0.5% BD-BR and maximum 1.09% BD-BR at no cost in bitrate accuracy compared with HEVC (HM15.0). The proposed algorithm devotes to improving video subjective quality under various video applications.

  20. A Novel Clustering Algorithm Inspired by Membrane Computing

    Directory of Open Access Journals (Sweden)

    Hong Peng

    2015-01-01

    Full Text Available P systems are a class of distributed parallel computing models; this paper presents a novel clustering algorithm, which is inspired from mechanism of a tissue-like P system with a loop structure of cells, called membrane clustering algorithm. The objects of the cells express the candidate centers of clusters and are evolved by the evolution rules. Based on the loop membrane structure, the communication rules realize a local neighborhood topology, which helps the coevolution of the objects and improves the diversity of objects in the system. The tissue-like P system can effectively search for the optimal partitioning with the help of its parallel computing advantage. The proposed clustering algorithm is evaluated on four artificial data sets and six real-life data sets. Experimental results show that the proposed clustering algorithm is superior or competitive to k-means algorithm and several evolutionary clustering algorithms recently reported in the literature.

  1. A Motion Estimation Algorithm Using DTCWT and ARPS

    Directory of Open Access Journals (Sweden)

    Unan Y. Oktiawati

    2013-09-01

    Full Text Available In this paper, a hybrid motion estimation algorithm utilizing the Dual Tree Complex Wavelet Transform (DTCWT and the Adaptive Rood Pattern Search (ARPS block is presented. The proposed algorithm first transforms each video sequence with DTCWT. The frame n of the video sequence is used as a reference input and the frame n+2 is used to find the motion vector. Next, the ARPS block search algorithm is carried out and followed by an inverse DTCWT. The motion compensation is then carried out on each inversed frame n and motion vector. The results show that PSNR can be improved for mobile device without depriving its quality. The proposed algorithm also takes less memory usage compared to the DCT-based algorithm. The main contribution of this work is a hybrid wavelet-based motion estimation algorithm for mobile devices. Other contribution is the visual quality scoring system as used in section 6.

  2. SeqCompress: an algorithm for biological sequence compression.

    Science.gov (United States)

    Sardaraz, Muhammad; Tahir, Muhammad; Ikram, Ataul Aziz; Bajwa, Hassan

    2014-10-01

    The growth of Next Generation Sequencing technologies presents significant research challenges, specifically to design bioinformatics tools that handle massive amount of data efficiently. Biological sequence data storage cost has become a noticeable proportion of total cost in the generation and analysis. Particularly increase in DNA sequencing rate is significantly outstripping the rate of increase in disk storage capacity, which may go beyond the limit of storage capacity. It is essential to develop algorithms that handle large data sets via better memory management. This article presents a DNA sequence compression algorithm SeqCompress that copes with the space complexity of biological sequences. The algorithm is based on lossless data compression and uses statistical model as well as arithmetic coding to compress DNA sequences. The proposed algorithm is compared with recent specialized compression tools for biological sequences. Experimental results show that proposed algorithm has better compression gain as compared to other existing algorithms. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Intermediate view reconstruction using adaptive disparity search algorithm for real-time 3D processing

    Science.gov (United States)

    Bae, Kyung-hoon; Park, Changhan; Kim, Eun-soo

    2008-03-01

    In this paper, intermediate view reconstruction (IVR) using adaptive disparity search algorithm (ASDA) is for realtime 3-dimensional (3D) processing proposed. The proposed algorithm can reduce processing time of disparity estimation by selecting adaptive disparity search range. Also, the proposed algorithm can increase the quality of the 3D imaging. That is, by adaptively predicting the mutual correlation between stereo images pair using the proposed algorithm, the bandwidth of stereo input images pair can be compressed to the level of a conventional 2D image and a predicted image also can be effectively reconstructed using a reference image and disparity vectors. From some experiments, stereo sequences of 'Pot Plant' and 'IVO', it is shown that the proposed algorithm improves the PSNRs of a reconstructed image to about 4.8 dB by comparing with that of conventional algorithms, and reduces the Synthesizing time of a reconstructed image to about 7.02 sec by comparing with that of conventional algorithms.

  4. An algorithm of improving speech emotional perception for hearing aid

    Science.gov (United States)

    Xi, Ji; Liang, Ruiyu; Fei, Xianju

    2017-07-01

    In this paper, a speech emotion recognition (SER) algorithm was proposed to improve the emotional perception of hearing-impaired people. The algorithm utilizes multiple kernel technology to overcome the drawback of SVM: slow training speed. Firstly, in order to improve the adaptive performance of Gaussian Radial Basis Function (RBF), the parameter determining the nonlinear mapping was optimized on the basis of Kernel target alignment. Then, the obtained Kernel Function was used as the basis kernel of Multiple Kernel Learning (MKL) with slack variable that could solve the over-fitting problem. However, the slack variable also brings the error into the result. Therefore, a soft-margin MKL was proposed to balance the margin against the error. Moreover, the relatively iterative algorithm was used to solve the combination coefficients and hyper-plane equations. Experimental results show that the proposed algorithm can acquire an accuracy of 90% for five kinds of emotions including happiness, sadness, anger, fear and neutral. Compared with KPCA+CCA and PIM-FSVM, the proposed algorithm has the highest accuracy.

  5. HC-IPSAG and GC-IPSAG algorithm proposals

    DEFF Research Database (Denmark)

    Bǎdoi, C.-I.; Croitoru, V.; Prasad, N.

    2010-01-01

    The cognitive radio (CR) technology was proposed as a solution for the lack of wireless resources in an environment with a rapidly growing number of users. The CR advances the dynamic utilization of the licensed users' unused spectrum channels by secondary users. The paper looks into large CR...

  6. Efficient Geo-Computational Algorithms for Constructing Space-Time Prisms in Road Networks

    Directory of Open Access Journals (Sweden)

    Hui-Ping Chen

    2016-11-01

    Full Text Available The Space-time prism (STP is a key concept in time geography for analyzing human activity-travel behavior under various Space-time constraints. Most existing time-geographic studies use a straightforward algorithm to construct STPs in road networks by using two one-to-all shortest path searches. However, this straightforward algorithm can introduce considerable computational overhead, given the fact that accessible links in a STP are generally a small portion of the whole network. To address this issue, an efficient geo-computational algorithm, called NTP-A*, is proposed. The proposed NTP-A* algorithm employs the A* and branch-and-bound techniques to discard inaccessible links during two shortest path searches, and thereby improves the STP construction performance. Comprehensive computational experiments are carried out to demonstrate the computational advantage of the proposed algorithm. Several implementation techniques, including the label-correcting technique and the hybrid link-node labeling technique, are discussed and analyzed. Experimental results show that the proposed NTP-A* algorithm can significantly improve STP construction performance in large-scale road networks by a factor of 100, compared with existing algorithms.

  7. Parallel Directionally Split Solver Based on Reformulation of Pipelined Thomas Algorithm

    Science.gov (United States)

    Povitsky, A.

    1998-01-01

    In this research an efficient parallel algorithm for 3-D directionally split problems is developed. The proposed algorithm is based on a reformulated version of the pipelined Thomas algorithm that starts the backward step computations immediately after the completion of the forward step computations for the first portion of lines This algorithm has data available for other computational tasks while processors are idle from the Thomas algorithm. The proposed 3-D directionally split solver is based on the static scheduling of processors where local and non-local, data-dependent and data-independent computations are scheduled while processors are idle. A theoretical model of parallelization efficiency is used to define optimal parameters of the algorithm, to show an asymptotic parallelization penalty and to obtain an optimal cover of a global domain with subdomains. It is shown by computational experiments and by the theoretical model that the proposed algorithm reduces the parallelization penalty about two times over the basic algorithm for the range of the number of processors (subdomains) considered and the number of grid nodes per subdomain.

  8. An Enhanced Jaya Algorithm with a Two Group Adaption

    Directory of Open Access Journals (Sweden)

    Chibing Gong

    2017-01-01

    Full Text Available This paper proposes a novel performance enhanced Jaya algorithm with a two group adaption (E-Jaya. Two improvements are presented in E-Jaya. First, instead of using the best and the worst values in Jaya algorithm, EJaya separates all candidates into two groups: the better and the worse groups based on their fitness values, then the mean of the better group and the mean of the worse group are used. Second, in order to add non algorithm-specific parameters in E-Jaya, a novel adaptive method of dividing the two groups has been developed. Finally, twelve benchmark functions with different dimensionality, such as 40, 60, and 100, were evaluated using the proposed EJaya algorithm. The results show that E-Jaya significantly outperformed Jaya algorithm in terms of the solution accuracy. Additionally, E-Jaya was also compared with a differential evolution (DE, a self-adapting control parameters in differential evolution (jDE, a firefly algorithm (FA, and a standard particle swarm optimization 2011 (SPSO2011 algorithm. E-Jaya algorithm outperforms all the algorithms.

  9. A novel heuristic method for optimization of straight blade vertical axis wind turbine

    International Nuclear Information System (INIS)

    Tahani, Mojtaba; Babayan, Narek; Mehrnia, Seyedmajid; Shadmehri, Mehran

    2016-01-01

    Highlights: • A novel heuristic method has been proposed for optimization of VAWTs. • The proposed method is the combination of DMST model with heuristic algorithms. • A continuous/discrete optimization problem has been solved. • A novel continuous optimization algorithm has been developed. • The CFD simulation of the optimized geometry has been carried out. - Abstract: In this research study it is aimed to propose a novel heuristic method for optimizing the VAWT design. The method is the combination of continuous/discrete optimization algorithms with double multiple stream tube (DMST) theory. For this purpose a DMST code has been developed and is validated using available experimental data in literature. A novel continuous optimization algorithm is proposed which can be considered as the combination of three heuristic optimization algorithms namely elephant herding optimization, flower pollination algorithm and grey wolf optimizer. The continuous algorithm is combined with popular discrete ant colony optimization algorithm (ACO). The proposed method can be utilized for several engineering problems which are dealing with continuous and discrete variables. In this research study, chord and diameter of the turbine are selected as continuous decision variables and airfoil types and number of blades are selected as discrete decision variables. The average power coefficient between tip speed rations from 1.5 to 9.5 is considered as the objective function. The optimization results indicated that the optimized geometry can produce a maximum power coefficient, 44% higher than the maximum power coefficient of the original turbine. Also a CFD simulation of the optimized geometry is carried out. The CFD results indicated that the average vorticity magnitude around the optimized blade is larger than the original blade and this results greater momentum and power coefficient.

  10. Real parameter optimization by an effective differential evolution algorithm

    Directory of Open Access Journals (Sweden)

    Ali Wagdy Mohamed

    2013-03-01

    Full Text Available This paper introduces an Effective Differential Evolution (EDE algorithm for solving real parameter optimization problems over continuous domain. The proposed algorithm proposes a new mutation rule based on the best and the worst individuals among the entire population of a particular generation. The mutation rule is combined with the basic mutation strategy through a linear decreasing probability rule. The proposed mutation rule is shown to promote local search capability of the basic DE and to make it faster. Furthermore, a random mutation scheme and a modified Breeder Genetic Algorithm (BGA mutation scheme are merged to avoid stagnation and/or premature convergence. Additionally, the scaling factor and crossover of DE are introduced as uniform random numbers to enrich the search behavior and to enhance the diversity of the population. The effectiveness and benefits of the proposed modifications used in EDE has been experimentally investigated. Numerical experiments on a set of bound-constrained problems have shown that the new approach is efficient, effective and robust. The comparison results between the EDE and several classical differential evolution methods and state-of-the-art parameter adaptive differential evolution variants indicate that the proposed EDE algorithm is competitive with , and in some cases superior to, other algorithms in terms of final solution quality, efficiency, convergence rate, and robustness.

  11. EV Charging Algorithm Implementation with User Price Preference

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Bin; Hu, Boyang; Qiu, Charlie; Chu, Peter; Gadh, Rajit

    2015-02-17

    in this paper, we propose and implement a smart Electric Vehicle (EV) charging algorithm to control the EV charging infrastructures according to users’ price preferences. EVSE (Electric Vehicle Supply Equipment), equipped with bidirectional communication devices and smart meters, can be remotely monitored by the proposed charging algorithm applied to EV control center and mobile app. On the server side, ARIMA model is utilized to fit historical charging load data and perform day-ahead prediction. A pricing strategy with energy bidding policy is proposed and implemented to generate a charging price list to be broadcasted to EV users through mobile app. On the user side, EV drivers can submit their price preferences and daily travel schedules to negotiate with Control Center to consume the expected energy and minimize charging cost simultaneously. The proposed algorithm is tested and validated through the experimental implementations in UCLA parking lots.

  12. Integrated Association Rules Complete Hiding Algorithms

    Directory of Open Access Journals (Sweden)

    Mohamed Refaat Abdellah

    2017-01-01

    Full Text Available This paper presents database security approach for complete hiding of sensitive association rules by using six novel algorithms. These algorithms utilize three new weights to reduce the needed database modifications and support complete hiding, as well as they reduce the knowledge distortion and the data distortions. Complete weighted hiding algorithms enhance the hiding failure by 100%; these algorithms have the advantage of performing only a single scan for the database to gather the required information to form the hiding process. These proposed algorithms are built within the database structure which enables the sanitized database to be generated on run time as needed.

  13. Construction Example for Algebra System Using Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    FangAn Deng

    2015-01-01

    Full Text Available The construction example of algebra system is to verify the existence of a complex algebra system, and it is a NP-hard problem. In this paper, to solve this kind of problems, firstly, a mathematical optimization model for construction example of algebra system is established. Secondly, an improved harmony search algorithm based on NGHS algorithm (INGHS is proposed to find as more solutions as possible for the optimization model; in the proposed INGHS algorithm, to achieve the balance between exploration power and exploitation power in the search process, a global best strategy and parameters dynamic adjustment method are present. Finally, nine construction examples of algebra system are used to evaluate the optimization model and performance of INGHS. The experimental results show that the proposed algorithm has strong performance for solving complex construction example problems of algebra system.

  14. Semioptimal practicable algorithmic cooling

    International Nuclear Information System (INIS)

    Elias, Yuval; Mor, Tal; Weinstein, Yossi

    2011-01-01

    Algorithmic cooling (AC) of spins applies entropy manipulation algorithms in open spin systems in order to cool spins far beyond Shannon's entropy bound. Algorithmic cooling of nuclear spins was demonstrated experimentally and may contribute to nuclear magnetic resonance spectroscopy. Several cooling algorithms were suggested in recent years, including practicable algorithmic cooling (PAC) and exhaustive AC. Practicable algorithms have simple implementations, yet their level of cooling is far from optimal; exhaustive algorithms, on the other hand, cool much better, and some even reach (asymptotically) an optimal level of cooling, but they are not practicable. We introduce here semioptimal practicable AC (SOPAC), wherein a few cycles (typically two to six) are performed at each recursive level. Two classes of SOPAC algorithms are proposed and analyzed. Both attain cooling levels significantly better than PAC and are much more efficient than the exhaustive algorithms. These algorithms are shown to bridge the gap between PAC and exhaustive AC. In addition, we calculated the number of spins required by SOPAC in order to purify qubits for quantum computation. As few as 12 and 7 spins are required (in an ideal scenario) to yield a mildly pure spin (60% polarized) from initial polarizations of 1% and 10%, respectively. In the latter case, about five more spins are sufficient to produce a highly pure spin (99.99% polarized), which could be relevant for fault-tolerant quantum computing.

  15. Combined heat and power economic dispatch by harmony search algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Vasebi, A.; Bathaee, S.M.T. [Power System Research Laboratory, Department of Electrical and Electronic Engineering, K.N.Toosi University of Technology, 322-Mirdamad Avenue West, 19697 Tehran (Iran); Fesanghary, M. [Department of Mechanical Engineering, Amirkabir University of Technology, 424-Hafez Avenue, Tehran (Iran)

    2007-12-15

    The optimal utilization of multiple combined heat and power (CHP) systems is a complicated problem that needs powerful methods to solve. This paper presents a harmony search (HS) algorithm to solve the combined heat and power economic dispatch (CHPED) problem. The HS algorithm is a recently developed meta-heuristic algorithm, and has been very successful in a wide variety of optimization problems. The method is illustrated using a test case taken from the literature as well as a new one proposed by authors. Numerical results reveal that the proposed algorithm can find better solutions when compared to conventional methods and is an efficient search algorithm for CHPED problem. (author)

  16. A Pilot-Pattern Based Algorithm for MIMO-OFDM Channel Estimation

    Directory of Open Access Journals (Sweden)

    Guomin Li

    2016-12-01

    Full Text Available An improved pilot pattern algorithm for facilitating the channel estimation in multiple input multiple output-orthogonal frequency division multiplexing (MIMO-OFDM systems is proposed in this paper. The presented algorithm reconfigures the parameter in the least square (LS algorithm, which belongs to the space-time block-coded (STBC category for channel estimation in pilot-based MIMO-OFDM system. Simulation results show that the algorithm has better performance in contrast to the classical single symbol scheme. In contrast to the double symbols scheme, the proposed algorithm can achieve nearly the same performance with only half of the complexity of the double symbols scheme.

  17. Normalized Minimum Error Entropy Algorithm with Recursive Power Estimation

    Directory of Open Access Journals (Sweden)

    Namyong Kim

    2016-06-01

    Full Text Available The minimum error entropy (MEE algorithm is known to be superior in signal processing applications under impulsive noise. In this paper, based on the analysis of behavior of the optimum weight and the properties of robustness against impulsive noise, a normalized version of the MEE algorithm is proposed. The step size of the MEE algorithm is normalized with the power of input entropy that is estimated recursively for reducing its computational complexity. The proposed algorithm yields lower minimum MSE (mean squared error and faster convergence speed simultaneously than the original MEE algorithm does in the equalization simulation. On the condition of the same convergence speed, its performance enhancement in steady state MSE is above 3 dB.

  18. Genetic Algorithm Based Economic Dispatch with Valve Point Effect

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jong Nam; Park, Kyung Won; Kim, Ji Hong; Kim, Jin O [Hanyang University (Korea, Republic of)

    1999-03-01

    This paper presents a new approach on genetic algorithm to economic dispatch problem for valve point discontinuities. Proposed approach in this paper on genetic algorithms improves the performance to solve economic dispatch problem for valve point discontinuities through improved death penalty method, generation-apart elitism, atavism and sexual selection with sexual distinction. Numerical results on a test system consisting of 13 thermal units show that the proposed approach is faster, more robust and powerful than conventional genetic algorithms. (author). 8 refs., 10 figs.

  19. A new genetic algorithm for flexible job-shop scheduling problems

    International Nuclear Information System (INIS)

    Driss, Imen; Mouss, Kinza Nadia; Laggoun, Assia

    2015-01-01

    Flexible job-shop scheduling problem (FJSP), which is proved to be NP-hard, is an extension of the classical job-shop scheduling problem. In this paper, we propose a new genetic algorithm (NGA) to solve FJSP to minimize makespan. This new algorithm uses a new chromosome representation and adopts different strategies for crossover and mutation. The proposed algorithm is validated on a series of benchmark data sets and tested on data from a drug manufacturing company. Experimental results prove that the NGA is more efficient and competitive than some other existing algorithms.

  20. A new genetic algorithm for flexible job-shop scheduling problems

    Energy Technology Data Exchange (ETDEWEB)

    Driss, Imen; Mouss, Kinza Nadia; Laggoun, Assia [University of Batna, Batna (Algeria)

    2015-03-15

    Flexible job-shop scheduling problem (FJSP), which is proved to be NP-hard, is an extension of the classical job-shop scheduling problem. In this paper, we propose a new genetic algorithm (NGA) to solve FJSP to minimize makespan. This new algorithm uses a new chromosome representation and adopts different strategies for crossover and mutation. The proposed algorithm is validated on a series of benchmark data sets and tested on data from a drug manufacturing company. Experimental results prove that the NGA is more efficient and competitive than some other existing algorithms.