WorldWideScience

Sample records for adaptive simulated annealing

  1. Adaptive simulated annealing (ASA): Lessons learned

    OpenAIRE

    Ingber, L.

    2000-01-01

    Adaptive simulated annealing (ASA) is a global optimization algorithm based on an associated proof that the parameter space can be sampled much more efficiently than by using other previous simulated annealing algorithms. The author's ASA code has been publicly available for over two years. During this time the author has volunteered to help people via e-mail, and the feedback obtained has been used to further develop the code. Some lessons learned, in particular some which are relevant to ot...

  2. Adaptive Simulated Annealing Based Protein Loop Modeling of Neurotoxins

    Institute of Scientific and Technical Information of China (English)

    陈杰; 黄丽娜; 彭志红

    2003-01-01

    A loop modeling method, adaptive simulated annealing, for ab initio prediction of protein loop structures, as an optimization problem of searching the global minimum of a given energy function, is proposed. An interface-friendly toolbox-LoopModeller in Windows and Linux systems, VC++ and OpenGL environments is developed for analysis and visualization. Simulation results of three short-chain neurotoxins modeled by LoopModeller show that the method proposed is fast and efficient.

  3. Parameter estimation for chaotic systems using a hybrid adaptive cuckoo search with simulated annealing algorithm

    Science.gov (United States)

    Sheng, Zheng; Wang, Jun; Zhou, Shudao; Zhou, Bihua

    2014-03-01

    This paper introduces a novel hybrid optimization algorithm to establish the parameters of chaotic systems. In order to deal with the weaknesses of the traditional cuckoo search algorithm, the proposed adaptive cuckoo search with simulated annealing algorithm is presented, which incorporates the adaptive parameters adjusting operation and the simulated annealing operation in the cuckoo search algorithm. Normally, the parameters of the cuckoo search algorithm are kept constant that may result in decreasing the efficiency of the algorithm. For the purpose of balancing and enhancing the accuracy and convergence rate of the cuckoo search algorithm, the adaptive operation is presented to tune the parameters properly. Besides, the local search capability of cuckoo search algorithm is relatively weak that may decrease the quality of optimization. So the simulated annealing operation is merged into the cuckoo search algorithm to enhance the local search ability and improve the accuracy and reliability of the results. The functionality of the proposed hybrid algorithm is investigated through the Lorenz chaotic system under the noiseless and noise condition, respectively. The numerical results demonstrate that the method can estimate parameters efficiently and accurately in the noiseless and noise condition. Finally, the results are compared with the traditional cuckoo search algorithm, genetic algorithm, and particle swarm optimization algorithm. Simulation results demonstrate the effectiveness and superior performance of the proposed algorithm.

  4. Parameter estimation for chaotic systems using a hybrid adaptive cuckoo search with simulated annealing algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Sheng, Zheng, E-mail: 19994035@sina.com [College of Meteorology and Oceanography, PLA University of Science and Technology, Nanjing 211101 (China); Wang, Jun; Zhou, Bihua [National Defense Key Laboratory on Lightning Protection and Electromagnetic Camouflage, PLA University of Science and Technology, Nanjing 210007 (China); Zhou, Shudao [College of Meteorology and Oceanography, PLA University of Science and Technology, Nanjing 211101 (China); Collaborative Innovation Center on Forecast and Evaluation of Meteorological Disasters, Nanjing University of Information Science and Technology, Nanjing 210044 (China)

    2014-03-15

    This paper introduces a novel hybrid optimization algorithm to establish the parameters of chaotic systems. In order to deal with the weaknesses of the traditional cuckoo search algorithm, the proposed adaptive cuckoo search with simulated annealing algorithm is presented, which incorporates the adaptive parameters adjusting operation and the simulated annealing operation in the cuckoo search algorithm. Normally, the parameters of the cuckoo search algorithm are kept constant that may result in decreasing the efficiency of the algorithm. For the purpose of balancing and enhancing the accuracy and convergence rate of the cuckoo search algorithm, the adaptive operation is presented to tune the parameters properly. Besides, the local search capability of cuckoo search algorithm is relatively weak that may decrease the quality of optimization. So the simulated annealing operation is merged into the cuckoo search algorithm to enhance the local search ability and improve the accuracy and reliability of the results. The functionality of the proposed hybrid algorithm is investigated through the Lorenz chaotic system under the noiseless and noise condition, respectively. The numerical results demonstrate that the method can estimate parameters efficiently and accurately in the noiseless and noise condition. Finally, the results are compared with the traditional cuckoo search algorithm, genetic algorithm, and particle swarm optimization algorithm. Simulation results demonstrate the effectiveness and superior performance of the proposed algorithm.

  5. Improve earthquake hypocenter using adaptive simulated annealing inversion in regional tectonic, volcano tectonic, and geothermal observation

    International Nuclear Information System (INIS)

    Observation of earthquakes is routinely used widely in tectonic activity observation, and also in local scale such as volcano tectonic and geothermal activity observation. It is necessary for determining the location of precise hypocenter which the process involves finding a hypocenter location that has minimum error between the observed and the calculated travel times. When solving this nonlinear inverse problem, simulated annealing inversion method can be applied to such global optimization problems, which the convergence of its solution is independent of the initial model. In this study, we developed own program codeby applying adaptive simulated annealing inversion in Matlab environment. We applied this method to determine earthquake hypocenter using several data cases which are regional tectonic, volcano tectonic, and geothermal field. The travel times were calculated using ray tracing shooting method. We then compared its results with the results using Geiger’s method to analyze its reliability. Our results show hypocenter location has smaller RMS error compared to the Geiger’s result that can be statistically associated with better solution. The hypocenter of earthquakes also well correlated with geological structure in the study area. Werecommend using adaptive simulated annealing inversion to relocate hypocenter location in purpose to get precise and accurate earthquake location

  6. Improve earthquake hypocenter using adaptive simulated annealing inversion in regional tectonic, volcano tectonic, and geothermal observation

    Energy Technology Data Exchange (ETDEWEB)

    Ry, Rexha Verdhora, E-mail: rexha.vry@gmail.com [Master Program of Geophysical Engineering, Faculty of Mining and Petroleum Engineering, Institut Teknologi Bandung, Jalan Ganesha No.10, Bandung 40132 (Indonesia); Nugraha, Andri Dian, E-mail: nugraha@gf.itb.ac.id [Global Geophysical Research Group, Faculty of Mining and Petroleum Engineering, Institut Teknologi Bandung, Jalan Ganesha No.10, Bandung 40132 (Indonesia)

    2015-04-24

    Observation of earthquakes is routinely used widely in tectonic activity observation, and also in local scale such as volcano tectonic and geothermal activity observation. It is necessary for determining the location of precise hypocenter which the process involves finding a hypocenter location that has minimum error between the observed and the calculated travel times. When solving this nonlinear inverse problem, simulated annealing inversion method can be applied to such global optimization problems, which the convergence of its solution is independent of the initial model. In this study, we developed own program codeby applying adaptive simulated annealing inversion in Matlab environment. We applied this method to determine earthquake hypocenter using several data cases which are regional tectonic, volcano tectonic, and geothermal field. The travel times were calculated using ray tracing shooting method. We then compared its results with the results using Geiger’s method to analyze its reliability. Our results show hypocenter location has smaller RMS error compared to the Geiger’s result that can be statistically associated with better solution. The hypocenter of earthquakes also well correlated with geological structure in the study area. Werecommend using adaptive simulated annealing inversion to relocate hypocenter location in purpose to get precise and accurate earthquake location.

  7. Simulated annealing method for electronic circuits design: adaptation and comparison with other optimization methods

    International Nuclear Information System (INIS)

    The circuit design problem consists in determining acceptable parameter values (resistors, capacitors, transistors geometries ...) which allow the circuit to meet various user given operational criteria (DC consumption, AC bandwidth, transient times ...). This task is equivalent to a multidimensional and/or multi objective optimization problem: n-variables functions have to be minimized in an hyper-rectangular domain ; equality constraints can be eventually specified. A similar problem consists in fitting component models. In this way, the optimization variables are the model parameters and one aims at minimizing a cost function built on the error between the model response and the data measured on the component. The chosen optimization method for this kind of problem is the simulated annealing method. This method, provided by the combinatorial optimization domain, has been adapted and compared with other global optimization methods for the continuous variables problems. An efficient strategy of variables discretization and a set of complementary stopping criteria have been proposed. The different parameters of the method have been adjusted with analytical functions of which minima are known, classically used in the literature. Our simulated annealing algorithm has been coupled with an open electrical simulator SPICE-PAC of which the modular structure allows the chaining of simulations required by the circuit optimization process. We proposed, for high-dimensional problems, a partitioning technique which ensures proportionality between CPU-time and variables number. To compare our method with others, we have adapted three other methods coming from combinatorial optimization domain - the threshold method, a genetic algorithm and the Tabu search method - The tests have been performed on the same set of test functions and the results allow a first comparison between these methods applied to continuous optimization variables. Finally, our simulated annealing program

  8. Adaptive MANET Multipath Routing Algorithm Based on the Simulated Annealing Approach

    Directory of Open Access Journals (Sweden)

    Sungwook Kim

    2014-01-01

    Full Text Available Mobile ad hoc network represents a system of wireless mobile nodes that can freely and dynamically self-organize network topologies without any preexisting communication infrastructure. Due to characteristics like temporary topology and absence of centralized authority, routing is one of the major issues in ad hoc networks. In this paper, a new multipath routing scheme is proposed by employing simulated annealing approach. The proposed metaheuristic approach can achieve greater and reciprocal advantages in a hostile dynamic real world network situation. Therefore, the proposed routing scheme is a powerful method for finding an effective solution into the conflict mobile ad hoc network routing problem. Simulation results indicate that the proposed paradigm adapts best to the variation of dynamic network situations. The average remaining energy, network throughput, packet loss probability, and traffic load distribution are improved by about 10%, 10%, 5%, and 10%, respectively, more than the existing schemes.

  9. Adaptive MANET multipath routing algorithm based on the simulated annealing approach.

    Science.gov (United States)

    Kim, Sungwook

    2014-01-01

    Mobile ad hoc network represents a system of wireless mobile nodes that can freely and dynamically self-organize network topologies without any preexisting communication infrastructure. Due to characteristics like temporary topology and absence of centralized authority, routing is one of the major issues in ad hoc networks. In this paper, a new multipath routing scheme is proposed by employing simulated annealing approach. The proposed metaheuristic approach can achieve greater and reciprocal advantages in a hostile dynamic real world network situation. Therefore, the proposed routing scheme is a powerful method for finding an effective solution into the conflict mobile ad hoc network routing problem. Simulation results indicate that the proposed paradigm adapts best to the variation of dynamic network situations. The average remaining energy, network throughput, packet loss probability, and traffic load distribution are improved by about 10%, 10%, 5%, and 10%, respectively, more than the existing schemes. PMID:25032241

  10. Stochastic Global Optimization and Its Applications with Fuzzy Adaptive Simulated Annealing

    CERN Document Server

    Aguiar e Oliveira Junior, Hime; Petraglia, Antonio; Rembold Petraglia, Mariane; Augusta Soares Machado, Maria

    2012-01-01

    Stochastic global optimization is a very important subject, that has applications in virtually all areas of science and technology. Therefore there is nothing more opportune than writing a book about a successful and mature algorithm that turned out to be a good tool in solving difficult problems. Here we present some techniques for solving  several problems by means of Fuzzy Adaptive Simulated Annealing (Fuzzy ASA), a fuzzy-controlled version of ASA, and by ASA itself. ASA is a sophisticated global optimization algorithm that is based upon ideas of the simulated annealing paradigm, coded in the C programming language and developed to statistically find the best global fit of a nonlinear constrained, non-convex cost function over a multi-dimensional space. By presenting detailed examples of its application we want to stimulate the reader’s intuition and make the use of Fuzzy ASA (or regular ASA) easier for everyone wishing to use these tools to solve problems. We kept formal mathematical requirements to a...

  11. The rvfit Code: A Detailed Adaptive Simulated Annealing Code for Fitting Binaries and Exoplanets Radial Velocities

    CERN Document Server

    Iglesias-Marzoa, Ramón; Morales, María Jesús Arévalo

    2015-01-01

    The fitting of radial velocity curves is a frequent procedure in binary stars and exoplanet research. In the majority of cases the fitting routines need to be fed with a set of initial parameter values and priors from which to begin the computations and their results can be affected by local minima. We present a new code, the rvfit code, for fitting radial velocities of stellar binaries and exoplanets using an Adaptive Simulated Annealing (ASA) global minimization method, which fastly converges to a global solution minimum without the need to provide preliminary parameter values. We show the performance of the code using both synthetic and real data sets: double-lined binaries, single-lined binaries, and exoplanet systems. In all examples the keplerian orbital parameters fitted by the rvfit code and their computed uncertainties are compared with literature solutions. Finally, we provide the source code with a working example and a detailed description on how to use it.

  12. Generalized Simulated Annealing

    OpenAIRE

    Tsallis, Constantino; Stariolo, Daniel A.

    1995-01-01

    We propose a new stochastic algorithm (generalized simulated annealing) for computationally finding the global minimum of a given (not necessarily convex) energy/cost function defined in a continuous D-dimensional space. This algorithm recovers, as particular cases, the so called classical ("Boltzmann machine") and fast ("Cauchy machine") simulated annealings, and can be quicker than both. Key-words: simulated annealing; nonconvex optimization; gradient descent; generalized statistical mechan...

  13. An adaptive evolutionary multi-objective approach based on simulated annealing.

    Science.gov (United States)

    Li, H; Landa-Silva, D

    2011-01-01

    A multi-objective optimization problem can be solved by decomposing it into one or more single objective subproblems in some multi-objective metaheuristic algorithms. Each subproblem corresponds to one weighted aggregation function. For example, MOEA/D is an evolutionary multi-objective optimization (EMO) algorithm that attempts to optimize multiple subproblems simultaneously by evolving a population of solutions. However, the performance of MOEA/D highly depends on the initial setting and diversity of the weight vectors. In this paper, we present an improved version of MOEA/D, called EMOSA, which incorporates an advanced local search technique (simulated annealing) and adapts the search directions (weight vectors) corresponding to various subproblems. In EMOSA, the weight vector of each subproblem is adaptively modified at the lowest temperature in order to diversify the search toward the unexplored parts of the Pareto-optimal front. Our computational results show that EMOSA outperforms six other well established multi-objective metaheuristic algorithms on both the (constrained) multi-objective knapsack problem and the (unconstrained) multi-objective traveling salesman problem. Moreover, the effects of the main algorithmic components and parameter sensitivities on the search performance of EMOSA are experimentally investigated.

  14. A memory structure adapted simulated annealing algorithm for a green vehicle routing problem.

    Science.gov (United States)

    Küçükoğlu, İlker; Ene, Seval; Aksoy, Aslı; Öztürk, Nursel

    2015-03-01

    Currently, reduction of carbon dioxide (CO2) emissions and fuel consumption has become a critical environmental problem and has attracted the attention of both academia and the industrial sector. Government regulations and customer demands are making environmental responsibility an increasingly important factor in overall supply chain operations. Within these operations, transportation has the most hazardous effects on the environment, i.e., CO2 emissions, fuel consumption, noise and toxic effects on the ecosystem. This study aims to construct vehicle routes with time windows that minimize the total fuel consumption and CO2 emissions. The green vehicle routing problem with time windows (G-VRPTW) is formulated using a mixed integer linear programming model. A memory structure adapted simulated annealing (MSA-SA) meta-heuristic algorithm is constructed due to the high complexity of the proposed problem and long solution times for practical applications. The proposed models are integrated with a fuel consumption and CO2 emissions calculation algorithm that considers the vehicle technical specifications, vehicle load, and transportation distance in a green supply chain environment. The proposed models are validated using well-known instances with different numbers of customers. The computational results indicate that the MSA-SA heuristic is capable of obtaining good G-VRPTW solutions within a reasonable amount of time by providing reductions in fuel consumption and CO2 emissions. PMID:25056743

  15. General Simulated Annealing

    Institute of Scientific and Technical Information of China (English)

    姚新; 李国杰

    1991-01-01

    Simulated annealing is a new kind of random search methods developed in recent years.It can also be considered as an extension to the classical hill-climbing method in AI--probabilistic hill-cimbing.One of its most important features is its global convergence.The convergence of simulated annealing algorithm is determined by state generating probability,state accepting probability,and temperature decreasing rate,This paper gives a generalized simulated annealing algorithm with dynamic generating and accepting probabilities.The paper also shows that the generating and accepting probabilities can adopt many different kinds of distributions while the global convergence is guaranteed.

  16. Keystream Generator Based On Simulated Annealing

    Directory of Open Access Journals (Sweden)

    Ayad A. Abdulsalam

    2011-01-01

    Full Text Available Advances in the design of keystream generator using heuristic techniques are reported. A simulated annealing algorithm for generating random keystream with large complexity is presented. Simulated annealing technique is adapted to locate these requirements. The definitions for some cryptographic properties are generalized, providing a measure suitable for use as an objective function in a simulated annealing algorithm, seeking randomness that satisfy both correlation immunity and the large linear complexity. Results are presented demonstrating the effectiveness of the method.

  17. On lumped models for thermodynamic properties of simulated annealing problems

    OpenAIRE

    Andresen, Bjarne; Hoffmann, Karl Heinz; Mosegaard, Klaus; Nulton, Jim; Pedersen, Jacob Mørch; Salamon, Peter

    1988-01-01

    The paper describes a new method for the estimation of thermodynamic properties for simulated annealing problems using data obtained during a simulated annealing run. The method works by estimating energy-to-energy transition probabilities and is well adapted to simulations such as simulated annealing, in which the system is never in equilibrium.

  18. multicast utilizando Simulated Annealing

    Directory of Open Access Journals (Sweden)

    Yezid Donoso

    2005-01-01

    Full Text Available En este artículo se presenta un método de optimización multiobjetivo para la solución del problema de balanceo de carga en redes de transmisión multicast, apoyándose en la aplicación de la meta-heurística de Simulated Annealing (Recocido Simulado. El método minimiza cuatro parámetros básicos para garantizar la calidad de servicio en transmisiones multicast: retardo origen destino, máxima utilización de enlaces, ancho de banda consumido y número de saltos. Los resultados devueltos por la heurística serán comparados con los resultados arrojados por el modelo matemático propuesto en investigaciones anteriores.

  19. Recursive simulation of quantum annealing

    CERN Document Server

    Sowa, A P; Samson, J H; Savel'ev, S E; Zagoskin, A M; Heidel, S; Zúñiga-Anaya, J C

    2015-01-01

    The evaluation of the performance of adiabatic annealers is hindered by lack of efficient algorithms for simulating their behaviour. We exploit the analyticity of the standard model for the adiabatic quantum process to develop an efficient recursive method for its numerical simulation in case of both unitary and non-unitary evolution. Numerical simulations show distinctly different distributions for the most important figure of merit of adiabatic quantum computing --- the success probability --- in these two cases.

  20. Simulation of Storm Occurrences Using Simulated Annealing.

    Science.gov (United States)

    Lokupitiya, Ravindra S.; Borgman, Leon E.; Anderson-Sprecher, Richard

    2005-11-01

    Modeling storm occurrences has become a vital part of hurricane prediction. In this paper, a method for simulating event occurrences using a simulated annealing algorithm is described. The method is illustrated using annual counts of hurricanes and of tropical storms in the Atlantic Ocean and Gulf of Mexico. Simulations closely match distributional properties, including possible correlations, in the historical data. For hurricanes, traditionally used Poisson and negative binomial processes also predict univariate properties well, but for tropical storms parametric methods are less successful. The authors determined that simulated annealing replicates properties of both series. Simulated annealing can be designed so that simulations mimic historical distributional properties to whatever degree is desired, including occurrence of extreme events and temporal patterning.

  1. Using Simulated Annealing to Factor Numbers

    OpenAIRE

    Altschuler, Eric Lewin; Williams, Timothy J.

    2014-01-01

    Almost all public secure communication relies on the inability to factor large numbers. There is no known analytic or classical numeric method to rapidly factor large numbers. Shor[1] has shown that a quantum computer can factor numbers in polynomial time but there is no practical quantum computer that can yet do such computations. We show that a simulated annealing[2] approach can be adapted to find factors of large numbers.

  2. Feasibility of Simulated Annealing Tomography

    CERN Document Server

    Vo, Nghia T; Moser, Herbert O

    2014-01-01

    Simulated annealing tomography (SAT) is a simple iterative image reconstruction technique which can yield a superior reconstruction compared with filtered back-projection (FBP). However, the very high computational cost of iteratively calculating discrete Radon transform (DRT) has limited the feasibility of this technique. In this paper, we propose an approach based on the pre-calculated intersection lengths array (PILA) which helps to remove the step of computing DRT in the simulated annealing procedure and speed up SAT by over 300 times. The enhancement of convergence speed of the reconstruction process using the best of multiple-estimate (BoME) strategy is introduced. The performance of SAT under different conditions and in comparison with other methods is demonstrated by numerical experiments.

  3. Simulated annealing method for electronic circuits design: adaptation and comparison with other optimization methods; La methode du recuit simule pour la conception des circuits electroniques: adaptation et comparaison avec d`autres methodes d`optimisation

    Energy Technology Data Exchange (ETDEWEB)

    Berthiau, G.

    1995-10-01

    The circuit design problem consists in determining acceptable parameter values (resistors, capacitors, transistors geometries ...) which allow the circuit to meet various user given operational criteria (DC consumption, AC bandwidth, transient times ...). This task is equivalent to a multidimensional and/or multi objective optimization problem: n-variables functions have to be minimized in an hyper-rectangular domain ; equality constraints can be eventually specified. A similar problem consists in fitting component models. In this way, the optimization variables are the model parameters and one aims at minimizing a cost function built on the error between the model response and the data measured on the component. The chosen optimization method for this kind of problem is the simulated annealing method. This method, provided by the combinatorial optimization domain, has been adapted and compared with other global optimization methods for the continuous variables problems. An efficient strategy of variables discretization and a set of complementary stopping criteria have been proposed. The different parameters of the method have been adjusted with analytical functions of which minima are known, classically used in the literature. Our simulated annealing algorithm has been coupled with an open electrical simulator SPICE-PAC of which the modular structure allows the chaining of simulations required by the circuit optimization process. We proposed, for high-dimensional problems, a partitioning technique which ensures proportionality between CPU-time and variables number. To compare our method with others, we have adapted three other methods coming from combinatorial optimization domain - the threshold method, a genetic algorithm and the Tabu search method - The tests have been performed on the same set of test functions and the results allow a first comparison between these methods applied to continuous optimization variables. (Abstract Truncated)

  4. Recursive Branching Simulated Annealing Algorithm

    Science.gov (United States)

    Bolcar, Matthew; Smith, J. Scott; Aronstein, David

    2012-01-01

    This innovation is a variation of a simulated-annealing optimization algorithm that uses a recursive-branching structure to parallelize the search of a parameter space for the globally optimal solution to an objective. The algorithm has been demonstrated to be more effective at searching a parameter space than traditional simulated-annealing methods for a particular problem of interest, and it can readily be applied to a wide variety of optimization problems, including those with a parameter space having both discrete-value parameters (combinatorial) and continuous-variable parameters. It can take the place of a conventional simulated- annealing, Monte-Carlo, or random- walk algorithm. In a conventional simulated-annealing (SA) algorithm, a starting configuration is randomly selected within the parameter space. The algorithm randomly selects another configuration from the parameter space and evaluates the objective function for that configuration. If the objective function value is better than the previous value, the new configuration is adopted as the new point of interest in the parameter space. If the objective function value is worse than the previous value, the new configuration may be adopted, with a probability determined by a temperature parameter, used in analogy to annealing in metals. As the optimization continues, the region of the parameter space from which new configurations can be selected shrinks, and in conjunction with lowering the annealing temperature (and thus lowering the probability for adopting configurations in parameter space with worse objective functions), the algorithm can converge on the globally optimal configuration. The Recursive Branching Simulated Annealing (RBSA) algorithm shares some features with the SA algorithm, notably including the basic principles that a starting configuration is randomly selected from within the parameter space, the algorithm tests other configurations with the goal of finding the globally optimal

  5. Very Fast Simulated Re-Annealing

    OpenAIRE

    Ingber, Lester

    1989-01-01

    Draft An algorithm is developed to statistically find the best global fit of a nonlinear non-convex cost-function over a D-dimensional space. It is argued that this algorithm permits an annealing schedule for ‘‘temperature’’ T decreasing exponentially in annealing-time k, T = T0 exp(−ck1/D). The introduction of re-annealing also permits adaptation to changing sensitivities in the multidimensional parameter-space. This annealing schedule is faster than fast Cauchy annealing, ...

  6. NEW SIMULATED ANNEALING ALGORITHMS FOR CONSTRAINED OPTIMIZATION

    OpenAIRE

    LINET ÖZDAMAR; CHANDRA SEKHAR PEDAMALLU

    2010-01-01

    We propose a Population based dual-sequence Non-Penalty Annealing algorithm (PNPA) for solving the general nonlinear constrained optimization problem. The PNPA maintains a population of solutions that are intermixed by crossover to supply a new starting solution for simulated annealing throughout the search. Every time the search gets stuck at a local optimum, this crossover procedure is triggered and simulated annealing search re-starts from a new subspace. In both the crossover and simulate...

  7. Quantum annealing speedup over simulated annealing on random Ising chains

    CERN Document Server

    Zanca, Tommaso

    2015-01-01

    We show clear evidence of a speedup of a quantum annealing (QA) Schr\\"odinger dynamics over a Glauber master-equation simulated annealing (SA) for a random Ising model in one dimension. Annealings are tackled on equal footing, by a deterministic dynamics of the resulting Jordan-Wigner fermionic problems. We find that disorder, without frustration, makes both SA and real-time QA logarithmically slow in the annealing time $\\tau$, but QA shows a quadratic speedup with respect to SA. We also find that an imaginary-time Schr\\"odinger QA dynamics provides a further exponential speedup, with an asymptotic residual error compatible with a power-law $\\tau^{-\\mu}$.

  8. Simulated Annealing using Hybrid Monte Carlo

    OpenAIRE

    Salazar, Rafael; Toral, Raúl

    1997-01-01

    We propose a variant of the simulated annealing method for optimization in the multivariate analysis of differentiable functions. The method uses global actualizations via the hybrid Monte Carlo algorithm in their generalized version for the proposal of new configurations. We show how this choice can improve upon the performance of simulated annealing methods (mainly when the number of variables is large) by allowing a more effective searching scheme and a faster annealing schedule.

  9. Cylinder packing by simulated annealing

    Directory of Open Access Journals (Sweden)

    M. Helena Correia

    2000-12-01

    Full Text Available This paper is motivated by the problem of loading identical items of circular base (tubes, rolls, ... into a rectangular base (the pallet. For practical reasons, all the loaded items are considered to have the same height. The resolution of this problem consists in determining the positioning pattern of the circular bases of the items on the rectangular pallet, while maximizing the number of items. This pattern will be repeated for each layer stacked on the pallet. Two algorithms based on the meta-heuristic Simulated Annealing have been developed and implemented. The tuning of these algorithms parameters implied running intensive tests in order to improve its efficiency. The algorithms developed were easily extended to the case of non-identical circles.Este artigo aborda o problema de posicionamento de objetos de base circular (tubos, rolos, ... sobre uma base retangular de maiores dimensões. Por razões práticas, considera-se que todos os objetos a carregar apresentam a mesma altura. A resolução do problema consiste na determinação do padrão de posicionamento das bases circulares dos referidos objetos sobre a base de forma retangular, tendo como objetivo a maximização do número de objetos estritamente posicionados no interior dessa base. Este padrão de posicionamento será repetido em cada uma das camadas a carregar sobre a base retangular. Apresentam-se dois algoritmos para a resolução do problema. Estes algoritmos baseiam-se numa meta-heurística, Simulated Annealling, cuja afinação de parâmetros requereu a execução de testes intensivos com o objetivo de atingir um elevado grau de eficiência no seu desempenho. As características dos algoritmos implementados permitiram que a sua extensão à consideração de círculos com raios diferentes fosse facilmente conseguida.

  10. Quantum annealing speedup over simulated annealing on random Ising chains

    Science.gov (United States)

    Zanca, Tommaso; Santoro, Giuseppe E.

    2016-06-01

    We show clear evidence of a quadratic speedup of a quantum annealing (QA) Schrödinger dynamics over a Glauber master equation simulated annealing (SA) for a random Ising model in one dimension, via an equal-footing exact deterministic dynamics of the Jordan-Wigner fermionized problems. This is remarkable, in view of the arguments of H. G. Katzgraber et al. [Phys. Rev. X 4, 021008 (2014), 10.1103/PhysRevX.4.021008], since SA does not encounter any phase transition, while QA does. We also find a second remarkable result: that a "quantum-inspired" imaginary-time Schrödinger QA provides a further exponential speedup, i.e., an asymptotic residual error decreasing as a power law τ-μ of the annealing time τ .

  11. Classical Simulated Annealing Using Quantum Analogues

    Science.gov (United States)

    La Cour, Brian R.; Troupe, James E.; Mark, Hans M.

    2016-06-01

    In this paper we consider the use of certain classical analogues to quantum tunneling behavior to improve the performance of simulated annealing on a discrete spin system of the general Ising form. Specifically, we consider the use of multiple simultaneous spin flips at each annealing step as an analogue to quantum spin coherence as well as modifications of the Boltzmann acceptance probability to mimic quantum tunneling. We find that the use of multiple spin flips can indeed be advantageous under certain annealing schedules, but only for long anneal times.

  12. Classical Simulated Annealing Using Quantum Analogues

    Science.gov (United States)

    La Cour, Brian R.; Troupe, James E.; Mark, Hans M.

    2016-08-01

    In this paper we consider the use of certain classical analogues to quantum tunneling behavior to improve the performance of simulated annealing on a discrete spin system of the general Ising form. Specifically, we consider the use of multiple simultaneous spin flips at each annealing step as an analogue to quantum spin coherence as well as modifications of the Boltzmann acceptance probability to mimic quantum tunneling. We find that the use of multiple spin flips can indeed be advantageous under certain annealing schedules, but only for long anneal times.

  13. Simulated Quantum Annealing Can Be Exponentially Faster than Classical Simulated Annealing

    OpenAIRE

    Crosson, Elizabeth; Harrow, Aram W.

    2016-01-01

    Simulated Quantum Annealing (SQA) is a Markov Chain Monte-Carlo algorithm that samples the equilibrium thermal state of a Quantum Annealing (QA) Hamiltonian. In addition to simulating quantum systems, SQA has also been proposed as another physics-inspired classical algorithm for combinatorial optimization, alongside classical simulated annealing. However, in many cases it remains an open challenge to determine the performance of both QA and SQA. One piece of evidence for the strength of Q...

  14. An Introduction to Simulated Annealing

    Science.gov (United States)

    Albright, Brian

    2007-01-01

    An attempt to model the physical process of annealing lead to the development of a type of combinatorial optimization algorithm that takes on the problem of getting trapped in a local minimum. The author presents a Microsoft Excel spreadsheet that illustrates how this works.

  15. Stochastic annealing simulation of cascades in metals

    Energy Technology Data Exchange (ETDEWEB)

    Heinisch, H.L.

    1996-04-01

    The stochastic annealing simulation code ALSOME is used to investigate quantitatively the differential production of mobile vacancy and SIA defects as a function of temperature for isolated 25 KeV cascades in copper generated by MD simulations. The ALSOME code and cascade annealing simulations are described. The annealing simulations indicate that the above Stage V, where the cascade vacancy clusters are unstable,m nearly 80% of the post-quench vacancies escape the cascade volume, while about half of the post-quench SIAs remain in clusters. The results are sensitive to the relative fractions of SIAs that occur in small, highly mobile clusters and large stable clusters, respectively, which may be dependent on the cascade energy.

  16. Constrained multi-global optimization using a penalty stretched simulated annealing framework

    OpenAIRE

    Pereira, Ana I.; Edite M.G.P. Fernandes

    2009-01-01

    This paper presents a new simulated annealing algorithm to solve constrained multi-global optimization problems. To compute all global solutions in a sequential manner, we combine the function stretching technique with the adaptive simulated annealing variant. Constraint-handling is carried out through a nondifferentiable penalty function. To benchmark our penalty stretched simulated annealing algorithm we solve a set of well-known problems. Our preliminary numerical results show that the alg...

  17. Simulated Annealing with Tsallis Weights - A Numerical Comparison

    OpenAIRE

    Hansmann, Ulrich H.E.

    1997-01-01

    We discuss the use of Tsallis generalized mechanics in simulated annealing algorithms. For a small peptide it is shown that older implementations are not more effective than regular simulated annealing in finding ground state configurations. We propose a new implementation which leads to an improvement over regular simulated annealing.

  18. Simulated annealing in orbital flight planning

    Science.gov (United States)

    Soller, Jeffrey

    1990-01-01

    Simulated annealing is used to solve a minimum fuel trajectory problem in the space station environment. The environment is unique because the space station will define the first true multivehicle environment in space. The optimization yields surfaces which are potentially complex, with multiple local minima. Because of the likelihood of these local minima, descent techniques are unable to offer robust solutions. Other deterministic optimization techniques were explored without success. The simulated annealing optimization is capable of identifying a minimum-fuel, two-burn trajectory subject to four constraints. Furthermore, the computational efforts involved in the optimization are such that missions could be planned on board the space station. Potential applications could include the on-site planning of rendezvous with a target craft of the emergency rescue of an astronaut. Future research will include multiwaypoint maneuvers, using a knowledge base to guide the optimization.

  19. Code Generator for Quantum Simulated Annealing

    CERN Document Server

    Tucci, Robert R

    2009-01-01

    This paper introduces QuSAnn v1.2 and Multiplexor Expander v1.2, two Java applications available for free. (Source code included in the distribution.) QuSAnn is a "code generator" for quantum simulated annealing: after the user inputs some parameters, it outputs a quantum circuit for performing simulated annealing on a quantum computer. The quantum circuit implements the algorithm of Wocjan et al. (arXiv:0804.4259), which improves on the original algorithm of Somma et al. (arXiv:0712.1008). The quantum circuit generated by QuSAnn includes some quantum multiplexors. The application Multiplexor Expander allows the user to replace each of those multiplexors by a sequence of more elementary gates such as multiply controlled NOTs and qubit rotations.

  20. Simulated annealing algorithm for optimal capital growth

    Science.gov (United States)

    Luo, Yong; Zhu, Bo; Tang, Yong

    2014-08-01

    We investigate the problem of dynamic optimal capital growth of a portfolio. A general framework that one strives to maximize the expected logarithm utility of long term growth rate was developed. Exact optimization algorithms run into difficulties in this framework and this motivates the investigation of applying simulated annealing optimized algorithm to optimize the capital growth of a given portfolio. Empirical results with real financial data indicate that the approach is inspiring for capital growth portfolio.

  1. MEDICAL STAFF SCHEDULING USING SIMULATED ANNEALING

    OpenAIRE

    Ladislav Rosocha; Silvia Vernerova; Robert Verner

    2015-01-01

    Purpose: The efficiency of medical staff is a fundamental feature of healthcare facilities quality. Therefore the better implementation of their preferences into the scheduling problem might not only rise the work-life balance of doctors and nurses, but also may result into better patient care. This paper focuses on optimization of medical staff preferences considering the scheduling problem. Methodology/Approach: We propose a medical staff scheduling algorithm based on simulated annealing, a...

  2. MEDICAL STAFF SCHEDULING USING SIMULATED ANNEALING

    OpenAIRE

    Ladislav Rosocha; Silvia Vernerova; Robert Verner

    2015-01-01

    Purpose: The efficiency of medical staff is a fundamental feature of healthcare facilities quality. Therefore the better implementation of their preferences into the scheduling problem might not only rise the work-life balance of doctors and nurses, but also may result into better patient care. This paper focuses on optimization of medical staff preferences considering the scheduling problem.Methodology/Approach: We propose a medical staff scheduling algorithm based on simulated annealing, a ...

  3. Parallel simulated annealing algorithms for cell placement on hypercube multiprocessors

    Science.gov (United States)

    Banerjee, Prithviraj; Jones, Mark Howard; Sargent, Jeff S.

    1990-01-01

    Two parallel algorithms for standard cell placement using simulated annealing are developed to run on distributed-memory message-passing hypercube multiprocessors. The cells can be mapped in a two-dimensional area of a chip onto processors in an n-dimensional hypercube in two ways, such that both small and large cell exchange and displacement moves can be applied. The computation of the cost function in parallel among all the processors in the hypercube is described, along with a distributed data structure that needs to be stored in the hypercube to support the parallel cost evaluation. A novel tree broadcasting strategy is used extensively for updating cell locations in the parallel environment. A dynamic parallel annealing schedule estimates the errors due to interacting parallel moves and adapts the rate of synchronization automatically. Two novel approaches in controlling error in parallel algorithms are described: heuristic cell coloring and adaptive sequence control.

  4. Binary Sparse Phase Retrieval via Simulated Annealing

    Directory of Open Access Journals (Sweden)

    Wei Peng

    2016-01-01

    Full Text Available This paper presents the Simulated Annealing Sparse PhAse Recovery (SASPAR algorithm for reconstructing sparse binary signals from their phaseless magnitudes of the Fourier transform. The greedy strategy version is also proposed for a comparison, which is a parameter-free algorithm. Sufficient numeric simulations indicate that our method is quite effective and suggest the binary model is robust. The SASPAR algorithm seems competitive to the existing methods for its efficiency and high recovery rate even with fewer Fourier measurements.

  5. Hypocoercivity in metastable settings and kinetic simulated annealing

    OpenAIRE

    Monmarché, Pierre

    2015-01-01

    Classical analysis of the simulated annealing algorithm is combined with the more recent hypocoercive method of distorted entropy to prove the convergence for large time of the kinetic Langevin annealing with logarithmic cooling schedule.

  6. Reactor controller design using genetic algorithms with simulated annealing

    International Nuclear Information System (INIS)

    This chapter presents a digital control system for ITU TRIGA Mark-II reactor using genetic algorithms with simulated annealing. The basic principles of genetic algorithms for problem solving are inspired by the mechanism of natural selection. Natural selection is a biological process in which stronger individuals are likely to be winners in a competing environment. Genetic algorithms use a direct analogy of natural evolution. Genetic algorithms are global search techniques for optimisation but they are poor at hill-climbing. Simulated annealing has the ability of probabilistic hill-climbing. Thus, the two techniques are combined here to get a fine-tuned algorithm that yields a faster convergence and a more accurate search by introducing a new mutation operator like simulated annealing or an adaptive cooling schedule. In control system design, there are currently no systematic approaches to choose the controller parameters to obtain the desired performance. The controller parameters are usually determined by test and error with simulation and experimental analysis. Genetic algorithm is used automatically and efficiently searching for a set of controller parameters for better performance. (orig.)

  7. Simulated annealing algorithm for detecting graph isomorphism

    Institute of Scientific and Technical Information of China (English)

    Geng Xiutang; Zhang Kai

    2008-01-01

    Evolutionary computation techniques have mostly been used to solve various optimization problems,and it is well known that graph isomorphism problem (GIP) is a nondeterministic polynomial problem.A simulated annealing (SA) algorithm for detecting graph isomorphism is proposed,and the proposed SA algorithm is well suited to deal with random graphs with large size.To verify the validity of the proposed SA algorithm,simulations are performed on three pairs of small graphs and four pairs of large random graphs with edge densities 0.5,0.1,and 0.01,respectively.The simulation results show that the proposed SA algorithm can detect graph isomorphism with a high probability.

  8. Simulated Annealing of Two Electron Density Solution Systems

    OpenAIRE

    Neto, Mario de Oliveira; Alonso, Ronaldo Luiz; Leite, Fabio Lima; Jr, Osvaldo N. Oliveira; Polikarpov, Igor; Mascarenhas, Yvonne Primerano

    2008-01-01

    Many structural studies have been performed with a combination of SAXS and simulated annealing to reconstruct three dimensional models. Simulated annealing is suitable for the study of monodisperse, diluted and two-electron densities systems. In this chapter we showed how the simulated annealing procedure can be used to minimize the discrepancy between two functions: the simulated intensity and the experimental one-dimensional SAXS curve. The goal was to find the most probable form for a prot...

  9. MEDICAL STAFF SCHEDULING USING SIMULATED ANNEALING

    Directory of Open Access Journals (Sweden)

    Ladislav Rosocha

    2015-07-01

    Full Text Available Purpose: The efficiency of medical staff is a fundamental feature of healthcare facilities quality. Therefore the better implementation of their preferences into the scheduling problem might not only rise the work-life balance of doctors and nurses, but also may result into better patient care. This paper focuses on optimization of medical staff preferences considering the scheduling problem.Methodology/Approach: We propose a medical staff scheduling algorithm based on simulated annealing, a well-known method from statistical thermodynamics. We define hard constraints, which are linked to legal and working regulations, and minimize the violations of soft constraints, which are related to the quality of work, psychic, and work-life balance of staff.Findings: On a sample of 60 physicians and nurses from gynecology department we generated monthly schedules and optimized their preferences in terms of soft constraints. Our results indicate that the final value of objective function optimized by proposed algorithm is more than 18-times better in violations of soft constraints than initially generated random schedule that satisfied hard constraints.Research Limitation/implication: Even though the global optimality of final outcome is not guaranteed, desirable solutionwas obtained in reasonable time. Originality/Value of paper: We show that designed algorithm is able to successfully generate schedules regarding hard and soft constraints. Moreover, presented method is significantly faster than standard schedule generation and is able to effectively reschedule due to the local neighborhood search characteristics of simulated annealing.

  10. Measures of Fault Tolerance in Distributed Simulated Annealing

    OpenAIRE

    Prakash, Aaditya

    2012-01-01

    In this paper, we examine the different measures of Fault Tolerance in a Distributed Simulated Annealing process. Optimization by Simulated Annealing on a distributed system is prone to various sources of failure. We analyse simulated annealing algorithm, its architecture in distributed platform and potential sources of failures. We examine the behaviour of tolerant distributed system for optimization task. We present possible methods to overcome the failures and achieve fault tolerance for t...

  11. A Parallel Genetic Simulated Annealing Hybrid Algorithm for Task Scheduling

    Institute of Scientific and Technical Information of China (English)

    SHU Wanneng; ZHENG Shijue

    2006-01-01

    In this paper combined with the advantages of genetic algorithm and simulated annealing, brings forward a parallel genetic simulated annealing hybrid algorithm (PGSAHA) and applied to solve task scheduling problem in grid computing .It first generates a new group of individuals through genetic operation such as reproduction, crossover, mutation, etc, and than simulated anneals independently all the generated individuals respectively.When the temperature in the process of cooling no longer falls, the result is the optimal solution on the whole.From the analysis and experiment result, it is concluded that this algorithm is superior to genetic algorithm and simulated annealing.

  12. Hierarchical Network Design Using Simulated Annealing

    DEFF Research Database (Denmark)

    Thomadsen, Tommy; Clausen, Jens

    2002-01-01

    The hierarchical network problem is the problem of finding the least cost network, with nodes divided into groups, edges connecting nodes in each groups and groups ordered in a hierarchy. The idea of hierarchical networks comes from telecommunication networks where hierarchies exist. Hierarchical...... networks are described and a mathematical model is proposed for a two level version of the hierarchical network problem. The problem is to determine which edges should connect nodes, and how demand is routed in the network. The problem is solved heuristically using simulated annealing which as a sub......-algorithm uses a construction algorithm to determine edges and route the demand. Performance for different versions of the algorithm are reported in terms of runtime and quality of the solutions. The algorithm is able to find solutions of reasonable quality in approximately 1 hour for networks with 100 nodes....

  13. Simulation of annealed polyelectrolytes in poor solvents

    International Nuclear Information System (INIS)

    We present (semi-)grand canonical Monte Carlo simulations on annealed polyelectrolytes in poor solvent. Increasing the chemical potential of the charges, which is equal to the pH of the solution except for a trivial additive constant, in rather poor solvents, we find the first-order phase transition between a weakly charged globule and a highly charged extended chain predicted by theory. In the close-to-Q -point regime, we investigate under which conditions pearl-necklace structures are stable. Most of the pearl-necklace parameters are found to obey the scaling relations predicted for quenched polyelectrolytes. However, similarly to the behavior known for this class of polyelectrolytes we obtain large fluctuations in pearl number and size. In agreement with theoretical predictions we find a non-uniform charge distribution between pearls and strings

  14. Tunneling through high energy barriers in simulated quantum annealing

    OpenAIRE

    Crosson, Elizabeth; Deng, Mingkai

    2014-01-01

    We analyze the performance of simulated quantum annealing (SQA) on an optimization problem for which simulated classical annealing (SA) is provably inefficient because of a high energy barrier. We present evidence that SQA can pass through this barrier to find the global minimum efficiently. This demonstrates the potential for SQA to inherit some of the advantages of quantum annealing (QA), since this problem has been previously shown to be efficiently solvable by quantum adiabatic optimization.

  15. Simulated Annealing for Location Area Planning in Cellular networks

    OpenAIRE

    Prajapati, N. B.; R. R. Agravat; Hasan, M I

    2010-01-01

    LA planning in cellular network is useful for minimizing location management cost in GSM network. In fact, size of LA can be optimized to create a balance between the LA update rate and expected paging rate within LA. To get optimal result for LA planning in cellular network simulated annealing algorithm is used. Simulated annealing give optimal results in acceptable run-time.

  16. Simulated Annealing for Location Area Planning in Cellular networks

    Directory of Open Access Journals (Sweden)

    N. B. Prajapati

    2010-03-01

    Full Text Available LA planning in cellular network is useful for minimizing location management cost in GSM network. Infact, size of LA can be optimized to create a balance between the LA update rate and expected pagingrate within LA. To get optimal result for LA planning in cellular network simulated annealing algorithmis used. Simulated annealing give optimal results in acceptable run-time.

  17. Remote sensing of atmospheric duct parameters using simulated annealing

    Institute of Scientific and Technical Information of China (English)

    Zhao Xiao-Feng; Huang Si-Xun; Xiang Jie; Shi Wei-Lai

    2011-01-01

    Simulated annealing is one of the robust optimization schemes. Simulated annealing mimics the annealing process of the slow cooling of a heated metal to reach a stable minimum energy state. In this paper,we adopt simulated annealing to study the problem of the remote sensing of atmospheric duct parameters for two different geometries of propagation measurement. One is from a single emitter to an array of radio receivers (vertical measurements),and the other is from the radar clutter returns (horizontal measurements). Basic principles of simulated annealing and its applications to refractivity estimation are introduced. The performance of this method is validated using numerical experiments and field measurements collected at the East China Sea. The retrieved results demonstrate the feasibility of simulated annealing for near real-time atmospheric refractivity estimation. For comparison,the retrievals of the genetic algorithm are also presented. The comparisons indicate that the convergence speed of simulated annealing is faster than that of the genetic algorithm,while the anti-noise ability of the genetic algorithm is better than that of simulated annealing.

  18. Surface Structure of Hydroxyapatite from Simulated Annealing Molecular Dynamics Simulations.

    Science.gov (United States)

    Wu, Hong; Xu, Dingguo; Yang, Mingli; Zhang, Xingdong

    2016-05-10

    The surface structure of hydroxyapatite (HAP) is crucial for its bioactivity. Using a molecular dynamics simulated annealing method, we studied the structure and its variation with annealing temperature of the HAP (100) surface. In contrast to the commonly used HAP surface model, which is sliced from HAP crystal and then relaxed at 0 K with first-principles or force-field calculations, a new surface structure with gradual changes from ordered inside to disordered on the surface was revealed. The disordering is dependent on the annealing temperature, Tmax. When Tmax increases up to the melting point, which was usually adopted in experiments, the disordering increases, as reflected by its radial distribution functions, structural factors, and atomic coordination numbers. The disordering of annealed structures does not show significant changes when Tmax is above the melting point. The thickness of disordered layers is about 10 Å. The surface energy of the annealed structures at high temperature is significantly less than that of the crystal structure relaxed at room temperature. A three-layer model of interior, middle, and surface was then proposed to describe the surface structure of HAP. The interior layer retains the atomic configurations in crystal. The middle layer has its atoms moved and its groups rotated about their original locations. In the surface layer, the atomic arrangements are totally different from those in crystal. In particular for the hydroxyl groups, they move outward and cover the Ca(2+) ions, leaving holes occupied by the phosphate groups. Our study suggested a new model with disordered surface structures for studying the interaction of HAP-based biomaterials with other molecules. PMID:27096760

  19. List-Based Simulated Annealing Algorithm for Traveling Salesman Problem.

    Science.gov (United States)

    Zhan, Shi-hua; Lin, Juan; Zhang, Ze-jun; Zhong, Yi-wen

    2016-01-01

    Simulated annealing (SA) algorithm is a popular intelligent optimization algorithm which has been successfully applied in many fields. Parameters' setting is a key factor for its performance, but it is also a tedious work. To simplify parameters setting, we present a list-based simulated annealing (LBSA) algorithm to solve traveling salesman problem (TSP). LBSA algorithm uses a novel list-based cooling schedule to control the decrease of temperature. Specifically, a list of temperatures is created first, and then the maximum temperature in list is used by Metropolis acceptance criterion to decide whether to accept a candidate solution. The temperature list is adapted iteratively according to the topology of the solution space of the problem. The effectiveness and the parameter sensitivity of the list-based cooling schedule are illustrated through benchmark TSP problems. The LBSA algorithm, whose performance is robust on a wide range of parameter values, shows competitive performance compared with some other state-of-the-art algorithms. PMID:27034650

  20. A NEW GENETIC SIMULATED ANNEALING ALGORITHM FOR FLOOD ROUTING MODEL

    Institute of Scientific and Technical Information of China (English)

    KANG Ling; WANG Cheng; JIANG Tie-bing

    2004-01-01

    In this paper, a new approach, the Genetic Simulated Annealing (GSA), was proposed for optimizing the parameters in the Muskingum routing model. By integrating the simulated annealing method into the genetic algorithm, the hybrid method could avoid some troubles of traditional methods, such as arduous trial-and-error procedure, premature convergence in genetic algorithm and search blindness in simulated annealing. The principle and implementing procedure of this algorithm were described. Numerical experiments show that the GSA can adjust the optimization population, prevent premature convergence and seek the global optimal result.Applications to the Nanyunhe River and Qingjiang River show that the proposed approach is of higher forecast accuracy and practicability.

  1. Stochastic search in structural optimization - Genetic algorithms and simulated annealing

    Science.gov (United States)

    Hajela, Prabhat

    1993-01-01

    An account is given of illustrative applications of genetic algorithms and simulated annealing methods in structural optimization. The advantages of such stochastic search methods over traditional mathematical programming strategies are emphasized; it is noted that these methods offer a significantly higher probability of locating the global optimum in a multimodal design space. Both genetic-search and simulated annealing can be effectively used in problems with a mix of continuous, discrete, and integer design variables.

  2. An improved simulated annealing algorithm for standard cell placement

    Science.gov (United States)

    Jones, Mark; Banerjee, Prithviraj

    1988-01-01

    Simulated annealing is a general purpose Monte Carlo optimization technique that was applied to the problem of placing standard logic cells in a VLSI ship so that the total interconnection wire length is minimized. An improved standard cell placement algorithm that takes advantage of the performance enhancements that appear to come from parallelizing the uniprocessor simulated annealing algorithm is presented. An outline of this algorithm is given.

  3. Nonsmooth trajectory optimization - An approach using continuous simulated annealing

    Science.gov (United States)

    Lu, Ping; Khan, M. A.

    1993-01-01

    An account is given of the properties of a continuous simulated annealing algorithm that can function as a global optimization tool for nonsmooth dynamic systems, as shown in the case of a trajectory-optimization program implementation. The approach is shown to successfully solve the problem of nonsmooth trajectory optimization for a high performance rigid-body aircraft. The results obtained demonstrate the superiority of the simulated annealing algorithm over widely used algorithms.

  4. Model based matching using simulated annealing and a minimum representation size criterion

    Science.gov (United States)

    Ravichandran, B.; Sanderson, A. C.

    1992-01-01

    We define the model based matching problem in terms of the correspondence and transformation that relate the model and scene, and the search and evaluation measures needed to find the best correspondence and transformation. Simulated annealing is proposed as a method for search and optimization, and the minimum representation size criterion is used as the evaluation measure in an algorithm that finds the best correspondence. An algorithm based on simulated annealing is presented and evaluated. This algorithm is viewed as a part of an adaptive, hierarchical approach which provides robust results for a variety of model based matching problems.

  5. Music playlist generation by adapted simulated annealing

    NARCIS (Netherlands)

    Pauws, S.C.; Verhaegh, W.F.J.; Vossen, M.P.H.

    2008-01-01

    We present the design of an algorithm for use in an interactivemusic system that automatically generates music playlists that fit the music preferences of a user. To this end, we introduce a formal model, define the problem of automatic playlist generation (APG), and proof its NP-hardness. We use a

  6. Quantum versus simulated annealing in wireless interference network optimization.

    Science.gov (United States)

    Wang, Chi; Chen, Huo; Jonckheere, Edmond

    2016-01-01

    Quantum annealing (QA) serves as a specialized optimizer that is able to solve many NP-hard problems and that is believed to have a theoretical advantage over simulated annealing (SA) via quantum tunneling. With the introduction of the D-Wave programmable quantum annealer, a considerable amount of effort has been devoted to detect and quantify quantum speedup. While the debate over speedup remains inconclusive as of now, instead of attempting to show general quantum advantage, here, we focus on a novel real-world application of D-Wave in wireless networking-more specifically, the scheduling of the activation of the air-links for maximum throughput subject to interference avoidance near network nodes. In addition, D-Wave implementation is made error insensitive by a novel Hamiltonian extra penalty weight adjustment that enlarges the gap and substantially reduces the occurrence of interference violations resulting from inevitable spin bias and coupling errors. The major result of this paper is that quantum annealing benefits more than simulated annealing from this gap expansion process, both in terms of ST99 speedup and network queue occupancy. It is the hope that this could become a real-word application niche where potential benefits of quantum annealing could be objectively assessed. PMID:27181056

  7. Quantum versus simulated annealing in wireless interference network optimization

    Science.gov (United States)

    Wang, Chi; Chen, Huo; Jonckheere, Edmond

    2016-05-01

    Quantum annealing (QA) serves as a specialized optimizer that is able to solve many NP-hard problems and that is believed to have a theoretical advantage over simulated annealing (SA) via quantum tunneling. With the introduction of the D-Wave programmable quantum annealer, a considerable amount of effort has been devoted to detect and quantify quantum speedup. While the debate over speedup remains inconclusive as of now, instead of attempting to show general quantum advantage, here, we focus on a novel real-world application of D-Wave in wireless networking—more specifically, the scheduling of the activation of the air-links for maximum throughput subject to interference avoidance near network nodes. In addition, D-Wave implementation is made error insensitive by a novel Hamiltonian extra penalty weight adjustment that enlarges the gap and substantially reduces the occurrence of interference violations resulting from inevitable spin bias and coupling errors. The major result of this paper is that quantum annealing benefits more than simulated annealing from this gap expansion process, both in terms of ST99 speedup and network queue occupancy. It is the hope that this could become a real-word application niche where potential benefits of quantum annealing could be objectively assessed.

  8. Quantum versus simulated annealing in wireless interference network optimization.

    Science.gov (United States)

    Wang, Chi; Chen, Huo; Jonckheere, Edmond

    2016-01-01

    Quantum annealing (QA) serves as a specialized optimizer that is able to solve many NP-hard problems and that is believed to have a theoretical advantage over simulated annealing (SA) via quantum tunneling. With the introduction of the D-Wave programmable quantum annealer, a considerable amount of effort has been devoted to detect and quantify quantum speedup. While the debate over speedup remains inconclusive as of now, instead of attempting to show general quantum advantage, here, we focus on a novel real-world application of D-Wave in wireless networking-more specifically, the scheduling of the activation of the air-links for maximum throughput subject to interference avoidance near network nodes. In addition, D-Wave implementation is made error insensitive by a novel Hamiltonian extra penalty weight adjustment that enlarges the gap and substantially reduces the occurrence of interference violations resulting from inevitable spin bias and coupling errors. The major result of this paper is that quantum annealing benefits more than simulated annealing from this gap expansion process, both in terms of ST99 speedup and network queue occupancy. It is the hope that this could become a real-word application niche where potential benefits of quantum annealing could be objectively assessed.

  9. On simulated annealing phase transitions in phylogeny reconstruction.

    Science.gov (United States)

    Strobl, Maximilian A R; Barker, Daniel

    2016-08-01

    Phylogeny reconstruction with global criteria is NP-complete or NP-hard, hence in general requires a heuristic search. We investigate the powerful, physically inspired, general-purpose heuristic simulated annealing, applied to phylogeny reconstruction. Simulated annealing mimics the physical process of annealing, where a liquid is gently cooled to form a crystal. During the search, periods of elevated specific heat occur, analogous to physical phase transitions. These simulated annealing phase transitions play a crucial role in the outcome of the search. Nevertheless, they have received comparably little attention, for phylogeny or other optimisation problems. We analyse simulated annealing phase transitions during searches for the optimal phylogenetic tree for 34 real-world multiple alignments. In the same way in which melting temperatures differ between materials, we observe distinct specific heat profiles for each input file. We propose this reflects differences in the search landscape and can serve as a measure for problem difficulty and for suitability of the algorithm's parameters. We discuss application in algorithmic optimisation and as a diagnostic to assess parameterisation before computationally costly, large phylogeny reconstructions are launched. Whilst the focus here lies on phylogeny reconstruction under maximum parsimony, it is plausible that our results are more widely applicable to optimisation procedures in science and industry. PMID:27150349

  10. On simulated annealing phase transitions in phylogeny reconstruction.

    Science.gov (United States)

    Strobl, Maximilian A R; Barker, Daniel

    2016-08-01

    Phylogeny reconstruction with global criteria is NP-complete or NP-hard, hence in general requires a heuristic search. We investigate the powerful, physically inspired, general-purpose heuristic simulated annealing, applied to phylogeny reconstruction. Simulated annealing mimics the physical process of annealing, where a liquid is gently cooled to form a crystal. During the search, periods of elevated specific heat occur, analogous to physical phase transitions. These simulated annealing phase transitions play a crucial role in the outcome of the search. Nevertheless, they have received comparably little attention, for phylogeny or other optimisation problems. We analyse simulated annealing phase transitions during searches for the optimal phylogenetic tree for 34 real-world multiple alignments. In the same way in which melting temperatures differ between materials, we observe distinct specific heat profiles for each input file. We propose this reflects differences in the search landscape and can serve as a measure for problem difficulty and for suitability of the algorithm's parameters. We discuss application in algorithmic optimisation and as a diagnostic to assess parameterisation before computationally costly, large phylogeny reconstructions are launched. Whilst the focus here lies on phylogeny reconstruction under maximum parsimony, it is plausible that our results are more widely applicable to optimisation procedures in science and industry.

  11. SIMULATED ANNEALING BASED POLYNOMIAL TIME QOS ROUTING ALGORITHM FOR MANETS

    Institute of Scientific and Technical Information of China (English)

    Liu Lianggui; Feng Guangzeng

    2006-01-01

    Multi-constrained Quality-of-Service (QoS) routing is a big challenge for Mobile Ad hoc Networks (MANETs) where the topology may change constantly. In this paper a novel QoS Routing Algorithm based on Simulated Annealing (SA_RA) is proposed. This algorithm first uses an energy function to translate multiple QoS weights into a single mixed metric and then seeks to find a feasible path by simulated annealing. The paper outlines simulated annealing algorithm and analyzes the problems met when we apply it to Qos Routing (QoSR) in MANETs. Theoretical analysis and experiment results demonstrate that the proposed method is an effective approximation algorithms showing better performance than the other pertinent algorithm in seeking the (approximate) optimal configuration within a period of polynomial time.

  12. Coordination Hydrothermal Interconnection Java-Bali Using Simulated Annealing

    Science.gov (United States)

    Wicaksono, B.; Abdullah, A. G.; Saputra, W. S.

    2016-04-01

    Hydrothermal power plant coordination aims to minimize the total cost of operating system that is represented by fuel costand constraints during optimization. To perform the optimization, there are several methods that can be used. Simulated Annealing (SA) is a method that can be used to solve the optimization problems. This method was inspired by annealing or cooling process in the manufacture of materials composed of crystals. The basic principle of hydrothermal power plant coordination includes the use of hydro power plants to support basic load while thermal power plants were used to support the remaining load. This study used two hydro power plant units and six thermal power plant units with 25 buses by calculating transmission losses and considering power limits in each power plant unit aided by MATLAB software during the process. Hydrothermal power plant coordination using simulated annealing plants showed that a total cost of generation for 24 hours is 13,288,508.01.

  13. Thermal, quantum and simulated quantum annealing: analytical comparisons for simple models

    OpenAIRE

    Bapst, Victor; Semerjian, Guilhem

    2015-01-01

    We study various annealing dynamics, both classical and quantum, for simple mean-field models and explain how to describe their behavior in the thermodynamic limit in terms of differential equations. In particular we emphasize the differences between quantum annealing (i.e. evolution with Schr\\"odinger equation) and simulated quantum annealing (i.e. annealing of a Quantum Monte Carlo simulation).

  14. Analysis of Trivium by a Simulated Annealing variant

    DEFF Research Database (Denmark)

    Borghoff, Julia; Knudsen, Lars Ramkilde; Matusiewicz, Krystian

    2010-01-01

    . A characteristic of equation systems that may be efficiently solvable by the means of such algorithms is provided. As an example, we investigate equation systems induced by the problem of recovering the internal state of the stream cipher Trivium. We propose an improved variant of the simulated annealing method...

  15. Function minimization with partially correct data via simulated annealing

    Science.gov (United States)

    Lorre, Jean J.

    1988-01-01

    The simulated annealing technique has been applied successfully to the problem of estimating the coefficients of a function in cases where only a portion of the data being fitted to the function is truly representative of the function, the rest being erroneous. Two examples are given, one in photometric function fitting and the other in pattern recognition. A schematic of the algorithm is provided.

  16. Physical Mapping Using Simulated Annealing and Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Vesterstrøm, Jacob Svaneborg

    2003-01-01

    Physical mapping (PM) is a method of bioinformatics that assists in DNA sequencing. The goal is to determine the order of a collection of fragments taken from a DNA strand, given knowledge of certain unique DNA markers contained in the fragments. Simulated annealing (SA) is the most widely used...

  17. A Simulated Annealing Methodology for Clusterwise Linear Regression.

    Science.gov (United States)

    DeSarbo, Wayne S.; And Others

    1989-01-01

    A method is presented that simultaneously estimates cluster membership and corresponding regression functions for a sample of observations or subjects. This methodology is presented with the simulated annealing-based algorithm. A set of Monte Carlo analyses is included to demonstrate the performance of the algorithm. (SLD)

  18. Application of Simulated Annealing to Clustering Tuples in Databases.

    Science.gov (United States)

    Bell, D. A.; And Others

    1990-01-01

    Investigates the value of applying principles derived from simulated annealing to clustering tuples in database design, and compares this technique with a graph-collapsing clustering method. It is concluded that, while the new method does give superior results, the expense involved in algorithm run time is prohibitive. (24 references) (CLB)

  19. The afforestation problem: a heuristic method based on simulated annealing

    DEFF Research Database (Denmark)

    Vidal, Rene Victor Valqui

    1992-01-01

    This paper presents the afforestation problem, that is the location and design of new forest compartments to be planted in a given area. This optimization problem is solved by a two-step heuristic method based on simulated annealing. Tests and experiences with this method are also presented....

  20. Meta-Modeling by Symbolic Regression and Pareto Simulated Annealing

    NARCIS (Netherlands)

    Stinstra, E.; Rennen, G.; Teeuwen, G.J.A.

    2006-01-01

    The subject of this paper is a new approach to Symbolic Regression.Other publications on Symbolic Regression use Genetic Programming.This paper describes an alternative method based on Pareto Simulated Annealing.Our method is based on linear regression for the estimation of constants.Interval arithm

  1. Particle Based Image Segmentation with Simulated Annealing

    NARCIS (Netherlands)

    Everts, M.H.; Bekker, H.; Jalba, A.C.; Roerdink, J.B.T.M.

    2007-01-01

    The Charged Particle Model (CPM) is a physically motivated deformable model for shape recovery and segmentation. It simulates a system of charged particles moving in an electric field generated from the input image, whose positions in the equilibrium state are used for curve or surface reconstructio

  2. Molecular dynamics simulation of annealed ZnO surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Min, Tjun Kit; Yoon, Tiem Leong [School of Physics, Universiti Sains Malaysia, 11800 USM, Penang (Malaysia); Lim, Thong Leng [Faculty of Engineering and Technology, Multimedia University, Melaka Campus, 75450 Melaka (Malaysia)

    2015-04-24

    The effect of thermally annealing a slab of wurtzite ZnO, terminated by two surfaces, (0001) (which is oxygen-terminated) and (0001{sup ¯}) (which is Zn-terminated), is investigated via molecular dynamics simulation by using reactive force field (ReaxFF). We found that upon heating beyond a threshold temperature of ∼700 K, surface oxygen atoms begin to sublimate from the (0001) surface. The ratio of oxygen leaving the surface at a given temperature increases as the heating temperature increases. A range of phenomena occurring at the atomic level on the (0001) surface has also been explored, such as formation of oxygen dimers on the surface and evolution of partial charge distribution in the slab during the annealing process. It was found that the partial charge distribution as a function of the depth from the surface undergoes a qualitative change when the annealing temperature is above the threshold temperature.

  3. Application of simulated annealing algorithm to optimizing sequencing of operation steps

    Institute of Scientific and Technical Information of China (English)

    2000-01-01

    Discusses the optimization of machining operation sequencing by simulated annealing, and building a simulated annealing optimization model. From which, a new way to optimize operation sequencing can be developed.

  4. Solving geometric constraints with genetic simulated annealing algorithm

    Institute of Scientific and Technical Information of China (English)

    刘生礼; 唐敏; 董金祥

    2003-01-01

    This paper applies genetic simulated annealing algorithm (SAGA) to solving geometric constraint problems. This method makes full use of the advantages of SAGA and can handle under-/over- constraint problems naturally. It has advantages (due to its not being sensitive to the initial values) over the Newton-Raphson method, and its yielding of multiple solutions, is an advantage over other optimal methods for multi-solution constraint system. Our experiments have proved the robustness and efficiency of this method.

  5. Optimization of pipe networks including pumps by simulated annealing

    OpenAIRE

    Costa A.L.H.; Medeiros J.L.; Pessoa F.L.P.

    2000-01-01

    The objective of this work is to present an application of the simulated annealing method for the optimal design of pipe networks including pumps. Although its importance, the optimization of pumped networks did not receive great attention in the literature. The proposed search scheme explores the discrete space of the decision variables: pipe diameters and pump sizes. The behavior of the pumps is describe through the characteristic curve, generating more realistic solutions. In order to demo...

  6. Simulated Annealing for the 0/1 Multidimensional Knapsack Problem

    Institute of Scientific and Technical Information of China (English)

    Fubin Qian; Rui Ding

    2007-01-01

    In this paper a simulated annealing (SA) algorithm is presented for the 0/1 multidimensional knapsack problem. Problem-specific knowledge is incorporated in the algorithm description and evaluation of parameters in order to look into the performance of finite-time implementations of SA. Computational results show that SA performs much better than a genetic algorithm in terms of solution time, whilst having a modest loss of solution quality.

  7. Convergence of simulated annealing by the generalized transition probability

    OpenAIRE

    Nishimori, Hidetoshi; Inoue, Jun-Ichi

    1998-01-01

    We prove weak ergodicity of the inhomogeneous Markov process generated by the generalized transition probability of Tsallis and Stariolo under power-law decay of the temperature. We thus have a mathematical foundation to conjecture convergence of simulated annealing processes with the generalized transition probability to the minimum of the cost function. An explicitly solvable example in one dimension is analyzed in which the generalized transition probability leads to a fast convergence of ...

  8. Simulated annealing spectral clustering algorithm for image segmentation

    Institute of Scientific and Technical Information of China (English)

    Yifang Yang; and Yuping Wang

    2014-01-01

    The similarity measure is crucial to the performance of spectral clustering. The Gaussian kernel function based on the Euclidean distance is usual y adopted as the similarity mea-sure. However, the Euclidean distance measure cannot ful y reveal the complex distribution data, and the result of spectral clustering is very sensitive to the scaling parameter. To solve these problems, a new manifold distance measure and a novel simulated anneal-ing spectral clustering (SASC) algorithm based on the manifold distance measure are proposed. The simulated annealing based on genetic algorithm (SAGA), characterized by its rapid conver-gence to the global optimum, is used to cluster the sample points in the spectral mapping space. The proposed algorithm can not only reflect local and global consistency better, but also reduce the sensitivity of spectral clustering to the kernel parameter, which improves the algorithm’s clustering performance. To efficiently ap-ply the algorithm to image segmentation, the Nystr¨om method is used to reduce the computation complexity. Experimental re-sults show that compared with traditional clustering algorithms and those popular spectral clustering algorithms, the proposed algorithm can achieve better clustering performances on several synthetic datasets, texture images and real images.

  9. Reticle Floorplanning and Simulated Wafer Dicing for Multiple-Project Wafers by Simulated Annealing

    OpenAIRE

    Lin, Rung-Bin; Wu, Meng-Chiou; Tsai, Shih-Cheng

    2008-01-01

    In this chapter we have demonstrated how simulated annealing is used to solve two NPhard problems: simulated wafer dicing and reticle floorplanning problems for MPW. For simulated wafer dicing, we suggest that HVMIS-SA-Z be employed to find the wafer dicing plans, especially for low-volume production. As for reticle floorplanning, BT-VOCO and

  10. Simulated annealing approach to the max cut problem

    Science.gov (United States)

    Sen, Sandip

    1993-03-01

    In this paper we address the problem of partitioning the nodes of a random graph into two sets, so as to maximize the sum of the weights on the edges connecting nodes belonging to different sets. This problem has important real-life counterparts, but has been proven to be NP-complete. As such, a number of heuristic solution techniques have been proposed in literature to address this problem. We propose a stochastic optimization technique, simulated annealing, to find solutions for the max cut problem. Our experiments verify that good solutions to the problem can be found using this algorithm in a reasonable amount of time.

  11. Stochastic annealing simulations of defect interactions among subcascades

    Energy Technology Data Exchange (ETDEWEB)

    Heinisch, H.L. [Pacific Northwest National Lab., Richland, WA (United States); Singh, B.N.

    1997-04-01

    The effects of the subcascade structure of high energy cascades on the temperature dependencies of annihilation, clustering and free defect production are investigated. The subcascade structure is simulated by closely spaced groups of lower energy MD cascades. The simulation results illustrate the strong influence of the defect configuration existing in the primary damage state on subsequent intracascade evolution. Other significant factors affecting the evolution of the defect distribution are the large differences in mobility and stability of vacancy and interstitial defects and the rapid one-dimensional diffusion of small, glissile interstitial loops produced directly in cascades. Annealing simulations are also performed on high-energy, subcascade-producing cascades generated with the binary collision approximation and calibrated to MD results.

  12. Simulated Annealing-Based Krill Herd Algorithm for Global Optimization

    Directory of Open Access Journals (Sweden)

    Gai-Ge Wang

    2013-01-01

    Full Text Available Recently, Gandomi and Alavi proposed a novel swarm intelligent method, called krill herd (KH, for global optimization. To enhance the performance of the KH method, in this paper, a new improved meta-heuristic simulated annealing-based krill herd (SKH method is proposed for optimization tasks. A new krill selecting (KS operator is used to refine krill behavior when updating krill’s position so as to enhance its reliability and robustness dealing with optimization problems. The introduced KS operator involves greedy strategy and accepting few not-so-good solutions with a low probability originally used in simulated annealing (SA. In addition, a kind of elitism scheme is used to save the best individuals in the population in the process of the krill updating. The merits of these improvements are verified by fourteen standard benchmarking functions and experimental results show that, in most cases, the performance of this improved meta-heuristic SKH method is superior to, or at least highly competitive with, the standard KH and other optimization methods.

  13. Simulated Annealing Approach to the Temperature-Emissivity Separation Problem in Thermal Remote Sensing Part One: Mathematical Background

    CERN Document Server

    Morgan, John A

    2016-01-01

    The method of simulated annealing is adapted to the temperature-emissivity separation (TES) problem. A patch of surface at the bottom of the atmosphere is assumed to be a greybody emitter with spectral emissivity $\\epsilon(k)$ describable by a mixture of spectral endmembers. We prove that a simulated annealing search conducted according to a suitable schedule converges to a solution maximizing the $\\textit{A-Posteriori}$ probability that spectral radiance detected at the top of the atmosphere originates from a patch with stipulated $T$ and $\\epsilon(k)$. Any such solution will be nonunique. The average of a large number of simulated annealing solutions, however, converges almost surely to a unique Maximum A-Posteriori solution for $T$ and $\\epsilon(k)$. The limitation to a stipulated set of endmember emissivities may be relaxed by allowing the number of endmembers to grow without bound, and to be generic continuous functions of wavenumber with bounded first derivatives with respect to wavenumber.

  14. Time series forecasting using a TSK fuzzy system tuned with simulated annealing

    OpenAIRE

    Almaraashi, Majid; John, Robert; Coupland, Simon; Hopgood, Adrian

    2010-01-01

    In this paper, a combination of a Takagi-Sugeno fuzzy system (TSK) and simulated annealing is used to predict well known time series by searching for the best configuration of the fuzzy system. Simulated annealing is used to optimise the parameters of the antecedent and the consequent parts of the fuzzy system rules. The results of the proposed method are encouraging indicating that simulated annealing and fuzzy logic are able to combine well in time series prediction.

  15. UUV Bow Profile Optimization Design Based on Adaptive Simulated Annealing Algorithm%基于自适应模拟退火法的UUV艏部线型优化设计

    Institute of Scientific and Technical Information of China (English)

    裴譞; 张宇文; 王亚东; 袁绪龙

    2011-01-01

    Based on the theory of global searching in the multi-objective domain, this paper presents a method of optimizing an unmanned underwater vehicle (UUV) bow profile by adopting the simulated annealing algorithm. This method takes UUV bow fullness, hydrodynamics and flow noise as the goal function. An integrated UUV bow shape optimal design model is established by utilizing the classical viscous flow theory and the computational fluid dynamics (CFD) algorithm. The principle of choosing the complex shape parameters and design variables is described according to the optimization of UUV shape. The optimization design of an UUV bow profile is simulated, and the result shows that with the present method, we can effectively achieve the desired goals of UUV optimization design, and get the optimal solution from the domain composed of various objective functions which are restricted each other. The method greatly improves the acoustic performance and the hydrodynamic characteristic of UUV bow.%基于多目标空间的全局搜索理论,将无人水下航行器(UUV)艏部丰满度,流体动力与流噪声作为集成判定指标的综合一体化UUV艏部线型优化设计方法.建立了针对UUV艏部外形设计的一体化设计优化模型,结合经典粘流理论、CFD算法实现了集成优化设计,针对UUV外形优化设计要求,提出了复杂外形参数化和设计变量的选取原则,并对具体算例进行外形综合仿真优化设计.结果表明,基于自适应模拟退火法能够有效实现对UUV期望目标的一体化优化设计,在相互制约的多种目标函数所组成的目标域中获得最优解,优化后UUV艏部流体动力和声学性能都有较大提高.

  16. Optimal placement of excitations and sensors by simulated annealing

    Science.gov (United States)

    Salama, Moktar; Bruno, R.; Chen, G.-S.; Garba, J.

    1989-01-01

    The optimal placement of discrete actuators and sensors is posed as a combinatorial optimization problem. Two examples for truss structures were used for illustration; the first dealt with the optimal placement of passive dampers along existing truss members, and the second dealt with the optimal placement of a combination of a set of actuators and a set of sensors. Except for the simplest problems, an exact solution by enumeration involves a very large number of function evaluations, and is therefore computationally intractable. By contrast, the simulated annealing heuristic involves far fewer evaluations and is best suited for the class of problems considered. As an optimization tool, the effectiveness of the algorithm is enhanced by introducing a number of rules that incorporate knowledge about the physical behavior of the problem. Some of the suggested rules are necessarily problem dependent.

  17. Simulated Annealing Algorithm and Its Application in Irregular Polygons Packing

    Institute of Scientific and Technical Information of China (English)

    段国林; 王彩红; 张健楠

    2003-01-01

    Two-dimensional irregular polygons packing problem is very difficult to be solved in traditional optimal way.Simulated annealing(SA)algorithm is a stochastic optimization technique that can be used to solve packing problems.The whole process of SA is introduced firstly in this paper. An extended neighborhood searching method in SA is mainly analyzed. A general module of SA algorithm is given and used to lay out the irregular polygons. The judgment of intersection and other constrains of irregular polygons are analyzed. Then an example that was used in the paper of Stefan Jakobs is listed.Results show that this SA algorithm shortens the computation time and improves the solution.

  18. Simulated annealing and joint manufacturing batch-sizing

    Directory of Open Access Journals (Sweden)

    Sarker Ruhul

    2003-01-01

    Full Text Available We address an important problem of a manufacturing system. The system procures raw materials from outside suppliers in a lot and processes them to produce finished goods. It proposes an ordering policy for raw materials to meet the requirements of a production facility. In return, this facility has to deliver finished products demanded by external buyers at fixed time intervals. First, a general cost model is developed considering both raw materials and finished products. Then this model is used to develop a simulated annealing approach to determining an optimal ordering policy for procurement of raw materials and also for the manufacturing batch size to minimize the total cost for meeting customer demands in time. The solutions obtained were compared with those of traditional approaches. Numerical examples are presented. .

  19. Solving the dynamic berth allocation problem by simulated annealing

    Science.gov (United States)

    Lin, Shih-Wei; Ting, Ching-Jung

    2014-03-01

    Berth allocation, the first operation when vessels arrive at a port, is one of the major container port optimization problems. From both the port operator's and the ocean carriers' perspective, minimization of the time a ship spends at berth is a key goal of port operations. This article focuses on two versions of the dynamic berth allocation problem (DBAP): discrete and continuous cases. The first case assigns ships to a given set of berth positions; the second one permits them to be moored anywhere along the berth. Simulated annealing (SA) approaches are proposed to solve the DBAP. The proposed SAs are tested with numerical instances for different sizes from the literature. Experimental results show that the proposed SA can obtain the optimal solutions in all instances of discrete cases and update 27 best known solutions in the continuous case.

  20. Simulated Annealing Clustering for Optimum GPS Satellite Selection

    Directory of Open Access Journals (Sweden)

    M. Ranjbar

    2012-05-01

    Full Text Available This paper utilizes a clustering approach based on Simulated Annealing (SA method to select optimum satellite subsets from the visible satellites. Geometric Dilution of Precision (GDOP is used as criteria of optimality. The lower the values of the GDOP number, the better the geometric strength, and vice versa. Not needing to calculate the inverse matrix, which is time-consuming process, is a dramatically important advantage of using this method, so a great reduction in computational cost is achieved. SA is a powerful technique to obtain a close approximation to the global optimum for a given problem. The evaluation of the performance of the proposed method is done by validation measures. The external validation measures, entropy and purity, are used to measure the extent to which cluster labels affirm with the externally given class labels. The overall purity and entropy is 0.9015 and 0.3993, respectively which is an excellent result.

  1. Restoration of polarimetric SAR images using simulated annealing

    DEFF Research Database (Denmark)

    Schou, Jesper; Skriver, Henning

    2001-01-01

    Filtering synthetic aperture radar (SAR) images ideally results in better estimates of the parameters characterizing the distributed targets in the images while preserving the structures of the nondistributed targets. However, these objectives are normally conflicting, often leading to a filtering...... approach favoring one of the objectives. An algorithm for estimating the radar cross-section (RCS) for intensity SAR images has previously been proposed in the literature based on Markov random fields and the stochastic optimization method simulated annealing. A new version of the algorithm is presented...... applicable to multilook polarimetric SAR images, resulting in an estimate of the mean covariance matrix rather than the RCS. Small windows are applied in the filtering, and due to the iterative nature of the approach, reasonable estimates of the polarimetric quantities characterizing the distributed targets...

  2. Memoryless cooperative graph search based on the simulated annealing algorithm

    Institute of Scientific and Technical Information of China (English)

    Hou Jian; Yan Gang-Feng; Fan Zhen

    2011-01-01

    We have studied the problem of reaching a globally optimal segment for a graph-like environment with a single or a group of autonomous mobile agents. Firstly, two efficient simulated-annealing-like algorithms are given for a single agent to solve the problem in a partially known environment and an unknown environment, respectively. It shows that under both proposed control strategies, the agent will eventually converge to a globally optimal segment with probability 1Secondly, we use multi-agent searching to simultaneously reduce the computation complexity and accelerate convergence based on the algorithms we have given for a single agent. By exploiting graph partition, a gossip consensus method based scheme is presented to update the key parameter-radius of the graph, ensuring that the agents spend much less time finding a globally optimal segment.

  3. Adaptive Sampling in Hierarchical Simulation

    Energy Technology Data Exchange (ETDEWEB)

    Knap, J; Barton, N R; Hornung, R D; Arsenlis, A; Becker, R; Jefferson, D R

    2007-07-09

    We propose an adaptive sampling methodology for hierarchical multi-scale simulation. The method utilizes a moving kriging interpolation to significantly reduce the number of evaluations of finer-scale response functions to provide essential constitutive information to a coarser-scale simulation model. The underlying interpolation scheme is unstructured and adaptive to handle the transient nature of a simulation. To handle the dynamic construction and searching of a potentially large set of finer-scale response data, we employ a dynamic metric tree database. We study the performance of our adaptive sampling methodology for a two-level multi-scale model involving a coarse-scale finite element simulation and a finer-scale crystal plasticity based constitutive law.

  4. A primer on implementing compressed simulated annealing for the optimisation of a constrained simulation model in Microsoft Excel

    OpenAIRE

    Graeme J. Doole

    2007-01-01

    This short paper provides a simple introduction on how a simulation model implemented in Microsoft Excel® can be optimised using Visual Basic for Applications (VBA) programming and the compressed simulated annealing algorithm (Ohlmann et al., 2004; Ohlmann and Thomas, 2007). The standard simulated annealing procedure enters as a special case. Practical advice for determining the parameters that guide the stochastic search process in an annealing algorithm is also given.

  5. Simulation of dopant diffusion and activation during flash lamp annealing

    Energy Technology Data Exchange (ETDEWEB)

    Zechner, Christoph [Synopsys Switzerland LLC, Affolternstrasse 52, CH-8050 Zurich (Switzerland); Matveev, Dmitri [Synopsys Switzerland LLC, Affolternstrasse 52, CH-8050 Zurich (Switzerland)], E-mail: matveev@synopsys.com; Zographos, Nikolas [Synopsys Switzerland LLC, Affolternstrasse 52, CH-8050 Zurich (Switzerland); Lerch, Wilfried; Paul, Silke [Mattson Thermal Products GmbH, Daimlerstrasse 10, D-89160 Dornstadt (Germany)

    2008-12-05

    A set of advanced models implemented into the simulator Sentaurus Process was applied to simulate ultra shallow junction formation by flash lamp annealing (FLA). The full path transient enhanced diffusion model includes equations for small interstitial clusters (I{sub 2}, I{sub 3}, I{sub 4}), {l_brace}3 1 1{r_brace} defects and dislocation loops. A dopant-point defect clustering model is used for dopant activation simulation. Several cluster types are considered: B{sub 2}, B{sub 2}I, B{sub 2}I{sub 2}, B{sub 3}I, B{sub 3}I{sub 2}, B{sub 3}I{sub 3} for boron and As{sub 2}, As{sub 2}V, As{sub 3}, As{sub 3}V, As{sub 4}, As{sub 4}V for arsenic. Different point defect and dopant-point defect pair charge states are taken into account to obtain accurate results in the high doping level region. The flux expressions in the three-phase segregation model include a dependence on the doping level and point defect supersaturation. The FLA process was performed at various peak temperatures in a Mattson Millios{sup TM} fRTP{sup TM} system. The measured wafer temperature as a function of time allowed us to simulate the transient processes with a high accuracy. A good agreement between secondary ion mass spectroscopy (SIMS) and simulated profiles was achieved. The sheet resistance dependence on the FLA peak temperature was reproduced successfully.

  6. A Simulated Annealing Algorithm for Training Empirical Potential Functions of Protein Folding

    Institute of Scientific and Technical Information of China (English)

    WANG Yu-hong; LI Wei

    2005-01-01

    In this paper are reported the local minimum problem by means of current greedy algorithm for training the empirical potential function of protein folding on 8623 non-native structures of 31 globular proteins and a solution of the problem based upon the simulated annealing algorithm. This simulated annealing algorithm is indispensable for developing and testing highly refined empirical potential functions.

  7. A Knowledge-Based Simulated Annealing Algorithm to Multiple Satellites Mission Planning Problems

    OpenAIRE

    Da-Wei Jin; Li-Ning Xing

    2013-01-01

    The multiple satellites mission planning is a complex combination optimization problem. A knowledge-based simulated annealing algorithm is proposed to the multiple satellites mission planning problems. The experimental results suggest that the proposed algorithm is effective to the given problem. The knowledge-based simulated annealing method will provide a useful reference for the improvement of existing optimization approaches.

  8. UN ALGORITMO DI SIMULATED ANNEALING PER LA SOLUZIONE DI UN PROBLEMA DI SCHEDULING

    OpenAIRE

    Crescenzio Gallo

    2004-01-01

    An algorithm of "Simulated Annealing" for solving scheduling problems is presented. The issues related to scheduling are discussed, together with the Simulated Annealing approximation method and its main parameters (freezing, temperature, cooling, number of neighbourhoods to explore), the choices made in defining them for the generation of a good algorithm that efficiently resolves the scheduling problem.

  9. Weak convergence rates for stochastic approximation with application to multiple targets and simulated annealing

    OpenAIRE

    Pelletier, Mariane

    1998-01-01

    We study convergence rates of $\\mathbb{R}$-valued algorithms, especially in the case of multiple targets and simulated annealing. We precise, for example, the convergence rate of simulated annealing algorithms, whose weak convergence to a distribution concentrated on the potential's minima had been established by Gelfand and Mitter or by Hwang and Sheu.

  10. Generalized simulated annealing algorithms using Tsallis statistics : Application to the discrete-time optimal growth problem

    OpenAIRE

    稻垣, 陽介; イナガキ, ヨウスケ; Yousuke, Inagaki

    2007-01-01

    The efficiency of Monte Carlo simulated annealing algorithm based on the generalized statistics of Tsallis (GSA) is compared with conventional simulated annealing (CSA) based on Boltzmann-Gibbs statistics. Application to the discrete-time optimal growth problem demonstrates that the replacement of CSA by GSA has the potential to speed up optimizations with no loss of accuracy in finding optimal policy function.

  11. Application of Simulated Annealing and Related Algorithms to TWTA Design

    Science.gov (United States)

    Radke, Eric M.

    2004-01-01

    Simulated Annealing (SA) is a stochastic optimization algorithm used to search for global minima in complex design surfaces where exhaustive searches are not computationally feasible. The algorithm is derived by simulating the annealing process, whereby a solid is heated to a liquid state and then cooled slowly to reach thermodynamic equilibrium at each temperature. The idea is that atoms in the solid continually bond and re-bond at various quantum energy levels, and with sufficient cooling time they will rearrange at the minimum energy state to form a perfect crystal. The distribution of energy levels is given by the Boltzmann distribution: as temperature drops, the probability of the presence of high-energy bonds decreases. In searching for an optimal design, local minima and discontinuities are often present in a design surface. SA presents a distinct advantage over other optimization algorithms in its ability to escape from these local minima. Just as high-energy atomic configurations are visited in the actual annealing process in order to eventually reach the minimum energy state, in SA highly non-optimal configurations are visited in order to find otherwise inaccessible global minima. The SA algorithm produces a Markov chain of points in the design space at each temperature, with a monotonically decreasing temperature. A random point is started upon, and the objective function is evaluated at that point. A stochastic perturbation is then made to the parameters of the point to arrive at a proposed new point in the design space, at which the objection function is evaluated as well. If the change in objective function values (Delta)E is negative, the proposed new point is accepted. If (Delta)E is positive, the proposed new point is accepted according to the Metropolis criterion: rho((Delta)f) = exp((-Delta)E/T), where T is the temperature for the current Markov chain. The process then repeats for the remainder of the Markov chain, after which the temperature is

  12. PERBANDINGAN KINERJA ALGORITMA GENETIKA DAN SIMULATED ANNEALING UNTUK MASALAH MULTIPLE OBJECTIVE PADA PENJADWALAN FLOWSHOP

    Directory of Open Access Journals (Sweden)

    I Gede Agus Widyadana

    2002-01-01

    Full Text Available The research is focused on comparing Genetics algorithm and Simulated Annealing in the term of performa and processing time. The main purpose is to find out performance both of the algorithm to solve minimizing makespan and total flowtime in a particular flowshop system. Performances of the algorithms are found by simulating problems with variation of jobs and machines combination. The result show the Simulated Annealing is much better than the Genetics up to 90%. The Genetics, however, only had score in processing time, but the trend that plotted suggest that in problems with lots of jobs and lots of machines, the Simulated Annealing will run much faster than the Genetics. Abstract in Bahasa Indonesia : Penelitian ini difokuskan pada pembandingan algoritma Genetika dan Simulated Annealing ditinjau dari aspek performa dan waktu proses. Tujuannya adalah untuk melihat kemampuan dua algoritma tersebut untuk menyelesaikan problem-problem penjadwalan flow shop dengan kriteria minimasi makespan dan total flowtime. Kemampuan kedua algoritma tersebut dilihat dengan melakukan simulasi yang dilakukan pada kombinasi-kombinasi job dan mesin yang berbeda-beda. Hasil simulasi menunjukan algoritma Simulated Annealing lebih unggul dari algoritma Genetika hingga 90%, algoritma Genetika hanya unggul pada waktu proses saja, namun dengan tren waktu proses yang terbentuk, diyakini pada problem dengan kombinasi job dan mesin yang banyak, algoritma Simulated Annealing dapat lebih cepat daripada algoritma Genetika. Kata kunci: Algoritma Genetika, Simulated Annealing, flow shop, makespan, total flowtime.

  13. Static Security Enhancement and Loss Minimization Using Simulated Annealing

    Directory of Open Access Journals (Sweden)

    A.Y. Abdelaziz

    2013-03-01

    Full Text Available This paper presents a developed algorithm for optimal placement of thyristor controlled series capacitors (TCSC’s for enhancing the power system static security and minimizing the system overall power loss. Placing TCSC’s at selected branches requires analysis of the system behavior under all possible contingencies. A selective procedure to determine the locations and settings of the thyristor controlled series capacitors is presented. The locations are determined by evaluating contingency sensitivity index (CSI for a given power system branch for a given number of contingencies. This criterion is then used to develop branches prioritizing index in order to rank the system branches possible for placement of the thyristor controlled series capacitors. Optimal settings of TCSC’s are determined by the optimization technique of simulated annealing (SA, where settings are chosen to minimize the overall power system losses. The goal of the developed methodology is to enhance power system static security by alleviating/eliminating overloads on the transmission lines and maintaining the voltages at all load buses within their specified limits through the optimal placement and setting of TCSC’s under single and double line outage network contingencies. The proposed algorithm is examined using different IEEE standard test systems to shown its superiority in enhancing the system static security and minimizing the system losses.

  14. Simulated Annealing Technique for Routing in a Rectangular Mesh Network

    Directory of Open Access Journals (Sweden)

    Noraziah Adzhar

    2014-01-01

    Full Text Available In the process of automatic design for printed circuit boards (PCBs, the phase following cell placement is routing. On the other hand, routing process is a notoriously difficult problem, and even the simplest routing problem which consists of a set of two-pin nets is known to be NP-complete. In this research, our routing region is first tessellated into a uniform Nx×Ny array of square cells. The ultimate goal for a routing problem is to achieve complete automatic routing with minimal need for any manual intervention. Therefore, shortest path for all connections needs to be established. While classical Dijkstra’s algorithm guarantees to find shortest path for a single net, each routed net will form obstacles for later paths. This will add complexities to route later nets and make its routing longer than the optimal path or sometimes impossible to complete. Today’s sequential routing often applies heuristic method to further refine the solution. Through this process, all nets will be rerouted in different order to improve the quality of routing. Because of this, we are motivated to apply simulated annealing, one of the metaheuristic methods to our routing model to produce better candidates of sequence.

  15. Traveling Salesman Approach for Solving Petrol Distribution Using Simulated Annealing

    Directory of Open Access Journals (Sweden)

    Zuhaimy Ismail

    2008-01-01

    Full Text Available This research presents an attempt to solve a logistic company's problem of delivering petrol to petrol station in the state of Johor. This delivery system is formulated as a travelling salesman problem (TSP. TSP involves finding an optimal route for visiting stations and returning to point of origin, where the inter-station distance is symmetric and known. This real world application is a deceptive simple combinatorial problem and our approach is to develop solutions based on the idea of local search and meta-heuristics. As a standard problem, we have chosen a solution is a deceptively simple combinatorial problem and we defined it simply as the time spends or distance travelled by salesman visiting n cities (or nodes cyclically. In one tour the vehicle visits each station just once and finishes up where he started. As standard problems, we have chosen TSP with different stations visited once. This research presents the development of solution engine based on local search method known as Greedy Method and with the result generated as the initial solution, Simulated Annealing (SA and Tabu Search (TS further used to improve the search and provide the best solution. A user friendly optimization program developed using Microsoft C++ to solve the TSP and provides solutions to future TSP which may be classified into daily or advanced management and engineering problems.

  16. Optical Design of Multilayer Achromatic Waveplate by Simulated Annealing Algorithm

    Institute of Scientific and Technical Information of China (English)

    Jun Ma; Jing-Shan Wang; Carsten Denker; Hai-Min Wang

    2008-01-01

    We applied a Monte Carlo method-simulated annealing algorithm-to carry out the design of multilayer achromatic waveplate. We present solutions for three-, six-and ten-layer achromatic waveplates. The optimized retardance settings are found to be 89°51'39"±0°33'37" and 89°54'46"±0°22'4" for the six-and ten-layer waveplates, respectively, for a wavelength range from 1000nm to 1800nm. The polarimetric properties of multilayer waveplates are investigated based on several numerical experiments. In contrast to previously proposed three-layer achromatic waveplate, the fast axes of the new six-and ten-layer achromatic waveplate remain at fixed angles, independent of the wavelength. Two applications of multilayer achromatic waveplate are discussed, the general-purpose phase shifter and the birefringent filter in the Infrared Imaging Magnetograph (IRIM) system of the Big Bear Solar Observatory (BBSO). We also checked an experimental method to measure the retardance of waveplates.

  17. Efficient Hand off using Fuzzy and Simulated Annealing

    Directory of Open Access Journals (Sweden)

    Vikas.M.N

    2012-02-01

    Full Text Available This paper presents an efficient method for the hand off mechanism in cellular networks using optimization algorithms. The proposed approach integrates a fuzzy logic approach with simulated annealing algorithm to automate the tuning process. The fuzzy controller carries out inference operation at high-speed, whereas the tuning procedure works at a much lower rate. For the implementation described in this paper, a two-input-one-output fuzzy controller is considered. Both the inputs and the output have 8- bit resolution, and up to seven membership functions for each input or output can be defined over the universe of discourse. The fuzzy controller has two levels of pipeline which allows overlapping of the arithmetic as well as inference operations. The SA tuning mechanism adjusts the triangular or singleton membership functions to minimize a cost function. The complete self-tuned fuzzy inference engine is implemented in a Xilinx SPARTAN3 XC3S200 series FPGA device. This paper describes various aspects of the implementation of the self-tuned hand off system.

  18. Annealing effect on current perpendicular to plane systems modeled by giant magnetoresistance simulation

    NARCIS (Netherlands)

    Jonkers, PAE

    2001-01-01

    A simulation single-electron model is presented to describe the effect of annealing current perpendicular to plane-giant magnetoresistance (CPP-GMR) systems. Progressive annealing is represented by a progressively increasing number of impurities occurring at the interfaces of adjacent layers constit

  19. Morphological neural networks for automatic target detection by simulated annealing learning algorithm

    Institute of Scientific and Technical Information of China (English)

    余农; 吴昊; 吴常泳; 李范鸣; 吴立德

    2003-01-01

    A practical neural network model for morphological filtering and a simulated annealing optimal algorithm for the network parameters training are proposed in this paper. It is pointed out that the optimal designing process of the morphological filtering network in fact is the optimal learning process of adjusting network parameters (structuring element, or SE for short) to accommodate image environment. Then the network structure may possess the characteristics ofimage targets, and so give specific infor- mation to the SE. Morphological filters formed in this way become certainly intelligent and can provide good filtering results and robust adaptability to complex changing image. For application tomotional image target detection, dynamic training algorithm is applied to the designing process using asymptotic shrinking error and appropriate network weights adjusting. Experimental results show that the algorithm has invariant propertywith respect to shift, scale and rotation of moving target in continuing detection of moving targets.

  20. Differential evolution and simulated annealing algorithms for mechanical systems design

    Directory of Open Access Journals (Sweden)

    H. Saruhan

    2014-09-01

    Full Text Available In this study, nature inspired algorithms – the Differential Evolution (DE and the Simulated Annealing (SA – are utilized to seek a global optimum solution for ball bearings link system assembly weight with constraints and mixed design variables. The Genetic Algorithm (GA and the Evolution Strategy (ES will be a reference for the examination and validation of the DE and the SA. The main purpose is to minimize the weight of an assembly system composed of a shaft and two ball bearings. Ball bearings link system is used extensively in many machinery applications. Among mechanical systems, designers pay great attention to the ball bearings link system because of its significant industrial importance. The problem is complex and a time consuming process due to mixed design variables and inequality constraints imposed on the objective function. The results showed that the DE and the SA performed and obtained convergence reliability on the global optimum solution. So the contribution of the DE and the SA application to the mechanical system design can be very useful in many real-world mechanical system design problems. Beside, the comparison confirms the effectiveness and the superiority of the DE over the others algorithms – the SA, the GA, and the ES – in terms of solution quality. The ball bearings link system assembly weight of 634,099 gr was obtained using the DE while 671,616 gr, 728213.8 gr, and 729445.5 gr were obtained using the SA, the ES, and the GA respectively.

  1. Sensitivity study on hydraulic well testing inversion using simulated annealing

    International Nuclear Information System (INIS)

    For environmental remediation, management of nuclear waste disposal, or geothermal reservoir engineering, it is very important to evaluate the permeabilities, spacing, and sizes of the subsurface fractures which control ground water flow. Cluster variable aperture (CVA) simulated annealing has been used as an inversion technique to construct fluid flow models of fractured formations based on transient pressure data from hydraulic tests. A two-dimensional fracture network system is represented as a filled regular lattice of fracture elements. The algorithm iteratively changes an aperture of cluster of fracture elements, which are chosen randomly from a list of discrete apertures, to improve the match to observed pressure transients. The size of the clusters is held constant throughout the iterations. Sensitivity studies using simple fracture models with eight wells show that, in general, it is necessary to conduct interference tests using at least three different wells as pumping well in order to reconstruct the fracture network with a transmissivity contrast of one order of magnitude, particularly when the cluster size is not known a priori. Because hydraulic inversion is inherently non-unique, it is important to utilize additional information. The authors investigated the relationship between the scale of heterogeneity and the optimum cluster size (and its shape) to enhance the reliability and convergence of the inversion. It appears that the cluster size corresponding to about 20--40 % of the practical range of the spatial correlation is optimal. Inversion results of the Raymond test site data are also presented and the practical range of spatial correlation is evaluated to be about 5--10 m from the optimal cluster size in the inversion

  2. ACTIVITY-BASED COSTING DAN SIMULATED ANNEALING UNTUK PENCARIAN RUTE PADA FLEXIBLE MANUFACTURING SYSTEMS

    Directory of Open Access Journals (Sweden)

    Gregorius Satia Budhi

    2003-01-01

    Full Text Available Flexible Manufacturing System (FMS is a manufacturing system that is formed from several Numerical Controlled Machines combine with material handling system, so that different jobs can be worked by different machines sequences. FMS combine the high productivity and flexibility of Transfer Line and Job Shop manufacturing system. In this reasearch, Activity-Based Costing(ABC approach was used as the weight to search the operation route in the proper machine, so that the total production cost can be optimized. The search method that was used in this experiment is Simulated Annealling, a variant form Hill Climbing Search method. An ideal operation time to proses a part was used as the annealling schedule. From the empirical test, it could be proved that the use of ABC approach and Simulated Annealing to search the route (routing process can optimize the Total Production Cost. In the other hand, the use of ideal operation time to process a part as annealing schedule can control the processing time well. Abstract in Bahasa Indonesia : Flexible Manufacturing System (FMS adalah sistem manufaktur yang tersusun dari mesin-mesin Numerical Control (NC yang dikombinasi dengan Sistem Penanganan Material, sehingga job-job berbeda dikerjakan oleh mesin-mesin dengan alur yang berlainan. FMS menggabungkan produktifitas dan fleksibilitas yang tinggi dari Sistem Manufaktur Transfer Line dan Job Shop. Pada riset ini pendekatan Activity-Based Costing (ABC digunakan sebagai bobot / weight dalam pencarian rute operasi pada mesin yang tepat, untuk lebih mengoptimasi biaya produksi secara keseluruhan. Adapun metode Searching yang digunakan adalah Simulated Annealing yang merupakan varian dari metode searching Hill Climbing. Waktu operasi ideal untuk memproses sebuah part digunakan sebagai Annealing Schedulenya. Dari hasil pengujian empiris dapat dibuktikan bahwa penggunaan pendekatan ABC dan Simulated Annealing untuk proses pencarian rute (routing dapat lebih

  3. Time Simulation of Bone Adaptation

    DEFF Research Database (Denmark)

    Bagge, Mette

    1998-01-01

    The structural adaptation of a three-dimensional finite element model ofthe proximal femur is considered. Presuming the bone possesses the optimalstructure under the given loads, the bone material distribution is foundby minimizing the strain energy averaged over ten load cases with avolume...... constraint. Theoptimized design is used as a start-configuration for the remodelingsimulation.The parameter characterizing the equilibrium level where no remodeling occurs is estimated from the optimization parameters.The loads vary in magnitude, location and direction simulating timedependent loading. The...

  4. Computer simulation of laser annealing of a nanostructured surface

    NARCIS (Netherlands)

    D. Ivanov; I. Marinov; Y. Gorbachev; A. Smirnov; V. Krzhizhanovskaya

    2009-01-01

    Laser annealing technology is used in mass production of new-generation semiconductor materials and nano-electronic devices like the MOS-based (metal-oxide-semiconductor) integrated circuits. Manufacturing sub-100 nm MOS devices demands application of ultra-shallow doping (junctions), which requires

  5. Picosecond and nanosecond laser annealing and simulation of amorphous silicon thin films for solar cell applications

    Science.gov (United States)

    Theodorakos, I.; Zergioti, I.; Vamvakas, V.; Tsoukalas, D.; Raptis, Y. S.

    2014-01-01

    In this work, a picosecond diode pumped solid state laser and a nanosecond Nd:YAG laser have been used for the annealing and the partial nano-crystallization of an amorphous silicon layer. These experiments were conducted as an alternative/complementary to plasma-enhanced chemical vapor deposition method for fabrication of micromorph tandem solar cell. The laser experimental work was combined with simulations of the annealing process, in terms of temperature distribution evolution, in order to predetermine the optimum annealing conditions. The annealed material was studied, as a function of several annealing parameters (wavelength, pulse duration, fluence), as far as it concerns its structural properties, by X-ray diffraction, SEM, and micro-Raman techniques.

  6. Picosecond and nanosecond laser annealing and simulation of amorphous silicon thin films for solar cell applications

    Energy Technology Data Exchange (ETDEWEB)

    Theodorakos, I.; Zergioti, I.; Tsoukalas, D.; Raptis, Y. S., E-mail: yraptis@central.ntua.gr [Physics Department, National Technical University of Athens, Heroon Polytechniou 9, 15780 Zographou, Athens (Greece); Vamvakas, V. [Heliosphera SA, Industrial Area of Tripolis, 8th Building Block, 5th Road, GR-221 00 Tripolis (Greece)

    2014-01-28

    In this work, a picosecond diode pumped solid state laser and a nanosecond Nd:YAG laser have been used for the annealing and the partial nano-crystallization of an amorphous silicon layer. These experiments were conducted as an alternative/complementary to plasma-enhanced chemical vapor deposition method for fabrication of micromorph tandem solar cell. The laser experimental work was combined with simulations of the annealing process, in terms of temperature distribution evolution, in order to predetermine the optimum annealing conditions. The annealed material was studied, as a function of several annealing parameters (wavelength, pulse duration, fluence), as far as it concerns its structural properties, by X-ray diffraction, SEM, and micro-Raman techniques.

  7. Exploring Photometric Redshifts as an Optimization Problem: An Ensemble MCMC and Simulated Annealing-Driven Template-Fitting Approach

    CERN Document Server

    Speagle, Joshua S; Eisenstein, Daniel J; Masters, Daniel C; Steinhardt, Charles L

    2015-01-01

    Using a grid of $\\sim 2$ million elements ($\\Delta z = 0.005$) adapted from COSMOS photometric redshift (photo-z) searches, we investigate the general properties of template-based photo-z likelihood surfaces. We find these surfaces are filled with numerous local minima and large degeneracies that generally confound rapid but "greedy" optimization schemes, even with additional stochastic sampling methods. In order to robustly and efficiently explore these surfaces, we develop BAD-Z [Brisk Annealing-Driven Redshifts (Z)], which combines ensemble Markov Chain Monte Carlo (MCMC) sampling with simulated annealing to sample arbitrarily large, pre-generated grids in approximately constant time. Using a mock catalog of 384,662 objects, we show BAD-Z samples $\\sim 40$ times more efficiently compared to a brute-force counterpart while maintaining similar levels of accuracy. Our results represent first steps toward designing template-fitting photo-z approaches limited mainly by memory constraints rather than computation...

  8. A computed tomography reconstruction algorithm based on multipurpose optimal criterion and simulated annealing theory

    Institute of Scientific and Technical Information of China (English)

    Hui Li; Xiong Wan; Taoli Liu; Zhongshou Liu; Yanhua Zhu

    2007-01-01

    Although emission spectral tomography (EST) combines emission spectral measurement with optical computed tomography (OCT), it is difficult to gain transient emission data from a large number of views,therefore, high precision OCT algorithms with few views ought to be studied for EST application. To improve the reconstruction precision in the case of few views, a new computed tomography reconstruction algorithm based on multipurpose optimal criterion and simulated annealing theory (multi-criterion simulated annealing reconstruction technique, MCSART) is proposed. This algorithm can suffice criterion of least squares, criterion of most uniformity, and criterion of most smoothness synchronously. We can get global optimal solution by MCSART algorithm with simulated annealing theory. The simulating experiment result shows that this algorithm is superior to the traditional algorithms under various noises.

  9. Robot Path Planning Based on Simulated Annealing and Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Xianmin Wei

    2013-05-01

    Full Text Available As for the limitations of algorithms in global path planning of mobile robot at present, this study applies the improved simulated annealing algorithm artificial neural networks to path planning of mobile robot in order to better the weaknesses of great scale of iteration computation and slow convergence, since the best-reserved simulated annealing algorithm was introduced and it was effectively combined with other algorithms, this improved algorithm has accelerated the convergence and shortened the computing time in the path planning and the global optimal solution can be quickly obtained. Because the simulated annealing algorithm was updated and the obstacle collision penalty function represented by neural networks and the path length are treated as the energy function, not only does the planning of path meet the standards of shortest path, but also avoids collisions with obstacles. Experimental results of simulation show this improved algorithm can effectively improve the calculation speed of path planning and ensure the quality of path planning.

  10. Robot Path Planning Based on Simulated Annealing and Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Xianmin Wei

    2013-06-01

    Full Text Available As for the limitations of algorithms in global path planning of mobile robot at present, this study applies the improved simulated annealing algorithm artificial neural networks to path planning of mobile robot in order to better the weaknesses of great scale of iteration computation and slow convergence, since the best-reserved simulated annealing algorithm was introduced and it was effectively combined with other algorithms, this improved algorithm has accelerated the convergence and shortened the computing time in the path planning and the global optimal solution can be quickly obtained. Because the simulated annealing algorithm was updated and the obstacle collision penalty function represented by neural networks and the path length are treated as the energy function, not only does the planning of path meet the standards of shortest path, but also avoids collisions with obstacles. Experimental results of simulation show this improved algorithm can effectively improve the calculation speed of path planning and ensure the quality of path planning.

  11. Simulation of annealing process effect on texture evolution of deep-drawing sheet St15

    Institute of Scientific and Technical Information of China (English)

    Jinghong Sun; Yazheng Liu; Leyu Zhou

    2005-01-01

    A two-dimensional cellular automaton method was used to simulate grain growth during the recrystallization annealing of deep-drawing sheet Stl5, taking the simulated result of recrystallization and the experimental result of the annealing texture of deepdrawing sheet St15 as the initial condition and reference. By means of computer simulation, the microstructures and textures of different periods of grain growth were predicted. It is achieved that the grain size, shape and texture become stable after the grain growth at a constant temperature of 700℃ for 10 h, and the advantaged texture components { 111 } and { 111 } are dominant.

  12. Simulated Stochastic Approximation Annealing for Global Optimization With a Square-Root Cooling Schedule

    KAUST Repository

    Liang, Faming

    2014-04-03

    Simulated annealing has been widely used in the solution of optimization problems. As known by many researchers, the global optima cannot be guaranteed to be located by simulated annealing unless a logarithmic cooling schedule is used. However, the logarithmic cooling schedule is so slow that no one can afford to use this much CPU time. This article proposes a new stochastic optimization algorithm, the so-called simulated stochastic approximation annealing algorithm, which is a combination of simulated annealing and the stochastic approximation Monte Carlo algorithm. Under the framework of stochastic approximation, it is shown that the new algorithm can work with a cooling schedule in which the temperature can decrease much faster than in the logarithmic cooling schedule, for example, a square-root cooling schedule, while guaranteeing the global optima to be reached when the temperature tends to zero. The new algorithm has been tested on a few benchmark optimization problems, including feed-forward neural network training and protein-folding. The numerical results indicate that the new algorithm can significantly outperform simulated annealing and other competitors. Supplementary materials for this article are available online.

  13. Quantum annealing

    OpenAIRE

    Ruiz, Alfonso de la Fuente

    2014-01-01

    Brief description on the state of the art of some local optimization methods: Quantum annealing Quantum annealing (also known as alloy, crystallization or tempering) is analogous to simulated annealing but in substitution of thermal activation by quantum tunneling. The class of algorithmic methods for quantum annealing (dubbed: 'QA'), sometimes referred by the italian school as Quantum Stochastic Optimization ('QSO'), is a promising metaheuristic tool for solving local search problems in mult...

  14. Laser annealing and simulation of amorphous silicon thin films for solar cell applications

    Science.gov (United States)

    Theodorakos, I.; Raptis, Y. S.; Vamvakas, V.; Tsoukalas, D.; Zergioti, I.

    2014-03-01

    In this work, a picosecond DPSS and a nanosecond Nd:YAG laser have been used for the annealing and the partial nanocrystallization of an amorphous silicon layer. These experiments were conducted in order to improve the characteristics of a micromorph tandem solar cell. The laser annealing was attempted at 1064nm in order to obtain the desired crystallization's depth and ratios. Preliminary annealing-processes, with different annealing parameters, have been tested, such as fluence, repetition rate and number of pulses. Irradiations were applied in the sub-melt regime, in order to prevent significant diffusion of p- and n-dopants to take place within the structure. The laser experimental work was combined with simulations of the laser annealing process, in terms of temperature distribution evolution, using the Synopsys Sentaurus Process TCAD software. The optimum annealing conditions for the two different pulse durations were determined. Experimentally determined optical properties of our samples, such as the absorption coefficient and reflectivity, were used for a more realistic simulation. From the simulations results, a temperature profile, appropriate to yield the desired recrystallization was obtained for the case of ps pulses, which was verified from the experimental results described below. The annealed material was studied, as far as it concerns its structural properties, by XRD, SEM and micro-Raman techniques, providing consistent information on the characteristics of the nanocrystalline material produced by the laser annealing experiments. It was found that, with the use of ps pulses, the resultant polycrystalline region shows crystallization's ratios similar to a PECVD developed poly-Silicon layer, with slightly larger nanocrystallite's size.

  15. Investigation of simulated annealing method and its application to optimal design of die mold for orientation of magnetic powder

    OpenAIRE

    Takahashi, Norio; Ebihara, Kenji; Yoshida, Koji; Nakata, Takayoshi; Ohashi, Ken; Miyata, Koji

    1996-01-01

    Factors affecting the convergence characteristics and results obtained by the optimal design method using the finite element method and simulated annealing are investigated systematically, and the optimal parameters for simulated annealing method are obtained. The optimal shape of the die mold for orientation of the magnetic powder (nonlinear magnetostatic problem) is obtained using finite elements and simulated annealing. The experimental verification is also carried out

  16. Simulated annealing to handle energy and ancillary services joint management considering electric vehicles

    DEFF Research Database (Denmark)

    Sousa, Tiago M; Soares, Tiago; Morais, Hugo;

    2016-01-01

    The massive use of distributed generation and electric vehicles will lead to a more complex management of the power system, requiring new approaches to be used in the optimal resource scheduling field. Electric vehicles with vehicle-to-grid capability can be useful for the aggregator players...... in the mitigation of renewable sources intermittency and in the ancillary services procurement. In this paper, an energy and ancillary services joint management model is proposed. A simulated annealing approach is used to solve the joint management for the following day, considering the minimization...... of the aggregator total operation costs. The case study considers a distribution network with 33-bus, 66 distributed generation and 2000 electric vehicles. The proposed simulated annealing is matched with a deterministic approach allowing an effective and efficient comparison. The simulated annealing presents...

  17. Optimal actuator placement on an active reflector using a modified simulated annealing technique

    Science.gov (United States)

    Kuo, Chin-Po; Bruno, Robin

    1991-01-01

    The development of a lightweight actuation system for maintaining the surface accuracy of a composite honeycomb panel using piezoelectric actuators is discussed. A modified simulated annealing technique is used to optimize the problem with both combinatorial and continuous criteria and with inequality constraints. Near optimal solutions for the location of the actuators, using combinatorial optimization, and for the required actuator forces, employing continuous optimization, are sought by means of the modified simulated annealing technique. The actuator locations are determined by first seeking a near optimum solution using the modified simulated annealing technique. The final actuator configuration consists of an arrangement wherein the piezoelectric actuators are placed along six radial lines. Numerical results showing the achievable surface correction by means of this configuration are presented.

  18. Simulated annealing algorithm for TSP%用模拟退火算法求解TSP

    Institute of Scientific and Technical Information of China (English)

    朱静丽

    2011-01-01

    货郎担问题,即TSP(Traveling Salesman Problem),是一个组合优化问题。具有NPC计算复杂性。本文分析了模拟退火算法模型,研究了用模拟退火算法求解TSP算法的可行性,并给出了用模拟退火算法求解TSP问题的具体实现方法。%Traveling salesman problem,that TSP(Travelling Salesman Problem),is a combinatorial optimization problem.Computational complexity with the NPC.This paper analyzes the simulated annealing algorithm model to study the simulated annealing algorithm for TSP of the algorithm,and gives the simulated annealing algorithm for TSP on the specific implementation.

  19. Satisfiability Test with Synchronous Simulated Annealing on the Fujitsu AP1000 Massively-Parallel Multiprocessor

    Science.gov (United States)

    Sohn, Andrew; Biswas, Rupak

    1996-01-01

    Solving the hard Satisfiability Problem is time consuming even for modest-sized problem instances. Solving the Random L-SAT Problem is especially difficult due to the ratio of clauses to variables. This report presents a parallel synchronous simulated annealing method for solving the Random L-SAT Problem on a large-scale distributed-memory multiprocessor. In particular, we use a parallel synchronous simulated annealing procedure, called Generalized Speculative Computation, which guarantees the same decision sequence as sequential simulated annealing. To demonstrate the performance of the parallel method, we have selected problem instances varying in size from 100-variables/425-clauses to 5000-variables/21,250-clauses. Experimental results on the AP1000 multiprocessor indicate that our approach can satisfy 99.9 percent of the clauses while giving almost a 70-fold speedup on 500 processors.

  20. Optimization of pressurized water reactor shuffling by simulated annealing with heuristics

    International Nuclear Information System (INIS)

    Simulated-annealing optimization of reactor core loading patterns is implemented with support for design heuristics during candidate pattern generation. The SIMAN optimization module uses the advanced nodal method of SIMULATE-3 and the full cross-section detail of CASMO-3 to evaluate accurately the neutronic performance of each candidate, resulting in high-quality patterns. The use of heuristics within simulated annealing is explored. Heuristics improve the consistency of optimization results for both fast- and slow-annealing runs with no penalty from the exclusion of unusual candidates. Thus, the heuristic application of designer judgment during automated pattern generation is shown to be effective. The capability of the SIMAN module to find and evaluate families of loading patterns that satisfy design constraints and have good objective performance within practical run times is demonstrated. The use of automated evaluations of successive cycles to explore multicycle effects of design decisions is discussed

  1. A hybrid Genetic and Simulated Annealing Algorithm for Chordal Ring implementation in large-scale networks

    DEFF Research Database (Denmark)

    Riaz, M. Tahir; Gutierrez Lopez, Jose Manuel; Pedersen, Jens Myrup;

    2011-01-01

    The paper presents a hybrid Genetic and Simulated Annealing algorithm for implementing Chordal Ring structure in optical backbone network. In recent years, topologies based on regular graph structures gained a lot of interest due to their good communication properties for physical topology...... of the networks. There have been many use of evolutionary algorithms to solve the problems which are in combinatory complexity nature, and extremely hard to solve by exact approaches. Both Genetic and Simulated annealing algorithms are similar in using controlled stochastic method to search the solution...

  2. Instantons in Quantum Annealing: Thermally Assisted Tunneling Vs Quantum Monte Carlo Simulations

    Science.gov (United States)

    Jiang, Zhang; Smelyanskiy, Vadim N.; Boixo, Sergio; Isakov, Sergei V.; Neven, Hartmut; Mazzola, Guglielmo; Troyer, Matthias

    2015-01-01

    Recent numerical result (arXiv:1512.02206) from Google suggested that the D-Wave quantum annealer may have an asymptotic speed-up than simulated annealing, however, the asymptotic advantage disappears when it is compared to quantum Monte Carlo (a classical algorithm despite its name). We show analytically that the asymptotic scaling of quantum tunneling is exactly the same as the escape rate in quantum Monte Carlo for a class of problems. Thus, the Google result might be explained in our framework. We also found that the transition state in quantum Monte Carlo corresponds to the instanton solution in quantum tunneling problems, which is observed in numerical simulations.

  3. A Clustering Algorithm Using the Tabu Search Approach with Simulated Annealing for Vector Quantization

    Institute of Scientific and Technical Information of China (English)

    CHUShuchuan; JohnF.Roddick

    2003-01-01

    In this paper, a cluster generation algorithm for vector quantization using a tabu search approach with simulated annealing is proposed. The main iclea of this algorithm is to use the tabu search approach to gen-erate non-local moves for the clusters and apply the sim-ulated annealing technique to select the current best solu-tion, thus improving the cluster generation and reducing the mean squared error. Preliminary experimental results demonstrate that the proposed approach is superior to the tabu search approach with Generalised Lloyd algorithm.

  4. Classification of hyperspectral remote sensing images based on simulated annealing genetic algorithm and multiple instance learning

    Institute of Scientific and Technical Information of China (English)

    高红民; 周惠; 徐立中; 石爱业

    2014-01-01

    A hybrid feature selection and classification strategy was proposed based on the simulated annealing genetic algorithm and multiple instance learning (MIL). The band selection method was proposed from subspace decomposition, which combines the simulated annealing algorithm with the genetic algorithm in choosing different cross-over and mutation probabilities, as well as mutation individuals. Then MIL was combined with image segmentation, clustering and support vector machine algorithms to classify hyperspectral image. The experimental results show that this proposed method can get high classification accuracy of 93.13%at small training samples and the weaknesses of the conventional methods are overcome.

  5. Multiphase Simulated Annealing Based on Boltzmann and Bose-Einstein Distribution Applied to Protein Folding Problem

    Science.gov (United States)

    Liñán-García, Ernesto; Sánchez-Hernández, Juan Paulo; González-Barbosa, J. Javier; González-Flores, Carlos

    2016-01-01

    A new hybrid Multiphase Simulated Annealing Algorithm using Boltzmann and Bose-Einstein distributions (MPSABBE) is proposed. MPSABBE was designed for solving the Protein Folding Problem (PFP) instances. This new approach has four phases: (i) Multiquenching Phase (MQP), (ii) Boltzmann Annealing Phase (BAP), (iii) Bose-Einstein Annealing Phase (BEAP), and (iv) Dynamical Equilibrium Phase (DEP). BAP and BEAP are simulated annealing searching procedures based on Boltzmann and Bose-Einstein distributions, respectively. DEP is also a simulated annealing search procedure, which is applied at the final temperature of the fourth phase, which can be seen as a second Bose-Einstein phase. MQP is a search process that ranges from extremely high to high temperatures, applying a very fast cooling process, and is not very restrictive to accept new solutions. However, BAP and BEAP range from high to low and from low to very low temperatures, respectively. They are more restrictive for accepting new solutions. DEP uses a particular heuristic to detect the stochastic equilibrium by applying a least squares method during its execution. MPSABBE parameters are tuned with an analytical method, which considers the maximal and minimal deterioration of problem instances. MPSABBE was tested with several instances of PFP, showing that the use of both distributions is better than using only the Boltzmann distribution on the classical SA. PMID:27413369

  6. Multiphase Simulated Annealing Based on Boltzmann and Bose-Einstein Distribution Applied to Protein Folding Problem.

    Science.gov (United States)

    Frausto-Solis, Juan; Liñán-García, Ernesto; Sánchez-Hernández, Juan Paulo; González-Barbosa, J Javier; González-Flores, Carlos; Castilla-Valdez, Guadalupe

    2016-01-01

    A new hybrid Multiphase Simulated Annealing Algorithm using Boltzmann and Bose-Einstein distributions (MPSABBE) is proposed. MPSABBE was designed for solving the Protein Folding Problem (PFP) instances. This new approach has four phases: (i) Multiquenching Phase (MQP), (ii) Boltzmann Annealing Phase (BAP), (iii) Bose-Einstein Annealing Phase (BEAP), and (iv) Dynamical Equilibrium Phase (DEP). BAP and BEAP are simulated annealing searching procedures based on Boltzmann and Bose-Einstein distributions, respectively. DEP is also a simulated annealing search procedure, which is applied at the final temperature of the fourth phase, which can be seen as a second Bose-Einstein phase. MQP is a search process that ranges from extremely high to high temperatures, applying a very fast cooling process, and is not very restrictive to accept new solutions. However, BAP and BEAP range from high to low and from low to very low temperatures, respectively. They are more restrictive for accepting new solutions. DEP uses a particular heuristic to detect the stochastic equilibrium by applying a least squares method during its execution. MPSABBE parameters are tuned with an analytical method, which considers the maximal and minimal deterioration of problem instances. MPSABBE was tested with several instances of PFP, showing that the use of both distributions is better than using only the Boltzmann distribution on the classical SA. PMID:27413369

  7. Multiphase Simulated Annealing Based on Boltzmann and Bose-Einstein Distribution Applied to Protein Folding Problem.

    Science.gov (United States)

    Frausto-Solis, Juan; Liñán-García, Ernesto; Sánchez-Hernández, Juan Paulo; González-Barbosa, J Javier; González-Flores, Carlos; Castilla-Valdez, Guadalupe

    2016-01-01

    A new hybrid Multiphase Simulated Annealing Algorithm using Boltzmann and Bose-Einstein distributions (MPSABBE) is proposed. MPSABBE was designed for solving the Protein Folding Problem (PFP) instances. This new approach has four phases: (i) Multiquenching Phase (MQP), (ii) Boltzmann Annealing Phase (BAP), (iii) Bose-Einstein Annealing Phase (BEAP), and (iv) Dynamical Equilibrium Phase (DEP). BAP and BEAP are simulated annealing searching procedures based on Boltzmann and Bose-Einstein distributions, respectively. DEP is also a simulated annealing search procedure, which is applied at the final temperature of the fourth phase, which can be seen as a second Bose-Einstein phase. MQP is a search process that ranges from extremely high to high temperatures, applying a very fast cooling process, and is not very restrictive to accept new solutions. However, BAP and BEAP range from high to low and from low to very low temperatures, respectively. They are more restrictive for accepting new solutions. DEP uses a particular heuristic to detect the stochastic equilibrium by applying a least squares method during its execution. MPSABBE parameters are tuned with an analytical method, which considers the maximal and minimal deterioration of problem instances. MPSABBE was tested with several instances of PFP, showing that the use of both distributions is better than using only the Boltzmann distribution on the classical SA.

  8. Optimal design of hydraulic manifold blocks based on niching genetic simulated annealing algorithm

    Institute of Scientific and Technical Information of China (English)

    Jia Chunqiang; Yu Ling; Tian Shujun; Gao Yanming

    2007-01-01

    To solve the combinatorial optimization problem of outer layout and inner connection integrated schemes in the design of hydraulic manifold blocks(HMB),a hybrid genetic simulated annealing algorithm based on niche technology is presented.This hybrid algorithm,which combines genetic algorithm,simulated annealing algorithm and niche technology,has a strong capability in global and local search,and all extrema can be found in a short time without strict requests for preferences.For the complex restricted solid spatial layout problems in HMB,the optimizing mathematical model is presented.The key technologies in the integrated layout and connection design of HMB,including the realization of coding,annealing operation and genetic operation,are discussed.The framework of HMB optimal design system based on hybrid optimization strategy is proposed.An example is given to testify the effectiveness and feasibility of the algorithm.

  9. An Improved Simulated Annealing Technique for Enhanced Mobility in Smart Cities.

    Science.gov (United States)

    Amer, Hayder; Salman, Naveed; Hawes, Matthew; Chaqfeh, Moumena; Mihaylova, Lyudmila; Mayfield, Martin

    2016-06-30

    Vehicular traffic congestion is a significant problem that arises in many cities. This is due to the increasing number of vehicles that are driving on city roads of limited capacity. The vehicular congestion significantly impacts travel distance, travel time, fuel consumption and air pollution. Avoidance of traffic congestion and providing drivers with optimal paths are not trivial tasks. The key contribution of this work consists of the developed approach for dynamic calculation of optimal traffic routes. Two attributes (the average travel speed of the traffic and the roads' length) are utilized by the proposed method to find the optimal paths. The average travel speed values can be obtained from the sensors deployed in smart cities and communicated to vehicles via the Internet of Vehicles and roadside communication units. The performance of the proposed algorithm is compared to three other algorithms: the simulated annealing weighted sum, the simulated annealing technique for order preference by similarity to the ideal solution and the Dijkstra algorithm. The weighted sum and technique for order preference by similarity to the ideal solution methods are used to formulate different attributes in the simulated annealing cost function. According to the Sheffield scenario, simulation results show that the improved simulated annealing technique for order preference by similarity to the ideal solution method improves the traffic performance in the presence of congestion by an overall average of 19.22% in terms of travel time, fuel consumption and CO₂ emissions as compared to other algorithms; also, similar performance patterns were achieved for the Birmingham test scenario.

  10. Stochastic annealing simulation of copper under neutron irradiation

    Energy Technology Data Exchange (ETDEWEB)

    Heinisch, H.L. [Pacific Northwest National Lab., Richland, WA (United States); Singh, B.N. [Risoe National Lab., Roskilde (Denmark)

    1998-03-01

    This report is a summary of a presentation made at ICFRM-8 on computer simulations of defect accumulation during irradiation of copper to low doses at room temperature. The simulation results are in good agreement with experimental data on defect cluster densities in copper irradiated in RTNS-II.

  11. The Adaptive Multi-scale Simulation Infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Tobin, William R. [Rensselaer Polytechnic Inst., Troy, NY (United States)

    2015-09-01

    The Adaptive Multi-scale Simulation Infrastructure (AMSI) is a set of libraries and tools developed to support the development, implementation, and execution of general multimodel simulations. Using a minimal set of simulation meta-data AMSI allows for minimally intrusive work to adapt existent single-scale simulations for use in multi-scale simulations. Support for dynamic runtime operations such as single- and multi-scale adaptive properties is a key focus of AMSI. Particular focus has been spent on the development on scale-sensitive load balancing operations to allow single-scale simulations incorporated into a multi-scale simulation using AMSI to use standard load-balancing operations without affecting the integrity of the overall multi-scale simulation.

  12. Improving Simulated Annealing by Replacing Its Variables with Game-Theoretic Utility Maximizers

    Science.gov (United States)

    Wolpert, David H.; Bandari, Esfandiar; Tumer, Kagan

    2001-01-01

    The game-theory field of Collective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved as a side-effect. Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting significantly improves simulated annealing for a model of an economic process run over an underlying small-worlds topology. Furthermore, these experiments reveal novel small-worlds phenomena, and highlight the shortcomings of conventional mechanism design in bounded rationality domains.

  13. Improving Simulated Annealing by Recasting it as a Non-Cooperative Game

    Science.gov (United States)

    Wolpert, David; Bandari, Esfandiar; Tumer, Kagan

    2001-01-01

    The game-theoretic field of COllective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved "as a side-effect". Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed game-theory-motivated algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting improves simulated annealing by several orders of magnitude for spin glass relaxation and bin-packing.

  14. Computer-Assisted Scheduling of Army Unit Training: An Application of Simulated Annealing.

    Science.gov (United States)

    Hart, Roland J.; Goehring, Dwight J.

    This report of an ongoing research project intended to provide computer assistance to Army units for the scheduling of training focuses on the feasibility of simulated annealing, a heuristic approach for solving scheduling problems. Following an executive summary and brief introduction, the document is divided into three sections. First, the Army…

  15. Using genetic/simulated annealing algorithm to solve disassembly sequence planning

    Institute of Scientific and Technical Information of China (English)

    Wu Hao; Zuo Hongfu

    2009-01-01

    disassembly sequence.And the solution methodology based on the genetic/simulated annealing algorithm with binary-tree algorithm is given.Finally,an example is analyzed in detail,and the result shows that the model is correct and efficient.

  16. MASTR: multiple alignment and structure prediction of non-coding RNAs using simulated annealing

    DEFF Research Database (Denmark)

    Lindgreen, Stinus; Gardner, Paul P; Krogh, Anders

    2007-01-01

    multiple alignment of RNA sequences. Using Markov chain Monte Carlo in a simulated annealing framework, the algorithm MASTR (Multiple Alignment of STructural RNAs) iteratively improves both sequence alignment and structure prediction for a set of RNA sequences. This is done by minimizing a combined cost...

  17. Crystallization on a sphere using the simulated annealing algorithm implemented on H.P.C. systems

    NARCIS (Netherlands)

    J.M. Voogd; P.M.A. Sloot

    1993-01-01

    The research presented here is a comparison of the scalability of the simulated annealing algorithm on a vector super computer (CRAY Y-MP) with the scalability of a parallel implementation on a massively parallel transputer surface (Parsytec GCel with 512 nodes of typeT805). Some results of the anne

  18. Multiobjective optimization with a modified simulated annealing algorithm for external beam radiotherapy treatment planning

    International Nuclear Information System (INIS)

    Inverse planning in external beam radiotherapy often requires a scalar objective function that incorporates importance factors to mimic the planner's preferences between conflicting objectives. Defining those importance factors is not straightforward, and frequently leads to an iterative process in which the importance factors become variables of the optimization problem. In order to avoid this drawback of inverse planning, optimization using algorithms more suited to multiobjective optimization, such as evolutionary algorithms, has been suggested. However, much inverse planning software, including one based on simulated annealing developed at our institution, does not include multiobjective-oriented algorithms. This work investigates the performance of a modified simulated annealing algorithm used to drive aperture-based intensity-modulated radiotherapy inverse planning software in a multiobjective optimization framework. For a few test cases involving gastric cancer patients, the use of this new algorithm leads to an increase in optimization speed of a little more than a factor of 2 over a conventional simulated annealing algorithm, while giving a close approximation of the solutions produced by a standard simulated annealing. A simple graphical user interface designed to facilitate the decision-making process that follows an optimization is also presented

  19. X-ray refinement of protein structures by simulated annealing: Test of the method on myohemerythrin

    International Nuclear Information System (INIS)

    The recently developed method of structure factor refinement by molecular dynamics with simulated annealing is tested on the 118 residue protein myohemerythrin. A highly refined structure for this protein at 1.3/1.7 A resolution has recently been published. This is compared with the results of simulated annealing refinement (with no manual intervention) starting from an earlier model for the protein from a stage in the refinement when conventional least-squares methods could not improve the structure. Simulated annealing reduces the R factor at 2.5 A from 39 to 31%, with uniform temperature factors and no solvent molecules and with similar stereochemistry; the comparable value for the manually refined structure is 27.9%. Errors in backbone and sidechain positions up to about 3 A are corrected by the method. The error in backbone positions for roughly 85% of the initial structure is within this range, and in these regions the r.m.s. backbone error is reduced from 1.1 to 0.4 A. For the rest of the structure, including a region which was incorrectly built due to a sequence error, the procedure does not yield any improvement and manual intervention appears to be required. Nevertheless, the overall improvement in the structure results in electron density maps that are easier to interpret and permit identification of the errors in the structure. The general utility of the simulated annealing methodology in X-ray refinement is discussed. (orig.)

  20. Hybridization of Genetic Algorithm with Parallel Implementation of Simulated Annealing for Job Shop Scheduling

    Directory of Open Access Journals (Sweden)

    Thamilselvan Rakkiannan

    2012-01-01

    Full Text Available Problem statement: The Job Shop Scheduling Problem (JSSP is observed as one of the most difficult NP-hard, combinatorial problem. The problem consists of determining the most efficient schedule for jobs that are processed on several machines. Approach: In this study Genetic Algorithm (GA is integrated with the parallel version of Simulated Annealing Algorithm (SA is applied to the job shop scheduling problem. The proposed algorithm is implemented in a distributed environment using Remote Method Invocation concept. The new genetic operator and a parallel simulated annealing algorithm are developed for solving job shop scheduling. Results: The implementation is done successfully to examine the convergence and effectiveness of the proposed hybrid algorithm. The JSS problems tested with very well-known benchmark problems, which are considered to measure the quality of proposed system. Conclusion/Recommendations: The empirical results show that the proposed genetic algorithm with simulated annealing is quite successful to achieve better solution than the individual genetic or simulated annealing algorithm."

  1. Simulated-quantum-annealing comparison between all-to-all connectivity schemes

    Science.gov (United States)

    Albash, Tameem; Vinci, Walter; Lidar, Daniel A.

    2016-08-01

    Quantum annealing aims to exploit quantum mechanics to speed up the search for the solution to optimization problems. Most problems exhibit complete connectivity between the logical spin variables after they are mapped to the Ising spin Hamiltonian of quantum annealing. To account for hardware constraints of current and future physical quantum annealers, methods enabling the embedding of fully connected graphs of logical spins into a constant-degree graph of physical spins are therefore essential. Here, we compare the recently proposed embedding scheme for quantum annealing with all-to-all connectivity by Lechner, Hauke, and Zoller (LHZ) [Sci. Adv. 1, e1500838 (2015), 10.1126/sciadv.1500838] to the commonly used minor embedding (ME) scheme. Using both simulated quantum annealing and parallel tempering simulations, we find that for a set of instances randomly chosen from a class of fully connected, random Ising problems, the ME scheme outperforms the LHZ scheme when using identical simulation parameters, despite the fault tolerance of the latter to weakly correlated spin-flip noise. This result persists even after we introduce several decoding strategies for the LHZ scheme, including a minimum-weight decoding algorithm that results in substantially improved performance over the original LHZ scheme. We explain the better performance of the ME scheme in terms of more efficient spin updates, which allows it to better tolerate the correlated spin-flip errors that arise in our model of quantum annealing. Our results leave open the question of whether the performance of the two embedding schemes can be improved using scheme-specific parameters and new error correction approaches.

  2. Synthesis of optimal digital shapers with arbitrary noise using simulated annealing

    Energy Technology Data Exchange (ETDEWEB)

    Regadío, Alberto, E-mail: aregadio@srg.aut.uah.es [Department of Computer Engineering, Space Research Group, Universidad de Alcalá, 28805 Alcalá de Henares (Spain); Electronic Technology Area, Instituto Nacional de Técnica Aeroespacial, 28850 Torrejón de Ardoz (Spain); Sánchez-Prieto, Sebastián, E-mail: sebastian.sanchez@uah.es [Department of Computer Engineering, Space Research Group, Universidad de Alcalá, 28805 Alcalá de Henares (Spain); Tabero, Jesús, E-mail: taberogj@inta.es [Electronic Technology Area, Instituto Nacional de Técnica Aeroespacial, 28850 Torrejón de Ardoz (Spain)

    2014-02-21

    This paper presents the structure, design and implementation of a new way of determining the optimal shaping in time-domain for spectrometers by means of simulated annealing. The proposed algorithm is able to adjust automatically and in real-time the coefficients for shaping an input signal. A practical prototype was designed, implemented and tested on a PowerPC 405 embedded in a Field Programmable Gate Array (FPGA). Lastly, its performance and capabilities were measured using simulations and a neutron monitor.

  3. Estimation of Mutual Coupling Coefficient of the Array by Simulated Annealing Algorithm

    Institute of Scientific and Technical Information of China (English)

    GAO Huo-tao; ZHENG Xia; LI Yong-xu

    2005-01-01

    We propose a method for estimating the mutual coupling coefficient among antennas in this paper which is based on the principle of signal subspace and the simulated annealing (SA) algorithm. The computer simulation has been conducted to illustrate the excellent performance of this method and to demonstrate that it is statistically efficient. The benefit of this new method is that calibration signals and unknown signals can be received simultaneously, during the course of calibration.

  4. Simulated annealing algorithm for the optimal translation sequence of the jth agent in rough communication

    Institute of Scientific and Technical Information of China (English)

    Wang Hongkai; Guan Yanyong; Xue Peijun

    2008-01-01

    In rough communication, because each agent has a different language and cannot provide precise communication to each other, the concept translated among multi-agents will loss some information and this results in a less or rougher concept. With different translation sequences, the problem of information loss is varied. To get the translation sequence, in which the jth agent taking part in rough communication gets maximum information, a simulated annealing algorithm is used. Analysis and simulation of this algorithm demonstrate its effectiveness.

  5. Experiences with serial and parallel algorithms for channel routing using simulated annealing

    Science.gov (United States)

    Brouwer, Randall Jay

    1988-01-01

    Two algorithms for channel routing using simulated annealing are presented. Simulated annealing is an optimization methodology which allows the solution process to back up out of local minima that may be encountered by inappropriate selections. By properly controlling the annealing process, it is very likely that the optimal solution to an NP-complete problem such as channel routing may be found. The algorithm presented proposes very relaxed restrictions on the types of allowable transformations, including overlapping nets. By freeing that restriction and controlling overlap situations with an appropriate cost function, the algorithm becomes very flexible and can be applied to many extensions of channel routing. The selection of the transformation utilizes a number of heuristics, still retaining the pseudorandom nature of simulated annealing. The algorithm was implemented as a serial program for a workstation, and a parallel program designed for a hypercube computer. The details of the serial implementation are presented, including many of the heuristics used and some of the resulting solutions.

  6. Folding simulations of gramicidin A into the β-helix conformations: Simulated annealing molecular dynamics study

    Science.gov (United States)

    Mori, Takaharu; Okamoto, Yuko

    2009-10-01

    Gramicidin A is a linear hydrophobic 15-residue peptide which consists of alternating D- and L-amino acids and forms a unique tertiary structure, called the β6.3-helix, to act as a cation-selective ion channel in the natural conditions. In order to investigate the intrinsic ability of the gramicidin A monomer to form secondary structures, we performed the folding simulation of gramicidin A using a simulated annealing molecular dynamics (MD) method in vacuum mimicking the low-dielectric, homogeneous membrane environment. The initial conformation was a fully extended one. From the 200 different MD runs, we obtained a right-handed β4.4-helix as the lowest-potential-energy structure, and left-handed β4.4-helix, right-handed and left-handed β6.3-helix as local-minimum energy states. These results are in accord with those of the experiments of gramicidin A in homogeneous organic solvent. Our simulations showed a slight right-hand sense in the lower-energy conformations and a quite β-sheet-forming tendency throughout almost the entire sequence. In order to examine the stability of the obtained right-handed β6.3-helix and β4.4-helix structures in more realistic membrane environment, we have also performed all-atom MD simulations in explicit water, ion, and lipid molecules, starting from these β-helix structures. The results suggested that β6.3-helix is more stable than β4.4-helix in the inhomogeneous, explicit membrane environment, where the pore water and the hydrogen bonds between Trp side-chains and lipid-head groups have a role to further stabilize the β6.3-helix conformation.

  7. Adaptive Multilevel Monte Carlo Simulation

    KAUST Repository

    Hoel, H

    2011-08-23

    This work generalizes a multilevel forward Euler Monte Carlo method introduced in Michael B. Giles. (Michael Giles. Oper. Res. 56(3):607–617, 2008.) for the approximation of expected values depending on the solution to an Itô stochastic differential equation. The work (Michael Giles. Oper. Res. 56(3):607– 617, 2008.) proposed and analyzed a forward Euler multilevelMonte Carlo method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a standard, single level, Forward Euler Monte Carlo method. This work introduces an adaptive hierarchy of non uniform time discretizations, generated by an adaptive algorithmintroduced in (AnnaDzougoutov et al. Raùl Tempone. Adaptive Monte Carlo algorithms for stopped diffusion. In Multiscale methods in science and engineering, volume 44 of Lect. Notes Comput. Sci. Eng., pages 59–88. Springer, Berlin, 2005; Kyoung-Sook Moon et al. Stoch. Anal. Appl. 23(3):511–558, 2005; Kyoung-Sook Moon et al. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates stochastic, path dependent, time steps and is based on a posteriori error expansions first developed in (Anders Szepessy et al. Comm. Pure Appl. Math. 54(10):1169– 1214, 2001). Our numerical results for a stopped diffusion problem, exhibit savings in the computational cost to achieve an accuracy of ϑ(TOL),from(TOL−3), from using a single level version of the adaptive algorithm to ϑ(((TOL−1)log(TOL))2).

  8. The Parameters Optimization of MCR-WPT System Based on the Improved Genetic Simulated Annealing Algorithm

    Directory of Open Access Journals (Sweden)

    Sheng Lu

    2015-01-01

    Full Text Available To solve the problem of parameter selection during the design of magnetically coupled resonant wireless power transmission system (MCR-WPT, this paper proposed an improved genetic simulated annealing algorithm. Firstly, the equivalent circuit of the system is analysis in this study and a nonlinear programming mathematical model is built. Secondly, in place of the penalty function method in the genetic algorithm, the selection strategy based on the distance between individuals is adopted to select individual. In this way, it reduces the excess empirical parameters. Meanwhile, it can improve the convergence rate and the searching ability by calculating crossover probability and mutation probability according to the variance of population’s fitness. At last, the simulated annealing operator is added to increase local search ability of the method. The simulation shows that the improved method can break the limit of the local optimum solution and get the global optimum solution faster. The optimized system can achieve the practical requirements.

  9. Simulated annealing: an application in fine particle magnetism

    Energy Technology Data Exchange (ETDEWEB)

    Legeratos, A.; Chantrell, R.W.; Wohlfarth, E.P.

    1985-07-01

    Using a model of a system of interacting fine ferromagnetic particles, a computer simulation of the dynamical approach to local or global minima of the system is developed for two different schedules of the application of ac and dc magnetic fields. The process of optimization, i.e., the achievement of a global minimum, depends on the rate of reduction of the ac field and on the symmetry of the ac field cycles. The calculations carried out to illustrate these effects include remanence curves and the zero field remanence for both schedules under different conditions. The growth of the magnetization during these processes was studied, and the interaction energy was calculated to best illustrate the optimization.

  10. An Improved Simulated Annealing Technique for Enhanced Mobility in Smart Cities

    Directory of Open Access Journals (Sweden)

    Hayder Amer

    2016-06-01

    Full Text Available Vehicular traffic congestion is a significant problem that arises in many cities. This is due to the increasing number of vehicles that are driving on city roads of limited capacity. The vehicular congestion significantly impacts travel distance, travel time, fuel consumption and air pollution. Avoidance of traffic congestion and providing drivers with optimal paths are not trivial tasks. The key contribution of this work consists of the developed approach for dynamic calculation of optimal traffic routes. Two attributes (the average travel speed of the traffic and the roads’ length are utilized by the proposed method to find the optimal paths. The average travel speed values can be obtained from the sensors deployed in smart cities and communicated to vehicles via the Internet of Vehicles and roadside communication units. The performance of the proposed algorithm is compared to three other algorithms: the simulated annealing weighted sum, the simulated annealing technique for order preference by similarity to the ideal solution and the Dijkstra algorithm. The weighted sum and technique for order preference by similarity to the ideal solution methods are used to formulate different attributes in the simulated annealing cost function. According to the Sheffield scenario, simulation results show that the improved simulated annealing technique for order preference by similarity to the ideal solution method improves the traffic performance in the presence of congestion by an overall average of 19.22% in terms of travel time, fuel consumption and CO2 emissions as compared to other algorithms; also, similar performance patterns were achieved for the Birmingham test scenario.

  11. An Improved Simulated Annealing Technique for Enhanced Mobility in Smart Cities.

    Science.gov (United States)

    Amer, Hayder; Salman, Naveed; Hawes, Matthew; Chaqfeh, Moumena; Mihaylova, Lyudmila; Mayfield, Martin

    2016-01-01

    Vehicular traffic congestion is a significant problem that arises in many cities. This is due to the increasing number of vehicles that are driving on city roads of limited capacity. The vehicular congestion significantly impacts travel distance, travel time, fuel consumption and air pollution. Avoidance of traffic congestion and providing drivers with optimal paths are not trivial tasks. The key contribution of this work consists of the developed approach for dynamic calculation of optimal traffic routes. Two attributes (the average travel speed of the traffic and the roads' length) are utilized by the proposed method to find the optimal paths. The average travel speed values can be obtained from the sensors deployed in smart cities and communicated to vehicles via the Internet of Vehicles and roadside communication units. The performance of the proposed algorithm is compared to three other algorithms: the simulated annealing weighted sum, the simulated annealing technique for order preference by similarity to the ideal solution and the Dijkstra algorithm. The weighted sum and technique for order preference by similarity to the ideal solution methods are used to formulate different attributes in the simulated annealing cost function. According to the Sheffield scenario, simulation results show that the improved simulated annealing technique for order preference by similarity to the ideal solution method improves the traffic performance in the presence of congestion by an overall average of 19.22% in terms of travel time, fuel consumption and CO₂ emissions as compared to other algorithms; also, similar performance patterns were achieved for the Birmingham test scenario. PMID:27376289

  12. An Improved Simulated Annealing Technique for Enhanced Mobility in Smart Cities

    Science.gov (United States)

    Amer, Hayder; Salman, Naveed; Hawes, Matthew; Chaqfeh, Moumena; Mihaylova, Lyudmila; Mayfield, Martin

    2016-01-01

    Vehicular traffic congestion is a significant problem that arises in many cities. This is due to the increasing number of vehicles that are driving on city roads of limited capacity. The vehicular congestion significantly impacts travel distance, travel time, fuel consumption and air pollution. Avoidance of traffic congestion and providing drivers with optimal paths are not trivial tasks. The key contribution of this work consists of the developed approach for dynamic calculation of optimal traffic routes. Two attributes (the average travel speed of the traffic and the roads’ length) are utilized by the proposed method to find the optimal paths. The average travel speed values can be obtained from the sensors deployed in smart cities and communicated to vehicles via the Internet of Vehicles and roadside communication units. The performance of the proposed algorithm is compared to three other algorithms: the simulated annealing weighted sum, the simulated annealing technique for order preference by similarity to the ideal solution and the Dijkstra algorithm. The weighted sum and technique for order preference by similarity to the ideal solution methods are used to formulate different attributes in the simulated annealing cost function. According to the Sheffield scenario, simulation results show that the improved simulated annealing technique for order preference by similarity to the ideal solution method improves the traffic performance in the presence of congestion by an overall average of 19.22% in terms of travel time, fuel consumption and CO2 emissions as compared to other algorithms; also, similar performance patterns were achieved for the Birmingham test scenario. PMID:27376289

  13. Experimental and Numerical Simulations of Phase Transformations Occurring During Continuous Annealing of DP Steel Strips

    Science.gov (United States)

    Wrożyna, Andrzej; Pernach, Monika; Kuziak, Roman; Pietrzyk, Maciej

    2016-04-01

    Due to their exceptional strength properties combined with good workability the Advanced High-Strength Steels (AHSS) are commonly used in automotive industry. Manufacturing of these steels is a complex process which requires precise control of technological parameters during thermo-mechanical treatment. Design of these processes can be significantly improved by the numerical models of phase transformations. Evaluation of predictive capabilities of models, as far as their applicability in simulation of thermal cycles thermal cycles for AHSS is considered, was the objective of the paper. Two models were considered. The former was upgrade of the JMAK equation while the latter was an upgrade of the Leblond model. The models can be applied to any AHSS though the examples quoted in the paper refer to the Dual Phase (DP) steel. Three series of experimental simulations were performed. The first included various thermal cycles going beyond limitations of the continuous annealing lines. The objective was to validate models behavior in more complex cooling conditions. The second set of tests included experimental simulations of the thermal cycle characteristic for the continuous annealing lines. Capability of the models to describe properly phase transformations in this process was evaluated. The third set included data from the industrial continuous annealing line. Validation and verification of models confirmed their good predictive capabilities. Since it does not require application of the additivity rule, the upgrade of the Leblond model was selected as the better one for simulation of industrial processes in AHSS production.

  14. Humanoid robot gait optimization: Stretched simulated annealing and genetic algorithm a comparative study

    Science.gov (United States)

    Pereira, Ana I.; Lima, José; Costa, Paulo

    2013-10-01

    There are several approaches to create the Humanoid robot gait planning. This problem presents a large number of unknown parameters that should be found to make the humanoid robot to walk. Optimization in simulation models can be used to find the gait based on several criteria such as energy minimization, acceleration, step length among the others. The presented paper addresses a comparison between two optimization methods, the Stretched Simulated Annealing and the Genetic Algorithm, that runs in an accurate and stable simulation model. Final results show the comparative study and demonstrate that optimization is a valid gait planning technique.

  15. Using genetic algorithm based simulated annealing penalty function to solve groundwater management model

    Institute of Scientific and Technical Information of China (English)

    吴剑锋; 朱学愚; 刘建立

    1999-01-01

    The genetic algorithm (GA) is a global and random search procedure based on the mechanics of natural selection and natural genetics. A new optimization method of the genetic algorithm-based simulated annealing penalty function (GASAPF) is presented to solve groundwater management model. Compared with the traditional gradient-based algorithms, the GA is straightforward and there is no need to calculate derivatives of the objective function. The GA is able to generate both convex and nonconvex points within the feasible region. It can be sure that the GA converges to the global or at least near-global optimal solution to handle the constraints by simulated annealing technique. Maximum pumping example results show that the GASAPF to solve optimization model is very efficient and robust.

  16. Optimización Global “Simulated Annealing”

    Directory of Open Access Journals (Sweden)

    Francisco Sánchez Mares

    2006-01-01

    Full Text Available El presente trabajo muestra la aplicación del método de optimización global Simulated Annealing (SA. Esta técnica ha sido aplicada en diversas áreas de la ingeniería como una estrategia robusta y versátil para calcular con éxito el mínimo global de una función o un sistema de funciones. Para probar la eficiencia del método se encontraron los mínimos globales de una función arbitraria y se evaluó el comportamiento numérico del Simulated Annealing durante la convergencia a las dos soluciones que presenta el caso de estudio.

  17. Hand Motion Tracking Using Simulated Annealing Method in a Discrete Space

    Institute of Scientific and Technical Information of China (English)

    LIANG Wei; JIA Yun-de; LIU Tang-li; HAN Lei; WU Xin-xiao

    2007-01-01

    Hand tracking is a challenging problem due to the complexity of searching in a 20 + degrees of freedom (DOF) space for an optimal estimation of hand configuration.The feasible hand configurations are represented as a discrete space,which avoids learning to find parameters as general configuration space representations do.Then,an extended simulated annealing method with particle filtering to search for optimal hand configuration in the proposed discrete space,in which simplex search running in multi-processor is designed to predict the hand motion instead of initializing the simulated annealing randomly,and particle filtering is employed to represent the state of the tracker at each layer for searching in high dimensional configuration space.Experimental results show that the proposed method makes the hand tracking more efficient and robust.

  18. The generalized simulated annealing algorithm in the low energy electron diffraction search problem

    International Nuclear Information System (INIS)

    We present in this work results concerning the application of the generalized simulated annealing (GSA) algorithm to the LEED search problem. The influence of the visiting distribution function (defined by the so-called qV parameter) in the effectiveness of the method was investigated by the application of the algorithm to structural searches for optimization of two to ten parameters in a theory-theory comparison for the CdTe(110) system. Results, obtained with the scaling relation and probability of convergence as a function of the number of parameters to be varied, indicate the fast simulated annealing (FSA) (qV = 2.0) approach as the best search machine

  19. Access control for MPEG video applications using neural network and simulated annealing

    Directory of Open Access Journals (Sweden)

    Ahmed N. U.

    2004-01-01

    Full Text Available We present a dynamic modelfor access control mechanism used in computer communication network applied to MPEG video transmission over Internet. This modelis different fromthosedeveloped inthe previous works related to this topic. In our model, token buckets supported by data buffersare used to shape incoming traffic and one multiplexor, serving all the token pools, multiplexes all theconforming traffic. The model is governed by a system of discrete nonlinear difference equations. Weuse neural network as the feedback controller which receives at its input (measurable available information and provides at its output the optimal control. The simulated annealing algorithm isusedto optimize the system performance by adjusting the weights. For illustration, we presentnumerical results which show that the system performance of MPEG video server can be improved by using neural network and simulated annealing approach.

  20. Fast and accurate protein substructure searching with simulated annealing and GPUs

    Directory of Open Access Journals (Sweden)

    Stivala Alex D

    2010-09-01

    Full Text Available Abstract Background Searching a database of protein structures for matches to a query structure, or occurrences of a structural motif, is an important task in structural biology and bioinformatics. While there are many existing methods for structural similarity searching, faster and more accurate approaches are still required, and few current methods are capable of substructure (motif searching. Results We developed an improved heuristic for tableau-based protein structure and substructure searching using simulated annealing, that is as fast or faster and comparable in accuracy, with some widely used existing methods. Furthermore, we created a parallel implementation on a modern graphics processing unit (GPU. Conclusions The GPU implementation achieves up to 34 times speedup over the CPU implementation of tableau-based structure search with simulated annealing, making it one of the fastest available methods. To the best of our knowledge, this is the first application of a GPU to the protein structural search problem.

  1. Optimizing the natural connectivity of scale-free networks using simulated annealing

    Science.gov (United States)

    Duan, Boping; Liu, Jing; Tang, Xianglong

    2016-09-01

    In real-world networks, the path between two nodes always plays a significant role in the fields of communication or transportation. In some cases, when one path fails, the two nodes cannot communicate any more. Thus, it is necessary to increase alternative paths between nodes. In the recent work (Wu et al., 2011), Wu et al. proposed the natural connectivity as a novel robustness measure of complex networks. The natural connectivity considers the redundancy of alternative paths in a network by computing the number of closed paths of all lengths. To enhance the robustness of networks in terms of the natural connectivity, in this paper, we propose a simulated annealing method to optimize the natural connectivity of scale-free networks without changing the degree distribution. The experimental results show that the simulated annealing method clearly outperforms other local search methods.

  2. Automatic Clustering Using Multi-objective Particle Swarm and Simulated Annealing.

    Directory of Open Access Journals (Sweden)

    Ahmad Abubaker

    Full Text Available This paper puts forward a new automatic clustering algorithm based on Multi-Objective Particle Swarm Optimization and Simulated Annealing, "MOPSOSA". The proposed algorithm is capable of automatic clustering which is appropriate for partitioning datasets to a suitable number of clusters. MOPSOSA combines the features of the multi-objective based particle swarm optimization (PSO and the Multi-Objective Simulated Annealing (MOSA. Three cluster validity indices were optimized simultaneously to establish the suitable number of clusters and the appropriate clustering for a dataset. The first cluster validity index is centred on Euclidean distance, the second on the point symmetry distance, and the last cluster validity index is based on short distance. A number of algorithms have been compared with the MOPSOSA algorithm in resolving clustering problems by determining the actual number of clusters and optimal clustering. Computational experiments were carried out to study fourteen artificial and five real life datasets.

  3. Automatic Clustering Using Multi-objective Particle Swarm and Simulated Annealing.

    Science.gov (United States)

    Abubaker, Ahmad; Baharum, Adam; Alrefaei, Mahmoud

    2015-01-01

    This paper puts forward a new automatic clustering algorithm based on Multi-Objective Particle Swarm Optimization and Simulated Annealing, "MOPSOSA". The proposed algorithm is capable of automatic clustering which is appropriate for partitioning datasets to a suitable number of clusters. MOPSOSA combines the features of the multi-objective based particle swarm optimization (PSO) and the Multi-Objective Simulated Annealing (MOSA). Three cluster validity indices were optimized simultaneously to establish the suitable number of clusters and the appropriate clustering for a dataset. The first cluster validity index is centred on Euclidean distance, the second on the point symmetry distance, and the last cluster validity index is based on short distance. A number of algorithms have been compared with the MOPSOSA algorithm in resolving clustering problems by determining the actual number of clusters and optimal clustering. Computational experiments were carried out to study fourteen artificial and five real life datasets.

  4. Use of simulated annealing in standardization and optimization of the acerola wine production

    Directory of Open Access Journals (Sweden)

    Sheyla dos Santos Almeida

    2014-06-01

    Full Text Available In this study, seven wine samples were prepared varying the amount of pulp of acerola fruits and the sugar content using the simulated annealing technique to obtain the optimal sensory qualities and cost for the wine produced. S. cerevisiae yeast was used in the fermentation process and the sensory attributes were evaluated using a hedonic scale. Acerola wines were classified as sweet, with 11°GL of alcohol concentration and with aroma, taste, and color characteristics of the acerola fruit. The simulated annealing experiments showed that the best conditions were found at mass ratio between 1/7.5-1/6 and total soluble solids between 28.6-29.0 °Brix, from which the sensory acceptance scores of 6.9, 6.8, and 8.8 were obtained for color, aroma, and flavor, respectively, with a production cost 43-45% lower than the cost of traditional wines commercialized in Brazil.

  5. Nanocrystallization behaviour of a ternary amorphous alloy during isothermal annealing: a Monte Carlo simulation

    Institute of Scientific and Technical Information of China (English)

    Jin Shi-Feng; Wang Wei-Min; Zhou Jian-Kun; Guo Hong-Xuan; J.F. Webb; Bian Xiu-Fang

    2005-01-01

    The nanocrystallization behaviour of Zr70Cu20Ni10 metallic glass during isothermal annealing is studied by employing a Monte Carlo simulation incorporating with a modified Ising model and a Q-state Potts model. Based on the simulated microstructure and differential scanning calorimetry curves, we find that the low crystal-amorphous interface energy of Ni plays an important role in the nanocrystallization of primary Zr2Ni. It is found that when T < TImax (where TImax is the temperature with maximum nucleation rate), the increase of temperature results in a larger growth rate and a much finer microstructure for the primary Zr2Ni, which accords with the microstructure evolution in "flash annealing". Finally, the Zr2Ni/Zr2Cu interface energy σG contributes to the pinning effect of the primary nano-sized Zr2Ni grains in the later formed normal Zr2Cu grains.

  6. Minimizing distortion and internal forces in truss structures by simulated annealing

    Science.gov (United States)

    Kincaid, Rex K.; Padula, Sharon L.

    1990-01-01

    Inaccuracies in the length of members and the diameters of joints of large space structures may produce unacceptable levels of surface distortion and internal forces. Here, two discrete optimization problems are formulated, one to minimize surface distortion (DSQRMS) and the other to minimize internal forces (FSQRMS). Both of these problems are based on the influence matrices generated by a small-deformation linear analysis. Good solutions are obtained for DSQRMS and FSQRMS through the use of a simulated annealing heuristic.

  7. A Simulated Annealing Algorithm for the Optimization of Multistage Depressed Collector Efficiency

    Science.gov (United States)

    Vaden, Karl R.; Wilson, Jeffrey D.; Bulson, Brian A.

    2002-01-01

    The microwave traveling wave tube amplifier (TWTA) is widely used as a high-power transmitting source for space and airborne communications. One critical factor in designing a TWTA is the overall efficiency. However, overall efficiency is highly dependent upon collector efficiency; so collector design is critical to the performance of a TWTA. Therefore, NASA Glenn Research Center has developed an optimization algorithm based on Simulated Annealing to quickly design highly efficient multi-stage depressed collectors (MDC).

  8. EXTRUSION DIE PROFILE DESIGN USING SIMULATED ANNEALING ALGORITHM AND PARTICLE SWARM OPTIMIZATION

    OpenAIRE

    R.VENKETESAN

    2010-01-01

    In this paper a new method has been proposed for optimum shape design of extrusion die. The Design problem is formulated as an unconstrained optimization problem. Here nontraditional optimization techniques likeSimulated Annealing Algorithm and Particle Swarm Optimization are used to minimize the extrusion force by optimizing the extrusion ratio and die cone angle. Internal power of deformation is also calculated and results are compared.

  9. Application of simulated annealing to the biclustering of gene expression data

    OpenAIRE

    Bolshakova, Nadia; Cunningham, Padraig

    2006-01-01

    In a gene expression data matrix, a bicluster is a submatrix of genes and conditions that exhibits a high correlation of expression activity across both rows and columns. The problem of locating the most significant bicluster has been shown to be NP-complete. Heuristic approaches such as Cheng and Church?s greedy node deletion algorithm have been previously employed. It is to be expected that stochastic search techniques such as evolutionary algorithms or simulated anneal...

  10. EXTRUSION DIE PROFILE DESIGN USING SIMULATED ANNEALING ALGORITHM AND PARTICLE SWARM OPTIMIZATION

    Directory of Open Access Journals (Sweden)

    R.VENKETESAN

    2010-08-01

    Full Text Available In this paper a new method has been proposed for optimum shape design of extrusion die. The Design problem is formulated as an unconstrained optimization problem. Here nontraditional optimization techniques likeSimulated Annealing Algorithm and Particle Swarm Optimization are used to minimize the extrusion force by optimizing the extrusion ratio and die cone angle. Internal power of deformation is also calculated and results are compared.

  11. SIMULATED ANNEALING ALGORITHM FOR SCHEDULING DIVISIBLE LOAD IN LARGE SCALE DATA GRIDS

    OpenAIRE

    Monir Abdullah; Mohamed, Othman; Hamidah Ibrahim; Shamala Subramaniam

    2010-01-01

    In many data grid applications, data can be decomposed into multiple independent sub datasets and distributed for parallel execution and analysis. This property has been successfully exploited using Divisible Load Theory (DLT). Many Scheduling approaches have been studied but there is no optimal solution. This paper proposes a novel Simulated Annealing (SA) algorithm for scheduling divisible load in large scale data grids. SA algorithm is integrated with DLT model and compared with th...

  12. Optimal design of a DC MHD pump by simulated annealing method

    Directory of Open Access Journals (Sweden)

    Bouali Khadidja

    2014-01-01

    Full Text Available In this paper a design methodology of a magnetohydrodynamic pump is proposed. The methodology is based on direct interpretation of the design problem as an optimization problem. The simulated annealing method is used for an optimal design of a DC MHD pump. The optimization procedure uses an objective function which can be the minimum of the mass. The constraints are both of geometrics and electromagnetic in type. The obtained results are reported.

  13. A parallel simulated annealing algorithm for standard cell placement on a hypercube computer

    Science.gov (United States)

    Jones, Mark Howard

    1987-01-01

    A parallel version of a simulated annealing algorithm is presented which is targeted to run on a hypercube computer. A strategy for mapping the cells in a two dimensional area of a chip onto processors in an n-dimensional hypercube is proposed such that both small and large distance moves can be applied. Two types of moves are allowed: cell exchanges and cell displacements. The computation of the cost function in parallel among all the processors in the hypercube is described along with a distributed data structure that needs to be stored in the hypercube to support parallel cost evaluation. A novel tree broadcasting strategy is used extensively in the algorithm for updating cell locations in the parallel environment. Studies on the performance of the algorithm on example industrial circuits show that it is faster and gives better final placement results than the uniprocessor simulated annealing algorithms. An improved uniprocessor algorithm is proposed which is based on the improved results obtained from parallelization of the simulated annealing algorithm.

  14. Validation of Sensor-Directed Spatial Simulated Annealing Soil Sampling Strategy.

    Science.gov (United States)

    Scudiero, Elia; Lesch, Scott M; Corwin, Dennis L

    2016-07-01

    Soil spatial variability has a profound influence on most agronomic and environmental processes at field and landscape scales, including site-specific management, vadose zone hydrology and transport, and soil quality. Mobile sensors are a practical means of mapping spatial variability because their measurements serve as a proxy for many soil properties, provided a sensor-soil calibration is conducted. A viable means of calibrating sensor measurements over soil properties is through linear regression modeling of sensor and target property data. In the present study, two sensor-directed, model-based, sampling scheme delineation methods were compared to validate recent applications of soil apparent electrical conductivity (EC)-directed spatial simulated annealing against the more established EC-directed response surface sampling design (RSSD) approach. A 6.8-ha study area near San Jacinto, CA, was surveyed for EC, and 30 soil sampling locations per sampling strategy were selected. Spatial simulated annealing and RSSD were compared for sensor calibration to a target soil property (i.e., salinity) and for evenness of spatial coverage of the study area, which is beneficial for mapping nontarget soil properties (i.e., those not correlated with EC). The results indicate that the linear modeling EC-salinity calibrations obtained from the two sampling schemes provided salinity maps characterized by similar errors. The maps of nontarget soil properties show similar errors across sampling strategies. The Spatial Simulated Annealing methodology is, therefore, validated, and its use in agronomic and environmental soil science applications is justified. PMID:27380070

  15. Broadband diffusion metasurface based on a single anisotropic element and optimized by the Simulated Annealing algorithm

    Science.gov (United States)

    Zhao, Yi; Cao, Xiangyu; Gao, Jun; Sun, Yu; Yang, Huanhuan; Liu, Xiao; Zhou, Yulong; Han, Tong; Chen, Wei

    2016-04-01

    We propose a new strategy to design broadband and wide angle diffusion metasurfaces. An anisotropic structure which has opposite phases under x- and y-polarized incidence is employed as the “0” and “1” elements base on the concept of coding metamaterial. To obtain a uniform backward scattering under normal incidence, Simulated Annealing algorithm is utilized in this paper to calculate the optimal layout. The proposed method provides an efficient way to design diffusion metasurface with a simple structure, which has been proved by both simulations and measurements.

  16. Broadband diffusion metasurface based on a single anisotropic element and optimized by the Simulated Annealing algorithm.

    Science.gov (United States)

    Zhao, Yi; Cao, Xiangyu; Gao, Jun; Sun, Yu; Yang, Huanhuan; Liu, Xiao; Zhou, Yulong; Han, Tong; Chen, Wei

    2016-01-01

    We propose a new strategy to design broadband and wide angle diffusion metasurfaces. An anisotropic structure which has opposite phases under x- and y-polarized incidence is employed as the "0" and "1" elements base on the concept of coding metamaterial. To obtain a uniform backward scattering under normal incidence, Simulated Annealing algorithm is utilized in this paper to calculate the optimal layout. The proposed method provides an efficient way to design diffusion metasurface with a simple structure, which has been proved by both simulations and measurements. PMID:27034110

  17. Molecular dynamics simulation of the deposition and annealing of NiAl film on Ni substrate

    Science.gov (United States)

    Wu, Bo; Zhou, Jianqiu; Xue, Chen; Liu, Hongxi

    2015-11-01

    A study is conducted to simulate the simultaneous deposition of Ni and Al atoms on Ni(0 0 1) substrate by molecular dynamics. The effects of deposition conditions and annealing on microstructure and morphology of the NiAl films are investigated. The empirical embedded-atom method is applied to calculate the atomic interactions between Ni and Al atoms. The simulative results indicate that increasing incident energy can roughen the surface of un-annealed films in 15 eV, enhance intermixing between film and substrate, strengthen the destruction to the perfect crystal structure of substrate from incident atoms, and increase the content of vacancies and voids in 10 eV. While incident angles have little influence on surface roughness (in 30°), intermixing, and destruction to the substrate, only density changes with incident angle by direct proportion. The substrate temperature almost has the same but weaker effect with incident energy on these properties, except for the density. After annealed, the films become smoother and more homogeneous, ordered and compact.

  18. Experimental investigations and simulation of the deactivation of arsenic during thermal processes after activation by SPER and spike annealing

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Limia, A. [Fraunhofer Institute of Integrated Systems and Device Technology, Schottkystrasse 10, 91058 Erlangen (Germany); Pichler, P. [Fraunhofer Institute of Integrated Systems and Device Technology, Schottkystrasse 10, 91058 Erlangen (Germany); Chair of Electron Devices, University of Erlangen-Nuremberg, Cauerstrasse 6, 91058 Erlangen (Germany)], E-mail: peter.pichler@iisb.fraunhofer.de; Lerch, W.; Paul, S. [Mattson Thermal Products GmbH, Daimlerstrasse 10, 89160 Dornstadt (Germany); Kheyrandish, H. [CSMA - MATS, Queens Road, Stoke on Trent ST4 7LQ (United Kingdom); Pakfar, A.; Tavernier, C. [STMicroelectronics SA, 850 rue Jean Monnet, 38926 Crolles (France)

    2008-12-05

    The possibility of using solid phase epitaxial regrowth (SPER) for activation of arsenic after amorphizing implantation in silicon is explored in this contribution and compared to spike annealing and published flash-annealing experiments. SPER takes advantage of the high activation level of the dopants after SPER combined with practically no dopant diffusion. We performed implantation and annealing experiments for three combinations of implantation energy and dose, and compared the results of SPER and spike annealing. The thermal stability of the dopant distribution was studied by subsequent post-annealing treatment for temperatures between 750 deg. C and 900 deg. C. The results of these experiments were included in the calibration of a diffusion and activation model for arsenic with high predictive capabilities. Additional simulations over a wide range of implantation energies were done to compare the efficiency of SPER, spike and flash annealing. The specific contributions to deactivation via different processes like clustering, precipitation, and segregation are discussed and annealing strategies to minimize the deactivation are proposed. Spike annealing seems to be the best solution for junctions of 25 nm or deeper, while for shallower junctions other processes combining preamorphization, multiple implantation steps, SPER, and/or flash annealing are needed.

  19. Simulated Annealing Study on Structures and Energetics of CO2 in Argon Clusters

    Institute of Scientific and Technical Information of China (English)

    Le-cheng Wang; Dai-qian Xie

    2011-01-01

    The minimum-energy configurations and energetic properties of the ArN-CO2 (N=1-19) van der Waals clusters were investigated by a simulated annealing algorithm.A newly developed Ar-CO2 potential energy surface together with the Aziz Ar-Ar interaction potential was employed to construct the high dimensional potential functions by pairwise additive approximation.The global minimal conformations were optimized by sampling the glassy phase space with a circumspectively formulated annealing schedule.Unlike the lighter RgN-CO2 clusters,the size-dependent structural and energetic characteristics of ArN-CO2 exhibit a different behavior.The dramatically variations with number of solvent were found for small clusters.After the completion of the first solvation shell at N=17,the clusters were evolved more smoothly.

  20. Image-Based Airborne Sensors: A Combined Approach for Spectral Signatures Classification through Deterministic Simulated Annealing

    Science.gov (United States)

    Guijarro, María; Pajares, Gonzalo; Herrera, P. Javier

    2009-01-01

    The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm. PMID:22399989

  1. Computer simulations of randomly branching polymers: annealed versus quenched branching structures

    Science.gov (United States)

    Rosa, Angelo; Everaers, Ralf

    2016-08-01

    We present computer simulations of three systems of randomly branching polymers in d = 3 dimensions: ideal trees and self-avoiding trees with annealed and quenched connectivities. In all cases, we performed a detailed analysis of trees connectivities, spatial conformations and statistical properties of linear paths on trees, and compare the results to the corresponding predictions of Flory theory. We confirm that, overall, the theory predicts correctly that trees with quenched ideal connectivity exhibit less overall swelling in good solvent than corresponding trees with annealed connectivity even though they are more strongly stretched on the path level. At the same time, we emphasize the inadequacy of the Flory theory in predicting the behaviour of other, and equally relevant, observables like contact probabilities between tree nodes. We show, then, that contact probabilities can be aptly characterized by introducing a novel critical exponent, {θ }{path}, which accounts for how they decay as a function of the node-to-node path distance on the tree.

  2. Displacement cascades and defect annealing in tungsten, Part II: Object kinetic Monte Carlo simulation of tungsten cascade aging

    Energy Technology Data Exchange (ETDEWEB)

    Nandipati, Giridhar, E-mail: giridhar.nandipati@pnnl.gov [Pacific Northwest National Laboratory, Richland, WA (United States); Setyawan, Wahyu; Heinisch, Howard L. [Pacific Northwest National Laboratory, Richland, WA (United States); Roche, Kenneth J. [Pacific Northwest National Laboratory, Richland, WA (United States); Department of Physics, University of Washington, Seattle, WA 98195 (United States); Kurtz, Richard J. [Pacific Northwest National Laboratory, Richland, WA (United States); Wirth, Brian D. [University of Tennessee, Knoxville, TN (United States)

    2015-07-15

    The results of object kinetic Monte Carlo (OKMC) simulations of the annealing of primary cascade damage in bulk tungsten using a comprehensive database of cascades obtained from molecular dynamics (Setyawan et al.) are described as a function of primary knock-on atom (PKA) energy at temperatures of 300, 1025 and 2050 K. An increase in SIA clustering coupled with a decrease in vacancy clustering with increasing temperature, in addition to the disparate mobilities of SIAs versus vacancies, causes an interesting effect of temperature on cascade annealing. The annealing efficiency (the ratio of the number of defects after and before annealing) exhibits an inverse U-shape curve as a function of temperature. The capabilities of the newly developed OKMC code KSOME (kinetic simulations of microstructure evolution) used to carry out these simulations are described.

  3. Displacement cascades and defect annealing in tungsten, Part II: Object kinetic Monte Carlo Simulation of Tungsten Cascade Aging

    Energy Technology Data Exchange (ETDEWEB)

    Nandipati, Giridhar; Setyawan, Wahyu; Heinisch, Howard L.; Roche, Kenneth J.; Kurtz, Richard J.; Wirth, Brian D.

    2015-07-01

    The results of object kinetic Monte Carlo (OKMC) simulations of the annealing of primary cascade damage in bulk tungsten using a comprehensive database of cascades obtained from molecular dynamics (Setyawan et al.) are described as a function of primary knock-on atom (PKA) energy at temperatures of 300, 1025 and 2050 K. An increase in SIA clustering coupled with a decrease in vacancy clustering with increasing temperature, in addition to the disparate mobilities of SIAs versus vacancies, causes an interesting effect of temperature on cascade annealing. The annealing efficiency (the ratio of the number of defects after and before annealing) exhibits an inverse U-shape curve as a function of temperature. The capabilities of the newly developed OKMC code KSOME (kinetic simulations of microstructure evolution) used to carry out these simulations are described.

  4. Annealing effect on thermodynamic and physical properties of mesoporous silicon: A simulation and nitrogen sorption study

    Science.gov (United States)

    Kumar, Pushpendra; Huber, Patrick

    2016-04-01

    Discovery of porous silicon formation in silicon substrate in 1956 while electro-polishing crystalline Si in hydrofluoric acid (HF), has triggered large scale investigations of porous silicon formation and their changes in physical and chemical properties with thermal and chemical treatment. A nitrogen sorption study is used to investigate the effect of thermal annealing on electrochemically etched mesoporous silicon (PS). The PS was thermally annealed from 200˚C to 800˚C for 1 hr in the presence of air. It was shown that the pore diameter and porosity of PS vary with annealing temperature. The experimentally obtained adsorption / desorption isotherms show hysteresis typical for capillary condensation in porous materials. A simulation study based on Saam and Cole model was performed and compared with experimentally observed sorption isotherms to study the physics behind of hysteresis formation. We discuss the shape of the hysteresis loops in the framework of the morphology of the layers. The different behavior of adsorption and desorption of nitrogen in PS with pore diameter was discussed in terms of concave menisci formation inside the pore space, which was shown to related with the induced pressure in varying the pore diameter from 7.2 nm to 3.4 nm.

  5. Simulated annealing algorithm for solving chambering student-case assignment problem

    Science.gov (United States)

    Ghazali, Saadiah; Abdul-Rahman, Syariza

    2015-12-01

    The problem related to project assignment problem is one of popular practical problem that appear nowadays. The challenge of solving the problem raise whenever the complexity related to preferences, the existence of real-world constraints and problem size increased. This study focuses on solving a chambering student-case assignment problem by using a simulated annealing algorithm where this problem is classified under project assignment problem. The project assignment problem is considered as hard combinatorial optimization problem and solving it using a metaheuristic approach is an advantage because it could return a good solution in a reasonable time. The problem of assigning chambering students to cases has never been addressed in the literature before. For the proposed problem, it is essential for law graduates to peruse in chambers before they are qualified to become legal counselor. Thus, assigning the chambering students to cases is a critically needed especially when involving many preferences. Hence, this study presents a preliminary study of the proposed project assignment problem. The objective of the study is to minimize the total completion time for all students in solving the given cases. This study employed a minimum cost greedy heuristic in order to construct a feasible initial solution. The search then is preceded with a simulated annealing algorithm for further improvement of solution quality. The analysis of the obtained result has shown that the proposed simulated annealing algorithm has greatly improved the solution constructed by the minimum cost greedy heuristic. Hence, this research has demonstrated the advantages of solving project assignment problem by using metaheuristic techniques.

  6. Design of phase plates for shaping partially coherent beams by simulated annealing

    Institute of Scientific and Technical Information of China (English)

    Li Jian-Long; Lü Bai-Da

    2008-01-01

    Taking the Gaussian Schell-model beam as a typical example of partially coherent beams,this paper applies the simulated annealing (SA) algorithm to the design of phase plates for shaping partially coherent beams.A flow diagram is presented to illustrate the procedure of phase optimization by the SA algorithm.Numerical examples demonstrate the advantages of the SA algorithm in shaping partially coherent beams.An uniform flat-topped beam profile with maximum reconstruction error RE < 1.74% is achieved.A further extension of the approach is discussed.

  7. Comparing of the Deterministic Simulated Annealing Methods for Quadratic Assignment Problem

    Directory of Open Access Journals (Sweden)

    Mehmet Güray ÜNSAL

    2013-08-01

    Full Text Available In this study, Threshold accepting and Record to record travel methods belonging to Simulated Annealing that is meta-heuristic method by applying Quadratic Assignment Problem are statistically analyzed whether they have a significant difference with regard to the values of these two methods target functions and CPU time. Between the two algorithms, no significant differences are found in terms of CPU time and the values of these two methods target functions. Consequently, on the base of Quadratic Assignment Problem, the two algorithms are compared in the study have the same performance in respect to CPU time and the target functions values

  8. Reconstruction of bremsstrahlung spectra from attenuation data using generalized simulated annealing.

    Science.gov (United States)

    Menin, O H; Martinez, A S; Costa, A M

    2016-05-01

    A generalized simulated annealing algorithm, combined with a suitable smoothing regularization function is used to solve the inverse problem of X-ray spectrum reconstruction from attenuation data. The approach is to set the initial acceptance and visitation temperatures and to standardize the terms of objective function to automate the algorithm to accommodate different spectra ranges. Experiments with both numerical and measured attenuation data are presented. Results show that the algorithm reconstructs spectra shapes accurately. It should be noted that in this algorithm, the regularization function was formulated to guarantee a smooth spectrum, thus, the presented technique does not apply to X-ray spectrum where characteristic radiation are present. PMID:26943902

  9. Optimization of blade arrangement in a randomly mistuned cascade using simulated annealing

    Science.gov (United States)

    Thompson, Edward A.; Becus, Georges A.

    1993-01-01

    This paper presents preliminary results of an investigation on mistuning of bladed-disk assemblies aimed at capturing the benefits of mistuning on stability, while at the same time, minimizing the adverse effects on response by solving the following problem: given a set of N turbine blades, each being a small random perturbation of the same nominal blade, determine the best arrangement of the N blades in a mistuned cascade with regard to aeroelastic response. In the studies reported here, mistuning of the blades is restricted to small differences in torsional stiffness. The large combinatorial optimization problem of seeking the best arrangement by blade exchanges is solved using a simulated annealing algorithm.

  10. Discrete channel modelling based on genetic algorithm and simulated annealing for training hidden Markov model

    Institute of Scientific and Technical Information of China (English)

    Zhao Zhi-Jin; Zheng Shi-Lian; Xu Chun-Yun; Kong Xian-Zheng

    2007-01-01

    Hidden Markov models (HMMs) have been used to model burst error sources of wireless channels. This paper proposes a hybrid method of using genetic algorithm (GA) and simulated annealing (SA) to train HMM for discrete channel modelling. The proposed method is compared with pure GA, and experimental results show that the HMMs trained by the hybrid method can better describe the error sequences due to SA's ability of facilitating hill-climbing at the later stage of the search. The burst error statistics of the HMMs trained by the proposed method and the corresponding error sequences are also presented to validate the proposed method.

  11. Optimización multiobjetivo en transmisiones de redes multicast utilizando Simulated Annealing

    OpenAIRE

    Yezid Donoso; Kadel Lacatt; Alfonso Jiménez

    2005-01-01

    En este artículo se presenta un método de optimización multiobjetivo para la solución del problema de balanceo de carga en redes de transmisión multicast, apoyándose en la aplicación de la meta-heurística de Simulated Annealing (Recocido Simulado). El método minimiza cuatro parámetros básicos para garantizar la calidad de servicio en transmisiones multicast: retardo origen destino, máxima utilización de enlaces, ancho de banda consumido y número de saltos. Los resultados devuel...

  12. Extraction of Web Usage Profiles using Simulated Annealing Based Biclustering Approach

    OpenAIRE

    Rathipriya, R.; Thangavel, K.

    2014-01-01

    In this paper, the Simulated Annealing (SA) based biclustering approach is proposed in which SA is used as an optimization tool for biclustering of web usage data to identify the optimal user profile from the given web usage data. Extracted biclusters are consists of correlated users whose usage behaviors are similar across the subset of web pages of a web site where as these users are uncorrelated for remaining pages of a web site. These results are very useful in web personalization so that...

  13. Extraction of cluster parameters from Sunyaev-Zeldovich effect observations with simulated annealing optimization

    CERN Document Server

    Hansen, S H

    2004-01-01

    We present a user-friendly tool for the analysis of data from Sunyaev-Zeldovich effect observations. The tool is based on the stochastic method of simulated annealing, and allows the extraction of the central values and error-bars of the 3 SZ parameters, Comptonization parameter, y, peculiar velocity, v_p, and electron temperature, T_e. The f77-code SASZ will allow any number of observing frequencies and spectral band shapes. As an example we consider the SZ parameters for the COMA cluster.

  14. Neighbourhood generation mechanism applied in simulated annealing to job shop scheduling problems

    Science.gov (United States)

    Cruz-Chávez, Marco Antonio

    2015-11-01

    This paper presents a neighbourhood generation mechanism for the job shop scheduling problems (JSSPs). In order to obtain a feasible neighbour with the generation mechanism, it is only necessary to generate a permutation of an adjacent pair of operations in a scheduling of the JSSP. If there is no slack time between the adjacent pair of operations that is permuted, then it is proven, through theory and experimentation, that the new neighbour (schedule) generated is feasible. It is demonstrated that the neighbourhood generation mechanism is very efficient and effective in a simulated annealing.

  15. Simulated annealing applied to two-dimensional low-beta reduced magnetohydrodynamics

    International Nuclear Information System (INIS)

    The simulated annealing (SA) method is applied to two-dimensional (2D) low-beta reduced magnetohydrodynamics (R-MHD). We have successfully obtained stationary states of the system numerically by the SA method with Casimir invariants preserved. Since the 2D low-beta R-MHD has two fields, the relaxation process becomes complex compared to a single field system such as 2D Euler flow. The obtained stationary state can have fine structure. We have found that the fine structure appears because the relaxation processes are different between kinetic energy and magnetic energy

  16. Molecular Dynamics Simulated Annealing Study of Gramicidin A in Water and the Hydrophobic Environment

    Science.gov (United States)

    Mori, Takaharu; Okamoto, Yuko

    2008-03-01

    Gramicidin A is a hydrophobic 15-residue peptide with alternating D- and L-amino acids, and it forms various conformations depending on its environment. For example, gramicidin A adopts a random coil or helical conformations, such as &4.4circ;-helix, &6.3circ;-helix, and double-stranded helix in organic solvents. To investigate the structural and dynamical properties of gramicidin A in water and the hydrophobic environment, we performed molecular dynamics simulated annealing simulations with implicit solvent based on a generalized Born model. From the simulations, it was found that gramicidin A has a strong tendency to form a random-coil structure in water, while in the hydrophobic environment it becomes compact and can fold into right- and left-handed conformations of β-helix structures. We discuss the folding mechanism of the β-helix conformation of gramicidin A.

  17. A Simulated Annealing Algorithm for D-Optimal Design for 2-Way and 3-Way Polynomial Regression with Correlated Observations

    OpenAIRE

    Chang Li; Coster, Daniel C.

    2014-01-01

    Much of the previous work in D-optimal design for regression models with correlated errors focused on polynomial models with a single predictor variable, in large part because of the intractability of an analytic solution. In this paper, we present a modified, improved simulated annealing algorithm, providing practical approaches to specifications of the annealing cooling parameters, thresholds, and search neighborhoods for the perturbation scheme, which finds approximate D-optimal designs fo...

  18. 基于免疫克隆模拟退火算法的网络生存性研究%Study of network survivability based on immune clonal simulated annealing algorithm

    Institute of Scientific and Technical Information of China (English)

    段谟意

    2012-01-01

    针对通信网络产生的拥塞问题,基于免疫克隆模拟退火算法提出了一种新的网络生存性评价方法(survivability algorithm based on immune clonal simulated annealing,SAICSA).该方法通过建立克隆变异和克隆交叉操作规则,并结合模拟退火接受准则来获得退火温度趋于零时的最优解.同时,以实际数据进行仿真实验,深入研究了网络生存性与失效边数、初始温度等影响因素之间的关系.实验结果表明,相比于免疫规划模拟退火算法和遗传模拟退火算法,SAICSA算法表现出较好的适应性.%In order to mitigate the network congestion by node failures, a novel survivability evaluation method (Survivability Algorithm based on Immune Clonal Simulated Annealing, SAICSA) is proposed by immune clonal simulated annealing algorithm. In this method, the clonal variation and clonal intersection regulations are presented at first, and the optimal solution is got by simulated annealing regulation when annealing temperature is tended to zero. Then, simulation was conducted to study the relationship between network survivability and failures node, as well as initial temperature with actual data. Compared SAIP (Simulated Annealing algorithm based on Immune Programming) and GSA (Genetic Simulated Annealing) algorithm, SAICSA algorithm has better adaptability.

  19. Reconstruction of the vertical electron density profile based on vertical TEC using the simulated annealing algorithm

    Science.gov (United States)

    Jiang, Chunhua; Yang, Guobin; Zhu, Peng; Nishioka, Michi; Yokoyama, Tatsuhiro; Zhou, Chen; Song, Huan; Lan, Ting; Zhao, Zhengyu; Zhang, Yuannong

    2016-05-01

    This paper presents a new method to reconstruct the vertical electron density profile based on vertical Total Electron Content (TEC) using the simulated annealing algorithm. The present technique used the Quasi-parabolic segments (QPS) to model the bottomside ionosphere. The initial parameters of the ionosphere model were determined from both International Reference Ionosphere (IRI) (Bilitza et al., 2014) and vertical TEC (vTEC). Then, the simulated annealing algorithm was used to search the best-fit parameters of the ionosphere model by comparing with the GPS-TEC. The performance and robust of this technique were verified by ionosonde data. The critical frequency (foF2) and peak height (hmF2) of the F2 layer obtained from ionograms recorded at different locations and on different days were compared with those calculated by the proposed method. The analysis of results shows that the present method is inspiring for obtaining foF2 from vTEC. However, the accuracy of hmF2 needs to be improved in the future work.

  20. Temporary Workforce Planning with Firm Contracts: A Model and a Simulated Annealing Heuristic

    Directory of Open Access Journals (Sweden)

    Muhammad Al-Salamah

    2011-01-01

    Full Text Available The aim of this paper is to introduce a model for temporary staffing when temporary employment is managed by firm contracts and to propose a simulated annealing-based method to solve the model. Temporary employment is a policy frequently used to adjust the working hour capacity to fluctuating demand. Temporary workforce planning models have been unnecessarily simplified to account for only periodic hiring and laying off; a company can review its workforce requirement every period and make hire-fire decisions accordingly, usually with a layoff cost. We present a more realistic temporary workforce planning model that assumes a firm contract between the worker and the company, which can extend to several periods. The model assumes the traditional constraints, such as inventory balance constraints, worker availability, and labor hour mix. The costs are the inventory holding cost, training cost of the temporary workers, and the backorder cost. The mixed integer model developed for this case has been found to be difficult to solve even for small problem sizes; therefore, a simulated annealing algorithm is proposed to solve the mixed integer model. The performance of the SA algorithm is compared with the CPLEX solution.

  1. Redesigning rain gauges network in Johor using geostatistics and simulated annealing

    Science.gov (United States)

    Aziz, Mohd Khairul Bazli Mohd; Yusof, Fadhilah; Daud, Zalina Mohd; Yusop, Zulkifli; Kasno, Mohammad Afif

    2015-02-01

    Recently, many rainfall network design techniques have been developed, discussed and compared by many researchers. Present day hydrological studies require higher levels of accuracy from collected data. In numerous basins, the rain gauge stations are located without clear scientific understanding. In this study, an attempt is made to redesign rain gauge network for Johor, Malaysia in order to meet the required level of accuracy preset by rainfall data users. The existing network of 84 rain gauges in Johor is optimized and redesigned into a new locations by using rainfall, humidity, solar radiation, temperature and wind speed data collected during the monsoon season (November - February) of 1975 until 2008. This study used the combination of geostatistics method (variance-reduction method) and simulated annealing as the algorithm of optimization during the redesigned proses. The result shows that the new rain gauge location provides minimum value of estimated variance. This shows that the combination of geostatistics method (variance-reduction method) and simulated annealing is successful in the development of the new optimum rain gauge system.

  2. Redesigning rain gauges network in Johor using geostatistics and simulated annealing

    Energy Technology Data Exchange (ETDEWEB)

    Aziz, Mohd Khairul Bazli Mohd, E-mail: mkbazli@yahoo.com [Centre of Preparatory and General Studies, TATI University College, 24000 Kemaman, Terengganu, Malaysia and Department of Mathematical Sciences, Faculty of Science, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia); Yusof, Fadhilah, E-mail: fadhilahy@utm.my [Department of Mathematical Sciences, Faculty of Science, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia); Daud, Zalina Mohd, E-mail: zalina@ic.utm.my [UTM Razak School of Engineering and Advanced Technology, Universiti Teknologi Malaysia, UTM KL, 54100 Kuala Lumpur (Malaysia); Yusop, Zulkifli, E-mail: zulyusop@utm.my [Institute of Environmental and Water Resource Management (IPASA), Faculty of Civil Engineering, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia); Kasno, Mohammad Afif, E-mail: mafifkasno@gmail.com [Malaysia - Japan International Institute of Technology (MJIIT), Universiti Teknologi Malaysia, UTM KL, 54100 Kuala Lumpur (Malaysia)

    2015-02-03

    Recently, many rainfall network design techniques have been developed, discussed and compared by many researchers. Present day hydrological studies require higher levels of accuracy from collected data. In numerous basins, the rain gauge stations are located without clear scientific understanding. In this study, an attempt is made to redesign rain gauge network for Johor, Malaysia in order to meet the required level of accuracy preset by rainfall data users. The existing network of 84 rain gauges in Johor is optimized and redesigned into a new locations by using rainfall, humidity, solar radiation, temperature and wind speed data collected during the monsoon season (November - February) of 1975 until 2008. This study used the combination of geostatistics method (variance-reduction method) and simulated annealing as the algorithm of optimization during the redesigned proses. The result shows that the new rain gauge location provides minimum value of estimated variance. This shows that the combination of geostatistics method (variance-reduction method) and simulated annealing is successful in the development of the new optimum rain gauge system.

  3. Design and optimization of solid rocket motor Finocyl grain using simulated annealing

    Institute of Scientific and Technical Information of China (English)

    Ali Kamran; LIANG Guo-zhu

    2011-01-01

    The research effort outlined the application of a computer aided design (CAD)-centric technique to the design and optimization of solid rocket motor Finocyl (fin in cylinder) grain using simulated annealing.The proper method for constructing the grain configuration model, ballistic performance and optimizer integration for analysis was presented. Finoeyl is a complex grain configuration, requiring thirteen variables to define the geometry. The large number of variables not only complicates the geometrical construction but also optimization process. CAD representation encapsulates all of the geometric entities pertinent to the grain design in a parametric way, allowing manipulation of grain entity (web), performing regression and automating geometrical data calculations. Robustness to avoid local minima and efficient capacity to explore design space makes simulated annealing an attractive choice as optimizer. It is demonstrated with a constrained optimization of Finocyl grain geometry for homogeneous, isotropic propellant, uniform regression, and a quasi-steady, bulk mode internal ballistics model that maximizes average thrust for required deviations from neutrality.

  4. Comparison of particle swarm optimization and simulated annealing for locating additional boreholes considering combined variance minimization

    Science.gov (United States)

    Soltani-Mohammadi, Saeed; Safa, Mohammad; Mokhtari, Hadi

    2016-10-01

    One of the most important stages in complementary exploration is optimal designing the additional drilling pattern or defining the optimum number and location of additional boreholes. Quite a lot research has been carried out in this regard in which for most of the proposed algorithms, kriging variance minimization as a criterion for uncertainty assessment is defined as objective function and the problem could be solved through optimization methods. Although kriging variance implementation is known to have many advantages in objective function definition, it is not sensitive to local variability. As a result, the only factors evaluated for locating the additional boreholes are initial data configuration and variogram model parameters and the effects of local variability are omitted. In this paper, with the goal of considering the local variability in boundaries uncertainty assessment, the application of combined variance is investigated to define the objective function. Thus in order to verify the applicability of the proposed objective function, it is used to locate the additional boreholes in Esfordi phosphate mine through the implementation of metaheuristic optimization methods such as simulated annealing and particle swarm optimization. Comparison of results from the proposed objective function and conventional methods indicates that the new changes imposed on the objective function has caused the algorithm output to be sensitive to the variations of grade, domain's boundaries and the thickness of mineralization domain. The comparison between the results of different optimization algorithms proved that for the presented case the application of particle swarm optimization is more appropriate than simulated annealing.

  5. An Evaluation of the Use of Simulated Annealing to Optimize Thinning Rates for Single Even-Aged Stands

    Directory of Open Access Journals (Sweden)

    Kai Moriguchi

    2015-01-01

    Full Text Available We evaluated the potential of simulated annealing as a reliable method for optimizing thinning rates for single even-aged stands. Four types of yield models were used as benchmark models to examine the algorithm’s versatility. Thinning rate, which was constrained to 0–50% every 5 years at stand ages of 10–45 years, was optimized to maximize the net present value for one fixed rotation term (50 years. The best parameters for the simulated annealing were chosen from 113 patterns, using the mean of the net present value from 39 runs to ensure the best performance. We compared the solutions with those from coarse full enumeration to evaluate the method’s reliability and with 39 runs of random search to evaluate its efficiency. In contrast to random search, the best run of simulated annealing for each of the four yield models resulted in a better solution than coarse full enumeration. However, variations in the objective function for two yield models obtained with simulated annealing were significantly larger than those of random search. In conclusion, simulated annealing with optimized parameters is more efficient for optimizing thinning rates than random search. However, it is necessary to execute multiple runs to obtain reliable solutions.

  6. Intelligent simulated annealing algorithm applied to the optimization of the main magnet for magnetic resonance imaging machine; Algoritmo simulated annealing inteligente aplicado a la optimizacion del iman principal de una maquina de resonancia magnetica de imagenes

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez Lopez, Hector [Universidad de Oriente, Santiago de Cuba (Cuba). Centro de Biofisica Medica]. E-mail: hsanchez@cbm.uo.edu.cu

    2001-08-01

    This work describes an alternative algorithm of Simulated Annealing applied to the design of the main magnet for a Magnetic Resonance Imaging machine. The algorithm uses a probabilistic radial base neuronal network to classify the possible solutions, before the objective function evaluation. This procedure allows reducing up to 50% the number of iterations required by simulated annealing to achieve the global maximum, when compared with the SA algorithm. The algorithm was applied to design a 0.1050 Tesla four coil resistive magnet, which produces a magnetic field 2.13 times more uniform than the solution given by SA. (author)

  7. LONSA as a tool for loading pattern optimization for VVER-1000 using synergy of a neural network and simulated annealing

    International Nuclear Information System (INIS)

    This paper presents a new method for loading pattern optimization in VVER-1000 reactor core. Because of the immensity of search space in fuel management optimization problems, finding the optimum solution requires a huge amount of calculations in the classical method, while neural network models, with massively parallel structures, accompanied by simulated annealing method are powerful enough to find the best solution in a reasonable time. Hopfield neural network operates as a local minimum searching algorithm; and for improving the obtained result from neural network, simulated annealing is used. Simulated annealing, because of its stochastic nature, can provide for the escape of the result of Hopfield neural network from a local minimum and guide it to the global minimum. In this study, minimization of radial power peaking factor inside the reactor core of Bushehr NPP is considered as the objective. The result is the optimum configuration, which is in agreement with the pattern proposed by the designer

  8. A simulated annealing approach to schedule optimization for the SES facility

    Science.gov (United States)

    Mcmahon, Mary Beth; Dean, Jack

    1992-01-01

    The Shuttle Engineering Simulator (SES) is a facility which houses the software and hardware for a variety of simulation systems. The simulators include the Autonomous Remote Manipulator, the Manned Maneuvering Unit, Orbiter/Space Station docking, and shuttle entry and landing. The SES simulators are used by various groups throughout NASA. For example, astronauts use the SES to practice maneuvers with the shuttle equipment; programmers use the SES to test flight software; and engineers use the SES for design and analysis studies. Due to its high demand, the SES is busy twenty-four hours a day and seven days a week. Scheduling the facility is a problem that is constantly growing and changing with the addition of new equipment. Currently a number of small independent programs have been developed to help solve the problem, but the long-term answer lies in finding a flexible, integrated system that provides the user with the ability to create, optimize, and edit the schedule. COMPASS is an interactive and highly flexible scheduling system. However, until recently COMPASS did not provide any optimization features. This paper describes the simulated annealing extension to COMPASS. It now allows the user to interweave schedule creation, revision, and optimization. This practical approach was necessary in order to satisfy the operational requirements of the SES.

  9. Pharmacokinetic modeling of dynamic MR images using a simulated annealing-based optimization

    Science.gov (United States)

    Sawant, Amit R.; Reece, John H.; Reddick, Wilburn E.

    2000-04-01

    The aim of this work was to use dynamic contrast enhanced MR image (DEMRI) data to generate 'parameter images' which provide functional information about contrast agent access, in bone sarcoma. A simulated annealing based technique was applied to optimize the parameters of a pharmacokinetic model used to describe the kinetics of the tissue response during and after intravenous infusion of a paramagnetic contrast medium, Gd-DTPA. Optimization was performed on a pixel by pixel basis so as to minimize the sum of square deviations of the calculated values from the values obtained experimentally during dynamic contrast enhanced MR imaging. A cost function based on a priori information was introduced during the annealing procedure to ensure that the values obtained were within the expected ranges. The optimized parameters were used in the model to generate parameter images, which reveal functional information that is normally not visible in conventional Gd-DTPA enhanced MR images. This functional information, during and upon completion of pre-operative chemotherapy, is useful in predicting the probability of disease free survival.

  10. Retrieval of Surface and Subsurface Moisture of Bare Soil Using Simulated Annealing

    Science.gov (United States)

    Tabatabaeenejad, A.; Moghaddam, M.

    2009-12-01

    Soil moisture is of fundamental importance to many hydrological and biological processes. Soil moisture information is vital to understanding the cycling of water, energy, and carbon in the Earth system. Knowledge of soil moisture is critical to agencies concerned with weather and climate, runoff potential and flood control, soil erosion, reservoir management, water quality, agricultural productivity, drought monitoring, and human health. The need to monitor the soil moisture on a global scale has motivated missions such as Soil Moisture Active and Passive (SMAP) [1]. Rough surface scattering models and remote sensing retrieval algorithms are essential in study of the soil moisture, because soil can be represented as a rough surface structure. Effects of soil moisture on the backscattered field have been studied since the 1960s, but soil moisture estimation remains a challenging problem and there is still a need for more accurate and more efficient inversion algorithms. It has been shown that the simulated annealing method is a powerful tool for inversion of the model parameters of rough surface structures [2]. The sensitivity of this method to measurement noise has also been investigated assuming a two-layer structure characterized by the layers dielectric constants, layer thickness, and statistical properties of the rough interfaces [2]. However, since the moisture profile varies with depth, it is sometimes necessary to model the rough surface as a layered structure with a rough interface on top and a stratified structure below where each layer is assumed to have a constant volumetric moisture content. In this work, we discretize the soil structure into several layers of constant moisture content to examine the effect of subsurface profile on the backscattering coefficient. We will show that while the moisture profile could vary in deeper layers, these layers do not affect the scattered electromagnetic field significantly. Therefore, we can use just a few layers

  11. Adaptive wavelet simulation of global ocean dynamics

    Directory of Open Access Journals (Sweden)

    N. K.-R. Kevlahan

    2015-07-01

    Full Text Available In order to easily enforce solid-wall boundary conditions in the presence of complex coastlines, we propose a new mass and energy conserving Brinkman penalization for the rotating shallow water equations. This penalization does not lead to higher wave speeds in the solid region. The error estimates for the penalization are derived analytically and verified numerically for linearized one dimensional equations. The penalization is implemented in a conservative dynamically adaptive wavelet method for the rotating shallow water equations on the sphere with bathymetry and coastline data from NOAA's ETOPO1 database. This code could form the dynamical core for a future global ocean model. The potential of the dynamically adaptive ocean model is illustrated by using it to simulate the 2004 Indonesian tsunami and wind-driven gyres.

  12. FPGA PLACEMENT OPTIMIZATION BY TWO-STEP UNIFIED GENETIC ALGORITHM AND SIMULATED ANNEALING ALGORITHM

    Institute of Scientific and Technical Information of China (English)

    Yang Meng; A.E.A.Almaini; Wang Pengjun

    2006-01-01

    Genetic Algorithm (GA) is a biologically inspired technique and widely used to solve numerous combinational optimization problems. It works on a population of individuals, not just one single solution. As a result, it avoids converging to the local optimum. However, it takes too much CPU time in the late process of GA. On the other hand, in the late process Simulated Annealing (SA) converges faster than GA but it is easily trapped to local optimum. In this letter, a useful method that unifies GA and SA is introduced, which utilizes the advantage of the global search ability of GA and fast convergence of SA. The experimental results show that the proposed algorithm outperforms GA in terms of CPU time without degradation of performance.It also achieves highly comparable placement cost compared to the state-of-the-art results obtained by Versatile Place and Route (VPR) Tool.

  13. A Simulated Annealing method to solve a generalized maximal covering location problem

    Directory of Open Access Journals (Sweden)

    M. Saeed Jabalameli

    2011-04-01

    Full Text Available The maximal covering location problem (MCLP seeks to locate a predefined number of facilities in order to maximize the number of covered demand points. In a classical sense, MCLP has three main implicit assumptions: all or nothing coverage, individual coverage, and fixed coverage radius. By relaxing these assumptions, three classes of modelling formulations are extended: the gradual cover models, the cooperative cover models, and the variable radius models. In this paper, we develop a special form of MCLP which combines the characteristics of gradual cover models, cooperative cover models, and variable radius models. The proposed problem has many applications such as locating cell phone towers. The model is formulated as a mixed integer non-linear programming (MINLP. In addition, a simulated annealing algorithm is used to solve the resulted problem and the performance of the proposed method is evaluated with a set of randomly generated problems.

  14. Fabrication of simulated plate fuel elements: Defining role of stress relief annealing

    Science.gov (United States)

    Kohli, D.; Rakesh, R.; Sinha, V. P.; Prasad, G. J.; Samajdar, I.

    2014-04-01

    This study involved fabrication of simulated plate fuel elements. Uranium silicide of actual fuel elements was replaced with yttria. The fabrication stages were otherwise identical. The final cold rolled and/or straightened plates, without stress relief, showed an inverse relationship between bond strength and out of plane residual shear stress (τ13). Stress relief of τ13 was conducted over a range of temperatures/times (200-500 °C and 15-240 min) and led to corresponding improvements in bond strength. Fastest τ13 relief was obtained through 300 °C annealing. Elimination of microscopic shear bands, through recovery and partial recrystallization, was clearly the most effective mechanism of relieving τ13.

  15. Use of a simulated annealing algorithm to fit compartmental models with an application to fractal pharmacokinetics.

    Science.gov (United States)

    Marsh, Rebeccah E; Riauka, Terence A; McQuarrie, Steve A

    2007-01-01

    Increasingly, fractals are being incorporated into pharmacokinetic models to describe transport and chemical kinetic processes occurring in confined and heterogeneous spaces. However, fractal compartmental models lead to differential equations with power-law time-dependent kinetic rate coefficients that currently are not accommodated by common commercial software programs. This paper describes a parameter optimization method for fitting individual pharmacokinetic curves based on a simulated annealing (SA) algorithm, which always converged towards the global minimum and was independent of the initial parameter values and parameter bounds. In a comparison using a classical compartmental model, similar fits by the Gauss-Newton and Nelder-Mead simplex algorithms required stringent initial estimates and ranges for the model parameters. The SA algorithm is ideal for fitting a wide variety of pharmacokinetic models to clinical data, especially those for which there is weak prior knowledge of the parameter values, such as the fractal models. PMID:17706176

  16. Shape optimization of road tunnel cross-section by simulated annealing

    Directory of Open Access Journals (Sweden)

    Sobótka Maciej

    2016-06-01

    Full Text Available The paper concerns shape optimization of a tunnel excavation cross-section. The study incorporates optimization procedure of the simulated annealing (SA. The form of a cost function derives from the energetic optimality condition, formulated in the authors’ previous papers. The utilized algorithm takes advantage of the optimization procedure already published by the authors. Unlike other approaches presented in literature, the one introduced in this paper takes into consideration a practical requirement of preserving fixed clearance gauge. Itasca Flac software is utilized in numerical examples. The optimal excavation shapes are determined for five different in situ stress ratios. This factor significantly affects the optimal topology of excavation. The resulting shapes are elongated in the direction of a principal stress greater value. Moreover, the obtained optimal shapes have smooth contours circumscribing the gauge.

  17. Microwave imaging for conducting scatterers by hybrid particle swarm optimization with simulated annealing

    International Nuclear Information System (INIS)

    In this paper, a microwave imaging technique for reconstructing the shape of two-dimensional perfectly conducting scatterers by means of a stochastic optimization approach is investigated. Based on the boundary condition and the measured scattered field derived by transverse magnetic illuminations, a set of nonlinear integral equations is obtained and the imaging problem is reformulated in to an optimization problem. A hybrid approximation algorithm, called PSO-SA, is developed in this work to solve the scattering inverse problem. In the hybrid algorithm, particle swarm optimization (PSO) combines global search and local search for finding the optimal results assignment with reasonable time and simulated annealing (SA) uses certain probability to avoid being trapped in a local optimum. The hybrid approach elegantly combines the exploration ability of PSO with the exploitation ability of SA. Reconstruction results are compared with exact shapes of some conducting cylinders; and good agreements with the original shapes are observed.

  18. Solving the flow fields in conduits and networks using energy minimization principle with simulated annealing

    CERN Document Server

    Sochi, Taha

    2014-01-01

    In this paper, we propose and test an intuitive assumption that the pressure field in single conduits and networks of interconnected conduits adjusts itself to minimize the total energy consumption required for transporting a specific quantity of fluid. We test this assumption by using linear flow models of Newtonian fluids transported through rigid tubes and networks in conjunction with a simulated annealing (SA) protocol to minimize the total energy cost. All the results confirm our hypothesis as the SA algorithm produces very close results to those obtained from the traditional deterministic methods of identifying the flow fields by solving a set of simultaneous equations based on the conservation principles. The same results apply to electric ohmic conductors and networks of interconnected ohmic conductors. Computational experiments conducted in this regard confirm this extension. Further studies are required to test the energy minimization hypothesis for the non-linear flow systems.

  19. Simulated annealing in networks for computing possible arrangements for red and green cones

    Science.gov (United States)

    Ahumada, Albert J., Jr.

    1987-01-01

    Attention is given to network models in which each of the cones of the retina is given a provisional color at random, and then the cones are allowed to determine the colors of their neighbors through an iterative process. A symmetric-structure spin-glass model has allowed arrays to be generated from completely random arrangements of red and green to arrays with approximately as much disorder as the parafoveal cones. Simulated annealing has also been added to the process in an attempt to generate color arrangements with greater regularity and hence more revealing moirepatterns than than the arrangements yielded by quenched spin-glass processes. Attention is given to the perceptual implications of these results.

  20. High-dose-rate prostate brachytherapy inverse planning on dose-volume criteria by simulated annealing.

    Science.gov (United States)

    Deist, T M; Gorissen, B L

    2016-02-01

    High-dose-rate brachytherapy is a tumor treatment method where a highly radioactive source is brought in close proximity to the tumor. In this paper we develop a simulated annealing algorithm to optimize the dwell times at preselected dwell positions to maximize tumor coverage under dose-volume constraints on the organs at risk. Compared to existing algorithms, our algorithm has advantages in terms of speed and objective value and does not require an expensive general purpose solver. Its success mainly depends on exploiting the efficiency of matrix multiplication and a careful selection of the neighboring states. In this paper we outline its details and make an in-depth comparison with existing methods using real patient data. PMID:26760757

  1. Simulated annealing approach to vascular structure with application to the coronary arteries.

    Science.gov (United States)

    Keelan, Jonathan; Chung, Emma M L; Hague, James P

    2016-02-01

    Do the complex processes of angiogenesis during organism development ultimately lead to a near optimal coronary vasculature in the organs of adult mammals? We examine this hypothesis using a powerful and universal method, built on physical and physiological principles, for the determination of globally energetically optimal arterial trees. The method is based on simulated annealing, and can be used to examine arteries in hollow organs with arbitrary tissue geometries. We demonstrate that the approach can generate in silico vasculatures which closely match porcine anatomical data for the coronary arteries on all length scales, and that the optimized arterial trees improve systematically as computational time increases. The method presented here is general, and could in principle be used to examine the arteries of other organs. Potential applications include improvement of medical imaging analysis and the design of vascular trees for artificial organs. PMID:26998317

  2. Reconstruction of bremsstrahlung spectra from attenuation data using generalized simulated annealing

    CERN Document Server

    Menin, Olavo Henrique; da Costa, Alessandro Martins

    2014-01-01

    The throughout knowledge of a X-ray beam spectrum is mandatory to assess the quality of its source device. Since the techniques to directly measurement such spectra are expensive and laborious, the X-ray spectrum reconstruction using attenuation data has been a promising alternative. However, such reconstruction corresponds mathematically to an inverse, nonlinear and ill-posed problem. Therefore, to solve it the use of powerful optimization algorithms and good regularization functions is required. Here, we present a generalized simulated annealing algorithm combined with a suitable smoothing regularization function to solve the X-ray spectrum reconstruction inverse problem. We also propose an approach to set the initial acceptance and visitation temperatures and a standardization of the objective function terms to automatize the algorithm to address with different spectra range. Numerical tests considering three different reference spectra with its attenuation curve are presented. Results show that the algori...

  3. Cascade annealing of tungsten implanted with 5 keV noble gas atoms. A computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Kolk, G.J. van der; Veen, A. van; Caspers, L.M. (Interuniversitair Reactor Inst., Delft (Netherlands); Technische Hogeschool Delft (Netherlands)); Hosson, J.T.M. de (Rijksuniversiteit Groningen (Netherlands). Materials Science Centre)

    1984-03-01

    The trapping of vacancies by implanted atoms is calculated. After low energy implantation (5 keV) of tungsten with heavy noble gas atoms most of the implanted atoms are in substitutional position with one or two vacancies closer than two lattice units. Under the influence of the lattice distortion around the implanted atoms the vacancies follow a preferential migration path towards the implant during annealing. With lattice relaxation simulations migration energies close to the implanted atom are calculated. Monte Carlo theory is applied to obtain trapping probabilities as a function of implant-vacancy separation and temperature. An estimate of the initial implant-vacancy separation follows from collision cascade calculations. The results show that nearby vacancies are trapped by the implanted atoms.

  4. Application of simulated annealing algorithm to improve work roll wear model in plate mills

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    Employing Simulated Annealing Algorithm (SAA) and many measured data, a calculation model of work roll wear was built in the 2 800 mm 4-high mill of Wuhan Iron and Steel (Group) Co.(WISCO). The model was a semi-theory practical formula. Its pattern and magnitude were still hardly defined with classical optimization methods. But the problem could be resolved by SAA. It was pretty high precision to predict the values for the wear profiles of work roll in a rolling unit. Afterone-year application, the results show that the model is feasible in engineering, and it can be applied to predict the wear profiles of work roll in other mills

  5. Extended Information Ratio for Portfolio Optimization Using Simulated Annealing with Constrained Neighborhood

    Science.gov (United States)

    Orito, Yukiko; Yamamoto, Hisashi; Tsujimura, Yasuhiro; Kambayashi, Yasushi

    The portfolio optimizations are to determine the proportion-weighted combination in the portfolio in order to achieve investment targets. This optimization is one of the multi-dimensional combinatorial optimizations and it is difficult for the portfolio constructed in the past period to keep its performance in the future period. In order to keep the good performances of portfolios, we propose the extended information ratio as an objective function, using the information ratio, beta, prime beta, or correlation coefficient in this paper. We apply the simulated annealing (SA) to optimize the portfolio employing the proposed ratio. For the SA, we make the neighbor by the operation that changes the structure of the weights in the portfolio. In the numerical experiments, we show that our portfolios keep the good performances when the market trend of the future period becomes different from that of the past period.

  6. An Archived Multi Objective Simulated Annealing Method to Discover Biclusters in Microarray Data

    Directory of Open Access Journals (Sweden)

    Mohsen Lashkargir

    2011-01-01

    Full Text Available With the advent of microarray technology it has been possible to measure thousands of expression values of genes in a single experiment. Analysis of large scale geonomics data, notably gene expression, has initially focused on clustering methods. Recently, biclustering techniques were proposed for revealing submatrices showing unique patterns. Biclustering or simultaneous clustering of both genes and conditions is challenging particularly for the analysis of high-dimensional gene expression data in information retrieval, knowledge discovery, and data mining. In biclustering of microarray data, several objectives have to be optimized simultaneously and often these objectives are in conflict with each other. A multi objective model is very suitable for solving this problem. Our method proposes a algorithm which is based on multi objective Simulated Annealing for discovering biclusters in gene expression data. Experimental result in bench mark data base present a significant improvement in overlap among biclusters and coverage of elements in gene expression and quality of biclusters.

  7. Evolutionary Algorithm based on simulated annealing for the multi-objective optimization of combinatorial problems

    Directory of Open Access Journals (Sweden)

    Elias David Nino Ruiz

    2013-05-01

    Full Text Available This paper states a novel hybrid-metaheuristic based on the Theory of Deterministic Swapping, Theory of Evolution and Simulated Annealing Metaheuristic for the multi-objective optimization of combinatorial problems. The proposed algorithm is named EMSA. It is an improvement of MODS algorithm. Unlike MODS, EMSA works using a search direction given through the assignation of weights to each function of the combinatorial problem to optimize. Also, in order to avoid local optimums, EMSA uses crossover strategy of Genetic Algorithm. Lastly, EMSA is tested using well know instances of the Bi-Objective Traveling Salesman Problem (TSP from TSPLIB. Its results were compared with MODS Metaheuristic (its precessor. The comparison was made using metrics from the specialized literature such as Spacing, Generational Distance, Inverse Generational Distance and Non-Dominated Generation Vectors. In every case, the EMSA results on the metrics were always better and in some of those cases, the superiority was 100%.

  8. MULTIOBJECTIVE OPTIMAL DESIGN OF THREE-PHASE INDUCTION GENERATOR USING SIMULATED ANNEALING TECHNIQUE

    Directory of Open Access Journals (Sweden)

    R.Kannan

    2010-05-01

    Full Text Available Self-excited induction generators are growing in popularity due to their advantages over the conventional synchronous generators. In this paper, the task of finding optimal design of a three-phase self-excited induction generator has been formulated as a multi criterion optimization problem. Criterial functions in the example are the active material cost and capacitance required for excitation under full load conditions to maintain rated voltage. Simulated Annealing technique is used as a tool to solve the problem. The obtained results prove the effectiveness of a multi objective approach since it allows us to find a good compromise among the proposed goals, and above all it represents an efficacious tool for the designer.

  9. Classification of event-related potentials using multivariate autoregressive modeling combined with simulated annealing

    Directory of Open Access Journals (Sweden)

    Vasios C.E.

    2003-01-01

    Full Text Available In the present work, a new method for the classification of Event Related Potentials (ERPs is proposed. The proposed method consists of two modules: the feature extraction module and the classification module. The feature extraction module comprises the implementation of the Multivariate Autoregressive model in conjunction with the Simulated Annealing technique, for the selection of optimum features from ERPs. The classification module is implemented with a single three-layer neural network, trained with the back-propagation algorithm and classifies the data into two classes: patients and control subjects. The method, in the form of a Decision Support System (DSS, has been thoroughly tested to a number of patient data (OCD, FES, depressives and drug users, resulting successful classification up to 100%.

  10. Application of simulated annealing to solve multi-objectives for aggregate production planning

    Science.gov (United States)

    Atiya, Bayda; Bakheet, Abdul Jabbar Khudhur; Abbas, Iraq Tereq; Bakar, Mohd. Rizam Abu; Soon, Lee Lai; Monsi, Mansor Bin

    2016-06-01

    Aggregate production planning (APP) is one of the most significant and complicated problems in production planning and aim to set overall production levels for each product category to meet fluctuating or uncertain demand in future. and to set decision concerning hiring, firing, overtime, subcontract, carrying inventory level. In this paper, we present a simulated annealing (SA) for multi-objective linear programming to solve APP. SA is considered to be a good tool for imprecise optimization problems. The proposed model minimizes total production and workforce costs. In this study, the proposed SA is compared with particle swarm optimization (PSO). The results show that the proposed SA is effective in reducing total production costs and requires minimal time.

  11. Simulated Annealing Based Hybrid Forecast for Improving Daily Municipal Solid Waste Generation Prediction

    Directory of Open Access Journals (Sweden)

    Jingwei Song

    2014-01-01

    Full Text Available A simulated annealing (SA based variable weighted forecast model is proposed to combine and weigh local chaotic model, artificial neural network (ANN, and partial least square support vector machine (PLS-SVM to build a more accurate forecast model. The hybrid model was built and multistep ahead prediction ability was tested based on daily MSW generation data from Seattle, Washington, the United States. The hybrid forecast model was proved to produce more accurate and reliable results and to degrade less in longer predictions than three individual models. The average one-week step ahead prediction has been raised from 11.21% (chaotic model, 12.93% (ANN, and 12.94% (PLS-SVM to 9.38%. Five-week average has been raised from 13.02% (chaotic model, 15.69% (ANN, and 15.92% (PLS-SVM to 11.27%.

  12. Simulated Annealing for Ground State Energy of Ionized Donor Bound Excitons in Semiconductors

    Institute of Scientific and Technical Information of China (English)

    YANHai-Qing; TANGChen; LIUMing; ZHANGHao; ZHANGGui-Min

    2004-01-01

    We present a global optimization method, called the simulated annealing, to the ground state energies of excitons. The proposed method does not require the partial derivatives with respect to each variational parameter or solving an eigenequation, so the present method is simpler in software programming than the variational method,and overcomes the major difficulties. The ground state energies of ionized-donor-bound excitons (D+,X) have beencal culated variationally for all values of effective electron-to-hole mass ratio σ. They are compared with those obtained by the variational method. The results obtained demonstrate that the proposed method is simple, accurate, and has more advantages than the traditional methods in calculation.

  13. Simulated Annealing for Ground State Energy of Ionized Donor Bound Excitons in Semiconductors

    Institute of Scientific and Technical Information of China (English)

    YAN Hai-Qing; TANG Chen; LIU Ming; ZHANG Hao; ZHANG Gui-Min

    2004-01-01

    We present a global optimization method, called the simulated annealing, to the ground state energies of excitons. The proposed method does not require the partial derivatives with respect to each variational parameter or solving an eigenequation, so the present method is simpler in software programming than the variational method,and overcomes the major difficulties. The ground state energies of ionized-donor-bound excitons (D+, X) have been calculated variationally for all values of effective electron-to-hole mass ratio σ. They are compared with those obtained by the variational method. The results obtained demonstrate that the proposed method is simple, accurate, and has more advantages than the traditional methods in calculation.

  14. Simulated annealing for three-dimensional low-beta reduced MHD equilibria in cylindrical geometry

    CERN Document Server

    Furukawa, M

    2016-01-01

    Simulated annealing (SA) is applied for three-dimensional (3D) equilibrium calculation of ideal, low-beta reduced MHD in cylindrical geometry. The SA is based on the theory of Hamiltonian mechanics. The dynamical equation of the original system, low-beta reduced MHD in this study, is modified so that the energy changes monotonically while preserving the Casimir invariants in the artificial dynamics. An equilibrium of the system is given by an extremum of the energy, therefore SA can be used as a method for calculating ideal MHD equilibrium. Previous studies demonstrated that the SA succeeds to lead to various MHD equilibria in two dimensional rectangular domain. In this paper, the theory is applied to 3D equilibrium of ideal, low-beta reduced MHD. An example of equilibrium with magnetic islands, obtained as a lower energy state, is shown. Several versions of the artificial dynamics are developed that can effect smoothing.

  15. Simulated annealing algorithm for multi-objective optimization : application to electric motor design

    Energy Technology Data Exchange (ETDEWEB)

    Idoumghar, L. [Haute Alcace Univ., Mulhouse (France); Fodorean, D.; Mirraoui, A. [Univ. of Technology of Belfort-Montbeliard, Belfort (France). Dept. of Electrical Engineering and Control Systems

    2010-03-09

    Metaheuristics algorithms can solve complex optimization problems. A unique simulated annealing (SA) algorithm for multi-objective optimization was presented in this paper. The proposed SA algorithm was validated on five standard benchmark mathematical functions and improved the design of an inset permanent magnet motor with concentrated flux (IPMM-CF). The paper provided a description of the SA algorithm and discussed the results. The five benchmarks that were studied included Rastrigin's function; Rosenbrock's function; Michalewicz's function; Schwefel's function; and Noisy's function. The findings were also compared with results obtained by using the Ant Colony paradigm as well as with a particle swarm algorithm. Conclusions and further research options were also offered. It was concluded that the proposed approach has better performance in terms of accuracy, convergence rate, stability and robustness. 15 refs., 4 tabs., 9 figs.

  16. Optimal Facility Location Model Based on Genetic Simulated Annealing Algorithm for Siting Urban Refueling Stations

    Directory of Open Access Journals (Sweden)

    Dawei Chen

    2015-01-01

    Full Text Available This paper analyzes the impact factors and principles of siting urban refueling stations and proposes a three-stage method. The main objective of the method is to minimize refueling vehicles’ detour time. The first stage aims at identifying the most frequently traveled road segments for siting refueling stations. The second stage focuses on adding additional refueling stations to serve vehicles whose demands are not directly satisfied by the refueling stations identified in the first stage. The last stage further adjusts and optimizes the refueling station plan generated by the first two stages. A genetic simulated annealing algorithm is proposed to solve the optimization problem in the second stage and the results are compared to those from the genetic algorithm. A case study is also conducted to demonstrate the effectiveness of the proposed method and algorithm. The results indicate the proposed method can provide practical and effective solutions that help planners and government agencies make informed refueling station location decisions.

  17. Discrete-State Simulated Annealing For Traveling-Wave Tube Slow-Wave Circuit Optimization

    Science.gov (United States)

    Wilson, Jeffrey D.; Bulson, Brian A.; Kory, Carol L.; Williams, W. Dan (Technical Monitor)

    2001-01-01

    Algorithms based on the global optimization technique of simulated annealing (SA) have proven useful in designing traveling-wave tube (TWT) slow-wave circuits for high RF power efficiency. The characteristic of SA that enables it to determine a globally optimized solution is its ability to accept non-improving moves in a controlled manner. In the initial stages of the optimization, the algorithm moves freely through configuration space, accepting most of the proposed designs. This freedom of movement allows non-intuitive designs to be explored rather than restricting the optimization to local improvement upon the initial configuration. As the optimization proceeds, the rate of acceptance of non-improving moves is gradually reduced until the algorithm converges to the optimized solution. The rate at which the freedom of movement is decreased is known as the annealing or cooling schedule of the SA algorithm. The main disadvantage of SA is that there is not a rigorous theoretical foundation for determining the parameters of the cooling schedule. The choice of these parameters is highly problem dependent and the designer needs to experiment in order to determine values that will provide a good optimization in a reasonable amount of computational time. This experimentation can absorb a large amount of time especially when the algorithm is being applied to a new type of design. In order to eliminate this disadvantage, a variation of SA known as discrete-state simulated annealing (DSSA), was recently developed. DSSA provides the theoretical foundation for a generic cooling schedule which is problem independent, Results of similar quality to SA can be obtained, but without the extra computational time required to tune the cooling parameters. Two algorithm variations based on DSSA were developed and programmed into a Microsoft Excel spreadsheet graphical user interface (GUI) to the two-dimensional nonlinear multisignal helix traveling-wave amplifier analysis program TWA3

  18. Multivariable Optimization: Quantum Annealing & Computation

    OpenAIRE

    Mukherjee, Sudip; Chakrabarti, Bikas K.

    2014-01-01

    Recent developments in quantum annealing techniques have been indicating potential advantage of quantum annealing for solving NP-hard optimization problems. In this article we briefly indicate and discuss the beneficial features of quantum annealing techniques and compare them with those of simulated annealing techniques. We then briefly discuss the quantum annealing studies of some model spin glass and kinetically constrained systems.

  19. Joint Optimization of Vertical Component Gravity and Seismic P-wave First Arrivals by Simulated Annealing

    Science.gov (United States)

    Louie, J. N.; Basler-Reeder, K.; Kent, G. M.; Pullammanappallil, S. K.

    2015-12-01

    Simultaneous joint seismic-gravity optimization improves P-wave velocity models in areas with sharp lateral velocity contrasts. Optimization is achieved using simulated annealing, a metaheuristic global optimization algorithm that does not require an accurate initial model. Balancing the seismic-gravity objective function is accomplished by a novel approach based on analysis of Pareto charts. Gravity modeling uses a newly developed convolution algorithm, while seismic modeling utilizes the highly efficient Vidale eikonal equation traveltime generation technique. Synthetic tests show that joint optimization improves velocity model accuracy and provides velocity control below the deepest headwave raypath. Detailed first arrival picking followed by trial velocity modeling remediates inconsistent data. We use a set of highly refined first arrival picks to compare results of a convergent joint seismic-gravity optimization to the Plotrefa™ and SeisOpt® Pro™ velocity modeling packages. Plotrefa™ uses a nonlinear least squares approach that is initial model dependent and produces shallow velocity artifacts. SeisOpt® Pro™ utilizes the simulated annealing algorithm and is limited to depths above the deepest raypath. Joint optimization increases the depth of constrained velocities, improving reflector coherency at depth. Kirchoff prestack depth migrations reveal that joint optimization ameliorates shallow velocity artifacts caused by limitations in refraction ray coverage. Seismic and gravity data from the San Emidio Geothermal field of the northwest Basin and Range province demonstrate that joint optimization changes interpretation outcomes. The prior shallow-valley interpretation gives way to a deep valley model, while shallow antiformal reflectors that could have been interpreted as antiformal folds are flattened. Furthermore, joint optimization provides a clearer image of the rangefront fault. This technique can readily be applied to existing datasets and could

  20. Optoelectronic analogs of self-programming neural nets - Architecture and methodologies for implementing fast stochastic learning by simulated annealing

    Science.gov (United States)

    Farhat, Nabil H.

    1987-01-01

    Self-organization and learning is a distinctive feature of neural nets and processors that sets them apart from conventional approaches to signal processing. It leads to self-programmability which alleviates the problem of programming complexity in artificial neural nets. In this paper architectures for partitioning an optoelectronic analog of a neural net into distinct layers with prescribed interconnectivity pattern to enable stochastic learning by simulated annealing in the context of a Boltzmann machine are presented. Stochastic learning is of interest because of its relevance to the role of noise in biological neural nets. Practical considerations and methodologies for appreciably accelerating stochastic learning in such a multilayered net are described. These include the use of parallel optical computing of the global energy of the net, the use of fast nonvolatile programmable spatial light modulators to realize fast plasticity, optical generation of random number arrays, and an adaptive noisy thresholding scheme that also makes stochastic learning more biologically plausible. The findings reported predict optoelectronic chips that can be used in the realization of optical learning machines.

  1. Deformation and fault parameters of the 2005 Qeshm earthquake in Iran revisited: A Bayesian simulated annealing approach applied to the inversion of space geodetic data

    OpenAIRE

    M. Amighpey; B. Voosoghi; Mahdi Motagh

    2013-01-01

    The estimation of earthquake source parameters using an earth surface displacement field in an elastic half-space leads to a complex nonlinear inverse problem that classic inverse methods are unable to solve. Global optimization methods such as simulated annealing are a good replacement for such problems. Simulated annealing is analogous to thermodynamic annealing where, under certain conditions, the chaotic motions of atoms in a melt can settle to form a crystal with minimal energy. Followin...

  2. Elemental thin film depth profiles by ion beam analysis using simulated annealing - a new tool

    Energy Technology Data Exchange (ETDEWEB)

    Jeynes, C [University of Surrey Ion Beam Centre, Guildford, GU2 7XH (United Kingdom); Barradas, N P [Instituto Tecnologico e Nuclear, E.N. 10, Sacavem (Portugal); Marriott, P K [Department of Statistics, National University of Singapore, Singapore (Singapore); Boudreault, G [University of Surrey Ion Beam Centre, Guildford, GU2 7XH (United Kingdom); Jenkin, M [School of Electronics Computing and Mathematics, University of Surrey, Guildford (United Kingdom); Wendler, E [Friedrich-Schiller-Universitaet Jena, Institut fuer Festkoerperphysik, Jena (Germany); Webb, R P [University of Surrey Ion Beam Centre, Guildford, GU2 7XH (United Kingdom)

    2003-04-07

    Rutherford backscattering spectrometry (RBS) and related techniques have long been used to determine the elemental depth profiles in films a few nanometres to a few microns thick. However, although obtaining spectra is very easy, solving the inverse problem of extracting the depth profiles from the spectra is not possible analytically except for special cases. It is because these special cases include important classes of samples, and because skilled analysts are adept at extracting useful qualitative information from the data, that ion beam analysis is still an important technique. We have recently solved this inverse problem using the simulated annealing algorithm. We have implemented the solution in the 'IBA DataFurnace' code, which has been developed into a very versatile and general new software tool that analysts can now use to rapidly extract quantitative accurate depth profiles from real samples on an industrial scale. We review the features, applicability and validation of this new code together with other approaches to handling IBA (ion beam analysis) data, with particular attention being given to determining both the absolute accuracy of the depth profiles and statistically accurate error estimates. We include examples of analyses using RBS, non-Rutherford elastic scattering, elastic recoil detection and non-resonant nuclear reactions. High depth resolution and the use of multiple techniques simultaneously are both discussed. There is usually systematic ambiguity in IBA data and Butler's example of ambiguity (1990 Nucl. Instrum. Methods B 45 160-5) is reanalysed. Analyses are shown: of evaporated, sputtered, oxidized, ion implanted, ion beam mixed and annealed materials; of semiconductors, optical and magnetic multilayers, superconductors, tribological films and metals; and of oxides on Si, mixed metal silicides, boron nitride, GaN, SiC, mixed metal oxides, YBCO and polymers. (topical review)

  3. SAIL: A CUDA-based implementation of the simulated annealing for the inverse Laplace transform problem

    CERN Document Server

    Lutsyshyn, Yaroslav

    2016-01-01

    We developed a CUDA-based parallelization of the annealing method for the inverse Laplace transform problem. The algorithm is based on annealing algorithm and minimizes residue of the reconstruction of the spectral function. We introduce local updates which preserve first two sum rules and allow an efficient parallel CUDA implementation. Annealing is performed with the Monte Carlo method on a population of Markov walkers. We propose imprinted branching method to improve further the convergence of the anneal. The algorithm is tested on truncated double-peak Lorentzian spectrum with examples of how the error in the input data affects the reconstruction.

  4. Core loading pattern optimization of a typical two-loop 300 MWe PWR using Simulated Annealing (SA), novel crossover Genetic Algorithms (GA) and hybrid GA(SA) schemes

    International Nuclear Information System (INIS)

    Highlights: • SA and GA based optimization for loading pattern has been carried out. • The LEOPARD and MCRAC codes for a typical PWR have been used. • At high annealing rates, the SA shows premature convergence. • Then novel crossover and mutation operators are proposed in this work. • Genetic Algorithms exhibit stagnation for small population sizes. - Abstract: A comparative study of the Simulated Annealing and Genetic Algorithms based optimization of loading pattern with power profile flattening as the goal, has been carried out using the LEOPARD and MCRAC neutronic codes, for a typical 300 MWe PWR. At high annealing rates, Simulated Annealing exhibited tendency towards premature convergence while at low annealing rates, it failed to converge to global minimum. The new ‘batch composition preserving’ Genetic Algorithms with novel crossover and mutation operators are proposed in this work which, consistent with the earlier findings (Yamamoto, 1997), for small population size, require comparable computational effort to Simulated Annealing with medium annealing rates. However, Genetic Algorithms exhibit stagnation for small population size. A hybrid Genetic Algorithms (Simulated Annealing) scheme is proposed that utilizes inner Simulated Annealing layer for further evolution of population at stagnation point. The hybrid scheme has been found to escape stagnation in bcp Genetic Algorithms and converge to the global minima with about 51% more computational effort for small population sizes

  5. Identifying fracture-zone geometry using simulated annealing and hydraulic-connection data

    Science.gov (United States)

    Day-Lewis, F. D.; Hsieh, P.A.; Gorelick, S.M.

    2000-01-01

    A new approach is presented to condition geostatistical simulation of high-permeability zones in fractured rock to hydraulic-connection data. A simulated-annealing algorithm generates three-dimensional (3-D) realizations conditioned to borehole data, inferred hydraulic connections between packer-isolated borehole intervals, and an indicator (fracture zone or background-K bedrock) variogram model of spatial variability. We apply the method to data from the U.S. Geological Survey Mirror Lake Site in New Hampshire, where connected high-permeability fracture zones exert a strong control on fluid flow at the hundred-meter scale. Single-well hydraulic-packer tests indicate where permeable fracture zones intersect boreholes, and multiple-well pumping tests indicate the degree of hydraulic connection between boreholes. Borehole intervals connected by a fracture zone exhibit similar hydraulic responses, whereas intervals not connected by a fracture zone exhibit different responses. Our approach yields valuable insights into the 3-D geometry of fracture zones at Mirror Lake. Statistical analysis of the realizations yields maps of the probabilities of intersecting specific fracture zones with additional wells. Inverse flow modeling based on the assumption of equivalent porous media is used to estimate hydraulic conductivity and specific storage and to identify those fracture-zone geometries that are consistent with hydraulic test data.

  6. Structural optimization and segregation behavior of quaternary alloy nanoparticles based on simulated annealing algorithm

    Science.gov (United States)

    Xin-Ze, Lu; Gui-Fang, Shao; Liang-You, Xu; Tun-Dong, Liu; Yu-Hua, Wen

    2016-05-01

    Alloy nanoparticles exhibit higher catalytic activity than monometallic nanoparticles, and their stable structures are of importance to their applications. We employ the simulated annealing algorithm to systematically explore the stable structure and segregation behavior of tetrahexahedral Pt–Pd–Cu–Au quaternary alloy nanoparticles. Three alloy nanoparticles consisting of 443 atoms, 1417 atoms, and 3285 atoms are considered and compared. The preferred positions of atoms in the nanoparticles are analyzed. The simulation results reveal that Cu and Au atoms tend to occupy the surface, Pt atoms preferentially occupy the middle layers, and Pd atoms tend to segregate to the inner layers. Furthermore, Au atoms present stronger surface segregation than Cu ones. This study provides a fundamental understanding on the structural features and segregation phenomena of multi-metallic nanoparticles. Project supported by the National Natural Science Foundation of China (Grant Nos. 51271156, 11474234, and 61403318) and the Natural Science Foundation of Fujian Province of China (Grant Nos. 2013J01255 and 2013J06002).

  7. Hybrid Simulated Annealing and Nelder-Mead Algorithm for Solving Large-Scale Global Optimization Problems

    Directory of Open Access Journals (Sweden)

    Ahmed Fouad Ali

    2014-05-01

    Full Text Available This paper presents a new algorithm for solving large scale global optimization problems based on hybridization of simulated annealing and Nelder-Mead algorithm. The new algorithm is called simulated Nelder-Mead algorithm with random variables updating (SNMRVU. SNMRVU starts with an initial solution, which is generated randomly and then the solution is divided into partitions. The neighborhood zone is generated, random number of partitions are selected and variables updating process is starting in order to generate a trail neighbor solutions. This process helps the SNMRVU algorithm to explore the region around a current iterate solution. The Nelder- Mead algorithm is used in the final stage in order to improve the best solution found so far and accelerates the convergence in the final stage. The performance of the SNMRVU algorithm is evaluated using 27 scalable benchmark functions and compared with four algorithms. The results show that the SNMRVU algorithm is promising and produces high quality solutions with low computational costs.

  8. Application of Statistical Design Methods and Simulated Annealing Algorithm in Milling Process Optimization

    Directory of Open Access Journals (Sweden)

    H. Gohari

    2016-08-01

    Full Text Available Investigating of cutting forces and vibrations has a critical significance in analyzing and understanding of machining processes as it can provide more details about the cutting tool life, and surface quality and integrity. The purpose of this work is to find the optimal milling process parameters in order to reduce the effect of the forced vibrations induced from the cutting process. Minimizing the cutting forces fluctuation can result in a constant deflection during the milling process, so it can lead to eliminating the chatter and resonance phenomena during the machining process. In order to determine the optimal process parameters, cutter diameter, helix angle and depth of cut have been considered as input design factors and the average surface roughness as a machining characteristic which can be used to evaluate the induced cutting vibrations. Experimental tests have been performed based on Taguchi experimental design method. A mathematical regression model has been developed and used as an objective function in the simulated annealing algorithm for the process optimization. In addition, Analysis of variance (ANOVA has been implemented to find the highest significant parameters and the optimal parameters levels. Another technique has been performed too based on the mechanistic cutting force model in order to simulate the cutting forces which indicate the forces fluctuations at the optimal parameters levels. The results from the previous three techniques show the same optimal milling parameters which can be used in designing new tools in order to eliminate the effect of chatter and forced vibrations

  9. A Multi-Operator Based Simulated Annealing Approach For Robot Navigation in Uncertain Environments

    Directory of Open Access Journals (Sweden)

    Hui Miao

    2010-04-01

    Full Text Available Optimization methods such as simulated annealing (SA and genetic algorithm(GA are used for solving optimization problems. However, the computationalprocessing time is crucial for the real-time applications such as mobile robots. Amulti-operator based SA approach incorporating with additional fourmathematical operators that can find the optimal path for robots in dynamicenvironments is proposed in this paper. It requires less computation times whilegiving better trade-offs among simplicity, far-field accuracy, and computationalcost. The contributions of the work include the implementing of the simulatedannealing algorithm for robot path planning in dynamic environments, and theenhanced new path planner for improving the efficiency of the path planningalgorithm. The simulation results are compared with the previous publishedclassic SA approach and the GA approach. The multi-operator based SA (MSAapproach is demonstrated through case studies not only to be effective inobtaining the optimal solution but also to be more efficient in both off-line and onlineprocessing for robot dynamic path planning.

  10. Adaptive Resolution Simulation in Equilibrium and Beyond

    CERN Document Server

    Wang, Han

    2014-01-01

    In this paper, we investigate the equilibrium statistical properties of both the force and potential interpolations of adaptive resolution simulation (AdResS) under the theoretical framework of grand-canonical like AdResS (GC-AdResS). The thermodynamic relations between the higher and lower resolutions are derived by considering the absence of fundamental conservation laws in mechanics for both branches of AdResS. In order to investigate the applicability of AdResS method in studying the properties beyond the equilibrium, we demonstrate the accuracy of AdResS in computing the dynamical properties in two numerical examples: The velocity auto-correlation of pure water and the conformational relaxation of alanine dipeptide dissolved in water. Theoretical and technical open questions of the AdResS method are discussed in the end of the paper.

  11. Quantum Annealing for Clustering

    OpenAIRE

    Kurihara, Kenichi; Tanaka, Shu; Miyashita, Seiji

    2014-01-01

    This paper studies quantum annealing (QA) for clustering, which can be seen as an extension of simulated annealing (SA). We derive a QA algorithm for clustering and propose an annealing schedule, which is crucial in practice. Experiments show the proposed QA algorithm finds better clustering assignments than SA. Furthermore, QA is as easy as SA to implement.

  12. Kinetic Monte Carlo simulation of nanostructural evolution under post-irradiation annealing in dilute FeMnNi

    Energy Technology Data Exchange (ETDEWEB)

    Chiapetto, M. [SCK-CEN, Nuclear Materials Science Institute, Mol (Belgium); Unite Materiaux et Transformations (UMET), UMR 8207, Universite de Lille 1, ENSCL, Villeneuve d' Ascq (France); Becquart, C.S. [Unite Materiaux et Transformations (UMET), UMR 8207, Universite de Lille 1, ENSCL, Villeneuve d' Ascq (France); Laboratoire commun EDF-CNRS, Etude et Modelisation des Microstructures pour le Vieillissement des Materiaux (EM2VM) (France); Domain, C. [EDF R and D, Departement Materiaux et Mecanique des Composants, Les Renardieres, Moret sur Loing (France); Laboratoire commun EDF-CNRS, Etude et Modelisation des Microstructures pour le Vieillissement des Materiaux (EM2VM) (France); Malerba, L. [SCK-CEN, Nuclear Materials Science Institute, Mol (Belgium)

    2015-01-01

    Post-irradiation annealing experiments are often used to obtain clearer information on the nature of defects produced by irradiation. However, their interpretation is not always straightforward without the support of physical models. We apply here a physically-based set of parameters for object kinetic Monte Carlo (OKMC) simulations of the nanostructural evolution of FeMnNi alloys under irradiation to the simulation of their post-irradiation isochronal annealing, from 290 to 600 C. The model adopts a ''grey alloy'' scheme, i.e. the solute atoms are not introduced explicitly, only their effect on the properties of point-defect clusters is. Namely, it is assumed that both vacancy and SIA clusters are significantly slowed down by the solutes. The slowing down increases with size until the clusters become immobile. Specifically, the slowing down of SIA clusters by Mn and Ni can be justified in terms of the interaction between these atoms and crowdions in Fe. The results of the model compare quantitatively well with post-irradiation isochronal annealing experimental data, providing clear insight into the mechanisms that determine the disappearance or re-arrangement of defects as functions of annealing time and temperature. (copyright 2015 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  13. Kinetic Monte Carlo simulation of nanostructural evolution under post-irradiation annealing in dilute FeMnNi

    International Nuclear Information System (INIS)

    Post-irradiation annealing experiments are often used to obtain clearer information on the nature of defects produced by irradiation. However, their interpretation is not always straightforward without the support of physical models. We apply here a physically-based set of parameters for object kinetic Monte Carlo (OKMC) simulations of the nanostructural evolution of FeMnNi alloys under irradiation to the simulation of their post-irradiation isochronal annealing, from 290 to 600 C. The model adopts a ''grey alloy'' scheme, i.e. the solute atoms are not introduced explicitly, only their effect on the properties of point-defect clusters is. Namely, it is assumed that both vacancy and SIA clusters are significantly slowed down by the solutes. The slowing down increases with size until the clusters become immobile. Specifically, the slowing down of SIA clusters by Mn and Ni can be justified in terms of the interaction between these atoms and crowdions in Fe. The results of the model compare quantitatively well with post-irradiation isochronal annealing experimental data, providing clear insight into the mechanisms that determine the disappearance or re-arrangement of defects as functions of annealing time and temperature. (copyright 2015 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  14. Finding a Hadamard Matrix by Simulated Annealing of Spin-Vectors

    CERN Document Server

    Suksmono, Andriyan Bayu

    2016-01-01

    Reformulation of a combinatorial problem into optimization of a statistical-mechanics system, enables finding a better solution using heuristics derived from a physical process, such as by the SA (Simulated Annealing). In this paper, we present a Hadamard matrix (H-matrix) searching method based on the SA on an Ising model. By equivalence, an H-matrix can be converted into an SH (Semi-normalized Hadamard) matrix; whose first columns are unity vector and the rest ones are vectors with equal number of -1 and +1 called SH-vectors. We define SH spin-vectors to represent the SH vectors, which play the role of the spins on the Ising model. The topology of the lattice is generalized into a graph, whose edges represent orthogonality relationship among the SH spin-vectors. Started from a randomly generated quasi H-matrix Q, which is a matrix similar to the SH-matrix without imposing orthogonality, we perform the SA. The transitions of Q are conducted by random exchange of {+,-} spin-pair within the SH-spin vectors whi...

  15. A permutation based simulated annealing algorithm to predict pseudoknotted RNA secondary structures.

    Science.gov (United States)

    Tsang, Herbert H; Wiese, Kay C

    2015-01-01

    Pseudoknots are RNA tertiary structures which perform essential biological functions. This paper discusses SARNA-Predict-pk, a RNA pseudoknotted secondary structure prediction algorithm based on Simulated Annealing (SA). The research presented here extends previous work of SARNA-Predict and further examines the effect of the new algorithm to include prediction of RNA secondary structure with pseudoknots. An evaluation of the performance of SARNA-Predict-pk in terms of prediction accuracy is made via comparison with several state-of-the-art prediction algorithms using 20 individual known structures from seven RNA classes. We measured the sensitivity and specificity of nine prediction algorithms. Three of these are dynamic programming algorithms: Pseudoknot (pknotsRE), NUPACK, and pknotsRG-mfe. One is using the statistical clustering approach: Sfold and the other five are heuristic algorithms: SARNA-Predict-pk, ILM, STAR, IPknot and HotKnots algorithms. The results presented in this paper demonstrate that SARNA-Predict-pk can out-perform other state-of-the-art algorithms in terms of prediction accuracy. This supports the use of the proposed method on pseudoknotted RNA secondary structure prediction of other known structures. PMID:26558299

  16. Multiobjective Simulated Annealing for Collision Avoidance in ATM Accounting for Three Admissible Maneuvers

    Directory of Open Access Journals (Sweden)

    A. Mateos

    2016-01-01

    Full Text Available Technological advances are required to accommodate air traffic control systems for the future growth of air traffic. Particularly, detection and resolution of conflicts between aircrafts is a problem that has attracted much attention in the last decade becoming vital to improve the safety standards in free flight unstructured environments. We propose using the archive simulated annealing-based multiobjective optimization algorithm to deal with such a problem, accounting for three admissible maneuvers (velocity, turn, and altitude changes in a multiobjective context. The minimization of the maneuver number and magnitude, time delays, or deviations in the leaving points are considered for analysis. The optimal values for the algorithm parameter set are identified in the more complex instance in which all aircrafts have conflicts between each other accounting for 5, 10, and 20 aircrafts. Moreover, the performance of the proposed approach is analyzed by means of a comparison with the Pareto front, computed using brute force for 5 aircrafts and the algorithm is also illustrated with a random instance with 20 aircrafts.

  17. Fast simulated annealing inversion of surface waves on pavement using phase-velocity spectra

    Science.gov (United States)

    Ryden, N.; Park, C.B.

    2006-01-01

    The conventional inversion of surface waves depends on modal identification of measured dispersion curves, which can be ambiguous. It is possible to avoid mode-number identification and extraction by inverting the complete phase-velocity spectrum obtained from a multichannel record. We use the fast simulated annealing (FSA) global search algorithm to minimize the difference between the measured phase-velocity spectrum and that calculated from a theoretical layer model, including the field setup geometry. Results show that this algorithm can help one avoid getting trapped in local minima while searching for the best-matching layer model. The entire procedure is demonstrated on synthetic and field data for asphalt pavement. The viscoelastic properties of the top asphalt layer are taken into account, and the inverted asphalt stiffness as a function of frequency compares well with laboratory tests on core samples. The thickness and shear-wave velocity of the deeper embedded layers are resolved within 10% deviation from those values measured separately during pavement construction. The proposed method may be equally applicable to normal soil site investigation and in the field of ultrasonic testing of materials. ?? 2006 Society of Exploration Geophysicists.

  18. Equilibrium properties of transition-metal ion-argon clusters via simulated annealing

    Science.gov (United States)

    Asher, Robert L.; Micha, David A.; Brucat, Philip J.

    1992-01-01

    The geometrical structures of M(+) (Ar)n ions, with n = 1-14, have been studied by the minimization of a many-body potential surface with a simulated annealing procedure. The minimization method is justified for finite systems through the use of an information theory approach. It is carried out for eight potential-energy surfaces constructed with two- and three-body terms parametrized from experimental data and ab initio results. The potentials should be representative of clusters of argon atoms with first-row transition-metal monocations of varying size. The calculated geometries for M(+) = Co(+) and V(+) possess radial shells with small (ca. 4-8) first-shell coordination number. The inclusion of an ion-induced-dipole-ion-induced-dipole interaction between argon atoms raises the energy and generally lowers the symmetry of the cluster by promoting incomplete shell closure. Rotational constants as well as electric dipole and quadrupole moments are quoted for the Co(+) (Ar)n and V(+) (Ar)n predicted structures.

  19. Multiobjective Simulated Annealing-Based Clustering of Tissue Samples for Cancer Diagnosis.

    Science.gov (United States)

    Acharya, Sudipta; Saha, Sriparna; Thadisina, Yamini

    2016-03-01

    In the field of pattern recognition, the study of the gene expression profiles of different tissue samples over different experimental conditions has become feasible with the arrival of microarray-based technology. In cancer research, classification of tissue samples is necessary for cancer diagnosis, which can be done with the help of microarray technology. In this paper, we have presented a multiobjective optimization (MOO)-based clustering technique utilizing archived multiobjective simulated annealing(AMOSA) as the underlying optimization strategy for classification of tissue samples from cancer datasets. The presented clustering technique is evaluated for three open source benchmark cancer datasets [Brain tumor dataset, Adult Malignancy, and Small Round Blood Cell Tumors (SRBCT)]. In order to evaluate the quality or goodness of produced clusters, two cluster quality measures viz, adjusted rand index and classification accuracy ( % CoA) are calculated. Comparative results of the presented clustering algorithm with ten state-of-the-art existing clustering techniques are shown for three benchmark datasets. Also, we have conducted a statistical significance test called t-test to prove the superiority of our presented MOO-based clustering technique over other clustering techniques. Moreover, significant gene markers have been identified and demonstrated visually from the clustering solutions obtained. In the field of cancer subtype prediction, this study can have important impact. PMID:25706936

  20. Optimal pumping from Palmela water supply wells (Portugal) using simulated annealing

    Science.gov (United States)

    Fragoso, Teresa; Cunha, Maria Da Conceição; Lobo-Ferreira, João P.

    2009-12-01

    Aquifer systems are an important part of an integrated water resources management plan as foreseen in the European Union’s Water Framework Directive (2000). The sustainable development of these systems demands the use of all available techniques capable of handling the multidisciplinary features of the problems involved. The formulation and resolution of an optimization model is described for a planning and management problem based on the Palmela aquifer (Portugal), developed to supply a given number of demand centres. This problem is solved using one of the latest optimization techniques, the simulated annealing heuristic method, designed to find the optimal solutions while avoiding falling into local optimums. The solution obtained, providing the wells location and the corresponding pumped flows to supply each centre, are analysed taking into account the objective function components and the constraints. It was found that the operation cost is the biggest share of the final cost, and the choice of wells is greatly affected by this fact. Another conclusion is that the solution takes advantage of the economies of scale, that is, it points toward drilling a large capacity well even if this increases the investment cost, rather than drilling several wells, which together will increase the operation costs.

  1. A Simulated Annealing Methodology to Multiproduct Capacitated Facility Location with Stochastic Demand

    Directory of Open Access Journals (Sweden)

    Jin Qin

    2015-01-01

    Full Text Available A stochastic multiproduct capacitated facility location problem involving a single supplier and multiple customers is investigated. Due to the stochastic demands, a reasonable amount of safety stock must be kept in the facilities to achieve suitable service levels, which results in increased inventory cost. Based on the assumption of normal distributed for all the stochastic demands, a nonlinear mixed-integer programming model is proposed, whose objective is to minimize the total cost, including transportation cost, inventory cost, operation cost, and setup cost. A combined simulated annealing (CSA algorithm is presented to solve the model, in which the outer layer subalgorithm optimizes the facility location decision and the inner layer subalgorithm optimizes the demand allocation based on the determined facility location decision. The results obtained with this approach shown that the CSA is a robust and practical approach for solving a multiple product problem, which generates the suboptimal facility location decision and inventory policies. Meanwhile, we also found that the transportation cost and the demand deviation have the strongest influence on the optimal decision compared to the others.

  2. QSAR modeling for quinoxaline derivatives using genetic algorithm and simulated annealing based feature selection.

    Science.gov (United States)

    Ghosh, P; Bagchi, M C

    2009-01-01

    With a view to the rational design of selective quinoxaline derivatives, 2D and 3D-QSAR models have been developed for the prediction of anti-tubercular activities. Successful implementation of a predictive QSAR model largely depends on the selection of a preferred set of molecular descriptors that can signify the chemico-biological interaction. Genetic algorithm (GA) and simulated annealing (SA) are applied as variable selection methods for model development. 2D-QSAR modeling using GA or SA based partial least squares (GA-PLS and SA-PLS) methods identified some important topological and electrostatic descriptors as important factor for tubercular activity. Kohonen network and counter propagation artificial neural network (CP-ANN) considering GA and SA based feature selection methods have been applied for such QSAR modeling of Quinoxaline compounds. Out of a variable pool of 380 molecular descriptors, predictive QSAR models are developed for the training set and validated on the test set compounds and a comparative study of the relative effectiveness of linear and non-linear approaches has been investigated. Further analysis using 3D-QSAR technique identifies two models obtained by GA-PLS and SA-PLS methods leading to anti-tubercular activity prediction. The influences of steric and electrostatic field effects generated by the contribution plots are discussed. The results indicate that SA is a very effective variable selection approach for such 3D-QSAR modeling.

  3. An archived multi-objective simulated annealing for a dynamic cellular manufacturing system

    Science.gov (United States)

    Shirazi, Hossein; Kia, Reza; Javadian, Nikbakhsh; Tavakkoli-Moghaddam, Reza

    2014-05-01

    To design a group layout of a cellular manufacturing system (CMS) in a dynamic environment, a multi-objective mixed-integer non-linear programming model is developed. The model integrates cell formation, group layout and production planning (PP) as three interrelated decisions involved in the design of a CMS. This paper provides an extensive coverage of important manufacturing features used in the design of CMSs and enhances the flexibility of an existing model in handling the fluctuations of part demands more economically by adding machine depot and PP decisions. Two conflicting objectives to be minimized are the total costs and the imbalance of workload among cells. As the considered objectives in this model are in conflict with each other, an archived multi-objective simulated annealing (AMOSA) algorithm is designed to find Pareto-optimal solutions. Matrix-based solution representation, a heuristic procedure generating an initial and feasible solution and efficient mutation operators are the advantages of the designed AMOSA. To demonstrate the efficiency of the proposed algorithm, the performance of AMOSA is compared with an exact algorithm (i.e., ∈-constraint method) solved by the GAMS software and a well-known evolutionary algorithm, namely NSGA-II for some randomly generated problems based on some comparison metrics. The obtained results show that the designed AMOSA can obtain satisfactory solutions for the multi-objective model.

  4. Selecting Magnet Laminations Recipes Using the Meth-od of Sim-u-la-ted Annealing

    Science.gov (United States)

    Russell, A. D.; Baiod, R.; Brown, B. C.; Harding, D. J.; Martin, P. S.

    1997-05-01

    The Fermilab Main Injector project is building 344 dipoles using more than 7000 tons of steel. Budget and logistical constraints required that steel production, lamination stamping and magnet fabrication proceed in parallel. There were significant run-to-run variations in the magnetic properties of the steel (Martin, P.S., et al., Variations in the Steel Properties and the Excitation Characteristics of FMI Dipoles, this conference). The large lamination size (>0.5 m coil opening) resulted in variations of gap height due to differences in stress relief in the steel after stamping. To minimize magnet-to-magnet strength and field shape variations the laminations were shuffled based on the available magnetic and mechanical data and assigned to magnets using a computer program based on the method of simulated annealing. The lamination sets selected by the program have produced magnets which easily satisfy the design requirements. Variations of the average magnet gap are an order of magnitude smaller than the variations in lamination gaps. This paper discusses observed gap variations, the program structure and the strength uniformity results.

  5. Simulated Annealing-Based Ant Colony Algorithm for Tugboat Scheduling Optimization

    Directory of Open Access Journals (Sweden)

    Qi Xu

    2012-01-01

    Full Text Available As the “first service station” for ships in the whole port logistics system, the tugboat operation system is one of the most important systems in port logistics. This paper formulated the tugboat scheduling problem as a multiprocessor task scheduling problem (MTSP after analyzing the characteristics of tugboat operation. The model considers factors of multianchorage bases, different operation modes, and three stages of operations (berthing/shifting-berth/unberthing. The objective is to minimize the total operation times for all tugboats in a port. A hybrid simulated annealing-based ant colony algorithm is proposed to solve the addressed problem. By the numerical experiments without the shifting-berth operation, the effectiveness was verified, and the fact that more effective sailing may be possible if tugboats return to the anchorage base timely was pointed out; by the experiments with the shifting-berth operation, one can see that the objective is most sensitive to the proportion of the shifting-berth operation, influenced slightly by the tugboat deployment scheme, and not sensitive to the handling operation times.

  6. Inversion of sonobuoy data from shallow-water sites with simulated annealing

    Science.gov (United States)

    Lindwall, Dennis; Brozena, John

    2005-02-01

    An enhanced simulated annealing algorithm is used to invert sparsely sampled seismic data collected with sonobuoys to obtain seafloor geoacoustic properties at two littoral marine environments as well as for a synthetic data set. Inversion of field data from a 750-m water-depth site using a water-gun sound source found a good solution which included a pronounced subbottom reflector after 6483 iterations over seven variables. Field data from a 250-m water-depth site using an air-gun source required 35 421 iterations for a good inversion solution because 30 variables had to be solved for, including the shot-to-receiver offsets. The sonobuoy derived compressional wave velocity-depth (Vp-Z) models compare favorably with Vp-Z models derived from nearby, high-quality, multichannel seismic data. There are, however, substantial differences between seafloor reflection coefficients calculated from field models and seafloor reflection coefficients based on commonly used Vp regression curves (gradients). Reflection loss is higher at one field site and lower at the other than predicted from commonly used Vp gradients for terrigenous sediments. In addition, there are strong effects on reflection loss due to the subseafloor interfaces that are also not predicted by Vp gradients.

  7. An interactive system for creating object models from range data based on simulated annealing

    International Nuclear Information System (INIS)

    In hazardous applications such as remediation of buried waste and dismantlement of radioactive facilities, robots are an attractive solution. Sensing to recognize and locate objects is a critical need for robotic operations in unstructured environments. An accurate 3-D model of objects in the scene is necessary for efficient high level control of robots. Drawing upon concepts from supervisory control, the authors have developed an interactive system for creating object models from range data, based on simulated annealing. Site modeling is a task that is typically performed using purely manual or autonomous techniques, each of which has inherent strengths and weaknesses. However, an interactive modeling system combines the advantages of both manual and autonomous methods, to create a system that has high operator productivity as well as high flexibility and robustness. The system is unique in that it can work with very sparse range data, tolerate occlusions, and tolerate cluttered scenes. The authors have performed an informal evaluation with four operators on 16 different scenes, and have shown that the interactive system is superior to either manual or automatic methods in terms of task time and accuracy

  8. Optimization Of Thermo-Electric Coolers Using Hybrid Genetic Algorithm And Simulated Annealing

    Directory of Open Access Journals (Sweden)

    Khanh Doan V.K.

    2014-06-01

    Full Text Available Thermo-electric Coolers (TECs nowadays are applied in a wide range of thermal energy systems. This is due to their superior features where no refrigerant and dynamic parts are needed. TECs generate no electrical or acoustical noise and are environmentally friendly. Over the past decades, many researches were employed to improve the efficiency of TECs by enhancing the material parameters and design parameters. The material parameters are restricted by currently available materials and module fabricating technologies. Therefore, the main objective of TECs design is to determine a set of design parameters such as leg area, leg length and the number of legs. Two elements that play an important role when considering the suitability of TECs in applications are rated of refrigeration (ROR and coefficient of performance (COP. In this paper, the review of some previous researches will be conducted to see the diversity of optimization in the design of TECs in enhancing the performance and efficiency. After that, single-objective optimization problems (SOP will be tested first by using Genetic Algorithm (GA and Simulated Annealing (SA to optimize geometry properties so that TECs will operate at near optimal conditions. Equality constraint and inequality constraint were taken into consideration.

  9. Study on Multi-stream Heat Exchanger Network Synthesis with Parallel Genetic/Simulated Annealing Algorithm

    Institute of Scientific and Technical Information of China (English)

    魏关锋; 姚平经; LUOXing; ROETZELWilfried

    2004-01-01

    The multi-stream heat exchanger network synthesis (HENS) problem can be formulated as a mixed integer nonlinear programming model according to Yee et al. Its nonconvexity nature leads to existence of more than one optimum and computational difficulty for traditional algorithms to find the global optimum. Compared with deterministic algorithms, evolutionary computation provides a promising approach to tackle this problem. In this paper, a mathematical model of multi-stream heat exchangers network synthesis problem is setup. Different from the assumption of isothermal mixing of stream splits and thus linearity constraints of Yee et al., non-isothermal mixing is supported. As a consequence, nonlinear constraints are resulted and nonconvexity of the objective function is added. To solve the mathematical model, an algorithm named GA/SA (parallel genetic/simulated annealing algorithm) is detailed for application to the multi-stream heat exchanger network synthesis problem. The performance of the proposed approach is demonstrated with three examples and the obtained solutions indicate the presented approach is effective for multi-stream HENS.

  10. Crosshole Tomography, Waveform Inversion, and Anisotropy: A Combined Approach Using Simulated Annealing

    Science.gov (United States)

    Afanasiev, M.; Pratt, R. G.; Kamei, R.; McDowell, G.

    2012-12-01

    Crosshole seismic tomography has been used by Vale to provide geophysical images of mineralized massive sulfides in the Eastern Deeps deposit at Voisey's Bay, Labrador, Canada. To date, these data have been processed using traveltime tomography, and we seek to improve the resolution of these images by applying acoustic Waveform Tomography. Due to the computational cost of acoustic waveform modelling, local descent algorithms are employed in Waveform Tomography; due to non-linearity an initial model is required which predicts first-arrival traveltimes to within a half-cycle of the lowest frequency used. Because seismic velocity anisotropy can be significant in hardrock settings, the initial model must quantify the anisotropy in order to meet the half-cycle criterion. In our case study, significant velocity contrasts between the target massive sulfides and the surrounding country rock led to difficulties in generating an accurate anisotropy model through traveltime tomography, and our starting model for Waveform Tomography failed the half-cycle criterion at large offsets. We formulate a new, semi-global approach for finding the best-fit 1-D elliptical anisotropy model using simulated annealing. Through random perturbations to Thompson's ɛ parameter, we explore the L2 norm of the frequency-domain phase residuals in the space of potential anisotropy models: If a perturbation decreases the residuals, it is always accepted, but if a perturbation increases the residuals, it is accepted with the probability P = exp(-(Ei-E)/T). This is the Metropolis criterion, where Ei is the value of the residuals at the current iteration, E is the value of the residuals for the previously accepted model, and T is a probability control parameter, which is decreased over the course of the simulation via a preselected cooling schedule. Convergence to the global minimum of the residuals is guaranteed only for infinitely slow cooling, but in practice good results are obtained from a variety

  11. Forecasting nonlinear chaotic time series with function expression method based on an improved genetic-simulated annealing algorithm.

    Science.gov (United States)

    Wang, Jun; Zhou, Bi-hua; Zhou, Shu-dao; Sheng, Zheng

    2015-01-01

    The paper proposes a novel function expression method to forecast chaotic time series, using an improved genetic-simulated annealing (IGSA) algorithm to establish the optimum function expression that describes the behavior of time series. In order to deal with the weakness associated with the genetic algorithm, the proposed algorithm incorporates the simulated annealing operation which has the strong local search ability into the genetic algorithm to enhance the performance of optimization; besides, the fitness function and genetic operators are also improved. Finally, the method is applied to the chaotic time series of Quadratic and Rossler maps for validation. The effect of noise in the chaotic time series is also studied numerically. The numerical results verify that the method can forecast chaotic time series with high precision and effectiveness, and the forecasting precision with certain noise is also satisfactory. It can be concluded that the IGSA algorithm is energy-efficient and superior.

  12. Forecasting Nonlinear Chaotic Time Series with Function Expression Method Based on an Improved Genetic-Simulated Annealing Algorithm

    Directory of Open Access Journals (Sweden)

    Jun Wang

    2015-01-01

    Full Text Available The paper proposes a novel function expression method to forecast chaotic time series, using an improved genetic-simulated annealing (IGSA algorithm to establish the optimum function expression that describes the behavior of time series. In order to deal with the weakness associated with the genetic algorithm, the proposed algorithm incorporates the simulated annealing operation which has the strong local search ability into the genetic algorithm to enhance the performance of optimization; besides, the fitness function and genetic operators are also improved. Finally, the method is applied to the chaotic time series of Quadratic and Rossler maps for validation. The effect of noise in the chaotic time series is also studied numerically. The numerical results verify that the method can forecast chaotic time series with high precision and effectiveness, and the forecasting precision with certain noise is also satisfactory. It can be concluded that the IGSA algorithm is energy-efficient and superior.

  13. Using Site Testing Data for Adaptive Optics Simulations

    OpenAIRE

    Herriot, Glen; Andersen, David; Conan, Rod; Ellerbroek, Brent; Gilles, Luc; Hickson, Paul; Jackson, Kate; Lardière, Olivier; Pfrommer, Thomas; Véran, Jean-Pierre; Wang, Lianqi

    2011-01-01

    Astronomical Site testing data plays a vital role in the simulation, design, evaluation and operation of adaptive optics systems for large telescope. We present the example of TMT and its first light facilitiy adaptive optics system NFIRAOS, and illustrate the many simulations done based on site testing data.

  14. Exploring the Use of Adaptively Restrained Particles for Graphics Simulations

    OpenAIRE

    Pierre-Luc Manteaux; Fran\\xe7ois Faure; Stephane Redon; Marie-Paule Cani

    2013-01-01

    International audience In this paper, we explore the use of Adaptively Restrained (AR) particles for graphics simulations. Contrary to previous methods, Adaptively Restrained Particle Simulations (ARPS) do not adapt time or space sampling, but rather switch the positional degrees of freedom of particles on and off, while letting their momenta evolve. Therefore, inter-particles forces do not have to be updated at each time step, in contrast with traditional methods that spend a lot of time ...

  15. Automated integration of genomic physical mapping data via parallel simulated annealing

    Energy Technology Data Exchange (ETDEWEB)

    Slezak, T.

    1994-06-01

    The Human Genome Center at the Lawrence Livermore National Laboratory (LLNL) is nearing closure on a high-resolution physical map of human chromosome 19. We have build automated tools to assemble 15,000 fingerprinted cosmid clones into 800 contigs with minimal spanning paths identified. These islands are being ordered, oriented, and spanned by a variety of other techniques including: Fluorescence Insitu Hybridization (FISH) at 3 levels of resolution, ECO restriction fragment mapping across all contigs, and a multitude of different hybridization and PCR techniques to link cosmid, YAC, AC, PAC, and Pl clones. The FISH data provide us with partial order and distance data as well as orientation. We made the observation that map builders need a much rougher presentation of data than do map readers; the former wish to see raw data since these can expose errors or interesting biology. We further noted that by ignoring our length and distance data we could simplify our problem into one that could be readily attacked with optimization techniques. The data integration problem could then be seen as an M x N ordering of our N cosmid clones which ``intersect`` M larger objects by defining ``intersection`` to mean either contig/map membership or hybridization results. Clearly, the goal of making an integrated map is now to rearrange the N cosmid clone ``columns`` such that the number of gaps on the object ``rows`` are minimized. Our FISH partially-ordered cosmid clones provide us with a set of constraints that cannot be violated by the rearrangement process. We solved the optimization problem via simulated annealing performed on a network of 40+ Unix machines in parallel, using a server/client model built on explicit socket calls. For current maps we can create a map in about 4 hours on the parallel net versus 4+ days on a single workstation. Our biologists are now using this software on a daily basis to guide their efforts toward final closure.

  16. A restraint molecular dynamics and simulated annealing approach for protein homology modeling utilizing mean angles

    Directory of Open Access Journals (Sweden)

    Maurer Till

    2005-04-01

    Full Text Available Abstract Background We have developed the program PERMOL for semi-automated homology modeling of proteins. It is based on restrained molecular dynamics using a simulated annealing protocol in torsion angle space. As main restraints defining the optimal local geometry of the structure weighted mean dihedral angles and their standard deviations are used which are calculated with an algorithm described earlier by Döker et al. (1999, BBRC, 257, 348–350. The overall long-range contacts are established via a small number of distance restraints between atoms involved in hydrogen bonds and backbone atoms of conserved residues. Employing the restraints generated by PERMOL three-dimensional structures are obtained using standard molecular dynamics programs such as DYANA or CNS. Results To test this modeling approach it has been used for predicting the structure of the histidine-containing phosphocarrier protein HPr from E. coli and the structure of the human peroxisome proliferator activated receptor γ (Ppar γ. The divergence between the modeled HPr and the previously determined X-ray structure was comparable to the divergence between the X-ray structure and the published NMR structure. The modeled structure of Ppar γ was also very close to the previously solved X-ray structure with an RMSD of 0.262 nm for the backbone atoms. Conclusion In summary, we present a new method for homology modeling capable of producing high-quality structure models. An advantage of the method is that it can be used in combination with incomplete NMR data to obtain reasonable structure models in accordance with the experimental data.

  17. Electrode Materials, Thermal Annealing Sequences, and Lateral/Vertical Phase Separation of Polymer Solar Cells from Multiscale Molecular Simulations

    KAUST Repository

    Lee, Cheng-Kuang

    2014-12-10

    © 2014 American Chemical Society. The nanomorphologies of the bulk heterojunction (BHJ) layer of polymer solar cells are extremely sensitive to the electrode materials and thermal annealing conditions. In this work, the correlations of electrode materials, thermal annealing sequences, and resultant BHJ nanomorphological details of P3HT:PCBM BHJ polymer solar cell are studied by a series of large-scale, coarse-grained (CG) molecular simulations of system comprised of PEDOT:PSS/P3HT:PCBM/Al layers. Simulations are performed for various configurations of electrode materials as well as processing temperature. The complex CG molecular data are characterized using a novel extension of our graph-based framework to quantify morphology and establish a link between morphology and processing conditions. Our analysis indicates that vertical phase segregation of P3HT:PCBM blend strongly depends on the electrode material and thermal annealing schedule. A thin P3HT-rich film is formed on the top, regardless of bottom electrode material, when the BHJ layer is exposed to the free surface during thermal annealing. In addition, preferential segregation of P3HT chains and PCBM molecules toward PEDOT:PSS and Al electrodes, respectively, is observed. Detailed morphology analysis indicated that, surprisingly, vertical phase segregation does not affect the connectivity of donor/acceptor domains with respective electrodes. However, the formation of P3HT/PCBM depletion zones next to the P3HT/PCBM-rich zones can be a potential bottleneck for electron/hole transport due to increase in transport pathway length. Analysis in terms of fraction of intra- and interchain charge transports revealed that processing schedule affects the average vertical orientation of polymer chains, which may be crucial for enhanced charge transport, nongeminate recombination, and charge collection. The present study establishes a more detailed link between processing and morphology by combining multiscale molecular

  18. Analyzing and tailoring spectra of arbitrary microring resonator arrays based on six transfer cellsand simulated annealing algorithm

    Institute of Scientific and Technical Information of China (English)

    Xiaobei Zhang; Yunhong Ding; Wei Hong; Xinliang Zhang; Dexiu Huang

    2009-01-01

    A simple approach based on six transfer cells and simulated annealing algorithm for analyzing and tailoring the spectra of arbitrary microring resonator arrays is presented.Coupling coefficients,ring sizes,and waveguide lengths of microring resonator arrays can be arbitrary in this approach.After developing this approach,several examples are demonstrated and optimized for various configurations of microring resonator arrays.Simulation results show that this approach is intuitive,efficient,and intelligent for applications based on microring resonator arrays.

  19. Self-adaptive genetic algorithms with simulated binary crossover.

    Science.gov (United States)

    Deb, K; Beyer, H G

    2001-01-01

    Self-adaptation is an essential feature of natural evolution. However, in the context of function optimization, self-adaptation features of evolutionary search algorithms have been explored mainly with evolution strategy (ES) and evolutionary programming (EP). In this paper, we demonstrate the self-adaptive feature of real-parameter genetic algorithms (GAs) using a simulated binary crossover (SBX) operator and without any mutation operator. The connection between the working of self-adaptive ESs and real-parameter GAs with the SBX operator is also discussed. Thereafter, the self-adaptive behavior of real-parameter GAs is demonstrated on a number of test problems commonly used in the ES literature. The remarkable similarity in the working principle of real-parameter GAs and self-adaptive ESs shown in this study suggests the need for emphasizing further studies on self-adaptive GAs. PMID:11382356

  20. PKA spectral effects on subcascade structures and free defect survival ratio as estimated by cascade-annealing computer simulation

    International Nuclear Information System (INIS)

    The free defect survival ratio is calculated by ''cascade-annealing'' computer simulation using the MARLOWE and modified DAIQUIRI codes in various cases of Primary Knock-on Atom (PKA) spectra. The number of subcascades is calculated by ''cut-off'' calculation using MARLOWE. The adequacy of these methods is checked by comparing the results with experiments (surface segregation measurements and Transmission Electron Microscope cascade defect observations). The correlation using the weighted average recoil energy as a parameter shows that the saturation of the free defect survival ratio at high PKA energies has a close relation to the cascade splitting into subcascades. (author)

  1. A Gradient-Simulated Annealing Algorithm of Pre-location-Based Best Fitting of Blank to Complex Surfaces Machining

    Institute of Scientific and Technical Information of China (English)

    MALi-ming; JIANGHong; WANGXiao-chun

    2004-01-01

    The algorithm is divided into two steps. The first step pre-locates the blank by aligning its centre of gravity and approximate normal vector with those of destination surfaces, with largest overlap of projections of two objects on a plane perpendicular to the normal vector. The second step is optimizing an objective function by means of gradient-simulated annealing algorithm to get the best matching of a set of distributed points on the blank and destination surfaces. An example for machining hydroelectric turbine blades is given to verify the effectiveness of algorithm.

  2. Detección daño estructural empleando el vector de fuerza residual modificado y el algoritmo Simulated Annealing (SA Damage Detection Using the Modified Residual Force Vector and the Simulated Annealing Algorithm (SA

    Directory of Open Access Journals (Sweden)

    Óscar Begambre

    2010-01-01

    Full Text Available En este trabajo, el algoritmo Simulated Annealing (SA es empleado para solucionar el problema inverso de detección de daño en vigas usando información modal contaminada con ruido. La formulación de la función objetivo para el procedimiento de optimización, basado en el SA, está fundamentada en el método de la fuerza residual modificada. El desempeño del SA empleado en este estudio superó el de un algoritmo genético (AG en dos funciones de prueba reportadas en la literatura internacional. El procedimiento de evaluación de integridad aquí propuesto se confirmó y validó numéricamente empleando la teoría de vigas de Euler-Bernoulli y un Modelo de Elementos Finitos (MEF de vigas en voladizo y apoyadas libremente.In this research, the Simulated Annealing Algorithm (SA is employed to solve damage detection problems in beam type structures using noisy polluted modal data. The formulation of the objective function for the SA optimization procedure is based on the modified residual force method. The SA used in this research performs better than the Genetic Algorithm (GA in two difficult benchmark functions. The proposed structural damage-identification scheme is confirmed and assessed using a Finite Element Model (FEM of cantilever and a free-free Euler-Bernoulli beam model

  3. Adaptive kernel methods to simulate quantum phase space flow

    Directory of Open Access Journals (Sweden)

    H.López

    2006-01-01

    Full Text Available A technique for simulating quantum dynamics in phase space is discussed. It makes use of ensembles of classical trajectories to approximate the distribution functions and their derivatives by implementing Adaptive Kernel Density Estimation. It is found to improve the accuracy and stability of the simulations compared to more conventional particle methods. Formulation of the method in higher dimensions is straightforward.

  4. A market based active/reactive dispatch including transformer taps and reactor and capacitor banks using Simulated Annealing

    International Nuclear Information System (INIS)

    This paper describes an optimization model to be used by System Operators in order to validate the economic schedules obtained by Market Operators together with the injections from Bilateral Contracts. These studies will be performed off-line in the day before operation and the developed model is based on adjustment bids submitted by generators and loads and it is used by System Operators if that is necessary to enforce technical or security constraints. This model corresponds to an enhancement of an approach described in a previous paper and it now includes discrete components as transformer taps and reactor and capacitor banks. The resulting mixed integer formulation is solved using Simulated Annealing, a well known metaheuristic specially suited for combinatorial problems. Once the Simulated Annealing converges and the values of the discrete variables are fixed, the resulting non-linear continuous problem is solved using Sequential Linear Programming to get the final solution. The developed model corresponds to an AC version, it includes constraints related with the capability diagram of synchronous generators and variables allowing the computation of the active power required to balance active losses. Finally, the paper includes a Case Study based on the IEEE 118 bus system to illustrate the results that it is possible to obtain and their interest. (author)

  5. A market based active/reactive dispatch including transformer taps and reactor and capacitor banks using Simulated Annealing

    Energy Technology Data Exchange (ETDEWEB)

    Gomes, Mario Helder [Departamento de Engenharia Electrotecnica, Instituto Politecnico de Tomar, Quinta do Contador, Estrada da Serra, 2300 Tomar (Portugal); Saraiva, Joao Tome [INESC Porto, Faculdade de Engenharia, Universidade do Porto, Campus da FEUP, Rua Dr. Roberto Frias, 4200-465 Porto (Portugal)

    2009-06-15

    This paper describes an optimization model to be used by System Operators in order to validate the economic schedules obtained by Market Operators together with the injections from Bilateral Contracts. These studies will be performed off-line in the day before operation and the developed model is based on adjustment bids submitted by generators and loads and it is used by System Operators if that is necessary to enforce technical or security constraints. This model corresponds to an enhancement of an approach described in a previous paper and it now includes discrete components as transformer taps and reactor and capacitor banks. The resulting mixed integer formulation is solved using Simulated Annealing, a well known metaheuristic specially suited for combinatorial problems. Once the Simulated Annealing converges and the values of the discrete variables are fixed, the resulting non-linear continuous problem is solved using Sequential Linear Programming to get the final solution. The developed model corresponds to an AC version, it includes constraints related with the capability diagram of synchronous generators and variables allowing the computation of the active power required to balance active losses. Finally, the paper includes a Case Study based on the IEEE 118 bus system to illustrate the results that it is possible to obtain and their interest. (author)

  6. Simulation and Rapid Prototyping of Adaptive Control Systems using the Adaptive Blockset for Simulink

    DEFF Research Database (Denmark)

    Ravn, Ole

    1998-01-01

    The paper describes the design considerations and implementational aspects of the Adaptive Blockset for Simulink which has been developed in a prototype implementation. The concept behind the Adaptive Blockset for Simulink is to bridge the gap between simulation and prototype controller...... implementation. This is done using the code generation capabilities of Real Time Workshop in combination with C s-function blocks for adaptive control in Simulink. In the paper the design of each group of blocks normally found in adaptive controllers is outlined. The block types are, identification, controller...... design, controller and state variable filter.The use of the Adaptive Blockset is demonstrated using a simple laboratory setup. Both the use of the blockset for simulation and for rapid prototyping of a real-time controller are shown....

  7. 1-Dimensional simulation of thermal annealing in a commercial nuclear power plant reactor pressure vessel wall section

    Energy Technology Data Exchange (ETDEWEB)

    Nakos, J.T.; Rosinski, S.T.; Acton, R.U.

    1994-11-01

    The objective of this work was to provide experimental heat transfer boundary condition and reactor pressure vessel (RPV) section thermal response data that can be used to benchmark computer codes that simulate thermal annealing of RPVS. This specific protect was designed to provide the Electric Power Research Institute (EPRI) with experimental data that could be used to support the development of a thermal annealing model. A secondary benefit is to provide additional experimental data (e.g., thermal response of concrete reactor cavity wall) that could be of use in an annealing demonstration project. The setup comprised a heater assembly, a 1.2 in {times} 1.2 m {times} 17.1 cm thick [4 ft {times} 4 ft {times} 6.75 in] section of an RPV (A533B ferritic steel with stainless steel cladding), a mockup of the {open_quotes}mirror{close_quotes} insulation between the RPV and the concrete reactor cavity wall, and a 25.4 cm [10 in] thick concrete wall, 2.1 in {times} 2.1 in [10 ft {times} 10 ft] square. Experiments were performed at temperature heat-up/cooldown rates of 7, 14, and 28{degrees}C/hr [12.5, 25, and 50{degrees}F/hr] as measured on the heated face. A peak temperature of 454{degrees}C [850{degrees}F] was maintained on the heated face until the concrete wall temperature reached equilibrium. Results are most representative of those RPV locations where the heat transfer would be 1-dimensional. Temperature was measured at multiple locations on the heated and unheated faces of the RPV section and the concrete wall. Incident heat flux was measured on the heated face, and absorbed heat flux estimates were generated from temperature measurements and an inverse heat conduction code. Through-wall temperature differences, concrete wall temperature response, heat flux absorbed into the RPV surface and incident on the surface are presented. All of these data are useful to modelers developing codes to simulate RPV annealing.

  8. 1-Dimensional simulation of thermal annealing in a commercial nuclear power plant reactor pressure vessel wall section

    International Nuclear Information System (INIS)

    The objective of this work was to provide experimental heat transfer boundary condition and reactor pressure vessel (RPV) section thermal response data that can be used to benchmark computer codes that simulate thermal annealing of RPVS. This specific protect was designed to provide the Electric Power Research Institute (EPRI) with experimental data that could be used to support the development of a thermal annealing model. A secondary benefit is to provide additional experimental data (e.g., thermal response of concrete reactor cavity wall) that could be of use in an annealing demonstration project. The setup comprised a heater assembly, a 1.2 in x 1.2 m x 17.1 cm thick [4 ft x 4 ft x 6.75 in] section of an RPV (A533B ferritic steel with stainless steel cladding), a mockup of the open-quotes mirrorclose quotes insulation between the RPV and the concrete reactor cavity wall, and a 25.4 cm [10 in] thick concrete wall, 2.1 in x 2.1 in [10 ft x 10 ft] square. Experiments were performed at temperature heat-up/cooldown rates of 7, 14, and 28 degrees C/hr [12.5, 25, and 50 degrees F/hr] as measured on the heated face. A peak temperature of 454 degrees C [850 degrees F] was maintained on the heated face until the concrete wall temperature reached equilibrium. Results are most representative of those RPV locations where the heat transfer would be 1-dimensional. Temperature was measured at multiple locations on the heated and unheated faces of the RPV section and the concrete wall. Incident heat flux was measured on the heated face, and absorbed heat flux estimates were generated from temperature measurements and an inverse heat conduction code. Through-wall temperature differences, concrete wall temperature response, heat flux absorbed into the RPV surface and incident on the surface are presented. All of these data are useful to modelers developing codes to simulate RPV annealing

  9. Energy management of power-split plug-in hybrid electric vehicles based on simulated annealing and Pontryagin's minimum principle

    Science.gov (United States)

    Chen, Zheng; Mi, Chunting Chris; Xia, Bing; You, Chenwen

    2014-12-01

    In this paper, an energy management method is proposed for a power-split plug-in hybrid electric vehicle (PHEV). Through analyzing the PHEV powertrain, a series of quadratic equations are employed to approximate the vehicle's fuel-rate, using battery current as the input. Pontryagin's Minimum Principle (PMP) is introduced to find the battery current commands by solving the Hamiltonian function. Simulated Annealing (SA) algorithm is applied to calculate the engine-on power and the maximum current coefficient. Moreover, the battery state of health (SOH) is introduced to extend the application of the proposed algorithm. Simulation results verified that the proposed algorithm can reduce fuel-consumption compared to charge-depleting (CD) and charge-sustaining (CS) mode.

  10. Adaptive LES Methodology for Turbulent Flow Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Oleg V. Vasilyev

    2008-06-12

    Although turbulent flows are common in the world around us, a solution to the fundamental equations that govern turbulence still eludes the scientific community. Turbulence has often been called one of the last unsolved problem in classical physics, yet it is clear that the need to accurately predict the effect of turbulent flows impacts virtually every field of science and engineering. As an example, a critical step in making modern computational tools useful in designing aircraft is to be able to accurately predict the lift, drag, and other aerodynamic characteristics in numerical simulations in a reasonable amount of time. Simulations that take months to years to complete are much less useful to the design cycle. Much work has been done toward this goal (Lee-Rausch et al. 2003, Jameson 2003) and as cost effective accurate tools for simulating turbulent flows evolve, we will all benefit from new scientific and engineering breakthroughs. The problem of simulating high Reynolds number (Re) turbulent flows of engineering and scientific interest would have been solved with the advent of Direct Numerical Simulation (DNS) techniques if unlimited computing power, memory, and time could be applied to each particular problem. Yet, given the current and near future computational resources that exist and a reasonable limit on the amount of time an engineer or scientist can wait for a result, the DNS technique will not be useful for more than 'unit' problems for the foreseeable future (Moin & Kim 1997, Jimenez & Moin 1991). The high computational cost for the DNS of three dimensional turbulent flows results from the fact that they have eddies of significant energy in a range of scales from the characteristic length scale of the flow all the way down to the Kolmogorov length scale. The actual cost of doing a three dimensional DNS scales as Re{sup 9/4} due to the large disparity in scales that need to be fully resolved. State-of-the-art DNS calculations of isotropic

  11. PASSATA - Object oriented numerical simulation software for adaptive optics

    CERN Document Server

    Agapito, G; Esposito, S

    2016-01-01

    We present the last version of the PyrAmid Simulator Software for Adaptive opTics Arcetri (PASSATA), an IDL and CUDA based object oriented software developed in the Adaptive Optics group of the Arcetri observatory for Monte-Carlo end-to-end adaptive optics simulations. The original aim of this software was to evaluate the performance of a single conjugate adaptive optics system for ground based telescope with a pyramid wavefront sensor. After some years of development, the current version of PASSATA is able to simulate several adaptive optics systems: single conjugate, multi conjugate and ground layer, with Shack Hartmann and Pyramid wavefront sensors. It can simulate from 8m to 40m class telescopes, with diffraction limited and resolved sources at finite or infinite distance from the pupil. The main advantages of this software are the versatility given by the object oriented approach and the speed given by the CUDA implementation of the most computational demanding routines. We describe the software with its...

  12. Chaotic Simulated Annealing by A Neural Network Model with Transient Chaos

    CERN Document Server

    Chen, L; Chen, Luonan; Aihara, Kazuyuki

    1997-01-01

    We propose a neural network model with transient chaos, or a transiently chaotic neural network (TCNN) as an approximation method for combinatorial optimization problem, by introducing transiently chaotic dynamics into neural networks. Unlike conventional neural networks only with point attractors, the proposed neural network has richer and more flexible dynamics, so that it can be expected to have higher ability of searching for globally optimal or near-optimal solutions. A significant property of this model is that the chaotic neurodynamics is temporarily generated for searching and self-organizing, and eventually vanishes with autonomous decreasing of a bifurcation parameter corresponding to the "temperature" in usual annealing process. Therefore, the neural network gradually approaches, through the transient chaos, to dynamical structure similar to such conventional models as the Hopfield neural network which converges to a stable equilibrium point. Since the optimization process of the transiently chaoti...

  13. Hybrid Strategy of Particle Swarm Optimization and Simulated Annealing for Optimizing Orthomorphisms

    Institute of Scientific and Technical Information of China (English)

    Tong Yan; Zhang Huanguo

    2012-01-01

    Orthomorphism on F2^n is a kind of elementary pemmtation with good cryptographic properties. This paper proposes a hybrid strategy of Particle Swarm Optimization (PSO) and Sirrmlated Annealing (SA) for finding orthomorphisrm with good cryptographic properties. By experiment based on this strategy, we get some orthorrorphisrm on F2^n = 5, 6, 7, 9, 10) with good cryptographic properties in the open document for the first time, and the optirml orthorrrphism on F found in this paper also does better than the one proposed by Feng Dengguo et al. in stream cipher Loiss in difference uniformity, algebraic degree, algebraic irrarnity and corresponding pernmtation polynomial degree. The PSOSA hybrid strategy for optimizing orthomerphism in this paper makes design of orthorrorphisrm with good cryptographic properties automated, efficient and convenient, which proposes a new approach to design orthornorphisrm.

  14. A real-time simulation facility for astronomical adaptive optics

    CERN Document Server

    Basden, Alastair

    2014-01-01

    In this paper we introduce the concept of real-time hardware-in-the-loop simulation for astronomical adaptive optics, and present the case for the requirement for such a facility. This real-time simulation, when linked with an adaptive optics real-time control system, provides an essential tool for the validation, verification and integration of the Extremely Large Telescope real-time control systems prior to commissioning at the telescope. We demonstrate that such a facility is crucial for the success of the future extremely large telescopes.

  15. 基于模拟退火的文化混合双聚类优化算法%A Biclustering Optimization Algorithm Based on Simulated Annealing and Cultural Annealing

    Institute of Scientific and Technical Information of China (English)

    朱娴; 马卫

    2011-01-01

    双聚类是用基因表达数据矩阵中部分行与列的相互表达水平,即矩阵中的子矩阵.文章提出一种基于模拟退火的文化混合优化算法,以文化算法为整体框架嵌入模拟退火法,作为种群空间的一个演化过程,避免模拟退火的概率突跳性缺点.在酵母细胞数据集实验中,文中的算法在时间消耗增加不多的情况下,搜索出的双聚类质量高,实验效果良好.%A bicluster is a grouping of a subset of genes and a subset of conditions which exhibits a high correlation of expression activity across both rows and columns. A hybrid optimization algorithm is presented, which is based on the simulated annealing and cultural algorithm. To overcome the shortcoming of simulated annealing that it is easy to trap into data's leap. The simulated annealing is embedded in the cultural algorithm framework as an evolving course from the population space. The yeast dataset experiment result indicates that, under the circumstances of consuming time a little more, the new algorithm achieves good results to search out high quality of biclusters.

  16. SimulCAT: Windows Software for Simulating Computerized Adaptive Test Administration

    Science.gov (United States)

    Han, Kyung T.

    2012-01-01

    Most, if not all, computerized adaptive testing (CAT) programs use simulation techniques to develop and evaluate CAT program administration and operations, but such simulation tools are rarely available to the public. Up to now, several software tools have been available to conduct CAT simulations for research purposes; however, these existing…

  17. A Single-Machine Two-Agent Scheduling Problem by a Branch-and-Bound and Three Simulated Annealing Algorithms

    Directory of Open Access Journals (Sweden)

    Shangchia Liu

    2015-01-01

    Full Text Available In the field of distributed decision making, different agents share a common processing resource, and each agent wants to minimize a cost function depending on its jobs only. These issues arise in different application contexts, including real-time systems, integrated service networks, industrial districts, and telecommunication systems. Motivated by its importance on practical applications, we consider two-agent scheduling on a single machine where the objective is to minimize the total completion time of the jobs of the first agent with the restriction that an upper bound is allowed the total completion time of the jobs for the second agent. For solving the proposed problem, a branch-and-bound and three simulated annealing algorithms are developed for the optimal solution, respectively. In addition, the extensive computational experiments are also conducted to test the performance of the algorithms.

  18. Evaluation and selection of the ship collaborative design resources based on AHP and genetic and simulated annealing algorithm

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The characteristics of the design resources in the ship collaborative design is described and the hierarchical model for the evaluation of the design resources is established. The comprehensive evaluation of the co-designers for the collaborative design resources has been done from different aspects using Analytic Hierarchy Process (AHP),and according to the evaluation results,the candidates are determined. Meanwhile,based on the principle of minimum cost,and starting from the relations between the design tasks and the corresponding co-designers,the optimizing selection model of the collaborators is established and one novel genetic combined with simulated annealing algorithm is proposed to realize the optimization. It overcomes the defects of the genetic algorithm which may lead to the premature convergence and local optimization if used individually. Through the application of this method in the ship collaborative design system,it proves the feasibility and provides a quantitative method for the optimizing selection of the design resources.

  19. Study of structure and spectroscopy of water-hydroxide ion clusters: A combined simulated annealing and DFT-based approach

    Indian Academy of Sciences (India)

    Satyajit Guha; Soumya Ganguly Neogi; Pinaki Chaudhury

    2014-05-01

    In this paper, we explore the use of stochastic optimizer, namely simulated annealing (SA) followed by density function theory (DFT)-based strategy for evaluating the structure and infrared spectroscopy of (H2O) OH− clusters where = 1-6. We have shown that the use of SA can generate both global and local structures of these cluster systems.We also perform a DFT calculation, using the optimized coordinate obtained from SA as input and extract the IR spectra of these systems. Finally, we compare our results with available theoretical and experimental data. There is a close correspondence between the computed frequencies from our theoretical study and available experimental data. To further aid in understanding the details of the hydrogen bonds formed, we performed atoms in molecules calculation on all the global minimum structures to evaluate relevant electron densities and critical points.

  20. Hydrogen isotope profiling of functionalised polystyrene blends using RBS/ERD and RBS/NRA with simulated annealing data analysis

    International Nuclear Information System (INIS)

    Full text: The behaviour of thin surface-active polystyrene (PS) films on silicon is being investigated. These films have amine functional groups which are attracted to the solid interface if they are fluorinated (N-PSF), and to the air interface if they are not (N-PS). To determine the interface enrichment of the species a 'sandwich' of deuterated- (DPS) and hydrogenated (HPS) films was prepared. 1.5MeV 4He ERD/RBS together with 0.7MeV 3He nuclear reaction analysis (NRA) to determine the D profile were applied to the films, and a self consistent analysis of all three spectra using the simulated annealing algorithm was made for each sample. The ERD data contains both H and D recoils, but the D profile does not have such good depth resolution as in the NRA data. The results are combined with data from neutron reflectivity

  1. ADAPTIVE ANNEALED IMPORTANCE SAMPLING FOR MULTIMODAL POSTERIOR EXPLORATION AND MODEL SELECTION WITH APPLICATION TO EXTRASOLAR PLANET DETECTION

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Bin, E-mail: bins@ieee.org [School of Computer Science and Technology, Nanjing University of Posts and Telecommunications, Nanjing 210023 (China)

    2014-07-01

    We describe an algorithm that can adaptively provide mixture summaries of multimodal posterior distributions. The parameter space of the involved posteriors ranges in size from a few dimensions to dozens of dimensions. This work was motivated by an astrophysical problem called extrasolar planet (exoplanet) detection, wherein the computation of stochastic integrals that are required for Bayesian model comparison is challenging. The difficulty comes from the highly nonlinear models that lead to multimodal posterior distributions. We resort to importance sampling (IS) to estimate the integrals, and thus translate the problem to be how to find a parametric approximation of the posterior. To capture the multimodal structure in the posterior, we initialize a mixture proposal distribution and then tailor its parameters elaborately to make it resemble the posterior to the greatest extent possible. We use the effective sample size (ESS) calculated based on the IS draws to measure the degree of approximation. The bigger the ESS is, the better the proposal resembles the posterior. A difficulty within this tailoring operation lies in the adjustment of the number of mixing components in the mixture proposal. Brute force methods just preset it as a large constant, which leads to an increase in the required computational resources. We provide an iterative delete/merge/add process, which works in tandem with an expectation-maximization step to tailor such a number online. The efficiency of our proposed method is tested via both simulation studies and real exoplanet data analysis.

  2. Permanent prostate implant using high activity seeds and inverse planning with fast simulated annealing algorithm: A 12-year Canadian experience

    International Nuclear Information System (INIS)

    Purpose: To report outcomes and toxicity of the first Canadian permanent prostate implant program. Methods and Materials: 396 consecutive patients (Gleason ≤6, initial prostate specific antigen (PSA) ≤10 and stage T1-T2a disease) were implanted between June 1994 and December 2001. The median follow-up is of 60 months (maximum, 136 months). All patients were planned with fast-simulated annealing inverse planning algorithm with high activity seeds ([gt] 0.76 U). Acute and late toxicity is reported for the first 213 patients using a modified RTOG toxicity scale. The Kaplan-Meier biochemical failure-free survival (bFFS) is reported according to the ASTRO and Houston definitions. Results: The bFFS at 60 months was of 88.5% (90.5%) according to the ASTRO (Houston) definition and, of 91.4% (94.6%) in the low risk group (initial PSA ≤10 and Gleason ≤6 and Stage ≤T2a). Risk factors statistically associated with bFFS were: initial PSA >10, a Gleason score of 7-8, and stage T2b-T3. The mean D90 was of 151 ± 36.1 Gy. The mean V100 was of 85.4 ± 8.5% with a mean V150 of 60.1 ± 12.3%. Overall, the implants were well tolerated. In the first 6 months, 31.5% of the patients were free of genitourinary symptoms (GUs), 12.7% had Grade 3 GUs; 91.6% were free of gastrointestinal symptoms (GIs). After 6 months, 54.0% were GUs free, 1.4% had Grade 3 GUs; 95.8% were GIs free. Conclusion: The inverse planning with fast simulated annealing and high activity seeds gives a 5-year bFFS, which is comparable with the best published series with a low toxicity profile

  3. The relative entropy is fundamental to adaptive resolution simulations

    Science.gov (United States)

    Kreis, Karsten; Potestio, Raffaello

    2016-07-01

    Adaptive resolution techniques are powerful methods for the efficient simulation of soft matter systems in which they simultaneously employ atomistic and coarse-grained (CG) force fields. In such simulations, two regions with different resolutions are coupled with each other via a hybrid transition region, and particles change their description on the fly when crossing this boundary. Here we show that the relative entropy, which provides a fundamental basis for many approaches in systematic coarse-graining, is also an effective instrument for the understanding of adaptive resolution simulation methodologies. We demonstrate that the use of coarse-grained potentials which minimize the relative entropy with respect to the atomistic system can help achieve a smoother transition between the different regions within the adaptive setup. Furthermore, we derive a quantitative relation between the width of the hybrid region and the seamlessness of the coupling. Our results do not only shed light on the what and how of adaptive resolution techniques but will also help setting up such simulations in an optimal manner.

  4. Adaptive resolution simulation of an atomistic protein in MARTINI water

    NARCIS (Netherlands)

    Zavadlav, Julija; Melo, Manuel Nuno; Marrink, Siewert J.; Praprotnik, Matej

    2014-01-01

    We present an adaptive resolution simulation of protein G in multiscale water. We couple atomistic water around the protein with mesoscopic water, where four water molecules are represented with one coarse-grained bead, farther away. We circumvent the difficulties that arise from coupling to the coa

  5. The behavior of adaptive bone-remodeling simulation models

    NARCIS (Netherlands)

    H.H. Weinans (Harrie); R. Huiskes (Rik); H.J. Grootenboer

    1992-01-01

    textabstractThe process of adaptive bone remodeling can be described mathematically and simulated in a computer model, integrated with the finite element method. In the model discussed here, cortical and trabecular bone are described as continuous materials with variable density. The remodeling rule

  6. Adaptive Sampling Algorithms for Probabilistic Risk Assessment of Nuclear Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Diego Mandelli; Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer

    2013-09-01

    Nuclear simulations are often computationally expensive, time-consuming, and high-dimensional with respect to the number of input parameters. Thus exploring the space of all possible simulation outcomes is infeasible using finite computing resources. During simulation-based probabilistic risk analysis, it is important to discover the relationship between a potentially large number of input parameters and the output of a simulation using as few simulation trials as possible. This is a typical context for performing adaptive sampling where a few observations are obtained from the simulation, a surrogate model is built to represent the simulation space, and new samples are selected based on the model constructed. The surrogate model is then updated based on the simulation results of the sampled points. In this way, we attempt to gain the most information possible with a small number of carefully selected sampled points, limiting the number of expensive trials needed to understand features of the simulation space. We analyze the specific use case of identifying the limit surface, i.e., the boundaries in the simulation space between system failure and system success. In this study, we explore several techniques for adaptively sampling the parameter space in order to reconstruct the limit surface. We focus on several adaptive sampling schemes. First, we seek to learn a global model of the entire simulation space using prediction models or neighborhood graphs and extract the limit surface as an iso-surface of the global model. Second, we estimate the limit surface by sampling in the neighborhood of the current estimate based on topological segmentations obtained locally. Our techniques draw inspirations from topological structure known as the Morse-Smale complex. We highlight the advantages and disadvantages of using a global prediction model versus local topological view of the simulation space, comparing several different strategies for adaptive sampling in both

  7. Adaptive thinking & leadership simulation game training for special forces officers.

    Energy Technology Data Exchange (ETDEWEB)

    Raybourn, Elaine Marie; Mendini, Kip (USA JFKSWCS DOTD, Ft. Bragg, NC); Heneghan, Jerry; Deagle, Edwin (USA JFKSWCS DOTD, Ft. Bragg, NC)

    2005-07-01

    Complex problem solving approaches and novel strategies employed by the military at the squad, team, and commander level are often best learned experimentally. Since live action exercises can be costly, advances in simulation game training technology offer exciting ways to enhance current training. Computer games provide an environment for active, critical learning. Games open up possibilities for simultaneous learning on multiple levels; players may learn from contextual information embedded in the dynamics of the game, the organic process generated by the game, and through the risks, benefits, costs, outcomes, and rewards of alternative strategies that result from decision making. In the present paper we discuss a multiplayer computer game simulation created for the Adaptive Thinking & Leadership (ATL) Program to train Special Forces Team Leaders. The ATL training simulation consists of a scripted single-player and an immersive multiplayer environment for classroom use which leverages immersive computer game technology. We define adaptive thinking as consisting of competencies such as negotiation and consensus building skills, the ability to communicate effectively, analyze ambiguous situations, be self-aware, think innovatively, and critically use effective problem solving skills. Each of these competencies is an essential element of leader development training for the U.S. Army Special Forces. The ATL simulation is used to augment experiential learning in the curriculum for the U.S. Army JFK Special Warfare Center & School (SWCS) course in Adaptive Thinking & Leadership. The school is incorporating the ATL simulation game into two additional training pipelines (PSYOPS and Civil Affairs Qualification Courses) that are also concerned with developing cultural awareness, interpersonal communication adaptability, and rapport-building skills. In the present paper, we discuss the design, development, and deployment of the training simulation, and emphasize how the

  8. Adaptive deployment of model reductions for tau-leaping simulation.

    Science.gov (United States)

    Wu, Sheng; Fu, Jin; Petzold, Linda R

    2015-05-28

    Multiple time scales in cellular chemical reaction systems often render the tau-leaping algorithm inefficient. Various model reductions have been proposed to accelerate tau-leaping simulations. However, these are often identified and deployed manually, requiring expert knowledge. This is time-consuming and prone to error. In previous work, we proposed a methodology for automatic identification and validation of model reduction opportunities for tau-leaping simulation. Here, we show how the model reductions can be automatically and adaptively deployed during the time course of a simulation. For multiscale systems, this can result in substantial speedups.

  9. Adaptive deployment of model reductions for tau-leaping simulation

    Science.gov (United States)

    Wu, Sheng; Fu, Jin; Petzold, Linda R.

    2015-05-01

    Multiple time scales in cellular chemical reaction systems often render the tau-leaping algorithm inefficient. Various model reductions have been proposed to accelerate tau-leaping simulations. However, these are often identified and deployed manually, requiring expert knowledge. This is time-consuming and prone to error. In previous work, we proposed a methodology for automatic identification and validation of model reduction opportunities for tau-leaping simulation. Here, we show how the model reductions can be automatically and adaptively deployed during the time course of a simulation. For multiscale systems, this can result in substantial speedups.

  10. Simulated annealing with a potential function with discontinuous gradient on Rd

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    In this paper, we have proven that the simulated annealingprocess with a potential function on Rd, of which the gradient is discontinuo us, converges in probability to a neighborhood of the global minima of the poten tial function.

  11. The application of neutral network integrated with genetic algorithm and simulated annealing for the simulation of rare earths separation processes by the solvent extraction technique using EHEHPA agent

    International Nuclear Information System (INIS)

    In the present work, neutral network has been used for mathematically modeling equilibrium data of the mixture of two rare earth elements, namely Nd and Pr with PC88A agent. Thermo-genetic algorithm based on the idea of the genetic algorithm and the simulated annealing algorithm have been used in the training procedure of the neutral networks, giving better result in comparison with the traditional modeling approach. The obtained neutral network modeling the experimental data is further used in the computer program to simulate the solvent extraction process of two elements Nd and Pr. Based on this computer program, various optional schemes for the separation of Nd and Pr have been investigated and proposed. (author)

  12. 并行遗传/模拟退火混合算法及其应用%Parallel Genetic Algorithm / Simulated Annealing Hybrid Algorithm and its Applications

    Institute of Scientific and Technical Information of China (English)

    温平川; 徐晓东; 何先刚

    2003-01-01

    This paper presents a highly hybrid Genetic Algorithm / Simulated Annealing algorithm. This algorithmhas been successfully implemented on Beowulf PCs Cluster and applied to a set of standard function optimization prob-lems. From experimental results, it is easily to see that this algorithm proposed by us is not only effective but also robust.

  13. Adaptive image ray-tracing for astrophysical simulations

    CERN Document Server

    Parkin, E R

    2010-01-01

    A technique is presented for producing synthetic images from numerical simulations whereby the image resolution is adapted around prominent features. In so doing, adaptive image ray-tracing (AIR) improves the efficiency of a calculation by focusing computational effort where it is needed most. The results of test calculations show that a factor of >~ 4 speed-up, and a commensurate reduction in the number of pixels required in the final image, can be achieved compared to an equivalent calculation with a fixed resolution image.

  14. 基于模拟退火的万有引力算法%The Gravity Algorithm Based Simulated Annealing

    Institute of Scientific and Technical Information of China (English)

    王立平; 肖乐意

    2014-01-01

    针对标准万有引力算法的个体位置更新策略可能对个体造成破坏且算法局部搜索能力较弱问题提出了一种改进算法。该算法将模拟退火思想引入万有引力算法,采用基于 Metroplis 准则的个体位置更新策略,并在引力操作之后,对每代最优个体进行退火操作。一定程度避免了个体移动的盲目性,提高了算法的局部搜索能力、收敛速度与精度。实验结果表明:算法的改进策略是有效的,且改进后的算法在收敛速度、收敛精度等方面具有明显优势。%In Gravitational Search Algorithm(GSA),individual location update strategy may damage the individual, and the local search ability is weak,an improved algorithm has been proposed. The proposed algorithm integrated simulated annealing mechanism into GSA,used individual location update strategy which based on Metroplis,and did annealing operation for optimal individual of every generation after gravity operation. To some extent,avoided the individual blind Mobile,Improve the local search ability of the algorithm,the velocity and precision of convergence. The experimental results demonstrate that improvement strategy of the algorithm is effective,and the improved algo-rithm has obvious advantages in the velocity of convergence,convergence accuracy,etc.

  15. Simulations and measurements of annealed pyrolytic graphite-metal composite baseplates

    Science.gov (United States)

    Streb, F.; Ruhl, G.; Schubert, A.; Zeidler, H.; Penzel, M.; Flemmig, S.; Todaro, I.; Squatrito, R.; Lampke, T.

    2016-03-01

    We investigated the usability of anisotropic materials as inserts in aluminum-matrix-composite baseplates for typical high performance power semiconductor modules using finite-element simulations and transient plane source measurements. For simulations, several physical modules can be used, which are suitable for different thermal boundary conditions. By comparing different modules and options of heat transfer we found non-isothermal simulations to be closest to reality for temperature distribution at the surface of the heat sink. We optimized the geometry of the graphite inserts for best heat dissipation and based on these results evaluated the thermal resistance of a typical power module using calculation time optimized steady-state simulations. Here we investigated the influence of thermal contact conductance (TCC) between metal matrix and inserts on the heat dissipation. We found improved heat dissipation compared to the plain metal baseplate for a TCC of 200 kW/m2/K and above.To verify the simulations we evaluated cast composite baseplates with two different insert geometries and measured their averaged lateral thermal conductivity using a transient plane source (HotDisk) technique at room temperature. For the composite baseplate we achieved local improvements in heat dissipation compared to the plain metal baseplate.

  16. Simulated annealing: in mathematical global optimization computation, hybrid with local or global search, and practical applications in crystallography and molecular modelling

    CERN Document Server

    Zhang, Jiapu

    2013-01-01

    Simulated annealing (SA) was inspired from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects, both are attributes of the material that depend on its thermodynamic free energy. In this Paper, firstly we will study SA in details on its practical implementation. Then, hybrid pure SA with local (or global) search optimization methods allows us to be able to design several effective and efficient global search optimization methods. In order to keep the original sense of SA, we clarify our understandings of SA in crystallography and molecular modeling field through the studies of prion amyloid fibrils.

  17. Distribution of relaxation times from dielectric spectroscopy using Monte Carlo simulated annealing: Application to α-PVDF

    Science.gov (United States)

    Bello, A.; Laredo, E.; Grimau, M.

    1999-11-01

    The existence of a distribution of relaxation times has been widely used to describe the relaxation function versus frequency in glass-forming liquids. Several empirical distributions have been proposed and the usual method is to fit the experimental data to a model that assumes one of these functions. Another alternative is to extract from the experimental data the discrete profile of the distribution function that best fits the experimental curve without any a priori assumption. To test this approach a Monte Carlo algorithm using the simulated annealing is used to best fit simulated dielectric loss data, ɛ''(ω), generated with Cole-Cole, Cole-Davidson, Havriliak-Negami, and Kohlrausch-Williams-Watts (KWW) functions. The relaxation times distribution, G(ln(τ)), is obtained as an histogram that follows very closely the analytical expression for the distributions that are known in these cases. Also, the temporal decay functions, φ(t), are evaluated and compared to a stretched exponential. The method is then applied to experimental data for α-polyvinylidene fluoride over a temperature range 233 Kflouride (PVDF) is found to be 87, which characterizes this polymer as a relatively structurally strong material.

  18. Disaster Rescue Simulation based on Complex Adaptive Theory

    Directory of Open Access Journals (Sweden)

    Feng Jiang

    2013-05-01

    Full Text Available Disaster rescue is one of the key measures of disaster reduction. The rescue process is a complex process with the characteristics of large scale, complicate structure, non-linear. It is hard to describe and analyze them with traditional methods. Based on complex adaptive theory, this paper analyzes the complex adaptation of the rescue process from seven features: aggregation, nonlinearity, mobility, diversity, tagging, internal model and building block. With the support of Repast platform, an agent-based model including rescue agents and victim agents was proposed. Moreover, two simulations with different parameters are employed to examine the feasibility of the model. As a result, the proposed model has been shown that it is efficient in dealing with the disaster rescue simulation and can provide the reference for making decisions.

  19. Adaptive quantum computation in changing environments using projective simulation

    Science.gov (United States)

    Tiersch, M.; Ganahl, E. J.; Briegel, H. J.

    2015-08-01

    Quantum information processing devices need to be robust and stable against external noise and internal imperfections to ensure correct operation. In a setting of measurement-based quantum computation, we explore how an intelligent agent endowed with a projective simulator can act as controller to adapt measurement directions to an external stray field of unknown magnitude in a fixed direction. We assess the agent’s learning behavior in static and time-varying fields and explore composition strategies in the projective simulator to improve the agent’s performance. We demonstrate the applicability by correcting for stray fields in a measurement-based algorithm for Grover’s search. Thereby, we lay out a path for adaptive controllers based on intelligent agents for quantum information tasks.

  20. Nonlinear Adaptive Robust Force Control of Hydraulic Load Simulator

    Institute of Scientific and Technical Information of China (English)

    YAO Jianyong; JIAO Zongxia; YAO Bin; SHANG Yaoxing; DONG Wenbin

    2012-01-01

    This paper deals with the high performance force control of hydraulic load samulator.Many prevtous works for hydraultc force control are based on their linearization equations,but hydraulic inherent nonlinear properties and uncertainties make the conventional feedback proportional-integral-derivative control not yield to high-performance requirements.In this paper,a nonlinear system model is derived and linear parameterization is made for adaptive control.Then a discontinuous projection-based nonlinear adaptive robust force controller is developed for hydraulic load simulator.The proposed controller constructs an asymptotically stable adaptive controller and adaptation laws,which can compensate for the system nonlinearities and uncertain parameters.Meanwhile a well-designed robust controller is also developed to cope with the hydraulic system uncertain nonlinearities.The controller achieves a guaranteed transient performance and final tracking accuracy in the presence of both parametric uncertainties and uncertain nonlinearities; in the absence of uncertain nonlinearities,the scheme also achieves asymptotic tracking performance.Simulation and experiment comparative results are obtained to verify the high-performance nature of the proposed control strategy and the tracking accuracy is greatly improved.

  1. UNFOLDING SIMULATIONS OF COLD- AND WARM-ADAPTED ELASTASES

    Directory of Open Access Journals (Sweden)

    Laura Riccardi1, Papaleo Elena2 *

    2010-11-01

    Full Text Available The earth surface is dominated by low temperature environments, which have been successfully colonized by several extremophilic organisms. Enzymes isolated from psychrophilic organisms are able to catalyze reactions at low temperatures at which enzymes from mesophiles or thermophiles are fully compromised. The current scenario on enzyme cold-adaptation suggest that these enzymes are characterized by higher catalytic efficiently at low temperatures, enhanced structural flexibility and lower thermostability. In the present contribution, molecular dynamics simulations in explicit solvent have been carried out at different high temperatures in order to investigate the unfolding process of cold- and warm-adapted homologous enzymes. In particular, we focused our attention on cold-adapted elastases for which it was previously demonstrated that the psychrophilic enzyme presents higher localized flexibility in loops surrounding the catalytic site and the specificity pocket. The unfolding simulations show a slower unfolding process for the cold-adapted enzyme, but characterized by a greater loss of intramolecular interactions and α-helices than the mesophilic counterparts.

  2. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    Energy Technology Data Exchange (ETDEWEB)

    Sharma, Gulshan B., E-mail: gbsharma@ucalgary.ca [Emory University, Department of Radiology and Imaging Sciences, Spine and Orthopaedic Center, Atlanta, Georgia 30329 (United States); University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213 (United States); University of Calgary, Schulich School of Engineering, Department of Mechanical and Manufacturing Engineering, Calgary, Alberta T2N 1N4 (Canada); Robertson, Douglas D., E-mail: douglas.d.robertson@emory.edu [Emory University, Department of Radiology and Imaging Sciences, Spine and Orthopaedic Center, Atlanta, Georgia 30329 (United States); University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213 (United States)

    2013-07-01

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  3. Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine

    International Nuclear Information System (INIS)

    Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than

  4. Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales

    Energy Technology Data Exchange (ETDEWEB)

    Xiu, Dongbin [Purdue Univ., West Lafayette, IN (United States)

    2016-06-21

    The focus of the project is the development of mathematical methods and high-performance com- putational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly e cient and scalable numer- ical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.

  5. Improvement of flight simulator feeling using adaptive fuzzy backlash compensation

    OpenAIRE

    Amara, Zied; Bordeneuve-Guibé, Joël

    2007-01-01

    In this paper we addressed the problem of improving the control of DC motors used for the specific application of a 3 degrees of freedom moving base flight simulator. Indeed the presence of backlash in DC motors gearboxes induces shocks and naturally limits the flight feeling. In this paper, dynamic inversion with Fuzzy Logic is used to design an adaptive backlash compensator. The classification property of fuzzy logic techniques makes them a natural candidate for the rejection of errors indu...

  6. Quantum Annealing of Hard Problems

    OpenAIRE

    Jorg, Thomas; Krzakala, Florent; Kurchan, Jorge; Maggs, A C

    2009-01-01

    Quantum annealing is analogous to simulated annealing with a tunneling mechanism substituting for thermal activation. Its performance has been tested in numerical simulation with mixed conclusions. There is a class of optimization problems for which the efficiency can be studied analytically using techniques based on the statistical mechanics of spin glasses.

  7. The fast simulated annealing algorithm applied to the search problem in LEED

    Science.gov (United States)

    Nascimento, V. B.; de Carvalho, V. E.; de Castilho, C. M. C.; Costa, B. V.; Soares, E. A.

    2001-07-01

    In this work we present new results obtained from the application of the fast simulated algorithm (FSA) to the surface structure determination of the Ag(1 1 0) and CdTe(1 1 0) systems. The influence of a control parameter, the "initial temperature", on the FSA search process was investigated. A scaling behaviour, that measures the efficiency of a search method as a function of the number of parameters to be varied, was obtained for the FSA algorithm, and indicated a favourable linear scaling ( N1).

  8. Ensemble annealing of complex physical systems

    OpenAIRE

    Habeck, Michael

    2015-01-01

    Algorithms for simulating complex physical systems or solving difficult optimization problems often resort to an annealing process. Rather than simulating the system at the temperature of interest, an annealing algorithm starts at a temperature that is high enough to ensure ergodicity and gradually decreases it until the destination temperature is reached. This idea is used in popular algorithms such as parallel tempering and simulated annealing. A general problem with annealing methods is th...

  9. An adaptive nonlinear solution scheme for reservoir simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lett, G.S. [Scientific Software - Intercomp, Inc., Denver, CO (United States)

    1996-12-31

    Numerical reservoir simulation involves solving large, nonlinear systems of PDE with strongly discontinuous coefficients. Because of the large demands on computer memory and CPU, most users must perform simulations on very coarse grids. The average properties of the fluids and rocks must be estimated on these grids. These coarse grid {open_quotes}effective{close_quotes} properties are costly to determine, and risky to use, since their optimal values depend on the fluid flow being simulated. Thus, they must be found by trial-and-error techniques, and the more coarse the grid, the poorer the results. This paper describes a numerical reservoir simulator which accepts fine scale properties and automatically generates multiple levels of coarse grid rock and fluid properties. The fine grid properties and the coarse grid simulation results are used to estimate discretization errors with multilevel error expansions. These expansions are local, and identify areas requiring local grid refinement. These refinements are added adoptively by the simulator, and the resulting composite grid equations are solved by a nonlinear Fast Adaptive Composite (FAC) Grid method, with a damped Newton algorithm being used on each local grid. The nonsymmetric linear system of equations resulting from Newton`s method are in turn solved by a preconditioned Conjugate Gradients-like algorithm. The scheme is demonstrated by performing fine and coarse grid simulations of several multiphase reservoirs from around the world.

  10. Spaceflight Sensorimotor Analogs: Simulating Acute and Adaptive Effects

    Science.gov (United States)

    Taylor, Laura C.; Harm, Deborah L.; Kozlovskaya, Inessa; Reschke, Millard F.; Wood, Scott J.

    2009-01-01

    Adaptive changes in sensorimotor function during spaceflight are reflected by spatial disorientation, motion sickness, gaze destabilization and decrements in balance, locomotion and eye-hand coordination that occur during and following transitions between different gravitational states. The purpose of this study was to conduct a meta-synthesis of data from spaceflight analogs to evaluate their effectiveness in simulating adaptive changes in sensorimotor function. METHODS. The analogs under review were categorized as either acute analogs used to simulate performance decrements accompanied with transient changes, or adaptive analogs used to drive sensorimotor learning to altered sensory feedback. The effectiveness of each analog was evaluated in terms of mechanisms of action, magnitude and time course of observed deficits compared to spaceflight data, and the effects of amplitude and exposure duration. RESULTS. Parabolic flight has been used extensively to examine effects of acute variation in gravitational loads, ranging from hypergravity to microgravity. More recently, galvanic vestibular stimulation has been used to elicit acute postural, locomotor and gaze dysfunction by disrupting vestibular afferents. Patient populations, e.g., with bilateral vestibular loss or cerebellar dysfunction, have been proposed to model acute sensorimotor dysfunction. Early research sponsored by NASA involved living onboard rotating rooms, which appeared to approximate the time course of adaptation and post-exposure recovery observed in astronauts following spaceflight. Exposure to different bed-rest paradigms (6 deg head down, dry immersion) result in similar motor deficits to that observed following spaceflight. Shorter adaptive analogs have incorporated virtual reality environments, visual distortion paradigms, exposure to conflicting tilt-translation cues, and exposure to 3Gx centrifugation. As with spaceflight, there is considerable variability in responses to most of the analogs

  11. INTRODUCCIÓN DE ELEMENTOS DE MEMORIA EN EL MÉTODO SIMULATED ANNEALING PARA RESOLVER PROBLEMAS DE PROGRAMACIÓN MULTIOBJETIVO DE MÁQUINAS PARALELAS INTRODUCTION OF MEMORY ELEMENTS IN SIMULATED ANNEALING METHOD TO SOLVE MULTIOBJECTIVE PARALLEL MACHINE SCHEDULING PROBLEMS

    Directory of Open Access Journals (Sweden)

    Felipe Baesler

    2008-12-01

    Full Text Available El presente artículo introduce una variante de la metaheurística simulated annealing, para la resolución de problemas de optimización multiobjetivo. Este enfoque se demonina MultiObjective Simulated Annealing with Random Trajectory Search, MOSARTS. Esta técnica agrega al algoritmo Simulated Annealing elementos de memoria de corto y largo plazo para realizar una búsqueda que permita balancear el esfuerzo entre todos los objetivos involucrados en el problema. Los resultados obtenidos se compararon con otras tres metodologías en un problema real de programación de máquinas paralelas, compuesto por 24 trabajos y 2 máquinas idénticas. Este problema corresponde a un caso de estudio real de la industria regional del aserrío. En los experimentos realizados, MOSARTS se comportó de mejor manera que el resto de la herramientas de comparación, encontrando mejores soluciones en términos de dominancia y dispersión.This paper introduces a variant of the metaheuristic simulated annealing, oriented to solve multiobjective optimization problems. This technique is called MultiObjective Simulated Annealing with Random Trajectory Search (MOSARTS. This technique incorporates short an long term memory concepts to Simulated Annealing in order to balance the search effort among all the objectives involved in the problem. The algorithm was tested against three different techniques on a real life parallel machine scheduling problem, composed of 24 jobs and two identical machines. This problem represents a real life case study of the local sawmill industry. The results showed that MOSARTS behaved much better than the other methods utilized, because found better solutions in terms of dominance and frontier dispersion.

  12. Simulated annealing based algorithm for identifying mutated driver pathways in cancer.

    Science.gov (United States)

    Li, Hai-Tao; Zhang, Yu-Lang; Zheng, Chun-Hou; Wang, Hong-Qiang

    2014-01-01

    With the development of next-generation DNA sequencing technologies, large-scale cancer genomics projects can be implemented to help researchers to identify driver genes, driver mutations, and driver pathways, which promote cancer proliferation in large numbers of cancer patients. Hence, one of the remaining challenges is to distinguish functional mutations vital for cancer development, and filter out the unfunctional and random "passenger mutations." In this study, we introduce a modified method to solve the so-called maximum weight submatrix problem which is used to identify mutated driver pathways in cancer. The problem is based on two combinatorial properties, that is, coverage and exclusivity. Particularly, we enhance an integrative model which combines gene mutation and expression data. The experimental results on simulated data show that, compared with the other methods, our method is more efficient. Finally, we apply the proposed method on two real biological datasets. The results show that our proposed method is also applicable in real practice. PMID:24982873

  13. Displacement cascades and defects annealing in tungsten, Part I: Defect database from molecular dynamics simulations

    Energy Technology Data Exchange (ETDEWEB)

    Setyawan, Wahyu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Nandipati, Giridhar [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Roche, Kenneth J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Univ. of Washington, Seattle, WA (United States); Heinisch, Howard L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wirth, Brian D. [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab., Oak Ridge, TN (United States); Kurtz, Richard J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Molecular dynamics simulations have been used to generate a comprehensive database of surviving defects due to displacement cascades in bulk tungsten. Twenty-one data points of primary knock-on atom (PKA) energies ranging from 100 eV (sub-threshold energy) to 100 keV (~780×Ed, where Ed = 128 eV is the average displacement threshold energy) have been completed at 300 K, 1025 K and 2050 K. Within this range of PKA energies, two regimes of power-law energy-dependence of the defect production are observed. A distinct power-law exponent characterizes the number of Frenkel pairs produced within each regime. The two regimes intersect at a transition energy which occurs at approximately 250×Ed. The transition energy also marks the onset of the formation of large self-interstitial atom (SIA) clusters (size 14 or more). The observed defect clustering behavior is asymmetric, with SIA clustering increasing with temperature, while the vacancy clustering decreases. This asymmetry increases with temperature such that at 2050 K (~0.5Tm) practically no large vacancy clusters are formed, meanwhile large SIA clusters appear in all simulations. The implication of such asymmetry on the long-term defect survival and damage accumulation is discussed. In addition, <100> {110} SIA loops are observed to form directly in the highest energy cascades, while vacancy <100> loops are observed to form at the lowest temperature and highest PKA energies, although the appearance of both the vacancy and SIA loops with Burgers vector of <100> type is relatively rare.

  14. Interpretation of residual gravity anomaly caused by simple shaped bodies using very fast simulated annealing global optimization

    Institute of Scientific and Technical Information of China (English)

    Arkoprovo Biswas

    2015-01-01

    A very fast simulated annealing (VFSA) global optimization is used to interpret residual gravity anomaly. Since, VFSA optimization yields a large number of best-fitted models in a vast model space;the nature of uncertainty in the interpretation is also examined simultaneously in the present study. The results of VFSA optimization reveal that various parameters show a number of equivalent solutions when shape of the target body is not known and shape factor ‘q’ is also optimized together with other model param-eters. The study reveals that amplitude coefficient k is strongly dependent on shape factor. This shows that there is a multi-model type uncertainty between these two model parameters derived from the analysis of cross-plots. However, the appraised values of shape factor from various VFSA runs clearly indicate whether the subsurface structure is sphere, horizontal or vertical cylinder type structure. Accordingly, the exact shape factor (1.5 for sphere, 1.0 for horizontal cylinder and 0.5 for vertical cylinder) is fixed and optimization process is repeated. After fixing the shape factor, analysis of uncertainty and cross-plots shows a well-defined uni-model characteristic. The mean model computed after fixing the shape factor gives the utmost consistent results. Inversion of noise-free and noisy synthetic data as well as field data demonstrates the efficacy of the approach.

  15. Modeling and Simulated Annealing Optimization of Surface Roughness in CO2 Laser Nitrogen Cutting of Stainless Steel

    Directory of Open Access Journals (Sweden)

    M. Madić

    2013-09-01

    Full Text Available This paper presents a systematic methodology for empirical modeling and optimization of surface roughness in nitrogen, CO2 laser cutting of stainless steel . The surface roughness prediction model was developed in terms of laser power , cutting speed , assist gas pressure and focus position by using The artificial neural network ( ANN . To cover a wider range of laser cutting parameters and obtain an experimental database for the ANN model development, Taguchi 's L27 orthogonal array was implemented in the experimental plan. The developed ANN model was expressed as an explicit nonlinear function , while the influence of laser cutting parameters and their interactions on surface roughness were analyzed by generating 2D and 3D plots . The final goal of the experimental study Focuses on the determinationof the optimum laser cutting parameters for the minimization of surface roughness . Since the solution space of the developed ANN model is complex, and the possibility of many local solutions is great, simulated annealing (SA was selected as a method for the optimization of surface roughness.

  16. TPA: A Two-Phase Approach Using Simulated Annealing for the Optimization of Census Taker Routes in Mexico

    Directory of Open Access Journals (Sweden)

    Silvia Gaona

    2015-01-01

    Full Text Available Censuses in Mexico are taken by the National Institute of Statistics and Geography (INEGI. In this paper a Two-Phase Approach (TPA to optimize the routes of INEGI’s census takers is presented. For each pollster, in the first phase, a route is produced by means of the Simulated Annealing (SA heuristic, which attempts to minimize the travel distance subject to particular constraints. Whenever the route is unrealizable, it is made realizable in the second phase by constructing a visibility graph for each obstacle and applying Dijkstra’s algorithm to determine the shortest path in this graph. A tuning methodology based on the irace package was used to determine the parameter values for TPA on a subset of 150 instances provided by INEGI. The practical effectiveness of TPA was assessed on another subset of 1962 instances, comparing its performance with that of the in-use heuristic (INEGIH. The results show that TPA clearly outperforms INEGIH. The average improvement is of 47.11%.

  17. A general Monte Carlo/simulated annealing algorithm for resonance assignment in NMR of uniformly labeled biopolymers

    Science.gov (United States)

    Hu, Kan-Nian; Qiang, Wei; Tycko, Robert

    2011-01-01

    We describe a general computational approach to site-specific resonance assignments in multidimensional NMR studies of uniformly 15N,13C-labeled biopolymers, based on a simple Monte Carlo/simulated annealing (MCSA) algorithm contained in the program MCASSIGN2. Input to MCASSIGN2 includes lists of multidimensional signals in the NMR spectra with their possible residue-type assignments (which need not be unique), the biopolymer sequence, and a table that describes the connections that relate one signal list to another. As output, MCASSIGN2 produces a high-scoring sequential assignment of the multidimensional signals, using a score function that rewards good connections (i.e., agreement between relevant sets of chemical shifts in different signal lists) and penalizes bad connections, unassigned signals, and assignment gaps. Examination of a set of high-scoring assignments from a large number of independent runs allows one to determine whether a unique assignment exists for the entire sequence or parts thereof. We demonstrate the MCSA algorithm using two-dimensional (2D) and three-dimensional (3D) solid state NMR spectra of several model protein samples (α-spectrin SH3 domain and protein G/B1 microcrystals, HET-s218–289 fibrils), obtained with magic-angle spinning and standard polarization transfer techniques. The MCSA algorithm and MCASSIGN2 program can accommodate arbitrary combinations of NMR spectra with arbitrary dimensionality, and can therefore be applied in many areas of solid state and solution NMR. PMID:21710190

  18. An optimization method based on combination of cellular automata and simulated annealing for VVER-1000 NPP loading pattern

    International Nuclear Information System (INIS)

    This paper introduces a design methodology in the context of finding new and innovative design principles by means of optimization techniques. In this method cellular automata (CA) and simulated annealing (SA) were combined and used for solving the optimization problem. This method contains two principles that are neighboring concept from CA and accepting each displacement basis on decreasing of objective function and Boltzman distribution from SA that plays role of transition rule. Proposed method was used for solving fuel management optimization problem in VVER-1000 Russian reactor. Since the fuel management problem contains a huge amount of calculation for finding the best configuration for fuel assemblies in reactor core this method has been introduced for reducing the volume of calculation. In this study reducing of power peaking factor inside the reactor core of Bushehr NPP is considered as the objective function. The proposed optimization method is compared with Hopfield neural network procedure that was used for solving this problem and has been shown that the result, velocity and qualification of new method are comparable with that. Besides, the result is the optimum configuration, which is in agreement with the pattern proposed by the designer.

  19. Porous media microstructure reconstruction using pixel-based and object-based simulated annealing: comparison with other reconstruction methods

    Energy Technology Data Exchange (ETDEWEB)

    Diogenes, Alysson N.; Santos, Luis O.E. dos; Fernandes, Celso P. [Universidade Federal de Santa Catarina (UFSC), Florianopolis, SC (Brazil); Appoloni, Carlos R. [Universidade Estadual de Londrina (UEL), PR (Brazil)

    2008-07-01

    The reservoir rocks physical properties are usually obtained in laboratory, through standard experiments. These experiments are often very expensive and time-consuming. Hence, the digital image analysis techniques are a very fast and low cost methodology for physical properties prediction, knowing only geometrical parameters measured from the rock microstructure thin sections. This research analyzes two methods for porous media reconstruction using the relaxation method simulated annealing. Using geometrical parameters measured from rock thin sections, it is possible to construct a three-dimensional (3D) model of the microstructure. We assume statistical homogeneity and isotropy and the 3D model maintains porosity spatial correlation, chord size distribution and d 3-4 distance transform distribution for a pixel-based reconstruction and spatial correlation for an object-based reconstruction. The 2D and 3D preliminary results are compared with microstructures reconstructed by truncated Gaussian methods. As this research is in its beginning, only the 2D results will be presented. (author)

  20. [Application of simulated annealing method and neural network on optimizing soil sampling schemes based on road distribution].

    Science.gov (United States)

    Han, Zong-wei; Huang, Wei; Luo, Yun; Zhang, Chun-di; Qi, Da-cheng

    2015-03-01

    Taking the soil organic matter in eastern Zhongxiang County, Hubei Province, as a research object, thirteen sample sets from different regions were arranged surrounding the road network, the spatial configuration of which was optimized by the simulated annealing approach. The topographic factors of these thirteen sample sets, including slope, plane curvature, profile curvature, topographic wetness index, stream power index and sediment transport index, were extracted by the terrain analysis. Based on the results of optimization, a multiple linear regression model with topographic factors as independent variables was built. At the same time, a multilayer perception model on the basis of neural network approach was implemented. The comparison between these two models was carried out then. The results revealed that the proposed approach was practicable in optimizing soil sampling scheme. The optimal configuration was capable of gaining soil-landscape knowledge exactly, and the accuracy of optimal configuration was better than that of original samples. This study designed a sampling configuration to study the soil attribute distribution by referring to the spatial layout of road network, historical samples, and digital elevation data, which provided an effective means as well as a theoretical basis for determining the sampling configuration and displaying spatial distribution of soil organic matter with low cost and high efficiency. PMID:26211074

  1. Dynamic cellular manufacturing system design considering alternative routing and part operation tradeoff using simulated annealing based genetic algorithm

    Indian Academy of Sciences (India)

    KAMAL DEEP; PARDEEP K SINGH

    2016-09-01

    In this paper, an integrated mathematical model of multi-period cell formation and part operation tradeoff in a dynamic cellular manufacturing system is proposed in consideration with multiple part process route. This paper puts emphasize on the production flexibility (production/subcontracting part operation) to satisfy the product demand requirement in different period segments of planning horizon considering production capacity shortage and/or sudden machine breakdown. The proposed model simultaneously generates machine cells and part families and selects the optimum process route instead of the user specifying predetermined routes. Conventional optimization method for the optimal cell formation problem requires substantial amount of time and memory space. Hence a simulated annealing based genetic algorithm is proposed to explore the solution regions efficiently and to expedite the solution search space. To evaluate the computability of the proposed algorithm, different problem scenarios are adopted from literature. The results approve the effectiveness of theproposed approach in designing the manufacturing cell and minimization of the overall cost, considering various manufacturing aspects such as production volume, multiple process route, production capacity, machine duplication, system reconfiguration, material handling and subcontracting part operation.

  2. Hydrodynamical Adaptive Mesh Refinement Simulations of Disk Galaxies

    CERN Document Server

    Gibson, Brad K; Sanchez-Blazquez, Patricia; Teyssier, Romain; House, Elisa L; Brook, Chris B; Kawata, Daisuke

    2008-01-01

    To date, fully cosmological hydrodynamic disk simulations to redshift zero have only been undertaken with particle-based codes, such as GADGET, Gasoline, or GCD+. In light of the (supposed) limitations of traditional implementations of smoothed particle hydrodynamics (SPH), or at the very least, their respective idiosyncrasies, it is important to explore complementary approaches to the SPH paradigm to galaxy formation. We present the first high-resolution cosmological disk simulations to redshift zero using an adaptive mesh refinement (AMR)-based hydrodynamical code, in this case, RAMSES. We analyse the temporal and spatial evolution of the simulated stellar disks' vertical heating, velocity ellipsoids, stellar populations, vertical and radial abundance gradients (gas and stars), assembly/infall histories, warps/lopsideness, disk edges/truncations (gas and stars), ISM physics implementations, and compare and contrast these properties with our sample of cosmological SPH disks, generated with GCD+. These prelim...

  3. Ensemble annealing of complex physical systems

    CERN Document Server

    Habeck, Michael

    2015-01-01

    Algorithms for simulating complex physical systems or solving difficult optimization problems often resort to an annealing process. Rather than simulating the system at the temperature of interest, an annealing algorithm starts at a temperature that is high enough to ensure ergodicity and gradually decreases it until the destination temperature is reached. This idea is used in popular algorithms such as parallel tempering and simulated annealing. A general problem with annealing methods is that they require a temperature schedule. Choosing well-balanced temperature schedules can be tedious and time-consuming. Imbalanced schedules can have a negative impact on the convergence, runtime and success of annealing algorithms. This article outlines a unifying framework, ensemble annealing, that combines ideas from simulated annealing, histogram reweighting and nested sampling with concepts in thermodynamic control. Ensemble annealing simultaneously simulates a physical system and estimates its density of states. The...

  4. Simulated Annealing Based Algorithm for Identifying Mutated Driver Pathways in Cancer

    Directory of Open Access Journals (Sweden)

    Hai-Tao Li

    2014-01-01

    Full Text Available With the development of next-generation DNA sequencing technologies, large-scale cancer genomics projects can be implemented to help researchers to identify driver genes, driver mutations, and driver pathways, which promote cancer proliferation in large numbers of cancer patients. Hence, one of the remaining challenges is to distinguish functional mutations vital for cancer development, and filter out the unfunctional and random “passenger mutations.” In this study, we introduce a modified method to solve the so-called maximum weight submatrix problem which is used to identify mutated driver pathways in cancer. The problem is based on two combinatorial properties, that is, coverage and exclusivity. Particularly, we enhance an integrative model which combines gene mutation and expression data. The experimental results on simulated data show that, compared with the other methods, our method is more efficient. Finally, we apply the proposed method on two real biological datasets. The results show that our proposed method is also applicable in real practice.

  5. Searching for Stable SinCn Clusters: Combination of Stochastic Potential Surface Search and Pseudopotential Plane-Wave Car-Parinello Simulated Annealing Simulations

    Directory of Open Access Journals (Sweden)

    Larry W. Burggraf

    2013-07-01

    Full Text Available To find low energy SinCn structures out of hundreds to thousands of isomers we have developed a general method to search for stable isomeric structures that combines Stochastic Potential Surface Search and Pseudopotential Plane-Wave Density Functional Theory Car-Parinello Molecular Dynamics simulated annealing (PSPW-CPMD-SA. We enhanced the Sunders stochastic search method to generate random cluster structures used as seed structures for PSPW-CPMD-SA simulations. This method ensures that each SA simulation samples a different potential surface region to find the regional minimum structure. By iterations of this automated, parallel process on a high performance computer we located hundreds to more than a thousand stable isomers for each SinCn cluster. Among these, five to 10 of the lowest energy isomers were further optimized using B3LYP/cc-pVTZ method. We applied this method to SinCn (n = 4–12 clusters and found the lowest energy structures, most not previously reported. By analyzing the bonding patterns of low energy structures of each SinCn cluster, we observed that carbon segregations tend to form condensed conjugated rings while Si connects to unsaturated bonds at the periphery of the carbon segregation as single atoms or clusters when n is small and when n is large a silicon network spans over the carbon segregation region.

  6. 模拟退火遗传算法在DOA估计技术中的应用%Application of simulated annealing genetic algorithm in DOA estimation technique

    Institute of Scientific and Technical Information of China (English)

    贾伟娜; 刘顺兰

    2014-01-01

    The simulated annealing genetic algorithm is a new global optimization algorithm, and it is formed by integrating the simulated annealing into the genetic algorithm. Then the simulated annealing genetic algorithm is applied to the WSF algorithm of DOA estimation technique, in order to reduce the complexity of WSF algorithm and improve the DOA estima-tion precision. At the same time, the new algorithm can solve the low efficiency and easily falling into local optimum prob-lems of the basic genetic algorithm in DOA estimation. Computer simulation results show that, compared with the basic genetic algorithm, gauss-newton method, the DOA estimation technique based on simulated annealing genetic algorithm has higher resolution probability and smaller mean square error.%将模拟退火思想融入到遗传算法中,形成了另一种优化算法,即模拟退火遗传算法,将其应用于加权子空间(WSF)算法的目标方位(DOA)估计技术中,以求降低WSF算法的运算复杂度,提高DOA估计精度,同时又解决了基本遗传算法在DOA估计中易陷入局部最优、后期搜索迟钝等问题。计算机仿真结果表明:采用模拟退火遗传算法的DOA估计技术在低信噪比条件下比采用基本遗传算法、高斯-牛顿算法有更高的分辨概率,更小的均方误差。

  7. A parallel adaptive finite difference algorithm for petroleum reservoir simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hoang, Hai Minh

    2005-07-01

    Adaptive finite differential for problems arising in simulation of flow in porous medium applications are considered. Such methods have been proven useful for overcoming limitations of computational resources and improving the resolution of the numerical solutions to a wide range of problems. By local refinement of the computational mesh where it is needed to improve the accuracy of solutions, yields better solution resolution representing more efficient use of computational resources than is possible with traditional fixed-grid approaches. In this thesis, we propose a parallel adaptive cell-centered finite difference (PAFD) method for black-oil reservoir simulation models. This is an extension of the adaptive mesh refinement (AMR) methodology first developed by Berger and Oliger (1984) for the hyperbolic problem. Our algorithm is fully adaptive in time and space through the use of subcycling, in which finer grids are advanced at smaller time steps than the coarser ones. When coarse and fine grids reach the same advanced time level, they are synchronized to ensure that the global solution is conservative and satisfy the divergence constraint across all levels of refinement. The material in this thesis is subdivided in to three overall parts. First we explain the methodology and intricacies of AFD scheme. Then we extend a finite differential cell-centered approximation discretization to a multilevel hierarchy of refined grids, and finally we are employing the algorithm on parallel computer. The results in this work show that the approach presented is robust, and stable, thus demonstrating the increased solution accuracy due to local refinement and reduced computing resource consumption. (Author)

  8. Simulation of Biochemical Pathway Adaptability Using Evolutionary Algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Bosl, W J

    2005-01-26

    The systems approach to genomics seeks quantitative and predictive descriptions of cells and organisms. However, both the theoretical and experimental methods necessary for such studies still need to be developed. We are far from understanding even the simplest collective behavior of biomolecules, cells or organisms. A key aspect to all biological problems, including environmental microbiology, evolution of infectious diseases, and the adaptation of cancer cells is the evolvability of genomes. This is particularly important for Genomes to Life missions, which tend to focus on the prospect of engineering microorganisms to achieve desired goals in environmental remediation and climate change mitigation, and energy production. All of these will require quantitative tools for understanding the evolvability of organisms. Laboratory biodefense goals will need quantitative tools for predicting complicated host-pathogen interactions and finding counter-measures. In this project, we seek to develop methods to simulate how external and internal signals cause the genetic apparatus to adapt and organize to produce complex biochemical systems to achieve survival. This project is specifically directed toward building a computational methodology for simulating the adaptability of genomes. This project investigated the feasibility of using a novel quantitative approach to studying the adaptability of genomes and biochemical pathways. This effort was intended to be the preliminary part of a larger, long-term effort between key leaders in computational and systems biology at Harvard University and LLNL, with Dr. Bosl as the lead PI. Scientific goals for the long-term project include the development and testing of new hypotheses to explain the observed adaptability of yeast biochemical pathways when the myosin-II gene is deleted and the development of a novel data-driven evolutionary computation as a way to connect exploratory computational simulation with hypothesis

  9. Adaptive mesh refinement and adjoint methods in geophysics simulations

    Science.gov (United States)

    Burstedde, Carsten

    2013-04-01

    It is an ongoing challenge to increase the resolution that can be achieved by numerical geophysics simulations. This applies to considering sub-kilometer mesh spacings in global-scale mantle convection simulations as well as to using frequencies up to 1 Hz in seismic wave propagation simulations. One central issue is the numerical cost, since for three-dimensional space discretizations, possibly combined with time stepping schemes, a doubling of resolution can lead to an increase in storage requirements and run time by factors between 8 and 16. A related challenge lies in the fact that an increase in resolution also increases the dimensionality of the model space that is needed to fully parametrize the physical properties of the simulated object (a.k.a. earth). Systems that exhibit a multiscale structure in space are candidates for employing adaptive mesh refinement, which varies the resolution locally. An example that we found well suited is the mantle, where plate boundaries and fault zones require a resolution on the km scale, while deeper area can be treated with 50 or 100 km mesh spacings. This approach effectively reduces the number of computational variables by several orders of magnitude. While in this case it is possible to derive the local adaptation pattern from known physical parameters, it is often unclear what are the most suitable criteria for adaptation. We will present the goal-oriented error estimation procedure, where such criteria are derived from an objective functional that represents the observables to be computed most accurately. Even though this approach is well studied, it is rarely used in the geophysics community. A related strategy to make finer resolution manageable is to design methods that automate the inference of model parameters. Tweaking more than a handful of numbers and judging the quality of the simulation by adhoc comparisons to known facts and observations is a tedious task and fundamentally limited by the turnaround times

  10. Direct numerical simulation of bubbles with parallelized adaptive mesh refinement

    International Nuclear Information System (INIS)

    The study of two-phase Thermal-Hydraulics is a major topic for Nuclear Engineering for both security and efficiency of nuclear facilities. In addition to experiments, numerical modeling helps to knowing precisely where bubbles appear and how they behave, in the core as well as in the steam generators. This work presents the finest scale of representation of two-phase flows, Direct Numerical Simulation of bubbles. We use the 'Di-phasic Low Mach Number' equation model. It is particularly adapted to low-Mach number flows, that is to say flows which velocity is much slower than the speed of sound; this is very typical of nuclear thermal-hydraulics conditions. Because we study bubbles, we capture the front between vapor and liquid phases thanks to a downward flux limiting numerical scheme. The specific discrete analysis technique this work introduces is well-balanced parallel Adaptive Mesh Refinement (AMR). With AMR, we refined the coarse grid on a batch of patches in order to locally increase precision in areas which matter more, and capture fine changes in the front location and its topology. We show that patch-based AMR is very adapted for parallel computing. We use a variety of physical examples: forced advection, heat transfer, phase changes represented by a Stefan model, as well as the combination of all those models. We will present the results of those numerical simulations, as well as the speed up compared to equivalent non-AMR simulation and to serial computation of the same problems. This document is made up of an abstract and the slides of the presentation. (author)

  11. 基于自适应遗传退火算法的配电网故障定位研究%Fault Location of Distribution Networks Based on Adaptive Genetic Annealing Algorithm

    Institute of Scientific and Technical Information of China (English)

    吕学勤; 陈树果; 田振宁

    2012-01-01

    Focusing on the problem of premature convergence and slow convergence of the standard genetic algorithm, this paper proposes a hybrid genetic algorithm (adaptive genetic annealing algorithm) to solve the fault location in radialized distribution networks. This algorithm adopts the combined mechanism of the roulette strategy and the optimal strategy to keep the current best individual in the population, and uses the adaptive crossover and mutation probability to expand the search area about population, and then introduces simulated annealing algorithm, so as to speed up the convergence rate of the interactive post. Finally, a simulation calculation is conducted for the IEEE-33 system and the result indicates that the algorithm has fast convergent velocity and accurate fault location abilities in single fault or multiple faults. In addition, it also has good fault-tolerance when fault information is aberrated.%针对标准遗传算法易早熟收敛以及收敛速度慢的问题,提出了一种混合遗传算法(自适应遗传退火算法)用于解决辐射状配电网故障定位问题.该算法采用轮盘赌和最优保存策略相结合的选择机制,使得当前最优个体始终保持在种群里,并结合自适应交叉、变异概率,扩大种群的搜索范围,继而引入模拟退火算法,加快迭代后期算法的收敛速度.最后,通过对IEEE-33节点配电系统进行仿真计算,结果表明,该算法能够对单点和多点故障进行实时、准确地定位,并在故障信息畸变的情况下,也能快速地得到准确结果.

  12. Adaptive Techniques for Clustered N-Body Cosmological Simulations

    CERN Document Server

    Menon, Harshitha; Zheng, Gengbin; Jetley, Pritish; Kale, Laxmikant; Quinn, Thomas; Governato, Fabio

    2014-01-01

    ChaNGa is an N-body cosmology simulation application implemented using Charm++. In this paper, we present the parallel design of ChaNGa and address many challenges arising due to the high dynamic ranges of clustered datasets. We focus on optimizations based on adaptive techniques for scaling to more than 128K cores. We demonstrate strong scaling on up to 512K cores of Blue Waters evolving 12 and 24 billion particles. We also show strong scaling of highly clustered datasets on up to 128K cores.

  13. Adaptive resolution simulation of polarizable supramolecular coarse-grained water models

    NARCIS (Netherlands)

    Zavadlav, Julija; Melo, Manuel N.; Marrink, Siewert J.; Praprotnik, Matej

    2015-01-01

    Multiscale simulations methods, such as adaptive resolution scheme, are becoming increasingly popular due to their significant computational advantages with respect to conventional atomistic simulations. For these kind of simulations, it is essential to develop accurate multiscale water models that

  14. Adapting a weather forecast model for greenhouse gas simulation

    Science.gov (United States)

    Polavarapu, S. M.; Neish, M.; Tanguay, M.; Girard, C.; de Grandpré, J.; Gravel, S.; Semeniuk, K.; Chan, D.

    2015-12-01

    The ability to simulate greenhouse gases on the global domain is useful for providing boundary conditions for regional flux inversions, as well as for providing reference data for bias correction of satellite measurements. Given the existence of operational weather and environmental prediction models and assimilation systems at Environment Canada, it makes sense to use these tools for greenhouse gas simulations. In this work, we describe the adaptations needed to reasonably simulate CO2 with a weather forecast model. The main challenges were the implementation of a mass conserving advection scheme, and the careful implementation of a mixing ratio defined with respect to dry air. The transport of tracers through convection was also added, and the vertical mixing through the boundary layer was slightly modified. With all these changes, the model conserves CO2 mass well on the annual time scale, and the high resolution (0.9 degree grid spacing) permits a good description of synoptic scale transport. The use of a coupled meteorological/tracer transport model also permits an assessment of approximations needed in offline transport model approaches, such as the neglect of water vapour mass when computing a tracer mixing ratio with respect to dry air.

  15. DESIGNING A DIFFRACTIVE OPTICAL ELEMENT FOR CONTROLLING THE BEAM PROFILE IN A THREE-DIMENSIONAL SPACE USING THE SIMULATED ANNEALING ALGORITHM

    Institute of Scientific and Technical Information of China (English)

    LIANG WEN-XI; ZHANG JING-JUAN; L(U) JUN-FENG; LIAO RUI

    2001-01-01

    We have designed a spatially quantized diffractive optical element (DOE) for controlling the beam profile in a three-dimensional space with the help of the simulated annealing (SA) algorithm. In this paper, we investigate the annealing schedule and the neighbourhood which are the deterministic parameters of the process that warrant the quality of the SA algorithm. The algorithm is employed to solve the discrete stochastic optimization problem of the design of a DOE. The objective function which constrains the optimization is also studied. The computed results demonstrate that the procedure of the algorithm converges stably to an optimal solution close to the global optimum with an acceptable computing time. The results meet the design requirement well and are applicable.

  16. Adaptive model reduction for nonsmooth discrete element simulation

    CERN Document Server

    Servin, Martin

    2015-01-01

    A method for adaptive model order reduction for nonsmooth discrete element simulation is developed and analysed in numerical experiments. Regions of the granular media that collectively move as rigid bodies are substituted with rigid bodies of the corresponding shape and mass distribution. The method also support particles merging with articulated multibody systems. A model approximation error is defined used for deriving and conditions for when and where to apply model reduction and refinement back into particles and smaller rigid bodies. Three methods for refinement are proposed and tested: prediction from contact events, trial solutions computed in the background and using split sensors. The computational performance can be increased by 5 - 50 times for model reduction level between 70 - 95 %.

  17. 分布式遗传模拟退火算法的火力打击目标分配优化%Optimization for Target Assignment in Fire Strike Based on Distributed Genetic Simulated Annealing Algorithm

    Institute of Scientific and Technical Information of China (English)

    吴坤鸿; 詹世贤

    2016-01-01

    根据火力打击规则,建立了多目标函数的目标分配模型,提出了分布式遗传模拟退火算法对模型进行求解。分布式遗传模拟退火算法基于经典遗传算法进行改进:将单目标串行搜索方式变成多目标分布式搜索方式,适用于多目标寻优问题求解;采用保留最优个体和轮盘赌相结合的方式进行个体选择,在交叉算子中引入模拟退火算法,使用自适应变异概率,较好地保持算法广度和深度搜索平衡。最后,通过仿真实验验证了算法的有效性和可靠性。%According to the rules of fire strike,a target assignment model is presented,and a Distributed Genetic Simulated Annealing algorithm (DGSA)is applied to resolve this model. DGSA is improved based on classic Genetic Algorithm (GA)as below:the single object serial-searched mode is changed to multiple objects distributed-searched mode,which is fitter for resolving multiobjective optimization; in order to keep a better balance between exploration and exploitation of algorithm,a method by coupling best one preservation and roulette wheel is established for individual selection,and simulated annealing algorithm is combined into crossover operation,and self -adaptive mutation probability is applied. Finally,the efficiency and reliability of DGSA is verified by simulation experiment.

  18. Quantum Annealing and Quantum Fluctuation Effect in Frustrated Ising Systems

    OpenAIRE

    Tanaka, Shu; Tamura, Ryo

    2012-01-01

    Quantum annealing method has been widely attracted attention in statistical physics and information science since it is expected to be a powerful method to obtain the best solution of optimization problem as well as simulated annealing. The quantum annealing method was incubated in quantum statistical physics. This is an alternative method of the simulated annealing which is well-adopted for many optimization problems. In the simulated annealing, we obtain a solution of optimization problem b...

  19. Quantum Annealing for Variational Bayes Inference

    OpenAIRE

    Sato, Issei; Kurihara, Kenichi; Tanaka, Shu; Nakagawa, Hiroshi; Miyashita, Seiji

    2014-01-01

    This paper presents studies on a deterministic annealing algorithm based on quantum annealing for variational Bayes (QAVB) inference, which can be seen as an extension of the simulated annealing for variational Bayes (SAVB) inference. QAVB is as easy as SAVB to implement. Experiments revealed QAVB finds a better local optimum than SAVB in terms of the variational free energy in latent Dirichlet allocation (LDA).

  20. Unsteady CFD simulations of a pump in part load conditions using scale-adaptive simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lucius, A., E-mail: andreas.lucius@tu-clausthal.d [Institute of Applied Mechanics, Clausthal University of Technology, Adolph-Roemer Str. 2a, 38678 Clausthal-Zellerfeld (Germany); Brenner, G. [Institute of Applied Mechanics, Clausthal University of Technology, Adolph-Roemer Str. 2a, 38678 Clausthal-Zellerfeld (Germany)

    2010-12-15

    The scope of this work is to demonstrate the applicability of an eddy resolving turbulence model in a turbomachinery configuration. The model combines the Large Eddy Simulation (LES) and the Reynolds Averaged Navier Stokes (RANS) approach. The point of interest of the present investigation is the unsteady rotating stall phenomenon occurring at low part load conditions. Since RANS turbulence models often fail to predict separation correctly, a LES like model is expected to give superior results. In this investigation the scale-adaptive simulation (SAS) model is used. This model avoids the grid dependence appearing in the Detached Eddy Simulation (DES) modelling strategy. The simulations are validated with transient measurement data. The present results demonstrate, that both models are able to predict the major stall frequency at part load. Results are similar for URANS and SAS, with advantages in predicting minor stall frequencies for the turbulence resolving model.

  1. Computational Simulation of Hypervelocity Penetration Using Adaptive SPH Method

    Institute of Scientific and Technical Information of China (English)

    QIANG Hongfu; MENG Lijun

    2006-01-01

    The normal hypervelocity impact of an Al-thin plate by an Al-sphere was numerically simulated by using the adaptive smoothed particle hydrodynamics (ASPH) method.In this method,the isotropic smoothing algorithm of standard SPH is replaced with anisotropic smoothing involving ellipsoidal kernels whose axes evolve automatically to follow the mean particle spacing as it varies in time,space,and direction around each particle.Using the ASPH,the anisotropic volume changes under strong shock condition are captured more accurately and clearly.The sophisticated features of meshless and Lagrangian nature inherent in the SPH method are kept for treating large deformations,large inhomogeneities and tracing free surfaces in the extremely transient impact process.A two-dimensional ASPH program is coded with C + +.The developed hydrocode is examined for example problems of hypervelocity impacts of solid materials.The results obtained from the numerical simulation are compared with available experimental ones.Good agreement is observed.

  2. Hybrid Quantum Annealing for Clustering Problems

    OpenAIRE

    Tanaka, Shu; Tamura, Ryo; Sato, Issei; Kurihara, Kenichi

    2011-01-01

    We develop a hybrid type of quantum annealing in which we control temperature and quantum field simultaneously. We study the efficiency of proposed quantum annealing and find a good schedule of changing thermal fluctuation and quantum fluctuation. In this paper, we focus on clustering problems which are important topics in information science and engineering. We obtain the better solution of the clustering problem than the standard simulated annealing by proposed quantum annealing.

  3. Quantum annealing: An introduction and new developments

    OpenAIRE

    Ohzeki, Masayuki; Nishimori, Hidetoshi

    2010-01-01

    Quantum annealing is a generic algorithm using quantum-mechanical fluctuations to search for the solution of an optimization problem. The present paper first reviews the fundamentals of quantum annealing and then reports on preliminary results for an alternative method. The review part includes the relationship of quantum annealing with classical simulated annealing. We next propose a novel quantum algorithm which might be available for hard optimization problems by using a classical-quantum ...

  4. Metaheurística Simulated Annealing para solução de problemas de planejamento florestal com restrições de integridade Simulated Annealing metaheuristic to solve forest planning problem with integer constraints

    Directory of Open Access Journals (Sweden)

    Flávio Lopes Rodrigues

    2004-04-01

    Full Text Available Os objetivos deste trabalho foram desenvolver e testar a metaheurística SA para solução de problemas de gerenciamento florestal com restrições de integridade. O algoritmo SA desenvolvido foi testado em quatro problemas, contendo entre 93 e 423 variáveis de decisão, sujeitos às restrições de singularidade, produção mínima e produção máxima, periodicamente. Todos os problemas tiveram como objetivo a maximização do valor presente líquido. O algoritmo SA foi codificado em liguagem delphi 5.0 e os testes foram efetuados em um microcomputador AMD K6II 500 MHZ, com memória RAM de 64 MB e disco rígido de 15GB. O desempenho da SA foi avaliado de acordo com as medidas de eficácia e eficiência. Os diferentes valores ou categorias dos parâmetros da SA foram testados e comparados quanto aos seus efeitos na eficácia do algoritmo. A seleção da melhor configuração de parâmetros foi feita com o teste L&O, a 1% de probabilidade, e as análises foram realizadas através de estatísticas descritivas. A melhor configuração de parâmetros propiciou à SA eficácia média de 95,36%, valor mínimo de 83,66%, valor máximo de 100% e coeficiente de variação igual a 3,18% do ótimo matemático obtido pelo algoritmo exato branch and bound. Para o problema de maior porte, a eficiência da SA foi dez vezes superior à eficiência do algoritmo exato branch and bound. O bom desempenho desta heurística reforçou as conclusões, tiradas em outros trabalhos, do seu enorme potencial para resolver importantes problemas de gerenciamento florestal de difícil solução pelos instrumentos computacionais da atualidade.The objectives of this work was to develop and test an algorithm based on Simulated Annealing (SA metaheuristic to solve problems of forest management with integer constraints. The algorithm SA developed was tested in five problems containing between 93 and 423 decision variables, periodically subject to singularity constraints, minimum

  5. Application of FastSLAM based on simulated annealing variance reduction in navigation and localization of AUV%模拟退火方差缩减的FastSLAM算法在AUV导航定位中的应用

    Institute of Scientific and Technical Information of China (English)

    王宏健; 王晶; 曲丽萍; 刘振业

    2013-01-01

    The FastSLAM algorithm based on variance reduction of particle weight was presented in order to solve the decrease of estimated accuracy of AUV ( autonomous underwater vehicle) , location due to particles degeneracy and the sample impoverishment as a result of resampling in standard FastSLAM. The variance of particle weight was decreased by generating an adaptive exponential fading factor, which came from the thought of cooling function in simulated annealing. The effective particle number was increased by application of FastSLAM based on simulated annealing variance reduction in navigation and localization of AUV. Resampling in standard FastSLAM was replaced with it. Establish the kinematic model of AUV, feature model and measurement models of sensors, and make feature extraction with Hough transform. The experiment of AUV's simultaneous localization and mapping u-sing simulated annealing variance reduction FastSLAM was based on trial data. The results indicate that the method described in this paper maintains the diversity of the particles, however, weakens the degeneracy, while at the same time enhances the accuracy stability of AUV's navigation and localization system.%由于标准FastSLAM中存在粒子退化及重采样引起的粒子贫化,导致自主水下航行器(AUV)位置估计精度严重下降的问题,提出了一种基于粒子权值方差缩减的FastSLAM算法.利用模拟退火的降温函数产生自适应指数渐消因子来降低粒子权值的方差,进而增加有效粒子数,以此取代标准FastSLAM中的重采样步骤.建立AUV的运动学模型、特征模型及传感器的测量模型,通过霍夫变换进行特征提取.利用方差缩减FastSLAM算法,基于海试数据进行了AUV同步定位与构图仿真试验,结果表明所提方法能够保证粒子的多样性,并且降低粒子的退化程度,提高了AUV定位与地图构建系统的准确性及稳定性.

  6. Grazing incidence X-ray diffraction study of the tilted phases of Langmuir films: Determination of molecular conformations using simulated annealing

    International Nuclear Information System (INIS)

    We have analyzed grazing incidence X-ray diffraction (GIXD) data from condensed phases of Langmuir films of long-chain fatty acids at the air-water using a new method consisting in a careful extraction of the structure factors followed by fitting of molecular parameters using simulated annealing. We show that the information contained in GIXD spectra is enough to obtain near-atomic structural information. In particular, we directly determine the orientation of the chain backbone planes and of the carboxylic headgroups, and we evaluate chain conformation defects

  7. Faster annealing schedules for quantum annealing

    OpenAIRE

    Morita, Satoshi

    2007-01-01

    New annealing schedules for quantum annealing are proposed based on the adiabatic theorem. These schedules exhibit faster decrease of the excitation probability than a linear schedule. To derive this conclusion, the asymptotic form of the excitation probability for quantum annealing is explicitly obtained in the limit of long annealing time. Its first-order term, which is inversely proportional to the square of the annealing time, is shown to be determined only by the information at the initi...

  8. Simulating adaptive wood harvest in a changing climate

    Science.gov (United States)

    Yousefpour, Rasoul; Nabel, Julia; Pongratz, Julia

    2016-04-01

    The world's forest experience substantial carbon exchange fluxes between land and atmosphere. Large carbon sinks occur in response to changes in environmental conditions (such as climate change and increased atmospheric CO2 concentrations), removing about one quarter of current anthropogenic CO2-emissions. Large sinks also occur due to regrowth of forest on areas of agricultural abandonment or forest management. Forest management, on the other hand, also leads to substantial amounts of carbon being eventually released to the atmosphere. Both sinks and sources attributable to forests are therefore dependent on the intensity of management. Forest management in turn depends on the availability of resources, which is influenced by environmental conditions and sustainability of management systems applied. Estimating future carbon fluxes therefore requires accounting for the interaction of environmental conditions, forest growth, and management. However, this interaction is not fully captured by current modeling approaches: Earth system models depict in detail interactions between climate, the carbon cycle, and vegetation growth, but use prescribed information on management. Resource needs and land management, however, are simulated by Integrated Assessment Models that typically only have coarse representations of the influence of environmental changes on vegetation growth and are typically based on the demand for wood driven by regional population growth and energy needs. Here we present a study that provides the link between environmental conditions, forest growth and management. We extend the land component JSBACH of the Max Planck Institute's Earth system model (MPI-ESM) to simulate potential wood harvest in response to altered growth conditions and thus as adaptive to changing climate and CO2 conditions. We apply the altered model to estimate potential wood harvest for future climates (representative concentration pathways, RCPs) for the management scenario of

  9. A pixel selection rule based on the number of different-phase neighbours for the simulated annealing reconstruction of sandstone microstructure.

    Science.gov (United States)

    Tang, T; Teng, Q; He, X; Luo, D

    2009-06-01

    Sandstone reservoir is one of the main types of oil and gas reservoirs in China. It has porous microstructure, which directly affects the transport properties of a sandstone. Hence, the study of porous microstructure is important to the exploration and exploitation of oil and gas. Three-dimensional microstructure of a sandstone can be reconstructed using the simulated annealing method based on statistical properties of its two-dimensional micrograph. The aim of reconstruction is to minimize the discrepancy between the statistical properties of the reconstructed microstructure and those of the two-dimensional image. To accelerate the rate of convergence, we proposed a different-phase neighbours (DPNs)-based pixel selection rule to replace the random pixel selection rule of the simulated annealing reconstruction. In this rule, pixels with the largest number of DPNs have the largest selection probability. The selection probabilities of other pixels are proportional to their DPNs. Microstructure reconstructed with the DPNs-based rule is compared with those with the random selection rule and two other biased pixel selection rules. The DPNs-based rule is the most effective in enhancing convergence. Permeability of the microstructure reconstructed with the DPNs-based rule is estimated by the Kozeny-Carman formula and is in good agreement with the one reconstructed with the random pixel selection rule.

  10. Job shop scheduling based on Petri net and simulated annealing algorithm%基于Petri网和模拟退火算法的生产调度

    Institute of Scientific and Technical Information of China (English)

    齐继阳; 竺长安

    2011-01-01

    In order to solve schedule problem in job shop, an enhanced timed transition Petri net and simulated annealing algorithm are combined in the paper. The Petri net is used to describe the schedule problem and the transition sequence is applied to stand for the solution; and then the simulated annealing algorithm is applied to search for the optimal or near-optimal solution of the problem. During its searching process, memory function is introduced to avoid repeated searching, which can improve the efficiency and the quality of the solution. At last, an simulating example demonstrates the effectiveness of the method.%为解决生产调度问题,在增强赋时变迁Petri网和模拟退火算法的基础上,将两者结合起来,首先利用Petri网描述生产调度问题,为提高算法的通用性,以Petri网的系列变迁来作为调度问题解的表达方式,然后通过模拟退火算法求解问题的最优解或近优解.在模拟退火算法新解产生过程中通过引入记忆功能,避免迂回搜索,提高了搜索效率和解的质量.最后通过实例验证了该方法的有效性.

  11. Quantum simulations of nuclei and nuclear pasta with the multiresolution adaptive numerical environment for scientific simulations

    Science.gov (United States)

    Sagert, I.; Fann, G. I.; Fattoyev, F. J.; Postnikov, S.; Horowitz, C. J.

    2016-05-01

    Background: Neutron star and supernova matter at densities just below the nuclear matter saturation density is expected to form a lattice of exotic shapes. These so-called nuclear pasta phases are caused by Coulomb frustration. Their elastic and transport properties are believed to play an important role for thermal and magnetic field evolution, rotation, and oscillation of neutron stars. Furthermore, they can impact neutrino opacities in core-collapse supernovae. Purpose: In this work, we present proof-of-principle three-dimensional (3D) Skyrme Hartree-Fock (SHF) simulations of nuclear pasta with the Multi-resolution ADaptive Numerical Environment for Scientific Simulations (MADNESS). Methods: We perform benchmark studies of 16O, 208Pb, and 238U nuclear ground states and calculate binding energies via 3D SHF simulations. Results are compared with experimentally measured binding energies as well as with theoretically predicted values from an established SHF code. The nuclear pasta simulation is initialized in the so-called waffle geometry as obtained by the Indiana University Molecular Dynamics (IUMD) code. The size of the unit cell is 24 fm with an average density of about ρ =0.05 fm-3 , proton fraction of Yp=0.3 , and temperature of T =0 MeV. Results: Our calculations reproduce the binding energies and shapes of light and heavy nuclei with different geometries. For the pasta simulation, we find that the final geometry is very similar to the initial waffle state. We compare calculations with and without spin-orbit forces. We find that while subtle differences are present, the pasta phase remains in the waffle geometry. Conclusions: Within the MADNESS framework, we can successfully perform calculations of inhomogeneous nuclear matter. By using pasta configurations from IUMD it is possible to explore different geometries and test the impact of self-consistent calculations on the latter.

  12. 应用于液压集成块优化的一种混合遗传-模拟退火算法%A hybrid genetic-simulated annealing algorithm for optimization of hydraulic manifold blocks

    Institute of Scientific and Technical Information of China (English)

    刘万辉; 田树军; 贾春强; 曹宇宁

    2008-01-01

    This paper establishes a mathematical model of multi-objective optimization with behavior constraints in solid space based on the problem of optimal design of hydraulic manifold blocks (HMB). Due to the limitation of its local search ability of genetic algorithm (GA) in solving a massive combinatorial optimization problem, simulated annealing (SA) is combined, the multi-parameter concatenated coding is adopted, and the memory function is added. Thus a hybrid genetic-simulated annealing with memory function is formed. Examples show that the modified algorithm can improve the local search ability in the solution space, and the solution quality.

  13. Numerical modeling of two-dimensional heat-transfer and temperature-based calibration using simulated annealing optimization method: Application to gas metal arc welding

    Directory of Open Access Journals (Sweden)

    Bjelić Mišo B.

    2016-01-01

    Full Text Available Simulation models of welding processes allow us to predict influence of welding parameters on the temperature field during welding and by means of temperature field and the influence to the weld geometry and microstructure. This article presents a numerical, finite-difference based model of heat transfer during welding of thin sheets. Unfortunately, accuracy of the model depends on many parameters, which cannot be accurately prescribed. In order to solve this problem, we have used simulated annealing optimization method in combination with presented numerical model. This way, we were able to determine uncertain values of heat source parameters, arc efficiency, emissivity and enhanced conductivity. The calibration procedure was made using thermocouple measurements of temperatures during welding for P355GH steel. The obtained results were used as input for simulation run. The results of simulation showed that represented calibration procedure could significantly improve reliability of heat transfer model. [National CEEPUS Office of Czech Republic (project CIII-HR-0108-07-1314 and to the Ministry of Education and Science of the Republic of Serbia (project TR37020

  14. Optimization of pulsed current GTAW process parameters for sintered hot forged AISI 4135 P/M steel welds by simulated annealing and genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Joseph, Joby; Muthukumaran, S. [National Institute of Technology, Tamil Nadu (India)

    2016-01-15

    Abundant improvements have occurred in materials handling, especially in metal joining. Pulsed current gas tungsten arc welding (PCGTAW) is one of the consequential fusion techniques. In this work, PCGTAW of AISI 4135 steel engendered through powder metallurgy (P/M) has been executed, and the process parameters have been highlighted applying Taguchi's L9 orthogonal array. The results show that the peak current (Ip), gas flow rate (GFR), welding speed (WS) and base current (Ib) are the critical constraints in strong determinant of the Tensile strength (TS) as well as percentage of elongation (% Elong) of the joint. The practical impact of applying Genetic algorithm (GA) and Simulated annealing (SA) to PCGTAW process has been authenticated by means of calculating the deviation between predicted and experimental welding process parameters.

  15. 1-(2-furoyl)-3,3-(diphenyl)thiourea: spectroscopic characterization and structural study from X-ray powder diffraction using simulated annealing

    Energy Technology Data Exchange (ETDEWEB)

    Estevez H, O.; Duque, J. [Universidad de La Habana, Instituto de Ciencia y Tecnologia de Materiales, 10400 La Habana (Cuba); Rodriguez H, J. [UNAM, Instituto de Investigaciones en Materiales, 04510 Mexico D. F. (Mexico); Yee M, H., E-mail: oestevezh@yahoo.com [Instituto Politecnico Nacional, Escuela Superior de Fisica y Matematicas, 07738 Mexico D. F. (Mexico)

    2015-07-01

    1-Furoyl-3,3-diphenylthiourea (FDFT) was synthesized, and characterized by Ftir, {sup 1}H and {sup 13}C NMR and ab initio X-ray powder structure analysis. FDFT crystallizes in the monoclinic space group P2{sub 1} with a = 12.691(1), b = 6.026(2), c = 11.861(1) A, β = 117.95(2) and V = 801.5(3) A{sup 3}. The crystal structure has been determined from laboratory X-ray powder diffraction data using direct space global optimization strategy (simulated annealing) followed by the Rietveld refinement. The thiourea group makes a dihedral angle of 73.8(6) with the furoyl group. In the crystal structure, molecules are linked by van der Waals interactions, forming one-dimensional chains along the a axis. (Author)

  16. A Hybrid Genetic-Simulated Annealing Algorithm for the Location-Inventory-Routing Problem Considering Returns under E-Supply Chain Environment

    Directory of Open Access Journals (Sweden)

    Yanhui Li

    2013-01-01

    Full Text Available Facility location, inventory control, and vehicle routes scheduling are critical and highly related problems in the design of logistics system for e-business. Meanwhile, the return ratio in Internet sales was significantly higher than in the traditional business. Many of returned merchandise have no quality defects, which can reenter sales channels just after a simple repackaging process. Focusing on the existing problem in e-commerce logistics system, we formulate a location-inventory-routing problem model with no quality defects returns. To solve this NP-hard problem, an effective hybrid genetic simulated annealing algorithm (HGSAA is proposed. Results of numerical examples show that HGSAA outperforms GA on computing time, optimal solution, and computing stability. The proposed model is very useful to help managers make the right decisions under e-supply chain environment.

  17. A hybrid genetic-simulated annealing algorithm for the location-inventory-routing problem considering returns under e-supply chain environment.

    Science.gov (United States)

    Li, Yanhui; Guo, Hao; Wang, Lin; Fu, Jing

    2013-01-01

    Facility location, inventory control, and vehicle routes scheduling are critical and highly related problems in the design of logistics system for e-business. Meanwhile, the return ratio in Internet sales was significantly higher than in the traditional business. Many of returned merchandise have no quality defects, which can reenter sales channels just after a simple repackaging process. Focusing on the existing problem in e-commerce logistics system, we formulate a location-inventory-routing problem model with no quality defects returns. To solve this NP-hard problem, an effective hybrid genetic simulated annealing algorithm (HGSAA) is proposed. Results of numerical examples show that HGSAA outperforms GA on computing time, optimal solution, and computing stability. The proposed model is very useful to help managers make the right decisions under e-supply chain environment.

  18. Definition of general topological equivalence in protein structures. A procedure involving comparison of properties and relationships through simulated annealing and dynamic programming.

    Science.gov (United States)

    Sali, A; Blundell, T L

    1990-03-20

    A protein is defined as an indexed string of elements at each level in the hierarchy of protein structure: sequence, secondary structure, super-secondary structure, etc. The elements, for example, residues or secondary structure segments such as helices or beta-strands, are associated with a series of properties and can be involved in a number of relationships with other elements. Element-by-element dissimilarity matrices are then computed and used in the alignment procedure based on the sequence alignment algorithm of Needleman & Wunsch, expanded by the simulated annealing technique to take into account relationships as well as properties. The utility of this method for exploring the variability of various aspects of protein structure and for comparing distantly related proteins is demonstrated by multiple alignment of serine proteinases, aspartic proteinase lobes and globins.

  19. A Pseudo-Parallel Genetic Algorithm Integrating Simulated Annealing for Stochastic Location-Inventory-Routing Problem with Consideration of Returns in E-Commerce

    Directory of Open Access Journals (Sweden)

    Bailing Liu

    2015-01-01

    Full Text Available Facility location, inventory control, and vehicle routes scheduling are three key issues to be settled in the design of logistics system for e-commerce. Due to the online shopping features of e-commerce, customer returns are becoming much more than traditional commerce. This paper studies a three-phase supply chain distribution system consisting of one supplier, a set of retailers, and a single type of product with continuous review (Q, r inventory policy. We formulate a stochastic location-inventory-routing problem (LIRP model with no quality defects returns. To solve the NP-hand problem, a pseudo-parallel genetic algorithm integrating simulated annealing (PPGASA is proposed. The computational results show that PPGASA outperforms GA on optimal solution, computing time, and computing stability.

  20. Synthesis of Linear Array of Parallel Dipole Antennas with Minimum Standing Wave Ratio Using Simulated Annealing and Particle Swarm Optimization approach

    Directory of Open Access Journals (Sweden)

    Banani Basu

    2010-05-01

    Full Text Available In this paper, we propose a technique based on two evolutionary algorithms simulated annealing and particle swarm optimization to design a linear array of half wavelength long parallel dipole antennas that will generate a pencil beam in the horizontal plane with minimum standing wave ratio (SWR and fixed side lobe level (SLL. Dynamic range ratio of current amplitude distribution is kept at a fixed value. Two different methods have been proposed withdifferent inter-element spacing but with same current amplitude distribution. First one uses a fixed geometry and optimizes the excitation distribution on it. In the second case further reduction of SWR is done via optimization of interelement spacing while keeping the amplitude distribution same as before. Coupling effect between the elements is analyzed using induced EMF method and minimized interms of SWR. Numerical results obtained from SA are validated by comparing with results obtained using PSO.

  1. Optimization of pulsed current GTAW process parameters for sintered hot forged AISI 4135 P/M steel welds by simulated annealing and genetic algorithm

    International Nuclear Information System (INIS)

    Abundant improvements have occurred in materials handling, especially in metal joining. Pulsed current gas tungsten arc welding (PCGTAW) is one of the consequential fusion techniques. In this work, PCGTAW of AISI 4135 steel engendered through powder metallurgy (P/M) has been executed, and the process parameters have been highlighted applying Taguchi's L9 orthogonal array. The results show that the peak current (Ip), gas flow rate (GFR), welding speed (WS) and base current (Ib) are the critical constraints in strong determinant of the Tensile strength (TS) as well as percentage of elongation (% Elong) of the joint. The practical impact of applying Genetic algorithm (GA) and Simulated annealing (SA) to PCGTAW process has been authenticated by means of calculating the deviation between predicted and experimental welding process parameters

  2. A hybrid genetic-simulated annealing algorithm for the location-inventory-routing problem considering returns under e-supply chain environment.

    Science.gov (United States)

    Li, Yanhui; Guo, Hao; Wang, Lin; Fu, Jing

    2013-01-01

    Facility location, inventory control, and vehicle routes scheduling are critical and highly related problems in the design of logistics system for e-business. Meanwhile, the return ratio in Internet sales was significantly higher than in the traditional business. Many of returned merchandise have no quality defects, which can reenter sales channels just after a simple repackaging process. Focusing on the existing problem in e-commerce logistics system, we formulate a location-inventory-routing problem model with no quality defects returns. To solve this NP-hard problem, an effective hybrid genetic simulated annealing algorithm (HGSAA) is proposed. Results of numerical examples show that HGSAA outperforms GA on computing time, optimal solution, and computing stability. The proposed model is very useful to help managers make the right decisions under e-supply chain environment. PMID:24489489

  3. Computerized adaptive measurement of depression: A simulation study

    Directory of Open Access Journals (Sweden)

    Mammen Oommen

    2004-05-01

    Full Text Available Abstract Background Efficient, accurate instruments for measuring depression are increasingly important in clinical practice. We developed a computerized adaptive version of the Beck Depression Inventory (BDI. We examined its efficiency and its usefulness in identifying Major Depressive Episodes (MDE and in measuring depression severity. Methods Subjects were 744 participants in research studies in which each subject completed both the BDI and the SCID. In addition, 285 patients completed the Hamilton Depression Rating Scale. Results The adaptive BDI had an AUC as an indicator of a SCID diagnosis of MDE of 88%, equivalent to the full BDI. The adaptive BDI asked fewer questions than the full BDI (5.6 versus 21 items. The adaptive latent depression score correlated r = .92 with the BDI total score and the latent depression score correlated more highly with the Hamilton (r = .74 than the BDI total score did (r = .70. Conclusions Adaptive testing for depression may provide greatly increased efficiency without loss of accuracy in identifying MDE or in measuring depression severity.

  4. A Model for Capturing Team Adaptation in Simulated Emergencies

    DEFF Research Database (Denmark)

    Paltved, Charlotte; Musaeus, Peter

    2013-01-01

    changes, adjust priorities and implement adjusted strategies were more likely to perform successfully in environments with unforeseen changes, in other words adaptability is the generalization of trained knowledge and skills to new, more difficult and more complex tasks. An interpretative approach...... events like closed-loop communication.1 A more nuanced understanding of team communication has the potential to enhance scholarship in interprofessional endeavours. In high risk environments, team performance depends on the ability of teams to quickly alter actions in response to rapidly changing...... conditions.2,3 However, research on team adaptation in healthcare is scarce.4 In this study, team adaptation in medical emergency teams was explored through the quality and content of updates. Updating is an ongoing process of incorporating interpretations based on new information with current beliefs.5...

  5. A New Heuristic Providing an Effective Initial Solution for a Simulated Annealing approach to Energy Resource Scheduling in Smart Grids

    DEFF Research Database (Denmark)

    Sousa, Tiago M; Morais, Hugo; Castro, R.;

    2014-01-01

    An intensive use of dispersed energy resources is expected for future power systems, including distributed generation, especially based on renewable sources, and electric vehicles. The system operation methods and tool must be adapted to the increased complexity, especially the optimal resource s...

  6. Adaptive resolution simulation of supramolecular water : The concurrent making, breaking, and remaking of water bundles

    NARCIS (Netherlands)

    Zavadlav, Julija; Marrink, Siewert J; Praprotnik, Matej

    2016-01-01

    The adaptive resolution scheme (AdResS) is a multiscale molecular dynamics simulation approach that can concurrently couple atomistic (AT) and coarse-grained (CG) resolution regions, i.e., the molecules can freely adapt their resolution according to their current position in the system. Coupling to

  7. Developing adaptive user interfaces using a game-based simulation environment

    NARCIS (Netherlands)

    Brake, G.M. te; Greef, T.E. de; Lindenberg, J.; Rypkema, J.A.; Smets-Noor, N.J.J.M.

    2006-01-01

    In dynamic settings, user interfaces can provide more optimal support if they adapt to the context of use. Providing adaptive user interfaces to first responders may therefore be fruitful. A cognitive engineering method that incorporates development iterations in both a simulated and a real-world en

  8. Toward a practical method for adaptive QM/MM simulations

    NARCIS (Netherlands)

    R.E. Bulo; B. Ensing; J. Sikkema; L. Visscher

    2009-01-01

    We present an accurate adaptive multiscale molecular dynamics method that will enable the detailed study of large molecular systems that mimic experiment. The method treats the reactive regions at the quantum mechanical level and the inactive environment regions at lower levels of accuracy, while at

  9. Parallel Simulation of the Shallow Water Equations on Structured Dynamically Adaptive Triangular Grids.

    OpenAIRE

    Vigh, Csaba Attila

    2013-01-01

    One of the most important computational challenges in the context of the numerical treatment of Partial Differential Equations is the generation, management, and dynamic adaptivity of grids. Dynamic adaptivity is extremely important in applications that require frequent changes of the grid pattern during a simulation run. One such application example is Tsunami simulation, where waves have to be tracked with highly resolved local grids. Arbitrary unstructured grids that can ...

  10. Application of simulated annealing in simulation and optimization of drying process of Zea mays malt Aplicação do simulated annealing na simulação e otimização do processo de secagem do malte de Zea mays

    Directory of Open Access Journals (Sweden)

    Marco A. C. Benvenga

    2011-10-01

    Full Text Available Kinetic simulation and drying process optimization of corn malt by Simulated Annealing (SA for estimation of temperature and time parameters in order to preserve maximum amylase activity in the obtained product are presented here. Germinated corn seeds were dried at 54-76 °C in a convective dryer, with occasional measurement of moisture content and enzymatic activity. The experimental data obtained were submitted to modeling. Simulation and optimization of the drying process were made by using the SA method, a randomized improvement algorithm, analogous to the simulated annealing process. Results showed that seeds were best dried between 3h and 5h. Among the models used in this work, the kinetic model of water diffusion into corn seeds showed the best fitting. Drying temperature and time showed a square influence on the enzymatic activity. Optimization through SA showed the best condition at 54 ºC and between 5.6h and 6.4h of drying. Values of specific activity in the corn malt were found between 5.26±0.06 SKB/mg and 15.69±0,10% of remaining moisture.Este trabalho objetivou a simulação da cinética e a otimização do processo de secagem do malte de milho por meio da técnica Simulated Annealing (SA, para estimação dos parâmetros de temperatura e tempo, tais que mantenham a atividade máxima das enzimas amilases no produto obtido. Para tanto, as sementes de milho germinadas foram secas entre 54-76°C, em um secador convectivo de ar. De tempo em tempo, a umidade e a atividade enzimática foram medidas. Esses dados experimentais foram usados para testar os modelos. A simulação e a otimização do processo foram feitas por meio do método SA, um algoritmo de melhoria randômica, análogo ao processo de têmpera simulada. Os resultados mostram que as sementes estavam secas após 3 h ou 5 h de secagem. Entre os modelos usados, o modelo cinético de difusão da água através das sementes apresentou o melhor ajuste. O tempo e a temperatura

  11. Computer simulation program is adaptable to industrial processes

    Science.gov (United States)

    Schultz, F. E.

    1966-01-01

    The Reaction kinetics ablation program /REKAP/, developed to simulate ablation of various materials, provides mathematical formulations for computer programs which can simulate certain industrial processes. The programs are based on the use of nonsymmetrical difference equations that are employed to solve complex partial differential equation systems.

  12. The New England Climate Adaptation Project: Enhancing Local Readiness to Adapt to Climate Change through Role-Play Simulations

    Science.gov (United States)

    Rumore, D.; Kirshen, P. H.; Susskind, L.

    2014-12-01

    Despite scientific consensus that the climate is changing, local efforts to prepare for and manage climate change risks remain limited. How we can raise concern about climate change risks and enhance local readiness to adapt to climate change's effects? In this presentation, we will share the lessons learned from the New England Climate Adaptation Project (NECAP), a participatory action research project that tested science-based role-play simulations as a tool for educating the public about climate change risks and simulating collective risk management efforts. NECAP was a 2-year effort involving the Massachusetts Institute of Technology, the Consensus Building Institute, the National Estuarine Research Reserve System, and four coastal New England municipalities. During 2012-2013, the NECAP team produced downscaled climate change projections, a summary risk assessment, and a stakeholder assessment for each partner community. Working with local partners, we used these assessments to create a tailored, science-based role-play simulation for each site. Through a series of workshops in 2013, NECAP engaged between 115-170 diverse stakeholders and members of the public in each partner municipality in playing the simulation and a follow up conversation about local climate change risks and possible adaptation strategies. Data were collected through before-and-after surveys administered to all workshop participants, follow-up interviews with 25 percent of workshop participants, public opinion polls conducted before and after our intervention, and meetings with public officials. This presentation will report our research findings and explain how science-based role-play simulations can be used to help communicate local climate change risks and enhance local readiness to adapt.

  13. Annealing evolutionary stochastic approximation Monte Carlo for global optimization

    KAUST Repository

    Liang, Faming

    2010-04-08

    In this paper, we propose a new algorithm, the so-called annealing evolutionary stochastic approximation Monte Carlo (AESAMC) algorithm as a general optimization technique, and study its convergence. AESAMC possesses a self-adjusting mechanism, whose target distribution can be adapted at each iteration according to the current samples. Thus, AESAMC falls into the class of adaptive Monte Carlo methods. This mechanism also makes AESAMC less trapped by local energy minima than nonadaptive MCMC algorithms. Under mild conditions, we show that AESAMC can converge weakly toward a neighboring set of global minima in the space of energy. AESAMC is tested on multiple optimization problems. The numerical results indicate that AESAMC can potentially outperform simulated annealing, the genetic algorithm, annealing stochastic approximation Monte Carlo, and some other metaheuristics in function optimization. © 2010 Springer Science+Business Media, LLC.

  14. Hydrological Modelling and Sensitivity Analysis Using Topmodel and Simulated Annealing Techniques.application To The Haute-mentue Catchment(switzerland).

    Science.gov (United States)

    Balin Talamba, D.; Higy, C.; Joerin, C.; Musy, A.

    The paper presents an application concerning the hydrological modelling for the Haute-Mentue catchment, located in western Switzerland. A simplified version of Topmodel, developed in a Labview programming environment, was applied in the aim of modelling the hydrological processes on this catchment. Previous researches car- ried out in this region outlined the importance of the environmental tracers in studying the hydrological behaviour and an important knowledge has been accumulated dur- ing this period concerning the mechanisms responsible for runoff generation. In con- formity with the theoretical constraints, Topmodel was applied for an Haute-Mentue sub-catchment where tracing experiments showed constantly low contributions of the soil water during the flood events. The model was applied for two humid periods in 1998. First, the model calibration was done in order to provide the best estimations for the total runoff. Instead, the simulated components (groundwater and rapid flow) showed far deviations from the reality indicated by the tracing experiments. Thus, a new calibration was performed including additional information given by the environ- mental tracing. The calibration of the model was done by using simulated annealing (SA) techniques, which are easy to implement and statistically allow for converging to a global minimum. The only problem is that the method is time and computer consum- ing. To improve this, a version of SA was used which is known as very fast-simulated annealing (VFSA). The principles are the same as for the SA technique. The random search is guided by certain probability distribution and the acceptance criterion is the same as for SA but the VFSA allows for better taking into account the ranges of vari- ation of each parameter. Practice with Topmodel showed that the energy function has different sensitivities along different dimensions of the parameter space. The VFSA algorithm allows differentiated search in relation with the

  15. Enzo+Moray: Radiation Hydrodynamics Adaptive Mesh Refinement Simulations with Adaptive Ray Tracing

    CERN Document Server

    Wise, John H

    2010-01-01

    We describe a photon-conserving radiative transfer algorithm, using a spatially-adaptive ray tracing scheme, and its parallel implementation into the adaptive mesh refinement (AMR) cosmological hydrodynamics code, Enzo. By coupling the solver with the energy equation and non-equilibrium chemistry network, our radiation hydrodynamics framework can be utilised to study a broad range of astrophysical problems, such as stellar and black hole (BH) feedback. Inaccuracies can arise from large timesteps and poor sampling, therefore we devised an adaptive time-stepping scheme and a fast approximation of the optically-thin radiation field with multiple sources. We test the method with several radiative transfer and radiation hydrodynamics tests that are given in Iliev et al. (2006, 2009). We further test our method with more dynamical situations, for example, the propagation of an ionisation front through a Rayleigh-Taylor instability, time-varying luminosities, and collimated radiation. The test suite also includes an...

  16. A comprehensive solution for simulating ultra-shallow junctions: From high dose/low energy implant to diffusion annealing

    International Nuclear Information System (INIS)

    This paper presents a global approach permitting accurate simulation of the process of ultra-shallow junctions. Physically based models of dopant implantation (BCA) and diffusion (including point and extended defects coupling) are integrated within a unique simulation tool. A useful set of the relevant parameters has been obtained through an original calibration methodology. It is shown that this approach provides an efficient tool for process modelling

  17. Aplikasi Simulasi Annealing Untuk Menyelesaikan Traveling Salesman Problem

    OpenAIRE

    Larasati, Tuti

    2012-01-01

    Traveling salesman problem is one of combinatorial optimization problems that aim to obtain an optimal solution which determines the route that most minimum. And to resolve and find solutions to these problems one algorithm to be used is simulated annealing. Simulated annealing is an analogy of a liquid metals cooling process called annealing. Annealing is the metallurgical process of heating up a solid and then cooling slowly until it crystallizes. At this final task will shown an analogy an...

  18. Fuzzy Backstepping Torque Control Of Passive Torque Simulator With Algebraic Parameters Adaptation

    Science.gov (United States)

    Ullah, Nasim; Wang, Shaoping; Wang, Xingjian

    2015-07-01

    This work presents fuzzy backstepping control techniques applied to the load simulator for good tracking performance in presence of extra torque, and nonlinear friction effects. Assuming that the parameters of the system are uncertain and bounded, Algebraic parameters adaptation algorithm is used to adopt the unknown parameters. The effect of transient fuzzy estimation error on parameters adaptation algorithm is analyzed and the fuzzy estimation error is further compensated using saturation function based adaptive control law working in parallel with the actual system to improve the transient performance of closed loop system. The saturation function based adaptive control term is large in the transient time and settles to an optimal lower value in the steady state for which the closed loop system remains stable. The simulation results verify the validity of the proposed control method applied to the complex aerodynamics passive load simulator.

  19. Hybridisations Of Simulated Annealing And Modified Simplex Algorithms On A Path Of Steepest Ascent With Multi-Response For Optimal Parameter Settings Of ACO

    Science.gov (United States)

    Luangpaiboon, P.

    2009-10-01

    Many entrepreneurs face to extreme conditions for instances; costs, quality, sales and services. Moreover, technology has always been intertwined with our demands. Then almost manufacturers or assembling lines adopt it and come out with more complicated process inevitably. At this stage, products and service improvement need to be shifted from competitors with sustainability. So, a simulated process optimisation is an alternative way for solving huge and complex problems. Metaheuristics are sequential processes that perform exploration and exploitation in the solution space aiming to efficiently find near optimal solutions with natural intelligence as a source of inspiration. One of the most well-known metaheuristics is called Ant Colony Optimisation, ACO. This paper is conducted to give an aid in complicatedness of using ACO in terms of its parameters: number of iterations, ants and moves. Proper levels of these parameters are analysed on eight noisy continuous non-linear continuous response surfaces. Considering the solution space in a specified region, some surfaces contain global optimum and multiple local optimums and some are with a curved ridge. ACO parameters are determined through hybridisations of Modified Simplex and Simulated Annealing methods on the path of Steepest Ascent, SAM. SAM was introduced to recommend preferable levels of ACO parameters via statistically significant regression analysis and Taguchi's signal to noise ratio. Other performance achievements include minimax and mean squared error measures. A series of computational experiments using each algorithm were conducted. Experimental results were analysed in terms of mean, design points and best so far solutions. It was found that results obtained from a hybridisation with stochastic procedures of Simulated Annealing method were better than that using Modified Simplex algorithm. However, the average execution time of experimental runs and number of design points using hybridisations were

  20. Comparison of Mesh Adaptivity Schemes in Finite ElementSimulation of Tube Extrusion Process

    Directory of Open Access Journals (Sweden)

    K. K. Pathak

    2008-05-01

    Full Text Available In this study, finite element simulation of tube extrusion process has been carried outconsidering different mesh adaptivity schemes. A comparison of these schemes has been madebased on stress, strain distribution, and load-stroke curves. Based on the finite element results,it is observed that the success of the computer simulation is dependent on the mesh refinementcriteria.

  1. Simulating Computer Adaptive Testing With the Mood and Anxiety Symptom Questionnaire

    NARCIS (Netherlands)

    G. Flens; N. Smits; I. Carlier; A.M. van Hemert; E. de Beurs

    2015-01-01

    In a post hoc simulation study (N = 3,597 psychiatric outpatients), we investigated whether the efficiency of the 90-item Mood and Anxiety Symptom Questionnaire (MASQ) could be improved for assessing clinical subjects with computerized adaptive testing (CAT). A CAT simulation was performed on each o

  2. The Local Minima Problem in Hierarchical Classes Analysis: An Evaluation of a Simulated Annealing Algorithm and Various Multistart Procedures

    Science.gov (United States)

    Ceulemans, Eva; Van Mechelen, Iven; Leenen, Iwin

    2007-01-01

    Hierarchical classes models are quasi-order retaining Boolean decomposition models for N-way N-mode binary data. To fit these models to data, rationally started alternating least squares (or, equivalently, alternating least absolute deviations) algorithms have been proposed. Extensive simulation studies showed that these algorithms succeed quite…

  3. ENZO+MORAY: radiation hydrodynamics adaptive mesh refinement simulations with adaptive ray tracing

    Science.gov (United States)

    Wise, John H.; Abel, Tom

    2011-07-01

    We describe a photon-conserving radiative transfer algorithm, using a spatially-adaptive ray-tracing scheme, and its parallel implementation into the adaptive mesh refinement cosmological hydrodynamics code ENZO. By coupling the solver with the energy equation and non-equilibrium chemistry network, our radiation hydrodynamics framework can be utilized to study a broad range of astrophysical problems, such as stellar and black hole feedback. Inaccuracies can arise from large time-steps and poor sampling; therefore, we devised an adaptive time-stepping scheme and a fast approximation of the optically-thin radiation field with multiple sources. We test the method with several radiative transfer and radiation hydrodynamics tests that are given in Iliev et al. We further test our method with more dynamical situations, for example, the propagation of an ionization front through a Rayleigh-Taylor instability, time-varying luminosities and collimated radiation. The test suite also includes an expanding H II region in a magnetized medium, utilizing the newly implemented magnetohydrodynamics module in ENZO. This method linearly scales with the number of point sources and number of grid cells. Our implementation is scalable to 512 processors on distributed memory machines and can include the radiation pressure and secondary ionizations from X-ray radiation. It is included in the newest public release of ENZO.

  4. The transfer of adaptation between actual and simulated rotary stimulation.

    Science.gov (United States)

    Dobie, T G; May, J G; Gutierrez, C; Heller, S S

    1990-12-01

    It is well known that continued exposure to motion environments leads to adaptation, but it is not clear whether such changes are specific to the particular type of motion experienced. The present investigation sought to evaluate the extent of transfer between real motion and visually-induced apparent motion. In addition, the direction of motion was varied and these two factors, mode of exposure and direction of rotation, were examined in a cross-adaptational design. Thirty-two subjects were pre- and posttested on measures of disorientation after active bodily rotation and visually-induced self-vection. Two groups received ten consecutive trials of active bodily rotation (clockwise or counter-clockwise) for 4 consecutive days. Two other groups received ten consecutive trials of visually-induced self-vection (clockwise or counter-clockwise) in a rotating drum for 4 consecutive days. During the exposure phase, dizziness and self-vection increased over trials for the groups exposed to the drum, while dizziness remained unchanged over trials for the groups exposed to bodily rotation. Repeated exposure to bodily rotation resulted in improved walking performance over trials and days. Subjects exposed to bodily rotation exhibited increased tolerance to visually-induced self-vection; however, exposure to visually-induced self-vection did not result in greater tolerance to bodily rotation. No support for directional specificity was evident. PMID:2285397

  5. Adaptive Multiscale Finite Element Method for Subsurface Flow Simulation

    NARCIS (Netherlands)

    Van Esch, J.M.

    2010-01-01

    Natural geological formations generally show multiscale structural and functional heterogeneity evolving over many orders of magnitude in space and time. In subsurface hydrological simulations the geological model focuses on the structural hierarchy of physical sub units and the flow model addresses

  6. Intelligent Electric Power Systems with Active-Adaptive Electric Networks: Challenges for Simulation Tools

    Directory of Open Access Journals (Sweden)

    Ufa Ruslan A.

    2015-01-01

    Full Text Available The motivation of the presented research is based on the needs for development of new methods and tools for adequate simulation of intelligent electric power systems with active-adaptive electric networks (IES including Flexible Alternating Current Transmission System (FACTS devices. The key requirements for the simulation were formed. The presented analysis of simulation results of IES confirms the need to use a hybrid modelling approach.

  7. Non-linear modeling of {sup 1}H NMR metabonomic data using kernel-based orthogonal projections to latent structures optimized by simulated annealing

    Energy Technology Data Exchange (ETDEWEB)

    Fonville, Judith M., E-mail: j.fonville07@imperial.ac.uk [Biomolecular Medicine, Department of Surgery and Cancer, Faculty of Medicine, Imperial College London, Sir Alexander Fleming Building, South Kensington, London SW7 2AZ (United Kingdom); Bylesjoe, Max, E-mail: max.bylesjo@almacgroup.com [Almac Diagnostics, 19 Seagoe Industrial Estate, Craigavon BT63 5QD (United Kingdom); Coen, Muireann, E-mail: m.coen@imperial.ac.uk [Biomolecular Medicine, Department of Surgery and Cancer, Faculty of Medicine, Imperial College London, Sir Alexander Fleming Building, South Kensington, London SW7 2AZ (United Kingdom); Nicholson, Jeremy K., E-mail: j.nicholson@imperial.ac.uk [Biomolecular Medicine, Department of Surgery and Cancer, Faculty of Medicine, Imperial College London, Sir Alexander Fleming Building, South Kensington, London SW7 2AZ (United Kingdom); Holmes, Elaine, E-mail: elaine.holmes@imperial.ac.uk [Biomolecular Medicine, Department of Surgery and Cancer, Faculty of Medicine, Imperial College London, Sir Alexander Fleming Building, South Kensington, London SW7 2AZ (United Kingdom); Lindon, John C., E-mail: j.lindon@imperial.ac.uk [Biomolecular Medicine, Department of Surgery and Cancer, Faculty of Medicine, Imperial College London, Sir Alexander Fleming Building, South Kensington, London SW7 2AZ (United Kingdom); Rantalainen, Mattias, E-mail: rantalai@stats.ox.ac.uk [Department of Statistics, Oxford University, 1 South Parks Road, Oxford OX1 3TG (United Kingdom)

    2011-10-31

    Highlights: {yields} Non-linear modeling of metabonomic data using K-OPLS. {yields} automated optimization of the kernel parameter by simulated annealing. {yields} K-OPLS provides improved prediction performance for exemplar spectral data sets. {yields} software implementation available for R and Matlab under GPL v2 license. - Abstract: Linear multivariate projection methods are frequently applied for predictive modeling of spectroscopic data in metabonomic studies. The OPLS method is a commonly used computational procedure for characterizing spectral metabonomic data, largely due to its favorable model interpretation properties providing separate descriptions of predictive variation and response-orthogonal structured noise. However, when the relationship between descriptor variables and the response is non-linear, conventional linear models will perform sub-optimally. In this study we have evaluated to what extent a non-linear model, kernel-based orthogonal projections to latent structures (K-OPLS), can provide enhanced predictive performance compared to the linear OPLS model. Just like its linear counterpart, K-OPLS provides separate model components for predictive variation and response-orthogonal structured noise. The improved model interpretation by this separate modeling is a property unique to K-OPLS in comparison to other kernel-based models. Simulated annealing (SA) was used for effective and automated optimization of the kernel-function parameter in K-OPLS (SA-K-OPLS). Our results reveal that the non-linear K-OPLS model provides improved prediction performance in three separate metabonomic data sets compared to the linear OPLS model. We also demonstrate how response-orthogonal K-OPLS components provide valuable biological interpretation of model and data. The metabonomic data sets were acquired using proton Nuclear Magnetic Resonance (NMR) spectroscopy, and include a study of the liver toxin galactosamine, a study of the nephrotoxin mercuric chloride and

  8. Block-Structured Adaptive Mesh Refinement Algorithms for Vlasov Simulation

    CERN Document Server

    Hittinger, J A F

    2012-01-01

    Direct discretization of continuum kinetic equations, like the Vlasov equation, are under-utilized because the distribution function generally exists in a high-dimensional (>3D) space and computational cost increases geometrically with dimension. We propose to use high-order finite-volume techniques with block-structured adaptive mesh refinement (AMR) to reduce the computational cost. The primary complication comes from a solution state comprised of variables of different dimensions. We develop the algorithms required to extend standard single-dimension block structured AMR to the multi-dimension case. Specifically, algorithms for reduction and injection operations that transfer data between mesh hierarchies of different dimensions are explained in detail. In addition, modifications to the basic AMR algorithm that enable the use of high-order spatial and temporal discretizations are discussed. Preliminary results for a standard 1D+1V Vlasov-Poisson test problem are presented. Results indicate that there is po...

  9. The numerical simulation tool for the MAORY multiconjugate adaptive optics system

    CERN Document Server

    Arcidiacono, Carmelo; Bregoli, Giovanni; Diolaiti, Emiliano; Foppiani, Italo; Agapito, Guido; Puglisi, Alfio; Xompero, Marco; Oberti, Sylvain; Cosentino, Giuseppe; Lombini, Matteo; Butler, Chris R; Ciliegi, Paolo; Cortecchia, Fausto; Patti, Mauro; Esposito, Simone; Feautrier, Philippe

    2016-01-01

    The Multiconjugate Adaptive Optics RelaY (MAORY) is and Adaptive Optics module to be mounted on the ESO European-Extremely Large Telescope (E-ELT). It is a hybrid Natural and Laser Guide System that will perform the correction of the atmospheric turbulence volume above the telescope feeding the Multi-AO Imaging Camera for Deep Observations Near Infrared spectro-imager (MICADO). We developed an end-to-end Monte- Carlo adaptive optics simulation tool to investigate the performance of a the MAORY and the calibration, acquisition, operation strategies. MAORY will implement Multiconjugate Adaptive Optics combining Laser Guide Stars (LGS) and Natural Guide Stars (NGS) measurements. The simulation tool implements the various aspect of the MAORY in an end to end fashion. The code has been developed using IDL and uses libraries in C++ and CUDA for efficiency improvements. Here we recall the code architecture, we describe the modeled instrument components and the control strategies implemented in the code.

  10. Cluster Optimization and Parallelization of Simulations with Dynamically Adaptive Grids

    KAUST Repository

    Schreiber, Martin

    2013-01-01

    The present paper studies solvers for partial differential equations that work on dynamically adaptive grids stemming from spacetrees. Due to the underlying tree formalism, such grids efficiently can be decomposed into connected grid regions (clusters) on-the-fly. A graph on those clusters classified according to their grid invariancy, workload, multi-core affinity, and further meta data represents the inter-cluster communication. While stationary clusters already can be handled more efficiently than their dynamic counterparts, we propose to treat them as atomic grid entities and introduce a skip mechanism that allows the grid traversal to omit those regions completely. The communication graph ensures that the cluster data nevertheless are kept consistent, and several shared memory parallelization strategies are feasible. A hyperbolic benchmark that has to remesh selected mesh regions iteratively to preserve conforming tessellations acts as benchmark for the present work. We discuss runtime improvements resulting from the skip mechanism and the implications on shared memory performance and load balancing. © 2013 Springer-Verlag.

  11. Computerized adaptive testing of population psychological distress:simulation-based evaluation of GHQ-30

    OpenAIRE

    Stochl, Jan; Böhnke, Jan R.; Pickett, Kate E.; Croudace, Tim J.

    2015-01-01

    PURPOSE: Goldberg's General Health Questionnaire (GHQ) items are frequently used to assess psychological distress but no study to date has investigated the GHQ-30's potential for adaptive administration. In computerized adaptive testing (CAT) items are matched optimally to the targeted distress level of respondents instead of relying on fixed-length versions of instruments. We therefore calibrate GHQ-30 items and report a simulation study exploring the potential of this instrument for adaptiv...

  12. A hybrid simulated annealing approach to handle energy resource management considering an intensive use of electric vehicles

    DEFF Research Database (Denmark)

    Sousa, Tiago; Vale, Zita; Carvalho, Joao Paulo;

    2014-01-01

    The massification of electric vehicles (EVs) can have a significant impact on the power system, requiring a new approach for the energy resource management. The energy resource management has the objective to obtain the optimal scheduling of the available resources considering distributed...... to determine the best solution in a reasonable amount of time. This paper presents a hybrid artificial intelligence technique to solve a complex energy resource management problem with a large number of resources, including EVs, connected to the electric network. The hybrid approach combines simulated...... generators, storage units, demand response and EVs. The large number of resources causes more complexity in the energy resource management, taking several hours to reach the optimal solution which requires a quick solution for the next day. Therefore, it is necessary to use adequate optimization techniques...

  13. SINGULARITY ANALYSIS AND COMPARITIVE STUDY OF SIX DEGREE OF FREEDOM STEWART PLATFORM AS A ROBOTIC ARM BY HEURISTIC ALGORITHMS AND SIMULATED ANNEALING

    Directory of Open Access Journals (Sweden)

    ASHWIN MISHRA,

    2011-01-01

    Full Text Available In this study singularity analysis of the six degree of freedom (DOF Stewart Platform using the various heuristic methods in a specified design configuration has been carried out .The Jacobian matrix of the Stewart platform is obtained and the absolute value of the determinant of the Jacobian is taken as the objective function, and the least value of this objective function is fished in the reachable workspace of the Stewart platform so as to find the singular configurations. The singular configurations of the platform depend on the value of this objective function under consideration, if it is zero the configuration is singular. The results thus obtained by different methods namely the genetic algorithm, Particle Swarm optimization and variants and simulated annealing are compared with each other. The variable sets considered are the respective desirable platform motions in the form of translation and rotation in six degrees of freedom. This paper hence presents a proper comparative study of these algorithms based on the results that are obtained and highlights the advantage of each in terms of computational cost and accuracy.

  14. A Two-Stage Simulated Annealing Algorithm for the Many-to-Many Milk-Run Routing Problem with Pipeline Inventory Cost

    Directory of Open Access Journals (Sweden)

    Yu Lin

    2015-01-01

    Full Text Available In recent years, logistics systems with multiple suppliers and plants in neighboring regions have been flourishing worldwide. However, high logistics costs remain a problem for such systems due to lack of information sharing and cooperation. This paper proposes an extended mathematical model that minimizes transportation and pipeline inventory costs via the many-to-many Milk-run routing mode. Because the problem is NP hard, a two-stage heuristic algorithm is developed by comprehensively considering its characteristics. More specifically, an initial satisfactory solution is generated in the first stage through a greedy heuristic algorithm to minimize the total number of vehicle service nodes and the best insertion heuristic algorithm to determine each vehicle’s route. Then, a simulated annealing algorithm (SA with limited search scope is used to improve the initial satisfactory solution. Thirty numerical examples are employed to test the proposed algorithms. The experiment results demonstrate the effectiveness of this algorithm. Further, the superiority of the many-to-many transportation mode over other modes is demonstrated via two case studies.

  15. Solving uncapacitated multiple allocation p-hub center problem by Dijkstra’s algorithm-based genetic algorithm and simulated annealing

    Directory of Open Access Journals (Sweden)

    Masoud Rabbani

    2015-09-01

    Full Text Available In the existing literature, there are a huge number of studies focused on p-hub median problems and inventing heuristic or metaheuristic algorithms for solving them. But such analogous body of literature does not exist for its counterpart problem; p-hub center problem. In fact, since p-hub center has been lately introduced and has a particular objective function, minimizing the maximum cost between origin-destination nodes, there are few studies investigating the problem and the challenges for solving it. In this study, after presenting a complete definition of the uncapacitated multiple allocation p-hub center problem (UMApHCP two well-known metaheuristic algorithms are proposed to solve the problem for small scale and large scale standard data sets. These two algorithms are one single solution-based algorithm, Simulated Annealing (SA, and one population-based metaheuristic, Genetic Algorithm (GA. Because of the particular nature of the problem, Dijkstra’s algorithm has been incorporated in the fitness function calculation part of the proposed methods. The numerical results of running the GA and SA for standard test problems show that for smaller scale test problems, single solution-based SA shows greater performance versus GA but for larger scales of data sets the GA generally yield more desirable solutions.

  16. A robust hybrid fuzzy-simulated annealing-intelligent water drops approach for tuning a distribution static compensator nonlinear controller in a distribution system

    Science.gov (United States)

    Bagheri Tolabi, Hajar; Hosseini, Rahil; Shakarami, Mahmoud Reza

    2016-06-01

    This article presents a novel hybrid optimization approach for a nonlinear controller of a distribution static compensator (DSTATCOM). The DSTATCOM is connected to a distribution system with the distributed generation units. The nonlinear control is based on partial feedback linearization. Two proportional-integral-derivative (PID) controllers regulate the voltage and track the output in this control system. In the conventional scheme, the trial-and-error method is used to determine the PID controller coefficients. This article uses a combination of a fuzzy system, simulated annealing (SA) and intelligent water drops (IWD) algorithms to optimize the parameters of the controllers. The obtained results reveal that the response of the optimized controlled system is effectively improved by finding a high-quality solution. The results confirm that using the tuning method based on the fuzzy-SA-IWD can significantly decrease the settling and rising times, the maximum overshoot and the steady-state error of the voltage step response of the DSTATCOM. The proposed hybrid tuning method for the partial feedback linearizing (PFL) controller achieved better regulation of the direct current voltage for the capacitor within the DSTATCOM. Furthermore, in the event of a fault the proposed controller tuned by the fuzzy-SA-IWD method showed better performance than the conventional controller or the PFL controller without optimization by the fuzzy-SA-IWD method with regard to both fault duration and clearing times.

  17. Improvement of bio-corrosion resistance for Ti42Zr40Si15Ta3 metallic glasses in simulated body fluid by annealing within supercooled liquid region.

    Science.gov (United States)

    Huang, C H; Lai, J J; Wei, T Y; Chen, Y H; Wang, X; Kuan, S Y; Huang, J C

    2015-01-01

    The effects of the nanocrystalline phases on the bio-corrosion behavior of highly bio-friendly Ti42Zr40Si15Ta3 metallic glasses in simulated body fluid were investigated, and the findings are compared with our previous observations from the Zr53Cu30Ni9Al8 metallic glasses. The Ti42Zr40Si15Ta3 metallic glasses were annealed at temperatures above the glass transition temperature, Tg, with different time periods to result in different degrees of α-Ti nano-phases in the amorphous matrix. The nanocrystallized Ti42Zr40Si15Ta3 metallic glasses containing corrosion resistant α-Ti phases exhibited more promising bio-corrosion resistance, due to the superior pitting resistance. This is distinctly different from the previous case of the Zr53Cu30Ni9Al8 metallic glasses with the reactive Zr2Cu phases inducing serious galvanic corrosion and lower bio-corrosion resistance. Thus, whether the fully amorphous or partially crystallized metallic glass would exhibit better bio-corrosion resistance, the answer would depend on the crystallized phase nature.

  18. a New Multimodal Multi-Criteria Route Planning Model by Integrating a Fuzzy-Ahp Weighting Method and a Simulated Annealing Algorithm

    Science.gov (United States)

    Ghaderi, F.; Pahlavani, P.

    2015-12-01

    A multimodal multi-criteria route planning (MMRP) system provides an optimal multimodal route from an origin point to a destination point considering two or more criteria in a way this route can be a combination of public and private transportation modes. In this paper, the simulate annealing (SA) and the fuzzy analytical hierarchy process (fuzzy AHP) were combined in order to find this route. In this regard, firstly, the effective criteria that are significant for users in their trip were determined. Then the weight of each criterion was calculated using the fuzzy AHP weighting method. The most important characteristic of this weighting method is the use of fuzzy numbers that aids the users to consider their uncertainty in pairwise comparison of criteria. After determining the criteria weights, the proposed SA algorithm were used for determining an optimal route from an origin to a destination. One of the most important problems in a meta-heuristic algorithm is trapping in local minima. In this study, five transportation modes, including subway, bus rapid transit (BRT), taxi, walking, and bus were considered for moving between nodes. Also, the fare, the time, the user's bother, and the length of the path were considered as effective criteria for solving the problem. The proposed model was implemented in an area in centre of Tehran in a GUI MATLAB programming language. The results showed a high efficiency and speed of the proposed algorithm that support our analyses.

  19. Annealing simulations to determine the matrix interface structure of SiC quantum dots embedded in SiO{sub 2}

    Energy Technology Data Exchange (ETDEWEB)

    Knaup, Jan M. [Department of Physics, Harvard University, 17 Oxford Street, Cambridge MA 02138 (United States); Bremen Center for Computational Materials Science, Universitaet Bremen, Am Fallturm 1, 28359 Bremen (Germany); Voeroes, Marton; Gali, Adam [Department of Atomic Physics, Budapest University of Technology and Economics, 1111 Budapest (Hungary); Deak, Peter; Frauenheim, Thomas [Bremen Center for Computational Materials Science, Universitaet Bremen, Am Fallturm 1, 28359 Bremen (Germany); Kaxiras, Efthimios [Department of Physics, Harvard University, 17 Oxford Street, Cambridge MA 02138 (United States)

    2010-02-15

    We use the density functional based tight-binding (SCC-DFTB) method in a quantum mechanics/molecular mechanics embedding scheme to simulate single SiC quantum dots of different shapes and with diameters of up to 1 nm, embedded in a block of {alpha} -quartz with cell vectors of a = 15 nm and c = 13 nm. First results show that during the annealing process three recurring motives appear at the SiC/SiO2 interface of the embedded nanocrystals: Si-Si bonds often in Si interstitial-like configurations, C-C bonds involving all possible combinations of sp3 and sp2 hybridized carbon and threefold coordinated oxygen atoms. All of these defects, alone or in complexes, may play a crucial role in quenching the luminescence of these embedded nanocrystals. Detailed studies of the spectroscopic properties if the point defects identified in this work is indicated (copyright 2010 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  20. 基于模拟退火算法的全国最优旅行方案%Optimal Nationwide Traveling Scheme Based on Simulated Annealing Algorithm

    Institute of Scientific and Technical Information of China (English)

    吕鹏举; 原杰; 吕菁华

    2011-01-01

    An optimal itinerary scheme to travel through provincial capitals, municipalities, Hong Kong, Macao, Taiwan is designed.The practical problems of the shortest path and least cost for travelling to the above places are analyzed.Taking account of the relationship of cost, route, duration and transportation, a model is established.The simulated annealing algorithm is adopted to solve the model.A travel path of saving money and time is obtained by a comprehensive consideration.The results show the correctness of this travel program and practical value.%以如何走遍全国各省会、直辖市、香港、澳门、台北为基础设计旅行方案,对旅行时的路径最短,费用最少等现实问题进行分析,在充分考虑旅行费用与路线,时间与交通工具的关系后,以实现路径最短与费用时间最少为目标,进行系统建模,并应用模拟退火算法对模型进行求解,得出了一条综合考虑省钱、省时的旅行路径.结果表明了该旅行方案的正确性和现实价值.

  1. Role-play simulations for climate change adaptation education and engagement

    Science.gov (United States)

    Rumore, Danya; Schenk, Todd; Susskind, Lawrence

    2016-08-01

    In order to effectively adapt to climate change, public officials and other stakeholders need to rapidly enhance their understanding of local risks and their ability to collaboratively and adaptively respond to them. We argue that science-based role-play simulation exercises -- a type of 'serious game' involving face-to-face mock decision-making -- have considerable potential as education and engagement tools for enhancing readiness to adapt. Prior research suggests role-play simulations and other serious games can foster public learning and encourage collective action in public policy-making contexts. However, the effectiveness of such exercises in the context of climate change adaptation education and engagement has heretofore been underexplored. We share results from two research projects that demonstrate the effectiveness of role-play simulations in cultivating climate change adaptation literacy, enhancing collaborative capacity and facilitating social learning. Based on our findings, we suggest such exercises should be more widely embraced as part of adaptation professionals' education and engagement toolkits.

  2. Multi-level adaptive simulation of transient two-phase flow in heterogeneous porous media

    KAUST Repository

    Chueh, C.C.

    2010-10-01

    An implicit pressure and explicit saturation (IMPES) finite element method (FEM) incorporating a multi-level shock-type adaptive refinement technique is presented and applied to investigate transient two-phase flow in porous media. Local adaptive mesh refinement is implemented seamlessly with state-of-the-art artificial diffusion stabilization allowing simulations that achieve both high resolution and high accuracy. Two benchmark problems, modelling a single crack and a random porous medium, are used to demonstrate the robustness of the method and illustrate the capabilities of the adaptive refinement technique in resolving the saturation field and the complex interaction (transport phenomena) between two fluids in heterogeneous media. © 2010 Elsevier Ltd.

  3. Cosmological Shocks in Adaptive Mesh Refinement Simulations and the Acceleration of Cosmic Rays

    OpenAIRE

    Skillman, Samuel W.; O'Shea, Brian W.; Hallman, Eric J.; Burns, Jack O.; Michael L. Norman

    2008-01-01

    We present new results characterizing cosmological shocks within adaptive mesh refinement N-Body/hydrodynamic simulations that are used to predict non-thermal components of large-scale structure. This represents the first study of shocks using adaptive mesh refinement. We propose a modified algorithm for finding shocks from those used on unigrid simulations that reduces the shock frequency of low Mach number shocks by a factor of ~3. We then apply our new technique to a large, (512 Mpc/h)^3, ...

  4. EVENT-DRIVEN SIMULATION OF INTEGRATE-AND-FIRE MODELS WITH SPIKE-FREQUENCY ADAPTATION

    Institute of Scientific and Technical Information of China (English)

    Lin Xianghong; Zhang Tianwen

    2009-01-01

    The evoked spike discharges of a neuron depend critically on the recent history of its electrical activity. A well-known example is the phenomenon of spike-frequency adaptation that is a commonly observed property of neurons. In this paper, using a leaky integrate-and-fire model that includes an adaptation current, we propose an event-driven strategy to simulate integrate-and-fire models with spike-frequency adaptation. Such approach is more precise than traditional clock-driven numerical integration approach because the timing of spikes is treated exactly. In experiments, using event-driven and clock-driven strategies we simulated the adaptation time course of single neuron and the random network with spike-timing dependent plasticity, the results indicate that (1) the temporal precision of spiking events impacts on neuronal dynamics of single as well as network in the different simulation strategies and (2) the simulation time scales linearly with the total number of spiking events in the event-driven simulation strategies.

  5. The Self-Adaptive Fuzzy PID Controller in Actuator Simulated Loading System

    OpenAIRE

    Chuanhui Zhang; Xiaodong Song

    2013-01-01

    This paper analyzes the structure principle of the actuator simulated loading system with variable stiffness, and establishes the simplified model. What’s more, it also does a research on the application of the self-adaptive tuning of fuzzy PID(Proportion Integration Differentiation) in actuator simulated loading system with variable stiffness. Because the loading system is connected with the steering system by a spring rod, there must be strong coupling. Besides, there are also the parametri...

  6. Dynamically adaptive Lattice Boltzmann simulation of shallow water flows with the Peano framework

    KAUST Repository

    Neumann, Philipp

    2015-09-01

    © 2014 Elsevier Inc. All rights reserved. We present a dynamically adaptive Lattice Boltzmann (LB) implementation for solving the shallow water equations (SWEs). Our implementation extends an existing LB component of the Peano framework. We revise the modular design with respect to the incorporation of new simulation aspects and LB models. The basic SWE-LB implementation is validated in different breaking dam scenarios. We further provide a numerical study on stability of the MRT collision operator used in our simulations.

  7. Simulated Annealing Approach to the Optimal Synthesis of Distillation Column with Intermediate Heat Exchangers%模拟退火方法用于带有中间换热器精馏塔的优化综合

    Institute of Scientific and Technical Information of China (English)

    安维中; 于凤娟; 董凤蕾; 胡仰栋

    2008-01-01

    This article presents a simulated annealing-based approach to the optimal synthesis of distillation col-umn considering intermediate heat exchangers arrangements. The number of intermediate condensers and/or inter-mediate reboilers, the placement locations, the operating pressure of column, and the heat duties of intermediate heat exchangers are treated as optimization variables. A novel coding procedure making use of an integer number series is proposed to represent and manipulate the structure of system and a stage-to-stage method is used for col-umn design and cost calculation. With the representation procedure, the synthesis problem is formulated as a mixed integer nonlinear programming (MINLP) problem, which can then be solved with an improved simulated annealing algorithm. Two examples are illustrated to show the effectiveness of the suggested approach.

  8. Modeling and simulation of neutron induced changes and temperature annealing of Neff and changes in resistivity in high resistivity silicon detectors

    International Nuclear Information System (INIS)

    A model based on various single levels with different charge states (acceptors, donors, neutral, etc.) created by neutron radiation is proposed to describe the changes of effective doping density in the space charge region Neff(SCR) and resistivity in the electrical neutral bulk ρ (ENB). This model explains well the effects of donor removal at low fluences, Neff(SCR) increase with n-fluence (Φn) at high fluence (''type-inversion'' in SCR), and the increase of resistivity in the ENB with Φn. The annealing of each single level with charged state is modeled by an activation energy Ea which gives the dependence of annealing time constant τ on the annealing temperature. The model describes well the observed effect of RT reverse anneal of Neff(SCR), which is probably caused by the difference of annealing time constants of a donor level (long τ) and an acceptor level (short τ). The model also predicts that there is an annealing temperature window, which is about 0 C, that would freeze the Neff(SCR) at the ''minimum'' value for a few years before the reverse anneal. A -20 C anneal, however, would freeze Neff(SCR) at the as-irradiated value, which is about 1.5-3 times higher than the ''minimum value''. The model also suggests that low resistivity (200 Ω cm to 1 kΩ cm) starting material may be harder in terms of Neff(SCR) for detectors to be within a working window of vertical stroke Neff(SCR)vertical stroke 12/cm3(Vfd=180 V, 300 μm) after about 1x1014 n/cm2 neutron radiation. (orig.)

  9. Improving the adaptability of simulated evolutionary swarm robots in dynamically changing environments.

    Directory of Open Access Journals (Sweden)

    Yao Yao

    Full Text Available One of the important challenges in the field of evolutionary robotics is the development of systems that can adapt to a changing environment. However, the ability to adapt to unknown and fluctuating environments is not straightforward. Here, we explore the adaptive potential of simulated swarm robots that contain a genomic encoding of a bio-inspired gene regulatory network (GRN. An artificial genome is combined with a flexible agent-based system, representing the activated part of the regulatory network that transduces environmental cues into phenotypic behaviour. Using an artificial life simulation framework that mimics a dynamically changing environment, we show that separating the static from the conditionally active part of the network contributes to a better adaptive behaviour. Furthermore, in contrast with most hitherto developed ANN-based systems that need to re-optimize their complete controller network from scratch each time they are subjected to novel conditions, our system uses its genome to store GRNs whose performance was optimized under a particular environmental condition for a sufficiently long time. When subjected to a new environment, the previous condition-specific GRN might become inactivated, but remains present. This ability to store 'good behaviour' and to disconnect it from the novel rewiring that is essential under a new condition allows faster re-adaptation if any of the previously observed environmental conditions is reencountered. As we show here, applying these evolutionary-based principles leads to accelerated and improved adaptive evolution in a non-stable environment.

  10. Numerical simulations of internal solitary waves interacting with uniform slopes using an adaptive model

    Science.gov (United States)

    Rickard, Graham; O'Callaghan, Joanne; Popinet, Stéphane

    Two-dimensional, non-linear, Boussinesq, non-hydrostatic simulations of internal solitary waves breaking and running up uniform slopes have been performed using an adaptive, finite volume fluid code "Gerris". It is demonstrated that the Gerris dynamical core performs well in this specific but important geophysical context. The "semi-structured" nature of Gerris is exploited to enhance model resolution along the slope where wave breaking and run-up occur. Comparison with laboratory experiments reveals that the generation of single and multiple turbulent surges ("boluses") as a function of slope angle is consistently reproduced by the model, comparable with observations and previous numerical simulations, suggesting aspects of the dynamical energy transfers are being represented by the model in two dimensions. Adaptivity is used to explore model convergence of the wave breaking dynamics, and it is shown that significant cpu memory and time savings are possible with adaptivity.

  11. The Self-Adaptive Fuzzy PID Controller in Actuator Simulated Loading System

    Directory of Open Access Journals (Sweden)

    Chuanhui Zhang

    2013-05-01

    Full Text Available This paper analyzes the structure principle of the actuator simulated loading system with variable stiffness, and establishes the simplified model. What’s more, it also does a research on the application of the self-adaptive tuning of fuzzy PID(Proportion Integration Differentiation in actuator simulated loading system with variable stiffness. Because the loading system is connected with the steering system by a spring rod, there must be strong coupling. Besides, there are also the parametric variations accompanying with the variations of the stiffness. Based on compensation from the feed-forward control on the disturbance brought by the motion of steering engine, the system performance can be improved by using fuzzy adaptive adjusting PID control to make up the changes of system parameter caused by the changes of the stiffness. By combining the fuzzy control with traditional PID control, fuzzy adaptive PID control is able to choose the parameters more properly.

  12. Simulation Research on Adaptive Control of a Six-degree-of-freedom Material-testing Machine

    Directory of Open Access Journals (Sweden)

    Dan Wang

    2014-02-01

    Full Text Available This paper presents an adaptive controller equipped with a stiffness estimation method for a novel material-testing machine, in order to alleviate the performance depression caused by the stiffness variance of the tested specimen. The dynamic model of the proposed machine is built using the Kane method, and the kinematic model is established with a closed-form solution. The stiffness estimation method is developed based on the recursive least-squares method and the proposed stiffness equivalent matrix. Control performances of the adaptive controller are simulated in detail. The simulation results illustrate that the proposed controller can greatly improve the control performance of the target material-testing machine by online stiffness estimation and adaptive parameter tuning, especially in low-cycle fatigue (LCF and high-cycle fatigue (HCF tests.

  13. 3D Simulation of Flow with Free Surface Based on Adaptive Octree Mesh System

    Institute of Scientific and Technical Information of China (English)

    Li Shaowu; Zhuang Qian; Huang Xiaoyun; Wang Dong

    2015-01-01

    The technique of adaptive tree mesh is an effective way to reduce computational cost through automatic adjustment of cell size according to necessity. In the present study, the 2D numerical N-S solver based on the adaptive quadtree mesh system was extended to a 3D one, in which a spatially adaptive octree mesh system and multiple parti-cle level set method were adopted for the convenience to deal with the air-water-structure multiple-medium coexisting domain. The stretching process of a dumbbell was simulated and the results indicate that the meshes are well adaptable to the free surface. The collapsing process of water column impinging a circle cylinder was simulated and from the results, it can be seen that the processes of fluid splitting and merging are properly simulated. The interaction of sec-ond-order Stokes waves with a square cylinder was simulated and the obtained drag force is consistent with the result by the Morison’s wave force formula with the coefficient values of the stable drag component and the inertial force component being set as 2.54.

  14. Largenet2: an object-oriented programming library for simulating large adaptive networks

    CERN Document Server

    Zschaler, Gerd

    2012-01-01

    The largenet2 C++ library provides an infrastructure for the simulation of large dynamic and adaptive networks with discrete node and link states. The library is released as free software. It is available at http://rincedd.github.com/largenet2. Largenet2 is licensed under the Creative Commons Attribution-NonCommercial 3.0 Unported License.

  15. Adaptive Wavelet Collocation Method for Simulation of Time Dependent Maxwell's Equations

    CERN Document Server

    Li, Haojun; Rieder, Andreas; Freude, Wolfgang

    2012-01-01

    This paper investigates an adaptive wavelet collocation time domain method for the numerical solution of Maxwell's equations. In this method a computational grid is dynamically adapted at each time step by using the wavelet decomposition of the field at that time instant. In the regions where the fields are highly localized, the method assigns more grid points; and in the regions where the fields are sparse, there will be less grid points. On the adapted grid, update schemes with high spatial order and explicit time stepping are formulated. The method has high compression rate, which substantially reduces the computational cost allowing efficient use of computational resources. This adaptive wavelet collocation method is especially suitable for simulation of guided-wave optical devices.

  16. Simulation and Performance Analysis of Adaptive Filtering Algorithms in Noise Cancellation

    CERN Document Server

    Ferdouse, Lilatul; Nipa, Tamanna Haque; Jaigirdar, Fariha Tasmin

    2011-01-01

    Noise problems in signals have gained huge attention due to the need of noise-free output signal in numerous communication systems. The principal of adaptive noise cancellation is to acquire an estimation of the unwanted interfering signal and subtract it from the corrupted signal. Noise cancellation operation is controlled adaptively with the target of achieving improved signal to noise ratio. This paper concentrates upon the analysis of adaptive noise canceller using Recursive Least Square (RLS), Fast Transversal Recursive Least Square (FTRLS) and Gradient Adaptive Lattice (GAL) algorithms. The performance analysis of the algorithms is done based on convergence behavior, convergence time, correlation coefficients and signal to noise ratio. After comparing all the simulated results we observed that GAL performs the best in noise cancellation in terms of Correlation Coefficient, SNR and Convergence Time. RLS, FTRLS and GAL were never evaluated and compared before on their performance in noise cancellation in ...

  17. Quantum Simulations of Nuclei and Nuclear Pasta with the Multi-resolution Adaptive Numerical Environment for Scientific Simulations

    CERN Document Server

    Sagert, I; Fattoyev, F J; Postnikov, S; Horowitz, C J

    2015-01-01

    Neutron star and supernova matter at densities just below the nuclear matter saturation density is expected to form a lattice of exotic shapes. These so-called nuclear pasta phases are caused by Coulomb frustration. Their elastic and transport properties are believed to play an important role for thermal and magnetic field evolution, rotation and oscillation of neutron stars. Furthermore, they can impact neutrino opacities in core-collapse supernovae. In this work, we present proof-of-principle 3D Skyrme Hartree-Fock (SHF) simulations of nuclear pasta with the Multi-resolution ADaptive Numerical Environment for Scientific Simulations (MADNESS). We perform benchmark studies of $^{16} \\mathrm{O}$, $^{208} \\mathrm{Pb}$ and $^{238} \\mathrm{U}$ nuclear ground states and calculate binding energies via 3D SHF simulations. Results are compared with experimentally measured binding energies as well as with theoretically predicted values from an established SHF code. The nuclear pasta simulation is initialized in the so...

  18. Diseño óptimo de un sistema de distribución de agua (SDA aplicando el algoritmo Simulated Annealing (SA

    Directory of Open Access Journals (Sweden)

    Maikel Méndez-Morales

    2014-09-01

    Full Text Available En este artículo se presenta la aplicación del algoritmo Simulated Annealing (SA en el diseño óptimo de un sistema de distribución de agua (SDA. El SA es un algoritmo metaheurístico de búsqueda, basado en una analogía entre el proceso de recocido en metales (proceso controlado de enfriamiento de un cuerpo y la solución de problemas de optimización combinatorios. El algoritmo SA, junto con diversos modelos matemáticos, ha sido utilizado exitosamente en el óptimo diseño de SDA. Como caso de estudio se utilizó el SDA a escala real de la comunidad de Marsella, en San Carlos, Costa Rica. El algoritmo SA fue implementado mediante el conocido modelo EPANET, a través de la extensión WaterNetGen. Se compararon tres diferentes variaciones automatizadas del algoritmo SA con el diseño manual del SDA Marsella llevado a cabo a prueba y error, utilizando únicamente costos unitarios de tuberías. Los resultados muestran que los tres esquemas automatizados del SA arrojaron costos unitarios por debajo del 0.49 como fracción, respecto al costo original del esquema de diseño ejecutado a prueba y error. Esto demuestra que el algoritmo SA es capaz de optimizar problemas combinatorios ligados al diseño de mínimo costo de los sistemas de distribución de agua a escala real.

  19. Application of artificial neural network coupled with genetic algorithm and simulated annealing to solve groundwater inflow problem to an advancing open pit mine

    Science.gov (United States)

    Bahrami, Saeed; Doulati Ardejani, Faramarz; Baafi, Ernest

    2016-05-01

    In this study, hybrid models are designed to predict groundwater inflow to an advancing open pit mine and the hydraulic head (HH) in observation wells at different distances from the centre of the pit during its advance. Hybrid methods coupling artificial neural network (ANN) with genetic algorithm (GA) methods (ANN-GA), and simulated annealing (SA) methods (ANN-SA), were utilised. Ratios of depth of pit penetration in aquifer to aquifer thickness, pit bottom radius to its top radius, inverse of pit advance time and the HH in the observation wells to the distance of observation wells from the centre of the pit were used as inputs to the networks. To achieve the objective two hybrid models consisting of ANN-GA and ANN-SA with 4-5-3-1 arrangement were designed. In addition, by switching the last argument of the input layer with the argument of the output layer of two earlier models, two new models were developed to predict the HH in the observation wells for the period of the mining process. The accuracy and reliability of models are verified by field data, results of a numerical finite element model using SEEP/W, outputs of simple ANNs and some well-known analytical solutions. Predicted results obtained by the hybrid methods are closer to the field data compared to the outputs of analytical and simple ANN models. Results show that despite the use of fewer and simpler parameters by the hybrid models, the ANN-GA and to some extent the ANN-SA have the ability to compete with the numerical models.

  20. Efficacy of very fast simulated annealing global optimization method for interpretation of self-potential anomaly by different forward formulation over 2D inclined sheet type structure

    Science.gov (United States)

    Biswas, A.; Sharma, S. P.

    2012-12-01

    Self-Potential anomaly is an important geophysical technique that measures the electrical potential due natural source of current in the Earth's subsurface. An inclined sheet type model is a very familiar structure associated with mineralization, fault plane, groundwater flow and many other geological features which exhibits self potential anomaly. A number of linearized and global inversion approaches have been developed for the interpretation of SP anomaly over different structures for various purposes. Mathematical expression to compute the forward response over a two-dimensional dipping sheet type structures can be described in three different ways using five variables in each case. Complexities in the inversion using three different forward approaches are different. Interpretation of self-potential anomaly using very fast simulated annealing global optimization has been developed in the present study which yielded a new insight about the uncertainty and equivalence in model parameters. Interpretation of the measured data yields the location of the causative body, depth to the top, extension, dip and quality of the causative body. In the present study, a comparative performance of three different forward approaches in the interpretation of self-potential anomaly is performed to assess the efficacy of the each approach in resolving the possible ambiguity. Even though each forward formulation yields the same forward response but optimization of different sets of variable using different forward problems poses different kinds of ambiguity in the interpretation. Performance of the three approaches in optimization has been compared and it is observed that out of three methods, one approach is best and suitable for this kind of study. Our VFSA approach has been tested on synthetic, noisy and field data for three different methods to show the efficacy and suitability of the best method. It is important to use the forward problem in the optimization that yields the

  1. Mesoscale Benchmark Demonstration Problem 1: Mesoscale Simulations of Intra-granular Fission Gas Bubbles in UO2 under Post-irradiation Thermal Annealing

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yulan; Hu, Shenyang Y.; Montgomery, Robert; Gao, Fei; Sun, Xin; Tonks, Michael; Biner, Bullent; Millet, Paul; Tikare, Veena; Radhakrishnan, Balasubramaniam; Andersson , David

    2012-04-11

    A study was conducted to evaluate the capabilities of different numerical methods used to represent microstructure behavior at the mesoscale for irradiated material using an idealized benchmark problem. The purpose of the mesoscale benchmark problem was to provide a common basis to assess several mesoscale methods with the objective of identifying the strengths and areas of improvement in the predictive modeling of microstructure evolution. In this work, mesoscale models (phase-field, Potts, and kinetic Monte Carlo) developed by PNNL, INL, SNL, and ORNL were used to calculate the evolution kinetics of intra-granular fission gas bubbles in UO2 fuel under post-irradiation thermal annealing conditions. The benchmark problem was constructed to include important microstructural evolution mechanisms on the kinetics of intra-granular fission gas bubble behavior such as the atomic diffusion of Xe atoms, U vacancies, and O vacancies, the effect of vacancy capture and emission from defects, and the elastic interaction of non-equilibrium gas bubbles. An idealized set of assumptions was imposed on the benchmark problem to simplify the mechanisms considered. The capability and numerical efficiency of different models are compared against selected experimental and simulation results. These comparisons find that the phase-field methods, by the nature of the free energy formulation, are able to represent a larger subset of the mechanisms influencing the intra-granular bubble growth and coarsening mechanisms in the idealized benchmark problem as compared to the Potts and kinetic Monte Carlo methods. It is recognized that the mesoscale benchmark problem as formulated does not specifically highlight the strengths of the discrete particle modeling used in the Potts and kinetic Monte Carlo methods. Future efforts are recommended to construct increasingly more complex mesoscale benchmark problems to further verify and validate the predictive capabilities of the mesoscale modeling

  2. 时空模型结合模拟退火进行脑磁源的定位%Spatio-Temporal MEG Source Localization Using Simulated Annealing

    Institute of Scientific and Technical Information of China (English)

    霍小林; 李军; 刘正东

    2001-01-01

    Locating the sources of brain magnetic fields is a basic problem of magnetoencephalography (MEG). The locating of multiple current dipole is a difficult problem for the inverse study of MEG. A method combining Spatio-Temporal Source Modeling with Simulated Annealing to locate multiple current dipoles, is presented through studying the STSM of MEG.This method can overcome the shortcoming of other optimal methods to avoid being trapped in a local minimum. The dipole parameters can be separated into linear and nonlinear components. The optimization dimensions can be reduced greatly by just optimizing the nonlinear components only. Compared with the MUSIC (MUltiple Signal Classification), this method can cut down requirements of independence of the dipole sources correspondingly.%脑磁源的定位问题是脑磁图(magnetoencephalography, MEG)研究的一个基本问题。其中多偶极子定位是脑磁逆问题研究当中的难点。本文通过研究脑磁图的时空模型STSM (spatio-temporal source modeling),提出将时空模型与模拟退火相结合进行多偶极子的定位,以克服其他优化方法易落入局部极小的不足。时空模型中偶极子参数经分解可分为线性部分和非线形部分,只对非线性部分进行模拟退火优化大大降低了优化空间的维数。通过与MUSIC (MUltiple SIgnal Classification)方法的比较,发现将时空模型与模拟退火相结合可以相对降低对源信号独立性的要求。

  3. BENDING RAY-TRACING BASED ON SIMULATED ANNEALING METHOD%基于模拟退火法的弯曲射线追踪

    Institute of Scientific and Technical Information of China (English)

    周竹生; 谢金伟

    2011-01-01

    This paper proposes a new ray-tracing method based on the concept of simulated annealing. With the new method, not only the problem that the traditional ray-tracing method is over dependent on pre - established initial ray-paths is well solved, but also the quality of desirable ray-paths construction and the associated traveltime calculation between fixed sources and receivers is ensured, even if the model is of much complicated velocity-field. As a result, the ray-paths whose traveltime approach is overall minimum are searched out successfully. Furthermore, the algorithm may calculate ray-paths with local extreme lower traveltime too and restrict them easily by instructing rays to pass through some fixed points. The feasibility and stability of the method have been proved by trial results of theoretical models.%提出了一种新的射线追踪方法——模拟退火法.新方法不仅较好地解决了传统射线追踪方法过分依赖初始模型的问题,而且对于复杂速度场模型也能保证在固定的发射与接收点之间构建令人满意的射线路径及其相应的走时,搜索到满足旅行时全局最小的射线路径.此外,新方法还可计算局部最小旅行时,并可方便地通过指定射线经过固定点来对射线路径进行限制.理论模型的试算结果证明了该方法的可行性和稳健性.

  4. Supply-chain management based on simulated annealing algorithm%基于模拟退火算法的供应链管理分析

    Institute of Scientific and Technical Information of China (English)

    董雪

    2012-01-01

    随着经济全球化的到来,更多的企业将工作重心放在其核心竞争力上,而物流业务也逐渐从生产加工等业务中分离出来。因此,如何有效管理供应商和生产商之间的关系(即供应链管理)已成为当前企业竞争和收益的焦点。以往对于供应链模型的求解往往是基于遗传算法等,虽然成熟有效,但局部搜索能力较差并且计算时间较长。主要应用模拟退火算法对供应链模型的求解问题进行研究和分析,并结合例题说明其有效性。%With the economic globalization, more and more enterprises focus on their core competitiveness. So the logistics operation has been gradually into various fairly independent unit. Therefore, how to manage the relations between suppliers and producers effectively (supply-chain management) becomes a hot topic. The former solutions to the supply-chain model have been based on genetic algorithm,which func- tion is mature and effective, but poor in local search ability and longer for computing time. It provides a simulated annealing algorithm to solve the supply-chain management model, and an example will be given to show its effectiveness.

  5. Local error estimates for adaptive simulation of the reaction-diffusion master equation via operator splitting

    Science.gov (United States)

    Hellander, Andreas; Lawson, Michael J.; Drawert, Brian; Petzold, Linda

    2014-06-01

    The efficiency of exact simulation methods for the reaction-diffusion master equation (RDME) is severely limited by the large number of diffusion events if the mesh is fine or if diffusion constants are large. Furthermore, inherent properties of exact kinetic-Monte Carlo simulation methods limit the efficiency of parallel implementations. Several approximate and hybrid methods have appeared that enable more efficient simulation of the RDME. A common feature to most of them is that they rely on splitting the system into its reaction and diffusion parts and updating them sequentially over a discrete timestep. This use of operator splitting enables more efficient simulation but it comes at the price of a temporal discretization error that depends on the size of the timestep. So far, existing methods have not attempted to estimate or control this error in a systematic manner. This makes the solvers hard to use for practitioners since they must guess an appropriate timestep. It also makes the solvers potentially less efficient than if the timesteps were adapted to control the error. Here, we derive estimates of the local error and propose a strategy to adaptively select the timestep when the RDME is simulated via a first order operator splitting. While the strategy is general and applicable to a wide range of approximate and hybrid methods, we exemplify it here by extending a previously published approximate method, the diffusive finite-state projection (DFSP) method, to incorporate temporal adaptivity.

  6. Local error estimates for adaptive simulation of the Reaction–Diffusion Master Equation via operator splitting

    Science.gov (United States)

    Hellander, Andreas; Lawson, Michael J; Drawert, Brian; Petzold, Linda

    2015-01-01

    The efficiency of exact simulation methods for the reaction-diffusion master equation (RDME) is severely limited by the large number of diffusion events if the mesh is fine or if diffusion constants are large. Furthermore, inherent properties of exact kinetic-Monte Carlo simulation methods limit the efficiency of parallel implementations. Several approximate and hybrid methods have appeared that enable more efficient simulation of the RDME. A common feature to most of them is that they rely on splitting the system into its reaction and diffusion parts and updating them sequentially over a discrete timestep. This use of operator splitting enables more efficient simulation but it comes at the price of a temporal discretization error that depends on the size of the timestep. So far, existing methods have not attempted to estimate or control this error in a systematic manner. This makes the solvers hard to use for practitioners since they must guess an appropriate timestep. It also makes the solvers potentially less efficient than if the timesteps are adapted to control the error. Here, we derive estimates of the local error and propose a strategy to adaptively select the timestep when the RDME is simulated via a first order operator splitting. While the strategy is general and applicable to a wide range of approximate and hybrid methods, we exemplify it here by extending a previously published approximate method, the Diffusive Finite-State Projection (DFSP) method, to incorporate temporal adaptivity. PMID:26865735

  7. High-Performance Reactive Fluid Flow Simulations Using Adaptive Mesh Refinement on Thousands of Processors

    Science.gov (United States)

    Calder, A. C.; Curtis, B. C.; Dursi, L. J.; Fryxell, B.; Henry, G.; MacNeice, P.; Olson, K.; Ricker, P.; Rosner, R.; Timmes, F. X.; Tufo, H. M.; Truran, J. W.; Zingale, M.

    We present simulations and performance results of nuclear burning fronts in supernovae on the largest domain and at the finest spatial resolution studied to date. These simulations were performed on the Intel ASCI-Red machine at Sandia National Laboratories using FLASH, a code developed at the Center for Astrophysical Thermonuclear Flashes at the University of Chicago. FLASH is a modular, adaptive mesh, parallel simulation code capable of handling compressible, reactive fluid flows in astrophysical environments. FLASH is written primarily in Fortran 90, uses the Message-Passing Interface library for inter-processor communication and portability, and employs the PARAMESH package to manage a block-structured adaptive mesh that places blocks only where the resolution is required and tracks rapidly changing flow features, such as detonation fronts, with ease. We describe the key algorithms and their implementation as well as the optimizations required to achieve sustained performance of 238 GLOPS on 6420 processors of ASCI-Red in 64-bit arithmetic.

  8. Adaptive grids and numerical fluid simulations for scrape-off layer plasmas

    International Nuclear Information System (INIS)

    Magnetic confinement nuclear fusion experiments create plasmas with local temperatures in excess of 100 million Kelvin. In these experiments the scrape-off layer, which is the plasma region in direct contact with the device wall, is of central importance both for the quality of the energy confinement and the wall material lifetime. To study the behaviour of the scrape-off layer, in addition to experiments, numerical simulations are used. This work investigates the use of adaptive discretizations of space and compatible numerical methods for scrape-off layer simulations. The resulting algorithms allow dynamic adaptation of computational grids aligned to the magnetic fields to precisely capture the strongly anisotropic energy and particle transport in the plasma. The methods are applied to the multi-fluid plasma code B2, with the goal of reducing the runtime of simulations and extending the applicability of the code.

  9. Availability simulation software adaptation to the IFMIF accelerator facility RAMI analyses

    International Nuclear Information System (INIS)

    Highlights: • The reason why IFMIF RAMI analyses needs a simulation is explained. • Changes, modifications and software validations done to AvailSim are described. • First IFMIF RAMI results obtained with AvailSim 2.0 are shown. • Implications of AvailSim 2.0 in IFMIF RAMI analyses are evaluated. - Abstract: Several problems were found when using generic reliability tools to perform RAMI (Reliability Availability Maintainability Inspectability) studies for the IFMIF (International Fusion Materials Irradiation Facility) accelerator. A dedicated simulation tool was necessary to model properly the complexity of the accelerator facility. AvailSim, the availability simulation software used for the International Linear Collider (ILC) became an excellent option to fulfill RAMI analyses needs. Nevertheless, this software needed to be adapted and modified to simulate the IFMIF accelerator facility in a useful way for the RAMI analyses in the current design phase. Furthermore, some improvements and new features have been added to the software. This software has become a great tool to simulate the peculiarities of the IFMIF accelerator facility allowing obtaining a realistic availability simulation. Degraded operation simulation and maintenance strategies are the main relevant features. In this paper, the necessity of this software, main modifications to improve it and its adaptation to IFMIF RAMI analysis are described. Moreover, first results obtained with AvailSim 2.0 and a comparison with previous results is shown

  10. Availability simulation software adaptation to the IFMIF accelerator facility RAMI analyses

    Energy Technology Data Exchange (ETDEWEB)

    Bargalló, Enric, E-mail: enric.bargallo-font@upc.edu [Fusion Energy Engineering Laboratory (FEEL), Technical University of Catalonia (UPC) Barcelona-Tech, Barcelona (Spain); Sureda, Pere Joan [Fusion Energy Engineering Laboratory (FEEL), Technical University of Catalonia (UPC) Barcelona-Tech, Barcelona (Spain); Arroyo, Jose Manuel [Laboratorio Nacional de Fusión por Confinamiento Magnético – CIEMAT, Madrid (Spain); Abal, Javier; De Blas, Alfredo; Dies, Javier; Tapia, Carlos [Fusion Energy Engineering Laboratory (FEEL), Technical University of Catalonia (UPC) Barcelona-Tech, Barcelona (Spain); Mollá, Joaquín; Ibarra, Ángel [Laboratorio Nacional de Fusión por Confinamiento Magnético – CIEMAT, Madrid (Spain)

    2014-10-15

    Highlights: • The reason why IFMIF RAMI analyses needs a simulation is explained. • Changes, modifications and software validations done to AvailSim are described. • First IFMIF RAMI results obtained with AvailSim 2.0 are shown. • Implications of AvailSim 2.0 in IFMIF RAMI analyses are evaluated. - Abstract: Several problems were found when using generic reliability tools to perform RAMI (Reliability Availability Maintainability Inspectability) studies for the IFMIF (International Fusion Materials Irradiation Facility) accelerator. A dedicated simulation tool was necessary to model properly the complexity of the accelerator facility. AvailSim, the availability simulation software used for the International Linear Collider (ILC) became an excellent option to fulfill RAMI analyses needs. Nevertheless, this software needed to be adapted and modified to simulate the IFMIF accelerator facility in a useful way for the RAMI analyses in the current design phase. Furthermore, some improvements and new features have been added to the software. This software has become a great tool to simulate the peculiarities of the IFMIF accelerator facility allowing obtaining a realistic availability simulation. Degraded operation simulation and maintenance strategies are the main relevant features. In this paper, the necessity of this software, main modifications to improve it and its adaptation to IFMIF RAMI analysis are described. Moreover, first results obtained with AvailSim 2.0 and a comparison with previous results is shown.

  11. Adaptive finite element simulation of flow and transport applications on parallel computers

    Science.gov (United States)

    Kirk, Benjamin Shelton

    The subject of this work is the adaptive finite element simulation of problems arising in flow and transport applications on parallel computers. Of particular interest are new contributions to adaptive mesh refinement (AMR) in this parallel high-performance context, including novel work on data structures, treatment of constraints in a parallel setting, generality and extensibility via object-oriented programming, and the design/implementation of a flexible software framework. This technology and software capability then enables more robust, reliable treatment of multiscale--multiphysics problems and specific studies of fine scale interaction such as those in biological chemotaxis (Chapter 4) and high-speed shock physics for compressible flows (Chapter 5). The work begins by presenting an overview of key concepts and data structures employed in AMR simulations. Of particular interest is how these concepts are applied in the physics-independent software framework which is developed here and is the basis for all the numerical simulations performed in this work. This open-source software framework has been adopted by a number of researchers in the U.S. and abroad for use in a wide range of applications. The dynamic nature of adaptive simulations pose particular issues for efficient implementation on distributed-memory parallel architectures. Communication cost, computational load balance, and memory requirements must all be considered when developing adaptive software for this class of machines. Specific extensions to the adaptive data structures to enable implementation on parallel computers is therefore considered in detail. The libMesh framework for performing adaptive finite element simulations on parallel computers is developed to provide a concrete implementation of the above ideas. This physics-independent framework is applied to two distinct flow and transport applications classes in the subsequent application studies to illustrate the flexibility of the

  12. Resource load balancing scheme based on simulated annealing algorithm.%基于模拟退火算法的资源负载均衡方案

    Institute of Scientific and Technical Information of China (English)

    金杉; 麦丰; 任波

    2011-01-01

    The load balancing problem of network resources in large-scale ERP systems is studied. Firstly,the demand of this issue is modeled and analyzed theoretically;secondly,a heuristic objective function which satisfies the host and network constraints is designed,and furthermore the model is transformed into the problem of degree-constrained minimum spanning tree;thirdly, a simulated annealing algorithm is designed to deal with the problem and a load balancing scheme based on the algorithm named LABS is proposed eventually. According to the theoretical analysis,the time complexity of the computation operated by the whole network is the square of the size of node.The simulation results show that, by selecting appropriate factors, the scheme not only can absorb most of the nodes to participate in load balancing operations, but also can reduce the bottlenecks of nodes and the average utilization rate of resources significantly.%针对大型ERP系统的网络资源负载均衡问题展开研究.在理论上对问题的需求进行建模分析,设计了满足主机和网络性能约束的启发式目标函数,将模型转化为度约束最小生成树问题,设计了一种模拟退火算法对此问题进行处理,提出了基于该算法的资源负载均衡方案LABS.理论分析表明,整个网络执行该方案的时间复杂度为节点规模的平方阶,说明方案具有较强的可用性与可伸缩性.仿真实验结果显示,通过选择适当的启发因子,算法不仅可以吸纳大部分节点协同参与负载均衡操作,还能够显著减少系统中的瓶颈节点数,降低平均资源使用率.

  13. Spatially adaptive radiation-hydrodynamical simulations of galaxy formation during cosmological reionization

    CERN Document Server

    Pawlik, Andreas H; Vecchia, Claudio Dalla

    2015-01-01

    We present a suite of cosmological radiation-hydrodynamical simulations of the assembly of galaxies driving the reionization of the intergalactic medium (IGM) at z >~ 6. The simulations account for the hydrodynamical feedback from photoionization heating and the explosion of massive stars as supernovae (SNe). Our reference simulation, which was carried out in a box of size 25 comoving Mpc/h using 2 x 512^3 particles, produces a reasonable reionization history and matches the observed UV luminosity function of galaxies. Simulations with different box sizes and resolutions are used to investigate numerical convergence, and simulations in which either SNe or photoionization heating or both are turned off, are used to investigate the role of feedback from star formation. Ionizing radiation is treated using accurate radiative transfer at the high spatially adaptive resolution at which the hydrodynamics is carried out. SN feedback strongly reduces the star formation rates (SFRs) over nearly the full mass range of s...

  14. Adapting the serial Alpgen event generator to simulate LHC collisions on millions of parallel threads

    CERN Document Server

    Childers, J T; LeCompte, T J; Papka, M E; Benjamin, D P

    2015-01-01

    As the LHC moves to higher energies and luminosity, the demand for computing resources increases accordingly and will soon outpace the growth of the Worldwide LHC Computing Grid. To meet this greater demand, event generation Monte Carlo was targeted for adaptation to run on Mira, the supercomputer at the Argonne Leadership Computing Facility. Alpgen is a Monte Carlo event generation application that is used by LHC experiments in the simulation of collisions that take place in the Large Hadron Collider. This paper details the process by which Alpgen was adapted from a single-processor serial-application to a large-scale parallel-application and the performance that was achieved.

  15. Reinforcement learning for adaptive threshold control of restorative brain-computer interfaces: a Bayesian simulation.

    Science.gov (United States)

    Bauer, Robert; Gharabaghi, Alireza

    2015-01-01

    Restorative brain-computer interfaces (BCI) are increasingly used to provide feedback of neuronal states in a bid to normalize pathological brain activity and achieve behavioral gains. However, patients and healthy subjects alike often show a large variability, or even inability, of brain self-regulation for BCI control, known as BCI illiteracy. Although current co-adaptive algorithms are powerful for assistive BCIs, their inherent class switching clashes with the operant conditioning goal of restorative BCIs. Moreover, due to the treatment rationale, the classifier of restorative BCIs usually has a constrained feature space, thus limiting the possibility of classifier adaptation. In this context, we applied a Bayesian model of neurofeedback and reinforcement learning for different threshold selection strategies to study the impact of threshold adaptation of a linear classifier on optimizing restorative BCIs. For each feedback iteration, we first determined the thresholds that result in minimal action entropy and maximal instructional efficiency. We then used the resulting vector for the simulation of continuous threshold adaptation. We could thus show that threshold adaptation can improve reinforcement learning, particularly in cases of BCI illiteracy. Finally, on the basis of information-theory, we provided an explanation for the achieved benefits of adaptive threshold setting. PMID:25729347

  16. Reinforcement learning for adaptive threshold control of restorative brain-computer interfaces: a Bayesian simulation

    Directory of Open Access Journals (Sweden)

    Robert eBauer

    2015-02-01

    Full Text Available Restorative brain-computer interfaces (BCI are increasingly used to provide feedback of neuronal states in a bid to normalize pathological brain activity and achieve behavioral gains. However, patients and healthy subjects alike often show a large variability, or even inability, of brain self-regulation for BCI control, known as BCI illiteracy. Although current co-adaptive algorithms are powerful for assistive BCIs, their inherent class switching clashes with the operant conditioning goal of restorative BCIs. Moreover, due to the treatment rationale, the classifier of restorative BCIs usually has a constrained feature space, thus limiting the possibility of classifier adaptation.In this context, we applied a Bayesian model of neurofeedback and reinforcement learning for different threshold selection strategies to study the impact of threshold adaptation of a linear classifier on optimizing restorative BCIs. For each feedback iteration, we first determined the thresholds that result in minimal action entropy and maximal instructional efficiency. We then used the resulting vector for the simulation of continuous threshold adaptation. We could thus show that threshold adaptation can improve reinforcement learning, particularly in cases of BCI illiteracy. Finally, on the basis of information-theory, we provided an explanation for the achieved benefits of adaptive threshold setting.

  17. Simulation of Old Urban Residential Area Evolution Based on Complex Adaptive System

    Institute of Scientific and Technical Information of China (English)

    YANG Fan; WANG Xiao-ming; HUA Hong

    2009-01-01

    On the basis of complex adaptive system theory,this paper proposed an agent-based model of old urban residential area,in which,residents and providers are the two adaptive agents.The behaviors of residents and providers in this model are trained with back propagation and simulated with Swarm software based on environment-rules-agents interaction.This model simulates the evolution of old urban residential area and analyzes the relations between the evolution and urban management with the background of Chaozhou city.As a result,the following are obtained:(1) Simulation without government intervention indicates the trend of housing ageing,environmental deterioration,economic depression,and social filtering-down in old urban residential area.If the development of old urban residential area is under control of developers in market,whose desire is profit maximization,and factors such as social justice,historic and culture value will be ignored.(2) If the government carries out some policies and measures which will perfectly serve their original aims,simulation reveals that old urban residential area could be adapted to environment and keep sustainable development.This conclusion emphasizes that government must act as initiator and program maker for guiding residents and other providers directly in the development of old urban residential area.

  18. 3D design and electric simulation of a silicon drift detector using a spiral biasing adapter

    Science.gov (United States)

    Li, Yu-yun; Xiong, Bo; Li, Zheng

    2016-09-01

    The detector system of combining a spiral biasing adapter (SBA) with a silicon drift detector (SBA-SDD) is largely different from the traditional silicon drift detector (SDD), including the spiral SDD. It has a spiral biasing adapter of the same design as a traditional spiral SDD and an SDD with concentric rings having the same radius. Compared with the traditional spiral SDD, the SBA-SDD separates the spiral's functions of biasing adapter and the p-n junction definition. In this paper, the SBA-SDD is simulated using a Sentaurus TCAD tool, which is a full 3D device simulation tool. The simulated electric characteristics include electric potential, electric field, electron concentration, and single event effect. Because of the special design of the SBA-SDD, the SBA can generate an optimum drift electric field in the SDD, comparable with the conventional spiral SDD, while the SDD can be designed with concentric rings to reduce surface area. Also the current and heat generated in the SBA are separated from the SDD. To study the single event response, we simulated the induced current caused by incident heavy ions (20 and 50 μm penetration length) with different linear energy transfer (LET). The SBA-SDD can be used just like a conventional SDD, such as X-ray detector for energy spectroscopy and imaging, etc.

  19. Adaptive Flow Simulation of Turbulence in Subject-Specific Abdominal Aortic Aneurysm on Massively Parallel Computers

    Science.gov (United States)

    Sahni, Onkar; Jansen, Kenneth; Shephard, Mark; Taylor, Charles

    2007-11-01

    Flow within the healthy human vascular system is typically laminar but diseased conditions can alter the geometry sufficiently to produce transitional/turbulent flows in regions focal (and immediately downstream) of the diseased section. The mean unsteadiness (pulsatile or respiratory cycle) further complicates the situation making traditional turbulence simulation techniques (e.g., Reynolds-averaged Navier-Stokes simulations (RANSS)) suspect. At the other extreme, direct numerical simulation (DNS) while fully appropriate can lead to large computational expense, particularly when the simulations must be done quickly since they are intended to affect the outcome of a medical treatment (e.g., virtual surgical planning). To produce simulations in a clinically relevant time frame requires; 1) adaptive meshing technique that closely matches the desired local mesh resolution in all three directions to the highly anisotropic physical length scales in the flow, 2) efficient solution algorithms, and 3) excellent scaling on massively parallel computers. In this presentation we will demonstrate results for a subject-specific simulation of an abdominal aortic aneurysm using stabilized finite element method on anisotropically adapted meshes consisting of O(10^8) elements over O(10^4) processors.

  20. Level-by-level artificial viscosity and visualization for MHD simulation with adaptive mesh refinement

    Science.gov (United States)

    Hatori, Tomoharu; Ito, Atsushi M.; Nunami, Masanori; Usui, Hideyuki; Miura, Hideaki

    2016-08-01

    We propose a numerical method to determine the artificial viscosity in magnetohydrodynamics (MHD) simulations with adaptive mesh refinement (AMR) method, where the artificial viscosity is adaptively changed due to the resolution level of the AMR hierarchy. Although the suitable value of the artificial viscosity depends on the governing equations and the model of target problem, it can be determined by von Neumann stability analysis. By means of the new method, "level-by-level artificial viscosity method," MHD simulations of Rayleigh-Taylor instability (RTI) are carried out with the AMR method. The validity of the level-by-level artificial viscosity method is confirmed by the comparison of the linear growth rates of RTI between the AMR simulations and the simple simulations with uniform grid and uniform artificial viscosity whose resolution is the same as that in the highest level of the AMR simulation. Moreover, in the nonlinear phase of RTI, the secondary instability is clearly observed where the hierarchical data structure of AMR calculation is visualized as high resolution region floats up like terraced fields. In the applications of the method to general fluid simulations, the growth of small structures can be sufficiently reproduced, while the divergence of numerical solutions can be suppressed.

  1. Quantum Annealing of a Disordered Magnet

    OpenAIRE

    Brooke, J.; Bitko, D.; Rosenbaum, T. F.; Aeppli, G.

    2001-01-01

    Traditional simulated annealing utilizes thermal fluctuations for convergence in optimization problems. Quantum tunneling provides a different mechanism for moving between states, with the potential for reduced time scales. We compare thermal and quantum annealing in a model disordered Ising magnet, Li\\sub{Ho}{0.44}\\sub{Y}{0.56}\\sub{F}{4}, where the effects of quantum mechanics can be tuned in the laboratory by varying a magnetic field applied transverse to the Ising axis. Our results indicat...

  2. Validation Through Simulations of a Cn2 Profiler for the ESO/VLT Adaptive Optics Facility

    CERN Document Server

    Garcia-Rissmann, A; Kolb, J; Louarn, M Le; Madec, P -Y; Neichel, B

    2015-01-01

    The Adaptive Optics Facility (AOF) project envisages transforming one of the VLT units into an adaptive telescope and providing its ESO (European Southern Observatory) second generation instruments with turbulence corrected wavefronts. For MUSE and HAWK-I this correction will be achieved through the GALACSI and GRAAL AO modules working in conjunction with a 1170 actuators Deformable Secondary Mirror (DSM) and the new Laser Guide Star Facility (4LGSF). Multiple wavefront sensors will enable GLAO and LTAO capabilities, whose performance can greatly benefit from a knowledge about the stratification of the turbulence in the atmosphere. This work, totally based on end-to-end simulations, describes the validation tests conducted on a Cn2 profiler adapted for the AOF specifications. Because an absolute profile calibration is strongly dependent on a reliable knowledge of turbulence parameters r0 and L0, the tests presented here refer only to normalized output profiles. Uncertainties in the input parameters inherent t...

  3. Adapt

    Science.gov (United States)

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  4. Wavelet-based adaptive numerical simulation of unsteady 3D flow around a bluff body

    Science.gov (United States)

    de Stefano, Giuliano; Vasilyev, Oleg

    2012-11-01

    The unsteady three-dimensional flow past a two-dimensional bluff body is numerically simulated using a wavelet-based method. The body is modeled by exploiting the Brinkman volume-penalization method, which results in modifying the governing equations with the addition of an appropriate forcing term inside the spatial region occupied by the obstacle. The volume-penalized incompressible Navier-Stokes equations are numerically solved by means of the adaptive wavelet collocation method, where the non-uniform spatial grid is dynamically adapted to the flow evolution. The combined approach is successfully applied to the simulation of vortex shedding flow behind a stationary prism with square cross-section. The computation is conducted at transitional Reynolds numbers, where fundamental unstable three-dimensional vortical structures exist, by well-predicting the unsteady forces arising from fluid-structure interaction.

  5. AN ADAPTIVE EFG-FE COUPLING METHOD FOR THE NUMERICAL SIMULATION OF EXTRUSION PROCESSES

    Institute of Scientific and Technical Information of China (English)

    L.C.Liu; X.H.Dong; C.X.Li

    2008-01-01

    An adaptive EFG-FE coupling method is proposed and developed for the numerical simulation of lateral extrusion and forward-backward extrusion. Initially, the simulation has been implemented by using a conventional FE model. During the deforming process, mesh quality is checked at every incremental step. Distorted elements are automatically converted to EFG nodes, whereas, the less distorted elements are reserved. A new algorithm to generate EFG nodes and interface elements is presented. This method is capable of dealing with large deformation and has higher computational efficiency than using an EFG method wholly. Numerical results demonstrate that the adaptive EFG-FE coupling method has reasonable accuracy and is effective for local bulk metal forming such as extrusion processes.

  6. Simulating computer adaptive testing with the Mood and Anxiety Symptom Questionnaire.

    Science.gov (United States)

    Flens, Gerard; Smits, Niels; Carlier, Ingrid; van Hemert, Albert M; de Beurs, Edwin

    2016-08-01

    In a post hoc simulation study (N = 3,597 psychiatric outpatients), we investigated whether the efficiency of the 90-item Mood and Anxiety Symptom Questionnaire (MASQ) could be improved for assessing clinical subjects with computerized adaptive testing (CAT). A CAT simulation was performed on each of the 3 MASQ subscales (Positive Affect, Negative Affect, and Somatic Anxiety). With the CAT simulation's stopping rule set at a high level of measurement precision, the results showed that patients' test administration can be shortened substantially; the mean decrease in items used for the subscales ranged from 56% up to 74%. Furthermore, the predictive utility of the CAT simulations was sufficient for all MASQ scales. The findings reveal that developing a MASQ CAT for clinical subjects is useful as it leads to more efficient measurement without compromising the reliability of the test outcomes. (PsycINFO Database Record

  7. Scale-adaptive simulation of a hot jet in cross flow

    Energy Technology Data Exchange (ETDEWEB)

    Duda, B M; Esteve, M-J [AIRBUS Operations S.A.S., Toulouse (France); Menter, F R; Hansen, T, E-mail: benjamin.duda@airbus.com [ANSYS Germany GmbH, Otterfing (Germany)

    2011-12-22

    The simulation of a hot jet in cross flow is of crucial interest for the aircraft industry as it directly impacts aircraft safety and global performance. Due to the highly transient and turbulent character of this flow, simulation strategies are necessary that resolve at least a part of the turbulence spectrum. The high Reynolds numbers for realistic aircraft applications do not permit the use of pure Large Eddy Simulations as the spatial and temporal resolution requirements for wall bounded flows are prohibitive in an industrial design process. For this reason, the hybrid approach of the Scale-Adaptive Simulation is employed, which retains attached boundary layers in well-established RANS regime and allows the resolution of turbulent fluctuations in areas with sufficient flow instabilities and grid refinement. To evaluate the influence of the underlying numerical grid, three meshing strategies are investigated and the results are validated against experimental data.

  8. Scale-adaptive simulation of a hot jet in cross flow

    Science.gov (United States)

    Duda, B. M.; Menter, F. R.; Hansen, T.; Esteve, M.-J.

    2011-12-01

    The simulation of a hot jet in cross flow is of crucial interest for the aircraft industry as it directly impacts aircraft safety and global performance. Due to the highly transient and turbulent character of this flow, simulation strategies are necessary that resolve at least a part of the turbulence spectrum. The high Reynolds numbers for realistic aircraft applications do not permit the use of pure Large Eddy Simulations as the spatial and temporal resolution requirements for wall bounded flows are prohibitive in an industrial design process. For this reason, the hybrid approach of the Scale-Adaptive Simulation is employed, which retains attached boundary layers in well-established RANS regime and allows the resolution of turbulent fluctuations in areas with sufficient flow instabilities and grid refinement. To evaluate the influence of the underlying numerical grid, three meshing strategies are investigated and the results are validated against experimental data.

  9. Predictive wind turbine simulation with an adaptive lattice Boltzmann method for moving boundaries

    Science.gov (United States)

    Deiterding, Ralf; Wood, Stephen L.

    2016-09-01

    Operating horizontal axis wind turbines create large-scale turbulent wake structures that affect the power output of downwind turbines considerably. The computational prediction of this phenomenon is challenging as efficient low dissipation schemes are necessary that represent the vorticity production by the moving structures accurately and that are able to transport wakes without significant artificial decay over distances of several rotor diameters. We have developed a parallel adaptive lattice Boltzmann method for large eddy simulation of turbulent weakly compressible flows with embedded moving structures that considers these requirements rather naturally and enables first principle simulations of wake-turbine interaction phenomena at reasonable computational costs. The paper describes the employed computational techniques and presents validation simulations for the Mexnext benchmark experiments as well as simulations of the wake propagation in the Scaled Wind Farm Technology (SWIFT) array consisting of three Vestas V27 turbines in triangular arrangement.

  10. Multi-GPU adaptation of a simulator of heart electric activity

    Directory of Open Access Journals (Sweden)

    Víctor M. García

    2013-12-01

    Full Text Available The simulation of the electrical activity of the heart is calculated by solving a large system of ordinary differential equations; this takes an enormous amount of computation time. In recent years graphics processing unit (GPU are being introduced in the field of high performance computing. These powerful computing devices have attracted research groups requiring simulate the electrical activity of the heart. The research group signing this paper has developed a simulator of cardiac electrical activity that runs on a single GPU. This article describes the adaptation and modification of the simulator to run on multiple GPU. The results confirm that the technique significantly reduces the execution time compared to those obtained with a single GPU, and allows the solution of larger problems.

  11. Harmonic amplitude-phase adaptive control and its application in three-axis simulators

    Institute of Scientific and Technical Information of China (English)

    ZHANG Yuan-sheng; YANG Yi-dong

    2005-01-01

    This paper proposes a compensation method for using the Harmonic Amplitude-Phase Adaptive Control(HAPAC) to increase the precision of sinusoidal motion simulators. It also expounds on the HAPAC principle and structural disposition, develops the HAPAC control laws and analyzes the system stability in the HAPAC. A nethod for further improving the precision using online identification of the system' s frequency-response models is presented. The tested data and tracking errors of the simulator demonstrate that the HAPAC makes the sinusoidal motions achieve higher precision than the common classical controls. The HAPAC can also be used in other tracking systems of precision sinusoidal motions.

  12. Aeroacoustic Simulation of Nose Landing Gear on Adaptive Unstructured Grids With FUN3D

    Science.gov (United States)

    Vatsa, Veer N.; Khorrami, Mehdi R.; Park, Michael A.; Lockhard, David P.

    2013-01-01

    Numerical simulations have been performed for a partially-dressed, cavity-closed nose landing gear configuration that was tested in NASA Langley s closed-wall Basic Aerodynamic Research Tunnel (BART) and in the University of Florida's open-jet acoustic facility known as the UFAFF. The unstructured-grid flow solver FUN3D, developed at NASA Langley Research center, is used to compute the unsteady flow field for this configuration. Starting with a coarse grid, a series of successively finer grids were generated using the adaptive gridding methodology available in the FUN3D code. A hybrid Reynolds-averaged Navier-Stokes/large eddy simulation (RANS/LES) turbulence model is used for these computations. Time-averaged and instantaneous solutions obtained on these grids are compared with the measured data. In general, the correlation with the experimental data improves with grid refinement. A similar trend is observed for sound pressure levels obtained by using these CFD solutions as input to a FfowcsWilliams-Hawkings noise propagation code to compute the farfield noise levels. In general, the numerical solutions obtained on adapted grids compare well with the hand-tuned enriched fine grid solutions and experimental data. In addition, the grid adaption strategy discussed here simplifies the grid generation process, and results in improved computational efficiency of CFD simulations.

  13. Friction compensation for low velocity control of hydraulic flight motion simulator: A simple adaptive robust approach

    Institute of Scientific and Technical Information of China (English)

    Yao Jianyong; Jiao Zongxia; Han Songshan

    2013-01-01

    Low-velocity tracking capability is a key performance of flight motion simulator (FMS),which is mainly affected by the nonlinear friction force.Though many compensation schemes with ad hoc friction models have been proposed,this paper deals with low-velocity control without friction model,since it is easy to be implemented in practice.Firstly,a nonlinear model of the FMS middle frame,which is driven by a hydraulic rotary actuator,is built.Noting that in the low velocity region,the unmodeled friction force is mainly characterized by a changing-slowly part,thus a simple adaptive law can be employed to learn this changing-slowly part and compensate it.To guarantee the boundedness of adaptation process,a discontinuous projection is utilized and then a robust scheme is proposed.The controller achieves a prescribed output tracking transient performance and final tracking accuracy in general while obtaining asymptotic output tracking in the absence of modeling errors.In addition,a saturated projection adaptive scheme is proposed to improve the globally learning capability when the velocity becomes large,which might make the previous proposed projection-based adaptive law be unstable.Theoretical and extensive experimental results are obtained to verify the high-performance nature of the proposed adaptive robust control strategy.

  14. 基于遗传模拟退火算法的车间布局规划研究%Research on Workshop’s Layout Planning Based on the Genetic Simulated Annealing Hybrid Algorithm

    Institute of Scientific and Technical Information of China (English)

    宛剑业; 张飞超; 高丽媛; 刘卫博

    2016-01-01

    针对YY企业电子油门生产车间的布局规划,分别采用了传统的SLP方法和遗传模拟退火算法,并利用 Proplanner 软件对其两种方法获得的方案1、2进行了仿真研究。仿真结果表明在零部件的搬运距离、搬运时间、搬运成本三方面,方案2明显优于方案1。从而说明在车间布局规划方面,遗传模拟退火算法比SLP更具可行性与合理性。%To study the layout planning for electronic accelerator production workshop of YY company, this paper uses SLP method and the genetic simulated annealing hybrid algorithm, and the Proplanner software to conduct the simulation research of the program 1, 2 obtained on the two methods. The simulation results show that the program 2 is obviously better than program 1 in the parts of the distance of transportation, handling time, handling cost, which means that the genetic simulated annealing hybrid algorithm is more feasible and rational than the SLP in the layout of a workshop.

  15. 一种模拟退火遗传算法的传感器网络数据融合技术研究%Research on wireless sensor networks data aggregation on simulated annealing genctic algorithm

    Institute of Scientific and Technical Information of China (English)

    张扬; 杨松涛; 张香芝

    2012-01-01

    研究无线传感器网络( WSN)数据融合技术.传感器节点计算能力、通信能力有限,WSN采用交叉重叠方式部署,导致冗余数据量大,需采用数据融合技术消除冗余和无效数据,节约网络通信能耗.结合遗传算法全局搜索和模拟退火算法局部搜索的优点,提出一种模拟退火遗传算法的WSN数据融合方法(SA-GA).采用模拟退火遗传算法快速找到移动代理路由最优传感器节点序列,并实现数据融合.仿真实验结果表明,与遗传算法、模拟退火算法相比,SA-GA更能快速找到全局最优数据融合节点序列,并对数据进行有效融合,具有更小的网络能耗和网络延时.%This paper researched the wireless sensor network ( WSN) data fusion. Sensor node computing ability and communication ability were limited. WSN used overlapping deployment, leading to large redundant data quantity, so as to use the data fusion technology to eliminate redundancy and invalid data, save network communication energy. Combination of genetic algorithm and simulated annealing algorithm for global search and local search advantages, this paper proposed a simulated annealing genetic algorithm ( SA-GA ) WSN data fusion method. By using simulated annealing genetic algorithm, it could quickly find the mobile agent routing optimal sensor node sequence and fuse the data. The simulation results show that, comparing with the genetic algorithm and simulated annealing algorithm, SA-GA can quickly find optimal data fusion node sequence, integrate the data effectively, and it has smaller energy consumption of the network and network delay.

  16. 基于模拟退火算法的智能试题产生系统研究%The research on intelligent question generation systems based on the using simulated annealing algorithm

    Institute of Scientific and Technical Information of China (English)

    路鹏; 丛晓; 周东岱

    2013-01-01

    With the application of artificial intelligence techniques in the field of educational evalua-tion, the computerized adaptive testing gradually becomes one of the most important educational evaluation methods.In such test, the computer can dynamically update the ability level of the learn-er and select tailored questions from the examination questions bank .It is required that the system has a relatively high efficiency of the implementation in order to meet the needs of the test .To solve this problem , the intelligent questions system based on simulated annealing algorithm is proposed . The experimental results show that while the method can ensure the selection of nearly optimal ques -tions from the examination questions bank for learners , it also greatly improve the efficiency of choo-sing questions from the system .%随着人工智能技术在教育评价领域中的应用,计算机自适应测试逐渐成为一种重要的教育评价方式。采用这种测试形式,计算机实时的对学习者的能力水平进行动态更新并从题库中为其选择量身定制的试题,这就要求系统具有比较高的执行效率,才能满足实际应用的需要。为了解决这个问题,提出了基于模拟退火算法来构建智能试题产生系统的方法。实验结果表明,该方法在保证从题库中为学习者选择接近最优试题的同时,也极大提高了系统的选题效率。

  17. Adaptive Resolution Simulation of Supramolecular Water: The Concurrent Making, Breaking, and Remaking of Water Bundles.

    Science.gov (United States)

    Zavadlav, Julija; Marrink, Siewert J; Praprotnik, Matej

    2016-08-01

    The adaptive resolution scheme (AdResS) is a multiscale molecular dynamics simulation approach that can concurrently couple atomistic (AT) and coarse-grained (CG) resolution regions, i.e., the molecules can freely adapt their resolution according to their current position in the system. Coupling to supramolecular CG models, where several molecules are represented as a single CG bead, is challenging, but it provides higher computational gains and connection to the established MARTINI CG force field. Difficulties that arise from such coupling have been so far bypassed with bundled AT water models, where additional harmonic bonds between oxygen atoms within a given supramolecular water bundle are introduced. While these models simplify the supramolecular coupling, they also cause in certain situations spurious artifacts, such as partial unfolding of biomolecules. In this work, we present a new clustering algorithm SWINGER that can concurrently make, break, and remake water bundles and in conjunction with the AdResS permits the use of original AT water models. We apply our approach to simulate a hybrid SPC/MARTINI water system and show that the essential properties of water are correctly reproduced with respect to the standard monoscale simulations. The developed hybrid water model can be used in biomolecular simulations, where a significant speed up can be obtained without compromising the accuracy of the AT water model. PMID:27409519

  18. Adaptive Resolution Simulation of Supramolecular Water: The Concurrent Making, Breaking, and Remaking of Water Bundles.

    Science.gov (United States)

    Zavadlav, Julija; Marrink, Siewert J; Praprotnik, Matej

    2016-08-01

    The adaptive resolution scheme (AdResS) is a multiscale molecular dynamics simulation approach that can concurrently couple atomistic (AT) and coarse-grained (CG) resolution regions, i.e., the molecules can freely adapt their resolution according to their current position in the system. Coupling to supramolecular CG models, where several molecules are represented as a single CG bead, is challenging, but it provides higher computational gains and connection to the established MARTINI CG force field. Difficulties that arise from such coupling have been so far bypassed with bundled AT water models, where additional harmonic bonds between oxygen atoms within a given supramolecular water bundle are introduced. While these models simplify the supramolecular coupling, they also cause in certain situations spurious artifacts, such as partial unfolding of biomolecules. In this work, we present a new clustering algorithm SWINGER that can concurrently make, break, and remake water bundles and in conjunction with the AdResS permits the use of original AT water models. We apply our approach to simulate a hybrid SPC/MARTINI water system and show that the essential properties of water are correctly reproduced with respect to the standard monoscale simulations. The developed hybrid water model can be used in biomolecular simulations, where a significant speed up can be obtained without compromising the accuracy of the AT water model.

  19. Goal-Oriented Self-Adaptive hp Finite Element Simulation of 3D DC Borehole Resistivity Simulations

    KAUST Repository

    Calo, Victor M.

    2011-05-14

    In this paper we present a goal-oriented self-adaptive hp Finite Element Method (hp-FEM) with shared data structures and a parallel multi-frontal direct solver. The algorithm automatically generates (without any user interaction) a sequence of meshes delivering exponential convergence of a prescribed quantity of interest with respect to the number of degrees of freedom. The sequence of meshes is generated from a given initial mesh, by performing h (breaking elements into smaller elements), p (adjusting polynomial orders of approximation) or hp (both) refinements on the finite elements. The new parallel implementation utilizes a computational mesh shared between multiple processors. All computational algorithms, including automatic hp goal-oriented adaptivity and the solver work fully in parallel. We describe the parallel self-adaptive hp-FEM algorithm with shared computational domain, as well as its efficiency measurements. We apply the methodology described to the three-dimensional simulation of the borehole resistivity measurement of direct current through casing in the presence of invasion.

  20. An Overview of Approaches to Modernize Quantum Annealing Using Local Searches

    OpenAIRE

    Chancellor, Nicholas

    2016-01-01

    I describe how real quantum annealers may be used to perform local (in state space) searches around specified states, rather than the global searches traditionally implemented in the quantum annealing algorithm. The quantum annealing algorithm is an analogue of simulated annealing, a classical numerical technique which is now obsolete. Hence, I explore strategies to use an annealer in a way which takes advantage of modern classical optimization algorithms, and additionally should be less sens...