Adaptive simulated annealing (ASA): Lessons learned
Ingber, L.
2000-01-01
Adaptive simulated annealing (ASA) is a global optimization algorithm based on an associated proof that the parameter space can be sampled much more efficiently than by using other previous simulated annealing algorithms. The author's ASA code has been publicly available for over two years. During this time the author has volunteered to help people via e-mail, and the feedback obtained has been used to further develop the code. Some lessons learned, in particular some which are relevant to ot...
Adaptive Simulated Annealing Based Protein Loop Modeling of Neurotoxins
陈杰; 黄丽娜; 彭志红
2003-01-01
A loop modeling method, adaptive simulated annealing, for ab initio prediction of protein loop structures, as an optimization problem of searching the global minimum of a given energy function, is proposed. An interface-friendly toolbox-LoopModeller in Windows and Linux systems, VC++ and OpenGL environments is developed for analysis and visualization. Simulation results of three short-chain neurotoxins modeled by LoopModeller show that the method proposed is fast and efficient.
Sheng, Zheng, E-mail: 19994035@sina.com [College of Meteorology and Oceanography, PLA University of Science and Technology, Nanjing 211101 (China); Wang, Jun; Zhou, Bihua [National Defense Key Laboratory on Lightning Protection and Electromagnetic Camouflage, PLA University of Science and Technology, Nanjing 210007 (China); Zhou, Shudao [College of Meteorology and Oceanography, PLA University of Science and Technology, Nanjing 211101 (China); Collaborative Innovation Center on Forecast and Evaluation of Meteorological Disasters, Nanjing University of Information Science and Technology, Nanjing 210044 (China)
2014-03-15
This paper introduces a novel hybrid optimization algorithm to establish the parameters of chaotic systems. In order to deal with the weaknesses of the traditional cuckoo search algorithm, the proposed adaptive cuckoo search with simulated annealing algorithm is presented, which incorporates the adaptive parameters adjusting operation and the simulated annealing operation in the cuckoo search algorithm. Normally, the parameters of the cuckoo search algorithm are kept constant that may result in decreasing the efficiency of the algorithm. For the purpose of balancing and enhancing the accuracy and convergence rate of the cuckoo search algorithm, the adaptive operation is presented to tune the parameters properly. Besides, the local search capability of cuckoo search algorithm is relatively weak that may decrease the quality of optimization. So the simulated annealing operation is merged into the cuckoo search algorithm to enhance the local search ability and improve the accuracy and reliability of the results. The functionality of the proposed hybrid algorithm is investigated through the Lorenz chaotic system under the noiseless and noise condition, respectively. The numerical results demonstrate that the method can estimate parameters efficiently and accurately in the noiseless and noise condition. Finally, the results are compared with the traditional cuckoo search algorithm, genetic algorithm, and particle swarm optimization algorithm. Simulation results demonstrate the effectiveness and superior performance of the proposed algorithm.
Sheng, Zheng; Wang, Jun; Zhou, Shudao; Zhou, Bihua
2014-03-01
This paper introduces a novel hybrid optimization algorithm to establish the parameters of chaotic systems. In order to deal with the weaknesses of the traditional cuckoo search algorithm, the proposed adaptive cuckoo search with simulated annealing algorithm is presented, which incorporates the adaptive parameters adjusting operation and the simulated annealing operation in the cuckoo search algorithm. Normally, the parameters of the cuckoo search algorithm are kept constant that may result in decreasing the efficiency of the algorithm. For the purpose of balancing and enhancing the accuracy and convergence rate of the cuckoo search algorithm, the adaptive operation is presented to tune the parameters properly. Besides, the local search capability of cuckoo search algorithm is relatively weak that may decrease the quality of optimization. So the simulated annealing operation is merged into the cuckoo search algorithm to enhance the local search ability and improve the accuracy and reliability of the results. The functionality of the proposed hybrid algorithm is investigated through the Lorenz chaotic system under the noiseless and noise condition, respectively. The numerical results demonstrate that the method can estimate parameters efficiently and accurately in the noiseless and noise condition. Finally, the results are compared with the traditional cuckoo search algorithm, genetic algorithm, and particle swarm optimization algorithm. Simulation results demonstrate the effectiveness and superior performance of the proposed algorithm.
Simulated annealing versus quantum annealing
Troyer, Matthias
Based on simulated classical annealing and simulated quantum annealing using quantum Monte Carlo (QMC) simulations I will explore the question where physical or simulated quantum annealers may outperform classical optimization algorithms. Although the stochastic dynamics of QMC simulations is not the same as the unitary dynamics of a quantum system, I will first show that for the problem of quantum tunneling between two local minima both QMC simulations and a physical system exhibit the same scaling of tunneling times with barrier height. The scaling in both cases is O (Δ2) , where Δ is the tunneling splitting. An important consequence is that QMC simulations can be used to predict the performance of a quantum annealer for tunneling through a barrier. Furthermore, by using open instead of periodic boundary conditions in imaginary time, equivalent to a projector QMC algorithm, one obtains a quadratic speedup for QMC, and achieve linear scaling in Δ. I will then address the apparent contradiction between experiments on a D-Wave 2 system that failed to see evidence of quantum speedup and previous QMC results that indicated an advantage of quantum annealing over classical annealing for spin glasses. We find that this contradiction is resolved by taking the continuous time limit in the QMC simulations which then agree with the experimentally observed behavior and show no speedup for 2D spin glasses. However, QMC simulations with large time steps gain further advantage: they ``cheat'' by ignoring what happens during a (large) time step, and can thus outperform both simulated quantum annealers and classical annealers. I will then address the question how to optimally run a simulated or physical quantum annealer. Investigating the behavior of the tails of the distribution of runtimes for very hard instances we find that adiabatically slow annealing is far from optimal. On the contrary, many repeated relatively fast annealing runs can be orders of magnitude faster for
Ry, Rexha Verdhora, E-mail: rexha.vry@gmail.com [Master Program of Geophysical Engineering, Faculty of Mining and Petroleum Engineering, Institut Teknologi Bandung, Jalan Ganesha No.10, Bandung 40132 (Indonesia); Nugraha, Andri Dian, E-mail: nugraha@gf.itb.ac.id [Global Geophysical Research Group, Faculty of Mining and Petroleum Engineering, Institut Teknologi Bandung, Jalan Ganesha No.10, Bandung 40132 (Indonesia)
2015-04-24
Observation of earthquakes is routinely used widely in tectonic activity observation, and also in local scale such as volcano tectonic and geothermal activity observation. It is necessary for determining the location of precise hypocenter which the process involves finding a hypocenter location that has minimum error between the observed and the calculated travel times. When solving this nonlinear inverse problem, simulated annealing inversion method can be applied to such global optimization problems, which the convergence of its solution is independent of the initial model. In this study, we developed own program codeby applying adaptive simulated annealing inversion in Matlab environment. We applied this method to determine earthquake hypocenter using several data cases which are regional tectonic, volcano tectonic, and geothermal field. The travel times were calculated using ray tracing shooting method. We then compared its results with the results using Geiger’s method to analyze its reliability. Our results show hypocenter location has smaller RMS error compared to the Geiger’s result that can be statistically associated with better solution. The hypocenter of earthquakes also well correlated with geological structure in the study area. Werecommend using adaptive simulated annealing inversion to relocate hypocenter location in purpose to get precise and accurate earthquake location.
Observation of earthquakes is routinely used widely in tectonic activity observation, and also in local scale such as volcano tectonic and geothermal activity observation. It is necessary for determining the location of precise hypocenter which the process involves finding a hypocenter location that has minimum error between the observed and the calculated travel times. When solving this nonlinear inverse problem, simulated annealing inversion method can be applied to such global optimization problems, which the convergence of its solution is independent of the initial model. In this study, we developed own program codeby applying adaptive simulated annealing inversion in Matlab environment. We applied this method to determine earthquake hypocenter using several data cases which are regional tectonic, volcano tectonic, and geothermal field. The travel times were calculated using ray tracing shooting method. We then compared its results with the results using Geiger’s method to analyze its reliability. Our results show hypocenter location has smaller RMS error compared to the Geiger’s result that can be statistically associated with better solution. The hypocenter of earthquakes also well correlated with geological structure in the study area. Werecommend using adaptive simulated annealing inversion to relocate hypocenter location in purpose to get precise and accurate earthquake location
The circuit design problem consists in determining acceptable parameter values (resistors, capacitors, transistors geometries ...) which allow the circuit to meet various user given operational criteria (DC consumption, AC bandwidth, transient times ...). This task is equivalent to a multidimensional and/or multi objective optimization problem: n-variables functions have to be minimized in an hyper-rectangular domain ; equality constraints can be eventually specified. A similar problem consists in fitting component models. In this way, the optimization variables are the model parameters and one aims at minimizing a cost function built on the error between the model response and the data measured on the component. The chosen optimization method for this kind of problem is the simulated annealing method. This method, provided by the combinatorial optimization domain, has been adapted and compared with other global optimization methods for the continuous variables problems. An efficient strategy of variables discretization and a set of complementary stopping criteria have been proposed. The different parameters of the method have been adjusted with analytical functions of which minima are known, classically used in the literature. Our simulated annealing algorithm has been coupled with an open electrical simulator SPICE-PAC of which the modular structure allows the chaining of simulations required by the circuit optimization process. We proposed, for high-dimensional problems, a partitioning technique which ensures proportionality between CPU-time and variables number. To compare our method with others, we have adapted three other methods coming from combinatorial optimization domain - the threshold method, a genetic algorithm and the Tabu search method - The tests have been performed on the same set of test functions and the results allow a first comparison between these methods applied to continuous optimization variables. Finally, our simulated annealing program
Adaptive MANET Multipath Routing Algorithm Based on the Simulated Annealing Approach
Sungwook Kim
2014-01-01
Full Text Available Mobile ad hoc network represents a system of wireless mobile nodes that can freely and dynamically self-organize network topologies without any preexisting communication infrastructure. Due to characteristics like temporary topology and absence of centralized authority, routing is one of the major issues in ad hoc networks. In this paper, a new multipath routing scheme is proposed by employing simulated annealing approach. The proposed metaheuristic approach can achieve greater and reciprocal advantages in a hostile dynamic real world network situation. Therefore, the proposed routing scheme is a powerful method for finding an effective solution into the conflict mobile ad hoc network routing problem. Simulation results indicate that the proposed paradigm adapts best to the variation of dynamic network situations. The average remaining energy, network throughput, packet loss probability, and traffic load distribution are improved by about 10%, 10%, 5%, and 10%, respectively, more than the existing schemes.
Iglesias-Marzoa, Ramón; Morales, María Jesús Arévalo
2015-01-01
The fitting of radial velocity curves is a frequent procedure in binary stars and exoplanet research. In the majority of cases the fitting routines need to be fed with a set of initial parameter values and priors from which to begin the computations and their results can be affected by local minima. We present a new code, the rvfit code, for fitting radial velocities of stellar binaries and exoplanets using an Adaptive Simulated Annealing (ASA) global minimization method, which fastly converges to a global solution minimum without the need to provide preliminary parameter values. We show the performance of the code using both synthetic and real data sets: double-lined binaries, single-lined binaries, and exoplanet systems. In all examples the keplerian orbital parameters fitted by the rvfit code and their computed uncertainties are compared with literature solutions. Finally, we provide the source code with a working example and a detailed description on how to use it.
Generalized Simulated Annealing
Tsallis, Constantino; Stariolo, Daniel A.
1995-01-01
We propose a new stochastic algorithm (generalized simulated annealing) for computationally finding the global minimum of a given (not necessarily convex) energy/cost function defined in a continuous D-dimensional space. This algorithm recovers, as particular cases, the so called classical ("Boltzmann machine") and fast ("Cauchy machine") simulated annealings, and can be quicker than both. Key-words: simulated annealing; nonconvex optimization; gradient descent; generalized statistical mechan...
姚新; 李国杰
1991-01-01
Simulated annealing is a new kind of random search methods developed in recent years.It can also be considered as an extension to the classical hill-climbing method in AI--probabilistic hill-cimbing.One of its most important features is its global convergence.The convergence of simulated annealing algorithm is determined by state generating probability,state accepting probability,and temperature decreasing rate,This paper gives a generalized simulated annealing algorithm with dynamic generating and accepting probabilities.The paper also shows that the generating and accepting probabilities can adopt many different kinds of distributions while the global convergence is guaranteed.
Keystream Generator Based On Simulated Annealing
Ayad A. Abdulsalam
2011-01-01
Full Text Available Advances in the design of keystream generator using heuristic techniques are reported. A simulated annealing algorithm for generating random keystream with large complexity is presented. Simulated annealing technique is adapted to locate these requirements. The definitions for some cryptographic properties are generalized, providing a measure suitable for use as an objective function in a simulated annealing algorithm, seeking randomness that satisfy both correlation immunity and the large linear complexity. Results are presented demonstrating the effectiveness of the method.
On lumped models for thermodynamic properties of simulated annealing problems
Andresen, Bjarne; Hoffmann, Karl Heinz; Mosegaard, Klaus; Nulton, Jim; Pedersen, Jacob Mørch; Salamon, Peter
1988-01-01
The paper describes a new method for the estimation of thermodynamic properties for simulated annealing problems using data obtained during a simulated annealing run. The method works by estimating energy-to-energy transition probabilities and is well adapted to simulations such as simulated annealing, in which the system is never in equilibrium.
multicast utilizando Simulated Annealing
Yezid Donoso
2005-01-01
Full Text Available En este artículo se presenta un método de optimización multiobjetivo para la solución del problema de balanceo de carga en redes de transmisión multicast, apoyándose en la aplicación de la meta-heurística de Simulated Annealing (Recocido Simulado. El método minimiza cuatro parámetros básicos para garantizar la calidad de servicio en transmisiones multicast: retardo origen destino, máxima utilización de enlaces, ancho de banda consumido y número de saltos. Los resultados devueltos por la heurística serán comparados con los resultados arrojados por el modelo matemático propuesto en investigaciones anteriores.
Implementation of a Simulated Annealing algorithm for Matlab
Moins, Stephane
2002-01-01
In this report we describe an adaptive simulated annealing method for sizing the devices in analog circuits. The motivation for use an adaptive simulated annealing method for analog circuit design are to increase the efficiency of the design circuit. To demonstrate the functionality and the performance of the approach, an operational transconductance amplifier is simulated. The circuit is modeled with symbolic equations that are derived automatically by a simulator.
Recursive simulation of quantum annealing
Sowa, A P; Samson, J H; Savel'ev, S E; Zagoskin, A M; Heidel, S; Zúñiga-Anaya, J C
2015-01-01
The evaluation of the performance of adiabatic annealers is hindered by lack of efficient algorithms for simulating their behaviour. We exploit the analyticity of the standard model for the adiabatic quantum process to develop an efficient recursive method for its numerical simulation in case of both unitary and non-unitary evolution. Numerical simulations show distinctly different distributions for the most important figure of merit of adiabatic quantum computing --- the success probability --- in these two cases.
Recursive simulation of quantum annealing
The evaluation of the performance of adiabatic annealers is hindered by the lack of efficient algorithms for simulating their behaviour. We exploit the analyticity of the standard model for the adiabatic quantum process to develop an efficient recursive method for its numerical simulation in case of both unitary and non-unitary evolution. Numerical simulations show distinctly different distributions for the most important figure of merit of adiabatic quantum computing—the success probability—in these two cases. (paper)
Simulation of Storm Occurrences Using Simulated Annealing.
Lokupitiya, Ravindra S.; Borgman, Leon E.; Anderson-Sprecher, Richard
2005-11-01
Modeling storm occurrences has become a vital part of hurricane prediction. In this paper, a method for simulating event occurrences using a simulated annealing algorithm is described. The method is illustrated using annual counts of hurricanes and of tropical storms in the Atlantic Ocean and Gulf of Mexico. Simulations closely match distributional properties, including possible correlations, in the historical data. For hurricanes, traditionally used Poisson and negative binomial processes also predict univariate properties well, but for tropical storms parametric methods are less successful. The authors determined that simulated annealing replicates properties of both series. Simulated annealing can be designed so that simulations mimic historical distributional properties to whatever degree is desired, including occurrence of extreme events and temporal patterning.
Using Simulated Annealing to Factor Numbers
Altschuler, Eric Lewin; Williams, Timothy J.
2014-01-01
Almost all public secure communication relies on the inability to factor large numbers. There is no known analytic or classical numeric method to rapidly factor large numbers. Shor[1] has shown that a quantum computer can factor numbers in polynomial time but there is no practical quantum computer that can yet do such computations. We show that a simulated annealing[2] approach can be adapted to find factors of large numbers.
Berthiau, G.
1995-10-01
The circuit design problem consists in determining acceptable parameter values (resistors, capacitors, transistors geometries ...) which allow the circuit to meet various user given operational criteria (DC consumption, AC bandwidth, transient times ...). This task is equivalent to a multidimensional and/or multi objective optimization problem: n-variables functions have to be minimized in an hyper-rectangular domain ; equality constraints can be eventually specified. A similar problem consists in fitting component models. In this way, the optimization variables are the model parameters and one aims at minimizing a cost function built on the error between the model response and the data measured on the component. The chosen optimization method for this kind of problem is the simulated annealing method. This method, provided by the combinatorial optimization domain, has been adapted and compared with other global optimization methods for the continuous variables problems. An efficient strategy of variables discretization and a set of complementary stopping criteria have been proposed. The different parameters of the method have been adjusted with analytical functions of which minima are known, classically used in the literature. Our simulated annealing algorithm has been coupled with an open electrical simulator SPICE-PAC of which the modular structure allows the chaining of simulations required by the circuit optimization process. We proposed, for high-dimensional problems, a partitioning technique which ensures proportionality between CPU-time and variables number. To compare our method with others, we have adapted three other methods coming from combinatorial optimization domain - the threshold method, a genetic algorithm and the Tabu search method - The tests have been performed on the same set of test functions and the results allow a first comparison between these methods applied to continuous optimization variables. (Abstract Truncated)
Feasibility of Simulated Annealing Tomography
Vo, Nghia T; Moser, Herbert O
2014-01-01
Simulated annealing tomography (SAT) is a simple iterative image reconstruction technique which can yield a superior reconstruction compared with filtered back-projection (FBP). However, the very high computational cost of iteratively calculating discrete Radon transform (DRT) has limited the feasibility of this technique. In this paper, we propose an approach based on the pre-calculated intersection lengths array (PILA) which helps to remove the step of computing DRT in the simulated annealing procedure and speed up SAT by over 300 times. The enhancement of convergence speed of the reconstruction process using the best of multiple-estimate (BoME) strategy is introduced. The performance of SAT under different conditions and in comparison with other methods is demonstrated by numerical experiments.
Recursive Branching Simulated Annealing Algorithm
Bolcar, Matthew; Smith, J. Scott; Aronstein, David
2012-01-01
This innovation is a variation of a simulated-annealing optimization algorithm that uses a recursive-branching structure to parallelize the search of a parameter space for the globally optimal solution to an objective. The algorithm has been demonstrated to be more effective at searching a parameter space than traditional simulated-annealing methods for a particular problem of interest, and it can readily be applied to a wide variety of optimization problems, including those with a parameter space having both discrete-value parameters (combinatorial) and continuous-variable parameters. It can take the place of a conventional simulated- annealing, Monte-Carlo, or random- walk algorithm. In a conventional simulated-annealing (SA) algorithm, a starting configuration is randomly selected within the parameter space. The algorithm randomly selects another configuration from the parameter space and evaluates the objective function for that configuration. If the objective function value is better than the previous value, the new configuration is adopted as the new point of interest in the parameter space. If the objective function value is worse than the previous value, the new configuration may be adopted, with a probability determined by a temperature parameter, used in analogy to annealing in metals. As the optimization continues, the region of the parameter space from which new configurations can be selected shrinks, and in conjunction with lowering the annealing temperature (and thus lowering the probability for adopting configurations in parameter space with worse objective functions), the algorithm can converge on the globally optimal configuration. The Recursive Branching Simulated Annealing (RBSA) algorithm shares some features with the SA algorithm, notably including the basic principles that a starting configuration is randomly selected from within the parameter space, the algorithm tests other configurations with the goal of finding the globally optimal
Very Fast Simulated Re-Annealing
Ingber, Lester
1989-01-01
Draft An algorithm is developed to statistically find the best global fit of a nonlinear non-convex cost-function over a D-dimensional space. It is argued that this algorithm permits an annealing schedule for ‘‘temperature’’ T decreasing exponentially in annealing-time k, T = T0 exp(−ck1/D). The introduction of re-annealing also permits adaptation to changing sensitivities in the multidimensional parameter-space. This annealing schedule is faster than fast Cauchy annealing, ...
NEW SIMULATED ANNEALING ALGORITHMS FOR CONSTRAINED OPTIMIZATION
LINET ÖZDAMAR; CHANDRA SEKHAR PEDAMALLU
2010-01-01
We propose a Population based dual-sequence Non-Penalty Annealing algorithm (PNPA) for solving the general nonlinear constrained optimization problem. The PNPA maintains a population of solutions that are intermixed by crossover to supply a new starting solution for simulated annealing throughout the search. Every time the search gets stuck at a local optimum, this crossover procedure is triggered and simulated annealing search re-starts from a new subspace. In both the crossover and simulate...
Quantum annealing speedup over simulated annealing on random Ising chains
Zanca, Tommaso
2015-01-01
We show clear evidence of a speedup of a quantum annealing (QA) Schr\\"odinger dynamics over a Glauber master-equation simulated annealing (SA) for a random Ising model in one dimension. Annealings are tackled on equal footing, by a deterministic dynamics of the resulting Jordan-Wigner fermionic problems. We find that disorder, without frustration, makes both SA and real-time QA logarithmically slow in the annealing time $\\tau$, but QA shows a quadratic speedup with respect to SA. We also find that an imaginary-time Schr\\"odinger QA dynamics provides a further exponential speedup, with an asymptotic residual error compatible with a power-law $\\tau^{-\\mu}$.
Simulated Annealing using Hybrid Monte Carlo
Salazar, Rafael; Toral, Raúl
1997-01-01
We propose a variant of the simulated annealing method for optimization in the multivariate analysis of differentiable functions. The method uses global actualizations via the hybrid Monte Carlo algorithm in their generalized version for the proposal of new configurations. We show how this choice can improve upon the performance of simulated annealing methods (mainly when the number of variables is large) by allowing a more effective searching scheme and a faster annealing schedule.
Cylinder packing by simulated annealing
M. Helena Correia
2000-12-01
Full Text Available This paper is motivated by the problem of loading identical items of circular base (tubes, rolls, ... into a rectangular base (the pallet. For practical reasons, all the loaded items are considered to have the same height. The resolution of this problem consists in determining the positioning pattern of the circular bases of the items on the rectangular pallet, while maximizing the number of items. This pattern will be repeated for each layer stacked on the pallet. Two algorithms based on the meta-heuristic Simulated Annealing have been developed and implemented. The tuning of these algorithms parameters implied running intensive tests in order to improve its efficiency. The algorithms developed were easily extended to the case of non-identical circles.Este artigo aborda o problema de posicionamento de objetos de base circular (tubos, rolos, ... sobre uma base retangular de maiores dimensões. Por razões práticas, considera-se que todos os objetos a carregar apresentam a mesma altura. A resolução do problema consiste na determinação do padrão de posicionamento das bases circulares dos referidos objetos sobre a base de forma retangular, tendo como objetivo a maximização do número de objetos estritamente posicionados no interior dessa base. Este padrão de posicionamento será repetido em cada uma das camadas a carregar sobre a base retangular. Apresentam-se dois algoritmos para a resolução do problema. Estes algoritmos baseiam-se numa meta-heurística, Simulated Annealling, cuja afinação de parâmetros requereu a execução de testes intensivos com o objetivo de atingir um elevado grau de eficiência no seu desempenho. As características dos algoritmos implementados permitiram que a sua extensão à consideração de círculos com raios diferentes fosse facilmente conseguida.
Quantum annealing speedup over simulated annealing on random Ising chains
Zanca, Tommaso; Santoro, Giuseppe E.
2016-06-01
We show clear evidence of a quadratic speedup of a quantum annealing (QA) Schrödinger dynamics over a Glauber master equation simulated annealing (SA) for a random Ising model in one dimension, via an equal-footing exact deterministic dynamics of the Jordan-Wigner fermionized problems. This is remarkable, in view of the arguments of H. G. Katzgraber et al. [Phys. Rev. X 4, 021008 (2014), 10.1103/PhysRevX.4.021008], since SA does not encounter any phase transition, while QA does. We also find a second remarkable result: that a "quantum-inspired" imaginary-time Schrödinger QA provides a further exponential speedup, i.e., an asymptotic residual error decreasing as a power law τ-μ of the annealing time τ .
Classical Simulated Annealing Using Quantum Analogues
La Cour, Brian R.; Troupe, James E.; Mark, Hans M.
2016-06-01
In this paper we consider the use of certain classical analogues to quantum tunneling behavior to improve the performance of simulated annealing on a discrete spin system of the general Ising form. Specifically, we consider the use of multiple simultaneous spin flips at each annealing step as an analogue to quantum spin coherence as well as modifications of the Boltzmann acceptance probability to mimic quantum tunneling. We find that the use of multiple spin flips can indeed be advantageous under certain annealing schedules, but only for long anneal times.
Classical Simulated Annealing Using Quantum Analogues
La Cour, Brian R.; Troupe, James E.; Mark, Hans M.
2016-08-01
In this paper we consider the use of certain classical analogues to quantum tunneling behavior to improve the performance of simulated annealing on a discrete spin system of the general Ising form. Specifically, we consider the use of multiple simultaneous spin flips at each annealing step as an analogue to quantum spin coherence as well as modifications of the Boltzmann acceptance probability to mimic quantum tunneling. We find that the use of multiple spin flips can indeed be advantageous under certain annealing schedules, but only for long anneal times.
Simulated Quantum Annealing Can Be Exponentially Faster than Classical Simulated Annealing
Crosson, Elizabeth; Harrow, Aram W.
2016-01-01
Simulated Quantum Annealing (SQA) is a Markov Chain Monte-Carlo algorithm that samples the equilibrium thermal state of a Quantum Annealing (QA) Hamiltonian. In addition to simulating quantum systems, SQA has also been proposed as another physics-inspired classical algorithm for combinatorial optimization, alongside classical simulated annealing. However, in many cases it remains an open challenge to determine the performance of both QA and SQA. One piece of evidence for the strength of Q...
An Introduction to Simulated Annealing
Albright, Brian
2007-01-01
An attempt to model the physical process of annealing lead to the development of a type of combinatorial optimization algorithm that takes on the problem of getting trapped in a local minimum. The author presents a Microsoft Excel spreadsheet that illustrates how this works.
Stochastic annealing simulation of cascades in metals
Heinisch, H.L.
1996-04-01
The stochastic annealing simulation code ALSOME is used to investigate quantitatively the differential production of mobile vacancy and SIA defects as a function of temperature for isolated 25 KeV cascades in copper generated by MD simulations. The ALSOME code and cascade annealing simulations are described. The annealing simulations indicate that the above Stage V, where the cascade vacancy clusters are unstable,m nearly 80% of the post-quench vacancies escape the cascade volume, while about half of the post-quench SIAs remain in clusters. The results are sensitive to the relative fractions of SIAs that occur in small, highly mobile clusters and large stable clusters, respectively, which may be dependent on the cascade energy.
Constrained multi-global optimization using a penalty stretched simulated annealing framework
Pereira, Ana I.; Edite M.G.P. Fernandes
2009-01-01
This paper presents a new simulated annealing algorithm to solve constrained multi-global optimization problems. To compute all global solutions in a sequential manner, we combine the function stretching technique with the adaptive simulated annealing variant. Constraint-handling is carried out through a nondifferentiable penalty function. To benchmark our penalty stretched simulated annealing algorithm we solve a set of well-known problems. Our preliminary numerical results show that the alg...
Hierarchical Network Design Using Simulated Annealing
Thomadsen, Tommy; Clausen, Jens
2002-01-01
networks are described and a mathematical model is proposed for a two level version of the hierarchical network problem. The problem is to determine which edges should connect nodes, and how demand is routed in the network. The problem is solved heuristically using simulated annealing which as a sub...
Parallel simulated annealing algorithms for cell placement on hypercube multiprocessors
Banerjee, Prithviraj; Jones, Mark Howard; Sargent, Jeff S.
1990-01-01
Two parallel algorithms for standard cell placement using simulated annealing are developed to run on distributed-memory message-passing hypercube multiprocessors. The cells can be mapped in a two-dimensional area of a chip onto processors in an n-dimensional hypercube in two ways, such that both small and large cell exchange and displacement moves can be applied. The computation of the cost function in parallel among all the processors in the hypercube is described, along with a distributed data structure that needs to be stored in the hypercube to support the parallel cost evaluation. A novel tree broadcasting strategy is used extensively for updating cell locations in the parallel environment. A dynamic parallel annealing schedule estimates the errors due to interacting parallel moves and adapts the rate of synchronization automatically. Two novel approaches in controlling error in parallel algorithms are described: heuristic cell coloring and adaptive sequence control.
Learning FCM by chaotic simulated annealing
Fuzzy cognitive map (FCM) is a directed graph, which shows the relations between essential components in complex systems. It is a very convenient, simple, and powerful tool, which is used in numerous areas of application. Experts who are familiar with the system components and their relations can generate a related FCM. There is a big gap when human experts cannot produce FCM or even there is no expert to produce the related FCM. Therefore, a new mechanism must be used to bridge this gap. In this paper, a novel learning method is proposed to construct FCM by using Chaotic simulated annealing (CSA). The proposed method not only is able to construct FCM graph topology but also is able to extract the weight of the edges from input historical data. The efficiency of the proposed method is shown via comparison of its results of some numerical examples with those of Simulated annealing (SA) method.
Simulated annealing in orbital flight planning
Soller, Jeffrey
1990-01-01
Simulated annealing is used to solve a minimum fuel trajectory problem in the space station environment. The environment is unique because the space station will define the first true multivehicle environment in space. The optimization yields surfaces which are potentially complex, with multiple local minima. Because of the likelihood of these local minima, descent techniques are unable to offer robust solutions. Other deterministic optimization techniques were explored without success. The simulated annealing optimization is capable of identifying a minimum-fuel, two-burn trajectory subject to four constraints. Furthermore, the computational efforts involved in the optimization are such that missions could be planned on board the space station. Potential applications could include the on-site planning of rendezvous with a target craft of the emergency rescue of an astronaut. Future research will include multiwaypoint maneuvers, using a knowledge base to guide the optimization.
Code Generator for Quantum Simulated Annealing
Tucci, Robert R
2009-01-01
This paper introduces QuSAnn v1.2 and Multiplexor Expander v1.2, two Java applications available for free. (Source code included in the distribution.) QuSAnn is a "code generator" for quantum simulated annealing: after the user inputs some parameters, it outputs a quantum circuit for performing simulated annealing on a quantum computer. The quantum circuit implements the algorithm of Wocjan et al. (arXiv:0804.4259), which improves on the original algorithm of Somma et al. (arXiv:0712.1008). The quantum circuit generated by QuSAnn includes some quantum multiplexors. The application Multiplexor Expander allows the user to replace each of those multiplexors by a sequence of more elementary gates such as multiply controlled NOTs and qubit rotations.
Simulated Annealing with Tsallis Weights - A Numerical Comparison
Hansmann, Ulrich H.E.
1997-01-01
We discuss the use of Tsallis generalized mechanics in simulated annealing algorithms. For a small peptide it is shown that older implementations are not more effective than regular simulated annealing in finding ground state configurations. We propose a new implementation which leads to an improvement over regular simulated annealing.
MEDICAL STAFF SCHEDULING USING SIMULATED ANNEALING
Ladislav Rosocha; Silvia Vernerova; Robert Verner
2015-01-01
Purpose: The efficiency of medical staff is a fundamental feature of healthcare facilities quality. Therefore the better implementation of their preferences into the scheduling problem might not only rise the work-life balance of doctors and nurses, but also may result into better patient care. This paper focuses on optimization of medical staff preferences considering the scheduling problem. Methodology/Approach: We propose a medical staff scheduling algorithm based on simulated annealing, a...
MEDICAL STAFF SCHEDULING USING SIMULATED ANNEALING
Ladislav Rosocha; Silvia Vernerova; Robert Verner
2015-01-01
Purpose: The efficiency of medical staff is a fundamental feature of healthcare facilities quality. Therefore the better implementation of their preferences into the scheduling problem might not only rise the work-life balance of doctors and nurses, but also may result into better patient care. This paper focuses on optimization of medical staff preferences considering the scheduling problem.Methodology/Approach: We propose a medical staff scheduling algorithm based on simulated annealing, a ...
Solving maximum cut problems by simulated annealing
Myklebust, Tor G. J.
2015-01-01
This paper gives a straightforward implementation of simulated annealing for solving maximum cut problems and compares its performance to that of some existing heuristic solvers. The formulation used is classical, dating to a 1989 paper of Johnson, Aragon, McGeoch, and Schevon. This implementation uses no structure peculiar to the maximum cut problem, but its low per-iteration cost allows it to find better solutions than were previously known for 40 of the 89 standard maximum cut instances te...
Binary Sparse Phase Retrieval via Simulated Annealing
Wei Peng
2016-01-01
Full Text Available This paper presents the Simulated Annealing Sparse PhAse Recovery (SASPAR algorithm for reconstructing sparse binary signals from their phaseless magnitudes of the Fourier transform. The greedy strategy version is also proposed for a comparison, which is a parameter-free algorithm. Sufficient numeric simulations indicate that our method is quite effective and suggest the binary model is robust. The SASPAR algorithm seems competitive to the existing methods for its efficiency and high recovery rate even with fewer Fourier measurements.
Reactor controller design using genetic algorithms with simulated annealing
This chapter presents a digital control system for ITU TRIGA Mark-II reactor using genetic algorithms with simulated annealing. The basic principles of genetic algorithms for problem solving are inspired by the mechanism of natural selection. Natural selection is a biological process in which stronger individuals are likely to be winners in a competing environment. Genetic algorithms use a direct analogy of natural evolution. Genetic algorithms are global search techniques for optimisation but they are poor at hill-climbing. Simulated annealing has the ability of probabilistic hill-climbing. Thus, the two techniques are combined here to get a fine-tuned algorithm that yields a faster convergence and a more accurate search by introducing a new mutation operator like simulated annealing or an adaptive cooling schedule. In control system design, there are currently no systematic approaches to choose the controller parameters to obtain the desired performance. The controller parameters are usually determined by test and error with simulation and experimental analysis. Genetic algorithm is used automatically and efficiently searching for a set of controller parameters for better performance. (orig.)
Hypocoercivity in metastable settings and kinetic simulated annealing
Monmarché, Pierre
2015-01-01
Classical analysis of the simulated annealing algorithm is combined with the more recent hypocoercive method of distorted entropy to prove the convergence for large time of the kinetic Langevin annealing with logarithmic cooling schedule.
Simulated annealing algorithm for detecting graph isomorphism
Geng Xiutang; Zhang Kai
2008-01-01
Evolutionary computation techniques have mostly been used to solve various optimization problems,and it is well known that graph isomorphism problem (GIP) is a nondeterministic polynomial problem.A simulated annealing (SA) algorithm for detecting graph isomorphism is proposed,and the proposed SA algorithm is well suited to deal with random graphs with large size.To verify the validity of the proposed SA algorithm,simulations are performed on three pairs of small graphs and four pairs of large random graphs with edge densities 0.5,0.1,and 0.01,respectively.The simulation results show that the proposed SA algorithm can detect graph isomorphism with a high probability.
Simulated Annealing of Two Electron Density Solution Systems
Neto, Mario de Oliveira; Alonso, Ronaldo Luiz; Leite, Fabio Lima; Jr, Osvaldo N. Oliveira; Polikarpov, Igor; Mascarenhas, Yvonne Primerano
2008-01-01
Many structural studies have been performed with a combination of SAXS and simulated annealing to reconstruct three dimensional models. Simulated annealing is suitable for the study of monodisperse, diluted and two-electron densities systems. In this chapter we showed how the simulated annealing procedure can be used to minimize the discrepancy between two functions: the simulated intensity and the experimental one-dimensional SAXS curve. The goal was to find the most probable form for a prot...
MEDICAL STAFF SCHEDULING USING SIMULATED ANNEALING
Ladislav Rosocha
2015-07-01
Full Text Available Purpose: The efficiency of medical staff is a fundamental feature of healthcare facilities quality. Therefore the better implementation of their preferences into the scheduling problem might not only rise the work-life balance of doctors and nurses, but also may result into better patient care. This paper focuses on optimization of medical staff preferences considering the scheduling problem.Methodology/Approach: We propose a medical staff scheduling algorithm based on simulated annealing, a well-known method from statistical thermodynamics. We define hard constraints, which are linked to legal and working regulations, and minimize the violations of soft constraints, which are related to the quality of work, psychic, and work-life balance of staff.Findings: On a sample of 60 physicians and nurses from gynecology department we generated monthly schedules and optimized their preferences in terms of soft constraints. Our results indicate that the final value of objective function optimized by proposed algorithm is more than 18-times better in violations of soft constraints than initially generated random schedule that satisfied hard constraints.Research Limitation/implication: Even though the global optimality of final outcome is not guaranteed, desirable solutionwas obtained in reasonable time. Originality/Value of paper: We show that designed algorithm is able to successfully generate schedules regarding hard and soft constraints. Moreover, presented method is significantly faster than standard schedule generation and is able to effectively reschedule due to the local neighborhood search characteristics of simulated annealing.
Measures of Fault Tolerance in Distributed Simulated Annealing
Prakash, Aaditya
2012-01-01
In this paper, we examine the different measures of Fault Tolerance in a Distributed Simulated Annealing process. Optimization by Simulated Annealing on a distributed system is prone to various sources of failure. We analyse simulated annealing algorithm, its architecture in distributed platform and potential sources of failures. We examine the behaviour of tolerant distributed system for optimization task. We present possible methods to overcome the failures and achieve fault tolerance for t...
A Parallel Genetic Simulated Annealing Hybrid Algorithm for Task Scheduling
SHU Wanneng; ZHENG Shijue
2006-01-01
In this paper combined with the advantages of genetic algorithm and simulated annealing, brings forward a parallel genetic simulated annealing hybrid algorithm (PGSAHA) and applied to solve task scheduling problem in grid computing .It first generates a new group of individuals through genetic operation such as reproduction, crossover, mutation, etc, and than simulated anneals independently all the generated individuals respectively.When the temperature in the process of cooling no longer falls, the result is the optimal solution on the whole.From the analysis and experiment result, it is concluded that this algorithm is superior to genetic algorithm and simulated annealing.
Simulation of annealed polyelectrolytes in poor solvents
We present (semi-)grand canonical Monte Carlo simulations on annealed polyelectrolytes in poor solvent. Increasing the chemical potential of the charges, which is equal to the pH of the solution except for a trivial additive constant, in rather poor solvents, we find the first-order phase transition between a weakly charged globule and a highly charged extended chain predicted by theory. In the close-to-Q -point regime, we investigate under which conditions pearl-necklace structures are stable. Most of the pearl-necklace parameters are found to obey the scaling relations predicted for quenched polyelectrolytes. However, similarly to the behavior known for this class of polyelectrolytes we obtain large fluctuations in pearl number and size. In agreement with theoretical predictions we find a non-uniform charge distribution between pearls and strings
Tunneling through high energy barriers in simulated quantum annealing
Crosson, Elizabeth; Deng, Mingkai
2014-01-01
We analyze the performance of simulated quantum annealing (SQA) on an optimization problem for which simulated classical annealing (SA) is provably inefficient because of a high energy barrier. We present evidence that SQA can pass through this barrier to find the global minimum efficiently. This demonstrates the potential for SQA to inherit some of the advantages of quantum annealing (QA), since this problem has been previously shown to be efficiently solvable by quantum adiabatic optimization.
Simulated Annealing for Location Area Planning in Cellular networks
N. B. Prajapati
2010-03-01
Full Text Available LA planning in cellular network is useful for minimizing location management cost in GSM network. Infact, size of LA can be optimized to create a balance between the LA update rate and expected pagingrate within LA. To get optimal result for LA planning in cellular network simulated annealing algorithmis used. Simulated annealing give optimal results in acceptable run-time.
Simulated Annealing for Location Area Planning in Cellular networks
Prajapati, N. B.; R. R. Agravat; Hasan, M I
2010-01-01
LA planning in cellular network is useful for minimizing location management cost in GSM network. In fact, size of LA can be optimized to create a balance between the LA update rate and expected paging rate within LA. To get optimal result for LA planning in cellular network simulated annealing algorithm is used. Simulated annealing give optimal results in acceptable run-time.
Remote sensing of atmospheric duct parameters using simulated annealing
Zhao Xiao-Feng; Huang Si-Xun; Xiang Jie; Shi Wei-Lai
2011-01-01
Simulated annealing is one of the robust optimization schemes. Simulated annealing mimics the annealing process of the slow cooling of a heated metal to reach a stable minimum energy state. In this paper,we adopt simulated annealing to study the problem of the remote sensing of atmospheric duct parameters for two different geometries of propagation measurement. One is from a single emitter to an array of radio receivers (vertical measurements),and the other is from the radar clutter returns (horizontal measurements). Basic principles of simulated annealing and its applications to refractivity estimation are introduced. The performance of this method is validated using numerical experiments and field measurements collected at the East China Sea. The retrieved results demonstrate the feasibility of simulated annealing for near real-time atmospheric refractivity estimation. For comparison,the retrievals of the genetic algorithm are also presented. The comparisons indicate that the convergence speed of simulated annealing is faster than that of the genetic algorithm,while the anti-noise ability of the genetic algorithm is better than that of simulated annealing.
Simulated annealing with probabilistic analysis for solving traveling salesman problems
Hong, Pei-Yee; Lim, Yai-Fung; Ramli, Razamin; Khalid, Ruzelan
2013-09-01
Simulated Annealing (SA) is a widely used meta-heuristic that was inspired from the annealing process of recrystallization of metals. Therefore, the efficiency of SA is highly affected by the annealing schedule. As a result, in this paper, we presented an empirical work to provide a comparable annealing schedule to solve symmetric traveling salesman problems (TSP). Randomized complete block design is also used in this study. The results show that different parameters do affect the efficiency of SA and thus, we propose the best found annealing schedule based on the Post Hoc test. SA was tested on seven selected benchmarked problems of symmetric TSP with the proposed annealing schedule. The performance of SA was evaluated empirically alongside with benchmark solutions and simple analysis to validate the quality of solutions. Computational results show that the proposed annealing schedule provides a good quality of solution.
Surface Structure of Hydroxyapatite from Simulated Annealing Molecular Dynamics Simulations.
Wu, Hong; Xu, Dingguo; Yang, Mingli; Zhang, Xingdong
2016-05-10
The surface structure of hydroxyapatite (HAP) is crucial for its bioactivity. Using a molecular dynamics simulated annealing method, we studied the structure and its variation with annealing temperature of the HAP (100) surface. In contrast to the commonly used HAP surface model, which is sliced from HAP crystal and then relaxed at 0 K with first-principles or force-field calculations, a new surface structure with gradual changes from ordered inside to disordered on the surface was revealed. The disordering is dependent on the annealing temperature, Tmax. When Tmax increases up to the melting point, which was usually adopted in experiments, the disordering increases, as reflected by its radial distribution functions, structural factors, and atomic coordination numbers. The disordering of annealed structures does not show significant changes when Tmax is above the melting point. The thickness of disordered layers is about 10 Å. The surface energy of the annealed structures at high temperature is significantly less than that of the crystal structure relaxed at room temperature. A three-layer model of interior, middle, and surface was then proposed to describe the surface structure of HAP. The interior layer retains the atomic configurations in crystal. The middle layer has its atoms moved and its groups rotated about their original locations. In the surface layer, the atomic arrangements are totally different from those in crystal. In particular for the hydroxyl groups, they move outward and cover the Ca(2+) ions, leaving holes occupied by the phosphate groups. Our study suggested a new model with disordered surface structures for studying the interaction of HAP-based biomaterials with other molecules. PMID:27096760
List-Based Simulated Annealing Algorithm for Traveling Salesman Problem.
Zhan, Shi-hua; Lin, Juan; Zhang, Ze-jun; Zhong, Yi-wen
2016-01-01
Simulated annealing (SA) algorithm is a popular intelligent optimization algorithm which has been successfully applied in many fields. Parameters' setting is a key factor for its performance, but it is also a tedious work. To simplify parameters setting, we present a list-based simulated annealing (LBSA) algorithm to solve traveling salesman problem (TSP). LBSA algorithm uses a novel list-based cooling schedule to control the decrease of temperature. Specifically, a list of temperatures is created first, and then the maximum temperature in list is used by Metropolis acceptance criterion to decide whether to accept a candidate solution. The temperature list is adapted iteratively according to the topology of the solution space of the problem. The effectiveness and the parameter sensitivity of the list-based cooling schedule are illustrated through benchmark TSP problems. The LBSA algorithm, whose performance is robust on a wide range of parameter values, shows competitive performance compared with some other state-of-the-art algorithms. PMID:27034650
A laboratory flash furnace for strand annealing simulation
Page, J. H. R.
1995-08-01
The economic production of CRML steels depends on the use of continuous annealing. Successful development of improved CRML steels, the compositions of which have moved to lower carbon contents, is critically dependent on the rate of heating and its effect on transformation characteristics. As a result, accurate simulation of annealing conditions, particularly the heating rate, is essential. With this in mind, European Electrical Steels set criteria for a laboratory annealing facility that would, das far as was practicable, reproduce day- to- day continuous furnace operation. This paper outlines the design criteria, construction, and operation of the resulting annealing facility.
A laboratory flash furnace for strand annealing simulation
Page, J.H.R. [European Electrical Steels, Newport (United Kingdom). Orb Works
1995-08-01
The economic production of CRML steels depends on the use of continuous annealing. Successful developed of improved CRML steels, the compositions of which have moved to lower carbon contents, is critically dependent on the rate of heating and its effect on transformation characteristics. As a result, accurate simulation of annealing conditions, particularly the heating rate, is essential. With this in mind, European Electrical Steels set criteria for a laboratory annealing facility that would, as far as was practicable, reproduce day-to-day continuous furnace operation. This paper outlines the design criteria, construction, and operation of the resulting annealing facility.
A NEW GENETIC SIMULATED ANNEALING ALGORITHM FOR FLOOD ROUTING MODEL
KANG Ling; WANG Cheng; JIANG Tie-bing
2004-01-01
In this paper, a new approach, the Genetic Simulated Annealing (GSA), was proposed for optimizing the parameters in the Muskingum routing model. By integrating the simulated annealing method into the genetic algorithm, the hybrid method could avoid some troubles of traditional methods, such as arduous trial-and-error procedure, premature convergence in genetic algorithm and search blindness in simulated annealing. The principle and implementing procedure of this algorithm were described. Numerical experiments show that the GSA can adjust the optimization population, prevent premature convergence and seek the global optimal result.Applications to the Nanyunhe River and Qingjiang River show that the proposed approach is of higher forecast accuracy and practicability.
Stochastic search in structural optimization - Genetic algorithms and simulated annealing
Hajela, Prabhat
1993-01-01
An account is given of illustrative applications of genetic algorithms and simulated annealing methods in structural optimization. The advantages of such stochastic search methods over traditional mathematical programming strategies are emphasized; it is noted that these methods offer a significantly higher probability of locating the global optimum in a multimodal design space. Both genetic-search and simulated annealing can be effectively used in problems with a mix of continuous, discrete, and integer design variables.
An improved simulated annealing algorithm for standard cell placement
Jones, Mark; Banerjee, Prithviraj
1988-01-01
Simulated annealing is a general purpose Monte Carlo optimization technique that was applied to the problem of placing standard logic cells in a VLSI ship so that the total interconnection wire length is minimized. An improved standard cell placement algorithm that takes advantage of the performance enhancements that appear to come from parallelizing the uniprocessor simulated annealing algorithm is presented. An outline of this algorithm is given.
Nonsmooth trajectory optimization - An approach using continuous simulated annealing
Lu, Ping; Khan, M. A.
1993-01-01
An account is given of the properties of a continuous simulated annealing algorithm that can function as a global optimization tool for nonsmooth dynamic systems, as shown in the case of a trajectory-optimization program implementation. The approach is shown to successfully solve the problem of nonsmooth trajectory optimization for a high performance rigid-body aircraft. The results obtained demonstrate the superiority of the simulated annealing algorithm over widely used algorithms.
Model based matching using simulated annealing and a minimum representation size criterion
Ravichandran, B.; Sanderson, A. C.
1992-01-01
We define the model based matching problem in terms of the correspondence and transformation that relate the model and scene, and the search and evaluation measures needed to find the best correspondence and transformation. Simulated annealing is proposed as a method for search and optimization, and the minimum representation size criterion is used as the evaluation measure in an algorithm that finds the best correspondence. An algorithm based on simulated annealing is presented and evaluated. This algorithm is viewed as a part of an adaptive, hierarchical approach which provides robust results for a variety of model based matching problems.
Simulated quantum annealing of double-well and multiwell potentials.
Inack, E M; Pilati, S
2015-11-01
We analyze the performance of quantum annealing as a heuristic optimization method to find the absolute minimum of various continuous models, including landscapes with only two wells and also models with many competing minima and with disorder. The simulations performed using a projective quantum Monte Carlo (QMC) algorithm are compared with those based on the finite-temperature path-integral QMC technique and with classical annealing. We show that the projective QMC algorithm is more efficient than the finite-temperature QMC technique, and that both are inferior to classical annealing if this is performed with appropriate long-range moves. However, as the difficulty of the optimization problem increases, classical annealing loses efficiency, while the projective QMC algorithm keeps stable performance and is finally the most effective optimization tool. We discuss the implications of our results for the outstanding problem of testing the efficiency of adiabatic quantum computers using stochastic simulations performed on classical computers. PMID:26651813
Monte Carlo simulation of primary recrystallization and annealing twinning
The formation of annealing twins has been studied from the beginning of the 20th century and a variety of mechanisms have been suggested. Molecular dynamics simulations on the atomic scale have also been performed. This paper reports a microscale simulation of primary recrystallization and twinning of a nickel alloy based on the Monte Carlo approach. Different twin morphologies were simulated. A possible dependence of grain growth direction on twin formation during annealing was demonstrated. The formation of incoherent Σ3 and Σ9 boundaries is verified as the indirect outcome after coherent Σ3 formation
Quantum versus simulated annealing in wireless interference network optimization
Wang, Chi; Chen, Huo; Jonckheere, Edmond
2016-01-01
Quantum annealing (QA) serves as a specialized optimizer that is able to solve many NP-hard problems and that is believed to have a theoretical advantage over simulated annealing (SA) via quantum tunneling. With the introduction of the D-Wave programmable quantum annealer, a considerable amount of effort has been devoted to detect and quantify quantum speedup. While the debate over speedup remains inconclusive as of now, instead of attempting to show general quantum advantage, here, we focus on a novel real-world application of D-Wave in wireless networking—more specifically, the scheduling of the activation of the air-links for maximum throughput subject to interference avoidance near network nodes. In addition, D-Wave implementation is made error insensitive by a novel Hamiltonian extra penalty weight adjustment that enlarges the gap and substantially reduces the occurrence of interference violations resulting from inevitable spin bias and coupling errors. The major result of this paper is that quantum annealing benefits more than simulated annealing from this gap expansion process, both in terms of ST99 speedup and network queue occupancy. It is the hope that this could become a real-word application niche where potential benefits of quantum annealing could be objectively assessed. PMID:27181056
Quantum versus simulated annealing in wireless interference network optimization
Wang, Chi; Chen, Huo; Jonckheere, Edmond
2016-05-01
Quantum annealing (QA) serves as a specialized optimizer that is able to solve many NP-hard problems and that is believed to have a theoretical advantage over simulated annealing (SA) via quantum tunneling. With the introduction of the D-Wave programmable quantum annealer, a considerable amount of effort has been devoted to detect and quantify quantum speedup. While the debate over speedup remains inconclusive as of now, instead of attempting to show general quantum advantage, here, we focus on a novel real-world application of D-Wave in wireless networking—more specifically, the scheduling of the activation of the air-links for maximum throughput subject to interference avoidance near network nodes. In addition, D-Wave implementation is made error insensitive by a novel Hamiltonian extra penalty weight adjustment that enlarges the gap and substantially reduces the occurrence of interference violations resulting from inevitable spin bias and coupling errors. The major result of this paper is that quantum annealing benefits more than simulated annealing from this gap expansion process, both in terms of ST99 speedup and network queue occupancy. It is the hope that this could become a real-word application niche where potential benefits of quantum annealing could be objectively assessed.
On simulated annealing phase transitions in phylogeny reconstruction.
Strobl, Maximilian A R; Barker, Daniel
2016-08-01
Phylogeny reconstruction with global criteria is NP-complete or NP-hard, hence in general requires a heuristic search. We investigate the powerful, physically inspired, general-purpose heuristic simulated annealing, applied to phylogeny reconstruction. Simulated annealing mimics the physical process of annealing, where a liquid is gently cooled to form a crystal. During the search, periods of elevated specific heat occur, analogous to physical phase transitions. These simulated annealing phase transitions play a crucial role in the outcome of the search. Nevertheless, they have received comparably little attention, for phylogeny or other optimisation problems. We analyse simulated annealing phase transitions during searches for the optimal phylogenetic tree for 34 real-world multiple alignments. In the same way in which melting temperatures differ between materials, we observe distinct specific heat profiles for each input file. We propose this reflects differences in the search landscape and can serve as a measure for problem difficulty and for suitability of the algorithm's parameters. We discuss application in algorithmic optimisation and as a diagnostic to assess parameterisation before computationally costly, large phylogeny reconstructions are launched. Whilst the focus here lies on phylogeny reconstruction under maximum parsimony, it is plausible that our results are more widely applicable to optimisation procedures in science and industry. PMID:27150349
Simulated annealing optimization for multi-objective economic dispatch solution
Ismail ZIANE
2014-11-01
Full Text Available This paper presents a multi-objective Simulated Annealing Optimization to solve a Dynamic Generation Dispatch problem. In this work, the problem is formulated as a multi-objective one with two competing functions, namely economic cost and emission functions, subject to different constraints. The inequality constraints considered are the generating unit capacity limits while the equality constraint is generation-demand balance. To show the advantages of the proposed algorithm, it has been applied for solving multi-objective EELD problems in a 3-generator system with NOx and SO2 emission. Results obtained with Simulated Annealing have been compared with other existing relevant approaches available in literatures. Experimental results show a proficiency of Simulated Annealing over other existing techniques in terms of robustness.
SIMULATED ANNEALING BASED POLYNOMIAL TIME QOS ROUTING ALGORITHM FOR MANETS
Liu Lianggui; Feng Guangzeng
2006-01-01
Multi-constrained Quality-of-Service (QoS) routing is a big challenge for Mobile Ad hoc Networks (MANETs) where the topology may change constantly. In this paper a novel QoS Routing Algorithm based on Simulated Annealing (SA_RA) is proposed. This algorithm first uses an energy function to translate multiple QoS weights into a single mixed metric and then seeks to find a feasible path by simulated annealing. The paper outlines simulated annealing algorithm and analyzes the problems met when we apply it to Qos Routing (QoSR) in MANETs. Theoretical analysis and experiment results demonstrate that the proposed method is an effective approximation algorithms showing better performance than the other pertinent algorithm in seeking the (approximate) optimal configuration within a period of polynomial time.
The Simulated Annealing Algorithm Implemented by the MATLAB
Lin Lin
2012-11-01
Full Text Available This paper expounds the basic principle of simulated annealing algorithm which was applied to solve the function optimization problem and the algorithm realization process by using MATLAB language. Through the improvement algorithm results show that the method is able to function for global optimization, effectively overcome the based on the derivative of the optimization algorithm easy to fall into local optimum problems. This method not only can deepen the understanding of the simulated annealing process, but also can achieve the purpose of design intelligent system.
Coordination Hydrothermal Interconnection Java-Bali Using Simulated Annealing
Wicaksono, B.; Abdullah, A. G.; Saputra, W. S.
2016-04-01
Hydrothermal power plant coordination aims to minimize the total cost of operating system that is represented by fuel costand constraints during optimization. To perform the optimization, there are several methods that can be used. Simulated Annealing (SA) is a method that can be used to solve the optimization problems. This method was inspired by annealing or cooling process in the manufacture of materials composed of crystals. The basic principle of hydrothermal power plant coordination includes the use of hydro power plants to support basic load while thermal power plants were used to support the remaining load. This study used two hydro power plant units and six thermal power plant units with 25 buses by calculating transmission losses and considering power limits in each power plant unit aided by MATLAB software during the process. Hydrothermal power plant coordination using simulated annealing plants showed that a total cost of generation for 24 hours is 13,288,508.01.
Analysis of Trivium by a Simulated Annealing variant
Borghoff, Julia; Knudsen, Lars Ramkilde; Matusiewicz, Krystian
2010-01-01
characteristic of equation systems that may be efficiently solvable by the means of such algorithms is provided. As an example, we investigate equation systems induced by the problem of recovering the internal state of the stream cipher Trivium. We propose an improved variant of the simulated annealing method...
Function minimization with partially correct data via simulated annealing
Lorre, Jean J.
1988-01-01
The simulated annealing technique has been applied successfully to the problem of estimating the coefficients of a function in cases where only a portion of the data being fitted to the function is truly representative of the function, the rest being erroneous. Two examples are given, one in photometric function fitting and the other in pattern recognition. A schematic of the algorithm is provided.
Physical Mapping Using Simulated Annealing and Evolutionary Algorithms
Vesterstrøm, Jacob Svaneborg
2003-01-01
Physical mapping (PM) is a method of bioinformatics that assists in DNA sequencing. The goal is to determine the order of a collection of fragments taken from a DNA strand, given knowledge of certain unique DNA markers contained in the fragments. Simulated annealing (SA) is the most widely used...
A Simulated Annealing Methodology for Clusterwise Linear Regression.
DeSarbo, Wayne S.; And Others
1989-01-01
A method is presented that simultaneously estimates cluster membership and corresponding regression functions for a sample of observations or subjects. This methodology is presented with the simulated annealing-based algorithm. A set of Monte Carlo analyses is included to demonstrate the performance of the algorithm. (SLD)
Application of Simulated Annealing to Clustering Tuples in Databases.
Bell, D. A.; And Others
1990-01-01
Investigates the value of applying principles derived from simulated annealing to clustering tuples in database design, and compares this technique with a graph-collapsing clustering method. It is concluded that, while the new method does give superior results, the expense involved in algorithm run time is prohibitive. (24 references) (CLB)
Meta-Modeling by Symbolic Regression and Pareto Simulated Annealing
Stinstra, E.; Rennen, G.; Teeuwen, G.J.A.
2006-01-01
The subject of this paper is a new approach to Symbolic Regression.Other publications on Symbolic Regression use Genetic Programming.This paper describes an alternative method based on Pareto Simulated Annealing.Our method is based on linear regression for the estimation of constants.Interval arithm
The afforestation problem: a heuristic method based on simulated annealing
Vidal, Rene Victor Valqui
1992-01-01
This paper presents the afforestation problem, that is the location and design of new forest compartments to be planted in a given area. This optimization problem is solved by a two-step heuristic method based on simulated annealing. Tests and experiences with this method are also presented....
Thermal, quantum and simulated quantum annealing: analytical comparisons for simple models
Bapst, Victor; Semerjian, Guilhem
2015-01-01
We study various annealing dynamics, both classical and quantum, for simple mean-field models and explain how to describe their behavior in the thermodynamic limit in terms of differential equations. In particular we emphasize the differences between quantum annealing (i.e. evolution with Schr\\"odinger equation) and simulated quantum annealing (i.e. annealing of a Quantum Monte Carlo simulation).
Particle Based Image Segmentation with Simulated Annealing
Everts, M.H.; Bekker, H.; Jalba, A.C.; Roerdink, J.B.T.M.
2007-01-01
The Charged Particle Model (CPM) is a physically motivated deformable model for shape recovery and segmentation. It simulates a system of charged particles moving in an electric field generated from the input image, whose positions in the equilibrium state are used for curve or surface reconstructio
The Simulated Annealing Algorithm Implemented by the MATLAB
Lin Lin; Chen Fei
2012-01-01
This paper expounds the basic principle of simulated annealing algorithm which was applied to solve the function optimization problem and the algorithm realization process by using MATLAB language. Through the improvement algorithm results show that the method is able to function for global optimization, effectively overcome the based on the derivative of the optimization algorithm easy to fall into local optimum problems. This method not only can deepen the understanding of the simulated ann...
Molecular dynamics simulation of annealed ZnO surfaces
Min, Tjun Kit; Yoon, Tiem Leong [School of Physics, Universiti Sains Malaysia, 11800 USM, Penang (Malaysia); Lim, Thong Leng [Faculty of Engineering and Technology, Multimedia University, Melaka Campus, 75450 Melaka (Malaysia)
2015-04-24
The effect of thermally annealing a slab of wurtzite ZnO, terminated by two surfaces, (0001) (which is oxygen-terminated) and (0001{sup ¯}) (which is Zn-terminated), is investigated via molecular dynamics simulation by using reactive force field (ReaxFF). We found that upon heating beyond a threshold temperature of ∼700 K, surface oxygen atoms begin to sublimate from the (0001) surface. The ratio of oxygen leaving the surface at a given temperature increases as the heating temperature increases. A range of phenomena occurring at the atomic level on the (0001) surface has also been explored, such as formation of oxygen dimers on the surface and evolution of partial charge distribution in the slab during the annealing process. It was found that the partial charge distribution as a function of the depth from the surface undergoes a qualitative change when the annealing temperature is above the threshold temperature.
Molecular dynamics simulation of annealed ZnO surfaces
The effect of thermally annealing a slab of wurtzite ZnO, terminated by two surfaces, (0001) (which is oxygen-terminated) and (0001¯) (which is Zn-terminated), is investigated via molecular dynamics simulation by using reactive force field (ReaxFF). We found that upon heating beyond a threshold temperature of ∼700 K, surface oxygen atoms begin to sublimate from the (0001) surface. The ratio of oxygen leaving the surface at a given temperature increases as the heating temperature increases. A range of phenomena occurring at the atomic level on the (0001) surface has also been explored, such as formation of oxygen dimers on the surface and evolution of partial charge distribution in the slab during the annealing process. It was found that the partial charge distribution as a function of the depth from the surface undergoes a qualitative change when the annealing temperature is above the threshold temperature
Application of simulated annealing algorithm to optimizing sequencing of operation steps
无
2000-01-01
Discusses the optimization of machining operation sequencing by simulated annealing, and building a simulated annealing optimization model. From which, a new way to optimize operation sequencing can be developed.
Optimization of pipe networks including pumps by simulated annealing
Costa A.L.H.; Medeiros J.L.; Pessoa F.L.P.
2000-01-01
The objective of this work is to present an application of the simulated annealing method for the optimal design of pipe networks including pumps. Although its importance, the optimization of pumped networks did not receive great attention in the literature. The proposed search scheme explores the discrete space of the decision variables: pipe diameters and pump sizes. The behavior of the pumps is describe through the characteristic curve, generating more realistic solutions. In order to demo...
Convergence of simulated annealing by the generalized transition probability
Nishimori, Hidetoshi; Inoue, Jun-Ichi
1998-01-01
We prove weak ergodicity of the inhomogeneous Markov process generated by the generalized transition probability of Tsallis and Stariolo under power-law decay of the temperature. We thus have a mathematical foundation to conjecture convergence of simulated annealing processes with the generalized transition probability to the minimum of the cost function. An explicitly solvable example in one dimension is analyzed in which the generalized transition probability leads to a fast convergence of ...
Optimization of multiple-layer microperforated panels by simulated annealing
Ruiz Villamil, Heidi; Cobo, Pedro; Jacobsen, Finn
2011-01-01
Sound absorption by microperforated panels (MPP) has received increasing attention the past years as an alternative to conventional porous absorbers in applications with special cleanliness and health requirements. The absorption curve of an MPP depends on four parameters: the holes diameter, the....... Therefore, simulated annealing is proposed in this paper as a tool to solve the optimization problem of finding the best combination of the constitutive parameters of an ML-MPP providing the maximum average absorption within a prescribed frequency band....
Solving geometric constraints with genetic simulated annealing algorithm
刘生礼; 唐敏; 董金祥
2003-01-01
This paper applies genetic simulated annealing algorithm (SAGA) to solving geometric constraint problems. This method makes full use of the advantages of SAGA and can handle under-/over- constraint problems naturally. It has advantages (due to its not being sensitive to the initial values) over the Newton-Raphson method, and its yielding of multiple solutions, is an advantage over other optimal methods for multi-solution constraint system. Our experiments have proved the robustness and efficiency of this method.
Simulated Annealing for the 0/1 Multidimensional Knapsack Problem
Fubin Qian; Rui Ding
2007-01-01
In this paper a simulated annealing (SA) algorithm is presented for the 0/1 multidimensional knapsack problem. Problem-specific knowledge is incorporated in the algorithm description and evaluation of parameters in order to look into the performance of finite-time implementations of SA. Computational results show that SA performs much better than a genetic algorithm in terms of solution time, whilst having a modest loss of solution quality.
Reversible Jump MCMC Simulated Annealing for Neural Networks
Andrieu, Christophe; De Freitas, Nando; Doucet, Arnaud
2013-01-01
We propose a novel reversible jump Markov chain Monte Carlo (MCMC) simulated annealing algorithm to optimize radial basis function (RBF) networks. This algorithm enables us to maximize the joint posterior distribution of the network parameters and the number of basis functions. It performs a global search in the joint space of the parameters and number of parameters, thereby surmounting the problem of local minima. We also show that by calibrating a Bayesian model, we can obtain the classical...
Simulated annealing spectral clustering algorithm for image segmentation
Yifang Yang; and Yuping Wang
2014-01-01
The similarity measure is crucial to the performance of spectral clustering. The Gaussian kernel function based on the Euclidean distance is usual y adopted as the similarity mea-sure. However, the Euclidean distance measure cannot ful y reveal the complex distribution data, and the result of spectral clustering is very sensitive to the scaling parameter. To solve these problems, a new manifold distance measure and a novel simulated anneal-ing spectral clustering (SASC) algorithm based on the manifold distance measure are proposed. The simulated annealing based on genetic algorithm (SAGA), characterized by its rapid conver-gence to the global optimum, is used to cluster the sample points in the spectral mapping space. The proposed algorithm can not only reflect local and global consistency better, but also reduce the sensitivity of spectral clustering to the kernel parameter, which improves the algorithm’s clustering performance. To efficiently ap-ply the algorithm to image segmentation, the Nystr¨om method is used to reduce the computation complexity. Experimental re-sults show that compared with traditional clustering algorithms and those popular spectral clustering algorithms, the proposed algorithm can achieve better clustering performances on several synthetic datasets, texture images and real images.
Adaptive Sampling in Hierarchical Simulation
Knap, J; Barton, N R; Hornung, R D; Arsenlis, A; Becker, R; Jefferson, D R
2007-07-09
We propose an adaptive sampling methodology for hierarchical multi-scale simulation. The method utilizes a moving kriging interpolation to significantly reduce the number of evaluations of finer-scale response functions to provide essential constitutive information to a coarser-scale simulation model. The underlying interpolation scheme is unstructured and adaptive to handle the transient nature of a simulation. To handle the dynamic construction and searching of a potentially large set of finer-scale response data, we employ a dynamic metric tree database. We study the performance of our adaptive sampling methodology for a two-level multi-scale model involving a coarse-scale finite element simulation and a finer-scale crystal plasticity based constitutive law.
Reticle Floorplanning and Simulated Wafer Dicing for Multiple-Project Wafers by Simulated Annealing
Lin, Rung-Bin; Wu, Meng-Chiou; Tsai, Shih-Cheng
2008-01-01
In this chapter we have demonstrated how simulated annealing is used to solve two NPhard problems: simulated wafer dicing and reticle floorplanning problems for MPW. For simulated wafer dicing, we suggest that HVMIS-SA-Z be employed to find the wafer dicing plans, especially for low-volume production. As for reticle floorplanning, BT-VOCO and
Optimisation of electron beam characteristics by simulated annealing
Full text: With the development of technology in the field of treatment beam delivery, the possibility of tailoring radiation beams (via manipulation of the beam's phase space) is foreseeable. This investigation involved evaluating a method for determining the characteristics of pure electron beams which provided dose distributions that best approximated desired distributions. The aim is to determine which degrees of freedom are advantageous and worth pursuing in a clinical setting. A simulated annealing routine was developed to determine optimum electron beam characteristics. A set of beam elements are defined at the surface of a homogeneous water equivalent phantom defining discrete positions and angles of incidence, and electron energies. The optimal weighting of these elements is determined by the (generally approximate) solution to the linear equation, Dw = d, where d represents the dose distribution calculated over the phantom, w the vector of (50 - 2x104) beam element relative weights, and D a normalised matrix of dose deposition kernels. In the iterative annealing procedure, beam elements are randomly selected and beam weighting distributions are sampled and used to perturb the selected elements. Perturbations are accepted or rejected according to standard simulated annealing criteria. The result (after the algorithm has terminated due to meeting an iteration or optimisation specification) is an approximate solution for the beam weight vector (w) specified by the above equation. This technique has been applied for several sample dose distributions and phase space restrictions. An example is given of the phase space obtained when endeavouring to conform to a rectangular 100% dose region with polyenergetic though normally incident electrons. For regular distributions, intuitive conclusions regarding the benefits of energy/angular manipulation may be made, whereas for complex distributions, variations in intensity over beam elements of varying energy and
Morgan, John A
2016-01-01
The method of simulated annealing is adapted to the temperature-emissivity separation (TES) problem. A patch of surface at the bottom of the atmosphere is assumed to be a greybody emitter with spectral emissivity $\\epsilon(k)$ describable by a mixture of spectral endmembers. We prove that a simulated annealing search conducted according to a suitable schedule converges to a solution maximizing the $\\textit{A-Posteriori}$ probability that spectral radiance detected at the top of the atmosphere originates from a patch with stipulated $T$ and $\\epsilon(k)$. Any such solution will be nonunique. The average of a large number of simulated annealing solutions, however, converges almost surely to a unique Maximum A-Posteriori solution for $T$ and $\\epsilon(k)$. The limitation to a stipulated set of endmember emissivities may be relaxed by allowing the number of endmembers to grow without bound, and to be generic continuous functions of wavenumber with bounded first derivatives with respect to wavenumber.
Job Shop Scheduling Using Modified Simulated Annealing Algorithm
PV Senthiil
2014-09-01
Full Text Available Timely and cost factor is increasingly important in today’s global competitive market. The key problem faced by today’s industries are feasible allocation of various jobs to available resources i.e., machines (Scheduling and optimal utilization of the available resources. Among the various problems in scheduling, the job shop scheduling is the most complicated and requires a large computational effort to solve it. A typical job shop scheduling problem has a set of jobs to be processed in a set of machines, with certain constraints and objective function to be achieved. The most commonly considered objectives are the minimization of make span, minimization of tardiness which leads to minimization of penalty cost, and to maximize machine utilization. Machine shop scheduling can be done using various techniques like standard dispatching rules, heuristic techniques like Simulated annealing, Tabu Search, Genetic algorithm, etc,.here a typical job shop shop scheduling problem is solved using simulated annealing(SA technique, a heuristic search algorithm. SA is generic neighbourhood search algorithm used to locate optimal solution very nearer to global optimal solution. A software based program is developed in VB platform for a typical job shop problem and test instances were performed over it. Experimental results obtained were further tuned by varying parameters and optimal results were obtained
Simulated Annealing-Based Krill Herd Algorithm for Global Optimization
Gai-Ge Wang
2013-01-01
Full Text Available Recently, Gandomi and Alavi proposed a novel swarm intelligent method, called krill herd (KH, for global optimization. To enhance the performance of the KH method, in this paper, a new improved meta-heuristic simulated annealing-based krill herd (SKH method is proposed for optimization tasks. A new krill selecting (KS operator is used to refine krill behavior when updating krill’s position so as to enhance its reliability and robustness dealing with optimization problems. The introduced KS operator involves greedy strategy and accepting few not-so-good solutions with a low probability originally used in simulated annealing (SA. In addition, a kind of elitism scheme is used to save the best individuals in the population in the process of the krill updating. The merits of these improvements are verified by fourteen standard benchmarking functions and experimental results show that, in most cases, the performance of this improved meta-heuristic SKH method is superior to, or at least highly competitive with, the standard KH and other optimization methods.
Simulated annealing technique to design minimum cost exchanger
Khalfe Nadeem M.
2011-01-01
Full Text Available Owing to the wide utilization of heat exchangers in industrial processes, their cost minimization is an important target for both designers and users. Traditional design approaches are based on iterative procedures which gradually change the design and geometric parameters to satisfy a given heat duty and constraints. Although well proven, this kind of approach is time consuming and may not lead to cost effective design as no cost criteria are explicitly accounted for. The present study explores the use of nontraditional optimization technique: called simulated annealing (SA, for design optimization of shell and tube heat exchangers from economic point of view. The optimization procedure involves the selection of the major geometric parameters such as tube diameters, tube length, baffle spacing, number of tube passes, tube layout, type of head, baffle cut etc and minimization of total annual cost is considered as design target. The presented simulated annealing technique is simple in concept, few in parameters and easy for implementations. Furthermore, the SA algorithm explores the good quality solutions quickly, giving the designer more degrees of freedom in the final choice with respect to traditional methods. The methodology takes into account the geometric and operational constraints typically recommended by design codes. Three different case studies are presented to demonstrate the effectiveness and accuracy of proposed algorithm. The SA approach is able to reduce the total cost of heat exchanger as compare to cost obtained by previously reported GA approach.
Time series forecasting using a TSK fuzzy system tuned with simulated annealing
Almaraashi, Majid; John, Robert; Coupland, Simon; Hopgood, Adrian
2010-01-01
In this paper, a combination of a Takagi-Sugeno fuzzy system (TSK) and simulated annealing is used to predict well known time series by searching for the best configuration of the fuzzy system. Simulated annealing is used to optimise the parameters of the antecedent and the consequent parts of the fuzzy system rules. The results of the proposed method are encouraging indicating that simulated annealing and fuzzy logic are able to combine well in time series prediction.
Optimal placement of excitations and sensors by simulated annealing
Salama, Moktar; Bruno, R.; Chen, G.-S.; Garba, J.
1989-01-01
The optimal placement of discrete actuators and sensors is posed as a combinatorial optimization problem. Two examples for truss structures were used for illustration; the first dealt with the optimal placement of passive dampers along existing truss members, and the second dealt with the optimal placement of a combination of a set of actuators and a set of sensors. Except for the simplest problems, an exact solution by enumeration involves a very large number of function evaluations, and is therefore computationally intractable. By contrast, the simulated annealing heuristic involves far fewer evaluations and is best suited for the class of problems considered. As an optimization tool, the effectiveness of the algorithm is enhanced by introducing a number of rules that incorporate knowledge about the physical behavior of the problem. Some of the suggested rules are necessarily problem dependent.
Simulated annealing and joint manufacturing batch-sizing
Sarker Ruhul
2003-01-01
Full Text Available We address an important problem of a manufacturing system. The system procures raw materials from outside suppliers in a lot and processes them to produce finished goods. It proposes an ordering policy for raw materials to meet the requirements of a production facility. In return, this facility has to deliver finished products demanded by external buyers at fixed time intervals. First, a general cost model is developed considering both raw materials and finished products. Then this model is used to develop a simulated annealing approach to determining an optimal ordering policy for procurement of raw materials and also for the manufacturing batch size to minimize the total cost for meeting customer demands in time. The solutions obtained were compared with those of traditional approaches. Numerical examples are presented. .
Simulated Annealing Algorithm and Its Application in Irregular Polygons Packing
段国林; 王彩红; 张健楠
2003-01-01
Two-dimensional irregular polygons packing problem is very difficult to be solved in traditional optimal way.Simulated annealing(SA)algorithm is a stochastic optimization technique that can be used to solve packing problems.The whole process of SA is introduced firstly in this paper. An extended neighborhood searching method in SA is mainly analyzed. A general module of SA algorithm is given and used to lay out the irregular polygons. The judgment of intersection and other constrains of irregular polygons are analyzed. Then an example that was used in the paper of Stefan Jakobs is listed.Results show that this SA algorithm shortens the computation time and improves the solution.
Memoryless cooperative graph search based on the simulated annealing algorithm
Hou Jian; Yan Gang-Feng; Fan Zhen
2011-01-01
We have studied the problem of reaching a globally optimal segment for a graph-like environment with a single or a group of autonomous mobile agents. Firstly, two efficient simulated-annealing-like algorithms are given for a single agent to solve the problem in a partially known environment and an unknown environment, respectively. It shows that under both proposed control strategies, the agent will eventually converge to a globally optimal segment with probability 1Secondly, we use multi-agent searching to simultaneously reduce the computation complexity and accelerate convergence based on the algorithms we have given for a single agent. By exploiting graph partition, a gossip consensus method based scheme is presented to update the key parameter-radius of the graph, ensuring that the agents spend much less time finding a globally optimal segment.
Solving the dynamic berth allocation problem by simulated annealing
Lin, Shih-Wei; Ting, Ching-Jung
2014-03-01
Berth allocation, the first operation when vessels arrive at a port, is one of the major container port optimization problems. From both the port operator's and the ocean carriers' perspective, minimization of the time a ship spends at berth is a key goal of port operations. This article focuses on two versions of the dynamic berth allocation problem (DBAP): discrete and continuous cases. The first case assigns ships to a given set of berth positions; the second one permits them to be moored anywhere along the berth. Simulated annealing (SA) approaches are proposed to solve the DBAP. The proposed SAs are tested with numerical instances for different sizes from the literature. Experimental results show that the proposed SA can obtain the optimal solutions in all instances of discrete cases and update 27 best known solutions in the continuous case.
Restoration of polarimetric SAR images using simulated annealing
Schou, Jesper; Skriver, Henning
2001-01-01
Filtering synthetic aperture radar (SAR) images ideally results in better estimates of the parameters characterizing the distributed targets in the images while preserving the structures of the nondistributed targets. However, these objectives are normally conflicting, often leading to a filtering...... approach favoring one of the objectives. An algorithm for estimating the radar cross-section (RCS) for intensity SAR images has previously been proposed in the literature based on Markov random fields and the stochastic optimization method simulated annealing. A new version of the algorithm is presented...... applicable to multilook polarimetric SAR images, resulting in an estimate of the mean covariance matrix rather than the RCS. Small windows are applied in the filtering, and due to the iterative nature of the approach, reasonable estimates of the polarimetric quantities characterizing the distributed targets...
Simulated Annealing Clustering for Optimum GPS Satellite Selection
M. Ranjbar
2012-05-01
Full Text Available This paper utilizes a clustering approach based on Simulated Annealing (SA method to select optimum satellite subsets from the visible satellites. Geometric Dilution of Precision (GDOP is used as criteria of optimality. The lower the values of the GDOP number, the better the geometric strength, and vice versa. Not needing to calculate the inverse matrix, which is time-consuming process, is a dramatically important advantage of using this method, so a great reduction in computational cost is achieved. SA is a powerful technique to obtain a close approximation to the global optimum for a given problem. The evaluation of the performance of the proposed method is done by validation measures. The external validation measures, entropy and purity, are used to measure the extent to which cluster labels affirm with the externally given class labels. The overall purity and entropy is 0.9015 and 0.3993, respectively which is an excellent result.
Graeme J. Doole
2007-01-01
This short paper provides a simple introduction on how a simulation model implemented in Microsoft Excel® can be optimised using Visual Basic for Applications (VBA) programming and the compressed simulated annealing algorithm (Ohlmann et al., 2004; Ohlmann and Thomas, 2007). The standard simulated annealing procedure enters as a special case. Practical advice for determining the parameters that guide the stochastic search process in an annealing algorithm is also given.
Simulation of dopant diffusion and activation during flash lamp annealing
Zechner, Christoph [Synopsys Switzerland LLC, Affolternstrasse 52, CH-8050 Zurich (Switzerland); Matveev, Dmitri [Synopsys Switzerland LLC, Affolternstrasse 52, CH-8050 Zurich (Switzerland)], E-mail: matveev@synopsys.com; Zographos, Nikolas [Synopsys Switzerland LLC, Affolternstrasse 52, CH-8050 Zurich (Switzerland); Lerch, Wilfried; Paul, Silke [Mattson Thermal Products GmbH, Daimlerstrasse 10, D-89160 Dornstadt (Germany)
2008-12-05
A set of advanced models implemented into the simulator Sentaurus Process was applied to simulate ultra shallow junction formation by flash lamp annealing (FLA). The full path transient enhanced diffusion model includes equations for small interstitial clusters (I{sub 2}, I{sub 3}, I{sub 4}), {l_brace}3 1 1{r_brace} defects and dislocation loops. A dopant-point defect clustering model is used for dopant activation simulation. Several cluster types are considered: B{sub 2}, B{sub 2}I, B{sub 2}I{sub 2}, B{sub 3}I, B{sub 3}I{sub 2}, B{sub 3}I{sub 3} for boron and As{sub 2}, As{sub 2}V, As{sub 3}, As{sub 3}V, As{sub 4}, As{sub 4}V for arsenic. Different point defect and dopant-point defect pair charge states are taken into account to obtain accurate results in the high doping level region. The flux expressions in the three-phase segregation model include a dependence on the doping level and point defect supersaturation. The FLA process was performed at various peak temperatures in a Mattson Millios{sup TM} fRTP{sup TM} system. The measured wafer temperature as a function of time allowed us to simulate the transient processes with a high accuracy. A good agreement between secondary ion mass spectroscopy (SIMS) and simulated profiles was achieved. The sheet resistance dependence on the FLA peak temperature was reproduced successfully.
A Knowledge-Based Simulated Annealing Algorithm to Multiple Satellites Mission Planning Problems
Da-Wei Jin; Li-Ning Xing
2013-01-01
The multiple satellites mission planning is a complex combination optimization problem. A knowledge-based simulated annealing algorithm is proposed to the multiple satellites mission planning problems. The experimental results suggest that the proposed algorithm is effective to the given problem. The knowledge-based simulated annealing method will provide a useful reference for the improvement of existing optimization approaches.
UN ALGORITMO DI SIMULATED ANNEALING PER LA SOLUZIONE DI UN PROBLEMA DI SCHEDULING
Crescenzio Gallo
2004-01-01
An algorithm of "Simulated Annealing" for solving scheduling problems is presented. The issues related to scheduling are discussed, together with the Simulated Annealing approximation method and its main parameters (freezing, temperature, cooling, number of neighbourhoods to explore), the choices made in defining them for the generation of a good algorithm that efficiently resolves the scheduling problem.
Pelletier, Mariane
1998-01-01
We study convergence rates of $\\mathbb{R}$-valued algorithms, especially in the case of multiple targets and simulated annealing. We precise, for example, the convergence rate of simulated annealing algorithms, whose weak convergence to a distribution concentrated on the potential's minima had been established by Gelfand and Mitter or by Hwang and Sheu.
稻垣, 陽介; イナガキ, ヨウスケ; Yousuke, Inagaki
2007-01-01
The efficiency of Monte Carlo simulated annealing algorithm based on the generalized statistics of Tsallis (GSA) is compared with conventional simulated annealing (CSA) based on Boltzmann-Gibbs statistics. Application to the discrete-time optimal growth problem demonstrates that the replacement of CSA by GSA has the potential to speed up optimizations with no loss of accuracy in finding optimal policy function.
A Simulated Annealing Algorithm for Training Empirical Potential Functions of Protein Folding
WANG Yu-hong; LI Wei
2005-01-01
In this paper are reported the local minimum problem by means of current greedy algorithm for training the empirical potential function of protein folding on 8623 non-native structures of 31 globular proteins and a solution of the problem based upon the simulated annealing algorithm. This simulated annealing algorithm is indispensable for developing and testing highly refined empirical potential functions.
Time Simulation of Bone Adaptation
Bagge, Mette
1998-01-01
The structural adaptation of a three-dimensional finite element model ofthe proximal femur is considered. Presuming the bone possesses the optimalstructure under the given loads, the bone material distribution is foundby minimizing the strain energy averaged over ten load cases with avolume...... constraint. Theoptimized design is used as a start-configuration for the remodelingsimulation.The parameter characterizing the equilibrium level where no remodeling occurs is estimated from the optimization parameters.The loads vary in magnitude, location and direction simulating timedependent loading. The...
Application of Simulated Annealing and Related Algorithms to TWTA Design
Radke, Eric M.
2004-01-01
Simulated Annealing (SA) is a stochastic optimization algorithm used to search for global minima in complex design surfaces where exhaustive searches are not computationally feasible. The algorithm is derived by simulating the annealing process, whereby a solid is heated to a liquid state and then cooled slowly to reach thermodynamic equilibrium at each temperature. The idea is that atoms in the solid continually bond and re-bond at various quantum energy levels, and with sufficient cooling time they will rearrange at the minimum energy state to form a perfect crystal. The distribution of energy levels is given by the Boltzmann distribution: as temperature drops, the probability of the presence of high-energy bonds decreases. In searching for an optimal design, local minima and discontinuities are often present in a design surface. SA presents a distinct advantage over other optimization algorithms in its ability to escape from these local minima. Just as high-energy atomic configurations are visited in the actual annealing process in order to eventually reach the minimum energy state, in SA highly non-optimal configurations are visited in order to find otherwise inaccessible global minima. The SA algorithm produces a Markov chain of points in the design space at each temperature, with a monotonically decreasing temperature. A random point is started upon, and the objective function is evaluated at that point. A stochastic perturbation is then made to the parameters of the point to arrive at a proposed new point in the design space, at which the objection function is evaluated as well. If the change in objective function values (Delta)E is negative, the proposed new point is accepted. If (Delta)E is positive, the proposed new point is accepted according to the Metropolis criterion: rho((Delta)f) = exp((-Delta)E/T), where T is the temperature for the current Markov chain. The process then repeats for the remainder of the Markov chain, after which the temperature is
Evolutionary algorithms, simulated annealing, and Tabu search: a comparative study
Youssef, Habib; Sait, Sadiq M.; Adiche, Hakim
1998-10-01
Evolutionary algorithms, simulated annealing (SA), and Tabu Search (TS) are general iterative algorithms for combinatorial optimization. The term evolutionary algorithm is used to refer to any probabilistic algorithm whose design is inspired by evolutionary mechanisms found in biological species. Most widely known algorithms of this category are Genetic Algorithms (GA). GA, SA, and TS have been found to be very effective and robust in solving numerous problems from a wide range of application domains.Furthermore, they are even suitable for ill-posed problems where some of the parameters are not known before hand. These properties are lacking in all traditional optimization techniques. In this paper we perform a comparative study among GA, SA, and TS. These algorithms have many similarities, but they also possess distinctive features, mainly in their strategies for searching the solution state space. the three heuristics are applied on the same optimization problem and compared with respect to (1) quality of the best solution identified by each heuristic, (2) progress of the search from initial solution(s) until stopping criteria are met, (3) the progress of the cost of the best solution as a function of time, and (4) the number of solutions found at successive intervals of the cost function. The benchmark problem was is the floorplanning of very large scale integrated circuits. This is a hard multi-criteria optimization problem. Fuzzy logic is used to combine all objective criteria into a single fuzzy evaluation function, which is then used to rate competing solutions.
Simulated Annealing Technique for Routing in a Rectangular Mesh Network
Noraziah Adzhar
2014-01-01
Full Text Available In the process of automatic design for printed circuit boards (PCBs, the phase following cell placement is routing. On the other hand, routing process is a notoriously difficult problem, and even the simplest routing problem which consists of a set of two-pin nets is known to be NP-complete. In this research, our routing region is first tessellated into a uniform Nx×Ny array of square cells. The ultimate goal for a routing problem is to achieve complete automatic routing with minimal need for any manual intervention. Therefore, shortest path for all connections needs to be established. While classical Dijkstra’s algorithm guarantees to find shortest path for a single net, each routed net will form obstacles for later paths. This will add complexities to route later nets and make its routing longer than the optimal path or sometimes impossible to complete. Today’s sequential routing often applies heuristic method to further refine the solution. Through this process, all nets will be rerouted in different order to improve the quality of routing. Because of this, we are motivated to apply simulated annealing, one of the metaheuristic methods to our routing model to produce better candidates of sequence.
Optimization of Electric Power Distribution Using Hybrid Simulated Annealing Approach
Walid Ahmed
2008-01-01
Full Text Available The key goal of electric power distribution companies is to provide a high quality of service with a low cost of operation. The growing customer needs requires a re-distribution of the Power over various nodes of the Distributed Generation (DG facilitates. The re-distribution might cause over load on various parts of the networks which if not correctly optimized might increase the cost of maintenance and affect the overall network reliability. This is why it is urgently requited to find a methodology that can effectively provide a schema for re-distribution of the power and achieve both customers and power companies contracting objectives. In this paper, we explore our new proposed idea of using a simulated annealing based local search technique to provide an efficient power load distribution for distributed generation network. On doing this, we will apply our approach on the famous IEEE14 and IEEE30 power systems as two test cases. The developed results show the significant of the proposed approach.
Traveling Salesman Approach for Solving Petrol Distribution Using Simulated Annealing
Zuhaimy Ismail
2008-01-01
Full Text Available This research presents an attempt to solve a logistic company's problem of delivering petrol to petrol station in the state of Johor. This delivery system is formulated as a travelling salesman problem (TSP. TSP involves finding an optimal route for visiting stations and returning to point of origin, where the inter-station distance is symmetric and known. This real world application is a deceptive simple combinatorial problem and our approach is to develop solutions based on the idea of local search and meta-heuristics. As a standard problem, we have chosen a solution is a deceptively simple combinatorial problem and we defined it simply as the time spends or distance travelled by salesman visiting n cities (or nodes cyclically. In one tour the vehicle visits each station just once and finishes up where he started. As standard problems, we have chosen TSP with different stations visited once. This research presents the development of solution engine based on local search method known as Greedy Method and with the result generated as the initial solution, Simulated Annealing (SA and Tabu Search (TS further used to improve the search and provide the best solution. A user friendly optimization program developed using Microsoft C++ to solve the TSP and provides solutions to future TSP which may be classified into daily or advanced management and engineering problems.
Efficient Hand off using Fuzzy and Simulated Annealing
Vikas.M.N
2012-02-01
Full Text Available This paper presents an efficient method for the hand off mechanism in cellular networks using optimization algorithms. The proposed approach integrates a fuzzy logic approach with simulated annealing algorithm to automate the tuning process. The fuzzy controller carries out inference operation at high-speed, whereas the tuning procedure works at a much lower rate. For the implementation described in this paper, a two-input-one-output fuzzy controller is considered. Both the inputs and the output have 8- bit resolution, and up to seven membership functions for each input or output can be defined over the universe of discourse. The fuzzy controller has two levels of pipeline which allows overlapping of the arithmetic as well as inference operations. The SA tuning mechanism adjusts the triangular or singleton membership functions to minimize a cost function. The complete self-tuned fuzzy inference engine is implemented in a Xilinx SPARTAN3 XC3S200 series FPGA device. This paper describes various aspects of the implementation of the self-tuned hand off system.
Static Security Enhancement and Loss Minimization Using Simulated Annealing
A.Y. Abdelaziz
2013-03-01
Full Text Available This paper presents a developed algorithm for optimal placement of thyristor controlled series capacitors (TCSC’s for enhancing the power system static security and minimizing the system overall power loss. Placing TCSC’s at selected branches requires analysis of the system behavior under all possible contingencies. A selective procedure to determine the locations and settings of the thyristor controlled series capacitors is presented. The locations are determined by evaluating contingency sensitivity index (CSI for a given power system branch for a given number of contingencies. This criterion is then used to develop branches prioritizing index in order to rank the system branches possible for placement of the thyristor controlled series capacitors. Optimal settings of TCSC’s are determined by the optimization technique of simulated annealing (SA, where settings are chosen to minimize the overall power system losses. The goal of the developed methodology is to enhance power system static security by alleviating/eliminating overloads on the transmission lines and maintaining the voltages at all load buses within their specified limits through the optimal placement and setting of TCSC’s under single and double line outage network contingencies. The proposed algorithm is examined using different IEEE standard test systems to shown its superiority in enhancing the system static security and minimizing the system losses.
I Gede Agus Widyadana
2002-01-01
Full Text Available The research is focused on comparing Genetics algorithm and Simulated Annealing in the term of performa and processing time. The main purpose is to find out performance both of the algorithm to solve minimizing makespan and total flowtime in a particular flowshop system. Performances of the algorithms are found by simulating problems with variation of jobs and machines combination. The result show the Simulated Annealing is much better than the Genetics up to 90%. The Genetics, however, only had score in processing time, but the trend that plotted suggest that in problems with lots of jobs and lots of machines, the Simulated Annealing will run much faster than the Genetics. Abstract in Bahasa Indonesia : Penelitian ini difokuskan pada pembandingan algoritma Genetika dan Simulated Annealing ditinjau dari aspek performa dan waktu proses. Tujuannya adalah untuk melihat kemampuan dua algoritma tersebut untuk menyelesaikan problem-problem penjadwalan flow shop dengan kriteria minimasi makespan dan total flowtime. Kemampuan kedua algoritma tersebut dilihat dengan melakukan simulasi yang dilakukan pada kombinasi-kombinasi job dan mesin yang berbeda-beda. Hasil simulasi menunjukan algoritma Simulated Annealing lebih unggul dari algoritma Genetika hingga 90%, algoritma Genetika hanya unggul pada waktu proses saja, namun dengan tren waktu proses yang terbentuk, diyakini pada problem dengan kombinasi job dan mesin yang banyak, algoritma Simulated Annealing dapat lebih cepat daripada algoritma Genetika. Kata kunci: Algoritma Genetika, Simulated Annealing, flow shop, makespan, total flowtime.
余农; 吴昊; 吴常泳; 李范鸣; 吴立德
2003-01-01
A practical neural network model for morphological filtering and a simulated annealing optimal algorithm for the network parameters training are proposed in this paper. It is pointed out that the optimal designing process of the morphological filtering network in fact is the optimal learning process of adjusting network parameters (structuring element, or SE for short) to accommodate image environment. Then the network structure may possess the characteristics ofimage targets, and so give specific infor- mation to the SE. Morphological filters formed in this way become certainly intelligent and can provide good filtering results and robust adaptability to complex changing image. For application tomotional image target detection, dynamic training algorithm is applied to the designing process using asymptotic shrinking error and appropriate network weights adjusting. Experimental results show that the algorithm has invariant propertywith respect to shift, scale and rotation of moving target in continuing detection of moving targets.
Jonkers, PAE
2001-01-01
A simulation single-electron model is presented to describe the effect of annealing current perpendicular to plane-giant magnetoresistance (CPP-GMR) systems. Progressive annealing is represented by a progressively increasing number of impurities occurring at the interfaces of adjacent layers constit
Sensitivity study on hydraulic well testing inversion using simulated annealing
For environmental remediation, management of nuclear waste disposal, or geothermal reservoir engineering, it is very important to evaluate the permeabilities, spacing, and sizes of the subsurface fractures which control ground water flow. Cluster variable aperture (CVA) simulated annealing has been used as an inversion technique to construct fluid flow models of fractured formations based on transient pressure data from hydraulic tests. A two-dimensional fracture network system is represented as a filled regular lattice of fracture elements. The algorithm iteratively changes an aperture of cluster of fracture elements, which are chosen randomly from a list of discrete apertures, to improve the match to observed pressure transients. The size of the clusters is held constant throughout the iterations. Sensitivity studies using simple fracture models with eight wells show that, in general, it is necessary to conduct interference tests using at least three different wells as pumping well in order to reconstruct the fracture network with a transmissivity contrast of one order of magnitude, particularly when the cluster size is not known a priori. Because hydraulic inversion is inherently non-unique, it is important to utilize additional information. The authors investigated the relationship between the scale of heterogeneity and the optimum cluster size (and its shape) to enhance the reliability and convergence of the inversion. It appears that the cluster size corresponding to about 20--40 % of the practical range of the spatial correlation is optimal. Inversion results of the Raymond test site data are also presented and the practical range of spatial correlation is evaluated to be about 5--10 m from the optimal cluster size in the inversion
A Branch and Bound and Simulated Annealing Approach for Job Shop Scheduling
Tan Hui Woon; Sutinah Salim
2004-01-01
This paper presents two approaches to the solution of the job shop scheduling problem, namely the branch and bound, and simulated annealing approach. The objective is to schedule the jobs on the machines so that the total completion time is minimized. In the branch and bound approach, the job shop scheduling problem is represented by a disjunctive graph, then the optimal schedule is obtained using the branch and bound algorithm while simulated annealing is a local search based algorithm which...
Gregorius Satia Budhi
2003-01-01
Full Text Available Flexible Manufacturing System (FMS is a manufacturing system that is formed from several Numerical Controlled Machines combine with material handling system, so that different jobs can be worked by different machines sequences. FMS combine the high productivity and flexibility of Transfer Line and Job Shop manufacturing system. In this reasearch, Activity-Based Costing(ABC approach was used as the weight to search the operation route in the proper machine, so that the total production cost can be optimized. The search method that was used in this experiment is Simulated Annealling, a variant form Hill Climbing Search method. An ideal operation time to proses a part was used as the annealling schedule. From the empirical test, it could be proved that the use of ABC approach and Simulated Annealing to search the route (routing process can optimize the Total Production Cost. In the other hand, the use of ideal operation time to process a part as annealing schedule can control the processing time well. Abstract in Bahasa Indonesia : Flexible Manufacturing System (FMS adalah sistem manufaktur yang tersusun dari mesin-mesin Numerical Control (NC yang dikombinasi dengan Sistem Penanganan Material, sehingga job-job berbeda dikerjakan oleh mesin-mesin dengan alur yang berlainan. FMS menggabungkan produktifitas dan fleksibilitas yang tinggi dari Sistem Manufaktur Transfer Line dan Job Shop. Pada riset ini pendekatan Activity-Based Costing (ABC digunakan sebagai bobot / weight dalam pencarian rute operasi pada mesin yang tepat, untuk lebih mengoptimasi biaya produksi secara keseluruhan. Adapun metode Searching yang digunakan adalah Simulated Annealing yang merupakan varian dari metode searching Hill Climbing. Waktu operasi ideal untuk memproses sebuah part digunakan sebagai Annealing Schedulenya. Dari hasil pengujian empiris dapat dibuktikan bahwa penggunaan pendekatan ABC dan Simulated Annealing untuk proses pencarian rute (routing dapat lebih
Theodorakos, I.; Zergioti, I.; Vamvakas, V.; Tsoukalas, D.; Raptis, Y. S.
2014-01-01
In this work, a picosecond diode pumped solid state laser and a nanosecond Nd:YAG laser have been used for the annealing and the partial nano-crystallization of an amorphous silicon layer. These experiments were conducted as an alternative/complementary to plasma-enhanced chemical vapor deposition method for fabrication of micromorph tandem solar cell. The laser experimental work was combined with simulations of the annealing process, in terms of temperature distribution evolution, in order to predetermine the optimum annealing conditions. The annealed material was studied, as a function of several annealing parameters (wavelength, pulse duration, fluence), as far as it concerns its structural properties, by X-ray diffraction, SEM, and micro-Raman techniques.
Theodorakos, I.; Zergioti, I.; Tsoukalas, D.; Raptis, Y. S., E-mail: yraptis@central.ntua.gr [Physics Department, National Technical University of Athens, Heroon Polytechniou 9, 15780 Zographou, Athens (Greece); Vamvakas, V. [Heliosphera SA, Industrial Area of Tripolis, 8th Building Block, 5th Road, GR-221 00 Tripolis (Greece)
2014-01-28
In this work, a picosecond diode pumped solid state laser and a nanosecond Nd:YAG laser have been used for the annealing and the partial nano-crystallization of an amorphous silicon layer. These experiments were conducted as an alternative/complementary to plasma-enhanced chemical vapor deposition method for fabrication of micromorph tandem solar cell. The laser experimental work was combined with simulations of the annealing process, in terms of temperature distribution evolution, in order to predetermine the optimum annealing conditions. The annealed material was studied, as a function of several annealing parameters (wavelength, pulse duration, fluence), as far as it concerns its structural properties, by X-ray diffraction, SEM, and micro-Raman techniques.
Speagle, Joshua S; Eisenstein, Daniel J; Masters, Daniel C; Steinhardt, Charles L
2015-01-01
Using a grid of $\\sim 2$ million elements ($\\Delta z = 0.005$) adapted from COSMOS photometric redshift (photo-z) searches, we investigate the general properties of template-based photo-z likelihood surfaces. We find these surfaces are filled with numerous local minima and large degeneracies that generally confound rapid but "greedy" optimization schemes, even with additional stochastic sampling methods. In order to robustly and efficiently explore these surfaces, we develop BAD-Z [Brisk Annealing-Driven Redshifts (Z)], which combines ensemble Markov Chain Monte Carlo (MCMC) sampling with simulated annealing to sample arbitrarily large, pre-generated grids in approximately constant time. Using a mock catalog of 384,662 objects, we show BAD-Z samples $\\sim 40$ times more efficiently compared to a brute-force counterpart while maintaining similar levels of accuracy. Our results represent first steps toward designing template-fitting photo-z approaches limited mainly by memory constraints rather than computation...
The Adaptive Multi-scale Simulation Infrastructure
Tobin, William R. [Rensselaer Polytechnic Inst., Troy, NY (United States)
2015-09-01
The Adaptive Multi-scale Simulation Infrastructure (AMSI) is a set of libraries and tools developed to support the development, implementation, and execution of general multimodel simulations. Using a minimal set of simulation meta-data AMSI allows for minimally intrusive work to adapt existent single-scale simulations for use in multi-scale simulations. Support for dynamic runtime operations such as single- and multi-scale adaptive properties is a key focus of AMSI. Particular focus has been spent on the development on scale-sensitive load balancing operations to allow single-scale simulations incorporated into a multi-scale simulation using AMSI to use standard load-balancing operations without affecting the integrity of the overall multi-scale simulation.
Crystallization on a sphere using the simulated annealing algorithm implemented on H.P.C. systems
de Voogd, J. M.; Sloot, P. M. A.; Verbraeck, A.; Kerckhoffs, E.J.H.
1993-01-01
The research presented here is a comparison of the scalability of the simulated annealing algorithm on a vector super computer (CRAY Y-MP) with the scalability of a parallel implementation on a massively parallel transputer surface (Parsytec GCel with 512 nodes of typeT805). Some results of the annealing procedure applied to thecrystallization of Lennard-Jones particles on a sphere arepresented.
Hui Li; Xiong Wan; Taoli Liu; Zhongshou Liu; Yanhua Zhu
2007-01-01
Although emission spectral tomography (EST) combines emission spectral measurement with optical computed tomography (OCT), it is difficult to gain transient emission data from a large number of views,therefore, high precision OCT algorithms with few views ought to be studied for EST application. To improve the reconstruction precision in the case of few views, a new computed tomography reconstruction algorithm based on multipurpose optimal criterion and simulated annealing theory (multi-criterion simulated annealing reconstruction technique, MCSART) is proposed. This algorithm can suffice criterion of least squares, criterion of most uniformity, and criterion of most smoothness synchronously. We can get global optimal solution by MCSART algorithm with simulated annealing theory. The simulating experiment result shows that this algorithm is superior to the traditional algorithms under various noises.
Robot Path Planning Based on Simulated Annealing and Artificial Neural Networks
Xianmin Wei
2013-05-01
Full Text Available As for the limitations of algorithms in global path planning of mobile robot at present, this study applies the improved simulated annealing algorithm artificial neural networks to path planning of mobile robot in order to better the weaknesses of great scale of iteration computation and slow convergence, since the best-reserved simulated annealing algorithm was introduced and it was effectively combined with other algorithms, this improved algorithm has accelerated the convergence and shortened the computing time in the path planning and the global optimal solution can be quickly obtained. Because the simulated annealing algorithm was updated and the obstacle collision penalty function represented by neural networks and the path length are treated as the energy function, not only does the planning of path meet the standards of shortest path, but also avoids collisions with obstacles. Experimental results of simulation show this improved algorithm can effectively improve the calculation speed of path planning and ensure the quality of path planning.
Robot Path Planning Based on Simulated Annealing and Artificial Neural Networks
Xianmin Wei
2013-06-01
Full Text Available As for the limitations of algorithms in global path planning of mobile robot at present, this study applies the improved simulated annealing algorithm artificial neural networks to path planning of mobile robot in order to better the weaknesses of great scale of iteration computation and slow convergence, since the best-reserved simulated annealing algorithm was introduced and it was effectively combined with other algorithms, this improved algorithm has accelerated the convergence and shortened the computing time in the path planning and the global optimal solution can be quickly obtained. Because the simulated annealing algorithm was updated and the obstacle collision penalty function represented by neural networks and the path length are treated as the energy function, not only does the planning of path meet the standards of shortest path, but also avoids collisions with obstacles. Experimental results of simulation show this improved algorithm can effectively improve the calculation speed of path planning and ensure the quality of path planning.
Simulation of annealing process effect on texture evolution of deep-drawing sheet St15
Jinghong Sun; Yazheng Liu; Leyu Zhou
2005-01-01
A two-dimensional cellular automaton method was used to simulate grain growth during the recrystallization annealing of deep-drawing sheet Stl5, taking the simulated result of recrystallization and the experimental result of the annealing texture of deepdrawing sheet St15 as the initial condition and reference. By means of computer simulation, the microstructures and textures of different periods of grain growth were predicted. It is achieved that the grain size, shape and texture become stable after the grain growth at a constant temperature of 700℃ for 10 h, and the advantaged texture components { 111 } and { 111 } are dominant.
Zhu, Jiulong; Wang, Shijun
Presently water resource in most watersheds in China is distributed in terms of administrative instructions. This kind of allocation method has many disadvantages and hampers the instructional effect of market mechanism on water allocation. The paper studies South-to-North Water Transfer Project and discusses water allocation of the node lakes along the Project. Firstly, it advanced four assumptions. Secondly, it analyzed constraint conditions of water allocation in terms of present state of water allocation in China. Thirdly, it established a goal model of water allocation and set up a systematic model from the angle of comprehensive profits of water utilization and profits of the node lakes. Fourthly, it discussed calculation method of the model by means of Simulated Annealing Hybrid Genetic Algorithm (SHGA). Finally, it validated the rationality and validity of the model by a simulation testing.
Liang, Faming
2014-04-03
Simulated annealing has been widely used in the solution of optimization problems. As known by many researchers, the global optima cannot be guaranteed to be located by simulated annealing unless a logarithmic cooling schedule is used. However, the logarithmic cooling schedule is so slow that no one can afford to use this much CPU time. This article proposes a new stochastic optimization algorithm, the so-called simulated stochastic approximation annealing algorithm, which is a combination of simulated annealing and the stochastic approximation Monte Carlo algorithm. Under the framework of stochastic approximation, it is shown that the new algorithm can work with a cooling schedule in which the temperature can decrease much faster than in the logarithmic cooling schedule, for example, a square-root cooling schedule, while guaranteeing the global optima to be reached when the temperature tends to zero. The new algorithm has been tested on a few benchmark optimization problems, including feed-forward neural network training and protein-folding. The numerical results indicate that the new algorithm can significantly outperform simulated annealing and other competitors. Supplementary materials for this article are available online.
Laser annealing and simulation of amorphous silicon thin films for solar cell applications
Theodorakos, I.; Raptis, Y. S.; Vamvakas, V.; Tsoukalas, D.; Zergioti, I.
2014-03-01
In this work, a picosecond DPSS and a nanosecond Nd:YAG laser have been used for the annealing and the partial nanocrystallization of an amorphous silicon layer. These experiments were conducted in order to improve the characteristics of a micromorph tandem solar cell. The laser annealing was attempted at 1064nm in order to obtain the desired crystallization's depth and ratios. Preliminary annealing-processes, with different annealing parameters, have been tested, such as fluence, repetition rate and number of pulses. Irradiations were applied in the sub-melt regime, in order to prevent significant diffusion of p- and n-dopants to take place within the structure. The laser experimental work was combined with simulations of the laser annealing process, in terms of temperature distribution evolution, using the Synopsys Sentaurus Process TCAD software. The optimum annealing conditions for the two different pulse durations were determined. Experimentally determined optical properties of our samples, such as the absorption coefficient and reflectivity, were used for a more realistic simulation. From the simulations results, a temperature profile, appropriate to yield the desired recrystallization was obtained for the case of ps pulses, which was verified from the experimental results described below. The annealed material was studied, as far as it concerns its structural properties, by XRD, SEM and micro-Raman techniques, providing consistent information on the characteristics of the nanocrystalline material produced by the laser annealing experiments. It was found that, with the use of ps pulses, the resultant polycrystalline region shows crystallization's ratios similar to a PECVD developed poly-Silicon layer, with slightly larger nanocrystallite's size.
Adaptive Multilevel Monte Carlo Simulation
Hoel, H
2011-08-23
This work generalizes a multilevel forward Euler Monte Carlo method introduced in Michael B. Giles. (Michael Giles. Oper. Res. 56(3):607–617, 2008.) for the approximation of expected values depending on the solution to an Itô stochastic differential equation. The work (Michael Giles. Oper. Res. 56(3):607– 617, 2008.) proposed and analyzed a forward Euler multilevelMonte Carlo method based on a hierarchy of uniform time discretizations and control variates to reduce the computational effort required by a standard, single level, Forward Euler Monte Carlo method. This work introduces an adaptive hierarchy of non uniform time discretizations, generated by an adaptive algorithmintroduced in (AnnaDzougoutov et al. Raùl Tempone. Adaptive Monte Carlo algorithms for stopped diffusion. In Multiscale methods in science and engineering, volume 44 of Lect. Notes Comput. Sci. Eng., pages 59–88. Springer, Berlin, 2005; Kyoung-Sook Moon et al. Stoch. Anal. Appl. 23(3):511–558, 2005; Kyoung-Sook Moon et al. An adaptive algorithm for ordinary, stochastic and partial differential equations. In Recent advances in adaptive computation, volume 383 of Contemp. Math., pages 325–343. Amer. Math. Soc., Providence, RI, 2005.). This form of the adaptive algorithm generates stochastic, path dependent, time steps and is based on a posteriori error expansions first developed in (Anders Szepessy et al. Comm. Pure Appl. Math. 54(10):1169– 1214, 2001). Our numerical results for a stopped diffusion problem, exhibit savings in the computational cost to achieve an accuracy of ϑ(TOL),from(TOL−3), from using a single level version of the adaptive algorithm to ϑ(((TOL−1)log(TOL))2).
Physical and mathematical models and numerical simulation of the diffusion of implanted impurities during rapid thermal treatment of silicon structures are discussed. The calculation results correspond to the experimental results with a sufficient accuracy. A simulation software system has been developed that is integrated into ATHENA simulation system developed by Silvaco Inc. This program can simulate processes of the low-energy implantation of B, BF2, P, As, Sb, C ions into the silicon structures and subsequent rapid thermal annealing. (authors)
Takahashi, Norio; Ebihara, Kenji; Yoshida, Koji; Nakata, Takayoshi; Ohashi, Ken; Miyata, Koji
1996-01-01
Factors affecting the convergence characteristics and results obtained by the optimal design method using the finite element method and simulated annealing are investigated systematically, and the optimal parameters for simulated annealing method are obtained. The optimal shape of the die mold for orientation of the magnetic powder (nonlinear magnetostatic problem) is obtained using finite elements and simulated annealing. The experimental verification is also carried out
Ruiz, Alfonso de la Fuente
2014-01-01
Brief description on the state of the art of some local optimization methods: Quantum annealing Quantum annealing (also known as alloy, crystallization or tempering) is analogous to simulated annealing but in substitution of thermal activation by quantum tunneling. The class of algorithmic methods for quantum annealing (dubbed: 'QA'), sometimes referred by the italian school as Quantum Stochastic Optimization ('QSO'), is a promising metaheuristic tool for solving local search problems in mult...
Optimal actuator placement on an active reflector using a modified simulated annealing technique
Kuo, Chin-Po; Bruno, Robin
1991-01-01
The development of a lightweight actuation system for maintaining the surface accuracy of a composite honeycomb panel using piezoelectric actuators is discussed. A modified simulated annealing technique is used to optimize the problem with both combinatorial and continuous criteria and with inequality constraints. Near optimal solutions for the location of the actuators, using combinatorial optimization, and for the required actuator forces, employing continuous optimization, are sought by means of the modified simulated annealing technique. The actuator locations are determined by first seeking a near optimum solution using the modified simulated annealing technique. The final actuator configuration consists of an arrangement wherein the piezoelectric actuators are placed along six radial lines. Numerical results showing the achievable surface correction by means of this configuration are presented.
Simulated annealing algorithm for TSP%用模拟退火算法求解TSP
朱静丽
2011-01-01
货郎担问题，即TSP（Traveling Salesman Problem），是一个组合优化问题。具有NPC计算复杂性。本文分析了模拟退火算法模型，研究了用模拟退火算法求解TSP算法的可行性，并给出了用模拟退火算法求解TSP问题的具体实现方法。%Traveling salesman problem,that TSP（Travelling Salesman Problem）,is a combinatorial optimization problem.Computational complexity with the NPC.This paper analyzes the simulated annealing algorithm model to study the simulated annealing algorithm for TSP of the algorithm,and gives the simulated annealing algorithm for TSP on the specific implementation.
Sohn, Andrew; Biswas, Rupak
1996-01-01
Solving the hard Satisfiability Problem is time consuming even for modest-sized problem instances. Solving the Random L-SAT Problem is especially difficult due to the ratio of clauses to variables. This report presents a parallel synchronous simulated annealing method for solving the Random L-SAT Problem on a large-scale distributed-memory multiprocessor. In particular, we use a parallel synchronous simulated annealing procedure, called Generalized Speculative Computation, which guarantees the same decision sequence as sequential simulated annealing. To demonstrate the performance of the parallel method, we have selected problem instances varying in size from 100-variables/425-clauses to 5000-variables/21,250-clauses. Experimental results on the AP1000 multiprocessor indicate that our approach can satisfy 99.9 percent of the clauses while giving almost a 70-fold speedup on 500 processors.
Optimization of pressurized water reactor shuffling by simulated annealing with heuristics
Simulated-annealing optimization of reactor core loading patterns is implemented with support for design heuristics during candidate pattern generation. The SIMAN optimization module uses the advanced nodal method of SIMULATE-3 and the full cross-section detail of CASMO-3 to evaluate accurately the neutronic performance of each candidate, resulting in high-quality patterns. The use of heuristics within simulated annealing is explored. Heuristics improve the consistency of optimization results for both fast- and slow-annealing runs with no penalty from the exclusion of unusual candidates. Thus, the heuristic application of designer judgment during automated pattern generation is shown to be effective. The capability of the SIMAN module to find and evaluate families of loading patterns that satisfy design constraints and have good objective performance within practical run times is demonstrated. The use of automated evaluations of successive cycles to explore multicycle effects of design decisions is discussed
Sousa, Tiago M; Soares, Tiago; Morais, Hugo;
2016-01-01
The massive use of distributed generation and electric vehicles will lead to a more complex management of the power system, requiring new approaches to be used in the optimal resource scheduling field. Electric vehicles with vehicle-to-grid capability can be useful for the aggregator players in the...... mitigation of renewable sources intermittency and in the ancillary services procurement. In this paper, an energy and ancillary services joint management model is proposed. A simulated annealing approach is used to solve the joint management for the following day, considering the minimization of the...... aggregator total operation costs. The case study considers a distribution network with 33-bus, 66 distributed generation and 2000 electric vehicles. The proposed simulated annealing is matched with a deterministic approach allowing an effective and efficient comparison. The simulated annealing presents a...
Instantons in Quantum Annealing: Thermally Assisted Tunneling Vs Quantum Monte Carlo Simulations
Jiang, Zhang; Smelyanskiy, Vadim N.; Boixo, Sergio; Isakov, Sergei V.; Neven, Hartmut; Mazzola, Guglielmo; Troyer, Matthias
2015-01-01
Recent numerical result (arXiv:1512.02206) from Google suggested that the D-Wave quantum annealer may have an asymptotic speed-up than simulated annealing, however, the asymptotic advantage disappears when it is compared to quantum Monte Carlo (a classical algorithm despite its name). We show analytically that the asymptotic scaling of quantum tunneling is exactly the same as the escape rate in quantum Monte Carlo for a class of problems. Thus, the Google result might be explained in our framework. We also found that the transition state in quantum Monte Carlo corresponds to the instanton solution in quantum tunneling problems, which is observed in numerical simulations.
CHUShuchuan; JohnF.Roddick
2003-01-01
In this paper, a cluster generation algorithm for vector quantization using a tabu search approach with simulated annealing is proposed. The main iclea of this algorithm is to use the tabu search approach to gen-erate non-local moves for the clusters and apply the sim-ulated annealing technique to select the current best solu-tion, thus improving the cluster generation and reducing the mean squared error. Preliminary experimental results demonstrate that the proposed approach is superior to the tabu search approach with Generalised Lloyd algorithm.
Riaz, M. Tahir; Gutierrez Lopez, Jose Manuel; Pedersen, Jens Myrup;
2011-01-01
The paper presents a hybrid Genetic and Simulated Annealing algorithm for implementing Chordal Ring structure in optical backbone network. In recent years, topologies based on regular graph structures gained a lot of interest due to their good communication properties for physical topology of the...... networks. There have been many use of evolutionary algorithms to solve the problems which are in combinatory complexity nature, and extremely hard to solve by exact approaches. Both Genetic and Simulated annealing algorithms are similar in using controlled stochastic method to search the solution. The...
高红民; 周惠; 徐立中; 石爱业
2014-01-01
A hybrid feature selection and classification strategy was proposed based on the simulated annealing genetic algorithm and multiple instance learning (MIL). The band selection method was proposed from subspace decomposition, which combines the simulated annealing algorithm with the genetic algorithm in choosing different cross-over and mutation probabilities, as well as mutation individuals. Then MIL was combined with image segmentation, clustering and support vector machine algorithms to classify hyperspectral image. The experimental results show that this proposed method can get high classification accuracy of 93.13%at small training samples and the weaknesses of the conventional methods are overcome.
Frausto-Solis, Juan; Liñán-García, Ernesto; Sánchez-Hernández, Juan Paulo; González-Barbosa, J Javier; González-Flores, Carlos; Castilla-Valdez, Guadalupe
2016-01-01
A new hybrid Multiphase Simulated Annealing Algorithm using Boltzmann and Bose-Einstein distributions (MPSABBE) is proposed. MPSABBE was designed for solving the Protein Folding Problem (PFP) instances. This new approach has four phases: (i) Multiquenching Phase (MQP), (ii) Boltzmann Annealing Phase (BAP), (iii) Bose-Einstein Annealing Phase (BEAP), and (iv) Dynamical Equilibrium Phase (DEP). BAP and BEAP are simulated annealing searching procedures based on Boltzmann and Bose-Einstein distributions, respectively. DEP is also a simulated annealing search procedure, which is applied at the final temperature of the fourth phase, which can be seen as a second Bose-Einstein phase. MQP is a search process that ranges from extremely high to high temperatures, applying a very fast cooling process, and is not very restrictive to accept new solutions. However, BAP and BEAP range from high to low and from low to very low temperatures, respectively. They are more restrictive for accepting new solutions. DEP uses a particular heuristic to detect the stochastic equilibrium by applying a least squares method during its execution. MPSABBE parameters are tuned with an analytical method, which considers the maximal and minimal deterioration of problem instances. MPSABBE was tested with several instances of PFP, showing that the use of both distributions is better than using only the Boltzmann distribution on the classical SA. PMID:27413369
Liñán-García, Ernesto; Sánchez-Hernández, Juan Paulo; González-Barbosa, J. Javier; González-Flores, Carlos
2016-01-01
A new hybrid Multiphase Simulated Annealing Algorithm using Boltzmann and Bose-Einstein distributions (MPSABBE) is proposed. MPSABBE was designed for solving the Protein Folding Problem (PFP) instances. This new approach has four phases: (i) Multiquenching Phase (MQP), (ii) Boltzmann Annealing Phase (BAP), (iii) Bose-Einstein Annealing Phase (BEAP), and (iv) Dynamical Equilibrium Phase (DEP). BAP and BEAP are simulated annealing searching procedures based on Boltzmann and Bose-Einstein distributions, respectively. DEP is also a simulated annealing search procedure, which is applied at the final temperature of the fourth phase, which can be seen as a second Bose-Einstein phase. MQP is a search process that ranges from extremely high to high temperatures, applying a very fast cooling process, and is not very restrictive to accept new solutions. However, BAP and BEAP range from high to low and from low to very low temperatures, respectively. They are more restrictive for accepting new solutions. DEP uses a particular heuristic to detect the stochastic equilibrium by applying a least squares method during its execution. MPSABBE parameters are tuned with an analytical method, which considers the maximal and minimal deterioration of problem instances. MPSABBE was tested with several instances of PFP, showing that the use of both distributions is better than using only the Boltzmann distribution on the classical SA. PMID:27413369
Neutronic optimization in high conversion Th-233U fuel assembly with simulated annealing
This paper reports on fuel design optimization of a PWR operating in a self sustainable Th-233U fuel cycle. Monte Carlo simulated annealing method was used in order to identify the fuel assembly configuration with the most attractive breeding performance. In previous studies, it was shown that breeding may be achieved by employing heterogeneous Seed-Blanket fuel geometry. The arrangement of seed and blanket pins within the assemblies may be determined by varying the designed parameters based on basic reactor physics phenomena which affect breeding. However, the amount of free parameters may still prove to be prohibitively large in order to systematically explore the design space for optimal solution. Therefore, the Monte Carlo annealing algorithm for neutronic optimization is applied in order to identify the most favorable design. The objective of simulated annealing optimization is to find a set of design parameters, which maximizes some given performance function (such as relative period of net breeding) under specified constraints (such as fuel cycle length). The first objective of the study was to demonstrate that the simulated annealing optimization algorithm will lead to the same fuel pins arrangement as was obtained in the previous studies which used only basic physics phenomena as guidance for optimization. In the second part of this work, the simulated annealing method was used to optimize fuel pins arrangement in much larger fuel assembly, where the basic physics intuition does not yield clearly optimal configuration. The simulated annealing method was found to be very efficient in selecting the optimal design in both cases. In the future, this method will be used for optimization of fuel assembly design with larger number of free parameters in order to determine the most favorable trade-off between the breeding performance and core average power density. (authors)
Stochastic annealing simulation of copper under neutron irradiation
Heinisch, H.L. [Pacific Northwest National Lab., Richland, WA (United States); Singh, B.N. [Risoe National Lab., Roskilde (Denmark)
1998-03-01
This report is a summary of a presentation made at ICFRM-8 on computer simulations of defect accumulation during irradiation of copper to low doses at room temperature. The simulation results are in good agreement with experimental data on defect cluster densities in copper irradiated in RTNS-II.
Optimal design of hydraulic manifold blocks based on niching genetic simulated annealing algorithm
Jia Chunqiang; Yu Ling; Tian Shujun; Gao Yanming
2007-01-01
To solve the combinatorial optimization problem of outer layout and inner connection integrated schemes in the design of hydraulic manifold blocks(HMB),a hybrid genetic simulated annealing algorithm based on niche technology is presented.This hybrid algorithm,which combines genetic algorithm,simulated annealing algorithm and niche technology,has a strong capability in global and local search,and all extrema can be found in a short time without strict requests for preferences.For the complex restricted solid spatial layout problems in HMB,the optimizing mathematical model is presented.The key technologies in the integrated layout and connection design of HMB,including the realization of coding,annealing operation and genetic operation,are discussed.The framework of HMB optimal design system based on hybrid optimization strategy is proposed.An example is given to testify the effectiveness and feasibility of the algorithm.
X-ray refinement of protein structures by simulated annealing: Test of the method on myohemerythrin
The recently developed method of structure factor refinement by molecular dynamics with simulated annealing is tested on the 118 residue protein myohemerythrin. A highly refined structure for this protein at 1.3/1.7 A resolution has recently been published. This is compared with the results of simulated annealing refinement (with no manual intervention) starting from an earlier model for the protein from a stage in the refinement when conventional least-squares methods could not improve the structure. Simulated annealing reduces the R factor at 2.5 A from 39 to 31%, with uniform temperature factors and no solvent molecules and with similar stereochemistry; the comparable value for the manually refined structure is 27.9%. Errors in backbone and sidechain positions up to about 3 A are corrected by the method. The error in backbone positions for roughly 85% of the initial structure is within this range, and in these regions the r.m.s. backbone error is reduced from 1.1 to 0.4 A. For the rest of the structure, including a region which was incorrectly built due to a sequence error, the procedure does not yield any improvement and manual intervention appears to be required. Nevertheless, the overall improvement in the structure results in electron density maps that are easier to interpret and permit identification of the errors in the structure. The general utility of the simulated annealing methodology in X-ray refinement is discussed. (orig.)
Crystallization on a sphere using the simulated annealing algorithm implemented on H.P.C. systems
J.M. Voogd; P.M.A. Sloot
1993-01-01
The research presented here is a comparison of the scalability of the simulated annealing algorithm on a vector super computer (CRAY Y-MP) with the scalability of a parallel implementation on a massively parallel transputer surface (Parsytec GCel with 512 nodes of typeT805). Some results of the anne
Computer-Assisted Scheduling of Army Unit Training: An Application of Simulated Annealing.
Hart, Roland J.; Goehring, Dwight J.
This report of an ongoing research project intended to provide computer assistance to Army units for the scheduling of training focuses on the feasibility of simulated annealing, a heuristic approach for solving scheduling problems. Following an executive summary and brief introduction, the document is divided into three sections. First, the Army…
Using genetic/simulated annealing algorithm to solve disassembly sequence planning
Wu Hao; Zuo Hongfu
2009-01-01
disassembly sequence.And the solution methodology based on the genetic/simulated annealing algorithm with binary-tree algorithm is given.Finally,an example is analyzed in detail,and the result shows that the model is correct and efficient.
MASTR: multiple alignment and structure prediction of non-coding RNAs using simulated annealing
Lindgreen, Stinus; Gardner, Paul P; Krogh, Anders
2007-01-01
multiple alignment of RNA sequences. Using Markov chain Monte Carlo in a simulated annealing framework, the algorithm MASTR (Multiple Alignment of STructural RNAs) iteratively improves both sequence alignment and structure prediction for a set of RNA sequences. This is done by minimizing a combined cost...
Improving Simulated Annealing by Recasting it as a Non-Cooperative Game
Wolpert, David; Bandari, Esfandiar; Tumer, Kagan
2001-01-01
The game-theoretic field of COllective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved "as a side-effect". Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed game-theory-motivated algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting improves simulated annealing by several orders of magnitude for spin glass relaxation and bin-packing.
Improving Simulated Annealing by Replacing Its Variables with Game-Theoretic Utility Maximizers
Wolpert, David H.; Bandari, Esfandiar; Tumer, Kagan
2001-01-01
The game-theory field of Collective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved as a side-effect. Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting significantly improves simulated annealing for a model of an economic process run over an underlying small-worlds topology. Furthermore, these experiments reveal novel small-worlds phenomena, and highlight the shortcomings of conventional mechanism design in bounded rationality domains.
A simulated annealing-based method for learning Bayesian networks from statistical data
Janžura, Martin; Nielsen, Jan
2006-01-01
Roč. 21, č. 3 (2006), s. 335-348. ISSN 0884-8173 R&D Projects: GA ČR GA201/03/0478 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bayesian network * simulated annealing * Markov Chain Monte Carlo Subject RIV: BA - General Mathematics Impact factor: 0.429, year: 2006
Inverse planning in external beam radiotherapy often requires a scalar objective function that incorporates importance factors to mimic the planner's preferences between conflicting objectives. Defining those importance factors is not straightforward, and frequently leads to an iterative process in which the importance factors become variables of the optimization problem. In order to avoid this drawback of inverse planning, optimization using algorithms more suited to multiobjective optimization, such as evolutionary algorithms, has been suggested. However, much inverse planning software, including one based on simulated annealing developed at our institution, does not include multiobjective-oriented algorithms. This work investigates the performance of a modified simulated annealing algorithm used to drive aperture-based intensity-modulated radiotherapy inverse planning software in a multiobjective optimization framework. For a few test cases involving gastric cancer patients, the use of this new algorithm leads to an increase in optimization speed of a little more than a factor of 2 over a conventional simulated annealing algorithm, while giving a close approximation of the solutions produced by a standard simulated annealing. A simple graphical user interface designed to facilitate the decision-making process that follows an optimization is also presented
Mori, Takaharu; Okamoto, Yuko
2009-10-01
Gramicidin A is a linear hydrophobic 15-residue peptide which consists of alternating D- and L-amino acids and forms a unique tertiary structure, called the β6.3-helix, to act as a cation-selective ion channel in the natural conditions. In order to investigate the intrinsic ability of the gramicidin A monomer to form secondary structures, we performed the folding simulation of gramicidin A using a simulated annealing molecular dynamics (MD) method in vacuum mimicking the low-dielectric, homogeneous membrane environment. The initial conformation was a fully extended one. From the 200 different MD runs, we obtained a right-handed β4.4-helix as the lowest-potential-energy structure, and left-handed β4.4-helix, right-handed and left-handed β6.3-helix as local-minimum energy states. These results are in accord with those of the experiments of gramicidin A in homogeneous organic solvent. Our simulations showed a slight right-hand sense in the lower-energy conformations and a quite β-sheet-forming tendency throughout almost the entire sequence. In order to examine the stability of the obtained right-handed β6.3-helix and β4.4-helix structures in more realistic membrane environment, we have also performed all-atom MD simulations in explicit water, ion, and lipid molecules, starting from these β-helix structures. The results suggested that β6.3-helix is more stable than β4.4-helix in the inhomogeneous, explicit membrane environment, where the pore water and the hydrogen bonds between Trp side-chains and lipid-head groups have a role to further stabilize the β6.3-helix conformation.
Estimation of Mutual Coupling Coefficient of the Array by Simulated Annealing Algorithm
GAO Huo-tao; ZHENG Xia; LI Yong-xu
2005-01-01
We propose a method for estimating the mutual coupling coefficient among antennas in this paper which is based on the principle of signal subspace and the simulated annealing (SA) algorithm. The computer simulation has been conducted to illustrate the excellent performance of this method and to demonstrate that it is statistically efficient. The benefit of this new method is that calibration signals and unknown signals can be received simultaneously, during the course of calibration.
Wang Hongkai; Guan Yanyong; Xue Peijun
2008-01-01
In rough communication, because each agent has a different language and cannot provide precise communication to each other, the concept translated among multi-agents will loss some information and this results in a less or rougher concept. With different translation sequences, the problem of information loss is varied. To get the translation sequence, in which the jth agent taking part in rough communication gets maximum information, a simulated annealing algorithm is used. Analysis and simulation of this algorithm demonstrate its effectiveness.
Annealing of dislocation loops in dislocation dynamics simulations
We report of 3-dimensional discrete dislocation dynamics (DDD) simulations of dislocation loops coarsening by vacancy bulk diffusion. The calculation is based upon a model which couples the diffusion theory of vacancies to the DDD in order to obtain the climb rate of the dislocation segments. Calculation of isolated loops agrees with experimental observations, i.e. loops shrink or expand, depending on their type and vacancy supersaturation. When an array of dislocation loops of various sizes is considered, and the total number of vacancies in the simulation is maintained constant, the largest dislocations are found to increase in size at the expense of small ones, which disappear in a process known as Ostwald ripening.
Annealing of dislocation loops in dislocation dynamics simulations
Mordehai, Dan; Clouet, Emmanuel [SRMP, CEA-Saclay, 91191 Gif-sur-Yvette Cedex (France); Fivel, Marc; Verdier, Marc, E-mail: danmord@tx.technion.ac.il [CNRS/SIMAP, INPG, BP 75, 38402 St Martin d' Heres (France)
2009-07-15
We report of 3-dimensional discrete dislocation dynamics (DDD) simulations of dislocation loops coarsening by vacancy bulk diffusion. The calculation is based upon a model which couples the diffusion theory of vacancies to the DDD in order to obtain the climb rate of the dislocation segments. Calculation of isolated loops agrees with experimental observations, i.e. loops shrink or expand, depending on their type and vacancy supersaturation. When an array of dislocation loops of various sizes is considered, and the total number of vacancies in the simulation is maintained constant, the largest dislocations are found to increase in size at the expense of small ones, which disappear in a process known as Ostwald ripening.
Experiences with serial and parallel algorithms for channel routing using simulated annealing
Brouwer, Randall Jay
1988-01-01
Two algorithms for channel routing using simulated annealing are presented. Simulated annealing is an optimization methodology which allows the solution process to back up out of local minima that may be encountered by inappropriate selections. By properly controlling the annealing process, it is very likely that the optimal solution to an NP-complete problem such as channel routing may be found. The algorithm presented proposes very relaxed restrictions on the types of allowable transformations, including overlapping nets. By freeing that restriction and controlling overlap situations with an appropriate cost function, the algorithm becomes very flexible and can be applied to many extensions of channel routing. The selection of the transformation utilizes a number of heuristics, still retaining the pseudorandom nature of simulated annealing. The algorithm was implemented as a serial program for a workstation, and a parallel program designed for a hypercube computer. The details of the serial implementation are presented, including many of the heuristics used and some of the resulting solutions.
Destya Arisetyanti
2012-09-01
Full Text Available Standar Digital Video Broadcasting Terrestrial (DVB-T diimplementasikan pada konfigurasi Single Frequency Network (SFN dimana seluruh pemancar pada sebuah jaringan beroperasi pada kanal frekuensi yang sama dan ditransmisikan pada waktu yang sama. SFN lebih dipilih daripada sistem pendahulunya yaitu Multi Frequency Network (MFN karena menggunakan frekuensi yang lebih efisien serta jangkauan area cakupan yang lebih luas. Pada sisi penerima memungkinkan adanya skenario multipath dengan menggabungkan sinyal dari pemancar yang berbeda karena konfigurasi SFN ini berbasis Orthogonal Frequency Division Multiplexing (OFDM. Pada penelitian ini, data ketinggian dan jumlah gedung melalui model prediksi propagasi free space dan knife edge akan diterapkan untuk memperkirakan nilai daya terima dan delay sinyal. Perhitungan nilai carrier (C dan carrier to interference (C/I dilakukan untuk mengetahui kualitas sinyal pada sisi penerima. Selanjutnya, optimasi parameter lokasi pemancar diterapkan oleh algoritma Simulated Annealing dengan menggunakan tiga cooling schedule terbaik. Simulated Annealing merupakan algoritma optimasi berdasarkan sistem termodinamika yang mensimulasikan proses annealing. Simulated Annealing telah berhasil memperluas daerah cakupan SFN. Hal ini dibuktikan dengan berkurangnya sebagian besar titik receiver dengan kualitas sinyal dibawah threshold.
Annealing simulations of nano-sized amorphous structures in Sic
A two-dimensional model of a nano-sized amorphous layer embedded in a perfect crystal has been developed, and the amorphous-to-crystalline (a-c) transition in 3C-Sic at 2000 K has been studied using molecular dynamics methods, with simulation times of up to 88 ns. Analysis of the a-c interfaces reveals that the recovery of the bond defects existing at the a-c interfaces plays an important role in recrystallization. During the recrystallization process, a second ordered phase, crystalline 2H-SiC, nucleates and grows, and this phase is stable for long simulation times. The crystallization mechanism is a two-step process that is separated by a longer period of second-phase stability. The kink sites formed at the interfaces between 2H- and 3C-SiC provide a low energy path for 2H-SiC atoms to transfer to 3C-SiC atoms, a process which can be defined as a solid-phase epitaxial transformation (SPET). It is observed that the nano-sized amorphous structure can be fully recrystallized at 2000 K in SiC, which is in agreement with experimental observations
Sheng Lu
2015-01-01
Full Text Available To solve the problem of parameter selection during the design of magnetically coupled resonant wireless power transmission system (MCR-WPT, this paper proposed an improved genetic simulated annealing algorithm. Firstly, the equivalent circuit of the system is analysis in this study and a nonlinear programming mathematical model is built. Secondly, in place of the penalty function method in the genetic algorithm, the selection strategy based on the distance between individuals is adopted to select individual. In this way, it reduces the excess empirical parameters. Meanwhile, it can improve the convergence rate and the searching ability by calculating crossover probability and mutation probability according to the variance of population’s fitness. At last, the simulated annealing operator is added to increase local search ability of the method. The simulation shows that the improved method can break the limit of the local optimum solution and get the global optimum solution faster. The optimized system can achieve the practical requirements.
Szu, Harold H.
1993-09-01
Classical artificial neural networks (ANN) and neurocomputing are reviewed for implementing a real time medical image diagnosis. An algorithm known as the self-reference matched filter that emulates the spatio-temporal integration ability of the human visual system might be utilized for multi-frame processing of medical imaging data. A Cauchy machine, implementing a fast simulated annealing schedule, can determine the degree of abnormality by the degree of orthogonality between the patient imagery and the class of features of healthy persons. An automatic inspection process based on multiple modality image sequences is simulated by incorporating the following new developments: (1) 1-D space-filling Peano curves to preserve the 2-D neighborhood pixels' relationship; (2) fast simulated Cauchy annealing for the global optimization of self-feature extraction; and (3) a mini-max energy function for the intra-inter cluster-segregation respectively useful for top-down ANN designs.
DKIST Adaptive Optics System: Simulation Results
Marino, Jose; Schmidt, Dirk
2016-05-01
The 4 m class Daniel K. Inouye Solar Telescope (DKIST), currently under construction, will be equipped with an ultra high order solar adaptive optics (AO) system. The requirements and capabilities of such a solar AO system are beyond those of any other solar AO system currently in operation. We must rely on solar AO simulations to estimate and quantify its performance.We present performance estimation results of the DKIST AO system obtained with a new solar AO simulation tool. This simulation tool is a flexible and fast end-to-end solar AO simulator which produces accurate solar AO simulations while taking advantage of current multi-core computer technology. It relies on full imaging simulations of the extended field Shack-Hartmann wavefront sensor (WFS), which directly includes important secondary effects such as field dependent distortions and varying contrast of the WFS sub-aperture images.
An Improved Simulated Annealing Technique for Enhanced Mobility in Smart Cities
Amer, Hayder; Salman, Naveed; Hawes, Matthew; Chaqfeh, Moumena; Mihaylova, Lyudmila; Mayfield, Martin
2016-01-01
Vehicular traffic congestion is a significant problem that arises in many cities. This is due to the increasing number of vehicles that are driving on city roads of limited capacity. The vehicular congestion significantly impacts travel distance, travel time, fuel consumption and air pollution. Avoidance of traffic congestion and providing drivers with optimal paths are not trivial tasks. The key contribution of this work consists of the developed approach for dynamic calculation of optimal traffic routes. Two attributes (the average travel speed of the traffic and the roads’ length) are utilized by the proposed method to find the optimal paths. The average travel speed values can be obtained from the sensors deployed in smart cities and communicated to vehicles via the Internet of Vehicles and roadside communication units. The performance of the proposed algorithm is compared to three other algorithms: the simulated annealing weighted sum, the simulated annealing technique for order preference by similarity to the ideal solution and the Dijkstra algorithm. The weighted sum and technique for order preference by similarity to the ideal solution methods are used to formulate different attributes in the simulated annealing cost function. According to the Sheffield scenario, simulation results show that the improved simulated annealing technique for order preference by similarity to the ideal solution method improves the traffic performance in the presence of congestion by an overall average of 19.22% in terms of travel time, fuel consumption and CO2 emissions as compared to other algorithms; also, similar performance patterns were achieved for the Birmingham test scenario. PMID:27376289
An Improved Simulated Annealing Technique for Enhanced Mobility in Smart Cities.
Amer, Hayder; Salman, Naveed; Hawes, Matthew; Chaqfeh, Moumena; Mihaylova, Lyudmila; Mayfield, Martin
2016-01-01
Vehicular traffic congestion is a significant problem that arises in many cities. This is due to the increasing number of vehicles that are driving on city roads of limited capacity. The vehicular congestion significantly impacts travel distance, travel time, fuel consumption and air pollution. Avoidance of traffic congestion and providing drivers with optimal paths are not trivial tasks. The key contribution of this work consists of the developed approach for dynamic calculation of optimal traffic routes. Two attributes (the average travel speed of the traffic and the roads' length) are utilized by the proposed method to find the optimal paths. The average travel speed values can be obtained from the sensors deployed in smart cities and communicated to vehicles via the Internet of Vehicles and roadside communication units. The performance of the proposed algorithm is compared to three other algorithms: the simulated annealing weighted sum, the simulated annealing technique for order preference by similarity to the ideal solution and the Dijkstra algorithm. The weighted sum and technique for order preference by similarity to the ideal solution methods are used to formulate different attributes in the simulated annealing cost function. According to the Sheffield scenario, simulation results show that the improved simulated annealing technique for order preference by similarity to the ideal solution method improves the traffic performance in the presence of congestion by an overall average of 19.22% in terms of travel time, fuel consumption and CO₂ emissions as compared to other algorithms; also, similar performance patterns were achieved for the Birmingham test scenario. PMID:27376289
An Improved Simulated Annealing Technique for Enhanced Mobility in Smart Cities
Hayder Amer
2016-06-01
Full Text Available Vehicular traffic congestion is a significant problem that arises in many cities. This is due to the increasing number of vehicles that are driving on city roads of limited capacity. The vehicular congestion significantly impacts travel distance, travel time, fuel consumption and air pollution. Avoidance of traffic congestion and providing drivers with optimal paths are not trivial tasks. The key contribution of this work consists of the developed approach for dynamic calculation of optimal traffic routes. Two attributes (the average travel speed of the traffic and the roads’ length are utilized by the proposed method to find the optimal paths. The average travel speed values can be obtained from the sensors deployed in smart cities and communicated to vehicles via the Internet of Vehicles and roadside communication units. The performance of the proposed algorithm is compared to three other algorithms: the simulated annealing weighted sum, the simulated annealing technique for order preference by similarity to the ideal solution and the Dijkstra algorithm. The weighted sum and technique for order preference by similarity to the ideal solution methods are used to formulate different attributes in the simulated annealing cost function. According to the Sheffield scenario, simulation results show that the improved simulated annealing technique for order preference by similarity to the ideal solution method improves the traffic performance in the presence of congestion by an overall average of 19.22% in terms of travel time, fuel consumption and CO2 emissions as compared to other algorithms; also, similar performance patterns were achieved for the Birmingham test scenario.
Dao-Wei Bi
2007-05-01
Full Text Available The limited energy supply of wireless sensor networks poses a great challenge for the deployment of wireless sensor nodes. In this paper, we focus on energy-efficient coverage with distributed particle swarm optimization and simulated annealing. First, the energy-efficient coverage problem is formulated with sensing coverage and energy consumption models. We consider the network composed of stationary and mobile nodes. Second, coverage and energy metrics are presented to evaluate the coverage rate and energy consumption of a wireless sensor network, where a grid exclusion algorithm extracts the coverage state and DijkstraÃ¢Â€Â™s algorithm calculates the lowest cost path for communication. Then, a hybrid algorithm optimizes the energy consumption, in which particle swarm optimization and simulated annealing are combined to find the optimal deployment solution in a distributed manner. Simulated annealing is performed on multiple wireless sensor nodes, results of which are employed to correct the local and global best solution of particle swarm optimization. Simulations of wireless sensor node deployment verify that coverage performance can be guaranteed, energy consumption of communication is conserved after deployment optimization and the optimization performance is boosted by the distributed algorithm. Moreover, it is demonstrated that energy efficiency of wireless sensor networks is enhanced by the proposed optimization algorithm in target tracking applications.
Wrożyna, Andrzej; Pernach, Monika; Kuziak, Roman; Pietrzyk, Maciej
2016-04-01
Due to their exceptional strength properties combined with good workability the Advanced High-Strength Steels (AHSS) are commonly used in automotive industry. Manufacturing of these steels is a complex process which requires precise control of technological parameters during thermo-mechanical treatment. Design of these processes can be significantly improved by the numerical models of phase transformations. Evaluation of predictive capabilities of models, as far as their applicability in simulation of thermal cycles thermal cycles for AHSS is considered, was the objective of the paper. Two models were considered. The former was upgrade of the JMAK equation while the latter was an upgrade of the Leblond model. The models can be applied to any AHSS though the examples quoted in the paper refer to the Dual Phase (DP) steel. Three series of experimental simulations were performed. The first included various thermal cycles going beyond limitations of the continuous annealing lines. The objective was to validate models behavior in more complex cooling conditions. The second set of tests included experimental simulations of the thermal cycle characteristic for the continuous annealing lines. Capability of the models to describe properly phase transformations in this process was evaluated. The third set included data from the industrial continuous annealing line. Validation and verification of models confirmed their good predictive capabilities. Since it does not require application of the additivity rule, the upgrade of the Leblond model was selected as the better one for simulation of industrial processes in AHSS production.
Simulations of the effect of pulse annealing on optically-stimulated luminescence of quartz
Pulse annealing techniques are commonly used in OSL studies of quartz to obtain information on the kinetic parameters of OSL traps and hole reservoirs. In this paper, simulations of pulse annealing experiments are carried out using the comprehensive model for quartz developed by Bailey [2001. Towards a general kinetic model for optically and thermally stimulated luminescence of quartz. Radiat. Meas. 33, 17-45] for both natural and laboratory irradiated aliquots. The results of the simulations are in qualitative agreement with, and reproduce, several unusual features of the experimental data of Wintle and Murray [1998. Towards the development of a preheat procedure for OSL dating of quartz. Radiat. Meas. 29, 81-94]. The simulations are also carried out using different heating rates, and show that pulse annealing experiments can be used to recover appropriate kinetic parameters for both the OSL traps and the hole reservoirs known to exist in quartz. The results of the simulations show the importance of these hole reservoirs in determining how the OSL signal depends upon the preheat temperature
Pereira, Ana I.; Lima, José; Costa, Paulo
2013-10-01
There are several approaches to create the Humanoid robot gait planning. This problem presents a large number of unknown parameters that should be found to make the humanoid robot to walk. Optimization in simulation models can be used to find the gait based on several criteria such as energy minimization, acceleration, step length among the others. The presented paper addresses a comparison between two optimization methods, the Stretched Simulated Annealing and the Genetic Algorithm, that runs in an accurate and stable simulation model. Final results show the comparative study and demonstrate that optimization is a valid gait planning technique.
Adaptive Optics Simulations for Siding Spring
Goodwin, Michael; Lambert, Andrew
2012-01-01
Using an observational derived model optical turbulence profile (model-OTP) we have investigated the performance of Adaptive Optics (AO) at Siding Spring Observatory (SSO), Australia. The simulations cover the performance for AO techniques of single conjugate adaptive optics (SCAO), multi-conjugate adaptive optics (MCAO) and ground-layer adaptive optics (GLAO). The simulation results presented in this paper predict the performance of these AO techniques as applied to the Australian National University (ANU) 2.3 m and Anglo-Australian Telescope (AAT) 3.9 m telescopes for astronomical wavelength bands J, H and K. The results indicate that AO performance is best for the longer wavelengths (K-band) and in the best seeing conditions (sub 1-arcsecond). The most promising results are found for GLAO simulations (field of view of 180 arcsecs), with the field RMS for encircled energy 50% diameter (EE50d) being uniform and minimally affected by the free-atmosphere turbulence. The GLAO performance is reasonably good over...
Jin Shi-Feng; Wang Wei-Min; Zhou Jian-Kun; Guo Hong-Xuan; J.F. Webb; Bian Xiu-Fang
2005-01-01
The nanocrystallization behaviour of Zr70Cu20Ni10 metallic glass during isothermal annealing is studied by employing a Monte Carlo simulation incorporating with a modified Ising model and a Q-state Potts model. Based on the simulated microstructure and differential scanning calorimetry curves, we find that the low crystal-amorphous interface energy of Ni plays an important role in the nanocrystallization of primary Zr2Ni. It is found that when T ＜ TImax (where TImax is the temperature with maximum nucleation rate), the increase of temperature results in a larger growth rate and a much finer microstructure for the primary Zr2Ni, which accords with the microstructure evolution in "flash annealing". Finally, the Zr2Ni/Zr2Cu interface energy σG contributes to the pinning effect of the primary nano-sized Zr2Ni grains in the later formed normal Zr2Cu grains.
Optimizing the natural connectivity of scale-free networks using simulated annealing
Duan, Boping; Liu, Jing; Tang, Xianglong
2016-09-01
In real-world networks, the path between two nodes always plays a significant role in the fields of communication or transportation. In some cases, when one path fails, the two nodes cannot communicate any more. Thus, it is necessary to increase alternative paths between nodes. In the recent work (Wu et al., 2011), Wu et al. proposed the natural connectivity as a novel robustness measure of complex networks. The natural connectivity considers the redundancy of alternative paths in a network by computing the number of closed paths of all lengths. To enhance the robustness of networks in terms of the natural connectivity, in this paper, we propose a simulated annealing method to optimize the natural connectivity of scale-free networks without changing the degree distribution. The experimental results show that the simulated annealing method clearly outperforms other local search methods.
Optimización Global Simulated Annealing
Francisco Sánchez Mares
2006-01-01
Full Text Available El presente trabajo muestra la aplicación del método de optimización global Simulated Annealing (SA. Esta técnica ha sido aplicada en diversas áreas de la ingeniería como una estrategia robusta y versátil para calcular con éxito el mínimo global de una función o un sistema de funciones. Para probar la eficiencia del método se encontraron los mínimos globales de una función arbitraria y se evaluó el comportamiento numérico del Simulated Annealing durante la convergencia a las dos soluciones que presenta el caso de estudio.
吴剑锋; 朱学愚; 刘建立
1999-01-01
The genetic algorithm (GA) is a global and random search procedure based on the mechanics of natural selection and natural genetics. A new optimization method of the genetic algorithm-based simulated annealing penalty function (GASAPF) is presented to solve groundwater management model. Compared with the traditional gradient-based algorithms, the GA is straightforward and there is no need to calculate derivatives of the objective function. The GA is able to generate both convex and nonconvex points within the feasible region. It can be sure that the GA converges to the global or at least near-global optimal solution to handle the constraints by simulated annealing technique. Maximum pumping example results show that the GASAPF to solve optimization model is very efficient and robust.
Hand Motion Tracking Using Simulated Annealing Method in a Discrete Space
LIANG Wei; JIA Yun-de; LIU Tang-li; HAN Lei; WU Xin-xiao
2007-01-01
Hand tracking is a challenging problem due to the complexity of searching in a 20 + degrees of freedom (DOF) space for an optimal estimation of hand configuration.The feasible hand configurations are represented as a discrete space,which avoids learning to find parameters as general configuration space representations do.Then,an extended simulated annealing method with particle filtering to search for optimal hand configuration in the proposed discrete space,in which simplex search running in multi-processor is designed to predict the hand motion instead of initializing the simulated annealing randomly,and particle filtering is employed to represent the state of the tracker at each layer for searching in high dimensional configuration space.Experimental results show that the proposed method makes the hand tracking more efficient and robust.
Fast and accurate protein substructure searching with simulated annealing and GPUs
Stivala Alex D
2010-09-01
Full Text Available Abstract Background Searching a database of protein structures for matches to a query structure, or occurrences of a structural motif, is an important task in structural biology and bioinformatics. While there are many existing methods for structural similarity searching, faster and more accurate approaches are still required, and few current methods are capable of substructure (motif searching. Results We developed an improved heuristic for tableau-based protein structure and substructure searching using simulated annealing, that is as fast or faster and comparable in accuracy, with some widely used existing methods. Furthermore, we created a parallel implementation on a modern graphics processing unit (GPU. Conclusions The GPU implementation achieves up to 34 times speedup over the CPU implementation of tableau-based structure search with simulated annealing, making it one of the fastest available methods. To the best of our knowledge, this is the first application of a GPU to the protein structural search problem.
The generalized simulated annealing algorithm in the low energy electron diffraction search problem
We present in this work results concerning the application of the generalized simulated annealing (GSA) algorithm to the LEED search problem. The influence of the visiting distribution function (defined by the so-called qV parameter) in the effectiveness of the method was investigated by the application of the algorithm to structural searches for optimization of two to ten parameters in a theory-theory comparison for the CdTe(110) system. Results, obtained with the scaling relation and probability of convergence as a function of the number of parameters to be varied, indicate the fast simulated annealing (FSA) (qV = 2.0) approach as the best search machine
Use of simulated annealing in standardization and optimization of the acerola wine production
Sheyla dos Santos Almeida
2014-06-01
Full Text Available In this study, seven wine samples were prepared varying the amount of pulp of acerola fruits and the sugar content using the simulated annealing technique to obtain the optimal sensory qualities and cost for the wine produced. S. cerevisiae yeast was used in the fermentation process and the sensory attributes were evaluated using a hedonic scale. Acerola wines were classified as sweet, with 11°GL of alcohol concentration and with aroma, taste, and color characteristics of the acerola fruit. The simulated annealing experiments showed that the best conditions were found at mass ratio between 1/7.5-1/6 and total soluble solids between 28.6-29.0 °Brix, from which the sensory acceptance scores of 6.9, 6.8, and 8.8 were obtained for color, aroma, and flavor, respectively, with a production cost 43-45% lower than the cost of traditional wines commercialized in Brazil.
Design of prestressed concrete precast road bridges with hybrid simulated annealing
Martí Albiñana, José Vicente; Gonzalez Vidosa, Fernando; Yepes Piqueras, Víctor; Alcalá González, Julián
2013-01-01
This paper describes one approach to the analysis and design of prestressed concrete precast road bridges, with double U-shaped cross-section and isostatic spans. The procedure used to solve the combinatorial problem is a variant of simulated annealing with a neighborhood move based on the mutation operator from the genetic algorithms (SAMO). This algorithm is applied to the economic cost of these structures at different stages of manufacturing, transportation and construction. The problem in...
Čapek, P.; Hejtmánek, Vladimír; Brabec, Libor; Zikánová, Arlette; Kočiřík, Milan
2009-01-01
Roč. 76, č. 2 (2009), s. 179-198. ISSN 0169-3913 R&D Projects: GA ČR GA203/05/0347 Institutional research plan: CEZ:AV0Z40720504; CEZ:AV0Z40400503 Keywords : stochastic reconstruction * simulated annealing * two-point cluster function Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 0.966, year: 2009
Paul, Gerald
2010-01-01
For almost two decades the question of whether tabu search (TS) or simulated annealing (SA) performs better for the quadratic assignment problem has been unresolved. To answer this question satisfactorily, we compare performance at various values of targeted solution quality, running each heuristic at its optimal number of iterations for each target. We find that for a number of varied problem instances, SA performs better for higher quality targets while TS performs better for lower quality targets.
EXTRUSION DIE PROFILE DESIGN USING SIMULATED ANNEALING ALGORITHM AND PARTICLE SWARM OPTIMIZATION
R.VENKETESAN
2010-08-01
Full Text Available In this paper a new method has been proposed for optimum shape design of extrusion die. The Design problem is formulated as an unconstrained optimization problem. Here nontraditional optimization techniques likeSimulated Annealing Algorithm and Particle Swarm Optimization are used to minimize the extrusion force by optimizing the extrusion ratio and die cone angle. Internal power of deformation is also calculated and results are compared.
Minimizing distortion and internal forces in truss structures by simulated annealing
Kincaid, Rex K.; Padula, Sharon L.
1990-01-01
Inaccuracies in the length of members and the diameters of joints of large space structures may produce unacceptable levels of surface distortion and internal forces. Here, two discrete optimization problems are formulated, one to minimize surface distortion (DSQRMS) and the other to minimize internal forces (FSQRMS). Both of these problems are based on the influence matrices generated by a small-deformation linear analysis. Good solutions are obtained for DSQRMS and FSQRMS through the use of a simulated annealing heuristic.
A Simulated Annealing Algorithm for the Optimization of Multistage Depressed Collector Efficiency
Vaden, Karl R.; Wilson, Jeffrey D.; Bulson, Brian A.
2002-01-01
The microwave traveling wave tube amplifier (TWTA) is widely used as a high-power transmitting source for space and airborne communications. One critical factor in designing a TWTA is the overall efficiency. However, overall efficiency is highly dependent upon collector efficiency; so collector design is critical to the performance of a TWTA. Therefore, NASA Glenn Research Center has developed an optimization algorithm based on Simulated Annealing to quickly design highly efficient multi-stage depressed collectors (MDC).
Application of simulated annealing to the biclustering of gene expression data
Bolshakova, Nadia; Cunningham, Padraig
2006-01-01
In a gene expression data matrix, a bicluster is a submatrix of genes and conditions that exhibits a high correlation of expression activity across both rows and columns. The problem of locating the most significant bicluster has been shown to be NP-complete. Heuristic approaches such as Cheng and Church?s greedy node deletion algorithm have been previously employed. It is to be expected that stochastic search techniques such as evolutionary algorithms or simulated anneal...
Two-Dimensional IIR Filter Design Using Simulated Annealing Based Particle Swarm Optimization
Supriya Dhabal; Palaniandavar Venkateswaran
2014-01-01
We present a novel hybrid algorithm based on particle swarm optimization (PSO) and simulated annealing (SA) for the design of two-dimensional recursive digital filters. The proposed method, known as SA-PSO, integrates the global search ability of PSO with the local search ability of SA and offsets the weakness of each other. The acceptance criterion of Metropolis is included in the basic algorithm of PSO to increase the swarm’s diversity by accepting sometimes weaker solutions also. The exper...
Dao-Wei Bi; Sheng Wang; Jun-Jie Ma; Xue Wang
2007-01-01
The limited energy supply of wireless sensor networks poses a great challenge for the deployment of wireless sensor nodes. In this paper, we focus on energy-efficient coverage with distributed particle swarm optimization and simulated annealing. First, the energy-efficient coverage problem is formulated with sensing coverage and energy consumption models. We consider the network composed of stationary and mobile nodes. Second, coverage and energy metrics are presented to evaluate the coverage...
Sirisumrannukul, Somporn
2010-01-01
The network reconfiguration problem for reliability enhancement is solved by the developed simulated annealing in conjunction with reliability worth analysis that provides an indirect measure for cost implication associated with power failure. The objective is to minimize customer interruption cost with the constraints that all load points have to be electrically supplied and radially connected. It can be seen from the results of the RBTS and the 69-bus system that the total customer interrup...
EXTRUSION DIE PROFILE DESIGN USING SIMULATED ANNEALING ALGORITHM AND PARTICLE SWARM OPTIMIZATION
R.VENKETESAN
2010-01-01
In this paper a new method has been proposed for optimum shape design of extrusion die. The Design problem is formulated as an unconstrained optimization problem. Here nontraditional optimization techniques likeSimulated Annealing Algorithm and Particle Swarm Optimization are used to minimize the extrusion force by optimizing the extrusion ratio and die cone angle. Internal power of deformation is also calculated and results are compared.
Hussain, Faraz; Jha, Sumit K.; Jha, Susmit; Langmead, Christopher J.
2014-01-01
Stochastic models are increasingly used to study the behaviour of biochemical systems. While the structure of such models is often readily available from first principles, unknown quantitative features of the model are incorporated into the model as parameters. Algorithmic discovery of parameter values from experimentally observed facts remains a challenge for the computational systems biology community. We present a new parameter discovery algorithm that uses simulated annealing, sequential ...
SIMULATED ANNEALING ALGORITHM FOR SCHEDULING DIVISIBLE LOAD IN LARGE SCALE DATA GRIDS
Monir Abdullah; Mohamed, Othman; Hamidah Ibrahim; Shamala Subramaniam
2010-01-01
In many data grid applications, data can be decomposed into multiple independent sub datasets and distributed for parallel execution and analysis. This property has been successfully exploited using Divisible Load Theory (DLT). Many Scheduling approaches have been studied but there is no optimal solution. This paper proposes a novel Simulated Annealing (SA) algorithm for scheduling divisible load in large scale data grids. SA algorithm is integrated with DLT model and compared with th...
Dong Yunfeng
2013-01-01
The scheduling problem is a typical time table problem in educational administration. For such a NP complete problems, when the genetic algorithm solves this problem, it has precociousness phenomenon and quickly converges not to the global optimal solution but to the local optimal solution. Therefore, we use the advantage of simulated annealing algorithm to transform the fitness function and chaotic sequence to control the crossover and mutation genetic ope...
A Phoenix++ Based New Genetic Algorithm Involving Mechanism of Simulated Annealing
Luokai Hu; Jin Liu; Chao Liang; Fuchuan Ni; Hang Chen
2015-01-01
Genetic algorithm is easy to fall into local optimal solution. Simulated annealing algorithm may accept nonoptimal solution at a certain probability to jump out of local optimal solution. On the other hand, lack of communication among genes in MapReduce platform based genetic algorithm, the high-performance distributed computing technologies or platforms can further increase the execution efficiency of these traditional genetic algorithms. To this end, we propose a novel Phoenix++ based new g...
Optimal design of a DC MHD pump by simulated annealing method
Bouali Khadidja
2014-01-01
Full Text Available In this paper a design methodology of a magnetohydrodynamic pump is proposed. The methodology is based on direct interpretation of the design problem as an optimization problem. The simulated annealing method is used for an optimal design of a DC MHD pump. The optimization procedure uses an objective function which can be the minimum of the mass. The constraints are both of geometrics and electromagnetic in type. The obtained results are reported.
A parallel simulated annealing algorithm for standard cell placement on a hypercube computer
Jones, Mark Howard
1987-01-01
A parallel version of a simulated annealing algorithm is presented which is targeted to run on a hypercube computer. A strategy for mapping the cells in a two dimensional area of a chip onto processors in an n-dimensional hypercube is proposed such that both small and large distance moves can be applied. Two types of moves are allowed: cell exchanges and cell displacements. The computation of the cost function in parallel among all the processors in the hypercube is described along with a distributed data structure that needs to be stored in the hypercube to support parallel cost evaluation. A novel tree broadcasting strategy is used extensively in the algorithm for updating cell locations in the parallel environment. Studies on the performance of the algorithm on example industrial circuits show that it is faster and gives better final placement results than the uniprocessor simulated annealing algorithms. An improved uniprocessor algorithm is proposed which is based on the improved results obtained from parallelization of the simulated annealing algorithm.
Validation of Sensor-Directed Spatial Simulated Annealing Soil Sampling Strategy.
Scudiero, Elia; Lesch, Scott M; Corwin, Dennis L
2016-07-01
Soil spatial variability has a profound influence on most agronomic and environmental processes at field and landscape scales, including site-specific management, vadose zone hydrology and transport, and soil quality. Mobile sensors are a practical means of mapping spatial variability because their measurements serve as a proxy for many soil properties, provided a sensor-soil calibration is conducted. A viable means of calibrating sensor measurements over soil properties is through linear regression modeling of sensor and target property data. In the present study, two sensor-directed, model-based, sampling scheme delineation methods were compared to validate recent applications of soil apparent electrical conductivity (EC)-directed spatial simulated annealing against the more established EC-directed response surface sampling design (RSSD) approach. A 6.8-ha study area near San Jacinto, CA, was surveyed for EC, and 30 soil sampling locations per sampling strategy were selected. Spatial simulated annealing and RSSD were compared for sensor calibration to a target soil property (i.e., salinity) and for evenness of spatial coverage of the study area, which is beneficial for mapping nontarget soil properties (i.e., those not correlated with EC). The results indicate that the linear modeling EC-salinity calibrations obtained from the two sampling schemes provided salinity maps characterized by similar errors. The maps of nontarget soil properties show similar errors across sampling strategies. The Spatial Simulated Annealing methodology is, therefore, validated, and its use in agronomic and environmental soil science applications is justified. PMID:27380070
Zhao, Yi; Cao, Xiangyu; Gao, Jun; Sun, Yu; Yang, Huanhuan; Liu, Xiao; Zhou, Yulong; Han, Tong; Chen, Wei
2016-04-01
We propose a new strategy to design broadband and wide angle diffusion metasurfaces. An anisotropic structure which has opposite phases under x- and y-polarized incidence is employed as the “0” and “1” elements base on the concept of coding metamaterial. To obtain a uniform backward scattering under normal incidence, Simulated Annealing algorithm is utilized in this paper to calculate the optimal layout. The proposed method provides an efficient way to design diffusion metasurface with a simple structure, which has been proved by both simulations and measurements.
Zhao, Yi; Cao, Xiangyu; Gao, Jun; Sun, Yu; Yang, Huanhuan; Liu, Xiao; Zhou, Yulong; Han, Tong; Chen, Wei
2016-01-01
We propose a new strategy to design broadband and wide angle diffusion metasurfaces. An anisotropic structure which has opposite phases under x- and y-polarized incidence is employed as the “0” and “1” elements base on the concept of coding metamaterial. To obtain a uniform backward scattering under normal incidence, Simulated Annealing algorithm is utilized in this paper to calculate the optimal layout. The proposed method provides an efficient way to design diffusion metasurface with a simple structure, which has been proved by both simulations and measurements. PMID:27034110
Guijarro, María; Pajares, Gonzalo; Herrera, P. Javier
2009-01-01
The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm. PMID:22399989
Simulated Annealing Study on Structures and Energetics of CO2 in Argon Clusters
Le-cheng Wang; Dai-qian Xie
2011-01-01
The minimum-energy configurations and energetic properties of the ArN-CO2 (N=1-19) van der Waals clusters were investigated by a simulated annealing algorithm.A newly developed Ar-CO2 potential energy surface together with the Aziz Ar-Ar interaction potential was employed to construct the high dimensional potential functions by pairwise additive approximation.The global minimal conformations were optimized by sampling the glassy phase space with a circumspectively formulated annealing schedule.Unlike the lighter RgN-CO2 clusters,the size-dependent structural and energetic characteristics of ArN-CO2 exhibit a different behavior.The dramatically variations with number of solvent were found for small clusters.After the completion of the first solvation shell at N=17,the clusters were evolved more smoothly.
Computer simulations of randomly branching polymers: annealed versus quenched branching structures
Rosa, Angelo; Everaers, Ralf
2016-08-01
We present computer simulations of three systems of randomly branching polymers in d = 3 dimensions: ideal trees and self-avoiding trees with annealed and quenched connectivities. In all cases, we performed a detailed analysis of trees connectivities, spatial conformations and statistical properties of linear paths on trees, and compare the results to the corresponding predictions of Flory theory. We confirm that, overall, the theory predicts correctly that trees with quenched ideal connectivity exhibit less overall swelling in good solvent than corresponding trees with annealed connectivity even though they are more strongly stretched on the path level. At the same time, we emphasize the inadequacy of the Flory theory in predicting the behaviour of other, and equally relevant, observables like contact probabilities between tree nodes. We show, then, that contact probabilities can be aptly characterized by introducing a novel critical exponent, {θ }{path}, which accounts for how they decay as a function of the node-to-node path distance on the tree.
Martinez-Limia, A. [Fraunhofer Institute of Integrated Systems and Device Technology, Schottkystrasse 10, 91058 Erlangen (Germany); Pichler, P. [Fraunhofer Institute of Integrated Systems and Device Technology, Schottkystrasse 10, 91058 Erlangen (Germany); Chair of Electron Devices, University of Erlangen-Nuremberg, Cauerstrasse 6, 91058 Erlangen (Germany)], E-mail: peter.pichler@iisb.fraunhofer.de; Lerch, W.; Paul, S. [Mattson Thermal Products GmbH, Daimlerstrasse 10, 89160 Dornstadt (Germany); Kheyrandish, H. [CSMA - MATS, Queens Road, Stoke on Trent ST4 7LQ (United Kingdom); Pakfar, A.; Tavernier, C. [STMicroelectronics SA, 850 rue Jean Monnet, 38926 Crolles (France)
2008-12-05
The possibility of using solid phase epitaxial regrowth (SPER) for activation of arsenic after amorphizing implantation in silicon is explored in this contribution and compared to spike annealing and published flash-annealing experiments. SPER takes advantage of the high activation level of the dopants after SPER combined with practically no dopant diffusion. We performed implantation and annealing experiments for three combinations of implantation energy and dose, and compared the results of SPER and spike annealing. The thermal stability of the dopant distribution was studied by subsequent post-annealing treatment for temperatures between 750 deg. C and 900 deg. C. The results of these experiments were included in the calibration of a diffusion and activation model for arsenic with high predictive capabilities. Additional simulations over a wide range of implantation energies were done to compare the efficiency of SPER, spike and flash annealing. The specific contributions to deactivation via different processes like clustering, precipitation, and segregation are discussed and annealing strategies to minimize the deactivation are proposed. Spike annealing seems to be the best solution for junctions of 25 nm or deeper, while for shallower junctions other processes combining preamorphization, multiple implantation steps, SPER, and/or flash annealing are needed.
Adaptive System Modeling for Spacecraft Simulation
Thomas, Justin
2011-01-01
This invention introduces a methodology and associated software tools for automatically learning spacecraft system models without any assumptions regarding system behavior. Data stream mining techniques were used to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). Evaluation on historical ISS telemetry data shows that adaptive system modeling reduces simulation error anywhere from 50 to 90 percent over existing approaches. The purpose of the methodology is to outline how someone can create accurate system models from sensor (telemetry) data. The purpose of the software is to support the methodology. The software provides analysis tools to design the adaptive models. The software also provides the algorithms to initially build system models and continuously update them from the latest streaming sensor data. The main strengths are as follows: Creates accurate spacecraft system models without in-depth system knowledge or any assumptions about system behavior. Automatically updates/calibrates system models using the latest streaming sensor data. Creates device specific models that capture the exact behavior of devices of the same type. Adapts to evolving systems. Can reduce computational complexity (faster simulations).
The results of object kinetic Monte Carlo (OKMC) simulations of the annealing of primary cascade damage in bulk tungsten using a comprehensive database of cascades obtained from molecular dynamics (Setyawan et al.) are described as a function of primary knock-on atom (PKA) energy at temperatures of 300, 1025 and 2050 K. An increase in SIA clustering coupled with a decrease in vacancy clustering with increasing temperature, in addition to the disparate mobilities of SIAs versus vacancies, causes an interesting effect of temperature on cascade annealing. The annealing efficiency (the ratio of the number of defects after and before annealing) exhibits an inverse U-shape curve as a function of temperature. The capabilities of the newly developed OKMC code KSOME (kinetic simulations of microstructure evolution) used to carry out these simulations are described
Adaptive wavelet simulation of global ocean dynamics
N. K.-R. Kevlahan
2015-07-01
Full Text Available In order to easily enforce solid-wall boundary conditions in the presence of complex coastlines, we propose a new mass and energy conserving Brinkman penalization for the rotating shallow water equations. This penalization does not lead to higher wave speeds in the solid region. The error estimates for the penalization are derived analytically and verified numerically for linearized one dimensional equations. The penalization is implemented in a conservative dynamically adaptive wavelet method for the rotating shallow water equations on the sphere with bathymetry and coastline data from NOAA's ETOPO1 database. This code could form the dynamical core for a future global ocean model. The potential of the dynamically adaptive ocean model is illustrated by using it to simulate the 2004 Indonesian tsunami and wind-driven gyres.
Kumar, Pushpendra; Huber, Patrick
2016-04-01
Discovery of porous silicon formation in silicon substrate in 1956 while electro-polishing crystalline Si in hydrofluoric acid (HF), has triggered large scale investigations of porous silicon formation and their changes in physical and chemical properties with thermal and chemical treatment. A nitrogen sorption study is used to investigate the effect of thermal annealing on electrochemically etched mesoporous silicon (PS). The PS was thermally annealed from 200˚C to 800˚C for 1 hr in the presence of air. It was shown that the pore diameter and porosity of PS vary with annealing temperature. The experimentally obtained adsorption / desorption isotherms show hysteresis typical for capillary condensation in porous materials. A simulation study based on Saam and Cole model was performed and compared with experimentally observed sorption isotherms to study the physics behind of hysteresis formation. We discuss the shape of the hysteresis loops in the framework of the morphology of the layers. The different behavior of adsorption and desorption of nitrogen in PS with pore diameter was discussed in terms of concave menisci formation inside the pore space, which was shown to related with the induced pressure in varying the pore diameter from 7.2 nm to 3.4 nm.
Simulated annealing algorithm for solving chambering student-case assignment problem
Ghazali, Saadiah; Abdul-Rahman, Syariza
2015-12-01
The problem related to project assignment problem is one of popular practical problem that appear nowadays. The challenge of solving the problem raise whenever the complexity related to preferences, the existence of real-world constraints and problem size increased. This study focuses on solving a chambering student-case assignment problem by using a simulated annealing algorithm where this problem is classified under project assignment problem. The project assignment problem is considered as hard combinatorial optimization problem and solving it using a metaheuristic approach is an advantage because it could return a good solution in a reasonable time. The problem of assigning chambering students to cases has never been addressed in the literature before. For the proposed problem, it is essential for law graduates to peruse in chambers before they are qualified to become legal counselor. Thus, assigning the chambering students to cases is a critically needed especially when involving many preferences. Hence, this study presents a preliminary study of the proposed project assignment problem. The objective of the study is to minimize the total completion time for all students in solving the given cases. This study employed a minimum cost greedy heuristic in order to construct a feasible initial solution. The search then is preceded with a simulated annealing algorithm for further improvement of solution quality. The analysis of the obtained result has shown that the proposed simulated annealing algorithm has greatly improved the solution constructed by the minimum cost greedy heuristic. Hence, this research has demonstrated the advantages of solving project assignment problem by using metaheuristic techniques.
Comparing of the Deterministic Simulated Annealing Methods for Quadratic Assignment Problem
Mehmet Güray ÜNSAL
2013-08-01
Full Text Available In this study, Threshold accepting and Record to record travel methods belonging to Simulated Annealing that is meta-heuristic method by applying Quadratic Assignment Problem are statistically analyzed whether they have a significant difference with regard to the values of these two methods target functions and CPU time. Between the two algorithms, no significant differences are found in terms of CPU time and the values of these two methods target functions. Consequently, on the base of Quadratic Assignment Problem, the two algorithms are compared in the study have the same performance in respect to CPU time and the target functions values
Sousa, Tiago; Vale, Zita; Morais, Hugo
2013-01-01
The aggregation and management of Distributed Energy Resources (DERs) by an Virtual Power Players (VPP) is an important task in a smart grid context. The Energy Resource Management (ERM) of theses DERs can become a hard and complex optimization problem. The large integration of several DERs...... Simulated Annealing (SA) approach to determine the ERM considering an intensive use of DERs, mainly EVs. In this paper, the possibility to apply Demand Response (DR) programs to the EVs is considered. Moreover, a trip reduce DR program is implemented. The SA methodology is tested on a 32-bus distribution...
Menin, O H; Martinez, A S; Costa, A M
2016-05-01
A generalized simulated annealing algorithm, combined with a suitable smoothing regularization function is used to solve the inverse problem of X-ray spectrum reconstruction from attenuation data. The approach is to set the initial acceptance and visitation temperatures and to standardize the terms of objective function to automate the algorithm to accommodate different spectra ranges. Experiments with both numerical and measured attenuation data are presented. Results show that the algorithm reconstructs spectra shapes accurately. It should be noted that in this algorithm, the regularization function was formulated to guarantee a smooth spectrum, thus, the presented technique does not apply to X-ray spectrum where characteristic radiation are present. PMID:26943902
Design of phase plates for shaping partially coherent beams by simulated annealing
Li Jian-Long; Lü Bai-Da
2008-01-01
Taking the Gaussian Schell-model beam as a typical example of partially coherent beams,this paper applies the simulated annealing (SA) algorithm to the design of phase plates for shaping partially coherent beams.A flow diagram is presented to illustrate the procedure of phase optimization by the SA algorithm.Numerical examples demonstrate the advantages of the SA algorithm in shaping partially coherent beams.An uniform flat-topped beam profile with maximum reconstruction error RE < 1.74% is achieved.A further extension of the approach is discussed.
Optimization of blade arrangement in a randomly mistuned cascade using simulated annealing
Thompson, Edward A.; Becus, Georges A.
1993-01-01
This paper presents preliminary results of an investigation on mistuning of bladed-disk assemblies aimed at capturing the benefits of mistuning on stability, while at the same time, minimizing the adverse effects on response by solving the following problem: given a set of N turbine blades, each being a small random perturbation of the same nominal blade, determine the best arrangement of the N blades in a mistuned cascade with regard to aeroelastic response. In the studies reported here, mistuning of the blades is restricted to small differences in torsional stiffness. The large combinatorial optimization problem of seeking the best arrangement by blade exchanges is solved using a simulated annealing algorithm.
Zhao Zhi-Jin; Zheng Shi-Lian; Xu Chun-Yun; Kong Xian-Zheng
2007-01-01
Hidden Markov models (HMMs) have been used to model burst error sources of wireless channels. This paper proposes a hybrid method of using genetic algorithm (GA) and simulated annealing (SA) to train HMM for discrete channel modelling. The proposed method is compared with pure GA, and experimental results show that the HMMs trained by the hybrid method can better describe the error sequences due to SA's ability of facilitating hill-climbing at the later stage of the search. The burst error statistics of the HMMs trained by the proposed method and the corresponding error sequences are also presented to validate the proposed method.
Extraction of Web Usage Profiles using Simulated Annealing Based Biclustering Approach
Rathipriya, R.; Thangavel, K.
2014-01-01
In this paper, the Simulated Annealing (SA) based biclustering approach is proposed in which SA is used as an optimization tool for biclustering of web usage data to identify the optimal user profile from the given web usage data. Extracted biclusters are consists of correlated users whose usage behaviors are similar across the subset of web pages of a web site where as these users are uncorrelated for remaining pages of a web site. These results are very useful in web personalization so that...
The performance of simulated annealing in parameter estimation for vapor-liquid equilibrium modeling
A. Bonilla-Petriciolet
2007-03-01
Full Text Available In this paper we report the application and evaluation of the simulated annealing (SA optimization method in parameter estimation for vapor-liquid equilibrium (VLE modeling. We tested this optimization method using the classical least squares and error-in-variable approaches. The reliability and efficiency of the data-fitting procedure are also considered using different values for algorithm parameters of the SA method. Our results indicate that this method, when properly implemented, is a robust procedure for nonlinear parameter estimation in thermodynamic models. However, in difficult problems it still can converge to local optimums of the objective function.
Optimización multiobjetivo en transmisiones de redes multicast utilizando Simulated Annealing
Yezid Donoso; Kadel Lacatt; Alfonso Jiménez
2005-01-01
En este artículo se presenta un método de optimización multiobjetivo para la solución del problema de balanceo de carga en redes de transmisión multicast, apoyándose en la aplicación de la meta-heurística de Simulated Annealing (Recocido Simulado). El método minimiza cuatro parámetros básicos para garantizar la calidad de servicio en transmisiones multicast: retardo origen destino, máxima utilización de enlaces, ancho de banda consumido y número de saltos. Los resultados devuel...
Neighbourhood generation mechanism applied in simulated annealing to job shop scheduling problems
Cruz-Chávez, Marco Antonio
2015-11-01
This paper presents a neighbourhood generation mechanism for the job shop scheduling problems (JSSPs). In order to obtain a feasible neighbour with the generation mechanism, it is only necessary to generate a permutation of an adjacent pair of operations in a scheduling of the JSSP. If there is no slack time between the adjacent pair of operations that is permuted, then it is proven, through theory and experimentation, that the new neighbour (schedule) generated is feasible. It is demonstrated that the neighbourhood generation mechanism is very efficient and effective in a simulated annealing.
Simulated annealing applied to two-dimensional low-beta reduced magnetohydrodynamics
The simulated annealing (SA) method is applied to two-dimensional (2D) low-beta reduced magnetohydrodynamics (R-MHD). We have successfully obtained stationary states of the system numerically by the SA method with Casimir invariants preserved. Since the 2D low-beta R-MHD has two fields, the relaxation process becomes complex compared to a single field system such as 2D Euler flow. The obtained stationary state can have fine structure. We have found that the fine structure appears because the relaxation processes are different between kinetic energy and magnetic energy
Mori, Takaharu; Okamoto, Yuko
2008-03-01
Gramicidin A is a hydrophobic 15-residue peptide with alternating D- and L-amino acids, and it forms various conformations depending on its environment. For example, gramicidin A adopts a random coil or helical conformations, such as &4.4circ;-helix, &6.3circ;-helix, and double-stranded helix in organic solvents. To investigate the structural and dynamical properties of gramicidin A in water and the hydrophobic environment, we performed molecular dynamics simulated annealing simulations with implicit solvent based on a generalized Born model. From the simulations, it was found that gramicidin A has a strong tendency to form a random-coil structure in water, while in the hydrophobic environment it becomes compact and can fold into right- and left-handed conformations of β-helix structures. We discuss the folding mechanism of the β-helix conformation of gramicidin A.
Chang Li; Coster, Daniel C.
2014-01-01
Much of the previous work in D-optimal design for regression models with correlated errors focused on polynomial models with a single predictor variable, in large part because of the intractability of an analytic solution. In this paper, we present a modified, improved simulated annealing algorithm, providing practical approaches to specifications of the annealing cooling parameters, thresholds, and search neighborhoods for the perturbation scheme, which finds approximate D-optimal designs fo...
段谟意
2012-01-01
针对通信网络产生的拥塞问题,基于免疫克隆模拟退火算法提出了一种新的网络生存性评价方法(survivability algorithm based on immune clonal simulated annealing,SAICSA).该方法通过建立克隆变异和克隆交叉操作规则,并结合模拟退火接受准则来获得退火温度趋于零时的最优解.同时,以实际数据进行仿真实验,深入研究了网络生存性与失效边数、初始温度等影响因素之间的关系.实验结果表明,相比于免疫规划模拟退火算法和遗传模拟退火算法,SAICSA算法表现出较好的适应性.%In order to mitigate the network congestion by node failures, a novel survivability evaluation method (Survivability Algorithm based on Immune Clonal Simulated Annealing, SAICSA) is proposed by immune clonal simulated annealing algorithm. In this method, the clonal variation and clonal intersection regulations are presented at first, and the optimal solution is got by simulated annealing regulation when annealing temperature is tended to zero. Then, simulation was conducted to study the relationship between network survivability and failures node, as well as initial temperature with actual data. Compared SAIP (Simulated Annealing algorithm based on Immune Programming) and GSA (Genetic Simulated Annealing) algorithm, SAICSA algorithm has better adaptability.
Temporary Workforce Planning with Firm Contracts: A Model and a Simulated Annealing Heuristic
Muhammad Al-Salamah
2011-01-01
Full Text Available The aim of this paper is to introduce a model for temporary staffing when temporary employment is managed by firm contracts and to propose a simulated annealing-based method to solve the model. Temporary employment is a policy frequently used to adjust the working hour capacity to fluctuating demand. Temporary workforce planning models have been unnecessarily simplified to account for only periodic hiring and laying off; a company can review its workforce requirement every period and make hire-fire decisions accordingly, usually with a layoff cost. We present a more realistic temporary workforce planning model that assumes a firm contract between the worker and the company, which can extend to several periods. The model assumes the traditional constraints, such as inventory balance constraints, worker availability, and labor hour mix. The costs are the inventory holding cost, training cost of the temporary workers, and the backorder cost. The mixed integer model developed for this case has been found to be difficult to solve even for small problem sizes; therefore, a simulated annealing algorithm is proposed to solve the mixed integer model. The performance of the SA algorithm is compared with the CPLEX solution.
Jiang, Chunhua; Yang, Guobin; Zhu, Peng; Nishioka, Michi; Yokoyama, Tatsuhiro; Zhou, Chen; Song, Huan; Lan, Ting; Zhao, Zhengyu; Zhang, Yuannong
2016-05-01
This paper presents a new method to reconstruct the vertical electron density profile based on vertical Total Electron Content (TEC) using the simulated annealing algorithm. The present technique used the Quasi-parabolic segments (QPS) to model the bottomside ionosphere. The initial parameters of the ionosphere model were determined from both International Reference Ionosphere (IRI) (Bilitza et al., 2014) and vertical TEC (vTEC). Then, the simulated annealing algorithm was used to search the best-fit parameters of the ionosphere model by comparing with the GPS-TEC. The performance and robust of this technique were verified by ionosonde data. The critical frequency (foF2) and peak height (hmF2) of the F2 layer obtained from ionograms recorded at different locations and on different days were compared with those calculated by the proposed method. The analysis of results shows that the present method is inspiring for obtaining foF2 from vTEC. However, the accuracy of hmF2 needs to be improved in the future work.
Redesigning rain gauges network in Johor using geostatistics and simulated annealing
Recently, many rainfall network design techniques have been developed, discussed and compared by many researchers. Present day hydrological studies require higher levels of accuracy from collected data. In numerous basins, the rain gauge stations are located without clear scientific understanding. In this study, an attempt is made to redesign rain gauge network for Johor, Malaysia in order to meet the required level of accuracy preset by rainfall data users. The existing network of 84 rain gauges in Johor is optimized and redesigned into a new locations by using rainfall, humidity, solar radiation, temperature and wind speed data collected during the monsoon season (November - February) of 1975 until 2008. This study used the combination of geostatistics method (variance-reduction method) and simulated annealing as the algorithm of optimization during the redesigned proses. The result shows that the new rain gauge location provides minimum value of estimated variance. This shows that the combination of geostatistics method (variance-reduction method) and simulated annealing is successful in the development of the new optimum rain gauge system
Prediction of Flood Warning in Taiwan Using Nonlinear SVM with Simulated Annealing Algorithm
Lee, C.
2013-12-01
The issue of the floods is important in Taiwan. It is because the narrow and high topography of the island make lots of rivers steep in Taiwan. The tropical depression likes typhoon always causes rivers to flood. Prediction of river flow under the extreme rainfall circumstances is important for government to announce the warning of flood. Every time typhoon passed through Taiwan, there were always floods along some rivers. The warning is classified to three levels according to the warning water levels in Taiwan. The propose of this study is to predict the level of floods warning from the information of precipitation, rainfall duration and slope of riverbed. To classify the level of floods warning by the above-mentioned information and modeling the problems, a machine learning model, nonlinear Support vector machine (SVM), is formulated to classify the level of floods warning. In addition, simulated annealing (SA), a probabilistic heuristic algorithm, is used to determine the optimal parameter of the SVM model. A case study of flooding-trend rivers of different gradients in Taiwan is conducted. The contribution of this SVM model with simulated annealing is capable of making efficient announcement for flood warning and keeping the danger of flood from residents along the rivers.
Redesigning rain gauges network in Johor using geostatistics and simulated annealing
Aziz, Mohd Khairul Bazli Mohd; Yusof, Fadhilah; Daud, Zalina Mohd; Yusop, Zulkifli; Kasno, Mohammad Afif
2015-02-01
Recently, many rainfall network design techniques have been developed, discussed and compared by many researchers. Present day hydrological studies require higher levels of accuracy from collected data. In numerous basins, the rain gauge stations are located without clear scientific understanding. In this study, an attempt is made to redesign rain gauge network for Johor, Malaysia in order to meet the required level of accuracy preset by rainfall data users. The existing network of 84 rain gauges in Johor is optimized and redesigned into a new locations by using rainfall, humidity, solar radiation, temperature and wind speed data collected during the monsoon season (November - February) of 1975 until 2008. This study used the combination of geostatistics method (variance-reduction method) and simulated annealing as the algorithm of optimization during the redesigned proses. The result shows that the new rain gauge location provides minimum value of estimated variance. This shows that the combination of geostatistics method (variance-reduction method) and simulated annealing is successful in the development of the new optimum rain gauge system.
Redesigning rain gauges network in Johor using geostatistics and simulated annealing
Aziz, Mohd Khairul Bazli Mohd, E-mail: mkbazli@yahoo.com [Centre of Preparatory and General Studies, TATI University College, 24000 Kemaman, Terengganu, Malaysia and Department of Mathematical Sciences, Faculty of Science, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia); Yusof, Fadhilah, E-mail: fadhilahy@utm.my [Department of Mathematical Sciences, Faculty of Science, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia); Daud, Zalina Mohd, E-mail: zalina@ic.utm.my [UTM Razak School of Engineering and Advanced Technology, Universiti Teknologi Malaysia, UTM KL, 54100 Kuala Lumpur (Malaysia); Yusop, Zulkifli, E-mail: zulyusop@utm.my [Institute of Environmental and Water Resource Management (IPASA), Faculty of Civil Engineering, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia); Kasno, Mohammad Afif, E-mail: mafifkasno@gmail.com [Malaysia - Japan International Institute of Technology (MJIIT), Universiti Teknologi Malaysia, UTM KL, 54100 Kuala Lumpur (Malaysia)
2015-02-03
Recently, many rainfall network design techniques have been developed, discussed and compared by many researchers. Present day hydrological studies require higher levels of accuracy from collected data. In numerous basins, the rain gauge stations are located without clear scientific understanding. In this study, an attempt is made to redesign rain gauge network for Johor, Malaysia in order to meet the required level of accuracy preset by rainfall data users. The existing network of 84 rain gauges in Johor is optimized and redesigned into a new locations by using rainfall, humidity, solar radiation, temperature and wind speed data collected during the monsoon season (November - February) of 1975 until 2008. This study used the combination of geostatistics method (variance-reduction method) and simulated annealing as the algorithm of optimization during the redesigned proses. The result shows that the new rain gauge location provides minimum value of estimated variance. This shows that the combination of geostatistics method (variance-reduction method) and simulated annealing is successful in the development of the new optimum rain gauge system.
Using simulated annealing algorithms to solve the Schrödinger equation in muonic atoms
In many physical problems, the computation of exact wave functions for muons (particles about two hundred times heavier than electrons), bound in the extended Coulomb field created by the atomic nucleus, is required. Even though the problem is trivial under the assumption of point-like nuclear systems, the consideration of the nuclear finite-size necessitates the use of advantageous numerical techniques. In the case of non-relativistic bound muons, the solution of the Schrödinger equation is reliable, but for a relativistic description the solution of the Dirac equations for the bound muon is needed. In the present contribution, as a first step, we attempt to derive a method for solving the Schrödinger equation on the basis of simulated annealing algorithms. To this end, one may optimize appropriate parametric expressions for the wave function of a muon orbiting around complex nuclei by employing the simulated annealing method recently constructed to minimize multi parametric expressions in several physical applications
Design and optimization of solid rocket motor Finocyl grain using simulated annealing
Ali Kamran; LIANG Guo-zhu
2011-01-01
The research effort outlined the application of a computer aided design (CAD)-centric technique to the design and optimization of solid rocket motor Finocyl (fin in cylinder) grain using simulated annealing.The proper method for constructing the grain configuration model, ballistic performance and optimizer integration for analysis was presented. Finoeyl is a complex grain configuration, requiring thirteen variables to define the geometry. The large number of variables not only complicates the geometrical construction but also optimization process. CAD representation encapsulates all of the geometric entities pertinent to the grain design in a parametric way, allowing manipulation of grain entity (web), performing regression and automating geometrical data calculations. Robustness to avoid local minima and efficient capacity to explore design space makes simulated annealing an attractive choice as optimizer. It is demonstrated with a constrained optimization of Finocyl grain geometry for homogeneous, isotropic propellant, uniform regression, and a quasi-steady, bulk mode internal ballistics model that maximizes average thrust for required deviations from neutrality.
Simulating Astronomical Adaptive Optics Systems Using Yao
Rigaut, François; Van Dam, Marcos
2013-12-01
Adaptive Optics systems are at the heart of the coming Extremely Large Telescopes generation. Given the importance, complexity and required advances of these systems, being able to simulate them faithfully is key to their success, and thus to the success of the ELTs. The type of systems envisioned to be built for the ELTs cover most of the AO breeds, from NGS AO to multiple guide star Ground Layer, Laser Tomography and Multi-Conjugate AO systems, with typically a few thousand actuators. This represents a large step up from the current generation of AO systems, and accordingly a challenge for existing AO simulation packages. This is especially true as, in the past years, computer power has not been following Moore's law in its most common understanding; CPU clocks are hovering at about 3GHz. Although the use of super computers is a possible solution to run these simulations, being able to use smaller machines has obvious advantages: cost, access, environmental issues. By using optimised code in an already proven AO simulation platform, we were able to run complex ELT AO simulations on very modest machines, including laptops. The platform is YAO. In this paper, we describe YAO, its architecture, its capabilities, the ELT-specific challenges and optimisations, and finally its performance. As an example, execution speed ranges from 5 iterations per second for a 6 LGS 60x60 subapertures Shack-Hartmann Wavefront sensor Laser Tomography AO system (including full physical image formation and detector characteristics) up to over 30 iterations/s for a single NGS AO system.
Annealing kinetics of single displacement cascades in Ni: An atomic scale computer simulation
In order to describe the long term evolution of the defects produced by a displacement cascade, Molecular dynamics (MD) and Kinetic Monte Carlo (KMC) methods are employed. Using an empirical Ni interatomic potential in MD, the damage resulting from primary knock-on atom (PKA) energies up to 30 keV has been simulated. The annealing kinetics and the fraction of freely migrating defects (FMD) are determined for each single displacement cascade, by a KMC code which is based on a set of parameters extracted mainly from MD simulations. It allows an atomistic study of the evolution of the initial damage over a time scale up to 100s and the determination of the fraction of the defects that escape the KMC box, compared to those obtained by MD, as function of temperature and PKA energy. It has been found that this fraction depends strongly on the temperature but reaches a saturation value above stage V
Kai Moriguchi
2015-01-01
Full Text Available We evaluated the potential of simulated annealing as a reliable method for optimizing thinning rates for single even-aged stands. Four types of yield models were used as benchmark models to examine the algorithm’s versatility. Thinning rate, which was constrained to 0–50% every 5 years at stand ages of 10–45 years, was optimized to maximize the net present value for one fixed rotation term (50 years. The best parameters for the simulated annealing were chosen from 113 patterns, using the mean of the net present value from 39 runs to ensure the best performance. We compared the solutions with those from coarse full enumeration to evaluate the method’s reliability and with 39 runs of random search to evaluate its efficiency. In contrast to random search, the best run of simulated annealing for each of the four yield models resulted in a better solution than coarse full enumeration. However, variations in the objective function for two yield models obtained with simulated annealing were significantly larger than those of random search. In conclusion, simulated annealing with optimized parameters is more efficient for optimizing thinning rates than random search. However, it is necessary to execute multiple runs to obtain reliable solutions.
Adaptive resolution simulation in equilibrium and beyond
Wang, H.; Agarwal, A.
2015-09-01
In this paper, we investigate the equilibrium statistical properties of both the force and potential interpolations of adaptive resolution simulation (AdResS) under the theoretical framework of grand-canonical like AdResS (GC-AdResS). The thermodynamic relations between the higher and lower resolutions are derived by considering the absence of fundamental conservation laws in mechanics for both branches of AdResS. In order to investigate the applicability of AdResS method in studying the properties beyond the equilibrium, we demonstrate the accuracy of AdResS in computing the dynamical properties in two numerical examples: The velocity auto-correlation of pure water and the conformational relaxation of alanine dipeptide dissolved in water. Theoretical and technical open questions of the AdResS method are discussed in the end of the paper.
Sanchez Lopez, Hector [Universidad de Oriente, Santiago de Cuba (Cuba). Centro de Biofisica Medica]. E-mail: hsanchez@cbm.uo.edu.cu
2001-08-01
This work describes an alternative algorithm of Simulated Annealing applied to the design of the main magnet for a Magnetic Resonance Imaging machine. The algorithm uses a probabilistic radial base neuronal network to classify the possible solutions, before the objective function evaluation. This procedure allows reducing up to 50% the number of iterations required by simulated annealing to achieve the global maximum, when compared with the SA algorithm. The algorithm was applied to design a 0.1050 Tesla four coil resistive magnet, which produces a magnetic field 2.13 times more uniform than the solution given by SA. (author)
A simulated annealing approach to schedule optimization for the SES facility
Mcmahon, Mary Beth; Dean, Jack
1992-01-01
The Shuttle Engineering Simulator (SES) is a facility which houses the software and hardware for a variety of simulation systems. The simulators include the Autonomous Remote Manipulator, the Manned Maneuvering Unit, Orbiter/Space Station docking, and shuttle entry and landing. The SES simulators are used by various groups throughout NASA. For example, astronauts use the SES to practice maneuvers with the shuttle equipment; programmers use the SES to test flight software; and engineers use the SES for design and analysis studies. Due to its high demand, the SES is busy twenty-four hours a day and seven days a week. Scheduling the facility is a problem that is constantly growing and changing with the addition of new equipment. Currently a number of small independent programs have been developed to help solve the problem, but the long-term answer lies in finding a flexible, integrated system that provides the user with the ability to create, optimize, and edit the schedule. COMPASS is an interactive and highly flexible scheduling system. However, until recently COMPASS did not provide any optimization features. This paper describes the simulated annealing extension to COMPASS. It now allows the user to interweave schedule creation, revision, and optimization. This practical approach was necessary in order to satisfy the operational requirements of the SES.
This paper presents a new method for loading pattern optimization in VVER-1000 reactor core. Because of the immensity of search space in fuel management optimization problems, finding the optimum solution requires a huge amount of calculations in the classical method, while neural network models, with massively parallel structures, accompanied by simulated annealing method are powerful enough to find the best solution in a reasonable time. Hopfield neural network operates as a local minimum searching algorithm; and for improving the obtained result from neural network, simulated annealing is used. Simulated annealing, because of its stochastic nature, can provide for the escape of the result of Hopfield neural network from a local minimum and guide it to the global minimum. In this study, minimization of radial power peaking factor inside the reactor core of Bushehr NPP is considered as the objective. The result is the optimum configuration, which is in agreement with the pattern proposed by the designer
Pharmacokinetic modeling of dynamic MR images using a simulated annealing-based optimization
Sawant, Amit R.; Reece, John H.; Reddick, Wilburn E.
2000-04-01
The aim of this work was to use dynamic contrast enhanced MR image (DEMRI) data to generate 'parameter images' which provide functional information about contrast agent access, in bone sarcoma. A simulated annealing based technique was applied to optimize the parameters of a pharmacokinetic model used to describe the kinetics of the tissue response during and after intravenous infusion of a paramagnetic contrast medium, Gd-DTPA. Optimization was performed on a pixel by pixel basis so as to minimize the sum of square deviations of the calculated values from the values obtained experimentally during dynamic contrast enhanced MR imaging. A cost function based on a priori information was introduced during the annealing procedure to ensure that the values obtained were within the expected ranges. The optimized parameters were used in the model to generate parameter images, which reveal functional information that is normally not visible in conventional Gd-DTPA enhanced MR images. This functional information, during and upon completion of pre-operative chemotherapy, is useful in predicting the probability of disease free survival.
Retrieval of Surface and Subsurface Moisture of Bare Soil Using Simulated Annealing
Tabatabaeenejad, A.; Moghaddam, M.
2009-12-01
Soil moisture is of fundamental importance to many hydrological and biological processes. Soil moisture information is vital to understanding the cycling of water, energy, and carbon in the Earth system. Knowledge of soil moisture is critical to agencies concerned with weather and climate, runoff potential and flood control, soil erosion, reservoir management, water quality, agricultural productivity, drought monitoring, and human health. The need to monitor the soil moisture on a global scale has motivated missions such as Soil Moisture Active and Passive (SMAP) [1]. Rough surface scattering models and remote sensing retrieval algorithms are essential in study of the soil moisture, because soil can be represented as a rough surface structure. Effects of soil moisture on the backscattered field have been studied since the 1960s, but soil moisture estimation remains a challenging problem and there is still a need for more accurate and more efficient inversion algorithms. It has been shown that the simulated annealing method is a powerful tool for inversion of the model parameters of rough surface structures [2]. The sensitivity of this method to measurement noise has also been investigated assuming a two-layer structure characterized by the layers dielectric constants, layer thickness, and statistical properties of the rough interfaces [2]. However, since the moisture profile varies with depth, it is sometimes necessary to model the rough surface as a layered structure with a rough interface on top and a stratified structure below where each layer is assumed to have a constant volumetric moisture content. In this work, we discretize the soil structure into several layers of constant moisture content to examine the effect of subsurface profile on the backscattering coefficient. We will show that while the moisture profile could vary in deeper layers, these layers do not affect the scattered electromagnetic field significantly. Therefore, we can use just a few layers
This work describes an alternative algorithm of Simulated Annealing applied to the design of the main magnet for a Magnetic Resonance Imaging machine. The algorithm uses a probabilistic radial base neuronal network to classify the possible solutions, before the objective function evaluation. This procedure allows reducing up to 50% the number of iterations required by simulated annealing to achieve the global maximum, when compared with the SA algorithm. The algorithm was applied to design a 0.1050 Tesla four coil resistive magnet, which produces a magnetic field 2.13 times more uniform than the solution given by SA. (author)
Fabrication of simulated plate fuel elements: Defining role of stress relief annealing
This study involved fabrication of simulated plate fuel elements. Uranium silicide of actual fuel elements was replaced with yttria. The fabrication stages were otherwise identical. The final cold rolled and/or straightened plates, without stress relief, showed an inverse relationship between bond strength and out of plane residual shear stress (τ13). Stress relief of τ13 was conducted over a range of temperatures/times (200–500 °C and 15–240 min) and led to corresponding improvements in bond strength. Fastest τ13 relief was obtained through 300 °C annealing. Elimination of microscopic shear bands, through recovery and partial recrystallization, was clearly the most effective mechanism of relieving τ13
FPGA PLACEMENT OPTIMIZATION BY TWO-STEP UNIFIED GENETIC ALGORITHM AND SIMULATED ANNEALING ALGORITHM
Yang Meng; A.E.A.Almaini; Wang Pengjun
2006-01-01
Genetic Algorithm (GA) is a biologically inspired technique and widely used to solve numerous combinational optimization problems. It works on a population of individuals, not just one single solution. As a result, it avoids converging to the local optimum. However, it takes too much CPU time in the late process of GA. On the other hand, in the late process Simulated Annealing (SA) converges faster than GA but it is easily trapped to local optimum. In this letter, a useful method that unifies GA and SA is introduced, which utilizes the advantage of the global search ability of GA and fast convergence of SA. The experimental results show that the proposed algorithm outperforms GA in terms of CPU time without degradation of performance.It also achieves highly comparable placement cost compared to the state-of-the-art results obtained by Versatile Place and Route (VPR) Tool.
A Simulated Annealing method to solve a generalized maximal covering location problem
M. Saeed Jabalameli
2011-04-01
Full Text Available The maximal covering location problem (MCLP seeks to locate a predefined number of facilities in order to maximize the number of covered demand points. In a classical sense, MCLP has three main implicit assumptions: all or nothing coverage, individual coverage, and fixed coverage radius. By relaxing these assumptions, three classes of modelling formulations are extended: the gradual cover models, the cooperative cover models, and the variable radius models. In this paper, we develop a special form of MCLP which combines the characteristics of gradual cover models, cooperative cover models, and variable radius models. The proposed problem has many applications such as locating cell phone towers. The model is formulated as a mixed integer non-linear programming (MINLP. In addition, a simulated annealing algorithm is used to solve the resulted problem and the performance of the proposed method is evaluated with a set of randomly generated problems.
COLSS Axial Power Distribution Synthesis using Artificial Neural Network with Simulated Annealing
Shim, K. W.; Oh, D. Y.; Kim, D. S.; Choi, Y. J.; Park, Y. H. [KEPCO Nuclear Fuel Company, Inc., Daejeon (Korea, Republic of)
2015-05-15
The core operating limit supervisory system (COLSS) is an application program implemented into the plant monitoring system (PMS) of nuclear power plants (NPPs). COLSS aids the operator in maintaining plant operation within selected limiting conditions for operation (LCOs), such as the departure from nucleate boiling ratio (DNBR) margin and the linear heat rate (LHR) margin. In order to calculate above LCOs, the COLSS uses core averaged axial power distribution (APD). 40 nodes of APD is synthesized by using the 5-level in-core neutron flux detector signals based on the Fourier series method in the COLSS. We proposed the artificial neural network (ANN) with simulated annealing (SA) method instead of Fourier series method to synthesize the axial power distribution (APD) of COLSS. The proposed method is more accurate than the current method as the results of the axial shape RMS errors.
Marsh, Rebeccah E; Riauka, Terence A; McQuarrie, Steve A
2007-01-01
Increasingly, fractals are being incorporated into pharmacokinetic models to describe transport and chemical kinetic processes occurring in confined and heterogeneous spaces. However, fractal compartmental models lead to differential equations with power-law time-dependent kinetic rate coefficients that currently are not accommodated by common commercial software programs. This paper describes a parameter optimization method for fitting individual pharmacokinetic curves based on a simulated annealing (SA) algorithm, which always converged towards the global minimum and was independent of the initial parameter values and parameter bounds. In a comparison using a classical compartmental model, similar fits by the Gauss-Newton and Nelder-Mead simplex algorithms required stringent initial estimates and ranges for the model parameters. The SA algorithm is ideal for fitting a wide variety of pharmacokinetic models to clinical data, especially those for which there is weak prior knowledge of the parameter values, such as the fractal models. PMID:17706176
Solving an one-dimensional cutting stock problem by simulated annealing and tabu search
Jahromi, Meghdad HMA; Tavakkoli-Moghaddam, Reza; Makui, Ahmad; Shamsi, Abbas
2012-10-01
A cutting stock problem is one of the main and classical problems in operations research that is modeled as LP problem. Because of its NP-hard nature, finding an optimal solution in reasonable time is extremely difficult and at least non-economical. In this paper, two meta-heuristic algorithms, namely simulated annealing (SA) and tabu search (TS), are proposed and developed for this type of the complex and large-sized problem. To evaluate the efficiency of these proposed approaches, several problems are solved using SA and TS, and then the related results are compared. The results show that the proposed SA gives good results in terms of objective function values rather than TS.
A hybrid Tabu search-simulated annealing method to solve quadratic assignment problem
Mohamad Amin Kaviani
2014-06-01
Full Text Available Quadratic assignment problem (QAP has been considered as one of the most complicated problems. The problem is NP-Hard and the optimal solutions are not available for large-scale problems. This paper presents a hybrid method using tabu search and simulated annealing technique to solve QAP called TABUSA. Using some well-known problems from QAPLIB generated by Burkard et al. (1997 [Burkard, R. E., Karisch, S. E., & Rendl, F. (1997. QAPLIB–a quadratic assignment problem library. Journal of Global Optimization, 10(4, 391-403.], two methods of TABUSA and TS are both coded on MATLAB and they are compared in terms of relative percentage deviation (RPD for all instances. The performance of the proposed method is examined against Tabu search and the preliminary results indicate that the hybrid method is capable of solving real-world problems, efficiently.
Deist, T. M.; Gorissen, B. L.
2016-02-01
High-dose-rate brachytherapy is a tumor treatment method where a highly radioactive source is brought in close proximity to the tumor. In this paper we develop a simulated annealing algorithm to optimize the dwell times at preselected dwell positions to maximize tumor coverage under dose-volume constraints on the organs at risk. Compared to existing algorithms, our algorithm has advantages in terms of speed and objective value and does not require an expensive general purpose solver. Its success mainly depends on exploiting the efficiency of matrix multiplication and a careful selection of the neighboring states. In this paper we outline its details and make an in-depth comparison with existing methods using real patient data.
Dynamic surface matching using simulated annealing for patient positioning in radiotherapy
Over a course of radiotherapy treatment, a patient's body surface will undergo daily shape changes due to the unavoidable dynamics of internal/external forces. This place ever increasing demands on conventional pre-treatment set-ups which attempt to re-create a patient orientation on the treatment couch that is identical to the planned CT scan, using only a few coplanar skin markers for reference. Surface matching offers a potentially more effective method of patient set-ups. A high-resolution height-map may be constructed using available CT data, for any portion of a patient's body surface. Technology exists for the generation of real-time partial-surface height-maps at sub-seconds intervals, within the treatment room. Using simulated annealing for the co-registration of both heightmaps, it is possible to finely adjust a patient set-up in a manner consistent with the gross changes in body surface shape. (author)
Application of simulated annealing algorithm to improve work roll wear model in plate mills
无
2002-01-01
Employing Simulated Annealing Algorithm (SAA) and many measured data, a calculation model of work roll wear was built in the 2 800 mm 4-high mill of Wuhan Iron and Steel (Group) Co.(WISCO). The model was a semi-theory practical formula. Its pattern and magnitude were still hardly defined with classical optimization methods. But the problem could be resolved by SAA. It was pretty high precision to predict the values for the wear profiles of work roll in a rolling unit. Afterone-year application, the results show that the model is feasible in engineering, and it can be applied to predict the wear profiles of work roll in other mills
Simulated Annealing for Ground State Energy of Ionized Donor Bound Excitons in Semiconductors
YAN Hai-Qing; TANG Chen; LIU Ming; ZHANG Hao; ZHANG Gui-Min
2004-01-01
We present a global optimization method, called the simulated annealing, to the ground state energies of excitons. The proposed method does not require the partial derivatives with respect to each variational parameter or solving an eigenequation, so the present method is simpler in software programming than the variational method,and overcomes the major difficulties. The ground state energies of ionized-donor-bound excitons (D+, X) have been calculated variationally for all values of effective electron-to-hole mass ratio σ. They are compared with those obtained by the variational method. The results obtained demonstrate that the proposed method is simple, accurate, and has more advantages than the traditional methods in calculation.
Sochi, Taha
2014-01-01
In this paper, we propose and test an intuitive assumption that the pressure field in single conduits and networks of interconnected conduits adjusts itself to minimize the total energy consumption required for transporting a specific quantity of fluid. We test this assumption by using linear flow models of Newtonian fluids transported through rigid tubes and networks in conjunction with a simulated annealing (SA) protocol to minimize the total energy cost. All the results confirm our hypothesis as the SA algorithm produces very close results to those obtained from the traditional deterministic methods of identifying the flow fields by solving a set of simultaneous equations based on the conservation principles. The same results apply to electric ohmic conductors and networks of interconnected ohmic conductors. Computational experiments conducted in this regard confirm this extension. Further studies are required to test the energy minimization hypothesis for the non-linear flow systems.
Simulated Annealing for Ground State Energy of Ionized Donor Bound Excitons in Semiconductors
YANHai-Qing; TANGChen; LIUMing; ZHANGHao; ZHANGGui-Min
2004-01-01
We present a global optimization method, called the simulated annealing, to the ground state energies of excitons. The proposed method does not require the partial derivatives with respect to each variational parameter or solving an eigenequation, so the present method is simpler in software programming than the variational method,and overcomes the major difficulties. The ground state energies of ionized-donor-bound excitons (D+,X) have beencal culated variationally for all values of effective electron-to-hole mass ratio σ. They are compared with those obtained by the variational method. The results obtained demonstrate that the proposed method is simple, accurate, and has more advantages than the traditional methods in calculation.
Application of simulated annealing to solve multi-objectives for aggregate production planning
Atiya, Bayda; Bakheet, Abdul Jabbar Khudhur; Abbas, Iraq Tereq; Bakar, Mohd. Rizam Abu; Soon, Lee Lai; Monsi, Mansor Bin
2016-06-01
Aggregate production planning (APP) is one of the most significant and complicated problems in production planning and aim to set overall production levels for each product category to meet fluctuating or uncertain demand in future. and to set decision concerning hiring, firing, overtime, subcontract, carrying inventory level. In this paper, we present a simulated annealing (SA) for multi-objective linear programming to solve APP. SA is considered to be a good tool for imprecise optimization problems. The proposed model minimizes total production and workforce costs. In this study, the proposed SA is compared with particle swarm optimization (PSO). The results show that the proposed SA is effective in reducing total production costs and requires minimal time.
Simulated annealing in networks for computing possible arrangements for red and green cones
Ahumada, Albert J., Jr.
1987-01-01
Attention is given to network models in which each of the cones of the retina is given a provisional color at random, and then the cones are allowed to determine the colors of their neighbors through an iterative process. A symmetric-structure spin-glass model has allowed arrays to be generated from completely random arrangements of red and green to arrays with approximately as much disorder as the parafoveal cones. Simulated annealing has also been added to the process in an attempt to generate color arrangements with greater regularity and hence more revealing moirepatterns than than the arrangements yielded by quenched spin-glass processes. Attention is given to the perceptual implications of these results.
Deist, T M; Gorissen, B L
2016-02-01
High-dose-rate brachytherapy is a tumor treatment method where a highly radioactive source is brought in close proximity to the tumor. In this paper we develop a simulated annealing algorithm to optimize the dwell times at preselected dwell positions to maximize tumor coverage under dose-volume constraints on the organs at risk. Compared to existing algorithms, our algorithm has advantages in terms of speed and objective value and does not require an expensive general purpose solver. Its success mainly depends on exploiting the efficiency of matrix multiplication and a careful selection of the neighboring states. In this paper we outline its details and make an in-depth comparison with existing methods using real patient data. PMID:26760757
Simulated annealing approach to vascular structure with application to the coronary arteries.
Keelan, Jonathan; Chung, Emma M L; Hague, James P
2016-02-01
Do the complex processes of angiogenesis during organism development ultimately lead to a near optimal coronary vasculature in the organs of adult mammals? We examine this hypothesis using a powerful and universal method, built on physical and physiological principles, for the determination of globally energetically optimal arterial trees. The method is based on simulated annealing, and can be used to examine arteries in hollow organs with arbitrary tissue geometries. We demonstrate that the approach can generate in silico vasculatures which closely match porcine anatomical data for the coronary arteries on all length scales, and that the optimized arterial trees improve systematically as computational time increases. The method presented here is general, and could in principle be used to examine the arteries of other organs. Potential applications include improvement of medical imaging analysis and the design of vascular trees for artificial organs. PMID:26998317
In this paper, a microwave imaging technique for reconstructing the shape of two-dimensional perfectly conducting scatterers by means of a stochastic optimization approach is investigated. Based on the boundary condition and the measured scattered field derived by transverse magnetic illuminations, a set of nonlinear integral equations is obtained and the imaging problem is reformulated in to an optimization problem. A hybrid approximation algorithm, called PSO-SA, is developed in this work to solve the scattering inverse problem. In the hybrid algorithm, particle swarm optimization (PSO) combines global search and local search for finding the optimal results assignment with reasonable time and simulated annealing (SA) uses certain probability to avoid being trapped in a local optimum. The hybrid approach elegantly combines the exploration ability of PSO with the exploitation ability of SA. Reconstruction results are compared with exact shapes of some conducting cylinders; and good agreements with the original shapes are observed.
Prediction of cutting force in turning of UD-GFRP using mathematical model and simulated annealing
Gupta, Meenu; Gill, Surinder Kumar
2012-12-01
Glass fiber reinforced plastics (GFRPs) composite is considered to be an alternative to heavy exortic materials. According to the need for accurate machining of composites has increased enormously. During machining, the obtaining cutting force is an important aspect. The present investigation deals with the study and development of a cutting force prediction model for the machining of unidirectional glass fiber reinforced plastics (UD-GFRP) composite using regression modeling and optimization by simulated annealing. The process parameters considered include cutting speed, feed rate and depth of cut. The predicted values radial cutting force model is compared with the experimental values. The results of prediction are quite close with the experimental values. The influences of different parameters in machining of UD-GFRP composite have been analyzed.
An improved hybrid topology optimization approach coupling simulated annealing and SIMP (SA-SIMP)
The Solid Isotropic Material with Penalization (SIMP) methodology has been used extensively due to its versatility and ease of implementation. However, one of its main drawbacks is that resulting topologies exhibit areas of intermediate densities which lack any physical meaning. This paper presents a hybrid methodology which couples simulated annealing and SIMP (SA-SIMP) in order to achieve solutions which are stiffer and predominantly black and white. Under a look-ahead strategy, the algorithm gradually fixes or removes those elements whose density resulting from SIMP is intermediate. Different strategies for selecting and fixing the fractional elements are examined using benchmark examples, which show that topologies resulting from SA-SIMP are more rigid than SIMP and predominantly black and white.
MULTIOBJECTIVE OPTIMAL DESIGN OF THREE-PHASE INDUCTION GENERATOR USING SIMULATED ANNEALING TECHNIQUE
R.Kannan
2010-05-01
Full Text Available Self-excited induction generators are growing in popularity due to their advantages over the conventional synchronous generators. In this paper, the task of finding optimal design of a three-phase self-excited induction generator has been formulated as a multi criterion optimization problem. Criterial functions in the example are the active material cost and capacitance required for excitation under full load conditions to maintain rated voltage. Simulated Annealing technique is used as a tool to solve the problem. The obtained results prove the effectiveness of a multi objective approach since it allows us to find a good compromise among the proposed goals, and above all it represents an efficacious tool for the designer.
An Archived Multi Objective Simulated Annealing Method to Discover Biclusters in Microarray Data
Mohsen Lashkargir
2011-01-01
Full Text Available With the advent of microarray technology it has been possible to measure thousands of expression values of genes in a single experiment. Analysis of large scale geonomics data, notably gene expression, has initially focused on clustering methods. Recently, biclustering techniques were proposed for revealing submatrices showing unique patterns. Biclustering or simultaneous clustering of both genes and conditions is challenging particularly for the analysis of high-dimensional gene expression data in information retrieval, knowledge discovery, and data mining. In biclustering of microarray data, several objectives have to be optimized simultaneously and often these objectives are in conflict with each other. A multi objective model is very suitable for solving this problem. Our method proposes a algorithm which is based on multi objective Simulated Annealing for discovering biclusters in gene expression data. Experimental result in bench mark data base present a significant improvement in overlap among biclusters and coverage of elements in gene expression and quality of biclusters.
Generation of optimal correlations by simulated annealing for ill-conditioned least-squares solution
A typical process of determining parameters of empirical correlation is collecting measurements of experiments and applying least-squares method with over-determined number of variable data. Least-squares problems occur frequently in the parameter identification of linear/nonlinear dynamic models, model fitting using dimensionless variables in flow interfacial treatment, heat transfer and pressure drop models, etc. Considering the inevitable measurement noise and careless experimental design, the ill-posedness property of the least-squares method can arise and limit the accuracy of the assumed correlation structures. In this paper, a method of simulated annealing is proposed for estimating power-law parameters of the empirical correlation of experimental data. The method is applied to the determination of the hydrogen removal correlation being used in reactor containment analysis. The analysis results show the remarkable improvement in accuracy and robustness for the noisy measurement data. (author)
CODUSA--customize optimal donor using simulated annealing in heart transplantation.
Ansari, Daniel; Andersson, Bodil; Ohlsson, Mattias; Höglund, Peter; Andersson, Roland; Nilsson, Johan
2013-01-01
In heart transplantation, selection of an optimal recipient-donor match has been constrained by the lack of individualized prediction models. Here we developed a customized donor-matching model (CODUSA) for patients requiring heart transplantations, by combining simulated annealing and artificial neural networks. Using this approach, by analyzing 59,698 adult heart transplant patients, we found that donor age matching was the variable most strongly associated with long-term survival. Female hearts were given to 21% of the women and 0% of the men, and recipients with blood group B received identical matched blood group in only 18% of best-case match compared with 73% for the original match. By optimizing the donor profile, the survival could be improved with 33 months. These findings strongly suggest that the CODUSA model can improve the ability to select optimal match and avoid worst-case match in the clinical setting. This is an important step towards personalized medicine. PMID:23722478
Vasios C.E.
2003-01-01
Full Text Available In the present work, a new method for the classification of Event Related Potentials (ERPs is proposed. The proposed method consists of two modules: the feature extraction module and the classification module. The feature extraction module comprises the implementation of the Multivariate Autoregressive model in conjunction with the Simulated Annealing technique, for the selection of optimum features from ERPs. The classification module is implemented with a single three-layer neural network, trained with the back-propagation algorithm and classifies the data into two classes: patients and control subjects. The method, in the form of a Decision Support System (DSS, has been thoroughly tested to a number of patient data (OCD, FES, depressives and drug users, resulting successful classification up to 100%.
Elias David Nino Ruiz
2013-05-01
Full Text Available This paper states a novel hybrid-metaheuristic based on the Theory of Deterministic Swapping, Theory of Evolution and Simulated Annealing Metaheuristic for the multi-objective optimization of combinatorial problems. The proposed algorithm is named EMSA. It is an improvement of MODS algorithm. Unlike MODS, EMSA works using a search direction given through the assignation of weights to each function of the combinatorial problem to optimize. Also, in order to avoid local optimums, EMSA uses crossover strategy of Genetic Algorithm. Lastly, EMSA is tested using well know instances of the Bi-Objective Traveling Salesman Problem (TSP from TSPLIB. Its results were compared with MODS Metaheuristic (its precessor. The comparison was made using metrics from the specialized literature such as Spacing, Generational Distance, Inverse Generational Distance and Non-Dominated Generation Vectors. In every case, the EMSA results on the metrics were always better and in some of those cases, the superiority was 100%.
The redundancy allocation problem (RAP) is an important reliability optimization problem. This paper studies a specific RAP in which redundancy strategies are chosen. To do so, the choice of the redundancy strategies among active and cold standby is considered as decision variables. The goal is to select the redundancy strategy, component, and redundancy level for each subsystem such that the system reliability is maximized. Since RAP is a NP-hard problem, we propose an efficient simulated annealing algorithm (SA) to solve it. In addition, to evaluating the performance of the proposed algorithm, it is compared with well-known algorithms in the literature for different test problems. The results of the performance analysis show a relatively satisfactory efficiency of the proposed SA algorithm
Jingwei Song
2014-01-01
Full Text Available A simulated annealing (SA based variable weighted forecast model is proposed to combine and weigh local chaotic model, artificial neural network (ANN, and partial least square support vector machine (PLS-SVM to build a more accurate forecast model. The hybrid model was built and multistep ahead prediction ability was tested based on daily MSW generation data from Seattle, Washington, the United States. The hybrid forecast model was proved to produce more accurate and reliable results and to degrade less in longer predictions than three individual models. The average one-week step ahead prediction has been raised from 11.21% (chaotic model, 12.93% (ANN, and 12.94% (PLS-SVM to 9.38%. Five-week average has been raised from 13.02% (chaotic model, 15.69% (ANN, and 15.92% (PLS-SVM to 11.27%.
COLSS Axial Power Distribution Synthesis using Artificial Neural Network with Simulated Annealing
The core operating limit supervisory system (COLSS) is an application program implemented into the plant monitoring system (PMS) of nuclear power plants (NPPs). COLSS aids the operator in maintaining plant operation within selected limiting conditions for operation (LCOs), such as the departure from nucleate boiling ratio (DNBR) margin and the linear heat rate (LHR) margin. In order to calculate above LCOs, the COLSS uses core averaged axial power distribution (APD). 40 nodes of APD is synthesized by using the 5-level in-core neutron flux detector signals based on the Fourier series method in the COLSS. We proposed the artificial neural network (ANN) with simulated annealing (SA) method instead of Fourier series method to synthesize the axial power distribution (APD) of COLSS. The proposed method is more accurate than the current method as the results of the axial shape RMS errors
Idoumghar, L. [Haute Alcace Univ., Mulhouse (France); Fodorean, D.; Mirraoui, A. [Univ. of Technology of Belfort-Montbeliard, Belfort (France). Dept. of Electrical Engineering and Control Systems
2010-03-09
Metaheuristics algorithms can solve complex optimization problems. A unique simulated annealing (SA) algorithm for multi-objective optimization was presented in this paper. The proposed SA algorithm was validated on five standard benchmark mathematical functions and improved the design of an inset permanent magnet motor with concentrated flux (IPMM-CF). The paper provided a description of the SA algorithm and discussed the results. The five benchmarks that were studied included Rastrigin's function; Rosenbrock's function; Michalewicz's function; Schwefel's function; and Noisy's function. The findings were also compared with results obtained by using the Ant Colony paradigm as well as with a particle swarm algorithm. Conclusions and further research options were also offered. It was concluded that the proposed approach has better performance in terms of accuracy, convergence rate, stability and robustness. 15 refs., 4 tabs., 9 figs.
Reconstruction of bremsstrahlung spectra from attenuation data using generalized simulated annealing
Menin, Olavo Henrique; da Costa, Alessandro Martins
2014-01-01
The throughout knowledge of a X-ray beam spectrum is mandatory to assess the quality of its source device. Since the techniques to directly measurement such spectra are expensive and laborious, the X-ray spectrum reconstruction using attenuation data has been a promising alternative. However, such reconstruction corresponds mathematically to an inverse, nonlinear and ill-posed problem. Therefore, to solve it the use of powerful optimization algorithms and good regularization functions is required. Here, we present a generalized simulated annealing algorithm combined with a suitable smoothing regularization function to solve the X-ray spectrum reconstruction inverse problem. We also propose an approach to set the initial acceptance and visitation temperatures and a standardization of the objective function terms to automatize the algorithm to address with different spectra range. Numerical tests considering three different reference spectra with its attenuation curve are presented. Results show that the algori...
Cascade annealing of tungsten implanted with 5 keV noble gas atoms. A computer simulation
Kolk, G.J. van der; Veen, A. van; Caspers, L.M. (Interuniversitair Reactor Inst., Delft (Netherlands); Technische Hogeschool Delft (Netherlands)); Hosson, J.T.M. de (Rijksuniversiteit Groningen (Netherlands). Materials Science Centre)
1984-03-01
The trapping of vacancies by implanted atoms is calculated. After low energy implantation (5 keV) of tungsten with heavy noble gas atoms most of the implanted atoms are in substitutional position with one or two vacancies closer than two lattice units. Under the influence of the lattice distortion around the implanted atoms the vacancies follow a preferential migration path towards the implant during annealing. With lattice relaxation simulations migration energies close to the implanted atom are calculated. Monte Carlo theory is applied to obtain trapping probabilities as a function of implant-vacancy separation and temperature. An estimate of the initial implant-vacancy separation follows from collision cascade calculations. The results show that nearby vacancies are trapped by the implanted atoms.
Left cardiac ventricle refinement of magnetic resonance images based on simulated annealing
In this work we present a methodology for the refinement of segmentation of 2D magnet resonance images. The algorithm proposed here begins with an initial segmentation and, through the addition and exclusion of pixels in the contour of the actual segmentation, the desired segmentation s is obtained. At each step, two segmentation are available, the current and a candidate one. One of these two is selected according to deterministic (hybrid technique refinement) or stochastic (refinement by simulated annealing) minimization of an energy function.This function is composed of terms that account for the contrast int he contour, for the variance of the signal and for the shape of the segmented object. The methodology was evaluated over numeric phantoms and applied to real resonance magnetic images with success. This proposal can be easily extended to other kinds of image modalities. (author)
Simulated annealing for three-dimensional low-beta reduced MHD equilibria in cylindrical geometry
Furukawa, M
2016-01-01
Simulated annealing (SA) is applied for three-dimensional (3D) equilibrium calculation of ideal, low-beta reduced MHD in cylindrical geometry. The SA is based on the theory of Hamiltonian mechanics. The dynamical equation of the original system, low-beta reduced MHD in this study, is modified so that the energy changes monotonically while preserving the Casimir invariants in the artificial dynamics. An equilibrium of the system is given by an extremum of the energy, therefore SA can be used as a method for calculating ideal MHD equilibrium. Previous studies demonstrated that the SA succeeds to lead to various MHD equilibria in two dimensional rectangular domain. In this paper, the theory is applied to 3D equilibrium of ideal, low-beta reduced MHD. An example of equilibrium with magnetic islands, obtained as a lower energy state, is shown. Several versions of the artificial dynamics are developed that can effect smoothing.
Discrete-State Simulated Annealing For Traveling-Wave Tube Slow-Wave Circuit Optimization
Wilson, Jeffrey D.; Bulson, Brian A.; Kory, Carol L.; Williams, W. Dan (Technical Monitor)
2001-01-01
Algorithms based on the global optimization technique of simulated annealing (SA) have proven useful in designing traveling-wave tube (TWT) slow-wave circuits for high RF power efficiency. The characteristic of SA that enables it to determine a globally optimized solution is its ability to accept non-improving moves in a controlled manner. In the initial stages of the optimization, the algorithm moves freely through configuration space, accepting most of the proposed designs. This freedom of movement allows non-intuitive designs to be explored rather than restricting the optimization to local improvement upon the initial configuration. As the optimization proceeds, the rate of acceptance of non-improving moves is gradually reduced until the algorithm converges to the optimized solution. The rate at which the freedom of movement is decreased is known as the annealing or cooling schedule of the SA algorithm. The main disadvantage of SA is that there is not a rigorous theoretical foundation for determining the parameters of the cooling schedule. The choice of these parameters is highly problem dependent and the designer needs to experiment in order to determine values that will provide a good optimization in a reasonable amount of computational time. This experimentation can absorb a large amount of time especially when the algorithm is being applied to a new type of design. In order to eliminate this disadvantage, a variation of SA known as discrete-state simulated annealing (DSSA), was recently developed. DSSA provides the theoretical foundation for a generic cooling schedule which is problem independent, Results of similar quality to SA can be obtained, but without the extra computational time required to tune the cooling parameters. Two algorithm variations based on DSSA were developed and programmed into a Microsoft Excel spreadsheet graphical user interface (GUI) to the two-dimensional nonlinear multisignal helix traveling-wave amplifier analysis program TWA3
Farhat, Nabil H.
1987-01-01
Self-organization and learning is a distinctive feature of neural nets and processors that sets them apart from conventional approaches to signal processing. It leads to self-programmability which alleviates the problem of programming complexity in artificial neural nets. In this paper architectures for partitioning an optoelectronic analog of a neural net into distinct layers with prescribed interconnectivity pattern to enable stochastic learning by simulated annealing in the context of a Boltzmann machine are presented. Stochastic learning is of interest because of its relevance to the role of noise in biological neural nets. Practical considerations and methodologies for appreciably accelerating stochastic learning in such a multilayered net are described. These include the use of parallel optical computing of the global energy of the net, the use of fast nonvolatile programmable spatial light modulators to realize fast plasticity, optical generation of random number arrays, and an adaptive noisy thresholding scheme that also makes stochastic learning more biologically plausible. The findings reported predict optoelectronic chips that can be used in the realization of optical learning machines.
M. Amighpey; B. Voosoghi; Mahdi Motagh
2013-01-01
The estimation of earthquake source parameters using an earth surface displacement field in an elastic half-space leads to a complex nonlinear inverse problem that classic inverse methods are unable to solve. Global optimization methods such as simulated annealing are a good replacement for such problems. Simulated annealing is analogous to thermodynamic annealing where, under certain conditions, the chaotic motions of atoms in a melt can settle to form a crystal with minimal energy. Followin...
Multivariable Optimization: Quantum Annealing & Computation
Mukherjee, Sudip; Chakrabarti, Bikas K.
2014-01-01
Recent developments in quantum annealing techniques have been indicating potential advantage of quantum annealing for solving NP-hard optimization problems. In this article we briefly indicate and discuss the beneficial features of quantum annealing techniques and compare them with those of simulated annealing techniques. We then briefly discuss the quantum annealing studies of some model spin glass and kinetically constrained systems.
Using Site Testing Data for Adaptive Optics Simulations
Herriot, Glen; Andersen, David; Conan, Rod; Ellerbroek, Brent; Gilles, Luc; Hickson, Paul; Jackson, Kate; Lardière, Olivier; Pfrommer, Thomas; Véran, Jean-Pierre; Wang, Lianqi
2011-01-01
Astronomical Site testing data plays a vital role in the simulation, design, evaluation and operation of adaptive optics systems for large telescope. We present the example of TMT and its first light facilitiy adaptive optics system NFIRAOS, and illustrate the many simulations done based on site testing data.
On the Simulation of Adaptive Measurements via Postselection
Dhillon, Vikram
2011-01-01
In this note we address the question of whether any any quantum computational model that allows adaptive measurements can be simulated by a model that allows postselected measurements. We argue in the favor of this question and prove that adaptive measurements can be simulated by postselection. We also discuss some potentially stunning consequences of this result such as the ability to solve #P problems.
Exploring the Use of Adaptively Restrained Particles for Graphics Simulations
Pierre-Luc Manteaux; Fran\\xe7ois Faure; Stephane Redon; Marie-Paule Cani
2013-01-01
International audience In this paper, we explore the use of Adaptively Restrained (AR) particles for graphics simulations. Contrary to previous methods, Adaptively Restrained Particle Simulations (ARPS) do not adapt time or space sampling, but rather switch the positional degrees of freedom of particles on and off, while letting their momenta evolve. Therefore, inter-particles forces do not have to be updated at each time step, in contrast with traditional methods that spend a lot of time ...
Elemental thin film depth profiles by ion beam analysis using simulated annealing - a new tool
Rutherford backscattering spectrometry (RBS) and related techniques have long been used to determine the elemental depth profiles in films a few nanometres to a few microns thick. However, although obtaining spectra is very easy, solving the inverse problem of extracting the depth profiles from the spectra is not possible analytically except for special cases. It is because these special cases include important classes of samples, and because skilled analysts are adept at extracting useful qualitative information from the data, that ion beam analysis is still an important technique. We have recently solved this inverse problem using the simulated annealing algorithm. We have implemented the solution in the 'IBA DataFurnace' code, which has been developed into a very versatile and general new software tool that analysts can now use to rapidly extract quantitative accurate depth profiles from real samples on an industrial scale. We review the features, applicability and validation of this new code together with other approaches to handling IBA (ion beam analysis) data, with particular attention being given to determining both the absolute accuracy of the depth profiles and statistically accurate error estimates. We include examples of analyses using RBS, non-Rutherford elastic scattering, elastic recoil detection and non-resonant nuclear reactions. High depth resolution and the use of multiple techniques simultaneously are both discussed. There is usually systematic ambiguity in IBA data and Butler's example of ambiguity (1990 Nucl. Instrum. Methods B 45 160-5) is reanalysed. Analyses are shown: of evaporated, sputtered, oxidized, ion implanted, ion beam mixed and annealed materials; of semiconductors, optical and magnetic multilayers, superconductors, tribological films and metals; and of oxides on Si, mixed metal silicides, boron nitride, GaN, SiC, mixed metal oxides, YBCO and polymers. (topical review)
Ahmed Fouad Ali
2014-05-01
Full Text Available This paper presents a new algorithm for solving large scale global optimization problems based on hybridization of simulated annealing and Nelder-Mead algorithm. The new algorithm is called simulated Nelder-Mead algorithm with random variables updating (SNMRVU. SNMRVU starts with an initial solution, which is generated randomly and then the solution is divided into partitions. The neighborhood zone is generated, random number of partitions are selected and variables updating process is starting in order to generate a trail neighbor solutions. This process helps the SNMRVU algorithm to explore the region around a current iterate solution. The Nelder- Mead algorithm is used in the final stage in order to improve the best solution found so far and accelerates the convergence in the final stage. The performance of the SNMRVU algorithm is evaluated using 27 scalable benchmark functions and compared with four algorithms. The results show that the SNMRVU algorithm is promising and produces high quality solutions with low computational costs.
Identifying fracture-zone geometry using simulated annealing and hydraulic-connection data
Day-Lewis, F. D.; Hsieh, P.A.; Gorelick, S.M.
2000-01-01
A new approach is presented to condition geostatistical simulation of high-permeability zones in fractured rock to hydraulic-connection data. A simulated-annealing algorithm generates three-dimensional (3-D) realizations conditioned to borehole data, inferred hydraulic connections between packer-isolated borehole intervals, and an indicator (fracture zone or background-K bedrock) variogram model of spatial variability. We apply the method to data from the U.S. Geological Survey Mirror Lake Site in New Hampshire, where connected high-permeability fracture zones exert a strong control on fluid flow at the hundred-meter scale. Single-well hydraulic-packer tests indicate where permeable fracture zones intersect boreholes, and multiple-well pumping tests indicate the degree of hydraulic connection between boreholes. Borehole intervals connected by a fracture zone exhibit similar hydraulic responses, whereas intervals not connected by a fracture zone exhibit different responses. Our approach yields valuable insights into the 3-D geometry of fracture zones at Mirror Lake. Statistical analysis of the realizations yields maps of the probabilities of intersecting specific fracture zones with additional wells. Inverse flow modeling based on the assumption of equivalent porous media is used to estimate hydraulic conductivity and specific storage and to identify those fracture-zone geometries that are consistent with hydraulic test data.
Xin-Ze, Lu; Gui-Fang, Shao; Liang-You, Xu; Tun-Dong, Liu; Yu-Hua, Wen
2016-05-01
Alloy nanoparticles exhibit higher catalytic activity than monometallic nanoparticles, and their stable structures are of importance to their applications. We employ the simulated annealing algorithm to systematically explore the stable structure and segregation behavior of tetrahexahedral Pt–Pd–Cu–Au quaternary alloy nanoparticles. Three alloy nanoparticles consisting of 443 atoms, 1417 atoms, and 3285 atoms are considered and compared. The preferred positions of atoms in the nanoparticles are analyzed. The simulation results reveal that Cu and Au atoms tend to occupy the surface, Pt atoms preferentially occupy the middle layers, and Pd atoms tend to segregate to the inner layers. Furthermore, Au atoms present stronger surface segregation than Cu ones. This study provides a fundamental understanding on the structural features and segregation phenomena of multi-metallic nanoparticles. Project supported by the National Natural Science Foundation of China (Grant Nos. 51271156, 11474234, and 61403318) and the Natural Science Foundation of Fujian Province of China (Grant Nos. 2013J01255 and 2013J06002).
A Multi-Operator Based Simulated Annealing Approach For Robot Navigation in Uncertain Environments
Hui Miao
2010-04-01
Full Text Available Optimization methods such as simulated annealing (SA and genetic algorithm(GA are used for solving optimization problems. However, the computationalprocessing time is crucial for the real-time applications such as mobile robots. Amulti-operator based SA approach incorporating with additional fourmathematical operators that can find the optimal path for robots in dynamicenvironments is proposed in this paper. It requires less computation times whilegiving better trade-offs among simplicity, far-field accuracy, and computationalcost. The contributions of the work include the implementing of the simulatedannealing algorithm for robot path planning in dynamic environments, and theenhanced new path planner for improving the efficiency of the path planningalgorithm. The simulation results are compared with the previous publishedclassic SA approach and the GA approach. The multi-operator based SA (MSAapproach is demonstrated through case studies not only to be effective inobtaining the optimal solution but also to be more efficient in both off-line and onlineprocessing for robot dynamic path planning.
H. Gohari
2016-08-01
Full Text Available Investigating of cutting forces and vibrations has a critical significance in analyzing and understanding of machining processes as it can provide more details about the cutting tool life, and surface quality and integrity. The purpose of this work is to find the optimal milling process parameters in order to reduce the effect of the forced vibrations induced from the cutting process. Minimizing the cutting forces fluctuation can result in a constant deflection during the milling process, so it can lead to eliminating the chatter and resonance phenomena during the machining process. In order to determine the optimal process parameters, cutter diameter, helix angle and depth of cut have been considered as input design factors and the average surface roughness as a machining characteristic which can be used to evaluate the induced cutting vibrations. Experimental tests have been performed based on Taguchi experimental design method. A mathematical regression model has been developed and used as an objective function in the simulated annealing algorithm for the process optimization. In addition, Analysis of variance (ANOVA has been implemented to find the highest significant parameters and the optimal parameters levels. Another technique has been performed too based on the mechanistic cutting force model in order to simulate the cutting forces which indicate the forces fluctuations at the optimal parameters levels. The results from the previous three techniques show the same optimal milling parameters which can be used in designing new tools in order to eliminate the effect of chatter and forced vibrations
Highlights: • SA and GA based optimization for loading pattern has been carried out. • The LEOPARD and MCRAC codes for a typical PWR have been used. • At high annealing rates, the SA shows premature convergence. • Then novel crossover and mutation operators are proposed in this work. • Genetic Algorithms exhibit stagnation for small population sizes. - Abstract: A comparative study of the Simulated Annealing and Genetic Algorithms based optimization of loading pattern with power profile flattening as the goal, has been carried out using the LEOPARD and MCRAC neutronic codes, for a typical 300 MWe PWR. At high annealing rates, Simulated Annealing exhibited tendency towards premature convergence while at low annealing rates, it failed to converge to global minimum. The new ‘batch composition preserving’ Genetic Algorithms with novel crossover and mutation operators are proposed in this work which, consistent with the earlier findings (Yamamoto, 1997), for small population size, require comparable computational effort to Simulated Annealing with medium annealing rates. However, Genetic Algorithms exhibit stagnation for small population size. A hybrid Genetic Algorithms (Simulated Annealing) scheme is proposed that utilizes inner Simulated Annealing layer for further evolution of population at stagnation point. The hybrid scheme has been found to escape stagnation in bcp Genetic Algorithms and converge to the global minima with about 51% more computational effort for small population sizes
Active neutron measurements such as the Differential Die-Away (DDA) technique involving pulsed neutron generator, are widely applied to determine the fissile content of waste packages. Unfortunately, the main drawback of such techniques is coming from the lack of knowledge of the waste matrix composition. Thus, the matrix effect correction for the DDA measurement is an essential improvement in the field of fissile material content determination. Different solutions have been developed to compensate the effect of the matrix on the neutron measurement interpretation. In this context, this paper describes an innovative matrix correction method we have developed with the goal of increasing the accuracy of the matrix effect correction and reducing the measurement time. The implementation of this method is based on the analysis of the raw signal with an optimisation algorithm called the simulated annealing algorithm. This algorithm needs a reference data base of Multi-Channel Scaling (MCS) spectra, to fit the raw signal. The construction of the MCS library involves a learning phase to define and acquire the DDA signals. This database has been provided by a set of active signals from experimental matrices (mock-up waste drums of 118 litres) recorded in a specific device dedicated to neutron measurement research and development of the Nuclear Measurement Laboratory of CEA-Cadarache, called PROMETHEE 6. The simulated annealing algorithm is applied to make use of the effect of the matrices on the total active signal of DDA measurement. Furthermore, as this algorithm is directly applied to the raw active signal, it is very useful when active background contributions can not be easily estimated and removed. Most of the cases tested during this work which represents the feasibility phase of the method, are within a 4% agreement interval with the expected experimental value. Moreover, one can notice that without any compensation of the matrix effect, the classical DDA prompt
Quantum Annealing for Clustering
Kurihara, Kenichi; Tanaka, Shu; Miyashita, Seiji
2014-01-01
This paper studies quantum annealing (QA) for clustering, which can be seen as an extension of simulated annealing (SA). We derive a QA algorithm for clustering and propose an annealing schedule, which is crucial in practice. Experiments show the proposed QA algorithm finds better clustering assignments than SA. Furthermore, QA is as easy as SA to implement.
A permutation based simulated annealing algorithm to predict pseudoknotted RNA secondary structures.
Tsang, Herbert H; Wiese, Kay C
2015-01-01
Pseudoknots are RNA tertiary structures which perform essential biological functions. This paper discusses SARNA-Predict-pk, a RNA pseudoknotted secondary structure prediction algorithm based on Simulated Annealing (SA). The research presented here extends previous work of SARNA-Predict and further examines the effect of the new algorithm to include prediction of RNA secondary structure with pseudoknots. An evaluation of the performance of SARNA-Predict-pk in terms of prediction accuracy is made via comparison with several state-of-the-art prediction algorithms using 20 individual known structures from seven RNA classes. We measured the sensitivity and specificity of nine prediction algorithms. Three of these are dynamic programming algorithms: Pseudoknot (pknotsRE), NUPACK, and pknotsRG-mfe. One is using the statistical clustering approach: Sfold and the other five are heuristic algorithms: SARNA-Predict-pk, ILM, STAR, IPknot and HotKnots algorithms. The results presented in this paper demonstrate that SARNA-Predict-pk can out-perform other state-of-the-art algorithms in terms of prediction accuracy. This supports the use of the proposed method on pseudoknotted RNA secondary structure prediction of other known structures. PMID:26558299
An interactive system for creating object models from range data based on simulated annealing
In hazardous applications such as remediation of buried waste and dismantlement of radioactive facilities, robots are an attractive solution. Sensing to recognize and locate objects is a critical need for robotic operations in unstructured environments. An accurate 3-D model of objects in the scene is necessary for efficient high level control of robots. Drawing upon concepts from supervisory control, the authors have developed an interactive system for creating object models from range data, based on simulated annealing. Site modeling is a task that is typically performed using purely manual or autonomous techniques, each of which has inherent strengths and weaknesses. However, an interactive modeling system combines the advantages of both manual and autonomous methods, to create a system that has high operator productivity as well as high flexibility and robustness. The system is unique in that it can work with very sparse range data, tolerate occlusions, and tolerate cluttered scenes. The authors have performed an informal evaluation with four operators on 16 different scenes, and have shown that the interactive system is superior to either manual or automatic methods in terms of task time and accuracy
Simulated Annealing-Based Ant Colony Algorithm for Tugboat Scheduling Optimization
Qi Xu
2012-01-01
Full Text Available As the “first service station” for ships in the whole port logistics system, the tugboat operation system is one of the most important systems in port logistics. This paper formulated the tugboat scheduling problem as a multiprocessor task scheduling problem (MTSP after analyzing the characteristics of tugboat operation. The model considers factors of multianchorage bases, different operation modes, and three stages of operations (berthing/shifting-berth/unberthing. The objective is to minimize the total operation times for all tugboats in a port. A hybrid simulated annealing-based ant colony algorithm is proposed to solve the addressed problem. By the numerical experiments without the shifting-berth operation, the effectiveness was verified, and the fact that more effective sailing may be possible if tugboats return to the anchorage base timely was pointed out; by the experiments with the shifting-berth operation, one can see that the objective is most sensitive to the proportion of the shifting-berth operation, influenced slightly by the tugboat deployment scheme, and not sensitive to the handling operation times.
A. Mateos
2016-01-01
Full Text Available Technological advances are required to accommodate air traffic control systems for the future growth of air traffic. Particularly, detection and resolution of conflicts between aircrafts is a problem that has attracted much attention in the last decade becoming vital to improve the safety standards in free flight unstructured environments. We propose using the archive simulated annealing-based multiobjective optimization algorithm to deal with such a problem, accounting for three admissible maneuvers (velocity, turn, and altitude changes in a multiobjective context. The minimization of the maneuver number and magnitude, time delays, or deviations in the leaving points are considered for analysis. The optimal values for the algorithm parameter set are identified in the more complex instance in which all aircrafts have conflicts between each other accounting for 5, 10, and 20 aircrafts. Moreover, the performance of the proposed approach is analyzed by means of a comparison with the Pareto front, computed using brute force for 5 aircrafts and the algorithm is also illustrated with a random instance with 20 aircrafts.
Fast simulated annealing inversion of surface waves on pavement using phase-velocity spectra
Ryden, N.; Park, C.B.
2006-01-01
The conventional inversion of surface waves depends on modal identification of measured dispersion curves, which can be ambiguous. It is possible to avoid mode-number identification and extraction by inverting the complete phase-velocity spectrum obtained from a multichannel record. We use the fast simulated annealing (FSA) global search algorithm to minimize the difference between the measured phase-velocity spectrum and that calculated from a theoretical layer model, including the field setup geometry. Results show that this algorithm can help one avoid getting trapped in local minima while searching for the best-matching layer model. The entire procedure is demonstrated on synthetic and field data for asphalt pavement. The viscoelastic properties of the top asphalt layer are taken into account, and the inverted asphalt stiffness as a function of frequency compares well with laboratory tests on core samples. The thickness and shear-wave velocity of the deeper embedded layers are resolved within 10% deviation from those values measured separately during pavement construction. The proposed method may be equally applicable to normal soil site investigation and in the field of ultrasonic testing of materials. ?? 2006 Society of Exploration Geophysicists.
Equilibrium properties of transition-metal ion-argon clusters via simulated annealing
Asher, Robert L.; Micha, David A.; Brucat, Philip J.
1992-01-01
The geometrical structures of M(+) (Ar)n ions, with n = 1-14, have been studied by the minimization of a many-body potential surface with a simulated annealing procedure. The minimization method is justified for finite systems through the use of an information theory approach. It is carried out for eight potential-energy surfaces constructed with two- and three-body terms parametrized from experimental data and ab initio results. The potentials should be representative of clusters of argon atoms with first-row transition-metal monocations of varying size. The calculated geometries for M(+) = Co(+) and V(+) possess radial shells with small (ca. 4-8) first-shell coordination number. The inclusion of an ion-induced-dipole-ion-induced-dipole interaction between argon atoms raises the energy and generally lowers the symmetry of the cluster by promoting incomplete shell closure. Rotational constants as well as electric dipole and quadrupole moments are quoted for the Co(+) (Ar)n and V(+) (Ar)n predicted structures.
Multiobjective Simulated Annealing-Based Clustering of Tissue Samples for Cancer Diagnosis.
Acharya, Sudipta; Saha, Sriparna; Thadisina, Yamini
2016-03-01
In the field of pattern recognition, the study of the gene expression profiles of different tissue samples over different experimental conditions has become feasible with the arrival of microarray-based technology. In cancer research, classification of tissue samples is necessary for cancer diagnosis, which can be done with the help of microarray technology. In this paper, we have presented a multiobjective optimization (MOO)-based clustering technique utilizing archived multiobjective simulated annealing(AMOSA) as the underlying optimization strategy for classification of tissue samples from cancer datasets. The presented clustering technique is evaluated for three open source benchmark cancer datasets [Brain tumor dataset, Adult Malignancy, and Small Round Blood Cell Tumors (SRBCT)]. In order to evaluate the quality or goodness of produced clusters, two cluster quality measures viz, adjusted rand index and classification accuracy ( % CoA) are calculated. Comparative results of the presented clustering algorithm with ten state-of-the-art existing clustering techniques are shown for three benchmark datasets. Also, we have conducted a statistical significance test called t-test to prove the superiority of our presented MOO-based clustering technique over other clustering techniques. Moreover, significant gene markers have been identified and demonstrated visually from the clustering solutions obtained. In the field of cancer subtype prediction, this study can have important impact. PMID:25706936
Complete text of publication follows. The determination of magnetic structures from neutron diffraction data is often carried out by trial and error. Much time is wasted by the examination of structures that are in fact symmetry forbidden. The technique of Representation Analysis (RA) [1] uses simple matrix calculations to provide model magnetic structures, but has fallen into disuse because of its tedious nature. New Windows-based code performs these calculations automatically. Integration with refinement packages based on Simulated Annealing (SA) [2], a.k.a. Reverse Monte Carlo (RMC), algorithms allows these models to be sequentially fitted against diffraction data. Magnetic structures also lend themselves well to the use of Genetic Algorithms (GA) and these techniques can be exploited to increase the symmetry of the refined structures. The best models determined can then be examined using conventional least-squares methods and software. Combination of SA, RA and GA creates a powerful new protocol for the determination of magnetic structures. (author) [1] E.F. Bertaut, Acta Cryst., A 24, 217 (1968); [2] R.L. McGreevy in Structural modelling in Inorganic Crystallography ed. R.A. Catlow, Academic Press (1997)
An interactive system for creating object models from range data based on simulated annealing
Hoff, W.A.; Hood, F.W.; King, R.H. [Colorado School of Mines, Golden, CO (United States). Center for Robotics and Intelligent Systems
1997-05-01
In hazardous applications such as remediation of buried waste and dismantlement of radioactive facilities, robots are an attractive solution. Sensing to recognize and locate objects is a critical need for robotic operations in unstructured environments. An accurate 3-D model of objects in the scene is necessary for efficient high level control of robots. Drawing upon concepts from supervisory control, the authors have developed an interactive system for creating object models from range data, based on simulated annealing. Site modeling is a task that is typically performed using purely manual or autonomous techniques, each of which has inherent strengths and weaknesses. However, an interactive modeling system combines the advantages of both manual and autonomous methods, to create a system that has high operator productivity as well as high flexibility and robustness. The system is unique in that it can work with very sparse range data, tolerate occlusions, and tolerate cluttered scenes. The authors have performed an informal evaluation with four operators on 16 different scenes, and have shown that the interactive system is superior to either manual or automatic methods in terms of task time and accuracy.
Finding a Hadamard Matrix by Simulated Annealing of Spin-Vectors
Suksmono, Andriyan Bayu
2016-01-01
Reformulation of a combinatorial problem into optimization of a statistical-mechanics system, enables finding a better solution using heuristics derived from a physical process, such as by the SA (Simulated Annealing). In this paper, we present a Hadamard matrix (H-matrix) searching method based on the SA on an Ising model. By equivalence, an H-matrix can be converted into an SH (Semi-normalized Hadamard) matrix; whose first columns are unity vector and the rest ones are vectors with equal number of -1 and +1 called SH-vectors. We define SH spin-vectors to represent the SH vectors, which play the role of the spins on the Ising model. The topology of the lattice is generalized into a graph, whose edges represent orthogonality relationship among the SH spin-vectors. Started from a randomly generated quasi H-matrix Q, which is a matrix similar to the SH-matrix without imposing orthogonality, we perform the SA. The transitions of Q are conducted by random exchange of {+,-} spin-pair within the SH-spin vectors whi...
Combined Simulated Annealing and Genetic Algorithm Approach to Bus Network Design
Liu, Li; Olszewski, Piotr; Goh, Pong-Chai
A new method - combined simulated annealing (SA) and genetic algorithm (GA) approach is proposed to solve the problem of bus route design and frequency setting for a given road network with fixed bus stop locations and fixed travel demand. The method involves two steps: a set of candidate routes is generated first and then the best subset of these routes is selected by the combined SA and GA procedure. SA is the main process to search for a better solution to minimize the total system cost, comprising user and operator costs. GA is used as a sub-process to generate new solutions. Bus demand assignment on two alternative paths is performed at the solution evaluation stage. The method was implemented on four theoretical grid networks of different size and a benchmark network. Several GA operators (crossover and mutation) were utilized and tested for their effectiveness. The results show that the proposed method can efficiently converge to the optimal solution on a small network but computation time increases significantly with network size. The method can also be used for other transport operation management problems.
Optimasi Coverage SFN pada Pemancar TV Digital DVB-T2 dengan Metode Simulated Annealing
Adib Nur Ikhwan
2013-09-01
Full Text Available Siaran TV digital yang akan diterapkan di Indonesia pada awalnya menggunakan standar DVB-T (Digital Video Broadcasting-Terestrial yang kemudian pada tahun 2012 diganti menjadi DVB-T2 (Digital Video Broadcasting-Terestrial Second Generation. Oleh karena itu, penelitian-penelitian sebelumnya termasuk optimasi coverage TV digital sudah tidak relevan lagi. Coverage merupakan salah satu bagian yang penting dalam siaran TV digital. Pada tugas akhir ini, optimasi coverage SFN (Single Frequency network pada pemancar TV digital diterapkan dengan metode SA (Simulated Annealing. Metode SA berusaha mencari solusi dengan berpindah dari satu solusi ke solusi yang lain, dimana akan dipilih solusi yang mempunyai fungsi energy (fitness yang terkecil. Optimasi dengan metode SA ini dilakukan dengan mengubah-ubah posisi pemancar TV digital sehingga didapatkan posisi yang terbaik. Optimasinya menggunakan 10 cooling schedule dengan melakukan 2 kali tes, baik pada mode FFT 2K ataupun 4K. Hasil yang dicapai dari penelitian ini adalah daerah coverage SFN pada pemancar siaran TV digital DVB-T2 mengalami peningkatan coverage relatif terbaik rata-rata sebesar 2.348% pada cooling schedule 7.
An archived multi-objective simulated annealing for a dynamic cellular manufacturing system
Shirazi, Hossein; Kia, Reza; Javadian, Nikbakhsh; Tavakkoli-Moghaddam, Reza
2014-05-01
To design a group layout of a cellular manufacturing system (CMS) in a dynamic environment, a multi-objective mixed-integer non-linear programming model is developed. The model integrates cell formation, group layout and production planning (PP) as three interrelated decisions involved in the design of a CMS. This paper provides an extensive coverage of important manufacturing features used in the design of CMSs and enhances the flexibility of an existing model in handling the fluctuations of part demands more economically by adding machine depot and PP decisions. Two conflicting objectives to be minimized are the total costs and the imbalance of workload among cells. As the considered objectives in this model are in conflict with each other, an archived multi-objective simulated annealing (AMOSA) algorithm is designed to find Pareto-optimal solutions. Matrix-based solution representation, a heuristic procedure generating an initial and feasible solution and efficient mutation operators are the advantages of the designed AMOSA. To demonstrate the efficiency of the proposed algorithm, the performance of AMOSA is compared with an exact algorithm (i.e., ∈-constraint method) solved by the GAMS software and a well-known evolutionary algorithm, namely NSGA-II for some randomly generated problems based on some comparison metrics. The obtained results show that the designed AMOSA can obtain satisfactory solutions for the multi-objective model.
Green Cloud: Smart Resource Allocation and Optimization using Simulated Annealing Technique
AkshatDhingra
2014-05-01
Full Text Available Cloud computing aims to offer utility based IT services by interconnecting large number of computers through a real-time communication network such as the Internet. There has been a significant increase in the power consumption by the data centres that host the Cloud applications because of the growing popularity of Cloud Computing in more and more organisations involved in various fields. Hence, there is a need to develop solutions that aim to save energy consumption without compromising much on the performance. Building such solutions would not only help in reducing the carbon footprint but would also cut down the costs without much compromise on SLA violations thereby benefitting the cloud service providers. In this paper, Simulated Annealing Optimizing Technique has been used for the purpose of continuously optimizing the placement of the VMs (Virtual Machines over the hosts in order to minimize the power consumption hence providing cost benefits to the service provider. The results make it clearly evident that by making use of virtualisation at the data centre level and the optimizing the virtual resource allocation could significantly reduce the power consumption by the servers.
Inversion of sonobuoy data from shallow-water sites with simulated annealing
Lindwall, Dennis; Brozena, John
2005-02-01
An enhanced simulated annealing algorithm is used to invert sparsely sampled seismic data collected with sonobuoys to obtain seafloor geoacoustic properties at two littoral marine environments as well as for a synthetic data set. Inversion of field data from a 750-m water-depth site using a water-gun sound source found a good solution which included a pronounced subbottom reflector after 6483 iterations over seven variables. Field data from a 250-m water-depth site using an air-gun source required 35 421 iterations for a good inversion solution because 30 variables had to be solved for, including the shot-to-receiver offsets. The sonobuoy derived compressional wave velocity-depth (Vp-Z) models compare favorably with Vp-Z models derived from nearby, high-quality, multichannel seismic data. There are, however, substantial differences between seafloor reflection coefficients calculated from field models and seafloor reflection coefficients based on commonly used Vp regression curves (gradients). Reflection loss is higher at one field site and lower at the other than predicted from commonly used Vp gradients for terrigenous sediments. In addition, there are strong effects on reflection loss due to the subseafloor interfaces that are also not predicted by Vp gradients.
Jin Qin
2015-01-01
Full Text Available A stochastic multiproduct capacitated facility location problem involving a single supplier and multiple customers is investigated. Due to the stochastic demands, a reasonable amount of safety stock must be kept in the facilities to achieve suitable service levels, which results in increased inventory cost. Based on the assumption of normal distributed for all the stochastic demands, a nonlinear mixed-integer programming model is proposed, whose objective is to minimize the total cost, including transportation cost, inventory cost, operation cost, and setup cost. A combined simulated annealing (CSA algorithm is presented to solve the model, in which the outer layer subalgorithm optimizes the facility location decision and the inner layer subalgorithm optimizes the demand allocation based on the determined facility location decision. The results obtained with this approach shown that the CSA is a robust and practical approach for solving a multiple product problem, which generates the suboptimal facility location decision and inventory policies. Meanwhile, we also found that the transportation cost and the demand deviation have the strongest influence on the optimal decision compared to the others.
Optimal pumping from Palmela water supply wells (Portugal) using simulated annealing
Fragoso, Teresa; Cunha, Maria Da Conceição; Lobo-Ferreira, João P.
2009-12-01
Aquifer systems are an important part of an integrated water resources management plan as foreseen in the European Union’s Water Framework Directive (2000). The sustainable development of these systems demands the use of all available techniques capable of handling the multidisciplinary features of the problems involved. The formulation and resolution of an optimization model is described for a planning and management problem based on the Palmela aquifer (Portugal), developed to supply a given number of demand centres. This problem is solved using one of the latest optimization techniques, the simulated annealing heuristic method, designed to find the optimal solutions while avoiding falling into local optimums. The solution obtained, providing the wells location and the corresponding pumped flows to supply each centre, are analysed taking into account the objective function components and the constraints. It was found that the operation cost is the biggest share of the final cost, and the choice of wells is greatly affected by this fact. Another conclusion is that the solution takes advantage of the economies of scale, that is, it points toward drilling a large capacity well even if this increases the investment cost, rather than drilling several wells, which together will increase the operation costs.
魏关锋; 姚平经; LUOXing; ROETZELWilfried
2004-01-01
The multi-stream heat exchanger network synthesis (HENS) problem can be formulated as a mixed integer nonlinear programming model according to Yee et al. Its nonconvexity nature leads to existence of more than one optimum and computational difficulty for traditional algorithms to find the global optimum. Compared with deterministic algorithms, evolutionary computation provides a promising approach to tackle this problem. In this paper, a mathematical model of multi-stream heat exchangers network synthesis problem is setup. Different from the assumption of isothermal mixing of stream splits and thus linearity constraints of Yee et al., non-isothermal mixing is supported. As a consequence, nonlinear constraints are resulted and nonconvexity of the objective function is added. To solve the mathematical model, an algorithm named GA/SA (parallel genetic/simulated annealing algorithm) is detailed for application to the multi-stream heat exchanger network synthesis problem. The performance of the proposed approach is demonstrated with three examples and the obtained solutions indicate the presented approach is effective for multi-stream HENS.
Selecting Magnet Laminations Recipes Using the Meth-od of Sim-u-la-ted Annealing
Russell, A. D.; Baiod, R.; Brown, B. C.; Harding, D. J.; Martin, P. S.
1997-05-01
The Fermilab Main Injector project is building 344 dipoles using more than 7000 tons of steel. Budget and logistical constraints required that steel production, lamination stamping and magnet fabrication proceed in parallel. There were significant run-to-run variations in the magnetic properties of the steel (Martin, P.S., et al., Variations in the Steel Properties and the Excitation Characteristics of FMI Dipoles, this conference). The large lamination size (>0.5 m coil opening) resulted in variations of gap height due to differences in stress relief in the steel after stamping. To minimize magnet-to-magnet strength and field shape variations the laminations were shuffled based on the available magnetic and mechanical data and assigned to magnets using a computer program based on the method of simulated annealing. The lamination sets selected by the program have produced magnets which easily satisfy the design requirements. Variations of the average magnet gap are an order of magnitude smaller than the variations in lamination gaps. This paper discusses observed gap variations, the program structure and the strength uniformity results.
Chiapetto, M. [SCK-CEN, Nuclear Materials Science Institute, Mol (Belgium); Unite Materiaux et Transformations (UMET), UMR 8207, Universite de Lille 1, ENSCL, Villeneuve d' Ascq (France); Becquart, C.S. [Unite Materiaux et Transformations (UMET), UMR 8207, Universite de Lille 1, ENSCL, Villeneuve d' Ascq (France); Laboratoire commun EDF-CNRS, Etude et Modelisation des Microstructures pour le Vieillissement des Materiaux (EM2VM) (France); Domain, C. [EDF R and D, Departement Materiaux et Mecanique des Composants, Les Renardieres, Moret sur Loing (France); Laboratoire commun EDF-CNRS, Etude et Modelisation des Microstructures pour le Vieillissement des Materiaux (EM2VM) (France); Malerba, L. [SCK-CEN, Nuclear Materials Science Institute, Mol (Belgium)
2015-01-01
Post-irradiation annealing experiments are often used to obtain clearer information on the nature of defects produced by irradiation. However, their interpretation is not always straightforward without the support of physical models. We apply here a physically-based set of parameters for object kinetic Monte Carlo (OKMC) simulations of the nanostructural evolution of FeMnNi alloys under irradiation to the simulation of their post-irradiation isochronal annealing, from 290 to 600 C. The model adopts a ''grey alloy'' scheme, i.e. the solute atoms are not introduced explicitly, only their effect on the properties of point-defect clusters is. Namely, it is assumed that both vacancy and SIA clusters are significantly slowed down by the solutes. The slowing down increases with size until the clusters become immobile. Specifically, the slowing down of SIA clusters by Mn and Ni can be justified in terms of the interaction between these atoms and crowdions in Fe. The results of the model compare quantitatively well with post-irradiation isochronal annealing experimental data, providing clear insight into the mechanisms that determine the disappearance or re-arrangement of defects as functions of annealing time and temperature. (copyright 2015 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)
Post-irradiation annealing experiments are often used to obtain clearer information on the nature of defects produced by irradiation. However, their interpretation is not always straightforward without the support of physical models. We apply here a physically-based set of parameters for object kinetic Monte Carlo (OKMC) simulations of the nanostructural evolution of FeMnNi alloys under irradiation to the simulation of their post-irradiation isochronal annealing, from 290 to 600 C. The model adopts a ''grey alloy'' scheme, i.e. the solute atoms are not introduced explicitly, only their effect on the properties of point-defect clusters is. Namely, it is assumed that both vacancy and SIA clusters are significantly slowed down by the solutes. The slowing down increases with size until the clusters become immobile. Specifically, the slowing down of SIA clusters by Mn and Ni can be justified in terms of the interaction between these atoms and crowdions in Fe. The results of the model compare quantitatively well with post-irradiation isochronal annealing experimental data, providing clear insight into the mechanisms that determine the disappearance or re-arrangement of defects as functions of annealing time and temperature. (copyright 2015 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)
Molecular dynamics simulations of solid state recrystallization and grain growth in iron nanoparticles containing 1436 atoms were carried out. During the period of relaxation of supercooled liquid drops and during thermal annealing of the solids they froze to, changes in disorder were followed by monitoring changes in energy and the migration of grain boundaries. All 27 polycrystalline nanoparticles, which were generated with different grain boundaries, were observed to recystallize into single crystals during annealing. Larger grains consumed the smaller ones. In particular, two sets of solid particles, designated as A and B, each with two grains, were treated to generate 18 members of each set with different thermal histories. This provided small ensembles (of 18 members each) from which rates at which the larger grain engulfed the smaller one, could be determined. The rate was higher, the smaller the degree of misorientation between the grains, a result contrary to the general rule based on published experiments, but the reason was clear. Crystal A, which happened to have a somewhat lower angle of misorientation, also had a higher population of defects, as confirmed by its higher energy. Accordingly, its driving force to recrystallize was greater. Although the mechanism of recrystallization is commonly called nucleation, our results, which probe the system on an atomic scale, were not able to identify nuclei unequivocally. By contrast, our technique can and does reveal nuclei in the freezing of liquids and in transformations from one solid phase to another. An alternative rationale for a nucleation-like process in our results is proposed. - Graphical Abstract: Time dependence of energy per atom in the quenching of liquid nanoparticles A–C of iron. Nanoparticle C freezes directly into a single crystal but A and B freeze to solids with two grains. A and B eventually recrystallize into single crystals. Highlights: ► Solid state material synthesis.
Sulaiman, Salina; Bade, Abdullah; Lee, Rechard; Tanalol, Siti Hasnah
2014-07-01
Mass Spring Model (MSM) is a highly efficient model in terms of calculations and easy implementation. Mass, spring stiffness coefficient and damping constant are three major components of MSM. This paper focuses on identifying the coefficients of spring stiffness and damping constant using automated tuning method by optimization in generating human liver model capable of responding quickly. To achieve the objective two heuristic approaches are used, namely Simulated Annealing (SA) and Genetic Algorithm (GA) on the human liver model data set. The properties of the mechanical heart, which are taken into consideration, are anisotropy and viscoelasticity. Optimization results from SA and GA are then implemented into the MSM to model two human hearts, each with its SA or GA construction parameters. These techniques are implemented while making FEM construction parameters as benchmark. Step size response of both models are obtained after MSMs were solved using Fourth Order Runge-Kutta (RK4) to compare the elasticity response of both models. Remodelled time using manual calculation methods was compared against heuristic optimization methods of SA and GA in showing that model with automatic construction is more realistic in terms of realtime interaction response time. Liver models generated using SA and GA optimization techniques are compared with liver model from manual calculation. It shows that the reconstruction time required for 1000 repetitions of SA and GA is faster than the manual method. Meanwhile comparison between construction time of SA and GA model indicates that model SA is faster than GA with varying rates of time 0.110635 seconds/1000 repetitions. Real-time interaction of mechanical properties is dependent on rate of time and speed of remodelling process. Thus, the SA and GA have proven to be suitable in enhancing realism of simulated real-time interaction in liver remodelling.
Afanasiev, M.; Pratt, R. G.; Kamei, R.; McDowell, G.
2012-12-01
Crosshole seismic tomography has been used by Vale to provide geophysical images of mineralized massive sulfides in the Eastern Deeps deposit at Voisey's Bay, Labrador, Canada. To date, these data have been processed using traveltime tomography, and we seek to improve the resolution of these images by applying acoustic Waveform Tomography. Due to the computational cost of acoustic waveform modelling, local descent algorithms are employed in Waveform Tomography; due to non-linearity an initial model is required which predicts first-arrival traveltimes to within a half-cycle of the lowest frequency used. Because seismic velocity anisotropy can be significant in hardrock settings, the initial model must quantify the anisotropy in order to meet the half-cycle criterion. In our case study, significant velocity contrasts between the target massive sulfides and the surrounding country rock led to difficulties in generating an accurate anisotropy model through traveltime tomography, and our starting model for Waveform Tomography failed the half-cycle criterion at large offsets. We formulate a new, semi-global approach for finding the best-fit 1-D elliptical anisotropy model using simulated annealing. Through random perturbations to Thompson's ɛ parameter, we explore the L2 norm of the frequency-domain phase residuals in the space of potential anisotropy models: If a perturbation decreases the residuals, it is always accepted, but if a perturbation increases the residuals, it is accepted with the probability P = exp(-(Ei-E)/T). This is the Metropolis criterion, where Ei is the value of the residuals at the current iteration, E is the value of the residuals for the previously accepted model, and T is a probability control parameter, which is decreased over the course of the simulation via a preselected cooling schedule. Convergence to the global minimum of the residuals is guaranteed only for infinitely slow cooling, but in practice good results are obtained from a variety
Self-adaptive genetic algorithms with simulated binary crossover.
Deb, K; Beyer, H G
2001-01-01
Self-adaptation is an essential feature of natural evolution. However, in the context of function optimization, self-adaptation features of evolutionary search algorithms have been explored mainly with evolution strategy (ES) and evolutionary programming (EP). In this paper, we demonstrate the self-adaptive feature of real-parameter genetic algorithms (GAs) using a simulated binary crossover (SBX) operator and without any mutation operator. The connection between the working of self-adaptive ESs and real-parameter GAs with the SBX operator is also discussed. Thereafter, the self-adaptive behavior of real-parameter GAs is demonstrated on a number of test problems commonly used in the ES literature. The remarkable similarity in the working principle of real-parameter GAs and self-adaptive ESs shown in this study suggests the need for emphasizing further studies on self-adaptive GAs. PMID:11382356
Jun Wang
2015-01-01
Full Text Available The paper proposes a novel function expression method to forecast chaotic time series, using an improved genetic-simulated annealing (IGSA algorithm to establish the optimum function expression that describes the behavior of time series. In order to deal with the weakness associated with the genetic algorithm, the proposed algorithm incorporates the simulated annealing operation which has the strong local search ability into the genetic algorithm to enhance the performance of optimization; besides, the fitness function and genetic operators are also improved. Finally, the method is applied to the chaotic time series of Quadratic and Rossler maps for validation. The effect of noise in the chaotic time series is also studied numerically. The numerical results verify that the method can forecast chaotic time series with high precision and effectiveness, and the forecasting precision with certain noise is also satisfactory. It can be concluded that the IGSA algorithm is energy-efficient and superior.
Assessment of a fuzzy based flood forecasting system optimized by simulated annealing
Reyhani Masouleh, Aida; Pakosch, Sabine; Disse, Markus
2010-05-01
Flood forecasting is an important tool to mitigate harmful effects of floods. Among the many different approaches for forecasting, Fuzzy Logic (FL) is one that has been increasingly applied over the last decade. This method is principally based on the linguistic description of Rule Systems (RS). A RS is a specific combination of membership functions of input and output variables. Setting up the RS can be implemented either automatically or manually, the choice of which can strongly influence the resulting rule systems. It is therefore the objective of this study to assess the influence that the parameters of an automated rule generation based on Simulated Annealing (SA) have on the resulting RS. The study area is the upper Main River area, located in the northern part of Bavaria, Germany. The data of Mainleus gauge with area of 1165 km2 was investigated in the whole period of 1984 and 2004. The highest observed discharge of 357 m3/s was recorded in 1995. The input arguments of the FL model were daily precipitation, forecasted precipitation, antecedent precipitation index, temperature and melting rate. The FL model of this study has one output variable, daily discharge and was independently set up for three different forecast lead times, namely one-, two- and three-days ahead. In total, each RS comprised 55 rules and all input and output variables were represented by five sets of trapezoidal and triangular fuzzy numbers. Simulated Annealing, which is a converging optimum solution algorithm, was applied for optimizing the RSs in this study. In order to assess the influence of its parameters (number of iterations, temperature decrease rate, initial value for generating random numbers, initial temperature and two other parameters), they were individually varied while keeping the others fixed. With each of the resulting parameter sets, a full-automatic SA was applied to gain optimized fuzzy rule systems for flood forecasting. Evaluation of the performance of the
Maurer Till
2005-04-01
Full Text Available Abstract Background We have developed the program PERMOL for semi-automated homology modeling of proteins. It is based on restrained molecular dynamics using a simulated annealing protocol in torsion angle space. As main restraints defining the optimal local geometry of the structure weighted mean dihedral angles and their standard deviations are used which are calculated with an algorithm described earlier by Döker et al. (1999, BBRC, 257, 348–350. The overall long-range contacts are established via a small number of distance restraints between atoms involved in hydrogen bonds and backbone atoms of conserved residues. Employing the restraints generated by PERMOL three-dimensional structures are obtained using standard molecular dynamics programs such as DYANA or CNS. Results To test this modeling approach it has been used for predicting the structure of the histidine-containing phosphocarrier protein HPr from E. coli and the structure of the human peroxisome proliferator activated receptor γ (Ppar γ. The divergence between the modeled HPr and the previously determined X-ray structure was comparable to the divergence between the X-ray structure and the published NMR structure. The modeled structure of Ppar γ was also very close to the previously solved X-ray structure with an RMSD of 0.262 nm for the backbone atoms. Conclusion In summary, we present a new method for homology modeling capable of producing high-quality structure models. An advantage of the method is that it can be used in combination with incomplete NMR data to obtain reasonable structure models in accordance with the experimental data.
Automated integration of genomic physical mapping data via parallel simulated annealing
Slezak, T.
1994-06-01
The Human Genome Center at the Lawrence Livermore National Laboratory (LLNL) is nearing closure on a high-resolution physical map of human chromosome 19. We have build automated tools to assemble 15,000 fingerprinted cosmid clones into 800 contigs with minimal spanning paths identified. These islands are being ordered, oriented, and spanned by a variety of other techniques including: Fluorescence Insitu Hybridization (FISH) at 3 levels of resolution, ECO restriction fragment mapping across all contigs, and a multitude of different hybridization and PCR techniques to link cosmid, YAC, AC, PAC, and Pl clones. The FISH data provide us with partial order and distance data as well as orientation. We made the observation that map builders need a much rougher presentation of data than do map readers; the former wish to see raw data since these can expose errors or interesting biology. We further noted that by ignoring our length and distance data we could simplify our problem into one that could be readily attacked with optimization techniques. The data integration problem could then be seen as an M x N ordering of our N cosmid clones which ``intersect`` M larger objects by defining ``intersection`` to mean either contig/map membership or hybridization results. Clearly, the goal of making an integrated map is now to rearrange the N cosmid clone ``columns`` such that the number of gaps on the object ``rows`` are minimized. Our FISH partially-ordered cosmid clones provide us with a set of constraints that cannot be violated by the rearrangement process. We solved the optimization problem via simulated annealing performed on a network of 40+ Unix machines in parallel, using a server/client model built on explicit socket calls. For current maps we can create a map in about 4 hours on the parallel net versus 4+ days on a single workstation. Our biologists are now using this software on a daily basis to guide their efforts toward final closure.
Adaptive kernel methods to simulate quantum phase space flow
H.López
2006-01-01
Full Text Available A technique for simulating quantum dynamics in phase space is discussed. It makes use of ensembles of classical trajectories to approximate the distribution functions and their derivatives by implementing Adaptive Kernel Density Estimation. It is found to improve the accuracy and stability of the simulations compared to more conventional particle methods. Formulation of the method in higher dimensions is straightforward.
Lee, Cheng-Kuang
2014-12-10
© 2014 American Chemical Society. The nanomorphologies of the bulk heterojunction (BHJ) layer of polymer solar cells are extremely sensitive to the electrode materials and thermal annealing conditions. In this work, the correlations of electrode materials, thermal annealing sequences, and resultant BHJ nanomorphological details of P3HT:PCBM BHJ polymer solar cell are studied by a series of large-scale, coarse-grained (CG) molecular simulations of system comprised of PEDOT:PSS/P3HT:PCBM/Al layers. Simulations are performed for various configurations of electrode materials as well as processing temperature. The complex CG molecular data are characterized using a novel extension of our graph-based framework to quantify morphology and establish a link between morphology and processing conditions. Our analysis indicates that vertical phase segregation of P3HT:PCBM blend strongly depends on the electrode material and thermal annealing schedule. A thin P3HT-rich film is formed on the top, regardless of bottom electrode material, when the BHJ layer is exposed to the free surface during thermal annealing. In addition, preferential segregation of P3HT chains and PCBM molecules toward PEDOT:PSS and Al electrodes, respectively, is observed. Detailed morphology analysis indicated that, surprisingly, vertical phase segregation does not affect the connectivity of donor/acceptor domains with respective electrodes. However, the formation of P3HT/PCBM depletion zones next to the P3HT/PCBM-rich zones can be a potential bottleneck for electron/hole transport due to increase in transport pathway length. Analysis in terms of fraction of intra- and interchain charge transports revealed that processing schedule affects the average vertical orientation of polymer chains, which may be crucial for enhanced charge transport, nongeminate recombination, and charge collection. The present study establishes a more detailed link between processing and morphology by combining multiscale molecular
Ravn, Ole
1998-01-01
The paper describes the design considerations and implementational aspects of the Adaptive Blockset for Simulink which has been developed in a prototype implementation. The concept behind the Adaptive Blockset for Simulink is to bridge the gap between simulation and prototype controller...... implementation. This is done using the code generation capabilities of Real Time Workshop in combination with C s-function blocks for adaptive control in Simulink. In the paper the design of each group of blocks normally found in adaptive controllers is outlined. The block types are, identification, controller...... design, controller and state variable filter.The use of the Adaptive Blockset is demonstrated using a simple laboratory setup. Both the use of the blockset for simulation and for rapid prototyping of a real-time controller are shown....
Xiaobei Zhang; Yunhong Ding; Wei Hong; Xinliang Zhang; Dexiu Huang
2009-01-01
A simple approach based on six transfer cells and simulated annealing algorithm for analyzing and tailoring the spectra of arbitrary microring resonator arrays is presented.Coupling coefficients,ring sizes,and waveguide lengths of microring resonator arrays can be arbitrary in this approach.After developing this approach,several examples are demonstrated and optimized for various configurations of microring resonator arrays.Simulation results show that this approach is intuitive,efficient,and intelligent for applications based on microring resonator arrays.
The heating and cooling curves during batch annealing process of low carbon steel have been modeled using the finite element technique. This has allowed to predict the transient thermal profile for every point of the annealed coils, particularly for the hottest and coldest ones. Through experimental measurements, the results have been adequately validated since a good agreement has been found between experimental values and those predicted by the model. Moreover, an Avrami recrystallization model. Moreover, and Avrami recrystallization model has been coupled to this thermal balance computation. Interrupted annealing experiments have been made by measuring the recrystallized fraction on the extreme points of the coil foe different times. These data gave the possibility to validate the developed recrystallization model through a reasonably good numerical-experimental fittings. (Author) 6 refs
Adaptive LES Methodology for Turbulent Flow Simulations
Oleg V. Vasilyev
2008-06-12
Although turbulent flows are common in the world around us, a solution to the fundamental equations that govern turbulence still eludes the scientific community. Turbulence has often been called one of the last unsolved problem in classical physics, yet it is clear that the need to accurately predict the effect of turbulent flows impacts virtually every field of science and engineering. As an example, a critical step in making modern computational tools useful in designing aircraft is to be able to accurately predict the lift, drag, and other aerodynamic characteristics in numerical simulations in a reasonable amount of time. Simulations that take months to years to complete are much less useful to the design cycle. Much work has been done toward this goal (Lee-Rausch et al. 2003, Jameson 2003) and as cost effective accurate tools for simulating turbulent flows evolve, we will all benefit from new scientific and engineering breakthroughs. The problem of simulating high Reynolds number (Re) turbulent flows of engineering and scientific interest would have been solved with the advent of Direct Numerical Simulation (DNS) techniques if unlimited computing power, memory, and time could be applied to each particular problem. Yet, given the current and near future computational resources that exist and a reasonable limit on the amount of time an engineer or scientist can wait for a result, the DNS technique will not be useful for more than 'unit' problems for the foreseeable future (Moin & Kim 1997, Jimenez & Moin 1991). The high computational cost for the DNS of three dimensional turbulent flows results from the fact that they have eddies of significant energy in a range of scales from the characteristic length scale of the flow all the way down to the Kolmogorov length scale. The actual cost of doing a three dimensional DNS scales as Re{sup 9/4} due to the large disparity in scales that need to be fully resolved. State-of-the-art DNS calculations of isotropic
Adaptive image ray-tracing for astrophysical simulations
Parkin, E. R.
2010-01-01
A technique is presented for producing synthetic images from numerical simulations whereby the image resolution is adapted around prominent features. In so doing, adaptive image ray-tracing (AIR) improves the efficiency of a calculation by focusing computational effort where it is needed most. The results of test calculations show that a factor of >~ 4 speed-up, and a commensurate reduction in the number of pixels required in the final image, can be achieved compared to an equivalent calculat...
PASSATA - Object oriented numerical simulation software for adaptive optics
Agapito, G; Esposito, S
2016-01-01
We present the last version of the PyrAmid Simulator Software for Adaptive opTics Arcetri (PASSATA), an IDL and CUDA based object oriented software developed in the Adaptive Optics group of the Arcetri observatory for Monte-Carlo end-to-end adaptive optics simulations. The original aim of this software was to evaluate the performance of a single conjugate adaptive optics system for ground based telescope with a pyramid wavefront sensor. After some years of development, the current version of PASSATA is able to simulate several adaptive optics systems: single conjugate, multi conjugate and ground layer, with Shack Hartmann and Pyramid wavefront sensors. It can simulate from 8m to 40m class telescopes, with diffraction limited and resolved sources at finite or infinite distance from the pupil. The main advantages of this software are the versatility given by the object oriented approach and the speed given by the CUDA implementation of the most computational demanding routines. We describe the software with its...
A real-time simulation facility for astronomical adaptive optics
Basden, Alastair
2014-01-01
In this paper we introduce the concept of real-time hardware-in-the-loop simulation for astronomical adaptive optics, and present the case for the requirement for such a facility. This real-time simulation, when linked with an adaptive optics real-time control system, provides an essential tool for the validation, verification and integration of the Extremely Large Telescope real-time control systems prior to commissioning at the telescope. We demonstrate that such a facility is crucial for the success of the future extremely large telescopes.
Mahdi Sadeghzadeh
2014-02-01
Full Text Available Genetic Algorithm is an algorithm based on population and many optimization problems are solved with this method, successfully. With increasing demand for computer attacks, security, efficient and reliable Internet has increased. Cryptographic systems have studied the science of communication is hidden, and includes two case categories including encryption, password and analysis. In this paper, several code analyses based on genetic algorithms, tabu search and simulated annealing for a permutation of encrypted text are investigated. The study also attempts to provide and to compare the performance in terms of the amount of check and control algorithms and the results are compared.
MALi-ming; JIANGHong; WANGXiao-chun
2004-01-01
The algorithm is divided into two steps. The first step pre-locates the blank by aligning its centre of gravity and approximate normal vector with those of destination surfaces, with largest overlap of projections of two objects on a plane perpendicular to the normal vector. The second step is optimizing an objective function by means of gradient-simulated annealing algorithm to get the best matching of a set of distributed points on the blank and destination surfaces. An example for machining hydroelectric turbine blades is given to verify the effectiveness of algorithm.
The free defect survival ratio is calculated by ''cascade-annealing'' computer simulation using the MARLOWE and modified DAIQUIRI codes in various cases of Primary Knock-on Atom (PKA) spectra. The number of subcascades is calculated by ''cut-off'' calculation using MARLOWE. The adequacy of these methods is checked by comparing the results with experiments (surface segregation measurements and Transmission Electron Microscope cascade defect observations). The correlation using the weighted average recoil energy as a parameter shows that the saturation of the free defect survival ratio at high PKA energies has a close relation to the cascade splitting into subcascades. (author)
Full text: Hydrogenated amorphous silicon films are deposited by CVD onto insulating (silica) substrates for the fabrication of solar cells. 1.5MeV 4He ERD/RBS is applied to the films, and a self consistent depth profile of Si and H using the simulated annealing (SA) algorithm was obtained for each sample. The analytical procedure is described in detail, and the confidence limits of the profiles are obtained using the Markov Chain Monte Carlo method which is a natural extension of the SA algorithm. We show how the results are of great benefit to the growers
A model has been developed for the rapid melting and resolidification of thin Si films induced by excimer-laser annealing. The key feature of this model is its ability to simulate lateral growth and random nucleation. The first component of the model is a set of rules for phase change. The second component is a set of functions for computing the latent heat and the displacement of the solid-liquid interface resulting from the phase change. The third component is an algorithm that allows for random nucleation based on classical nucleation theory. Consequently, the model enables the prediction of lateral growth length (LGL), as well as the calculation of other critical responses of the quenched film such as solid-liquid interface velocity and undercooling. Thin amorphous Si films with thickness of 30, 50, and 100 nm were annealed under various laser fluences to completely melt the films. The resulting LGL were measured using a scanning electron microscope. Using physical parameters that were consistent with previous studies, the simulated LGL values agree well with the experimental results over a wide range of irradiation conditions. Sensitivity analysis was done to demonstrate the behavior of the model with respect to a select number of model parameters. Our simulations suggest that, for a given fluence, controlling the film's quenching rate is essential for increasing LGL. To this end, the model is an invaluable tool for evaluating and choosing irradiation strategies for increasing lateral growth in laser-crystallized silicon films
SimulCAT: Windows Software for Simulating Computerized Adaptive Test Administration
Han, Kyung T.
2012-01-01
Most, if not all, computerized adaptive testing (CAT) programs use simulation techniques to develop and evaluate CAT program administration and operations, but such simulation tools are rarely available to the public. Up to now, several software tools have been available to conduct CAT simulations for research purposes; however, these existing…
Óscar Begambre
2010-01-01
Full Text Available En este trabajo, el algoritmo Simulated Annealing (SA es empleado para solucionar el problema inverso de detección de daño en vigas usando información modal contaminada con ruido. La formulación de la función objetivo para el procedimiento de optimización, basado en el SA, está fundamentada en el método de la fuerza residual modificada. El desempeño del SA empleado en este estudio superó el de un algoritmo genético (AG en dos funciones de prueba reportadas en la literatura internacional. El procedimiento de evaluación de integridad aquí propuesto se confirmó y validó numéricamente empleando la teoría de vigas de Euler-Bernoulli y un Modelo de Elementos Finitos (MEF de vigas en voladizo y apoyadas libremente.In this research, the Simulated Annealing Algorithm (SA is employed to solve damage detection problems in beam type structures using noisy polluted modal data. The formulation of the objective function for the SA optimization procedure is based on the modified residual force method. The SA used in this research performs better than the Genetic Algorithm (GA in two difficult benchmark functions. The proposed structural damage-identification scheme is confirmed and assessed using a Finite Element Model (FEM of cantilever and a free-free Euler-Bernoulli beam model
Adaptive resolution simulation of an atomistic protein in MARTINI water
Zavadlav, Julija; Melo, Manuel Nuno; Marrink, Siewert J.; Praprotnik, Matej
2014-01-01
We present an adaptive resolution simulation of protein G in multiscale water. We couple atomistic water around the protein with mesoscopic water, where four water molecules are represented with one coarse-grained bead, farther away. We circumvent the difficulties that arise from coupling to the coa
Adaptive Sampling Algorithms for Probabilistic Risk Assessment of Nuclear Simulations
Diego Mandelli; Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer
2013-09-01
Nuclear simulations are often computationally expensive, time-consuming, and high-dimensional with respect to the number of input parameters. Thus exploring the space of all possible simulation outcomes is infeasible using finite computing resources. During simulation-based probabilistic risk analysis, it is important to discover the relationship between a potentially large number of input parameters and the output of a simulation using as few simulation trials as possible. This is a typical context for performing adaptive sampling where a few observations are obtained from the simulation, a surrogate model is built to represent the simulation space, and new samples are selected based on the model constructed. The surrogate model is then updated based on the simulation results of the sampled points. In this way, we attempt to gain the most information possible with a small number of carefully selected sampled points, limiting the number of expensive trials needed to understand features of the simulation space. We analyze the specific use case of identifying the limit surface, i.e., the boundaries in the simulation space between system failure and system success. In this study, we explore several techniques for adaptively sampling the parameter space in order to reconstruct the limit surface. We focus on several adaptive sampling schemes. First, we seek to learn a global model of the entire simulation space using prediction models or neighborhood graphs and extract the limit surface as an iso-surface of the global model. Second, we estimate the limit surface by sampling in the neighborhood of the current estimate based on topological segmentations obtained locally. Our techniques draw inspirations from topological structure known as the Morse-Smale complex. We highlight the advantages and disadvantages of using a global prediction model versus local topological view of the simulation space, comparing several different strategies for adaptive sampling in both
Adaptive thinking & leadership simulation game training for special forces officers.
Raybourn, Elaine Marie; Mendini, Kip (USA JFKSWCS DOTD, Ft. Bragg, NC); Heneghan, Jerry; Deagle, Edwin (USA JFKSWCS DOTD, Ft. Bragg, NC)
2005-07-01
Complex problem solving approaches and novel strategies employed by the military at the squad, team, and commander level are often best learned experimentally. Since live action exercises can be costly, advances in simulation game training technology offer exciting ways to enhance current training. Computer games provide an environment for active, critical learning. Games open up possibilities for simultaneous learning on multiple levels; players may learn from contextual information embedded in the dynamics of the game, the organic process generated by the game, and through the risks, benefits, costs, outcomes, and rewards of alternative strategies that result from decision making. In the present paper we discuss a multiplayer computer game simulation created for the Adaptive Thinking & Leadership (ATL) Program to train Special Forces Team Leaders. The ATL training simulation consists of a scripted single-player and an immersive multiplayer environment for classroom use which leverages immersive computer game technology. We define adaptive thinking as consisting of competencies such as negotiation and consensus building skills, the ability to communicate effectively, analyze ambiguous situations, be self-aware, think innovatively, and critically use effective problem solving skills. Each of these competencies is an essential element of leader development training for the U.S. Army Special Forces. The ATL simulation is used to augment experiential learning in the curriculum for the U.S. Army JFK Special Warfare Center & School (SWCS) course in Adaptive Thinking & Leadership. The school is incorporating the ATL simulation game into two additional training pipelines (PSYOPS and Civil Affairs Qualification Courses) that are also concerned with developing cultural awareness, interpersonal communication adaptability, and rapport-building skills. In the present paper, we discuss the design, development, and deployment of the training simulation, and emphasize how the
Felipe Baesler; Reinaldo Moraga; Oscar Cornejo
2008-01-01
El presente artículo introduce una variante de la metaheurística simulated annealing, para la resolución de problemas de optimización multiobjetivo. Este enfoque se demonina MultiObjective Simulated Annealing with Random Trajectory Search, MOSARTS. Esta técnica agrega al algoritmo Simulated Annealing elementos de memoria de corto y largo plazo para realizar una búsqueda que permita balancear el esfuerzo entre todos los objetivos involucrados en el problema. Los resultados obtenidos se compara...
A Model for Capturing Team Adaptation in Simulated Emergencies
Paltved, Charlotte; Musaeus, Peter
2013-01-01
Research on how teams adapt to unforeseen changes or non-routine events supports the idea that updating is somehow difficult to accomplish.6,7 Methods: Thirty emergency physicians and nurses participated in a Simulator Instructor Course at SkejSim Medical Simulation and Skills Training, Aarhus, Denmark in...... May-June 2012. The study was exempted from approval from the Danish National Committee of Health Research. The study has been reported to the Danish Data Protection Agency. All participants volunteered and written informed consent was obtained. Twenty nine simulation scenarios were recorded and...... changes, adjust priorities and implement adjusted strategies were more likely to perform successfully in environments with unforeseen changes, in other words adaptability is the generalization of trained knowledge and skills to new, more difficult and more complex tasks. An interpretative approach is...
Colin, A. [STMicroelectronics, 850 rue Jean Monnet, 38926 Crolles Cedex (France); InESS - CNRS and Universite Louis Pasteur, BP 20 CR, 23 rue de Loess, 67037 Strasbourg Cedex 2 (France)], E-mail: alexis.colin@st.com; Morin, P.; Cacho, F. [STMicroelectronics, 850 rue Jean Monnet, 38926 Crolles Cedex (France); Bono, H. [LETI, CEA-Grenoble, 17 Rue des Martyrs, 38054 Grenoble Cedex 9 (France); Beneyton, R.; Bidaud, M. [STMicroelectronics, 850 rue Jean Monnet, 38926 Crolles Cedex (France); Mathiot, D.; Fogarassy, E. [InESS -CNRS and Universite Louis Pasteur, BP 20 CR, 23 rue de Loess, 67037 Strasbourg Cedex 2 (France)
2008-12-05
Due to the continuous CMOS transistor scaling requirements, sub-melt millisecond laser annealing has been introduced in 45 nm CMOS technology to enhance dopant activation without any additional diffusion. Because of the design, the device layout at the wafer surface introduces during this process significant variations of optical absorption and heat transfer that can induce temperature non-uniformities over the die, detrimental to the device and often called 'pattern effects'. The introduction of an absorbent layer above the wafer reduces the optical properties dispersion, but the temperature variations generated by the thermal properties non-homogeneities cannot be suppressed. The impossibility to measure directly this local transient temperature effects on complex transistors layout requires simulation. A thermal simulation has been developed and calibrated to model with accuracy the laser annealing with the real process parameters. This model is used to obtain the transient temperature distribution over the devices, which is needed to understand the laser impact on the transistors performance. We demonstrate that the shallow trench isolation (STI) filled with silicon oxide is critical for these thermal pattern effects. Depending on the STI layout density, temperature variations up to 50 deg. C over a die are observed.
This paper describes an optimization model to be used by System Operators in order to validate the economic schedules obtained by Market Operators together with the injections from Bilateral Contracts. These studies will be performed off-line in the day before operation and the developed model is based on adjustment bids submitted by generators and loads and it is used by System Operators if that is necessary to enforce technical or security constraints. This model corresponds to an enhancement of an approach described in a previous paper and it now includes discrete components as transformer taps and reactor and capacitor banks. The resulting mixed integer formulation is solved using Simulated Annealing, a well known metaheuristic specially suited for combinatorial problems. Once the Simulated Annealing converges and the values of the discrete variables are fixed, the resulting non-linear continuous problem is solved using Sequential Linear Programming to get the final solution. The developed model corresponds to an AC version, it includes constraints related with the capability diagram of synchronous generators and variables allowing the computation of the active power required to balance active losses. Finally, the paper includes a Case Study based on the IEEE 118 bus system to illustrate the results that it is possible to obtain and their interest. (author)
Chen, Zheng; Mi, Chunting Chris; Xia, Bing; You, Chenwen
2014-12-01
In this paper, an energy management method is proposed for a power-split plug-in hybrid electric vehicle (PHEV). Through analyzing the PHEV powertrain, a series of quadratic equations are employed to approximate the vehicle's fuel-rate, using battery current as the input. Pontryagin's Minimum Principle (PMP) is introduced to find the battery current commands by solving the Hamiltonian function. Simulated Annealing (SA) algorithm is applied to calculate the engine-on power and the maximum current coefficient. Moreover, the battery state of health (SOH) is introduced to extend the application of the proposed algorithm. Simulation results verified that the proposed algorithm can reduce fuel-consumption compared to charge-depleting (CD) and charge-sustaining (CS) mode.
The objective of this work was to provide experimental heat transfer boundary condition and reactor pressure vessel (RPV) section thermal response data that can be used to benchmark computer codes that simulate thermal annealing of RPVS. This specific protect was designed to provide the Electric Power Research Institute (EPRI) with experimental data that could be used to support the development of a thermal annealing model. A secondary benefit is to provide additional experimental data (e.g., thermal response of concrete reactor cavity wall) that could be of use in an annealing demonstration project. The setup comprised a heater assembly, a 1.2 in x 1.2 m x 17.1 cm thick [4 ft x 4 ft x 6.75 in] section of an RPV (A533B ferritic steel with stainless steel cladding), a mockup of the open-quotes mirrorclose quotes insulation between the RPV and the concrete reactor cavity wall, and a 25.4 cm [10 in] thick concrete wall, 2.1 in x 2.1 in [10 ft x 10 ft] square. Experiments were performed at temperature heat-up/cooldown rates of 7, 14, and 28 degrees C/hr [12.5, 25, and 50 degrees F/hr] as measured on the heated face. A peak temperature of 454 degrees C [850 degrees F] was maintained on the heated face until the concrete wall temperature reached equilibrium. Results are most representative of those RPV locations where the heat transfer would be 1-dimensional. Temperature was measured at multiple locations on the heated and unheated faces of the RPV section and the concrete wall. Incident heat flux was measured on the heated face, and absorbed heat flux estimates were generated from temperature measurements and an inverse heat conduction code. Through-wall temperature differences, concrete wall temperature response, heat flux absorbed into the RPV surface and incident on the surface are presented. All of these data are useful to modelers developing codes to simulate RPV annealing
Liu, Bin, E-mail: bins@ieee.org [School of Computer Science and Technology, Nanjing University of Posts and Telecommunications, Nanjing 210023 (China)
2014-07-01
We describe an algorithm that can adaptively provide mixture summaries of multimodal posterior distributions. The parameter space of the involved posteriors ranges in size from a few dimensions to dozens of dimensions. This work was motivated by an astrophysical problem called extrasolar planet (exoplanet) detection, wherein the computation of stochastic integrals that are required for Bayesian model comparison is challenging. The difficulty comes from the highly nonlinear models that lead to multimodal posterior distributions. We resort to importance sampling (IS) to estimate the integrals, and thus translate the problem to be how to find a parametric approximation of the posterior. To capture the multimodal structure in the posterior, we initialize a mixture proposal distribution and then tailor its parameters elaborately to make it resemble the posterior to the greatest extent possible. We use the effective sample size (ESS) calculated based on the IS draws to measure the degree of approximation. The bigger the ESS is, the better the proposal resembles the posterior. A difficulty within this tailoring operation lies in the adjustment of the number of mixing components in the mixture proposal. Brute force methods just preset it as a large constant, which leads to an increase in the required computational resources. We provide an iterative delete/merge/add process, which works in tandem with an expectation-maximization step to tailor such a number online. The efficiency of our proposed method is tested via both simulation studies and real exoplanet data analysis.
We describe an algorithm that can adaptively provide mixture summaries of multimodal posterior distributions. The parameter space of the involved posteriors ranges in size from a few dimensions to dozens of dimensions. This work was motivated by an astrophysical problem called extrasolar planet (exoplanet) detection, wherein the computation of stochastic integrals that are required for Bayesian model comparison is challenging. The difficulty comes from the highly nonlinear models that lead to multimodal posterior distributions. We resort to importance sampling (IS) to estimate the integrals, and thus translate the problem to be how to find a parametric approximation of the posterior. To capture the multimodal structure in the posterior, we initialize a mixture proposal distribution and then tailor its parameters elaborately to make it resemble the posterior to the greatest extent possible. We use the effective sample size (ESS) calculated based on the IS draws to measure the degree of approximation. The bigger the ESS is, the better the proposal resembles the posterior. A difficulty within this tailoring operation lies in the adjustment of the number of mixing components in the mixture proposal. Brute force methods just preset it as a large constant, which leads to an increase in the required computational resources. We provide an iterative delete/merge/add process, which works in tandem with an expectation-maximization step to tailor such a number online. The efficiency of our proposed method is tested via both simulation studies and real exoplanet data analysis
Disaster Rescue Simulation based on Complex Adaptive Theory
Feng Jiang
2013-05-01
Full Text Available Disaster rescue is one of the key measures of disaster reduction. The rescue process is a complex process with the characteristics of large scale, complicate structure, non-linear. It is hard to describe and analyze them with traditional methods. Based on complex adaptive theory, this paper analyzes the complex adaptation of the rescue process from seven features: aggregation, nonlinearity, mobility, diversity, tagging, internal model and building block. With the support of Repast platform, an agent-based model including rescue agents and victim agents was proposed. Moreover, two simulations with different parameters are employed to examine the feasibility of the model. As a result, the proposed model has been shown that it is efficient in dealing with the disaster rescue simulation and can provide the reference for making decisions.
Nonlinear Adaptive Robust Force Control of Hydraulic Load Simulator
YAO Jianyong; JIAO Zongxia; YAO Bin; SHANG Yaoxing; DONG Wenbin
2012-01-01
This paper deals with the high performance force control of hydraulic load samulator.Many prevtous works for hydraultc force control are based on their linearization equations,but hydraulic inherent nonlinear properties and uncertainties make the conventional feedback proportional-integral-derivative control not yield to high-performance requirements.In this paper,a nonlinear system model is derived and linear parameterization is made for adaptive control.Then a discontinuous projection-based nonlinear adaptive robust force controller is developed for hydraulic load simulator.The proposed controller constructs an asymptotically stable adaptive controller and adaptation laws,which can compensate for the system nonlinearities and uncertain parameters.Meanwhile a well-designed robust controller is also developed to cope with the hydraulic system uncertain nonlinearities.The controller achieves a guaranteed transient performance and final tracking accuracy in the presence of both parametric uncertainties and uncertain nonlinearities; in the absence of uncertain nonlinearities,the scheme also achieves asymptotic tracking performance.Simulation and experiment comparative results are obtained to verify the high-performance nature of the proposed control strategy and the tracking accuracy is greatly improved.
UNFOLDING SIMULATIONS OF COLD- AND WARM-ADAPTED ELASTASES
Laura Riccardi1, Papaleo Elena2 *
2010-11-01
Full Text Available The earth surface is dominated by low temperature environments, which have been successfully colonized by several extremophilic organisms. Enzymes isolated from psychrophilic organisms are able to catalyze reactions at low temperatures at which enzymes from mesophiles or thermophiles are fully compromised. The current scenario on enzyme cold-adaptation suggest that these enzymes are characterized by higher catalytic efficiently at low temperatures, enhanced structural flexibility and lower thermostability. In the present contribution, molecular dynamics simulations in explicit solvent have been carried out at different high temperatures in order to investigate the unfolding process of cold- and warm-adapted homologous enzymes. In particular, we focused our attention on cold-adapted elastases for which it was previously demonstrated that the psychrophilic enzyme presents higher localized flexibility in loops surrounding the catalytic site and the specificity pocket. The unfolding simulations show a slower unfolding process for the cold-adapted enzyme, but characterized by a greater loss of intramolecular interactions and α-helices than the mesophilic counterparts.
Chaotic Simulated Annealing by A Neural Network Model with Transient Chaos
Chen, L; Chen, Luonan; Aihara, Kazuyuki
1997-01-01
We propose a neural network model with transient chaos, or a transiently chaotic neural network (TCNN) as an approximation method for combinatorial optimization problem, by introducing transiently chaotic dynamics into neural networks. Unlike conventional neural networks only with point attractors, the proposed neural network has richer and more flexible dynamics, so that it can be expected to have higher ability of searching for globally optimal or near-optimal solutions. A significant property of this model is that the chaotic neurodynamics is temporarily generated for searching and self-organizing, and eventually vanishes with autonomous decreasing of a bifurcation parameter corresponding to the "temperature" in usual annealing process. Therefore, the neural network gradually approaches, through the transient chaos, to dynamical structure similar to such conventional models as the Hopfield neural network which converges to a stable equilibrium point. Since the optimization process of the transiently chaoti...
Hybrid Strategy of Particle Swarm Optimization and Simulated Annealing for Optimizing Orthomorphisms
Tong Yan; Zhang Huanguo
2012-01-01
Orthomorphism on F2^n is a kind of elementary pemmtation with good cryptographic properties. This paper proposes a hybrid strategy of Particle Swarm Optimization （PSO） and Sirrmlated Annealing （SA） for finding orthomorphisrm with good cryptographic properties. By experiment based on this strategy, we get some orthorrorphisrm on F2^n = 5, 6, 7, 9, 10） with good cryptographic properties in the open document for the first time, and the optirml orthorrrphism on F found in this paper also does better than the one proposed by Feng Dengguo et al. in stream cipher Loiss in difference uniformity, algebraic degree, algebraic irrarnity and corresponding pernmtation polynomial degree. The PSOSA hybrid strategy for optimizing orthomerphism in this paper makes design of orthorrorphisrm with good cryptographic properties automated, efficient and convenient, which proposes a new approach to design orthornorphisrm.
Improvement of flight simulator feeling using adaptive fuzzy backlash compensation
Amara, Zied; Bordeneuve-Guibé, Joël
2007-01-01
In this paper we addressed the problem of improving the control of DC motors used for the specific application of a 3 degrees of freedom moving base flight simulator. Indeed the presence of backlash in DC motors gearboxes induces shocks and naturally limits the flight feeling. In this paper, dynamic inversion with Fuzzy Logic is used to design an adaptive backlash compensator. The classification property of fuzzy logic techniques makes them a natural candidate for the rejection of errors indu...
Advanced Dynamically Adaptive Algorithms for Stochastic Simulations on Extreme Scales
Xiu, Dongbin [Purdue Univ., West Lafayette, IN (United States)
2016-06-21
The focus of the project is the development of mathematical methods and high-performance com- putational tools for stochastic simulations, with a particular emphasis on computations on extreme scales. The core of the project revolves around the design of highly e cient and scalable numer- ical algorithms that can adaptively and accurately, in high dimensional spaces, resolve stochastic problems with limited smoothness, even containing discontinuities.
Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine
Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than
Adaptive scapula bone remodeling computational simulation: Relevance to regenerative medicine
Sharma, Gulshan B., E-mail: gbsharma@ucalgary.ca [Emory University, Department of Radiology and Imaging Sciences, Spine and Orthopaedic Center, Atlanta, Georgia 30329 (United States); University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213 (United States); University of Calgary, Schulich School of Engineering, Department of Mechanical and Manufacturing Engineering, Calgary, Alberta T2N 1N4 (Canada); Robertson, Douglas D., E-mail: douglas.d.robertson@emory.edu [Emory University, Department of Radiology and Imaging Sciences, Spine and Orthopaedic Center, Atlanta, Georgia 30329 (United States); University of Pittsburgh, Swanson School of Engineering, Department of Bioengineering, Pittsburgh, Pennsylvania 15213 (United States)
2013-07-01
Shoulder arthroplasty success has been attributed to many factors including, bone quality, soft tissue balancing, surgeon experience, and implant design. Improved long-term success is primarily limited by glenoid implant loosening. Prosthesis design examines materials and shape and determines whether the design should withstand a lifetime of use. Finite element (FE) analyses have been extensively used to study stresses and strains produced in implants and bone. However, these static analyses only measure a moment in time and not the adaptive response to the altered environment produced by the therapeutic intervention. Computational analyses that integrate remodeling rules predict how bone will respond over time. Recent work has shown that subject-specific two- and three dimensional adaptive bone remodeling models are feasible and valid. Feasibility and validation were achieved computationally, simulating bone remodeling using an intact human scapula, initially resetting the scapular bone material properties to be uniform, numerically simulating sequential loading, and comparing the bone remodeling simulation results to the actual scapula’s material properties. Three-dimensional scapula FE bone model was created using volumetric computed tomography images. Muscle and joint load and boundary conditions were applied based on values reported in the literature. Internal bone remodeling was based on element strain-energy density. Initially, all bone elements were assigned a homogeneous density. All loads were applied for 10 iterations. After every iteration, each bone element’s remodeling stimulus was compared to its corresponding reference stimulus and its material properties modified. The simulation achieved convergence. At the end of the simulation the predicted and actual specimen bone apparent density were plotted and compared. Location of high and low predicted bone density was comparable to the actual specimen. High predicted bone density was greater than
Adaptive resolution simulation of an atomistic protein in MARTINI water
We present an adaptive resolution simulation of protein G in multiscale water. We couple atomistic water around the protein with mesoscopic water, where four water molecules are represented with one coarse-grained bead, farther away. We circumvent the difficulties that arise from coupling to the coarse-grained model via a 4-to-1 molecule coarse-grain mapping by using bundled water models, i.e., we restrict the relative movement of water molecules that are mapped to the same coarse-grained bead employing harmonic springs. The water molecules change their resolution from four molecules to one coarse-grained particle and vice versa adaptively on-the-fly. Having performed 15 ns long molecular dynamics simulations, we observe within our error bars no differences between structural (e.g., root-mean-squared deviation and fluctuations of backbone atoms, radius of gyration, the stability of native contacts and secondary structure, and the solvent accessible surface area) and dynamical properties of the protein in the adaptive resolution approach compared to the fully atomistically solvated model. Our multiscale model is compatible with the widely used MARTINI force field and will therefore significantly enhance the scope of biomolecular simulations
朱娴; 马卫
2011-01-01
双聚类是用基因表达数据矩阵中部分行与列的相互表达水平,即矩阵中的子矩阵.文章提出一种基于模拟退火的文化混合优化算法,以文化算法为整体框架嵌入模拟退火法,作为种群空间的一个演化过程,避免模拟退火的概率突跳性缺点.在酵母细胞数据集实验中,文中的算法在时间消耗增加不多的情况下,搜索出的双聚类质量高,实验效果良好.%A bicluster is a grouping of a subset of genes and a subset of conditions which exhibits a high correlation of expression activity across both rows and columns. A hybrid optimization algorithm is presented, which is based on the simulated annealing and cultural algorithm. To overcome the shortcoming of simulated annealing that it is easy to trap into data's leap. The simulated annealing is embedded in the cultural algorithm framework as an evolving course from the population space. The yeast dataset experiment result indicates that, under the circumstances of consuming time a little more, the new algorithm achieves good results to search out high quality of biclusters.
Spaceflight Sensorimotor Analogs: Simulating Acute and Adaptive Effects
Taylor, Laura C.; Harm, Deborah L.; Kozlovskaya, Inessa; Reschke, Millard F.; Wood, Scott J.
2009-01-01
Adaptive changes in sensorimotor function during spaceflight are reflected by spatial disorientation, motion sickness, gaze destabilization and decrements in balance, locomotion and eye-hand coordination that occur during and following transitions between different gravitational states. The purpose of this study was to conduct a meta-synthesis of data from spaceflight analogs to evaluate their effectiveness in simulating adaptive changes in sensorimotor function. METHODS. The analogs under review were categorized as either acute analogs used to simulate performance decrements accompanied with transient changes, or adaptive analogs used to drive sensorimotor learning to altered sensory feedback. The effectiveness of each analog was evaluated in terms of mechanisms of action, magnitude and time course of observed deficits compared to spaceflight data, and the effects of amplitude and exposure duration. RESULTS. Parabolic flight has been used extensively to examine effects of acute variation in gravitational loads, ranging from hypergravity to microgravity. More recently, galvanic vestibular stimulation has been used to elicit acute postural, locomotor and gaze dysfunction by disrupting vestibular afferents. Patient populations, e.g., with bilateral vestibular loss or cerebellar dysfunction, have been proposed to model acute sensorimotor dysfunction. Early research sponsored by NASA involved living onboard rotating rooms, which appeared to approximate the time course of adaptation and post-exposure recovery observed in astronauts following spaceflight. Exposure to different bed-rest paradigms (6 deg head down, dry immersion) result in similar motor deficits to that observed following spaceflight. Shorter adaptive analogs have incorporated virtual reality environments, visual distortion paradigms, exposure to conflicting tilt-translation cues, and exposure to 3Gx centrifugation. As with spaceflight, there is considerable variability in responses to most of the analogs
An adaptive nonlinear solution scheme for reservoir simulation
Lett, G.S. [Scientific Software - Intercomp, Inc., Denver, CO (United States)
1996-12-31
Numerical reservoir simulation involves solving large, nonlinear systems of PDE with strongly discontinuous coefficients. Because of the large demands on computer memory and CPU, most users must perform simulations on very coarse grids. The average properties of the fluids and rocks must be estimated on these grids. These coarse grid {open_quotes}effective{close_quotes} properties are costly to determine, and risky to use, since their optimal values depend on the fluid flow being simulated. Thus, they must be found by trial-and-error techniques, and the more coarse the grid, the poorer the results. This paper describes a numerical reservoir simulator which accepts fine scale properties and automatically generates multiple levels of coarse grid rock and fluid properties. The fine grid properties and the coarse grid simulation results are used to estimate discretization errors with multilevel error expansions. These expansions are local, and identify areas requiring local grid refinement. These refinements are added adoptively by the simulator, and the resulting composite grid equations are solved by a nonlinear Fast Adaptive Composite (FAC) Grid method, with a damped Newton algorithm being used on each local grid. The nonsymmetric linear system of equations resulting from Newton`s method are in turn solved by a preconditioned Conjugate Gradients-like algorithm. The scheme is demonstrated by performing fine and coarse grid simulations of several multiphase reservoirs from around the world.
As far as stochastic optimization methods are concerned, Simulated Annealing (SA) and Genetic Algorithms (GA) have been successfully applied to fuel management, when using a single objective function. Recent work has shown that it is possible to use a true multi-objective approach (e.g. fresh fuel enrichment minimization and cycle length maximization,...) based on GA. In that approach, ranking the individuals of the population is based on the non-dominance principle. It is shown that a similar approach can be applied to SA, which is traditionally single objective. In this approach, every time a solution using is accepted, it is compared to other archived solutions using the non-dominance principle. At the end of the optimization search, one ends up with an archived population which actually represents the trade-off surface between all the objective functions of interest, among which the expert will then choose the best solution according to his priorities. (author)
Sousa, Tiago; Vale, Zita; Carvalho, Joao Paulo;
2014-01-01
The massification of electric vehicles (EVs) can have a significant impact on the power system, requiring a new approach for the energy resource management. The energy resource management has the objective to obtain the optimal scheduling of the available resources considering distributed...... to determine the best solution in a reasonable amount of time. This paper presents a hybrid artificial intelligence technique to solve a complex energy resource management problem with a large number of resources, including EVs, connected to the electric network. The hybrid approach combines...... simulated annealing (SA) and ant colony optimization (ACO) techniques. The case study concerns different EVs penetration levels. Comparisons with a previous SA approach and a deterministic technique are also presented. For 2000 EVs scenario, the proposed hybrid approach found a solution better than the...
Full text: The behaviour of thin surface-active polystyrene (PS) films on silicon is being investigated. These films have amine functional groups which are attracted to the solid interface if they are fluorinated (N-PSF), and to the air interface if they are not (N-PS). To determine the interface enrichment of the species a 'sandwich' of deuterated- (DPS) and hydrogenated (HPS) films was prepared. 1.5MeV 4He ERD/RBS together with 0.7MeV 3He nuclear reaction analysis (NRA) to determine the D profile were applied to the films, and a self consistent analysis of all three spectra using the simulated annealing algorithm was made for each sample. The ERD data contains both H and D recoils, but the D profile does not have such good depth resolution as in the NRA data. The results are combined with data from neutron reflectivity
Shangchia Liu
2015-01-01
Full Text Available In the field of distributed decision making, different agents share a common processing resource, and each agent wants to minimize a cost function depending on its jobs only. These issues arise in different application contexts, including real-time systems, integrated service networks, industrial districts, and telecommunication systems. Motivated by its importance on practical applications, we consider two-agent scheduling on a single machine where the objective is to minimize the total completion time of the jobs of the first agent with the restriction that an upper bound is allowed the total completion time of the jobs for the second agent. For solving the proposed problem, a branch-and-bound and three simulated annealing algorithms are developed for the optimal solution, respectively. In addition, the extensive computational experiments are also conducted to test the performance of the algorithms.
无
2006-01-01
The characteristics of the design resources in the ship collaborative design is described and the hierarchical model for the evaluation of the design resources is established. The comprehensive evaluation of the co-designers for the collaborative design resources has been done from different aspects using Analytic Hierarchy Process (AHP),and according to the evaluation results,the candidates are determined. Meanwhile,based on the principle of minimum cost,and starting from the relations between the design tasks and the corresponding co-designers,the optimizing selection model of the collaborators is established and one novel genetic combined with simulated annealing algorithm is proposed to realize the optimization. It overcomes the defects of the genetic algorithm which may lead to the premature convergence and local optimization if used individually. Through the application of this method in the ship collaborative design system,it proves the feasibility and provides a quantitative method for the optimizing selection of the design resources.
Satyajit Guha; Soumya Ganguly Neogi; Pinaki Chaudhury
2014-05-01
In this paper, we explore the use of stochastic optimizer, namely simulated annealing (SA) followed by density function theory (DFT)-based strategy for evaluating the structure and infrared spectroscopy of (H2O) OH− clusters where = 1-6. We have shown that the use of SA can generate both global and local structures of these cluster systems.We also perform a DFT calculation, using the optimized coordinate obtained from SA as input and extract the IR spectra of these systems. Finally, we compare our results with available theoretical and experimental data. There is a close correspondence between the computed frequencies from our theoretical study and available experimental data. To further aid in understanding the details of the hydrogen bonds formed, we performed atoms in molecules calculation on all the global minimum structures to evaluate relevant electron densities and critical points.
Purpose: To report outcomes and toxicity of the first Canadian permanent prostate implant program. Methods and Materials: 396 consecutive patients (Gleason ≤6, initial prostate specific antigen (PSA) ≤10 and stage T1-T2a disease) were implanted between June 1994 and December 2001. The median follow-up is of 60 months (maximum, 136 months). All patients were planned with fast-simulated annealing inverse planning algorithm with high activity seeds ([gt] 0.76 U). Acute and late toxicity is reported for the first 213 patients using a modified RTOG toxicity scale. The Kaplan-Meier biochemical failure-free survival (bFFS) is reported according to the ASTRO and Houston definitions. Results: The bFFS at 60 months was of 88.5% (90.5%) according to the ASTRO (Houston) definition and, of 91.4% (94.6%) in the low risk group (initial PSA ≤10 and Gleason ≤6 and Stage ≤T2a). Risk factors statistically associated with bFFS were: initial PSA >10, a Gleason score of 7-8, and stage T2b-T3. The mean D90 was of 151 ± 36.1 Gy. The mean V100 was of 85.4 ± 8.5% with a mean V150 of 60.1 ± 12.3%. Overall, the implants were well tolerated. In the first 6 months, 31.5% of the patients were free of genitourinary symptoms (GUs), 12.7% had Grade 3 GUs; 91.6% were free of gastrointestinal symptoms (GIs). After 6 months, 54.0% were GUs free, 1.4% had Grade 3 GUs; 95.8% were GIs free. Conclusion: The inverse planning with fast simulated annealing and high activity seeds gives a 5-year bFFS, which is comparable with the best published series with a low toxicity profile
Hydrodynamical Adaptive Mesh Refinement Simulations of Disk Galaxies
Gibson, Brad K; Sanchez-Blazquez, Patricia; Teyssier, Romain; House, Elisa L; Brook, Chris B; Kawata, Daisuke
2008-01-01
To date, fully cosmological hydrodynamic disk simulations to redshift zero have only been undertaken with particle-based codes, such as GADGET, Gasoline, or GCD+. In light of the (supposed) limitations of traditional implementations of smoothed particle hydrodynamics (SPH), or at the very least, their respective idiosyncrasies, it is important to explore complementary approaches to the SPH paradigm to galaxy formation. We present the first high-resolution cosmological disk simulations to redshift zero using an adaptive mesh refinement (AMR)-based hydrodynamical code, in this case, RAMSES. We analyse the temporal and spatial evolution of the simulated stellar disks' vertical heating, velocity ellipsoids, stellar populations, vertical and radial abundance gradients (gas and stars), assembly/infall histories, warps/lopsideness, disk edges/truncations (gas and stars), ISM physics implementations, and compare and contrast these properties with our sample of cosmological SPH disks, generated with GCD+. These prelim...
SIMULATION AND PERFORMANCE ANALYASIS OF ADAPTIVE FILTER IN NOISE CANCELLATION
RAJ KUMAR THENUA,
2010-09-01
Full Text Available Noise problems in the environment have gained attention due to the tremendous growth of technology that has led to noisy engines, heavy machinery, high speed wind buffeting and other noise sources. The problem of controlling the noise level has become the focus of a tremendous amount of research over the years. In last few years various adaptive algorithms are developed for noise cancellation. In this paper we present an implementation of LMS (Least Mean Square, NLMS (Normalized Least Mean Square and RLS (Recursive Least Square algorithms on MATLAB platform with the intention to compare their performance in noise cancellation. We simulate the adaptive filter in MATLAB with a noisy tone signal and white noise signal and analyze the performance of algorithms in terms of MSE (Mean Squared Error, percentage noise removal, computational complexity and stability. The obtained results shows that RLS has the best performance but at thecost of large computational complexity and memory requirement.
Strategies in edge plasma simulation using adaptive dynamic nodalization techniques
A wide span of steady-state and transient edge plasma processes simulation problems require accurate discretization techniques and can then be treated with Finite Element (FE) and Finite Volume (FV) methods. The software used here to meet these meshing requirements is a 2D finite element grid generator. It allows to produce adaptive unstructured grids taking into consideration the flux surface characteristics. To comply with the common mesh handling features of FE/FV packages, some options have been added to the basic generation tool. These enhancements include quadrilateral meshes without non-regular transition elements obtained by substituting them by transition constructions consisting of regular quadrilateral elements. Furthermore triangular grids can be created with one edge parallel to the magnetic field and modified by the basic adaptation/realignment techniques. Enhanced code operation properties and processing capabilities are expected. (author)
Decentralized adaptive control of manipulators - Theory, simulation, and experimentation
Seraji, Homayoun
1989-01-01
The author presents a simple decentralized adaptive-control scheme for multijoint robot manipulators based on the independent joint control concept. The control objective is to achieve accurate tracking of desired joint trajectories. The proposed control scheme does not use the complex manipulator dynamic model, and each joint is controlled simply by a PID (proportional-integral-derivative) feedback controller and a position-velocity-acceleration feedforward controller, both with adjustable gains. Simulation results are given for a two-link direct-drive manipulator under adaptive independent joint control. The results illustrate trajectory tracking under coupled dynamics and varying payload. The proposed scheme is implemented on a MicroVAX II computer for motion control of the three major joints of a PUMA 560 arm. Experimental results are presented to demonstrate that trajectory tracking is achieved despite coupled nonlinear joint dynamics.
Simulated annealing with a potential function with discontinuous gradient on Rd
无
2001-01-01
In this paper, we have proven that the simulated annealingprocess with a potential function on Rd, of which the gradient is discontinuo us, converges in probability to a neighborhood of the global minima of the poten tial function.
In the present work, neutral network has been used for mathematically modeling equilibrium data of the mixture of two rare earth elements, namely Nd and Pr with PC88A agent. Thermo-genetic algorithm based on the idea of the genetic algorithm and the simulated annealing algorithm have been used in the training procedure of the neutral networks, giving better result in comparison with the traditional modeling approach. The obtained neutral network modeling the experimental data is further used in the computer program to simulate the solvent extraction process of two elements Nd and Pr. Based on this computer program, various optional schemes for the separation of Nd and Pr have been investigated and proposed. (author)
A parallel adaptive finite difference algorithm for petroleum reservoir simulation
Hoang, Hai Minh
2005-07-01
Adaptive finite differential for problems arising in simulation of flow in porous medium applications are considered. Such methods have been proven useful for overcoming limitations of computational resources and improving the resolution of the numerical solutions to a wide range of problems. By local refinement of the computational mesh where it is needed to improve the accuracy of solutions, yields better solution resolution representing more efficient use of computational resources than is possible with traditional fixed-grid approaches. In this thesis, we propose a parallel adaptive cell-centered finite difference (PAFD) method for black-oil reservoir simulation models. This is an extension of the adaptive mesh refinement (AMR) methodology first developed by Berger and Oliger (1984) for the hyperbolic problem. Our algorithm is fully adaptive in time and space through the use of subcycling, in which finer grids are advanced at smaller time steps than the coarser ones. When coarse and fine grids reach the same advanced time level, they are synchronized to ensure that the global solution is conservative and satisfy the divergence constraint across all levels of refinement. The material in this thesis is subdivided in to three overall parts. First we explain the methodology and intricacies of AFD scheme. Then we extend a finite differential cell-centered approximation discretization to a multilevel hierarchy of refined grids, and finally we are employing the algorithm on parallel computer. The results in this work show that the approach presented is robust, and stable, thus demonstrating the increased solution accuracy due to local refinement and reduced computing resource consumption. (Author)
Adaptive hybrid simulations for multiscale stochastic reaction networks
The probability distribution describing the state of a Stochastic Reaction Network (SRN) evolves according to the Chemical Master Equation (CME). It is common to estimate its solution using Monte Carlo methods such as the Stochastic Simulation Algorithm (SSA). In many cases, these simulations can take an impractical amount of computational time. Therefore, many methods have been developed that approximate sample paths of the underlying stochastic process and estimate the solution of the CME. A prominent class of these methods include hybrid methods that partition the set of species and the set of reactions into discrete and continuous subsets. Such a partition separates the dynamics into a discrete and a continuous part. Simulating such a stochastic process can be computationally much easier than simulating the exact discrete stochastic process with SSA. Moreover, the quasi-stationary assumption to approximate the dynamics of fast subnetworks can be applied for certain classes of networks. However, as the dynamics of a SRN evolves, these partitions may have to be adapted during the simulation. We develop a hybrid method that approximates the solution of a CME by automatically partitioning the reactions and species sets into discrete and continuous components and applying the quasi-stationary assumption on identifiable fast subnetworks. Our method does not require any user intervention and it adapts to exploit the changing timescale separation between reactions and/or changing magnitudes of copy-numbers of constituent species. We demonstrate the efficiency of the proposed method by considering examples from systems biology and showing that very good approximations to the exact probability distributions can be achieved in significantly less computational time. This is especially the case for systems with oscillatory dynamics, where the system dynamics change considerably throughout the time-period of interest
Simulation of Biochemical Pathway Adaptability Using Evolutionary Algorithms
Bosl, W J
2005-01-26
The systems approach to genomics seeks quantitative and predictive descriptions of cells and organisms. However, both the theoretical and experimental methods necessary for such studies still need to be developed. We are far from understanding even the simplest collective behavior of biomolecules, cells or organisms. A key aspect to all biological problems, including environmental microbiology, evolution of infectious diseases, and the adaptation of cancer cells is the evolvability of genomes. This is particularly important for Genomes to Life missions, which tend to focus on the prospect of engineering microorganisms to achieve desired goals in environmental remediation and climate change mitigation, and energy production. All of these will require quantitative tools for understanding the evolvability of organisms. Laboratory biodefense goals will need quantitative tools for predicting complicated host-pathogen interactions and finding counter-measures. In this project, we seek to develop methods to simulate how external and internal signals cause the genetic apparatus to adapt and organize to produce complex biochemical systems to achieve survival. This project is specifically directed toward building a computational methodology for simulating the adaptability of genomes. This project investigated the feasibility of using a novel quantitative approach to studying the adaptability of genomes and biochemical pathways. This effort was intended to be the preliminary part of a larger, long-term effort between key leaders in computational and systems biology at Harvard University and LLNL, with Dr. Bosl as the lead PI. Scientific goals for the long-term project include the development and testing of new hypotheses to explain the observed adaptability of yeast biochemical pathways when the myosin-II gene is deleted and the development of a novel data-driven evolutionary computation as a way to connect exploratory computational simulation with hypothesis
Direct numerical simulation of bubbles with parallelized adaptive mesh refinement
The study of two-phase Thermal-Hydraulics is a major topic for Nuclear Engineering for both security and efficiency of nuclear facilities. In addition to experiments, numerical modeling helps to knowing precisely where bubbles appear and how they behave, in the core as well as in the steam generators. This work presents the finest scale of representation of two-phase flows, Direct Numerical Simulation of bubbles. We use the 'Di-phasic Low Mach Number' equation model. It is particularly adapted to low-Mach number flows, that is to say flows which velocity is much slower than the speed of sound; this is very typical of nuclear thermal-hydraulics conditions. Because we study bubbles, we capture the front between vapor and liquid phases thanks to a downward flux limiting numerical scheme. The specific discrete analysis technique this work introduces is well-balanced parallel Adaptive Mesh Refinement (AMR). With AMR, we refined the coarse grid on a batch of patches in order to locally increase precision in areas which matter more, and capture fine changes in the front location and its topology. We show that patch-based AMR is very adapted for parallel computing. We use a variety of physical examples: forced advection, heat transfer, phase changes represented by a Stefan model, as well as the combination of all those models. We will present the results of those numerical simulations, as well as the speed up compared to equivalent non-AMR simulation and to serial computation of the same problems. This document is made up of an abstract and the slides of the presentation. (author)
Adaptive mesh refinement and adjoint methods in geophysics simulations
Burstedde, Carsten
2013-04-01
It is an ongoing challenge to increase the resolution that can be achieved by numerical geophysics simulations. This applies to considering sub-kilometer mesh spacings in global-scale mantle convection simulations as well as to using frequencies up to 1 Hz in seismic wave propagation simulations. One central issue is the numerical cost, since for three-dimensional space discretizations, possibly combined with time stepping schemes, a doubling of resolution can lead to an increase in storage requirements and run time by factors between 8 and 16. A related challenge lies in the fact that an increase in resolution also increases the dimensionality of the model space that is needed to fully parametrize the physical properties of the simulated object (a.k.a. earth). Systems that exhibit a multiscale structure in space are candidates for employing adaptive mesh refinement, which varies the resolution locally. An example that we found well suited is the mantle, where plate boundaries and fault zones require a resolution on the km scale, while deeper area can be treated with 50 or 100 km mesh spacings. This approach effectively reduces the number of computational variables by several orders of magnitude. While in this case it is possible to derive the local adaptation pattern from known physical parameters, it is often unclear what are the most suitable criteria for adaptation. We will present the goal-oriented error estimation procedure, where such criteria are derived from an objective functional that represents the observables to be computed most accurately. Even though this approach is well studied, it is rarely used in the geophysics community. A related strategy to make finer resolution manageable is to design methods that automate the inference of model parameters. Tweaking more than a handful of numbers and judging the quality of the simulation by adhoc comparisons to known facts and observations is a tedious task and fundamentally limited by the turnaround times
Simulations and measurements of annealed pyrolytic graphite-metal composite baseplates
Streb, F.; Ruhl, G.; Schubert, A.; Zeidler, H.; Penzel, M.; Flemmig, S.; Todaro, I.; Squatrito, R.; Lampke, T.
2016-03-01
We investigated the usability of anisotropic materials as inserts in aluminum-matrix-composite baseplates for typical high performance power semiconductor modules using finite-element simulations and transient plane source measurements. For simulations, several physical modules can be used, which are suitable for different thermal boundary conditions. By comparing different modules and options of heat transfer we found non-isothermal simulations to be closest to reality for temperature distribution at the surface of the heat sink. We optimized the geometry of the graphite inserts for best heat dissipation and based on these results evaluated the thermal resistance of a typical power module using calculation time optimized steady-state simulations. Here we investigated the influence of thermal contact conductance (TCC) between metal matrix and inserts on the heat dissipation. We found improved heat dissipation compared to the plain metal baseplate for a TCC of 200 kW/m2/K and above.To verify the simulations we evaluated cast composite baseplates with two different insert geometries and measured their averaged lateral thermal conductivity using a transient plane source (HotDisk) technique at room temperature. For the composite baseplate we achieved local improvements in heat dissipation compared to the plain metal baseplate.
温平川; 徐晓东; 何先刚
2003-01-01
This paper presents a highly hybrid Genetic Algorithm / Simulated Annealing algorithm. This algorithmhas been successfully implemented on Beowulf PCs Cluster and applied to a set of standard function optimization prob-lems. From experimental results, it is easily to see that this algorithm proposed by us is not only effective but also robust.
基于模拟退火的万有引力算法%The Gravity Algorithm Based Simulated Annealing
王立平; 肖乐意
2014-01-01
针对标准万有引力算法的个体位置更新策略可能对个体造成破坏且算法局部搜索能力较弱问题提出了一种改进算法。该算法将模拟退火思想引入万有引力算法，采用基于 Metroplis 准则的个体位置更新策略，并在引力操作之后，对每代最优个体进行退火操作。一定程度避免了个体移动的盲目性，提高了算法的局部搜索能力、收敛速度与精度。实验结果表明：算法的改进策略是有效的，且改进后的算法在收敛速度、收敛精度等方面具有明显优势。%In Gravitational Search Algorithm(GSA),individual location update strategy may damage the individual, and the local search ability is weak,an improved algorithm has been proposed. The proposed algorithm integrated simulated annealing mechanism into GSA,used individual location update strategy which based on Metroplis,and did annealing operation for optimal individual of every generation after gravity operation. To some extent,avoided the individual blind Mobile,Improve the local search ability of the algorithm,the velocity and precision of convergence. The experimental results demonstrate that improvement strategy of the algorithm is effective,and the improved algo-rithm has obvious advantages in the velocity of convergence,convergence accuracy,etc.
A graph-based watershed merging using fuzzy C-means and simulated annealing for image segmentation
Vadiveloo, Mogana; Abdullah, Rosni; Rajeswari, Mandava
2015-12-01
In this paper, we have addressed the issue of over-segmented regions produced in watershed by merging the regions using global feature. The global feature information is obtained from clustering the image in its feature space using Fuzzy C-Means (FCM) clustering. The over-segmented regions produced by performing watershed on the gradient of the image are then mapped to this global information in the feature space. Further to this, the global feature information is optimized using Simulated Annealing (SA). The optimal global feature information is used to derive the similarity criterion to merge the over-segmented watershed regions which are represented by the region adjacency graph (RAG). The proposed method has been tested on digital brain phantom simulated dataset to segment white matter (WM), gray matter (GM) and cerebrospinal fluid (CSF) soft tissues regions. The experiments showed that the proposed method performs statistically better, with average of 95.242% regions are merged, than the immersion watershed and average accuracy improvement of 8.850% in comparison with RAG-based immersion watershed merging using global and local features.
Zhang, Jiapu
2013-01-01
Simulated annealing (SA) was inspired from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects, both are attributes of the material that depend on its thermodynamic free energy. In this Paper, firstly we will study SA in details on its practical implementation. Then, hybrid pure SA with local (or global) search optimization methods allows us to be able to design several effective and efficient global search optimization methods. In order to keep the original sense of SA, we clarify our understandings of SA in crystallography and molecular modeling field through the studies of prion amyloid fibrils.
Adaptive and predictive control of a simulated robot arm.
Tolu, Silvia; Vanegas, Mauricio; Garrido, Jesús A; Luque, Niceto R; Ros, Eduardo
2013-06-01
In this work, a basic cerebellar neural layer and a machine learning engine are embedded in a recurrent loop which avoids dealing with the motor error or distal error problem. The presented approach learns the motor control based on available sensor error estimates (position, velocity, and acceleration) without explicitly knowing the motor errors. The paper focuses on how to decompose the input into different components in order to facilitate the learning process using an automatic incremental learning model (locally weighted projection regression (LWPR) algorithm). LWPR incrementally learns the forward model of the robot arm and provides the cerebellar module with optimal pre-processed signals. We present a recurrent adaptive control architecture in which an adaptive feedback (AF) controller guarantees a precise, compliant, and stable control during the manipulation of objects. Therefore, this approach efficiently integrates a bio-inspired module (cerebellar circuitry) with a machine learning component (LWPR). The cerebellar-LWPR synergy makes the robot adaptable to changing conditions. We evaluate how this scheme scales for robot-arms of a high number of degrees of freedom (DOFs) using a simulated model of a robot arm of the new generation of light weight robots (LWRs). PMID:23627657
Adaptive resolution simulation of polarizable supramolecular coarse-grained water models
Zavadlav, Julija; Melo, Manuel N.; Marrink, Siewert J.; Praprotnik, Matej
2015-01-01
Multiscale simulations methods, such as adaptive resolution scheme, are becoming increasingly popular due to their significant computational advantages with respect to conventional atomistic simulations. For these kind of simulations, it is essential to develop accurate multiscale water models that
The fast simulated annealing algorithm applied to the search problem in LEED
Nascimento, V. B.; de Carvalho, V. E.; de Castilho, C. M. C.; Costa, B. V.; Soares, E. A.
2001-07-01
In this work we present new results obtained from the application of the fast simulated algorithm (FSA) to the surface structure determination of the Ag(1 1 0) and CdTe(1 1 0) systems. The influence of a control parameter, the "initial temperature", on the FSA search process was investigated. A scaling behaviour, that measures the efficiency of a search method as a function of the number of parameters to be varied, was obtained for the FSA algorithm, and indicated a favourable linear scaling ( N1).
Quantum Annealing of Hard Problems
Jorg, Thomas; Krzakala, Florent; Kurchan, Jorge; Maggs, A C
2009-01-01
Quantum annealing is analogous to simulated annealing with a tunneling mechanism substituting for thermal activation. Its performance has been tested in numerical simulation with mixed conclusions. There is a class of optimization problems for which the efficiency can be studied analytically using techniques based on the statistical mechanics of spin glasses.
Simulated annealing based algorithm for identifying mutated driver pathways in cancer.
Li, Hai-Tao; Zhang, Yu-Lang; Zheng, Chun-Hou; Wang, Hong-Qiang
2014-01-01
With the development of next-generation DNA sequencing technologies, large-scale cancer genomics projects can be implemented to help researchers to identify driver genes, driver mutations, and driver pathways, which promote cancer proliferation in large numbers of cancer patients. Hence, one of the remaining challenges is to distinguish functional mutations vital for cancer development, and filter out the unfunctional and random "passenger mutations." In this study, we introduce a modified method to solve the so-called maximum weight submatrix problem which is used to identify mutated driver pathways in cancer. The problem is based on two combinatorial properties, that is, coverage and exclusivity. Particularly, we enhance an integrative model which combines gene mutation and expression data. The experimental results on simulated data show that, compared with the other methods, our method is more efficient. Finally, we apply the proposed method on two real biological datasets. The results show that our proposed method is also applicable in real practice. PMID:24982873
Adapting a weather forecast model for greenhouse gas simulation
Polavarapu, S. M.; Neish, M.; Tanguay, M.; Girard, C.; de Grandpré, J.; Gravel, S.; Semeniuk, K.; Chan, D.
2015-12-01
The ability to simulate greenhouse gases on the global domain is useful for providing boundary conditions for regional flux inversions, as well as for providing reference data for bias correction of satellite measurements. Given the existence of operational weather and environmental prediction models and assimilation systems at Environment Canada, it makes sense to use these tools for greenhouse gas simulations. In this work, we describe the adaptations needed to reasonably simulate CO2 with a weather forecast model. The main challenges were the implementation of a mass conserving advection scheme, and the careful implementation of a mixing ratio defined with respect to dry air. The transport of tracers through convection was also added, and the vertical mixing through the boundary layer was slightly modified. With all these changes, the model conserves CO2 mass well on the annual time scale, and the high resolution (0.9 degree grid spacing) permits a good description of synoptic scale transport. The use of a coupled meteorological/tracer transport model also permits an assessment of approximations needed in offline transport model approaches, such as the neglect of water vapour mass when computing a tracer mixing ratio with respect to dry air.
Silvia Gaona
2015-01-01
Full Text Available Censuses in Mexico are taken by the National Institute of Statistics and Geography (INEGI. In this paper a Two-Phase Approach (TPA to optimize the routes of INEGI’s census takers is presented. For each pollster, in the first phase, a route is produced by means of the Simulated Annealing (SA heuristic, which attempts to minimize the travel distance subject to particular constraints. Whenever the route is unrealizable, it is made realizable in the second phase by constructing a visibility graph for each obstacle and applying Dijkstra’s algorithm to determine the shortest path in this graph. A tuning methodology based on the irace package was used to determine the parameter values for TPA on a subset of 150 instances provided by INEGI. The practical effectiveness of TPA was assessed on another subset of 1962 instances, comparing its performance with that of the in-use heuristic (INEGIH. The results show that TPA clearly outperforms INEGIH. The average improvement is of 47.11%.
Diogenes, Alysson N.; Santos, Luis O.E. dos; Fernandes, Celso P. [Universidade Federal de Santa Catarina (UFSC), Florianopolis, SC (Brazil); Appoloni, Carlos R. [Universidade Estadual de Londrina (UEL), PR (Brazil)
2008-07-01
The reservoir rocks physical properties are usually obtained in laboratory, through standard experiments. These experiments are often very expensive and time-consuming. Hence, the digital image analysis techniques are a very fast and low cost methodology for physical properties prediction, knowing only geometrical parameters measured from the rock microstructure thin sections. This research analyzes two methods for porous media reconstruction using the relaxation method simulated annealing. Using geometrical parameters measured from rock thin sections, it is possible to construct a three-dimensional (3D) model of the microstructure. We assume statistical homogeneity and isotropy and the 3D model maintains porosity spatial correlation, chord size distribution and d 3-4 distance transform distribution for a pixel-based reconstruction and spatial correlation for an object-based reconstruction. The 2D and 3D preliminary results are compared with microstructures reconstructed by truncated Gaussian methods. As this research is in its beginning, only the 2D results will be presented. (author)
Chang, Yin-Jung; Chen, Yu-Ting
2011-07-01
Broadband omnidirectional antireflection (AR) coatings for solar cells optimized using simulated annealing (SA) algorithm incorporated with the solar (irradiance) spectrum at Earth's surface (AM1.57 radiation) are described. Material dispersions and reflections from the planar backside metal are considered in the rigorous electromagnetic calculations. Optimized AR coatings for bulk crystalline Si and thin-film CuIn(1-x)GaxSe(2) (CIGS) solar cells as two representative cases are presented and the effect of solar spectrum in the AR coating designs is investigated. In general, the angle-averaged reflectance of a solar-spectrum-incorporated AR design is shown to be smaller and more uniform in the spectral range with relatively stronger solar irradiance. By incorporating the transparent conductive and buffer layers as part of the AR coating in CIGS solar cells (2μm-thick CIGS layer), a single MgF(2) layer could provide an average reflectance of 8.46% for wavelengths ranging from 350 nm to 1200 nm and incident angles from 0° to 80°. PMID:21747557
This paper introduces a design methodology in the context of finding new and innovative design principles by means of optimization techniques. In this method cellular automata (CA) and simulated annealing (SA) were combined and used for solving the optimization problem. This method contains two principles that are neighboring concept from CA and accepting each displacement basis on decreasing of objective function and Boltzman distribution from SA that plays role of transition rule. Proposed method was used for solving fuel management optimization problem in VVER-1000 Russian reactor. Since the fuel management problem contains a huge amount of calculation for finding the best configuration for fuel assemblies in reactor core this method has been introduced for reducing the volume of calculation. In this study reducing of power peaking factor inside the reactor core of Bushehr NPP is considered as the objective function. The proposed optimization method is compared with Hopfield neural network procedure that was used for solving this problem and has been shown that the result, velocity and qualification of new method are comparable with that. Besides, the result is the optimum configuration, which is in agreement with the pattern proposed by the designer.
Han, Zong-wei; Huang, Wei; Luo, Yun; Zhang, Chun-di; Qi, Da-cheng
2015-03-01
Taking the soil organic matter in eastern Zhongxiang County, Hubei Province, as a research object, thirteen sample sets from different regions were arranged surrounding the road network, the spatial configuration of which was optimized by the simulated annealing approach. The topographic factors of these thirteen sample sets, including slope, plane curvature, profile curvature, topographic wetness index, stream power index and sediment transport index, were extracted by the terrain analysis. Based on the results of optimization, a multiple linear regression model with topographic factors as independent variables was built. At the same time, a multilayer perception model on the basis of neural network approach was implemented. The comparison between these two models was carried out then. The results revealed that the proposed approach was practicable in optimizing soil sampling scheme. The optimal configuration was capable of gaining soil-landscape knowledge exactly, and the accuracy of optimal configuration was better than that of original samples. This study designed a sampling configuration to study the soil attribute distribution by referring to the spatial layout of road network, historical samples, and digital elevation data, which provided an effective means as well as a theoretical basis for determining the sampling configuration and displaying spatial distribution of soil organic matter with low cost and high efficiency. PMID:26211074
吕学勤; 陈树果; 田振宁
2012-01-01
Focusing on the problem of premature convergence and slow convergence of the standard genetic algorithm, this paper proposes a hybrid genetic algorithm (adaptive genetic annealing algorithm) to solve the fault location in radialized distribution networks. This algorithm adopts the combined mechanism of the roulette strategy and the optimal strategy to keep the current best individual in the population, and uses the adaptive crossover and mutation probability to expand the search area about population, and then introduces simulated annealing algorithm, so as to speed up the convergence rate of the interactive post. Finally, a simulation calculation is conducted for the IEEE-33 system and the result indicates that the algorithm has fast convergent velocity and accurate fault location abilities in single fault or multiple faults. In addition, it also has good fault-tolerance when fault information is aberrated.%针对标准遗传算法易早熟收敛以及收敛速度慢的问题,提出了一种混合遗传算法(自适应遗传退火算法)用于解决辐射状配电网故障定位问题.该算法采用轮盘赌和最优保存策略相结合的选择机制,使得当前最优个体始终保持在种群里,并结合自适应交叉、变异概率,扩大种群的搜索范围,继而引入模拟退火算法,加快迭代后期算法的收敛速度.最后,通过对IEEE-33节点配电系统进行仿真计算,结果表明,该算法能够对单点和多点故障进行实时、准确地定位,并在故障信息畸变的情况下,也能快速地得到准确结果.
Felipe Baesler
2008-12-01
Full Text Available El presente artículo introduce una variante de la metaheurística simulated annealing, para la resolución de problemas de optimización multiobjetivo. Este enfoque se demonina MultiObjective Simulated Annealing with Random Trajectory Search, MOSARTS. Esta técnica agrega al algoritmo Simulated Annealing elementos de memoria de corto y largo plazo para realizar una búsqueda que permita balancear el esfuerzo entre todos los objetivos involucrados en el problema. Los resultados obtenidos se compararon con otras tres metodologías en un problema real de programación de máquinas paralelas, compuesto por 24 trabajos y 2 máquinas idénticas. Este problema corresponde a un caso de estudio real de la industria regional del aserrío. En los experimentos realizados, MOSARTS se comportó de mejor manera que el resto de la herramientas de comparación, encontrando mejores soluciones en términos de dominancia y dispersión.This paper introduces a variant of the metaheuristic simulated annealing, oriented to solve multiobjective optimization problems. This technique is called MultiObjective Simulated Annealing with Random Trajectory Search (MOSARTS. This technique incorporates short an long term memory concepts to Simulated Annealing in order to balance the search effort among all the objectives involved in the problem. The algorithm was tested against three different techniques on a real life parallel machine scheduling problem, composed of 24 jobs and two identical machines. This problem represents a real life case study of the local sawmill industry. The results showed that MOSARTS behaved much better than the other methods utilized, because found better solutions in terms of dominance and frontier dispersion.
Simulated Annealing Based Algorithm for Identifying Mutated Driver Pathways in Cancer
Hai-Tao Li
2014-01-01
Full Text Available With the development of next-generation DNA sequencing technologies, large-scale cancer genomics projects can be implemented to help researchers to identify driver genes, driver mutations, and driver pathways, which promote cancer proliferation in large numbers of cancer patients. Hence, one of the remaining challenges is to distinguish functional mutations vital for cancer development, and filter out the unfunctional and random “passenger mutations.” In this study, we introduce a modified method to solve the so-called maximum weight submatrix problem which is used to identify mutated driver pathways in cancer. The problem is based on two combinatorial properties, that is, coverage and exclusivity. Particularly, we enhance an integrative model which combines gene mutation and expression data. The experimental results on simulated data show that, compared with the other methods, our method is more efficient. Finally, we apply the proposed method on two real biological datasets. The results show that our proposed method is also applicable in real practice.
Adaptive model reduction for nonsmooth discrete element simulation
Servin, Martin
2015-01-01
A method for adaptive model order reduction for nonsmooth discrete element simulation is developed and analysed in numerical experiments. Regions of the granular media that collectively move as rigid bodies are substituted with rigid bodies of the corresponding shape and mass distribution. The method also support particles merging with articulated multibody systems. A model approximation error is defined used for deriving and conditions for when and where to apply model reduction and refinement back into particles and smaller rigid bodies. Three methods for refinement are proposed and tested: prediction from contact events, trial solutions computed in the background and using split sensors. The computational performance can be increased by 5 - 50 times for model reduction level between 70 - 95 %.
Adaptive model reduction for nonsmooth discrete element simulation
Servin, Martin; Wang, Da
2016-03-01
A method for adaptive model order reduction for nonsmooth discrete element simulation is developed and analysed in numerical experiments. Regions of the granular media that collectively move as rigid bodies are substituted with rigid bodies of the corresponding shape and mass distribution. The method also support particles merging with articulated multibody systems. A model approximation error is defined and used to derive conditions for when and where to apply reduction and refinement back into particles and smaller rigid bodies. Three methods for refinement are proposed and tested: prediction from contact events, trial solutions computed in the background and using split sensors. The computational performance can be increased by 5-50 times for model reduction level between 70-95 %.
Larry W. Burggraf
2013-07-01
Full Text Available To find low energy SinCn structures out of hundreds to thousands of isomers we have developed a general method to search for stable isomeric structures that combines Stochastic Potential Surface Search and Pseudopotential Plane-Wave Density Functional Theory Car-Parinello Molecular Dynamics simulated annealing (PSPW-CPMD-SA. We enhanced the Sunders stochastic search method to generate random cluster structures used as seed structures for PSPW-CPMD-SA simulations. This method ensures that each SA simulation samples a different potential surface region to find the regional minimum structure. By iterations of this automated, parallel process on a high performance computer we located hundreds to more than a thousand stable isomers for each SinCn cluster. Among these, five to 10 of the lowest energy isomers were further optimized using B3LYP/cc-pVTZ method. We applied this method to SinCn (n = 4–12 clusters and found the lowest energy structures, most not previously reported. By analyzing the bonding patterns of low energy structures of each SinCn cluster, we observed that carbon segregations tend to form condensed conjugated rings while Si connects to unsaturated bonds at the periphery of the carbon segregation as single atoms or clusters when n is small and when n is large a silicon network spans over the carbon segregation region.
Ensemble annealing of complex physical systems
Habeck, Michael
2015-01-01
Algorithms for simulating complex physical systems or solving difficult optimization problems often resort to an annealing process. Rather than simulating the system at the temperature of interest, an annealing algorithm starts at a temperature that is high enough to ensure ergodicity and gradually decreases it until the destination temperature is reached. This idea is used in popular algorithms such as parallel tempering and simulated annealing. A general problem with annealing methods is th...
贾伟娜; 刘顺兰
2014-01-01
The simulated annealing genetic algorithm is a new global optimization algorithm, and it is formed by integrating the simulated annealing into the genetic algorithm. Then the simulated annealing genetic algorithm is applied to the WSF algorithm of DOA estimation technique, in order to reduce the complexity of WSF algorithm and improve the DOA estima-tion precision. At the same time, the new algorithm can solve the low efficiency and easily falling into local optimum prob-lems of the basic genetic algorithm in DOA estimation. Computer simulation results show that, compared with the basic genetic algorithm, gauss-newton method, the DOA estimation technique based on simulated annealing genetic algorithm has higher resolution probability and smaller mean square error.%将模拟退火思想融入到遗传算法中，形成了另一种优化算法，即模拟退火遗传算法，将其应用于加权子空间（WSF）算法的目标方位（DOA）估计技术中，以求降低WSF算法的运算复杂度，提高DOA估计精度，同时又解决了基本遗传算法在DOA估计中易陷入局部最优、后期搜索迟钝等问题。计算机仿真结果表明：采用模拟退火遗传算法的DOA估计技术在低信噪比条件下比采用基本遗传算法、高斯-牛顿算法有更高的分辨概率，更小的均方误差。
Ensemble annealing of complex physical systems
Habeck, Michael
2015-01-01
Algorithms for simulating complex physical systems or solving difficult optimization problems often resort to an annealing process. Rather than simulating the system at the temperature of interest, an annealing algorithm starts at a temperature that is high enough to ensure ergodicity and gradually decreases it until the destination temperature is reached. This idea is used in popular algorithms such as parallel tempering and simulated annealing. A general problem with annealing methods is that they require a temperature schedule. Choosing well-balanced temperature schedules can be tedious and time-consuming. Imbalanced schedules can have a negative impact on the convergence, runtime and success of annealing algorithms. This article outlines a unifying framework, ensemble annealing, that combines ideas from simulated annealing, histogram reweighting and nested sampling with concepts in thermodynamic control. Ensemble annealing simultaneously simulates a physical system and estimates its density of states. The...
Unsteady CFD simulations of a pump in part load conditions using scale-adaptive simulation
Lucius, A., E-mail: andreas.lucius@tu-clausthal.d [Institute of Applied Mechanics, Clausthal University of Technology, Adolph-Roemer Str. 2a, 38678 Clausthal-Zellerfeld (Germany); Brenner, G. [Institute of Applied Mechanics, Clausthal University of Technology, Adolph-Roemer Str. 2a, 38678 Clausthal-Zellerfeld (Germany)
2010-12-15
The scope of this work is to demonstrate the applicability of an eddy resolving turbulence model in a turbomachinery configuration. The model combines the Large Eddy Simulation (LES) and the Reynolds Averaged Navier Stokes (RANS) approach. The point of interest of the present investigation is the unsteady rotating stall phenomenon occurring at low part load conditions. Since RANS turbulence models often fail to predict separation correctly, a LES like model is expected to give superior results. In this investigation the scale-adaptive simulation (SAS) model is used. This model avoids the grid dependence appearing in the Detached Eddy Simulation (DES) modelling strategy. The simulations are validated with transient measurement data. The present results demonstrate, that both models are able to predict the major stall frequency at part load. Results are similar for URANS and SAS, with advantages in predicting minor stall frequencies for the turbulence resolving model.
Zheng, Han; Zhang, Yingkai
2008-01-01
We propose a new adaptive sampling approach to determine free energy profiles with molecular dynamics simulations, which is called as “repository based adaptive umbrella sampling” (RBAUS). Its main idea is that a sampling repository is continuously updated based on the latest simulation data, and the accumulated knowledge and sampling history are then employed to determine whether and how to update the biasing umbrella potential for subsequent simulations. In comparison with other adaptive me...
Studies suggest that clinical outcomes are improved in repeat trigeminal neuralgia (TN) Gamma Knife radiosurgery if a different part of the nerve from the previous radiosurgery is treated. The MR images taken in the first and repeat radiosurgery need to be coregistered to map the first radiosurgery volume onto the second treatment planning image. We propose a fully automatic and robust three-dimensional (3-D) mutual information- (MI-) based registration method engineered by a simulated annealing (SA) optimization technique. Commonly, Powell's method and Downhill simplex (DS) method are most popular in optimizing the MI objective function in medical image registration applications. However, due to the nonconvex property of the MI function, robustness of those two methods is questionable, especially for our cases, where only 28 slices of MR T1 images were utilized. Our SA method obtained successful registration results for all the 41 patients recruited in this study. On the other hand, Powell's method and the DS method failed to provide satisfactory registration for 11 patients and 9 patients, respectively. The overlapping volume ratio (OVR) is defined to quantify the degree of the partial volume overlap between the first and second MR scan. Statistical results from a logistic regression procedure demonstrated that the probability of a success of Powell's method tends to decrease as OVR decreases. The rigid registration with Powell's or the DS method is not suitable for the TN radiosurgery application, where OVR is likely to be low. In summary, our experimental results demonstrated that the MI-based registration method with the SA optimization technique is a robust and reliable option when the number of slices in the imaging study is limited
Metriplectic Simulated Annealing
Morrison, P. J.; Flierl, G. R.
2015-11-01
Metriplectic dynamics is a general form for dynamical systems that represent the first and second laws of thermodynamics, energy conservation and entropy production. Entropy production provides asymptotic stability to equilibrium states, which because of constraints need not be trivial. The formalism will be used to perform quasigeostrophic computations, akin to those of, for obtaining a variety of vortex states.
LIANG WEN-XI; ZHANG JING-JUAN; L(U) JUN-FENG; LIAO RUI
2001-01-01
We have designed a spatially quantized diffractive optical element (DOE) for controlling the beam profile in a three-dimensional space with the help of the simulated annealing (SA) algorithm. In this paper, we investigate the annealing schedule and the neighbourhood which are the deterministic parameters of the process that warrant the quality of the SA algorithm. The algorithm is employed to solve the discrete stochastic optimization problem of the design of a DOE. The objective function which constrains the optimization is also studied. The computed results demonstrate that the procedure of the algorithm converges stably to an optimal solution close to the global optimum with an acceptable computing time. The results meet the design requirement well and are applicable.
Simulating adaptive wood harvest in a changing climate
Yousefpour, Rasoul; Nabel, Julia; Pongratz, Julia
2016-04-01
The world's forest experience substantial carbon exchange fluxes between land and atmosphere. Large carbon sinks occur in response to changes in environmental conditions (such as climate change and increased atmospheric CO2 concentrations), removing about one quarter of current anthropogenic CO2-emissions. Large sinks also occur due to regrowth of forest on areas of agricultural abandonment or forest management. Forest management, on the other hand, also leads to substantial amounts of carbon being eventually released to the atmosphere. Both sinks and sources attributable to forests are therefore dependent on the intensity of management. Forest management in turn depends on the availability of resources, which is influenced by environmental conditions and sustainability of management systems applied. Estimating future carbon fluxes therefore requires accounting for the interaction of environmental conditions, forest growth, and management. However, this interaction is not fully captured by current modeling approaches: Earth system models depict in detail interactions between climate, the carbon cycle, and vegetation growth, but use prescribed information on management. Resource needs and land management, however, are simulated by Integrated Assessment Models that typically only have coarse representations of the influence of environmental changes on vegetation growth and are typically based on the demand for wood driven by regional population growth and energy needs. Here we present a study that provides the link between environmental conditions, forest growth and management. We extend the land component JSBACH of the Max Planck Institute's Earth system model (MPI-ESM) to simulate potential wood harvest in response to altered growth conditions and thus as adaptive to changing climate and CO2 conditions. We apply the altered model to estimate potential wood harvest for future climates (representative concentration pathways, RCPs) for the management scenario of
Quantum Annealing and Quantum Fluctuation Effect in Frustrated Ising Systems
Tanaka, Shu; Tamura, Ryo
2012-01-01
Quantum annealing method has been widely attracted attention in statistical physics and information science since it is expected to be a powerful method to obtain the best solution of optimization problem as well as simulated annealing. The quantum annealing method was incubated in quantum statistical physics. This is an alternative method of the simulated annealing which is well-adopted for many optimization problems. In the simulated annealing, we obtain a solution of optimization problem b...
Quantum Annealing for Variational Bayes Inference
Sato, Issei; Kurihara, Kenichi; Tanaka, Shu; Nakagawa, Hiroshi; Miyashita, Seiji
2014-01-01
This paper presents studies on a deterministic annealing algorithm based on quantum annealing for variational Bayes (QAVB) inference, which can be seen as an extension of the simulated annealing for variational Bayes (SAVB) inference. QAVB is as easy as SAVB to implement. Experiments revealed QAVB finds a better local optimum than SAVB in terms of the variational free energy in latent Dirichlet allocation (LDA).
We have analyzed grazing incidence X-ray diffraction (GIXD) data from condensed phases of Langmuir films of long-chain fatty acids at the air-water using a new method consisting in a careful extraction of the structure factors followed by fitting of molecular parameters using simulated annealing. We show that the information contained in GIXD spectra is enough to obtain near-atomic structural information. In particular, we directly determine the orientation of the chain backbone planes and of the carboxylic headgroups, and we evaluate chain conformation defects
Sagert, I.; Fann, G. I.; Fattoyev, F. J.; Postnikov, S.; Horowitz, C. J.
2016-05-01
Background: Neutron star and supernova matter at densities just below the nuclear matter saturation density is expected to form a lattice of exotic shapes. These so-called nuclear pasta phases are caused by Coulomb frustration. Their elastic and transport properties are believed to play an important role for thermal and magnetic field evolution, rotation, and oscillation of neutron stars. Furthermore, they can impact neutrino opacities in core-collapse supernovae. Purpose: In this work, we present proof-of-principle three-dimensional (3D) Skyrme Hartree-Fock (SHF) simulations of nuclear pasta with the Multi-resolution ADaptive Numerical Environment for Scientific Simulations (MADNESS). Methods: We perform benchmark studies of 16O, 208Pb, and 238U nuclear ground states and calculate binding energies via 3D SHF simulations. Results are compared with experimentally measured binding energies as well as with theoretically predicted values from an established SHF code. The nuclear pasta simulation is initialized in the so-called waffle geometry as obtained by the Indiana University Molecular Dynamics (IUMD) code. The size of the unit cell is 24 fm with an average density of about ρ =0.05 fm-3 , proton fraction of Yp=0.3 , and temperature of T =0 MeV. Results: Our calculations reproduce the binding energies and shapes of light and heavy nuclei with different geometries. For the pasta simulation, we find that the final geometry is very similar to the initial waffle state. We compare calculations with and without spin-orbit forces. We find that while subtle differences are present, the pasta phase remains in the waffle geometry. Conclusions: Within the MADNESS framework, we can successfully perform calculations of inhomogeneous nuclear matter. By using pasta configurations from IUMD it is possible to explore different geometries and test the impact of self-consistent calculations on the latter.
Hybrid Quantum Annealing for Clustering Problems
Tanaka, Shu; Tamura, Ryo; Sato, Issei; Kurihara, Kenichi
2011-01-01
We develop a hybrid type of quantum annealing in which we control temperature and quantum field simultaneously. We study the efficiency of proposed quantum annealing and find a good schedule of changing thermal fluctuation and quantum fluctuation. In this paper, we focus on clustering problems which are important topics in information science and engineering. We obtain the better solution of the clustering problem than the standard simulated annealing by proposed quantum annealing.
Quantum annealing: An introduction and new developments
Ohzeki, Masayuki; Nishimori, Hidetoshi
2010-01-01
Quantum annealing is a generic algorithm using quantum-mechanical fluctuations to search for the solution of an optimization problem. The present paper first reviews the fundamentals of quantum annealing and then reports on preliminary results for an alternative method. The review part includes the relationship of quantum annealing with classical simulated annealing. We next propose a novel quantum algorithm which might be available for hard optimization problems by using a classical-quantum ...
Use of sensitivity-information for the adaptive simulation of thermo-hydraulic system codes
Within the scope of this thesis the development of methods for online-adaptation of dynamical plant simulations of a thermal-hydraulic system code to measurement data is depicted. The described approaches are mainly based on the use of sensitivity-information in different areas: statistical sensitivity measures are used for the identification of the parameters to be adapted and online-sensitivities for the parameter adjustment itself. For the parameter adjustment the method of a ''system-adapted heuristic adaptation with partial separation'' (SAHAT) was developed, which combines certain variants of parameter estimation and control with supporting procedures to solve the basic problems. The applicability of the methods is shown by adaptive simulations of a PKL-III experiment and by selected transients in a nuclear power plant. Finally the main perspectives for the application of a tracking simulator on a system code are identified.
Computerized adaptive measurement of depression: A simulation study
Mammen Oommen
2004-05-01
Full Text Available Abstract Background Efficient, accurate instruments for measuring depression are increasingly important in clinical practice. We developed a computerized adaptive version of the Beck Depression Inventory (BDI. We examined its efficiency and its usefulness in identifying Major Depressive Episodes (MDE and in measuring depression severity. Methods Subjects were 744 participants in research studies in which each subject completed both the BDI and the SCID. In addition, 285 patients completed the Hamilton Depression Rating Scale. Results The adaptive BDI had an AUC as an indicator of a SCID diagnosis of MDE of 88%, equivalent to the full BDI. The adaptive BDI asked fewer questions than the full BDI (5.6 versus 21 items. The adaptive latent depression score correlated r = .92 with the BDI total score and the latent depression score correlated more highly with the Hamilton (r = .74 than the BDI total score did (r = .70. Conclusions Adaptive testing for depression may provide greatly increased efficiency without loss of accuracy in identifying MDE or in measuring depression severity.
Zavadlav, Julija; Marrink, Siewert J; Praprotnik, Matej
2016-01-01
The adaptive resolution scheme (AdResS) is a multiscale molecular dynamics simulation approach that can concurrently couple atomistic (AT) and coarse-grained (CG) resolution regions, i.e., the molecules can freely adapt their resolution according to their current position in the system. Coupling to
Aidin Delgoshaei
2016-06-01
Full Text Available In this paper, a new method is proposed for scheduling dynamic cellular manufacturing systems (D-CMS in the presence of uncertain product demands. The aim of this method is to control the process of trading off between in-house manufacturing and outsourcing while product demands are uncertain and can be varied from period to period. To solve the proposed problem, a hybrid Tabu Search and Simulated Annealing are developed to overcome hardness of the proposed model and then results are compared with a Branch and Bound and Simulated Annealing algorithms. A Taguchi method (L_27 orthogonal optimization is used to estimate parameters of the proposed method in order to solve experiments derived from literature. An in-depth analysis is conducted on the results in consideration of various factors. For evaluating the system imbalance in dynamic market demands, a new measuring index is developed. Our findings indicate that the uncertain condition of market demands affects the routing of product parts and may induce machine-load variations that yield to cell-load diversity. The results showed that the proposed hybrid method can provide solutions with better quality.
Design of a virtual environment with adaptation of immersion for wheelchair driving simulation
Goncalves, Frédéric
2014-01-01
This thesis focuses on the adaptation of the immersion in a driving wheelchair simulation. This is the study of sensory feedback realism involved in a virtual environment in order to adapt the immersion according to user preferences and the task to perform. These works were performed within the AccesSim project supported by the Ile de France region. The project objective is to develop a dynamic simulator to evaluate the accessibility and also urban environment, to conduct training in the use ...
Vigh, Csaba Attila
2013-01-01
One of the most important computational challenges in the context of the numerical treatment of Partial Differential Equations is the generation, management, and dynamic adaptivity of grids. Dynamic adaptivity is extremely important in applications that require frequent changes of the grid pattern during a simulation run. One such application example is Tsunami simulation, where waves have to be tracked with highly resolved local grids. Arbitrary unstructured grids that can ...
Online Body Schema Adaptation Based on Internal Mental Simulation and Multisensory Feedback
Vicente, Pedro; Jamone, Lorenzo; Bernardino, Alexandre
2016-01-01
In this paper, we describe a novel approach to obtain automatic adaptation of the robot body schema and to improve the robot perceptual and motor skills based on this body knowledge. Predictions obtained through a mental simulation of the body are combined with the real sensory feedback to achieve two objectives simultaneously: body schema adaptation and markerless 6D hand pose estimation. The body schema consists of a computer graphics simulation of the robot, which includes the arm and head...
Rumore, D.; Kirshen, P. H.; Susskind, L.
2014-12-01
Despite scientific consensus that the climate is changing, local efforts to prepare for and manage climate change risks remain limited. How we can raise concern about climate change risks and enhance local readiness to adapt to climate change's effects? In this presentation, we will share the lessons learned from the New England Climate Adaptation Project (NECAP), a participatory action research project that tested science-based role-play simulations as a tool for educating the public about climate change risks and simulating collective risk management efforts. NECAP was a 2-year effort involving the Massachusetts Institute of Technology, the Consensus Building Institute, the National Estuarine Research Reserve System, and four coastal New England municipalities. During 2012-2013, the NECAP team produced downscaled climate change projections, a summary risk assessment, and a stakeholder assessment for each partner community. Working with local partners, we used these assessments to create a tailored, science-based role-play simulation for each site. Through a series of workshops in 2013, NECAP engaged between 115-170 diverse stakeholders and members of the public in each partner municipality in playing the simulation and a follow up conversation about local climate change risks and possible adaptation strategies. Data were collected through before-and-after surveys administered to all workshop participants, follow-up interviews with 25 percent of workshop participants, public opinion polls conducted before and after our intervention, and meetings with public officials. This presentation will report our research findings and explain how science-based role-play simulations can be used to help communicate local climate change risks and enhance local readiness to adapt.
Toward a practical method for adaptive QM/MM simulations
R.E. Bulo; B. Ensing; J. Sikkema; L. Visscher
2009-01-01
We present an accurate adaptive multiscale molecular dynamics method that will enable the detailed study of large molecular systems that mimic experiment. The method treats the reactive regions at the quantum mechanical level and the inactive environment regions at lower levels of accuracy, while at
Computer simulation program is adaptable to industrial processes
Schultz, F. E.
1966-01-01
The Reaction kinetics ablation program /REKAP/, developed to simulate ablation of various materials, provides mathematical formulations for computer programs which can simulate certain industrial processes. The programs are based on the use of nonsymmetrical difference equations that are employed to solve complex partial differential equation systems.
Faster annealing schedules for quantum annealing
Morita, Satoshi
2007-01-01
New annealing schedules for quantum annealing are proposed based on the adiabatic theorem. These schedules exhibit faster decrease of the excitation probability than a linear schedule. To derive this conclusion, the asymptotic form of the excitation probability for quantum annealing is explicitly obtained in the limit of long annealing time. Its first-order term, which is inversely proportional to the square of the annealing time, is shown to be determined only by the information at the initi...
UNFOLDING SIMULATIONS OF COLD- AND WARM-ADAPTED ELASTASES
Laura Riccardi1, Papaleo Elena2 *
2010-01-01
The earth surface is dominated by low temperature environments, which have been successfully colonized by several extremophilic organisms. Enzymes isolated from psychrophilic organisms are able to catalyze reactions at low temperatures at which enzymes from mesophiles or thermophiles are fully compromised. The current scenario on enzyme cold-adaptation suggest that these enzymes are characterized by higher catalytic efficiently at low temperatures, enhanced structural flexibility and lower th...
A Bacterial-Based Algorithm to Simulate Complex Adaptative Systems
González Rodríguez, Diego; Hernández Carrión, José Rodolfo
2014-01-01
Bacteria have demonstrated an amazing capacity to overcome envi-ronmental changes by collective adaptation through genetic exchanges. Using a distributed communication system and sharing individual strategies, bacteria propagate mutations as innovations that allow them to survive in different envi-ronments. In this paper we present an agent-based model which is inspired by bacterial conjugation of DNA plasmids. In our approach, agents with bounded rationality interact in a common environment ...
This study presents and efficient methodology that derives design alternatives and performance criteria of safety functions/systems in commercial nuclear power plants. Determination of design alternatives and intermediate-level performance criteria is posed as a reliability allocation problem. The reliability allocation is performed for determination of reliabilities of safety functions/systems from top-level performance criteria. The reliability allocation is a very difficult multi objective optimization problem (MOP) as well as a global optimization problem with many local minima. The weighted Chebyshev norm (WCN) approach in combination with an improved Metropolis algorithm of simulated annealing is developed and applied to the reliability allocation problem. The hierarchy of probabilistic safety criteria (PSC) may consist of three levels, which ranges from the overall top level (e.g., core damage frequency, acute fatality and latent cancer fatality) through the interlnediate level (e.g., unavailiability of safety system/function) to the low level (e.g., unavailability of components, component specifications or human error). In order to determine design alternatives of safety functions/systems and the intermediate-level PSC, the reliability allocation is performed from the top-level PSC. The intermediated level corresponds to an objective space and the top level is related to a risk space. The reliability allocation is performed by means of a concept of two-tier noninferior solutions in the objective and risk spaces within the top-level PSC. In this study, two kinds of towtier noninferior solutions are defined: intolerable intermediate-level PSC and desirable design alternatives of safety functions/systems that are determined from Sets 1 and 2, respectively. Set 1 is obtained by maximizing simultaneously not only safety function/system unavailabilities but also risks. Set 1 reflects safety function/system unavailabilities in the worst case. Hence, the
Fuzzy Backstepping Torque Control Of Passive Torque Simulator With Algebraic Parameters Adaptation
Ullah, Nasim; Wang, Shaoping; Wang, Xingjian
2015-07-01
This work presents fuzzy backstepping control techniques applied to the load simulator for good tracking performance in presence of extra torque, and nonlinear friction effects. Assuming that the parameters of the system are uncertain and bounded, Algebraic parameters adaptation algorithm is used to adopt the unknown parameters. The effect of transient fuzzy estimation error on parameters adaptation algorithm is analyzed and the fuzzy estimation error is further compensated using saturation function based adaptive control law working in parallel with the actual system to improve the transient performance of closed loop system. The saturation function based adaptive control term is large in the transient time and settles to an optimal lower value in the steady state for which the closed loop system remains stable. The simulation results verify the validity of the proposed control method applied to the complex aerodynamics passive load simulator.
Cartesian Off-Body Grid Adaption for Viscous Time- Accurate Flow Simulation
Buning, Pieter G.; Pulliam, Thomas H.
2011-01-01
An improved solution adaption capability has been implemented in the OVERFLOW overset grid CFD code. Building on the Cartesian off-body approach inherent in OVERFLOW and the original adaptive refinement method developed by Meakin, the new scheme provides for automated creation of multiple levels of finer Cartesian grids. Refinement can be based on the undivided second-difference of the flow solution variables, or on a specific flow quantity such as vorticity. Coupled with load-balancing and an inmemory solution interpolation procedure, the adaption process provides very good performance for time-accurate simulations on parallel compute platforms. A method of using refined, thin body-fitted grids combined with adaption in the off-body grids is presented, which maximizes the part of the domain subject to adaption. Two- and three-dimensional examples are used to illustrate the effectiveness and performance of the adaption scheme.
Duarte, Ricardo; Eskofier, Björn; Rumpf, Martin; Wiemeyer, Josef
2016-01-01
This report documents the program and the outcomes of Dagstuhl Seminar 15382 "Modeling and Simulation of Sport Games, Sport Movements, and Adaptations to Training". The primary goal of the seminar was the continuation of the interdisciplinary and transdisciplinarity research in sports and computer science with the emphasis on modeling and simulation technologies. In this seminar, experts on modeling and simulation from computer science, sport science, and industry were invited to discuss rece...
Ufa Ruslan A.
2015-01-01
Full Text Available The motivation of the presented research is based on the needs for development of new methods and tools for adequate simulation of intelligent electric power systems with active-adaptive electric networks (IES including Flexible Alternating Current Transmission System (FACTS devices. The key requirements for the simulation were formed. The presented analysis of simulation results of IES confirms the need to use a hybrid modelling approach.
刘万辉; 田树军; 贾春强; 曹宇宁
2008-01-01
This paper establishes a mathematical model of multi-objective optimization with behavior constraints in solid space based on the problem of optimal design of hydraulic manifold blocks (HMB). Due to the limitation of its local search ability of genetic algorithm (GA) in solving a massive combinatorial optimization problem, simulated annealing (SA) is combined, the multi-parameter concatenated coding is adopted, and the memory function is added. Thus a hybrid genetic-simulated annealing with memory function is formed. Examples show that the modified algorithm can improve the local search ability in the solution space, and the solution quality.
Joseph, Joby; Muthukumaran, S. [National Institute of Technology, Tamil Nadu (India)
2016-01-15
Abundant improvements have occurred in materials handling, especially in metal joining. Pulsed current gas tungsten arc welding (PCGTAW) is one of the consequential fusion techniques. In this work, PCGTAW of AISI 4135 steel engendered through powder metallurgy (P/M) has been executed, and the process parameters have been highlighted applying Taguchi's L9 orthogonal array. The results show that the peak current (Ip), gas flow rate (GFR), welding speed (WS) and base current (Ib) are the critical constraints in strong determinant of the Tensile strength (TS) as well as percentage of elongation (% Elong) of the joint. The practical impact of applying Genetic algorithm (GA) and Simulated annealing (SA) to PCGTAW process has been authenticated by means of calculating the deviation between predicted and experimental welding process parameters.
Yanhui Li
2013-01-01
Full Text Available Facility location, inventory control, and vehicle routes scheduling are critical and highly related problems in the design of logistics system for e-business. Meanwhile, the return ratio in Internet sales was significantly higher than in the traditional business. Many of returned merchandise have no quality defects, which can reenter sales channels just after a simple repackaging process. Focusing on the existing problem in e-commerce logistics system, we formulate a location-inventory-routing problem model with no quality defects returns. To solve this NP-hard problem, an effective hybrid genetic simulated annealing algorithm (HGSAA is proposed. Results of numerical examples show that HGSAA outperforms GA on computing time, optimal solution, and computing stability. The proposed model is very useful to help managers make the right decisions under e-supply chain environment.
Bailing Liu
2015-01-01
Full Text Available Facility location, inventory control, and vehicle routes scheduling are three key issues to be settled in the design of logistics system for e-commerce. Due to the online shopping features of e-commerce, customer returns are becoming much more than traditional commerce. This paper studies a three-phase supply chain distribution system consisting of one supplier, a set of retailers, and a single type of product with continuous review (Q, r inventory policy. We formulate a stochastic location-inventory-routing problem (LIRP model with no quality defects returns. To solve the NP-hand problem, a pseudo-parallel genetic algorithm integrating simulated annealing (PPGASA is proposed. The computational results show that PPGASA outperforms GA on optimal solution, computing time, and computing stability.
Estevez H, O.; Duque, J. [Universidad de La Habana, Instituto de Ciencia y Tecnologia de Materiales, 10400 La Habana (Cuba); Rodriguez H, J. [UNAM, Instituto de Investigaciones en Materiales, 04510 Mexico D. F. (Mexico); Yee M, H., E-mail: oestevezh@yahoo.com [Instituto Politecnico Nacional, Escuela Superior de Fisica y Matematicas, 07738 Mexico D. F. (Mexico)
2015-07-01
1-Furoyl-3,3-diphenylthiourea (FDFT) was synthesized, and characterized by Ftir, {sup 1}H and {sup 13}C NMR and ab initio X-ray powder structure analysis. FDFT crystallizes in the monoclinic space group P2{sub 1} with a = 12.691(1), b = 6.026(2), c = 11.861(1) A, β = 117.95(2) and V = 801.5(3) A{sup 3}. The crystal structure has been determined from laboratory X-ray powder diffraction data using direct space global optimization strategy (simulated annealing) followed by the Rietveld refinement. The thiourea group makes a dihedral angle of 73.8(6) with the furoyl group. In the crystal structure, molecules are linked by van der Waals interactions, forming one-dimensional chains along the a axis. (Author)
1-Furoyl-3,3-diphenylthiourea (FDFT) was synthesized, and characterized by Ftir, 1H and 13C NMR and ab initio X-ray powder structure analysis. FDFT crystallizes in the monoclinic space group P21 with a = 12.691(1), b = 6.026(2), c = 11.861(1) A, β = 117.95(2) and V = 801.5(3) A3. The crystal structure has been determined from laboratory X-ray powder diffraction data using direct space global optimization strategy (simulated annealing) followed by the Rietveld refinement. The thiourea group makes a dihedral angle of 73.8(6) with the furoyl group. In the crystal structure, molecules are linked by van der Waals interactions, forming one-dimensional chains along the a axis. (Author)
Abundant improvements have occurred in materials handling, especially in metal joining. Pulsed current gas tungsten arc welding (PCGTAW) is one of the consequential fusion techniques. In this work, PCGTAW of AISI 4135 steel engendered through powder metallurgy (P/M) has been executed, and the process parameters have been highlighted applying Taguchi's L9 orthogonal array. The results show that the peak current (Ip), gas flow rate (GFR), welding speed (WS) and base current (Ib) are the critical constraints in strong determinant of the Tensile strength (TS) as well as percentage of elongation (% Elong) of the joint. The practical impact of applying Genetic algorithm (GA) and Simulated annealing (SA) to PCGTAW process has been authenticated by means of calculating the deviation between predicted and experimental welding process parameters
Li, Yanhui; Guo, Hao; Wang, Lin; Fu, Jing
2013-01-01
Facility location, inventory control, and vehicle routes scheduling are critical and highly related problems in the design of logistics system for e-business. Meanwhile, the return ratio in Internet sales was significantly higher than in the traditional business. Many of returned merchandise have no quality defects, which can reenter sales channels just after a simple repackaging process. Focusing on the existing problem in e-commerce logistics system, we formulate a location-inventory-routing problem model with no quality defects returns. To solve this NP-hard problem, an effective hybrid genetic simulated annealing algorithm (HGSAA) is proposed. Results of numerical examples show that HGSAA outperforms GA on computing time, optimal solution, and computing stability. The proposed model is very useful to help managers make the right decisions under e-supply chain environment. PMID:24489489
Sousa, Tiago M; Morais, Hugo; Castro, R.;
2014-01-01
An intensive use of dispersed energy resources is expected for future power systems, including distributed generation, especially based on renewable sources, and electric vehicles. The system operation methods and tool must be adapted to the increased complexity, especially the optimal resource s...... approach and presenting a very small error concerning the objective function with a low execution time for the scenario with 2000 vehicles....
The numerical simulation tool for the MAORY multiconjugate adaptive optics system
Arcidiacono, Carmelo; Bregoli, Giovanni; Diolaiti, Emiliano; Foppiani, Italo; Agapito, Guido; Puglisi, Alfio; Xompero, Marco; Oberti, Sylvain; Cosentino, Giuseppe; Lombini, Matteo; Butler, Chris R; Ciliegi, Paolo; Cortecchia, Fausto; Patti, Mauro; Esposito, Simone; Feautrier, Philippe
2016-01-01
The Multiconjugate Adaptive Optics RelaY (MAORY) is and Adaptive Optics module to be mounted on the ESO European-Extremely Large Telescope (E-ELT). It is a hybrid Natural and Laser Guide System that will perform the correction of the atmospheric turbulence volume above the telescope feeding the Multi-AO Imaging Camera for Deep Observations Near Infrared spectro-imager (MICADO). We developed an end-to-end Monte- Carlo adaptive optics simulation tool to investigate the performance of a the MAORY and the calibration, acquisition, operation strategies. MAORY will implement Multiconjugate Adaptive Optics combining Laser Guide Stars (LGS) and Natural Guide Stars (NGS) measurements. The simulation tool implements the various aspect of the MAORY in an end to end fashion. The code has been developed using IDL and uses libraries in C++ and CUDA for efficiency improvements. Here we recall the code architecture, we describe the modeled instrument components and the control strategies implemented in the code.
Stochl, Jan; Böhnke, Jan R.; Pickett, Kate E.; Croudace, Tim J.
2015-01-01
PURPOSE: Goldberg's General Health Questionnaire (GHQ) items are frequently used to assess psychological distress but no study to date has investigated the GHQ-30's potential for adaptive administration. In computerized adaptive testing (CAT) items are matched optimally to the targeted distress level of respondents instead of relying on fixed-length versions of instruments. We therefore calibrate GHQ-30 items and report a simulation study exploring the potential of this instrument for adaptiv...
An adaptative finite element method for turbulent flow simulations
After outlining the space and time discretization methods used in the N3S thermal hydraulic code developed at EDF/NHL, we describe the possibilities of the peripheral version, the Adaptative Mesh, which comprises two separate parts: the error indicator computation and the development of a module subdividing elements usable by the solid dynamics code ASTER and the electromagnetism code TRIFOU also developed by R and DD. The error indicators implemented in N3S are described. They consist of a projection indicator quantifying the space error in laminar or turbulent flow calculations and a Navier-Stokes residue indicator calculated on each element. The method for subdivision of triangles into four sub-triangles and tetrahedra into eight sub-tetrahedra is then presented with its advantages and drawbacks. It is illustrated by examples showing the efficiency of the module. The last concerns the 2 D case of flow behind a backward-facing step. (authors). 9 refs., 5 figs., 1 tab
Cluster Optimization and Parallelization of Simulations with Dynamically Adaptive Grids
Schreiber, Martin
2013-01-01
The present paper studies solvers for partial differential equations that work on dynamically adaptive grids stemming from spacetrees. Due to the underlying tree formalism, such grids efficiently can be decomposed into connected grid regions (clusters) on-the-fly. A graph on those clusters classified according to their grid invariancy, workload, multi-core affinity, and further meta data represents the inter-cluster communication. While stationary clusters already can be handled more efficiently than their dynamic counterparts, we propose to treat them as atomic grid entities and introduce a skip mechanism that allows the grid traversal to omit those regions completely. The communication graph ensures that the cluster data nevertheless are kept consistent, and several shared memory parallelization strategies are feasible. A hyperbolic benchmark that has to remesh selected mesh regions iteratively to preserve conforming tessellations acts as benchmark for the present work. We discuss runtime improvements resulting from the skip mechanism and the implications on shared memory performance and load balancing. © 2013 Springer-Verlag.
Numerical simulation of supersonic over/under expanded jets using adaptive grid
Numerical simulation of supersonic under and over expanded jet was simulated. In order to achieve the solution efficiently and with high resolution, adaptive grid is used. The axisymmetric compressible, time dependent Navier-Stokes equations in body fitted curvilinear coordinate were solved numerically. The equations were discretized by using control volume, and the Van Leer flux splitting approach. The equations were solved implicitly. The obtained computer code was used to simulate four different cases of moderate and strong under and over expanded jet flows. The results show that with the adaptation of the grid, the various features of this complicated flow can be observed. It was shown that the adaptation method is very efficient and has the ability to make fine grids near the high gradient regions. (author)
Role-play simulations for climate change adaptation education and engagement
Rumore, Danya; Schenk, Todd; Susskind, Lawrence
2016-08-01
In order to effectively adapt to climate change, public officials and other stakeholders need to rapidly enhance their understanding of local risks and their ability to collaboratively and adaptively respond to them. We argue that science-based role-play simulation exercises -- a type of 'serious game' involving face-to-face mock decision-making -- have considerable potential as education and engagement tools for enhancing readiness to adapt. Prior research suggests role-play simulations and other serious games can foster public learning and encourage collective action in public policy-making contexts. However, the effectiveness of such exercises in the context of climate change adaptation education and engagement has heretofore been underexplored. We share results from two research projects that demonstrate the effectiveness of role-play simulations in cultivating climate change adaptation literacy, enhancing collaborative capacity and facilitating social learning. Based on our findings, we suggest such exercises should be more widely embraced as part of adaptation professionals' education and engagement toolkits.
Using statistical sensitivities for adaptation of a best-estimate thermo-hydraulic simulation model
On-line adaptation of best-estimate simulations of NPP behaviour to time-dependent measurement data can be used to insure that simulations performed in parallel to plant operation develop synchronously with the real plant behaviour even over extended periods of time. This opens a range of applications including operator support in non-standard-situations, improving diagnostics and validation of measurements in real plants or experimental facilities. A number of adaptation methods have been proposed and successfully applied to control problems. However, these methods are difficult to be applied to best-estimate thermal-hydraulic codes, such as TRACE and ATHLET, with their large nonlinear differential equation systems and sophisticated time integration techniques. This paper presents techniques to use statistical sensitivity measures to overcome those problems by reducing the number of parameters subject to adaptation. It describes how to identify the most significant parameters for adaptation and how this information can be used by combining: -decomposition techniques splitting the system into a small set of component parts with clearly defined interfaces where boundary conditions can be derived from the measurement data, -filtering techniques to insure that the time frame for adaptation is meaningful, -numerical sensitivities to find minimal error conditions. The suitability of combining those techniques is shown by application to an adaptive simulation of the PKL experiment.
Multi-level adaptive simulation of transient two-phase flow in heterogeneous porous media
Chueh, C.C.
2010-10-01
An implicit pressure and explicit saturation (IMPES) finite element method (FEM) incorporating a multi-level shock-type adaptive refinement technique is presented and applied to investigate transient two-phase flow in porous media. Local adaptive mesh refinement is implemented seamlessly with state-of-the-art artificial diffusion stabilization allowing simulations that achieve both high resolution and high accuracy. Two benchmark problems, modelling a single crack and a random porous medium, are used to demonstrate the robustness of the method and illustrate the capabilities of the adaptive refinement technique in resolving the saturation field and the complex interaction (transport phenomena) between two fluids in heterogeneous media. © 2010 Elsevier Ltd.
Marco A. C. Benvenga
2011-10-01
Full Text Available Kinetic simulation and drying process optimization of corn malt by Simulated Annealing (SA for estimation of temperature and time parameters in order to preserve maximum amylase activity in the obtained product are presented here. Germinated corn seeds were dried at 54-76 °C in a convective dryer, with occasional measurement of moisture content and enzymatic activity. The experimental data obtained were submitted to modeling. Simulation and optimization of the drying process were made by using the SA method, a randomized improvement algorithm, analogous to the simulated annealing process. Results showed that seeds were best dried between 3h and 5h. Among the models used in this work, the kinetic model of water diffusion into corn seeds showed the best fitting. Drying temperature and time showed a square influence on the enzymatic activity. Optimization through SA showed the best condition at 54 ºC and between 5.6h and 6.4h of drying. Values of specific activity in the corn malt were found between 5.26±0.06 SKB/mg and 15.69±0,10% of remaining moisture.Este trabalho objetivou a simulação da cinética e a otimização do processo de secagem do malte de milho por meio da técnica Simulated Annealing (SA, para estimação dos parâmetros de temperatura e tempo, tais que mantenham a atividade máxima das enzimas amilases no produto obtido. Para tanto, as sementes de milho germinadas foram secas entre 54-76°C, em um secador convectivo de ar. De tempo em tempo, a umidade e a atividade enzimática foram medidas. Esses dados experimentais foram usados para testar os modelos. A simulação e a otimização do processo foram feitas por meio do método SA, um algoritmo de melhoria randômica, análogo ao processo de têmpera simulada. Os resultados mostram que as sementes estavam secas após 3 h ou 5 h de secagem. Entre os modelos usados, o modelo cinético de difusão da água através das sementes apresentou o melhor ajuste. O tempo e a temperatura
Annealing evolutionary stochastic approximation Monte Carlo for global optimization
Liang, Faming
2010-04-08
In this paper, we propose a new algorithm, the so-called annealing evolutionary stochastic approximation Monte Carlo (AESAMC) algorithm as a general optimization technique, and study its convergence. AESAMC possesses a self-adjusting mechanism, whose target distribution can be adapted at each iteration according to the current samples. Thus, AESAMC falls into the class of adaptive Monte Carlo methods. This mechanism also makes AESAMC less trapped by local energy minima than nonadaptive MCMC algorithms. Under mild conditions, we show that AESAMC can converge weakly toward a neighboring set of global minima in the space of energy. AESAMC is tested on multiple optimization problems. The numerical results indicate that AESAMC can potentially outperform simulated annealing, the genetic algorithm, annealing stochastic approximation Monte Carlo, and some other metaheuristics in function optimization. © 2010 Springer Science+Business Media, LLC.
This paper presents a global approach permitting accurate simulation of the process of ultra-shallow junctions. Physically based models of dopant implantation (BCA) and diffusion (including point and extended defects coupling) are integrated within a unique simulation tool. A useful set of the relevant parameters has been obtained through an original calibration methodology. It is shown that this approach provides an efficient tool for process modelling
Cosmological Shocks in Adaptive Mesh Refinement Simulations and the Acceleration of Cosmic Rays
Skillman, Samuel W.; O'Shea, Brian W.; Hallman, Eric J.; Burns, Jack O.; Michael L. Norman
2008-01-01
We present new results characterizing cosmological shocks within adaptive mesh refinement N-Body/hydrodynamic simulations that are used to predict non-thermal components of large-scale structure. This represents the first study of shocks using adaptive mesh refinement. We propose a modified algorithm for finding shocks from those used on unigrid simulations that reduces the shock frequency of low Mach number shocks by a factor of ~3. We then apply our new technique to a large, (512 Mpc/h)^3, ...
Balin Talamba, D.; Higy, C.; Joerin, C.; Musy, A.
The paper presents an application concerning the hydrological modelling for the Haute-Mentue catchment, located in western Switzerland. A simplified version of Topmodel, developed in a Labview programming environment, was applied in the aim of modelling the hydrological processes on this catchment. Previous researches car- ried out in this region outlined the importance of the environmental tracers in studying the hydrological behaviour and an important knowledge has been accumulated dur- ing this period concerning the mechanisms responsible for runoff generation. In con- formity with the theoretical constraints, Topmodel was applied for an Haute-Mentue sub-catchment where tracing experiments showed constantly low contributions of the soil water during the flood events. The model was applied for two humid periods in 1998. First, the model calibration was done in order to provide the best estimations for the total runoff. Instead, the simulated components (groundwater and rapid flow) showed far deviations from the reality indicated by the tracing experiments. Thus, a new calibration was performed including additional information given by the environ- mental tracing. The calibration of the model was done by using simulated annealing (SA) techniques, which are easy to implement and statistically allow for converging to a global minimum. The only problem is that the method is time and computer consum- ing. To improve this, a version of SA was used which is known as very fast-simulated annealing (VFSA). The principles are the same as for the SA technique. The random search is guided by certain probability distribution and the acceptance criterion is the same as for SA but the VFSA allows for better taking into account the ranges of vari- ation of each parameter. Practice with Topmodel showed that the energy function has different sensitivities along different dimensions of the parameter space. The VFSA algorithm allows differentiated search in relation with the
EVENT-DRIVEN SIMULATION OF INTEGRATE-AND-FIRE MODELS WITH SPIKE-FREQUENCY ADAPTATION
Lin Xianghong; Zhang Tianwen
2009-01-01
The evoked spike discharges of a neuron depend critically on the recent history of its electrical activity. A well-known example is the phenomenon of spike-frequency adaptation that is a commonly observed property of neurons. In this paper, using a leaky integrate-and-fire model that includes an adaptation current, we propose an event-driven strategy to simulate integrate-and-fire models with spike-frequency adaptation. Such approach is more precise than traditional clock-driven numerical integration approach because the timing of spikes is treated exactly. In experiments, using event-driven and clock-driven strategies we simulated the adaptation time course of single neuron and the random network with spike-timing dependent plasticity, the results indicate that (1) the temporal precision of spiking events impacts on neuronal dynamics of single as well as network in the different simulation strategies and (2) the simulation time scales linearly with the total number of spiking events in the event-driven simulation strategies.
Cari Pérez-Vives; César Albarrán-Diego; Santiago García-Lázaro; Teresa Ferrer-Blasco; Robert Montés-Micó
2014-01-01
Purpose: To compare optical and visual quality of implantable collamer lens (ICL) implantation and femtosecond laser in situ keratomileusis (F-LASIK) for myopia. Methods: The CRX1 adaptive optics visual simulator (Imagine Eyes, Orsay, France) was used to simulate the wavefront aberration pattern after the two surgical procedures for -3-diopter (D) and -6-D myopia. Visual acuity at different contrasts and contrast sensitivities at 10, 20, and 25 cycles/degree (cpd) were measured for 3-mm an...
The Self-Adaptive Fuzzy PID Controller in Actuator Simulated Loading System
Chuanhui Zhang; Xiaodong Song
2013-01-01
This paper analyzes the structure principle of the actuator simulated loading system with variable stiffness, and establishes the simplified model. What’s more, it also does a research on the application of the self-adaptive tuning of fuzzy PID(Proportion Integration Differentiation) in actuator simulated loading system with variable stiffness. Because the loading system is connected with the steering system by a spring rod, there must be strong coupling. Besides, there are also the parametri...
Dynamically adaptive Lattice Boltzmann simulation of shallow water flows with the Peano framework
Neumann, Philipp
2015-09-01
© 2014 Elsevier Inc. All rights reserved. We present a dynamically adaptive Lattice Boltzmann (LB) implementation for solving the shallow water equations (SWEs). Our implementation extends an existing LB component of the Peano framework. We revise the modular design with respect to the incorporation of new simulation aspects and LB models. The basic SWE-LB implementation is validated in different breaking dam scenarios. We further provide a numerical study on stability of the MRT collision operator used in our simulations.
Luangpaiboon, P.
2009-10-01
Many entrepreneurs face to extreme conditions for instances; costs, quality, sales and services. Moreover, technology has always been intertwined with our demands. Then almost manufacturers or assembling lines adopt it and come out with more complicated process inevitably. At this stage, products and service improvement need to be shifted from competitors with sustainability. So, a simulated process optimisation is an alternative way for solving huge and complex problems. Metaheuristics are sequential processes that perform exploration and exploitation in the solution space aiming to efficiently find near optimal solutions with natural intelligence as a source of inspiration. One of the most well-known metaheuristics is called Ant Colony Optimisation, ACO. This paper is conducted to give an aid in complicatedness of using ACO in terms of its parameters: number of iterations, ants and moves. Proper levels of these parameters are analysed on eight noisy continuous non-linear continuous response surfaces. Considering the solution space in a specified region, some surfaces contain global optimum and multiple local optimums and some are with a curved ridge. ACO parameters are determined through hybridisations of Modified Simplex and Simulated Annealing methods on the path of Steepest Ascent, SAM. SAM was introduced to recommend preferable levels of ACO parameters via statistically significant regression analysis and Taguchi's signal to noise ratio. Other performance achievements include minimax and mean squared error measures. A series of computational experiments using each algorithm were conducted. Experimental results were analysed in terms of mean, design points and best so far solutions. It was found that results obtained from a hybridisation with stochastic procedures of Simulated Annealing method were better than that using Modified Simplex algorithm. However, the average execution time of experimental runs and number of design points using hybridisations were
Yao Yao
Full Text Available One of the important challenges in the field of evolutionary robotics is the development of systems that can adapt to a changing environment. However, the ability to adapt to unknown and fluctuating environments is not straightforward. Here, we explore the adaptive potential of simulated swarm robots that contain a genomic encoding of a bio-inspired gene regulatory network (GRN. An artificial genome is combined with a flexible agent-based system, representing the activated part of the regulatory network that transduces environmental cues into phenotypic behaviour. Using an artificial life simulation framework that mimics a dynamically changing environment, we show that separating the static from the conditionally active part of the network contributes to a better adaptive behaviour. Furthermore, in contrast with most hitherto developed ANN-based systems that need to re-optimize their complete controller network from scratch each time they are subjected to novel conditions, our system uses its genome to store GRNs whose performance was optimized under a particular environmental condition for a sufficiently long time. When subjected to a new environment, the previous condition-specific GRN might become inactivated, but remains present. This ability to store 'good behaviour' and to disconnect it from the novel rewiring that is essential under a new condition allows faster re-adaptation if any of the previously observed environmental conditions is reencountered. As we show here, applying these evolutionary-based principles leads to accelerated and improved adaptive evolution in a non-stable environment.
Ceulemans, Eva; Van Mechelen, Iven; Leenen, Iwin
2007-01-01
Hierarchical classes models are quasi-order retaining Boolean decomposition models for N-way N-mode binary data. To fit these models to data, rationally started alternating least squares (or, equivalently, alternating least absolute deviations) algorithms have been proposed. Extensive simulation studies showed that these algorithms succeed quite…
Simulation Research on Adaptive Control of a Six-degree-of-freedom Material-testing Machine
Dan Wang
2014-02-01
Full Text Available This paper presents an adaptive controller equipped with a stiffness estimation method for a novel material-testing machine, in order to alleviate the performance depression caused by the stiffness variance of the tested specimen. The dynamic model of the proposed machine is built using the Kane method, and the kinematic model is established with a closed-form solution. The stiffness estimation method is developed based on the recursive least-squares method and the proposed stiffness equivalent matrix. Control performances of the adaptive controller are simulated in detail. The simulation results illustrate that the proposed controller can greatly improve the control performance of the target material-testing machine by online stiffness estimation and adaptive parameter tuning, especially in low-cycle fatigue (LCF and high-cycle fatigue (HCF tests.
The Self-Adaptive Fuzzy PID Controller in Actuator Simulated Loading System
Chuanhui Zhang
2013-05-01
Full Text Available This paper analyzes the structure principle of the actuator simulated loading system with variable stiffness, and establishes the simplified model. What’s more, it also does a research on the application of the self-adaptive tuning of fuzzy PID(Proportion Integration Differentiation in actuator simulated loading system with variable stiffness. Because the loading system is connected with the steering system by a spring rod, there must be strong coupling. Besides, there are also the parametric variations accompanying with the variations of the stiffness. Based on compensation from the feed-forward control on the disturbance brought by the motion of steering engine, the system performance can be improved by using fuzzy adaptive adjusting PID control to make up the changes of system parameter caused by the changes of the stiffness. By combining the fuzzy control with traditional PID control, fuzzy adaptive PID control is able to choose the parameters more properly.
3D Simulation of Flow with Free Surface Based on Adaptive Octree Mesh System
Li Shaowu; Zhuang Qian; Huang Xiaoyun; Wang Dong
2015-01-01
The technique of adaptive tree mesh is an effective way to reduce computational cost through automatic adjustment of cell size according to necessity. In the present study, the 2D numerical N-S solver based on the adaptive quadtree mesh system was extended to a 3D one, in which a spatially adaptive octree mesh system and multiple parti-cle level set method were adopted for the convenience to deal with the air-water-structure multiple-medium coexisting domain. The stretching process of a dumbbell was simulated and the results indicate that the meshes are well adaptable to the free surface. The collapsing process of water column impinging a circle cylinder was simulated and from the results, it can be seen that the processes of fluid splitting and merging are properly simulated. The interaction of sec-ond-order Stokes waves with a square cylinder was simulated and the obtained drag force is consistent with the result by the Morison’s wave force formula with the coefficient values of the stable drag component and the inertial force component being set as 2.54.
Adaptive finite element method assisted by stochastic simulation of chemical systems
Cotter, S.L.; Vejchodský, Tomáš; Erban, R.
2013-01-01
Roč. 35, č. 1 (2013), B107-B131. ISSN 1064-8275 R&D Projects: GA AV ČR(CZ) IAA100190803 Institutional support: RVO:67985840 Keywords : chemical Fokker-Planck * adaptive meshes * stochastic simulation algorithm Subject RIV: BA - General Mathematics Impact factor: 1.940, year: 2013 http://epubs.siam.org/doi/abs/10.1137/120877374
Simulation and Performance Analysis of Adaptive Filtering Algorithms in Noise Cancellation
Ferdouse, Lilatul; Nipa, Tamanna Haque; Jaigirdar, Fariha Tasmin
2011-01-01
Noise problems in signals have gained huge attention due to the need of noise-free output signal in numerous communication systems. The principal of adaptive noise cancellation is to acquire an estimation of the unwanted interfering signal and subtract it from the corrupted signal. Noise cancellation operation is controlled adaptively with the target of achieving improved signal to noise ratio. This paper concentrates upon the analysis of adaptive noise canceller using Recursive Least Square (RLS), Fast Transversal Recursive Least Square (FTRLS) and Gradient Adaptive Lattice (GAL) algorithms. The performance analysis of the algorithms is done based on convergence behavior, convergence time, correlation coefficients and signal to noise ratio. After comparing all the simulated results we observed that GAL performs the best in noise cancellation in terms of Correlation Coefficient, SNR and Convergence Time. RLS, FTRLS and GAL were never evaluated and compared before on their performance in noise cancellation in ...
Adaptive Wavelet Collocation Method for Simulation of Time Dependent Maxwell's Equations
Li, Haojun; Rieder, Andreas; Freude, Wolfgang
2012-01-01
This paper investigates an adaptive wavelet collocation time domain method for the numerical solution of Maxwell's equations. In this method a computational grid is dynamically adapted at each time step by using the wavelet decomposition of the field at that time instant. In the regions where the fields are highly localized, the method assigns more grid points; and in the regions where the fields are sparse, there will be less grid points. On the adapted grid, update schemes with high spatial order and explicit time stepping are formulated. The method has high compression rate, which substantially reduces the computational cost allowing efficient use of computational resources. This adaptive wavelet collocation method is especially suitable for simulation of guided-wave optical devices.
Aplikasi Simulasi Annealing Untuk Menyelesaikan Traveling Salesman Problem
Larasati, Tuti
2012-01-01
Traveling salesman problem is one of combinatorial optimization problems that aim to obtain an optimal solution which determines the route that most minimum. And to resolve and find solutions to these problems one algorithm to be used is simulated annealing. Simulated annealing is an analogy of a liquid metals cooling process called annealing. Annealing is the metallurgical process of heating up a solid and then cooling slowly until it crystallizes. At this final task will shown an analogy an...
Sagert, I; Fattoyev, F J; Postnikov, S; Horowitz, C J
2015-01-01
Neutron star and supernova matter at densities just below the nuclear matter saturation density is expected to form a lattice of exotic shapes. These so-called nuclear pasta phases are caused by Coulomb frustration. Their elastic and transport properties are believed to play an important role for thermal and magnetic field evolution, rotation and oscillation of neutron stars. Furthermore, they can impact neutrino opacities in core-collapse supernovae. In this work, we present proof-of-principle 3D Skyrme Hartree-Fock (SHF) simulations of nuclear pasta with the Multi-resolution ADaptive Numerical Environment for Scientific Simulations (MADNESS). We perform benchmark studies of $^{16} \\mathrm{O}$, $^{208} \\mathrm{Pb}$ and $^{238} \\mathrm{U}$ nuclear ground states and calculate binding energies via 3D SHF simulations. Results are compared with experimentally measured binding energies as well as with theoretically predicted values from an established SHF code. The nuclear pasta simulation is initialized in the so...
Highlights: → Non-linear modeling of metabonomic data using K-OPLS. → automated optimization of the kernel parameter by simulated annealing. → K-OPLS provides improved prediction performance for exemplar spectral data sets. → software implementation available for R and Matlab under GPL v2 license. - Abstract: Linear multivariate projection methods are frequently applied for predictive modeling of spectroscopic data in metabonomic studies. The OPLS method is a commonly used computational procedure for characterizing spectral metabonomic data, largely due to its favorable model interpretation properties providing separate descriptions of predictive variation and response-orthogonal structured noise. However, when the relationship between descriptor variables and the response is non-linear, conventional linear models will perform sub-optimally. In this study we have evaluated to what extent a non-linear model, kernel-based orthogonal projections to latent structures (K-OPLS), can provide enhanced predictive performance compared to the linear OPLS model. Just like its linear counterpart, K-OPLS provides separate model components for predictive variation and response-orthogonal structured noise. The improved model interpretation by this separate modeling is a property unique to K-OPLS in comparison to other kernel-based models. Simulated annealing (SA) was used for effective and automated optimization of the kernel-function parameter in K-OPLS (SA-K-OPLS). Our results reveal that the non-linear K-OPLS model provides improved prediction performance in three separate metabonomic data sets compared to the linear OPLS model. We also demonstrate how response-orthogonal K-OPLS components provide valuable biological interpretation of model and data. The metabonomic data sets were acquired using proton Nuclear Magnetic Resonance (NMR) spectroscopy, and include a study of the liver toxin galactosamine, a study of the nephrotoxin mercuric chloride and a study of
Parallel Adaptive Simulation of Detonation Waves Using a Weighted Essentially Non-Oscillatory Scheme
McMahon, Sean
The purpose of this thesis was to develop a code that could be used to develop a better understanding of the physics of detonation waves. First, a detonation was simulated in one dimension using ZND theory. Then, using the 1D solution as an initial condition, a detonation was simulated in two dimensions using a weighted essentially non-oscillatory scheme on an adaptive mesh with the smallest lengthscales being equal to 2-3 flamelet lengths. The code development in linking Chemkin for chemical kinetics to the adaptive mesh refinement flow solver was completed. The detonation evolved in a way that, qualitatively, matched the experimental observations, however, the simulation was unable to progress past the formation of the triple point.
Calder, A. C.; Curtis, B. C.; Dursi, L. J.; Fryxell, B.; Henry, G.; MacNeice, P.; Olson, K.; Ricker, P.; Rosner, R.; Timmes, F. X.; Tufo, H. M.; Truran, J. W.; Zingale, M.
We present simulations and performance results of nuclear burning fronts in supernovae on the largest domain and at the finest spatial resolution studied to date. These simulations were performed on the Intel ASCI-Red machine at Sandia National Laboratories using FLASH, a code developed at the Center for Astrophysical Thermonuclear Flashes at the University of Chicago. FLASH is a modular, adaptive mesh, parallel simulation code capable of handling compressible, reactive fluid flows in astrophysical environments. FLASH is written primarily in Fortran 90, uses the Message-Passing Interface library for inter-processor communication and portability, and employs the PARAMESH package to manage a block-structured adaptive mesh that places blocks only where the resolution is required and tracks rapidly changing flow features, such as detonation fronts, with ease. We describe the key algorithms and their implementation as well as the optimizations required to achieve sustained performance of 238 GLOPS on 6420 processors of ASCI-Red in 64-bit arithmetic.
Adaptive grids and numerical fluid simulations for scrape-off layer plasmas
Magnetic confinement nuclear fusion experiments create plasmas with local temperatures in excess of 100 million Kelvin. In these experiments the scrape-off layer, which is the plasma region in direct contact with the device wall, is of central importance both for the quality of the energy confinement and the wall material lifetime. To study the behaviour of the scrape-off layer, in addition to experiments, numerical simulations are used. This work investigates the use of adaptive discretizations of space and compatible numerical methods for scrape-off layer simulations. The resulting algorithms allow dynamic adaptation of computational grids aligned to the magnetic fields to precisely capture the strongly anisotropic energy and particle transport in the plasma. The methods are applied to the multi-fluid plasma code B2, with the goal of reducing the runtime of simulations and extending the applicability of the code.
Availability simulation software adaptation to the IFMIF accelerator facility RAMI analyses
Highlights: • The reason why IFMIF RAMI analyses needs a simulation is explained. • Changes, modifications and software validations done to AvailSim are described. • First IFMIF RAMI results obtained with AvailSim 2.0 are shown. • Implications of AvailSim 2.0 in IFMIF RAMI analyses are evaluated. - Abstract: Several problems were found when using generic reliability tools to perform RAMI (Reliability Availability Maintainability Inspectability) studies for the IFMIF (International Fusion Materials Irradiation Facility) accelerator. A dedicated simulation tool was necessary to model properly the complexity of the accelerator facility. AvailSim, the availability simulation software used for the International Linear Collider (ILC) became an excellent option to fulfill RAMI analyses needs. Nevertheless, this software needed to be adapted and modified to simulate the IFMIF accelerator facility in a useful way for the RAMI analyses in the current design phase. Furthermore, some improvements and new features have been added to the software. This software has become a great tool to simulate the peculiarities of the IFMIF accelerator facility allowing obtaining a realistic availability simulation. Degraded operation simulation and maintenance strategies are the main relevant features. In this paper, the necessity of this software, main modifications to improve it and its adaptation to IFMIF RAMI analysis are described. Moreover, first results obtained with AvailSim 2.0 and a comparison with previous results is shown
A well-balanced numerical scheme for shallow water simulation on adaptive grids
The efficiency of solving two-dimensional shallow-water equations (SWEs) is vital for simulation of large-scale flood inundation. For flood flows over real topography, local high-resolution method, which uses adaptable grids, is required in order to prevent the loss of accuracy of the flow pattern while saving computational cost. This paper introduces an adaptive grid model, which uses an adaptive criterion calculated on the basis of the water lever. The grid adaption is performed by manipulating subdivision levels of the computation grids. As the flow feature varies during the shallow wave propagation, the local grid density changes adaptively and the stored information of neighbor relationship updates correspondingly, achieving a balance between the model accuracy and running efficiency. In this work, a well-balanced (WB) scheme for solving SWEs is introduced. In reconstructions of Riemann state, the definition of the unique bottom elevation on grid interfaces is modified, and the numerical scheme is pre-balanced automatically. By the validation against two idealist test cases, the proposed model is applied to simulate flood inundation due to a dam-break of Zhanghe Reservoir, Hubei province, China. The results show that the presented model is robust and well-balanced, has nice computational efficiency and numerical stability, and thus has bright application prospects.
Adaptive finite element simulation of flow and transport applications on parallel computers
Kirk, Benjamin Shelton
The subject of this work is the adaptive finite element simulation of problems arising in flow and transport applications on parallel computers. Of particular interest are new contributions to adaptive mesh refinement (AMR) in this parallel high-performance context, including novel work on data structures, treatment of constraints in a parallel setting, generality and extensibility via object-oriented programming, and the design/implementation of a flexible software framework. This technology and software capability then enables more robust, reliable treatment of multiscale--multiphysics problems and specific studies of fine scale interaction such as those in biological chemotaxis (Chapter 4) and high-speed shock physics for compressible flows (Chapter 5). The work begins by presenting an overview of key concepts and data structures employed in AMR simulations. Of particular interest is how these concepts are applied in the physics-independent software framework which is developed here and is the basis for all the numerical simulations performed in this work. This open-source software framework has been adopted by a number of researchers in the U.S. and abroad for use in a wide range of applications. The dynamic nature of adaptive simulations pose particular issues for efficient implementation on distributed-memory parallel architectures. Communication cost, computational load balance, and memory requirements must all be considered when developing adaptive software for this class of machines. Specific extensions to the adaptive data structures to enable implementation on parallel computers is therefore considered in detail. The libMesh framework for performing adaptive finite element simulations on parallel computers is developed to provide a concrete implementation of the above ideas. This physics-independent framework is applied to two distinct flow and transport applications classes in the subsequent application studies to illustrate the flexibility of the
Modeling and simulating the adaptive electrical properties of stochastic polymeric 3D networks
Memristors are passive two-terminal circuit elements that combine resistance and memory. Although in theory memristors are a very promising approach to fabricate hardware with adaptive properties, there are only very few implementations able to show their basic properties. We recently developed stochastic polymeric matrices with a functionality that evidences the formation of self-assembled three-dimensional (3D) networks of memristors. We demonstrated that those networks show the typical hysteretic behavior observed in the ‘one input-one output’ memristive configuration. Interestingly, using different protocols to electrically stimulate the networks, we also observed that their adaptive properties are similar to those present in the nervous system. Here, we model and simulate the electrical properties of these self-assembled polymeric networks of memristors, the topology of which is defined stochastically. First, we show that the model recreates the hysteretic behavior observed in the real experiments. Second, we demonstrate that the networks modeled indeed have a 3D instead of a planar functionality. Finally, we show that the adaptive properties of the networks depend on their connectivity pattern. Our model was able to replicate fundamental qualitative behavior of the real organic 3D memristor networks; yet, through the simulations, we also explored other interesting properties, such as the relation between connectivity patterns and adaptive properties. Our model and simulations represent an interesting tool to understand the very complex behavior of self-assembled memristor networks, which can finally help to predict and formulate hypotheses for future experiments. (paper)
Childers, J T; LeCompte, T J; Papka, M E; Benjamin, D P
2015-01-01
As the LHC moves to higher energies and luminosity, the demand for computing resources increases accordingly and will soon outpace the growth of the Worldwide LHC Computing Grid. To meet this greater demand, event generation Monte Carlo was targeted for adaptation to run on Mira, the supercomputer at the Argonne Leadership Computing Facility. Alpgen is a Monte Carlo event generation application that is used by LHC experiments in the simulation of collisions that take place in the Large Hadron Collider. This paper details the process by which Alpgen was adapted from a single-processor serial-application to a large-scale parallel-application and the performance that was achieved.
Pawlik, Andreas H; Vecchia, Claudio Dalla
2015-01-01
We present a suite of cosmological radiation-hydrodynamical simulations of the assembly of galaxies driving the reionization of the intergalactic medium (IGM) at z >~ 6. The simulations account for the hydrodynamical feedback from photoionization heating and the explosion of massive stars as supernovae (SNe). Our reference simulation, which was carried out in a box of size 25 comoving Mpc/h using 2 x 512^3 particles, produces a reasonable reionization history and matches the observed UV luminosity function of galaxies. Simulations with different box sizes and resolutions are used to investigate numerical convergence, and simulations in which either SNe or photoionization heating or both are turned off, are used to investigate the role of feedback from star formation. Ionizing radiation is treated using accurate radiative transfer at the high spatially adaptive resolution at which the hydrodynamics is carried out. SN feedback strongly reduces the star formation rates (SFRs) over nearly the full mass range of s...
Robert eBauer
2015-02-01
Full Text Available Restorative brain-computer interfaces (BCI are increasingly used to provide feedback of neuronal states in a bid to normalize pathological brain activity and achieve behavioral gains. However, patients and healthy subjects alike often show a large variability, or even inability, of brain self-regulation for BCI control, known as BCI illiteracy. Although current co-adaptive algorithms are powerful for assistive BCIs, their inherent class switching clashes with the operant conditioning goal of restorative BCIs. Moreover, due to the treatment rationale, the classifier of restorative BCIs usually has a constrained feature space, thus limiting the possibility of classifier adaptation.In this context, we applied a Bayesian model of neurofeedback and reinforcement learning for different threshold selection strategies to study the impact of threshold adaptation of a linear classifier on optimizing restorative BCIs. For each feedback iteration, we first determined the thresholds that result in minimal action entropy and maximal instructional efficiency. We then used the resulting vector for the simulation of continuous threshold adaptation. We could thus show that threshold adaptation can improve reinforcement learning, particularly in cases of BCI illiteracy. Finally, on the basis of information-theory, we provided an explanation for the achieved benefits of adaptive threshold setting.
Bargatze, L. F.
2015-12-01
Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted