Enhanced Simulated Annealing for Solving Aggregate Production Planning
Directory of Open Access Journals (Sweden)
Mohd Rizam Abu Bakar
2016-01-01
Full Text Available Simulated annealing (SA has been an effective means that can address difficulties related to optimisation problems. SA is now a common discipline for research with several productive applications such as production planning. Due to the fact that aggregate production planning (APP is one of the most considerable problems in production planning, in this paper, we present multiobjective linear programming model for APP and optimised by SA. During the course of optimising for the APP problem, it uncovered that the capability of SA was inadequate and its performance was substandard, particularly for a sizable controlled APP problem with many decision variables and plenty of constraints. Since this algorithm works sequentially then the current state will generate only one in next state that will make the search slower and the drawback is that the search may fall in local minimum which represents the best solution in only part of the solution space. In order to enhance its performance and alleviate the deficiencies in the problem solving, a modified SA (MSA is proposed. We attempt to augment the search space by starting with N+1 solutions, instead of one solution. To analyse and investigate the operations of the MSA with the standard SA and harmony search (HS, the real performance of an industrial company and simulation are made for evaluation. The results show that, compared to SA and HS, MSA offers better quality solutions with regard to convergence and accuracy.
Temporary Workforce Planning with Firm Contracts: A Model and a Simulated Annealing Heuristic
Directory of Open Access Journals (Sweden)
Muhammad Al-Salamah
2011-01-01
Full Text Available The aim of this paper is to introduce a model for temporary staffing when temporary employment is managed by firm contracts and to propose a simulated annealing-based method to solve the model. Temporary employment is a policy frequently used to adjust the working hour capacity to fluctuating demand. Temporary workforce planning models have been unnecessarily simplified to account for only periodic hiring and laying off; a company can review its workforce requirement every period and make hire-fire decisions accordingly, usually with a layoff cost. We present a more realistic temporary workforce planning model that assumes a firm contract between the worker and the company, which can extend to several periods. The model assumes the traditional constraints, such as inventory balance constraints, worker availability, and labor hour mix. The costs are the inventory holding cost, training cost of the temporary workers, and the backorder cost. The mixed integer model developed for this case has been found to be difficult to solve even for small problem sizes; therefore, a simulated annealing algorithm is proposed to solve the mixed integer model. The performance of the SA algorithm is compared with the CPLEX solution.
Application of simulated annealing to solve multi-objectives for aggregate production planning
Atiya, Bayda; Bakheet, Abdul Jabbar Khudhur; Abbas, Iraq Tereq; Bakar, Mohd. Rizam Abu; Soon, Lee Lai; Monsi, Mansor Bin
2016-06-01
Aggregate production planning (APP) is one of the most significant and complicated problems in production planning and aim to set overall production levels for each product category to meet fluctuating or uncertain demand in future. and to set decision concerning hiring, firing, overtime, subcontract, carrying inventory level. In this paper, we present a simulated annealing (SA) for multi-objective linear programming to solve APP. SA is considered to be a good tool for imprecise optimization problems. The proposed model minimizes total production and workforce costs. In this study, the proposed SA is compared with particle swarm optimization (PSO). The results show that the proposed SA is effective in reducing total production costs and requires minimal time.
Using genetic/simulated annealing algorithm to solve disassembly sequence planning
Institute of Scientific and Technical Information of China (English)
Wu Hao; Zuo Hongfu
2009-01-01
disassembly sequence.And the solution methodology based on the genetic/simulated annealing algorithm with binary-tree algorithm is given.Finally,an example is analyzed in detail,and the result shows that the model is correct and efficient.
multicast utilizando Simulated Annealing
Directory of Open Access Journals (Sweden)
Yezid Donoso
2005-01-01
Full Text Available En este artículo se presenta un método de optimización multiobjetivo para la solución del problema de balanceo de carga en redes de transmisión multicast, apoyándose en la aplicación de la meta-heurística de Simulated Annealing (Recocido Simulado. El método minimiza cuatro parámetros básicos para garantizar la calidad de servicio en transmisiones multicast: retardo origen destino, máxima utilización de enlaces, ancho de banda consumido y número de saltos. Los resultados devueltos por la heurística serán comparados con los resultados arrojados por el modelo matemático propuesto en investigaciones anteriores.
Recursive simulation of quantum annealing
Sowa, A P; Samson, J H; Savel'ev, S E; Zagoskin, A M; Heidel, S; Zúñiga-Anaya, J C
2015-01-01
The evaluation of the performance of adiabatic annealers is hindered by lack of efficient algorithms for simulating their behaviour. We exploit the analyticity of the standard model for the adiabatic quantum process to develop an efficient recursive method for its numerical simulation in case of both unitary and non-unitary evolution. Numerical simulations show distinctly different distributions for the most important figure of merit of adiabatic quantum computing --- the success probability --- in these two cases.
Feasibility of Simulated Annealing Tomography
Vo, Nghia T; Moser, Herbert O
2014-01-01
Simulated annealing tomography (SAT) is a simple iterative image reconstruction technique which can yield a superior reconstruction compared with filtered back-projection (FBP). However, the very high computational cost of iteratively calculating discrete Radon transform (DRT) has limited the feasibility of this technique. In this paper, we propose an approach based on the pre-calculated intersection lengths array (PILA) which helps to remove the step of computing DRT in the simulated annealing procedure and speed up SAT by over 300 times. The enhancement of convergence speed of the reconstruction process using the best of multiple-estimate (BoME) strategy is introduced. The performance of SAT under different conditions and in comparison with other methods is demonstrated by numerical experiments.
Residual entropy and simulated annealing
Ettelaie, R.; Moore, M. A.
1985-01-01
Determining the residual entropy in the simulated annealing approach to optimization is shown to provide useful information on the true ground state energy. The one-dimensional Ising spin glass is studied to exemplify the procedure and in this case the residual entropy is related to the number of one-spin flip stable metastable states. The residual entropy decreases to zero only logarithmically slowly with the inverse cooling rate.
Simulated annealing model of acupuncture
Shang, Charles; Szu, Harold
2015-05-01
The growth control singularity model suggests that acupuncture points (acupoints) originate from organizers in embryogenesis. Organizers are singular points in growth control. Acupuncture can cause perturbation of a system with effects similar to simulated annealing. In clinical trial, the goal of a treatment is to relieve certain disorder which corresponds to reaching certain local optimum in simulated annealing. The self-organizing effect of the system is limited and related to the person's general health and age. Perturbation at acupoints can lead a stronger local excitation (analogous to higher annealing temperature) compared to perturbation at non-singular points (placebo control points). Such difference diminishes as the number of perturbed points increases due to the wider distribution of the limited self-organizing activity. This model explains the following facts from systematic reviews of acupuncture trials: 1. Properly chosen single acupoint treatment for certain disorder can lead to highly repeatable efficacy above placebo 2. When multiple acupoints are used, the result can be highly repeatable if the patients are relatively healthy and young but are usually mixed if the patients are old, frail and have multiple disorders at the same time as the number of local optima or comorbidities increases. 3. As number of acupoints used increases, the efficacy difference between sham and real acupuncture often diminishes. It predicted that the efficacy of acupuncture is negatively correlated to the disease chronicity, severity and patient's age. This is the first biological - physical model of acupuncture which can predict and guide clinical acupuncture research.
Keystream Generator Based On Simulated Annealing
Directory of Open Access Journals (Sweden)
Ayad A. Abdulsalam
2011-01-01
Full Text Available Advances in the design of keystream generator using heuristic techniques are reported. A simulated annealing algorithm for generating random keystream with large complexity is presented. Simulated annealing technique is adapted to locate these requirements. The definitions for some cryptographic properties are generalized, providing a measure suitable for use as an objective function in a simulated annealing algorithm, seeking randomness that satisfy both correlation immunity and the large linear complexity. Results are presented demonstrating the effectiveness of the method.
Morton, Gerard C; Sankreacha, Raxa; Halina, Patrick; Loblaw, Andrew
2008-01-01
Dose distribution in a high-dose-rate (HDR) brachytherapy implant is optimized by adjusting source dwell positions and dwell times along the implanted catheters. Inverse planning with fast simulated annealing (IPSA) is a recently developed algorithm for anatomy-based inverse planning, capable of generating an optimized plan in less than 1min. The purpose of this study is to compare dose distributions achieved using IPSA to those obtained with a graphical optimization (GrO) algorithm for prostate HDR brachytherapy. This is a retrospective study of 63 consecutive prostate HDR brachytherapy implants planned and treated using on-screen GrO to a dose of 10Gy per implant. All plans were then recalculated using IPSA, without changing any parameters (contours, catheters, number, or location of dwell positions). The IPSA and GrO plans were compared with respect to target coverage, conformality, dose homogeneity, and normal tissue dose. The mean volume of target treated to 100% of prescription dose (V(100)) was 97.1% and 96.7%, and mean Conformal Index 0.71 and 0.68 with GrO and IPSA, respectively. IPSA plans had a higher mean homogeneity index (0.69 vs. 0.63, p<0.001) and lower volume of target receiving 150% (30.2% vs. 35.6%, p<0.001) and 200% (10.7% vs. 12.7%, p<0.001) of the prescription dose. Mean dose to urethra, rectum, and bladder were all significantly lower with IPSA (p<0.001). IPSA plans tended to be more reproducible, with smaller standard deviations for all measured parameters. Plans generated using IPSA provide similar target coverage to those obtained using GrO but with lower dose to normal structures and greater dose homogeneity.
Cylinder packing by simulated annealing
Directory of Open Access Journals (Sweden)
M. Helena Correia
2000-12-01
Full Text Available This paper is motivated by the problem of loading identical items of circular base (tubes, rolls, ... into a rectangular base (the pallet. For practical reasons, all the loaded items are considered to have the same height. The resolution of this problem consists in determining the positioning pattern of the circular bases of the items on the rectangular pallet, while maximizing the number of items. This pattern will be repeated for each layer stacked on the pallet. Two algorithms based on the meta-heuristic Simulated Annealing have been developed and implemented. The tuning of these algorithms parameters implied running intensive tests in order to improve its efficiency. The algorithms developed were easily extended to the case of non-identical circles.Este artigo aborda o problema de posicionamento de objetos de base circular (tubos, rolos, ... sobre uma base retangular de maiores dimensões. Por razões práticas, considera-se que todos os objetos a carregar apresentam a mesma altura. A resolução do problema consiste na determinação do padrão de posicionamento das bases circulares dos referidos objetos sobre a base de forma retangular, tendo como objetivo a maximização do número de objetos estritamente posicionados no interior dessa base. Este padrão de posicionamento será repetido em cada uma das camadas a carregar sobre a base retangular. Apresentam-se dois algoritmos para a resolução do problema. Estes algoritmos baseiam-se numa meta-heurística, Simulated Annealling, cuja afinação de parâmetros requereu a execução de testes intensivos com o objetivo de atingir um elevado grau de eficiência no seu desempenho. As características dos algoritmos implementados permitiram que a sua extensão à consideração de círculos com raios diferentes fosse facilmente conseguida.
Quantum Adiabatic Evolution Algorithms versus Simulated Annealing
Farhi, E; Gutmann, S; Farhi, Edward; Goldstone, Jeffrey; Gutmann, Sam
2002-01-01
We explain why quantum adiabatic evolution and simulated annealing perform similarly in certain examples of searching for the minimum of a cost function of n bits. In these examples each bit is treated symmetrically so the cost function depends only on the Hamming weight of the n bits. We also give two examples, closely related to these, where the similarity breaks down in that the quantum adiabatic algorithm succeeds in polynomial time whereas simulated annealing requires exponential time.
Stochastic annealing simulation of cascades in metals
Energy Technology Data Exchange (ETDEWEB)
Heinisch, H.L.
1996-04-01
The stochastic annealing simulation code ALSOME is used to investigate quantitatively the differential production of mobile vacancy and SIA defects as a function of temperature for isolated 25 KeV cascades in copper generated by MD simulations. The ALSOME code and cascade annealing simulations are described. The annealing simulations indicate that the above Stage V, where the cascade vacancy clusters are unstable,m nearly 80% of the post-quench vacancies escape the cascade volume, while about half of the post-quench SIAs remain in clusters. The results are sensitive to the relative fractions of SIAs that occur in small, highly mobile clusters and large stable clusters, respectively, which may be dependent on the cascade energy.
Ghaderi, F.; Pahlavani, P.
2015-12-01
A multimodal multi-criteria route planning (MMRP) system provides an optimal multimodal route from an origin point to a destination point considering two or more criteria in a way this route can be a combination of public and private transportation modes. In this paper, the simulate annealing (SA) and the fuzzy analytical hierarchy process (fuzzy AHP) were combined in order to find this route. In this regard, firstly, the effective criteria that are significant for users in their trip were determined. Then the weight of each criterion was calculated using the fuzzy AHP weighting method. The most important characteristic of this weighting method is the use of fuzzy numbers that aids the users to consider their uncertainty in pairwise comparison of criteria. After determining the criteria weights, the proposed SA algorithm were used for determining an optimal route from an origin to a destination. One of the most important problems in a meta-heuristic algorithm is trapping in local minima. In this study, five transportation modes, including subway, bus rapid transit (BRT), taxi, walking, and bus were considered for moving between nodes. Also, the fare, the time, the user's bother, and the length of the path were considered as effective criteria for solving the problem. The proposed model was implemented in an area in centre of Tehran in a GUI MATLAB programming language. The results showed a high efficiency and speed of the proposed algorithm that support our analyses.
An Application of Simulated Annealing to Scheduling Army Unit Training
1986-10-01
Simulated annealing operates by analogy to the metalurgy process which strengthens metals through successive heating and cooling. The method is highly...diminishing returns is observed. The simulated annealing heuristic operates by analogy to annealing in physical systems. Annealing in a physical
A simulated annealing technique for multi-objective simulation optimization
Mahmoud H. Alrefaei; Diabat, Ali H.
2009-01-01
In this paper, we present a simulated annealing algorithm for solving multi-objective simulation optimization problems. The algorithm is based on the idea of simulated annealing with constant temperature, and uses a rule for accepting a candidate solution that depends on the individual estimated objective function values. The algorithm is shown to converge almost surely to an optimal solution. It is applied to a multi-objective inventory problem; the numerical results show that the algorithm ...
Simulated annealing algorithm for optimal capital growth
Luo, Yong; Zhu, Bo; Tang, Yong
2014-08-01
We investigate the problem of dynamic optimal capital growth of a portfolio. A general framework that one strives to maximize the expected logarithm utility of long term growth rate was developed. Exact optimization algorithms run into difficulties in this framework and this motivates the investigation of applying simulated annealing optimized algorithm to optimize the capital growth of a given portfolio. Empirical results with real financial data indicate that the approach is inspiring for capital growth portfolio.
Binary Sparse Phase Retrieval via Simulated Annealing
Directory of Open Access Journals (Sweden)
Wei Peng
2016-01-01
Full Text Available This paper presents the Simulated Annealing Sparse PhAse Recovery (SASPAR algorithm for reconstructing sparse binary signals from their phaseless magnitudes of the Fourier transform. The greedy strategy version is also proposed for a comparison, which is a parameter-free algorithm. Sufficient numeric simulations indicate that our method is quite effective and suggest the binary model is robust. The SASPAR algorithm seems competitive to the existing methods for its efficiency and high recovery rate even with fewer Fourier measurements.
Comparative study of the performance of quantum annealing and simulated annealing.
Nishimori, Hidetoshi; Tsuda, Junichi; Knysh, Sergey
2015-01-01
Relations of simulated annealing and quantum annealing are studied by a mapping from the transition matrix of classical Markovian dynamics of the Ising model to a quantum Hamiltonian and vice versa. It is shown that these two operators, the transition matrix and the Hamiltonian, share the eigenvalue spectrum. Thus, if simulated annealing with slow temperature change does not encounter a difficulty caused by an exponentially long relaxation time at a first-order phase transition, the same is true for the corresponding process of quantum annealing in the adiabatic limit. One of the important differences between the classical-to-quantum mapping and the converse quantum-to-classical mapping is that the Markovian dynamics of a short-range Ising model is mapped to a short-range quantum system, but the converse mapping from a short-range quantum system to a classical one results in long-range interactions. This leads to a difference in efficiencies that simulated annealing can be efficiently simulated by quantum annealing but the converse is not necessarily true. We conclude that quantum annealing is easier to implement and is more flexible than simulated annealing. We also point out that the present mapping can be extended to accommodate explicit time dependence of temperature, which is used to justify the quantum-mechanical analysis of simulated annealing by Somma, Batista, and Ortiz. Additionally, an alternative method to solve the nonequilibrium dynamics of the one-dimensional Ising model is provided through the classical-to-quantum mapping.
MEDICAL STAFF SCHEDULING USING SIMULATED ANNEALING
Directory of Open Access Journals (Sweden)
Ladislav Rosocha
2015-07-01
Full Text Available Purpose: The efficiency of medical staff is a fundamental feature of healthcare facilities quality. Therefore the better implementation of their preferences into the scheduling problem might not only rise the work-life balance of doctors and nurses, but also may result into better patient care. This paper focuses on optimization of medical staff preferences considering the scheduling problem.Methodology/Approach: We propose a medical staff scheduling algorithm based on simulated annealing, a well-known method from statistical thermodynamics. We define hard constraints, which are linked to legal and working regulations, and minimize the violations of soft constraints, which are related to the quality of work, psychic, and work-life balance of staff.Findings: On a sample of 60 physicians and nurses from gynecology department we generated monthly schedules and optimized their preferences in terms of soft constraints. Our results indicate that the final value of objective function optimized by proposed algorithm is more than 18-times better in violations of soft constraints than initially generated random schedule that satisfied hard constraints.Research Limitation/implication: Even though the global optimality of final outcome is not guaranteed, desirable solutionwas obtained in reasonable time. Originality/Value of paper: We show that designed algorithm is able to successfully generate schedules regarding hard and soft constraints. Moreover, presented method is significantly faster than standard schedule generation and is able to effectively reschedule due to the local neighborhood search characteristics of simulated annealing.
A Parallel Genetic Simulated Annealing Hybrid Algorithm for Task Scheduling
Institute of Scientific and Technical Information of China (English)
SHU Wanneng; ZHENG Shijue
2006-01-01
In this paper combined with the advantages of genetic algorithm and simulated annealing, brings forward a parallel genetic simulated annealing hybrid algorithm (PGSAHA) and applied to solve task scheduling problem in grid computing .It first generates a new group of individuals through genetic operation such as reproduction, crossover, mutation, etc, and than simulated anneals independently all the generated individuals respectively.When the temperature in the process of cooling no longer falls, the result is the optimal solution on the whole.From the analysis and experiment result, it is concluded that this algorithm is superior to genetic algorithm and simulated annealing.
Institute of Scientific and Technical Information of China (English)
宛剑业; 张飞超; 高丽媛; 刘卫博
2016-01-01
针对YY企业电子油门生产车间的布局规划，分别采用了传统的SLP方法和遗传模拟退火算法，并利用 Proplanner 软件对其两种方法获得的方案1、2进行了仿真研究。仿真结果表明在零部件的搬运距离、搬运时间、搬运成本三方面，方案2明显优于方案1。从而说明在车间布局规划方面，遗传模拟退火算法比SLP更具可行性与合理性。%To study the layout planning for electronic accelerator production workshop of YY company, this paper uses SLP method and the genetic simulated annealing hybrid algorithm, and the Proplanner software to conduct the simulation research of the program 1, 2 obtained on the two methods. The simulation results show that the program 2 is obviously better than program 1 in the parts of the distance of transportation, handling time, handling cost, which means that the genetic simulated annealing hybrid algorithm is more feasible and rational than the SLP in the layout of a workshop.
Hierarchical Network Design Using Simulated Annealing
DEFF Research Database (Denmark)
Thomadsen, Tommy; Clausen, Jens
2002-01-01
The hierarchical network problem is the problem of finding the least cost network, with nodes divided into groups, edges connecting nodes in each groups and groups ordered in a hierarchy. The idea of hierarchical networks comes from telecommunication networks where hierarchies exist. Hierarchical...... networks are described and a mathematical model is proposed for a two level version of the hierarchical network problem. The problem is to determine which edges should connect nodes, and how demand is routed in the network. The problem is solved heuristically using simulated annealing which as a sub......-algorithm uses a construction algorithm to determine edges and route the demand. Performance for different versions of the algorithm are reported in terms of runtime and quality of the solutions. The algorithm is able to find solutions of reasonable quality in approximately 1 hour for networks with 100 nodes....
Remote sensing of atmospheric duct parameters using simulated annealing
Institute of Scientific and Technical Information of China (English)
Zhao Xiao-Feng; Huang Si-Xun; Xiang Jie; Shi Wei-Lai
2011-01-01
Simulated annealing is one of the robust optimization schemes. Simulated annealing mimics the annealing process of the slow cooling of a heated metal to reach a stable minimum energy state. In this paper,we adopt simulated annealing to study the problem of the remote sensing of atmospheric duct parameters for two different geometries of propagation measurement. One is from a single emitter to an array of radio receivers (vertical measurements),and the other is from the radar clutter returns (horizontal measurements). Basic principles of simulated annealing and its applications to refractivity estimation are introduced. The performance of this method is validated using numerical experiments and field measurements collected at the East China Sea. The retrieved results demonstrate the feasibility of simulated annealing for near real-time atmospheric refractivity estimation. For comparison,the retrievals of the genetic algorithm are also presented. The comparisons indicate that the convergence speed of simulated annealing is faster than that of the genetic algorithm,while the anti-noise ability of the genetic algorithm is better than that of simulated annealing.
Simulated annealing with probabilistic analysis for solving traveling salesman problems
Hong, Pei-Yee; Lim, Yai-Fung; Ramli, Razamin; Khalid, Ruzelan
2013-09-01
Simulated Annealing (SA) is a widely used meta-heuristic that was inspired from the annealing process of recrystallization of metals. Therefore, the efficiency of SA is highly affected by the annealing schedule. As a result, in this paper, we presented an empirical work to provide a comparable annealing schedule to solve symmetric traveling salesman problems (TSP). Randomized complete block design is also used in this study. The results show that different parameters do affect the efficiency of SA and thus, we propose the best found annealing schedule based on the Post Hoc test. SA was tested on seven selected benchmarked problems of symmetric TSP with the proposed annealing schedule. The performance of SA was evaluated empirically alongside with benchmark solutions and simple analysis to validate the quality of solutions. Computational results show that the proposed annealing schedule provides a good quality of solution.
A NEW GENETIC SIMULATED ANNEALING ALGORITHM FOR FLOOD ROUTING MODEL
Institute of Scientific and Technical Information of China (English)
KANG Ling; WANG Cheng; JIANG Tie-bing
2004-01-01
In this paper, a new approach, the Genetic Simulated Annealing (GSA), was proposed for optimizing the parameters in the Muskingum routing model. By integrating the simulated annealing method into the genetic algorithm, the hybrid method could avoid some troubles of traditional methods, such as arduous trial-and-error procedure, premature convergence in genetic algorithm and search blindness in simulated annealing. The principle and implementing procedure of this algorithm were described. Numerical experiments show that the GSA can adjust the optimization population, prevent premature convergence and seek the global optimal result.Applications to the Nanyunhe River and Qingjiang River show that the proposed approach is of higher forecast accuracy and practicability.
Kriging-approximation simulated annealing algorithm for groundwater modeling
Shen, C. H.
2015-12-01
Optimization algorithms are often applied to search best parameters for complex groundwater models. Running the complex groundwater models to evaluate objective function might be time-consuming. This research proposes a Kriging-approximation simulated annealing algorithm. Kriging is a spatial statistics method used to interpolate unknown variables based on surrounding given data. In the algorithm, Kriging method is used to estimate complicate objective function and is incorporated with simulated annealing. The contribution of the Kriging-approximation simulated annealing algorithm is to reduce calculation time and increase efficiency.
An Evaluation of a Modified Simulated Annealing Algorithm for Various Formulations
1990-08-01
plans and procedures for the improved operation of existing systems ( Reklaitis , Ravindran, & Ragsdell, 1983)." A gas pipeline flow problem is used to...simulated annealings, Journal of Statistical Physics, 45(5/6), 885-890. Reklaitis , G. V., Ravindran, A., & Ragsdell, K. M. (1983). Engineering
Wang, Wenlong; Machta, Jonathan; Katzgraber, Helmut G.
2015-07-01
Population annealing is a Monte Carlo algorithm that marries features from simulated-annealing and parallel-tempering Monte Carlo. As such, it is ideal to overcome large energy barriers in the free-energy landscape while minimizing a Hamiltonian. Thus, population-annealing Monte Carlo can be used as a heuristic to solve combinatorial optimization problems. We illustrate the capabilities of population-annealing Monte Carlo by computing ground states of the three-dimensional Ising spin glass with Gaussian disorder, while comparing to simulated-annealing and parallel-tempering Monte Carlo. Our results suggest that population annealing Monte Carlo is significantly more efficient than simulated annealing but comparable to parallel-tempering Monte Carlo for finding spin-glass ground states.
Wang, Wenlong; Machta, Jonathan; Katzgraber, Helmut G
2015-07-01
Population annealing is a Monte Carlo algorithm that marries features from simulated-annealing and parallel-tempering Monte Carlo. As such, it is ideal to overcome large energy barriers in the free-energy landscape while minimizing a Hamiltonian. Thus, population-annealing Monte Carlo can be used as a heuristic to solve combinatorial optimization problems. We illustrate the capabilities of population-annealing Monte Carlo by computing ground states of the three-dimensional Ising spin glass with Gaussian disorder, while comparing to simulated-annealing and parallel-tempering Monte Carlo. Our results suggest that population annealing Monte Carlo is significantly more efficient than simulated annealing but comparable to parallel-tempering Monte Carlo for finding spin-glass ground states.
Population annealing simulations of a binary hard-sphere mixture
Callaham, Jared; Machta, Jonathan
2017-06-01
Population annealing is a sequential Monte Carlo scheme well suited to simulating equilibrium states of systems with rough free energy landscapes. Here we use population annealing to study a binary mixture of hard spheres. Population annealing is a parallel version of simulated annealing with an extra resampling step that ensures that a population of replicas of the system represents the equilibrium ensemble at every packing fraction in an annealing schedule. The algorithm and its equilibration properties are described, and results are presented for a glass-forming fluid composed of a 50/50 mixture of hard spheres with diameter ratio of 1.4:1. For this system, we obtain precise results for the equation of state in the glassy regime up to packing fractions φ ≈0.60 and study deviations from the Boublik-Mansoori-Carnahan-Starling-Leland equation of state. For higher packing fractions, the algorithm falls out of equilibrium and a free volume fit predicts jamming at packing fraction φ ≈0.667 . We conclude that population annealing is an effective tool for studying equilibrium glassy fluids and the jamming transition.
On simulated annealing phase transitions in phylogeny reconstruction.
Strobl, Maximilian A R; Barker, Daniel
2016-08-01
Phylogeny reconstruction with global criteria is NP-complete or NP-hard, hence in general requires a heuristic search. We investigate the powerful, physically inspired, general-purpose heuristic simulated annealing, applied to phylogeny reconstruction. Simulated annealing mimics the physical process of annealing, where a liquid is gently cooled to form a crystal. During the search, periods of elevated specific heat occur, analogous to physical phase transitions. These simulated annealing phase transitions play a crucial role in the outcome of the search. Nevertheless, they have received comparably little attention, for phylogeny or other optimisation problems. We analyse simulated annealing phase transitions during searches for the optimal phylogenetic tree for 34 real-world multiple alignments. In the same way in which melting temperatures differ between materials, we observe distinct specific heat profiles for each input file. We propose this reflects differences in the search landscape and can serve as a measure for problem difficulty and for suitability of the algorithm's parameters. We discuss application in algorithmic optimisation and as a diagnostic to assess parameterisation before computationally costly, large phylogeny reconstructions are launched. Whilst the focus here lies on phylogeny reconstruction under maximum parsimony, it is plausible that our results are more widely applicable to optimisation procedures in science and industry.
SIMULATED ANNEALING BASED POLYNOMIAL TIME QOS ROUTING ALGORITHM FOR MANETS
Institute of Scientific and Technical Information of China (English)
Liu Lianggui; Feng Guangzeng
2006-01-01
Multi-constrained Quality-of-Service (QoS) routing is a big challenge for Mobile Ad hoc Networks (MANETs) where the topology may change constantly. In this paper a novel QoS Routing Algorithm based on Simulated Annealing (SA_RA) is proposed. This algorithm first uses an energy function to translate multiple QoS weights into a single mixed metric and then seeks to find a feasible path by simulated annealing. The paper outlines simulated annealing algorithm and analyzes the problems met when we apply it to Qos Routing (QoSR) in MANETs. Theoretical analysis and experiment results demonstrate that the proposed method is an effective approximation algorithms showing better performance than the other pertinent algorithm in seeking the (approximate) optimal configuration within a period of polynomial time.
A theoretical comparison of evolutionary algorithms and simulated annealing
Energy Technology Data Exchange (ETDEWEB)
Hart, W.E.
1995-08-28
This paper theoretically compares the performance of simulated annealing and evolutionary algorithms. Our main result is that under mild conditions a wide variety of evolutionary algorithms can be shown to have greater performance than simulated annealing after a sufficiently large number of function evaluations. This class of EAs includes variants of evolutionary strategie and evolutionary programming, the canonical genetic algorithm, as well as a variety of genetic algorithms that have been applied to combinatorial optimization problems. The proof of this result is based on a performance analysis of a very general class of stochastic optimization algorithms, which has implications for the performance of a variety of other optimization algorithm.
Coordination Hydrothermal Interconnection Java-Bali Using Simulated Annealing
Wicaksono, B.; Abdullah, A. G.; Saputra, W. S.
2016-04-01
Hydrothermal power plant coordination aims to minimize the total cost of operating system that is represented by fuel costand constraints during optimization. To perform the optimization, there are several methods that can be used. Simulated Annealing (SA) is a method that can be used to solve the optimization problems. This method was inspired by annealing or cooling process in the manufacture of materials composed of crystals. The basic principle of hydrothermal power plant coordination includes the use of hydro power plants to support basic load while thermal power plants were used to support the remaining load. This study used two hydro power plant units and six thermal power plant units with 25 buses by calculating transmission losses and considering power limits in each power plant unit aided by MATLAB software during the process. Hydrothermal power plant coordination using simulated annealing plants showed that a total cost of generation for 24 hours is 13,288,508.01.
Meta-Modeling by Symbolic Regression and Pareto Simulated Annealing
Stinstra, E.; Rennen, G.; Teeuwen, G.J.A.
2006-01-01
The subject of this paper is a new approach to Symbolic Regression.Other publications on Symbolic Regression use Genetic Programming.This paper describes an alternative method based on Pareto Simulated Annealing.Our method is based on linear regression for the estimation of constants.Interval arithm
Analysis of Trivium by a Simulated Annealing variant
DEFF Research Database (Denmark)
Borghoff, Julia; Knudsen, Lars Ramkilde; Matusiewicz, Krystian
2010-01-01
. A characteristic of equation systems that may be efficiently solvable by the means of such algorithms is provided. As an example, we investigate equation systems induced by the problem of recovering the internal state of the stream cipher Trivium. We propose an improved variant of the simulated annealing method...
Estimation of the parameters of ETAS models by Simulated Annealing
Lombardi, Anna Maria
2015-01-01
This paper proposes a new algorithm to estimate the maximum likelihood parameters of an Epidemic Type Aftershock Sequences (ETAS) model. It is based on Simulated Annealing, a versatile method that solves problems of global optimization and ensures convergence to a global optimum. The procedure is tested on both simulated and real catalogs. The main conclusion is that the method performs poorly as the size of the catalog decreases because the effect of the correlation of the ETAS parameters is...
Adaptive Simulated Annealing Based Protein Loop Modeling of Neurotoxins
Institute of Scientific and Technical Information of China (English)
陈杰; 黄丽娜; 彭志红
2003-01-01
A loop modeling method, adaptive simulated annealing, for ab initio prediction of protein loop structures, as an optimization problem of searching the global minimum of a given energy function, is proposed. An interface-friendly toolbox-LoopModeller in Windows and Linux systems, VC++ and OpenGL environments is developed for analysis and visualization. Simulation results of three short-chain neurotoxins modeled by LoopModeller show that the method proposed is fast and efficient.
Molecular dynamics simulation of annealed ZnO surfaces
Energy Technology Data Exchange (ETDEWEB)
Min, Tjun Kit; Yoon, Tiem Leong [School of Physics, Universiti Sains Malaysia, 11800 USM, Penang (Malaysia); Lim, Thong Leng [Faculty of Engineering and Technology, Multimedia University, Melaka Campus, 75450 Melaka (Malaysia)
2015-04-24
The effect of thermally annealing a slab of wurtzite ZnO, terminated by two surfaces, (0001) (which is oxygen-terminated) and (0001{sup ¯}) (which is Zn-terminated), is investigated via molecular dynamics simulation by using reactive force field (ReaxFF). We found that upon heating beyond a threshold temperature of ∼700 K, surface oxygen atoms begin to sublimate from the (0001) surface. The ratio of oxygen leaving the surface at a given temperature increases as the heating temperature increases. A range of phenomena occurring at the atomic level on the (0001) surface has also been explored, such as formation of oxygen dimers on the surface and evolution of partial charge distribution in the slab during the annealing process. It was found that the partial charge distribution as a function of the depth from the surface undergoes a qualitative change when the annealing temperature is above the threshold temperature.
An adaptive approach to the physical annealing strategy for simulated annealing
Hasegawa, M.
2013-02-01
A new and reasonable method for adaptive implementation of simulated annealing (SA) is studied on two types of random traveling salesman problems. The idea is based on the previous finding on the search characteristics of the threshold algorithms, that is, the primary role of the relaxation dynamics in their finite-time optimization process. It is shown that the effective temperature for optimization can be predicted from the system's behavior analogous to the stabilization phenomenon occurring in the heating process starting from a quenched solution. The subsequent slow cooling near the predicted point draws out the inherent optimizing ability of finite-time SA in more straightforward manner than the conventional adaptive approach.
List-Based Simulated Annealing Algorithm for Traveling Salesman Problem.
Zhan, Shi-hua; Lin, Juan; Zhang, Ze-jun; Zhong, Yi-wen
2016-01-01
Simulated annealing (SA) algorithm is a popular intelligent optimization algorithm which has been successfully applied in many fields. Parameters' setting is a key factor for its performance, but it is also a tedious work. To simplify parameters setting, we present a list-based simulated annealing (LBSA) algorithm to solve traveling salesman problem (TSP). LBSA algorithm uses a novel list-based cooling schedule to control the decrease of temperature. Specifically, a list of temperatures is created first, and then the maximum temperature in list is used by Metropolis acceptance criterion to decide whether to accept a candidate solution. The temperature list is adapted iteratively according to the topology of the solution space of the problem. The effectiveness and the parameter sensitivity of the list-based cooling schedule are illustrated through benchmark TSP problems. The LBSA algorithm, whose performance is robust on a wide range of parameter values, shows competitive performance compared with some other state-of-the-art algorithms.
Ranking important nodes in complex networks by simulated annealing
Sun, Yu; Yao, Pei-Yang; Wan, Lu-Jun; Shen, Jian; Zhong, Yun
2017-02-01
In this paper, based on simulated annealing a new method to rank important nodes in complex networks is presented. First, the concept of an importance sequence (IS) to describe the relative importance of nodes in complex networks is defined. Then, a measure used to evaluate the reasonability of an IS is designed. By comparing an IS and the measure of its reasonability to a state of complex networks and the energy of the state, respectively, the method finds the ground state of complex networks by simulated annealing. In other words, the method can construct a most reasonable IS. The results of experiments on real and artificial networks show that this ranking method not only is effective but also can be applied to different kinds of complex networks. Project supported by the National Natural Science Foundation of China (Grant No. 61573017) and the Natural Science Foundation of Shaanxi Province, China (Grant No. 2016JQ6062).
Variable neighbourhood simulated annealing algorithm for capacitated vehicle routing problems
Xiao, Yiyong; Zhao, Qiuhong; Kaku, Ikou; Mladenovic, Nenad
2014-04-01
This article presents the variable neighbourhood simulated annealing (VNSA) algorithm, a variant of the variable neighbourhood search (VNS) combined with simulated annealing (SA), for efficiently solving capacitated vehicle routing problems (CVRPs). In the new algorithm, the deterministic 'Move or not' criterion of the original VNS algorithm regarding the incumbent replacement is replaced by an SA probability, and the neighbourhood shifting of the original VNS (from near to far by k← k+1) is replaced by a neighbourhood shaking procedure following a specified rule. The geographical neighbourhood structure is introduced in constructing the neighbourhood structures for the CVRP of the string model. The proposed algorithm is tested against 39 well-known benchmark CVRP instances of different scales (small/middle, large, very large). The results show that the VNSA algorithm outperforms most existing algorithms in terms of computational effectiveness and efficiency, showing good performance in solving large and very large CVRPs.
Multi-Objective Simulating Annealing for Permutation Flow Shop Problems
Mokotoff, E.; Pérez, J.
2007-09-01
Real life scheduling problems require more than one criterion. Nevertheless, the complex nature of the Permutation Flow Shop problem has prevented the development of models with multiple criteria. Considering only one regular criterion, this scheduling problem was shown to be NP-complete. The Multi-Objective Simulated Annealing (MOSA) methods are metaheuristics based on Simulated Annealing to solve Multi-Objective Combinatorial Optimization (MOCO) problems, like the problem at hand. Starting from the general MOSA method introduced by Loukil et al. [1], we developed MOSA models to provide the decision maker with efficient solutions for the Permutation Flow Shop problem (common in the production of ceramic tiles). In this paper we present three models: two bicriteria models and one based on satisfaction levels for the main criterion.
Simulated Annealing for the 0/1 Multidimensional Knapsack Problem
Institute of Scientific and Technical Information of China (English)
Fubin Qian; Rui Ding
2007-01-01
In this paper a simulated annealing (SA) algorithm is presented for the 0/1 multidimensional knapsack problem. Problem-specific knowledge is incorporated in the algorithm description and evaluation of parameters in order to look into the performance of finite-time implementations of SA. Computational results show that SA performs much better than a genetic algorithm in terms of solution time, whilst having a modest loss of solution quality.
Solving geometric constraints with genetic simulated annealing algorithm
Institute of Scientific and Technical Information of China (English)
刘生礼; 唐敏; 董金祥
2003-01-01
This paper applies genetic simulated annealing algorithm (SAGA) to solving geometric constraint problems. This method makes full use of the advantages of SAGA and can handle under-/over- constraint problems naturally. It has advantages (due to its not being sensitive to the initial values) over the Newton-Raphson method, and its yielding of multiple solutions, is an advantage over other optimal methods for multi-solution constraint system. Our experiments have proved the robustness and efficiency of this method.
Rayleigh wave inversion using heat-bath simulated annealing algorithm
Lu, Yongxu; Peng, Suping; Du, Wenfeng; Zhang, Xiaoyang; Ma, Zhenyuan; Lin, Peng
2016-11-01
The dispersion of Rayleigh waves can be used to obtain near-surface shear (S)-wave velocity profiles. This is performed mainly by inversion of the phase velocity dispersion curves, which has been proven to be a highly nonlinear and multimodal problem, and it is unsuitable to use local search methods (LSMs) as the inversion algorithm. In this study, a new strategy is proposed based on a variant of simulated annealing (SA) algorithm. SA, which simulates the annealing procedure of crystalline solids in nature, is one of the global search methods (GSMs). There are many variants of SA, most of which contain two steps: the perturbation of model and the Metropolis-criterion-based acceptance of the new model. In this paper we propose a one-step SA variant known as heat-bath SA. To test the performance of the heat-bath SA, two models are created. Both noise-free and noisy synthetic data are generated. Levenberg-Marquardt (LM) algorithm and a variant of SA, known as the fast simulated annealing (FSA) algorithm, are also adopted for comparison. The inverted results of the synthetic data show that the heat-bath SA algorithm is a reasonable choice for Rayleigh wave dispersion curve inversion. Finally, a real-world inversion example from a coal mine in northwestern China is shown, which proves that the scheme we propose is applicable.
Institute of Scientific and Technical Information of China (English)
巩敦卫; 曾现峰; 张勇
2013-01-01
Aiming at the path planning problem of robot in global static environment, a modified simulated annealing algorithm was proposed which is easy to implement. In this algorithm, a new method used to generate new status was introduced by defining an off-barrier operator and an optimal search operator. The first operator utilizes a directional disturbance strategy to help path points jump off the obstacles. It not only ensures free-collision of the generated path, but also enhances search efficiency of the algorithm. The second one adjusts the path points chosen randomly with dynamic ranges. Hence, it makes the algorithm be able to generate new position in the whole search space, and enhances the global search capability of the algorithm. Finally, simulation results verify the effectiveness of the proposed algorithm.%针对全局静态移动机器人路径规划问题,给出了一种简单易行的改进模拟退火算法.算法通过引入脱障算子和一致寻优算子,提出了一种新的状态产生方法.前者采用维值定向扰动策略,使碰撞路段的两个端点以一定步长跳离障碍物,这既保证了路径的无碰性,又加快了寻优效率；后者对随机选取的若干个路径点进行变步长地调整,使产生的候选解可以遍布整个解空间,提高了算法的全局寻优能力.最后,通过对一般环境和“陷阱”环境路径规划问题的仿真,验证了该方法的有效性.
Estimation of the parameters of ETAS models by Simulated Annealing
Lombardi, Anna Maria
2015-02-01
This paper proposes a new algorithm to estimate the maximum likelihood parameters of an Epidemic Type Aftershock Sequences (ETAS) model. It is based on Simulated Annealing, a versatile method that solves problems of global optimization and ensures convergence to a global optimum. The procedure is tested on both simulated and real catalogs. The main conclusion is that the method performs poorly as the size of the catalog decreases because the effect of the correlation of the ETAS parameters is more significant. These results give new insights into the ETAS model and the efficiency of the maximum-likelihood method within this context.
Simulated annealing spectral clustering algorithm for image segmentation
Institute of Scientific and Technical Information of China (English)
Yifang Yang; and Yuping Wang
2014-01-01
The similarity measure is crucial to the performance of spectral clustering. The Gaussian kernel function based on the Euclidean distance is usual y adopted as the similarity mea-sure. However, the Euclidean distance measure cannot ful y reveal the complex distribution data, and the result of spectral clustering is very sensitive to the scaling parameter. To solve these problems, a new manifold distance measure and a novel simulated anneal-ing spectral clustering (SASC) algorithm based on the manifold distance measure are proposed. The simulated annealing based on genetic algorithm (SAGA), characterized by its rapid conver-gence to the global optimum, is used to cluster the sample points in the spectral mapping space. The proposed algorithm can not only reflect local and global consistency better, but also reduce the sensitivity of spectral clustering to the kernel parameter, which improves the algorithm’s clustering performance. To efficiently ap-ply the algorithm to image segmentation, the Nystr¨om method is used to reduce the computation complexity. Experimental re-sults show that compared with traditional clustering algorithms and those popular spectral clustering algorithms, the proposed algorithm can achieve better clustering performances on several synthetic datasets, texture images and real images.
Simulated annealing approach to the max cut problem
Sen, Sandip
1993-03-01
In this paper we address the problem of partitioning the nodes of a random graph into two sets, so as to maximize the sum of the weights on the edges connecting nodes belonging to different sets. This problem has important real-life counterparts, but has been proven to be NP-complete. As such, a number of heuristic solution techniques have been proposed in literature to address this problem. We propose a stochastic optimization technique, simulated annealing, to find solutions for the max cut problem. Our experiments verify that good solutions to the problem can be found using this algorithm in a reasonable amount of time.
Optimal estuarine sediment monitoring network design with simulated annealing.
Nunes, L M; Caeiro, S; Cunha, M C; Ribeiro, L
2006-02-01
An objective function based on geostatistical variance reduction, constrained to the reproduction of the probability distribution functions of selected physical and chemical sediment variables, is applied to the selection of the best set of compliance monitoring stations in the Sado river estuary in Portugal. These stations were to be selected from a large set of sampling stations from a prior field campaign. Simulated annealing was chosen to solve the optimisation function model. Both the combinatorial problem structure and the resulting candidate sediment monitoring networks are discussed, and the optimal dimension and spatial distribution are proposed. An optimal network of sixty stations was obtained from an original 153-station sampling campaign.
Stochastic annealing simulations of defect interactions among subcascades
Energy Technology Data Exchange (ETDEWEB)
Heinisch, H.L. [Pacific Northwest National Lab., Richland, WA (United States); Singh, B.N.
1997-04-01
The effects of the subcascade structure of high energy cascades on the temperature dependencies of annihilation, clustering and free defect production are investigated. The subcascade structure is simulated by closely spaced groups of lower energy MD cascades. The simulation results illustrate the strong influence of the defect configuration existing in the primary damage state on subsequent intracascade evolution. Other significant factors affecting the evolution of the defect distribution are the large differences in mobility and stability of vacancy and interstitial defects and the rapid one-dimensional diffusion of small, glissile interstitial loops produced directly in cascades. Annealing simulations are also performed on high-energy, subcascade-producing cascades generated with the binary collision approximation and calibrated to MD results.
List-Based Simulated Annealing Algorithm for Traveling Salesman Problem
Directory of Open Access Journals (Sweden)
Shi-hua Zhan
2016-01-01
Full Text Available Simulated annealing (SA algorithm is a popular intelligent optimization algorithm which has been successfully applied in many fields. Parameters’ setting is a key factor for its performance, but it is also a tedious work. To simplify parameters setting, we present a list-based simulated annealing (LBSA algorithm to solve traveling salesman problem (TSP. LBSA algorithm uses a novel list-based cooling schedule to control the decrease of temperature. Specifically, a list of temperatures is created first, and then the maximum temperature in list is used by Metropolis acceptance criterion to decide whether to accept a candidate solution. The temperature list is adapted iteratively according to the topology of the solution space of the problem. The effectiveness and the parameter sensitivity of the list-based cooling schedule are illustrated through benchmark TSP problems. The LBSA algorithm, whose performance is robust on a wide range of parameter values, shows competitive performance compared with some other state-of-the-art algorithms.
Sparse approximation problem: how rapid simulated annealing succeeds and fails
Obuchi, Tomoyuki; Kabashima, Yoshiyuki
2016-03-01
Information processing techniques based on sparseness have been actively studied in several disciplines. Among them, a mathematical framework to approximately express a given dataset by a combination of a small number of basis vectors of an overcomplete basis is termed the sparse approximation. In this paper, we apply simulated annealing, a metaheuristic algorithm for general optimization problems, to sparse approximation in the situation where the given data have a planted sparse representation and noise is present. The result in the noiseless case shows that our simulated annealing works well in a reasonable parameter region: the planted solution is found fairly rapidly. This is true even in the case where a common relaxation of the sparse approximation problem, the G-relaxation, is ineffective. On the other hand, when the dimensionality of the data is close to the number of non-zero components, another metastable state emerges, and our algorithm fails to find the planted solution. This phenomenon is associated with a first-order phase transition. In the case of very strong noise, it is no longer meaningful to search for the planted solution. In this situation, our algorithm determines a solution with close-to-minimum distortion fairly quickly.
Simulated annealing technique to design minimum cost exchanger
Directory of Open Access Journals (Sweden)
Khalfe Nadeem M.
2011-01-01
Full Text Available Owing to the wide utilization of heat exchangers in industrial processes, their cost minimization is an important target for both designers and users. Traditional design approaches are based on iterative procedures which gradually change the design and geometric parameters to satisfy a given heat duty and constraints. Although well proven, this kind of approach is time consuming and may not lead to cost effective design as no cost criteria are explicitly accounted for. The present study explores the use of nontraditional optimization technique: called simulated annealing (SA, for design optimization of shell and tube heat exchangers from economic point of view. The optimization procedure involves the selection of the major geometric parameters such as tube diameters, tube length, baffle spacing, number of tube passes, tube layout, type of head, baffle cut etc and minimization of total annual cost is considered as design target. The presented simulated annealing technique is simple in concept, few in parameters and easy for implementations. Furthermore, the SA algorithm explores the good quality solutions quickly, giving the designer more degrees of freedom in the final choice with respect to traditional methods. The methodology takes into account the geometric and operational constraints typically recommended by design codes. Three different case studies are presented to demonstrate the effectiveness and accuracy of proposed algorithm. The SA approach is able to reduce the total cost of heat exchanger as compare to cost obtained by previously reported GA approach.
Simulated Annealing-Based Krill Herd Algorithm for Global Optimization
Directory of Open Access Journals (Sweden)
Gai-Ge Wang
2013-01-01
Full Text Available Recently, Gandomi and Alavi proposed a novel swarm intelligent method, called krill herd (KH, for global optimization. To enhance the performance of the KH method, in this paper, a new improved meta-heuristic simulated annealing-based krill herd (SKH method is proposed for optimization tasks. A new krill selecting (KS operator is used to refine krill behavior when updating krill’s position so as to enhance its reliability and robustness dealing with optimization problems. The introduced KS operator involves greedy strategy and accepting few not-so-good solutions with a low probability originally used in simulated annealing (SA. In addition, a kind of elitism scheme is used to save the best individuals in the population in the process of the krill updating. The merits of these improvements are verified by fourteen standard benchmarking functions and experimental results show that, in most cases, the performance of this improved meta-heuristic SKH method is superior to, or at least highly competitive with, the standard KH and other optimization methods.
Almaraashia, M.; John, Robert; Hopgood, A.; S. Ahmadi
2016-01-01
This paper reports the use of simulated annealing to design more efficient fuzzy logic systems to model problems with associated uncertainties. Simulated annealing is used within this work as a method for learning the best configurations of interval and general type-2 fuzzy logic systems to maximize their modeling ability. The combination of simulated annealing with these models is presented in the modeling of four benchmark problems including real-world problems. The type-2 fuzzy logic syste...
spsann - optimization of sample patterns using spatial simulated annealing
Samuel-Rosa, Alessandro; Heuvelink, Gerard; Vasques, Gustavo; Anjos, Lúcia
2015-04-01
There are many algorithms and computer programs to optimize sample patterns, some private and others publicly available. A few have only been presented in scientific articles and text books. This dispersion and somewhat poor availability is holds back to their wider adoption and further development. We introduce spsann, a new R-package for the optimization of sample patterns using spatial simulated annealing. R is the most popular environment for data processing and analysis. Spatial simulated annealing is a well known method with widespread use to solve optimization problems in the soil and geo-sciences. This is mainly due to its robustness against local optima and easiness of implementation. spsann offers many optimizing criteria for sampling for variogram estimation (number of points or point-pairs per lag distance class - PPL), trend estimation (association/correlation and marginal distribution of the covariates - ACDC), and spatial interpolation (mean squared shortest distance - MSSD). spsann also includes the mean or maximum universal kriging variance (MUKV) as an optimizing criterion, which is used when the model of spatial variation is known. PPL, ACDC and MSSD were combined (PAN) for sampling when we are ignorant about the model of spatial variation. spsann solves this multi-objective optimization problem scaling the objective function values using their maximum absolute value or the mean value computed over 1000 random samples. Scaled values are aggregated using the weighted sum method. A graphical display allows to follow how the sample pattern is being perturbed during the optimization, as well as the evolution of its energy state. It is possible to start perturbing many points and exponentially reduce the number of perturbed points. The maximum perturbation distance reduces linearly with the number of iterations. The acceptance probability also reduces exponentially with the number of iterations. R is memory hungry and spatial simulated annealing is a
A Scheduling Algorithm Based on Petri Nets and Simulated Annealing
Directory of Open Access Journals (Sweden)
Rachida H. Ghoul
2007-01-01
Full Text Available This study aims at presenting a hybrid Flexible Manufacturing System "HFMS" short-term scheduling problem. Based on the art state of general scheduling algorithms, we present the meta-heuristic, we have decided to apply for a given example of HFMS. That was the study of Simulated Annealing Algorithm SA. The HFMS model based on hierarchical Petri nets, was used to represent static and dynamic behavior of the HFMS and design scheduling solutions. Hierarchical Petri nets model was regarded as being made up a set of single timed colored Petri nets models. Each single model represents one process which was composed of many operations and tasks. The complex scheduling problem was decomposed in simple sub-problems. Scheduling algorithm was applied on each sub model in order to resolve conflicts on shared production resources.
Memoryless cooperative graph search based on the simulated annealing algorithm
Institute of Scientific and Technical Information of China (English)
Hou Jian; Yan Gang-Feng; Fan Zhen
2011-01-01
We have studied the problem of reaching a globally optimal segment for a graph-like environment with a single or a group of autonomous mobile agents. Firstly, two efficient simulated-annealing-like algorithms are given for a single agent to solve the problem in a partially known environment and an unknown environment, respectively. It shows that under both proposed control strategies, the agent will eventually converge to a globally optimal segment with probability 1Secondly, we use multi-agent searching to simultaneously reduce the computation complexity and accelerate convergence based on the algorithms we have given for a single agent. By exploiting graph partition, a gossip consensus method based scheme is presented to update the key parameter-radius of the graph, ensuring that the agents spend much less time finding a globally optimal segment.
Restoration of polarimetric SAR images using simulated annealing
DEFF Research Database (Denmark)
Schou, Jesper; Skriver, Henning
2001-01-01
approach favoring one of the objectives. An algorithm for estimating the radar cross-section (RCS) for intensity SAR images has previously been proposed in the literature based on Markov random fields and the stochastic optimization method simulated annealing. A new version of the algorithm is presented...... are obtained while at the same time preserving most of the structures in the image. The algorithm is evaluated using multilook polarimetric L-band data from the Danish airborne EMISAR system, and the impact of the algorithm on the unsupervised H-α classification is demonstrated......Filtering synthetic aperture radar (SAR) images ideally results in better estimates of the parameters characterizing the distributed targets in the images while preserving the structures of the nondistributed targets. However, these objectives are normally conflicting, often leading to a filtering...
Optimization of multiple-layer microperforated panels by simulated annealing
DEFF Research Database (Denmark)
Ruiz Villamil, Heidi; Cobo, Pedro; Jacobsen, Finn
2011-01-01
Sound absorption by microperforated panels (MPP) has received increasing attention the past years as an alternative to conventional porous absorbers in applications with special cleanliness and health requirements. The absorption curve of an MPP depends on four parameters: the holes diameter......, the panel thickness, the perforation ratio, and the thickness of the air cavity between the panel and an impervious wall. It is possible to find a proper combination of these parameters that provides an MPP absorbing in one octave band or two, within the frequency range of interest for noise control....... Therefore, simulated annealing is proposed in this paper as a tool to solve the optimization problem of finding the best combination of the constitutive parameters of an ML-MPP providing the maximum average absorption within a prescribed frequency band....
Simulated annealing and joint manufacturing batch-sizing
Directory of Open Access Journals (Sweden)
Sarker Ruhul
2003-01-01
Full Text Available We address an important problem of a manufacturing system. The system procures raw materials from outside suppliers in a lot and processes them to produce finished goods. It proposes an ordering policy for raw materials to meet the requirements of a production facility. In return, this facility has to deliver finished products demanded by external buyers at fixed time intervals. First, a general cost model is developed considering both raw materials and finished products. Then this model is used to develop a simulated annealing approach to determining an optimal ordering policy for procurement of raw materials and also for the manufacturing batch size to minimize the total cost for meeting customer demands in time. The solutions obtained were compared with those of traditional approaches. Numerical examples are presented. .
Fuzzy unit commitment solution - A novel twofold simulated annealing approach
Energy Technology Data Exchange (ETDEWEB)
Saber, Ahmed Yousuf; Senjyu, Tomonobu; Yona, Atsushi; Urasaki, Naomitsu [Faculty of Engineering, University of the Ryukyus, 1 Senbaru, Nishihara-cho Nakagami, Okinawa 903-0213 (Japan); Funabashi, Toshihisa [Meidensha Corporation, Riverside Building 36-2, Tokyo 103-8515 (Japan)
2007-10-15
The authors propose a twofold simulated annealing (twofold-SA) method for the optimization of fuzzy unit commitment formulation in this paper. In the proposed method, simulated annealing (SA) and fuzzy logic are combined to obtain SA acceptance probabilities from fuzzy membership degrees. Fuzzy load is calculated from error statistics and an initial solution is generated by a priority list method. The initial solution is decomposed into hourly-schedules and each hourly-schedule is modified by decomposed-SA using a bit flipping operator. Fuzzy membership degrees are the selection attributes of the decomposed-SA. A new solution consists of these hourly-schedules of entire scheduling period after repair, as unit-wise constraints may not be fulfilled at the time of an individual hourly-schedule modification. This helps to detect and modify promising schedules of appropriate hours. In coupling-SA, this new solution is accepted for the next iteration if its cost is less than that of current solution. However, a higher cost new solution is accepted with the temperature dependent total cost membership function. Computation time of the proposed method is also improved by the imprecise tolerance of the fuzzy model. Besides, excess units with the system dependent probability distribution help to handle constraints efficiently and imprecise economic load dispatch (ELD) calculations are modified to save the execution time. The proposed method is tested using standard reported data sets. Numerical results show an improvement in solution cost and time compared to the results obtained from other existing methods. (author)
A Simulated Annealing Based Location Area Optimization in Next Generation Mobile Networks
Directory of Open Access Journals (Sweden)
Vilmos Simon
2007-01-01
Full Text Available Mobile networks have faced rapid increase in the number of mobile users and the solution for supporting the growing population is to reduce the cell sizes and to increase the bandwidth reuse. This will cause the number of location management operations and call deliveries to increase significantly, and result in high signaling overhead. We focus on minimizing this overhead, by efficient Location Area Planning (LAP. In this paper we seek to determine the location areas to achieve the minimization of the registration cost, constrained by the paging cost. For that we propose a simulated annealing algorithm, which is applied on a basic Location Area partition of cells formed by a greedy algorithm. We used our realistic mobile environment simulator to generate input (cell changing and incoming call statistics for our algorithm, and by comparing the values of the registration cost function we recognized that significant reduction was achieved in the amount of the signaling traffic.
Directory of Open Access Journals (Sweden)
I Gede Agus Widyadana
2002-01-01
Full Text Available The research is focused on comparing Genetics algorithm and Simulated Annealing in the term of performa and processing time. The main purpose is to find out performance both of the algorithm to solve minimizing makespan and total flowtime in a particular flowshop system. Performances of the algorithms are found by simulating problems with variation of jobs and machines combination. The result show the Simulated Annealing is much better than the Genetics up to 90%. The Genetics, however, only had score in processing time, but the trend that plotted suggest that in problems with lots of jobs and lots of machines, the Simulated Annealing will run much faster than the Genetics. Abstract in Bahasa Indonesia : Penelitian ini difokuskan pada pembandingan algoritma Genetika dan Simulated Annealing ditinjau dari aspek performa dan waktu proses. Tujuannya adalah untuk melihat kemampuan dua algoritma tersebut untuk menyelesaikan problem-problem penjadwalan flow shop dengan kriteria minimasi makespan dan total flowtime. Kemampuan kedua algoritma tersebut dilihat dengan melakukan simulasi yang dilakukan pada kombinasi-kombinasi job dan mesin yang berbeda-beda. Hasil simulasi menunjukan algoritma Simulated Annealing lebih unggul dari algoritma Genetika hingga 90%, algoritma Genetika hanya unggul pada waktu proses saja, namun dengan tren waktu proses yang terbentuk, diyakini pada problem dengan kombinasi job dan mesin yang banyak, algoritma Simulated Annealing dapat lebih cepat daripada algoritma Genetika. Kata kunci: Algoritma Genetika, Simulated Annealing, flow shop, makespan, total flowtime.
Traveling Salesman Approach for Solving Petrol Distribution Using Simulated Annealing
Directory of Open Access Journals (Sweden)
Zuhaimy Ismail
2008-01-01
Full Text Available This research presents an attempt to solve a logistic company's problem of delivering petrol to petrol station in the state of Johor. This delivery system is formulated as a travelling salesman problem (TSP. TSP involves finding an optimal route for visiting stations and returning to point of origin, where the inter-station distance is symmetric and known. This real world application is a deceptive simple combinatorial problem and our approach is to develop solutions based on the idea of local search and meta-heuristics. As a standard problem, we have chosen a solution is a deceptively simple combinatorial problem and we defined it simply as the time spends or distance travelled by salesman visiting n cities (or nodes cyclically. In one tour the vehicle visits each station just once and finishes up where he started. As standard problems, we have chosen TSP with different stations visited once. This research presents the development of solution engine based on local search method known as Greedy Method and with the result generated as the initial solution, Simulated Annealing (SA and Tabu Search (TS further used to improve the search and provide the best solution. A user friendly optimization program developed using Microsoft C++ to solve the TSP and provides solutions to future TSP which may be classified into daily or advanced management and engineering problems.
Optical Design of Multilayer Achromatic Waveplate by Simulated Annealing Algorithm
Institute of Scientific and Technical Information of China (English)
Jun Ma; Jing-Shan Wang; Carsten Denker; Hai-Min Wang
2008-01-01
We applied a Monte Carlo method-simulated annealing algorithm-to carry out the design of multilayer achromatic waveplate. We present solutions for three-, six-and ten-layer achromatic waveplates. The optimized retardance settings are found to be 89°51'39"±0°33'37" and 89°54'46"±0°22'4" for the six-and ten-layer waveplates, respectively, for a wavelength range from 1000nm to 1800nm. The polarimetric properties of multilayer waveplates are investigated based on several numerical experiments. In contrast to previously proposed three-layer achromatic waveplate, the fast axes of the new six-and ten-layer achromatic waveplate remain at fixed angles, independent of the wavelength. Two applications of multilayer achromatic waveplate are discussed, the general-purpose phase shifter and the birefringent filter in the Infrared Imaging Magnetograph (IRIM) system of the Big Bear Solar Observatory (BBSO). We also checked an experimental method to measure the retardance of waveplates.
Simulated Annealing Technique for Routing in a Rectangular Mesh Network
Directory of Open Access Journals (Sweden)
Noraziah Adzhar
2014-01-01
Full Text Available In the process of automatic design for printed circuit boards (PCBs, the phase following cell placement is routing. On the other hand, routing process is a notoriously difficult problem, and even the simplest routing problem which consists of a set of two-pin nets is known to be NP-complete. In this research, our routing region is first tessellated into a uniform Nx×Ny array of square cells. The ultimate goal for a routing problem is to achieve complete automatic routing with minimal need for any manual intervention. Therefore, shortest path for all connections needs to be established. While classical Dijkstra’s algorithm guarantees to find shortest path for a single net, each routed net will form obstacles for later paths. This will add complexities to route later nets and make its routing longer than the optimal path or sometimes impossible to complete. Today’s sequential routing often applies heuristic method to further refine the solution. Through this process, all nets will be rerouted in different order to improve the quality of routing. Because of this, we are motivated to apply simulated annealing, one of the metaheuristic methods to our routing model to produce better candidates of sequence.
A Simulated Annealing Approach for the Train Design Optimization Problem
Directory of Open Access Journals (Sweden)
Federico Alonso-Pecina
2017-01-01
Full Text Available The Train Design Optimization Problem regards making optimal decisions on the number and movement of locomotives and crews through a railway network, so as to satisfy requested pick-up and delivery of car blocks at stations. In a mathematical programming formulation, the objective function to minimize is composed of the costs associated with the movement of locomotives and cars, the loading/unloading operations, the number of locomotives, and the crews’ return to their departure stations. The constraints include upper bounds for number of car blocks per locomotive, number of car block swaps, and number of locomotives passing through railroad segments. We propose here a heuristic method to solve this highly combinatorial problem in two steps. The first one finds an initial, feasible solution by means of an ad hoc algorithm. The second step uses the simulated annealing concept to improve the initial solution, followed by a procedure aiming to further reduce the number of needed locomotives. We show that our results are competitive with those found in the literature.
Hybrid annealing using a quantum simulator coupled to a classical computer
Graß, Tobias
2016-01-01
Finding the global minimum in a rugged potential landscape is a computationally hard task, often equivalent to relevant optimization problems. Simulated annealing is a computational technique which explores the configuration space by mimicking thermal noise. By slow cooling, it freezes the system in a low-energy configuration, but the algorithm often gets stuck in local minima. In quantum annealing, the thermal noise is replaced by controllable quantum fluctuations, and the technique can be implemented in modern quantum simulators. However, quantum-adiabatic schemes become prohibitively slow in the presence of quasidegeneracies. Here we propose a strategy which combines ideas from simulated annealing and quantum annealing. In such hybrid algorithm, the outcome of a quantum simulator is processed on a classical device. While the quantum simulator explores the configuration space by repeatedly applying quantum fluctuations and performing projective measurements, the classical computer evaluates each configurati...
Differential evolution and simulated annealing algorithms for mechanical systems design
Directory of Open Access Journals (Sweden)
H. Saruhan
2014-09-01
Full Text Available In this study, nature inspired algorithms – the Differential Evolution (DE and the Simulated Annealing (SA – are utilized to seek a global optimum solution for ball bearings link system assembly weight with constraints and mixed design variables. The Genetic Algorithm (GA and the Evolution Strategy (ES will be a reference for the examination and validation of the DE and the SA. The main purpose is to minimize the weight of an assembly system composed of a shaft and two ball bearings. Ball bearings link system is used extensively in many machinery applications. Among mechanical systems, designers pay great attention to the ball bearings link system because of its significant industrial importance. The problem is complex and a time consuming process due to mixed design variables and inequality constraints imposed on the objective function. The results showed that the DE and the SA performed and obtained convergence reliability on the global optimum solution. So the contribution of the DE and the SA application to the mechanical system design can be very useful in many real-world mechanical system design problems. Beside, the comparison confirms the effectiveness and the superiority of the DE over the others algorithms – the SA, the GA, and the ES – in terms of solution quality. The ball bearings link system assembly weight of 634,099 gr was obtained using the DE while 671,616 gr, 728213.8 gr, and 729445.5 gr were obtained using the SA, the ES, and the GA respectively.
Sensitivity study on hydraulic well testing inversion using simulated annealing
Energy Technology Data Exchange (ETDEWEB)
Nakao, Shinsuke; Najita, J.; Karasaki, Kenzi
1997-11-01
For environmental remediation, management of nuclear waste disposal, or geothermal reservoir engineering, it is very important to evaluate the permeabilities, spacing, and sizes of the subsurface fractures which control ground water flow. Cluster variable aperture (CVA) simulated annealing has been used as an inversion technique to construct fluid flow models of fractured formations based on transient pressure data from hydraulic tests. A two-dimensional fracture network system is represented as a filled regular lattice of fracture elements. The algorithm iteratively changes an aperture of cluster of fracture elements, which are chosen randomly from a list of discrete apertures, to improve the match to observed pressure transients. The size of the clusters is held constant throughout the iterations. Sensitivity studies using simple fracture models with eight wells show that, in general, it is necessary to conduct interference tests using at least three different wells as pumping well in order to reconstruct the fracture network with a transmissivity contrast of one order of magnitude, particularly when the cluster size is not known a priori. Because hydraulic inversion is inherently non-unique, it is important to utilize additional information. The authors investigated the relationship between the scale of heterogeneity and the optimum cluster size (and its shape) to enhance the reliability and convergence of the inversion. It appears that the cluster size corresponding to about 20--40 % of the practical range of the spatial correlation is optimal. Inversion results of the Raymond test site data are also presented and the practical range of spatial correlation is evaluated to be about 5--10 m from the optimal cluster size in the inversion.
Directory of Open Access Journals (Sweden)
Dawei Chen
2015-01-01
Full Text Available This paper analyzes the impact factors and principles of siting urban refueling stations and proposes a three-stage method. The main objective of the method is to minimize refueling vehicles’ detour time. The first stage aims at identifying the most frequently traveled road segments for siting refueling stations. The second stage focuses on adding additional refueling stations to serve vehicles whose demands are not directly satisfied by the refueling stations identified in the first stage. The last stage further adjusts and optimizes the refueling station plan generated by the first two stages. A genetic simulated annealing algorithm is proposed to solve the optimization problem in the second stage and the results are compared to those from the genetic algorithm. A case study is also conducted to demonstrate the effectiveness of the proposed method and algorithm. The results indicate the proposed method can provide practical and effective solutions that help planners and government agencies make informed refueling station location decisions.
Wanneng Shu
2009-01-01
Quantum-inspired genetic algorithm (QGA) is applied to simulated annealing (SA) to develop a class of quantum-inspired simulated annealing genetic algorithm (QSAGA) for combinatorial optimization. With the condition of preserving QGA advantages, QSAGA takes advantage of the SA algorithm so as to avoid premature convergence. To demonstrate its effectiveness and applicability, experiments are carried out on the knapsack problem. The results show that QSAGA performs well, without premature conve...
Directory of Open Access Journals (Sweden)
Gregorius Satia Budhi
2003-01-01
Full Text Available Flexible Manufacturing System (FMS is a manufacturing system that is formed from several Numerical Controlled Machines combine with material handling system, so that different jobs can be worked by different machines sequences. FMS combine the high productivity and flexibility of Transfer Line and Job Shop manufacturing system. In this reasearch, Activity-Based Costing(ABC approach was used as the weight to search the operation route in the proper machine, so that the total production cost can be optimized. The search method that was used in this experiment is Simulated Annealling, a variant form Hill Climbing Search method. An ideal operation time to proses a part was used as the annealling schedule. From the empirical test, it could be proved that the use of ABC approach and Simulated Annealing to search the route (routing process can optimize the Total Production Cost. In the other hand, the use of ideal operation time to process a part as annealing schedule can control the processing time well. Abstract in Bahasa Indonesia : Flexible Manufacturing System (FMS adalah sistem manufaktur yang tersusun dari mesin-mesin Numerical Control (NC yang dikombinasi dengan Sistem Penanganan Material, sehingga job-job berbeda dikerjakan oleh mesin-mesin dengan alur yang berlainan. FMS menggabungkan produktifitas dan fleksibilitas yang tinggi dari Sistem Manufaktur Transfer Line dan Job Shop. Pada riset ini pendekatan Activity-Based Costing (ABC digunakan sebagai bobot / weight dalam pencarian rute operasi pada mesin yang tepat, untuk lebih mengoptimasi biaya produksi secara keseluruhan. Adapun metode Searching yang digunakan adalah Simulated Annealing yang merupakan varian dari metode searching Hill Climbing. Waktu operasi ideal untuk memproses sebuah part digunakan sebagai Annealing Schedulenya. Dari hasil pengujian empiris dapat dibuktikan bahwa penggunaan pendekatan ABC dan Simulated Annealing untuk proses pencarian rute (routing dapat lebih
Computer simulation of laser annealing of a nanostructured surface
Ivanov, D.; Marinov, I.; Gorbachev, Y.; Smirnov, A.; Krzhizhanovskaya, V.
2010-01-01
Laser annealing technology is used in mass production of new-generation semiconductor materials and nano-electronic devices like the MOS-based (metal-oxide-semiconductor) integrated circuits. Manufacturing sub-100 nm MOS devices demands application of ultra-shallow doping (junctions), which requires
Theodorakos, I.; Zergioti, I.; Vamvakas, V.; Tsoukalas, D.; Raptis, Y. S.
2014-01-01
In this work, a picosecond diode pumped solid state laser and a nanosecond Nd:YAG laser have been used for the annealing and the partial nano-crystallization of an amorphous silicon layer. These experiments were conducted as an alternative/complementary to plasma-enhanced chemical vapor deposition method for fabrication of micromorph tandem solar cell. The laser experimental work was combined with simulations of the annealing process, in terms of temperature distribution evolution, in order to predetermine the optimum annealing conditions. The annealed material was studied, as a function of several annealing parameters (wavelength, pulse duration, fluence), as far as it concerns its structural properties, by X-ray diffraction, SEM, and micro-Raman techniques.
Green, P. L.
2015-02-01
This work details the Bayesian identification of a nonlinear dynamical system using a novel MCMC algorithm: 'Data Annealing'. Data Annealing is similar to Simulated Annealing in that it allows the Markov chain to easily clear 'local traps' in the target distribution. To achieve this, training data is fed into the likelihood such that its influence over the posterior is introduced gradually - this allows the annealing procedure to be conducted with reduced computational expense. Additionally, Data Annealing uses a proposal distribution which allows it to conduct a local search accompanied by occasional long jumps, reducing the chance that it will become stuck in local traps. Here it is used to identify an experimental nonlinear system. The resulting Markov chains are used to approximate the covariance matrices of the parameters in a set of competing models before the issue of model selection is tackled using the Deviance Information Criterion.
Simulation of annealing process effect on texture evolution of deep-drawing sheet St15
Institute of Scientific and Technical Information of China (English)
Jinghong Sun; Yazheng Liu; Leyu Zhou
2005-01-01
A two-dimensional cellular automaton method was used to simulate grain growth during the recrystallization annealing of deep-drawing sheet Stl5, taking the simulated result of recrystallization and the experimental result of the annealing texture of deepdrawing sheet St15 as the initial condition and reference. By means of computer simulation, the microstructures and textures of different periods of grain growth were predicted. It is achieved that the grain size, shape and texture become stable after the grain growth at a constant temperature of 700℃ for 10 h, and the advantaged texture components { 111 } and { 111 } are dominant.
Liang, Faming
2014-04-03
Simulated annealing has been widely used in the solution of optimization problems. As known by many researchers, the global optima cannot be guaranteed to be located by simulated annealing unless a logarithmic cooling schedule is used. However, the logarithmic cooling schedule is so slow that no one can afford to use this much CPU time. This article proposes a new stochastic optimization algorithm, the so-called simulated stochastic approximation annealing algorithm, which is a combination of simulated annealing and the stochastic approximation Monte Carlo algorithm. Under the framework of stochastic approximation, it is shown that the new algorithm can work with a cooling schedule in which the temperature can decrease much faster than in the logarithmic cooling schedule, for example, a square-root cooling schedule, while guaranteeing the global optima to be reached when the temperature tends to zero. The new algorithm has been tested on a few benchmark optimization problems, including feed-forward neural network training and protein-folding. The numerical results indicate that the new algorithm can significantly outperform simulated annealing and other competitors. Supplementary materials for this article are available online.
Energy Technology Data Exchange (ETDEWEB)
Saboonchi, Ahmad [Department of Mechanical Engineering, Isfahan University of Technology, Isfahan 84154 (Iran); Hassanpour, Saeid [Rayan Tahlil Sepahan Co., Isfahan Science and Technology Town, Isfahan 84155 (Iran); Abbasi, Shahram [R and D Department, Mobarakeh Steel Complex, Isfahan (Iran)
2008-11-15
Cold rolled steel coils are annealed in batch furnaces to obtain desirable mechanical properties. Annealing operations involve heating and cooling cycles which take long due to high weight of the coils under annealing. To reduce annealing time, a simulation code was developed that is capable of evaluating more effective schedules for annealing coils during the heating process. This code is additionally capable of accurate determination of furnace turn-off time for different coil weights and charge dimensions. After studying many heating schedules and considering heat transfer mechanism in the annealing furnace, a new schedule with the most advantages was selected as the new operation conditions in the hydrogen annealing plant. The performance of all the furnaces were adjusted to the new heating schedule after experiments had been carried out to ensure the accuracy of the code and the fitness of the new operation condition. Comparison of similar yield of cold rolled coils over two months revealed that specific energy consumption of furnaces under the new heating schedule decreased by 11%, heating cycle time by 16%, and the hydrogen consumption by 14%. (author)
Hartmann, Matthias; Bogner, Ludwig
2008-05-01
Inverse treatment planning of intensity-modulated radiation therapy (IMRT) is complicated by several sources of error, which can cause deviations of optimized plans from the true optimal solution. These errors include the systematic and convergence error, the local minima error, and the optimizer convergence error. We minimize these errors by developing an inverse IMRT treatment planning system with a Monte Carlo based dose engine and a simulated annealing search engine as well as a deterministic search engine. In addition, different generalized equivalent uniform dose (gEUD)-based and hybrid objective functions were implemented and investigated with simulated annealing. By means of a head-and-neck IMRT case we have analyzed the properties of these gEUD-based objective functions, including its search space and the existence of local optima errors. We found evidence that the use of a previously published investigation of a gEUD-based objective function results in an uncommon search space with a golf hole structure. This special search space structure leads to trapping in local minima, making it extremely difficult to identify the true global minimum, even when using stochastic search engines. Moreover, for the same IMRT case several local optima have been detected by comparing the solutions of 100 different trials using a gradient optimization algorithm with the global optimum computed by simulated annealing. We have demonstrated that the hybrid objective function, which includes dose-based objectives for the target and gEUD-based objectives for normal tissue, results in equally good sparing of the critical structures as for the pure gEUD objective function and lower target dose maxima.
Simulated annealing algorithm for TSP%用模拟退火算法求解TSP
Institute of Scientific and Technical Information of China (English)
朱静丽
2011-01-01
货郎担问题，即TSP（Traveling Salesman Problem），是一个组合优化问题。具有NPC计算复杂性。本文分析了模拟退火算法模型，研究了用模拟退火算法求解TSP算法的可行性，并给出了用模拟退火算法求解TSP问题的具体实现方法。%Traveling salesman problem,that TSP（Travelling Salesman Problem）,is a combinatorial optimization problem.Computational complexity with the NPC.This paper analyzes the simulated annealing algorithm model to study the simulated annealing algorithm for TSP of the algorithm,and gives the simulated annealing algorithm for TSP on the specific implementation.
Cook, Darcy; Ferens, Ken; Kinsner, Witold
Simulated Annealing (SA) has shown to be a successful technique in optimization problems. It has been applied to both continuous function optimization problems, and combinatorial optimization problems. There has been some work in modifying the SA algorithm to apply properties of chaotic processes with the goal of reducing the time to converge to an optimal or a good solution. There are several variations of these chaotic simulated annealing (CSA) algorithms. In this paper a new variation of chaotic simulated annealing is proposed and is applied in solving a combinatorial optimization problem in multiprocessor task allocation. The experiments show the CSA algorithms reach a good solution faster than traditional SA algorithms in many cases because of a wider initial solution search.
Instantons in Quantum Annealing: Thermally Assisted Tunneling Vs Quantum Monte Carlo Simulations
Jiang, Zhang; Smelyanskiy, Vadim N.; Boixo, Sergio; Isakov, Sergei V.; Neven, Hartmut; Mazzola, Guglielmo; Troyer, Matthias
2015-01-01
Recent numerical result (arXiv:1512.02206) from Google suggested that the D-Wave quantum annealer may have an asymptotic speed-up than simulated annealing, however, the asymptotic advantage disappears when it is compared to quantum Monte Carlo (a classical algorithm despite its name). We show analytically that the asymptotic scaling of quantum tunneling is exactly the same as the escape rate in quantum Monte Carlo for a class of problems. Thus, the Google result might be explained in our framework. We also found that the transition state in quantum Monte Carlo corresponds to the instanton solution in quantum tunneling problems, which is observed in numerical simulations.
Institute of Scientific and Technical Information of China (English)
高红民; 周惠; 徐立中; 石爱业
2014-01-01
A hybrid feature selection and classification strategy was proposed based on the simulated annealing genetic algorithm and multiple instance learning (MIL). The band selection method was proposed from subspace decomposition, which combines the simulated annealing algorithm with the genetic algorithm in choosing different cross-over and mutation probabilities, as well as mutation individuals. Then MIL was combined with image segmentation, clustering and support vector machine algorithms to classify hyperspectral image. The experimental results show that this proposed method can get high classification accuracy of 93.13%at small training samples and the weaknesses of the conventional methods are overcome.
DEFF Research Database (Denmark)
Sousa, Tiago M; Soares, Tiago; Morais, Hugo
2016-01-01
The massive use of distributed generation and electric vehicles will lead to a more complex management of the power system, requiring new approaches to be used in the optimal resource scheduling field. Electric vehicles with vehicle-to-grid capability can be useful for the aggregator players...... of the aggregator total operation costs. The case study considers a distribution network with 33-bus, 66 distributed generation and 2000 electric vehicles. The proposed simulated annealing is matched with a deterministic approach allowing an effective and efficient comparison. The simulated annealing presents...
DEFF Research Database (Denmark)
Riaz, M. Tahir; Gutierrez Lopez, Jose Manuel; Pedersen, Jens Myrup
2011-01-01
The paper presents a hybrid Genetic and Simulated Annealing algorithm for implementing Chordal Ring structure in optical backbone network. In recent years, topologies based on regular graph structures gained a lot of interest due to their good communication properties for physical topology...... of the networks. There have been many use of evolutionary algorithms to solve the problems which are in combinatory complexity nature, and extremely hard to solve by exact approaches. Both Genetic and Simulated annealing algorithms are similar in using controlled stochastic method to search the solution....... The paper combines the algorithms in order to analyze the impact of implementation performance....
Institute of Scientific and Technical Information of China (English)
CHUShuchuan; JohnF.Roddick
2003-01-01
In this paper, a cluster generation algorithm for vector quantization using a tabu search approach with simulated annealing is proposed. The main iclea of this algorithm is to use the tabu search approach to gen-erate non-local moves for the clusters and apply the sim-ulated annealing technique to select the current best solu-tion, thus improving the cluster generation and reducing the mean squared error. Preliminary experimental results demonstrate that the proposed approach is superior to the tabu search approach with Generalised Lloyd algorithm.
Directory of Open Access Journals (Sweden)
Juan Frausto-Solis
2016-01-01
Full Text Available A new hybrid Multiphase Simulated Annealing Algorithm using Boltzmann and Bose-Einstein distributions (MPSABBE is proposed. MPSABBE was designed for solving the Protein Folding Problem (PFP instances. This new approach has four phases: (i Multiquenching Phase (MQP, (ii Boltzmann Annealing Phase (BAP, (iii Bose-Einstein Annealing Phase (BEAP, and (iv Dynamical Equilibrium Phase (DEP. BAP and BEAP are simulated annealing searching procedures based on Boltzmann and Bose-Einstein distributions, respectively. DEP is also a simulated annealing search procedure, which is applied at the final temperature of the fourth phase, which can be seen as a second Bose-Einstein phase. MQP is a search process that ranges from extremely high to high temperatures, applying a very fast cooling process, and is not very restrictive to accept new solutions. However, BAP and BEAP range from high to low and from low to very low temperatures, respectively. They are more restrictive for accepting new solutions. DEP uses a particular heuristic to detect the stochastic equilibrium by applying a least squares method during its execution. MPSABBE parameters are tuned with an analytical method, which considers the maximal and minimal deterioration of problem instances. MPSABBE was tested with several instances of PFP, showing that the use of both distributions is better than using only the Boltzmann distribution on the classical SA.
Optimal design of hydraulic manifold blocks based on niching genetic simulated annealing algorithm
Institute of Scientific and Technical Information of China (English)
Jia Chunqiang; Yu Ling; Tian Shujun; Gao Yanming
2007-01-01
To solve the combinatorial optimization problem of outer layout and inner connection integrated schemes in the design of hydraulic manifold blocks(HMB),a hybrid genetic simulated annealing algorithm based on niche technology is presented.This hybrid algorithm,which combines genetic algorithm,simulated annealing algorithm and niche technology,has a strong capability in global and local search,and all extrema can be found in a short time without strict requests for preferences.For the complex restricted solid spatial layout problems in HMB,the optimizing mathematical model is presented.The key technologies in the integrated layout and connection design of HMB,including the realization of coding,annealing operation and genetic operation,are discussed.The framework of HMB optimal design system based on hybrid optimization strategy is proposed.An example is given to testify the effectiveness and feasibility of the algorithm.
An Improved Simulated Annealing Technique for Enhanced Mobility in Smart Cities.
Amer, Hayder; Salman, Naveed; Hawes, Matthew; Chaqfeh, Moumena; Mihaylova, Lyudmila; Mayfield, Martin
2016-06-30
Vehicular traffic congestion is a significant problem that arises in many cities. This is due to the increasing number of vehicles that are driving on city roads of limited capacity. The vehicular congestion significantly impacts travel distance, travel time, fuel consumption and air pollution. Avoidance of traffic congestion and providing drivers with optimal paths are not trivial tasks. The key contribution of this work consists of the developed approach for dynamic calculation of optimal traffic routes. Two attributes (the average travel speed of the traffic and the roads' length) are utilized by the proposed method to find the optimal paths. The average travel speed values can be obtained from the sensors deployed in smart cities and communicated to vehicles via the Internet of Vehicles and roadside communication units. The performance of the proposed algorithm is compared to three other algorithms: the simulated annealing weighted sum, the simulated annealing technique for order preference by similarity to the ideal solution and the Dijkstra algorithm. The weighted sum and technique for order preference by similarity to the ideal solution methods are used to formulate different attributes in the simulated annealing cost function. According to the Sheffield scenario, simulation results show that the improved simulated annealing technique for order preference by similarity to the ideal solution method improves the traffic performance in the presence of congestion by an overall average of 19.22% in terms of travel time, fuel consumption and CO₂ emissions as compared to other algorithms; also, similar performance patterns were achieved for the Birmingham test scenario.
Stochastic annealing simulation of copper under neutron irradiation
Energy Technology Data Exchange (ETDEWEB)
Heinisch, H.L. [Pacific Northwest National Lab., Richland, WA (United States); Singh, B.N. [Risoe National Lab., Roskilde (Denmark)
1998-03-01
This report is a summary of a presentation made at ICFRM-8 on computer simulations of defect accumulation during irradiation of copper to low doses at room temperature. The simulation results are in good agreement with experimental data on defect cluster densities in copper irradiated in RTNS-II.
Directory of Open Access Journals (Sweden)
Thamilselvan Rakkiannan
2012-01-01
Full Text Available Problem statement: The Job Shop Scheduling Problem (JSSP is observed as one of the most difficult NP-hard, combinatorial problem. The problem consists of determining the most efficient schedule for jobs that are processed on several machines. Approach: In this study Genetic Algorithm (GA is integrated with the parallel version of Simulated Annealing Algorithm (SA is applied to the job shop scheduling problem. The proposed algorithm is implemented in a distributed environment using Remote Method Invocation concept. The new genetic operator and a parallel simulated annealing algorithm are developed for solving job shop scheduling. Results: The implementation is done successfully to examine the convergence and effectiveness of the proposed hybrid algorithm. The JSS problems tested with very well-known benchmark problems, which are considered to measure the quality of proposed system. Conclusion/Recommendations: The empirical results show that the proposed genetic algorithm with simulated annealing is quite successful to achieve better solution than the individual genetic or simulated annealing algorithm."
A Simulated Annealing Algorithm for Maximum Common Edge Subgraph Detection in Biological Networks
DEFF Research Database (Denmark)
Larsen, Simon; Alkærsig, Frederik G.; Ditzel, Henrik
2016-01-01
introduce a heuristic algorithm for the multiple maximum common edge subgraph problem that is able to detect large common substructures shared across multiple, real-world size networks efficiently. Our algorithm uses a combination of iterated local search, simulated annealing and a pheromone...
Improving Simulated Annealing by Recasting it as a Non-Cooperative Game
Wolpert, David; Bandari, Esfandiar; Tumer, Kagan
2001-01-01
The game-theoretic field of COllective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved "as a side-effect". Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed game-theory-motivated algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting improves simulated annealing by several orders of magnitude for spin glass relaxation and bin-packing.
An archived multi-objective simulated annealing for a dynamic cellular manufacturing system
Shirazi, Hossein; Kia, Reza; Javadian, Nikbakhsh; Tavakkoli-Moghaddam, Reza
2014-05-01
To design a group layout of a cellular manufacturing system (CMS) in a dynamic environment, a multi-objective mixed-integer non-linear programming model is developed. The model integrates cell formation, group layout and production planning (PP) as three interrelated decisions involved in the design of a CMS. This paper provides an extensive coverage of important manufacturing features used in the design of CMSs and enhances the flexibility of an existing model in handling the fluctuations of part demands more economically by adding machine depot and PP decisions. Two conflicting objectives to be minimized are the total costs and the imbalance of workload among cells. As the considered objectives in this model are in conflict with each other, an archived multi-objective simulated annealing (AMOSA) algorithm is designed to find Pareto-optimal solutions. Matrix-based solution representation, a heuristic procedure generating an initial and feasible solution and efficient mutation operators are the advantages of the designed AMOSA. To demonstrate the efficiency of the proposed algorithm, the performance of AMOSA is compared with an exact algorithm (i.e., ∈-constraint method) solved by the GAMS software and a well-known evolutionary algorithm, namely NSGA-II for some randomly generated problems based on some comparison metrics. The obtained results show that the designed AMOSA can obtain satisfactory solutions for the multi-objective model.
Simulated Annealing Genetic Algorithm Based Schedule Risk Management of IT Outsourcing Project
Directory of Open Access Journals (Sweden)
Fuqiang Lu
2017-01-01
Full Text Available IT outsourcing is an effective way to enhance the core competitiveness for many enterprises. But the schedule risk of IT outsourcing project may cause enormous economic loss to enterprise. In this paper, the Distributed Decision Making (DDM theory and the principal-agent theory are used to build a model for schedule risk management of IT outsourcing project. In addition, a hybrid algorithm combining simulated annealing (SA and genetic algorithm (GA is designed, namely, simulated annealing genetic algorithm (SAGA. The effect of the proposed model on the schedule risk management problem is analyzed in the simulation experiment. Meanwhile, the simulation results of the three algorithms GA, SA, and SAGA show that SAGA is the most superior one to the other two algorithms in terms of stability and convergence. Consequently, this paper provides the scientific quantitative proposal for the decision maker who needs to manage the schedule risk of IT outsourcing project.
Synthesis of optimal digital shapers with arbitrary noise using simulated annealing
Energy Technology Data Exchange (ETDEWEB)
Regadío, Alberto, E-mail: aregadio@srg.aut.uah.es [Department of Computer Engineering, Space Research Group, Universidad de Alcalá, 28805 Alcalá de Henares (Spain); Electronic Technology Area, Instituto Nacional de Técnica Aeroespacial, 28850 Torrejón de Ardoz (Spain); Sánchez-Prieto, Sebastián, E-mail: sebastian.sanchez@uah.es [Department of Computer Engineering, Space Research Group, Universidad de Alcalá, 28805 Alcalá de Henares (Spain); Tabero, Jesús, E-mail: taberogj@inta.es [Electronic Technology Area, Instituto Nacional de Técnica Aeroespacial, 28850 Torrejón de Ardoz (Spain)
2014-02-21
This paper presents the structure, design and implementation of a new way of determining the optimal shaping in time-domain for spectrometers by means of simulated annealing. The proposed algorithm is able to adjust automatically and in real-time the coefficients for shaping an input signal. A practical prototype was designed, implemented and tested on a PowerPC 405 embedded in a Field Programmable Gate Array (FPGA). Lastly, its performance and capabilities were measured using simulations and a neutron monitor.
Institute of Scientific and Technical Information of China (English)
Wang Hongkai; Guan Yanyong; Xue Peijun
2008-01-01
In rough communication, because each agent has a different language and cannot provide precise communication to each other, the concept translated among multi-agents will loss some information and this results in a less or rougher concept. With different translation sequences, the problem of information loss is varied. To get the translation sequence, in which the jth agent taking part in rough communication gets maximum information, a simulated annealing algorithm is used. Analysis and simulation of this algorithm demonstrate its effectiveness.
Directory of Open Access Journals (Sweden)
Destya Arisetyanti
2012-09-01
Full Text Available Standar Digital Video Broadcasting Terrestrial (DVB-T diimplementasikan pada konfigurasi Single Frequency Network (SFN dimana seluruh pemancar pada sebuah jaringan beroperasi pada kanal frekuensi yang sama dan ditransmisikan pada waktu yang sama. SFN lebih dipilih daripada sistem pendahulunya yaitu Multi Frequency Network (MFN karena menggunakan frekuensi yang lebih efisien serta jangkauan area cakupan yang lebih luas. Pada sisi penerima memungkinkan adanya skenario multipath dengan menggabungkan sinyal dari pemancar yang berbeda karena konfigurasi SFN ini berbasis Orthogonal Frequency Division Multiplexing (OFDM. Pada penelitian ini, data ketinggian dan jumlah gedung melalui model prediksi propagasi free space dan knife edge akan diterapkan untuk memperkirakan nilai daya terima dan delay sinyal. Perhitungan nilai carrier (C dan carrier to interference (C/I dilakukan untuk mengetahui kualitas sinyal pada sisi penerima. Selanjutnya, optimasi parameter lokasi pemancar diterapkan oleh algoritma Simulated Annealing dengan menggunakan tiga cooling schedule terbaik. Simulated Annealing merupakan algoritma optimasi berdasarkan sistem termodinamika yang mensimulasikan proses annealing. Simulated Annealing telah berhasil memperluas daerah cakupan SFN. Hal ini dibuktikan dengan berkurangnya sebagian besar titik receiver dengan kualitas sinyal dibawah threshold.
Directory of Open Access Journals (Sweden)
Chang Li
2014-01-01
Full Text Available Much of the previous work in D-optimal design for regression models with correlated errors focused on polynomial models with a single predictor variable, in large part because of the intractability of an analytic solution. In this paper, we present a modified, improved simulated annealing algorithm, providing practical approaches to specifications of the annealing cooling parameters, thresholds, and search neighborhoods for the perturbation scheme, which finds approximate D-optimal designs for 2-way and 3-way polynomial regression for a variety of specific correlation structures with a given correlation coefficient. Results in each correlated-errors case are compared with traditional simulated annealing algorithm, that is, the SA algorithm without our improvement. Our improved simulated annealing results had generally higher D-efficiency than traditional simulated annealing algorithm, especially when the correlation parameter was well away from 0.
Optimal Lead-lag Controller for Distributed Generation Unit in Island Mode Using Simulated Annealing
Directory of Open Access Journals (Sweden)
A. Akbarimajd
2014-07-01
Full Text Available Active and reactive power components of a Distributed Generation (DG is normally controlled by a conventional dq-current control strategy however, after islanding the dq-current which is not able to successfully complete the control task is disabled and a lead-lag control strategy based optimized by simulated annealing is proposed for control of DG unit in islanding mode. Integral of Time multiply by Absolute Error (ITEA criterion is used as cost function of simulated annealing in order to achieve smooth response and robust behavior. The proposed controller improved robust stability margins of the system. Simulations with different load and input operating conditions verify advantages of the proposed controller in comparison with a previously developed classic controller in terms of robustness and response time.
Directory of Open Access Journals (Sweden)
Sheng Lu
2015-01-01
Full Text Available To solve the problem of parameter selection during the design of magnetically coupled resonant wireless power transmission system (MCR-WPT, this paper proposed an improved genetic simulated annealing algorithm. Firstly, the equivalent circuit of the system is analysis in this study and a nonlinear programming mathematical model is built. Secondly, in place of the penalty function method in the genetic algorithm, the selection strategy based on the distance between individuals is adopted to select individual. In this way, it reduces the excess empirical parameters. Meanwhile, it can improve the convergence rate and the searching ability by calculating crossover probability and mutation probability according to the variance of population’s fitness. At last, the simulated annealing operator is added to increase local search ability of the method. The simulation shows that the improved method can break the limit of the local optimum solution and get the global optimum solution faster. The optimized system can achieve the practical requirements.
Ohzeki, Masayuki
2017-01-01
Quantum annealing is a generic solver of the optimization problem that uses fictitious quantum fluctuation. Its simulation in classical computing is often performed using the quantum Monte Carlo simulation via the Suzuki–Trotter decomposition. However, the negative sign problem sometimes emerges in the simulation of quantum annealing with an elaborate driver Hamiltonian, since it belongs to a class of non-stoquastic Hamiltonians. In the present study, we propose an alternative way to avoid the negative sign problem involved in a particular class of the non-stoquastic Hamiltonians. To check the validity of the method, we demonstrate our method by applying it to a simple problem that includes the anti-ferromagnetic XX interaction, which is a typical instance of the non-stoquastic Hamiltonians. PMID:28112244
Simulated annealing: an application in fine particle magnetism
Energy Technology Data Exchange (ETDEWEB)
Legeratos, A.; Chantrell, R.W.; Wohlfarth, E.P.
1985-07-01
Using a model of a system of interacting fine ferromagnetic particles, a computer simulation of the dynamical approach to local or global minima of the system is developed for two different schedules of the application of ac and dc magnetic fields. The process of optimization, i.e., the achievement of a global minimum, depends on the rate of reduction of the ac field and on the symmetry of the ac field cycles. The calculations carried out to illustrate these effects include remanence curves and the zero field remanence for both schedules under different conditions. The growth of the magnetization during these processes was studied, and the interaction energy was calculated to best illustrate the optimization.
An Improved Simulated Annealing Technique for Enhanced Mobility in Smart Cities
Directory of Open Access Journals (Sweden)
Hayder Amer
2016-06-01
Full Text Available Vehicular traffic congestion is a significant problem that arises in many cities. This is due to the increasing number of vehicles that are driving on city roads of limited capacity. The vehicular congestion significantly impacts travel distance, travel time, fuel consumption and air pollution. Avoidance of traffic congestion and providing drivers with optimal paths are not trivial tasks. The key contribution of this work consists of the developed approach for dynamic calculation of optimal traffic routes. Two attributes (the average travel speed of the traffic and the roads’ length are utilized by the proposed method to find the optimal paths. The average travel speed values can be obtained from the sensors deployed in smart cities and communicated to vehicles via the Internet of Vehicles and roadside communication units. The performance of the proposed algorithm is compared to three other algorithms: the simulated annealing weighted sum, the simulated annealing technique for order preference by similarity to the ideal solution and the Dijkstra algorithm. The weighted sum and technique for order preference by similarity to the ideal solution methods are used to formulate different attributes in the simulated annealing cost function. According to the Sheffield scenario, simulation results show that the improved simulated annealing technique for order preference by similarity to the ideal solution method improves the traffic performance in the presence of congestion by an overall average of 19.22% in terms of travel time, fuel consumption and CO2 emissions as compared to other algorithms; also, similar performance patterns were achieved for the Birmingham test scenario.
An Improved Simulated Annealing Technique for Enhanced Mobility in Smart Cities
Amer, Hayder; Salman, Naveed; Hawes, Matthew; Chaqfeh, Moumena; Mihaylova, Lyudmila; Mayfield, Martin
2016-01-01
Vehicular traffic congestion is a significant problem that arises in many cities. This is due to the increasing number of vehicles that are driving on city roads of limited capacity. The vehicular congestion significantly impacts travel distance, travel time, fuel consumption and air pollution. Avoidance of traffic congestion and providing drivers with optimal paths are not trivial tasks. The key contribution of this work consists of the developed approach for dynamic calculation of optimal traffic routes. Two attributes (the average travel speed of the traffic and the roads’ length) are utilized by the proposed method to find the optimal paths. The average travel speed values can be obtained from the sensors deployed in smart cities and communicated to vehicles via the Internet of Vehicles and roadside communication units. The performance of the proposed algorithm is compared to three other algorithms: the simulated annealing weighted sum, the simulated annealing technique for order preference by similarity to the ideal solution and the Dijkstra algorithm. The weighted sum and technique for order preference by similarity to the ideal solution methods are used to formulate different attributes in the simulated annealing cost function. According to the Sheffield scenario, simulation results show that the improved simulated annealing technique for order preference by similarity to the ideal solution method improves the traffic performance in the presence of congestion by an overall average of 19.22% in terms of travel time, fuel consumption and CO2 emissions as compared to other algorithms; also, similar performance patterns were achieved for the Birmingham test scenario. PMID:27376289
Wrożyna, Andrzej; Pernach, Monika; Kuziak, Roman; Pietrzyk, Maciej
2016-04-01
Due to their exceptional strength properties combined with good workability the Advanced High-Strength Steels (AHSS) are commonly used in automotive industry. Manufacturing of these steels is a complex process which requires precise control of technological parameters during thermo-mechanical treatment. Design of these processes can be significantly improved by the numerical models of phase transformations. Evaluation of predictive capabilities of models, as far as their applicability in simulation of thermal cycles thermal cycles for AHSS is considered, was the objective of the paper. Two models were considered. The former was upgrade of the JMAK equation while the latter was an upgrade of the Leblond model. The models can be applied to any AHSS though the examples quoted in the paper refer to the Dual Phase (DP) steel. Three series of experimental simulations were performed. The first included various thermal cycles going beyond limitations of the continuous annealing lines. The objective was to validate models behavior in more complex cooling conditions. The second set of tests included experimental simulations of the thermal cycle characteristic for the continuous annealing lines. Capability of the models to describe properly phase transformations in this process was evaluated. The third set included data from the industrial continuous annealing line. Validation and verification of models confirmed their good predictive capabilities. Since it does not require application of the additivity rule, the upgrade of the Leblond model was selected as the better one for simulation of industrial processes in AHSS production.
Energy Technology Data Exchange (ETDEWEB)
Sheng, Zheng, E-mail: 19994035@sina.com [College of Meteorology and Oceanography, PLA University of Science and Technology, Nanjing 211101 (China); Wang, Jun; Zhou, Bihua [National Defense Key Laboratory on Lightning Protection and Electromagnetic Camouflage, PLA University of Science and Technology, Nanjing 210007 (China); Zhou, Shudao [College of Meteorology and Oceanography, PLA University of Science and Technology, Nanjing 211101 (China); Collaborative Innovation Center on Forecast and Evaluation of Meteorological Disasters, Nanjing University of Information Science and Technology, Nanjing 210044 (China)
2014-03-15
This paper introduces a novel hybrid optimization algorithm to establish the parameters of chaotic systems. In order to deal with the weaknesses of the traditional cuckoo search algorithm, the proposed adaptive cuckoo search with simulated annealing algorithm is presented, which incorporates the adaptive parameters adjusting operation and the simulated annealing operation in the cuckoo search algorithm. Normally, the parameters of the cuckoo search algorithm are kept constant that may result in decreasing the efficiency of the algorithm. For the purpose of balancing and enhancing the accuracy and convergence rate of the cuckoo search algorithm, the adaptive operation is presented to tune the parameters properly. Besides, the local search capability of cuckoo search algorithm is relatively weak that may decrease the quality of optimization. So the simulated annealing operation is merged into the cuckoo search algorithm to enhance the local search ability and improve the accuracy and reliability of the results. The functionality of the proposed hybrid algorithm is investigated through the Lorenz chaotic system under the noiseless and noise condition, respectively. The numerical results demonstrate that the method can estimate parameters efficiently and accurately in the noiseless and noise condition. Finally, the results are compared with the traditional cuckoo search algorithm, genetic algorithm, and particle swarm optimization algorithm. Simulation results demonstrate the effectiveness and superior performance of the proposed algorithm.
Sheng, Zheng; Wang, Jun; Zhou, Shudao; Zhou, Bihua
2014-03-01
This paper introduces a novel hybrid optimization algorithm to establish the parameters of chaotic systems. In order to deal with the weaknesses of the traditional cuckoo search algorithm, the proposed adaptive cuckoo search with simulated annealing algorithm is presented, which incorporates the adaptive parameters adjusting operation and the simulated annealing operation in the cuckoo search algorithm. Normally, the parameters of the cuckoo search algorithm are kept constant that may result in decreasing the efficiency of the algorithm. For the purpose of balancing and enhancing the accuracy and convergence rate of the cuckoo search algorithm, the adaptive operation is presented to tune the parameters properly. Besides, the local search capability of cuckoo search algorithm is relatively weak that may decrease the quality of optimization. So the simulated annealing operation is merged into the cuckoo search algorithm to enhance the local search ability and improve the accuracy and reliability of the results. The functionality of the proposed hybrid algorithm is investigated through the Lorenz chaotic system under the noiseless and noise condition, respectively. The numerical results demonstrate that the method can estimate parameters efficiently and accurately in the noiseless and noise condition. Finally, the results are compared with the traditional cuckoo search algorithm, genetic algorithm, and particle swarm optimization algorithm. Simulation results demonstrate the effectiveness and superior performance of the proposed algorithm.
Automatic Clustering Using Multi-objective Particle Swarm and Simulated Annealing.
Directory of Open Access Journals (Sweden)
Ahmad Abubaker
Full Text Available This paper puts forward a new automatic clustering algorithm based on Multi-Objective Particle Swarm Optimization and Simulated Annealing, "MOPSOSA". The proposed algorithm is capable of automatic clustering which is appropriate for partitioning datasets to a suitable number of clusters. MOPSOSA combines the features of the multi-objective based particle swarm optimization (PSO and the Multi-Objective Simulated Annealing (MOSA. Three cluster validity indices were optimized simultaneously to establish the suitable number of clusters and the appropriate clustering for a dataset. The first cluster validity index is centred on Euclidean distance, the second on the point symmetry distance, and the last cluster validity index is based on short distance. A number of algorithms have been compared with the MOPSOSA algorithm in resolving clustering problems by determining the actual number of clusters and optimal clustering. Computational experiments were carried out to study fourteen artificial and five real life datasets.
Automatic Clustering Using Multi-objective Particle Swarm and Simulated Annealing.
Abubaker, Ahmad; Baharum, Adam; Alrefaei, Mahmoud
2015-01-01
This paper puts forward a new automatic clustering algorithm based on Multi-Objective Particle Swarm Optimization and Simulated Annealing, "MOPSOSA". The proposed algorithm is capable of automatic clustering which is appropriate for partitioning datasets to a suitable number of clusters. MOPSOSA combines the features of the multi-objective based particle swarm optimization (PSO) and the Multi-Objective Simulated Annealing (MOSA). Three cluster validity indices were optimized simultaneously to establish the suitable number of clusters and the appropriate clustering for a dataset. The first cluster validity index is centred on Euclidean distance, the second on the point symmetry distance, and the last cluster validity index is based on short distance. A number of algorithms have been compared with the MOPSOSA algorithm in resolving clustering problems by determining the actual number of clusters and optimal clustering. Computational experiments were carried out to study fourteen artificial and five real life datasets.
Fast and accurate protein substructure searching with simulated annealing and GPUs
Directory of Open Access Journals (Sweden)
Stivala Alex D
2010-09-01
Full Text Available Abstract Background Searching a database of protein structures for matches to a query structure, or occurrences of a structural motif, is an important task in structural biology and bioinformatics. While there are many existing methods for structural similarity searching, faster and more accurate approaches are still required, and few current methods are capable of substructure (motif searching. Results We developed an improved heuristic for tableau-based protein structure and substructure searching using simulated annealing, that is as fast or faster and comparable in accuracy, with some widely used existing methods. Furthermore, we created a parallel implementation on a modern graphics processing unit (GPU. Conclusions The GPU implementation achieves up to 34 times speedup over the CPU implementation of tableau-based structure search with simulated annealing, making it one of the fastest available methods. To the best of our knowledge, this is the first application of a GPU to the protein structural search problem.
Optimización Global Simulated Annealing
Directory of Open Access Journals (Sweden)
Francisco Sánchez Mares
2006-01-01
Full Text Available El presente trabajo muestra la aplicación del método de optimización global Simulated Annealing (SA. Esta técnica ha sido aplicada en diversas áreas de la ingeniería como una estrategia robusta y versátil para calcular con éxito el mínimo global de una función o un sistema de funciones. Para probar la eficiencia del método se encontraron los mínimos globales de una función arbitraria y se evaluó el comportamiento numérico del Simulated Annealing durante la convergencia a las dos soluciones que presenta el caso de estudio.
Institute of Scientific and Technical Information of China (English)
Jin Shi-Feng; Wang Wei-Min; Zhou Jian-Kun; Guo Hong-Xuan; J.F. Webb; Bian Xiu-Fang
2005-01-01
The nanocrystallization behaviour of Zr70Cu20Ni10 metallic glass during isothermal annealing is studied by employing a Monte Carlo simulation incorporating with a modified Ising model and a Q-state Potts model. Based on the simulated microstructure and differential scanning calorimetry curves, we find that the low crystal-amorphous interface energy of Ni plays an important role in the nanocrystallization of primary Zr2Ni. It is found that when T ＜ TImax (where TImax is the temperature with maximum nucleation rate), the increase of temperature results in a larger growth rate and a much finer microstructure for the primary Zr2Ni, which accords with the microstructure evolution in "flash annealing". Finally, the Zr2Ni/Zr2Cu interface energy σG contributes to the pinning effect of the primary nano-sized Zr2Ni grains in the later formed normal Zr2Cu grains.
Research on coal-mine gas monitoring system controlled by annealing simulating algorithm
Zhou, Mengran; Li, Zhenbi
2007-12-01
This paper introduces the principle and schematic diagram of gas monitoring system by means of infrared method. Annealing simulating algorithm is adopted to find the whole optimum solution and the Metroplis criterion is used to make iterative algorithm combination optimization by control parameter decreasing aiming at solving large-scale combination optimization problem. Experiment result obtained by the performing scheme of realizing algorithm training and flow of realizing algorithm training indicates that annealing simulating algorithm applied to identify gas is better than traditional linear local search method. It makes the algorithm iterate to the optimum value rapidly so that the quality of the solution is improved efficiently. The CPU time is shortened and the identifying rate of gas is increased. For the mines with much-gas gushing fatalness the regional danger and disaster advanced forecast can be realized. The reliability of coal-mine safety is improved.
Institute of Scientific and Technical Information of China (English)
吴剑锋; 朱学愚; 刘建立
1999-01-01
The genetic algorithm (GA) is a global and random search procedure based on the mechanics of natural selection and natural genetics. A new optimization method of the genetic algorithm-based simulated annealing penalty function (GASAPF) is presented to solve groundwater management model. Compared with the traditional gradient-based algorithms, the GA is straightforward and there is no need to calculate derivatives of the objective function. The GA is able to generate both convex and nonconvex points within the feasible region. It can be sure that the GA converges to the global or at least near-global optimal solution to handle the constraints by simulated annealing technique. Maximum pumping example results show that the GASAPF to solve optimization model is very efficient and robust.
Salcedo-Sanz, Sancho; Santiago-Mozos, Ricardo; Bousoño-Calzón, Carlos
2004-04-01
A hybrid Hopfield network-simulated annealing algorithm (HopSA) is presented for the frequency assignment problem (FAP) in satellite communications. The goal of this NP-complete problem is minimizing the cochannel interference between satellite communication systems by rearranging the frequency assignment, for the systems can accommodate the increasing demands. The HopSA algorithm consists of a fast digital Hopfield neural network which manages the problem constraints hybridized with a simulated annealing which improves the quality of the solutions obtained. We analyze the problem and its formulation, describing and discussing the HopSA algorithm and solving a set of benchmark problems. The results obtained are compared with other existing approaches in order to show the performance of the HopSA approach.
Paul, Gerald
2010-01-01
For almost two decades the question of whether tabu search (TS) or simulated annealing (SA) performs better for the quadratic assignment problem has been unresolved. To answer this question satisfactorily, we compare performance at various values of targeted solution quality, running each heuristic at its optimal number of iterations for each target. We find that for a number of varied problem instances, SA performs better for higher quality targets while TS performs better for lower quality targets.
Directory of Open Access Journals (Sweden)
Kohei Arai
2012-07-01
Full Text Available Method for geophysical parameter estimations with microwave radiometer data based on Simulated Annealing: SA is proposed. Geophysical parameters which are estimated with microwave radiometer data are closely related each other. Therefore simultaneous estimation makes constraints in accordance with the relations. On the other hand, SA requires huge computer resources for convergence. In order to accelerate convergence process, oscillated decreasing function is proposed for cool down function. Experimental results show that remarkable improvements are observed for geophysical parameter estimations.
Paul, Gerald
2011-01-01
The quadratic assignment problem (QAP) is one of the most difficult combinatorial optimization problems. One of the most powerful and commonly used heuristics to obtain approximations to the optimal solution of the QAP is simulated annealing (SA). We present an efficient implementation of the SA heuristic which performs more than 100 times faster then existing implementations for large problem sizes and a large number of SA iterations.
A GPU implementation of the Simulated Annealing Heuristic for the Quadratic Assignment Problem
Paul, Gerald
2012-01-01
The quadratic assignment problem (QAP) is one of the most difficult combinatorial optimization problems. An effective heuristic for obtaining approximate solutions to the QAP is simulated annealing (SA). Here we describe an SA implementation for the QAP which runs on a graphics processing unit (GPU). GPUs are composed of low cost commodity graphics chips which in combination provide a powerful platform for general purpose parallel computing. For SA runs with large numbers of iterations, we fi...
Wenbo Wu; Jiahong Liang; Xinyu Yao; Baohong Liu
2014-01-01
This paper addresses the problem of task allocation in real-time distributed systems with the goal of maximizing the system reliability, which has been shown to be NP-hard. We take account of the deadline constraint to formulate this problem and then propose an algorithm called chaotic adaptive simulated annealing (XASA) to solve the problem. Firstly, XASA begins with chaotic optimization which takes a chaotic walk in the solution space and generates several local minima; secondly XASA improv...
Akbar, Akhmad Fanani; Nugraha, Andri Dian; Sule, Rachmat; Juanda, Aditya Abdurrahman
2013-09-01
Hypocenter determination of micro-earthquakes of Mount "X-1" geothermal field has been conducted using simulated annealing and guided error search method using a 1D seismic velocity model. In order to speed up the hypocenter determination process a three-circle intersection method has been used to guide the simulated annealing and guided error search process. We used P and S arrival time's microseismic data. In the simulated annealing and guided error search processes, the minimum travel time from a source to a receiver has been calculated by employing ray tracing with shooting method. The resulting hypocenters from the above process occurred at depths of 3-4 km below mean sea level. These hypocenter distributions are correlated with previous study which was concluded that the most active microseismic area in which the site of many fractures and also vertical circulation place. Later on, resulting hypocenters location was used as input to determine 1-D seismic velocity using joint hypocenter determination method. The results of VELEST indicate show low Vp/Vs ratio value at depths of 3-4 km. Our interpretation is this anomaly may be related to a rock layer which is saturated by vapor (gas or steam). Another feature is high Vp/Vs ratio value at depths of 1-3 km that may related to a rock layer which is saturated by fluid or partial melting. We also analyze the focal mechanism of microseismic using ISOLA method to determine the source characteristic of this event.
Ry, Rexha Verdhora; Nugraha, Andri Dian
2015-04-01
Observation of earthquakes is routinely used widely in tectonic activity observation, and also in local scale such as volcano tectonic and geothermal activity observation. It is necessary for determining the location of precise hypocenter which the process involves finding a hypocenter location that has minimum error between the observed and the calculated travel times. When solving this nonlinear inverse problem, simulated annealing inversion method can be applied to such global optimization problems, which the convergence of its solution is independent of the initial model. In this study, we developed own program codeby applying adaptive simulated annealing inversion in Matlab environment. We applied this method to determine earthquake hypocenter using several data cases which are regional tectonic, volcano tectonic, and geothermal field. The travel times were calculated using ray tracing shooting method. We then compared its results with the results using Geiger's method to analyze its reliability. Our results show hypocenter location has smaller RMS error compared to the Geiger's result that can be statistically associated with better solution. The hypocenter of earthquakes also well correlated with geological structure in the study area. Werecommend using adaptive simulated annealing inversion to relocate hypocenter location in purpose to get precise and accurate earthquake location.
Energy Technology Data Exchange (ETDEWEB)
Ry, Rexha Verdhora, E-mail: rexha.vry@gmail.com [Master Program of Geophysical Engineering, Faculty of Mining and Petroleum Engineering, Institut Teknologi Bandung, Jalan Ganesha No.10, Bandung 40132 (Indonesia); Nugraha, Andri Dian, E-mail: nugraha@gf.itb.ac.id [Global Geophysical Research Group, Faculty of Mining and Petroleum Engineering, Institut Teknologi Bandung, Jalan Ganesha No.10, Bandung 40132 (Indonesia)
2015-04-24
Observation of earthquakes is routinely used widely in tectonic activity observation, and also in local scale such as volcano tectonic and geothermal activity observation. It is necessary for determining the location of precise hypocenter which the process involves finding a hypocenter location that has minimum error between the observed and the calculated travel times. When solving this nonlinear inverse problem, simulated annealing inversion method can be applied to such global optimization problems, which the convergence of its solution is independent of the initial model. In this study, we developed own program codeby applying adaptive simulated annealing inversion in Matlab environment. We applied this method to determine earthquake hypocenter using several data cases which are regional tectonic, volcano tectonic, and geothermal field. The travel times were calculated using ray tracing shooting method. We then compared its results with the results using Geiger’s method to analyze its reliability. Our results show hypocenter location has smaller RMS error compared to the Geiger’s result that can be statistically associated with better solution. The hypocenter of earthquakes also well correlated with geological structure in the study area. Werecommend using adaptive simulated annealing inversion to relocate hypocenter location in purpose to get precise and accurate earthquake location.
Thin film design using simulated annealing and study of the filter robustness
Boudet, Thierry; Chaton, Patrick
1996-08-01
Modern optical components require sophisticated coatings with tough specifications and the design of optical multilayers has become a key activity of most laboratories and factories. A synthesis technique based on the simulated annealing algorithm is presented here. In this stochastic minimization, no starting solution is required, only the materials and technological constraints need to be specified. Moreover, the algorithm will always reach the final result. As simulated annealing is a stochastic algorithm, a great amount of state transitions is needed in order to reach a global minimum of the merit function used to evaluate the difference between the optical target and the calculated filter. Anyway the computing time remains reasonable on a work-station. A few examples will show the performances of our program. It also has to be pointed out that no refinement is needed at the end of the annealing because the solution is already highly optimized. Nowadays the design of robust filters with low sensitivity to technological variations remains a key factor for manufacturers. This is why we have established some criteria that quantify the robustness of the stacks. It also enables comparison of multilayers synthesized by different methods and corresponding to the same target.
Directory of Open Access Journals (Sweden)
P. Wang
2015-04-01
Full Text Available In this paper, we introduce a novel image reconstruction algorithm with Least Squares Support Vector Machines (LS-SVM and Simulated Annealing Particle Swarm Optimization (APSO, named SAP. This algorithm introduces simulated annealing ideas into Particle Swarm Optimization (PSO, which adopts cooling process functions to replace the inertia weight function and constructs the time variant inertia weight function featured in annealing mechanism. Meanwhile, it employs the APSO procedure to search for the optimized resolution of Electrical Capacitance Tomography (ECT for image reconstruction. In order to overcome the soft field characteristics of ECT sensitivity field, some image samples with typical flow patterns are chosen for training with LS-SVM. Under the training procedure, the capacitance error caused by the soft field characteristics is predicted, and then is used to construct the fitness function of the particle swarm optimization on basis of the capacitance error. Experimental results demonstrated that the proposed SAP algorithm has a quick convergence rate. Moreover, the proposed SAP outperforms the classic Landweber algorithm and Newton-Raphson algorithm on image reconstruction.
Guijarro, María; Pajares, Gonzalo; Herrera, P. Javier
2009-01-01
The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm. PMID:22399989
Guijarro, María; Pajares, Gonzalo; Herrera, P Javier
2009-01-01
The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm.
Energy Technology Data Exchange (ETDEWEB)
Nandipati, Giridhar, E-mail: giridhar.nandipati@pnnl.gov [Pacific Northwest National Laboratory, Richland, WA (United States); Setyawan, Wahyu; Heinisch, Howard L. [Pacific Northwest National Laboratory, Richland, WA (United States); Roche, Kenneth J. [Pacific Northwest National Laboratory, Richland, WA (United States); Department of Physics, University of Washington, Seattle, WA 98195 (United States); Kurtz, Richard J. [Pacific Northwest National Laboratory, Richland, WA (United States); Wirth, Brian D. [University of Tennessee, Knoxville, TN (United States)
2015-07-15
The results of object kinetic Monte Carlo (OKMC) simulations of the annealing of primary cascade damage in bulk tungsten using a comprehensive database of cascades obtained from molecular dynamics (Setyawan et al.) are described as a function of primary knock-on atom (PKA) energy at temperatures of 300, 1025 and 2050 K. An increase in SIA clustering coupled with a decrease in vacancy clustering with increasing temperature, in addition to the disparate mobilities of SIAs versus vacancies, causes an interesting effect of temperature on cascade annealing. The annealing efficiency (the ratio of the number of defects after and before annealing) exhibits an inverse U-shape curve as a function of temperature. The capabilities of the newly developed OKMC code KSOME (kinetic simulations of microstructure evolution) used to carry out these simulations are described.
Energy Technology Data Exchange (ETDEWEB)
Nandipati, Giridhar; Setyawan, Wahyu; Heinisch, Howard L.; Roche, Kenneth J.; Kurtz, Richard J.; Wirth, Brian D.
2015-07-01
The results of object kinetic Monte Carlo (OKMC) simulations of the annealing of primary cascade damage in bulk tungsten using a comprehensive database of cascades obtained from molecular dynamics (Setyawan et al.) are described as a function of primary knock-on atom (PKA) energy at temperatures of 300, 1025 and 2050 K. An increase in SIA clustering coupled with a decrease in vacancy clustering with increasing temperature, in addition to the disparate mobilities of SIAs versus vacancies, causes an interesting effect of temperature on cascade annealing. The annealing efficiency (the ratio of the number of defects after and before annealing) exhibits an inverse U-shape curve as a function of temperature. The capabilities of the newly developed OKMC code KSOME (kinetic simulations of microstructure evolution) used to carry out these simulations are described.
Kumar, Pushpendra; Huber, Patrick
2016-04-01
Discovery of porous silicon formation in silicon substrate in 1956 while electro-polishing crystalline Si in hydrofluoric acid (HF), has triggered large scale investigations of porous silicon formation and their changes in physical and chemical properties with thermal and chemical treatment. A nitrogen sorption study is used to investigate the effect of thermal annealing on electrochemically etched mesoporous silicon (PS). The PS was thermally annealed from 200˚C to 800˚C for 1 hr in the presence of air. It was shown that the pore diameter and porosity of PS vary with annealing temperature. The experimentally obtained adsorption / desorption isotherms show hysteresis typical for capillary condensation in porous materials. A simulation study based on Saam and Cole model was performed and compared with experimentally observed sorption isotherms to study the physics behind of hysteresis formation. We discuss the shape of the hysteresis loops in the framework of the morphology of the layers. The different behavior of adsorption and desorption of nitrogen in PS with pore diameter was discussed in terms of concave menisci formation inside the pore space, which was shown to related with the induced pressure in varying the pore diameter from 7.2 nm to 3.4 nm.
Simulated annealing algorithm for solving chambering student-case assignment problem
Ghazali, Saadiah; Abdul-Rahman, Syariza
2015-12-01
The problem related to project assignment problem is one of popular practical problem that appear nowadays. The challenge of solving the problem raise whenever the complexity related to preferences, the existence of real-world constraints and problem size increased. This study focuses on solving a chambering student-case assignment problem by using a simulated annealing algorithm where this problem is classified under project assignment problem. The project assignment problem is considered as hard combinatorial optimization problem and solving it using a metaheuristic approach is an advantage because it could return a good solution in a reasonable time. The problem of assigning chambering students to cases has never been addressed in the literature before. For the proposed problem, it is essential for law graduates to peruse in chambers before they are qualified to become legal counselor. Thus, assigning the chambering students to cases is a critically needed especially when involving many preferences. Hence, this study presents a preliminary study of the proposed project assignment problem. The objective of the study is to minimize the total completion time for all students in solving the given cases. This study employed a minimum cost greedy heuristic in order to construct a feasible initial solution. The search then is preceded with a simulated annealing algorithm for further improvement of solution quality. The analysis of the obtained result has shown that the proposed simulated annealing algorithm has greatly improved the solution constructed by the minimum cost greedy heuristic. Hence, this research has demonstrated the advantages of solving project assignment problem by using metaheuristic techniques.
Comparing of the Deterministic Simulated Annealing Methods for Quadratic Assignment Problem
Directory of Open Access Journals (Sweden)
Mehmet Güray ÜNSAL
2013-08-01
Full Text Available In this study, Threshold accepting and Record to record travel methods belonging to Simulated Annealing that is meta-heuristic method by applying Quadratic Assignment Problem are statistically analyzed whether they have a significant difference with regard to the values of these two methods target functions and CPU time. Between the two algorithms, no significant differences are found in terms of CPU time and the values of these two methods target functions. Consequently, on the base of Quadratic Assignment Problem, the two algorithms are compared in the study have the same performance in respect to CPU time and the target functions values
Design of phase plates for shaping partially coherent beams by simulated annealing
Institute of Scientific and Technical Information of China (English)
Li Jian-Long; Lü Bai-Da
2008-01-01
Taking the Gaussian Schell-model beam as a typical example of partially coherent beams,this paper applies the simulated annealing (SA) algorithm to the design of phase plates for shaping partially coherent beams.A flow diagram is presented to illustrate the procedure of phase optimization by the SA algorithm.Numerical examples demonstrate the advantages of the SA algorithm in shaping partially coherent beams.An uniform flat-topped beam profile with maximum reconstruction error RE < 1.74% is achieved.A further extension of the approach is discussed.
Total lineshape analysis of high-resolution NMR spectra powered by simulated annealing
Cheshkov, D. A.; Sinitsyn, D. O.; Sheberstov, K. F.; Chertkov, V. A.
2016-11-01
The novel algorithm for a total lineshape analysis of high-resolution NMR spectra has been developed. A global optimization by simulated annealing has been applied that has allowed to overcome the main trouble of common approaches which had frequently returned solutions for local minima rather than for global ones. The algorithm has been verified for the four-spin test systems ABCD, and has been successfully used for analysis of experimental NMR spectra of proline. The approach has allowed to avoid a sophisticated manual setup of initial parameters and to conduct the analysis of complicated high-resolution NMR spectra nearly automatically.
Institute of Scientific and Technical Information of China (English)
Zhao Zhi-Jin; Zheng Shi-Lian; Xu Chun-Yun; Kong Xian-Zheng
2007-01-01
Hidden Markov models (HMMs) have been used to model burst error sources of wireless channels. This paper proposes a hybrid method of using genetic algorithm (GA) and simulated annealing (SA) to train HMM for discrete channel modelling. The proposed method is compared with pure GA, and experimental results show that the HMMs trained by the hybrid method can better describe the error sequences due to SA's ability of facilitating hill-climbing at the later stage of the search. The burst error statistics of the HMMs trained by the proposed method and the corresponding error sequences are also presented to validate the proposed method.
Simulated annealing applied to two-dimensional low-beta reduced magnetohydrodynamics
Energy Technology Data Exchange (ETDEWEB)
Chikasue, Y., E-mail: chikasue@ppl.k.u-tokyo.ac.jp [Graduate School of Frontier Sciences, University of Tokyo, 5-1-5 Kashiwanoha, Kashiwa-shi, Chiba 277-8561 (Japan); Furukawa, M., E-mail: furukawa@damp.tottori-u.ac.jp [Graduate School of Engineering, Tottori University, Minami 4-101, Koyama-cho, Tottori-shi, Tottori 680-8552 (Japan)
2015-02-15
The simulated annealing (SA) method is applied to two-dimensional (2D) low-beta reduced magnetohydrodynamics (R-MHD). We have successfully obtained stationary states of the system numerically by the SA method with Casimir invariants preserved. Since the 2D low-beta R-MHD has two fields, the relaxation process becomes complex compared to a single field system such as 2D Euler flow. The obtained stationary state can have fine structure. We have found that the fine structure appears because the relaxation processes are different between kinetic energy and magnetic energy.
Energy Technology Data Exchange (ETDEWEB)
Fleischer, M.; Jacobson, S.
1994-12-31
This paper presents a new empirical approach designed to illustrate the theory developed in Fleischer and Jacobson regarding entropy measures and the finite-time performance of the simulated annealing (SA) algorithm. The theory is tested using several experimental methodologies based on a new structure, generic configuration spaces, and polynomial transformations between NP-hard problems. Both approaches provide several ways to alter the configuration space and its associated entropy measure while preserving the value of the globally optimal solution. This makes it possible to illuminate the extent to which entropy measures impact the finite-time performance of the SA algorithm.
Hansen, S H
2004-01-01
We present a user-friendly tool for the analysis of data from Sunyaev-Zeldovich effect observations. The tool is based on the stochastic method of simulated annealing, and allows the extraction of the central values and error-bars of the 3 SZ parameters, Comptonization parameter, y, peculiar velocity, v_p, and electron temperature, T_e. The f77-code SASZ will allow any number of observing frequencies and spectral band shapes. As an example we consider the SZ parameters for the COMA cluster.
Kerr, I. D.; Sankararamakrishnan, R; Smart, O.S.; Sansom, M S
1994-01-01
A parallel bundle of transmembrane (TM) alpha-helices surrounding a central pore is present in several classes of ion channel, including the nicotinic acetylcholine receptor (nAChR). We have modeled bundles of hydrophobic and of amphipathic helices using simulated annealing via restrained molecular dynamics. Bundles of Ala20 helices, with N = 4, 5, or 6 helices/bundle were generated. For all three N values the helices formed left-handed coiled coils, with pitches ranging from 160 A (N = 4) to...
Stochastic Global Optimization and Its Applications with Fuzzy Adaptive Simulated Annealing
Aguiar e Oliveira Junior, Hime; Petraglia, Antonio; Rembold Petraglia, Mariane; Augusta Soares Machado, Maria
2012-01-01
Stochastic global optimization is a very important subject, that has applications in virtually all areas of science and technology. Therefore there is nothing more opportune than writing a book about a successful and mature algorithm that turned out to be a good tool in solving difficult problems. Here we present some techniques for solving several problems by means of Fuzzy Adaptive Simulated Annealing (Fuzzy ASA), a fuzzy-controlled version of ASA, and by ASA itself. ASA is a sophisticated global optimization algorithm that is based upon ideas of the simulated annealing paradigm, coded in the C programming language and developed to statistically find the best global fit of a nonlinear constrained, non-convex cost function over a multi-dimensional space. By presenting detailed examples of its application we want to stimulate the reader’s intuition and make the use of Fuzzy ASA (or regular ASA) easier for everyone wishing to use these tools to solve problems. We kept formal mathematical requirements to a...
Soltani-Mohammadi, Saeed; Safa, Mohammad; Mokhtari, Hadi
2016-10-01
One of the most important stages in complementary exploration is optimal designing the additional drilling pattern or defining the optimum number and location of additional boreholes. Quite a lot research has been carried out in this regard in which for most of the proposed algorithms, kriging variance minimization as a criterion for uncertainty assessment is defined as objective function and the problem could be solved through optimization methods. Although kriging variance implementation is known to have many advantages in objective function definition, it is not sensitive to local variability. As a result, the only factors evaluated for locating the additional boreholes are initial data configuration and variogram model parameters and the effects of local variability are omitted. In this paper, with the goal of considering the local variability in boundaries uncertainty assessment, the application of combined variance is investigated to define the objective function. Thus in order to verify the applicability of the proposed objective function, it is used to locate the additional boreholes in Esfordi phosphate mine through the implementation of metaheuristic optimization methods such as simulated annealing and particle swarm optimization. Comparison of results from the proposed objective function and conventional methods indicates that the new changes imposed on the objective function has caused the algorithm output to be sensitive to the variations of grade, domain's boundaries and the thickness of mineralization domain. The comparison between the results of different optimization algorithms proved that for the presented case the application of particle swarm optimization is more appropriate than simulated annealing.
Redesigning rain gauges network in Johor using geostatistics and simulated annealing
Energy Technology Data Exchange (ETDEWEB)
Aziz, Mohd Khairul Bazli Mohd, E-mail: mkbazli@yahoo.com [Centre of Preparatory and General Studies, TATI University College, 24000 Kemaman, Terengganu, Malaysia and Department of Mathematical Sciences, Faculty of Science, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia); Yusof, Fadhilah, E-mail: fadhilahy@utm.my [Department of Mathematical Sciences, Faculty of Science, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia); Daud, Zalina Mohd, E-mail: zalina@ic.utm.my [UTM Razak School of Engineering and Advanced Technology, Universiti Teknologi Malaysia, UTM KL, 54100 Kuala Lumpur (Malaysia); Yusop, Zulkifli, E-mail: zulyusop@utm.my [Institute of Environmental and Water Resource Management (IPASA), Faculty of Civil Engineering, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia); Kasno, Mohammad Afif, E-mail: mafifkasno@gmail.com [Malaysia - Japan International Institute of Technology (MJIIT), Universiti Teknologi Malaysia, UTM KL, 54100 Kuala Lumpur (Malaysia)
2015-02-03
Recently, many rainfall network design techniques have been developed, discussed and compared by many researchers. Present day hydrological studies require higher levels of accuracy from collected data. In numerous basins, the rain gauge stations are located without clear scientific understanding. In this study, an attempt is made to redesign rain gauge network for Johor, Malaysia in order to meet the required level of accuracy preset by rainfall data users. The existing network of 84 rain gauges in Johor is optimized and redesigned into a new locations by using rainfall, humidity, solar radiation, temperature and wind speed data collected during the monsoon season (November - February) of 1975 until 2008. This study used the combination of geostatistics method (variance-reduction method) and simulated annealing as the algorithm of optimization during the redesigned proses. The result shows that the new rain gauge location provides minimum value of estimated variance. This shows that the combination of geostatistics method (variance-reduction method) and simulated annealing is successful in the development of the new optimum rain gauge system.
Design and optimization of solid rocket motor Finocyl grain using simulated annealing
Institute of Scientific and Technical Information of China (English)
Ali Kamran; LIANG Guo-zhu
2011-01-01
The research effort outlined the application of a computer aided design (CAD)-centric technique to the design and optimization of solid rocket motor Finocyl (fin in cylinder) grain using simulated annealing.The proper method for constructing the grain configuration model, ballistic performance and optimizer integration for analysis was presented. Finoeyl is a complex grain configuration, requiring thirteen variables to define the geometry. The large number of variables not only complicates the geometrical construction but also optimization process. CAD representation encapsulates all of the geometric entities pertinent to the grain design in a parametric way, allowing manipulation of grain entity (web), performing regression and automating geometrical data calculations. Robustness to avoid local minima and efficient capacity to explore design space makes simulated annealing an attractive choice as optimizer. It is demonstrated with a constrained optimization of Finocyl grain geometry for homogeneous, isotropic propellant, uniform regression, and a quasi-steady, bulk mode internal ballistics model that maximizes average thrust for required deviations from neutrality.
Directory of Open Access Journals (Sweden)
Kai Moriguchi
2015-01-01
Full Text Available We evaluated the potential of simulated annealing as a reliable method for optimizing thinning rates for single even-aged stands. Four types of yield models were used as benchmark models to examine the algorithm’s versatility. Thinning rate, which was constrained to 0–50% every 5 years at stand ages of 10–45 years, was optimized to maximize the net present value for one fixed rotation term (50 years. The best parameters for the simulated annealing were chosen from 113 patterns, using the mean of the net present value from 39 runs to ensure the best performance. We compared the solutions with those from coarse full enumeration to evaluate the method’s reliability and with 39 runs of random search to evaluate its efficiency. In contrast to random search, the best run of simulated annealing for each of the four yield models resulted in a better solution than coarse full enumeration. However, variations in the objective function for two yield models obtained with simulated annealing were significantly larger than those of random search. In conclusion, simulated annealing with optimized parameters is more efficient for optimizing thinning rates than random search. However, it is necessary to execute multiple runs to obtain reliable solutions.
Adaptive MANET Multipath Routing Algorithm Based on the Simulated Annealing Approach
Directory of Open Access Journals (Sweden)
Sungwook Kim
2014-01-01
Full Text Available Mobile ad hoc network represents a system of wireless mobile nodes that can freely and dynamically self-organize network topologies without any preexisting communication infrastructure. Due to characteristics like temporary topology and absence of centralized authority, routing is one of the major issues in ad hoc networks. In this paper, a new multipath routing scheme is proposed by employing simulated annealing approach. The proposed metaheuristic approach can achieve greater and reciprocal advantages in a hostile dynamic real world network situation. Therefore, the proposed routing scheme is a powerful method for finding an effective solution into the conflict mobile ad hoc network routing problem. Simulation results indicate that the proposed paradigm adapts best to the variation of dynamic network situations. The average remaining energy, network throughput, packet loss probability, and traffic load distribution are improved by about 10%, 10%, 5%, and 10%, respectively, more than the existing schemes.
Adaptive MANET multipath routing algorithm based on the simulated annealing approach.
Kim, Sungwook
2014-01-01
Mobile ad hoc network represents a system of wireless mobile nodes that can freely and dynamically self-organize network topologies without any preexisting communication infrastructure. Due to characteristics like temporary topology and absence of centralized authority, routing is one of the major issues in ad hoc networks. In this paper, a new multipath routing scheme is proposed by employing simulated annealing approach. The proposed metaheuristic approach can achieve greater and reciprocal advantages in a hostile dynamic real world network situation. Therefore, the proposed routing scheme is a powerful method for finding an effective solution into the conflict mobile ad hoc network routing problem. Simulation results indicate that the proposed paradigm adapts best to the variation of dynamic network situations. The average remaining energy, network throughput, packet loss probability, and traffic load distribution are improved by about 10%, 10%, 5%, and 10%, respectively, more than the existing schemes.
Research on Optimal Control for the Vehicle Suspension Based on the Simulated Annealing Algorithm
Directory of Open Access Journals (Sweden)
Jie Meng
2014-01-01
Full Text Available A method is designed to optimize the weight matrix of the LQR controller by using the simulated annealing algorithm. This method utilizes the random searching characteristics of the algorithm to optimize the weight matrices with the target function of suspension performance indexes. This method improves the design efficiency and control performance of the LQR control, and solves the problem of the LQR controller when defining the weight matrices. And a simulation is provided for vehicle active chassis control. The result shows that the active suspension using LQR optimized by the genetic algorithm compared to the chassis controlled by the normal LQR and the passive one, shows better performance. Meanwhile, the problem of defining the weight matrices is greatly solved.
Kinetic Monte Carlo simulations of boron activation in implanted Si under laser thermal annealing
Fisicaro, Giuseppe; Pelaz, Lourdes; Aboy, Maria; Lopez, Pedro; Italia, Markus; Huet, Karim; Cristiano, Filadelfo; Essa, Zahi; Yang, Qui; Bedel-Pereira, Elena; Quillec, Maurice; La Magna, Antonino
2014-02-01
We investigate the correlation between dopant activation and damage evolution in boron-implanted silicon under excimer laser irradiation. The dopant activation efficiency in the solid phase was measured under a wide range of irradiation conditions and simulated using coupled phase-field and kinetic Monte Carlo models. With the inclusion of dopant atoms, the presented code extends the capabilities of a previous version, allowing its definitive validation by means of detailed comparisons with experimental data. The stochastic method predicts the post-implant kinetics of the defect-dopant system in the far-from-equilibrium conditions caused by laser irradiation. The simulations explain the dopant activation dynamics and demonstrate that the competitive dopant-defect kinetics during the first laser annealing treatment dominates the activation phenomenon, stabilizing the system against additional laser irradiation steps.
The Politics of City Planning Simulations.
Kolson, Kenneth
This research paper presents an analysis of the computer simulation, SimCity, used for an urban city planning class. The data were gathered by actual use of the simulation and an electronic mail network was employed to secure impressions from users of the simulation. SimCity (developed by Maxis) provides the player with rules of human factors,…
Energy Technology Data Exchange (ETDEWEB)
Sanchez Lopez, Hector [Universidad de Oriente, Santiago de Cuba (Cuba). Centro de Biofisica Medica]. E-mail: hsanchez@cbm.uo.edu.cu
2001-08-01
This work describes an alternative algorithm of Simulated Annealing applied to the design of the main magnet for a Magnetic Resonance Imaging machine. The algorithm uses a probabilistic radial base neuronal network to classify the possible solutions, before the objective function evaluation. This procedure allows reducing up to 50% the number of iterations required by simulated annealing to achieve the global maximum, when compared with the SA algorithm. The algorithm was applied to design a 0.1050 Tesla four coil resistive magnet, which produces a magnetic field 2.13 times more uniform than the solution given by SA. (author)
Morgan, John A
2016-01-01
The method of simulated annealing is adapted to the temperature-emissivity separation (TES) problem. A patch of surface at the bottom of the atmosphere is assumed to be a greybody emitter with spectral emissivity $\\epsilon(k)$ describable by a mixture of spectral endmembers. We prove that a simulated annealing search conducted according to a suitable schedule converges to a solution maximizing the $\\textit{A-Posteriori}$ probability that spectral radiance detected at the top of the atmosphere originates from a patch with stipulated $T$ and $\\epsilon(k)$. Any such solution will be nonunique. The average of a large number of simulated annealing solutions, however, converges almost surely to a unique Maximum A-Posteriori solution for $T$ and $\\epsilon(k)$. The limitation to a stipulated set of endmember emissivities may be relaxed by allowing the number of endmembers to grow without bound, and to be generic continuous functions of wavenumber with bounded first derivatives with respect to wavenumber.
Minimizing distortion and internal forces in truss structures by simulated annealing
Kincaid, Rex K.
1989-01-01
Inaccuracies in the length of members and the diameters of joints of large truss reflector backup structures may produce unacceptable levels of surface distortion and member forces. However, if the member lengths and joint diameters can be measured accurately it is possible to configure the members and joints so that root-mean-square (rms) surface error and/or rms member forces is minimized. Following Greene and Haftka (1989) it is assumed that the force vector f is linearly proportional to the member length errors e(sub M) of dimension NMEMB (the number of members) and joint errors e(sub J) of dimension NJOINT (the number of joints), and that the best-fit displacement vector d is a linear function of f. Let NNODES denote the number of positions on the surface of the truss where error influences are measured. The solution of the problem is discussed. To classify, this problem was compared to a similar combinatorial optimization problem. In particular, when only the member length errors are considered, minimizing d(sup 2)(sub rms) is equivalent to the quadratic assignment problem. The quadratic assignment problem is a well known NP-complete problem in operations research literature. Hence minimizing d(sup 2)(sub rms) is is also an NP-complete problem. The focus of the research is the development of a simulated annealing algorithm to reduce d(sup 2)(sub rms). The plausibility of this technique is its recent success on a variety of NP-complete combinatorial optimization problems including the quadratic assignment problem. A physical analogy for simulated annealing is the way liquids freeze and crystallize. All computational experiments were done on a MicroVAX. The two interchange heuristic is very fast but produces widely varying results. The two and three interchange heuristic provides less variability in the final objective function values but runs much more slowly. Simulated annealing produced the best objective function values for every starting configuration and
Retrieval of Surface and Subsurface Moisture of Bare Soil Using Simulated Annealing
Tabatabaeenejad, A.; Moghaddam, M.
2009-12-01
Soil moisture is of fundamental importance to many hydrological and biological processes. Soil moisture information is vital to understanding the cycling of water, energy, and carbon in the Earth system. Knowledge of soil moisture is critical to agencies concerned with weather and climate, runoff potential and flood control, soil erosion, reservoir management, water quality, agricultural productivity, drought monitoring, and human health. The need to monitor the soil moisture on a global scale has motivated missions such as Soil Moisture Active and Passive (SMAP) [1]. Rough surface scattering models and remote sensing retrieval algorithms are essential in study of the soil moisture, because soil can be represented as a rough surface structure. Effects of soil moisture on the backscattered field have been studied since the 1960s, but soil moisture estimation remains a challenging problem and there is still a need for more accurate and more efficient inversion algorithms. It has been shown that the simulated annealing method is a powerful tool for inversion of the model parameters of rough surface structures [2]. The sensitivity of this method to measurement noise has also been investigated assuming a two-layer structure characterized by the layers dielectric constants, layer thickness, and statistical properties of the rough interfaces [2]. However, since the moisture profile varies with depth, it is sometimes necessary to model the rough surface as a layered structure with a rough interface on top and a stratified structure below where each layer is assumed to have a constant volumetric moisture content. In this work, we discretize the soil structure into several layers of constant moisture content to examine the effect of subsurface profile on the backscattering coefficient. We will show that while the moisture profile could vary in deeper layers, these layers do not affect the scattered electromagnetic field significantly. Therefore, we can use just a few layers
Schneider, Johannes J.; Puchta, Markus
2010-12-01
Simulated annealing is the classic physical optimization algorithm, which has been applied to a large variety of problems for many years. Over time, several adaptive mechanisms for decreasing the temperature and thus controlling the acceptance of deteriorations have been developed, based on the measurement of the mean value and the variance of the energy. Here we propose a new simplified approach in which we consider the probability of accepting deteriorations as the main control parameter and derive the temperature by averaging over the last few deteriorations stored in a memory. We present results for the traveling salesman problem and demonstrate, how the amount of data retained influences both the cooling schedule and the quality of the results.
Evaluating strong measurement noise in data series with simulated annealing method
Carvalho, J; Haase, M; Lind, P G
2013-01-01
Many stochastic time series can be described by a Langevin equation composed of a deterministic and a stochastic dynamical part. Such a stochastic process can be reconstructed by means of a recently introduced nonparametric method, thus increasing the predictability, i.e. knowledge of the macroscopic drift and the microscopic diffusion functions. If the measurement of a stochastic process is affected by additional strong measurement noise, the reconstruction process cannot be applied. Here, we present a method for the reconstruction of stochastic processes in the presence of strong measurement noise, based on a suitably parametrized ansatz. At the core of the process is the minimization of the functional distance between terms containing the conditional moments taken from measurement data, and the corresponding ansatz functions. It is shown that a minimization of the distance by means of a simulated annealing procedure yields better results than a previously used Levenberg-Marquardt algorithm, which permits a...
COLSS Axial Power Distribution Synthesis using Artificial Neural Network with Simulated Annealing
Energy Technology Data Exchange (ETDEWEB)
Shim, K. W.; Oh, D. Y.; Kim, D. S.; Choi, Y. J.; Park, Y. H. [KEPCO Nuclear Fuel Company, Inc., Daejeon (Korea, Republic of)
2015-05-15
The core operating limit supervisory system (COLSS) is an application program implemented into the plant monitoring system (PMS) of nuclear power plants (NPPs). COLSS aids the operator in maintaining plant operation within selected limiting conditions for operation (LCOs), such as the departure from nucleate boiling ratio (DNBR) margin and the linear heat rate (LHR) margin. In order to calculate above LCOs, the COLSS uses core averaged axial power distribution (APD). 40 nodes of APD is synthesized by using the 5-level in-core neutron flux detector signals based on the Fourier series method in the COLSS. We proposed the artificial neural network (ANN) with simulated annealing (SA) method instead of Fourier series method to synthesize the axial power distribution (APD) of COLSS. The proposed method is more accurate than the current method as the results of the axial shape RMS errors.
Engineering phase shifter domains for multiple QPM using simulated annealing algorithm
Siva, Chellappa; Sunder Meetei, Toijam; Shiva, Prabhakar; Narayanan, Balaji; Arvind, Ganesh; Boomadevi, Shanmugam; Pandiyan, Krishnamoorthy
2017-10-01
We have utilized the general algorithm of simulated annealing (SA) to engineer the phase shifter domains in a quasi-phase-matching (QPM) device to generate multiple frequency conversion. SA is an algorithm generally used to find the global maxima or minima in a given random function. Here, we have utilized this algorithm to generate multiple QPM second harmonic generation (SHG) by distributing phase shifters suitably. In general, phase shifters are distributed in a QPM device with some specific profile along the length to generate multiple QPM SHG. Using the SA algorithm, the location of these phase shifters can be easily identified to have the desired multiple QPM with higher conversion efficiency. The methodology to generate the desired multiple QPM SHG using the SA algorithm has been discussed in detail.
A hybrid Tabu search-simulated annealing method to solve quadratic assignment problem
Directory of Open Access Journals (Sweden)
Mohamad Amin Kaviani
2014-06-01
Full Text Available Quadratic assignment problem (QAP has been considered as one of the most complicated problems. The problem is NP-Hard and the optimal solutions are not available for large-scale problems. This paper presents a hybrid method using tabu search and simulated annealing technique to solve QAP called TABUSA. Using some well-known problems from QAPLIB generated by Burkard et al. (1997 [Burkard, R. E., Karisch, S. E., & Rendl, F. (1997. QAPLIB–a quadratic assignment problem library. Journal of Global Optimization, 10(4, 391-403.], two methods of TABUSA and TS are both coded on MATLAB and they are compared in terms of relative percentage deviation (RPD for all instances. The performance of the proposed method is examined against Tabu search and the preliminary results indicate that the hybrid method is capable of solving real-world problems, efficiently.
Fabrication of simulated plate fuel elements: Defining role of stress relief annealing
Kohli, D.; Rakesh, R.; Sinha, V. P.; Prasad, G. J.; Samajdar, I.
2014-04-01
This study involved fabrication of simulated plate fuel elements. Uranium silicide of actual fuel elements was replaced with yttria. The fabrication stages were otherwise identical. The final cold rolled and/or straightened plates, without stress relief, showed an inverse relationship between bond strength and out of plane residual shear stress (τ13). Stress relief of τ13 was conducted over a range of temperatures/times (200-500 °C and 15-240 min) and led to corresponding improvements in bond strength. Fastest τ13 relief was obtained through 300 °C annealing. Elimination of microscopic shear bands, through recovery and partial recrystallization, was clearly the most effective mechanism of relieving τ13.
Shape optimization of road tunnel cross-section by simulated annealing
Directory of Open Access Journals (Sweden)
Sobótka Maciej
2016-06-01
Full Text Available The paper concerns shape optimization of a tunnel excavation cross-section. The study incorporates optimization procedure of the simulated annealing (SA. The form of a cost function derives from the energetic optimality condition, formulated in the authors’ previous papers. The utilized algorithm takes advantage of the optimization procedure already published by the authors. Unlike other approaches presented in literature, the one introduced in this paper takes into consideration a practical requirement of preserving fixed clearance gauge. Itasca Flac software is utilized in numerical examples. The optimal excavation shapes are determined for five different in situ stress ratios. This factor significantly affects the optimal topology of excavation. The resulting shapes are elongated in the direction of a principal stress greater value. Moreover, the obtained optimal shapes have smooth contours circumscribing the gauge.
Simulated annealing for three-dimensional low-beta reduced MHD equilibria in cylindrical geometry
Furukawa, M
2016-01-01
Simulated annealing (SA) is applied for three-dimensional (3D) equilibrium calculation of ideal, low-beta reduced MHD in cylindrical geometry. The SA is based on the theory of Hamiltonian mechanics. The dynamical equation of the original system, low-beta reduced MHD in this study, is modified so that the energy changes monotonically while preserving the Casimir invariants in the artificial dynamics. An equilibrium of the system is given by an extremum of the energy, therefore SA can be used as a method for calculating ideal MHD equilibrium. Previous studies demonstrated that the SA succeeds to lead to various MHD equilibria in two dimensional rectangular domain. In this paper, the theory is applied to 3D equilibrium of ideal, low-beta reduced MHD. An example of equilibrium with magnetic islands, obtained as a lower energy state, is shown. Several versions of the artificial dynamics are developed that can effect smoothing.
An Archived Multi Objective Simulated Annealing Method to Discover Biclusters in Microarray Data
Directory of Open Access Journals (Sweden)
Mohsen Lashkargir
2011-01-01
Full Text Available With the advent of microarray technology it has been possible to measure thousands of expression values of genes in a single experiment. Analysis of large scale geonomics data, notably gene expression, has initially focused on clustering methods. Recently, biclustering techniques were proposed for revealing submatrices showing unique patterns. Biclustering or simultaneous clustering of both genes and conditions is challenging particularly for the analysis of high-dimensional gene expression data in information retrieval, knowledge discovery, and data mining. In biclustering of microarray data, several objectives have to be optimized simultaneously and often these objectives are in conflict with each other. A multi objective model is very suitable for solving this problem. Our method proposes a algorithm which is based on multi objective Simulated Annealing for discovering biclusters in gene expression data. Experimental result in bench mark data base present a significant improvement in overlap among biclusters and coverage of elements in gene expression and quality of biclusters.
Huq, Ashfia; Stephens, P W
2003-02-01
Recent advances in crystallographic computing and availability of high-resolution diffraction data have made it relatively easy to solve crystal structures from powders that would have traditionally required single crystal samples. The success of direct space methods depends heavily on starting with an accurate molecular model. In this paper we address the applicability of using these methods in finding subtleties such as disorder in the molecular conformation that might not be known a priori. We use ranitidine HCl as our test sample as it is known to have a conformational disorder from single crystal structural work. We redetermine the structure from powder data using simulated annealing and show that the conformational disorder is clearly revealed by this method.
Application of simulated annealing algorithm to improve work roll wear model in plate mills
Institute of Scientific and Technical Information of China (English)
无
2002-01-01
Employing Simulated Annealing Algorithm (SAA) and many measured data, a calculation model of work roll wear was built in the 2 800 mm 4-high mill of Wuhan Iron and Steel (Group) Co.(WISCO). The model was a semi-theory practical formula. Its pattern and magnitude were still hardly defined with classical optimization methods. But the problem could be resolved by SAA. It was pretty high precision to predict the values for the wear profiles of work roll in a rolling unit. Afterone-year application, the results show that the model is feasible in engineering, and it can be applied to predict the wear profiles of work roll in other mills
An infrared achromatic quarter-wave plate designed based on simulated annealing algorithm
Pang, Yajun; Zhang, Yinxin; Huang, Zhanhua; Yang, Huaidong
2017-03-01
Quarter-wave plates are primarily used to change the polarization state of light. Their retardation usually varies depending on the wavelength of the incident light. In this paper, the design and characteristics of an achromatic quarter-wave plate, which is formed by a cascaded system of birefringent plates, are studied. For the analysis of the combination, we use Jones matrix method to derivate the general expressions of the equivalent retardation and the equivalent azimuth. The infrared achromatic quarter-wave plate is designed based on the simulated annealing (SA) algorithm. The maximum retardation variation and the maximum azimuth variation of this achromatic waveplate are only about 1.8 ° and 0.5 ° , respectively, over the entire wavelength range of 1250-1650 nm. This waveplate can change the linear polarized light into circular polarized light with a less than 3.2% degree of linear polarization (DOLP) over that wide wavelength range.
Energy Technology Data Exchange (ETDEWEB)
Idoumghar, L. [Haute Alcace Univ., Mulhouse (France); Fodorean, D.; Mirraoui, A. [Univ. of Technology of Belfort-Montbeliard, Belfort (France). Dept. of Electrical Engineering and Control Systems
2010-03-09
Metaheuristics algorithms can solve complex optimization problems. A unique simulated annealing (SA) algorithm for multi-objective optimization was presented in this paper. The proposed SA algorithm was validated on five standard benchmark mathematical functions and improved the design of an inset permanent magnet motor with concentrated flux (IPMM-CF). The paper provided a description of the SA algorithm and discussed the results. The five benchmarks that were studied included Rastrigin's function; Rosenbrock's function; Michalewicz's function; Schwefel's function; and Noisy's function. The findings were also compared with results obtained by using the Ant Colony paradigm as well as with a particle swarm algorithm. Conclusions and further research options were also offered. It was concluded that the proposed approach has better performance in terms of accuracy, convergence rate, stability and robustness. 15 refs., 4 tabs., 9 figs.
Hussain, Faraz; Jha, Sumit K; Jha, Susmit; Langmead, Christopher J
2014-01-01
Stochastic models are increasingly used to study the behaviour of biochemical systems. While the structure of such models is often readily available from first principles, unknown quantitative features of the model are incorporated into the model as parameters. Algorithmic discovery of parameter values from experimentally observed facts remains a challenge for the computational systems biology community. We present a new parameter discovery algorithm that uses simulated annealing, sequential hypothesis testing, and statistical model checking to learn the parameters in a stochastic model. We apply our technique to a model of glucose and insulin metabolism used for in-silico validation of artificial pancreata and demonstrate its effectiveness by developing parallel CUDA-based implementation for parameter synthesis in this model.
Directory of Open Access Journals (Sweden)
Vasios C.E.
2003-01-01
Full Text Available In the present work, a new method for the classification of Event Related Potentials (ERPs is proposed. The proposed method consists of two modules: the feature extraction module and the classification module. The feature extraction module comprises the implementation of the Multivariate Autoregressive model in conjunction with the Simulated Annealing technique, for the selection of optimum features from ERPs. The classification module is implemented with a single three-layer neural network, trained with the back-propagation algorithm and classifies the data into two classes: patients and control subjects. The method, in the form of a Decision Support System (DSS, has been thoroughly tested to a number of patient data (OCD, FES, depressives and drug users, resulting successful classification up to 100%.
Simulated Annealing for Ground State Energy of Ionized Donor Bound Excitons in Semiconductors
Institute of Scientific and Technical Information of China (English)
YANHai-Qing; TANGChen; LIUMing; ZHANGHao; ZHANGGui-Min
2004-01-01
We present a global optimization method, called the simulated annealing, to the ground state energies of excitons. The proposed method does not require the partial derivatives with respect to each variational parameter or solving an eigenequation, so the present method is simpler in software programming than the variational method,and overcomes the major difficulties. The ground state energies of ionized-donor-bound excitons (D+,X) have beencal culated variationally for all values of effective electron-to-hole mass ratio σ. They are compared with those obtained by the variational method. The results obtained demonstrate that the proposed method is simple, accurate, and has more advantages than the traditional methods in calculation.
Simulated Annealing for Ground State Energy of Ionized Donor Bound Excitons in Semiconductors
Institute of Scientific and Technical Information of China (English)
YAN Hai-Qing; TANG Chen; LIU Ming; ZHANG Hao; ZHANG Gui-Min
2004-01-01
We present a global optimization method, called the simulated annealing, to the ground state energies of excitons. The proposed method does not require the partial derivatives with respect to each variational parameter or solving an eigenequation, so the present method is simpler in software programming than the variational method,and overcomes the major difficulties. The ground state energies of ionized-donor-bound excitons (D+, X) have been calculated variationally for all values of effective electron-to-hole mass ratio σ. They are compared with those obtained by the variational method. The results obtained demonstrate that the proposed method is simple, accurate, and has more advantages than the traditional methods in calculation.
Orito, Yukiko; Yamamoto, Hisashi; Tsujimura, Yasuhiro; Kambayashi, Yasushi
The portfolio optimizations are to determine the proportion-weighted combination in the portfolio in order to achieve investment targets. This optimization is one of the multi-dimensional combinatorial optimizations and it is difficult for the portfolio constructed in the past period to keep its performance in the future period. In order to keep the good performances of portfolios, we propose the extended information ratio as an objective function, using the information ratio, beta, prime beta, or correlation coefficient in this paper. We apply the simulated annealing (SA) to optimize the portfolio employing the proposed ratio. For the SA, we make the neighbor by the operation that changes the structure of the weights in the portfolio. In the numerical experiments, we show that our portfolios keep the good performances when the market trend of the future period becomes different from that of the past period.
Two-Dimensional IIR Filter Design Using Simulated Annealing Based Particle Swarm Optimization
Directory of Open Access Journals (Sweden)
Supriya Dhabal
2014-01-01
Full Text Available We present a novel hybrid algorithm based on particle swarm optimization (PSO and simulated annealing (SA for the design of two-dimensional recursive digital filters. The proposed method, known as SA-PSO, integrates the global search ability of PSO with the local search ability of SA and offsets the weakness of each other. The acceptance criterion of Metropolis is included in the basic algorithm of PSO to increase the swarm’s diversity by accepting sometimes weaker solutions also. The experimental results reveal that the performance of the optimal filter designed by the proposed SA-PSO method is improved. Further, the convergence behavior as well as optimization accuracy of proposed method has been improved significantly and computational time is also reduced. In addition, the proposed SA-PSO method also produces the best optimal solution with lower mean and variance which indicates that the algorithm can be used more efficiently in realizing two-dimensional digital filters.
Directory of Open Access Journals (Sweden)
Jingwei Song
2014-01-01
Full Text Available A simulated annealing (SA based variable weighted forecast model is proposed to combine and weigh local chaotic model, artificial neural network (ANN, and partial least square support vector machine (PLS-SVM to build a more accurate forecast model. The hybrid model was built and multistep ahead prediction ability was tested based on daily MSW generation data from Seattle, Washington, the United States. The hybrid forecast model was proved to produce more accurate and reliable results and to degrade less in longer predictions than three individual models. The average one-week step ahead prediction has been raised from 11.21% (chaotic model, 12.93% (ANN, and 12.94% (PLS-SVM to 9.38%. Five-week average has been raised from 13.02% (chaotic model, 15.69% (ANN, and 15.92% (PLS-SVM to 11.27%.
Directory of Open Access Journals (Sweden)
Wenbo Wu
2014-01-01
Full Text Available This paper addresses the problem of task allocation in real-time distributed systems with the goal of maximizing the system reliability, which has been shown to be NP-hard. We take account of the deadline constraint to formulate this problem and then propose an algorithm called chaotic adaptive simulated annealing (XASA to solve the problem. Firstly, XASA begins with chaotic optimization which takes a chaotic walk in the solution space and generates several local minima; secondly XASA improves SA algorithm via several adaptive schemes and continues to search the optimal based on the results of chaotic optimization. The effectiveness of XASA is evaluated by comparing with traditional SA algorithm and improved SA algorithm. The results show that XASA can achieve a satisfactory performance of speedup without loss of solution quality.
Louie, J. N.; Basler-Reeder, K.; Kent, G. M.; Pullammanappallil, S. K.
2015-12-01
Simultaneous joint seismic-gravity optimization improves P-wave velocity models in areas with sharp lateral velocity contrasts. Optimization is achieved using simulated annealing, a metaheuristic global optimization algorithm that does not require an accurate initial model. Balancing the seismic-gravity objective function is accomplished by a novel approach based on analysis of Pareto charts. Gravity modeling uses a newly developed convolution algorithm, while seismic modeling utilizes the highly efficient Vidale eikonal equation traveltime generation technique. Synthetic tests show that joint optimization improves velocity model accuracy and provides velocity control below the deepest headwave raypath. Detailed first arrival picking followed by trial velocity modeling remediates inconsistent data. We use a set of highly refined first arrival picks to compare results of a convergent joint seismic-gravity optimization to the Plotrefa™ and SeisOpt® Pro™ velocity modeling packages. Plotrefa™ uses a nonlinear least squares approach that is initial model dependent and produces shallow velocity artifacts. SeisOpt® Pro™ utilizes the simulated annealing algorithm and is limited to depths above the deepest raypath. Joint optimization increases the depth of constrained velocities, improving reflector coherency at depth. Kirchoff prestack depth migrations reveal that joint optimization ameliorates shallow velocity artifacts caused by limitations in refraction ray coverage. Seismic and gravity data from the San Emidio Geothermal field of the northwest Basin and Range province demonstrate that joint optimization changes interpretation outcomes. The prior shallow-valley interpretation gives way to a deep valley model, while shallow antiformal reflectors that could have been interpreted as antiformal folds are flattened. Furthermore, joint optimization provides a clearer image of the rangefront fault. This technique can readily be applied to existing datasets and could
Directory of Open Access Journals (Sweden)
Min Wang
2017-01-01
Full Text Available PFC2D(3D is commercial software, which is commonly used to model the crack initiation of rock and rock-like materials. For the PFC2D(3D numerical simulation, a proper set of microparameters need to be determined before the numerical simulation. To obtain a proper set of microparameters for PFC2D(3D model based on the macroparameters obtained from physical experiments, a novel technique has been carried out in this paper. The improved simulated annealing algorithm was employed to calibrate the microparameters of the numerical simulation model of PFC2D(3D. A Python script completely controls the calibration process, which can terminate automatically based on a termination criterion. The microparameter calibration process is not based on establishing the relationship between microparameters and macroparameters; instead, the microparameters are calibrated according to the improved simulated annealing algorithm. By using the proposed approach, the microparameters of both the contact-bond model and parallel-bond model in PFC2D(3D can be determined. To verify the validity of calibrating the microparameters of PFC2D(3D via the improved simulated annealing algorithm, some examples were selected from the literature. The corresponding numerical simulations were performed, and the numerical simulation results indicated that the proposed method is reliable for calibrating the microparameters of PFC2D(3D model.
Indian Academy of Sciences (India)
KAMAL DEEP; PARDEEP K SINGH
2016-09-01
In this paper, an integrated mathematical model of multi-period cell formation and part operation tradeoff in a dynamic cellular manufacturing system is proposed in consideration with multiple part process route. This paper puts emphasize on the production flexibility (production/subcontracting part operation) to satisfy the product demand requirement in different period segments of planning horizon considering production capacity shortage and/or sudden machine breakdown. The proposed model simultaneously generates machine cells and part families and selects the optimum process route instead of the user specifying predetermined routes. Conventional optimization method for the optimal cell formation problem requires substantial amount of time and memory space. Hence a simulated annealing based genetic algorithm is proposed to explore the solution regions efficiently and to expedite the solution search space. To evaluate the computability of the proposed algorithm, different problem scenarios are adopted from literature. The results approve the effectiveness of theproposed approach in designing the manufacturing cell and minimization of the overall cost, considering various manufacturing aspects such as production volume, multiple process route, production capacity, machine duplication, system reconfiguration, material handling and subcontracting part operation.
Directory of Open Access Journals (Sweden)
M. Madić
2013-09-01
Full Text Available This paper presents a systematic methodology for empirical modeling and optimization of surface roughness in nitrogen, CO2 laser cutting of stainless steel . The surface roughness prediction model was developed in terms of laser power , cutting speed , assist gas pressure and focus position by using The artificial neural network ( ANN . To cover a wider range of laser cutting parameters and obtain an experimental database for the ANN model development, Taguchi 's L27 orthogonal array was implemented in the experimental plan. The developed ANN model was expressed as an explicit nonlinear function , while the influence of laser cutting parameters and their interactions on surface roughness were analyzed by generating 2D and 3D plots . The final goal of the experimental study Focuses on the determinationof the optimum laser cutting parameters for the minimization of surface roughness . Since the solution space of the developed ANN model is complex, and the possibility of many local solutions is great, simulated annealing (SA was selected as a method for the optimization of surface roughness.
Elemental thin film depth profiles by ion beam analysis using simulated annealing - a new tool
Energy Technology Data Exchange (ETDEWEB)
Jeynes, C [University of Surrey Ion Beam Centre, Guildford, GU2 7XH (United Kingdom); Barradas, N P [Instituto Tecnologico e Nuclear, E.N. 10, Sacavem (Portugal); Marriott, P K [Department of Statistics, National University of Singapore, Singapore (Singapore); Boudreault, G [University of Surrey Ion Beam Centre, Guildford, GU2 7XH (United Kingdom); Jenkin, M [School of Electronics Computing and Mathematics, University of Surrey, Guildford (United Kingdom); Wendler, E [Friedrich-Schiller-Universitaet Jena, Institut fuer Festkoerperphysik, Jena (Germany); Webb, R P [University of Surrey Ion Beam Centre, Guildford, GU2 7XH (United Kingdom)
2003-04-07
Rutherford backscattering spectrometry (RBS) and related techniques have long been used to determine the elemental depth profiles in films a few nanometres to a few microns thick. However, although obtaining spectra is very easy, solving the inverse problem of extracting the depth profiles from the spectra is not possible analytically except for special cases. It is because these special cases include important classes of samples, and because skilled analysts are adept at extracting useful qualitative information from the data, that ion beam analysis is still an important technique. We have recently solved this inverse problem using the simulated annealing algorithm. We have implemented the solution in the 'IBA DataFurnace' code, which has been developed into a very versatile and general new software tool that analysts can now use to rapidly extract quantitative accurate depth profiles from real samples on an industrial scale. We review the features, applicability and validation of this new code together with other approaches to handling IBA (ion beam analysis) data, with particular attention being given to determining both the absolute accuracy of the depth profiles and statistically accurate error estimates. We include examples of analyses using RBS, non-Rutherford elastic scattering, elastic recoil detection and non-resonant nuclear reactions. High depth resolution and the use of multiple techniques simultaneously are both discussed. There is usually systematic ambiguity in IBA data and Butler's example of ambiguity (1990 Nucl. Instrum. Methods B 45 160-5) is reanalysed. Analyses are shown: of evaporated, sputtered, oxidized, ion implanted, ion beam mixed and annealed materials; of semiconductors, optical and magnetic multilayers, superconductors, tribological films and metals; and of oxides on Si, mixed metal silicides, boron nitride, GaN, SiC, mixed metal oxides, YBCO and polymers. (topical review)
Validation of Mission Plans Through Simulation
St-Pierre, J.; Melanson, P.; Brunet, C.; Crabtree, D.
2002-01-01
The purpose of a spacecraft mission planning system is to automatically generate safe and optimized mission plans for a single spacecraft, or more functioning in unison. The system verifies user input syntax, conformance to commanding constraints, absence of duty cycle violations, timing conflicts, state conflicts, etc. Present day constraint-based systems with state-based predictive models use verification rules derived from expert knowledge. A familiar solution found in Mission Operations Centers, is to complement the planning system with a high fidelity spacecraft simulator. Often a dedicated workstation, the simulator is frequently used for operator training and procedure validation, and may be interfaced to actual control stations with command and telemetry links. While there are distinct advantages to having a planning system offer realistic operator training using the actual flight control console, physical verification of data transfer across layers and procedure validation, experience has revealed some drawbacks and inefficiencies in ground segment operations: With these considerations, two simulation-based mission plan validation projects are under way at the Canadian Space Agency (CSA): RVMP and ViSION. The tools proposed in these projects will automatically run scenarios and provide execution reports to operations planning personnel, prior to actual command upload. This can provide an important safeguard for system or human errors that can only be detected with high fidelity, interdependent spacecraft models running concurrently. The core element common to these projects is a spacecraft simulator, built with off-the- shelf components such as CAE's Real-Time Object-Based Simulation Environment (ROSE) technology, MathWork's MATLAB/Simulink, and Analytical Graphics' Satellite Tool Kit (STK). To complement these tools, additional components were developed, such as an emulated Spacecraft Test and Operations Language (STOL) interpreter and CCSDS TM
Lutsyshyn, Yaroslav
2016-01-01
We developed a CUDA-based parallelization of the annealing method for the inverse Laplace transform problem. The algorithm is based on annealing algorithm and minimizes residue of the reconstruction of the spectral function. We introduce local updates which preserve first two sum rules and allow an efficient parallel CUDA implementation. Annealing is performed with the Monte Carlo method on a population of Markov walkers. We propose imprinted branching method to improve further the convergence of the anneal. The algorithm is tested on truncated double-peak Lorentzian spectrum with examples of how the error in the input data affects the reconstruction.
A Simulated Annealing based Optimization Algorithm for Automatic Variogram Model Fitting
Soltani-Mohammadi, Saeed; Safa, Mohammad
2016-09-01
Fitting a theoretical model to an experimental variogram is an important issue in geostatistical studies because if the variogram model parameters are tainted with uncertainty, the latter will spread in the results of estimations and simulations. Although the most popular fitting method is fitting by eye, in some cases use is made of the automatic fitting method on the basis of putting together the geostatistical principles and optimization techniques to: 1) provide a basic model to improve fitting by eye, 2) fit a model to a large number of experimental variograms in a short time, and 3) incorporate the variogram related uncertainty in the model fitting. Effort has been made in this paper to improve the quality of the fitted model by improving the popular objective function (weighted least squares) in the automatic fitting. Also, since the variogram model function (£) and number of structures (m) too affect the model quality, a program has been provided in the MATLAB software that can present optimum nested variogram models using the simulated annealing method. Finally, to select the most desirable model from among the single/multi-structured fitted models, use has been made of the cross-validation method, and the best model has been introduced to the user as the output. In order to check the capability of the proposed objective function and the procedure, 3 case studies have been presented.
Energy Technology Data Exchange (ETDEWEB)
Chiapetto, M. [SCK-CEN, Nuclear Materials Science Institute, Mol (Belgium); Unite Materiaux et Transformations (UMET), UMR 8207, Universite de Lille 1, ENSCL, Villeneuve d' Ascq (France); Becquart, C.S. [Unite Materiaux et Transformations (UMET), UMR 8207, Universite de Lille 1, ENSCL, Villeneuve d' Ascq (France); Laboratoire commun EDF-CNRS, Etude et Modelisation des Microstructures pour le Vieillissement des Materiaux (EM2VM) (France); Domain, C. [EDF R and D, Departement Materiaux et Mecanique des Composants, Les Renardieres, Moret sur Loing (France); Laboratoire commun EDF-CNRS, Etude et Modelisation des Microstructures pour le Vieillissement des Materiaux (EM2VM) (France); Malerba, L. [SCK-CEN, Nuclear Materials Science Institute, Mol (Belgium)
2015-01-01
Post-irradiation annealing experiments are often used to obtain clearer information on the nature of defects produced by irradiation. However, their interpretation is not always straightforward without the support of physical models. We apply here a physically-based set of parameters for object kinetic Monte Carlo (OKMC) simulations of the nanostructural evolution of FeMnNi alloys under irradiation to the simulation of their post-irradiation isochronal annealing, from 290 to 600 C. The model adopts a ''grey alloy'' scheme, i.e. the solute atoms are not introduced explicitly, only their effect on the properties of point-defect clusters is. Namely, it is assumed that both vacancy and SIA clusters are significantly slowed down by the solutes. The slowing down increases with size until the clusters become immobile. Specifically, the slowing down of SIA clusters by Mn and Ni can be justified in terms of the interaction between these atoms and crowdions in Fe. The results of the model compare quantitatively well with post-irradiation isochronal annealing experimental data, providing clear insight into the mechanisms that determine the disappearance or re-arrangement of defects as functions of annealing time and temperature. (copyright 2015 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)
Directory of Open Access Journals (Sweden)
Helio Yochihiro Fuchigami
2014-08-01
Full Text Available This article addresses the problem of minimizing makespan on two parallel flow shops with proportional processing and setup times. The setup times are separated and sequence-independent. The parallel flow shop scheduling problem is a specific case of well-known hybrid flow shop, characterized by a multistage production system with more than one machine working in parallel at each stage. This situation is very common in various kinds of companies like chemical, electronics, automotive, pharmaceutical and food industries. This work aimed to propose six Simulated Annealing algorithms, their perturbation schemes and an algorithm for initial sequence generation. This study can be classified as “applied research” regarding the nature, “exploratory” about the objectives and “experimental” as to procedures, besides the “quantitative” approach. The proposed algorithms were effective regarding the solution and computationally efficient. Results of Analysis of Variance (ANOVA revealed no significant difference between the schemes in terms of makespan. It’s suggested the use of PS4 scheme, which moves a subsequence of jobs, for providing the best percentage of success. It was also found that there is a significant difference between the results of the algorithms for each value of the proportionality factor of the processing and setup times of flow shops.
Directory of Open Access Journals (Sweden)
Ümmühan Başaran Filik
2010-01-01
Full Text Available This paper presents the solving unit commitment (UC problem using Modified Subgradient Method (MSG method combined with Simulated Annealing (SA algorithm. UC problem is one of the important power system engineering hard-solving problems. The Lagrangian relaxation (LR based methods are commonly used to solve the UC problem. The main disadvantage of this group of methods is the difference between the dual and the primal solution which gives some significant problems on the quality of the feasible solution. In this paper, MSG method which does not require any convexity and differentiability assumptions is used for solving the UC problem. MSG method depending on the initial value reaches zero duality gap. SA algorithm is used in order to assign the appropriate initial value for MSG method. The major advantage of the proposed approach is that it guarantees the zero duality gap independently from the size of the problem. In order to show the advantages of this proposed approach, the four-unit Tuncbilek thermal plant and ten-unit thermal plant which is usually used in literature are chosen as test systems. Penalty function (PF method is also used to compare with our proposed method in terms of total cost and UC schedule.
Wu, Zujian; Pang, Wei; Coghill, George M
2015-01-01
Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.
Directory of Open Access Journals (Sweden)
A. Mateos
2016-01-01
Full Text Available Technological advances are required to accommodate air traffic control systems for the future growth of air traffic. Particularly, detection and resolution of conflicts between aircrafts is a problem that has attracted much attention in the last decade becoming vital to improve the safety standards in free flight unstructured environments. We propose using the archive simulated annealing-based multiobjective optimization algorithm to deal with such a problem, accounting for three admissible maneuvers (velocity, turn, and altitude changes in a multiobjective context. The minimization of the maneuver number and magnitude, time delays, or deviations in the leaving points are considered for analysis. The optimal values for the algorithm parameter set are identified in the more complex instance in which all aircrafts have conflicts between each other accounting for 5, 10, and 20 aircrafts. Moreover, the performance of the proposed approach is analyzed by means of a comparison with the Pareto front, computed using brute force for 5 aircrafts and the algorithm is also illustrated with a random instance with 20 aircrafts.
Simulated Annealing-Based Ant Colony Algorithm for Tugboat Scheduling Optimization
Directory of Open Access Journals (Sweden)
Qi Xu
2012-01-01
Full Text Available As the “first service station” for ships in the whole port logistics system, the tugboat operation system is one of the most important systems in port logistics. This paper formulated the tugboat scheduling problem as a multiprocessor task scheduling problem (MTSP after analyzing the characteristics of tugboat operation. The model considers factors of multianchorage bases, different operation modes, and three stages of operations (berthing/shifting-berth/unberthing. The objective is to minimize the total operation times for all tugboats in a port. A hybrid simulated annealing-based ant colony algorithm is proposed to solve the addressed problem. By the numerical experiments without the shifting-berth operation, the effectiveness was verified, and the fact that more effective sailing may be possible if tugboats return to the anchorage base timely was pointed out; by the experiments with the shifting-berth operation, one can see that the objective is most sensitive to the proportion of the shifting-berth operation, influenced slightly by the tugboat deployment scheme, and not sensitive to the handling operation times.
A memory structure adapted simulated annealing algorithm for a green vehicle routing problem.
Küçükoğlu, İlker; Ene, Seval; Aksoy, Aslı; Öztürk, Nursel
2015-03-01
Currently, reduction of carbon dioxide (CO2) emissions and fuel consumption has become a critical environmental problem and has attracted the attention of both academia and the industrial sector. Government regulations and customer demands are making environmental responsibility an increasingly important factor in overall supply chain operations. Within these operations, transportation has the most hazardous effects on the environment, i.e., CO2 emissions, fuel consumption, noise and toxic effects on the ecosystem. This study aims to construct vehicle routes with time windows that minimize the total fuel consumption and CO2 emissions. The green vehicle routing problem with time windows (G-VRPTW) is formulated using a mixed integer linear programming model. A memory structure adapted simulated annealing (MSA-SA) meta-heuristic algorithm is constructed due to the high complexity of the proposed problem and long solution times for practical applications. The proposed models are integrated with a fuel consumption and CO2 emissions calculation algorithm that considers the vehicle technical specifications, vehicle load, and transportation distance in a green supply chain environment. The proposed models are validated using well-known instances with different numbers of customers. The computational results indicate that the MSA-SA heuristic is capable of obtaining good G-VRPTW solutions within a reasonable amount of time by providing reductions in fuel consumption and CO2 emissions.
Optimasi Coverage SFN pada Pemancar TV Digital DVB-T2 dengan Metode Simulated Annealing
Directory of Open Access Journals (Sweden)
Adib Nur Ikhwan
2013-09-01
Full Text Available Siaran TV digital yang akan diterapkan di Indonesia pada awalnya menggunakan standar DVB-T (Digital Video Broadcasting-Terestrial yang kemudian pada tahun 2012 diganti menjadi DVB-T2 (Digital Video Broadcasting-Terestrial Second Generation. Oleh karena itu, penelitian-penelitian sebelumnya termasuk optimasi coverage TV digital sudah tidak relevan lagi. Coverage merupakan salah satu bagian yang penting dalam siaran TV digital. Pada tugas akhir ini, optimasi coverage SFN (Single Frequency network pada pemancar TV digital diterapkan dengan metode SA (Simulated Annealing. Metode SA berusaha mencari solusi dengan berpindah dari satu solusi ke solusi yang lain, dimana akan dipilih solusi yang mempunyai fungsi energy (fitness yang terkecil. Optimasi dengan metode SA ini dilakukan dengan mengubah-ubah posisi pemancar TV digital sehingga didapatkan posisi yang terbaik. Optimasinya menggunakan 10 cooling schedule dengan melakukan 2 kali tes, baik pada mode FFT 2K ataupun 4K. Hasil yang dicapai dari penelitian ini adalah daerah coverage SFN pada pemancar siaran TV digital DVB-T2 mengalami peningkatan coverage relatif terbaik rata-rata sebesar 2.348% pada cooling schedule 7.
Optimization Of Thermo-Electric Coolers Using Hybrid Genetic Algorithm And Simulated Annealing
Directory of Open Access Journals (Sweden)
Khanh Doan V.K.
2014-06-01
Full Text Available Thermo-electric Coolers (TECs nowadays are applied in a wide range of thermal energy systems. This is due to their superior features where no refrigerant and dynamic parts are needed. TECs generate no electrical or acoustical noise and are environmentally friendly. Over the past decades, many researches were employed to improve the efficiency of TECs by enhancing the material parameters and design parameters. The material parameters are restricted by currently available materials and module fabricating technologies. Therefore, the main objective of TECs design is to determine a set of design parameters such as leg area, leg length and the number of legs. Two elements that play an important role when considering the suitability of TECs in applications are rated of refrigeration (ROR and coefficient of performance (COP. In this paper, the review of some previous researches will be conducted to see the diversity of optimization in the design of TECs in enhancing the performance and efficiency. After that, single-objective optimization problems (SOP will be tested first by using Genetic Algorithm (GA and Simulated Annealing (SA to optimize geometry properties so that TECs will operate at near optimal conditions. Equality constraint and inequality constraint were taken into consideration.
Simulated annealing (SA to vehicle routing problems with soft time windows
Directory of Open Access Journals (Sweden)
Suphan Sodsoon
2014-12-01
Full Text Available The researcher has applied and develops the meta-heuristics method to solve Vehicle Routing Problems with Soft Time Windows (VRPSTW. For this case there was only one depot, multi customers which each generally sparse either or demand was different though perceived number of demand and specific period of time to receive them. The Operation Research was representative combinatorial optimization problems and is known to be NP-hard. In this research algorithm, use Simulated Annealing (SA to determine the optimum solutions which rapidly time solving. After developed the algorithms, apply them to examine the factors and the optimum extended time windows and test these factors with vehicle problem routing under specific time windows by Solomon in OR-Library in case of maximum 25 customers. Meanwhile, 6 problems are including of C101, C102, R101, R102, RC101 and RC102 respectively. The result shows the optimum extended time windows at level of 50%. At last, after comparison these answers with the case of vehicle problem routing under specific time windows and flexible time windows, found that percentage errors on number of vehicles approximately by -28.57% and percentage errors on distances approximately by -28.57% which this algorithm spent average processing time on 45.5 sec/problems.
Kang, Jiyoung; Yamasaki, Kazuhiko; Sano, Kuniaki; Tsutsui, Ken; Tsutsui, Kimiko M.; Tateno, Masaru
2017-01-01
Theoretical analyses of multivariate data have become increasingly important in various scientific disciplines. The multivariate curve resolution alternating least-squares (MCR-ALS) method is an integrated and systematic tool to decompose such various types of spectral data to several pure spectra, corresponding to distinct species. However, in the present study, the MCR-ALS calculation provided only unreasonable solutions, when used to process the circular dichroism spectra of double-stranded DNA (228 bp) in the complex with a DNA-binding peptide under various concentrations. To resolve this problem, we developed an algorithm by including a simulated annealing (SA) protocol (the SA-MCR-ALS method), to facilitate the expansion of the sampling space. The analysis successfully decomposed the aforementioned data into three reasonable pure spectra. Thus, our SA-MCR-ALS scheme provides a useful tool for effective extended sampling, to investigate the substantial and detailed properties of various forms of multivariate data with significant difficulties in the degrees of freedom.
Qin, Jin; Xiang, Hui; Ye, Yong; Ni, Linglin
2015-01-01
A stochastic multiproduct capacitated facility location problem involving a single supplier and multiple customers is investigated. Due to the stochastic demands, a reasonable amount of safety stock must be kept in the facilities to achieve suitable service levels, which results in increased inventory cost. Based on the assumption of normal distributed for all the stochastic demands, a nonlinear mixed-integer programming model is proposed, whose objective is to minimize the total cost, including transportation cost, inventory cost, operation cost, and setup cost. A combined simulated annealing (CSA) algorithm is presented to solve the model, in which the outer layer subalgorithm optimizes the facility location decision and the inner layer subalgorithm optimizes the demand allocation based on the determined facility location decision. The results obtained with this approach shown that the CSA is a robust and practical approach for solving a multiple product problem, which generates the suboptimal facility location decision and inventory policies. Meanwhile, we also found that the transportation cost and the demand deviation have the strongest influence on the optimal decision compared to the others.
Ghosh, P; Bagchi, M C
2009-01-01
With a view to the rational design of selective quinoxaline derivatives, 2D and 3D-QSAR models have been developed for the prediction of anti-tubercular activities. Successful implementation of a predictive QSAR model largely depends on the selection of a preferred set of molecular descriptors that can signify the chemico-biological interaction. Genetic algorithm (GA) and simulated annealing (SA) are applied as variable selection methods for model development. 2D-QSAR modeling using GA or SA based partial least squares (GA-PLS and SA-PLS) methods identified some important topological and electrostatic descriptors as important factor for tubercular activity. Kohonen network and counter propagation artificial neural network (CP-ANN) considering GA and SA based feature selection methods have been applied for such QSAR modeling of Quinoxaline compounds. Out of a variable pool of 380 molecular descriptors, predictive QSAR models are developed for the training set and validated on the test set compounds and a comparative study of the relative effectiveness of linear and non-linear approaches has been investigated. Further analysis using 3D-QSAR technique identifies two models obtained by GA-PLS and SA-PLS methods leading to anti-tubercular activity prediction. The influences of steric and electrostatic field effects generated by the contribution plots are discussed. The results indicate that SA is a very effective variable selection approach for such 3D-QSAR modeling.
Institute of Scientific and Technical Information of China (English)
魏关锋; 姚平经; LUOXing; ROETZELWilfried
2004-01-01
The multi-stream heat exchanger network synthesis (HENS) problem can be formulated as a mixed integer nonlinear programming model according to Yee et al. Its nonconvexity nature leads to existence of more than one optimum and computational difficulty for traditional algorithms to find the global optimum. Compared with deterministic algorithms, evolutionary computation provides a promising approach to tackle this problem. In this paper, a mathematical model of multi-stream heat exchangers network synthesis problem is setup. Different from the assumption of isothermal mixing of stream splits and thus linearity constraints of Yee et al., non-isothermal mixing is supported. As a consequence, nonlinear constraints are resulted and nonconvexity of the objective function is added. To solve the mathematical model, an algorithm named GA/SA (parallel genetic/simulated annealing algorithm) is detailed for application to the multi-stream heat exchanger network synthesis problem. The performance of the proposed approach is demonstrated with three examples and the obtained solutions indicate the presented approach is effective for multi-stream HENS.
Finding a Hadamard Matrix by Simulated Annealing of Spin-Vectors
Suksmono, Andriyan Bayu
2016-01-01
Reformulation of a combinatorial problem into optimization of a statistical-mechanics system, enables finding a better solution using heuristics derived from a physical process, such as by the SA (Simulated Annealing). In this paper, we present a Hadamard matrix (H-matrix) searching method based on the SA on an Ising model. By equivalence, an H-matrix can be converted into an SH (Semi-normalized Hadamard) matrix; whose first columns are unity vector and the rest ones are vectors with equal number of -1 and +1 called SH-vectors. We define SH spin-vectors to represent the SH vectors, which play the role of the spins on the Ising model. The topology of the lattice is generalized into a graph, whose edges represent orthogonality relationship among the SH spin-vectors. Started from a randomly generated quasi H-matrix Q, which is a matrix similar to the SH-matrix without imposing orthogonality, we perform the SA. The transitions of Q are conducted by random exchange of {+,-} spin-pair within the SH-spin vectors whi...
Directory of Open Access Journals (Sweden)
Masoud Rabbani
2016-02-01
Full Text Available This paper presents the capacitated Windy Rural Postman Problem with several vehicles. For this problem, two objectives are considered. One of them is the minimization of the total cost of all vehicle routes expressed by the sum of the total traversing cost and another one is reduction of the maximum cost of vehicle route in order to find a set of equitable tours for the vehicles. Mathematical formulation is provided. The multi-objective simulated annealing (MOSA algorithm has been modified for solving this bi-objective NP-hard problem. To increase algorithm performance, Taguchi technique is applied to design experiments for tuning parameters of the algorithm. Numerical experiments are proposed to show efficiency of the model. Finally, the results of the MOSA have been compared with MOCS (multi-objective Cuckoo Search algorithm to validate the performance of the proposed algorithm. The experimental results indicate that the proposed algorithm provides good solutions and performs significantly better than the MOCS.
An adaptive evolutionary multi-objective approach based on simulated annealing.
Li, H; Landa-Silva, D
2011-01-01
A multi-objective optimization problem can be solved by decomposing it into one or more single objective subproblems in some multi-objective metaheuristic algorithms. Each subproblem corresponds to one weighted aggregation function. For example, MOEA/D is an evolutionary multi-objective optimization (EMO) algorithm that attempts to optimize multiple subproblems simultaneously by evolving a population of solutions. However, the performance of MOEA/D highly depends on the initial setting and diversity of the weight vectors. In this paper, we present an improved version of MOEA/D, called EMOSA, which incorporates an advanced local search technique (simulated annealing) and adapts the search directions (weight vectors) corresponding to various subproblems. In EMOSA, the weight vector of each subproblem is adaptively modified at the lowest temperature in order to diversify the search toward the unexplored parts of the Pareto-optimal front. Our computational results show that EMOSA outperforms six other well established multi-objective metaheuristic algorithms on both the (constrained) multi-objective knapsack problem and the (unconstrained) multi-objective traveling salesman problem. Moreover, the effects of the main algorithmic components and parameter sensitivities on the search performance of EMOSA are experimentally investigated.
Institute of Scientific and Technical Information of China (English)
张火明; 黄赛花; 管卫兵
2014-01-01
The highest similarity degree of static characteristics including both horizontal and vertical restoring force-displacement characteristics of total mooring system, as well as the tension-displacement characteristics of the representative single mooring line between the truncated and full depth system are obtained by annealing simulation algorithm for hybrid discrete variables (ASFHDV, in short). A“baton” optimization approach is proposed by utilizing ASFHDV. After each baton of optimization, if a few dimensional variables reach the upper or lower limit, the boundary of certain dimensional variables shall be expanded. In consideration of the experimental requirements, the length of the upper mooring line should not be smaller than 8 m, and the diameter of the anchor chain on the bottom should be larger than 0.03 m. A 100000 t turret mooring FPSO in the water depth of 304 m, with the truncated water depth being 76 m, is taken as an example of equivalent water depth truncated mooring system optimal design and calculation, and is performed to obtain the conformation parameters of the truncated mooring system. The numerical results indicate that the present truncated mooring system design is successful and effective.
Back-Analysis of Tunnel Response from Field Monitoring Using Simulated Annealing
Vardakos, Sotirios; Gutierrez, Marte; Xia, Caichu
2016-12-01
This paper deals with the use of field monitoring data to improve predictions of tunnel response during and after construction from numerical models. Computational models are powerful tools for the performance-based engineering analysis and design of geotechnical structures; however, the main challenge to their use is the paucity of information to establish input data needed to yield reliable predictions that can be used in the design of geotechnical structures. Field monitoring can offer not only the means to verify modeling results but also faster and more reliable ways to determine model parameters and for improving the reliability of model predictions. Back-analysis involves the determination of parameters required in computational models using field-monitored data, and is particularly suited to underground constructions, where more information about ground conditions and response becomes available as the construction progresses. A crucial component of back-analysis is an algorithm to find a set of input parameters that will minimize the difference between predicted and measured performance (e.g., in terms of deformations, stresses, or tunnel support loads). Methods of back-analysis can be broadly classified as direct and gradient-based optimization techniques. An alternative methodology to carry out the nonlinear optimization involved in back-analyses is the use of heuristic techniques. Heuristic methods refer to experience-based techniques for problem-solving, learning, and discovery that find a solution which is not guaranteed to be fully optimal, but good enough for a given set of goals. This paper focuses on the use of the heuristic simulated annealing (SA) method in the back-analysis of tunnel responses from field-monitored data. SA emulates the metallurgical processing of metals such as steel by annealing, which involves a gradual and sufficiently slow cooling of a metal from the heated phase which leads to a final material with a minimum imperfections
Afanasiev, M.; Pratt, R. G.; Kamei, R.; McDowell, G.
2012-12-01
Crosshole seismic tomography has been used by Vale to provide geophysical images of mineralized massive sulfides in the Eastern Deeps deposit at Voisey's Bay, Labrador, Canada. To date, these data have been processed using traveltime tomography, and we seek to improve the resolution of these images by applying acoustic Waveform Tomography. Due to the computational cost of acoustic waveform modelling, local descent algorithms are employed in Waveform Tomography; due to non-linearity an initial model is required which predicts first-arrival traveltimes to within a half-cycle of the lowest frequency used. Because seismic velocity anisotropy can be significant in hardrock settings, the initial model must quantify the anisotropy in order to meet the half-cycle criterion. In our case study, significant velocity contrasts between the target massive sulfides and the surrounding country rock led to difficulties in generating an accurate anisotropy model through traveltime tomography, and our starting model for Waveform Tomography failed the half-cycle criterion at large offsets. We formulate a new, semi-global approach for finding the best-fit 1-D elliptical anisotropy model using simulated annealing. Through random perturbations to Thompson's ɛ parameter, we explore the L2 norm of the frequency-domain phase residuals in the space of potential anisotropy models: If a perturbation decreases the residuals, it is always accepted, but if a perturbation increases the residuals, it is accepted with the probability P = exp(-(Ei-E)/T). This is the Metropolis criterion, where Ei is the value of the residuals at the current iteration, E is the value of the residuals for the previously accepted model, and T is a probability control parameter, which is decreased over the course of the simulation via a preselected cooling schedule. Convergence to the global minimum of the residuals is guaranteed only for infinitely slow cooling, but in practice good results are obtained from a variety
Wang, Jun; Zhou, Bi-hua; Zhou, Shu-dao; Sheng, Zheng
2015-01-01
The paper proposes a novel function expression method to forecast chaotic time series, using an improved genetic-simulated annealing (IGSA) algorithm to establish the optimum function expression that describes the behavior of time series. In order to deal with the weakness associated with the genetic algorithm, the proposed algorithm incorporates the simulated annealing operation which has the strong local search ability into the genetic algorithm to enhance the performance of optimization; besides, the fitness function and genetic operators are also improved. Finally, the method is applied to the chaotic time series of Quadratic and Rossler maps for validation. The effect of noise in the chaotic time series is also studied numerically. The numerical results verify that the method can forecast chaotic time series with high precision and effectiveness, and the forecasting precision with certain noise is also satisfactory. It can be concluded that the IGSA algorithm is energy-efficient and superior.
Directory of Open Access Journals (Sweden)
Jun Wang
2015-01-01
Full Text Available The paper proposes a novel function expression method to forecast chaotic time series, using an improved genetic-simulated annealing (IGSA algorithm to establish the optimum function expression that describes the behavior of time series. In order to deal with the weakness associated with the genetic algorithm, the proposed algorithm incorporates the simulated annealing operation which has the strong local search ability into the genetic algorithm to enhance the performance of optimization; besides, the fitness function and genetic operators are also improved. Finally, the method is applied to the chaotic time series of Quadratic and Rossler maps for validation. The effect of noise in the chaotic time series is also studied numerically. The numerical results verify that the method can forecast chaotic time series with high precision and effectiveness, and the forecasting precision with certain noise is also satisfactory. It can be concluded that the IGSA algorithm is energy-efficient and superior.
W. Ismail; Hassan, R.; PAYNE, A.; Swift, S
2011-01-01
This paper was delivered at AIME 2011: 13th Conference on Artifical Intelligence in Medicine. This paper presents a method for the detection and classification of blast cells in M3 with others sub-types using simulated annealing and neural networks. In this paper, we increased our test result from 10 images to 20 images. We performed Hill Climbing, Simulated Annealing and Genetic Algorithms for detecting the blast cells. As a result, simulated annealing is the “best” heuristic search for d...
Advanced Simulation and Computing Business Plan
Energy Technology Data Exchange (ETDEWEB)
Rummel, E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2015-07-09
To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.
Automated integration of genomic physical mapping data via parallel simulated annealing
Energy Technology Data Exchange (ETDEWEB)
Slezak, T.
1994-06-01
The Human Genome Center at the Lawrence Livermore National Laboratory (LLNL) is nearing closure on a high-resolution physical map of human chromosome 19. We have build automated tools to assemble 15,000 fingerprinted cosmid clones into 800 contigs with minimal spanning paths identified. These islands are being ordered, oriented, and spanned by a variety of other techniques including: Fluorescence Insitu Hybridization (FISH) at 3 levels of resolution, ECO restriction fragment mapping across all contigs, and a multitude of different hybridization and PCR techniques to link cosmid, YAC, AC, PAC, and Pl clones. The FISH data provide us with partial order and distance data as well as orientation. We made the observation that map builders need a much rougher presentation of data than do map readers; the former wish to see raw data since these can expose errors or interesting biology. We further noted that by ignoring our length and distance data we could simplify our problem into one that could be readily attacked with optimization techniques. The data integration problem could then be seen as an M x N ordering of our N cosmid clones which ``intersect`` M larger objects by defining ``intersection`` to mean either contig/map membership or hybridization results. Clearly, the goal of making an integrated map is now to rearrange the N cosmid clone ``columns`` such that the number of gaps on the object ``rows`` are minimized. Our FISH partially-ordered cosmid clones provide us with a set of constraints that cannot be violated by the rearrangement process. We solved the optimization problem via simulated annealing performed on a network of 40+ Unix machines in parallel, using a server/client model built on explicit socket calls. For current maps we can create a map in about 4 hours on the parallel net versus 4+ days on a single workstation. Our biologists are now using this software on a daily basis to guide their efforts toward final closure.
Directory of Open Access Journals (Sweden)
Maurer Till
2005-04-01
Full Text Available Abstract Background We have developed the program PERMOL for semi-automated homology modeling of proteins. It is based on restrained molecular dynamics using a simulated annealing protocol in torsion angle space. As main restraints defining the optimal local geometry of the structure weighted mean dihedral angles and their standard deviations are used which are calculated with an algorithm described earlier by Döker et al. (1999, BBRC, 257, 348–350. The overall long-range contacts are established via a small number of distance restraints between atoms involved in hydrogen bonds and backbone atoms of conserved residues. Employing the restraints generated by PERMOL three-dimensional structures are obtained using standard molecular dynamics programs such as DYANA or CNS. Results To test this modeling approach it has been used for predicting the structure of the histidine-containing phosphocarrier protein HPr from E. coli and the structure of the human peroxisome proliferator activated receptor γ (Ppar γ. The divergence between the modeled HPr and the previously determined X-ray structure was comparable to the divergence between the X-ray structure and the published NMR structure. The modeled structure of Ppar γ was also very close to the previously solved X-ray structure with an RMSD of 0.262 nm for the backbone atoms. Conclusion In summary, we present a new method for homology modeling capable of producing high-quality structure models. An advantage of the method is that it can be used in combination with incomplete NMR data to obtain reasonable structure models in accordance with the experimental data.
Corazza, S; Mündermann, L; Chaudhari, A M; Demattio, T; Cobelli, C; Andriacchi, T P
2006-06-01
Human motion capture is frequently used to study musculoskeletal biomechanics and clinical problems, as well as to provide realistic animation for the entertainment industry. The most popular technique for human motion capture uses markers placed on the skin, despite some important drawbacks including the impediment to the motion by the presence of skin markers and relative movement between the skin where the markers are placed and the underlying bone. The latter makes it difficult to estimate the motion of the underlying bone, which is the variable of interest for biomechanical and clinical applications. A model-based markerless motion capture system is presented in this study, which does not require the placement of any markers on the subject's body. The described method is based on visual hull reconstruction and an a priori model of the subject. A custom version of adapted fast simulated annealing has been developed to match the model to the visual hull. The tracking capability and a quantitative validation of the method were evaluated in a virtual environment for a complete gait cycle. The obtained mean errors, for an entire gait cycle, for knee and hip flexion are respectively 1.5 degrees (+/-3.9 degrees ) and 2.0 degrees (+/-3.0 degrees ), while for knee and hip adduction they are respectively 2.0 degrees (+/-2.3 degrees ) and 1.1 degrees (+/-1.7 degrees ). Results for the ankle and shoulder joints are also presented. Experimental results captured in a gait laboratory with a real subject are also shown to demonstrate the effectiveness and potential of the presented method in a clinical environment.
Salter, Bill Jean, Jr.
Purpose. The advent of new, so called IVth Generation, external beam radiation therapy treatment machines (e.g. Scanditronix' MM50 Racetrack Microtron) has raised the question of how the capabilities of these new machines might be exploited to produce extremely conformal dose distributions. Such machines possess the ability to produce electron energies as high as 50 MeV and, due to their scanned beam delivery of electron treatments, to modulate intensity and even energy, within a broad field. Materials and methods. Two patients with 'challenging' tumor geometries were selected from the patient archives of the Cancer Therapy and Research Center (CTRC), in San Antonio Texas. The treatment scheme that was tested allowed for twelve, energy and intensity modulated beams, equi-spaced about the patient-only intensity was modulated for the photon treatment. The elementary beams, incident from any of the twelve allowed directions, were assumed parallel, and the elementary electron beams were modeled by elementary beam data. The optimal arrangement of elementary beam energies and/or intensities was optimized by Szu-Hartley Fast Simulated Annealing Optimization. Optimized treatment plans were determined for each patient using both the high energy, intensity and energy modulated electron (HIEME) modality, and the 6 MV photon modality. The 'quality' of rival plans were scored using three different, popular objective functions which included Root Mean Square (RMS), Maximize Dose Subject to Dose and Volume Limitations (MDVL - Morrill et. al.), and Probability of Uncomplicated Tumor Control (PUTC) methods. The scores of the two optimized treatments (i.e. HIEME and intensity modulated photons) were compared to the score of the conventional plan with which the patient was actually treated. Results. The first patient evaluated presented a deeply located target volume, partially surrounding the spinal cord. A healthy right kidney was immediately adjacent to the tumor volume, separated
Lee, Cheng-Kuang
2014-12-10
© 2014 American Chemical Society. The nanomorphologies of the bulk heterojunction (BHJ) layer of polymer solar cells are extremely sensitive to the electrode materials and thermal annealing conditions. In this work, the correlations of electrode materials, thermal annealing sequences, and resultant BHJ nanomorphological details of P3HT:PCBM BHJ polymer solar cell are studied by a series of large-scale, coarse-grained (CG) molecular simulations of system comprised of PEDOT:PSS/P3HT:PCBM/Al layers. Simulations are performed for various configurations of electrode materials as well as processing temperature. The complex CG molecular data are characterized using a novel extension of our graph-based framework to quantify morphology and establish a link between morphology and processing conditions. Our analysis indicates that vertical phase segregation of P3HT:PCBM blend strongly depends on the electrode material and thermal annealing schedule. A thin P3HT-rich film is formed on the top, regardless of bottom electrode material, when the BHJ layer is exposed to the free surface during thermal annealing. In addition, preferential segregation of P3HT chains and PCBM molecules toward PEDOT:PSS and Al electrodes, respectively, is observed. Detailed morphology analysis indicated that, surprisingly, vertical phase segregation does not affect the connectivity of donor/acceptor domains with respective electrodes. However, the formation of P3HT/PCBM depletion zones next to the P3HT/PCBM-rich zones can be a potential bottleneck for electron/hole transport due to increase in transport pathway length. Analysis in terms of fraction of intra- and interchain charge transports revealed that processing schedule affects the average vertical orientation of polymer chains, which may be crucial for enhanced charge transport, nongeminate recombination, and charge collection. The present study establishes a more detailed link between processing and morphology by combining multiscale molecular
Chen, Hongwei; Kong, Xi; Chong, Bo; Qin, Gan; Zhou, Xianyi; Peng, Xinhua; Du, Jiangfeng
2011-03-01
The method of quantum annealing (QA) is a promising way for solving many optimization problems in both classical and quantum information theory. The main advantage of this approach, compared with the gate model, is the robustness of the operations against errors originated from both external controls and the environment. In this work, we succeed in demonstrating experimentally an application of the method of QA to a simplified version of the traveling salesman problem by simulating the corresponding Schrödinger evolution with a NMR quantum simulator. The experimental results unambiguously yielded the optimal traveling route, in good agreement with the theoretical prediction.
Speagle, Joshua S; Eisenstein, Daniel J; Masters, Daniel C; Steinhardt, Charles L
2015-01-01
Using a grid of $\\sim 2$ million elements ($\\Delta z = 0.005$) adapted from COSMOS photometric redshift (photo-z) searches, we investigate the general properties of template-based photo-z likelihood surfaces. We find these surfaces are filled with numerous local minima and large degeneracies that generally confound rapid but "greedy" optimization schemes, even with additional stochastic sampling methods. In order to robustly and efficiently explore these surfaces, we develop BAD-Z [Brisk Annealing-Driven Redshifts (Z)], which combines ensemble Markov Chain Monte Carlo (MCMC) sampling with simulated annealing to sample arbitrarily large, pre-generated grids in approximately constant time. Using a mock catalog of 384,662 objects, we show BAD-Z samples $\\sim 40$ times more efficiently compared to a brute-force counterpart while maintaining similar levels of accuracy. Our results represent first steps toward designing template-fitting photo-z approaches limited mainly by memory constraints rather than computation...
Hao, Ge-Fei; Xu, Wei-Fang; Yang, Sheng-Gang; Yang, Guang-Fu
2015-10-23
Protein and peptide structure predictions are of paramount importance for understanding their functions, as well as the interactions with other molecules. However, the use of molecular simulation techniques to directly predict the peptide structure from the primary amino acid sequence is always hindered by the rough topology of the conformational space and the limited simulation time scale. We developed here a new strategy, named Multiple Simulated Annealing-Molecular Dynamics (MSA-MD) to identify the native states of a peptide and miniprotein. A cluster of near native structures could be obtained by using the MSA-MD method, which turned out to be significantly more efficient in reaching the native structure compared to continuous MD and conventional SA-MD simulation.
Information Security Plan for Flight Simulator Applications
Slaughter, Jason
2011-01-01
The Department of Defense has a need for an identity management system that uses two factor authentications to ensure that only the correct individuals get access to their top secret flight simulator program. Currently the Department of Defense does not have a web interface sign in system. We will be creating a system that will allow them to access their programs, back office and administrator functions remotely. A security plan outlining our security architecture will be delivered prior to the final code roll out. The plan will include responses to encryption used and the security architecture applied in the final documentation. The code will be delivered in phases to work out any issues that may occur during the implementation
Institute of Scientific and Technical Information of China (English)
MALi-ming; JIANGHong; WANGXiao-chun
2004-01-01
The algorithm is divided into two steps. The first step pre-locates the blank by aligning its centre of gravity and approximate normal vector with those of destination surfaces, with largest overlap of projections of two objects on a plane perpendicular to the normal vector. The second step is optimizing an objective function by means of gradient-simulated annealing algorithm to get the best matching of a set of distributed points on the blank and destination surfaces. An example for machining hydroelectric turbine blades is given to verify the effectiveness of algorithm.
Directory of Open Access Journals (Sweden)
Óscar Begambre
2010-01-01
Full Text Available En este trabajo, el algoritmo Simulated Annealing (SA es empleado para solucionar el problema inverso de detección de daño en vigas usando información modal contaminada con ruido. La formulación de la función objetivo para el procedimiento de optimización, basado en el SA, está fundamentada en el método de la fuerza residual modificada. El desempeño del SA empleado en este estudio superó el de un algoritmo genético (AG en dos funciones de prueba reportadas en la literatura internacional. El procedimiento de evaluación de integridad aquí propuesto se confirmó y validó numéricamente empleando la teoría de vigas de Euler-Bernoulli y un Modelo de Elementos Finitos (MEF de vigas en voladizo y apoyadas libremente.In this research, the Simulated Annealing Algorithm (SA is employed to solve damage detection problems in beam type structures using noisy polluted modal data. The formulation of the objective function for the SA optimization procedure is based on the modified residual force method. The SA used in this research performs better than the Genetic Algorithm (GA in two difficult benchmark functions. The proposed structural damage-identification scheme is confirmed and assessed using a Finite Element Model (FEM of cantilever and a free-free Euler-Bernoulli beam model
Energy Technology Data Exchange (ETDEWEB)
Gomes, Mario Helder [Departamento de Engenharia Electrotecnica, Instituto Politecnico de Tomar, Quinta do Contador, Estrada da Serra, 2300 Tomar (Portugal); Saraiva, Joao Tome [INESC Porto, Faculdade de Engenharia, Universidade do Porto, Campus da FEUP, Rua Dr. Roberto Frias, 4200-465 Porto (Portugal)
2009-06-15
This paper describes an optimization model to be used by System Operators in order to validate the economic schedules obtained by Market Operators together with the injections from Bilateral Contracts. These studies will be performed off-line in the day before operation and the developed model is based on adjustment bids submitted by generators and loads and it is used by System Operators if that is necessary to enforce technical or security constraints. This model corresponds to an enhancement of an approach described in a previous paper and it now includes discrete components as transformer taps and reactor and capacitor banks. The resulting mixed integer formulation is solved using Simulated Annealing, a well known metaheuristic specially suited for combinatorial problems. Once the Simulated Annealing converges and the values of the discrete variables are fixed, the resulting non-linear continuous problem is solved using Sequential Linear Programming to get the final solution. The developed model corresponds to an AC version, it includes constraints related with the capability diagram of synchronous generators and variables allowing the computation of the active power required to balance active losses. Finally, the paper includes a Case Study based on the IEEE 118 bus system to illustrate the results that it is possible to obtain and their interest. (author)
Energy Technology Data Exchange (ETDEWEB)
Nakos, J.T.; Rosinski, S.T.; Acton, R.U.
1994-11-01
The objective of this work was to provide experimental heat transfer boundary condition and reactor pressure vessel (RPV) section thermal response data that can be used to benchmark computer codes that simulate thermal annealing of RPVS. This specific protect was designed to provide the Electric Power Research Institute (EPRI) with experimental data that could be used to support the development of a thermal annealing model. A secondary benefit is to provide additional experimental data (e.g., thermal response of concrete reactor cavity wall) that could be of use in an annealing demonstration project. The setup comprised a heater assembly, a 1.2 in {times} 1.2 m {times} 17.1 cm thick [4 ft {times} 4 ft {times} 6.75 in] section of an RPV (A533B ferritic steel with stainless steel cladding), a mockup of the {open_quotes}mirror{close_quotes} insulation between the RPV and the concrete reactor cavity wall, and a 25.4 cm [10 in] thick concrete wall, 2.1 in {times} 2.1 in [10 ft {times} 10 ft] square. Experiments were performed at temperature heat-up/cooldown rates of 7, 14, and 28{degrees}C/hr [12.5, 25, and 50{degrees}F/hr] as measured on the heated face. A peak temperature of 454{degrees}C [850{degrees}F] was maintained on the heated face until the concrete wall temperature reached equilibrium. Results are most representative of those RPV locations where the heat transfer would be 1-dimensional. Temperature was measured at multiple locations on the heated and unheated faces of the RPV section and the concrete wall. Incident heat flux was measured on the heated face, and absorbed heat flux estimates were generated from temperature measurements and an inverse heat conduction code. Through-wall temperature differences, concrete wall temperature response, heat flux absorbed into the RPV surface and incident on the surface are presented. All of these data are useful to modelers developing codes to simulate RPV annealing.
Energy Technology Data Exchange (ETDEWEB)
Nakos, J.T.; Rosinski, S.T.; Acton, R.U.
1994-11-01
The objective of this work was to provide experimental heat transfer boundary condition and reactor pressure vessel (RPV) section thermal response data that can be used to benchmark computer codes that simulate thermal annealing of RPVS. This specific protect was designed to provide the Electric Power Research Institute (EPRI) with experimental data that could be used to support the development of a thermal annealing model. A secondary benefit is to provide additional experimental data (e.g., thermal response of concrete reactor cavity wall) that could be of use in an annealing demonstration project. The setup comprised a heater assembly, a 1.2 in {times} 1.2 m {times} 17.1 cm thick [4 ft {times} 4 ft {times} 6.75 in] section of an RPV (A533B ferritic steel with stainless steel cladding), a mockup of the {open_quotes}mirror{close_quotes} insulation between the RPV and the concrete reactor cavity wall, and a 25.4 cm [10 in] thick concrete wall, 2.1 in {times} 2.1 in [10 ft {times} 10 ft] square. Experiments were performed at temperature heat-up/cooldown rates of 7, 14, and 28{degrees}C/hr [12.5, 25, and 50{degrees}F/hr] as measured on the heated face. A peak temperature of 454{degrees}C [850{degrees}F] was maintained on the heated face until the concrete wall temperature reached equilibrium. Results are most representative of those RPV locations where the heat transfer would be 1-dimensional. Temperature was measured at multiple locations on the heated and unheated faces of the RPV section and the concrete wall. Incident heat flux was measured on the heated face, and absorbed heat flux estimates were generated from temperature measurements and an inverse heat conduction code. Through-wall temperature differences, concrete wall temperature response, heat flux absorbed into the RPV surface and incident on the surface are presented. All of these data are useful to modelers developing codes to simulate RPV annealing.
An Interactive Simulation Tool for Production Planning in Bacon Factories
DEFF Research Database (Denmark)
Nielsen, Jens Frederik Dalsgaard; Nielsen, Kirsten Mølgaard
1994-01-01
The paper describes an interactive simulation tool for production planning in bacon factories. The main aim of the tool is to make it possible to combine the production plans of all parts of the factory......The paper describes an interactive simulation tool for production planning in bacon factories. The main aim of the tool is to make it possible to combine the production plans of all parts of the factory...
Jia, F.; Lichti, D.
2017-09-01
The optimal network design problem has been well addressed in geodesy and photogrammetry but has not received the same attention for terrestrial laser scanner (TLS) networks. The goal of this research is to develop a complete design system that can automatically provide an optimal plan for high-accuracy, large-volume scanning networks. The aim in this paper is to use three heuristic optimization methods, simulated annealing (SA), genetic algorithm (GA) and particle swarm optimization (PSO), to solve the first-order design (FOD) problem for a small-volume indoor network and make a comparison of their performances. The room is simplified as discretized wall segments and possible viewpoints. Each possible viewpoint is evaluated with a score table representing the wall segments visible from each viewpoint based on scanning geometry constraints. The goal is to find a minimum number of viewpoints that can obtain complete coverage of all wall segments with a minimal sum of incidence angles. The different methods have been implemented and compared in terms of the quality of the solutions, runtime and repeatability. The experiment environment was simulated from a room located on University of Calgary campus where multiple scans are required due to occlusions from interior walls. The results obtained in this research show that PSO and GA provide similar solutions while SA doesn't guarantee an optimal solution within limited iterations. Overall, GA is considered as the best choice for this problem based on its capability of providing an optimal solution and fewer parameters to tune.
Chaotic Simulated Annealing by A Neural Network Model with Transient Chaos
Chen, L; Chen, Luonan; Aihara, Kazuyuki
1997-01-01
We propose a neural network model with transient chaos, or a transiently chaotic neural network (TCNN) as an approximation method for combinatorial optimization problem, by introducing transiently chaotic dynamics into neural networks. Unlike conventional neural networks only with point attractors, the proposed neural network has richer and more flexible dynamics, so that it can be expected to have higher ability of searching for globally optimal or near-optimal solutions. A significant property of this model is that the chaotic neurodynamics is temporarily generated for searching and self-organizing, and eventually vanishes with autonomous decreasing of a bifurcation parameter corresponding to the "temperature" in usual annealing process. Therefore, the neural network gradually approaches, through the transient chaos, to dynamical structure similar to such conventional models as the Hopfield neural network which converges to a stable equilibrium point. Since the optimization process of the transiently chaoti...
Directory of Open Access Journals (Sweden)
Doddy Kastanya
2017-02-01
Full Text Available In any reactor physics analysis, the instantaneous power distribution in the core can be calculated when the actual bundle-wise burnup distribution is known. Considering the fact that CANDU (Canada Deuterium Uranium utilizes on-power refueling to compensate for the reduction of reactivity due to fuel burnup, in the CANDU fuel management analysis, snapshots of power and burnup distributions can be obtained by simulating and tracking the reactor operation over an extended period using various tools such as the *SIMULATE module of the Reactor Fueling Simulation Program (RFSP code. However, for some studies, such as an evaluation of a conceptual design of a next-generation CANDU reactor, the preferred approach to obtain a snapshot of the power distribution in the core is based on the patterned-channel-age model implemented in the *INSTANTAN module of the RFSP code. The objective of this approach is to obtain a representative snapshot of core conditions quickly. At present, such patterns could be generated by using a program called RANDIS, which is implemented within the *INSTANTAN module. In this work, we present an alternative approach to derive the patterned-channel-age model where a simulated-annealing-based algorithm is used to find such patterns, which produce reasonable power distributions.
Energy Technology Data Exchange (ETDEWEB)
Fabbri, Paolo; Trevisani, Sebastiano [Dipartimento di Geologia, Paleontologia e Geofisica, Universita degli Studi di Padova, via Giotto 1, 35127 Padova (Italy)
2005-10-01
The spatial distribution of groundwater temperatures in the low-temperature (60-86{sup o}C) geothermal Euganean field of northeastern Italy has been studied using a geostatistical approach. The data set consists of 186 temperatures measured in a fractured limestone reservoir, over an area of 8km{sup 2}. Investigation of the spatial continuity by means of variographic analysis revealed the presence of anisotropies that are apparently related to the particular geologic structure of the area. After inference of variogram models, a simulated annealing procedure was used to perform conditional simulations of temperature in the domain being studied. These simulations honor the data values and reproduce the spatial continuity inferred from the data. Post-processing of the simulations permits an assessment of temperature uncertainties. Maps of estimated temperatures, interquartile range, and of the probability of exceeding a prescribed 80{sup o}C threshold were also computed. The methodology described could prove useful when siting new wells in a geothermal area. (author)
ArF-excimer-laser annealing of 3C-SiC films—diode characteristics and numerical simulation
Mizunami, T.; Toyama, N.
2003-09-01
We fabricated Schottky barrier diodes using 3C-SiC films deposited on Si(1 1 1) by lamp-assisted thermal chemical vapor deposition and annealed with an ArF excimer laser. Improvement in both the reverse current and the ideality factor was obtained with 1-3 pulses with energy densities of 1.4- 1.6 J/cm2 per pulse. We solved a heat equation numerically assuming a transient liquid phase of SiC. The calculated threshold energy density for melting the surface was 0.9 J/cm2. The thermal effects of Si substrate on SiC film were also discussed. The experimental optimum condition was consistent the numerical simulation.
Speagle, Joshua S.; Capak, Peter L.; Eisenstein, Daniel J.; Masters, Daniel C.; Steinhardt, Charles L.
2016-10-01
Using a 4D grid of ˜2 million model parameters (Δz = 0.005) adapted from Cosmological Origins Survey photometric redshift (photo-z) searches, we investigate the general properties of template-based photo-z likelihood surfaces. We find these surfaces are filled with numerous local minima and large degeneracies that generally confound simplistic gradient-descent optimization schemes. We combine ensemble Markov Chain Monte Carlo sampling with simulated annealing to robustly and efficiently explore these surfaces in approximately constant time. Using a mock catalogue of 384 662 objects, we show our approach samples ˜40 times more efficiently compared to a `brute-force' counterpart while maintaining similar levels of accuracy. Our results represent first steps towards designing template-fitting photo-z approaches limited mainly by memory constraints rather than computation time.
DEFF Research Database (Denmark)
Sousa, Tiago M; Morais, Hugo; Castro, R.
2014-01-01
to be used in the energy resource scheduling methodology based on simulated annealing previously developed by the authors. The case study considers two scenarios with 1000 and 2000 electric vehicles connected in a distribution network. The proposed heuristics are compared with a deterministic approach......An intensive use of dispersed energy resources is expected for future power systems, including distributed generation, especially based on renewable sources, and electric vehicles. The system operation methods and tool must be adapted to the increased complexity, especially the optimal resource...... scheduling problem. Therefore, the use of metaheuristics is required to obtain good solutions in a reasonable amount of time. This paper proposes two new heuristics, called naive electric vehicles charge and discharge allocation and generation tournament based on cost, developed to obtain an initial solution...
Indian Academy of Sciences (India)
Satyajit Guha; Soumya Ganguly Neogi; Pinaki Chaudhury
2014-05-01
In this paper, we explore the use of stochastic optimizer, namely simulated annealing (SA) followed by density function theory (DFT)-based strategy for evaluating the structure and infrared spectroscopy of (H2O) OH− clusters where = 1-6. We have shown that the use of SA can generate both global and local structures of these cluster systems.We also perform a DFT calculation, using the optimized coordinate obtained from SA as input and extract the IR spectra of these systems. Finally, we compare our results with available theoretical and experimental data. There is a close correspondence between the computed frequencies from our theoretical study and available experimental data. To further aid in understanding the details of the hydrogen bonds formed, we performed atoms in molecules calculation on all the global minimum structures to evaluate relevant electron densities and critical points.
Directory of Open Access Journals (Sweden)
Shangchia Liu
2015-01-01
Full Text Available In the field of distributed decision making, different agents share a common processing resource, and each agent wants to minimize a cost function depending on its jobs only. These issues arise in different application contexts, including real-time systems, integrated service networks, industrial districts, and telecommunication systems. Motivated by its importance on practical applications, we consider two-agent scheduling on a single machine where the objective is to minimize the total completion time of the jobs of the first agent with the restriction that an upper bound is allowed the total completion time of the jobs for the second agent. For solving the proposed problem, a branch-and-bound and three simulated annealing algorithms are developed for the optimal solution, respectively. In addition, the extensive computational experiments are also conducted to test the performance of the algorithms.
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
The characteristics of the design resources in the ship collaborative design is described and the hierarchical model for the evaluation of the design resources is established. The comprehensive evaluation of the co-designers for the collaborative design resources has been done from different aspects using Analytic Hierarchy Process (AHP),and according to the evaluation results,the candidates are determined. Meanwhile,based on the principle of minimum cost,and starting from the relations between the design tasks and the corresponding co-designers,the optimizing selection model of the collaborators is established and one novel genetic combined with simulated annealing algorithm is proposed to realize the optimization. It overcomes the defects of the genetic algorithm which may lead to the premature convergence and local optimization if used individually. Through the application of this method in the ship collaborative design system,it proves the feasibility and provides a quantitative method for the optimizing selection of the design resources.
Addawe, Rizavel C.; Addawe, Joel M.; Magadia, Joselito C.
2016-10-01
Accurate forecasting of dengue cases would significantly improve epidemic prevention and control capabilities. This paper attempts to provide useful models in forecasting dengue epidemic specific to the young and adult population of Baguio City. To capture the seasonal variations in dengue incidence, this paper develops a robust modeling approach to identify and estimate seasonal autoregressive integrated moving average (SARIMA) models in the presence of additive outliers. Since the least squares estimators are not robust in the presence of outliers, we suggest a robust estimation based on winsorized and reweighted least squares estimators. A hybrid algorithm, Differential Evolution - Simulated Annealing (DESA), is used to identify and estimate the parameters of the optimal SARIMA model. The method is applied to the monthly reported dengue cases in Baguio City, Philippines.
Energy Technology Data Exchange (ETDEWEB)
Rincon, Luis [Universidad de Los Andes, Merida (Venezuela)
2001-03-01
Semiempirical simulated annealing molecular dynamics method using a fictitious Lagrangian has been developed for the study of structural and electronic properties of micro- and nano-clusters. As an application of the present scheme, we study the structure of Na{sub n} clusters in the range of n=2-100, and compared the present calculation with some ab-initio model calculation. [Spanish] Se desarrollo un metodo de Dinamica Molecular-Recocido simulado usando un Lagrangiano ficticio para estudiar las propiedades electronicas y estructurales de micro- y nano-agregados. Como una aplicacion del presente esquema, se estudio la estructura de agregados de Na{sub n} en el rango entre n=2-100, y se compararon los resultados con algunos calculos ab-initio modelo.
Tournus, Florent; Tamion, Alexandre; Hillion, Arnaud; Dupuis, Véronique
2016-12-01
Isothermal remanent magnetization (IRM) combined with Direct current demagnetization (DcD) are powerful tools to qualitatively study the interactions (through the Δm parameter) between magnetic particles in a granular media. For magnetic nanoparticles diluted in a matrix, it is possible to reach a regime where Δm is equal to zero, i.e. where interparticle interactions are negligible: one can then infer the intrinsic properties of nanoparticles through measurements on an assembly, which are analyzed by a combined fit procedure (based on the Stoner-Wohlfarth and Néel models). Here we illustrate the benefits of a quantitative analysis of IRM curves, for Co nanoparticles embedded in amorphous carbon (before and after annealing): while a large anisotropy increase may have been deduced from the other measurements, IRM curves provide an improved characterization of the nanomagnets intrinsic properties, revealing that it is in fact not the case. This shows that IRM curves, which only probe the irreversible switching of nanomagnets, are complementary to widely used low field susceptibility curves.
Institute of Scientific and Technical Information of China (English)
温平川; 徐晓东; 何先刚
2003-01-01
This paper presents a highly hybrid Genetic Algorithm / Simulated Annealing algorithm. This algorithmhas been successfully implemented on Beowulf PCs Cluster and applied to a set of standard function optimization prob-lems. From experimental results, it is easily to see that this algorithm proposed by us is not only effective but also robust.
Dou, Tai H.; Min, Yugang; Neylon, John; Thomas, David; Kupelian, Patrick; Santhanam, Anand P.
2016-03-01
Deformable image registration (DIR) is an important step in radiotherapy treatment planning. An optimal input registration parameter set is critical to achieve the best registration performance with the specific algorithm. Methods In this paper, we investigated a parameter optimization strategy for Optical-flow based DIR of the 4DCT lung anatomy. A novel fast simulated annealing with adaptive Monte Carlo sampling algorithm (FSA-AMC) was investigated for solving the complex non-convex parameter optimization problem. The metric for registration error for a given parameter set was computed using landmark-based mean target registration error (mTRE) between a given volumetric image pair. To reduce the computational time in the parameter optimization process, a GPU based 3D dense optical-flow algorithm was employed for registering the lung volumes. Numerical analyses on the parameter optimization for the DIR were performed using 4DCT datasets generated with breathing motion models and open-source 4DCT datasets. Results showed that the proposed method efficiently estimated the optimum parameters for optical-flow and closely matched the best registration parameters obtained using an exhaustive parameter search method.
Simulations and measurements of annealed pyrolytic graphite-metal composite baseplates
Streb, F.; Ruhl, G.; Schubert, A.; Zeidler, H.; Penzel, M.; Flemmig, S.; Todaro, I.; Squatrito, R.; Lampke, T.
2016-03-01
We investigated the usability of anisotropic materials as inserts in aluminum-matrix-composite baseplates for typical high performance power semiconductor modules using finite-element simulations and transient plane source measurements. For simulations, several physical modules can be used, which are suitable for different thermal boundary conditions. By comparing different modules and options of heat transfer we found non-isothermal simulations to be closest to reality for temperature distribution at the surface of the heat sink. We optimized the geometry of the graphite inserts for best heat dissipation and based on these results evaluated the thermal resistance of a typical power module using calculation time optimized steady-state simulations. Here we investigated the influence of thermal contact conductance (TCC) between metal matrix and inserts on the heat dissipation. We found improved heat dissipation compared to the plain metal baseplate for a TCC of 200 kW/m2/K and above.To verify the simulations we evaluated cast composite baseplates with two different insert geometries and measured their averaged lateral thermal conductivity using a transient plane source (HotDisk) technique at room temperature. For the composite baseplate we achieved local improvements in heat dissipation compared to the plain metal baseplate.
Zhang, Jiapu
2013-01-01
Simulated annealing (SA) was inspired from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects, both are attributes of the material that depend on its thermodynamic free energy. In this Paper, firstly we will study SA in details on its practical implementation. Then, hybrid pure SA with local (or global) search optimization methods allows us to be able to design several effective and efficient global search optimization methods. In order to keep the original sense of SA, we clarify our understandings of SA in crystallography and molecular modeling field through the studies of prion amyloid fibrils.
PowerPlan: An interactive Simulation Tool to Explore Electric Power Planning Options
de Vries, B; Benders, Reinerus; Almeida de, Aníbal; Rosenfeld, Arthur; Roturier, Jacques; Norgard, Jorgen
1994-01-01
In this chapter the simulation model PowerPlan is described briefly, with a focus on reliability and merit-order algorithms. Next, some general characteristics of energy models are reviewed and we indicate the position of PowerPlan. Finally, the results of a historical simulation for The Netherlands
Bello, A.; Laredo, E.; Grimau, M.
1999-11-01
The existence of a distribution of relaxation times has been widely used to describe the relaxation function versus frequency in glass-forming liquids. Several empirical distributions have been proposed and the usual method is to fit the experimental data to a model that assumes one of these functions. Another alternative is to extract from the experimental data the discrete profile of the distribution function that best fits the experimental curve without any a priori assumption. To test this approach a Monte Carlo algorithm using the simulated annealing is used to best fit simulated dielectric loss data, ɛ''(ω), generated with Cole-Cole, Cole-Davidson, Havriliak-Negami, and Kohlrausch-Williams-Watts (KWW) functions. The relaxation times distribution, G(ln(τ)), is obtained as an histogram that follows very closely the analytical expression for the distributions that are known in these cases. Also, the temporal decay functions, φ(t), are evaluated and compared to a stretched exponential. The method is then applied to experimental data for α-polyvinylidene fluoride over a temperature range 233 Kflouride (PVDF) is found to be 87, which characterizes this polymer as a relatively structurally strong material.
Directory of Open Access Journals (Sweden)
Felipe Baesler
2008-12-01
Full Text Available El presente artículo introduce una variante de la metaheurística simulated annealing, para la resolución de problemas de optimización multiobjetivo. Este enfoque se demonina MultiObjective Simulated Annealing with Random Trajectory Search, MOSARTS. Esta técnica agrega al algoritmo Simulated Annealing elementos de memoria de corto y largo plazo para realizar una búsqueda que permita balancear el esfuerzo entre todos los objetivos involucrados en el problema. Los resultados obtenidos se compararon con otras tres metodologías en un problema real de programación de máquinas paralelas, compuesto por 24 trabajos y 2 máquinas idénticas. Este problema corresponde a un caso de estudio real de la industria regional del aserrío. En los experimentos realizados, MOSARTS se comportó de mejor manera que el resto de la herramientas de comparación, encontrando mejores soluciones en términos de dominancia y dispersión.This paper introduces a variant of the metaheuristic simulated annealing, oriented to solve multiobjective optimization problems. This technique is called MultiObjective Simulated Annealing with Random Trajectory Search (MOSARTS. This technique incorporates short an long term memory concepts to Simulated Annealing in order to balance the search effort among all the objectives involved in the problem. The algorithm was tested against three different techniques on a real life parallel machine scheduling problem, composed of 24 jobs and two identical machines. This problem represents a real life case study of the local sawmill industry. The results showed that MOSARTS behaved much better than the other methods utilized, because found better solutions in terms of dominance and frontier dispersion.
Directory of Open Access Journals (Sweden)
Silvia Gaona
2015-01-01
Full Text Available Censuses in Mexico are taken by the National Institute of Statistics and Geography (INEGI. In this paper a Two-Phase Approach (TPA to optimize the routes of INEGI’s census takers is presented. For each pollster, in the first phase, a route is produced by means of the Simulated Annealing (SA heuristic, which attempts to minimize the travel distance subject to particular constraints. Whenever the route is unrealizable, it is made realizable in the second phase by constructing a visibility graph for each obstacle and applying Dijkstra’s algorithm to determine the shortest path in this graph. A tuning methodology based on the irace package was used to determine the parameter values for TPA on a subset of 150 instances provided by INEGI. The practical effectiveness of TPA was assessed on another subset of 1962 instances, comparing its performance with that of the in-use heuristic (INEGIH. The results show that TPA clearly outperforms INEGIH. The average improvement is of 47.11%.
Institute of Scientific and Technical Information of China (English)
De-xuan ZOU; Gai-ge WANG; Gai PAN; Hong-wei QI
2016-01-01
Outline-free floorplanning focuses on area and wirelength reductions, which are usually meaningless, since they can hardly satisfy modern design requirements. We concentrate on a more difficult and useful issue, fixed-outline floorplanning. This issue imposes fixed-outline constraints on the outline-free floorplanning, making the physical design more interesting and chal-lenging. The contributions of this paper are primarily twofold. First, a modified simulated annealing (MSA) algorithm is proposed. In the beginning of the evolutionary process, a new attenuation formula is used to decrease the temperature slowly, to enhance MSA’s global searching capacity. After a period of time, the traditional attenuation formula is employed to decrease the temper-ature rapidly, to maintain MSA’s local searching capacity. Second, an excessive area model is designed to guide MSA to find feasible solutions readily. This can save much time for refining feasible solutions. Additionally, B*-tree representation is known as a very useful method for characterizing floorplanning. Therefore, it is employed to perform a perturbing operation for MSA. Finally, six groups of benchmark instances with different dead spaces and aspect ratios—circuits n10, n30, n50, n100, n200, and n300—are chosen to demonstrate the efficiency of our proposed method on fixed-outline floorplanning. Compared to several existing methods, the proposed method is more efficient in obtaining desirable objective function values associated with the chip area, wirelength, and fixed-outline constraints.
Hu, Kan-Nian; Qiang, Wei; Tycko, Robert
2011-01-01
We describe a general computational approach to site-specific resonance assignments in multidimensional NMR studies of uniformly 15N,13C-labeled biopolymers, based on a simple Monte Carlo/simulated annealing (MCSA) algorithm contained in the program MCASSIGN2. Input to MCASSIGN2 includes lists of multidimensional signals in the NMR spectra with their possible residue-type assignments (which need not be unique), the biopolymer sequence, and a table that describes the connections that relate one signal list to another. As output, MCASSIGN2 produces a high-scoring sequential assignment of the multidimensional signals, using a score function that rewards good connections (i.e., agreement between relevant sets of chemical shifts in different signal lists) and penalizes bad connections, unassigned signals, and assignment gaps. Examination of a set of high-scoring assignments from a large number of independent runs allows one to determine whether a unique assignment exists for the entire sequence or parts thereof. We demonstrate the MCSA algorithm using two-dimensional (2D) and three-dimensional (3D) solid state NMR spectra of several model protein samples (α-spectrin SH3 domain and protein G/B1 microcrystals, HET-s218–289 fibrils), obtained with magic-angle spinning and standard polarization transfer techniques. The MCSA algorithm and MCASSIGN2 program can accommodate arbitrary combinations of NMR spectra with arbitrary dimensionality, and can therefore be applied in many areas of solid state and solution NMR. PMID:21710190
Directory of Open Access Journals (Sweden)
Andrés Iglesias
2016-01-01
Full Text Available Fitting curves to noisy data points is a difficult problem arising in many scientific and industrial domains. Although polynomial functions are usually applied to this task, there are many shapes that cannot be properly fitted by using this approach. In this paper, we tackle this issue by using rational Bézier curves. This is a very difficult problem that requires computing four different sets of unknowns (data parameters, poles, weights, and the curve degree strongly related to each other in a highly nonlinear way. This leads to a difficult continuous nonlinear optimization problem. In this paper, we propose two simulated annealing schemas (the all-in-one schema and the sequential schema to determine the data parameterization and the weights of the poles of the fitting curve. These schemas are combined with least-squares minimization and the Bayesian Information Criterion to calculate the poles and the optimal degree of the best fitting Bézier rational curve, respectively. We apply our methods to a benchmark of three carefully chosen examples of 2D and 3D noisy data points. Our experimental results show that this methodology (particularly, the sequential schema outperforms previous polynomial-based approaches for our data fitting problem, even in the presence of noise of low-medium intensity.
Energy Technology Data Exchange (ETDEWEB)
Diogenes, Alysson N.; Santos, Luis O.E. dos; Fernandes, Celso P. [Universidade Federal de Santa Catarina (UFSC), Florianopolis, SC (Brazil); Appoloni, Carlos R. [Universidade Estadual de Londrina (UEL), PR (Brazil)
2008-07-01
The reservoir rocks physical properties are usually obtained in laboratory, through standard experiments. These experiments are often very expensive and time-consuming. Hence, the digital image analysis techniques are a very fast and low cost methodology for physical properties prediction, knowing only geometrical parameters measured from the rock microstructure thin sections. This research analyzes two methods for porous media reconstruction using the relaxation method simulated annealing. Using geometrical parameters measured from rock thin sections, it is possible to construct a three-dimensional (3D) model of the microstructure. We assume statistical homogeneity and isotropy and the 3D model maintains porosity spatial correlation, chord size distribution and d 3-4 distance transform distribution for a pixel-based reconstruction and spatial correlation for an object-based reconstruction. The 2D and 3D preliminary results are compared with microstructures reconstructed by truncated Gaussian methods. As this research is in its beginning, only the 2D results will be presented. (author)
Energy Technology Data Exchange (ETDEWEB)
Setyawan, Wahyu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Nandipati, Giridhar [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Roche, Kenneth J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Univ. of Washington, Seattle, WA (United States); Heinisch, Howard L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wirth, Brian D. [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab., Oak Ridge, TN (United States); Kurtz, Richard J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
2015-07-01
Molecular dynamics simulations have been used to generate a comprehensive database of surviving defects due to displacement cascades in bulk tungsten. Twenty-one data points of primary knock-on atom (PKA) energies ranging from 100 eV (sub-threshold energy) to 100 keV (~780^{×}_{Ed}, where _{Ed} = 128 eV is the average displacement threshold energy) have been completed at 300 K, 1025 K and 2050 K. Within this range of PKA energies, two regimes of power-law energy-dependence of the defect production are observed. A distinct power-law exponent characterizes the number of Frenkel pairs produced within each regime. The two regimes intersect at a transition energy which occurs at approximately 250^{×}_{Ed}. The transition energy also marks the onset of the formation of large self-interstitial atom (SIA) clusters (size 14 or more). The observed defect clustering behavior is asymmetric, with SIA clustering increasing with temperature, while the vacancy clustering decreases. This asymmetry increases with temperature such that at 2050 K (~0.5_{Tm}) practically no large vacancy clusters are formed, meanwhile large SIA clusters appear in all simulations. The implication of such asymmetry on the long-term defect survival and damage accumulation is discussed. In addition, <100> {110} SIA loops are observed to form directly in the highest energy cascades, while vacancy <100> loops are observed to form at the lowest temperature and highest PKA energies, although the appearance of both the vacancy and SIA loops with Burgers vector of <100> type is relatively rare.
SYSTEM PLANNING WITH THE HANFORD WASTE OPERATIONS SIMULATOR
Energy Technology Data Exchange (ETDEWEB)
CRAWFORD TW; CERTA PJ; WELLS MN
2010-01-14
At the U. S. Department of Energy's Hanford Site in southeastern Washington State, 216 million liters (57 million gallons) of nuclear waste is currently stored in aging underground tanks, threatening the Columbia River. The River Protection Project (RPP), a fully integrated system of waste storage, retrieval, treatment, and disposal facilities, is in varying stages of design, construction, operation, and future planning. These facilities face many overlapping technical, regulatory, and financial hurdles to achieve site cleanup and closure. Program execution is ongoing, but completion is currently expected to take approximately 40 more years. Strategic planning for the treatment of Hanford tank waste is by nature a multi-faceted, complex and iterative process. To help manage the planning, a report referred to as the RPP System Plan is prepared to provide a basis for aligning the program scope with the cost and schedule, from upper-tier contracts to individual facility operating plans. The Hanford Tank Waste Operations Simulator (HTWOS), a dynamic flowsheet simulation and mass balance computer model, is used to simulate the current planned RPP mission, evaluate the impacts of changes to the mission, and assist in planning near-term facility operations. Development of additional modeling tools, including an operations research model and a cost model, will further improve long-term planning confidence. The most recent RPP System Plan, Revision 4, was published in September 2009.
Simulated Annealing Based Algorithm for Identifying Mutated Driver Pathways in Cancer
Directory of Open Access Journals (Sweden)
Hai-Tao Li
2014-01-01
Full Text Available With the development of next-generation DNA sequencing technologies, large-scale cancer genomics projects can be implemented to help researchers to identify driver genes, driver mutations, and driver pathways, which promote cancer proliferation in large numbers of cancer patients. Hence, one of the remaining challenges is to distinguish functional mutations vital for cancer development, and filter out the unfunctional and random “passenger mutations.” In this study, we introduce a modified method to solve the so-called maximum weight submatrix problem which is used to identify mutated driver pathways in cancer. The problem is based on two combinatorial properties, that is, coverage and exclusivity. Particularly, we enhance an integrative model which combines gene mutation and expression data. The experimental results on simulated data show that, compared with the other methods, our method is more efficient. Finally, we apply the proposed method on two real biological datasets. The results show that our proposed method is also applicable in real practice.
Directory of Open Access Journals (Sweden)
Larry W. Burggraf
2013-07-01
Full Text Available To find low energy SinCn structures out of hundreds to thousands of isomers we have developed a general method to search for stable isomeric structures that combines Stochastic Potential Surface Search and Pseudopotential Plane-Wave Density Functional Theory Car-Parinello Molecular Dynamics simulated annealing (PSPW-CPMD-SA. We enhanced the Sunders stochastic search method to generate random cluster structures used as seed structures for PSPW-CPMD-SA simulations. This method ensures that each SA simulation samples a different potential surface region to find the regional minimum structure. By iterations of this automated, parallel process on a high performance computer we located hundreds to more than a thousand stable isomers for each SinCn cluster. Among these, five to 10 of the lowest energy isomers were further optimized using B3LYP/cc-pVTZ method. We applied this method to SinCn (n = 4–12 clusters and found the lowest energy structures, most not previously reported. By analyzing the bonding patterns of low energy structures of each SinCn cluster, we observed that carbon segregations tend to form condensed conjugated rings while Si connects to unsaturated bonds at the periphery of the carbon segregation as single atoms or clusters when n is small and when n is large a silicon network spans over the carbon segregation region.
Simulation-based decision support for evaluating operational plans
Directory of Open Access Journals (Sweden)
Johan Schubert
2015-12-01
Full Text Available In this article, we describe simulation-based decision support techniques for evaluation of operational plans within effects-based planning. Using a decision support tool, developers of operational plans are able to evaluate thousands of alternative plans against possible courses of events and decide which of these plans are capable of achieving a desired end state. The objective of this study is to examine the potential of a decision support system that helps operational analysts understand the consequences of numerous alternative plans through simulation and evaluation. Operational plans are described in the effects-based approach to operations concept as a set of actions and effects. For each action, we examine several different alternative ways to perform the action. We use a representation where a plan consists of several actions that should be performed. Each action may be performed in one of several different alternative ways. Together these action alternatives make up all possible plan instances, which are represented as a tree of action alternatives that may be searched for the most effective sequence of alternative actions. As a test case, we use an expeditionary operation with a plan of 43 actions and several alternatives for these actions, as well as a scenario of 40 group actors. Decision support for planners is provided by several methods that analyze the impact of a plan on the 40 actors, e.g., by visualizing time series of plan performance. Detailed decision support for finding the most influential actions of a plan is presented by using sensitivity analysis and regression tree analysis. Finally, a decision maker may use the tool to determine the boundaries of an operation that it must not move beyond without risk of drastic failure. The significant contribution of this study is the presentation of an integrated approach for evaluation of operational plans.
Combat Simulation Modeling in Naval Special Warfare Mission Planning.
1995-12-01
This thesis explores the potential role of combat simulation modeling in the Naval Special Warfare mission planning cycle. It discusses methods for...at the tactical level. The thesis concludes by discussing additional applications of combat simulation modeling within the Naval Special Warfare community and makes recommendations for its effective and efficient implementation.
Widjaja, Effendi; Garland, Marc
2002-07-15
A combination of singular value decomposition, entropy minimization, and simulated annealing was applied to a synthetic 7-species spectroscopic data set with added white noise. The pure spectra were highly overlapping. Global minima for selected objective functions were obtained for the transformation of the first seven right singular vectors. Simple Shannon type entropy functions were used in the objective functions and realistic physical constraints were imposed in the penalties. It was found that good first approximations for the pure component spectra could be obtained without the use of any a priori information. The present method out performed the two widely used routines, namely Simplisma and OPA-ALS, as well as IPCA. These results indicate that a combination of SVD, entropy minimization, and simulated annealing is a potentially powerful tool for spectral reconstructions from large real experimental systems. Copyright 2002 Wiley Periodicals, Inc.
Directory of Open Access Journals (Sweden)
Min Dai
2013-01-01
Full Text Available A flexible flow-shop scheduling (FFS with nonidentical parallel machines for minimizing the maximum completion time or makespan is a well-known combinational problem. Since the problem is known to be strongly NP-hard, optimization can either be the subject of optimization approaches or be implemented for some approximated cases. In this paper, an improved genetic-simulated annealing algorithm (IGAA, which combines genetic algorithm (GA based on an encoding matrix with simulated annealing algorithm (SAA based on a hormone modulation mechanism, is proposed to achieve the optimal or near-optimal solution. The novel hybrid algorithm tries to overcome the local optimum and further to explore the solution space. To evaluate the performance of IGAA, computational experiments are conducted and compared with results generated by different algorithms. Experimental results clearly demonstrate that the improved metaheuristic algorithm performs considerably well in terms of solution quality, and it outperforms several other algorithms.
A Simulation Tool for Hurricane Evacuation Planning
Directory of Open Access Journals (Sweden)
Daniel J. Fonseca
2009-01-01
Full Text Available Atlantic hurricanes and severe tropical storms are a serious threat for the communities in the Gulf of Mexico region. Such storms are violent and destructive. In response to these dangers, coastal evacuation may be ordered. This paper describes the development of a simulation model to analyze the movement of vehicles through I-65, a major US Interstate highway that runs north off the coastal City of Mobile, Alabama, towards the State of Tennessee, during a massive evacuation originated by a disastrous event such a hurricane. The constructed simulation platform consists of a primary and two secondary models. The primary model is based on the entry of vehicles from the 20 on-ramps to I-65. The two secondary models assist the primary model with related traffic events such as car breakdowns and accidents, traffic control measures, interarrival signaling, and unforeseen emergency incidents, among others. Statistical testing was performed on the data generated by the simulation model to indentify variation in relevant traffic variables affecting the timely flow of vehicles travelling north. The performed statistical analysis focused on the closing of alternative on-ramps throughout the Interstate.
Hasegawa, M.
2011-03-01
The aim of the present study is to elucidate how simulated annealing (SA) works in its finite-time implementation by starting from the verification of its conventional optimization scenario based on equilibrium statistical mechanics. Two and one supplementary experiments, the design of which is inspired by concepts and methods developed for studies on liquid and glass, are performed on two types of random traveling salesman problems. In the first experiment, a newly parameterized temperature schedule is introduced to simulate a quasistatic process along the scenario and a parametric study is conducted to investigate the optimization characteristics of this adaptive cooling. In the second experiment, the search trajectory of the Metropolis algorithm (constant-temperature SA) is analyzed in the landscape paradigm in the hope of drawing a precise physical analogy by comparison with the corresponding dynamics of glass-forming molecular systems. These two experiments indicate that the effectiveness of finite-time SA comes not from equilibrium sampling at low temperature but from downward interbasin dynamics occurring before equilibrium. These dynamics work most effectively at an intermediate temperature varying with the total search time and thus this effective temperature is identified using the Deborah number. To test directly the role of these relaxation dynamics in the process of cooling, a supplementary experiment is performed using another parameterized temperature schedule with a piecewise variable cooling rate and the effect of this biased cooling is examined systematically. The results show that the optimization performance is not only dependent on but also sensitive to cooling in the vicinity of the above effec-tive temperature and that this feature is interpreted as a consequence of the presence or absence of the workable interbasin dynamics. It is confirmed for the present instances that the effectiveness of finite-time SA derives from the glassy relaxation
Simulation based planning of surgical interventions in pediatric cardiology
Marsden, Alison L.
2013-10-01
Hemodynamics plays an essential role in the progression and treatment of cardiovascular disease. However, while medical imaging provides increasingly detailed anatomical information, clinicians often have limited access to hemodynamic data that may be crucial to patient risk assessment and treatment planning. Computational simulations can now provide detailed hemodynamic data to augment clinical knowledge in both adult and pediatric applications. There is a particular need for simulation tools in pediatric cardiology, due to the wide variation in anatomy and physiology in congenital heart disease patients, necessitating individualized treatment plans. Despite great strides in medical imaging, enabling extraction of flow information from magnetic resonance and ultrasound imaging, simulations offer predictive capabilities that imaging alone cannot provide. Patient specific simulations can be used for in silico testing of new surgical designs, treatment planning, device testing, and patient risk stratification. Furthermore, simulations can be performed at no direct risk to the patient. In this paper, we outline the current state of the art in methods for cardiovascular blood flow simulation and virtual surgery. We then step through pressing challenges in the field, including multiscale modeling, boundary condition selection, optimization, and uncertainty quantification. Finally, we summarize simulation results of two representative examples from pediatric cardiology: single ventricle physiology, and coronary aneurysms caused by Kawasaki disease. These examples illustrate the potential impact of computational modeling tools in the clinical setting.
Transmission network expansion planning with simulation optimization
Energy Technology Data Exchange (ETDEWEB)
Bent, Russell W [Los Alamos National Laboratory; Berscheid, Alan [Los Alamos National Laboratory; Toole, G. Loren [Los Alamos National Laboratory
2010-01-01
Within the electric power literatW''e the transmi ssion expansion planning problem (TNEP) refers to the problem of how to upgrade an electric power network to meet future demands. As this problem is a complex, non-linear, and non-convex optimization problem, researchers have traditionally focused on approximate models. Often, their approaches are tightly coupled to the approximation choice. Until recently, these approximations have produced results that are straight-forward to adapt to the more complex (real) problem. However, the power grid is evolving towards a state where the adaptations are no longer easy (i.e. large amounts of limited control, renewable generation) that necessitates new optimization techniques. In this paper, we propose a generalization of the powerful Limited Discrepancy Search (LDS) that encapsulates the complexity in a black box that may be queJied for information about the quality of a proposed expansion. This allows the development of a new optimization algOlitlun that is independent of the underlying power model.
Institute of Scientific and Technical Information of China (English)
LIANG WEN-XI; ZHANG JING-JUAN; L(U) JUN-FENG; LIAO RUI
2001-01-01
We have designed a spatially quantized diffractive optical element (DOE) for controlling the beam profile in a three-dimensional space with the help of the simulated annealing (SA) algorithm. In this paper, we investigate the annealing schedule and the neighbourhood which are the deterministic parameters of the process that warrant the quality of the SA algorithm. The algorithm is employed to solve the discrete stochastic optimization problem of the design of a DOE. The objective function which constrains the optimization is also studied. The computed results demonstrate that the procedure of the algorithm converges stably to an optimal solution close to the global optimum with an acceptable computing time. The results meet the design requirement well and are applicable.
Simulated annealing reveals the kinetic activity of SGLT1, a member of the LeuT structural family.
Longpré, Jean-Philippe; Sasseville, Louis J; Lapointe, Jean-Yves
2012-10-01
The Na(+)/glucose cotransporter (SGLT1) is the archetype of membrane proteins that use the electrochemical Na(+) gradient to drive uphill transport of a substrate. The crystal structure recently obtained for vSGLT strongly suggests that SGLT1 adopts the inverted repeat fold of the LeuT structural family for which several crystal structures are now available. What is largely missing is an accurate view of the rates at which SGLT1 transits between its different conformational states. In the present study, we used simulated annealing to analyze a large set of steady-state and pre-steady-state currents measured for human SGLT1 at different membrane potentials, and in the presence of different Na(+) and α-methyl-d-glucose (αMG) concentrations. The simplest kinetic model that could accurately reproduce the time course of the measured currents (down to the 2 ms time range) is a seven-state model (C(1) to C(7)) where the binding of the two Na(+) ions (C(4)→C(5)) is highly cooperative. In the forward direction (Na(+)/glucose influx), the model is characterized by two slow, electroneutral conformational changes (59 and 100 s(-1)) which represent reorientation of the free and of the fully loaded carrier between inside-facing and outside-facing conformations. From the inward-facing (C(1)) to the outward-facing Na-bound configuration (C(5)), 1.3 negative elementary charges are moved outward. Although extracellular glucose binding (C(5)→C(6)) is electroneutral, the next step (C(6)→C(7)) carries 0.7 positive charges inside the cell. Alignment of the seven-state model with a generalized model suggested by the structural data of the LeuT fold family suggests that electrogenic steps are associated with the movement of the so-called thin gates on each side of the substrate binding site. To our knowledge, this is the first model that can quantitatively describe the behavior of SGLT1 down to the 2 ms time domain. The model is highly symmetrical and in good agreement with the
Realistic Crowd Simulation with Density-Based Path Planning
van Toll, W.G.; Cook IV, A.F.; Geraerts, R.J.
2012-01-01
Virtual characters in games and simulations often need to plan visually convincing paths through a crowded environment. This paper describes how crowd density information can be used to guide a large number of characters through a crowded environment. Crowd density information helps characters avoid
Realistic Crowd Simulation with Density-Based Path Planning
van Toll, W.G.; Cook IV, A.F.; Geraerts, R.J.
2012-01-01
Virtual characters in games and simulations often need to plan visually convincing paths through a crowded environment. This paper describes how crowd density information can be used to guide a large number of characters through a crowded environment. Crowd density information helps characters avoid
Lean Supply Chain Planning: A Performance Evaluation through Simulation
Directory of Open Access Journals (Sweden)
Rossini Matteo
2016-01-01
Full Text Available Nowadays companies look more and more for improving their efficiency to excel in the market. At the same time, the competition has moved from firm level to whole supply chain level. Supply chain are very complex systems and lacks of coordination among their members leads to inefficiency. Supply chain planning task is to improve coordination among supply chain members. Which is the best planning solution to improve efficiency is an open issue. On the other hand, Lean approach is becoming more and more popular among managers. Lean approach is recognize as efficiency engine for production systems, but effects of Lean implementation out of single firm boundaries is not clear. This paper aims at providing a theoretical and practical starting point for Lean implementation in supply chain planning issue. To reach it, a DES simulation model of a three-echelon and multi-product supply chain has been set. Lean management is a very broad topic and this paper focuses on two principles of “pull” and “create the flow”. Kanban system and setup-time and batch-size reductions are implemented in the lean-configured supply chain to apply “pull” and “create the flow” respectively. Lean principles implementations have been analyzed and compared with other supply chain planning policies: EOQ and information sharing (Visibility. Supported by the simulation study, this paper points Lean supply chain planning is a competitive planning policies to increase efficiency.
Naskar, Pulak; Talukder, Srijeeta; Chaudhury, Pinaki
2017-04-05
In this communication, we would like to discuss the advantages of adaptive mutation simulated annealing (AMSA) over standard simulated annealing (SA) in studying the Coulombic explosion of (CO2)n(2+) clusters for n = 20-68, where 'n' is the size of the cluster. We have demonstrated how AMSA itself can overcome the predicaments which can arise in conventional SA and carry out the search for better results by adapting the parameters (only when needed) dynamically during the simulations so that the search process can come out of high energy basins and not go astray for better exploration and convergence, respectively. This technique also has in-built properties for getting more than one minimum in a single run. For a (CO2)n(2+) cluster system we have found the critical limit to be n = 43, above which the attractive forces between individual units become greater in value than that of the large repulsive forces and the clusters stay intact as the energetically favoured isomers. This result is in good concurrence with earlier studies. Moreover, we have studied the fragmentation patterns for the entire size range and we have found fission type fragmentation as the favoured mechanism nearly for all sizes.
Directory of Open Access Journals (Sweden)
Kumar Deepak
2015-12-01
Full Text Available Groundwater contamination due to leakage of gasoline is one of the several causes which affect the groundwater environment by polluting it. In the past few years, In-situ bioremediation has attracted researchers because of its ability to remediate the contaminant at its site with low cost of remediation. This paper proposed the use of a new hybrid algorithm to optimize a multi-objective function which includes the cost of remediation as the first objective and residual contaminant at the end of the remediation period as the second objective. The hybrid algorithm was formed by combining the methods of Differential Evolution, Genetic Algorithms and Simulated Annealing. Support Vector Machines (SVM was used as a virtual simulator for biodegradation of contaminants in the groundwater flow. The results obtained from the hybrid algorithm were compared with Differential Evolution (DE, Non Dominated Sorting Genetic Algorithm (NSGA II and Simulated Annealing (SA. It was found that the proposed hybrid algorithm was capable of providing the best solution. Fuzzy logic was used to find the best compromising solution and finally a pumping rate strategy for groundwater remediation was presented for the best compromising solution. The results show that the cost incurred for the best compromising solution is intermediate between the highest and lowest cost incurred for other non-dominated solutions.
Creating virtual humans for simulation-based training and planning
Energy Technology Data Exchange (ETDEWEB)
Stansfield, S.; Sobel, A.
1998-05-12
Sandia National Laboratories has developed a distributed, high fidelity simulation system for training and planning small team Operations. The system provides an immersive environment populated by virtual objects and humans capable of displaying complex behaviors. The work has focused on developing the behaviors required to carry out complex tasks and decision making under stress. Central to this work are techniques for creating behaviors for virtual humans and for dynamically assigning behaviors to CGF to allow scenarios without fixed outcomes. Two prototype systems have been developed that illustrate these capabilities: MediSim, a trainer for battlefield medics and VRaptor, a system for planning, rehearsing and training assault operations.
Treatment planning in radiosurgery: parallel Monte Carlo simulation software
Energy Technology Data Exchange (ETDEWEB)
Scielzo, G. [Galliera Hospitals, Genova (Italy). Dept. of Hospital Physics; Grillo Ruggieri, F. [Galliera Hospitals, Genova (Italy) Dept. for Radiation Therapy; Modesti, M.; Felici, R. [Electronic Data System, Rome (Italy); Surridge, M. [University of South Hampton (United Kingdom). Parallel Apllication Centre
1995-12-01
The main objective of this research was to evaluate the possibility of direct Monte Carlo simulation for accurate dosimetry with short computation time. We made us of: graphics workstation, linear accelerator, water, PMMA and anthropomorphic phantoms, for validation purposes; ionometric, film and thermo-luminescent techniques, for dosimetry; treatment planning system for comparison. Benchmarking results suggest that short computing times can be obtained with use of the parallel version of EGS4 that was developed. Parallelism was obtained assigning simulation incident photons to separate processors, and the development of a parallel random number generator was necessary. Validation consisted in: phantom irradiation, comparison of predicted and measured values good agreement in PDD and dose profiles. Experiments on anthropomorphic phantoms (with inhomogeneities) were carried out, and these values are being compared with results obtained with the conventional treatment planning system.
Manufacturing Resource Planning Technology Based on Genetic Programming Simulation
Institute of Scientific and Technical Information of China (English)
GAO Shiwen; LIAO Wenhe; GUO Yu; LIU Jinshan; SU Yan
2009-01-01
Network-based manufacturing is a kind of distributed system, which enables manufacturers to finish production tasks as well as to grasp the opportunities in the market, even if manufacturing resources are insufficient. One of the main problems in network-based manufacturing is the allocation of resources and the assignment of tasks rationally, according to flexible resource distribution. The mapping rules and relations between production techniques and resources are proposed, followed by the definition of the resource unit. Ultimately, the genetic programming method for the optimization of the manufacturing system is put forward. A set of software for the optimization system of simulation process using genetic programming techniques has been developed, and the problems of manufacturing resource planning in network-based manufacturing are solved with the simulation of optimizing methods by genetic programming. The optimum proposal of hardware planning, selection of company and scheduling will be obtained in theory to help company managers in scientific decision-making.
A Graphical Interactive Simulation Environment for Production Planning in Bacon Factories
DEFF Research Database (Denmark)
Nielsen, Kirsten Mølgaard; Nielsen, Jens Frederik Dalsgaard
1994-01-01
The paper describes a graphical interactive simulation tool for production planning in bacon factories........The paper describes a graphical interactive simulation tool for production planning in bacon factories.....
MRPPSim: A Multi-Robot Path Planning Simulation
Directory of Open Access Journals (Sweden)
Ebtehal Turki Saho Alotaibi
2016-08-01
Full Text Available Multi-robot path planning problem is an interesting problem of research having great potential for several optimization problems in the world. In multi-robot path planning problem domain (MRPP, robots must move from their start locations to their goal locations avoiding collisions with each other. MRPP is a relevant problem in several domains, including; automatic packages inside a warehouse, automated guided vehicles, planetary exploration, robotics mining, and video games. This work introduces MRPPSim; a new modeling, evaluation and simulation tool for multi-robot path planning algorithms and its applications. In doing so, it handles all the aspects related to the multi-robot path planning algorithms. Through its working, MRPPSim unifies the representation for the input. This algorithm provides researchers with a set of evaluation models with each of them serving a set of objectives. It provides a comprehensive method to evaluate and compare the algorithm’s performance to the ones that solve public benchmark problems inas shown in literature. The work presented in this paper also provides a complete tool to reformat and control user input, critical small benchmark, biconnected, random and grid problems. Once all of this is performed, it calculates the common performance measurements of multi-robot path planning algorithms in a unified way. The work presented in this paper animates the results so the researchers can follow their algorithms’ executions. In addition, MRPPSim is designed as set of models, each is dedicated to a specific function, this allows new algorithm, evaluation model, or performance measurements to be easily plugged into the simulator.
融合模拟退火策略的萤火虫优化算法%Glowworm swarm optimization algorithm merging simulated annealing strategy
Institute of Scientific and Technical Information of China (English)
曹秀爽
2014-01-01
Artificial glowworm swarm optimization algorithm is a new research orientation in the field of swarm intel igence recently.The algorithm has achieved success in the complex function optimization,but it is easy to fal into local optimum,and has the low speed of convergence in the later period and so on.Simulated annealing algorithm has excel ent global search ability.Combi-ning their advantages,an improved glowworm swarm optimization algorithm was proposed based on simulated annealing strategy.The simulated annealing strategy was integrated into the process of glowworm swarm optimization algorithm.And the temper strategy was integrated into the local search process of hybrid algorithm to improve search precision.Overal performance of the Glowworm swarm optimization was improved.Simulation results show that the hybrid algo-rithm increases the accuracy of solution and the speed of convergence significantly,and is a fea-sible and effective method.%萤火虫算法是群智能领域近年出现的一个新的研究方向，该算法虽已在复杂函数优化方面取得了成功，但也存在着易于陷入局部最优且进化后期收敛速度慢等问题，而模拟退火机制具有很强的全局搜索能力，结合两者的优缺点，提出一种融合模拟退火策略的萤火虫优化算法。改进后的算法在萤火虫算法全局搜索过程中融入模拟退火搜索机制，在局部搜索过程中采用了回火策略，改善寻优精度，改进了萤火虫算法的全局搜索性能和局部搜索性能。仿真实验结果表明：改进后的算法在收敛速度和解的精度方面有了显著地提高，证明了算法改进的可行性和有效性。
Distribution System Optimization Planning Based on Plant Growth Simulation Algorithm
Institute of Scientific and Technical Information of China (English)
WANG Chun; CHENG Hao-zhong; HU Ze-chun; WANG Yi
2008-01-01
An approach for the integrated optimization of the construction/expansion capacity of high-voltage/medium-voltage (HV/MV) substations and the configuration of MV radial distribution network was presented using plant growth simulation algorithm (PGSA). In the optimization process, fixed costs correspondent to the investment in lines and substations and the variable costs associated to the operation of the system were considered under the constraints of branch capacity, substation capacity and bus voltage. The optimization variables considerably reduce the dimension of variables and speed up the process of optimizing. The effectiveness of the proposed approach was tested by a distribution system planning.
Decision support for simulation-based operation planning
Schubert, Johan; Hörling, Pontus
2016-05-01
In this paper, we develop methods for analyzing large amounts of data from a military ground combat simulation system. Through a series of processes, we focus the big data set on situations that correspond to important questions and show advantageous outcomes. The result is a decision support methodology that provides commanders with results that answer specific questions of interest, such as what the consequences for the Blue side are in various Red scenarios or what a particular Blue force can withstand. This approach is a step toward taking the traditional data farming methodology from its analytical view into a prescriptive operation planning context and a decision making mode.
[Planning and simulation of minimally-invasive robotic heart surgery].
Coste-Manière, Eve; Adhami, Louaï; Severac-Bastide, Renault; Boissonnat, Jean-Daniel; Carpentier, Alain
2002-04-01
Due to their numerous advantages, mainly in terms of patient benefit, mini-invasive robotically assisted interventions are gaining in importance in various surgical fields. However, this conversion has its own challenges that stem from both its novelty and complexity. In this paper we propose to accompany the surgeons in their transition, by offering an integrated environment that enables them to make better use of this new technology. The proposed system is patient-dependent, and enables the planning, validation, simulation, teaching and archiving of robotically assisted interventions. The approach is illustrated for a coronary bypass graft using the daVinci tele-operated robot.
Building Performance Simulation tools for planning of energy efficiency retrofits
DEFF Research Database (Denmark)
Mondrup, Thomas Fænø; Karlshøj, Jan; Vestergaard, Flemming
2014-01-01
Designing energy efficiency retrofits for existing buildings will bring environmental, economic, social, and health benefits. However, selecting specific retrofit strategies is complex and requires careful planning. In this study, we describe a methodology for adopting Building Performance...... to energy efficiency retrofits in social housing. To generate energy savings, we focus on optimizing the building envelope. We evaluate alternative building envelope actions using procedural solar radiation and daylight simulations. In addition, we identify the digital information flow and the information...... Simulation (BPS) tools as energy and environmentally conscious decision-making aids. The methodology has been developed to screen buildings for potential improvements and to support the development of retrofit strategies. We present a case study of a Danish renovation project, implementing BPS approaches...
Directory of Open Access Journals (Sweden)
Bjelić Mišo B.
2016-01-01
Full Text Available Simulation models of welding processes allow us to predict influence of welding parameters on the temperature field during welding and by means of temperature field and the influence to the weld geometry and microstructure. This article presents a numerical, finite-difference based model of heat transfer during welding of thin sheets. Unfortunately, accuracy of the model depends on many parameters, which cannot be accurately prescribed. In order to solve this problem, we have used simulated annealing optimization method in combination with presented numerical model. This way, we were able to determine uncertain values of heat source parameters, arc efficiency, emissivity and enhanced conductivity. The calibration procedure was made using thermocouple measurements of temperatures during welding for P355GH steel. The obtained results were used as input for simulation run. The results of simulation showed that represented calibration procedure could significantly improve reliability of heat transfer model. [National CEEPUS Office of Czech Republic (project CIII-HR-0108-07-1314 and to the Ministry of Education and Science of the Republic of Serbia (project TR37020
Energy Technology Data Exchange (ETDEWEB)
Estevez H, O.; Duque, J. [Universidad de La Habana, Instituto de Ciencia y Tecnologia de Materiales, 10400 La Habana (Cuba); Rodriguez H, J. [UNAM, Instituto de Investigaciones en Materiales, 04510 Mexico D. F. (Mexico); Yee M, H., E-mail: oestevezh@yahoo.com [Instituto Politecnico Nacional, Escuela Superior de Fisica y Matematicas, 07738 Mexico D. F. (Mexico)
2015-07-01
1-Furoyl-3,3-diphenylthiourea (FDFT) was synthesized, and characterized by Ftir, {sup 1}H and {sup 13}C NMR and ab initio X-ray powder structure analysis. FDFT crystallizes in the monoclinic space group P2{sub 1} with a = 12.691(1), b = 6.026(2), c = 11.861(1) A, β = 117.95(2) and V = 801.5(3) A{sup 3}. The crystal structure has been determined from laboratory X-ray powder diffraction data using direct space global optimization strategy (simulated annealing) followed by the Rietveld refinement. The thiourea group makes a dihedral angle of 73.8(6) with the furoyl group. In the crystal structure, molecules are linked by van der Waals interactions, forming one-dimensional chains along the a axis. (Author)
Directory of Open Access Journals (Sweden)
Yanhui Li
2013-01-01
Full Text Available Facility location, inventory control, and vehicle routes scheduling are critical and highly related problems in the design of logistics system for e-business. Meanwhile, the return ratio in Internet sales was significantly higher than in the traditional business. Many of returned merchandise have no quality defects, which can reenter sales channels just after a simple repackaging process. Focusing on the existing problem in e-commerce logistics system, we formulate a location-inventory-routing problem model with no quality defects returns. To solve this NP-hard problem, an effective hybrid genetic simulated annealing algorithm (HGSAA is proposed. Results of numerical examples show that HGSAA outperforms GA on computing time, optimal solution, and computing stability. The proposed model is very useful to help managers make the right decisions under e-supply chain environment.
Li, Yanhui; Guo, Hao; Wang, Lin; Fu, Jing
2013-01-01
Facility location, inventory control, and vehicle routes scheduling are critical and highly related problems in the design of logistics system for e-business. Meanwhile, the return ratio in Internet sales was significantly higher than in the traditional business. Many of returned merchandise have no quality defects, which can reenter sales channels just after a simple repackaging process. Focusing on the existing problem in e-commerce logistics system, we formulate a location-inventory-routing problem model with no quality defects returns. To solve this NP-hard problem, an effective hybrid genetic simulated annealing algorithm (HGSAA) is proposed. Results of numerical examples show that HGSAA outperforms GA on computing time, optimal solution, and computing stability. The proposed model is very useful to help managers make the right decisions under e-supply chain environment.
Energy Technology Data Exchange (ETDEWEB)
Joseph, Joby; Muthukumaran, S. [National Institute of Technology, Tamil Nadu (India)
2016-01-15
Abundant improvements have occurred in materials handling, especially in metal joining. Pulsed current gas tungsten arc welding (PCGTAW) is one of the consequential fusion techniques. In this work, PCGTAW of AISI 4135 steel engendered through powder metallurgy (P/M) has been executed, and the process parameters have been highlighted applying Taguchi's L9 orthogonal array. The results show that the peak current (Ip), gas flow rate (GFR), welding speed (WS) and base current (Ib) are the critical constraints in strong determinant of the Tensile strength (TS) as well as percentage of elongation (% Elong) of the joint. The practical impact of applying Genetic algorithm (GA) and Simulated annealing (SA) to PCGTAW process has been authenticated by means of calculating the deviation between predicted and experimental welding process parameters.
Directory of Open Access Journals (Sweden)
Bailing Liu
2015-01-01
Full Text Available Facility location, inventory control, and vehicle routes scheduling are three key issues to be settled in the design of logistics system for e-commerce. Due to the online shopping features of e-commerce, customer returns are becoming much more than traditional commerce. This paper studies a three-phase supply chain distribution system consisting of one supplier, a set of retailers, and a single type of product with continuous review (Q, r inventory policy. We formulate a stochastic location-inventory-routing problem (LIRP model with no quality defects returns. To solve the NP-hand problem, a pseudo-parallel genetic algorithm integrating simulated annealing (PPGASA is proposed. The computational results show that PPGASA outperforms GA on optimal solution, computing time, and computing stability.
Directory of Open Access Journals (Sweden)
Banani Basu
2010-05-01
Full Text Available In this paper, we propose a technique based on two evolutionary algorithms simulated annealing and particle swarm optimization to design a linear array of half wavelength long parallel dipole antennas that will generate a pencil beam in the horizontal plane with minimum standing wave ratio (SWR and fixed side lobe level (SLL. Dynamic range ratio of current amplitude distribution is kept at a fixed value. Two different methods have been proposed withdifferent inter-element spacing but with same current amplitude distribution. First one uses a fixed geometry and optimizes the excitation distribution on it. In the second case further reduction of SWR is done via optimization of interelement spacing while keeping the amplitude distribution same as before. Coupling effect between the elements is analyzed using induced EMF method and minimized interms of SWR. Numerical results obtained from SA are validated by comparing with results obtained using PSO.
Kaplan, Sezgin; Rabadi, Ghaith
2013-01-01
This article addresses the aerial refuelling scheduling problem (ARSP), where a set of fighter jets (jobs) with certain ready times must be refuelled from tankers (machines) by their due dates; otherwise, they reach a low fuel level (deadline) incurring a high cost. ARSP is an identical parallel machine scheduling problem with release times and due date-to-deadline windows to minimize the total weighted tardiness. A simulated annealing (SA) and metaheuristic for randomized priority search (Meta-RaPS) with the newly introduced composite dispatching rule, apparent piecewise tardiness cost with ready times (APTCR), are applied to the problem. Computational experiments compared the algorithms' solutions to optimal solutions for small problems and to each other for larger problems. To obtain optimal solutions, a mixed integer program with a piecewise weighted tardiness objective function was solved for up to 12 jobs. The results show that Meta-RaPS performs better in terms of average relative error but SA is more efficient.
Guo, Hao; Fu, Jing
2013-01-01
Facility location, inventory control, and vehicle routes scheduling are critical and highly related problems in the design of logistics system for e-business. Meanwhile, the return ratio in Internet sales was significantly higher than in the traditional business. Many of returned merchandise have no quality defects, which can reenter sales channels just after a simple repackaging process. Focusing on the existing problem in e-commerce logistics system, we formulate a location-inventory-routing problem model with no quality defects returns. To solve this NP-hard problem, an effective hybrid genetic simulated annealing algorithm (HGSAA) is proposed. Results of numerical examples show that HGSAA outperforms GA on computing time, optimal solution, and computing stability. The proposed model is very useful to help managers make the right decisions under e-supply chain environment. PMID:24489489
Web Mining Based on Hybrid Simulated Annealing Genetic Algorithm and HMM%基于混合模拟退火-遗传算法和HMM的Web挖掘
Institute of Scientific and Technical Information of China (English)
邹腊梅; 龚向坚
2012-01-01
The training algorithm which is used to training HMM is a sub-optimal algorithm and sensitive to initial parameters. Typical hidden Markov model often leads to sub-optimal when training it with random parameters. It is ineffective when mining Web information with typical HMM. GA has the excellent ability of global searching and has the defect of slow convergence rate. SA has the excellent ability of local searching and has the defect of randomly roaming. It combines the advantages of genetic algorithm and simulated annealing algorithm .proposes hybrid simulated annealing genetic algorithm (SGA). SGA chooses the best SGA parameters by experiment and optimizes HMM combining Baum-Welch during the course of Web mining. The experimental results show that the SGA significantly improves the performance in precision and recall.%隐马尔可夫模型训练算法是一种局部搜索算法,对初值敏感.传统方法采用随机参数训练隐马尔可夫模型时常陷入局部最优,应用于Web挖掘效果不佳.遗传算法具有较强的全局搜索能力,但容易早熟、收敛慢,模拟退火算法具有较强的局部寻优能力,但会随机漫游,全局搜索能力欠缺.综合考虑遗传算法和模拟退火算法的特点,提出混合模拟退火-遗传算法SGA,优化HMM初始参数,弥补Baum-Welch算法对初始参数敏感的缺陷,Web挖掘的实验结果表明五个域提取的REC和PRE都有明显的提高.
模拟退火蚁群算法求解二次分配问题%Simulated annealing ant colony algorithm for QAP.
Institute of Scientific and Technical Information of China (English)
朱经纬; 芮挺; 蒋新胜; 张金林
2011-01-01
A simulated annealing ant colony algorithm is presented to tackle the Quadratic Assignment Problem(QAP).The simulated annealing method is introduced to the ant colony algorithm.By setting the temperature which changes with the iterative,after each turn of circuit,the solution set got by the colony is taken as the candidate set,the update set is gotten by accepting the solutions in the candidate set with the probability determined by the temperature.The candidate set is used to update the trail information matrix.In each turn of updating the tail information,the best solution is used to enhance the tail information.The tail information matrix is reset when the algorithm is in stagnation.The computer experiments demonstrate this algorithm has high calculation stability and converging speed.%提出了一种求解二次分配问题的模拟退火蚁群算法.将模拟退火机制引入蚁群算法,在算法中设定随迭代变化的温度,将蚁群根据信息素矩阵搜索得到的解集作为候选集,根据当前温度按照模拟退火机制由候选集生成更新集,利用更新集更新信息素矩阵,并利用当前最优解对信息素矩阵进行强化.当算法出现停滞对信息素矩阵进行重置.实验表明,该算法有着高的稳定性与收敛速度.
Directory of Open Access Journals (Sweden)
Marco A. C. Benvenga
2011-10-01
Full Text Available Kinetic simulation and drying process optimization of corn malt by Simulated Annealing (SA for estimation of temperature and time parameters in order to preserve maximum amylase activity in the obtained product are presented here. Germinated corn seeds were dried at 54-76 °C in a convective dryer, with occasional measurement of moisture content and enzymatic activity. The experimental data obtained were submitted to modeling. Simulation and optimization of the drying process were made by using the SA method, a randomized improvement algorithm, analogous to the simulated annealing process. Results showed that seeds were best dried between 3h and 5h. Among the models used in this work, the kinetic model of water diffusion into corn seeds showed the best fitting. Drying temperature and time showed a square influence on the enzymatic activity. Optimization through SA showed the best condition at 54 ºC and between 5.6h and 6.4h of drying. Values of specific activity in the corn malt were found between 5.26±0.06 SKB/mg and 15.69±0,10% of remaining moisture.Este trabalho objetivou a simulação da cinética e a otimização do processo de secagem do malte de milho por meio da técnica Simulated Annealing (SA, para estimação dos parâmetros de temperatura e tempo, tais que mantenham a atividade máxima das enzimas amilases no produto obtido. Para tanto, as sementes de milho germinadas foram secas entre 54-76°C, em um secador convectivo de ar. De tempo em tempo, a umidade e a atividade enzimática foram medidas. Esses dados experimentais foram usados para testar os modelos. A simulação e a otimização do processo foram feitas por meio do método SA, um algoritmo de melhoria randômica, análogo ao processo de têmpera simulada. Os resultados mostram que as sementes estavam secas após 3 h ou 5 h de secagem. Entre os modelos usados, o modelo cinético de difusão da água através das sementes apresentou o melhor ajuste. O tempo e a temperatura
Balin Talamba, D.; Higy, C.; Joerin, C.; Musy, A.
The paper presents an application concerning the hydrological modelling for the Haute-Mentue catchment, located in western Switzerland. A simplified version of Topmodel, developed in a Labview programming environment, was applied in the aim of modelling the hydrological processes on this catchment. Previous researches car- ried out in this region outlined the importance of the environmental tracers in studying the hydrological behaviour and an important knowledge has been accumulated dur- ing this period concerning the mechanisms responsible for runoff generation. In con- formity with the theoretical constraints, Topmodel was applied for an Haute-Mentue sub-catchment where tracing experiments showed constantly low contributions of the soil water during the flood events. The model was applied for two humid periods in 1998. First, the model calibration was done in order to provide the best estimations for the total runoff. Instead, the simulated components (groundwater and rapid flow) showed far deviations from the reality indicated by the tracing experiments. Thus, a new calibration was performed including additional information given by the environ- mental tracing. The calibration of the model was done by using simulated annealing (SA) techniques, which are easy to implement and statistically allow for converging to a global minimum. The only problem is that the method is time and computer consum- ing. To improve this, a version of SA was used which is known as very fast-simulated annealing (VFSA). The principles are the same as for the SA technique. The random search is guided by certain probability distribution and the acceptance criterion is the same as for SA but the VFSA allows for better taking into account the ranges of vari- ation of each parameter. Practice with Topmodel showed that the energy function has different sensitivities along different dimensions of the parameter space. The VFSA algorithm allows differentiated search in relation with the
Asian Rhinoplasty: Preoperative Simulation and Planning Using Adobe Photoshop.
Kiranantawat, Kidakorn; Nguyen, Anh H
2015-11-01
A rhinoplasty in Asians differs from a rhinoplasty performed in patients of other ethnicities. Surgeons should understand the concept of Asian beauty, the nasal anatomy of Asians, and common problems encountered while operating on the Asian nose. With this understanding, surgeons can set appropriate goals, choose proper operative procedures, and provide an outcome that satisfies patients. In this article the authors define the concept of an Asian rhinoplasty-a paradigm shift from the traditional on-top augmentation rhinoplasty to a structurally integrated augmentation rhinoplasty-and provide a step-by-step procedure for the use of Adobe Photoshop as a preoperative program to simulate the expected surgical outcome for patients and to develop a preoperative plan for surgeons.
Simulation and optimization models for emergency medical systems planning.
Bettinelli, Andrea; Cordone, Roberto; Ficarelli, Federico; Righini, Giovanni
2014-01-01
The authors address strategic planning problems for emergency medical systems (EMS). In particular, the three following critical decisions are considered: i) how many ambulances to deploy in a given territory at any given point in time, to meet the forecasted demand, yielding an appropriate response time; ii) when ambulances should be used for serving nonurgent requests and when they should better be kept idle for possible incoming urgent requests; iii) how to define an optimal mix of contracts for renting ambulances from private associations to meet the forecasted demand at minimum cost. In particular, analytical models for decision support, based on queuing theory, discrete-event simulation, and integer linear programming were presented. Computational experiments have been done on real data from the city of Milan, Italy.
一致性车辆路径问题下基于模板路径的模拟退火法%A Simulated Annealing Heuristic for the Consistent Vehicle Routing Problem
Institute of Scientific and Technical Information of China (English)
刘恒宇; 汝宜红
2015-01-01
根据一致性车辆路径问题的"服务一致性"特征,本文提出了基于模板路径的模拟退火法(TSA)以更好地求解此问题.该算法求解分为2个阶段:第1阶段求解模板路径,第2阶段以所得模板路径为参考获得各天车辆具体配送路径方案,2个阶段均采用模拟退火法进行优化.借助小、中规模基准数据集,文章对TSA算法进行数值实验,并将实验结果与ConRTR算法和TTS算法的结果作比较,利用TSA法求解一致性车辆路径问题得到的配送路径方案和"服务一致性"指标均得到优化.实验结果表明,运用TSA算法规划车辆配送路径方案,不仅能够降低运营成本,还能提高配送服务质量.%According to the"service consistency"characteristics of the consistent vehicle routing problem, a template-based simulated annealing heuristic (TSA) is proposed to look for better solutions, this algorithm can be divide into two stages: in the first stage, we get the template routes, and in the second stage, the template routes serve as a reference to determine the daily vehicle routing schedules. The simulated annealing heuristic is applied in both stages to get optimal solutions. Based on two small-and middle-scale benchmark data sets, numerical experiments are conducted to test the TSA and then the results are compared to ConRTR's and TTS'. It can be seen that both the"service consistency"indicators in these two experiments are improved. Therefore, the results prove that by using TSA to plan vehicle delivery routes, not only the operating cost is reduced but also higher service quality obtained.
Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases
Energy Technology Data Exchange (ETDEWEB)
Carlsbad Field Office
2007-11-13
The Performance Demonstration Program (PDP) for headspace gases distributes blind audit samples in a gas matrix for analysis of volatile organic compounds (VOCs). Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document
Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases
Energy Technology Data Exchange (ETDEWEB)
Carlsbad Field Office
2007-11-19
The Performance Demonstration Program (PDP) for headspace gases distributes blind audit samples in a gas matrix for analysis of volatile organic compounds (VOCs). Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility’s compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document
Dosimetry audit simulation of treatment planning system in multicenters radiotherapy
Kasmuri, S.; Pawiro, S. A.
2017-07-01
Treatment Planning System (TPS) is an important modality that determines radiotherapy outcome. TPS requires input data obtained through commissioning and the potentially error occurred. Error in this stage may result in the systematic error. The aim of this study to verify the TPS dosimetry to know deviation range between calculated and measurement dose. This study used CIRS phantom 002LFC representing the human thorax and simulated all external beam radiotherapy stages. The phantom was scanned using CT Scanner and planned 8 test cases that were similar to those in clinical practice situation were made, tested in four radiotherapy centers. Dose measurement using 0.6 cc ionization chamber. The results of this study showed that generally, deviation of all test cases in four centers was within agreement criteria with average deviation about -0.17±1.59 %, -1.64±1.92 %, 0.34±1.34 % and 0.13±1.81 %. The conclusion of this study was all TPS involved in this study showed good performance. The superposition algorithm showed rather poor performance than either analytic anisotropic algorithm (AAA) and convolution algorithm with average deviation about -1.64±1.92 %, -0.17±1.59 % and -0.27±1.51 % respectively.
1995-05-01
A HYBRID ANALYTICAL/ SIMULATION MODELING APPROACH FOR PLANNING AND OPTIMIZING MASS TACTICAL AIRBORNE OPERATIONS by DAVID DOUGLAS BRIGGS M.S.B.A...COVERED MAY 1995 TECHNICAL REPORT THESIS 4. TITLE AND SUBTITLE 5. FUNDING NUMBERS A HYBRID ANALYTICAL SIMULATION MODELING APPROACH FOR PLANNING AND...are present. Thus, simulation modeling presents itself as an excellent alternate tool for planning because it allows for the modeling of highly complex
A Constrained CA Model for Planning Simulation Incorporating Institutional Constraints
Institute of Scientific and Technical Information of China (English)
2010-01-01
In recent years,it is prevailing to simulate urban growth by means of cellular automata (CA in short) modeling,which is based on selforganizing theories and different from the system dynamic modeling.Since the urban system is definitely complex,the CA models applied in urban growth simulation should take into consideration not only the neighborhood influence,but also other factors influencing urban development.We bring forward the term of complex constrained CA (CC-CA in short) model,which integrates the constrained conditions of neighborhood,macro socio-economy,space and institution.Particularly,the constrained construction zoning,as one institutional constraint,is considered in the CC-CA modeling.In the paper,the conceptual CC-CA model is introduced together with the transition rules.Based on the CC-CA model for Beijing,we discuss the complex constraints to the urban development of,and we show how to set institutional constraints in planning scenario to control the urban growth pattern of Beijing.
Application of WEAP Simulation Model to Hengshui City Water Planning
Institute of Scientific and Technical Information of China (English)
OJEKUNLE Z O; ZHAO Lin; LI Manzhou; YANG Zhen; TAN Xin
2007-01-01
Like many river basins in China, water resources in the Fudong Pai River are almost fully allocated. This paper seeks to assess and evaluate water resource problems using water evaluation and planning (WEAP) model via its application to Hengshui Basin of Fudong Pai River. This model allows the simulation and analysis of various water allocation scenarios and, above all, scenarios of users' behavior. Water demand management is one of the options discussed in detail. Simulations are proposed for diverse climatic situations from dry years to normal years and results are discussed. Within the limits of data availability, it appears that most water users are not able to meet all their requirements from the river, and that even the ecological reserve will not be fully met during certain years. But the adoption of water demand management procedures offers opportunities for remedying this situation during normal hydrological years. However, it appears that demand management alone will not suffice during dry years. Nevertheless, the ease of use of the model and its user-friendly interfaces make it particularly useful for discussions and dialogue on water resources management among stakeholders.
Energy Technology Data Exchange (ETDEWEB)
Fonville, Judith M., E-mail: j.fonville07@imperial.ac.uk [Biomolecular Medicine, Department of Surgery and Cancer, Faculty of Medicine, Imperial College London, Sir Alexander Fleming Building, South Kensington, London SW7 2AZ (United Kingdom); Bylesjoe, Max, E-mail: max.bylesjo@almacgroup.com [Almac Diagnostics, 19 Seagoe Industrial Estate, Craigavon BT63 5QD (United Kingdom); Coen, Muireann, E-mail: m.coen@imperial.ac.uk [Biomolecular Medicine, Department of Surgery and Cancer, Faculty of Medicine, Imperial College London, Sir Alexander Fleming Building, South Kensington, London SW7 2AZ (United Kingdom); Nicholson, Jeremy K., E-mail: j.nicholson@imperial.ac.uk [Biomolecular Medicine, Department of Surgery and Cancer, Faculty of Medicine, Imperial College London, Sir Alexander Fleming Building, South Kensington, London SW7 2AZ (United Kingdom); Holmes, Elaine, E-mail: elaine.holmes@imperial.ac.uk [Biomolecular Medicine, Department of Surgery and Cancer, Faculty of Medicine, Imperial College London, Sir Alexander Fleming Building, South Kensington, London SW7 2AZ (United Kingdom); Lindon, John C., E-mail: j.lindon@imperial.ac.uk [Biomolecular Medicine, Department of Surgery and Cancer, Faculty of Medicine, Imperial College London, Sir Alexander Fleming Building, South Kensington, London SW7 2AZ (United Kingdom); Rantalainen, Mattias, E-mail: rantalai@stats.ox.ac.uk [Department of Statistics, Oxford University, 1 South Parks Road, Oxford OX1 3TG (United Kingdom)
2011-10-31
Highlights: {yields} Non-linear modeling of metabonomic data using K-OPLS. {yields} automated optimization of the kernel parameter by simulated annealing. {yields} K-OPLS provides improved prediction performance for exemplar spectral data sets. {yields} software implementation available for R and Matlab under GPL v2 license. - Abstract: Linear multivariate projection methods are frequently applied for predictive modeling of spectroscopic data in metabonomic studies. The OPLS method is a commonly used computational procedure for characterizing spectral metabonomic data, largely due to its favorable model interpretation properties providing separate descriptions of predictive variation and response-orthogonal structured noise. However, when the relationship between descriptor variables and the response is non-linear, conventional linear models will perform sub-optimally. In this study we have evaluated to what extent a non-linear model, kernel-based orthogonal projections to latent structures (K-OPLS), can provide enhanced predictive performance compared to the linear OPLS model. Just like its linear counterpart, K-OPLS provides separate model components for predictive variation and response-orthogonal structured noise. The improved model interpretation by this separate modeling is a property unique to K-OPLS in comparison to other kernel-based models. Simulated annealing (SA) was used for effective and automated optimization of the kernel-function parameter in K-OPLS (SA-K-OPLS). Our results reveal that the non-linear K-OPLS model provides improved prediction performance in three separate metabonomic data sets compared to the linear OPLS model. We also demonstrate how response-orthogonal K-OPLS components provide valuable biological interpretation of model and data. The metabonomic data sets were acquired using proton Nuclear Magnetic Resonance (NMR) spectroscopy, and include a study of the liver toxin galactosamine, a study of the nephrotoxin mercuric chloride and
Energy Technology Data Exchange (ETDEWEB)
Berthiau, G.
1995-10-01
The circuit design problem consists in determining acceptable parameter values (resistors, capacitors, transistors geometries ...) which allow the circuit to meet various user given operational criteria (DC consumption, AC bandwidth, transient times ...). This task is equivalent to a multidimensional and/or multi objective optimization problem: n-variables functions have to be minimized in an hyper-rectangular domain ; equality constraints can be eventually specified. A similar problem consists in fitting component models. In this way, the optimization variables are the model parameters and one aims at minimizing a cost function built on the error between the model response and the data measured on the component. The chosen optimization method for this kind of problem is the simulated annealing method. This method, provided by the combinatorial optimization domain, has been adapted and compared with other global optimization methods for the continuous variables problems. An efficient strategy of variables discretization and a set of complementary stopping criteria have been proposed. The different parameters of the method have been adjusted with analytical functions of which minima are known, classically used in the literature. Our simulated annealing algorithm has been coupled with an open electrical simulator SPICE-PAC of which the modular structure allows the chaining of simulations required by the circuit optimization process. We proposed, for high-dimensional problems, a partitioning technique which ensures proportionality between CPU-time and variables number. To compare our method with others, we have adapted three other methods coming from combinatorial optimization domain - the threshold method, a genetic algorithm and the Tabu search method - The tests have been performed on the same set of test functions and the results allow a first comparison between these methods applied to continuous optimization variables. (Abstract Truncated)
Institute of Scientific and Technical Information of China (English)
赵敬和; 谢玲
2011-01-01
针对旅行商问题（TSP）具有的易于描述却难以处理的NP完全难题、其可能的路径数目与城市数目是呈指数型增长的、求解困难的特点。本文首次采用LabVIEW仿真实现模拟退火算法来求解该问题。仿真结果表明LabVIEW独有的数组运算规则可有效的实现该算法求解TSP问题，相比较其它方法，该方法更简单、实用、计算精度高、速度快，并且适合任意城市数目的TSP问题。%For the NP-complete hard problem which is easy to be described,but hard to be solved and the possible amounts of path increase exponentially with the amounts of city in Traveling Salesman Problem ,both resulting TSP is difficult to solve,this paper uses Simulated Annealing based on LabVIEW simulation to solve the problem for the first time. LabVIEW simulation results show that its unique array algorithms can effectively implement the Simulated annealing for TSP.Compared to other methods,this method is more simple,more practical and more precise.In addition, it has higher speed and is suitable for the TSP with any number of cities.
RTSTEP regional transportation simulation tool for emergency planning - final report.
Energy Technology Data Exchange (ETDEWEB)
Ley, H.; Sokolov, V.; Hope, M.; Auld, J.; Zhang, K.; Park, Y.; Kang, X. (Energy Systems)
2012-01-20
such materials over a large area, with responders trying to mitigate the immediate danger to the population in a variety of ways that may change over time (e.g., in-place evacuation, staged evacuations, and declarations of growing evacuation zones over time). In addition, available resources will be marshaled in unusual ways, such as the repurposing of transit vehicles to support mass evacuations. Thus, any simulation strategy will need to be able to address highly dynamic effects and will need to be able to handle any mode of ground transportation. Depending on the urgency and timeline of the event, emergency responders may also direct evacuees to leave largely on foot, keeping roadways as clear as possible for emergency responders, logistics, mass transport, and law enforcement. This RTSTEP project developed a regional emergency evacuation modeling tool for the Chicago Metropolitan Area that emergency responders can use to pre-plan evacuation strategies and compare different response strategies on the basis of a rather realistic model of the underlying complex transportation system. This approach is a significant improvement over existing response strategies that are largely based on experience gained from small-scale events, anecdotal evidence, and extrapolation to the scale of the assumed emergency. The new tool will thus add to the toolbox available to emergency response planners to help them design appropriate generalized procedures and strategies that lead to an improved outcome when used during an actual event.
Institute of Scientific and Technical Information of China (English)
齐小刚; 王云鹤
2011-01-01
为解决Hopfield神经网络应用过程中参数设置的问题,在研究Hopfield神经网络的工作原理的基础上,分析了神经网络模型在求解TSP(Traveling Salesman Problem)问题过程中参数的选取,通过对输出数据进行归一化处理建立网络的评价函数,然后引入模拟退火算法对参数进行最优化选取.实验结果表明,经过参数优化过的Hopfield神经网络模型能更有效,更快速地得到TSP问题的最优解.%In order to solve the parameter setting problem during the application process of Hopfield neural network. The working principle of Hopfield neural network is described, the neural network model parameter selection problem in the TSP ( Traveling Salesman Problem) problems solving process is analyed On the basis established the evaluation function of network by using normalized on output data, and then use simulated annealing algorithm to select the optimal parameters. The results show that, after optimization of parameters, Hopfield neural network can obtain the optimal solution of TSP problems more effective and more quickly.
Huang, C H; Lai, J J; Wei, T Y; Chen, Y H; Wang, X; Kuan, S Y; Huang, J C
2015-01-01
The effects of the nanocrystalline phases on the bio-corrosion behavior of highly bio-friendly Ti42Zr40Si15Ta3 metallic glasses in simulated body fluid were investigated, and the findings are compared with our previous observations from the Zr53Cu30Ni9Al8 metallic glasses. The Ti42Zr40Si15Ta3 metallic glasses were annealed at temperatures above the glass transition temperature, Tg, with different time periods to result in different degrees of α-Ti nano-phases in the amorphous matrix. The nanocrystallized Ti42Zr40Si15Ta3 metallic glasses containing corrosion resistant α-Ti phases exhibited more promising bio-corrosion resistance, due to the superior pitting resistance. This is distinctly different from the previous case of the Zr53Cu30Ni9Al8 metallic glasses with the reactive Zr2Cu phases inducing serious galvanic corrosion and lower bio-corrosion resistance. Thus, whether the fully amorphous or partially crystallized metallic glass would exhibit better bio-corrosion resistance, the answer would depend on the crystallized phase nature.
Directory of Open Access Journals (Sweden)
Yu Lin
2015-01-01
Full Text Available In recent years, logistics systems with multiple suppliers and plants in neighboring regions have been flourishing worldwide. However, high logistics costs remain a problem for such systems due to lack of information sharing and cooperation. This paper proposes an extended mathematical model that minimizes transportation and pipeline inventory costs via the many-to-many Milk-run routing mode. Because the problem is NP hard, a two-stage heuristic algorithm is developed by comprehensively considering its characteristics. More specifically, an initial satisfactory solution is generated in the first stage through a greedy heuristic algorithm to minimize the total number of vehicle service nodes and the best insertion heuristic algorithm to determine each vehicle’s route. Then, a simulated annealing algorithm (SA with limited search scope is used to improve the initial satisfactory solution. Thirty numerical examples are employed to test the proposed algorithms. The experiment results demonstrate the effectiveness of this algorithm. Further, the superiority of the many-to-many transportation mode over other modes is demonstrated via two case studies.
Li, Ji; Hu, Guoqing; Zhou, Yonghong; Zou, Chong; Peng, Wei; Alam Sm, Jahangir
2017-04-19
As a high performance-cost ratio solution for differential pressure measurement, piezo-resistive differential pressure sensors are widely used in engineering processes. However, their performance is severely affected by the environmental temperature and the static pressure applied to them. In order to modify the non-linear measuring characteristics of the piezo-resistive differential pressure sensor, compensation actions should synthetically consider these two aspects. Advantages such as nonlinear approximation capability, highly desirable generalization ability and computational efficiency make the kernel extreme learning machine (KELM) a practical approach for this critical task. Since the KELM model is intrinsically sensitive to the regularization parameter and the kernel parameter, a searching scheme combining the coupled simulated annealing (CSA) algorithm and the Nelder-Mead simplex algorithm is adopted to find an optimal KLEM parameter set. A calibration experiment at different working pressure levels was conducted within the temperature range to assess the proposed method. In comparison with other compensation models such as the back-propagation neural network (BP), radius basis neural network (RBF), particle swarm optimization optimized support vector machine (PSO-SVM), particle swarm optimization optimized least squares support vector machine (PSO-LSSVM) and extreme learning machine (ELM), the compensation results show that the presented compensation algorithm exhibits a more satisfactory performance with respect to temperature compensation and synthetic compensation problems.
Directory of Open Access Journals (Sweden)
ASHWIN MISHRA,
2011-01-01
Full Text Available In this study singularity analysis of the six degree of freedom (DOF Stewart Platform using the various heuristic methods in a specified design configuration has been carried out .The Jacobian matrix of the Stewart platform is obtained and the absolute value of the determinant of the Jacobian is taken as the objective function, and the least value of this objective function is fished in the reachable workspace of the Stewart platform so as to find the singular configurations. The singular configurations of the platform depend on the value of this objective function under consideration, if it is zero the configuration is singular. The results thus obtained by different methods namely the genetic algorithm, Particle Swarm optimization and variants and simulated annealing are compared with each other. The variable sets considered are the respective desirable platform motions in the form of translation and rotation in six degrees of freedom. This paper hence presents a proper comparative study of these algorithms based on the results that are obtained and highlights the advantage of each in terms of computational cost and accuracy.
基于模拟退火算法的全国最优旅行方案%Optimal Nationwide Traveling Scheme Based on Simulated Annealing Algorithm
Institute of Scientific and Technical Information of China (English)
吕鹏举; 原杰; 吕菁华
2011-01-01
An optimal itinerary scheme to travel through provincial capitals, municipalities, Hong Kong, Macao, Taiwan is designed.The practical problems of the shortest path and least cost for travelling to the above places are analyzed.Taking account of the relationship of cost, route, duration and transportation, a model is established.The simulated annealing algorithm is adopted to solve the model.A travel path of saving money and time is obtained by a comprehensive consideration.The results show the correctness of this travel program and practical value.%以如何走遍全国各省会、直辖市、香港、澳门、台北为基础设计旅行方案,对旅行时的路径最短,费用最少等现实问题进行分析,在充分考虑旅行费用与路线,时间与交通工具的关系后,以实现路径最短与费用时间最少为目标,进行系统建模,并应用模拟退火算法对模型进行求解,得出了一条综合考虑省钱、省时的旅行路径.结果表明了该旅行方案的正确性和现实价值.
Bagheri Tolabi, Hajar; Hosseini, Rahil; Shakarami, Mahmoud Reza
2016-06-01
This article presents a novel hybrid optimization approach for a nonlinear controller of a distribution static compensator (DSTATCOM). The DSTATCOM is connected to a distribution system with the distributed generation units. The nonlinear control is based on partial feedback linearization. Two proportional-integral-derivative (PID) controllers regulate the voltage and track the output in this control system. In the conventional scheme, the trial-and-error method is used to determine the PID controller coefficients. This article uses a combination of a fuzzy system, simulated annealing (SA) and intelligent water drops (IWD) algorithms to optimize the parameters of the controllers. The obtained results reveal that the response of the optimized controlled system is effectively improved by finding a high-quality solution. The results confirm that using the tuning method based on the fuzzy-SA-IWD can significantly decrease the settling and rising times, the maximum overshoot and the steady-state error of the voltage step response of the DSTATCOM. The proposed hybrid tuning method for the partial feedback linearizing (PFL) controller achieved better regulation of the direct current voltage for the capacitor within the DSTATCOM. Furthermore, in the event of a fault the proposed controller tuned by the fuzzy-SA-IWD method showed better performance than the conventional controller or the PFL controller without optimization by the fuzzy-SA-IWD method with regard to both fault duration and clearing times.
Semi-automatic simulation model generation of virtual dynamic networks for production flow planning
Krenczyk, D.; Skolud, B.; Olender, M.
2016-08-01
Computer modelling, simulation and visualization of production flow allowing to increase the efficiency of production planning process in dynamic manufacturing networks. The use of the semi-automatic model generation concept based on parametric approach supporting processes of production planning is presented. The presented approach allows the use of simulation and visualization for verification of production plans and alternative topologies of manufacturing network configurations as well as with automatic generation of a series of production flow scenarios. Computational examples with the application of Enterprise Dynamics simulation software comprising the steps of production planning and control for manufacturing network have been also presented.
Health care planning and education via gaming-simulation: a two-stage experiment.
Gagnon, J H; Greenblat, C S
1977-01-01
A two-stage process of gaming-simulation design was conducted: the first stage of design concerned national planning for hemophilia care; the second stage of design was for gaming-simulation concerning the problems of hemophilia patients and health care providers. The planning design was intended to be adaptable to large-scale planning for a variety of health care problems. The educational game was designed using data developed in designing the planning game. A broad range of policy-makers participated in the planning game.
Hase, Chris
2010-01-01
In August 2003, the Secretary of Defense (SECDEF) established the Adaptive Planning (AP) initiative [1] with an objective of reducing the time necessary to develop and revise Combatant Commander (COCOM) contingency plans and increase SECDEF plan visibility. In addition to reducing the traditional plan development timeline from twenty-four months to less than twelve months (with a goal of six months)[2], AP increased plan visibility to Department of Defense (DoD) leadership through In-Progress Reviews (IPRs). The IPR process, as well as the increased number of campaign and contingency plans COCOMs had to develop, increased the workload while the number of planners remained fixed. Several efforts from collaborative planning tools to streamlined processes were initiated to compensate for the increased workload enabling COCOMS to better meet shorter planning timelines. This paper examines the Joint Strategic Capabilities Plan (JSCP) directed contingency planning and staffing requirements assigned to a combatant commander staff through the lens of modeling and simulation. The dynamics of developing a COCOM plan are captured with an ExtendSim [3] simulation. The resulting analysis provides a quantifiable means by which to measure a combatant commander staffs workload associated with development and staffing JSCP [4] directed contingency plans with COCOM capability/capacity. Modeling and simulation bring significant opportunities in measuring the sensitivity of key variables in the assessment of workload to capability/capacity analysis. Gaining an understanding of the relationship between plan complexity, number of plans, planning processes, and number of planners with time required for plan development provides valuable information to DoD leadership. Through modeling and simulation AP leadership can gain greater insight in making key decisions on knowing where to best allocate scarce resources in an effort to meet DoD planning objectives.
Institute of Scientific and Technical Information of China (English)
许闻清; 陈剑
2011-01-01
针对遗传算法和模拟退火算法各自的优缺点,研究将两者联合起来,并通过动态调节交叉概率和变异概率来防止遗传算法的早熟现象,形成改进的遗传模拟退火算法,并将其应用于动力总成悬置系统的优化.%Owing to merits and demerits of genetic algorithm and simulated annealing algorithm,the two algorithms were combined.As the probabilities of crossover and mutation were dynamically adopted to overcome the premature phenomenon of genetic algorithm, A improved genetic simulated annealing algorithm was formed and used in the optimization of powertrain mounting system.
Test Plan for the Boiling Water Reactor Dry Cask Simulator
Energy Technology Data Exchange (ETDEWEB)
Durbin, Samuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lindgren, Eric R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2015-11-01
canister. The symmetric single assembly geometry with well-controlled boundary conditions simplifies interpretation of results. Various configurations of outer concentric ducting will be used to mimic conditions for above and below-ground storage configurations of vertical, dry cask systems with canisters. Radial and axial temperature profiles will be measured for a wide range of decay power and helium cask pressures. Of particular interest is the evaluation of the effect of increased helium pressure on allowable heat load and the effect of simulated wind on a simplified below ground vent configuration. While incorporating the best available information, this test plan is subject to changes due to improved understanding from modeling or from as-built deviations to designs. As-built conditions and actual procedures will be documented in the final test report.
Heilala, Juhani; Montonen, Jari; Jarvinen, Paula; Kivikunnas, Sauli
2010-01-01
The chapter is a summary of following national public research projects: "Integrated dynamic simulation model of enterprise for planning of operations", (1997-1999); "Integrated dynamic customer driven production network management using operative simulation" (2000-2001) and "Integrated dynamic electronic production and suppliers control and planning of resources" (2000-2001). Development work was later carried out in the projects "Modelling and simulation of manufacturing systems for value n...
Bahrami, Saeed; Doulati Ardejani, Faramarz; Baafi, Ernest
2016-05-01
In this study, hybrid models are designed to predict groundwater inflow to an advancing open pit mine and the hydraulic head (HH) in observation wells at different distances from the centre of the pit during its advance. Hybrid methods coupling artificial neural network (ANN) with genetic algorithm (GA) methods (ANN-GA), and simulated annealing (SA) methods (ANN-SA), were utilised. Ratios of depth of pit penetration in aquifer to aquifer thickness, pit bottom radius to its top radius, inverse of pit advance time and the HH in the observation wells to the distance of observation wells from the centre of the pit were used as inputs to the networks. To achieve the objective two hybrid models consisting of ANN-GA and ANN-SA with 4-5-3-1 arrangement were designed. In addition, by switching the last argument of the input layer with the argument of the output layer of two earlier models, two new models were developed to predict the HH in the observation wells for the period of the mining process. The accuracy and reliability of models are verified by field data, results of a numerical finite element model using SEEP/W, outputs of simple ANNs and some well-known analytical solutions. Predicted results obtained by the hybrid methods are closer to the field data compared to the outputs of analytical and simple ANN models. Results show that despite the use of fewer and simpler parameters by the hybrid models, the ANN-GA and to some extent the ANN-SA have the ability to compete with the numerical models.
时空模型结合模拟退火进行脑磁源的定位%Spatio-Temporal MEG Source Localization Using Simulated Annealing
Institute of Scientific and Technical Information of China (English)
霍小林; 李军; 刘正东
2001-01-01
Locating the sources of brain magnetic fields is a basic problem of magnetoencephalography (MEG). The locating of multiple current dipole is a difficult problem for the inverse study of MEG. A method combining Spatio-Temporal Source Modeling with Simulated Annealing to locate multiple current dipoles, is presented through studying the STSM of MEG.This method can overcome the shortcoming of other optimal methods to avoid being trapped in a local minimum. The dipole parameters can be separated into linear and nonlinear components. The optimization dimensions can be reduced greatly by just optimizing the nonlinear components only. Compared with the MUSIC (MUltiple Signal Classification), this method can cut down requirements of independence of the dipole sources correspondingly.%脑磁源的定位问题是脑磁图(magnetoencephalography, MEG)研究的一个基本问题。其中多偶极子定位是脑磁逆问题研究当中的难点。本文通过研究脑磁图的时空模型STSM (spatio-temporal source modeling),提出将时空模型与模拟退火相结合进行多偶极子的定位，以克服其他优化方法易落入局部极小的不足。时空模型中偶极子参数经分解可分为线性部分和非线形部分，只对非线性部分进行模拟退火优化大大降低了优化空间的维数。通过与MUSIC (MUltiple SIgnal Classification)方法的比较，发现将时空模型与模拟退火相结合可以相对降低对源信号独立性的要求。
BENDING RAY-TRACING BASED ON SIMULATED ANNEALING METHOD%基于模拟退火法的弯曲射线追踪
Institute of Scientific and Technical Information of China (English)
周竹生; 谢金伟
2011-01-01
This paper proposes a new ray-tracing method based on the concept of simulated annealing. With the new method, not only the problem that the traditional ray-tracing method is over dependent on pre - established initial ray-paths is well solved, but also the quality of desirable ray-paths construction and the associated traveltime calculation between fixed sources and receivers is ensured, even if the model is of much complicated velocity-field. As a result, the ray-paths whose traveltime approach is overall minimum are searched out successfully. Furthermore, the algorithm may calculate ray-paths with local extreme lower traveltime too and restrict them easily by instructing rays to pass through some fixed points. The feasibility and stability of the method have been proved by trial results of theoretical models.%提出了一种新的射线追踪方法——模拟退火法.新方法不仅较好地解决了传统射线追踪方法过分依赖初始模型的问题,而且对于复杂速度场模型也能保证在固定的发射与接收点之间构建令人满意的射线路径及其相应的走时,搜索到满足旅行时全局最小的射线路径.此外,新方法还可计算局部最小旅行时,并可方便地通过指定射线经过固定点来对射线路径进行限制.理论模型的试算结果证明了该方法的可行性和稳健性.
Directory of Open Access Journals (Sweden)
Maikel Méndez-Morales
2014-09-01
Full Text Available En este artículo se presenta la aplicación del algoritmo Simulated Annealing (SA en el diseño óptimo de un sistema de distribución de agua (SDA. El SA es un algoritmo metaheurístico de búsqueda, basado en una analogía entre el proceso de recocido en metales (proceso controlado de enfriamiento de un cuerpo y la solución de problemas de optimización combinatorios. El algoritmo SA, junto con diversos modelos matemáticos, ha sido utilizado exitosamente en el óptimo diseño de SDA. Como caso de estudio se utilizó el SDA a escala real de la comunidad de Marsella, en San Carlos, Costa Rica. El algoritmo SA fue implementado mediante el conocido modelo EPANET, a través de la extensión WaterNetGen. Se compararon tres diferentes variaciones automatizadas del algoritmo SA con el diseño manual del SDA Marsella llevado a cabo a prueba y error, utilizando únicamente costos unitarios de tuberías. Los resultados muestran que los tres esquemas automatizados del SA arrojaron costos unitarios por debajo del 0.49 como fracción, respecto al costo original del esquema de diseño ejecutado a prueba y error. Esto demuestra que el algoritmo SA es capaz de optimizar problemas combinatorios ligados al diseño de mínimo costo de los sistemas de distribución de agua a escala real.
Biswas, A.; Sharma, S. P.
2012-12-01
Self-Potential anomaly is an important geophysical technique that measures the electrical potential due natural source of current in the Earth's subsurface. An inclined sheet type model is a very familiar structure associated with mineralization, fault plane, groundwater flow and many other geological features which exhibits self potential anomaly. A number of linearized and global inversion approaches have been developed for the interpretation of SP anomaly over different structures for various purposes. Mathematical expression to compute the forward response over a two-dimensional dipping sheet type structures can be described in three different ways using five variables in each case. Complexities in the inversion using three different forward approaches are different. Interpretation of self-potential anomaly using very fast simulated annealing global optimization has been developed in the present study which yielded a new insight about the uncertainty and equivalence in model parameters. Interpretation of the measured data yields the location of the causative body, depth to the top, extension, dip and quality of the causative body. In the present study, a comparative performance of three different forward approaches in the interpretation of self-potential anomaly is performed to assess the efficacy of the each approach in resolving the possible ambiguity. Even though each forward formulation yields the same forward response but optimization of different sets of variable using different forward problems poses different kinds of ambiguity in the interpretation. Performance of the three approaches in optimization has been compared and it is observed that out of three methods, one approach is best and suitable for this kind of study. Our VFSA approach has been tested on synthetic, noisy and field data for three different methods to show the efficacy and suitability of the best method. It is important to use the forward problem in the optimization that yields the
Fonville, Judith M; Bylesjö, Max; Coen, Muireann; Nicholson, Jeremy K; Holmes, Elaine; Lindon, John C; Rantalainen, Mattias
2011-10-31
Linear multivariate projection methods are frequently applied for predictive modeling of spectroscopic data in metabonomic studies. The OPLS method is a commonly used computational procedure for characterizing spectral metabonomic data, largely due to its favorable model interpretation properties providing separate descriptions of predictive variation and response-orthogonal structured noise. However, when the relationship between descriptor variables and the response is non-linear, conventional linear models will perform sub-optimally. In this study we have evaluated to what extent a non-linear model, kernel-based orthogonal projections to latent structures (K-OPLS), can provide enhanced predictive performance compared to the linear OPLS model. Just like its linear counterpart, K-OPLS provides separate model components for predictive variation and response-orthogonal structured noise. The improved model interpretation by this separate modeling is a property unique to K-OPLS in comparison to other kernel-based models. Simulated annealing (SA) was used for effective and automated optimization of the kernel-function parameter in K-OPLS (SA-K-OPLS). Our results reveal that the non-linear K-OPLS model provides improved prediction performance in three separate metabonomic data sets compared to the linear OPLS model. We also demonstrate how response-orthogonal K-OPLS components provide valuable biological interpretation of model and data. The metabonomic data sets were acquired using proton Nuclear Magnetic Resonance (NMR) spectroscopy, and include a study of the liver toxin galactosamine, a study of the nephrotoxin mercuric chloride and a study of Trypanosoma brucei brucei infection. Automated and user-friendly procedures for the kernel-optimization have been incorporated into version 1.1.1 of the freely available K-OPLS software package for both R and Matlab to enable easy application of K-OPLS for non-linear prediction modeling.
Energy Technology Data Exchange (ETDEWEB)
Li, Yulan; Hu, Shenyang Y.; Montgomery, Robert; Gao, Fei; Sun, Xin; Tonks, Michael; Biner, Bullent; Millet, Paul; Tikare, Veena; Radhakrishnan, Balasubramaniam; Andersson , David
2012-04-11
A study was conducted to evaluate the capabilities of different numerical methods used to represent microstructure behavior at the mesoscale for irradiated material using an idealized benchmark problem. The purpose of the mesoscale benchmark problem was to provide a common basis to assess several mesoscale methods with the objective of identifying the strengths and areas of improvement in the predictive modeling of microstructure evolution. In this work, mesoscale models (phase-field, Potts, and kinetic Monte Carlo) developed by PNNL, INL, SNL, and ORNL were used to calculate the evolution kinetics of intra-granular fission gas bubbles in UO2 fuel under post-irradiation thermal annealing conditions. The benchmark problem was constructed to include important microstructural evolution mechanisms on the kinetics of intra-granular fission gas bubble behavior such as the atomic diffusion of Xe atoms, U vacancies, and O vacancies, the effect of vacancy capture and emission from defects, and the elastic interaction of non-equilibrium gas bubbles. An idealized set of assumptions was imposed on the benchmark problem to simplify the mechanisms considered. The capability and numerical efficiency of different models are compared against selected experimental and simulation results. These comparisons find that the phase-field methods, by the nature of the free energy formulation, are able to represent a larger subset of the mechanisms influencing the intra-granular bubble growth and coarsening mechanisms in the idealized benchmark problem as compared to the Potts and kinetic Monte Carlo methods. It is recognized that the mesoscale benchmark problem as formulated does not specifically highlight the strengths of the discrete particle modeling used in the Potts and kinetic Monte Carlo methods. Future efforts are recommended to construct increasingly more complex mesoscale benchmark problems to further verify and validate the predictive capabilities of the mesoscale modeling
Supply-chain management based on simulated annealing algorithm%基于模拟退火算法的供应链管理分析
Institute of Scientific and Technical Information of China (English)
董雪
2012-01-01
随着经济全球化的到来，更多的企业将工作重心放在其核心竞争力上，而物流业务也逐渐从生产加工等业务中分离出来。因此，如何有效管理供应商和生产商之间的关系（即供应链管理）已成为当前企业竞争和收益的焦点。以往对于供应链模型的求解往往是基于遗传算法等，虽然成熟有效，但局部搜索能力较差并且计算时间较长。主要应用模拟退火算法对供应链模型的求解问题进行研究和分析，并结合例题说明其有效性。%With the economic globalization, more and more enterprises focus on their core competitiveness. So the logistics operation has been gradually into various fairly independent unit. Therefore, how to manage the relations between suppliers and producers effectively (supply-chain management) becomes a hot topic. The former solutions to the supply-chain model have been based on genetic algorithm,which func- tion is mature and effective, but poor in local search ability and longer for computing time. It provides a simulated annealing algorithm to solve the supply-chain management model, and an example will be given to show its effectiveness.
Becker, Kathrin; Stauber, Martin; Schwarz, Frank; Beißbarth, Tim
2015-09-01
We propose a novel 3D-2D registration approach for micro-computed tomography (μCT) and histology (HI), constructed for dental implant biopsies, that finds the position and normal vector of the oblique slice from μCT that corresponds to HI. During image pre-processing, the implants and the bone tissue are segmented using a combination of thresholding, morphological filters and component labeling. After this, chamfer matching is employed to register the implant edges and fine registration of the bone tissues is achieved using simulated annealing. The method was tested on n=10 biopsies, obtained at 20 weeks after non-submerged healing in the canine mandible. The specimens were scanned with μCT 100 and processed for hard tissue sectioning. After registration, we assessed the agreement of bone to implant contact (BIC) using automated and manual measurements. Statistical analysis was conducted to test the agreement of the BIC measurements in the registered samples. Registration was successful for all specimens and agreement of the respective binary images was high (median: 0.90, 1.-3. Qu.: 0.89-0.91). Direct comparison of BIC yielded that automated (median 0.82, 1.-3. Qu.: 0.75-0.85) and manual (median 0.61, 1.-3. Qu.: 0.52-0.67) measures from μCT were significant positively correlated with HI (median 0.65, 1.-3. Qu.: 0.59-0.72) between μCT and HI groups (manual: R(2)=0.87, automated: R(2)=0.75, p<0.001). The results show that this method yields promising results and that μCT may become a valid alternative to assess osseointegration in three dimensions. Copyright © 2015 Elsevier Ltd. All rights reserved.
Modernizing quantum annealing using local searches
Chancellor, Nicholas
2017-02-01
I describe how real quantum annealers may be used to perform local (in state space) searches around specified states, rather than the global searches traditionally implemented in the quantum annealing algorithm (QAA). Such protocols will have numerous advantages over simple quantum annealing. By using such searches the effect of problem mis-specification can be reduced, as only energy differences between the searched states will be relevant. The QAA is an analogue of simulated annealing, a classical numerical technique which has now been superseded. Hence, I explore two strategies to use an annealer in a way which takes advantage of modern classical optimization algorithms. Specifically, I show how sequential calls to quantum annealers can be used to construct analogues of population annealing and parallel tempering which use quantum searches as subroutines. The techniques given here can be applied not only to optimization, but also to sampling. I examine the feasibility of these protocols on real devices and note that implementing such protocols should require minimal if any change to the current design of the flux qubit-based annealers by D-Wave Systems Inc. I further provide proof-of-principle numerical experiments based on quantum Monte Carlo that demonstrate simple examples of the discussed techniques.
Simulation modelling for resource allocation and planning in the health sector.
Lehaney, B; Hlupic, V
1995-12-01
This paper provides a review of the use of simulation for resource planning in the health sector. Case examples of simulation in health are provided, and the modelling problems are explored. The successes and failures of simulation modelling in this context are examined, and an approach for improving the processes, and outcomes, by the use of soft systems methodology, is suggested.
Perceived Speech Privacy in Computer Simulated Open-plan Offices
DEFF Research Database (Denmark)
Pop, Claudiu B.; Rindel, Jens Holger
2005-01-01
parameter that correlates well with the perceived degree of speech privacy and to derive a clear method for evaluating the acoustic conditions in open plan offices. Acoustic measurements were carried out in an open plan office, followed by data analysis at the Acoustic Department, DTU. A computer model......In open plan offices the lack of speech privacy between the workstations is one of the major acoustic problems. Improving the speech privacy in an open plan design is therefore the main concern for a successful open plan environment. The project described in this paper aimed to find an objective...... of the actual office was developed using the ODEON room acoustic software, and this allowed a systematic investigation of the possible influence of various acoustic conditions on the speech privacy. Four different versions of acoustic treatment of the office were used and three different distances from...
Energy Technology Data Exchange (ETDEWEB)
Monsalve, A.; Artigas, A.; Celentano, D.; Melendez, F.
2004-07-01
The heating and cooling curves during batch annealing process of low carbon steel have been modeled using the finite element technique. This has allowed to predict the transient thermal profile for every point of the annealed coils, particularly for the hottest and coldest ones. Through experimental measurements, the results have been adequately validated since a good agreement has been found between experimental values and those predicted by the model. Moreover, an Avrami recrystallization model. Moreover, and Avrami recrystallization model has been coupled to this thermal balance computation. Interrupted annealing experiments have been made by measuring the recrystallized fraction on the extreme points of the coil foe different times. These data gave the possibility to validate the developed recrystallization model through a reasonably good numerical-experimental fittings. (Author) 6 refs.
Use of mental simulations to change theory of planned behaviour variables
Armitage, Christopher J.; Reidy, John G.
2008-01-01
Objectives. The predictive validity of the theory of planned behaviour iswell established, but much less is known about: (a) whether there are causal relationships between key components of the model and (b) how to go about changing the theory of planned behaviour variables. This study tested the ability of outcome and process simulations to change variables specified in the theory of planned behaviour in relation to blood donation.\\ud Design. Participants (N ¼ 146) were randomized to one of ...
Prototyping and validating requirements of radiation and nuclear emergency plan simulator
Hamid, AHA.; Rozan, MZA.; Ibrahim, R.; Deris, S.; Selamat, A.
2015-04-01
Organizational incapability in developing unrealistic, impractical, inadequate and ambiguous mechanisms of radiological and nuclear emergency preparedness and response plan (EPR) causing emergency plan disorder and severe disasters. These situations resulting from 65.6% of poor definition and unidentified roles and duties of the disaster coordinator. Those unexpected conditions brought huge aftermath to the first responders, operators, workers, patients and community at large. Hence, in this report, we discuss prototyping and validating of Malaysia radiation and nuclear emergency preparedness and response plan simulation model (EPRM). A prototyping technique was required to formalize the simulation model requirements. Prototyping as systems requirements validation was carried on to endorse the correctness of the model itself against the stakeholder's intensions in resolving those organizational incapability. We have made assumptions for the proposed emergency preparedness and response model (EPRM) through the simulation software. Those assumptions provided a twofold of expected mechanisms, planning and handling of the respective emergency plan as well as in bringing off the hazard involved. This model called RANEPF (Radiation and Nuclear Emergency Planning Framework) simulator demonstrated the training emergency response perquisites rather than the intervention principles alone. The demonstrations involved the determination of the casualties' absorbed dose range screening and the coordination of the capacity planning of the expected trauma triage. Through user-centred design and sociotechnical approach, RANEPF simulator was strategized and simplified, though certainly it is equally complex.
Prototyping and validating requirements of radiation and nuclear emergency plan simulator
Energy Technology Data Exchange (ETDEWEB)
Hamid, AHA., E-mail: amyhamijah@nm.gov.my [Malaysian Nuclear Agency (NM), Bangi, 43000 Kajang, Selangor (Malaysia); Faculty of Computing, Universiti Teknologi Malaysia (UTM), Skudai, 81310 Johor Bahru, Johor (Malaysia); Rozan, MZA.; Ibrahim, R.; Deris, S.; Selamat, A. [Faculty of Computing, Universiti Teknologi Malaysia (UTM), Skudai, 81310 Johor Bahru, Johor (Malaysia)
2015-04-29
Organizational incapability in developing unrealistic, impractical, inadequate and ambiguous mechanisms of radiological and nuclear emergency preparedness and response plan (EPR) causing emergency plan disorder and severe disasters. These situations resulting from 65.6% of poor definition and unidentified roles and duties of the disaster coordinator. Those unexpected conditions brought huge aftermath to the first responders, operators, workers, patients and community at large. Hence, in this report, we discuss prototyping and validating of Malaysia radiation and nuclear emergency preparedness and response plan simulation model (EPRM). A prototyping technique was required to formalize the simulation model requirements. Prototyping as systems requirements validation was carried on to endorse the correctness of the model itself against the stakeholder’s intensions in resolving those organizational incapability. We have made assumptions for the proposed emergency preparedness and response model (EPRM) through the simulation software. Those assumptions provided a twofold of expected mechanisms, planning and handling of the respective emergency plan as well as in bringing off the hazard involved. This model called RANEPF (Radiation and Nuclear Emergency Planning Framework) simulator demonstrated the training emergency response perquisites rather than the intervention principles alone. The demonstrations involved the determination of the casualties’ absorbed dose range screening and the coordination of the capacity planning of the expected trauma triage. Through user-centred design and sociotechnical approach, RANEPF simulator was strategized and simplified, though certainly it is equally complex.
Energy Technology Data Exchange (ETDEWEB)
Nakos, J.; Rosinski, S.; Acton, R.; Strait, B.; Schulze, D. [Sandia National Labs., Albuquerque, NM (United States)
1994-09-01
The objective of this paper was a review of a proof-of-principle annealing process conducted on a RV section. Test conditions and set-up were described, and photographs of the test setup were presented. Results of various temperature measurements were also presented.
Kniepert, Juliane; Lange, Ilja; van der Kaap, Niels J.; Koster, L. Jan Anton; Neher, Dieter
2014-01-01
Time-delayed collection field (TDCF) and bias-amplified charge extraction (BACE) are applied to as-prepared and annealed poly(3-hexylthiophene):[6,6]-phenyl C-71 butyric acid methyl ester (P3HT:PCBM) blends coated from chloroform. Despite large differences in fill factor, short-circuit current, and
Institute of Scientific and Technical Information of China (English)
罗晨; 李渊; 刘勇; 刘晓明
2012-01-01
Aiming at the shortcomings of normal genetic algorithm that its convergence speed is slow in task allocation, baaed on giving the formal specification of task allocation in multi-agent system, this paper proposed a simulated annealing genetic algorithm (SAGA) by integrating simulated annealing, presented the basic thought and pivotal steps of SAGA in detail, and validated the algorithm by simulation experiment. The simulation results illustrate thai SAGA has better convergence speed and optimal results than normal genetic algorithm.%针对标准的遗传算法在任务分配中收敛速度慢的问题,对多agent系统中的任务分配进行形式化描述的基础上,融合模拟退火算法的优化思想,提出了一种基于模拟退火遗传算法的任务分配方法,详细阐述了该算法的基本思想和关键步骤,并通过仿真实验进行验证.仿真实验结果表明,基于模拟退火遗传算法比标准的遗传算法具有更快的收敛速度和寻优效果.
Lean engineering for planning systems redesign - staff participation by simulation
van der Zee, D.J.; Pool, A.; Wijngaard, J.; Mason, S.J.; Hill, R.R.; Moench, L.; Rose, O.
2008-01-01
Lean manufacturing aims at flexible and efficient manufacturing systems by reducing waste in all forms, such as, production of defective parts, excess inventory, unnecessary processing steps, and unnecessary movements of people or materials. Recent research stresses the need to include planning syst
Lean engineering for planning systems redesign - staff participation by simulation
van der Zee, D.J.; Pool, A.; Wijngaard, J.; Mason, S.J.; Hill, R.R.; Moench, L.; Rose, O.
2008-01-01
Lean manufacturing aims at flexible and efficient manufacturing systems by reducing waste in all forms, such as, production of defective parts, excess inventory, unnecessary processing steps, and unnecessary movements of people or materials. Recent research stresses the need to include planning
Lean engineering for planning systems redesign - staff participation by simulation
van der Zee, D.J.; Pool, A.; Wijngaard, J.; Mason, S.J.; Hill, R.R.; Moench, L.; Rose, O.
2008-01-01
Lean manufacturing aims at flexible and efficient manufacturing systems by reducing waste in all forms, such as, production of defective parts, excess inventory, unnecessary processing steps, and unnecessary movements of people or materials. Recent research stresses the need to include planning syst
Strategic planning for skills and simulation labs in colleges of nursing.
Gantt, Laura T
2010-01-01
While simulation laboratories for clinical nursing education are predicted to grow, budget cuts may threaten these programs. One of the ways to develop a new lab, as well as to keep an existing one on track, is to develop and regularly update a strategic plan. The process of planning not only helps keep the lab faculty and staff apprised of the challenges to be faced, but it also helps to keep senior level management engaged by reason of the need for their input and approval of the plan. The strategic planning documents drafted by those who supervised the development of the new building and Concepts Integration Labs (CILs) helped guide and orient faculty and other personnel hired to implement the plan and fulfill the vision. As the CILs strategic plan was formalized, the draft plans, including the SWOT analysis, were reviewed to provide historical perspective, stimulate discussion, and to make sure old or potential mistakes were not repeated.
Institute of Scientific and Technical Information of China (English)
张扬; 杨松涛; 张香芝
2012-01-01
研究无线传感器网络( WSN)数据融合技术.传感器节点计算能力、通信能力有限,WSN采用交叉重叠方式部署,导致冗余数据量大,需采用数据融合技术消除冗余和无效数据,节约网络通信能耗.结合遗传算法全局搜索和模拟退火算法局部搜索的优点,提出一种模拟退火遗传算法的WSN数据融合方法(SA-GA).采用模拟退火遗传算法快速找到移动代理路由最优传感器节点序列,并实现数据融合.仿真实验结果表明,与遗传算法、模拟退火算法相比,SA-GA更能快速找到全局最优数据融合节点序列,并对数据进行有效融合,具有更小的网络能耗和网络延时.%This paper researched the wireless sensor network ( WSN) data fusion. Sensor node computing ability and communication ability were limited. WSN used overlapping deployment, leading to large redundant data quantity, so as to use the data fusion technology to eliminate redundancy and invalid data, save network communication energy. Combination of genetic algorithm and simulated annealing algorithm for global search and local search advantages, this paper proposed a simulated annealing genetic algorithm ( SA-GA ) WSN data fusion method. By using simulated annealing genetic algorithm, it could quickly find the mobile agent routing optimal sensor node sequence and fuse the data. The simulation results show that, comparing with the genetic algorithm and simulated annealing algorithm, SA-GA can quickly find optimal data fusion node sequence, integrate the data effectively, and it has smaller energy consumption of the network and network delay.
The use of discrete-event simulation modelling to improve radiation therapy planning processes.
Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven
2009-07-01
The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.
Models for Planning and Simulation in Computer Assisted Orthognatic Surgery
Chabanas, M; Payan, Y; Boutault, F; Chabanas, Matthieu; Marecaux, Christophe; Payan, Yohan; Boutault, Franck
2002-01-01
Two aspects required to establish a planning in orthognatic surgery are addressed in this paper. First, a 3D cephalometric analysis, which is clini-cally essential for the therapeutic decision. Then, an original method to build a biomechanical model of patient face soft tissue, which provides evaluation of the aesthetic outcomes of an intervention. Both points are developed within a clinical application context for computer aided maxillofacial surgery.
Advanced Simulation and Computing FY17 Implementation Plan, Version 0
Energy Technology Data Exchange (ETDEWEB)
McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment
2016-08-29
The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.
Institute of Scientific and Technical Information of China (English)
朱均燕; 温永仙
2013-01-01
在传统的模拟退火算法基础上,对于产生新解边界值的处理给出一种新方法,并将它应用到二维Toy模型.对4条Fibonacci序列进行了结构预测,结果表明该算法可行有效.%A new method for the treatment of new border values on the basis of traditional simulated annealing algorithm was proposed , and it was applied to Toy model. The structure of four Fibonacci sequences was predicted, the results showed that the algorithm was feasible and effective.
Institute of Scientific and Technical Information of China (English)
覃德泽
2011-01-01
提出一种基于模拟退火的优化算法来解决路由问题.模拟退火算法以加权累积期望传输时间为代价函数来寻找最佳路由方式.系统仿真基于802.11无线网络,分别比较使用基于模拟退火的路由算法和最短路由算法情况下的网络吞吐量和丢包率.仿真结果显示,基于模拟退火的路由算法比最短路由算法具有更好的性能.%An optimization algorithm based on simulated annealing was proposed to solve the routing problem. The simulated annealing (SA) algorithm looked for the best routing strategy by taking the weighted cumulative expectations transmission time as the cost function. The simulation that operated based on 802. 11 wireless networks compared the network throughput and the packet loss ratio by using SA-based routing algorithm and the shortest path routing strategy respectively. The results show that SA-based routing algorithm had a better performance than that of the shortest path routing algorithm.
Institute of Scientific and Technical Information of China (English)
王家文; 王岩; 陈前; 李伟; 陈钰青; 靳书岩; 牛伟; 陈凤霞
2014-01-01
以热模拟实验为基础，建立固溶态GH4169合金的动态再结晶模型，应用DEFORM-3D有限元软件模拟圆柱状试样在不同压缩变形条件下的动态再结晶体积分数分布；结合金相定量分析、电子背散射衍射(Electron backsatter diffraction (EBSD))分析及有限元模拟结果，对比研究变形参数对圆柱状GH4169合金心部微观组织的影响。研究结果表明：升高变形温度及降低应变速率，均可促进圆柱状GH4169合金热模拟压缩试样变形的均匀性；应变速率的降低可加速GH4169合金中小角度晶界向大角度晶界的转变过程；GH4169合金的动态再结晶形核机制为以原始晶界为主的非连续动态再结晶，在试验变形条件下，孪晶界的演化对动态再结晶过程起重要作用；同时，分析实验结果与模拟结果之间的差异及其原因。%Dynamic recrystallization (DRX) model of the annealed GH4169 alloy was established based on the thermal-mechanical simulation tests. The finite element analysis software DEFORM-3D was introduced to simulating the DRX volume of the cylindrical annealed GH4169 alloy under different deformation conditions. Combined quantitative metallographic analysis, electron backscatter diffraction (EBSD) analysis with finite element analysis, the effects of the deformation parameters on the microstructures of the center for the cylindrical samples were investigated. The results show that increasing the deformation temperature and lowering the strain rate would promote the deformation homogeneity of the cylindrical samples during thermal-mechanical simulation tests. The transformation procedure of grain boundaries with low angles and with high angles is accelerated with decreasing the strain rate. The nucleation mechanism of the dynamic recrystallization for the alloy is the discontinuous one dominated mainly by the bulging of the original grain boundaries. Under the tested conditions, the evolution of
BRUS2. An energy system simulator for long term planning
DEFF Research Database (Denmark)
Skytte, K.; Skjerk Christensen, P.
1999-01-01
BRUS2 is a technical-economic bottom-up scenario model. The objective of BRUS2 is to provide decision-makers with information on consequences of given trends of parameters of society like population growth and productivity, and of political goals, e.g., energy saving initiatives. BRUS2 simulates ...
BRUS2. An energy system simulator for long term planning
DEFF Research Database (Denmark)
Skytte, K.; Skjerk Christensen, P.
1999-01-01
BRUS2 is a technical-economic bottom-up scenario model. The objective of BRUS2 is to provide decision-makers with information on consequences of given trends of parameters of society like population growth and productivity, and of political goals, e.g., energy saving initiatives. BRUS2 simulates ...
Training Community Modeling and Simulation Business Plan: 2009 Edition
2010-04-01
HLA user community • Develop Federate compliance test tools • Revise the HLA Object Model Template ( OMT ) • Update the Distributed Simulation...Ocean, Atmosphere, and Space Environmental Services OCONUS Outside of the Continental United States OFT Office of Force Transformation OMT Object
Simulator Training Requirements and Effectiveness Study (STRES): Future Research Plans.
1981-01-01
simulation technology. The AFHRL/OT program, using the ASPT and SAAC devices, is already embarked on an extensive visual technology research effort, one...facilities that would be required to conduct the research described. In some cases, specific research devices are mentioned, such as ASPT , SAAC, and the...configuration of a particular device cannot be foreseen at this point (e.g., the ASPT might have a variety of possible specific cockpit configurations), no
Virtual tryout planning in automotive industry based on simulation metamodels
Harsch, D.; Heingärtner, J.; Hortig, D.; Hora, P.
2016-11-01
Deep drawn sheet metal parts are increasingly designed to the feasibility limit, thus achieving a robust manufacturing is often challenging. The fluctuation of process and material properties often lead to robustness problems. Therefore, numerical simulations are used to detect the critical regions. To enhance the agreement with the real process conditions, the material data are acquired through a variety of experiments. Furthermore, the force distribution is taken into account. The simulation metamodel contains the virtual knowledge of a particular forming process, which is determined based on a series of finite element simulations with variable input parameters. Based on the metamodels, virtual process windows can be displayed for different configurations. This helps to improve the operating point as well as to adjust process settings in case the process becomes unstable. Furthermore, the time of tool tryout can be shortened due to transfer of the virtual knowledge contained in the metamodels on the optimisation of the drawbeads. This allows the tool manufacturer to focus on the essential, to save time and to recognize complex relationships.
Simulation-based planning of surgical interventions in pediatric cardiology
Marsden, Alison
2012-11-01
Hemodynamics plays an essential role in the progression and treatment of cardiovascular disease. This is particularly true in pediatric cardiology, due to the wide variation in anatomy observed in congenital heart disease patients. While medical imaging provides increasingly detailed anatomical information, clinicians currently have limited knowledge of important fluid mechanical parameters. Treatment decisions are therefore often made using anatomical information alone, despite the known links between fluid mechanics and disease progression. Patient-specific simulations now offer the means to provide this missing information, and, more importantly, to perform in-silico testing of new surgical designs at no risk to the patient. In this talk, we will outline the current state of the art in methods for cardiovascular blood flow simulation and virtual surgery. We will then present new methodology for coupling optimization with simulation and uncertainty quantification to customize treatments for individual patients. Finally, we will present examples in pediatric cardiology that illustrate the potential impact of these tools in the clinical setting.
Jiménez-Delgado, Juan J; Paulano-Godino, Félix; PulidoRam-Ramírez, Rubén; Jiménez-Pérez, J Roberto
2016-05-01
The development of support systems for surgery significantly increases the likelihood of obtaining satisfactory results. In the case of fracture reduction interventions these systems enable surgery planning, training, monitoring and assessment. They allow improvement of fracture stabilization, a minimizing of health risks and a reduction of surgery time. Planning a bone fracture reduction by means of a computer assisted simulation involves several semiautomatic or automatic steps. The simulation deals with the correct position of osseous fragments and fixation devices for a fracture reduction. Currently, to the best of our knowledge there is no computer assisted methods to plan an entire fracture reduction process. This paper presents an overall scheme of the computer based process for planning a bone fracture reduction, as described above, and details its main steps, the most common proposed techniques and their main shortcomings. In addition, challenges and new trends of this research field are depicted and analyzed.
Simulation-optimization model for production planning in the blood supply chain.
Osorio, Andres F; Brailsford, Sally C; Smith, Honora K; Forero-Matiz, Sonia P; Camacho-Rodríguez, Bernardo A
2016-06-04
Production planning in the blood supply chain is a challenging task. Many complex factors such as uncertain supply and demand, blood group proportions, shelf life constraints and different collection and production methods have to be taken into account, and thus advanced methodologies are required for decision making. This paper presents an integrated simulation-optimization model to support both strategic and operational decisions in production planning. Discrete-event simulation is used to represent the flows through the supply chain, incorporating collection, production, storing and distribution. On the other hand, an integer linear optimization model running over a rolling planning horizon is used to support daily decisions, such as the required number of donors, collection methods and production planning. This approach is evaluated using real data from a blood center in Colombia. The results show that, using the proposed model, key indicators such as shortages, outdated units, donors required and cost are improved.
Goh, Yang Miang; Askar Ali, Mohamed Jawad
2016-08-01
One of the key challenges in improving construction safety and health is the management of safety behavior. From a system point of view, workers work unsafely due to system level issues such as poor safety culture, excessive production pressure, inadequate allocation of resources and time and lack of training. These systemic issues should be eradicated or minimized during planning. However, there is a lack of detailed planning tools to help managers assess the impact of their upstream decisions on worker safety behavior. Even though simulation had been used in construction planning, the review conducted in this study showed that construction safety management research had not been exploiting the potential of simulation techniques. Thus, a hybrid simulation framework is proposed to facilitate integration of safety management considerations into construction activity simulation. The hybrid framework consists of discrete event simulation (DES) as the core, but heterogeneous, interactive and intelligent (able to make decisions) agents replace traditional entities and resources. In addition, some of the cognitive processes and physiological aspects of agents are captured using system dynamics (SD) approach. The combination of DES, agent-based simulation (ABS) and SD allows a more "natural" representation of the complex dynamics in construction activities. The proposed hybrid framework was demonstrated using a hypothetical case study. In addition, due to the lack of application of factorial experiment approach in safety management simulation, the case study demonstrated sensitivity analysis and factorial experiment to guide future research.
Xia, J J; Gateno, J; Teichgraeber, J F; Yuan, P; Chen, K-C; Li, J; Zhang, X; Tang, Z; Alfi, D M
2015-12-01
The success of craniomaxillofacial (CMF) surgery depends not only on the surgical techniques, but also on an accurate surgical plan. The adoption of computer-aided surgical simulation (CASS) has created a paradigm shift in surgical planning. However, planning an orthognathic operation using CASS differs fundamentally from planning using traditional methods. With this in mind, the Surgical Planning Laboratory of Houston Methodist Research Institute has developed a CASS protocol designed specifically for orthognathic surgery. The purpose of this article is to present an algorithm using virtual tools for planning a double-jaw orthognathic operation. This paper will serve as an operation manual for surgeons wanting to incorporate CASS into their clinical practice.
Energy Technology Data Exchange (ETDEWEB)
Tahvili, Sahar [Mälardalen University (Sweden); Österberg, Jonas; Silvestrov, Sergei [Division of Applied Mathematics, Mälardalen University (Sweden); Biteus, Jonas [Scania CV (Sweden)
2014-12-10
One of the most important factors in the operations of many cooperations today is to maximize profit and one important tool to that effect is the optimization of maintenance activities. Maintenance activities is at the largest level divided into two major areas, corrective maintenance (CM) and preventive maintenance (PM). When optimizing maintenance activities, by a maintenance plan or policy, we seek to find the best activities to perform at each point in time, be it PM or CM. We explore the use of stochastic simulation, genetic algorithms and other tools for solving complex maintenance planning optimization problems in terms of a suggested framework model based on discrete event simulation.
A SIMULATION-AS-A-SERVICE FRAMEWORK FACILITATING WEBGIS BASED INSTALLATION PLANNING
Directory of Open Access Journals (Sweden)
Z. Zheng
2017-09-01
Full Text Available Installation Planning is constrained by both natural and social conditions, especially for spatially sparse but functionally connected facilities. Simulation is important for proper deploy in space and configuration in function of facilities to make them a cohesive and supportive system to meet users’ operation needs. Based on requirement analysis, we propose a framework to combine GIS and Agent simulation to overcome the shortness in temporal analysis and task simulation of traditional GIS. In this framework, Agent based simulation runs as a service on the server, exposes basic simulation functions, such as scenario configuration, simulation control, and simulation data retrieval to installation planners. At the same time, the simulation service is able to utilize various kinds of geoprocessing services in Agents’ process logic to make sophisticated spatial inferences and analysis. This simulation-as-a-service framework has many potential benefits, such as easy-to-use, on-demand, shared understanding, and boosted performances. At the end, we present a preliminary implement of this concept using ArcGIS javascript api 4.0 and ArcGIS for server, showing how trip planning and driving can be carried out by agents.
Planning acetabular fracture reduction using patient-specific multibody simulation of the hip
Oliveri, Hadrien; Boudissa, Mehdi; Tonetti, Jerome; Chabanas, Matthieu
2017-03-01
Acetabular fractures are a challenge in orthopedic surgery. Computer-aided solutions were proposed to segment bone fragments, simulate the fracture reduction or design the osteosynthesis fixation plates. This paper addresses the simulation part, which is usually carried out by freely moving bone fragments with six degrees of freedom to reproduce the pre-fracture state. Instead we propose a different paradigm, closer to actual surgeon's requirements: to simulate the surgical procedure itself rather than the desired result. A simple, patient-specific, biomechanical multibody model is proposed, integrating the main ligaments and muscles of the hip joint while accounting for contacts between bone fragments. Main surgical tools and actions can be simulated, such as clamps, Schanz screws or traction of the femur. Simulations are computed interactively, which enables clinicians to evaluate different strategies for an optimal surgical planning. Six retrospective cases were studied, with simple and complex fracture patterns. After interactively building the models from preoperative CT, gestures from the surgical reports were reproduced. Results of the simulations could then be compared with postoperative CT data. A qualitative study shows the model behavior is excellent and the simulated reductions fit the observed data. A more quantitative analysis is currently being completed. Two cases are particularly significant, for which the surgical reduction actually failed. Simulations show it was indeed not possible to reduce these fractures with the chosen approach. Had our simulator being used, a better planning may have avoided a second surgery to these patients.
Xia, J; Samman, N; Yeung, R W; Shen, S G; Wang, D; Ip, H H; Tideman, H
2000-01-01
A new integrated computer system, the 3-dimensional (3D) virtual reality surgical planning and simulation workbench for orthognathic surgery (VRSP), is presented. Five major functions are implemented in this system: post-processing and reconstruction of computed tomographic (CT) data, transformation of 3D unique coordinate system geometry, generation of 3D color facial soft tissue models, virtual surgical planning and simulation, and presurgical prediction of soft tissue changes. The basic mensuration functions, such as linear and spatial measurements, are also included. The surgical planning and simulation are based on 3D CT reconstructions, whereas soft tissue prediction is based on an individualized, texture-mapped, color facial soft tissue model. The surgeon "enters" the virtual operatory with virtual reality equipment, "holds" a virtual scalpel, and "operates" on a virtual patient to accomplish actual surgical planning, simulation of the surgical procedure, and prediction of soft tissue changes before surgery. As a final result, a quantitative osteotomy-simulated bone model and predicted color facial model with photorealistic quality can be visualized from any arbitrary viewing point in a personal computer system. This system can be installed in any hospital for daily use.
Saber, Deborah A; Strout, Kelley; Caruso, Lisa Swanson; Ingwell-Spolan, Charlene; Koplovsky, Aiden
2017-10-01
Many natural and man-made disasters require the assistance from teams of health care professionals. Knowing that continuing education about disaster simulation training is essential to nursing students, nurses, and emergency first responders (e.g., emergency medical technicians, firefighters, police officers), a university in the northeastern United States planned and implemented an interprofessional mass casualty incident (MCI) disaster simulation using the Project Management Body of Knowledge (PMBOK) management framework. The school of nursing and University Volunteer Ambulance Corps (UVAC) worked together to simulate a bus crash with disaster victim actors to provide continued education for community first responders and train nursing students on the MCI process. This article explains the simulation activity, planning process, and achieved outcomes. J Contin Educ Nurs. 2017;48(10):447-453. Copyright 2017, SLACK Incorporated.
Sündermann, Simon H; Gessat, Michael; Maier, Willibald; Kempfert, Jörg; Frauenfelder, Thomas; Nguyen, Thi D L; Maisano, Francesco; Falk, Volkmar
2015-01-01
We tested the hypothesis that simulated three-dimensional prosthesis overlay procedure planning may support valve selection in transcatheter aortic valve implantation (TAVI) procedures. Preoperative multidimensional computed tomography (MDCT) data sets from 81 consecutive TAVI patients were included in the study. A planning tool was developed, which semiautomatically creates a three-dimensional model of the aortic root from these data. Three-dimensional templates of the commonly used TAVI implants are spatially registered with the patient data and presented as graphic overlay. Fourteen physicians used the tool to perform retrospective planning of TAVI procedures. Results of prosthesis sizing were compared with the prosthesis size used in the actually performed procedure, and the patients were accordingly divided into three groups: those with equal size (concordance with retrospective planning), oversizing (retrospective planning of a smaller prosthesis), and undersizing (retrospective planning of a larger prosthesis). In the oversizing group, 85% of the patients had new pacemaker implantation. In the undersizing group, in 66%, at least mild paravalvular leakage was observed (greater than grade 1 in one third of the cases). In 46% of the patients in the equal-size group, neither of these complications was observed. Three-dimensional prosthesis overlay in MDCT-derived patient data for patient-specific planning of TAVI procedures is feasible. It may improve valve selection compared with two-dimensional MDCT planning and thus yield better outcomes.
Experience Report: Constraint-Based Modelling and Simulation of Railway Emergency Response Plans
DEFF Research Database (Denmark)
Debois, Søren; Hildebrandt, Thomas; Sandberg, Lene
2016-01-01
We report on experiences from a case study applying a constraint-based process-modelling and -simulation tool, dcrgraphs.net, to the modelling and rehearsal of railway emergency response plans with domain experts. The case study confirmed the approach as a viable means for domain experts to analyse...... and security processes in the danish public transport sector and their dependency on ICT....
Agent-based analysis and simulation of meta-reasoning processes in strategic naval planning
Hoogendoorn, M.; Jonker, C.M.; Maanen, P.P. van; Treur, J.
2009-01-01
This paper presents analysis and simulation of meta-reasoning processes based on an agent-based meta-level architecture for strategic reasoning in naval planning. The architecture was designed as a generic agent model and instantiated with decision knowledge acquired from naval domain experts and wa
Realistic tool-tissue interaction models for surgical simulation and planning
Misra, Sarthak
2009-01-01
Surgical simulators present a safe and potentially effective method for surgical training, and can also be used in pre- and intra-operative surgical planning. Realistic modeling of medical interventions involving tool-tissue interactions has been considered to be a key requirement in the development
Directory of Open Access Journals (Sweden)
Vahid Moslemi
2011-03-01
Full Text Available Introduction: In brachytherapy, radioactive sources are placed close to the tumor, therefore, small changes in their positions can cause large changes in the dose distribution. This emphasizes the need for computerized treatment planning. The usual method for treatment planning of cervix brachytherapy uses conventional radiographs in the Manchester system. Nowadays, because of their advantages in locating the source positions and the surrounding tissues, CT and MRI images are replacing conventional radiographs. In this study, we used CT images in Monte Carlo based dose calculation for brachytherapy treatment planning, using an interface software to create the geometry file required in the MCNP code. The aim of using the interface software is to facilitate and speed up the geometry set-up for simulations based on the patient’s anatomy. This paper examines the feasibility of this method in cervix brachytherapy and assesses its accuracy and speed. Material and Methods: For dosimetric measurements regarding the treatment plan, a pelvic phantom was made from polyethylene in which the treatment applicators could be placed. For simulations using CT images, the phantom was scanned at 120 kVp. Using an interface software written in MATLAB, the CT images were converted into MCNP input file and the simulation was then performed. Results: Using the interface software, preparation time for the simulations of the applicator and surrounding structures was approximately 3 minutes; the corresponding time needed in the conventional MCNP geometry entry being approximately 1 hour. The discrepancy in the simulated and measured doses to point A was 1.7% of the prescribed dose. The corresponding dose differences between the two methods in rectum and bladder were 3.0% and 3.7% of the prescribed dose, respectively. Comparing the results of simulation using the interface software with those of simulation using the standard MCNP geometry entry showed a less than 1
Evolutionary View Planning for Optimized UAV Terrain Modeling in a Simulated Environment
Directory of Open Access Journals (Sweden)
Ronald A. Martin
2015-12-01
Full Text Available This work demonstrates the use of genetic algorithms in optimized view planning for 3D reconstruction applications using small unmanned aerial vehicles (UAVs. The quality of UAV site models is currently highly dependent on manual pilot operations or grid-based automation solutions. When applied to 3D structures, these approaches can result in gaps in the total coverage or inconsistency in final model resolution. Genetic algorithms can effectively explore the search space to locate image positions that produce high quality models in terms of coverage and accuracy. A fitness function is defined, and optimization parameters are selected through semi-exhaustive search. A novel simulation environment for evaluating view plans is demonstrated using terrain generation software. The view planning algorithm is tested in two separate simulation cases: a water drainage structure and a reservoir levee, as representative samples of infrastructure monitoring. The optimized flight plan is compared against three alternate flight plans in each case. The optimized view plan is found to yield terrain models with up to 43% greater accuracy than a standard grid flight pattern, while maintaining comparable coverage and completeness.
Directory of Open Access Journals (Sweden)
Fatimazahra BARRAMOU
2012-12-01
Full Text Available In this research a new agent based approach for simulating complex systems with spatial dynamics is presented. We propose architecture based on coupling between two systems: multi-agent systems and geographic information systems. We also propose a generic model of agent-oriented simulation that we will apply to the field of land use planning. In fact, simulating the evolution of the urban system is a key to help decision makers to anticipate the needs of the city in terms of installing new equipment and opening new urbanization’ areas to install the new population.
Institute of Scientific and Technical Information of China (English)
吴坤鸿; 詹世贤
2016-01-01
根据火力打击规则，建立了多目标函数的目标分配模型，提出了分布式遗传模拟退火算法对模型进行求解。分布式遗传模拟退火算法基于经典遗传算法进行改进：将单目标串行搜索方式变成多目标分布式搜索方式，适用于多目标寻优问题求解；采用保留最优个体和轮盘赌相结合的方式进行个体选择，在交叉算子中引入模拟退火算法，使用自适应变异概率，较好地保持算法广度和深度搜索平衡。最后，通过仿真实验验证了算法的有效性和可靠性。%According to the rules of fire strike,a target assignment model is presented,and a Distributed Genetic Simulated Annealing algorithm (DGSA)is applied to resolve this model. DGSA is improved based on classic Genetic Algorithm (GA)as below:the single object serial-searched mode is changed to multiple objects distributed-searched mode,which is fitter for resolving multiobjective optimization; in order to keep a better balance between exploration and exploitation of algorithm,a method by coupling best one preservation and roulette wheel is established for individual selection,and simulated annealing algorithm is combined into crossover operation,and self -adaptive mutation probability is applied. Finally,the efficiency and reliability of DGSA is verified by simulation experiment.
基于二进制蚁群模拟退火算法的认知引擎%Cognitive engine based on binary ant colony simulated annealing algorithm
Institute of Scientific and Technical Information of China (English)
夏龄; 冯文江
2012-01-01
在认知无线电系统中,认知引擎依据通信环境的变化和用户需求动态配置无线电工作参数.针对认知引擎中的智能优化问题,提出一种二进制蚁群模拟退火(BAC&SA)算法用于认知无线电参数优化.该算法在二进制蚁群优化(BACO)算法中引入模拟退火(SA)算法,融合了BACO的快速寻优能力和SA的概率突跳特性,能有效避免BACO容易陷入局部最优解的缺陷.仿真实验结果表明,与遗传算法(GA)和BACO算法相比,基于BAC&SA算法的认知引擎在全局搜索能力和平均适应度等方面具有明显的优势.%In cognitive radio system, cognitive engine can dynamically configure its working parameters according to the changes of communication environment and users' requirement. Intelligent optimization algorithm of cognitive engine had been studied, and a Binary Ant Colony Simulated Annealing ( BAC&SA) algorithm was proposed for parameters optimization of cognitive radio system. The new algorithm, which introduced the Simulated Annealing (SA) algorithm into the Binary Ant Colony Optimization ( BACO) algorithm, combined the rapid optimization ability of BACO with probability jumping property of SA, and effectively avoided the defect of falling into local optimization result of BACO. The simulation results show that cognitive engine based on BAC&SA algorithm has considerable advantage over GA and BACO algorithm in the global search ability and average fitness.
Institute of Scientific and Technical Information of China (English)
王宏健; 王晶; 曲丽萍; 刘振业
2013-01-01
The FastSLAM algorithm based on variance reduction of particle weight was presented in order to solve the decrease of estimated accuracy of AUV ( autonomous underwater vehicle) , location due to particles degeneracy and the sample impoverishment as a result of resampling in standard FastSLAM. The variance of particle weight was decreased by generating an adaptive exponential fading factor, which came from the thought of cooling function in simulated annealing. The effective particle number was increased by application of FastSLAM based on simulated annealing variance reduction in navigation and localization of AUV. Resampling in standard FastSLAM was replaced with it. Establish the kinematic model of AUV, feature model and measurement models of sensors, and make feature extraction with Hough transform. The experiment of AUV's simultaneous localization and mapping u-sing simulated annealing variance reduction FastSLAM was based on trial data. The results indicate that the method described in this paper maintains the diversity of the particles, however, weakens the degeneracy, while at the same time enhances the accuracy stability of AUV's navigation and localization system.%由于标准FastSLAM中存在粒子退化及重采样引起的粒子贫化,导致自主水下航行器(AUV)位置估计精度严重下降的问题,提出了一种基于粒子权值方差缩减的FastSLAM算法.利用模拟退火的降温函数产生自适应指数渐消因子来降低粒子权值的方差,进而增加有效粒子数,以此取代标准FastSLAM中的重采样步骤.建立AUV的运动学模型、特征模型及传感器的测量模型,通过霍夫变换进行特征提取.利用方差缩减FastSLAM算法,基于海试数据进行了AUV同步定位与构图仿真试验,结果表明所提方法能够保证粒子的多样性,并且降低粒子的退化程度,提高了AUV定位与地图构建系统的准确性及稳定性.
Modified simulated annealing algorithm for flexible job-shop scheduling problem%柔性作业车间调度优化的改进模拟退火算法
Institute of Scientific and Technical Information of China (English)
李俊; 刘志雄; 张煜; 贺晶晶
2015-01-01
A modified simulated annealing algorithm was put forward to resolve the flexible job‐shop scheduling problem ,which used two kinds of individual encoding method respectively based on parti‐cle position rounding and roulette probability assignment in particle swarm algorithm .Three different local search methods were employed to constitute the neighborhood structure .The computational re‐sults show that the modified simulated annealing algorithm is more effective than particle swarm algo‐rithm ,hybrid particle swarm algorithm and simulated annealing algorithm in resolving the flexible job‐shop scheduling problem .Compared with the position rounding encoding method ,the roulette‐probability‐assignment‐based encoding method can render the algorithm more effective ,and the local search method based on crossing‐over operation is better than the other two search methods in impro‐ving the solving performance of the algorithm .%针对柔性作业车间调度问题，提出一种改进模拟退火算法来进行求解。该算法引入粒子群算法中的基于位置取整和基于轮盘赌两种个体编码方法，并采用3种不同的局部搜索方法来构造个体的邻域结构。算例计算表明，改进模拟退火算法在求解柔性作业车间调度问题时，比粒子群算法、混合粒子群算法以及模拟退火算法具有更好的求解性能，其中采用轮盘赌编码时，算法的求解性能要优于采用位置取整时的求解性能，且基于互换的局部搜索方法要优于其他两种局部搜索方法，能更有效地改善算法的求解性能。
Preoperative surgical planning and simulation of complex cranial base tumors in virtual reality
Institute of Scientific and Technical Information of China (English)
YI Zhi-qiang; LI Liang; MO Da-peng; ZHANG Jia-yong; ZHANG Yang; BAO Sheng-de
2008-01-01
@@ The extremely complex anatomic relationships among bone,tumor,blood vessels and cranial nerves remains a big challenge for cranial base tumor surgery.Therefore.a good understanding of the patient specific anatomy and a preoperative planning are helpful and crocial for the neurosurgeons.Three dimensional (3-D) visualization of various imaging techniques have been widely explored to enhance the comprehension of volumetric data for surgical planning.1 We used the Destroscope Virtual Reality (VR) System (Singapore,Volume Interaction Pte Ltd,software:RadioDexterTM 1.0) to optimize preoperative plan in the complex cranial base tumors.This system uses patient-specific,coregistered,fused radiology data sets that may be viewed stereoscopically and can be manipulated in a virtual reality environment.This article describes our experience with the Destroscope VR system in preoperative surgical planning and simulation for 5 patients with complex cranial base tumors and evaluates the clinical usefulness of this system.
An Overview of Approaches to Modernize Quantum Annealing Using Local Searches
Directory of Open Access Journals (Sweden)
Nicholas Chancellor
2016-06-01
Full Text Available I describe how real quantum annealers may be used to perform local (in state space searches around specified states, rather than the global searches traditionally implemented in the quantum annealing algorithm. The quantum annealing algorithm is an analogue of simulated annealing, a classical numerical technique which is now obsolete. Hence, I explore strategies to use an annealer in a way which takes advantage of modern classical optimization algorithms, and additionally should be less sensitive to problem mis-specification then the traditional quantum annealing algorithm.
Directory of Open Access Journals (Sweden)
Micaeil Mollazadeh
2010-06-01
Full Text Available Introduction: GafChromic EBT films are one of the self-developing and modern films commercially available for dosimetric verification of treatment planning systems (TPSs. Their high spatial resolution, low energy dependence and near-tissue equivalence make them suitable for verification of dose distributions in radiation therapy. This study was designed to evaluate the dosimetric parameters of the RtDosePlan TPS such as PDD curves, lateral beam profiles, and isodose curves measured in a water phantom using EBT Radiochromic film and EGSnrc Monte Carlo (MC simulation. Methods and Materials: A Microtek color scanner was used as the film scanning system, where the response in the red color channel was extracted and used for the analyses. A calibration curve was measured using pieces of irradiated films to specific doses. The film was mounted inside the phantom parallel to the beam's central axis and was irradiated in a standard setup (SSD = 80 cm, FS = 10×10 cm2 with a 60Co machine. The BEAMnrc and the DOSXYZnrc codes were used to simulate the Co-60 machine and extracting the voxel-based phantom. The phantom's acquired CT data were transferred to the TPS using DICOM files. Results: Distance-To-Agreement (DTA and Dose Difference (DD among the TPS predictions, measurements and MC calculations were all within the acceptance criteria (DD=3%, DTA=3 mm. Conclusion: This study shows that EBT film is an appropriate tool for verification of 2D dose distribution predicted by a TPS system. In addition, it is concluded that MC simulation with the BEAMnrc code has a suitable potential for verifying dose distributions.
Energy Technology Data Exchange (ETDEWEB)
Sanvicente Sanchez, H.; Solis, J. F.
2003-07-01
To set pipe diameters in the least-cost design of a water distribution network is a strong non linear restricted problem with multiple local optima and its solutions space has many unfeasible regions. The heuristic algorithm of optimization, called Simulated Annealing (SA), is a global method that has been used to make stochastic searches in the problem's solutions space battering the performance of other methods. This paper proposes a problem formulation with penalty functions, which lets SA algorithm, among another advantages, that the stochastic walk done by it could be less sinuous, crossing unfeasible regions. This approach improves the algorithm efficiency, for the same error level, with respect to a classical restricted formulation. (Author) 17 refs.
Energy Technology Data Exchange (ETDEWEB)
Wimmer, Thomas, E-mail: thomas.wimmer@medunigraz.at; Srimathveeravalli, Govindarajan; Gutta, Narendra [Memorial Sloan-Kettering Cancer Center, Interventional Radiology Service, Department of Radiology (United States); Ezell, Paula C. [The Rockefeller University, Research Animal Resource Center, Memorial Sloan-Kettering Cancer Center, Weill Cornell Medical College (United States); Monette, Sebastien [The Rockefeller University, Laboratory of Comparative Pathology, Memorial Sloan-Kettering Cancer Center, Weill Cornell Medical College (United States); Maybody, Majid; Erinjery, Joseph P.; Durack, Jeremy C. [Memorial Sloan-Kettering Cancer Center, Interventional Radiology Service, Department of Radiology (United States); Coleman, Jonathan A. [Memorial Sloan-Kettering Cancer Center, Urology Service, Department of Surgery (United States); Solomon, Stephen B. [Memorial Sloan-Kettering Cancer Center, Interventional Radiology Service, Department of Radiology (United States)
2015-02-15
PurposeNumerical simulations are used for treatment planning in clinical applications of irreversible electroporation (IRE) to determine ablation size and shape. To assess the reliability of simulations for treatment planning, we compared simulation results with empiric outcomes of renal IRE using computed tomography (CT) and histology in an animal model.MethodsThe ablation size and shape for six different IRE parameter sets (70–90 pulses, 2,000–2,700 V, 70–100 µs) for monopolar and bipolar electrodes was simulated using a numerical model. Employing these treatment parameters, 35 CT-guided IRE ablations were created in both kidneys of six pigs and followed up with CT immediately and after 24 h. Histopathology was analyzed from postablation day 1.ResultsAblation zones on CT measured 81 ± 18 % (day 0, p ≤ 0.05) and 115 ± 18 % (day 1, p ≤ 0.09) of the simulated size for monopolar electrodes, and 190 ± 33 % (day 0, p ≤ 0.001) and 234 ± 12 % (day 1, p ≤ 0.0001) for bipolar electrodes. Histopathology indicated smaller ablation zones than simulated (71 ± 41 %, p ≤ 0.047) and measured on CT (47 ± 16 %, p ≤ 0.005) with complete ablation of kidney parenchyma within the central zone and incomplete ablation in the periphery.ConclusionBoth numerical simulations for planning renal IRE and CT measurements may overestimate the size of ablation compared to histology, and ablation effects may be incomplete in the periphery.
Improved hybrid particle swarm algorithm based on simulated annealing%基于自适应模拟退火的改进混合粒子群算法
Institute of Scientific and Technical Information of China (English)
杨文光; 严哲; 隋丽丽
2015-01-01
为了改善旅行商(TSP)优化求解能力，对模拟退火与混合粒子群算法进行改进，引入了自适应寻优策略。交叉、变异的混合粒子群算法，易于陷入局部最优，而自适应的模拟退火算法可以跳出局部最优，进行全局寻优，所以两者的结合兼顾了全局和局部。该算法增加的自适应性寻优策略提供了判定粒子是否陷入局部极值的条件，并可借此以一定概率进行自适应寻优，增强了全局寻优能力。与混合粒子群算法实验结果对比，显示了本文算法的有效性。%In order to enhance the ability of solving TSP optimization, the hybrid particle swarm optimization (PSO) algorithm with simulated annealing is improved, which introduced the adaptive optimization strategy. Hybrid particle swarm optimization algorithm with crossover and mutation, is easy to fall into local optimum, and the simulated annealing algorithm can avoid local optimization, so the combination of both global and lo-cal. This algorithm increases the adaptive optimization strategy which provided to determine whether the parti-cles fall into local extreme conditions, and can be used to with a certain probability of adaptive optimization, enhanced the ability of global optimization. Compared with the hybrid particle swarm algorithm experimental results, shows the effectiveness of the proposed algorithm.
Exploring first-order phase transitions with population annealing
Barash, Lev Yu.; Weigel, Martin; Shchur, Lev N.; Janke, Wolfhard
2017-03-01
Population annealing is a hybrid of sequential and Markov chain Monte Carlo methods geared towards the efficient parallel simulation of systems with complex free-energy landscapes. Systems with first-order phase transitions are among the problems in computational physics that are difficult to tackle with standard methods such as local-update simulations in the canonical ensemble, for example with the Metropolis algorithm. It is hence interesting to see whether such transitions can be more easily studied using population annealing. We report here our preliminary observations from population annealing runs for the two-dimensional Potts model with q > 4, where it undergoes a first-order transition.
Renal Tumor Cryoablation Planning. The Efficiency of Simulation on Reconstructed 3D CT Scan
Directory of Open Access Journals (Sweden)
Ciprian Valerian LUCAN
2010-12-01
Full Text Available Introduction & Objective: Nephron-sparing surgical techniques risks are related to tumor relationships with adjacent anatomic structures. Complexity of the renal anatomy drives the interest to develop tools for 3D reconstruction and surgery simulation. The aim of the article was to assess the simulation on reconstructed 3D CT scan used for planning the cryoablation. Material & Method: A prospective randomized study was performed between Jan. 2007 and July 2009 on 27 patients who underwent retroperitoneoscopic T1a renal tumors cryoablation (RC. All patients were assessed preoperatively by CT scan, also used for 3D volume rendering. In the Gr.A, the patients underwent surgery planning by simulation on 3D CT scan. In the Gr.B., patients underwent standard RC. The two groups were compared in terms of surgical time, bleeding, postoperative drainage, analgesics requirement, hospital stay, time to socio-professional reintegration. Results: Fourteen patients underwent preoperative cryoablation planning (Gr.A and 13 patients underwent standard CR (Gr.B. All parameters analyzed were shorter in the Gr.A. On multivariate logistic regression, only shortens of the surgical time (138.79±5.51 min. in Gr.A. vs. 140.92±5.54 min in Gr.B. and bleeding (164.29±60.22 mL in Gr.A. vs. 215.38±100.80 mL in Gr.B. achieved statistical significance (p<0.05. The number of cryoneedles assessed by simulation had a 92.52% accuracy when compared with those effectively used. Conclusions: Simulation of the cryoablation using reconstructed 3D CT scan improves the surgical results. The application used for simulation was able to accurately assess the number of cryoneedles required for tumor ablation, their direction and approach.
Simulation modeling perspectives of the Bangladesh family planning and female education system.
Teel, J H; Ragade, R K
1984-07-01
A systems dynamics simulation study of the interaction of various social subsystems in the People's Republic of Bangladesh is chosen to address integrated planning concerns. It is concluded that one should not underestimate the potential of noneconomic societal forces: They can have a positive impact on slowing population growth and improving the quality of life. Methodologies included: fuzzy profiles for choosing primary variables; interpretive impact matrices to generate the systems dynamics equations; interactive computer capabilities for purposes other than simulation runs; modeling log file to note modeling assumptions, changes, and redefinitions; and microcomputer portability.
Benefits of a clinical planning and coordination module: a simulation study
DEFF Research Database (Denmark)
Jensen, Sanne; Vingtoft, Søren; Nøhr, Christian
2013-01-01
igital Clinical Practice Guidelines are commonly used in Danish health care. Planning and decision support are particularly important to patients with chronic diseases, who often are in contact with General Practitioners, Community Nurses and hospitals. In the Capital Region of Denmark the potent......igital Clinical Practice Guidelines are commonly used in Danish health care. Planning and decision support are particularly important to patients with chronic diseases, who often are in contact with General Practitioners, Community Nurses and hospitals. In the Capital Region of Denmark...... the potential benefits of a planning and coordination module has been assessed in a full-scale simulation test including 18 health care professionals. The results showed that health care professionals can benefit from such a module. Furthermore unexpected new possible benefits concerning communication...... and quality management emerged during the test and potential new groups of users were identified...
Selection of a Planning Horizon for a Hybrid Microgrid Using Simulated Wind Forecasts
2014-12-01
Craparo Dashi I. Singham Naval Postgraduate School 1411 Cunningham Road Monterey, CA, 93943 USA ABSTRACT Hybrid microgrids containing renewable energy ...produced is at least as great as the total load. Energy is produced by generators, wind turbines, purchases from the commercial grid, and discharge of the...A PLANNING HORIZON FOR A HYBRID MICROGRID USING SIMULATED WIND FORECASTS Mumtaz Karatas Turkish Naval Academy Tuzla, Istanbul, 34942, TURKEY Emily M
Kotevski Živko; Jovanoski Bojan; Minovski Robert
2015-01-01
Production planning and control (PPC) systems are the base of all production facilities. In today's surroundings, having a good PPC system generates lots of benefits for the companies. But, having an excellent PPC system provides great competitive advantage and serious reduction of cost in many fields. In order to get to a point of having excellent PPC, the companies turn more and more to the newest software tools, for simulations as an example. Considering today's advanced computer technolog...
Cassidy, Jeffrey; Betz, Vaughn; Lilge, Lothar
2015-03-01
Photodynamic therapy (PDT) delivers a localized cytotoxic dose that is a function of tissue oxygen availability, photosensitive drug concentration, and light fluence. Providing safe and effective PDT requires an understanding of all three elements and the physiological response to the radicals generated. Interstitial PDT (IPDT) for solid tumours poses particular challenges due to complex organ geometries and the associated limitations for diffusion theory based fluence rate prediction, in addition to restricted access for light delivery and dose monitoring. As a first step towards enabling a complete prospective IPDT treatment-planning platform, we demonstrate use of our previously developed FullMonte tetrahedral Monte Carlo simulation engine for modeling of the interstitial fluence field due to intravesicular insertion of brief light sources. The goal is to enable a complete treatment planning and monitoring work flow analogous to that used in ionizing radiation therapy, including plan evaluation through dose-volume histograms and algorithmic treatment plan optimization. FullMonte is to our knowledge the fastest open-source tetrahedral MC light propagation software. Using custom hardware acceleration, we achieve 4x faster computing with 67x better power efficiency for limited-size meshes compared to the software. Ongoing work will improve the performance advantage to 16x with unlimited mesh size, enabling algorithmic plan optimization in reasonable time. Using FullMonte, we demonstrate significant new plan-evaluation capabilities including fluence field visualization, generation of organ dose-volume histograms, and rendering of isofluence surfaces for a representative bladder cancer mesh from a real patient. We also discuss the advantages of MC simulations for dose-volume histogram generation and the need for online personalized fluence-rate monitoring.
Quantum annealing with all-to-all connected nonlinear oscillators
DEFF Research Database (Denmark)
Puri, Shruti; Andersen, Christian Kraglund; Grimsmo, Arne L.
2017-01-01
Quantum annealing aims at solving combinatorial optimization problems mapped to Ising interactions between quantum spins. Here, with the objective of developing a noise-resilient annealer, we propose a paradigm for quantum annealing with a scalable network of two-photon-driven Kerr......-nonlinear resonators. Each resonator encodes an Ising spin in a robust degenerate subspace formed by two coherent states of opposite phases. A fully connected optimization problem is mapped to local fields driving the resonators, which are connected with only local four-body interactions. We describe an adiabatic...... annealing protocol in this system and analyse its performance in the presence of photon loss. Numerical simulations indicate substantial resilience to this noise channel, leading to a high success probability for quantum annealing. Finally, we propose a realistic circuit QED implementation of this promising...
Complex Urban Simulations and Sustainable Urban Planning with Spatial and Social Implications
Becker, T.; Boschert, S.; Hempel, L.; Höffken, S.; Obst, B.
2013-09-01
Cities can be seen as complex systems of heterogeneous processes with a high variety of different influences (e.g. social, infrastructural, economic, and political impacts). This especially applies for tasks concerning urban development of existing assets. The optimization of traffic flows, reduction of emissions, improvement of energy efficiency, but also urban climate and landscape planning issues require the involvement of many different actors, balancing different perspectives, and divergent claims. The increasing complexities of planning and decision processes make high demands on professionals of various disciplines, government departments, and municipal decision-makers. In the long term, topics like urban resilience, energy management, risk and resource management have to be taken into account and reflected in future projects, but always related to socio-spatial and governmental aspects. Accordingly, it is important to develop models to be able to understand and analyze the outcomes and effects of governmental measures and planning to the urban environment. Thus, a more systematic approach is needed - going away from welldefined city models to city system models. The purpose is to describe urban processes not only quantitatively, but to grasp their qualitative complexity and interdependencies, by modeling and simulating existing urban systems. This contribution will present the City System Model (CSM) concept closely related to an Urban Energy Planning use case, will highlight the methodology, and focus on first results and findings from an ongoing interdisciplinary research project and use case to improve the basis of information for decision-makers and politicians about urban planning decisions.
Capacity planning for maternal-fetal medicine using discrete event simulation.
Ferraro, Nicole M; Reamer, Courtney B; Reynolds, Thomas A; Howell, Lori J; Moldenhauer, Julie S; Day, Theodore Eugene
2015-07-01
Maternal-fetal medicine is a rapidly growing field requiring collaboration from many subspecialties. We provide an evidence-based estimate of capacity needs for our clinic, as well as demonstrate how simulation can aid in capacity planning in similar environments. A Discrete Event Simulation of the Center for Fetal Diagnosis and Treatment and Special Delivery Unit at The Children's Hospital of Philadelphia was designed and validated. This model was then used to determine the time until demand overwhelms inpatient bed availability under increasing capacity. No significant deviation was found between historical inpatient censuses and simulated censuses for the validation phase (p = 0.889). Prospectively increasing capacity was found to delay time to balk (the inability of the center to provide bed space for a patient in need of admission). With current capacity, the model predicts mean time to balk of 276 days. Adding three beds delays mean time to first balk to 762 days; an additional six beds to 1,335 days. Providing sufficient access is a patient safety issue, and good planning is crucial for targeting infrastructure investments appropriately. Computer-simulated analysis can provide an evidence base for both medical and administrative decision making in a complex clinical environment. Thieme Medical Publishers 333 Seventh Avenue, New York, NY 10001, USA.
Xia, J. J.; Gateno, J.; Teichgraeber, J. F.; Yuan, P.; Chen, K.-C.; Li, J.; Zhang, X.; Tang, Z.; Alfi, D. M.
2015-01-01
The success of craniomaxillofacial (CMF) surgery depends not only on the surgical techniques, but also on an accurate surgical plan. The adoption of computer-aided surgical simulation (CASS) has created a paradigm shift in surgical planning. However, planning an orthognathic operation using CASS differs fundamentally from planning using traditional methods. With this in mind, the Surgical Planning Laboratory of Houston Methodist Research Institute has developed a CASS protocol designed specifically for orthognathic surgery. The purpose of this article is to present an algorithm using virtual tools for planning a double-jaw orthognathic operation. This paper will serve as an operation manual for surgeons wanting to incorporate CASS into their clinical practice. PMID:26573562
Grain coarsening mechanism of Cu thin films by rapid annealing
Energy Technology Data Exchange (ETDEWEB)
Sasajima, Yasushi, E-mail: sasajima@mx.ibaraki.ac.jp; Kageyama, Junpei; Khoo, Khyoupin; Onuki, Jin
2010-09-30
Cu thin films have been produced by an electroplating method using nominal 9N anode and nominal 6N CuSO{sub 4}.5H{sub 2}O electrolyte. Film samples were heat-treated by two procedures: conventional isothermal annealing in hydrogen atmosphere (abbreviated as H{sub 2} annealing) and rapid thermal annealing with an infrared lamp (abbreviated as RTA). After heat treatment, the average grain diameters and the grain orientation distributions were examined by electron backscattering pattern analysis. The RTA samples (400 {sup o}C for 5 min) have a larger average grain diameter, more uniform grain distribution and higher ratio of (111) orientation than H{sub 2} annealed samples (400 {sup o}C for 30 min). This means that RTA can produce films with coarser and more uniformly distributed grains than H{sub 2} annealing within a short time, i.e. only a few minutes. To clarify the grain coarsening mechanism, grain growth by RTA was simulated using the phase field method. The simulated grain diameter reaches its maximum at a heating rate which is the same order as that in the actual RTA experiment. The maximum grain diameter is larger than that obtained by H{sub 2} annealing with the same annealing time at the isothermal stage as in RTA. The distribution of the misorientation was analyzed which led to a proposed grain growth model for the RTA method.
Robustness of Recommended Farm Plans in England under Climate Change: A Monte Carlo Simulation
Energy Technology Data Exchange (ETDEWEB)
Gibbons, J.M.; Ramsden, S.J. [Division of Agricultural Sciences, University of Nottingham, Sutton Bonington Campus, Loughborough, LE12 5RD (United Kingdom)
2005-01-01
A methodology is described for estimating robustness of recommended farm plans under climate change while maintaining a meaningful representation of the underlying farm system. Monte Carlo Simulation (MCS) of crop yield data is used in conjunction with a fully specified farm-level model and output from a field worktime model. Estimates of farm net margin, enterprise mix (choice and area of enterprises), labour, machinery, storage and animal housing under mean crop yields and field worktimes for current (2000s) and 2050s conditions are generated. MCS is used to estimate the effect of crop yield variation on farm profitability and enterprise mix for the same periods by running the farm-level model with no constraints and running it constrained to the mean data plan. Estimates of robustness, measured as the percentage difference and the probability of exceeding the mean farm net-margin, were calculated from the outputs from these runs. For three representative farm types, mean farm net margin increased; however changes in robustness as shown by percentage difference in farm net margin depended on farm type while the probability of exceeding the mean plan net-margin decreased by 2050 indicating an increase in robustness. The most robust farm type had a diversified mix of enterprises and required no additional fixed resources by the 2050s. The least robust farm type was in a marginal location and mean plan recommendations for the 2050s required additional investment in fixed resources, particularly irrigation. It is concluded that the information provided by the methodology would be particularly useful to farmers: where mean data plans are not robust, MCS results could be used with financial planning techniques to minimise the impact of variability, rather than using high cost inputs to reduce variability per se.
Composition dependent thermal annealing behaviour of ion tracks in apatite
Energy Technology Data Exchange (ETDEWEB)
Nadzri, A., E-mail: allina.nadzri@anu.edu.au [Department of Electronic Materials Engineering, Research School of Physics and Engineering, Australian National University, Canberra, ACT 2601 (Australia); Schauries, D.; Mota-Santiago, P.; Muradoglu, S. [Department of Electronic Materials Engineering, Research School of Physics and Engineering, Australian National University, Canberra, ACT 2601 (Australia); Trautmann, C. [GSI Helmholtz Centre for Heavy Ion Research, Planckstrasse 1, 64291 Darmstadt (Germany); Technische Universität Darmstadt, 64287 Darmstadt (Germany); Gleadow, A.J.W. [School of Earth Science, University of Melbourne, Melbourne, VIC 3010 (Australia); Hawley, A. [Australian Synchrotron, 800 Blackburn Road, Clayton, VIC 3168 (Australia); Kluth, P. [Department of Electronic Materials Engineering, Research School of Physics and Engineering, Australian National University, Canberra, ACT 2601 (Australia)
2016-07-15
Natural apatite samples with different F/Cl content from a variety of geological locations (Durango, Mexico; Mud Tank, Australia; and Snarum, Norway) were irradiated with swift heavy ions to simulate fission tracks. The annealing kinetics of the resulting ion tracks was investigated using synchrotron-based small-angle X-ray scattering (SAXS) combined with ex situ annealing. The activation energies for track recrystallization were extracted and consistent with previous studies using track-etching, tracks in the chlorine-rich Snarum apatite are more resistant to annealing than in the other compositions.
Evidence for quantum annealing with more than one hundred qubits
Boixo, Sergio; Rønnow, Troels F.; Isakov, Sergei V.; Wang, Zhihui; Wecker, David; Lidar, Daniel A.; Martinis, John M.; Troyer, Matthias
2014-03-01
Quantum technology is maturing to the point where quantum devices, such as quantum communication systems, quantum random number generators and quantum simulators may be built with capabilities exceeding classical computers. A quantum annealer, in particular, solves optimization problems by evolving a known initial configuration at non-zero temperature towards the ground state of a Hamiltonian encoding a given problem. Here, we present results from tests on a 108 qubit D-Wave One device based on superconducting flux qubits. By studying correlations we find that the device performance is inconsistent with classical annealing or that it is governed by classical spin dynamics. In contrast, we find that the device correlates well with simulated quantum annealing. We find further evidence for quantum annealing in the form of small-gap avoided level crossings characterizing the hard problems. To assess the computational power of the device we compare it against optimized classical algorithms.
Directory of Open Access Journals (Sweden)
Y. Li
2012-01-01
Full Text Available The Xiaojiaqiao barrier lake, which was the second largest barrier lake formed by the Wenchuan Earthquake had seriously threatened the lives and property of the population downstream. The lake was finally dredged successfully on 7 June 2008. Because of the limited time available to conduct an inundation potential analysis and make an evacuation plan, barrier lake information extraction and real-time dam break flood simulation should be carried out quickly, integrating remote sensing and geographic information system (GIS techniques with hydrologic/hydraulic analysis. In this paper, a technical framework and several key techniques for this real-time preliminary evacuation planning are introduced. An object-oriented method was used to extract hydrological information on the barrier lake from unmanned aerial vehicle (UAV remote sensing images. The real-time flood routine was calculated by using shallow-water equations, which were solved by means of a finite volume scheme on multiblock structured grids. The results of the hydraulic computations are visualized and analyzed in a 3-D geographic information system for inundation potential analysis, and an emergency response plan is made. The results show that if either a full-break or a half-break situation had occurred for the Chapinghe barrier lake on 19 May 2008, then the Xiaoba Town region and the Sangzao Town region would have been affected, but the downstream towns would have been less influenced. Preliminary evacuation plans under different dam break situations can be effectively made using these methods.
Li, Y.; Gong, J. H.; Zhu, J.; Ye, L.; Song, Y. Q.; Yue, Y. J.
2012-01-01
The Xiaojiaqiao barrier lake, which was the second largest barrier lake formed by the Wenchuan Earthquake had seriously threatened the lives and property of the population downstream. The lake was finally dredged successfully on 7 June 2008. Because of the limited time available to conduct an inundation potential analysis and make an evacuation plan, barrier lake information extraction and real-time dam break flood simulation should be carried out quickly, integrating remote sensing and geographic information system (GIS) techniques with hydrologic/hydraulic analysis. In this paper, a technical framework and several key techniques for this real-time preliminary evacuation planning are introduced. An object-oriented method was used to extract hydrological information on the barrier lake from unmanned aerial vehicle (UAV) remote sensing images. The real-time flood routine was calculated by using shallow-water equations, which were solved by means of a finite volume scheme on multiblock structured grids. The results of the hydraulic computations are visualized and analyzed in a 3-D geographic information system for inundation potential analysis, and an emergency response plan is made. The results show that if either a full-break or a half-break situation had occurred for the Chapinghe barrier lake on 19 May 2008, then the Xiaoba Town region and the Sangzao Town region would have been affected, but the downstream towns would have been less influenced. Preliminary evacuation plans under different dam break situations can be effectively made using these methods.
Institute of Scientific and Technical Information of China (English)
毛力; 刘兴阳; 沈明明
2011-01-01
In view of the advantages and disadvantages of K-harmonic means (KHM) and simulated annealing particle swarm optimization (SAPSO), a hybrid clustering algorithm combining KHM and SAPSO (KHM-SAPSO) was presented in this paper. With KHM, the particle swarm was divided into several sub-groups. Each particle iteratively updated its location based on its individual extreme value and the global extreme value of the sub-group it belonged to. With simulated annealing technique, the algorithm prevented premature convergence and improved the calculation accuracy. Using the databases of Iris, Zoo, Wine and Image Segmentation, and taking F-measure as a measure to evaluate the clustering effect, this paper qualified the new hybrid algorithm. Our experimental results indicated that the new algorithm significantly improved the clustering effectiveness by avoiding being trapped in local optimum, enhanced the global search capability while achieved faster convergence rate. This algorithm is adopted by an aquaculture water quality analysis system of a freshwater breeding base in Wuxi, which is running effectively.%针对K-调和均值和模拟退火粒子群聚类算法的优缺点,提出了1种融合K-调和均值和模拟退火粒子群的混合聚类算法.首先通过K-调和均值方法将粒子群分成若干个子群,每个粒子根据其个体极值和所在子种群的全局极值来更新位置.同时引入模拟退火思想,抑制了早期收敛,提高了计算精度.本文使用Iris、Zoo、Wine和Image Segmentation,4个数据库,以F-measure为评价聚类效果的标准,对混合聚类算法进行了验证.研究发现,该混合聚类算法可以有效地避免陷入局部最优,在保证收敛速度的同时增强了算法的全局搜索能力,明显改善了聚类效果.该算法目前已用于无锡一淡水养殖基地的水产健康养殖水质分析系统,运行效果良好.
Institute of Scientific and Technical Information of China (English)
陈雄峰; 吴景岚; 朱文兴
2014-01-01
A hybrid genetic simulated annealing algorithm is presented for solving the problem of VLSI standard cell placement with up to millions of cells. Firstly, to make genetic algorithm be capable of handling very large scale of standard cell placement, the strategies of small size population, dynamic updating population, and crossover localization are adopted, and the global search and local search of genetic algorithm are coordinated. Then, by introducing hill climbing ( HC) and simulated annealing ( SA) into the framework of genetic algorithm and the internal procedure of its operators, an effective crossover operator named Net Cycle Crossover and local search algorithms for the placement problem are designed to further improve the evolutionary efficiency of the algorithm and the quality of its placement results. In the algorithm procedure, HC method and SA method focus on array placement and non-array placement respectively. The experimental results on Peko suite3, Peko suite4 and ISPD04 benchmark circuits show that the proposed algorithm can handle array and non-array placements with 10,000 ~1,600,000 cells and 10,000~210,000 cells respectively, and can effectively improve the quality of placement results in a reasonable running time.%提出有效处理百万个VLSI标准单元布局问题的混合遗传模拟退火算法。首先采用小规模种群、动态更新种群和交叉局部化策略，并协调全局与局部搜索，使遗传算法可处理超大规模标准单元布局问题。然后为进一步提高算法进化效率和布局结果质量，将爬山和模拟退火方法引入遗传算法框架及其算子内部流程，设计高效的线网-循环交叉算子和局部搜索算法。标准单元阵列布局侧重使用爬山法，非阵列布局侧重使用模拟退火方法。 Peko suite3、Peko suite4和ISPD04标准测试电路的实验结果表明，该算法可在合理运行时间内有效提高布局结果质量。
Optimal Acceleration-Velocity-Bounded Trajectory Planning in Dynamic Crowd Simulation
Directory of Open Access Journals (Sweden)
Fu Yue-wen
2014-01-01
Full Text Available Creating complex and realistic crowd behaviors, such as pedestrian navigation behavior with dynamic obstacles, is a difficult and time consuming task. In this paper, we study one special type of crowd which is composed of urgent individuals, normal individuals, and normal groups. We use three steps to construct the crowd simulation in dynamic environment. The first one is that the urgent individuals move forward along a given path around dynamic obstacles and other crowd members. An optimal acceleration-velocity-bounded trajectory planning method is utilized to model their behaviors, which ensures that the durations of the generated trajectories are minimal and the urgent individuals are collision-free with dynamic obstacles (e.g., dynamic vehicles. In the second step, a pushing model is adopted to simulate the interactions between urgent members and normal ones, which ensures that the computational cost of the optimal trajectory planning is acceptable. The third step is obligated to imitate the interactions among normal members using collision avoidance behavior and flocking behavior. Various simulation results demonstrate that these three steps give realistic crowd phenomenon just like the real world.
Computer aided process planning and die design in simulation environment in sheet metal forming
Tisza, Miklós; Lukács, Zsolt
2013-12-01
During the recent 10-15 years, Computer Aided Process Planning and Die Design evolved as one of the most important engineering tools in sheet metal forming, particularly in the automotive industry. This emerging role is strongly emphasized by the rapid development of Finite Element Modeling, as well. The purpose of this paper is to give a general overview about the recent achievements in this very important field of sheet metal forming and to introduce some special results in this development activity. Therefore, in this paper, an integrated process simulation and die design system developed at the University of Miskolc, Department of Mechanical Engineering will be analyzed. The proposed integrated solutions have great practical importance to improve the global competitiveness of sheet metal forming in the very important segment of industry. The concept described in this paper may have specific value both for process planning and die design engineers.
Simulation-Based Planning and Control of Transport Flows in Port Logistic Systems
Directory of Open Access Journals (Sweden)
Antonio Diogo Passos Lima
2015-01-01
Full Text Available In highly dynamic and uncertain transport conditions, transport transit time has to be continuously monitored so that the service level is ensured at a proper cost. The aim of this research is to propose and to test a procedure which allows an agile planning and control of transport flows in port logistic systems. The procedure couples an agent-based simulation and a queueing theory model. In this paper, the transport scheduling performed by an agent at the intermodal terminal was taken into consideration. The decision-making agent takes into account data which is acquired in remote points of the system. The obtained results indicate the relevance of continuously considering, for the transport planning and control, the expected transit time and further waiting times along port logistic systems.
Tumlinson, Katherine; Speizer, Ilene S.; Curtis, Sian L.; Pence, Brian W.
2014-01-01
Despite widespread endorsement within the field of international family planning regarding the importance of quality of care as a reproductive right, the field has yet to develop validated data collection instruments to accurately assess quality in terms of its public health importance. This study, conducted among 19 higher volume public and private facilities in Kisumu, Kenya, used the simulated client method to test the validity of three standard data collection instruments included in large-scale facility surveys: provider interviews, client interviews, and observation of client-provider interactions. Results found low specificity and positive predictive values in each of the three instruments for a number of quality indicators, suggesting that quality of care may be overestimated by traditional methods. Revised approaches to measuring family planning service quality may be needed to ensure accurate assessment of programs and to better inform quality improvement interventions. PMID:25469929
Energy Technology Data Exchange (ETDEWEB)
Dinges, S.; Koswig, S.; Buchali, A.; Wurm, R.; Schlenger, L.; Boehmer, D.; Budach, V. [Humboldt-Universitaet Berlin (Germany). Klinik fuer Strahlentherapie
1998-10-01
Purpose: The exact coverage of the lymph nodes and optimal shielding of the organs at risk are necessary for patients with Hodgkin`s disease or malignant lymphoma to guarantee a high cure rate and a low rate of late effects for normal tissue. The purpose of this study was to compare conventional simulation and blocking with virtual simulation in terms of coverage of the target volume and shielding of the organs at risk in this highly curative patient group. Patients and Methods: In 10 patients diagnosed with Hodgkin`s disease and 5 patients with a Non-Hodgkin lymphoma radiation treatment planning for a mantle field or para-aortic field with inclusion of the spleen was performed in a conventional manner and with virtual simulation. With conventional technique, irradiation portals were defined during fluoroscopy and shielding of the organs at risk was drawn onto the simulation films, based on the information from previous X-ray films, CT or MRI scans. For virtual simulation, contouring of the target volumes and organs at risk (e.g. the kidneys) and the definition of the irradiation portals were performed with the AcQSim {sup trademark} software package on a VoxelQ {sup trademark} workstation (Picker Inc.). This was done in a beam`s eye view environment on a currently driven CT scan in the treatment position. Both irradiation portals were compared in terms of coverage of the target volume and shielding of the organs at risk. Results: Planning of a mantle field in the conventional way resulted in an imcomplete coverage of the right hilus in 4/15 cases and of the left in 1/15 cases, respectively. The spleen and the spleen hilus were not covered completely in 5/15 and 6/15 cases, respectively. The left kidney was adequately shielded in only two thirds (10/15) of the conventionally planned fields. The planning time required for virtual simulation was reduced for the patient, but was increased for the physician because of the more time consuming contouring procedure
Improved mapping of the travelling salesman problem for quantum annealing
Troyer, Matthias; Heim, Bettina; Brown, Ethan; Wecker, David
2015-03-01
We consider the quantum adiabatic algorithm as applied to the travelling salesman problem (TSP). We introduce a novel mapping of TSP to an Ising spin glass Hamiltonian and compare it to previous known mappings. Through direct perturbative analysis, unitary evolution, and simulated quantum annealing, we show this new mapping to be significantly superior. We discuss how this advantage can translate to actual physical implementations of TSP on quantum annealers.
Energy Technology Data Exchange (ETDEWEB)
Haeggstaahl, Daniel [Maelardalen Univ., Vaesteraas (Sweden); Dotzauer, Erik [AB Fortum, Stockholm (Sweden)
2004-12-01
Production planning in Combined Heat and Power (CHP) systems is considered. The focus is on development and use of mathematical models and methods. Different aspects on production planning are discussed, including weather and load predictions. Questions relevant on the different planning horizons are illuminated. The main purpose with short-term (one week) planning is to decide when to start and stop the production units, and to decide how to use the heat storage. The main conclusion from the outline of pros and cons of commercial planning software are that several are using Mixed Integer Programming (MIP). In that sense they are similar. Building a production planning model means that the planning problem is formulated as a mathematical optimization problem. The accuracy of the input data determines the practical detail level of the model. Two alternatives to the methods used in today's commercial programs are proposed: stochastic optimization and simulator-based optimization. The basic concepts of mathematical optimization are outlined. A simulator-based model for short-term planning is developed. The purpose is to minimize the production costs, depending on the heat demand in the district heating system, prices of electricity and fuels, emission taxes and fees, etc. The problem is simplified by not including any time-linking conditions. The process model is developed in IPSEpro, a heat and mass-balance software from SimTech Simulation Technology. TOMLAB, an optimization toolbox in MATLAB, is used as optimizer. Three different solvers are applied: glcFast, glcCluster and SNOPT. The link between TOMLAB and IPSEpro is accomplished using the Microsoft COM technology. MATLAB is the automation client and contains the control of IPSEpro and TOMLAB. The simulator-based model is applied to the CHP plant in Eskilstuna. Two days are chosen and analyzed. The optimized production is compared to the measured. A sensitivity analysis on how variations in outdoor
Kum, Oyeon
2007-11-01
Customized cancer radiation treatment planning for each patient is very useful for both a patient and a doctor because it provides the ability to deliver higher doses to a more accurately defined tumor and at the same time lower doses to organs at risk and normal tissues. This can be realized by building an accurate planning simulation system to provide better treatment strategies based on each patient's tomographic data such as CT, MRI, PET, or SPECT. In this study, we develop a real-time online client-server/client collaborative environment between the client (health care professionals or hospitals) and the server/client under a secure network using telematics (the integrated use of telecommunications and medical informatics). The implementation is based on a point-to-point communication scheme between client and server/client following the WYSIWIS (what you see is what I see) paradigm. After uploading the patient tomographic data, the client is able to collaborate with the server/client for treatment planning. Consequently, the level of health care services can be improved, specifically for small radiotherapy clinics in rural/remote-country areas that do not possess much experience or equipment such as a treatment planning simulator. The telematics service of the system can also be used to provide continued medical education in radiotherapy. Moreover, the system is easy to use. A client can use the system if s/he is familiar with the Windows(TM) operating system because it is designed and built based on a user-friendly concept. This system does not require the client to continue hardware and software maintenance and updates. These are performed automatically by the server.
SU-E-J-72: Geant4 Simulations of Spot-Scanned Proton Beam Treatment Plans
Energy Technology Data Exchange (ETDEWEB)
Kanehira, T; Sutherland, K; Matsuura, T; Umegaki, K; Shirato, H [Hokkaido University, Sapporo, Hokkaido (Japan)
2014-06-01
Purpose: To evaluate density inhomogeneities which can effect dose distributions for real-time image gated spot-scanning proton therapy (RGPT), a dose calculation system, using treatment planning system VQA (Hitachi Ltd., Tokyo) spot position data, was developed based on Geant4. Methods: A Geant4 application was developed to simulate spot-scanned proton beams at Hokkaido University Hospital. A CT scan (0.98 × 0.98 × 1.25 mm) was performed for prostate cancer treatment with three or four inserted gold markers (diameter 1.5 mm, volume 1.77 mm3) in or near the target tumor. The CT data was read into VQA. A spot scanning plan was generated and exported to text files, specifying the beam energy and position of each spot. The text files were converted and read into our Geant4-based software. The spot position was converted into steering magnet field strength (in Tesla) for our beam nozzle. Individual protons were tracked from the vacuum chamber, through the helium chamber, steering magnets, dose monitors, etc., in a straight, horizontal line. The patient CT data was converted into materials with variable density and placed in a parametrized volume at the isocenter. Gold fiducial markers were represented in the CT data by two adjacent voxels (volume 2.38 mm3). 600,000 proton histories were tracked for each target spot. As one beam contained about 1,000 spots, approximately 600 million histories were recorded for each beam on a blade server. Two plans were considered: two beam horizontal opposed (90 and 270 degree) and three beam (0, 90 and 270 degree). Results: We are able to convert spot scanning plans from VQA and simulate them with our Geant4-based code. Our system can be used to evaluate the effect of dose reduction caused by gold markers used for RGPT. Conclusion: Our Geant4 application is able to calculate dose distributions for spot scanned proton therapy.
Institute of Scientific and Technical Information of China (English)
杨建宇; 岳彦利; 宋海荣; 汤赛; 叶思菁; 徐凡
2015-01-01
耕地质量监测是保障耕地资源的永续利用,实现耕地产能提升、加强耕地资源的管理、保护、合理利用的重要措施,对实现持续粮食安全具有重要意义.该文提出了基于空间模拟退火算法的耕地质量布样优化方法,以空间模拟退火算法为基础生成一组最优样本,构成基础监测网络,在此基础上,通过多期耕地等级成果数据提取属性发生变化的分等因素和对应发生变化的区域,生成潜在变化区,并结合研究区实际情况辅以专家知识和异常监测点,对基础样本点进行增加、删除、替换等优化操作,生成最终监测样点.以北京市大兴区为例,最终确定布设55个监测样点,结果表明,该方法布设的样点在耕地质量预测方面的精度高于传统的随机抽样和分层抽样方法,能有效地预测县域耕地质量并监控耕地质量的变化情况.%M Monitoring points in country area are the foundation to reflect changes of cultivated land quality, which directly affect the result of farmland grading and its accuracy. Through the monitoring network for cultivated land quality in county area, the distribution and changing trend of the cultivated land quality can be reflected. Besides, the quality of non-sampled locations should also be estimated with the data of sampling points. Due to the correlation among spatial samples, the traditional methods such as simple random sampling, stratified sampling and systematic sampling are inefficient to accomplish the task above. Thus, we propose a new spatial sampling and optimizing method based on the spatial simulated annealing (SSA). This paper presents a pre-processing method to determine the number of sampling points, including preprocessing the data of cultivated land quality before sampling, exploring the spatial correlation and spatial distribution pattern of cultivated land quality, and computing the appropriate quantity of sampling points by analyzing the
Computer-assisted three-dimensional surgical planning and simulation: 3D virtual osteotomy.
Xia, J; Ip, H H; Samman, N; Wang, D; Kot, C S; Yeung, R W; Tideman, H
2000-02-01
A computer-assisted three-dimensional virtual osteotomy system for orthognathic surgery (CAVOS) is presented. The virtual reality workbench is used for surgical planning. The surgeon immerses in a virtual reality environment with stereo eyewear, holds a virtual "scalpel" (3D Mouse) and operates on a "real" patient (3D visualization) to obtain pre-surgical prediction (3D bony segment movements). Virtual surgery on a computer-generated 3D head model is simulated and can be visualized from any arbitrary viewing point in a personal computer system.
Directory of Open Access Journals (Sweden)
Baizid Khelifa
2016-01-01
Full Text Available This paper presents IRoSim: Industrial Robotics Simulation Design Planning and Optimization platform which we developed based on SolidWorks API. The main objective is to integrate features from mechanical and robotics CAD software into the same platform in order to facilitate the development process through a friendly interaction interface. The platform provides important steps to develop a given robotized task such as: defining a given task, CAD learning of the end-effectors’ trajectory, checking the manipulator’s reach-ability to perform a task, simulating the motion and preventing the trajectory from possible collisions. To assess the usability of the proposed platform, a car’s doors painting task using a 6 Degree Of Freedom industrial manipulator has been developed.
Kok, H P; van den Berg, C A T; Bel, A; Crezee, J
2013-10-01
Accurate thermal simulations in hyperthermia treatment planning require discrete modeling of large blood vessels. The very long computation time of the finite difference based DIscrete VAsculature model (DIVA) developed for this purpose is impractical for clinical applications. In this work, a fast steady-state thermal solver was developed for simulations with realistic 3D vessel networks. Additionally, an efficient temperature-based optimization method including the thermal effect of discrete vasculature was developed. The steady-state energy balance for vasculature and tissue was described by a linear system, which was solved with an iterative method on the graphical processing unit. Temperature calculations during optimization were performed by superposition of several precomputed temperature distributions, calculated with the developed thermal solver. The thermal solver and optimization were applied to a human anatomy, with the prostate being the target region, heated with the eight waveguide 70 MHz AMC-8 system. Realistic 3D pelvic vasculature was obtained from angiography. Both the arterial and venous vessel networks consisted of 174 segments and 93 endpoints with a diameter of 1.2 mm. Calculation of the steady-state temperature distribution lasted about 3.3 h with the original DIVA model, while the newly developed method took only ≈ 1-1.5 min. Temperature-based optimization with and without taking the vasculature into account showed differences in optimized waveguide power of more than a factor 2 and optimized tumor T90 differed up to ≈ 0.5°C. This shows the necessity to take discrete vasculature into account during optimization. A very fast method was developed for thermal simulations with realistic 3D vessel networks. The short simulation time allows online calculations and makes temperature optimization with realistic vasculature feasible, which is an important step forward in hyperthermia treatment planning.
Energy Technology Data Exchange (ETDEWEB)
Driscoll, P.C.; Gronenborn, A.M.; Beress, L.; Clore, G.M. (National Institutes of Health, Bethesda, MD (USA))
1989-03-07
The three-dimensional solution structure of the antihypertensive and antiviral protein BDS-I from the sea anemone Anemonia sulcata has been determined on the basis of 489 interproton and 24 hydrogen-bonding distance restraints supplemented by 23 {phi} backbone and 21 {sub {chi}1} side-chain torsion angle restraints derived from nuclear magnetic resonance (NMR) measurements. A total of 42 structures is calculated by a hybrid metric matrix distance geometry-dynamical simulated annealing approach. Both the backbone and side-chain atom positions are well defined. The average atomic rms difference between the 42 individual SA structures and the mean structure obtained by averaging their coordinates is 0.67 {plus minus} 0.12 {angstrom} for the backbone atoms and 0.90 {plus minus} 0.17 {angstrom} for all atoms. The core of the protein is formed by a triple-stranded antiparallel {beta}-sheet composed of residues 14-16 (strand 1), 30-34 (strand 2), and 37-41 (strand 3) with an additional mini-antiparallel {beta}-sheet at the N-terminus (residues 6-9). The first and second strands of the triple-stranded antiparallel {beta}-sheet are connected by a long exposed loop. A number of side-chain interactions are discussed in light of the structure.
Institute of Scientific and Technical Information of China (English)
钱晓杨
2013-01-01
By discussing the relevance and the multicollinearity problems between physical indicators, this paper improves the existing linear regressing model and proposes a kind of new optimized ridge regression model estimation algorithm based on simulated annealing technique to determine the parameter k. And experiments with reference standard of mse and common sense are made to prove the accuracy and reliability of the algorithm.%以体质指标关联性为研究对象,针对体质指标间存在的多重共线性问题,对现有的线性回归模型进行改选,本文提出一种基于模拟退火技术来确定岭参数k值的改进的岭回归估计模型算法.在实验中,以均方误差和理论常识为参考标准,证明此算法更具有一定的准确性和可靠性.
Energy Technology Data Exchange (ETDEWEB)
Li, Taoran, E-mail: taoran.li.duke@gmail.com; Wu, Qiuwen; Yang, Yun; Rodrigues, Anna; Yin, Fang-Fang; Jackie Wu, Q. [Department of Radiation Oncology, Duke University Medical Center Durham, North Carolina 27710 (United States)
2015-01-15
Purpose: An important challenge facing online adaptive radiation therapy is the development of feasible and efficient quality assurance (QA). This project aimed to validate the deliverability of online adapted plans and develop a proof-of-concept online delivery monitoring system for online adaptive radiation therapy QA. Methods: The first part of this project benchmarked automatically online adapted prostate treatment plans using traditional portal dosimetry IMRT QA. The portal dosimetry QA results of online adapted plans were compared to original (unadapted) plans as well as randomly selected prostate IMRT plans from our clinic. In the second part, an online delivery monitoring system was designed and validated via a simulated treatment with intentional multileaf collimator (MLC) errors. This system was based on inputs from the dynamic machine information (DMI), which continuously reports actual MLC positions and machine monitor units (MUs) at intervals of 50 ms or less during delivery. Based on the DMI, the system performed two levels of monitoring/verification during the delivery: (1) dynamic monitoring of cumulative fluence errors resulting from leaf position deviations and visualization using fluence error maps (FEMs); and (2) verification of MLC positions against the treatment plan for potential errors in MLC motion and data transfer at each control point. Validation of the online delivery monitoring system was performed by introducing intentional systematic MLC errors (ranging from 0.5 to 2 mm) to the DMI files for both leaf banks. These DMI files were analyzed by the proposed system to evaluate the system’s performance in quantifying errors and revealing the source of errors, as well as to understand patterns in the FEMs. In addition, FEMs from 210 actual prostate IMRT beams were analyzed using the proposed system to further validate its ability to catch and identify errors, as well as establish error magnitude baselines for prostate IMRT delivery
Li, Taoran; Wu, Qiuwen; Yang, Yun; Rodrigues, Anna; Yin, Fang-Fang; Jackie Wu, Q
2015-01-01
An important challenge facing online adaptive radiation therapy is the development of feasible and efficient quality assurance (QA). This project aimed to validate the deliverability of online adapted plans and develop a proof-of-concept online delivery monitoring system for online adaptive radiation therapy QA. The first part of this project benchmarked automatically online adapted prostate treatment plans using traditional portal dosimetry IMRT QA. The portal dosimetry QA results of online adapted plans were compared to original (unadapted) plans as well as randomly selected prostate IMRT plans from our clinic. In the second part, an online delivery monitoring system was designed and validated via a simulated treatment with intentional multileaf collimator (MLC) errors. This system was based on inputs from the dynamic machine information (DMI), which continuously reports actual MLC positions and machine monitor units (MUs) at intervals of 50 ms or less during delivery. Based on the DMI, the system performed two levels of monitoring/verification during the delivery: (1) dynamic monitoring of cumulative fluence errors resulting from leaf position deviations and visualization using fluence error maps (FEMs); and (2) verification of MLC positions against the treatment plan for potential errors in MLC motion and data transfer at each control point. Validation of the online delivery monitoring system was performed by introducing intentional systematic MLC errors (ranging from 0.5 to 2 mm) to the DMI files for both leaf banks. These DMI files were analyzed by the proposed system to evaluate the system's performance in quantifying errors and revealing the source of errors, as well as to understand patterns in the FEMs. In addition, FEMs from 210 actual prostate IMRT beams were analyzed using the proposed system to further validate its ability to catch and identify errors, as well as establish error magnitude baselines for prostate IMRT delivery. Online adapted plans were
The Traverse Planning Process for the Drats 2010 Analog Field Simulations
Horz, Friedrich; Gruener, John; Lofgren, Gary; Skinner, James A., Jr.; Graf, Jodi; Seibert, Marc
2011-01-01
Traverse planning concentrates on optimizing the science return within the overall objectives of planetary surface missions or their analog field simulations. Such simulations were conducted in the San Francisco Volcanic Field, northern Arizona, from Aug. 26 to Sept 17, 2010 and involved some 200 individuals in the field, with some 40 geoscientists composing the science team. The purpose of these Desert Research and Technology Studies (DRATS) is to exercise and evaluate developmental hardware, software and operational concepts in a mission-like, fully-integrated, setting under the direction of an onsite Mobile Mission Control Center(MMCC). DRATS 2010 focused on the simultaneous operation of 2 rovers, a historic first. Each vehicle was manned by an astronaut-commander and an experienced field geologist. Having 2 rovers and crews in the field mandated substantially more complex science and mission control operations compared to the single rover DRATS tests of 2008 and 2009, or the Apollo lunar missions. For instance, the science support function was distributed over 2 "back rooms", one for each rover, with both "tactical" teams operating independently and simultaneously during the actual traverses. Synthesis and integration of the daily findings and forward planning for the next day(s) was accomplished overnight by yet another "strategic" science team.
Wieslander, Elinore; Knöös, Tommy
2003-10-01
An increasing number of patients receiving radiation therapy have metallic implants such as hip prostheses. Therefore, beams are normally set up to avoid irradiation through the implant; however, this cannot always be accomplished. In such situations, knowledge of the accuracy of the used treatment planning system (TPS) is required. Two algorithms, the pencil beam (PB) and the collapsed cone (CC), are implemented in the studied TPS. Comparisons are made with Monte Carlo simulations for 6 and 18 MV. The studied materials are steel, CoCrMo, Orthinox® (a stainless steel alloy and registered trademark of Stryker Corporation), TiAlV and Ti. Monte Carlo simulated depth dose curves and dose profiles are compared to CC and PB calculated data. The CC algorithm shows overall a better agreement with Monte Carlo than the PB algorithm. Thus, it is recommended to use the CC algorithm to get the most accurate dose calculation both for the planning target volume and for tissues adjacent to the implants when beams are set up to pass through implants.
Energy Technology Data Exchange (ETDEWEB)
Sanchez Camacho, Enrique; Andreu Alvarez, Joaquin [Universidad Politecnica de Valencia (Spain)
2001-06-01
Two numerical procedures, based on the Genetic Algorithm (GA) and the Simulated Annealing (SA), are developed to solve the problem of the expansion of capacity of a water resource system. The problem was divided into two subproblems: capital availability and operation policy. Both are optimisation-simulation models, the first one is solved by means of the GA and SA, in each case, while the second one is solved using the Out-of-kilter algorithm (OKA), in both models. The objective function considers the usual benefits and costs in this kind of systems, such as irrigation and hydropower benefits, costs of dam construction and system maintenance. The strength and weakness of both models are evaluated by comparing their results with those obtained with the branch and bound technique, which was classically used to solve this kind of problems. [Spanish] Un par de metodos numericos fundamentados en dos tecnicas de busqueda globales. Algoritmos Genetico (AG) y Recocido Simulado (RS), son desarrollados para resolver el problema de expansion de capacidad de un sistema de recursos hidricos. La estrategia ha sido dividir al problema en dos subproblemas: el de disponibilidad de capital y el de la politica de operacion. Ambos modelos son de optimizacion-simulacion, el primero se realiza mediante los algoritmos del RS y el AG en cada caso, en tanto que el segundo lleva a cabo a traves del algoritmo del Out-of-kilter (AOK) en los dos modelos. La funcion objetivo con que se trabaja considera los beneficios y costos mas comunes en este tipo de sistema, tales como beneficios por riego, por hidroelectricidad y costos de construccion de los embalses y mantenimiento del sistema. La potencia y debilidades delos dos modelos se evaluan mediante la comparacion con los resultados obtenidos a traves de una de las tecnicas mas usadas en este tipo de problemas: la de ramificacion y acotacion.
TU-A-304-02: Treatment Simulation, Planning and Delivery for SBRT
Energy Technology Data Exchange (ETDEWEB)
Yang, Y.
2015-06-15
Increased use of SBRT and hypo fractionation in radiation oncology practice has posted a number of challenges to medical physicist, ranging from planning, image-guided patient setup and on-treatment monitoring, to quality assurance (QA) and dose delivery. This symposium is designed to provide updated knowledge necessary for the safe and efficient implementation of SBRT in various linac platforms, including the emerging digital linacs equipped with high dose rate FFF beams. Issues related to 4D CT, PET and MRI simulations, 3D/4D CBCT guided patient setup, real-time image guidance during SBRT dose delivery using gated/un-gated VMAT or IMRT, and technical advancements in QA of SBRT (in particular, strategies dealing with high dose rate FFF beams) will be addressed. The symposium will help the attendees to gain a comprehensive understanding of the SBRT workflow and facilitate their clinical implementation of the state-of-art imaging and planning techniques. Learning Objectives: Present background knowledge of SBRT, describe essential requirements for safe implementation of SBRT, and discuss issues specific to SBRT treatment planning and QA. Update on the use of multi-dimensional (3D and 4D) and multi-modality (CT, beam-level X-ray imaging, pre- and on-treatment 3D/4D MRI, PET, robotic ultrasound, etc.) for reliable guidance of SBRT. Provide a comprehensive overview of emerging digital linacs and summarize the key geometric and dosimetric features of the new generation of linacs for substantially improved SBRT. Discuss treatment planning and quality assurance issues specific to SBRT. Research grant from Varian Medical Systems.
Gaube, Veronika; Remesch, Alexander
2013-01-01
Interest in assessing the sustainability of socio-ecological systems of urban areas has increased notably, with additional attention generated due to the fact that half the world's population now lives in cities. Urban areas face both a changing urban population size and increasing sustainability issues in terms of providing good socioeconomic and environmental living conditions. Urban planning has to deal with both challenges. Households play a major role by being affected by urban planning decisions on the one hand and by being responsible – among many other factors – for the environmental performance of a city (e.g. energy use). We here present an agent-based decision model referring to the city of Vienna, the capital of Austria, with a population of about 1.7 million (2.3 million within the metropolitan area, the latter being more than 25% of Austria's total population). Since the early 1990s, after decades of negative population growth, Vienna has been experiencing a steady increase in population, mainly driven by immigration. The aim of the agent-based decision model is to simulate new residential patterns of different household types based on demographic development and migration scenarios. Model results were used to assess spatial patterns of energy use caused by different household types in the four scenarios (1) conventional urban planning, (2) sustainable urban planning, (3) expensive centre and (4) no green area preference. Outcomes show that changes in preferences of households relating to the presence of nearby green areas have the most important impact on the distribution of households across the small-scaled city area. Additionally, the results demonstrate the importance of the distribution of different household types regarding spatial patterns of energy use. PMID:27667962
Gaube, Veronika; Remesch, Alexander
2013-07-01
Interest in assessing the sustainability of socio-ecological systems of urban areas has increased notably, with additional attention generated due to the fact that half the world's population now lives in cities. Urban areas face both a changing urban population size and increasing sustainability issues in terms of providing good socioeconomic and environmental living conditions. Urban planning has to deal with both challenges. Households play a major role by being affected by urban planning decisions on the one hand and by being responsible - among many other factors - for the environmental performance of a city (e.g. energy use). We here present an agent-based decision model referring to the city of Vienna, the capital of Austria, with a population of about 1.7 million (2.3 million within the metropolitan area, the latter being more than 25% of Austria's total population). Since the early 1990s, after decades of negative population growth, Vienna has been experiencing a steady increase in population, mainly driven by immigration. The aim of the agent-based decision model is to simulate new residential patterns of different household types based on demographic development and migration scenarios. Model results were used to assess spatial patterns of energy use caused by different household types in the four scenarios (1) conventional urban planning, (2) sustainable urban planning, (3) expensive centre and (4) no green area preference. Outcomes show that changes in preferences of households relating to the presence of nearby green areas have the most important impact on the distribution of households across the small-scaled city area. Additionally, the results demonstrate the importance of the distribution of different household types regarding spatial patterns of energy use.
Simulation of heat exchanger network (HEN) and planning the optimum cleaning schedule
Energy Technology Data Exchange (ETDEWEB)
Sanaye, Sepehr [Energy Systems Improvement Laboratory, Mechanical Engineering Department, Iran University of Science and Technology (IUST), Narmak, Tehran 16488 (Iran, Islamic Republic of)]. E-mail: sepehr@iust.ac.ir; Niroomand, Behzad [Energy Systems Improvement Laboratory, Mechanical Engineering Department, Iran University of Science and Technology (IUST), Narmak, Tehran 16488 (Iran, Islamic Republic of)
2007-05-15
Modeling and simulation of heat exchanger networks for estimating the amount of fouling, variations in overall heat transfer coefficient, and variations in outlet temperatures of hot and cold streams has a significant effect on production analysis. In this analysis, parameters such as the exchangers' types and arrangements, their heat transfer surface areas, mass flow rates of hot and cold streams, heat transfer coefficients and variations of fouling with time are required input data. The main goal is to find the variations of the outlet temperatures of the hot and cold streams with time to plan the optimum cleaning schedule of heat exchangers that provides the minimum operational cost or maximum amount of savings. In this paper, the simulation of heat exchanger networks is performed by choosing an asymptotic fouling function. Two main parameters in the asymptotic fouling formation model, i.e. the decay time of fouling formation ({tau}) and the asymptotic fouling resistance (R{sub f}{sup {approx}}) were obtained from empirical data as input parameters to the simulation relations. These data were extracted from the technical history sheets of the Khorasan Petrochemical Plant to guaranty the consistency between our model outputs and the real operating conditions. The output results of the software program developed, including the variations with time of the outlet temperatures of the hot and cold streams, the heat transfer coefficient and the heat transfer rate in the exchangers, are presented for two case studies. Then, an objective function (operational cost) was defined, and the optimal cleaning schedule of the HEN (heat exchanger network) in the Urea and Ammonia units were found by minimizing the objective function using a numerical search method. Based on this minimization procedure, the decision was made whether a heat exchanger should be cleaned or continue to operate. The final result was the most cost effective plan for the HEN cleaning schedule. The
Institute of Scientific and Technical Information of China (English)
张廷龙; 孙睿; 胡波; 冯丽超
2011-01-01
生态过程模型建立在明确的机理之上,能够较好地模拟陆地生态系统的行为和特征,但模型众多的参数,成为模型具体应用的瓶颈.本文以Biome-BGC模型为例,采用模拟退火算法,对其生理、生态参数进行优化.在优化过程中,先对待优化参数进行了选择,然后采取逐步优化的方法进行优化.结果表明,使用优化后的参数,模型模拟结果与实际观测更为接近,参数优化能有效地降低模型模拟的不确定性.文中参数优化的过程和方法,可为生态模型的参数识别和优化提供一种实例和思路,有助于生态模型应用区域的扩展.%Ecological process model based on defined mechanism can well simulate the dynamic behaviors and features of terrestrial ecosystem, but could become a bottleneck in application because of numerous parameters needed to be confirmed. In this paper, simulated annealing algorithm was used to optimize the physiological and ecological parameters of Biome-BGC model. The first step was to choose some of these parameters to optimize, and then, gradually optimized these parameters. By using the optimized parameters, the model simulation results were much more close to the observed data, and the parameter optimization could effectively reduce the uncertainty of model simulation. The parameter optimization method used in this paper could provide a case and an idea for the parameter identification and optimization of ecological process models,and also, help to expand the application area of the models.
A Personified Annealing Algorithm for Circles Packing Problem
Institute of Scientific and Technical Information of China (English)
ZHANGDe-Fu; LIXin
2005-01-01
Circles packing problem is an NP-hard problem and is difficult to solve. In this paper, a hybrid search strategy for circles packing problem is discussed. A way of generating new configuration is presented by simulating the moving of elastic objects, which can avoid the blindness of simulated annealing search and make iteration process converge fast. Inspired by the life experiences of people,an effective personified strategy to jump out of local minima is given. Based on the simulated annealing idea and personification strategy, an effective personified annealing algorithm for circles packing problem is developed. Numerical experiments on benchmark problem instances show that the proposed algorithm outperforms the best algorithm in the literature.
Banura, Natsuo; Nishimoto, Kohei; Murase, Kenya
2016-01-01
This study was undertaken to develop a system for heat transfer simulation for optimization and treatment planning of magnetic hyperthermia treatment (MHT) using magnetic particle imaging (MPI). First, we performed phantom experiments to obtain the regression equation between the MPI pixel value and the specific absorption rate (SAR) of magnetic nanoparticles (MNPs), from which the MPI pixel value was converted to the SAR value in the simulation. Second, we generated the geometries for use in the simulation by processing X-ray computed tomography (CT) and MPI images of tumor-bearing mice injected intratumorally with MNPs (Resovist). The geometries and MPI images were then imported into software based on a finite element method (COMSOL Multiphysics) to compute the time-dependent temperature distribution for 20 min after the start of MHT. There was an excellent correlation between the MPI pixel value and the SAR value (r = 0.956). There was good agreement between the time course of the temperature rise in the t...
Caruso, G.; Bartels, H. W.; Iseli, M.; Meyder, R.; Nordlinder, S.; Pasler, V.; Porfiri, M. T.
2006-01-01
Code validation activities have been promoted inside the European fusion development agreement (EFDA) to test the capability of codes in simulating accident phenomena in fusion facilities and, specifically, in the International thermonuclear experimental reactor (ITER). This work includes a comparison between three different computer codes (CONSEN, MAGS and MELCOR) and one analytical model (ITER Model) in simulating cryogenic helium releases into the vacuum vessel (VV) which contains hot structures. The scope was the evaluation of the transient pressure inside the VV. The results will be used to design a vent duct (equivalent diameter, length and roughness) to allow pressure relief for the protection of the VV, which has a maximum design pressure of 200 kPa. The model geometry is a simplified scheme preserving the main features of the ITER design. Based on the results of the simulations, a matrix of experiments was developed to validate the calculated results and to design the vent duct for the ITER VV. The experiments are planned to be performed in the EVITA test facility, located in the CEA Cadarache research centre (France).
Simulation-aided planning of quality-oriented personnel structures in production systems.
Zülch, Gert; Krüger, Jan; Schindele, Hermann; Rottinger, Sven
2003-07-01
This paper presents research activities associated with the development of a simulation tool for modelling human reliability in production systems. This dynamic model enables the planner to determine the consequences of changes in human reliability on the quality of the production processes and the products. The model is built upon the basis of a tool for human reliability analysis ESAT (Experten-System zur Aufgaben-Taxonomie; Aufgabentaxonomie: Ein Verfahren zur Ermittlung der menschlichen Leistung bei der Durchführung von Aufgaben, Messerschmitt-Bölkow-Blohm, Ottobrunn, 1990.) and a personnel-oriented simulation programme ESPE (Engpassorientierte Simulation von Personalstrukturen; Ein engpassorientierter Ansatz zur simulationsunterstützten Planung von Personalstrukturen, Dissertation, Karlsruhe University, 1994), developed at the ifab-Institute of Human and Industrial Engineering at the University of Karlsruhe. In addition to the definition and the calculation of the human error probabilities, the consequences of the human errors (i.e. rework and waste) for the quality of the processes and the products were also implemented. This method is able to systematically plan quality-oriented assignments of personnel to functions and workplaces (personnel structures) in production systems. The effectiveness of the method is demonstrated by a case study.
Offline motion planning and simulation of two-robot welding coordination
Zhang, Tie; Ouyang, Fan
2012-03-01
This paper focuses on the two-robot welding coordination of complex curve seam which means one robot grasp the workpiece, the other hold the torch, the two robots work on the same workpiece simultaneously. This paper builds the dual-robot coordinate system at the beginning, and three point calibration method of two robots' relative base coordinate system is presented. After that, the non master/slave scheme is chosen for the motion planning, the non master/slave scheme sets the poses versus time function of the point u on the workpiece, and calculates the two robot end effecter trajectories through the constrained relationship matrix automatically. Moreover, downhand welding is employed which can guarantee the torch and the seam keep in good contact condition all the time during the welding. Finally, a Solidworks-Sim Mechanics simulation platform is established, and a simulation of curved steel pipe welding is conducted. The results of the simulation illustrate the welding process can meet the requirements of downhand welding, the joint displacement curves are smooth and continuous and no joint velocities are out of working scope.
Eliminating Inconsistencies in Simulation and Treatment Planning Orders in Radiation Therapy
Energy Technology Data Exchange (ETDEWEB)
Santanam, Lakshmi, E-mail: lsantanam@radonc.wustl.edu [Department of Radiation Oncology, Washington University School of Medicine, St. Louis, Missouri (United States); Brame, Ryan S.; Lindsey, Andrew; Dewees, Todd; Danieley, Jon; Labrash, Jason; Parikh, Parag; Bradley, Jeffrey; Zoberi, Imran; Michalski, Jeff; Mutic, Sasa [Department of Radiation Oncology, Washington University School of Medicine, St. Louis, Missouri (United States)
2013-02-01
Purpose: To identify deficiencies with simulation and treatment planning orders and to develop corrective measures to improve safety and quality. Methods and Materials: At Washington University, the DMAIIC formalism is used for process management, whereby the process is understood as comprising Define, Measure, Analyze, Improve, Implement, and Control activities. Two complementary tools were used to provide quantitative assessments: failure modes and effects analysis and reported event data. The events were classified by the user according to severity. The event rates (ie, number of events divided by the number of opportunities to generate an event) related to simulation and treatment plan orders were determined. Results: We analyzed event data from the period 2008-2009 to design an intelligent SIMulation and treatment PLanning Electronic (SIMPLE) order system. Before implementation of SIMPLE, event rates of 0.16 (420 of 2558) for a group of physicians that were subsequently used as a pilot group and 0.13 (787 of 6023) for all physicians were obtained. An interdisciplinary group evaluated and decided to replace the Microsoft Word-based form with a Web-based order system. This order system has mandatory fields and context-sensitive logic, an ability to create templates, and enables an automated process for communication of orders through an enterprise management system. After the implementation of the SIMPLE order, the event rate decreased to 0.09 (96 of 1001) for the pilot group and to 0.06 (145 of 2140) for all physicians (P<.0001). The average time to complete the SIMPLE form was 3 minutes, as compared with 7 minutes for the Word-based form. The number of severe events decreased from 10.7% (45 of 420) and 12.1% (96 of 787) to 6.2% (6 of 96) and 10.3% (15 of 145) for the pilot group and all physicians, respectively. Conclusions: There was a dramatic reduction in the total and the number of potentially severe events through use of the SIMPLE system. In addition
Jin, Cheng-Jie; Wang, Wei; Jiang, Rui
2016-08-01
The proper setting of traffic signals at signalized intersections is one of the most important tasks in traffic control and management. This paper has evaluated the four-phase traffic signal plans at a four-leg intersection via cellular automaton simulations. Each leg consists of three lanes, an exclusive left-turn lane, a through lane, and a through/right-turn lane. For a comparison, we also evaluate the two-phase signal plan. The diagram of the intersection states in the space of inflow rate versus turning ratio has been presented, which exhibits four regions: In region I/II/III, congestion will propagate upstream and laterally and result in queue spillover with both signal plans/two-phase signal plan/four-phase signal plan, respectively. Therefore, neither signal plan works in region I, and only the four-phase signal plan/two-phase signal plan works in region II/III. In region IV, both signal plans work, but two-phase signal plan performs better in terms of average delays of vehicles. Finally, we study the diagram of the intersection states and average delays in the asymmetrical configurations.
SIMULATION BASED PLANNING FOR DESIGN OF WASHING MACHINE USING H/W & S/W CO DESIGN
Directory of Open Access Journals (Sweden)
Rajesh Kumar Garg
2010-11-01
Full Text Available A simulator is designed and developed or helping embedded system development team to plan their time schedule, work flow etc. and distribute human resources and their efforts over various phasesof washing machine development. The simulator helps in identifying development phases of washing machine for successful operations of overall system. The development process of washing machine has beendone in a cost effective way by planning and managing and simulating the system. The simulator gives the results in form of critical and near-critical phases while considering the rest of phases as concurrent development activities. The simulator has been a handytool for engineers in developing the system in an optimal manner. Hardware/Software Codesign approach in which both hardware and software designers’ work together to develop a system is used inthis work.
Austenite formation during intercritical annealing
A. Lis; J. Lis
2008-01-01
Purpose: of this paper is the effect of the soft annealing of initial microstructure of the 6Mn16 steel on the kinetics of the austenite formation during next intercritical annealing.Design/methodology/approach: Analytical TEM point analysis with EDAX system attached to Philips CM20 was used to evaluate the concentration of Mn, Ni and Cr in the microstructure constituents of the multiphase steel and mainly Bainite- Martensite islands.Findings: The increase in soft annealing time from 1-60 hou...
A Hybrid Simulated Annealing Algorithm for the Three-Dimensional Packing Problem%求解三维装箱问题的混合模拟退火算法
Institute of Scientific and Technical Information of China (English)
张德富; 彭煜; 朱文兴; 陈火旺
2009-01-01
This paper presents an efficient hybrid simulated annealing algorithm for three dimen-sional container loading problem (3D-CLP). The 3D-CLP is the problem of loading a subset of a given set of rectangular boxes into a rectangular container so that the stowed volume is maxi-mized. The algorithm introduced in this paper is based on three important algorithms. First, complex block generating, complex block can contain any number boxes of different types, which differs from the traditional algorithm. Second, basic heuristic, which is a new construction heu-ristic algorithm used to generate a feasible packing solution from a packing sequence. Third, sim-ulated annealing algorithm, based on the complex block and basic heuristic, it encodes a feasible packing solution as a packing sequence, and searches in the encoding space to find an approxima-ted optimal solution. 1500 benchmark instances with weakly and strongly heterogeneous boxes are considered in this paper. The computational results show that the volume utilization of hybrid algorithm outperforms current excellent algorithms for the considered problem.%提出了一个高效求解三维装箱问题(Three Dimensional Container Loading Problem 3D-CLP)的混合模拟退火算法.三维装箱问题要求装载给定箱子集合的一个子集到容器中,使得被装载的箱子总体积最大.文中介绍的混合模拟退火算法基于三个重要算法:(1)复合块生成算法,与传统算法不同的是文中提出的复合块不只包含单一种类的箱子,而是可以在一定的限制条件下包含任意种类的箱子.(2)基础启发式算法,该算法基于块装载,可以按照指定装载序列生成放置方案.(3)模拟退火算法,以复合块生成和基础启发式算法为基础,将装载序列作为可行放置方案的编码,在编码空间中采用模拟退火算法进行搜索以寻找问题的近似最优解.文中采用1500个弱异构和强异构的装箱问题数据对算法进行测试.
Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0
Energy Technology Data Exchange (ETDEWEB)
McCoy, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hendrickson, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
2015-08-27
The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.
Population Annealing: Theory and Application in Spin Glasses
Machta, Jonathan; Wang, Wenlong; Katzgraber, Helmut G.
Population annealing is an efficient sequential Monte Carlo algorithm for simulating equilibrium states of systems with rough free energy landscapes. The theory of population annealing is presented, and systematic and statistical errors are discussed. The behavior of the algorithm is studied in the context of large-scale simulations of the three-dimensional Ising spin glass and the performance of the algorithm is compared to parallel tempering. It is found that the two algorithms are similar in efficiency though with different strengths and weaknesses. Supported by NSF DMR-1151387, DMR-1208046 and DMR-1507506.
Institute of Scientific and Technical Information of China (English)
彭碧涛; 周永务
2011-01-01
传统的车辆路径问题只考虑物品装载的质量属性约束,而忽略其他装载属性约束.针对这种情况,研究了三维装载约束的车辆路径问题,提出了三维装载的处理算法,基于模拟退火算法设计了一种两阶段启发式算法进行求解:第1阶段通过启发式算法得到初始解；第2阶段通过模拟退火算法对初始解进行改进,构造了测试集对结果进行验证.实验结果显示该算法是能够有效的求解该问题.%In the classical vehicle routing problem, it considers the goods weight constraint only,but ignores other loading constraints, such as the loading space constraint. In this paper, the vehicle routing problem with three-dimensional loading constraint taken into account is addressed. A heuristic is proposed for goods loading such that the three-dimensional loading constraint is satisfied and optimized. Then, based on the heuristic, a simulated annealing algorithm is presented to solve the problem. A number of benchmark problems are used to test the proposed method. Results show the algorithm is effective.
Institute of Scientific and Technical Information of China (English)
陈香
2013-01-01
In order to effectively solve Arrange fair and objective interview to interview members of the Group ,in this pa-per ,the issues were discussed ,establish its mathematical model ,the model is a complex non -linear integer programming problem .Proposed a packing code ,simulated annealing genetic ,multi-point crossover ,the search for variability in the field of genetic algorithms to solve the mathematical model ,And with an example :30 experts to interview 300 students each interview group of four experts ,with the genetic algorithm to solve the calculation of the examples ,show that the improved genetic algorithm can be efficient for solving the approximate optimal solution of problem solving can meet the job interview fair and reasonable arrangements required to achieve results .%为了有效求解如何安排面试专家组成员工作使面试公正客观的问题，建立面试安排工作数学模型，该模型为复杂的非线性整数规划问题。提出一种装箱编码、模拟退火遗传、多点交叉、领域搜索变异的遗传算法对数学模型进行求解，并以一个30名专家对300名学生进行面试，且每个面试组4名专家的例子用遗传算法进行求解计算。结果表明，改进后的遗传算法能高效求解出问题的近似最优解，求解结果能满足面试工作安排所提出的要求。
Directory of Open Access Journals (Sweden)
Aitang Xing
2016-06-01
Full Text Available Purpose: The purpose of this paper is to describe a practical approach to commissioning and quality assurance (QA of a dedicated wide-bore 3 Tesla (3T magnetic resonance imaging (MRI scanner for radiotherapy planning.Methods: A comprehensive commissioning protocol focusing on radiotherapy (RT specific requirements was developed and performed. RT specific tests included: uniformity characteristics of radio-frequency (RF coil, couch top attenuation, geometric distortion, laser and couch movement and an end-to-end radiotherapy treatment planning test. General tests for overall system performance and safety measurements were also performed.Results: The use of pre-scan based intensity correction increased the uniformity from 61.7% to 97% (body flexible coil, from 50% to 90% (large flexible coil and from 51% to 98% (small flexible coil. RT flat top couch decreased signal-to-noise ratio (SNR by an average of 42%. The mean and maximum geometric distortion was found to be 1.25 mm and 4.08 mm for three dimensional (3D corrected image acquisition, 2.07 mm and 7.88 mm for two dimensional (2D corrected image acquisition over 500 mm × 375 mm × 252 mm field of view (FOV. The accuracy of the laser and couch movement was less than ±1 mm. The standard deviation of registration parameters for the end-to-end test was less than 0.41 mm. An on-going QA program was developed to monitor the system’s performance.Conclusion: A number of RT specific tests have been described for commissioning and subsequent performance monitoring of a dedicated MRI simulator (MRI-Sim. These tests have been important in establishing and maintaining its operation for RT planning.
Radiation annealing in cuprous oxide
DEFF Research Database (Denmark)
Vajda, P.
1966-01-01
Experimental results from high-intensity gamma-irradiation of cuprous oxide are used to investigate the annealing of defects with increasing radiation dose. The results are analysed on the basis of the Balarin and Hauser (1965) statistical model of radiation annealing, giving a square-root relati......-root relationship between the rate of change of resistivity and the resistivity change. The saturation defect density at room temperature is estimated on the basis of a model for defect creation in cuprous oxide....
Quantum annealing with manufactured spins.
Johnson, M W; Amin, M H S; Gildert, S; Lanting, T; Hamze, F; Dickson, N; Harris, R; Berkley, A J; Johansson, J; Bunyk, P; Chapple, E M; Enderud, C; Hilton, J P; Karimi, K; Ladizinsky, E; Ladizinsky, N; Oh, T; Perminov, I; Rich, C; Thom, M C; Tolkacheva, E; Truncik, C J S; Uchaikin, S; Wang, J; Wilson, B; Rose, G
2011-05-12
Many interesting but practically intractable problems can be reduced to that of finding the ground state of a system of interacting spins; however, finding such a ground state remains computationally difficult. It is believed that the ground state of some naturally occurring spin systems can be effectively attained through a process called quantum annealing. If it could be harnessed, quantum annealing might improve on known methods for solving certain types of problem. However, physical investigation of quantum annealing has been largely confined to microscopic spins in condensed-matter systems. Here we use quantum annealing to find the ground state of an artificial Ising spin system comprising an array of eight superconducting flux quantum bits with programmable spin-spin couplings. We observe a clear signature of quantum annealing, distinguishable from classical thermal annealing through the temperature dependence of the time at which the system dynamics freezes. Our implementation can be configured in situ to realize a wide variety of different spin networks, each of which can be monitored as it moves towards a low-energy configuration. This programmable artificial spin network bridges the gap between the theoretical study of ideal isolated spin networks and the experimental investigation of bulk magnetic samples. Moreover, with an increased number of spins, such a system may provide a practical physical means to implement a quantum algorithm, possibly allowing more-effective approaches to solving certain classes of hard combinatorial optimization problems.
Shortcuts to adiabaticity for quantum annealing
Takahashi, Kazutaka
2017-01-01
We study the Ising Hamiltonian with a transverse field term to simulate the quantum annealing. Using shortcuts to adiabaticity, we design the time dependence of the Hamiltonian. The dynamical invariant is obtained by the mean-field ansatz, and the Hamiltonian is designed by the inverse engineering. We show that the time dependence of physical quantities such as the magnetization is independent of the speed of the Hamiltonian variation in the infinite-range model. We also show that rotating transverse magnetic fields are useful to achieve the ideal time evolution.
Interference Alignment Using Variational Mean Field Annealing
DEFF Research Database (Denmark)
Badiu, Mihai Alin; Guillaud, Maxime; Fleury, Bernard Henri
2014-01-01
We study the problem of interference alignment in the multiple-input multiple- output interference channel. Aiming at minimizing the interference leakage power relative to the receiver noise level, we use the deterministic annealing approach to solve the optimization problem. In the corresponding...... for interference alignment. We also show that the iterative leakage minimization algorithm by Gomadam et al. and the alternating minimization algorithm by Peters and Heath, Jr. are instances of our method. Finally, we assess the performance of the proposed algorithm through computer simulations....
Institute of Scientific and Technical Information of China (English)
Feng Shaodong; Li Guangxia; Feng Qi
2011-01-01
The Burst Time Plan (BTP) generation is the key for resource allocation in Broadband Satellite Multimedia (BSM) system.The main purpose of this paper is to minimize the system response time to users' request caused by BTP generation as well as maintain the Quality of Service (QoS) and improve the channel utilization efficiency.Traditionally the BTP is generated periodically in order to simplify the implementation of the resource allocation algorithm.Based on the analysis we find that Periodical BTP Generation (P-BTPG) method cannot guarantee the delay performance,channel utilization efficiency and QoS simultaneously,especially when the capacity requests arrived randomly.The Optimized BTP Generation (O-BTPG) method is given based on the optimal scheduling period and scheduling latency without considering the signaling overhead.Finally,a novel Asynchronous BTP Generation (A-BTPG) method is proposed which is invoked according to users' requests.A BSM system application scenario is simulated.Simulation results show that A-BTPG is a trade-off between the performance and signaling overhead which can improve the system performance insensitive to the traffic pattern.This method can be used in the ATM onboard switching satellite system and further more can be expended to Digital Video Broadcasting-Return Channel Satellite (DVB-RCS) system or IP onboard routing BSM system in the future.
Directory of Open Access Journals (Sweden)
Kotevski Živko
2015-01-01
Full Text Available Production planning and control (PPC systems are the base of all production facilities. In today's surroundings, having a good PPC system generates lots of benefits for the companies. But, having an excellent PPC system provides great competitive advantage and serious reduction of cost in many fields. In order to get to a point of having excellent PPC, the companies turn more and more to the newest software tools, for simulations as an example. Considering today's advanced computer technology, by using the simulations in this area, companies will have strong asset when dealing with different kinds of wastes, delays, overstock, bottlenecks and generally loss of time. This model is applicable in almost all production facilities. Taking into account the different scrap percentages for the pieces that form the end product, a detailed model and analysis were made in order to determine the optimal starting parameters. At first all the conditions of the company were determined, conceptual model was created along with all assumptions. Then the model was verified and validated and at the end a cost benefit analysis was conducted in order to have clear results.
Simulated Annealing in the Variable Landscape
Hasegawa, Manabu; Kim, Chang Ju
An experimental analysis is conducted to test whether the appropriate introduction of the smoothness-temperature schedule enhances the optimizing ability of the MASSS method, the combination of the Metropolis algorithm (MA) and the search-space smoothing (SSS) method. The test is performed on two types of random traveling salesman problems. The results show that the optimization performance of the MA is substantially improved by a single smoothing alone and slightly more by a single smoothing with cooling and by a de-smoothing process with heating. The performance is compared to that of the parallel tempering method and a clear advantage of the idea of smoothing is observed depending on the problem.
Field sampling scheme optimization using simulated annealing
CSIR Research Space (South Africa)
Debba, Pravesh
2010-10-01
Full Text Available to derive optimal sampling schemes. 2. Hyperspectral remote sensing In the study of electro-magnetic physics, when energy in the form of light interacts with a material, part of the energy at certain wavelength is absorbed, transmitted, emitted... in order to derive optimal sampling schemes. 2. Hyperspectral remote sensing In the study of electro-magnetic physics, when energy in the form of light interacts with a material, part of the energy at certain wavelength is absorbed, transmitted, emitted...
Metriplectic simulated annealing for quasigeostrophic flow
Morrison, P. J.; Flierl, G. R.
2016-11-01
Metriplectic dynamics is a general form for dynamical systems that embodies the first and second laws of thermodynamics, energy conservation and entropy production. The formalism provides an H-theorem for relaxation to nontrivial equilibrium states. Upon choosing enstrophy as entropy and potential vorticity of the form q =∇2 Ψ + T (x) , recent results of computations, akin to those of, will be described for various topography functions T (x) , including ridge (T = exp (-x2 / 2)) and random functions. Interpretation of the results, in particular their sensitivity to the chosen entropy function will be discussed. PJM supported by U.S. Dept. of Energy Contract # DE-FG05-80ET-53088.
Nakazawa, Hisato; Mori, Yoshimasa; Komori, Masataka; Tsugawa, Takahiko; Shibamoto, Yuta; Kobayashi, Tatsuya; Hashizume, Chisa; Uchiyama, Yukio; Hagiwara, Masahiro
2014-05-01
Fractionated stereotactic radiotherapy (SRT) is performed with a linear accelerator-based system such as Novalis. Recently, Gamma Knife Perfexion (PFX) featured the Extend system with relocatable fixation devices available for SRT. In this study, the dosimetric results of these two modalities were compared from the viewpoint of conformity, heterogeneity and gradient in target covering. A total of 14 patients with skull base tumors were treated with Novalis intensity-modulated (IM)-SRT. Treatment was planned on an iPlan workstation. Five- to seven-beam IM-SRT was performed in 14-18 fractions with a fraction dose of 2.5 or 3 Gy. With these patients' data, additional treatment planning was simulated using a GammaPlan workstation for PFX-SRT. Reference CT images with planning structure contour sets on iPlan, including the planning target volume (PTV, 1.1-102.2 ml) and organs at risk, were exported to GammaPlan in DICOM-RT format. Dosimetric results for Novalis IM-SRT and PFX-SRT were evaluated in the same prescription doses. The isocenter number of PFX was between 12 and 50 at the isodose contour of 50-60%. The PTV coverage was 95-99% for Novalis and 94-98% for PFX. The conformity index (CI) was 1.11-1.61 and 1.04-1.15, the homogeneity index (HI) was 1.1-3.62 and 2.3-3.25, and the gradient index (GI) was 3.72-7.97 and 2.54-3.39 for Novalis and PFX, respectively. PTV coverage by Novalis and PFX was almost equivalent. PFX was superior in CI and GI, and Novalis was better in HI. Better conformality would be achieved by PFX, when the homogeneity inside tumors is less important.
OʼHara, Susan
2014-01-01
Nurses have increasingly been regarded as critical members of the planning team as architects recognize their knowledge and value. But the nurses' role as knowledge experts can be expanded to leading efforts to integrate the clinical, operational, and architectural expertise through simulation modeling. Simulation modeling allows for the optimal merge of multifactorial data to understand the current state of the intensive care unit and predict future states. Nurses can champion the simulation modeling process and reap the benefits of a cost-effective way to test new designs, processes, staffing models, and future programming trends prior to implementation. Simulation modeling is an evidence-based planning approach, a standard, for integrating the sciences with real client data, to offer solutions for improving patient care.
Solving Set Cover with Pairs Problem using Quantum Annealing
Cao, Yudong; Jiang, Shuxian; Perouli, Debbie; Kais, Sabre
2016-09-01
Here we consider using quantum annealing to solve Set Cover with Pairs (SCP), an NP-hard combinatorial optimization problem that plays an important role in networking, computational biology, and biochemistry. We show an explicit construction of Ising Hamiltonians whose ground states encode the solution of SCP instances. We numerically simulate the time-dependent Schrödinger equation in order to test the performance of quantum annealing for random instances and compare with that of simulated annealing. We also discuss explicit embedding strategies for realizing our Hamiltonian construction on the D-wave type restricted Ising Hamiltonian based on Chimera graphs. Our embedding on the Chimera graph preserves the structure of the original SCP instance and in particular, the embedding for general complete bipartite graphs and logical disjunctions may be of broader use than that the specific problem we deal with.
Five-fold twin formation during annealing of nanocrystalline Cu
Energy Technology Data Exchange (ETDEWEB)
Bringa, E M; Farkas, D; Caro, A; Wang, Y M; McNaney, J; Smith, R
2009-05-20
Contrary to the common belief that many-fold twins, or star twins, in nanophase materials are due to the action of significant external stresses, we report molecular dynamics simulations of annealing in 5 nm grain size samples annealed at 800 K for nearly 0.5 nsec at 0 external pressure showing the formation of five-fold star twins during annealing under the action of the large internal stresses responsible for grain growth and microstructural evolution. The structure of the many-fold twins is remarkably similar to those we have found to occur under uniaxial shock loading, of samples of nanocrystalline NiW with a grain size of {approx}5-30 nm. The mechanism of formation of the many-fold twins is discussed in the light of the simulations and experiments.
Energy Technology Data Exchange (ETDEWEB)
Kurtz, R.J.; Heasler, P.G.; Baird, D.B. [Pacific Northwest Lab., Richland, WA (United States)
1994-02-01
This report summarizes the results of three previous studies to evaluate and compare the effectiveness of sampling plans for steam generator tube inspections. An analytical evaluation and Monte Carlo simulation techniques were the methods used to evaluate sampling plan performance. To test the performance of candidate sampling plans under a variety of conditions, ranges of inspection system reliability were considered along with different distributions of tube degradation. Results from the eddy current reliability studies performed with the retired-from-service Surry 2A steam generator were utilized to guide the selection of appropriate probability of detection and flaw sizing models for use in the analysis. Different distributions of tube degradation were selected to span the range of conditions that might exist in operating steam generators. The principal means of evaluating sampling performance was to determine the effectiveness of the sampling plan for detecting and plugging defective tubes. A summary of key results from the eddy current reliability studies is presented. The analytical and Monte Carlo simulation analyses are discussed along with a synopsis of key results and conclusions.
Simulation in Pre-departure Training for Residents Planning Clinical Work in a Low-Income Country
Directory of Open Access Journals (Sweden)
Kevin R. Schwartz
2015-12-01
Full Text Available Introduction: Increasingly, pediatric and emergency medicine (EM residents are pursuing clinical rotations in low-income countries. Optimal pre-departure preparation for such rotations has not yet been established. High-fidelity simulation represents a potentially effective modality for such preparation. This study was designed to assess whether a pre-departure high-fidelity medical simulation curriculum is effective in helping to prepare residents for clinical rotations in a low-income country. Methods: 43 pediatric and EM residents planning clinical rotations in Liberia, West Africa, participated in a simulation-based curriculum focused on severe pediatric malaria and malnutrition and were then assessed by survey at three time points: pre-simulation, post-simulation, and after returning from work abroad. Results: Prior to simulation, 1/43 (2% participants reported they were comfortable with the diagnosis and management of severe malnutrition; this increased to 30/42 (71% after simulation and 24/31 (77% after working abroad. Prior to simulation, 1/43 (2% of residents reported comfort with the diagnosis and management of severe malaria; this increased to 26/42 (62% after simulation and 28/31 (90% after working abroad; 36/42 (86% of residents agreed that a simulation-based global health curriculum is more useful than a didactic curriculum alone, and 41/42 (98% felt a simulator-based curriculum should be offered to all residents planning a clinical trip to a low-income country. Conclusion: High-fidelity simulation is effective in increasing residents’ self-rated comfort in management of pediatric malaria and malnutrition and a majority of participating residents feel it should be included as a component of pre-departure training for all residents rotating clinically to low-income countries.
Institute of Scientific and Technical Information of China (English)
项灏; 张俊
2012-01-01
The advantages and disadvantages of the simulated annealing algorithm, genetic algorithm and ordinary quantum genetic algorithm were analyzed. Aiming at the population diversity and convergence rapidity of the real-coded double-chain quantum genetic algorithm, it was combined with the simulated annealing algorithm, and real-coded double-chain quantum genetic simulated annealing algorithm was put forward on the basis of the simulation of cosmic evolution of celestial bodies. The initial weights and thresholds of BP neural network were improved with this new algorithm, and the improved BP neural network was used in intelligent fault diagnosis. The simulation results show that this algorithm has goad effective.%分析了模拟退火算法、遗传算法与普通量子遗传算法的优缺点,针对实数编码双链量子遗传算法的种群多样性和收敛快速性,将其与模拟退火算法相结合,在模拟天体宇宙演变的基础之上,提出实数编码双链量子遗传模拟退火算法,并用之改进BP神经网络的初始权值与阈值,并将改进后的BP神经网络运用于智能故障诊断中.仿真结果表明,该算法效果良好.
M.P. Schilperoord (Michel)
2005-01-01
textabstract“Complexity in Foresight” is a new synthetic paradigm that crosses areas in strategic planning and the complexity sciences. It connects the fields of agent-based simulation and complex adapative systems, and provides the overall blueprint for the construction of a new generation of toolk
Xia, J; Samman, N; Yeung, R W; Wang, D; Shen, S G; Ip, H H; Tideman, H
2000-08-01
The purpose of this paper is to report a new technique for three-dimensional facial soft-tissue-change prediction after simulated orthognathic surgical planning. A scheme for soft tissue deformation, "Computer-assisted three-dimensional virtual reality soft tissue planning and prediction for orthognathic surgery (CASP)", is presented. The surgical planning was based on three-dimensional reconstructed CT visualization. Soft tissue changes were predicted by two newly devised algorithms: Surface Normal-based Model Deformation Algorithm and Ray Projection-based Model Deformation Algorithm. A three-dimensional color facial texture-mapping technique was also used for generating the color photo-realistic facial model. As a final result, a predicted and simulated patient's color facial model can be visualized from arbitrary viewing points.
Simulation Planning for Sustainable Use of Land Resources: Case study in Diamou
Directory of Open Access Journals (Sweden)
Diallo Yacouba
2009-01-01
Full Text Available In this study, we presented the simulation planning scheme to project land and land resources use changes at a local scale for Diamou (MALI. Problem statement: All the land cover types were under the influence of human and livestock population. Diamou has undergone changes in land-cover over the last decades. The shifting cultivation system practiced was probably the main reason for this state of affairs. Moreover, the dryness and extensive character of pastoral activities had contributed to the general degradation of natural resources. The principal objective of our study was to contribute to the sustainable use of land resources from 1999-2010. Approach: Using formula the resources supply and demand had been estimated based on statistics data, derived from a comprehensive review of the literature. The resources balance (difference between supply and demand had been estimated for two years 1999 and 2010.The resources demand were measured by an average consumption needs person-1 day-1 multiplied by the population. For the livestock population the biomass demand and supply had been measured based on TLU dietary requirements and the pastureland carrying capacity. The diagram of resources balances were drawn using word Microsoft word command and the simulation land use areas schema using ArcGIS. Results: From present approach, it was found, that, in year 1999 the fuel wood and cereal balances were negative. The drink water and biomass balances were positive. The dominant land use categories were the pastureland and the cropland, occupying about 52 and 45% of total area respectively of the total area 8876 ha. Except the biomass balance in year 2010, all the resources balance were negative. The drink water and fuel wood deficits were equal to 439 and 2801 m³ respectively. The dominant land use class, a cropland covered approximately 45% of total area. Conclusion: Studies had indicated the cereal, fuel wood and drink water resources deficit in
Peschmann, K. R.; Parker, D. L.; Smith, V.
1982-11-01
An abundant number of different CT scanner models has been developed in the past ten years, meeting increasing standards of performance. From the beginning they remained a comparatively expensive piece of equipment. This is due not only to their technical complexity but is also due to the difficulties involved in assessing "true" specifications (avoiding "overde-sign"). Our aim has been to provide, for Radiation Therapy Treatment Planning, a low cost CT scanner system featuring large freedom in patient positioning. We have taken advantage of the concurrent tremendously increased amount of knowledge and experience in the technical area of CT1 . By way of extensive computer simulations we gained confidence that an inexpensive C-arm simulator gantry and a simple one phase-two pulse generator in connection with a standard x-ray tube could be used, without sacrificing image quality. These components have been complemented by a commercial high precision shaft encoder, a simple and effective fan beam collimator, a high precision, high efficiency, luminescence crystal-silicon photodiode detector with 256 channels, low noise electronic preamplifier and sampling filter stages, a simplified data aquisition system furnished by Toshiba/ Analogic and an LSI 11/23 microcomputer plus data storage disk as well as various smaller interfaces linking the electrical components. The quality of CT scan pictures of phantoms,performed by the end of last year confirmed that this simple approach is working well. As a next step we intend to upgrade this system with an array processor in order to shorten recon-struction time to one minute per slice. We estimate that the system including this processor could be manufactured for a selling price of $210,000.
Institute of Scientific and Technical Information of China (English)
刘佳; 梁秋丽; 王书青; 陈立潮
2014-01-01
The Glowworm Swarm Optimization algorithm ( GSO) was studied. In order to improve the shortcom-ings of the basic GSO algorithm, such as low optimization precision and convergence slowly and easy to fall into local optima, an improved GSO algorithm was proposed in this paper. The Boltzmann selection mechanisms was applied to the movement of glowworm in order to increase the diversity of the population, and the simulated annealing algorithm was introduced into obtain a more accurate solution in this new algorithm. The new algorithm can greatly improve the ability of seeking the global excellent result and convergence property and accuracy. The feasibility and effectiveness of the new approach were verified through testing with functions. The experimental results show that the proposed al-gorithm is significantly superior to original GSO.%研究算法优化问题,在人工萤火虫群( GSO)算法的研究中,为了改进GSO易陷入局部极小和进化后期收敛速度慢、求解精度低等缺陷,引入了模拟退火算法,提出了一种新的人工萤火虫群算法( SA_GSO)。新算法将Boltzmann选择机制应用到萤火虫的移动选择策略中,动态调整搜索最优解过程中的选择压力,在人工萤火虫全局寻优的同时,并利用模拟退火算子实施局部细化,提高了GSO算法的全局搜索和局部搜索能力。仿真结果表明,SA_GSO算法的收敛速度和求解精度显著地提高,是求解组合优化问题的一种可行且有效的方法。
Institute of Scientific and Technical Information of China (English)
麻士东; 龚光红; 韩亮; 宋晓
2011-01-01
In the process of air-to-ground attacking by helicopter formations, target assignment plays an important role in completing military tasks. The integrated interest function and the principle of helicopter target assignment are made. Target assignment in helicopter formation' s air-to-ground attacking is realized by using hybrid strategy with ant colony algorithm and simulated annealing algorithm. Aimed at the defect because of deciding current best solution by integrated interest function, the hybrid strategy are improved, that is, to decide current best solution by amount of pheromone. So the best solution can be determined both by integrated interest function and amount of pheromone, and bad individual interests and slow convergence will be avoided in order to get maximal interests in process of assignment. A test is given to validate the improved hybrid algorithm, and the results indicate that the improved algorithm has a better performance in finding optimal solution and more quick convergence than before, and has a more reasonable assignment results.%直升机编队的对地攻击过程中,目标分配是实现作战任务的重要条件.确定了直升机目标分配的优势度计算方法以及目标分配的原则.利用蚁群-模拟退火算法实现了直升机对地攻击的目标分配过程,并针对分配过程中,采用综合优势度最大来确定最优路径所出现的不足对算法进行了改进,即根据信息素的积累量来确定最优路径,能够综合考虑信息素浓度与优势度的影响,避免了为达到全局最大优势度而出现的个体分配效益不好以及收敛缓慢的情况.实验结果表明,改进的算法效率更高,收敛的速度较之前更快,分配结果更趋合理.
Institute of Scientific and Technical Information of China (English)
路鹏; 丛晓; 周东岱
2013-01-01
With the application of artificial intelligence techniques in the field of educational evalua-tion, the computerized adaptive testing gradually becomes one of the most important educational evaluation methods.In such test, the computer can dynamically update the ability level of the learn-er and select tailored questions from the examination questions bank .It is required that the system has a relatively high efficiency of the implementation in order to meet the needs of the test .To solve this problem , the intelligent questions system based on simulated annealing algorithm is proposed . The experimental results show that while the method can ensure the selection of nearly optimal ques -tions from the examination questions bank for learners , it also greatly improve the efficiency of choo-sing questions from the system .%随着人工智能技术在教育评价领域中的应用，计算机自适应测试逐渐成为一种重要的教育评价方式。采用这种测试形式，计算机实时的对学习者的能力水平进行动态更新并从题库中为其选择量身定制的试题，这就要求系统具有比较高的执行效率，才能满足实际应用的需要。为了解决这个问题，提出了基于模拟退火算法来构建智能试题产生系统的方法。实验结果表明，该方法在保证从题库中为学习者选择接近最优试题的同时，也极大提高了系统的选题效率。
Inversion of Size Distribution of Fire Smoke Particles Based on Simulated Annealing%基于模拟退火的火灾烟颗粒粒径反演
Institute of Scientific and Technical Information of China (English)
李耀东; 张启兴; 邓小玖; 张永明
2011-01-01
火灾烟颗粒粒径分布的反演是一个典型的病态问题,容易因为陷入局部最小值而丢失全局最优解.在详细分析了随机噪声对烟颗粒群光散射Mueller矩阵元随角度分布的影响之后,采用全局搜索能力很强的模拟退火算法,实现了对球形模型下单分散系和对数正态分布系的粒径反演,在有相对强度为信号最大值3%的随机噪声干扰下,反演结果的误差都小于0.3%.然后使用该反演程序对烟颗粒分形凝团的散射光数据进行反演,得出了不同分形维数的火灾烟颗粒分形凝团在球形模型下的光学等效半径,并且火灾烟颗粒分形凝团的回转半径与光学等效半径之间具有近似线性关系.%Inversion of the size distribution of fire smoke particles is an ill-conditioning problem, and it tends to lose global optimal solutions on account of being trapped into local minimum. Under spherical model, the inversion of monodisperse systems and lognormal distribution systems have been performed by simulated annealing algorithm which has a powerful ability in global searching. Before that, the interference of random noise on the angular distribution of Mueller matrixs in the light scattering of fire smoke particles has been analyzed in detail. Errors of inversional results are less than 0.3％ when signal mixed with 3％ stochastic noise. Then, the optical equivalent radius of smoke particle clusters with different fractal dimensions could be calculated when the program is used to fit the scattering light of the clusters. Furthermore, it has an approximatively linear relationship with radius of gyration of the clusters.
Thermal Annealing of Exfoliated Graphene
Directory of Open Access Journals (Sweden)
Wang Xueshen
2013-01-01
Full Text Available Monolayer graphene is obtained by mechanical exfoliation using scotch tapes. The effects of thermal annealing on the tape residues and edges of graphene are researched. Atomic force microscope images showed that almost all the residues could be removed in N2/H2 at 400°C but only agglomerated in vacuum. Raman spectra of the annealed graphene show both the 2D peak and G peak blueshift. The full width at half maximum (FWHM of the 2D peak becomes larger and the intensity ratio of the 2D peak to G peak decreases. The edges of graphene are completely attached to the surface of the substrate after annealing.
PDCI Wide-Area Damping Control: PSLF Simulations of the 2016 Open and Closed Loop Test Plan
Energy Technology Data Exchange (ETDEWEB)
Wilches Bernal, Felipe [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pierre, Brian Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Elliott, Ryan Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schoenwald, David A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neely, Jason C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Trudnowski, Daniel J. [Montana Tech of the Univ. of Montana, Butte, MT (United States); Donnelly, Matthew K. [Montana Tech of the Univ. of Montana, Butte, MT (United States)
2017-03-01
To demonstrate and validate the performance of the wide-are a damping control system, the project plans to conduct closed-loop tests on the PDCI in summer/fall 2016. A test plan details the open and closed loop tests to be conducted on the P DCI using the wide-area damping control system. To ensure the appropriate level of preparedness, simulations were performed in order to predict and evaluate any possible unsafe operations before hardware experiments are attempted. This report contains the result s from these simulations using the power system dynamics software PSLF (Power System Load Flow, trademark of GE). The simulations use the WECC (Western Electricity Coordinating Council) 2016 light summer and heavy summer base cases.
Evaluation of a performance appraisal framework for radiation therapists in planning and simulation
Energy Technology Data Exchange (ETDEWEB)
Becker, Jillian, E-mail: jillian.becker@health.qld.gov.au [Radiation Oncology Mater Centre, South Brisbane, Queensland (Australia); Bridge, Pete [School of Clinical Sciences, Queensland University of Technology, Brisbane, Queensland (Australia); Brown, Elizabeth; Lusk, Ryan; Ferrari-Anderson, Janet [Radiation Oncology, Princess Alexandra Hospital, Brisbane, Queensland (Australia); Radiation Oncology Mater Centre, South Brisbane, Queensland (Australia)
2015-06-15
Constantly evolving technology and techniques within radiation therapy require practitioners to maintain a continuous approach to professional development and training. Systems of performance appraisal and adoption of regular feedback mechanisms are vital to support this development yet frequently lack structure and rely on informal peer support. A Radiation Therapy Performance Appraisal Framework (RT-PAF) for radiation therapists in planning and simulation was developed to define expectations of practice and promote a supportive and objective culture of performance and skills appraisal. Evaluation of the framework was conducted via an anonymous online survey tool. Nine peer reviewers and fourteen recipients provided feedback on its effectiveness and the challenges and limitations of the approach. Findings from the evaluation were positive and suggested that both groups gained benefit from and expressed a strong interest in embedding the approach more routinely. Respondents identified common challenges related to the limited ability to implement suggested development strategies; this was strongly associated with time and rostering issues. This framework successfully defined expectations for practice and provided a fair and objective feedback process that focussed on skills development. It empowered staff to maintain their skills and reach their professional potential. Management support, particularly in regard to provision of protected time was highlighted as critical to the framework's ongoing success. The demonstrated benefits arising in terms of staff satisfaction and development highlight the importance of this commitment to the modern radiation therapy workforce.
Salb, Tobias; Brief, Jakob; Welzel, Thomas; Giesler, Bjoern; Hassfeld, Steffan; Muehling, Joachim; Dillmann, Ruediger
2003-05-01
In this paper we present recent developments and pre-clinical validation results of our approach for augmented reality (AR, for short) in craniofacial surgery. A commercial Sony Glasstron display is used for optical see-through overlay of surgical planning and simulation results with a patient inside the operation room (OR). For the tracking of the glasses, of the patient and of various medical instruments an NDI Polaris system is used as standard solution. A complementary inside-out navigation approach has been realized with a panoramic camera. This device is mounted on the head of the surgeon for tracking of fiducials placed on the walls of the OR. Further tasks described include the calibration of the head-mounted display (HMD), the registration of virtual objects with the real world and the detection of occlusions in the object overlay with help of two miniature CCD cameras. The evaluation of our work took place in the laboratory environment and showed promising results. Future work will concentrate on the optimization of the technical features of the prototype and on the development of a system for everyday clinical use.
Chapter 8: Planning Tools to Simulate and Optimize Neighborhood Energy Systems
Energy Technology Data Exchange (ETDEWEB)
Zhivov, Alexander Michael; Case, Michael Patrick; Jank, Reinhard; Eicker, Ursula; Booth, Samuel
2017-03-15
This section introduces different energy modeling tools available in Europe and the USA for community energy master planning process varying from strategic Urban Energy Planning to more detailed Local Energy Planning. Two modeling tools used for Energy Master Planning of primarily residential communities, the 3D city model with CityGML, and the Net Zero Planner tool developed for the US Department of Defense installations are described in more details.
An Adaptive Filtering Algorithm using Mean Field Annealing Techniques
Persson, Per; Nordebo, Sven; Claesson, Ingvar
2002-01-01
We present a new approach to discrete adaptive filtering based on the mean field annealing algorithm. The main idea is to find the discrete filter vector that minimizes the matrix form of the Wiener-Hopf equations in a least-squares sense by a generalized mean field annealing algorithm. It is indicated by simulations that this approach, with complexity O(M^2) where M is the filter length, finds a solution comparable to the one obtained by the recursive least squares (RLS) algorithm but withou...
Gerlach, Kathy D; Spreng, R Nathan; Madore, Kevin P; Schacter, Daniel L
2014-12-01
We spend much of our daily lives imagining how we can reach future goals and what will happen when we attain them. Despite the prevalence of such goal-directed simulations, neuroimaging studies on planning have mainly focused on executive processes in the frontal lobe. This experiment examined the neural basis of process simulations, during which participants imagined themselves going through steps toward attaining a goal, and outcome simulations, during which participants imagined events they associated with achieving a goal. In the scanner, participants engaged in these simulation tasks and an odd/even control task. We hypothesized that process simulations would recruit default and frontoparietal control network regions, and that outcome simulations, which allow us to anticipate the affective consequences of achieving goals, would recruit default and reward-processing regions. Our analysis of brain activity that covaried with process and outcome simulations confirmed these hypotheses. A functional connectivity analysis with posterior cingulate, dorsolateral prefrontal cortex and anterior inferior parietal lobule seeds showed that their activity was correlated during process simulations and associated with a distributed network of default and frontoparietal control network regions. During outcome simulations, medial prefrontal cortex and amygdala seeds covaried together and formed a functional network with default and reward-processing regions. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.
Lehmer, R.; Ingram, C.; Jovic, S.; Alderete, J.; Brown, D.; Carpenter, D.; LaForce, S.; Panda, R.; Walker, J.; Chaplin, P.;
2006-01-01
The Virtual Airspace Simulation Technology - Real-Time (VAST-RT) Project, an element cf NASA's Virtual Airspace Modeling and Simulation (VAMS) Project, has been developing a distributed simulation capability that supports an extensible and expandable real-time, human-in-the-loop airspace simulation environment. The VAST-RT system architecture is based on DoD High Level Architecture (HLA) and the VAST-RT HLA Toolbox, a common interface implementation that incorporates a number of novel design features. The scope of the initial VAST-RT integration activity (Capability 1) included the high-fidelity human-in-the-loop simulation facilities located at NASA/Ames Research Center and medium fidelity pseudo-piloted target generators, such as the Airspace Traffic Generator (ATG) being developed as part of VAST-RT, as well as other real-time tools. This capability has been demonstrated in a gate-to-gate simulation. VAST-RT's (Capability 2A) has been recently completed, and this paper will discuss the improved integration of the real-time assets into VAST-RT, including the development of tools to integrate data collected across the simulation environment into a single data set for the researcher. Current plans for the completion of the VAST-RT distributed simulation environment (Capability 2B) and its use to evaluate future airspace capacity enhancing concepts being developed by VAMS will be discussed. Additionally, the simulation environment's application to other airspace and airport research projects is addressed.
Dose/volume-response relations for rectal morbidity using planned and simulated motion-inclusive dose distributions
Thor, Maria; Apte, Aditya; Deasy, Joseph O; Karlsdóttir, Àsa; Moiseenko, Vitali; Liu, Mitchell; Muren, Ludvig Paul
2014-01-01
Background and purpose Many dose-limiting normal tissues in radiotherapy (RT) display considerable internal motion between fractions over a course of treatment, potentially reducing the appropriateness of using planned dose distributions to predict morbidity. Accounting explicitly for rectal motion could improve the predictive power of modelling rectal morbidity. To test this, we simulated the effect of motion in two cohorts. Materials and methods The included patients (232 and 159 cases) received RT for prostate cancer to 70 and 74 Gy. Motion-inclusive dose distributions were introduced as simulations of random or systematic motion to the planned dose distributions. Six rectal morbidity endpoints were analysed. A probit model using the QUANTEC recommended parameters was also applied to the cohorts. Results The differences in associations using the planned over the motion- inclusive dose distributions were modest. Statistically significant associations were obtained with four of the endpoints, mainly at high doses (55–70 Gy), using both the planned and the motion-inclusive dose distributions, primarily when simulating random motion. The strongest associations were observed for GI toxicity and rectal bleeding (Rs=0.12–0.21; Rs=0.11–0.20). Applying the probit model, significant associations were found for tenesmus and rectal bleeding (Rs=0.13, p=0.02). Conclusion Equally strong associations with rectal morbidity were observed at high doses (>55 Gy), for the planned and the simulated dose distributions including in particular random rectal motion. Future studies should explore patient-specific descriptions of rectal motion to achieve improved predictive power. PMID:24231236
Directory of Open Access Journals (Sweden)
Utkarsh Gautam
2015-05-01
Full Text Available Addressing the need for exploration of benthic zones utilising autonomous underwater vehicles, this paper presents a simulation for an optimised path planning from the source node to the destination node of the autonomous underwater vehicle SLOCUM Glider in near-bottom ocean environment. Near-bottom ocean current data from the Bedford Institute of Oceanography, Canada, have been used for this simulation. A cost function is formulated to describe the dynamics of the autonomous underwater vehicle in near-bottom ocean currents. This cost function is then optimised using various biologically-inspired algorithms such as genetic algorithm, Ant Colony optimisation algorithm and particle swarm optimisation algorithm. The simulation of path planning is also performed using Q-learning technique and the results are compared with the biologically-inspired algorithms. The results clearly show that the Q-learning algorithm is better in computational complexity than the biologically-inspired algorithms. The ease of simulating the environment is also more in the case of Q-learning techniques. Hence this paper presents an effective path planning technique, which has been tested for the SLOCUM glider and it may be extended for use in any standard autonomous underwater vehicle.Defence Science Journal, Vol. 65, No. 3, May 2015, pp.220-225, DOI: http://dx.doi.org/10.14429/dsj.65.7855
A flexible annealing chaotic neural network to maximum clique problem.
Yang, Gang; Tang, Zheng; Zhang, Zhiqiang; Zhu, Yunyi
2007-06-01
Based on the analysis and comparison of several annealing strategies, we present a flexible annealing chaotic neural network which has flexible controlling ability and quick convergence rate to optimization problem. The proposed network has rich and adjustable chaotic dynamics at the beginning, and then can converge quickly to stable states. We test the network on the maximum clique problem by some graphs of the DIMACS clique instances, p-random and k random graphs. The simulations show that the flexible annealing chaotic neural network can get satisfactory solutions at very little time and few steps. The comparison between our proposed network and other chaotic neural networks denotes that the proposed network has superior executive efficiency and better ability to get optimal or near-optimal solution.
Chaotic Multiquenching Annealing Applied to the Protein Folding Problem
Liñan-García, Ernesto; Sánchez-Pérez, Mishael; Sánchez-Hernández, Juan Paulo
2014-01-01
The Chaotic Multiquenching Annealing algorithm (CMQA) is proposed. CMQA is a new algorithm, which is applied to protein folding problem (PFP). This algorithm is divided into three phases: (i) multiquenching phase (MQP), (ii) annealing phase (AP), and (iii) dynamical equilibrium phase (DEP). MQP enforces several stages of quick quenching processes that include chaotic functions. The chaotic functions can increase the exploration potential of solutions space of PFP. AP phase implements a simulated annealing algorithm (SA) with an exponential cooling function. MQP and AP are delimited by different ranges of temperatures; MQP is applied for a range of temperatures which goes from extremely high values to very high values; AP searches for solutions in a range of temperatures from high values to extremely low values. DEP phase finds the equilibrium in a dynamic way by applying least squares method. CMQA is tested with several instances of PFP. PMID:24790563
Enhancement of GMI Effect in Silicon Steels by Furnace Annealing
Institute of Scientific and Technical Information of China (English)
C.Sirisathitkul; P. Jantaratana
2009-01-01
The ratio and sensitivity of giant magnetoimpedance (GMI) in grain oriented silicon steels (Fe-4.5%Si) are improved after furnace annealing in air for 20 min. By annealing at 800℃, the GMI sensitivity rises from 1.29%/Oe to 1.91%/Oe and the ratio increases from 237% to 294% with decreasing characteristic frequency. The results are attributable to an increase in the transverse magnetic permeability during the heat treatment. From simulation by finite element method, the GMI effect can be interpreted as the modification of the current distribution by the applied magnetic field via the transverse permeability. In the case of annealed samples, the larger transverse permeability allows a higher GMI ratio and sensitivity.
Chaotic Multiquenching Annealing Applied to the Protein Folding Problem
Directory of Open Access Journals (Sweden)
Juan Frausto-Solis
2014-01-01
Full Text Available The Chaotic Multiquenching Annealing algorithm (CMQA is proposed. CMQA is a new algorithm, which is applied to protein folding problem (PFP. This algorithm is divided into three phases: (i multiquenching phase (MQP, (ii annealing phase (AP, and (iii dynamical equilibrium phase (DEP. MQP enforces several stages of quick quenching processes that include chaotic functions. The chaotic functions can increase the exploration potential of solutions space of PFP. AP phase implements a simulated annealing algorithm (SA with an exponential cooling function. MQP and AP are delimited by different ranges of temperatures; MQP is applied for a range of temperatures which goes from extremely high values to very high values; AP searches for solutions in a range of temperatures from high values to extremely low values. DEP phase finds the equilibrium in a dynamic way by applying least squares method. CMQA is tested with several instances of PFP.
Stochastic Evolutionary Algorithms for Planning Robot Paths
Fink, Wolfgang; Aghazarian, Hrand; Huntsberger, Terrance; Terrile, Richard
2006-01-01
A computer program implements stochastic evolutionary algorithms for planning and optimizing collision-free paths for robots and their jointed limbs. Stochastic evolutionary algorithms can be made to produce acceptably close approximations to exact, optimal solutions for path-planning problems while often demanding much less computation than do exhaustive-search and deterministic inverse-kinematics algorithms that have been used previously for this purpose. Hence, the present software is better suited for application aboard robots having limited computing capabilities (see figure). The stochastic aspect lies in the use of simulated annealing to (1) prevent trapping of an optimization algorithm in local minima of an energy-like error measure by which the fitness of a trial solution is evaluated while (2) ensuring that the entire multidimensional configuration and parameter space of the path-planning problem is sampled efficiently with respect to both robot joint angles and computation time. Simulated annealing is an established technique for avoiding local minima in multidimensional optimization problems, but has not, until now, been applied to planning collision-free robot paths by use of low-power computers.
Annealing properties of rice starch.
Thermal properties of starch can be modified by annealing, i.e., a pre-treatment in excessive amounts of water at temperatures below the gelatinization temperatures. This treatment is known to improve the crystalline properties, and is a useful tool to gain a better control of the functional proper...
Annealing-induced shape recovery in thin film metallic glass
Energy Technology Data Exchange (ETDEWEB)
Negussie, Alemu Tesfaye; Diyatmika, Wahyu [Department of Materials Science and Engineering, National Taiwan University of Science and Technology, Taipei 10607, Taiwan (China); Chu, J.P., E-mail: jpchu@mail.ntust.edu.tw [Department of Materials Science and Engineering, National Taiwan University of Science and Technology, Taipei 10607, Taiwan (China); Shen, Y.L. [Department of Mechanical Engineering, University of New Mexico, Albuquerque, NM 87131 (United States); Jang, J.S.C. [Department of Mechanical Engineering, National Central University, Chung-Li 32001, Taiwan (China); Hsueh, C.H. [Department of Materials Science and Engineering, National Taiwan University, Taipei 10617, Taiwan (China)
2014-11-15
Highlights: • Annealing-induced shape recovery of thin film metallic glass is examined. • Shape recovery becomes obvious with increasing temperature and holding time. • Minimum roughness is obtained when annealed within supercooled liquid region. • The amount of free volume in the film plays a role for the shape recovery. • The numerical simulation confirms the shape recovery upon annealing. - Abstract: The shape recovery property of a sputtered Zr{sub 50.3}Cu{sub 28.1}Al{sub 14}Ni{sub 7.6} (in at.%) thin film metallic glass upon heating is examined. Due to the surface tension-driven viscous flow, the shape of indentation appears to recover to different extents at various temperatures and holding times. It is found that a maximum of 59.8% indentation depth recovery is achieved after annealing within the supercooled liquid region (SCLR). The amount of free volume in the film is found to play a role in the recovery. Atomic force microscopy results reveal a decrease in film roughness to a minimum value within SCLR. To elucidate the experimentally observed shape recovery, a numerical modeling has been employed. It is evident that the depressed region caused by indentation is elevated after annealing.
Directory of Open Access Journals (Sweden)
Quintiliano Siqueira Schroden Nomelini
2009-12-01
Full Text Available Um mapa genético é um diagrama onde são representados os genes com suas respectivas posições no cromossomo. Eles são essenciais para o procedimento de localização de genes envolvidos no controle genético de caracteres quantitativos ou no controle de outros caracteres de interesse econômico. No presente trabalho avalia-se, via simulação computacional de dados, a eficiência dos algoritmos simulated annealing, delineação rápida em cadeia e ramos e conexões, para a construção de mapas genéticos. Nas condições avaliadas, o algoritmo ramos e conexões foi o mais rápido, sendo que tanto este, quanto a delineação rápida em cadeia apresentaram 100% de eficiência. A eficiência do simulated annealing para ordenação de marcadores variou com o número de marcadores, para 5 e 10 foi de 100%, para 15 99,8% e com 20 marcadores a eficiência obtida foi de 99,2%.The efficiency of Simulated Annealing (SA, Rapid Chain Delineation (RCD and Branch and Bounds (BB algorithms was evaluated by a Monte Carlo method. Regarding the conditions appraised the Branch and Bounds showed to be the fastest among them. Both RCD and BB were 100% efficient. The efficiency of SA depends on the length of the linkage group to be ordered. For 5 and 10 the efficiency was 100%, for 15 it was 99.8% and for 20 it was 99.2%.
Commissioning of a Geant4 based treatment plan simulation tool: linac model and dicom-rt interface
Cornelius, Iwan; Middlebrook, Nigel; Poole, Christopher; Oborn, Brad; Langton, Christian
2011-01-01
A Geant4 based simulation tool has been developed to perform Monte Carlo modelling of a 6 MV VarianTM iX clinac. The computer aided design interface of Geant4 was used to accurately model the LINAC components, including the Millenium multi-leaf collimators (MLCs). The simulation tool was verified via simulation of standard commissioning dosimetry data acquired with an ionisation chamber in a water phantom. Verification of the MLC model was achieved by simulation of leaf leakage measurements performed using GafchromicTM film in a solid water phantom. An absolute dose calibration capability was added by including a virtual monitor chamber into the simulation. Furthermore, a DICOM-RT interface was integrated with the application to allow the simulation of treatment plans in radiotherapy. The ability of the simulation tool to accurately model leaf movements and doses at each control point was verified by simulation of a widely used intensity-modulated radiation therapy (IMRT) quality assurance (QA) technique, the...
Accuracy of a decision aid for advance care planning: simulated end-of-life decision making.
Levi, Benjamin H; Heverley, Steven R; Green, Michael J
2011-01-01
Advance directives have been criticized for failing to help physicians make decisions consistent with patients' wishes. This pilot study sought to determine if an interactive, computer-based decision aid that generates an advance directive can help physicians accurately translate patients' wishes into treatment decisions. We recruited 19 patient-participants who had each previously created an advance directive using a computer-based decision aid, and 14 physicians who had no prior knowledge of the patient-participants. For each advance directive, three physicians were randomly assigned to review the advance directive and make five to six treatment decisions for each of six (potentially) end-of-life clinical scenarios. From the three individual physicians' responses, a "consensus physician response" was generated for each treatment decision (total decisions = 32). This consensus response was shared with the patient whose advance directive had been reviewed, and she/he was then asked to indicate how well the physician translated his/her wishes into clinical decisions. Patient-participants agreed with the consensus physician responses 84 percent (508/608) of the time, including 82 percent agreement on whether to provide mechanical ventilation, and 75 percent on decisions about cardiopulmonary resuscitation (CPR). Across the six vignettes, patient-participants' rating of how well physicians translated their advance directive into medical decisions was 8.4 (range = 6.5-10, where 1 = extremely poorly, and 10 = extremely well). Physicians' overall rating of their confidence at accurately translating patients' wishes into clinical decisions was 7.8 (range = 6.1-9.3, 1 = not at all confident, 10 = extremely confident). For simulated cases, a computer-based decision aid for advance care planning can help physicians more confidently make end-of-life decisions that patients will endorse.
A coherent quantum annealer with Rydberg atoms
Glaetzle, A. W.; van Bijnen, R. M. W.; Zoller, P.; Lechner, W.
2017-06-01
There is a significant ongoing effort in realizing quantum annealing with different physical platforms. The challenge is to achieve a fully programmable quantum device featuring coherent adiabatic quantum dynamics. Here we show that combining the well-developed quantum simulation toolbox for Rydberg atoms with the recently proposed Lechner-Hauke-Zoller (LHZ) architecture allows one to build a prototype for a coherent adiabatic quantum computer with all-to-all Ising interactions and, therefore, a platform for quantum annealing. In LHZ an infinite-range spin-glass is mapped onto the low energy subspace of a spin-1/2 lattice gauge model with quasi-local four-body parity constraints. This spin model can be emulated in a natural way with Rubidium and Caesium atoms in a bipartite optical lattice involving laser-dressed Rydberg-Rydberg interactions, which are several orders of magnitude larger than the relevant decoherence rates. This makes the exploration of coherent quantum enhanced optimization protocols accessible with state-of-the-art atomic physics experiments.
A coherent quantum annealer with Rydberg atoms.
Glaetzle, A W; van Bijnen, R M W; Zoller, P; Lechner, W
2017-06-22
There is a significant ongoing effort in realizing quantum annealing with different physical platforms. The challenge is to achieve a fully programmable quantum device featuring coherent adiabatic quantum dynamics. Here we show that combining the well-developed quantum simulation toolbox for Rydberg atoms with the recently proposed Lechner-Hauke-Zoller (LHZ) architecture allows one to build a prototype for a coherent adiabatic quantum computer with all-to-all Ising interactions and, therefore, a platform for quantum annealing. In LHZ an infinite-range spin-glass is mapped onto the low energy subspace of a spin-1/2 lattice gauge model with quasi-local four-body parity constraints. This spin model can be emulated in a natural way with Rubidium and Caesium atoms in a bipartite optical lattice involving laser-dressed Rydberg-Rydberg interactions, which are several orders of magnitude larger than the relevant decoherence rates. This makes the exploration of coherent quantum enhanced optimization protocols accessible with state-of-the-art atomic physics experiments.
Institute of Scientific and Technical Information of China (English)
You Hai-Long; Zhang Chun-Fu
2009-01-01
In this paper, the effects of optical interference and annealing on the performance of P3HT:PCBM based organic solar cells are studied in detail. Due to the optical interference effect, short circuit current density (JSC) shows obvious oscillatory behaviour with the variation of active layer thickness. With the help of the simulated results, the devices are optimized around the first two optical interference peaks. It is found that the optimized thicknesses are 80 and 208 nm. The study on the effect of annealing on the performance indicates that post-annealing is more favourable than pre-annealing. Based on post-annealing, different annealing temperatures are tested. The optimized annealing condition is 160℃ for 10 min in a nitrogen atmosphere. The device shows that the open circuit voltage VOC achieves about 0.65V and the power conversion efficiency is as high as 4.0 % around the second interference peak.
Institute of Scientific and Technical Information of China (English)
任世科; 何正文; 徐渝
2012-01-01
This paper involves the project payment scheduling problem from a joint perspective of two parties of contract. In this problem, payments are attached to events and the task is to arrange payment events, performing mode of activities, and occurrence time of events reasonably so as to maximize the joint revenue of the two parties and make them willing to accept the relative arrangement in the meantime. On the basis of the identification of the problem, the optimization model composed of two submodels is constructed. It. view of the strong NP-hardness of the problem, a simulated annealing heuristic algorithm which consists of two modules is developed. The algorithm is tested on a data set of standard instances generated randomly and the result shows that it is an efficient algorithm for the problem studied. Ultimately, an example is utilized to illustrate the significance of the study and the following conclusion is drawn： If the two parties of contract make decisions according to their own preference, they will get an outcome which is bad for both parties. However, if they take a cooperative attitude and coordinate with each other, they will obtain more profits from the project. The research in this paper can provide decision supports for the two parties of contract in the negotiation on project payment scheduling.%从合同双方的联合视角出发,研究项目支付进度问题。其中,支付与项目事件相联系,任务是合理地安排支付事件、活动执行模式及事件发生时间,以使合同双方的共同收益最大化并使二者都能接受相应的安排。在对问题进行界定的基础上,构建由两个子模型构成的优化模型。针对问题的强NP-hard属性,设计包含两个子模块的模拟退火启发式算法,并在随机生成的标准算例集合上对算法进行测试,结果表明,该算法是求解问题的有效算法。最后,用一个算例对研究进行说明,得到如下结论：如果合同双方均按自身
Mayer, I.S.; Zhou, Q.; Lo, J.; Abspoel, L.; Keijser, X.; Olsen, E.; Nixon, E.; Kannen, A.
2012-01-01
Marine ecosystems around the globe are increasingly affected by human activities such as fisheries, shipping, offshore petroleum developments, wind farms, recreation, tourism and more. Whereas the necessity and urgency to regulate and plan competing marine spatial claims is growing, the planning and
Energy Technology Data Exchange (ETDEWEB)
Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.
2009-01-01
The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.
Energy Technology Data Exchange (ETDEWEB)
Pascau, Javier, E-mail: jpascau@mce.hggm.es [Unidad de Medicina y Cirugia Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Departamento de Bioingenieria e Ingenieria Aeroespacial, Universidad Carlos III de Madrid, Madrid (Spain); Santos Miranda, Juan Antonio [Servicio de Oncologia Radioterapica, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Facultad de Medicina, Universidad Complutense de Madrid, Madrid (Spain); Calvo, Felipe A. [Servicio de Oncologia Radioterapica, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Facultad de Medicina, Universidad Complutense de Madrid, Madrid (Spain); Departamento de Oncologia, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Bouche, Ana; Morillo, Virgina [Consorcio Hospitalario Provincial de Castellon, Castellon (Spain); Gonzalez-San Segundo, Carmen [Servicio de Oncologia Radioterapica, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Facultad de Medicina, Universidad Complutense de Madrid, Madrid (Spain); Ferrer, Carlos; Lopez Tarjuelo, Juan [Consorcio Hospitalario Provincial de Castellon, Castellon (Spain); and others
2012-06-01
Purpose: Intraoperative electron beam radiation therapy (IOERT) involves a modified strategy of conventional radiation therapy and surgery. The lack of specific planning tools limits the spread of this technique. The purpose of the present study is to describe a new simulation and planning tool and its initial evaluation by clinical users. Methods and Materials: The tool works on a preoperative computed tomography scan. A physician contours regions to be treated and protected and simulates applicator positioning, calculating isodoses and the corresponding dose-volume histograms depending on the selected electron energy. Three radiation oncologists evaluated data from 15 IOERT patients, including different tumor locations. Segmentation masks, applicator positions, and treatment parameters were compared. Results: High parameter agreement was found in the following cases: three breast and three rectal cancer, retroperitoneal sarcoma, and rectal and ovary monotopic recurrences. All radiation oncologists performed similar segmentations of tumors and high-risk areas. The average applicator position difference was 1.2 {+-} 0.95 cm. The remaining cancer sites showed higher deviations because of differences in the criteria for segmenting high-risk areas (one rectal, one pancreas) and different surgical access simulated (two rectal, one Ewing sarcoma). Conclusions: The results show that this new tool can be used to simulate IOERT cases involving different anatomic locations, and that preplanning has to be carried out with specialized surgical input.
Propagating self-sustained annealing of radiation-induced interstitial complexes
Bokov, P. M.; Selyshchev, P. A.
2016-02-01
A propagating self-sustained annealing of radiation induced defects as a result of thermal-concentration instability is studied. The defects that are considered in the model are complexes. Each of them consists of one atom of impunity and of one interstitial atom. Crystal with defects has extra energy which is transformed into heat during defect annealing. Simulation of the auto-wave of annealing has been performed. The front and the speed of the auto-wave have been obtained. It is shown that annealing occurs in a narrow region of time and space. There are two kinds of such annealing behaviour. In the first case the speed of the auto-wave oscillates near its constant mean value and the front of temperature oscillates in a complex way. In the second case the speed of propagation is constant and fronts of temperature and concentration look like sigmoid functions.
Burgner, Jessica; Kahrs, Lüder Alexander; Raczkowsky, Jörg; Wörn, Heinz
2009-01-01
Material processing using laser became a widely used method especially in the scope of industrial automation. The systems are mostly based on a precise model of the laser process and the according parameterization. Beside the industrial use the laser as an instrument to treat human tissue has become an integral part in medicine as well. Human tissue as an inhomogeneous material to process, poses the question of how to determine a model, which reflects the interaction processes with a specific laser.Recently it could be shown that the pulsed CO2 laser is suitable to ablate bony and cartilage tissue. Until now this thermo-mechanical bone ablation is not characterized as a discrete process. In order to plan and simulate the ablation process in the correct level of detail, the parameterization is indispensable. We developed a planning and simulation environment, determined parameters by confocal measurements of bony specimen and use these results to transfer planned cutting trajectories into a pulse sequence and corresponding robot locations.
Verma, Savita Arora; Jung, Yoon Chul
2017-01-01
This presentation describes the overview of the ATD-2 project and the integrated simulation of surface and airspace to evaluate the procedures of IADS system and evaluate surface metering capabilities via a high-fidelity human-in-the-loop simulation. Two HITL facilities, Future Flight Central (FFC) and Airspace Operations Laboratory (AOL), are integrated for simulating surface operations of the Charlotte-Douglas International Airport (CLT) and airspace in CLT TRACON and Washington Center.
2004-01-01
The primary goal of Access 5 is to allow safe, reliable and routine operations of High Altitude-Long Endurance Remotely Operated Aircraft (HALE ROAs) within the National Airspace System (NAS). Step 1 of Access 5 addresses the policies, procedures, technologies and implementation issues of introducing such operations into the NAS above pressure altitude 40,000 ft (Flight Level 400 or FL400). Routine HALE ROA activity within the NAS represents a potentially significant change to the tasks and concerns of NAS users, service providers and other stakeholders. Due to the complexity of the NAS, and the importance of maintaining current high levels of safety in the NAS, any significant changes must be thoroughly evaluated prior to implementation. The Access 5 community has been tasked with performing this detailed evaluation of routine HALE-ROA activities in the NAS, and providing to key NAS stakeholders a set of recommended policies and procedures to achieve this goal. Extensive simulation, in concert with a directed flight demonstration program are intended to provide the required supporting evidence that these recommendations are based on sound methods and offer a clear roadmap to achieving safe, reliable and routine HALE ROA operations in the NAS. Through coordination with NAS service providers and policy makers, and with significant input from HALE-ROA manufacturers, operators and pilots, this document presents the detailed simulation plan for Step 1 of Access 5. A brief background of the Access 5 project will be presented with focus on Steps 1 and 2, concerning HALE-ROA operations above FL400 and FL180 respectively. An overview of project management structure follows with particular emphasis on the role of the Simulation IPT and its relationships to other project entities. This discussion will include a description of work packages assigned to the Simulation IPT, and present the specific goals to be achieved for each simulation work package, along with the associated
Strong white photoluminescence from annealed zeolites
Energy Technology Data Exchange (ETDEWEB)
Bai, Zhenhua, E-mail: baizh46@gmail.com [School of Chemical and Biomedical Engineering, Nanyang Technological University, Singapore 637457 (Singapore); Fujii, Minoru; Imakita, Kenji; Hayashi, Shinji [Department of Electrical and Electronic Engineering, Graduate School of Engineering, Kobe University, Rokkodai, Nada, Kobe 657-8501 (Japan)
2014-01-15
The optical properties of zeolites annealed at various temperatures are investigated for the first time. The annealed zeolites exhibit strong white photoluminescence (PL) under ultraviolet light excitation. With increasing annealing temperature, the emission intensity of annealed zeolites first increases and then decreases. At the same time, the PL peak red-shifts from 495 nm to 530 nm, and then returns to 500 nm. The strongest emission appears when the annealing temperature is 500 °C. The quantum yield of the sample is measured to be ∼10%. The PL lifetime monotonously increases from 223 μs to 251 μs with increasing annealing temperature. The origin of white PL is ascribed to oxygen vacancies formed during the annealing process. -- Highlights: • The optical properties of zeolites annealed at various temperatures are investigated. • The annealed zeolites exhibit strong white photoluminescence. • The maximum PL enhancement reaches as large as 62 times. • The lifetime shows little dependence on annealing temperature. • The origin of white emission is ascribed to the oxygen vacancies.
Adaptive genetic algorithm for path planning of loosely coordinated multi-robot manipulators
Institute of Scientific and Technical Information of China (English)
高胜; 赵杰; 蔡鹤皋
2003-01-01
Adaptive genetic algorithm ASAGA, a novel algorithm, which can dynamically modify the parameters of Genetic Algorithms in terms of simulated annealing mechanism, is proposed for path planning of loosely coordinated multi-robot manipulators. Over the task space of a multi-robot, a strategy of decoupled planning is also applied to the evolutionary process, which enables a multi-robot to avoid falling into deadlock and calculating of composite C-space. Finally, two representative tests are given to validate ASA GA and the strategy of decoupled planning.