WorldWideScience

Sample records for planning simulated annealing

  1. Conventional treatment planning optimization using simulated annealing

    International Nuclear Information System (INIS)

    Morrill, S.M.; Langer, M.; Lane, R.G.

    1995-01-01

    Purpose: Simulated annealing (SA) allows for the implementation of realistic biological and clinical cost functions into treatment plan optimization. However, a drawback to the clinical implementation of SA optimization is that large numbers of beams appear in the final solution, some with insignificant weights, preventing the delivery of these optimized plans using conventional (limited to a few coplanar beams) radiation therapy. A preliminary study suggested two promising algorithms for restricting the number of beam weights. The purpose of this investigation was to compare these two algorithms using our current SA algorithm with the aim of producing a algorithm to allow clinically useful radiation therapy treatment planning optimization. Method: Our current SA algorithm, Variable Stepsize Generalized Simulated Annealing (VSGSA) was modified with two algorithms to restrict the number of beam weights in the final solution. The first algorithm selected combinations of a fixed number of beams from the complete solution space at each iterative step of the optimization process. The second reduced the allowed number of beams by a factor of two at periodic steps during the optimization process until only the specified number of beams remained. Results of optimization of beam weights and angles using these algorithms were compared using a standard cadre of abdominal cases. The solution space was defined as a set of 36 custom-shaped open and wedged-filtered fields at 10 deg. increments with a target constant target volume margin of 1.2 cm. For each case a clinically-accepted cost function, minimum tumor dose was maximized subject to a set of normal tissue binary dose-volume constraints. For this study, the optimized plan was restricted to four (4) fields suitable for delivery with conventional therapy equipment. Results: The table gives the mean value of the minimum target dose obtained for each algorithm averaged over 5 different runs and the comparable manual treatment

  2. Enhanced Simulated Annealing for Solving Aggregate Production Planning

    Directory of Open Access Journals (Sweden)

    Mohd Rizam Abu Bakar

    2016-01-01

    Full Text Available Simulated annealing (SA has been an effective means that can address difficulties related to optimisation problems. SA is now a common discipline for research with several productive applications such as production planning. Due to the fact that aggregate production planning (APP is one of the most considerable problems in production planning, in this paper, we present multiobjective linear programming model for APP and optimised by SA. During the course of optimising for the APP problem, it uncovered that the capability of SA was inadequate and its performance was substandard, particularly for a sizable controlled APP problem with many decision variables and plenty of constraints. Since this algorithm works sequentially then the current state will generate only one in next state that will make the search slower and the drawback is that the search may fall in local minimum which represents the best solution in only part of the solution space. In order to enhance its performance and alleviate the deficiencies in the problem solving, a modified SA (MSA is proposed. We attempt to augment the search space by starting with N+1 solutions, instead of one solution. To analyse and investigate the operations of the MSA with the standard SA and harmony search (HS, the real performance of an industrial company and simulation are made for evaluation. The results show that, compared to SA and HS, MSA offers better quality solutions with regard to convergence and accuracy.

  3. Multiobjective optimization with a modified simulated annealing algorithm for external beam radiotherapy treatment planning

    International Nuclear Information System (INIS)

    Aubry, Jean-Francois; Beaulieu, Frederic; Sevigny, Caroline; Beaulieu, Luc; Tremblay, Daniel

    2006-01-01

    Inverse planning in external beam radiotherapy often requires a scalar objective function that incorporates importance factors to mimic the planner's preferences between conflicting objectives. Defining those importance factors is not straightforward, and frequently leads to an iterative process in which the importance factors become variables of the optimization problem. In order to avoid this drawback of inverse planning, optimization using algorithms more suited to multiobjective optimization, such as evolutionary algorithms, has been suggested. However, much inverse planning software, including one based on simulated annealing developed at our institution, does not include multiobjective-oriented algorithms. This work investigates the performance of a modified simulated annealing algorithm used to drive aperture-based intensity-modulated radiotherapy inverse planning software in a multiobjective optimization framework. For a few test cases involving gastric cancer patients, the use of this new algorithm leads to an increase in optimization speed of a little more than a factor of 2 over a conventional simulated annealing algorithm, while giving a close approximation of the solutions produced by a standard simulated annealing. A simple graphical user interface designed to facilitate the decision-making process that follows an optimization is also presented

  4. A simulated annealing approach to supplier selection aware inventory planning

    OpenAIRE

    Turk, Seda; Miller, Simon; Özcan, Ender; John, Robert

    2015-01-01

    Selection of an appropriate supplier is a crucial and challenging task in the effective management of a supply chain. Also, appropriate inventory management is critical to the success of a supply chain operation. In recent years, there has been a growing interest in the area of selection of an appropriate vendor and creating good inventory planning using supplier selection information. In this paper, we consider both of these tasks in a two-stage approach employing Interval Type-2 Fuzzy Sets ...

  5. Optimization of Gamma Knife treatment planning via guided evolutionary simulated annealing

    International Nuclear Information System (INIS)

    Zhang Pengpeng; Dean, David; Metzger, Andrew; Sibata, Claudio

    2001-01-01

    We present a method for generating optimized Gamma Knife trade mark sign (Elekta, Stockholm, Sweden) radiosurgery treatment plans. This semiautomatic method produces a highly conformal shot packing plan for the irradiation of an intracranial tumor. We simulate optimal treatment planning criteria with a probability function that is linked to every voxel in a volumetric (MR or CT) region of interest. This sigmoidal P + parameter models the requirement of conformality (i.e., tumor ablation and normal tissue sparing). After determination of initial radiosurgery treatment parameters, a guided evolutionary simulated annealing (GESA) algorithm is used to find the optimal size, position, and weight for each shot. The three-dimensional GESA algorithm searches the shot parameter space more thoroughly than is possible during manual shot packing and provides one plan that is suitable to the treatment criteria of the attending neurosurgeon and radiation oncologist. The result is a more conformal plan, which also reduces redundancy, and saves treatment administration time

  6. Temporary Workforce Planning with Firm Contracts: A Model and a Simulated Annealing Heuristic

    Directory of Open Access Journals (Sweden)

    Muhammad Al-Salamah

    2011-01-01

    Full Text Available The aim of this paper is to introduce a model for temporary staffing when temporary employment is managed by firm contracts and to propose a simulated annealing-based method to solve the model. Temporary employment is a policy frequently used to adjust the working hour capacity to fluctuating demand. Temporary workforce planning models have been unnecessarily simplified to account for only periodic hiring and laying off; a company can review its workforce requirement every period and make hire-fire decisions accordingly, usually with a layoff cost. We present a more realistic temporary workforce planning model that assumes a firm contract between the worker and the company, which can extend to several periods. The model assumes the traditional constraints, such as inventory balance constraints, worker availability, and labor hour mix. The costs are the inventory holding cost, training cost of the temporary workers, and the backorder cost. The mixed integer model developed for this case has been found to be difficult to solve even for small problem sizes; therefore, a simulated annealing algorithm is proposed to solve the mixed integer model. The performance of the SA algorithm is compared with the CPLEX solution.

  7. A study of inverse planning by simulated annealing for photon beams modulated by a multileaf collimator

    International Nuclear Information System (INIS)

    Grant, Walter; Carol, Mark; Geis, Paul; Boyer, Arthur L.

    1995-01-01

    Purpose/Objective: To demonstrate the feasibility of inverse planning for multiple fixed-field conformal therapy with a prototype simulated annealing technique and to deliver the treatment plan with an engineering prototype dynamic multileaf collimator. Methods and Materials: A version of the NOMOS inverse-planning algorithm was used to compute weighting distributions over the areas of multiple fixed-gantry fields. The algorithm uses simulated annealing and a cost function based on physical dose. The algorithm is a modification of a NOMOS Peacock planning implementation being used clinically. The computed weighting distributions represented the relative intensities over small 0.5 cm x 1.0 cm areas of the fields. The inverse planning was carried out using a Sun Model 20 computer using four processors. Between five and nine fixed-gantry beams were used in the plans. The weighting distributions were rendered into leaf-setting sequences using an algorithm developed for use with a Varian experimental dynamic-multileaf collimator. The sequences were saved as computer files in a format that was used to drive the Varian control system. X-ray fields having 6-MV and 18-MV energies were planned and delivered using tumor target and sensitive structure volumes segmented from clinical CT scans. Results: The resulting beam-modulation sequences could be loaded into the accelerator control systems and initiated. Each fixed-gantry angle beam was delivered in 30 s to 50 s. The resulting dose distributions were measured in quasi-anatomical phantoms using film. Dose distributions that could achieve significant tissue-sparing were demonstrated. There was good agreement between the delivered dose distributions and the planned distributions. Conclusion: The prototype inverse-planning system under development by NOMOS can be integrated with the prototype dynamic-delivery system being developed by Varian Associates. Should these commercial entities chose to offer compatible FDA

  8. Simulated annealing and circuit layout

    NARCIS (Netherlands)

    Aarts, E.H.L.; Laarhoven, van P.J.M.

    1991-01-01

    We discuss the problem of approximately sotvlng circuit layout problems by simulated annealing. For this we first summarize the theoretical concepts of the simulated annealing algorithm using Ihe theory of homogeneous and inhomogeneous Markov chains. Next we briefly review general aspects of the

  9. Placement by thermodynamic simulated annealing

    International Nuclear Information System (INIS)

    Vicente, Juan de; Lanchares, Juan; Hermida, Roman

    2003-01-01

    Combinatorial optimization problems arise in different fields of science and engineering. There exist some general techniques coping with these problems such as simulated annealing (SA). In spite of SA success, it usually requires costly experimental studies in fine tuning the most suitable annealing schedule. In this Letter, the classical integrated circuit placement problem is faced by Thermodynamic Simulated Annealing (TSA). TSA provides a new annealing schedule derived from thermodynamic laws. Unlike SA, temperature in TSA is free to evolve and its value is continuously updated from the variation of state functions as the internal energy and entropy. Thereby, TSA achieves the high quality results of SA while providing interesting adaptive features

  10. Global optimization and simulated annealing

    NARCIS (Netherlands)

    Dekkers, A.; Aarts, E.H.L.

    1988-01-01

    In this paper we are concerned with global optimization, which can be defined as the problem of finding points on a bounded subset of Rn in which some real valued functionf assumes its optimal (i.e. maximal or minimal) value. We present a stochastic approach which is based on the simulated annealing

  11. Simulated annealing model of acupuncture

    Science.gov (United States)

    Shang, Charles; Szu, Harold

    2015-05-01

    The growth control singularity model suggests that acupuncture points (acupoints) originate from organizers in embryogenesis. Organizers are singular points in growth control. Acupuncture can cause perturbation of a system with effects similar to simulated annealing. In clinical trial, the goal of a treatment is to relieve certain disorder which corresponds to reaching certain local optimum in simulated annealing. The self-organizing effect of the system is limited and related to the person's general health and age. Perturbation at acupoints can lead a stronger local excitation (analogous to higher annealing temperature) compared to perturbation at non-singular points (placebo control points). Such difference diminishes as the number of perturbed points increases due to the wider distribution of the limited self-organizing activity. This model explains the following facts from systematic reviews of acupuncture trials: 1. Properly chosen single acupoint treatment for certain disorder can lead to highly repeatable efficacy above placebo 2. When multiple acupoints are used, the result can be highly repeatable if the patients are relatively healthy and young but are usually mixed if the patients are old, frail and have multiple disorders at the same time as the number of local optima or comorbidities increases. 3. As number of acupoints used increases, the efficacy difference between sham and real acupuncture often diminishes. It predicted that the efficacy of acupuncture is negatively correlated to the disease chronicity, severity and patient's age. This is the first biological - physical model of acupuncture which can predict and guide clinical acupuncture research.

  12. Simulated annealing with constant thermodynamic speed

    International Nuclear Information System (INIS)

    Salamon, P.; Ruppeiner, G.; Liao, L.; Pedersen, J.

    1987-01-01

    Arguments are presented to the effect that the optimal annealing schedule for simulated annealing proceeds with constant thermodynamic speed, i.e., with dT/dt = -(v T)/(ε-√C), where T is the temperature, ε- is the relaxation time, C ist the heat capacity, t is the time, and v is the thermodynamic speed. Experimental results consistent with this conjecture are presented from simulated annealing on graph partitioning problems. (orig.)

  13. Cylinder packing by simulated annealing

    Directory of Open Access Journals (Sweden)

    M. Helena Correia

    2000-12-01

    Full Text Available This paper is motivated by the problem of loading identical items of circular base (tubes, rolls, ... into a rectangular base (the pallet. For practical reasons, all the loaded items are considered to have the same height. The resolution of this problem consists in determining the positioning pattern of the circular bases of the items on the rectangular pallet, while maximizing the number of items. This pattern will be repeated for each layer stacked on the pallet. Two algorithms based on the meta-heuristic Simulated Annealing have been developed and implemented. The tuning of these algorithms parameters implied running intensive tests in order to improve its efficiency. The algorithms developed were easily extended to the case of non-identical circles.Este artigo aborda o problema de posicionamento de objetos de base circular (tubos, rolos, ... sobre uma base retangular de maiores dimensões. Por razões práticas, considera-se que todos os objetos a carregar apresentam a mesma altura. A resolução do problema consiste na determinação do padrão de posicionamento das bases circulares dos referidos objetos sobre a base de forma retangular, tendo como objetivo a maximização do número de objetos estritamente posicionados no interior dessa base. Este padrão de posicionamento será repetido em cada uma das camadas a carregar sobre a base retangular. Apresentam-se dois algoritmos para a resolução do problema. Estes algoritmos baseiam-se numa meta-heurística, Simulated Annealling, cuja afinação de parâmetros requereu a execução de testes intensivos com o objetivo de atingir um elevado grau de eficiência no seu desempenho. As características dos algoritmos implementados permitiram que a sua extensão à consideração de círculos com raios diferentes fosse facilmente conseguida.

  14. Inverse planning anatomy-based dose optimization for HDR-brachytherapy of the prostate using fast simulated annealing algorithm and dedicated objective function

    International Nuclear Information System (INIS)

    Lessard, Etienne; Pouliot, Jean

    2001-01-01

    An anatomy-based dose optimization algorithm is developed to automatically and rapidly produce a highly conformal dose coverage of the target volume while minimizing urethra, bladder, and rectal doses in the delivery of an high dose-rate (HDR) brachytherapy boost for the treatment of prostate cancer. The dwell times are optimized using an inverse planning simulated annealing algorithm (IPSA) governed entirely from the anatomy extracted from a CT and by a dedicated objective function (cost function) reflecting clinical prescription and constraints. With this inverse planning approach, the focus is on the physician's prescription and constraint instead of on the technical limitations. Consequently, the physician's control on the treatment is improved. The capacity of this algorithm to represent the physician's prescription is presented for a clinical prostate case. The computation time (CPU) for IPSA optimization is less than 1 min (41 s for 142 915 iterations) for a typical clinical case, allowing fast and practical dose optimization. The achievement of highly conformal dose coverage to the target volume opens the possibility to deliver a higher dose to the prostate without inducing overdosage of urethra and normal tissues surrounding the prostate. Moreover, using the same concept, it will be possible to deliver a boost dose to a delimited tumor volume within the prostate. Finally, this method can be easily extended to other anatomical sites

  15. Intelligent medical image processing by simulated annealing

    International Nuclear Information System (INIS)

    Ohyama, Nagaaki

    1992-01-01

    Image processing is being widely used in the medical field and already has become very important, especially when used for image reconstruction purposes. In this paper, it is shown that image processing can be classified into 4 categories; passive, active, intelligent and visual image processing. These 4 classes are explained at first through the use of several examples. The results show that the passive image processing does not give better results than the others. Intelligent image processing, then, is addressed, and the simulated annealing method is introduced. Due to the flexibility of the simulated annealing, formulated intelligence is shown to be easily introduced in an image reconstruction problem. As a practical example, 3D blood vessel reconstruction from a small number of projections, which is insufficient for conventional method to give good reconstruction, is proposed, and computer simulation clearly shows the effectiveness of simulated annealing method. Prior to the conclusion, medical file systems such as IS and C (Image Save and Carry) is pointed out to have potential for formulating knowledge, which is indispensable for intelligent image processing. This paper concludes by summarizing the advantages of simulated annealing. (author)

  16. Thin-film designs by simulated annealing

    Science.gov (United States)

    Boudet, T.; Chaton, P.; Herault, L.; Gonon, G.; Jouanet, L.; Keller, P.

    1996-11-01

    With the increasing power of computers, new methods in synthesis of optical multilayer systems have appeared. Among these, the simulated-annealing algorithm has proved its efficiency in several fields of physics. We propose to show its performances in the field of optical multilayer systems through different filter designs.

  17. Job shop scheduling by simulated annealing

    NARCIS (Netherlands)

    Laarhoven, van P.J.M.; Aarts, E.H.L.; Lenstra, J.K.

    1992-01-01

    We describe an approximation algorithm for the problem of finding the minimum makespan in a job shop. The algorithm is based on simulated annealing, a generalization of the well known iterative improvement approach to combinatorial optimization problems. The generalization involves the acceptance of

  18. Finite-time thermodynamics and simulated annealing

    International Nuclear Information System (INIS)

    Andresen, B.

    1989-01-01

    When the general, global optimization technique simulated annealing was introduced by Kirkpatrick et al. (1983), this mathematical algorithm was based on an analogy to the statistical mechanical behavior of real physical systems like spin glasses, hence the name. In the intervening span of years the method has proven exceptionally useful for a great variety of extremely complicated problems, notably NP-problems like the travelling salesman, DNA sequencing, and graph partitioning. Only a few highly optimized heuristic algorithms (e.g. Lin, Kernighan 1973) have outperformed simulated annealing on their respective problems (Johnson et al. 1989). Simulated annealing in its current form relies only on the static quantity 'energy' to describe the system, whereas questions of rate, as in the temperature path (annealing schedule, see below), are left to intuition. We extent the connection to physical systems and take over further components from thermodynamics like ensemble, heat capacity, and relaxation time. Finally we refer to finite-time thermodynamics (Andresen, Salomon, Berry 1984) for a dynamical estimate of the optimal temperature path. (orig.)

  19. On lumped models for thermodynamic properties of simulated annealing problems

    International Nuclear Information System (INIS)

    Andresen, B.; Pedersen, J.M.; Salamon, P.; Hoffmann, K.H.; Mosegaard, K.; Nulton, J.

    1987-01-01

    The paper describes a new method for the estimation of thermodynamic properties for simulated annealing problems using data obtained during a simulated annealing run. The method works by estimating energy-to-energy transition probabilities and is well adapted to simulations such as simulated annealing, in which the system is never in equilibrium. (orig.)

  20. A note on simulated annealing to computer laboratory scheduling ...

    African Journals Online (AJOL)

    The concepts, principles and implementation of simulated Annealing as a modem heuristic technique is presented. Simulated Annealing algorithm is used in solving real life problem of Computer Laboratory scheduling in order to maximize the use of scarce and insufficient resources. KEY WORDS: Simulated Annealing ...

  1. Learning FCM by chaotic simulated annealing

    International Nuclear Information System (INIS)

    Alizadeh, Somayeh; Ghazanfari, Mehdi

    2009-01-01

    Fuzzy cognitive map (FCM) is a directed graph, which shows the relations between essential components in complex systems. It is a very convenient, simple, and powerful tool, which is used in numerous areas of application. Experts who are familiar with the system components and their relations can generate a related FCM. There is a big gap when human experts cannot produce FCM or even there is no expert to produce the related FCM. Therefore, a new mechanism must be used to bridge this gap. In this paper, a novel learning method is proposed to construct FCM by using Chaotic simulated annealing (CSA). The proposed method not only is able to construct FCM graph topology but also is able to extract the weight of the edges from input historical data. The efficiency of the proposed method is shown via comparison of its results of some numerical examples with those of Simulated annealing (SA) method.

  2. Simulated annealing algorithm for optimal capital growth

    Science.gov (United States)

    Luo, Yong; Zhu, Bo; Tang, Yong

    2014-08-01

    We investigate the problem of dynamic optimal capital growth of a portfolio. A general framework that one strives to maximize the expected logarithm utility of long term growth rate was developed. Exact optimization algorithms run into difficulties in this framework and this motivates the investigation of applying simulated annealing optimized algorithm to optimize the capital growth of a given portfolio. Empirical results with real financial data indicate that the approach is inspiring for capital growth portfolio.

  3. Binary Sparse Phase Retrieval via Simulated Annealing

    Directory of Open Access Journals (Sweden)

    Wei Peng

    2016-01-01

    Full Text Available This paper presents the Simulated Annealing Sparse PhAse Recovery (SASPAR algorithm for reconstructing sparse binary signals from their phaseless magnitudes of the Fourier transform. The greedy strategy version is also proposed for a comparison, which is a parameter-free algorithm. Sufficient numeric simulations indicate that our method is quite effective and suggest the binary model is robust. The SASPAR algorithm seems competitive to the existing methods for its efficiency and high recovery rate even with fewer Fourier measurements.

  4. Metaheurística Simulated Annealing para solução de problemas de planejamento florestal com restrições de integridade Simulated Annealing metaheuristic to solve forest planning problem with integer constraints

    Directory of Open Access Journals (Sweden)

    Flávio Lopes Rodrigues

    2004-04-01

    Full Text Available Os objetivos deste trabalho foram desenvolver e testar a metaheurística SA para solução de problemas de gerenciamento florestal com restrições de integridade. O algoritmo SA desenvolvido foi testado em quatro problemas, contendo entre 93 e 423 variáveis de decisão, sujeitos às restrições de singularidade, produção mínima e produção máxima, periodicamente. Todos os problemas tiveram como objetivo a maximização do valor presente líquido. O algoritmo SA foi codificado em liguagem delphi 5.0 e os testes foram efetuados em um microcomputador AMD K6II 500 MHZ, com memória RAM de 64 MB e disco rígido de 15GB. O desempenho da SA foi avaliado de acordo com as medidas de eficácia e eficiência. Os diferentes valores ou categorias dos parâmetros da SA foram testados e comparados quanto aos seus efeitos na eficácia do algoritmo. A seleção da melhor configuração de parâmetros foi feita com o teste L&O, a 1% de probabilidade, e as análises foram realizadas através de estatísticas descritivas. A melhor configuração de parâmetros propiciou à SA eficácia média de 95,36%, valor mínimo de 83,66%, valor máximo de 100% e coeficiente de variação igual a 3,18% do ótimo matemático obtido pelo algoritmo exato branch and bound. Para o problema de maior porte, a eficiência da SA foi dez vezes superior à eficiência do algoritmo exato branch and bound. O bom desempenho desta heurística reforçou as conclusões, tiradas em outros trabalhos, do seu enorme potencial para resolver importantes problemas de gerenciamento florestal de difícil solução pelos instrumentos computacionais da atualidade.The objectives of this work was to develop and test an algorithm based on Simulated Annealing (SA metaheuristic to solve problems of forest management with integer constraints. The algorithm SA developed was tested in five problems containing between 93 and 423 decision variables, periodically subject to singularity constraints, minimum

  5. Hierarchical Network Design Using Simulated Annealing

    DEFF Research Database (Denmark)

    Thomadsen, Tommy; Clausen, Jens

    2002-01-01

    networks are described and a mathematical model is proposed for a two level version of the hierarchical network problem. The problem is to determine which edges should connect nodes, and how demand is routed in the network. The problem is solved heuristically using simulated annealing which as a sub......-algorithm uses a construction algorithm to determine edges and route the demand. Performance for different versions of the algorithm are reported in terms of runtime and quality of the solutions. The algorithm is able to find solutions of reasonable quality in approximately 1 hour for networks with 100 nodes....

  6. Simulated annealing for tensor network states

    International Nuclear Information System (INIS)

    Iblisdir, S

    2014-01-01

    Markov chains for probability distributions related to matrix product states and one-dimensional Hamiltonians are introduced. With appropriate ‘inverse temperature’ schedules, these chains can be combined into a simulated annealing scheme for ground states of such Hamiltonians. Numerical experiments suggest that a linear, i.e., fast, schedule is possible in non-trivial cases. A natural extension of these chains to two-dimensional settings is next presented and tested. The obtained results compare well with Euclidean evolution. The proposed Markov chains are easy to implement and are inherently sign problem free (even for fermionic degrees of freedom). (paper)

  7. MEDICAL STAFF SCHEDULING USING SIMULATED ANNEALING

    Directory of Open Access Journals (Sweden)

    Ladislav Rosocha

    2015-07-01

    Full Text Available Purpose: The efficiency of medical staff is a fundamental feature of healthcare facilities quality. Therefore the better implementation of their preferences into the scheduling problem might not only rise the work-life balance of doctors and nurses, but also may result into better patient care. This paper focuses on optimization of medical staff preferences considering the scheduling problem.Methodology/Approach: We propose a medical staff scheduling algorithm based on simulated annealing, a well-known method from statistical thermodynamics. We define hard constraints, which are linked to legal and working regulations, and minimize the violations of soft constraints, which are related to the quality of work, psychic, and work-life balance of staff.Findings: On a sample of 60 physicians and nurses from gynecology department we generated monthly schedules and optimized their preferences in terms of soft constraints. Our results indicate that the final value of objective function optimized by proposed algorithm is more than 18-times better in violations of soft constraints than initially generated random schedule that satisfied hard constraints.Research Limitation/implication: Even though the global optimality of final outcome is not guaranteed, desirable solutionwas obtained in reasonable time. Originality/Value of paper: We show that designed algorithm is able to successfully generate schedules regarding hard and soft constraints. Moreover, presented method is significantly faster than standard schedule generation and is able to effectively reschedule due to the local neighborhood search characteristics of simulated annealing.

  8. PENJADWALAN FLOWSHOP DENGAN MENGGUNAKAN SIMULATED ANNEALING

    Directory of Open Access Journals (Sweden)

    Muhammad Firdaus

    2015-04-01

    Full Text Available This article apply a machine scheduling technique, named Simulate Annealing (SA to schedule 8 jobs and 5 machines to minimize makespan. A flowshop production flow is chosen as a case study to collect data and attempted to reduce jobs’ makespan. This article also does a sensitivity analysis to explore the implication of the changes of SA parameters as temperature. The results shows that the completion time of the jobs uses SA algoritm can decrease the completion time of the jobs, about 5 hours lower than the existing method. Moreover, total idle time of the machines is also reduced by 2.18 per cent using SA technique. Based on the sensitivity analysis, it indicates that there is a significant relationship between the changes of temperatures and makespan and computation time.

  9. Very fast simulated re-annealing

    OpenAIRE

    L. Ingber

    1989-01-01

    Draft An algorithm is developed to statistically find the best global fit of a nonlinear non-convex cost-function over a D-dimensional space. It is argued that this algorithm permits an annealing schedule for ‘‘temperature’’ T decreasing exponentially in annealing-time k, T = T0 exp(−ck1/D). The introduction of re-annealing also permits adaptation to changing sensitivities in the multidimensional parameter-space. This annealing schedule is faster than fast Cauchy annealing, ...

  10. Simulated annealing approach for solving economic load dispatch ...

    African Journals Online (AJOL)

    user

    thermodynamics to solve economic load dispatch (ELD) problems. ... evolutionary programming algorithm has been successfully applied for solving the ... concept behind the simulated annealing (SA) optimization is discussed in Section 3.

  11. Simulated annealing image reconstruction for positron emission tomography

    Energy Technology Data Exchange (ETDEWEB)

    Sundermann, E; Lemahieu, I; Desmedt, P [Department of Electronics and Information Systems, University of Ghent, St. Pietersnieuwstraat 41, B-9000 Ghent, Belgium (Belgium)

    1994-12-31

    In Positron Emission Tomography (PET) images have to be reconstructed from moisy projection data. The noise on the PET data can be modeled by a Poison distribution. In this paper, we present the results of using the simulated annealing technique to reconstruct PET images. Various parameter settings of the simulated annealing algorithm are discussed and optimized. The reconstructed images are of good quality and high contrast, in comparison to other reconstruction techniques. (authors). 11 refs., 2 figs.

  12. Simulated annealing image reconstruction for positron emission tomography

    International Nuclear Information System (INIS)

    Sundermann, E.; Lemahieu, I.; Desmedt, P.

    1994-01-01

    In Positron Emission Tomography (PET) images have to be reconstructed from moisy projection data. The noise on the PET data can be modeled by a Poison distribution. In this paper, we present the results of using the simulated annealing technique to reconstruct PET images. Various parameter settings of the simulated annealing algorithm are discussed and optimized. The reconstructed images are of good quality and high contrast, in comparison to other reconstruction techniques. (authors)

  13. Stochastic search in structural optimization - Genetic algorithms and simulated annealing

    Science.gov (United States)

    Hajela, Prabhat

    1993-01-01

    An account is given of illustrative applications of genetic algorithms and simulated annealing methods in structural optimization. The advantages of such stochastic search methods over traditional mathematical programming strategies are emphasized; it is noted that these methods offer a significantly higher probability of locating the global optimum in a multimodal design space. Both genetic-search and simulated annealing can be effectively used in problems with a mix of continuous, discrete, and integer design variables.

  14. On simulated annealing phase transitions in phylogeny reconstruction.

    Science.gov (United States)

    Strobl, Maximilian A R; Barker, Daniel

    2016-08-01

    Phylogeny reconstruction with global criteria is NP-complete or NP-hard, hence in general requires a heuristic search. We investigate the powerful, physically inspired, general-purpose heuristic simulated annealing, applied to phylogeny reconstruction. Simulated annealing mimics the physical process of annealing, where a liquid is gently cooled to form a crystal. During the search, periods of elevated specific heat occur, analogous to physical phase transitions. These simulated annealing phase transitions play a crucial role in the outcome of the search. Nevertheless, they have received comparably little attention, for phylogeny or other optimisation problems. We analyse simulated annealing phase transitions during searches for the optimal phylogenetic tree for 34 real-world multiple alignments. In the same way in which melting temperatures differ between materials, we observe distinct specific heat profiles for each input file. We propose this reflects differences in the search landscape and can serve as a measure for problem difficulty and for suitability of the algorithm's parameters. We discuss application in algorithmic optimisation and as a diagnostic to assess parameterisation before computationally costly, large phylogeny reconstructions are launched. Whilst the focus here lies on phylogeny reconstruction under maximum parsimony, it is plausible that our results are more widely applicable to optimisation procedures in science and industry. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  15. Angular filter refractometry analysis using simulated annealing.

    Science.gov (United States)

    Angland, P; Haberberger, D; Ivancic, S T; Froula, D H

    2017-10-01

    Angular filter refractometry (AFR) is a novel technique used to characterize the density profiles of laser-produced, long-scale-length plasmas [Haberberger et al., Phys. Plasmas 21, 056304 (2014)]. A new method of analysis for AFR images was developed using an annealing algorithm to iteratively converge upon a solution. A synthetic AFR image is constructed by a user-defined density profile described by eight parameters, and the algorithm systematically alters the parameters until the comparison is optimized. The optimization and statistical uncertainty calculation is based on the minimization of the χ 2 test statistic. The algorithm was successfully applied to experimental data of plasma expanding from a flat, laser-irradiated target, resulting in an average uncertainty in the density profile of 5%-20% in the region of interest.

  16. Simulated annealing algorithm for reactor in-core design optimizations

    International Nuclear Information System (INIS)

    Zhong Wenfa; Zhou Quan; Zhong Zhaopeng

    2001-01-01

    A nuclear reactor must be optimized for in core fuel management to make full use of the fuel, to reduce the operation cost and to flatten the power distribution reasonably. The author presents a simulated annealing algorithm. The optimized objective function and the punishment function were provided for optimizing the reactor physics design. The punishment function was used to practice the simulated annealing algorithm. The practical design of the NHR-200 was calculated. The results show that the K eff can be increased by 2.5% and the power distribution can be flattened

  17. Meta-Modeling by Symbolic Regression and Pareto Simulated Annealing

    NARCIS (Netherlands)

    Stinstra, E.; Rennen, G.; Teeuwen, G.J.A.

    2006-01-01

    The subject of this paper is a new approach to Symbolic Regression.Other publications on Symbolic Regression use Genetic Programming.This paper describes an alternative method based on Pareto Simulated Annealing.Our method is based on linear regression for the estimation of constants.Interval

  18. Correction of measured multiplicity distributions by the simulated annealing method

    International Nuclear Information System (INIS)

    Hafidouni, M.

    1993-01-01

    Simulated annealing is a method used to solve combinatorial optimization problems. It is used here for the correction of the observed multiplicity distribution from S-Pb collisions at 200 GeV/c per nucleon. (author) 11 refs., 2 figs

  19. The afforestation problem: a heuristic method based on simulated annealing

    DEFF Research Database (Denmark)

    Vidal, Rene Victor Valqui

    1992-01-01

    This paper presents the afforestation problem, that is the location and design of new forest compartments to be planted in a given area. This optimization problem is solved by a two-step heuristic method based on simulated annealing. Tests and experiences with this method are also presented....

  20. Molecular dynamics simulation of annealed ZnO surfaces

    Energy Technology Data Exchange (ETDEWEB)

    Min, Tjun Kit; Yoon, Tiem Leong [School of Physics, Universiti Sains Malaysia, 11800 USM, Penang (Malaysia); Lim, Thong Leng [Faculty of Engineering and Technology, Multimedia University, Melaka Campus, 75450 Melaka (Malaysia)

    2015-04-24

    The effect of thermally annealing a slab of wurtzite ZnO, terminated by two surfaces, (0001) (which is oxygen-terminated) and (0001{sup ¯}) (which is Zn-terminated), is investigated via molecular dynamics simulation by using reactive force field (ReaxFF). We found that upon heating beyond a threshold temperature of ∼700 K, surface oxygen atoms begin to sublimate from the (0001) surface. The ratio of oxygen leaving the surface at a given temperature increases as the heating temperature increases. A range of phenomena occurring at the atomic level on the (0001) surface has also been explored, such as formation of oxygen dimers on the surface and evolution of partial charge distribution in the slab during the annealing process. It was found that the partial charge distribution as a function of the depth from the surface undergoes a qualitative change when the annealing temperature is above the threshold temperature.

  1. Reactor controller design using genetic algorithms with simulated annealing

    International Nuclear Information System (INIS)

    Erkan, K.; Buetuen, E.

    2000-01-01

    This chapter presents a digital control system for ITU TRIGA Mark-II reactor using genetic algorithms with simulated annealing. The basic principles of genetic algorithms for problem solving are inspired by the mechanism of natural selection. Natural selection is a biological process in which stronger individuals are likely to be winners in a competing environment. Genetic algorithms use a direct analogy of natural evolution. Genetic algorithms are global search techniques for optimisation but they are poor at hill-climbing. Simulated annealing has the ability of probabilistic hill-climbing. Thus, the two techniques are combined here to get a fine-tuned algorithm that yields a faster convergence and a more accurate search by introducing a new mutation operator like simulated annealing or an adaptive cooling schedule. In control system design, there are currently no systematic approaches to choose the controller parameters to obtain the desired performance. The controller parameters are usually determined by test and error with simulation and experimental analysis. Genetic algorithm is used automatically and efficiently searching for a set of controller parameters for better performance. (orig.)

  2. Ranking important nodes in complex networks by simulated annealing

    International Nuclear Information System (INIS)

    Sun Yu; Yao Pei-Yang; Shen Jian; Zhong Yun; Wan Lu-Jun

    2017-01-01

    In this paper, based on simulated annealing a new method to rank important nodes in complex networks is presented. First, the concept of an importance sequence (IS) to describe the relative importance of nodes in complex networks is defined. Then, a measure used to evaluate the reasonability of an IS is designed. By comparing an IS and the measure of its reasonability to a state of complex networks and the energy of the state, respectively, the method finds the ground state of complex networks by simulated annealing. In other words, the method can construct a most reasonable IS. The results of experiments on real and artificial networks show that this ranking method not only is effective but also can be applied to different kinds of complex networks. (paper)

  3. Selection of views to materialize using simulated annealing algorithms

    Science.gov (United States)

    Zhou, Lijuan; Liu, Chi; Wang, Hongfeng; Liu, Daixin

    2002-03-01

    A data warehouse contains lots of materialized views over the data provided by the distributed heterogeneous databases for the purpose of efficiently implementing decision-support or OLAP queries. It is important to select the right view to materialize that answer a given set of queries. The goal is the minimization of the combination of the query evaluation and view maintenance costs. In this paper, we have addressed and designed algorithms for selecting a set of views to be materialized so that the sum of processing a set of queries and maintaining the materialized views is minimized. We develop an approach using simulated annealing algorithms to solve it. First, we explore simulated annealing algorithms to optimize the selection of materialized views. Then we use experiments to demonstrate our approach. The results show that our algorithm works better. We implemented our algorithms and a performance study of the algorithms shows that the proposed algorithm gives an optimal solution.

  4. Annealing simulation of cascade damage using MARLOWE-DAIQUIRI codes

    International Nuclear Information System (INIS)

    Muroga, Takeo

    1984-01-01

    The localization effect of the defects generated by the cascade damage on the properties of solids was studied by using a computer code. The code is based on the two-body collision approximation method and the Monte Carlo method. The MARLOWE and DAIQUIRI codes were partly improved to fit the present calculation of the annealing of cascade damage. The purpose of this study is to investigate the behavior of defects under the simulated reactive and irradiation condition. Calculation was made for alpha iron (BCC), and the threshold energy was set at 40 eV. The temperature dependence of annealing and the growth of a cluster were studied. The overlapping effect of cascade was studied. At first, the extreme case of overlapping was studied, then the practical cases were estimated by interpolation. The state of overlapping of cascade corresponded to the irradiation speed. The interaction between cascade and dislocations was studied, and the calculation of the annealing of primary knock-out atoms (PKA) in alpha iron was performed. At low temperature, the effect of dislocations was large, but the growth of vacancy was not seen. At high temperature, the effect of dislocations was small. The evaluation of the simulation of various ion irradiation and the growth efficiency of defects were performed. (Kato, T.)

  5. Parallel simulated annealing algorithms for cell placement on hypercube multiprocessors

    Science.gov (United States)

    Banerjee, Prithviraj; Jones, Mark Howard; Sargent, Jeff S.

    1990-01-01

    Two parallel algorithms for standard cell placement using simulated annealing are developed to run on distributed-memory message-passing hypercube multiprocessors. The cells can be mapped in a two-dimensional area of a chip onto processors in an n-dimensional hypercube in two ways, such that both small and large cell exchange and displacement moves can be applied. The computation of the cost function in parallel among all the processors in the hypercube is described, along with a distributed data structure that needs to be stored in the hypercube to support the parallel cost evaluation. A novel tree broadcasting strategy is used extensively for updating cell locations in the parallel environment. A dynamic parallel annealing schedule estimates the errors due to interacting parallel moves and adapts the rate of synchronization automatically. Two novel approaches in controlling error in parallel algorithms are described: heuristic cell coloring and adaptive sequence control.

  6. Restoration of polarimetric SAR images using simulated annealing

    DEFF Research Database (Denmark)

    Schou, Jesper; Skriver, Henning

    2001-01-01

    approach favoring one of the objectives. An algorithm for estimating the radar cross-section (RCS) for intensity SAR images has previously been proposed in the literature based on Markov random fields and the stochastic optimization method simulated annealing. A new version of the algorithm is presented......Filtering synthetic aperture radar (SAR) images ideally results in better estimates of the parameters characterizing the distributed targets in the images while preserving the structures of the nondistributed targets. However, these objectives are normally conflicting, often leading to a filtering...

  7. Combined Simulated Annealing Algorithm for the Discrete Facility Location Problem

    Directory of Open Access Journals (Sweden)

    Jin Qin

    2012-01-01

    Full Text Available The combined simulated annealing (CSA algorithm was developed for the discrete facility location problem (DFLP in the paper. The method is a two-layer algorithm, in which the external subalgorithm optimizes the decision of the facility location decision while the internal subalgorithm optimizes the decision of the allocation of customer's demand under the determined location decision. The performance of the CSA is tested by 30 instances with different sizes. The computational results show that CSA works much better than the previous algorithm on DFLP and offers a new reasonable alternative solution method to it.

  8. Analysis of Trivium by a Simulated Annealing variant

    DEFF Research Database (Denmark)

    Borghoff, Julia; Knudsen, Lars Ramkilde; Matusiewicz, Krystian

    2010-01-01

    This paper proposes a new method of solving certain classes of systems of multivariate equations over the binary field and its cryptanalytical applications. We show how heuristic optimization methods such as hill climbing algorithms can be relevant to solving systems of multivariate equations....... A characteristic of equation systems that may be efficiently solvable by the means of such algorithms is provided. As an example, we investigate equation systems induced by the problem of recovering the internal state of the stream cipher Trivium. We propose an improved variant of the simulated annealing method...

  9. Stochastic annealing simulations of defect interactions among subcascades

    Energy Technology Data Exchange (ETDEWEB)

    Heinisch, H.L. [Pacific Northwest National Lab., Richland, WA (United States); Singh, B.N.

    1997-04-01

    The effects of the subcascade structure of high energy cascades on the temperature dependencies of annihilation, clustering and free defect production are investigated. The subcascade structure is simulated by closely spaced groups of lower energy MD cascades. The simulation results illustrate the strong influence of the defect configuration existing in the primary damage state on subsequent intracascade evolution. Other significant factors affecting the evolution of the defect distribution are the large differences in mobility and stability of vacancy and interstitial defects and the rapid one-dimensional diffusion of small, glissile interstitial loops produced directly in cascades. Annealing simulations are also performed on high-energy, subcascade-producing cascades generated with the binary collision approximation and calibrated to MD results.

  10. Optimisation of electron beam characteristics by simulated annealing

    International Nuclear Information System (INIS)

    Ebert, M.A.; University of Adelaide, SA; Hoban, P.W.

    1996-01-01

    Full text: With the development of technology in the field of treatment beam delivery, the possibility of tailoring radiation beams (via manipulation of the beam's phase space) is foreseeable. This investigation involved evaluating a method for determining the characteristics of pure electron beams which provided dose distributions that best approximated desired distributions. The aim is to determine which degrees of freedom are advantageous and worth pursuing in a clinical setting. A simulated annealing routine was developed to determine optimum electron beam characteristics. A set of beam elements are defined at the surface of a homogeneous water equivalent phantom defining discrete positions and angles of incidence, and electron energies. The optimal weighting of these elements is determined by the (generally approximate) solution to the linear equation, Dw = d, where d represents the dose distribution calculated over the phantom, w the vector of (50 - 2x10 4 ) beam element relative weights, and D a normalised matrix of dose deposition kernels. In the iterative annealing procedure, beam elements are randomly selected and beam weighting distributions are sampled and used to perturb the selected elements. Perturbations are accepted or rejected according to standard simulated annealing criteria. The result (after the algorithm has terminated due to meeting an iteration or optimisation specification) is an approximate solution for the beam weight vector (w) specified by the above equation. This technique has been applied for several sample dose distributions and phase space restrictions. An example is given of the phase space obtained when endeavouring to conform to a rectangular 100% dose region with polyenergetic though normally incident electrons. For regular distributions, intuitive conclusions regarding the benefits of energy/angular manipulation may be made, whereas for complex distributions, variations in intensity over beam elements of varying energy and

  11. Simulated Annealing-Based Krill Herd Algorithm for Global Optimization

    Directory of Open Access Journals (Sweden)

    Gai-Ge Wang

    2013-01-01

    Full Text Available Recently, Gandomi and Alavi proposed a novel swarm intelligent method, called krill herd (KH, for global optimization. To enhance the performance of the KH method, in this paper, a new improved meta-heuristic simulated annealing-based krill herd (SKH method is proposed for optimization tasks. A new krill selecting (KS operator is used to refine krill behavior when updating krill’s position so as to enhance its reliability and robustness dealing with optimization problems. The introduced KS operator involves greedy strategy and accepting few not-so-good solutions with a low probability originally used in simulated annealing (SA. In addition, a kind of elitism scheme is used to save the best individuals in the population in the process of the krill updating. The merits of these improvements are verified by fourteen standard benchmarking functions and experimental results show that, in most cases, the performance of this improved meta-heuristic SKH method is superior to, or at least highly competitive with, the standard KH and other optimization methods.

  12. List-Based Simulated Annealing Algorithm for Traveling Salesman Problem

    Directory of Open Access Journals (Sweden)

    Shi-hua Zhan

    2016-01-01

    Full Text Available Simulated annealing (SA algorithm is a popular intelligent optimization algorithm which has been successfully applied in many fields. Parameters’ setting is a key factor for its performance, but it is also a tedious work. To simplify parameters setting, we present a list-based simulated annealing (LBSA algorithm to solve traveling salesman problem (TSP. LBSA algorithm uses a novel list-based cooling schedule to control the decrease of temperature. Specifically, a list of temperatures is created first, and then the maximum temperature in list is used by Metropolis acceptance criterion to decide whether to accept a candidate solution. The temperature list is adapted iteratively according to the topology of the solution space of the problem. The effectiveness and the parameter sensitivity of the list-based cooling schedule are illustrated through benchmark TSP problems. The LBSA algorithm, whose performance is robust on a wide range of parameter values, shows competitive performance compared with some other state-of-the-art algorithms.

  13. Differential evolution-simulated annealing for multiple sequence alignment

    Science.gov (United States)

    Addawe, R. C.; Addawe, J. M.; Sueño, M. R. K.; Magadia, J. C.

    2017-10-01

    Multiple sequence alignments (MSA) are used in the analysis of molecular evolution and sequence structure relationships. In this paper, a hybrid algorithm, Differential Evolution - Simulated Annealing (DESA) is applied in optimizing multiple sequence alignments (MSAs) based on structural information, non-gaps percentage and totally conserved columns. DESA is a robust algorithm characterized by self-organization, mutation, crossover, and SA-like selection scheme of the strategy parameters. Here, the MSA problem is treated as a multi-objective optimization problem of the hybrid evolutionary algorithm, DESA. Thus, we name the algorithm as DESA-MSA. Simulated sequences and alignments were generated to evaluate the accuracy and efficiency of DESA-MSA using different indel sizes, sequence lengths, deletion rates and insertion rates. The proposed hybrid algorithm obtained acceptable solutions particularly for the MSA problem evaluated based on the three objectives.

  14. spsann - optimization of sample patterns using spatial simulated annealing

    Science.gov (United States)

    Samuel-Rosa, Alessandro; Heuvelink, Gerard; Vasques, Gustavo; Anjos, Lúcia

    2015-04-01

    There are many algorithms and computer programs to optimize sample patterns, some private and others publicly available. A few have only been presented in scientific articles and text books. This dispersion and somewhat poor availability is holds back to their wider adoption and further development. We introduce spsann, a new R-package for the optimization of sample patterns using spatial simulated annealing. R is the most popular environment for data processing and analysis. Spatial simulated annealing is a well known method with widespread use to solve optimization problems in the soil and geo-sciences. This is mainly due to its robustness against local optima and easiness of implementation. spsann offers many optimizing criteria for sampling for variogram estimation (number of points or point-pairs per lag distance class - PPL), trend estimation (association/correlation and marginal distribution of the covariates - ACDC), and spatial interpolation (mean squared shortest distance - MSSD). spsann also includes the mean or maximum universal kriging variance (MUKV) as an optimizing criterion, which is used when the model of spatial variation is known. PPL, ACDC and MSSD were combined (PAN) for sampling when we are ignorant about the model of spatial variation. spsann solves this multi-objective optimization problem scaling the objective function values using their maximum absolute value or the mean value computed over 1000 random samples. Scaled values are aggregated using the weighted sum method. A graphical display allows to follow how the sample pattern is being perturbed during the optimization, as well as the evolution of its energy state. It is possible to start perturbing many points and exponentially reduce the number of perturbed points. The maximum perturbation distance reduces linearly with the number of iterations. The acceptance probability also reduces exponentially with the number of iterations. R is memory hungry and spatial simulated annealing is a

  15. Geometric Optimization of Thermo-electric Coolers Using Simulated Annealing

    International Nuclear Information System (INIS)

    Khanh, D V K; Vasant, P M; Elamvazuthi, I; Dieu, V N

    2015-01-01

    The field of thermo-electric coolers (TECs) has grown drastically in recent years. In an extreme environment as thermal energy and gas drilling operations, TEC is an effective cooling mechanism for instrument. However, limitations such as the relatively low energy conversion efficiency and ability to dissipate only a limited amount of heat flux may seriously damage the lifetime and performance of the instrument. Until now, many researches were conducted to expand the efficiency of TECs. The material parameters are the most significant, but they are restricted by currently available materials and module fabricating technologies. Therefore, the main objective of finding the optimal TECs design is to define a set of design parameters. In this paper, a new method of optimizing the dimension of TECs using simulated annealing (SA), to maximize the rate of refrigeration (ROR) was proposed. Equality constraint and inequality constraint were taken into consideration. This work reveals that SA shows better performance than Cheng's work. (paper)

  16. Memoryless cooperative graph search based on the simulated annealing algorithm

    International Nuclear Information System (INIS)

    Hou Jian; Yan Gang-Feng; Fan Zhen

    2011-01-01

    We have studied the problem of reaching a globally optimal segment for a graph-like environment with a single or a group of autonomous mobile agents. Firstly, two efficient simulated-annealing-like algorithms are given for a single agent to solve the problem in a partially known environment and an unknown environment, respectively. It shows that under both proposed control strategies, the agent will eventually converge to a globally optimal segment with probability 1. Secondly, we use multi-agent searching to simultaneously reduce the computation complexity and accelerate convergence based on the algorithms we have given for a single agent. By exploiting graph partition, a gossip-consensus method based scheme is presented to update the key parameter—radius of the graph, ensuring that the agents spend much less time finding a globally optimal segment. (interdisciplinary physics and related areas of science and technology)

  17. Simulated annealing and joint manufacturing batch-sizing

    Directory of Open Access Journals (Sweden)

    Sarker Ruhul

    2003-01-01

    Full Text Available We address an important problem of a manufacturing system. The system procures raw materials from outside suppliers in a lot and processes them to produce finished goods. It proposes an ordering policy for raw materials to meet the requirements of a production facility. In return, this facility has to deliver finished products demanded by external buyers at fixed time intervals. First, a general cost model is developed considering both raw materials and finished products. Then this model is used to develop a simulated annealing approach to determining an optimal ordering policy for procurement of raw materials and also for the manufacturing batch size to minimize the total cost for meeting customer demands in time. The solutions obtained were compared with those of traditional approaches. Numerical examples are presented. .

  18. Simulated annealing in adaptive optics for imaging the eye retina

    International Nuclear Information System (INIS)

    Zommer, S.; Adler, J.; Lipson, S. G.; Ribak, E.

    2004-01-01

    Full Text:Adaptive optics is a method designed to correct deformed images in real time. Once the distorted wavefront is known, a deformable mirror is used to compensate the aberrations and return the wavefront to a plane wave. This study concentrates on methods that omit wave front sensing from the reconstruction process. Such methods use stochastic algorithms to find the extremum of a certain sharpness function, thereby correcting the image without any information on the wavefront. Theoretical work [l] has shown that the optical problem can be mapped onto a model for crystal roughening. The main algorithm applied is simulated annealing. We present a first hardware realization of this algorithm in an adaptive optics system designed to image the retina of the human eye

  19. A simulated annealing approach for redesigning a warehouse network problem

    Science.gov (United States)

    Khairuddin, Rozieana; Marlizawati Zainuddin, Zaitul; Jiun, Gan Jia

    2017-09-01

    Now a day, several companies consider downsizing their distribution networks in ways that involve consolidation or phase-out of some of their current warehousing facilities due to the increasing competition, mounting cost pressure and taking advantage on the economies of scale. Consequently, the changes on economic situation after a certain period of time require an adjustment on the network model in order to get the optimal cost under the current economic conditions. This paper aimed to develop a mixed-integer linear programming model for a two-echelon warehouse network redesign problem with capacitated plant and uncapacitated warehouses. The main contribution of this study is considering capacity constraint for existing warehouses. A Simulated Annealing algorithm is proposed to tackle with the proposed model. The numerical solution showed the model and method of solution proposed was practical.

  20. Algoritmo de recocido simulado para la descomposición robusta del horizonte de tiempo en problemas de planeación de producción A simulated annealing algorithm for the robust decomposition of temporal horizons in production planning problems

    Directory of Open Access Journals (Sweden)

    José Fidel Torres Delgado

    2007-06-01

    Full Text Available El problema de la descomposición robusta del horizonte de tiempo en planeación de producción fue inicialmente tratado en [1]. Posteriormente, en [2], Torres propone partir de una solución entera encontrada por programación dinámica, para luego mejorarla mediante un algoritmo de recocido simulado(simulated annealing. De acuerdo con [2], es necesario investigar más a fondo la capacidad de este algoritmo para mejorar la solución inicial y el impacto de la selección de los parámetros de control del algoritmo sobre la calidad de las soluciones encontradas. En este trabajo se desarrolla esta propuesta de analizar más a fondo la capacidad del algoritmo de recocido simulado para mejorar la solución inicial. Como resultado de los experimentos computacionales realizados, se determinó que el método de enfriamiento y la tasa de enfriamiento tienen efecto significativo en la calidad de la solución final. De igual manera se estableció que la solución depende en gran medida de las características del plan de operaciones, encontrándose mejores soluciones para planes con horizontes de tiempo más cortos.The problem of robust decomposition of temporal horizons in production planning was first introduced by Torres [1]. Later, in [2], Torres suggests to start with an integer solution found by dynamic programming, and then to use a simulated annealing algorithm to improve it. According to [2], more needs to be known about the impact of the control parameters in the simulated annealing algorithm, and their sensitivity with respect to the quality of the solutions. In this work we develop this idea and analyze in depth the ability of the simulated annealing algorithm to improve the initial solution. As a result of the computational experiments conducted, we determined that the cooling scheme and the cooling rate have significant effect on the quality of the final solution. It was also established that the solution found depends strongly on the

  1. PERBANDINGAN KINERJA ALGORITMA GENETIKA DAN SIMULATED ANNEALING UNTUK MASALAH MULTIPLE OBJECTIVE PADA PENJADWALAN FLOWSHOP

    Directory of Open Access Journals (Sweden)

    I Gede Agus Widyadana

    2002-01-01

    Full Text Available The research is focused on comparing Genetics algorithm and Simulated Annealing in the term of performa and processing time. The main purpose is to find out performance both of the algorithm to solve minimizing makespan and total flowtime in a particular flowshop system. Performances of the algorithms are found by simulating problems with variation of jobs and machines combination. The result show the Simulated Annealing is much better than the Genetics up to 90%. The Genetics, however, only had score in processing time, but the trend that plotted suggest that in problems with lots of jobs and lots of machines, the Simulated Annealing will run much faster than the Genetics. Abstract in Bahasa Indonesia : Penelitian ini difokuskan pada pembandingan algoritma Genetika dan Simulated Annealing ditinjau dari aspek performa dan waktu proses. Tujuannya adalah untuk melihat kemampuan dua algoritma tersebut untuk menyelesaikan problem-problem penjadwalan flow shop dengan kriteria minimasi makespan dan total flowtime. Kemampuan kedua algoritma tersebut dilihat dengan melakukan simulasi yang dilakukan pada kombinasi-kombinasi job dan mesin yang berbeda-beda. Hasil simulasi menunjukan algoritma Simulated Annealing lebih unggul dari algoritma Genetika hingga 90%, algoritma Genetika hanya unggul pada waktu proses saja, namun dengan tren waktu proses yang terbentuk, diyakini pada problem dengan kombinasi job dan mesin yang banyak, algoritma Simulated Annealing dapat lebih cepat daripada algoritma Genetika. Kata kunci: Algoritma Genetika, Simulated Annealing, flow shop, makespan, total flowtime.

  2. Finding a Hadamard matrix by simulated annealing of spin vectors

    Science.gov (United States)

    Bayu Suksmono, Andriyan

    2017-05-01

    Reformulation of a combinatorial problem into optimization of a statistical-mechanics system enables finding a better solution using heuristics derived from a physical process, such as by the simulated annealing (SA). In this paper, we present a Hadamard matrix (H-matrix) searching method based on the SA on an Ising model. By equivalence, an H-matrix can be converted into a seminormalized Hadamard (SH) matrix, whose first column is unit vector and the rest ones are vectors with equal number of -1 and +1 called SH-vectors. We define SH spin vectors as representation of the SH vectors, which play a similar role as the spins on Ising model. The topology of the lattice is generalized into a graph, whose edges represent orthogonality relationship among the SH spin vectors. Starting from a randomly generated quasi H-matrix Q, which is a matrix similar to the SH-matrix without imposing orthogonality, we perform the SA. The transitions of Q are conducted by random exchange of {+, -} spin-pair within the SH-spin vectors that follow the Metropolis update rule. Upon transition toward zeroth energy, the Q-matrix is evolved following a Markov chain toward an orthogonal matrix, at which the H-matrix is said to be found. We demonstrate the capability of the proposed method to find some low-order H-matrices, including the ones that cannot trivially be constructed by the Sylvester method.

  3. Simulated Annealing Technique for Routing in a Rectangular Mesh Network

    Directory of Open Access Journals (Sweden)

    Noraziah Adzhar

    2014-01-01

    Full Text Available In the process of automatic design for printed circuit boards (PCBs, the phase following cell placement is routing. On the other hand, routing process is a notoriously difficult problem, and even the simplest routing problem which consists of a set of two-pin nets is known to be NP-complete. In this research, our routing region is first tessellated into a uniform Nx×Ny array of square cells. The ultimate goal for a routing problem is to achieve complete automatic routing with minimal need for any manual intervention. Therefore, shortest path for all connections needs to be established. While classical Dijkstra’s algorithm guarantees to find shortest path for a single net, each routed net will form obstacles for later paths. This will add complexities to route later nets and make its routing longer than the optimal path or sometimes impossible to complete. Today’s sequential routing often applies heuristic method to further refine the solution. Through this process, all nets will be rerouted in different order to improve the quality of routing. Because of this, we are motivated to apply simulated annealing, one of the metaheuristic methods to our routing model to produce better candidates of sequence.

  4. Simulation of short-term annealing of displacement cascades in FCC metals

    International Nuclear Information System (INIS)

    Heinisch, H.L.; Doran, D.G.; Schwartz, D.M.

    1980-01-01

    Computer models have been developed for the simulation of high energy displacement cascades. The objective is the generation of defect production functions for use in correlation analysis of radiation effects in fusion reactor materials. In particular, the stochastic cascade annealing simulation code SCAS has been developed and used to model the short-term annealing behavior of simulated cascades in FCC metals. The code is fast enough to make annealing of high energy cascades practical. Sets of cascades from 5 keV to 100 keV in copper were generated by the binary collision code MARLOWE

  5. Differential evolution and simulated annealing algorithms for mechanical systems design

    Directory of Open Access Journals (Sweden)

    H. Saruhan

    2014-09-01

    Full Text Available In this study, nature inspired algorithms – the Differential Evolution (DE and the Simulated Annealing (SA – are utilized to seek a global optimum solution for ball bearings link system assembly weight with constraints and mixed design variables. The Genetic Algorithm (GA and the Evolution Strategy (ES will be a reference for the examination and validation of the DE and the SA. The main purpose is to minimize the weight of an assembly system composed of a shaft and two ball bearings. Ball bearings link system is used extensively in many machinery applications. Among mechanical systems, designers pay great attention to the ball bearings link system because of its significant industrial importance. The problem is complex and a time consuming process due to mixed design variables and inequality constraints imposed on the objective function. The results showed that the DE and the SA performed and obtained convergence reliability on the global optimum solution. So the contribution of the DE and the SA application to the mechanical system design can be very useful in many real-world mechanical system design problems. Beside, the comparison confirms the effectiveness and the superiority of the DE over the others algorithms – the SA, the GA, and the ES – in terms of solution quality. The ball bearings link system assembly weight of 634,099 gr was obtained using the DE while 671,616 gr, 728213.8 gr, and 729445.5 gr were obtained using the SA, the ES, and the GA respectively.

  6. Sensitivity study on hydraulic well testing inversion using simulated annealing

    International Nuclear Information System (INIS)

    Nakao, Shinsuke; Najita, J.; Karasaki, Kenzi

    1997-11-01

    For environmental remediation, management of nuclear waste disposal, or geothermal reservoir engineering, it is very important to evaluate the permeabilities, spacing, and sizes of the subsurface fractures which control ground water flow. Cluster variable aperture (CVA) simulated annealing has been used as an inversion technique to construct fluid flow models of fractured formations based on transient pressure data from hydraulic tests. A two-dimensional fracture network system is represented as a filled regular lattice of fracture elements. The algorithm iteratively changes an aperture of cluster of fracture elements, which are chosen randomly from a list of discrete apertures, to improve the match to observed pressure transients. The size of the clusters is held constant throughout the iterations. Sensitivity studies using simple fracture models with eight wells show that, in general, it is necessary to conduct interference tests using at least three different wells as pumping well in order to reconstruct the fracture network with a transmissivity contrast of one order of magnitude, particularly when the cluster size is not known a priori. Because hydraulic inversion is inherently non-unique, it is important to utilize additional information. The authors investigated the relationship between the scale of heterogeneity and the optimum cluster size (and its shape) to enhance the reliability and convergence of the inversion. It appears that the cluster size corresponding to about 20--40 % of the practical range of the spatial correlation is optimal. Inversion results of the Raymond test site data are also presented and the practical range of spatial correlation is evaluated to be about 5--10 m from the optimal cluster size in the inversion

  7. Sensitivity study on hydraulic well testing inversion using simulated annealing

    Energy Technology Data Exchange (ETDEWEB)

    Nakao, Shinsuke; Najita, J.; Karasaki, Kenzi

    1997-11-01

    For environmental remediation, management of nuclear waste disposal, or geothermal reservoir engineering, it is very important to evaluate the permeabilities, spacing, and sizes of the subsurface fractures which control ground water flow. Cluster variable aperture (CVA) simulated annealing has been used as an inversion technique to construct fluid flow models of fractured formations based on transient pressure data from hydraulic tests. A two-dimensional fracture network system is represented as a filled regular lattice of fracture elements. The algorithm iteratively changes an aperture of cluster of fracture elements, which are chosen randomly from a list of discrete apertures, to improve the match to observed pressure transients. The size of the clusters is held constant throughout the iterations. Sensitivity studies using simple fracture models with eight wells show that, in general, it is necessary to conduct interference tests using at least three different wells as pumping well in order to reconstruct the fracture network with a transmissivity contrast of one order of magnitude, particularly when the cluster size is not known a priori. Because hydraulic inversion is inherently non-unique, it is important to utilize additional information. The authors investigated the relationship between the scale of heterogeneity and the optimum cluster size (and its shape) to enhance the reliability and convergence of the inversion. It appears that the cluster size corresponding to about 20--40 % of the practical range of the spatial correlation is optimal. Inversion results of the Raymond test site data are also presented and the practical range of spatial correlation is evaluated to be about 5--10 m from the optimal cluster size in the inversion.

  8. New technique for global solar radiation forecasting by simulated annealing and genetic algorithms using

    International Nuclear Information System (INIS)

    Tolabi, H.B.; Ayob, S.M.

    2014-01-01

    In this paper, a novel approach based on simulated annealing algorithm as a meta-heuristic method is implemented in MATLAB software to estimate the monthly average daily global solar radiation on a horizontal surface for six different climate cities of Iran. A search method based on genetic algorithm is applied to accelerate problem solving. Results show that simulated annealing based on genetic algorithm search is a suitable method to find the global solar radiation. (author)

  9. ACTIVITY-BASED COSTING DAN SIMULATED ANNEALING UNTUK PENCARIAN RUTE PADA FLEXIBLE MANUFACTURING SYSTEMS

    Directory of Open Access Journals (Sweden)

    Gregorius Satia Budhi

    2003-01-01

    Full Text Available Flexible Manufacturing System (FMS is a manufacturing system that is formed from several Numerical Controlled Machines combine with material handling system, so that different jobs can be worked by different machines sequences. FMS combine the high productivity and flexibility of Transfer Line and Job Shop manufacturing system. In this reasearch, Activity-Based Costing(ABC approach was used as the weight to search the operation route in the proper machine, so that the total production cost can be optimized. The search method that was used in this experiment is Simulated Annealling, a variant form Hill Climbing Search method. An ideal operation time to proses a part was used as the annealling schedule. From the empirical test, it could be proved that the use of ABC approach and Simulated Annealing to search the route (routing process can optimize the Total Production Cost. In the other hand, the use of ideal operation time to process a part as annealing schedule can control the processing time well. Abstract in Bahasa Indonesia : Flexible Manufacturing System (FMS adalah sistem manufaktur yang tersusun dari mesin-mesin Numerical Control (NC yang dikombinasi dengan Sistem Penanganan Material, sehingga job-job berbeda dikerjakan oleh mesin-mesin dengan alur yang berlainan. FMS menggabungkan produktifitas dan fleksibilitas yang tinggi dari Sistem Manufaktur Transfer Line dan Job Shop. Pada riset ini pendekatan Activity-Based Costing (ABC digunakan sebagai bobot / weight dalam pencarian rute operasi pada mesin yang tepat, untuk lebih mengoptimasi biaya produksi secara keseluruhan. Adapun metode Searching yang digunakan adalah Simulated Annealing yang merupakan varian dari metode searching Hill Climbing. Waktu operasi ideal untuk memproses sebuah part digunakan sebagai Annealing Schedulenya. Dari hasil pengujian empiris dapat dibuktikan bahwa penggunaan pendekatan ABC dan Simulated Annealing untuk proses pencarian rute (routing dapat lebih

  10. SAGRAD: A Program for Neural Network Training with Simulated Annealing and the Conjugate Gradient Method.

    Science.gov (United States)

    Bernal, Javier; Torres-Jimenez, Jose

    2015-01-01

    SAGRAD (Simulated Annealing GRADient), a Fortran 77 program for computing neural networks for classification using batch learning, is discussed. Neural network training in SAGRAD is based on a combination of simulated annealing and Møller's scaled conjugate gradient algorithm, the latter a variation of the traditional conjugate gradient method, better suited for the nonquadratic nature of neural networks. Different aspects of the implementation of the training process in SAGRAD are discussed, such as the efficient computation of gradients and multiplication of vectors by Hessian matrices that are required by Møller's algorithm; the (re)initialization of weights with simulated annealing required to (re)start Møller's algorithm the first time and each time thereafter that it shows insufficient progress in reaching a possibly local minimum; and the use of simulated annealing when Møller's algorithm, after possibly making considerable progress, becomes stuck at a local minimum or flat area of weight space. Outlines of the scaled conjugate gradient algorithm, the simulated annealing procedure and the training process used in SAGRAD are presented together with results from running SAGRAD on two examples of training data.

  11. Simulated Stochastic Approximation Annealing for Global Optimization With a Square-Root Cooling Schedule

    KAUST Repository

    Liang, Faming

    2014-04-03

    Simulated annealing has been widely used in the solution of optimization problems. As known by many researchers, the global optima cannot be guaranteed to be located by simulated annealing unless a logarithmic cooling schedule is used. However, the logarithmic cooling schedule is so slow that no one can afford to use this much CPU time. This article proposes a new stochastic optimization algorithm, the so-called simulated stochastic approximation annealing algorithm, which is a combination of simulated annealing and the stochastic approximation Monte Carlo algorithm. Under the framework of stochastic approximation, it is shown that the new algorithm can work with a cooling schedule in which the temperature can decrease much faster than in the logarithmic cooling schedule, for example, a square-root cooling schedule, while guaranteeing the global optima to be reached when the temperature tends to zero. The new algorithm has been tested on a few benchmark optimization problems, including feed-forward neural network training and protein-folding. The numerical results indicate that the new algorithm can significantly outperform simulated annealing and other competitors. Supplementary materials for this article are available online.

  12. New heating schedule in hydrogen annealing furnace based on process simulation for less energy consumption

    International Nuclear Information System (INIS)

    Saboonchi, Ahmad; Hassanpour, Saeid; Abbasi, Shahram

    2008-01-01

    Cold rolled steel coils are annealed in batch furnaces to obtain desirable mechanical properties. Annealing operations involve heating and cooling cycles which take long due to high weight of the coils under annealing. To reduce annealing time, a simulation code was developed that is capable of evaluating more effective schedules for annealing coils during the heating process. This code is additionally capable of accurate determination of furnace turn-off time for different coil weights and charge dimensions. After studying many heating schedules and considering heat transfer mechanism in the annealing furnace, a new schedule with the most advantages was selected as the new operation conditions in the hydrogen annealing plant. The performance of all the furnaces were adjusted to the new heating schedule after experiments had been carried out to ensure the accuracy of the code and the fitness of the new operation condition. Comparison of similar yield of cold rolled coils over two months revealed that specific energy consumption of furnaces under the new heating schedule decreased by 11%, heating cycle time by 16%, and the hydrogen consumption by 14%

  13. New heating schedule in hydrogen annealing furnace based on process simulation for less energy consumption

    Energy Technology Data Exchange (ETDEWEB)

    Saboonchi, Ahmad [Department of Mechanical Engineering, Isfahan University of Technology, Isfahan 84154 (Iran); Hassanpour, Saeid [Rayan Tahlil Sepahan Co., Isfahan Science and Technology Town, Isfahan 84155 (Iran); Abbasi, Shahram [R and D Department, Mobarakeh Steel Complex, Isfahan (Iran)

    2008-11-15

    Cold rolled steel coils are annealed in batch furnaces to obtain desirable mechanical properties. Annealing operations involve heating and cooling cycles which take long due to high weight of the coils under annealing. To reduce annealing time, a simulation code was developed that is capable of evaluating more effective schedules for annealing coils during the heating process. This code is additionally capable of accurate determination of furnace turn-off time for different coil weights and charge dimensions. After studying many heating schedules and considering heat transfer mechanism in the annealing furnace, a new schedule with the most advantages was selected as the new operation conditions in the hydrogen annealing plant. The performance of all the furnaces were adjusted to the new heating schedule after experiments had been carried out to ensure the accuracy of the code and the fitness of the new operation condition. Comparison of similar yield of cold rolled coils over two months revealed that specific energy consumption of furnaces under the new heating schedule decreased by 11%, heating cycle time by 16%, and the hydrogen consumption by 14%. (author)

  14. Simulation, hardware implementation and control of a multilevel inverter with simulated annealing algorithm

    Directory of Open Access Journals (Sweden)

    Fayçal Chabni

    2017-09-01

    Full Text Available Harmonic pollution is a very common issue in the field of power electronics, Harmonics can cause multiple problems for power converters and electrical loads alike, this paper introduces a modulation method called selective harmonic elimination pulse width modulation (SHEPWM, this method allows the elimination of a specific order of harmonics and also control the amplitude of the fundamental component of the output voltage. In this work SHEPWM strategy is applied to a five level cascade inverter. The objective of this study is to demonstrate the total control provided by the SHEPWM strategy over any rank of harmonics using the simulated annealing optimization algorithm and also control the amplitude of the fundamental component at any desired value. Simulation and experimental results are presented in this work.

  15. Simulation of the diffusion of implanted impurities in silicon structures at the rapid thermal annealing

    International Nuclear Information System (INIS)

    Komarov, F.F.; Komarov, A.F.; Mironov, A.M.; Makarevich, Yu.V.; Miskevich, S.A.; Zayats, G.M.

    2011-01-01

    Physical and mathematical models and numerical simulation of the diffusion of implanted impurities during rapid thermal treatment of silicon structures are discussed. The calculation results correspond to the experimental results with a sufficient accuracy. A simulation software system has been developed that is integrated into ATHENA simulation system developed by Silvaco Inc. This program can simulate processes of the low-energy implantation of B, BF 2 , P, As, Sb, C ions into the silicon structures and subsequent rapid thermal annealing. (authors)

  16. Annealing of ion irradiated high TC Josephson junctions studied by numerical simulations

    International Nuclear Information System (INIS)

    Sirena, M.; Matzen, S.; Bergeal, N.; Lesueur, J.; Faini, G.; Bernard, R.; Briatico, J.; Crete, D. G.

    2009-01-01

    Recently, annealing of ion irradiated high T c Josephson iunctions (JJs) has been studied experimentally in the perspective of improving their reproducibility. Here we present numerical simulations based on random walk and Monte Carlo calculations of the evolution of JJ characteristics such as the transition temperature T c ' and its spread ΔT c ' , and compare them with experimental results on junctions irradiated with 100 and 150 keV oxygen ions, and annealed at low temperatures (below 80 deg. C). We have successfully used a vacancy-interstitial annihilation mechanism to describe the evolution of the T c ' and the homogeneity of a JJ array, analyzing the evolution of the defects density mean value and its distribution width. The annealing first increases the spread in T c ' for short annealing times due to the stochastic nature of the process, but then tends to reduce it for longer times, which is interesting for technological applications

  17. Optimization of pressurized water reactor shuffling by simulated annealing with heuristics

    International Nuclear Information System (INIS)

    Stevens, J.G.; Smith, K.S.; Rempe, K.R.; Downar, T.J.

    1995-01-01

    Simulated-annealing optimization of reactor core loading patterns is implemented with support for design heuristics during candidate pattern generation. The SIMAN optimization module uses the advanced nodal method of SIMULATE-3 and the full cross-section detail of CASMO-3 to evaluate accurately the neutronic performance of each candidate, resulting in high-quality patterns. The use of heuristics within simulated annealing is explored. Heuristics improve the consistency of optimization results for both fast- and slow-annealing runs with no penalty from the exclusion of unusual candidates. Thus, the heuristic application of designer judgment during automated pattern generation is shown to be effective. The capability of the SIMAN module to find and evaluate families of loading patterns that satisfy design constraints and have good objective performance within practical run times is demonstrated. The use of automated evaluations of successive cycles to explore multicycle effects of design decisions is discussed

  18. Simulated annealing to handle energy and ancillary services joint management considering electric vehicles

    DEFF Research Database (Denmark)

    Sousa, Tiago M; Soares, Tiago; Morais, Hugo

    2016-01-01

    The massive use of distributed generation and electric vehicles will lead to a more complex management of the power system, requiring new approaches to be used in the optimal resource scheduling field. Electric vehicles with vehicle-to-grid capability can be useful for the aggregator players...... in the mitigation of renewable sources intermittency and in the ancillary services procurement. In this paper, an energy and ancillary services joint management model is proposed. A simulated annealing approach is used to solve the joint management for the following day, considering the minimization...... of the aggregator total operation costs. The case study considers a distribution network with 33-bus, 66 distributed generation and 2000 electric vehicles. The proposed simulated annealing is matched with a deterministic approach allowing an effective and efficient comparison. The simulated annealing presents...

  19. Optimization of the energy production for the Baghdara hydropower plant in Afghanistan using simulated annealing; Optimierung der Energieerzeugung fuer das Wasserkraftwerk Baghdara in Afghanistan mit simulated annealing

    Energy Technology Data Exchange (ETDEWEB)

    Ayros, E.; Hildebrandt, H.; Peissner, K. [Fichtner GmbH und Co. KG, Stuttgart (Germany). Wasserbau und Wasserkraftwerke; Bardossy, A. [Stuttgart Univ. (Germany). Inst. fuer Wasserbau

    2008-07-01

    Simulated Annealing (SA) is an optimization method analogous to the thermodynamic method and is a new alternative for optimising the energy production of hydropower systems with storage capabilities. The SA-Algorithm is presented here and it was applied for the maximization of the energy production of the Baghdara hydropower plant in Afghanistan. The results were also compared with a non-linear optimization method NLP. (orig.)

  20. Instantons in Quantum Annealing: Thermally Assisted Tunneling Vs Quantum Monte Carlo Simulations

    Science.gov (United States)

    Jiang, Zhang; Smelyanskiy, Vadim N.; Boixo, Sergio; Isakov, Sergei V.; Neven, Hartmut; Mazzola, Guglielmo; Troyer, Matthias

    2015-01-01

    Recent numerical result (arXiv:1512.02206) from Google suggested that the D-Wave quantum annealer may have an asymptotic speed-up than simulated annealing, however, the asymptotic advantage disappears when it is compared to quantum Monte Carlo (a classical algorithm despite its name). We show analytically that the asymptotic scaling of quantum tunneling is exactly the same as the escape rate in quantum Monte Carlo for a class of problems. Thus, the Google result might be explained in our framework. We also found that the transition state in quantum Monte Carlo corresponds to the instanton solution in quantum tunneling problems, which is observed in numerical simulations.

  1. Ideal versus real: simulated annealing of experimentally derived and geometric platinum nanoparticles

    Science.gov (United States)

    Ellaby, Tom; Aarons, Jolyon; Varambhia, Aakash; Jones, Lewys; Nellist, Peter; Ozkaya, Dogan; Sarwar, Misbah; Thompsett, David; Skylaris, Chris-Kriton

    2018-04-01

    Platinum nanoparticles find significant use as catalysts in industrial applications such as fuel cells. Research into their design has focussed heavily on nanoparticle size and shape as they greatly influence activity. Using high throughput, high precision electron microscopy, the structures of commercially available Pt catalysts have been determined, and we have used classical and quantum atomistic simulations to examine and compare them with geometric cuboctahedral and truncated octahedral structures. A simulated annealing procedure was used both to explore the potential energy surface at different temperatures, and also to assess the effect on catalytic activity that annealing would have on nanoparticles with different geometries and sizes. The differences in response to annealing between the real and geometric nanoparticles are discussed in terms of thermal stability, coordination number and the proportion of optimal binding sites on the surface of the nanoparticles. We find that annealing both experimental and geometric nanoparticles results in structures that appear similar in shape and predicted activity, using oxygen adsorption as a measure. Annealing is predicted to increase the catalytic activity in all cases except the truncated octahedra, where it has the opposite effect. As our simulations have been performed with a classical force field, we also assess its suitability to describe the potential energy of such nanoparticles by comparing with large scale density functional theory calculations.

  2. Multiphase Simulated Annealing Based on Boltzmann and Bose-Einstein Distribution Applied to Protein Folding Problem.

    Science.gov (United States)

    Frausto-Solis, Juan; Liñán-García, Ernesto; Sánchez-Hernández, Juan Paulo; González-Barbosa, J Javier; González-Flores, Carlos; Castilla-Valdez, Guadalupe

    2016-01-01

    A new hybrid Multiphase Simulated Annealing Algorithm using Boltzmann and Bose-Einstein distributions (MPSABBE) is proposed. MPSABBE was designed for solving the Protein Folding Problem (PFP) instances. This new approach has four phases: (i) Multiquenching Phase (MQP), (ii) Boltzmann Annealing Phase (BAP), (iii) Bose-Einstein Annealing Phase (BEAP), and (iv) Dynamical Equilibrium Phase (DEP). BAP and BEAP are simulated annealing searching procedures based on Boltzmann and Bose-Einstein distributions, respectively. DEP is also a simulated annealing search procedure, which is applied at the final temperature of the fourth phase, which can be seen as a second Bose-Einstein phase. MQP is a search process that ranges from extremely high to high temperatures, applying a very fast cooling process, and is not very restrictive to accept new solutions. However, BAP and BEAP range from high to low and from low to very low temperatures, respectively. They are more restrictive for accepting new solutions. DEP uses a particular heuristic to detect the stochastic equilibrium by applying a least squares method during its execution. MPSABBE parameters are tuned with an analytical method, which considers the maximal and minimal deterioration of problem instances. MPSABBE was tested with several instances of PFP, showing that the use of both distributions is better than using only the Boltzmann distribution on the classical SA.

  3. Reconstruction of X-rays spectra of clinical linear accelerators using the generalized simulated annealing method

    International Nuclear Information System (INIS)

    Manrique, John Peter O.; Costa, Alessandro M.

    2016-01-01

    The spectral distribution of megavoltage X-rays used in radiotherapy departments is a fundamental quantity from which, in principle, all relevant information required for radiotherapy treatments can be determined. To calculate the dose delivered to the patient who make radiation therapy, are used treatment planning systems (TPS), which make use of convolution and superposition algorithms and which requires prior knowledge of the photon fluence spectrum to perform the calculation of three-dimensional doses and thus ensure better accuracy in the tumor control probabilities preserving the normal tissue complication probabilities low. In this work we have obtained the photon fluence spectrum of X-ray of the SIEMENS ONCOR linear accelerator of 6 MV, using an character-inverse method to the reconstruction of the spectra of photons from transmission curves measured for different thicknesses of aluminum; the method used for reconstruction of the spectra is a stochastic technique known as generalized simulated annealing (GSA), based on the work of quasi-equilibrium statistic of Tsallis. For the validation of the reconstructed spectra we calculated the curve of percentage depth dose (PDD) for energy of 6 MV, using Monte Carlo simulation with Penelope code, and from the PDD then calculate the beam quality index TPR_2_0_/_1_0. (author)

  4. An Improved Simulated Annealing Technique for Enhanced Mobility in Smart Cities.

    Science.gov (United States)

    Amer, Hayder; Salman, Naveed; Hawes, Matthew; Chaqfeh, Moumena; Mihaylova, Lyudmila; Mayfield, Martin

    2016-06-30

    Vehicular traffic congestion is a significant problem that arises in many cities. This is due to the increasing number of vehicles that are driving on city roads of limited capacity. The vehicular congestion significantly impacts travel distance, travel time, fuel consumption and air pollution. Avoidance of traffic congestion and providing drivers with optimal paths are not trivial tasks. The key contribution of this work consists of the developed approach for dynamic calculation of optimal traffic routes. Two attributes (the average travel speed of the traffic and the roads' length) are utilized by the proposed method to find the optimal paths. The average travel speed values can be obtained from the sensors deployed in smart cities and communicated to vehicles via the Internet of Vehicles and roadside communication units. The performance of the proposed algorithm is compared to three other algorithms: the simulated annealing weighted sum, the simulated annealing technique for order preference by similarity to the ideal solution and the Dijkstra algorithm. The weighted sum and technique for order preference by similarity to the ideal solution methods are used to formulate different attributes in the simulated annealing cost function. According to the Sheffield scenario, simulation results show that the improved simulated annealing technique for order preference by similarity to the ideal solution method improves the traffic performance in the presence of congestion by an overall average of 19.22% in terms of travel time, fuel consumption and CO₂ emissions as compared to other algorithms; also, similar performance patterns were achieved for the Birmingham test scenario.

  5. Defect production in simulated cascades: Cascade quenching and short-term annealing

    International Nuclear Information System (INIS)

    Heinisch, H.L.

    1983-01-01

    Defect production in displacement cascades in copper has been modeled using the MARLOWE code to generate cascades and the stochastic annealing code ALSOME to simulate cascade quenching and short-term annealing of isolated cascades. Quenching is accomplished by using exaggerated values for defect mobilities and for critical reaction distances in ALSOME for a very short time. The quenched cascades are then short-term annealed with normal parameter values. The quenching parameter values were empirically determined by comparison with results of resistivity measurements. Throughout the collisional, quenching and short-term annealing phases of cascade development, the high energy cascades continue to behave as a collection of independent lower energy lobes. For recoils above about 30 keV the total number of defects and the numbers of free defects scale with the damage energy. As the energy decreases from 30 keV, defect production varies with the changing nature of the cascade configuration, resulting in more defects per unit damage energy. The simulated annealing of a low fluence of interacting cascades revealed an interstitial shielding effect on depleted zones during Stage I recovery. (orig.)

  6. Loading pattern optimization by multi-objective simulated annealing with screening technique

    International Nuclear Information System (INIS)

    Tong, K. P.; Hyun, C. L.; Hyung, K. J.; Chang, H. K.

    2006-01-01

    This paper presents a new multi-objective function which is made up of the main objective term as well as penalty terms related to the constraints. All the terms are represented in the same functional form and the coefficient of each term is normalized so that each term has equal weighting in the subsequent simulated annealing optimization calculations. The screening technique introduced in the previous work is also adopted in order to save computer time in 3-D neutronics evaluation of trial loading patterns. For numerical test of the new multi-objective function in the loading pattern optimization, the optimum loading patterns for the initial and the cycle 7 reload PWR core of Yonggwang Unit 4 are calculated by the simulated annealing algorithm with screening technique. A total of 10 optimum loading patterns are obtained for the initial core through 10 independent simulated annealing optimization runs. For the cycle 7 reload core one optimum loading pattern has been obtained from a single simulated annealing optimization run. More SA optimization runs will be conducted to optimum loading patterns for the cycle 7 reload core and results will be presented in the further work. (authors)

  7. Improving Simulated Annealing by Replacing Its Variables with Game-Theoretic Utility Maximizers

    Science.gov (United States)

    Wolpert, David H.; Bandari, Esfandiar; Tumer, Kagan

    2001-01-01

    The game-theory field of Collective INtelligence (COIN) concerns the design of computer-based players engaged in a non-cooperative game so that as those players pursue their self-interests, a pre-specified global goal for the collective computational system is achieved as a side-effect. Previous implementations of COIN algorithms have outperformed conventional techniques by up to several orders of magnitude, on domains ranging from telecommunications control to optimization in congestion problems. Recent mathematical developments have revealed that these previously developed algorithms were based on only two of the three factors determining performance. Consideration of only the third factor would instead lead to conventional optimization techniques like simulated annealing that have little to do with non-cooperative games. In this paper we present an algorithm based on all three terms at once. This algorithm can be viewed as a way to modify simulated annealing by recasting it as a non-cooperative game, with each variable replaced by a player. This recasting allows us to leverage the intelligent behavior of the individual players to substantially improve the exploration step of the simulated annealing. Experiments are presented demonstrating that this recasting significantly improves simulated annealing for a model of an economic process run over an underlying small-worlds topology. Furthermore, these experiments reveal novel small-worlds phenomena, and highlight the shortcomings of conventional mechanism design in bounded rationality domains.

  8. Comparison of Lasserre's Measure-based Bounds for Polynomial Optimization to Bounds Obtained by Simulated Annealing

    NARCIS (Netherlands)

    de Klerk, Etienne; Laurent, Monique

    We consider the problem of minimizing a continuous function f over a compact set K. We compare the hierarchy of upper bounds proposed by Lasserre in [SIAM J. Optim. 21(3) (2011), pp. 864-885] to bounds that may be obtained from simulated annealing. We show that, when f is a polynomial and K a convex

  9. Inverse simulated annealing: Improvements and application to amorphous InSb

    OpenAIRE

    Los, Jan H.; Gabardi, Silvia; Bernasconi, Marco; Kühne, Thomas D.

    2014-01-01

    An improved inverse simulated annealing method is presented to determine the structure of complex disordered systems from first principles in agreement with available experimental data or desired predetermined target properties. The effectiveness of this method is demonstrated by revisiting the structure of amorphous InSb. The resulting network is mostly tetrahedral and in excellent agreement with available experimental data.

  10. A hybrid Genetic and Simulated Annealing Algorithm for Chordal Ring implementation in large-scale networks

    DEFF Research Database (Denmark)

    Riaz, M. Tahir; Gutierrez Lopez, Jose Manuel; Pedersen, Jens Myrup

    2011-01-01

    The paper presents a hybrid Genetic and Simulated Annealing algorithm for implementing Chordal Ring structure in optical backbone network. In recent years, topologies based on regular graph structures gained a lot of interest due to their good communication properties for physical topology of the...

  11. A comparison of an algorithm for automated sequential beam orientation selection (Cycle) with simulated annealing

    International Nuclear Information System (INIS)

    Woudstra, Evert; Heijmen, Ben J M; Storchi, Pascal R M

    2008-01-01

    Some time ago we developed and published a new deterministic algorithm (called Cycle) for automatic selection of beam orientations in radiotherapy. This algorithm is a plan generation process aiming at the prescribed PTV dose within hard dose and dose-volume constraints. The algorithm allows a large number of input orientations to be used and selects only the most efficient orientations, surviving the selection process. Efficiency is determined by a score function and is more or less equal to the extent of uninhibited access to the PTV for a specific beam during the selection process. In this paper we compare the capabilities of fast-simulated annealing (FSA) and Cycle for cases where local optima are supposed to be present. Five pancreas and five oesophagus cases previously treated in our institute were selected for this comparison. Plans were generated for FSA and Cycle, using the same hard dose and dose-volume constraints, and the largest possible achieved PTV doses as obtained from these algorithms were compared. The largest achieved PTV dose values were generally very similar for the two algorithms. In some cases FSA resulted in a slightly higher PTV dose than Cycle, at the cost of switching on substantially more beam orientations than Cycle. In other cases, when Cycle generated the solution with the highest PTV dose using only a limited number of non-zero weight beams, FSA seemed to have some difficulty in switching off the unfavourable directions. Cycle was faster than FSA, especially for large-dimensional feasible spaces. In conclusion, for the cases studied in this paper, we have found that despite the inherent drawback of sequential search as used by Cycle (where Cycle could probably get trapped in a local optimum), Cycle is nevertheless able to find comparable or sometimes slightly better treatment plans in comparison with FSA (which in theory finds the global optimum) especially in large-dimensional beam weight spaces

  12. Simulated Annealing Genetic Algorithm Based Schedule Risk Management of IT Outsourcing Project

    Directory of Open Access Journals (Sweden)

    Fuqiang Lu

    2017-01-01

    Full Text Available IT outsourcing is an effective way to enhance the core competitiveness for many enterprises. But the schedule risk of IT outsourcing project may cause enormous economic loss to enterprise. In this paper, the Distributed Decision Making (DDM theory and the principal-agent theory are used to build a model for schedule risk management of IT outsourcing project. In addition, a hybrid algorithm combining simulated annealing (SA and genetic algorithm (GA is designed, namely, simulated annealing genetic algorithm (SAGA. The effect of the proposed model on the schedule risk management problem is analyzed in the simulation experiment. Meanwhile, the simulation results of the three algorithms GA, SA, and SAGA show that SAGA is the most superior one to the other two algorithms in terms of stability and convergence. Consequently, this paper provides the scientific quantitative proposal for the decision maker who needs to manage the schedule risk of IT outsourcing project.

  13. Phase diagram of 2D Hubbard model by simulated annealing mean field approximation

    International Nuclear Information System (INIS)

    Kato, Masaru; Kitagaki, Takashi

    1991-01-01

    In order to investigate the stable magnetic structure of the Hubbard model on a square lattice, we utilize the dynamical simulated annealing method which proposed by R. Car and M. Parrinello. Results of simulations on a 10 x 10 lattice system with 80 electrons under assumption of collinear magnetic structure that the most stable state is incommensurate spin density wave state with periodic domain wall. (orig.)

  14. Experiences with serial and parallel algorithms for channel routing using simulated annealing

    Science.gov (United States)

    Brouwer, Randall Jay

    1988-01-01

    Two algorithms for channel routing using simulated annealing are presented. Simulated annealing is an optimization methodology which allows the solution process to back up out of local minima that may be encountered by inappropriate selections. By properly controlling the annealing process, it is very likely that the optimal solution to an NP-complete problem such as channel routing may be found. The algorithm presented proposes very relaxed restrictions on the types of allowable transformations, including overlapping nets. By freeing that restriction and controlling overlap situations with an appropriate cost function, the algorithm becomes very flexible and can be applied to many extensions of channel routing. The selection of the transformation utilizes a number of heuristics, still retaining the pseudorandom nature of simulated annealing. The algorithm was implemented as a serial program for a workstation, and a parallel program designed for a hypercube computer. The details of the serial implementation are presented, including many of the heuristics used and some of the resulting solutions.

  15. Resorting the NIST undulator using simulated annealing for field error reduction

    International Nuclear Information System (INIS)

    Denbeaux, Greg; Johnson, Lewis E.; Madey, John M.J.

    2000-01-01

    We have used a simulated annealing algorithm to sort the samarium cobalt blocks and vanadium permendur poles in the hybrid NIST undulator to optimize the spectrum of the emitted light. While simulated annealing has proven highly effective in sorting of the SmCo blocks in pure REC undulators, the reliance on magnetically 'soft' poles operating near saturation to concentrate the flux in hybrid undulators introduces a pair of additional variables - the permeability and saturation induction of the poles - which limit the utility of the assumption of superposition on which most simulated annealing codes rely. Detailed magnetic measurements clearly demonstrated the failure of the superposition principle due to random variations in the permeability in the 'unsorted' NIST undulator. To deal with the issue, we measured both the magnetization of the REC blocks and the permeability of the NIST's integrated vanadium permendur poles, and implemented a sorting criteria which minimized the pole-to-pole variations in permeability to satisfy the criteria for realization of superposition on a nearest-neighbor basis. Though still imperfect, the computed spectrum of the radiation from the re-sorted and annealed NIST undulator is significantly superior to that of the original, unsorted device

  16. The Parameters Optimization of MCR-WPT System Based on the Improved Genetic Simulated Annealing Algorithm

    Directory of Open Access Journals (Sweden)

    Sheng Lu

    2015-01-01

    Full Text Available To solve the problem of parameter selection during the design of magnetically coupled resonant wireless power transmission system (MCR-WPT, this paper proposed an improved genetic simulated annealing algorithm. Firstly, the equivalent circuit of the system is analysis in this study and a nonlinear programming mathematical model is built. Secondly, in place of the penalty function method in the genetic algorithm, the selection strategy based on the distance between individuals is adopted to select individual. In this way, it reduces the excess empirical parameters. Meanwhile, it can improve the convergence rate and the searching ability by calculating crossover probability and mutation probability according to the variance of population’s fitness. At last, the simulated annealing operator is added to increase local search ability of the method. The simulation shows that the improved method can break the limit of the local optimum solution and get the global optimum solution faster. The optimized system can achieve the practical requirements.

  17. Simulated parallel annealing within a neighborhood for optimization of biomechanical systems.

    Science.gov (United States)

    Higginson, J S; Neptune, R R; Anderson, F C

    2005-09-01

    Optimization problems for biomechanical systems have become extremely complex. Simulated annealing (SA) algorithms have performed well in a variety of test problems and biomechanical applications; however, despite advances in computer speed, convergence to optimal solutions for systems of even moderate complexity has remained prohibitive. The objective of this study was to develop a portable parallel version of a SA algorithm for solving optimization problems in biomechanics. The algorithm for simulated parallel annealing within a neighborhood (SPAN) was designed to minimize interprocessor communication time and closely retain the heuristics of the serial SA algorithm. The computational speed of the SPAN algorithm scaled linearly with the number of processors on different computer platforms for a simple quadratic test problem and for a more complex forward dynamic simulation of human pedaling.

  18. Parameter identification based on modified simulated annealing differential evolution algorithm for giant magnetostrictive actuator

    Science.gov (United States)

    Gao, Xiaohui; Liu, Yongguang

    2018-01-01

    There is a serious nonlinear relationship between input and output in the giant magnetostrictive actuator (GMA) and how to establish mathematical model and identify its parameters is very important to study characteristics and improve control accuracy. The current-displacement model is firstly built based on Jiles-Atherton (J-A) model theory, Ampere loop theorem and stress-magnetism coupling model. And then laws between unknown parameters and hysteresis loops are studied to determine the data-taking scope. The modified simulated annealing differential evolution algorithm (MSADEA) is proposed by taking full advantage of differential evolution algorithm's fast convergence and simulated annealing algorithm's jumping property to enhance the convergence speed and performance. Simulation and experiment results shows that this algorithm is not only simple and efficient, but also has fast convergence speed and high identification accuracy.

  19. Atomic scale simulations of arsenic ion implantation and annealing in silicon

    International Nuclear Information System (INIS)

    Caturla, M.J.; Diaz de la Rubia, T.; Jaraiz, M.

    1995-01-01

    We present results of multiple-time-scale simulations of 5, 10 and 15 keV low temperature ion implantation of arsenic on silicon (100), followed by high temperature anneals. The simulations start with a molecular dynamics (MD) calculation of the primary state of damage after 10ps. The results are then coupled to a kinetic Monte Carlo (MC) simulation of bulk defect diffusion and clustering. Dose accumulation is achieved considering that at low temperatures the damage produced in the lattice is stable. After the desired dose is accumulated, the system is annealed at 800 degrees C for several seconds. The results provide information on the evolution for the damage microstructure over macroscopic length and time scales and affords direct comparison to experimental results. We discuss the database of inputs to the MC model and how it affects the diffusion process

  20. Cascade annealing: an overview

    International Nuclear Information System (INIS)

    Doran, D.G.; Schiffgens, J.O.

    1976-04-01

    Concepts and an overview of radiation displacement damage modeling and annealing kinetics are presented. Short-term annealing methodology is described and results of annealing simulations performed on damage cascades generated using the Marlowe and Cascade programs are included. Observations concerning the inconsistencies and inadequacies of current methods are presented along with simulation of high energy cascades and simulation of longer-term annealing

  1. Physical Mapping Using Simulated Annealing and Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Vesterstrøm, Jacob Svaneborg

    2003-01-01

    optimization method when searching for an ordering of the fragments in PM. In this paper, we applied an evolutionary algorithm to the problem, and compared its performance to that of SA and local search on simulated PM data, in order to determine the important factors in finding a good ordering of the segments....... The analysis highlights the importance of a good PM model, a well-correlated fitness function, and high quality hybridization data. We suggest that future work in PM should focus on design of more reliable fitness functions and on developing error-screening algorithms....

  2. An Improved Simulated Annealing Technique for Enhanced Mobility in Smart Cities

    Directory of Open Access Journals (Sweden)

    Hayder Amer

    2016-06-01

    Full Text Available Vehicular traffic congestion is a significant problem that arises in many cities. This is due to the increasing number of vehicles that are driving on city roads of limited capacity. The vehicular congestion significantly impacts travel distance, travel time, fuel consumption and air pollution. Avoidance of traffic congestion and providing drivers with optimal paths are not trivial tasks. The key contribution of this work consists of the developed approach for dynamic calculation of optimal traffic routes. Two attributes (the average travel speed of the traffic and the roads’ length are utilized by the proposed method to find the optimal paths. The average travel speed values can be obtained from the sensors deployed in smart cities and communicated to vehicles via the Internet of Vehicles and roadside communication units. The performance of the proposed algorithm is compared to three other algorithms: the simulated annealing weighted sum, the simulated annealing technique for order preference by similarity to the ideal solution and the Dijkstra algorithm. The weighted sum and technique for order preference by similarity to the ideal solution methods are used to formulate different attributes in the simulated annealing cost function. According to the Sheffield scenario, simulation results show that the improved simulated annealing technique for order preference by similarity to the ideal solution method improves the traffic performance in the presence of congestion by an overall average of 19.22% in terms of travel time, fuel consumption and CO2 emissions as compared to other algorithms; also, similar performance patterns were achieved for the Birmingham test scenario.

  3. Parameter estimation for chaotic systems using a hybrid adaptive cuckoo search with simulated annealing algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Sheng, Zheng, E-mail: 19994035@sina.com [College of Meteorology and Oceanography, PLA University of Science and Technology, Nanjing 211101 (China); Wang, Jun; Zhou, Bihua [National Defense Key Laboratory on Lightning Protection and Electromagnetic Camouflage, PLA University of Science and Technology, Nanjing 210007 (China); Zhou, Shudao [College of Meteorology and Oceanography, PLA University of Science and Technology, Nanjing 211101 (China); Collaborative Innovation Center on Forecast and Evaluation of Meteorological Disasters, Nanjing University of Information Science and Technology, Nanjing 210044 (China)

    2014-03-15

    This paper introduces a novel hybrid optimization algorithm to establish the parameters of chaotic systems. In order to deal with the weaknesses of the traditional cuckoo search algorithm, the proposed adaptive cuckoo search with simulated annealing algorithm is presented, which incorporates the adaptive parameters adjusting operation and the simulated annealing operation in the cuckoo search algorithm. Normally, the parameters of the cuckoo search algorithm are kept constant that may result in decreasing the efficiency of the algorithm. For the purpose of balancing and enhancing the accuracy and convergence rate of the cuckoo search algorithm, the adaptive operation is presented to tune the parameters properly. Besides, the local search capability of cuckoo search algorithm is relatively weak that may decrease the quality of optimization. So the simulated annealing operation is merged into the cuckoo search algorithm to enhance the local search ability and improve the accuracy and reliability of the results. The functionality of the proposed hybrid algorithm is investigated through the Lorenz chaotic system under the noiseless and noise condition, respectively. The numerical results demonstrate that the method can estimate parameters efficiently and accurately in the noiseless and noise condition. Finally, the results are compared with the traditional cuckoo search algorithm, genetic algorithm, and particle swarm optimization algorithm. Simulation results demonstrate the effectiveness and superior performance of the proposed algorithm.

  4. Parameter estimation for chaotic systems using a hybrid adaptive cuckoo search with simulated annealing algorithm

    International Nuclear Information System (INIS)

    Sheng, Zheng; Wang, Jun; Zhou, Bihua; Zhou, Shudao

    2014-01-01

    This paper introduces a novel hybrid optimization algorithm to establish the parameters of chaotic systems. In order to deal with the weaknesses of the traditional cuckoo search algorithm, the proposed adaptive cuckoo search with simulated annealing algorithm is presented, which incorporates the adaptive parameters adjusting operation and the simulated annealing operation in the cuckoo search algorithm. Normally, the parameters of the cuckoo search algorithm are kept constant that may result in decreasing the efficiency of the algorithm. For the purpose of balancing and enhancing the accuracy and convergence rate of the cuckoo search algorithm, the adaptive operation is presented to tune the parameters properly. Besides, the local search capability of cuckoo search algorithm is relatively weak that may decrease the quality of optimization. So the simulated annealing operation is merged into the cuckoo search algorithm to enhance the local search ability and improve the accuracy and reliability of the results. The functionality of the proposed hybrid algorithm is investigated through the Lorenz chaotic system under the noiseless and noise condition, respectively. The numerical results demonstrate that the method can estimate parameters efficiently and accurately in the noiseless and noise condition. Finally, the results are compared with the traditional cuckoo search algorithm, genetic algorithm, and particle swarm optimization algorithm. Simulation results demonstrate the effectiveness and superior performance of the proposed algorithm

  5. Application of simulated annealing for simultaneous retrieval of particle size distribution and refractive index

    International Nuclear Information System (INIS)

    Ma, Lin; Kranendonk, Laura; Cai, Weiwei; Zhao, Yan; Baba, Justin S.

    2009-01-01

    This paper describes the application of the simulated annealing technique for the simultaneous retrieval of particle size distribution and refractive index based on polarization modulated scattering (PMS) measurements. The PMS technique is a well-established method to measure multiple elements of the Mueller scattering matrix. However, the inference of the scatterers properties (e.g., the size distribution function and refractive index) from such measurements involves solving an ill-conditioned inverse problem. In this paper, a new inversion technique was demonstrated to infer particle properties from PMS measurements. The new technique formulated the inverse problem into a minimization problem, which is then solved by the simulated annealing technique. Both numerical and experimental investigation on the new inversion technique was presented in the paper. The results obtained demonstrated the robustness and reliability of the new algorithm, and supported its expanded applications in scientific and technological areas involving particulates/aerosols.

  6. A Simulated Annealing-Based Heuristic Algorithm for Job Shop Scheduling to Minimize Lateness

    Directory of Open Access Journals (Sweden)

    Rui Zhang

    2013-04-01

    Full Text Available A decomposition-based optimization algorithm is proposed for solving large job shop scheduling problems with the objective of minimizing the maximum lateness. First, we use the constraint propagation theory to derive the orientation of a portion of disjunctive arcs. Then we use a simulated annealing algorithm to find a decomposition policy which satisfies the maximum number of oriented disjunctive arcs. Subsequently, each subproblem (corresponding to a subset of operations as determined by the decomposition policy is successively solved with a simulated annealing algorithm, which leads to a feasible solution to the original job shop scheduling problem. Computational experiments are carried out for adapted benchmark problems, and the results show the proposed algorithm is effective and efficient in terms of solution quality and time performance.

  7. Fast and accurate protein substructure searching with simulated annealing and GPUs

    Directory of Open Access Journals (Sweden)

    Stivala Alex D

    2010-09-01

    Full Text Available Abstract Background Searching a database of protein structures for matches to a query structure, or occurrences of a structural motif, is an important task in structural biology and bioinformatics. While there are many existing methods for structural similarity searching, faster and more accurate approaches are still required, and few current methods are capable of substructure (motif searching. Results We developed an improved heuristic for tableau-based protein structure and substructure searching using simulated annealing, that is as fast or faster and comparable in accuracy, with some widely used existing methods. Furthermore, we created a parallel implementation on a modern graphics processing unit (GPU. Conclusions The GPU implementation achieves up to 34 times speedup over the CPU implementation of tableau-based structure search with simulated annealing, making it one of the fastest available methods. To the best of our knowledge, this is the first application of a GPU to the protein structural search problem.

  8. Use of simulated annealing in standardization and optimization of the acerola wine production

    Directory of Open Access Journals (Sweden)

    Sheyla dos Santos Almeida

    2014-06-01

    Full Text Available In this study, seven wine samples were prepared varying the amount of pulp of acerola fruits and the sugar content using the simulated annealing technique to obtain the optimal sensory qualities and cost for the wine produced. S. cerevisiae yeast was used in the fermentation process and the sensory attributes were evaluated using a hedonic scale. Acerola wines were classified as sweet, with 11°GL of alcohol concentration and with aroma, taste, and color characteristics of the acerola fruit. The simulated annealing experiments showed that the best conditions were found at mass ratio between 1/7.5-1/6 and total soluble solids between 28.6-29.0 °Brix, from which the sensory acceptance scores of 6.9, 6.8, and 8.8 were obtained for color, aroma, and flavor, respectively, with a production cost 43-45% lower than the cost of traditional wines commercialized in Brazil.

  9. Reconstruction of bremsstrahlung spectra from attenuation data using generalized simulated annealing

    International Nuclear Information System (INIS)

    Menin, O.H.; Martinez, A.S.; Costa, A.M.

    2016-01-01

    A generalized simulated annealing algorithm, combined with a suitable smoothing regularization function is used to solve the inverse problem of X-ray spectrum reconstruction from attenuation data. The approach is to set the initial acceptance and visitation temperatures and to standardize the terms of objective function to automate the algorithm to accommodate different spectra ranges. Experiments with both numerical and measured attenuation data are presented. Results show that the algorithm reconstructs spectra shapes accurately. It should be noted that in this algorithm, the regularization function was formulated to guarantee a smooth spectrum, thus, the presented technique does not apply to X-ray spectrum where characteristic radiation are present. - Highlights: • X-ray spectra reconstruction from attenuation data using generalized simulated annealing. • Algorithm employs a smoothing regularization function, and sets the initial acceptance and visitation temperatures. • Algorithmic is automated by standardizing the terms of the objective function. • Algorithm is compared with classical methods.

  10. PedMine – A simulated annealing algorithm to identify maximally unrelated individuals in population isolates

    OpenAIRE

    Douglas, Julie A.; Sandefur, Conner I.

    2008-01-01

    In family-based genetic studies, it is often useful to identify a subset of unrelated individuals. When such studies are conducted in population isolates, however, most if not all individuals are often detectably related to each other. To identify a set of maximally unrelated (or equivalently, minimally related) individuals, we have implemented simulated annealing, a general-purpose algorithm for solving difficult combinatorial optimization problems. We illustrate our method on data from a ge...

  11. Compatible topologies and parameters for NMR structure determination of carbohydrates by simulated annealing

    OpenAIRE

    Feng, Yingang

    2017-01-01

    The use of NMR methods to determine the three-dimensional structures of carbohydrates and glycoproteins is still challenging, in part because of the lack of standard protocols. In order to increase the convenience of structure determination, the topology and parameter files for carbohydrates in the program Crystallography & NMR System (CNS) were investigated and new files were developed to be compatible with the standard simulated annealing protocols for proteins and nucleic acids. Recalculat...

  12. A study on three dimensional layout design by the simulated annealing method

    International Nuclear Information System (INIS)

    Jang, Seung Ho

    2008-01-01

    Modern engineered products are becoming increasingly complicated and most consumers prefer compact designs. Layout design plays an important role in many engineered products. The objective of this study is to suggest a method to apply the simulated annealing method to the arbitrarily shaped three-dimensional component layout design problem. The suggested method not only optimizes the packing density but also satisfies constraint conditions among the components. The algorithm and its implementation as suggested in this paper are extendable to other research objectives

  13. EIT image regularization by a new Multi-Objective Simulated Annealing algorithm.

    Science.gov (United States)

    Castro Martins, Thiago; Sales Guerra Tsuzuki, Marcos

    2015-01-01

    Multi-Objective Optimization can be used to produce regularized Electrical Impedance Tomography (EIT) images where the weight of the regularization term is not known a priori. This paper proposes a novel Multi-Objective Optimization algorithm based on Simulated Annealing tailored for EIT image reconstruction. Images are reconstructed from experimental data and compared with images from other Multi and Single Objective optimization methods. A significant performance enhancement from traditional techniques can be inferred from the results.

  14. Defect production in simulated cascades: cascade quenching and short-term annealing

    International Nuclear Information System (INIS)

    Heinisch, H.L.

    1982-01-01

    Defect production in high energy displacement cascades has been modeled using the computer code MARLOWE to generate the cascades and the stochastic computer code ALSOME to simulate the cascade quenching and short-term annealing of isolated cascades. The quenching is accomplished by using ALSOME with exaggerated values for defect mobilities and critical reaction distanes for recombination and clustering, which are in effect until the number of defect pairs is equal to the value determined from resistivity experiments at 4K. Then normal mobilities and reaction distances are used during short-term annealing to a point representative of Stage III recovery. Effects of cascade interactions at low fluences are also being investigated. The quenching parameter values were empirically determined for 30 keV cascades. The results agree well with experimental information throughout the range from 1 keV to 100 keV. Even after quenching and short-term annealing the high energy cascades behave as a collection of lower energy subcascades and lobes. Cascades generated in a crystal having thermal displacements were found to be in better agreement with experiments after quenching and annealing than those generated in a non-thermal crystal

  15. Direct comparison of quantum and simulated annealing on a fully connected Ising ferromagnet

    Science.gov (United States)

    Wauters, Matteo M.; Fazio, Rosario; Nishimori, Hidetoshi; Santoro, Giuseppe E.

    2017-08-01

    We compare the performance of quantum annealing (QA, through Schrödinger dynamics) and simulated annealing (SA, through a classical master equation) on the p -spin infinite range ferromagnetic Ising model, by slowly driving the system across its equilibrium, quantum or classical, phase transition. When the phase transition is second order (p =2 , the familiar two-spin Ising interaction) SA shows a remarkable exponential speed-up over QA. For a first-order phase transition (p ≥3 , i.e., with multispin Ising interactions), in contrast, the classical annealing dynamics appears to remain stuck in the disordered phase, while we have clear evidence that QA shows a residual energy which decreases towards zero when the total annealing time τ increases, albeit in a rather slow (logarithmic) fashion. This is one of the rare examples where a limited quantum speedup, a speedup by QA over SA, has been shown to exist by direct solutions of the Schrödinger and master equations in combination with a nonequilibrium Landau-Zener analysis. We also analyze the imaginary-time QA dynamics of the model, finding a 1 /τ2 behavior for all finite values of p , as predicted by the adiabatic theorem of quantum mechanics. The Grover-search limit p (odd )=∞ is also discussed.

  16. A parallel simulated annealing algorithm for standard cell placement on a hypercube computer

    Science.gov (United States)

    Jones, Mark Howard

    1987-01-01

    A parallel version of a simulated annealing algorithm is presented which is targeted to run on a hypercube computer. A strategy for mapping the cells in a two dimensional area of a chip onto processors in an n-dimensional hypercube is proposed such that both small and large distance moves can be applied. Two types of moves are allowed: cell exchanges and cell displacements. The computation of the cost function in parallel among all the processors in the hypercube is described along with a distributed data structure that needs to be stored in the hypercube to support parallel cost evaluation. A novel tree broadcasting strategy is used extensively in the algorithm for updating cell locations in the parallel environment. Studies on the performance of the algorithm on example industrial circuits show that it is faster and gives better final placement results than the uniprocessor simulated annealing algorithms. An improved uniprocessor algorithm is proposed which is based on the improved results obtained from parallelization of the simulated annealing algorithm.

  17. Improve earthquake hypocenter using adaptive simulated annealing inversion in regional tectonic, volcano tectonic, and geothermal observation

    Energy Technology Data Exchange (ETDEWEB)

    Ry, Rexha Verdhora, E-mail: rexha.vry@gmail.com [Master Program of Geophysical Engineering, Faculty of Mining and Petroleum Engineering, Institut Teknologi Bandung, Jalan Ganesha No.10, Bandung 40132 (Indonesia); Nugraha, Andri Dian, E-mail: nugraha@gf.itb.ac.id [Global Geophysical Research Group, Faculty of Mining and Petroleum Engineering, Institut Teknologi Bandung, Jalan Ganesha No.10, Bandung 40132 (Indonesia)

    2015-04-24

    Observation of earthquakes is routinely used widely in tectonic activity observation, and also in local scale such as volcano tectonic and geothermal activity observation. It is necessary for determining the location of precise hypocenter which the process involves finding a hypocenter location that has minimum error between the observed and the calculated travel times. When solving this nonlinear inverse problem, simulated annealing inversion method can be applied to such global optimization problems, which the convergence of its solution is independent of the initial model. In this study, we developed own program codeby applying adaptive simulated annealing inversion in Matlab environment. We applied this method to determine earthquake hypocenter using several data cases which are regional tectonic, volcano tectonic, and geothermal field. The travel times were calculated using ray tracing shooting method. We then compared its results with the results using Geiger’s method to analyze its reliability. Our results show hypocenter location has smaller RMS error compared to the Geiger’s result that can be statistically associated with better solution. The hypocenter of earthquakes also well correlated with geological structure in the study area. Werecommend using adaptive simulated annealing inversion to relocate hypocenter location in purpose to get precise and accurate earthquake location.

  18. Siting and sizing of distributed generators based on improved simulated annealing particle swarm optimization.

    Science.gov (United States)

    Su, Hongsheng

    2017-12-18

    Distributed power grids generally contain multiple diverse types of distributed generators (DGs). Traditional particle swarm optimization (PSO) and simulated annealing PSO (SA-PSO) algorithms have some deficiencies in site selection and capacity determination of DGs, such as slow convergence speed and easily falling into local trap. In this paper, an improved SA-PSO (ISA-PSO) algorithm is proposed by introducing crossover and mutation operators of genetic algorithm (GA) into SA-PSO, so that the capabilities of the algorithm are well embodied in global searching and local exploration. In addition, diverse types of DGs are made equivalent to four types of nodes in flow calculation by the backward or forward sweep method, and reactive power sharing principles and allocation theory are applied to determine initial reactive power value and execute subsequent correction, thus providing the algorithm a better start to speed up the convergence. Finally, a mathematical model of the minimum economic cost is established for the siting and sizing of DGs under the location and capacity uncertainties of each single DG. Its objective function considers investment and operation cost of DGs, grid loss cost, annual purchase electricity cost, and environmental pollution cost, and the constraints include power flow, bus voltage, conductor current, and DG capacity. Through applications in an IEEE33-node distributed system, it is found that the proposed method can achieve desirable economic efficiency and safer voltage level relative to traditional PSO and SA-PSO algorithms, and is a more effective planning method for the siting and sizing of DGs in distributed power grids.

  19. Identification of exploration strategies for electric power distribution network using simulated annealing; Identificao de estrategias de exploracao de redes de distribuicao de energia electrica utilizando simulated annealing

    Energy Technology Data Exchange (ETDEWEB)

    Pereira, Jorge; Saraiva, J. Tome; Leao, Maria Teresa Ponce de [Instituto de Engenharia de Sistemas e Computadores (INESC), Porto (Portugal). E-mail: jpereira@inescn.pt; jsaraiva@inescn.pt; mleao@inescn.pt

    1999-07-01

    This paper presents a model for identification of optimum strategies for electric power distribution networks, considering the aim of minimizing the active power losses. This objective can be attained by modifying the transformer connections or modification of the condenser groups on duty. By the other side, specifications of voltage ranges for each bar and current intensity limits for the branches are admitted, in order to obtain a more realistic the used model. The paper describes the the simulated annealing in order to surpass the mentioned difficulties. The application of the method to the problem resolution allows the identification solutions based on exact models. The application is illustrated with the results obtained by using a IEEE test network and a network based on real distribution with 645 bars.

  20. Image-Based Airborne Sensors: A Combined Approach for Spectral Signatures Classification through Deterministic Simulated Annealing

    Science.gov (United States)

    Guijarro, María; Pajares, Gonzalo; Herrera, P. Javier

    2009-01-01

    The increasing technology of high-resolution image airborne sensors, including those on board Unmanned Aerial Vehicles, demands automatic solutions for processing, either on-line or off-line, the huge amountds of image data sensed during the flights. The classification of natural spectral signatures in images is one potential application. The actual tendency in classification is oriented towards the combination of simple classifiers. In this paper we propose a combined strategy based on the Deterministic Simulated Annealing (DSA) framework. The simple classifiers used are the well tested supervised parametric Bayesian estimator and the Fuzzy Clustering. The DSA is an optimization approach, which minimizes an energy function. The main contribution of DSA is its ability to avoid local minima during the optimization process thanks to the annealing scheme. It outperforms simple classifiers used for the combination and some combined strategies, including a scheme based on the fuzzy cognitive maps and an optimization approach based on the Hopfield neural network paradigm. PMID:22399989

  1. Simulated annealing method for electronic circuits design: adaptation and comparison with other optimization methods

    International Nuclear Information System (INIS)

    Berthiau, G.

    1995-10-01

    The circuit design problem consists in determining acceptable parameter values (resistors, capacitors, transistors geometries ...) which allow the circuit to meet various user given operational criteria (DC consumption, AC bandwidth, transient times ...). This task is equivalent to a multidimensional and/or multi objective optimization problem: n-variables functions have to be minimized in an hyper-rectangular domain ; equality constraints can be eventually specified. A similar problem consists in fitting component models. In this way, the optimization variables are the model parameters and one aims at minimizing a cost function built on the error between the model response and the data measured on the component. The chosen optimization method for this kind of problem is the simulated annealing method. This method, provided by the combinatorial optimization domain, has been adapted and compared with other global optimization methods for the continuous variables problems. An efficient strategy of variables discretization and a set of complementary stopping criteria have been proposed. The different parameters of the method have been adjusted with analytical functions of which minima are known, classically used in the literature. Our simulated annealing algorithm has been coupled with an open electrical simulator SPICE-PAC of which the modular structure allows the chaining of simulations required by the circuit optimization process. We proposed, for high-dimensional problems, a partitioning technique which ensures proportionality between CPU-time and variables number. To compare our method with others, we have adapted three other methods coming from combinatorial optimization domain - the threshold method, a genetic algorithm and the Tabu search method - The tests have been performed on the same set of test functions and the results allow a first comparison between these methods applied to continuous optimization variables. Finally, our simulated annealing program

  2. Concept for Multi-cycle Nuclear Fuel Optimization Based On Parallel Simulated Annealing With Mixing of States

    International Nuclear Information System (INIS)

    Kropaczek, David J.

    2008-01-01

    A new concept for performing nuclear fuel optimization over a multi-cycle planning horizon is presented. The method provides for an implicit coupling between traditionally separate in-core and out-of-core fuel management decisions including determination of: fresh fuel batch size, enrichment and bundle design; exposed fuel reuse; and core loading pattern. The algorithm uses simulated annealing optimization, modified with a technique called mixing of states that allows for deployment in a scalable parallel environment. Analysis of algorithm performance for a transition cycle design (i.e. a PWR 6 month cycle length extension) demonstrates the feasibility of the approach as a production tool for fuel procurement and multi-cycle core design. (authors)

  3. Displacement cascades and defect annealing in tungsten, Part II: Object kinetic Monte Carlo simulation of tungsten cascade aging

    Energy Technology Data Exchange (ETDEWEB)

    Nandipati, Giridhar, E-mail: giridhar.nandipati@pnnl.gov [Pacific Northwest National Laboratory, Richland, WA (United States); Setyawan, Wahyu; Heinisch, Howard L. [Pacific Northwest National Laboratory, Richland, WA (United States); Roche, Kenneth J. [Pacific Northwest National Laboratory, Richland, WA (United States); Department of Physics, University of Washington, Seattle, WA 98195 (United States); Kurtz, Richard J. [Pacific Northwest National Laboratory, Richland, WA (United States); Wirth, Brian D. [University of Tennessee, Knoxville, TN (United States)

    2015-07-15

    The results of object kinetic Monte Carlo (OKMC) simulations of the annealing of primary cascade damage in bulk tungsten using a comprehensive database of cascades obtained from molecular dynamics (Setyawan et al.) are described as a function of primary knock-on atom (PKA) energy at temperatures of 300, 1025 and 2050 K. An increase in SIA clustering coupled with a decrease in vacancy clustering with increasing temperature, in addition to the disparate mobilities of SIAs versus vacancies, causes an interesting effect of temperature on cascade annealing. The annealing efficiency (the ratio of the number of defects after and before annealing) exhibits an inverse U-shape curve as a function of temperature. The capabilities of the newly developed OKMC code KSOME (kinetic simulations of microstructure evolution) used to carry out these simulations are described.

  4. Displacement cascades and defect annealing in tungsten, Part II: Object kinetic Monte Carlo Simulation of Tungsten Cascade Aging

    Energy Technology Data Exchange (ETDEWEB)

    Nandipati, Giridhar; Setyawan, Wahyu; Heinisch, Howard L.; Roche, Kenneth J.; Kurtz, Richard J.; Wirth, Brian D.

    2015-07-01

    The results of object kinetic Monte Carlo (OKMC) simulations of the annealing of primary cascade damage in bulk tungsten using a comprehensive database of cascades obtained from molecular dynamics (Setyawan et al.) are described as a function of primary knock-on atom (PKA) energy at temperatures of 300, 1025 and 2050 K. An increase in SIA clustering coupled with a decrease in vacancy clustering with increasing temperature, in addition to the disparate mobilities of SIAs versus vacancies, causes an interesting effect of temperature on cascade annealing. The annealing efficiency (the ratio of the number of defects after and before annealing) exhibits an inverse U-shape curve as a function of temperature. The capabilities of the newly developed OKMC code KSOME (kinetic simulations of microstructure evolution) used to carry out these simulations are described.

  5. Annealing effect on thermodynamic and physical properties of mesoporous silicon: A simulation and nitrogen sorption study

    Science.gov (United States)

    Kumar, Pushpendra; Huber, Patrick

    2016-04-01

    Discovery of porous silicon formation in silicon substrate in 1956 while electro-polishing crystalline Si in hydrofluoric acid (HF), has triggered large scale investigations of porous silicon formation and their changes in physical and chemical properties with thermal and chemical treatment. A nitrogen sorption study is used to investigate the effect of thermal annealing on electrochemically etched mesoporous silicon (PS). The PS was thermally annealed from 200˚C to 800˚C for 1 hr in the presence of air. It was shown that the pore diameter and porosity of PS vary with annealing temperature. The experimentally obtained adsorption / desorption isotherms show hysteresis typical for capillary condensation in porous materials. A simulation study based on Saam and Cole model was performed and compared with experimentally observed sorption isotherms to study the physics behind of hysteresis formation. We discuss the shape of the hysteresis loops in the framework of the morphology of the layers. The different behavior of adsorption and desorption of nitrogen in PS with pore diameter was discussed in terms of concave menisci formation inside the pore space, which was shown to related with the induced pressure in varying the pore diameter from 7.2 nm to 3.4 nm.

  6. Simulated annealing of displacement cascades in FCC metals. 1. Beeler cascades

    International Nuclear Information System (INIS)

    Doran, D.G.; Burnett, R.A.

    1974-09-01

    An important source of damage to structural materials in fast reactors is the displacement of atoms from normal lattice sites. A high energy neutron may impart sufficient energy to an atom to initiate a displacement cascade consisting of a localized high density of hundreds of interstitials and vacancies. These defects subsequently interact to form clusters and to reduce their density by mutual annihilation. This short term annealing of an isolated cascade has been simulated at high and low temperatures using a correlated random walk model. The cascade representations used were developed by Beeler and the point defect properties were based on the model of γ-iron by Johnson. Low temperature anneals, characterized by no vacancy migration and a 104 site annihilation region (AR), resulted in 49 defect pairs at 20 keV and 11 pairs at 5 keV. High temperature anneals, characterized by both interstitial and vacancy migration and a 32 site AR, resulted in 68 pairs at 20 keV and 18 pairs at 5 keV when no cluster dissociation was permitted; most of the vacancies were in immobile clusters. These high temperature values dropped to 40 and 14 upon dissolution of the vacancy clusters. Parameter studies showed that, at a given temperature, the large AR resulted in about one-half as many defects as the small AR. Cluster size distributions and examples of spatial configurations are included. (U.S.)

  7. Simulated annealing algorithm for solving chambering student-case assignment problem

    Science.gov (United States)

    Ghazali, Saadiah; Abdul-Rahman, Syariza

    2015-12-01

    The problem related to project assignment problem is one of popular practical problem that appear nowadays. The challenge of solving the problem raise whenever the complexity related to preferences, the existence of real-world constraints and problem size increased. This study focuses on solving a chambering student-case assignment problem by using a simulated annealing algorithm where this problem is classified under project assignment problem. The project assignment problem is considered as hard combinatorial optimization problem and solving it using a metaheuristic approach is an advantage because it could return a good solution in a reasonable time. The problem of assigning chambering students to cases has never been addressed in the literature before. For the proposed problem, it is essential for law graduates to peruse in chambers before they are qualified to become legal counselor. Thus, assigning the chambering students to cases is a critically needed especially when involving many preferences. Hence, this study presents a preliminary study of the proposed project assignment problem. The objective of the study is to minimize the total completion time for all students in solving the given cases. This study employed a minimum cost greedy heuristic in order to construct a feasible initial solution. The search then is preceded with a simulated annealing algorithm for further improvement of solution quality. The analysis of the obtained result has shown that the proposed simulated annealing algorithm has greatly improved the solution constructed by the minimum cost greedy heuristic. Hence, this research has demonstrated the advantages of solving project assignment problem by using metaheuristic techniques.

  8. First-order design of geodetic networks using the simulated annealing method

    Science.gov (United States)

    Berné, J. L.; Baselga, S.

    2004-09-01

    The general problem of the optimal design for a geodetic network subject to any extrinsic factors, namely the first-order design problem, can be dealt with as a numeric optimization problem. The classic theory of this problem and the optimization methods are revised. Then the innovative use of the simulated annealing method, which has been successfully applied in other fields, is presented for this classical geodetic problem. This method, belonging to iterative heuristic techniques in operational research, uses a thermodynamical analogy to crystalline networks to offer a solution that converges probabilistically to the global optimum. Basic formulation and some examples are studied.

  9. Simulated annealing CFAR threshold selection for South African ship detection in ASAR imagery

    CSIR Research Space (South Africa)

    Schwegmann, CP

    2014-07-01

    Full Text Available ALTER CURRENT THRESHOLD PLANE IF CANDIDATE IS BETTER IF CANDIDATE IS WORSE IF (RANDOM NUMBER < BOLTZMANN PROBABILITY) Fig. 3. The iterative procedure of Simulated Annealing. Starting at some initial threshold plane Ti (x, y) each iteration tests... if the new solution T is better than the previous best solution Tb (x, y). A possible “bad” candidate can replace the current best due to the Boltzmann probability. A new threshold plane Tb (x, y) is defined which is mapped to the 2D distribution map...

  10. The performance of simulated annealing in parameter estimation for vapor-liquid equilibrium modeling

    Directory of Open Access Journals (Sweden)

    A. Bonilla-Petriciolet

    2007-03-01

    Full Text Available In this paper we report the application and evaluation of the simulated annealing (SA optimization method in parameter estimation for vapor-liquid equilibrium (VLE modeling. We tested this optimization method using the classical least squares and error-in-variable approaches. The reliability and efficiency of the data-fitting procedure are also considered using different values for algorithm parameters of the SA method. Our results indicate that this method, when properly implemented, is a robust procedure for nonlinear parameter estimation in thermodynamic models. However, in difficult problems it still can converge to local optimums of the objective function.

  11. Protein structure predictions with Monte Carlo simulated annealing: Case for the β-sheet

    Science.gov (United States)

    Okamoto, Y.; Fukugita, M.; Kawai, H.; Nakazawa, T.

    Work is continued for a prediction of three-dimensional structure of peptides and proteins with Monte Carlo simulated annealing using only a generic energy function and amino acid sequence as input. We report that β-sheet like structure is successfully predicted for a fragment of bovine pancreatic trypsin inhibitor which is known to have the β-sheet structure in nature. Together with the results for α-helix structure reported earlier, this means that a successful prediction can be made, at least at a qualitative level, for two dominant building blocks of proteins, α-helix and β-sheet, from the information of amino acid sequence alone.

  12. Neighbourhood generation mechanism applied in simulated annealing to job shop scheduling problems

    Science.gov (United States)

    Cruz-Chávez, Marco Antonio

    2015-11-01

    This paper presents a neighbourhood generation mechanism for the job shop scheduling problems (JSSPs). In order to obtain a feasible neighbour with the generation mechanism, it is only necessary to generate a permutation of an adjacent pair of operations in a scheduling of the JSSP. If there is no slack time between the adjacent pair of operations that is permuted, then it is proven, through theory and experimentation, that the new neighbour (schedule) generated is feasible. It is demonstrated that the neighbourhood generation mechanism is very efficient and effective in a simulated annealing.

  13. Modified Backtracking Search Optimization Algorithm Inspired by Simulated Annealing for Constrained Engineering Optimization Problems

    Directory of Open Access Journals (Sweden)

    Hailong Wang

    2018-01-01

    Full Text Available The backtracking search optimization algorithm (BSA is a population-based evolutionary algorithm for numerical optimization problems. BSA has a powerful global exploration capacity while its local exploitation capability is relatively poor. This affects the convergence speed of the algorithm. In this paper, we propose a modified BSA inspired by simulated annealing (BSAISA to overcome the deficiency of BSA. In the BSAISA, the amplitude control factor (F is modified based on the Metropolis criterion in simulated annealing. The redesigned F could be adaptively decreased as the number of iterations increases and it does not introduce extra parameters. A self-adaptive ε-constrained method is used to handle the strict constraints. We compared the performance of the proposed BSAISA with BSA and other well-known algorithms when solving thirteen constrained benchmarks and five engineering design problems. The simulation results demonstrated that BSAISA is more effective than BSA and more competitive with other well-known algorithms in terms of convergence speed.

  14. Optimization of Multiple Traveling Salesman Problem Based on Simulated Annealing Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Xu Mingji

    2017-01-01

    Full Text Available It is very effective to solve the multi variable optimization problem by using hierarchical genetic algorithm. This thesis analyzes both advantages and disadvantages of hierarchical genetic algorithm and puts forward an improved simulated annealing genetic algorithm. The new algorithm is applied to solve the multiple traveling salesman problem, which can improve the performance of the solution. First, it improves the design of chromosomes hierarchical structure in terms of redundant hierarchical algorithm, and it suggests a suffix design of chromosomes; Second, concerning to some premature problems of genetic algorithm, it proposes a self-identify crossover operator and mutation; Third, when it comes to the problem of weak ability of local search of genetic algorithm, it stretches the fitness by mixing genetic algorithm with simulated annealing algorithm. Forth, it emulates the problems of N traveling salesmen and M cities so as to verify its feasibility. The simulation and calculation shows that this improved algorithm can be quickly converged to a best global solution, which means the algorithm is encouraging in practical uses.

  15. Optimization of cladding parameters for resisting corrosion on low carbon steels using simulated annealing algorithm

    Science.gov (United States)

    Balan, A. V.; Shivasankaran, N.; Magibalan, S.

    2018-04-01

    Low carbon steels used in chemical industries are frequently affected by corrosion. Cladding is a surfacing process used for depositing a thick layer of filler metal in a highly corrosive materials to achieve corrosion resistance. Flux cored arc welding (FCAW) is preferred in cladding process due to its augmented efficiency and higher deposition rate. In this cladding process, the effect of corrosion can be minimized by controlling the output responses such as minimizing dilution, penetration and maximizing bead width, reinforcement and ferrite number. This paper deals with the multi-objective optimization of flux cored arc welding responses by controlling the process parameters such as wire feed rate, welding speed, Nozzle to plate distance, welding gun angle for super duplex stainless steel material using simulated annealing technique. Regression equation has been developed and validated using ANOVA technique. The multi-objective optimization of weld bead parameters was carried out using simulated annealing to obtain optimum bead geometry for reducing corrosion. The potentiodynamic polarization test reveals the balanced formation of fine particles of ferrite and autenite content with desensitized nature of the microstructure in the optimized clad bead.

  16. Redesigning rain gauges network in Johor using geostatistics and simulated annealing

    Energy Technology Data Exchange (ETDEWEB)

    Aziz, Mohd Khairul Bazli Mohd, E-mail: mkbazli@yahoo.com [Centre of Preparatory and General Studies, TATI University College, 24000 Kemaman, Terengganu, Malaysia and Department of Mathematical Sciences, Faculty of Science, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia); Yusof, Fadhilah, E-mail: fadhilahy@utm.my [Department of Mathematical Sciences, Faculty of Science, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia); Daud, Zalina Mohd, E-mail: zalina@ic.utm.my [UTM Razak School of Engineering and Advanced Technology, Universiti Teknologi Malaysia, UTM KL, 54100 Kuala Lumpur (Malaysia); Yusop, Zulkifli, E-mail: zulyusop@utm.my [Institute of Environmental and Water Resource Management (IPASA), Faculty of Civil Engineering, Universiti Teknologi Malaysia, 81310 UTM Johor Bahru, Johor (Malaysia); Kasno, Mohammad Afif, E-mail: mafifkasno@gmail.com [Malaysia - Japan International Institute of Technology (MJIIT), Universiti Teknologi Malaysia, UTM KL, 54100 Kuala Lumpur (Malaysia)

    2015-02-03

    Recently, many rainfall network design techniques have been developed, discussed and compared by many researchers. Present day hydrological studies require higher levels of accuracy from collected data. In numerous basins, the rain gauge stations are located without clear scientific understanding. In this study, an attempt is made to redesign rain gauge network for Johor, Malaysia in order to meet the required level of accuracy preset by rainfall data users. The existing network of 84 rain gauges in Johor is optimized and redesigned into a new locations by using rainfall, humidity, solar radiation, temperature and wind speed data collected during the monsoon season (November - February) of 1975 until 2008. This study used the combination of geostatistics method (variance-reduction method) and simulated annealing as the algorithm of optimization during the redesigned proses. The result shows that the new rain gauge location provides minimum value of estimated variance. This shows that the combination of geostatistics method (variance-reduction method) and simulated annealing is successful in the development of the new optimum rain gauge system.

  17. IMPROVEMENT OF RECOGNITION QUALITY IN DEEP LEARNING NETWORKS BY SIMULATED ANNEALING METHOD

    Directory of Open Access Journals (Sweden)

    A. S. Potapov

    2014-09-01

    Full Text Available The subject of this research is deep learning methods, in which automatic construction of feature transforms is taken place in tasks of pattern recognition. Multilayer autoencoders have been taken as the considered type of deep learning networks. Autoencoders perform nonlinear feature transform with logistic regression as an upper classification layer. In order to verify the hypothesis of possibility to improve recognition rate by global optimization of parameters for deep learning networks, which are traditionally trained layer-by-layer by gradient descent, a new method has been designed and implemented. The method applies simulated annealing for tuning connection weights of autoencoders while regression layer is simultaneously trained by stochastic gradient descent. Experiments held by means of standard MNIST handwritten digit database have shown the decrease of recognition error rate from 1.1 to 1.5 times in case of the modified method comparing to the traditional method, which is based on local optimization. Thus, overfitting effect doesn’t appear and the possibility to improve learning rate is confirmed in deep learning networks by global optimization methods (in terms of increasing recognition probability. Research results can be applied for improving the probability of pattern recognition in the fields, which require automatic construction of nonlinear feature transforms, in particular, in the image recognition. Keywords: pattern recognition, deep learning, autoencoder, logistic regression, simulated annealing.

  18. Redesigning rain gauges network in Johor using geostatistics and simulated annealing

    International Nuclear Information System (INIS)

    Aziz, Mohd Khairul Bazli Mohd; Yusof, Fadhilah; Daud, Zalina Mohd; Yusop, Zulkifli; Kasno, Mohammad Afif

    2015-01-01

    Recently, many rainfall network design techniques have been developed, discussed and compared by many researchers. Present day hydrological studies require higher levels of accuracy from collected data. In numerous basins, the rain gauge stations are located without clear scientific understanding. In this study, an attempt is made to redesign rain gauge network for Johor, Malaysia in order to meet the required level of accuracy preset by rainfall data users. The existing network of 84 rain gauges in Johor is optimized and redesigned into a new locations by using rainfall, humidity, solar radiation, temperature and wind speed data collected during the monsoon season (November - February) of 1975 until 2008. This study used the combination of geostatistics method (variance-reduction method) and simulated annealing as the algorithm of optimization during the redesigned proses. The result shows that the new rain gauge location provides minimum value of estimated variance. This shows that the combination of geostatistics method (variance-reduction method) and simulated annealing is successful in the development of the new optimum rain gauge system

  19. Solving the patient zero inverse problem by using generalized simulated annealing

    Science.gov (United States)

    Menin, Olavo H.; Bauch, Chris T.

    2018-01-01

    Identifying patient zero - the initially infected source of a given outbreak - is an important step in epidemiological investigations of both existing and emerging infectious diseases. Here, the use of the Generalized Simulated Annealing algorithm (GSA) to solve the inverse problem of finding the source of an outbreak is studied. The classical disease natural histories susceptible-infected (SI), susceptible-infected-susceptible (SIS), susceptible-infected-recovered (SIR) and susceptible-infected-recovered-susceptible (SIRS) in a regular lattice are addressed. Both the position of patient zero and its time of infection are considered unknown. The algorithm performance with respect to the generalization parameter q˜v and the fraction ρ of infected nodes for whom infection was ascertained is assessed. Numerical experiments show the algorithm is able to retrieve the epidemic source with good accuracy, even when ρ is small, but present no evidence to support that GSA performs better than its classical version. Our results suggest that simulated annealing could be a helpful tool for identifying patient zero in an outbreak where not all cases can be ascertained.

  20. Comparison of particle swarm optimization and simulated annealing for locating additional boreholes considering combined variance minimization

    Science.gov (United States)

    Soltani-Mohammadi, Saeed; Safa, Mohammad; Mokhtari, Hadi

    2016-10-01

    One of the most important stages in complementary exploration is optimal designing the additional drilling pattern or defining the optimum number and location of additional boreholes. Quite a lot research has been carried out in this regard in which for most of the proposed algorithms, kriging variance minimization as a criterion for uncertainty assessment is defined as objective function and the problem could be solved through optimization methods. Although kriging variance implementation is known to have many advantages in objective function definition, it is not sensitive to local variability. As a result, the only factors evaluated for locating the additional boreholes are initial data configuration and variogram model parameters and the effects of local variability are omitted. In this paper, with the goal of considering the local variability in boundaries uncertainty assessment, the application of combined variance is investigated to define the objective function. Thus in order to verify the applicability of the proposed objective function, it is used to locate the additional boreholes in Esfordi phosphate mine through the implementation of metaheuristic optimization methods such as simulated annealing and particle swarm optimization. Comparison of results from the proposed objective function and conventional methods indicates that the new changes imposed on the objective function has caused the algorithm output to be sensitive to the variations of grade, domain's boundaries and the thickness of mineralization domain. The comparison between the results of different optimization algorithms proved that for the presented case the application of particle swarm optimization is more appropriate than simulated annealing.

  1. Redesigning rain gauges network in Johor using geostatistics and simulated annealing

    Science.gov (United States)

    Aziz, Mohd Khairul Bazli Mohd; Yusof, Fadhilah; Daud, Zalina Mohd; Yusop, Zulkifli; Kasno, Mohammad Afif

    2015-02-01

    Recently, many rainfall network design techniques have been developed, discussed and compared by many researchers. Present day hydrological studies require higher levels of accuracy from collected data. In numerous basins, the rain gauge stations are located without clear scientific understanding. In this study, an attempt is made to redesign rain gauge network for Johor, Malaysia in order to meet the required level of accuracy preset by rainfall data users. The existing network of 84 rain gauges in Johor is optimized and redesigned into a new locations by using rainfall, humidity, solar radiation, temperature and wind speed data collected during the monsoon season (November - February) of 1975 until 2008. This study used the combination of geostatistics method (variance-reduction method) and simulated annealing as the algorithm of optimization during the redesigned proses. The result shows that the new rain gauge location provides minimum value of estimated variance. This shows that the combination of geostatistics method (variance-reduction method) and simulated annealing is successful in the development of the new optimum rain gauge system.

  2. Prediction of Flood Warning in Taiwan Using Nonlinear SVM with Simulated Annealing Algorithm

    Science.gov (United States)

    Lee, C.

    2013-12-01

    The issue of the floods is important in Taiwan. It is because the narrow and high topography of the island make lots of rivers steep in Taiwan. The tropical depression likes typhoon always causes rivers to flood. Prediction of river flow under the extreme rainfall circumstances is important for government to announce the warning of flood. Every time typhoon passed through Taiwan, there were always floods along some rivers. The warning is classified to three levels according to the warning water levels in Taiwan. The propose of this study is to predict the level of floods warning from the information of precipitation, rainfall duration and slope of riverbed. To classify the level of floods warning by the above-mentioned information and modeling the problems, a machine learning model, nonlinear Support vector machine (SVM), is formulated to classify the level of floods warning. In addition, simulated annealing (SA), a probabilistic heuristic algorithm, is used to determine the optimal parameter of the SVM model. A case study of flooding-trend rivers of different gradients in Taiwan is conducted. The contribution of this SVM model with simulated annealing is capable of making efficient announcement for flood warning and keeping the danger of flood from residents along the rivers.

  3. Lattice Boltzmann simulation of flow and heat transfer in random porous media constructed by simulated annealing algorithm

    International Nuclear Information System (INIS)

    Liu, Minghua; Shi, Yong; Yan, Jiashu; Yan, Yuying

    2017-01-01

    Highlights: • A numerical capability combining the lattice Boltzmann method with simulated annealing algorithm is developed. • Digitized representations of random porous media are constructed using limited but meaningful statistical descriptors. • Pore-scale flow and heat transfer information in random porous media is obtained by the lattice Boltzmann simulation. • The effective properties at the representative elementary volume scale are well specified using appropriate upscale averaging. - Abstract: In this article, the lattice Boltzmann (LB) method for transport phenomena is combined with the simulated annealing (SA) algorithm for digitized porous-medium construction to study flow and heat transfer in random porous media. Importantly, in contrast to previous studies which simplify porous media as arrays of regularly shaped objects or effective pore networks, the LB + SA method in this article can model statistically meaningful random porous structures in irregular morphology, and simulate pore-scale transport processes inside them. Pore-scale isothermal flow and heat conduction in a set of constructed random porous media characterized by statistical descriptors were then simulated through use of the LB + SA method. The corresponding averages over the computational volumes and the related effective transport properties were also computed based on these pore scale numerical results. Good agreement between the numerical results and theoretical predictions or experimental data on the representative elementary volume scale was found. The numerical simulations in this article demonstrate combination of the LB method with the SA algorithm is a viable and powerful numerical strategy for simulating transport phenomena in random porous media in complex geometries.

  4. An Evaluation of the Use of Simulated Annealing to Optimize Thinning Rates for Single Even-Aged Stands

    Directory of Open Access Journals (Sweden)

    Kai Moriguchi

    2015-01-01

    Full Text Available We evaluated the potential of simulated annealing as a reliable method for optimizing thinning rates for single even-aged stands. Four types of yield models were used as benchmark models to examine the algorithm’s versatility. Thinning rate, which was constrained to 0–50% every 5 years at stand ages of 10–45 years, was optimized to maximize the net present value for one fixed rotation term (50 years. The best parameters for the simulated annealing were chosen from 113 patterns, using the mean of the net present value from 39 runs to ensure the best performance. We compared the solutions with those from coarse full enumeration to evaluate the method’s reliability and with 39 runs of random search to evaluate its efficiency. In contrast to random search, the best run of simulated annealing for each of the four yield models resulted in a better solution than coarse full enumeration. However, variations in the objective function for two yield models obtained with simulated annealing were significantly larger than those of random search. In conclusion, simulated annealing with optimized parameters is more efficient for optimizing thinning rates than random search. However, it is necessary to execute multiple runs to obtain reliable solutions.

  5. Adaptive MANET Multipath Routing Algorithm Based on the Simulated Annealing Approach

    Directory of Open Access Journals (Sweden)

    Sungwook Kim

    2014-01-01

    Full Text Available Mobile ad hoc network represents a system of wireless mobile nodes that can freely and dynamically self-organize network topologies without any preexisting communication infrastructure. Due to characteristics like temporary topology and absence of centralized authority, routing is one of the major issues in ad hoc networks. In this paper, a new multipath routing scheme is proposed by employing simulated annealing approach. The proposed metaheuristic approach can achieve greater and reciprocal advantages in a hostile dynamic real world network situation. Therefore, the proposed routing scheme is a powerful method for finding an effective solution into the conflict mobile ad hoc network routing problem. Simulation results indicate that the proposed paradigm adapts best to the variation of dynamic network situations. The average remaining energy, network throughput, packet loss probability, and traffic load distribution are improved by about 10%, 10%, 5%, and 10%, respectively, more than the existing schemes.

  6. Determination of electron clinical spectra from percentage depth dose (PDD) curves by classical simulated annealing method

    International Nuclear Information System (INIS)

    Visbal, Jorge H. Wilches; Costa, Alessandro M.

    2016-01-01

    Percentage depth dose of electron beams represents an important item of data in radiation therapy treatment since it describes the dosimetric properties of these. Using an accurate transport theory, or the Monte Carlo method, has been shown obvious differences between the dose distribution of electron beams of a clinical accelerator in a water simulator object and the dose distribution of monoenergetic electrons of nominal energy of the clinical accelerator in water. In radiotherapy, the electron spectra should be considered to improve the accuracy of dose calculation since the shape of PDP curve depends of way how radiation particles deposit their energy in patient/phantom, that is, the spectrum. Exist three principal approaches to obtain electron energy spectra from central PDP: Monte Carlo Method, Direct Measurement and Inverse Reconstruction. In this work it will be presented the Simulated Annealing method as a practical, reliable and simple approach of inverse reconstruction as being an optimal alternative to other options. (author)

  7. Intelligent simulated annealing algorithm applied to the optimization of the main magnet for magnetic resonance imaging machine; Algoritmo simulated annealing inteligente aplicado a la optimizacion del iman principal de una maquina de resonancia magnetica de imagenes

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez Lopez, Hector [Universidad de Oriente, Santiago de Cuba (Cuba). Centro de Biofisica Medica]. E-mail: hsanchez@cbm.uo.edu.cu

    2001-08-01

    This work describes an alternative algorithm of Simulated Annealing applied to the design of the main magnet for a Magnetic Resonance Imaging machine. The algorithm uses a probabilistic radial base neuronal network to classify the possible solutions, before the objective function evaluation. This procedure allows reducing up to 50% the number of iterations required by simulated annealing to achieve the global maximum, when compared with the SA algorithm. The algorithm was applied to design a 0.1050 Tesla four coil resistive magnet, which produces a magnetic field 2.13 times more uniform than the solution given by SA. (author)

  8. PENJADWALAN JOB SHOP STATIK DENGAN METODE SIMULATED ANNEALING UNTUK MEMINIMASI WAKTU MAKESPAN

    Directory of Open Access Journals (Sweden)

    Moh.Husen

    2015-10-01

    Full Text Available Penjadwalan bagi perusahaan adalah aspek yang sangat penting, karena penjadwalan merupakan salah satu elemen perencanaan dan pengendalian produksi, sehingga perusahaan dapat mengirim barang sesuai dengan waktu yang telah ditentukan, agar diperoleh waktu total penyelesaian yang minimum. Dalam penelitian ini, penjadwalan menggunakan metode Simulated Annealing (SA dengan bantuan Matlab diharapkan dapat menghasilkan waktu total penyelesaian (makespan lebih cepat dari penjadwalan yang ada pada perusahaan. Metode SA mensimulasikan proses annealing pada pembuatan materi yang terdiri dari butiran Kristal atau logam. Tujuan dari proses ini adalah menghasilkan struktur kristal yang baik dengan menggunakan energi seminimal mungkin. Permasalahan yang dihadapi oleh perusahaan adalah perusahaan belum mempertimbangkan makespan dalam penyelesaian produk dan penjadwalan produksi untuk produk paket satu rumah kos-kosan. Hal ini berdasarkan data produksi yang terjadi keterlambatan dilihat dari waktu penyelesaian (makespan produksi, sehingga perusahaan harus menambah 2-5 hari lagi untuk bisa menyelesaikan keseluruhan produk. Dengan menggunakan metode SA menghasilkan makespan 23 jam, lebih cepat 2 jam dari pada penjadwalan awal.

  9. Intelligent simulated annealing algorithm applied to the optimization of the main magnet for magnetic resonance imaging machine

    International Nuclear Information System (INIS)

    Sanchez Lopez, Hector

    2001-01-01

    This work describes an alternative algorithm of Simulated Annealing applied to the design of the main magnet for a Magnetic Resonance Imaging machine. The algorithm uses a probabilistic radial base neuronal network to classify the possible solutions, before the objective function evaluation. This procedure allows reducing up to 50% the number of iterations required by simulated annealing to achieve the global maximum, when compared with the SA algorithm. The algorithm was applied to design a 0.1050 Tesla four coil resistive magnet, which produces a magnetic field 2.13 times more uniform than the solution given by SA. (author)

  10. Optimization of permanent-magnet undulator magnets ordering using simulated annealing algorithm

    International Nuclear Information System (INIS)

    Chen Nian; He Duohui; Li Ge; Jia Qika; Zhang Pengfei; Xu Hongliang; Cai Genwang

    2005-01-01

    Pure permanent-magnet undulator consists of many magnets. The unavoidable remanence divergence of these magnets causes the undulator magnetic field error, which will affect the functional mode of the storage ring and the quality of the spontaneous emission spectrum. Optimizing permanent-magnet undulator magnets ordering using simulated annealing algorithm before installing undulator magnets, the first field integral can be reduced to 10 -6 T·m, the second integral to 10 -6 T·m 2 and the peak field error to less than 10 -4 . The optimized results are independent of the initial solution. This paper gives the optimizing process in detail and puts forward a method to quickly calculate the peak field error and field integral according to the magnet remanence. (authors)

  11. Parameter discovery in stochastic biological models using simulated annealing and statistical model checking.

    Science.gov (United States)

    Hussain, Faraz; Jha, Sumit K; Jha, Susmit; Langmead, Christopher J

    2014-01-01

    Stochastic models are increasingly used to study the behaviour of biochemical systems. While the structure of such models is often readily available from first principles, unknown quantitative features of the model are incorporated into the model as parameters. Algorithmic discovery of parameter values from experimentally observed facts remains a challenge for the computational systems biology community. We present a new parameter discovery algorithm that uses simulated annealing, sequential hypothesis testing, and statistical model checking to learn the parameters in a stochastic model. We apply our technique to a model of glucose and insulin metabolism used for in-silico validation of artificial pancreata and demonstrate its effectiveness by developing parallel CUDA-based implementation for parameter synthesis in this model.

  12. Extended Information Ratio for Portfolio Optimization Using Simulated Annealing with Constrained Neighborhood

    Science.gov (United States)

    Orito, Yukiko; Yamamoto, Hisashi; Tsujimura, Yasuhiro; Kambayashi, Yasushi

    The portfolio optimizations are to determine the proportion-weighted combination in the portfolio in order to achieve investment targets. This optimization is one of the multi-dimensional combinatorial optimizations and it is difficult for the portfolio constructed in the past period to keep its performance in the future period. In order to keep the good performances of portfolios, we propose the extended information ratio as an objective function, using the information ratio, beta, prime beta, or correlation coefficient in this paper. We apply the simulated annealing (SA) to optimize the portfolio employing the proposed ratio. For the SA, we make the neighbor by the operation that changes the structure of the weights in the portfolio. In the numerical experiments, we show that our portfolios keep the good performances when the market trend of the future period becomes different from that of the past period.

  13. Microwave imaging for conducting scatterers by hybrid particle swarm optimization with simulated annealing

    International Nuclear Information System (INIS)

    Mhamdi, B.; Grayaa, K.; Aguili, T.

    2011-01-01

    In this paper, a microwave imaging technique for reconstructing the shape of two-dimensional perfectly conducting scatterers by means of a stochastic optimization approach is investigated. Based on the boundary condition and the measured scattered field derived by transverse magnetic illuminations, a set of nonlinear integral equations is obtained and the imaging problem is reformulated in to an optimization problem. A hybrid approximation algorithm, called PSO-SA, is developed in this work to solve the scattering inverse problem. In the hybrid algorithm, particle swarm optimization (PSO) combines global search and local search for finding the optimal results assignment with reasonable time and simulated annealing (SA) uses certain probability to avoid being trapped in a local optimum. The hybrid approach elegantly combines the exploration ability of PSO with the exploitation ability of SA. Reconstruction results are compared with exact shapes of some conducting cylinders; and good agreements with the original shapes are observed.

  14. An improved hybrid topology optimization approach coupling simulated annealing and SIMP (SA-SIMP)

    International Nuclear Information System (INIS)

    Garcia-Lopez, N P; Sanchez-Silva, M; Medaglia, A L; Chateauneuf, A

    2010-01-01

    The Solid Isotropic Material with Penalization (SIMP) methodology has been used extensively due to its versatility and ease of implementation. However, one of its main drawbacks is that resulting topologies exhibit areas of intermediate densities which lack any physical meaning. This paper presents a hybrid methodology which couples simulated annealing and SIMP (SA-SIMP) in order to achieve solutions which are stiffer and predominantly black and white. Under a look-ahead strategy, the algorithm gradually fixes or removes those elements whose density resulting from SIMP is intermediate. Different strategies for selecting and fixing the fractional elements are examined using benchmark examples, which show that topologies resulting from SA-SIMP are more rigid than SIMP and predominantly black and white.

  15. Simulated annealing with restart strategy for the blood pickup routing problem

    Science.gov (United States)

    Yu, V. F.; Iswari, T.; Normasari, N. M. E.; Asih, A. M. S.; Ting, H.

    2018-04-01

    This study develops a simulated annealing heuristic with restart strategy (SA_RS) for solving the blood pickup routing problem (BPRP). BPRP minimizes the total length of the routes for blood bag collection between a blood bank and a set of donation sites, each associated with a time window constraint that must be observed. The proposed SA_RS is implemented in C++ and tested on benchmark instances of the vehicle routing problem with time windows to verify its performance. The algorithm is then tested on some newly generated BPRP instances and the results are compared with those obtained by CPLEX. Experimental results show that the proposed SA_RS heuristic effectively solves BPRP.

  16. Compatible topologies and parameters for NMR structure determination of carbohydrates by simulated annealing.

    Science.gov (United States)

    Feng, Yingang

    2017-01-01

    The use of NMR methods to determine the three-dimensional structures of carbohydrates and glycoproteins is still challenging, in part because of the lack of standard protocols. In order to increase the convenience of structure determination, the topology and parameter files for carbohydrates in the program Crystallography & NMR System (CNS) were investigated and new files were developed to be compatible with the standard simulated annealing protocols for proteins and nucleic acids. Recalculating the published structures of protein-carbohydrate complexes and glycosylated proteins demonstrates that the results are comparable to the published structures which employed more complex procedures for structure calculation. Integrating the new carbohydrate parameters into the standard structure calculation protocol will facilitate three-dimensional structural study of carbohydrates and glycosylated proteins by NMR spectroscopy.

  17. A Simulated Annealing method to solve a generalized maximal covering location problem

    Directory of Open Access Journals (Sweden)

    M. Saeed Jabalameli

    2011-04-01

    Full Text Available The maximal covering location problem (MCLP seeks to locate a predefined number of facilities in order to maximize the number of covered demand points. In a classical sense, MCLP has three main implicit assumptions: all or nothing coverage, individual coverage, and fixed coverage radius. By relaxing these assumptions, three classes of modelling formulations are extended: the gradual cover models, the cooperative cover models, and the variable radius models. In this paper, we develop a special form of MCLP which combines the characteristics of gradual cover models, cooperative cover models, and variable radius models. The proposed problem has many applications such as locating cell phone towers. The model is formulated as a mixed integer non-linear programming (MINLP. In addition, a simulated annealing algorithm is used to solve the resulted problem and the performance of the proposed method is evaluated with a set of randomly generated problems.

  18. Optimization of PWR fuel assembly radial enrichment and burnable poison location based on adaptive simulated annealing

    International Nuclear Information System (INIS)

    Rogers, Timothy; Ragusa, Jean; Schultz, Stephen; St Clair, Robert

    2009-01-01

    The focus of this paper is to present a concurrent optimization scheme for the radial pin enrichment and burnable poison location in PWR fuel assemblies. The methodology is based on the Adaptive Simulated Annealing (ASA) technique, coupled with a neutron lattice physics code to update the cost function values. In this work, the variations in the pin U-235 enrichment are variables to be optimized radially, i.e., pin by pin. We consider the optimization of two categories of fuel assemblies, with and without Gadolinium burnable poison pins. When burnable poisons are present, both the radial distribution of enrichment and the poison locations are variables in the optimization process. Results for 15 x 15 PWR fuel assembly designs are provided.

  19. Improved Genetic and Simulating Annealing Algorithms to Solve the Traveling Salesman Problem Using Constraint Programming

    Directory of Open Access Journals (Sweden)

    M. Abdul-Niby

    2016-04-01

    Full Text Available The Traveling Salesman Problem (TSP is an integer programming problem that falls into the category of NP-Hard problems. As the problem become larger, there is no guarantee that optimal tours will be found within reasonable computation time. Heuristics techniques, like genetic algorithm and simulating annealing, can solve TSP instances with different levels of accuracy. Choosing which algorithm to use in order to get a best solution is still considered as a hard choice. This paper suggests domain reduction as a tool to be combined with any meta-heuristic so that the obtained results will be almost the same. The hybrid approach of combining domain reduction with any meta-heuristic encountered the challenge of choosing an algorithm that matches the TSP instance in order to get the best results.

  20. An efficient simulated annealing algorithm for the redundancy allocation problem with a choice of redundancy strategies

    International Nuclear Information System (INIS)

    Chambari, Amirhossain; Najafi, Amir Abbas; Rahmati, Seyed Habib A.; Karimi, Aida

    2013-01-01

    The redundancy allocation problem (RAP) is an important reliability optimization problem. This paper studies a specific RAP in which redundancy strategies are chosen. To do so, the choice of the redundancy strategies among active and cold standby is considered as decision variables. The goal is to select the redundancy strategy, component, and redundancy level for each subsystem such that the system reliability is maximized. Since RAP is a NP-hard problem, we propose an efficient simulated annealing algorithm (SA) to solve it. In addition, to evaluating the performance of the proposed algorithm, it is compared with well-known algorithms in the literature for different test problems. The results of the performance analysis show a relatively satisfactory efficiency of the proposed SA algorithm

  1. REPAIR SHOP JOB SCHEDULING WITH PARALLEL OPERATORS AND MULTIPLE CONSTRAINTS USING SIMULATED ANNEALING

    Directory of Open Access Journals (Sweden)

    N. Shivasankaran

    2013-04-01

    Full Text Available Scheduling problems are generally treated as NP andash; complete combinatorial optimization problems which is a multi-objective and multi constraint one. Repair shop Job sequencing and operator allocation is one such NP andash; complete problem. For such problems, an efficient technique is required that explores a wide range of solution space. This paper deals with Simulated Annealing Technique, a Meta - heuristic to solve the complex Car Sequencing and Operator Allocation problem in a car repair shop. The algorithm is tested with several constraint settings and the solution quality exceeds the results reported in the literature with high convergence speed and accuracy. This algorithm could be considered as quite effective while other heuristic routine fails.

  2. Two-Dimensional IIR Filter Design Using Simulated Annealing Based Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Supriya Dhabal

    2014-01-01

    Full Text Available We present a novel hybrid algorithm based on particle swarm optimization (PSO and simulated annealing (SA for the design of two-dimensional recursive digital filters. The proposed method, known as SA-PSO, integrates the global search ability of PSO with the local search ability of SA and offsets the weakness of each other. The acceptance criterion of Metropolis is included in the basic algorithm of PSO to increase the swarm’s diversity by accepting sometimes weaker solutions also. The experimental results reveal that the performance of the optimal filter designed by the proposed SA-PSO method is improved. Further, the convergence behavior as well as optimization accuracy of proposed method has been improved significantly and computational time is also reduced. In addition, the proposed SA-PSO method also produces the best optimal solution with lower mean and variance which indicates that the algorithm can be used more efficiently in realizing two-dimensional digital filters.

  3. Discrete-State Simulated Annealing For Traveling-Wave Tube Slow-Wave Circuit Optimization

    Science.gov (United States)

    Wilson, Jeffrey D.; Bulson, Brian A.; Kory, Carol L.; Williams, W. Dan (Technical Monitor)

    2001-01-01

    Algorithms based on the global optimization technique of simulated annealing (SA) have proven useful in designing traveling-wave tube (TWT) slow-wave circuits for high RF power efficiency. The characteristic of SA that enables it to determine a globally optimized solution is its ability to accept non-improving moves in a controlled manner. In the initial stages of the optimization, the algorithm moves freely through configuration space, accepting most of the proposed designs. This freedom of movement allows non-intuitive designs to be explored rather than restricting the optimization to local improvement upon the initial configuration. As the optimization proceeds, the rate of acceptance of non-improving moves is gradually reduced until the algorithm converges to the optimized solution. The rate at which the freedom of movement is decreased is known as the annealing or cooling schedule of the SA algorithm. The main disadvantage of SA is that there is not a rigorous theoretical foundation for determining the parameters of the cooling schedule. The choice of these parameters is highly problem dependent and the designer needs to experiment in order to determine values that will provide a good optimization in a reasonable amount of computational time. This experimentation can absorb a large amount of time especially when the algorithm is being applied to a new type of design. In order to eliminate this disadvantage, a variation of SA known as discrete-state simulated annealing (DSSA), was recently developed. DSSA provides the theoretical foundation for a generic cooling schedule which is problem independent, Results of similar quality to SA can be obtained, but without the extra computational time required to tune the cooling parameters. Two algorithm variations based on DSSA were developed and programmed into a Microsoft Excel spreadsheet graphical user interface (GUI) to the two-dimensional nonlinear multisignal helix traveling-wave amplifier analysis program TWA3

  4. An evolutionary programming based simulated annealing method for solving the unit commitment problem

    Energy Technology Data Exchange (ETDEWEB)

    Christober Asir Rajan, C. [Department of EEE, Pondicherry Engineering College, Pondicherry 605014 (India); Mohan, M.R. [Department of EEE, Anna University, Chennai 600 025 (India)

    2007-09-15

    This paper presents a new approach to solve the short-term unit commitment problem using an evolutionary programming based simulated annealing method. The objective of this paper is to find the generation scheduling such that the total operating cost can be minimized, when subjected to a variety of constraints. This also means that it is desirable to find the optimal generating unit commitment in the power system for the next H hours. Evolutionary programming, which happens to be a global optimisation technique for solving unit commitment Problem, operates on a system, which is designed to encode each unit's operating schedule with regard to its minimum up/down time. In this, the unit commitment schedule is coded as a string of symbols. An initial population of parent solutions is generated at random. Here, each schedule is formed by committing all the units according to their initial status (''flat start''). Here the parents are obtained from a pre-defined set of solution's, i.e. each and every solution is adjusted to meet the requirements. Then, a random recommitment is carried out with respect to the unit's minimum down times. And SA improves the status. The best population is selected by evolutionary strategy. The Neyveli Thermal Power Station (NTPS) Unit-II in India demonstrates the effectiveness of the proposed approach; extensive studies have also been performed for different power systems consists of 10, 26, 34 generating units. Numerical results are shown comparing the cost solutions and computation time obtained by using the Evolutionary Programming method and other conventional methods like Dynamic Programming, Lagrangian Relaxation and Simulated Annealing and Tabu Search in reaching proper unit commitment. (author)

  5. Exploration of DGVM Parameter Solution Space Using Simulated Annealing: Implications for Forecast Uncertainties

    Science.gov (United States)

    Wells, J. R.; Kim, J. B.

    2011-12-01

    Parameters in dynamic global vegetation models (DGVMs) are thought to be weakly constrained and can be a significant source of errors and uncertainties. DGVMs use between 5 and 26 plant functional types (PFTs) to represent the average plant life form in each simulated plot, and each PFT typically has a dozen or more parameters that define the way it uses resource and responds to the simulated growing environment. Sensitivity analysis explores how varying parameters affects the output, but does not do a full exploration of the parameter solution space. The solution space for DGVM parameter values are thought to be complex and non-linear; and multiple sets of acceptable parameters may exist. In published studies, PFT parameters are estimated from published literature, and often a parameter value is estimated from a single published value. Further, the parameters are "tuned" using somewhat arbitrary, "trial-and-error" methods. BIOMAP is a new DGVM created by fusing MAPSS biogeography model with Biome-BGC. It represents the vegetation of North America using 26 PFTs. We are using simulated annealing, a global search method, to systematically and objectively explore the solution space for the BIOMAP PFTs and system parameters important for plant water use. We defined the boundaries of the solution space by obtaining maximum and minimum values from published literature, and where those were not available, using +/-20% of current values. We used stratified random sampling to select a set of grid cells representing the vegetation of the conterminous USA. Simulated annealing algorithm is applied to the parameters for spin-up and a transient run during the historical period 1961-1990. A set of parameter values is considered acceptable if the associated simulation run produces a modern potential vegetation distribution map that is as accurate as one produced by trial-and-error calibration. We expect to confirm that the solution space is non-linear and complex, and that

  6. Fusion Simulation Program Execution Plan

    International Nuclear Information System (INIS)

    Brooks, Jeffrey

    2011-01-01

    The overall science goal of the FSP is to develop predictive simulation capability for magnetically confined fusion plasmas at an unprecedented level of integration and fidelity. This will directly support and enable effective U.S. participation in research related to the International Thermonuclear Experimental Reactor (ITER) and the overall mission of delivering practical fusion energy. The FSP will address a rich set of scientific issues together with experimental programs, producing validated integrated physics results. This is very well aligned with the mission of the ITER Organization to coordinate with its members the integrated modeling and control of fusion plasmas, including benchmarking and validation activities. [1]. Initial FSP research will focus on two critical areas: 1) the plasma edge and 2) whole device modeling including disruption avoidance. The first of these problems involves the narrow plasma boundary layer and its complex interactions with the plasma core and the surrounding material wall. The second requires development of a computationally tractable, but comprehensive model that describes all equilibrium and dynamic processes at a sufficient level of detail to provide useful prediction of the temporal evolution of fusion plasma experiments. The initial driver for the whole device model (WDM) will be prediction and avoidance of discharge-terminating disruptions, especially at high performance, which are a critical impediment to successful operation of machines like ITER. If disruptions prove unable to be avoided, their associated dynamics and effects will be addressed in the next phase of the FSP. The FSP plan targets the needed modeling capabilities by developing Integrated Science Applications (ISAs) specific to their needs. The Pedestal-Boundary model will include boundary magnetic topology, cross-field transport of multi-species plasmas, parallel plasma transport, neutral transport, atomic physics and interactions with the plasma wall

  7. Bead Game Simulation. Lesson Plan.

    Science.gov (United States)

    Ripp, Ken

    This lesson plan offers students the opportunity to participate in the three basic economic systems (market, command, and tradition). By working in each of the systems, students will internalize the fundamental values present in each system and will gain insights into the basic advantages and disadvantages of each system. The lesson plan provides…

  8. Elemental thin film depth profiles by ion beam analysis using simulated annealing - a new tool

    International Nuclear Information System (INIS)

    Jeynes, C; Barradas, N P; Marriott, P K; Boudreault, G; Jenkin, M; Wendler, E; Webb, R P

    2003-01-01

    Rutherford backscattering spectrometry (RBS) and related techniques have long been used to determine the elemental depth profiles in films a few nanometres to a few microns thick. However, although obtaining spectra is very easy, solving the inverse problem of extracting the depth profiles from the spectra is not possible analytically except for special cases. It is because these special cases include important classes of samples, and because skilled analysts are adept at extracting useful qualitative information from the data, that ion beam analysis is still an important technique. We have recently solved this inverse problem using the simulated annealing algorithm. We have implemented the solution in the 'IBA DataFurnace' code, which has been developed into a very versatile and general new software tool that analysts can now use to rapidly extract quantitative accurate depth profiles from real samples on an industrial scale. We review the features, applicability and validation of this new code together with other approaches to handling IBA (ion beam analysis) data, with particular attention being given to determining both the absolute accuracy of the depth profiles and statistically accurate error estimates. We include examples of analyses using RBS, non-Rutherford elastic scattering, elastic recoil detection and non-resonant nuclear reactions. High depth resolution and the use of multiple techniques simultaneously are both discussed. There is usually systematic ambiguity in IBA data and Butler's example of ambiguity (1990 Nucl. Instrum. Methods B 45 160-5) is reanalysed. Analyses are shown: of evaporated, sputtered, oxidized, ion implanted, ion beam mixed and annealed materials; of semiconductors, optical and magnetic multilayers, superconductors, tribological films and metals; and of oxides on Si, mixed metal silicides, boron nitride, GaN, SiC, mixed metal oxides, YBCO and polymers. (topical review)

  9. Modeling and Simulated Annealing Optimization of Surface Roughness in CO2 Laser Nitrogen Cutting of Stainless Steel

    Directory of Open Access Journals (Sweden)

    M. Madić

    2013-09-01

    Full Text Available This paper presents a systematic methodology for empirical modeling and optimization of surface roughness in nitrogen, CO2 laser cutting of stainless steel . The surface roughness prediction model was developed in terms of laser power , cutting speed , assist gas pressure and focus position by using The artificial neural network ( ANN . To cover a wider range of laser cutting parameters and obtain an experimental database for the ANN model development, Taguchi 's L27 orthogonal array was implemented in the experimental plan. The developed ANN model was expressed as an explicit nonlinear function , while the influence of laser cutting parameters and their interactions on surface roughness were analyzed by generating 2D and 3D plots . The final goal of the experimental study Focuses on the determinationof the optimum laser cutting parameters for the minimization of surface roughness . Since the solution space of the developed ANN model is complex, and the possibility of many local solutions is great, simulated annealing (SA was selected as a method for the optimization of surface roughness.

  10. Improvement of the matrix effect compensation in active neutron measurement by simulated annealing algorithm (June 2009)

    International Nuclear Information System (INIS)

    Raoux, A. C.; Loridon, J.; Mariani, A.; Passard, C.

    2009-01-01

    Active neutron measurements such as the Differential Die-Away (DDA) technique involving pulsed neutron generator, are widely applied to determine the fissile content of waste packages. Unfortunately, the main drawback of such techniques is coming from the lack of knowledge of the waste matrix composition. Thus, the matrix effect correction for the DDA measurement is an essential improvement in the field of fissile material content determination. Different solutions have been developed to compensate the effect of the matrix on the neutron measurement interpretation. In this context, this paper describes an innovative matrix correction method we have developed with the goal of increasing the accuracy of the matrix effect correction and reducing the measurement time. The implementation of this method is based on the analysis of the raw signal with an optimisation algorithm called the simulated annealing algorithm. This algorithm needs a reference data base of Multi-Channel Scaling (MCS) spectra, to fit the raw signal. The construction of the MCS library involves a learning phase to define and acquire the DDA signals. This database has been provided by a set of active signals from experimental matrices (mock-up waste drums of 118 litres) recorded in a specific device dedicated to neutron measurement research and development of the Nuclear Measurement Laboratory of CEA-Cadarache, called PROMETHEE 6. The simulated annealing algorithm is applied to make use of the effect of the matrices on the total active signal of DDA measurement. Furthermore, as this algorithm is directly applied to the raw active signal, it is very useful when active background contributions can not be easily estimated and removed. Most of the cases tested during this work which represents the feasibility phase of the method, are within a 4% agreement interval with the expected experimental value. Moreover, one can notice that without any compensation of the matrix effect, the classical DDA prompt

  11. Improvement of the matrix effect compensation in active neutron measurement by simulated annealing algorithm (June 2009)

    Energy Technology Data Exchange (ETDEWEB)

    Raoux, A. C.; Loridon, J.; Mariani, A.; Passard, C. [French Atomic Energy Commission, DEN, Cadarache, F-3108 Saint-Paul-Lez-Durance (France)

    2009-07-01

    Active neutron measurements such as the Differential Die-Away (DDA) technique involving pulsed neutron generator, are widely applied to determine the fissile content of waste packages. Unfortunately, the main drawback of such techniques is coming from the lack of knowledge of the waste matrix composition. Thus, the matrix effect correction for the DDA measurement is an essential improvement in the field of fissile material content determination. Different solutions have been developed to compensate the effect of the matrix on the neutron measurement interpretation. In this context, this paper describes an innovative matrix correction method we have developed with the goal of increasing the accuracy of the matrix effect correction and reducing the measurement time. The implementation of this method is based on the analysis of the raw signal with an optimisation algorithm called the simulated annealing algorithm. This algorithm needs a reference data base of Multi-Channel Scaling (MCS) spectra, to fit the raw signal. The construction of the MCS library involves a learning phase to define and acquire the DDA signals. This database has been provided by a set of active signals from experimental matrices (mock-up waste drums of 118 litres) recorded in a specific device dedicated to neutron measurement research and development of the Nuclear Measurement Laboratory of CEA-Cadarache, called PROMETHEE 6. The simulated annealing algorithm is applied to make use of the effect of the matrices on the total active signal of DDA measurement. Furthermore, as this algorithm is directly applied to the raw active signal, it is very useful when active background contributions can not be easily estimated and removed. Most of the cases tested during this work which represents the feasibility phase of the method, are within a 4% agreement interval with the expected experimental value. Moreover, one can notice that without any compensation of the matrix effect, the classical DDA prompt

  12. Integration of electromagnetic induction sensor data in soil sampling scheme optimization using simulated annealing.

    Science.gov (United States)

    Barca, E; Castrignanò, A; Buttafuoco, G; De Benedetto, D; Passarella, G

    2015-07-01

    Soil survey is generally time-consuming, labor-intensive, and costly. Optimization of sampling scheme allows one to reduce the number of sampling points without decreasing or even increasing the accuracy of investigated attribute. Maps of bulk soil electrical conductivity (EC a ) recorded with electromagnetic induction (EMI) sensors could be effectively used to direct soil sampling design for assessing spatial variability of soil moisture. A protocol, using a field-scale bulk EC a survey, has been applied in an agricultural field in Apulia region (southeastern Italy). Spatial simulated annealing was used as a method to optimize spatial soil sampling scheme taking into account sampling constraints, field boundaries, and preliminary observations. Three optimization criteria were used. the first criterion (minimization of mean of the shortest distances, MMSD) optimizes the spreading of the point observations over the entire field by minimizing the expectation of the distance between an arbitrarily chosen point and its nearest observation; the second criterion (minimization of weighted mean of the shortest distances, MWMSD) is a weighted version of the MMSD, which uses the digital gradient of the grid EC a data as weighting function; and the third criterion (mean of average ordinary kriging variance, MAOKV) minimizes mean kriging estimation variance of the target variable. The last criterion utilizes the variogram model of soil water content estimated in a previous trial. The procedures, or a combination of them, were tested and compared in a real case. Simulated annealing was implemented by the software MSANOS able to define or redesign any sampling scheme by increasing or decreasing the original sampling locations. The output consists of the computed sampling scheme, the convergence time, and the cooling law, which can be an invaluable support to the process of sampling design. The proposed approach has found the optimal solution in a reasonable computation time. The

  13. Communication Systems Simulation Laboratory (CSSL): Simulation Planning Guide

    Science.gov (United States)

    Schlesinger, Adam

    2012-01-01

    The simulation process, milestones and inputs are unknowns to first-time users of the CSSL. The Simulation Planning Guide aids in establishing expectations for both NASA and non-NASA facility customers. The potential audience for this guide includes both internal and commercial spaceflight hardware/software developers. It is intended to assist their engineering personnel in simulation planning and execution. Material covered includes a roadmap of the simulation process, roles and responsibilities of facility and user, major milestones, facility capabilities, and inputs required by the facility. Samples of deliverables, facility interfaces, and inputs necessary to define scope, cost, and schedule are included as an appendix to the guide.

  14. Kinetic Monte Carlo simulation of nanostructural evolution under post-irradiation annealing in dilute FeMnNi

    Energy Technology Data Exchange (ETDEWEB)

    Chiapetto, M. [SCK-CEN, Nuclear Materials Science Institute, Mol (Belgium); Unite Materiaux et Transformations (UMET), UMR 8207, Universite de Lille 1, ENSCL, Villeneuve d' Ascq (France); Becquart, C.S. [Unite Materiaux et Transformations (UMET), UMR 8207, Universite de Lille 1, ENSCL, Villeneuve d' Ascq (France); Laboratoire commun EDF-CNRS, Etude et Modelisation des Microstructures pour le Vieillissement des Materiaux (EM2VM) (France); Domain, C. [EDF R and D, Departement Materiaux et Mecanique des Composants, Les Renardieres, Moret sur Loing (France); Laboratoire commun EDF-CNRS, Etude et Modelisation des Microstructures pour le Vieillissement des Materiaux (EM2VM) (France); Malerba, L. [SCK-CEN, Nuclear Materials Science Institute, Mol (Belgium)

    2015-01-01

    Post-irradiation annealing experiments are often used to obtain clearer information on the nature of defects produced by irradiation. However, their interpretation is not always straightforward without the support of physical models. We apply here a physically-based set of parameters for object kinetic Monte Carlo (OKMC) simulations of the nanostructural evolution of FeMnNi alloys under irradiation to the simulation of their post-irradiation isochronal annealing, from 290 to 600 C. The model adopts a ''grey alloy'' scheme, i.e. the solute atoms are not introduced explicitly, only their effect on the properties of point-defect clusters is. Namely, it is assumed that both vacancy and SIA clusters are significantly slowed down by the solutes. The slowing down increases with size until the clusters become immobile. Specifically, the slowing down of SIA clusters by Mn and Ni can be justified in terms of the interaction between these atoms and crowdions in Fe. The results of the model compare quantitatively well with post-irradiation isochronal annealing experimental data, providing clear insight into the mechanisms that determine the disappearance or re-arrangement of defects as functions of annealing time and temperature. (copyright 2015 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  15. A memory structure adapted simulated annealing algorithm for a green vehicle routing problem.

    Science.gov (United States)

    Küçükoğlu, İlker; Ene, Seval; Aksoy, Aslı; Öztürk, Nursel

    2015-03-01

    Currently, reduction of carbon dioxide (CO2) emissions and fuel consumption has become a critical environmental problem and has attracted the attention of both academia and the industrial sector. Government regulations and customer demands are making environmental responsibility an increasingly important factor in overall supply chain operations. Within these operations, transportation has the most hazardous effects on the environment, i.e., CO2 emissions, fuel consumption, noise and toxic effects on the ecosystem. This study aims to construct vehicle routes with time windows that minimize the total fuel consumption and CO2 emissions. The green vehicle routing problem with time windows (G-VRPTW) is formulated using a mixed integer linear programming model. A memory structure adapted simulated annealing (MSA-SA) meta-heuristic algorithm is constructed due to the high complexity of the proposed problem and long solution times for practical applications. The proposed models are integrated with a fuel consumption and CO2 emissions calculation algorithm that considers the vehicle technical specifications, vehicle load, and transportation distance in a green supply chain environment. The proposed models are validated using well-known instances with different numbers of customers. The computational results indicate that the MSA-SA heuristic is capable of obtaining good G-VRPTW solutions within a reasonable amount of time by providing reductions in fuel consumption and CO2 emissions.

  16. Simulated annealing (SA to vehicle routing problems with soft time windows

    Directory of Open Access Journals (Sweden)

    Suphan Sodsoon

    2014-12-01

    Full Text Available The researcher has applied and develops the meta-heuristics method to solve Vehicle Routing Problems with Soft Time Windows (VRPSTW. For this case there was only one depot, multi customers which each generally sparse either or demand was different though perceived number of demand and specific period of time to receive them. The Operation Research was representative combinatorial optimization problems and is known to be NP-hard. In this research algorithm, use Simulated Annealing (SA to determine the optimum solutions which rapidly time solving. After developed the algorithms, apply them to examine the factors and the optimum extended time windows and test these factors with vehicle problem routing under specific time windows by Solomon in OR-Library in case of maximum 25 customers. Meanwhile, 6 problems are including of C101, C102, R101, R102, RC101 and RC102 respectively. The result shows the optimum extended time windows at level of 50%. At last, after comparison these answers with the case of vehicle problem routing under specific time windows and flexible time windows, found that percentage errors on number of vehicles approximately by -28.57% and percentage errors on distances approximately by -28.57% which this algorithm spent average processing time on 45.5 sec/problems.

  17. Multi-Objective Optimization for Pure Permanent-Magnet Undulator Magnets Ordering Using Modified Simulated Annealing

    CERN Document Server

    Chen Nian; Li, Ge

    2004-01-01

    Undulator field errors influence the electron beam trajectories and lower the radiation quality. Angular deflection of electron beam is determined by first field integral, orbital displacement of electron beam is determined by second field integral and radiation quality can be evaluated by rms field error or phase error. Appropriate ordering of magnets can greatly reduce the errors. We apply a modified simulated annealing algorithm to this multi-objective optimization problem, taking first field integral, second field integral and rms field error as objective functions. Undulator with small field errors can be designed by this method within a reasonable calculation time even for the case of hundreds of magnets (first field integral reduced to 10-6T·m, second integral to 10-6T·m2 and rms field error to 0.01%). Thus, the field correction after assembling of undulator will be greatly simplified. This paper gives the optimizing process in detail and puts forward a new method to quickly calculate the rms field e...

  18. Simulated Annealing-Based Ant Colony Algorithm for Tugboat Scheduling Optimization

    Directory of Open Access Journals (Sweden)

    Qi Xu

    2012-01-01

    Full Text Available As the “first service station” for ships in the whole port logistics system, the tugboat operation system is one of the most important systems in port logistics. This paper formulated the tugboat scheduling problem as a multiprocessor task scheduling problem (MTSP after analyzing the characteristics of tugboat operation. The model considers factors of multianchorage bases, different operation modes, and three stages of operations (berthing/shifting-berth/unberthing. The objective is to minimize the total operation times for all tugboats in a port. A hybrid simulated annealing-based ant colony algorithm is proposed to solve the addressed problem. By the numerical experiments without the shifting-berth operation, the effectiveness was verified, and the fact that more effective sailing may be possible if tugboats return to the anchorage base timely was pointed out; by the experiments with the shifting-berth operation, one can see that the objective is most sensitive to the proportion of the shifting-berth operation, influenced slightly by the tugboat deployment scheme, and not sensitive to the handling operation times.

  19. Study on the mechanism and efficiency of simulated annealing using an LP optimization benchmark problem - 113

    International Nuclear Information System (INIS)

    Qianqian, Li; Xiaofeng, Jiang; Shaohong, Zhang

    2010-01-01

    Simulated Annealing Algorithm (SAA) for solving combinatorial optimization problems is a popular method for loading pattern optimization. The main purpose of this paper is to understand the underlying search mechanism of SAA and to study its efficiency. In this study, a general SAA that employs random pair exchange of fuel assemblies to search for the optimum fuel Loading Pattern (LP) is applied to an exhaustively searched LP optimization benchmark problem. All the possible LPs of the benchmark problem have been enumerated and evaluated via the use of the very fast and accurate Hybrid Harmonics and Linear Perturbation (HHLP) method, such that the mechanism of SA for LP optimization can be explicitly analyzed and its search efficiency evaluated. The generic core geometry itself dictates that only a small number LPs can be generated by performing random single pair exchanges and that the LPs are necessarily mostly similar to the initial LP. This phase space effect turns out to be the basic mechanism in SAA that can explain its efficiency and good local search ability. A measure of search efficiency is introduced which shows that the stochastic nature of SAA greatly influences the variability of its search efficiency. It is also found that using fuel assembly k-infinity distribution as a technique to filter the LPs can significantly enhance the SAA search efficiency. (authors)

  20. An Integrated Qualitative and Quantitative Biochemical Model Learning Framework Using Evolutionary Strategy and Simulated Annealing.

    Science.gov (United States)

    Wu, Zujian; Pang, Wei; Coghill, George M

    2015-01-01

    Both qualitative and quantitative model learning frameworks for biochemical systems have been studied in computational systems biology. In this research, after introducing two forms of pre-defined component patterns to represent biochemical models, we propose an integrative qualitative and quantitative modelling framework for inferring biochemical systems. In the proposed framework, interactions between reactants in the candidate models for a target biochemical system are evolved and eventually identified by the application of a qualitative model learning approach with an evolution strategy. Kinetic rates of the models generated from qualitative model learning are then further optimised by employing a quantitative approach with simulated annealing. Experimental results indicate that our proposed integrative framework is feasible to learn the relationships between biochemical reactants qualitatively and to make the model replicate the behaviours of the target system by optimising the kinetic rates quantitatively. Moreover, potential reactants of a target biochemical system can be discovered by hypothesising complex reactants in the synthetic models. Based on the biochemical models learned from the proposed framework, biologists can further perform experimental study in wet laboratory. In this way, natural biochemical systems can be better understood.

  1. An interactive system for creating object models from range data based on simulated annealing

    International Nuclear Information System (INIS)

    Hoff, W.A.; Hood, F.W.; King, R.H.

    1997-01-01

    In hazardous applications such as remediation of buried waste and dismantlement of radioactive facilities, robots are an attractive solution. Sensing to recognize and locate objects is a critical need for robotic operations in unstructured environments. An accurate 3-D model of objects in the scene is necessary for efficient high level control of robots. Drawing upon concepts from supervisory control, the authors have developed an interactive system for creating object models from range data, based on simulated annealing. Site modeling is a task that is typically performed using purely manual or autonomous techniques, each of which has inherent strengths and weaknesses. However, an interactive modeling system combines the advantages of both manual and autonomous methods, to create a system that has high operator productivity as well as high flexibility and robustness. The system is unique in that it can work with very sparse range data, tolerate occlusions, and tolerate cluttered scenes. The authors have performed an informal evaluation with four operators on 16 different scenes, and have shown that the interactive system is superior to either manual or automatic methods in terms of task time and accuracy

  2. Optimization Of Thermo-Electric Coolers Using Hybrid Genetic Algorithm And Simulated Annealing

    Directory of Open Access Journals (Sweden)

    Khanh Doan V.K.

    2014-06-01

    Full Text Available Thermo-electric Coolers (TECs nowadays are applied in a wide range of thermal energy systems. This is due to their superior features where no refrigerant and dynamic parts are needed. TECs generate no electrical or acoustical noise and are environmentally friendly. Over the past decades, many researches were employed to improve the efficiency of TECs by enhancing the material parameters and design parameters. The material parameters are restricted by currently available materials and module fabricating technologies. Therefore, the main objective of TECs design is to determine a set of design parameters such as leg area, leg length and the number of legs. Two elements that play an important role when considering the suitability of TECs in applications are rated of refrigeration (ROR and coefficient of performance (COP. In this paper, the review of some previous researches will be conducted to see the diversity of optimization in the design of TECs in enhancing the performance and efficiency. After that, single-objective optimization problems (SOP will be tested first by using Genetic Algorithm (GA and Simulated Annealing (SA to optimize geometry properties so that TECs will operate at near optimal conditions. Equality constraint and inequality constraint were taken into consideration.

  3. Combined Simulated Annealing and Genetic Algorithm Approach to Bus Network Design

    Science.gov (United States)

    Liu, Li; Olszewski, Piotr; Goh, Pong-Chai

    A new method - combined simulated annealing (SA) and genetic algorithm (GA) approach is proposed to solve the problem of bus route design and frequency setting for a given road network with fixed bus stop locations and fixed travel demand. The method involves two steps: a set of candidate routes is generated first and then the best subset of these routes is selected by the combined SA and GA procedure. SA is the main process to search for a better solution to minimize the total system cost, comprising user and operator costs. GA is used as a sub-process to generate new solutions. Bus demand assignment on two alternative paths is performed at the solution evaluation stage. The method was implemented on four theoretical grid networks of different size and a benchmark network. Several GA operators (crossover and mutation) were utilized and tested for their effectiveness. The results show that the proposed method can efficiently converge to the optimal solution on a small network but computation time increases significantly with network size. The method can also be used for other transport operation management problems.

  4. Fast simulated annealing inversion of surface waves on pavement using phase-velocity spectra

    Science.gov (United States)

    Ryden, N.; Park, C.B.

    2006-01-01

    The conventional inversion of surface waves depends on modal identification of measured dispersion curves, which can be ambiguous. It is possible to avoid mode-number identification and extraction by inverting the complete phase-velocity spectrum obtained from a multichannel record. We use the fast simulated annealing (FSA) global search algorithm to minimize the difference between the measured phase-velocity spectrum and that calculated from a theoretical layer model, including the field setup geometry. Results show that this algorithm can help one avoid getting trapped in local minima while searching for the best-matching layer model. The entire procedure is demonstrated on synthetic and field data for asphalt pavement. The viscoelastic properties of the top asphalt layer are taken into account, and the inverted asphalt stiffness as a function of frequency compares well with laboratory tests on core samples. The thickness and shear-wave velocity of the deeper embedded layers are resolved within 10% deviation from those values measured separately during pavement construction. The proposed method may be equally applicable to normal soil site investigation and in the field of ultrasonic testing of materials. ?? 2006 Society of Exploration Geophysicists.

  5. Optimal design of minimum mean-square error noise reduction algorithms using the simulated annealing technique.

    Science.gov (United States)

    Bai, Mingsian R; Hsieh, Ping-Ju; Hur, Kur-Nan

    2009-02-01

    The performance of the minimum mean-square error noise reduction (MMSE-NR) algorithm in conjunction with time-recursive averaging (TRA) for noise estimation is found to be very sensitive to the choice of two recursion parameters. To address this problem in a more systematic manner, this paper proposes an optimization method to efficiently search the optimal parameters of the MMSE-TRA-NR algorithms. The objective function is based on a regression model, whereas the optimization process is carried out with the simulated annealing algorithm that is well suited for problems with many local optima. Another NR algorithm proposed in the paper employs linear prediction coding as a preprocessor for extracting the correlated portion of human speech. Objective and subjective tests were undertaken to compare the optimized MMSE-TRA-NR algorithm with several conventional NR algorithms. The results of subjective tests were processed by using analysis of variance to justify the statistic significance. A post hoc test, Tukey's Honestly Significant Difference, was conducted to further assess the pairwise difference between the NR algorithms.

  6. A proposal simulated annealing algorithm for proportional parallel flow shops with separated setup times

    Directory of Open Access Journals (Sweden)

    Helio Yochihiro Fuchigami

    2014-08-01

    Full Text Available This article addresses the problem of minimizing makespan on two parallel flow shops with proportional processing and setup times. The setup times are separated and sequence-independent. The parallel flow shop scheduling problem is a specific case of well-known hybrid flow shop, characterized by a multistage production system with more than one machine working in parallel at each stage. This situation is very common in various kinds of companies like chemical, electronics, automotive, pharmaceutical and food industries. This work aimed to propose six Simulated Annealing algorithms, their perturbation schemes and an algorithm for initial sequence generation. This study can be classified as “applied research” regarding the nature, “exploratory” about the objectives and “experimental” as to procedures, besides the “quantitative” approach. The proposed algorithms were effective regarding the solution and computationally efficient. Results of Analysis of Variance (ANOVA revealed no significant difference between the schemes in terms of makespan. It’s suggested the use of PS4 scheme, which moves a subsequence of jobs, for providing the best percentage of success. It was also found that there is a significant difference between the results of the algorithms for each value of the proportionality factor of the processing and setup times of flow shops.

  7. Power system restoration: planning and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Hazarika, D. [Assam Engineering Coll., Dept. of Electrical Engineering, Assam (India); Sinha, A.K. [Inidan Inst. of Technology, Dept. of Electrical Engineering, Kharagpur (India)

    2003-03-01

    This paper describes a restoration guidance simulator, which allows power system operator/planner to simulate and plan restoration events in an interactive mode. The simulator provides a list of restoration events according to the priority based on some restoration rules and list of priority loads. It also provides in an interactive mode the list of events, which becomes possible as the system grows during restoration. Further, the selected event is validated through a load flow and other analytical tools to show the consequences of implementing the planned event. (Author)

  8. Planning by Search through Simulations.

    Science.gov (United States)

    1985-10-01

    Info. Scl. Philadelphia, PA 19104 Mr. Fred M. Griffee 1 copy Technical Advisor C3 Division Marine Corps Development and Education Command Quantico, VA...ing, intervals during the execution of the plan (e.g., a thesis advisor is used discon- tinuously by a graduate student during his thesis research...The Garden I le\\ Ro4m tob Pita *44a Fr4Teot4i 4444444.4 (S-R4EST44CTIO pro444444 444 ( 4IT4 I B4 tually ~ ~ Rom com int effct For exmpe if the robo

  9. Crosshole Tomography, Waveform Inversion, and Anisotropy: A Combined Approach Using Simulated Annealing

    Science.gov (United States)

    Afanasiev, M.; Pratt, R. G.; Kamei, R.; McDowell, G.

    2012-12-01

    Crosshole seismic tomography has been used by Vale to provide geophysical images of mineralized massive sulfides in the Eastern Deeps deposit at Voisey's Bay, Labrador, Canada. To date, these data have been processed using traveltime tomography, and we seek to improve the resolution of these images by applying acoustic Waveform Tomography. Due to the computational cost of acoustic waveform modelling, local descent algorithms are employed in Waveform Tomography; due to non-linearity an initial model is required which predicts first-arrival traveltimes to within a half-cycle of the lowest frequency used. Because seismic velocity anisotropy can be significant in hardrock settings, the initial model must quantify the anisotropy in order to meet the half-cycle criterion. In our case study, significant velocity contrasts between the target massive sulfides and the surrounding country rock led to difficulties in generating an accurate anisotropy model through traveltime tomography, and our starting model for Waveform Tomography failed the half-cycle criterion at large offsets. We formulate a new, semi-global approach for finding the best-fit 1-D elliptical anisotropy model using simulated annealing. Through random perturbations to Thompson's ɛ parameter, we explore the L2 norm of the frequency-domain phase residuals in the space of potential anisotropy models: If a perturbation decreases the residuals, it is always accepted, but if a perturbation increases the residuals, it is accepted with the probability P = exp(-(Ei-E)/T). This is the Metropolis criterion, where Ei is the value of the residuals at the current iteration, E is the value of the residuals for the previously accepted model, and T is a probability control parameter, which is decreased over the course of the simulation via a preselected cooling schedule. Convergence to the global minimum of the residuals is guaranteed only for infinitely slow cooling, but in practice good results are obtained from a variety

  10. Forecasting nonlinear chaotic time series with function expression method based on an improved genetic-simulated annealing algorithm.

    Science.gov (United States)

    Wang, Jun; Zhou, Bi-hua; Zhou, Shu-dao; Sheng, Zheng

    2015-01-01

    The paper proposes a novel function expression method to forecast chaotic time series, using an improved genetic-simulated annealing (IGSA) algorithm to establish the optimum function expression that describes the behavior of time series. In order to deal with the weakness associated with the genetic algorithm, the proposed algorithm incorporates the simulated annealing operation which has the strong local search ability into the genetic algorithm to enhance the performance of optimization; besides, the fitness function and genetic operators are also improved. Finally, the method is applied to the chaotic time series of Quadratic and Rossler maps for validation. The effect of noise in the chaotic time series is also studied numerically. The numerical results verify that the method can forecast chaotic time series with high precision and effectiveness, and the forecasting precision with certain noise is also satisfactory. It can be concluded that the IGSA algorithm is energy-efficient and superior.

  11. Operations planning simulation: Model study

    Science.gov (United States)

    1974-01-01

    The use of simulation modeling for the identification of system sensitivities to internal and external forces and variables is discussed. The technique provides a means of exploring alternate system procedures and processes, so that these alternatives may be considered on a mutually comparative basis permitting the selection of a mode or modes of operation which have potential advantages to the system user and the operator. These advantages are measurements is system efficiency are: (1) the ability to meet specific schedules for operations, mission or mission readiness requirements or performance standards and (2) to accomplish the objectives within cost effective limits.

  12. Solving a multi-objective manufacturing cell scheduling problem with the consideration of warehouses using a simulated annealing based procedure

    Directory of Open Access Journals (Sweden)

    Adrián A. Toncovich

    2019-01-01

    Full Text Available The competition manufacturing companies face has driven the development of novel and efficient methods that enhance the decision making process. In this work, a specific flow shop scheduling problem of practical interest in the industry is presented and formalized using a mathematical programming model. The problem considers a manufacturing system arranged as a work cell that takes into account the transport operations of raw material and final products between the manufacturing cell and warehouses. For solving this problem, we present a multiobjective metaheuristic strategy based on simulated annealing, the Pareto Archived Simulated Annealing (PASA. We tested this strategy on two kinds of benchmark problem sets proposed by the authors. The first group is composed by small-sized problems. On these tests, PASA was able to obtain optimal or near-optimal solutions in significantly short computing times. In order to complete the analysis, we compared these results to the exact Pareto front of the instances obtained with augmented ε-constraint method. Then, we also tested the algorithm in a set of larger problems to evaluate its performance in more extensive search spaces. We performed this assessment through an analysis of the hypervolume metric. Both sets of tests showed the competitiveness of the Pareto Archived Simulated Annealing to efficiently solve this problem and obtain good quality solutions while using reasonable computational resources.

  13. WEAR PERFORMANCE OPTIMIZATION OF SILICON NITRIDE USING GENETIC AND SIMULATED ANNEALING ALGORITHM

    Directory of Open Access Journals (Sweden)

    SACHIN GHALME

    2017-12-01

    Full Text Available Replacing damaged joint with the suitable alternative material is a prime requirement in a patient who has arthritis. Generation of wear particles in the artificial joint during action or movement is a serious issue and leads to aseptic loosening of joint. Research in the field of bio-tribology is trying to evaluate materials with minimum wear volume loss so as to extend joint life. Silicon nitride (Si3N4 is non-oxide ceramic suggested as a new alternative for hip/knee joint replacement. Hexagonal Boron Nitride (hBN is recommended as a solid additive lubricant to improve the wear performance of Si3N4 . In this paper, an attempt has been made to evaluate the optimum combination of load and % volume of hBN in Si3N4 to minimize wear volume loss (WVL. The experiments were conducted according to Design of Experiments (DoE – Taguchi method and a mathematical model is developed. Further, this model is processed with Genetic Algorithm (GA and Simulated Annealing (SA to find out the optimum percentage of hBN in Si3N4 to minimize wear volume loss against Alumina (Al2O3 counterface. Taguchi method presents 15 N load and 8% volume of hBN to minimize WVL of Si3N4 . While GA and SA optimization offer 11.08 N load, 12.115% volume of hBN and 11.0789 N load, 12.128% volume of hBN respectively to minimize WVL in Si3N4. .

  14. Automated integration of genomic physical mapping data via parallel simulated annealing

    Energy Technology Data Exchange (ETDEWEB)

    Slezak, T.

    1994-06-01

    The Human Genome Center at the Lawrence Livermore National Laboratory (LLNL) is nearing closure on a high-resolution physical map of human chromosome 19. We have build automated tools to assemble 15,000 fingerprinted cosmid clones into 800 contigs with minimal spanning paths identified. These islands are being ordered, oriented, and spanned by a variety of other techniques including: Fluorescence Insitu Hybridization (FISH) at 3 levels of resolution, ECO restriction fragment mapping across all contigs, and a multitude of different hybridization and PCR techniques to link cosmid, YAC, AC, PAC, and Pl clones. The FISH data provide us with partial order and distance data as well as orientation. We made the observation that map builders need a much rougher presentation of data than do map readers; the former wish to see raw data since these can expose errors or interesting biology. We further noted that by ignoring our length and distance data we could simplify our problem into one that could be readily attacked with optimization techniques. The data integration problem could then be seen as an M x N ordering of our N cosmid clones which ``intersect`` M larger objects by defining ``intersection`` to mean either contig/map membership or hybridization results. Clearly, the goal of making an integrated map is now to rearrange the N cosmid clone ``columns`` such that the number of gaps on the object ``rows`` are minimized. Our FISH partially-ordered cosmid clones provide us with a set of constraints that cannot be violated by the rearrangement process. We solved the optimization problem via simulated annealing performed on a network of 40+ Unix machines in parallel, using a server/client model built on explicit socket calls. For current maps we can create a map in about 4 hours on the parallel net versus 4+ days on a single workstation. Our biologists are now using this software on a daily basis to guide their efforts toward final closure.

  15. A restraint molecular dynamics and simulated annealing approach for protein homology modeling utilizing mean angles

    Directory of Open Access Journals (Sweden)

    Maurer Till

    2005-04-01

    Full Text Available Abstract Background We have developed the program PERMOL for semi-automated homology modeling of proteins. It is based on restrained molecular dynamics using a simulated annealing protocol in torsion angle space. As main restraints defining the optimal local geometry of the structure weighted mean dihedral angles and their standard deviations are used which are calculated with an algorithm described earlier by Döker et al. (1999, BBRC, 257, 348–350. The overall long-range contacts are established via a small number of distance restraints between atoms involved in hydrogen bonds and backbone atoms of conserved residues. Employing the restraints generated by PERMOL three-dimensional structures are obtained using standard molecular dynamics programs such as DYANA or CNS. Results To test this modeling approach it has been used for predicting the structure of the histidine-containing phosphocarrier protein HPr from E. coli and the structure of the human peroxisome proliferator activated receptor γ (Ppar γ. The divergence between the modeled HPr and the previously determined X-ray structure was comparable to the divergence between the X-ray structure and the published NMR structure. The modeled structure of Ppar γ was also very close to the previously solved X-ray structure with an RMSD of 0.262 nm for the backbone atoms. Conclusion In summary, we present a new method for homology modeling capable of producing high-quality structure models. An advantage of the method is that it can be used in combination with incomplete NMR data to obtain reasonable structure models in accordance with the experimental data.

  16. Optimization of a hydrometric network extension using specific flow, kriging and simulated annealing

    Science.gov (United States)

    Chebbi, Afef; Kebaili Bargaoui, Zoubeida; Abid, Nesrine; da Conceição Cunha, Maria

    2017-12-01

    In hydrometric stations, water levels are continuously observed and discharge rating curves are constantly updated to achieve accurate river levels and discharge observations. An adequate spatial distribution of hydrological gauging stations presents a lot of interest in linkage with the river regime characterization, water infrastructures design, water resources management and ecological survey. Due to the increase of riverside population and the associated flood risk, hydrological networks constantly need to be developed. This paper suggests taking advantage of kriging approaches to improve the design of a hydrometric network. The context deals with the application of an optimization approach using ordinary kriging and simulated annealing (SA) in order to identify the best locations to install new hydrometric gauges. The task at hand is to extend an existing hydrometric network in order to estimate, at ungauged sites, the average specific annual discharge which is a key basin descriptor. This methodology is developed for the hydrometric network of the transboundary Medjerda River in the North of Tunisia. A Geographic Information System (GIS) is adopted to delineate basin limits and centroids. The latter are adopted to assign the location of basins in kriging development. Scenarios where the size of an existing 12 stations network is alternatively increased by 1, 2, 3, 4 and 5 new station(s) are investigated using geo-regression and minimization of the variance of kriging errors. The analysis of the optimized locations from a scenario to another shows a perfect conformity with respect to the location of the new sites. The new locations insure a better spatial coverage of the study area as seen with the increase of both the average and the maximum of inter-station distances after optimization. The optimization procedure selects the basins that insure the shifting of the mean drainage area towards higher specific discharges.

  17. Electrode Materials, Thermal Annealing Sequences, and Lateral/Vertical Phase Separation of Polymer Solar Cells from Multiscale Molecular Simulations

    KAUST Repository

    Lee, Cheng-Kuang

    2014-12-10

    © 2014 American Chemical Society. The nanomorphologies of the bulk heterojunction (BHJ) layer of polymer solar cells are extremely sensitive to the electrode materials and thermal annealing conditions. In this work, the correlations of electrode materials, thermal annealing sequences, and resultant BHJ nanomorphological details of P3HT:PCBM BHJ polymer solar cell are studied by a series of large-scale, coarse-grained (CG) molecular simulations of system comprised of PEDOT:PSS/P3HT:PCBM/Al layers. Simulations are performed for various configurations of electrode materials as well as processing temperature. The complex CG molecular data are characterized using a novel extension of our graph-based framework to quantify morphology and establish a link between morphology and processing conditions. Our analysis indicates that vertical phase segregation of P3HT:PCBM blend strongly depends on the electrode material and thermal annealing schedule. A thin P3HT-rich film is formed on the top, regardless of bottom electrode material, when the BHJ layer is exposed to the free surface during thermal annealing. In addition, preferential segregation of P3HT chains and PCBM molecules toward PEDOT:PSS and Al electrodes, respectively, is observed. Detailed morphology analysis indicated that, surprisingly, vertical phase segregation does not affect the connectivity of donor/acceptor domains with respective electrodes. However, the formation of P3HT/PCBM depletion zones next to the P3HT/PCBM-rich zones can be a potential bottleneck for electron/hole transport due to increase in transport pathway length. Analysis in terms of fraction of intra- and interchain charge transports revealed that processing schedule affects the average vertical orientation of polymer chains, which may be crucial for enhanced charge transport, nongeminate recombination, and charge collection. The present study establishes a more detailed link between processing and morphology by combining multiscale molecular

  18. Numerical and experimental simulation of mechanical and microstructural transformations in Batch annealing steels

    International Nuclear Information System (INIS)

    Monsalve, A.; Artigas, A.; Celentano, D.; Melendez, F.

    2004-01-01

    The heating and cooling curves during batch annealing process of low carbon steel have been modeled using the finite element technique. This has allowed to predict the transient thermal profile for every point of the annealed coils, particularly for the hottest and coldest ones. Through experimental measurements, the results have been adequately validated since a good agreement has been found between experimental values and those predicted by the model. Moreover, an Avrami recrystallization model. Moreover, and Avrami recrystallization model has been coupled to this thermal balance computation. Interrupted annealing experiments have been made by measuring the recrystallized fraction on the extreme points of the coil foe different times. These data gave the possibility to validate the developed recrystallization model through a reasonably good numerical-experimental fittings. (Author) 6 refs

  19. Advanced Simulation and Computing Business Plan

    Energy Technology Data Exchange (ETDEWEB)

    Rummel, E. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-07-09

    To maintain a credible nuclear weapons program, the National Nuclear Security Administration’s (NNSA’s) Office of Defense Programs (DP) needs to make certain that the capabilities, tools, and expert staff are in place and are able to deliver validated assessments. This requires a complete and robust simulation environment backed by an experimental program to test ASC Program models. This ASC Business Plan document encapsulates a complex set of elements, each of which is essential to the success of the simulation component of the Nuclear Security Enterprise. The ASC Business Plan addresses the hiring, mentoring, and retaining of programmatic technical staff responsible for building the simulation tools of the nuclear security complex. The ASC Business Plan describes how the ASC Program engages with industry partners—partners upon whom the ASC Program relies on for today’s and tomorrow’s high performance architectures. Each piece in this chain is essential to assure policymakers, who must make decisions based on the results of simulations, that they are receiving all the actionable information they need.

  20. PKA spectral effects on subcascade structures and free defect survival ratio as estimated by cascade-annealing computer simulation

    International Nuclear Information System (INIS)

    Muroga, Takeo

    1990-01-01

    The free defect survival ratio is calculated by ''cascade-annealing'' computer simulation using the MARLOWE and modified DAIQUIRI codes in various cases of Primary Knock-on Atom (PKA) spectra. The number of subcascades is calculated by ''cut-off'' calculation using MARLOWE. The adequacy of these methods is checked by comparing the results with experiments (surface segregation measurements and Transmission Electron Microscope cascade defect observations). The correlation using the weighted average recoil energy as a parameter shows that the saturation of the free defect survival ratio at high PKA energies has a close relation to the cascade splitting into subcascades. (author)

  1. Characterisation of amorphous silicon alloys by RBS/ERD with self consistent data analysis using simulated annealing

    International Nuclear Information System (INIS)

    Barradas, N.P.; Wendler, E.; Jeynes, C.; Summers, S.; Reehal, H.S.; Summers, S.

    1999-01-01

    Full text: Hydrogenated amorphous silicon films are deposited by CVD onto insulating (silica) substrates for the fabrication of solar cells. 1.5MeV 4 He ERD/RBS is applied to the films, and a self consistent depth profile of Si and H using the simulated annealing (SA) algorithm was obtained for each sample. The analytical procedure is described in detail, and the confidence limits of the profiles are obtained using the Markov Chain Monte Carlo method which is a natural extension of the SA algorithm. We show how the results are of great benefit to the growers

  2. A New Heuristic Providing an Effective Initial Solution for a Simulated Annealing approach to Energy Resource Scheduling in Smart Grids

    DEFF Research Database (Denmark)

    Sousa, Tiago M; Morais, Hugo; Castro, R.

    2014-01-01

    scheduling problem. Therefore, the use of metaheuristics is required to obtain good solutions in a reasonable amount of time. This paper proposes two new heuristics, called naive electric vehicles charge and discharge allocation and generation tournament based on cost, developed to obtain an initial solution...... to be used in the energy resource scheduling methodology based on simulated annealing previously developed by the authors. The case study considers two scenarios with 1000 and 2000 electric vehicles connected in a distribution network. The proposed heuristics are compared with a deterministic approach...

  3. A market based active/reactive dispatch including transformer taps and reactor and capacitor banks using Simulated Annealing

    International Nuclear Information System (INIS)

    Gomes, Mario Helder; Saraiva, Joao Tome

    2009-01-01

    This paper describes an optimization model to be used by System Operators in order to validate the economic schedules obtained by Market Operators together with the injections from Bilateral Contracts. These studies will be performed off-line in the day before operation and the developed model is based on adjustment bids submitted by generators and loads and it is used by System Operators if that is necessary to enforce technical or security constraints. This model corresponds to an enhancement of an approach described in a previous paper and it now includes discrete components as transformer taps and reactor and capacitor banks. The resulting mixed integer formulation is solved using Simulated Annealing, a well known metaheuristic specially suited for combinatorial problems. Once the Simulated Annealing converges and the values of the discrete variables are fixed, the resulting non-linear continuous problem is solved using Sequential Linear Programming to get the final solution. The developed model corresponds to an AC version, it includes constraints related with the capability diagram of synchronous generators and variables allowing the computation of the active power required to balance active losses. Finally, the paper includes a Case Study based on the IEEE 118 bus system to illustrate the results that it is possible to obtain and their interest. (author)

  4. Multi–criteria evaluation and simulated annealing for delimiting high priority habitats of Alectoris chukar and Phasianus colchicus in Iran

    Directory of Open Access Journals (Sweden)

    Momeni Dehaghi, I.

    2018-01-01

    Full Text Available Habitat degradation and hunting are among the most important causes of population decline for Alectoris chukar and Phasianus colchicus, two of the most threatened game species in the Golestan Province of Iran. Limited data on distribution and location of high–quality habitats for the two species make conservation efforts more difficult in the province. We used multi–criteria evaluation (MCE as a coarse–filter approach to refine the general distribution areas into habitat suitability maps for the species. We then used these maps as input to simulated annealing as a heuristic algorithm through Marxan in order to prioritize areas for conservation of the two species. To find the optimal solution, we tested various boundary length modifier (BLM values in the simulated annealing process. Our results showed that the MCE approach was useful to refine general habitat maps. Assessment of the selected reserves confirmed the suitability of the selected areas (mainly neighboring the current reserves making their management easier and more feasible. The total area of the selected reserves was about 476 km2. As current reserves of the Golestan Province represent only 23 % of the optimal area, further protected areas should be considered to efficiently conserve these two species.

  5. Simulated Annealing-based Optimal Proportional-Integral-Derivative (PID) Controller Design: A Case Study on Nonlinear Quadcopter Dynamics

    Science.gov (United States)

    Nemirsky, Kristofer Kevin

    In this thesis, the history and evolution of rotor aircraft with simulated annealing-based PID application were reviewed and quadcopter dynamics are presented. The dynamics of a quadcopter were then modeled, analyzed, and linearized. A cascaded loop architecture with PID controllers was used to stabilize the plant dynamics, which was improved upon through the application of simulated annealing (SA). A Simulink model was developed to test the controllers and verify the functionality of the proposed control system design. In addition, the data that the Simulink model provided were compared with flight data to present the validity of derived dynamics as a proper mathematical model representing the true dynamics of the quadcopter system. Then, the SA-based global optimization procedure was applied to obtain optimized PID parameters. It was observed that the tuned gains through the SA algorithm produced a better performing PID controller than the original manually tuned one. Next, we investigated the uncertain dynamics of the quadcopter setup. After adding uncertainty to the gyroscopic effects associated with pitch-and-roll rate dynamics, the controllers were shown to be robust against the added uncertainty. A discussion follows to summarize SA-based algorithm PID controller design and performance outcomes. Lastly, future work on SA application on multi-input-multi-output (MIMO) systems is briefly discussed.

  6. A dynamic programming–enhanced simulated annealing algorithm for solving bi-objective cell formation problem with duplicate machines

    Directory of Open Access Journals (Sweden)

    Mohammad Mohammadi

    2015-04-01

    Full Text Available Cell formation process is one of the first and the most important steps in designing cellular manufacturing systems. It consists of identifying part families according to the similarities in the design, shape, and presses of parts and dedicating machines to each part family based on the operations required by the parts. In this study, a hybrid method based on a combination of simulated annealing algorithm and dynamic programming was developed to solve a bi-objective cell formation problem with duplicate machines. In the proposed hybrid method, each solution was represented as a permutation of parts, which is created by simulated annealing algorithm, and dynamic programming was used to partition this permutation into part families and determine the number of machines in each cell such that the total dissimilarity between the parts and the total machine investment cost are minimized. The performance of the algorithm was evaluated by performing numerical experiments in different sizes. Our computational experiments indicated that the results were very encouraging in terms of computational time and solution quality.

  7. Simulation of Defect Reduction in Block Copolymer Thin Films by Solvent Annealing

    Energy Technology Data Exchange (ETDEWEB)

    Hur, Su-Mi; Khaira, Gurdaman S.; Ramírez-Hernández, Abelardo; Müller, Marcus; Nealey, Paul F.; de Pablo, Juan J.

    2015-01-20

    Solvent annealing provides an effective means to control the self-assembly of block copolymer (BCP) thin films. Multiple effects, including swelling, shrinkage, and morphological transitions, act in concert to yield ordered or disordered structures. The current understanding of these processes is limited; by relying on a theoretically informed coarse-grained model of block copolymers, a conceptual framework is presented that permits prediction and rationalization of experimentally observed behaviors. Through proper selection of several process conditions, it is shown that a narrow window of solvent pressures exists over which one can direct a BCP material to form well-ordered, defect-free structures.

  8. 1-Dimensional simulation of thermal annealing in a commercial nuclear power plant reactor pressure vessel wall section

    International Nuclear Information System (INIS)

    Nakos, J.T.; Rosinski, S.T.; Acton, R.U.

    1994-11-01

    The objective of this work was to provide experimental heat transfer boundary condition and reactor pressure vessel (RPV) section thermal response data that can be used to benchmark computer codes that simulate thermal annealing of RPVS. This specific protect was designed to provide the Electric Power Research Institute (EPRI) with experimental data that could be used to support the development of a thermal annealing model. A secondary benefit is to provide additional experimental data (e.g., thermal response of concrete reactor cavity wall) that could be of use in an annealing demonstration project. The setup comprised a heater assembly, a 1.2 in x 1.2 m x 17.1 cm thick [4 ft x 4 ft x 6.75 in] section of an RPV (A533B ferritic steel with stainless steel cladding), a mockup of the open-quotes mirrorclose quotes insulation between the RPV and the concrete reactor cavity wall, and a 25.4 cm [10 in] thick concrete wall, 2.1 in x 2.1 in [10 ft x 10 ft] square. Experiments were performed at temperature heat-up/cooldown rates of 7, 14, and 28 degrees C/hr [12.5, 25, and 50 degrees F/hr] as measured on the heated face. A peak temperature of 454 degrees C [850 degrees F] was maintained on the heated face until the concrete wall temperature reached equilibrium. Results are most representative of those RPV locations where the heat transfer would be 1-dimensional. Temperature was measured at multiple locations on the heated and unheated faces of the RPV section and the concrete wall. Incident heat flux was measured on the heated face, and absorbed heat flux estimates were generated from temperature measurements and an inverse heat conduction code. Through-wall temperature differences, concrete wall temperature response, heat flux absorbed into the RPV surface and incident on the surface are presented. All of these data are useful to modelers developing codes to simulate RPV annealing

  9. Convection methodology for fission track annealing: direct and inverse numerical simulations in the multi-exponential case

    International Nuclear Information System (INIS)

    Miellou, J.C.; Igli, H.; Grivet, M.; Rebetez, M.; Chambaudet, A.

    1994-01-01

    In minerals, the uranium fission tracks are sensitive to temperature and time. The consequence is that the etchable lengths are reduced. To simulate the phenomenon, at the last International Conference on Nuclear Tracks in solids at Beijing in 1992, we proposed a convection model for fission track annealing based on a reaction situation associated with only one activation energy. Moreover a simple inverse method based on the resolution of an ordinary differential equation was described, making it possible to retrace the thermal history in this mono-exponential situation. The aim of this paper is to consider a more involved class of models including multi-exponentials associated with several activation energies. We shall describe in this framework the modelling of the direct phenomenon and the resolution of the inverse problem. Results of numerical simulations and comparison with the mono-exponential case will be presented. 5 refs. (author)

  10. Electrical Impedance Tomography Reconstruction Through Simulated Annealing using a New Outside-in Heuristic and GPU Parallelization

    International Nuclear Information System (INIS)

    Tavares, R S; Tsuzuki, M S G; Martins, T C

    2012-01-01

    Electrical Impedance Tomography (EIT) is an imaging technique that attempts to reconstruct the conductivity distribution inside an object from electrical currents and potentials applied and measured at its surface. The EIT reconstruction problem is approached as an optimization problem, where the difference between the simulated and measured distributions must be minimized. This optimization problem can be solved using Simulated Annealing (SA), but at a high computational cost. To reduce the computational load, it is possible to use an incomplete evaluation of the objective function. This algorithm showed to present an outside-in behavior, determining the impedance of the external elements first, similar to a layer striping algorithm. A new outside-in heuristic to make use of this property is proposed. It also presents the impact of using GPU for parallelizing matrix-vector multiplication and triangular solvers. Results with experimental data are presented. The outside-in heuristic showed to be faster when compared to the conventional SA algorithm.

  11. A fitting algorithm based on simulated annealing techniques for efficiency calibration of HPGe detectors using different mathematical functions

    Energy Technology Data Exchange (ETDEWEB)

    Hurtado, S. [Servicio de Radioisotopos, Centro de Investigacion, Tecnologia e Innovacion (CITIUS), Universidad de Sevilla, Avda. Reina Mercedes s/n, 41012 Sevilla (Spain)], E-mail: shurtado@us.es; Garcia-Leon, M. [Departamento de Fisica Atomica, Molecular y Nuclear, Facultad de Fisica, Universidad de Sevilla, Aptd. 1065, 41080 Sevilla (Spain); Garcia-Tenorio, R. [Departamento de Fisica Aplicada II, E.T.S.A. Universidad de Sevilla, Avda, Reina Mercedes 2, 41012 Sevilla (Spain)

    2008-09-11

    In this work several mathematical functions are compared in order to perform the full-energy peak efficiency calibration of HPGe detectors using a 126cm{sup 3} HPGe coaxial detector and gamma-ray energies ranging from 36 to 1460 keV. Statistical tests and Monte Carlo simulations were used to study the performance of the fitting curve equations. Furthermore the fitting procedure of these complex functional forms to experimental data is a non-linear multi-parameter minimization problem. In gamma-ray spectrometry usually non-linear least-squares fitting algorithms (Levenberg-Marquardt method) provide a fast convergence while minimizing {chi}{sub R}{sup 2}, however, sometimes reaching only local minima. In order to overcome that shortcoming a hybrid algorithm based on simulated annealing (HSA) techniques is proposed. Additionally a new function is suggested that models the efficiency curve of germanium detectors in gamma-ray spectrometry.

  12. Perbandingan Algoritma Simulated Annealing dan Harmony Search dalam Penerapan Picking Order Sequence

    Directory of Open Access Journals (Sweden)

    Tanti Octavia

    2017-12-01

    Full Text Available Implementation of mobile rack warehouse is commonly used in manufacturing industry because it can minimize the warehouse area used. Applying picking orders in taking of Stock Keeping Unit (SKU on mobile rack warehouses could give fast loading order. This research aims to find out which algorithm is better in applying picking order sequence in mobile rack warehouse. The algorithm used is Simualted Annealing (SA and Harmony Search (HS algorithm. Both of these algorithms will be compared in terms of the gap with the shortest path method.The result shows that the HS algorithm produces a better solution than the SA algorithm with lower CPU time, but the convergence rate of HS is lower than that of SA.HS was able to produce a better solution than the shortest path method of 9 cases, while SA only 8 cases from 15 cases.

  13. a Comparison of Simulated Annealing, Genetic Algorithm and Particle Swarm Optimization in Optimal First-Order Design of Indoor Tls Networks

    Science.gov (United States)

    Jia, F.; Lichti, D.

    2017-09-01

    The optimal network design problem has been well addressed in geodesy and photogrammetry but has not received the same attention for terrestrial laser scanner (TLS) networks. The goal of this research is to develop a complete design system that can automatically provide an optimal plan for high-accuracy, large-volume scanning networks. The aim in this paper is to use three heuristic optimization methods, simulated annealing (SA), genetic algorithm (GA) and particle swarm optimization (PSO), to solve the first-order design (FOD) problem for a small-volume indoor network and make a comparison of their performances. The room is simplified as discretized wall segments and possible viewpoints. Each possible viewpoint is evaluated with a score table representing the wall segments visible from each viewpoint based on scanning geometry constraints. The goal is to find a minimum number of viewpoints that can obtain complete coverage of all wall segments with a minimal sum of incidence angles. The different methods have been implemented and compared in terms of the quality of the solutions, runtime and repeatability. The experiment environment was simulated from a room located on University of Calgary campus where multiple scans are required due to occlusions from interior walls. The results obtained in this research show that PSO and GA provide similar solutions while SA doesn't guarantee an optimal solution within limited iterations. Overall, GA is considered as the best choice for this problem based on its capability of providing an optimal solution and fewer parameters to tune.

  14. SARAPAN-A simulated-annealing-based tool to generate random patterned-channel-age in CANDU fuel management analyses

    Energy Technology Data Exchange (ETDEWEB)

    Kastanya, Doddy [Safety and Licensing Department, Candesco Division of Kinectrics Inc., Toronto (Canada)

    2017-02-15

    In any reactor physics analysis, the instantaneous power distribution in the core can be calculated when the actual bundle-wise burnup distribution is known. Considering the fact that CANDU (Canada Deuterium Uranium) utilizes on-power refueling to compensate for the reduction of reactivity due to fuel burnup, in the CANDU fuel management analysis, snapshots of power and burnup distributions can be obtained by simulating and tracking the reactor operation over an extended period using various tools such as the *SIMULATE module of the Reactor Fueling Simulation Program (RFSP) code. However, for some studies, such as an evaluation of a conceptual design of a next-generation CANDU reactor, the preferred approach to obtain a snapshot of the power distribution in the core is based on the patterned-channel-age model implemented in the *INSTANTAN module of the RFSP code. The objective of this approach is to obtain a representative snapshot of core conditions quickly. At present, such patterns could be generated by using a program called RANDIS, which is implemented within the *INSTANTAN module. In this work, we present an alternative approach to derive the patterned-channel-age model where a simulated-annealing-based algorithm is used to find such patterns, which produce reasonable power distributions.

  15. SARAPAN—A Simulated-Annealing-Based Tool to Generate Random Patterned-Channel-Age in CANDU Fuel Management Analyses

    Directory of Open Access Journals (Sweden)

    Doddy Kastanya

    2017-02-01

    Full Text Available In any reactor physics analysis, the instantaneous power distribution in the core can be calculated when the actual bundle-wise burnup distribution is known. Considering the fact that CANDU (Canada Deuterium Uranium utilizes on-power refueling to compensate for the reduction of reactivity due to fuel burnup, in the CANDU fuel management analysis, snapshots of power and burnup distributions can be obtained by simulating and tracking the reactor operation over an extended period using various tools such as the *SIMULATE module of the Reactor Fueling Simulation Program (RFSP code. However, for some studies, such as an evaluation of a conceptual design of a next-generation CANDU reactor, the preferred approach to obtain a snapshot of the power distribution in the core is based on the patterned-channel-age model implemented in the *INSTANTAN module of the RFSP code. The objective of this approach is to obtain a representative snapshot of core conditions quickly. At present, such patterns could be generated by using a program called RANDIS, which is implemented within the *INSTANTAN module. In this work, we present an alternative approach to derive the patterned-channel-age model where a simulated-annealing-based algorithm is used to find such patterns, which produce reasonable power distributions.

  16. A Single-Machine Two-Agent Scheduling Problem by a Branch-and-Bound and Three Simulated Annealing Algorithms

    Directory of Open Access Journals (Sweden)

    Shangchia Liu

    2015-01-01

    Full Text Available In the field of distributed decision making, different agents share a common processing resource, and each agent wants to minimize a cost function depending on its jobs only. These issues arise in different application contexts, including real-time systems, integrated service networks, industrial districts, and telecommunication systems. Motivated by its importance on practical applications, we consider two-agent scheduling on a single machine where the objective is to minimize the total completion time of the jobs of the first agent with the restriction that an upper bound is allowed the total completion time of the jobs for the second agent. For solving the proposed problem, a branch-and-bound and three simulated annealing algorithms are developed for the optimal solution, respectively. In addition, the extensive computational experiments are also conducted to test the performance of the algorithms.

  17. A hybrid simulated annealing approach to handle energy resource management considering an intensive use of electric vehicles

    DEFF Research Database (Denmark)

    Sousa, Tiago; Vale, Zita; Carvalho, Joao Paulo

    2014-01-01

    The massification of electric vehicles (EVs) can have a significant impact on the power system, requiring a new approach for the energy resource management. The energy resource management has the objective to obtain the optimal scheduling of the available resources considering distributed...... to determine the best solution in a reasonable amount of time. This paper presents a hybrid artificial intelligence technique to solve a complex energy resource management problem with a large number of resources, including EVs, connected to the electric network. The hybrid approach combines simulated...... annealing (SA) and ant colony optimization (ACO) techniques. The case study concerns different EVs penetration levels. Comparisons with a previous SA approach and a deterministic technique are also presented. For 2000 EVs scenario, the proposed hybrid approach found a solution better than the previous SA...

  18. Optimization of seasonal ARIMA models using differential evolution - simulated annealing (DESA) algorithm in forecasting dengue cases in Baguio City

    Science.gov (United States)

    Addawe, Rizavel C.; Addawe, Joel M.; Magadia, Joselito C.

    2016-10-01

    Accurate forecasting of dengue cases would significantly improve epidemic prevention and control capabilities. This paper attempts to provide useful models in forecasting dengue epidemic specific to the young and adult population of Baguio City. To capture the seasonal variations in dengue incidence, this paper develops a robust modeling approach to identify and estimate seasonal autoregressive integrated moving average (SARIMA) models in the presence of additive outliers. Since the least squares estimators are not robust in the presence of outliers, we suggest a robust estimation based on winsorized and reweighted least squares estimators. A hybrid algorithm, Differential Evolution - Simulated Annealing (DESA), is used to identify and estimate the parameters of the optimal SARIMA model. The method is applied to the monthly reported dengue cases in Baguio City, Philippines.

  19. Anisotropy evolution of nanoparticles under annealing: Benefits of isothermal remanent magnetization simulation

    Science.gov (United States)

    Tournus, Florent; Tamion, Alexandre; Hillion, Arnaud; Dupuis, Véronique

    2016-12-01

    Isothermal remanent magnetization (IRM) combined with Direct current demagnetization (DcD) are powerful tools to qualitatively study the interactions (through the Δm parameter) between magnetic particles in a granular media. For magnetic nanoparticles diluted in a matrix, it is possible to reach a regime where Δm is equal to zero, i.e. where interparticle interactions are negligible: one can then infer the intrinsic properties of nanoparticles through measurements on an assembly, which are analyzed by a combined fit procedure (based on the Stoner-Wohlfarth and Néel models). Here we illustrate the benefits of a quantitative analysis of IRM curves, for Co nanoparticles embedded in amorphous carbon (before and after annealing): while a large anisotropy increase may have been deduced from the other measurements, IRM curves provide an improved characterization of the nanomagnets intrinsic properties, revealing that it is in fact not the case. This shows that IRM curves, which only probe the irreversible switching of nanomagnets, are complementary to widely used low field susceptibility curves.

  20. Anisotropy evolution of nanoparticles under annealing: Benefits of isothermal remanent magnetization simulation

    International Nuclear Information System (INIS)

    Tournus, Florent; Tamion, Alexandre; Hillion, Arnaud; Dupuis, Véronique

    2016-01-01

    Isothermal remanent magnetization (IRM) combined with Direct current demagnetization (DcD) are powerful tools to qualitatively study the interactions (through the Δm parameter) between magnetic particles in a granular media. For magnetic nanoparticles diluted in a matrix, it is possible to reach a regime where Δm is equal to zero, i.e. where interparticle interactions are negligible: one can then infer the intrinsic properties of nanoparticles through measurements on an assembly, which are analyzed by a combined fit procedure (based on the Stoner–Wohlfarth and Néel models). Here we illustrate the benefits of a quantitative analysis of IRM curves, for Co nanoparticles embedded in amorphous carbon (before and after annealing): while a large anisotropy increase may have been deduced from the other measurements, IRM curves provide an improved characterization of the nanomagnets intrinsic properties, revealing that it is in fact not the case. This shows that IRM curves, which only probe the irreversible switching of nanomagnets, are complementary to widely used low field susceptibility curves.

  1. Anisotropy evolution of nanoparticles under annealing: Benefits of isothermal remanent magnetization simulation

    Energy Technology Data Exchange (ETDEWEB)

    Tournus, Florent, E-mail: florent.tournus@univ-lyon1.fr; Tamion, Alexandre; Hillion, Arnaud; Dupuis, Véronique

    2016-12-01

    Isothermal remanent magnetization (IRM) combined with Direct current demagnetization (DcD) are powerful tools to qualitatively study the interactions (through the Δm parameter) between magnetic particles in a granular media. For magnetic nanoparticles diluted in a matrix, it is possible to reach a regime where Δm is equal to zero, i.e. where interparticle interactions are negligible: one can then infer the intrinsic properties of nanoparticles through measurements on an assembly, which are analyzed by a combined fit procedure (based on the Stoner–Wohlfarth and Néel models). Here we illustrate the benefits of a quantitative analysis of IRM curves, for Co nanoparticles embedded in amorphous carbon (before and after annealing): while a large anisotropy increase may have been deduced from the other measurements, IRM curves provide an improved characterization of the nanomagnets intrinsic properties, revealing that it is in fact not the case. This shows that IRM curves, which only probe the irreversible switching of nanomagnets, are complementary to widely used low field susceptibility curves.

  2. The atomic-scale nucleation mechanism of NiTi metallic glasses upon isothermal annealing studied via molecular dynamics simulations.

    Science.gov (United States)

    Li, Yang; Li, JiaHao; Liu, BaiXin

    2015-10-28

    Nucleation is one of the most essential transformation paths in phase transition and exerts a significant influence on the crystallization process. Molecular dynamics simulations were performed to investigate the atomic-scale nucleation mechanisms of NiTi metallic glasses upon devitrification at various temperatures (700 K, 750 K, 800 K, and 850 K). Our simulations reveal that at 700 K and 750 K, nucleation is polynuclear with high nucleation density, while at 800 K it is mononuclear. The underlying nucleation mechanisms have been clarified, manifesting that nucleation can be induced either by the initial ordered clusters (IOCs) or by the other precursors of nuclei evolved directly from the supercooled liquid. IOCs and other precursors stem from the thermal fluctuations of bond orientational order in supercooled liquids during the quenching process and during the annealing process, respectively. The simulation results not only elucidate the underlying nucleation mechanisms varied with temperature, but also unveil the origin of nucleation. These discoveries offer new insights into the devitrification mechanism of metallic glasses.

  3. The application of neutral network integrated with genetic algorithm and simulated annealing for the simulation of rare earths separation processes by the solvent extraction technique using EHEHPA agent

    International Nuclear Information System (INIS)

    Tran Ngoc Ha; Pham Thi Hong Ha

    2003-01-01

    In the present work, neutral network has been used for mathematically modeling equilibrium data of the mixture of two rare earth elements, namely Nd and Pr with PC88A agent. Thermo-genetic algorithm based on the idea of the genetic algorithm and the simulated annealing algorithm have been used in the training procedure of the neutral networks, giving better result in comparison with the traditional modeling approach. The obtained neutral network modeling the experimental data is further used in the computer program to simulate the solvent extraction process of two elements Nd and Pr. Based on this computer program, various optional schemes for the separation of Nd and Pr have been investigated and proposed. (author)

  4. An Interactive Simulation Tool for Production Planning in Bacon Factories

    DEFF Research Database (Denmark)

    Nielsen, Jens Frederik Dalsgaard; Nielsen, Kirsten Mølgaard

    1994-01-01

    The paper describes an interactive simulation tool for production planning in bacon factories. The main aim of the tool is to make it possible to combine the production plans of all parts of the factory......The paper describes an interactive simulation tool for production planning in bacon factories. The main aim of the tool is to make it possible to combine the production plans of all parts of the factory...

  5. Compact nuclear simulator and its upgrade plan

    International Nuclear Information System (INIS)

    Kwon, Kee-Choon; Park, Jae-Chang; Jung, Chul-Hwan; Lee, Jang-Soo; Kim, Jang-Yeol

    1997-01-01

    The Compact Nuclear Simulator (CNS) was installed at the nuclear training center of the Korea Atomic Energy Research Institute (KAERI) in 1998. The CNS has been used for training non-operator personnel, such as NSSS design engineers, maintenance personnel, and inspectors of regulatory body, and for testing fuzzy control algorithm. The CNS mathematical modeling modeled a three loop Westinghouse Pressurizer Water Reactor (PWR), 993 MWe, mostly referred to as the Kori Unit 3 and 4 in Korea. However, the main computer (Micro VAX II), an interface card between a main computer and operator panel, and a graphic display system are faced with frequent troubles due to obsolescence and a lack of spare parts. Accordingly, CNS hardware should be replaced by state of the art equipment. There are plans to replace the main computer with an HP workstation, the dedicated interface card with a PLC-based interface system, and the graphic interface system with a X-terminal based full graphics system. The full graphics user interface system supports an easy and friendly interface between the CNS and users. The software for the instructor console also will be modified from a text-based to a Motif-based user interface. The Motif-based user interface provides a more efficient and easy operation in an instructor console. The real-time executive software programmed under a Micro VMS operating system should also be replaced by software programmed under a HPUX operating system. (author)

  6. Ensemble annealing of complex physical systems

    OpenAIRE

    Habeck, Michael

    2015-01-01

    Algorithms for simulating complex physical systems or solving difficult optimization problems often resort to an annealing process. Rather than simulating the system at the temperature of interest, an annealing algorithm starts at a temperature that is high enough to ensure ergodicity and gradually decreases it until the destination temperature is reached. This idea is used in popular algorithms such as parallel tempering and simulated annealing. A general problem with annealing methods is th...

  7. INTRODUCCIÓN DE ELEMENTOS DE MEMORIA EN EL MÉTODO SIMULATED ANNEALING PARA RESOLVER PROBLEMAS DE PROGRAMACIÓN MULTIOBJETIVO DE MÁQUINAS PARALELAS INTRODUCTION OF MEMORY ELEMENTS IN SIMULATED ANNEALING METHOD TO SOLVE MULTIOBJECTIVE PARALLEL MACHINE SCHEDULING PROBLEMS

    Directory of Open Access Journals (Sweden)

    Felipe Baesler

    2008-12-01

    Full Text Available El presente artículo introduce una variante de la metaheurística simulated annealing, para la resolución de problemas de optimización multiobjetivo. Este enfoque se demonina MultiObjective Simulated Annealing with Random Trajectory Search, MOSARTS. Esta técnica agrega al algoritmo Simulated Annealing elementos de memoria de corto y largo plazo para realizar una búsqueda que permita balancear el esfuerzo entre todos los objetivos involucrados en el problema. Los resultados obtenidos se compararon con otras tres metodologías en un problema real de programación de máquinas paralelas, compuesto por 24 trabajos y 2 máquinas idénticas. Este problema corresponde a un caso de estudio real de la industria regional del aserrío. En los experimentos realizados, MOSARTS se comportó de mejor manera que el resto de la herramientas de comparación, encontrando mejores soluciones en términos de dominancia y dispersión.This paper introduces a variant of the metaheuristic simulated annealing, oriented to solve multiobjective optimization problems. This technique is called MultiObjective Simulated Annealing with Random Trajectory Search (MOSARTS. This technique incorporates short an long term memory concepts to Simulated Annealing in order to balance the search effort among all the objectives involved in the problem. The algorithm was tested against three different techniques on a real life parallel machine scheduling problem, composed of 24 jobs and two identical machines. This problem represents a real life case study of the local sawmill industry. The results showed that MOSARTS behaved much better than the other methods utilized, because found better solutions in terms of dominance and frontier dispersion.

  8. Novel approach for tomographic reconstruction of gas concentration distributions in air: Use of smooth basis functions and simulated annealing

    Science.gov (United States)

    Drescher, A. C.; Gadgil, A. J.; Price, P. N.; Nazaroff, W. W.

    Optical remote sensing and iterative computed tomography (CT) can be applied to measure the spatial distribution of gaseous pollutant concentrations. We conducted chamber experiments to test this combination of techniques using an open path Fourier transform infrared spectrometer (OP-FTIR) and a standard algebraic reconstruction technique (ART). Although ART converged to solutions that showed excellent agreement with the measured ray-integral concentrations, the solutions were inconsistent with simultaneously gathered point-sample concentration measurements. A new CT method was developed that combines (1) the superposition of bivariate Gaussians to represent the concentration distribution and (2) a simulated annealing minimization routine to find the parameters of the Gaussian basis functions that result in the best fit to the ray-integral concentration data. This method, named smooth basis function minimization (SBFM), generated reconstructions that agreed well, both qualitatively and quantitatively, with the concentration profiles generated from point sampling. We present an analysis of two sets of experimental data that compares the performance of ART and SBFM. We conclude that SBFM is a superior CT reconstruction method for practical indoor and outdoor air monitoring applications.

  9. Porous media microstructure reconstruction using pixel-based and object-based simulated annealing: comparison with other reconstruction methods

    Energy Technology Data Exchange (ETDEWEB)

    Diogenes, Alysson N.; Santos, Luis O.E. dos; Fernandes, Celso P. [Universidade Federal de Santa Catarina (UFSC), Florianopolis, SC (Brazil); Appoloni, Carlos R. [Universidade Estadual de Londrina (UEL), PR (Brazil)

    2008-07-01

    The reservoir rocks physical properties are usually obtained in laboratory, through standard experiments. These experiments are often very expensive and time-consuming. Hence, the digital image analysis techniques are a very fast and low cost methodology for physical properties prediction, knowing only geometrical parameters measured from the rock microstructure thin sections. This research analyzes two methods for porous media reconstruction using the relaxation method simulated annealing. Using geometrical parameters measured from rock thin sections, it is possible to construct a three-dimensional (3D) model of the microstructure. We assume statistical homogeneity and isotropy and the 3D model maintains porosity spatial correlation, chord size distribution and d 3-4 distance transform distribution for a pixel-based reconstruction and spatial correlation for an object-based reconstruction. The 2D and 3D preliminary results are compared with microstructures reconstructed by truncated Gaussian methods. As this research is in its beginning, only the 2D results will be presented. (author)

  10. [Application of simulated annealing method and neural network on optimizing soil sampling schemes based on road distribution].

    Science.gov (United States)

    Han, Zong-wei; Huang, Wei; Luo, Yun; Zhang, Chun-di; Qi, Da-cheng

    2015-03-01

    Taking the soil organic matter in eastern Zhongxiang County, Hubei Province, as a research object, thirteen sample sets from different regions were arranged surrounding the road network, the spatial configuration of which was optimized by the simulated annealing approach. The topographic factors of these thirteen sample sets, including slope, plane curvature, profile curvature, topographic wetness index, stream power index and sediment transport index, were extracted by the terrain analysis. Based on the results of optimization, a multiple linear regression model with topographic factors as independent variables was built. At the same time, a multilayer perception model on the basis of neural network approach was implemented. The comparison between these two models was carried out then. The results revealed that the proposed approach was practicable in optimizing soil sampling scheme. The optimal configuration was capable of gaining soil-landscape knowledge exactly, and the accuracy of optimal configuration was better than that of original samples. This study designed a sampling configuration to study the soil attribute distribution by referring to the spatial layout of road network, historical samples, and digital elevation data, which provided an effective means as well as a theoretical basis for determining the sampling configuration and displaying spatial distribution of soil organic matter with low cost and high efficiency.

  11. Displacement cascades and defects annealing in tungsten, Part I: Defect database from molecular dynamics simulations

    Energy Technology Data Exchange (ETDEWEB)

    Setyawan, Wahyu [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Nandipati, Giridhar [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Roche, Kenneth J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Univ. of Washington, Seattle, WA (United States); Heinisch, Howard L. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Wirth, Brian D. [Univ. of Tennessee, Knoxville, TN (United States); Oak Ridge National Lab., Oak Ridge, TN (United States); Kurtz, Richard J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-01

    Molecular dynamics simulations have been used to generate a comprehensive database of surviving defects due to displacement cascades in bulk tungsten. Twenty-one data points of primary knock-on atom (PKA) energies ranging from 100 eV (sub-threshold energy) to 100 keV (~780×Ed, where Ed = 128 eV is the average displacement threshold energy) have been completed at 300 K, 1025 K and 2050 K. Within this range of PKA energies, two regimes of power-law energy-dependence of the defect production are observed. A distinct power-law exponent characterizes the number of Frenkel pairs produced within each regime. The two regimes intersect at a transition energy which occurs at approximately 250×Ed. The transition energy also marks the onset of the formation of large self-interstitial atom (SIA) clusters (size 14 or more). The observed defect clustering behavior is asymmetric, with SIA clustering increasing with temperature, while the vacancy clustering decreases. This asymmetry increases with temperature such that at 2050 K (~0.5Tm) practically no large vacancy clusters are formed, meanwhile large SIA clusters appear in all simulations. The implication of such asymmetry on the long-term defect survival and damage accumulation is discussed. In addition, <100> {110} SIA loops are observed to form directly in the highest energy cascades, while vacancy <100> loops are observed to form at the lowest temperature and highest PKA energies, although the appearance of both the vacancy and SIA loops with Burgers vector of <100> type is relatively rare.

  12. Edge Simulation Laboratory Progress and Plans

    International Nuclear Information System (INIS)

    Cohen, R

    2007-01-01

    The Edge Simulation Laboratory (ESL) is a project to develop a gyrokinetic code for MFE edge plasmas based on continuum (Eulerian) techniques. ESL is a base-program activity of OFES, with an allied algorithm research activity funded by the OASCR base math program. ESL OFES funds directly support about 0.8 FTE of career staff at LLNL, a postdoc and a small fraction of an FTE at GA, and a graduate student at UCSD. In addition the allied OASCR program funds about 1/2 FTE each in the computations directorates at LBNL and LLNL. OFES ESL funding for LLNL and UCSD began in fall 2005, while funding for GA and the math team began about a year ago. ESL's continuum approach is a complement to the PIC-based methods of the CPES Project, and was selected (1) because of concerns about noise issues associated with PIC in the high-density-contrast environment of the edge pedestal, (2) to be able to exploit advanced numerical methods developed for fluid codes, and (3) to build upon the successes of core continuum gyrokinetic codes such as GYRO, GS2 and GENE. The ESL project presently has three components: TEMPEST, a full-f, full-geometry (single-null divertor, or arbitrary-shape closed flux surfaces) code in E, μ (energy, magnetic-moment) coordinates; EGK, a simple-geometry rapid-prototype code, presently of; and the math component, which is developing and implementing algorithms for a next-generation code. Progress would be accelerated if we could find funding for a fourth, computer science, component, which would develop software infrastructure, provide user support, and address needs for data handing and analysis. We summarize the status and plans for the three funded activities

  13. Searching for stable Si(n)C(n) clusters: combination of stochastic potential surface search and pseudopotential plane-wave Car-Parinello simulated annealing simulations.

    Science.gov (United States)

    Duan, Xiaofeng F; Burggraf, Larry W; Huang, Lingyu

    2013-07-22

    To find low energy Si(n)C(n) structures out of hundreds to thousands of isomers we have developed a general method to search for stable isomeric structures that combines Stochastic Potential Surface Search and Pseudopotential Plane-Wave Density Functional Theory Car-Parinello Molecular Dynamics simulated annealing (PSPW-CPMD-SA). We enhanced the Sunders stochastic search method to generate random cluster structures used as seed structures for PSPW-CPMD-SA simulations. This method ensures that each SA simulation samples a different potential surface region to find the regional minimum structure. By iterations of this automated, parallel process on a high performance computer we located hundreds to more than a thousand stable isomers for each Si(n)C(n) cluster. Among these, five to 10 of the lowest energy isomers were further optimized using B3LYP/cc-pVTZ method. We applied this method to Si(n)C(n) (n = 4-12) clusters and found the lowest energy structures, most not previously reported. By analyzing the bonding patterns of low energy structures of each Si(n)C(n) cluster, we observed that carbon segregations tend to form condensed conjugated rings while Si connects to unsaturated bonds at the periphery of the carbon segregation as single atoms or clusters when n is small and when n is large a silicon network spans over the carbon segregation region.

  14. Searching for Stable SinCn Clusters: Combination of Stochastic Potential Surface Search and Pseudopotential Plane-Wave Car-Parinello Simulated Annealing Simulations

    Directory of Open Access Journals (Sweden)

    Larry W. Burggraf

    2013-07-01

    Full Text Available To find low energy SinCn structures out of hundreds to thousands of isomers we have developed a general method to search for stable isomeric structures that combines Stochastic Potential Surface Search and Pseudopotential Plane-Wave Density Functional Theory Car-Parinello Molecular Dynamics simulated annealing (PSPW-CPMD-SA. We enhanced the Sunders stochastic search method to generate random cluster structures used as seed structures for PSPW-CPMD-SA simulations. This method ensures that each SA simulation samples a different potential surface region to find the regional minimum structure. By iterations of this automated, parallel process on a high performance computer we located hundreds to more than a thousand stable isomers for each SinCn cluster. Among these, five to 10 of the lowest energy isomers were further optimized using B3LYP/cc-pVTZ method. We applied this method to SinCn (n = 4–12 clusters and found the lowest energy structures, most not previously reported. By analyzing the bonding patterns of low energy structures of each SinCn cluster, we observed that carbon segregations tend to form condensed conjugated rings while Si connects to unsaturated bonds at the periphery of the carbon segregation as single atoms or clusters when n is small and when n is large a silicon network spans over the carbon segregation region.

  15. Analysis of optimisation method for a two-stroke piston ring using the Finite Element Method and the Simulated Annealing Method

    Science.gov (United States)

    Kaliszewski, M.; Mazuro, P.

    2016-09-01

    Simulated Annealing Method of optimisation for the sealing piston ring geometry is tested. The aim of optimisation is to develop ring geometry which would exert demanded pressure on a cylinder just while being bended to fit the cylinder. Method of FEM analysis of an arbitrary piston ring geometry is applied in an ANSYS software. The demanded pressure function (basing on formulae presented by A. Iskra) as well as objective function are introduced. Geometry definition constructed by polynomials in radial coordinate system is delivered and discussed. Possible application of Simulated Annealing Method in a piston ring optimisation task is proposed and visualised. Difficulties leading to possible lack of convergence of optimisation are presented. An example of an unsuccessful optimisation performed in APDL is discussed. Possible line of further optimisation improvement is proposed.

  16. Semiconductor annealing

    International Nuclear Information System (INIS)

    Young, J.M.; Scovell, P.D.

    1982-01-01

    A process for annealing crystal damage in ion implanted semiconductor devices in which the device is rapidly heated to a temperature between 450 and 900 0 C and allowed to cool. It has been found that such heating of the device to these relatively low temperatures results in rapid annealing. In one application the device may be heated on a graphite element mounted between electrodes in an inert atmosphere in a chamber. (author)

  17. Plan Validation Using DES and Agent-based Simulation

    National Research Council Canada - National Science Library

    Wong, Teck H; Ong, Kim S

    2008-01-01

    .... This thesis explores the possibility of using a multi-agent system (MAS) to generate the aggressor's air strike plans, which could be coupled with a low resolution Discrete Event Simulation (DES...

  18. Semiconductor annealing

    International Nuclear Information System (INIS)

    Young, J.M.; Scovell, P.D.

    1981-01-01

    A process for annealing crystal damage in ion implanted semiconductor devices is described in which the device is rapidly heated to a temperature between 450 and 600 0 C and allowed to cool. It has been found that such heating of the device to these relatively low temperatures results in rapid annealing. In one application the device may be heated on a graphite element mounted between electrodes in an inert atmosphere in a chamber. The process may be enhanced by the application of optical radiation from a Xenon lamp. (author)

  19. The Paper Airplane Challenge: A Market Economy Simulation. Lesson Plan.

    Science.gov (United States)

    Owens, Kimberly

    This lesson plan features a classroom simulation that helps students understand the characteristics of a market economic system. The lesson plan states a purpose; cites student objectives; suggests a time duration; lists materials needed; and details a step-by-step teaching procedure. The "Paper Airplane Challenge" handout is attached. (BT)

  20. System Planning With The Hanford Waste Operations Simulator

    International Nuclear Information System (INIS)

    Crawford, T.W.; Certa, P.J.; Wells, M.N.

    2010-01-01

    At the U. S. Department of Energy's Hanford Site in southeastern Washington State, 216 million liters (57 million gallons) of nuclear waste is currently stored in aging underground tanks, threatening the Columbia River. The River Protection Project (RPP), a fully integrated system of waste storage, retrieval, treatment, and disposal facilities, is in varying stages of design, construction, operation, and future planning. These facilities face many overlapping technical, regulatory, and financial hurdles to achieve site cleanup and closure. Program execution is ongoing, but completion is currently expected to take approximately 40 more years. Strategic planning for the treatment of Hanford tank waste is by nature a multi-faceted, complex and iterative process. To help manage the planning, a report referred to as the RPP System Plan is prepared to provide a basis for aligning the program scope with the cost and schedule, from upper-tier contracts to individual facility operating plans. The Hanford Tank Waste Operations Simulator (HTWOS), a dynamic flowsheet simulation and mass balance computer model, is used to simulate the current planned RPP mission, evaluate the impacts of changes to the mission, and assist in planning near-term facility operations. Development of additional modeling tools, including an operations research model and a cost model, will further improve long-term planning confidence. The most recent RPP System Plan, Revision 4, was published in September 2009.

  1. Quantum Annealing and Quantum Fluctuation Effect in Frustrated Ising Systems

    OpenAIRE

    Tanaka, Shu; Tamura, Ryo

    2012-01-01

    Quantum annealing method has been widely attracted attention in statistical physics and information science since it is expected to be a powerful method to obtain the best solution of optimization problem as well as simulated annealing. The quantum annealing method was incubated in quantum statistical physics. This is an alternative method of the simulated annealing which is well-adopted for many optimization problems. In the simulated annealing, we obtain a solution of optimization problem b...

  2. ANALYTICAL AND SIMULATION PLANNING MODEL OF URBAN PASSENGER TRANSPORT

    Directory of Open Access Journals (Sweden)

    Andrey Borisovich Nikolaev

    2017-09-01

    Full Text Available The article described the structure of the analytical and simulation models to make informed decisions in the planning of urban passenger transport. Designed UML diagram that describes the relationship of classes of the proposed model. A description of the main agents of the model developed in the simulation AnyLogic. Designed user interface integration with GIS map. Also provides simulation results that allow concluding about her health and the possibility of its use in solving planning problems of urban passenger transport.

  3. A Simulation Tool for Hurricane Evacuation Planning

    Directory of Open Access Journals (Sweden)

    Daniel J. Fonseca

    2009-01-01

    Full Text Available Atlantic hurricanes and severe tropical storms are a serious threat for the communities in the Gulf of Mexico region. Such storms are violent and destructive. In response to these dangers, coastal evacuation may be ordered. This paper describes the development of a simulation model to analyze the movement of vehicles through I-65, a major US Interstate highway that runs north off the coastal City of Mobile, Alabama, towards the State of Tennessee, during a massive evacuation originated by a disastrous event such a hurricane. The constructed simulation platform consists of a primary and two secondary models. The primary model is based on the entry of vehicles from the 20 on-ramps to I-65. The two secondary models assist the primary model with related traffic events such as car breakdowns and accidents, traffic control measures, interarrival signaling, and unforeseen emergency incidents, among others. Statistical testing was performed on the data generated by the simulation model to indentify variation in relevant traffic variables affecting the timely flow of vehicles travelling north. The performed statistical analysis focused on the closing of alternative on-ramps throughout the Interstate.

  4. Simulation-based decision support for evaluating operational plans

    Directory of Open Access Journals (Sweden)

    Johan Schubert

    2015-12-01

    Full Text Available In this article, we describe simulation-based decision support techniques for evaluation of operational plans within effects-based planning. Using a decision support tool, developers of operational plans are able to evaluate thousands of alternative plans against possible courses of events and decide which of these plans are capable of achieving a desired end state. The objective of this study is to examine the potential of a decision support system that helps operational analysts understand the consequences of numerous alternative plans through simulation and evaluation. Operational plans are described in the effects-based approach to operations concept as a set of actions and effects. For each action, we examine several different alternative ways to perform the action. We use a representation where a plan consists of several actions that should be performed. Each action may be performed in one of several different alternative ways. Together these action alternatives make up all possible plan instances, which are represented as a tree of action alternatives that may be searched for the most effective sequence of alternative actions. As a test case, we use an expeditionary operation with a plan of 43 actions and several alternatives for these actions, as well as a scenario of 40 group actors. Decision support for planners is provided by several methods that analyze the impact of a plan on the 40 actors, e.g., by visualizing time series of plan performance. Detailed decision support for finding the most influential actions of a plan is presented by using sensitivity analysis and regression tree analysis. Finally, a decision maker may use the tool to determine the boundaries of an operation that it must not move beyond without risk of drastic failure. The significant contribution of this study is the presentation of an integrated approach for evaluation of operational plans.

  5. The effect of residual thermal stresses on the fatigue crack growth of laser-surface-annealed AISI 304 stainless steel Part I: computer simulation

    International Nuclear Information System (INIS)

    Shiue, R.K.; Chang, C.T.; Young, M.C.; Tsay, L.W.

    2004-01-01

    The effect of residual thermal stresses on the fatigue crack growth of the laser-surface-annealed AISI 304 stainless steel, especially the effect of stress redistribution ahead of the crack tip was extensively evaluated in the study. Based on the finite element simulation, the longitudinal residual tensile stress field has a width of roughly 20 mm on the laser-irradiated surface and was symmetric with respect to the centerline of the laser-annealed zone (LAZ). Meanwhile, residual compressive stresses distributed over a wide region away from the LAZ. After introducing a notch perpendicular to the LAZ, the distribution of longitudinal residual stresses became unsymmetrical about the centerline of LAZ. High residual compressive stresses exist within a narrow range ahead of notch tip. The improved crack growth resistance of the laser-annealed specimen might be attributed to those induced compressive stresses. As the notch tip passed through the centerline of the LAZ, the residual stress ahead of the notch tip was completely reverted into residual tensile stresses. The existence of unanimous residual tensile stresses ahead of the notch tip was maintained, even if the notch tip extended deeply into the LAZ. Additionally, the presence of the residual tensile stress ahead of the notch tip did not accelerate the fatigue crack growth rate in the compact tension specimen

  6. Grazing incidence X-ray diffraction study of the tilted phases of Langmuir films: Determination of molecular conformations using simulated annealing

    International Nuclear Information System (INIS)

    Pignat, J.; Daillant, J.; Cantin, S.; Perrot, F.; Konovalov, O.

    2007-01-01

    We have analyzed grazing incidence X-ray diffraction (GIXD) data from condensed phases of Langmuir films of long-chain fatty acids at the air-water using a new method consisting in a careful extraction of the structure factors followed by fitting of molecular parameters using simulated annealing. We show that the information contained in GIXD spectra is enough to obtain near-atomic structural information. In particular, we directly determine the orientation of the chain backbone planes and of the carboxylic headgroups, and we evaluate chain conformation defects

  7. Grazing incidence X-ray diffraction study of the tilted phases of Langmuir films: Determination of molecular conformations using simulated annealing

    Energy Technology Data Exchange (ETDEWEB)

    Pignat, J. [LIONS/Service de Chimie Moleculaire, CEA-Saclay bat. 125, F-91191 Gif-sur-Yvette Cedex (France); LPPI, universite de Cergy-Pontoise, 5 mail Gay-Lussac Neuville/Oise, 95031 Cergy-Pontoise Cedex (France); Daillant, J. [LIONS/Service de Chimie Moleculaire, CEA-Saclay bat. 125, F-91191 Gif-sur-Yvette Cedex (France)]. E-mail: jean.daillant@cea.fr; Cantin, S. [LPPI, universite de Cergy-Pontoise, 5 mail Gay-Lussac Neuville/Oise, 95031 Cergy-Pontoise Cedex (France); Perrot, F. [LPPI, universite de Cergy-Pontoise, 5 mail Gay-Lussac Neuville/Oise, 95031 Cergy-Pontoise Cedex (France); Konovalov, O. [ESRF, 6 rue Jules Horowitz, BP220, 38043 Grenoble Cedex (France)

    2007-05-23

    We have analyzed grazing incidence X-ray diffraction (GIXD) data from condensed phases of Langmuir films of long-chain fatty acids at the air-water using a new method consisting in a careful extraction of the structure factors followed by fitting of molecular parameters using simulated annealing. We show that the information contained in GIXD spectra is enough to obtain near-atomic structural information. In particular, we directly determine the orientation of the chain backbone planes and of the carboxylic headgroups, and we evaluate chain conformation defects.

  8. Multi-objective optimization of in-situ bioremediation of groundwater using a hybrid metaheuristic technique based on differential evolution, genetic algorithms and simulated annealing

    Directory of Open Access Journals (Sweden)

    Kumar Deepak

    2015-12-01

    Full Text Available Groundwater contamination due to leakage of gasoline is one of the several causes which affect the groundwater environment by polluting it. In the past few years, In-situ bioremediation has attracted researchers because of its ability to remediate the contaminant at its site with low cost of remediation. This paper proposed the use of a new hybrid algorithm to optimize a multi-objective function which includes the cost of remediation as the first objective and residual contaminant at the end of the remediation period as the second objective. The hybrid algorithm was formed by combining the methods of Differential Evolution, Genetic Algorithms and Simulated Annealing. Support Vector Machines (SVM was used as a virtual simulator for biodegradation of contaminants in the groundwater flow. The results obtained from the hybrid algorithm were compared with Differential Evolution (DE, Non Dominated Sorting Genetic Algorithm (NSGA II and Simulated Annealing (SA. It was found that the proposed hybrid algorithm was capable of providing the best solution. Fuzzy logic was used to find the best compromising solution and finally a pumping rate strategy for groundwater remediation was presented for the best compromising solution. The results show that the cost incurred for the best compromising solution is intermediate between the highest and lowest cost incurred for other non-dominated solutions.

  9. Simulation based planning of surgical interventions in pediatric cardiology

    Science.gov (United States)

    Marsden, Alison L.

    2013-10-01

    Hemodynamics plays an essential role in the progression and treatment of cardiovascular disease. However, while medical imaging provides increasingly detailed anatomical information, clinicians often have limited access to hemodynamic data that may be crucial to patient risk assessment and treatment planning. Computational simulations can now provide detailed hemodynamic data to augment clinical knowledge in both adult and pediatric applications. There is a particular need for simulation tools in pediatric cardiology, due to the wide variation in anatomy and physiology in congenital heart disease patients, necessitating individualized treatment plans. Despite great strides in medical imaging, enabling extraction of flow information from magnetic resonance and ultrasound imaging, simulations offer predictive capabilities that imaging alone cannot provide. Patient specific simulations can be used for in silico testing of new surgical designs, treatment planning, device testing, and patient risk stratification. Furthermore, simulations can be performed at no direct risk to the patient. In this paper, we outline the current state of the art in methods for cardiovascular blood flow simulation and virtual surgery. We then step through pressing challenges in the field, including multiscale modeling, boundary condition selection, optimization, and uncertainty quantification. Finally, we summarize simulation results of two representative examples from pediatric cardiology: single ventricle physiology, and coronary aneurysms caused by Kawasaki disease. These examples illustrate the potential impact of computational modeling tools in the clinical setting.

  10. a Simulation-As Framework Facilitating Webgis Based Installation Planning

    Science.gov (United States)

    Zheng, Z.; Chang, Z. Y.; Fei, Y. F.

    2017-09-01

    Installation Planning is constrained by both natural and social conditions, especially for spatially sparse but functionally connected facilities. Simulation is important for proper deploy in space and configuration in function of facilities to make them a cohesive and supportive system to meet users' operation needs. Based on requirement analysis, we propose a framework to combine GIS and Agent simulation to overcome the shortness in temporal analysis and task simulation of traditional GIS. In this framework, Agent based simulation runs as a service on the server, exposes basic simulation functions, such as scenario configuration, simulation control, and simulation data retrieval to installation planners. At the same time, the simulation service is able to utilize various kinds of geoprocessing services in Agents' process logic to make sophisticated spatial inferences and analysis. This simulation-as-a-service framework has many potential benefits, such as easy-to-use, on-demand, shared understanding, and boosted performances. At the end, we present a preliminary implement of this concept using ArcGIS javascript api 4.0 and ArcGIS for server, showing how trip planning and driving can be carried out by agents.

  11. Planning organization and productivity simulation tool for maritime container terminals

    Directory of Open Access Journals (Sweden)

    B. Beškovnik

    2010-09-01

    Full Text Available The article describes a proposed planning organization and productivity simulation tool, with a special emphasis on orientations to the optimization of operations in a maritime container terminal. With the application of an adequate model frame for traffic and technical-technologic forecasting, infrastructure and manpower planning and productivity simulation are possible to measure and increase the productivity in the whole subsystem of the maritime container terminal. The emphasis is mainly put on setting up planning organization in order to collect important information and consequently to raise productivity. This is the main task and goal of terminal management that must develop elements and strategies for optimal operational and financial production. An adequate planning structure must use simplified but efficient simulation tools enabling owners and management to take a vast number of adequate financial and operational decisions. Considering all important and very dynamic facts in container and shipping industry, the proposed simulation tool gives a helpful instrument for checking productivity and its time variation and monitoring a competitive position of a certain maritime terminal with the terminals from the same group. Therefore, the management of every maritime container terminal must establish an appropriate internal planning system as a mechanism for strategic decision support relating basically to the assessment of the best development and optimization solutions for the infrastructure and suprastructure of the entire system.

  12. Evaluating impact of market changes on increasing cell-load variation in dynamic cellular manufacturing systems using a hybrid Tabu search and simulated annealing algorithms

    Directory of Open Access Journals (Sweden)

    Aidin Delgoshaei

    2016-06-01

    Full Text Available In this paper, a new method is proposed for scheduling dynamic cellular manufacturing systems (D-CMS in the presence of uncertain product demands. The aim of this method is to control the process of trading off between in-house manufacturing and outsourcing while product demands are uncertain and can be varied from period to period. To solve the proposed problem, a hybrid Tabu Search and Simulated Annealing are developed to overcome hardness of the proposed model and then results are compared with a Branch and Bound and Simulated Annealing algorithms. A Taguchi method (L_27 orthogonal optimization is used to estimate parameters of the proposed method in order to solve experiments derived from literature. An in-depth analysis is conducted on the results in consideration of various factors. For evaluating the system imbalance in dynamic market demands, a new measuring index is developed. Our findings indicate that the uncertain condition of market demands affects the routing of product parts and may induce machine-load variations that yield to cell-load diversity. The results showed that the proposed hybrid method can provide solutions with better quality.

  13. Lean Supply Chain Planning: A Performance Evaluation through Simulation

    Directory of Open Access Journals (Sweden)

    Rossini Matteo

    2016-01-01

    Full Text Available Nowadays companies look more and more for improving their efficiency to excel in the market. At the same time, the competition has moved from firm level to whole supply chain level. Supply chain are very complex systems and lacks of coordination among their members leads to inefficiency. Supply chain planning task is to improve coordination among supply chain members. Which is the best planning solution to improve efficiency is an open issue. On the other hand, Lean approach is becoming more and more popular among managers. Lean approach is recognize as efficiency engine for production systems, but effects of Lean implementation out of single firm boundaries is not clear. This paper aims at providing a theoretical and practical starting point for Lean implementation in supply chain planning issue. To reach it, a DES simulation model of a three-echelon and multi-product supply chain has been set. Lean management is a very broad topic and this paper focuses on two principles of “pull” and “create the flow”. Kanban system and setup-time and batch-size reductions are implemented in the lean-configured supply chain to apply “pull” and “create the flow” respectively. Lean principles implementations have been analyzed and compared with other supply chain planning policies: EOQ and information sharing (Visibility. Supported by the simulation study, this paper points Lean supply chain planning is a competitive planning policies to increase efficiency.

  14. Empowering stakeholders through simulation in water resources planning

    International Nuclear Information System (INIS)

    Palmer, R.N.; Keyes, A.M.; Fisher, S.

    1993-01-01

    During the past two years, researchers at the University of Washington (UW) have had the unique opportunity to facilitate and observe the development of drought planning activities associated with the National Drought Study (NDS) and its Drought Preparedness Studies (DPS) sites as sponsored by the Institute of Water Resources of the US Army Corps of Engineers. Each of the DPS sites is unique, with different study objectives and institutional constraints. However, one uniform requirement of the study is to develop tactical and strategic drought plans that can be successfully implemented within the study region. At the onset of the study, it was recognized that successful implementation is directly related to the active involvement of affected parties and agencies (denoted as stakeholders) and the degree to which they support the plan's conclusions. Their involvement is also necessary because the problems addressed by the DPS's require the experience and knowledge of a variety of water resource interests in order to arrive at effective alternatives. Their support of the plan conclusions enables regional implementation. Several techniques were used to encourage stakeholder participation in the planning process. Individuals representing the stakeholders had a wide range of professional backgrounds. This paper concentrates on one specific approach found useful in encouraging comprehensive and meaningful participation by a wide range of stakeholders; the development of object-oriented simulation models for the water resource systems under study. Simulation models were to develop tactical and strategic drought plans and to ensure the acceptance of the plans by building consensus among the stakeholders. The remainder of this paper describes: how simulation models became a part of the National Drought Study, procedures used to develop the DPS models, and how the model empowered stakeholders

  15. Determination of performance criteria of safety systems in a nuclear power plant via simulated annealing optimization method

    International Nuclear Information System (INIS)

    Jung, Woo Sik

    1993-02-01

    This study presents and efficient methodology that derives design alternatives and performance criteria of safety functions/systems in commercial nuclear power plants. Determination of design alternatives and intermediate-level performance criteria is posed as a reliability allocation problem. The reliability allocation is performed for determination of reliabilities of safety functions/systems from top-level performance criteria. The reliability allocation is a very difficult multi objective optimization problem (MOP) as well as a global optimization problem with many local minima. The weighted Chebyshev norm (WCN) approach in combination with an improved Metropolis algorithm of simulated annealing is developed and applied to the reliability allocation problem. The hierarchy of probabilistic safety criteria (PSC) may consist of three levels, which ranges from the overall top level (e.g., core damage frequency, acute fatality and latent cancer fatality) through the interlnediate level (e.g., unavailiability of safety system/function) to the low level (e.g., unavailability of components, component specifications or human error). In order to determine design alternatives of safety functions/systems and the intermediate-level PSC, the reliability allocation is performed from the top-level PSC. The intermediated level corresponds to an objective space and the top level is related to a risk space. The reliability allocation is performed by means of a concept of two-tier noninferior solutions in the objective and risk spaces within the top-level PSC. In this study, two kinds of towtier noninferior solutions are defined: intolerable intermediate-level PSC and desirable design alternatives of safety functions/systems that are determined from Sets 1 and 2, respectively. Set 1 is obtained by maximizing simultaneously not only safety function/system unavailabilities but also risks. Set 1 reflects safety function/system unavailabilities in the worst case. Hence, the

  16. Treatment planning for a small animal using Monte Carlo simulation

    International Nuclear Information System (INIS)

    Chow, James C. L.; Leung, Michael K. K.

    2007-01-01

    The development of a small animal model for radiotherapy research requires a complete setup of customized imaging equipment, irradiators, and planning software that matches the sizes of the subjects. The purpose of this study is to develop and demonstrate the use of a flexible in-house research environment for treatment planning on small animals. The software package, called DOSCTP, provides a user-friendly platform for DICOM computed tomography-based Monte Carlo dose calculation using the EGSnrcMP-based DOSXYZnrc code. Validation of the treatment planning was performed by comparing the dose distributions for simple photon beam geometries calculated through the Pinnacle3 treatment planning system and measurements. A treatment plan for a mouse based on a CT image set by a 360-deg photon arc is demonstrated. It is shown that it is possible to create 3D conformal treatment plans for small animals with consideration of inhomogeneities using small photon beam field sizes in the diameter range of 0.5-5 cm, with conformal dose covering the target volume while sparing the surrounding critical tissue. It is also found that Monte Carlo simulation is suitable to carry out treatment planning dose calculation for small animal anatomy with voxel size about one order of magnitude smaller than that of the human

  17. A novel approach in optimization problem for research reactors fuel plate using a synergy between cellular automata and quasi-simulated annealing methods

    International Nuclear Information System (INIS)

    Barati, Ramin

    2014-01-01

    Highlights: • An innovative optimization technique for multi-objective optimization is presented. • The technique utilizes combination of CA and quasi-simulated annealing. • Mass and deformation of fuel plate are considered as objective functions. • Computational burden is significantly reduced compared to classic tools. - Abstract: This paper presents a new and innovative optimization technique utilizing combination of cellular automata (CA) and quasi-simulated annealing (QSA) as solver concerning conceptual design optimization which is indeed a multi-objective optimization problem. Integrating CA and QSA into a unified optimizer tool has a great potential for solving multi-objective optimization problems. Simulating neighborhood effects while taking local information into account from CA and accepting transitions based on decreasing of objective function and Boltzmann distribution from QSA as transition rule make this tool effective in multi-objective optimization. Optimization of fuel plate safety design while taking into account major goals of conceptual design such as improving reliability and life-time – which are the most significant elements during shutdown – is a major multi-objective optimization problem. Due to hugeness of search space in fuel plate optimization problem, finding optimum solution in classical methods requires a huge amount of calculation and CPU time. The CA models, utilizing local information, require considerably less computation. In this study, minimizing both mass and deformation of fuel plate of a multipurpose research reactor (MPRR) are considered as objective functions. Results, speed, and qualification of proposed method are comparable with those of genetic algorithm and neural network methods applied to this problem before

  18. Angular filter refractometry analysis using simulated annealing [An improved method for characterizing plasma density profiles using angular filter refractometry

    International Nuclear Information System (INIS)

    Angland, P.; Haberberger, D.; Ivancic, S. T.; Froula, D. H.

    2017-01-01

    Here, a new method of analysis for angular filter refractometry images was developed to characterize laser-produced, long-scale-length plasmas using an annealing algorithm to iterative converge upon a solution. Angular filter refractometry (AFR) is a novel technique used to characterize the density pro files of laser-produced, long-scale-length plasmas. A synthetic AFR image is constructed by a user-defined density profile described by eight parameters, and the algorithm systematically alters the parameters until the comparison is optimized. The optimization and statistical uncertainty calculation is based on a minimization of the χ2 test statistic. The algorithm was successfully applied to experimental data of plasma expanding from a flat, laser-irradiated target, resulting in average uncertainty in the density profile of 5-10% in the region of interest.

  19. Creating virtual humans for simulation-based training and planning

    Energy Technology Data Exchange (ETDEWEB)

    Stansfield, S.; Sobel, A.

    1998-05-12

    Sandia National Laboratories has developed a distributed, high fidelity simulation system for training and planning small team Operations. The system provides an immersive environment populated by virtual objects and humans capable of displaying complex behaviors. The work has focused on developing the behaviors required to carry out complex tasks and decision making under stress. Central to this work are techniques for creating behaviors for virtual humans and for dynamically assigning behaviors to CGF to allow scenarios without fixed outcomes. Two prototype systems have been developed that illustrate these capabilities: MediSim, a trainer for battlefield medics and VRaptor, a system for planning, rehearsing and training assault operations.

  20. A High-Speed Train Operation Plan Inspection Simulation Model

    Directory of Open Access Journals (Sweden)

    Yang Rui

    2018-01-01

    Full Text Available We developed a train operation simulation tool to inspect a train operation plan. In applying an improved Petri Net, the train was regarded as a token, and the line and station were regarded as places, respectively, in accordance with the high-speed train operation characteristics and network function. Location change and running information transfer of the high-speed train were realized by customizing a variety of transitions. The model was built based on the concept of component combination, considering the random disturbance in the process of train running. The simulation framework can be generated quickly and the system operation can be completed according to the different test requirements and the required network data. We tested the simulation tool when used for the real-world Wuhan to Guangzhou high-speed line. The results showed that the proposed model can be developed, the simulation results basically coincide with the objective reality, and it can not only test the feasibility of the high-speed train operation plan, but also be used as a support model to develop the simulation platform with more capabilities.

  1. Reconstruction of X-rays spectra of clinical linear accelerators using the generalized simulated annealing method; Reconstrucao de espectros de raios-X de aceleradores lineares clinicos usando o metodo de recozimento simulado generalizado

    Energy Technology Data Exchange (ETDEWEB)

    Manrique, John Peter O.; Costa, Alessandro M., E-mail: johnp067@usp.br, E-mail: amcosta@usp.br [Universidade de Sao Paulo (USP), Ribeirao Preto, SP (Brazil)

    2016-07-01

    The spectral distribution of megavoltage X-rays used in radiotherapy departments is a fundamental quantity from which, in principle, all relevant information required for radiotherapy treatments can be determined. To calculate the dose delivered to the patient who make radiation therapy, are used treatment planning systems (TPS), which make use of convolution and superposition algorithms and which requires prior knowledge of the photon fluence spectrum to perform the calculation of three-dimensional doses and thus ensure better accuracy in the tumor control probabilities preserving the normal tissue complication probabilities low. In this work we have obtained the photon fluence spectrum of X-ray of the SIEMENS ONCOR linear accelerator of 6 MV, using an character-inverse method to the reconstruction of the spectra of photons from transmission curves measured for different thicknesses of aluminum; the method used for reconstruction of the spectra is a stochastic technique known as generalized simulated annealing (GSA), based on the work of quasi-equilibrium statistic of Tsallis. For the validation of the reconstructed spectra we calculated the curve of percentage depth dose (PDD) for energy of 6 MV, using Monte Carlo simulation with Penelope code, and from the PDD then calculate the beam quality index TPR{sub 20/10}. (author)

  2. Treatment planning in radiosurgery: parallel Monte Carlo simulation software

    Energy Technology Data Exchange (ETDEWEB)

    Scielzo, G [Galliera Hospitals, Genova (Italy). Dept. of Hospital Physics; Grillo Ruggieri, F [Galliera Hospitals, Genova (Italy) Dept. for Radiation Therapy; Modesti, M; Felici, R [Electronic Data System, Rome (Italy); Surridge, M [University of South Hampton (United Kingdom). Parallel Apllication Centre

    1995-12-01

    The main objective of this research was to evaluate the possibility of direct Monte Carlo simulation for accurate dosimetry with short computation time. We made us of: graphics workstation, linear accelerator, water, PMMA and anthropomorphic phantoms, for validation purposes; ionometric, film and thermo-luminescent techniques, for dosimetry; treatment planning system for comparison. Benchmarking results suggest that short computing times can be obtained with use of the parallel version of EGS4 that was developed. Parallelism was obtained assigning simulation incident photons to separate processors, and the development of a parallel random number generator was necessary. Validation consisted in: phantom irradiation, comparison of predicted and measured values good agreement in PDD and dose profiles. Experiments on anthropomorphic phantoms (with inhomogeneities) were carried out, and these values are being compared with results obtained with the conventional treatment planning system.

  3. Annealing evolutionary stochastic approximation Monte Carlo for global optimization

    KAUST Repository

    Liang, Faming

    2010-01-01

    outperform simulated annealing, the genetic algorithm, annealing stochastic approximation Monte Carlo, and some other metaheuristics in function optimization. © 2010 Springer Science+Business Media, LLC.

  4. Optimization of pulsed current GTAW process parameters for sintered hot forged AISI 4135 P/M steel welds by simulated annealing and genetic algorithm

    International Nuclear Information System (INIS)

    Joseph, Joby; Muthukumaran, S.

    2016-01-01

    Abundant improvements have occurred in materials handling, especially in metal joining. Pulsed current gas tungsten arc welding (PCGTAW) is one of the consequential fusion techniques. In this work, PCGTAW of AISI 4135 steel engendered through powder metallurgy (P/M) has been executed, and the process parameters have been highlighted applying Taguchi's L9 orthogonal array. The results show that the peak current (Ip), gas flow rate (GFR), welding speed (WS) and base current (Ib) are the critical constraints in strong determinant of the Tensile strength (TS) as well as percentage of elongation (% Elong) of the joint. The practical impact of applying Genetic algorithm (GA) and Simulated annealing (SA) to PCGTAW process has been authenticated by means of calculating the deviation between predicted and experimental welding process parameters

  5. Optimization of pulsed current GTAW process parameters for sintered hot forged AISI 4135 P/M steel welds by simulated annealing and genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Joseph, Joby; Muthukumaran, S. [National Institute of Technology, Tamil Nadu (India)

    2016-01-15

    Abundant improvements have occurred in materials handling, especially in metal joining. Pulsed current gas tungsten arc welding (PCGTAW) is one of the consequential fusion techniques. In this work, PCGTAW of AISI 4135 steel engendered through powder metallurgy (P/M) has been executed, and the process parameters have been highlighted applying Taguchi's L9 orthogonal array. The results show that the peak current (Ip), gas flow rate (GFR), welding speed (WS) and base current (Ib) are the critical constraints in strong determinant of the Tensile strength (TS) as well as percentage of elongation (% Elong) of the joint. The practical impact of applying Genetic algorithm (GA) and Simulated annealing (SA) to PCGTAW process has been authenticated by means of calculating the deviation between predicted and experimental welding process parameters.

  6. A Hybrid Genetic-Simulated Annealing Algorithm for the Location-Inventory-Routing Problem Considering Returns under E-Supply Chain Environment

    Directory of Open Access Journals (Sweden)

    Yanhui Li

    2013-01-01

    Full Text Available Facility location, inventory control, and vehicle routes scheduling are critical and highly related problems in the design of logistics system for e-business. Meanwhile, the return ratio in Internet sales was significantly higher than in the traditional business. Many of returned merchandise have no quality defects, which can reenter sales channels just after a simple repackaging process. Focusing on the existing problem in e-commerce logistics system, we formulate a location-inventory-routing problem model with no quality defects returns. To solve this NP-hard problem, an effective hybrid genetic simulated annealing algorithm (HGSAA is proposed. Results of numerical examples show that HGSAA outperforms GA on computing time, optimal solution, and computing stability. The proposed model is very useful to help managers make the right decisions under e-supply chain environment.

  7. 1-(2-furoyl)-3,3-(diphenyl)thiourea: spectroscopic characterization and structural study from X-ray powder diffraction using simulated annealing

    Energy Technology Data Exchange (ETDEWEB)

    Estevez H, O.; Duque, J. [Universidad de La Habana, Instituto de Ciencia y Tecnologia de Materiales, 10400 La Habana (Cuba); Rodriguez H, J. [UNAM, Instituto de Investigaciones en Materiales, 04510 Mexico D. F. (Mexico); Yee M, H., E-mail: oestevezh@yahoo.com [Instituto Politecnico Nacional, Escuela Superior de Fisica y Matematicas, 07738 Mexico D. F. (Mexico)

    2015-07-01

    1-Furoyl-3,3-diphenylthiourea (FDFT) was synthesized, and characterized by Ftir, {sup 1}H and {sup 13}C NMR and ab initio X-ray powder structure analysis. FDFT crystallizes in the monoclinic space group P2{sub 1} with a = 12.691(1), b = 6.026(2), c = 11.861(1) A, β = 117.95(2) and V = 801.5(3) A{sup 3}. The crystal structure has been determined from laboratory X-ray powder diffraction data using direct space global optimization strategy (simulated annealing) followed by the Rietveld refinement. The thiourea group makes a dihedral angle of 73.8(6) with the furoyl group. In the crystal structure, molecules are linked by van der Waals interactions, forming one-dimensional chains along the a axis. (Author)

  8. A hybrid genetic-simulated annealing algorithm for the location-inventory-routing problem considering returns under e-supply chain environment.

    Science.gov (United States)

    Li, Yanhui; Guo, Hao; Wang, Lin; Fu, Jing

    2013-01-01

    Facility location, inventory control, and vehicle routes scheduling are critical and highly related problems in the design of logistics system for e-business. Meanwhile, the return ratio in Internet sales was significantly higher than in the traditional business. Many of returned merchandise have no quality defects, which can reenter sales channels just after a simple repackaging process. Focusing on the existing problem in e-commerce logistics system, we formulate a location-inventory-routing problem model with no quality defects returns. To solve this NP-hard problem, an effective hybrid genetic simulated annealing algorithm (HGSAA) is proposed. Results of numerical examples show that HGSAA outperforms GA on computing time, optimal solution, and computing stability. The proposed model is very useful to help managers make the right decisions under e-supply chain environment.

  9. A Hybrid Genetic-Simulated Annealing Algorithm for the Location-Inventory-Routing Problem Considering Returns under E-Supply Chain Environment

    Science.gov (United States)

    Guo, Hao; Fu, Jing

    2013-01-01

    Facility location, inventory control, and vehicle routes scheduling are critical and highly related problems in the design of logistics system for e-business. Meanwhile, the return ratio in Internet sales was significantly higher than in the traditional business. Many of returned merchandise have no quality defects, which can reenter sales channels just after a simple repackaging process. Focusing on the existing problem in e-commerce logistics system, we formulate a location-inventory-routing problem model with no quality defects returns. To solve this NP-hard problem, an effective hybrid genetic simulated annealing algorithm (HGSAA) is proposed. Results of numerical examples show that HGSAA outperforms GA on computing time, optimal solution, and computing stability. The proposed model is very useful to help managers make the right decisions under e-supply chain environment. PMID:24489489

  10. A Pseudo-Parallel Genetic Algorithm Integrating Simulated Annealing for Stochastic Location-Inventory-Routing Problem with Consideration of Returns in E-Commerce

    Directory of Open Access Journals (Sweden)

    Bailing Liu

    2015-01-01

    Full Text Available Facility location, inventory control, and vehicle routes scheduling are three key issues to be settled in the design of logistics system for e-commerce. Due to the online shopping features of e-commerce, customer returns are becoming much more than traditional commerce. This paper studies a three-phase supply chain distribution system consisting of one supplier, a set of retailers, and a single type of product with continuous review (Q, r inventory policy. We formulate a stochastic location-inventory-routing problem (LIRP model with no quality defects returns. To solve the NP-hand problem, a pseudo-parallel genetic algorithm integrating simulated annealing (PPGASA is proposed. The computational results show that PPGASA outperforms GA on optimal solution, computing time, and computing stability.

  11. FCFPYRO simulation of the first year FCF hot operation plan

    International Nuclear Information System (INIS)

    Liaw, J.R.; Li, S.X.; Benedict, R.W.

    1996-01-01

    A simulation study has been successfully completed according to the first year FCF operational plan for the treatment of EBR-II spent fuels. Material flow by nuclides for each processing step and radioactive decays during the process are considered. The FCFPYRO code package is a very useful tool to provide step-by-step information essential to the analysis of operational strategy, process chemistry, heat removal, criticality safety, and radiological health issues in FCF

  12. Optimization of parameter values for complex pulse sequences by simulated annealing: application to 3D MP-RAGE imaging of the brain.

    Science.gov (United States)

    Epstein, F H; Mugler, J P; Brookeman, J R

    1994-02-01

    A number of pulse sequence techniques, including magnetization-prepared gradient echo (MP-GRE), segmented GRE, and hybrid RARE, employ a relatively large number of variable pulse sequence parameters and acquire the image data during a transient signal evolution. These sequences have recently been proposed and/or used for clinical applications in the brain, spine, liver, and coronary arteries. Thus, the need for a method of deriving optimal pulse sequence parameter values for this class of sequences now exists. Due to the complexity of these sequences, conventional optimization approaches, such as applying differential calculus to signal difference equations, are inadequate. We have developed a general framework for adapting the simulated annealing algorithm to pulse sequence parameter value optimization, and applied this framework to the specific case of optimizing the white matter-gray matter signal difference for a T1-weighted variable flip angle 3D MP-RAGE sequence. Using our algorithm, the values of 35 sequence parameters, including the magnetization-preparation RF pulse flip angle and delay time, 32 flip angles in the variable flip angle gradient-echo acquisition sequence, and the magnetization recovery time, were derived. Optimized 3D MP-RAGE achieved up to a 130% increase in white matter-gray matter signal difference compared with optimized 3D RF-spoiled FLASH with the same total acquisition time. The simulated annealing approach was effective at deriving optimal parameter values for a specific 3D MP-RAGE imaging objective, and may be useful for other imaging objectives and sequences in this general class.

  13. Application of simulated annealing in simulation and optimization of drying process of Zea mays malt Aplicação do simulated annealing na simulação e otimização do processo de secagem do malte de Zea mays

    Directory of Open Access Journals (Sweden)

    Marco A. C. Benvenga

    2011-10-01

    Full Text Available Kinetic simulation and drying process optimization of corn malt by Simulated Annealing (SA for estimation of temperature and time parameters in order to preserve maximum amylase activity in the obtained product are presented here. Germinated corn seeds were dried at 54-76 °C in a convective dryer, with occasional measurement of moisture content and enzymatic activity. The experimental data obtained were submitted to modeling. Simulation and optimization of the drying process were made by using the SA method, a randomized improvement algorithm, analogous to the simulated annealing process. Results showed that seeds were best dried between 3h and 5h. Among the models used in this work, the kinetic model of water diffusion into corn seeds showed the best fitting. Drying temperature and time showed a square influence on the enzymatic activity. Optimization through SA showed the best condition at 54 ºC and between 5.6h and 6.4h of drying. Values of specific activity in the corn malt were found between 5.26±0.06 SKB/mg and 15.69±0,10% of remaining moisture.Este trabalho objetivou a simulação da cinética e a otimização do processo de secagem do malte de milho por meio da técnica Simulated Annealing (SA, para estimação dos parâmetros de temperatura e tempo, tais que mantenham a atividade máxima das enzimas amilases no produto obtido. Para tanto, as sementes de milho germinadas foram secas entre 54-76°C, em um secador convectivo de ar. De tempo em tempo, a umidade e a atividade enzimática foram medidas. Esses dados experimentais foram usados para testar os modelos. A simulação e a otimização do processo foram feitas por meio do método SA, um algoritmo de melhoria randômica, análogo ao processo de têmpera simulada. Os resultados mostram que as sementes estavam secas após 3 h ou 5 h de secagem. Entre os modelos usados, o modelo cinético de difusão da água através das sementes apresentou o melhor ajuste. O tempo e a temperatura

  14. A Graphical Interactive Simulation Environment for Production Planning in Bacon Factories

    DEFF Research Database (Denmark)

    Nielsen, Kirsten Mølgaard; Nielsen, Jens Frederik Dalsgaard

    1994-01-01

    The paper describes a graphical interactive simulation tool for production planning in bacon factories........The paper describes a graphical interactive simulation tool for production planning in bacon factories.....

  15. Mathematical foundation of quantum annealing

    International Nuclear Information System (INIS)

    Morita, Satoshi; Nishimori, Hidetoshi

    2008-01-01

    Quantum annealing is a generic name of quantum algorithms that use quantum-mechanical fluctuations to search for the solution of an optimization problem. It shares the basic idea with quantum adiabatic evolution studied actively in quantum computation. The present paper reviews the mathematical and theoretical foundations of quantum annealing. In particular, theorems are presented for convergence conditions of quantum annealing to the target optimal state after an infinite-time evolution following the Schroedinger or stochastic (Monte Carlo) dynamics. It is proved that the same asymptotic behavior of the control parameter guarantees convergence for both the Schroedinger dynamics and the stochastic dynamics in spite of the essential difference of these two types of dynamics. Also described are the prescriptions to reduce errors in the final approximate solution obtained after a long but finite dynamical evolution of quantum annealing. It is shown there that we can reduce errors significantly by an ingenious choice of annealing schedule (time dependence of the control parameter) without compromising computational complexity qualitatively. A review is given on the derivation of the convergence condition for classical simulated annealing from the view point of quantum adiabaticity using a classical-quantum mapping

  16. Electrode Materials, Thermal Annealing Sequences, and Lateral/Vertical Phase Separation of Polymer Solar Cells from Multiscale Molecular Simulations

    KAUST Repository

    Lee, Cheng-Kuang; Wodo, Olga; Ganapathysubramanian, Baskar; Pao, Chun-Wei

    2014-01-01

    . Simulations are performed for various configurations of electrode materials as well as processing temperature. The complex CG molecular data are characterized using a novel extension of our graph-based framework to quantify morphology and establish a link

  17. A comprehensive solution for simulating ultra-shallow junctions: From high dose/low energy implant to diffusion annealing

    International Nuclear Information System (INIS)

    Boucard, F.; Roger, F.; Chakarov, I.; Zhuk, V.; Temkin, M.; Montagner, X.; Guichard, E.; Mathiot, D.

    2005-01-01

    This paper presents a global approach permitting accurate simulation of the process of ultra-shallow junctions. Physically based models of dopant implantation (BCA) and diffusion (including point and extended defects coupling) are integrated within a unique simulation tool. A useful set of the relevant parameters has been obtained through an original calibration methodology. It is shown that this approach provides an efficient tool for process modelling

  18. A comprehensive solution for simulating ultra-shallow junctions: From high dose/low energy implant to diffusion annealing

    Energy Technology Data Exchange (ETDEWEB)

    Boucard, F. [Silvaco Data Systems, 55 Rue Blaise Pascal, F38330 Montbonnot (France)]. E-mail: Frederic.Boucard@silvaco.com; Roger, F. [Silvaco Data Systems, 55 Rue Blaise Pascal, F38330 Montbonnot (France); Chakarov, I. [Silvaco Data Systems, 55 Rue Blaise Pascal, F38330 Montbonnot (France); Zhuk, V. [Silvaco Data Systems, 55 Rue Blaise Pascal, F38330 Montbonnot (France); Temkin, M. [Silvaco Data Systems, 55 Rue Blaise Pascal, F38330 Montbonnot (France); Montagner, X. [Silvaco Data Systems, 55 Rue Blaise Pascal, F38330 Montbonnot (France); Guichard, E. [Silvaco Data Systems, 55 Rue Blaise Pascal, F38330 Montbonnot (France); Mathiot, D. [InESS, CNRS and Universite Louis Pasteur, 23 Rue du Loess, F67037 Strasbourg (France)]. E-mail: Daniel.Mathiot@iness.c-strasbourg.fr

    2005-12-05

    This paper presents a global approach permitting accurate simulation of the process of ultra-shallow junctions. Physically based models of dopant implantation (BCA) and diffusion (including point and extended defects coupling) are integrated within a unique simulation tool. A useful set of the relevant parameters has been obtained through an original calibration methodology. It is shown that this approach provides an efficient tool for process modelling.

  19. Simulation-based planning for theater air warfare

    Science.gov (United States)

    Popken, Douglas A.; Cox, Louis A., Jr.

    2004-08-01

    Planning for Theatre Air Warfare can be represented as a hierarchy of decisions. At the top level, surviving airframes must be assigned to roles (e.g., Air Defense, Counter Air, Close Air Support, and AAF Suppression) in each time period in response to changing enemy air defense capabilities, remaining targets, and roles of opposing aircraft. At the middle level, aircraft are allocated to specific targets to support their assigned roles. At the lowest level, routing and engagement decisions are made for individual missions. The decisions at each level form a set of time-sequenced Courses of Action taken by opposing forces. This paper introduces a set of simulation-based optimization heuristics operating within this planning hierarchy to optimize allocations of aircraft. The algorithms estimate distributions for stochastic outcomes of the pairs of Red/Blue decisions. Rather than using traditional stochastic dynamic programming to determine optimal strategies, we use an innovative combination of heuristics, simulation-optimization, and mathematical programming. Blue decisions are guided by a stochastic hill-climbing search algorithm while Red decisions are found by optimizing over a continuous representation of the decision space. Stochastic outcomes are then provided by fast, Lanchester-type attrition simulations. This paper summarizes preliminary results from top and middle level models.

  20. Temperature Scaling Law for Quantum Annealing Optimizers.

    Science.gov (United States)

    Albash, Tameem; Martin-Mayor, Victor; Hen, Itay

    2017-09-15

    Physical implementations of quantum annealing unavoidably operate at finite temperatures. We point to a fundamental limitation of fixed finite temperature quantum annealers that prevents them from functioning as competitive scalable optimizers and show that to serve as optimizers annealer temperatures must be appropriately scaled down with problem size. We derive a temperature scaling law dictating that temperature must drop at the very least in a logarithmic manner but also possibly as a power law with problem size. We corroborate our results by experiment and simulations and discuss the implications of these to practical annealers.

  1. Building Performance Simulation tools for planning of energy efficiency retrofits

    DEFF Research Database (Denmark)

    Mondrup, Thomas Fænø; Karlshøj, Jan; Vestergaard, Flemming

    2014-01-01

    Designing energy efficiency retrofits for existing buildings will bring environmental, economic, social, and health benefits. However, selecting specific retrofit strategies is complex and requires careful planning. In this study, we describe a methodology for adopting Building Performance...... to energy efficiency retrofits in social housing. To generate energy savings, we focus on optimizing the building envelope. We evaluate alternative building envelope actions using procedural solar radiation and daylight simulations. In addition, we identify the digital information flow and the information...... Simulation (BPS) tools as energy and environmentally conscious decision-making aids. The methodology has been developed to screen buildings for potential improvements and to support the development of retrofit strategies. We present a case study of a Danish renovation project, implementing BPS approaches...

  2. Path planning of master-slave manipulator using graphic simulator

    International Nuclear Information System (INIS)

    Lee, J. Y.; Kim, S. H.; Song, T. K.; Park, B. S.; Yoon, J. S.

    2002-01-01

    To handle the high level radioactive materials such as spent fuels remotely, the master-slave manipulator is generally used as a remote handling equipment in the hot cell. To analyze the motion and to implement the training system by virtual reality technology, the simulator for M-S manipulator using the computer graphics is developed. The parts are modelled in 3-D graphics, assembled, and kinematics are assigned. The inverse kinematics of the manipulator is defined, and the slave of manipulator is coupled with master by the manipulator's specification. Also, the virtual work cell is implemented in the graphical environment which is the same as the real environment and the path planning method using the function of the collision detection for a manipulator are proposed. This graphic simulator of manipulator can be effectively used in designing of the maintenance processes for the hot cell equipment and enhance the reliability of the spent fuel management

  3. Virtual environment simulation as a tool to support evacuation planning

    International Nuclear Information System (INIS)

    Mol, Antonio C.; Grecco, Claudio H.S.; Santos, Isaac J.A.L.; Carvalho, Paulo V.R.; Jorge, Carlos A.F.; Sales, Douglas S.; Couto, Pedro M.; Botelho, Felipe M.; Bastos, Felipe R.

    2007-01-01

    This work is a preliminary study of the use of a free game-engine as a tool to build and to navigate in virtual environments, with a good degree of realism, for virtual simulations of evacuation from building and risk zones. To achieve this goal, some adjustments in the game engine have been implemented. A real building with four floors, consisting of some rooms with furniture and people, has been virtually implemented. Simulations of simple different evacuation scenarios have been performed, measuring the total time spent in each case. The measured times have been compared with their corresponding real evacuation times, measured in the real building. The first results have demonstrated that the virtual environment building with the free game engine is capable to reproduce the real situation with a satisfactory level. However, it is important to emphasize that such virtual simulations serve only as an aid in the planning of real evacuation simulations, and as such must never substitute the later. (author)

  4. Non-linear modeling of 1H NMR metabonomic data using kernel-based orthogonal projections to latent structures optimized by simulated annealing

    International Nuclear Information System (INIS)

    Fonville, Judith M.; Bylesjoe, Max; Coen, Muireann; Nicholson, Jeremy K.; Holmes, Elaine; Lindon, John C.; Rantalainen, Mattias

    2011-01-01

    Highlights: → Non-linear modeling of metabonomic data using K-OPLS. → automated optimization of the kernel parameter by simulated annealing. → K-OPLS provides improved prediction performance for exemplar spectral data sets. → software implementation available for R and Matlab under GPL v2 license. - Abstract: Linear multivariate projection methods are frequently applied for predictive modeling of spectroscopic data in metabonomic studies. The OPLS method is a commonly used computational procedure for characterizing spectral metabonomic data, largely due to its favorable model interpretation properties providing separate descriptions of predictive variation and response-orthogonal structured noise. However, when the relationship between descriptor variables and the response is non-linear, conventional linear models will perform sub-optimally. In this study we have evaluated to what extent a non-linear model, kernel-based orthogonal projections to latent structures (K-OPLS), can provide enhanced predictive performance compared to the linear OPLS model. Just like its linear counterpart, K-OPLS provides separate model components for predictive variation and response-orthogonal structured noise. The improved model interpretation by this separate modeling is a property unique to K-OPLS in comparison to other kernel-based models. Simulated annealing (SA) was used for effective and automated optimization of the kernel-function parameter in K-OPLS (SA-K-OPLS). Our results reveal that the non-linear K-OPLS model provides improved prediction performance in three separate metabonomic data sets compared to the linear OPLS model. We also demonstrate how response-orthogonal K-OPLS components provide valuable biological interpretation of model and data. The metabonomic data sets were acquired using proton Nuclear Magnetic Resonance (NMR) spectroscopy, and include a study of the liver toxin galactosamine, a study of the nephrotoxin mercuric chloride and a study of

  5. Simulated annealing method for electronic circuits design: adaptation and comparison with other optimization methods; La methode du recuit simule pour la conception des circuits electroniques: adaptation et comparaison avec d`autres methodes d`optimisation

    Energy Technology Data Exchange (ETDEWEB)

    Berthiau, G

    1995-10-01

    The circuit design problem consists in determining acceptable parameter values (resistors, capacitors, transistors geometries ...) which allow the circuit to meet various user given operational criteria (DC consumption, AC bandwidth, transient times ...). This task is equivalent to a multidimensional and/or multi objective optimization problem: n-variables functions have to be minimized in an hyper-rectangular domain ; equality constraints can be eventually specified. A similar problem consists in fitting component models. In this way, the optimization variables are the model parameters and one aims at minimizing a cost function built on the error between the model response and the data measured on the component. The chosen optimization method for this kind of problem is the simulated annealing method. This method, provided by the combinatorial optimization domain, has been adapted and compared with other global optimization methods for the continuous variables problems. An efficient strategy of variables discretization and a set of complementary stopping criteria have been proposed. The different parameters of the method have been adjusted with analytical functions of which minima are known, classically used in the literature. Our simulated annealing algorithm has been coupled with an open electrical simulator SPICE-PAC of which the modular structure allows the chaining of simulations required by the circuit optimization process. We proposed, for high-dimensional problems, a partitioning technique which ensures proportionality between CPU-time and variables number. To compare our method with others, we have adapted three other methods coming from combinatorial optimization domain - the threshold method, a genetic algorithm and the Tabu search method - The tests have been performed on the same set of test functions and the results allow a first comparison between these methods applied to continuous optimization variables. (Abstract Truncated)

  6. Planning of overhead contact lines and simulation of the pantograph running; Oberleitungsplanung und Simulation des Stromabnehmerlaufes

    Energy Technology Data Exchange (ETDEWEB)

    Hofbauer, Gerhard [ALPINE-ENERGIE Oesterreich GmbH, Linz (Austria); Hofbauer, Werner

    2009-07-01

    Using the software FLTG all planning steps for overhead contact lines can be carried out based on the parameters of the contact line type and the line data. Contact line supports and individual spans are presented graphically. The geometric interaction of pantograph and contact line can be simulated taking into account the pantograph type, its sway and the wind action. Thus, the suitability of a line for the interoperability of the transEuropean rail system can be demonstrated. (orig.)

  7. Study on Temperature and Synthetic Compensation of Piezo-Resistive Differential Pressure Sensors by Coupled Simulated Annealing and Simplex Optimized Kernel Extreme Learning Machine.

    Science.gov (United States)

    Li, Ji; Hu, Guoqing; Zhou, Yonghong; Zou, Chong; Peng, Wei; Alam Sm, Jahangir

    2017-04-19

    As a high performance-cost ratio solution for differential pressure measurement, piezo-resistive differential pressure sensors are widely used in engineering processes. However, their performance is severely affected by the environmental temperature and the static pressure applied to them. In order to modify the non-linear measuring characteristics of the piezo-resistive differential pressure sensor, compensation actions should synthetically consider these two aspects. Advantages such as nonlinear approximation capability, highly desirable generalization ability and computational efficiency make the kernel extreme learning machine (KELM) a practical approach for this critical task. Since the KELM model is intrinsically sensitive to the regularization parameter and the kernel parameter, a searching scheme combining the coupled simulated annealing (CSA) algorithm and the Nelder-Mead simplex algorithm is adopted to find an optimal KLEM parameter set. A calibration experiment at different working pressure levels was conducted within the temperature range to assess the proposed method. In comparison with other compensation models such as the back-propagation neural network (BP), radius basis neural network (RBF), particle swarm optimization optimized support vector machine (PSO-SVM), particle swarm optimization optimized least squares support vector machine (PSO-LSSVM) and extreme learning machine (ELM), the compensation results show that the presented compensation algorithm exhibits a more satisfactory performance with respect to temperature compensation and synthetic compensation problems.

  8. Spectral fitting for signal assignment and structural analysis of uniformly {sup 13}C-labeled solid proteins by simulated annealing based on chemical shifts and spin dynamics

    Energy Technology Data Exchange (ETDEWEB)

    Matsuki, Yoh; Akutsu, Hideo; Fujiwara, Toshimichi [Osaka University, Institute for Protein Research (Japan)], E-mail: tfjwr@protein.osaka-u.ac.jp

    2007-08-15

    We describe an approach for the signal assignment and structural analysis with a suite of two-dimensional {sup 13}C-{sup 13}C magic-angle-spinning solid-state NMR spectra of uniformly {sup 13}C-labeled peptides and proteins. We directly fit the calculated spectra to experimental ones by simulated annealing in restrained molecular dynamics program CNS as a function of atomic coordinates. The spectra are calculated from the conformation dependent chemical shift obtained with SHIFTX and the cross-peak intensities computed for recoupled dipolar interactions. This method was applied to a membrane-bound 14-residue peptide, mastoparan-X. The obtained C', C{sup {alpha}} and C{sup {beta}} chemical shifts agreed with those reported previously at the precisions of 0.2, 0.7 and 0.4 ppm, respectively. This spectral fitting program also provides backbone dihedral angles with a precision of about 50 deg. from the spectra even with resonance overlaps. The restraints on the angles were improved by applying protein database program TALOS to the obtained chemical shifts. The peptide structure provided by these restraints was consistent with the reported structure at the backbone RMSD of about 1 A.

  9. Improvement of bio-corrosion resistance for Ti42Zr40Si15Ta3 metallic glasses in simulated body fluid by annealing within supercooled liquid region.

    Science.gov (United States)

    Huang, C H; Lai, J J; Wei, T Y; Chen, Y H; Wang, X; Kuan, S Y; Huang, J C

    2015-01-01

    The effects of the nanocrystalline phases on the bio-corrosion behavior of highly bio-friendly Ti42Zr40Si15Ta3 metallic glasses in simulated body fluid were investigated, and the findings are compared with our previous observations from the Zr53Cu30Ni9Al8 metallic glasses. The Ti42Zr40Si15Ta3 metallic glasses were annealed at temperatures above the glass transition temperature, Tg, with different time periods to result in different degrees of α-Ti nano-phases in the amorphous matrix. The nanocrystallized Ti42Zr40Si15Ta3 metallic glasses containing corrosion resistant α-Ti phases exhibited more promising bio-corrosion resistance, due to the superior pitting resistance. This is distinctly different from the previous case of the Zr53Cu30Ni9Al8 metallic glasses with the reactive Zr2Cu phases inducing serious galvanic corrosion and lower bio-corrosion resistance. Thus, whether the fully amorphous or partially crystallized metallic glass would exhibit better bio-corrosion resistance, the answer would depend on the crystallized phase nature. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Global minimum-energy structure and spectroscopic properties of I2(*-) x n H2O clusters: a Monte Carlo simulated annealing study.

    Science.gov (United States)

    Pathak, Arup Kumar; Mukherjee, Tulsi; Maity, Dilip Kumar

    2010-01-18

    The vibrational (IR and Raman) and photoelectron spectral properties of hydrated iodine-dimer radical-anion clusters, I(2)(*-) x n H(2)O (n=1-10), are presented. Several initial guess structures are considered for each size of cluster to locate the global minimum-energy structure by applying a Monte Carlo simulated annealing procedure including spin-orbit interaction. In the Raman spectrum, hydration reduces the intensity of the I-I stretching band but enhances the intensity of the O-H stretching band of water. Raman spectra of more highly hydrated clusters appear to be simpler than the corresponding IR spectra. Vibrational bands due to simultaneous stretching vibrations of O-H bonds in a cyclic water network are observed for I(2)(*-) x n H(2)O clusters with n > or = 3. The vertical detachment energy (VDE) profile shows stepwise saturation that indicates closing of the geometrical shell in the hydrated clusters on addition of every four water molecules. The calculated VDE of finite-size small hydrated clusters is extrapolated to evaluate the bulk VDE value of I(2)(*-) in aqueous solution as 7.6 eV at the CCSD(T) level of theory. Structure and spectroscopic properties of these hydrated clusters are compared with those of hydrated clusters of Cl(2)(*-) and Br(2)(*-).

  11. Rapid thermal pulse annealing

    International Nuclear Information System (INIS)

    Miller, M.G.; Koehn, B.W.; Chaplin, R.L.

    1976-01-01

    Characteristics of recovery processes have been investigated for cases of heating a sample to successively higher temperatures by means of isochronal annealing or by using a rapid pulse annealing. A recovery spectra shows the same features independent of which annealing procedure is used. In order to determine which technique provides the best resolution, a study was made of how two independent first-order processes are separated for different heating rates and time increments of the annealing pulses. It is shown that the pulse anneal method offers definite advantages over isochronal annealing when annealing for short time increments. Experimental data by means of the pulse anneal techniques are given for the various substages of stage I of aluminium. (author)

  12. Solving Assembly Sequence Planning using Angle Modulated Simulated Kalman Filter

    Science.gov (United States)

    Mustapa, Ainizar; Yusof, Zulkifli Md.; Adam, Asrul; Muhammad, Badaruddin; Ibrahim, Zuwairie

    2018-03-01

    This paper presents an implementation of Simulated Kalman Filter (SKF) algorithm for optimizing an Assembly Sequence Planning (ASP) problem. The SKF search strategy contains three simple steps; predict-measure-estimate. The main objective of the ASP is to determine the sequence of component installation to shorten assembly time or save assembly costs. Initially, permutation sequence is generated to represent each agent. Each agent is then subjected to a precedence matrix constraint to produce feasible assembly sequence. Next, the Angle Modulated SKF (AMSKF) is proposed for solving ASP problem. The main idea of the angle modulated approach in solving combinatorial optimization problem is to use a function, g(x), to create a continuous signal. The performance of the proposed AMSKF is compared against previous works in solving ASP by applying BGSA, BPSO, and MSPSO. Using a case study of ASP, the results show that AMSKF outperformed all the algorithms in obtaining the best solution.

  13. Asian Rhinoplasty: Preoperative Simulation and Planning Using Adobe Photoshop.

    Science.gov (United States)

    Kiranantawat, Kidakorn; Nguyen, Anh H

    2015-11-01

    A rhinoplasty in Asians differs from a rhinoplasty performed in patients of other ethnicities. Surgeons should understand the concept of Asian beauty, the nasal anatomy of Asians, and common problems encountered while operating on the Asian nose. With this understanding, surgeons can set appropriate goals, choose proper operative procedures, and provide an outcome that satisfies patients. In this article the authors define the concept of an Asian rhinoplasty-a paradigm shift from the traditional on-top augmentation rhinoplasty to a structurally integrated augmentation rhinoplasty-and provide a step-by-step procedure for the use of Adobe Photoshop as a preoperative program to simulate the expected surgical outcome for patients and to develop a preoperative plan for surgeons.

  14. High-temperature annealing of graphite: A molecular dynamics study

    Science.gov (United States)

    Petersen, Andrew; Gillette, Victor

    2018-05-01

    A modified AIREBO potential was developed to simulate the effects of thermal annealing on the structure and physical properties of damaged graphite. AIREBO parameter modifications were made to reproduce Density Functional Theory interstitial results. These changes to the potential resulted in high-temperature annealing of the model, as measured by stored-energy reduction. These results show some resemblance to experimental high-temperature annealing results, and show promise that annealing effects in graphite are accessible with molecular dynamics and reactive potentials.

  15. Energy and Delay Optimization of Heterogeneous Multicore Wireless Multimedia Sensor Nodes by Adaptive Genetic-Simulated Annealing Algorithm

    Directory of Open Access Journals (Sweden)

    Xing Liu

    2018-01-01

    Full Text Available Energy efficiency and delay optimization are significant for the proliferation of wireless multimedia sensor network (WMSN. In this article, an energy-efficient, delay-efficient, hardware and software cooptimization platform is researched to minimize the energy cost while guaranteeing the deadline of the real-time WMSN tasks. First, a multicore reconfigurable WMSN hardware platform is designed and implemented. This platform uses both the heterogeneous multicore architecture and the dynamic voltage and frequency scaling (DVFS technique. By this means, the nodes can adjust the hardware characteristics dynamically in terms of the software run-time contexts. Consequently, the software can be executed more efficiently with less energy cost and shorter execution time. Then, based on this hardware platform, an energy and delay multiobjective optimization algorithm and a DVFS adaption algorithm are investigated. These algorithms aim to search out the global energy optimization solution within the acceptable calculation time and strip the time redundancy in the task executing process. Thus, the energy efficiency of the WMSN node can be improved significantly even under strict constraint of the execution time. Simulation and real-world experiments proved that the proposed approaches can decrease the energy cost by more than 29% compared to the traditional single-core WMSN node. Moreover, the node can react quickly to the time-sensitive events.

  16. Comparison of CT-based 3D treatment planning with simulator planning of pelvic irradiation of primary cervical carcinoma

    International Nuclear Information System (INIS)

    Knocke, T.H.; Pokrajac, B.; Fellner, C.; Poetter, R.

    1999-01-01

    In a prospective study on 20 subsequent patients with primary cervical carcinoma in Stages I to III simulator planning of a 4-field box-technique was performed. After defining the planning target volume (PTV) in the 3D planning system the field configuration of the simulator planning was transmitted. The resulting plan was compared to a second one based on the defined PTV and evaluated regarding a possible geographical miss and encompassment of the PTV by the treated volume (ICRU). Volumes of open and shaped portals were calculated for both techniques. Planning by simulation resulted in 1 geographical miss and in 10 more cases the encompassment of the PTV by the treated volume was inadequate. For a PTV of mean 1 729 cm 3 the mean volume defined by simulation was 3 120 cm 3 for the open portals and 2 702 cm 3 for the shaped portals. The volume reduction by blocks was 13,4% (mean). With CT-based 3D treatment planning the volume of the open portals was 3,3% (mean) enlarged to 3 224 cm 3 . The resulting mean volume of the shaped portals was 2 458 ccm. The reduction compared to the open portals was 23,8% (mean). The treated volumes were 244 cm 3 or 9% (mean) smaller compared to simulator planning. The 'treated volume/planning target volume ratio' was decreased from 1.59 to 1.42. (orig.) [de

  17. Dosimetry audit simulation of treatment planning system in multicenters radiotherapy

    Science.gov (United States)

    Kasmuri, S.; Pawiro, S. A.

    2017-07-01

    Treatment Planning System (TPS) is an important modality that determines radiotherapy outcome. TPS requires input data obtained through commissioning and the potentially error occurred. Error in this stage may result in the systematic error. The aim of this study to verify the TPS dosimetry to know deviation range between calculated and measurement dose. This study used CIRS phantom 002LFC representing the human thorax and simulated all external beam radiotherapy stages. The phantom was scanned using CT Scanner and planned 8 test cases that were similar to those in clinical practice situation were made, tested in four radiotherapy centers. Dose measurement using 0.6 cc ionization chamber. The results of this study showed that generally, deviation of all test cases in four centers was within agreement criteria with average deviation about -0.17±1.59 %, -1.64±1.92 %, 0.34±1.34 % and 0.13±1.81 %. The conclusion of this study was all TPS involved in this study showed good performance. The superposition algorithm showed rather poor performance than either analytic anisotropic algorithm (AAA) and convolution algorithm with average deviation about -1.64±1.92 %, -0.17±1.59 % and -0.27±1.51 % respectively.

  18. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    International Nuclear Information System (INIS)

    2006-01-01

    The Performance Demonstration Program (PDP) for headspace gases distributes sample gases of volatile organic compounds (VOCs) for analysis. Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility's compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document. Participating measurement

  19. Performance Demonstration Program Plan for Analysis of Simulated Headspace Gases

    International Nuclear Information System (INIS)

    2007-01-01

    The Performance Demonstration Program (PDP) for headspace gases distributes blind audit samples in a gas matrix for analysis of volatile organic compounds (VOCs). Participating measurement facilities (i.e., fixed laboratories, mobile analysis systems, and on-line analytical systems) are located across the United States. Each sample distribution is termed a PDP cycle. These evaluation cycles provide an objective measure of the reliability of measurements performed for transuranic (TRU) waste characterization. The primary documents governing the conduct of the PDP are the Quality Assurance Program Document (QAPD) (DOE/CBFO-94-1012) and the Waste Isolation Pilot Plant (WIPP) Waste Analysis Plan (WAP) contained in the Hazardous Waste Facility Permit (NM4890139088-TSDF) issued by the New Mexico Environment Department (NMED). The WAP requires participation in the PDP; the PDP must comply with the QAPD and the WAP. This plan implements the general requirements of the QAPD and the applicable requirements of the WAP for the Headspace Gas (HSG) PDP. Participating measurement facilities analyze blind audit samples of simulated TRU waste package headspace gases according to the criteria set by this PDP Plan. Blind audit samples (hereafter referred to as PDP samples) are used as an independent means to assess each measurement facility's compliance with the WAP quality assurance objectives (QAOs). To the extent possible, the concentrations of VOC analytes in the PDP samples encompass the range of concentrations anticipated in actual TRU waste package headspace gas samples. Analyses of headspace gases are required by the WIPP to demonstrate compliance with regulatory requirements. These analyses must be performed by measurement facilities that have demonstrated acceptable performance in this PDP. These analyses are referred to as WIPP analyses and the TRU waste package headspace gas samples on which they are performed are referred to as WIPP samples in this document

  20. Mesoscale Benchmark Demonstration Problem 1: Mesoscale Simulations of Intra-granular Fission Gas Bubbles in UO2 under Post-irradiation Thermal Annealing

    Energy Technology Data Exchange (ETDEWEB)

    Li, Yulan; Hu, Shenyang Y.; Montgomery, Robert; Gao, Fei; Sun, Xin; Tonks, Michael; Biner, Bullent; Millet, Paul; Tikare, Veena; Radhakrishnan, Balasubramaniam; Andersson , David

    2012-04-11

    A study was conducted to evaluate the capabilities of different numerical methods used to represent microstructure behavior at the mesoscale for irradiated material using an idealized benchmark problem. The purpose of the mesoscale benchmark problem was to provide a common basis to assess several mesoscale methods with the objective of identifying the strengths and areas of improvement in the predictive modeling of microstructure evolution. In this work, mesoscale models (phase-field, Potts, and kinetic Monte Carlo) developed by PNNL, INL, SNL, and ORNL were used to calculate the evolution kinetics of intra-granular fission gas bubbles in UO2 fuel under post-irradiation thermal annealing conditions. The benchmark problem was constructed to include important microstructural evolution mechanisms on the kinetics of intra-granular fission gas bubble behavior such as the atomic diffusion of Xe atoms, U vacancies, and O vacancies, the effect of vacancy capture and emission from defects, and the elastic interaction of non-equilibrium gas bubbles. An idealized set of assumptions was imposed on the benchmark problem to simplify the mechanisms considered. The capability and numerical efficiency of different models are compared against selected experimental and simulation results. These comparisons find that the phase-field methods, by the nature of the free energy formulation, are able to represent a larger subset of the mechanisms influencing the intra-granular bubble growth and coarsening mechanisms in the idealized benchmark problem as compared to the Potts and kinetic Monte Carlo methods. It is recognized that the mesoscale benchmark problem as formulated does not specifically highlight the strengths of the discrete particle modeling used in the Potts and kinetic Monte Carlo methods. Future efforts are recommended to construct increasingly more complex mesoscale benchmark problems to further verify and validate the predictive capabilities of the mesoscale modeling

  1. Efficacy of very fast simulated annealing global optimization method for interpretation of self-potential anomaly by different forward formulation over 2D inclined sheet type structure

    Science.gov (United States)

    Biswas, A.; Sharma, S. P.

    2012-12-01

    Self-Potential anomaly is an important geophysical technique that measures the electrical potential due natural source of current in the Earth's subsurface. An inclined sheet type model is a very familiar structure associated with mineralization, fault plane, groundwater flow and many other geological features which exhibits self potential anomaly. A number of linearized and global inversion approaches have been developed for the interpretation of SP anomaly over different structures for various purposes. Mathematical expression to compute the forward response over a two-dimensional dipping sheet type structures can be described in three different ways using five variables in each case. Complexities in the inversion using three different forward approaches are different. Interpretation of self-potential anomaly using very fast simulated annealing global optimization has been developed in the present study which yielded a new insight about the uncertainty and equivalence in model parameters. Interpretation of the measured data yields the location of the causative body, depth to the top, extension, dip and quality of the causative body. In the present study, a comparative performance of three different forward approaches in the interpretation of self-potential anomaly is performed to assess the efficacy of the each approach in resolving the possible ambiguity. Even though each forward formulation yields the same forward response but optimization of different sets of variable using different forward problems poses different kinds of ambiguity in the interpretation. Performance of the three approaches in optimization has been compared and it is observed that out of three methods, one approach is best and suitable for this kind of study. Our VFSA approach has been tested on synthetic, noisy and field data for three different methods to show the efficacy and suitability of the best method. It is important to use the forward problem in the optimization that yields the

  2. Diseño óptimo de un sistema de distribución de agua (SDA aplicando el algoritmo Simulated Annealing (SA

    Directory of Open Access Journals (Sweden)

    Maikel Méndez-Morales

    2014-09-01

    Full Text Available En este artículo se presenta la aplicación del algoritmo Simulated Annealing (SA en el diseño óptimo de un sistema de distribución de agua (SDA. El SA es un algoritmo metaheurístico de búsqueda, basado en una analogía entre el proceso de recocido en metales (proceso controlado de enfriamiento de un cuerpo y la solución de problemas de optimización combinatorios. El algoritmo SA, junto con diversos modelos matemáticos, ha sido utilizado exitosamente en el óptimo diseño de SDA. Como caso de estudio se utilizó el SDA a escala real de la comunidad de Marsella, en San Carlos, Costa Rica. El algoritmo SA fue implementado mediante el conocido modelo EPANET, a través de la extensión WaterNetGen. Se compararon tres diferentes variaciones automatizadas del algoritmo SA con el diseño manual del SDA Marsella llevado a cabo a prueba y error, utilizando únicamente costos unitarios de tuberías. Los resultados muestran que los tres esquemas automatizados del SA arrojaron costos unitarios por debajo del 0.49 como fracción, respecto al costo original del esquema de diseño ejecutado a prueba y error. Esto demuestra que el algoritmo SA es capaz de optimizar problemas combinatorios ligados al diseño de mínimo costo de los sistemas de distribución de agua a escala real.

  3. Modernizing quantum annealing using local searches

    International Nuclear Information System (INIS)

    Chancellor, Nicholas

    2017-01-01

    I describe how real quantum annealers may be used to perform local (in state space) searches around specified states, rather than the global searches traditionally implemented in the quantum annealing algorithm (QAA). Such protocols will have numerous advantages over simple quantum annealing. By using such searches the effect of problem mis-specification can be reduced, as only energy differences between the searched states will be relevant. The QAA is an analogue of simulated annealing, a classical numerical technique which has now been superseded. Hence, I explore two strategies to use an annealer in a way which takes advantage of modern classical optimization algorithms. Specifically, I show how sequential calls to quantum annealers can be used to construct analogues of population annealing and parallel tempering which use quantum searches as subroutines. The techniques given here can be applied not only to optimization, but also to sampling. I examine the feasibility of these protocols on real devices and note that implementing such protocols should require minimal if any change to the current design of the flux qubit-based annealers by D-Wave Systems Inc. I further provide proof-of-principle numerical experiments based on quantum Monte Carlo that demonstrate simple examples of the discussed techniques. (paper)

  4. RTSTEP regional transportation simulation tool for emergency planning - final report.

    Energy Technology Data Exchange (ETDEWEB)

    Ley, H.; Sokolov, V.; Hope, M.; Auld, J.; Zhang, K.; Park, Y.; Kang, X. (Energy Systems)

    2012-01-20

    such materials over a large area, with responders trying to mitigate the immediate danger to the population in a variety of ways that may change over time (e.g., in-place evacuation, staged evacuations, and declarations of growing evacuation zones over time). In addition, available resources will be marshaled in unusual ways, such as the repurposing of transit vehicles to support mass evacuations. Thus, any simulation strategy will need to be able to address highly dynamic effects and will need to be able to handle any mode of ground transportation. Depending on the urgency and timeline of the event, emergency responders may also direct evacuees to leave largely on foot, keeping roadways as clear as possible for emergency responders, logistics, mass transport, and law enforcement. This RTSTEP project developed a regional emergency evacuation modeling tool for the Chicago Metropolitan Area that emergency responders can use to pre-plan evacuation strategies and compare different response strategies on the basis of a rather realistic model of the underlying complex transportation system. This approach is a significant improvement over existing response strategies that are largely based on experience gained from small-scale events, anecdotal evidence, and extrapolation to the scale of the assumed emergency. The new tool will thus add to the toolbox available to emergency response planners to help them design appropriate generalized procedures and strategies that lead to an improved outcome when used during an actual event.

  5. Performance demonstration program plan for analysis of simulated headspace gases

    International Nuclear Information System (INIS)

    1995-06-01

    The Performance Demonstration Program (PDP) for analysis of headspace gases will consist of regular distribution and analyses of test standards to evaluate the capability for analyzing VOCs, hydrogen, and methane in the headspace of transuranic (TRU) waste throughout the Department of Energy (DOE) complex. Each distribution is termed a PDP cycle. These evaluation cycles will provide an objective measure of the reliability of measurements performed for TRU waste characterization. Laboratory performance will be demonstrated by the successful analysis of blind audit samples of simulated TRU waste drum headspace gases according to the criteria set within the text of this Program Plan. Blind audit samples (hereinafter referred to as PDP samples) will be used as an independent means to assess laboratory performance regarding compliance with the QAPP QAOs. The concentration of analytes in the PDP samples will encompass the range of concentrations anticipated in actual waste characterization gas samples. Analyses which are required by the WIPP to demonstrate compliance with various regulatory requirements and which are included in the PDP must be performed by laboratories which have demonstrated acceptable performance in the PDP

  6. Test Plan for the Boiling Water Reactor Dry Cask Simulator

    Energy Technology Data Exchange (ETDEWEB)

    Durbin, Samuel [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Lindgren, Eric R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    canister. The symmetric single assembly geometry with well-controlled boundary conditions simplifies interpretation of results. Various configurations of outer concentric ducting will be used to mimic conditions for above and below-ground storage configurations of vertical, dry cask systems with canisters. Radial and axial temperature profiles will be measured for a wide range of decay power and helium cask pressures. Of particular interest is the evaluation of the effect of increased helium pressure on allowable heat load and the effect of simulated wind on a simplified below ground vent configuration. While incorporating the best available information, this test plan is subject to changes due to improved understanding from modeling or from as-built deviations to designs. As-built conditions and actual procedures will be documented in the final test report.

  7. Perceived Speech Privacy in Computer Simulated Open-plan Offices

    DEFF Research Database (Denmark)

    Pop, Claudiu B.; Rindel, Jens Holger

    2005-01-01

    In open plan offices the lack of speech privacy between the workstations is one of the major acoustic problems. Improving the speech privacy in an open plan design is therefore the main concern for a successful open plan environment. The project described in this paper aimed to find an objective...... parameter that correlates well with the perceived degree of speech privacy and to derive a clear method for evaluating the acoustic conditions in open plan offices. Acoustic measurements were carried out in an open plan office, followed by data analysis at the Acoustic Department, DTU. A computer model...

  8. Determination of photon contamination dose of clinical electron beams using the generalized simulated annealing method; Determinação da dose dos fótons contaminantes de feixes de elétrons clínicos usando o Método de Recozimento Simulado Generalizado

    Energy Technology Data Exchange (ETDEWEB)

    Visbal, Jorge H. Wilches; Costa, Alessandro M. da, E-mail: jhwilchev@gmail.com [Universidade de Sao Paulo (USP), Ribeirão Preto, SP (Brazil). Faculdade de Filosofia, Ciências e Letras

    2017-07-01

    Clinical electron beams are composed of a mixture of pure electrons and Bremsstrahlung photons produced in the structures of the accelerator head as well as in the air. Accurate knowledge of these components is important for calculating the dose and for treatment planning. There are at least two approaches to deter-mine the contribution of the photons in the percentage depth dose of clinical electrons: a) Analytical Method that calculates the dose of the photons from the previous determination of the spectrum of the incident Bremsstrahlung photons; b) Adjustment method based on a semi-empirical biexponential formula where four parameters must be established from optimization methods. The results show that the generalized simulated annealing method can calculate the photon contamination dose by overestimating the dose in the tail no more than 0.6% of the maximum dose (electrons and photons). (author)

  9. Adaptive Planning: Understanding Organizational Workload to Capability/ Capacity through Modeling and Simulation

    Science.gov (United States)

    Hase, Chris

    2010-01-01

    In August 2003, the Secretary of Defense (SECDEF) established the Adaptive Planning (AP) initiative [1] with an objective of reducing the time necessary to develop and revise Combatant Commander (COCOM) contingency plans and increase SECDEF plan visibility. In addition to reducing the traditional plan development timeline from twenty-four months to less than twelve months (with a goal of six months)[2], AP increased plan visibility to Department of Defense (DoD) leadership through In-Progress Reviews (IPRs). The IPR process, as well as the increased number of campaign and contingency plans COCOMs had to develop, increased the workload while the number of planners remained fixed. Several efforts from collaborative planning tools to streamlined processes were initiated to compensate for the increased workload enabling COCOMS to better meet shorter planning timelines. This paper examines the Joint Strategic Capabilities Plan (JSCP) directed contingency planning and staffing requirements assigned to a combatant commander staff through the lens of modeling and simulation. The dynamics of developing a COCOM plan are captured with an ExtendSim [3] simulation. The resulting analysis provides a quantifiable means by which to measure a combatant commander staffs workload associated with development and staffing JSCP [4] directed contingency plans with COCOM capability/capacity. Modeling and simulation bring significant opportunities in measuring the sensitivity of key variables in the assessment of workload to capability/capacity analysis. Gaining an understanding of the relationship between plan complexity, number of plans, planning processes, and number of planners with time required for plan development provides valuable information to DoD leadership. Through modeling and simulation AP leadership can gain greater insight in making key decisions on knowing where to best allocate scarce resources in an effort to meet DoD planning objectives.

  10. Sequential use of simulation and optimization in analysis and planning

    Science.gov (United States)

    Hans R. Zuuring; Jimmie D. Chew; J. Greg Jones

    2000-01-01

    Management activities are analyzed at landscape scales employing both simulation and optimization. SIMPPLLE, a stochastic simulation modeling system, is initially applied to assess the risks associated with a specific natural process occurring on the current landscape without management treatments, but with fire suppression. These simulation results are input into...

  11. WE-H-BRC-09: Simulated Errors in Mock Radiotherapy Plans to Quantify the Effectiveness of the Physics Plan Review

    International Nuclear Information System (INIS)

    Gopan, O; Kalet, A; Smith, W; Hendrickson, K; Kim, M; Young, L; Nyflot, M; Chvetsov, A; Phillips, M; Ford, E

    2016-01-01

    Purpose: A standard tool for ensuring the quality of radiation therapy treatments is the initial physics plan review. However, little is known about its performance in practice. The goal of this study is to measure the effectiveness of physics plan review by introducing simulated errors into “mock” treatment plans and measuring the performance of plan review by physicists. Methods: We generated six mock treatment plans containing multiple errors. These errors were based on incident learning system data both within the department and internationally (SAFRON). These errors were scored for severity and frequency. Those with the highest scores were included in the simulations (13 errors total). Observer bias was minimized using a multiple co-correlated distractor approach. Eight physicists reviewed these plans for errors, with each physicist reviewing, on average, 3/6 plans. The confidence interval for the proportion of errors detected was computed using the Wilson score interval. Results: Simulated errors were detected in 65% of reviews [51–75%] (95% confidence interval [CI] in brackets). The following error scenarios had the highest detection rates: incorrect isocenter in DRRs/CBCT (91% [73–98%]) and a planned dose different from the prescribed dose (100% [61–100%]). Errors with low detection rates involved incorrect field parameters in record and verify system (38%, [18–61%]) and incorrect isocenter localization in planning system (29% [8–64%]). Though pre-treatment QA failure was reliably identified (100%), less than 20% of participants reported the error that caused the failure. Conclusion: This is one of the first quantitative studies of error detection. Although physics plan review is a key safety measure and can identify some errors with high fidelity, others errors are more challenging to detect. This data will guide future work on standardization and automation. Creating new checks or improving existing ones (i.e., via automation) will help in

  12. WE-H-BRC-09: Simulated Errors in Mock Radiotherapy Plans to Quantify the Effectiveness of the Physics Plan Review

    Energy Technology Data Exchange (ETDEWEB)

    Gopan, O; Kalet, A; Smith, W; Hendrickson, K; Kim, M; Young, L; Nyflot, M; Chvetsov, A; Phillips, M; Ford, E [University of Washington, Seattle, WA (United States)

    2016-06-15

    Purpose: A standard tool for ensuring the quality of radiation therapy treatments is the initial physics plan review. However, little is known about its performance in practice. The goal of this study is to measure the effectiveness of physics plan review by introducing simulated errors into “mock” treatment plans and measuring the performance of plan review by physicists. Methods: We generated six mock treatment plans containing multiple errors. These errors were based on incident learning system data both within the department and internationally (SAFRON). These errors were scored for severity and frequency. Those with the highest scores were included in the simulations (13 errors total). Observer bias was minimized using a multiple co-correlated distractor approach. Eight physicists reviewed these plans for errors, with each physicist reviewing, on average, 3/6 plans. The confidence interval for the proportion of errors detected was computed using the Wilson score interval. Results: Simulated errors were detected in 65% of reviews [51–75%] (95% confidence interval [CI] in brackets). The following error scenarios had the highest detection rates: incorrect isocenter in DRRs/CBCT (91% [73–98%]) and a planned dose different from the prescribed dose (100% [61–100%]). Errors with low detection rates involved incorrect field parameters in record and verify system (38%, [18–61%]) and incorrect isocenter localization in planning system (29% [8–64%]). Though pre-treatment QA failure was reliably identified (100%), less than 20% of participants reported the error that caused the failure. Conclusion: This is one of the first quantitative studies of error detection. Although physics plan review is a key safety measure and can identify some errors with high fidelity, others errors are more challenging to detect. This data will guide future work on standardization and automation. Creating new checks or improving existing ones (i.e., via automation) will help in

  13. Planning of general practitioners in the Netherlands: a simulation model.

    NARCIS (Netherlands)

    Greuningen, M. van; Batenburg, R.S.; Velden, L.F.J. van der

    2010-01-01

    Manpower planning can be an important instrument to control shortages (or oversupply) within the health care labour market. The Netherlands is one of the countries that have a relative long tradition of manpower planning in health care. In 1973 the government introduced the numerus clausus for the

  14. Internet-based system for simulation-based medical planning for cardiovascular disease.

    Science.gov (United States)

    Steele, Brooke N; Draney, Mary T; Ku, Joy P; Taylor, Charles A

    2003-06-01

    Current practice in vascular surgery utilizes only diagnostic and empirical data to plan treatments, which does not enable quantitative a priori prediction of the outcomes of interventions. We have previously described simulation-based medical planning methods to model blood flow in arteries and plan medical treatments based on physiologic models. An important consideration for the design of these patient-specific modeling systems is the accessibility to physicians with modest computational resources. We describe a simulation-based medical planning environment developed for the World Wide Web (WWW) using the Virtual Reality Modeling Language (VRML) and the Java programming language.

  15. Enhancing Student’s Understanding in Entrepreneurship Through Business Plan Simulation

    Directory of Open Access Journals (Sweden)

    Guzairy M.

    2018-01-01

    Full Text Available Business Plan is an important document for entrepreneurs to guide them managing their business. Business Plan also assist the entrepreneur to strategies their business and manage future growth. That is why Malaysian government has foster all Higher Education Provider to set entrepreneurship education as compulsory course. One of the entrepreneurship education learning outcome is the student can write effective business plan. This study focused on enhancing student’s understanding in entrepreneurship through business plan simulation. This study also considers which of the factor that most facilitate the business simulation that help the student to prepare effective business plan. The methodology of this study using quantitative approach with pre-and post-research design. 114 students take part as respondent in the business simulation and answer quantitative survey pre-question and post question. The crucial findings of this study are student characteristic factor after playing the simulation contribute much on facilitate business plan learning. The result has shown that the business plan simulation can enhance undergraduate student in understanding entrepreneurship by preparing effective business plan before opening new startup.

  16. Comparison of optimization algorithms in intensity-modulated radiation therapy planning

    Science.gov (United States)

    Kendrick, Rachel

    Intensity-modulated radiation therapy is used to better conform the radiation dose to the target, which includes avoiding healthy tissue. Planning programs employ optimization methods to search for the best fluence of each photon beam, and therefore to create the best treatment plan. The Computational Environment for Radiotherapy Research (CERR), a program written in MATLAB, was used to examine some commonly-used algorithms for one 5-beam plan. Algorithms include the genetic algorithm, quadratic programming, pattern search, constrained nonlinear optimization, simulated annealing, the optimization method used in Varian EclipseTM, and some hybrids of these. Quadratic programing, simulated annealing, and a quadratic/simulated annealing hybrid were also separately compared using different prescription doses. The results of each dose-volume histogram as well as the visual dose color wash were used to compare the plans. CERR's built-in quadratic programming provided the best overall plan, but avoidance of the organ-at-risk was rivaled by other programs. Hybrids of quadratic programming with some of these algorithms seems to suggest the possibility of better planning programs, as shown by the improved quadratic/simulated annealing plan when compared to the simulated annealing algorithm alone. Further experimentation will be done to improve cost functions and computational time.

  17. Training Community Modeling and Simulation Business Plan: 2008 Edition

    Science.gov (United States)

    2009-12-01

    and Cyber Constructive Environment– Information Operations System ASCOT Airspace Control and Operations Trainer ASDA Advanced Seal Delivery System...Advanced Seal Delivery System ( ASDA ). Simulates a submarine training system for providing stealthy submerged transportation for insertion into Special

  18. Specification of Training Simulator Fidelity: A Research Plan

    Science.gov (United States)

    1982-02-01

    Knowlede --Dunnette (1976) has recently reviewed the literature in the areas of human skills, abilities, and knowledges. The establishment of what types... management 6. Other than rational user responses to R&D studies and to training simulators 7. Deficiencies in training simulator design 23...proficient at managing the introduction of training innovations by applying those factors that can be controlled to influence acceptance. (p. 19) The

  19. Infrared thermal annealing device

    International Nuclear Information System (INIS)

    Gladys, M.J.; Clarke, I.; O'Connor, D.J.

    2003-01-01

    A device for annealing samples within an ultrahigh vacuum (UHV) scanning tunneling microscopy system was designed, constructed, and tested. The device is based on illuminating the sample with infrared radiation from outside the UHV chamber with a tungsten projector bulb. The apparatus uses an elliptical mirror to focus the beam through a sapphire viewport for low absorption. Experiments were conducted on clean Pd(100) and annealing temperatures in excess of 1000 K were easily reached

  20. Lean engineering for planning systems redesign - staff participation by simulation

    NARCIS (Netherlands)

    van der Zee, D.J.; Pool, A.; Wijngaard, J.; Mason, S.J.; Hill, R.R.; Moench, L.; Rose, O.

    2008-01-01

    Lean manufacturing aims at flexible and efficient manufacturing systems by reducing waste in all forms, such as, production of defective parts, excess inventory, unnecessary processing steps, and unnecessary movements of people or materials. Recent research stresses the need to include planning

  1. Prototyping and validating requirements of radiation and nuclear emergency plan simulator

    Energy Technology Data Exchange (ETDEWEB)

    Hamid, AHA., E-mail: amyhamijah@nm.gov.my [Malaysian Nuclear Agency (NM), Bangi, 43000 Kajang, Selangor (Malaysia); Faculty of Computing, Universiti Teknologi Malaysia (UTM), Skudai, 81310 Johor Bahru, Johor (Malaysia); Rozan, MZA.; Ibrahim, R.; Deris, S.; Selamat, A. [Faculty of Computing, Universiti Teknologi Malaysia (UTM), Skudai, 81310 Johor Bahru, Johor (Malaysia)

    2015-04-29

    Organizational incapability in developing unrealistic, impractical, inadequate and ambiguous mechanisms of radiological and nuclear emergency preparedness and response plan (EPR) causing emergency plan disorder and severe disasters. These situations resulting from 65.6% of poor definition and unidentified roles and duties of the disaster coordinator. Those unexpected conditions brought huge aftermath to the first responders, operators, workers, patients and community at large. Hence, in this report, we discuss prototyping and validating of Malaysia radiation and nuclear emergency preparedness and response plan simulation model (EPRM). A prototyping technique was required to formalize the simulation model requirements. Prototyping as systems requirements validation was carried on to endorse the correctness of the model itself against the stakeholder’s intensions in resolving those organizational incapability. We have made assumptions for the proposed emergency preparedness and response model (EPRM) through the simulation software. Those assumptions provided a twofold of expected mechanisms, planning and handling of the respective emergency plan as well as in bringing off the hazard involved. This model called RANEPF (Radiation and Nuclear Emergency Planning Framework) simulator demonstrated the training emergency response perquisites rather than the intervention principles alone. The demonstrations involved the determination of the casualties’ absorbed dose range screening and the coordination of the capacity planning of the expected trauma triage. Through user-centred design and sociotechnical approach, RANEPF simulator was strategized and simplified, though certainly it is equally complex.

  2. Prototyping and validating requirements of radiation and nuclear emergency plan simulator

    International Nuclear Information System (INIS)

    Hamid, AHA.; Rozan, MZA.; Ibrahim, R.; Deris, S.; Selamat, A.

    2015-01-01

    Organizational incapability in developing unrealistic, impractical, inadequate and ambiguous mechanisms of radiological and nuclear emergency preparedness and response plan (EPR) causing emergency plan disorder and severe disasters. These situations resulting from 65.6% of poor definition and unidentified roles and duties of the disaster coordinator. Those unexpected conditions brought huge aftermath to the first responders, operators, workers, patients and community at large. Hence, in this report, we discuss prototyping and validating of Malaysia radiation and nuclear emergency preparedness and response plan simulation model (EPRM). A prototyping technique was required to formalize the simulation model requirements. Prototyping as systems requirements validation was carried on to endorse the correctness of the model itself against the stakeholder’s intensions in resolving those organizational incapability. We have made assumptions for the proposed emergency preparedness and response model (EPRM) through the simulation software. Those assumptions provided a twofold of expected mechanisms, planning and handling of the respective emergency plan as well as in bringing off the hazard involved. This model called RANEPF (Radiation and Nuclear Emergency Planning Framework) simulator demonstrated the training emergency response perquisites rather than the intervention principles alone. The demonstrations involved the determination of the casualties’ absorbed dose range screening and the coordination of the capacity planning of the expected trauma triage. Through user-centred design and sociotechnical approach, RANEPF simulator was strategized and simplified, though certainly it is equally complex

  3. Prototyping and validating requirements of radiation and nuclear emergency plan simulator

    Science.gov (United States)

    Hamid, AHA.; Rozan, MZA.; Ibrahim, R.; Deris, S.; Selamat, A.

    2015-04-01

    Organizational incapability in developing unrealistic, impractical, inadequate and ambiguous mechanisms of radiological and nuclear emergency preparedness and response plan (EPR) causing emergency plan disorder and severe disasters. These situations resulting from 65.6% of poor definition and unidentified roles and duties of the disaster coordinator. Those unexpected conditions brought huge aftermath to the first responders, operators, workers, patients and community at large. Hence, in this report, we discuss prototyping and validating of Malaysia radiation and nuclear emergency preparedness and response plan simulation model (EPRM). A prototyping technique was required to formalize the simulation model requirements. Prototyping as systems requirements validation was carried on to endorse the correctness of the model itself against the stakeholder's intensions in resolving those organizational incapability. We have made assumptions for the proposed emergency preparedness and response model (EPRM) through the simulation software. Those assumptions provided a twofold of expected mechanisms, planning and handling of the respective emergency plan as well as in bringing off the hazard involved. This model called RANEPF (Radiation and Nuclear Emergency Planning Framework) simulator demonstrated the training emergency response perquisites rather than the intervention principles alone. The demonstrations involved the determination of the casualties' absorbed dose range screening and the coordination of the capacity planning of the expected trauma triage. Through user-centred design and sociotechnical approach, RANEPF simulator was strategized and simplified, though certainly it is equally complex.

  4. Simulation to Support Local Search in Trajectory Optimization Planning

    Science.gov (United States)

    Morris, Robert A.; Venable, K. Brent; Lindsey, James

    2012-01-01

    NASA and the international community are investing in the development of a commercial transportation infrastructure that includes the increased use of rotorcraft, specifically helicopters and civil tilt rotors. However, there is significant concern over the impact of noise on the communities surrounding the transportation facilities. One way to address the rotorcraft noise problem is by exploiting powerful search techniques coming from artificial intelligence coupled with simulation and field tests to design low-noise flight profiles which can be tested in simulation or through field tests. This paper investigates the use of simulation based on predictive physical models to facilitate the search for low-noise trajectories using a class of automated search algorithms called local search. A novel feature of this approach is the ability to incorporate constraints directly into the problem formulation that addresses passenger safety and comfort.

  5. Advanced Simulation and Computing FY17 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, Michel [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, Bill [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Hendrickson, Bruce [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Wade, Doug [National Nuclear Security Administration (NNSA), Washington, DC (United States). Office of Advanced Simulation and Computing and Institutional Research and Development; Hoang, Thuc [National Nuclear Security Administration (NNSA), Washington, DC (United States). Computational Systems and Software Environment

    2016-08-29

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources that support annual stockpile assessment and certification, study advanced nuclear weapons design and manufacturing processes, analyze accident scenarios and weapons aging, and provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balance of resource, including technical staff, hardware, simulation software, and computer science solutions. ASC is now focused on increasing predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (sufficient resolution, dimensionality, and scientific details), and quantifying critical margins and uncertainties. Resolving each issue requires increasingly difficult analyses because the aging process has progressively moved the stockpile further away from the original test base. Where possible, the program also enables the use of high performance computing (HPC) and simulation tools to address broader national security needs, such as foreign nuclear weapon assessments and counter nuclear terrorism.

  6. BRUS2. An energy system simulator for long term planning

    DEFF Research Database (Denmark)

    Skytte, K.; Skjerk Christensen, P.

    1999-01-01

    BRUS2 is a technical-economic bottom-up scenario model. The objective of BRUS2 is to provide decision-makers with information on consequences of given trends of parameters of society like population growth and productivity, and of political goals, e.g., energy saving initiatives. BRUS2 simulates ...

  7. Energy efficient process planning based on numerical simulations

    OpenAIRE

    Neugebauer, Reimund; Hochmuth, C.; Schmidt, G.; Dix, M.

    2011-01-01

    The main goal of energy-efficient manufacturing is to generate products with maximum value-added at minimum energy consumption. To this end, in metal cutting processes, it is necessary to reduce the specific cutting energy while, at the same time, precision requirements have to be ensured. Precision is critical in metal cutting processes because they often constitute the final stages of metalworking chains. This paper presents a method for the planning of energy-efficient machining processes ...

  8. Combined Log Inventory and Process Simulation Models for the Planning and Control of Sawmill Operations

    Science.gov (United States)

    Guillermo A. Mendoza; Roger J. Meimban; Philip A. Araman; William G. Luppold

    1991-01-01

    A log inventory model and a real-time hardwood process simulation model were developed and combined into an integrated production planning and control system for hardwood sawmills. The log inventory model was designed to monitor and periodically update the status of the logs in the log yard. The process simulation model was designed to estimate various sawmill...

  9. Simulation model for planning metallurgical treatment of large-size billets

    International Nuclear Information System (INIS)

    Timofeev, M.A.; Echeistova, L.A.; Kuznetsov, V.G.; Semakin, S.V.; Krivonogov, A.B.

    1989-01-01

    The computerized simulation system ''Ritm'' for planning metallurgical treatment of billets is developed. Three principles, specifying the organization structure of the treatment cycle are formulated as follows: a cycling principle, a priority principle and a principle of group treatment. The ''Ritm'' software consists of three independent operating systems: preparation of source data, simulation, data output

  10. The use of discrete-event simulation modelling to improve radiation therapy planning processes.

    Science.gov (United States)

    Werker, Greg; Sauré, Antoine; French, John; Shechter, Steven

    2009-07-01

    The planning portion of the radiation therapy treatment process at the British Columbia Cancer Agency is efficient but nevertheless contains room for improvement. The purpose of this study is to show how a discrete-event simulation (DES) model can be used to represent this complex process and to suggest improvements that may reduce the planning time and ultimately reduce overall waiting times. A simulation model of the radiation therapy (RT) planning process was constructed using the Arena simulation software, representing the complexities of the system. Several types of inputs feed into the model; these inputs come from historical data, a staff survey, and interviews with planners. The simulation model was validated against historical data and then used to test various scenarios to identify and quantify potential improvements to the RT planning process. Simulation modelling is an attractive tool for describing complex systems, and can be used to identify improvements to the processes involved. It is possible to use this technique in the area of radiation therapy planning with the intent of reducing process times and subsequent delays for patient treatment. In this particular system, reducing the variability and length of oncologist-related delays contributes most to improving the planning time.

  11. Strategic planning for skills and simulation labs in colleges of nursing.

    Science.gov (United States)

    Gantt, Laura T

    2010-01-01

    While simulation laboratories for clinical nursing education are predicted to grow, budget cuts may threaten these programs. One of the ways to develop a new lab, as well as to keep an existing one on track, is to develop and regularly update a strategic plan. The process of planning not only helps keep the lab faculty and staff apprised of the challenges to be faced, but it also helps to keep senior level management engaged by reason of the need for their input and approval of the plan. The strategic planning documents drafted by those who supervised the development of the new building and Concepts Integration Labs (CILs) helped guide and orient faculty and other personnel hired to implement the plan and fulfill the vision. As the CILs strategic plan was formalized, the draft plans, including the SWOT analysis, were reviewed to provide historical perspective, stimulate discussion, and to make sure old or potential mistakes were not repeated.

  12. Training Community Modeling and Simulation Business Plan: 2009 Edition

    Science.gov (United States)

    2010-04-01

    TBMCS ADSI C2PC GCCS AMDWS JADOCS ASTI AFSERS Camp Smith, HI CECG ASSET JDAARS MENTOR JCTC...THREAD JLCDT/GIM JMECS MARITIME THREAD JSAF GW JDT ADSI JWFC TBMCS HFFL GCCS-J JWFC WHITE COP AIR MARITIME THREAD AIR THREAD JSAF AC130 19SOS HLA GW...BMD THREAD SIMULATIONS & INTERFACES JRE PACAF TBMCS PAOC GCCS-J PACFLT MDST ADSI PACAF ADSI PACFLT CCD16 GCCS-J PACAF GCCS-J PACOM RADMERC PACOM

  13. Simulation-based planning of surgical interventions in pediatric cardiology

    Science.gov (United States)

    Marsden, Alison

    2012-11-01

    Hemodynamics plays an essential role in the progression and treatment of cardiovascular disease. This is particularly true in pediatric cardiology, due to the wide variation in anatomy observed in congenital heart disease patients. While medical imaging provides increasingly detailed anatomical information, clinicians currently have limited knowledge of important fluid mechanical parameters. Treatment decisions are therefore often made using anatomical information alone, despite the known links between fluid mechanics and disease progression. Patient-specific simulations now offer the means to provide this missing information, and, more importantly, to perform in-silico testing of new surgical designs at no risk to the patient. In this talk, we will outline the current state of the art in methods for cardiovascular blood flow simulation and virtual surgery. We will then present new methodology for coupling optimization with simulation and uncertainty quantification to customize treatments for individual patients. Finally, we will present examples in pediatric cardiology that illustrate the potential impact of these tools in the clinical setting.

  14. Multi-institutional comparison of simulated treatment delivery errors in ssIMRT, manually planned VMAT and autoplan-VMAT plans for nasopharyngeal radiotherapy

    DEFF Research Database (Denmark)

    Pogson, Elise M; Aruguman, Sankar; Hansen, Christian R

    2017-01-01

    PURPOSE: To quantify the impact of simulated errors for nasopharynx radiotherapy across multiple institutions and planning techniques (auto-plan generated Volumetric Modulated Arc Therapy (ap-VMAT), manually planned VMAT (mp-VMAT) and manually planned step and shoot Intensity Modulated Radiation...... Therapy (mp-ssIMRT)). METHODS: Ten patients were retrospectively planned with VMAT according to three institution's protocols. Within one institution two further treatment plans were generated using differing treatment planning techniques. This resulted in mp-ssIMRT, mp-VMAT, and ap-VMAT plans. Introduced...

  15. GPU accelerated population annealing algorithm

    Science.gov (United States)

    Barash, Lev Yu.; Weigel, Martin; Borovský, Michal; Janke, Wolfhard; Shchur, Lev N.

    2017-11-01

    Population annealing is a promising recent approach for Monte Carlo simulations in statistical physics, in particular for the simulation of systems with complex free-energy landscapes. It is a hybrid method, combining importance sampling through Markov chains with elements of sequential Monte Carlo in the form of population control. While it appears to provide algorithmic capabilities for the simulation of such systems that are roughly comparable to those of more established approaches such as parallel tempering, it is intrinsically much more suitable for massively parallel computing. Here, we tap into this structural advantage and present a highly optimized implementation of the population annealing algorithm on GPUs that promises speed-ups of several orders of magnitude as compared to a serial implementation on CPUs. While the sample code is for simulations of the 2D ferromagnetic Ising model, it should be easily adapted for simulations of other spin models, including disordered systems. Our code includes implementations of some advanced algorithmic features that have only recently been suggested, namely the automatic adaptation of temperature steps and a multi-histogram analysis of the data at different temperatures. Program Files doi:http://dx.doi.org/10.17632/sgzt4b7b3m.1 Licensing provisions: Creative Commons Attribution license (CC BY 4.0) Programming language: C, CUDA External routines/libraries: NVIDIA CUDA Toolkit 6.5 or newer Nature of problem: The program calculates the internal energy, specific heat, several magnetization moments, entropy and free energy of the 2D Ising model on square lattices of edge length L with periodic boundary conditions as a function of inverse temperature β. Solution method: The code uses population annealing, a hybrid method combining Markov chain updates with population control. The code is implemented for NVIDIA GPUs using the CUDA language and employs advanced techniques such as multi-spin coding, adaptive temperature

  16. Irradiation test plan of the simulated DUPIC fuel

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Ki Kwang; Yang, M. S.; Kim, B. K. [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-11-01

    Simulated DUPIC fuel had been irradiated from Aug. 4, 1999 to Oct. 4 1999, in order to produce the data of its in-core behavior, to verify the design of DUPIC non-instrumented capsule developed, and to ensure the irradiation requirements of DUPIC fuel at HANARO. The welding process was certified for manufacturing the mini-element, and simulated DUPIC fuel rods were manufactured with simulated DUPIC pellets through examination and test. The non-instrumented capsule for a irradiation test of DUPIC fuel has been designed and manufactured referring to the design specification of the HANARO fuel. This is to be the design basis of the instrumented capsule under consideration. The verification experiment, whether the capsule loaded in the OR4 hole meet the HANARO requirements under the normal operation condition, as well as the structural analysis was carried out. The items for this experiment were the pressure drop test, vibration test, integrity test, et. al. It was noted that each experimental result meet the HANARO operational requirements. For the safety analysis of the DUPIC non-instrumented capsule loaded in the HANARO core, the nuclear/mechanical compatibility, thermodynamic compatibility, integrity analysis of the irradiation samples according to the reactor condition as well as the safety analysis of the HANARO were performed. Besides, the core reactivity effects were discussed during the irradiation test of the DUPIC capsule. The average power of each fuel rod in the DUPIC capsule was calculated, and maximal linear power reflecting the axial peaking power factor from the MCNP results was evaluated. From these calculation results, the HANARO core safety was evaluated. At the end of this report, similar overseas cases were introduced. 9 refs., 16 figs., 10 tabs. (Author)

  17. Basic considerations in simulated treatment planning for the Stanford Medical Pion Generator (SMPG)

    International Nuclear Information System (INIS)

    Pistenma, D.A.; Li, G.C.; Bagshaw, M.A.

    1977-01-01

    Recent interest in charged heavy particle irradiation is based upon expected improved local tumor control rates because of the greater precision in dose localization and the increased biological effectiveness of the high linear energy transfer ionization of particle beams in their stopping regions (Bragg peaks). A novel 60 beam cylindrical geometry pion spectrometer designed for a hospital-based pion therapy facility has been constructed at Stanford. In conjunction with the development and testing of the SMPG a program of simulated treatment planning is being conducted. This paper presents basic considerations in treatment planning for pions and other charged heavy particles. It also presents the status of simulated treatment planning calculations for the SMPG including a discussion of the principle of irradiation of hypothetical tumor volumes illustrated by examples of simplified treatment plans incorporating tissue density inhomogeneity corrections. Also presented are considerations for realistic simulated treatment planning calculations using computerized tomographic scan cross sections of actual patients and a conceptual plan for an integrated treatment planning and patient treatment system for the SMPG

  18. Quantum annealing for combinatorial clustering

    Science.gov (United States)

    Kumar, Vaibhaw; Bass, Gideon; Tomlin, Casey; Dulny, Joseph

    2018-02-01

    Clustering is a powerful machine learning technique that groups "similar" data points based on their characteristics. Many clustering algorithms work by approximating the minimization of an objective function, namely the sum of within-the-cluster distances between points. The straightforward approach involves examining all the possible assignments of points to each of the clusters. This approach guarantees the solution will be a global minimum; however, the number of possible assignments scales quickly with the number of data points and becomes computationally intractable even for very small datasets. In order to circumvent this issue, cost function minima are found using popular local search-based heuristic approaches such as k-means and hierarchical clustering. Due to their greedy nature, such techniques do not guarantee that a global minimum will be found and can lead to sub-optimal clustering assignments. Other classes of global search-based techniques, such as simulated annealing, tabu search, and genetic algorithms, may offer better quality results but can be too time-consuming to implement. In this work, we describe how quantum annealing can be used to carry out clustering. We map the clustering objective to a quadratic binary optimization problem and discuss two clustering algorithms which are then implemented on commercially available quantum annealing hardware, as well as on a purely classical solver "qbsolv." The first algorithm assigns N data points to K clusters, and the second one can be used to perform binary clustering in a hierarchical manner. We present our results in the form of benchmarks against well-known k-means clustering and discuss the advantages and disadvantages of the proposed techniques.

  19. Clinical treatment planning for stereotactic radiotherapy, evaluation by Monte Carlo simulation

    International Nuclear Information System (INIS)

    Kairn, T.; Aland, T.; Kenny, J.; Knight, R.T.; Crowe, S.B.; Langton, C.M.; Franich, R.D.; Johnston, P.N.

    2010-01-01

    Full text: This study uses re-evaluates the doses delivered by a series of clinical stereotactic radiotherapy treatments, to test the accuracy of treatment planning predictions for very small radiation fields. Stereotactic radiotherapy treatment plans for meningiomas near the petrous temporal bone and the foramen magnum (incorp rating fields smaller than I c m2) were examined using Monte Carlo simulations. Important differences between treatment planning predictions and Monte Carlo calculations of doses delivered to stereotactic radiotherapy patients are apparent. For example, in one case the Monte Carlo calculation shows that the delivery a planned meningioma treatment would spare the patient's critical structures (eyes, brainstem) more effectively than the treatment plan predicted, and therefore suggests that this patient could safely receive an increased dose to their tumour. Monte Carlo simulations can be used to test the dose predictions made by a conventional treatment planning system, for dosimetrically challenging small fields, and can thereby suggest valuable modifications to clinical treatment plans. This research was funded by the Wesley Research Institute, Australia. The authors wish to thank Andrew Fielding and David Schlect for valuable discussions of aspects of this work. The authors are also grateful to Muhammad Kakakhel, for assisting with the design and calibration of our linear accelerator model, and to the stereotactic radiation therapy team at Premion, who designed the treatment plans. Computational resources and services used in this work were provided by the HPC and Research Support Unit, QUT, Brisbane, Australia. (author)

  20. A hybrid simulation approach for integrating safety behavior into construction planning: An earthmoving case study.

    Science.gov (United States)

    Goh, Yang Miang; Askar Ali, Mohamed Jawad

    2016-08-01

    One of the key challenges in improving construction safety and health is the management of safety behavior. From a system point of view, workers work unsafely due to system level issues such as poor safety culture, excessive production pressure, inadequate allocation of resources and time and lack of training. These systemic issues should be eradicated or minimized during planning. However, there is a lack of detailed planning tools to help managers assess the impact of their upstream decisions on worker safety behavior. Even though simulation had been used in construction planning, the review conducted in this study showed that construction safety management research had not been exploiting the potential of simulation techniques. Thus, a hybrid simulation framework is proposed to facilitate integration of safety management considerations into construction activity simulation. The hybrid framework consists of discrete event simulation (DES) as the core, but heterogeneous, interactive and intelligent (able to make decisions) agents replace traditional entities and resources. In addition, some of the cognitive processes and physiological aspects of agents are captured using system dynamics (SD) approach. The combination of DES, agent-based simulation (ABS) and SD allows a more "natural" representation of the complex dynamics in construction activities. The proposed hybrid framework was demonstrated using a hypothetical case study. In addition, due to the lack of application of factorial experiment approach in safety management simulation, the case study demonstrated sensitivity analysis and factorial experiment to guide future research. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Simulation-optimization model for production planning in the blood supply chain.

    Science.gov (United States)

    Osorio, Andres F; Brailsford, Sally C; Smith, Honora K; Forero-Matiz, Sonia P; Camacho-Rodríguez, Bernardo A

    2017-12-01

    Production planning in the blood supply chain is a challenging task. Many complex factors such as uncertain supply and demand, blood group proportions, shelf life constraints and different collection and production methods have to be taken into account, and thus advanced methodologies are required for decision making. This paper presents an integrated simulation-optimization model to support both strategic and operational decisions in production planning. Discrete-event simulation is used to represent the flows through the supply chain, incorporating collection, production, storing and distribution. On the other hand, an integer linear optimization model running over a rolling planning horizon is used to support daily decisions, such as the required number of donors, collection methods and production planning. This approach is evaluated using real data from a blood center in Colombia. The results show that, using the proposed model, key indicators such as shortages, outdated units, donors required and cost are improved.

  2. A SIMULATION-AS-A-SERVICE FRAMEWORK FACILITATING WEBGIS BASED INSTALLATION PLANNING

    Directory of Open Access Journals (Sweden)

    Z. Zheng

    2017-09-01

    Full Text Available Installation Planning is constrained by both natural and social conditions, especially for spatially sparse but functionally connected facilities. Simulation is important for proper deploy in space and configuration in function of facilities to make them a cohesive and supportive system to meet users’ operation needs. Based on requirement analysis, we propose a framework to combine GIS and Agent simulation to overcome the shortness in temporal analysis and task simulation of traditional GIS. In this framework, Agent based simulation runs as a service on the server, exposes basic simulation functions, such as scenario configuration, simulation control, and simulation data retrieval to installation planners. At the same time, the simulation service is able to utilize various kinds of geoprocessing services in Agents’ process logic to make sophisticated spatial inferences and analysis. This simulation-as-a-service framework has many potential benefits, such as easy-to-use, on-demand, shared understanding, and boosted performances. At the end, we present a preliminary implement of this concept using ArcGIS javascript api 4.0 and ArcGIS for server, showing how trip planning and driving can be carried out by agents.

  3. Protocol for quality control of scanners used in the simulation of radiotherapy treatment planning

    International Nuclear Information System (INIS)

    Yanes, Yaima; Alfonso, Rodolfo; Silvestre, Ileana

    2009-01-01

    Computed Tomography (CT) has become the tool fundamental imaging of modern radiation therapy, to locate targets and critical organs and dose planning. Tomographs used for these purposes require strict assurance program quality, which differs in many aspects of monitoring required for diagnostic use only with intention. The aim of this work has been the design and validation of a quality control protocol applicable to any TAC used for simulation, radiotherapy planning. (author)

  4. Population annealing: Theory and application in spin glasses

    OpenAIRE

    Wang, Wenlong; Machta, Jonathan; Katzgraber, Helmut G.

    2015-01-01

    Population annealing is an efficient sequential Monte Carlo algorithm for simulating equilibrium states of systems with rough free energy landscapes. The theory of population annealing is presented, and systematic and statistical errors are discussed. The behavior of the algorithm is studied in the context of large-scale simulations of the three-dimensional Ising spin glass and the performance of the algorithm is compared to parallel tempering. It is found that the two algorithms are similar ...

  5. Solving complex maintenance planning optimization problems using stochastic simulation and multi-criteria fuzzy decision making

    International Nuclear Information System (INIS)

    Tahvili, Sahar; Österberg, Jonas; Silvestrov, Sergei; Biteus, Jonas

    2014-01-01

    One of the most important factors in the operations of many cooperations today is to maximize profit and one important tool to that effect is the optimization of maintenance activities. Maintenance activities is at the largest level divided into two major areas, corrective maintenance (CM) and preventive maintenance (PM). When optimizing maintenance activities, by a maintenance plan or policy, we seek to find the best activities to perform at each point in time, be it PM or CM. We explore the use of stochastic simulation, genetic algorithms and other tools for solving complex maintenance planning optimization problems in terms of a suggested framework model based on discrete event simulation

  6. Solving complex maintenance planning optimization problems using stochastic simulation and multi-criteria fuzzy decision making

    Energy Technology Data Exchange (ETDEWEB)

    Tahvili, Sahar [Mälardalen University (Sweden); Österberg, Jonas; Silvestrov, Sergei [Division of Applied Mathematics, Mälardalen University (Sweden); Biteus, Jonas [Scania CV (Sweden)

    2014-12-10

    One of the most important factors in the operations of many cooperations today is to maximize profit and one important tool to that effect is the optimization of maintenance activities. Maintenance activities is at the largest level divided into two major areas, corrective maintenance (CM) and preventive maintenance (PM). When optimizing maintenance activities, by a maintenance plan or policy, we seek to find the best activities to perform at each point in time, be it PM or CM. We explore the use of stochastic simulation, genetic algorithms and other tools for solving complex maintenance planning optimization problems in terms of a suggested framework model based on discrete event simulation.

  7. An Interprofessional Approach to Continuing Education With Mass Casualty Simulation: Planning and Execution.

    Science.gov (United States)

    Saber, Deborah A; Strout, Kelley; Caruso, Lisa Swanson; Ingwell-Spolan, Charlene; Koplovsky, Aiden

    2017-10-01

    Many natural and man-made disasters require the assistance from teams of health care professionals. Knowing that continuing education about disaster simulation training is essential to nursing students, nurses, and emergency first responders (e.g., emergency medical technicians, firefighters, police officers), a university in the northeastern United States planned and implemented an interprofessional mass casualty incident (MCI) disaster simulation using the Project Management Body of Knowledge (PMBOK) management framework. The school of nursing and University Volunteer Ambulance Corps (UVAC) worked together to simulate a bus crash with disaster victim actors to provide continued education for community first responders and train nursing students on the MCI process. This article explains the simulation activity, planning process, and achieved outcomes. J Contin Educ Nurs. 2017;48(10):447-453. Copyright 2017, SLACK Incorporated.

  8. The potential impact of urban growth simulation on the long-term planning of our cities

    CSIR Research Space (South Africa)

    Waldeck, L

    2012-10-01

    Full Text Available of urban growth simulation on the long-term planning of our cities 4th Biennial Conference Presented by: Dr Louis Waldeck Date: 10 October 2012 Slide 2 of 17 Why Urban Growth Simulation? ? Reduced carbon footprint ? Reduce resource consumption... of the population concentrated in cities and the opportunities to gain efficiencies, cities are the most important arena for intervention.? Maurice Strong Unabated urbanisation Quest for sustainable development What makes a city sustainable? Slide 3 of 17...

  9. Development of a Searchable Database of Cryoablation Simulations for Use in Treatment Planning.

    Science.gov (United States)

    Boas, F Edward; Srimathveeravalli, Govindarajan; Durack, Jeremy C; Kaye, Elena A; Erinjeri, Joseph P; Ziv, Etay; Maybody, Majid; Yarmohammadi, Hooman; Solomon, Stephen B

    2017-05-01

    To create and validate a planning tool for multiple-probe cryoablation, using simulations of ice ball size and shape for various ablation probe configurations, ablation times, and types of tissue ablated. Ice ball size and shape was simulated using the Pennes bioheat equation. Five thousand six hundred and seventy different cryoablation procedures were simulated, using 1-6 cryoablation probes and 1-2 cm spacing between probes. The resulting ice ball was measured along three perpendicular axes and recorded in a database. Simulated ice ball sizes were compared to gel experiments (26 measurements) and clinical cryoablation cases (42 measurements). The clinical cryoablation measurements were obtained from a HIPAA-compliant retrospective review of kidney and liver cryoablation procedures between January 2015 and February 2016. Finally, we created a web-based cryoablation planning tool, which uses the cryoablation simulation database to look up the probe spacing and ablation time that produces the desired ice ball shape and dimensions. Average absolute error between the simulated and experimentally measured ice balls was 1 mm in gel experiments and 4 mm in clinical cryoablation cases. The simulations accurately predicted the degree of synergy in multiple-probe ablations. The cryoablation simulation database covers a wide range of ice ball sizes and shapes up to 9.8 cm. Cryoablation simulations accurately predict the ice ball size in multiple-probe ablations. The cryoablation database can be used to plan ablation procedures: given the desired ice ball size and shape, it will find the number and type of probes, probe configuration and spacing, and ablation time required.

  10. Development of a Searchable Database of Cryoablation Simulations for Use in Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Boas, F. Edward, E-mail: boasf@mskcc.org; Srimathveeravalli, Govindarajan, E-mail: srimaths@mskcc.org; Durack, Jeremy C., E-mail: durackj@mskcc.org [Memorial Sloan Kettering Cancer Center, Department of Radiology (United States); Kaye, Elena A., E-mail: kayee@mskcc.org [Memorial Sloan Kettering Cancer Center, Department of Medical Physics (United States); Erinjeri, Joseph P., E-mail: erinjerj@mskcc.org; Ziv, Etay, E-mail: zive@mskcc.org; Maybody, Majid, E-mail: maybodym@mskcc.org; Yarmohammadi, Hooman, E-mail: yarmohah@mskcc.org; Solomon, Stephen B., E-mail: solomons@mskcc.org [Memorial Sloan Kettering Cancer Center, Department of Radiology (United States)

    2017-05-15

    PurposeTo create and validate a planning tool for multiple-probe cryoablation, using simulations of ice ball size and shape for various ablation probe configurations, ablation times, and types of tissue ablated.Materials and MethodsIce ball size and shape was simulated using the Pennes bioheat equation. Five thousand six hundred and seventy different cryoablation procedures were simulated, using 1–6 cryoablation probes and 1–2 cm spacing between probes. The resulting ice ball was measured along three perpendicular axes and recorded in a database. Simulated ice ball sizes were compared to gel experiments (26 measurements) and clinical cryoablation cases (42 measurements). The clinical cryoablation measurements were obtained from a HIPAA-compliant retrospective review of kidney and liver cryoablation procedures between January 2015 and February 2016. Finally, we created a web-based cryoablation planning tool, which uses the cryoablation simulation database to look up the probe spacing and ablation time that produces the desired ice ball shape and dimensions.ResultsAverage absolute error between the simulated and experimentally measured ice balls was 1 mm in gel experiments and 4 mm in clinical cryoablation cases. The simulations accurately predicted the degree of synergy in multiple-probe ablations. The cryoablation simulation database covers a wide range of ice ball sizes and shapes up to 9.8 cm.ConclusionCryoablation simulations accurately predict the ice ball size in multiple-probe ablations. The cryoablation database can be used to plan ablation procedures: given the desired ice ball size and shape, it will find the number and type of probes, probe configuration and spacing, and ablation time required.

  11. Development of a Searchable Database of Cryoablation Simulations for Use in Treatment Planning

    International Nuclear Information System (INIS)

    Boas, F. Edward; Srimathveeravalli, Govindarajan; Durack, Jeremy C.; Kaye, Elena A.; Erinjeri, Joseph P.; Ziv, Etay; Maybody, Majid; Yarmohammadi, Hooman; Solomon, Stephen B.

    2017-01-01

    PurposeTo create and validate a planning tool for multiple-probe cryoablation, using simulations of ice ball size and shape for various ablation probe configurations, ablation times, and types of tissue ablated.Materials and MethodsIce ball size and shape was simulated using the Pennes bioheat equation. Five thousand six hundred and seventy different cryoablation procedures were simulated, using 1–6 cryoablation probes and 1–2 cm spacing between probes. The resulting ice ball was measured along three perpendicular axes and recorded in a database. Simulated ice ball sizes were compared to gel experiments (26 measurements) and clinical cryoablation cases (42 measurements). The clinical cryoablation measurements were obtained from a HIPAA-compliant retrospective review of kidney and liver cryoablation procedures between January 2015 and February 2016. Finally, we created a web-based cryoablation planning tool, which uses the cryoablation simulation database to look up the probe spacing and ablation time that produces the desired ice ball shape and dimensions.ResultsAverage absolute error between the simulated and experimentally measured ice balls was 1 mm in gel experiments and 4 mm in clinical cryoablation cases. The simulations accurately predicted the degree of synergy in multiple-probe ablations. The cryoablation simulation database covers a wide range of ice ball sizes and shapes up to 9.8 cm.ConclusionCryoablation simulations accurately predict the ice ball size in multiple-probe ablations. The cryoablation database can be used to plan ablation procedures: given the desired ice ball size and shape, it will find the number and type of probes, probe configuration and spacing, and ablation time required.

  12. CT-Based Brachytherapy Treatment Planning using Monte Carlo Simulation Aided by an Interface Software

    Directory of Open Access Journals (Sweden)

    Vahid Moslemi

    2011-03-01

    Full Text Available Introduction: In brachytherapy, radioactive sources are placed close to the tumor, therefore, small changes in their positions can cause large changes in the dose distribution. This emphasizes the need for computerized treatment planning. The usual method for treatment planning of cervix brachytherapy uses conventional radiographs in the Manchester system. Nowadays, because of their advantages in locating the source positions and the surrounding tissues, CT and MRI images are replacing conventional radiographs. In this study, we used CT images in Monte Carlo based dose calculation for brachytherapy treatment planning, using an interface software to create the geometry file required in the MCNP code. The aim of using the interface software is to facilitate and speed up the geometry set-up for simulations based on the patient’s anatomy. This paper examines the feasibility of this method in cervix brachytherapy and assesses its accuracy and speed. Material and Methods: For dosimetric measurements regarding the treatment plan, a pelvic phantom was made from polyethylene in which the treatment applicators could be placed. For simulations using CT images, the phantom was scanned at 120 kVp. Using an interface software written in MATLAB, the CT images were converted into MCNP input file and the simulation was then performed. Results: Using the interface software, preparation time for the simulations of the applicator and surrounding structures was approximately 3 minutes; the corresponding time needed in the conventional MCNP geometry entry being approximately 1 hour. The discrepancy in the simulated and measured doses to point A was 1.7% of the prescribed dose.  The corresponding dose differences between the two methods in rectum and bladder were 3.0% and 3.7% of the prescribed dose, respectively. Comparing the results of simulation using the interface software with those of simulation using the standard MCNP geometry entry showed a less than 1

  13. Patient dose simulation in X-ray CT using a radiation treatment-planning system

    International Nuclear Information System (INIS)

    Nakae, Yasuo; Oda, Masahiko; Minamoto, Takahiro

    2003-01-01

    Medical irradiation dosage has been increasing with the development of new radiological equipment and new techniques like interventional radiology. It is fair to say that patient dose has been increased as a result of the development of multi-slice CT. A number of studies on the irradiation dose of CT have been reported, and the computed tomography dose index (CTDI) is now used as a general means of determining CT dose. However, patient dose distribution in the body varies with the patient's constitution, bowel gas in the body, and conditions of exposure. In this study, patient dose was analyzed from the viewpoint of dose distribution, using a radiation treatment-planning computer. Percent depth dose (PDD) and the off-center ratio (OCR) of the CT beam are needed to calculate dose distribution by the planning computer. Therefore, X-ray CT data were measured with various apparatuses, and beam data were sent to the planning computer. Measurement and simulation doses in the elliptical phantom (Mix-Dp: water equivalent material) were collated, and the CT irradiation dose was determined for patient dose simulation. The rotational radiation treatment technique was used to obtain the patient dose distribution of CT, and patient dose was evaluated through simulation of the dose distribution. CT images of the thorax were sent to the planning computer and simulated. The result was that the patient dose distribution of the thorax was obtained for CT examination. (author)

  14. A Simulation for Managing Complexity in Sales and Operations Planning Decisions

    Science.gov (United States)

    DuHadway, Scott; Dreyfus, David

    2017-01-01

    Within the classroom it is often difficult to convey the complexities and intricacies that go into making sales and operations planning decisions. This article describes an in-class simulation that allows students to gain hands-on experience with the complexities in making forecasting, inventory, and supplier selection decisions as part of the…

  15. Image formation simulation for computer-aided inspection planning of machine vision systems

    Science.gov (United States)

    Irgenfried, Stephan; Bergmann, Stephan; Mohammadikaji, Mahsa; Beyerer, Jürgen; Dachsbacher, Carsten; Wörn, Heinz

    2017-06-01

    In this work, a simulation toolset for Computer Aided Inspection Planning (CAIP) of systems for automated optical inspection (AOI) is presented along with a versatile two-robot-setup for verification of simulation and system planning results. The toolset helps to narrow down the large design space of optical inspection systems in interaction with a system expert. The image formation taking place in optical inspection systems is simulated using GPU-based real time graphics and high quality off-line-rendering. The simulation pipeline allows a stepwise optimization of the system, from fast evaluation of surface patch visibility based on real time graphics up to evaluation of image processing results based on off-line global illumination calculation. A focus of this work is on the dependency of simulation quality on measuring, modeling and parameterizing the optical surface properties of the object to be inspected. The applicability to real world problems is demonstrated by taking the example of planning a 3D laser scanner application. Qualitative and quantitative comparison results of synthetic and real images are presented.

  16. Test plan for Fauske and Associates to perform tube propagation experiments with simulated Hanford tank wastes

    International Nuclear Information System (INIS)

    Carlson, C.D.; Babad, H.

    1996-05-01

    This test plan, prepared at Pacific Northwest National Laboratory for Westinghouse Hanford Company, provides guidance for performing tube propagation experiments on simulated Hanford tank wastes and on actual tank waste samples. Simulant compositions are defined and an experimental logic tree is provided for Fauske and Associates (FAI) to perform the experiments. From this guidance, methods and equipment for small-scale tube propagation experiments to be performed at the Hanford Site on actual tank samples will be developed. Propagation behavior of wastes will directly support the safety analysis (SARR) for the organic tanks. Tube propagation may be the definitive tool for determining the relative reactivity of the wastes contained in the Hanford tanks. FAI have performed tube propagation studies previously on simple two- and three-component surrogate mixtures. The simulant defined in this test plan more closely represents actual tank composition. Data will be used to support preparation of criteria for determining the relative safety of the organic bearing wastes

  17. Preoperative surgical planning and simulation of complex cranial base tumors in virtual reality

    Institute of Scientific and Technical Information of China (English)

    YI Zhi-qiang; LI Liang; MO Da-peng; ZHANG Jia-yong; ZHANG Yang; BAO Sheng-de

    2008-01-01

    @@ The extremely complex anatomic relationships among bone,tumor,blood vessels and cranial nerves remains a big challenge for cranial base tumor surgery.Therefore.a good understanding of the patient specific anatomy and a preoperative planning are helpful and crocial for the neurosurgeons.Three dimensional (3-D) visualization of various imaging techniques have been widely explored to enhance the comprehension of volumetric data for surgical planning.1 We used the Destroscope Virtual Reality (VR) System (Singapore,Volume Interaction Pte Ltd,software:RadioDexterTM 1.0) to optimize preoperative plan in the complex cranial base tumors.This system uses patient-specific,coregistered,fused radiology data sets that may be viewed stereoscopically and can be manipulated in a virtual reality environment.This article describes our experience with the Destroscope VR system in preoperative surgical planning and simulation for 5 patients with complex cranial base tumors and evaluates the clinical usefulness of this system.

  18. Planning Irreversible Electroporation in the Porcine Kidney: Are Numerical Simulations Reliable for Predicting Empiric Ablation Outcomes?

    International Nuclear Information System (INIS)

    Wimmer, Thomas; Srimathveeravalli, Govindarajan; Gutta, Narendra; Ezell, Paula C.; Monette, Sebastien; Maybody, Majid; Erinjery, Joseph P.; Durack, Jeremy C.; Coleman, Jonathan A.; Solomon, Stephen B.

    2015-01-01

    PurposeNumerical simulations are used for treatment planning in clinical applications of irreversible electroporation (IRE) to determine ablation size and shape. To assess the reliability of simulations for treatment planning, we compared simulation results with empiric outcomes of renal IRE using computed tomography (CT) and histology in an animal model.MethodsThe ablation size and shape for six different IRE parameter sets (70–90 pulses, 2,000–2,700 V, 70–100 µs) for monopolar and bipolar electrodes was simulated using a numerical model. Employing these treatment parameters, 35 CT-guided IRE ablations were created in both kidneys of six pigs and followed up with CT immediately and after 24 h. Histopathology was analyzed from postablation day 1.ResultsAblation zones on CT measured 81 ± 18 % (day 0, p ≤ 0.05) and 115 ± 18 % (day 1, p ≤ 0.09) of the simulated size for monopolar electrodes, and 190 ± 33 % (day 0, p ≤ 0.001) and 234 ± 12 % (day 1, p ≤ 0.0001) for bipolar electrodes. Histopathology indicated smaller ablation zones than simulated (71 ± 41 %, p ≤ 0.047) and measured on CT (47 ± 16 %, p ≤ 0.005) with complete ablation of kidney parenchyma within the central zone and incomplete ablation in the periphery.ConclusionBoth numerical simulations for planning renal IRE and CT measurements may overestimate the size of ablation compared to histology, and ablation effects may be incomplete in the periphery

  19. Reactor pressure vessel thermal annealing

    International Nuclear Information System (INIS)

    Lee, A.D.

    1997-01-01

    The steel plates and/or forgings and welds in the beltline region of a reactor pressure vessel (RPV) are subject to embrittlement from neutron irradiation. This embrittlement causes the fracture toughness of the beltline materials to be less than the fracture toughness of the unirradiated material. Material properties of RPVs that have been irradiated and embrittled are recoverable through thermal annealing of the vessel. The amount of recovery primarily depends on the level of the irradiation embrittlement, the chemical composition of the steel, and the annealing temperature and time. Since annealing is an option for extending the service lives of RPVs or establishing less restrictive pressure-temperature (P-T) limits; the industry, the Department of Energy (DOE) and the Nuclear Regulatory Commission (NRC) have assisted in efforts to determine the viability of thermal annealing for embrittlement recovery. General guidance for in-service annealing is provided in American Society for Testing and Materials (ASTM) Standard E 509-86. In addition, the American Society of Mechanical Engineers (ASME) Code Case N-557 addresses annealing conditions (temperature and duration), temperature monitoring, evaluation of loadings, and non-destructive examination techniques. The NRC thermal annealing rule (10 CFR 50.66) was approved by the Commission and published in the Federal Register on December 19, 1995. The Regulatory Guide on thermal annealing (RG 1.162) was processed in parallel with the rule package and was published on February 15, 1996. RG 1.162 contains a listing of issues that need to be addressed for thermal annealing of an RPV. The RG also provides alternatives for predicting re-embrittlement trends after the thermal anneal has been completed. This paper gives an overview of methodology and recent technical references that are associated with thermal annealing. Results from the DOE annealing prototype demonstration project, as well as NRC activities related to the

  20. Functional image-based radiotherapy planning for non-small cell lung cancer: A simulation study

    International Nuclear Information System (INIS)

    Bates, Emma L.; Bragg, Christopher M.; Wild, Jim M.; Hatton, Matthew Q.F.; Ireland, Rob H.

    2009-01-01

    Background and purpose: To investigate the incorporation of data from single-photon emission computed tomography (SPECT) or hyperpolarized helium-3 magnetic resonance imaging ( 3 He-MRI) into intensity-modulated radiotherapy (IMRT) planning for non-small cell lung cancer (NSCLC). Material and methods: Seven scenarios were simulated that represent cases of NSCLC with significant functional lung defects. Two independent IMRT plans were produced for each scenario; one to minimise total lung volume receiving ≥20 Gy (V 20 ), and the other to minimise only the functional lung volume receiving ≥20 Gy (FV 20 ). Dose-volume characteristics and a plan quality index related to planning target volume coverage by the 95% isodose (V PTV95 /FV 20 ) were compared between anatomical and functional plans using the Wilcoxon signed ranks test. Results: Compared to anatomical IMRT plans, functional planning reduced FV 20 (median 2.7%, range 0.6-3.5%, p = 0.02), and total lung V 20 (median 1.5%, 0.5-2.7%, p = 0.02), with a small reduction in mean functional lung dose (median 0.4 Gy, 0-0.7 Gy, p = 0.03). There were no significant differences in target volume coverage or organ-at-risk doses. Plan quality index was improved for functional plans (median increase 1.4, range 0-11.8, p = 0.02). Conclusions: Statistically significant reductions in FV 20 , V 20 and mean functional lung dose are possible when IMRT planning is supplemented by functional information derived from SPECT or 3 He-MRI.

  1. 77 FR 31026 - Use of Computer Simulation of the United States Blood Supply in Support of Planning for Emergency...

    Science.gov (United States)

    2012-05-24

    ...] Use of Computer Simulation of the United States Blood Supply in Support of Planning for Emergency... entitled: ``Use of Computer Simulation of the United States Blood Supply in Support of Planning for... and panel discussions with experts from academia, regulated industry, government, and other...

  2. PULSim: User-Based Adaptable Simulation Tool for Railway Planning and Operations

    Directory of Open Access Journals (Sweden)

    Yong Cui

    2018-01-01

    Full Text Available Simulation methods are widely used in the field of railway planning and operations. Currently, several commercial software tools are available that not only provide functionality for railway simulation but also enable further evaluation and optimisation of the network for scheduling, dispatching, and capacity research. However, the various tools are all lacking with respect to the standards they utilise as well as their published interfaces. For an end-user, the basic mechanism and the assumptions built into a simulation tool are unknown, which means that the true potential of these software tools is limited. One of the most critical issues is the lack of the ability of users to define a sophisticated workflow, integrated in several rounds of simulation with adjustable parameters and settings. This paper develops and describes a user-based, customisable platform. As the preconditions of the platform, the design aspects for modelling the components of a railway system and building the workflow of railway simulation are elaborated in detail. Based on the model and the workflow, an integrated simulation platform with open interfaces is developed. Users and researchers gain the ability to rapidly develop their own algorithms, supported by the tailored simulation process in a flexible manner. The productivity of using simulation tools for further evaluation and optimisation will be significantly improved through the user-adaptable open interfaces.

  3. Radiofrequency ablation of hepatic tumors: simulation, planning, and contribution of virtual reality and haptics.

    Science.gov (United States)

    Villard, Caroline; Soler, Luc; Gangi, Afshin

    2005-08-01

    For radiofrequency ablation (RFA) of liver tumors, evaluation of vascular architecture, post-RFA necrosis prediction, and the choice of a suitable needle placement strategy using conventional radiological techniques remain difficult. In an attempt to enhance the safety of RFA, a 3D simulator, treatment planning, and training tool, that simulates the insertion of the needle, the necrosis of the treated area, and proposes an optimal needle placement, has been developed. The 3D scenes are automatically reconstructed from enhanced spiral CT scans. The simulator takes into account the cooling effect of local vessels greater than 3 mm in diameter, making necrosis shapes more realistic. Optimal needle positioning can be automatically generated by the software to produce complete destruction of the tumor, with maximum respect of the healthy liver and of all major structures to avoid. We also studied how the use of virtual reality and haptic devices are valuable to make simulation and training realistic and effective.

  4. Composition dependent thermal annealing behaviour of ion tracks in apatite

    Energy Technology Data Exchange (ETDEWEB)

    Nadzri, A., E-mail: allina.nadzri@anu.edu.au [Department of Electronic Materials Engineering, Research School of Physics and Engineering, Australian National University, Canberra, ACT 2601 (Australia); Schauries, D.; Mota-Santiago, P.; Muradoglu, S. [Department of Electronic Materials Engineering, Research School of Physics and Engineering, Australian National University, Canberra, ACT 2601 (Australia); Trautmann, C. [GSI Helmholtz Centre for Heavy Ion Research, Planckstrasse 1, 64291 Darmstadt (Germany); Technische Universität Darmstadt, 64287 Darmstadt (Germany); Gleadow, A.J.W. [School of Earth Science, University of Melbourne, Melbourne, VIC 3010 (Australia); Hawley, A. [Australian Synchrotron, 800 Blackburn Road, Clayton, VIC 3168 (Australia); Kluth, P. [Department of Electronic Materials Engineering, Research School of Physics and Engineering, Australian National University, Canberra, ACT 2601 (Australia)

    2016-07-15

    Natural apatite samples with different F/Cl content from a variety of geological locations (Durango, Mexico; Mud Tank, Australia; and Snarum, Norway) were irradiated with swift heavy ions to simulate fission tracks. The annealing kinetics of the resulting ion tracks was investigated using synchrotron-based small-angle X-ray scattering (SAXS) combined with ex situ annealing. The activation energies for track recrystallization were extracted and consistent with previous studies using track-etching, tracks in the chlorine-rich Snarum apatite are more resistant to annealing than in the other compositions.

  5. Renal Tumor Cryoablation Planning. The Efficiency of Simulation on Reconstructed 3D CT Scan

    Directory of Open Access Journals (Sweden)

    Ciprian Valerian LUCAN

    2010-12-01

    Full Text Available Introduction & Objective: Nephron-sparing surgical techniques risks are related to tumor relationships with adjacent anatomic structures. Complexity of the renal anatomy drives the interest to develop tools for 3D reconstruction and surgery simulation. The aim of the article was to assess the simulation on reconstructed 3D CT scan used for planning the cryoablation. Material & Method: A prospective randomized study was performed between Jan. 2007 and July 2009 on 27 patients who underwent retroperitoneoscopic T1a renal tumors cryoablation (RC. All patients were assessed preoperatively by CT scan, also used for 3D volume rendering. In the Gr.A, the patients underwent surgery planning by simulation on 3D CT scan. In the Gr.B., patients underwent standard RC. The two groups were compared in terms of surgical time, bleeding, postoperative drainage, analgesics requirement, hospital stay, time to socio-professional reintegration. Results: Fourteen patients underwent preoperative cryoablation planning (Gr.A and 13 patients underwent standard CR (Gr.B. All parameters analyzed were shorter in the Gr.A. On multivariate logistic regression, only shortens of the surgical time (138.79±5.51 min. in Gr.A. vs. 140.92±5.54 min in Gr.B. and bleeding (164.29±60.22 mL in Gr.A. vs. 215.38±100.80 mL in Gr.B. achieved statistical significance (p<0.05. The number of cryoneedles assessed by simulation had a 92.52% accuracy when compared with those effectively used. Conclusions: Simulation of the cryoablation using reconstructed 3D CT scan improves the surgical results. The application used for simulation was able to accurately assess the number of cryoneedles required for tumor ablation, their direction and approach.

  6. Simulation of textile manufacturing processes for planning, scheduling, and quality control purposes

    Science.gov (United States)

    Cropper, A. E.; Wang, Z.

    1995-08-01

    Simulation, as a management information tool, has been applied to engineering manufacture and assembly operations. The application of the principles to textile manufacturing (fiber to fabric) is discussed. The particular problems and solutions in applying the simulation software package to the yarn production processes are discussed with an indication of how the software achieves the production schedule. The system appears to have application in planning, scheduling, and quality assurance. The latter being a result of the traceability possibilities through a process involving mixing and splitting of material.

  7. Inverse planning IMRT

    International Nuclear Information System (INIS)

    Rosenwald, J.-C.

    2008-01-01

    The lecture addressed the following topics: Optimizing radiotherapy dose distribution; IMRT contributes to optimization of energy deposition; Inverse vs direct planning; Main steps of IMRT; Background of inverse planning; General principle of inverse planning; The 3 main components of IMRT inverse planning; The simplest cost function (deviation from prescribed dose); The driving variable : the beamlet intensity; Minimizing a 'cost function' (or 'objective function') - the walker (or skier) analogy; Application to IMRT optimization (the gradient method); The gradient method - discussion; The simulated annealing method; The optimization criteria - discussion; Hard and soft constraints; Dose volume constraints; Typical user interface for definition of optimization criteria; Biological constraints (Equivalent Uniform Dose); The result of the optimization process; Semi-automatic solutions for IMRT; Generalisation of the optimization problem; Driving and driven variables used in RT optimization; Towards multi-criteria optimization; and Conclusions for the optimization phase. (P.A.)

  8. Sci-Fri PM: Radiation Therapy, Planning, Imaging, and Special Techniques - 05: A novel respiratory motion simulation program for VMAT treatment plans: a phantom validation study

    International Nuclear Information System (INIS)

    Hubley, Emily; Pierce, Greg; Ploquin, Nicolas

    2016-01-01

    Purpose: To develop and validate a computational method to simulate craniocaudal respiratory motion in a VMAT treatment plan. Methods: Three 4DCTs of the QUASAR respiratory motion phantom were acquired with a 2cm water-density spherical tumour embedded in cedar to simulate lung. The phantom was oscillating sinusoidally with an amplitude of 2cm and periods of 3, 4, and 5 seconds. An ITV was contoured and 5mm PTV margin was added. High and a low modulation factor VMAT plans were created for each scan. An in-house program was developed to simulate respiratory motion in the treatment plans by shifting the MLC leaf positions relative to the phantom. Each plan was delivered to the phantom and the dose was measured using Gafchromic film. The measured and calculated plans were compared using an absolute dose gamma analysis (3%/3mm). Results: The average gamma pass rate for the low modulation plan and high modulation plans were 91.1% and 51.4% respectively. The difference between the high and low modulation plans gamma pass rates is likely related to the different sampling frequency of the respiratory curve and the higher MLC leaf speeds in the high modulation plan. A high modulation plan has a slower gantry speed and therefore samples the breathing cycle at a coarser frequency leading to inaccuracies between the measured and planned doses. Conclusion: A simple program, including a novel method for increasing sampling frequency beyond the control point frequency, has been developed to simulate respiratory motion in VMAT plans by shifting the MLC leaf positions.

  9. Sci-Fri PM: Radiation Therapy, Planning, Imaging, and Special Techniques - 05: A novel respiratory motion simulation program for VMAT treatment plans: a phantom validation study

    Energy Technology Data Exchange (ETDEWEB)

    Hubley, Emily; Pierce, Greg; Ploquin, Nicolas [University of Calgary, Tom Baker Cancer Centre, Tom Baker Cancer Centre (Canada)

    2016-08-15

    Purpose: To develop and validate a computational method to simulate craniocaudal respiratory motion in a VMAT treatment plan. Methods: Three 4DCTs of the QUASAR respiratory motion phantom were acquired with a 2cm water-density spherical tumour embedded in cedar to simulate lung. The phantom was oscillating sinusoidally with an amplitude of 2cm and periods of 3, 4, and 5 seconds. An ITV was contoured and 5mm PTV margin was added. High and a low modulation factor VMAT plans were created for each scan. An in-house program was developed to simulate respiratory motion in the treatment plans by shifting the MLC leaf positions relative to the phantom. Each plan was delivered to the phantom and the dose was measured using Gafchromic film. The measured and calculated plans were compared using an absolute dose gamma analysis (3%/3mm). Results: The average gamma pass rate for the low modulation plan and high modulation plans were 91.1% and 51.4% respectively. The difference between the high and low modulation plans gamma pass rates is likely related to the different sampling frequency of the respiratory curve and the higher MLC leaf speeds in the high modulation plan. A high modulation plan has a slower gantry speed and therefore samples the breathing cycle at a coarser frequency leading to inaccuracies between the measured and planned doses. Conclusion: A simple program, including a novel method for increasing sampling frequency beyond the control point frequency, has been developed to simulate respiratory motion in VMAT plans by shifting the MLC leaf positions.

  10. Selection of a Planning Horizon for a Hybrid Microgrid Using Simulated Wind Forecasts

    Science.gov (United States)

    2014-12-01

    microgrid robustness and efficiency and may provide operators with real-time guidance and control policies for microgrid operation. ACKNOWLEDGMENTS The...A PLANNING HORIZON FOR A HYBRID MICROGRID USING SIMULATED WIND FORECASTS Mumtaz Karatas Turkish Naval Academy Tuzla, Istanbul, 34942, TURKEY Emily M...Craparo Dashi I. Singham Naval Postgraduate School 1411 Cunningham Road Monterey, CA, 93943 USA ABSTRACT Hybrid microgrids containing renewable energy

  11. Influence of alloying and secondary annealing on anneal hardening ...

    Indian Academy of Sciences (India)

    Unknown

    Influence of alloying and secondary annealing on anneal hardening effect at sintered copper alloys. SVETLANA NESTOROVIC. Technical Faculty Bor, University of Belgrade, Bor, Yugoslavia. MS received 11 February 2004; revised 29 October 2004. Abstract. This paper reports results of investigation carried out on sintered ...

  12. New development of integrated CT simulation system for radiation therapy planning

    International Nuclear Information System (INIS)

    Kushima, Takeyuki; Kono, Michio

    1993-01-01

    In order to put more accurate radiotherapy into practice, a radiotherapy planning system using CT, which is named CT simulation system, has been developed and introduced at Kobe University Hospital. The CT simulation system consists of a CT scanner, an image processing work-station, and a laser marking system. The target area of radiation is determined on each CT axial image of scout view in the work-station. Three-dimensional treatment planning is feasible on the basis of the two-dimensional tumor information in CT axial images. After setting treatment parameters, the contour of the radiation field on beam's eye view and the iso-center position are calculated by computer. This system makes it possible to choose an appropriate irradiation method and an optimal dose distribution. In the present study we examined the fundamental capability of this system. The laser marking system proved to have a very high degree of accuracy. The outcome of a phantom test raised the strong possibility that this system may be applied clinically. In addition to these basic findings, this paper describes preliminary clinical observations that support the good reproducibility of the radiation field projected with the CT simulator. In conclusion, this system is of high value for radiation therapy planning. (author)

  13. DOE's annealing prototype demonstration projects

    International Nuclear Information System (INIS)

    Warren, J.; Nakos, J.; Rochau, G.

    1997-01-01

    One of the challenges U.S. utilities face in addressing technical issues associated with the aging of nuclear power plants is the long-term effect of plant operation on reactor pressure vessels (RPVs). As a nuclear plant operates, its RPV is exposed to neutrons. For certain plants, this neutron exposure can cause embrittlement of some of the RPV welds which can shorten the useful life of the RPV. This RPV embrittlement issue has the potential to affect the continued operation of a number of operating U.S. pressurized water reactor (PWR) plants. However, RPV material properties affected by long-term irradiation are recoverable through a thermal annealing treatment of the RPV. Although a dozen Russian-designed RPVs and several U.S. military vessels have been successfully annealed, U.S. utilities have stated that a successful annealing demonstration of a U.S. RPV is a prerequisite for annealing a licensed U.S. nuclear power plant. In May 1995, the Department of Energy's Sandia National Laboratories awarded two cost-shared contracts to evaluate the feasibility of annealing U.S. licensed plants by conducting an anneal of an installed RPV using two different heating technologies. The contracts were awarded to the American Society of Mechanical Engineers (ASME) Center for Research and Technology Development (CRTD) and MPR Associates (MPR). The ASME team completed its annealing prototype demonstration in July 1996, using an indirect gas furnace at the uncompleted Public Service of Indiana's Marble Hill nuclear power plant. The MPR team's annealing prototype demonstration was scheduled to be completed in early 1997, using a direct heat electrical furnace at the uncompleted Consumers Power Company's nuclear power plant at Midland, Michigan. This paper describes the Department's annealing prototype demonstration goals and objectives; the tasks, deliverables, and results to date for each annealing prototype demonstration; and the remaining annealing technology challenges

  14. Planning of development strategy for establishment of advanced simulation of nuclear system

    International Nuclear Information System (INIS)

    Chung, Bubdong; Ko, Wonil; Kwon Junhyun

    2013-12-01

    In this product, the long term development plan in each technical area has been prosed with the plan of coupled code system. The consolidated code system for safety analysis has been proposing for future needs. The computing hardware needed for te advanced simulation is also proposing. The best approach for future safety analysis simulation capabilities may be a dual-path program. i. e. the development programs for an integrated analysis tool and multi-scale/multi-physic analysis tools, where the former aims at reducing uncertainty and the latter at enhancing accuracy. Integrated analysis tool with risk informed safety margin quantification It requires a significant extension of the phenomenological and geometric capabilities of existing reactor safety analysis software, capable of detailed simulations that reduce the uncertainties. Multi-scale, multi-physics analysis tools. Simplifications of complex phenomenological models and dependencies have been made in current safety analyses to accommodate computer hardware limitations. With the advent of modern computer hardware, these limitations may be removed to permit greater accuracy in representation of physical behavior of materials in design basis and beyond design basis conditions, and hence more accurate assessment of the true safety margins based on first principle methodology. The proposals can be utilized to develop the advanced simulation project and formulation of organization and establishment of high performance computing system in KAERI

  15. Fast dose planning Monte Carlo simulations in inhomogeneous phantoms submerged in uniform, static magnetic fields

    International Nuclear Information System (INIS)

    Yanez, R.; Dempsey, J. F.

    2007-01-01

    We present studies in support of the development of a magnetic resonance imaging (MRI) guided intensity modulated radiation therapy (IMRT) device for the treatment of cancer patients. Fast and accurate computation of the absorbed ionizing radiation dose delivered in the presence of the MRI magnetic field are required for clinical implementation. The fast Monte Carlo simulation code DPM, optimized for radiotherapy treatment planning, is modified to simulate absorbed doses in uniform, static magnetic fields, and benchmarked against PENELOPE. Simulations of dose deposition in inhomogeneous phantoms in which a low density material is sandwiched in water shows that a lower MRI field strength (0.3 T) is to prefer in order to avoid dose build-up near material boundaries. (authors)

  16. Modeling, simulation, and optimal initiation planning for needle insertion into the liver.

    Science.gov (United States)

    Sharifi Sedeh, R; Ahmadian, M T; Janabi-Sharifi, F

    2010-04-01

    Needle insertion simulation and planning systems (SPSs) will play an important role in diminishing inappropriate insertions into soft tissues and resultant complications. Difficulties in SPS development are due in large part to the computational requirements of the extensive calculations in finite element (FE) models of tissue. For clinical feasibility, the computational speed of SPSs must be improved. At the same time, a realistic model of tissue properties that reflects large and velocity-dependent deformations must be employed. The purpose of this study is to address the aforementioned difficulties by presenting a cost-effective SPS platform for needle insertions into the liver. The study was constrained to planar (2D) cases, but can be extended to 3D insertions. To accommodate large and velocity-dependent deformations, a hyperviscoelastic model was devised to produce an FE model of liver tissue. Material constants were identified by a genetic algorithm applied to the experimental results of unconfined compressions of bovine liver. The approach for SPS involves B-spline interpolations of sample data generated from the FE model of liver. Two interpolation-based models are introduced to approximate puncture times and to approximate the coordinates of FE model nodes interacting with the needle tip as a function of the needle initiation pose; the latter was also a function of postpuncture time. A real-time simulation framework is provided, and its computational benefit is highlighted by comparing its performance with the FE method. A planning algorithm for optimal needle initiation was designed, and its effectiveness was evaluated by analyzing its accuracy in reaching a random set of targets at different resolutions of sampled data using the FE model. The proposed simulation framework can easily surpass haptic rates (>500 Hz), even with a high pose resolution level ( approximately 30). The computational time required to update the coordinates of the node at the

  17. Radiation annealing in cuprous oxide

    DEFF Research Database (Denmark)

    Vajda, P.

    1966-01-01

    Experimental results from high-intensity gamma-irradiation of cuprous oxide are used to investigate the annealing of defects with increasing radiation dose. The results are analysed on the basis of the Balarin and Hauser (1965) statistical model of radiation annealing, giving a square...

  18. Extrapolation of zircon fission-track annealing models

    International Nuclear Information System (INIS)

    Palissari, R.; Guedes, S.; Curvo, E.A.C.; Moreira, P.A.F.P.; Tello, C.A.; Hadler, J.C.

    2013-01-01

    One of the purposes of this study is to give further constraints on the temperature range of the zircon partial annealing zone over a geological time scale using data from borehole zircon samples, which have experienced stable temperatures for ∼1 Ma. In this way, the extrapolation problem is explicitly addressed by fitting the zircon annealing models with geological timescale data. Several empirical model formulations have been proposed to perform these calibrations and have been compared in this work. The basic form proposed for annealing models is the Arrhenius-type model. There are other annealing models, that are based on the same general formulation. These empirical model equations have been preferred due to the great number of phenomena from track formation to chemical etching that are not well understood. However, there are two other models, which try to establish a direct correlation between their parameters and the related phenomena. To compare the response of the different annealing models, thermal indexes, such as closure temperature, total annealing temperature and the partial annealing zone, have been calculated and compared with field evidence. After comparing the different models, it was concluded that the fanning curvilinear models yield the best agreement between predicted index temperatures and field evidence. - Highlights: ► Geological data were used along with lab data for improving model extrapolation. ► Index temperatures were simulated for testing model extrapolation. ► Curvilinear Arrhenius models produced better geological temperature predictions

  19. Optimization of a water resource system expansion using the Genetic Algorithm and Simulated Annealing methods; Optimizacion de la expansion de un sistema de recursos hidricos utilizados las metodologias del algoritmo genetico y el recocido simulado

    Energy Technology Data Exchange (ETDEWEB)

    Sanchez Camacho, Enrique; Andreu Alvarez, Joaquin [Universidad Politecnica de Valencia (Spain)

    2001-06-01

    Two numerical procedures, based on the Genetic Algorithm (GA) and the Simulated Annealing (SA), are developed to solve the problem of the expansion of capacity of a water resource system. The problem was divided into two subproblems: capital availability and operation policy. Both are optimisation-simulation models, the first one is solved by means of the GA and SA, in each case, while the second one is solved using the Out-of-kilter algorithm (OKA), in both models. The objective function considers the usual benefits and costs in this kind of systems, such as irrigation and hydropower benefits, costs of dam construction and system maintenance. The strength and weakness of both models are evaluated by comparing their results with those obtained with the branch and bound technique, which was classically used to solve this kind of problems. [Spanish] Un par de metodos numericos fundamentados en dos tecnicas de busqueda globales. Algoritmos Genetico (AG) y Recocido Simulado (RS), son desarrollados para resolver el problema de expansion de capacidad de un sistema de recursos hidricos. La estrategia ha sido dividir al problema en dos subproblemas: el de disponibilidad de capital y el de la politica de operacion. Ambos modelos son de optimizacion-simulacion, el primero se realiza mediante los algoritmos del RS y el AG en cada caso, en tanto que el segundo lleva a cabo a traves del algoritmo del Out-of-kilter (AOK) en los dos modelos. La funcion objetivo con que se trabaja considera los beneficios y costos mas comunes en este tipo de sistema, tales como beneficios por riego, por hidroelectricidad y costos de construccion de los embalses y mantenimiento del sistema. La potencia y debilidades delos dos modelos se evaluan mediante la comparacion con los resultados obtenidos a traves de una de las tecnicas mas usadas en este tipo de problemas: la de ramificacion y acotacion.

  20. Simulation of complex data structures for planning of studies with focus on biomarker comparison.

    Science.gov (United States)

    Schulz, Andreas; Zöller, Daniela; Nickels, Stefan; Beutel, Manfred E; Blettner, Maria; Wild, Philipp S; Binder, Harald

    2017-06-13

    There are a growing number of observational studies that do not only focus on single biomarkers for predicting an outcome event, but address questions in a multivariable setting. For example, when quantifying the added value of new biomarkers in addition to established risk factors, the aim might be to rank several new markers with respect to their prediction performance. This makes it important to consider the marker correlation structure for planning such a study. Because of the complexity, a simulation approach may be required to adequately assess sample size or other aspects, such as the choice of a performance measure. In a simulation study based on real data, we investigated how to generate covariates with realistic distributions and what generating model should be used for the outcome, aiming to determine the least amount of information and complexity needed to obtain realistic results. As a basis for the simulation a large epidemiological cohort study, the Gutenberg Health Study was used. The added value of markers was quantified and ranked in subsampling data sets of this population data, and simulation approaches were judged by the quality of the ranking. One of the evaluated approaches, the random forest, requires original data at the individual level. Therefore, also the effect of the size of a pilot study for random forest based simulation was investigated. We found that simple logistic regression models failed to adequately generate realistic data, even with extensions such as interaction terms or non-linear effects. The random forest approach was seen to be more appropriate for simulation of complex data structures. Pilot studies starting at about 250 observations were seen to provide a reasonable level of information for this approach. We advise to avoid oversimplified regression models for simulation, in particular when focusing on multivariable research questions. More generally, a simulation should be based on real data for adequately reflecting

  1. Simulation of complex data structures for planning of studies with focus on biomarker comparison

    Directory of Open Access Journals (Sweden)

    Andreas Schulz

    2017-06-01

    Full Text Available Abstract Background There are a growing number of observational studies that do not only focus on single biomarkers for predicting an outcome event, but address questions in a multivariable setting. For example, when quantifying the added value of new biomarkers in addition to established risk factors, the aim might be to rank several new markers with respect to their prediction performance. This makes it important to consider the marker correlation structure for planning such a study. Because of the complexity, a simulation approach may be required to adequately assess sample size or other aspects, such as the choice of a performance measure. Methods In a simulation study based on real data, we investigated how to generate covariates with realistic distributions and what generating model should be used for the outcome, aiming to determine the least amount of information and complexity needed to obtain realistic results. As a basis for the simulation a large epidemiological cohort study, the Gutenberg Health Study was used. The added value of markers was quantified and ranked in subsampling data sets of this population data, and simulation approaches were judged by the quality of the ranking. One of the evaluated approaches, the random forest, requires original data at the individual level. Therefore, also the effect of the size of a pilot study for random forest based simulation was investigated. Results We found that simple logistic regression models failed to adequately generate realistic data, even with extensions such as interaction terms or non-linear effects. The random forest approach was seen to be more appropriate for simulation of complex data structures. Pilot studies starting at about 250 observations were seen to provide a reasonable level of information for this approach. Conclusions We advise to avoid oversimplified regression models for simulation, in particular when focusing on multivariable research questions. More generally

  2. Using simulated annealing algorithm to optimize the parameters of Biome-BGC model%利用模拟退火算法优化Biome-BGC模型参数

    Institute of Scientific and Technical Information of China (English)

    张廷龙; 孙睿; 胡波; 冯丽超

    2011-01-01

    生态过程模型建立在明确的机理之上,能够较好地模拟陆地生态系统的行为和特征,但模型众多的参数,成为模型具体应用的瓶颈.本文以Biome-BGC模型为例,采用模拟退火算法,对其生理、生态参数进行优化.在优化过程中,先对待优化参数进行了选择,然后采取逐步优化的方法进行优化.结果表明,使用优化后的参数,模型模拟结果与实际观测更为接近,参数优化能有效地降低模型模拟的不确定性.文中参数优化的过程和方法,可为生态模型的参数识别和优化提供一种实例和思路,有助于生态模型应用区域的扩展.%Ecological process model based on defined mechanism can well simulate the dynamic behaviors and features of terrestrial ecosystem, but could become a bottleneck in application because of numerous parameters needed to be confirmed. In this paper, simulated annealing algorithm was used to optimize the physiological and ecological parameters of Biome-BGC model. The first step was to choose some of these parameters to optimize, and then, gradually optimized these parameters. By using the optimized parameters, the model simulation results were much more close to the observed data, and the parameter optimization could effectively reduce the uncertainty of model simulation. The parameter optimization method used in this paper could provide a case and an idea for the parameter identification and optimization of ecological process models,and also, help to expand the application area of the models.

  3. Multi-period multi-objective electricity generation expansion planning problem with Monte-Carlo simulation

    Energy Technology Data Exchange (ETDEWEB)

    Tekiner, Hatice [Industrial Engineering, College of Engineering and Natural Sciences, Istanbul Sehir University, 2 Ahmet Bayman Rd, Istanbul (Turkey); Coit, David W. [Department of Industrial and Systems Engineering, Rutgers University, 96 Frelinghuysen Rd., Piscataway, NJ (United States); Felder, Frank A. [Edward J. Bloustein School of Planning and Public Policy, Rutgers University, Piscataway, NJ (United States)

    2010-12-15

    A new approach to the electricity generation expansion problem is proposed to minimize simultaneously multiple objectives, such as cost and air emissions, including CO{sub 2} and NO{sub x}, over a long term planning horizon. In this problem, system expansion decisions are made to select the type of power generation, such as coal, nuclear, wind, etc., where the new generation asset should be located, and at which time period expansion should take place. We are able to find a Pareto front for the multi-objective generation expansion planning problem that explicitly considers availability of the system components over the planning horizon and operational dispatching decisions. Monte-Carlo simulation is used to generate numerous scenarios based on the component availabilities and anticipated demand for energy. The problem is then formulated as a mixed integer linear program, and optimal solutions are found based on the simulated scenarios with a combined objective function considering the multiple problem objectives. The different objectives are combined using dimensionless weights and a Pareto front can be determined by varying these weights. The mathematical model is demonstrated on an example problem with interesting results indicating how expansion decisions vary depending on whether minimizing cost or minimizing greenhouse gas emissions or pollutants is given higher priority. (author)

  4. Using discrete-event simulation in strategic capacity planning for an outpatient physical therapy service.

    Science.gov (United States)

    Rau, Chi-Lun; Tsai, Pei-Fang Jennifer; Liang, Sheau-Farn Max; Tan, Jhih-Cian; Syu, Hong-Cheng; Jheng, Yue-Ling; Ciou, Ting-Syuan; Jaw, Fu-Shan

    2013-12-01

    This study uses a simulation model as a tool for strategic capacity planning for an outpatient physical therapy clinic in Taipei, Taiwan. The clinic provides a wide range of physical treatments, with 6 full-time therapists in each session. We constructed a discrete-event simulation model to study the dynamics of patient mixes with realistic treatment plans, and to estimate the practical capacity of the physical therapy room. The changes in time-related and space-related performance measurements were used to evaluate the impact of various strategies on the capacity of the clinic. The simulation results confirmed that the clinic is extremely patient-oriented, with a bottleneck occurring at the traction units for Intermittent Pelvic Traction (IPT), with usage at 58.9 %. Sensitivity analysis showed that attending to more patients would significantly increase the number of patients staying for overtime sessions. We found that pooling the therapists produced beneficial results. The average waiting time per patient could be reduced by 45 % when we pooled 2 therapists. We found that treating up to 12 new patients per session had no significantly negative impact on returning patients. Moreover, we found that the average waiting time for new patients decreased if they were given priority over returning patients when called by the therapists.

  5. A multileaf collimator phantom for the quality assurance of radiation therapy planning systems and CT simulators

    International Nuclear Information System (INIS)

    McNiven, Andrea; Kron, Tomas; Van Dyk, Jake

    2004-01-01

    Purpose: The evolution of three-dimensional conformal radiation treatment has led to the use of multileaf collimators (MLCs) in intensity-modulated radiation therapy (IMRT) and other treatment techniques to increase the conformity of the dose distribution. A new quality assurance (QA) phantom has been designed to check the handling of MLC settings in treatment planning and delivery. Methods and materials: The phantom consists of a Perspex block with stepped edges that can be rotated in all planes. The design allows for the assessment of several MLC and micro-MLC types from various manufacturers, and is therefore applicable to most radiation therapy institutions employing MLCs. The phantom is computed tomography (CT) scanned as is a patient, and QA assessments can be made of field edge display for a variety of shapes and orientations on both radiation treatment planning systems (RTPS) and computed tomography simulators. Results: The dimensions of the phantom were verified to be physically correct within an uncertainty range of 0-0.7 mm. Errors in leaf position larger than 1 mm were easily identified by multiple observers. Conclusions: The MLC geometry phantom is a useful tool in the QA of radiation therapy with application to RTPS, CT simulators, and virtual simulation packages with MLC display capabilities

  6. Optimal Acceleration-Velocity-Bounded Trajectory Planning in Dynamic Crowd Simulation

    Directory of Open Access Journals (Sweden)

    Fu Yue-wen

    2014-01-01

    Full Text Available Creating complex and realistic crowd behaviors, such as pedestrian navigation behavior with dynamic obstacles, is a difficult and time consuming task. In this paper, we study one special type of crowd which is composed of urgent individuals, normal individuals, and normal groups. We use three steps to construct the crowd simulation in dynamic environment. The first one is that the urgent individuals move forward along a given path around dynamic obstacles and other crowd members. An optimal acceleration-velocity-bounded trajectory planning method is utilized to model their behaviors, which ensures that the durations of the generated trajectories are minimal and the urgent individuals are collision-free with dynamic obstacles (e.g., dynamic vehicles. In the second step, a pushing model is adopted to simulate the interactions between urgent members and normal ones, which ensures that the computational cost of the optimal trajectory planning is acceptable. The third step is obligated to imitate the interactions among normal members using collision avoidance behavior and flocking behavior. Various simulation results demonstrate that these three steps give realistic crowd phenomenon just like the real world.

  7. Algorithm for planning a double-jaw orthognathic surgery using a computer-aided surgical simulation (CASS) protocol. Part 1: planning sequence

    Science.gov (United States)

    Xia, J. J.; Gateno, J.; Teichgraeber, J. F.; Yuan, P.; Chen, K.-C.; Li, J.; Zhang, X.; Tang, Z.; Alfi, D. M.

    2015-01-01

    The success of craniomaxillofacial (CMF) surgery depends not only on the surgical techniques, but also on an accurate surgical plan. The adoption of computer-aided surgical simulation (CASS) has created a paradigm shift in surgical planning. However, planning an orthognathic operation using CASS differs fundamentally from planning using traditional methods. With this in mind, the Surgical Planning Laboratory of Houston Methodist Research Institute has developed a CASS protocol designed specifically for orthognathic surgery. The purpose of this article is to present an algorithm using virtual tools for planning a double-jaw orthognathic operation. This paper will serve as an operation manual for surgeons wanting to incorporate CASS into their clinical practice. PMID:26573562

  8. Preoperative simulation for the planning of microsurgical clipping of intracranial aneurysms.

    Science.gov (United States)

    Marinho, Paulo; Vermandel, Maximilien; Bourgeois, Philippe; Lejeune, Jean-Paul; Mordon, Serge; Thines, Laurent

    2014-12-01

    The safety and success of intracranial aneurysm (IA) surgery could be improved through the dedicated application of simulation covering the procedure from the 3-dimensional (3D) description of the surgical scene to the visual representation of the clip application. We aimed in this study to validate the technical feasibility and clinical relevance of such a protocol. All patients preoperatively underwent 3D magnetic resonance imaging and 3D computed tomography angiography to build 3D reconstructions of the brain, cerebral arteries, and surrounding cranial bone. These 3D models were segmented and merged using Osirix, a DICOM image processing application. This provided the surgical scene that was subsequently imported into Blender, a modeling platform for 3D animation. Digitized clips and appliers could then be manipulated in the virtual operative environment, allowing the visual simulation of clipping. This simulation protocol was assessed in a series of 10 IAs by 2 neurosurgeons. The protocol was feasible in all patients. The visual similarity between the surgical scene and the operative view was excellent in 100% of the cases, and the identification of the vascular structures was accurate in 90% of the cases. The neurosurgeons found the simulation helpful for planning the surgical approach (ie, the bone flap, cisternal opening, and arterial tree exposure) in 100% of the cases. The correct number of final clip(s) needed was predicted from the simulation in 90% of the cases. The preoperatively expected characteristics of the optimal clip(s) (ie, their number, shape, size, and orientation) were validated during surgery in 80% of the cases. This study confirmed that visual simulation of IA clipping based on the processing of high-resolution 3D imaging can be effective. This is a new and important step toward the development of a more sophisticated integrated simulation platform dedicated to cerebrovascular surgery.

  9. Comparing Pre- and Post-Operative Fontan Hemodynamic Simulations: Implications for the Reliability of Surgical Planning

    Science.gov (United States)

    Haggerty, Christopher M.; de Zélicourt, Diane A.; Restrepo, Maria; Rossignac, Jarek; Spray, Thomas L.; Kanter, Kirk R.; Fogel, Mark A.; Yoganathan, Ajit P.

    2012-01-01

    Background Virtual modeling of cardiothoracic surgery is a new paradigm that allows for systematic exploration of various operative strategies and uses engineering principles to predict the optimal patient-specific plan. This study investigates the predictive accuracy of such methods for the surgical palliation of single ventricle heart defects. Methods Computational fluid dynamics (CFD)-based surgical planning was used to model the Fontan procedure for four patients prior to surgery. The objective for each was to identify the operative strategy that best distributed hepatic blood flow to the pulmonary arteries. Post-operative magnetic resonance data were acquired to compare (via CFD) the post-operative hemodynamics with predictions. Results Despite variations in physiologic boundary conditions (e.g., cardiac output, venous flows) and the exact geometry of the surgical baffle, sufficient agreement was observed with respect to hepatic flow distribution (90% confidence interval-14 ± 4.3% difference). There was also good agreement of flow-normalized energetic efficiency predictions (19 ± 4.8% error). Conclusions The hemodynamic outcomes of prospective patient-specific surgical planning of the Fontan procedure are described for the first time with good quantitative comparisons between preoperatively predicted and postoperative simulations. These results demonstrate that surgical planning can be a useful tool for single ventricle cardiothoracic surgery with the ability to deliver significant clinical impact. PMID:22777126

  10. Semi-automatic watershed medical image segmentation methods for customized cancer radiation treatment planning simulation

    International Nuclear Information System (INIS)

    Kum Oyeon; Kim Hye Kyung; Max, N.

    2007-01-01

    A cancer radiation treatment planning simulation requires image segmentation to define the gross tumor volume, clinical target volume, and planning target volume. Manual segmentation, which is usual in clinical settings, depends on the operator's experience and may, in addition, change for every trial by the same operator. To overcome this difficulty, we developed semi-automatic watershed medical image segmentation tools using both the top-down watershed algorithm in the insight segmentation and registration toolkit (ITK) and Vincent-Soille's bottom-up watershed algorithm with region merging. We applied our algorithms to segment two- and three-dimensional head phantom CT data and to find pixel (or voxel) numbers for each segmented area, which are needed for radiation treatment optimization. A semi-automatic method is useful to avoid errors incurred by both human and machine sources, and provide clear and visible information for pedagogical purpose. (orig.)

  11. Simulation-Based Planning and Control of Transport Flows in Port Logistic Systems

    Directory of Open Access Journals (Sweden)

    Antonio Diogo Passos Lima

    2015-01-01

    Full Text Available In highly dynamic and uncertain transport conditions, transport transit time has to be continuously monitored so that the service level is ensured at a proper cost. The aim of this research is to propose and to test a procedure which allows an agile planning and control of transport flows in port logistic systems. The procedure couples an agent-based simulation and a queueing theory model. In this paper, the transport scheduling performed by an agent at the intermodal terminal was taken into consideration. The decision-making agent takes into account data which is acquired in remote points of the system. The obtained results indicate the relevance of continuously considering, for the transport planning and control, the expected transit time and further waiting times along port logistic systems.

  12. SU-E-J-72: Geant4 Simulations of Spot-Scanned Proton Beam Treatment Plans

    Energy Technology Data Exchange (ETDEWEB)

    Kanehira, T; Sutherland, K; Matsuura, T; Umegaki, K; Shirato, H [Hokkaido University, Sapporo, Hokkaido (Japan)

    2014-06-01

    Purpose: To evaluate density inhomogeneities which can effect dose distributions for real-time image gated spot-scanning proton therapy (RGPT), a dose calculation system, using treatment planning system VQA (Hitachi Ltd., Tokyo) spot position data, was developed based on Geant4. Methods: A Geant4 application was developed to simulate spot-scanned proton beams at Hokkaido University Hospital. A CT scan (0.98 × 0.98 × 1.25 mm) was performed for prostate cancer treatment with three or four inserted gold markers (diameter 1.5 mm, volume 1.77 mm3) in or near the target tumor. The CT data was read into VQA. A spot scanning plan was generated and exported to text files, specifying the beam energy and position of each spot. The text files were converted and read into our Geant4-based software. The spot position was converted into steering magnet field strength (in Tesla) for our beam nozzle. Individual protons were tracked from the vacuum chamber, through the helium chamber, steering magnets, dose monitors, etc., in a straight, horizontal line. The patient CT data was converted into materials with variable density and placed in a parametrized volume at the isocenter. Gold fiducial markers were represented in the CT data by two adjacent voxels (volume 2.38 mm3). 600,000 proton histories were tracked for each target spot. As one beam contained about 1,000 spots, approximately 600 million histories were recorded for each beam on a blade server. Two plans were considered: two beam horizontal opposed (90 and 270 degree) and three beam (0, 90 and 270 degree). Results: We are able to convert spot scanning plans from VQA and simulate them with our Geant4-based code. Our system can be used to evaluate the effect of dose reduction caused by gold markers used for RGPT. Conclusion: Our Geant4 application is able to calculate dose distributions for spot scanned proton therapy.

  13. Faster-than-real-time robot simulation for plan development and robot safety

    International Nuclear Information System (INIS)

    Crane, C.D. III; Dalton, R.; Ogles, J.; Tulenko, J.S.; Zhou, X.

    1990-01-01

    The University of Florida, in cooperation with the Universities of Texas, Tennessee, and Michigan and Oak Ridge National Laboratory (ORNL), is developing an advanced robotic system for the US Department of Energy under the University Program for Robotics for Advanced Reactors. As part of this program, the University of Florida has been pursuing the development of a faster-than-real-time robotic simulation program for planning and control of mobile robotic operations to ensure the efficient and safe operation of mobile robots in nuclear power plants and other hazardous environments

  14. Mathematical simulation of dose fields in the planning of repair stuff irradiation

    International Nuclear Information System (INIS)

    Tashlykov, O.L.; Shcheklein, S.E.; Markelov, N.I.

    2004-01-01

    The role of planning stage in the cycle of optimization when organizing repair works at NPPs is discussed. The methods used for forecasting irradiation doses for personnel engaged in repair works are considered. The importance of the problems of simulating the doses connected with estimation of dose rate values in different points of the working area and working time period in corresponding radiation fields is shown. The calculated data on distributions of γ radiation dose rate fields from surface and linear sources are given [ru

  15. Multiplatform Mission Planning and Operations Simulation Environment for Adaptive Remote Sensors

    Science.gov (United States)

    Smith, G.; Ball, C.; O'Brien, A.; Johnson, J. T.

    2017-12-01

    We report on the design and development of mission simulator libraries to support the emerging field of adaptive remote sensors. We will outline the current state of the art in adaptive sensing, provide analysis of how the current approach to performing observing system simulation experiments (OSSEs) must be changed to enable adaptive sensors for remote sensing, and present an architecture to enable their inclusion in future OSSEs.The growing potential of sensors capable of real-time adaptation of their operational parameters calls for a new class of mission planning and simulation tools. Existing simulation tools used in OSSEs assume a fixed set of sensor parameters in terms of observation geometry, frequencies used, resolution, or observation time, which allows simplifications to be made in the simulation and allows sensor observation errors to be characterized a priori. Adaptive sensors may vary these parameters depending on the details of the scene observed, so that sensor performance is not simple to model without conducting OSSE simulations that include sensor adaptation in response to varying observational environment. Adaptive sensors are of significance to resource-constrained, small satellite platforms because they enable the management of power and data volumes while providing methods for multiple sensors to collaborate.The new class of OSSEs required to utilize adaptive sensors located on multiple platforms must answer the question: If the physical act of sensing has a cost, how does the system determine if the science value of a measurement is worth the cost and how should that cost be shared among the collaborating sensors?Here we propose to answer this question using an architecture structured around three modules: ADAPT, MANAGE and COLLABORATE. The ADAPT module is a set of routines to facilitate modeling of adaptive sensors, the MANAGE module will implement a set of routines to facilitate simulations of sensor resource management when power and data

  16. Production Planning with Respect to Uncertainties. Simulator Based Production Planning of Average Sized Combined Heat and Power Production Plants; Produktionsplanering under osaekerhet. Simulatorbaserad produktionsplanering av medelstora kraftvaermeanlaeggningar

    Energy Technology Data Exchange (ETDEWEB)

    Haeggstaahl, Daniel [Maelardalen Univ., Vaesteraas (Sweden); Dotzauer, Erik [AB Fortum, Stockholm (Sweden)

    2004-12-01

    Production planning in Combined Heat and Power (CHP) systems is considered. The focus is on development and use of mathematical models and methods. Different aspects on production planning are discussed, including weather and load predictions. Questions relevant on the different planning horizons are illuminated. The main purpose with short-term (one week) planning is to decide when to start and stop the production units, and to decide how to use the heat storage. The main conclusion from the outline of pros and cons of commercial planning software are that several are using Mixed Integer Programming (MIP). In that sense they are similar. Building a production planning model means that the planning problem is formulated as a mathematical optimization problem. The accuracy of the input data determines the practical detail level of the model. Two alternatives to the methods used in today's commercial programs are proposed: stochastic optimization and simulator-based optimization. The basic concepts of mathematical optimization are outlined. A simulator-based model for short-term planning is developed. The purpose is to minimize the production costs, depending on the heat demand in the district heating system, prices of electricity and fuels, emission taxes and fees, etc. The problem is simplified by not including any time-linking conditions. The process model is developed in IPSEpro, a heat and mass-balance software from SimTech Simulation Technology. TOMLAB, an optimization toolbox in MATLAB, is used as optimizer. Three different solvers are applied: glcFast, glcCluster and SNOPT. The link between TOMLAB and IPSEpro is accomplished using the Microsoft COM technology. MATLAB is the automation client and contains the control of IPSEpro and TOMLAB. The simulator-based model is applied to the CHP plant in Eskilstuna. Two days are chosen and analyzed. The optimized production is compared to the measured. A sensitivity analysis on how variations in outdoor

  17. Annealing relaxation of ultrasmall gold nanostructures

    Science.gov (United States)

    Chaban, Vitaly

    2015-01-01

    Except serving as an excellent gift on proper occasions, gold finds applications in life sciences, particularly in diagnostics and therapeutics. These applications were made possible by gold nanoparticles, which differ drastically from macroscopic gold. Versatile surface chemistry of gold nanoparticles allows coating with small molecules, polymers, biological recognition molecules. Theoretical investigation of nanoscale gold is not trivial, because of numerous metastable states in these systems. Unlike elsewhere, this work obtains equilibrium structures using annealing simulations within the recently introduced PM7-MD method. Geometries of the ultrasmall gold nanostructures with chalcogen coverage are described at finite temperature, for the first time.

  18. Strategic energy planning: Modelling and simulating energy market behaviours using system thinking and systems dynamics principles

    International Nuclear Information System (INIS)

    Papageorgiou, George Nathaniel

    2005-01-01

    In the face of limited energy reserves and the global warming phenomenon, Europe is undergoing a transition from rapidly depleting fossil fuels to renewable unconventional energy sources. During this transition period, energy shortfalls will occur and energy prices will be increasing in an oscillating manner. As a result of the turbulence and dynamicity that will accompany the transition period, energy analysts need new appropriate methods, techniques and tools in order to develop forecasts for the behaviour of energy markets, which would assist in the long term strategic energy planning and policy analysis. This paper reviews energy market behaviour as related to policy formation, and from a dynamic point of view through the use of ''systems thinking'' and ''system dynamics'' principles, provides a framework for modelling the energy production and consumption process in relation to their environment. Thereby, effective energy planning can be developed via computerised simulation using policy experimentation. In a demonstration model depicted in this paper, it is shown that disasters due to attractive policies can be avoided by using simple computer simulation. (Author)

  19. The Traverse Planning Process for the Drats 2010 Analog Field Simulations

    Science.gov (United States)

    Horz, Friedrich; Gruener, John; Lofgren, Gary; Skinner, James A., Jr.; Graf, Jodi; Seibert, Marc

    2011-01-01

    Traverse planning concentrates on optimizing the science return within the overall objectives of planetary surface missions or their analog field simulations. Such simulations were conducted in the San Francisco Volcanic Field, northern Arizona, from Aug. 26 to Sept 17, 2010 and involved some 200 individuals in the field, with some 40 geoscientists composing the science team. The purpose of these Desert Research and Technology Studies (DRATS) is to exercise and evaluate developmental hardware, software and operational concepts in a mission-like, fully-integrated, setting under the direction of an onsite Mobile Mission Control Center(MMCC). DRATS 2010 focused on the simultaneous operation of 2 rovers, a historic first. Each vehicle was manned by an astronaut-commander and an experienced field geologist. Having 2 rovers and crews in the field mandated substantially more complex science and mission control operations compared to the single rover DRATS tests of 2008 and 2009, or the Apollo lunar missions. For instance, the science support function was distributed over 2 "back rooms", one for each rover, with both "tactical" teams operating independently and simultaneously during the actual traverses. Synthesis and integration of the daily findings and forward planning for the next day(s) was accomplished overnight by yet another "strategic" science team.

  20. Dose perturbation in the presence of metallic implants: treatment planning system versus Monte Carlo simulations

    International Nuclear Information System (INIS)

    Wieslander, Elinore; Knoeoes, Tommy

    2003-01-01

    An increasing number of patients receiving radiation therapy have metallic implants such as hip prostheses. Therefore, beams are normally set up to avoid irradiation through the implant; however, this cannot always be accomplished. In such situations, knowledge of the accuracy of the used treatment planning system (TPS) is required. Two algorithms, the pencil beam (PB) and the collapsed cone (CC), are implemented in the studied TPS. Comparisons are made with Monte Carlo simulations for 6 and 18 MV. The studied materials are steel, CoCrMo, Orthinox(a stainless steel alloy and registered trademark of Stryker Corporation), TiAlV and Ti. Monte Carlo simulated depth dose curves and dose profiles are compared to CC and PB calculated data. The CC algorithm shows overall a better agreement with Monte Carlo than the PB algorithm. Thus, it is recommended to use the CC algorithm to get the most accurate dose calculation both for the planning target volume and for tissues adjacent to the implants when beams are set up to pass through implants

  1. Quality assurance for online adapted treatment plans: Benchmarking and delivery monitoring simulation

    International Nuclear Information System (INIS)

    Li, Taoran; Wu, Qiuwen; Yang, Yun; Rodrigues, Anna; Yin, Fang-Fang; Jackie Wu, Q.

    2015-01-01

    Purpose: An important challenge facing online adaptive radiation therapy is the development of feasible and efficient quality assurance (QA). This project aimed to validate the deliverability of online adapted plans and develop a proof-of-concept online delivery monitoring system for online adaptive radiation therapy QA. Methods: The first part of this project benchmarked automatically online adapted prostate treatment plans using traditional portal dosimetry IMRT QA. The portal dosimetry QA results of online adapted plans were compared to original (unadapted) plans as well as randomly selected prostate IMRT plans from our clinic. In the second part, an online delivery monitoring system was designed and validated via a simulated treatment with intentional multileaf collimator (MLC) errors. This system was based on inputs from the dynamic machine information (DMI), which continuously reports actual MLC positions and machine monitor units (MUs) at intervals of 50 ms or less during delivery. Based on the DMI, the system performed two levels of monitoring/verification during the delivery: (1) dynamic monitoring of cumulative fluence errors resulting from leaf position deviations and visualization using fluence error maps (FEMs); and (2) verification of MLC positions against the treatment plan for potential errors in MLC motion and data transfer at each control point. Validation of the online delivery monitoring system was performed by introducing intentional systematic MLC errors (ranging from 0.5 to 2 mm) to the DMI files for both leaf banks. These DMI files were analyzed by the proposed system to evaluate the system’s performance in quantifying errors and revealing the source of errors, as well as to understand patterns in the FEMs. In addition, FEMs from 210 actual prostate IMRT beams were analyzed using the proposed system to further validate its ability to catch and identify errors, as well as establish error magnitude baselines for prostate IMRT delivery

  2. Quality assurance for online adapted treatment plans: Benchmarking and delivery monitoring simulation

    Energy Technology Data Exchange (ETDEWEB)

    Li, Taoran, E-mail: taoran.li.duke@gmail.com; Wu, Qiuwen; Yang, Yun; Rodrigues, Anna; Yin, Fang-Fang; Jackie Wu, Q. [Department of Radiation Oncology, Duke University Medical Center Durham, North Carolina 27710 (United States)

    2015-01-15

    Purpose: An important challenge facing online adaptive radiation therapy is the development of feasible and efficient quality assurance (QA). This project aimed to validate the deliverability of online adapted plans and develop a proof-of-concept online delivery monitoring system for online adaptive radiation therapy QA. Methods: The first part of this project benchmarked automatically online adapted prostate treatment plans using traditional portal dosimetry IMRT QA. The portal dosimetry QA results of online adapted plans were compared to original (unadapted) plans as well as randomly selected prostate IMRT plans from our clinic. In the second part, an online delivery monitoring system was designed and validated via a simulated treatment with intentional multileaf collimator (MLC) errors. This system was based on inputs from the dynamic machine information (DMI), which continuously reports actual MLC positions and machine monitor units (MUs) at intervals of 50 ms or less during delivery. Based on the DMI, the system performed two levels of monitoring/verification during the delivery: (1) dynamic monitoring of cumulative fluence errors resulting from leaf position deviations and visualization using fluence error maps (FEMs); and (2) verification of MLC positions against the treatment plan for potential errors in MLC motion and data transfer at each control point. Validation of the online delivery monitoring system was performed by introducing intentional systematic MLC errors (ranging from 0.5 to 2 mm) to the DMI files for both leaf banks. These DMI files were analyzed by the proposed system to evaluate the system’s performance in quantifying errors and revealing the source of errors, as well as to understand patterns in the FEMs. In addition, FEMs from 210 actual prostate IMRT beams were analyzed using the proposed system to further validate its ability to catch and identify errors, as well as establish error magnitude baselines for prostate IMRT delivery

  3. Quality assurance for online adapted treatment plans: benchmarking and delivery monitoring simulation.

    Science.gov (United States)

    Li, Taoran; Wu, Qiuwen; Yang, Yun; Rodrigues, Anna; Yin, Fang-Fang; Jackie Wu, Q

    2015-01-01

    An important challenge facing online adaptive radiation therapy is the development of feasible and efficient quality assurance (QA). This project aimed to validate the deliverability of online adapted plans and develop a proof-of-concept online delivery monitoring system for online adaptive radiation therapy QA. The first part of this project benchmarked automatically online adapted prostate treatment plans using traditional portal dosimetry IMRT QA. The portal dosimetry QA results of online adapted plans were compared to original (unadapted) plans as well as randomly selected prostate IMRT plans from our clinic. In the second part, an online delivery monitoring system was designed and validated via a simulated treatment with intentional multileaf collimator (MLC) errors. This system was based on inputs from the dynamic machine information (DMI), which continuously reports actual MLC positions and machine monitor units (MUs) at intervals of 50 ms or less during delivery. Based on the DMI, the system performed two levels of monitoring/verification during the delivery: (1) dynamic monitoring of cumulative fluence errors resulting from leaf position deviations and visualization using fluence error maps (FEMs); and (2) verification of MLC positions against the treatment plan for potential errors in MLC motion and data transfer at each control point. Validation of the online delivery monitoring system was performed by introducing intentional systematic MLC errors (ranging from 0.5 to 2 mm) to the DMI files for both leaf banks. These DMI files were analyzed by the proposed system to evaluate the system's performance in quantifying errors and revealing the source of errors, as well as to understand patterns in the FEMs. In addition, FEMs from 210 actual prostate IMRT beams were analyzed using the proposed system to further validate its ability to catch and identify errors, as well as establish error magnitude baselines for prostate IMRT delivery. Online adapted plans were

  4. Music playlist generation by adapted simulated annealing

    NARCIS (Netherlands)

    Pauws, S.C.; Verhaegh, W.F.J.; Vossen, M.P.H.

    2008-01-01

    We present the design of an algorithm for use in an interactivemusic system that automatically generates music playlists that fit the music preferences of a user. To this end, we introduce a formal model, define the problem of automatic playlist generation (APG), and proof its NP-hardness. We use a

  5. Field sampling scheme optimization using simulated annealing

    CSIR Research Space (South Africa)

    Debba, Pravesh

    2010-10-01

    Full Text Available : silica (quartz, chalcedony, and opal)→ alunite → kaolinite → illite → smectite → chlorite. Associated with this mineral alteration are high sulphidation gold deposits and low sulphidation base metal deposits. Gold min- eralization is located... of vuggy (porous) quartz, opal and gray and black chalcedony veins. Vuggy quartz (porous quartz) is formed from extreme leaching of the host rock. It hosts high sulphidation gold mineralization and is evidence for a hypogene event. Alteration...

  6. Air atmosphere annealing effects on LSO:Ce crystal

    Czech Academy of Sciences Publication Activity Database

    Ding, D.; Feng, H.; Ren, G.; Nikl, Martin; Qin, L.; Pan, S.; Yang, F.

    2010-01-01

    Roč. 57, č. 3 (2010), s. 1272-1277 ISSN 0018-9499 R&D Projects: GA MŠk ME08034 Institutional research plan: CEZ:AV0Z10100521 Keywords : annealing * cerium * LSO * luminescence Subject RIV: BM - Solid Matter Physics ; Magnetism Impact factor: 1.519, year: 2010

  7. Simulation of heat exchanger network (HEN) and planning the optimum cleaning schedule

    International Nuclear Information System (INIS)

    Sanaye, Sepehr; Niroomand, Behzad

    2007-01-01

    Modeling and simulation of heat exchanger networks for estimating the amount of fouling, variations in overall heat transfer coefficient, and variations in outlet temperatures of hot and cold streams has a significant effect on production analysis. In this analysis, parameters such as the exchangers' types and arrangements, their heat transfer surface areas, mass flow rates of hot and cold streams, heat transfer coefficients and variations of fouling with time are required input data. The main goal is to find the variations of the outlet temperatures of the hot and cold streams with time to plan the optimum cleaning schedule of heat exchangers that provides the minimum operational cost or maximum amount of savings. In this paper, the simulation of heat exchanger networks is performed by choosing an asymptotic fouling function. Two main parameters in the asymptotic fouling formation model, i.e. the decay time of fouling formation (τ) and the asymptotic fouling resistance (R f ∼ ) were obtained from empirical data as input parameters to the simulation relations. These data were extracted from the technical history sheets of the Khorasan Petrochemical Plant to guaranty the consistency between our model outputs and the real operating conditions. The output results of the software program developed, including the variations with time of the outlet temperatures of the hot and cold streams, the heat transfer coefficient and the heat transfer rate in the exchangers, are presented for two case studies. Then, an objective function (operational cost) was defined, and the optimal cleaning schedule of the HEN (heat exchanger network) in the Urea and Ammonia units were found by minimizing the objective function using a numerical search method. Based on this minimization procedure, the decision was made whether a heat exchanger should be cleaned or continue to operate. The final result was the most cost effective plan for the HEN cleaning schedule. The corresponding savings by

  8. Simulation of heat exchanger network (HEN) and planning the optimum cleaning schedule

    Energy Technology Data Exchange (ETDEWEB)

    Sanaye, Sepehr [Energy Systems Improvement Laboratory, Mechanical Engineering Department, Iran University of Science and Technology (IUST), Narmak, Tehran 16488 (Iran, Islamic Republic of)]. E-mail: sepehr@iust.ac.ir; Niroomand, Behzad [Energy Systems Improvement Laboratory, Mechanical Engineering Department, Iran University of Science and Technology (IUST), Narmak, Tehran 16488 (Iran, Islamic Republic of)

    2007-05-15

    Modeling and simulation of heat exchanger networks for estimating the amount of fouling, variations in overall heat transfer coefficient, and variations in outlet temperatures of hot and cold streams has a significant effect on production analysis. In this analysis, parameters such as the exchangers' types and arrangements, their heat transfer surface areas, mass flow rates of hot and cold streams, heat transfer coefficients and variations of fouling with time are required input data. The main goal is to find the variations of the outlet temperatures of the hot and cold streams with time to plan the optimum cleaning schedule of heat exchangers that provides the minimum operational cost or maximum amount of savings. In this paper, the simulation of heat exchanger networks is performed by choosing an asymptotic fouling function. Two main parameters in the asymptotic fouling formation model, i.e. the decay time of fouling formation ({tau}) and the asymptotic fouling resistance (R{sub f}{sup {approx}}) were obtained from empirical data as input parameters to the simulation relations. These data were extracted from the technical history sheets of the Khorasan Petrochemical Plant to guaranty the consistency between our model outputs and the real operating conditions. The output results of the software program developed, including the variations with time of the outlet temperatures of the hot and cold streams, the heat transfer coefficient and the heat transfer rate in the exchangers, are presented for two case studies. Then, an objective function (operational cost) was defined, and the optimal cleaning schedule of the HEN (heat exchanger network) in the Urea and Ammonia units were found by minimizing the objective function using a numerical search method. Based on this minimization procedure, the decision was made whether a heat exchanger should be cleaned or continue to operate. The final result was the most cost effective plan for the HEN cleaning schedule. The

  9. MRI-based treatment plan simulation and adaptation for ion radiotherapy using a classification-based approach

    International Nuclear Information System (INIS)

    Rank, Christopher M; Tremmel, Christoph; Hünemohr, Nora; Nagel, Armin M; Jäkel, Oliver; Greilich, Steffen

    2013-01-01

    In order to benefit from the highly conformal irradiation of tumors in ion radiotherapy, sophisticated treatment planning and simulation are required. The purpose of this study was to investigate the potential of MRI for ion radiotherapy treatment plan simulation and adaptation using a classification-based approach. Firstly, a voxelwise tissue classification was applied to derive pseudo CT numbers from MR images using up to 8 contrasts. Appropriate MR sequences and parameters were evaluated in cross-validation studies of three phantoms. Secondly, ion radiotherapy treatment plans were optimized using both MRI-based pseudo CT and reference CT and recalculated on reference CT. Finally, a target shift was simulated and a treatment plan adapted to the shift was optimized on a pseudo CT and compared to reference CT optimizations without plan adaptation. The derivation of pseudo CT values led to mean absolute errors in the range of 81 - 95 HU. Most significant deviations appeared at borders between air and different tissue classes and originated from partial volume effects. Simulations of ion radiotherapy treatment plans using pseudo CT for optimization revealed only small underdosages in distal regions of a target volume with deviations of the mean dose of PTV between 1.4 - 3.1% compared to reference CT optimizations. A plan adapted to the target volume shift and optimized on the pseudo CT exhibited a comparable target dose coverage as a non-adapted plan optimized on a reference CT. We were able to show that a MRI-based derivation of pseudo CT values using a purely statistical classification approach is feasible although no physical relationship exists. Large errors appeared at compact bone classes and came from an imperfect distinction of bones and other tissue types in MRI. In simulations of treatment plans, it was demonstrated that these deviations are comparable to uncertainties of a target volume shift of 2 mm in two directions indicating that especially

  10. TU-A-304-02: Treatment Simulation, Planning and Delivery for SBRT

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Y.

    2015-06-15

    Increased use of SBRT and hypo fractionation in radiation oncology practice has posted a number of challenges to medical physicist, ranging from planning, image-guided patient setup and on-treatment monitoring, to quality assurance (QA) and dose delivery. This symposium is designed to provide updated knowledge necessary for the safe and efficient implementation of SBRT in various linac platforms, including the emerging digital linacs equipped with high dose rate FFF beams. Issues related to 4D CT, PET and MRI simulations, 3D/4D CBCT guided patient setup, real-time image guidance during SBRT dose delivery using gated/un-gated VMAT or IMRT, and technical advancements in QA of SBRT (in particular, strategies dealing with high dose rate FFF beams) will be addressed. The symposium will help the attendees to gain a comprehensive understanding of the SBRT workflow and facilitate their clinical implementation of the state-of-art imaging and planning techniques. Learning Objectives: Present background knowledge of SBRT, describe essential requirements for safe implementation of SBRT, and discuss issues specific to SBRT treatment planning and QA. Update on the use of multi-dimensional (3D and 4D) and multi-modality (CT, beam-level X-ray imaging, pre- and on-treatment 3D/4D MRI, PET, robotic ultrasound, etc.) for reliable guidance of SBRT. Provide a comprehensive overview of emerging digital linacs and summarize the key geometric and dosimetric features of the new generation of linacs for substantially improved SBRT. Discuss treatment planning and quality assurance issues specific to SBRT. Research grant from Varian Medical Systems.

  11. Photon energy-modulated radiotherapy: Monte Carlo simulation and treatment planning study

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jong Min; Kim, Jung-in; Heon Choi, Chang; Chie, Eui Kyu; Kim, Il Han; Ye, Sung-Joon [Interdiciplinary Program in Radiation Applied Life Science, Seoul National University, Seoul, 110-744, Korea and Department of Radiation Oncology, Seoul National University Hospital, Seoul, 110-744 (Korea, Republic of); Interdiciplinary Program in Radiation Applied Life Science, Seoul National University, Seoul, 110-744 (Korea, Republic of); Department of Radiation Oncology, Seoul National University Hospital, Seoul, 110-744 (Korea, Republic of); Interdiciplinary Program in Radiation Applied Life Science, Seoul National University, Seoul, 110-744 (Korea, Republic of) and Department of Radiation Oncology, Seoul National University Hospital, Seoul, 110-744 (Korea, Republic of); Interdiciplinary Program in Radiation Applied Life Science, Seoul National University, Seoul, 110-744 (Korea, Republic of); Department of Radiation Oncology, Seoul National University Hospital, Seoul, 110-744 (Korea, Republic of) and Department of Intelligent Convergence Systems, Seoul National University, Seoul, 151-742 (Korea, Republic of)

    2012-03-15

    Purpose: To demonstrate the feasibility of photon energy-modulated radiotherapy during beam-on time. Methods: A cylindrical device made of aluminum was conceptually proposed as an energy modulator. The frame of the device was connected with 20 tubes through which mercury could be injected or drained to adjust the thickness of mercury along the beam axis. In Monte Carlo (MC) simulations, a flattening filter of 6 or 10 MV linac was replaced with the device. The thickness of mercury inside the device varied from 0 to 40 mm at the field sizes of 5 x 5 cm{sup 2} (FS5), 10 x 10 cm{sup 2} (FS10), and 20 x 20 cm{sup 2} (FS20). At least 5 billion histories were followed for each simulation to create phase space files at 100 cm source to surface distance (SSD). In-water beam data were acquired by additional MC simulations using the above phase space files. A treatment planning system (TPS) was commissioned to generate a virtual machine using the MC-generated beam data. Intensity modulated radiation therapy (IMRT) plans for six clinical cases were generated using conventional 6 MV, 6 MV flattening filter free, and energy-modulated photon beams of the virtual machine. Results: As increasing the thickness of mercury, Percentage depth doses (PDD) of modulated 6 and 10 MV after the depth of dose maximum were continuously increased. The amount of PDD increase at the depth of 10 and 20 cm for modulated 6 MV was 4.8% and 5.2% at FS5, 3.9% and 5.0% at FS10 and 3.2%-4.9% at FS20 as increasing the thickness of mercury from 0 to 20 mm. The same for modulated 10 MV was 4.5% and 5.0% at FS5, 3.8% and 4.7% at FS10 and 4.1% and 4.8% at FS20 as increasing the thickness of mercury from 0 to 25 mm. The outputs of modulated 6 MV with 20 mm mercury and of modulated 10 MV with 25 mm mercury were reduced into 30%, and 56% of conventional linac, respectively. The energy-modulated IMRT plans had less integral doses than 6 MV IMRT or 6 MV flattening filter free plans for tumors located in the

  12. Eliminating Inconsistencies in Simulation and Treatment Planning Orders in Radiation Therapy

    International Nuclear Information System (INIS)

    Santanam, Lakshmi; Brame, Ryan S.; Lindsey, Andrew; Dewees, Todd; Danieley, Jon; Labrash, Jason; Parikh, Parag; Bradley, Jeffrey; Zoberi, Imran; Michalski, Jeff; Mutic, Sasa

    2013-01-01

    Purpose: To identify deficiencies with simulation and treatment planning orders and to develop corrective measures to improve safety and quality. Methods and Materials: At Washington University, the DMAIIC formalism is used for process management, whereby the process is understood as comprising Define, Measure, Analyze, Improve, Implement, and Control activities. Two complementary tools were used to provide quantitative assessments: failure modes and effects analysis and reported event data. The events were classified by the user according to severity. The event rates (ie, number of events divided by the number of opportunities to generate an event) related to simulation and treatment plan orders were determined. Results: We analyzed event data from the period 2008-2009 to design an intelligent SIMulation and treatment PLanning Electronic (SIMPLE) order system. Before implementation of SIMPLE, event rates of 0.16 (420 of 2558) for a group of physicians that were subsequently used as a pilot group and 0.13 (787 of 6023) for all physicians were obtained. An interdisciplinary group evaluated and decided to replace the Microsoft Word-based form with a Web-based order system. This order system has mandatory fields and context-sensitive logic, an ability to create templates, and enables an automated process for communication of orders through an enterprise management system. After the implementation of the SIMPLE order, the event rate decreased to 0.09 (96 of 1001) for the pilot group and to 0.06 (145 of 2140) for all physicians (P<.0001). The average time to complete the SIMPLE form was 3 minutes, as compared with 7 minutes for the Word-based form. The number of severe events decreased from 10.7% (45 of 420) and 12.1% (96 of 787) to 6.2% (6 of 96) and 10.3% (15 of 145) for the pilot group and all physicians, respectively. Conclusions: There was a dramatic reduction in the total and the number of potentially severe events through use of the SIMPLE system. In addition

  13. Simulation of 3D-treatment plans in head and neck tumors aided by matching of digitally reconstructed radiographs (DRR) and on-line distortion corrected simulator images

    International Nuclear Information System (INIS)

    Lohr, Frank; Schramm, Oliver; Schraube, Peter; Sroka-Perez, Gabriele; Seeber, Steffen; Schlepple, Gerd; Schlegel, Wolfgang; Wannenmacher, Michael

    1997-01-01

    Background and purpose: Simulation of 3D-treatment plans for head and neck malignancy is difficult due to complex anatomy. Therefore, CT-simulation and stereotactic techniques are becoming more common in the treatment preparation, overcoming the need for simulation. However, if simulation is still performed, it is an important step in the treatment preparation/execution chain, since simulation errors, if not detected immediately, can compromise the success of treatment. A recently developed PC-based system for on-line image matching and comparison of digitally reconstructed radiographs (DRR) and distortion corrected simulator monitor images that enables instant correction of field placement errors during the simulation process was evaluated. The range of field placement errors with noncomputer aided simulation is reported. Materials and methods: For 14 patients either a primary 3D-treatment plan or a 3D-boost plan after initial treatment with opposing laterals for head and neck malignancy with a coplanar or non-coplanar two- or three-field technique was simulated. After determining the robustness of the matching process and the accuracy of field placement error detection with phantom measurements, DRRs were generated from the treatment planning CT-dataset of each patient and were interactively matched with on-line simulator images that had undergone correction for geometrical distortion, using a landmark algorithm. Translational field placement errors in all three planes as well as in-plane rotational errors were studied and were corrected immediately. Results: The interactive matching process is very robust with a tolerance of <2 mm when suitable anatomical landmarks are chosen. The accuracy for detection of translational errors in phantom measurements was <1 mm and for in-plane rotational errors the accuracy had a maximum of only 1.5 deg.. For patient simulation, the mean absolute distance of the planned versus simulated isocenter was 6.4 ± 3.9 mm. The in

  14. Secure environment for real-time tele-collaboration on virtual simulation of radiation treatment planning.

    Science.gov (United States)

    Ntasis, Efthymios; Maniatis, Theofanis A; Nikita, Konstantina S

    2003-01-01

    A secure framework is described for real-time tele-collaboration on Virtual Simulation procedure of Radiation Treatment Planning. An integrated approach is followed clustering the security issues faced by the system into organizational issues, security issues over the LAN and security issues over the LAN-to-LAN connection. The design and the implementation of the security services are performed according to the identified security requirements, along with the need for real time communication between the collaborating health care professionals. A detailed description of the implementation is given, presenting a solution, which can directly be tailored to other tele-collaboration services in the field of health care. The pilot study of the proposed security components proves the feasibility of the secure environment, and the consistency with the high performance demands of the application.

  15. Advanced Simulation and Computing Fiscal Year 2016 Implementation Plan, Version 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Archer, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Hendrickson, B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-27

    The Stockpile Stewardship Program (SSP) is an integrated technical program for maintaining the safety, surety, and reliability of the U.S. nuclear stockpile. The SSP uses nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of experimental facilities and programs, and the computational capabilities to support these programs. The purpose of this IP is to outline key work requirements to be performed and to control individual work activities within the scope of work. Contractors may not deviate from this plan without a revised WA or subsequent IP.

  16. Simulation as a planning tool for job-shop production environment

    Science.gov (United States)

    Maram, Venkataramana; Nawawi, Mohd Kamal Bin Mohd; Rahman, Syariza Abdul; Sultan, Sultan Juma

    2015-12-01

    In this paper, we made an attempt to use discrete event simulation software ARENA® as a planning tool for job shop production environment. We considered job shop produces three types of Jigs with different sequence of operations to study and improve shop floor performance. The sole purpose of the study is to identifying options to improve machines utilization, reducing job waiting times at bottleneck machines. First, the performance of the existing system was evaluated by using ARENA®. Then identified improvement opportunities by analyzing base system results. Second, updated the model with most economical options. The proposed new system outperforms with that of the current base system by 816% improvement in delay times at paint shop by increase 2 to 3 and Jig cycle time reduces by Jig1 92%, Jig2 65% and Jig3 41% and hence new proposal was recommended.

  17. A review of computer-aided oral and maxillofacial surgery: planning, simulation and navigation.

    Science.gov (United States)

    Chen, Xiaojun; Xu, Lu; Sun, Yi; Politis, Constantinus

    2016-11-01

    Currently, oral and maxillofacial surgery (OMFS) still poses a significant challenge for surgeons due to the anatomic complexity and limited field of view of the oral cavity. With the great development of computer technologies, he computer-aided surgery has been widely used for minimizing the risks and improving the precision of surgery. Areas covered: The major goal of this paper is to provide a comprehensive reference source of current and future development of computer-aided OMFS including surgical planning, simulation and navigation for relevant researchers. Expert commentary: Compared with the traditional OMFS, computer-aided OMFS overcomes the disadvantage that the treatment on the region of anatomically complex maxillofacial depends almost exclusively on the experience of the surgeon.

  18. Commissioning and quality control of a dedicated wide bore 3T MRI simulator for radiotherapy planning

    Directory of Open Access Journals (Sweden)

    Aitang Xing

    2016-06-01

    Full Text Available Purpose: The purpose of this paper is to describe a practical approach to commissioning and quality assurance (QA of a dedicated wide-bore 3 Tesla (3T magnetic resonance imaging (MRI scanner for radiotherapy planning.Methods: A comprehensive commissioning protocol focusing on radiotherapy (RT specific requirements was developed and performed. RT specific tests included: uniformity characteristics of radio-frequency (RF coil, couch top attenuation, geometric distortion, laser and couch movement and an end-to-end radiotherapy treatment planning test. General tests for overall system performance and safety measurements were also performed.Results: The use of pre-scan based intensity correction increased the uniformity from 61.7% to 97% (body flexible coil, from 50% to 90% (large flexible coil and from 51% to 98% (small flexible coil. RT flat top couch decreased signal-to-noise ratio (SNR by an average of 42%. The mean and maximum geometric distortion was found to be 1.25 mm and 4.08 mm for three dimensional (3D corrected image acquisition, 2.07 mm and 7.88 mm for two dimensional (2D corrected image acquisition over 500 mm × 375 mm × 252 mm field of view (FOV. The accuracy of the laser and couch movement was less than ±1 mm. The standard deviation of registration parameters for the end-to-end test was less than 0.41 mm. An on-going QA program was developed to monitor the system’s performance.Conclusion: A number of RT specific tests have been described for commissioning and subsequent performance monitoring of a dedicated MRI simulator (MRI-Sim. These tests have been important in establishing and maintaining its operation for RT planning.

  19. A study on a comparative analysis of 2D and 3D planning using CT simulator for transbronchial brachytherapy

    International Nuclear Information System (INIS)

    Seo, Dong Rin; Kim, Dae Sup; Back, Geum Mun

    2013-01-01

    Transbronchial brachytherapy used in the two-dimensional treatment planning difficult to identify the location of the tumor in the affected area to determine the process analysis. In this study, we have done a comparative analysis for the patient's treatment planning using a CT simulator. The analysis was performed by the patients who visited the hospital to June 2012. The patient carried out CT-image by CT simulator, and we were plan to compare with a two-dimensional and three dimensional treatment planning using a Oncentra Brachy planning system (Nucletron, Netherlands). The location of the catheter was confirmed the each time on a treatment planning for fractionated transbronchial brachytherapy. GTV volumes were 3.5 cm 3 and 3.3 cm 3 . Also easy to determine the dose distribution of the tumor, the errors of a dose delivery were confirmed dose distribution of the prescribed dose for GTV. In the first treatment was 92% and the second was 88%. In order to compensate for the problem through a two-dimensional treatment planning, it is necessary to be tested process for the accurate identification and analysis of the treatment volume and dose distribution. Quantitatively determine the dose delivery error process that is reflected to the treatment planning is required

  20. A study on a comparative analysis of 2D and 3D planning using CT simulator for transbronchial brachytherapy

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Dong Rin; Kim, Dae Sup; Back, Geum Mun [Dept. of Radiation Oncology, Asan Medical Center, Seoul (Korea, Republic of)

    2013-03-15

    Transbronchial brachytherapy used in the two-dimensional treatment planning difficult to identify the location of the tumor in the affected area to determine the process analysis. In this study, we have done a comparative analysis for the patient's treatment planning using a CT simulator. The analysis was performed by the patients who visited the hospital to June 2012. The patient carried out CT-image by CT simulator, and we were plan to compare with a two-dimensional and three dimensional treatment planning using a Oncentra Brachy planning system (Nucletron, Netherlands). The location of the catheter was confirmed the each time on a treatment planning for fractionated transbronchial brachytherapy. GTV volumes were 3.5 cm{sup 3} and 3.3 cm{sup 3}. Also easy to determine the dose distribution of the tumor, the errors of a dose delivery were confirmed dose distribution of the prescribed dose for GTV. In the first treatment was 92% and the second was 88%. In order to compensate for the problem through a two-dimensional treatment planning, it is necessary to be tested process for the accurate identification and analysis of the treatment volume and dose distribution. Quantitatively determine the dose delivery error process that is reflected to the treatment planning is required.

  1. Assembly Line Productivity Assessment by Comparing Optimization-Simulation Algorithms of Trajectory Planning for Industrial Robots

    Directory of Open Access Journals (Sweden)

    Francisco Rubio

    2015-01-01

    Full Text Available In this paper an analysis of productivity will be carried out from the resolution of the problem of trajectory planning of industrial robots. The analysis entails economic considerations, thus overcoming some limitations of the existing literature. Two methodologies based on optimization-simulation procedures are compared to calculate the time needed to perform an industrial robot task. The simulation methodology relies on the use of robotics and automation software called GRASP. The optimization methodology developed in this work is based on the kinematics and the dynamics of industrial robots. It allows us to pose a multiobjective optimization problem to assess the trade-offs between the economic variables by means of the Pareto fronts. The comparison is carried out for different examples and from a multidisciplinary point of view, thus, to determine the impact of using each method. Results have shown the opportunity costs of non using the methodology with optimized time trajectories. Furthermore, it allows companies to stay competitive because of the quick adaptation to rapidly changing markets.

  2. An optimization algorithm for simulation-based planning of low-income housing projects

    Directory of Open Access Journals (Sweden)

    Mohamed M. Marzouk

    2010-10-01

    Full Text Available Construction of low-income housing projects is a replicated process and is associated with uncertainties that arise from the unavailability of resources. Government agencies and/or contractors have to select a construction system that meets low-income housing projects constraints including project conditions, technical, financial and time constraints. This research presents a framework, using computer simulation, which aids government authorities and contractors in the planning of low-income housing projects. The proposed framework estimates the time and cost required for the construction of low-income housing using pre-cast hollow core with hollow blocks bearing walls. Five main components constitute the proposed framework: a network builder module, a construction alternative selection module, a simulation module, an optimization module and a reporting module. An optimization module utilizing a genetic algorithm enables the defining of different options and ranges of parameters associated with low-income housing projects that influence the duration and total cost of the pre-cast hollow core with hollow blocks bearing walls method. A computer prototype, named LIHouse_Sim, was developed in MS Visual Basic 6.0 as proof of concept for the proposed framework. A numerical example is presented to demonstrate the use of the developed framework and to illustrate its essential features.

  3. Simulation model for improved production planning and control through quality, cycle time and batch size management

    Directory of Open Access Journals (Sweden)

    Kotevski Živko

    2015-01-01

    Full Text Available Production planning and control (PPC systems are the base of all production facilities. In today's surroundings, having a good PPC system generates lots of benefits for the companies. But, having an excellent PPC system provides great competitive advantage and serious reduction of cost in many fields. In order to get to a point of having excellent PPC, the companies turn more and more to the newest software tools, for simulations as an example. Considering today's advanced computer technology, by using the simulations in this area, companies will have strong asset when dealing with different kinds of wastes, delays, overstock, bottlenecks and generally loss of time. This model is applicable in almost all production facilities. Taking into account the different scrap percentages for the pieces that form the end product, a detailed model and analysis were made in order to determine the optimal starting parameters. At first all the conditions of the company were determined, conceptual model was created along with all assumptions. Then the model was verified and validated and at the end a cost benefit analysis was conducted in order to have clear results.

  4. Management of the Bohunice RPVs annealing procedures

    International Nuclear Information System (INIS)

    Repka, M.

    1994-01-01

    The program of annealing regeneration procedure of RPVs units 1 and 2 of NPP V-1 (EBO) realization in the year 1993, is the topic of this paper. In the paper the following steps are described in detail: the preparation works, the annealing procedure realization schedule and safety management: starting with zero conditions, assembling of annealing apparatus, annealing procedure, cooling down and disassembling procedure of annealing apparatus. At the end the programs of annealing of both RPVs including the dosimetry measurements are discussed and evaluated. (author). 3 figs

  5. A bi-level integrated generation-transmission planning model incorporating the impacts of demand response by operation simulation

    International Nuclear Information System (INIS)

    Zhang, Ning; Hu, Zhaoguang; Springer, Cecilia; Li, Yanning; Shen, Bo

    2016-01-01

    Highlights: • We put forward a novel bi-level integrated power system planning model. • Generation expansion planning and transmission expansion planning are combined. • The effects of two sorts of demand response in reducing peak load are considered. • Operation simulation is conducted to reflect the actual effects of demand response. • The interactions between the two levels can guarantee a reasonably optimal result. - Abstract: If all the resources in power supply side, transmission part, and power demand side are considered together, the optimal expansion scheme from the perspective of the whole system can be achieved. In this paper, generation expansion planning and transmission expansion planning are combined into one model. Moreover, the effects of demand response in reducing peak load are taken into account in the planning model, which can cut back the generation expansion capacity and transmission expansion capacity. Existing approaches to considering demand response for planning tend to overestimate the impacts of demand response on peak load reduction. These approaches usually focus on power reduction at the moment of peak load without considering the situations in which load demand at another moment may unexpectedly become the new peak load due to demand response. These situations are analyzed in this paper. Accordingly, a novel approach to incorporating demand response in a planning model is proposed. A modified unit commitment model with demand response is utilized. The planning model is thereby a bi-level model with interactions between generation-transmission expansion planning and operation simulation to reflect the actual effects of demand response and find the reasonably optimal planning result.

  6. Quantum annealing versus classical machine learning applied to a simplified computational biology problem

    Science.gov (United States)

    Li, Richard Y.; Di Felice, Rosa; Rohs, Remo; Lidar, Daniel A.

    2018-01-01

    Transcription factors regulate gene expression, but how these proteins recognize and specifically bind to their DNA targets is still debated. Machine learning models are effective means to reveal interaction mechanisms. Here we studied the ability of a quantum machine learning approach to predict binding specificity. Using simplified datasets of a small number of DNA sequences derived from actual binding affinity experiments, we trained a commercially available quantum annealer to classify and rank transcription factor binding. The results were compared to state-of-the-art classical approaches for the same simplified datasets, including simulated annealing, simulated quantum annealing, multiple linear regression, LASSO, and extreme gradient boosting. Despite technological limitations, we find a slight advantage in classification performance and nearly equal ranking performance using the quantum annealer for these fairly small training data sets. Thus, we propose that quantum annealing might be an effective method to implement machine learning for certain computational biology problems. PMID:29652405

  7. Quantum annealing versus classical machine learning applied to a simplified computational biology problem

    Science.gov (United States)

    Li, Richard Y.; Di Felice, Rosa; Rohs, Remo; Lidar, Daniel A.

    2018-03-01

    Transcription factors regulate gene expression, but how these proteins recognize and specifically bind to their DNA targets is still debated. Machine learning models are effective means to reveal interaction mechanisms. Here we studied the ability of a quantum machine learning approach to classify and rank binding affinities. Using simplified data sets of a small number of DNA sequences derived from actual binding affinity experiments, we trained a commercially available quantum annealer to classify and rank transcription factor binding. The results were compared to state-of-the-art classical approaches for the same simplified data sets, including simulated annealing, simulated quantum annealing, multiple linear regression, LASSO, and extreme gradient boosting. Despite technological limitations, we find a slight advantage in classification performance and nearly equal ranking performance using the quantum annealer for these fairly small training data sets. Thus, we propose that quantum annealing might be an effective method to implement machine learning for certain computational biology problems.

  8. Planning intensive care unit design using computer simulation modeling: optimizing integration of clinical, operational, and architectural requirements.

    Science.gov (United States)

    OʼHara, Susan

    2014-01-01

    Nurses have increasingly been regarded as critical members of the planning team as architects recognize their knowledge and value. But the nurses' role as knowledge experts can be expanded to leading efforts to integrate the clinical, operational, and architectural expertise through simulation modeling. Simulation modeling allows for the optimal merge of multifactorial data to understand the current state of the intensive care unit and predict future states. Nurses can champion the simulation modeling process and reap the benefits of a cost-effective way to test new designs, processes, staffing models, and future programming trends prior to implementation. Simulation modeling is an evidence-based planning approach, a standard, for integrating the sciences with real client data, to offer solutions for improving patient care.

  9. Chaotic Multiquenching Annealing Applied to the Protein Folding Problem

    Directory of Open Access Journals (Sweden)

    Juan Frausto-Solis

    2014-01-01

    Full Text Available The Chaotic Multiquenching Annealing algorithm (CMQA is proposed. CMQA is a new algorithm, which is applied to protein folding problem (PFP. This algorithm is divided into three phases: (i multiquenching phase (MQP, (ii annealing phase (AP, and (iii dynamical equilibrium phase (DEP. MQP enforces several stages of quick quenching processes that include chaotic functions. The chaotic functions can increase the exploration potential of solutions space of PFP. AP phase implements a simulated annealing algorithm (SA with an exponential cooling function. MQP and AP are delimited by different ranges of temperatures; MQP is applied for a range of temperatures which goes from extremely high values to very high values; AP searches for solutions in a range of temperatures from high values to extremely low values. DEP phase finds the equilibrium in a dynamic way by applying least squares method. CMQA is tested with several instances of PFP.

  10. Using ProModel as a simulation tools to assist plant layout design and planning: Case study plastic packaging factory

    OpenAIRE

    Pochamarn Tearwattanarattikal; Suwadee Namphacharoen; Chonthicha Chamrasporn

    2008-01-01

    This study is about the application of a Simulation Model to assist decision making on expanding capacity and plant layout design and planning. The plant layout design concept is performed first to create the physical layouts then the simulation model used to test the capability of plant to meet various demand forecast scena. The study employed ProModel package as a tool, using the model to compare the performances in term of % utilization, characteristics of WIP and ability to meet due date....

  11. Simulation in Pre-departure Training for Residents Planning Clinical Work in a Low-Income Country

    Directory of Open Access Journals (Sweden)

    Kevin R. Schwartz

    2015-12-01

    Full Text Available Introduction: Increasingly, pediatric and emergency medicine (EM residents are pursuing clinical rotations in low-income countries. Optimal pre-departure preparation for such rotations has not yet been established. High-fidelity simulation represents a potentially effective modality for such preparation. This study was designed to assess whether a pre-departure high-fidelity medical simulation curriculum is effective in helping to prepare residents for clinical rotations in a low-income country. Methods: 43 pediatric and EM residents planning clinical rotations in Liberia, West Africa, participated in a simulation-based curriculum focused on severe pediatric malaria and malnutrition and were then assessed by survey at three time points: pre-simulation, post-simulation, and after returning from work abroad. Results: Prior to simulation, 1/43 (2% participants reported they were comfortable with the diagnosis and management of severe malnutrition; this increased to 30/42 (71% after simulation and 24/31 (77% after working abroad. Prior to simulation, 1/43 (2% of residents reported comfort with the diagnosis and management of severe malaria; this increased to 26/42 (62% after simulation and 28/31 (90% after working abroad; 36/42 (86% of residents agreed that a simulation-based global health curriculum is more useful than a didactic curriculum alone, and 41/42 (98% felt a simulator-based curriculum should be offered to all residents planning a clinical trip to a low-income country. Conclusion: High-fidelity simulation is effective in increasing residents’ self-rated comfort in management of pediatric malaria and malnutrition and a majority of participating residents feel it should be included as a component of pre-departure training for all residents rotating clinically to low-income countries.

  12. Kinetics of annealing of irradiated surveillance pressure vessel steel

    International Nuclear Information System (INIS)

    Harvey, D.J.; Wechsler, M.S.

    1982-01-01

    Indentation hardness measurements as a function of annealing were made on broken halves of Charpy impact surveillance samples. The samples had been irradiated in commercial power reactors to a neutron fluence of approximately 1 x 10 18 neutrons per cm 2 , E > 1 MeV, at a temperature of about 300 0 C (570 0 F). Results are reported for the weld metal, which showed greater radiation hardening than the base plate or heat-affected zone material. Isochronal and isothermal anneals were conducted on the irradiated surveillance samples and on unirradiated control samples. No hardness changes upon annealing occurred for the control samples. The recovery in hardness for the irradiated samples took place mostly between 400 and 500 0 C. Based on the Meechan-Brinkman method of analysis, the activation energy for annealing was found to be 0.60 +- 0.06 eV. According to computer simulation calculations of Beeler, the activation energy for migration of vacancies in alpha iron is about 0.67 eV. Therefore, the results of this preliminary study appear to be consistent with a mechanism of annealing of radiation damage in pressure vessel steels based on the migration of radiation-produced lattice vacancies

  13. Non-stoquastic Hamiltonians in quantum annealing via geometric phases

    Science.gov (United States)

    Vinci, Walter; Lidar, Daniel A.

    2017-09-01

    We argue that a complete description of quantum annealing implemented with continuous variables must take into account the non-adiabatic Aharonov-Anandan geometric phase that arises when the system Hamiltonian changes during the anneal. We show that this geometric effect leads to the appearance of non-stoquasticity in the effective quantum Ising Hamiltonians that are typically used to describe quantum annealing with flux qubits. We explicitly demonstrate the effect of this geometric non-stoquasticity when quantum annealing is performed with a system of one and two coupled flux qubits. The realization of non-stoquastic Hamiltonians has important implications from a computational complexity perspective, since it is believed that in many cases quantum annealing with stoquastic Hamiltonians can be efficiently simulated via classical algorithms such as Quantum Monte Carlo. It is well known that the direct implementation of non-stoquastic Hamiltonians with flux qubits is particularly challenging. Our results suggest an alternative path for the implementation of non-stoquasticity via geometric phases that can be exploited for computational purposes.

  14. Latent interface-trap building in power VDMOSFETs: new experimental evidence and numerical simulation

    International Nuclear Information System (INIS)

    Ristic, G.F.; Jaksic, A.B.; Pejovic, M.M.

    1999-01-01

    The paper presents new experimental evidence of the latent interface-trap buildup during annealing of gamma-ray irradiated power VDMOSFETs. We try to reveal the nature of this still ill-understood phenomenon by isothermal annealing, switching temperature annealing and switching bias annealing experiments. The results of numerical simulation of interface-trap kinetics during annealing are also shown. (authors)

  15. Effects of optical interference and annealing on the performance of poly (3-hexylthiophene): fullerene based solar cells

    International Nuclear Information System (INIS)

    Hai-Long, You; Chun-Fu, Zhang

    2009-01-01

    In this paper, the effects of optical interference and annealing on the performance of P3HT:PCBM based organic solar cells are studied in detail. Due to the optical interference effect, short circuit current density (J SC ) shows obvious oscillatory behaviour with the variation of active layer thickness. With the help of the simulated results, the devices are optimized around the first two optical interference peaks. It is found that the optimized thicknesses are 80 and 208 nm. The study on the effect of annealing on the performance indicates that post-annealing is more favourable than pre-annealing. Based on post-annealing, different annealing temperatures are tested. The optimized annealing condition is 160° C for 10 min in a nitrogen atmosphere. The device shows that the open circuit voltage V OC achieves about 0.65V and the power conversion efficiency is as high as 4.0 % around the second interference peak

  16. Preoperative planning with three-dimensional reconstruction of patient's anatomy, rapid prototyping and simulation for endoscopic mitral valve repair.

    Science.gov (United States)

    Sardari Nia, Peyman; Heuts, Samuel; Daemen, Jean; Luyten, Peter; Vainer, Jindrich; Hoorntje, Jan; Cheriex, Emile; Maessen, Jos

    2017-02-01

    Mitral valve repair performed by an experienced surgeon is superior to mitral valve replacement for degenerative mitral valve disease; however, many surgeons are still deterred from adapting this procedure because of a steep learning curve. Simulation-based training and planning could improve the surgical performance and reduce the learning curve. The aim of this study was to develop a patient-specific simulation for mitral valve repair and provide a proof of concept of personalized medicine in a patient prospectively planned for mitral valve surgery. A 65-year old male with severe symptomatic mitral valve regurgitation was referred to our mitral valve heart team. On the basis of three-dimensional (3D) transoesophageal echocardiography and computed tomography, 3D reconstructions of the patient's anatomy were constructed. By navigating through these reconstructions, the repair options and surgical access were chosen (minimally invasive repair). Using rapid prototyping and negative mould fabrication, we developed a process to cast a patient-specific mitral valve silicone replica for preoperative repair in a high-fidelity simulator. Mitral valve and negative mould were printed in systole to capture the pathology when the valve closes. A patient-specific mitral valve silicone replica was casted and mounted in the simulator. All repair techniques could be performed in the simulator to choose the best repair strategy. As the valve was printed in systole, no special testing other than adjusting the coaptation area was required. Subsequently, the patient was operated, mitral valve pathology was validated and repair was successfully done as in the simulation. The patient-specific simulation and planning could be applied for surgical training, starting the (minimally invasive) mitral valve repair programme, planning of complex cases and the evaluation of new interventional techniques. The personalized medicine could be a possible pathway towards enhancing reproducibility

  17. Pattern Laser Annealing by a Pulsed Laser

    Science.gov (United States)

    Komiya, Yoshio; Hoh, Koichiro; Murakami, Koichi; Takahashi, Tetsuo; Tarui, Yasuo

    1981-10-01

    Preliminary experiments with contact-type pattern laser annealing were made for local polycrystallization of a-Si, local evaporation of a-Si and local formation of Ni-Si alloy. These experiments showed that the mask patterns can be replicated as annealed regions with a resolution of a few microns on substrates. To overcome shortcomings due to the contact type pattern annealing, a projection type reduction pattern laser annealing system is proposed for resistless low temperature pattern forming processes.

  18. Continued Development Of An Inexpensive Simulator Based CT Scanner For Radiation Therapy Treatment Planning

    Science.gov (United States)

    Peschmann, K. R.; Parker, D. L.; Smith, V.

    1982-11-01

    An abundant number of different CT scanner models has been developed in the past ten years, meeting increasing standards of performance. From the beginning they remained a comparatively expensive piece of equipment. This is due not only to their technical complexity but is also due to the difficulties involved in assessing "true" specifications (avoiding "overde-sign"). Our aim has been to provide, for Radiation Therapy Treatment Planning, a low cost CT scanner system featuring large freedom in patient positioning. We have taken advantage of the concurrent tremendously increased amount of knowledge and experience in the technical area of CT1 . By way of extensive computer simulations we gained confidence that an inexpensive C-arm simulator gantry and a simple one phase-two pulse generator in connection with a standard x-ray tube could be used, without sacrificing image quality. These components have been complemented by a commercial high precision shaft encoder, a simple and effective fan beam collimator, a high precision, high efficiency, luminescence crystal-silicon photodiode detector with 256 channels, low noise electronic preamplifier and sampling filter stages, a simplified data aquisition system furnished by Toshiba/ Analogic and an LSI 11/23 microcomputer plus data storage disk as well as various smaller interfaces linking the electrical components. The quality of CT scan pictures of phantoms,performed by the end of last year confirmed that this simple approach is working well. As a next step we intend to upgrade this system with an array processor in order to shorten recon-struction time to one minute per slice. We estimate that the system including this processor could be manufactured for a selling price of $210,000.

  19. Rapid thermal annealing of phosphorus implanted silicon

    International Nuclear Information System (INIS)

    Lee, Y.H.; Pogany, A.; Harrison, H.B.; Williams, J.S.

    1985-01-01

    Rapid thermal annealing (RTA) of phosphorus-implanted silicon has been investigated by four point probe, Van der Pauw methods and transmission electron microscopy. The results have been compared to furnace annealing. Experiments show that RTA, even at temperatures as low as 605 deg C, results in good electrical properties with little remnant damage and compares favourably with furnace annealing

  20. Computational Multiqubit Tunnelling in Programmable Quantum Annealers

    Science.gov (United States)

    2016-08-25

    ARTICLE Received 3 Jun 2015 | Accepted 26 Nov 2015 | Published 7 Jan 2016 Computational multiqubit tunnelling in programmable quantum annealers...state itself. Quantum tunnelling has been hypothesized as an advantageous physical resource for optimization in quantum annealing. However, computational ...qubit tunnelling plays a computational role in a currently available programmable quantum annealer. We devise a probe for tunnelling, a computational

  1. Evaluation of a performance appraisal framework for radiation therapists in planning and simulation

    Energy Technology Data Exchange (ETDEWEB)

    Becker, Jillian, E-mail: jillian.becker@health.qld.gov.au [Radiation Oncology Mater Centre, South Brisbane, Queensland (Australia); Bridge, Pete [School of Clinical Sciences, Queensland University of Technology, Brisbane, Queensland (Australia); Brown, Elizabeth; Lusk, Ryan; Ferrari-Anderson, Janet [Radiation Oncology, Princess Alexandra Hospital, Brisbane, Queensland (Australia); Radiation Oncology Mater Centre, South Brisbane, Queensland (Australia)

    2015-06-15

    Constantly evolving technology and techniques within radiation therapy require practitioners to maintain a continuous approach to professional development and training. Systems of performance appraisal and adoption of regular feedback mechanisms are vital to support this development yet frequently lack structure and rely on informal peer support. A Radiation Therapy Performance Appraisal Framework (RT-PAF) for radiation therapists in planning and simulation was developed to define expectations of practice and promote a supportive and objective culture of performance and skills appraisal. Evaluation of the framework was conducted via an anonymous online survey tool. Nine peer reviewers and fourteen recipients provided feedback on its effectiveness and the challenges and limitations of the approach. Findings from the evaluation were positive and suggested that both groups gained benefit from and expressed a strong interest in embedding the approach more routinely. Respondents identified common challenges related to the limited ability to implement suggested development strategies; this was strongly associated with time and rostering issues. This framework successfully defined expectations for practice and provided a fair and objective feedback process that focussed on skills development. It empowered staff to maintain their skills and reach their professional potential. Management support, particularly in regard to provision of protected time was highlighted as critical to the framework's ongoing success. The demonstrated benefits arising in terms of staff satisfaction and development highlight the importance of this commitment to the modern radiation therapy workforce.

  2. Evaluation of a performance appraisal framework for radiation therapists in planning and simulation

    International Nuclear Information System (INIS)

    Becker, Jillian; Bridge, Pete; Brown, Elizabeth; Lusk, Ryan; Ferrari-Anderson, Janet

    2015-01-01

    Constantly evolving technology and techniques within radiation therapy require practitioners to maintain a continuous approach to professional development and training. Systems of performance appraisal and adoption of regular feedback mechanisms are vital to support this development yet frequently lack structure and rely on informal peer support. A Radiation Therapy Performance Appraisal Framework (RT-PAF) for radiation therapists in planning and simulation was developed to define expectations of practice and promote a supportive and objective culture of performance and skills appraisal. Evaluation of the framework was conducted via an anonymous online survey tool. Nine peer reviewers and fourteen recipients provided feedback on its effectiveness and the challenges and limitations of the approach. Findings from the evaluation were positive and suggested that both groups gained benefit from and expressed a strong interest in embedding the approach more routinely. Respondents identified common challenges related to the limited ability to implement suggested development strategies; this was strongly associated with time and rostering issues. This framework successfully defined expectations for practice and provided a fair and objective feedback process that focussed on skills development. It empowered staff to maintain their skills and reach their professional potential. Management support, particularly in regard to provision of protected time was highlighted as critical to the framework's ongoing success. The demonstrated benefits arising in terms of staff satisfaction and development highlight the importance of this commitment to the modern radiation therapy workforce

  3. Generalizable open source urban water portfolio simulation framework demonstrated using a multi-objective risk-based planning benchmark problem.

    Science.gov (United States)

    Trindade, B. C.; Reed, P. M.

    2017-12-01

    The growing access and reduced cost for computing power in recent years has promoted rapid development and application of multi-objective water supply portfolio planning. As this trend continues there is a pressing need for flexible risk-based simulation frameworks and improved algorithm benchmarking for emerging classes of water supply planning and management problems. This work contributes the Water Utilities Management and Planning (WUMP) model: a generalizable and open source simulation framework designed to capture how water utilities can minimize operational and financial risks by regionally coordinating planning and management choices, i.e. making more efficient and coordinated use of restrictions, water transfers and financial hedging combined with possible construction of new infrastructure. We introduce the WUMP simulation framework as part of a new multi-objective benchmark problem for planning and management of regionally integrated water utility companies. In this problem, a group of fictitious water utilities seek to balance the use of the mentioned reliability driven actions (e.g., restrictions, water transfers and infrastructure pathways) and their inherent financial risks. Several traits of this problem make it ideal for a benchmark problem, namely the presence of (1) strong non-linearities and discontinuities in the Pareto front caused by the step-wise nature of the decision making formulation and by the abrupt addition of storage through infrastructure construction, (2) noise due to the stochastic nature of the streamflows and water demands, and (3) non-separability resulting from the cooperative formulation of the problem, in which decisions made by stakeholder may substantially impact others. Both the open source WUMP simulation framework and its demonstration in a challenging benchmarking example hold value for promoting broader advances in urban water supply portfolio planning for regions confronting change.

  4. PDCI Wide-Area Damping Control: PSLF Simulations of the 2016 Open and Closed Loop Test Plan

    Energy Technology Data Exchange (ETDEWEB)

    Wilches Bernal, Felipe [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pierre, Brian Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Elliott, Ryan Thomas [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Schoenwald, David A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Neely, Jason C. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Trudnowski, Daniel J. [Montana Tech of the Univ. of Montana, Butte, MT (United States); Donnelly, Matthew K. [Montana Tech of the Univ. of Montana, Butte, MT (United States)

    2017-03-01

    To demonstrate and validate the performance of the wide-are a damping control system, the project plans to conduct closed-loop tests on the PDCI in summer/fall 2016. A test plan details the open and closed loop tests to be conducted on the P DCI using the wide-area damping control system. To ensure the appropriate level of preparedness, simulations were performed in order to predict and evaluate any possible unsafe operations before hardware experiments are attempted. This report contains the result s from these simulations using the power system dynamics software PSLF (Power System Load Flow, trademark of GE). The simulations use the WECC (Western Electricity Coordinating Council) 2016 light summer and heavy summer base cases.

  5. Future planning: default network activity couples with frontoparietal control network and reward-processing regions during process and outcome simulations.

    Science.gov (United States)

    Gerlach, Kathy D; Spreng, R Nathan; Madore, Kevin P; Schacter, Daniel L

    2014-12-01

    We spend much of our daily lives imagining how we can reach future goals and what will happen when we attain them. Despite the prevalence of such goal-directed simulations, neuroimaging studies on planning have mainly focused on executive processes in the frontal lobe. This experiment examined the neural basis of process simulations, during which participants imagined themselves going through steps toward attaining a goal, and outcome simulations, during which participants imagined events they associated with achieving a goal. In the scanner, participants engaged in these simulation tasks and an odd/even control task. We hypothesized that process simulations would recruit default and frontoparietal control network regions, and that outcome simulations, which allow us to anticipate the affective consequences of achieving goals, would recruit default and reward-processing regions. Our analysis of brain activity that covaried with process and outcome simulations confirmed these hypotheses. A functional connectivity analysis with posterior cingulate, dorsolateral prefrontal cortex and anterior inferior parietal lobule seeds showed that their activity was correlated during process simulations and associated with a distributed network of default and frontoparietal control network regions. During outcome simulations, medial prefrontal cortex and amygdala seeds covaried together and formed a functional network with default and reward-processing regions. © The Author (2014). Published by Oxford University Press. For Permissions, please email: journals.permissions@oup.com.

  6. Computational algorithm for molybdenite concentrate annealing

    International Nuclear Information System (INIS)

    Alkatseva, V.M.

    1995-01-01

    Computational algorithm is presented for annealing of molybdenite concentrate with granulated return dust and that of granulated molybdenite concentrate. The algorithm differs from the known analogies for sulphide raw material annealing by including the calculation of return dust mass in stationary annealing; the latter quantity varies form the return dust mass value obtained in the first iteration step. Masses of solid products are determined by distribution of concentrate annealing products, including return dust and benthonite. The algorithm is applied to computations for annealing of other sulphide materials. 3 refs

  7. Plasma assisted heat treatment: annealing

    International Nuclear Information System (INIS)

    Brunatto, S F; Guimaraes, N V

    2009-01-01

    This work comprises a new dc plasma application in the metallurgical-mechanical field, called plasma assisted heat treatment, and it presents the first results for annealing. Annealing treatments were performed in 90% reduction cold-rolled niobium samples at 900 deg. C and 60 min, in two different heating ways: (a) in a hollow cathode discharge (HCD) configuration and (b) in a plasma oven configuration. The evolution of the samples' recrystallization was determined by means of the microstructure, microhardness and softening rate characterization. The results indicate that plasma species (ions and neutrals) bombardment in HCD plays an important role in the recrystallization process activation and could lead to technological and economical advantages considering the metallic materials' heat treatment application. (fast track communication)

  8. Engineering Task Plan for simulated riser installation by use of rotary drilling

    International Nuclear Information System (INIS)

    Barnes, G.A.

    1995-12-01

    This task is being performed to demonstrate the feasibility of the best riser installation alternative identified in the Engineering Study. This Engineering Task Plan (ETP) will be the WHC project management plan for the riser installation demonstration activities

  9. Chapter 8: Planning Tools to Simulate and Optimize Neighborhood Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Zhivov, Alexander Michael; Case, Michael Patrick; Jank, Reinhard; Eicker, Ursula; Booth, Samuel

    2017-03-15

    This section introduces different energy modeling tools available in Europe and the USA for community energy master planning process varying from strategic Urban Energy Planning to more detailed Local Energy Planning. Two modeling tools used for Energy Master Planning of primarily residential communities, the 3D city model with CityGML, and the Net Zero Planner tool developed for the US Department of Defense installations are described in more details.

  10. Annealing of ion implanted silicon

    International Nuclear Information System (INIS)

    Chivers, D.; Smith, B.J.; Stephen, J.; Fisher, M.

    1980-09-01

    The newer uses of ion implantation require a higher dose rate. This has led to the introduction of high beam current implanters; the wafers move in front of a stationary beam to give a scanning effect. This can lead to non-uniform heating of the wafer. Variations in the sheet resistance of the layers can be very non-uniform following thermal annealing. Non-uniformity in the effective doping both over a single wafer and from one wafer to another, can affect the usefulness of ion implantation in high dose rate applications. Experiments to determine the extent of non-uniformity in sheet resistance, and to see if it is correlated to the annealing scheme have been carried out. Details of the implantation parameters are given. It was found that best results were obtained when layers were annealed at the maximum possible temperature. For arsenic, phosphorus and antimony layers, improvements were observed up to 1200 0 C and boron up to 950 0 C. Usually, it is best to heat the layer directly to the maximum temperature to produce the most uniform layer; with phosphorus layers however it is better to pre-heat to 1050 0 C. (U.K.)

  11. Propagating self-sustained annealing of radiation-induced interstitial complexes

    International Nuclear Information System (INIS)

    Bokov, P M; Selyshchev, P A

    2016-01-01

    A propagating self-sustained annealing of radiation induced defects as a result of thermal-concentration instability is studied. The defects that are considered in the model are complexes. Each of them consists of one atom of impunity and of one interstitial atom. Crystal with defects has extra energy which is transformed into heat during defect annealing. Simulation of the auto-wave of annealing has been performed. The front and the speed of the auto-wave have been obtained. It is shown that annealing occurs in a narrow region of time and space. There are two kinds of such annealing behaviour. In the first case the speed of the auto-wave oscillates near its constant mean value and the front of temperature oscillates in a complex way. In the second case the speed of propagation is constant and fronts of temperature and concentration look like sigmoid functions. (paper)

  12. Dose/volume–response relations for rectal morbidity using planned and simulated motion-inclusive dose distributions

    International Nuclear Information System (INIS)

    Thor, Maria; Apte, Aditya; Deasy, Joseph O.; Karlsdóttir, Àsa; Moiseenko, Vitali; Liu, Mitchell; Muren, Ludvig Paul

    2013-01-01

    Background and purpose: Many dose-limiting normal tissues in radiotherapy (RT) display considerable internal motion between fractions over a course of treatment, potentially reducing the appropriateness of using planned dose distributions to predict morbidity. Accounting explicitly for rectal motion could improve the predictive power of modelling rectal morbidity. To test this, we simulated the effect of motion in two cohorts. Materials and methods: The included patients (232 and 159 cases) received RT for prostate cancer to 70 and 74 Gy. Motion-inclusive dose distributions were introduced as simulations of random or systematic motion to the planned dose distributions. Six rectal morbidity endpoints were analysed. A probit model using the QUANTEC recommended parameters was also applied to the cohorts. Results: The differences in associations using the planned over the motion-inclusive dose distributions were modest. Statistically significant associations were obtained with four of the endpoints, mainly at high doses (55–70 Gy), using both the planned and the motion-inclusive dose distributions, primarily when simulating random motion. The strongest associations were observed for GI toxicity and rectal bleeding (Rs = 0.12–0.21; Rs = 0.11–0.20). Applying the probit model, significant associations were found for tenesmus and rectal bleeding (Rs = 0.13, p = 0.02). Conclusion: Equally strong associations with rectal morbidity were observed at high doses (>55 Gy), for the planned and the simulated dose distributions including in particular random rectal motion. Future studies should explore patient-specific descriptions of rectal motion to achieve improved predictive power

  13. Determination of electron clinical spectra from percentage depth dose (PDD) curves by classical simulated annealing method; Determinacao de espectros de energia de eletrons clinicos a partir de curvas de porcentagem de dose em profundidade (PDP) utilizando o metodo de recozimento simulado classico

    Energy Technology Data Exchange (ETDEWEB)

    Visbal, Jorge H. Wilches; Costa, Alessandro M., E-mail: jhwilchev@usp.br [Universidade de Sao Paulo (USP), Ribeirao Preto (USP), SP (Brazil). Faculdade de Filosofia, Ciencias e Letras

    2016-07-01

    Percentage depth dose of electron beams represents an important item of data in radiation therapy treatment since it describes the dosimetric properties of these. Using an accurate transport theory, or the Monte Carlo method, has been shown obvious differences between the dose distribution of electron beams of a clinical accelerator in a water simulator object and the dose distribution of monoenergetic electrons of nominal energy of the clinical accelerator in water. In radiotherapy, the energy spectrum of electrons should be considered to improve the accuracy of dose calculation, because the electron beams that reach the surface traveling through internal structures of accelerator are not in fact monoenergetic. There are three principal approaches to obtain electron energy spectra from central PDP: Monte Carlo Method, Direct Measurement and Inverse Reconstruction. In this work, it will be presented the Simulated Annealing method as a practical, reliable and simple approach of inverse reconstruction as being an optimal alternative to other options. (author)

  14. Simulation-based power calculations for planning a two-stage individual participant data meta-analysis.

    Science.gov (United States)

    Ensor, Joie; Burke, Danielle L; Snell, Kym I E; Hemming, Karla; Riley, Richard D

    2018-05-18

    Researchers and funders should consider the statistical power of planned Individual Participant Data (IPD) meta-analysis projects, as they are often time-consuming and costly. We propose simulation-based power calculations utilising a two-stage framework, and illustrate the approach for a planned IPD meta-analysis of randomised trials with continuous outcomes where the aim is to identify treatment-covariate interactions. The simulation approach has four steps: (i) specify an underlying (data generating) statistical model for trials in the IPD meta-analysis; (ii) use readily available information (e.g. from publications) and prior knowledge (e.g. number of studies promising IPD) to specify model parameter values (e.g. control group mean, intervention effect, treatment-covariate interaction); (iii) simulate an IPD meta-analysis dataset of a particular size from the model, and apply a two-stage IPD meta-analysis to obtain the summary estimate of interest (e.g. interaction effect) and its associated p-value; (iv) repeat the previous step (e.g. thousands of times), then estimate the power to detect a genuine effect by the proportion of summary estimates with a significant p-value. In a planned IPD meta-analysis of lifestyle interventions to reduce weight gain in pregnancy, 14 trials (1183 patients) promised their IPD to examine a treatment-BMI interaction (i.e. whether baseline BMI modifies intervention effect on weight gain). Using our simulation-based approach, a two-stage IPD meta-analysis has meta-analysis was appropriate. Pre-specified adjustment for prognostic factors would increase power further. Incorrect dichotomisation of BMI would reduce power by over 20%, similar to immediately throwing away IPD from ten trials. Simulation-based power calculations could inform the planning and funding of IPD projects, and should be used routinely.

  15. The influence of annealing temperature and time on the efficiency of pentacene: PTCDI organic solar cells

    Directory of Open Access Journals (Sweden)

    Mehmet Biber

    Full Text Available In this study, fabrication of a polycyclic aromatic hydrocarbon/Perylene Tetracarboxylic Di-Imide (PTCDI, donor/acceptor solar cells are presented using physical vapour deposition technique in a 1000 class glove box. An ITO/PEDOT:PSS/Pentacene/PTCDI/Al (ITO = Indium Tin Oxide and PEDOT:PSS = poly(3,4-ethylenedioxythiophene polystyrene sulfonate solar cell has been obtained and the power conversion efficiency, PCE (η of about 0.33% has been obtained under simulated solar illumination of 300 W/m2. Furthermore, the effects of annealing temperatures (at 100 and 150 °C and of annealing (at 100 °C times for 5 and 10 min. on the power conversion efficiency, η of the solar cells have also been investigated. In general, it has been seen that the thermal annealing deteriorated the characteristics parameters of Pentacene/PTCDI solar cell such that both fill factor, FF and η decreased after annealing and with increase of annealing time. Atomic force microscopy (AFM images showed that the phase segregation and grain size increased and the surface roughness of Pentacene film decreased and these effects reduced the η value. The η values of the solar cell have been determined as 0.33%, 0.12% and 0.06% for pre-annealing, annealing at 100 and 150 °C, respectively. Keywords: Organic solar cells, PTCDI, Pentacene, Annealing

  16. Strong white photoluminescence from annealed zeolites

    International Nuclear Information System (INIS)

    Bai, Zhenhua; Fujii, Minoru; Imakita, Kenji; Hayashi, Shinji

    2014-01-01

    The optical properties of zeolites annealed at various temperatures are investigated for the first time. The annealed zeolites exhibit strong white photoluminescence (PL) under ultraviolet light excitation. With increasing annealing temperature, the emission intensity of annealed zeolites first increases and then decreases. At the same time, the PL peak red-shifts from 495 nm to 530 nm, and then returns to 500 nm. The strongest emission appears when the annealing temperature is 500 °C. The quantum yield of the sample is measured to be ∼10%. The PL lifetime monotonously increases from 223 μs to 251 μs with increasing annealing temperature. The origin of white PL is ascribed to oxygen vacancies formed during the annealing process. -- Highlights: • The optical properties of zeolites annealed at various temperatures are investigated. • The annealed zeolites exhibit strong white photoluminescence. • The maximum PL enhancement reaches as large as 62 times. • The lifetime shows little dependence on annealing temperature. • The origin of white emission is ascribed to the oxygen vacancies

  17. Simulating my own or others action plans?--Motor representations, not visual representations are recalled in motor memory.

    Directory of Open Access Journals (Sweden)

    Christian Seegelke

    Full Text Available Action plans are not generated from scratch for each movement, but features of recently generated plans are recalled for subsequent movements. This study investigated whether the observation of an action is sufficient to trigger plan recall processes. Participant dyads performed an object manipulation task in which one participant transported a plunger from an outer platform to a center platform of different heights (first move. Subsequently, either the same (intra-individual task condition or the other participant (inter-individual task condition returned the plunger to the outer platform (return moves. Grasp heights were inversely related to center target height and similar irrespective of direction (first vs. return move and task condition (intra- vs. inter-individual. Moreover, participants' return move grasp heights were highly correlated with their own, but not with their partners' first move grasp heights. Our findings provide evidence that a simulated action plan resembles a plan of how the observer would execute that action (based on a motor representation rather than a plan of the actually observed action (based on a visual representation.

  18. A Simulation Study for Radiation Treatment Planning Based on the Atomic Physics of the Proton-Boron Fusion Reaction

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Sunmi; Yoon, Do-Kun; Shin, Han-Back; Jung, Joo-Young; Kim, Moo-Sub; Kim, Kyeong-Hyeon; Jang, Hong-Seok; Suh, Tae Suk [the Catholic University of Korea, Seoul (Korea, Republic of)

    2017-03-15

    The purpose of this research is to demonstrate, based on a Monte Carlo simulation code, the procedure of radiation treatment planning for proton-boron fusion therapy (PBFT). A discrete proton beam (60 - 120 MeV) relevant to the Bragg peak was simulated using a Monte Carlo particle extended (MCNPX, Ver. 2.6.0, National Laboratory, Los Alamos NM, USA) simulation code. After computed tomography (CT) scanning of a virtual water phantom including air cavities, the acquired CT images were converted using the simulation source code. We set the boron uptake regions (BURs) in the simulated water phantom to achieve the proton-boron fusion reaction. Proton sources irradiated the BUR, in the phantom. The acquired dose maps were overlapped with the original CT image of the phantom to analyze the dose volume histogram (DVH). We successfully confirmed amplifications of the proton doses (average: 130%) at the target regions. From the DVH result for each simulation, we acquired a relatively accurate dose map for the treatment. A simulation was conducted to characterize the dose distribution and verify the feasibility of proton boron fusion therapy (PBFT). We observed a variation in proton range and developed a tumor targeting technique for treatment that was more accurate and powerful than both conventional proton therapy and boron-neutron capture therapy.

  19. Enhancing Student’s Understanding in Entrepreneurship Through Business Plan Simulation

    OpenAIRE

    Guzairy M.; Mohamad N.; Yunus A.R.

    2018-01-01

    Business Plan is an important document for entrepreneurs to guide them managing their business. Business Plan also assist the entrepreneur to strategies their business and manage future growth. That is why Malaysian government has foster all Higher Education Provider to set entrepreneurship education as compulsory course. One of the entrepreneurship education learning outcome is the student can write effective business plan. This study focused on enhancing student’s understanding in entrepren...

  20. Patient-specific surgical simulator for the pre-operative planning of single-incision laparoscopic surgery with bimanual robots.

    Science.gov (United States)

    Turini, Giuseppe; Moglia, Andrea; Ferrari, Vincenzo; Ferrari, Mauro; Mosca, Franco

    2012-01-01

    The trend of surgical robotics is to follow the evolution of laparoscopy, which is now moving towards single-incision laparoscopic surgery. The main drawback of this approach is the limited maneuverability of the surgical tools. Promising solutions to improve the surgeon's dexterity are based on bimanual robots. However, since both robot arms are completely inserted into the patient's body, issues related to possible unwanted collisions with structures adjacent to the target organ may arise. This paper presents a simulator based on patient-specific data for the positioning and workspace evaluation of bimanual surgical robots in the pre-operative planning of single-incision laparoscopic surgery. The simulator, designed for the pre-operative planning of robotic laparoscopic interventions, was tested by five expert surgeons who evaluated its main functionalities and provided an overall rating for the system. The proposed system demonstrated good performance and usability, and was designed to integrate both present and future bimanual surgical robots.

  1. Merging Methods to Manage Uncertainty: Combining Simulation Modeling and Scenario Planning to Inform Resource Management Under Climate Change

    Science.gov (United States)

    Miller, B. W.; Schuurman, G. W.; Symstad, A.; Fisichelli, N. A.; Frid, L.

    2017-12-01

    Managing natural resources in this era of anthropogenic climate change is fraught with uncertainties around how ecosystems will respond to management actions and a changing climate. Scenario planning (oftentimes implemented as a qualitative, participatory exercise for exploring multiple possible futures) is a valuable tool for addressing this challenge. However, this approach may face limits in resolving responses of complex systems to altered climate and management conditions, and may not provide the scientific credibility that managers often require to support actions that depart from current practice. Quantitative information on projected climate changes and ecological responses is rapidly growing and evolving, but this information is often not at a scale or in a form that is `actionable' for resource managers. We describe a project that sought to create usable information for resource managers in the northern Great Plains by combining qualitative and quantitative methods. In particular, researchers, resource managers, and climate adaptation specialists co-produced a simulation model in conjunction with scenario planning workshops to inform natural resource management in southwest South Dakota. Scenario planning for a wide range of resources facilitated open-minded thinking about a set of divergent and challenging, yet relevant and plausible, climate scenarios and management alternatives that could be implemented in the simulation. With stakeholder input throughout the process, we built a simulation of key vegetation types, grazing, exotic plants, fire, and the effects of climate and management on rangeland productivity and composition. By simulating multiple land management jurisdictions, climate scenarios, and management alternatives, the model highlighted important tradeoffs between herd sizes and vegetation composition, and between the short- versus long-term costs of invasive species management. It also identified impactful uncertainties related to the

  2. A simulation framework for the evaluation of production planning and order management strategies in the sawmilling industry

    OpenAIRE

    Dumetz , Ludwig; Gaudreault , Jonathan; Thomas , André; Marier , Philippe; Lehoux , Nadia; Bril El-Haouzi , Hind

    2015-01-01

    International audience; Raw material heterogeneity, complex transformation processes, and divergent product flowsmake sawmilling operationsdifficult to manage. Most north-American lumber sawmillsapply a make-to-stock production strategy, some accepting/refusing orders according to available-to-promise (ATP) quantities, while a few uses more advanced approaches. This article introduces a simulation framework allowing comparing and evaluatingdifferentproduction planning strategies as well as or...

  3. Flight plan optimization

    Science.gov (United States)

    Dharmaseelan, Anoop; Adistambha, Keyne D.

    2015-05-01

    Fuel cost accounts for 40 percent of the operating cost of an airline. Fuel cost can be minimized by planning a flight on optimized routes. The routes can be optimized by searching best connections based on the cost function defined by the airline. The most common algorithm that used to optimize route search is Dijkstra's. Dijkstra's algorithm produces a static result and the time taken for the search is relatively long. This paper experiments a new algorithm to optimize route search which combines the principle of simulated annealing and genetic algorithm. The experimental results of route search, presented are shown to be computationally fast and accurate compared with timings from generic algorithm. The new algorithm is optimal for random routing feature that is highly sought by many regional operators.

  4. Three-Dimensional Liver Surgery Simulation: Computer-Assisted Surgical Planning with Three-Dimensional Simulation Software and Three-Dimensional Printing.

    Science.gov (United States)

    Oshiro, Yukio; Ohkohchi, Nobuhiro

    2017-06-01

    To perform accurate hepatectomy without injury, it is necessary to understand the anatomical relationship among the branches of Glisson's sheath, hepatic veins, and tumor. In Japan, three-dimensional (3D) preoperative simulation for liver surgery is becoming increasingly common, and liver 3D modeling and 3D hepatectomy simulation by 3D analysis software for liver surgery have been covered by universal healthcare insurance since 2012. Herein, we review the history of virtual hepatectomy using computer-assisted surgery (CAS) and our research to date, and we discuss the future prospects of CAS. We have used the SYNAPSE VINCENT medical imaging system (Fujifilm Medical, Tokyo, Japan) for 3D visualization and virtual resection of the liver since 2010. We developed a novel fusion imaging technique combining 3D computed tomography (CT) with magnetic resonance imaging (MRI). The fusion image enables us to easily visualize anatomic relationships among the hepatic arteries, portal veins, bile duct, and tumor in the hepatic hilum. In 2013, we developed an original software, called Liversim, which enables real-time deformation of the liver using physical simulation, and a randomized control trial has recently been conducted to evaluate the use of Liversim and SYNAPSE VINCENT for preoperative simulation and planning. Furthermore, we developed a novel hollow 3D-printed liver model whose surface is covered with frames. This model is useful for safe liver resection, has better visibility, and the production cost is reduced to one-third of a previous model. Preoperative simulation and navigation with CAS in liver resection are expected to help planning and conducting a surgery and surgical education. Thus, a novel CAS system will contribute to not only the performance of reliable hepatectomy but also to surgical education.

  5. Freeform fabrication of tissue-simulating phantom for potential use of surgical planning in conjoined twins separation surgery.

    Science.gov (United States)

    Shen, Shuwei; Wang, Haili; Xue, Yue; Yuan, Li; Zhou, Ximing; Zhao, Zuhua; Dong, Erbao; Liu, Bin; Liu, Wendong; Cromeens, Barrett; Adler, Brent; Besner, Gail; Xu, Ronald X

    2017-09-08

    Preoperative assessment of tissue anatomy and accurate surgical planning is crucial in conjoined twin separation surgery. We developed a new method that combines three-dimensional (3D) printing, assembling, and casting to produce anatomic models of high fidelity for surgical planning. The related anatomic features of the conjoined twins were captured by computed tomography (CT), classified as five organ groups, and reconstructed as five computer models. Among these organ groups, the skeleton was produced by fused deposition modeling (FDM) using acrylonitrile-butadiene-styrene. For the other four organ groups, shell molds were prepared by FDM and cast with silica gel to simulate soft tissues, with contrast enhancement pigments added to simulate different CT and visual contrasts. The produced models were assembled, positioned firmly within a 3D printed shell mold simulating the skin boundary, and cast with transparent silica gel. The produced phantom was subject to further CT scan in comparison with that of the patient data for fidelity evaluation. Further data analysis showed that the produced model reassembled the geometric features of the original CT data with an overall mean deviation of less than 2 mm, indicating the clinical potential to use this method for surgical planning in conjoined twin separation surgery.

  6. An observation planning algorithm applied to multi-objective astronomical observations and its simulation in COSMOS field

    Science.gov (United States)

    Jin, Yi; Gu, Yonggang; Zhai, Chao

    2012-09-01

    Multi-Object Fiber Spectroscopic sky surveys are now booming, such as LAMOST already built by China, BIGBOSS project put forward by the U.S. Lawrence Berkeley National Lab and GTC (Gran Telescopio Canarias) telescope developed by the United States, Mexico and Spain. They all use or will use this approach and each fiber can be moved within a certain area for one astrology target, so observation planning is particularly important for this Sky Surveys. One observation planning algorithm used in multi-objective astronomical observations is developed. It can avoid the collision and interference between the fiber positioning units in the focal plane during the observation in one field of view, and the interested objects can be ovserved in a limited round with the maximize efficiency. Also, the observation simulation can be made for wide field of view through multi-FOV observation. After the observation planning is built ,the simulation is made in COSMOS field using GTC telescope. Interested galaxies, stars and high-redshift LBG galaxies are selected after the removal of the mask area, which may be bright stars. Then 9 FOV simulation is completed and observation efficiency and fiber utilization ratio for every round are given. Otherwise,allocating a certain number of fibers for background sky, giving different weights for different objects and how to move the FOV to improve the overall observation efficiency are discussed.

  7. Loviisa Unit One: Annealing - healing

    Energy Technology Data Exchange (ETDEWEB)

    Kohopaeae, J.; Virsu, R. [ed.; Henriksson, A. [ed.

    1997-11-01

    Unit 1 of the Loviisa nuclear powerplant was annealed in connection with the refuelling outage in the summer of 1996. This type of heat treatment restored the toughness properties of the pressure vessel weld, which had been embrittled be neutron radiation, so that it is almost equivalent to a new weld. The treatment itself was an ordinary metallurgical procedure that took only a few days. But the material studies that preceded it began over fifteen years ago and have put IVO at the forefront of world-wide expertise in the area of radiation embrittlement

  8. SU-F-T-403: Impact of Dose Reduction for Simulation CT On Radiation Therapy Treatment Planning

    Energy Technology Data Exchange (ETDEWEB)

    Liang, Q; Shah, P; Li, S; Miyamoto, C [Temple University Hospital, Philadelphia, PA (United States)

    2016-06-15

    Purpose: To investigate the feasibility of applying ALARA principles to current treatment planning CT scans. The study aims to quantitatively verify lower dose scans does not alter treatment planning. Method: Gammex 467 tissue characterization phantom with inserts of 14 different materials was scanned at seven different mA levels (30∼300 mA). CT numbers of different inserts were measured. Auto contouring for bone and lung in treatment planning system (Pinnacle) was used to evaluate the effect of CT number accuracy from treatment planning aspect, on the 30 and 300 mA-scanned images. A head CT scan intended for a 3D whole brain radiation treatment was evaluated. Dose calculations were performed on normal scanned images using clinical protocol (120 kVP, Smart mA, maximum 291 mA), and the images with added simulating noise mimicking a 70 mA scan. Plan parameters including isocenter, beam arrangements, block shapes, dose grid size and resolution, and prescriptions were kept the same for these two plans. The calculated monitor units (MUs) for these two plans were compared. Results: No significant degradation of CT number accuracy was found at lower dose levels from both the phantom scans, and the patient images with added noise. The CT numbers kept consistent when mA is higher than 60 mA. The auto contoured volumes for lung and cortical bone show 0.3% and 0.12% of differences between 30 mA and 300 mA respectively. The two forward plans created on regular and low dose images gave the same calculated MU, and 98.3% of points having <1% of dose difference. Conclusion: Both phantom and patient studies quantitatively verified low dose CT provides similar quality for treatment planning at 20–25% of regular scan dose. Therefore, there is the potential to optimize simulation CT scan protocol to fulfil the ALARA principle and limit unnecessary radiation exposure to non-targeted tissues.

  9. A trajectory planning scheme for spacecraft in the space station environment. M.S. Thesis - University of California

    Science.gov (United States)

    Soller, Jeffrey Alan; Grunwald, Arthur J.; Ellis, Stephen R.

    1991-01-01

    Simulated annealing is used to solve a minimum fuel trajectory problem in the space station environment. The environment is special because the space station will define a multivehicle environment in space. The optimization surface is a complex nonlinear function of the initial conditions of the chase and target crafts. Small permutations in the input conditions can result in abrupt changes to the optimization surface. Since no prior knowledge about the number or location of local minima on the surface is available, the optimization must be capable of functioning on a multimodal surface. It was reported in the literature that the simulated annealing algorithm is more effective on such surfaces than descent techniques using random starting points. The simulated annealing optimization was found to be capable of identifying a minimum fuel, two-burn trajectory subject to four constraints which are integrated into the optimization using a barrier method. The computations required to solve the optimization are fast enough that missions could be planned on board the space station. Potential applications for on board planning of missions are numerous. Future research topics may include optimal planning of multi-waypoint maneuvers using a knowledge base to guide the optimization, and a study aimed at developing robust annealing schedules for potential on board missions.

  10. An Innovative Tool for Intraoperative Electron Beam Radiotherapy Simulation and Planning: Description and Initial Evaluation by Radiation Oncologists

    Energy Technology Data Exchange (ETDEWEB)

    Pascau, Javier, E-mail: jpascau@mce.hggm.es [Unidad de Medicina y Cirugia Experimental, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Departamento de Bioingenieria e Ingenieria Aeroespacial, Universidad Carlos III de Madrid, Madrid (Spain); Santos Miranda, Juan Antonio [Servicio de Oncologia Radioterapica, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Facultad de Medicina, Universidad Complutense de Madrid, Madrid (Spain); Calvo, Felipe A. [Servicio de Oncologia Radioterapica, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Facultad de Medicina, Universidad Complutense de Madrid, Madrid (Spain); Departamento de Oncologia, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Bouche, Ana; Morillo, Virgina [Consorcio Hospitalario Provincial de Castellon, Castellon (Spain); Gonzalez-San Segundo, Carmen [Servicio de Oncologia Radioterapica, Hospital General Universitario Gregorio Maranon, Madrid (Spain); Facultad de Medicina, Universidad Complutense de Madrid, Madrid (Spain); Ferrer, Carlos; Lopez Tarjuelo, Juan [Consorcio Hospitalario Provincial de Castellon, Castellon (Spain); and others

    2012-06-01

    Purpose: Intraoperative electron beam radiation therapy (IOERT) involves a modified strategy of conventional radiation therapy and surgery. The lack of specific planning tools limits the spread of this technique. The purpose of the present study is to describe a new simulation and planning tool and its initial evaluation by clinical users. Methods and Materials: The tool works on a preoperative computed tomography scan. A physician contours regions to be treated and protected and simulates applicator positioning, calculating isodoses and the corresponding dose-volume histograms depending on the selected electron energy. Three radiation oncologists evaluated data from 15 IOERT patients, including different tumor locations. Segmentation masks, applicator positions, and treatment parameters were compared. Results: High parameter agreement was found in the following cases: three breast and three rectal cancer, retroperitoneal sarcoma, and rectal and ovary monotopic recurrences. All radiation oncologists performed similar segmentations of tumors and high-risk areas. The average applicator position difference was 1.2 {+-} 0.95 cm. The remaining cancer sites showed higher deviations because of differences in the criteria for segmenting high-risk areas (one rectal, one pancreas) and different surgical access simulated (two rectal, one Ewing sarcoma). Conclusions: The results show that this new tool can be used to simulate IOERT cases involving different anatomic locations, and that preplanning has to be carried out with specialized surgical input.

  11. High Performance Electrical Modeling and Simulation Software Normal Environment Verification and Validation Plan, Version 1.0; TOPICAL

    International Nuclear Information System (INIS)

    WIX, STEVEN D.; BOGDAN, CAROLYN W.; MARCHIONDO JR., JULIO P.; DEVENEY, MICHAEL F.; NUNEZ, ALBERT V.

    2002-01-01

    The requirements in modeling and simulation are driven by two fundamental changes in the nuclear weapons landscape: (1) The Comprehensive Test Ban Treaty and (2) The Stockpile Life Extension Program which extends weapon lifetimes well beyond their originally anticipated field lifetimes. The move from confidence based on nuclear testing to confidence based on predictive simulation forces a profound change in the performance asked of codes. The scope of this document is to improve the confidence in the computational results by demonstration and documentation of the predictive capability of electrical circuit codes and the underlying conceptual, mathematical and numerical models as applied to a specific stockpile driver. This document describes the High Performance Electrical Modeling and Simulation software normal environment Verification and Validation Plan

  12. First application of quantum annealing to IMRT beamlet intensity optimization

    International Nuclear Information System (INIS)

    Nazareth, Daryl P; Spaans, Jason D

    2015-01-01

    Optimization methods are critical to radiation therapy. A new technology, quantum annealing (QA), employs novel hardware and software techniques to address various discrete optimization problems in many fields. We report on the first application of quantum annealing to the process of beamlet intensity optimization for IMRT.We apply recently-developed hardware which natively exploits quantum mechanical effects for improved optimization. The new algorithm, called QA, is most similar to simulated annealing, but relies on natural processes to directly minimize a system’s free energy. A simple quantum system is slowly evolved into a classical system representing the objective function. If the evolution is sufficiently slow, there are probabilistic guarantees that a global minimum will be located.To apply QA to IMRT-type optimization, two prostate cases were considered. A reduced number of beamlets were employed, due to the current QA hardware limitations. The beamlet dose matrices were computed using CERR and an objective function was defined based on typical clinical constraints, including dose-volume objectives, which result in a complex non-convex search space. The objective function was discretized and the QA method was compared to two standard optimization methods, simulated annealing and Tabu search, run on a conventional computing cluster.Based on several runs, the average final objective function value achieved by the QA was 16.9 for the first patient, compared with 10.0 for Tabu and 6.7 for the simulated annealing (SA) method. For the second patient, the values were 70.7 for the QA, 120.0 for Tabu and 22.9 for the SA. The QA algorithm required 27–38% of the time required by the other two methods.In this first application of hardware-enabled QA to IMRT optimization, its performance is comparable to Tabu search, but less effective than the SA in terms of final objective function values. However, its speed was 3–4 times faster than the other two methods

  13. High Altitude Long Endurance Remotely Operated Aircraft - National Airspace System Integration - Simulation IPT: Detailed Airspace Operations Simulation Plan. Version 1.0

    Science.gov (United States)

    2004-01-01

    The primary goal of Access 5 is to allow safe, reliable and routine operations of High Altitude-Long Endurance Remotely Operated Aircraft (HALE ROAs) within the National Airspace System (NAS). Step 1 of Access 5 addresses the policies, procedures, technologies and implementation issues of introducing such operations into the NAS above pressure altitude 40,000 ft (Flight Level 400 or FL400). Routine HALE ROA activity within the NAS represents a potentially significant change to the tasks and concerns of NAS users, service providers and other stakeholders. Due to the complexity of the NAS, and the importance of maintaining current high levels of safety in the NAS, any significant changes must be thoroughly evaluated prior to implementation. The Access 5 community has been tasked with performing this detailed evaluation of routine HALE-ROA activities in the NAS, and providing to key NAS stakeholders a set of recommended policies and procedures to achieve this goal. Extensive simulation, in concert with a directed flight demonstration program are intended to provide the required supporting evidence that these recommendations are based on sound methods and offer a clear roadmap to achieving safe, reliable and routine HALE ROA operations in the NAS. Through coordination with NAS service providers and policy makers, and with significant input from HALE-ROA manufacturers, operators and pilots, this document presents the detailed simulation plan for Step 1 of Access 5. A brief background of the Access 5 project will be presented with focus on Steps 1 and 2, concerning HALE-ROA operations above FL400 and FL180 respectively. An overview of project management structure follows with particular emphasis on the role of the Simulation IPT and its relationships to other project entities. This discussion will include a description of work packages assigned to the Simulation IPT, and present the specific goals to be achieved for each simulation work package, along with the associated

  14. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    Energy Technology Data Exchange (ETDEWEB)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  15. Spatial policy, planning and infrastructure investment: lessons from urban simulations in three South African cities

    CSIR Research Space (South Africa)

    Coetzee, M

    2014-05-01

    Full Text Available from work conducted as part of a Department of Science and Technology (DST)-funded Integrated Planning and Development Modelling (IPDM) project, the article argues that decisions about infrastructure investment in South African metropolitan areas ought...

  16. Transforming network simulation data to semantic data for network attack planning

    CSIR Research Space (South Africa)

    Chan, Ke Fai Peter

    2017-03-01

    Full Text Available study was performed, using the Common Open Research Emulator (CORE), to generate the necessary network simulation data. The simulation data was analysed, and then transformed into linked data. The result of the transformation is a data file that adheres...

  17. Simulation in Quality Management – An Approach to Improve Inspection Planning

    Directory of Open Access Journals (Sweden)

    H.-A. Crostack

    2005-01-01

    Full Text Available Production is a multi-step process involving many different articles produced in different jobs by various machining stations. Quality inspection has to be integrated in the production sequence in order to ensure the conformance of the products. The interactions between manufacturing processes and inspections are very complex since three aspects (quality, cost, and time should all be considered at the same time while determining the suitable inspection strategy. Therefore, a simulation approach was introduced to solve this problem.The simulator called QUINTE [the QUINTE simulator has been developed at the University of Dortmund in the course of two research projects funded by the German Federal Ministry of Economics and Labour (BMWA: Bundesministerium für Wirtschaft und Arbeit, the Arbeitsgemeinschaft industrieller Forschungsvereinigungen (AiF, Cologne/Germany and the Forschungsgemeinschaft Qualität, Frankfurt a.M./Germany] was developed to simulate the machining as well as the inspection. It can be used to investigate and evaluate the inspection strategies in manufacturing processes. The investigation into the application of QUINTE simulator in industry was carried out at two pilot companies. The results show the validity of this simulator. An attempt to run QUINTE in a user-friendly environment, i.e., the commercial simulation software – Arena® is also described in this paper.NOTATION: QUINTE Qualität in der Teilefertigung  (Quality in  the manufacturing process  

  18. Simulations

    CERN Document Server

    Ngada, Narcisse

    2015-06-15

    The complexity and cost of building and running high-power electrical systems make the use of simulations unavoidable. The simulations available today provide great understanding about how systems really operate. This paper helps the reader to gain an insight into simulation in the field of power converters for particle accelerators. Starting with the definition and basic principles of simulation, two simulation types, as well as their leading tools, are presented: analog and numerical simulations. Some practical applications of each simulation type are also considered. The final conclusion then summarizes the main important items to keep in mind before opting for a simulation tool or before performing a simulation.

  19. A comparison of forward planning and optimised inverse planning

    International Nuclear Information System (INIS)

    Oldham, Mark; Neal, Anthony; Webb, Steve

    1995-01-01

    A radiotherapy treatment plan optimisation algorithm has been applied to 48 prostate plans and the results compared with those of an experienced human planner. Twelve patients were used in the study, and a 3, 4, 6 and 8 field plan (with standard coplanar beam angles for each plan type) were optimised by both the human planner and the optimisation algorithm. The human planner 'optimised' the plan by conventional forward planning techniques. The optimisation algorithm was based on fast-simulated-annealing. 'Importance factors' assigned to different regions of the patient provide a method for controlling the algorithm, and it was found that the same values gave good results for almost all plans. The plans were compared on the basis of dose statistics and normal-tissue-complication-probability (NTCP) and tumour-control-probability (TCP). The results show that the optimisation algorithm yielded results that were at least as good as the human planner for all plan types, and on the whole slightly better. A study of the beam-weights chosen by the optimisation algorithm and the planner will be presented. The optimisation algorithm showed greater variation, in response to individual patient geometry. For simple (e.g. 3 field) plans it was found to consistently achieve slightly higher TCP and lower NTCP values. For more complicated (e.g. 8 fields) plans the optimisation also achieved slightly better results with generally less numbers of beams. The optimisation time was always ≤5 minutes; a factor of up to 20 times faster than the human planner

  20. Laser annealing of ion implanted silicon

    International Nuclear Information System (INIS)

    White, C.W.; Narayan, J.; Young, R.T.

    1978-11-01

    The physical and electrical properties of ion implanted silicon annealed with high powered ruby laser radiation are summarized. Results show that pulsed laser annealing can lead to a complete removal of extended defects in the implanted region accompanied by incorporation of dopants into lattice sites even when their concentration far exceeds the solid solubility limit

  1. Annealed star-branched polyelectrolytes in solution

    NARCIS (Netherlands)

    Klein Wolterink, J.; Male, van J.; Cohen Stuart, M.A.; Koopal, L.K.; Zhulina, E.B.; Borisov, O.V.

    2002-01-01

    Equilibrium conformations of annealed star-branched polyelectrolytes (polyacids) are calculated with a numerical self-consistent-field (SCF) model. From the calculations we obtain also the size and charge of annealed polyelectrolyte stars as a function of the number of arms, pH, and the ionic

  2. Annealed silver-islands for enhanced optical absorption in organic solar cell

    Energy Technology Data Exchange (ETDEWEB)

    Otieno, Francis, E-mail: frankotienoo@gmail.com [Material Physics Research Institute, School of Physics, University of the Witwatersrand, Private Bag 3, Wits, 2050Johannesburg (South Africa); Materials for Energy Research Group, University of the Witwatersrand, Private Bag 3, Wits, 2050 Johannesburg (South Africa); Airo, Mildred [School of Chemistry, University of the Witwatersrand, Private Bag 3, Wits, 2050 (South Africa); Ranganathan, Kamalakannan [School of Chemistry, University of the Witwatersrand, Private Bag 3, Wits, 2050 (South Africa); DST-NRF Centre of Strong Materials and the Molecular Sciences Institute, School of Chemistry, University of the Witwatersrand, 2193 Johannesburg (South Africa); Wamwangi, Daniel [Material Physics Research Institute, School of Physics, University of the Witwatersrand, Private Bag 3, Wits, 2050Johannesburg (South Africa); Materials for Energy Research Group, University of the Witwatersrand, Private Bag 3, Wits, 2050 Johannesburg (South Africa)

    2016-01-01

    Silver nano-islands are explored for enhancing optical absorption and photo-conversion efficiency in organic solar cells (OSCs) based on the surface plasmon resonance effect under diverse annealing conditions. Ag nano-islands have been deposited by RF magnetron sputtering at 15 W for 10 s and subsequently annealed between 100 °C–250 °C in air and Argon ambient. The optical properties of the reconstructed Ag islands demonstrate an increase and a blue shift in the absorption bands with increasing annealing temperature. This is the localized surface plasmon effect due to the Ag islands of diverse sizes, shapes and coverages. The increase in optical absorption with temperature is attributed to changes in island shape and density as collaborated by atomic force microscopy and TEM. As a proof of concept, an organic solar cell was characterized for current–voltage (I–V) measurements under dark and under solar simulated white light. Incorporation of annealed Ag islands has yielded an efficiency increment of between 4–24%. - Highlights: • RF Sputtering can be used to produce Ag NPs at low power. • Annealing enhances size, shape reconstruction as well as inter-particle separation. • Annealing in Argon ambient is more suitable than in air. • Ag NPs annealed at 250 °C enhances device absorption and PCE by up to 24%.

  3. Luminescence sensitivity changes in quartz as a result of annealing

    DEFF Research Database (Denmark)

    Bøtter-Jensen, L.; Agersnap Larsen, N.; Mejdahl, V.

    1995-01-01

    archaeological samples show very different OSL sensitivities. In this paper we report on studies of the effect of high temperature annealing on the OSL and phototransferred TL (PTTL) signals from sedimentary and synthetic quartz. A dramatic enhancement of both OSL and PTTL sensitivity was found especially...... in the temperature range 500-800 degrees C. Computer simulations of the possible effects are shown to produce data that agree in all essential details with the experimental observations. It is further demonstrated that the enhanced OSL sensitivity as a function of annealing temperature is not a pre-dose effect....... of magnitude less per unit radiation than that for heated material. The reason these temperature-induced sensitivity changes occur in quartz is presently not well understood. This phenomenon is also seen in the related area of luminescence dating in which sedimentary quartz and quartz from heated...

  4. PEGASO - simulation model for the operation of nuclear power plants for planning purposes

    International Nuclear Information System (INIS)

    Ribeiro, A.A.T.; Muniz, A.A.

    1979-07-01

    The utilization manual for PEGASO is presented, consisting of a set of programs whose objective is to simulate the monthly operation of nuclear power plants (up to 10 NPP), determining the principal physical parameters and criticality. (Author) [pt

  5. Inter-Enterprise Planning of Manufacturing Systems Applying Simulation with IPR Protection

    Science.gov (United States)

    Mertins, Kai; Rabe, Markus

    Discrete Event Simulation is a well-proved method to analyse the dynamic behaviour of manufacturing systems. However, simulation application is still poor for external supply chains or virtual enterprises, encompassing several legal entities. Most conventional simulation systems provide no means to protect intellectual property rights (IPR), nor methods to support cross-enterprise teamwork. This paper describes a solution to keep enterprise models private, but still provide their functionality for cross-enterprise evaluation purposes. Applying the new modelling system, the inter-enterprise business process is specified by the user, including a specification of the objects exchanged between the local models. The required environment for a distributed simulation is generated automatically. The mechanisms have been tested with a large supply chain model.

  6. Understanding the microwave annealing of silicon

    Directory of Open Access Journals (Sweden)

    Chaochao Fu

    2017-03-01

    Full Text Available Though microwave annealing appears to be very appealing due to its unique features, lacking an in-depth understanding and accurate model hinder its application in semiconductor processing. In this paper, the physics-based model and accurate calculation for the microwave annealing of silicon are presented. Both thermal effects, including ohmic conduction loss and dielectric polarization loss, and non-thermal effects are thoroughly analyzed. We designed unique experiments to verify the mechanism and extract relevant parameters. We also explicitly illustrate the dynamic interaction processes of the microwave annealing of silicon. This work provides an in-depth understanding that can expedite the application of microwave annealing in semiconductor processing and open the door to implementing microwave annealing for future research and applications.

  7. Reduced annealing temperatures in silicon solar cells

    Science.gov (United States)

    Weinberg, I.; Swartz, C. K.

    1981-01-01

    Cells irradiated to a fluence of 5x10,000,000,000,000/square cm showed short circuit current on annealing at 200 C, with complete annealing occurring at 275 C. Cells irradiated to 100,000,000,000,000/square cm showed a reduction in annealing temperature from the usual 500 to 300 C. Annealing kinetic studies yield an activation energy of (1.5 + or - 2) eV for the low fluence, low temperature anneal. Comparison with activation energies previously obtained indicate that the presently obtained activation energy is consistent with the presence of either the divacancy or the carbon interstitial carbon substitutional pair, a result which agrees with the conclusion based on defect behavior in boron-doped silicon.

  8. Protocol for quality control of scanners used in the simulation of radiotherapy treatment planning

    International Nuclear Information System (INIS)

    Morales, Jorge l; Alfonso, Rodolfo; Vega, Manuel

    2009-01-01

    The treatment planning of HDR brachytherapy with Ir-192 is made in the INOR based on semi-orthogonal X-ray images. In the case of implants of molds for head and neck injuries for the purpose of strengthening the external radiation doses, reports valuable information can combine isodose distributions of both modalities. The CT imaging the patient with the applicator-placed cast, gives the possibility to obtain three-dimensional dose distributions in different anatomical views. The aim of this study was to implement the verification of post-plan dose distributions and the possibility of combined distributions. (author)

  9. Preoperative planning of thoracic surgery with use of three-dimensional reconstruction, rapid prototyping, simulation and virtual navigation

    Science.gov (United States)

    Heuts, Samuel; Maessen, Jos G.

    2016-01-01

    For the past decades, surgeries have become more complex, due to the increasing age of the patient population referred for thoracic surgery, more complex pathology and the emergence of minimally invasive thoracic surgery. Together with the early detection of thoracic disease as a result of innovations in diagnostic possibilities and the paradigm shift to personalized medicine, preoperative planning is becoming an indispensable and crucial aspect of surgery. Several new techniques facilitating this paradigm shift have emerged. Pre-operative marking and staining of lesions are already a widely accepted method of preoperative planning in thoracic surgery. However, three-dimensional (3D) image reconstructions, virtual simulation and rapid prototyping (RP) are still in development phase. These new techniques are expected to become an important part of the standard work-up of patients undergoing thoracic surgery in the future. This review aims at graphically presenting and summarizing these new diagnostic and therapeutic tools PMID:29078505

  10. Advanced Simulation and Computing FY08-09 Implementation Plan Volume 2 Revision 0

    International Nuclear Information System (INIS)

    McCoy, M; Kusnezov, D; Bikkel, T; Hopson, J

    2007-01-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the safety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future nonnuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  11. Advanced Simulation and Computing FY10-FY11 Implementation Plan Volume 2, Rev. 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Peery, J; McCoy, M; Hopson, J

    2009-09-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  12. Advanced Simulation and Computing FY09-FY10 Implementation Plan Volume 2, Rev. 1

    Energy Technology Data Exchange (ETDEWEB)

    Kissel, L

    2009-04-01

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  13. Advanced Simulation and Computing FY09-FY10 Implementation Plan, Volume 2, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Meisner, R; Hopson, J; Peery, J; McCoy, M

    2008-10-07

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one

  14. Advanced Simulation and Computing FY10-11 Implementation Plan Volume 2, Rev. 0

    Energy Technology Data Exchange (ETDEWEB)

    Carnes, B

    2009-06-08

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from one that

  15. Advanced Simulation and Computing Fiscal Year 2011-2012 Implementation Plan, Revision 0

    Energy Technology Data Exchange (ETDEWEB)

    McCoy, M; Phillips, J; Hpson, J; Meisner, R

    2010-04-22

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering (D&E) programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC) is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable stockpile Life Extension Programs (LEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional (3D) simulation environment while maintaining support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model

  16. Advanced Simulation and Computing FY08-09 Implementation Plan, Volume 2, Revision 0.5

    Energy Technology Data Exchange (ETDEWEB)

    Kusnezov, D; Bickel, T; McCoy, M; Hopson, J

    2007-09-13

    The Stockpile Stewardship Program (SSP) is a single, highly integrated technical program for maintaining the surety and reliability of the U.S. nuclear stockpile. The SSP uses past nuclear test data along with current and future non-nuclear test data, computational modeling and simulation, and experimental facilities to advance understanding of nuclear weapons. It includes stockpile surveillance, experimental research, development and engineering programs, and an appropriately scaled production capability to support stockpile requirements. This integrated national program requires the continued use of current facilities and programs along with new experimental facilities and computational enhancements to support these programs. The Advanced Simulation and Computing Program (ASC)1 is a cornerstone of the SSP, providing simulation capabilities and computational resources to support the annual stockpile assessment and certification, to study advanced nuclear-weapons design and manufacturing processes, to analyze accident scenarios and weapons aging, and to provide the tools to enable Stockpile Life Extension Programs (SLEPs) and the resolution of Significant Finding Investigations (SFIs). This requires a balanced resource, including technical staff, hardware, simulation software, and computer science solutions. In its first decade, the ASC strategy focused on demonstrating simulation capabilities of unprecedented scale in three spatial dimensions. In its second decade, ASC is focused on increasing its predictive capabilities in a three-dimensional simulation environment while maintaining the support to the SSP. The program continues to improve its unique tools for solving progressively more difficult stockpile problems (focused on sufficient resolution, dimensionality and scientific details); to quantify critical margins and uncertainties (QMU); and to resolve increasingly difficult analyses needed for the SSP. Moreover, ASC has restructured its business model from

  17. Electrical properties and annealing kinetics study of laser-annealed ion-implanted silicon

    International Nuclear Information System (INIS)

    Wang, K.L.; Liu, Y.S.; Kirkpatrick, C.G.; Possin, G.E.

    1979-01-01

    This paper describes measurements of electrical properties and the regrowth behavior of ion-implanted silicon annealed with an 80-ns (FWHM) laser pulse at 1.06 μm. The experimental results include: (1) a determination of threshold energy density required for melting using a transient optical reflectivity technique, (2) measurements of dopant distribution using Rutherford backscattering spectroscopy, (3) characterization of electrical properties by measuring reverse leakage current densities of laser-annealed and thermal-annealed mesa diodes, (4) determination of annealed junction depth using an electron-beam-induced-current technique, and (5) a deep-level-transient spectroscopic study of residual defects. In particular, by measuring these properties of a diode annealed at a condition near the threshold energy density for liquid phase epitaxial regrowth, we have found certain correlations among these various annealing behaviors and electrical properties of laser-annealed ion-implanted silicon diodes

  18. Multiple methods for multiple futures: Integrating qualitative scenario planning and quantitative simulation modeling for natural resource decision making

    Science.gov (United States)

    Symstad, Amy J.; Fisichelli, Nicholas A.; Miller, Brian W.; Rowland, Erika; Schuurman, Gregor W.

    2017-01-01

    Scenario planning helps managers incorporate climate change into their natural resource decision making through a structured “what-if” process of identifying key uncertainties and potential impacts and responses. Although qualitative scenarios, in which ecosystem responses to climate change are derived via expert opinion, often suffice for managers to begin addressing climate change in their planning, this approach may face limits in resolving the responses of complex systems to altered climate conditions. In addition, this approach may fall short of the scientific credibility managers often require to take actions that differ from current practice. Quantitative simulation modeling of ecosystem response to climate conditions and management actions can provide this credibility, but its utility is limited unless the modeling addresses the most impactful and management-relevant uncertainties and incorporates realistic management actions. We use a case study to compare and contrast management implications derived from qualitative scenario narratives and from scenarios supported by quantitative simulations. We then describe an analytical framework that refines the case study’s integrated approach in order to improve applicability of results to management decisions. The case study illustrates the value of an integrated approach for identifying counterintuitive system dynamics, refining understanding of complex relationships, clarifying the magnitude and timing of changes, identifying and checking the validity of assumptions about resource responses to climate, and refining management directions. Our proposed analytical framework retains qualitative scenario planning as a core element because its participatory approach builds understanding for both managers and scientists, lays the groundwork to focus quantitative simulations on key system dynamics, and clarifies the challenges that subsequent decision making must address.

  19. Dosimetric study of prostate brachytherapy using techniques of Monte-Carlo simulation, experimental measurements and comparison with a treatment plan

    International Nuclear Information System (INIS)

    Teles, Pedro; Barros, Silvia; Vaz, Pedro; Goncalves, Isabel; Facure, Alessandro; Rosa, Luiz da; Santos, Maira; Pereira Junior, Pedro Paulo; Zankl, Maria

    2013-01-01

    Prostate Brachytherapy is a radiotherapy technique, which consists in inserting a number of radioactive seeds (containing, usually, the following radionuclides 125 l, 241 Am or 103 Pd ) surrounding or in the vicinity of, prostate tumor tissue . The main objective of this technique is to maximize the radiation dose to the tumor and minimize it in other tissues and organs healthy, in order to reduce its morbidity. The absorbed dose distribution in the prostate, using this technique is usually non-homogeneous and time dependent. Various parameters such as the type of seed, the attenuation interactions between them, their geometrical arrangement within the prostate, the actual geometry of the seeds,and further swelling of the prostate gland after implantation greatly influence the course of absorbed dose in the prostate and surrounding areas. Quantification of these parameters is therefore extremely important for dose optimization and improvement of their plans conventional treatment, which