WorldWideScience

Sample records for optimization simulation tooling

  1. Biopharmaceutical Process Optimization with Simulation and Scheduling Tools

    Directory of Open Access Journals (Sweden)

    Demetri Petrides

    2014-09-01

    Full Text Available Design and assessment activities associated with a biopharmaceutical process are performed at different levels of detail, based on the stage of development that the product is in. Preliminary “back-of-the envelope” assessments are performed early in the development lifecycle, whereas detailed design and evaluation are performed prior to the construction of a new facility. Both the preliminary and detailed design of integrated biopharmaceutical processes can be greatly assisted by the use of process simulators, discrete event simulators or finite capacity scheduling tools. This report describes the use of such tools for bioprocess development, design, and manufacturing. The report is divided into three sections. Section One provides introductory information and explains the purpose of bioprocess simulation. Section Two focuses on the detailed modeling of a single batch bioprocess that represents the manufacturing of a therapeutic monoclonal antibody (MAb. This type of analysis is typically performed by engineers engaged in the development and optimization of such processes. Section Three focuses on production planning and scheduling models for multiproduct plants.

  2. BASIMO - Borehole Heat Exchanger Array Simulation and Optimization Tool

    Science.gov (United States)

    Schulte, Daniel O.; Bastian, Welsch; Wolfram, Rühaak; Kristian, Bär; Ingo, Sass

    2017-04-01

    Arrays of borehole heat exchangers are an increasingly popular source for renewable energy. Furthermore, they can serve as borehole thermal energy storage (BTES) systems for seasonally fluctuating heat sources like solar thermal energy or district heating grids. The high temperature level of these heat sources prohibits the use of the shallow subsurface for environmental reasons. Therefore, deeper reservoirs have to be accessed instead. The increased depth of the systems results in high investment costs and has hindered the implementation of this technology until now. Therefore, research of medium deep BTES systems relies on numerical simulation models. Current simulation tools cannot - or only to some extent - describe key features like partly insulated boreholes unless they run fully discretized models of the borehole heat exchangers. However, fully discretized models often come at a high computational cost, especially for large arrays of borehole heat exchangers. We give an update on the development of BASIMO: a tool, which uses one dimensional thermal resistance and capacity models for the borehole heat exchangers coupled with a numerical finite element model for the subsurface heat transport in a dual-continuum approach. An unstructured tetrahedral mesh bypasses the limitations of structured grids for borehole path geometries, while the thermal resistance and capacity model is improved to account for borehole heat exchanger properties changing with depth. Thereby, partly insulated boreholes can be considered in the model. Furthermore, BASIMO can be used to improve the design of BTES systems: the tool allows for automated parameter variations and is readily coupled to other code like mathematical optimization algorithms. Optimization can be used to determine the required minimum system size or to increase the system performance.

  3. BASIMO - Borehole Heat Exchanger Array Simulation and Optimization Tool

    Science.gov (United States)

    Schulte, Daniel; Rühaak, Wolfram; Welsch, Bastian; Bär, Kristian; Sass, Ingo

    2016-04-01

    Borehole heat exchangers represent a well-established technology, which pushes for new fields of applications and novel modifications. Current simulation tools cannot - or only to some extent - describe features like inclined or partly insulated boreholes unless they run fully discretized models of the borehole heat exchangers. However, fully discretized models often come at a high computational cost, especially for large arrays of borehole heat exchangers. We present a tool, which uses one dimensional thermal resistance and capacity models for the borehole heat exchangers coupled with a numerical finite element model for the subsurface heat transport. An unstructured tetrahedral mesh bypasses the limitations of structured grids for borehole path geometries, while the thermal resistance and capacity model is improved to account for borehole heat exchanger properties changing with depth. The presented tool benefits from the fast analytical solution of the thermal interactions within the boreholes while still allowing for a detailed consideration of the borehole heat exchanger properties.

  4. SIMULATION AS A TOOL FOR PROCESS OPTIMIZATION OF LOGISTIC SYSTEMS

    Directory of Open Access Journals (Sweden)

    Radko Popovič

    2015-09-01

    Full Text Available The paper deals with the simulation of the production processes, especially module of Siemens Tecnomatix software. Tecnomatix Process Simulate is designed for building new or modifying existing production processes. The simulation created in this software has a posibility for fast testing of planned changes or improvements of the production processes. On the base of simulation you can imagine the future picture of the real production system. 3D Simulation can reflects the actual status and conditions on the running system and of course, after some improvements, it can show the possible figure of the production system.

  5. Simulation tools

    CERN Document Server

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  6. Optimization Efficiency of Monte Carlo Simulation Tool for Evanescent Wave Spectroscopy Fiber-Optic Probe

    Directory of Open Access Journals (Sweden)

    Daniel Khankin

    2012-01-01

    Full Text Available In a previous work, we described the simulation tool (FOPS 3D (Khankin et al., 2001 which can simulate the full three-dimensional geometrical structure of a fiber and the propagation of a light beam sent through it. In this paper we are focusing on three major points: the first concerns the improvements made with respect to the simulation tool and the second, optimizations implemented with respect to the calculations' efficiency. Finally, the major research improvement from our previous works is the simulation results of the optimal absorbance value, as a function of bending angle for a given uncladded part diameter, that are presented; it is suggested that fiber-bending may improve the efficiency of recording the relevant measurements. This is the third iteration of the FOPS development process (Mann et al., 2009 which was significantly optimized by decreasing memory usage and increasing CPU utilization.

  7. Computer Simulation as a Tool for Analyzing and Optimizing Real-Life Processes

    Directory of Open Access Journals (Sweden)

    Tomáš Domonkos

    2010-06-01

    Full Text Available In some real-life situations, the analysis of complicated systems using standard analytical methods of operational research represents a particularly difficult endeavor, due to the system's complicated structure or the impossibility of reaching a mathematical solution. In cases when we are not able to use the standard methods of operational research, we can use simulation modeling. Sometimes simulation modeling as a tool for supporting practical decision-making offers a possible solution for this problem. The aim of this paper is to characterize briefly the development of discrete-event simulation methodology over the past 50 years on the grounds of the evolution of various simulation programs, describe the essentials of the simulation program tool Simul8 and present, on the basis of a case study, how we can analyze and optimize complicated real-life systems.

  8. Laser additive manufacturing of multimaterial tool inserts: a simulation-based optimization study

    Science.gov (United States)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2017-02-01

    Selective laser melting is fast evolving into an industrially applicable manufacturing process. While components produced from high-value materials, such as Ti6Al4V and Inconel 718 alloys, are already being produced, the processing of multi-material components still remains to be achieved by using laser additive manufacturing. The physical handling of multi-material in a SLM setup continues to be a primary challenge along with the selection of process parameters/plan to achieve the desired results - both challenges requiring considerable experimental undertakings. Consequently, numerical process modelling has been adopted towards tackling the latter challenge in an effective manner. In this paper, a numerical simulation based optimization study is undertaken to enable selective laser melting of multi-material tool inserts. A standard copper specimen covered by a thin layer of nickel is chosen, over which a layer of steel has been deposited using cold-spraying technique, such as to protect the microstructure of Ni during selective laser melting. The process modelled thus entails additively manufacturing a steel tool insert around the multi-material specimen with a goal of achieving a dense product while preventing recrystallization in the Nickel layer. The process is simulated using a high-fidelity thermo-microstructural model with constant processing parameters to capture the effect on Nickel layer. Based on results, key structural and process parameters are identified, and subsequently an optimization study is conducted using evolutionary algorithms to determine the appropriate process parameter values as well as processing sequence. The optimized process plan is then used to manufacture real multi-material tool insert samples by selective laser melting.

  9. Dynamic simulation tools for the analysis and optimization of novel collection, filtration and sample preparation systems

    Energy Technology Data Exchange (ETDEWEB)

    Clague, D; Weisgraber, T; Rockway, J; McBride, K

    2006-02-12

    The focus of research effort described here is to develop novel simulation tools to address design and optimization needs in the general class of problems that involve species and fluid (liquid and gas phases) transport through sieving media. This was primarily motivated by the heightened attention on Chem/Bio early detection systems, which among other needs, have a need for high efficiency filtration, collection and sample preparation systems. Hence, the said goal was to develop the computational analysis tools necessary to optimize these critical operations. This new capability is designed to characterize system efficiencies based on the details of the microstructure and environmental effects. To accomplish this, new lattice Boltzmann simulation capabilities where developed to include detailed microstructure descriptions, the relevant surface forces that mediate species capture and release, and temperature effects for both liquid and gas phase systems. While developing the capability, actual demonstration and model systems (and subsystems) of national and programmatic interest were targeted to demonstrate the capability. As a result, where possible, experimental verification of the computational capability was performed either directly using Digital Particle Image Velocimetry or published results.

  10. Development of a Groundwater Transport Simulation Tool for Remedial Process Optimization

    Energy Technology Data Exchange (ETDEWEB)

    Ivarson, Kristine A.; Hanson, James P.; Tonkin, M.; Miller, Charles W.; Baker, S.

    2015-01-14

    The groundwater remedy for hexavalent chromium at the Hanford Site includes operation of five large pump-and-treat systems along the Columbia River. The systems at the 100-HR-3 and 100-KR-4 groundwater operable units treat a total of about 9,840 liters per minute (2,600 gallons per minute) of groundwater to remove hexavalent chromium, and cover an area of nearly 26 square kilometers (10 square miles). The pump-and-treat systems result in large scale manipulation of groundwater flow direction, velocities, and most importantly, the contaminant plumes. Tracking of the plumes and predicting needed system modifications is part of the remedial process optimization, and is a continual process with the goal of reducing costs and shortening the timeframe to achieve the cleanup goals. While most of the initial system evaluations are conducted by assessing performance (e.g., reduction in contaminant concentration in groundwater and changes in inferred plume size), changes to the well field are often recommended. To determine the placement for new wells, well realignments, and modifications to pumping rates, it is important to be able to predict resultant plume changes. In smaller systems, it may be effective to make small scale changes periodically and adjust modifications based on groundwater monitoring results. Due to the expansive nature of the remediation systems at Hanford, however, additional tools were needed to predict the plume reactions to system changes. A computer simulation tool was developed to support pumping rate recommendations for optimization of large pump-and-treat groundwater remedy systems. This tool, called the Pumping Optimization Model, or POM, is based on a 1-layer derivation of a multi-layer contaminant transport model using MODFLOW and MT3D.

  11. Laser additive manufacturing of multimaterial tool inserts: a simulation-based optimization study

    DEFF Research Database (Denmark)

    Mohanty, Sankhya; Hattel, Jesper Henri

    2017-01-01

    Selective laser melting is fast evolving into an industrially applicable manufacturing process. While components produced from high-value materials, such as Ti6Al4V and Inconel 718 alloys, are already being produced, the processing of multi-material components still remains to be achieved by using...... laser additive manufacturing. The physical handling of multi-material in a SLM setup continues to be a primary challenge along with the selection of process parameters/plan to achieve the desired results – both challenges requiring considerable experimental undertakings. Consequently, numerical process...... modelling has been adopted towards tackling the latter challenge in an effective manner. In this paper, a numerical simulation based optimization study is undertaken to enable selective laser melting of multi-material tool inserts. A standard copper specimen covered by a thin layer of nickel is chosen, over...

  12. Final report on LDRD project: Simulation/optimization tools for system variability analysis

    Energy Technology Data Exchange (ETDEWEB)

    R. L. Bierbaum; R. F. Billau; J. E. Campbell; K. D. Marx; R. J. Sikorski; B. M. Thompson; S. D. Wix

    1999-10-01

    >This work was conducted during FY98 (Proposal Number 98-0036) and FY99 (Proposal Number 99-0818) under the auspices of the Sandia National Laboratories Laboratory-Directed Research and Development (LDRD) program. Electrical simulation typically treats a single data point in the very large input space of component properties. For electrical simulation to reach its full potential as a design tool, it must be able to address the unavoidable variability and uncertainty in component properties. Component viability is strongly related to the design margin (and reliability) of the end product. During the course of this project, both tools and methodologies were developed to enable analysis of variability in the context of electrical simulation tools. Two avenues to link relevant tools were also developed, and the resultant toolset was applied to a major component.

  13. Chapter 8: Planning Tools to Simulate and Optimize Neighborhood Energy Systems

    Energy Technology Data Exchange (ETDEWEB)

    Zhivov, Alexander Michael; Case, Michael Patrick; Jank, Reinhard; Eicker, Ursula; Booth, Samuel

    2017-03-15

    This section introduces different energy modeling tools available in Europe and the USA for community energy master planning process varying from strategic Urban Energy Planning to more detailed Local Energy Planning. Two modeling tools used for Energy Master Planning of primarily residential communities, the 3D city model with CityGML, and the Net Zero Planner tool developed for the US Department of Defense installations are described in more details.

  14. Method for vibration response simulation and sensor placement optimization of a machine tool spindle system with a bearing defect.

    Science.gov (United States)

    Cao, Hongrui; Niu, Linkai; He, Zhengjia

    2012-01-01

    Bearing defects are one of the most important mechanical sources for vibration and noise generation in machine tool spindles. In this study, an integrated finite element (FE) model is proposed to predict the vibration responses of a spindle bearing system with localized bearing defects and then the sensor placement for better detection of bearing faults is optimized. A nonlinear bearing model is developed based on Jones' bearing theory, while the drawbar, shaft and housing are modeled as Timoshenko's beam. The bearing model is then integrated into the FE model of drawbar/shaft/housing by assembling equations of motion. The Newmark time integration method is used to solve the vibration responses numerically. The FE model of the spindle-bearing system was verified by conducting dynamic tests. Then, the localized bearing defects were modeled and vibration responses generated by the outer ring defect were simulated as an illustration. The optimization scheme of the sensor placement was carried out on the test spindle. The results proved that, the optimal sensor placement depends on the vibration modes under different boundary conditions and the transfer path between the excitation and the response.

  15. Simulation and Application of GPOPS for a Trajectory Optimization and Mission Planning Tool

    Science.gov (United States)

    2010-03-01

    PLANNING TOOL I. Introduction Hollywood has romanticized the concept of the strategic super computer and war. Movies imply that the all...challenge 4.2, Globally Deliver Full Spectrum of Kinetic Energy. The short-term goal for this challenge is to hit a target from anywhere in theatre

  16. Optimization of Friction Stir Welding Tool Advance Speed via Monte-Carlo Simulation of the Friction Stir Welding Process

    Directory of Open Access Journals (Sweden)

    Kirk A. Fraser

    2014-04-01

    Full Text Available Recognition of the friction stir welding process is growing in the aeronautical and aero-space industries. To make the process more available to the structural fabrication industry (buildings and bridges, being able to model the process to determine the highest speed of advance possible that will not cause unwanted welding defects is desirable. A numerical solution to the transient two-dimensional heat diffusion equation for the friction stir welding process is presented. A non-linear heat generation term based on an arbitrary piecewise linear model of friction as a function of temperature is used. The solution is used to solve for the temperature distribution in the Al 6061-T6 work pieces. The finite difference solution of the non-linear problem is used to perform a Monte-Carlo simulation (MCS. A polynomial response surface (maximum welding temperature as a function of advancing and rotational speed is constructed from the MCS results. The response surface is used to determine the optimum tool speed of advance and rotational speed. The exterior penalty method is used to find the highest speed of advance and the associated rotational speed of the tool for the FSW process considered. We show that good agreement with experimental optimization work is possible with this simplified model. Using our approach an optimal weld pitch of 0.52 mm/rev is obtained for 3.18 mm thick AA6061-T6 plate. Our method provides an estimate of the optimal welding parameters in less than 30 min of calculation time.

  17. Optimization of Friction Stir Welding Tool Advance Speed via Monte-Carlo Simulation of the Friction Stir Welding Process.

    Science.gov (United States)

    Fraser, Kirk A; St-Georges, Lyne; Kiss, Laszlo I

    2014-04-30

    Recognition of the friction stir welding process is growing in the aeronautical and aero-space industries. To make the process more available to the structural fabrication industry (buildings and bridges), being able to model the process to determine the highest speed of advance possible that will not cause unwanted welding defects is desirable. A numerical solution to the transient two-dimensional heat diffusion equation for the friction stir welding process is presented. A non-linear heat generation term based on an arbitrary piecewise linear model of friction as a function of temperature is used. The solution is used to solve for the temperature distribution in the Al 6061-T6 work pieces. The finite difference solution of the non-linear problem is used to perform a Monte-Carlo simulation (MCS). A polynomial response surface (maximum welding temperature as a function of advancing and rotational speed) is constructed from the MCS results. The response surface is used to determine the optimum tool speed of advance and rotational speed. The exterior penalty method is used to find the highest speed of advance and the associated rotational speed of the tool for the FSW process considered. We show that good agreement with experimental optimization work is possible with this simplified model. Using our approach an optimal weld pitch of 0.52 mm/rev is obtained for 3.18 mm thick AA6061-T6 plate. Our method provides an estimate of the optimal welding parameters in less than 30 min of calculation time.

  18. Simulation Tool for Inventory Models: SIMIN

    OpenAIRE

    Pratiksha Saxen; Tulsi Kushwaha

    2014-01-01

    In this paper, an integrated simulation optimization model for the inventory system is developed. An effective algorithm is developed to evaluate and analyze the back-end stored simulation results. This paper proposes simulation tool SIMIN (Inventory Simulation) to simulate inventory models. SIMIN is a tool which simulates and compares the results of different inventory models. To overcome various practical restrictive assumptions, SIMIN provides values for a number of performance measurement...

  19. Handbook of simulation optimization

    CERN Document Server

    Fu, Michael C

    2014-01-01

    The Handbook of Simulation Optimization presents an overview of the state of the art of simulation optimization, providing a survey of the most well-established approaches for optimizing stochastic simulation models and a sampling of recent research advances in theory and methodology. Leading contributors cover such topics as discrete optimization via simulation, ranking and selection, efficient simulation budget allocation, random search methods, response surface methodology, stochastic gradient estimation, stochastic approximation, sample average approximation, stochastic constraints, variance reduction techniques, model-based stochastic search methods and Markov decision processes. This single volume should serve as a reference for those already in the field and as a means for those new to the field for understanding and applying the main approaches. The intended audience includes researchers, practitioners and graduate students in the business/engineering fields of operations research, management science,...

  20. FASTBUS simulation tools

    Energy Technology Data Exchange (ETDEWEB)

    Dean, T.D. (Stanford Linear Accelerator Center, Menlo Park, CA (United States)); Haney, M.J. (Illinois Univ., Urbana, IL (United States))

    1991-10-01

    A generalized model of a FASTBUS master is presented. The model is used with simulation tools to aid in the specification, design, and production of FASTBUS slave modules. The model provides a mechanism to interact with the electrical schematics and software models to predict performance. The model is written in the IEEE std 1076-1987 hardware description language VHDL. A model of the ATC logic is also presented. VHDL was chosen to provide portability to various platforms and simulation tools. The models, in conjunction with most commercially available simulators, will perform all of the transactions specified in IEEE std 960-1989. The models may be used to study the behavior of electrical schematics and other software models and detect violations of the FASTBUS protocol. For example, a hardware design of a slave module could be studied, protocol violations detected and corrected before committing money to prototype development. The master model accepts a stream of high level commands from an ASCII file to initiate FASTBUS transactions. The high level command language is based on the FASTBUS standard routines listed in IEEE std 1177-1989. Using this standard-based command language to direct the model of the master, hardware engineers can simulate FASTBUS transactions in the language used by physicists and programmers to operate FASTBUS systems. 15 refs., 6 figs.

  1. Terascale Optimal PDE Simulations

    Energy Technology Data Exchange (ETDEWEB)

    David Keyes

    2009-07-28

    The Terascale Optimal PDE Solvers (TOPS) Integrated Software Infrastructure Center (ISIC) was created to develop and implement algorithms and support scientific investigations performed by DOE-sponsored researchers. These simulations often involve the solution of partial differential equations (PDEs) on terascale computers. The TOPS Center researched, developed and deployed an integrated toolkit of open-source, optimal complexity solvers for the nonlinear partial differential equations that arise in many DOE application areas, including fusion, accelerator design, global climate change and reactive chemistry. The algorithms created as part of this project were also designed to reduce current computational bottlenecks by orders of magnitude on terascale computers, enabling scientific simulation on a scale heretofore impossible.

  2. Optimization Specifications for CUDA Code Restructuring Tool

    KAUST Repository

    Khan, Ayaz

    2017-03-13

    In this work we have developed a restructuring software tool (RT-CUDA) following the proposed optimization specifications to bridge the gap between high-level languages and the machine dependent CUDA environment. RT-CUDA takes a C program and convert it into an optimized CUDA kernel with user directives in a configuration file for guiding the compiler. RTCUDA also allows transparent invocation of the most optimized external math libraries like cuSparse and cuBLAS enabling efficient design of linear algebra solvers. We expect RT-CUDA to be needed by many KSA industries dealing with science and engineering simulation on massively parallel computers like NVIDIA GPUs.

  3. NEMO. A novel techno-economic tool suite for simulating and optimizing solutions for grid integration of electric vehicles and charging stations

    Energy Technology Data Exchange (ETDEWEB)

    Erge, Thomas; Stillahn, Thies; Dallmer-Zerbe, Kilian; Wille-Haussmann, Bernhard [Frauenhofer Institut for Solar Energy Systems ISE, Freiburg (Germany)

    2013-07-01

    With an increasing use of electric vehicles (EV) grid operators need to predict energy flows depending on electromobility use profiles to accordingly adjust grid infrastructure and operation control accordingly. Tools and methodologies are required to characterize grid problems resulting from the interconnection of EV with the grid. The simulation and optimization tool suite NEMO (Novel E-MObility grid model) was developed within a European research project and is currently being tested using realistic showcases. It is a combination of three professional tools. One of the tools aims at a combined techno-economic design and operation, primarily modeling plants on contracts or the spot market, at the same time participating in balancing markets. The second tool is designed for planning grid extension or reinforcement while the third tool is mainly used to quickly discover potential conflicts of grid operation approaches through load flow analysis. The tool suite is used to investigate real showcases in Denmark, Germany and the Netherlands. First studies show that significant alleviation of stress on distribution grid lines could be achieved by few but intelligent restrictions to EV charging procedures.

  4. Dynamic optimization case studies in DYNOPT tool

    Science.gov (United States)

    Ozana, Stepan; Pies, Martin; Docekal, Tomas

    2016-06-01

    Dynamic programming is typically applied to optimization problems. As the analytical solutions are generally very difficult, chosen software tools are used widely. These software packages are often third-party products bound for standard simulation software tools on the market. As typical examples of such tools, TOMLAB and DYNOPT could be effectively applied for solution of problems of dynamic programming. DYNOPT will be presented in this paper due to its licensing policy (free product under GPL) and simplicity of use. DYNOPT is a set of MATLAB functions for determination of optimal control trajectory by given description of the process, the cost to be minimized, subject to equality and inequality constraints, using orthogonal collocation on finite elements method. The actual optimal control problem is solved by complete parameterization both the control and the state profile vector. It is assumed, that the optimized dynamic model may be described by a set of ordinary differential equations (ODEs) or differential-algebraic equations (DAEs). This collection of functions extends the capability of the MATLAB Optimization Tool-box. The paper will introduce use of DYNOPT in the field of dynamic optimization problems by means of case studies regarding chosen laboratory physical educational models.

  5. Autonomous tools for Grid management, monitoring and optimization

    CERN Document Server

    Wislicki, Wojciech

    2007-01-01

    We outline design and lines of development of autonomous tools for the computing Grid management, monitoring and optimization. The management is proposed to be based on the notion of utility. Grid optimization is considered to be application-oriented. A generic Grid simulator is proposed as an optimization tool for Grid structure and functionality.

  6. Advanced Simulation and Optimization Tools for Dynamic Aperture of Non-scaling FFAGs and Accelerators including Modern User Interfaces

    Energy Technology Data Exchange (ETDEWEB)

    Mills, F.; Makino, Kyoko; Berz, Martin; Johnstone, C.

    2010-09-01

    With the U.S. experimental effort in HEP largely located at laboratories supporting the operations of large, highly specialized accelerators, colliding beam facilities, and detector facilities, the understanding and prediction of high energy particle accelerators becomes critical to the success, overall, of the DOE HEP program. One area in which small businesses can contribute to the ongoing success of the U.S. program in HEP is through innovations in computer techniques and sophistication in the modeling of high-energy accelerators. Accelerator modeling at these facilities is performed by experts with the product generally highly specific and representative only of in-house accelerators or special-interest accelerator problems. Development of new types of accelerators like FFAGs with their wide choices of parameter modifications, complicated fields, and the simultaneous need to efficiently handle very large emittance beams requires the availability of new simulation environments to assure predictability in operation. In this, ease of use and interfaces are critical to realizing a successful model, or optimization of a new design or working parameters of machines. In Phase I, various core modules for the design and analysis of FFAGs were developed and Graphical User Interfaces (GUI) have been investigated instead of the more general yet less easily manageable console-type output COSY provides.

  7. Ion trap simulation tools.

    Energy Technology Data Exchange (ETDEWEB)

    Hamlet, Benjamin Roger

    2009-02-01

    Ion traps present a potential architecture for future quantum computers. These computers are of interest due to their increased power over classical computers stemming from the superposition of states and the resulting capability to simultaneously perform many computations. This paper describes a software application used to prepare and visualize simulations of trapping and maneuvering ions in ion traps.

  8. Optimizing Event-Driven Simulations

    CERN Document Server

    De Michele, Cristiano

    2010-01-01

    Event-driven molecular dynamics is a valuable tool in condensed and soft matter physics when particles can be modeled as hard objects or more generally if their interaction potential can be modeled in a stepwise fashion. Hard spheres model has been indeed widely used both for computational and theoretical description of physical systems. Recently further developments of computational techniques allow simulations of hard rigid objects of generic shape. In present paper we will present some optimizations for event-driven simulations that offered significant speedup over previous methods. In particular we will describe a generalization of well known linked list method and an improvement on nearest neighbor lists method recently proposed by us.

  9. Application of simulation models for the optimization of business processes

    Science.gov (United States)

    Jašek, Roman; Sedláček, Michal; Chramcov, Bronislav; Dvořák, Jiří

    2016-06-01

    The paper deals with the applications of modeling and simulation tools in the optimization of business processes, especially in solving an optimization of signal flow in security company. As a modeling tool was selected Simul8 software that is used to process modeling based on discrete event simulation and which enables the creation of a visual model of production and distribution processes.

  10. Multiphysics simulation electromechanical system applications and optimization

    CERN Document Server

    Dede, Ercan M; Nomura, Tsuyoshi

    2014-01-01

    This book highlights a unique combination of numerical tools and strategies for handling the challenges of multiphysics simulation, with a specific focus on electromechanical systems as the target application. Features: introduces the concept of design via simulation, along with the role of multiphysics simulation in today's engineering environment; discusses the importance of structural optimization techniques in the design and development of electromechanical systems; provides an overview of the physics commonly involved with electromechanical systems for applications such as electronics, ma

  11. Coupled multiscale simulation and optimization in nanoelectronics

    CERN Document Server

    2015-01-01

    Designing complex integrated circuits relies heavily on mathematical methods and calls for suitable simulation and optimization tools. The current design approach involves simulations and optimizations in different physical domains (device, circuit, thermal, electromagnetic) and in a range of electrical engineering disciplines (logic, timing, power, crosstalk, signal integrity, system functionality). COMSON was a Marie Curie Research Training Network created to meet these new scientific and training challenges by (a) developing new descriptive models that take these mutual dependencies into account, (b) combining these models with existing circuit descriptions in new simulation strategies, and (c) developing new optimization techniques that will accommodate new designs. The book presents the main project results in the fields of PDAE modeling and simulation, model order reduction techniques and optimization, based on merging the know-how of three major European semiconductor companies with the combined expe...

  12. The Xygra gun simulation tool.

    Energy Technology Data Exchange (ETDEWEB)

    Garasi, Christopher Joseph; Lamppa, Derek C.; Aubuchon, Matthew S.; Shirley, David Noyes; Robinson, Allen Conrad; Russo, Thomas V.

    2008-12-01

    Inductive electromagnetic launchers, or coilguns, use discrete solenoidal coils to accelerate a coaxial conductive armature. To date, Sandia has been using an internally developed code, SLINGSHOT, as a point-mass lumped circuit element simulation tool for modeling coilgun behavior for design and verification purposes. This code has shortcomings in terms of accurately modeling gun performance under stressful electromagnetic propulsion environments. To correct for these limitations, it was decided to attempt to closely couple two Sandia simulation codes, Xyce and ALEGRA, to develop a more rigorous simulation capability for demanding launch applications. This report summarizes the modifications made to each respective code and the path forward to completing interfacing between them.

  13. Automated Multivariate Optimization Tool for Energy Analysis: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Ellis, P. G.; Griffith, B. T.; Long, N.; Torcellini, P. A.; Crawley, D.

    2006-07-01

    Building energy simulations are often used for trial-and-error evaluation of ''what-if'' options in building design--a limited search for an optimal solution, or ''optimization''. Computerized searching has the potential to automate the input and output, evaluate many options, and perform enough simulations to account for the complex interactions among combinations of options. This paper describes ongoing efforts to develop such a tool. The optimization tool employs multiple modules, including a graphical user interface, a database, a preprocessor, the EnergyPlus simulation engine, an optimization engine, and a simulation run manager. Each module is described and the overall application architecture is summarized.

  14. OPTIMIZATION METHODS AND SEO TOOLS

    Directory of Open Access Journals (Sweden)

    Maria Cristina ENACHE

    2014-06-01

    Full Text Available SEO is the activity of optimizing Web pages or whole sites in order to make them more search engine friendly, thus getting higher positions in search results. Search engine optimization (SEO involves designing, writing, and coding a website in a way that helps to improve the volume and quality of traffic to your website from people using search engines. While Search Engine Optimization is the focus of this booklet, keep in mind that it is one of many marketing techniques. A brief overview of other marketing techniques is provided at the end of this booklet.

  15. TOPFARM wind farm optimization tool

    DEFF Research Database (Denmark)

    Réthoré, Pierre-Elouan; Fuglsang, Peter; Larsen, Torben J.;

    the optimization problem includes elements as energy production, turbine degradation, operation and maintenance costs, electrical grid costs and foundation costs. The objective function is optimized using a dedicated multi fidelity approach with the locations of individual turbines in the wind farm spanning......A wind farm optimization framework is presented in detail and demonstrated on two test cases: 1) Middelgrunden and 2) Stags Holt/Coldham. A detailed flow model describing the instationary flow within a wind farm is used together with an aeroelastic model to determine production and fatigue loading...... of wind farm wind turbines. Based on generic load cases, the wind farm production and fatigue evaluations are subsequently condensed in a large pre-calculated database for rapid calculation of lifetime equivalent loads and energy production in the optimization loop.. The objective function defining...

  16. TOPFARM wind farm optimization tool

    Energy Technology Data Exchange (ETDEWEB)

    Rethore, P.-E.; Fuglsang, P.; Larsen, Torben J.; Buhl, T.; Larsen, Gunner C.

    2011-02-15

    A wind farm optimization framework is presented in detail and demonstrated on two test cases: 1) Middelgrunden and 2) Stags Holt/Coldham. A detailed flow model describing the instationary flow within a wind farm is used together with an aeroelastic model to determine production and fatigue loading of wind farm wind turbines. Based on generic load cases, the wind farm production and fatigue evaluations are subsequently condensed in a large pre-calculated database for rapid calculation of lifetime equivalent loads and energy production in the optimization loop. The objective function defining the optimization problem includes elements as energy production, turbine degradation, operation and maintenance costs, electrical grid costs and foundation costs. The objective function is optimized using a dedicated multi fidelity approach with the locations of individual turbines in the wind farm spanning the design space. The results are over all satisfying and are giving some interesting insights on the pros and cons of the design choices. They show in particular that the inclusion of the fatigue loads costs give rise to some additional details in comparison with pure power based optimization. The Middelgrunden test case resulted in an improvement of the financial balance of 2.1 M Euro originating from a very large increase in the energy production value of 9.3 M Euro mainly counterbalanced by increased electrical grid costs. The Stags Holt/Coldham test case resulted in an improvement of the financial balance of 3.1 M Euro. (Author)

  17. Optimizing Methods in Simulation

    Science.gov (United States)

    1981-08-01

    exploited by Kiefer and Wolfowitz -; (1959). Wald (1943) used the criterion of D-optimality - in some other context and was so named by Kiefer and...of discrepency between the observed and expected value A is obtained in terms of mean squared errors ( MSE ). i Consider the model, E(Ylx) = a + ex and...V(YIX) = 0 2 Let L < x < U, be the interval of possible x values. The MSE (x) is the mean squared error of x as obtained from y. Let w(x) be a weight

  18. General model for boring tool optimization

    Science.gov (United States)

    Moraru, G. M.; rbes, M. V. Ze; Popescu, L. G.

    2016-08-01

    Optimizing a tool (and therefore those for boring) consist in improving its performance through maximizing the objective functions chosen by the designer and/or by user. In order to define and to implement the proposed objective functions, contribute numerous features and performance required by tool users. Incorporation of new features makes the cutting tool to be competitive in the market and to meet user requirements.

  19. 10 CFR 434.606 - Simulation tool.

    Science.gov (United States)

    2010-01-01

    ... 10 Energy 3 2010-01-01 2010-01-01 false Simulation tool. 434.606 Section 434.606 Energy DEPARTMENT... RESIDENTIAL BUILDINGS Building Energy Compliance Alternative § 434.606 Simulation tool. 606.1 The criteria established in subsection 521 for the selection of a simulation tool shall be followed when using...

  20. Concrete Plant Operations Optimization Using Combined Simulation and Genetic Algorithms

    NARCIS (Netherlands)

    Cao, Ming; Lu, Ming; Zhang, Jian-Ping

    2004-01-01

    This work presents a new approach for concrete plant operations optimization by combining a ready mixed concrete (RMC) production simulation tool (called HKCONSIM) with a genetic algorithm (GA) based optimization procedure. A revamped HKCONSIM computer system can be used to automate the simulation m

  1. Concrete Plant Operations Optimization Using Combined Simulation and Genetic Algorithms

    NARCIS (Netherlands)

    Cao, Ming; Lu, Ming; Zhang, Jian-Ping

    2004-01-01

    This work presents a new approach for concrete plant operations optimization by combining a ready mixed concrete (RMC) production simulation tool (called HKCONSIM) with a genetic algorithm (GA) based optimization procedure. A revamped HKCONSIM computer system can be used to automate the simulation m

  2. Concrete Plant Operations Optimization Using Combined Simulation and Genetic Algorithms

    NARCIS (Netherlands)

    Cao, Ming; Lu, Ming; Zhang, Jian-Ping

    2004-01-01

    This work presents a new approach for concrete plant operations optimization by combining a ready mixed concrete (RMC) production simulation tool (called HKCONSIM) with a genetic algorithm (GA) based optimization procedure. A revamped HKCONSIM computer system can be used to automate the simulation

  3. Simulating the Farm Production System Using the MONARC Simulation Tool

    Institute of Scientific and Technical Information of China (English)

    Y.Wu; I.C.Legrand; 等

    2001-01-01

    The simulation program developed by the "Models of Networked Analysis at Regional Centers"(MONARC) project is a powerful and flexible tool for simulating the behavior of large scale distributed computing systems,In this study,we further validate this simulation tool in a large-scale distributed farm computing system.We also report the usage of this simulation tool to identify the bottlenecks and limitations of our farm system.

  4. Overview of Simulation Tools for Smart Grids

    DEFF Research Database (Denmark)

    aim of this report “D2.1 – Overview of Simulation Tools for Smart Grids” is to provide an overview of the different simulation tools available, i.e. developed and in use, at the different research centres. Required new tool capabilities are identified and extensions to the existing packages...... are indicated. An analysis of the emerging power systems challenges together with a review of the main topics regarding smart grids is provided in Chapter 1. The requirements for the simulation tools and the list of available tools in the different research centres and their main characteristic are reported...... in Chapter 2. The main aspects of the different tools and their purpose of analysis are listed in Chapter 3 along with the main topics concerning the new requirements for tools in order to allow a proper study in the smart grid context. Gaps capabilities and model consolidation of the analysed tools...

  5. Humanoid robot simulation with a joint trajectory optimized controller

    OpenAIRE

    2008-01-01

    This paper describes a joint trajectory optimized controller for a humanoid robot simulator following the real robot characteristics. As simulation is a powerful tool for speeding up the control software development, the proposed accurate simulator allows to fulfil this goal. The simulator, based on the Open Dynamics Engine and GLScene graphics library, provides instant visual feedback. The proposed simulator, with realistic dynamics, allows to design and test behaviours and control strat...

  6. Tool Support for Software Lookup Table Optimization

    Directory of Open Access Journals (Sweden)

    Chris Wilcox

    2011-01-01

    Full Text Available A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology and tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.

  7. Robotic Mission Simulation Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Energid Technologies proposes a software tool to predict robotic mission performance and support supervision of robotic missions even when environments and...

  8. Simulation of Oscillatory Working Tool

    Directory of Open Access Journals (Sweden)

    Carmen Debeleac

    2010-01-01

    Full Text Available The paper presents a study of the resistance forces in soils cutting, with emphasis on their dependence on working tool motion during the loading process and dynamic regimes. The periodic process of cutting of soil by a tool (blade has described. Different intervals in the cycle of steady-state motion of the tool, and several interaction regimes were considered. The analysis has based on a non-linear approximation of the dependence of the soil resistance force on tool motion. Finally, the influence of frequency on the laws governing the interaction in the cyclic process was established.

  9. Use of simulation tools to optimize the operation in the Guardamar del Segura WWTP (Alicante, Spain); Optimizacion de la operacion de la EDAR de Guardamar del Segura mediante la utilizacion de herramientas de simulacion

    Energy Technology Data Exchange (ETDEWEB)

    Rey Gosalbez, H.; Gracia Igelmo, M. de; Larrea Urcola, M. A.; Morenilla Martinez, J. J.; Bernacer Bonora, I.; Santos Asensi, J. M.

    2007-07-01

    The present work shows the improvements obtained in the Guardmar del Segura WWTP with regard to both the saving of the power and the efficiency of the purification. the WWTP has been modelled and implemented in the platform of simulation WEST. the model has been calibrated and validated experimentally in order to guarantee the predictive capacity of the simulator that has served to detect and diagnose the problems of operation of the plant. In addition, it has been explores that there are more advisable strategies of operation considering the capacities of the plant and the possible scenes of operation. The results obtained after the application of these strategies demonstrate the potential of the simulation tools, which, as an effective support in the determination of the necessary performances, enables to optimize the operation costs without sacrificing any quality of the treated water. (Author) 9 refs.

  10. Data Mining and Optimization Tools for Developing Engine Parameters Tools

    Science.gov (United States)

    Dhawan, Atam P.

    1998-01-01

    This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. Tricia Erhardt and I studied the problem domain for developing an Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy, datasets. From the study and discussion with NASA LeRC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of tile data for GA based multi-resolution optimal search.

  11. Data Mining and Optimization Tools for Developing Engine Parameters Tools

    Science.gov (United States)

    Dhawan, Atam P.

    1998-01-01

    This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. Tricia Erhardt and I studied the problem domain for developing an Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy, datasets. From the study and discussion with NASA LeRC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of tile data for GA based multi-resolution optimal search.

  12. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    This paper describes the modelling, simulating and optimizing including experimental verification as being carried out as part of a Ph.D. project being written resp. supervised by the authors. The work covers dynamic performance of both water-tube boilers and fire tube boilers. A detailed dynamic...... model of the boiler has been developed and simulations carried out by means of the Matlab integration routines. The model is prepared as a dynamic model consisting of both ordinary differential equations and algebraic equations, together formulated as a Differential-Algebraic-Equation system. Being able...... to operate a boiler plant dynamically means that the boiler designs must be able to absorb any fluctuations in water level and temperature gradients resulting from the pressure change in the boiler. On the one hand a large water-/steam space may be required, i.e. to build the boiler as big as possible. Due...

  13. Simulation Tool For Energy Consumption and Production

    DEFF Research Database (Denmark)

    Nysteen, Michael; Mynderup, Henrik; Poulsen, Bjarne

    2013-01-01

    the energy consumption in smart homes. This paper presents a prototype simulation tool that allows graphical modeling of a home. Based on the modeled homes the user is able to simulate the energy consumptions and compare scenarios. The simulations are based on dynamic weather and energy price data as well...

  14. Versatile and Extensible, Continuous-Thrust Trajectory Optimization Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to develop an innovative, versatile and extensible, continuous-thrust trajectory optimization tool for planetary mission design and optimization of...

  15. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  16. Simulation-based optimization parametric optimization techniques and reinforcement learning

    CERN Document Server

    Gosavi, Abhijit

    2003-01-01

    Simulation-Based Optimization: Parametric Optimization Techniques and Reinforcement Learning introduces the evolving area of simulation-based optimization. The book's objective is two-fold: (1) It examines the mathematical governing principles of simulation-based optimization, thereby providing the reader with the ability to model relevant real-life problems using these techniques. (2) It outlines the computational technology underlying these methods. Taken together these two aspects demonstrate that the mathematical and computational methods discussed in this book do work. Broadly speaking, the book has two parts: (1) parametric (static) optimization and (2) control (dynamic) optimization. Some of the book's special features are: *An accessible introduction to reinforcement learning and parametric-optimization techniques. *A step-by-step description of several algorithms of simulation-based optimization. *A clear and simple introduction to the methodology of neural networks. *A gentle introduction to converg...

  17. Simulations of optimized anguilliform swimming.

    Science.gov (United States)

    Kern, Stefan; Koumoutsakos, Petros

    2006-12-01

    The hydrodynamics of anguilliform swimming motions was investigated using three-dimensional simulations of the fluid flow past a self-propelled body. The motion of the body is not specified a priori, but is instead obtained through an evolutionary algorithm used to optimize the swimming efficiency and the burst swimming speed. The results of the present simulations support the hypothesis that anguilliform swimmers modify their kinematics according to different objectives and provide a quantitative analysis of the swimming motion and the forces experienced by the body. The kinematics of burst swimming is characterized by the large amplitude of the tail undulations while the anterior part of the body remains straight. In contrast, during efficient swimming behavior significant lateral undulation occurs along the entire length of the body. In turn, during burst swimming, the majority of the thrust is generated at the tail, whereas in the efficient swimming mode, in addition to the tail, the middle of the body contributes significantly to the thrust. The burst swimming velocity is 42% higher and the propulsive efficiency is 15% lower than the respective values during efficient swimming. The wake, for both swimming modes, consists largely of a double row of vortex rings with an axis aligned with the swimming direction. The vortex rings are responsible for producing lateral jets of fluid, which has been documented in prior experimental studies. We note that the primary wake vortices are qualitatively similar in both swimming modes except that the wake vortex rings are stronger and relatively more elongated in the fast swimming mode. The present results provide quantitative information of three-dimensional fluid-body interactions that may complement related experimental studies. In addition they enable a detailed quantitative analysis, which may be difficult to obtain experimentally, of the different swimming modes linking the kinematics of the motion with the forces

  18. Dynamic fault simulation of wind turbines using commercial simulation tools

    DEFF Research Database (Denmark)

    Lund, Torsten; Eek, Jarle; Uski, Sanna

    2005-01-01

    . The deviations and the reasons for the deviations between the tools are stated. The simulation models are imple-mented using the built-in library components of the simulation tools with exception of the mechanical drive-train model, which had to be user-modeled in PowerFactory and PSS/E.......This paper compares the commercial simulation tools: PSCAD/EMTDC, PowerFactory, SIMPOW and PSS/E for analysing fault sequences defined in the Danish grid code requirements for wind turbines connected to a voltage level below 100 kV. Both symmetrical and unsymmetrical faults are analysed...

  19. FDTD simulation tools for UWB antenna analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Brocato, Robert Wesley

    2004-12-01

    This paper describes the development of a set of software tools useful for analyzing ultra-wideband (UWB) antennas and structures. These tools are used to perform finite difference time domain (FDTD) simulation of a conical antenna with continuous wave (CW) and UWB pulsed excitations. The antenna is analyzed using spherical coordinate-based FDTD equations that are derived from first principles. The simulation results for CW excitation are compared to simulation and measured results from published sources; the results for UWB excitation are new.

  20. Probabilistic fire simulator - Monte Carlo simulation tool for fire scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Hostikka, S.; Keski-Rahkonen, O. [VTT Building and Transport, Espoo (Finland)

    2002-11-01

    Risk analysis tool is developed for computing of the distributions of fire model output variables. The tool, called Probabilistic Fire Simulator, combines Monte Carlo simulation and CFAST two-zone fire model. In this work, it is used to calculate failure probability of redundant cables and fire detector activation times in a cable tunnel fire. Sensitivity of the output variables to the input variables is calculated in terms of the rank order correlations. (orig.)

  1. Stochastic Simulation Tool for Aerospace Structural Analysis

    Science.gov (United States)

    Knight, Norman F.; Moore, David F.

    2006-01-01

    Stochastic simulation refers to incorporating the effects of design tolerances and uncertainties into the design analysis model and then determining their influence on the design. A high-level evaluation of one such stochastic simulation tool, the MSC.Robust Design tool by MSC.Software Corporation, has been conducted. This stochastic simulation tool provides structural analysts with a tool to interrogate their structural design based on their mathematical description of the design problem using finite element analysis methods. This tool leverages the analyst's prior investment in finite element model development of a particular design. The original finite element model is treated as the baseline structural analysis model for the stochastic simulations that are to be performed. A Monte Carlo approach is used by MSC.Robust Design to determine the effects of scatter in design input variables on response output parameters. The tool was not designed to provide a probabilistic assessment, but to assist engineers in understanding cause and effect. It is driven by a graphical-user interface and retains the engineer-in-the-loop strategy for design evaluation and improvement. The application problem for the evaluation is chosen to be a two-dimensional shell finite element model of a Space Shuttle wing leading-edge panel under re-entry aerodynamic loading. MSC.Robust Design adds value to the analysis effort by rapidly being able to identify design input variables whose variability causes the most influence in response output parameters.

  2. Simulation Tool For Energy Consumption and Production

    DEFF Research Database (Denmark)

    Nysteen, Michael; Mynderup, Henrik; Poulsen, Bjarne

    2013-01-01

    In order to promote adoption of smart grid with the general public it is necessary to be able to visualize the benefits of a smart home. Software tools that model the effects can help significantly with this. However, only little work has been done in the area of simulating and visualizing...... the energy consumption in smart homes. This paper presents a prototype simulation tool that allows graphical modeling of a home. Based on the modeled homes the user is able to simulate the energy consumptions and compare scenarios. The simulations are based on dynamic weather and energy price data as well...... as a controller unit of the user’s choice. The results of the simulations can be compared using a dynamic reporting window that allows the user to create custom charts of the data. The application has been designed such that it can easily be extended with additional controller units, price and weather data...

  3. A Data Simulator Tool for NIRCam

    Science.gov (United States)

    Hilbert, Bryan; Canipe, Alicia Michelle; Robberto, Massimo; NIRCam Team at STScI

    2017-06-01

    We present a new data simulator tool capable of producing high fidelity simulated data for NIRCam. This simulator produces “raw” multiaccum integrations, each composed of multiple non-destructive detector readouts. This is equivalent to data from real observations prior to the application of any calibration steps. Our primary motivation for creating this tool is to produce realistic data with which to test the JWST calibration pipeline steps, from basic calibration through the production of mosaic images and associated source catalogs. However, data created from this tool may also be useful to observers wishing to have example data to test custom data reduction or analysis software.The simulator begins with a real NIRCam dark current integration and adds synthetic astronomical sources. In this way, the simulated integration is guaranteed to contain all of the same noise characteristics and detector effects that will be present in real NIRCam observations. The output format of the simulated data is such that the files can be immediately run through the standard JWST calibration pipelines. Currently the tool supports the creation of NIRCam imaging and dispersed (wide field slitless) observations, including moving target (non-sidereal tracking) and time series observation data.

  4. Study on Tool Path Optimization in Multi-axis NC Machining

    Directory of Open Access Journals (Sweden)

    Niu Xinghua

    2015-01-01

    Full Text Available This paper presents a new generation algorithm for tool path based on the optimization of traditional algorithms. Then, the tool path on an impeller is generated with UG software, and it is used to make contrasts and verifications for the effect of optimization. Finally, VERICUT software with the function of the simulating on the whole manufacturing process is utilized to verify the feasibility of the optimized algorithm.

  5. A simulation tool for brassiness studies.

    Science.gov (United States)

    Gilbert, Joël; Menguy, Ludovic; Campbell, Murray

    2008-04-01

    A frequency-domain numerical model of brass instrument sound production is proposed as a tool to predict their brassiness, defined as the rate of spectral enrichment with increasing dynamic level. It is based on generalized Burger's equations dedicated to weakly nonlinear wave propagation in nonuniform ducts, and is an extension of previous work by Menguy and Gilbert [Acta Acustica 86, 798-810 (2000)], initially limited to short cylindrical tubes. The relevance of the present tool is evaluated by carrying out simulations over distances longer than typical shock formation distances, and by doing preliminary simulations of periodic regimes in a typical brass trombone bore geometry.

  6. Simulation Based Optimization for World Line Card Production System

    Directory of Open Access Journals (Sweden)

    Sinan APAK

    2012-07-01

    Full Text Available Simulation based decision support system is one of the commonly used tool to examine complex production systems. The simulation approach provides process modules which can be adjusted with certain parameters by using data relatively easily obtainable in production process. World Line Card production system simulation is developed to evaluate the optimality of existing production line via using discrete event simulation model with variaty of alternative proposals. The current production system is analysed by a simulation model emphasizing the bottlenecks and the poorly utilized production line. Our analysis identified some improvements and efficient solutions for the existing system.

  7. Realtime simulation tools in the CHORALE workshop

    Science.gov (United States)

    Cathala, Thierry; Le Goff, Alain; Gozard, Patrick; Latger, Jean

    2006-05-01

    CHORALE (simulated Optronic Acoustic Radar battlefield) is used by the French DGA/DET (Directorate for Evaluation of the French Ministry of Defense) to perform multi-sensors simulations. CHORALE enables the user to create virtual and realistic multi spectral 3D scenes, and generate the physical signal received by a sensor, typically an IR sensor. To evaluate their efficiency in visible and infrared wavelength, simulation tools, that give a good representation of physical phenomena, are used. This article describes the elements used to prepare data (3D database, materials, scenario, ...) for the simulation, and the set of tools (SE-FAST-IR), used in CHORALE for the Real Time simulation in the infrared spectrum. SE-FAST-IR package allows the compilation and visualization of 3D databases for infrared simulations. It enables one to visualize complex and large databases for a wide set of real and pseudo-real time applications. SE-FAST-IR is based on the physical model of the Non Real Time tool of CHORALE workshop. It automatically computes radiance textures, Open GL light source and fog-law parameters for predefined thermal and atmospheric conditions, specified by the user.

  8. Optimizing reversible simulation of injective functions

    DEFF Research Database (Denmark)

    Yokoyama, Tetsuo; Axelsen, Holger Bock; Glück, Robert

    2012-01-01

    Bennett showed that a clean reversible simulation of injective programs is possible without returning the input of a program as additional output. His method involves two computation and two uncomputation phases. This paper proposes an optimization of Bennett’s simulation that requires only half ......-coding program is further optimized by conserving the model over the text-generation phase. This paper may thus provide a newviewon developing efficient reversible simulations for a certain class of injective functions....

  9. Simulated annealing algorithm for optimal capital growth

    Science.gov (United States)

    Luo, Yong; Zhu, Bo; Tang, Yong

    2014-08-01

    We investigate the problem of dynamic optimal capital growth of a portfolio. A general framework that one strives to maximize the expected logarithm utility of long term growth rate was developed. Exact optimization algorithms run into difficulties in this framework and this motivates the investigation of applying simulated annealing optimized algorithm to optimize the capital growth of a given portfolio. Empirical results with real financial data indicate that the approach is inspiring for capital growth portfolio.

  10. Advances in Intelligent Modelling and Simulation Simulation Tools and Applications

    CERN Document Server

    Oplatková, Zuzana; Carvalho, Marco; Kisiel-Dorohinicki, Marek

    2012-01-01

    The human capacity to abstract complex systems and phenomena into simplified models has played a critical role in the rapid evolution of our modern industrial processes and scientific research. As a science and an art, Modelling and Simulation have been one of the core enablers of this remarkable human trace, and have become a topic of great importance for researchers and practitioners. This book was created to compile some of the most recent concepts, advances, challenges and ideas associated with Intelligent Modelling and Simulation frameworks, tools and applications. The first chapter discusses the important aspects of a human interaction and the correct interpretation of results during simulations. The second chapter gets to the heart of the analysis of entrepreneurship by means of agent-based modelling and simulations. The following three chapters bring together the central theme of simulation frameworks, first describing an agent-based simulation framework, then a simulator for electrical machines, and...

  11. Simulation and OR (operations research) in combination for practical optimization

    NARCIS (Netherlands)

    N. van Dijk; E. van der Sluis; R. Haijema; A. Al-Ibrahim; J. van der Wal

    2005-01-01

    Should we pool capacities or not? This is a question that one can regularly be confronted with in operations and service management. It is a question that necessarily requires a combination of queueing (as OR discipline) and simulation (as evaluative tool) and further steps for optimization. It will

  12. Desensitized Optimal Filtering and Sensor Fusion Tool Kit Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Research on desensitized optimal filtering techniques and a navigation and sensor fusion tool kit using advanced filtering techniques is proposed. Research focuses...

  13. Simulation Tool for Dielectric Barrier Discharge Plasma Actuators

    Science.gov (United States)

    Likhanskii, Alexander

    2014-01-01

    Traditional approaches for active flow separation control using dielectric barrier discharge (DBD) plasma actuators are limited to relatively low speed flows and atmospheric conditions. This results in low feasibility of the DBDs for aerospace applications. For active flow control at turbine blades, fixed wings, and rotary wings and on hypersonic vehicles, DBD plasma actuators must perform at a wide range of conditions, including rarified flows and combustion mixtures. An efficient, comprehensive, physically based DBD simulation tool can optimize DBD plasma actuators for different operation conditions. Researchers are developing a DBD plasma actuator simulation tool for a wide range of ambient gas pressures. The tool will treat DBD using either kinetic, fluid, or hybrid models, depending on the DBD operational condition.

  14. A simulated annealing technique for multi-objective simulation optimization

    OpenAIRE

    Mahmoud H. Alrefaei; Diabat, Ali H.

    2009-01-01

    In this paper, we present a simulated annealing algorithm for solving multi-objective simulation optimization problems. The algorithm is based on the idea of simulated annealing with constant temperature, and uses a rule for accepting a candidate solution that depends on the individual estimated objective function values. The algorithm is shown to converge almost surely to an optimal solution. It is applied to a multi-objective inventory problem; the numerical results show that the algorithm ...

  15. The Virtual Habitat - a tool for Life Support Systems optimization

    Science.gov (United States)

    Czupalla, Markus; Dirlich, Thomas; Harder, Jan; Pfeiffer, Matthias

    In the course of designing Life Support Systems (LSS) a great multitude of concepts for and various combinations of subsystems and components are developed. In order to find an optimal LSS solution, thus the right combination of subsystems, the parameters for the definition of the optimization itself have to be determined. The often times used Equivalent Systems Mass (ESM) based trade study approach for life support systems is well suited for phase A conceptual design evaluations. The ESM approach allows an efficient evaluation of LSS on a component or subsystem level. The necessary next step in the process is the design, evaluation and optimization of the LSS on a system level. For the system level LSS design a classic ESM-based trade study seems not to be able to provide the information that is necessary to evaluate the concept correctly. Important decisive criteria such as system stability, controllability and effectiveness are not represented in the ESM approach. These parameters directly and decisively impact the scientific efficiency of the crew, thereby the mission in total. Thus, for system level optimization these criteria must be included alongside the ESM in a new integral optimization method. In order to be able to apply such an integral criterion dynamic modeling of most involved LSS subsystems, especially of the human crew, is necessary. Only then the required information about the efficiency of the LSS, over time, e.g. the systems stability, becomes available. In an effort to establish a dynamic simulation environment for habitats in extreme environmental conditions, the "Virtual Habitat" tool is being developed by the Human Spaceflight Group of the Technische Universit¨t M¨nchen (TUM). The paper discussed here presents the concept of a u the virtual habitat simulation. It discusses in what way the simulation tool enables a prediction of system characteristics and required information demanded by an integral optimization criterion. In general the

  16. A tool for simulating parallel branch-and-bound methods

    Directory of Open Access Journals (Sweden)

    Golubeva Yana

    2016-01-01

    Full Text Available The Branch-and-Bound method is known as one of the most powerful but very resource consuming global optimization methods. Parallel and distributed computing can efficiently cope with this issue. The major difficulty in parallel B&B method is the need for dynamic load redistribution. Therefore design and study of load balancing algorithms is a separate and very important research topic. This paper presents a tool for simulating parallel Branchand-Bound method. The simulator allows one to run load balancing algorithms with various numbers of processors, sizes of the search tree, the characteristics of the supercomputer’s interconnect thereby fostering deep study of load distribution strategies. The process of resolution of the optimization problem by B&B method is replaced by a stochastic branching process. Data exchanges are modeled using the concept of logical time. The user friendly graphical interface to the simulator provides efficient visualization and convenient performance analysis.

  17. A tool for simulating parallel branch-and-bound methods

    Science.gov (United States)

    Golubeva, Yana; Orlov, Yury; Posypkin, Mikhail

    2016-01-01

    The Branch-and-Bound method is known as one of the most powerful but very resource consuming global optimization methods. Parallel and distributed computing can efficiently cope with this issue. The major difficulty in parallel B&B method is the need for dynamic load redistribution. Therefore design and study of load balancing algorithms is a separate and very important research topic. This paper presents a tool for simulating parallel Branchand-Bound method. The simulator allows one to run load balancing algorithms with various numbers of processors, sizes of the search tree, the characteristics of the supercomputer's interconnect thereby fostering deep study of load distribution strategies. The process of resolution of the optimization problem by B&B method is replaced by a stochastic branching process. Data exchanges are modeled using the concept of logical time. The user friendly graphical interface to the simulator provides efficient visualization and convenient performance analysis.

  18. Simulation tool of a Supernova search with VST

    OpenAIRE

    Calvi, R.; Cappellaro, E; Botticella, M. T.; Riello, M.

    2007-01-01

    To improve the estimate of SN rates for all types as a function of redshift has been proposed and accepted a three years SN search with the VST telescope. To help planning an optimal strategy for the search, we have developed a simulation tool used to predict the numbers of Supernovae of different types which are expected to be discovered in a magnitude-limited survey. In our simulation a most important ingredient has been the determination of the K-correction as function of redshift for ever...

  19. Induction generator models in dynamic simulation tools

    DEFF Research Database (Denmark)

    Knudsen, Hans; Akhmatov, Vladislav

    1999-01-01

    . It is found to be possible to include a transient model in dynamic stability tools and, then, obtain correct results also in dynamic tools. The representation of the rotating system influences on the voltage recovery shape which is an important observation in case of windmills, where a heavy mill is connected......For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained...

  20. Simulation platform to model, optimize and design wind turbines

    Energy Technology Data Exchange (ETDEWEB)

    Iov, F.; Hansen, A.D.; Soerensen, P.; Blaabjerg, F.

    2004-03-01

    This report is a general overview of the results obtained in the project 'Electrical Design and Control. Simulation Platform to Model, Optimize and Design Wind Turbines'. The motivation for this research project is the ever-increasing wind energy penetration into the power network. Therefore, the project has the main goal to create a model database in different simulation tools for a system optimization of the wind turbine systems. Using this model database a simultaneous optimization of the aerodynamic, mechanical, electrical and control systems over the whole range of wind speeds and grid characteristics can be achieved. The report is structured in six chapters. First, the background of this project and the main goals as well as the structure of the simulation platform is given. The main topologies for wind turbines, which have been taken into account during the project, are briefly presented. Then, the considered simulation tools namely: HAWC, DIgSILENT, Saber and Matlab/Simulink have been used in this simulation platform are described. The focus here is on the modelling and simulation time scale aspects. The abilities of these tools are complementary and they can together cover all the modelling aspects of the wind turbines e.g. mechanical loads, power quality, switching, control and grid faults. However, other simulation packages e.g PSCAD/EMTDC can easily be added in the simulation platform. New models and new control algorithms for wind turbine systems have been developed and tested in these tools. All these models are collected in dedicated libraries in Matlab/Simulink as well as in Saber. Some simulation results from the considered tools are presented for MW wind turbines. These simulation results focuses on fixed-speed and variable speed/pitch wind turbines. A good agreement with the real behaviour of these systems is obtained for each simulation tool. These models can easily be extended to model different kinds of wind turbines or large wind

  1. optimal assembly line balancing using simulation techniques

    African Journals Online (AJOL)

    user

    Department of Mechanical Engineering. Addis Ababa University ..... tested as per the simulation design. Certain key ..... Industry using a Genetic Algorithm”,. International Journal of ... Optimization Models for Assembly Line. Balancing in ...

  2. Educational Tool for Optimal Controller Tuning Using Evolutionary Strategies

    Science.gov (United States)

    Carmona Morales, D.; Jimenez-Hornero, J. E.; Vazquez, F.; Morilla, F.

    2012-01-01

    In this paper, an optimal tuning tool is presented for control structures based on multivariable proportional-integral-derivative (PID) control, using genetic algorithms as an alternative to traditional optimization algorithms. From an educational point of view, this tool provides students with the necessary means to consolidate their knowledge on…

  3. Induction generator models in dynamic simulation tools

    DEFF Research Database (Denmark)

    Knudsen, Hans; Akhmatov, Vladislav

    1999-01-01

    For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained. It is fo......For AC network with large amount of induction generators (windmills) the paper demonstrates a significant discrepancy in the simulated voltage recovery after fault in weak networks when comparing dynamic and transient stability descriptions and the reasons of discrepancies are explained....... It is found to be possible to include a transient model in dynamic stability tools and, then, obtain correct results also in dynamic tools. The representation of the rotating system influences on the voltage recovery shape which is an important observation in case of windmills, where a heavy mill is connected...

  4. A Simulation Tool for Downlink Long Term Evolution-advanced

    Directory of Open Access Journals (Sweden)

    Huda Adibah Mohd Ramli

    2014-11-01

    Full Text Available Long Term Evolution-Advanced (LTE-A is an emerging mobile cellular system envisaged to provide better quality of multimedia applications. Packet scheduling becomes paramount as the LTE-A delivers multimedia applications using packet switching technology. Given that LTE-A is a new technology, its ability to satisfy the Quality of Service (QoS requirements of multimedia applications demands further performance study. At present, a number of LTE-A simulators are available. However, these simulators in general are too specific in nature or their source codes are not publicly accessible for the research communities. As such, this study presents a novel simulation tool to assist the research communities to study the downlink LTE-A and further optimize packet scheduling performance. This simulation tool accurately models the downlink LTE-A taking user mobility, carrier aggregation, packet scheduling and other aspects that are relevant to the research communities into consideration. The efficacy of the simulation tool is validated through performance study of a number of well-known packet scheduling algorithms.

  5. USMC Logistics Resource Allocation Optimization Tool

    Science.gov (United States)

    2015-12-01

    Logistics, Supply Chain Management, Modeling and Optimization, Linear Program. 15. NUMBER OF PAGES 83 16. PRICE CODE 17. SECURITY CLASSIFICATION... Algebraic Modeling System GR Grounding Rod GUI Graphical User Interface HADR Humanitarian Assistance/Disaster Relief HL Headlight JHSV Joint High...many optimization methods which are applicable to a logistical transportation problem is elusive at best. The method presented through MARPOM is

  6. Modelling, simulating and optimizing Boilers

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2003-01-01

    of the boiler has been developed and simulations carried out by means of the Matlab integration routines. The model is prepared as a dynamic model consisting of both ordinary differential equations and algebraic equations, together formulated as a Differential-Algebraic- Equation system. Being able to operate...

  7. A Powerful Optimization Tool for Analog Integrated Circuits Design

    Directory of Open Access Journals (Sweden)

    M. Kubar

    2013-09-01

    Full Text Available This paper presents a new optimization tool for analog circuit design. Proposed tool is based on the robust version of the differential evolution optimization method. Corners of technology, temperature, voltage and current supplies are taken into account during the optimization. That ensures robust resulting circuits. Those circuits usually do not need any schematic change and are ready for the layout.. The newly developed tool is implemented directly to the Cadence design environment to achieve very short setup time of the optimization task. The design automation procedure was enhanced by optimization watchdog feature. It was created to control optimization progress and moreover to reduce the search space to produce better design in shorter time. The optimization algorithm presented in this paper was successfully tested on several design examples.

  8. SOFI Simulation Tool: A Software Package for Simulating and Testing Super-Resolution Optical Fluctuation Imaging.

    Science.gov (United States)

    Girsault, Arik; Lukes, Tomas; Sharipov, Azat; Geissbuehler, Stefan; Leutenegger, Marcel; Vandenberg, Wim; Dedecker, Peter; Hofkens, Johan; Lasser, Theo

    2016-01-01

    Super-resolution optical fluctuation imaging (SOFI) allows one to perform sub-diffraction fluorescence microscopy of living cells. By analyzing the acquired image sequence with an advanced correlation method, i.e. a high-order cross-cumulant analysis, super-resolution in all three spatial dimensions can be achieved. Here we introduce a software tool for a simple qualitative comparison of SOFI images under simulated conditions considering parameters of the microscope setup and essential properties of the biological sample. This tool incorporates SOFI and STORM algorithms, displays and describes the SOFI image processing steps in a tutorial-like fashion. Fast testing of various parameters simplifies the parameter optimization prior to experimental work. The performance of the simulation tool is demonstrated by comparing simulated results with experimentally acquired data.

  9. First Report of the Simulation Optimization Group

    CERN Document Server

    Rimoldi, A; Dell'Acqua, A; Froidevaux, D; Gianotti, F; Guyot, C; Hinchliffe, I; Jakobs, K; Marshall, Z; Nasati, A; Quarrie, D; Unal, G; Young, C

    2008-01-01

    This is the first report of the ATLAS Simulation Optimization Group, established in June of 2007. This article justifies the selected Geant4 version, physics list, and range cuts to be used by the default ATLAS simulation for initial data taking and beyond. The current status of several projects, including detector description, simulation validation, studies of additional Geant4 parameters, and cavern background, are reported.

  10. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, K.; Condra, T.; Houbak, Niels

    2003-01-01

    , and the total stress level (i.e. stresses introduced due to internal pressure plus stresses introduced due to temperature gradients) must always be kept below the allowable stress level. In this way, the increased water-/steam space that should allow for better dynamic performance, in the end causes limited...... freedom with respect to dynamic operation of the plant. By means of an objective function including as well the price of the plant as a quantification of the value of dynamic operation of the plant an optimization is carried out. The dynamic model of the boiler plant is applied to define parts...

  11. Modelling, simulating and optimizing Boilers

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2003-01-01

    , and the total stress level (i.e. stresses introduced due to internal pressure plus stresses introduced due to temperature gradients) must always be kept below the allowable stress level. In this way, the increased water-/steam space that should allow for better dynamic performance, in the end causes limited...... freedom with respect to dynamic operation of the plant. By means of an objective function including as well the price of the plant as a quantication of the value of dynamic operation of the plant an optimization is carried out. The dynamic model of the boiler plant is applied to dene parts...

  12. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2004-01-01

    on the boiler) have been dened. Furthermore a number of constraints related to: minimum and maximum boiler load gradient, minimum boiler size, Shrinking and Swelling and Steam Space Load have been dened. For dening the constraints related to the required boiler volume a dynamic model for simulating the boiler...... performance has been developed. Outputs from the simulations are shrinking and swelling of water level in the drum during for example a start-up of the boiler, these gures combined with the requirements with respect to allowable water level uctuations in the drum denes the requirements with respect to drum...... size. The model has been formulated with a specied building-up of the pressure during the start-up of the plant, i.e. the steam production during start-up of the boiler is output from the model. The steam outputs together with requirements with respect to steam space load have been utilized to dene...

  13. DNA – A General Energy System Simulation Tool

    DEFF Research Database (Denmark)

    Elmegaard, Brian; Houbak, Niels

    2005-01-01

    operation. The program decides at runtime to apply the DAE solver if the system contains differential equations. This makes it easy to extend an existing steady state model to simulate dynamic operation of the plant. The use of the program is illustrated by examples of gas turbine models. The paper also......The paper reviews the development of the energy system simulation tool DNA (Dynamic Network Analysis). DNA has been developed since 1989 to be able to handle models of any kind of energy system based on the control volume approach, usually systems of lumped parameter components. DNA has proven...... to be a useful tool in the analysis and optimization of several types of thermal systems: Steam turbines, gas turbines, fuels cells, gasification, refrigeration and heat pumps for both conventional fossil fuels and different types of biomass. DNA is applicable for models of both steady state and dynamic...

  14. Performance Optimization of the ATLAS Detector Simulation

    CERN Document Server

    AUTHOR|(CDS)2091018

    In the thesis at hand the current performance of the ATLAS detector simulation, part of the Athena framework, is analyzed and possible optimizations are examined. For this purpose the event based sampling profiler VTune Amplifier by Intel is utilized. As the most important metric to measure improvements, the total execution time of the simulation of $t\\bar{t}$ events is also considered. All efforts are focused on structural changes, which do not influence the simulation output and can be attributed to CPU specific issues, especially front end stalls and vectorization. The most promising change is the activation of profile guided optimization for Geant4, which is a critical external dependency of the simulation. Profile guided optimization gives an average improvement of $8.9\\%$ and $10.0\\%$ for the two considered cases at the cost of one additional compilation (instrumented binaries) and execution (training to obtain profiling data) at build time.

  15. Topology and boundary shape optimization as an integrated design tool

    Science.gov (United States)

    Bendsoe, Martin Philip; Rodrigues, Helder Carrico

    1990-01-01

    The optimal topology of a two dimensional linear elastic body can be computed by regarding the body as a domain of the plane with a high density of material. Such an optimal topology can then be used as the basis for a shape optimization method that computes the optimal form of the boundary curves of the body. This results in an efficient and reliable design tool, which can be implemented via common FEM mesh generator and CAD type input-output facilities.

  16. OpenSimulator Interoperability with DRDC Simulation Tools: Compatibility Study

    Science.gov (United States)

    2014-09-01

    conversion. iv These tools include Blender, AutoCAD , 3DS Max and SketchUp. Thus, DRDC use case (1) was demonstrated as feasible. However...supplementary support for the creation of content and resources compatible with OpenSimulator. These include programs such as Blender, GIMP, AutoCAD , 3DS MAX...support the PDS, PDMS, ACIS (.sat), MicroStation (.dgn), and AutoCAD (.dwg) formats, but not COLLADA directly [14]. An important limitation of

  17. Quenching Simulation of PM Coated Tools

    Institute of Scientific and Technical Information of China (English)

    AxelHoftert; WernerTheisen; ChristophBroeckmann

    2004-01-01

    HIP cladding is a powder metallurgical coating technique used in the production of wear parts and tools. In many cases the composite components consist of carbide-free hot-work steel as base material and wear resistant carbide-rich PM cold-work steel as coating material. To ensure operativeness a heat tleatment matched to the substrate and coating material is required. Dissimilar phase tlansformation behaviour and different thermal expansion coefficients of layer and substrate entail inner stresses affecting the tlansformation kinetics in tam. In order to get a deeper insight into these effects Finite Element simulation tools are used. On the one hand, the tlansient heat conduction problem of the quenching process has to be solved. Non-linear boundary conditions and phase transformation of both, substrate and layer are considered. On the other hand, the mechanical response is calculated. The overall aim of the investigation is an improvement of common heat treatment techniques used for HIP cladded wear parts.

  18. Quenching Simulation of PM Coated Tools

    Institute of Scientific and Technical Information of China (English)

    Axel H(o)fter; Werner Theisen; Christoph Broeckmann

    2004-01-01

    HIP cladding is a powder metallurgical coating technique used in the production of wear parts and tools. In many cases the composite components consist of carbide-free hot-work steel as base material and wear resistant carbide-rich PM cold-work steel as coating material. To ensure operativeness a heat treatment matched to the substrate and coating material is required. Dissimilar phase transformation behaviour and different thermal expansion coefficients of layer and substrate entail inner stresses affecting the transformation kinetics in turn. In order to get a deeper insight into these effects Finite Element simulation tools are used. On the one hand, the transient heat conduction problem of the quenching process has to be solved. Non-linear boundary conditions and phase transformation of both, substrate and layer are considered. On the other hand, the mechanical response is calculated. The overall aim of the investigation is an improvement of common heat treatment techniques used for HIP cladded wear parts.

  19. MODELLING, SIMULATING AND OPTIMIZING BOILERS

    DEFF Research Database (Denmark)

    Sørensen, Kim; Condra, Thomas Joseph; Houbak, Niels

    2004-01-01

    on the boiler) have been dened. Furthermore a number of constraints related to: minimum and maximum boiler load gradient, minimum boiler size, Shrinking and Swelling and Steam Space Load have been dened. For dening the constraints related to the required boiler volume a dynamic model for simulating the boiler...... size. The model has been formulated with a specied building-up of the pressure during the start-up of the plant, i.e. the steam production during start-up of the boiler is output from the model. The steam outputs together with requirements with respect to steam space load have been utilized to dene...... of the boiler is (with an acceptable accuracy) proportional with the volume of the boiler. For the dynamic operation capability a cost function penalizing limited dynamic operation capability and vise-versa has been dened. The main idea is that it by mean of the parameters in this function is possible to t its...

  20. Simulating Protein Conformations through Global Optimization

    CERN Document Server

    Mucherino, A; Pardalos, P M

    2008-01-01

    Many researches have been working on the protein folding problem from more than half century. Protein folding is indeed one of the major unsolved problems in science. In this work, we discuss a model for the simulation of protein conformations. This simple model is based on the idea of imposing few geometric requirements on chains of atoms representing the backbone of a protein conformation. The model leads to the formulation of a global optimization problem, whose solutions correspond to conformations satisfying the desired requirements. The global optimization problem is solved by the recently proposed Monkey Search algorithm. The simplicity of the optimization problem and the effectiveness of the used meta-heuristic search allowed the simulation of a large set of high-quality conformations. We show that, even though only few geometric requirements are imposed, some of the simulated conformation results to be similar (in terms of RMSD) to conformations real proteins actually have in nature.

  1. Optimizing Large-Scale ODE Simulations

    CERN Document Server

    Mulansky, Mario

    2014-01-01

    We present a strategy to speed up Runge-Kutta-based ODE simulations of large systems with nearest-neighbor coupling. We identify the cache/memory bandwidth as the crucial performance bottleneck. To reduce the required bandwidth, we introduce a granularity in the simulation and identify the optimal cluster size in a performance study. This leads to a considerable performance increase and transforms the algorithm from bandwidth bound to CPU bound. By additionally employing SIMD instructions we are able to boost the efficiency even further. In the end, a total performance increase of up to a factor three is observed when using cache optimization and SIMD instructions compared to a standard implementation. All simulation codes are written in C++ and made publicly available. By using the modern C++ libraries Boost.odeint and Boost.SIMD, these optimizations can be implemented with minimal programming effort.

  2. GEOSSAV: a simulation tool for subsurface applications

    Science.gov (United States)

    Regli, Christian; Rosenthaler, Lukas; Huggenberger, Peter

    2004-04-01

    Geostatistical Environment fOr Subsurface Simulation And Visualization (GEOSSAV) is a tool for the integration of hard and soft data into stochastic simulation and visualization of distributions of geological structures and hydrogeological properties in the subsurface. GEOSSAV, as an interface to selected geostatistical modules (bicalib, gamv, vargplt, and sisim) from the Geostatistical Software LIBrary, GSLIB (GSLIB: Geostatistical Software Library and User's Guide, 2nd Edition, Oxford University Press, Oxford, 1998, 369pp), can be used for data analysis, variogram computation of regularly or irregularly spaced data, and sequential indicator simulation of subsurface heterogeneities. Sequential indicator simulation, based on various kriging techniques (simple, ordinary, and Bayesian), is suitable for the simulation of continuous variables such as hydraulic conductivity of an aquifer or chemical concentrations at a contaminated site, and categorical variables which indicate the presence or absence of a particular lithofacies. The software integration platform and development environment of GEOSSAV is Tool command language (Tcl) with its graphical user interface, Toolkit (Tk), and a number of Tcl/Tk extensions. The standard Open Graphics Library application programming interface is used for rendering three-dimensional (3D) data distributions and for slicing perpendicular to the main coordinate axis. Export options for finite-difference groundwater models allow either files that characterize single model layers (which are saved in ASCII matrix format) or files that characterize the complete 3D flow model setup for MODFLOW-based groundwater simulation systems (which are saved in block-centered flow package files (User's documentation for MODFLOW-96, an update to the US Geological Survey modular finite-difference ground-water flow model, Geological Survey Open-File Report 96-485, Reston, VA, 1996, 56pp)). GEOSSAV can be used whenever stochastic solutions are preferred

  3. Desensitized Optimal Filtering and Sensor Fusion Tool Kit Project

    Data.gov (United States)

    National Aeronautics and Space Administration — It is proposed to develop desensitized optimal filtering techniques and to implement these algorithms in a navigation and sensor fusion tool kit. These proposed...

  4. Security constrained optimal power flow by modern optimization tools

    African Journals Online (AJOL)

    The fertilization is divided into self and Cross Pollination. The self ... Blossom steadiness can be considered as the generation l. 4. ..... discovered considering the base case is 801.8436, and this esteem is .... Gaing Z., and ChangR., 2006, Security-constrained optimal power flow by mixed-integer genetic algorithm with.

  5. Computational tool for simulation of power and refrigeration cycles

    Science.gov (United States)

    Córdoba Tuta, E.; Reyes Orozco, M.

    2016-07-01

    Small improvement in thermal efficiency of power cycles brings huge cost savings in the production of electricity, for that reason have a tool for simulation of power cycles allows modeling the optimal changes for a best performance. There is also a big boom in research Organic Rankine Cycle (ORC), which aims to get electricity at low power through cogeneration, in which the working fluid is usually a refrigerant. A tool to design the elements of an ORC cycle and the selection of the working fluid would be helpful, because sources of heat from cogeneration are very different and in each case would be a custom design. In this work the development of a multiplatform software for the simulation of power cycles and refrigeration, which was implemented in the C ++ language and includes a graphical interface which was developed using multiplatform environment Qt and runs on operating systems Windows and Linux. The tool allows the design of custom power cycles, selection the type of fluid (thermodynamic properties are calculated through CoolProp library), calculate the plant efficiency, identify the fractions of flow in each branch and finally generates a report very educational in pdf format via the LaTeX tool.

  6. Optimal design and uncertainty quantification in blood flow simulations for congenital heart disease

    Science.gov (United States)

    Marsden, Alison

    2009-11-01

    Recent work has demonstrated substantial progress in capabilities for patient-specific cardiovascular flow simulations. Recent advances include increasingly complex geometries, physiological flow conditions, and fluid structure interaction. However inputs to these simulations, including medical image data, catheter-derived pressures and material properties, can have significant uncertainties associated with them. For simulations to predict clinically useful and reliable output information, it is necessary to quantify the effects of input uncertainties on outputs of interest. In addition, blood flow simulation tools can now be efficiently coupled to shape optimization algorithms for surgery design applications, and these tools should incorporate uncertainty information. We present a unified framework to systematically and efficient account for uncertainties in simulations using adaptive stochastic collocation. In addition, we present a framework for derivative-free optimization of cardiovascular geometries, and layer these tools to perform optimization under uncertainty. These methods are demonstrated using simulations and surgery optimization to improve hemodynamics in pediatric cardiology applications.

  7. A Thermal Simulation Tool for Building and Its Interoperability through the Building Information Modeling (BIM Platform

    Directory of Open Access Journals (Sweden)

    Christophe Nicolle

    2013-05-01

    Full Text Available This paper describes potential challenges and opportunities for using thermal simulation tools to optimize building performance. After reviewing current trends in thermal simulation, it outlines major criteria for the evaluation of building thermal simulation tools based on specifications and capabilities in interoperability. Details are discussed including workflow of data exchange of multiple thermal analyses such as the BIM-based application. The present analysis focuses on selected thermal simulation tools that provide functionalities to exchange data with other tools in order to obtain a picture of its basic work principles and to identify selection criteria for generic thermal tools in BIM. Significances and barriers to integration design with BIM and building thermal simulation tools are also discussed.

  8. A tool for study of optimal decision trees

    KAUST Repository

    Alkhalid, Abdulaziz

    2010-01-01

    The paper describes a tool which allows us for relatively small decision tables to make consecutive optimization of decision trees relative to various complexity measures such as number of nodes, average depth, and depth, and to find parameters and the number of optimal decision trees. © 2010 Springer-Verlag Berlin Heidelberg.

  9. Watershed Management Optimization Support Tool (WMOST) v2: Theoretical Documentation

    Science.gov (United States)

    The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that evaluates the relative cost-effectiveness of management practices at the local or watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed c...

  10. Visualizing simulated learning experiences through the use of informatics tools.

    Science.gov (United States)

    Thompson, Teri L; Warren, Judith J

    2009-01-01

    High-fidelity simulation technology is a growing educational technology. Designing effective simulations requires the use of informatics tools such as UML modeling. This poster demonstrates the steps in modeling a simulation exercise.

  11. Modeling, simulation and optimization of bipedal walking

    CERN Document Server

    Berns, Karsten

    2013-01-01

    The model-based investigation of motions of anthropomorphic systems is an important interdisciplinary research topic involving specialists from many fields such as Robotics, Biomechanics, Physiology, Orthopedics, Psychology, Neurosciences, Sports, Computer Graphics and Applied Mathematics. This book presents a study of basic locomotion forms such as walking and running is of particular interest due to the high demand on dynamic coordination, actuator efficiency and balance control. Mathematical models and numerical simulation and optimization techniques are explained, in combination with experimental data, which can help to better understand the basic underlying mechanisms of these motions and to improve them. Example topics treated in this book are Modeling techniques for anthropomorphic bipedal walking systems Optimized walking motions for different objective functions Identification of objective functions from measurements Simulation and optimization approaches for humanoid robots Biologically inspired con...

  12. Simulation-optimization via Kriging and bootstrapping : A survey

    NARCIS (Netherlands)

    Kleijnen, Jack P.C.

    2014-01-01

    This article surveys optimization of simulated systems. The simulation may be either deterministic or random. The survey reflects the author’s extensive experience with simulation-optimization through Kriging (or Gaussian process) metamodels, analysed through parametric bootstrapping for determinist

  13. Silicon epitaxy process recipe and tool configuration optimization

    Science.gov (United States)

    Moy, W. H.; Cheong, K. Y.

    2017-07-01

    Silicon epitaxy is widely used in semiconductor fabrication due to its ability to produce high quality and low cost thin film. Epitaxy optimized process condition with respect to the process recipe and tool for the maximization of n-type epitaxial production has been investigated. For standard recipe of an epitaxy process, there are seven main steps, namely purge, ramp, bake, stab, deposition, post and cooling. This project focuses on the recipe optimization on ramp, bake and stab steps. For the tool configuration, cool-down step has been optimized. Impact on slip, haze, wafers warpage and crystal originated particles have been investigated.

  14. HAMMER: Reweighting tool for simulated data samples

    CERN Document Server

    Duell, Stephan; Ligeti, Zoltan; Papucci, Michele; Robinson, Dean

    2016-01-01

    Modern flavour physics experiments, such as Belle II or LHCb, require large samples of generated Monte Carlo events. Monte Carlo events often are processed in a sophisticated chain that includes a simulation of the detector response. The generation and reconstruction of large samples is resource-intensive and in principle would need to be repeated if e.g. parameters responsible for the underlying models change due to new measurements or new insights. To avoid having to regenerate large samples, we work on a tool, The Helicity Amplitude Module for Matrix Element Reweighting (HAMMER), which allows one to easily reweight existing events in the context of semileptonic b → q ` ̄ ν ` analyses to new model parameters or new physics scenarios.

  15. A Network Simulation Tool for Task Scheduling

    Directory of Open Access Journals (Sweden)

    Ondřej Votava

    2012-01-01

    Full Text Available Distributed computing may be looked at from many points of view. Task scheduling is the viewpoint, where a distributed application can be described as a Directed Acyclic Graph and every node of the graph is executed independently. There are, however, data dependencies and the nodes have to be executed in a specified order. Hence the parallelism of the execution is limited. The scheduling problem is difficult and therefore heuristics are used. However, many inaccuracies are caused by the model used for the system, in which the heuristics are being tested. In this paper we present a tool for simulating the execution of the distributed application on a “real” computer network, and try to tell how the executionis influenced compared to the model.

  16. Contingency Contractor Optimization Phase 3 Sustainment Software Design Document - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa; Jones, Katherine A

    2016-05-01

    This document describes the final software design of the Contingency Contractor Optimization Tool - Prototype. Its purpose is to provide the overall architecture of the software and the logic behind this architecture. Documentation for the individual classes is provided in the application Javadoc. The Contingency Contractor Optimization project is intended to address Department of Defense mandates by delivering a centralized strategic planning tool that allows senior decision makers to quickly and accurately assess the impacts, risks, and mitigation strategies associated with utilizing contract support. The Contingency Contractor Optimization Tool - Prototype was developed in Phase 3 of the OSD ATL Contingency Contractor Optimization project to support strategic planning for contingency contractors. The planning tool uses a model to optimize the Total Force mix by minimizing the combined total costs for selected mission scenarios. The model optimizes the match of personnel types (military, DoD civilian, and contractors) and capabilities to meet mission requirements as effectively as possible, based on risk, cost, and other requirements.

  17. Building Performance Simulation tools for planning of energy efficiency retrofits

    DEFF Research Database (Denmark)

    Mondrup, Thomas Fænø; Karlshøj, Jan; Vestergaard, Flemming

    2014-01-01

    Designing energy efficiency retrofits for existing buildings will bring environmental, economic, social, and health benefits. However, selecting specific retrofit strategies is complex and requires careful planning. In this study, we describe a methodology for adopting Building Performance...... to energy efficiency retrofits in social housing. To generate energy savings, we focus on optimizing the building envelope. We evaluate alternative building envelope actions using procedural solar radiation and daylight simulations. In addition, we identify the digital information flow and the information...... Simulation (BPS) tools as energy and environmentally conscious decision-making aids. The methodology has been developed to screen buildings for potential improvements and to support the development of retrofit strategies. We present a case study of a Danish renovation project, implementing BPS approaches...

  18. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, J.

    2014-12-01

    The deployment and use of lithium-ion batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite of tools. This suite of tools pairs NREL's high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic, long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  19. Optimal Hamiltonian Simulation by Quantum Signal Processing

    Science.gov (United States)

    Low, Guang Hao; Chuang, Isaac L.

    2017-01-01

    The physics of quantum mechanics is the inspiration for, and underlies, quantum computation. As such, one expects physical intuition to be highly influential in the understanding and design of many quantum algorithms, particularly simulation of physical systems. Surprisingly, this has been challenging, with current Hamiltonian simulation algorithms remaining abstract and often the result of sophisticated but unintuitive constructions. We contend that physical intuition can lead to optimal simulation methods by showing that a focus on simple single-qubit rotations elegantly furnishes an optimal algorithm for Hamiltonian simulation, a universal problem that encapsulates all the power of quantum computation. Specifically, we show that the query complexity of implementing time evolution by a d -sparse Hamiltonian H ^ for time-interval t with error ɛ is O [t d ∥H ^ ∥max+log (1 /ɛ ) /log log (1 /ɛ ) ] , which matches lower bounds in all parameters. This connection is made through general three-step "quantum signal processing" methodology, comprised of (i) transducing eigenvalues of H ^ into a single ancilla qubit, (ii) transforming these eigenvalues through an optimal-length sequence of single-qubit rotations, and (iii) projecting this ancilla with near unity success probability.

  20. A simulation tool for dynamic contrast enhanced MRI.

    Directory of Open Access Journals (Sweden)

    Nicolas Adrien Pannetier

    Full Text Available The quantification of bolus-tracking MRI techniques remains challenging. The acquisition usually relies on one contrast and the analysis on a simplified model of the various phenomena that arise within a voxel, leading to inaccurate perfusion estimates. To evaluate how simplifications in the interstitial model impact perfusion estimates, we propose a numerical tool to simulate the MR signal provided by a dynamic contrast enhanced (DCE MRI experiment. Our model encompasses the intrinsic R1 and R2 relaxations, the magnetic field perturbations induced by susceptibility interfaces (vessels and cells, the diffusion of the water protons, the blood flow, the permeability of the vessel wall to the the contrast agent (CA and the constrained diffusion of the CA within the voxel. The blood compartment is modeled as a uniform compartment. The different blocks of the simulation are validated and compared to classical models. The impact of the CA diffusivity on the permeability and blood volume estimates is evaluated. Simulations demonstrate that the CA diffusivity slightly impacts the permeability estimates (< 5% for classical blood flow and CA diffusion. The effect of long echo times is investigated. Simulations show that DCE-MRI performed with an echo time TE = 5 ms may already lead to significant underestimation of the blood volume (up to 30% lower for brain tumor permeability values. The potential and the versatility of the proposed implementation are evaluated by running the simulation with realistic vascular geometry obtained from two photons microscopy and with impermeable cells in the extravascular environment. In conclusion, the proposed simulation tool describes DCE-MRI experiments and may be used to evaluate and optimize acquisition and processing strategies.

  1. Kanban simulation model for production process optimization

    Directory of Open Access Journals (Sweden)

    Golchev Riste

    2015-01-01

    Full Text Available A long time has passed since the KANBAN system has been established as an efficient method for coping with the excessive inventory. Still, the possibilities for its improvement through its integration with other different approaches should be investigated further. The basic research challenge of this paper is to present benefits of KANBAN implementation supported with Discrete Event Simulation (DES. In that direction, at the beginning, the basics of KANBAN system are presented with emphasis on the information and material flow, together with a methodology for implementation of KANBAN system. Certain analysis on combining the simulation with this methodology is presented. The paper is concluded with a practical example which shows that through understanding the philosophy of the implementation methodology of KANBAN system and the simulation methodology, a simulation model can be created which can serve as a basis for a variety of experiments that can be conducted within a short period of time, resulting with production process optimization.

  2. Integration of Advanced Simulation and Visualization for Manufacturing Process Optimization

    Science.gov (United States)

    Zhou, Chenn; Wang, Jichao; Tang, Guangwu; Moreland, John; Fu, Dong; Wu, Bin

    2016-05-01

    The integration of simulation and visualization can provide a cost-effective tool for process optimization, design, scale-up and troubleshooting. The Center for Innovation through Visualization and Simulation (CIVS) at Purdue University Northwest has developed methodologies for such integration with applications in various manufacturing processes. The methodologies have proven to be useful for virtual design and virtual training to provide solutions addressing issues on energy, environment, productivity, safety, and quality in steel and other industries. In collaboration with its industrial partnerships, CIVS has provided solutions to companies, saving over US38 million. CIVS is currently working with the steel industry to establish an industry-led Steel Manufacturing Simulation and Visualization Consortium through the support of National Institute of Standards and Technology AMTech Planning Grant. The consortium focuses on supporting development and implementation of simulation and visualization technologies to advance steel manufacturing across the value chain.

  3. Optimal visual-haptic integration with articulated tools.

    Science.gov (United States)

    Takahashi, Chie; Watt, Simon J

    2017-05-01

    When we feel and see an object, the nervous system integrates visual and haptic information optimally, exploiting the redundancy in multiple signals to estimate properties more precisely than is possible from either signal alone. We examined whether optimal integration is similarly achieved when using articulated tools. Such tools (tongs, pliers, etc) are a defining characteristic of human hand function, but complicate the classical sensory 'correspondence problem' underlying multisensory integration. Optimal integration requires establishing the relationship between signals acquired by different sensors (hand and eye) and, therefore, in fundamentally unrelated units. The system must also determine when signals refer to the same property of the world-seeing and feeling the same thing-and only integrate those that do. This could be achieved by comparing the pattern of current visual and haptic input to known statistics of their normal relationship. Articulated tools disrupt this relationship, however, by altering the geometrical relationship between object properties and hand posture (the haptic signal). We examined whether different tool configurations are taken into account in visual-haptic integration. We indexed integration by measuring the precision of size estimates, and compared our results to optimal predictions from a maximum-likelihood integrator. Integration was near optimal, independent of tool configuration/hand posture, provided that visual and haptic signals referred to the same object in the world. Thus, sensory correspondence was determined correctly (trial-by-trial), taking tool configuration into account. This reveals highly flexible multisensory integration underlying tool use, consistent with the brain constructing internal models of tools' properties.

  4. PIMMS tools for capturing metadata about simulations

    Science.gov (United States)

    Pascoe, Charlotte; Devine, Gerard; Tourte, Gregory; Pascoe, Stephen; Lawrence, Bryan; Barjat, Hannah

    2013-04-01

    PIMMS (Portable Infrastructure for the Metafor Metadata System) provides a method for consistent and comprehensive documentation of modelling activities that enables the sharing of simulation data and model configuration information. The aim of PIMMS is to package the metadata infrastructure developed by Metafor for CMIP5 so that it can be used by climate modelling groups in UK Universities. PIMMS tools capture information about simulations from the design of experiments to the implementation of experiments via simulations that run models. PIMMS uses the Metafor methodology which consists of a Common Information Model (CIM), Controlled Vocabularies (CV) and software tools. PIMMS software tools provide for the creation and consumption of CIM content via a web services infrastructure and portal developed by the ES-DOC community. PIMMS metadata integrates with the ESGF data infrastructure via the mapping of vocabularies onto ESGF facets. There are three paradigms of PIMMS metadata collection: Model Intercomparision Projects (MIPs) where a standard set of questions is asked of all models which perform standard sets of experiments. Disciplinary level metadata collection where a standard set of questions is asked of all models but experiments are specified by users. Bespoke metadata creation where the users define questions about both models and experiments. Examples will be shown of how PIMMS has been configured to suit each of these three paradigms. In each case PIMMS allows users to provide additional metadata beyond that which is asked for in an initial deployment. The primary target for PIMMS is the UK climate modelling community where it is common practice to reuse model configurations from other researchers. This culture of collaboration exists in part because climate models are very complex with many variables that can be modified. Therefore it has become common practice to begin a series of experiments by using another climate model configuration as a starting

  5. Preliminary Development of an Object-Oriented Optimization Tool

    Science.gov (United States)

    Pak, Chan-gi

    2011-01-01

    The National Aeronautics and Space Administration Dryden Flight Research Center has developed a FORTRAN-based object-oriented optimization (O3) tool that leverages existing tools and practices and allows easy integration and adoption of new state-of-the-art software. The object-oriented framework can integrate the analysis codes for multiple disciplines, as opposed to relying on one code to perform analysis for all disciplines. Optimization can thus take place within each discipline module, or in a loop between the central executive module and the discipline modules, or both. Six sample optimization problems are presented. The first four sample problems are based on simple mathematical equations; the fifth and sixth problems consider a three-bar truss, which is a classical example in structural synthesis. Instructions for preparing input data for the O3 tool are presented.

  6. Decision-Theoretic Methods in Simulation Optimization

    Science.gov (United States)

    2014-09-24

    Materiel Command REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is...Alamos National Lab: Frazier visited LANL , hosted by Frank Alexander, in January 2013, where he discussed the use of simulation optimization methods for...Alexander, Turab Lookman, and others from LANL , at the Materials Informatics Workshop at the Sante Fe Institute in April 2013. In February 2014, Frazier

  7. Optimizing the integrated design of boilers - simulation

    DEFF Research Database (Denmark)

    Sørensen, Kim; Karstensen, Claus M. S.; Condra, Thomas Joseph

    2004-01-01

    .) it is important to see the 3 components as an integrated unit and optimize these as such. This means that the burner must be designed and optimized exactly to the pressure part where it is utilized, the control system must have a conguration optimal for the pressure part and burner where it is utilized etc...... together with Aalborg University and The Technical University of Denmark carried out a project to develop the Model based Multivariable Control System . This is foreseen to be a control system utilizing the continuously increasing computational possibilities to take all the important operation parameters...... formulated as Differential-Algebraic-Equation (DAE) systems. For integration in SIMULINK the models have been index-reduced to Ordinary- Differential-Equation (ODE) systems. The simulations have been carried out by means of the MATLAB/SIMULINK integration routines. For verifying the models developed...

  8. An Interactive Simulation Tool for Production Planning in Bacon Factories

    DEFF Research Database (Denmark)

    Nielsen, Jens Frederik Dalsgaard; Nielsen, Kirsten Mølgaard

    1994-01-01

    The paper describes an interactive simulation tool for production planning in bacon factories. The main aim of the tool is to make it possible to combine the production plans of all parts of the factory......The paper describes an interactive simulation tool for production planning in bacon factories. The main aim of the tool is to make it possible to combine the production plans of all parts of the factory...

  9. Battery Lifetime Analysis and Simulation Tool (BLAST) Documentation

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, J. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2014-12-01

    The deployment and use of lithium-ion (Li-ion) batteries in automotive and stationary energy storage applications must be optimized to justify their high up-front costs. Given that batteries degrade with use and storage, such optimizations must evaluate many years of operation. As the degradation mechanisms are sensitive to temperature, state-of-charge (SOC) histories, current levels, and cycle depth and frequency, it is important to model both the battery and the application to a high level of detail to ensure battery response is accurately predicted. To address these issues, the National Renewable Energy Laboratory (NREL) has developed the Battery Lifetime Analysis and Simulation Tool (BLAST) suite. This suite of tools pairs NREL’s high-fidelity battery degradation model with a battery electrical and thermal performance model, application-specific electrical and thermal performance models of the larger system (e.g., an electric vehicle), application-specific system use data (e.g., vehicle travel patterns and driving data), and historic climate data from cities across the United States. This provides highly realistic long-term predictions of battery response and thereby enables quantitative comparisons of varied battery use strategies.

  10. Contingency Contractor Optimization Phase 3 Sustainment Third-Party Software List - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David; Frazier, Christopher Rawls; Bandlow, Alisa

    2016-05-01

    The Contingency Contractor Optimization Tool - Prototype (CCOT-P) requires several third-party software packages. These are documented below for each of the CCOT-P elements: client, web server, database server, solver, web application and polling application.

  11. Optimization of Process Parameters of Tool Wear in Turning Operation

    Directory of Open Access Journals (Sweden)

    Manik Barman

    2015-04-01

    Full Text Available Tool Wear is of great apprehension in machining industries since itaffects the surface quality, dimensional accuracy and production cost of the materials / components. In the present study twenty seven experiments were conducted as per 3 parameter 3 level full factorial design for turning operation of a mild steel specimen with high speed steel (HSS cutting tool. An experimental investigation on cutting tool wear and a mathematical model for tool wear estimation is reported in this paper where the model was simulated by computer programming and it has been found that this model is capable of estimating the wear rate of cutting tool and it provides an optimum set of process parameters for minimum tool wear.

  12. Ergodic optimization in the expanding case concepts, tools and applications

    CERN Document Server

    Garibaldi, Eduardo

    2017-01-01

    This book focuses on the interpretation of ergodic optimal problems as questions of variational dynamics, employing a comparable approach to that of the Aubry-Mather theory for Lagrangian systems. Ergodic optimization is primarily concerned with the study of optimizing probability measures. This work presents and discusses the fundamental concepts of the theory, including the use and relevance of Sub-actions as analogues to subsolutions of the Hamilton-Jacobi equation. Further, it provides evidence for the impressively broad applicability of the tools inspired by the weak KAM theory.

  13. Compiler Optimization: A Case for the Transformation Tool Contest

    Directory of Open Access Journals (Sweden)

    Sebastian Buchwald

    2011-11-01

    Full Text Available An optimizing compiler consists of a front end parsing a textual programming language into an intermediate representation (IR, a middle end performing optimizations on the IR, and a back end lowering the IR to a target representation (TR built of operations supported by the target hardware. In modern compiler construction graph-based IRs are employed. Optimization and lowering tasks can then be implemented with graph transformation rules. This case provides two compiler tasks to evaluate the participating tools regarding performance.

  14. OPTIMIZATION OF A WAVE CANCELLATION MULTIHULL SHIP USING CFD TOOLS

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A simple CFD tool, coupled to a discrete surface representation and a gradient-based optimization procedure, is applied to the design of optimal hull forms and optimal arrangement of hulls for a wave cancellation multihull ship. The CFD tool, which is used to estimate the wave drag, is based on the zeroth-order slender ship approximation. The hull surface is represented by a triangulation, and almost every grid point on the surface can be used as a design variable. A smooth surface is obtained via a simplified pseudo-shell problem. The optimal design process consists of two steps. The optimal center and outer hull forms are determined independently in the first step, where each hull forms are determined independently in the first step, where each hull keeps the same displacement as the original design while the wave drag is minimized. The optimal outer-hull arrangement is determined in the second step for the optimal center and outer hull forms obtained in the first step. Results indicate that the new design can achieve a large wave drag reduction in comparison to the original design configuration.

  15. Advanced computational tools for optimization and uncertainty quantification of carbon capture processes

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David C. [U.S. DOE; Ng, Brenda [Lawrence Livermore National Laboratory; Eslick, John [Carnegie Mellon University

    2014-01-01

    Advanced multi-scale modeling and simulation has the potential to dramatically reduce development time, resulting in considerable cost savings. The Carbon Capture Simulation Initiative (CCSI) is a partnership among national laboratories, industry and universities that is developing, demonstrating, and deploying a suite of multi-scale modeling and simulation tools. One significant computational tool is FOQUS, a Framework for Optimization and Quantification of Uncertainty and Sensitivity, which enables basic data submodels, including thermodynamics and kinetics, to be used within detailed process models to rapidly synthesize and optimize a process and determine the level of uncertainty associated with the resulting process. The overall approach of CCSI is described with a more detailed discussion of FOQUS and its application to carbon capture systems.

  16. Silvicultural decisions based on simulation-optimization systems

    Energy Technology Data Exchange (ETDEWEB)

    Cao, Tianjian

    2010-05-15

    Forest management is facing new challenges under climate change. By adjusting thinning regimes, conventional forest management can be adapted to various objectives of utilization of forest resources, such as wood quality, forest bioenergy, and carbon sequestration. This thesis aims to develop and apply a simulation-optimization system as a tool for an interdisciplinary understanding of the interactions between wood science, forest ecology, and forest economics. In this thesis, the OptiFor software was developed for forest resources management. The OptiFor simulation-optimization system integrated the process-based growth model PipeQual, wood quality models, biomass production and carbon emission models, as well as energy wood and commercial logging models into a single optimization model. Osyczka s direct and random search algorithm was employed to identify optimal values for a set of decision variables. The numerical studies in this thesis broadened our current knowledge and understanding of the relationships between wood science, forest ecology, and forest economics. The results for timber production show that optimal thinning regimes depend on site quality and initial stand characteristics. Taking wood properties into account, our results show that increasing the intensity of thinning resulted in lower wood density and shorter fibers. The addition of nutrients accelerated volume growth, but lowered wood quality for Norway spruce. Integrating energy wood harvesting into conventional forest management showed that conventional forest management without energy wood harvesting was still superior in sparse stands of Scots pine. Energy wood from pre-commercial thinning turned out to be optimal for dense stands. When carbon balance is taken into account, our results show that changing carbon assessment methods leads to very different optimal thinning regimes and average carbon stocks. Raising the carbon price resulted in longer rotations and a higher mean annual

  17. Simulation and optimization of fractional crystallization processes

    DEFF Research Database (Denmark)

    Thomsen, Kaj; Rasmussen, Peter; Gani, Rafiqul

    1998-01-01

    A general method for the calculation of various types of phase diagrams for aqueous electrolyte mixtures is outlined. It is shown how the thermodynamic equilibrium precipitation process can be used to satisfy the operational needs of industrial crystallizer/centrifuge units. Examples of simulation...... and optimization of fractional crystallization processes are shown. In one of these examples, a process with multiple steady states is analyzed. The thermodynamic model applied for describing the highly non-ideal aqueous electrolyte systems is the Extended UNIQUAC model. (C) 1998 Published by Elsevier Science Ltd...

  18. Draught risk index tool for building energy simulations

    DEFF Research Database (Denmark)

    Vorre, Mette Havgaard; Jensen, Rasmus Lund; Nielsen, Peter V.

    2014-01-01

    Flow elements combined with a building energy simulation tool can be used to indicate areas and periods when there is a risk of draught in a room. The study tests this concept by making a tool for post-processing of data from building energy simulations. The objective is to show indications...

  19. 10 CFR 434.507 - Calculation procedure and simulation tool.

    Science.gov (United States)

    2010-01-01

    ... Calculation procedure and simulation tool. 507.1The Prototype or Reference Buildings shall be modeled using... 10 Energy 3 2010-01-01 2010-01-01 false Calculation procedure and simulation tool. 434.507 Section 434.507 Energy DEPARTMENT OF ENERGY ENERGY CONSERVATION ENERGY CODE FOR NEW FEDERAL COMMERCIAL...

  20. Tool Steel Heat Treatment Optimization Using Neural Network Modeling

    Science.gov (United States)

    Podgornik, Bojan; Belič, Igor; Leskovšek, Vojteh; Godec, Matjaz

    2016-11-01

    Optimization of tool steel properties and corresponding heat treatment is mainly based on trial and error approach, which requires tremendous experimental work and resources. Therefore, there is a huge need for tools allowing prediction of mechanical properties of tool steels as a function of composition and heat treatment process variables. The aim of the present work was to explore the potential and possibilities of artificial neural network-based modeling to select and optimize vacuum heat treatment conditions depending on the hot work tool steel composition and required properties. In the current case training of the feedforward neural network with error backpropagation training scheme and four layers of neurons (8-20-20-2) scheme was based on the experimentally obtained tempering diagrams for ten different hot work tool steel compositions and at least two austenitizing temperatures. Results show that this type of modeling can be successfully used for detailed and multifunctional analysis of different influential parameters as well as to optimize heat treatment process of hot work tool steels depending on the composition. In terms of composition, V was found as the most beneficial alloying element increasing hardness and fracture toughness of hot work tool steel; Si, Mn, and Cr increase hardness but lead to reduced fracture toughness, while Mo has the opposite effect. Optimum concentration providing high KIc/HRC ratios would include 0.75 pct Si, 0.4 pct Mn, 5.1 pct Cr, 1.5 pct Mo, and 0.5 pct V, with the optimum heat treatment performed at lower austenitizing and intermediate tempering temperatures.

  1. Simulation Propulsion System and Trajectory Optimization

    Science.gov (United States)

    Hendricks, Eric S.; Falck, Robert D.; Gray, Justin S.

    2017-01-01

    A number of new aircraft concepts have recently been proposed which tightly couple the propulsion system design and operation with the overall vehicle design and performance characteristics. These concepts include propulsion technology such as boundary layer ingestion, hybrid electric propulsion systems, distributed propulsion systems and variable cycle engines. Initial studies examining these concepts have typically used a traditional decoupled approach to aircraft design where the aerodynamics and propulsion designs are done a-priori and tabular data is used to provide inexpensive look ups to the trajectory ana-ysis. However the cost of generating the tabular data begins to grow exponentially when newer aircraft concepts require consideration of additional operational parameters such as multiple throttle settings, angle-of-attack effects on the propulsion system, or propulsion throttle setting effects on aerodynamics. This paper proposes a new modeling approach that eliminated the need to generate tabular data, instead allowing an expensive propulsion or aerodynamic analysis to be directly integrated into the trajectory analysis model and the entire design problem optimized in a fully coupled manner. The new method is demonstrated by implementing a canonical optimal control problem, the F-4 minimum time-to-climb trajectory optimization using three relatively new analysis tools: Open M-DAO, PyCycle and Pointer. Pycycle and Pointer both provide analytic derivatives and Open MDAO enables the two tools to be combined into a coupled model that can be run in an efficient parallel manner that helps to cost the increased cost of the more expensive propulsion analysis. Results generated with this model serve as a validation of the tightly coupled design method and guide future studies to examine aircraft concepts with more complex operational dependencies for the aerodynamic and propulsion models.

  2. Simulating an Optimizing Model of Currency Substitution Simulating an Optimizing Model of Currency Substitution

    Directory of Open Access Journals (Sweden)

    Leonardo Leiderman

    1992-03-01

    Full Text Available Simulating an Optimizing Model of Currency Substitution This paper reports simulations based on the parameter estimates of an intertemporal model of currency substitution under nonexpected utility obtained by Bufman and Leiderman (1991. Here we first study the quantitative impact of changes in the degree of dollarization and in the elasticity of currency substitution on government seigniorage. Then, when examine whether the model can account for the comovement of consumption growth and assets' returnr after the 1985 stabilization program, and in particular for the consumption boom of 1986-87. The results are generally encouraging for future applications of optimizing models of currencysubstitution to policy and practical issues.

  3. Atmospheric extinction in simulation tools for solar tower plants

    Science.gov (United States)

    Hanrieder, Natalie; Wilbert, Stefan; Schroedter-Homscheidt, Marion; Schnell, Franziska; Guevara, Diana Mancera; Buck, Reiner; Giuliano, Stefano; Pitz-Paal, Robert

    2017-06-01

    Atmospheric extinction causes significant radiation losses between the heliostat field and the receiver in a solar tower plants. These losses vary with site and time. State of the art is that in ray-tracing and plant optimization tools, atmospheric extinction is included by choosing between few constant standard atmospheric conditions. Even though some tools allow the consideration of site and time dependent extinction data, such data sets are nearly never available. This paper summarizes and compares the most common model equations implemented in several ray-tracing tools. There are already several methods developed and published to measure extinction on-site. An overview of the existing methods is also given here. Ray-tracing simulations of one exemplary tower plant at the Plataforma Solar de Almería (PSA) are presented to estimate the plant yield deviations between simulations using standard model equations instead of extinction time series. For PSA, the effect of atmospheric extinction accounts for losses between 1.6 and 7 %. This range is caused by considering overload dumping or not. Applying standard clear or hazy model equations instead of extinction time series lead to an underestimation of the annual plant yield at PSA. The discussion of the effect of extinction in tower plants has to include overload dumping. Situations in which overload dumping occurs are mostly connected to high radiation levels and low atmospheric extinction. Therefore it can be recommended that project developers should consider site and time dependent extinction data especially on hazy sites. A reduced uncertainty of the plant yield prediction can significantly reduce costs due to smaller risk margins for financing and EPCs. The generation of extinction data for several locations in form of representative yearly time series or geographical maps should be further elaborated.

  4. Numerical Tool Path Optimization for Conventional Sheet Metal Spinning Processes

    Science.gov (United States)

    Rentsch, Benedikt; Manopulo, Niko; Hora, Pavel

    2016-08-01

    To this day, conventional sheet metal spinning processes are designed with a very low degree of automation. They are usually executed by experienced personnel, who actively adjust the tool paths during production. The practically unlimited freedom in designing the tool paths enables the efficient manufacturing of complex geometries on one hand, but is challenging to translate into a standardized procedure on the other. The present study aims to propose a systematic methodology, based on a 3D FEM model combined with a numerical optimization strategy, in order to design tool paths. The accurate numerical modelling of the spinning process is firstly discussed, followed by an analysis of appropriate objective functions and constraints required to obtain a failure free tool path design.

  5. Solving complex maintenance planning optimization problems using stochastic simulation and multi-criteria fuzzy decision making

    Energy Technology Data Exchange (ETDEWEB)

    Tahvili, Sahar [Mälardalen University (Sweden); Österberg, Jonas; Silvestrov, Sergei [Division of Applied Mathematics, Mälardalen University (Sweden); Biteus, Jonas [Scania CV (Sweden)

    2014-12-10

    One of the most important factors in the operations of many cooperations today is to maximize profit and one important tool to that effect is the optimization of maintenance activities. Maintenance activities is at the largest level divided into two major areas, corrective maintenance (CM) and preventive maintenance (PM). When optimizing maintenance activities, by a maintenance plan or policy, we seek to find the best activities to perform at each point in time, be it PM or CM. We explore the use of stochastic simulation, genetic algorithms and other tools for solving complex maintenance planning optimization problems in terms of a suggested framework model based on discrete event simulation.

  6. Quantum simulation using fidelity-profile optimization

    Science.gov (United States)

    Manu, V. S.; Kumar, Anil

    2014-05-01

    Experimental quantum simulation of a Hamiltonian H requires unitary operator decomposition (UOD) of its evolution unitary U =exp(-iHt) in terms of native unitary operators of the experimental system. Here, using a genetic algorithm, we numerically evaluate the most generic UOD (valid over a continuous range of Hamiltonian parameters) of the unitary operator U , termed fidelity-profile optimization. The optimization is obtained by systematically evaluating the functional dependence of experimental unitary operators (such as single-qubit rotations and time-evolution unitaries of the system interactions) to the Hamiltonian (H) parameters. Using this technique, we have solved the experimental unitary decomposition of a controlled-phase gate (for any phase value), the evolution unitary of the Heisenberg XY interaction, and simulation of the Dzyaloshinskii-Moriya (DM) interaction in the presence of the Heisenberg XY interaction. Using these decompositions, we studied the entanglement dynamics of a Bell state in the DM interaction and experimentally verified the entanglement preservation procedure of Hou et al. [Ann. Phys. (N.Y.) 327, 292 (2012), 10.1016/j.aop.2011.08.004] in a nuclear magnetic resonance quantum information processor.

  7. Simulation Tools Prevent Signal Interference on Spacecraft

    Science.gov (United States)

    2014-01-01

    NASA engineers use simulation software to detect and prevent interference between different radio frequency (RF) systems on a rocket and satellite before launch. To speed up the process, Kennedy Space Center awarded SBIR funding to Champaign, Illinois-based Delcross Technologies LLC, which added a drag-and-drop feature to its commercial simulation software, resulting in less time spent preparing for the analysis.

  8. HAM-Tools – a whole building simulation tool in Annex 41

    DEFF Research Database (Denmark)

    Kalagasidis, Angela Sasic; Rode, Carsten; Woloszyn, Monika

    2008-01-01

    HAM-Tools is a building simulation software. The main task of this tool is to simulate transfer processes related to building physics, i.e. heat, air and moisture transport in buildings and building components in operating conditions. The scope of the ECBCS Annex 41 “Whole Building Heat, Air and ...

  9. Memory Optimization for Phase-field Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Derek Gaston; John Peterson; Andrew Slaughter; Cody Permann; David Andrs

    2014-08-01

    Phase-field simulations are computationally and memory intensive applications. Many of the phase-field simulations being conducted in support of NEAMS were not capable of running on “normal clusters” with 2-4GB of RAM per core, and instead required specialized “big-memory” clusters with 64GB per core. To address this issue, the MOOSE team developed a new Python-based utility called MemoryLogger, and applied it to locate, diagnose, and eradicate memory bottlenecks within the MOOSE framework. MemoryLogger allows for a better understanding of the memory usage of an application being run in parallel across a cluster. Memory usage information is captured for every individual process in a parallel job, and communicated to the head node of the cluster. Console text output from the application itself is automatically matched with this memory usage information to produce a detailed picture of memory usage over time, making it straightforward to identify the subroutines which contribute most to the application’s peak memory usage. The information produced by the MemoryLogger quickly and effectively narrows the search for memory optimizations to the most data-intensive parts of the simulation.

  10. Response simulation and theoretical calibration of a dual-induction resistivity LWD tool

    Institute of Scientific and Technical Information of China (English)

    Xu Wei; Ke Shi-Zhen; Li An-Zong; Chen Peng; Zhu Jun; Zhang Wei

    2014-01-01

    In this paper, responses of a new dual-induction resistivity logging-while-drilling (LWD) tool in 3D inhomogeneous formation models are simulated by the vector finite element method (VFEM), the influences of the borehole, invaded zone, surrounding strata, and tool eccentricity are analyzed, and calibration loop parameters and calibration coefficients of the LWD tool are discussed. The results show that the tool has a greater depth of investigation than that of the existing electromagnetic propagation LWD tools and is more sensitive to azimuthal conductivity. Both deep and medium induction responses have linear relationships with the formation conductivity, considering optimal calibration loop parameters and calibration coefficients. Due to the different depths of investigation and resolution, deep induction and medium induction are affected differently by the formation model parameters, thereby having different correction factors. The simulation results can provide theoretical references for the research and interpretation of the dual-induction resistivity LWD tools.

  11. Visual Dynamic Simulation and Optimization of Zhangjiuhe Diversion Project

    Institute of Scientific and Technical Information of China (English)

    ZHONG Denghua; LIU Jianmin; XIONG Kaizhi; FU Jinqiang

    2008-01-01

    With the aim of visualizing the real-time simulation calculation of water delivery system (WDS), a structural drawing-oriented (SDO) simulation technique was presented, and applied to Zhangjiuhe Diversion Project, which is a long-distance water delivery system constructed for drawing water from the Zhangjiuhe River to Kunming city. Taking SIMULINK software as simulating platform, the technique established a visual dynamic simulation model for the system. The simulation procedure of the system was simplified, and the efficiency of modeling was also enhanced according to the modularization and reutilization of the simulation program. Furthermore, a selfoptimization model was presented. Based on the digital simulation models, the on line controlled optimization link was added, and the input data can be continually optimized according to the feedback information of simulating output. The system was thus optimized automatically. Built upon MATLAB software, simulation optimization of the Zhangjiuhe Diversion Project was achieved, which provides a new way for the research of optimal operation of WDS.

  12. Modeling and Simulation Tools for Heavy Lift Airships

    Science.gov (United States)

    Hochstetler, Ron; Chachad, Girish; Hardy, Gordon; Blanken, Matthew; Melton, John

    2016-01-01

    For conventional fixed wing and rotary wing aircraft a variety of modeling and simulation tools have been developed to provide designers the means to thoroughly investigate proposed designs and operational concepts. However, lighter-than-air (LTA) airships, hybrid air vehicles, and aerostats have some important aspects that are different from heavier-than-air (HTA) vehicles. In order to account for these differences, modifications are required to the standard design tools to fully characterize the LTA vehicle design and performance parameters.. To address these LTA design and operational factors, LTA development organizations have created unique proprietary modeling tools, often at their own expense. An expansion of this limited LTA tool set could be accomplished by leveraging existing modeling and simulation capabilities available in the National laboratories and public research centers. Development of an expanded set of publicly available LTA modeling and simulation tools for LTA developers would mitigate the reliance on proprietary LTA design tools in use today. A set of well researched, open source, high fidelity LTA design modeling and simulation tools would advance LTA vehicle development and also provide the analytical basis for accurate LTA operational cost assessments. This paper will present the modeling and analysis tool capabilities required for LTA vehicle design, analysis of operations, and full life-cycle support. A survey of the tools currently available will be assessed to identify the gaps between their capabilities and the LTA industry's needs. Options for development of new modeling and analysis capabilities to supplement contemporary tools will also be presented.

  13. Methods and tools for analysis and optimization of power plants

    Energy Technology Data Exchange (ETDEWEB)

    Assadi, Mohsen

    2000-09-01

    The most noticeable advantage of the introduction of the computer-aided tools in the field of power generation, has been the ability to study the plant's performance prior to the construction phase. The results of these studies have made it possible to change and adjust the plant layout to match the pre-defined requirements. Further development of computers in recent years has opened up for implementation of new features in the existing tools and also for the development of new tools for specific applications, like thermodynamic and economic optimization, prediction of the remaining component life time, and fault diagnostics, resulting in improvement of the plant's performance, availability and reliability. The most common tools for pre-design studies are heat and mass balance programs. Further thermodynamic and economic optimization of plant layouts, generated by the heat and mass balance programs, can be accomplished by using pinch programs, exergy analysis and thermoeconomics. Surveillance and fault diagnostics of existing systems can be performed by using tools like condition monitoring systems and artificial neural networks. The increased number of tools and their various construction and application areas make the choice of the most adequate tool for a certain application difficult. In this thesis the development of different categories of tools and techniques, and their application area are reviewed and presented. Case studies on both existing and theoretical power plant layouts have been performed using different commercially available tools to illuminate their advantages and shortcomings. The development of power plant technology and the requirements for new tools and measurement systems have been briefly reviewed. This thesis contains also programming techniques and calculation methods concerning part-load calculations using local linearization, which has been implemented in an inhouse heat and mass balance program developed by the author

  14. Comparison of simplified and advanced building simulation tool with measured data

    DEFF Research Database (Denmark)

    Christensen, Jørgen Erik; Schiønning, Peder; Dethlefsen, Espen

    2013-01-01

    In the future building design must progress to a format where CO 2 neutral societies are optimized as a whole and innovative technologies integrated. The purpose of this paper is to demonstrate the problems using a simplified design tool to simulate a complicated building and how this may not give...

  15. Process Optimization with Simulation Modeling in a Manufacturing System

    Directory of Open Access Journals (Sweden)

    Akbel Yildiz

    2011-04-01

    Full Text Available Computer simulation has become an important tool in modeling systems in the last ten years due to parallel improvement in computer technologies. Companies tend to computer based system modeling and simulation not to lose any extra income or time to their competitors but to make future investments while they both have the same labor force, resources and technology. This study is an implementation of a machine spare parts manufacturer factory located in city of Turkey. The purpose of the study depends on increasing the utilization rates and optimizing the manufacture process to decrease prouction costs via identifying the bottlenecks in manufacture system. Therefore, ProModel simulation software was used to model the production line of the factory. Production line consists of nineteen work stations and was modeled for the most manufactured two products. The manufacture in the factory is divided into two weeks of batch production time and simulation model was demonstrated and replicated for ten times to get results. Thus, statistics including existing capacity usages of work stations in the whole production line were found to identify the bottlenecks of the critical work stations and machines. With the use of the simulation model, creating scenarios while making changes of the system parameters, taking the cycle times of the work stations, total production quantity, batch sizes and the shifts of the factory in hand helped to make suggestions.

  16. A Novel Optimization Tool for Automated Design of Integrated Circuits based on MOSGA

    Directory of Open Access Journals (Sweden)

    Maryam Dehbashian

    2011-11-01

    Full Text Available In this paper a novel optimization method based on Multi-Objective Gravitational Search Algorithm (MOGSA is presented for automated design of analog integrated circuits. The recommended method firstly simulates a selected circuit using a simulator and then simulated results are optimized by MOGSA algorithm. Finally this process continues to meet its optimum result. The main programs of the proposed method have been implemented in MATLAB while analog circuits are simulated by HSPICE software. To show the capability of this method, its proficiency will be examined in the optimization of analog integrated circuits design. In this paper, an analog circuit sizing scheme -Optimum Automated Design of a Temperature independent Differential Op-amp using Widlar Current Source- is illustrated as a case study. The computer results obtained from implementing this method indicate that the design specifications are closely met. Moreover, according to various design criteria, this tool by proposing a varied set of answers can give more options to designers to choose a desirable scheme among other suggested results. MOGSA, the proposal algorithm, introduces a novel method in multi objective optimization on the basis of Gravitational Search Algorithm in which the concept of “Pareto-optimality” is used to determine “non-dominated” positions as well as an external repository to keep these positions. To ensure the accuracy of MOGSA performance, this algorithm is validated using several standard test functions from some specialized literatures. Final results indicate that our method is highly competitive with current multi objective optimization algorithms.

  17. A Robust Method to Integrate End-to-End Mission Architecture Optimization Tools

    Science.gov (United States)

    Lugo, Rafael; Litton, Daniel; Qu, Min; Shidner, Jeremy; Powell, Richard

    2016-01-01

    End-to-end mission simulations include multiple phases of flight. For example, an end-to-end Mars mission simulation may include launch from Earth, interplanetary transit to Mars and entry, descent and landing. Each phase of flight is optimized to meet specified constraints and often depend on and impact subsequent phases. The design and optimization tools and methodologies used to combine different aspects of end-to-end framework and their impact on mission planning are presented. This work focuses on a robust implementation of a Multidisciplinary Design Analysis and Optimization (MDAO) method that offers the flexibility to quickly adapt to changing mission design requirements. Different simulations tailored to the liftoff, ascent, and atmospheric entry phases of a trajectory are integrated and optimized in the MDAO program Isight, which provides the user a graphical interface to link simulation inputs and outputs. This approach provides many advantages to mission planners, as it is easily adapted to different mission scenarios and can improve the understanding of the integrated system performance within a particular mission configuration. A Mars direct entry mission using the Space Launch System (SLS) is presented as a generic end-to-end case study. For the given launch period, the SLS launch performance is traded for improved orbit geometry alignment, resulting in an optimized a net payload that is comparable to that in the SLS Mission Planner's Guide.

  18. Simulation tools for robotics research and assessment

    Science.gov (United States)

    Fields, MaryAnne; Brewer, Ralph; Edge, Harris L.; Pusey, Jason L.; Weller, Ed; Patel, Dilip G.; DiBerardino, Charles A.

    2016-05-01

    The Robotics Collaborative Technology Alliance (RCTA) program focuses on four overlapping technology areas: Perception, Intelligence, Human-Robot Interaction (HRI), and Dexterous Manipulation and Unique Mobility (DMUM). In addition, the RCTA program has a requirement to assess progress of this research in standalone as well as integrated form. Since the research is evolving and the robotic platforms with unique mobility and dexterous manipulation are in the early development stage and very expensive, an alternate approach is needed for efficient assessment. Simulation of robotic systems, platforms, sensors, and algorithms, is an attractive alternative to expensive field-based testing. Simulation can provide insight during development and debugging unavailable by many other means. This paper explores the maturity of robotic simulation systems for applications to real-world problems in robotic systems research. Open source (such as Gazebo and Moby), commercial (Simulink, Actin, LMS), government (ANVEL/VANE), and the RCTA-developed RIVET simulation environments are examined with respect to their application in the robotic research domains of Perception, Intelligence, HRI, and DMUM. Tradeoffs for applications to representative problems from each domain are presented, along with known deficiencies and disadvantages. In particular, no single robotic simulation environment adequately covers the needs of the robotic researcher in all of the domains. Simulation for DMUM poses unique constraints on the development of physics-based computational models of the robot, the environment and objects within the environment, and the interactions between them. Most current robot simulations focus on quasi-static systems, but dynamic robotic motion places an increased emphasis on the accuracy of the computational models. In order to understand the interaction of dynamic multi-body systems, such as limbed robots, with the environment, it may be necessary to build component

  19. Productivity simulation model for optimization of maritime container terminals

    Directory of Open Access Journals (Sweden)

    Elen TWRDY

    2009-01-01

    Full Text Available This article describes a proposed productivity simulation model enabling container terminal operators to find optimization possibilities. A research of more than forty terminals has been done, in order to provide a helping tool for maritime container terminals. By applying an adequate simulation model, it is possible to measure and increase the productivity in all subsystem of the maritime container terminal. Management of a maritime container terminal includes a vast number of different financial and operational decisions. Financial decisions are often in a direct connection with investments in infrastructure and handling equipment. Such investments are very expensive. Therefore, they must give back the invested money as soon as possible. On the other hand, some terminals are limited by the physical extension and are forced to increase annual throughput only with sophisticated equipment on the berth side and on the yard as well. Considering all these important facts in container and shipping industry, the proposed simulation model gives a helping tool for checking the productivity and its time variation and monitoring competitiveness of a certain maritime terminal with terminals from the same group.

  20. Optimal control and quantum simulations in superconducting quantum devices

    Energy Technology Data Exchange (ETDEWEB)

    Egger, Daniel J.

    2014-10-31

    Quantum optimal control theory is the science of steering quantum systems. In this thesis we show how to overcome the obstacles in implementing optimal control for superconducting quantum bits, a promising candidate for the creation of a quantum computer. Building such a device will require the tools of optimal control. We develop pulse shapes to solve a frequency crowding problem and create controlled-Z gates. A methodology is developed for the optimisation towards a target non-unitary process. We show how to tune-up control pulses for a generic quantum system in an automated way using a combination of open- and closed-loop optimal control. This will help scaling of quantum technologies since algorithms can calibrate control pulses far more efficiently than humans. Additionally we show how circuit QED can be brought to the novel regime of multi-mode ultrastrong coupling using a left-handed transmission line coupled to a right-handed one. We then propose to use this system as an analogue quantum simulator for the Spin-Boson model to show how dissipation arises in quantum systems.

  1. Speed optimized influence matrix processing in inverse treatment planning tools

    Energy Technology Data Exchange (ETDEWEB)

    Ziegenhein, Peter; Wilkens, Jan J; Nill, Simeon; Oelfke, Uwe [German Cancer Research Center (DKFZ), Department of Medical Physics in Radiation Oncology, Im Neuenheimer Feld 280, 69120 Heidelberg (Germany); Ludwig, Thomas [University of Heidelberg, Institute of Computer Science, Research Group Parallel and Distributed Systems, Im Neuenheimer Feld 348, 69120 Heidelberg (Germany)], E-mail: p.ziegenhein@dkfz.de, E-mail: u.oelfke@dkfz.de

    2008-05-07

    An optimal plan in modern treatment planning tools is found through the use of an iterative optimization algorithm, which deals with a high amount of patient-related data and number of treatment parameters to be optimized. Thus, calculating a good plan is a very time-consuming process which limits the application for patients in clinics and for research activities aiming for more accuracy. A common technique to handle the vast amount of radiation dose data is the concept of the influence matrix (DIJ), which stores the dose contribution of each bixel to the patient in the main memory of the computer. This study revealed that a bottleneck for the optimization time arises from the data transfer of the dose data between the memory and the CPU. In this note, we introduce a new method which speeds up the data transportation from stored dose data to the CPU. As an example we used the DIJ approach as is implemented in our treatment planning tool KonRad, developed at the German Cancer Research Center (DKFZ) in Heidelberg. A data cycle reordering method is proposed to take the advantage of modern memory hardware. This induces a minimal eviction policy which results in a memory behaviour exhibiting a 2.6 times faster algorithm compared to the naive implementation. Although our method is described for the DIJ approach implemented in KonRad, we believe that any other planning tool which uses a similar approach to store the dose data will also benefit from the described methods. (note)

  2. Simulation Tool for GNSS Ocean Surface Reflections

    DEFF Research Database (Denmark)

    Høeg, Per; von Benzon, Hans-Henrik; Durgonics, Tibor

    2015-01-01

    on the International Space Station, are focusing on GNSS ocean reflection measurements. Thus, simulation studies highlighting the assumptions for the data retrievals and the precision and the accuracy of such measurements are of interest for assessing the observational method.The theory of propagation of microwaves...

  3. Software tool for the prosthetic foot modeling and stiffness optimization.

    Science.gov (United States)

    Strbac, Matija; Popović, Dejan B

    2012-01-01

    We present the procedure for the optimization of the stiffness of the prosthetic foot. The procedure allows the selection of the elements of the foot and the materials used for the design. The procedure is based on the optimization where the cost function is the minimization of the difference between the knee joint torques of healthy walking and the walking with the transfemural prosthesis. We present a simulation environment that allows the user to interactively vary the foot geometry and track the changes in the knee torque that arise from these adjustments. The software allows the estimation of the optimal prosthetic foot elasticity and geometry. We show that altering model attributes such as the length of the elastic foot segment or its elasticity leads to significant changes in the estimated knee torque required for a given trajectory.

  4. Software Tool for the Prosthetic Foot Modeling and Stiffness Optimization

    Directory of Open Access Journals (Sweden)

    Matija Štrbac

    2012-01-01

    Full Text Available We present the procedure for the optimization of the stiffness of the prosthetic foot. The procedure allows the selection of the elements of the foot and the materials used for the design. The procedure is based on the optimization where the cost function is the minimization of the difference between the knee joint torques of healthy walking and the walking with the transfemural prosthesis. We present a simulation environment that allows the user to interactively vary the foot geometry and track the changes in the knee torque that arise from these adjustments. The software allows the estimation of the optimal prosthetic foot elasticity and geometry. We show that altering model attributes such as the length of the elastic foot segment or its elasticity leads to significant changes in the estimated knee torque required for a given trajectory.

  5. Unified Nonlinear Flight Dynamics and Aeroelastic Simulator Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — ZONA Technology, Inc. (ZONA) proposes a R&D effort to develop a Unified Nonlinear Flight Dynamics and Aeroelastic Simulator (UNFDAS) Tool that will combine...

  6. High Fidelity Regolith Simulation Tool for ISRU Applications Project

    Data.gov (United States)

    National Aeronautics and Space Administration — NASA has serious unmet needs for simulation tools capable of predicting the behavior of lunar regolith in proposed excavation, transport and handling systems....

  7. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, A.; Jauch, Clemens; Soerensen, P.

    The present report describes the dynamic wind turbine models implemented in the power system simulation tool DIgSILENT. The developed models are a part of the results of a national research project, whose overall objective is to create a model database in different simulation tools. The report...... provides a description of the wind turbine modelling, both at a component level and at a system level....

  8. Simulation Tool for GNSS Ocean Surface Reflections

    Science.gov (United States)

    Høeg, Per; von Benzon, Hans-Henrik; Durgonics, Tibor

    2015-04-01

    GNSS coherent and incoherent reflected signals have the potential of deriving large scale parameters of ocean surfaces, as barotropic variability, eddy currents and fronts, Rossby waves, coastal upwelling, mean ocean surface heights, and patterns of the general ocean circulation. In the reflection zone the measurements may derive parameters as sea surface roughness, winds, waves, heights and tilts from the spectral measurements. Previous measurements from the top of mountains and airplanes have shown such results leading. The coming satellite missions, CYGNSS, COSMIC-2, and GEROS on the International Space Station, are focusing on GNSS ocean reflection measurements. Thus, simulation studies highlighting the assumptions for the data retrievals and the precision and the accuracy of such measurements are of interest for assessing the observational method. The theory of propagation of microwaves in the atmosphere is well established, and methods for propagation modeling range from ray tracing to numerical solutions to the wave equation. Besides ray tracing there are propagation methods that use mode theory and a finite difference solution to the parabolic equation. The presented propagator is based on the solution of the parabolic equation. The parabolic equation in our simulator is solved using the split-step sine transformation. The Earth's surface is modeled with the use of an impedance model. The value of the Earth impedance is given as a function of the range along the surface of the Earth. This impedance concept gives an accurate lower boundary condition in the determination of the electromagnetic field, and makes it possible to simulate reflections and the effects of transitions between different mediums. A semi-isotropic Philips spectrum is used to represent the air-sea interaction. Simulated GPS ocean surface reflections will be presented and discussed based on different ocean characteristics. The spectra of the simulated surface reflections will be analyzed

  9. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    Directory of Open Access Journals (Sweden)

    Helen V. Hsieh

    2017-05-01

    Full Text Available Immunochromatographic or lateral flow assays (LFAs are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads, biological reagents (e.g., antibodies, blocking reagents and buffers and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness.

  10. A Simulation Tool for Hurricane Evacuation Planning

    Directory of Open Access Journals (Sweden)

    Daniel J. Fonseca

    2009-01-01

    Full Text Available Atlantic hurricanes and severe tropical storms are a serious threat for the communities in the Gulf of Mexico region. Such storms are violent and destructive. In response to these dangers, coastal evacuation may be ordered. This paper describes the development of a simulation model to analyze the movement of vehicles through I-65, a major US Interstate highway that runs north off the coastal City of Mobile, Alabama, towards the State of Tennessee, during a massive evacuation originated by a disastrous event such a hurricane. The constructed simulation platform consists of a primary and two secondary models. The primary model is based on the entry of vehicles from the 20 on-ramps to I-65. The two secondary models assist the primary model with related traffic events such as car breakdowns and accidents, traffic control measures, interarrival signaling, and unforeseen emergency incidents, among others. Statistical testing was performed on the data generated by the simulation model to indentify variation in relevant traffic variables affecting the timely flow of vehicles travelling north. The performed statistical analysis focused on the closing of alternative on-ramps throughout the Interstate.

  11. Performance optimization of web-based medical simulation.

    Science.gov (United States)

    Halic, Tansel; Ahn, Woojin; De, Suvranu

    2013-01-01

    This paper presents a technique for performance optimization of multimodal interactive web-based medical simulation. A web-based simulation framework is promising for easy access and wide dissemination of medical simulation. However, the real-time performance of the simulation highly depends on hardware capability on the client side. Providing consistent simulation in different hardware is critical for reliable medical simulation. This paper proposes a non-linear mixed integer programming model to optimize the performance of visualization and physics computation while considering hardware capability and application specific constraints. The optimization model identifies and parameterizes the rendering and computing capabilities of the client hardware using an exploratory proxy code. The parameters are utilized to determine the optimized simulation conditions including texture sizes, mesh sizes and canvas resolution. The test results show that the optimization model not only achieves a desired frame per second but also resolves visual artifacts due to low performance hardware.

  12. Reactor2D: A tool for simulation of shock deformation

    Science.gov (United States)

    Kraus, Eugeny I.; Shabalin, Ivan I.

    2016-10-01

    The basic steps for creating a numerical tool to simulate the deformation and failure processes of complex technical objects (CTO) are presented. Calculations of shock loading of CTO both at low and high speeds, showing the efficiency of the numerical tools created are carried out.

  13. Simulation Tools for Power Electronics Courses Based on Java Technologies

    Science.gov (United States)

    Canesin, Carlos A.; Goncalves, Flavio A. S.; Sampaio, Leonardo P.

    2010-01-01

    This paper presents interactive power electronics educational tools. These interactive tools make use of the benefits of Java language to provide a dynamic and interactive approach to simulating steady-state ideal rectifiers (uncontrolled and controlled; single-phase and three-phase). Additionally, this paper discusses the development and use of…

  14. CPN Tools for Editing, Simulating, and Analysing Coloured Petri Nets

    DEFF Research Database (Denmark)

    Ratzer, Anne Vinter; Wells, Lisa Marie; Lassen, Henry Michael

    2003-01-01

    CPN Tools is a tool for editing, simulating and analysing Coloured Petri Nets. The GUI is based on advanced interaction techniques, such as toolglasses, marking menus, and bi-manual interaction. Feedback facilities provide contextual error messages and indicate dependency relationships between net...

  15. Simulation Tools for Power Electronics Courses Based on Java Technologies

    Science.gov (United States)

    Canesin, Carlos A.; Goncalves, Flavio A. S.; Sampaio, Leonardo P.

    2010-01-01

    This paper presents interactive power electronics educational tools. These interactive tools make use of the benefits of Java language to provide a dynamic and interactive approach to simulating steady-state ideal rectifiers (uncontrolled and controlled; single-phase and three-phase). Additionally, this paper discusses the development and use of…

  16. Contingency Contractor Optimization Phase 3 Sustainment Database Design Document - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Frazier, Christopher Rawls; Durfee, Justin David; Bandlow, Alisa; Gearhart, Jared Lee; Jones, Katherine A

    2016-05-01

    The Contingency Contractor Optimization Tool – Prototype (CCOT-P) database is used to store input and output data for the linear program model described in [1]. The database allows queries to retrieve this data and updating and inserting new input data.

  17. A Hybrid Analytical/Simulation Modeling Approach for Planning and Optimizing Mass Tactical Airborne Operations

    Science.gov (United States)

    1995-05-01

    A HYBRID ANALYTICAL/ SIMULATION MODELING APPROACH FOR PLANNING AND OPTIMIZING MASS TACTICAL AIRBORNE OPERATIONS by DAVID DOUGLAS BRIGGS M.S.B.A...COVERED MAY 1995 TECHNICAL REPORT THESIS 4. TITLE AND SUBTITLE 5. FUNDING NUMBERS A HYBRID ANALYTICAL SIMULATION MODELING APPROACH FOR PLANNING AND...are present. Thus, simulation modeling presents itself as an excellent alternate tool for planning because it allows for the modeling of highly complex

  18. Building energy demand aggregation and simulation tools

    DEFF Research Database (Denmark)

    Gianniou, Panagiota; Heller, Alfred; Rode, Carsten

    2015-01-01

    Nowadays, the minimization of energy consumption and the optimization of efficiency of the overall energy grid have been in the agenda of most national and international energy policies. At the same time, urbanization has put cities under the microscope towards achieving cost-effective energy...... savings due to their compact and highly dense form. Thus, accurate estimation of energy demand of cities is of high importance to policy-makers and energy planners. This calls for automated methods that can be easily expandable to higher levels of aggregation, ranging from clusters of buildings...... to neighbourhoods and cities. Buildings occupy a key place in the development of smart cities as they represent an important potential to integrate smart energy solutions. Building energy consumption affects significantly the performance of the entire energy network. Therefore, a realistic estimation...

  19. Risk Reduction and Training using Simulation Based Tools - 12180

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Irin P. [Newport News Shipbuilding, Newport News, Virginia 23607 (United States)

    2012-07-01

    Process Modeling and Simulation (M and S) has been used for many years in manufacturing and similar domains, as part of an industrial engineer's tool box. Traditionally, however, this technique has been employed in small, isolated projects where models were created from scratch, often making it time and cost prohibitive. Newport News Shipbuilding (NNS) has recognized the value of this predictive technique and what it offers in terms of risk reduction, cost avoidance and on-schedule performance of highly complex work. To facilitate implementation, NNS has been maturing a process and the software to rapidly deploy and reuse M and S based decision support tools in a variety of environments. Some examples of successful applications by NNS of this technique in the nuclear domain are a reactor refueling simulation based tool, a fuel handling facility simulation based tool and a tool for dynamic radiation exposure tracking. The next generation of M and S applications include expanding simulation based tools into immersive and interactive training. The applications discussed here take a tool box approach to creating simulation based decision support tools for maximum utility and return on investment. This approach involves creating a collection of simulation tools that can be used individually or integrated together for a larger application. The refueling simulation integrates with the fuel handling facility simulation to understand every aspect and dependency of the fuel handling evolutions. This approach translates nicely to other complex domains where real system experimentation is not feasible, such as nuclear fuel lifecycle and waste management. Similar concepts can also be applied to different types of simulation techniques. For example, a process simulation of liquid waste operations may be useful to streamline and plan operations, while a chemical model of the liquid waste composition is an important tool for making decisions with respect to waste disposition

  20. Spacecraft Component Adaptive Layout Environment (SCALE): An efficient optimization tool

    Science.gov (United States)

    Fakoor, Mahdi; Ghoreishi, Seyed Mohammad Navid; Sabaghzadeh, Hossein

    2016-11-01

    For finding the optimum layout of spacecraft subsystems, important factors such as the center of gravity, moments of inertia, thermal distribution, natural frequencies, etc. should be taken into account. This large number of effective parameters makes the optimum layout process of spacecraft subsystems complex and time consuming. In this paper, an automatic tool, based on multi-objective optimization methods, is proposed for a three dimensional layout of spacecraft subsystems. In this regard, an efficient Spacecraft Component Adaptive Layout Environment (SCALE) is produced by integration of some modeling, FEM, and optimization software. SCALE automatically provides optimal solutions for a three dimensional layout of spacecraft subsystems with considering important constraints such as center of gravity, moment of inertia, thermal distribution, natural frequencies and structural strength. In order to show the superiority and efficiency of SCALE, layout of a telecommunication spacecraft and a remote sensing spacecraft are performed. The results show that, the objective functions values for obtained layouts by using SCALE are in a much better condition than traditional one i.e. Reference Baseline Solution (RBS) which is proposed by the engineering system team. This indicates the good performance and ability of SCALE for finding the optimal layout of spacecraft subsystems.

  1. 10 CFR 434.521 - The simulation tool.

    Science.gov (United States)

    2010-01-01

    ... RESIDENTIAL BUILDINGS Building Energy Cost Compliance Alternative § 434.521 The simulation tool. 521.1Annual energy consumption shall be simulated with a multi-zone, 8760 hours per year building energy model. The... buildings. In addition, models shall be capable of translating the Design Energy Consumption into......

  2. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    Science.gov (United States)

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development.

  3. Optimizing Muscle Parameters in Musculoskeletal Modeling Using Monte Carlo Simulations

    Science.gov (United States)

    Hanson, Andrea; Reed, Erik; Cavanagh, Peter

    2011-01-01

    Astronauts assigned to long-duration missions experience bone and muscle atrophy in the lower limbs. The use of musculoskeletal simulation software has become a useful tool for modeling joint and muscle forces during human activity in reduced gravity as access to direct experimentation is limited. Knowledge of muscle and joint loads can better inform the design of exercise protocols and exercise countermeasure equipment. In this study, the LifeModeler(TM) (San Clemente, CA) biomechanics simulation software was used to model a squat exercise. The initial model using default parameters yielded physiologically reasonable hip-joint forces. However, no activation was predicted in some large muscles such as rectus femoris, which have been shown to be active in 1-g performance of the activity. Parametric testing was conducted using Monte Carlo methods and combinatorial reduction to find a muscle parameter set that more closely matched physiologically observed activation patterns during the squat exercise. Peak hip joint force using the default parameters was 2.96 times body weight (BW) and increased to 3.21 BW in an optimized, feature-selected test case. The rectus femoris was predicted to peak at 60.1% activation following muscle recruitment optimization, compared to 19.2% activation with default parameters. These results indicate the critical role that muscle parameters play in joint force estimation and the need for exploration of the solution space to achieve physiologically realistic muscle activation.

  4. Numerical tools for Molten salt reactor simulation

    Energy Technology Data Exchange (ETDEWEB)

    Doligez, X.; Heuer, D.; Merle-Lucotte, E.; Allibert, M.; Ghetta, V. [LPSC-IN2P3-CNRS/Universite Joseph Fourier/Grenoble-INP, 53 Avenue des Martyrs, 38026 Grenoble Cedex (France)

    2009-06-15

    Molten salt reactors (MSR) are basically different from other reactors mainly because the fuel is liquid. It flows through the core, pipes, pumps and heat exchangers. Previous studies showed that a particular configuration of a molten salt reactor perfectly fulfils criteria chosen by the Generation 4 International Forum (GIF). This configuration, called non-moderated Thorium Molten Salt Reactor is a 1000 GW electrical thorium cycle based molten salt reactor with no moderator inside the core. Consequently, the neutron spectrum is fast. The reactor is coupled with a salt control unit, which complicates the studies. Reactors simulation is based on resolving Bateman's equations, which give the population of each nucleus inside the core at each moment. Because of MSR's fundamental characteristics, those equations have to be modified adding two terms: a fertile/fissile alimentation for the reactivity and the salt composition control, and the reprocessing associated term. Equations become: {delta}N{sub i}/{delta}t = {sigma}{sub j{ne}}{sub i} {lambda}{sub j{yields}}{sub i} N{sub j} + X{sub j} <{sigma}{sub j}{phi}> N{sub j} - {lambda}{sub i}N{sub i} - <{sigma}{sub i}{phi}> N{sub i} {lambda}{sub chem} N{sub i} + A where {lambda}{sub chem} represents the reprocessing capacities and A represents the fertile/fissile alimentation. All our studies are made with a homemade code, REM, which is a precision driven code for material evolution. Neutron flux and neutron reactions rate are calculated thanks MCNP and the temporal integration is made thanks a Runge-Kutta fourth order method. This code REM, whose calculation scheme will be described in the paper, does not allow a coupling flexible enough between the reprocessing and the core physics. Indeed, reprocessing terms in the previous equation ({lambda}{sub chem}) are set for the whole evolution that can last several hundreds of years. A new way is to drive chemical needs to keep the core critical. Therefore, we are

  5. ICOOL: A TOOL FOR MUON COLLIDER SIMULATIONS.

    Energy Technology Data Exchange (ETDEWEB)

    FERNOW,R.C.

    2001-09-28

    Current ideas for designing neutrino factories [ 1,2] and muon colliders [3] require unique configurations of fields and materials to prepare the muon beam for acceleration. This so-called front end system must accomplish the goals of phase rotation, bunching and cooling. We have continued the development of a 3-D tracking code, ICOOL [4], for examining possible muon collider front end configurations. A system is described in terms of a series of longitudinal regions with associated material and field properties. The tracking takes place in a coordinate system that follows a reference orbit through the system. The code takes into account decays and interactions of {approx}50-500 MeV/c muons in matter. Material geometry regions include cylinders and wedges. A number of analytic models are provided for describing the field configurations. Simple diagnostics are built into the code, including calculation of emittances and correlations, longitudinal traces, histograms and scatter plots. A number of auxiliary codes can be used for pre-processing, post-processing and optimization.

  6. CPN/Tools: A Tool for Editing and Simulating Coloured Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt; Christensen, Søren; Ravn, Katrine

    2001-01-01

    CPN/Tools is a major redesign of the popular Design/CPN tool for editing, simulation and state space analysis of Coloured Petri Nets. The new interface is based on advanced interaction techniques, including bi-manual interaction, toolglasses and marking menus and a new metaphor for managing...

  7. Finite Element Modeling, Simulation, Tools, and Capabilities at Superform

    Science.gov (United States)

    Raman, Hari; Barnes, A. J.

    2010-06-01

    Over the past thirty years Superform has been a pioneer in the SPF arena, having developed a keen understanding of the process and a range of unique forming techniques to meet varying market needs. Superform’s high-profile list of customers includes Boeing, Airbus, Aston Martin, Ford, and Rolls Royce. One of the more recent additions to Superform’s technical know-how is finite element modeling and simulation. Finite element modeling is a powerful numerical technique which when applied to SPF provides a host of benefits including accurate prediction of strain levels in a part, presence of wrinkles and predicting pressure cycles optimized for time and part thickness. This paper outlines a brief history of finite element modeling applied to SPF and then reviews some of the modeling tools and techniques that Superform have applied and continue to do so to successfully superplastically form complex-shaped parts. The advantages of employing modeling at the design stage are discussed and illustrated with real-world examples.

  8. PLIO: a generic tool for real-time operational predictive optimal control of water networks.

    Science.gov (United States)

    Cembrano, G; Quevedo, J; Puig, V; Pérez, R; Figueras, J; Verdejo, J M; Escaler, I; Ramón, G; Barnet, G; Rodríguez, P; Casas, M

    2011-01-01

    This paper presents a generic tool, named PLIO, that allows to implement the real-time operational control of water networks. Control strategies are generated using predictive optimal control techniques. This tool allows the flow management in a large water supply and distribution system including reservoirs, open-flow channels for water transport, water treatment plants, pressurized water pipe networks, tanks, flow/pressure control elements and a telemetry/telecontrol system. Predictive optimal control is used to generate flow control strategies from the sources to the consumer areas to meet future demands with appropriate pressure levels, optimizing operational goals such as network safety volumes and flow control stability. PLIO allows to build the network model graphically and then to automatically generate the model equations used by the predictive optimal controller. Additionally, PLIO can work off-line (in simulation) and on-line (in real-time mode). The case study of Santiago-Chile is presented to exemplify the control results obtained using PLIO off-line (in simulation).

  9. Optimization of multiple-layer microperforated panels by simulated annealing

    DEFF Research Database (Denmark)

    Ruiz Villamil, Heidi; Cobo, Pedro; Jacobsen, Finn

    2011-01-01

    Sound absorption by microperforated panels (MPP) has received increasing attention the past years as an alternative to conventional porous absorbers in applications with special cleanliness and health requirements. The absorption curve of an MPP depends on four parameters: the holes diameter......, the panel thickness, the perforation ratio, and the thickness of the air cavity between the panel and an impervious wall. It is possible to find a proper combination of these parameters that provides an MPP absorbing in one octave band or two, within the frequency range of interest for noise control....... Therefore, simulated annealing is proposed in this paper as a tool to solve the optimization problem of finding the best combination of the constitutive parameters of an ML-MPP providing the maximum average absorption within a prescribed frequency band....

  10. Software simulator for design and optimization of the kaleidoscopes for the surface reflectance measurement

    Science.gov (United States)

    Havran, Vlastimil; Bittner, Jiří; Čáp, Jiří; Hošek, Jan; Macúchová, Karolina; Němcová, Šárka

    2015-01-01

    Realistic reproduction of appearance of real-world materials by means of computer graphics requires accurate measurement and reconstruction of surface reflectance properties. We propose an interactive software simulation tool for modeling properties of a kaleidoscopic optical system for surface reflectance measurement. We use ray tracing to obtain fine grain simulation results corresponding to the resolution of a simulated image sensor and computing the reflections inside this system based on planar mirrors. We allow for a simulation of different geometric configurations of a kaleidoscope such as the number of mirrors, the length, and the taper angle. For accelerating the computation and delivering interactivity we use parallel processing of large groups of rays. Apart from the interactive mode our tool also features batch optimization suitable for automatic search for optimized kaleidoscope designs. We discuss the possibilities of the simulation and present some preliminary results obtained by using it in practice.

  11. Simulation and optimization of electromagnetohydrodynamic flows

    Science.gov (United States)

    Dennis, Brian Harrison

    2000-10-01

    Electromagnetohydrodynamics (EMHD) is the study of flow of electrically conducting incompressible fluids in applied electric and magnetic fields. The goal of this research was to develop and implement a numerical method for the simulation and optimization of steady viscous planar and axisymmetric EMHD flows. A finite element method based on least-squares variational principles, known as least-squares finite element method (LSFEM), was used to discretize the governing system of partial differential equations. The use of LSFEM allows the use of equal order approximation functions for all unknowns and is stable for high Reynolds numbers. In addition, the LSFEM allows the enforcement of the divergence constraint on the magnetic field in a straight forward manner. The associated linear algebraic system is symmetric and positive definite. A new second order theoretical model of the combined interaction of externally applied electric and magnetic fields and viscous incompressible fluid flows was rewritten as a system of first order partial differential equations, making it suitable for the application of LSFEM. The method was implemented in an object-oriented fashion using the C++ programming language. Both h and p-type finite elements were implemented in the software. The p-type finite elements were developed using hierarchical basis functions based on Jacobi polynomials. The hierarchical basis leads to a linear algebraic system with a natural multilevel structure that is well suited to adaptive enrichment. The sparse linear systems were solved by either direct sparse LU factorization or by iterative methods. Two iterative methods were implemented in the software, one based on a Jacobi preconditioned conjugate gradient and the another based a multigrid-like technique that uses the hierarchy of basis functions instead of a hierarchy of finer grids. The software was tested against analytic solutions for Navier-Stokes equations and for channel flows through transverse

  12. Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB.

    Science.gov (United States)

    Lee, Leng-Feng; Umberger, Brian R

    2016-01-01

    Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1-2 hours) using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB. This

  13. Generating optimal control simulations of musculoskeletal movement using OpenSim and MATLAB

    Directory of Open Access Journals (Sweden)

    Leng-Feng Lee

    2016-01-01

    Full Text Available Computer modeling, simulation and optimization are powerful tools that have seen increased use in biomechanics research. Dynamic optimizations can be categorized as either data-tracking or predictive problems. The data-tracking approach has been used extensively to address human movement problems of clinical relevance. The predictive approach also holds great promise, but has seen limited use in clinical applications. Enhanced software tools would facilitate the application of predictive musculoskeletal simulations to clinically-relevant research. The open-source software OpenSim provides tools for generating tracking simulations but not predictive simulations. However, OpenSim includes an extensive application programming interface that permits extending its capabilities with scripting languages such as MATLAB. In the work presented here, we combine the computational tools provided by MATLAB with the musculoskeletal modeling capabilities of OpenSim to create a framework for generating predictive simulations of musculoskeletal movement based on direct collocation optimal control techniques. In many cases, the direct collocation approach can be used to solve optimal control problems considerably faster than traditional shooting methods. Cyclical and discrete movement problems were solved using a simple 1 degree of freedom musculoskeletal model and a model of the human lower limb, respectively. The problems could be solved in reasonable amounts of time (several seconds to 1–2 hours using the open-source IPOPT solver. The problems could also be solved using the fmincon solver that is included with MATLAB, but the computation times were excessively long for all but the smallest of problems. The performance advantage for IPOPT was derived primarily by exploiting sparsity in the constraints Jacobian. The framework presented here provides a powerful and flexible approach for generating optimal control simulations of musculoskeletal movement using

  14. Optimizing Chromatographic Separation: An Experiment Using an HPLC Simulator

    Science.gov (United States)

    Shalliker, R. A.; Kayillo, S.; Dennis, G. R.

    2008-01-01

    Optimization of a chromatographic separation within the time constraints of a laboratory session is practically impossible. However, by employing a HPLC simulator, experiments can be designed that allow students to develop an appreciation of the complexities involved in optimization procedures. In the present exercise, a HPLC simulator from "JCE…

  15. Optimizing Chromatographic Separation: An Experiment Using an HPLC Simulator

    Science.gov (United States)

    Shalliker, R. A.; Kayillo, S.; Dennis, G. R.

    2008-01-01

    Optimization of a chromatographic separation within the time constraints of a laboratory session is practically impossible. However, by employing a HPLC simulator, experiments can be designed that allow students to develop an appreciation of the complexities involved in optimization procedures. In the present exercise, a HPLC simulator from "JCE…

  16. Agent-based model of laser hair removal: A treatment optimization and patient education tool

    Directory of Open Access Journals (Sweden)

    Eapen Bell

    2009-01-01

    Full Text Available Background: Tracking of various parameters associated with laser hair removal is tedious and time consuming. The currently available mathematical models are not simple enough for physicians to be used as a treatment optimization and patient education tool. Aim: The aim of the study was to develop a mathematical model for laser hair removal using agent-based modeling and to make a user-friendly simulation environment. Methods: The model was created using NetLogo. The hairs were modeled as agents oscillating between anagen and telogen. The variables were assigned based on published data whenever possible and the various paths the agent could take were coded as conditional statements. The improvement was assessed using an arbitrary index which takes into account the mean diameter and pigmentation along with the number and length of hairs visible above the surface. Few of the commonly encountered scenarios were simulated using the model. Results: The model is made freely available online (http://www.gulfdoctor.net/model/lhr.htm. Limited number of simulations performed indicated that an eight-week gap between laser sessions may be more effective than a four-week gap. Conclusions: The simulation provides a reliable tool for treatment optimization and patient education as obtaining relevant clinical data is slow and labor-intensive. Its visual interface and online availability makes it useful for everyday use.

  17. Numerical simulation of azimuth electromagnetic wave tool response based on self-adaptive FEM

    Science.gov (United States)

    Li, Hui; Shen, Yi-Ze

    2017-07-01

    Azimuth electromagnetic wave is a new type of electromagnetic prospecting technology. It can detect weak electromagnetic wave signal and realize real-time formation conductivity imaging. For effectively optimizing measurement accuracy of azimuth electromagnetic wave imaging tool, the efficient numerical simulation algorithm is required. In this paper, self-adaptive finite element method (FEM) has been used to investigate the azimuth electromagnetic wave logging tool response by adjusting antenna array system in different geological conditions. Numerical simulation examples show the accuracy and efficiency of the method, and provide physical interpretation of amplitude attenuation and phase shift of electromagnetic wave signal. Meanwhile, the high-accuracy numerical simulation results have great value to azimuth electromagnetic wave imaging tool calibration and data interpretation.

  18. Optimized constitutive equation of material property based on inverse modeling for aluminum alloy hydroforming simulation

    Institute of Scientific and Technical Information of China (English)

    LANG Li-hui; LI Tao; ZHOU Xian-bin; B. E. KRISTENSEN; J. DANCKERT; K. B. NIELSEN

    2006-01-01

    By using aluminum alloys, the properties of the material in sheet hydroforming were obtained based on the identification of parameters for constitutive models by inverse modeling in which the friction coefficients were also considered in 2D and 3D simulations. With consideration of identified simulation parameters by inverse modeling, some key process parameters including tool dimensions and pre-bulging on the forming processes in sheet hydroforming were investigated and optimized. Based on the optimized parameters, the sheet hydroforming process can be analyzed more accurately to improve the robust design. It proves that the results from simulation based on the identified parameters are in good agreement with those from experiments.

  19. Asymptotically Optimal Simulation Budget Allocation under Fixed Confidence Level by Ordinal Optimization

    Institute of Scientific and Technical Information of China (English)

    WANGJian-feng; SUNChun-Lin; CHENYong-qing

    2004-01-01

    Ordinal optimization concentrates on isolating a subset of good designs with high probability and reduces the required simulation time dramatically for discrete event simulation. To obtain the same probability level,we may optimally allocate our computing budget among different designs,instead of equally simulating all different designs. In this paper we present an effective approach to optimally allocate computing budget for discrete-event system simulation. While ordinal optimization can dramatically reduce the computation cost, our approach can further reduce the already-low cost.

  20. Computer simulation tests of optimized neutron powder diffractometer configurations

    Energy Technology Data Exchange (ETDEWEB)

    Cussen, L.D., E-mail: Leo@CussenConsulting.com [Cussen Consulting, 23 Burgundy Drive, Doncaster 3108 (Australia); Lieutenant, K., E-mail: Klaus.Lieutenant@helmholtz-berlin.de [Helmholtz Zentrum Berlin, Hahn-Meitner Platz 1, 14109 Berlin (Germany)

    2016-06-21

    Recent work has developed a new mathematical approach to optimally choose beam elements for constant wavelength neutron powder diffractometers. This article compares Monte Carlo computer simulations of existing instruments with simulations of instruments using configurations chosen using the new approach. The simulations show that large performance improvements over current best practice are possible. The tests here are limited to instruments optimized for samples with a cubic structure which differs from the optimization for triclinic structure samples. A novel primary spectrometer design is discussed and simulation tests show that it performs as expected and allows a single instrument to operate flexibly over a wide range of measurement resolution.

  1. A NEO population generation and observation simulation software tool

    Science.gov (United States)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as

  2. Optimization of Operations Resources via Discrete Event Simulation Modeling

    Science.gov (United States)

    Joshi, B.; Morris, D.; White, N.; Unal, R.

    1996-01-01

    The resource levels required for operation and support of reusable launch vehicles are typically defined through discrete event simulation modeling. Minimizing these resources constitutes an optimization problem involving discrete variables and simulation. Conventional approaches to solve such optimization problems involving integer valued decision variables are the pattern search and statistical methods. However, in a simulation environment that is characterized by search spaces of unknown topology and stochastic measures, these optimization approaches often prove inadequate. In this paper, we have explored the applicability of genetic algorithms to the simulation domain. Genetic algorithms provide a robust search strategy that does not require continuity and differentiability of the problem domain. The genetic algorithm successfully minimized the operation and support activities for a space vehicle, through a discrete event simulation model. The practical issues associated with simulation optimization, such as stochastic variables and constraints, were also taken into consideration.

  3. Contingency Contractor Optimization Phase 3 Sustainment Requirements Document Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Bandlow, Alisa; Durfee, Justin David; Frazier, Christopher Rawls; Jones, Katherine A; Gearhart, Jared Lee

    2016-05-01

    This requirements document serves as an addendum to the Contingency Contractor Optimization Phase 2, Requirements Document [1] and Phase 3 Requirements Document [2]. The Phase 2 Requirements document focused on the high-level requirements for the tool. The Phase 3 Requirements document provided more detailed requirements to which the engineering prototype was built in Phase 3. This document will provide detailed requirements for features and enhancements being added to the production pilot in the Phase 3 Sustainment.

  4. Hygrothermal Numerical Simulation Tools Applied to Building Physics

    CERN Document Server

    Delgado, João M P Q; Ramos, Nuno M M; Freitas, Vasco Peixoto

    2013-01-01

    This book presents a critical review on the development and application of hygrothermal analysis methods to simulate the coupled transport processes of Heat, Air, and Moisture (HAM) transfer for one or multidimensional cases. During the past few decades there has been relevant development in this field of study and an increase in the professional use of tools that simulate some of the physical phenomena that are involved in Heat, Air and Moisture conditions in building components or elements. Although there is a significant amount of hygrothermal models referred in the literature, the vast majority of them are not easily available to the public outside the institutions where they were developed, which restricts the analysis of this book to only 14 hygrothermal modelling tools. The special features of this book are (a) a state-of-the-art of numerical simulation tools applied to building physics, (b) the boundary conditions importance, (c) the material properties, namely, experimental methods for the measuremen...

  5. FEM simulation of infeed rotary swaging with structured tools

    Directory of Open Access Journals (Sweden)

    Herrmann Marius

    2015-01-01

    Full Text Available Rotary swaging is an incremental cold forming process for rods and tubes. Infeed rotary swaging with structure in the reduction zone of the tools is investigated using a two dimensional finite element simulation. A few geometrical parameters are varied, for cosine and skew stairway shapes. The effective tool angle is kept constant. The influence is evaluated by the radial and axial process forces. Furthermore, the material flow is visualized by the neutral plane. The simulation results are quantitatively compared to each other to analyse the reaction force FA, which acts against the feeding force. Also, the results serve to find suitable geometries to be transferred to rotary swaging tools for practical application. It is shown that the shapes have a significant effect on the forces and the location of the neutral plane. Finally a first swaging tool is modified with an exemplary geometry for experimental investigations.

  6. Efficient Simulation and Optimal Control for Vehicle Propulsion

    OpenAIRE

    Fröberg, Anders

    2008-01-01

    Efficient drive cycle simulation of longitudinal vehicle propulsion models is an important aid for design and analysis of power trains. Tools on the market today mainly use two different methods for such simulations, forward dynamic or quasi-static inverse simulation. Here known theory for stable inversion of non linear systems is used in order to combine the fast simulation times of the quasi-static inverse simulation with the ability of including transient dynamics as in the forward dynamic...

  7. Status of Simulation Tools for the ILD ScECAL

    OpenAIRE

    Kotera, Katsushige; anduze, marc; Boudry, Vincent; Brient, Jean-Claude; Jeans, Daniel; Kawagoe, Kiyotomo; Miyamoto, Akiya; de Freitas, Paulo Mora; Musat, Gabriel; Ono, Hiroaki; Takeshita, Tohru; Uozumi, Satoru

    2010-01-01

    The scintillator-strip electromagnetic calorimeter (ScECAL) is one of the calorimeter technic for the ILC. To achieve the fine granularity from the strip-segmented layers the strips in odd layers are orthogonal with respect to those in the even layers. In order to extract the best performance from such detector concept, a special reconstruction method and simulation tools are being developed in ILD collaboration. This manuscript repots the status of developing of those tools.

  8. Distribution view: a tool to write and simulate distributions

    OpenAIRE

    Coelho, José; Branco, Fernando; Oliveira, Teresa

    2006-01-01

    In our work we present a tool to write and simulate distributions. This tool allows to write mathematical expressions which can contain not only functions and variables, but also statistical distributions, including mixtures. Each time the expression is evaluated, for all inner distributions, is generated a value according to the distribution and is used for expression value determination. The inversion method can be used in this language, allowing to generate all distributions...

  9. Applications of the Renewable Energy Network Optimization Tool

    Science.gov (United States)

    Alliss, R.; Link, R.; Apling, D.; Kiley, H.; Mason, M.; Darmenova, K.

    2010-12-01

    As the renewable energy industry continues to grow so does the requirement for atmospheric modeling and analysis tools to maximize both wind and solar power. Renewable energy generation is variable however; presenting challenges for electrical grid operation and requires a variety of measures to adequately firm power. These measures include the production of non-renewable generation during times when renewables are not available. One strategy for minimizing the variability of renewable energy production is site diversity. Assuming that a network of renewable energy systems feed a common electrical grid, site diversity ensures that when one system on the network has a reduction in generation others on the same grid make up the difference. The site-diversity strategy can be used to mitigate the intermittency in alternative energy production systems while still maximizing saleable energy. The Renewable Energy Network Optimization Tool (ReNOT) has recently been developed to study the merits of site optimization for wind farms. The modeling system has a plug-in architecture that allows us to accommodate a wide variety of renewable energy system designs and performance metrics. The Weather Research and Forecasting (WRF) mesoscale model is applied to generate high-resolution wind databases to support the site selection of wind farms. These databases are generated on High Performance Computing systems such as the Rocky Mountain Supercomputing Center (RMSC). The databases are then accessed by ReNOT and an optimized site selection is developed. We can accommodate numerous constraints (e.g., number of sites, the geographic extent of the optimization, proximity to high-voltage transport lines, etc.). As part of our collaboration with RMSC and the State of Montana a study was performed to estimate the optimal locations of a network of wind farms. Comparisons were made to four existing wind farm locations in Montana including Glacier with a 210 MW name plate capacity, Horseshoe

  10. Physics validation of detector simulation tools for LHC

    CERN Document Server

    Beringer, J

    2004-01-01

    Extensive studies aimed at validating the physics processes built into the detector simulation tools Geant4 and Fluka are in progress within all Large Hadron Collider (LHC) experiments, within the collaborations developing these tools, and within the LHC Computing Grid (LCG) Simulation Physics Validation Project, which has become the primary forum for these activities. This work includes detailed comparisons with test beam data, as well as benchmark studies of simple geometries and materials with single incident particles of various energies for which experimental data is available. We give an overview of these validation activities with emphasis on the latest results.

  11. A Coupling Tool for Parallel Molecular Dynamics-Continuum Simulations

    KAUST Repository

    Neumann, Philipp

    2012-06-01

    We present a tool for coupling Molecular Dynamics and continuum solvers. It is written in C++ and is meant to support the developers of hybrid molecular - continuum simulations in terms of both realisation of the respective coupling algorithm as well as parallel execution of the hybrid simulation. We describe the implementational concept of the tool and its parallel extensions. We particularly focus on the parallel execution of particle insertions into dense molecular systems and propose a respective parallel algorithm. Our implementations are validated for serial and parallel setups in two and three dimensions. © 2012 IEEE.

  12. Using Open Source Tools to Create a Mobile Optimized, Crowdsourced Translation Tool

    Directory of Open Access Journals (Sweden)

    Evviva Weinraub Lajoie

    2014-04-01

    Full Text Available In late 2012, OSU Libraries and Press partnered with Maria's Libraries, an NGO in Rural Kenya, to provide users the ability to crowdsource translations of folk tales and existing children's books into a variety of African languages, sub-languages, and dialects. Together, these two organizations have been creating a mobile optimized platform using open source libraries such as Wink Toolkit (a library which provides mobile-friendly interaction from a website and Globalize3 to allow for multiple translations of database entries in a Ruby on Rails application. Research regarding successes of similar tools has been utilized in providing a consistent user interface. The OSU Libraries & Press team delivered a proof-of-concept tool that has the opportunity to promote technology exploration, improve early childhood literacy, change the way we approach foreign language learning, and to provide opportunities for cost-effective, multi-language publishing.

  13. Automating Initial Guess Generation for High Fidelity Trajectory Optimization Tools

    Science.gov (United States)

    Villa, Benjamin; Lantoine, Gregory; Sims, Jon; Whiffen, Gregory

    2013-01-01

    Many academic studies in spaceflight dynamics rely on simplified dynamical models, such as restricted three-body models or averaged forms of the equations of motion of an orbiter. In practice, the end result of these preliminary orbit studies needs to be transformed into more realistic models, in particular to generate good initial guesses for high-fidelity trajectory optimization tools like Mystic. This paper reviews and extends some of the approaches used in the literature to perform such a task, and explores the inherent trade-offs of such a transformation with a view toward automating it for the case of ballistic arcs. Sample test cases in the libration point regimes and small body orbiter transfers are presented.

  14. Development and optimization of FJP tools and their practical verification

    Science.gov (United States)

    Messelink, Wilhelmus A. C. M.; Waeger, Reto; Meeder, Mark; Looser, Herbert; Wons, Torsten; Heiniger, Kurt C.; Faehnle, Oliver W.

    2005-09-01

    This article presents the recent achievements with Jules Verne, a sub-aperture polishing technique closely related to Fluid Jet Polishing [1]. Whereas FJP typically applies a nozzle stand-off distance of millimeters to centimeters, JV uses a stand-off distance down to 50 μm. The objective is to generate a non-directional fluid flow parallel to the surface, which is specifically suited to reduce the surface roughness [2, 3]. Different characteristic Jules Verne nozzle geometries have been designed and numerically simulated using Computational Fluid Dynamics (CFD). To verify these simulations, the flow of fluid and particles of these nozzles has been visualized in a measurement setup developed specifically for this purpose. A simplified JV nozzle geometry is positioned in a measurement setup and the gap between tool and surface has been observed by an ICCD camera. In order to be able to visualize the motion of the abrasives, the particles have been coated with fluorescence. Furthermore, these nozzles have been manufactured and tested in a practical environment using a modified polishing machine. The results of these laboratory and practical tests are presented and discussed, demonstrating that the CFD simulations are in good agreement with the experiments. It was possible to qualitatively predict the material removal on the processed glass surface, due to the implementation of appropriate erosion models [4, 5] in the CFD software.

  15. Electric Vehicle Scenario Simulator Tool for Smart Grid Operators

    OpenAIRE

    Hugo Morais; Zita Vale; João Soares; Cristina Lobo; Bruno Canizes

    2012-01-01

    This paper presents a simulator for electric vehicles in the context of smart grids and distribution networks. It aims to support network operators’ planning and operations but can be used by other entities for related studies. The paper describes the parameters supported by the current version of the Electric Vehicle Scenario Simulator (EVeSSi) tool and its current algorithm. EVeSSi enables the definition of electric vehicles scenarios on distribution networks using a built-in movement engin...

  16. Electric Vehicle Scenario Simulator Tool for Smart Grid Operators

    OpenAIRE

    Hugo Morais; Zita Vale; João Soares; Cristina Lobo; Bruno Canizes

    2012-01-01

    This paper presents a simulator for electric vehicles in the context of smart grids and distribution networks. It aims to support network operators’ planning and operations but can be used by other entities for related studies. The paper describes the parameters supported by the current version of the Electric Vehicle Scenario Simulator (EVeSSi) tool and its current algorithm. EVeSSi enables the definition of electric vehicles scenarios on distribution networks using a built-in movement engin...

  17. GMOseek: a user friendly tool for optimized GMO testing.

    Science.gov (United States)

    Morisset, Dany; Novak, Petra Kralj; Zupanič, Darko; Gruden, Kristina; Lavrač, Nada; Žel, Jana

    2014-08-01

    With the increasing pace of new Genetically Modified Organisms (GMOs) authorized or in pipeline for commercialization worldwide, the task of the laboratories in charge to test the compliance of food, feed or seed samples with their relevant regulations became difficult and costly. Many of them have already adopted the so called "matrix approach" to rationalize the resources and efforts used to increase their efficiency within a limited budget. Most of the time, the "matrix approach" is implemented using limited information and some proprietary (if any) computational tool to efficiently use the available data. The developed GMOseek software is designed to support decision making in all the phases of routine GMO laboratory testing, including the interpretation of wet-lab results. The tool makes use of a tabulated matrix of GM events and their genetic elements, of the laboratory analysis history and the available information about the sample at hand. The tool uses an optimization approach to suggest the most suited screening assays for the given sample. The practical GMOseek user interface allows the user to customize the search for a cost-efficient combination of screening assays to be employed on a given sample. It further guides the user to select appropriate analyses to determine the presence of individual GM events in the analyzed sample, and it helps taking a final decision regarding the GMO composition in the sample. GMOseek can also be used to evaluate new, previously unused GMO screening targets and to estimate the profitability of developing new GMO screening methods. The presented freely available software tool offers the GMO testing laboratories the possibility to select combinations of assays (e.g. quantitative real-time PCR tests) needed for their task, by allowing the expert to express his/her preferences in terms of multiplexing and cost. The utility of GMOseek is exemplified by analyzing selected food, feed and seed samples from a national reference

  18. Solid-state-drives (SSDs) modeling simulation tools & strategies

    CERN Document Server

    2017-01-01

    This book introduces simulation tools and strategies for complex systems of solid-state-drives (SSDs) which consist of a flash multi-core microcontroller plus NAND flash memories. It provides a broad overview of the most popular simulation tools, with special focus on open source solutions. VSSIM, NANDFlashSim and DiskSim are benchmarked against performances of real SSDs under different traffic workloads. PROs and CONs of each simulator are analyzed, and it is clearly indicated which kind of answers each of them can give and at a what price. It is explained, that speed and precision do not go hand in hand, and it is important to understand when to simulate what, and with which tool. Being able to simulate SSD’s performances is mandatory to meet time-to-market, together with product cost and quality. Over the last few years the authors developed an advanced simulator named “SSDExplorer” which has been used to evaluate multiple phenomena with great accuracy, from QoS (Quality Of Service) to Read Retry, fr...

  19. Urban-Climate Adaptation Tool: Optimizing Green Infrastructure

    Science.gov (United States)

    Fellows, J. D.; Bhaduri, B. L.

    2016-12-01

    Cities have an opportunity to become more resilient to future climate change and green through investments made in urban infrastructure today. However, most cities lack access to credible high-resolution climate change projection and other environmental information needed to assess and address potential vulnerabilities from future climate variability. Therefore, we present an integrated framework for developing an urban climate adaptation tool (Urban-CAT). The initial focus of Urban-CAT is to optimize the placement of green infrastructure (e.g., green roofs, porous pavements, retention basins, etc.) to be better control stormwater runoff and lower the ambient urban temperature. Urban-CAT consists of four modules. Firstly, it provides climate projections at different spatial resolutions for quantifying urban landscape. Secondly, this projected data is combined with socio-economic and other environmental data using leading and lagging indicators for assessing landscape vulnerability to climate extremes (e.g., urban flooding). Thirdly, a neighborhood scale modeling approach is presented for identifying candidate areas for adaptation strategies (e.g., green infrastructure as an adaptation strategy for urban flooding). Finally, all these capabilities are made available as a web-based tool to support decision-making and communication at the neighborhood and city levels. This presentation will highlight the methods that drive each of the modules, demo some of the capabilities using Knoxville Tennessee as a case study, and discuss the challenges of working with communities to incorporate climate change into their planning. Next steps on Urban-CAT is to additional capabilities to create a comprehensive climate adaptation tool, including energy, transportation, health, and other key urban services.

  20. Comparison of discrete event simulation tools in an academic environment

    Directory of Open Access Journals (Sweden)

    Mario Jadrić

    2014-12-01

    Full Text Available A new research model for simulation software evaluation is proposed consisting of three main categories of criteria: modeling and simulation capabilities of the explored tools, and tools’ input/output analysis possibilities, all with respective sub-criteria. Using the presented model, two discrete event simulation tools are evaluated in detail using the task-centred scenario. Both tools (Arena and ExtendSim were used for teaching discrete event simulation in preceding academic years. With the aim to inspect their effectiveness and to help us determine which tool is more suitable for students i.e. academic purposes, we used a simple simulation model of entities competing for limited resources. The main goal was to measure subjective (primarily attitude and objective indicators while using the tools when the same simulation scenario is given. The subjects were first year students of Master studies in Information Management at the Faculty of Economics in Split taking a course in Business Process Simulations (BPS. In a controlled environment – in a computer lab, two groups of students were given detailed, step-by-step instructions for building models using both tools - first using ExtendSim then Arena or vice versa. Subjective indicators (students’ attitudes were collected using an online survey completed immediately upon building each model. Subjective indicators primarily include students’ personal estimations of Arena and ExtendSim capabilities/features for model building, model simulation and result analysis. Objective indicators were measured using specialised software that logs information on user's behavior while performing a particular task on their computer such as distance crossed by mouse during model building, the number of mouse clicks, usage of the mouse wheel and speed achieved. The results indicate that ExtendSim is well preferred comparing to Arena with regards to subjective indicators while the objective indicators are

  1. Ropossum: An Authoring Tool for Designing, Optimizing and Solving Cut the Rope Levels

    DEFF Research Database (Denmark)

    Shaker, Mohammad; Shaker, Noor; Togelius, Julian

    2013-01-01

    We present a demonstration of Ropossum, an authoring tool for the generation and testing of levels of the physics-based game, Cut the Rope. Ropossum integrates many features: (1) automatic design of complete solvable content, (2) incorporation of designer’s input through the creation of complete...... or partial designs, (3) automatic check for playability and (4) optimization of a given design based on playability. The system includes a physics engine to simulate the game and an evolutionary framework to evolve content as well as an AI reasoning agent to check for playability. The system is optimised...

  2. Forging tool shape optimization using pseudo inverse approach and adaptive incremental approach

    Science.gov (United States)

    Halouani, A.; Meng, F. J.; Li, Y. M.; Labergère, C.; Abbès, B.; Lafon, P.; Guo, Y. Q.

    2013-05-01

    This paper presents a simplified finite element method called "Pseudo Inverse Approach" (PIA) for tool shape design and optimization in multi-step cold forging processes. The approach is based on the knowledge of the final part shape. Some intermediate configurations are introduced and corrected by using a free surface method to consider the deformation paths without contact treatment. A robust direct algorithm of plasticity is implemented by using the equivalent stress notion and tensile curve. Numerical tests have shown that the PIA is very fast compared to the incremental approach. The PIA is used in an optimization procedure to automatically design the shapes of the preform tools. Our objective is to find the optimal preforms which minimize the equivalent plastic strain and punch force. The preform shapes are defined by B-Spline curves. A simulated annealing algorithm is adopted for the optimization procedure. The forging results obtained by the PIA are compared to those obtained by the incremental approach to show the efficiency and accuracy of the PIA.

  3. Optimal design of supply chain network under uncertainty environment using hybrid analytical and simulation modeling approach

    Science.gov (United States)

    Chiadamrong, N.; Piyathanavong, V.

    2017-04-01

    Models that aim to optimize the design of supply chain networks have gained more interest in the supply chain literature. Mixed-integer linear programming and discrete-event simulation are widely used for such an optimization problem. We present a hybrid approach to support decisions for supply chain network design using a combination of analytical and discrete-event simulation models. The proposed approach is based on iterative procedures until the difference between subsequent solutions satisfies the pre-determined termination criteria. The effectiveness of proposed approach is illustrated by an example, which shows closer to optimal results with much faster solving time than the results obtained from the conventional simulation-based optimization model. The efficacy of this proposed hybrid approach is promising and can be applied as a powerful tool in designing a real supply chain network. It also provides the possibility to model and solve more realistic problems, which incorporate dynamism and uncertainty.

  4. Development of a framework for optimization of reservoir simulation studies

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Jiang; Delshad, Mojdeh; Sepehrnoori, Kamy [The University of Texas at Austin, Austin, TX (United States)

    2007-10-15

    We have developed a framework that distributes multiple reservoir simulations on a cluster of CPUs for fast and efficient process optimization studies. This platform utilizes several commercial reservoir simulators for flow simulations, an experimental design and a Monte Carlo algorithm with a global optimization search engine to identify the optimum combination of reservoir decision factors under uncertainty. This approach is applied to a well placement design for a field-scale development exercise. The uncertainties considered are in the fault structure, porosity and permeability, PVT, and relative permeabilities. The results indicate that the approach is practical and efficient for performing reservoir optimization studies. (author)

  5. Field sampling scheme optimization using simulated annealing

    CSIR Research Space (South Africa)

    Debba, Pravesh

    2010-10-01

    Full Text Available to derive optimal sampling schemes. 2. Hyperspectral remote sensing In the study of electro-magnetic physics, when energy in the form of light interacts with a material, part of the energy at certain wavelength is absorbed, transmitted, emitted... in order to derive optimal sampling schemes. 2. Hyperspectral remote sensing In the study of electro-magnetic physics, when energy in the form of light interacts with a material, part of the energy at certain wavelength is absorbed, transmitted, emitted...

  6. CLAIRE, an event-driven simulation tool for testing software

    Energy Technology Data Exchange (ETDEWEB)

    Raguideau, J.; Schoen, D.; Henry, J.Y.; Boulc`h, J.

    1994-06-15

    CLAIRE is a software tool created to perform validations on executable codes or on specifications of distributed real-time applications for nuclear safety. CLAIRE can be used both to verify the safety properties by modelling the specifications, and also to validate the final code by simulating the behaviour of its equipment and software interfaces. It can be used to observe and provide dynamic control of the simulation process, and also to record changes to the simulated data for off-line analysis. (R.P.).

  7. NavLab, a Generic Simulation and Post-processing Tool for Navigation

    Directory of Open Access Journals (Sweden)

    Kenneth Gade

    2005-07-01

    Full Text Available The ambition of getting one common tool for a great variety of navigation tasks was the background for the development of NavLab (Navigation Laboratory. The main emphasis during the development has been a solid theoretical foundation with a stringent mathematical representation to ensure that statistical optimality is maintained throughout the entire system_ NavLab is implemented in Matlab. and consists of a simulator and an estimator.

  8. statnet: Software Tools for the Representation, Visualization, Analysis and Simulation of Network Data

    Directory of Open Access Journals (Sweden)

    Mark S. Handcock

    2007-12-01

    Full Text Available statnet is a suite of software packages for statistical network analysis. The packages implement recent advances in network modeling based on exponential-family random graph models (ERGM. The components of the package provide a comprehensive framework for ERGM-based network modeling, including tools for model estimation, model evaluation, model-based network simulation, and network visualization. This broad functionality is powered by a central Markov chain Monte Carlo (MCMC algorithm. The coding is optimized for speed and robustness.

  9. CPN Tools for Editing, Simulating, and Analysing Coloured Petri Nets

    DEFF Research Database (Denmark)

    Ratzer, Anne Vinter; Wells, Lisa Marie; Lassen, Henry Michael

    2003-01-01

    elements. The tool features incremental syntax checking and code generation which take place while a net is being constructed. A fast simulator efficiently handles both untimed and timed nets. Full and partial state spaces can be generated and analysed, and a standard state space report contains...

  10. TNO-ADVANCE: a modular powertrain simulation and design tool

    NARCIS (Netherlands)

    Venne, J.W.C. van de; Smokers, R.T.M.

    2000-01-01

    To support its activities in the field of conventional and hybrid vehicles, TNO has developed ADVANCE, a modular simulation tool for the design and evaluation of advanced powertrains. In this paper the various features and the potential of ADVANCE are described and illustrated by means of three case

  11. Simulation modeling: a powerful tool for process improvement.

    Science.gov (United States)

    Boxerman, S B

    1996-01-01

    Simulation modeling provides an efficient means of examining the operation of a system under a variety of alternative conditions. This tool can potentially enhance a benchmarking project by providing a means for evaluating proposed modifications to the system or process under study.

  12. Simulation as a tool for managing Ebola infection

    Directory of Open Access Journals (Sweden)

    Somsri Wiwanitkit

    2015-06-01

    Full Text Available Ebola virus disease is the present deadly infection. The outbreak in Africa becomes the great concern globally. Several attempts have been launched to manage the problem since its first appearance in Africa. The use of simulations as a tool to manage the problem is very interesting. In this short article, the author briefly summarizes and discusses on this topic.

  13. Simulation training tools for nonlethal weapons using gaming environments

    Science.gov (United States)

    Donne, Alexsana; Eagan, Justin; Tse, Gabriel; Vanderslice, Tom; Woods, Jerry

    2006-05-01

    Modern simulation techniques have a growing role for evaluating new technologies and for developing cost-effective training programs. A mission simulator facilitates the productive exchange of ideas by demonstration of concepts through compellingly realistic computer simulation. Revolutionary advances in 3D simulation technology have made it possible for desktop computers to process strikingly realistic and complex interactions with results depicted in real-time. Computer games now allow for multiple real human players and "artificially intelligent" (AI) simulated robots to play together. Advances in computer processing power have compensated for the inherent intensive calculations required for complex simulation scenarios. The main components of the leading game-engines have been released for user modifications, enabling game enthusiasts and amateur programmers to advance the state-of-the-art in AI and computer simulation technologies. It is now possible to simulate sophisticated and realistic conflict situations in order to evaluate the impact of non-lethal devices as well as conflict resolution procedures using such devices. Simulations can reduce training costs as end users: learn what a device does and doesn't do prior to use, understand responses to the device prior to deployment, determine if the device is appropriate for their situational responses, and train with new devices and techniques before purchasing hardware. This paper will present the status of SARA's mission simulation development activities, based on the Half-Life gameengine, for the purpose of evaluating the latest non-lethal weapon devices, and for developing training tools for such devices.

  14. Optimizing reversible simulation of injective functions

    DEFF Research Database (Denmark)

    Yokoyama, Tetsuo; Axelsen, Holger Bock; Glück, Robert

    2012-01-01

    of the computation and uncomputation steps for a class of injective programs. A practical consequence is that the reversible simulation runs twice as fast as Bennett’s simulation. The proposed method is demonstrated by developing lossless encoders and decoders for run-length encoding and range coding. The range...

  15. Beam Delivery Simulation - Recent Developments and Optimization

    CERN Document Server

    AUTHOR|(INSPIRE)INSPIRE-00232566; Boogert, Stewart Takashi; Garcia-Morales, H; Gibson, Stephen; Kwee-Hinzmann, Regina; Nevay, Laurence James; Deacon, Lawrence Charles

    2015-01-01

    Beam Delivery Simulation (BDSIM) is a particle tracking code that simulates the passage of particles through both the magnetic accelerator lattice as well as their interaction with the material of the accelerator itself. The Geant4 toolkit is used to give a full range of physics processes needed to simulate both the interaction of primary particles and the production and subsequent propagation of secondaries. BDSIM has already been used to simulate linear accelerators such as the International Linear Collider (ILC) and the Compact Linear Collider (CLIC), but it has recently been adapted to simulate circular accelerators as well, producing loss maps for the Large Hadron Collider (LHC). In this paper the most recent developments, which extend BDSIM’s functionality as well as improve its efficiency are presented. Improvement and refactorisation of the tracking algorithms are presented alongside improved automatic geometry construction for increased particle tracking speed.

  16. A Survey of Stochastic Simulation and Optimization Methods in Signal Processing

    Science.gov (United States)

    Pereyra, Marcelo; Schniter, Philip; Chouzenoux, Emilie; Pesquet, Jean-Christophe; Tourneret, Jean-Yves; Hero, Alfred O.; McLaughlin, Steve

    2016-03-01

    Modern signal processing (SP) methods rely very heavily on probability and statistics to solve challenging SP problems. SP methods are now expected to deal with ever more complex models, requiring ever more sophisticated computational inference techniques. This has driven the development of statistical SP methods based on stochastic simulation and optimization. Stochastic simulation and optimization algorithms are computationally intensive tools for performing statistical inference in models that are analytically intractable and beyond the scope of deterministic inference methods. They have been recently successfully applied to many difficult problems involving complex statistical models and sophisticated (often Bayesian) statistical inference techniques. This survey paper offers an introduction to stochastic simulation and optimization methods in signal and image processing. The paper addresses a variety of high-dimensional Markov chain Monte Carlo (MCMC) methods as well as deterministic surrogate methods, such as variational Bayes, the Bethe approach, belief and expectation propagation and approximate message passing algorithms. It also discusses a range of optimization methods that have been adopted to solve stochastic problems, as well as stochastic methods for deterministic optimization. Subsequently, areas of overlap between simulation and optimization, in particular optimization-within-MCMC and MCMC-driven optimization are discussed.

  17. Serious Games and Simulation as Tools for Education

    Directory of Open Access Journals (Sweden)

    Luca Mori

    2012-06-01

    Full Text Available The increasing adoption of computer-based “serious games” as digital tools for education requires to address the question about the role of simulation in teaching and learning process. Whereas many recent studies have stressed the benefits of digital games in a variety of learning contexts, this paper approaches the problem of misuse and limitations of computer-based simulations, and argues that we still need to understand when a digital serious game is actually better than other non-computer-based simulation experiences. Considering that the distinction between the two types of simulation does not mean that they are incompatible, the final question that I address regards the best ways to correlate computer-based and non-computer-based simulation techniques.

  18. Electric Vehicle Scenario Simulator Tool for Smart Grid Operators

    Directory of Open Access Journals (Sweden)

    Hugo Morais

    2012-06-01

    Full Text Available This paper presents a simulator for electric vehicles in the context of smart grids and distribution networks. It aims to support network operators’ planning and operations but can be used by other entities for related studies. The paper describes the parameters supported by the current version of the Electric Vehicle Scenario Simulator (EVeSSi tool and its current algorithm. EVeSSi enables the definition of electric vehicles scenarios on distribution networks using a built-in movement engine. The scenarios created with EVeSSi can be used by external tools (e.g., power flow for specific analysis, for instance grid impacts. Two scenarios are briefly presented for illustration of the simulator capabilities.

  19. Monte Carlo Simulations of Neutron Oil well Logging Tools

    CERN Document Server

    Azcurra, M

    2002-01-01

    Monte Carlo simulations of simple neutron oil well logging tools into typical geological formations are presented.The simulated tools consist of both 14 MeV pulsed and continuous Am-Be neutron sources with time gated and continuous gamma ray detectors respectively.The geological formation consists of pure limestone with 15% absolute porosity in a wide range of oil saturation.The particle transport was performed with the Monte Carlo N-Particle Transport Code System, MCNP-4B.Several gamma ray spectra were obtained at the detector position that allow to perform composition analysis of the formation.In particular, the ratio C/O was analyzed as an indicator of oil saturation.Further calculations are proposed to simulate actual detector responses in order to contribute to understand the relation between the detector response with the formation composition

  20. Free and open source simulation tools for the design of power processing units for photovoltaic systems

    Directory of Open Access Journals (Sweden)

    Sergio Morales-Hernández

    2015-06-01

    Full Text Available Renewable energy sources, including solar photovoltaic, require electronic circuits that serve as interface between the transducer device and the device or system that uses energy. Moreover, the energy efficiency and the cost of the system can be compromised if such electronic circuit is not designed properly. Given that the electrical characteristics of the photovoltaic devices are nonlinear and that the most efficient electronic circuits for power processing are naturally discontinuous, a detailed dynamic analysis to optimize the design is required. This analysis should be supported by computer simulation tools. In this paper a comparison between two software tools for dynamic system simulation is performed to determinate its usefulness in the design process of photovoltaic systems, mainly in what corresponds to the power processing units. Using as a case of study a photovoltaic system for battery charging it was determined that Scicoslab tool was the most suitable.

  1. Flow Simulation and Optimization of Plasma Reactors for Coal Gasification

    Science.gov (United States)

    Ji, Chunjun; Zhang, Yingzi; Ma, Tengcai

    2003-10-01

    This paper reports a 3-d numerical simulation system to analyze the complicated flow in plasma reactors for coal gasification, which involve complex chemical reaction, two-phase flow and plasma effect. On the basis of analytic results, the distribution of the density, temperature and components' concentration are obtained and a different plasma reactor configuration is proposed to optimize the flow parameters. The numerical simulation results show an improved conversion ratio of the coal gasification. Different kinds of chemical reaction models are used to simulate the complex flow inside the reactor. It can be concluded that the numerical simulation system can be very useful for the design and optimization of the plasma reactor.

  2. Optimizing the integrated design of boilers - simulation

    DEFF Research Database (Denmark)

    Sørensen, Kim; Karstensen, Claus M. S.; Condra, Thomas Joseph

    2004-01-01

    .) it is important to see the 3 components as an integrated unit and optimize these as such. This means that the burner must be designed and optimized exactly to the pressure part where it is utilized, the control system must have a conguration optimal for the pressure part and burner where it is utilized etc....... Traditionally boiler control systems have been designed in a rather simple manner consisting of a feed water controller and a pressure controller; two controllers which, in principle, operated without any interaction - for more details on boiler control see [4]. During the last year Aalborg Industries A/S has...... that are difcult to estimate/calculate have (on the basis of the tests) been determined by means of a least-square data tting, the minimums have been found by means of a Gauss-Newton algorithm and physically veried afterwards. The dynamic boiler model will be applied for developing controllers and adapting...

  3. Temperature variable optimization for precision machine tool thermal error compensation on optimal threshold

    Science.gov (United States)

    Zhang, Ting; Ye, Wenhua; Liang, Ruijun; Lou, Peihuang; Yang, Xiaolan

    2013-01-01

    Machine tool thermal error is an important reason for poor machining accuracy. Thermal error compensation is a primary technology in accuracy control. To build thermal error model, temperature variables are needed to be divided into several groups on an appropriate threshold. Currently, group threshold value is mainly determined by researchers experience. Few studies focus on group threshold in temperature variable grouping. Since the threshold is important in error compensation, this paper arms to find out an optimal threshold to realize temperature variable optimization in thermal error modeling. Firstly, correlation coefficient is used to express membership grade of temperature variables, and the theory of fuzzy transitive closure is applied to obtain relational matrix of temperature variables. Concepts as compact degree and separable degree are introduced. Then evaluation model of temperature variable clustering is built. The optimal threshold and the best temperature variable clustering can be obtained by setting the maximum value of evaluation model as the objective. Finally, correlation coefficients between temperature variables and thermal error are calculated in order to find out optimum temperature variables for thermal error modeling. An experiment is conducted on a precise horizontal machining center. In experiment, three displacement sensors are used to measure spindle thermal error and twenty-nine temperature sensors are utilized to detect the machining center temperature. Experimental result shows that the new method of temperature variable optimization on optimal threshold successfully worked out a best threshold value interval and chose seven temperature variables from twenty-nine temperature measuring points. The model residual of z direction is within 3 μm. Obviously, the proposed new variable optimization method has simple computing process and good modeling accuracy, which is quite fit for thermal error compensation.

  4. LOGISTICS ALTERNATIVE IN SIMULATION AND OPTIMIZATION THE SUPPLY CHAIN

    Directory of Open Access Journals (Sweden)

    Liliana CONDRATCHI

    2013-01-01

    Full Text Available The purpose of this work is the presentation a general framework to support the operational decisions for supply chain networks using a combination of optimization model and discrete-event simulation. The simulation model includes nonlinear and stochastic elements, whereas the optimization model represents a simplified version. Based on initial simulation runs cost parameters, production and transportation times are estimated for the optimization model. The solutions of the optimization model are translated into decision rules for the discrete-event simulation. This procedure is applied iteratively until the difference between subsequent solutions is small enough. This method is applied successfully to several test examples and is shown to delivery competitive results much faster compared to conventional mixed-integer models in a stochastic environment. It provides the possibility to model and solve morerealistic problems (incorporating dynamism and uncertainty in an acceptable way

  5. Robust Optimization in Simulation : Taguchi and Response Surface Methodology

    NARCIS (Netherlands)

    Dellino, G.; Kleijnen, J.P.C.; Meloni, C.

    2008-01-01

    Optimization of simulated systems is tackled by many methods, but most methods assume known environments. This article, however, develops a 'robust' methodology for uncertain environments. This methodology uses Taguchi's view of the uncertain world, but replaces his statistical techniques by

  6. Initiation Style Optimization of Aimed Warhead by Numerical Simulation

    Institute of Scientific and Technical Information of China (English)

    WEI Ji-feng; LI Na; WEN Yu-quan; WANG Wen-jie

    2008-01-01

    The kill characteristics of aimed warhead were studied.Emphasis on the improvement of initiation system,experiments and three dimensional numerical investigations were carried out.Simulation results of side three initiation points fit experiments well.Optimal initiation style is obtained through further simulation.It shows that the effective fragments and the effective kill energy of the optimal scheme increase 12.8%and 10.1%respectively.

  7. Localized Overheating Phenomena and Optimization of Spark-Plasma Sintering Tooling Design

    Directory of Open Access Journals (Sweden)

    Darold G. Martin

    2013-06-01

    Full Text Available The present paper shows the application of a three-dimensional coupled electrical, thermal, mechanical finite element macro-scale modeling framework of Spark Plasma Sintering (SPS to an actual problem of SPS tooling overheating, encountered during SPS experimentation. The overheating phenomenon is analyzed by varying the geometry of the tooling that exhibits the problem, namely by modeling various tooling configurations involving sequences of disk-shape spacers with step-wise increasing radii. The analysis is conducted by means of finite element simulations, intended to obtain temperature spatial distributions in the graphite press-forms, including punches, dies, and spacers; to identify the temperature peaks and their respective timing, and to propose a more suitable SPS tooling configuration with the avoidance of the overheating as a final aim. Electric currents-based Joule heating, heat transfer, mechanical conditions, and densification are imbedded in the model, utilizing the finite-element software COMSOL™, which possesses a distinguishing ability of coupling multiple physics. Thereby the implementation of a finite element method applicable to a broad range of SPS procedures is carried out, together with the more specific optimization of the SPS tooling design when dealing with excessive heating phenomena.

  8. Elektrisk Design og Styring. Simulation Platform to Model, Optimize and Design Wind Turbines

    DEFF Research Database (Denmark)

    Iov, Florin; Hansen, A. D.; Soerensen, P.;

    This report is a general overview of the results obtained in the project ?Electrical Design and Control. Simulation Platform to Model, Optimize and Design Wind Turbines?. The report is structured in six chapters. First, the background of this project and the main goals as well as the structure...... of the simulation platform is given. The main topologies for wind turbines, which have been taken into account during the project, are briefly presented. Then, the considered simulation tools namely: HAWC, DIgSILENT, Saber and Matlab/Simulink have been used in this simulation platform are described. The focus here...... is on the modelling and simulation time scale aspects. The abilities of these tools are complementary and they can together cover all the modelling aspects of the wind turbines e.g. mechanical loads, power quality, switching, control and grid faults. New models and new control algorithms for wind turbine systems have...

  9. Design of GNSS Performance Analysis and Simulation Tools as a Web Portal

    Directory of Open Access Journals (Sweden)

    S. Tadic

    2014-11-01

    Full Text Available This paper considers design of a web-portal for the validation of behavior of GNSS applications in different environments. The tool provides the positioning performance analysis and a comparison to benchmark devices. Web-portal incorporates a 3D synthetic data generator to compute the propagation and the reception of radio-navigation signals in a 3D virtual environment. This radio propagation simulator uses ray-tracing to calculate interactions between the GNSS signal and the local environment. For faster execution on a GPU platform, the simulator uses BVH optimization. The work is verified in field trials and by using reference software.

  10. An optimal inventory policy under certainty distributed demand for cutting tools with stochastically distributed lifespan

    National Research Council Canada - National Science Library

    Li, Cun Rong; Cheng, Jiadong

    2015-01-01

    .... An optimal inventory policy with general demand (OIPGD) was developed with which the allowable stopping time for tools, order-up-to-level inventory, and order cycle can be optimally determined by an exhaustive...

  11. Managing Retention Use of Simulation and Optimization

    Science.gov (United States)

    2010-01-01

    Early Retirement Separation Policy N P R S Advancement Planning Promotion Plan by Community N Compensation Policy Pay and Incentives Force...Shaping Tools Managing Losses and Excesses High Year TenureE4 E6 Temporary Early Retirement Authority Perform to Serve Selective Reenlistment Bonus...T Selective Early Retirement N P R S N Length of ServiceNominal Actual Nominal Steady State 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

  12. Transmission network expansion planning with simulation optimization

    Energy Technology Data Exchange (ETDEWEB)

    Bent, Russell W [Los Alamos National Laboratory; Berscheid, Alan [Los Alamos National Laboratory; Toole, G. Loren [Los Alamos National Laboratory

    2010-01-01

    Within the electric power literatW''e the transmi ssion expansion planning problem (TNEP) refers to the problem of how to upgrade an electric power network to meet future demands. As this problem is a complex, non-linear, and non-convex optimization problem, researchers have traditionally focused on approximate models. Often, their approaches are tightly coupled to the approximation choice. Until recently, these approximations have produced results that are straight-forward to adapt to the more complex (real) problem. However, the power grid is evolving towards a state where the adaptations are no longer easy (i.e. large amounts of limited control, renewable generation) that necessitates new optimization techniques. In this paper, we propose a generalization of the powerful Limited Discrepancy Search (LDS) that encapsulates the complexity in a black box that may be queJied for information about the quality of a proposed expansion. This allows the development of a new optimization algOlitlun that is independent of the underlying power model.

  13. Parametric Optimization and Prediction Tool for Excavation and Prospecting Tasks Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Honeybee Robotics therefore proposed to develop a software tool for facilitating prospecting and excavation system trades in support of selecting an optimal...

  14. A framework for response surface methodology for simulation optimization

    NARCIS (Netherlands)

    H.G. Neddermeijer; G.J. van Oortmarssen (Gerrit); N. Piersma (Nanda); R. Dekker (Rommert)

    2000-01-01

    textabstractWe develop a framework for automated optimization of stochastic simulation models using Response Surface Methodology. The framework is especially intended for simulation models where the calculation of the corresponding stochastic response function is very expensive or time-consuming. Re

  15. Optimization of simulated inventory systems: OptQuest and alternatives

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; Wan, J.

    2006-01-01

    This article illustrates simulation optimization through an (s, S) inventory manage- ment system. In this system, the goal function to be minimized is the expected value of speci…c inventory costs. Moreover, speci…c constraints must be satis…ed for some random simulation responses, namely the servic

  16. Optimization of Simulated Inventory Systems : OptQuest and Alternatives

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; Wan, J.

    2006-01-01

    This article illustrates simulation optimization through an (s, S) inventory management system.In this system, the goal function to be minimized is the expected value of specific inventory costs.Moreover, specific constraints must be satisfied for some random simulation responses, namely the service

  17. Using computer simulation for optimal staffing: A case for the patient registration process of a hospital.

    Science.gov (United States)

    Shim, Sung J; Kumar, Arun; Jiao, Roger

    2017-01-01

    Some healthcare managers use computer simulation to assist with staffing. As staffing actions are usually slow to evolve and long term in nature, computer simulation can provide the opportunity to evaluate different alternatives at substantially lower costs with fewer risks. Using computer simulation, this paper seeks to determine the optimal number and allocation of clerks involved in the patient registration process of a hospital. This paper is based on a case study conducted in a hospital and uses historical data provided by the hospital in simulating the patient registration process. The simulation results indicate that computer simulation can be an effective decision supporting tool in modeling the patient registration process and evaluating the effects of changes in the number and allocation of clerks in the process. Based upon a case study applying real-world data, the results of this paper would be beneficial to those who consider utilizing computer simulation for staffing decisions.

  18. Simulation Process Analysis of Rubber Shock Absorber for Machine Tool

    Directory of Open Access Journals (Sweden)

    Chai Rong Xia

    2016-01-01

    Full Text Available The simulation on rubber shock absorber of machine tool was studied. The simple material model of rubber was obtained by through the finite element analysis software ABAQUS. The compression speed and the hardness of rubber material were considered to obtain the deformation law of rubber shock absorber. The location of fatigue were confirmed from the simulation results. The results shown that the fatigue position is distributed in the corner of shock absorber. The degree of deformation is increased with increasing of compress speed, and the hardness of rubber material is proportional to deformation.

  19. Ergonomics and simulation tools for service & industrial process improvement

    Science.gov (United States)

    Sánchez, A.; García, M.

    2012-04-01

    Human interaction within designed processes is a really important factor in how efficiently any process will operate. How a human will function in relation to a process is not easy to predict. All the ergonomic considerations traditionally have been evaluated outside of the 3D product design. Nowadays technologies of 3D process design and simulation tools give us this opportunity from the earliest stages of the design process. Also they can be used to improve current process in order to increase human comfort, productivity and safety. This work shows a methodology using 3D design and simulation tools to improve industrial and service process. This methodology has as an objective the detection, evaluation, control of work-related musculoskeletal disorders (WMSDs).

  20. Final Report of the Simulation Optimization Task Force

    CERN Document Server

    Rimoldi, A; Dell'Acqua, A; Froidevaux, D; Gianotti, F; Guyot, C; Hinchliffe, I; Jakobs, K; Marshall, Z; Nisati, A; Quarrie, D; Unal, G; Young, C

    2009-01-01

    This is the final report of the ATLAS Simulation Optimization Task Force, established in June of 2007. This note justifies the selected Geant4 version, physics list, and range cuts to be used by the default ATLAS simulation for initial data taking and beyond. The current status of several projects, including detector description, simulation validation, studies of additional Geant4 parameters, and cavern background, are reported.

  1. An Optimization Method for Simulator Using Probability Statistic Model

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    An optimization method was presented to be easily applied in retargetable simulator. The substance of this method is to reduce the redundant information of operation code which is caused by the variety of execution frequencies of instructions. By recoding the operation code in the loading part of simulator, times of bit comparison in identification of an instruction will get reduced. Thus the performance of the simulator will be improved. The theoretical analysis and experimental results both prove the validity of this method.

  2. Coating-substrate-simulations applied to HFQ® forming tools

    Directory of Open Access Journals (Sweden)

    Leopold Jürgen

    2015-01-01

    Full Text Available In this paper a comparative analysis of coating-substrate simulations applied to HFQTM forming tools is presented. When using the solution heat treatment cold die forming and quenching process, known as HFQTM, for forming of hardened aluminium alloy of automotive panel parts, coating-substrate-systems have to satisfy unique requirements. Numerical experiments, based on the Advanced Adaptive FE method, will finally present.

  3. Simulation and Optimization of Contactless Power Transfer System for Rotary Ultrasonic Machining

    Directory of Open Access Journals (Sweden)

    Wang Xinwei

    2016-01-01

    Full Text Available In today’s rotary ultrasonic machining (RUM, the power transfer system is based on a contactless power system (rotary transformer rather than the slip ring that cannot cope with high-speed rotary of the tool. The efficiency of the rotary transformer is vital to the whole rotary ultrasonic machine. This paper focused on simulation of the rotary transformer and enhancing the efficiency of the rotary transformer by optimizing three main factors that influence its efficiency, including the gap between the two ferrite cores, the ratio of length and width of the ferrite core and the thickness of ferrite. The finite element model of rotary transformer was built on Maxwell platform. Simulation and optimization work was based on the finite element model. The optimization results compared with the initial simulation result showed an approximate 18% enhancement in terms of efficiency, from 77.69% to 95.2%.

  4. Development of Interpretive Simulation Tool for the Proton Radiography Technique

    CERN Document Server

    Levy, M C; Wilks, S C; Ross, J S; Huntington, C M; Fiuza, F; Baring, M G; Park, H- S

    2014-01-01

    Proton radiography is a useful diagnostic of high energy density (HED) plasmas under active theoretical and experimental development. In this paper we describe a new simulation tool that interacts realistic laser-driven point-like proton sources with three dimensional electromagnetic fields of arbitrary strength and structure and synthesizes the associated high resolution proton radiograph. The present tool's numerical approach captures all relevant physics effects, including effects related to the formation of caustics. Electromagnetic fields can be imported from PIC or hydrodynamic codes in a streamlined fashion, and a library of electromagnetic field `primitives' is also provided. This latter capability allows users to add a primitive, modify the field strength, rotate a primitive, and so on, while quickly generating a high resolution radiograph at each step. In this way, our tool enables the user to deconstruct features in a radiograph and interpret them in connection to specific underlying electromagneti...

  5. Simulations for optimization of SSRF BPM

    Institute of Scientific and Technical Information of China (English)

    YANG Nuo; LIU Gui-Min

    2003-01-01

    The wake field, impedance and output signal for SSRF BPM have been calculated and analyzed by us-ing the numerical simulation code of MAFIA. The narrow-band impedance of BPM arising from formation of reso-nance in its cavity like structure either is harmful to beam or limits the performance of BPM itself, and should be re-duced to tolerable levels. The calculated results show that there are three main peaks in the impedance spectrum ofthe SSRF BPM prototype, and two of which are above the limit. After lots of simulations for different shapes of theBPM button, a new structure of SSRF BPM have been found, and there is only one main peak, which is under thelimit, in the impedance spectrum. Its output signal also meets the requirement.

  6. CLIC Telescope optimization with ALLPIX simulation

    CERN Document Server

    Qi, Wu

    2015-01-01

    A simulation study of CLIC-EUDET telescope resolution with MIMOSA 26 as reference sensors under DESY (5.6 GeV electron beam) and CERN-SPS (120-180 GeV pion^{-} beam) conditions. During the study, a virtual DUT sensor with cylindrical sensing area was defined and used with ALLPIX software. By changing the configuration of telescope, some results for DESY's setup were found agreeing with the theoretical calculation.

  7. Programmable physical parameter optimization for particle plasma simulations

    Science.gov (United States)

    Ragan-Kelley, Benjamin; Verboncoeur, John; Lin, Ming-Chieh

    2012-10-01

    We have developed a scheme for interactive and programmable optimization of physical parameters for plasma simulations. The simulation code Object-Oriented Plasma Device 1-D (OOPD1) has been adapted to a Python interface, allowing sophisticated user or program interaction with simulations, and detailed numerical analysis via numpy. Because the analysis/diagnostic interface is the same as the input mechanism (the Python programming language), it is straightforward to optimize simulation parameters based on analysis of previous runs and automate the optimization process using a user-determined scheme and criteria. An example use case of the Child-Langmuir space charge limit in bipolar flow is demonstrated, where the beam current is iterated upon by measuring the relationship of the measured current and the injected current.

  8. Simulation and Optimization of Turning-Milling Complex Machining

    Directory of Open Access Journals (Sweden)

    Shihong Guo

    2013-05-01

    Full Text Available In this study, the turning-milling complex processing simulation platform is established based on the simulation and optimization platform of VERICUT NC machining, with WFL M65 turning-milling complex machining center as the research object; taking barrel body parts as an example, the simulation machining and related process issues checking in machining process is made and the analysis and optimization of effect factors is made for processing efficiency. The application indicates that: the research results effectively realize the simulation of the turning-milling complex machining process and the correctness verification and process optimization of the NC machining program, improve the processing efficiency and the processing quality, well improve the application level of enterprise turning-milling complex machining center, promote the development of the turning-milling complex machining technology.

  9. Simulation Optimization of the Crossdock Door Assignment Problem

    CERN Document Server

    Aickelin, Uwe

    2008-01-01

    The purpose of this report is to present the Crossdock Door Assignment Problem, which involves assigning destinations to outbound dock doors of Crossdock centres such that travel distance by material handling equipment is minimized. We propose a two fold solution; simulation and optimization of the simulation model simulation optimization. The novel aspect of our solution approach is that we intend to use simulation to derive a more realistic objective function and use Memetic algorithms to find an optimal solution. The main advantage of using Memetic algorithms is that it combines a local search with Genetic Algorithms. The Crossdock Door Assignment Problem is a new domain application to Memetic Algorithms and it is yet unknown how it will perform.

  10. A Data Management System for International Space Station Simulation Tools

    Science.gov (United States)

    Betts, Bradley J.; DelMundo, Rommel; Elcott, Sharif; McIntosh, Dawn; Niehaus, Brian; Papasin, Richard; Mah, Robert W.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Groups associated with the design, operational, and training aspects of the International Space Station make extensive use of modeling and simulation tools. Users of these tools often need to access and manipulate large quantities of data associated with the station, ranging from design documents to wiring diagrams. Retrieving and manipulating this data directly within the simulation and modeling environment can provide substantial benefit to users. An approach for providing these kinds of data management services, including a database schema and class structure, is presented. Implementation details are also provided as a data management system is integrated into the Intelligent Virtual Station, a modeling and simulation tool developed by the NASA Ames Smart Systems Research Laboratory. One use of the Intelligent Virtual Station is generating station-related training procedures in a virtual environment, The data management component allows users to quickly and easily retrieve information related to objects on the station, enhancing their ability to generate accurate procedures. Users can associate new information with objects and have that information stored in a database.

  11. Distribution System Optimization Planning Based on Plant Growth Simulation Algorithm

    Institute of Scientific and Technical Information of China (English)

    WANG Chun; CHENG Hao-zhong; HU Ze-chun; WANG Yi

    2008-01-01

    An approach for the integrated optimization of the construction/expansion capacity of high-voltage/medium-voltage (HV/MV) substations and the configuration of MV radial distribution network was presented using plant growth simulation algorithm (PGSA). In the optimization process, fixed costs correspondent to the investment in lines and substations and the variable costs associated to the operation of the system were considered under the constraints of branch capacity, substation capacity and bus voltage. The optimization variables considerably reduce the dimension of variables and speed up the process of optimizing. The effectiveness of the proposed approach was tested by a distribution system planning.

  12. Nonsmooth Optimization Algorithms, System Theory, and Software Tools

    Science.gov (United States)

    1993-04-13

    Solving Optimal Control Problems with...and D. Q. Mayne, "A Method of Centers Based on Barrier Functions for Solving Optimal Control Problems with Continuum State and Con- trol Constraints...Barrier Functions for Solving Optimal Control Problems with Continuum State and Con- trol Constraints", SIAMJ. Control and Opt., Vol.31, No. 1. pp

  13. TOPFARM – A Tool for Wind Farm Optimization

    DEFF Research Database (Denmark)

    Larsen, Gunner Chr.; Réthoré, Pierre-Elouan

    2013-01-01

    TOPFARM takes the investors perspective and performs an economical optimization of a wind farm layout throughout the lifetime of the wind farm. The economical optimization approach of wind farm layout differs significantly from the traditional power output optimization. The major differences...

  14. A Simple Evacuation Modeling and Simulation Tool for First Responders

    Energy Technology Data Exchange (ETDEWEB)

    Koch, Daniel B [ORNL; Payne, Patricia W [ORNL

    2015-01-01

    Although modeling and simulation of mass evacuations during a natural or man-made disaster is an on-going and vigorous area of study, tool adoption by front-line first responders is uneven. Some of the factors that account for this situation include cost and complexity of the software. For several years, Oak Ridge National Laboratory has been actively developing the free Incident Management Preparedness and Coordination Toolkit (IMPACT) to address these issues. One of the components of IMPACT is a multi-agent simulation module for area-based and path-based evacuations. The user interface is designed so that anyone familiar with typical computer drawing tools can quickly author a geospatially-correct evacuation visualization suitable for table-top exercises. Since IMPACT is designed for use in the field where network communications may not be available, quick on-site evacuation alternatives can be evaluated to keep pace with a fluid threat situation. Realism is enhanced by incorporating collision avoidance into the simulation. Statistics are gathered as the simulation unfolds, including most importantly time-to-evacuate, to help first responders choose the best course of action.

  15. Tools for evaluating team performance in simulation-based training.

    Science.gov (United States)

    Rosen, Michael A; Weaver, Sallie J; Lazzara, Elizabeth H; Salas, Eduardo; Wu, Teresa; Silvestri, Salvatore; Schiebel, Nicola; Almeida, Sandra; King, Heidi B

    2010-10-01

    Teamwork training constitutes one of the core approaches for moving healthcare systems toward increased levels of quality and safety, and simulation provides a powerful method of delivering this training, especially for face-paced and dynamic specialty areas such as Emergency Medicine. Team performance measurement and evaluation plays an integral role in ensuring that simulation-based training for teams (SBTT) is systematic and effective. However, this component of SBTT systems is overlooked frequently. This article addresses this gap by providing a review and practical introduction to the process of developing and implementing evaluation systems in SBTT. First, an overview of team performance evaluation is provided. Second, best practices for measuring team performance in simulation are reviewed. Third, some of the prominent measurement tools in the literature are summarized and discussed relative to the best practices. Subsequently, implications of the review are discussed for the practice of training teamwork in Emergency Medicine.

  16. Monte Carlo Simulation Tool Installation and Operation Guide

    Energy Technology Data Exchange (ETDEWEB)

    Aguayo Navarrete, Estanislao; Ankney, Austin S.; Berguson, Timothy J.; Kouzes, Richard T.; Orrell, John L.; Troy, Meredith D.; Wiseman, Clinton G.

    2013-09-02

    This document provides information on software and procedures for Monte Carlo simulations based on the Geant4 toolkit, the ROOT data analysis software and the CRY cosmic ray library. These tools have been chosen for its application to shield design and activation studies as part of the simulation task for the Majorana Collaboration. This document includes instructions for installation, operation and modification of the simulation code in a high cyber-security computing environment, such as the Pacific Northwest National Laboratory network. It is intended as a living document, and will be periodically updated. It is a starting point for information collection by an experimenter, and is not the definitive source. Users should consult with one of the authors for guidance on how to find the most current information for their needs.

  17. Tools for evaluating team performance in simulation-based training

    Science.gov (United States)

    Rosen, Michael A; Weaver, Sallie J; Lazzara, Elizabeth H; Salas, Eduardo; Wu, Teresa; Silvestri, Salvatore; Schiebel, Nicola; Almeida, Sandra; King, Heidi B

    2010-01-01

    Teamwork training constitutes one of the core approaches for moving healthcare systems toward increased levels of quality and safety, and simulation provides a powerful method of delivering this training, especially for face-paced and dynamic specialty areas such as Emergency Medicine. Team performance measurement and evaluation plays an integral role in ensuring that simulation-based training for teams (SBTT) is systematic and effective. However, this component of SBTT systems is overlooked frequently. This article addresses this gap by providing a review and practical introduction to the process of developing and implementing evaluation systems in SBTT. First, an overview of team performance evaluation is provided. Second, best practices for measuring team performance in simulation are reviewed. Third, some of the prominent measurement tools in the literature are summarized and discussed relative to the best practices. Subsequently, implications of the review are discussed for the practice of training teamwork in Emergency Medicine. PMID:21063558

  18. Contingency Contractor Optimization Phase 3 Sustainment Platform Requirements - Contingency Contractor Optimization Tool - Prototype

    Energy Technology Data Exchange (ETDEWEB)

    Durfee, Justin David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Frazier, Christopher Rawls [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bandlow, Alisa [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gearhart, Jared Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Katherine A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-06-01

    Sandia National Laboratories (Sandia) is in Phase 3 Sustainment of development of a prototype tool, currently referred to as the Contingency Contractor Optimization Tool - Prototype (CCOTP), under the direction of OSD Program Support. CCOT-P is intended to help provide senior Department of Defense (DoD) leaders with comprehensive insight into the global availability, readiness and capabilities of the Total Force Mix. The CCOT-P will allow senior decision makers to quickly and accurately assess the impacts, risks and mitigating strategies for proposed changes to force/capabilities assignments, apportionments and allocations options, focusing specifically on contingency contractor planning. During Phase 2 of the program, conducted during fiscal year 2012, Sandia developed an electronic storyboard prototype of the Contingency Contractor Optimization Tool that can be used for communication with senior decision makers and other Operational Contract Support (OCS) stakeholders. Phase 3 used feedback from demonstrations of the electronic storyboard prototype to develop an engineering prototype for planners to evaluate. Sandia worked with the DoD and Joint Chiefs of Staff strategic planning community to get feedback and input to ensure that the engineering prototype was developed to closely align with future planning needs. The intended deployment environment was also a key consideration as this prototype was developed. Initial release of the engineering prototype was done on servers at Sandia in the middle of Phase 3. In 2013, the tool was installed on a production pilot server managed by the OUSD(AT&L) eBusiness Center. The purpose of this document is to specify the CCOT-P engineering prototype platform requirements as of May 2016. Sandia developed the CCOT-P engineering prototype using common technologies to minimize the likelihood of deployment issues. CCOT-P engineering prototype was architected and designed to be as independent as possible of the major deployment

  19. Rigorous simulation: a tool to enhance decision making

    Energy Technology Data Exchange (ETDEWEB)

    Neiva, Raquel; Larson, Mel; Baks, Arjan [KBC Advanced Technologies plc, Surrey (United Kingdom)

    2012-07-01

    The world refining industries continue to be challenged by population growth (increased demand), regional market changes and the pressure of regulatory requirements to operate a 'green' refinery. Environmental regulations are reducing the value and use of heavy fuel oils, and leading to convert more of the heavier products or even heavier crude into lighter products while meeting increasingly stringent transportation fuel specifications. As a result actions are required for establishing a sustainable advantage for future success. Rigorous simulation provides a key advantage improving the time and efficient use of capital investment and maximizing profitability. Sustainably maximizing profit through rigorous modeling is achieved through enhanced performance monitoring and improved Linear Programme (LP) model accuracy. This paper contains examples on these two items. The combination of both increases overall rates of return. As refiners consider optimizing existing assets and expanding projects, the process agreed to achieve these goals is key for a successful profit improvement. The benefit of rigorous kinetic simulation with detailed fractionation allows for optimizing existing asset utilization while focusing the capital investment in the new unit(s), and therefore optimizing the overall strategic plan and return on investment. Individual process unit's monitoring works as a mechanism for validating and optimizing the plant performance. Unit monitoring is important to rectify poor performance and increase profitability. The key to a good LP relies upon the accuracy of the data used to generate the LP sub-model data. The value of rigorous unit monitoring are that the results are heat and mass balanced consistently, and are unique for a refiners unit / refinery. With the improved match of the refinery operation, the rigorous simulation models will allow capturing more accurately the non linearity of those process units and therefore provide correct

  20. Robust optimal design of experiments for model discrimination using an interactive software tool.

    Directory of Open Access Journals (Sweden)

    Johannes Stegmaier

    Full Text Available Mathematical modeling of biochemical processes significantly contributes to a better understanding of biological functionality and underlying dynamic mechanisms. To support time consuming and costly lab experiments, kinetic reaction equations can be formulated as a set of ordinary differential equations, which in turn allows to simulate and compare hypothetical models in silico. To identify new experimental designs that are able to discriminate between investigated models, the approach used in this work solves a semi-infinite constrained nonlinear optimization problem using derivative based numerical algorithms. The method takes into account parameter variabilities such that new experimental designs are robust against parameter changes while maintaining the optimal potential to discriminate between hypothetical models. In this contribution we present a newly developed software tool that offers a convenient graphical user interface for model discrimination. We demonstrate the beneficial operation of the discrimination approach and the usefulness of the software tool by analyzing a realistic benchmark experiment from literature. New robust optimal designs that allow to discriminate between the investigated model hypotheses of the benchmark experiment are successfully calculated and yield promising results. The involved robustification approach provides maximally discriminating experiments for the worst parameter configurations, which can be used to estimate the meaningfulness of upcoming experiments. A major benefit of the graphical user interface is the ability to interactively investigate the model behavior and the clear arrangement of numerous variables. In addition to a brief theoretical overview of the discrimination method and the functionality of the software tool, the importance of robustness of experimental designs against parameter variability is demonstrated on a biochemical benchmark problem. The software is licensed under the GNU

  1. Feed drive modelling for the simulation of tool path tracking in multi-axis High Speed Machining

    CERN Document Server

    Prévost, David; Lartigue, Claire; Dumur, Didier

    2011-01-01

    Within the context of High Speed Machining, it is essential to manage the trajectory generation to achieve both high surface quality and high productivity. As feed drives are one part of the set Machine tool - Numerical Controller, it is necessary to improve their performances to optimize feed drive dynamics during trajectory follow up. Hence, this paper deals with the modelling of the feed drive in the case of multi axis machining. This model can be used for the simulation of axis dynamics and tool-path tracking to tune parameters and optimize new frameworks of command strategies. A procedure of identification based on modern NC capabilities is presented and applied to industrial HSM centres. Efficiency of this modelling is assessed by experimental verifications on various representative trajectories. After implementing a Generalized Predictive Control, reliable simulations are performed thanks to the model. These simulations can then be used to tune parameters of this new framework according to the tool-pat...

  2. Visualization in simulation tools: requirements and a tool specification to support the teaching of dynamic biological processes.

    Science.gov (United States)

    Jørgensen, Katarina M; Haddow, Pauline C

    2011-08-01

    Simulation tools are playing an increasingly important role behind advances in the field of systems biology. However, the current generation of biological science students has either little or no experience with such tools. As such, this educational glitch is limiting both the potential use of such tools as well as the potential for tighter cooperation between the designers and users. Although some simulation tool producers encourage their use in teaching, little attempt has hitherto been made to analyze and discuss their suitability as an educational tool for noncomputing science students. In general, today's simulation tools assume that the user has a stronger mathematical and computing background than that which is found in most biological science curricula, thus making the introduction of such tools a considerable pedagogical challenge. This paper provides an evaluation of the pedagogical attributes of existing simulation tools for cell signal transduction based on Cognitive Load theory. Further, design recommendations for an improved educational simulation tool are provided. The study is based on simulation tools for cell signal transduction. However, the discussions are relevant to a broader biological simulation tool set.

  3. Simulation and optimization of a smart reconfigurable aperture antenna

    Science.gov (United States)

    Washington, Gregory N.; Yoon, Hwan-Sik; Theunissen, Wilhelmus H.

    2002-07-01

    The work in this study develops the framework for placement and actuation of novel mechanically reconfigurable dual-offset contour beam reflector antennas (DCBRA). Towards that end the methodology for the antennas' design is defined. The antenna designed in this study employs piezoelectrically driven ball screw actuators. These actuators are attached to a flexible sub reflector surface and are used to vary radiation pattern. In addition, two separate optimization problems are stated and solved: Actuator position optimization and actuation value optimization. For the former, a method termed as Greatest Error Suppression method is proposed where the position of each actuator is decided one by one after each evaluation of the error between the desired subreflector shape and the actual subreflector shape. For the second problem, a mathematical analysis shows that there exists only one optimal configuration. Two optimization techniques are used for the second problem: the Simulated Annealing algorithm and a simple univariate optimization technique. The univariate technique always generates the same optimal configuration for different initial configurations and it gives the low bound in the evaluation of the error. The Simulated Annealing algorithm is a stochastic technique used to search for global optimum point. Finally, as an example, the results of the proposed optimization techniques are presented for the generation of a subreflector shape for the geographical outline of Brazil.

  4. Statistical Testing of Optimality Conditions in Multiresponse Simulation-based Optimization (Revision of 2005-81)

    NARCIS (Netherlands)

    Bettonvil, B.W.M.; Del Castillo, E.; Kleijnen, J.P.C.

    2007-01-01

    This paper studies simulation-based optimization with multiple outputs. It assumes that the simulation model has one random objective function and must satisfy given constraints on the other random outputs. It presents a statistical procedure for test- ing whether a specific input combination

  5. Statistical Testing of Optimality Conditions in Multiresponse Simulation-based Optimization (Revision of 2005-81)

    NARCIS (Netherlands)

    Bettonvil, B.W.M.; Del Castillo, E.; Kleijnen, J.P.C.

    2007-01-01

    This paper studies simulation-based optimization with multiple outputs. It assumes that the simulation model has one random objective function and must satisfy given constraints on the other random outputs. It presents a statistical procedure for test- ing whether a specific input combination (propo

  6. Genetic-Algorithm Tool For Search And Optimization

    Science.gov (United States)

    Wang, Lui; Bayer, Steven

    1995-01-01

    SPLICER computer program used to solve search and optimization problems. Genetic algorithms adaptive search procedures (i.e., problem-solving methods) based loosely on processes of natural selection and Darwinian "survival of fittest." Algorithms apply genetically inspired operators to populations of potential solutions in iterative fashion, creating new populations while searching for optimal or nearly optimal solution to problem at hand. Written in Think C.

  7. Structure optimization and simulation analysis of the quartz micromachined gyroscope

    Science.gov (United States)

    Wu, Xuezhong; Wang, Haoxu; Xie, Liqiang; Dong, Peitao

    2014-03-01

    Structure optimization and simulation analysis of the quartz micromachined gyroscope are reported in this paper. The relationships between the structure parameters and the frequencies of work mode were analysed by finite element analysis. The structure parameters of the quartz micromachined gyroscope were optimized to reduce the difference between the frequencies of the drive mode and the sense mode. The simulation results were proved by testing the prototype gyroscope, which was fabricated by micro-electromechanical systems (MEMS) technology. Therefore, the frequencies of the drive mode and the sense mode can match each other by the structure optimization and simulation analysis of the quartz micromachined gyroscope, which is helpful in the design of the high sensitivity quartz micromachined gyroscope.

  8. Structure optimization and simulation analysis of the quartz micromachined gyroscope

    Directory of Open Access Journals (Sweden)

    Xuezhong Wu

    2014-02-01

    Full Text Available Structure optimization and simulation analysis of the quartz micromachined gyroscope are reported in this paper. The relationships between the structure parameters and the frequencies of work mode were analysed by finite element analysis. The structure parameters of the quartz micromachined gyroscope were optimized to reduce the difference between the frequencies of the drive mode and the sense mode. The simulation results were proved by testing the prototype gyroscope, which was fabricated by micro-electromechanical systems (MEMS technology. Therefore, the frequencies of the drive mode and the sense mode can match each other by the structure optimization and simulation analysis of the quartz micromachined gyroscope, which is helpful in the design of the high sensitivity quartz micromachined gyroscope.

  9. Simultaneous computation within a sequential process simulation tool

    Directory of Open Access Journals (Sweden)

    G. Endrestøl

    1989-10-01

    Full Text Available The paper describes an equation solver superstructure developed for a sequential modular dynamic process simulation system as part of a Eureka project with Norwegian and British participation. The purpose of the development was combining some of the advantages of equation based and purely sequential systems, enabling implicit treatment of key variables independent of module boundaries, and use of numerical integration techniques suitable for each individual type of variable. For training simulator applications the main advantages are gains in speed due to increased stability limits on time steps and improved consistency of simulation results. The system is split into an off-line analysis phase and an on-line equation solver. The off-line processing consists of automatic determination of the topological structure of the system connectivity from standard process description files and derivation of an optimized sparse matrix solution procedure for the resulting set of equations. The on-line routine collects equation coefficients from involved modules, solves the combined sets of structured equations, and stores the results appropriately. This method minimizes the processing cost during the actual simulation. The solver has been applied in the Veslefrikk training simulator project.

  10. A distributed computing tool for generating neural simulation databases.

    Science.gov (United States)

    Calin-Jageman, Robert J; Katz, Paul S

    2006-12-01

    After developing a model neuron or network, it is important to systematically explore its behavior across a wide range of parameter values or experimental conditions, or both. However, compiling a very large set of simulation runs is challenging because it typically requires both access to and expertise with high-performance computing facilities. To lower the barrier for large-scale model analysis, we have developed NeuronPM, a client/server application that creates a "screen-saver" cluster for running simulations in NEURON (Hines & Carnevale, 1997). NeuronPM provides a user-friendly way to use existing computing resources to catalog the performance of a neural simulation across a wide range of parameter values and experimental conditions. The NeuronPM client is a Windows-based screen saver, and the NeuronPM server can be hosted on any Apache/PHP/MySQL server. During idle time, the client retrieves model files and work assignments from the server, invokes NEURON to run the simulation, and returns results to the server. Administrative panels make it simple to upload model files, define the parameters and conditions to vary, and then monitor client status and work progress. NeuronPM is open-source freeware and is available for download at http://neuronpm.homeip.net . It is a useful entry-level tool for systematically analyzing complex neuron and network simulations.

  11. Application of Artificial Intelligence Methods of Tool Path Optimization in CNC Machines: A Review

    Directory of Open Access Journals (Sweden)

    Khashayar Danesh Narooei

    2014-08-01

    Full Text Available Today, in most of metal machining process, Computer Numerical Control (CNC machine tools have been very popular due to their efficiencies and repeatability to achieve high accuracy positioning. One of the factors that govern the productivity is the tool path travel during cutting a work piece. It has been proved that determination of optimal cutting parameters can enhance the machining results to reach high efficiency and minimum the machining cost. In various publication and articles, scientist and researchers adapted several Artificial Intelligence (AI methods or hybrid method for tool path optimization such as Genetic Algorithms (GA, Artificial Neural Network (ANN, Artificial Immune Systems (AIS, Ant Colony Optimization (ACO and Particle Swarm Optimization (PSO. This study presents a review of researches in tool path optimization with different types of AI methods that show the capability of using different types of optimization methods in CNC machining process.

  12. Monte Carlo Simulation as a Research Management Tool

    Energy Technology Data Exchange (ETDEWEB)

    Douglas, L. J.

    1986-06-01

    Monte Carlo simulation provides a research manager with a performance monitoring tool to supplement the standard schedule- and resource-based tools such as the Program Evaluation and Review Technique (PERT) and Critical Path Method (CPM). The value of the Monte Carlo simulation in a research environment is that it 1) provides a method for ranking competing processes, 2) couples technical improvements to the process economics, and 3) provides a mechanism to determine the value of research dollars. In this paper the Monte Carlo simulation approach is developed and applied to the evaluation of three competing processes for converting lignocellulosic biomass to ethanol. The technique is shown to be useful for ranking the processes and illustrating the importance of the timeframe of the analysis on the decision process. The results show that acid hydrolysis processes have higher potential for near-term application (2-5 years), while the enzymatic hydrolysis approach has an equal chance to be competitive in the long term (beyond 10 years).

  13. USMC Inventory Control Using Optimization Modeling and Discrete Event Simulation

    Science.gov (United States)

    2016-09-01

    same as DES. Source : [6] C. Almeder, M. Preusser and R. F. Hatl, “Simlulation and Optimization of Supply Chains : Alternative or Complementary...brief discussion of the current techniques in which optimization and simulation are used to improve supply chain and inventory management processes is...combat environment is most likely impractical, which is not the case in established supply chain networks. In the area of supply chain network

  14. Modeling and Simulation Tools: From Systems Biology to Systems Medicine.

    Science.gov (United States)

    Olivier, Brett G; Swat, Maciej J; Moné, Martijn J

    2016-01-01

    Modeling is an integral component of modern biology. In this chapter we look into the role of the model, as it pertains to Systems Medicine, and the software that is required to instantiate and run it. We do this by comparing the development, implementation, and characteristics of tools that have been developed to work with two divergent methodologies: Systems Biology and Pharmacometrics. From the Systems Biology perspective we consider the concept of "Software as a Medical Device" and what this may imply for the migration of research-oriented, simulation software into the domain of human health.In our second perspective, we see how in practice hundreds of computational tools already accompany drug discovery and development at every stage of the process. Standardized exchange formats are required to streamline the model exchange between tools, which would minimize translation errors and reduce the required time. With the emergence, almost 15 years ago, of the SBML standard, a large part of the domain of interest is already covered and models can be shared and passed from software to software without recoding them. Until recently the last stage of the process, the pharmacometric analysis used in clinical studies carried out on subject populations, lacked such an exchange medium. We describe a new emerging exchange format in Pharmacometrics which covers the non-linear mixed effects models, the standard statistical model type used in this area. By interfacing these two formats the entire domain can be covered by complementary standards and subsequently the according tools.

  15. Metabolic engineering with systems biology tools to optimize production of prokaryotic secondary metabolites

    DEFF Research Database (Denmark)

    Kim, Hyun Uk; Charusanti, Pep; Lee, Sang Yup;

    2016-01-01

    for the optimal production of various prokaryotic secondary metabolites: native versus heterologous hosts (e.g., Escherichia coli) and rational versus random approaches. This comparative analysis is followed by discussions on systems biology tools deployed in optimizing the production of secondary metabolites....... The potential contributions of additional systems biology tools are also discussed in the context of current challenges encountered during optimization of secondary metabolite production....

  16. Flow Simulation and Optimization of Plasma Reactors for Coal Gasification

    Institute of Scientific and Technical Information of China (English)

    冀春俊; 张英姿; 马腾才

    2003-01-01

    This paper reports a 3-d numerical simulation system to analyze the complicatedflow in plasma reactors for coal gasification, which involve complex chemical reaction, two-phaseflow and plasma effect. On the basis of analytic results, the distribution of the density, tempera-ture and components' concentration are obtained and a different plasma reactor configuration isproposed to optimize the flow parameters. The numerical simulation results show an improvedconversion ratio of the coal gasification. Different kinds of chemical reaction models are used tosimulate the complex flow inside the reactor. It can be concluded that the numerical simulationsystem can be very useful for the design and optimization of the plasma reactor.

  17. Optimization and Simulation in the Danish Fishing Industry

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg; Clausen, Jens

    We consider the Danish fishing industry from a holistic viewpoint, and give a review of the main aspects, and the important actors. We also consider supply chain theory, and identify both theoretically, and based on other application areas, e.g. other fresh food industries, how optimization...... and simulation can be applied in a holistic modeling framework. Using the insights into supply chain theory and the Danish fishing industry, we investigate how the fishing industry as a whole may benefit from the formulation and use of mathematical optimization and simulation models. Finally, an appendix...

  18. Metamodelling Approach and Software Tools for Physical Modelling and Simulation

    Directory of Open Access Journals (Sweden)

    Vitaliy Mezhuyev

    2015-02-01

    Full Text Available In computer science, metamodelling approach becomes more and more popular for the purpose of software systems development. In this paper, we discuss applicability of the metamodelling approach for development of software tools for physical modelling and simulation.To define a metamodel for physical modelling the analysis of physical models will be done. The result of such the analyses will show the invariant physical structures, we propose to use as the basic abstractions of the physical metamodel. It is a system of geometrical objects, allowing to build a spatial structure of physical models and to set a distribution of physical properties. For such geometry of distributed physical properties, the different mathematical methods can be applied. To prove the proposed metamodelling approach, we consider the developed prototypes of software tools.

  19. The surgical ensemble: choreography as a simulation and training tool.

    Science.gov (United States)

    Satava, Richard M; Hunter, Anne Marie

    2011-09-01

    Team training and interprofessional training have recently emerged as critical new simulations that enhance performance by coordinating communication, leadership, professional, and, to a certain extent, technical skills. In describing these new training tools, the term choreography has been loosely used, but no critical appraisal of the role of the science of choreography has been applied to a surgical procedure. By analogy, the surgical team, including anesthetists, surgeons, nurses, and technicians, constitutes a complete ensemble, whose physical actions and interactions constitute the "performance of surgery." There are very specific "elements" (tools) that are basic to choreography, such as space, timing, rhythm, energy, cues, transitions, and especially rehearsal. This review explores whether such a metaphor is appropriate and the possibility of applying the science of choreography to the surgical team in the operating theater.

  20. Continuously Optimized Reliable Energy (CORE) Microgrid: Models & Tools (Fact Sheet)

    Energy Technology Data Exchange (ETDEWEB)

    2013-07-01

    This brochure describes Continuously Optimized Reliable Energy (CORE), a trademarked process NREL employs to produce conceptual microgrid designs. This systems-based process enables designs to be optimized for economic value, energy surety, and sustainability. Capabilities NREL offers in support of microgrid design are explained.

  1. Hypersonic Control Modeling and Simulation Tool for Lifting Towed Ballutes Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Global Aerospace Corporation proposes to develop a hypersonic control modeling and simulation tool for hypersonic aeroassist vehicles. Our control and simulation...

  2. Decision Tool for optimal deployment of radar systems

    NARCIS (Netherlands)

    Vogel, M.H.

    1995-01-01

    A Decision Tool for air defence is presented. This Decision Tool, when provided with information about the radar, the environment, and the expected class of targets, informs the radar operator about detection probabilities. This assists the radar operator to select the optimum radar parameters. n

  3. Decision Tool for optimal deployment of radar systems

    NARCIS (Netherlands)

    Vogel, M.H.

    1995-01-01

    A Decision Tool for air defence is presented. This Decision Tool, when provided with information about the radar, the environment, and the expected class of targets, informs the radar operator about detection probabilities. This assists the radar operator to select the optimum radar parameters. n th

  4. Flow simulation and optimization of plasma reactors for coal gasification

    Energy Technology Data Exchange (ETDEWEB)

    Ji, C.J.; Zhang, Y.Z.; Ma, T.C. [Dalian University of Technology, Dalian (China). Power Engineering Dept.

    2003-10-01

    This paper reports a 3-D numerical simulation system to analyze the complicated flow in plasma reactors for coal gasification, which involve complex chemical reaction, two-phase flow and plasma effect. On the basis of analytic results, the distribution of the density, temperature and components' concentration are obtained and a different plasma reactor configuration is proposed to optimize the flow parameters. The numerical simulation results show an improved conversion ratio of the coal gasification. Different kinds of chemical reaction models are used to simulate the complex flow inside the reactor. It can be concluded that the numerical simulation system can be very useful for the design and optimization of the plasma reactor.

  5. Patient simulation: a literary synthesis of assessment tools in anesthesiology.

    Science.gov (United States)

    Edler, Alice A; Fanning, Ruth G; Chen, Michael I; Claure, Rebecca; Almazan, Dondee; Struyk, Brain; Seiden, Samuel C

    2009-12-20

    High-fidelity patient simulation (HFPS) has been hypothesized as a modality for assessing competency of knowledge and skill in patient simulation, but uniform methods for HFPS performance assessment (PA) have not yet been completely achieved. Anesthesiology as a field founded the HFPS discipline and also leads in its PA. This project reviews the types, quality, and designated purpose of HFPS PA tools in anesthesiology. We used the systematic review method and systematically reviewed anesthesiology literature referenced in PubMed to assess the quality and reliability of available PA tools in HFPS. Of 412 articles identified, 50 met our inclusion criteria. Seventy seven percent of studies have been published since 2000; more recent studies demonstrated higher quality. Investigators reported a variety of test construction and validation methods. The most commonly reported test construction methods included "modified Delphi Techniques" for item selection, reliability measurement using inter-rater agreement, and intra-class correlations between test items or subtests. Modern test theory, in particular generalizability theory, was used in nine (18%) of studies. Test score validity has been addressed in multiple investigations and shown a significant improvement in reporting accuracy. However the assessment of predicative has been low across the majority of studies. Usability and practicality of testing occasions and tools was only anecdotally reported. To more completely comply with the gold standards for PA design, both shared experience of experts and recognition of test construction standards, including reliability and validity measurements, instrument piloting, rater training, and explicit identification of the purpose and proposed use of the assessment tool, are required.

  6. Terascale Optimal PDE Simulations (TOPS) Center

    Energy Technology Data Exchange (ETDEWEB)

    Professor Olof B. Widlund

    2007-07-09

    Our work has focused on the development and analysis of domain decomposition algorithms for a variety of problems arising in continuum mechanics modeling. In particular, we have extended and analyzed FETI-DP and BDDC algorithms; these iterative solvers were first introduced and studied by Charbel Farhat and his collaborators, see [11, 45, 12], and by Clark Dohrmann of SANDIA, Albuquerque, see [43, 2, 1], respectively. These two closely related families of methods are of particular interest since they are used more extensively than other iterative substructuring methods to solve very large and difficult problems. Thus, the FETI algorithms are part of the SALINAS system developed by the SANDIA National Laboratories for very large scale computations, and as already noted, BDDC was first developed by a SANDIA scientist, Dr. Clark Dohrmann. The FETI algorithms are also making inroads in commercial engineering software systems. We also note that the analysis of these algorithms poses very real mathematical challenges. The success in developing this theory has, in several instances, led to significant improvements in the performance of these algorithms. A very desirable feature of these iterative substructuring and other domain decomposition algorithms is that they respect the memory hierarchy of modern parallel and distributed computing systems, which is essential for approaching peak floating point performance. The development of improved methods, together with more powerful computer systems, is making it possible to carry out simulations in three dimensions, with quite high resolution, relatively easily. This work is supported by high quality software systems, such as Argonne's PETSc library, which facilitates code development as well as the access to a variety of parallel and distributed computer systems. The success in finding scalable and robust domain decomposition algorithms for very large number of processors and very large finite element problems is, e

  7. Dynamic wind turbine models in power system simulation tool

    DEFF Research Database (Denmark)

    Hansen, Anca D.; Iov, Florin; Sørensen, Poul

    This report presents a collection of models and control strategies developed and implemented in the power system simulation tool PowerFactory DIgSILENT for different wind turbine concepts. It is the second edition of Risø-R-1400(EN) and it gathers and describes a whole wind turbine model database...... strategies have different goals e.g. fast response over disturbances, optimum power efficiency over a wider range of wind speeds, voltage ride-through capability including grid support. A dynamic model of a DC connection for active stall wind farms to the grid including the control is also implemented...

  8. Eddy current NDE performance demonstrations using simulation tools

    Energy Technology Data Exchange (ETDEWEB)

    Maurice, L. [EDF - CEIDRE, 2 rue Ampere, 93206 Saint-Denis Cedex 1 (France); Costan, V.; Guillot, E.; Thomas, P. [EDF - R and D, THEMIS, 1, avenue du General de Gaulle, 92141 Clamart (France)

    2013-01-25

    To carry out performance demonstrations of the Eddy-Current NDE processes applied on French nuclear power plants, EDF studies the possibility of using simulation tools as an alternative to measurements on steam generator tube mocks-up. This paper focuses on the strategy led by EDF to assess and use code{sub C}armel3D and Civa, on the case of Eddy-Current NDE on wears problem which may appear in the U-shape region of steam generator tubes due to the rubbing of anti-vibration bars.

  9. A Simulation Framework for Optimal Energy Storage Sizing

    OpenAIRE

    Carlos Suazo-Martínez; Eduardo Pereira-Bonvallet; Rodrigo Palma-Behnke

    2014-01-01

    Despite the increasing interest in Energy Storage Systems (ESS), quantification of their technical and economical benefits remains a challenge. To assess the use of ESS, a simulation approach for ESS optimal sizing is presented. The algorithm is based on an adapted Unit Commitment, including ESS operational constraints, and the use of high performance computing (HPC). Multiple short-term simulations are carried out within a multiple year horizon. Evaluation is performed for Chile's No...

  10. Optimization and Simulation in Drug Development - Review and Analysis

    DEFF Research Database (Denmark)

    Schjødt-Eriksen, Jens; Clausen, Jens

    2003-01-01

    We give a review of pharmaceutical R&D and mathematical simulation and optimization methods used to support decision making within the pharmaceutical development process. The complex nature of drug development is pointed out through a description of the various phases of the pharmaceutical...... development process. A part of the paper is dedicated to the use of simulation techniques to support clinical trials. The paper ends with a section describing portfolio modelling methods in the context of the pharmaceutical industry....

  11. Optimization and Simulation in Drug Development - Review and Analysis

    OpenAIRE

    Schjødt-Eriksen, Jens; Clausen, Jens

    2003-01-01

    We give a review of pharmaceutical R&D and mathematical simulation and optimization methods used to support decision making within the pharmaceutical development process. The complex nature of drug development is pointed out through a description of the various phases of the pharmaceutical development process. A part of the paper is dedicated to the use of simulation techniques to support clinical trials. The paper ends with a section describing portfolio modelling methods in the context ...

  12. Research on virtual dynamic optimization design for NC machine tools

    Institute of Scientific and Technical Information of China (English)

    HU Ru-fu; GUI Zhong-hua; CHEN Xiao-ping; SUN Qing-hong

    2006-01-01

    Virtual dynamic optimization design can avoid the repeated process from design to trial-manufacture and test. The designer can analyze and optimize the product structures in virtual visualization environment. The design cycle is shortened and the cost is reduced. The paper analyzed the peculiarity of virtual optimization design, and put forwards the thought and flow to implement virtual optimization design. The example to optimize the internal grinder was studied via establishing precise finite element model, modifying the layout of Stiffened Plates and designing parameters of the worktable, and using the technology of modal frequency revision and the technology of multiple tuned damper.The result of optimization design compared the new grinder with the original grinder shows that the entire machine's first orders natural frequency is enhanced by 17%, and the response displacement of the grinding-head has dropped by 28% under the first order natural frequency and by 41% under second order natural frequency. Finally, the dynamic performance of the internal grinder was optimized.

  13. OPTIMAL ALGORITHM FOR NO TOOlRETRACTIONS CONTOUR-PARALLEL OFFSET TOOL-PATH LINKING

    Institute of Scientific and Technical Information of China (English)

    HAO Yongtao; JIANG Lili

    2007-01-01

    A contour-parallel offset (CPO) tool-path linking algorithm is derived without toolretractions and with the largest practicability. The concept of "tool-path loop tree" (TPL-tree)providing the information on the parent/child relationships among the tool-path loops (TPLs) is presented. The direction, tool-path loop, leaf/branch, layer number, and the corresponding points of the TPL-tree are introduced. By defining TPL as a vector, and by traveling throughout the tree, a CPO tool-path without tool-retractions can be derived.

  14. A new simulation tool for testing smart home recognition algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Bouchard, K.; Ajroud, A.; Bouchard, B.; Bouzouane, A. [Chicoutimi Univ. du Quebec, Saguenay, PQ (Canada)

    2010-08-13

    Smart home technologies are being actively researched. However, many scientists studying this topic of research do not own smart home infrastructure, and therefore, are unable to conduct proper experiments in a concrete environment with real data. In order to address this problem and to help researchers working in the field of activity recognition, this paper presented a novel and flexible three-dimensional smart home infrastructure simulator called SIMACT developed in Java. The paper discussed the proposed simulator including the software architecture, implementation, and scripting. It also discussed the actual initiative to gather real data that would be incorporated in the simulator as real case pre-recorded scenarios. An overview of related work and future developments were also presented. It was concluded that SIMACT is a user-friendly three-dimensional animated simulator that offers a simple interface that could be easily adapted to individual needs and could become a very useful tool for the community with its XML scripting language, many options, customizable three-dimensional frame and its set of pre-defined real case scenarios. 18 refs., 4 figs.

  15. Validation of Multibody Program to Optimize Simulated Trajectories II Parachute Simulation with Interacting Forces

    Science.gov (United States)

    Raiszadeh, Behzad; Queen, Eric M.; Hotchko, Nathaniel J.

    2009-01-01

    A capability to simulate trajectories of multiple interacting rigid bodies has been developed, tested and validated. This capability uses the Program to Optimize Simulated Trajectories II (POST 2). The standard version of POST 2 allows trajectory simulation of multiple bodies without force interaction. In the current implementation, the force interaction between the parachute and the suspended bodies has been modeled using flexible lines, allowing accurate trajectory simulation of the individual bodies in flight. The POST 2 multibody capability is intended to be general purpose and applicable to any parachute entry trajectory simulation. This research paper explains the motivation for multibody parachute simulation, discusses implementation methods, and presents validation of this capability.

  16. Conceptual air sparging decision tool in support of the development of an air sparging optimization decision tool

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    The enclosed document describes a conceptual decision tool (hereinafter, Tool) for determining applicability of and for optimizing air sparging systems. The Tool was developed by a multi-disciplinary team of internationally recognized experts in air sparging technology, lead by a group of project and task managers at Parsons Engineering Science, Inc. (Parsons ES). The team included Mr. Douglas Downey and Dr. Robert Hinchee of Parsons ES, Dr. Paul Johnson of Arizona State University, Dr. Richard Johnson of Oregon Graduate Institute, and Mr. Michael Marley of Envirogen, Inc. User Community Panel Review was coordinated by Dr. Robert Siegrist of Colorado School of Mines (also of Oak Ridge National Laboratory) and Dr. Thomas Brouns of Battelle/Pacific Northwest Laboratory. The Tool is intended to provide guidance to field practitioners and environmental managers for evaluating the applicability and optimization of air sparging as remedial action technique.

  17. Automatic CT simulation optimization for radiation therapy: A general strategy

    Energy Technology Data Exchange (ETDEWEB)

    Li, Hua, E-mail: huli@radonc.wustl.edu; Chen, Hsin-Chen; Tan, Jun; Gay, Hiram; Michalski, Jeff M.; Mutic, Sasa [Department of Radiation Oncology, Washington University, St. Louis, Missouri 63110 (United States); Yu, Lifeng [Department of Radiology, Mayo Clinic, Rochester, Minnesota 55905 (United States); Anastasio, Mark A. [Department of Biomedical Engineering, Washington University, St. Louis, Missouri 63110 (United States); Low, Daniel A. [Department of Radiation Oncology, University of California Los Angeles, Los Angeles, California 90095 (United States)

    2014-03-15

    Purpose: In radiation therapy, x-ray computed tomography (CT) simulation protocol specifications should be driven by the treatment planning requirements in lieu of duplicating diagnostic CT screening protocols. The purpose of this study was to develop a general strategy that allows for automatically, prospectively, and objectively determining the optimal patient-specific CT simulation protocols based on radiation-therapy goals, namely, maintenance of contouring quality and integrity while minimizing patient CT simulation dose. Methods: The authors proposed a general prediction strategy that provides automatic optimal CT simulation protocol selection as a function of patient size and treatment planning task. The optimal protocol is the one that delivers the minimum dose required to provide a CT simulation scan that yields accurate contours. Accurate treatment plans depend on accurate contours in order to conform the dose to actual tumor and normal organ positions. An image quality index, defined to characterize how simulation scan quality affects contour delineation, was developed and used to benchmark the contouring accuracy and treatment plan quality within the predication strategy. A clinical workflow was developed to select the optimal CT simulation protocols incorporating patient size, target delineation, and radiation dose efficiency. An experimental study using an anthropomorphic pelvis phantom with added-bolus layers was used to demonstrate how the proposed prediction strategy could be implemented and how the optimal CT simulation protocols could be selected for prostate cancer patients based on patient size and treatment planning task. Clinical IMRT prostate treatment plans for seven CT scans with varied image quality indices were separately optimized and compared to verify the trace of target and organ dosimetry coverage. Results: Based on the phantom study, the optimal image quality index for accurate manual prostate contouring was 4.4. The optimal tube

  18. Automatic Calibration Tool for Hydrologic Simulation Program-FORTRAN Using a Shuffled Complex Evolution Algorithm

    Directory of Open Access Journals (Sweden)

    Chounghyun Seong

    2015-02-01

    Full Text Available Hydrologic Simulation Program-Fortran (HSPF model calibration is typically done manually due to the lack of an automated calibration tool as well as the difficulty of balancing objective functions to be considered. This paper discusses the development and demonstration of an automated calibration tool for HSPF (HSPF-SCE. HSPF-SCE was developed using the open source software “R”. The tool employs the Shuffled Complex Evolution optimization algorithm (SCE-UA to produce a pool of qualified calibration parameter sets from which the modeler chooses a single set of calibrated parameters. Six calibration criteria specified in the Expert System for the Calibration of HSPF (HSPEXP decision support tool were combined to develop a single, composite objective function for HSPF-SCE. The HSPF-SCE tool was demonstrated, and automated and manually calibrated model performance were compared using three Virginia watersheds, where HSPF models had been previously prepared for bacteria total daily maximum load (TMDL development. The example applications demonstrate that HSPF-SCE can be an effective tool for calibrating HSPF.

  19. Shape optimization as a tool to design biocatalytic microreactors

    DEFF Research Database (Denmark)

    Pereira Rosinha Grundtvig, Ines; Daugaard, Anders Egede; Woodley, John

    2017-01-01

    conditions. However, common reactor types used in (bio)chemical processes do not always give the optimal conditions for executing the reaction, and it is therefore necessary to look into new approaches to further improve the performance of reactors. The new application of shape optimization described...... in this paper has as its main goal the design of a reactor by compensating for the limitations of the reaction system by modifying the reactor configuration. Random search was the optimization method chosen for transforming the initial reactor configuration to a more optimal one. The case study presented here...... investigates the impact of a change to the microreactor shape on the active mixing of two parallel streams (one containing an enzyme, amino transaminase, and the other the substrates, acetophenone and isopropylamine) and consequently its influence on the reaction yield. Compared to the original reactor...

  20. Developing the Next Generation of Tools for Simulating Galaxy Outflows

    Science.gov (United States)

    Scannapieco, Evan

    Outflows are observed in starbursting galaxies of all masses and at all cosmological epochs. They play a key role throughout the history of the Universe: shaping the galaxy mass-metallicity relation, drastically affecting the content and number density of dwarf galaxies, and transforming the chemical composition of the intergalactic medium. Yet, a complete model of galaxy out ows has proven to be elusive, as it requires both a better understanding of the evolution of the turbulent, multiphase gas in and around starbursting galaxies, and better tools to reproduce this evolution in galaxy-scale simulations. Here we propose to conduct a detailed series of numerical simulations designed to help develop such next-generation tools for the simulation of galaxy outflows. The program will consist of three types of direct numerical simulations, each of which will be targeted to allow galaxy-scale simulations to more accurately model key microphysical processes and their observational consequences. Our first set of simulations will be targeted at better modeling the starbursting interstellar medium (ISM) from which galaxy outflows are driven. The surface densities in starbursting galaxies are much larger than those in the Milky Way, resulting in larger gravitational accelerations and random velocities exceeding 30 or even 100 km/s. Under these conditions, the thermal stability of the ISM is changed dramatically, due to the sharp peak in gas cooling efficiency at H 200,000 K. Our simulations will carefully quantify the key ways in which this medium differs from the local ISM, and the consequences of these differences for when, where, and how outflows are driven. A second set of simulations will be targeted at better modeling the observed properties of rapidly cooling, highly turbulent gas. Because gas cooling in and around starbursts is extremely efficient, turbulent motions are often supersonic, which leads to a distribution of ionization states that is vastly different than

  1. GMOseek: a user friendly tool for optimized GMO testing

    National Research Council Canada - National Science Library

    Morisset, Dany; Novak, Petra Kralj; Zupanič, Darko; Gruden, Kristina; Lavrač, Nada; Žel, Jana

    2014-01-01

    ...) computational tool to efficiently use the available data. The developed GMOseek software is designed to support decision making in all the phases of routine GMO laboratory testing, including the interpretation...

  2. Effects of machining parameters on tool life and its optimization in turning mild steel with brazed carbide cutting tool

    Science.gov (United States)

    Dasgupta, S.; Mukherjee, S.

    2016-09-01

    One of the most significant factors in metal cutting is tool life. In this research work, the effects of machining parameters on tool under wet machining environment were studied. Tool life characteristics of brazed carbide cutting tool machined against mild steel and optimization of machining parameters based on Taguchi design of experiments were examined. The experiments were conducted using three factors, spindle speed, feed rate and depth of cut each having three levels. Nine experiments were performed on a high speed semi-automatic precision central lathe. ANOVA was used to determine the level of importance of the machining parameters on tool life. The optimum machining parameter combination was obtained by the analysis of S/N ratio. A mathematical model based on multiple regression analysis was developed to predict the tool life. Taguchi's orthogonal array analysis revealed the optimal combination of parameters at lower levels of spindle speed, feed rate and depth of cut which are 550 rpm, 0.2 mm/rev and 0.5mm respectively. The Main Effects plot reiterated the same. The variation of tool life with different process parameters has been plotted. Feed rate has the most significant effect on tool life followed by spindle speed and depth of cut.

  3. Topology Optimization as a Conceptual Tool for Designing New Airframes

    OpenAIRE

    2016-01-01

    During the two last decades, topology optimization has grown to be an accepted and used method to produce conceptual designs. Topology optimization is traditionally carried out on a component level, but in this project, the possibility to apply it to airframe design on a full scale aeroplane model is evaluated. The project features a conceptual flying-wing design on which the study is to be carried out. Inertia Relief is used to constrain the aeroplane instead of traditional single point cons...

  4. Robust Optimization in Simulation : Taguchi and Response Surface Methodology

    NARCIS (Netherlands)

    Dellino, G.; Kleijnen, J.P.C.; Meloni, C.

    2008-01-01

    Optimization of simulated systems is tackled by many methods, but most methods assume known environments. This article, however, develops a 'robust' methodology for uncertain environments. This methodology uses Taguchi's view of the uncertain world, but replaces his statistical techniques by Respons

  5. Optimized firing. Numerical simulation of flow; Optimierte Feuerung. Numerische Stroemungssimulation

    Energy Technology Data Exchange (ETDEWEB)

    Klasen, T. [Inpro-Consult (Germany); Floetgen, A.

    2007-07-01

    By the aid of a numerical flow simulation in the beginning of boiler design can be optimized geometrical and process details. An example is shown for a feeding stoker with combined dust firing of an existing boiler plant for biogenic fuels. (GL)

  6. Robust Optimization in Simulation : Taguchi and Krige Combined

    NARCIS (Netherlands)

    Dellino, G.; Kleijnen, Jack P.C.; Meloni, C.

    2009-01-01

    Optimization of simulated systems is the goal of many methods, but most methods as- sume known environments. We, however, develop a `robust' methodology that accounts for uncertain environments. Our methodology uses Taguchi's view of the uncertain world, but replaces his statistical techniques by Kr

  7. A Software Tool for Optimal Sizing of PV Systems in Malaysia

    OpenAIRE

    Tamer Khatib; Azah Mohamed; K. Sopian

    2012-01-01

    This paper presents a MATLAB based user friendly software tool called as PV.MY for optimal sizing of photovoltaic (PV) systems. The software has the capabilities of predicting the metrological variables such as solar energy, ambient temperature and wind speed using artificial neural network (ANN), optimizes the PV module/ array tilt angle, optimizes the inverter size and calculate optimal capacities of PV array, battery, wind turbine and diesel generator in hybrid PV systems. The ANN based mo...

  8. The Big Data Tools Impact on Development of Simulation-Concerned Academic Disciplines

    Directory of Open Access Journals (Sweden)

    A. A. Sukhobokov

    2015-01-01

    Full Text Available The article gives a definition of Big Data on the basis of 5V (Volume, Variety, Velocity, Veracity, Value as well as shows examples of tasks that require using Big Data tools in a diversity of areas, namely: health, education, financial services, industry, agriculture, logistics, retail, information technology, telecommunications and others. An overview of Big Data tools is delivered, including open source products, IBM Bluemix and SAP HANA platforms. Examples of architecture of corporate data processing and management systems using Big Data tools are shown for big Internet companies and for enterprises in traditional industries. Within the overview, a classification of Big Data tools is proposed that fills gaps of previously developed similar classifications. The new classification contains 19 classes and allows embracing several hundreds of existing and emerging products.The uprise and use of Big Data tools, in addition to solving practical problems, affects the development of scientific disciplines concerning the simulation of technical, natural or socioeconomic systems and the solution of practical problems based on developed models. New schools arise in these disciplines. These new schools decide peculiar to each discipline tasks, but for systems with a much bigger number of internal elements and connections between them. Characteristics of the problems to be solved under new schools, not always meet the criteria for Big Data. It is suggested to identify the Big Data as a part of the theory of sorting and searching algorithms. In other disciplines the new schools are called by analogy with Big Data: Big Calculation in numerical methods, Big Simulation in imitational modeling, Big Management in the management of socio-economic systems, Big Optimal Control in the optimal control theory. The paper shows examples of tasks and methods to be developed within new schools. The educed tendency is not limited to the considered disciplines: there are

  9. Simulation and optimization of an industrial PSA unit

    Directory of Open Access Journals (Sweden)

    Barg C.

    2000-01-01

    Full Text Available The Pressure Swing Adsorption (PSA units have been used as a low cost alternative to the usual gas separation processes. Its largest commercial application is for hydrogen purification systems. Several studies have been made about the simulation of pressure swing adsorption units, but there are only few reports on the optimization of such processes. The objective of this study is to simulate and optimize an industrial PSA unit for hydrogen purification. This unit consists of six beds, each of them have three layers of different kinds of adsorbents. The main impurities are methane, carbon monoxide and sulfidric gas. The product stream has 99.99% purity in hydrogen, and the recovery is around 90%. A mathematical model for a commercial PSA unit is developed. The cycle time and the pressure swing steps are optimized. All the features concerning with complex commercial processes are considered.

  10. Applied simulation and optimization in logistics, industrial and aeronautical practice

    CERN Document Server

    Mota, Idalia; Serrano, Daniel

    2015-01-01

    Presenting techniques, case-studies and methodologies that combine the use of simulation approaches with optimization techniques for facing problems in manufacturing, logistics, or aeronautical problems, this book provides solutions to common industrial problems in several fields, which range from manufacturing to aviation problems, where the common denominator is the combination of simulation’s flexibility with optimization techniques’ robustness. Providing readers with a comprehensive guide to tackle similar issues in industrial environments, this text explores novel ways to face industrial problems through hybrid approaches (simulation-optimization) that benefit from the advantages of both paradigms, in order to give solutions to important problems in service industry, production processes, or supply chains, such as scheduling, routing problems and resource allocations, among others.

  11. A Simulation Approach to Statistical Estimation of Multiperiod Optimal Portfolios

    Directory of Open Access Journals (Sweden)

    Hiroshi Shiraishi

    2012-01-01

    Full Text Available This paper discusses a simulation-based method for solving discrete-time multiperiod portfolio choice problems under AR(1 process. The method is applicable even if the distributions of return processes are unknown. We first generate simulation sample paths of the random returns by using AR bootstrap. Then, for each sample path and each investment time, we obtain an optimal portfolio estimator, which optimizes a constant relative risk aversion (CRRA utility function. When an investor considers an optimal investment strategy with portfolio rebalancing, it is convenient to introduce a value function. The most important difference between single-period portfolio choice problems and multiperiod ones is that the value function is time dependent. Our method takes care of the time dependency by using bootstrapped sample paths. Numerical studies are provided to examine the validity of our method. The result shows the necessity to take care of the time dependency of the value function.

  12. Optimization Of Assembly Line Of Printed Circuit Board Using Simulation

    Directory of Open Access Journals (Sweden)

    Kamal Alzameli

    2015-08-01

    Full Text Available The development of an assembly line would not stop at a specific research even if that research contains very detailed sectors of one assembly line or starting from customer order to product delivery. Therefore continuous improvement needs to be vigorous for continuous productivity improvement and innovation. The optimization process of printed circuit board assembly line includes hard and soft optimization. The hard optimization includes hardware changes via design and the soft optimization is figured via simulation of the current assembly line and analysis. The aim of the research is to determine the bottlenecks find a solution and develop improvement method via changing the configuration of the setup on the assembly line until the feasible optimal configuration is definite. The software that will be used for this simulation is Arena software. The data of the research is collected from real assembly line. The best engineering assumption for the area is used for no permanent data that could be available such as the data that might be collected from the manual stations. In addition to the aim of the research is to get a better yield via improving the time of the PCB assembly process. Hence the outcome to all of these attempts of optimizing the assembly line of PCB for continuous improvement is to deliver good quality products reduce cost and minimize the time to delivery and meet the customer expectations.

  13. User guide to SUNDT. A simulation tool for ultrasonic NDT

    Energy Technology Data Exchange (ETDEWEB)

    Wirdelius, H. [SAQ Kontroll AB, Moelndal (Sweden)

    2000-08-01

    Mathematical modelling of the ultrasonic NDT situation has become an emerging discipline with a broadening industrial interest in the recent decade. New and stronger demands on reliability of used procedures and methods applied in e.g. nuclear and pressure vessel industries have enforced this fact. To qualify the procedures, extensive experimental work on test blocks is normally required. A thoroughly validated model has the ability to be an alternative and a complement to the experimental work in order to reduce the extensive cost that is associated with the previous procedure. The present report describes the SUNDT software (Simulation tool for Ultrasonic NDT). Except being a user guide to the software it also pinpoints its modelling capabilities and restrictions. The SUNDT software is a windows based pre- and post processor using the UTDefect model as mathematical kernel. The software simulates the whole testing procedure with the contact probes (of arbitrary type, angle and size) acting in pulse-echo or tandem inspection situations and includes a large number of defect models. The simulated test piece is at the present state restricted to be of a homogeneous and isotropic material and does not include any model of attenuation due to absorption (viscous effects) or grain boundary scattering. The report also incorporates a short declaration of previous validations and verifications against experimental investigations and comparisons with other existing simulation software. The major part of the report deals with a presentation and visualisation of the various options within the pre- and post processor. In order to exemplify its capability a specific simulation is followed from setting the parameters, running the kernel and towards visualisation of the result.

  14. Optimization of simulated moving bed (SMB) chromatography: a multi-level optimization procedure

    DEFF Research Database (Denmark)

    Jørgensen, Sten Bay; Lim, Young-il

    2004-01-01

    This paper presents a multi-level optimization strategy to obtain optimum operating conditions (four flowrates and cycle time) of nonlinear simulated moving bed chromatography. The multi-level optimization procedure (MLOP) approaches systematically from initialization to optimization with two...... objective functions (productivity and desorbent consumption), employing the standing wave analysis, the true moving bed (TMB) model and the simulated moving bed (SMB) model. The procedure is constructed on a non-worse solution property advancing level by level and its solution does not mean a global optimum....... That is, the lower desorbent consumption under the higher productivity is successively obtained on the basis of the SMB model, as the two SMB-model optimizations are repeated using a standard SQP (successive quadratic programming) algorithm. This approach takes advantage of the TMB model and surmounts...

  15. Optimizing the simulation of riverine species flow preferences

    Science.gov (United States)

    Kiesel, Jens; Pfannerstill, Matthias; Guse, Björn; Kakouei, Karan; Jähnig, Sonja C.; Fohrer, Nicola

    2016-04-01

    Riverine biota have distinct demands on the discharge regime. To quantify these demands, discharge time series are translated to ecohydrological indicators, e.g. magnitude, timing or duration of baseflow or peak flow events. These indicators are then related to species occurrence and/or absence to establish the feedback response of aquatic species to hydrological conditions. These links can be used in conjunction with hydrological simulations for predictions of species occurrences. If differences between observed and simulated ecohydrological indicator values are too high, such predictions can be wrong. Indicator differences can be due to poor input data quality and simplified model algorithms, but also depend on how the model was optimized. For instance, in case the model was optimised towards a single objective function, e.g. minimizing the difference between simulated and observed Q95, differences between simulated and observed high flow indicators will be smaller as compared to baseflow indicators. In this study, we are working towards assessing this error depending on the optimisation of the model. This assessment is based on a multi-objective vs. single-objective model optimization which we have realised in the following four-step approach: (1) sets of highly relevant ecohydrological indicators are defined; (2) the hydrologic model is optimised using a multi-objective function that combines all indicators; (3) the hydrologic model is optimised using single-objective functions with one optimisation round for each indicator and (4) the differences between all optimisation methods are calculated. By assessing these absolute (simulated vs observed) and relative (simulated vs simulated) differences, we can evaluate the magnitude of the possible error band when optimising a hydrological model towards different ecohydrological indicators. This assessment can be used to optimize hydrological models for depicting preferences of riverine biota more effectively and

  16. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    Science.gov (United States)

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results.

  17. Qualitative and Quantitative Integrated Modeling for Stochastic Simulation and Optimization

    Directory of Open Access Journals (Sweden)

    Xuefeng Yan

    2013-01-01

    Full Text Available The simulation and optimization of an actual physics system are usually constructed based on the stochastic models, which have both qualitative and quantitative characteristics inherently. Most modeling specifications and frameworks find it difficult to describe the qualitative model directly. In order to deal with the expert knowledge, uncertain reasoning, and other qualitative information, a qualitative and quantitative combined modeling specification was proposed based on a hierarchical model structure framework. The new modeling approach is based on a hierarchical model structure which includes the meta-meta model, the meta-model and the high-level model. A description logic system is defined for formal definition and verification of the new modeling specification. A stochastic defense simulation was developed to illustrate how to model the system and optimize the result. The result shows that the proposed method can describe the complex system more comprehensively, and the survival probability of the target is higher by introducing qualitative models into quantitative simulation.

  18. Future missions for observing Earth's changing gravity field: a closed-loop simulation tool

    Science.gov (United States)

    Visser, P. N.

    2008-12-01

    The GRACE mission has successfully demonstrated the observation from space of the changing Earth's gravity field at length and time scales of typically 1000 km and 10-30 days, respectively. Many scientific communities strongly advertise the need for continuity of observing Earth's gravity field from space. Moreover, a strong interest is being expressed to have gravity missions that allow a more detailed sampling of the Earth's gravity field both in time and in space. Designing a gravity field mission for the future is a complicated process that involves making many trade-offs, such as trade-offs between spatial, temporal resolution and financial budget. Moreover, it involves the optimization of many parameters, such as orbital parameters (height, inclination), distinction between which gravity sources to observe or correct for (for example are gravity changes due to ocean currents a nuisance or a signal to be retrieved?), observation techniques (low-low satellite-to-satellite tracking, satellite gravity gradiometry, accelerometers), and satellite control systems (drag-free?). A comprehensive tool has been developed and implemented that allows the closed-loop simulation of gravity field retrievals for different satellite mission scenarios. This paper provides a description of this tool. Moreover, its capabilities are demonstrated by a few case studies. Acknowledgments. The research that is being done with the closed-loop simulation tool is partially funded by the European Space Agency (ESA). An important component of the tool is the GEODYN software, kindly provided by NASA Goddard Space Flight Center in Greenbelt, Maryland.

  19. An Indirect Simulation-Optimization Model for Determining Optimal TMDL Allocation under Uncertainty

    Directory of Open Access Journals (Sweden)

    Feng Zhou

    2015-11-01

    Full Text Available An indirect simulation-optimization model framework with enhanced computational efficiency and risk-based decision-making capability was developed to determine optimal total maximum daily load (TMDL allocation under uncertainty. To convert the traditional direct simulation-optimization model into our indirect equivalent model framework, we proposed a two-step strategy: (1 application of interval regression equations derived by a Bayesian recursive regression tree (BRRT v2 algorithm, which approximates the original hydrodynamic and water-quality simulation models and accurately quantifies the inherent nonlinear relationship between nutrient load reductions and the credible interval of algal biomass with a given confidence interval; and (2 incorporation of the calibrated interval regression equations into an uncertain optimization framework, which is further converted to our indirect equivalent framework by the enhanced-interval linear programming (EILP method and provides approximate-optimal solutions at various risk levels. The proposed strategy was applied to the Swift Creek Reservoir’s nutrient TMDL allocation (Chesterfield County, VA to identify the minimum nutrient load allocations required from eight sub-watersheds to ensure compliance with user-specified chlorophyll criteria. Our results indicated that the BRRT-EILP model could identify critical sub-watersheds faster than the traditional one and requires lower reduction of nutrient loadings compared to traditional stochastic simulation and trial-and-error (TAE approaches. This suggests that our proposed framework performs better in optimal TMDL development compared to the traditional simulation-optimization models and provides extreme and non-extreme tradeoff analysis under uncertainty for risk-based decision making.

  20. A new methodology for sizing hybrid photovoltaic-wind energy system using simulation and optimization tools = Uma nova metodologia para dimensionamento de sistemas híbridos de energia (solar-eólica utilizando ferramentas de simulação e otimização

    Directory of Open Access Journals (Sweden)

    Samuel Nelson Melegari de Souza

    2005-01-01

    Full Text Available This paper presents a new methodology for sizing an autonomousphotovoltaic-wind hybrid energy system with battery storage, using simulation and optimization tools. The developed model is useful for energizing remote rural areas and produces a system with minimum cost and high reliability, based on the concept of Loss of Power Supply Probability (LPSP applied for consecutive hours. Some scenarios arecalculated and compared, using different numbers of consecutive hours and different LPSP values. As a result, a complete sizing of the system and a long-term cost evaluation are presented.Este trabalho apresenta uma nova metodologia para dimensionamento de sistemas híbridos de energia (solar-eólica com armazenamento em banco de baterias, utilizando ferramentas de simulação e otimização. O modelo desenvolvido é útil para a energização de áreas ruraisisoladas e resulta num sistema com custo mínimo e alta confiabilidade, baseado no conceito de perda de fornecimento de energia à carga (LPSP aplicado para horas consecutivas. Alguns cenários são calculados e comparados, utilizando-se diferentes períodos de horas consecutivas e diferentes valores de LPSP. Os resultados apresentam um dimensionamento completo do sistema e uma avaliação de custos ao longo de vários anos.

  1. Interactive simulations as teaching tools for engineering mechanics courses

    Science.gov (United States)

    Carbonell, Victoria; Romero, Carlos; Martínez, Elvira; Flórez, Mercedes

    2013-07-01

    This study aimed to gauge the effect of interactive simulations in class as an active teaching strategy for a mechanics course. Engineering analysis and design often use the properties of planar sections in calculations. In the stress analysis of a beam under bending and torsional loads, cross-sectional properties are used to determine stress and displacement distributions in the beam cross section. The centroid, moments and products of inertia of an area made up of several common shapes (rectangles usually) may thus be obtained by adding the moments of inertia of the component areas (U-shape, L-shape, C-shape, etc). This procedure is used to calculate the second moments of structural shapes in engineering practice because the determination of their moments of inertia is necessary for the design of structural components. This paper presents examples of interactive simulations developed for teaching the ‘Mechanics and mechanisms’ course at the Universidad Politecnica de Madrid, Spain. The simulations focus on fundamental topics such as centroids, the properties of the moment of inertia, second moments of inertia with respect to two axes, principal moments of inertia and Mohr's Circle for plane stress, and were composed using Geogebra software. These learning tools feature animations, graphics and interactivity and were designed to encourage student participation and engagement in active learning activities, to effectively explain and illustrate course topics, and to build student problem-solving skills.

  2. WINS. Market Simulation Tool for Facilitating Wind Energy Integration

    Energy Technology Data Exchange (ETDEWEB)

    Shahidehpour, Mohammad [Illinois Inst. of Technology, Chicago, IL (United States)

    2012-10-30

    Integrating 20% or more wind energy into the system and transmitting large sums of wind energy over long distances will require a decision making capability that can handle very large scale power systems with tens of thousands of buses and lines. There is a need to explore innovative analytical and implementation solutions for continuing reliable operations with the most economical integration of additional wind energy in power systems. A number of wind integration solution paths involve the adoption of new operating policies, dynamic scheduling of wind power across interties, pooling integration services, and adopting new transmission scheduling practices. Such practices can be examined by the decision tool developed by this project. This project developed a very efficient decision tool called Wind INtegration Simulator (WINS) and applied WINS to facilitate wind energy integration studies. WINS focused on augmenting the existing power utility capabilities to support collaborative planning, analysis, and wind integration project implementations. WINS also had the capability of simulating energy storage facilities so that feasibility studies of integrated wind energy system applications can be performed for systems with high wind energy penetrations. The development of WINS represents a major expansion of a very efficient decision tool called POwer Market Simulator (POMS), which was developed by IIT and has been used extensively for power system studies for decades. Specifically, WINS provides the following superiorities; (1) An integrated framework is included in WINS for the comprehensive modeling of DC transmission configurations, including mono-pole, bi-pole, tri-pole, back-to-back, and multi-terminal connection, as well as AC/DC converter models including current source converters (CSC) and voltage source converters (VSC); (2) An existing shortcoming of traditional decision tools for wind integration is the limited availability of user interface, i.e., decision

  3. Managing simulation-based training: A framework for optimizing learning, cost, and time

    Science.gov (United States)

    Richmond, Noah Joseph

    This study provides a management framework for optimizing training programs for learning, cost, and time when using simulation based training (SBT) and reality based training (RBT) as resources. Simulation is shown to be an effective means for implementing activity substitution as a way to reduce risk. The risk profile of 22 US Air Force vehicles are calculated, and the potential risk reduction is calculated under the assumption of perfect substitutability of RBT and SBT. Methods are subsequently developed to relax the assumption of perfect substitutability. The transfer effectiveness ratio (TER) concept is defined and modeled as a function of the quality of the simulator used, and the requirements of the activity trained. The Navy F/A-18 is then analyzed in a case study illustrating how learning can be maximized subject to constraints in cost and time, and also subject to the decision maker's preferences for the proportional and absolute use of simulation. Solution methods for optimizing multiple activities across shared resources are next provided. Finally, a simulation strategy including an operations planning program (OPP), an implementation program (IP), an acquisition program (AP), and a pedagogical research program (PRP) is detailed. The study provides the theoretical tools to understand how to leverage SBT, a case study demonstrating these tools' efficacy, and a set of policy recommendations to enable the US military to better utilize SBT in the future.

  4. OPTIMIZATION METHOD FOR VIRTUAL PRODUCT DEVELOPMENT BASED ON SIMULATION METAMODEL AND ITS APPLICATION

    Institute of Scientific and Technical Information of China (English)

    Pan Jun; Fan Xiumin; Ma Dengzhe; Jin Ye

    2003-01-01

    Virtual product development (VPD) is essentially based on simulation. Due to computational inefficiency, traditional engineering simulation software and optimization methods are inadequate to analyze optimization problems in VPD. Optimization method based on simulation metamodel for virtual product development is proposed to satisfy the needs of complex optimal designs driven by VPD. This method extends the current design of experiments (DOE) by various metamodeling technologies. Simulation metamodels are built to approximate detailed simulation codes, so as to provide link between optimization and simulation, or serve as a bridge for simulation software integration among different domains. An example of optimal design for composite material structure is used to demonstrate the newly introduced method.

  5. SBMLSimulator: A Java Tool for Model Simulation and Parameter Estimation in Systems Biology

    Directory of Open Access Journals (Sweden)

    Alexander Dörr

    2014-12-01

    Full Text Available The identification of suitable model parameters for biochemical reactions has been recognized as a quite difficult endeavor. Parameter values from literature or experiments can often not directly be combined in complex reaction systems. Nature-inspired optimization techniques can find appropriate sets of parameters that calibrate a model to experimentally obtained time series data. We present SBMLsimulator, a tool that combines the Systems Biology Simulation Core Library for dynamic simulation of biochemical models with the heuristic optimization framework EvA2. SBMLsimulator provides an intuitive graphical user interface with various options as well as a fully-featured command-line interface for large-scale and script-based model simulation and calibration. In a parameter estimation study based on a published model and artificial data we demonstrate the capability of SBMLsimulator to identify parameters. SBMLsimulator is useful for both, the interactive simulation and exploration of the parameter space and for the large-scale model calibration and estimation of uncertain parameter values.

  6. Molecular dynamics simulation of subnanometric tool-workpiece contact on a force sensor-integrated fast tool servo for ultra-precision microcutting

    Energy Technology Data Exchange (ETDEWEB)

    Cai, Yindi [Department of Nanomechanics, Tohoku University, Sendai 980-8579 (Japan); Chen, Yuan-Liu, E-mail: yuanliuchen@nano.mech.tohoku.ac.jp [Department of Nanomechanics, Tohoku University, Sendai 980-8579 (Japan); Shimizu, Yuki; Ito, So; Gao, Wei [Department of Nanomechanics, Tohoku University, Sendai 980-8579 (Japan); Zhang, Liangchi [School of Mechanical and Manufacturing Engineering, The University of New South Wales, NSW 2052 (Australia)

    2016-04-30

    Highlights: • Subnanometric contact between a diamond tool and a copper workpiece surface is investigated by MD simulation. • A multi-relaxation time technique is proposed to eliminate the influence of the atom vibrations. • The accuracy of the elastic-plastic transition contact depth estimation is improved by observing the residual defects. • The simulation results are beneficial for optimization of the next-generation microcutting instruments. - Abstract: This paper investigates the contact characteristics between a copper workpiece and a diamond tool in a force sensor-integrated fast tool servo (FS-FTS) for single point diamond microcutting and in-process measurement of ultra-precision surface forms of the workpiece. Molecular dynamics (MD) simulations are carried out to identify the subnanometric elastic-plastic transition contact depth, at which the plastic deformation in the workpiece is initiated. This critical depth can be used to optimize the FS-FTS as well as the cutting/measurement process. It is clarified that the vibrations of the copper atoms in the MD model have a great influence on the subnanometric MD simulation results. A multi-relaxation time method is then proposed to reduce the influence of the atom vibrations based on the fact that the dominant vibration component has a certain period determined by the size of the MD model. It is also identified that for a subnanometric contact depth, the position of the tool tip for the contact force to be zero during the retracting operation of the tool does not correspond to the final depth of the permanent contact impression on the workpiece surface. The accuracy for identification of the transition contact depth is then improved by observing the residual defects on the workpiece surface after the tool retracting.

  7. Nuclear fuel cycle system simulation tool based on high-fidelity component modeling

    Energy Technology Data Exchange (ETDEWEB)

    Ames, David E.,

    2014-02-01

    The DOE is currently directing extensive research into developing fuel cycle technologies that will enable the safe, secure, economic, and sustainable expansion of nuclear energy. The task is formidable considering the numerous fuel cycle options, the large dynamic systems that each represent, and the necessity to accurately predict their behavior. The path to successfully develop and implement an advanced fuel cycle is highly dependent on the modeling capabilities and simulation tools available for performing useful relevant analysis to assist stakeholders in decision making. Therefore a high-fidelity fuel cycle simulation tool that performs system analysis, including uncertainty quantification and optimization was developed. The resulting simulator also includes the capability to calculate environmental impact measures for individual components and the system. An integrated system method and analysis approach that provides consistent and comprehensive evaluations of advanced fuel cycles was developed. A general approach was utilized allowing for the system to be modified in order to provide analysis for other systems with similar attributes. By utilizing this approach, the framework for simulating many different fuel cycle options is provided. Two example fuel cycle configurations were developed to take advantage of used fuel recycling and transmutation capabilities in waste management scenarios leading to minimized waste inventories.

  8. A Simulation Framework for Optimal Energy Storage Sizing

    Directory of Open Access Journals (Sweden)

    Carlos Suazo-Martínez

    2014-05-01

    Full Text Available Despite the increasing interest in Energy Storage Systems (ESS, quantification of their technical and economical benefits remains a challenge. To assess the use of ESS, a simulation approach for ESS optimal sizing is presented. The algorithm is based on an adapted Unit Commitment, including ESS operational constraints, and the use of high performance computing (HPC. Multiple short-term simulations are carried out within a multiple year horizon. Evaluation is performed for Chile's Northern Interconnected Power System (SING. The authors show that a single year evaluation could lead to sub-optimal results when evaluating optimal ESS size. Hence, it is advisable to perform long-term evaluations of ESS. Additionally, the importance of detailed simulation for adequate assessment of ESS contributions and to fully capture storage value is also discussed. Furthermore, the robustness of the optimal sizing approach is evaluated by means of a sensitivity analyses. The results suggest that regulatory frameworks should recognize multiple value streams from storage in order to encourage greater ESS integration.

  9. Mixed integer simulation optimization for optimal hydraulic fracturing and production of shale gas fields

    Science.gov (United States)

    Li, J. C.; Gong, B.; Wang, H. G.

    2016-08-01

    Optimal development of shale gas fields involves designing a most productive fracturing network for hydraulic stimulation processes and operating wells appropriately throughout the production time. A hydraulic fracturing network design-determining well placement, number of fracturing stages, and fracture lengths-is defined by specifying a set of integer ordered blocks to drill wells and create fractures in a discrete shale gas reservoir model. The well control variables such as bottom hole pressures or production rates for well operations are real valued. Shale gas development problems, therefore, can be mathematically formulated with mixed-integer optimization models. A shale gas reservoir simulator is used to evaluate the production performance for a hydraulic fracturing and well control plan. To find the optimal fracturing design and well operation is challenging because the problem is a mixed integer optimization problem and entails computationally expensive reservoir simulation. A dynamic simplex interpolation-based alternate subspace (DSIAS) search method is applied for mixed integer optimization problems associated with shale gas development projects. The optimization performance is demonstrated with the example case of the development of the Barnett Shale field. The optimization results of DSIAS are compared with those of a pattern search algorithm.

  10. Research advances in coupling bionic optimization design method for CNC machine tools based on ergonomics

    Directory of Open Access Journals (Sweden)

    Shihao LIU

    2015-06-01

    Full Text Available Currently, most Chinese CNC machine tools' dynamic and static performances have large gap comparing with the similar foreign products, and the CNC machine tools users' human-centered design demand are ignored, which results in that the domestic CNC machine tools' overall competitiveness is relatively low. In order to solve the above problem, the ergonomics and coupling bionics are adopted to study collaborative optimization design method for CNC machine tools based on the domestic and foreign machine tool design method research achievement. The CNC machine tools' "man-machine-environment" interaction mechanism can be built by combining with ergonomic, and then the CNC ergonomic design criteria is obtained. Taking the coupling bionics as theoretical basis, the biological structures "morphology-structure-function-adaptive growth" multiple coupling mechanism can be studied, and the mechanical performance benefits structure can be extracted, then the CNC machine tools structural coupling bionic design technology is obtained by combining with the similarity principle. Combination of CNC machine tools' ergonomic design criteria and coupling bionic design technology, and considering the CNC machine tool performance's interaction and coupling mechanisms, a new multi-objective optimization design method can be obtained, which is verified through CNC machine tools' prototype experiments. The new optimization design method for CNC machine tools can not only help improve the whole machine's dynamic and static performance, but also has a bright prospect because of the "man-oriented" design concept.

  11. Tokamak Scenario Trajectory Optimization Using Fast Integrated Simulations

    Science.gov (United States)

    Urban, Jakub; Artaud, Jean-François; Vahala, Linda; Vahala, George

    2015-11-01

    We employ a fast integrated tokamak simulator, METIS, for optimizing tokamak discharge trajectories. METIS is based on scaling laws and simplified transport equations, validated on existing experiments and capable of simulating a full tokamak discharge in about 1 minute. Rapid free-boundary equilibrium post-processing using FREEBIE provides estimates of PF coil currents or forces. We employ several optimization strategies for optimizing key trajectories, such as Ip or heating power, of a model ITER hybrid discharge. Local and global algorithms with single or multiple objective functions show how to reach optimum performance, stationarity or minimum flux consumption. We constrain fundamental operation parameters, such as ramp-up rate, PF coils currents and forces or heating power. As an example, we demonstrate the benefit of current over-shoot for hybrid mode, consistent with previous results. This particular optimization took less than 2 hours on a single PC. Overall, we have established a powerful approach for rapid, non-linear tokamak scenario optimization, including operational constraints, pertinent to existing and future devices design and operation.

  12. When teams shift among processes: insights from simulation and optimization.

    Science.gov (United States)

    Kennedy, Deanna M; McComb, Sara A

    2014-09-01

    This article introduces process shifts to study the temporal interplay among transition and action processes espoused in the recurring phase model proposed by Marks, Mathieu, and Zacarro (2001). Process shifts are those points in time when teams complete a focal process and change to another process. By using team communication patterns to measure process shifts, this research explores (a) when teams shift among different transition processes and initiate action processes and (b) the potential of different interventions, such as communication directives, to manipulate process shift timing and order and, ultimately, team performance. Virtual experiments are employed to compare data from observed laboratory teams not receiving interventions, simulated teams receiving interventions, and optimal simulated teams generated using genetic algorithm procedures. Our results offer insights about the potential for different interventions to affect team performance. Moreover, certain interventions may promote discussions about key issues (e.g., tactical strategies) and facilitate shifting among transition processes in a manner that emulates optimal simulated teams' communication patterns. Thus, we contribute to theory regarding team processes in 2 important ways. First, we present process shifts as a way to explore the timing of when teams shift from transition to action processes. Second, we use virtual experimentation to identify those interventions with the greatest potential to affect performance by changing when teams shift among processes. Additionally, we employ computational methods including neural networks, simulation, and optimization, thereby demonstrating their applicability in conducting team research.

  13. Parametric Optimization and Prediction Tool for Lunar Surface Systems Excavation Tasks Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Honeybee Robotics proposes to develop a software tool for facilitating lunar excavation system trades in support of selecting an optimal architecture. This will...

  14. Toolkit of Available EPA Green Infrastructure Modeling Software: Watershed Management Optimization Support Tool (WMOST)

    Science.gov (United States)

    Watershed Management Optimization Support Tool (WMOST) is a software application designed tofacilitate integrated water resources management across wet and dry climate regions. It allows waterresources managers and planners to screen a wide range of practices across their watersh...

  15. PROBEmer: A web-based software tool for selecting optimal DNA oligos

    National Research Council Canada - National Science Library

    Emrich, Scott J; Lowe, Mary; Delcher, Arthur L

    2003-01-01

    PROBEmer (http://probemer.cs.loyola.edu) is a web-based software tool that enables a researcher to select optimal oligos for PCR applications and multiplex detection platforms including oligonucleotide microarrays and bead-based arrays...

  16. Toolkit of Available EPA Green Infrastructure Modeling Software: Watershed Management Optimization Support Tool (WMOST)

    Science.gov (United States)

    Watershed Management Optimization Support Tool (WMOST) is a software application designed tofacilitate integrated water resources management across wet and dry climate regions. It allows waterresources managers and planners to screen a wide range of practices across their watersh...

  17. 76 FR 5832 - International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA...

    Science.gov (United States)

    2011-02-02

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF LABOR Employment and Training Administration International Business Machines (IBM), Software Group Business Unit... at International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools...

  18. Software tool for horizontal-axis wind turbine simulation

    Energy Technology Data Exchange (ETDEWEB)

    Vitale, A.J. [Instituto Argentino de Oceanografia, Camino La Carrindanga Km. 7, 5 CC 804, B8000FWB Bahia Blanca (Argentina); Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina); Rossi, A.P. [Universidad Tecnologica Nacional Facultad Regional Bahia Blanca, GESE, 11 de Abril 461, B8000LMI Bahia Blanca (Argentina); Dpto. de Ing. Electrica y de Computadoras, Universidad Nacional del Sur, Av. Alem 1253, 8000 Bahia Blanca (Argentina)

    2008-07-15

    The main problem of a wind turbine generator design project is the design of the right blades capable of satisfying the specific energy requirement of an electric system with optimum performance. Once the blade has been designed for optimum operation at a particular rotor angular speed, it is necessary to determine the overall performance of the rotor under the range of wind speed that it will encounter. A software tool that simulates low-power, horizontal-axis wind turbines was developed for this purpose. With this program, the user can calculate the rotor power output for any combination of wind and rotor speeds, with definite blade shape and airfoil characteristics. The software also provides information about distribution of forces along the blade span, for different operational conditions. (author)

  19. APRON: A Cellular Processor Array Simulation and Hardware Design Tool

    Directory of Open Access Journals (Sweden)

    David R. W. Barr

    2009-01-01

    Full Text Available We present a software environment for the efficient simulation of cellular processor arrays (CPAs. This software (APRON is used to explore algorithms that are designed for massively parallel fine-grained processor arrays, topographic multilayer neural networks, vision chips with SIMD processor arrays, and related architectures. The software uses a highly optimised core combined with a flexible compiler to provide the user with tools for the design of new processor array hardware architectures and the emulation of existing devices. We present performance benchmarks for the software processor array implemented on standard commodity microprocessors. APRON can be configured to use additional processing hardware if necessary and can be used as a complete graphical user interface and development environment for new or existing CPA systems, allowing more users to develop algorithms for CPA systems.

  20. Optimal estuarine sediment monitoring network design with simulated annealing.

    Science.gov (United States)

    Nunes, L M; Caeiro, S; Cunha, M C; Ribeiro, L

    2006-02-01

    An objective function based on geostatistical variance reduction, constrained to the reproduction of the probability distribution functions of selected physical and chemical sediment variables, is applied to the selection of the best set of compliance monitoring stations in the Sado river estuary in Portugal. These stations were to be selected from a large set of sampling stations from a prior field campaign. Simulated annealing was chosen to solve the optimisation function model. Both the combinatorial problem structure and the resulting candidate sediment monitoring networks are discussed, and the optimal dimension and spatial distribution are proposed. An optimal network of sixty stations was obtained from an original 153-station sampling campaign.

  1. Automatic, optimized interface placement in forward flux sampling simulations

    CERN Document Server

    Kratzer, Kai; Allen, Rosalind J

    2013-01-01

    Forward flux sampling (FFS) provides a convenient and efficient way to simulate rare events in equilibrium or non-equilibrium systems. FFS ratchets the system from an initial state to a final state via a series of interfaces in phase space. The efficiency of FFS depends sensitively on the positions of the interfaces. We present two alternative methods for placing interfaces automatically and adaptively in their optimal locations, on-the-fly as an FFS simulation progresses, without prior knowledge or user intervention. These methods allow the FFS simulation to advance efficiently through bottlenecks in phase space by placing more interfaces where the probability of advancement is lower. The methods are demonstrated both for a single-particle test problem and for the crystallization of Yukawa particles. By removing the need for manual interface placement, our methods both facilitate the setting up of FFS simulations and improve their performance, especially for rare events which involve complex trajectories thr...

  2. Designing a new tool for modeling and simulation of discrete-event based systems

    OpenAIRE

    2009-01-01

    This paper talks about design, development, and application of a new Petri net simulator for modeling and simulation of discrete event system (e.g. information systems). The new tool is called GPenSIM (General purpose Petri Net Simulator). Firstly, this paper presents the reason for developing a new tool, through a brief literature study. Secondly, the design and architectural issues of the tool is given. Finally, an application example is given on the application of the tool.

  3. Aircraft Course Optimization Tool Using GPOPS MATLAB Code

    Science.gov (United States)

    2012-03-01

    experiences when the problem becomes too complex. v Acknowledgements This thesis would never have come to fruition without the help of those around me. I must...Florida, and Standford University’s Sparse Nonlinear OPTimizer(SNOPT) solver. The addition of several ACOT specific scripts frame the problem to the GPOPS... experiences with the two lobe radar cross section is the discontinuity where the RCS is 1m2, however it is thought this is ignored due to the discrete

  4. Optimal Facility Location Tool for Logistics Battle Command (LBC)

    Science.gov (United States)

    2015-08-01

    possible A/SPODs. 71 A-82 Appendix B. VBA Code The following screen shots of the VBA code are provided to document the code used to run the model ...research is to develop a model that optimizes the selection of Air and Sea Ports of Debarkation and intermediate logistical distribution centers while...conduct transportation in a theater of operations. The model developed by AFIT uses a network approach to solve a multi-objective model . The model was

  5. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    Science.gov (United States)

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-09-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  6. Simulated Interactive Research Experiments as Educational Tools for Advanced Science

    Science.gov (United States)

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M.; Hopf, Martin; Arndt, Markus

    2015-01-01

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields. PMID:26370627

  7. Simulated Interactive Research Experiments as Educational Tools for Advanced Science.

    Science.gov (United States)

    Tomandl, Mathias; Mieling, Thomas; Losert-Valiente Kroon, Christiane M; Hopf, Martin; Arndt, Markus

    2015-09-15

    Experimental research has become complex and thus a challenge to science education. Only very few students can typically be trained on advanced scientific equipment. It is therefore important to find new tools that allow all students to acquire laboratory skills individually and independent of where they are located. In a design-based research process we have investigated the feasibility of using a virtual laboratory as a photo-realistic and scientifically valid representation of advanced scientific infrastructure to teach modern experimental science, here, molecular quantum optics. We found a concept based on three educational principles that allows undergraduate students to become acquainted with procedures and concepts of a modern research field. We find a significant increase in student understanding using our Simulated Interactive Research Experiment (SiReX), by evaluating the learning outcomes with semi-structured interviews in a pre/post design. This suggests that this concept of an educational tool can be generalized to disseminate findings in other fields.

  8. Tools for Optimizing Management of a Spatially Variable Organic Field

    Directory of Open Access Journals (Sweden)

    Thomas Panagopoulos

    2015-03-01

    Full Text Available Geostatistical tools were used to estimate spatial relations between wheat yield and soil parameters under organic farming field conditions. Thematic maps of each factor were created as raster images in R software using kriging. The Geographic Resources Analysis Support System (GRASS calculated the principal component analysis raster images for soil parameters and yield. The correlation between the raster arising from the PC1 of soil and yield parameters showed high linear correlation (r = 0.75 and explained 48.50% of the data variance. The data show that durum wheat yield is strongly affected by soil parameter variability, and thus, the average production can be substantially lower than its potential. Soil water content was the limiting factor to grain yield and not nitrate as in other similar studies. The use of precision agriculture tools helped reduce the level of complexity between the measured parameters by the grouping of several parameters and demonstrating that precision agriculture tools can be applied in small organic fields, reducing costs and increasing wheat yield. Consequently, site-specific applications could be expected to improve the yield without increasing excessively the cost for farmers and enhance environmental and economic benefits.

  9. ConvAn: a convergence analyzing tool for optimization of biochemical networks.

    Science.gov (United States)

    Kostromins, Andrejs; Mozga, Ivars; Stalidzans, Egils

    2012-01-01

    Dynamic models of biochemical networks usually are described as a system of nonlinear differential equations. In case of optimization of models for purpose of parameter estimation or design of new properties mainly numerical methods are used. That causes problems of optimization predictability as most of numerical optimization methods have stochastic properties and the convergence of the objective function to the global optimum is hardly predictable. Determination of suitable optimization method and necessary duration of optimization becomes critical in case of evaluation of high number of combinations of adjustable parameters or in case of large dynamic models. This task is complex due to variety of optimization methods, software tools and nonlinearity features of models in different parameter spaces. A software tool ConvAn is developed to analyze statistical properties of convergence dynamics for optimization runs with particular optimization method, model, software tool, set of optimization method parameters and number of adjustable parameters of the model. The convergence curves can be normalized automatically to enable comparison of different methods and models in the same scale. By the help of the biochemistry adapted graphical user interface of ConvAn it is possible to compare different optimization methods in terms of ability to find the global optima or values close to that as well as the necessary computational time to reach them. It is possible to estimate the optimization performance for different number of adjustable parameters. The functionality of ConvAn enables statistical assessment of necessary optimization time depending on the necessary optimization accuracy. Optimization methods, which are not suitable for a particular optimization task, can be rejected if they have poor repeatability or convergence properties. The software ConvAn is freely available on www.biosystems.lv/convan.

  10. CFD-Based Design Optimization Tool Developed for Subsonic Inlet

    Science.gov (United States)

    1995-01-01

    The traditional approach to the design of engine inlets for commercial transport aircraft is a tedious process that ends with a less-than-optimum design. With the advent of high-speed computers and the availability of more accurate and reliable computational fluid dynamics (CFD) solvers, numerical optimization processes can effectively be used to design an aerodynamic inlet lip that enhances engine performance. The designers' experience at Boeing Corporation showed that for a peak Mach number on the inlet surface beyond some upper limit, the performance of the engine degrades excessively. Thus, our objective was to optimize efficiency (minimize the peak Mach number) at maximum cruise without compromising performance at other operating conditions. Using a CFD code NPARC, the NASA Lewis Research Center, in collaboration with Boeing, developed an integrated procedure at Lewis to find the optimum shape of a subsonic inlet lip and a numerical optimization code, ADS. We used a GRAPE-based three-dimensional grid generator to help automate the optimization procedure. The inlet lip shape at the crown and the keel was described as a superellipse, and the superellipse exponents and radii ratios were considered as design variables. Three operating conditions: cruise, takeoff, and rolling takeoff, were considered in this study. Three-dimensional Euler computations were carried out to obtain the flow field. At the initial design, the peak Mach numbers for maximum cruise, takeoff, and rolling takeoff conditions were 0.88, 1.772, and 1.61, respectively. The acceptable upper limits on the takeoff and rolling takeoff Mach numbers were 1.55 and 1.45. Since the initial design provided by Boeing was found to be optimum with respect to the maximum cruise condition, the sum of the peak Mach numbers at takeoff and rolling takeoff were minimized in the current study while the maximum cruise Mach number was constrained to be close to that at the existing design. With this objective, the

  11. Operational Excellence through Schedule Optimization and Production Simulation of Application Specific Integrated Circuits.

    Energy Technology Data Exchange (ETDEWEB)

    Flory, John Andrew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Padilla, Denise D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gauthier, John H. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zwerneman, April Marie [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Miller, Steven P [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2016-05-01

    Upcoming weapon programs require an aggressive increase in Application Specific Integrated Circuit (ASIC) production at Sandia National Laboratories (SNL). SNL has developed unique modeling and optimization tools that have been instrumental in improving ASIC production productivity and efficiency, identifying optimal operational and tactical execution plans under resource constraints, and providing confidence in successful mission execution. With ten products and unprecedented levels of demand, a single set of shared resources, highly variable processes, and the need for external supplier task synchronization, scheduling is an integral part of successful manufacturing. The scheduler uses an iterative multi-objective genetic algorithm and a multi-dimensional performance evaluator. Schedule feasibility is assessed using a discrete event simulation (DES) that incorporates operational uncertainty, variability, and resource availability. The tools provide rapid scenario assessments and responses to variances in the operational environment, and have been used to inform major equipment investments and workforce planning decisions in multiple SNL facilities.

  12. Generating optimal initial conditions for smooth particle hydrodynamics (SPH) simulations

    Energy Technology Data Exchange (ETDEWEB)

    Diehl, Steven [Los Alamos National Laboratory; Rockefeller, Gabriel M [Los Alamos National Laboratory; Fryer, Christopher L [Los Alamos National Laboratory

    2008-01-01

    We present a new optimal method to set up initial conditions for Smooth Particle Hydrodynamics Simulations, which may also be of interest for N-body simulations. This new method is based on weighted Voronoi tesselations (WVTs) and can meet arbitrarily complex spatial resolution requirements. We conduct a comprehensive review of existing SPH setup methods, and outline their advantages, limitations and drawbacks. A serial version of our WVT setup method is publicly available and we give detailed instruction on how to easily implement the new method on top of an existing parallel SPH code.

  13. Modified Sequential Kriging Optimization for Multidisciplinary Complex Product Simulation

    Institute of Scientific and Technical Information of China (English)

    Wang Hao; Wang Shaoping; Mileta M.Tomovic

    2010-01-01

    Directing to the high cost of computer simulation optimization problem,Kriging surrogate model is widely used to decrease the computation time.Since the sequential Kriging optimization is time consuming,this article extends the expected improvement and put forwards a modified sequential Kriging optimization (MSKO).This method changes the twice optimization problem into once by adding more than one point at the same time.Before re-fitting the Kriging model,the new sample points are verified to ensure that they do not overlap the previous one and the distance between two sample points is not too small.This article presents the double stopping criterion to keep the root mean square error (RMSE) of the final surrogate model at an ac-ceptable level.The example shows that MSKO can approach the global optimization quickly and accurately.MSKO can ensure global optimization no matter where the initial point is.Application of active suspension indicates that the proposed method is effective.

  14. Proposing "the burns suite" as a novel simulation tool for advancing the delivery of burns education.

    Science.gov (United States)

    Sadideen, Hazim; Wilson, David; Moiemen, Naiem; Kneebone, Roger

    2014-01-01

    Educational theory highlights the importance of contextualized simulation for effective learning. We explored this concept in a burns scenario in a novel, low-cost, high-fidelity, portable, immersive simulation environment (referred to as distributed simulation). This contextualized simulation/distributed simulation combination was named "The Burns Suite" (TBS). A pediatric burn resuscitation scenario was selected after high trainee demand. It was designed on Advanced Trauma and Life Support and Emergency Management of Severe Burns principles and refined using expert opinion through cognitive task analysis. TBS contained "realism" props, briefed nurses, and a simulated patient. Novices and experts were recruited. Five-point Likert-type questionnaires were developed for face and content validity. Cronbach's α was calculated for scale reliability. Semistructured interviews captured responses for qualitative thematic analysis allowing for data triangulation. Twelve participants completed TBS scenario. Mean face and content validity ratings were high (4.6 and 4.5, respectively; range, 4-5). The internal consistency of questions was high. Qualitative data analysis revealed that participants felt 1) the experience was "real" and they were "able to behave as if in a real resuscitation environment," and 2) TBS "addressed what Advanced Trauma and Life Support and Emergency Management of Severe Burns didn't" (including the efficacy of incorporating nontechnical skills). TBS provides a novel, effective simulation tool to significantly advance the delivery of burns education. Recreating clinical challenge is crucial to optimize simulation training. This low-cost approach also has major implications for surgical education, particularly during increasing financial austerity. Alternative scenarios and/or procedures can be recreated within TBS, providing a diverse educational immersive simulation experience.

  15. Optimized GPU simulation of continuous-spin glass models

    CERN Document Server

    Yavors'kii, Taras

    2012-01-01

    We develop a highly optimized code for simulating the Edwards-Anderson Heisenberg model on graphics processing units (GPUs). Using a number of computational tricks such as tiling, data compression and appropriate memory layouts, the simulation code combining over-relaxation, heat bath and parallel tempering moves achieves a peak performance of 0.29 ns per spin update on realistic system sizes, corresponding to a more than 150 fold speed-up over a serial CPU reference implementation. The optimized implementation is used to study the spin-glass transition in a random external magnetic field to probe the existence of a de Almeida-Thouless line in the model, for which we give benchmark results.

  16. Optimized GPU simulation of continuous-spin glass models

    Science.gov (United States)

    Yavors'kii, T.; Weigel, M.

    2012-08-01

    We develop a highly optimized code for simulating the Edwards-Anderson Heisenberg model on graphics processing units (GPUs). Using a number of computational tricks such as tiling, data compression and appropriate memory layouts, the simulation code combining over-relaxation, heat bath and parallel tempering moves achieves a peak performance of 0.29 ns per spin update on realistic system sizes, corresponding to a more than 150 fold speed-up over a serial CPU reference implementation. The optimized implementation is used to study the spin-glass transition in a random external magnetic field to probe the existence of a de Almeida-Thouless line in the model, for which we give benchmark results.

  17. OPTIMAL WELL LOCATOR (OWL): A SCREENING TOOL FOR EVALUATING LOCATIONS OF MONITORING WELLS

    Science.gov (United States)

    The Optimal Well Locator ( OWL) program was designed and developed by USEPA to be a screening tool to evaluate and optimize the placement of wells in long term monitoring networks at small sites. The first objective of the OWL program is to allow the user to visualize the change ...

  18. Optimization Tool for Direct Water Cooling System of High Power IGBT Modules

    DEFF Research Database (Denmark)

    Bahman, Amir Sajjad; Blaabjerg, Frede

    2016-01-01

    important issue for thermal design engineers. This paper aims to present a user friendly optimization tool for direct water cooling system of a high power module which enables the cooling system designer to identify the optimized solution depending on customer load profiles and available pump power. CFD...

  19. Identifying Cost-Effective Water Resources Management Strategies: Watershed Management Optimization Support Tool (WMOST)

    Science.gov (United States)

    The Watershed Management Optimization Support Tool (WMOST) is a public-domain software application designed to aid decision makers with integrated water resources management. The tool allows water resource managers and planners to screen a wide-range of management practices for c...

  20. Watershed Management Optimization Support Tool (WMOST) v2: User Manual and Case Studies

    Science.gov (United States)

    The Watershed Management Optimization Support Tool (WMOST) is a decision support tool that evaluates the relative cost-effectiveness of management practices at the local or watershed scale. WMOST models the environmental effects and costs of management decisions in a watershed c...

  1. The Simulation and Optimization of Aspheric Plastic Lens Injection Molding

    Institute of Scientific and Technical Information of China (English)

    WEN Jialing; WEN Pengfei

    2005-01-01

    For the purpose of reducing the volumetric shrinkage and volumetric shrinkage variation, the process in injection molding of aspheric plastic lens was simulated, and several process parameters which include holding pressure, melt temperature, mold temperature, fill time, holding pressure time and cooling time were optimized by using an orthogonal experimental design method. Finally, the optimum process parameters and the influence degree of process parameters on the average volumetric shrinkage and the volumetric shrinkage variation are obtained.

  2. Simulation Application for Optimization of Solar Collector Array

    OpenAIRE

    Igor Shesho*,; Done Tashevsk

    2014-01-01

    Solar systems offer a comparatively low output density , so increasing the output always means a corresponding increase in the size of the collector area. Thus collector arrays are occasionally constructed (i.e. with different azimuth angles and/or slopes, which be imposed by the location and structure available to mount the collector. In this paper is developed simulation application for optimization for the solar collector array position and number of collectors in regard of...

  3. Optimal Results and Numerical Simulations for Flow Shop Scheduling Problems

    Directory of Open Access Journals (Sweden)

    Tao Ren

    2012-01-01

    Full Text Available This paper considers the m-machine flow shop problem with two objectives: makespan with release dates and total quadratic completion time, respectively. For Fm|rj|Cmax, we prove the asymptotic optimality for any dense scheduling when the problem scale is large enough. For Fm‖ΣCj2, improvement strategy with local search is presented to promote the performance of the classical SPT heuristic. At the end of the paper, simulations show the effectiveness of the improvement strategy.

  4. Scheduling Patients’ Appointments: Allocation of Healthcare Service Using Simulation Optimization

    Directory of Open Access Journals (Sweden)

    Ping-Shun Chen

    2015-01-01

    Full Text Available In the service industry, scheduling medical procedures causes difficulties for both patients and management. Factors such as fluctuations in customer demand and service time affect the appointment scheduling systems’ performance in terms of, for example, patients’ waiting time, idle time of resources, and total cost/profits. This research implements four appointment scheduling policies, i.e., constant arrival, mixed patient arrival, three-section pattern arrival, and irregular arrival, in an ultrasound department of a hospital in Taiwan. By simulating the four implemented policies’ optimization procedures, optimal or near-optimal solutions can be obtained for patients per arrival, patients’ inter-arrival time, and the number of the time slots for arrived patients. Furthermore, three objective functions are tested, and the results are discussed. The managerial implications and discussions are summarized to demonstrate how outcomes can be useful for hospital managers seeking to allocate their healthcare service capacities.

  5. spsann - optimization of sample patterns using spatial simulated annealing

    Science.gov (United States)

    Samuel-Rosa, Alessandro; Heuvelink, Gerard; Vasques, Gustavo; Anjos, Lúcia

    2015-04-01

    There are many algorithms and computer programs to optimize sample patterns, some private and others publicly available. A few have only been presented in scientific articles and text books. This dispersion and somewhat poor availability is holds back to their wider adoption and further development. We introduce spsann, a new R-package for the optimization of sample patterns using spatial simulated annealing. R is the most popular environment for data processing and analysis. Spatial simulated annealing is a well known method with widespread use to solve optimization problems in the soil and geo-sciences. This is mainly due to its robustness against local optima and easiness of implementation. spsann offers many optimizing criteria for sampling for variogram estimation (number of points or point-pairs per lag distance class - PPL), trend estimation (association/correlation and marginal distribution of the covariates - ACDC), and spatial interpolation (mean squared shortest distance - MSSD). spsann also includes the mean or maximum universal kriging variance (MUKV) as an optimizing criterion, which is used when the model of spatial variation is known. PPL, ACDC and MSSD were combined (PAN) for sampling when we are ignorant about the model of spatial variation. spsann solves this multi-objective optimization problem scaling the objective function values using their maximum absolute value or the mean value computed over 1000 random samples. Scaled values are aggregated using the weighted sum method. A graphical display allows to follow how the sample pattern is being perturbed during the optimization, as well as the evolution of its energy state. It is possible to start perturbing many points and exponentially reduce the number of perturbed points. The maximum perturbation distance reduces linearly with the number of iterations. The acceptance probability also reduces exponentially with the number of iterations. R is memory hungry and spatial simulated annealing is a

  6. Using simulation-based optimization to improve performance at a tire manufacturing company

    Directory of Open Access Journals (Sweden)

    Mohamad Darayi

    2013-04-01

    Full Text Available In this paper, a simulation optimization-based decision support tool has been developed to study the capacity enhancement scenarios in a tire manufacturing company located in Iran. This company is experiencing challenges in synchronizing production output with customer demand causing an unbalanced work-in-process (WIP inventory distribution throughout the tire manufacturing process. However, a new opportunity to increase the supplying of raw materials by fifty percent and increase the expected growth in market demand, necessitate this study of the current company situation. This research supported by the company, is to analyze whether the ongoing production logistics system can respond to the increased market demand, considering the raw material expansion. Implementation of a proposed hybrid push/pull production control strategy, together with the facility capacity enhancement options in bottleneck stations and/or heterogeneous lines within the plant, are investigated by the proposed simulation optimization methodology.

  7. Effective Energy Simulation and Optimal Design of Side-lit Buildings with Venetian Blinds

    Science.gov (United States)

    Cheng, Tian

    Venetian blinds are popularly used in buildings to control the amount of incoming daylight for improving visual comfort and reducing heat gains in air-conditioning systems. Studies have shown that the proper design and operation of window systems could result in significant energy savings in both lighting and cooling. However, there is no convenient computer tool that allows effective and efficient optimization of the envelope of side-lit buildings with blinds now. Three computer tools, Adeline, DOE2 and EnergyPlus widely used for the above-mentioned purpose have been experimentally examined in this study. Results indicate that the two former tools give unacceptable accuracy due to unrealistic assumptions adopted while the last one may generate large errors in certain conditions. Moreover, current computer tools have to conduct hourly energy simulations, which are not necessary for life-cycle energy analysis and optimal design, to provide annual cooling loads. This is not computationally efficient, particularly not suitable for optimal designing a building at initial stage because the impacts of many design variations and optional features have to be evaluated. A methodology is therefore developed for efficient and effective thermal and daylighting simulations and optimal design of buildings with blinds. Based on geometric optics and radiosity method, a mathematical model is developed to reasonably simulate the daylighting behaviors of venetian blinds. Indoor illuminance at any reference point can be directly and efficiently computed. They have been validated with both experiments and simulations with Radiance. Validation results show that indoor illuminances computed by the new models agree well with the measured data, and the accuracy provided by them is equivalent to that of Radiance. The computational efficiency of the new models is much higher than that of Radiance as well as EnergyPlus. Two new methods are developed for the thermal simulation of buildings. A

  8. Using "The Burns Suite" as a Novel High Fidelity Simulation Tool for Interprofessional and Teamwork Training.

    Science.gov (United States)

    Sadideen, Hazim; Wilson, David; Moiemen, Naiem; Kneebone, Roger

    2016-01-01

    Educational theory highlights the importance of contextualized simulation for effective learning. The authors recently published the concept of "The Burns Suite" (TBS) as a novel tool to advance the delivery of burns education for residents/clinicians. Effectively, TBS represents a low-cost, high-fidelity, portable, immersive simulation environment. Recently, simulation-based team training (SBTT) has been advocated as a means to improve interprofessional practice. The authors aimed to explore the role of TBS in SBTT. A realistic pediatric burn resuscitation scenario was designed based on "advanced trauma and life support" and "emergency management of severe burns" principles, refined utilizing expert opinion through cognitive task analysis. The focus of this analysis was on nontechnical and interpersonal skills of clinicians and nurses within the scenario, mirroring what happens in real life. Five-point Likert-type questionnaires were developed for face and content validity. Cronbach's alpha was calculated for scale reliability. Semistructured interviews captured responses for qualitative thematic analysis allowing for data triangulation. Twenty-two participants completed TBS resuscitation scenario. Mean face and content validity ratings were high (4.4 and 4.7 respectively; range 4-5). The internal consistency of questions was high. Qualitative data analysis revealed two new themes. Participants reported that the experience felt particularly authentic because the simulation had high psychological and social fidelity, and there was a demand for such a facility to be made available to improve nontechnical skills and interprofessional relations. TBS provides a realistic, novel tool for SBTT, addressing both nontechnical and interprofessional team skills. Recreating clinical challenge is crucial to optimize SBTT. With a better understanding of the theories underpinning simulation and interprofessional education, future simulation scenarios can be designed to provide

  9. Integrating Multibody Simulation and CFD: toward Complex Multidisciplinary Design Optimization

    Science.gov (United States)

    Pieri, Stefano; Poloni, Carlo; Mühlmeier, Martin

    This paper describes the use of integrated multidisciplinary analysis and optimization of a race car model on a predefined circuit. The objective is the definition of the most efficient geometric configuration that can guarantee the lowest lap time. In order to carry out this study it has been necessary to interface the design optimization software modeFRONTIER with the following softwares: CATIA v5, a three dimensional CAD software, used for the definition of the parametric geometry; A.D.A.M.S./Motorsport, a multi-body dynamic simulation software; IcemCFD, a mesh generator, for the automatic generation of the CFD grid; CFX, a Navier-Stokes code, for the fluid-dynamic forces prediction. The process integration gives the possibility to compute, for each geometrical configuration, a set of aerodynamic coefficients that are then used in the multiboby simulation for the computation of the lap time. Finally an automatic optimization procedure is started and the lap-time minimized. The whole process is executed on a Linux cluster running CFD simulations in parallel.

  10. Simulation based optimized beam velocity in additive manufacturing

    Science.gov (United States)

    Vignat, Frédéric; Béraud, Nicolas; Villeneuve, François

    2017-08-01

    Manufacturing good parts with additive technologies rely on melt pool dimension and temperature and are controlled by manufacturing strategies often decided on machine side. Strategies are built on beam path and variable energy input. Beam path are often a mix of contour and hatching strategies filling the contours at each slice. Energy input depend on beam intensity and speed and is determined from simple thermal models to control melt pool dimensions and temperature and ensure porosity free material. These models take into account variation in thermal environment such as overhanging surfaces or back and forth hatching path. However not all the situations are correctly handled and precision is limited. This paper proposes new method to determine energy input from full built chamber 3D thermal simulation. Using the results of the simulation, energy is modified to keep melt pool temperature in a predetermined range. The paper present first an experimental method to determine the optimal range of temperature. In a second part the method to optimize the beam speed from the simulation results is presented. Finally, the optimized beam path is tested in the EBM machine and built part are compared with part built with ordinary beam path.

  11. FEMSA: A Finite Element Simulation Tool for Quasi-static Seismic Deformation Modeling

    Science.gov (United States)

    Volpe, M.; Melini, D.; Piersanti, A.

    2006-12-01

    Modeling postseismic deformation is an increasingly valuable tool in earthquake seismology. In particular, the Finite Element (FE) numerical method allows accurate modeling of complex faulting geometry, inhomogeneous materials and realistic viscous flow, appearing an excellent tool to investigate a lot of specific phenomena related with earthquakes. We developed a FE simulation tool, FEMSA (Finite Element Modeling for Seismic Applications), to model quasi-static deformation generated by faulting sources. The approach allows to automatically implement arbitrary faulting sources and calculate displacement and stress fields induced by slip on the fault. The package makes use of the capabilities of CalculiX, a non commercial FE software designed to solve field problems, and is freely distributed. The main advantages of the method are: reliability, wide diffusion and flexibility, allowing geometrical and/or rheological heterogeneities to be included in a mechanical analysis. We carried out an optimization study on boundary conditions as well as a series of benchmark simulations on test cases and we also verified the capability of our approach to face the presence of 3D heterogeneities within the domain. Here, we present our package and show some simple examples of application.

  12. Reliability Simulation and Design Optimization for Mechanical Maintenance

    Institute of Scientific and Technical Information of China (English)

    LIU Deshun; HUANG Liangpei; YUE Wenhui; XU Xiaoyan

    2009-01-01

    Reliability model of a mechanical product system will be newly reconstructed and maintenance cost will increase because failed parts can be replaced with new components during service, which should be accounted for in system design. In this paper, a reliability model and reliability-based design optimization methodology for maintenance are presented. First, based on the time-to-failure density function of the part of the system, the age distributions of all parts of the system during service are investigated, a reliability model of the mechanical system for maintenance is developed. Then, reliability simulations of the systems with Weibull probability density functions are performed, the system minimum reliability and steady reliability for maintenance are defined based on reliability simulation during the life cycle of the system. Thirdly, a maintenance cost model is developed based on replacement rates of the parts, a reliability-based design optimization model for maintenance is presented, in which total life cycle cost is considered as design objective and system reliability as design constrain. Finally, the reliability-based design optimization methodology for maintenance is used to design of a link ring for the chain conveyor, which shows that optimal design with the lowest maintenance cost can be obtained, and minimum reliability and steady reliability of the system can satisfy requirement of system reliability during service of the chain conveyor.

  13. Simulated Annealing-Based Krill Herd Algorithm for Global Optimization

    Directory of Open Access Journals (Sweden)

    Gai-Ge Wang

    2013-01-01

    Full Text Available Recently, Gandomi and Alavi proposed a novel swarm intelligent method, called krill herd (KH, for global optimization. To enhance the performance of the KH method, in this paper, a new improved meta-heuristic simulated annealing-based krill herd (SKH method is proposed for optimization tasks. A new krill selecting (KS operator is used to refine krill behavior when updating krill’s position so as to enhance its reliability and robustness dealing with optimization problems. The introduced KS operator involves greedy strategy and accepting few not-so-good solutions with a low probability originally used in simulated annealing (SA. In addition, a kind of elitism scheme is used to save the best individuals in the population in the process of the krill updating. The merits of these improvements are verified by fourteen standard benchmarking functions and experimental results show that, in most cases, the performance of this improved meta-heuristic SKH method is superior to, or at least highly competitive with, the standard KH and other optimization methods.

  14. Automatic design optimization tool for passive structural control systems

    Science.gov (United States)

    Mojolic, Cristian; Hulea, Radu; Parv, Bianca Roxana

    2017-07-01

    The present paper proposes an automatic dynamic process in order to find the parameters of the seismic isolation systems applied to large span structures. Three seismic isolation solutions are proposed for the model of the new Slatina Sport Hall. The first case uses friction pendulum system (FP), the second one uses High Damping Rubber Bearing (HDRB) and Lead Rubber Bearings, while (LRB) are used for the last case of isolation. The placement of the isolation level is at the top end of the roof supporting columns. The aim is to calculate the parameters of each isolation system so that the whole's structure first vibration periods is the one desired by the user. The model is computed with the use of SAP2000 software. In order to find the best solution for the optimization problem, an optimization process based on Genetic Algorithms (GA) has been developed in Matlab. With the use of the API (Application Programming Interface) libraries a two way link is created between the two programs in order to exchange results and link parameters. The main goal is to find the best seismic isolation method for each desired modal period so that the bending moment on the supporting columns should be minimum.

  15. A Tool for Optimizing Observation Planning for Faint Moving Objects

    Science.gov (United States)

    Arredondo, Anicia; Bosh, Amanda S.; Levine, Stephen

    2016-10-01

    Observations of small solar system bodies such as trans-Neptunian objects and Centaurs are vital for understanding the basic properties of these small members of our solar system. Because these objects are often very faint, large telescopes and long exposures may be necessary, which can result in crowded fields in which the target of interest may be blended with a field star. For accurate photometry and astrometry, observations must be planned to occur when the target is free of background stars; this restriction results in limited observing windows. We have created a tool that can be used to plan observations of faint moving objects. Features of the tool include estimates of best times to observe (when the object is not too near another object), a finder chart output, a list of possible astrometric and photometric reference stars, and an exposure time calculator. This work makes use of the USNOFS Image and Catalogue Archive operated by the United States Naval Observatory, Flagstaff Station (S.E. Levine and D.G. Monet 2000), the JPL Horizons online ephemeris service (Giorgini et al. 1996), the Minor Planet Center's MPChecker (http://cgi.minorplanetcenter.net/cgi-bin/checkmp.cgi), and source extraction software SExtractor (Bertin & Arnouts 1996). Support for this work was provided by NASA SSO grant NNX15AJ82G.

  16. Brute force optimization: combining mass energy simulation and life cycle analysis to optimize building design

    Energy Technology Data Exchange (ETDEWEB)

    Fix, Stuart; Richman, Russell [Department of Architectural Science, Faculty of Engineering, Architecture and Science, Ryerson University (Canada)], email: sfix@ryerson.ca, email: richman@ryerson.ca

    2011-07-01

    With the depletion of energy resources and the rising concerns about the environment, building designers are shifting towards green building designs. However since no design optimization for an entire building exists, a significant degree of uncertainty is involved in design decisions. The aim of this paper is to present the brute force optimization process which is a method removing the uncertainty from green building designs. This method relies on the selection of optimization criteria and then several simulations are performed. A demonstration pilot was carried out in Toronto and over one million design permutations were conducted. Results showed that parameters such as total building area, window performance and infiltration level are the most important to the lifetime energy consumption of a building. This study pointed out the important parameters to optimize in order to reduce a building's energy consumption.

  17. QCAD simulation and optimization of semiconductor double quantum dots

    Energy Technology Data Exchange (ETDEWEB)

    Nielsen, Erik; Gao, Xujiao; Kalashnikova, Irina; Muller, Richard Partain; Salinger, Andrew Gerhard; Young, Ralph Watson

    2013-12-01

    We present the Quantum Computer Aided Design (QCAD) simulator that targets modeling quantum devices, particularly silicon double quantum dots (DQDs) developed for quantum qubits. The simulator has three di erentiating features: (i) its core contains nonlinear Poisson, e ective mass Schrodinger, and Con guration Interaction solvers that have massively parallel capability for high simulation throughput, and can be run individually or combined self-consistently for 1D/2D/3D quantum devices; (ii) the core solvers show superior convergence even at near-zero-Kelvin temperatures, which is critical for modeling quantum computing devices; (iii) it couples with an optimization engine Dakota that enables optimization of gate voltages in DQDs for multiple desired targets. The Poisson solver includes Maxwell- Boltzmann and Fermi-Dirac statistics, supports Dirichlet, Neumann, interface charge, and Robin boundary conditions, and includes the e ect of dopant incomplete ionization. The solver has shown robust nonlinear convergence even in the milli-Kelvin temperature range, and has been extensively used to quickly obtain the semiclassical electrostatic potential in DQD devices. The self-consistent Schrodinger-Poisson solver has achieved robust and monotonic convergence behavior for 1D/2D/3D quantum devices at very low temperatures by using a predictor-correct iteration scheme. The QCAD simulator enables the calculation of dot-to-gate capacitances, and comparison with experiment and between solvers. It is observed that computed capacitances are in the right ballpark when compared to experiment, and quantum con nement increases capacitance when the number of electrons is xed in a quantum dot. In addition, the coupling of QCAD with Dakota allows to rapidly identify which device layouts are more likely leading to few-electron quantum dots. Very efficient QCAD simulations on a large number of fabricated and proposed Si DQDs have made it possible to provide fast feedback for design

  18. Automatic differentiation tools in the dynamic simulation of chemical engineering processes

    Directory of Open Access Journals (Sweden)

    Castro M.C.

    2000-01-01

    Full Text Available Automatic Differentiation is a relatively recent technique developed for the differentiation of functions applicable directly to the source code to compute the function written in standard programming languages. That technique permits the automatization of the differentiation step, crucial for dynamic simulation and optimization of processes. The values for the derivatives obtained with AD are exact (to roundoff. The theoretical exactness of the AD comes from the fact that it uses the same rules of differentiation as in differential calculus, but these rules are applied to an algorithmic specification of the function rather than to a formula. The main purpose of this contribution is to discuss the impact of Automatic Differentiation in the field of dynamic simulation of chemical engineering processes. The influence of the differentiation technique on the behavior of the integration code, the performance of the generated code and the incorporation of AD tools in consistent initialization tools are discussed from the viewpoint of dynamic simulation of typical models in chemical engineering.

  19. TAX PLANNING: OPTIMIZATION TOOL OF DEBTS TOWARDS THE BUDGET

    Directory of Open Access Journals (Sweden)

    Anatol GRAUR

    2017-06-01

    Full Text Available Tax planning is complex of measures,consisting in the reduction of tax payments under the law. Tax planning at the enterprise starts from the initial structuring of businesses and activities and can be carried out both at entity level (corporate and the individual (individual. Compared to tax evasion, tax planning is performed only under the law by avoiding taxes. Avoiding or reducing taxes is possible by organizing activities in such a way that the law allows reducing the tax base or tax rate. Optimization of tax payments is possible by organizing the work in such a way, so as the legislation avoids or reduces the tax base,tax rates and tax incentives application.

  20. Collaboration pathway(s) using new tools for optimizing operational climate monitoring from space

    Science.gov (United States)

    Helmuth, Douglas B.; Selva, Daniel; Dwyer, Morgan M.

    2014-10-01

    Consistently collecting the earth's climate signatures remains a priority for world governments and international scientific organizations. Architecting a solution requires transforming scientific missions into an optimized robust `operational' constellation that addresses the needs of decision makers, scientific investigators and global users for trusted data. The application of new tools offers pathways for global architecture collaboration. Recent (2014) rulebased decision engine modeling runs that targeted optimizing the intended NPOESS architecture, becomes a surrogate for global operational climate monitoring architecture(s). This rule-based systems tools provide valuable insight for Global climate architectures, through the comparison and evaluation of alternatives considered and the exhaustive range of trade space explored. A representative optimization of Global ECV's (essential climate variables) climate monitoring architecture(s) is explored and described in some detail with thoughts on appropriate rule-based valuations. The optimization tools(s) suggest and support global collaboration pathways and hopefully elicit responses from the audience and climate science shareholders.

  1. Semi-automatic tool to ease the creation and optimization of GPU programs

    DEFF Research Database (Denmark)

    Jepsen, Jacob

    2014-01-01

    We present a tool that reduces the development time of GPU-executable code. We implement a catalogue of common optimizations specific to the GPU architecture. Through the tool, the programmer can semi-automatically transform a computationally-intensive code section into GPU-executable form...... and apply optimizations thereto. Based on experiments, the code generated by the tool can be 3-256X faster than code generated by an OpenACC compiler, 4-37X faster than optimized CPU code, and attain up to 25% of peak performance of the GPU. We found that by using pattern-matching rules, many...... of the transformations can be performed automatically, which makes the tool usable for both novices and experts in GPU programming....

  2. A simulator tool set for evaluating HEVC/SHVC streaming

    Science.gov (United States)

    Al Hadhrami, Tawfik; Nightingale, James; Wang, Qi; Grecos, Christos; Kehtarnavaz, Nasser

    2015-02-01

    Video streaming and other multimedia applications account for an ever increasing proportion of all network traffic. The recent adoption of High Efficiency Video Coding (HEVC) as the H.265 standard provides many opportunities for new and improved services multimedia services and applications in the consumer domain. Since the delivery of version one of H.265, the Joint Collaborative Team on Video Coding have been working towards standardisation of a scalable extension (SHVC) to the H.265 standard and a series of range extensions and new profiles. As these enhancements are added to the standard the range of potential applications and research opportunities will expend. For example the use of video is also growing rapidly in other sectors such as safety, security, defence and health with real-time high quality video transmission playing an important role in areas like critical infrastructure monitoring and disaster management. Each of which may benefit from the application of enhanced HEVC/H.265 and SHVC capabilities. The majority of existing research into HEVC/H.265 transmission has focussed on the consumer domain addressing issues such as broadcast transmission and delivery to mobile devices with the lack of freely available tools widely cited as an obstacle to conducting this type of research. In this paper we present a toolset which facilitates the transmission and evaluation of HEVC/H.265 and SHVC encoded video on the popular open source NCTUns simulator. Our toolset provides researchers with a modular, easy to use platform for evaluating video transmission and adaptation proposals on large scale wired, wireless and hybrid architectures. The toolset consists of pre-processing, transmission, SHVC adaptation and post-processing tools to gather and analyse statistics. It has been implemented using HM15 and SHM5, the latest versions of the HEVC and SHVC reference software implementations to ensure that currently adopted proposals for scalable and range extensions to

  3. OPTIMIZATION OF SURFACE ROUGHNESS AND TOOL FLANK WEAR IN TURNING OF AISI 304 AUSTENITIC STAINLESS STEEL WITH CVD COATED TOOL

    Directory of Open Access Journals (Sweden)

    M. KALADHAR

    2013-04-01

    Full Text Available AISI 304 austenitic stainless steel is a popularly used grade in the various fields of manufacturing because of its high ductility, high durability and excellent corrosion resistance. High work hardening, low heat conductivity and high built up edge (BUE formation made this as difficult-to- machine material. Poor surface quality and rapid tool wear are the common problems encountered while machining it. In the present work, an attempt has been made to explore the influence of machining parameters on the performance measures, surface roughness and flank wear in turning of AISI 304 austenitic stainless steel with a two layer Chemical vapour deposition(CVD coated tool. In order to achieve this, Taguchi approach has been employed. The results revealed that the cutting speed most significantly, influences both surface roughness and flank wear. In addition to this the optimal setting of process parameters and optimal ranges of performance measures are predicted.

  4. Laser: a Tool for Optimization and Enhancement of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Preisler, Jan [Iowa State Univ., Ames, IA (United States)

    1997-01-01

    In this work, we use lasers to enhance possibilities of laser desorption methods and to optimize coating procedure for capillary electrophoresis (CE). We use several different instrumental arrangements to characterize matrix-assisted laser desorption (MALD) at atmospheric pressure and in vacuum. In imaging mode, 488-nm argon-ion laser beam is deflected by two acousto-optic deflectors to scan plumes desorbed at atmospheric pressure via absorption. All absorbing species, including neutral molecules, are monitored. Interesting features, e.g. differences between the initial plume and subsequent plumes desorbed from the same spot, or the formation of two plumes from one laser shot are observed. Total plume absorbance can be correlated with the acoustic signal generated by the desorption event. A model equation for the plume velocity as a function of time is proposed. Alternatively, the use of a static laser beam for observation enables reliable determination of plume velocities even when they are very high. Static scattering detection reveals negative influence of particle spallation on MS signal. Ion formation during MALD was monitored using 193-nm light to photodissociate a portion of insulin ion plume. These results define the optimal conditions for desorbing analytes from matrices, as opposed to achieving a compromise between efficient desorption and efficient ionization as is practiced in mass spectrometry. In CE experiment, we examined changes in a poly(ethylene oxide) (PEO) coating by continuously monitoring the electroosmotic flow (EOF) in a fused-silica capillary during electrophoresis. An imaging CCD camera was used to follow the motion of a fluorescent neutral marker zone along the length of the capillary excited by 488-nm Ar-ion laser. The PEO coating was shown to reduce the velocity of EOF by more than an order of magnitude compared to a bare capillary at pH 7.0. The coating protocol was important, especially at an intermediate pH of 7.7. The increase of p

  5. Laser: a Tool for Optimization and Enhancement of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Preisler, Jan

    1997-01-01

    In this work, we use lasers to enhance possibilities of laser desorption methods and to optimize coating procedure for capillary electrophoresis (CE). We use several different instrumental arrangements to characterize matrix-assisted laser desorption (MALD) at atmospheric pressure and in vacuum. In imaging mode, 488-nm argon-ion laser beam is deflected by two acousto-optic deflectors to scan plumes desorbed at atmospheric pressure via absorption. All absorbing species, including neutral molecules, are monitored. Interesting features, e.g. differences between the initial plume and subsequent plumes desorbed from the same spot, or the formation of two plumes from one laser shot are observed. Total plume absorbance can be correlated with the acoustic signal generated by the desorption event. A model equation for the plume velocity as a function of time is proposed. Alternatively, the use of a static laser beam for observation enables reliable determination of plume velocities even when they are very high. Static scattering detection reveals negative influence of particle spallation on MS signal. Ion formation during MALD was monitored using 193-nm light to photodissociate a portion of insulin ion plume. These results define the optimal conditions for desorbing analytes from matrices, as opposed to achieving a compromise between efficient desorption and efficient ionization as is practiced in mass spectrometry. In CE experiment, we examined changes in a poly(ethylene oxide) (PEO) coating by continuously monitoring the electroosmotic flow (EOF) in a fused-silica capillary during electrophoresis. An imaging CCD camera was used to follow the motion of a fluorescent neutral marker zone along the length of the capillary excited by 488-nm Ar-ion laser. The PEO coating was shown to reduce the velocity of EOF by more than an order of magnitude compared to a bare capillary at pH 7.0. The coating protocol was important, especially at an intermediate pH of 7.7. The increase of p

  6. Illumination pattern optimization for fluorescence tomography: theory and simulation studies.

    Science.gov (United States)

    Dutta, Joyita; Ahn, Sangtae; Joshi, Anand A; Leahy, Richard M

    2010-05-21

    Fluorescence molecular tomography is a powerful tool for 3D visualization of molecular targets and pathways in vivo in small animals. Owing to the high degrees of absorption and scattering of light through tissue, the fluorescence tomographic inverse problem is inherently ill-posed. In order to improve source localization and the conditioning of the light propagation model, multiple sets of data are acquired by illuminating the animal surface with different spatial patterns of near-infrared light. However, the choice of these patterns in most experimental setups is ad hoc and suboptimal. This paper presents a systematic approach for designing efficient illumination patterns for fluorescence tomography. Our objective here is to determine how to optimally illuminate the animal surface so as to maximize the information content in the acquired data. We achieve this by improving the conditioning of the Fisher information matrix. We parameterize the spatial illumination patterns and formulate our problem as a constrained optimization problem that, for a fixed number of illumination patterns, yields the optimal set of patterns. For geometric insight, we used our method to generate a set of three optimal patterns for an optically homogeneous, regular geometrical shape and observed expected symmetries in the result. We also generated a set of six optimal patterns for an optically homogeneous cuboidal phantom set up in the transillumination mode. Finally, we computed optimal illumination patterns for an optically inhomogeneous realistically shaped mouse atlas for different given numbers of patterns. The regularized pseudoinverse matrix, generated using the singular value decomposition, was employed to reconstruct the point spread function for each set of patterns in the presence of a sample fluorescent point source deep inside the mouse atlas. We have evaluated the performance of our method by examining the singular value spectra as well as plots of average spatial

  7. Using Cotton Model Simulations to Estimate Optimally Profitable Irrigation Strategies

    Science.gov (United States)

    Mauget, S. A.; Leiker, G.; Sapkota, P.; Johnson, J.; Maas, S.

    2011-12-01

    In recent decades irrigation pumping from the Ogallala Aquifer has led to declines in saturated thickness that have not been compensated for by natural recharge, which has led to questions about the long-term viability of agriculture in the cotton producing areas of west Texas. Adopting irrigation management strategies that optimize profitability while reducing irrigation waste is one way of conserving the aquifer's water resource. Here, a database of modeled cotton yields generated under drip and center pivot irrigated and dryland production scenarios is used in a stochastic dominance analysis that identifies such strategies under varying commodity price and pumping cost conditions. This database and analysis approach will serve as the foundation for a web-based decision support tool that will help producers identify optimal irrigation treatments under specified cotton price, electricity cost, and depth to water table conditions.

  8. Supply chain simulation tools and techniques: a survey

    NARCIS (Netherlands)

    Kleijnen, J.P.C.

    2005-01-01

    The main contribution of this paper is twofold: it surveys different types of simulation for supply chain management; it discusses several methodological issues. These different types of simulation are spreadsheet simulation, system dynamics, discrete-event simulation and business games. Which simul

  9. OPEP: a tool for the optimal partitioning of electric properties.

    Science.gov (United States)

    Angyán, János G; Chipot, Christophe; Dehez, François; Hättig, Christof; Jansen, Georg; Millot, Claude

    2003-06-01

    OPEP is a suite of FORTRAN programs targeted at the optimal partitioning of molecular electric properties. It includes an interactive module for the construction of Cartesian grids of points, on which either the molecular electrostatic potential or the induction energy is mapped. The generation of distributed multipoles and polarizabilities is achieved using either the formalism of the normal equations of the least-squares problem, which restates the fitting procedure in terms of simple matrix operations, or a statistical approach, which provides a pictorial description of the distributed models of multipoles and polarizabilities, thereby allowing the pinpointing of pathological cases. Molecular symmetry is accounted for by means of local atomic frames, which are generated in an automated fashion. A Tcl/Tk graphical user interface wraps the suite of programs, thereby making OPEP a user-friendly package for building models of distributed multipoles and polarizabilities. OPEP is an open-source suite of programs distributed free of charge under the GNU general public license (GPL) at http://www.lctn.uhp-nancy.fr/Opep.

  10. Extension of an Object-Oriented Optimization Tool: User's Reference Manual

    Science.gov (United States)

    Pak, Chan-Gi; Truong, Samson S.

    2015-01-01

    The National Aeronautics and Space Administration Armstrong Flight Research Center has developed a cost-effective and flexible object-oriented optimization (O (sup 3)) tool that leverages existing tools and practices and allows easy integration and adoption of new state-of-the-art software. This object-oriented framework can integrate the analysis codes for multiple disciplines, as opposed to relying on one code to perform analysis for all disciplines. Optimization can thus take place within each discipline module, or in a loop between the O (sup 3) tool and the discipline modules, or both. Six different sample mathematical problems are presented to demonstrate the performance of the O (sup 3) tool. Instructions for preparing input data for the O (sup 3) tool are detailed in this user's manual.

  11. Optimizing Friction Stir Welding via Statistical Design of Tool Geometry and Process Parameters

    Science.gov (United States)

    Blignault, C.; Hattingh, D. G.; James, M. N.

    2012-06-01

    This article considers optimization procedures for friction stir welding (FSW) in 5083-H321 aluminum alloy, via control of weld process parameters and tool design modifications. It demonstrates the potential utility of the "force footprint" (FF) diagram in providing a real-time graphical user interface (GUI) for process optimization of FSW. Multiple force, torque, and temperature responses were recorded during FS welding using 24 different tool pin geometries, and these data were statistically analyzed to determine the relative influence of a number of combinations of important process and tool geometry parameters on tensile strength. Desirability profile charts are presented, which show the influence of seven key combinations of weld process variables on tensile strength. The model developed in this study allows the weld tensile strength to be predicted for other combinations of tool geometry and process parameters to fall within an average error of 13%. General guidelines for tool profile selection and the likelihood of influencing weld tensile strength are also provided.

  12. A two-parameter preliminary optimization study for a fluidized-bed boiler through a comprehensive mathematical simulator

    Energy Technology Data Exchange (ETDEWEB)

    Rabi, Jose A.; Souza-Santos, Marcio L. de [Universidade Estadual de Campinas, SP (Brazil). Faculdade de Engenharia Mecanica. Dept. de Energia]. E-mails: jrabi@fem.unicamp.br; dss@fem.unicamp.br

    2000-07-01

    Modeling and simulation of fluidized-bed equipment have demonstrated their importance as a tool for design and optimization of industrial equipment. Accordingly, this work carries on an optimization study of a fluidized-bed boiler with the aid of a comprehensive mathematical simulator. The configuration data of the boiler are based on a particular Babcock and Wilcox Co. (USA) test unit. Due to their importance, the number of tubes in the bed section and the air excess are chosen as the parameters upon which the optimization study is based. On their turn, the fixed-carbon conversion factor and the boiler efficiency are chosen as two distinct optimization objectives. The results from both preliminary searches are compared. The present work is intended to be just a study on possible routes for future optimization of larger boilers. Nonetheless, the present discussion might give some insight on the equipment behavior. (author)

  13. Solar assisted heat pump on air collectors: A simulation tool

    Energy Technology Data Exchange (ETDEWEB)

    Karagiorgas, Michalis; Galatis, Kostas; Tsagouri, Manolis [Department of Mechanical Engineering Educators, ASPETE, N. Iraklio, GR 14121 (Greece); Tsoutsos, Theocharis [Environmental Engineering Dept., Technical University of Crete, Technical University Campus, GR 73100, Chania (Greece); Botzios-Valaskakis, Aristotelis [Centre for Renewable Energy Sources (CRES), 19th km Marathon Ave., GR 19001, Pikermi (Greece)

    2010-01-15

    The heating system of the bioclimatic building of the Greek National Centre for Renewable Energy Sources (CRES) comprises two heating plants: the first one includes an air source heat pump, Solar Air Collectors (SACs) and a heat distribution system (comprising a fan coil unit network); the second one is, mainly, a geothermal heat pump unit to cover the ground floor thermal needs. The SAC configuration as well as the fraction of the building heating load covered by the heating plant are assessed in two operation modes; the direct (hot air from the collectors is supplied directly to the heated space) and the indirect mode (warm air from the SAC or its mixture with ambient air is not supplied directly to the heated space but indirectly into the evaporator of the air source heat pump). The technique of the indirect mode of heating aims at maximizing the efficiency of the SAC, saving electrical power consumed by the compressor of the heat pump, and therefore, at optimizing the coefficient of performance (COP) of the heat pump due to the increased intake of ambient thermal energy by means of the SAC. Results are given for three research objectives: assessment of the heat pump efficiency whether in direct or indirect heating mode; Assessment of the overall heating plant efficiency on a daily or hourly basis; Assessment of the credibility of the suggested simulation model TSAGAIR by comparing its results with the TRNSYS ones. (author)

  14. Optimization of metabolite detection by quantum mechanics simulations in magnetic resonance spectroscopy.

    Science.gov (United States)

    Gambarota, Giulio

    2016-09-03

    Magnetic resonance spectroscopy (MRS) is a well established modality for investigating tissue metabolism in vivo. In recent years, many efforts by the scientific community have been directed towards the improvement of metabolite detection and quantitation. Quantum mechanics simulations allow for investigations of the MR signal behaviour of metabolites; thus, they provide an essential tool in the optimization of metabolite detection. In this review, we will examine quantum mechanics simulations based on the density matrix formalism. The density matrix was introduced by von Neumann in 1927 to take into account statistical effects within the theory of quantum mechanics. We will discuss the main steps of the density matrix simulation of an arbitrary spin system and show some examples for the strongly coupled two spin system.

  15. Simulation of stochastic systems via polynomial chaos expansions and convex optimization

    CERN Document Server

    Fagiano, Lorenzo

    2012-01-01

    Polynomial Chaos Expansions represent a powerful tool to simulate stochastic models of dynamical systems. Yet, deriving the expansion's coefficients for complex systems might require a significant and non-trivial manipulation of the model, or the computation of large numbers of simulation runs, rendering the approach too time consuming and impracticable for applications with more than a handful of random variables. We introduce a novel computationally tractable technique for computing the coefficients of polynomial chaos expansions. The approach exploits a regularization technique with a particular choice of weighting matrices, which allow to take into account the specific features of Polynomial Chaos expansions. The method, completely based on convex optimization, can be applied to problems with a large number of random variables and uses a modest number of Monte Carlo simulations, while avoiding model manipulations. Additional information on the stochastic process, when available, can be also incorporated i...

  16. Optimized molecular force field for sulfur hexafluoride simulations.

    Science.gov (United States)

    Olivet, Aurelio; Vega, Lourdes F

    2007-04-14

    An optimized molecular force field for sulfur hexafluoride (SF6) simulations is presented in this work. The new force field for SF6 contains two parts: a Lennard-Jones potential that deals with F-F intermolecular interactions and the second term dealing with the intramolecular forces. In this second part the flexibility of the molecule is explicitly considered by 6 harmonic stretch terms, modeling the S-F chemical bonds, and 12 harmonic bending terms, modeling the F-S-F angular deformations. The parameters of the new force field have been obtained by a multivariable optimization procedure, whose main feature is the simultaneous fitting of all force field parameters, using as reference data several equilibrium properties (vapor pressure, saturated liquid density, and surface tension) and shear viscosity. The new force field clearly improves the description of the phase envelope and the rest of the properties as compared to previous simulations for a rigid model for the same molecule [A. Olivet et al., J. Chem. Phys. 123, 194508 (2005)]. Results for the optimized force field concerning the vapor-liquid coexistence curve, several thermodynamics states at the homogeneous gas and liquid region, and transport coefficients of SF6 are in good agreement with available experimental data.

  17. The Human Group Optimizer (HGO): Mimicking the collective intelligence of human groups as an optimization tool for combinatorial problems

    CERN Document Server

    De Vincenzo, Ilario; Carbone, Giuseppe

    2016-01-01

    A large number of optimization algorithms have been developed by researchers to solve a variety of complex problems in operations management area. We present a novel optimization algorithm belonging to the class of swarm intelligence optimization methods. The algorithm mimics the decision making process of human groups and exploits the dynamics of this process as an optimization tool for combinatorial problems. In order to achieve this aim, a continuous-time Markov process is proposed to describe the behavior of a population of socially interacting agents, modelling how humans in a group modify their opinions driven by self-interest and consensus seeking. As in the case of a collection of spins, the dynamics of such a system is characterized by a phase transition from low to high values of the overall consenus (magnetization). We recognize this phase transition as being associated with the emergence of a collective superior intelligence of the population. While this state being active, a cooling schedule is a...

  18. Optimizations for the EcoPod field identification tool

    Directory of Open Access Journals (Sweden)

    Yu YuanYuan

    2008-03-01

    Full Text Available Abstract Background We sketch our species identification tool for palm sized computers that helps knowledgeable observers with census activities. An algorithm turns an identification matrix into a minimal length series of questions that guide the operator towards identification. Historic observation data from the census geographic area helps minimize question volume. We explore how much historic data is required to boost performance, and whether the use of history negatively impacts identification of rare species. We also explore how characteristics of the matrix interact with the algorithm, and how best to predict the probability of observing a previously unseen species. Results Point counts of birds taken at Stanford University's Jasper Ridge Biological Preserve between 2000 and 2005 were used to examine the algorithm. A computer identified species by correctly answering, and counting the algorithm's questions. We also explored how the character density of the key matrix and the theoretical minimum number of questions for each bird in the matrix influenced the algorithm. Our investigation of the required probability smoothing determined whether Laplace smoothing of observation probabilities was sufficient, or whether the more complex Good-Turing technique is required. Conclusion Historic data improved identification speed, but only impacted the top 25% most frequently observed birds. For rare birds the history based algorithms did not impose a noticeable penalty in the number of questions required for identification. For our dataset neither age of the historic data, nor the number of observation years impacted the algorithm. Density of characters for different taxa in the identification matrix did not impact the algorithms. Intrinsic differences in identifying different birds did affect the algorithm, but the differences affected the baseline method of not using historic data to exactly the same degree. We found that Laplace smoothing

  19. Review of ASITIC (Analysis and Simulation of Inductors and Transformers for Integrated Circuits Tool to Design Inductor on Chip

    Directory of Open Access Journals (Sweden)

    M.Zamin Ali Khan

    2012-07-01

    Full Text Available Passive elements such as inductors, capacitors, and transformers have the potential to improve the performance of key RF building blocks. Their use, though, not only necessitates proper modeling of electrostatic and magneto static effects, but also electromagnetic parasitic substrate coupling. A custom computer-aided-design tool called ASITIC, €œAnalysis and Simulation of Inductors and Transformers for Integrated Circuits€ is described, which is used for the analysis, design, and optimization of passive devices. This tool allows circuit and process engineers to design and optimize the geometry of passive devices and the process parameters to meet electrical specifications. The losses in the passive devices determine the achievable gain and power dissipation. Optimization of such passive devices is thus an integral part in the design of such building blocks.

  20. A Fixed-Wing Aircraft Simulation Tool for Improving the efficiency of DoD Acquisition

    Science.gov (United States)

    2015-10-05

    05/2015 Oct 2008-Sep 2015 A Fixed- Wing Aircraft Simulation Tool for Improving the Efficiency of DoD Acquisition Scott A. Morton and David R...multi-disciplinary fixed- wing virtual aircraft simulation tool incorporating aerodynamics, structural dynamics, kinematics, and kinetics. Kestrel allows...testing. UU UU UU UU 9 Kevin Newmeyer 703-812-4417 1 A Fixed- Wing Aircraft Simulation Tool for Improving the Efficiency of DoD Acquisition

  1. Assessing recovery feasibility for piping plovers using optimization and simulation

    Science.gov (United States)

    Larson, M.A.; Ryan, M.R.; Murphy, R.K.

    2003-01-01

    Optimization and simulation modeling can be used to account for demographic and economic factors simultaneously in a comprehensive analysis of endangered-species population recovery. This is a powerful approach that is broadly applicable but underutilized in conservation biology. We applied the approach to a population recovery analysis of threatened and endangered piping plovers (Charadrius melodus) in the Great Plains of North America. Predator exclusion increases the reproductive success of piping plovers, but the most cost-efficient strategy of applying predator exclusion and the number of protected breeding pairs necessary to prevent further population declines were unknown. We developed a linear programming model to define strategies that would either maximize fledging rates or minimize financial costs by allocating plover pairs to 1 of 6 types of protection. We evaluated the optimal strategies using a stochastic population simulation model. The minimum cost to achieve a 20% chance of stabilizing simulated populations was approximately $1-11 million over 50 years. Increasing reproductive success to 1.24 fledglings/pair at minimal cost in any given area required fencing 85% of pairs at managed sites but cost 23% less than the current approach. Maximum fledging rates resulted in >20% of simulated populations reaching recovery goals in 30-50 years at cumulative costs of <$16 million. Protecting plover pairs within 50 km of natural resource agency field offices was sufficient to increase simulated populations to established recovery goals. A range-wide management plan needs to be developed and implemented to foster the involvement and cooperation among managers that will be necessary for recovery efforts to be successful. We also discuss how our approach can be applied to a variety of wildlife management issues.

  2. Simulation Application for Optimization of Solar Collector Array

    Directory of Open Access Journals (Sweden)

    Igor Shesho*,

    2014-01-01

    Full Text Available Solar systems offer a comparatively low output density , so increasing the output always means a corresponding increase in the size of the collector area. Thus collector arrays are occasionally constructed (i.e. with different azimuth angles and/or slopes, which be imposed by the location and structure available to mount the collector. In this paper is developed simulation application for optimization for the solar collector array position and number of collectors in regard of maximum annual energy gain and thermal efficiency. It is analyzed solar collector array which has parallel and serial connected solar collectors with different tilt, orientation and thermal characteristics. Measurements are performed for determine the thermal performance of the system. Using the programming language INSEL it is developed simulation program for the analyzed system where optimization is done through parametric runs in the simulation program. Accent is given on the SE orientated collectors regarding their tilt and number, comparing two solutions-scenarios and the current system set situation of the in means of efficiency and total annual energy gain. The first scenario envisages a change of angle from 35 to 25 solar panels on the SE orientation, while the second scenario envisages retaining the existing angle of 35 and adding additional solar collector. Scenario 1 accounts for more than 13% energy gain on annual basis while Scenario 2 has 2% bigger thermal efficiency.

  3. Integrated gasification combined cycle (IGCC) process simulation and optimization

    Energy Technology Data Exchange (ETDEWEB)

    Emun, F.; Gadalla, M.; Majozi, T.; Boer, D. [University of Rovira & Virgili, Tarragona (Spain). Dept. of Chemical Engineering

    2010-03-05

    The integrated gasification combined cycle (IGCC) is an electrical power generation system which offers efficient generation from coal with lower effect on the environment than conventional coal power plants. However, further improvement of its efficiency and thereby lowering emissions are important tasks to achieve a more sustainable energy production. In this paper, a process simulation tool is proposed for simulation of IGCC. This tool is used to improve IGCC's efficiency and the environmental performance through an analysis of the operating conditions, together with process integration studies. Pinch analysis principles and process integration insights are then employed to make topological changes to the flowsheet to improve the energy efficiency and minimize the operation costs. Process data of the Texaco gasifier and the associated plants (coal preparation, air separation unit, gas cleaning, sulfur recovery, gas turbine, steam turbine and the heat recovery steam generator) are considered as a base case, and simulated using Aspen Plus. The results of parameter analysis and heat integration studies indicate that thermal efficiency of 45% can be reached, while a significant decrease in CO{sub 2} and SOx emissions is observed. The CO{sub 2} and SOx emission levels reached are 698 kg/MWh and 0.15 kg/MWh, respectively. Application of pinch analysis determines energy targets, and also identifies potential modifications for further improvement to overall energy efficiency. Benefits of energy integration and steam production possibilities can further be quantified. Overall benefits can be translated to minimum operation costs and atmospheric emissions.

  4. Global Optimization for Black-box Simulation via Sequential Intrinsic Kriging

    NARCIS (Netherlands)

    Mehdad, E.; Kleijnen, Jack P.C.

    2014-01-01

    In this paper we investigate global optimization for black-box simulations using metamodels to guide this optimization. As a novel metamodel we introduce intrinsic Kriging, for either deterministic or random simulation. For deterministic simulation we study the famous `efficient global optimization'

  5. Molecular dynamics simulation of subnanometric tool-workpiece contact on a force sensor-integrated fast tool servo for ultra-precision microcutting

    Science.gov (United States)

    Cai, Yindi; Chen, Yuan-Liu; Shimizu, Yuki; Ito, So; Gao, Wei; Zhang, Liangchi

    2016-04-01

    This paper investigates the contact characteristics between a copper workpiece and a diamond tool in a force sensor-integrated fast tool servo (FS-FTS) for single point diamond microcutting and in-process measurement of ultra-precision surface forms of the workpiece. Molecular dynamics (MD) simulations are carried out to identify the subnanometric elastic-plastic transition contact depth, at which the plastic deformation in the workpiece is initiated. This critical depth can be used to optimize the FS-FTS as well as the cutting/measurement process. It is clarified that the vibrations of the copper atoms in the MD model have a great influence on the subnanometric MD simulation results. A multi-relaxation time method is then proposed to reduce the influence of the atom vibrations based on the fact that the dominant vibration component has a certain period determined by the size of the MD model. It is also identified that for a subnanometric contact depth, the position of the tool tip for the contact force to be zero during the retracting operation of the tool does not correspond to the final depth of the permanent contact impression on the workpiece surface. The accuracy for identification of the transition contact depth is then improved by observing the residual defects on the workpiece surface after the tool retracting.

  6. Stochastic simulation and robust design optimization of integrated photonic filters

    Directory of Open Access Journals (Sweden)

    Weng Tsui-Wei

    2017-01-01

    Full Text Available Manufacturing variations are becoming an unavoidable issue in modern fabrication processes; therefore, it is crucial to be able to include stochastic uncertainties in the design phase. In this paper, integrated photonic coupled ring resonator filters are considered as an example of significant interest. The sparsity structure in photonic circuits is exploited to construct a sparse combined generalized polynomial chaos model, which is then used to analyze related statistics and perform robust design optimization. Simulation results show that the optimized circuits are more robust to fabrication process variations and achieve a reduction of 11%–35% in the mean square errors of the 3 dB bandwidth compared to unoptimized nominal designs.

  7. Determination of oil well production performance using artificial neural network (ANN linked to the particle swarm optimization (PSO tool

    Directory of Open Access Journals (Sweden)

    Mohammad Ali Ahmadi

    2015-06-01

    In this work, novel and rigorous methods based on two different types of intelligent approaches including the artificial neural network (ANN linked to the particle swarm optimization (PSO tool are developed to precisely forecast the productivity of horizontal wells under pseudo-steady-state conditions. It was found that there is very good match between the modeling output and the real data taken from the literature, so that a very low average absolute error percentage is attained (e.g., <0.82%. The developed techniques can be also incorporated in the numerical reservoir simulation packages for the purpose of accuracy improvement as well as better parametric sensitivity analysis.

  8. Development Concept of Guaranteed Verification Electric Power System Simulation Tools and Its Realization

    Directory of Open Access Journals (Sweden)

    Gusev Alexander

    2015-01-01

    Full Text Available The analysis of existing problem reliability and verification of widespread electric power systems (EPS simulation tools is presented in this article. Everything simulation tools are based on the using of numerical methods for ordinary differential equations. Described the concept of guaranteed verification EPS simulation tools and the structure of its realization are based using the Simulator having properties of continuous , without decomposition three-phase EPS simulation in real time and on an unlimited range with guaranteed accuracy. The information from the Simulator can be verified by using data only quasi-steady-state regime received from the SCADA and such Simulator can be applied as the standard model for verification any EPS simulation tools.

  9. An integrated multidisciplinary design optimization method for computer numerical control machine tool development

    Directory of Open Access Journals (Sweden)

    Zaifang Zhang

    2015-02-01

    Full Text Available Computer numerical control machine tool is a typical complex product related with multidisciplinary fields, complex structure, and high-performance requirements. It is difficult to identify the overall optimal solution of the machine tool structure for their multiple objectives. A new integrated multidisciplinary design optimization method is then proposed by using a Latin hypercube sampling, a Kriging approximate model, and a multi-objective genetic algorithm. Design space and parametric model are built by choosing appropriate design variables and their value ranges. Samples in design space are generated by optimal Latin hypercube method, and design variable contributions for design performance are discussed for aiding the designer’s judgments. The Kriging model is built by using polynomial approximation according to the response outputs of these samples. The multidisciplinary design model is established based on three optimization objectives, that is, setting mass, optimum deformation, and first-order natural frequency, and two constraints, that is, second-order natural frequency and third-order natural frequency. The optimal solution is identified by using a multi-objective genetic algorithm. The proposed method is applied in a multidisciplinary optimization case study for a typical computer numerical control machine tool. In the optimal solution, the mass decreases by 3.35% and the first-order natural frequency increases by 4.34% in contrast to the original solution.

  10. Virtual reality simulation for the optimization of endovascular procedures: current perspectives

    Directory of Open Access Journals (Sweden)

    Rudarakanchana N

    2015-03-01

    Full Text Available Nung Rudarakanchana,1 Isabelle Van Herzeele,2 Liesbeth Desender,2 Nicholas JW Cheshire1 1Department of Surgery, Imperial College London, London, UK; 2Department of Thoracic and Vascular Surgery, Ghent University Hospital, Ghent, BelgiumOn behalf of EVEREST (European Virtual reality Endovascular RESearch TeamAbstract: Endovascular technologies are rapidly evolving, often requiring coordination and cooperation between clinicians and technicians from diverse specialties. These multidisciplinary interactions lead to challenges that are reflected in the high rate of errors occurring during endovascular procedures. Endovascular virtual reality (VR simulation has evolved from simple benchtop devices to full physic simulators with advanced haptics and dynamic imaging and physiological controls. The latest developments in this field include the use of fully immersive simulated hybrid angiosuites to train whole endovascular teams in crisis resource management and novel technologies that enable practitioners to build VR simulations based on patient-specific anatomy. As our understanding of the skills, both technical and nontechnical, required for optimal endovascular performance improves, the requisite tools for objective assessment of these skills are being developed and will further enable the use of VR simulation in the training and assessment of endovascular interventionalists and their entire teams. Simulation training that allows deliberate practice without danger to patients may be key to bridging the gap between new endovascular technology and improved patient outcomes.Keywords: virtual reality, simulation, endovascular, aneurysm

  11. Optimization of hydrogen vehicle refueling via dynamic simulation

    DEFF Research Database (Denmark)

    Rothuizen, Erasmus Damgaard; Mérida, W.; Rokni, Masoud

    2013-01-01

    A dynamic model has been developed to analyze and optimize the thermodynamics and design of hydrogen refueling stations. The model is based on Dymola software and incorporates discrete components. Two refueling station designs were simulated and compared. The modeling results indicate that pressure...... loss in the vehicle's storage system is one of the main factors determining the mass flow and peak cooling requirements of the refueling process. The design of the refueling station does not influence the refueling of the vehicle when the requirements of the technical information report J2601 from...

  12. Global optimization of tool path for five-axis flank milling with a cylindrical cutter

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    In this paper, optimum positioning of cylindrical cutter for five-axis flank milling of non-developable ruled surface is addressed from the perspective of surface approximation. Based on the developed interchangeability principle, global optimization of the five-axis tool path is modeled as approximation of the tool envelope surface to the data points on the design surface following the minimum zone criterion recommended by ANSI and ISO standards for tolerance evaluation. By using the signed point-to-surface distance function, tool path plannings for semi-finish and finish millings are formulated as two constrained optimization problems in a unified framework. Based on the second order Taylor approximation of the distance function, a sequential approximation algorithm along with a hierarchical algorithmic structure is developed for the optimization. Numerical examples are presented to confirm the validity of the proposed approach.

  13. Global optimization of tool path for five-axis flank milling with a cylindrical cutter

    Institute of Scientific and Technical Information of China (English)

    DING Han; ZHU LiMin

    2009-01-01

    In this paper,optimum positioning of cylindrical cutter for five-axis flank milling of non-developable ruled surface is addressed from the perspective of surface approximation.Based on the developed interchangeability principle,global optimization of the five-axis tool path is modeled as approximation of the tool envelope surface to the data points on the design surface following the minimum zone criterion recommended by ANSI and ISO standards for tolerance evaluation.By using the signed point-to-surface distance function,tool path plannings for semi-finish and finish millings are formulated as two constrained optimization problems in a unified framework.Based on the second order Taylor approximation of the distance function,a sequential approximation algorithm along with a hierarchical algorithmic structure is developed for the optimization.Numerical examples are presented to confirm the validity of the proposed approach.

  14. Optimizing Grippers for Compensating Pose Uncertainties by Dynamic Simulation

    DEFF Research Database (Denmark)

    Wolniakowski, Adam; Kramberger, Aljaž; Gams, Andrej

    2016-01-01

    Gripper design process is one of the interesting challenges in the context of grasping within industry. Typically, simple parallel-finger grippers, which are easy to install and maintain, are used in platforms for robotic grasping. The context switches in these platforms require frequent exchange......, we have presented a method to automatically compute the optimal finger shapes for defined task contexts in simulation. In this paper, we show the performance of our method in an industrial grasping scenario. We first analyze the uncertainties of the used vision system, which are the major source...... of grasping error. Then, we perform the experiments, both in simulation and in a real setting. The experiments confirmed the validity of our approach. The computed finger design was employed in a real industrial assembly scenario....

  15. Edge control in CNC polishing, paper 2: simulation and validation of tool influence functions on edges.

    Science.gov (United States)

    Li, Hongyu; Walker, David; Yu, Guoyu; Sayle, Andrew; Messelink, Wilhelmus; Evans, Rob; Beaucamp, Anthony

    2013-01-14

    Edge mis-figure is regarded as one of the most difficult technical issues for manufacturing the segments of extremely large telescopes, which can dominate key aspects of performance. A novel edge-control technique has been developed, based on 'Precessions' polishing technique and for which accurate and stable edge tool influence functions (TIFs) are crucial. In the first paper in this series [D. Walker Opt. Express 20, 19787-19798 (2012)], multiple parameters were experimentally optimized using an extended set of experiments. The first purpose of this new work is to 'short circuit' this procedure through modeling. This also gives the prospect of optimizing local (as distinct from global) polishing for edge mis-figure, now under separate development. This paper presents a model that can predict edge TIFs based on surface-speed profiles and pressure distributions over the polishing spot at the edge of the part, the latter calculated by finite element analysis and verified by direct force measurement. This paper also presents a hybrid-measurement method for edge TIFs to verify the simulation results. Experimental and simulation results show good agreement.

  16. Combining On-Line Characterization Tools with Modern Software Environments for Optimal Operation of Polymerization Processes

    Directory of Open Access Journals (Sweden)

    Navid Ghadipasha

    2016-02-01

    Full Text Available This paper discusses the initial steps towards the formulation and implementation of a generic and flexible model centric framework for integrated simulation, estimation, optimization and feedback control of polymerization processes. For the first time it combines the powerful capabilities of the automatic continuous on-line monitoring of polymerization system (ACOMP, with a modern simulation, estimation and optimization software environment towards an integrated scheme for the optimal operation of polymeric processes. An initial validation of the framework was performed for modelling and optimization using literature data, illustrating the flexibility of the method to apply under different systems and conditions. Subsequently, off-line capabilities of the system were fully tested experimentally for model validations, parameter estimation and process optimization using ACOMP data. Experimental results are provided for free radical solution polymerization of methyl methacrylate.

  17. Development of simulation tools for numerical investigation and computer-aided design (CAD) of gyrotrons

    Science.gov (United States)

    Damyanova, M.; Sabchevski, S.; Zhelyazkov, I.; Vasileva, E.; Balabanova, E.; Dankov, P.; Malinov, P.

    2016-10-01

    As the most powerful CW sources of coherent radiation in the sub-terahertz to terahertz frequency range the gyrotrons have demonstrated a remarkable potential for numerous novel and prospective applications in the fundamental physical research and the technologies. Among them are powerful gyrotrons for electron cyclotron resonance heating (ECRH) and current drive (ECCD) of magnetically confined plasma in various reactors for controlled thermonuclear fusion (e.g., tokamaks and most notably ITER), high-frequency gyrotrons for sub-terahertz spectroscopy (for example NMR-DNP, XDMR, study of the hyperfine structure of positronium, etc.), gyrotrons for thermal processing and so on. Modelling and simulation are indispensable tools for numerical studies, computer-aided design (CAD) and optimization of such sophisticated vacuum tubes (fast-wave devices) operating on a physical principle known as electron cyclotron resonance maser (ECRM) instability. During the recent years, our research team has been involved in the development of physical models and problem-oriented software packages for numerical analysis and CAD of different gyrotrons in the framework of a broad international collaboration. In this paper we present the current status of our simulation tools (GYROSIM and GYREOSS packages) and illustrate their functionality by results of numerical experiments carried out recently. Finally, we provide an outlook on the envisaged further development of the computer codes and the computational modules belonging to these packages and specialized to different subsystems of the gyrotrons.

  18. Optimal Machine Tools Selection Using Interval-Valued Data FCM Clustering Algorithm

    Directory of Open Access Journals (Sweden)

    Yupeng Xin

    2014-01-01

    Full Text Available Machine tool selection directly affects production rates, accuracy, and flexibility. In order to quickly and accurately select the appropriate machine tools in machining process planning, this paper proposes an optimal machine tools selection method based on interval-valued data fuzzy C-means (FCM clustering algorithm. We define the machining capability meta (MAE as the smallest unit to describe machining capacity of machine tools and establish MAE library based on the MAE information model. According to the manufacturing process requirements, the MAEs can be queried from MAE library. Subsequently, interval-valued data FCM algorithm is used to select the appropriate machine tools for manufacturing process. Through computing matching degree between manufacturing process machining constraints and MAEs, we get the most appropriate MAEs and the corresponding machine tools. Finally, a case study of an exhaust duct part of the aeroengine is presented to demonstrate the applicability of the proposed method.

  19. On-line Optimization-Based Simulators for Fractured and Non-fractured Reservoirs

    Energy Technology Data Exchange (ETDEWEB)

    Milind D. Deo

    2005-08-31

    Oil field development is a multi-million dollar business. Reservoir simulation is often used to guide the field management and development process. Reservoir characterization and geologic modeling tools have become increasingly sophisticated. As a result the geologic models produced are complex. Most reservoirs are fractured to a certain extent. The new geologic characterization methods are making it possible to map features such as faults and fractures, field-wide. Significant progress has been made in being able to predict properties of the faults and of the fractured zones. Traditionally, finite difference methods have been employed in discretizing the domains created by geologic means. For complex geometries, finite-element methods of discretization may be more suitable. Since reservoir simulation is a mature science, some of the advances in numerical methods (linear, nonlinear solvers and parallel computing) have not been fully realized in the implementation of most of the simulators. The purpose of this project was to address some of these issues. {sm_bullet} One of the goals of this project was to develop a series of finite-element simulators to handle problems of complex geometry, including systems containing faults and fractures. {sm_bullet} The idea was to incorporate the most modern computing tools; use of modular object-oriented computer languages, the most sophisticated linear and nonlinear solvers, parallel computing methods and good visualization tools. {sm_bullet} One of the tasks of the project was also to demonstrate the construction of fractures and faults in a reservoir using the available data and to assign properties to these features. {sm_bullet} Once the reservoir model is in place, it is desirable to find the operating conditions, which would provide the best reservoir performance. This can be accomplished by utilization optimization tools and coupling them with reservoir simulation. Optimization-based reservoir simulation was one of the

  20. Optimization of suspension smelting technology by computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lilius, K.; Jokilaakso, A.; Ahokainen, T.; Teppo, O.; Yang Yongxiang [Helsinki Univ. of Technology, Otaniemi (Finland). Lab. of Materials Processing and Powder Metallurgy

    1994-12-31

    The flash smelting process has been extensively studied during its over 40 years existence. Laboratory or pilot scale experiments can not, however, predict all the complicated and coupled phenomena taking place in a flash smelting furnace. Development of commercial modelling software and increasingly efficient computers have brought a new tool to researchers for more comprehensive investigation of the transport and combustion processes taking place in the flash smelting process. A flash smelting furnace and a waste-heat boiler geometry have been simulated in two- and three-dimensional laboratory models which have, in turn, been modeled by using commercial Computational-Fluid-Dynamics -software. The computer simulation has then been extended to an industrial-scale furnace and waste-heat boiler. The work has proceeded from cold gas flow to heat transfer, combustion, and two-phase flow simulations. In the present approach, the modelling task has been divided into submodels. Outlet values of a model are used as inlet values in the subsequent submodel. Heat transfer calculations have been carried out starting from very basic considerations. Different options of the software and heat transfer modes have been tested, and hot gas flow through the furnace and boiler has been simulated. Validation of the models was carried out with temperature measurements from the uptake shaft. Also one geometrical variation of the uptake shaft has been simulated, namely turning the outlet of the shaft 90 degrees. Combustion of sulphides is approximated with gaseous combustion by using a built-in combustion model of the software. The waste-heat boiler has been simulated first as an empty geometry, and adding gradually approximated radiation curtains and convection tube bundles. Both convection and radiation heat transfer were considered. (orig.)

  1. Statistical Testing of Optimality Conditions in Multiresponse Simulation-Based Optimization (Replaced by Discussion Paper 2007-45)

    NARCIS (Netherlands)

    Bettonvil, B.W.M.; Del Castillo, E.; Kleijnen, Jack P.C.

    2005-01-01

    This paper derives a novel procedure for testing the Karush-Kuhn-Tucker (KKT) first-order optimality conditions in models with multiple random responses.Such models arise in simulation-based optimization with multivariate outputs.This paper focuses on expensive simulations, which have small sample

  2. Professors' and students' perceptions and experiences of computational simulations as learning tools

    Science.gov (United States)

    Magana de Leon, Alejandra De Jesus

    Computational simulations are becoming a critical component of scientific and engineering research, and now are becoming an important component for learning. This dissertation provides findings from a multifaceted research study exploring the ways computational simulations have been perceived and experienced as learning tools by instructors and students. Three studies were designed with an increasing focus on the aspects of learning and instructing with computational simulation tools. Study One used a student survey with undergraduate and graduate students whose instructors enhanced their teaching using online computational tools. Results of this survey were used to identify students' perceptions and experiences with these simulations as learning tools. The results provided both an evaluation of the instructional design and an indicator of which instructors were selected in Study Two. Study Two used a phenomenographic research design resulting in a two dimensional outcome space with six qualitatively different ways instructors perceived their learning outcomes associated with using simulation tools as part of students' learning experiences. Results from this work provide a framework for identifying major learning objectives to promote learning with computational simulation tools. Study Three used a grounded theory methodology to expand on instructors' learning objectives to include their perceptions of formative assessment and pedagogy. These perceptions were compared and contrasted with students' perceptions associated with learning with computational tools. The study is organized around three phases and analyzed as a collection of case studies focused on the instructors and their students' perceptions and experiences of computational simulations as learning tools. This third study resulted in a model for using computational simulations as learning tools. This model indicates the potential of integrating the computational simulation tools into formal learning

  3. Collaboration pathway(s) using new tools for optimizing `operational' climate monitoring from space

    Science.gov (United States)

    Helmuth, Douglas B.; Selva, Daniel; Dwyer, Morgan M.

    2015-09-01

    Consistently collecting the earth's climate signatures remains a priority for world governments and international scientific organizations. Architecting a long term solution requires transforming scientific missions into an optimized robust `operational' constellation that addresses the collective needs of policy makers, scientific communities and global academic users for trusted data. The application of new tools offers pathways for global architecture collaboration. Recent rule-based expert system (RBES) optimization modeling of the intended NPOESS architecture becomes a surrogate for global operational climate monitoring architecture(s). These rulebased systems tools provide valuable insight for global climate architectures, by comparison/evaluation of alternatives and the sheer range of trade space explored. Optimization of climate monitoring architecture(s) for a partial list of ECV (essential climate variables) is explored and described in detail with dialogue on appropriate rule-based valuations. These optimization tool(s) suggest global collaboration advantages and elicit responses from the audience and climate science community. This paper will focus on recent research exploring joint requirement implications of the high profile NPOESS architecture and extends the research and tools to optimization for a climate centric case study. This reflects work from SPIE RS Conferences 2013 and 2014, abridged for simplification30, 32. First, the heavily securitized NPOESS architecture; inspired the recent research question - was Complexity (as a cost/risk factor) overlooked when considering the benefits of aggregating different missions into a single platform. Now years later a complete reversal; should agencies considering Disaggregation as the answer. We'll discuss what some academic research suggests. Second, using the GCOS requirements of earth climate observations via ECV (essential climate variables) many collected from space-based sensors; and accepting their

  4. Update on HCDstruct - A Tool for Hybrid Wing Body Conceptual Design and Structural Optimization

    Science.gov (United States)

    Gern, Frank H.

    2015-01-01

    HCDstruct is a Matlab® based software tool to rapidly build a finite element model for structural optimization of hybrid wing body (HWB) aircraft at the conceptual design level. The tool uses outputs from a Flight Optimization System (FLOPS) performance analysis together with a conceptual outer mold line of the vehicle, e.g. created by Vehicle Sketch Pad (VSP), to generate a set of MSC Nastran® bulk data files. These files can readily be used to perform a structural optimization and weight estimation using Nastran’s® Solution 200 multidisciplinary optimization solver. Initially developed at NASA Langley Research Center to perform increased fidelity conceptual level HWB centerbody structural analyses, HCDstruct has grown into a complete HWB structural sizing and weight estimation tool, including a fully flexible aeroelastic loads analysis. Recent upgrades to the tool include the expansion to a full wing tip-to-wing tip model for asymmetric analyses like engine out conditions and dynamic overswings, as well as a fully actuated trailing edge, featuring up to 15 independently actuated control surfaces and twin tails. Several example applications of the HCDstruct tool are presented.

  5. Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool

    Science.gov (United States)

    Pak, Chan-gi

    2011-01-01

    An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.

  6. Tools for Performance Simulation of Heat, Air and Moisture Conditions of Whole Buildings

    DEFF Research Database (Denmark)

    Woloszyn, Monika; Rode, Carsten

    2008-01-01

    Humidity of indoor air is an important factor influencing the air quality and energy consumption of buildings as well as durability of building components. Indoor humidity depends on several factors, such as moisture sources, air change, sorption in materials and possible condensation. Since all...... and moisture transfer processes that take place in “whole buildings” by considering all relevant parts of its constituents. It is believed that full understanding of these processes for the whole building is absolutely crucial for future energy optimization of buildings, as this cannot take place without...... these phenomena are strongly dependent on each other, numerical predictions of indoor humidity need to be integrated into combined heat and airflow simulation tools. The purpose of a recent international collaborative project, IEA ECBCS Annex 41, has been to advance development in modelling the integral heat, air...

  7. Synthesize, optimize, analyze, repeat (SOAR): Application of neural network tools to ECG patient monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Watrous, R.; Towell, G.; Glassman, M.S. [Siemens Corporate Research, Princeton, NJ (United States)

    1995-12-31

    Results are reported from the application of tools for synthesizing, optimizing and analyzing neural networks to an ECG Patient Monitoring task. A neural network was synthesized from a rule-based classifier and optimized over a set of normal and abnormal heartbeats. The classification error rate on a separate and larger test set was reduced by a factor of 2. When the network was analyzed and reduced in size by a factor of 40%, the same level of performance was maintained.

  8. Modelling and Optimization of Technological Process for Magnetron Synthesis of Altin Nanocomposite Films on Cutting Tools

    Science.gov (United States)

    Kozhina, T. D.

    2016-04-01

    The paper highlights the results of the research on developing the mechanism to model the technological process for magnetron synthesis of nanocomposite films on cutting tools, which provides their specified physical and mechanical characteristics by controlling pulsed plasma parameters. The paper presents optimal conditions for AlTiN coating deposition on cutting tools according to the ion energy of sputtered atoms in order to provide their specified physical and mechanical characteristics.

  9. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

    Science.gov (United States)

    Seltzer, S. M.

    1974-01-01

    Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

  10. Parametric Optimization Through Numerical Simulation of VCR Diesel Engine

    Science.gov (United States)

    Ganji, Prabhakara Rao; Mahmood, Al-Qarttani Abdulrahman Shakir; Kandula, Aasrith; Raju, Vysyaraju Rajesh Khana; Rao, Surapaneni Srinivasa

    2016-06-01

    In the present study, the Variable Compression Ratio (VCR) engine was analyzed numerically using CONVERGE™ Computational Fluid Dynamics code in order to optimize the design/operating parameters such as Compression Ratio (CR), Start of Injection (SOI) and Exhaust Gas Recirculation (EGR). VCR engine was run for 100 % load to test its performance and it was validated for standard configuration. Simulations were performed by varying the design/operating parameters such as CR (18-14), SOI (17°-26° bTDC) and EGR (0-15 %) at constant fuel injection pressure of 230 bar and speed of 1500 rpm. The effect of each of these parameters on pressure, oxides of nitrogen (NOx) and soot are presented. Finally, regression equations were developed for pressure, NOx and soot by using the simulation results. The regression equations were solved for multi objective criteria in order to reduce the NOx and soot while maintaining the baseline performance. The optimized configuration was tested for validation and found satisfactory.

  11. Validation, Optimization and Simulation of a Solar Thermoelectric Generator Model

    Science.gov (United States)

    Madkhali, Hadi Ali; Hamil, Ali; Lee, HoSung

    2017-08-01

    This study explores thermoelectrics as a viable option for small-scale solar thermal applications. Thermoelectric technology is based on the Seebeck effect, which states that a voltage is induced when a temperature gradient is applied to the junctions of two differing materials. This research proposes to analyze, validate, simulate, and optimize a prototype solar thermoelectric generator (STEG) model in order to increase efficiency. The intent is to further develop STEGs as a viable and productive energy source that limits pollution and reduces the cost of energy production. An empirical study (Kraemer et al. in Nat Mater 10:532, 2011) on the solar thermoelectric generator reported a high efficiency performance of 4.6%. The system had a vacuum glass enclosure, a flat panel (absorber), thermoelectric generator and water circulation for the cold side. The theoretical and numerical approach of this current study validated the experimental results from Kraemer's study to a high degree. The numerical simulation process utilizes a two-stage approach in ANSYS software for Fluent and Thermal-Electric Systems. The solar load model technique uses solar radiation under AM 1.5G conditions in Fluent. This analytical model applies Dr. Ho Sung Lee's theory of optimal design to improve the performance of the STEG system by using dimensionless parameters. Applying this theory, using two cover glasses and radiation shields, the STEG model can achieve a highest efficiency of 7%.

  12. Parametric Optimization Through Numerical Simulation of VCR Diesel Engine

    Science.gov (United States)

    Ganji, Prabhakara Rao; Mahmood, Al-Qarttani Abdulrahman Shakir; Kandula, Aasrith; Raju, Vysyaraju Rajesh Khana; Rao, Surapaneni Srinivasa

    2017-08-01

    In the present study, the Variable Compression Ratio (VCR) engine was analyzed numerically using CONVERGE™ Computational Fluid Dynamics code in order to optimize the design/operating parameters such as Compression Ratio (CR), Start of Injection (SOI) and Exhaust Gas Recirculation (EGR). VCR engine was run for 100 % load to test its performance and it was validated for standard configuration. Simulations were performed by varying the design/operating parameters such as CR (18-14), SOI (17°-26° bTDC) and EGR (0-15 %) at constant fuel injection pressure of 230 bar and speed of 1500 rpm. The effect of each of these parameters on pressure, oxides of nitrogen (NOx) and soot are presented. Finally, regression equations were developed for pressure, NOx and soot by using the simulation results. The regression equations were solved for multi objective criteria in order to reduce the NOx and soot while maintaining the baseline performance. The optimized configuration was tested for validation and found satisfactory.

  13. Development of a Multi-Event Trajectory Optimization Tool for Noise-Optimized Approach Route Design

    NARCIS (Netherlands)

    Braakenburg, M.L.; Hartjes, S.; Visser, H.G.; Hebly, S.J.

    2011-01-01

    This paper presents preliminary results from an ongoing research effort towards the development of a multi-event trajectory optimization methodology that allows to synthesize RNAV approach routes that minimize a cumulative measure of noise, taking into account the total noise effect aggregated for a

  14. A Software Tool for Optimal Sizing of PV Systems in Malaysia

    Directory of Open Access Journals (Sweden)

    Tamer Khatib

    2012-01-01

    Full Text Available This paper presents a MATLAB based user friendly software tool called as PV.MY for optimal sizing of photovoltaic (PV systems. The software has the capabilities of predicting the metrological variables such as solar energy, ambient temperature and wind speed using artificial neural network (ANN, optimizes the PV module/ array tilt angle, optimizes the inverter size and calculate optimal capacities of PV array, battery, wind turbine and diesel generator in hybrid PV systems. The ANN based model for metrological prediction uses four meteorological variables, namely, sun shine ratio, day number and location coordinates. As for PV system sizing, iterative methods are used for determining the optimal sizing of three types of PV systems, which are standalone PV system, hybrid PV/wind system and hybrid PV/diesel generator system. The loss of load probability (LLP technique is used for optimization in which the energy sources capacities are the variables to be optimized considering very low LLP. As for determining the optimal PV panels tilt angle and inverter size, the Liu and Jordan model for solar energy incident on a tilt surface is used in optimizing the monthly tilt angle, while a model for inverter efficiency curve is used in the optimization of inverter size.

  15. Simulation-based optimal Bayesian experimental design for nonlinear systems

    KAUST Repository

    Huan, Xun

    2013-01-01

    The optimal selection of experimental conditions is essential to maximizing the value of data for inference and prediction, particularly in situations where experiments are time-consuming and expensive to conduct. We propose a general mathematical framework and an algorithmic approach for optimal experimental design with nonlinear simulation-based models; in particular, we focus on finding sets of experiments that provide the most information about targeted sets of parameters.Our framework employs a Bayesian statistical setting, which provides a foundation for inference from noisy, indirect, and incomplete data, and a natural mechanism for incorporating heterogeneous sources of information. An objective function is constructed from information theoretic measures, reflecting expected information gain from proposed combinations of experiments. Polynomial chaos approximations and a two-stage Monte Carlo sampling method are used to evaluate the expected information gain. Stochastic approximation algorithms are then used to make optimization feasible in computationally intensive and high-dimensional settings. These algorithms are demonstrated on model problems and on nonlinear parameter inference problems arising in detailed combustion kinetics. © 2012 Elsevier Inc.

  16. Simulation and optimization of SOFC-BCHP system

    Institute of Scientific and Technical Information of China (English)

    ZHANG Xing-mei; ZHAO Xi-ling; DUAN Chang-gui

    2009-01-01

    As the prime motor of dispersed energy system, the high-temperature solid oxide fuel cells (SOFC) are high efficient with large heat recovery. This study presents a simulation of SOFC building-based cooling,heat and power (BCHP) system, which can meet basic requirements in power and heating (cooling) of the designated customers. The peak power load can be met by power grid, while the peak heating (cooling) load requirement can be met by backup equipments. In order to solve the economic dispatch problem of the energy system, a restricted nonlinear optimization model has been developed. The production costs can be minimized via both the equality constraints of customer's beat and power demands, and other inequality constrains of equipments' capacities. The sequential quadratic programming method has been used to search the solution. The study indicates that the model can be used to optimize the system's capacities and run strategy. An office building case has been computed, and it is indicated that the model can be served in design and optimization of SOFC-BCHP system.

  17. Time domain simulation and modeling of power electronic circuits. Development of a simulation tool

    Energy Technology Data Exchange (ETDEWEB)

    Mo, O.

    1993-08-01

    This thesis presents the results of a study on the topic: Time domain modeling and simulation of power electronic circuits. The objectives of the presented work have been to improve and expand the simulation program KREAN. This also included search for, development and implementation of models suited for analysis of power electronic circuits. The main contribution of the work is the improved KREAN program itself and the models created for the program. Further, the work has led to this thesis which is a documentation of the applied methods. The thesis shows how to create a power electronic simulation tool and how to meet the special problems encountered in power electronic circuits. Among the major improvements of KREAN are: Better methods for solution of nonlinear algebraic equations. Major modifications have been implemented in the modified Newton iteration method. The old method suffered from insufficient control of the iteration error. Improved efficiency, accuracy and robustness of the breakpoint detection methods (breakpoints are time instants of discontinuous behavior in models). A new linear circuit now replaces the nonlinear modules at each stage in the iteration. The old one could give serious errors in the results and was not applicable after introduction of voltage response terminals. Several new models have been implemented as KREAN modules. Together with the old basic ones, they form a powerful set for simulation of power electronics. The thesis describes the applied methods, the implemented models and also presents results from study of the accuracy and efficiency of the program. The applied methods in the program are stated to be good enough for most simulation purposes. 100 refs., 93 figs., 14 tabs.

  18. A study on optimization of hybrid drive train using Advanced Vehicle Simulator (ADVISOR)

    Science.gov (United States)

    Same, Adam; Stipe, Alex; Grossman, David; Park, Jae Wan

    This study investigates the advantages and disadvantages of three hybrid drive train configurations: series, parallel, and "through-the-ground" parallel. Power flow simulations are conducted with the MATLAB/Simulink-based software ADVISOR. These simulations are then applied in an application for the UC Davis SAE Formula Hybrid vehicle. ADVISOR performs simulation calculations for vehicle position using a combined backward/forward method. These simulations are used to study how efficiency and agility are affected by the motor, fuel converter, and hybrid configuration. Three different vehicle models are developed to optimize the drive train of a vehicle for three stages of the SAE Formula Hybrid competition: autocross, endurance, and acceleration. Input cycles are created based on rough estimates of track geometry. The output from these ADVISOR simulations is a series of plots of velocity profile and energy storage State of Charge that provide a good estimate of how the Formula Hybrid vehicle will perform on the given course. The most noticeable discrepancy between the input cycle and the actual velocity profile of the vehicle occurs during deceleration. A weighted ranking system is developed to organize the simulation results and to determine the best drive train configuration for the Formula Hybrid vehicle. Results show that the through-the-ground parallel configuration with front-mounted motors achieves an optimal balance of efficiency, simplicity, and cost. ADVISOR is proven to be a useful tool for vehicle power train design for the SAE Formula Hybrid competition. This vehicle model based on ADVISOR simulation is applicable to various studies concerning performance and efficiency of hybrid drive trains.

  19. A study on optimization of hybrid drive train using Advanced Vehicle Simulator (ADVISOR)

    Energy Technology Data Exchange (ETDEWEB)

    Same, Adam; Stipe, Alex; Grossman, David; Park, Jae Wan [Department of Mechanical and Aeronautical Engineering, University of California, Davis, One Shields Ave, Davis, CA 95616 (United States)

    2010-10-01

    This study investigates the advantages and disadvantages of three hybrid drive train configurations: series, parallel, and ''through-the-ground'' parallel. Power flow simulations are conducted with the MATLAB/Simulink-based software ADVISOR. These simulations are then applied in an application for the UC Davis SAE Formula Hybrid vehicle. ADVISOR performs simulation calculations for vehicle position using a combined backward/forward method. These simulations are used to study how efficiency and agility are affected by the motor, fuel converter, and hybrid configuration. Three different vehicle models are developed to optimize the drive train of a vehicle for three stages of the SAE Formula Hybrid competition: autocross, endurance, and acceleration. Input cycles are created based on rough estimates of track geometry. The output from these ADVISOR simulations is a series of plots of velocity profile and energy storage State of Charge that provide a good estimate of how the Formula Hybrid vehicle will perform on the given course. The most noticeable discrepancy between the input cycle and the actual velocity profile of the vehicle occurs during deceleration. A weighted ranking system is developed to organize the simulation results and to determine the best drive train configuration for the Formula Hybrid vehicle. Results show that the through-the-ground parallel configuration with front-mounted motors achieves an optimal balance of efficiency, simplicity, and cost. ADVISOR is proven to be a useful tool for vehicle power train design for the SAE Formula Hybrid competition. This vehicle model based on ADVISOR simulation is applicable to various studies concerning performance and efficiency of hybrid drive trains. (author)

  20. D-VASim: A Software Tool to Simulate and Analyze Genetic Logic Circuits

    DEFF Research Database (Denmark)

    Baig, Hasan; Madsen, Jan

    2016-01-01

    -stage researchers with limited experience in the field of biology. The Solution: Using LabVIEW to develop a user-friendly simulation tool named Dynamic Virtual Analyzer and Simulator (D-VASim), which is the first software tool in the domain of synthetic biology that provides a virtual laboratory environment...

  1. D-VASim: A Software Tool to Simulate and Analyze Genetic Logic Circuits

    DEFF Research Database (Denmark)

    Baig, Hasan; Madsen, Jan

    2016-01-01

    -stage researchers with limited experience in the field of biology. The Solution: Using LabVIEW to develop a user-friendly simulation tool named Dynamic Virtual Analyzer and Simulator (D-VASim), which is the first software tool in the domain of synthetic biology that provides a virtual laboratory environment...

  2. Application of Maintenance Simulate System of NC Machine Tools in Teaching

    Directory of Open Access Journals (Sweden)

    He Xin

    2017-01-01

    Full Text Available The project design is the foundation of training, which can insure the usefulness of human resource development system. The Training Project Maintenance Simulate System of NC Machine Tools is presented based on analyzed contradiction between supply and demand. This paper introduces several scheme of Maintenance Simulate System of NC Machine Tools.

  3. Design optimization of the tool structure for stamping an automotive part with the high strength steel

    Science.gov (United States)

    Kim, S. H.; Choi, H. J.; Rho, J. D.; Kim, K. P.; Park, K. D.; Kwon, B. K.; Cho, C. H.; Kang, M. J.; Bae, S. M.

    2013-12-01

    Optimum shape design of the tool structure is carried out in order to decrease the deformation and the stress from the large amount of stamping load with a simultaneous effect of weight reduction. Topology optimization is carried out to design the shape of the rib structure and Taguchi method is utilized to optimize the core shape. As a result of optimization, the weight of the rib and the core structures is reduced to 3.1% and the deformation and the stress of the rib and the core structures are decreased to 10.6% and 3.7% comparing to the initial design, respectively.

  4. Multidisciplinary Design, Analysis, and Optimization Tool Development Using a Genetic Algorithm

    Science.gov (United States)

    Pak, Chan-gi; Li, Wesley

    2009-01-01

    Multidisciplinary design, analysis, and optimization using a genetic algorithm is being developed at the National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California) to automate analysis and design process by leveraging existing tools to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic, and hypersonic aircraft. This is a promising technology, but faces many challenges in large-scale, real-world application. This report describes current approaches, recent results, and challenges for multidisciplinary design, analysis, and optimization as demonstrated by experience with the Ikhana fire pod design.!

  5. RTSTEP regional transportation simulation tool for emergency planning - final report.

    Energy Technology Data Exchange (ETDEWEB)

    Ley, H.; Sokolov, V.; Hope, M.; Auld, J.; Zhang, K.; Park, Y.; Kang, X. (Energy Systems)

    2012-01-20

    such materials over a large area, with responders trying to mitigate the immediate danger to the population in a variety of ways that may change over time (e.g., in-place evacuation, staged evacuations, and declarations of growing evacuation zones over time). In addition, available resources will be marshaled in unusual ways, such as the repurposing of transit vehicles to support mass evacuations. Thus, any simulation strategy will need to be able to address highly dynamic effects and will need to be able to handle any mode of ground transportation. Depending on the urgency and timeline of the event, emergency responders may also direct evacuees to leave largely on foot, keeping roadways as clear as possible for emergency responders, logistics, mass transport, and law enforcement. This RTSTEP project developed a regional emergency evacuation modeling tool for the Chicago Metropolitan Area that emergency responders can use to pre-plan evacuation strategies and compare different response strategies on the basis of a rather realistic model of the underlying complex transportation system. This approach is a significant improvement over existing response strategies that are largely based on experience gained from small-scale events, anecdotal evidence, and extrapolation to the scale of the assumed emergency. The new tool will thus add to the toolbox available to emergency response planners to help them design appropriate generalized procedures and strategies that lead to an improved outcome when used during an actual event.

  6. MicroarrayDesigner: an online search tool and repository for near-optimal microarray experimental designs

    Directory of Open Access Journals (Sweden)

    Ferhatosmanoglu Nilgun

    2009-09-01

    Full Text Available Abstract Background Dual-channel microarray experiments are commonly employed for inference of differential gene expressions across varying organisms and experimental conditions. The design of dual-channel microarray experiments that can help minimize the errors in the resulting inferences has recently received increasing attention. However, a general and scalable search tool and a corresponding database of optimal designs were still missing. Description An efficient and scalable search method for finding near-optimal dual-channel microarray designs, based on a greedy hill-climbing optimization strategy, has been developed. It is empirically shown that this method can successfully and efficiently find near-optimal designs. Additionally, an improved interwoven loop design construction algorithm has been developed to provide an easily computable general class of near-optimal designs. Finally, in order to make the best results readily available to biologists, a continuously evolving catalog of near-optimal designs is provided. Conclusion A new search algorithm and database for near-optimal microarray designs have been developed. The search tool and the database are accessible via the World Wide Web at http://db.cse.ohio-state.edu/MicroarrayDesigner. Source code and binary distributions are available for academic use upon request.

  7. An improved model for the oPtImal Measurement Probes Allocation tool

    Energy Technology Data Exchange (ETDEWEB)

    Sterle, C., E-mail: claudio.sterle@unina.it [Consorzio CREATE/Dipartimento di Ingegneria Elettrica e delle Tecnologie dell’Informazione, Università degli Studi di Napoli Federico II, Via Claudio 21, 80125 Napoli (Italy); Neto, A.C. [Fusion for Energy, 08019 Barcelona (Spain); De Tommasi, G. [Consorzio CREATE/Dipartimento di Ingegneria Elettrica e delle Tecnologie dell’Informazione, Università degli Studi di Napoli Federico II, Via Claudio 21, 80125 Napoli (Italy)

    2015-10-15

    Highlights: • The problem of optimally allocating the probes of a diagnostic system is tackled. • The problem is decomposed in two consecutive optimization problems. • Two original ILP models are proposed and sequentially solved to optimality. • The proposed ILP models improve and extend the previous work present in literature. • Real size instances have been optimally solved with very low computation time. - Abstract: The oPtImal Measurement Probes Allocation (PIMPA) tool has been recently proposed in [1] to maximize the reliability of a tokamak diagnostic system against the failure of one or more of the processing nodes. PIMPA is based on the solution of integer linear programming (ILP) problems, and it minimizes the effect of the failure of a data acquisition component. The first formulation of the PIMPA model did not support the concept of individual slots. This work presents an improved ILP model that addresses the above mentioned problem, by taking into account all the individual probes.

  8. Stochastic Global Optimization and Its Applications with Fuzzy Adaptive Simulated Annealing

    CERN Document Server

    Aguiar e Oliveira Junior, Hime; Petraglia, Antonio; Rembold Petraglia, Mariane; Augusta Soares Machado, Maria

    2012-01-01

    Stochastic global optimization is a very important subject, that has applications in virtually all areas of science and technology. Therefore there is nothing more opportune than writing a book about a successful and mature algorithm that turned out to be a good tool in solving difficult problems. Here we present some techniques for solving  several problems by means of Fuzzy Adaptive Simulated Annealing (Fuzzy ASA), a fuzzy-controlled version of ASA, and by ASA itself. ASA is a sophisticated global optimization algorithm that is based upon ideas of the simulated annealing paradigm, coded in the C programming language and developed to statistically find the best global fit of a nonlinear constrained, non-convex cost function over a multi-dimensional space. By presenting detailed examples of its application we want to stimulate the reader’s intuition and make the use of Fuzzy ASA (or regular ASA) easier for everyone wishing to use these tools to solve problems. We kept formal mathematical requirements to a...

  9. OPTIMIZING LAYOUT OF URBAN STREET CANYON USING NUMERICAL SIMULATION COUPLING WITH MATHEMATICAL OPTIMIZATION

    Institute of Scientific and Technical Information of China (English)

    WANG Jia-song; ZHAO Bao-qing; YE Chun; YANG De-qing; HUANG Zhen

    2006-01-01

    Optimizing the layout of the urban street canyon to achieve the maximum environmental benefits should become a new idea for modern urban street design and planning. This paper aims to find out the optimized street canyon from a viewpoint of environmental protection by using the two-dimensional numerical simulation model with the turbulence model, coupling with the mathematical optimization method. The total pollutant concentration within and at top of the specific street canyons was taken as the objective function, and the height of one side of the canyon as the constrained condition. A nonlinearly improved constrained variable metric solver was used. The effect of the height of the leeward building and windward building on the integrated pollutant dispersion was studied to achieve the most beneficial configuration of the urban geometry. The optimization of layout for an asymmetrical street canyon was obtained. It is further found that the step-down street canyon with a large height difference is generally a good layout favoring to reduce the concentration accumulation in the street canyon.

  10. Simulation and optimization of biomass harvest and transport system

    Energy Technology Data Exchange (ETDEWEB)

    Busato, Patricia; Berruto, Remigio; Piccarolo, Pietro [University of Turin (Italy). Dipt. di Economia e Ingegneria Agraria, Forestale e Ambientale (DEIAFA)], E-mail: patrizia.busato@unito.it

    2008-07-01

    The implementation of a biomass supply chain needs the delivery of good feasibility studies. Since biomass is characterized by low value and low energy density, the logistic costs is an important component in order to assess this feasibility. To design the logistics and estimate the costs is a complex task because the process consists of multiple work processes intensively interlinked. Bottlenecks within transport or unloading operations can reduce system capacity below the capacity of the harvester. A well-matched system can lower the cost of producing forages. The overall goal of this study was to present the combined use of both the simulation and linear programming models to optimize the flow of biomass from field to a power plant. The simulation predicted the overall system performance. The results from the simulation model were then used as input in the linear programming model, which chosen the best combination of equipment for each field distance and yield, in order to minimize the logistic costs, while satisfying some constraint like the number of hours available for harvest and the area to be harvested. The presented case study refers to corn silage harvest of an area of 4000 ha (72000 tDM). The logistic operation costs ranged from 17.00 Euros.tDM{sup -1} for 10 km to 31.86 Euros.tDM{sup -1} for 40 km biomass collection radius. The average unitary costs were respectively of 306 Euros.ha{sup -1} and 574 Euros.ha{sup -1}. (author)

  11. The Use of the Articulated Total Body Model as a Robot Dynamics Simulation Tool

    Science.gov (United States)

    1988-07-01

    AARL-SR-90-512 AD-A235 930l[liill ~i 11111111111 iIII J The Use of the Articulated Total Body Model as a Robot Dynamics Simulation Tool Louise A...R 4. TITLE AND SUBTITLE S. FUNDING NUMBERS The Use of the Articulated Total Body Model as a Robot Dynamics Simulation Tool PE 62202F 6. AUTHOR(S) PR...Lagrange method. In this paper the use of the ATH model as a robot dynamics simulation tool is discussed and various simulations are demonstrated. For this

  12. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    Science.gov (United States)

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater.

  13. A Tool for Optimizing the Build Performance of Large Software Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Voinea, Lucian; Kontogiannis, K; Tjortjis, C; Winter, A

    2008-01-01

    We present Build Analyzer, a tool that helps developers optimize the build performance of huge systems written in C Due to complex C header dependencies, even small code changes can cause extremely long rebuilds, which are problematic when code is shared and modified by teams of hundreds of individu

  14. AngelStow: A Commercial Optimization-Based Decision Support Tool for Stowage Planning

    DEFF Research Database (Denmark)

    Delgado-Ortegon, Alberto; Jensen, Rune Møller; Guilbert, Nicolas

    save port fees, optimize use of vessel capacity, and reduce bunker consumption. Stowage Coordinators (SCs) produce these plans manually with the help of graphical tools, but high-quality SPs are hard to generate with the limited support they provide. In this abstract, we introduce AngelStow which...

  15. Design tool for large solar hot water systems - Uniform optimization of components and economy

    NARCIS (Netherlands)

    Visser, H.

    1996-01-01

    In close collaboration with the parties concerned, i.e. both the sellers and investors, a design and optimization method for large solar hot water systems is being developed. In order to support investors in achieving the feasibility of such systems, the normalized method including software tool for

  16. A Tool for Optimizing the Build Performance of Large Software Code Bases

    NARCIS (Netherlands)

    Telea, Alexandru; Voinea, Lucian; Kontogiannis, K; Tjortjis, C; Winter, A

    2008-01-01

    We present Build Analyzer, a tool that helps developers optimize the build performance of huge systems written in C Due to complex C header dependencies, even small code changes can cause extremely long rebuilds, which are problematic when code is shared and modified by teams of hundreds of

  17. An optimal inventory policy under certainty distributed demand for cutting tools with stochastically distributed lifespan

    Directory of Open Access Journals (Sweden)

    Cun Rong Li

    2015-12-01

    Full Text Available Traditional inventory policy was deeply investigated for various kinds of demand in different industrial sectors. More extensive explorations on inventory policy, including the combination with manufacturing process, detailed attributes of the purchased products, etc. was conducted by many researchers. During manufacturing process, lifespan of cutting tools have significant impact on both the quantity of inventory and production cost. In this paper, the impact of maximum allowable stopping time for cutting tools on production-inventory policy under general production demands was investigated. An optimal inventory policy with general demand (OIPGD was developed with which the allowable stopping time for tools, order-up-to-level inventory, and order cycle can be optimally determined by an exhaustive searching algorithm. Examples with different distributions on tool lifespan and production demand is presented to show the implementation of the OIPGD model. The results and the sensitivity analysis about the parameters show that optimized combination of selection for tool allowable stopping time, order-up-to-level, and order cycle time can dramatically minimize the total cost of the whole inventory activity.

  18. Optimizing tritium extraction from a Permeator Against Vacuum (PAV) by dimensional design using different tritium transport modeling tools

    Energy Technology Data Exchange (ETDEWEB)

    Martinez, P., E-mail: pablomiguel.martinez@ciemat.es [CIEMAT-LNF (Laboratorio Nacional de Fusion), Madrid (Spain); Moreno, C. [CIEMAT-LNF (Laboratorio Nacional de Fusion), Madrid (Spain); Martinez, I. [SENER Ingenieria y Sistemas, Provenca 392, 4a 08025 Barcelona (Spain); Sedano, L. [CIEMAT-LNF (Laboratorio Nacional de Fusion), Madrid (Spain)

    2012-08-15

    The Permeator Against Vacuum (PAV) has been conceived as the simplest, cost effective and reliable technology system dedicated to tritium extraction from breeding liquid metals. An optimal design of a PAV requires a detailed hydraulic design optimization for established operational ranges (HCLL at low velocities of {approx}1 mm/s or DCLL in the ranges of tens of cm/s). The present work analyses the PAV extraction efficiency dependency on the design parameters as optimum on-line Tritium Extraction System (TES). Three different models have been built for that purpose: one through physically refined 1D tritium transport computation using TMAP7 (unique simulation tool with QA for ITER); and two further detailed models on 2D/3D FEM tool (COMSOL Multi-physics 4.0). The geometry used in this work is a simplification of Fuskite{sup Registered-Sign} conceptual design developed at CIEMAT, consisting of a set of cylindrical and concentric {alpha}-Fe double membranes enclosing a vacuumed space and in contact with in-pipe flowing LiPb eutectic. The aim of this paper is to give the first steps to establish the optimal design parameters of a PAV and evaluate the state-of-the-art of these models.

  19. Modane: A Design Support Tool for Numerical Simulation Codes

    Directory of Open Access Journals (Sweden)

    Lelandais Benoît

    2016-07-01

    Full Text Available The continual increasing power of supercomputers allows numerical simulation codes to take into account more complex physical phenomena. Therefore, physicists and mathematicians have to implement complex algorithms using cutting edge technologies and integrate them in large simulators. The CEA-DAM has been studying for several years the contribution of UML/MDE technologies in its simulators development cycle. The Modane application is one of the results of this work.

  20. Optimized multiple quantum MAS lineshape simulations in solid state NMR

    Science.gov (United States)

    Brouwer, William J.; Davis, Michael C.; Mueller, Karl T.

    2009-10-01

    The majority of nuclei available for study in solid state Nuclear Magnetic Resonance have half-integer spin I>1/2, with corresponding electric quadrupole moment. As such, they may couple with a surrounding electric field gradient. This effect introduces anisotropic line broadening to spectra, arising from distinct chemical species within polycrystalline solids. In Multiple Quantum Magic Angle Spinning (MQMAS) experiments, a second frequency dimension is created, devoid of quadrupolar anisotropy. As a result, the center of gravity of peaks in the high resolution dimension is a function of isotropic second order quadrupole and chemical shift alone. However, for complex materials, these parameters take on a stochastic nature due in turn to structural and chemical disorder. Lineshapes may still overlap in the isotropic dimension, complicating the task of assignment and interpretation. A distributed computational approach is presented here which permits simulation of the two-dimensional MQMAS spectrum, generated by random variates from model distributions of isotropic chemical and quadrupole shifts. Owing to the non-convex nature of the residual sum of squares (RSS) function between experimental and simulated spectra, simulated annealing is used to optimize the simulation parameters. In this manner, local chemical environments for disordered materials may be characterized, and via a re-sampling approach, error estimates for parameters produced. Program summaryProgram title: mqmasOPT Catalogue identifier: AEEC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEEC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 3650 No. of bytes in distributed program, including test data, etc.: 73 853 Distribution format: tar.gz Programming language: C, OCTAVE Computer: UNIX

  1. SIMULATION SYSTEM FOR FIVE-AXIS NC MACHINING USING GENERAL CUTTING TOOL

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A simulation system for five-axis NC machining using general cutting tools is presented. This system differs from other simulation system in that it not only focuses on the geometric simulation but also focuses on the collision detection which is usually not included in NC machining simulation. Besides all of these, estimating cutting forces is also discussed. In order to obtain high efficiency, all algorithms use swept volume modeling technique, so the simulation system is compact and can be performed efficiently.

  2. Optimization of Operation Sequence in CNC Machine Tools Using Genetic Algorithm

    Science.gov (United States)

    Abu Qudeiri, Jaber; Yamamoto, Hidehiko; Ramli, Rizauddin

    The productivity of machine tools is significantly improved by using microcomputer based CAD/CAM systems for NC program generation. Currently, many commercial CAD/CAM packages that provide automatic NC programming have been developed and applied to various cutting processes. Many cutting processes machined by CNC machine tools. In this paper, we attempt to find an efficient solution approach to determine the best sequence of operations for a set of operations that located in asymmetrical locations and different levels. In order to find the best sequence of operations that achieves the shortest cutting tool travel path (CTTP), genetic algorithm is introduced. After the sequence is optimized, the G-codes that use to code for the travel time is created. CTTP can be formulated as a special case of the traveling salesman problem (TSP). The incorporation of genetic algorithm and TSP can be included in the commercial CAD/CAM packages to optimize the CTTP during automatic generation of NC programs.

  3. Simulation Modeling and Statistical Network Tools for Improving Collaboration in Military Logistics

    Science.gov (United States)

    2008-10-01

    AFRL-RH-WP-TR-2009-0110 Simulation Modeling and Statistical Network Tools for Improving Collaboration in Military Logistics...SUBTITLE Simulation Modeling and Statistical Network Tools for Improving Collaboration in Military Logistics 5a. CONTRACT NUMBER FA8650-07-1-6848...8 1 1.0 SUMMARY This final technical report describes the research findings of the project Simulation Modeling and Statistical Network

  4. Aeroelastic Simulation Tool for Inflatable Ballute Aerocapture Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project will develop a much-needed multidisciplinary analysis tool for predicting the impact of aeroelastic effects on the functionality of inflatable...

  5. Aeroelastic Simulation Tool for Inflatable Ballute Aerocapture Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project will develop a much-needed multidisciplinary analysis tool for predicting the impact of aeroelastic effects on the functionality of inflatable...

  6. Coupled Aeroheating and Ablative Thermal Response Simulation Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A predictive tool with tight coupling of the fluid and thermal physics will give insights into the conservatism of the uncoupled design process and could lead to...

  7. CFD simulations to optimize the blade design of water wheels

    Science.gov (United States)

    Quaranta, Emanuele; Revelli, Roberto

    2017-05-01

    At low head sites and at low discharges, water wheels can be considered among the most convenient hydropower converters to install. The aim of this work is to improve the performance of an existing breastshot water wheel by changing the blade shape using computational fluid dynamic (CFD) simulations. Three optimal profiles are investigated: the profile of the existing blades, a circular profile and an elliptical profile. The results are validated by performing experimental tests on the wheel with the existing profile. The numerical results show that the efficiency of breastshot wheels is affected by the blade profile. The average increase in efficiency using the new circular profile is about 4 % with respect to the profile of the existing blades.

  8. Simulation of optimal exploitation of an open geotermal loop

    Science.gov (United States)

    Vaganova, N. A.; Filimonov, M. Yu

    2016-10-01

    Geothermal aquifers are a renewable resource of heat and energy. To encourage these resources open geothermal systems consisting of injection and production wells are commonly used. As a rule, such systems consist of two wells. Hot water from the production well is used and became cooler, and the injection well returns this cold water into the aquifer. To simulate this open geothermal system a three-dimensional nonstationar mathematical model and numerical algorithms are developed taking into account the most important physical and technical parameters of the wells to describe processes of heat transfer and thermal water filtration in the aquifer. Results of numerical calculations, which, in particular, are used to determine an optimal parameters for a geothermal system in North Caucasus, are presented.

  9. Wind hybrid electrical supply system: behaviour simulation and sizing optimization

    Science.gov (United States)

    Notton, G.; Cristofari, C.; Poggi, P.; Muselli, M.

    2001-04-01

    Using a global approach, a wind hybrid system operation is simulated and the evolution of several parameters is analysed, such as the wasted energy, the fuel consumption and the role of the wind turbine subsystem in the global production. This analysis shows that all the energies which take part in the system operation are more dependent on the wind turbine size than on the battery storage capacity. A storage of 2 or 3 days is sufficient, because an increase in storage beyond these values does not have a notable impact on the performance of the wind hybrid system. Finally, a cost study is performed to determine the optimal configuration of the system conducive to the lowest cost of electricity production.

  10. Optimal Numerical Schemes for Compressible Large Eddy Simulations

    Science.gov (United States)

    Edoh, Ayaboe; Karagozian, Ann; Sankaran, Venkateswaran; Merkle, Charles

    2014-11-01

    The design of optimal numerical schemes for subgrid scale (SGS) models in LES of reactive flows remains an area of continuing challenge. It has been shown that significant differences in solution can arise due to the choice of the SGS model's numerical scheme and its inherent dissipation properties, which can be exacerbated in combustion computations. This presentation considers the individual roles of artificial dissipation, filtering, secondary conservation (Kinetic Energy Preservation), and collocated versus staggered grid arrangements with respect to the dissipation and dispersion characteristics and their overall impact on the robustness and accuracy for time-dependent simulations of relevance to reacting and non-reacting LES. We utilize von Neumann stability analysis in order to quantify these effects and to determine the relative strengths and weaknesses of the different approaches. Distribution A: Approved for public release, distribution unlimited. Supported by AFOSR (PM: Dr. F. Fahroo).

  11. Optimization-based Fluid Simulation on Unstructured Meshes

    DEFF Research Database (Denmark)

    Misztal, Marek Krzysztof; Bridson, Robert; Erleben, Kenny;

    We present a novel approach to fluid simulation, allowing us to take into account the surface energy in a pre- cise manner. This new approach combines a novel, topology-adaptive approach to deformable interface track- ing, called the deformable simplicial complexes method (DSC) with an optimization......-based, linear finite element method for solving the incompressible Euler equations. The deformable simplicial complexes track the surface of the fluid: the fluid-air interface is represented explicitly as a piecewise linear surface which is a subset of tetra- hedralization of the space, such that the interface...... can be also represented implicitly as a set of faces separating tetrahedra marked as inside from the ones marked as outside. This representation introduces insignificant and con- trollable numerical diffusion, allows robust topological adaptivity and provides both a volumetric finite element mesh...

  12. Simulation and optimization models for emergency medical systems planning.

    Science.gov (United States)

    Bettinelli, Andrea; Cordone, Roberto; Ficarelli, Federico; Righini, Giovanni

    2014-01-01

    The authors address strategic planning problems for emergency medical systems (EMS). In particular, the three following critical decisions are considered: i) how many ambulances to deploy in a given territory at any given point in time, to meet the forecasted demand, yielding an appropriate response time; ii) when ambulances should be used for serving nonurgent requests and when they should better be kept idle for possible incoming urgent requests; iii) how to define an optimal mix of contracts for renting ambulances from private associations to meet the forecasted demand at minimum cost. In particular, analytical models for decision support, based on queuing theory, discrete-event simulation, and integer linear programming were presented. Computational experiments have been done on real data from the city of Milan, Italy.

  13. Dynamic Simulation and Optimization of Nuclear Hydrogen Production Systems

    Energy Technology Data Exchange (ETDEWEB)

    Paul I. Barton; Mujid S. Kaximi; Georgios Bollas; Patricio Ramirez Munoz

    2009-07-31

    This project is part of a research effort to design a hydrogen plant and its interface with a nuclear reactor. This project developed a dynamic modeling, simulation and optimization environment for nuclear hydrogen production systems. A hybrid discrete/continuous model captures both the continuous dynamics of the nuclear plant, the hydrogen plant, and their interface, along with discrete events such as major upsets. This hybrid model makes us of accurate thermodynamic sub-models for the description of phase and reaction equilibria in the thermochemical reactor. Use of the detailed thermodynamic models will allow researchers to examine the process in detail and have confidence in the accurary of the property package they use.

  14. Business Simulation Games: Effective Teaching Tools or Window Dressing?

    Science.gov (United States)

    Tanner, John R.; Stewart, Geoffrey; Totaro, Michael W.; Hargrave, Melissa

    2012-01-01

    Business simulations serve as learning platforms that stimulate the "gaming" interest of students, that provide a structured learning environment, and that should help manage the time resources of faculty. Simulations appear to provide a context where students feel learning can take place. However, faculty perception of simulation…

  15. The validity of arthroscopic simulators and performance tools

    NARCIS (Netherlands)

    Stunt, J.J.

    2017-01-01

    As there is a growing demand for more time-efficient and effective methods for medical training without putting patients at risk, the role of simulation keeps expanding. Validation of simulators should precede implementation in medical curricula. However, only a small minority of available medical s

  16. Design and optimization of new simulated moving bed plants

    Directory of Open Access Journals (Sweden)

    D. C. S. Azevedo

    2006-06-01

    Full Text Available The simulated moving bed (SMB technology has attracted considerable attention for its efficiency as a chromatographic adsorptive separation. It has been increasingly applied to the separation of binary mixtures with low separation factors, namely to separate isomers. Although quite a vast amount of information has been published concerning the simulation and design of operating conditions of existing SMB plants, fewer works have addressed the question of design and optimisation of geometric parameters and operating conditions of a new adsorber, especially when mass transfer resistances are significant. The present work extends an algorithm developed elsewhere to design SMB equipment and optimize its operating conditions and applies it to the case of fructose-glucose separation using a cation-exchange resin as stationary phase in order to obtain nearly pure fructose in the extract and glucose in the raffinate. The constraints were set as 99% purity for both products. The objective function was chosen to be the adsorbent productivity. The algorithm attempted to find the minimum column lengths for increasing throughputs, which met the required purity constraint. Then, the best construction parameters and operating conditions were chosen as those for which the adsorbent productivity was maximum. The effects of the safety margins applied on the velocity ratios in sections 1 and 4 were examined and a heuristic rule for optimum eluent flowrate was derived. The effect of the purity requirements was also investigated. Finally, the calculated optimal operating points, in terms of flowrate ratios in SMB sections 2 and 3, were analysed in the frame of the equilibrium theory. Sound coherence was verified, which confirmed the accuracy and adequacy of the extended algorithm for the design and optimisation of a SMB adsorber with strong mass transfer effects.

  17. Numerical simulation of the stress – strain state of technological tools for fine drawing

    OpenAIRE

    Порубов, А. В.; Мельникова, Т. Е.

    2014-01-01

    An urgent task is to ensure the long life of the technological tool, namely the expensive diamond dies, which can significantly improve the efficiency of the production of wire. Strength Evaluation of technological tools, numerical simulation of the stress – strain state of the diamond drawing tool in the finite – element package ANSYS. Calculation of strain and stress state of the diamond drawing tool for drawing copper and nickel – plated wire with the operating pressure and the pressing fo...

  18. Computer simulation of optimal sensor locations in loading identification

    Science.gov (United States)

    Li, Dong-Sheng; Li, Hong-Nan; Guo, Xing L.

    2003-07-01

    A method is presented for the selection of a set of sensor locations from a larger candidate sent for the purpose of structural loading identification. The method ranks the candidate sensor locations according to their effectiveness for identifying the given known loadings. Measurement locations that yield abnormal jumps in identification results or increase the condition number of the frequency response function are removed. The final sensor configuration tends to minimize the error of the loading identification results and the condition number of the frequency response function. The initial candidate set is selected based on the modal kinetic energy distribution that gives a measure of the dynamic contribution of each physical degree freedom to each of the target mode shapes of interest. In addition, excitation location is considered when selecting appropriate response measurement locations. This method was successfully applied to the optimal sensor location selection and loading identification of a uniform cantilever beam in experiment. It is shown that computer simulation is a good way to select the optimal sensor location for loading identification.

  19. A Simulated Annealing Approach for the Train Design Optimization Problem

    Directory of Open Access Journals (Sweden)

    Federico Alonso-Pecina

    2017-01-01

    Full Text Available The Train Design Optimization Problem regards making optimal decisions on the number and movement of locomotives and crews through a railway network, so as to satisfy requested pick-up and delivery of car blocks at stations. In a mathematical programming formulation, the objective function to minimize is composed of the costs associated with the movement of locomotives and cars, the loading/unloading operations, the number of locomotives, and the crews’ return to their departure stations. The constraints include upper bounds for number of car blocks per locomotive, number of car block swaps, and number of locomotives passing through railroad segments. We propose here a heuristic method to solve this highly combinatorial problem in two steps. The first one finds an initial, feasible solution by means of an ad hoc algorithm. The second step uses the simulated annealing concept to improve the initial solution, followed by a procedure aiming to further reduce the number of needed locomotives. We show that our results are competitive with those found in the literature.

  20. Optimal Subinterval Selection Approach for Power System Transient Stability Simulation

    Directory of Open Access Journals (Sweden)

    Soobae Kim

    2015-10-01

    Full Text Available Power system transient stability analysis requires an appropriate integration time step to avoid numerical instability as well as to reduce computational demands. For fast system dynamics, which vary more rapidly than what the time step covers, a fraction of the time step, called a subinterval, is used. However, the optimal value of this subinterval is not easily determined because the analysis of the system dynamics might be required. This selection is usually made from engineering experiences, and perhaps trial and error. This paper proposes an optimal subinterval selection approach for power system transient stability analysis, which is based on modal analysis using a single machine infinite bus (SMIB system. Fast system dynamics are identified with the modal analysis and the SMIB system is used focusing on fast local modes. An appropriate subinterval time step from the proposed approach can reduce computational burden and achieve accurate simulation responses as well. The performance of the proposed method is demonstrated with the GSO 37-bus system.