WorldWideScience

Sample records for economic tools algorithms

  1. Advanced Fuel Cycle Economic Tools, Algorithms, and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    David E. Shropshire

    2009-05-01

    The Advanced Fuel Cycle Initiative (AFCI) Systems Analysis supports engineering economic analyses and trade-studies, and requires a requisite reference cost basis to support adequate analysis rigor. In this regard, the AFCI program has created a reference set of economic documentation. The documentation consists of the “Advanced Fuel Cycle (AFC) Cost Basis” report (Shropshire, et al. 2007), “AFCI Economic Analysis” report, and the “AFCI Economic Tools, Algorithms, and Methodologies Report.” Together, these documents provide the reference cost basis, cost modeling basis, and methodologies needed to support AFCI economic analysis. The application of the reference cost data in the cost and econometric systems analysis models will be supported by this report. These methodologies include: the energy/environment/economic evaluation of nuclear technology penetration in the energy market—domestic and internationally—and impacts on AFCI facility deployment, uranium resource modeling to inform the front-end fuel cycle costs, facility first-of-a-kind to nth-of-a-kind learning with application to deployment of AFCI facilities, cost tradeoffs to meet nuclear non-proliferation requirements, and international nuclear facility supply/demand analysis. The economic analysis will be performed using two cost models. VISION.ECON will be used to evaluate and compare costs under dynamic conditions, consistent with the cases and analysis performed by the AFCI Systems Analysis team. Generation IV Excel Calculations of Nuclear Systems (G4-ECONS) will provide static (snapshot-in-time) cost analysis and will provide a check on the dynamic results. In future analysis, additional AFCI measures may be developed to show the value of AFCI in closing the fuel cycle. Comparisons can show AFCI in terms of reduced global proliferation (e.g., reduction in enrichment), greater sustainability through preservation of a natural resource (e.g., reduction in uranium ore depletion), value from

  2. Economic dispatch optimization algorithm based on particle diffusion

    International Nuclear Information System (INIS)

    Han, Li; Romero, Carlos E.; Yao, Zheng

    2015-01-01

    Highlights: • A dispatch model that considers fuel, emissions control and wind power cost is built. • An optimization algorithm named diffusion particle optimization (DPO) is proposed. • DPO was used to analyze the impact of wind power risk and emissions on dispatch. - Abstract: Due to the widespread installation of emissions control equipment in fossil fuel-fired power plants, the cost of emissions control needs to be considered, together with the plant fuel cost, in providing economic power dispatch of those units to the grid. On the other hand, while using wind power decreases the overall power generation cost for the power grid, it poses a risk to a traditional grid, because of its inherent stochastic characteristics. Therefore, an economic dispatch optimization model needs to consider all of the fuel cost, emissions control cost and wind power cost for each of the generating unit conforming the fleet that meets the required grid power demand. In this study, an optimization algorithm referred as diffusion particle optimization (DPO) is proposed to solve such complex optimization problem. In this algorithm, Brownian motion theory is used to guide the movement of particles so that the particles can search for an optimal solution over the entire definition region. Several benchmark functions and power grid system data were used to test the performance of DPO, and compared to traditional algorithms used for economic dispatch optimization, such as, particle swarm optimization and artificial bee colony algorithm. It was found that DPO has less probability to be trapped in local optimums. According to results of different power systems DPO was able to find economic dispatch solutions with lower costs. DPO was also used to analyze the impact of wind power risk and fossil unit emissions coefficients on power dispatch. The result are encouraging for the use of DPO as a dynamic tool for economic dispatch of the power grid.

  3. Economic dispatch using chaotic bat algorithm

    International Nuclear Information System (INIS)

    Adarsh, B.R.; Raghunathan, T.; Jayabarathi, T.; Yang, Xin-She

    2016-01-01

    This paper presents the application of a new metaheuristic optimization algorithm, the chaotic bat algorithm for solving the economic dispatch problem involving a number of equality and inequality constraints such as power balance, prohibited operating zones and ramp rate limits. Transmission losses and multiple fuel options are also considered for some problems. The chaotic bat algorithm, a variant of the basic bat algorithm, is obtained by incorporating chaotic sequences to enhance its performance. Five different example problems comprising 6, 13, 20, 40 and 160 generating units are solved to demonstrate the effectiveness of the algorithm. The algorithm requires little tuning by the user, and the results obtained show that it either outperforms or compares favorably with several existing techniques reported in literature. - Highlights: • The chaotic bat algorithm, a new metaheuristic optimization algorithm has been used. • The problem solved – the economic dispatch problem – is nonlinear, discontinuous. • It has number of equality and inequality constraints. • The algorithm has been demonstrated to be applicable on high dimensional problems.

  4. Adaptive engineering management tools of enterprise economic security

    Directory of Open Access Journals (Sweden)

    G.E. Krokhicheva

    2018-06-01

    Full Text Available This paper discusses the organizational and methodological foundations and methods exploited to forecast, analyze and scale down threats and risks in the sphere of economic security, to solve the adaptation problems, to implement and to evaluate of the potency of protective measures. The object of the conducted research is associated with various economic activities of the commercial enterprises affiliated in Rostov region. A suggested model of the formation and functioning of adaptive engineering tools for managing economic security in the form of derivative balance of the enterprise resources and the sources of their formation will allow the proprietors, executive board and mana-gerial staff to obtain necessary information within the requested context regarding the enterprise vital economic interests. In addition, the paper pays attention to the methodological aspects of accounting description and estimation of the iterative achievements to meet the desired adaptation results, implemented within the framework of the described iterative algorithm aimed at ensuring strategic prediction.

  5. Economic and Financial Analysis Tools | Energy Analysis | NREL

    Science.gov (United States)

    Economic and Financial Analysis Tools Economic and Financial Analysis Tools Use these economic and . Job and Economic Development Impact (JEDI) Model Use these easy-to-use, spreadsheet-based tools to analyze the economic impacts of constructing and operating power generation and biofuel plants at the

  6. Innovative algorithms and tools summary and outlook

    International Nuclear Information System (INIS)

    Fine, V.E.; Naumann, N.A.

    2003-01-01

    This contribution provides a summary of the 'Session IV: Innovative Algorithms and Tools'. The 'tools' portion of the session IV comprised three oral sessions, 12 talks by seven speakers, and three posters. The algorithmic part was covered by 15 talks and three poster presentations. We will try to give a summary of the main development directions, and state our personal views and interpretation on it as well as an outlook

  7. Genetic Algorithm Based Economic Dispatch with Valve Point Effect

    Energy Technology Data Exchange (ETDEWEB)

    Park, Jong Nam; Park, Kyung Won; Kim, Ji Hong; Kim, Jin O [Hanyang University (Korea, Republic of)

    1999-03-01

    This paper presents a new approach on genetic algorithm to economic dispatch problem for valve point discontinuities. Proposed approach in this paper on genetic algorithms improves the performance to solve economic dispatch problem for valve point discontinuities through improved death penalty method, generation-apart elitism, atavism and sexual selection with sexual distinction. Numerical results on a test system consisting of 13 thermal units show that the proposed approach is faster, more robust and powerful than conventional genetic algorithms. (author). 8 refs., 10 figs.

  8. DIDADTIC TOOLS FOR THE STUDENTS’ ALGORITHMIC THINKING DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    T. P. Pushkaryeva

    2017-01-01

    Full Text Available Introduction. Modern engineers must possess high potential of cognitive abilities, in particular, the algorithmic thinking (AT. In this regard, the training of future experts (university graduates of technical specialities has to provide the knowledge of principles and ways of designing of various algorithms, abilities to analyze them, and to choose the most optimal variants for engineering activity implementation. For full formation of AT skills it is necessary to consider all channels of psychological perception and cogitative processing of educational information: visual, auditory, and kinesthetic.The aim of the present research is theoretical basis of design, development and use of resources for successful development of AT during the educational process of training in programming.Methodology and research methods. Methodology of the research involves the basic thesis of cognitive psychology and information approach while organizing the educational process. The research used methods: analysis; modeling of cognitive processes; designing training tools that take into account the mentality and peculiarities of information perception; diagnostic efficiency of the didactic tools. Results. The three-level model for future engineers training in programming aimed at development of AT skills was developed. The model includes three components: aesthetic, simulative, and conceptual. Stages to mastering a new discipline are allocated. It is proved that for development of AT skills when training in programming it is necessary to use kinesthetic tools at the stage of mental algorithmic maps formation; algorithmic animation and algorithmic mental maps at the stage of algorithmic model and conceptual images formation. Kinesthetic tools for development of students’ AT skills when training in algorithmization and programming are designed. Using of kinesthetic training simulators in educational process provide the effective development of algorithmic style of

  9. Economic modeling using evolutionary algorithms : the effect of binary encoding of strategies

    NARCIS (Netherlands)

    Waltman, L.R.; Eck, van N.J.; Dekker, Rommert; Kaymak, U.

    2011-01-01

    We are concerned with evolutionary algorithms that are employed for economic modeling purposes. We focus in particular on evolutionary algorithms that use a binary encoding of strategies. These algorithms, commonly referred to as genetic algorithms, are popular in agent-based computational economics

  10. Commodities Trading: An Essential Economic Tool.

    Science.gov (United States)

    Welch, Mary A., Ed.

    1989-01-01

    This issue focuses on commodities trading as an essential economic tool. Activities include critical thinking about marketing decisions and discussion on how futures markets and options are used as important economic tools. Discussion questions and a special student project are included. (EH)

  11. JPSS CGS Tools For Rapid Algorithm Updates

    Science.gov (United States)

    Smith, D. C.; Grant, K. D.

    2011-12-01

    Northrop Grumman have developed tools and processes to enable changes to be evaluated, tested, and moved into the operational baseline in a rapid and efficient manner. This presentation will provide an overview of the tools available to the Cal/Val teams to ensure rapid and accurate assessment of algorithm changes, along with the processes in place to ensure baseline integrity. [1] K. Grant and G. Route, "JPSS CGS Tools for Rapid Algorithm Updates," NOAA 2011 Satellite Direct Readout Conference, Miami FL, Poster, Apr 2011. [2] K. Grant, G. Route and B. Reed, "JPSS CGS Tools for Rapid Algorithm Updates," AMS 91st Annual Meeting, Seattle WA, Poster, Jan 2011. [3] K. Grant, G. Route and B. Reed, "JPSS CGS Tools for Rapid Algorithm Updates," AGU 2010 Fall Meeting, San Francisco CA, Oral Presentation, Dec 2010.

  12. Heat Transfer Search Algorithm for Non-convex Economic Dispatch Problems

    Science.gov (United States)

    Hazra, Abhik; Das, Saborni; Basu, Mousumi

    2018-03-01

    This paper presents Heat Transfer Search (HTS) algorithm for the non-linear economic dispatch problem. HTS algorithm is based on the law of thermodynamics and heat transfer. The proficiency of the suggested technique has been disclosed on three dissimilar complicated economic dispatch problems with valve point effect; prohibited operating zone; and multiple fuels with valve point effect. Test results acquired from the suggested technique for the economic dispatch problem have been fitted to that acquired from other stated evolutionary techniques. It has been observed that the suggested HTS carry out superior solutions.

  13. Benchmarking of nuclear economics tools

    International Nuclear Information System (INIS)

    Moore, Megan; Korinny, Andriy; Shropshire, David; Sadhankar, Ramesh

    2017-01-01

    Highlights: • INPRO and GIF economic tools exhibited good alignment in total capital cost estimation. • Subtle discrepancies in the cost result from differences in financing and the fuel cycle assumptions. • A common set of assumptions was found to reduce the discrepancies to 1% or less. • Opportunities for harmonisation of economic tools exists. - Abstract: Benchmarking of the economics methodologies developed by the Generation IV International Forum (GIF) and the International Atomic Energy Agency’s International Project on Innovative Nuclear Reactors and Fuel Cycles (INPRO), was performed for three Generation IV nuclear energy systems. The Economic Modeling Working Group of GIF developed an Excel based spreadsheet package, G4ECONS (Generation 4 Excel-based Calculation Of Nuclear Systems), to calculate the total capital investment cost (TCIC) and the levelised unit energy cost (LUEC). G4ECONS is sufficiently generic in the sense that it can accept the types of projected input, performance and cost data that are expected to become available for Generation IV systems through various development phases and that it can model both open and closed fuel cycles. The Nuclear Energy System Assessment (NESA) Economic Support Tool (NEST) was developed to enable an economic analysis using the INPRO methodology to easily calculate outputs including the TCIC, LUEC and other financial figures of merit including internal rate of return, return of investment and net present value. NEST is also Excel based and can be used to evaluate nuclear reactor systems using the open fuel cycle, MOX (mixed oxide) fuel recycling and closed cycles. A Super Critical Water-cooled Reactor system with an open fuel cycle and two Fast Reactor systems, one with a break-even fuel cycle and another with a burner fuel cycle, were selected for the benchmarking exercise. Published data on capital and operating costs were used for economics analyses using G4ECONS and NEST tools. Both G4ECONS and

  14. NPOESS Tools for Rapid Algorithm Updates

    Science.gov (United States)

    Route, G.; Grant, K. D.; Hughes, B.; Reed, B.

    2009-12-01

    The National Oceanic and Atmospheric Administration (NOAA), Department of Defense (DoD), and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation weather and environmental satellite system; the National Polar-orbiting Operational Environmental Satellite System (NPOESS). NPOESS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the Defense Meteorological Satellite Program (DMSP) managed by the DoD. The NPOESS satellites carry a suite of sensors that collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The ground data processing segment for NPOESS is the Interface Data Processing Segment (IDPS), developed by Raytheon Intelligence and Information Systems. The IDPS processes both NPP and NPOESS satellite data to provide environmental data products to NOAA and DoD processing centers operated by the United States government. Northrop Grumman Aerospace Systems Algorithms and Data Products (A&DP) organization is responsible for the algorithms that produce the EDRs, including their quality aspects. As the Calibration and Validation activities move forward following both the NPP launch and subsequent NPOESS launches, rapid algorithm updates may be required. Raytheon and Northrop Grumman have developed tools and processes to enable changes to be evaluated, tested, and moved into the operational baseline in a rapid and efficient manner. This presentation will provide an overview of the tools available to the Cal/Val teams to ensure rapid and accurate assessment of algorithm changes, along with the processes in place to ensure baseline integrity.

  15. Introduction to genetic algorithms as a modeling tool

    International Nuclear Information System (INIS)

    Wildberger, A.M.; Hickok, K.A.

    1990-01-01

    Genetic algorithms are search and classification techniques modeled on natural adaptive systems. This is an introduction to their use as a modeling tool with emphasis on prospects for their application in the power industry. It is intended to provide enough background information for its audience to begin to follow technical developments in genetic algorithms and to recognize those which might impact on electric power engineering. Beginning with a discussion of genetic algorithms and their origin as a model of biological adaptation, their advantages and disadvantages are described in comparison with other modeling tools such as simulation and neural networks in order to provide guidance in selecting appropriate applications. In particular, their use is described for improving expert systems from actual data and they are suggested as an aid in building mathematical models. Using the Thermal Performance Advisor as an example, it is suggested how genetic algorithms might be used to make a conventional expert system and mathematical model of a power plant adapt automatically to changes in the plant's characteristics

  16. An FMS Dynamic Production Scheduling Algorithm Considering Cutting Tool Failure and Cutting Tool Life

    International Nuclear Information System (INIS)

    Setiawan, A; Wangsaputra, R; Halim, A H; Martawirya, Y Y

    2016-01-01

    This paper deals with Flexible Manufacturing System (FMS) production rescheduling due to unavailability of cutting tools caused either of cutting tool failure or life time limit. The FMS consists of parallel identical machines integrated with an automatic material handling system and it runs fully automatically. Each machine has a same cutting tool configuration that consists of different geometrical cutting tool types on each tool magazine. The job usually takes two stages. Each stage has sequential operations allocated to machines considering the cutting tool life. In the real situation, the cutting tool can fail before the cutting tool life is reached. The objective in this paper is to develop a dynamic scheduling algorithm when a cutting tool is broken during unmanned and a rescheduling needed. The algorithm consists of four steps. The first step is generating initial schedule, the second step is determination the cutting tool failure time, the third step is determination of system status at cutting tool failure time and the fourth step is the rescheduling for unfinished jobs. The approaches to solve the problem are complete-reactive scheduling and robust-proactive scheduling. The new schedules result differences starting time and completion time of each operations from the initial schedule. (paper)

  17. Simultaneous Scheduling of Jobs, AGVs and Tools Considering Tool Transfer Times in Multi Machine FMS By SOS Algorithm

    Science.gov (United States)

    Sivarami Reddy, N.; Ramamurthy, D. V., Dr.; Prahlada Rao, K., Dr.

    2017-08-01

    This article addresses simultaneous scheduling of machines, AGVs and tools where machines are allowed to share the tools considering transfer times of jobs and tools between machines, to generate best optimal sequences that minimize makespan in a multi-machine Flexible Manufacturing System (FMS). Performance of FMS is expected to improve by effective utilization of its resources, by proper integration and synchronization of their scheduling. Symbiotic Organisms Search (SOS) algorithm is a potent tool which is a better alternative for solving optimization problems like scheduling and proven itself. The proposed SOS algorithm is tested on 22 job sets with makespan as objective for scheduling of machines and tools where machines are allowed to share tools without considering transfer times of jobs and tools and the results are compared with the results of existing methods. The results show that the SOS has outperformed. The same SOS algorithm is used for simultaneous scheduling of machines, AGVs and tools where machines are allowed to share tools considering transfer times of jobs and tools to determine the best optimal sequences that minimize makespan.

  18. Harmony search algorithm for solving combined heat and power economic dispatch problems

    Energy Technology Data Exchange (ETDEWEB)

    Khorram, Esmaile, E-mail: eskhor@aut.ac.i [Department of Applied Mathematics, Faculty of Mathematics and Computer Science, Amirkabir University of Technology, No. 424, Hafez Ave., 15914 Tehran (Iran, Islamic Republic of); Jaberipour, Majid, E-mail: Majid.Jaberipour@gmail.co [Department of Applied Mathematics, Faculty of Mathematics and Computer Science, Amirkabir University of Technology, No. 424, Hafez Ave., 15914 Tehran (Iran, Islamic Republic of)

    2011-02-15

    Economic dispatch (ED) is one of the key optimization problems in electric power system operation. The problem grows complex if one or more units produce both power and heat. Combined heat and power economic dispatch (CHPED) problem is a complicated problem that needs powerful methods to solve. This paper presents a harmony search (EDHS) algorithm to solve CHPED. Some standard examples are presented to demonstrate the effectiveness of this algorithm in obtaining the optimal solution. In all cases, the solutions obtained using EDHS algorithm are better than those obtained by other methods.

  19. A new honey bee mating optimization algorithm for non-smooth economic dispatch

    International Nuclear Information System (INIS)

    Niknam, Taher; Mojarrad, Hasan Doagou; Meymand, Hamed Zeinoddini; Firouzi, Bahman Bahmani

    2011-01-01

    The non-storage characteristics of electricity and the increasing fuel costs worldwide call for the need to operate the systems more economically. Economic dispatch (ED) is one of the most important optimization problems in power systems. ED has the objective of dividing the power demand among the online generators economically while satisfying various constraints. The importance of economic dispatch is to get maximum usable power using minimum resources. To solve the static ED problem, honey bee mating algorithm (HBMO) can be used. The basic disadvantage of the original HBMO algorithm is the fact that it may miss the optimum and provide a near optimum solution in a limited runtime period. In order to avoid this shortcoming, we propose a new method that improves the mating process of HBMO and also, combines the improved HBMO with a Chaotic Local Search (CLS) called Chaotic Improved Honey Bee Mating Optimization (CIHBMO). The proposed algorithm is used to solve ED problems taking into account the nonlinear generator characteristics such as prohibited operation zones, multi-fuel and valve-point loading effects. The CIHBMO algorithm is tested on three test systems and compared with other methods in the literature. Results have shown that the proposed method is efficient and fast for ED problems with non-smooth and non-continuous fuel cost functions. Moreover, the optimal power dispatch obtained by the algorithm is superior to previous reported results. -- Research highlights: →Economic dispatch. →Reducing electrical energy loss. →Saving electrical energy. →Optimal operation.

  20. Combined heat and power economic dispatch by harmony search algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Vasebi, A.; Bathaee, S.M.T. [Power System Research Laboratory, Department of Electrical and Electronic Engineering, K.N.Toosi University of Technology, 322-Mirdamad Avenue West, 19697 Tehran (Iran); Fesanghary, M. [Department of Mechanical Engineering, Amirkabir University of Technology, 424-Hafez Avenue, Tehran (Iran)

    2007-12-15

    The optimal utilization of multiple combined heat and power (CHP) systems is a complicated problem that needs powerful methods to solve. This paper presents a harmony search (HS) algorithm to solve the combined heat and power economic dispatch (CHPED) problem. The HS algorithm is a recently developed meta-heuristic algorithm, and has been very successful in a wide variety of optimization problems. The method is illustrated using a test case taken from the literature as well as a new one proposed by authors. Numerical results reveal that the proposed algorithm can find better solutions when compared to conventional methods and is an efficient search algorithm for CHPED problem. (author)

  1. A simple two stage optimization algorithm for constrained power economic dispatch

    International Nuclear Information System (INIS)

    Huang, G.; Song, K.

    1994-01-01

    A simple two stage optimization algorithm is proposed and investigated for fast computation of constrained power economic dispatch control problems. The method is a simple demonstration of the hierarchical aggregation-disaggregation (HAD) concept. The algorithm first solves an aggregated problem to obtain an initial solution. This aggregated problem turns out to be classical economic dispatch formulation, and it can be solved in 1% of overall computation time. In the second stage, linear programming method finds optimal solution which satisfies power balance constraints, generation and transmission inequality constraints and security constraints. Implementation of the algorithm for IEEE systems and EPRI Scenario systems shows that the two stage method obtains average speedup ratio 10.64 as compared to classical LP-based method

  2. NPP/NPOESS Tools for Rapid Algorithm Updates

    Science.gov (United States)

    Route, G.; Grant, K. D.; Hughes, R.

    2010-12-01

    The National Oceanic and Atmospheric Administration (NOAA), Department of Defense (DoD), and National Aeronautics and Space Administration (NASA) are jointly acquiring the next-generation weather and environmental satellite system; the National Polar-orbiting Operational Environmental Satellite System (NPOESS). NPOESS replaces the current Polar-orbiting Operational Environmental Satellites (POES) managed by NOAA and the Defense Meteorological Satellite Program (DMSP) managed by the DoD. The NPOESS Preparatory Project (NPP) and NPOESS satellites will carry a suite of sensors that collect meteorological, oceanographic, climatological, and solar-geophysical observations of the earth, atmosphere, and space. The ground data processing segment for NPOESS is the Interface Data Processing Segment (IDPS), developed by Raytheon Intelligence and Information Systems. The IDPS processes both NPP and NPOESS satellite data to provide environmental data products to NOAA and DoD processing centers operated by the United States government. The Northrop Grumman Aerospace Systems (NGAS) Algorithms and Data Products (A&DP) organization is responsible for the algorithms that produce the Environmental Data Records (EDRs), including their quality aspects. As the Calibration and Validation (Cal/Val) activities move forward following both the NPP launch and subsequent NPOESS launches, rapid algorithm updates may be required. Raytheon and Northrop Grumman have developed tools and processes to enable changes to be evaluated, tested, and moved into the operational baseline in a rapid and efficient manner. This presentation will provide an overview of the tools available to the Cal/Val teams to ensure rapid and accurate assessment of algorithm changes, along with the processes in place to ensure baseline integrity.

  3. Innovative Software Algorithms and Tools parallel sessions summary

    International Nuclear Information System (INIS)

    Gaines, Irwin

    2001-01-01

    A variety of results were presented in the poster and 5 parallel sessions of the Innovative Software, Algorithms and Tools (ISAT) sessions. I will briefly summarize these presentations and attempt to identify some unifying trends

  4. A new accurate curvature matching and optimal tool based five-axis machining algorithm

    International Nuclear Information System (INIS)

    Lin, Than; Lee, Jae Woo; Bohez, Erik L. J.

    2009-01-01

    Free-form surfaces are widely used in CAD systems to describe the part surface. Today, the most advanced machining of free from surfaces is done in five-axis machining using a flat end mill cutter. However, five-axis machining requires complex algorithms for gouging avoidance, collision detection and powerful computer-aided manufacturing (CAM) systems to support various operations. An accurate and efficient method is proposed for five-axis CNC machining of free-form surfaces. The proposed algorithm selects the best tool and plans the tool path autonomously using curvature matching and integrated inverse kinematics of the machine tool. The new algorithm uses the real cutter contact tool path generated by the inverse kinematics and not the linearized piecewise real cutter location tool path

  5. Optimal Economic Operation of Islanded Microgrid by Using a Modified PSO Algorithm

    Directory of Open Access Journals (Sweden)

    Yiwei Ma

    2015-01-01

    Full Text Available An optimal economic operation method is presented to attain a joint-optimization of cost reduction and operation strategy for islanded microgrid, which includes renewable energy source, the diesel generator, and battery storage system. The optimization objective is to minimize the overall generating cost involving depreciation cost, operation cost, emission cost, and economic subsidy available for renewable energy source, while satisfying various equality and inequality constraints. A novel dynamic optimization process is proposed based on two different operation control modes where diesel generator or battery storage acts as the master unit to maintain the system frequency and voltage stability, and a modified particle swarm optimization algorithm is applied to get faster solution to the practical economic operation problem of islanded microgrid. With the example system of an actual islanded microgrid in Dongao Island, China, the proposed models, dynamic optimization strategy, and solution algorithm are verified and the influences of different operation strategies and optimization algorithms on the economic operation are discussed. The results achieved demonstrate the effectiveness and feasibility of the proposed method.

  6. ESHOPPS: A COMPUTATIONAL TOOL TO AID THE TEACHING OF SHORTEST PATH ALGORITHMS

    Directory of Open Access Journals (Sweden)

    S. J. de A. LIMA

    2015-07-01

    Full Text Available The development of a computational tool called EShoPPS – Environment for Shortest Path Problem Solving, which is used to assist students in understanding the working of Dijkstra, Greedy search and A*(star algorithms is presented in this paper. Such algorithms are commonly taught in graduate and undergraduate courses of Engineering and Informatics and are used for solving many optimization problems that can be characterized as Shortest Path Problem. The EShoPPS is an interactive tool that allows students to create a graph representing the problem and also helps in developing their knowledge of each specific algorithm. Experiments performed with 155 students of undergraduate and graduate courses such as Industrial Engineering, Computer Science and Information Systems have shown that by using the EShoPPS tool students were able to improve their interpretation of investigated algorithms.

  7. Clustering Algorithm As A Planning Support Tool For Rural Electrification Optimization

    Directory of Open Access Journals (Sweden)

    Ronaldo Pornillosa Parreno Jr

    2015-08-01

    Full Text Available Abstract In this study clustering algorithm was developed to optimize electrification plans by screening and grouping potential customers to be supplied with electricity. The algorithm provided adifferent approach in clustering problem which combines conceptual and distance-based clustering algorithmsto analyze potential clusters using spanning tree with the shortest possible edge weight and creating final cluster trees based on the test of inconsistency for the edges. The clustering criteria consists of commonly used distance measure with the addition of household information as basis for the ability to pay ATP value. The combination of these two parameters resulted to a more significant and realistic clusters since distance measure alone could not take the effect of the household characteristics in screening the most sensible groupings of households. In addition the implications of varying geographical features were incorporated in the algorithm by using routing index across the locations of the households. This new approach of connecting the households in an area was applied in an actual case study of one village or barangay that was not yet energized. The results of clustering algorithm generated cluster trees which could becomethetheoretical basis for power utilities to plan the initial network arrangement of electrification. Scenario analysis conducted on the two strategies of clustering the households provideddifferent alternatives for the optimization of the cost of electrification. Futhermorethe benefits associated with the two strategies formulated from the two scenarios was evaluated using benefit cost ratio BC to determine which is more economically advantageous. The results of the study showed that clustering algorithm proved to be effective in solving electrification optimization problem and serves its purpose as a planning support tool which can facilitate electrification in rural areas and achieve cost-effectiveness.

  8. Sounds unheard of evolutionary algorithms as creative tools for the contemporary composer

    DEFF Research Database (Denmark)

    Dahlstedt, Palle

    2004-01-01

    Evolutionary algorithms are studied as tools for generating novel musical material in the form of musical scores and synthesized sounds. The choice of genetic representation defines a space of potential music. This space is explored using evolutionary algorithms, in search of useful musical mater...... composed with the tools described in the thesis are presented....

  9. A Parallel Adaptive Particle Swarm Optimization Algorithm for Economic/Environmental Power Dispatch

    Directory of Open Access Journals (Sweden)

    Jinchao Li

    2012-01-01

    Full Text Available A parallel adaptive particle swarm optimization algorithm (PAPSO is proposed for economic/environmental power dispatch, which can overcome the premature characteristic, the slow-speed convergence in the late evolutionary phase, and lacking good direction in particles’ evolutionary process. A search population is randomly divided into several subpopulations. Then for each subpopulation, the optimal solution is searched synchronously using the proposed method, and thus parallel computing is realized. To avoid converging to a local optimum, a crossover operator is introduced to exchange the information among the subpopulations and the diversity of population is sustained simultaneously. Simulation results show that the proposed algorithm can effectively solve the economic/environmental operation problem of hydropower generating units. Performance comparisons show that the solution from the proposed method is better than those from the conventional particle swarm algorithm and other optimization algorithms.

  10. Applied economic model development algorithm for electronics company

    Directory of Open Access Journals (Sweden)

    Mikhailov I.

    2017-01-01

    Full Text Available The purpose of this paper is to report about received experience in the field of creating the actual methods and algorithms that help to simplify development of applied decision support systems. It reports about an algorithm, which is a result of two years research and have more than one-year practical verification. In a case of testing electronic components, the time of the contract conclusion is crucial point to make the greatest managerial mistake. At this stage, it is difficult to achieve a realistic assessment of time-limit and of wage-fund for future work. The creation of estimating model is possible way to solve this problem. In the article is represented an algorithm for creation of those models. The algorithm is based on example of the analytical model development that serves for amount of work estimation. The paper lists the algorithm’s stages and explains their meanings with participants’ goals. The implementation of the algorithm have made possible twofold acceleration of these models development and fulfilment of management’s requirements. The resulting models have made a significant economic effect. A new set of tasks was identified to be further theoretical study.

  11. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were car...

  12. An improved harmony search algorithm for power economic load dispatch

    Energy Technology Data Exchange (ETDEWEB)

    Santos Coelho, Leandro dos [Pontifical Catholic University of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, PPGEPS, Imaculada Conceicao, 1155, 80215-901 Curitiba, PR (Brazil)], E-mail: leandro.coelho@pucpr.br; Mariani, Viviana Cocco [Pontifical Catholic University of Parana, PUCPR, Department of Mechanical Engineering, PPGEM, Imaculada Conceicao, 1155, 80215-901 Curitiba, PR (Brazil)], E-mail: viviana.mariani@pucpr.br

    2009-10-15

    A meta-heuristic algorithm called harmony search (HS), mimicking the improvisation process of music players, has been recently developed. The HS algorithm has been successful in several optimization problems. The HS algorithm does not require derivative information and uses stochastic random search instead of a gradient search. In addition, the HS algorithm is simple in concept, few in parameters, and easy in implementation. This paper presents an improved harmony search (IHS) algorithm based on exponential distribution for solving economic dispatch problems. A 13-unit test system with incremental fuel cost function taking into account the valve-point loading effects is used to illustrate the effectiveness of the proposed IHS method. Numerical results show that the IHS method has good convergence property. Furthermore, the generation costs of the IHS method are lower than those of the classical HS and other optimization algorithms reported in recent literature.

  13. An improved harmony search algorithm for power economic load dispatch

    Energy Technology Data Exchange (ETDEWEB)

    Coelho, Leandro dos Santos [Pontifical Catholic Univ. of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, PPGEPS, Imaculada Conceicao, 1155, 80215-901 Curitiba, PR (Brazil); Mariani, Viviana Cocco [Pontifical Catholic Univ. of Parana, PUCPR, Dept. of Mechanical Engineering, PPGEM, Imaculada Conceicao, 1155, 80215-901 Curitiba, PR (Brazil)

    2009-10-15

    A meta-heuristic algorithm called harmony search (HS), mimicking the improvisation process of music players, has been recently developed. The HS algorithm has been successful in several optimization problems. The HS algorithm does not require derivative information and uses stochastic random search instead of a gradient search. In addition, the HS algorithm is simple in concept, few in parameters, and easy in implementation. This paper presents an improved harmony search (IHS) algorithm based on exponential distribution for solving economic dispatch problems. A 13-unit test system with incremental fuel cost function taking into account the valve-point loading effects is used to illustrate the effectiveness of the proposed IHS method. Numerical results show that the IHS method has good convergence property. Furthermore, the generation costs of the IHS method are lower than those of the classical HS and other optimization algorithms reported in recent literature. (author)

  14. An improved harmony search algorithm for power economic load dispatch

    International Nuclear Information System (INIS)

    Santos Coelho, Leandro dos; Mariani, Viviana Cocco

    2009-01-01

    A meta-heuristic algorithm called harmony search (HS), mimicking the improvisation process of music players, has been recently developed. The HS algorithm has been successful in several optimization problems. The HS algorithm does not require derivative information and uses stochastic random search instead of a gradient search. In addition, the HS algorithm is simple in concept, few in parameters, and easy in implementation. This paper presents an improved harmony search (IHS) algorithm based on exponential distribution for solving economic dispatch problems. A 13-unit test system with incremental fuel cost function taking into account the valve-point loading effects is used to illustrate the effectiveness of the proposed IHS method. Numerical results show that the IHS method has good convergence property. Furthermore, the generation costs of the IHS method are lower than those of the classical HS and other optimization algorithms reported in recent literature.

  15. ECONOMIC MECHANISMS OF POPULATION PROTECTION AGAINST PENSION RISKS AS A TOOL FOR SOCIO-ECONOMIC DEVELOPMENT OF THE REGION

    Directory of Open Access Journals (Sweden)

    Lyubov V. Grigoryeva

    2017-06-01

    Full Text Available The problem of finding effective economic instruments for socio-economic development of the regions has recently become of increasing relevance. Strengthening the regional differentiation, highlighting the leading and lagging regions, the lack of own resources in the regions has forced regional authorities to use new instruments of territorial development, in particular, economic mechanisms of protection against pension risks. The use of these mechanisms has a dual effect (economic and social, due to the attraction of the regions with additional financial resources in the form of “long money” and increasing the protection of citizens against pension risks (the increase in the level of pension payments. The analysis of the current use of economic mechanisms of protection against pension risks in the regions of the Southern Federal District helped to articulate key issues of their use, in particular, low pension literacy of the population, distrust towards specialized financial institutions, the investment policy of the regions does not take into account the possibility of attracting private pension funds into regional projects, and there is no mechanism to support regional National Pension Fund. Territorial analysis revealed the potential application of economic mechanisms to protect against pension risks in the regions of the Southern Federal District as a tool for socio-economic development, which is based on the existence of regional pension funds and insurance companies (providing services for pension insurance, as well as participation in private pension provision. The Krasnodar, Rostov and Volgograd regions have the highest potential among the regions of the Southern Federal District, as there already exist regional National Pension Fund, and the participation in private pension insurance is confirmed by the statistics of the contributions paid. The study of existing economic mechanisms to protect against pension risks will provide the

  16. Applicability of genetic algorithms to parameter estimation of economic models

    Directory of Open Access Journals (Sweden)

    Marcel Ševela

    2004-01-01

    Full Text Available The paper concentrates on capability of genetic algorithms for parameter estimation of non-linear economic models. In the paper we test the ability of genetic algorithms to estimate of parameters of demand function for durable goods and simultaneously search for parameters of genetic algorithm that lead to maximum effectiveness of the computation algorithm. The genetic algorithms connect deterministic iterative computation methods with stochastic methods. In the genteic aůgorithm approach each possible solution is represented by one individual, those life and lifes of all generations of individuals run under a few parameter of genetic algorithm. Our simulations resulted in optimal mutation rate of 15% of all bits in chromosomes, optimal elitism rate 20%. We can not set the optimal extend of generation, because it proves positive correlation with effectiveness of genetic algorithm in all range under research, but its impact is degreasing. The used genetic algorithm was sensitive to mutation rate at most, than to extend of generation. The sensitivity to elitism rate is not so strong.

  17. Annealed Demon Algorithms Solving the Environmental / Economic Dispatch Problem

    Directory of Open Access Journals (Sweden)

    Aristidis VLACHOS

    2013-06-01

    Full Text Available This paper presents an efficient and reliable Annealed Demon (AD algorithm for the Environmental/Economic Dispatch (EEB problem. The EED problem is a multi-objective non-linear optimization problem with constraints. This problem is one of the fundamentals issues in power system operation. The system of generation associates thermal generators and emissions which involves sulphur oxides (SO2 and nitrogen oxides (NOx. The aim is to minimize total fuel cost of the system and control emission. The proposed AD algorithm is applied for EED of a simple power system.

  18. OPTIMIZING USABILITY OF AN ECONOMIC DECISION SUPPORT TOOL: PROTOTYPE OF THE EQUIPT TOOL.

    Science.gov (United States)

    Cheung, Kei Long; Hiligsmann, Mickaël; Präger, Maximilian; Jones, Teresa; Józwiak-Hagymásy, Judit; Muñoz, Celia; Lester-George, Adam; Pokhrel, Subhash; López-Nicolás, Ángel; Trapero-Bertran, Marta; Evers, Silvia M A A; de Vries, Hein

    2018-01-01

    Economic decision-support tools can provide valuable information for tobacco control stakeholders, but their usability may impact the adoption of such tools. This study aims to illustrate a mixed-method usability evaluation of an economic decision-support tool for tobacco control, using the EQUIPT ROI tool prototype as a case study. A cross-sectional mixed methods design was used, including a heuristic evaluation, a thinking aloud approach, and a questionnaire testing and exploring the usability of the Return of Investment tool. A total of sixty-six users evaluated the tool (thinking aloud) and completed the questionnaire. For the heuristic evaluation, four experts evaluated the interface. In total twenty-one percent of the respondents perceived good usability. A total of 118 usability problems were identified, from which twenty-six problems were categorized as most severe, indicating high priority to fix them before implementation. Combining user-based and expert-based evaluation methods is recommended as these were shown to identify unique usability problems. The evaluation provides input to optimize usability of a decision-support tool, and may serve as a vantage point for other developers to conduct usability evaluations to refine similar tools before wide-scale implementation. Such studies could reduce implementation gaps by optimizing usability, enhancing in turn the research impact of such interventions.

  19. Economic environmental dispatch using BSA algorithm

    Science.gov (United States)

    Jihane, Kartite; Mohamed, Cherkaoui

    2018-05-01

    Economic environmental dispatch problem (EED) is an important issue especially in the field of fossil fuel power plant system. It allows the network manager to choose among different units the most optimized in terms of fuel costs and emission level. The objective of this paper is to minimize the fuel cost with emissions constrained; the test is conducted for two cases: six generator unit and ten generator unit for the same power demand 1200Mw. The simulation has been computed in MATLAB and the result shows the robustness of the Backtracking Search optimization Algorithm (BSA) and the impact of the load demand on the emission.

  20. Multiobjective Economic Load Dispatch in 3-D Space by Genetic Algorithm

    Science.gov (United States)

    Jain, N. K.; Nangia, Uma; Singh, Iqbal

    2017-10-01

    This paper presents the application of genetic algorithm to Multiobjective Economic Load Dispatch (MELD) problem considering fuel cost, transmission losses and environmental pollution as objective functions. The MELD problem has been formulated using constraint method. The non-inferior set for IEEE 5, 14 and 30-bus system has been generated by using genetic algorithm and the target point has been obtained by using maximization of minimum relative attainments.

  1. Springback Simulation and Tool Surface Compensation Algorithm for Sheet Metal Forming

    International Nuclear Information System (INIS)

    Shen Guozhe; Hu Ping; Zhang Xiangkui; Chen Xiaobin; Li Xiaoda

    2005-01-01

    Springback is an unquenchable forming defect in the sheet metal forming process. How to calculate springback accurately is a big challenge for a lot of FEA software. Springback compensation makes the stamped final part accordant with the designed part shape by modifying tool surface, which depends on the accurate springback amount. How ever, the meshing data based on numerical simulation is expressed by nodes and elements, such data can not be supplied directly to tool surface CAD data. In this paper, a tool surface compensation algorithm based on numerical simulation technique of springback process is proposed in which the independently developed dynamic explicit springback algorithm (DESA) is used to simulate springback amount. When doing the tool surface compensation, the springback amount of the projected point can be obtained by interpolation of the springback amount of the projected element nodes. So the modified values of tool surface can be calculated reversely. After repeating the springback and compensation calculations for 1∼3 times, the reasonable tool surface mesh is gained. Finally, the FEM data on the compensated tool surface is fitted into the surface by CAD modeling software. The examination of a real industrial part shows the validity of the present method

  2. ANN-Benchmarks: A Benchmarking Tool for Approximate Nearest Neighbor Algorithms

    DEFF Research Database (Denmark)

    Aumüller, Martin; Bernhardsson, Erik; Faithfull, Alexander

    2017-01-01

    This paper describes ANN-Benchmarks, a tool for evaluating the performance of in-memory approximate nearest neighbor algorithms. It provides a standard interface for measuring the performance and quality achieved by nearest neighbor algorithms on different standard data sets. It supports several...... visualise these as images, Open image in new window plots, and websites with interactive plots. ANN-Benchmarks aims to provide a constantly updated overview of the current state of the art of k-NN algorithms. In the short term, this overview allows users to choose the correct k-NN algorithm and parameters...... for their similarity search task; in the longer term, algorithm designers will be able to use this overview to test and refine automatic parameter tuning. The paper gives an overview of the system, evaluates the results of the benchmark, and points out directions for future work. Interestingly, very different...

  3. Model-based fault diagnosis techniques design schemes, algorithms, and tools

    CERN Document Server

    Ding, Steven

    2008-01-01

    The objective of this book is to introduce basic model-based FDI schemes, advanced analysis and design algorithms, and the needed mathematical and control theory tools at a level for graduate students and researchers as well as for engineers. This is a textbook with extensive examples and references. Most methods are given in the form of an algorithm that enables a direct implementation in a programme. Comparisons among different methods are included when possible.

  4. Tools and Algorithms for the Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2004, held in Barcelona, Spain in March/April 2004. The 37 revised full papers and 6 revised tool demonstration papers presented were...... carefully reviewed and selected from a total of 162 submissions. The papers are organized in topical sections on theorem proving, probabilistic model checking, testing, tools, explicit state and Petri nets, scheduling, constraint solving, timed systems, case studies, software, temporal logic, abstraction...

  5. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    Science.gov (United States)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  6. A Decomposition Algorithm for Mean-Variance Economic Model Predictive Control of Stochastic Linear Systems

    DEFF Research Database (Denmark)

    Sokoler, Leo Emil; Dammann, Bernd; Madsen, Henrik

    2014-01-01

    This paper presents a decomposition algorithm for solving the optimal control problem (OCP) that arises in Mean-Variance Economic Model Predictive Control of stochastic linear systems. The algorithm applies the alternating direction method of multipliers to a reformulation of the OCP...

  7. Nonsmooth Optimization Algorithms, System Theory, and Software Tools

    Science.gov (United States)

    1993-04-13

    Optimization Algorithms, System Theory , and Scftware Tools" AFOSR-90-OO68 L AUTHOR($) Elijah Polak -Professor and Principal Investigator 7. PERFORMING...NSN 754Q-01-2W0-S500 Standard Form 295 (69O104 Draft) F’wsa*W by hA Sit 230.1""V AFOSR-90-0068 NONSMO0 TH OPTIMIZA TION A L GORI THMS, SYSTEM THEORY , AND

  8. Tools and Algorithms for Construction and Analysis of Systems

    DEFF Research Database (Denmark)

    This book constitutes the refereed proceedings of the 6th International Conference on Tools and Algorithms for the Construction and Analysis of Systems, TACAS 2000, held as part of ETAPS 2000 in Berlin, Germany, in March/April 2000. The 33 revised full papers presented together with one invited...... paper and two short tool descriptions were carefully reviewed and selected from a total of 107 submissions. The papers are organized in topical sections on software and formal methods, formal methods, timed and hybrid systems, infinite and parameterized systems, diagnostic and test generation, efficient...

  9. Entropic algorithms and the lid method as exploration tools for complex landscapes

    DEFF Research Database (Denmark)

    Barettin, Daniele; Sibani, Paolo

    2011-01-01

    to a single valley, are key to understand the dynamical properties of such systems. In this paper we combine the lid algorithm, a tool for landscape exploration previously applied to a range of models, with the Wang-Swendsen algorithm. To test this improved exploration tool, we consider a paradigmatic complex...... system, the Edwards-Andersom model in two and three spatial dimension. We find a striking difference between the energy dependence of the local density of states in the two cases: nearly flat in the first case, and nearly exponential in the second. The lid dependence of the data is analyzed to estimate...

  10. Hydro-Thermal-Wind Generation Scheduling Considering Economic and Environmental Factors Using Heuristic Algorithms

    Directory of Open Access Journals (Sweden)

    Suresh K. Damodaran

    2018-02-01

    Full Text Available Hydro-thermal-wind generation scheduling (HTWGS with economic and environmental factors is a multi-objective complex nonlinear power system optimization problem with many equality and inequality constraints. The objective of the problem is to generate an hour-by-hour optimum schedule of hydro-thermal-wind power plants to attain the least emission of pollutants from thermal plants and a reduced generation cost of thermal and wind plants for a 24-h period, satisfying the system constraints. The paper presents a detailed framework of the HTWGS problem and proposes a modified particle swarm optimization (MPSO algorithm for evolving a solution. The competency of selected heuristic algorithms, representing different heuristic groups, viz. the binary coded genetic algorithm (BCGA, particle swarm optimization (PSO, improved harmony search (IHS, and JAYA algorithm, for searching for an optimal solution to HTWGS considering economic and environmental factors was investigated in a trial system consisting of a multi-stream cascaded system with four reservoirs, three thermal plants, and two wind plants. Appropriate mathematical models were used for representing the water discharge, generation cost, and pollutant emission of respective power plants incorporated in the system. Statistical analysis was performed to check the consistency and reliability of the proposed algorithm. The simulation results indicated that the proposed MPSO algorithm provided a better solution to the problem of HTWGS, with a reduced generation cost and the least emission, when compared with the other heuristic algorithms considered.

  11. Genetic Algorithms vs. Artificial Neural Networks in Economic Forecasting Process

    Directory of Open Access Journals (Sweden)

    Nicolae Morariu

    2008-01-01

    Full Text Available This paper aims to describe the implementa-tion of a neural network and a genetic algorithm system in order to forecast certain economic indicators of a free market economy. In a free market economy forecasting process precedes the economic planning (a management function, providing important information for the result of the last process. Forecasting represents a starting point in setting of target for a firm, an organization or even a branch of the economy. Thus, the forecasting method used can influence in a significant mode the evolution of an entity. In the following we will describe the forecasting of an economic indicator using two intelligent systems. The difference between the results obtained by this two systems are described in chapter IV.

  12. Improved differential evolution algorithms for handling economic dispatch optimization with generator constraints

    International Nuclear Information System (INIS)

    Coelho, Leandro dos Santos; Mariani, Viviana Cocco

    2007-01-01

    Global optimization based on evolutionary algorithms can be used as the important component for many engineering optimization problems. Evolutionary algorithms have yielded promising results for solving nonlinear, non-differentiable and multi-modal optimization problems in the power systems area. Differential evolution (DE) is a simple and efficient evolutionary algorithm for function optimization over continuous spaces. It has reportedly outperformed search heuristics when tested over both benchmark and real world problems. This paper proposes improved DE algorithms for solving economic load dispatch problems that take into account nonlinear generator features such as ramp rate limits and prohibited operating zones in the power system operation. The DE algorithms and its variants are validated for two test systems consisting of 6 and 15 thermal units. Various DE approaches outperforms other state of the art algorithms reported in the literature in solving load dispatch problems with generator constraints

  13. Differential Forms: A New Tool in Economics

    Science.gov (United States)

    Mimkes, Jürgen

    Econophysics is the transfer of methods from natural to socio-economic sciences. This concept has first been applied to finance1, but it is now also used in various applications of economics and social sciences [2,3]. The present paper focuses on problems in macro economics and growth. 1. Neoclassical theory [4, 5] neglects the “ex post” property of income and growth. Income Y(K, L) is assumed to be a function of capital and labor. But functions cannot model the “ex post” character of income. 2. Neoclassical theory is based on a Cobb Douglas function [6] with variable elasticity α, which may be fitted to economic data. But an undefined elasticity α leads to a descriptive rather than a predictive economic theory. The present paper introduces a new tool - differential forms and path dependent integrals - to macro economics. This is a solution to the problems above: 1. The integral of not exact differential forms is path dependent and can only be calculated “ex post” like income and economic growth. 2. Not exact differential forms can be made exact by an integrating factor, this leads to a new, well defined, unique production function F and a predictive economic theory.

  14. Symbiotic organisms search algorithm for dynamic economic dispatch with valve-point effects

    Science.gov (United States)

    Sonmez, Yusuf; Kahraman, H. Tolga; Dosoglu, M. Kenan; Guvenc, Ugur; Duman, Serhat

    2017-05-01

    In this study, symbiotic organisms search (SOS) algorithm is proposed to solve the dynamic economic dispatch with valve-point effects problem, which is one of the most important problems of the modern power system. Some practical constraints like valve-point effects, ramp rate limits and prohibited operating zones have been considered as solutions. Proposed algorithm was tested on five different test cases in 5 units, 10 units and 13 units systems. The obtained results have been compared with other well-known metaheuristic methods reported before. Results show that proposed algorithm has a good convergence and produces better results than other methods.

  15. Thermo-economic multi-objective optimization of solar dish-Stirling engine by implementing evolutionary algorithm

    International Nuclear Information System (INIS)

    Ahmadi, Mohammad H.; Sayyaadi, Hoseyn; Mohammadi, Amir H.; Barranco-Jimenez, Marco A.

    2013-01-01

    Highlights: • Thermo-economic multi-objective optimization of solar dish-Stirling engine is studied. • Application of the evolutionary algorithm is investigated. • Error analysis is done to find out the error through investigation. - Abstract: In the recent years, remarkable attention is drawn to Stirling engine due to noticeable advantages, for instance a lot of resources such as biomass, fossil fuels and solar energy can be applied as heat source. Great number of studies are conducted on Stirling engine and finite time thermo-economic is one of them. In the present study, the dimensionless thermo-economic objective function, thermal efficiency and dimensionless power output are optimized for a dish-Stirling system using finite time thermo-economic analysis and NSGA-II algorithm. Optimized answers are chosen from the results using three decision-making methods. Error analysis is done to find out the error through investigation

  16. The chaotic global best artificial bee colony algorithm for the multi-area economic/emission dispatch

    International Nuclear Information System (INIS)

    Secui, Dinu Calin

    2015-01-01

    This paper suggests a chaotic optimizing method, based on the GBABC (global best artificial bee colony algorithm), where the random sequences used in updating the solutions of this algorithm are replaced with chaotic sequences generated by chaotic maps. The new algorithm, called chaotic CGBABC (global best artificial bee colony algorithm), is used to solving the multi-area economic/emission dispatch problem taking into consideration the valve-point effects, the transmission line losses, multi-fuel sources, prohibited operating zones, tie line capacity and power transfer cost between different areas of the system. The behaviour of the CGBABC algorithm is studied considering ten chaotic maps both one-dimensional and bi-dimensional, with various probability density functions. The CGBABC algorithm's performance including a variety of chaotic maps is tested on five systems (6-unit, 10-unit, 16-unit, 40-unit and 120-unit) with different characteristics, constraints and sizes. The results comparison highlights a hierarchy in the chaotic maps included in the CGBABC algorithm and shows that it performs better than the classical ABC algorithm, the GBABC algorithm and other optimization techniques. - Highlights: • A chaotic global best ABC algorithm (CGBABC) is presented. • CGBABC is applied for solving the multi-area economic/emission dispatch problem. • Valve-point effects, multi-fuel sources, POZ, transmission losses were considered. • The algorithm is tested on five systems having 6, 10, 16, 40 and 120 thermal units. • CGBABC algorithm outperforms several optimization techniques.

  17. Economic Tools for Managing Nitrogen in Coastal Watersheds ...

    Science.gov (United States)

    Watershed managers are interested in using economics to communicate the value of estuarine resources to the wider community, determine the most cost-effective means to reduce nitrogen pollution, and evaluate the benefits of taking action to improve coastal ecosystems. We spoke to coastal watershed managers who had commissioned economic studies and found that they were largely satisfied with the information and their ability to communicate the importance of coastal ecosystems. However, while managers were able to use these studies as communication tools, methods used in some studies were inconsistent with what some economists consider best practices. In addition, many watershed managers are grappling with how to implement nitrogen management activities in a way that is both cost-effective and achieves environmental goals, while maintaining public support. These and other issues led to this project. Our intent is to provide information to watershed managers and others interested in watershed management – such as National Estuary Programs, local governments, or nongovernmental organizations – on economic tools for managing nitrogen in coastal watersheds, and to economists and other analysts who are interested in assisting them in meeting their needs. Watershed management requires balancing scientific, political, and social issues to solve environmental problems. This document summarizes questions that watershed managers have about using economic analysis, and g

  18. A novel evolutionary algorithm for dynamic economic dispatch with energy saving and emission reduction in power system integrated wind power

    International Nuclear Information System (INIS)

    Liao, Gwo-Ching

    2011-01-01

    An optimization algorithm is proposed in this paper to solve the economic dispatch problem that includes wind farm using the Chaotic Quantum Genetic Algorithm (CQGA). In addition to the detailed models of economic dispatch introduction and their associated constraints, the wind power effect is also included in this paper. The chaotic quantum genetic algorithm used to solve the economic dispatch process and discussed with real scenarios used for the simulation tests. After comparing the proposed algorithm with several other algorithms commonly used to solve optimization problems, the results show that the proposed algorithm is able to find the optimal solution quickly and accurately (i.e. to obtain the minimum cost for power generation in the shortest time). At the end, the impact to the total cost savings for power generation after adding (or not adding) wind power generation is also discussed. The actual implementation results prove that the proposed algorithm is economical, fast and practical. They are quite valuable for further research. -- Research highlights: → Quantum Genetic Algorithm can effectively improve the global search ability. → It can achieve the real objective of the global optimal solutions. → The CPU computation time is less than that other algorithms adopted in this paper.

  19. A recursive economic dispatch algorithm for assessing the cost of thermal generator schedules

    International Nuclear Information System (INIS)

    Wong, K.P.; Doan, K.

    1992-01-01

    This paper develops an efficient, recursive algorithm for determining the economic power dispatch of thermal generators within the unit commitment environment. A method for incorporating the operation limits of all on-line generators and limits due to ramping generators is developed in the paper. The developed algorithm is amenable for computer implementation using the artificial intelligence programming language, Prolog. The performance of the developed algorithm is demonstrated through its application to evaluate the costs of dispatching 13 thermal generators within a generator schedule in a 24-hour schedule horizon

  20. Economic Consequence Analysis of Disasters: The ECAT Software Tool

    Energy Technology Data Exchange (ETDEWEB)

    Rose, Adam; Prager, Fynn; Chen, Zhenhua; Chatterjee, Samrat; Wei, Dan; Heatwole, Nathaniel; Warren, Eric

    2017-04-15

    This study develops a methodology for rapidly obtaining approximate estimates of the economic consequences from numerous natural, man-made and technological threats. This software tool is intended for use by various decision makers and analysts to obtain estimates rapidly. It is programmed in Excel and Visual Basic for Applications (VBA) to facilitate its use. This tool is called E-CAT (Economic Consequence Analysis Tool) and accounts for the cumulative direct and indirect impacts (including resilience and behavioral factors that significantly affect base estimates) on the U.S. economy. E-CAT is intended to be a major step toward advancing the current state of economic consequence analysis (ECA) and also contributing to and developing interest in further research into complex but rapid turnaround approaches. The essence of the methodology involves running numerous simulations in a computable general equilibrium (CGE) model for each threat, yielding synthetic data for the estimation of a single regression equation based on the identification of key explanatory variables (threat characteristics and background conditions). This transforms the results of a complex model, which is beyond the reach of most users, into a "reduced form" model that is readily comprehensible. Functionality has been built into E-CAT so that its users can switch various consequence categories on and off in order to create customized profiles of economic consequences of numerous risk events. E-CAT incorporates uncertainty on both the input and output side in the course of the analysis.

  1. Continuous grasp algorithm applied to economic dispatch problem of thermal units

    Energy Technology Data Exchange (ETDEWEB)

    Vianna Neto, Julio Xavier [Pontifical Catholic University of Parana - PUCPR, Curitiba, PR (Brazil). Undergraduate Program at Mechatronics Engineering; Bernert, Diego Luis de Andrade; Coelho, Leandro dos Santos [Pontifical Catholic University of Parana - PUCPR, Curitiba, PR (Brazil). Industrial and Systems Engineering Graduate Program, LAS/PPGEPS], e-mail: leandro.coelho@pucpr.br

    2010-07-01

    The economic dispatch problem (EDP) is one of the fundamental issues in power systems to obtain benefits with the stability, reliability and security. Its objective is to allocate the power demand among committed generators in the most economical manner, while all physical and operational constraints are satisfied. The cost of power generation, particularly in fossil fuel plants, is very high and economic dispatch helps in saving a significant amount of revenue. Recently, as an alternative to the conventional mathematical approaches, modern heuristic optimization techniques such as simulated annealing, evolutionary algorithms, neural networks, ant colony, and tabu search have been given much attention by many researchers due to their ability to find an almost global optimal solution in EDPs. On other hand, continuous GRASP (C-GRASP) is a stochastic local search meta-heuristic for finding cost-efficient solutions to continuous global optimization problems subject to box constraints. Like a greedy randomized adaptive search procedure (GRASP), a C-GRASP is a multi-start procedure where a starting solution for local improvement is constructed in a greedy randomized fashion. The C-GRASP algorithm is validated for a test system consisting of fifteen units, test system that takes into account spinning reserve and prohibited operating zones constrains. (author)

  2. An Efficient Meta Heuristic Algorithm to Solve Economic Load Dispatch Problems

    Directory of Open Access Journals (Sweden)

    R Subramanian

    2013-12-01

    Full Text Available The Economic Load Dispatch (ELD problems in power generation systems are to reduce the fuel cost by reducing the total cost for the generation of electric power. This paper presents an efficient Modified Firefly Algorithm (MFA, for solving ELD Problem. The main objective of the problems is to minimize the total fuel cost of the generating units having quadratic cost functions subjected to limits on generator true power output and transmission losses. The MFA is a stochastic, Meta heuristic approach based on the idealized behaviour of the flashing characteristics of fireflies. This paper presents an application of MFA to ELD for six generator test case system. MFA is applied to ELD problem and compared its solution quality and computation efficiency to Genetic algorithm (GA, Differential Evolution (DE, Particle swarm optimization (PSO, Artificial Bee Colony optimization (ABC, Biogeography-Based Optimization (BBO, Bacterial Foraging optimization (BFO, Firefly Algorithm (FA techniques. The simulation result shows that the proposed algorithm outperforms previous optimization methods.

  3. Economic Analysis on Value Chain of Taxi Fleet with Battery-Swapping Mode Using Multiobjective Genetic Algorithm

    OpenAIRE

    Ning, Guobao; Zhen, Zijian; Wang, Peng; Li, Yang; Yin, Huaixian

    2012-01-01

    This paper presents an economic analysis model on value chain of taxi fleet with battery-swapping mode in a pilot city. In the model, economic benefits of charging-swapping station group, taxi company, and taxi driver in the region have been taken into consideration. Thus, the model is a multiobjective function and multiobjective genetic algorithm is used to solve this problem. According to the real data collected from the pilot city, the multiobjective genetic algorithm is tested as an effec...

  4. BONDI-97 A novel neutron energy spectrum unfolding tool using a genetic algorithm

    CERN Document Server

    Mukherjee, B

    1999-01-01

    The neutron spectrum unfolding procedure using the count rate data obtained from a set of Bonner sphere neutron detectors requires the solution of the Fredholm integral equation of the first kind by using complex mathematical methods. This paper reports a new approach for the unfolding of neutron spectra using the Genetic Algorithm tool BONDI-97 (BOnner sphere Neutron DIfferentiation). The BONDI-97 was used as the input for Genetic Algorithm engine EVOLVER to search for a globally optimised solution vector from a population of randomly generated solutions. This solution vector corresponds to the unfolded neutron energy spectrum. The Genetic Algorithm engine emulates the Darwinian 'Survival of the Fittest' strategy, the key ingredient of the 'Theory of Evolution'. The spectra of sup 2 sup 4 sup 1 Am/Be (alpha,n) and sup 2 sup 3 sup 9 Pu/Be (alpha,n) neutron sources were unfolded using the BONDI-97 tool. (author)

  5. Optimization of IBF parameters based on adaptive tool-path algorithm

    Science.gov (United States)

    Deng, Wen Hui; Chen, Xian Hua; Jin, Hui Liang; Zhong, Bo; Hou, Jin; Li, An Qi

    2018-03-01

    As a kind of Computer Controlled Optical Surfacing(CCOS) technology. Ion Beam Figuring(IBF) has obvious advantages in the control of surface accuracy, surface roughness and subsurface damage. The superiority and characteristics of IBF in optical component processing are analyzed from the point of view of removal mechanism. For getting more effective and automatic tool path with the information of dwell time, a novel algorithm is proposed in this thesis. Based on the removal functions made through our IBF equipment and the adaptive tool-path, optimized parameters are obtained through analysis the residual error that would be created in the polishing process. A Φ600 mm plane reflector element was used to be a simulation instance. The simulation result shows that after four combinations of processing, the surface accuracy of PV (Peak Valley) value and the RMS (Root Mean Square) value was reduced to 4.81 nm and 0.495 nm from 110.22 nm and 13.998 nm respectively in the 98% aperture. The result shows that the algorithm and optimized parameters provide a good theoretical for high precision processing of IBF.

  6. A Modified Artificial Bee Colony Algorithm Application for Economic Environmental Dispatch

    Science.gov (United States)

    Tarafdar Hagh, M.; Baghban Orandi, Omid

    2018-03-01

    In conventional fossil-fuel power systems, the economic environmental dispatch (EED) problem is a major problem that optimally determines the output power of generating units in a way that cost of total production and emission level be minimized simultaneously, and at the same time all the constraints of units and system are satisfied properly. To solve EED problem which is a non-convex optimization problem, a modified artificial bee colony (MABC) algorithm is proposed in this paper. This algorithm by implementing weighted sum method is applied on two test systems, and eventually, obtained results are compared with other reported results. Comparison of results confirms superiority and efficiency of proposed method clearly.

  7. A new modified artificial bee colony algorithm for the economic dispatch problem

    International Nuclear Information System (INIS)

    Secui, Dinu Calin

    2015-01-01

    Highlights: • A new modified ABC algorithm (MABC) is proposed to solve the EcD/EmD problem. • Valve-point effects, ramp-rate limits, POZ, transmission losses were considered. • The algorithm is tested on four systems having 6, 13, 40 and 52 thermal units. • MABC algorithm outperforms several optimization techniques. - Abstract: In this paper a new modified artificial bee colony algorithm (MABC) is proposed to solve the economic dispatch problem by taking into account the valve-point effects, the emission pollutions and various operating constraints of the generating units. The MABC algorithm introduces a new relation to update the solutions within the search space, in order to increase the algorithm ability to avoid premature convergence and to find stable and high quality solutions. Moreover, to strengthen the MABC algorithm performance, it is endowed with a chaotic sequence generated by both a cat map and a logistic map. The MABC algorithm behavior is investigated for several combinations resulting from three generating modalities of the chaotic sequences and two selection schemes of the solutions. The performance of the MABC variants is tested on four systems having six units, thirteen units, forty units and fifty-two thermal generating units. The comparison of the results shows that the MABC variants have a better performance than the classical ABC algorithm and other optimization techniques

  8. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    Energy Technology Data Exchange (ETDEWEB)

    Gould, Nathan [Department of Computer Science, The College of New Jersey, Ewing, NJ (United States); Hendy, Oliver [Department of Biology, The College of New Jersey, Ewing, NJ (United States); Papamichail, Dimitris, E-mail: papamicd@tcnj.edu [Department of Computer Science, The College of New Jersey, Ewing, NJ (United States)

    2014-10-06

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  9. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    International Nuclear Information System (INIS)

    Gould, Nathan; Hendy, Oliver; Papamichail, Dimitris

    2014-01-01

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  10. Algorithms and programming tools for image processing on the MPP:3

    Science.gov (United States)

    Reeves, Anthony P.

    1987-01-01

    This is the third and final report on the work done for NASA Grant 5-403 on Algorithms and Programming Tools for Image Processing on the MPP:3. All the work done for this grant is summarized in the introduction. Work done since August 1986 is reported in detail. Research for this grant falls under the following headings: (1) fundamental algorithms for the MPP; (2) programming utilities for the MPP; (3) the Parallel Pascal Development System; and (4) performance analysis. In this report, the results of two efforts are reported: region growing, and performance analysis of important characteristic algorithms. In each case, timing results from MPP implementations are included. A paper is included in which parallel algorithms for region growing on the MPP is discussed. These algorithms permit different sized regions to be merged in parallel. Details on the implementation and peformance of several important MPP algorithms are given. These include a number of standard permutations, the FFT, convolution, arbitrary data mappings, image warping, and pyramid operations, all of which have been implemented on the MPP. The permutation and image warping functions have been included in the standard development system library.

  11. A modified Symbiotic Organisms Search algorithm for large scale economic dispatch problem with valve-point effects

    International Nuclear Information System (INIS)

    Secui, Dinu Calin

    2016-01-01

    This paper proposes a new metaheuristic algorithm, called Modified Symbiotic Organisms Search (MSOS) algorithm, to solve the economic dispatch problem considering the valve-point effects, the prohibited operating zones (POZ), the transmission line losses, multi-fuel sources, as well as other operating constraints of the generating units and power system. The MSOS algorithm introduces, in all of its phases, new relations to update the solutions to improve its capacity of identifying stable and of high-quality solutions in a reasonable time. Furthermore, to increase the capacity of exploring the MSOS algorithm in finding the most promising zones, it is endowed with a chaotic component generated by the Logistic map. The performance of the modified algorithm and of the original algorithm Symbiotic Organisms Search (SOS) is tested on five systems of different characteristics, constraints and dimensions (13-unit, 40-unit, 80-unit, 160-unit and 320-unit). The results obtained by applying the proposed algorithm (MSOS) show that this has a better performance than other techniques of optimization recently used in solving the economic dispatch problem with valve-point effects. - Highlights: • A new modified SOS algorithm (MSOS) is proposed to solve the EcD problem. • Valve-point effects, ramp-rate limits, POZ, multi-fuel sources, transmission losses were considered. • The algorithm is tested on five systems having 13, 40, 80, 160 and 320 thermal units. • MSOS algorithm outperforms many other optimization techniques.

  12. Cross entropy-based memetic algorithms: An application study over the tool switching problem

    Directory of Open Access Journals (Sweden)

    Jhon Edgar Amaya

    2013-05-01

    Full Text Available This paper presents a parameterized schema for building memetic algorithms based on cross-entropy (CE methods. This novel schema is general in nature, and features multiple probability mass functions and Lamarckian learning. The applicability of the approach is assessed by considering the Tool Switching Problem, a complex combinatorial problem in the field of Flexible Manufacturing Systems. An exhaustive evaluation (including techniques ranging from local search and evolutionary algorithms to constructive methods provides evidence of the effectiveness of CE-based memetic algorithms.

  13. Assistance tool commissioner of new algorithms of systems planning of therapy with ionizing

    International Nuclear Information System (INIS)

    Reinado, D.; Ricos, B.; Alonso, S.; Chinillach, N.; Bellido, P.; Tortosa, R.

    2013-01-01

    The Commissioner of a new scheduling algorithm is associated with a high number of hours of work and measures. In order to optimize the development of the Commissioner for the AAA algorithms and Acuros XB within planning Eclipse (V.10) system marketed by Varian and have developed a tool in Microsoft Excel format where the different tests have been included to perform. (Author)

  14. Gentree of Tool for Syntactic Analysis Based On Younger Cocke Kasami Algorithm

    Directory of Open Access Journals (Sweden)

    - Wijanarto

    2017-04-01

    Full Text Available Syntactic analysis is a series of processes in order to validate a string that is received by a language. Understanding the process of reduction rules to become a tree is the part that is difficult to explain. This paper describes the results of the design tool to automate an input string into a decrease in the rules to trees in the visualized with images either in the form of files or display, performance evaluation tools and analysis of students' understanding of the tool by the algorithm Cocke Younger Kasami (cyk was selected as one of the cases for parsing techniques in the Context Free Grammar (CFG in the form of Chomsky Normal Form (CNF. These results indicate that the model successfully implemented into the application named genTree (Generator Tree, application performance gained a significant number of measurements of the variations in the complexity of the grammar and the input string by 29.13% with the complexities 7 and 8:50% with the complexity of 20, while for long input string against time processing algorithm can be a value of 3.3 and 66.98% as well as 29 and 6:19%, also obtained differences in the ability of the t-test on a group of students control against the experimental group with a value of t = 5.336 with df 74, p value of 0.001 , on the level of signfikansi 0.05% (5%. Also terapat increase in the percentage of correct answers was 58% in the variation of difficulty, 83% of the variation was easy. Sebalikanya wrong answer decline by 60% in difficult variation, the variation was 100% and 57% for easy variation. Recently there is a change decrease in the percentage of students who are not doing as much as 60% in the variation of difficulty, 44% of the variation was 13% on the variations easily can be concluded that the applications run efficiently and optimally, but also can effectively improve students' understanding in beajar automata with case cyk algorithm. Keywords—Tool, Analysis, Syntax, Algorithms, Trees

  15. Parallel genetic algorithm as a tool for nuclear reactors reload

    International Nuclear Information System (INIS)

    Santos, Darley Roberto G.; Schirru, Roberto

    1999-01-01

    This work intends to present a tool which can be used by designers in order to get better solutions, in terms of computational costs, to solve problems of nuclear reactor reloads. It is known that the project of nuclear fuel reload is a complex combinatorial one. Generally, iterative processes are the most used ones because they generate answers to satisfy all restrictions. The model presented here uses Artificial Intelligence techniques, more precisely Genetic Algorithms techniques, mixed with parallelization techniques.Test of the tool presented here were highly satisfactory, due to a considerable reduction in computational time. (author)

  16. Have Economic Educators Embraced Social Media as a Teaching Tool?

    Science.gov (United States)

    Al-Bahrani, Abdullah; Patel, Darshak; Sheridan, Brandon J.

    2017-01-01

    In this article, the authors discuss the results of a study of the perceptions of a national sample of economics faculty members from various institutions regarding the use of social media as a teaching tool in and out of the economics classroom. In the past few years, social media has become globally popular, and its use is ubiquitous among…

  17. Understanding the stakeholders' intention to use economic decision-support tools: A cross-sectional study with the tobacco return on investment tool.

    Science.gov (United States)

    Cheung, Kei Long; Evers, Silvia M A A; Hiligsmann, Mickaël; Vokó, Zoltán; Pokhrel, Subhash; Jones, Teresa; Muñoz, Celia; Wolfenstetter, Silke B; Józwiak-Hagymásy, Judit; de Vries, Hein

    2016-01-01

    Despite an increased number of economic evaluations of tobacco control interventions, the uptake by stakeholders continues to be limited. Understanding the underlying mechanism in adopting such economic decision-support tools by stakeholders is therefore important. By applying the I-Change Model, this study aims to identify which factors determine potential uptake of an economic decision-support tool, i.e., the Return on Investment tool. Stakeholders (decision-makers, purchasers of services/pharma products, professionals/service providers, evidence generators and advocates of health promotion) were interviewed in five countries, using an I-Change based questionnaire. MANOVA's were conducted to assess differences between intenders and non-intenders regarding beliefs. A multiple regression analysis was conducted to identify the main explanatory variables of intention to use an economic decision-support tool. Ninety-three stakeholders participated. Significant differences in beliefs were found between non-intenders and intenders: risk perception, attitude, social support, and self-efficacy towards using the tool. Regression showed that demographics, pre-motivational, and motivational factors explained 69% of the variation in intention. This study is the first to provide a theoretical framework to understand differences in beliefs between stakeholders who do or do not intend to use economic decision-support tools, and empirically corroborating the framework. This contributes to our understanding of the facilitators and barriers to the uptake of these studies. Copyright © 2015 The Authors. Published by Elsevier Ireland Ltd.. All rights reserved.

  18. Application of multiple tabu search algorithm to solve dynamic economic dispatch considering generator constraints

    International Nuclear Information System (INIS)

    Pothiya, Saravuth; Ngamroo, Issarachai; Kongprawechnon, Waree

    2008-01-01

    This paper presents a new optimization technique based on a multiple tabu search algorithm (MTS) to solve the dynamic economic dispatch (ED) problem with generator constraints. In the constrained dynamic ED problem, the load demand and spinning reserve capacity as well as some practical operation constraints of generators, such as ramp rate limits and prohibited operating zone are taken into consideration. The MTS algorithm introduces additional mechanisms such as initialization, adaptive searches, multiple searches, crossover and restarting process. To show its efficiency, the MTS algorithm is applied to solve constrained dynamic ED problems of power systems with 6 and 15 units. The results obtained from the MTS algorithm are compared to those achieved from the conventional approaches, such as simulated annealing (SA), genetic algorithm (GA), tabu search (TS) algorithm and particle swarm optimization (PSO). The experimental results show that the proposed MTS algorithm approaches is able to obtain higher quality solutions efficiently and with less computational time than the conventional approaches

  19. An algorithm and a Tool for Wavelength Allocation in OMS-SP Ring Architecture

    DEFF Research Database (Denmark)

    Riaz, Muhammad Tahir; Pedersen, Jens Myrup; Madsen, Ole Brun

    2006-01-01

    OMS-SP ring is one of the well known architectures in Wavelength Division Multiplexing based optical fiber networks. The architecture supports a restorable full mesh in an optical fiber ring using multiple light wavelengths. The paper presents an algorithm to allocate wavelengths in the OMS-SP ri...... architecture. A tool is also introduced which implements the algorithm and assigns wavelengths. The proposed algorithm uses fewer number of wavelengths than the classical allocation method. The algorithm is described and results are presented.......OMS-SP ring is one of the well known architectures in Wavelength Division Multiplexing based optical fiber networks. The architecture supports a restorable full mesh in an optical fiber ring using multiple light wavelengths. The paper presents an algorithm to allocate wavelengths in the OMS-SP ring...

  20. An improved Pattern Search based algorithm to solve the Dynamic Economic Dispatch problem with valve-point effect

    International Nuclear Information System (INIS)

    Alsumait, J.S.; Qasem, M.; Sykulski, J.K.; Al-Othman, A.K.

    2010-01-01

    In this paper, an improved algorithm based on Pattern Search method (PS) to solve the Dynamic Economic Dispatch is proposed. The algorithm maintains the essential unit ramp rate constraint, along with all other necessary constraints, not only for the time horizon of operation (24 h), but it preserves these constraints through the transaction period to the next time horizon (next day) in order to avoid the discontinuity of the power system operation. The Dynamic Economic and Emission Dispatch problem (DEED) is also considered. The load balance constraints, operating limits, valve-point loading and network losses are included in the models of both DED and DEED. The numerical results clarify the significance of the improved algorithm and verify its performance.

  1. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    Directory of Open Access Journals (Sweden)

    Nathan eGould

    2014-10-01

    Full Text Available Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de-novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  2. PCTFPeval: a web tool for benchmarking newly developed algorithms for predicting cooperative transcription factor pairs in yeast.

    Science.gov (United States)

    Lai, Fu-Jou; Chang, Hong-Tsun; Wu, Wei-Sheng

    2015-01-01

    Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Allowing users to select eight existing performance indices and 15

  3. Algorithms: economical computation of functions of real matrices

    International Nuclear Information System (INIS)

    Weiss, Z.

    1991-01-01

    An algorithm is presented which economizes on the calculation of F(a), where A is a real matrix and F(x) a real valued function of x, using spectral analysis. Assuming the availability of the software for the calculation of the complete set of eigenvalues and eigen vectors of A, it is shown that the complex matrix arithmetics involved in subsequent operations leading from A to F(A) can be reduced to the size comparable with the analogous problem in real matrix arithmetics. Saving in CPU time and storage has been achieved by utilizing explicitly the property that complex eigenvalues of a real matrix appear in pairs of complex conjugated numbers. (author)

  4. Condition monitoring of face milling tool using K-star algorithm and histogram features of vibration signal

    Directory of Open Access Journals (Sweden)

    C.K. Madhusudana

    2016-09-01

    Full Text Available This paper deals with the fault diagnosis of the face milling tool based on machine learning approach using histogram features and K-star algorithm technique. Vibration signals of the milling tool under healthy and different fault conditions are acquired during machining of steel alloy 42CrMo4. Histogram features are extracted from the acquired signals. The decision tree is used to select the salient features out of all the extracted features and these selected features are used as an input to the classifier. K-star algorithm is used as a classifier and the output of the model is utilised to study and classify the different conditions of the face milling tool. Based on the experimental results, K-star algorithm is provided a better classification accuracy in the range from 94% to 96% with histogram features and is acceptable for fault diagnosis.

  5. A Homogeneous and Self-Dual Interior-Point Linear Programming Algorithm for Economic Model Predictive Control

    DEFF Research Database (Denmark)

    Sokoler, Leo Emil; Frison, Gianluca; Skajaa, Anders

    2015-01-01

    We develop an efficient homogeneous and self-dual interior-point method (IPM) for the linear programs arising in economic model predictive control of constrained linear systems with linear objective functions. The algorithm is based on a Riccati iteration procedure, which is adapted to the linear...... system of equations solved in homogeneous and self-dual IPMs. Fast convergence is further achieved using a warm-start strategy. We implement the algorithm in MATLAB and C. Its performance is tested using a conceptual power management case study. Closed loop simulations show that 1) the proposed algorithm...

  6. Short term economic emission power scheduling of hydrothermal energy systems using improved water cycle algorithm

    International Nuclear Information System (INIS)

    Haroon, S.S.; Malik, T.N.

    2017-01-01

    Due to the increasing environmental concerns, the demand of clean and green energy and concern of atmospheric pollution is increasing. Hence, the power utilities are forced to limit their emissions within the prescribed limits. Therefore, the minimization of fuel cost as well as exhaust gas emissions is becoming an important and challenging task in the short-term scheduling of hydro-thermal energy systems. This paper proposes a novel algorithm known as WCA-ER (Water Cycle Algorithm with Evaporation Rate) to inspect the short term EEPSHES (Economic Emission Power Scheduling of Hydrothermal Energy Systems). WCA has its ancestries from the natural hydrologic cycle i.e. the raining process forms streams and these streams start flowing towards the rivers which finally flow towards the sea. The worth of WCA-ER has been tested on the standard economic emission power scheduling of hydrothermal energy test system consisting of four hydropower and three thermal plants. The problem has been investigated for the three case studies (i) ECS (Economic Cost Scheduling), (ii) ES (Economic Emission Scheduling) and (iii) ECES (Economic Cost and Emission Scheduling). The results obtained show that WCA-ER is superior to many other methods in the literature in bringing lower fuel cost and emissions. (author)

  7. Panorama 2018 - Overview of economic carbon pricing tools worldwide

    International Nuclear Information System (INIS)

    Coussy, Paula

    2018-01-01

    The Paris Agreement signed at COP21 came into effect in November 2016. This agreement aims to hold the increase in global average temperature to below 2 deg. C and pursue efforts to limit the rise to 1.5 deg. C by 2100. Governments and local jurisdictions must now implement an economic and regulatory framework to encourage greenhouse gas reductions. One of the economic tools available is carbon pricing. It varies greatly in form and value at international level and is deployed in all sectors of the economy. (author)

  8. Panorama 2017 - Overview of economic carbon pricing tools worldwide

    International Nuclear Information System (INIS)

    Coussy, Paula

    2017-06-01

    The Paris Agreement signed at COP21 came into effect in November 2016. This agreement aims to hold the increase in global average temperature to below 2 deg. C and pursue efforts to limit the rise to 1.5 deg. C by 2100. Governments and local jurisdictions must now implement an economic and regulatory framework to encourage greenhouse gas reductions. One of the economic tools available is carbon pricing. It varies greatly in form and value at international level and is deployed in all sectors of the economy

  9. Techno-economic optimization of a shell and tube heat exchanger by genetic and particle swarm algorithms

    International Nuclear Information System (INIS)

    Sadeghzadeh, H.; Ehyaei, M.A.; Rosen, M.A.

    2015-01-01

    Highlights: • Calculating pressure drop and heat transfer coefficient by Delaware method. • The accuracy of the Delaware method is more than the Kern method. • The results of the PSO are better than the results of the GA. • The optimization results suggest that yields the best and most economic optimization. - Abstract: The use of genetic and particle swarm algorithms in the design of techno-economically optimum shell-and-tube heat exchangers is demonstrated. A cost function (including costs of the heat exchanger based on surface area and power consumption to overcome pressure drops) is the objective function, which is to be minimized. Selected decision variables include tube diameter, central baffles spacing and shell diameter. The Delaware method is used to calculate the heat transfer coefficient and the shell-side pressure drop. The accuracy and efficiency of the suggested algorithm and the Delaware method are investigated. A comparison of the results obtained by the two algorithms shows that results obtained with the particle swarm optimization method are superior to those obtained with the genetic algorithm method. By comparing these results with those from various references employing the Kern method and other algorithms, it is shown that the Delaware method accompanied by genetic and particle swarm algorithms achieves more optimum results, based on assessments for two case studies

  10. A modified gravitational search algorithm based on a non-dominated sorting genetic approach for hydro-thermal-wind economic emission dispatching

    International Nuclear Information System (INIS)

    Chen, Fang; Zhou, Jianzhong; Wang, Chao; Li, Chunlong; Lu, Peng

    2017-01-01

    Wind power is a type of clean and renewable energy, and reasonable utilization of wind power is beneficial to environmental protection and economic development. Therefore, a short-term hydro-thermal-wind economic emission dispatching (SHTW-EED) problem is presented in this paper. The proposed problem aims to distribute the load among hydro, thermal and wind power units to simultaneously minimize economic cost and pollutant emission. To solve the SHTW-EED problem with complex constraints, a modified gravitational search algorithm based on the non-dominated sorting genetic algorithm-III (MGSA-NSGA-III) is proposed. In the proposed MGSA-NSGA-III, a non-dominated sorting approach, reference-point based selection mechanism and chaotic mutation strategy are applied to improve the evolutionary process of the original gravitational search algorithm (GSA) and maintain the distribution diversity of Pareto optimal solutions. Moreover, a parallel computing strategy is introduced to improve the computational efficiency. Finally, the proposed MGSA-NSGA-III is applied to a typical hydro-thermal-wind system to verify its feasibility and effectiveness. The simulation results indicate that the proposed algorithm can obtain low economic cost and small pollutant emission when dealing with the SHTW-EED problem. - Highlights: • A hybrid algorithm is proposed to handle hydro-thermal-wind power dispatching. • Several improvement strategies are applied to the algorithm. • A parallel computing strategy is applied to improve computational efficiency. • Two cases are analyzed to verify the efficiency of the optimize mode.

  11. Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool

    Energy Technology Data Exchange (ETDEWEB)

    Beckers, Koenraad J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); McCabe, Kevin [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-02-14

    This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal energy for Production of Heat and electricity ('IR') Economically Simulated). GEOPHIRES combines engineering models of the reservoir, wellbores, and surface plant facilities of a geothermal plant with an economic model to estimate the capital and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy. The available end-use options are electricity, direct-use heat, and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to import temperature data (e.g., measured or from stand-alone reservoir simulator), updated cost correlations, and more flexibility in selecting the time step and number of injection and production wells. In this paper, we provide an overview of all the updates and two case studies to illustrate the tool's new capabilities.

  12. Artificial bee colony algorithm for economic load dispatch with wind power energy

    Directory of Open Access Journals (Sweden)

    Safari Amin

    2016-01-01

    Full Text Available This paper presents an efficient Artificial Bee Colony (ABC algorithm for solving large scale economic load dispatch (ELD problems in power networks. To realize the ELD, the valve-point loading effect, system load demand, power losses, ramp rate limits and prohibited operation zones are considered here. Simulations were performed on four different power systems with 3, 6, 15 and 40 generating units and the results are compared with two forms of power systems, one power system is with a wind power generator and other power system is without a wind power generator. The results of this study reveal that the proposed approach is able to find appreciable ELD solutions than those of previous algorithms.

  13. Introducing GEOPHIRES v2.0: Updated Geothermal Techno-Economic Simulation Tool: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Beckers, Koenraad J [National Renewable Energy Laboratory (NREL), Golden, CO (United States); McCabe, Kevin [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2018-02-16

    This paper presents an updated version of the geothermal techno-economic simulation tool GEOPHIRES (GEOthermal Energy for Production of Heat and electricity (IR) Economically Simulated). GEOPHIRES combines reservoir, wellbore, surface plant and economic models to estimate the capital, and operation and maintenance costs, lifetime energy production, and overall levelized cost of energy of a geothermal plant. The available end-use options are electricity, direct-use heat and cogeneration. The main updates in the new version include conversion of the source code from FORTRAN to Python, the option to couple to an external reservoir simulator, updated cost correlations, and more flexibility in selecting the time step and number of injection and production wells. An overview of all the updates and two case-studies to illustrate the tool's new capabilities are provided in this paper.

  14. A new algorithm for combined dynamic economic emission dispatch with security constraints

    International Nuclear Information System (INIS)

    Arul, R.; Velusami, S.; Ravi, G.

    2015-01-01

    The primary objective of CDEED (combined dynamic economic emission dispatch) problem is to determine the optimal power generation schedule for the online generating units over a time horizon considered and simultaneously minimizing the emission level and satisfying the generators and system constraints. The CDEED problem is bi-objective optimization problem, where generation cost and emission are considered as two competing objective functions. This bi-objective CDEED problem is represented as a single objective optimization problem by assigning different weights for each objective functions. The weights are varied in steps and for each variation one compromise solution are generated and finally fuzzy based selection method is used to select the best compromise solution from the set of compromise solutions obtained. In order to reflect the test systems considered as real power system model, the security constraints are also taken into account. Three new versions of DHS (differential harmony search) algorithms have been proposed to solve the CDEED problems. The feasibility of the proposed algorithms is demonstrated on IEEE-26 and IEEE-39 bus systems. The result obtained by the proposed CSADHS (chaotic self-adaptive differential harmony search) algorithm is found to be better than EP (evolutionary programming), DHS, and the other proposed algorithms in terms of solution quality, convergence speed and computation time. - Highlights: • In this paper, three new algorithms CDHS, SADHS and CSADHS are proposed. • To solve DED with emission, poz's, spinning reserve and security constraints. • Results obtained by the proposed CSADHS algorithm are better than others. • The proposed CSADHS algorithm has fast convergence characteristic than others

  15. A Dantzig-Wolfe decomposition algorithm for linear economic model predictive control of dynamically decoupled subsystems

    DEFF Research Database (Denmark)

    Sokoler, Leo Emil; Standardi, Laura; Edlund, Kristian

    2014-01-01

    This paper presents a warm-started Dantzig–Wolfe decomposition algorithm tailored to economic model predictive control of dynamically decoupled subsystems. We formulate the constrained optimal control problem solved at each sampling instant as a linear program with state space constraints, input...... limits, input rate limits, and soft output limits. The objective function of the linear program is related directly to the cost of operating the subsystems, and the cost of violating the soft output constraints. Simulations for large-scale economic power dispatch problems show that the proposed algorithm...... is significantly faster than both state-of-the-art linear programming solvers, and a structure exploiting implementation of the alternating direction method of multipliers. It is also demonstrated that the control strategy presented in this paper can be tuned using a weighted ℓ1-regularization term...

  16. A hybrid multi-objective cultural algorithm for short-term environmental/economic hydrothermal scheduling

    International Nuclear Information System (INIS)

    Lu Youlin; Zhou Jianzhong; Qin Hui; Wang Ying; Zhang Yongchuan

    2011-01-01

    Research highlights: → Multi-objective optimization model of short-term environmental/economic hydrothermal scheduling. → A hybrid multi-objective cultural algorithm (HMOCA) is presented. → New heuristic constraint handling methods are proposed. → Better quality solutions by reducing fuel cost and emission effects simultaneously are obtained. -- Abstract: The short-term environmental/economic hydrothermal scheduling (SEEHS) with the consideration of multiple objectives is a complicated non-linear constrained optimization problem with non-smooth and non-convex characteristics. In this paper, a multi-objective optimization model of SEEHS is proposed to consider the minimal of fuel cost and emission effects synthetically, and the transmission loss, the water transport delays between connected reservoirs as well as the valve-point effects of thermal plants are taken into consideration to formulate the problem precisely. Meanwhile, a hybrid multi-objective cultural algorithm (HMOCA) is presented to deal with SEEHS problem by optimizing both two objectives simultaneously. The proposed method integrated differential evolution (DE) algorithm into the framework of cultural algorithm model to implement the evolution of population space, and two knowledge structures in belief space are redefined according to the characteristics of DE and SEEHS problem to avoid premature convergence effectively. Moreover, in order to deal with the complicated constraints effectively, new heuristic constraint handling methods without any penalty factor settings are proposed in this paper. The feasibility and effectiveness of the proposed HMOCA method are demonstrated by two case studies of a hydrothermal power system. The simulation results reveal that, compared with other methods established recently, HMOCA can get better quality solutions by reducing fuel cost and emission effects simultaneously.

  17. Development of Gis Tool for the Solution of Minimum Spanning Tree Problem using Prim's Algorithm

    Science.gov (United States)

    Dutta, S.; Patra, D.; Shankar, H.; Alok Verma, P.

    2014-11-01

    minimum spanning tree (MST) of a connected, undirected and weighted network is a tree of that network consisting of all its nodes and the sum of weights of all its edges is minimum among all such possible spanning trees of the same network. In this study, we have developed a new GIS tool using most commonly known rudimentary algorithm called Prim's algorithm to construct the minimum spanning tree of a connected, undirected and weighted road network. This algorithm is based on the weight (adjacency) matrix of a weighted network and helps to solve complex network MST problem easily, efficiently and effectively. The selection of the appropriate algorithm is very essential otherwise it will be very hard to get an optimal result. In case of Road Transportation Network, it is very essential to find the optimal results by considering all the necessary points based on cost factor (time or distance). This paper is based on solving the Minimum Spanning Tree (MST) problem of a road network by finding it's minimum span by considering all the important network junction point. GIS technology is usually used to solve the network related problems like the optimal path problem, travelling salesman problem, vehicle routing problems, location-allocation problems etc. Therefore, in this study we have developed a customized GIS tool using Python script in ArcGIS software for the solution of MST problem for a Road Transportation Network of Dehradun city by considering distance and time as the impedance (cost) factors. It has a number of advantages like the users do not need a greater knowledge of the subject as the tool is user-friendly and that allows to access information varied and adapted the needs of the users. This GIS tool for MST can be applied for a nationwide plan called Prime Minister Gram Sadak Yojana in India to provide optimal all weather road connectivity to unconnected villages (points). This tool is also useful for constructing highways or railways spanning several

  18. Static economic dispatch incorporating wind farm using Flower pollination algorithm

    Directory of Open Access Journals (Sweden)

    Suresh Velamuri

    2016-09-01

    Full Text Available Renewable energy is one of the clean and cheapest forms of energy which helps in minimizing the carbon foot print. Due to the less environmental impact and economic issues integration of renewable energy sources with the existing network gained attention. In this paper, the impact of wind energy is analysed in a power system network using static economic dispatch (SED. The wind energy is integrated with the existing thermal systems. Here, the generation scheduling is optimized using Flower pollination algorithm (FPA due to its robustness in solving nonlinear problems. Integration of wind power in the existing system increases the complexity due to its stochastic nature. Weibull distribution function is used for solving the stochastic nature of wind. Scenarios without and with wind power penetration are discussed in detail. The analysis is carried out by considering the losses and installing the wind farm at different locations in the system. The proposed methodology is tested and validated on a standard IEEE 30 bus system.

  19. Heuristic algorithms for solving of the tool routing problem for CNC cutting machines

    Science.gov (United States)

    Chentsov, P. A.; Petunin, A. A.; Sesekin, A. N.; Shipacheva, E. N.; Sholohov, A. E.

    2015-11-01

    The article is devoted to the problem of minimizing the path of the cutting tool to shape cutting machines began. This problem can be interpreted as a generalized traveling salesman problem. Earlier version of the dynamic programming method to solve this problem was developed. Unfortunately, this method allows to process an amount not exceeding thirty circuits. In this regard, the task of constructing quasi-optimal route becomes relevant. In this paper we propose options for quasi-optimal greedy algorithms. Comparison of the results of exact and approximate algorithms is given.

  20. Economic Load Dispatch Unit Pembangkit Termal Mempertimbangkan Penambahan Pembangkit Tenaga Angin Dengan Menggunakan Firefly Algorithm

    Directory of Open Access Journals (Sweden)

    Ridho Syahrial Ibrahim

    2017-03-01

    Full Text Available Maraknya isu global warming serta keterbatasan sumber daya alam membuat mulai banyaknya dibangun pembangkit-pembangkit listrik dengan renewable energy, salah satunya adalah pembangkit tenaga angin. Pada jurnal ini, firefly algorithm diterapkan untuk mengoptimasi total biaya pembangkitan 2 buah sistem uji, tanpa dan dengan mempertimbangkan penambahan tenaga angin. Hasil simulasi menunjukkan bahwa dengan penambahan pembangkit tenaga angin ke dalam sistem tenaga listrik, total biaya pembangkitan tidak selalu lebih murah. Selain itu, hasil simulasi juga menunjukkan bahwa firefly algorithm sebagai metode optimasi dapat menyelesaikan permasalahan economic load dispatch (ELD lebih baik dibandingkan metode lain yang sudah dilakukan, yaitu particle swarm optimization (PSO, bat algorithm (BA, biogeography-based optimization (BBO dan plant growth simulation algorithm (PGSA dengan persentase selisih nilai penghematan total biaya berkisar antara 0.32% ($50 hingga 9.27% ($11884.

  1. Evolutionary algorithms for multi-objective energetic and economic optimization in thermal system design

    International Nuclear Information System (INIS)

    Toffolo, A.; Lazzaretto, A.

    2002-01-01

    Thermoeconomic analyses in thermal system design are always focused on the economic objective. However, knowledge of only the economic minimum may not be sufficient in the decision making process, since solutions with a higher thermodynamic efficiency, in spite of small increases in total costs, may result in much more interesting designs due to changes in energy market prices or in energy policies. This paper suggests how to perform a multi-objective optimization in order to find solutions that simultaneously satisfy exergetic and economic objectives. This corresponds to a search for the set of Pareto optimal solutions with respect to the two competing objectives. The optimization process is carried out by an evolutionary algorithm, that features a new diversity preserving mechanism using as a test case the well-known CGAM problem. (author)

  2. A Fast Inspection of Tool Electrode and Drilling Depth in EDM Drilling by Detection Line Algorithm.

    Science.gov (United States)

    Huang, Kuo-Yi

    2008-08-21

    The purpose of this study was to develop a novel measurement method using a machine vision system. Besides using image processing techniques, the proposed system employs a detection line algorithm that detects the tool electrode length and drilling depth of a workpiece accurately and effectively. Different boundaries of areas on the tool electrode are defined: a baseline between base and normal areas, a ND-line between normal and drilling areas (accumulating carbon area), and a DD-line between drilling area and dielectric fluid droplet on the electrode tip. Accordingly, image processing techniques are employed to extract a tool electrode image, and the centroid, eigenvector, and principle axis of the tool electrode are determined. The developed detection line algorithm (DLA) is then used to detect the baseline, ND-line, and DD-line along the direction of the principle axis. Finally, the tool electrode length and drilling depth of the workpiece are estimated via detected baseline, ND-line, and DD-line. Experimental results show good accuracy and efficiency in estimation of the tool electrode length and drilling depth under different conditions. Hence, this research may provide a reference for industrial application in EDM drilling measurement.

  3. A Dantzig-Wolfe Decomposition Algorithm for Linear Economic MPC of a Power Plant Portfolio

    DEFF Research Database (Denmark)

    Standardi, Laura; Edlund, Kristian; Poulsen, Niels Kjølstad

    2012-01-01

    Future power systems will consist of a large number of decentralized power producers and a large number of controllable power consumers in addition to stochastic power producers such as wind turbines and solar power plants. Control of such large scale systems requires new control algorithms....... In this paper, we formulate the control of such a system as an Economic Model Predictive Control (MPC) problem. When the power producers and controllable power consumers have linear dynamics, the Economic MPC may be expressed as a linear program and we apply Dantzig-Wolfe decomposition for solution...

  4. New algorithms for motion error detection of numerical control machine tool by laser tracking measurement on the basis of GPS principle

    Science.gov (United States)

    Wang, Jindong; Chen, Peng; Deng, Yufen; Guo, Junjie

    2018-01-01

    As a three-dimensional measuring instrument, the laser tracker is widely used in industrial measurement. To avoid the influence of angle measurement error on the overall measurement accuracy, the multi-station and time-sharing measurement with a laser tracker is introduced on the basis of the global positioning system (GPS) principle in this paper. For the proposed method, how to accurately determine the coordinates of each measuring point by using a large amount of measured data is a critical issue. Taking detecting motion error of a numerical control machine tool, for example, the corresponding measurement algorithms are investigated thoroughly. By establishing the mathematical model of detecting motion error of a machine tool with this method, the analytical algorithm concerning on base station calibration and measuring point determination is deduced without selecting the initial iterative value in calculation. However, when the motion area of the machine tool is in a 2D plane, the coefficient matrix of base station calibration is singular, which generates a distortion result. In order to overcome the limitation of the original algorithm, an improved analytical algorithm is also derived. Meanwhile, the calibration accuracy of the base station with the improved algorithm is compared with that with the original analytical algorithm and some iterative algorithms, such as the Gauss-Newton algorithm and Levenberg-Marquardt algorithm. The experiment further verifies the feasibility and effectiveness of the improved algorithm. In addition, the different motion areas of the machine tool have certain influence on the calibration accuracy of the base station, and the corresponding influence of measurement error on the calibration result of the base station depending on the condition number of coefficient matrix are analyzed.

  5. A solution to the economic dispatch using EP based SA algorithm on large scale power system

    Energy Technology Data Exchange (ETDEWEB)

    Christober Asir Rajan, C. [Department of EEE, Pondicherry Engineering College, Pondicherry 605 014 (India)

    2010-07-15

    This paper develops a new approach for solving the Economic Load Dispatch (ELD) using an integrated algorithm based on Evolutionary Programming (EP) and Simulated Annealing (SA) on large scale power system. Classical methods employed for solving Economic Load Dispatch are calculus-based. For generator units having quadratic fuel cost functions, the classical techniques ignore or flatten out the portions of the incremental fuel cost curves and so may be have difficulties in the determination of the global optimum solution for non-differentiable fuel cost functions. To overcome these problems, the intelligent techniques, namely, Evolutionary Programming and Simulated Annealing are employed. The above said optimization techniques are capable of determining the global or near global optimum dispatch solutions. The validity and effectiveness of the proposed integrated algorithm has been tested with 66-bus Indian utility system, IEEE 5-bus, 30-bus, 118-bus system. And the test results are compared with the results obtained from other methods. Numerical results show that the proposed integrated algorithm can provide accurate solutions within reasonable time for any type of fuel cost functions. (author)

  6. Retro-Techno-Economic Analysis: Using (Bio)Process Systems Engineering Tools to Attain Process Target Values

    DEFF Research Database (Denmark)

    Furlan, Felipe F.; Costa, Caliane B B; Secchi, Argimiro R.

    2016-01-01

    Economic analysis, allied to process systems engineering tools, can provide useful insights about process techno-economic feasibility. More interestingly, rather than being used to evaluate specific process conditions, this techno-economic analysis can be turned upside down to achieve target valu...

  7. Optimization of dynamic economic dispatch with valve-point effect using chaotic sequence based differential evolution algorithms

    International Nuclear Information System (INIS)

    He Dakuo; Dong Gang; Wang Fuli; Mao Zhizhong

    2011-01-01

    A chaotic sequence based differential evolution (DE) approach for solving the dynamic economic dispatch problem (DEDP) with valve-point effect is presented in this paper. The proposed method combines the DE algorithm with the local search technique to improve the performance of the algorithm. DE is the main optimizer, while an approximated model for local search is applied to fine tune in the solution of the DE run. To accelerate convergence of DE, a series of constraints handling rules are adopted. An initial population obtained by using chaotic sequence exerts optimal performance of the proposed algorithm. The combined algorithm is validated for two test systems consisting of 10 and 13 thermal units whose incremental fuel cost function takes into account the valve-point loading effects. The proposed combined method outperforms other algorithms reported in literatures for DEDP considering valve-point effects.

  8. Tool path planning of hole-making operations in ejector plate of injection mould using modified shuffled frog leaping algorithm

    Directory of Open Access Journals (Sweden)

    Amol M. Dalavi

    2016-07-01

    Full Text Available Optimization of hole-making operations in manufacturing industry plays a vital role. Tool travel and tool switch planning are the two major issues in hole-making operations. Many industrial applications such as moulds, dies, engine block, automotive parts etc. requires machining of large number of holes. Large number of machining operations like drilling, enlargement or tapping/reaming are required to achieve the final size of individual hole, which gives rise to number of possible sequences to complete hole-making operations on the part depending upon the location of hole and tool sequence to be followed. It is necessary to find the optimal sequence of operations which minimizes the total processing cost of hole-making operations. In this work, therefore an attempt is made to reduce the total processing cost of hole-making operations by applying relatively new optimization algorithms known as shuffled frog leaping algorithm and proposed modified shuffled frog leaping algorithm for the determination of optimal sequence of hole-making operations. An industrial application example of ejector plate of injection mould is considered in this work to demonstrate the proposed approach. The obtained results by the shuffled frog leaping algorithm and proposed modified shuffled frog leaping algorithm are compared with each other. It is seen from the obtained results that the results of proposed modified shuffled frog leaping algorithm are superior to those obtained using shuffled frog leaping algorithm.

  9. Socio-economic analysis: a tool for assessing the potential of nanotechnologies

    International Nuclear Information System (INIS)

    Brignon, Jean-Marc

    2011-01-01

    Cost-Benefit Analysis (CBA) has a long history, especially in the USA, of being used for the assessment of new regulation, new infrastructure and more recently for new technologies. Under the denomination of Socio-Economic Analysis (SEA), this concept is used in EU safety and environmental regulation, especially for the placing of chemicals on the market (REACh regulation) and the operation of industrial installations (Industrial Emissions Directive). As far as REACh and other EU legislation apply specifically to nanomaterials in the future, SEA might become an important assessment tool for nanotechnologies. The most important asset of SEA regarding nanomaterials, is the comparison with alternatives in socio-economic scenarios, which is key for the understanding of how a nanomaterial 'socially' performs in comparison with its alternatives. 'Industrial economics' methods should be introduced in SEAs to make industry and the regulator share common concepts and visions about economic competitiveness implications of regulating nanotechnologies, SEA and Life Cycle Analysis (LCA) can complement each other : Socio-Economic LCA are increasingly seen as a complete assessment tool for nanotechnologies, but the perspective between Social LCA and SEA are different and the respective merits and limitations of both approaches should be kept in mind. SEA is a 'pragmatic regulatory impact analysis', that uses a cost/benefit framework analysis but remains open to other disciplines than economy, and open to the participation of stakeholders for the construction of scenarios of the deployment of technologies and the identification of alternatives. SEA is 'pragmatic' in the sense that it is driven by the purpose to assess 'what happens' with the introduction of nanotechnology, and uses methodologies such as Life Cycle Analysis only as far as they really contribute to that goal. We think that, being pragmatic, SEA is also adaptative, which is a key quality to handle the novelty of

  10. Socio-economic analysis: a tool for assessing the potential of nanotechnologies

    Science.gov (United States)

    Brignon, Jean-Marc

    2011-07-01

    Cost-Benefit Analysis (CBA) has a long history, especially in the USA, of being used for the assessment of new regulation, new infrastructure and more recently for new technologies. Under the denomination of Socio-Economic Analysis (SEA), this concept is used in EU safety and environmental regulation, especially for the placing of chemicals on the market (REACh regulation) and the operation of industrial installations (Industrial Emissions Directive). As far as REACh and other EU legislation apply specifically to nanomaterials in the future, SEA might become an important assessment tool for nanotechnologies. The most important asset of SEA regarding nanomaterials, is the comparison with alternatives in socio-economic scenarios, which is key for the understanding of how a nanomaterial "socially" performs in comparison with its alternatives. "Industrial economics" methods should be introduced in SEAs to make industry and the regulator share common concepts and visions about economic competitiveness implications of regulating nanotechnologies, SEA and Life Cycle Analysis (LCA) can complement each other : Socio-Economic LCA are increasingly seen as a complete assessment tool for nanotechnologies, but the perspective between Social LCA and SEA are different and the respective merits and limitations of both approaches should be kept in mind. SEA is a "pragmatic regulatory impact analysis", that uses a cost/benefit framework analysis but remains open to other disciplines than economy, and open to the participation of stakeholders for the construction of scenarios of the deployment of technologies and the identification of alternatives. SEA is "pragmatic" in the sense that it is driven by the purpose to assess "what happens" with the introduction of nanotechnology, and uses methodologies such as Life Cycle Analysis only as far as they really contribute to that goal. We think that, being pragmatic, SEA is also adaptative, which is a key quality to handle the novelty of

  11. Towards a New Approach of the Economic Intelligence Process: Basic Concepts, Analysis Methods and Informational Tools

    Directory of Open Access Journals (Sweden)

    Sorin Briciu

    2009-04-01

    Full Text Available One of the obvious trends in current business environment is the increased competition. In this context, organizations are becoming more and more aware of the importance of knowledge as a key factor in obtaining competitive advantage. A possible solution in knowledge management is Economic Intelligence (EI that involves the collection, evaluation, processing, analysis, and dissemination of economic data (about products, clients, competitors, etc. inside organizations. The availability of massive quantities of data correlated with advances in information and communication technology allowing for the filtering and processing of these data provide new tools for the production of economic intelligence.The research is focused on innovative aspects of economic intelligence process (models of analysis, activities, methods and informational tools and is providing practical guidelines for initiating this process. In this paper, we try: (a to contribute to a coherent view on economic intelligence process (approaches, stages, fields of application; b to describe the most important models of analysis related to this process; c to analyze the activities, methods and tools associated with each stage of an EI process.

  12. The Use of Economic Impact Studies as a Service Learning Tool in Undergraduate Business Programs

    Science.gov (United States)

    Misner, John M.

    2004-01-01

    This paper examines the use of community based economic impact studies as service learning tools for undergraduate business programs. Economic impact studies are used to measure the economic benefits of a variety of activities such as community redevelopment, tourism, and expansions of existing facilities for both private and public producers.…

  13. Economic Analysis on Value Chain of Taxi Fleet with Battery-Swapping Mode Using Multiobjective Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Guobao Ning

    2012-01-01

    Full Text Available This paper presents an economic analysis model on value chain of taxi fleet with battery-swapping mode in a pilot city. In the model, economic benefits of charging-swapping station group, taxi company, and taxi driver in the region have been taken into consideration. Thus, the model is a multiobjective function and multiobjective genetic algorithm is used to solve this problem. According to the real data collected from the pilot city, the multiobjective genetic algorithm is tested as an effective method to solve this problem. Furthermore, the effects of price of electricity, price of battery package, life cycle of battery package, cost of battery-swapping devices and infrastructure, and driving mileage per day on the benefits of value holders are analyzed, which provide theoretical and practical reference for the deployment of electric vehicles, for the national subsidy criteria adjusment, technological innovation instruction, commercial mode selection, and infrastructure construction.

  14. Process planning optimization on turning machine tool using a hybrid genetic algorithm with local search approach

    Directory of Open Access Journals (Sweden)

    Yuliang Su

    2015-04-01

    Full Text Available A turning machine tool is a kind of new type of machine tool that is equipped with more than one spindle and turret. The distinctive simultaneous and parallel processing abilities of turning machine tool increase the complexity of process planning. The operations would not only be sequenced and satisfy precedence constraints, but also should be scheduled with multiple objectives such as minimizing machining cost, maximizing utilization of turning machine tool, and so on. To solve this problem, a hybrid genetic algorithm was proposed to generate optimal process plans based on a mixed 0-1 integer programming model. An operation precedence graph is used to represent precedence constraints and help generate a feasible initial population of hybrid genetic algorithm. Encoding strategy based on data structure was developed to represent process plans digitally in order to form the solution space. In addition, a local search approach for optimizing the assignments of available turrets would be added to incorporate scheduling with process planning. A real-world case is used to prove that the proposed approach could avoid infeasible solutions and effectively generate a global optimal process plan.

  15. Special Section on "Tools and Algorithms for the Construction and Analysis of Systems"

    DEFF Research Database (Denmark)

    2006-01-01

    in the Lecture Notes in Computer Science series published by Springer. TACAS is a forum for researchers, developers and users interested in rigorously based tools for the construction and analysis of systems. The conference serves to bridge the gaps between different communities – including but not limited......This special section contains the revised and expanded versions of eight of the papers from the 10th International Conference on Tools and Algorithms for the Construction and Analysis of Systems (TACAS) held in March/April 2004 in Barcelona, Spain. The conference proceedings appeared as volume 2988...... to those devoted to formal methods, software and hardware verification, static analysis, programming languages, software engineering, real-time systems, and communications protocols – that share common interests in, and techniques for, tool development. Other more theoretical papers from the conference...

  16. The West Midlands breast cancer screening status algorithm - methodology and use as an audit tool.

    Science.gov (United States)

    Lawrence, Gill; Kearins, Olive; O'Sullivan, Emma; Tappenden, Nancy; Wallis, Matthew; Walton, Jackie

    2005-01-01

    To illustrate the ability of the West Midlands breast screening status algorithm to assign a screening status to women with malignant breast cancer, and its uses as a quality assurance and audit tool. Breast cancers diagnosed between the introduction of the National Health Service [NHS] Breast Screening Programme and 31 March 2001 were obtained from the West Midlands Cancer Intelligence Unit (WMCIU). Screen-detected tumours were identified via breast screening units, and the remaining cancers were assigned to one of eight screening status categories. Multiple primaries and recurrences were excluded. A screening status was assigned to 14,680 women (96% of the cohort examined), 110 cancers were not registered at the WMCIU and the cohort included 120 screen-detected recurrences. The West Midlands breast screening status algorithm is a robust simple tool which can be used to derive data to evaluate the efficacy and impact of the NHS Breast Screening Programme.

  17. Multidisciplinary Design, Analysis, and Optimization Tool Development Using a Genetic Algorithm

    Science.gov (United States)

    Pak, Chan-gi; Li, Wesley

    2009-01-01

    Multidisciplinary design, analysis, and optimization using a genetic algorithm is being developed at the National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California) to automate analysis and design process by leveraging existing tools to enable true multidisciplinary optimization in the preliminary design stage of subsonic, transonic, supersonic, and hypersonic aircraft. This is a promising technology, but faces many challenges in large-scale, real-world application. This report describes current approaches, recent results, and challenges for multidisciplinary design, analysis, and optimization as demonstrated by experience with the Ikhana fire pod design.!

  18. Towards an Advanced Modelling of Complex Economic Phenomena Pretopological and Topological Uncertainty Research Tools

    CERN Document Server

    Aluja, Jaime Gil

    2012-01-01

    Little by little we are being provided with an arsenal of operative instruments of a non-numerical nature, in the shape of models and algorithms, capable of providing answers to the “aggressions” which our economics and management systems must withstand, coming from an environment full of turmoil.   In the work which we are presenting, we dare to propose a set of elements from which we hope arise focuses capable of renewing those structures of economic thought which are upheld by the geometrical idea.   The concepts of pretopology and topology, habitually marginalized in economics and management studies, have centred our interest in recent times.  We consider that it is not possible to conceive formal structures capable of representing the Darwinism concept of economic behaviour today without recurring to this fundamental generalisation of metric spaces.   In our attempts to find a solid base to the structures proposed for the treatment of economic phenomena, we have frequently resorted to the theory ...

  19. Comparison between dynamic programming and genetic algorithm for hydro unit economic load dispatch

    Directory of Open Access Journals (Sweden)

    Bin Xu

    2014-10-01

    Full Text Available The hydro unit economic load dispatch (ELD is of great importance in energy conservation and emission reduction. Dynamic programming (DP and genetic algorithm (GA are two representative algorithms for solving ELD problems. The goal of this study was to examine the performance of DP and GA while they were applied to ELD. We established numerical experiments to conduct performance comparisons between DP and GA with two given schemes. The schemes included comparing the CPU time of the algorithms when they had the same solution quality, and comparing the solution quality when they had the same CPU time. The numerical experiments were applied to the Three Gorges Reservoir in China, which is equipped with 26 hydro generation units. We found the relation between the performance of algorithms and the number of units through experiments. Results show that GA is adept at searching for optimal solutions in low-dimensional cases. In some cases, such as with a number of units of less than 10, GA's performance is superior to that of a coarse-grid DP. However, GA loses its superiority in high-dimensional cases. DP is powerful in obtaining stable and high-quality solutions. Its performance can be maintained even while searching over a large solution space. Nevertheless, due to its exhaustive enumerating nature, it costs excess time in low-dimensional cases.

  20. ADVISHE: A new tool to report validation of health-economic decision models

    NARCIS (Netherlands)

    Vemer, P.; Corro Ramos, I.; Van Voorn, G.; Al, M.J.; Feenstra, T.L.

    2014-01-01

    Background: Modelers and reimbursement decision makers could both profit from a more systematic reporting of the efforts to validate health-economic (HE) models. Objectives: Development of a tool to systematically report validation efforts of HE decision models and their outcomes. Methods: A gross

  1. Value chain analysis of CO2 storage by using the Ecco tool: Storage economics

    NARCIS (Netherlands)

    Loeve, D.; Bos, C.; Chitu, A.; Loveseth, S.; Wahl, P.E.; Coussy, P.; Eickhoff, C.

    2013-01-01

    The ECCO Tool [1, 2] has been developed in the “ECCO – European value chain for CO2” project [3]. ECCO was a collaborating project under the 7th framework programme for research of the EU. The ECCO Tool is a software program designed to evaluate quantitatively the post-tax economics of Carbon

  2. ProBiS tools (algorithm, database, and web servers) for predicting and modeling of biologically interesting proteins.

    Science.gov (United States)

    Konc, Janez; Janežič, Dušanka

    2017-09-01

    ProBiS (Protein Binding Sites) Tools consist of algorithm, database, and web servers for prediction of binding sites and protein ligands based on the detection of structurally similar binding sites in the Protein Data Bank. In this article, we review the operations that ProBiS Tools perform, provide comments on the evolution of the tools, and give some implementation details. We review some of its applications to biologically interesting proteins. ProBiS Tools are freely available at http://probis.cmm.ki.si and http://probis.nih.gov. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Integration of life cycle assessment software with tools for economic and sustainability analyses and process simulation for sustainable process design

    DEFF Research Database (Denmark)

    Kalakul, Sawitree; Malakul, Pomthong; Siemanond, Kitipat

    2014-01-01

    The sustainable future of the world challenges engineers to develop chemical process designs that are not only technically and economically feasible but also environmental friendly. Life cycle assessment (LCA) is a tool for identifying and quantifying environmental impacts of the chemical product...... with other process design tools such as sustainable design (SustainPro), economic analysis (ECON) and process simulation. The software framework contains four main tools: Tool-I is for life cycle inventory (LCI) knowledge management that enables easy maintenance and future expansion of the LCI database; Tool...... and/or the process that makes it. It can be used in conjunction with process simulation and economic analysis tools to evaluate the design of any existing and/or new chemical-biochemical process and to propose improvement options in order to arrive at the best design among various alternatives...

  4. Use of a quality improvement tool, the prioritization matrix, to identify and prioritize triage software algorithm enhancement.

    Science.gov (United States)

    North, Frederick; Varkey, Prathiba; Caraballo, Pedro; Vsetecka, Darlene; Bartel, Greg

    2007-10-11

    Complex decision support software can require significant effort in maintenance and enhancement. A quality improvement tool, the prioritization matrix, was successfully used to guide software enhancement of algorithms in a symptom assessment call center.

  5. Hybrid SOA-SQP algorithm for dynamic economic dispatch with valve-point effects

    Energy Technology Data Exchange (ETDEWEB)

    Sivasubramani, S.; Swarup, K.S. [Department of Electrical Engineering, Indian Institute of Technology Madras, Chennai 600036 (India)

    2010-12-15

    This paper proposes a hybrid technique combining a new heuristic algorithm named seeker optimization algorithm (SOA) and sequential quadratic programming (SQP) method for solving dynamic economic dispatch problem with valve-point effects. The SOA is based on the concept of simulating the act of human searching, where the search direction is based on the empirical gradient (EG) by evaluating the response to the position changes and the step length is based on uncertainty reasoning by using a simple fuzzy rule. In this paper, SOA is used as a base level search, which can give a good direction to the optimal global region and SQP as a local search to fine tune the solution obtained from SOA. Thus SQP guides SOA to find optimal or near optimal solution in the complex search space. Two test systems i.e., 5 unit with losses and 10 unit without losses, have been taken to validate the efficiency of the proposed hybrid method. Simulation results clearly show that the proposed method outperforms the existing method in terms of solution quality. (author)

  6. A simulator-independent optimization tool based on genetic algorithm applied to nuclear reactor design

    International Nuclear Information System (INIS)

    Abreu Pereira, Claudio Marcio Nascimento do; Schirru, Roberto; Martinez, Aquilino Senra

    1999-01-01

    Here is presented an engineering optimization tool based on a genetic algorithm, implemented according to the method proposed in recent work that has demonstrated the feasibility of the use of this technique in nuclear reactor core designs. The tool is simulator-independent in the sense that it can be customized to use most of the simulators which have the input parameters read from formatted text files and the outputs also written from a text file. As the nuclear reactor simulators generally use such kind of interface, the proposed tool plays an important role in nuclear reactor designs. Research reactors may often use non-conventional design approaches, causing different situations that may lead the nuclear engineer to face new optimization problems. In this case, a good optimization technique, together with its customizing facility and a friendly man-machine interface could be very interesting. Here, the tool is described and some advantages are outlined. (author)

  7. The Index of Sustainable Economic Welfare (ISEW) as a tool in the sustainabledevelopment – Poland case

    NARCIS (Netherlands)

    Swiatkowska, Marta

    2008-01-01

    The research is based on the index of sustainable economic welfare (ISEW) as a tool in the sustainable development. The new index was developed to answer the growing number of critiques over the GDP indicator which measures only the economic activity of a

  8. A Placement Algorithm for Capital Items that Depreciate with Time

    International Nuclear Information System (INIS)

    Wweru, R.M

    1999-01-01

    The replacement algorithm is centred on the prediction of the replacement cost and the determination of the most economical replacement policy. For items whose efficiency depreciates over their life spans e.g. machine tools, vehicles et.c; the prediction of costs involves those factors which contribute to increase operating cost, forced idle time, increase scrap, increased repair cost etc. The alternative to increased cost of operating an aging equipment is the cost of replacing the old equipment with a new one. There is some age at which the replacement of the old equipment is more economical than continuation (of the old one) at the increased operating cost (Johnson R D, Siskin B R, 1989). This algorithm uses certain cost relationships that are vital in minimization of total costs and is focused on capital equipment that depreciates with time as opposed to items with a probabilistic life span

  9. Genetic algorithms

    Science.gov (United States)

    Wang, Lui; Bayer, Steven E.

    1991-01-01

    Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.

  10. Combined heat and power economic dispatch by a fish school search algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Leonardo Trigueiro dos; Costa e Silva, Marsil de Athayde [Undergraduate in Mechatronics Engineering, Pontifical Catholic University of Parana, Curitiba, PR (Brazil); Coelho, Leandro dos Santos [Industrial and Systems Engineering Graduate Program, PPGEPS, Pontifical Catholic University of Parana, Curitiba, PR (Brazil)], e-mail: leandro.coelho@pucpr.br

    2010-07-01

    The conversion of primary fossil fuels, such as coal and gas, to electricity is a a relatively inefficient process. Even the most modern combined cycle plants can only achieve efficiencies of between 50-60%. A great portion of the energy wasted in this conversion process is released to the environment as waste heat. The principle of combined heat and power, also known as cogeneration, is to recover and make beneficial use of this heat, significantly raising the overall efficiency of the conversion process. However, the optimal utilization of multiple combined heat and power systems is a complicated problem which needs powerful methods to solve. This paper presents a fish school search (FSS) algorithm to solve the combined heat and power economic dispatch problem. FSS is a novel approach recently proposed to perform search in complex optimization problems. Some simulations presented in the literature indicated that FSS can outperform many bio-inspired algorithms, mainly in multimodal functions. The search process in FSS is carried out by a population of limited-memory individuals - the fishes. Each fish represents a possible solution to the problem. Similarly to particle swarm optimization or genetic algorithm, search guidance in FSS is driven by the success of some individual members of the population. A four-unit system proposed recently which is a benchmark case in the power systems field has been validated as a case study in this paper. (author)

  11. Tool path in torus tool CNC machining

    Directory of Open Access Journals (Sweden)

    XU Ying

    2016-10-01

    Full Text Available This paper is about tool path in torus tool CNC machining.The mathematical model of torus tool is established.The tool path planning algorithm is determined through calculation of the cutter location,boundary discretization,calculation of adjacent tool path and so on,according to the conversion formula,the cutter contact point will be converted to the cutter location point and then these points fit a toolpath.Lastly,the path planning algorithm is implemented by using Matlab programming.The cutter location points for torus tool are calculated by Matlab,and then fit these points to a toolpath.While using UG software,another tool path of free surface is simulated of the same data.It is drew compared the two tool paths that using torus tool is more efficient.

  12. Multi-objective optimum design of fast tool servo based on improved differential evolution algorithm

    International Nuclear Information System (INIS)

    Zhu, Zhiwei; Zhou, Xiaoqin; Liu, Qiang; Zhao, Shaoxin

    2011-01-01

    The flexure-based mechanism is a promising realization of fast tool servo (FTS), and the optimum determination of flexure hinge parameters is one of the most important elements in the FTS design. This paper presents a multi-objective optimization approach to optimizing the dimension and position parameters of the flexure-based mechanism, which is based on the improved differential evolution algorithm embedding chaos and nonlinear simulated anneal algorithm. The results of optimum design show that the proposed algorithm has excellent performance and a well-balanced compromise is made between two conflicting objectives, the stroke and natural frequency of the FTS mechanism. The validation tests based on finite element analysis (FEA) show good agreement with the results obtained by using the proposed theoretical algorithm of this paper. Finally, a series of experimental tests are conducted to validate the design process and assess the performance of the FTS mechanism. The designed FTS reaches up to a stroke of 10.25 μm with at least 2 kHz bandwidth. Both of the FEA and experimental results demonstrate that the parameters of the flexure-based mechanism determined by the proposed approaches can achieve the specified performance and the proposed approach is suitable for the optimum design of FTS mechanism and of excellent performances

  13. IT Tools and their Use in Strategy Creation in Respect of Economic Results of a Company

    Directory of Open Access Journals (Sweden)

    Ladislav Pálka

    2016-01-01

    Full Text Available Purpose of the article: The article analyzes the current state of information technology in terms of their use in a strategy creation of a company in relation to monitoring the economic results of a company. It investigates, identifies and evaluates the overall situation of the concept and principles of these tools, their effectiveness in drawing up the strategy and strategic company goals, the ability to perform a variety of economic analysis without the need of a complex operation and understanding, but also for an effective evaluation of data for a planning support, management and deciding of management components, leading to the overall success of a company. The reason for this monitoring is a considerable difference between strategic company planning and its real results. Methodology/methods: In terms of methodology, the literature review of the current state of the issue has been used. – Primary: interviews, observations, expert estimation. – Secondary: evaluation of the data from the database of IS, documentation of seminars. – Quantitative Research: mapping the orientation of the issue, the confrontation with the theory. – Qualitative research: projective, structured interview (by users and suppliers. Scientific aim: The main aim of the work is to solve the problems of management and evaluation of the economic process in respect of information technology tools in connection with the formation of corporate strategy and monitoring of financial results of the company. The reason for selecting of the above-mentioned issue is the fact that information technology resources are currently not used in the creation of corporate strategy, specifically in the area of economic goals. Findings: To describe the situation in the region and to clearly define the basic problems used as a basis for the use of IT support tools in creation of corporate strategy, namely economic goals and the use of feedback of information support tools for assessing

  14. Short-term economic environmental hydrothermal scheduling using improved multi-objective gravitational search algorithm

    International Nuclear Information System (INIS)

    Li, Chunlong; Zhou, Jianzhong; Lu, Peng; Wang, Chao

    2015-01-01

    Highlights: • Improved multi-objective gravitational search algorithm. • An elite archive set is proposed to guide evolutionary process. • Neighborhood searching mechanism to improve local search ability. • Adopt chaotic mutation for avoiding premature convergence. • Propose feasible space method to handle hydro plant constrains. - Abstract: With growing concerns about energy and environment, short-term economic environmental hydrothermal scheduling (SEEHS) plays a more and more important role in power system. Because of the two objectives and various constraints, SEEHS is a complex multi-objective optimization problem (MOOP). In order to solve the problem, we propose an improved multi-objective gravitational search algorithm (IMOGSA) in this paper. In IMOGSA, the mass of the agent is redefined by multiple objectives to make it suitable for MOOP. An elite archive set is proposed to keep Pareto optimal solutions and guide evolutionary process. For balancing exploration and exploitation, a neighborhood searching mechanism is presented to cooperate with chaotic mutation. Moreover, a novel method based on feasible space is proposed to handle hydro plant constraints during SEEHS, and a violation adjustment method is adopted to handle power balance constraint. For verifying its effectiveness, the proposed IMOGSA is applied to a hydrothermal system in two different case studies. The simulation results show that IMOGSA has a competitive performance in SEEHS when compared with other established algorithms

  15. Toward a synthetic economic systems modeling tool for sustainable exploitation of ecosystems.

    Science.gov (United States)

    Richardson, Colin; Courvisanos, Jerry; Crawford, John W

    2011-02-01

    Environmental resources that underpin the basic human needs of water, energy, and food are predicted to become in such short supply by 2050 that global security and the well-being of millions will be under threat. These natural commodities have been allowed to reach crisis levels of supply because of a failure of economic systems to sustain them. This is largely because there have been no means of integrating their exploitation into any economic model that effectively addresses ecological systemic failures in a way that provides an integrated ecological-economic tool that can monitor and evaluate market and policy targets. We review the reasons for this and recent attempts to address the problem while identifying outstanding issues. The key elements of a policy-oriented economic model that integrates ecosystem processes are described and form the basis of a proposed new synthesis approach. The approach is illustrated by an indicative case study that develops a simple model for rainfed and irrigated food production in the Murray-Darling basin of southeastern Australia. © 2011 New York Academy of Sciences.

  16. Artificial Immune Systems as a Modern Tool for Solving Multi-Purpose Optimization Tasks in the Field of Logistics

    Directory of Open Access Journals (Sweden)

    Skitsko Volodymyr I.

    2017-03-01

    Full Text Available The article investigates various aspects of the functioning of artificial immune systems and their using to solve different tasks. The analysis of the studied literature showed that nowadays there exist combinations of artificial immune systems, in particular with genetic algorithms, the particle swarm optimization method, artificial neural networks, etc., to solve different tasks. However, the solving of economic tasks is paid little attention. The article presents the basic terminology of artificial immune systems; the steps of the clonal selection algorithm are described, as well as a brief description of the negative selection algorithm, the immune network algorithm and the dendritic algorithm is given; conceptual aspects of the use of an artificial immune system for solving multi-purpose optimization problems are formulated, and an example of solving a problem in the field of logistics is described. Artificial immune systems as a means of solving various weakly structured, multi-criteria and multi-purpose economic tasks, in particular in the sphere of logistics, are a promising tool that requires further research. Therefore, it is advisable in the future to focus on the use of various existing immune algorithms for solving various economic problems.

  17. Effects of visualization on algorithm comprehension

    Science.gov (United States)

    Mulvey, Matthew

    Computer science students are expected to learn and apply a variety of core algorithms which are an essential part of the field. Any one of these algorithms by itself is not necessarily extremely complex, but remembering the large variety of algorithms and the differences between them is challenging. To address this challenge, we present a novel algorithm visualization tool designed to enhance students understanding of Dijkstra's algorithm by allowing them to discover the rules of the algorithm for themselves. It is hoped that a deeper understanding of the algorithm will help students correctly select, adapt and apply the appropriate algorithm when presented with a problem to solve, and that what is learned here will be applicable to the design of other visualization tools designed to teach different algorithms. Our visualization tool is currently in the prototype stage, and this thesis will discuss the pedagogical approach that informs its design, as well as the results of some initial usability testing. Finally, to clarify the direction for further development of the tool, four different variations of the prototype were implemented, and the instructional effectiveness of each was assessed by having a small sample participants use the different versions of the prototype and then take a quiz to assess their comprehension of the algorithm.

  18. Genetic algorithms and Monte Carlo simulation for optimal plant design

    International Nuclear Information System (INIS)

    Cantoni, M.; Marseguerra, M.; Zio, E.

    2000-01-01

    We present an approach to the optimal plant design (choice of system layout and components) under conflicting safety and economic constraints, based upon the coupling of a Monte Carlo evaluation of plant operation with a Genetic Algorithms-maximization procedure. The Monte Carlo simulation model provides a flexible tool, which enables one to describe relevant aspects of plant design and operation, such as standby modes and deteriorating repairs, not easily captured by analytical models. The effects of deteriorating repairs are described by means of a modified Brown-Proschan model of imperfect repair which accounts for the possibility of an increased proneness to failure of a component after a repair. The transitions of a component from standby to active, and vice versa, are simulated using a multiplicative correlation model. The genetic algorithms procedure is demanded to optimize a profit function which accounts for the plant safety and economic performance and which is evaluated, for each possible design, by the above Monte Carlo simulation. In order to avoid an overwhelming use of computer time, for each potential solution proposed by the genetic algorithm, we perform only few hundreds Monte Carlo histories and, then, exploit the fact that during the genetic algorithm population evolution, the fit chromosomes appear repeatedly many times, so that the results for the solutions of interest (i.e. the best ones) attain statistical significance

  19. U.S. - GERMAN BILATERAL WORKING GROUP WORKSHOP ON: ECONOMIC TOOLS FOR SUSTAINABLE BROWNFIELDS REDEVELOPMENT

    Science.gov (United States)

    This CD-ROM contains information from a two-day workshop discussing innovative brownfields financing and economic strategies in the United States and Germany. A special emphasis was given to the identification of advantages and disadvantages of different financial tools, economi...

  20. Utilization of debate as an educational tool to learn health economics for dental students in Malaysia.

    Science.gov (United States)

    Khan, Saad A; Omar, Hanan; Babar, Muneer Gohar; Toh, Chooi G

    2012-12-01

    Health economics, a special branch of science applying economic principles to the health delivery system, is a relatively young subdiscipline. The literature is scanty about teaching health economics in the medical and dental fields. Delivery methods of this topic vary from one university to another, with lectures, seminars, and independent learning reported as teaching/learning tools used for the topic. Ideally, debates should foster the development of logical reasoning and communication skills. Health economics in dentistry is taught under the community oral health module that constitutes part of an outcome-based dental curriculum in a private dental school in Kuala Lumpur, Malaysia. For this study, the students were divided into two groups: active participants (active debaters) and supporting participants (nonactive debaters). The debate style chosen for this activity was parliamentary style. Active and nonactive debaters' perceptions were evaluated before and after the activity through a structured questionnaire using a five-point rating scale addressing the topic and perceptions about debate as an educational tool. Cronbach's alpha coefficient was used as a measure of internal consistency for the questionnaire items. Among a total of eighty-two third-year dental students of two successive cohorts (thirty-eight students and forty-four students), seventy-three completed the questionnaire, yielding a response rate of 89 percent. Students' responses to the questionnaire were analyzed with the Kruskal-Wallis analysis of variance test. Results revealed that the students felt that their interest in debate, knowledge of the topic, and reinforcement of the previous knowledge had improved following participation in the debate. Within the limitations of this study, it can be concluded that debate was a useful tool in teaching health economics to dental students.

  1. Operation management of daily economic dispatch using novel hybrid particle swarm optimization and gravitational search algorithm with hybrid mutation strategy

    Science.gov (United States)

    Wang, Yan; Huang, Song; Ji, Zhicheng

    2017-07-01

    This paper presents a hybrid particle swarm optimization and gravitational search algorithm based on hybrid mutation strategy (HGSAPSO-M) to optimize economic dispatch (ED) including distributed generations (DGs) considering market-based energy pricing. A daily ED model was formulated and a hybrid mutation strategy was adopted in HGSAPSO-M. The hybrid mutation strategy includes two mutation operators, chaotic mutation, Gaussian mutation. The proposed algorithm was tested on IEEE-33 bus and results show that the approach is effective for this problem.

  2. Scalability of Comparative Analysis, Novel Algorithms and Tools (MICW - Metagenomics Informatics Challenges Workshop: 10K Genomes at a Time)

    Energy Technology Data Exchange (ETDEWEB)

    Mavrommatis, Kostas

    2011-10-12

    DOE JGI's Kostas Mavrommatis, chair of the Scalability of Comparative Analysis, Novel Algorithms and Tools panel, at the Metagenomics Informatics Challenges Workshop held at the DOE JGI on October 12-13, 2011.

  3. Unconventional Algorithms: Complementarity of Axiomatics and Construction

    Directory of Open Access Journals (Sweden)

    Gordana Dodig Crnkovic

    2012-10-01

    Full Text Available In this paper, we analyze axiomatic and constructive issues of unconventional computations from a methodological and philosophical point of view. We explain how the new models of algorithms and unconventional computations change the algorithmic universe, making it open and allowing increased flexibility and expressive power that augment creativity. At the same time, the greater power of new types of algorithms also results in the greater complexity of the algorithmic universe, transforming it into the algorithmic multiverse and demanding new tools for its study. That is why we analyze new powerful tools brought forth by local mathematics, local logics, logical varieties and the axiomatic theory of algorithms, automata and computation. We demonstrate how these new tools allow efficient navigation in the algorithmic multiverse. Further work includes study of natural computation by unconventional algorithms and constructive approaches.

  4. An Accurate FFPA-PSR Estimator Algorithm and Tool for Software Effort Estimation

    Directory of Open Access Journals (Sweden)

    Senthil Kumar Murugesan

    2015-01-01

    Full Text Available Software companies are now keen to provide secure software with respect to accuracy and reliability of their products especially related to the software effort estimation. Therefore, there is a need to develop a hybrid tool which provides all the necessary features. This paper attempts to propose a hybrid estimator algorithm and model which incorporates quality metrics, reliability factor, and the security factor with a fuzzy-based function point analysis. Initially, this method utilizes a fuzzy-based estimate to control the uncertainty in the software size with the help of a triangular fuzzy set at the early development stage. Secondly, the function point analysis is extended by the security and reliability factors in the calculation. Finally, the performance metrics are added with the effort estimation for accuracy. The experimentation is done with different project data sets on the hybrid tool, and the results are compared with the existing models. It shows that the proposed method not only improves the accuracy but also increases the reliability, as well as the security, of the product.

  5. An interactive economic GIS tool for Europe using map objects for Java

    Science.gov (United States)

    Srinivasan, Vaishnavi

    Europe is one of the world's seven continents, which has approximately 50 countries and all are rich in culture, traditions, economy, biodiversity, among other things. This thesis focuses on creating a GIS application about Europe which will give an overview of Europe in various aspects. It covers 50 countries including financial centers, currency used, population, GDP growth, private banks, central banks, stock exchange, coat of arms and flags for each country, using the HotLink Tool. A reference link is also provided for detailed understanding of the above mentioned aspects. The other part of the thesis mainly focuses on the economics of the European Union as well as each country independently, which gives a thorough knowledge about the current investment climate in Europe. A part of this idea is to ensure transparency after the financial crisis in 2008. Further the capital markets of the European Union and other European countries are brought to light to provide a clear picture of their present financial situation. The application can help in improving policy and decision making, foreign investments, business environment for various development organizations. So this GIS application will be an effective tool for customers to understand the risks in investments by learning about the economic conditions of Europe.

  6. Model-Based Fault Diagnosis Techniques Design Schemes, Algorithms and Tools

    CERN Document Server

    Ding, Steven X

    2013-01-01

    Guaranteeing a high system performance over a wide operating range is an important issue surrounding the design of automatic control systems with successively increasing complexity. As a key technology in the search for a solution, advanced fault detection and identification (FDI) is receiving considerable attention. This book introduces basic model-based FDI schemes, advanced analysis and design algorithms, and mathematical and control-theoretic tools. This second edition of Model-Based Fault Diagnosis Techniques contains: ·         new material on fault isolation and identification, and fault detection in feedback control loops; ·         extended and revised treatment of systematic threshold determination for systems with both deterministic unknown inputs and stochastic noises; addition of the continuously-stirred tank heater as a representative process-industrial benchmark; and ·         enhanced discussion of residual evaluation in stochastic processes. Model-based Fault Diagno...

  7. Developing an eye-tracking algorithm as a potential tool for early diagnosis of autism spectrum disorder in children.

    Directory of Open Access Journals (Sweden)

    Natalia I Vargas-Cuentas

    Full Text Available Autism spectrum disorder (ASD currently affects nearly 1 in 160 children worldwide. In over two-thirds of evaluations, no validated diagnostics are used and gold standard diagnostic tools are used in less than 5% of evaluations. Currently, the diagnosis of ASD requires lengthy and expensive tests, in addition to clinical confirmation. Therefore, fast, cheap, portable, and easy-to-administer screening instruments for ASD are required. Several studies have shown that children with ASD have a lower preference for social scenes compared with children without ASD. Based on this, eye-tracking and measurement of gaze preference for social scenes has been used as a screening tool for ASD. Currently available eye-tracking software requires intensive calibration, training, or holding of the head to prevent interference with gaze recognition limiting its use in children with ASD.In this study, we designed a simple eye-tracking algorithm that does not require calibration or head holding, as a platform for future validation of a cost-effective ASD potential screening instrument. This system operates on a portable and inexpensive tablet to measure gaze preference of children for social compared to abstract scenes. A child watches a one-minute stimulus video composed of a social scene projected on the left side and an abstract scene projected on the right side of the tablet's screen. We designed five stimulus videos by changing the social/abstract scenes. Every child observed all the five videos in random order. We developed an eye-tracking algorithm that calculates the child's gaze preference for the social and abstract scenes, estimated as the percentage of the accumulated time that the child observes the left or right side of the screen, respectively. Twenty-three children without a prior history of ASD and 8 children with a clinical diagnosis of ASD were evaluated. The recorded video of the child´s eye movement was analyzed both manually by an observer

  8. Modified Cuckoo Search Algorithm for Solving Nonconvex Economic Load Dispatch Problems

    Directory of Open Access Journals (Sweden)

    Thang Trung Nguyen

    2016-01-01

    Full Text Available This paper presents the application of modified cuckoo search algorithm (MCSA for solving economic load dispatch (ELD problems. The MCSA method is developed to improve the search ability and solution quality of the conventional CSA method. In the MCSA, the evaluation of eggs has divided the initial eggs into two groups, the top egg group with good quality and the abandoned group with worse quality. Moreover, the value of the updated step size in MCSA is adapted as generating a new solution for the abandoned group and the top group via the Levy flights so that a large zone is searched at the beginning and a local zone is foraged as the maximum number of iterations is nearly reached. The MCSA method has been tested on different systems with different characteristics of thermal units and constraints. The result comparison with other methods in the literature has indicated that the MCSA method can be a powerful method for solving the ELD.

  9. Introduction to Evolutionary Algorithms

    CERN Document Server

    Yu, Xinjie

    2010-01-01

    Evolutionary algorithms (EAs) are becoming increasingly attractive for researchers from various disciplines, such as operations research, computer science, industrial engineering, electrical engineering, social science, economics, etc. This book presents an insightful, comprehensive, and up-to-date treatment of EAs, such as genetic algorithms, differential evolution, evolution strategy, constraint optimization, multimodal optimization, multiobjective optimization, combinatorial optimization, evolvable hardware, estimation of distribution algorithms, ant colony optimization, particle swarm opti

  10. Zombie algorithms: a timesaving remote sensing systems engineering tool

    Science.gov (United States)

    Ardanuy, Philip E.; Powell, Dylan C.; Marley, Stephen

    2008-08-01

    In modern horror fiction, zombies are generally undead corpses brought back from the dead by supernatural or scientific means, and are rarely under anyone's direct control. They typically have very limited intelligence, and hunger for the flesh of the living [1]. Typical spectroradiometric or hyperspectral instruments providess calibrated radiances for a number of remote sensing algorithms. The algorithms typically must meet specified latency and availability requirements while yielding products at the required quality. These systems, whether research, operational, or a hybrid, are typically cost constrained. Complexity of the algorithms can be high, and may evolve and mature over time as sensor characterization changes, product validation occurs, and areas of scientific basis improvement are identified and completed. This suggests the need for a systems engineering process for algorithm maintenance that is agile, cost efficient, repeatable, and predictable. Experience on remote sensing science data systems suggests the benefits of "plug-n-play" concepts of operation. The concept, while intuitively simple, can be challenging to implement in practice. The use of zombie algorithms-empty shells that outwardly resemble the form, fit, and function of a "complete" algorithm without the implemented theoretical basis-provides the ground systems advantages equivalent to those obtained by integrating sensor engineering models onto the spacecraft bus. Combined with a mature, repeatable process for incorporating the theoretical basis, or scientific core, into the "head" of the zombie algorithm, along with associated scripting and registration, provides an easy "on ramp" for the rapid and low-risk integration of scientific applications into operational systems.

  11. Efficient RNA structure comparison algorithms.

    Science.gov (United States)

    Arslan, Abdullah N; Anandan, Jithendar; Fry, Eric; Monschke, Keith; Ganneboina, Nitin; Bowerman, Jason

    2017-12-01

    Recently proposed relative addressing-based ([Formula: see text]) RNA secondary structure representation has important features by which an RNA structure database can be stored into a suffix array. A fast substructure search algorithm has been proposed based on binary search on this suffix array. Using this substructure search algorithm, we present a fast algorithm that finds the largest common substructure of given multiple RNA structures in [Formula: see text] format. The multiple RNA structure comparison problem is NP-hard in its general formulation. We introduced a new problem for comparing multiple RNA structures. This problem has more strict similarity definition and objective, and we propose an algorithm that solves this problem efficiently. We also develop another comparison algorithm that iteratively calls this algorithm to locate nonoverlapping large common substructures in compared RNAs. With the new resulting tools, we improved the RNASSAC website (linked from http://faculty.tamuc.edu/aarslan ). This website now also includes two drawing tools: one specialized for preparing RNA substructures that can be used as input by the search tool, and another one for automatically drawing the entire RNA structure from a given structure sequence.

  12. Enhanced clinical pharmacy service targeting tools: risk-predictive algorithms.

    Science.gov (United States)

    El Hajji, Feras W D; Scullin, Claire; Scott, Michael G; McElnay, James C

    2015-04-01

    This study aimed to determine the value of using a mix of clinical pharmacy data and routine hospital admission spell data in the development of predictive algorithms. Exploration of risk factors in hospitalized patients, together with the targeting strategies devised, will enable the prioritization of clinical pharmacy services to optimize patient outcomes. Predictive algorithms were developed using a number of detailed steps using a 75% sample of integrated medicines management (IMM) patients, and validated using the remaining 25%. IMM patients receive targeted clinical pharmacy input throughout their hospital stay. The algorithms were applied to the validation sample, and predicted risk probability was generated for each patient from the coefficients. Risk threshold for the algorithms were determined by identifying the cut-off points of risk scores at which the algorithm would have the highest discriminative performance. Clinical pharmacy staffing levels were obtained from the pharmacy department staffing database. Numbers of previous emergency admissions and admission medicines together with age-adjusted co-morbidity and diuretic receipt formed a 12-month post-discharge and/or readmission risk algorithm. Age-adjusted co-morbidity proved to be the best index to predict mortality. Increased numbers of clinical pharmacy staff at ward level was correlated with a reduction in risk-adjusted mortality index (RAMI). Algorithms created were valid in predicting risk of in-hospital and post-discharge mortality and risk of hospital readmission 3, 6 and 12 months post-discharge. The provision of ward-based clinical pharmacy services is a key component to reducing RAMI and enabling the full benefits of pharmacy input to patient care to be realized. © 2014 John Wiley & Sons, Ltd.

  13. A constriction factor based particle swarm optimisation algorithm to solve the economic dispatch problem including losses

    Energy Technology Data Exchange (ETDEWEB)

    Young, Steven; Montakhab, Mohammad; Nouri, Hassan

    2011-07-15

    Economic dispatch (ED) is one of the most important problems to be solved in power generation as fractional percentage fuel reductions represent significant cost savings. ED wishes to optimise the power generated by each generating unit in a system in order to find the minimum operating cost at a required load demand, whilst ensuring both equality and inequality constraints are met. For the process of optimisation, a model must be created for each generating unit. The particle swarm optimisation technique is an evolutionary computation technique with one of the most powerful methods for solving global optimisation problems. The aim of this paper is to add in a constriction factor to the particle swarm optimisation algorithm (CFBPSO). Results show that the algorithm is very good at solving the ED problem and that CFBPSO must be able to work in a practical environment and so a valve point effect with transmission losses should be included in future work.

  14. Environmental/economic dispatch problem of power system by using an enhanced multi-objective differential evolution algorithm

    International Nuclear Information System (INIS)

    Lu Youlin; Zhou Jianzhong; Qin Hui; Wang Ying; Zhang Yongchuan

    2011-01-01

    An enhanced multi-objective differential evolution algorithm (EMODE) is proposed in this paper to solve environmental/economic dispatch (EED) problem by considering the minimal of fuel cost and emission effects synthetically. In the proposed algorithm, an elitist archive technique is adopted to retain the non-dominated solutions obtained during the evolutionary process, and the operators of DE are modified according to the characteristics of multi-objective optimization problems. Moreover, in order to avoid premature convergence, a local random search (LRS) operator is integrated with the proposed method to improve the convergence performance. In view of the difficulties of handling the complicated constraints of EED problem, a new heuristic constraints handling method without any penalty factor settings is presented. The feasibility and effectiveness of the proposed EMODE method is demonstrated for a test power system. Compared with other methods, EMODE can get higher quality solutions by reducing the fuel cost and the emission effects synthetically.

  15. Denni Algorithm An Enhanced Of SMS (Scan, Move and Sort) Algorithm

    Science.gov (United States)

    Aprilsyah Lubis, Denni; Salim Sitompul, Opim; Marwan; Tulus; Andri Budiman, M.

    2017-12-01

    Sorting has been a profound area for the algorithmic researchers, and many resources are invested to suggest a more working sorting algorithm. For this purpose many existing sorting algorithms were observed in terms of the efficiency of the algorithmic complexity. Efficient sorting is important to optimize the use of other algorithms that require sorted lists to work correctly. Sorting has been considered as a fundamental problem in the study of algorithms that due to many reasons namely, the necessary to sort information is inherent in many applications, algorithms often use sorting as a key subroutine, in algorithm design there are many essential techniques represented in the body of sorting algorithms, and many engineering issues come to the fore when implementing sorting algorithms., Many algorithms are very well known for sorting the unordered lists, and one of the well-known algorithms that make the process of sorting to be more economical and efficient is SMS (Scan, Move and Sort) algorithm, an enhancement of Quicksort invented Rami Mansi in 2010. This paper presents a new sorting algorithm called Denni-algorithm. The Denni algorithm is considered as an enhancement on the SMS algorithm in average, and worst cases. The Denni algorithm is compared with the SMS algorithm and the results were promising.

  16. Improved algorithms and advanced features of the CAD to MC conversion tool McCad

    International Nuclear Information System (INIS)

    Lu, L.; Fischer, U.; Pereslavtsev, P.

    2014-01-01

    Highlights: •The latest improvements of the McCad conversion approach including decomposition and void filling algorithms is presented. •An advanced interface for the materials editing and assignment has been developed and added to the McCAD GUI. •These improvements have been tested and successfully applied to DEMO and ITER NBI (Neutral Beam Injector) applications. •The performance of the CAD model conversion process is shown to be significantly improved. -- Abstract: McCad is a geometry conversion tool developed at KIT to enable the automatic bi-directional conversions of CAD models into the Monte Carlo (MC) geometries utilized for neutronics calculations (CAD to MC) and, reversed (MC to CAD), for visualization purposes. The paper presents the latest improvements of the conversion algorithms including improved decomposition, void filling and an advanced interface for the materials editing and assignment. The new implementations and features were tested on fusion neutronics applications to the DEMO and ITER NBI (Neutral Beam Injector) models. The results demonstrate greater stability and enhanced efficiency of McCad conversion process

  17. Object-Oriented Economic Power Dispatch of Electrical Power System with minimum pollution using a Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    T. Bouktir

    2005-06-01

    Full Text Available This paper presents solution of optimal power flow (OPF problem of electrical power system via a genetic algorithm of real type. The objective is to minimize the total fuel cost of generation and environmental pollution caused by fossil based thermal generating units and also maintain an acceptable system performance in terms of limits on generator real and reactive power outputs, bus voltages, shunt capacitors/reactors, transformers tap-setting and power flow of transmission lines. CPU times can be reduced by decomposing the optimization constraints to active constraints that affect directly the cost function manipulated directly the GA, and passive constraints such as generator bus voltages and transformer tap setting maintained in their soft limits using a conventional constraint load flow. The algorithm was developed in an Object Oriented fashion, in the C++ programming language. This option satisfies the requirements of flexibility, extensibility, maintainability and data integrity. The economic power dispatch is applied to IEEE 30-bus model system (6-generator, 41-line and 20-load. The numerical results have demonstrate the effectiveness of the stochastic search algorithms because its can provide accurate dispatch solutions with reasonable time. Further analyses indicate that this method is effective for large-scale power systems.

  18. Conceptual model of management steadfast economic development production-economic systems

    OpenAIRE

    Prokhorova, V.

    2010-01-01

    The article is devoted developments of conceptual model of management proof economic development of the industrialeconomy systems. Features are certain, the algorithm of impulse is offered and intercommunication of contours of management proof economic development of the industrialeconomy systems is investigational

  19. Methods and tools to simulate the effect of economic instruments in complex water resources systems. Application to the Jucar river basin.

    Science.gov (United States)

    Lopez-Nicolas, Antonio; Pulido-Velazquez, Manuel

    2014-05-01

    The main challenge of the BLUEPRINT to safeguard Europe's water resources (EC, 2012) is to guarantee that enough good quality water is available for people's needs, the economy and the environment. In this sense, economic policy instruments such as water pricing policies and water markets can be applied to enhance efficient use of water. This paper presents a method based on hydro-economic tools to assess the effect of economic instruments on water resource systems. Hydro-economic models allow integrated analysis of water supply, demand and infrastructure operation at the river basin scale, by simultaneously combining engineering, hydrologic and economic aspects of water resources management. The method made use of the simulation and optimization hydroeconomic tools SIMGAMS and OPTIGAMS. The simulation tool SIMGAMS allocates water resources among the users according to priorities and operating rules, and evaluate economic scarcity costs of the system by using economic demand functions. The model's objective function is designed so that the system aims to meet the operational targets (ranked according to priorities) at each month while following the system operating rules. The optimization tool OPTIGAMS allocates water resources based on an economic efficiency criterion: maximize net benefits, or alternatively, minimizing the total water scarcity and operating cost of water use. SIMGAS allows to simulate incentive water pricing policies based on marginal resource opportunity costs (MROC; Pulido-Velazquez et al., 2013). Storage-dependent step pricing functions are derived from the time series of MROC values at a certain reservoir in the system. These water pricing policies are defined based on water availability in the system (scarcity pricing), so that when water storage is high, the MROC is low, while low storage (drought periods) will be associated to high MROC and therefore, high prices. We also illustrate the use of OPTIGAMS to simulate the effect of ideal water

  20. Archimedean copula estimation of distribution algorithm based on artificial bee colony algorithm

    Institute of Scientific and Technical Information of China (English)

    Haidong Xu; Mingyan Jiang; Kun Xu

    2015-01-01

    The artificial bee colony (ABC) algorithm is a com-petitive stochastic population-based optimization algorithm. How-ever, the ABC algorithm does not use the social information and lacks the knowledge of the problem structure, which leads to in-sufficiency in both convergent speed and searching precision. Archimedean copula estimation of distribution algorithm (ACEDA) is a relatively simple, time-economic and multivariate correlated EDA. This paper proposes a novel hybrid algorithm based on the ABC algorithm and ACEDA cal ed Archimedean copula estima-tion of distribution based on the artificial bee colony (ACABC) algorithm. The hybrid algorithm utilizes ACEDA to estimate the distribution model and then uses the information to help artificial bees to search more efficiently in the search space. Six bench-mark functions are introduced to assess the performance of the ACABC algorithm on numerical function optimization. Experimen-tal results show that the ACABC algorithm converges much faster with greater precision compared with the ABC algorithm, ACEDA and the global best (gbest)-guided ABC (GABC) algorithm in most of the experiments.

  1. Tools of Realization of Social Responsibility of Industrial Business for Sustainable Socio-economic Development of Mining Region's Rural Territory

    Science.gov (United States)

    Jurzina, Tatyana; Egorova, Natalia; Zaruba, Natalia; Kosinskij, Peter

    2017-11-01

    Modern conditions of the Russian economy do especially relevant questions of social responsibility of industrial business of the mining region for sustainable social and economic development of rural territories that demands search of the new strategy, tools, ways for positioning and increase in competitiveness of the enterprises, which are carrying out the entrepreneurial activity in this territory. The article opens problems of an influence of the industrial enterprises on the territory of presence, reasons the theoretical base directed to the formation of practical tools (mechanism) providing realization of social responsibility of business for sustainable social and economic development of rural territories of the mining region.

  2. Tools of Realization of Social Responsibility of Industrial Business for Sustainable Socio-economic Development of Mining Region's Rural Territory

    Directory of Open Access Journals (Sweden)

    Jurzina Tatyana

    2017-01-01

    Full Text Available Modern conditions of the Russian economy do especially relevant questions of social responsibility of industrial business of the mining region for sustainable social and economic development of rural territories that demands search of the new strategy, tools, ways for positioning and increase in competitiveness of the enterprises, which are carrying out the entrepreneurial activity in this territory. The article opens problems of an influence of the industrial enterprises on the territory of presence, reasons the theoretical base directed to the formation of practical tools (mechanism providing realization of social responsibility of business for sustainable social and economic development of rural territories of the mining region.

  3. Thermal-economic optimisation of a CHP gas turbine system by applying a fit-problem genetic algorithm

    Science.gov (United States)

    Ferreira, Ana C. M.; Teixeira, Senhorinha F. C. F.; Silva, Rui G.; Silva, Ângela M.

    2018-04-01

    Cogeneration allows the optimal use of the primary energy sources and significant reductions in carbon emissions. Its use has great potential for applications in the residential sector. This study aims to develop a methodology for thermal-economic optimisation of small-scale micro-gas turbine for cogeneration purposes, able to fulfil domestic energy needs with a thermal power out of 125 kW. A constrained non-linear optimisation model was built. The objective function is the maximisation of the annual worth from the combined heat and power, representing the balance between the annual incomes and the expenditures subject to physical and economic constraints. A genetic algorithm coded in the java programming language was developed. An optimal micro-gas turbine able to produce 103.5 kW of electrical power with a positive annual profit (i.e. 11,925 €/year) was disclosed. The investment can be recovered in 4 years and 9 months, which is less than half of system lifetime expectancy.

  4. Problem solving with genetic algorithms and Splicer

    Science.gov (United States)

    Bayer, Steven E.; Wang, Lui

    1991-01-01

    Genetic algorithms are highly parallel, adaptive search procedures (i.e., problem-solving methods) loosely based on the processes of population genetics and Darwinian survival of the fittest. Genetic algorithms have proven useful in domains where other optimization techniques perform poorly. The main purpose of the paper is to discuss a NASA-sponsored software development project to develop a general-purpose tool for using genetic algorithms. The tool, called Splicer, can be used to solve a wide variety of optimization problems and is currently available from NASA and COSMIC. This discussion is preceded by an introduction to basic genetic algorithm concepts and a discussion of genetic algorithm applications.

  5. System capacity and economic modeling computer tool for satellite mobile communications systems

    Science.gov (United States)

    Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.

    1988-01-01

    A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.

  6. Pathology economic model tool: a novel approach to workflow and budget cost analysis in an anatomic pathology laboratory.

    Science.gov (United States)

    Muirhead, David; Aoun, Patricia; Powell, Michael; Juncker, Flemming; Mollerup, Jens

    2010-08-01

    The need for higher efficiency, maximum quality, and faster turnaround time is a continuous focus for anatomic pathology laboratories and drives changes in work scheduling, instrumentation, and management control systems. To determine the costs of generating routine, special, and immunohistochemical microscopic slides in a large, academic anatomic pathology laboratory using a top-down approach. The Pathology Economic Model Tool was used to analyze workflow processes at The Nebraska Medical Center's anatomic pathology laboratory. Data from the analysis were used to generate complete cost estimates, which included not only materials, consumables, and instrumentation but also specific labor and overhead components for each of the laboratory's subareas. The cost data generated by the Pathology Economic Model Tool were compared with the cost estimates generated using relative value units. Despite the use of automated systems for different processes, the workflow in the laboratory was found to be relatively labor intensive. The effect of labor and overhead on per-slide costs was significantly underestimated by traditional relative-value unit calculations when compared with the Pathology Economic Model Tool. Specific workflow defects with significant contributions to the cost per slide were identified. The cost of providing routine, special, and immunohistochemical slides may be significantly underestimated by traditional methods that rely on relative value units. Furthermore, a comprehensive analysis may identify specific workflow processes requiring improvement.

  7. Development and evaluation of an algorithm-based tool for Medication Management in nursing homes: the AMBER study protocol.

    Science.gov (United States)

    Erzkamp, Susanne; Rose, Olaf

    2018-04-20

    Residents of nursing homes are susceptible to risks from medication. Medication Reviews (MR) can increase clinical outcomes and the quality of medication therapy. Limited resources and barriers between healthcare practitioners are potential obstructions to performing MR in nursing homes. Focusing on frequent and relevant problems can support pharmacists in the provision of pharmaceutical care services. This study aims to develop and evaluate an algorithm-based tool that facilitates the provision of Medication Management in clinical practice. This study is subdivided into three phases. In phase I, semistructured interviews with healthcare practitioners and patients will be performed, and a mixed methods approach will be chosen. Qualitative content analysis and the rating of the aspects concerning the frequency and relevance of problems in the medication process in nursing homes will be performed. In phase II, a systematic review of the current literature on problems and interventions will be conducted. The findings will be narratively presented. The results of both phases will be combined to develop an algorithm for MRs. For further refinement of the aspects detected, a Delphi survey will be conducted. In conclusion, a tool for clinical practice will be created. In phase III, the tool will be tested on MRs in nursing homes. In addition, effectiveness, acceptance, feasibility and reproducibility will be assessed. The primary outcome of phase III will be the reduction of drug-related problems (DRPs), which will be detected using the tool. The secondary outcomes will be the proportion of DRPs, the acceptance of pharmaceutical recommendations and the expenditure of time using the tool and inter-rater reliability. This study intervention is approved by the local Ethics Committee. The findings of the study will be presented at national and international scientific conferences and will be published in peer-reviewed journals. DRKS00010995. © Article author(s) (or their

  8. Explaining algorithms using metaphors

    CERN Document Server

    Forišek, Michal

    2013-01-01

    There is a significant difference between designing a new algorithm, proving its correctness, and teaching it to an audience. When teaching algorithms, the teacher's main goal should be to convey the underlying ideas and to help the students form correct mental models related to the algorithm. This process can often be facilitated by using suitable metaphors. This work provides a set of novel metaphors identified and developed as suitable tools for teaching many of the 'classic textbook' algorithms taught in undergraduate courses worldwide. Each chapter provides exercises and didactic notes fo

  9. Implementation of an Evidence-Based and Content Validated Standardized Ostomy Algorithm Tool in Home Care: A Quality Improvement Project.

    Science.gov (United States)

    Bare, Kimberly; Drain, Jerri; Timko-Progar, Monica; Stallings, Bobbie; Smith, Kimberly; Ward, Naomi; Wright, Sandra

    Many nurses have limited experience with ostomy management. We sought to provide a standardized approach to ostomy education and management to support nurses in early identification of stomal and peristomal complications, pouching problems, and provide standardized solutions for managing ostomy care in general while improving utilization of formulary products. This article describes development and testing of an ostomy algorithm tool.

  10. Economics of Agroforestry

    Science.gov (United States)

    D. Evan Mercer; Frederick W. Cubbage; Gregory E. Frey

    2014-01-01

    This chapter provides principles, literature and a case study about the economics of agroforestry. We examine necessary conditions for achieving efficiency in agroforestry system design and economic analysis tools for assessing efficiency and adoptability of agroforestry. The tools presented here (capital budgeting, linear progranuning, production frontier analysis...

  11. MATHEMATICAL METHODS AND TOOLS OF TRENDS’ RESEARCH IN THE EVOLUTIONARY DEVELOPMENT OF THE NATURAL AND ECONOMIC PROCESSES

    OpenAIRE

    Kymratova A. M.

    2015-01-01

    The present study was carried out in the view of the fact that there is no more or less complete theory of time series prediction memory to date. This determines the urgency and necessity of the development of new mathematical methods and algorithms to detect possible potential predictability of the series with the memory and the construction of adequate predictive models. Classical methods of forecasting economic time series are based on the mathematical apparatus of econometrics. It is carr...

  12. Thermo-economic design optimization of parabolic trough solar plants for industrial process heat applications with memetic algorithms

    International Nuclear Information System (INIS)

    Silva, R.; Berenguel, M.; Pérez, M.; Fernández-Garcia, A.

    2014-01-01

    Highlights: • A thermo-economic optimization of a parabolic-trough solar plant for industrial process heat applications is developed. • An analysis of the influence of economic cost functions on optimal design point location is presented. • A multi-objective optimization approach to the design routine is proposed. • A sensitivity analysis of the optimal point location to economic, operational, and ambient conditions is developed. • Design optimization of a parabolic trough plant for a reference industrial application is developed. - Abstract: A thermo-economic design optimization of a parabolic trough solar plant for industrial processes with memetic algorithms is developed. The design domain variables considered in the optimization routine are the number of collectors in series, number of collector rows, row spacing, and storage volume. Life cycle savings, levelized cost of energy, and payback time objective functions are compared to study the influence on optimal design point location. Furthermore a multi-objective optimization approach is proposed to analyze the design problem from a multi-economic criteria point of view. An extensive set of optimization cases are performed to estimate the influence of fuel price trend, plant location, demand profile, operation conditions, solar field orientation, and radiation uncertainty on optimal design. The results allow quantifying as thermo-economic design optimization based on short term criteria as the payback time leads to smaller plants with higher solar field efficiencies and smaller solar fractions, while the consideration of optimization criteria based on long term performance of the plants, as life cycle savings based optimization, leads to the reverse conclusion. The role of plant location and future evolution of gas prices in the thermo-economic performance of the solar plant has been also analyzed. Thermo-economic optimization of a parabolic trough solar plant design for the reference industrial

  13. A decision algorithm for patch spraying

    DEFF Research Database (Denmark)

    Christensen, Svend; Heisel, Torben; Walter, Mette

    2003-01-01

    method that estimates an economic optimal herbicide dose according to site-specific weed composition and density is presented in this paper. The method was termed a ‘decision algorithm for patch spraying’ (DAPS) and was evaluated in a 5-year experiment, in Denmark. DAPS consists of a competition model......, a herbicide dose–response model and an algorithm that estimates the economically optimal doses. The experiment was designed to compare herbicide treatments with DAPS recommendations and the Danish decision support system PC-Plant Protection. The results did not show any significant grain yield difference...

  14. Loading pattern optimization using ant colony algorithm

    International Nuclear Information System (INIS)

    Hoareau, Fabrice

    2008-01-01

    Electricite de France (EDF) operates 58 nuclear power plants (NPP), of the Pressurized Water Reactor type. The loading pattern optimization of these NPP is currently done by EDF expert engineers. Within this framework, EDF R and D has developed automatic optimization tools that assist the experts. LOOP is an industrial tool, developed by EDF R and D and based on a simulated annealing algorithm. In order to improve the results of such automatic tools, new optimization methods have to be tested. Ant Colony Optimization (ACO) algorithms are recent methods that have given very good results on combinatorial optimization problems. In order to evaluate the performance of such methods on loading pattern optimization, direct comparisons between LOOP and a mock-up based on the Max-Min Ant System algorithm (a particular variant of ACO algorithms) were made on realistic test-cases. It is shown that the results obtained by the ACO mock-up are very similar to those of LOOP. Future research will consist in improving these encouraging results by using parallelization and by hybridizing the ACO algorithm with local search procedures. (author)

  15. Parallel Multi-Objective Genetic Algorithm for Short-Term Economic Environmental Hydrothermal Scheduling

    Directory of Open Access Journals (Sweden)

    Zhong-Kai Feng

    2017-01-01

    Full Text Available With the increasingly serious energy crisis and environmental pollution, the short-term economic environmental hydrothermal scheduling (SEEHTS problem is becoming more and more important in modern electrical power systems. In order to handle the SEEHTS problem efficiently, the parallel multi-objective genetic algorithm (PMOGA is proposed in the paper. Based on the Fork/Join parallel framework, PMOGA divides the whole population of individuals into several subpopulations which will evolve in different cores simultaneously. In this way, PMOGA can avoid the wastage of computational resources and increase the population diversity. Moreover, the constraint handling technique is used to handle the complex constraints in SEEHTS, and a selection strategy based on constraint violation is also employed to ensure the convergence speed and solution feasibility. The results from a hydrothermal system in different cases indicate that PMOGA can make the utmost of system resources to significantly improve the computing efficiency and solution quality. Moreover, PMOGA has competitive performance in SEEHTS when compared with several other methods reported in the previous literature, providing a new approach for the operation of hydrothermal systems.

  16. Methodology, Algorithms, and Emerging Tool for Automated Design of Intelligent Integrated Multi-Sensor Systems

    Directory of Open Access Journals (Sweden)

    Andreas König

    2009-11-01

    Full Text Available The emergence of novel sensing elements, computing nodes, wireless communication and integration technology provides unprecedented possibilities for the design and application of intelligent systems. Each new application system must be designed from scratch, employing sophisticated methods ranging from conventional signal processing to computational intelligence. Currently, a significant part of this overall algorithmic chain of the computational system model still has to be assembled manually by experienced designers in a time and labor consuming process. In this research work, this challenge is picked up and a methodology and algorithms for automated design of intelligent integrated and resource-aware multi-sensor systems employing multi-objective evolutionary computation are introduced. The proposed methodology tackles the challenge of rapid-prototyping of such systems under realization constraints and, additionally, includes features of system instance specific self-correction for sustained operation of a large volume and in a dynamically changing environment. The extension of these concepts to the reconfigurable hardware platform renders so called self-x sensor systems, which stands, e.g., for self-monitoring, -calibrating, -trimming, and -repairing/-healing systems. Selected experimental results prove the applicability and effectiveness of our proposed methodology and emerging tool. By our approach, competitive results were achieved with regard to classification accuracy, flexibility, and design speed under additional design constraints.

  17. Linear feature detection algorithm for astronomical surveys - I. Algorithm description

    Science.gov (United States)

    Bektešević, Dino; Vinković, Dejan

    2017-11-01

    Computer vision algorithms are powerful tools in astronomical image analyses, especially when automation of object detection and extraction is required. Modern object detection algorithms in astronomy are oriented towards detection of stars and galaxies, ignoring completely the detection of existing linear features. With the emergence of wide-field sky surveys, linear features attract scientific interest as possible trails of fast flybys of near-Earth asteroids and meteors. In this work, we describe a new linear feature detection algorithm designed specifically for implementation in big data astronomy. The algorithm combines a series of algorithmic steps that first remove other objects (stars and galaxies) from the image and then enhance the line to enable more efficient line detection with the Hough algorithm. The rate of false positives is greatly reduced thanks to a step that replaces possible line segments with rectangles and then compares lines fitted to the rectangles with the lines obtained directly from the image. The speed of the algorithm and its applicability in astronomical surveys are also discussed.

  18. Data and software tools for gamma radiation spectral threat detection and nuclide identification algorithm development and evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Portnoy, David; Fisher, Brian; Phifer, Daniel

    2015-06-01

    The detection of radiological and nuclear threats is extremely important to national security. The federal government is spending significant resources developing new detection systems and attempting to increase the performance of existing ones. The detection of illicit radionuclides that may pose a radiological or nuclear threat is a challenging problem complicated by benign radiation sources (e.g., cat litter and medical treatments), shielding, and large variations in background radiation. Although there is a growing acceptance within the community that concentrating efforts on algorithm development (independent of the specifics of fully assembled systems) has the potential for significant overall system performance gains, there are two major hindrances to advancements in gamma spectral analysis algorithms under the current paradigm: access to data and common performance metrics along with baseline performance measures. Because many of the signatures collected during performance measurement campaigns are classified, dissemination to algorithm developers is extremely limited. This leaves developers no choice but to collect their own data if they are lucky enough to have access to material and sensors. This is often combined with their own definition of metrics for measuring performance. These two conditions make it all but impossible for developers and external reviewers to make meaningful comparisons between algorithms. Without meaningful comparisons, performance advancements become very hard to achieve and (more importantly) recognize. The objective of this work is to overcome these obstacles by developing and freely distributing real and synthetically generated gamma-spectra data sets as well as software tools for performance evaluation with associated performance baselines to national labs, academic institutions, government agencies, and industry. At present, datasets for two tracks, or application domains, have been developed: one that includes temporal

  19. Data and software tools for gamma radiation spectral threat detection and nuclide identification algorithm development and evaluation

    International Nuclear Information System (INIS)

    Portnoy, David; Fisher, Brian; Phifer, Daniel

    2015-01-01

    The detection of radiological and nuclear threats is extremely important to national security. The federal government is spending significant resources developing new detection systems and attempting to increase the performance of existing ones. The detection of illicit radionuclides that may pose a radiological or nuclear threat is a challenging problem complicated by benign radiation sources (e.g., cat litter and medical treatments), shielding, and large variations in background radiation. Although there is a growing acceptance within the community that concentrating efforts on algorithm development (independent of the specifics of fully assembled systems) has the potential for significant overall system performance gains, there are two major hindrances to advancements in gamma spectral analysis algorithms under the current paradigm: access to data and common performance metrics along with baseline performance measures. Because many of the signatures collected during performance measurement campaigns are classified, dissemination to algorithm developers is extremely limited. This leaves developers no choice but to collect their own data if they are lucky enough to have access to material and sensors. This is often combined with their own definition of metrics for measuring performance. These two conditions make it all but impossible for developers and external reviewers to make meaningful comparisons between algorithms. Without meaningful comparisons, performance advancements become very hard to achieve and (more importantly) recognize. The objective of this work is to overcome these obstacles by developing and freely distributing real and synthetically generated gamma-spectra data sets as well as software tools for performance evaluation with associated performance baselines to national labs, academic institutions, government agencies, and industry. At present, datasets for two tracks, or application domains, have been developed: one that includes temporal

  20. Methodological Tools for the Assessment of Ecological and Socio-Economic Environment in the Region within the Limits of the Sustainability of Biosphere

    Directory of Open Access Journals (Sweden)

    Aleksey Yuryevich Davankov

    2016-12-01

    Full Text Available The article is devoted to the study of ecological and socio-economic environment as well as the development of effective methodological tool for the assessment of its stability. This tool allows to ascertain the level of economic activity of the regions within the limits of the sustainability of biosphere. In the article, the regional system is considered as the total of industrial enterprises, social infrastructure and natural environment creating a specific territorial ecological and socio-economic environment, whose stability depends on the level of economic activity measured by the capacity of territorial ecosystem. The use of a technique for the comparative assessment of the energy indicators of economic activity creating a specific ecological and socio-economic environment of the region as well as of the indicator of the ecological capacity of the territory is proved. The ecological capacity of the territory enables to better estimate the level of the sustainability of the region within the limits of sustainability of biosphere. This method allows to forecast the development of the studied territory by the measurement of general energy flow on the basis of closed material and energy flows. The research revealed an indicator of the sustainability of ecological and socio-economic environment of Ural Federal District. Yamalo-Nenets Autonomous District is the most stable, the Chelyabinsk region is the least stable, which is associated with both natural conditions and the specificities of economic structure. The labour productivity indicator, expressed in energy units, has revealed regions with rich natural resources. It was found that in these regions, there are significant material flows in the electricity industry that leads to a large proportion of greenhouse gas emissions. The assessment of the demographic capacity fully correlates with the calculations of the stability indicator of the regional system and the analysis of labour

  1. Twitter as a teaching tool in the Social Sciences faculties. A case study from the Economic History

    Directory of Open Access Journals (Sweden)

    Misael Arturo López Zapico

    2013-08-01

    Full Text Available 0 0 1 127 701 USAL 5 1 827 14.0 Normal 0 21 false false false ES JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Tabla normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:Calibri; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-ansi-language:ES; mso-fareast-language:EN-US;} The increasing use of social networking among university students ease the way for teachers to use these kinds of tools towards achieving the objectives set in the European Higher Education Area. In this sense, Twitter appears as a highly versatile learning tool that perfectly fits with the skill-based education approach, as evidenced by the literature. This paper describes the methodology, as well as, discusses the results of three experiments that took place during the 2011-2012 Academic Year at the School of Economics and Business of the University of Oviedo. Twitter was used during those experiments to debate the today’s economic crisis. The indicators obtained are used to conclude that microblogging services are a proper tool not only for teaching Economic History but also for doing so for any Social Sciences.

  2. Estimation of electricity demand of Iran using two heuristic algorithms

    International Nuclear Information System (INIS)

    Amjadi, M.H.; Nezamabadi-pour, H.; Farsangi, M.M.

    2010-01-01

    This paper deals with estimation of electricity demand of Iran based on economic indicators using Particle Swarm Optimization (PSO) Algorithm. The estimation is based on Gross Domestic Product (GDP), population, number of customers and average price electricity by developing two different estimation models: a linear model and a non-linear model. The proposed models are obtained based upon available actual data of 21 years; since 1980-2000. Then the models obtained are used to estimate the electricity demand of the target years; for a period of time e.g. 2001-2006 and the results obtained are compared with the actual demand during this period. Furthermore, to validate the results obtained by PSO, genetic algorithm (GA) is applied to solve the problem. The results show that the PSO is a useful optimization tool for solving the problem using two developed models and can be used as an alternative solution to estimate the future electricity demand.

  3. Improved quantum-inspired evolutionary algorithm with diversity information applied to economic dispatch problem with prohibited operating zones

    International Nuclear Information System (INIS)

    Vianna Neto, Julio Xavier; Andrade Bernert, Diego Luis de; Santos Coelho, Leandro dos

    2011-01-01

    The objective of the economic dispatch problem (EDP) of electric power generation, whose characteristics are complex and highly nonlinear, is to schedule the committed generating unit outputs so as to meet the required load demand at minimum operating cost while satisfying all unit and system equality and inequality constraints. Recently, as an alternative to the conventional mathematical approaches, modern meta-heuristic optimization techniques have been given much attention by many researchers due to their ability to find an almost global optimal solution in EDPs. Research on merging evolutionary computation and quantum computation has been started since late 1990. Inspired on the quantum computation, this paper presented an improved quantum-inspired evolutionary algorithm (IQEA) based on diversity information of population. A classical quantum-inspired evolutionary algorithm (QEA) and the IQEA were implemented and validated for a benchmark of EDP with 15 thermal generators with prohibited operating zones. From the results for the benchmark problem, it is observed that the proposed IQEA approach provides promising results when compared to various methods available in the literature.

  4. Improved quantum-inspired evolutionary algorithm with diversity information applied to economic dispatch problem with prohibited operating zones

    Energy Technology Data Exchange (ETDEWEB)

    Vianna Neto, Julio Xavier, E-mail: julio.neto@onda.com.b [Pontifical Catholic University of Parana, PUCPR, Undergraduate Program at Mechatronics Engineering, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil); Andrade Bernert, Diego Luis de, E-mail: dbernert@gmail.co [Pontifical Catholic University of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil); Santos Coelho, Leandro dos, E-mail: leandro.coelho@pucpr.b [Pontifical Catholic University of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil)

    2011-01-15

    The objective of the economic dispatch problem (EDP) of electric power generation, whose characteristics are complex and highly nonlinear, is to schedule the committed generating unit outputs so as to meet the required load demand at minimum operating cost while satisfying all unit and system equality and inequality constraints. Recently, as an alternative to the conventional mathematical approaches, modern meta-heuristic optimization techniques have been given much attention by many researchers due to their ability to find an almost global optimal solution in EDPs. Research on merging evolutionary computation and quantum computation has been started since late 1990. Inspired on the quantum computation, this paper presented an improved quantum-inspired evolutionary algorithm (IQEA) based on diversity information of population. A classical quantum-inspired evolutionary algorithm (QEA) and the IQEA were implemented and validated for a benchmark of EDP with 15 thermal generators with prohibited operating zones. From the results for the benchmark problem, it is observed that the proposed IQEA approach provides promising results when compared to various methods available in the literature.

  5. Improved quantum-inspired evolutionary algorithm with diversity information applied to economic dispatch problem with prohibited operating zones

    Energy Technology Data Exchange (ETDEWEB)

    Neto, Julio Xavier Vianna [Pontifical Catholic University of Parana, PUCPR, Undergraduate Program at Mechatronics Engineering, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil); Bernert, Diego Luis de Andrade; Coelho, Leandro dos Santos [Pontifical Catholic University of Parana, PUCPR, Industrial and Systems Engineering Graduate Program, LAS/PPGEPS, Imaculada Conceicao, 1155, Zip code 80215-901, Curitiba, Parana (Brazil)

    2011-01-15

    The objective of the economic dispatch problem (EDP) of electric power generation, whose characteristics are complex and highly nonlinear, is to schedule the committed generating unit outputs so as to meet the required load demand at minimum operating cost while satisfying all unit and system equality and inequality constraints. Recently, as an alternative to the conventional mathematical approaches, modern meta-heuristic optimization techniques have been given much attention by many researchers due to their ability to find an almost global optimal solution in EDPs. Research on merging evolutionary computation and quantum computation has been started since late 1990. Inspired on the quantum computation, this paper presented an improved quantum-inspired evolutionary algorithm (IQEA) based on diversity information of population. A classical quantum-inspired evolutionary algorithm (QEA) and the IQEA were implemented and validated for a benchmark of EDP with 15 thermal generators with prohibited operating zones. From the results for the benchmark problem, it is observed that the proposed IQEA approach provides promising results when compared to various methods available in the literature. (author)

  6. Physics and Algorithm Enhancements for a Validated MCNP/X Monte Carlo Simulation Tool, Phase VII

    International Nuclear Information System (INIS)

    McKinney, Gregg W.

    2012-01-01

    Currently the US lacks an end-to-end (i.e., source-to-detector) radiation transport simulation code with predictive capability for the broad range of DHS nuclear material detection applications. For example, gaps in the physics, along with inadequate analysis algorithms, make it difficult for Monte Carlo simulations to provide a comprehensive evaluation, design, and optimization of proposed interrogation systems. With the development and implementation of several key physics and algorithm enhancements, along with needed improvements in evaluated data and benchmark measurements, the MCNP/X Monte Carlo codes will provide designers, operators, and systems analysts with a validated tool for developing state-of-the-art active and passive detection systems. This project is currently in its seventh year (Phase VII). This presentation will review thirty enhancements that have been implemented in MCNPX over the last 3 years and were included in the 2011 release of version 2.7.0. These improvements include 12 physics enhancements, 4 source enhancements, 8 tally enhancements, and 6 other enhancements. Examples and results will be provided for each of these features. The presentation will also discuss the eight enhancements that will be migrated into MCNP6 over the upcoming year.

  7. Vibration suppression in cutting tools using collocated piezoelectric sensors/actuators with an adaptive control algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Radecki, Peter P [Los Alamos National Laboratory; Farinholt, Kevin M [Los Alamos National Laboratory; Park, Gyuhae [Los Alamos National Laboratory; Bement, Matthew T [Los Alamos National Laboratory

    2008-01-01

    The machining process is very important in many engineering applications. In high precision machining, surface finish is strongly correlated with vibrations and the dynamic interactions between the part and the cutting tool. Parameters affecting these vibrations and dynamic interactions, such as spindle speed, cut depth, feed rate, and the part's material properties can vary in real-time, resulting in unexpected or undesirable effects on the surface finish of the machining product. The focus of this research is the development of an improved machining process through the use of active vibration damping. The tool holder employs a high bandwidth piezoelectric actuator with an adaptive positive position feedback control algorithm for vibration and chatter suppression. In addition, instead of using external sensors, the proposed approach investigates the use of a collocated piezoelectric sensor for measuring the dynamic responses from machining processes. The performance of this method is evaluated by comparing the surface finishes obtained with active vibration control versus baseline uncontrolled cuts. Considerable improvement in surface finish (up to 50%) was observed for applications in modern day machining.

  8. Women Empowerment and Participation in Economic Activities: Indispensable Tools for Self-Reliance and Development of Nigerian Society

    Science.gov (United States)

    E. N., Ekesionye; A. N., Okolo

    2012-01-01

    The objective of the study was to examine women empowerment and participation in economic activities as tools for self-reliance and development of the Nigerian society. Research questions and hypothesis were used to guide the study. Structured questionnaire was used as the major instrument for data collection. Copies of questionnaires were…

  9. The Biobank Economic Modeling Tool (BEMT): Online Financial Planning to Facilitate Biobank Sustainability

    Science.gov (United States)

    Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping

    2015-01-01

    Background: Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. Methods: To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. Results: A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. Conclusion: A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting. PMID:26697911

  10. The Biobank Economic Modeling Tool (BEMT): Online Financial Planning to Facilitate Biobank Sustainability.

    Science.gov (United States)

    Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping; Moore, Helen M

    2015-12-01

    Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting.

  11. An Effective Framework For Economic Dispatch Using Modified Harmony Search Algorithm

    Directory of Open Access Journals (Sweden)

    Advik Kumar

    2017-09-01

    Full Text Available The effects of ever-increasing wind power generation for solving the economic dispatch ED problem have led to high penetration of renewable energy source in new power systems. Continuing search for better utilizing of wind turbine associated with thermal sources to find the optimal allocation of output power is necessary in which pro-vide more reliability and efficiency. Dynamic nature of wind energy has imposed uncertainties characteristics in the poser systems. To deal with this problem an effective probabilistic method to investigate all unpredictability would be a good idea to make more realistic analysis. This paper presents a heuristics optimization method based on harmony search HS algorithm to solve non-convex ED problems while uncertainties effects caused by wind turbines are considered. To involve a realistic analysis as a more practical investigation the proposed probabilistic ED PED approach includes prohibited operating zone POZ system spinning reserve ramp rate limits variety of fuel is considered in this studies. Point Estimate Method PEM as a proposed PED model the uncertainties of wind speed for wind turbines to present better realization to the problem. Optimal solution are presented for vari-ous test system and these solutions demonstrate the benefits of our approach in terms of cost over existing ED techniques.

  12. Optimal Power Flow Using Gbest-Guided Cuckoo Search Algorithm with Feedback Control Strategy and Constraint Domination Rule

    Directory of Open Access Journals (Sweden)

    Gonggui Chen

    2017-01-01

    Full Text Available The optimal power flow (OPF is well-known as a significant optimization tool for the security and economic operation of power system, and OPF problem is a complex nonlinear, nondifferentiable programming problem. Thus this paper proposes a Gbest-guided cuckoo search algorithm with the feedback control strategy and constraint domination rule which is named as FCGCS algorithm for solving OPF problem and getting optimal solution. This FCGCS algorithm is guided by the global best solution for strengthening exploitation ability. Feedback control strategy is devised to dynamically regulate the control parameters according to actual and specific feedback value in the simulation process. And the constraint domination rule can efficiently handle inequality constraints on state variables, which is superior to traditional penalty function method. The performance of FCGCS algorithm is tested and validated on the IEEE 30-bus and IEEE 57-bus example systems, and simulation results are compared with different methods obtained from other literatures recently. The comparison results indicate that FCGCS algorithm can provide high-quality feasible solutions for different OPF problems.

  13. Kinect as a Tool for Gait Analysis: Validation of a Real-Time Joint Extraction Algorithm Working in Side View

    Directory of Open Access Journals (Sweden)

    Enea Cippitelli

    2015-01-01

    Full Text Available The Microsoft Kinect sensor has gained attention as a tool for gait analysis for several years. Despite the many advantages the sensor provides, however, the lack of a native capability to extract joints from the side view of a human body still limits the adoption of the device to a number of relevant applications. This paper presents an algorithm to locate and estimate the trajectories of up to six joints extracted from the side depth view of a human body captured by the Kinect device. The algorithm is then applied to extract data that can be exploited to provide an objective score for the “Get Up and Go Test”, which is typically adopted for gait analysis in rehabilitation fields. Starting from the depth-data stream provided by the Microsoft Kinect sensor, the proposed algorithm relies on anthropometric models only, to locate and identify the positions of the joints. Differently from machine learning approaches, this solution avoids complex computations, which usually require significant resources. The reliability of the information about the joint position output by the algorithm is evaluated by comparison to a marker-based system. Tests show that the trajectories extracted by the proposed algorithm adhere to the reference curves better than the ones obtained from the skeleton generated by the native applications provided within the Microsoft Kinect (Microsoft Corporation, Redmond,WA, USA, 2013 and OpenNI (OpenNI organization, Tel Aviv, Israel, 2013 Software Development Kits.

  14. Kinect as a Tool for Gait Analysis: Validation of a Real-Time Joint Extraction Algorithm Working in Side View

    Science.gov (United States)

    Cippitelli, Enea; Gasparrini, Samuele; Spinsante, Susanna; Gambi, Ennio

    2015-01-01

    The Microsoft Kinect sensor has gained attention as a tool for gait analysis for several years. Despite the many advantages the sensor provides, however, the lack of a native capability to extract joints from the side view of a human body still limits the adoption of the device to a number of relevant applications. This paper presents an algorithm to locate and estimate the trajectories of up to six joints extracted from the side depth view of a human body captured by the Kinect device. The algorithm is then applied to extract data that can be exploited to provide an objective score for the “Get Up and Go Test”, which is typically adopted for gait analysis in rehabilitation fields. Starting from the depth-data stream provided by the Microsoft Kinect sensor, the proposed algorithm relies on anthropometric models only, to locate and identify the positions of the joints. Differently from machine learning approaches, this solution avoids complex computations, which usually require significant resources. The reliability of the information about the joint position output by the algorithm is evaluated by comparison to a marker-based system. Tests show that the trajectories extracted by the proposed algorithm adhere to the reference curves better than the ones obtained from the skeleton generated by the native applications provided within the Microsoft Kinect (Microsoft Corporation, Redmond, WA, USA, 2013) and OpenNI (OpenNI organization, Tel Aviv, Israel, 2013) Software Development Kits. PMID:25594588

  15. Algorithms as fetish: Faith and possibility in algorithmic work

    Directory of Open Access Journals (Sweden)

    Suzanne L Thomas

    2018-01-01

    Full Text Available Algorithms are powerful because we invest in them the power to do things. With such promise, they can transform the ordinary, say snapshots along a robotic vacuum cleaner’s route, into something much more, such as a clean home. Echoing David Graeber’s revision of fetishism, we argue that this easy slip from technical capabilities to broader claims betrays not the “magic” of algorithms but rather the dynamics of their exchange. Fetishes are not indicators of false thinking, but social contracts in material form. They mediate emerging distributions of power often too nascent, too slippery or too disconcerting to directly acknowledge. Drawing primarily on 2016 ethnographic research with computer vision professionals, we show how faith in what algorithms can do shapes the social encounters and exchanges of their production. By analyzing algorithms through the lens of fetishism, we can see the social and economic investment in some people’s labor over others. We also see everyday opportunities for social creativity and change. We conclude that what is problematic about algorithms is not their fetishization but instead their stabilization into full-fledged gods and demons – the more deserving objects of critique.

  16. Confronting Decision Cliffs: Diagnostic Assessment of Multi-Objective Evolutionary Algorithms' Performance for Addressing Uncertain Environmental Thresholds

    Science.gov (United States)

    Ward, V. L.; Singh, R.; Reed, P. M.; Keller, K.

    2014-12-01

    As water resources problems typically involve several stakeholders with conflicting objectives, multi-objective evolutionary algorithms (MOEAs) are now key tools for understanding management tradeoffs. Given the growing complexity of water planning problems, it is important to establish if an algorithm can consistently perform well on a given class of problems. This knowledge allows the decision analyst to focus on eliciting and evaluating appropriate problem formulations. This study proposes a multi-objective adaptation of the classic environmental economics "Lake Problem" as a computationally simple but mathematically challenging MOEA benchmarking problem. The lake problem abstracts a fictional town on a lake which hopes to maximize its economic benefit without degrading the lake's water quality to a eutrophic (polluted) state through excessive phosphorus loading. The problem poses the challenge of maintaining economic activity while confronting the uncertainty of potentially crossing a nonlinear and potentially irreversible pollution threshold beyond which the lake is eutrophic. Objectives for optimization are maximizing economic benefit from lake pollution, maximizing water quality, maximizing the reliability of remaining below the environmental threshold, and minimizing the probability that the town will have to drastically change pollution policies in any given year. The multi-objective formulation incorporates uncertainty with a stochastic phosphorus inflow abstracting non-point source pollution. We performed comprehensive diagnostics using 6 algorithms: Borg, MOEAD, eMOEA, eNSGAII, GDE3, and NSGAII to ascertain their controllability, reliability, efficiency, and effectiveness. The lake problem abstracts elements of many current water resources and climate related management applications where there is the potential for crossing irreversible, nonlinear thresholds. We show that many modern MOEAs can fail on this test problem, indicating its suitability as a

  17. Forest FIRE and FIRE wood : tools for tree automata and tree algorithms

    NARCIS (Netherlands)

    Cleophas, L.G.W.A.; Piskorski, J.; Watson, B.W.; Yli-Jyrä, A.

    2009-01-01

    Pattern matching, acceptance, and parsing algorithms on node-labeled, ordered, ranked trees ('tree algorithms') are important for applications such as instruction selection and tree transformation/term rewriting. Many such algorithms have been developed. They often are based on results from such

  18. Economic evaluation of the one-hour rule-out and rule-in algorithm for acute myocardial infarction using the high-sensitivity cardiac troponin T assay in the emergency department.

    Directory of Open Access Journals (Sweden)

    Apoorva Ambavane

    Full Text Available The 1-hour (h algorithm triages patients presenting with suspected acute myocardial infarction (AMI to the emergency department (ED towards "rule-out," "rule-in," or "observation," depending on baseline and 1-h levels of high-sensitivity cardiac troponin (hs-cTn. The economic consequences of applying the accelerated 1-h algorithm are unknown.We performed a post-hoc economic analysis in a large, diagnostic, multicenter study of hs-cTnT using central adjudication of the final diagnosis by two independent cardiologists. Length of stay (LoS, resource utilization (RU, and predicted diagnostic accuracy of the 1-h algorithm compared to standard of care (SoC in the ED were estimated. The ED LoS, RU, and accuracy of the 1-h algorithm was compared to that achieved by the SoC at ED discharge. Expert opinion was sought to characterize clinical implementation of the 1-h algorithm, which required blood draws at ED presentation and 1h, after which "rule-in" patients were transferred for coronary angiography, "rule-out" patients underwent outpatient stress testing, and "observation" patients received SoC. Unit costs were for the United Kingdom, Switzerland, and Germany. The sensitivity and specificity for the 1-h algorithm were 87% and 96%, respectively, compared to 69% and 98% for SoC. The mean ED LoS for the 1-h algorithm was 4.3h-it was 6.5h for SoC, which is a reduction of 33%. The 1-h algorithm was associated with reductions in RU, driven largely by the shorter LoS in the ED for patients with a diagnosis other than AMI. The estimated total costs per patient were £2,480 for the 1-h algorithm compared to £4,561 for SoC, a reduction of up to 46%.The analysis shows that the use of 1-h algorithm is associated with reduction in overall AMI diagnostic costs, provided it is carefully implemented in clinical practice. These results need to be prospectively validated in the future.

  19. Algorithmic tools for interpreting vital signs.

    Science.gov (United States)

    Rathbun, Melina C; Ruth-Sahd, Lisa A

    2009-07-01

    Today's complex world of nursing practice challenges nurse educators to develop teaching methods that promote critical thinking skills and foster quick problem solving in the novice nurse. Traditional pedagogies previously used in the classroom and clinical setting are no longer adequate to prepare nursing students for entry into practice. In addition, educators have expressed frustration when encouraging students to apply newly learned theoretical content to direct the care of assigned patients in the clinical setting. This article presents algorithms as an innovative teaching strategy to guide novice student nurses in the interpretation and decision making related to vital sign assessment in an acute care setting.

  20. Incorporation of a health economic modelling tool into public health commissioning: Evidence use in a politicised context.

    Science.gov (United States)

    Sanders, Tom; Grove, Amy; Salway, Sarah; Hampshaw, Susan; Goyder, Elizabeth

    2017-08-01

    This paper explores how commissioners working in an English local government authority (LA) viewed a health economic decision tool for planning services in relation to diabetes. We conducted 15 interviews and 2 focus groups between July 2015 and February 2016, with commissioners (including public health managers, data analysts and council members). Two overlapping themes were identified explaining the obstacles and enablers of using such a tool in commissioning: a) evidence cultures, and b) system interdependency. The former highlighted the diverse evidence cultures present in the LA with politicians influenced by the 'soft' social care agendas affecting their local population and treating local opinion as evidence, whilst public health managers prioritised the scientific view of evidence informed by research. System interdependency further complicated the decision making process by recognizing interlinking with departments and other disease groups. To achieve legitimacy within the commissioning arena health economic modelling needs to function effectively in a highly politicised environment where decisions are made not only on the basis of research evidence, but on grounds of 'soft' data, personal opinion and intelligence. In this context decisions become politicised, with multiple opinions seeking a voice. The way that such decisions are negotiated and which ones establish authority is of importance. We analyse the data using Larson's (1990) discursive field concept to show how the tool becomes an object of research push and pull likely to be used instrumentally by stakeholders to advance specific agendas, not a means of informing complex decisions. In conclusion, LA decision making is underpinned by a transactional business ethic which is a further potential 'pull' mechanism for the incorporation of health economic modelling in local commissioning. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  1. TECHNOLOGICAL ELEMENTS OF THE SYSTEM OF STRATEGIC PLANNING AS TOOLS FOR PROVIDING THE ECONOMIC DEVELOPMENT OF THE SERVICES SPHERE

    Directory of Open Access Journals (Sweden)

    V. V. Gromov

    2015-01-01

    Full Text Available Topicality article is to determine the composition of the technological elements of the strategic planning system, the interaction of which is aimed at achieving the planned economic results in the changing factors influence macro microenvironments on the activities of institutions and economic activities of services. The articles structurally is made on the basis of respect for the logical sequence of interactions of technological elements of strategic planning and combat their negative factors of external and internal environment. Active interaction of technological elements of strategic planning tools is to ensure long-term development planning authorities of economic entities, economic activities service sector for sustainable economic growth. Contribution of the author in the scope of this article is to generalize the definition of the target composition and installation of technological elements of strategic planning and development institutions and industry components of the service sector.

  2. Administrative and economic tools of environmental protection

    OpenAIRE

    Staničová, Anna

    2010-01-01

    This diploma thesis deals with administrative and economic instruments of environmental protection, which represent the most important groups of instruments of environmental protection. Administrative and economic instruments are means and methods that affect human behavior in relation to environment. The thesis is systematically divided into two main parts and each of them is subdivided into chapters and subchapters. The first part of the thesis is focused generally and provides overview of ...

  3. UTV Expansion Pack: Special-Purpose Rank-Revealing Algorithms

    DEFF Research Database (Denmark)

    Fierro, Ricardo D.; Hansen, Per Christian

    2005-01-01

    This collection of Matlab 7.0 software supplements and complements the package UTV Tools from 1999, and includes implementations of special-purpose rank-revealing algorithms developed since the publication of the original package. We provide algorithms for computing and modifying symmetric rank-r...... values of a sparse or structured matrix. These new algorithms have applications in signal processing, optimization and LSI information retrieval.......This collection of Matlab 7.0 software supplements and complements the package UTV Tools from 1999, and includes implementations of special-purpose rank-revealing algorithms developed since the publication of the original package. We provide algorithms for computing and modifying symmetric rank......-revealing VSV decompositions, we expand the algorithms for the ULLV decomposition of a matrix pair to handle interference-type problems with a rank-deficient covariance matrix, and we provide a robust and reliable Lanczos algorithm which - despite its simplicity - is able to capture all the dominant singular...

  4. Dynamic Harmony Search with Polynomial Mutation Algorithm for Valve-Point Economic Load Dispatch

    Directory of Open Access Journals (Sweden)

    M. Karthikeyan

    2015-01-01

    mutation (DHSPM algorithm to solve ORPD problem. In DHSPM algorithm the key parameters of HS algorithm like harmony memory considering rate (HMCR and pitch adjusting rate (PAR are changed dynamically and there is no need to predefine these parameters. Additionally polynomial mutation is inserted in the updating step of HS algorithm to favor exploration and exploitation of the search space. The DHSPM algorithm is tested with three power system cases consisting of 3, 13, and 40 thermal units. The computational results show that the DHSPM algorithm is more effective in finding better solutions than other computational intelligence based methods.

  5. Monitoring the Implementation of State Regulation of National Economic Security

    Directory of Open Access Journals (Sweden)

    Hubarieva Iryna O.

    2018-03-01

    Full Text Available The aim of the article is to improve the methodological tools for monitoring the implementation of state regulation of national economic security. The approaches to defining the essence of the concept of “national economic security” are generalized. Assessment of the level of national economic security is a key element in monitoring the implementation of state regulation in this area. Recommendations for improving the methodology for assessing national economic security, the calculation algorithm of which includes four interrelated components (economic, political, social, spiritual one, suggests using analysis methods (correlation and cluster analysis, and taxonomy, which allows to determine the level and disproportion of development, can serve as a basis for monitoring the implementation of state regulation of national economic security. Such an approach to assessing national economic security makes it possible to determine the place (rank that a country occupies in a totality of countries, the dynamics of changing ranks over a certain period of time, to identify problem components, and monitor the effectiveness of state regulation of the national economic security. It the course of the research it was determined that the economic sphere is the main problem component of ensuring the security of Ukraine’s economy. The analysis made it possible to identify the most problematic partial indicators in the economic sphere of Ukraine: economic globalization, uneven economic development, level of infrastructure, level of financial market development, level of economic instability, macroeconomic stability. These indicators have a stable negative dynamics and a downward trend, which requires an immediate intervention of state bodies to ensure the national economic security.

  6. Simple sorting algorithm test based on CUDA

    OpenAIRE

    Meng, Hongyu; Guo, Fangjin

    2015-01-01

    With the development of computing technology, CUDA has become a very important tool. In computer programming, sorting algorithm is widely used. There are many simple sorting algorithms such as enumeration sort, bubble sort and merge sort. In this paper, we test some simple sorting algorithm based on CUDA and draw some useful conclusions.

  7. Early Termination of Dantzig-Wolfe Algorithm for Economic MPC

    DEFF Research Database (Denmark)

    Standardi, Laura; Sokoler, Leo Emil; Poulsen, Niels Kjølstad

    2013-01-01

    In this paper we apply the Economic Model Predictive Control (MPC) for balancing the power supply and demand in the future power systems in the most economic way. The control problem is formulated as a linear program, having a block-angular structure solved by the implementation of the Dantzig...

  8. Fast geometric algorithms

    International Nuclear Information System (INIS)

    Noga, M.T.

    1984-01-01

    This thesis addresses a number of important problems that fall within the framework of the new discipline of Computational Geometry. The list of topics covered includes sorting and selection, convex hull algorithms, the L 1 hull, determination of the minimum encasing rectangle of a set of points, the Euclidean and L 1 diameter of a set of points, the metric traveling salesman problem, and finding the superrange of star-shaped and monotype polygons. The main theme of all the work was to develop a set of very fast state-of-the-art algorithms that supersede any rivals in terms of speed and ease of implementation. In some cases existing algorithms were refined; for others new techniques were developed that add to the present database of fast adaptive geometric algorithms. What emerges is a collection of techniques that is successful at merging modern tools developed in analysis of algorithms with those of classical geometry

  9. Diagnostic tests and algorithms used in the investigation of haematuria: systematic reviews and economic evaluation.

    Science.gov (United States)

    Rodgers, M; Nixon, J; Hempel, S; Aho, T; Kelly, J; Neal, D; Duffy, S; Ritchie, G; Kleijnen, J; Westwood, M

    2006-06-01

    of false results, but evidence was lacking regarding the accuracy of routine microscopy and estimates were adopted for the model. The model for imaging the upper urinary tract showed that US detects more tumours than IVU at one-third of the cost, and is also associated with fewer false results. For any cause of haematuria, CT was shown to have a mean incremental cost-effectiveness ratio of pounds sterling 9939 in comparison with the next best option, US. When US is followed up with CT for negative results with persistent haematuria, it dominates the initial use of CT alone, with a saving of pounds sterling 235,000 for the evaluation of 1000 patients. The model for investigation of the lower urinary tract showed that for low-risk patients the use of immediate cystoscopy could be avoided if cystoscopy were used for follow-up patients with a negative initial test using tumour markers and/or cytology, resulting in a saving of pounds sterling 483,000 for the evaluation of 1000 patients. The clinical and economic impact on delayed detection of both upper and lower urinary tract tumours through the use of follow-up testing should be evaluated in future studies. There are insufficient data currently available to derive an evidence-based algorithm of the diagnostic pathway for haematuria. A hypothetical algorithm based on the opinion and practice of clinical experts in the review team, other published algorithms and the results of economic modelling is presented in this report. This algorithm is presented, for comparative purposes, alongside current US and UK guidelines. The ideas contained in these algorithms and the specific questions outlined should form the basis of future research. Quality assessment of the diagnostic accuracy studies included in this review highlighted several areas of deficiency.

  10. Algorithms in Singular

    Directory of Open Access Journals (Sweden)

    Hans Schonemann

    1996-12-01

    Full Text Available Some algorithms for singularity theory and algebraic geometry The use of Grobner basis computations for treating systems of polynomial equations has become an important tool in many areas. This paper introduces of the concept of standard bases (a generalization of Grobner bases and the application to some problems from algebraic geometry. The examples are presented as SINGULAR commands. A general introduction to Grobner bases can be found in the textbook [CLO], an introduction to syzygies in [E] and [St1]. SINGULAR is a computer algebra system for computing information about singularities, for use in algebraic geometry. The basic algorithms in SINGULAR are several variants of a general standard basis algorithm for general monomial orderings (see [GG]. This includes wellorderings (Buchberger algorithm ([B1], [B2] and tangent cone orderings (Mora algorithm ([M1], [MPT] as special cases: It is able to work with non-homogeneous and homogeneous input and also to compute in the localization of the polynomial ring in 0. Recent versions include algorithms to factorize polynomials and a factorizing Grobner basis algorithm. For a complete description of SINGULAR see [Si].

  11. Optimal Design of Passive Power Filters Based on Pseudo-parallel Genetic Algorithm

    Science.gov (United States)

    Li, Pei; Li, Hongbo; Gao, Nannan; Niu, Lin; Guo, Liangfeng; Pei, Ying; Zhang, Yanyan; Xu, Minmin; Chen, Kerui

    2017-05-01

    The economic costs together with filter efficiency are taken as targets to optimize the parameter of passive filter. Furthermore, the method of combining pseudo-parallel genetic algorithm with adaptive genetic algorithm is adopted in this paper. In the early stages pseudo-parallel genetic algorithm is introduced to increase the population diversity, and adaptive genetic algorithm is used in the late stages to reduce the workload. At the same time, the migration rate of pseudo-parallel genetic algorithm is improved to change with population diversity adaptively. Simulation results show that the filter designed by the proposed method has better filtering effect with lower economic cost, and can be used in engineering.

  12. AeroADL: applying the integration of the Suomi-NPP science algorithms with the Algorithm Development Library to the calibration and validation task

    Science.gov (United States)

    Houchin, J. S.

    2014-09-01

    A common problem for the off-line validation of the calibration algorithms and algorithm coefficients is being able to run science data through the exact same software used for on-line calibration of that data. The Joint Polar Satellite System (JPSS) program solved part of this problem by making the Algorithm Development Library (ADL) available, which allows the operational algorithm code to be compiled and run on a desktop Linux workstation using flat file input and output. However, this solved only part of the problem, as the toolkit and methods to initiate the processing of data through the algorithms were geared specifically toward the algorithm developer, not the calibration analyst. In algorithm development mode, a limited number of sets of test data are staged for the algorithm once, and then run through the algorithm over and over as the software is developed and debugged. In calibration analyst mode, we are continually running new data sets through the algorithm, which requires significant effort to stage each of those data sets for the algorithm without additional tools. AeroADL solves this second problem by providing a set of scripts that wrap the ADL tools, providing both efficient means to stage and process an input data set, to override static calibration coefficient look-up-tables (LUT) with experimental versions of those tables, and to manage a library containing multiple versions of each of the static LUT files in such a way that the correct set of LUTs required for each algorithm are automatically provided to the algorithm without analyst effort. Using AeroADL, The Aerospace Corporation's analyst team has demonstrated the ability to quickly and efficiently perform analysis tasks for both the VIIRS and OMPS sensors with minimal training on the software tools.

  13. An Empirical Derivation of the Run Time of the Bubble Sort Algorithm.

    Science.gov (United States)

    Gonzales, Michael G.

    1984-01-01

    Suggests a moving pictorial tool to help teach principles in the bubble sort algorithm. Develops such a tool applied to an unsorted list of numbers and describes a method to derive the run time of the algorithm. The method can be modified to run the times of various other algorithms. (JN)

  14. A Gradient-Based Multistart Algorithm for Multimodal Aerodynamic Shape Optimization Problems Based on Free-Form Deformation

    Science.gov (United States)

    Streuber, Gregg Mitchell

    Environmental and economic factors motivate the pursuit of more fuel-efficient aircraft designs. Aerodynamic shape optimization is a powerful tool in this effort, but is hampered by the presence of multimodality in many design spaces. Gradient-based multistart optimization uses a sampling algorithm and multiple parallel optimizations to reliably apply fast gradient-based optimization to moderately multimodal problems. Ensuring that the sampled geometries remain physically realizable requires manually developing specialized linear constraints for each class of problem. Utilizing free-form deformation geometry control allows these linear constraints to be written in a geometry-independent fashion, greatly easing the process of applying the algorithm to new problems. This algorithm was used to assess the presence of multimodality when optimizing a wing in subsonic and transonic flows, under inviscid and viscous conditions, and a blended wing-body under transonic, viscous conditions. Multimodality was present in every wing case, while the blended wing-body was found to be generally unimodal.

  15. SeqCompress: an algorithm for biological sequence compression.

    Science.gov (United States)

    Sardaraz, Muhammad; Tahir, Muhammad; Ikram, Ataul Aziz; Bajwa, Hassan

    2014-10-01

    The growth of Next Generation Sequencing technologies presents significant research challenges, specifically to design bioinformatics tools that handle massive amount of data efficiently. Biological sequence data storage cost has become a noticeable proportion of total cost in the generation and analysis. Particularly increase in DNA sequencing rate is significantly outstripping the rate of increase in disk storage capacity, which may go beyond the limit of storage capacity. It is essential to develop algorithms that handle large data sets via better memory management. This article presents a DNA sequence compression algorithm SeqCompress that copes with the space complexity of biological sequences. The algorithm is based on lossless data compression and uses statistical model as well as arithmetic coding to compress DNA sequences. The proposed algorithm is compared with recent specialized compression tools for biological sequences. Experimental results show that proposed algorithm has better compression gain as compared to other existing algorithms. Copyright © 2014 Elsevier Inc. All rights reserved.

  16. Economic analysis model for total energy and economic systems

    International Nuclear Information System (INIS)

    Shoji, Katsuhiko; Yasukawa, Shigeru; Sato, Osamu

    1980-09-01

    This report describes framing an economic analysis model developed as a tool of total energy systems. To prospect and analyze future energy systems, it is important to analyze the relation between energy system and economic structure. We prepared an economic analysis model which was suited for this purpose. Our model marks that we can analyze in more detail energy related matters than other economic ones, and can forecast long-term economic progress rather than short-term economic fluctuation. From view point of economics, our model is longterm multi-sectoral economic analysis model of open Leontief type. Our model gave us appropriate results for fitting test and forecasting estimation. (author)

  17. Adaptive Differential Evolution Approach for Constrained Economic Power Dispatch with Prohibited Operating Zones

    Directory of Open Access Journals (Sweden)

    Abdellatif HAMOUDA

    2011-12-01

    Full Text Available Economic power dispatch (EPD is one of the main tools for optimal operation and planning of modern power systems. To solve effectively the EPD problem, most of the conventional calculus methods rely on the assumption that the fuel cost characteristic of a generating unit is a continuous and convex function, resulting in inaccurate dispatch. This paper presents the design and application of efficient adaptive differential evolution (ADE algorithm for the solution of the economic power dispatch problem, where the non-convex characteristics of the generators, such us prohibited operating zones and ramp rate limits of the practical generator operation are considered. The 26 bus benchmark test system with 6 units having prohibited operating zones and ramp rate limits was used for testing and validation purposes. The results obtained demonstrate the effectiveness of the proposed method for solving the non-convex economic dispatch problem.

  18. Development of an integrated economic decision-support tool for the remediation of contaminated sites. Overview note

    International Nuclear Information System (INIS)

    Samson, R.; Bage, G.

    2004-05-01

    This report concludes the first design phase of an innovative software tool which, when completed, will allow managers of contaminated sites to make optimal decisions with respect to site remediation. The principal objective of the project was to develop the foundations for decision-support software (SITE VII) which will allow a comprehensive and rigorous approach to the comparison of remediation scenarios for sites contaminated with petroleum hydrocarbons. During this first phase of the project, the NSERC Industrial Chair in Site Remediation and Management of the Ecole Polytechnique de Montreal has completed four stages in the design of a decision-support tool that could be applied by any site manager using a simple computer. These four stages are: refinement of a technico-economic evaluation model; development of databases for five soil remediation technologies; design of a structure for integration of the databases with the technico-economic model; and simulation of the remediation of a contaminated site using the technico-economic model and a subset of the databases. In the interim report, the emphasis was placed on the development of the technico-economic model, supported by a very simple, single-technology simulation of remediation. In the present report, the priority is placed on the integration of the different components required for the creation of decision-support software based on the technico-economic model. An entire chapter of this report is devoted to elaborating the decision structure of the software. The treatment of information within the software is shown schematically and explained step-by-step. Five remediation technologies are handled by the software: three in-situ technologies (bio-venting, bio-slurping, bio-sparging) and two ex-situ technologies (thermal desorption, Bio-pile treatment). A technology file has been created for each technology, containing a brief description of the technology, its performance, its criteria of applicability

  19. Synthesis algorithm of VLSI multipliers for ASIC

    Science.gov (United States)

    Chua, O. H.; Eldin, A. G.

    1993-01-01

    Multipliers are critical sub-blocks in ASIC design, especially for digital signal processing and communications applications. A flexible multiplier synthesis tool is developed which is capable of generating multiplier blocks for word size in the range of 4 to 256 bits. A comparison of existing multiplier algorithms is made in terms of speed, silicon area, and suitability for automated synthesis and verification of its VLSI implementation. The algorithm divides the range of supported word sizes into sub-ranges and provides each sub-range with a specific multiplier architecture for optimal speed and area. The algorithm of the synthesis tool and the multiplier architectures are presented. Circuit implementation and the automated synthesis methodology are discussed.

  20. Development and analysis of an economizer control strategy algorithm to promote an opportunity for energy savings in air conditioning installations

    Energy Technology Data Exchange (ETDEWEB)

    Neto, Jose H.M.; Azevedo, Walter L. [Centro Federal de Educacao Tecnologica de Minas Gerais (CEFET), Belo Horizonte, MG (Brazil). Dept. de Engenharia Mecanica]. E-mail: henrique@daem.des.cefetmg.br

    2000-07-01

    This work presents an algorithm control strategy denominated enthalpy economizer. The objective of this algorithm strategy is to determine the adequate fractions of outside and return air flowrates entering a cooling coil based on the analysis of the outside, return and supply air enthalpies, rather than on the analysis of the dry bulb temperatures. The proposed algorithm predicts the actual opening position of the outside and return air dampers in order to provide the lower mixing air enthalpy. First, the psychometrics properties of the outside and return air are calculated from actual measurements of the dry and wet bulb temperatures. Then, three distinct cases are analyzed: the enthalpy of the outside air is lower than the enthalpy of the supply air (free cooling); the enthalpy of the outside air is higher than the enthalpy of the return air; the enthalpy of the outside air is lower than the enthalpy of the return air and higher than the temperature of the supply air. Different outside air conditions were selected in order to represent typical weather data of Brazilians cities, as well as typical return air conditions. It was found that the enthalpy control strategy could promote an opportunity for energy savings mainly during mild nights and wintertime periods as well as during warm afternoons and summertime periods, depending on the outside air relative humidity. The proposed algorithm works well and can be integrated in some commercial automation software to reduce energy consumption and electricity demand. (author)

  1. Data Mining and Optimization Tools for Developing Engine Parameters Tools

    Science.gov (United States)

    Dhawan, Atam P.

    1998-01-01

    This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. Tricia Erhardt and I studied the problem domain for developing an Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy, datasets. From the study and discussion with NASA LeRC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of tile data for GA based multi-resolution optimal search.

  2. An algorithm of computing inhomogeneous differential equations for definite integrals

    OpenAIRE

    Nakayama, Hiromasa; Nishiyama, Kenta

    2010-01-01

    We give an algorithm to compute inhomogeneous differential equations for definite integrals with parameters. The algorithm is based on the integration algorithm for $D$-modules by Oaku. Main tool in the algorithm is the Gr\\"obner basis method in the ring of differential operators.

  3. Systematic methods and tools for design of sustainable chemical processes for CO2 utilization

    DEFF Research Database (Denmark)

    Kongpanna, Pichayapan; Babi, Deenesh K.; Pavarajarn, Varong

    2016-01-01

    A systematic computer-aided framework for sustainable process design is presented together with its application to the synthesis and generation of processing networks for dimethyl carbonate (DMC) production with CO2 utilization. The framework integrated with various methods, tools, algorithms......-stage involves selection and analysis of the identified networks as a base case design in terms of operational feasibility, economics, life cycle assessment factors and sustainability measures, which are employed to establish targets for improvement in the next-stage. The innovation-stage involves generation...

  4. Immersive Algorithms: Better Visualization with Less Information

    DEFF Research Database (Denmark)

    Bille, Philip; Gørtz, Inge Li

    2017-01-01

    Visualizing algorithms, such as drawings, slideshow presentations, animations, videos, and software tools, is a key concept to enhance and support student learning. A typical visualization of an algorithm show the data and then perform computation on the data. For instance, a standard visualization...

  5. DWFS: A Wrapper Feature Selection Tool Based on a Parallel Genetic Algorithm

    KAUST Repository

    Soufan, Othman

    2015-02-26

    Many scientific problems can be formulated as classification tasks. Data that harbor relevant information are usually described by a large number of features. Frequently, many of these features are irrelevant for the class prediction. The efficient implementation of classification models requires identification of suitable combinations of features. The smaller number of features reduces the problem\\'s dimensionality and may result in higher classification performance. We developed DWFS, a web-based tool that allows for efficient selection of features for a variety of problems. DWFS follows the wrapper paradigm and applies a search strategy based on Genetic Algorithms (GAs). A parallel GA implementation examines and evaluates simultaneously large number of candidate collections of features. DWFS also integrates various filteringmethods thatmay be applied as a pre-processing step in the feature selection process. Furthermore, weights and parameters in the fitness function of GA can be adjusted according to the application requirements. Experiments using heterogeneous datasets from different biomedical applications demonstrate that DWFS is fast and leads to a significant reduction of the number of features without sacrificing performance as compared to several widely used existing methods. DWFS can be accessed online at www.cbrc.kaust.edu.sa/dwfs.

  6. DWFS: A Wrapper Feature Selection Tool Based on a Parallel Genetic Algorithm

    KAUST Repository

    Soufan, Othman; Kleftogiannis, Dimitrios A.; Kalnis, Panos; Bajic, Vladimir B.

    2015-01-01

    Many scientific problems can be formulated as classification tasks. Data that harbor relevant information are usually described by a large number of features. Frequently, many of these features are irrelevant for the class prediction. The efficient implementation of classification models requires identification of suitable combinations of features. The smaller number of features reduces the problem's dimensionality and may result in higher classification performance. We developed DWFS, a web-based tool that allows for efficient selection of features for a variety of problems. DWFS follows the wrapper paradigm and applies a search strategy based on Genetic Algorithms (GAs). A parallel GA implementation examines and evaluates simultaneously large number of candidate collections of features. DWFS also integrates various filteringmethods thatmay be applied as a pre-processing step in the feature selection process. Furthermore, weights and parameters in the fitness function of GA can be adjusted according to the application requirements. Experiments using heterogeneous datasets from different biomedical applications demonstrate that DWFS is fast and leads to a significant reduction of the number of features without sacrificing performance as compared to several widely used existing methods. DWFS can be accessed online at www.cbrc.kaust.edu.sa/dwfs.

  7. Cryptographic protocol security analysis based on bounded constructing algorithm

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    An efficient approach to analyzing cryptographic protocols is to develop automatic analysis tools based on formal methods. However, the approach has encountered the high computational complexity problem due to reasons that participants of protocols are arbitrary, their message structures are complex and their executions are concurrent. We propose an efficient automatic verifying algorithm for analyzing cryptographic protocols based on the Cryptographic Protocol Algebra (CPA) model proposed recently, in which algebraic techniques are used to simplify the description of cryptographic protocols and their executions. Redundant states generated in the analysis processes are much reduced by introducing a new algebraic technique called Universal Polynomial Equation and the algorithm can be used to verify the correctness of protocols in the infinite states space. We have implemented an efficient automatic analysis tool for cryptographic protocols, called ACT-SPA, based on this algorithm, and used the tool to check more than 20 cryptographic protocols. The analysis results show that this tool is more efficient, and an attack instance not offered previously is checked by using this tool.

  8. Modular Regularization Algorithms

    DEFF Research Database (Denmark)

    Jacobsen, Michael

    2004-01-01

    The class of linear ill-posed problems is introduced along with a range of standard numerical tools and basic concepts from linear algebra, statistics and optimization. Known algorithms for solving linear inverse ill-posed problems are analyzed to determine how they can be decomposed into indepen...

  9. Novel medical image enhancement algorithms

    Science.gov (United States)

    Agaian, Sos; McClendon, Stephen A.

    2010-01-01

    In this paper, we present two novel medical image enhancement algorithms. The first, a global image enhancement algorithm, utilizes an alpha-trimmed mean filter as its backbone to sharpen images. The second algorithm uses a cascaded unsharp masking technique to separate the high frequency components of an image in order for them to be enhanced using a modified adaptive contrast enhancement algorithm. Experimental results from enhancing electron microscopy, radiological, CT scan and MRI scan images, using the MATLAB environment, are then compared to the original images as well as other enhancement methods, such as histogram equalization and two forms of adaptive contrast enhancement. An image processing scheme for electron microscopy images of Purkinje cells will also be implemented and utilized as a comparison tool to evaluate the performance of our algorithm.

  10. New software tool for dynamic radiological characterisation and monitoring in nuclear sites

    International Nuclear Information System (INIS)

    Szoeke, Istvan; Louka, Michael N.; Mark, Niels K.; Bryntesen, Tom R.; Bratteli, Joachim; Edvardsen, Svein T.; Gustavsen, Morten A.; Toppe, Aleksander L.; Johnsen, Terje; Rindahl, Grete

    2012-01-01

    The Halden Reactor Project (HRP) is a jointly sponsored international cooperation, under the aegis of the Organisation for Economic Co-operation and Development - Nuclear Energy Agency. Extensive and valuable guidance and tools, connected to safe and reliable operation of nuclear facilities, has been elaborated throughout the years within the frame of this programme. The HRP has particularly high level results in virtual-reality based tools for real-time areal and personal monitoring. The techniques, developed earlier, are now being supplemented to enhance the planning and monitoring capabilities, and support general radiological characterisation connected to nuclear sites and facilities. Due to the complexity and abundance of the input information required, software tools, dedicated to the radiological characterization of contaminated materials, buildings, land and groundwater, are applied to review, evaluate and visualize the data. Characterisation of the radiation situation in a realistic environment can be very complex, and efficient visualisation of the data to the user is not straight forward. The monitoring and planning tools elaborated in the frame of the HRP feature very sophisticated three-dimensional (3D) high definition visualisation and user interfaces to promote easy interpretation of the input data. The visualisation tools permit dynamic visualisation of radiation fields in virtual or augmented reality by various techniques and real-time personal monitoring of humanoid models. In addition new techniques are being elaborated to visualise the 3D distribution of activities in structures and materials. The dosimetric algorithms, feeding information to the visualisation and user interface of these planning tools, include deterministic radiation transport techniques suitable for fast photon dose estimates, in case physical and radio- and spectrometric characteristics of the gamma sources are known. The basic deterministic model, implemented in earlier

  11. Consensus algorithm in smart grid and communication networks

    Science.gov (United States)

    Alfagee, Husain Abdulaziz

    On a daily basis, consensus theory attracts more and more researches from different areas of interest, to apply its techniques to solve technical problems in a way that is faster, more reliable, and even more precise than ever before. A power system network is one of those fields that consensus theory employs extensively. The use of the consensus algorithm to solve the Economic Dispatch and Load Restoration Problems is a good example. Instead of a conventional central controller, some researchers have explored an algorithm to solve the above mentioned problems, in a distribution manner, using the consensus algorithm, which is based on calculation methods, i.e., non estimation methods, for updating the information consensus matrix. Starting from this point of solving these types of problems mentioned, specifically, in a distribution fashion, using the consensus algorithm, we have implemented a new advanced consensus algorithm. It is based on the adaptive estimation techniques, such as the Gradient Algorithm and the Recursive Least Square Algorithm, to solve the same problems. This advanced work was tested on different case studies that had formerly been explored, as seen in references 5, 7, and 18. Three and five generators, or agents, with different topologies, correspond to the Economic Dispatch Problem and the IEEE 16-Bus power system corresponds to the Load Restoration Problem. In all the cases we have studied, the results met our expectations with extreme accuracy, and completely matched the results of the previous researchers. There is little question that this research proves the capability and dependability of using the consensus algorithm, based on the estimation methods as the Gradient Algorithm and the Recursive Least Square Algorithm to solve such power problems.

  12. A Wavelet Kernel-Based Primal Twin Support Vector Machine for Economic Development Prediction

    Directory of Open Access Journals (Sweden)

    Fang Su

    2013-01-01

    Full Text Available Economic development forecasting allows planners to choose the right strategies for the future. This study is to propose economic development prediction method based on the wavelet kernel-based primal twin support vector machine algorithm. As gross domestic product (GDP is an important indicator to measure economic development, economic development prediction means GDP prediction in this study. The wavelet kernel-based primal twin support vector machine algorithm can solve two smaller sized quadratic programming problems instead of solving a large one as in the traditional support vector machine algorithm. Economic development data of Anhui province from 1992 to 2009 are used to study the prediction performance of the wavelet kernel-based primal twin support vector machine algorithm. The comparison of mean error of economic development prediction between wavelet kernel-based primal twin support vector machine and traditional support vector machine models trained by the training samples with the 3–5 dimensional input vectors, respectively, is given in this paper. The testing results show that the economic development prediction accuracy of the wavelet kernel-based primal twin support vector machine model is better than that of traditional support vector machine.

  13. Artifact removal algorithms for stroke detection using a multistatic MIST beamforming algorithm.

    Science.gov (United States)

    Ricci, E; Di Domenico, S; Cianca, E; Rossi, T

    2015-01-01

    Microwave imaging (MWI) has been recently proved as a promising imaging modality for low-complexity, low-cost and fast brain imaging tools, which could play a fundamental role to efficiently manage emergencies related to stroke and hemorrhages. This paper focuses on the UWB radar imaging approach and in particular on the processing algorithms of the backscattered signals. Assuming the use of the multistatic version of the MIST (Microwave Imaging Space-Time) beamforming algorithm, developed by Hagness et al. for the early detection of breast cancer, the paper proposes and compares two artifact removal algorithms. Artifacts removal is an essential step of any UWB radar imaging system and currently considered artifact removal algorithms have been shown not to be effective in the specific scenario of brain imaging. First of all, the paper proposes modifications of a known artifact removal algorithm. These modifications are shown to be effective to achieve good localization accuracy and lower false positives. However, the main contribution is the proposal of an artifact removal algorithm based on statistical methods, which allows to achieve even better performance but with much lower computational complexity.

  14. The South Florida Ecosystem Portfolio Model - A Map-Based Multicriteria Ecological, Economic, and Community Land-Use Planning Tool

    Science.gov (United States)

    Labiosa, William B.; Bernknopf, Richard; Hearn, Paul; Hogan, Dianna; Strong, David; Pearlstine, Leonard; Mathie, Amy M.; Wein, Anne M.; Gillen, Kevin; Wachter, Susan

    2009-01-01

    The South Florida Ecosystem Portfolio Model (EPM) prototype is a regional land-use planning Web tool that integrates ecological, economic, and social information and values of relevance to decision-makers and stakeholders. The EPM uses a multicriteria evaluation framework that builds on geographic information system-based (GIS) analysis and spatially-explicit models that characterize important ecological, economic, and societal endpoints and consequences that are sensitive to regional land-use/land-cover (LULC) change. The EPM uses both economics (monetized) and multiattribute utility (nonmonetized) approaches to valuing these endpoints and consequences. This hybrid approach represents a methodological middle ground between rigorous economic and ecological/ environmental scientific approaches. The EPM sacrifices some degree of economic- and ecological-forecasting precision to gain methodological transparency, spatial explicitness, and transferability, while maintaining credibility. After all, even small steps in the direction of including ecosystem services evaluation are an improvement over current land-use planning practice (Boyd and Wainger, 2003). There are many participants involved in land-use decision-making in South Florida, including local, regional, State, and Federal agencies, developers, environmental groups, agricultural groups, and other stakeholders (South Florida Regional Planning Council, 2003, 2004). The EPM's multicriteria evaluation framework is designed to cut across the objectives and knowledge bases of all of these participants. This approach places fundamental importance on social equity and stakeholder participation in land-use decision-making, but makes no attempt to determine normative socially 'optimal' land-use plans. The EPM is thus a map-based set of evaluation tools for planners and stakeholders to use in their deliberations of what is 'best', considering a balancing of disparate interests within a regional perspective. Although

  15. CSA: An efficient algorithm to improve circular DNA multiple alignment

    Directory of Open Access Journals (Sweden)

    Pereira Luísa

    2009-07-01

    Full Text Available Abstract Background The comparison of homologous sequences from different species is an essential approach to reconstruct the evolutionary history of species and of the genes they harbour in their genomes. Several complete mitochondrial and nuclear genomes are now available, increasing the importance of using multiple sequence alignment algorithms in comparative genomics. MtDNA has long been used in phylogenetic analysis and errors in the alignments can lead to errors in the interpretation of evolutionary information. Although a large number of multiple sequence alignment algorithms have been proposed to date, they all deal with linear DNA and cannot handle directly circular DNA. Researchers interested in aligning circular DNA sequences must first rotate them to the "right" place using an essentially manual process, before they can use multiple sequence alignment tools. Results In this paper we propose an efficient algorithm that identifies the most interesting region to cut circular genomes in order to improve phylogenetic analysis when using standard multiple sequence alignment algorithms. This algorithm identifies the largest chain of non-repeated longest subsequences common to a set of circular mitochondrial DNA sequences. All the sequences are then rotated and made linear for multiple alignment purposes. To evaluate the effectiveness of this new tool, three different sets of mitochondrial DNA sequences were considered. Other tests considering randomly rotated sequences were also performed. The software package Arlequin was used to evaluate the standard genetic measures of the alignments obtained with and without the use of the CSA algorithm with two well known multiple alignment algorithms, the CLUSTALW and the MAVID tools, and also the visualization tool SinicView. Conclusion The results show that a circularization and rotation pre-processing step significantly improves the efficiency of public available multiple sequence alignment

  16. USING GEM - GLOBAL ECONOMIC MODEL IN ACHIEVING A GLOBAL ECONOMIC FORECAST

    Directory of Open Access Journals (Sweden)

    Camelia Madalina Orac

    2013-12-01

    Full Text Available The global economic development model has proved to be insufficiently reliable under the new economic crisis. As a result, the entire theoretical construction about the global economy needs rethinking and reorientation. In this context, it is quite clear that only through effective use of specific techniques and tools of economic-mathematical modeling, statistics, regional analysis and economic forecasting it is possible to obtain an overview of the future economy.

  17. Fractal Landscape Algorithms for Environmental Simulations

    Science.gov (United States)

    Mao, H.; Moran, S.

    2014-12-01

    Natural science and geographical research are now able to take advantage of environmental simulations that more accurately test experimental hypotheses, resulting in deeper understanding. Experiments affected by the natural environment can benefit from 3D landscape simulations capable of simulating a variety of terrains and environmental phenomena. Such simulations can employ random terrain generation algorithms that dynamically simulate environments to test specific models against a variety of factors. Through the use of noise functions such as Perlin noise, Simplex noise, and diamond square algorithms, computers can generate simulations that model a variety of landscapes and ecosystems. This study shows how these algorithms work together to create realistic landscapes. By seeding values into the diamond square algorithm, one can control the shape of landscape. Perlin noise and Simplex noise are also used to simulate moisture and temperature. The smooth gradient created by coherent noise allows more realistic landscapes to be simulated. Terrain generation algorithms can be used in environmental studies and physics simulations. Potential studies that would benefit from simulations include the geophysical impact of flash floods or drought on a particular region and regional impacts on low lying area due to global warming and rising sea levels. Furthermore, terrain generation algorithms also serve as aesthetic tools to display landscapes (Google Earth), and simulate planetary landscapes. Hence, it can be used as a tool to assist science education. Algorithms used to generate these natural phenomena provide scientists a different approach in analyzing our world. The random algorithms used in terrain generation not only contribute to the generating the terrains themselves, but are also capable of simulating weather patterns.

  18. Pure intelligent monitoring system for steam economizer trips

    Directory of Open Access Journals (Sweden)

    Basim Ismail Firas

    2017-01-01

    Full Text Available Steam economizer represents one of the main equipment in the power plant. Some steam economizer's behavior lead to failure and shutdown in the entire power plant. This will lead to increase in operating and maintenance cost. By detecting the cause in the early stages maintain normal and safe operational conditions of power plant. However, these methodologies are hard to be achieved due to certain boundaries such as system learning ability and the weakness of the system beyond its domain of expertise. The best solution for these problems, an intelligent modeling system specialized in steam economizer trips have been proposed and coded within MATLAB environment to be as a potential solution to insure a fault detection and diagnosis system (FDD. An integrated plant data preparation framework for 10 trips was studied as framework variables. The most influential operational variables have been trained and validated by adopting Artificial Neural Network (ANN. The Extreme Learning Machine (ELM neural network methodology has been proposed as a major computational intelligent tool in the system. It is shown that ANN can be implemented for monitoring any process faults in thermal power plants. Better speed of learning algorithms by using the Extreme Learning Machine has been approved as well.

  19. Introduction to Economic Analysis

    OpenAIRE

    R. Preston McAfee

    2005-01-01

    This book presents introductory economics ("principles") material using standard mathematical tools, including calculus. It is designed for a relatively sophisticated undergraduate who has not taken a basic university course in economics. It also contains the standard intermediate microeconomics material and some material that ought to be standard but is not. The book can easily serve as an intermediate microeconomics text. The focus of this book is on the conceptual tools and not on fluff. M...

  20. Research on prediction of agricultural machinery total power based on grey model optimized by genetic algorithm

    Science.gov (United States)

    Xie, Yan; Li, Mu; Zhou, Jin; Zheng, Chang-zheng

    2009-07-01

    Agricultural machinery total power is an important index to reflex and evaluate the level of agricultural mechanization. It is the power source of agricultural production, and is the main factors to enhance the comprehensive agricultural production capacity expand production scale and increase the income of the farmers. Its demand is affected by natural, economic, technological and social and other "grey" factors. Therefore, grey system theory can be used to analyze the development of agricultural machinery total power. A method based on genetic algorithm optimizing grey modeling process is introduced in this paper. This method makes full use of the advantages of the grey prediction model and characteristics of genetic algorithm to find global optimization. So the prediction model is more accurate. According to data from a province, the GM (1, 1) model for predicting agricultural machinery total power was given based on the grey system theories and genetic algorithm. The result indicates that the model can be used as agricultural machinery total power an effective tool for prediction.

  1. Algorithms for the Computation of Debris Risk

    Science.gov (United States)

    Matney, Mark J.

    2017-01-01

    Determining the risks from space debris involve a number of statistical calculations. These calculations inevitably involve assumptions about geometry - including the physical geometry of orbits and the geometry of satellites. A number of tools have been developed in NASA’s Orbital Debris Program Office to handle these calculations; many of which have never been published before. These include algorithms that are used in NASA’s Orbital Debris Engineering Model ORDEM 3.0, as well as other tools useful for computing orbital collision rates and ground casualty risks. This paper presents an introduction to these algorithms and the assumptions upon which they are based.

  2. Economic incentives as a policy tool to promote safety and health at work.

    Science.gov (United States)

    Kankaanpää, Eila

    2010-06-01

    Incentives are regarded as a promising policy tool for promoting occupational safety and health (OSH). This article discusses the potential of different kinds of incentives in light of economic theory and evidence from research. When incentives are used as a policy tool, it implies the existance of an institution that has both the interest and the power to apply incentives to stakeholders, usually to employers. Governments can subsidize employers' investments in OSH with subsidies and tax structures. These incentives are successful only if the demand for OSH responds to the change in the price of OSH investments and if the suppliers of OSH are able to increase their production smoothly. Otherwise, the subsidy will only lead to higher prices for OSH goods. Both public and private insurance companies can differentiate insurance premiums according to claim behavior in the past (experience rating). There is evidence that this can effectively lower the frequency of claims, but not the severity of cases. This papers concludes that incentives do not directly lead to improvement. When incentives are introduced, their objective(s) should be clear and the end result (ie what the incentive aims to promote) should be known to be effective in achieving healthy and safe workplaces.

  3. Internal Audit as a Tool for Combating Economic Fraud. Case Study of the Misappropriation Process of Company’s Assets

    Directory of Open Access Journals (Sweden)

    Michał Falkowski

    2010-12-01

    Full Text Available The paper deals with the problem of economic fraud and the role of Internal Audit as a tool for preventing it. As the economic downturn intensifies, the possibility of disputes and other difficulties arises more frequently. More often employees and contracting parties try to shift their own losses on to other economic entities. When internal rules are broken or either are not established, organizations are exposed to risks and problems that they are often not used to dealing with. As the analyzed case study showed threats of an economic fraud can come also from inside the company. Embezzlement concerning expense reimbursement is one of the most “popular” ways to steal money from inside the company. To prevent such situations from happening Internal Audit Unit has to perform assurance and consulting actions to deter this particular and any other type of fraud. When the actual fraud occurs an important element, is the properly divide roles between an internal auditor and the forensic specialist who is adequately prepared to lead the investigation, find evidence, and bring fraudsters to justice.

  4. Genetic testing in the European Union: does economic evaluation matter?

    Science.gov (United States)

    Antoñanzas, Fernando; Rodríguez-Ibeas, R; Hutter, M F; Lorente, R; Juárez, C; Pinillos, M

    2012-10-01

    We review the published economic evaluation studies applied to genetic technologies in the EU to know the main diseases addressed by these studies, the ways the studies were conducted and to assess the efficiency of these new technologies. The final aim of this review was to understand the possibilities of the economic evaluations performed up to date as a tool to contribute to decision making in this area. We have reviewed a set of articles found in several databases until March 2010. Literature searches were made in the following databases: PubMed; Euronheed; Centre for Reviews and Dissemination of the University of York-Health Technology Assessment, Database of Abstracts of Reviews of Effects, NHS Economic Evaluation Database; and Scopus. The algorithm was "(screening or diagnosis) and genetic and (cost or economic) and (country EU27)". We included studies if they met the following criteria: (1) a genetic technology was analysed; (2) human DNA must be tested for; (3) the analysis was a real economic evaluation or a cost study, and (4) the articles had to be related to any EU Member State. We initially found 3,559 papers on genetic testing but only 92 articles of economic analysis referred to a wide range of genetic diseases matched the inclusion criteria. The most studied diseases were as follows: cystic fibrosis (12), breast and ovarian cancer (8), hereditary hemochromatosis (6), Down's syndrome (7), colorectal cancer (5), familial hypercholesterolaemia (5), prostate cancer (4), and thrombophilia (4). Genetic tests were mostly used for screening purposes, and cost-effectiveness analysis is the most common type of economic study. The analysed gene technologies are deemed to be efficient for some specific population groups and screening algorithms according to the values of their cost-effectiveness ratios that were below the commonly accepted threshold of 30,000€. Economic evaluation of genetic technologies matters but the number of published studies is still

  5. GENDER DISPARITIES REGARDING WAGE AS A MOTIVATIONAL TOOL IN THE CURRENT ECONOMIC CONTEXT

    Directory of Open Access Journals (Sweden)

    DEMYEN SUZANA

    2014-02-01

    Full Text Available The deepening process of globalization, negative trends regarding demographic evolution both nationally and internationally, also the emigration phenomenon and the long-term effects of the economic crisis, are the main challenges in terms of creating a general support and to encourage a fair and effective management of human resources, regardless of the industry they are developing their activity. Motivation consists in a series of problems that need to be solved in order to generate both individual and team performance, and wage is seen as one of the most important motivational tools. Though we have witnessed a less serious gap between wages according to the gender criterion, still there can be identified certain issues that need to be solved regardless the most recent trends in management

  6. DiVinE-CUDA - A Tool for GPU Accelerated LTL Model Checking

    Directory of Open Access Journals (Sweden)

    Jiří Barnat

    2009-12-01

    Full Text Available In this paper we present a tool that performs CUDA accelerated LTL Model Checking. The tool exploits parallel algorithm MAP adjusted to the NVIDIA CUDA architecture in order to efficiently detect the presence of accepting cycles in a directed graph. Accepting cycle detection is the core algorithmic procedure in automata-based LTL Model Checking. We demonstrate that the tool outperforms non-accelerated version of the algorithm and we discuss where the limits of the tool are and what we intend to do in the future to avoid them.

  7. Elementary functions algorithms and implementation

    CERN Document Server

    Muller, Jean-Michel

    2016-01-01

    This textbook presents the concepts and tools necessary to understand, build, and implement algorithms for computing elementary functions (e.g., logarithms, exponentials, and the trigonometric functions). Both hardware- and software-oriented algorithms are included, along with issues related to accurate floating-point implementation. This third edition has been updated and expanded to incorporate the most recent advances in the field, new elementary function algorithms, and function software. After a preliminary chapter that briefly introduces some fundamental concepts of computer arithmetic, such as floating-point arithmetic and redundant number systems, the text is divided into three main parts. Part I considers the computation of elementary functions using algorithms based on polynomial or rational approximations and using table-based methods; the final chapter in this section deals with basic principles of multiple-precision arithmetic. Part II is devoted to a presentation of “shift-and-add” algorithm...

  8. A proposed tool to integrate environmental and economical assessments of products

    International Nuclear Information System (INIS)

    Senthil, Kumaran D.; Ong, S.K.; Nee, A.Y.C.; Tan, Reginald B.H.

    2003-01-01

    An attempt has been made to interpret the outcomes of a Life Cycle Assessment (LCA) in terms of environmental costs. This attempt ensures the environmental accountability of the products while LCA ensures their eco-friendly nature. Keeping this as an objective, a Life Cycle Environmental Cost Analysis (LCECA) model was developed. This new tool incorporates costing into the LCA practice. This model prescribes a life cycle environmental cost model to estimate and correlate the effects of these costs in all the life cycle stages of the product. The newly developed categories of eco-costs are: costs of effluent treatment/control/disposal, environmental management systems, eco-taxes, rehabilitation, energy and savings of recycling and reuse strategies. The mathematical model of LCECA determines quantitative expressions between the total cost of products and the various eco-costs. The eco-costs of the alternatives are compared with the computational LCECA model. This method enables the environmental as well as the economic assessment of products, which leads to cost-effective, eco-friendly design of products

  9. A design tool for direct and non-stochastic calculations of near-field radiative transfer in complex structures: The NF-RT-FDTD algorithm

    Science.gov (United States)

    Didari, Azadeh; Pinar Mengüç, M.

    2017-08-01

    Advances in nanotechnology and nanophotonics are inextricably linked with the need for reliable computational algorithms to be adapted as design tools for the development of new concepts in energy harvesting, radiative cooling, nanolithography and nano-scale manufacturing, among others. In this paper, we provide an outline for such a computational tool, named NF-RT-FDTD, to determine the near-field radiative transfer between structured surfaces using Finite Difference Time Domain method. NF-RT-FDTD is a direct and non-stochastic algorithm, which accounts for the statistical nature of the thermal radiation and is easily applicable to any arbitrary geometry at thermal equilibrium. We present a review of the fundamental relations for far- and near-field radiative transfer between different geometries with nano-scale surface and volumetric features and gaps, and then we discuss the details of the NF-RT-FDTD formulation, its application to sample geometries and outline its future expansion to more complex geometries. In addition, we briefly discuss some of the recent numerical works for direct and indirect calculations of near-field thermal radiation transfer, including Scattering Matrix method, Finite Difference Time Domain method (FDTD), Wiener Chaos Expansion, Fluctuating Surface Current (FSC), Fluctuating Volume Current (FVC) and Thermal Discrete Dipole Approximations (TDDA).

  10. Dereplication, Aggregation and Scoring Tool (DAS Tool) v1.0

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-01

    Communities of uncultivated microbes are critical to ecosystem function and microorganism health, and a key objective of metagenomic studies is to analyze organism-specific metabolic pathways and reconstruct community interaction networks. This requires accurate assignment of genes to genomes, yet existing binning methods often fail to predict a reasonable number of genomes and report many bins of low quality and completeness. Furthermore, the performance of existing algorithms varies between samples and biotypes. Here, we present a dereplication, aggregation and scoring strategy, DAS Tool, that combines the strengths of a flexible set of established binning algorithms. DAS Tools applied to a constructed community generated more accurate bins than any automated method. Further, when applied to samples of different complexity, including soil, natural oil seeps, and the human gut, DAS Tool recovered substantially more near-complete genomes than any single binning method alone. Included were three genomes from a novel lineage . The ability to reconstruct many near-complete genomes from metagenomics data will greatly advance genome-centric analyses of ecosystems.

  11. Assistance tool commissioner of new algorithms of systems planning of therapy with ionizing; Herramienta de asistencia en el comisionado de nuevos algoritmos de sistemas de planificacion de terapia con radiaciones ionizantes

    Energy Technology Data Exchange (ETDEWEB)

    Reinado, D.; Ricos, B.; Alonso, S.; Chinillach, N.; Bellido, P.; Tortosa, R.

    2013-07-01

    The Commissioner of a new scheduling algorithm is associated with a high number of hours of work and measures. In order to optimize the development of the Commissioner for the AAA algorithms and Acuros XB within planning Eclipse (V.10) system marketed by Varian and have developed a tool in Microsoft Excel format where the different tests have been included to perform. (Author)

  12. NEMO. A novel techno-economic tool suite for simulating and optimizing solutions for grid integration of electric vehicles and charging stations

    Energy Technology Data Exchange (ETDEWEB)

    Erge, Thomas; Stillahn, Thies; Dallmer-Zerbe, Kilian; Wille-Haussmann, Bernhard [Frauenhofer Institut for Solar Energy Systems ISE, Freiburg (Germany)

    2013-07-01

    With an increasing use of electric vehicles (EV) grid operators need to predict energy flows depending on electromobility use profiles to accordingly adjust grid infrastructure and operation control accordingly. Tools and methodologies are required to characterize grid problems resulting from the interconnection of EV with the grid. The simulation and optimization tool suite NEMO (Novel E-MObility grid model) was developed within a European research project and is currently being tested using realistic showcases. It is a combination of three professional tools. One of the tools aims at a combined techno-economic design and operation, primarily modeling plants on contracts or the spot market, at the same time participating in balancing markets. The second tool is designed for planning grid extension or reinforcement while the third tool is mainly used to quickly discover potential conflicts of grid operation approaches through load flow analysis. The tool suite is used to investigate real showcases in Denmark, Germany and the Netherlands. First studies show that significant alleviation of stress on distribution grid lines could be achieved by few but intelligent restrictions to EV charging procedures.

  13. NEMO. A novel techno-economic tool suite for simulating and optimizing solutions for grid integration of electric vehicles and charging stations

    International Nuclear Information System (INIS)

    Erge, Thomas; Stillahn, Thies; Dallmer-Zerbe, Kilian; Wille-Haussmann, Bernhard

    2013-01-01

    With an increasing use of electric vehicles (EV) grid operators need to predict energy flows depending on electromobility use profiles to accordingly adjust grid infrastructure and operation control accordingly. Tools and methodologies are required to characterize grid problems resulting from the interconnection of EV with the grid. The simulation and optimization tool suite NEMO (Novel E-MObility grid model) was developed within a European research project and is currently being tested using realistic showcases. It is a combination of three professional tools. One of the tools aims at a combined techno-economic design and operation, primarily modeling plants on contracts or the spot market, at the same time participating in balancing markets. The second tool is designed for planning grid extension or reinforcement while the third tool is mainly used to quickly discover potential conflicts of grid operation approaches through load flow analysis. The tool suite is used to investigate real showcases in Denmark, Germany and the Netherlands. First studies show that significant alleviation of stress on distribution grid lines could be achieved by few but intelligent restrictions to EV charging procedures.

  14. Designing algorithms using CAD technologies

    Directory of Open Access Journals (Sweden)

    Alin IORDACHE

    2008-01-01

    Full Text Available A representative example of eLearning-platform modular application, ‘Logical diagrams’, is intended to be a useful learning and testing tool for the beginner programmer, but also for the more experienced one. The problem this application is trying to solve concerns young programmers who forget about the fundamentals of this domain, algorithmic. Logical diagrams are a graphic representation of an algorithm, which uses different geometrical figures (parallelograms, rectangles, rhombuses, circles with particular meaning that are called blocks and connected between them to reveal the flow of the algorithm. The role of this application is to help the user build the diagram for the algorithm and then automatically generate the C code and test it.

  15. Methods and Algorithms for Economic MPC in Power Production Planning

    DEFF Research Database (Denmark)

    Sokoler, Leo Emil

    in real-time. A generator can represent a producer of electricity, a consumer of electricity, or possibly both. Examples of generators are heat pumps, electric vehicles, wind turbines, virtual power plants, solar cells, and conventional fuel-fired thermal power plants. Although this thesis is mainly...... concerned with EMPC for minutes-ahead production planning, we show that the proposed EMPC scheme can be extended to days-ahead planning (including unit commitment) as well. The power generation from renewable energy sources such as wind and solar power is inherently uncertain and variable. A portfolio...... design an algorithm based on the alternating direction method of multipliers (ADMM) to solve input-constrained OCPs with convex objective functions. The OCPs that occur in EMPC of dynamically decoupled subsystems, e.g. power generators, have a block-angular structure. Subsystem decomposition algorithms...

  16. The G4-ECONS Economic Evaluation Tool for Generation IV Reactor Systems and its Proposed Application to Deliberately Small Reactor Systems and Proposed New Nuclear Fuel Cycle Facilities. Annex IX

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2013-12-15

    At the outset of the international Generation IV programme, it was decided that the six candidate reactor systems will ultimately be evaluated on the basis of safety, sustainability, non-proliferation attributes, technical readiness and projected economics. It is likely that the same factors will influence the evaluation of deliberately small reactor systems1 and new fuel cycle facilities, such as reprocessing plants that are being considered under the more recent Global Nuclear Energy Partnership (GNEP). This annex describes how the development of an economic modelling system has evolved to address the issue of economic competitiveness for both the Generation IV and GNEP programmes. In 2004, the Generation IV Economic Modelling Working Group (EMWG) commissioned the development of a Microsoft Excel based model capable of calculating the levelized unit electricity cost (LUEC) in mills/kW.h (1 mill = $10{sup -3}) or $/MW.h for multiple types of reactor system being developed under the Generation IV programme. This overall modelling system is now called the Generation IV spreadsheet calculation of nuclear systems (G4-ECONS), and is being expanded to calculate costs of energy products in addition to electricity, such as hydrogen and desalinated water. A version has also been developed to evaluate the costs of products or services from fuel cycle facilities. The cost estimating methodology and algorithms are explained in detail in the Generation IV Cost Estimating Guidelines and in the G4-ECONS User's Manual. The model was constructed with relatively simple economic algorithms such that it could be used by almost any nation without regard to country specific taxation, cost accounting, depreciation or capital cost recovery methodologies. It was also designed with transparency to the user in mind (i.e. all algorithms and cell contents are visible to the user). A short description of version 1.0 G4-ECONS-R (reactor economics model) has also been published in the

  17. The G4-ECONS Economic Evaluation Tool for Generation IV Reactor Systems and its Proposed Application to Deliberately Small Reactor Systems and Proposed New Nuclear Fuel Cycle Facilities. Annex IX

    International Nuclear Information System (INIS)

    2013-01-01

    At the outset of the international Generation IV programme, it was decided that the six candidate reactor systems will ultimately be evaluated on the basis of safety, sustainability, non-proliferation attributes, technical readiness and projected economics. It is likely that the same factors will influence the evaluation of deliberately small reactor systems1 and new fuel cycle facilities, such as reprocessing plants that are being considered under the more recent Global Nuclear Energy Partnership (GNEP). This annex describes how the development of an economic modelling system has evolved to address the issue of economic competitiveness for both the Generation IV and GNEP programmes. In 2004, the Generation IV Economic Modelling Working Group (EMWG) commissioned the development of a Microsoft Excel based model capable of calculating the levelized unit electricity cost (LUEC) in mills/kW.h (1 mill = $10 -3 ) or $/MW.h for multiple types of reactor system being developed under the Generation IV programme. This overall modelling system is now called the Generation IV spreadsheet calculation of nuclear systems (G4-ECONS), and is being expanded to calculate costs of energy products in addition to electricity, such as hydrogen and desalinated water. A version has also been developed to evaluate the costs of products or services from fuel cycle facilities. The cost estimating methodology and algorithms are explained in detail in the Generation IV Cost Estimating Guidelines and in the G4-ECONS User's Manual. The model was constructed with relatively simple economic algorithms such that it could be used by almost any nation without regard to country specific taxation, cost accounting, depreciation or capital cost recovery methodologies. It was also designed with transparency to the user in mind (i.e. all algorithms and cell contents are visible to the user). A short description of version 1.0 G4-ECONS-R (reactor economics model) has also been published in the Proceedings of

  18. Advanced order management in ERM systems: the tic-tac-toe algorithm

    Science.gov (United States)

    Badell, Mariana; Fernandez, Elena; Puigjaner, Luis

    2000-10-01

    The concept behind improved enterprise resource planning systems (ERP) systems is the overall integration of the whole enterprise functionality into the management systems through financial links. Converting current software into real management decision tools requires crucial changes in the current approach to ERP systems. This evolution must be able to incorporate the technological achievements both properly and in time. The exploitation phase of plants needs an open web-based environment for collaborative business-engineering with on-line schedulers. Today's short lifecycles of products and processes require sharp and finely tuned management actions that must be guided by scheduling tools. Additionally, such actions must be able to keep track of money movements related to supply chain events. Thus, the necessary outputs require financial-production integration at the scheduling level as proposed in the new approach of enterprise management systems (ERM). Within this framework, the economical analysis of the due date policy and its optimization become essential to manage dynamically realistic and optimal delivery dates with price-time trade-off during the marketing activities. In this work we propose a scheduling tool with web-based interface conducted by autonomous agents when precise economic information relative to plant and business actions and their effects are provided. It aims to attain a better arrangement of the marketing and production events in order to face the bid/bargain process during e-commerce. Additionally, management systems require real time execution and an efficient transaction-oriented approach capable to dynamically adopt realistic and optimal actions to support marketing management. To this end the TicTacToe algorithm provides sequence optimization with acceptable tolerances in realistic time.

  19. Bank Portfolio Structure and Economic Absorption Theory of Economic Development: A Theoretical Proposition

    Directory of Open Access Journals (Sweden)

    Uduak B. UBOM

    2016-11-01

    Full Text Available The focus of this article was on theoretical proposition of Bank Portfolio Structure and Economic Absorption Theory of economic development. Specifically, the work sought to establish the basis of bank portfolio rigidity and to identify the causes of economic absorption problems and their implications on economic development. The theoretical and conceptual research designs were used. Existing literatures were reviewed using archival retrieval approach, library search and internet exploration. The information obtained was judgmentally, logically and qualitatively analyzed. It was discovered among others, that, bank portfolio rigidity stems from regulatory policy defects using inconsistent monetary policy tools such as high liquidity ratio and cash ratio, etc. and compelling the banks to adhere to the regulatory requirement, as well as lack of adequate and quality stock of infrastructure and technology as the basic causes of economic absorption problems. Above all, low level of economic absorption has been discovered to hinder effective contributions of banks to economic development. Following from above, it was therefore recommended that regulatory tools used by Central Banks should be aligned with the development needs of the economy and the direction of governments. The monetary policy tools such as liquidity and cash ratios should also be moderated and stabilized for stable bank portfolio performance as well as aggressive improvement in the stock and quality of infrastructure and technology within an economy. With the new theory, it is expected that policy formulations and adjustments concerning bank portfolio structure and management would be designed with adequate flexibility and focus on long term loans and investments coupled with improved stock and quality of infrastructure to enhance economic development. This theory therefore provides another frontier of research on bank portfolio structure and contributions to economic development.

  20. Enhancement of combined heat and power economic dispatch using self adaptive real-coded genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Subbaraj, P. [Kalasalingam University, Srivilliputhur, Tamilnadu 626 190 (India); Rengaraj, R. [Electrical and Electronics Engineering, S.S.N. College of Engineering, Old Mahabalipuram Road, Thirupporur (T.K), Kalavakkam, Kancheepuram (Dist.) 603 110, Tamilnadu (India); Salivahanan, S. [S.S.N. College of Engineering, Old Mahabalipuram Road, Thirupporur (T.K), Kalavakkam, Kancheepuram (Dist.) 603 110, Tamilnadu (India)

    2009-06-15

    In this paper, a self adaptive real-coded genetic algorithm (SARGA) is implemented to solve the combined heat and power economic dispatch (CHPED) problem. The self adaptation is achieved by means of tournament selection along with simulated binary crossover (SBX). The selection process has a powerful exploration capability by creating tournaments between two solutions. The better solution is chosen and placed in the mating pool leading to better convergence and reduced computational burden. The SARGA integrates penalty parameterless constraint handling strategy and simultaneously handles equality and inequality constraints. The population diversity is introduced by making use of distribution index in SBX operator to create a better offspring. This leads to a high diversity in population which can increase the probability towards the global optimum and prevent premature convergence. The SARGA is applied to solve CHPED problem with bounded feasible operating region which has large number of local minima. The numerical results demonstrate that the proposed method can find a solution towards the global optimum and compares favourably with other recent methods in terms of solution quality, handling constraints and computation time. (author)

  1. Sustainable logistics and transportation optimization models and algorithms

    CERN Document Server

    Gakis, Konstantinos; Pardalos, Panos

    2017-01-01

    Focused on the logistics and transportation operations within a supply chain, this book brings together the latest models, algorithms, and optimization possibilities. Logistics and transportation problems are examined within a sustainability perspective to offer a comprehensive assessment of environmental, social, ethical, and economic performance measures. Featured models, techniques, and algorithms may be used to construct policies on alternative transportation modes and technologies, green logistics, and incentives by the incorporation of environmental, economic, and social measures. Researchers, professionals, and graduate students in urban regional planning, logistics, transport systems, optimization, supply chain management, business administration, information science, mathematics, and industrial and systems engineering will find the real life and interdisciplinary issues presented in this book informative and useful.

  2. Algorithms for the Computation of Debris Risks

    Science.gov (United States)

    Matney, Mark

    2017-01-01

    Determining the risks from space debris involve a number of statistical calculations. These calculations inevitably involve assumptions about geometry - including the physical geometry of orbits and the geometry of non-spherical satellites. A number of tools have been developed in NASA's Orbital Debris Program Office to handle these calculations; many of which have never been published before. These include algorithms that are used in NASA's Orbital Debris Engineering Model ORDEM 3.0, as well as other tools useful for computing orbital collision rates and ground casualty risks. This paper will present an introduction to these algorithms and the assumptions upon which they are based.

  3. Distributed Economic Dispatch in Microgrids Based on Cooperative Reinforcement Learning.

    Science.gov (United States)

    Liu, Weirong; Zhuang, Peng; Liang, Hao; Peng, Jun; Huang, Zhiwu; Weirong Liu; Peng Zhuang; Hao Liang; Jun Peng; Zhiwu Huang; Liu, Weirong; Liang, Hao; Peng, Jun; Zhuang, Peng; Huang, Zhiwu

    2018-06-01

    Microgrids incorporated with distributed generation (DG) units and energy storage (ES) devices are expected to play more and more important roles in the future power systems. Yet, achieving efficient distributed economic dispatch in microgrids is a challenging issue due to the randomness and nonlinear characteristics of DG units and loads. This paper proposes a cooperative reinforcement learning algorithm for distributed economic dispatch in microgrids. Utilizing the learning algorithm can avoid the difficulty of stochastic modeling and high computational complexity. In the cooperative reinforcement learning algorithm, the function approximation is leveraged to deal with the large and continuous state spaces. And a diffusion strategy is incorporated to coordinate the actions of DG units and ES devices. Based on the proposed algorithm, each node in microgrids only needs to communicate with its local neighbors, without relying on any centralized controllers. Algorithm convergence is analyzed, and simulations based on real-world meteorological and load data are conducted to validate the performance of the proposed algorithm.

  4. Algorithms and programming tools for image processing on the MPP, part 2

    Science.gov (United States)

    Reeves, Anthony P.

    1986-01-01

    A number of algorithms were developed for image warping and pyramid image filtering. Techniques were investigated for the parallel processing of a large number of independent irregular shaped regions on the MPP. In addition some utilities for dealing with very long vectors and for sorting were developed. Documentation pages for the algorithms which are available for distribution are given. The performance of the MPP for a number of basic data manipulations was determined. From these results it is possible to predict the efficiency of the MPP for a number of algorithms and applications. The Parallel Pascal development system, which is a portable programming environment for the MPP, was improved and better documentation including a tutorial was written. This environment allows programs for the MPP to be developed on any conventional computer system; it consists of a set of system programs and a library of general purpose Parallel Pascal functions. The algorithms were tested on the MPP and a presentation on the development system was made to the MPP users group. The UNIX version of the Parallel Pascal System was distributed to a number of new sites.

  5. A review on economic emission dispatch problems using quantum computational intelligence

    Science.gov (United States)

    Mahdi, Fahad Parvez; Vasant, Pandian; Kallimani, Vish; Abdullah-Al-Wadud, M.

    2016-11-01

    Economic emission dispatch (EED) problems are one of the most crucial problems in power systems. Growing energy demand, limitation of natural resources and global warming make this topic into the center of discussion and research. This paper reviews the use of Quantum Computational Intelligence (QCI) in solving Economic Emission Dispatch problems. QCI techniques like Quantum Genetic Algorithm (QGA) and Quantum Particle Swarm Optimization (QPSO) algorithm are discussed here. This paper will encourage the researcher to use more QCI based algorithm to get better optimal result for solving EED problems.

  6. SaTool - a Software Tool for Structural Analysis of Complex Automation Systems

    DEFF Research Database (Denmark)

    Blanke, Mogens; Lorentzen, Torsten

    2006-01-01

    The paper introduces SaTool, a tool for structural analysis, the use of the Matlab (R)-based implementation is presented and special features are introduced, which were motivated by industrial users. Salient features of tool are presented, including the ability to specify the behavior of a complex...... system at a high level of functional abstraction, analyze single and multiple fault scenarios and automatically generate parity relations for diagnosis for the system in normal and impaired conditions. User interface and algorithmic details are presented....

  7. High-Performance Integrated Virtual Environment (HIVE) Tools and Applications for Big Data Analysis.

    Science.gov (United States)

    Simonyan, Vahan; Mazumder, Raja

    2014-09-30

    The High-performance Integrated Virtual Environment (HIVE) is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS) data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  8. High-Performance Integrated Virtual Environment (HIVE Tools and Applications for Big Data Analysis

    Directory of Open Access Journals (Sweden)

    Vahan Simonyan

    2014-09-01

    Full Text Available The High-performance Integrated Virtual Environment (HIVE is a high-throughput cloud-based infrastructure developed for the storage and analysis of genomic and associated biological data. HIVE consists of a web-accessible interface for authorized users to deposit, retrieve, share, annotate, compute and visualize Next-generation Sequencing (NGS data in a scalable and highly efficient fashion. The platform contains a distributed storage library and a distributed computational powerhouse linked seamlessly. Resources available through the interface include algorithms, tools and applications developed exclusively for the HIVE platform, as well as commonly used external tools adapted to operate within the parallel architecture of the system. HIVE is composed of a flexible infrastructure, which allows for simple implementation of new algorithms and tools. Currently, available HIVE tools include sequence alignment and nucleotide variation profiling tools, metagenomic analyzers, phylogenetic tree-building tools using NGS data, clone discovery algorithms, and recombination analysis algorithms. In addition to tools, HIVE also provides knowledgebases that can be used in conjunction with the tools for NGS sequence and metadata analysis.

  9. Implementation of fuzzy logic control algorithm in embedded ...

    African Journals Online (AJOL)

    Fuzzy logic control algorithm solves problems that are difficult to address with traditional control techniques. This paper describes an implementation of fuzzy logic control algorithm using inexpensive hardware as well as how to use fuzzy logic to tackle a specific control problem without any special software tools. As a case ...

  10. [Fostering of health economics in Germany].

    Science.gov (United States)

    Ulrich, V

    2012-05-01

    Health economics is now well established in Germany with the aim to apply economic tools to answer problems in health and health care. After a short review of the international development of health economics and the development in Germany in particular, the article looks at selected recent topics of health economic analysis in Germany (economic evaluation, industrial economics, health and education).

  11. A Web-Based Tool to Interpolate Nitrogen Loading Using a Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Youn Shik Park

    2014-09-01

    Full Text Available Water quality data may not be collected at a high frequency, nor over the range of streamflow data. For instance, water quality data are often collected monthly, biweekly, or weekly, since collecting and analyzing water quality samples are costly compared to streamflow data. Regression models are often used to interpolate pollutant loads from measurements made intermittently. Web-based Load Interpolation Tool (LOADIN was developed to provide user-friendly interfaces and to allow use of streamflow and water quality data from U.S. Geological Survey (USGS via web access. LOADIN has a regression model assuming that instantaneous load is comprised of the pollutant load based on streamflow and the pollutant load variation within the period. The regression model has eight coefficients determined by a genetic algorithm with measured water quality data. LOADIN was applied to eleven water quality datasets from USGS gage stations located in Illinois, Indiana, Michigan, Minnesota, and Wisconsin states with drainage areas from 44 km2 to 1,847,170 km2. Measured loads were calculated by multiplying nitrogen data by streamflow data associated with measured nitrogen data. The estimated nitrogen loads and measured loads were evaluated using Nash-Sutcliffe Efficiency (NSE and coefficient of determination (R2. NSE ranged from 0.45 to 0.91, and R2 ranged from 0.51 to 0.91 for nitrogen load estimation.

  12. Interactive animation of fault-tolerant parallel algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Apgar, S.W.

    1992-02-01

    Animation of algorithms makes understanding them intuitively easier. This paper describes the software tool Raft (Robust Animator of Fault Tolerant Algorithms). The Raft system allows the user to animate a number of parallel algorithms which achieve fault tolerant execution. In particular, we use it to illustrate the key Write-All problem. It has an extensive user-interface which allows a choice of the number of processors, the number of elements in the Write-All array, and the adversary to control the processor failures. The novelty of the system is that the interface allows the user to create new on-line adversaries as the algorithm executes.

  13. Efficient ecologic and economic operational rules for dammed systems by means of nondominated sorting genetic algorithm II

    Science.gov (United States)

    Niayifar, A.; Perona, P.

    2015-12-01

    River impoundment by dams is known to strongly affect the natural flow regime and in turn the river attributes and the related ecosystem biodiversity. Making hydropower sustainable implies to seek for innovative operational policies able to generate dynamic environmental flows while maintaining economic efficiency. For dammed systems, we build the ecological and economical efficiency plot for non-proportional flow redistribution operational rules compared to minimal flow operational. As for the case of small hydropower plants (e.g., see the companion paper by Gorla et al., this session), we use a four parameters Fermi-Dirac statistical distribution to mathematically formulate non-proportional redistribution rules. These rules allocate a fraction of water to the riverine environment depending on current reservoir inflows and storage. Riverine ecological benefits associated to dynamic environmental flows are computed by integrating the Weighted Usable Area (WUA) for fishes with Richter's hydrological indicators. Then, we apply nondominated sorting genetic algorithm II (NSGA-II) to an ensemble of non-proportional and minimal flow redistribution rules in order to generate the Pareto frontier showing the system performances in the ecologic and economic space. This fast and elitist multiobjective optimization method is eventually applied to a case study. It is found that non-proportional dynamic flow releases ensure maximal power production on the one hand, while conciliating ecological sustainability on the other hand. Much of the improvement in the environmental indicator is seen to arise from a better use of the reservoir storage dynamics, which allows to capture, and laminate flood events while recovering part of them for energy production. In conclusion, adopting such new operational policies would unravel a spectrum of globally-efficient performances of the dammed system when compared with those resulting from policies based on constant minimum flow releases.

  14. NDT-Tool: A case tool to deal with requirements in web information systems

    OpenAIRE

    Escalona Cuaresma, María José; Torres Valderrama, Jesús; Mejías Risoto, Manuel

    2003-01-01

    Internet progress and the rising interest for developing systems in web environment has given way to several methodological proposals which have been proposed to be a suitable reference in the development process. However, there is a gap in case tool[3][4][6]. This work presents a case tool named NDT-Tool that allows to apply algorithms and techniques proposed in NDT (Navigational Development Techniques) [2], which is a methodological proposition to specify, analyze and desi...

  15. GeneYenta: a phenotype-based rare disease case matching tool based on online dating algorithms for the acceleration of exome interpretation.

    Science.gov (United States)

    Gottlieb, Michael M; Arenillas, David J; Maithripala, Savanie; Maurer, Zachary D; Tarailo Graovac, Maja; Armstrong, Linlea; Patel, Millan; van Karnebeek, Clara; Wasserman, Wyeth W

    2015-04-01

    Advances in next-generation sequencing (NGS) technologies have helped reveal causal variants for genetic diseases. In order to establish causality, it is often necessary to compare genomes of unrelated individuals with similar disease phenotypes to identify common disrupted genes. When working with cases of rare genetic disorders, finding similar individuals can be extremely difficult. We introduce a web tool, GeneYenta, which facilitates the matchmaking process, allowing clinicians to coordinate detailed comparisons for phenotypically similar cases. Importantly, the system is focused on phenotype annotation, with explicit limitations on highly confidential data that create barriers to participation. The procedure for matching of patient phenotypes, inspired by online dating services, uses an ontology-based semantic case matching algorithm with attribute weighting. We evaluate the capacity of the system using a curated reference data set and 19 clinician entered cases comparing four matching algorithms. We find that the inclusion of clinician weights can augment phenotype matching. © 2015 WILEY PERIODICALS, INC.

  16. Advances in mathematical economics

    CERN Document Server

    Maruyama, Toru

    2015-01-01

    The series is designed to bring together those mathematicians who are seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking effective mathematical tools for their research. A lot of economic problems can be formulated as constrained optimizations and equilibration of their solutions. Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories.

  17. Advances in mathematical economics

    CERN Document Server

    Maruyama, Toru

    2014-01-01

    A lot of economic problems can be formulated as constrained optimizations and equilibration of their solutions. Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories. The series is designed to bring together those mathematicians who are seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking effective mathematical tools for their research.

  18. Advances in mathematical economics

    CERN Document Server

    Yamazaki, Akira

    2006-01-01

    A lot of economic problems can formulated as constrained optimizations and equilibration of their solutions. Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories. The series is designed to bring together those mathematicians who were seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking for effective mathematical tools for their researchers.

  19. Advances in mathematical economics

    CERN Document Server

    Yamazaki, Akira

    2006-01-01

    A lot of economic problems can formulated as constrained optimizations and equilibration of their solutions.Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories. The series is designed to bring together those mathematicians who were seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking for effective mathematical tools for their researchers.

  20. Advances in mathematical economics

    CERN Document Server

    Maruyama, Toru

    2017-01-01

    The series is designed to bring together those mathematicians who are seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking effective mathematical tools for their research. A lot of economic problems can be formulated as constrained optimizations and equilibration of their solutions. Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories.

  1. Advances in mathematical economics

    CERN Document Server

    Maruyama, Toru

    2016-01-01

    The series is designed to bring together those mathematicians who are seriously interested in getting new challenging stimuli from economic theories with those economists who are seeking effective mathematical tools for their research. A lot of economic problems can be formulated as constrained optimizations and equilibration of their solutions. Various mathematical theories have been supplying economists with indispensable machineries for these problems arising in economic theory. Conversely, mathematicians have been stimulated by various mathematical difficulties raised by economic theories.

  2. Dynamic contrast-enhanced MRI of the prostate. Comparison of two different post-processing algorithms

    International Nuclear Information System (INIS)

    Beyersdorff, Dirk; Franiel, T.; Luedemann, L.; Dietz, E.; Galler, D.; Marchot, P.

    2011-01-01

    Purpose: To evaluate the usefulness of a commercially available post-processing software tool for detecting prostate cancer on dynamic contrast-enhanced magnetic resonance imaging (MRI) and to compare the results to those obtained with a custom-made post-processing algorithm already tested under clinical conditions. Materials and Methods: Forty-eight patients with proven prostate cancer were examined by standard MRI supplemented by dynamic contrast-enhanced dual susceptibility contrast (DCE-DSC) MRI prior to prostatectomy. A custom-made post-processing algorithm was used to analyze the MRI data sets and the results were compared to those obtained using a post-processing algorithm from Invivo Corporation (Dyna CAD for Prostate) applied to dynamic T 1-weighted images. Histology was used as the gold standard. Results: The sensitivity for prostate cancer detection was 78 % for the custom-made algorithm and 60 % for the commercial algorithm and the specificity was 79 % and 82 %, respectively. The accuracy was 79 % for our algorithm and 77.5 % for the commercial software tool. The chi-square test (McNemar-Bowker test) yielded no significant differences between the two tools (p = 0.06). Conclusion: The two investigated post-processing algorithms did not differ in terms of prostate cancer detection. The commercially available software tool allows reliable and fast analysis of dynamic contrast-enhanced MRI for the detection of prostate cancer. (orig.)

  3. A new multi-objective reserve constrained combined heat and power dynamic economic emission dispatch

    International Nuclear Information System (INIS)

    Niknam, Taher; Azizipanah-Abarghooee, Rasoul; Roosta, Alireza; Amiri, Babak

    2012-01-01

    Combined heat and power units are playing an ever increasing role in conventional power stations due to advantages such as reduced emissions and operational cost savings. This paper investigates a more practical formulation of the complex non-convex, non-smooth and non-linear multi-objective dynamic economic emission dispatch that incorporates combined heat and power units. Integrating these types of units, and their power ramp constraints, require an efficient tool to cope with the joint characteristics of power and heat. Unlike previous approaches, the spinning reserve requirements of this system are clearly formulated in the problem. In this way, a new multi-objective optimisation based on an enhanced firefly algorithm is proposed to achieve a set of non-dominated (Pareto-optimal) solutions. A new tuning parameter based on a chaotic mechanism and novel self adaptive probabilistic mutation strategies are used to improve the overall performance of the algorithm. The numerical results demonstrate how the proposed framework was applied in real time studies. -- Highlights: ► Investigate a practical formulation of the DEED (Dynamic Economic Emission Dispatch). ► Consider combined heat and power units. ► Consider power ramp constraints. ► Consider the system spinning reserve requirements. ► Present a new multi-objective optimization firefly.

  4. Data structures and algorithm analysis in C++

    CERN Document Server

    Shaffer, Clifford A

    2011-01-01

    With its focus on creating efficient data structures and algorithms, this comprehensive text helps readers understand how to select or design the tools that will best solve specific problems. It uses Microsoft C++ as the programming language and is suitable for second-year data structure courses and computer science courses in algorithm analysis.Techniques for representing data are presented within the context of assessing costs and benefits, promoting an understanding of the principles of algorithm analysis and the effects of a chosen physical medium. The text also explores tradeoff issues, f

  5. Data structures and algorithm analysis in Java

    CERN Document Server

    Shaffer, Clifford A

    2011-01-01

    With its focus on creating efficient data structures and algorithms, this comprehensive text helps readers understand how to select or design the tools that will best solve specific problems. It uses Java as the programming language and is suitable for second-year data structure courses and computer science courses in algorithm analysis. Techniques for representing data are presented within the context of assessing costs and benefits, promoting an understanding of the principles of algorithm analysis and the effects of a chosen physical medium. The text also explores tradeoff issues, familiari

  6. Glint Field Trial Results and Application to Glint Threshold Distance Algorithm

    National Research Council Canada - National Science Library

    Chevalier, William

    1998-01-01

    .... Glint threshold algorithm. Software adjustments would tentatively be made to the existing algorithm to improve glint threshold distance calculation accuracy, making the modified model a better iterative eye armor design tool...

  7. Tools for signal compression applications to speech and audio coding

    CERN Document Server

    Moreau, Nicolas

    2013-01-01

    This book presents tools and algorithms required to compress/uncompress signals such as speech and music. These algorithms are largely used in mobile phones, DVD players, HDTV sets, etc. In a first rather theoretical part, this book presents the standard tools used in compression systems: scalar and vector quantization, predictive quantization, transform quantization, entropy coding. In particular we show the consistency between these different tools. The second part explains how these tools are used in the latest speech and audio coders. The third part gives Matlab programs simulating t

  8. THE IMPORTANCE OF LAW AND ECONOMICS IN THE CONTEMPORARY ECONOMIC REALITY

    OpenAIRE

    Pomaskow, Joanna

    2015-01-01

    The law and economics movement can improve the functioning of companies doing business in the contemporary, rapidly changing, reality. The tensions between the idea of efficiency and the idea of justice cause difficulties in the application of tools which are proposed by the representatives of the law and economics movement in legal solutions. Economics proposes a new, fresh look at the law, which makes it easier to assess and influence the growth of its transparency. Perspective can therefor...

  9. Algorithms in invariant theory

    CERN Document Server

    Sturmfels, Bernd

    2008-01-01

    J. Kung and G.-C. Rota, in their 1984 paper, write: "Like the Arabian phoenix rising out of its ashes, the theory of invariants, pronounced dead at the turn of the century, is once again at the forefront of mathematics". The book of Sturmfels is both an easy-to-read textbook for invariant theory and a challenging research monograph that introduces a new approach to the algorithmic side of invariant theory. The Groebner bases method is the main tool by which the central problems in invariant theory become amenable to algorithmic solutions. Students will find the book an easy introduction to this "classical and new" area of mathematics. Researchers in mathematics, symbolic computation, and computer science will get access to a wealth of research ideas, hints for applications, outlines and details of algorithms, worked out examples, and research problems.

  10. Some multigrid algorithms for SIMD machines

    Energy Technology Data Exchange (ETDEWEB)

    Dendy, J.E. Jr. [Los Alamos National Lab., NM (United States)

    1996-12-31

    Previously a semicoarsening multigrid algorithm suitable for use on SIMD architectures was investigated. Through the use of new software tools, the performance of this algorithm has been considerably improved. The method has also been extended to three space dimensions. The method performs well for strongly anisotropic problems and for problems with coefficients jumping by orders of magnitude across internal interfaces. The parallel efficiency of this method is analyzed, and its actual performance on the CM-5 is compared with its performance on the CRAY-YMP. A standard coarsening multigrid algorithm is also considered, and we compare its performance on these two platforms as well.

  11. Erratum to ''Johnson's algorithm : A key to solve optimally or approximately flowshop scheduling problems with unavailability periods'' [International Journal of Production Economics 121 (2009) 81-87

    OpenAIRE

    Rapine , Christophe

    2013-01-01

    International audience; In Allaoui H., Artiba A, ''Johnson's algorithm : A key to solve optimally or approximately flowshop scheduling problems with unavailability periods'' [International Journal of Production Economics 121 (2009)] the authors propose optimality conditions for the Johnson sequence in presence of one unavailability period on the first machine and pretend for a performance guarantee of 2 when several unavailability periods may occur. We establish in this note that these condit...

  12. The Parallel SBAS-DInSAR algorithm: an effective and scalable tool for Earth's surface displacement retrieval

    Science.gov (United States)

    Zinno, Ivana; De Luca, Claudio; Elefante, Stefano; Imperatore, Pasquale; Manunta, Michele; Casu, Francesco

    2014-05-01

    Differential Synthetic Aperture Radar Interferometry (DInSAR) is an effective technique to estimate and monitor ground displacements with centimetre accuracy [1]. In the last decade, advanced DInSAR algorithms, such as the Small Baseline Subset (SBAS) [2] one that is aimed at following the temporal evolution of the ground deformation, showed to be significantly useful remote sensing tools for the geoscience communities as well as for those related to hazard monitoring and risk mitigation. DInSAR scenario is currently characterized by the large and steady increasing availability of huge SAR data archives that have a broad range of diversified features according to the characteristics of the employed sensor. Indeed, besides the old generation sensors, that include ERS, ENVISAT and RADARSAT systems, the new X-band generation constellations, such as COSMO-SkyMed and TerraSAR-X, have permitted an overall study of ground deformations with an unprecedented detail thanks to their improved spatial resolution and reduced revisit time. Furthermore, the incoming ESA Sentinel-1 SAR satellite is characterized by a global coverage acquisition strategy and 12-day revisit time and, therefore, will further contribute to improve deformation analyses and monitoring capabilities. However, in this context, the capability to process such huge SAR data archives is strongly limited by the existing DInSAR algorithms, which are not specifically designed to exploit modern high performance computational infrastructures (e.g. cluster, grid and cloud computing platforms). The goal of this paper is to present a Parallel version of the SBAS algorithm (P-SBAS) which is based on a dual-level parallelization approach and embraces combined parallel strategies [3], [4]. A detailed description of the P-SBAS algorithm will be provided together with a scalability analysis focused on studying its performances. In particular, a P-SBAS scalability analysis with respect to the number of exploited CPUs has

  13. Genetic algorithm essentials

    CERN Document Server

    Kramer, Oliver

    2017-01-01

    This book introduces readers to genetic algorithms (GAs) with an emphasis on making the concepts, algorithms, and applications discussed as easy to understand as possible. Further, it avoids a great deal of formalisms and thus opens the subject to a broader audience in comparison to manuscripts overloaded by notations and equations. The book is divided into three parts, the first of which provides an introduction to GAs, starting with basic concepts like evolutionary operators and continuing with an overview of strategies for tuning and controlling parameters. In turn, the second part focuses on solution space variants like multimodal, constrained, and multi-objective solution spaces. Lastly, the third part briefly introduces theoretical tools for GAs, the intersections and hybridizations with machine learning, and highlights selected promising applications.

  14. Conducting systematic reviews of economic evaluations.

    Science.gov (United States)

    Gomersall, Judith Streak; Jadotte, Yuri Tertilus; Xue, Yifan; Lockwood, Suzi; Riddle, Dru; Preda, Alin

    2015-09-01

    In 2012, a working group was established to review and enhance the Joanna Briggs Institute (JBI) guidance for conducting systematic review of evidence from economic evaluations addressing a question(s) about health intervention cost-effectiveness. The objective is to present the outcomes of the working group. The group conducted three activities to inform the new guidance: review of literature on the utility/futility of systematic reviews of economic evaluations and consideration of its implications for updating the existing methodology; assessment of the critical appraisal tool in the existing guidance against criteria that promotes validity in economic evaluation research and two other commonly used tools; and a workshop. The debate in the literature on the limitations/value of systematic review of economic evidence cautions that systematic reviews of economic evaluation evidence are unlikely to generate one size fits all answers to questions about the cost-effectiveness of interventions and their comparators. Informed by this finding, the working group adjusted the framing of the objectives definition in the existing JBI methodology. The shift is away from defining the objective as to determine one cost-effectiveness measure toward summarizing study estimates of cost-effectiveness and informed by consideration of the included study characteristics (patient, setting, intervention component, etc.), identifying conditions conducive to lowering costs and maximizing health benefits. The existing critical appraisal tool was included in the new guidance. The new guidance includes the recommendation that a tool designed specifically for the purpose of appraising model-based studies be used together with the generic appraisal tool for economic evaluations assessment to evaluate model-based evaluations. The guidance produced by the group offers reviewers guidance for each step of the systematic review process, which are the same steps followed in JBI reviews of other

  15. Fuels planning: science synthesis and integration; economic uses fact sheet 03: economic impacts of fuel treatments

    Science.gov (United States)

    Rocky Mountain Research Station USDA Forest Service

    2004-01-01

    With increased interest in reducing hazardous fuels in dry inland forests of the American West, agencies and the public will want to know the economic impacts of fuel reduction treatments. This fact sheet discusses the economic impact tool, a component of My Fuel Treatment Planner, for evaluating economic impacts.

  16. Methodology for evaluation of economic security of industrial enterprises

    OpenAIRE

    Kopytko Marta Ivanovna

    2014-01-01

    This paper investigates the features of evaluation of ensuring economic security of industrial enterprises and the algorithm of complex evaluation of the economic security of industrial enterprises over time and the system of criteria and their limit values ​​and the dynamics of change to determine the level of economic security industrial enterprise in terms of its components.

  17. Unified Lambert Tool for Massively Parallel Applications in Space Situational Awareness

    Science.gov (United States)

    Woollands, Robyn M.; Read, Julie; Hernandez, Kevin; Probe, Austin; Junkins, John L.

    2018-03-01

    This paper introduces a parallel-compiled tool that combines several of our recently developed methods for solving the perturbed Lambert problem using modified Chebyshev-Picard iteration. This tool (unified Lambert tool) consists of four individual algorithms, each of which is unique and better suited for solving a particular type of orbit transfer. The first is a Keplerian Lambert solver, which is used to provide a good initial guess (warm start) for solving the perturbed problem. It is also used to determine the appropriate algorithm to call for solving the perturbed problem. The arc length or true anomaly angle spanned by the transfer trajectory is the parameter that governs the automated selection of the appropriate perturbed algorithm, and is based on the respective algorithm convergence characteristics. The second algorithm solves the perturbed Lambert problem using the modified Chebyshev-Picard iteration two-point boundary value solver. This algorithm does not require a Newton-like shooting method and is the most efficient of the perturbed solvers presented herein, however the domain of convergence is limited to about a third of an orbit and is dependent on eccentricity. The third algorithm extends the domain of convergence of the modified Chebyshev-Picard iteration two-point boundary value solver to about 90% of an orbit, through regularization with the Kustaanheimo-Stiefel transformation. This is the second most efficient of the perturbed set of algorithms. The fourth algorithm uses the method of particular solutions and the modified Chebyshev-Picard iteration initial value solver for solving multiple revolution perturbed transfers. This method does require "shooting" but differs from Newton-like shooting methods in that it does not require propagation of a state transition matrix. The unified Lambert tool makes use of the General Mission Analysis Tool and we use it to compute thousands of perturbed Lambert trajectories in parallel on the Space Situational

  18. Drinking Water Consequences Tools. A Literature Review

    Energy Technology Data Exchange (ETDEWEB)

    Pasqualini, Donatella [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-12

    In support of the goals of Department of Homeland Security’s (DHS) National Protection and Programs Directorate and the Federal Emergency Management Agency, the DHS Office of Science and Technology is seeking to develop and/or modify consequence assessment tools to enable drinking water systems owner/operators to estimate the societal and economic consequences of drinking water disruption due to the threats and hazards. This work will expand the breadth of consequence estimation methods and tools using the best-available data describing water distribution infrastructure, owner/assetlevel economic losses, regional-scale economic activity, and health. In addition, this project will deploy the consequence methodology and capability within a Web-based platform. This report is intended to support DHS effort providing a review literature review of existing assessment tools of water and wastewater systems consequences to disruptions. The review includes tools that assess water systems resilience, vulnerability, and risk. This will help to understand gaps and limitations of these tools in order to plan for the development of the next-generation consequences tool for water and waste water systems disruption.

  19. A dataflow analysis tool for parallel processing of algorithms

    Science.gov (United States)

    Jones, Robert L., III

    1993-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on a set of identical parallel processors. Typical applications include signal processing and control law problems. Graph analysis techniques are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool is shown to facilitate the application of the design process to a given problem.

  20. Macro-economic environmental models

    International Nuclear Information System (INIS)

    Wier, M.

    1993-01-01

    In the present report, an introduction to macro-economic environmental models is given. The role of the models as a tool for policy analysis is discussed. Future applications, as well as the limitations given by the data, are brought into focus. The economic-ecological system is described. A set of guidelines for implementation of the system in a traditional economic macro-model is proposed. The characteristics of empirical national and international environmental macro-economic models so far are highlighted. Special attention is paid to main economic causalities and their consequences for the environmental policy recommendations sat by the models. (au) (41 refs.)

  1. A Linear Algorithm for Black Scholes Economic Model

    Directory of Open Access Journals (Sweden)

    Dumitru FANACHE

    2008-01-01

    Full Text Available The pricing of options is a very important problem encountered in financial domain. The famous Black-Scholes model provides explicit closed form solution for the values of certain (European style call and put options. But for many other options, either there are no closed form solution, or if such closed form solutions exist, the formulas exhibiting them are complicated and difficult to evaluate accurately by conventional methods. The aim of this paper is to study the possibility of obtaining the numerical solution for the Black-Scholes equation in parallel, by means of several processors, using the finite difference method. A comparison between the complexity of the parallel algorithm and the serial one is given.

  2. Optimal Wind Turbines Micrositing in Onshore Wind Farms Using Fuzzy Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Jun Yang

    2015-01-01

    Full Text Available With the fast growth in the number and size of installed wind farms (WFs around the world, optimal wind turbines (WTs micrositing has become a challenge from both technological and mathematical points of view. An appropriate layout of wind turbines is crucial to obtain adequate performance with respect to the development and operation of the wind power plant during its life span. This work presents a fuzzy genetic algorithm (FGA for maximizing the economic profitability of the project. The algorithm considers a new WF model including several important factors to the design of the layout. The model consists of wake loss, terrain effect, and economic benefits, which can be calculated by locations of wind turbines. The results demonstrate that the algorithm performs better than genetic algorithm, in terms of maximum values of net annual value of wind power plants and computational burden.

  3. Template Generation and Selection Algorithms

    NARCIS (Netherlands)

    Guo, Y.; Smit, Gerardus Johannes Maria; Broersma, Haitze J.; Heysters, P.M.; Badaway, W.; Ismail, Y.

    The availability of high-level design entry tooling is crucial for the viability of any reconfigurable SoC architecture. This paper presents a template generation method to extract functional equivalent structures, i.e. templates, from a control data flow graph. By inspecting the graph the algorithm

  4. MODELLING OF TOURISM SERVICE DYNAMICS UNDER THE INFLUENCE OF ECONOMIC PATTERN OF SOCIETY

    Directory of Open Access Journals (Sweden)

    Lesya Buyak

    2016-11-01

    Full Text Available Tourism as a phenomenon of social life is a derivative of social development. Its appearance is attributed to the industrial stage of human development, which was inherent in the accelerated development of the productive forces, deepening division of labour, development of urbanization processes. Accelerated innovation changes related to scientific and technological progress contributed to the overall socio-economic development of certain countries, improve the living standards of their populations, changed the nature of work, method and way of life, especially evident in the XX century. Urbanization and changes in the settlement system, post-industrial phase of economic development, deepening comprehension of environmental issues and global dimension of humanity, humanization of all spheres of public life. The increase in tourist flows in all regions complicates the management of enterprises, schemes of partnership in the process in tourism, which, among other things, is accompanied by rising levels of consumer education, and therefore their quality requirements for end tourism product, the rapid increase in supply, there is a need to study the characteristics of consumer behaviour, search for existing reserves to build capacity of individual enterprises, isolation and effective use of effective methods and tools of influence on consumer choice of consumers. Development and implementation of an effective mechanism of formation of market supply needs an assessment of consumer behaviour on quantitative and qualitative indicators. The rapid development of tourism, of course, helps determine the types and methods of calculating these indicators. These problems and targeted research are considered in this article. The subject of research is the concept and tools of analysis, mathematical modelling of the economic structure of society in dynamic tourist services. Research methodology is economic and mathematical models, algorithms and processes

  5. Formation of the Regional System of Small and Medium Enterprises in the Current Economic Conditions

    Directory of Open Access Journals (Sweden)

    Sergey Aleksandrovich Korobov

    2016-10-01

    Full Text Available In connection with the growing importance of small and medium enterprises as a crucial element of innovation-oriented economy, the implementation of measures to support and promote small and medium enterprises at the regional level should be based on rational development of existing regional authorities’ resources. Therefore, for the development and adoption of effective (rational decisions in management development of small and medium business, it is important to use the cognitive tools of analysis – modern technologies of system analysis. The article assesses the government measures on the formation of a regional system of development of small and medium enterprises using 4 author’s criteria; provides a cognitive map of the interaction of resources at their development in the process of formation of regional system of development of small and medium enterprises; presents the algorithm of formation of regional system of small and medium business development. The study is based on comprehensive and comparative analysis of the state measures for formation of regional system of small and medium enterprises development in the context of the resource-oriented approach, graphical analysis in the framework of cognitive modeling causal relationships between existing regional authorities, resources, and stages of formation of regional system of development of small and medium enterprises in modern economic conditions, represented in the form of an algorithm. The author comes to the conclusion that the tools of cognitive analysis can be successfully applied in the formation of a regional system of development of small and medium enterprises, as they allow to provide the maximum socio-economic efficiency of harnessing the region’s resources.

  6. Using Economic Impact Models as an Educational Tool in Community Economic Development Programming: Lessons from Pennsylvania and Wisconsin.

    Science.gov (United States)

    Shields, Martin; Deller, Steven C.

    2003-01-01

    Outlines an educational process designed to help provide communities with economic, social, and political information using community economic impact modeling. Describes the process of community meetings using economic impact, community demographics, and fiscal impact modules and the local preconditions that help make the process successful. (SK)

  7. Evaluation of train-speed control algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Slavik, M.M. [BKS Advantech (Pty.) Ltd., Pretoria (South Africa)

    2000-07-01

    A relatively simple and fast simulator has been developed and used for the preliminary testing of train cruise-control algorithms. The simulation is done in software on a PC. The simulator is used to gauge the consequences and feasibility of a cruise-control strategy prior to more elaborate testing and evaluation. The tool was used to design and pre-test a train-cruise control algorithm called NSS, which does not require knowledge of exact train mass, vertical alignment, or actual braking force. Only continuous measurements on the speed of the train and electrical current are required. With this modest input, the NSS algorithm effected speed changes smoothly and efficiently for a wide range of operating conditions. (orig.)

  8. Quantum algorithms for computational nuclear physics

    Directory of Open Access Journals (Sweden)

    Višňák Jakub

    2015-01-01

    Full Text Available While quantum algorithms have been studied as an efficient tool for the stationary state energy determination in the case of molecular quantum systems, no similar study for analogical problems in computational nuclear physics (computation of energy levels of nuclei from empirical nucleon-nucleon or quark-quark potentials have been realized yet. Although the difference between the above mentioned studies might seem negligible, it will be examined. First steps towards a particular simulation (on classical computer of the Iterative Phase Estimation Algorithm for deuterium and tritium nuclei energy level computation will be carried out with the aim to prove algorithm feasibility (and extensibility to heavier nuclei for its possible practical realization on a real quantum computer.

  9. Reliability concepts applied to cutting tool change time

    Energy Technology Data Exchange (ETDEWEB)

    Patino Rodriguez, Carmen Elena, E-mail: cpatino@udea.edu.c [Department of Industrial Engineering, University of Antioquia, Medellin (Colombia); Department of Mechatronics and Mechanical Systems, Polytechnic School, University of Sao Paulo, Sao Paulo (Brazil); Francisco Martha de Souza, Gilberto [Department of Mechatronics and Mechanical Systems, Polytechnic School, University of Sao Paulo, Sao Paulo (Brazil)

    2010-08-15

    This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled.

  10. Reliability concepts applied to cutting tool change time

    International Nuclear Information System (INIS)

    Patino Rodriguez, Carmen Elena; Francisco Martha de Souza, Gilberto

    2010-01-01

    This paper presents a reliability-based analysis for calculating critical tool life in machining processes. It is possible to determine the running time for each tool involved in the process by obtaining the operations sequence for the machining procedure. Usually, the reliability of an operation depends on three independent factors: operator, machine-tool and cutting tool. The reliability of a part manufacturing process is mainly determined by the cutting time for each job and by the sequence of operations, defined by the series configuration. An algorithm is presented to define when the cutting tool must be changed. The proposed algorithm is used to evaluate the reliability of a manufacturing process composed of turning and drilling operations. The reliability of the turning operation is modeled based on data presented in the literature, and from experimental results, a statistical distribution of drilling tool wear was defined, and the reliability of the drilling process was modeled.

  11. Managament quality of tools in the planned housing casting

    Directory of Open Access Journals (Sweden)

    Jaworski J.

    2007-01-01

    Full Text Available The Kaizen method of housing casting designing was presented in the paper. Algorithm making possibility of identification of tools limiting efficienty of tooling process was formulated. The system of tool management consisting of various was shown.

  12. Reliability Based Spare Parts Management Using Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Rahul Upadhyay

    2015-08-01

    Full Text Available Effective and efficient inventory management is the key to the economic sustainability of capital intensive modern industries. Inventory grows exponentially with complexity and size of the equipment fleet. Substantial amount of capital is required for maintaining an inventory and therefore its optimization is beneficial for smooth operation of the project at minimum cost of inventory. The size and hence the cost of the inventory is influenced by a large no of factors. This makes the optimization problem complex. This work presents a model to solve the problem of optimization of spare parts inventory. The novelty of this study lies with the fact that the developed method could tackle not only the artificial test case but also a real-world industrial problem. Various investigators developed several methods and semi-analytical tools for obtaining optimum solutions for this problem. In this study non-traditional optimization tool namely genetic algorithms GA are utilized. Apart from this Coxs regression analysis is also used to incorporate the effect of some environmental factors on the demand of spares. It shows the efficacy of the applicability of non-traditional optimization tool like GA to solve these problems. This research illustrates the proposed model with the analysis of data taken from a fleet of dumper operated in a large surface coal mine. The optimum time schedules so suggested by this GA-based model are found to be cost effective. A sensitivity analysis is also conducted for this industrial problem. Objective function is developed and the factors like the effect of season and production pressure overloading towards financial year-ending is included in the equations. Statistical analysis of the collected operational and performance data were carried out with the help of Easy-Fit Ver-5.5.The analysis gives the shape and scale parameter of theoretical Weibull distribution. The Coxs regression coefficient corresponding to excessive loading

  13. jClustering, an open framework for the development of 4D clustering algorithms.

    Directory of Open Access Journals (Sweden)

    José María Mateos-Pérez

    Full Text Available We present jClustering, an open framework for the design of clustering algorithms in dynamic medical imaging. We developed this tool because of the difficulty involved in manually segmenting dynamic PET images and the lack of availability of source code for published segmentation algorithms. Providing an easily extensible open tool encourages publication of source code to facilitate the process of comparing algorithms and provide interested third parties with the opportunity to review code. The internal structure of the framework allows an external developer to implement new algorithms easily and quickly, focusing only on the particulars of the method being implemented and not on image data handling and preprocessing. This tool has been coded in Java and is presented as an ImageJ plugin in order to take advantage of all the functionalities offered by this imaging analysis platform. Both binary packages and source code have been published, the latter under a free software license (GNU General Public License to allow modification if necessary.

  14. Economics-based optimal control of greenhouse tomato crop production

    NARCIS (Netherlands)

    Tap, F.

    2000-01-01

    The design and testing of an optimal control algorithm, based on scientific models of greenhouse and tomato crop and an economic criterion (goal function), to control greenhouse climate, is described. An important characteristic of this control is that it aims at maximising an economic

  15. Comparison of AI techniques to solve combined economic emission dispatch problem with line flow constraints

    Energy Technology Data Exchange (ETDEWEB)

    Jacob Raglend, I. [School of Electrical Sciences, Noorul Islam University, Kumaracoil 629 180 (India); Veeravalli, Sowjanya; Sailaja, Kasanur; Sudheera, B. [School of Electrical Sciences, Vellore Institute of Technology, Vellore 632 004 (India); Kothari, D.P. [FNAE, FNASC, SMIEEE, Vellore Institute of Technology University, Vellore 632 014 (India)

    2010-07-15

    A comparative study has been made on the solutions obtained using combined economic emission dispatch (CEED) problem considering line flow constraints using different intelligent techniques for the regulated power system to ensure a practical, economical and secure generation schedule. The objective of the paper is to minimize the total production cost of the power generation. Economic load dispatch (ELD) and economic emission dispatch (EED) have been applied to obtain optimal fuel cost of generating units. Combined economic emission dispatch (CEED) is obtained by considering both the economic and emission objectives. This bi-objective CEED problem is converted into single objective function using price penalty factor approach. In this paper, intelligent techniques such as genetic algorithm (GA), evolutionary programming (EP), particle swarm optimization (PSO), differential evolution (DE) are applied to obtain CEED solutions for the IEEE 30-bus system and 15-unit system. This proposed algorithm introduces an efficient CEED approach that obtains the minimum operating cost satisfying unit, emission and network constraints. The proposed algorithm has been tested on two sample systems viz the IEEE 30-bus system and a 15-unit system. The results obtained by the various artificial intelligent techniques are compared with respect to the solution time, total production cost and convergence criteria. The solutions obtained are quite encouraging and useful in the economic emission environment. The algorithm and simulation are carried out using Matlab software. (author)

  16. Comparison of AI techniques to solve combined economic emission dispatch problem with line flow constraints

    International Nuclear Information System (INIS)

    Jacob Raglend, I.; Veeravalli, Sowjanya; Sailaja, Kasanur; Sudheera, B.; Kothari, D.P.

    2010-01-01

    A comparative study has been made on the solutions obtained using combined economic emission dispatch (CEED) problem considering line flow constraints using different intelligent techniques for the regulated power system to ensure a practical, economical and secure generation schedule. The objective of the paper is to minimize the total production cost of the power generation. Economic load dispatch (ELD) and economic emission dispatch (EED) have been applied to obtain optimal fuel cost of generating units. Combined economic emission dispatch (CEED) is obtained by considering both the economic and emission objectives. This bi-objective CEED problem is converted into single objective function using price penalty factor approach. In this paper, intelligent techniques such as genetic algorithm (GA), evolutionary programming (EP), particle swarm optimization (PSO), differential evolution (DE) are applied to obtain CEED solutions for the IEEE 30-bus system and 15-unit system. This proposed algorithm introduces an efficient CEED approach that obtains the minimum operating cost satisfying unit, emission and network constraints. The proposed algorithm has been tested on two sample systems viz the IEEE 30-bus system and a 15-unit system. The results obtained by the various artificial intelligent techniques are compared with respect to the solution time, total production cost and convergence criteria. The solutions obtained are quite encouraging and useful in the economic emission environment. The algorithm and simulation are carried out using Matlab software. (author)

  17. Efficient algorithms for flow simulation related to nuclear reactor safety

    International Nuclear Information System (INIS)

    Gornak, Tatiana

    2013-01-01

    Safety analysis is of ultimate importance for operating Nuclear Power Plants (NPP). The overall modeling and simulation of physical and chemical processes occuring in the course of an accident is an interdisciplinary problem and has origins in fluid dynamics, numerical analysis, reactor technology and computer programming. The aim of the study is therefore to create the foundations of a multi-dimensional non-isothermal fluid model for a NPP containment and software tool based on it. The numerical simulations allow to analyze and predict the behavior of NPP systems under different working and accident conditions, and to develop proper action plans for minimizing the risks of accidents, and/or minimizing the consequences of possible accidents. A very large number of scenarios have to be simulated, and at the same time acceptable accuracy for the critical parameters, such as radioactive pollution, temperature, etc., have to be achieved. The existing software tools are either too slow, or not accurate enough. This thesis deals with developing customized algorithm and software tools for simulation of isothermal and non-isothermal flows in a containment pool of NPP. Requirements to such a software are formulated, and proper algorithms are presented. The goal of the work is to achieve a balance between accuracy and speed of calculation, and to develop customized algorithm for this special case. Different discretization and solution approaches are studied and those which correspond best to the formulated goal are selected, adjusted, and when possible, analysed. Fast directional splitting algorithm for Navier-Stokes equations in complicated geometries, in presence of solid and porous obstacles, is in the core of the algorithm. Developing suitable pre-processor and customized domain decomposition algorithms are essential part of the overall algorithm amd software. Results from numerical simulations in test geometries and in real geometries are presented and discussed.

  18. Lot-sizing algorithms with applications to engineering and economics

    DEFF Research Database (Denmark)

    Vidal, Rene Victor Valqui; Ferreira, Jose S.

    1984-01-01

    of time-varying parameters. A comparison of the efficiency of the new solution procedures with well-known methods is developed. New applications of the techniques described within the fields of engineering (optimal design of a pump-pipe system) and economics (a model for import-planning) are referred to...

  19. Interior point algorithms theory and analysis

    CERN Document Server

    Ye, Yinyu

    2011-01-01

    The first comprehensive review of the theory and practice of one of today's most powerful optimization techniques. The explosive growth of research into and development of interior point algorithms over the past two decades has significantly improved the complexity of linear programming and yielded some of today's most sophisticated computing techniques. This book offers a comprehensive and thorough treatment of the theory, analysis, and implementation of this powerful computational tool. Interior Point Algorithms provides detailed coverage of all basic and advanced aspects of the subject.

  20. Secured Economic Dispatch Algorithm using GSDF Matrix

    Directory of Open Access Journals (Sweden)

    Slimane SOUAG

    2014-02-01

    Full Text Available In this paper we present a new method for solving the secured power flow problem by the economic dispatch using DC power flow method and Generation Shift Distribution Factor (GSDF. A graphical interface in LabVIEW has been created as a virtual instrument. Hence the DC power flow reduces the power flow problem to a set of linear equations, which make the iterative calculation very fast and the GSFD matrix present the effects of single and multiple generator MW change on the transmission line. The effectiveness of the method developed is identified through its application to an IEEE-14 bus test system. The calculation results show excellent performance of the proposed method, in regard to computation time and quality of results.

  1. Sensitivity analysis for photovoltaic water pumping systems: Energetic and economic studies

    International Nuclear Information System (INIS)

    Yahyaoui, Imene; Atieh, Ahmad; Serna, Alvaro; Tadeo, Fernando

    2017-01-01

    Highlights: • An algorithm for sizing a PV water pumping components is studied in depth. • The strategy ensures the system autonomy and pumping the needed water. • The algorithm is tested by measured data and compared with the results of HOMER. • Economic study of systems equipped diesel generator three countries is detailed. - Abstract: In agricultural remote areas where electrical energy is required to supply water pumping plants, photovoltaic modules are considered a good option to generate electricity. The reliability of autonomous Photovoltaic water pumping plants depends essentially on the system components size, which should meet the criteria related to the plant autonomy and the water volume required for irrigation. In this context, this research paper proposes an approach to size the elements of an autonomous photovoltaic system equipped with an energy storage device (a battery bank), and which is used to supply a water-pumping plant with electricity. The proposed approach determines the optimal surface of the photovoltaic modules, the optimal capacity of the battery bank and the volume of the water storage tank. The optimization approach takes into account the monthly average solar radiation, the fulfillment of the water needed for the crops’ irrigation and the number of the days of autonomy. Measured climatic data of 10 ha situated in Northern Tunisia and planted with tomato are used in the optimization process, which is conducted during the tomato vegetative cycle (from March to July). The optimal results achieved for this farm are 101.5 m"2 of photovoltaic modules’ surface, 1680 A h/12 V of the battery bank and 1800 m3 of the volume of the water storage tank. Then, to verify the reliability of the proposed optimization approach, the results of the proposed sizing algorithm are compared with those of a commercial optimization tool named HOMER, which shows better results using the proposed approach. Finally, the economic reliability of the

  2. Exact and Heuristic Algorithms for Runway Scheduling

    Science.gov (United States)

    Malik, Waqar A.; Jung, Yoon C.

    2016-01-01

    This paper explores the Single Runway Scheduling (SRS) problem with arrivals, departures, and crossing aircraft on the airport surface. Constraints for wake vortex separations, departure area navigation separations and departure time window restrictions are explicitly considered. The main objective of this research is to develop exact and heuristic based algorithms that can be used in real-time decision support tools for Air Traffic Control Tower (ATCT) controllers. The paper provides a multi-objective dynamic programming (DP) based algorithm that finds the exact solution to the SRS problem, but may prove unusable for application in real-time environment due to large computation times for moderate sized problems. We next propose a second algorithm that uses heuristics to restrict the search space for the DP based algorithm. A third algorithm based on a combination of insertion and local search (ILS) heuristics is then presented. Simulation conducted for the east side of Dallas/Fort Worth International Airport allows comparison of the three proposed algorithms and indicates that the ILS algorithm performs favorably in its ability to find efficient solutions and its computation times.

  3. ViSAPy: a Python tool for biophysics-based generation of virtual spiking activity for evaluation of spike-sorting algorithms.

    Science.gov (United States)

    Hagen, Espen; Ness, Torbjørn V; Khosrowshahi, Amir; Sørensen, Christina; Fyhn, Marianne; Hafting, Torkel; Franke, Felix; Einevoll, Gaute T

    2015-04-30

    New, silicon-based multielectrodes comprising hundreds or more electrode contacts offer the possibility to record spike trains from thousands of neurons simultaneously. This potential cannot be realized unless accurate, reliable automated methods for spike sorting are developed, in turn requiring benchmarking data sets with known ground-truth spike times. We here present a general simulation tool for computing benchmarking data for evaluation of spike-sorting algorithms entitled ViSAPy (Virtual Spiking Activity in Python). The tool is based on a well-established biophysical forward-modeling scheme and is implemented as a Python package built on top of the neuronal simulator NEURON and the Python tool LFPy. ViSAPy allows for arbitrary combinations of multicompartmental neuron models and geometries of recording multielectrodes. Three example benchmarking data sets are generated, i.e., tetrode and polytrode data mimicking in vivo cortical recordings and microelectrode array (MEA) recordings of in vitro activity in salamander retinas. The synthesized example benchmarking data mimics salient features of typical experimental recordings, for example, spike waveforms depending on interspike interval. ViSAPy goes beyond existing methods as it includes biologically realistic model noise, synaptic activation by recurrent spiking networks, finite-sized electrode contacts, and allows for inhomogeneous electrical conductivities. ViSAPy is optimized to allow for generation of long time series of benchmarking data, spanning minutes of biological time, by parallel execution on multi-core computers. ViSAPy is an open-ended tool as it can be generalized to produce benchmarking data or arbitrary recording-electrode geometries and with various levels of complexity. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  4. Multi-Objective Optimization of the Hedging Model for reservoir Operation Using Evolutionary Algorithms

    Directory of Open Access Journals (Sweden)

    sadegh sadeghitabas

    2015-12-01

    Full Text Available Multi-objective problems rarely ever provide a single optimal solution, rather they yield an optimal set of outputs (Pareto fronts. Solving these problems was previously accomplished by using some simplifier methods such as the weighting coefficient method used for converting a multi-objective problem to a single objective function. However, such robust tools as multi-objective meta-heuristic algorithms have been recently developed for solving these problems. The hedging model is one of the classic problems for reservoir operation that is generally employed for mitigating drought impacts in water resources management. According to this method, although it is possible to supply the total planned demands, only portions of the demands are met to save water by allowing small deficits in the current conditions in order to avoid or reduce severe deficits in future. The approach heavily depends on economic and social considerations. In the present study, the meta-heuristic algorithms of NSGA-II, MOPSO, SPEA-II, and AMALGAM are used toward the multi-objective optimization of the hedging model. For this purpose, the rationing factors involved in Taleghan dam operation are optimized over a 35-year statistical period of inflow. There are two objective functions: a minimizing the modified shortage index, and b maximizing the reliability index (i.e., two opposite objectives. The results show that the above algorithms are applicable to a wide range of optimal solutions. Among the algorithms, AMALGAM is found to produce a better Pareto front for the values of the objective function, indicating its more satisfactory performance.

  5. Designing machines for lattice physics and algorithm investigation

    International Nuclear Information System (INIS)

    Fischler, M.; Atac, R.; Cook, A.

    1989-10-01

    Special-purpose computers are appropriate tools for the study of lattice gauge theory. While these machines deliver considerable processing power, it is also important to be able to program complex physics ideas and investigate algorithms on them. We examine features that facilitate coding of physics problems, and flexibility in algorithms. Appropriate balances among power, memory, communications and I/O capabilities are presented. 10 refs

  6. Management of Ecological and Economic Security of Industrial Enterprises

    Directory of Open Access Journals (Sweden)

    Ivantsova Elena Anatolyevna

    2014-11-01

    Full Text Available The purpose of this study was the modeling of ecological and economic security of production processes in an industrial plant using methods of fuzzy logic. The subject of the research – methods of modeling systems of ecological and economic security of industrial enterprises, based on the adaptation of fuzzy set theory to solve this problem. In the research process the following scientific methods and techniques were applied: scientific abstraction, analysis, synthesis, methods of grouping, comparison, etc. Along with the traditional methods the authors used the tools for simulation modeling, fuzzy sets systems, computer simulation MatLab. The informational and empirical basis of the research was formed on the basis of the data of the Federal service of state statistics and its territorial subdivisions of the Ministry of economic development of the Russian Federation, e-resources of the Internet, the research by Russian and foreign scientists, experts’ assesments. The article presents the author’s method of ensuring ecological and economic security in the enterprise by means of fuzzy logic, based on the quantitative assessment of indicators of threats in MatLab and results of visualization of fuzzy-multiple modeling of ecological and economic security. The algorithm of calculation of the conditional environmental pressures on water resources and the atmosphere, allowed to determine the dependence between the cost of wastewater treatment and economic damage from pollution and to evaluate the effectiveness of various conservation programs, and to analyze their impact on environmental sustainability. The authors also develop complex fuzzy models and implemented their software in the MatLab Fuzzy Logic Toolbox, which allowed to obtain an integrated assessment of the state of the enterprise environmental safety and comparisons of the values of these threats based on assessment. The author presents the author’s methodology and the evaluation

  7. A flexible fuzzy regression algorithm for forecasting oil consumption estimation

    International Nuclear Information System (INIS)

    Azadeh, A.; Khakestani, M.; Saberi, M.

    2009-01-01

    Oil consumption plays a vital role in socio-economic development of most countries. This study presents a flexible fuzzy regression algorithm for forecasting oil consumption based on standard economic indicators. The standard indicators are annual population, cost of crude oil import, gross domestic production (GDP) and annual oil production in the last period. The proposed algorithm uses analysis of variance (ANOVA) to select either fuzzy regression or conventional regression for future demand estimation. The significance of the proposed algorithm is three fold. First, it is flexible and identifies the best model based on the results of ANOVA and minimum absolute percentage error (MAPE), whereas previous studies consider the best fitted fuzzy regression model based on MAPE or other relative error results. Second, the proposed model may identify conventional regression as the best model for future oil consumption forecasting because of its dynamic structure, whereas previous studies assume that fuzzy regression always provide the best solutions and estimation. Third, it utilizes the most standard independent variables for the regression models. To show the applicability and superiority of the proposed flexible fuzzy regression algorithm the data for oil consumption in Canada, United States, Japan and Australia from 1990 to 2005 are used. The results show that the flexible algorithm provides accurate solution for oil consumption estimation problem. The algorithm may be used by policy makers to accurately foresee the behavior of oil consumption in various regions.

  8. Improved algorithm for quantum separability and entanglement detection

    International Nuclear Information System (INIS)

    Ioannou, L.M.; Ekert, A.K.; Travaglione, B.C.; Cheung, D.

    2004-01-01

    Determining whether a quantum state is separable or entangled is a problem of fundamental importance in quantum information science. It has recently been shown that this problem is NP-hard, suggesting that an efficient, general solution does not exist. There is a highly inefficient 'basic algorithm' for solving the quantum separability problem which follows from the definition of a separable state. By exploiting specific properties of the set of separable states, we introduce a classical algorithm that solves the problem significantly faster than the 'basic algorithm', allowing a feasible separability test where none previously existed, e.g., in 3x3-dimensional systems. Our algorithm also provides a unique tool in the experimental detection of entanglement

  9. Leveraging Python Interoperability Tools to Improve Sapphire's Usability

    Energy Technology Data Exchange (ETDEWEB)

    Gezahegne, A; Love, N S

    2007-12-10

    The Sapphire project at the Center for Applied Scientific Computing (CASC) develops and applies an extensive set of data mining algorithms for the analysis of large data sets. Sapphire's algorithms are currently available as a set of C++ libraries. However many users prefer higher level scripting languages such as Python for their ease of use and flexibility. In this report, we evaluate four interoperability tools for the purpose of wrapping Sapphire's core functionality with Python. Exposing Sapphire's functionality through a Python interface would increase its usability and connect its algorithms to existing Python tools.

  10. An efficient algorithm for bi-objective combined heat and power production planning under the emission trading scheme

    International Nuclear Information System (INIS)

    Rong, Aiying; Figueira, José Rui; Lahdelma, Risto

    2014-01-01

    Highlights: • Define fuel mix setting for the bi-objective CHP environmental/economic dispatch. • Develop an efficient algorithm for constructing the Pareto frontier for the problem. • Time complexity analysis is conducted for the proposed algorithm. • The algorithm is theoretically compared against a traditional algorithm. • The efficiency of the algorithm is justified by numerical results. - Abstract: The growing environmental awareness and the apparent conflicts between economic and environmental objectives turn energy planning problems naturally into multi-objective optimization problems. In the current study, mixed fuel combustion is considered as an option to achieve tradeoff between economic objective (associated with fuel cost) and emission objective (measured in CO 2 emission cost according to fuels and emission allowance price) because a fuel with higher emissions is usually cheaper than one with lower emissions. Combined heat and power (CHP) production is an important high-efficiency technology to promote under the emission trading scheme. In CHP production, the production planning of both commodities must be done in coordination. A long-term planning problem decomposes into thousands of hourly subproblems. In this paper, a bi-objective multi-period linear programming CHP planning model is presented first. Then, an efficient specialized merging algorithm for constructing the exact Pareto frontier (PF) of the problem is presented. The algorithm is theoretically and empirically compared against a modified dichotomic search algorithm. The efficiency and effectiveness of the algorithm is justified

  11. Towards Automatic Controller Design using Multi-Objective Evolutionary Algorithms

    DEFF Research Database (Denmark)

    Pedersen, Gerulf

    of evolutionary computation, a choice was made to use multi-objective algorithms for the purpose of aiding in automatic controller design. More specifically, the choice was made to use the Non-dominated Sorting Genetic Algorithm II (NSGAII), which is one of the most potent algorithms currently in use...... for automatic controller design. However, because the field of evolutionary computation is relatively unknown in the field of control engineering, this thesis also includes a comprehensive introduction to the basic field of evolutionary computation as well as a description of how the field has previously been......In order to design the controllers of tomorrow, a need has risen for tools that can aid in the design of these. A desire to use evolutionary computation as a tool to achieve that goal is what gave inspiration for the work contained in this thesis. After having studied the foundations...

  12. Environmental Optimization Using the WAste Reduction Algorithm (WAR)

    Science.gov (United States)

    Traditionally chemical process designs were optimized using purely economic measures such as rate of return. EPA scientists developed the WAste Reduction algorithm (WAR) so that environmental impacts of designs could easily be evaluated. The goal of WAR is to reduce environme...

  13. A high-performance spatial database based approach for pathology imaging algorithm evaluation.

    Science.gov (United States)

    Wang, Fusheng; Kong, Jun; Gao, Jingjing; Cooper, Lee A D; Kurc, Tahsin; Zhou, Zhengwen; Adler, David; Vergara-Niedermayr, Cristobal; Katigbak, Bryan; Brat, Daniel J; Saltz, Joel H

    2013-01-01

    Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS) data model. (1) Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2) Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3) Develop a set of queries to support data sampling and result comparisons; (4) Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. WE HAVE CONSIDERED TWO SCENARIOS FOR ALGORITHM EVALUATION: (1) algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2) algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The validated data were formatted based on the PAIS data model and

  14. Algorithms for Scheduling and Network Problems

    Science.gov (United States)

    1991-09-01

    time. We already know, by Lemma 2.2.1, that WOPT = O(log( mpU )), so if we could solve this integer program optimally we would be done. However, the...Folydirat, 15:177-191, 1982. [6] I.S. Belov and Ya. N. Stolin. An algorithm in a single path operations scheduling problem. In Mathematical Economics and

  15. The Methodological Aspect of the Diagnostics of Objects in Economics

    Directory of Open Access Journals (Sweden)

    Grynko Pavlo О.

    2017-12-01

    Full Text Available The article on the basis of morphological analysis defines the concept of «diagnostics in economics». The differences of diagnostics in economics, economic diagnostics, and analysis in economics were analyzed. The types, functions, principles of diagnostics in economics in modern conditions have been substantiated. The interrelationship of diagnostics with other functions of management in economics has been concretized. The logic and contents of stages of the diagnostic technology in economics have been clarified. The reliability of diagnostics in economics is determined by analytical tools, which are applicable in its implementation. Recommendations of analytical tools for realization of diagnostics in economics have been substantiated. A structural-logical scheme of diagnostics in economics, firming its scientific base and providing objectivity in practical implementation has been proposed.

  16. Linear Time Local Approximation Algorithm for Maximum Stable Marriage

    Directory of Open Access Journals (Sweden)

    Zoltán Király

    2013-08-01

    Full Text Available We consider a two-sided market under incomplete preference lists with ties, where the goal is to find a maximum size stable matching. The problem is APX-hard, and a 3/2-approximation was given by McDermid [1]. This algorithm has a non-linear running time, and, more importantly needs global knowledge of all preference lists. We present a very natural, economically reasonable, local, linear time algorithm with the same ratio, using some ideas of Paluch [2]. In this algorithm every person make decisions using only their own list, and some information asked from members of these lists (as in the case of the famous algorithm of Gale and Shapley. Some consequences to the Hospitals/Residents problem are also discussed.

  17. Ecological and resource economics as ecosystem management tools

    Science.gov (United States)

    Stephen Farber; Dennis. Bradley

    1999-01-01

    Economic pressures on ecosystems will only intensify in the future. Increased population levels, settlement patterns, and increased incomes will raise the demands for ecosystem resources and their services. The pressure to transform ecosystem natural assets into marketable commodities, whether by harvesting and mining resources or altering landscapes through...

  18. A tool for the auto-management of Units of Learning: the Link Tool

    NARCIS (Netherlands)

    Pérez-Sanagustín, Mar; Cherian, Roy; Hernández-Leo, Davinia; Griffiths, Dai; Blat, Josep

    2009-01-01

    Pérez-Sanagustín, M., Cherian, R., Hernández-Leo, D., Griffiths, D., & Blat, J. (2010). A tool for the auto-management of Units of Learning: the Link Tool. In D. Griffiths, & R. Koper (Eds.), Rethinking Learning and Employment at a Time of Economic Uncertainty. Proceedings of the 6th TENCompetence

  19. Development of a web-based tool for the assessment of health and economic outcomes of the European Innovation Partnership on Active and Healthy Ageing (EIP on AHA).

    Science.gov (United States)

    Boehler, Christian E H; de Graaf, Gimon; Steuten, Lotte; Yang, Yaling; Abadie, Fabienne

    2015-01-01

    The European Innovation Partnership on Active and Healthy Ageing (EIP on AHA) is a European Commission led policy initiative to address the challenges of demographic change in Europe. For monitoring the health and economic impact of the social and technological innovations carried out by more than 500 stakeholder's groups ('commitments') participating in the EIP on AHA, a generic and flexible web-based monitoring and assessment tool is currently being developed. This paper describes the approach for developing and implementing this web-based tool, its main characteristics and capability to provide specific outcomes that are of value to the developers of an intervention, as well as a series of case studies planned before wider rollout. The tool builds up from a variety of surrogate endpoints commonly used across the diverse set of EIP on AHA commitments in order to estimate health and economic outcomes in terms of incremental changes in quality adjusted life years (QALYs) as well as health and social care utilisation. A highly adaptable Markov model with initially three mutually exclusive health states ('baseline health', 'deteriorated health' and 'death') provides the basis for the tool which draws from an extensive database of epidemiological, economic and effectiveness data; and also allows further customisation through remote data entry enabling more accurate and context specific estimation of intervention impact. Both probabilistic sensitivity analysis and deterministic scenario analysis allow assessing the impact of parameter uncertainty on intervention outcomes. A set of case studies, ranging from the pre-market assessment of early healthcare technologies to the retrospective analysis of established care pathways, will be carried out before public rollout, which is envisaged end 2015. Monitoring the activities carried out within the EIP on AHA requires an approach that is both flexible and consistent in the way health and economic impact is estimated across

  20. The firms Sandvik Coromant and Walter instruments choice algorithmization

    Directory of Open Access Journals (Sweden)

    Євген Іванович Іванов

    2015-11-01

    Full Text Available This article describes the typical algorithms for choosing modern tools made by foreign firms Sandvik Coromant and Walter. The use of modern tools is effective on both new and old equipment. Algorithms ensure orderly operation of engineers in the development of new or upgrading old processes, and may also be useful for students enrolled in the specialty "Mechanical Engineering". The use of modern tools is effective on both new and old equipment. Correctly chosen tool make it possible for you to quickly recoup the cost of new equipment and significantly improve the work of the old equipment. Currently, all cutting tools can be divided into the following groups: a a solid; b composite; c assembly; d modular (dial-up. In composite cutting tools and parts theholders are attached permanently. For example, the attachment can be blocked by welding or soldering. At modular and modular cutting tools and parts are detachable. In those parts of the modular tool there are separate assembly units (modules with standardized mounting surface. Thus one and the same cutter head may be attached to a holder (mandrel housings of different configuration and function. The choice of the cutting part of such tools includes determining the shape and size of the indexable insert (SMP, the geometry of its front surface, corner radius and tool material. Selecting the holder (mandrel body includes determining its type and size. It is necessary to take into account the possibility of technological equipment (type and size of the mounting surfaces of the tool holder and tool spindle.After selecting the tool it is necessary to determine working regimes

  1. New HIV Testing Algorithm: Promising Tool in the Fight Against HIV

    Centers for Disease Control (CDC) Podcasts

    In this podcast, CDC’s Dr. Phil Peters discusses the new HIV testing algorithm and how this latest technology can improve the diagnosis of acute HIV infection. Early detection of HIV is critical to saving lives, getting patients into treatment, and preventing transmission.

  2. Feicim: A browser for data and algorithms

    International Nuclear Information System (INIS)

    Lazar, Z I; McNulty, R; Kechadi, T

    2008-01-01

    As programming and programming environments become increasingly complex, more effort must be invested in presenting the user with a simple yet comprehensive interface. Feicim is a tool that unifies the representation of data and algorithms. It provides resource discovery of data-files, data-content and algorithm implementation through an intuitive graphical user interface. It allows local or remote data stored on Grid type platforms to be accessed by the users, the viewing and creation of user-defined or collaboration-defined algorithms, the implementation of algorithms, and the production of output data-files and/or histograms. An application of Feicim is illustrated using the LHCb data. It provides a graphical view of the Gaudi architecture, LHCb event data model, and interfaces to the file catalogue. Feicim is particularly suited to such frameworks as Gaudi which consider algorithms as objects [2]. Instant viewing of any LHCb data will be of particular value in the commissioning of the detector and for quickly familiarizing newcomers to the data and software environment

  3. Multi-stage phase retrieval algorithm based upon the gyrator transform.

    Science.gov (United States)

    Rodrigo, José A; Duadi, Hamootal; Alieva, Tatiana; Zalevsky, Zeev

    2010-01-18

    The gyrator transform is a useful tool for optical information processing applications. In this work we propose a multi-stage phase retrieval approach based on this operation as well as on the well-known Gerchberg-Saxton algorithm. It results in an iterative algorithm able to retrieve the phase information using several measurements of the gyrator transform power spectrum. The viability and performance of the proposed algorithm is demonstrated by means of several numerical simulations and experimental results.

  4. Multi-stage phase retrieval algorithm based upon the gyrator transform

    OpenAIRE

    Rodrigo Martín-Romo, José Augusto; Duadi, Hamootal; Alieva, Tatiana Krasheninnikova; Zalevsky, Zeev

    2010-01-01

    The gyrator transform is a useful tool for optical information processing applications. In this work we propose a multi-stage phase retrieval approach based on this operation as well as on the well-known Gerchberg-Saxton algorithm. It results in an iterative algorithm able to retrieve the phase information using several measurements of the gyrator transform power spectrum. The viability and performance of the proposed algorithm is demonstrated by means of several numerical simulations and exp...

  5. FORMATION OF ECONOMIC MECHANISM OF INDUSTRIAL ENTERPRISE DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Shestakova E. V.

    2015-06-01

    Full Text Available Now in research of social and economic systems synergy approach according to which the entity represents the open self-organizing (spontaneous system gains ground. Such representation of the entity in the context of modern economic science requires development of new mechanisms and management tools. The purpose of researches in the sphere of synergy management is development of mechanisms of self-organization, and also information filling of its elements. Complexity of processes of self-organization dictates need of integration of separate types of the mechanisms differing on a method of creation, uniformity of elements, complexity, a strategic orientation, target orientation, management functions. Thus, the integrated mechanism of self-organization of the entity represents multi-level system of the interconnected mechanisms (organizational, economic, information, motivational differentiated on elements. In article content of the economic development mechanism of the entity reveals; its purposes, subjects, objects, the principles, methods, tools and resources are considered. On the basis of research of features of development of social and economic systems the ratio of stages of enterprise lifecycle with self-organization process stages is established. The principles of the economic development mechanism of the entity are proved: financial independence, self-sufficiency, economic feasibility, responsibility, resource capability, economic control, interest. Methods of the economic mechanism (planning and forecasting, marketing activity, economic diagnostics, financial credit policy, economic incentives are allocated and the tools corresponding to them are proved. Features of sale of the economic mechanism at stages of dynamic balance and bifurcation are established. The practical importance of results of research consists in development of development mechanisms of the industrial enterprises promoting achievement of long-term competitive

  6. Android Malware Classification Using K-Means Clustering Algorithm

    Science.gov (United States)

    Hamid, Isredza Rahmi A.; Syafiqah Khalid, Nur; Azma Abdullah, Nurul; Rahman, Nurul Hidayah Ab; Chai Wen, Chuah

    2017-08-01

    Malware was designed to gain access or damage a computer system without user notice. Besides, attacker exploits malware to commit crime or fraud. This paper proposed Android malware classification approach based on K-Means clustering algorithm. We evaluate the proposed model in terms of accuracy using machine learning algorithms. Two datasets were selected to demonstrate the practicing of K-Means clustering algorithms that are Virus Total and Malgenome dataset. We classify the Android malware into three clusters which are ransomware, scareware and goodware. Nine features were considered for each types of dataset such as Lock Detected, Text Detected, Text Score, Encryption Detected, Threat, Porn, Law, Copyright and Moneypak. We used IBM SPSS Statistic software for data classification and WEKA tools to evaluate the built cluster. The proposed K-Means clustering algorithm shows promising result with high accuracy when tested using Random Forest algorithm.

  7. Current practices in economic appraisal

    NARCIS (Netherlands)

    Mossink, J.C.M.

    2000-01-01

    By means of economic appraisal, the costs and the benefits of health, environment and safety management can be made clear, both at the national level and at the company level. As such it is a tool in advocating good practices. This paper explores the possibilities of economic appraisal for policy

  8. A tool for model based diagnostics of the AGS Booster

    International Nuclear Information System (INIS)

    Luccio, A.

    1993-01-01

    A model-based algorithmic tool was developed to search for lattice errors by a systematic analysis of orbit data in the AGS Booster synchrotron. The algorithm employs transfer matrices calculated with MAD between points in the ring. Iterative model fitting of the data allows one to find and eventually correct magnet displacements and angles or field errors. The tool, implemented on a HP-Apollo workstation system, has proved very general and of immediate physical interpretation

  9. Genetic Algorithm Optimizes Q-LAW Control Parameters

    Science.gov (United States)

    Lee, Seungwon; von Allmen, Paul; Petropoulos, Anastassios; Terrile, Richard

    2008-01-01

    A document discusses a multi-objective, genetic algorithm designed to optimize Lyapunov feedback control law (Q-law) parameters in order to efficiently find Pareto-optimal solutions for low-thrust trajectories for electronic propulsion systems. These would be propellant-optimal solutions for a given flight time, or flight time optimal solutions for a given propellant requirement. The approximate solutions are used as good initial solutions for high-fidelity optimization tools. When the good initial solutions are used, the high-fidelity optimization tools quickly converge to a locally optimal solution near the initial solution. Q-law control parameters are represented as real-valued genes in the genetic algorithm. The performances of the Q-law control parameters are evaluated in the multi-objective space (flight time vs. propellant mass) and sorted by the non-dominated sorting method that assigns a better fitness value to the solutions that are dominated by a fewer number of other solutions. With the ranking result, the genetic algorithm encourages the solutions with higher fitness values to participate in the reproduction process, improving the solutions in the evolution process. The population of solutions converges to the Pareto front that is permitted within the Q-law control parameter space.

  10. Dicotyledon Weed Quantification Algorithm for Selective Herbicide Application in Maize Crops.

    Science.gov (United States)

    Laursen, Morten Stigaard; Jørgensen, Rasmus Nyholm; Midtiby, Henrik Skov; Jensen, Kjeld; Christiansen, Martin Peter; Giselsson, Thomas Mosgaard; Mortensen, Anders Krogh; Jensen, Peter Kryger

    2016-11-04

    The stricter legislation within the European Union for the regulation of herbicides that are prone to leaching causes a greater economic burden on the agricultural industry through taxation. Owing to the increased economic burden, research in reducing herbicide usage has been prompted. High-resolution images from digital cameras support the studying of plant characteristics. These images can also be utilized to analyze shape and texture characteristics for weed identification. Instead of detecting weed patches, weed density can be estimated at a sub-patch level, through which even the identification of a single plant is possible. The aim of this study is to adapt the monocot and dicot coverage ratio vision (MoDiCoVi) algorithm to estimate dicotyledon leaf cover, perform grid spraying in real time, and present initial results in terms of potential herbicide savings in maize. The authors designed and executed an automated, large-scale field trial supported by the Armadillo autonomous tool carrier robot. The field trial consisted of 299 maize plots. Half of the plots (parcels) were planned with additional seeded weeds; the other half were planned with naturally occurring weeds. The in-situ evaluation showed that, compared to conventional broadcast spraying, the proposed method can reduce herbicide usage by 65% without measurable loss in biological effect.

  11. Environmental economics: Saving lives, money, and ecosystems ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2010-10-07

    Oct 7, 2010 ... Environmental economics gives developing countries a unique tool to make ... provides decision-makers facing tough economic and environmental choices with vital ... Emerging economies a new force in international giving.

  12. Algorithm for detecting violations of traffic rules based on computer vision approaches

    Directory of Open Access Journals (Sweden)

    Ibadov Samir

    2017-01-01

    Full Text Available We propose a new algorithm for automatic detect violations of traffic rules for improving the people safety on the unregulated pedestrian crossing. The algorithm uses multi-step proceedings. They are zebra detection, cars detection, and pedestrian detection. For car detection, we use faster R-CNN deep learning tool. The algorithm shows promising results in the detection violations of traffic rules.

  13. SOL: A Library for Scalable Online Learning Algorithms

    OpenAIRE

    Wu, Yue; Hoi, Steven C. H.; Liu, Chenghao; Lu, Jing; Sahoo, Doyen; Yu, Nenghai

    2016-01-01

    SOL is an open-source library for scalable online learning algorithms, and is particularly suitable for learning with high-dimensional data. The library provides a family of regular and sparse online learning algorithms for large-scale binary and multi-class classification tasks with high efficiency, scalability, portability, and extensibility. SOL was implemented in C++, and provided with a collection of easy-to-use command-line tools, python wrappers and library calls for users and develope...

  14. Practical mine ventilation optimization based on genetic algorithms for free splitting networks

    Energy Technology Data Exchange (ETDEWEB)

    Acuna, E.; Maynard, R.; Hall, S. [Laurentian Univ., Sudbury, ON (Canada). Mirarco Mining Innovation; Hardcastle, S.G.; Li, G. [Natural Resources Canada, Sudbury, ON (Canada). CANMET Mining and Mineral Sciences Laboratories; Lowndes, I.S. [Nottingham Univ., Nottingham (United Kingdom). Process and Environmental Research Division; Tonnos, A. [Bestech, Sudbury, ON (Canada)

    2010-07-01

    The method used to optimize the design and operation of mine ventilation has generally been based on case studies and expert knowledge. It has yet to benefit from optimization techniques used and proven in other fields of engineering. Currently, optimization of mine ventilation systems is a manual based decision process performed by an experienced mine ventilation specialist assisted by commercial ventilation distribution solvers. These analysis tools are widely used in the mining industry to evaluate the practical and economic viability of alternative ventilation system configurations. The scenario which is usually selected is the one that reports the lowest energy consumption while delivering the required airflow distribution. Since most commercial solvers do not have an integrated optimization algorithm network, the process of generating a series of potential ventilation solutions using the conventional iterative design strategy can be time consuming. For that reason, a genetic algorithm (GA) optimization routine was developed in combination with a ventilation solver to determine the potential optimal solutions of a primary mine ventilation system based on a free splitting network. The optimization method was used in a small size mine ventilation network. The technique was shown to have the capacity to generate good feasible solutions and improve upon the manual results obtained by mine ventilation specialists. 9 refs., 7 tabs., 3 figs.

  15. Semer: a simple calculational tool for the economic evaluations of reactor systems and associated innovations

    International Nuclear Information System (INIS)

    Nisan, S.; Rouyer, J.L.

    2001-01-01

    This paper summarises part of our on-going investigations on the economic evaluations of various nuclear and fossil energy systems and related innovations. These investigations are principally concerned with the development of the code system SEMER and its validation. SEMER has been developed to furnish top management and project leaders a simple tool for cost evaluations enabling a choice between competitive technological options. The cost evaluation models, actually integrated in the SEMER system, already cover a very wide range of electricity producing systems and, where relevant, their associated fuel cycles: The ''global models'', allowing rapid but relatively approximate overall cost estimations (about 15 % error). These include: Almost all the electricity producing systems using fossil energies (Oil, Coal, Gas, including gas turbines with combined cycles); Nuclear reactor systems including all the French PWRs, HTRs, Compact PWRs, and PWRs for nuclear propulsion systems. (author)

  16. Advancing School-Based Interventions through Economic Analysis

    Science.gov (United States)

    Olsson, Tina M.; Ferrer-Wreder, Laura; Eninger, Lilianne

    2014-01-01

    Commentators interested in school-based prevention programs point to the importance of economic issues for the future of prevention efforts. Many of the processes and aims of prevention science are dependent upon prevention resources. Although economic analysis is an essential tool for assessing resource use, the attention given economic analysis…

  17. Effectiveness of the random sequential absorption algorithm in the analysis of volume elements with nanoplatelets

    DEFF Research Database (Denmark)

    Pontefisso, Alessandro; Zappalorto, Michele; Quaresimin, Marino

    2016-01-01

    In this work, a study of the Random Sequential Absorption (RSA) algorithm in the generation of nanoplatelet Volume Elements (VEs) is carried out. The effect of the algorithm input parameters on the reinforcement distribution is studied through the implementation of statistical tools, showing...... that the platelet distribution is systematically affected by these parameters. The consequence is that a parametric analysis of the VE input parameters may be biased by hidden differences in the filler distribution. The same statistical tools used in the analysis are implemented in a modified RSA algorithm...

  18. Stabilization Algorithms for Large-Scale Problems

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg

    2006-01-01

    The focus of the project is on stabilization of large-scale inverse problems where structured models and iterative algorithms are necessary for computing approximate solutions. For this purpose, we study various iterative Krylov methods and their abilities to produce regularized solutions. Some......-curve. This heuristic is implemented as a part of a larger algorithm which is developed in collaboration with G. Rodriguez and P. C. Hansen. Last, but not least, a large part of the project has, in different ways, revolved around the object-oriented Matlab toolbox MOORe Tools developed by PhD Michael Jacobsen. New...

  19. New Mathematical Model and Algorithm for Economic Lot Scheduling Problem in Flexible Flow Shop

    Directory of Open Access Journals (Sweden)

    H. Zohali

    2018-03-01

    Full Text Available This paper addresses the lot sizing and scheduling problem for a number of products in flexible flow shop with identical parallel machines. The production stages are in series, while separated by finite intermediate buffers. The objective is to minimize the sum of setup and inventory holding costs per unit of time. The available mathematical model of this problem in the literature suffers from huge complexity in terms of size and computation. In this paper, a new mixed integer linear program is developed for delay with the huge dimentions of the problem. Also, a new meta heuristic algorithm is developed for the problem. The results of the numerical experiments represent a significant advantage of the proposed model and algorithm compared with the available models and algorithms in the literature.

  20. Using fuzzy mathematics for decision making in economics

    Directory of Open Access Journals (Sweden)

    Pavkov Ivan

    2012-01-01

    Full Text Available Traditionally, economic models are based on classical mathematics and Aristotelian two-valued logic. Nevertheless, fuzzy mathematics, as a tool for modeling some types of uncertainties and incomplete phenomena, is a more appropriate framework for modeling in economics. New approach has resulted in approximate reasoning and fuzzy control systems, which proved to be an efficient tool for decision making in fuzzy environment.

  1. Simulated annealing approach for solving economic load dispatch ...

    African Journals Online (AJOL)

    user

    thermodynamics to solve economic load dispatch (ELD) problems. ... evolutionary programming algorithm has been successfully applied for solving the ... concept behind the simulated annealing (SA) optimization is discussed in Section 3.

  2. A high-performance spatial database based approach for pathology imaging algorithm evaluation

    Directory of Open Access Journals (Sweden)

    Fusheng Wang

    2013-01-01

    Full Text Available Background: Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. Context: The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS data model. Aims: (1 Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2 Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3 Develop a set of queries to support data sampling and result comparisons; (4 Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. Materials and Methods: We have considered two scenarios for algorithm evaluation: (1 algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2 algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The

  3. New Search Space Reduction Algorithm for Vertical Reference Trajectory Optimization

    Directory of Open Access Journals (Sweden)

    Alejandro MURRIETA-MENDOZA

    2016-06-01

    Full Text Available Burning the fuel required to sustain a given flight releases pollution such as carbon dioxide and nitrogen oxides, and the amount of fuel consumed is also a significant expense for airlines. It is desirable to reduce fuel consumption to reduce both pollution and flight costs. To increase fuel savings in a given flight, one option is to compute the most economical vertical reference trajectory (or flight plan. A deterministic algorithm was developed using a numerical aircraft performance model to determine the most economical vertical flight profile considering take-off weight, flight distance, step climb and weather conditions. This algorithm is based on linear interpolations of the performance model using the Lagrange interpolation method. The algorithm downloads the latest available forecast from Environment Canada according to the departure date and flight coordinates, and calculates the optimal trajectory taking into account the effects of wind and temperature. Techniques to avoid unnecessary calculations are implemented to reduce the computation time. The costs of the reference trajectories proposed by the algorithm are compared with the costs of the reference trajectories proposed by a commercial flight management system using the fuel consumption estimated by the FlightSim® simulator made by Presagis®.

  4. Finite Element Simulation of Sheet Metal Forming Process Using Local Interpolation for Tool Surfaces

    International Nuclear Information System (INIS)

    Hama, Takayuki; Takuda, Hirohiko; Takamura, Masato; Makinouchi, Akitake; Teodosiu, Cristian

    2005-01-01

    Treatment of contact between a sheet and tools is one of the most difficult problems to deal with in finite-element simulations of sheet forming processes. In order to obtain more accurate tool models without increasing the number of elements, this paper describes a new formulation for contact problems using interpolation proposed by Nagata for tool surfaces. A contact search algorithm between sheet nodes and the interpolated tool surfaces was developed and was introduced into the static-explicit elastoplastic finite-element method code STAMP3D. Simulations of a square cup deep drawing process with a very coarsely discretized punch model were carried out. The simulated results showed that the proposed algorithm gave the proper drawn shape, demonstrating the validity of the proposed algorithm

  5. Analysis of Behavioral Economics in Crowdsensing: A Loss Aversion Cooperation Model

    Directory of Open Access Journals (Sweden)

    Deng Li

    2018-01-01

    Full Text Available The existing incentive mechanisms of crowdsourcing construct the expected utility function based on the assumption of rational people in traditional economics. A large number of studies in behavioral economics have demonstrated the defects of the traditional utility function and introduced a new parameter called loss aversion coefficient to calculate individual utility when it suffers a loss. In this paper, combination of behavioral economics and a payment algorithm based on the loss aversion is proposed. Compared with usual incentive mechanisms, the node utility function is redefined by the loss aversion characteristic of the node. Experimental results show that the proposed algorithm can get a higher rate of cooperation with a lower payment price and has good scalability compared with the traditional incentive mechanism.

  6. THE CONCEPT OF USING EVOLUTIONARY ALGORITHMS AS TOOLS FOR OPTIMAL PLANNING OF MULTIMODAL COMPOSITION IN THE DIDACTIC TEXTS

    Directory of Open Access Journals (Sweden)

    Marek A. Jakubowski

    2014-11-01

    Full Text Available At the beginning we would like to provide a short description of the new theory of learning in the digital age called connectivism. It is the integration of principles explored by the following theories: chaos, network, complexity and self-organization. Next, we describe in short new visual solutions for the teaching of writing so called multimodal literacy 5–11. We define and describe the following notions: multimodal text and original theory so called NOS (non-optimum systems methodology as a basis for new methods of visual solutions at the classes and audiovisual texts applications. Especially, we would like to emphasize the tremendous usefulness of evolutionary algorithms VEGA and NSGA as tools for optimal planning of multimodal composition in teaching texts. Finally, we give some examples of didactic texts for classrooms, which provide a deep insight into learning skills and tasks needed in the Internet age.

  7. InfoRoute: the CISMeF Context-specific Search Algorithm.

    Science.gov (United States)

    Merabti, Tayeb; Lelong, Romain; Darmoni, Stefan

    2015-01-01

    The aim of this paper was to present a practical InfoRoute algorithm and applications developed by CISMeF to perform a contextual information retrieval across multiple medical websites in different health domains. The algorithm was developed to treat multiple types of queries: natural, Boolean and advanced. The algorithm also generates multiple types of queries: Boolean query, PubMed query or Advanced query. Each query can be extended via an inter alignments relationship from UMLS and HeTOP portal. A web service and two web applications have been developed based on the InfoRoute algorithm to generate links-query across multiple websites, i.e.: "PubMed" or "ClinicalTrials.org". The InfoRoute algorithm is a useful tool to perform contextual information retrieval across multiple medical websites in both English and French.

  8. Identifying persistent and characteristic features in firearm tool marks on cartridge cases

    Science.gov (United States)

    Ott, Daniel; Soons, Johannes; Thompson, Robert; Song, John

    2017-12-01

    Recent concerns about subjectivity in forensic firearm identification have motivated the development of algorithms to compare firearm tool marks that are imparted on ammunition and to generate quantitative measures of similarity. In this paper, we describe an algorithm that identifies impressed tool marks on a cartridge case that are both consistent between firings and contribute strongly to a surface similarity metric. The result is a representation of the tool mark topography that emphasizes both significant and persistent features across firings. This characteristic surface map is useful for understanding the variability and persistence of the tool marks created by a firearm and can provide improved discrimination between the comparison scores of samples fired from the same firearm and the scores of samples fired from different firearms. The algorithm also provides a convenient method for visualizing areas of similarity that may be useful in providing quantitative support for visual comparisons by trained examiners.

  9. Accelerated bridge construction (ABC) decision making and economic modeling tool.

    Science.gov (United States)

    2011-12-01

    In this FHWA-sponsored pool funded study, a set of decision making tools, based on the Analytic Hierarchy Process (AHP) was developed. This tool set is prepared for transportation specialists and decision-makers to determine if ABC is more effective ...

  10. CAMPAIGN: an open-source library of GPU-accelerated data clustering algorithms.

    Science.gov (United States)

    Kohlhoff, Kai J; Sosnick, Marc H; Hsu, William T; Pande, Vijay S; Altman, Russ B

    2011-08-15

    Data clustering techniques are an essential component of a good data analysis toolbox. Many current bioinformatics applications are inherently compute-intense and work with very large datasets. Sequential algorithms are inadequate for providing the necessary performance. For this reason, we have created Clustering Algorithms for Massively Parallel Architectures, Including GPU Nodes (CAMPAIGN), a central resource for data clustering algorithms and tools that are implemented specifically for execution on massively parallel processing architectures. CAMPAIGN is a library of data clustering algorithms and tools, written in 'C for CUDA' for Nvidia GPUs. The library provides up to two orders of magnitude speed-up over respective CPU-based clustering algorithms and is intended as an open-source resource. New modules from the community will be accepted into the library and the layout of it is such that it can easily be extended to promising future platforms such as OpenCL. Releases of the CAMPAIGN library are freely available for download under the LGPL from https://simtk.org/home/campaign. Source code can also be obtained through anonymous subversion access as described on https://simtk.org/scm/?group_id=453. kjk33@cantab.net.

  11. Heterogenous Agents Model with the Worst Out Algorithm

    Czech Academy of Sciences Publication Activity Database

    Vácha, Lukáš; Vošvrda, Miloslav

    -, č. 8 (2006), s. 3-19 ISSN 1801-5999 Institutional research plan: CEZ:AV0Z10750506 Keywords : efficient market hypothesis * fractal market hypothesis * agents' investment horizons * agents' trading strategies * technical trading rules * heterogeneous agent model with stochastic memory * Worst out algorithm Subject RIV: AH - Economics

  12. The economic security of power plants

    Directory of Open Access Journals (Sweden)

    Niedziółka Dorota

    2017-01-01

    Full Text Available Currently, power plants in Poland have to work in a very uncomfortable situation. Unstable market conditions and frequent changes in the law may have serious adverse consequences for their economic security. Power plants play a very important role in the economy. The effectiveness of their performance affects the activity of all other businesses. Therefore, it is very important to provide a definition of economic security for the power plants’ sector and the factors determining its level. Maintaining economic security will allow energy generation companies to grow in a sustainable way as well as limit operational risk. A precise definition can also be used to create analytical tools for economic security measurement and monitoring. Proper usage of such tools can help energy generation companies sustain their economic security and properly plan their capital expenditures. The article focuses on the definition of economic security in the “micro” context of a separate business unit (enterprise. We also present an analytical model that measures economic security of a company engaged in the production of energy - a company of strategic importance for the national economy. The model uses macroeconomic variables, variables describing prices of raw material and legal / political stability in the country, as well as selected financial indicators. The appliance of conclusions resulting from the model’s implementation will help provide economic security for companies generating energy.

  13. Software tool for data mining and its applications

    Science.gov (United States)

    Yang, Jie; Ye, Chenzhou; Chen, Nianyi

    2002-03-01

    A software tool for data mining is introduced, which integrates pattern recognition (PCA, Fisher, clustering, hyperenvelop, regression), artificial intelligence (knowledge representation, decision trees), statistical learning (rough set, support vector machine), computational intelligence (neural network, genetic algorithm, fuzzy systems). It consists of nine function models: pattern recognition, decision trees, association rule, fuzzy rule, neural network, genetic algorithm, Hyper Envelop, support vector machine, visualization. The principle and knowledge representation of some function models of data mining are described. The software tool of data mining is realized by Visual C++ under Windows 2000. Nonmonotony in data mining is dealt with by concept hierarchy and layered mining. The software tool of data mining has satisfactorily applied in the prediction of regularities of the formation of ternary intermetallic compounds in alloy systems, and diagnosis of brain glioma.

  14. Economic Statistical Design of Variable Sampling Interval X¯$\\overline X $ Control Chart Based on Surrogate Variable Using Genetic Algorithms

    Directory of Open Access Journals (Sweden)

    Lee Tae-Hoon

    2016-12-01

    Full Text Available In many cases, a X¯$\\overline X $ control chart based on a performance variable is used in industrial fields. Typically, the control chart monitors the measurements of a performance variable itself. However, if the performance variable is too costly or impossible to measure, and a less expensive surrogate variable is available, the process may be more efficiently controlled using surrogate variables. In this paper, we present a model for the economic statistical design of a VSI (Variable Sampling Interval X¯$\\overline X $ control chart using a surrogate variable that is linearly correlated with the performance variable. We derive the total average profit model from an economic viewpoint and apply the model to a Very High Temperature Reactor (VHTR nuclear fuel measurement system and derive the optimal result using genetic algorithms. Compared with the control chart based on a performance variable, the proposed model gives a larger expected net income per unit of time in the long-run if the correlation between the performance variable and the surrogate variable is relatively high. The proposed model was confined to the sample mean control chart under the assumption that a single assignable cause occurs according to the Poisson process. However, the model may also be extended to other types of control charts using a single or multiple assignable cause assumptions such as VSS (Variable Sample Size X¯$\\overline X $ control chart, EWMA, CUSUM charts and so on.

  15. Improved Ant Colony Clustering Algorithm and Its Performance Study

    Science.gov (United States)

    Gao, Wei

    2016-01-01

    Clustering analysis is used in many disciplines and applications; it is an important tool that descriptively identifies homogeneous groups of objects based on attribute values. The ant colony clustering algorithm is a swarm-intelligent method used for clustering problems that is inspired by the behavior of ant colonies that cluster their corpses and sort their larvae. A new abstraction ant colony clustering algorithm using a data combination mechanism is proposed to improve the computational efficiency and accuracy of the ant colony clustering algorithm. The abstraction ant colony clustering algorithm is used to cluster benchmark problems, and its performance is compared with the ant colony clustering algorithm and other methods used in existing literature. Based on similar computational difficulties and complexities, the results show that the abstraction ant colony clustering algorithm produces results that are not only more accurate but also more efficiently determined than the ant colony clustering algorithm and the other methods. Thus, the abstraction ant colony clustering algorithm can be used for efficient multivariate data clustering. PMID:26839533

  16. Linear programming mathematics, theory and algorithms

    CERN Document Server

    1996-01-01

    Linear Programming provides an in-depth look at simplex based as well as the more recent interior point techniques for solving linear programming problems. Starting with a review of the mathematical underpinnings of these approaches, the text provides details of the primal and dual simplex methods with the primal-dual, composite, and steepest edge simplex algorithms. This then is followed by a discussion of interior point techniques, including projective and affine potential reduction, primal and dual affine scaling, and path following algorithms. Also covered is the theory and solution of the linear complementarity problem using both the complementary pivot algorithm and interior point routines. A feature of the book is its early and extensive development and use of duality theory. Audience: The book is written for students in the areas of mathematics, economics, engineering and management science, and professionals who need a sound foundation in the important and dynamic discipline of linear programming.

  17. Deconvolution algorithms applied in ultrasonics

    International Nuclear Information System (INIS)

    Perrot, P.

    1993-12-01

    In a complete system of acquisition and processing of ultrasonic signals, it is often necessary at one stage to use some processing tools to get rid of the influence of the different elements of that system. By that means, the final quality of the signals in terms of resolution is improved. There are two main characteristics of ultrasonic signals which make this task difficult. Firstly, the signals generated by transducers are very often non-minimum phase. The classical deconvolution algorithms are unable to deal with such characteristics. Secondly, depending on the medium, the shape of the propagating pulse is evolving. The spatial invariance assumption often used in classical deconvolution algorithms is rarely valid. Many classical algorithms, parametric and non-parametric, have been investigated: the Wiener-type, the adaptive predictive techniques, the Oldenburg technique in the frequency domain, the minimum variance deconvolution. All the algorithms have been firstly tested on simulated data. One specific experimental set-up has also been analysed. Simulated and real data has been produced. This set-up demonstrated the interest in applying deconvolution, in terms of the achieved resolution. (author). 32 figs., 29 refs

  18. Multiperiod hydrothermal economic dispatch by an interior point method

    Directory of Open Access Journals (Sweden)

    Kimball L. M.

    2002-01-01

    Full Text Available This paper presents an interior point algorithm to solve the multiperiod hydrothermal economic dispatch (HTED. The multiperiod HTED is a large scale nonlinear programming problem. Various optimization methods have been applied to the multiperiod HTED, but most neglect important network characteristics or require decomposition into thermal and hydro subproblems. The algorithm described here exploits the special bordered block diagonal structure and sparsity of the Newton system for the first order necessary conditions to result in a fast efficient algorithm that can account for all network aspects. Applying this new algorithm challenges a conventional method for the use of available hydro resources known as the peak shaving heuristic.

  19. Real-time algorithm for acoustic imaging with a microphone array.

    Science.gov (United States)

    Huang, Xun

    2009-05-01

    Acoustic phased array has become an important testing tool in aeroacoustic research, where the conventional beamforming algorithm has been adopted as a classical processing technique. The computation however has to be performed off-line due to the expensive cost. An innovative algorithm with real-time capability is proposed in this work. The algorithm is similar to a classical observer in the time domain while extended for the array processing to the frequency domain. The observer-based algorithm is beneficial mainly for its capability of operating over sampling blocks recursively. The expensive experimental time can therefore be reduced extensively since any defect in a testing can be corrected instantaneously.

  20. Dynamic Uniform Scaling for Multiobjective Genetic Algorithms

    DEFF Research Database (Denmark)

    Pedersen, Gerulf; Goldberg, David E.

    2004-01-01

    Before Multiobjective Evolutionary Algorithms (MOEAs) can be used as a widespread tool for solving arbitrary real world problems there are some salient issues which require further investigation. One of these issues is how a uniform distribution of solutions along the Pareto non-dominated front c...

  1. Dynamic Uniform Scaling for Multiobjective Genetic Algorithms

    DEFF Research Database (Denmark)

    Pedersen, Gerulf; Goldberg, D.E.

    2004-01-01

    Before Multiobjective Evolutionary Algorithms (MOEAs) can be used as a widespread tool for solving arbitrary real world problems there are some salient issues which require further investigation. One of these issues is how a uniform distribution of solutions along the Pareto non-dominated front can...

  2. Semer: a simple calculational tool for the economic evaluations of reactor systems and associated innovations

    Energy Technology Data Exchange (ETDEWEB)

    Nisan, S. [CEA Cadarache, Nuclear Reactor Directorate, DRN, Dept. of Reactor Studies, DER, Reactor and Innovative Systems Service, SERSI, 13 - Saint Paul lez Durance (France); Rouyer, J.L. [Electricite de France (EDF), Pole Industrie, Div. Ingenierie et Services, 93 - Saint-Denis (France)

    2001-07-01

    This paper summarises part of our on-going investigations on the economic evaluations of various nuclear and fossil energy systems and related innovations. These investigations are principally concerned with the development of the code system SEMER and its validation. SEMER has been developed to furnish top management and project leaders a simple tool for cost evaluations enabling a choice between competitive technological options. The cost evaluation models, actually integrated in the SEMER system, already cover a very wide range of electricity producing systems and, where relevant, their associated fuel cycles: The ''global models'', allowing rapid but relatively approximate overall cost estimations (about 15 % error). These include: Almost all the electricity producing systems using fossil energies (Oil, Coal, Gas, including gas turbines with combined cycles); Nuclear reactor systems including all the French PWRs, HTRs, Compact PWRs, and PWRs for nuclear propulsion systems. (author)

  3. Quantum algorithms for topological and geometric analysis of data

    Science.gov (United States)

    Lloyd, Seth; Garnerone, Silvano; Zanardi, Paolo

    2016-01-01

    Extracting useful information from large data sets can be a daunting task. Topological methods for analysing data sets provide a powerful technique for extracting such information. Persistent homology is a sophisticated tool for identifying topological features and for determining how such features persist as the data is viewed at different scales. Here we present quantum machine learning algorithms for calculating Betti numbers—the numbers of connected components, holes and voids—in persistent homology, and for finding eigenvectors and eigenvalues of the combinatorial Laplacian. The algorithms provide an exponential speed-up over the best currently known classical algorithms for topological data analysis. PMID:26806491

  4. Combined algorithms in nonlinear problems of magnetostatics

    International Nuclear Information System (INIS)

    Gregus, M.; Khoromskij, B.N.; Mazurkevich, G.E.; Zhidkov, E.P.

    1988-01-01

    To solve boundary problems of magnetostatics in unbounded two- and three-dimensional regions, we construct combined algorithms based on a combination of the method of boundary integral equations with the grid methods. We study the question of substantiation of the combined method of nonlinear magnetostatic problem without the preliminary discretization of equations and give some results on the convergence of iterative processes that arise in non-linear cases. We also discuss economical iterative processes and algorithms that solve boundary integral equations on certain surfaces. Finally, examples of numerical solutions of magnetostatic problems that arose when modelling the fields of electrophysical installations are given too. 14 refs.; 2 figs.; 1 tab

  5. The behavioral economics of health and health care.

    Science.gov (United States)

    Rice, Thomas

    2013-01-01

    People often make decisions in health care that are not in their best interest, ranging from failing to enroll in health insurance to which they are entitled, to engaging in extremely harmful behaviors. Traditional economic theory provides a limited tool kit for improving behavior because it assumes that people make decisions in a rational way, have the mental capacity to deal with huge amounts of information and choice, and have tastes endemic to them and not open to manipulation. Melding economics with psychology, behavioral economics acknowledges that people often do not act rationally in the economic sense. It therefore offers a potentially richer set of tools than provided by traditional economic theory to understand and influence behaviors. Only recently, however, has it been applied to health care. This article provides an overview of behavioral economics, reviews some of its contributions, and shows how it can be used in health care to improve people's decisions and health.

  6. THE WASTE REDUCTION (WAR) ALGORITHM: ENVIRONMENTAL IMPACTS, ENERGY CONSUMPTION, AND ENGINEERING ECONOMICS

    Science.gov (United States)

    A general theory known as the WAste Reduction (WAR) algorithm has been developed to describe the flow and the generation of potential environmental impact through a chemical process. This theory defines potential environmental impact indexes that characterize the generation and t...

  7. Production engineering jig and tool design

    CERN Document Server

    Jones, E J H

    1972-01-01

    Production Engineering: Jig and Tool Design focuses on jig and tool design as part of production engineering and covers topics ranging from inspection and gauging to multiple and consecutive tooling, tool calculation and development of form tools, deep-hole boring, and grinding-wheel form-crushing. Air and oil operated fixtures, negative rake machining, and the economics of jig and fixture practice are also discussed. This text is comprised of 22 chapters; the first of which provides an overview of the function and organization of the jig and tool department. Attention then turns to the subjec

  8. CiSE: a circular spring embedder layout algorithm.

    Science.gov (United States)

    Dogrusoz, Ugur; Belviranli, Mehmet E; Dilek, Alptug

    2013-06-01

    We present a new algorithm for automatic layout of clustered graphs using a circular style. The algorithm tries to determine optimal location and orientation of individual clusters intrinsically within a modified spring embedder. Heuristics such as reversal of the order of nodes in a cluster and swap of neighboring node pairs in the same cluster are employed intermittently to further relax the spring embedder system, resulting in reduced inter-cluster edge crossings. Unlike other algorithms generating circular drawings, our algorithm does not require the quotient graph to be acyclic, nor does it sacrifice the edge crossing number of individual clusters to improve respective positioning of the clusters. Moreover, it reduces the total area required by a cluster by using the space inside the associated circle. Experimental results show that the execution time and quality of the produced drawings with respect to commonly accepted layout criteria are quite satisfactory, surpassing previous algorithms. The algorithm has also been successfully implemented and made publicly available as part of a compound and clustered graph editing and layout tool named CHISIO.

  9. Quality control of the documentation process in electronic economic activities

    Directory of Open Access Journals (Sweden)

    Krutova A.S.

    2017-06-01

    Full Text Available It is proved that the main tool that will provide adequate information resources e economic activities of social and economic relations are documenting quality control processes as the basis of global information space. Directions problems as formation evaluation information resources in the process of documentation, namely development tools assess the efficiency of the system components – qualitative assessment; development of mathematical modeling tools – quantitative evaluation. A qualitative assessment of electronic documentation of economic activity through exercise performance, efficiency of communication; document management efficiency; effectiveness of flow control operations; relationship management effectiveness. The concept of quality control process documents electronically economic activity to components which include: the level of workflow; forms adequacy of information; consumer quality documents; quality attributes; type of income data; condition monitoring systems; organizational level process documentation; attributes of quality, performance quality consumer; type of management system; type of income data; condition monitoring systems. Grounded components of the control system electronic document subjects of economic activity. Detected components IT-audit management system economic activity: compliance audit; audit of internal control; detailed multilevel analysis; corporate risk assessment methodology. The stages and methods of processing electronic transactions economic activity during condition monitoring of electronic economic activity.

  10. Evolutionary algorithm based optimization of hydraulic machines utilizing a state-of-the-art block coupled CFD solver and parametric geometry and mesh generation tools

    Science.gov (United States)

    S, Kyriacou; E, Kontoleontos; S, Weissenberger; L, Mangani; E, Casartelli; I, Skouteropoulou; M, Gattringer; A, Gehrer; M, Buchmayr

    2014-03-01

    An efficient hydraulic optimization procedure, suitable for industrial use, requires an advanced optimization tool (EASY software), a fast solver (block coupled CFD) and a flexible geometry generation tool. EASY optimization software is a PCA-driven metamodel-assisted Evolutionary Algorithm (MAEA (PCA)) that can be used in both single- (SOO) and multiobjective optimization (MOO) problems. In MAEAs, low cost surrogate evaluation models are used to screen out non-promising individuals during the evolution and exclude them from the expensive, problem specific evaluation, here the solution of Navier-Stokes equations. For additional reduction of the optimization CPU cost, the PCA technique is used to identify dependences among the design variables and to exploit them in order to efficiently drive the application of the evolution operators. To further enhance the hydraulic optimization procedure, a very robust and fast Navier-Stokes solver has been developed. This incompressible CFD solver employs a pressure-based block-coupled approach, solving the governing equations simultaneously. This method, apart from being robust and fast, also provides a big gain in terms of computational cost. In order to optimize the geometry of hydraulic machines, an automatic geometry and mesh generation tool is necessary. The geometry generation tool used in this work is entirely based on b-spline curves and surfaces. In what follows, the components of the tool chain are outlined in some detail and the optimization results of hydraulic machine components are shown in order to demonstrate the performance of the presented optimization procedure.

  11. Evolutionary algorithm based optimization of hydraulic machines utilizing a state-of-the-art block coupled CFD solver and parametric geometry and mesh generation tools

    International Nuclear Information System (INIS)

    Kyriacou S; Kontoleontos E; Weissenberger S; Mangani L; Casartelli E; Skouteropoulou I; Gattringer M; Gehrer A; Buchmayr M

    2014-01-01

    An efficient hydraulic optimization procedure, suitable for industrial use, requires an advanced optimization tool (EASY software), a fast solver (block coupled CFD) and a flexible geometry generation tool. EASY optimization software is a PCA-driven metamodel-assisted Evolutionary Algorithm (MAEA (PCA)) that can be used in both single- (SOO) and multiobjective optimization (MOO) problems. In MAEAs, low cost surrogate evaluation models are used to screen out non-promising individuals during the evolution and exclude them from the expensive, problem specific evaluation, here the solution of Navier-Stokes equations. For additional reduction of the optimization CPU cost, the PCA technique is used to identify dependences among the design variables and to exploit them in order to efficiently drive the application of the evolution operators. To further enhance the hydraulic optimization procedure, a very robust and fast Navier-Stokes solver has been developed. This incompressible CFD solver employs a pressure-based block-coupled approach, solving the governing equations simultaneously. This method, apart from being robust and fast, also provides a big gain in terms of computational cost. In order to optimize the geometry of hydraulic machines, an automatic geometry and mesh generation tool is necessary. The geometry generation tool used in this work is entirely based on b-spline curves and surfaces. In what follows, the components of the tool chain are outlined in some detail and the optimization results of hydraulic machine components are shown in order to demonstrate the performance of the presented optimization procedure

  12. Optimal Selection of Clustering Algorithm via Multi-Criteria Decision Analysis (MCDA for Load Profiling Applications

    Directory of Open Access Journals (Sweden)

    Ioannis P. Panapakidis

    2018-02-01

    Full Text Available Due to high implementation rates of smart meter systems, considerable amount of research is placed in machine learning tools for data handling and information retrieval. A key tool in load data processing is clustering. In recent years, a number of researches have proposed different clustering algorithms in the load profiling field. The present paper provides a methodology for addressing the aforementioned problem through Multi-Criteria Decision Analysis (MCDA and namely, using the Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS. A comparison of the algorithms is employed. Next, a single test case on the selection of an algorithm is examined. User specific weights are applied and based on these weight values, the optimal algorithm is drawn.

  13. Advanced metaheuristic algorithms for laser optimization

    International Nuclear Information System (INIS)

    Tomizawa, H.

    2010-01-01

    A laser is one of the most important experimental tools. In synchrotron radiation field, lasers are widely used for experiments with Pump-Probe techniques. Especially for Xray-FELs, a laser has important roles as a seed light source or photo-cathode-illuminating light source to generate a high brightness electron bunch. The controls of laser pulse characteristics are required for many kinds of experiments. However, the laser should be tuned and customized for each requirement by laser experts. The automatic tuning of laser is required to realize with some sophisticated algorithms. The metaheuristic algorithm is one of the useful candidates to find one of the best solutions as acceptable as possible. The metaheuristic laser tuning system is expected to save our human resources and time for the laser preparations. I have shown successful results on a metaheuristic algorithm based on a genetic algorithm to optimize spatial (transverse) laser profiles and a hill climbing method extended with a fuzzy set theory to choose one of the best laser alignments automatically for each experimental requirement. (author)

  14. Complexity of Economical Systems

    Directory of Open Access Journals (Sweden)

    G. P. Pavlos

    2015-01-01

    Full Text Available In this study new theoretical concepts are described concerning the interpretation of economical complex dynamics. In addition a summary of an extended algorithm of nonlinear time series analysis is provided which is applied not only in economical time series but also in other physical complex systems (e.g. [22, 24]. In general, Economy is a vast and complicated set of arrangements and actions wherein agents—consumers, firms, banks, investors, government agencies—buy and sell, speculate, trade, oversee, bring products into being, offer services, invest in companies, strategize, explore, forecast, compete, learn, innovate, and adapt. As a result the economic and financial variables such as foreign exchange rates, gross domestic product, interest rates, production, stock market prices and unemployment exhibit large-amplitude and aperiodic fluctuations evident in complex systems. Thus, the Economics can be considered as spatially distributed non-equilibrium complex system, for which new theoretical concepts, such as Tsallis non extensive statistical mechanics and strange dynamics, percolation, nonGaussian, multifractal and multiscale dynamics related to fractional Langevin equations can be used for modeling and understanding of the economical complexity locally or globally.

  15. IEA SHC Task 42/ECES Annex 29 – A Simple Tool for the Economic Evaluation of Thermal Energy Storages

    DEFF Research Database (Denmark)

    Rathgeber, Christoph; Hiebler, Stefan; Lävemann, Eberhard

    2016-01-01

    Within the framework of IEA SHC Task 42 / ECES Annex 29, a simple tool for the economic evaluation of thermal energy storages has been developed and tested on various existing storages. On that account, the storage capacity costs (costs per installed storage capacity) of thermal energy storages...... have been evaluated via a Top-down and a Bottom-up approach. The Top-down approach follows the assumption that the costs of energy supplied by the storage should not exceed the costs of energy from the market. The maximum acceptable storage capacity costs depend on the interest rate assigned...

  16. Techno-economic optimisation of energy systems

    International Nuclear Information System (INIS)

    Mansilla Pellen, Ch.

    2006-07-01

    The traditional approach currently used to assess the economic interest of energy systems is based on a defined flow-sheet. Some studies have shown that the flow-sheets corresponding to the best thermodynamic efficiencies do not necessarily lead to the best production costs. A method called techno-economic optimisation was proposed. This method aims at minimising the production cost of a given energy system, including both investment and operating costs. It was implemented using genetic algorithms. This approach was compared to the heat integration method on two different examples, thus validating its interest. Techno-economic optimisation was then applied to different energy systems dealing with hydrogen as well as electricity production. (author)

  17. E-PLE: an Algorithm for Image Inpainting

    Directory of Open Access Journals (Sweden)

    Yi-Qing Wang

    2013-12-01

    Full Text Available Gaussian mixture is a powerful tool for modeling the patch prior. In this work, a probabilisticview of an existing algorithm piecewise linear estimation (PLE for image inpainting is presentedwhich leads to several theoretical and numerical improvements based on an effective use ofGaussian mixture.

  18. Optimal design and management of chlorination in drinking water networks: a multi-objective approach using Genetic Algorithms and the Pareto optimality concept

    Science.gov (United States)

    Nouiri, Issam

    2017-11-01

    This paper presents the development of multi-objective Genetic Algorithms to optimize chlorination design and management in drinking water networks (DWN). Three objectives have been considered: the improvement of the chlorination uniformity (healthy objective), the minimization of chlorine booster stations number, and the injected chlorine mass (economic objectives). The problem has been dissociated in medium and short terms ones. The proposed methodology was tested on hypothetical and real DWN. Results proved the ability of the developed optimization tool to identify relationships between the healthy and economic objectives as Pareto fronts. The proposed approach was efficient in computing solutions ensuring better chlorination uniformity while requiring the weakest injected chlorine mass when compared to other approaches. For the real DWN studied, chlorination optimization has been crowned by great improvement of free-chlorine-dosing uniformity and by a meaningful chlorine mass reduction, in comparison with the conventional chlorination.

  19. Alignment of Custom Standards by Machine Learning Algorithms

    Directory of Open Access Journals (Sweden)

    Adela Sirbu

    2010-09-01

    Full Text Available Building an efficient model for automatic alignment of terminologies would bring a significant improvement to the information retrieval process. We have developed and compared two machine learning based algorithms whose aim is to align 2 custom standards built on a 3 level taxonomy, using kNN and SVM classifiers that work on a vector representation consisting of several similarity measures. The weights utilized by the kNN were optimized with an evolutionary algorithm, while the SVM classifier's hyper-parameters were optimized with a grid search algorithm. The database used for train was semi automatically obtained by using the Coma++ tool. The performance of our aligners is shown by the results obtained on the test set.

  20. Control of baker’s yeast fermentation : PID and fuzzy algorithms

    OpenAIRE

    Machado, Carlos; Gomes, Pedro; Soares, Rui; Pereira, Silvia; Soares, Filomena

    2001-01-01

    A MATLAB/SIMULINK-based simulator was employed for studies concerning the control of baker’s yeast fed-batch fermentation. Four control algorithms were implemented and compared: the classical PID control, two discrete versions- modified velocity and position algorithms, and a fuzzy law. The simulation package was seen to be an efficient tool for the simulation and tests of control strategies of the non-linear process.

  1. DNA Microarray Data Analysis: A Novel Biclustering Algorithm Approach

    Directory of Open Access Journals (Sweden)

    Tewfik Ahmed H

    2006-01-01

    Full Text Available Biclustering algorithms refer to a distinct class of clustering algorithms that perform simultaneous row-column clustering. Biclustering problems arise in DNA microarray data analysis, collaborative filtering, market research, information retrieval, text mining, electoral trends, exchange analysis, and so forth. When dealing with DNA microarray experimental data for example, the goal of biclustering algorithms is to find submatrices, that is, subgroups of genes and subgroups of conditions, where the genes exhibit highly correlated activities for every condition. In this study, we develop novel biclustering algorithms using basic linear algebra and arithmetic tools. The proposed biclustering algorithms can be used to search for all biclusters with constant values, biclusters with constant values on rows, biclusters with constant values on columns, and biclusters with coherent values from a set of data in a timely manner and without solving any optimization problem. We also show how one of the proposed biclustering algorithms can be adapted to identify biclusters with coherent evolution. The algorithms developed in this study discover all valid biclusters of each type, while almost all previous biclustering approaches will miss some.

  2. The JPSS Ground Project Algorithm Verification, Test and Evaluation System

    Science.gov (United States)

    Vicente, G. A.; Jain, P.; Chander, G.; Nguyen, V. T.; Dixon, V.

    2016-12-01

    The Government Resource for Algorithm Verification, Independent Test, and Evaluation (GRAVITE) is an operational system that provides services to the Suomi National Polar-orbiting Partnership (S-NPP) Mission. It is also a unique environment for Calibration/Validation (Cal/Val) and Data Quality Assessment (DQA) of the Join Polar Satellite System (JPSS) mission data products. GRAVITE provides a fast and direct access to the data and products created by the Interface Data Processing Segment (IDPS), the NASA/NOAA operational system that converts Raw Data Records (RDR's) generated by sensors on the S-NPP into calibrated geo-located Sensor Data Records (SDR's) and generates Mission Unique Products (MUPS). It also facilitates algorithm investigation, integration, checkouts and tuning, instrument and product calibration and data quality support, monitoring and data/products distribution. GRAVITE is the portal for the latest S-NPP and JPSS baselined Processing Coefficient Tables (PCT's) and Look-Up-Tables (LUT's) and hosts a number DQA offline tools that takes advantage of the proximity to the near-real time data flows. It also contains a set of automated and ad-hoc Cal/Val tools used for algorithm analysis and updates, including an instance of the IDPS called GRAVITE Algorithm Development Area (G-ADA), that has the latest installation of the IDPS algorithms running in an identical software and hardware platforms. Two other important GRAVITE component are the Investigator-led Processing System (IPS) and the Investigator Computing Facility (ICF). The IPS is a dedicated environment where authorized users run automated scripts called Product Generation Executables (PGE's) to support Cal/Val and data quality assurance offline. This data-rich and data-driven service holds its own distribution system and allows operators to retrieve science data products. The ICF is a workspace where users can share computing applications and resources and have full access to libraries and

  3. A New DG Multiobjective Optimization Method Based on an Improved Evolutionary Algorithm

    Directory of Open Access Journals (Sweden)

    Wanxing Sheng

    2013-01-01

    Full Text Available A distribution generation (DG multiobjective optimization method based on an improved Pareto evolutionary algorithm is investigated in this paper. The improved Pareto evolutionary algorithm, which introduces a penalty factor in the objective function constraints, uses an adaptive crossover and a mutation operator in the evolutionary process and combines a simulated annealing iterative process. The proposed algorithm is utilized to the optimize DG injection models to maximize DG utilization while minimizing system loss and environmental pollution. A revised IEEE 33-bus system with multiple DG units was used to test the multiobjective optimization algorithm in a distribution power system. The proposed algorithm was implemented and compared with the strength Pareto evolutionary algorithm 2 (SPEA2, a particle swarm optimization (PSO algorithm, and nondominated sorting genetic algorithm II (NGSA-II. The comparison of the results demonstrates the validity and practicality of utilizing DG units in terms of economic dispatch and optimal operation in a distribution power system.

  4. SIMULATION TOOLS FOR ELECTRICAL MACHINES MODELLING ...

    African Journals Online (AJOL)

    Dr Obe

    ABSTRACT. Simulation tools are used both for research and teaching to allow a good ... The solution provide an easy way of determining the dynamic .... incorporate an in-built numerical algorithm, ... to learn, versatile in application, enhanced.

  5. Developing Programming Tools to Handle Traveling Salesman Problem by the Three Object-Oriented Languages

    Directory of Open Access Journals (Sweden)

    Hassan Ismkhan

    2014-01-01

    Full Text Available The traveling salesman problem (TSP is one of the most famous problems. Many applications and programming tools have been developed to handle TSP. However, it seems to be essential to provide easy programming tools according to state-of-the-art algorithms. Therefore, we have collected and programmed new easy tools by the three object-oriented languages. In this paper, we present ADT (abstract data type of developed tools at first; then we analyze their performance by experiments. We also design a hybrid genetic algorithm (HGA by developed tools. Experimental results show that the proposed HGA is comparable with the recent state-of-the-art applications.

  6. Economic impact of reduced mortality due to increased cycling.

    Science.gov (United States)

    Rutter, Harry; Cavill, Nick; Racioppi, Francesca; Dinsdale, Hywell; Oja, Pekka; Kahlmeier, Sonja

    2013-01-01

    Increasing regular physical activity is a key public health goal. One strategy is to change the physical environment to encourage walking and cycling, requiring partnerships with the transport and urban planning sectors. Economic evaluation is an important factor in the decision to fund any new transport scheme, but techniques for assessing the economic value of the health benefits of cycling and walking have tended to be less sophisticated than the approaches used for assessing other benefits. This study aimed to produce a practical tool for estimating the economic impact of reduced mortality due to increased cycling. The tool was intended to be transparent, easy to use, reliable, and based on conservative assumptions and default values, which can be used in the absence of local data. It addressed the question: For a given volume of cycling within a defined population, what is the economic value of the health benefits? The authors used published estimates of relative risk of all-cause mortality among regular cyclists and applied these to levels of cycling defined by the user to produce an estimate of the number of deaths potentially averted because of regular cycling. The tool then calculates the economic value of the deaths averted using the "value of a statistical life." The outputs of the tool support decision making on cycle infrastructure or policies, or can be used as part of an integrated economic appraisal. The tool's unique contribution is that it takes a public health approach to a transport problem, addresses it in epidemiologic terms, and places the results back into the transport context. Examples of its use include its adoption by the English and Swedish departments of transport as the recommended methodologic approach for estimating the health impact of walking and cycling. Copyright © 2013 World Health Organization. Published by Elsevier Inc. All rights reserved.

  7. A Coupling Tool for Parallel Molecular Dynamics-Continuum Simulations

    KAUST Repository

    Neumann, Philipp

    2012-06-01

    We present a tool for coupling Molecular Dynamics and continuum solvers. It is written in C++ and is meant to support the developers of hybrid molecular - continuum simulations in terms of both realisation of the respective coupling algorithm as well as parallel execution of the hybrid simulation. We describe the implementational concept of the tool and its parallel extensions. We particularly focus on the parallel execution of particle insertions into dense molecular systems and propose a respective parallel algorithm. Our implementations are validated for serial and parallel setups in two and three dimensions. © 2012 IEEE.

  8. The algorithms and principles of non-photorealistic graphics

    CERN Document Server

    Geng, Weidong

    2011-01-01

    ""The Algorithms and Principles of Non-photorealistic Graphics: Artistic Rendering and Cartoon Animation"" provides a conceptual framework for and comprehensive and up-to-date coverage of research on non-photorealistic computer graphics including methodologies, algorithms and software tools dedicated to generating artistic and meaningful images and animations. This book mainly discusses how to create art from a blank canvas, how to convert the source images into pictures with the desired visual effects, how to generate artistic renditions from 3D models, how to synthesize expressive pictures f

  9. A General Event Location Algorithm with Applications to Eclipse and Station Line-of-Sight

    Science.gov (United States)

    Parker, Joel J. K.; Hughes, Steven P.

    2011-01-01

    A general-purpose algorithm for the detection and location of orbital events is developed. The proposed algorithm reduces the problem to a global root-finding problem by mapping events of interest (such as eclipses, station access events, etc.) to continuous, differentiable event functions. A stepping algorithm and a bracketing algorithm are used to detect and locate the roots. Examples of event functions and the stepping/bracketing algorithms are discussed, along with results indicating performance and accuracy in comparison to commercial tools across a variety of trajectories.

  10. A General Event Location Algorithm with Applications to Eclispe and Station Line-of-Sight

    Science.gov (United States)

    Parker, Joel J. K.; Hughes, Steven P.

    2011-01-01

    A general-purpose algorithm for the detection and location of orbital events is developed. The proposed algorithm reduces the problem to a global root-finding problem by mapping events of interest (such as eclipses, station access events, etc.) to continuous, differentiable event functions. A stepping algorithm and a bracketing algorithm are used to detect and locate the roots. Examples of event functions and the stepping/bracketing algorithms are discussed, along with results indicating performance and accuracy in comparison to commercial tools across a variety of trajectories.

  11. Cost reduction improvement for power generation system integrating WECS using harmony search algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Ngonkham, S. [Khonkaen Univ., Amphur Muang (Thailand). Dept. of Electrical Engineering; Buasri, P. [Khonkaen Univ., Amphur Muang (Thailand). Embed System Research Group

    2009-03-11

    A harmony search (HS) algorithm was used to optimize economic dispatch (ED) in a wind energy conversion system (WECS) for power system integration. The HS algorithm was based on a stochastic random search method. System costs for the WECS system were estimated in relation to average wind speeds. The HS algorithm was implemented to optimize the ED with a simple programming procedure. The study showed that the initial parameters must be carefully selected to ensure the accuracy of the HS algorithm. The algorithm demonstrated that total costs of the WECS system were higher than costs associated with energy efficiency procedures that reduced the same amount of greenhouse gas (GHG) emissions. 7 refs,. 10 tabs., 16 figs.

  12. CONSIDERATIONS ON FISCAL POLICY AS A TOOL OF ECONOMIC RECOVERY

    Directory of Open Access Journals (Sweden)

    Stoichin Elena Mădălina

    2012-03-01

    Full Text Available One of the most important components of social and economic life is the public finances, with direct implications on the formation and distribution of gross domestic product. State, in order to establish their own funds to set up the concept according to which, any natural or legal person carrying out an income or owns a dime in the category of those taxable in the State due to tax or duty. Starting from these considerations, the paper analyses, on the one side, the influencing factors and effects of increasing fiscal pressure, and, on the other side, the role of fiscal policy in the economic relaunch.

  13. Search algorithms, hidden labour and information control

    Directory of Open Access Journals (Sweden)

    Paško Bilić

    2016-06-01

    Full Text Available The paper examines some of the processes of the closely knit relationship between Google’s ideologies of neutrality and objectivity and global market dominance. Neutrality construction comprises an important element sustaining the company’s economic position and is reflected in constant updates, estimates and changes to utility and relevance of search results. Providing a purely technical solution to these issues proves to be increasingly difficult without a human hand in steering algorithmic solutions. Search relevance fluctuates and shifts through continuous tinkering and tweaking of the search algorithm. The company also uses third parties to hire human raters for performing quality assessments of algorithmic updates and adaptations in linguistically and culturally diverse global markets. The adaptation process contradicts the technical foundations of the company and calculations based on the initial Page Rank algorithm. Annual market reports, Google’s Search Quality Rating Guidelines, and reports from media specialising in search engine optimisation business are analysed. The Search Quality Rating Guidelines document provides a rare glimpse into the internal architecture of search algorithms and the notions of utility and relevance which are presented and structured as neutral and objective. Intertwined layers of ideology, hidden labour of human raters, advertising revenues, market dominance and control are discussed throughout the paper.

  14. Nonlinear Economic Model Predictive Control Strategy for Active Smart Buildings

    DEFF Research Database (Denmark)

    Santos, Rui Mirra; Zong, Yi; Sousa, Joao M. C.

    2016-01-01

    Nowadays, the development of advanced and innovative intelligent control techniques for energy management in buildings is a key issue within the smart grid topic. A nonlinear economic model predictive control (EMPC) scheme, based on the branch-and-bound tree search used as optimization algorithm ...... controller is shown very reliable keeping the comfort levels in the two considered seasons and shifting the load away from peak hours in order to achieve the desired flexible electricity consumption.......Nowadays, the development of advanced and innovative intelligent control techniques for energy management in buildings is a key issue within the smart grid topic. A nonlinear economic model predictive control (EMPC) scheme, based on the branch-and-bound tree search used as optimization algorithm...

  15. An integrated environment for fast development and performance assessment of sonar image processing algorithms - SSIE

    DEFF Research Database (Denmark)

    Henriksen, Lars

    1996-01-01

    The sonar simulator integrated environment (SSIE) is a tool for developing high performance processing algorithms for single or sequences of sonar images. The tool is based on MATLAB providing a very short lead time from concept to executable code and thereby assessment of the algorithms tested...... of the algorithms is the availability of sonar images. To accommodate this problem the SSIE has been equipped with a simulator capable of generating high fidelity sonar images for a given scene of objects, sea-bed AUV path, etc. In the paper the main components of the SSIE is described and examples of different...... processing steps are given...

  16. EVOLUTION OF THEORETICAL AND METHODOLOGICAL FOUNDATIONS OF COMPARATIVE ECONOMICS

    OpenAIRE

    N. Grazhevska

    2014-01-01

    The article reveals the evolution stages of theoretical and methodological foundations of comparative economics. The author highlights algorithms of comparative analysis as well as theoretical and methodological limitations of four research programs of the new comparative economics. The article justifies the necessity of a comprehensive comparative study of the major trends and contradictions in the development of national economies in the era of globalization.

  17. EVOLUTION OF THEORETICAL AND METHODOLOGICAL FOUNDATIONS OF COMPARATIVE ECONOMICS

    Directory of Open Access Journals (Sweden)

    N. Grazhevska

    2014-12-01

    Full Text Available The article reveals the evolution stages of theoretical and methodological foundations of comparative economics. The author highlights algorithms of comparative analysis as well as theoretical and methodological limitations of four research programs of the new comparative economics. The article justifies the necessity of a comprehensive comparative study of the major trends and contradictions in the development of national economies in the era of globalization.

  18. Investments Portfolio Optimal Planning for industrial assets management: Method and Tool

    International Nuclear Information System (INIS)

    Lonchampt, Jerome; Fessart, Karine

    2012-01-01

    The purpose of this paper is to describe the method and tool dedicated to optimize investments planning for industrial assets. These investments may either be preventive maintenance tasks, asset enhancement or logistic investment such as spare parts purchase. The three methodological points to investigate in such an issue are: 1. The measure of the profitability of a portfolio of investments 2. The selection and planning of an optimal set of investments 3. The measure of the risk of a portfolio of investments The measure of the profitability of a set of investments in the IPOP (registered) tool is synthesised in the Net Present Value indicator. The NPV is the sum of the differences of discounted cash flows (direct costs, forced outages...) between the situations with and without a given investment. These cash flows are calculated through a pseudo-markov reliability model representing independently the components of the industrial asset and the spare parts inventories. The component model has been widely discussed over the years but the spare part model is a new one based on some approximations that will be discussed. This model, referred as the NPV function, takes for input an investments portfolio and gives its NPV. The second issue is to optimize the NPV. If all investments were independent, this optimization would be an easy calculation, unfortunately there are two sources of dependency. The first one is introduced by the spare part model, as if components are indeed independent in their reliability model, the fact that several components use the same inventory induces a dependency. The second dependency comes from economic, technical or logistic constraints, such as a global maintenance budget limit or a precedence constraint between two investments, making the aggregation of individual optimum not necessary feasible. The algorithm used to solve such a difficult optimization problem is a genetic algorithm. After a description of the features of the software a

  19. ECONOMIC INTELLIGENCE - THEORETICAL AND PRACTICAL ASPECTS

    Directory of Open Access Journals (Sweden)

    VIRGIL - ION POPOVICI

    2014-12-01

    Full Text Available Economic Intelligence (EI may be a solution in knowledge management as involves collecting, evaluating, processing, analysis and dissemination of economic data within organizations. The ultimate goal of economic intelligence (EI is to take advantage of this opportunity to develop and improve methods for identifying relevant information sources, analysis of information collected and manipulation, to give the user all the necessary decisions. Scope of the Economic Intelligence focused on information available outside the organization, covering wide areas from technology to market or legal issues. Economic Intelligence (EI is closely related to other approaches to information management, and knowledge management and business intelligence, excelling in the use of software tools.

  20. Multi-area economic dispatch with tie-line constraints employing ...

    African Journals Online (AJOL)

    user

    The economic dispatch problem is frequently solved without considering ... programming algorithm was proposed for the MAED solution with tie-line constraints ..... are the difference between two randomly chosen parameter vectors, a concept.

  1. A Hybrid Genetic-Algorithm Space-Mapping Tool for the Optimization of Antennas

    DEFF Research Database (Denmark)

    Pantoja, Mario Fernández; Meincke, Peter; Bretones, Amelia Rubio

    2007-01-01

    A hybrid global-local optimization technique for the design of antennas is presented. It consists of the subsequent application of a genetic algorithm (GA) that employs coarse models in the simulations and a space mapping (SM) that refines the solution found in the previous stage. The technique...

  2. Cost-effective analysis of different algorithms for the diagnosis of hepatitis C virus infection

    Directory of Open Access Journals (Sweden)

    A.M.E.C. Barreto

    2008-02-01

    Full Text Available We compared the cost-benefit of two algorithms, recently proposed by the Centers for Disease Control and Prevention, USA, with the conventional one, the most appropriate for the diagnosis of hepatitis C virus (HCV infection in the Brazilian population. Serum samples were obtained from 517 ELISA-positive or -inconclusive blood donors who had returned to Fundação Pró-Sangue/Hemocentro de São Paulo to confirm previous results. Algorithm A was based on signal-to-cut-off (s/co ratio of ELISA anti-HCV samples that show s/co ratio ³95% concordance with immunoblot (IB positivity. For algorithm B, reflex nucleic acid amplification testing by PCR was required for ELISA-positive or -inconclusive samples and IB for PCR-negative samples. For algorithm C, all positive or inconclusive ELISA samples were submitted to IB. We observed a similar rate of positive results with the three algorithms: 287, 287, and 285 for A, B, and C, respectively, and 283 were concordant with one another. Indeterminate results from algorithms A and C were elucidated by PCR (expanded algorithm which detected two more positive samples. The estimated cost of algorithms A and B was US$21,299.39 and US$32,397.40, respectively, which were 43.5 and 14.0% more economic than C (US$37,673.79. The cost can vary according to the technique used. We conclude that both algorithms A and B are suitable for diagnosing HCV infection in the Brazilian population. Furthermore, algorithm A is the more practical and economical one since it requires supplemental tests for only 54% of the samples. Algorithm B provides early information about the presence of viremia.

  3. Efficient Use of Behavioral Tools to Reduce Electricity Demand of Domestic Consumers

    Directory of Open Access Journals (Sweden)

    Elbaz Shimon

    2016-12-01

    Full Text Available Purpose: The present study investigated the main literature on the subject of methods and policies for reducing the electricity demand of domestic consumers, in order to identify the place of behavioral tools. Methodology: We used secondary sources, performing a literature review, together with analysis and synthesis. Findings: Policy makers prefer to use tools offered by neoclassical economics, such as various forms of taxation, fines and financial incentives in order to make domestic electricity consumers save electricity, on the assumption that consumers will make rational decisions while maximizing their personal benefit. However, studies conducted in recent years in the field of behavioral economics, which are based on the assumption that consumers’ decisions are not rational and are affected by cognitive biases, showed that the use of behavioral tools, such as detailed online information (feedback,social comparison information, information on varying rates (dynamic pricing and general information (advertising campaign, are tools that are not less appropriate than the ones the neoclassical economics offers, mainly because electricity is an invisible product and consumers are unable to assess it by normal cognitive measures. Using an interdisciplinary combination of behavioral tools that come from a variety of approaches taken from a wide variety of different academic fields, it is possible to receive efficient results in the endeavor of reducing electricity demand. Implications: Although the neoclassical economics still remains the fundamental theory used by policymakers, it is recommended to consider behavioral economics as a complementary approach to the neoclassical economics, and combine behavioral tools in the policymakers’ toolbox, especially when those tools do not require a significant financial investment, thus efficiently maximizing the reduction of electricity demand among domestic consumers. These theoretical results will be

  4. Restoration and economics: A union waiting to happen?

    Science.gov (United States)

    Alicia S.T. Robbins; Jean M. Daniels

    2012-01-01

    In this article, our objective is to introduce economics as a tool for the planning, prioritization, and evaluation of restoration projects. Studies that develop economic estimates of public values for ecological restoration employ methods that may be unfamiliar to practitioners. We hope to address this knowledge gap by describing economic concepts in the context of...

  5. A cross-species alignment tool (CAT)

    DEFF Research Database (Denmark)

    Li, Heng; Guan, Liang; Liu, Tao

    2007-01-01

    BACKGROUND: The main two sorts of automatic gene annotation frameworks are ab initio and alignment-based, the latter splitting into two sub-groups. The first group is used for intra-species alignments, among which are successful ones with high specificity and speed. The other group contains more...... sensitive methods which are usually applied in aligning inter-species sequences. RESULTS: Here we present a new algorithm called CAT (for Cross-species Alignment Tool). It is designed to align mRNA sequences to mammalian-sized genomes. CAT is implemented using C scripts and is freely available on the web...... at http://xat.sourceforge.net/. CONCLUSIONS: Examined from different angles, CAT outperforms other extant alignment tools. Tested against all available mouse-human and zebrafish-human orthologs, we demonstrate that CAT combines the specificity and speed of the best intra-species algorithms, like BLAT...

  6. Economic vulnerability of timber resources to forest fires.

    Science.gov (United States)

    y Silva, Francisco Rodríguez; Molina, Juan Ramón; González-Cabán, Armando; Machuca, Miguel Ángel Herrera

    2012-06-15

    The temporal-spatial planning of activities for a territorial fire management program requires knowing the value of forest ecosystems. In this paper we extend to and apply the economic valuation principle to the concept of economic vulnerability and present a methodology for the economic valuation of the forest production ecosystems. The forest vulnerability is analyzed from criteria intrinsically associated to the forest characterization, and to the potential behavior of surface fires. Integrating a mapping process of fire potential and analytical valuation algorithms facilitates the implementation of fire prevention planning. The availability of cartography of economic vulnerability of the forest ecosystems is fundamental for budget optimization, and to help in the decision making process. Published by Elsevier Ltd.

  7. Economic justification of robotic systems using graphical simulation as a tool

    International Nuclear Information System (INIS)

    Bennett, P.C.

    1995-01-01

    This paper outlines the simulation and analysis approach taken to address radiation dose reduction using robotic automation from the operational and economic standpoints for the DOE Civilian Radioactive Waste Management system and for the transuranic wave loading facilities within the DOE complex. Simulations of the robotic operations using validated software are described. These simulations provide through-put, capital and operating costs for an economic benefit-cost analysis. Benefit-cost analysis results are also presented

  8. Energy Storage Economics

    Energy Technology Data Exchange (ETDEWEB)

    Elgqvist, Emma M [National Renewable Energy Laboratory (NREL), Golden, CO (United States)

    2017-09-07

    This presentation provides an overview on energy storage economics including recent market trends, battery terminology and concepts, value streams, challenges, and an example of how photovoltaics and storage can be used to lower demand charges. It also provides an overview of the REopt Lite web tool inputs and outputs.

  9. 3D TRUMP - A GBI launch window tool

    Science.gov (United States)

    Karels, Steven N.; Hancock, John; Matchett, Gary

    3D TRUMP is a novel GPS and communicatons-link software analysis tool developed for the SDIO's Ground-Based Interceptor (GBI) program. 3D TRUMP uses a computationally efficient analysis tool which provides key GPS-based performance measures for an entire GBI mission's reentry vehicle and interceptor trajectories. Algorithms and sample outputs are presented.

  10. Optimization of Straight Cylindrical Turning Using Artificial Bee Colony (ABC) Algorithm

    Science.gov (United States)

    Prasanth, Rajanampalli Seshasai Srinivasa; Hans Raj, Kandikonda

    2017-04-01

    Artificial bee colony (ABC) algorithm, that mimics the intelligent foraging behavior of honey bees, is increasingly gaining acceptance in the field of process optimization, as it is capable of handling nonlinearity, complexity and uncertainty. Straight cylindrical turning is a complex and nonlinear machining process which involves the selection of appropriate cutting parameters that affect the quality of the workpiece. This paper presents the estimation of optimal cutting parameters of the straight cylindrical turning process using the ABC algorithm. The ABC algorithm is first tested on four benchmark problems of numerical optimization and its performance is compared with genetic algorithm (GA) and ant colony optimization (ACO) algorithm. Results indicate that, the rate of convergence of ABC algorithm is better than GA and ACO. Then, the ABC algorithm is used to predict optimal cutting parameters such as cutting speed, feed rate, depth of cut and tool nose radius to achieve good surface finish. Results indicate that, the ABC algorithm estimated a comparable surface finish when compared with real coded genetic algorithm and differential evolution algorithm.

  11. Complex-based OCT angiography algorithm recovers microvascular information better than amplitude- or phase-based algorithms in phase-stable systems.

    Science.gov (United States)

    Xu, Jingjiang; Song, Shaozhen; Li, Yuandong; Wang, Ruikang K

    2017-12-19

    Optical coherence tomography angiography (OCTA) is increasingly becoming a popular inspection tool for biomedical imaging applications. By exploring the amplitude, phase and complex information available in OCT signals, numerous algorithms have been proposed that contrast functional vessel networks within microcirculatory tissue beds. However, it is not clear which algorithm delivers optimal imaging performance. Here, we investigate systematically how amplitude and phase information have an impact on the OCTA imaging performance, to establish the relationship of amplitude and phase stability with OCT signal-to-noise ratio (SNR), time interval and particle dynamics. With either repeated A-scan or repeated B-scan imaging protocols, the amplitude noise increases with the increase of OCT SNR; however, the phase noise does the opposite, i.e. it increases with the decrease of OCT SNR. Coupled with experimental measurements, we utilize a simple Monte Carlo (MC) model to simulate the performance of amplitude-, phase- and complex-based algorithms for OCTA imaging, the results of which suggest that complex-based algorithms deliver the best performance when the phase noise is  algorithm delivers better performance than either the amplitude- or phase-based algorithms for both the repeated A-scan and the B-scan imaging protocols, which agrees well with the conclusion drawn from the MC simulations.

  12. The design of the public transport lines with the use of the fast genetic algorithm

    Directory of Open Access Journals (Sweden)

    Aleksander Król

    2015-09-01

    Full Text Available Background: The growing role of public transport and the pressure of economic criteria requires the new optimization tools for process of public transport planning. These problems are computationally very complex, thus it is preferable to use various approximate methods, leading to a good solution within an acceptable time. Methods: One of such method is the genetic algorithm mimicking the processes of evolution and natural selection in the nature. In this paper, the different variants of the public transport lines layout are subjected to the artificial selection. The essence of the proposed approach is a simplified method of calculating the value of the fit function for a single individual, which brings relatively short computation time even for large jobs. Results: It was shown that despite the introduced simplifications the quality of the results is not worsened. Using the data obtained from KZK GOP (Communications Municipal Association of Upper Silesian Industrial Region the described algorithm was used to optimize the layout of the network of bus lines located within the borders of Katowice. Conclusion: The proposed algorithm was applied to a real, very complex network of public transportation and a possibility of a significant improvement of its efficiency was indicated. The obtained results give hope that the presented model, after some improvements can be the basis of the scientific method, and in a consequence of a further development to find practical application.

  13. On distribution reduction and algorithm implementation in inconsistent ordered information systems.

    Science.gov (United States)

    Zhang, Yanqin

    2014-01-01

    As one part of our work in ordered information systems, distribution reduction is studied in inconsistent ordered information systems (OISs). Some important properties on distribution reduction are studied and discussed. The dominance matrix is restated for reduction acquisition in dominance relations based information systems. Matrix algorithm for distribution reduction acquisition is stepped. And program is implemented by the algorithm. The approach provides an effective tool for the theoretical research and the applications for ordered information systems in practices. For more detailed and valid illustrations, cases are employed to explain and verify the algorithm and the program which shows the effectiveness of the algorithm in complicated information systems.

  14. Real-time Distributed Economic Dispatch forDistributed Generation Based on Multi-Agent System

    DEFF Research Database (Denmark)

    Luo, Kui; Wu, Qiuwei; Nielsen, Arne Hejde

    2015-01-01

    The distributed economic dispatch for distributed generation is formulated as a optimization problem with equality and inequality constraints. An effective distributed approach based on multi-agent system is proposed for solving the economic dispatch problem in this paper. The proposed approach...... consists of two stages. In the first stage, an adjacency average allocation algorithm is proposed to ensure the generation-demand equality. In the second stage, a local replicator dynamics algorithm is applied to achieve nash equilibrium for the power dispatch game. The approach is implemented in a fully...

  15. A Constrained Algorithm Based NMFα for Image Representation

    Directory of Open Access Journals (Sweden)

    Chenxue Yang

    2014-01-01

    Full Text Available Nonnegative matrix factorization (NMF is a useful tool in learning a basic representation of image data. However, its performance and applicability in real scenarios are limited because of the lack of image information. In this paper, we propose a constrained matrix decomposition algorithm for image representation which contains parameters associated with the characteristics of image data sets. Particularly, we impose label information as additional hard constraints to the α-divergence-NMF unsupervised learning algorithm. The resulted algorithm is derived by using Karush-Kuhn-Tucker (KKT conditions as well as the projected gradient and its monotonic local convergence is proved by using auxiliary functions. In addition, we provide a method to select the parameters to our semisupervised matrix decomposition algorithm in the experiment. Compared with the state-of-the-art approaches, our method with the parameters has the best classification accuracy on three image data sets.

  16. Prediction of insemination outcomes in Holstein dairy cattle using alternative machine learning algorithms.

    Science.gov (United States)

    Shahinfar, Saleh; Page, David; Guenther, Jerry; Cabrera, Victor; Fricke, Paul; Weigel, Kent

    2014-02-01

    When making the decision about whether or not to breed a given cow, knowledge about the expected outcome would have an economic impact on profitability of the breeding program and net income of the farm. The outcome of each breeding can be affected by many management and physiological features that vary between farms and interact with each other. Hence, the ability of machine learning algorithms to accommodate complex relationships in the data and missing values for explanatory variables makes these algorithms well suited for investigation of reproduction performance in dairy cattle. The objective of this study was to develop a user-friendly and intuitive on-farm tool to help farmers make reproduction management decisions. Several different machine learning algorithms were applied to predict the insemination outcomes of individual cows based on phenotypic and genotypic data. Data from 26 dairy farms in the Alta Genetics (Watertown, WI) Advantage Progeny Testing Program were used, representing a 10-yr period from 2000 to 2010. Health, reproduction, and production data were extracted from on-farm dairy management software, and estimated breeding values were downloaded from the US Department of Agriculture Agricultural Research Service Animal Improvement Programs Laboratory (Beltsville, MD) database. The edited data set consisted of 129,245 breeding records from primiparous Holstein cows and 195,128 breeding records from multiparous Holstein cows. Each data point in the final data set included 23 and 25 explanatory variables and 1 binary outcome for of 0.756 ± 0.005 and 0.736 ± 0.005 for primiparous and multiparous cows, respectively. The naïve Bayes algorithm, Bayesian network, and decision tree algorithms showed somewhat poorer classification performance. An information-based variable selection procedure identified herd average conception rate, incidence of ketosis, number of previous (failed) inseminations, days in milk at breeding, and mastitis as the most

  17. The economics of climate change and the change of climate in economics

    International Nuclear Information System (INIS)

    Marechal, Kevin

    2007-01-01

    Economics is an unavoidable decision-making tool in the field of climate policy. At the same time, traditional economics is being challenged both empirically and theoretically by scholars in different fields. Its non-neutrality in dealing with climate-related issues-which is illustrated by the controversy over the 'no-regret potential'-would thus call for an opening of economics to insights from other disciplines. Within that context, we show that an evolutionary-inspired line of thought coupled with a systemic and historical perspective of technological change provides a very insightful alternative to traditional economics. More particularly, it follows from that framework that the picture of the climate challenge ahead looks very different from what traditional economic analyses would suggest. For instance, the lock-in process makes it unlikely that traditional cost-efficient measures (such as carbon taxation or tradable emission rights) will be sufficient to bring about the required radical changes in the field of energy as they fail to address structural barriers highlighted in our approach

  18. An efficient technique to solve combined economic and emission ...

    Indian Academy of Sciences (India)

    Home; Journals; Sadhana; Volume 38; Issue 4 ... Combined economic emission dispatch (CEED); optimization algorithms; power demand; Ant ... and at the same time the necessary equality and inequality constraints should also be fulfilled.

  19. Application of cultural algorithm to generation scheduling of hydrothermal systems

    International Nuclear Information System (INIS)

    Yuan Xiaohui; Yuan Yanbin

    2006-01-01

    The daily generation scheduling of hydrothermal power systems plays an important role in the operation of electric power systems for economics and security, which is a large scale dynamic non-linear constrained optimization problem. It is difficult to solve using traditional optimization methods. This paper proposes a new cultural algorithm to solve the optimal daily generation scheduling of hydrothermal power systems. The approach takes the water transport delay time between connected reservoirs into consideration and can conveniently deal with the complicated hydraulic coupling simultaneously. An example is used to verify the correctness and effectiveness of the proposed cultural algorithm, comparing with both the Lagrange method and the genetic algorithm method. The simulation results demonstrate that the proposed algorithm has rapid convergence speed and higher solution precision. Thus, an effective method is provided to solve the optimal daily generation scheduling of hydrothermal systems

  20. Simple educational tool for digital speckle shearography

    International Nuclear Information System (INIS)

    Schirripa Spagnolo, Giuseppe; Martocchia, Andrea; Papalillo, Donato; Cozzella, Lorenzo

    2012-01-01

    In this study, an educational tool has been prepared for obtaining short-term and more economic training on digital speckle shearography (DSS). Shearography non-destructive testing (NDT) has gained wide acceptance over the last decade, providing a number of important and exciting inspection solutions in aerospace, electronics and medical device manufacturing. For exploring these motivations, it is important to develop didactic tools to understand the potential of digital shearography through training and didactic courses in the field of NDT. In this paper we describe a simple tool for making one familiar with the potential of DSS in the area of education and training. The system is realized with a simple and economic optical setup and a virtual instrument based on the LabVIEW™ and DAQ. (paper)

  1. Life as Thermodynamic Evidence of Algorithmic Structure in Natural Environments

    Directory of Open Access Journals (Sweden)

    David A. Rosenblueth

    2012-11-01

    Full Text Available In evolutionary biology, attention to the relationship between stochastic organisms and their stochastic environments has leaned towards the adaptability and learning capabilities of the organisms rather than toward the properties of the environment. This article is devoted to the algorithmic aspects of the environment and its interaction with living organisms. We ask whether one may use the fact of the existence of life to establish how far nature is removed from algorithmic randomness. The paper uses a novel approach to behavioral evolutionary questions, using tools drawn from information theory, algorithmic complexity and the thermodynamics of computation to support an intuitive assumption about the near optimal structure of a physical environment that would prove conducive to the evolution and survival of organisms, and sketches the potential of these tools, at present alien to biology, that could be used in the future to address different and deeper questions. We contribute to the discussion of the algorithmic structure of natural environments and provide statistical and computational arguments for the intuitive claim that living systems would not be able to survive in completely unpredictable environments, even if adaptable and equipped with storage and learning capabilities by natural selection (brain memory or DNA.

  2. Caesy: A software tool for computer-aided engineering

    Science.gov (United States)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  3. Life cycle and sustainability of abrasive tools

    CERN Document Server

    Linke, Barbara

    2016-01-01

    This monograph focuses on abrasive tools for grinding, polishing, honing, and lapping operations. The book describes the life cycle of abrasive tools from raw material processing of abrasive grits and bonding, manufacturing of monolithic or multi-layered tools, tool use to tool end-of-life. Moreover, this work highlights sustainability challenges including economic, environmental, social and technological aspects. The target audience primarily comprises research and industry experts in the field of manufacturing, but the book may also be beneficial for graduate students.

  4. Proportionate Minimum Error Entropy Algorithm for Sparse System Identification

    Directory of Open Access Journals (Sweden)

    Zongze Wu

    2015-08-01

    Full Text Available Sparse system identification has received a great deal of attention due to its broad applicability. The proportionate normalized least mean square (PNLMS algorithm, as a popular tool, achieves excellent performance for sparse system identification. In previous studies, most of the cost functions used in proportionate-type sparse adaptive algorithms are based on the mean square error (MSE criterion, which is optimal only when the measurement noise is Gaussian. However, this condition does not hold in most real-world environments. In this work, we use the minimum error entropy (MEE criterion, an alternative to the conventional MSE criterion, to develop the proportionate minimum error entropy (PMEE algorithm for sparse system identification, which may achieve much better performance than the MSE based methods especially in heavy-tailed non-Gaussian situations. Moreover, we analyze the convergence of the proposed algorithm and derive a sufficient condition that ensures the mean square convergence. Simulation results confirm the excellent performance of the new algorithm.

  5. Impact of population and economic growth on carbon emissions in Taiwan using an analytic tool STIRPAT

    Directory of Open Access Journals (Sweden)

    Jong-Chao Yeh

    2017-01-01

    Full Text Available Carbon emission has increasingly become an issue of global concern because of climate change. Unfortunately, Taiwan was listed as top 20 countries of carbon emission in 2014. In order to provide appropriate measures to control carbon emission, it appears that there is an urgent need to address how such factors as population and economic growth impact the emission of carbon dioxide in any developing countries. In addition to total population, both the percentages of population living in urban area (i.e., urbanization percentage, and non-dependent population may also serve as limiting factors. On the other hand, the total energy-driven gross domestic production (GDP and the percentage of GDP generated by the manufacturing industries are assessed to see their respective degree of impact on carbon emission. Therefore, based on the past national data in the period 1990–2014 in Taiwan, an analytic tool of Stochastic Impacts by Regression on Population, Affluence and Technology (STIRPAT was employed to see how well those aforementioned factors can describe their individual potential impact on global warming, which is measured by the total amount of carbon emission into the atmosphere. Seven scenarios of STIRPAT model were proposed and tested statistically for the significance of each proposed model. As a result, two models were suggested to predict the impact of carbon emission due to population and economic growth by the year 2025 in Taiwan.

  6. Decision support tool for used oil regeneration technologies assessment and selection.

    Science.gov (United States)

    Khelifi, Olfa; Dalla Giovanna, Fabio; Vranes, Sanja; Lodolo, Andrea; Miertus, Stanislav

    2006-09-01

    Regeneration is the most efficient way of managing used oil. It saves money by preventing costly cleanups and liabilities that are associated with mismanagement of used oil, it helps to protect the environment and it produces a technically renewable resource by enabling an indefinite recycling potential. There are a variety of processes and licensors currently offering ways to deal with used oils. Selecting a regeneration technology for used oil involves "cross-matching" key criteria. Therefore, the first prototype of spent oil regeneration (SPORE), a decision support tool, has been developed to help decision-makers to assess the available technologies and select the preferred used oil regeneration options. The analysis is based on technical, economical and environmental criteria. These criteria are ranked to determine their relative importance for a particular used oil regeneration project. The multi-criteria decision analysis (MCDA) is the core of the SPORE using the PROMETHEE II algorithm.

  7. Macro-micro interlocked simulation algorithm: an exemplification for aurora arc evolution

    Energy Technology Data Exchange (ETDEWEB)

    Sato, Tetsuya [University of Hyogo, Kobe 650-0044 (Japan); Hasegawa, Hiroki; Ohno, Nobuaki [Japan Agency for Marine-Earth Science and Technology, Yokohama 236-0001 (Japan)], E-mail: sato@hq.u-hyogo.ac.jp

    2009-01-01

    Using an innovative holistic simulation algorithm that can self-consistently treat a system that evolves as cooperation between macroscopic and microscopic processes, the evolution of a colorful aurora arc is beautifully reproduced as the result of cooperation between the global field-aligned feedback instability of the coupled magnetosphere-ionosphere system and the ensuing microscopic ion-acoustic instability that generates electric double layers and accelerates aurora electrons. These results are in agreement with rocket and satellite observations. This shows that the proposed holistic algorithm could be a reliable tool to reveal complex real dramatic events and become, in the near future, a viable scientifically secure prediction tool for natural disasters such as earthquakes, landslides and floods caused by typhoons.

  8. Algorithm Building and Learning Programming Languages Using a New Educational Paradigm

    Science.gov (United States)

    Jain, Anshul K.; Singhal, Manik; Gupta, Manu Sheel

    2011-08-01

    This research paper presents a new concept of using a single tool to associate syntax of various programming languages, algorithms and basic coding techniques. A simple framework has been programmed in Python that helps students learn skills to develop algorithms, and implement them in various programming languages. The tool provides an innovative and a unified graphical user interface for development of multimedia objects, educational games and applications. It also aids collaborative learning amongst students and teachers through an integrated mechanism based on Remote Procedure Calls. The paper also elucidates an innovative method for code generation to enable students to learn the basics of programming languages using drag-n-drop methods for image objects.

  9. Turkish University Students’ Perceptions of the World Wide Web as a Learning Tool: An Investigation Based on Gender, Socio-Economic Background, and Web Experience

    Directory of Open Access Journals (Sweden)

    Erkan Tekinarslan

    2009-04-01

    Full Text Available The main purpose of the study is to investigate Turkish undergraduate students’ perceptions of the Web as a learning tool and to analyze whether their perceptions differ significantly based on gender, socio-economic background, and Web experience. Data obtained from 722 undergraduate students (331 males and 391 females were used in the analyses. The findings indicated significant differences based on gender, socio-economic background, and Web experience. The students from higher socio-economic backgrounds indicated significantly higher attitude scores on the self-efficacy subscale of the Web attitude scale. Similarly, the male students indicated significantly higher scores on the self-efficacy subscale than the females. Also, the students with higher Web experience in terms of usage frequency indicated higher scores on all subscales (i.e., self-efficacy, affective, usefulness, Web-based learning. Moreover, the two-way ANOVA results indicated that the student’s PC ownership has significant main effects on their Web attitudes and on the usefulness, self-efficacy, and affective subscales.

  10. State regulation as a tool for improving the economic security of the regions

    Directory of Open Access Journals (Sweden)

    Yu. M. Sokolinskaya

    2017-01-01

    Full Text Available Providing economic security for the development of regions, increasing their competitiveness, risk-free and sustainable activities are the main tasks of the regional program of social and economic development, which occupies a special place in the system of instruments for public management of these processes. The program of social and economic development is a unique strategy of the region aimed at security and optimization of the spatial structure and relations between the center and the regions in order to ensure economic security and growth by maximizing the effective use of existing internal and external factors. The institutional influence of the state in order to improve the economic security of regions and enterprises occurs palliatively when the business of the region is supported in direct – subsidies, and more often indirectly – compliance with the laws and regulations of the Russian Federation and the region, on the principles of institutional and market synergies. Adaptation of enterprises in the region to the market is difficult, when specific socio-organizational, economic, technical and technological, scientific, information activities in their interrelations function in the field of Russian laws. The search for ways to improve the economic security of the Russian Federation, regions and enterprises takes place in the context of global integration through the improvement of the mechanism of state regulation. An important task of the current stage of economic security of the country and regions is the construction of a system of its institutional organization that would be able to balance the levers of government with the opportunities of private enterprises, provide a quality level of providing the business with protection from terrorism, predation, financial risks, legal competition etc.

  11. Rapid mental computation system as a tool for algorithmic thinking of elementary school students development

    OpenAIRE

    Ziatdinov, Rushan; Musa, Sajid

    2013-01-01

    In this paper, we describe the possibilities of using a rapid mental computation system in elementary education. The system consists of a number of readily memorized operations that allow one to perform arithmetic computations very quickly. These operations are actually simple algorithms which can develop or improve the algorithmic thinking of pupils. Using a rapid mental computation system allows forming the basis for the study of computer science in secondary school.

  12. Jane: a new tool for the cophylogeny reconstruction problem.

    Science.gov (United States)

    Conow, Chris; Fielder, Daniel; Ovadia, Yaniv; Libeskind-Hadas, Ran

    2010-02-03

    This paper describes the theory and implementation of a new software tool, called Jane, for the study of historical associations. This problem arises in parasitology (associations of hosts and parasites), molecular systematics (associations of orderings and genes), and biogeography (associations of regions and orderings). The underlying problem is that of reconciling pairs of trees subject to biologically plausible events and costs associated with these events. Existing software tools for this problem have strengths and limitations, and the new Jane tool described here provides functionality that complements existing tools. The Jane software tool uses a polynomial time dynamic programming algorithm in conjunction with a genetic algorithm to find very good, and often optimal, solutions even for relatively large pairs of trees. The tool allows the user to provide rich timing information on both the host and parasite trees. In addition the user can limit host switch distance and specify multiple host switch costs by specifying regions in the host tree and costs for host switches between pairs of regions. Jane also provides a graphical user interface that allows the user to interactively experiment with modifications to the solutions found by the program. Jane is shown to be a useful tool for cophylogenetic reconstruction. Its functionality complements existing tools and it is therefore likely to be of use to researchers in the areas of parasitology, molecular systematics, and biogeography.

  13. Reactive power dispatch considering voltage stability with seeker optimization algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Chaohua; Chen, Weirong; Zhang, Xuexia [The School of Electrical Engineering, Southwest Jiaotong University, Chengdu 610031 (China); Zhu, Yunfang [Department of Computer and Communication Engineering, E' mei Campus, Southwest Jiaotong University, E' mei 614202 (China)

    2009-10-15

    Optimal reactive power dispatch (ORPD) has a growing impact on secure and economical operation of power systems. This issue is well known as a non-linear, multi-modal and multi-objective optimization problem where global optimization techniques are required in order to avoid local minima. In the last decades, computation intelligence-based techniques such as genetic algorithms (GAs), differential evolution (DE) algorithms and particle swarm optimization (PSO) algorithms, etc., have often been used for this aim. In this work, a seeker optimization algorithm (SOA) based method is proposed for ORPD considering static voltage stability and voltage deviation. The SOA is based on the concept of simulating the act of human searching where search direction is based on the empirical gradient by evaluating the response to the position changes and step length is based on uncertainty reasoning by using a simple Fuzzy rule. The algorithm's performance is studied with comparisons of two versions of GAs, three versions of DE algorithms and four versions of PSO algorithms on the IEEE 57 and 118-bus power systems. The simulation results show that the proposed approach performed better than the other listed algorithms and can be efficiently used for the ORPD problem. (author)

  14. The Challenge of Promoting Algorithmic Thinking of Both Sciences- and Humanities-Oriented Learners

    Science.gov (United States)

    Katai, Z.

    2015-01-01

    The research results we present in this paper reveal that properly calibrated e-learning tools have potential to effectively promote the algorithmic thinking of both science-oriented and humanities-oriented students. After students had watched an illustration (by a folk dance choreography) and an animation of the studied sorting algorithm (bubble…

  15. LFQC: a lossless compression algorithm for FASTQ files

    Science.gov (United States)

    Nicolae, Marius; Pathak, Sudipta; Rajasekaran, Sanguthevar

    2015-01-01

    Motivation: Next Generation Sequencing (NGS) technologies have revolutionized genomic research by reducing the cost of whole genome sequencing. One of the biggest challenges posed by modern sequencing technology is economic storage of NGS data. Storing raw data is infeasible because of its enormous size and high redundancy. In this article, we address the problem of storage and transmission of large FASTQ files using innovative compression techniques. Results: We introduce a new lossless non-reference based FASTQ compression algorithm named Lossless FASTQ Compressor. We have compared our algorithm with other state of the art big data compression algorithms namely gzip, bzip2, fastqz (Bonfield and Mahoney, 2013), fqzcomp (Bonfield and Mahoney, 2013), Quip (Jones et al., 2012), DSRC2 (Roguski and Deorowicz, 2014). This comparison reveals that our algorithm achieves better compression ratios on LS454 and SOLiD datasets. Availability and implementation: The implementations are freely available for non-commercial purposes. They can be downloaded from http://engr.uconn.edu/rajasek/lfqc-v1.1.zip. Contact: rajasek@engr.uconn.edu PMID:26093148

  16. An algorithm for seismic analysis of low-rise structural walls

    International Nuclear Information System (INIS)

    Jost, S.D.; Mo, Y.L.

    1991-01-01

    Although structures with elastic response are fairly well understood, structures with inelastic response are more difficult to analyze. Furthermore, in studies of inelastic response, attention has generally been paid to the flexural response of reinforced concrete structures with relatively little attention being given to shear response. In this paper, an algorithm is described for computing the shear force-deflection relationship for orthogonally reinforced concrete low-rise structural walls. In this situation, the inelastic response relationship forms hysteresis loops which depend on the cracking shear force and direction of wall movement in addition to the loading history, so an algorithm which accounts for the continually varying stiffness and energy absorbing characteristics of such walls is needed. This algorithm is used together with the linear step-by-step method for numerically solving differential equations to analyze low rise structural walls during a seismic disturbance. This combination forms a useful tool for predicting the seismic response of low-rise structural walls. Using this tool, two examples are analyzed: a single shearwall in which cracking occurs and a shearwall which interacts seismically with a steel structure. (orig.)

  17. Economic impacts study

    Energy Technology Data Exchange (ETDEWEB)

    Brunsen, W.; Worley, W.; Frost, E.

    1988-09-30

    This is a progress report on the first phase of a project to measure the economic impacts of a rapidly changing U.S. target base. The purpose of the first phase is to designate and test the macroeconomic impact analysis model. Criteria were established for a decision-support model. Additional criteria were defined for an interactive macroeconomic impact analysis model. After a review of several models, the Economic Impact Forecast System model of the U.S. Army Construction Research Laboratory was selected as the appropriate input-output tool that can address local and regional economic analysis. The model was applied to five test cases to demonstrate its utility and define possible revisions to meet project criteria. A plan for EIFS access was defined at three levels. Objectives and tasks for scenario refinement are proposed.

  18. Survey of Non-Rigid Registration Tools in Medicine.

    Science.gov (United States)

    Keszei, András P; Berkels, Benjamin; Deserno, Thomas M

    2017-02-01

    We catalogue available software solutions for non-rigid image registration to support scientists in selecting suitable tools for specific medical registration purposes. Registration tools were identified using non-systematic search in Pubmed, Web of Science, IEEE Xplore® Digital Library, Google Scholar, and through references in identified sources (n = 22). Exclusions are due to unavailability or inappropriateness. The remaining (n = 18) tools were classified by (i) access and technology, (ii) interfaces and application, (iii) living community, (iv) supported file formats, and (v) types of registration methodologies emphasizing the similarity measures implemented. Out of the 18 tools, (i) 12 are open source, 8 are released under a permissive free license, which imposes the least restrictions on the use and further development of the tool, 8 provide graphical processing unit (GPU) support; (ii) 7 are built on software platforms, 5 were developed for brain image registration; (iii) 6 are under active development but only 3 have had their last update in 2015 or 2016; (iv) 16 support the Analyze format, while 7 file formats can be read with only one of the tools; and (v) 6 provide multiple registration methods and 6 provide landmark-based registration methods. Based on open source, licensing, GPU support, active community, several file formats, algorithms, and similarity measures, the tools Elastics and Plastimatch are chosen for the platform ITK and without platform requirements, respectively. Researchers in medical image analysis already have a large choice of registration tools freely available. However, the most recently published algorithms may not be included in the tools, yet.

  19. Homo Oeconomicus and Behavioral Economics

    Directory of Open Access Journals (Sweden)

    Justyna Brzezicka

    2014-12-01

    Full Text Available Recent years have witnessed a growing interest in behavioral trends in both economic theory and practical applications. As a science with vast potential for explaining complex market behaviors, behavioral economics is drifting away from the classical model of homo oeconomicus deployed by mainstream economics. This paper discusses the significance and role of the homo oeconomicus model in light of behavioral economics. It analyzes the direction of changes affecting homo oeconomicus, examines the definition of anomalies within the context of behavioral economics and discusses the anomalous status of homo oeconomicus. The paper proposes a hypothesis that the attitude characterizing homo oeconomicus is unique and incidental. The presented interdisciplinary analysis relies on economics, behavioral economics, economic psychology, behavioral finance and the methodology of science to discuss the homo oeconomicus model. The paper reviews change trends in economics, which are largely propelled by advancements in behavioral economics. The key methodological tools deployed in this paper are theoretical analysis and a compilation of extensive research findings. The results were used to formulate new theories advocating the development of a modern approach to the homo oeconomicus model, recognizing its significance and the growing importance of behavioral economics.

  20. CATCHprofiles: Clustering and Alignment Tool for ChIP Profiles

    DEFF Research Database (Denmark)

    G. G. Nielsen, Fiona; Galschiøt Markus, Kasper; Møllegaard Friborg, Rune

    2012-01-01

    IP-profiling data and detect potentially meaningful patterns, the areas of enrichment must be aligned and clustered, which is an algorithmically and computationally challenging task. We have developed CATCHprofiles, a novel tool for exhaustive pattern detection in ChIP profiling data. CATCHprofiles is built upon...... a computationally efficient implementation for the exhaustive alignment and hierarchical clustering of ChIP profiling data. The tool features a graphical interface for examination and browsing of the clustering results. CATCHprofiles requires no prior knowledge about functional sites, detects known binding patterns...... it an invaluable tool for explorative research based on ChIP profiling data. CATCHprofiles and the CATCH algorithm run on all platforms and is available for free through the CATCH website: http://catch.cmbi.ru.nl/. User support is available by subscribing to the mailing list catch-users@bioinformatics.org....

  1. Climate economics in progress 2011; Climate economics in progress 2011

    Energy Technology Data Exchange (ETDEWEB)

    De Perthuis, Christian [Paris-Dauphine University (France); Jouvet, Pierre-Andre [Paris-Ouest University (France); Trotignon, Raphael; Simonet, Gabriela; Boutueil, Virginie [Climate Economics Chair, Paris-Dauphine University (France)

    2011-10-01

    Climate Economics in Progress offers a global overview of the present status of action on climate change. Drawing on the most recent data, it analyzes the development of carbon markets in Europe and other parts of the world. It also examines the conditions for including major players such as China and new sectors such as agriculture, forestry and transport in the fight against global warming. The book is essential reading for anyone wishing to understand current advances in climate control, which could pave the way for a new form of economic growth. The book brings together a group of researchers whose goal is to make the link between academic research on the economics of climate change and the implementation of operational tools, thereby allowing the climate issue to be integrated into the functioning of the real economy

  2. Projected 21st century coastal flooding in the Southern California Bight. Part 2: Tools for assessing climate change-driven coastal hazards and socio-economic impacts

    Science.gov (United States)

    Erikson, Li; Barnard, Patrick; O'Neill, Andrea; Wood, Nathan J.; Jones, Jeanne M.; Finzi Hart, Juliette; Vitousek, Sean; Limber, Patrick; Hayden, Maya; Fitzgibbon, Michael; Lovering, Jessica; Foxgrover, Amy C.

    2018-01-01

    This paper is the second of two that describes the Coastal Storm Modeling System (CoSMoS) approach for quantifying physical hazards and socio-economic hazard exposure in coastal zones affected by sea-level rise and changing coastal storms. The modelling approach, presented in Part 1, downscales atmospheric global-scale projections to local scale coastal flood impacts by deterministically computing the combined hazards of sea-level rise, waves, storm surges, astronomic tides, fluvial discharges, and changes in shoreline positions. The method is demonstrated through an application to Southern California, United States, where the shoreline is a mix of bluffs, beaches, highly managed coastal communities, and infrastructure of high economic value. Results show that inclusion of 100-year projected coastal storms will increase flooding by 9–350% (an additional average 53.0 ± 16.0 km2) in addition to a 25–500 cm sea-level rise. The greater flooding extents translate to a 55–110% increase in residential impact and a 40–90% increase in building replacement costs. To communicate hazards and ranges in socio-economic exposures to these hazards, a set of tools were collaboratively designed and tested with stakeholders and policy makers; these tools consist of two web-based mapping and analytic applications as well as virtual reality visualizations. To reach a larger audience and enhance usability of the data, outreach and engagement included workshop-style trainings for targeted end-users and innovative applications of the virtual reality visualizations.

  3. Evaluation of the performance of different firefly algorithms to the ...

    African Journals Online (AJOL)

    To solve the economic load dispatch problem, traditional and intelligent techniques were applied. Researchers have shown interest in utilizing metaheuristic methods to solve complex optimization problems in real life applications. In this paper, three alternatives of firefly algorithms are applied to solve the nonlinear ELD ...

  4. Robust reactor power control system design by genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yoon Joon; Cho, Kyung Ho; Kim, Sin [Cheju National University, Cheju (Korea, Republic of)

    1998-12-31

    The H{sub {infinity}} robust controller for the reactor power control system is designed by use of the mixed weight sensitivity. The system is configured into the typical two-port model with which the weight functions are augmented. Since the solution depends on the weighting functions and the problem is of nonconvex, the genetic algorithm is used to determine the weighting functions. The cost function applied in the genetic algorithm permits the direct control of the power tracking performances. In addition, the actual operating constraints such as rod velocity and acceleration can be treated as design parameters. Compared with the conventional approach, the controller designed by the genetic algorithm results in the better performances with the realistic constraints. Also, it is found that the genetic algorithm could be used as an effective tool in the robust design. 4 refs., 6 figs. (Author)

  5. Robust reactor power control system design by genetic algorithm

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Yoon Joon; Cho, Kyung Ho; Kim, Sin [Cheju National University, Cheju (Korea, Republic of)

    1997-12-31

    The H{sub {infinity}} robust controller for the reactor power control system is designed by use of the mixed weight sensitivity. The system is configured into the typical two-port model with which the weight functions are augmented. Since the solution depends on the weighting functions and the problem is of nonconvex, the genetic algorithm is used to determine the weighting functions. The cost function applied in the genetic algorithm permits the direct control of the power tracking performances. In addition, the actual operating constraints such as rod velocity and acceleration can be treated as design parameters. Compared with the conventional approach, the controller designed by the genetic algorithm results in the better performances with the realistic constraints. Also, it is found that the genetic algorithm could be used as an effective tool in the robust design. 4 refs., 6 figs. (Author)

  6. Quantitative Methods in Supply Chain Management Models and Algorithms

    CERN Document Server

    Christou, Ioannis T

    2012-01-01

    Quantitative Methods in Supply Chain Management presents some of the most important methods and tools available for modeling and solving problems arising in the context of supply chain management. In the context of this book, “solving problems” usually means designing efficient algorithms for obtaining high-quality solutions. The first chapter is an extensive optimization review covering continuous unconstrained and constrained linear and nonlinear optimization algorithms, as well as dynamic programming and discrete optimization exact methods and heuristics. The second chapter presents time-series forecasting methods together with prediction market techniques for demand forecasting of new products and services. The third chapter details models and algorithms for planning and scheduling with an emphasis on production planning and personnel scheduling. The fourth chapter presents deterministic and stochastic models for inventory control with a detailed analysis on periodic review systems and algorithmic dev...

  7. Advancing Research in Second Language Writing through Computational Tools and Machine Learning Techniques: A Research Agenda

    Science.gov (United States)

    Crossley, Scott A.

    2013-01-01

    This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…

  8. EAGLE: 'EAGLE'Is an' Algorithmic Graph Library for Exploration

    Energy Technology Data Exchange (ETDEWEB)

    2015-01-16

    The Resource Description Framework (RDF) and SPARQL Protocol and RDF Query Language (SPARQL) were introduced about a decade ago to enable flexible schema-free data interchange on the Semantic Web. Today data scientists use the framework as a scalable graph representation for integrating, querying, exploring and analyzing data sets hosted at different sources. With increasing adoption, the need for graph mining capabilities for the Semantic Web has emerged. Today there is no tools to conduct "graph mining" on RDF standard data sets. We address that need through implementation of popular iterative Graph Mining algorithms (Triangle count, Connected component analysis, degree distribution, diversity degree, PageRank, etc.). We implement these algorithms as SPARQL queries, wrapped within Python scripts and call our software tool as EAGLE. In RDF style, EAGLE stands for "EAGLE 'Is an' algorithmic graph library for exploration. EAGLE is like 'MATLAB' for 'Linked Data.'

  9. Development of CAD implementing the algorithm of boundary elements’ numerical analytical method

    Directory of Open Access Journals (Sweden)

    Yulia V. Korniyenko

    2015-03-01

    Full Text Available Up to recent days the algorithms for numerical-analytical boundary elements method had been implemented with programs written in MATLAB environment language. Each program had a local character, i.e. used to solve a particular problem: calculation of beam, frame, arch, etc. Constructing matrices in these programs was carried out “manually” therefore being time-consuming. The research was purposed onto a reasoned choice of programming language for new CAD development, allows to implement algorithm of numerical analytical boundary elements method and to create visualization tools for initial objects and calculation results. Research conducted shows that among wide variety of programming languages the most efficient one for CAD development, employing the numerical analytical boundary elements method algorithm, is the Java language. This language provides tools not only for development of calculating CAD part, but also to build the graphic interface for geometrical models construction and calculated results interpretation.

  10. Benchmark Framework for Mobile Robots Navigation Algorithms

    Directory of Open Access Journals (Sweden)

    Nelson David Muñoz-Ceballos

    2014-01-01

    Full Text Available Despite the wide variety of studies and research on mobile robot systems, performance metrics are not often examined. This makes difficult to establish an objective comparison of achievements. In this paper, the navigation of an autonomous mobile robot is evaluated. Several metrics are described. These metrics, collectively, provide an indication of navigation quality, useful for comparing and analyzing navigation algorithms of mobile robots. This method is suggested as an educational tool, which allows the student to optimize the algorithms quality, relating to important aspectsof science, technology and engineering teaching, as energy consumption, optimization and design.

  11. A Recommendation Algorithm for Automating Corollary Order Generation

    Science.gov (United States)

    Klann, Jeffrey; Schadow, Gunther; McCoy, JM

    2009-01-01

    Manual development and maintenance of decision support content is time-consuming and expensive. We explore recommendation algorithms, e-commerce data-mining tools that use collective order history to suggest purchases, to assist with this. In particular, previous work shows corollary order suggestions are amenable to automated data-mining techniques. Here, an item-based collaborative filtering algorithm augmented with association rule interestingness measures mined suggestions from 866,445 orders made in an inpatient hospital in 2007, generating 584 potential corollary orders. Our expert physician panel evaluated the top 92 and agreed 75.3% were clinically meaningful. Also, at least one felt 47.9% would be directly relevant in guideline development. This automated generation of a rough-cut of corollary orders confirms prior indications about automated tools in building decision support content. It is an important step toward computerized augmentation to decision support development, which could increase development efficiency and content quality while automatically capturing local standards. PMID:20351875

  12. Methods and Algorithms for Computer-aided Engineering of Die Tooling of Compressor Blades from Titanium Alloy

    Science.gov (United States)

    Khaimovich, A. I.; Khaimovich, I. N.

    2018-01-01

    The articles provides the calculation algorithms for blank design and die forming fitting to produce the compressor blades for aircraft engines. The design system proposed in the article allows generating drafts of trimming and reducing dies automatically, leading to significant reduction of work preparation time. The detailed analysis of the blade structural elements features was carried out, the taken limitations and technological solutions allowed to form generalized algorithms of forming parting stamp face over the entire circuit of the engraving for different configurations of die forgings. The author worked out the algorithms and programs to calculate three dimensional point locations describing the configuration of die cavity.

  13. UTV Expansion Pack - Special-Purpose Rank Revealing Algorithms (version 1.0 for Matlab 6.5)

    DEFF Research Database (Denmark)

    Fierro, Ricardo D.; Hansen, Per Christian

    This collection of Matlab software supplements and complements the package UTV Tools from 1999, and includes implementations of special-purpose rank-revealing algorithms developed since the publication of the original package. We provide algorithms for computing and modifying symmetric rank...

  14. An efficient technique to solve combined economic and emission ...

    Indian Academy of Sciences (India)

    GPSO was a derivative of the Standard Particle Swarm Optimization (SPSO) and .... ing path which makes it more attractive for future ants to follow. ..... algorithm applied to power economic dispatch of generators with multiple fuel options.

  15. Real-Time Optimization for Economic Model Predictive Control

    DEFF Research Database (Denmark)

    Sokoler, Leo Emil; Edlund, Kristian; Frison, Gianluca

    2012-01-01

    In this paper, we develop an efficient homogeneous and self-dual interior-point method for the linear programs arising in economic model predictive control. To exploit structure in the optimization problems, the algorithm employs a highly specialized Riccati iteration procedure. Simulations show...

  16. Introduction to health economics and decision-making: Is economics relevant for the frontline clinician?

    Science.gov (United States)

    Goeree, Ron; Diaby, Vakaramoko

    2013-12-01

    In a climate of escalating demands for new health care services and significant constraints on new resources, the disciplines of health economics and health technology assessment (HTA) have increasingly been turned to as explicit evidence-based frameworks to help make tough health care access and reimbursement decisions. Health economics is the discipline of economics concerned with the efficient allocation of health care resources, essentially trying to maximize health benefits to society contingent upon available resources. HTA is a broader field drawing upon several disciplines, but which relies heavily upon the tools of health economics and economic evaluation. Traditionally, health economics and economic evaluation have been widely used at the political (macro) and local (meso) decision-making levels, and have progressively had an important role even at informing individual clinical decisions (micro level). The aim of this paper is to introduce readers to health economics and discuss its relevance to frontline clinicians. Particularly, the content of the paper will facilitate clinicians' understanding of the link between economics and their medical practice, and how clinical decision-making reflects on health care resource allocation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Performance comparison of attitude determination, attitude estimation, and nonlinear observers algorithms

    Science.gov (United States)

    MOHAMMED, M. A. SI; BOUSSADIA, H.; BELLAR, A.; ADNANE, A.

    2017-01-01

    This paper presents a brief synthesis and useful performance analysis of different attitude filtering algorithms (attitude determination algorithms, attitude estimation algorithms, and nonlinear observers) applied to Low Earth Orbit Satellite in terms of accuracy, convergence time, amount of memory, and computation time. This latter is calculated in two ways, using a personal computer and also using On-board computer 750 (OBC 750) that is being used in many SSTL Earth observation missions. The use of this comparative study could be an aided design tool to the designer to choose from an attitude determination or attitude estimation or attitude observer algorithms. The simulation results clearly indicate that the nonlinear Observer is the more logical choice.

  18. Performance analysis of manufacturing systems : queueing approximations and algorithms

    NARCIS (Netherlands)

    Vuuren, van M.

    2007-01-01

    Performance Analysis of Manufacturing Systems Queueing Approximations and Algorithms This thesis is concerned with the performance analysis of manufacturing systems. Manufacturing is the application of tools and a processing medium to the transformation of raw materials into finished goods for sale.

  19. Economic Report of the President

    National Research Council Canada - National Science Library

    1998-01-01

    .... To reverse this course, we took a new approach, putting in place a bold economic strategy designed to bring down the deficit and give America's workers the tools and training they need to help...

  20. Management of gas releases with greenhouse effect: which economical tools?; Maitriser les emissions de gaz a effet de serre: quels instruments economiques?

    Energy Technology Data Exchange (ETDEWEB)

    Lepeltier, Serge [Senat, Paris (France)

    2000-06-09

    The climatic change represents the most severe danger to the durable world development, public health and future prosperity. This document concerning the gas releases with greenhouse effect is a report of the Senate Planning delegation regarding the economic and fiscal tools envisaging abatement of releases of gases with greenhouse effect. These issues are presented in four chapters titled as follows: 1. Since the scientific evidencing, requirement of managing the releases of gas with greenhouse effect has been unanimously recognized at the summits of Rio (1992) and Kyoto (1997); 2. The economic theory suggests instruments for reducing the gas releases with greenhouse effect at a minimum cost; 3. Challenges and ways of international cooperation in the field of climatic change; 4. Joining the political will with the pragmatic use of the economic instruments at national scale. The document contains a synthesis of proposals directed towards the following goals: international negotiations relating to climatic change; creating the community framework of managing the gas releases resulting in greenhouse effect; establishing national measures for managing the gas releases leading to greenhouse effect; actions to be undertaken by the territorial collectivities.

  1. Processing: A Python Framework for the Seamless Integration of Geoprocessing Tools in QGIS

    Directory of Open Access Journals (Sweden)

    Anita Graser

    2015-10-01

    Full Text Available Processing is an object-oriented Python framework for the popular open source Geographic Information System QGIS, which provides a seamless integration of geoprocessing tools from a variety of different software libraries. In this paper, we present the development history, software architecture and features of the Processing framework, which make it a versatile tool for the development of geoprocessing algorithms and workflows, as well as an efficient integration platform for algorithms from different sources. Using real-world application examples, we furthermore illustrate how the Processing architecture enables typical geoprocessing use cases in research and development, such as automating and documenting workflows, combining algorithms from different software libraries, as well as developing and integrating custom algorithms. Finally, we discuss how Processing can facilitate reproducible research and provide an outlook towards future development goals.

  2. The Economic Crisis and Sustainable Development

    DEFF Research Database (Denmark)

    Lund, Henrik; Hvelplund, Frede

    of sustainable energy solutions involves the replacement of imported fossil fuels by substantial investments in energy conservation and renewable energy. In such situation, it becomes increasingly essential to develop economic thinking and economic models that can analyse the concrete institutions in which......This paper presents Concrete Institutional Economics as an economic paradigm to understand how the wish for sustainable energy in times of economic crisis can be used to generate jobs as well as economic growth. In most countries, including European countries, the USA and China, the implementation...... the market is embedded. This paper presents such tools and methodologies and applies them to the case of the Danish heating sector. The case shows how investments in decreasing fossil fuels and CO2 emissions can be made in a way in which they have a positive influence on job creation and economic development...

  3. Is prophetic discourse adequate to address global economic justice?

    African Journals Online (AJOL)

    Test

    2011-02-15

    Feb 15, 2011 ... of moral discourse adequately addresses issues of economic injustice. ... plays an indispensable role in addressing issues of global economic justice, but ...... governance in their business practices, to provide a tool for a.

  4. INCORPORATING ENVIRONMENTAL AND ECONOMIC CONSIDERATIONS INTO PROCESS DESIGN: THE WASTE REDUCTION (WAR) ALGORITHM

    Science.gov (United States)

    A general theory known as the WAste Reduction (WASR) algorithm has been developed to describe the flow and the generation of potential environmental impact through a chemical process. This theory integrates environmental impact assessment into chemical process design Potential en...

  5. DANNP: an efficient artificial neural network pruning tool

    KAUST Repository

    Alshahrani, Mona

    2017-11-06

    Background Artificial neural networks (ANNs) are a robust class of machine learning models and are a frequent choice for solving classification problems. However, determining the structure of the ANNs is not trivial as a large number of weights (connection links) may lead to overfitting the training data. Although several ANN pruning algorithms have been proposed for the simplification of ANNs, these algorithms are not able to efficiently cope with intricate ANN structures required for complex classification problems. Methods We developed DANNP, a web-based tool, that implements parallelized versions of several ANN pruning algorithms. The DANNP tool uses a modified version of the Fast Compressed Neural Network software implemented in C++ to considerably enhance the running time of the ANN pruning algorithms we implemented. In addition to the performance evaluation of the pruned ANNs, we systematically compared the set of features that remained in the pruned ANN with those obtained by different state-of-the-art feature selection (FS) methods. Results Although the ANN pruning algorithms are not entirely parallelizable, DANNP was able to speed up the ANN pruning up to eight times on a 32-core machine, compared to the serial implementations. To assess the impact of the ANN pruning by DANNP tool, we used 16 datasets from different domains. In eight out of the 16 datasets, DANNP significantly reduced the number of weights by 70%–99%, while maintaining a competitive or better model performance compared to the unpruned ANN. Finally, we used a naïve Bayes classifier derived with the features selected as a byproduct of the ANN pruning and demonstrated that its accuracy is comparable to those obtained by the classifiers trained with the features selected by several state-of-the-art FS methods. The FS ranking methodology proposed in this study allows the users to identify the most discriminant features of the problem at hand. To the best of our knowledge, DANNP (publicly

  6. The Economic Value of Water

    Directory of Open Access Journals (Sweden)

    Pedro Arrojo Agudo

    1999-10-01

    Full Text Available The economic term of water is seen from the perspective of an ecological economy, an Aristotelian sense that integrates social values, environmental considerations and financial issues. Water should thus be conceptualized as an “ecosocial” good and not merely as a simple factor of production. Therefore, the focus of water management should not limit itself to managing a scarce resource. Rather the focus should be to articulate an institutional framework that would allow for the use of management tools based on the financial value of water (pricing policies, fiscal incentives, economic penalties for inefficiency... fixed to a somewhat interventionist market, or which answers to administration mechanisms, with constraints setting the conditions of sustainablity that the sound management of water requires in each territory. This approach brings to the table a profoundly territorial andcontextualized view of water management within the paradigm of Sustainable Development. Having said this does not imply disregarding the classical economic science tools of cost/benefit analysis, though. Quite the contrary: today, economic science can provide highly useful, multiple concepts and traditional techniques to the creation of a new model of the economic management of water. At bottom, the challenge is to take advantage of the previous conceptual and methodological body of work, refining the work in some cases,contextualizing it in others, and above all, complementing the previous work with other value-based perspectives to develop a multi-criteria decision-making model for the management and financial assessment of water policies.

  7. Parallel grid generation algorithm for distributed memory computers

    Science.gov (United States)

    Moitra, Stuti; Moitra, Anutosh

    1994-01-01

    A parallel grid-generation algorithm and its implementation on the Intel iPSC/860 computer are described. The grid-generation scheme is based on an algebraic formulation of homotopic relations. Methods for utilizing the inherent parallelism of the grid-generation scheme are described, and implementation of multiple levELs of parallelism on multiple instruction multiple data machines are indicated. The algorithm is capable of providing near orthogonality and spacing control at solid boundaries while requiring minimal interprocessor communications. Results obtained on the Intel hypercube for a blended wing-body configuration are used to demonstrate the effectiveness of the algorithm. Fortran implementations bAsed on the native programming model of the iPSC/860 computer and the Express system of software tools are reported. Computational gains in execution time speed-up ratios are given.

  8. New HIV Testing Algorithm: Promising Tool in the Fight Against HIV

    Centers for Disease Control (CDC) Podcasts

    2016-09-21

    In this podcast, CDC’s Dr. Phil Peters discusses the new HIV testing algorithm and how this latest technology can improve the diagnosis of acute HIV infection. Early detection of HIV is critical to saving lives, getting patients into treatment, and preventing transmission.  Created: 9/21/2016 by National Center for HIV/AIDS, Viral Hepatitis, STD and TB Prevention (NCHHSTP), • Division of HIV/AIDS Prevention (DHAP).   Date Released: 9/21/2016.

  9. Life Cycle Assessment Studies of Chemical and Biochemical Processes through the new LCSoft Software-tool

    DEFF Research Database (Denmark)

    Supawanich, Perapong; Malakul, Pomthong; Gani, Rafiqul

    2015-01-01

    requirements have to be evaluated together with environmental and economic aspects. The LCSoft software-tool has been developed to perform LCA as a stand-alone tool as well as integrated with other process design tools such as process simulation, economic analysis (ECON), and sustainable process design...

  10. Nature-inspired novel Cuckoo Search Algorithm for genome ...

    Indian Academy of Sciences (India)

    compared their results with other methods such as the genetic algorithm. ... It is a population-based search procedure used as an optimization tool, in ... In this section, the problem formulation, fitness evaluation, flowchart and implementation of the ..... Machine Learning 21: 11–33 ... Numerical Optimization 1: 330–343.

  11. Integrating spatial support tools into strategic planning-SEA of the GMS North-South Economic Corridor Strategy and Action Plan

    International Nuclear Information System (INIS)

    Ramachandran, Pavit; Linde, Lothar

    2011-01-01

    The GMS countries, supported by the Asian Development Bank, have adopted a holistic, multidimensional approach to strengthen infrastructural linkages and facilitate cross border trade through (i) the establishment of a trans-boundary road connecting two economic nodes across marginalised areas, followed by 2) facilitation of environmentally and socially sound investments in these newly connected areas as a means to develop livelihoods. The North-South Economic Corridor is currently in its second phase of development, with investment opportunities to be laid out in the NSEC Strategy and Action Plan (SAP). It targets the ecologically and culturally sensitive border area between PR China's Yunnan Province, Northern Lao PDR, and Thailand. A trans-boundary, cross-sectoral Strategic Environmental Assessment was conducted to support the respective governments in assessing potential environmental and social impacts, developing alternatives and mitigation options, and feeding the findings back into the SAP writing process. Given the spatial dimension of corridor development-both with regard to opportunities and risks-particular emphasis was put in the application of spatial modelling tools to help geographically locate and quantify impacts as a means to guide interventions and set priorities.

  12. Research of cartographer laser SLAM algorithm

    Science.gov (United States)

    Xu, Bo; Liu, Zhengjun; Fu, Yiran; Zhang, Changsai

    2017-11-01

    As the indoor is a relatively closed and small space, total station, GPS, close-range photogrammetry technology is difficult to achieve fast and accurate indoor three-dimensional space reconstruction task. LIDAR SLAM technology does not rely on the external environment a priori knowledge, only use their own portable lidar, IMU, odometer and other sensors to establish an independent environment map, a good solution to this problem. This paper analyzes the Google Cartographer laser SLAM algorithm from the point cloud matching and closed loop detection. Finally, the algorithm is presented in the 3D visualization tool RViz from the data acquisition and processing to create the environment map, complete the SLAM technology and realize the process of indoor threedimensional space reconstruction

  13. Techno-economic design optimization of solar thermal power plants

    OpenAIRE

    Morin, G.

    2011-01-01

    A holistic view is essential in the engineering of technical systems. This thesis presents an integrative approach for designing solar thermal power plants. The methodology is based on a techno-economic plant model and a powerful optimization algorithm. Typically, contemporary design methods treat technical and economic parameters and sub-systems separately, making it difficult or even impossible to realize the full optimization potential of power plant systems. The approach presented here ov...

  14. Modeling Algorithms in SystemC and ACL2

    Directory of Open Access Journals (Sweden)

    John W. O'Leary

    2014-06-01

    Full Text Available We describe the formal language MASC, based on a subset of SystemC and intended for modeling algorithms to be implemented in hardware. By means of a special-purpose parser, an algorithm coded in SystemC is converted to a MASC model for the purpose of documentation, which in turn is translated to ACL2 for formal verification. The parser also generates a SystemC variant that is suitable as input to a high-level synthesis tool. As an illustration of this methodology, we describe a proof of correctness of a simple 32-bit radix-4 multiplier.

  15. Optimising a shaft's geometry by applying genetic algorithms

    Directory of Open Access Journals (Sweden)

    María Alejandra Guzmán

    2005-05-01

    Full Text Available Many engnieering design tasks involve optimising several conflicting goals; these types of problem are known as Multiobjective Optimisation Problems (MOPs. Evolutionary techniques have proved to be an effective tool for finding solutions to these MOPs during the last decade, Variations on the basic generic algorithm have been particulary proposed by different researchers for finding rapid optimal solutions to MOPs. The NSGA (Non-dominated Sorting Generic Algorithm has been implemented in this paper for finding an optimal design for a shaft subjected to cyclic loads, the conflycting goals being minimum weight and minimum lateral deflection.

  16. Radwaste volume reduction economics: an overview

    International Nuclear Information System (INIS)

    Naughton, M.D.

    1984-01-01

    Today, utilities are faced with mounting charges related to the disposal of radioactive waste from their nuclear power plants. Numerous factors complicate economic analysis of radwaste processing options. This paper details two recent key EPRI studies bearing upon radwaste operations and economics. The first study, RP1557-3, characterizes low level wastes from nuclear power plants during the period 1978 to 1982. This paper presents information on the quantity of waste by type, waste composition, specific activities and major isotopes and radiation fields of final disposal packages. The second study, RP1557-11,12,13, involved the development of a computer code for evaluating radwaste disposal economics. Capital and operating cost estimates were prepared for 11 diferent processing-disposal options. These costs are utilized along with a burial site pricing algorithm in VRTECH, a computer radwaste economic assessment program. This paper discusses the VRTECH code and the results of the generic analyses conducted in the study

  17. RNA-SSPT: RNA Secondary Structure Prediction Tools.

    Science.gov (United States)

    Ahmad, Freed; Mahboob, Shahid; Gulzar, Tahsin; Din, Salah U; Hanif, Tanzeela; Ahmad, Hifza; Afzal, Muhammad

    2013-01-01

    The prediction of RNA structure is useful for understanding evolution for both in silico and in vitro studies. Physical methods like NMR studies to predict RNA secondary structure are expensive and difficult. Computational RNA secondary structure prediction is easier. Comparative sequence analysis provides the best solution. But secondary structure prediction of a single RNA sequence is challenging. RNA-SSPT is a tool that computationally predicts secondary structure of a single RNA sequence. Most of the RNA secondary structure prediction tools do not allow pseudoknots in the structure or are unable to locate them. Nussinov dynamic programming algorithm has been implemented in RNA-SSPT. The current studies shows only energetically most favorable secondary structure is required and the algorithm modification is also available that produces base pairs to lower the total free energy of the secondary structure. For visualization of RNA secondary structure, NAVIEW in C language is used and modified in C# for tool requirement. RNA-SSPT is built in C# using Dot Net 2.0 in Microsoft Visual Studio 2005 Professional edition. The accuracy of RNA-SSPT is tested in terms of Sensitivity and Positive Predicted Value. It is a tool which serves both secondary structure prediction and secondary structure visualization purposes.

  18. Demystifying the role of copyright as a tool for economic ...

    African Journals Online (AJOL)

    RV

    ... regard is one of creating a conducive environment through political (and economic) stability, and not one of actually ..... Coach 2010 ..... influenced by the civil law tradition, have incorporated bad civil law elements into English copyright law ...

  19. Health economics education in undergraduate medical training: introducing the health economics education (HEe) website

    Science.gov (United States)

    2013-01-01

    In the UK, the General Medical Council clearly stipulates that upon completion of training, medical students should be able to discuss the principles underlying the development of health and health service policy, including issues relating to health economics. In response, researchers from the UK and other countries have called for a need to incorporate health economics training into the undergraduate medical curricula. The Health Economics education website was developed to encourage and support teaching and learning in health economics for medical students. It was designed to function both as a forum for teachers of health economics to communicate and to share resources and also to provide instantaneous access to supporting literature and teaching materials on health economics. The website provides a range of free online material that can be used by both health economists and non-health economists to teach the basic principles of the discipline. The Health Economics education website is the only online education resource that exists for teaching health economics to medical undergraduate students and it provides teachers of health economics with a range of comprehensive basic and advanced teaching materials that are freely available. This article presents the website as a tool to encourage the incorporation of health economics training into the undergraduate medical curricula. PMID:24034906

  20. Health economics education in undergraduate medical training: introducing the health economics education (HEe) website.

    Science.gov (United States)

    Oppong, Raymond; Mistry, Hema; Frew, Emma

    2013-09-13

    In the UK, the General Medical Council clearly stipulates that upon completion of training, medical students should be able to discuss the principles underlying the development of health and health service policy, including issues relating to health economics. In response, researchers from the UK and other countries have called for a need to incorporate health economics training into the undergraduate medical curricula. The Health Economics education website was developed to encourage and support teaching and learning in health economics for medical students. It was designed to function both as a forum for teachers of health economics to communicate and to share resources and also to provide instantaneous access to supporting literature and teaching materials on health economics. The website provides a range of free online material that can be used by both health economists and non-health economists to teach the basic principles of the discipline. The Health Economics education website is the only online education resource that exists for teaching health economics to medical undergraduate students and it provides teachers of health economics with a range of comprehensive basic and advanced teaching materials that are freely available. This article presents the website as a tool to encourage the incorporation of health economics training into the undergraduate medical curricula.