Li, Jingpeng; Burke, Edmund
2008-01-01
Nurse rostering is a complex scheduling problem that affects hospital personnel on a daily basis all over the world. This paper presents a new component-based approach with adaptive perturbations, for a nurse scheduling problem arising at a major UK hospital. The main idea behind this technique is to decompose a schedule into its components (i.e. the allocated shift pattern of each nurse), and then mimic a natural evolutionary process on these components to iteratively deliver better schedules. The worthiness of all components in the schedule has to be continuously demonstrated in order for them to remain there. This demonstration employs a dynamic evaluation function which evaluates how well each component contributes towards the final objective. Two perturbation steps are then applied: the first perturbation eliminates a number of components that are deemed not worthy to stay in the current schedule; the second perturbation may also throw out, with a low level of probability, some worthy components. The eli...
Heuristic Based Task Scheduling In Grid
Directory of Open Access Journals (Sweden)
Manpreet Singh
2012-09-01
Full Text Available Grid computing is concerned with coordinated resource sharing and problem solving in dynamic, multi-institutional virtual organizations. Efficient scheduling of complex applications in a grid environment reveals several challenges due to its high heterogeneity, dynamic behavior and space shared utilization. Objectives of scheduling algorithms are increase in system throughput, efficiency and reduction in task completion time. The main focus of this paper is to highlight the merits of resource and task selection technique based on certain heuristics.
Greedy heuristic algorithm for solving series of eee components classification problems*
Kazakovtsev, A. L.; Antamoshkin, A. N.; Fedosov, V. V.
2016-04-01
Algorithms based on using the agglomerative greedy heuristics demonstrate precise and stable results for clustering problems based on k- means and p-median models. Such algorithms are successfully implemented in the processes of production of specialized EEE components for using in space systems which include testing each EEE device and detection of homogeneous production batches of the EEE components based on results of the tests using p-median models. In this paper, authors propose a new version of the genetic algorithm with the greedy agglomerative heuristic which allows solving series of problems. Such algorithm is useful for solving the k-means and p-median clustering problems when the number of clusters is unknown. Computational experiments on real data show that the preciseness of the result decreases insignificantly in comparison with the initial genetic algorithm for solving a single problem.
Heuristic Reduction Algorithm Based on Pairwise Positive Region
Institute of Scientific and Technical Information of China (English)
QI Li; LIU Yu-shu
2007-01-01
To guarantee the optimal reduct set, a heuristic reduction algorithm is proposed, which considers the distinguishing information between the members of each pair decision classes. Firstly the pairwise positive region is defined, based on which the pairwise significance measure is calculated between the members of each pair classes. Finally the weighted pairwise significance of attribute is used as the attribute reduction criterion, which indicates the necessity of attributes very well. By introducing the noise tolerance factor, the new algorithm can tolerate noise to some extent. Experimental results show the advantages of our novel heuristic reduction algorithm over the traditional attribute dependency based algorithm.
Hybrid Heuristic-Based Artificial Immune System for Task Scheduling
sanei, Masoomeh
2011-01-01
Task scheduling problem in heterogeneous systems is the process of allocating tasks of an application to heterogeneous processors interconnected by high-speed networks, so that minimizing the finishing time of application as much as possible. Tasks are processing units of application and have precedenceconstrained, communication and also, are presented by Directed Acyclic Graphs (DAGs). Evolutionary algorithms are well suited for solving task scheduling problem in heterogeneous environment. In this paper, we propose a hybrid heuristic-based Artificial Immune System (AIS) algorithm for solving the scheduling problem. In this regard, AIS with some heuristics and Single Neighbourhood Search (SNS) technique are hybridized. Clonning and immune-remove operators of AIS provide diversity, while heuristics and SNS provide convergence of algorithm into good solutions, that is balancing between exploration and exploitation. We have compared our method with some state-of-the art algorithms. The results of the experiments...
Honda, Hidehito; Matsuka, Toshihiko; Ueda, Kazuhiro
2016-07-20
Some researchers on binary choice inference have argued that people make inferences based on simple heuristics, such as recognition, fluency, or familiarity. Others have argued that people make inferences based on available knowledge. To examine the boundary between heuristic and knowledge usage, we examine binary choice inference processes in terms of attribute substitution in heuristic use (Kahneman & Frederick, 2005). In this framework, it is predicted that people will rely on heuristic or knowledge-based inference depending on the subjective difficulty of the inference task. We conducted competitive tests of binary choice inference models representing simple heuristics (fluency and familiarity heuristics) and knowledge-based inference models. We found that a simple heuristic model (especially a familiarity heuristic model) explained inference patterns for subjectively difficult inference tasks, and that a knowledge-based inference model explained subjectively easy inference tasks. These results were consistent with the predictions of the attribute substitution framework. Issues on usage of simple heuristics and psychological processes are discussed.
Improving Multi-Component Maintenance Acquisition with a Greedy Heuristic Local Algorithm
2013-04-01
heuristic based on a genetic algorithm was applied in train maintenance scheduling problems by Sriskandarajah, Jardine , and Chan (1998), primarily...interdependent. OR Spectrum, 27(1), 63–84. Sriskandarajah, C., Jardine , A., & Chan, C. (1998). Maintenance scheduling of rolling stock using a genetic
Automated Detection of Heuristics and Biases among Pathologists in a Computer-Based System
Crowley, Rebecca S.; Legowski, Elizabeth; Medvedeva, Olga; Reitmeyer, Kayse; Tseytlin, Eugene; Castine, Melissa; Jukic, Drazen; Mello-Thoms, Claudia
2013-01-01
The purpose of this study is threefold: (1) to develop an automated, computer-based method to detect heuristics and biases as pathologists examine virtual slide cases, (2) to measure the frequency and distribution of heuristics and errors across three levels of training, and (3) to examine relationships of heuristics to biases, and biases to…
Heuristic Based Adaptive Step Size CLMS Algorithms for Smart Antennas
Directory of Open Access Journals (Sweden)
Y Rama Krishna
2013-05-01
Full Text Available A smart antenna system combines multiple antenna elements with a signal processing capability to optimize its radiation and/or reception pattern automatically in response to the signal environment through complex weight selection. The weight selection process to get suitable Array factor with low Half Power Beam Width (HPBW and Side Lobe Level (SLL is a complex method. The aim of this task is to design a new approach for smart antennas to minimize the noise and interference effects from external sources with least number of iterations. This paper presents Heuristics based adaptive step size Complex Least Mean Square (CLMS model for Smart Antennas to speedup convergence. In this process Benveniste and Mathews algorithms are used as heuristics with CLMS and the improvement of performance of Smart Antenna System in terms of convergence rate and array factor are discussed and compared with the performance of CLMS and Augmented CLMS (ACLMS algorithms.
Heuristics-based query optimisation for SPARQL
P. Tsialiamanis (Petros); E. Sidirourgos (Eleftherios); I. Fundulaki; V. Christophides; P.A. Boncz (Peter)
2012-01-01
textabstractQuery optimization in RDF Stores is a challenging problem as SPARQL queries typically contain many more joins than equivalent relational plans, and hence lead to a large join order search space. In such cases, cost-based query optimization often is not possible. One practical reason for
SVM multiuser detection based on heuristic kernel
Institute of Scientific and Technical Information of China (English)
Yang Tao; Hu Bo
2007-01-01
A support vector machine (SVM) based multiuser detection (MUD) scheme in code-division multiple-access (CDMA) system is proposed. In this scheme, the equivalent support vector (SV) is obtained through a kernel sparsity approximation algorithm, which avoids the conventional costly quadratic programming (QP) procedure in SVM. Besides, the coefficient of the SV is attained through the solution to a generalized eigenproblem. Simulation results show that the proposed scheme has almost the same bit error rate (BER) as the standard SVM and is better than minimum mean square error (MMSE) scheme. Meanwhile, it has a low computation complexity.
An heuristic based practical tool for casting process design
Energy Technology Data Exchange (ETDEWEB)
Nanda, N.K.; Smith, K.A.; Voller, V.R.; Haberle, K.F. [Univ. of Minnesota, Minneapolis, MN (United States). Dept. of Civil Engineering
1995-12-31
The work in this paper reports on an heuristic based computer tool directed at casting process design; in particular key design parameters, such as part orientation, location of sprues, feeding rates, etc. The underlying principal used is that a given casting can be represented on identifying and classifying its critical features. The input to the system consists of the attributes of the features and the graphical output provides semi-quantitative information on key design parameters. Results on real castings match those of the expert casting designers and in some cases potential design improvements have been suggested by the system.
Heuristic-based scheduling algorithm for high level synthesis
Mohamed, Gulam; Tan, Han-Ngee; Chng, Chew-Lye
1992-01-01
A new scheduling algorithm is proposed which uses a combination of a resource utilization chart, a heuristic algorithm to estimate the minimum number of hardware units based on operator mobilities, and a list-scheduling technique to achieve fast and near optimal schedules. The schedule time of this algorithm is almost independent of the length of mobilities of operators as can be seen from the benchmark example (fifth order digital elliptical wave filter) presented when the cycle time was increased from 17 to 18 and then to 21 cycles. It is implemented in C on a SUN3/60 workstation.
A Modularity Degree Based Heuristic Community Detection Algorithm
Directory of Open Access Journals (Sweden)
Dongming Chen
2014-01-01
Full Text Available A community in a complex network can be seen as a subgroup of nodes that are densely connected. Discovery of community structures is a basic problem of research and can be used in various areas, such as biology, computer science, and sociology. Existing community detection methods usually try to expand or collapse the nodes partitions in order to optimize a given quality function. These optimization function based methods share the same drawback of inefficiency. Here we propose a heuristic algorithm (MDBH algorithm based on network structure which employs modularity degree as a measure function. Experiments on both synthetic benchmarks and real-world networks show that our algorithm gives competitive accuracy with previous modularity optimization methods, even though it has less computational complexity. Furthermore, due to the use of modularity degree, our algorithm naturally improves the resolution limit in community detection.
A Genetic Algorithm-based Heuristic for Part-Feeding Mobile Robot Scheduling Problem
DEFF Research Database (Denmark)
Dang, Vinh Quang; Nielsen, Izabela Ewa; Bocewicz, Grzegorz
2012-01-01
This present study deals with the problem of sequencing feeding tasks of a single mobile robot with manipulation arm which is able to provide parts or components for feeders of machines in a manufacturing cell. The mobile robot has to be scheduled in order to keep machines within the cell produci....... A genetic algorithm-based heuristic is developed to find the near optimal solution for the problem. A case study is implemented at an impeller production line in a factory to demonstrate the result of the proposed approach....
A Genetic Algorithm-based Heuristic for Part-Feeding Mobile Robot Scheduling Problem
DEFF Research Database (Denmark)
Dang, Vinh Quang; Nielsen, Izabela Ewa; Bocewicz, Grzegorz
2012-01-01
This present study deals with the problem of sequencing feeding tasks of a single mobile robot with manipulation arm which is able to provide parts or components for feeders of machines in a manufacturing cell. The mobile robot has to be scheduled in order to keep machines within the cell produci....... A genetic algorithm-based heuristic is developed to find the near optimal solution for the problem. A case study is implemented at an impeller production line in a factory to demonstrate the result of the proposed approach....
Wahid, Juliana; Hussin, Naimah Mohd
2016-08-01
The construction of population of initial solution is a crucial task in population-based metaheuristic approach for solving curriculum-based university course timetabling problem because it can affect the convergence speed and also the quality of the final solution. This paper presents an exploration on combination of graph heuristics in construction approach in curriculum based course timetabling problem to produce a population of initial solutions. The graph heuristics were set as single and combination of two heuristics. In addition, several ways of assigning courses into room and timeslot are implemented. All settings of heuristics are then tested on the same curriculum based course timetabling problem instances and are compared with each other in terms of number of population produced. The result shows that combination of saturation degree followed by largest degree heuristic produce the highest number of population of initial solutions. The results from this study can be used in the improvement phase of algorithm that uses population of initial solutions.
Using Heuristic Task Analysis to Create Web-Based Instructional Design Theory
Fiester, Herbert R.
2010-01-01
The first purpose of this study was to identify procedural and heuristic knowledge used when creating web-based instruction. The second purpose of this study was to develop suggestions for improving the Heuristic Task Analysis process, a technique for eliciting, analyzing, and representing expertise in cognitively complex tasks. Three expert…
Using Heuristic Task Analysis to Create Web-Based Instructional Design Theory
Fiester, Herbert R.
2010-01-01
The first purpose of this study was to identify procedural and heuristic knowledge used when creating web-based instruction. The second purpose of this study was to develop suggestions for improving the Heuristic Task Analysis process, a technique for eliciting, analyzing, and representing expertise in cognitively complex tasks. Three expert…
Veermans, K.H.; Jong, de T.; Joolingen, van W.R.; Mason, L.; Andreuzza, S.; Arfè, B.; Favero, del L.
2003-01-01
Learners are often reported to experience difficulties with simulation-based discovery learning. Heuristics for discovery learning (rules of thumb that guide decision-making) can help learners to overcome these difficulties. In addition, the heuristics themselves are open for transfer. One way to in
New Meta-Heuristic for Combinatorial Optimization Problems:Intersection Based Scaling
Institute of Scientific and Technical Information of China (English)
Peng Zou; Zhi Zhou; Ying-Yu Wan; Guo-Liang Chen; Jun Gu
2004-01-01
Combinatorial optimization problems are found in many application fields such as computer science, engineering and economy. In this paper, a new efficient meta-heuristic, Intersection-Based Scaling (IBS for abbreviation),is proposed and it can be applied to the combinatorial optimization problems. The main idea of IBS is to scale the size of the instance based on the intersection of some local optima, and to simplify the search space by extracting the intersection from the instance, which makes the search more efficient. The combination of IBS with some local search heuristics of different combinatorial optimization problems such as Traveling Salesman Problem (TSP) and Graph Partitioning Problem (GPP) is studied, and comparisons are made with some of the best heuristic algorithms and meta-heuristic algorithms. It is found that it has significantly improved the performance of existing local search heuristics and significantly outperforms the known best algorithms.
Archer, Charles J.; Blocksome, Michael A.; Heidelberger, Philip; Kumar, Sameer; Parker, Jeffrey J.; Ratterman, Joseph D.
2011-06-07
Methods, compute nodes, and computer program products are provided for heuristic status polling of a component in a computing system. Embodiments include receiving, by a polling module from a requesting application, a status request requesting status of a component; determining, by the polling module, whether an activity history for the component satisfies heuristic polling criteria; polling, by the polling module, the component for status if the activity history for the component satisfies the heuristic polling criteria; and not polling, by the polling module, the component for status if the activity history for the component does not satisfy the heuristic criteria.
Petri nets SM-cover-based on heuristic coloring algorithm
Tkacz, Jacek; Doligalski, Michał
2015-09-01
In the paper, coloring heuristic algorithm of interpreted Petri nets is presented. Coloring is used to determine the State Machines (SM) subnets. The present algorithm reduces the Petri net in order to reduce the computational complexity and finds one of its possible State Machines cover. The proposed algorithm uses elements of interpretation of Petri nets. The obtained result may not be the best, but it is sufficient for use in rapid prototyping of logic controllers. Found SM-cover will be also used in the development of algorithms for decomposition, and modular synthesis and implementation of parallel logic controllers. Correctness developed heuristic algorithm was verified using Gentzen formal reasoning system.
Directory of Open Access Journals (Sweden)
Muhammad Farhan Ausaf
2015-12-01
Full Text Available Process planning and scheduling are two important components of a manufacturing setup. It is important to integrate them to achieve better global optimality and improved system performance. To find optimal solutions for integrated process planning and scheduling (IPPS problem, numerous algorithm-based approaches exist. Most of these approaches try to use existing meta-heuristic algorithms for solving the IPPS problem. Although these approaches have been shown to be effective in optimizing the IPPS problem, there is still room for improvement in terms of quality of solution and algorithm efficiency, especially for more complicated problems. Dispatching rules have been successfully utilized for solving complicated scheduling problems, but haven’t been considered extensively for the IPPS problem. This approach incorporates dispatching rules with the concept of prioritizing jobs, in an algorithm called priority-based heuristic algorithm (PBHA. PBHA tries to establish job and machine priority for selecting operations. Priority assignment and a set of dispatching rules are simultaneously used to generate both the process plans and schedules for all jobs and machines. The algorithm was tested for a series of benchmark problems. The proposed algorithm was able to achieve superior results for most complex problems presented in recent literature while utilizing lesser computational resources.
A personification heuristic Genetic Algorithm for Digital Microfluidics-based Biochips Placement
Directory of Open Access Journals (Sweden)
Jingsong Yang
2013-06-01
Full Text Available A personification heuristic Genetic Algorithm is established for the placement of digital microfluidics-based biochips, in which, the personification heuristic algorithm is used to control the packing process, while the genetic algorithm is designed to be used in multi-objective placement results optimizing. As an example, the process of microfluidic module physical placement in multiplexed in-vitro diagnostics on human physiological fluids is simulated. The experiment results show that personification heuristic genetic algorithm can achieve better results in multi-objective optimization, compare to the parallel recombinative simulated annealing algorithm.
Directory of Open Access Journals (Sweden)
Stanimirović Ivan
2009-01-01
Full Text Available We introduce a heuristic method for the single resource constrained project scheduling problem, based on the dynamic programming solution of the knapsack problem. This method schedules projects with one type of resources, in the non-preemptive case: once started an activity is not interrupted and runs to completion. We compare the implementation of this method with well-known heuristic scheduling method, called Minimum Slack First (known also as Gray-Kidd algorithm, as well as with Microsoft Project.
A marketing science perspective on recognition-based heuristics (and the fast-and-frugal paradigm)
John Hauser
2011-01-01
Marketing science seeks to prescribe better marketing strategies (advertising, product development, pricing, etc.). To do so we rely on models of consumer decisions grounded in empirical observations. Field experience suggests that recognition-based heuristics help consumers to choose which brands to consider and purchase in frequently-purchased categories, but other heuristics are more relevant in durable-goods categories. Screening with recognition is a rational screening rule when advertis...
Salcedo-Sanz, S.
2016-10-01
Meta-heuristic algorithms are problem-solving methods which try to find good-enough solutions to very hard optimization problems, at a reasonable computation time, where classical approaches fail, or cannot even been applied. Many existing meta-heuristics approaches are nature-inspired techniques, which work by simulating or modeling different natural processes in a computer. Historically, many of the most successful meta-heuristic approaches have had a biological inspiration, such as evolutionary computation or swarm intelligence paradigms, but in the last few years new approaches based on nonlinear physics processes modeling have been proposed and applied with success. Non-linear physics processes, modeled as optimization algorithms, are able to produce completely new search procedures, with extremely effective exploration capabilities in many cases, which are able to outperform existing optimization approaches. In this paper we review the most important optimization algorithms based on nonlinear physics, how they have been constructed from specific modeling of a real phenomena, and also their novelty in terms of comparison with alternative existing algorithms for optimization. We first review important concepts on optimization problems, search spaces and problems' difficulty. Then, the usefulness of heuristics and meta-heuristics approaches to face hard optimization problems is introduced, and some of the main existing classical versions of these algorithms are reviewed. The mathematical framework of different nonlinear physics processes is then introduced as a preparatory step to review in detail the most important meta-heuristics based on them. A discussion on the novelty of these approaches, their main computational implementation and design issues, and the evaluation of a novel meta-heuristic based on Strange Attractors mutation will be carried out to complete the review of these techniques. We also describe some of the most important application areas, in
PROBLEM SOLVING IN SCHOOL MATHEMATICS BASED ON HEURISTIC STRATEGIES
Novotná, Jarmila; EISENMANN. Petr; PŘIBYL, Jiří; ONDRUŠOVÁ, Jiřina; BŘEHOVSKÝ, Jiří
2014-01-01
The paper describes one of the ways of developing pupils’ creative approach to problem solving. The described experiment is a part of a longitudinal research focusing on improvement of culture of problem solving by pupils. It deals with solving of problems using the following heuristic strategies: Analogy, Guess – check – revise, Systematic experimentation, Problem reformulation, Solution drawing, Way back and Use of graphs of functions. Most attention is paid to the question whether short-te...
Structure-Based Local Search Heuristics for Circuit-Level Boolean Satisfiability
Belov, Anton
2011-01-01
This work focuses on improving state-of-the-art in stochastic local search (SLS) for solving Boolean satisfiability (SAT) instances arising from real-world industrial SAT application domains. The recently introduced SLS method CRSat has been shown to noticeably improve on previously suggested SLS techniques in solving such real-world instances by combining justification-based local search with limited Boolean constraint propagation on the non-clausal formula representation form of Boolean circuits. In this work, we study possibilities of further improving the performance of CRSat by exploiting circuit-level structural knowledge for developing new search heuristics for CRSat. To this end, we introduce and experimentally evaluate a variety of search heuristics, many of which are motivated by circuit-level heuristics originally developed in completely different contexts, e.g., for electronic design automation applications. To the best of our knowledge, most of the heuristics are novel in the context of SLS for S...
Heuristic for Critical Machine Based a Lot Streaming for Two-Stage Hybrid Production Environment
Vivek, P.; Saravanan, R.; Chandrasekaran, M.; Pugazhenthi, R.
2017-03-01
Lot streaming in Hybrid flowshop [HFS] is encountered in many real world problems. This paper deals with a heuristic approach for Lot streaming based on critical machine consideration for a two stage Hybrid Flowshop. The first stage has two identical parallel machines and the second stage has only one machine. In the second stage machine is considered as a critical by valid reasons these kind of problems is known as NP hard. A mathematical model developed for the selected problem. The simulation modelling and analysis were carried out in Extend V6 software. The heuristic developed for obtaining optimal lot streaming schedule. The eleven cases of lot streaming were considered. The proposed heuristic was verified and validated by real time simulation experiments. All possible lot streaming strategies and possible sequence under each lot streaming strategy were simulated and examined. The heuristic consistently yielded optimal schedule consistently in all eleven cases. The identification procedure for select best lot streaming strategy was suggested.
Hyper heuristic based on great deluge and its variants for exam timetabling problem
Sin, Ei Shwe
2012-01-01
Today, University Timetabling problems are occurred annually and they are often hard and time consuming to solve. This paper describes Hyper Heuristics (HH) method based on Great Deluge (GD) and its variants for solving large, highly constrained timetabling problems from different domains. Generally, in hyper heuristic framework, there are two main stages: heuristic selection and move acceptance. This paper emphasizes on the latter stage to develop Hyper Heuristic (HH) framework. The main contribution of this paper is that Great Deluge (GD) and its variants: Flex Deluge(FD), Non-linear(NLGD), Extended Great Deluge(EGD) are used as move acceptance method in HH by combining Reinforcement learning (RL).These HH methods are tested on exam benchmark timetabling problem and best results and comparison analysis are reported.
Layer-layout-based heuristics for loading homogeneous items into a single container
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
The container loading problem (CLP) is a well-known NP-hard problem. Due to the computation complexity, heuristics is an often-sought approach. This article proposes two heuristics to pack homogeneous rectangular boxes into a single container. Both algorithms adopt the concept of building layers on one face of the container, but the first heuristic determines the layer face once for all, while the second treats the remaining container space as a reduced-sized container after one layer is loaded and, hence, selects the layer face dynamically. To handle the layout design problem at a layer's level, a block-based 2D packing procedure is also developed. Numerical studies demonstrate the efficiency of the heuristics.
Takahashi, Kazuhiko; Naemura, Masahide
2005-12-01
This paper proposes a human body posture estimation method based on analysis of human silhouette and Kalman filter. The proposed method is based on both the heuristically extraction method of estimating the significant points of human body and the contour analysis of the human silhouette. The 2D coordinates of the human body's significant points, such as top of the head, and tips of feet, are located by applying the heuristically extraction method to the human silhouette, those of tips of hands are obtained by using the result of the contour analysis, and the joints of elbows and knees are estimated by introducing some heuristic rules to the contour image of the human silhouette. The estimated results are optimized and tracked by using Kalman filter. The proposed estimation method is implemented on a personal computer and runs in real-time. Experimental results show both the feasibility and the effectiveness of the proposed method for estimating human body postures.
A set-covering based heuristic algorithm for the periodic vehicle routing problem.
Cacchiani, V; Hemmelmayr, V C; Tricoire, F
2014-01-30
We present a hybrid optimization algorithm for mixed-integer linear programming, embedding both heuristic and exact components. In order to validate it we use the periodic vehicle routing problem (PVRP) as a case study. This problem consists of determining a set of minimum cost routes for each day of a given planning horizon, with the constraints that each customer must be visited a required number of times (chosen among a set of valid day combinations), must receive every time the required quantity of product, and that the number of routes per day (each respecting the capacity of the vehicle) does not exceed the total number of available vehicles. This is a generalization of the well-known vehicle routing problem (VRP). Our algorithm is based on the linear programming (LP) relaxation of a set-covering-like integer linear programming formulation of the problem, with additional constraints. The LP-relaxation is solved by column generation, where columns are generated heuristically by an iterated local search algorithm. The whole solution method takes advantage of the LP-solution and applies techniques of fixing and releasing of the columns as a local search, making use of a tabu list to avoid cycling. We show the results of the proposed algorithm on benchmark instances from the literature and compare them to the state-of-the-art algorithms, showing the effectiveness of our approach in producing good quality solutions. In addition, we report the results on realistic instances of the PVRP introduced in Pacheco et al. (2011) [24] and on benchmark instances of the periodic traveling salesman problem (PTSP), showing the efficacy of the proposed algorithm on these as well. Finally, we report the new best known solutions found for all the tested problems.
A Hop Count Based Heuristic Routing Protocol for Mobile Delay Tolerant Networks
Wei, Changjiang; Dai, Chenqu; Xu, Jixing; Hu, Lejuan
2014-01-01
Routing in delay tolerant networks (DTNs) is a challenge since it must handle network partitioning, long delays, and dynamic topology. Meanwhile, routing protocols of the traditional mobile ad hoc networks (MANETs) cannot work well due to the failure of its assumption that most network connections are available. In this paper, we propose a hop count based heuristic routing protocol by utilizing the information carried by the peripatetic packets in the network. A heuristic function is defined to help in making the routing decision. We formally define a custom operation for square matrices so as to transform the heuristic value calculation into matrix manipulation. Finally, the performance of our proposed algorithm is evaluated by the simulation results, which show the advantage of such self-adaptive routing protocol in the diverse circumstance of DTNs. PMID:25110736
Institute of Scientific and Technical Information of China (English)
Bo HUANG; Yamin SUN
2005-01-01
This paper proposes and evaluates two improved Petri net (PN)-based hybrid search strategies and their applications to flexible manufacturing system (FMS) scheduling.The algorithms proposed in some previous papers,which combine PN simulation capabilities with A* heuristic search within the PN reachability graph,may not find an optimum solution even with an admissible heuristic function.To remedy the defects an improved heuristic search strategy is proposed,which adopts a different method for selecting the promising markings and reserves the admissibility of the algorithm.To speed up the search process,another algorithm is also proposed which invokes faster termination conditions and still guarantees that the solution found is optimum.The scheduling results are compared through a simple FMS between our algorithms and the previous methods.They are also applied and evaluated in a set of randomly-generated FMSs with such characteristics as multiple resources and alternative routes.
A Hop Count Based Heuristic Routing Protocol for Mobile Delay Tolerant Networks
Directory of Open Access Journals (Sweden)
Lei You
2014-01-01
Full Text Available Routing in delay tolerant networks (DTNs is a challenge since it must handle network partitioning, long delays, and dynamic topology. Meanwhile, routing protocols of the traditional mobile ad hoc networks (MANETs cannot work well due to the failure of its assumption that most network connections are available. In this paper, we propose a hop count based heuristic routing protocol by utilizing the information carried by the peripatetic packets in the network. A heuristic function is defined to help in making the routing decision. We formally define a custom operation for square matrices so as to transform the heuristic value calculation into matrix manipulation. Finally, the performance of our proposed algorithm is evaluated by the simulation results, which show the advantage of such self-adaptive routing protocol in the diverse circumstance of DTNs.
A column generation-based heuristic for rostering with work patterns
DEFF Research Database (Denmark)
Lusby, Richard Martin; Dohn, Anders Høeg; Range, Troels Martin
2012-01-01
This paper addresses the Ground Crew Rostering Problem with Work Patterns, an important manpower planning problem arising in the ground operations of airline companies. We present a cutting stock-based integer programming formulation of the problem and describe a powerful heuristic decomposition...
Directory of Open Access Journals (Sweden)
Mahdi Maktabdar Oghaz
Full Text Available Color is one of the most prominent features of an image and used in many skin and face detection applications. Color space transformation is widely used by researchers to improve face and skin detection performance. Despite the substantial research efforts in this area, choosing a proper color space in terms of skin and face classification performance which can address issues like illumination variations, various camera characteristics and diversity in skin color tones has remained an open issue. This research proposes a new three-dimensional hybrid color space termed SKN by employing the Genetic Algorithm heuristic and Principal Component Analysis to find the optimal representation of human skin color in over seventeen existing color spaces. Genetic Algorithm heuristic is used to find the optimal color component combination setup in terms of skin detection accuracy while the Principal Component Analysis projects the optimal Genetic Algorithm solution to a less complex dimension. Pixel wise skin detection was used to evaluate the performance of the proposed color space. We have employed four classifiers including Random Forest, Naïve Bayes, Support Vector Machine and Multilayer Perceptron in order to generate the human skin color predictive model. The proposed color space was compared to some existing color spaces and shows superior results in terms of pixel-wise skin detection accuracy. Experimental results show that by using Random Forest classifier, the proposed SKN color space obtained an average F-score and True Positive Rate of 0.953 and False Positive Rate of 0.0482 which outperformed the existing color spaces in terms of pixel wise skin detection accuracy. The results also indicate that among the classifiers used in this study, Random Forest is the most suitable classifier for pixel wise skin detection applications.
Maktabdar Oghaz, Mahdi; Maarof, Mohd Aizaini; Zainal, Anazida; Rohani, Mohd Foad; Yaghoubyan, S Hadi
2015-01-01
Color is one of the most prominent features of an image and used in many skin and face detection applications. Color space transformation is widely used by researchers to improve face and skin detection performance. Despite the substantial research efforts in this area, choosing a proper color space in terms of skin and face classification performance which can address issues like illumination variations, various camera characteristics and diversity in skin color tones has remained an open issue. This research proposes a new three-dimensional hybrid color space termed SKN by employing the Genetic Algorithm heuristic and Principal Component Analysis to find the optimal representation of human skin color in over seventeen existing color spaces. Genetic Algorithm heuristic is used to find the optimal color component combination setup in terms of skin detection accuracy while the Principal Component Analysis projects the optimal Genetic Algorithm solution to a less complex dimension. Pixel wise skin detection was used to evaluate the performance of the proposed color space. We have employed four classifiers including Random Forest, Naïve Bayes, Support Vector Machine and Multilayer Perceptron in order to generate the human skin color predictive model. The proposed color space was compared to some existing color spaces and shows superior results in terms of pixel-wise skin detection accuracy. Experimental results show that by using Random Forest classifier, the proposed SKN color space obtained an average F-score and True Positive Rate of 0.953 and False Positive Rate of 0.0482 which outperformed the existing color spaces in terms of pixel wise skin detection accuracy. The results also indicate that among the classifiers used in this study, Random Forest is the most suitable classifier for pixel wise skin detection applications.
Pasam, Gopi Krishna; Manohar, T. Gowri
2016-09-01
Determination of available transfer capability (ATC) requires the use of experience, intuition and exact judgment in order to meet several significant aspects in the deregulated environment. Based on these points, this paper proposes two heuristic approaches to compute ATC. The first proposed heuristic algorithm integrates the five methods known as continuation repeated power flow, repeated optimal power flow, radial basis function neural network, back propagation neural network and adaptive neuro fuzzy inference system to obtain ATC. The second proposed heuristic model is used to obtain multiple ATC values. Out of these, a specific ATC value will be selected based on a number of social, economic, deregulated environmental constraints and related to specific applications like optimization, on-line monitoring, and ATC forecasting known as multi-objective decision based optimal ATC. The validity of results obtained through these proposed methods are scrupulously verified on various buses of the IEEE 24-bus reliable test system. The results presented and derived conclusions in this paper are very useful for planning, operation, maintaining of reliable power in any power system and its monitoring in an on-line environment of deregulated power system. In this way, the proposed heuristic methods would contribute the best possible approach to assess multiple objective ATC using integrated methods.
Qin, Junping; Sun, Shiwen; Deng, Qingxu; Liu, Limin; Tian, Yonghong
2017-06-02
Object tracking and detection is one of the most significant research areas for wireless sensor networks. Existing indoor trajectory tracking schemes in wireless sensor networks are based on continuous localization and moving object data mining. Indoor trajectory tracking based on the received signal strength indicator (RSSI) has received increased attention because it has low cost and requires no special infrastructure. However, RSSI tracking introduces uncertainty because of the inaccuracies of measurement instruments and the irregularities (unstable, multipath, diffraction) of wireless signal transmissions in indoor environments. Heuristic information includes some key factors for trajectory tracking procedures. This paper proposes a novel trajectory tracking scheme based on Delaunay triangulation and heuristic information (TTDH). In this scheme, the entire field is divided into a series of triangular regions. The common side of adjacent triangular regions is regarded as a regional boundary. Our scheme detects heuristic information related to a moving object's trajectory, including boundaries and triangular regions. Then, the trajectory is formed by means of a dynamic time-warping position-fingerprint-matching algorithm with heuristic information constraints. Field experiments show that the average error distance of our scheme is less than 1.5 m, and that error does not accumulate among the regions.
Tandiseru, Selvi Rajuaty
2015-01-01
The problem in this research is the lack of creative thinking skills of students. One of the learning models that is expected to enhance student's creative thinking skill is the local culture-based mathematical heuristic-KR learning model (LC-BMHLM). Heuristic-KR is a learning model which was introduced by Krulik and Rudnick (1995) that is the…
A Heuristic Algorithm for Task Scheduling Based on Mean Load on Grid
Institute of Scientific and Technical Information of China (English)
Li-Na Ni; Jin-Quan Zhang; Chun-Gang Yan; Chang-Jun Jiang
2006-01-01
Efficient task scheduling is critical to achieving high performance on grid computing environment. The task scheduling on grid is studied as optimization problem in this paper. A heuristic task scheduling algorithm satisfying resources load balancing on grid environment is presented. The algorithm schedules tasks by employing mean load based on task predictive execution time as heuristic information to obtain an initial scheduling strategy. Then an optimal scheduling strategy is achieved by selecting two machines satisfying condition to change their loads via reassigning their tasks under the heuristic of their mean load. Methods of selecting machines and tasks are given in this paper to increase the throughput of the system and reduce the total waiting time. The efficiency of the algorithm is analyzed and the performance of the proposed algorithm is evaluated via extensive simulation experiments. Experimental results show that the heuristic algorithm performs significantly to ensure high load balancing and achieve an optimal scheduling strategy almost all the time. Furthermore, results show that our algorithm is high efficient in terms of time complexity.
Diez, Luis E; Bahillo, Alfonso; Bataineh, Safaa; Masegosa, Antonio D; Perallos, Asier
2016-08-01
Location based services can improve the quality of patient care and increase the efficiency of the healthcare systems. Among the different technologies that provide indoor positioning, inertial sensors based pedestrian dead-reckoning (PDR) is one of the more cost-effective solutions, but its performance is limited by drift problems. Regarding the heading drift, some heuristics make use of the building's dominant directions in order to reduce this problem. In this paper, we enhance the method known as improved heuristic drift elimination (iHDE) to be implemented in a Step-and-Heading (SHS) based PDR system, that allows to place the inertial sensors in almost any location of the user's body. Particularly, wrist-worn sensors will be used. Tests on synthetically generated and real data show that the iHDE method can be used in a SHS-based PDR without losing its heading drift reduction capability.
Directory of Open Access Journals (Sweden)
Gregorius Satia Budhi
2002-01-01
Full Text Available The application of Activity Based Costing (ABC approach to select the set-machine that is used in the production of Flexible Manufacture System (FMS based on technical and economical criteria can be useful for producers to design FMS by considering the minimum production cost. In the other hand, Heuristic Search is known to have a short searching time. Algorithm Heuristic that using ABC approach as the weight in finding the solution to shorten the equipment selection time during the design / redesign process of the FMS in less than exponential time was designed in this research. The increasing speed is useful because with the faster time in design / redesign process, therefore the flexibility level of part variety that can be processed will become better. Theoretical and empirical analysis in Algorithm Heuristic shows that time searching to get appropriate set of equipment is not too long, so that we can assume that the designed Algorithm Heuristic can be implemented in the real world. By comparing the empirical result of Algorithm Heuristic to the Algorithm Exhaustive, we can also assume that Algorithm Heuristic that using ABC method as the weight for finding solution can optimise the equipment selection problem of FMS based on economical criteria too. Abstract in Bahasa Indonesia : Penggunaan pendekatan Activity Based Costing (ABC untuk memilih set mesin yang digunakan dalam produksi pada Flexible Manufacture Systems (FMS berdasar atas kriteria teknis dan ekonomis, dapat membantu pelaku produksi untuk mendisain FMS dengan pertimbangan minimalisasi biaya produksi. Sementara itu, Heuristic Search dikenal memiliki waktu pencarian yang singkat. Pada riset ini didisain sebuah Algoritma Heuristic yang menggunakan pendekatan ABC sebagai bobot dalam pencarian solusi, untuk mempersingkat waktu pemilihan peralatan saat desain/redisain FMS dalam waktu kurang dari waktu Eksponensial. Peningkatan kecepatan ini bermanfaat, karena dengan cepatnya waktu
Impact of Blended Learning Environments Based on Algo-Heuristic Theory on Some Variables
Directory of Open Access Journals (Sweden)
Mustafa AYGÜN
2012-12-01
Full Text Available In this study, the effects of Algo–Heuristic Theory based blended learning environments on students’ computer skills in their preparation of presentations, levels of attitudes towards computers, and levels of motivation regarding the information technology course were investigated. The research sample was composed of 71 students. A semi–empirical design with a pre-test–post-test, and control group was used. Research data was collected using an Academic Achievement Test, the Computer Attitude Scale for Primary School Students and the Motivation Scale for the Information Technology Course. A one way ANOVA was conducted on all the data collected and the results revealed that the achievements and motivation levels of the students who studied in an Algo–Heuristic Theory based blended learning environment in the information technology course increased significantly.
A variable neighborhood descent based heuristic to solve the capacitated location-routing problem
Directory of Open Access Journals (Sweden)
M. S. Jabal-Ameli
2011-01-01
Full Text Available Location-routing problem (LRP is established as a new research area in the context of location analysis. The primary concern of LRP is on locating facilities and routing of vehicles among established facilities and existing demand points. In this work, we address the capacitated LRP which arises in many practical applications within logistics and supply chain management. The objective is to minimize the overall system costs which include the fixed costs of opening depots and using vehicles at each depot site, and the variable costs associated with delivery activities. A novel heuristic is proposed which is based on variable neighborhood descent (VND algorithm to solve the resulted problem. The computational study indicates that the proposed VND based heuristic is highly competitive with the existing solution algorithms in terms of solution quality.
A Novel Heuristic Algorithm Based on Clark and Wright Algorithm for Green Vehicle Routing Problem
Mehdi Alinaghian; Zahra Kaviani; Siyavash Khaledan
2015-01-01
A significant portion of Gross Domestic Production (GDP) in any country belongs to the transportation system. Transportation equipment, in the other hand, is supposed to be great consumer of oil products. Many attempts have been assigned to the vehicles to cut down Greenhouse Gas (GHG). In this paper a novel heuristic algorithm based on Clark and Wright Algorithm called Green Clark and Wright (GCW) for Vehicle Routing Problem regarding to fuel consumption is presented. The objective function ...
A Heuristic Clustering Algorithm for Intrusion Detection Based on Information Entropy
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
This paper studied on the clustering problem for intrusion detection with the theory of information entropy, it was put forward that the clustering problem for exact intrusion detection based on information entropy is NP-complete, therefore, the heuristic algorithm to solve the clustering problem for intrusion detection was designed, this algorithm has the characteristic of incremental development, it can deal with the database with large connection records from the internet.
Robotics Vision-based Heuristic Reasoning for Underwater Target Tracking and Navigation
Chua Kia; Mohd. Rizal Arshad
2005-01-01
This paper presents a robotics vision-based heuristic reasoning system for underwater target tracking and navigation. This system is introduced to improve the level of automation of underwater Remote Operated Vehicles (ROVs) operations. A prototype which combines computer vision with an underwater robotics system is successfully designed and developed to perform target tracking and intelligent navigation. This study focuses on developing image processing algorithms and fuzzy inference system ...
An LP-based heuristic for the fixed charge transportation problem
DEFF Research Database (Denmark)
Klose, Andreas
2007-01-01
The fixed charge transportation problem consists in finding a minimum cost network flow from a set of suppliers to a set of customers. Beside costs proportional to quantities transported, transportation costs also include a fixed charge. The paper describes a linear programming based heuristic ap...... inequalities and flow cover inequalities, the approach also employs Fenchel cuts that are based on embedded 0-1 single node flow sets. Computational results obtained for a set of standard test problem instances are reported.......The fixed charge transportation problem consists in finding a minimum cost network flow from a set of suppliers to a set of customers. Beside costs proportional to quantities transported, transportation costs also include a fixed charge. The paper describes a linear programming based heuristic...... approach for computing lower and upper bounds on the minimal cost. To this end, the LP relaxation is iteratively strengthened by means of adding cuts; in each iteration the current LP solution is then used to guide a local search heuristic. In addition to standard polyhedral cuts as lifted cover...
Towards a Complexity Theory of Randomized Search Heuristics: Ranking-Based Black-Box Complexity
Doerr, Benjamin
2011-01-01
Randomized search heuristics are a broadly used class of general-purpose algorithms. Analyzing them via classical methods of theoretical computer science is a growing field. A big step forward would be a useful complexity theory for such algorithms. We enrich the two existing black-box complexity notions due to Wegener and other authors by the restrictions that not actual objective values, but only the relative quality of the previously evaluated solutions may be taken into account by the algorithm. Many randomized search heuristics belong to this class of algorithms. We show that the new ranking-based model gives more realistic complexity estimates for some problems, while for others the low complexities of the previous models still hold.
A marketing science perspective on recognition-based heuristics (and the fast-and-frugal paradigm
Directory of Open Access Journals (Sweden)
John Hauser
2011-07-01
Full Text Available Marketing science seeks to prescribe better marketing strategies (advertising, product development, pricing, etc.. To do so we rely on models of consumer decisions grounded in empirical observations. Field experience suggests that recognition-based heuristics help consumers to choose which brands to consider and purchase in frequently-purchased categories, but other heuristics are more relevant in durable-goods categories. Screening with recognition is a rational screening rule when advertising is a signal of product quality, when observing other consumers makes it easy to learn decision rules, and when firms react to engineering-design constraints by offering brands such that a high-level on one product feature implies a low level on another product feature. Experience with applications and field experiments suggests four fruitful research topics: deciding how to decide (endogeneity, learning decision rules by self-reflection, risk reduction, and the difference between utility functions and decision rules. These challenges also pose methodological cautions.
Local search-based heuristics for the multiobjective multidimensional knapsack problem
Directory of Open Access Journals (Sweden)
Dalessandro Soares Vianna
2012-01-01
Full Text Available In real optimization problems it is generally desirable to optimize more than one performance criterion (or objective at the same time. The goal of the multiobjective combinatorial optimization (MOCO is to optimize simultaneously r > 1 objectives. As in the single-objective case, the use of heuristic/metaheuristic techniques seems to be the most promising approach to MOCO problems because of their efficiency, generality and relative simplicity of implementation. In this work, we develop algorithms based on Greedy Randomized Adaptive Search Procedure (GRASP and Iterated Local Search (ILS metaheuristics for the multiobjective knapsack problem. Computational experiments on benchmark instances show that the proposed algorithms are very robust and outperform other heuristics in terms of solution quality and running times.
Heuristic reduction of gyro drift in IMU-based personnel tracking systems
Borenstein, Johann; Ojeda, Lauro; Kwanmuang, Surat
2009-05-01
The paper pertains to the reduction of measurement errors in gyroscopes used for tracking the position of walking persons. Some of these tracking systems commonly use inertial or other means to measure distance traveled, and one or more gyros to measure changes in heading. MEMS-type gyros or IMUs are best suited for this task because of their small size and low weight. However, these gyros have large drift rates and can be sensitive to accelerations. The Heuristic Drift Reduction (HDR) method presented in this paper estimates the drift component and eliminates it, reducing heading errors by almost one order of magnitude.
A Heuristics-Based Parthenogenetic Algorithm for the VRP with Potential Demands and Time Windows
Directory of Open Access Journals (Sweden)
Chenghua Shi
2016-01-01
Full Text Available We present the vehicle routing problem with potential demands and time windows (VRP-PDTW, which is a variation of the classical VRP. A homogenous fleet of vehicles originated in a central depot serves customers with soft time windows and deliveries from/to their locations, and split delivery is considered. Also, besides the initial demand in the order contract, the potential demand caused by conformity consuming behavior is also integrated and modeled in our problem. The objective of minimizing the cost traveled by the vehicles and penalized cost due to violating time windows is then constructed. We propose a heuristics-based parthenogenetic algorithm (HPGA for successfully solving optimal solutions to the problem, in which heuristics is introduced to generate the initial solution. Computational experiments are reported for instances and the proposed algorithm is compared with genetic algorithm (GA and heuristics-based genetic algorithm (HGA from the literature. The comparison results show that our algorithm is quite competitive by considering the quality of solutions and computation time.
Agent-based transportation planning compared with scheduling heuristics
Mes, Martijn; Heijden, van der Matthieu; Harten, van Aart
2004-01-01
Here we consider the problem of dynamically assigning vehicles to transportation orders that have di¤erent time windows and should be handled in real time. We introduce a new agent-based system for the planning and scheduling of these transportation networks. Intelligent vehicle agents schedule thei
Agent-based transportation planning compared with scheduling heuristics
Mes, Martijn R.K.; van der Heijden, Matthijs C.; van Harten, Aart
2004-01-01
Here we consider the problem of dynamically assigning vehicles to transportation orders that have di¤erent time windows and should be handled in real time. We introduce a new agent-based system for the planning and scheduling of these transportation networks. Intelligent vehicle agents schedule
Code generation based on formal BURS theory and heuristic search
Nymeyer, Albert; Katoen, Joost P.
BURS theory provides a powerful mechanism to efficiently generate pattern matches in a given expression tree. BURS, which stands for bottom-up rewrite system, is based on term rewrite systems, to which costs are added. We formalise the underlying theory, and derive an algorithm that computes all
Heuristic based data scheduling algorithm for OFDMA wireless network
Institute of Scientific and Technical Information of China (English)
无
2008-01-01
A system model based on joint layer mechanism is formulated for optimal data scheduling over fixed point-to-point links in OFDMA ad-hoc wireless networks.A distributed scheduling algorithm (DSA) for system model optimization is proposed that combines the randomly chosen subcarrier according to the channel condition of local subcarriers with link power control to limit interference caused by the reuse of subcarrier among links.For the global fairness improvement of algorithms,a global power control scheduling algorithm (GPCSA) based on the proposed DSA is presented and dynamically allocates global power according to difference between average carrier-noise-ratio of selected local links and system link protection ratio.Simulation results demonstrate that the proposed algorithms achieve better efficiency and fairness compared with other existing algorithms.
Directory of Open Access Journals (Sweden)
Mansour Alssager
2017-02-01
Full Text Available The heuristic method is a well-known constructive method for initialize trail quality solutions in capacitated vehicle routing problem. Cheapest insertion heuristic is a popular construction heuristic known for being fast, producing decent solutions, simple to implement and easy to extend handling complicated constraints. However, in previous work, there was less focus on diverse initial quality solutions. Therefore, this study proposed an extension to the cheapest insertion heuristic which consider various combinations of seed customer criteria (the first customer inserted on a route to preserve solutions diversification. Three seed customer criteria proposed which based on the combination of two criteria based on (farthest, nearest and random criteria. The best performing criteria selected and tested on benchmark dataset, later compared with Clarke and Wright saving heuristic. The results shown that the combination of (farthest and random criteria obtained the best initial solution which preserve balance between the quality and diversity, with less time when compared to Clarke and wright saving heuristic. This approach is for generating diverse and quality starting solutions for the capacitated vehicle routing problem.
A heuristic approach based on Clarke-Wright algorithm for open vehicle routing problem.
Pichpibul, Tantikorn; Kawtummachai, Ruengsak
2013-01-01
We propose a heuristic approach based on the Clarke-Wright algorithm (CW) to solve the open version of the well-known capacitated vehicle routing problem in which vehicles are not required to return to the depot after completing service. The proposed CW has been presented in four procedures composed of Clarke-Wright formula modification, open-route construction, two-phase selection, and route postimprovement. Computational results show that the proposed CW is competitive and outperforms classical CW in all directions. Moreover, the best known solution is also obtained in 97% of tested instances (60 out of 62).
Improving IT project governance: A reflective analysis based on critical systems heuristics
Directory of Open Access Journals (Sweden)
David Johnstone
2017-05-01
Full Text Available IT project governance involves establishing authority structures, policies and mechanisms for IT projects. However, the way governance arrangements are implemented can sometimes exclude or marginalise important stakeholders. In this paper, we use critical systems thinking, and the notions of boundary critique and entrenched structural conflict, to inform a critical re-analysis of a case study where the governance proved relatively ineffective. We use the ‘twelve questions’ from the critical systems heuristics (CSH approach to diagnose problems with governance arrangements and suggest solutions. Based on this, we suggest the CSH approach has theoretical and practical efficacy for improving IT project governance in general.
Yet Another Method for Image Segmentation based on Histograms and Heuristics
Directory of Open Access Journals (Sweden)
Horia-Nicolai L. Teodorescu
2012-07-01
Full Text Available We introduce a method for image segmentation that requires little computations, yet providing comparable results to other methods. While the proposed method resembles to the known ones based on histograms, it is still different in the use of the gray level distribution. When to the basic procedure we add several heuristic rules, the method produces results that, in some cases, may outperform the results produced by the known methods. The paper reports preliminary results. More details on the method, improvements, and results will be presented in a future paper.
A note on "A LP-based heuristic for a time-constrained routing problem"
Muter, İbrahim; Muter, Ibrahim; Birbil, Ş. İlker; Birbil, S. Ilker; Bülbül, Kerem; Bulbul, Kerem; Şahin, Güvenç; Sahin, Guvenc
2012-01-01
Avella et al. (2006) [Avella, P., D'Auria, B., Salerno, S. (2006). A LP-based heuristic for a time-constrained routing problem. European Journal of Operational Research 173:120-124] investigate a time-constrained routing (TCR) problem. The core of the proposed solution approach is a large-scale linear program (LP) that grows both row- and column-wise when new variables are introduced. Thus, a column-and-row generation algorithm is proposed to solve this LP optimally, and an optimality conditi...
A heuristic two-dimensional presentation of microsatellite-based data applied to dogs and wolves
Directory of Open Access Journals (Sweden)
Foerster Martin
2007-07-01
Full Text Available Abstract Methods based on genetic distance matrices usually lose information during the process of tree-building by converting a multi-dimensional matrix into a phylogenetic tree. We applied a heuristic method of two-dimensional presentation to achieve a better resolution of the relationship between breeds and individuals investigated. Four hundred and nine individuals from nine German dog breed populations and one free-living wolf population were analysed with a marker set of 23 microsatellites. The result of the two-dimensional presentation was partly comparable with and complemented a model-based analysis that uses genotype patterns. The assignment test and the neighbour-joining tree based on allele sharing estimate allocated 99% and 97% of the individuals according to their breed, respectively. The application of the two-dimensional presentation to distances on the basis of the proportion of shared alleles resulted in comparable and further complementary insight into inferred population structure by multilocus genotype data. We expect that the inference of population structure in domesticated species with complex breeding histories can be strongly supported by the two-dimensional presentation based on the described heuristic method.
Srinivas, B; Kulick, S N; Doran, Christine; Kulick, Seth
1995-01-01
There are currently two philosophies for building grammars and parsers -- Statistically induced grammars and Wide-coverage grammars. One way to combine the strengths of both approaches is to have a wide-coverage grammar with a heuristic component which is domain independent but whose contribution is tuned to particular domains. In this paper, we discuss a three-stage approach to disambiguation in the context of a lexicalized grammar, using a variety of domain independent heuristic techniques. We present a training algorithm which uses hand-bracketed treebank parses to set the weights of these heuristics. We compare the performance of our grammar against the performance of the IBM statistical grammar, using both untrained and trained weights for the heuristics.
Directory of Open Access Journals (Sweden)
Yun Tian
2016-01-01
Full Text Available The segmentation of coronary arteries is a vital process that helps cardiovascular radiologists detect and quantify stenosis. In this paper, we propose a fully automated coronary artery segmentation from cardiac data volume. The method is built on a statistics region growing together with a heuristic decision. First, the heart region is extracted using a multi-atlas-based approach. Second, the vessel structures are enhanced via a 3D multiscale line filter. Next, seed points are detected automatically through a threshold preprocessing and a subsequent morphological operation. Based on the set of detected seed points, a statistics-based region growing is applied. Finally, results are obtained by setting conservative parameters. A heuristic decision method is then used to obtain the desired result automatically because parameters in region growing vary in different patients, and the segmentation requires full automation. The experiments are carried out on a dataset that includes eight-patient multivendor cardiac computed tomography angiography (CTA volume data. The DICE similarity index, mean distance, and Hausdorff distance metrics are employed to compare the proposed algorithm with two state-of-the-art methods. Experimental results indicate that the proposed algorithm is capable of performing complete, robust, and accurate extraction of coronary arteries.
Directory of Open Access Journals (Sweden)
Yahong Zheng
2014-05-01
Full Text Available Purpose: This paper focuses on a classic optimization problem in operations research, the flexible job shop scheduling problem (FJSP, to discuss the method to deal with uncertainty in a manufacturing system.Design/methodology/approach: In this paper, condition based maintenance (CBM, a kind of preventive maintenance, is suggested to reduce unavailability of machines. Different to the simultaneous scheduling algorithm (SSA used in the previous article (Neale & Cameron,1979, an inserting algorithm (IA is applied, in which firstly a pre-schedule is obtained through heuristic algorithm and then maintenance tasks are inserted into the pre-schedule scheme.Findings: It is encouraging that a new better solution for an instance in benchmark of FJSP is obtained in this research. Moreover, factually SSA used in literature for solving normal FJSPPM (FJSP with PM is not suitable for the dynamic FJSPPM. Through application in the benchmark of normal FJSPPM, it is found that although IA obtains inferior results compared to SSA used in literature, it performs much better in executing speed.Originality/value: Different to traditional scheduling of FJSP, uncertainty of machines is taken into account, which increases the complexity of the problem. An inserting algorithm (IA is proposed to solve the dynamic scheduling problem. It is stated that the quality of the final result depends much on the quality of the pre-schedule obtained during the procedure of solving a normal FJSP. In order to find the best solution of FJSP, a comparative study of three heuristics is carried out, the integrated GA, ACO and ABC. In the comparative study, we find that GA performs best in the three heuristic algorithms. Meanwhile, a new better solution for an instance in benchmark of FJSP is obtained in this research.
Heuristic evaluation of paper-based Web pages: a simplified inspection usability methodology.
Allen, Mureen; Currie, Leanne M; Bakken, Suzanne; Patel, Vimla L; Cimino, James J
2006-08-01
Online medical information, when presented to clinicians, must be well-organized and intuitive to use, so that the clinicians can conduct their daily work efficiently and without error. It is essential to actively seek to produce good user interfaces that are acceptable to the user. This paper describes the methodology used to develop a simplified heuristic evaluation (HE) suitable for the evaluation of screen shots of Web pages, the development of an HE instrument used to conduct the evaluation, and the results of the evaluation of the aforementioned screen shots. In addition, this paper presents examples of the process of categorizing problems identified by the HE and the technological solutions identified to resolve these problems. Four usability experts reviewed 18 paper-based screen shots and made a total of 108 comments. Each expert completed the task in about an hour. We were able to implement solutions to approximately 70% of the violations. Our study found that a heuristic evaluation using paper-based screen shots of a user interface was expeditious, inexpensive, and straightforward to implement.
Mugunthan, Pradeep; Shoemaker, Christine A.; Regis, Rommel G.
2005-11-01
The performance of function approximation (FA) methods is compared to heuristic and derivative-based nonlinear optimization methods for automatic calibration of biokinetic parameters of a groundwater bioremediation model of chlorinated ethenes on a hypothetical and a real field case. For the hypothetical case, on the basis of 10 trials on two different objective functions, the FA methods had the lowest mean and smaller deviation of the objective function among all algorithms for a combined Nash-Sutcliffe objective and among all but the derivative-based algorithm for a total squared error objective. The best algorithms in the hypothetical case were applied to calibrate eight parameters to data obtained from a site in California. In three trials the FA methods outperformed heuristic and derivative-based methods for both objective functions. This study indicates that function approximation methods could be a more efficient alternative to heuristic and derivative-based methods for automatic calibration of computationally expensive bioremediation models.
Directory of Open Access Journals (Sweden)
Amanda Rose Schenstead
2012-03-01
Full Text Available This theoretical article explores the author's experience of a heuristic, arts-based self-study with focus on the data analysis method that was utilized in this project and its continual development. The author refers to this method as arts-based reflexivity. A historical review of arts-based and heuristic research will be provided to give context and theoretical background to support the development and use of arts-based reflexivity. This systematic method of analysing artistic data encourages the researcher to ask various questions to him/herself and interact with the data by creating intuitive art forms as responses to internal dialogue and feelings. A template will be offered for researchers to explore and utilize for their own projects and processes. The components of arts-based reflexivity will be explored using examples from the author's graduate research project as well as recent reflections upon the topic: "What is My Artistic Centre?", an adapted short form performance piece which illustrates the steps and potential self-knowledge which can be gain through the arts-based reflexivity method.
Directory of Open Access Journals (Sweden)
Puneet Rai
2014-02-01
Full Text Available Ant Colony Optimization (ACO is nature inspired algorithm based on foraging behavior of ants. The algorithm is based on the fact how ants deposit pheromone while searching for food. ACO generates a pheromone matrix which gives the edge information present at each pixel position of image, formed by ants dispatched on image. The movement of ants depends on local variance of image's intensity value. This paper proposes an improved method based on heuristic which assigns weight to the neighborhood. Thus by assigning the weights or priority to the neighboring pixels, the ant decides in which direction it can move. The method is applied on Medical images and experimental results are provided to support the superior performance of the proposed approach and the existing method.
Component Based Testing with ioco
van der Bijl, H.M.; Rensink, Arend; Tretmans, G.J.
Component based testing concerns the integration of components which have already been tested separately. We show that, with certain restrictions, the ioco-test theory for conformance testing is suitable for component based testing, in the sense that the integration of fully conformant components is
Siegrist, Michael; Keller, Carmen; Cousin, Marie-Eve
2006-08-01
The implicit association test (IAT) measures automatic associations. In the present research, the IAT was adapted to measure implicit attitudes toward technological hazards. In Study 1, implicit and explicit attitudes toward nuclear power were examined. Implicit measures (i.e., the IAT) revealed negative attitudes toward nuclear power that were not detected by explicit measures (i.e., a questionnaire). In Study 2, implicit attitudes toward EMF (electro-magnetic field) hazards were examined. Results showed that cell phone base stations and power lines are judged to be similarly risky and, further, that base stations are more closely related to risk concepts than home appliances are. No differences between experts and lay people were observed. Results of the present studies are in line with the affect heuristic proposed by Slovic and colleagues. Affect seems to be an important factor in risk perception.
van der Zee, D.J.
2010-01-01
Group technology exploits similarities in product and process design to effectively meet the diversity of customer demand. In this paper we consider one of the implementations of this concept-heuristics for family based dispatching. Intrinsic to family based dispatching is the grouping of similar ty
Special relativity a heuristic approach
Hassani, Sadri
2017-01-01
Special Relativity: A Heuristic Approach provides a qualitative exposition of relativity theory on the basis of the constancy of the speed of light. Using Einstein's signal velocity as the defining idea for the notion of simultaneity and the fact that the speed of light is independent of the motion of its source, chapters delve into a qualitative exposition of the relativity of time and length, discuss the time dilation formula using the standard light clock, explore the Minkowski four-dimensional space-time distance based on how the time dilation formula is derived, and define the components of the two-dimensional space-time velocity, amongst other topics. Provides a heuristic derivation of the Minkowski distance formula Uses relativistic photography to see Lorentz transformation and vector algebra manipulation in action Includes worked examples to elucidate and complement the topic being discussed Written in a very accessible style
Monkman, Helen; Griffith, Janessa; Kushniruk, Andre W
2015-01-01
Heuristic evaluations have proven to be valuable for identifying usability issues in systems. Commonly used sets of heuritics exist; however, they may not always be the most suitable, given the specific goal of the analysis. One such example is seeking to evaluate the demands on eHealth literacy and usability of consumer health information systems. In this study, eight essential heuristics and three optional heuristics subsumed from the evidence on eHealth/health literacy and usability were tested for their utility in assessing a mobile blood pressure tracking application (app). This evaluation revealed a variety of ways the design of the app could both benefit and impede users with limited eHealth literacy. This study demonstrated the utility of a low-cost, single evaluation approach for identifying both eHealth literacy and usability issues based on existing evidence in the literature.
The Memory State Heuristic: A Formal Model Based on Repeated Recognition Judgments
Castela, Marta; Erdfelder, Edgar
2017-01-01
The recognition heuristic (RH) theory predicts that, in comparative judgment tasks, if one object is recognized and the other is not, the recognized one is chosen. The memory-state heuristic (MSH) extends the RH by assuming that choices are not affected by recognition judgments per se, but by the memory states underlying these judgments (i.e.,…
The Memory State Heuristic: A Formal Model Based on Repeated Recognition Judgments
Castela, Marta; Erdfelder, Edgar
2017-01-01
The recognition heuristic (RH) theory predicts that, in comparative judgment tasks, if one object is recognized and the other is not, the recognized one is chosen. The memory-state heuristic (MSH) extends the RH by assuming that choices are not affected by recognition judgments per se, but by the memory states underlying these judgments (i.e.,…
Yan, Jerry C.
1987-01-01
In concurrent systems, a major responsibility of the resource management system is to decide how the application program is to be mapped onto the multi-processor. Instead of using abstract program and machine models, a generate-and-test framework known as 'post-game analysis' that is based on data gathered during program execution is proposed. Each iteration consists of (1) (a simulation of) an execution of the program; (2) analysis of the data gathered; and (3) the proposal of a new mapping that would have a smaller execution time. These heuristics are applied to predict execution time changes in response to small perturbations applied to the current mapping. An initial experiment was carried out using simple strategies on 'pipeline-like' applications. The results obtained from four simple strategies demonstrated that for this kind of application, even simple strategies can produce acceptable speed-up with a small number of iterations.
A Novel Heuristic Algorithm Based on Clark and Wright Algorithm for Green Vehicle Routing Problem
Directory of Open Access Journals (Sweden)
Mehdi Alinaghian
2015-08-01
Full Text Available A significant portion of Gross Domestic Production (GDP in any country belongs to the transportation system. Transportation equipment, in the other hand, is supposed to be great consumer of oil products. Many attempts have been assigned to the vehicles to cut down Greenhouse Gas (GHG. In this paper a novel heuristic algorithm based on Clark and Wright Algorithm called Green Clark and Wright (GCW for Vehicle Routing Problem regarding to fuel consumption is presented. The objective function is fuel consumption, drivers, and the usage of vehicles. Being compared to exact methods solutions for small-sized problems and to Differential Evolution (DE algorithm solutions for large-scaled problems, the results show efficient performance of the proposed GCW algorithm.
Yuan, Haiying; Wang, Xiuyu; Sun, Xun; Ju, Zijian
2017-06-01
Bearing fault diagnosis collects massive amounts of vibration data about a rotating machinery system, whose fault classification largely depends on feature extraction. Features reflecting bearing work states are directly extracted using time-frequency analysis of vibration signals, which leads to high dimensional feature data. To address the problem of feature dimension reduction, a compressive sensing-based feature extraction algorithm is developed to construct a concise fault feature set. Next, a heuristic PSO-BP neural network, whose learning process perfectly combines particle swarm optimization and the Levenberg-Marquardt algorithm, is constructed for fault classification. Numerical simulation experiments are conducted on four datasets sampled under different severity levels and load conditions, which verify that the proposed fault diagnosis method achieves efficient feature extraction and high classification accuracy.
A Dynamic Programming-Based Heuristic for the Shift Design Problem in Airport Ground Handling
DEFF Research Database (Denmark)
Clausen, Tommy
We consider the heterogeneous shift design problem for a workforce with multiple skills, where work shifts are created to cover a given demand as well as possible while minimizing cost and satisfying a flexible set of constraints. We focus mainly on applications within airport ground handling where...... the demand can be highly irregular and specified on time intervals as short as five minutes. Ground handling operations are subject to a high degree of cooperation and specialization that require workers with different qualifications to be planned together. Different labor regulations or organizational rules...... can apply to different ground handling operations, so the rules and restrictions can be numerous and vary significantly. This is modeled using flexible volume constraints that limit the creation of certain shifts. We present a fast heuristic for the heterogeneous shift design problem based on dynamic...
Dong, Lu; Zhong, Xiangnan; Sun, Changyin; He, Haibo
2016-04-08
This paper presents the design of a novel adaptive event-triggered control method based on the heuristic dynamic programming (HDP) technique for nonlinear discrete-time systems with unknown system dynamics. In the proposed method, the control law is only updated when the event-triggered condition is violated. Compared with the periodic updates in the traditional adaptive dynamic programming (ADP) control, the proposed method can reduce the computation and transmission cost. An actor-critic framework is used to learn the optimal event-triggered control law and the value function. Furthermore, a model network is designed to estimate the system state vector. The main contribution of this paper is to design a new trigger threshold for discrete-time systems. A detailed Lyapunov stability analysis shows that our proposed event-triggered controller can asymptotically stabilize the discrete-time systems. Finally, we test our method on two different discrete-time systems, and the simulation results are included.
Impact of variable body size on pedestrian dynamics by heuristics-based model
Guo, Ning; Hu, Mao-Bin; Jiang, Rui
2017-01-01
In the real world, pedestrians can arch the shoulders or rotate their bodies actively to across the narrow space. The method is helpful to reduce the effective size of the body. In this paper, the impact of variable body size on the direction choice has been investigated by an improved heuristic-based model. In the model, it is assumed that the cost of adjusting body size is a factor in the process to evaluate the optimal direction. In a typical simulation scenario, the pedestrian reluctant to adjust body size will pass by the blocks. On the contrary, the pedestrian caring little about body size will traverse through the exit. There is a direction-choice change behavior between bypass and traverse considering block width and the initial location of the pedestrian.
Benders' Decomposition Based Heuristics for Large-Scale Dynamic Quadratic Assignment Problems
Directory of Open Access Journals (Sweden)
Sirirat Muenvanichakul
2009-01-01
Full Text Available Problem statement: Dynamic Quadratic Assignment Problem (DQAP is NP hard problem. Benders decomposition based heuristics method is applied to the equivalent mixed-integer linear programming problem of the original DQAP. Approach: Approximate Benders Decomposition (ABD generates the ensemble of a subset of feasible layout for Approximate Dynamic Programming (ADP to determine the sub-optimal optimal solution. A Trust-Region Constraint (TRC for the master problem in ABD and a Successive Adaptation Procedure (SAP were implemented to accelerate the convergence rate of the method. Results: The sub-optimal solutions of large-scales DQAPs from the method and its variants were compared well with other metaheuristic methods. Conclusion: Overall performance of the method is comparable to other metaheuristic methods for large-scale DQAPs.
Formalization in Component Based Development
DEFF Research Database (Denmark)
Holmegaard, Jens Peter; Knudsen, John; Makowski, Piotr;
2006-01-01
We present a unifying conceptual framework for components, component interfaces, contracts and composition of components by focusing on the collection of properties or qualities that they must share. A specific property, such as signature, functionality behaviour or timing is an aspect. Each aspe...... by small examples, using UML as concrete syntax for various aspects, and is illustrated by one larger case study based on an industrial prototype of a complex component based system....
Directory of Open Access Journals (Sweden)
Zheng Wang
2016-01-01
Full Text Available This paper presents a saving-based heuristic for the vehicle routing problem with time windows and stochastic travel times (VRPTWSTT. One of the basic ideas of the heuristic is to advance the latest service start time of each customer by a certain period of time. In this way, the reserved time can be used to cope with unexpected travel time delay when necessary. Another important idea is to transform the VRPTWSTT to a set of vehicle routing problems with time windows (VRPTW, each of which is defined by a given percentage used to calculate the reserved time for customers. Based on the above two key ideas, a three-stage heuristic that includes the “problem transformation” stage, the “solution construction” stage, and the “solution improvement” stage is developed. After the problem transformation in the first stage, the work of the next two stages is to first construct an initial solution for each transformed VRPTW by improving the idea of the classical Clarke-Wright heuristic and then further improve the solution. Finally, a number of numerical experiments are conducted to evaluate the efficiency of the described methodology under different uncertainty levels.
Nash, Mark S; Cowan, Rachel E; Kressler, Jochen
2012-09-01
Component and coalesced health risks of the cardiometabolic syndrome (CMS) are commonly reported in persons with spinal cord injuries (SCIs). These CMS hazards are also co-morbid with physical deconditioning and elevated pro-atherogenic inflammatory cytokines, both of which are common after SCI and worsen the prognosis for all-cause cardiovascular disease. This article describes a systematic procedure for individualized CMS risk assessment after SCI, and emphasizes evidence-based and intuition-centered countermeasures to disease. A unified approach will propose therapeutic lifestyle intervention as a routine plan for aggressive primary prevention in this risk-susceptible population. Customization of dietary and exercise plans then follow, identifying shortfalls in diet and activity patterns, and ways in which these healthy lifestyles can be more substantially embraced by both stakeholders with SCI and their health care providers. In cases where lifestyle intervention utilizing diet and exercise is unsuccessful in countering risks, available pharmacotherapies and a preferred therapeutic agent are proposed according to authoritative standards. The over-arching purpose of the monograph is to create an operational framework in which existing evidence-based approaches or heuristic modeling becomes best practice. In this way persons with SCI can lead more active and healthy lives.
Zulai, Luis G. T.; Durand, Fábio R.; Abrão, Taufik
2015-05-01
In this article, an energy-efficiency mechanism for next-generation passive optical networks is investigated through heuristic particle swarm optimization. Ten-gigabit Ethernet-wavelength division multiplexing optical code division multiplexing-passive optical network next-generation passive optical networks are based on the use of a legacy 10-gigabit Ethernet-passive optical network with the advantage of using only an en/decoder pair of optical code division multiplexing technology, thus eliminating the en/decoder at each optical network unit. The proposed joint mechanism is based on the sleep-mode power-saving scheme for a 10-gigabit Ethernet-passive optical network, combined with a power control procedure aiming to adjust the transmitted power of the active optical network units while maximizing the overall energy-efficiency network. The particle swarm optimization based power control algorithm establishes the optimal transmitted power in each optical network unit according to the network pre-defined quality of service requirements. The objective is controlling the power consumption of the optical network unit according to the traffic demand by adjusting its transmitter power in an attempt to maximize the number of transmitted bits with minimum energy consumption, achieving maximal system energy efficiency. Numerical results have revealed that it is possible to save 75% of energy consumption with the proposed particle swarm optimization based sleep-mode energy-efficiency mechanism compared to 55% energy savings when just a sleeping-mode-based mechanism is deployed.
Phase Selection Heuristics for Satisfiability Solvers
Chen, Jingchao
2011-01-01
In general, a SAT Solver based on conflict-driven DPLL consists of variable selection, phase selection, Boolean Constraint Propagation, conflict analysis, clause learning and its database maintenance. Optimizing any part of these components can enhance the performance of a solver. This paper focuses on optimizing phase selection. Although the ACE (Approximation of the Combined lookahead Evaluation) weight is applied to a lookahead SAT solver such as March, so far, no conflict-driven SAT solver applies successfully the ACE weight, since computing the ACE weight is time-consuming. Here we apply the ACE weight to partial phase selection of conflict-driven SAT solvers. This can be seen as an improvement of the heuristic proposed by Jeroslow-Wang (1990). We incorporate the ACE heuristic and the existing phase selection heuristics in the new solver MPhaseSAT, and select a phase heuristic in a way similar to portfolio methods. Experimental results show that adding the ACE heuristic can improve the conflict-driven so...
A simple heuristic for Internet-based evidence search in primary care: a randomized controlled trial
Directory of Open Access Journals (Sweden)
Eberbach A
2016-08-01
Full Text Available Andreas Eberbach,1 Annette Becker,1 Justine Rochon,2 Holger Finkemeler,1Achim Wagner,3 Norbert Donner-Banzhoff1 1Department of Family and Community Medicine, Philipp University of Marburg, Marburg, Germany; 2Institute of Medical Biometry and Informatics, University of Heidelberg, Heidelberg, Germany; 3Department of Sport Medicine, Justus-Liebig-University of Giessen, Giessen, Germany Background: General practitioners (GPs are confronted with a wide variety of clinical questions, many of which remain unanswered. Methods: In order to assist GPs in finding quick, evidence-based answers, we developed a learning program (LP with a short interactive workshop based on a simple three-step-heuristic to improve their search and appraisal competence (SAC. We evaluated the LP effectiveness with a randomized controlled trial (RCT. Participants (intervention group [IG] n=20; control group [CG] n=31 rated acceptance and satisfaction and also answered 39 knowledge questions to assess their SAC. We controlled for previous knowledge in content areas covered by the test. Results: Main outcome – SAC: within both groups, the pre–post test shows significant (P=0.00 improvements in correctness (IG 15% vs CG 11% and confidence (32% vs 26% to find evidence-based answers. However, the SAC difference was not significant in the RCT. Other measures: Most workshop participants rated “learning atmosphere” (90%, “skills acquired” (90%, and “relevancy to my practice” (86% as good or very good. The LP-recommendations were implemented by 67% of the IG, whereas 15% of the CG already conformed to LP recommendations spontaneously (odds ratio 9.6, P=0.00. After literature search, the IG showed a (not significantly higher satisfaction regarding “time spent” (IG 80% vs CG 65%, “quality of information” (65% vs 54%, and “amount of information” (53% vs 47%.Conclusion: Long-standing established GPs have a good SAC. Despite high acceptance, strong
Example-Based Sequence Diagrams to Colored Petri Nets Transformation Using Heuristic Search
Kessentini, Marouane; Bouchoucha, Arbi; Sahraoui, Houari; Boukadoum, Mounir
Dynamic UML models like sequence diagrams (SD) lack sufficient formal semantics, making it difficult to build automated tools for their analysis, simulation and validation. A common approach to circumvent the problem is to map these models to more formal representations. In this context, many works propose a rule-based approach to automatically translate SD into colored Petri nets (CPN). However, finding the rules for such SD-to-CPN transformations may be difficult, as the transformation rules are sometimes difficult to define and the produced CPN may be subject to state explosion. We propose a solution that starts from the hypothesis that examples of good transformation traces of SD-to-CPN can be useful to generate the target model. To this end, we describe an automated SD-to-CPN transformation method which finds the combination of transformation fragments that best covers the SD model, using heuristic search in a base of examples. To achieve our goal, we combine two algorithms for global and local search, namely Particle Swarm Optimization (PSO) and Simulated Annealing (SA). Our empirical results show that the new approach allows deriving the sought CPNs with at least equal performance, in terms of size and correctness, to that obtained by a transformation rule-based tool.
Robotics Vision-based Heuristic Reasoning for Underwater Target Tracking and Navigation
Directory of Open Access Journals (Sweden)
Chua Kia
2008-11-01
Full Text Available This paper presents a robotics vision-based heuristic reasoning system for underwater target tracking and navigation. This system is introduced to improve the level of automation of underwater Remote Operated Vehicles (ROVs operations. A prototype which combines computer vision with an underwater robotics system is successfully designed and developed to perform target tracking and intelligent navigation. This study focuses on developing image processing algorithms and fuzzy inference system for the analysis of the terrain. The vision system developed is capable of interpreting underwater scene by extracting subjective uncertainties of the object of interest. Subjective uncertainties are further processed as multiple inputs of a fuzzy inference system that is capable of making crisp decisions concerning where to navigate. The important part of the image analysis is morphological filtering. The applications focus on binary images with the extension of gray-level concepts. An open-loop fuzzy control system is developed for classifying the traverse of terrain. The great achievement is the system's capability to recognize and perform target tracking of the object of interest (pipeline in perspective view based on perceived condition. The effectiveness of this approach is demonstrated by computer and prototype simulations. This work is originated from the desire to develop robotics vision system with the ability to mimic the human expert's judgement and reasoning when maneuvering ROV in the traverse of the underwater terrain.
Building a Spammer Monitoring System Using Heuristic Rule-Based Approach
Directory of Open Access Journals (Sweden)
Adewole Kayode S
2012-10-01
Full Text Available Spam is a major problem of electronic mail system that has enjoyed extensive discourse. E-mail has been greatly abused by spammers to disseminate unwanted messages and spread malicious contents. Several anti-spam systems developed have been greatly abused and this is as evident in the proliferation of Spammer’s activities. Observing this fact, a protective mechanism to countermeasure the ever-growing spam problem is indeed inevitable.In this paper, a heuristic approach is proposed which employs a standard normalized Spammer’s languages harvested from Google and Yahoo spam language data set to build the knowledge base. The spam languages were prioritized based on the frequency of occurrence in the two global data sets. A threshold of 5% was established for a user without spamming history while 3% was set for a suspected spammer. A platform independent system was designed and implemented to monitor users’ mail in real time. As soon as the threshold is reached the user will be alerted and the suspected mail will be cancelled. The developed model was evaluated for accuracy and effectiveness using three composed email messages. It is recommended among others that this spam preventive model be incorporated in the architecture of every Internet Service Provider.
Othman, Muhammad Murtadha; Abd Rahman, Nurulazmi; Musirin, Ismail; Fotuhi-Firuzabad, Mahmud; Rajabi-Ghahnavieh, Abbas
2015-01-01
This paper introduces a novel multiobjective approach for capacity benefit margin (CBM) assessment taking into account tie-line reliability of interconnected systems. CBM is the imperative information utilized as a reference by the load-serving entities (LSE) to estimate a certain margin of transfer capability so that a reliable access to generation through interconnected system could be attained. A new Pareto-based evolutionary programming (EP) technique is used to perform a simultaneous determination of CBM for all areas of the interconnected system. The selection of CBM at the Pareto optimal front is proposed to be performed by referring to a heuristic ranking index that takes into account system loss of load expectation (LOLE) in various conditions. Eventually, the power transfer based available transfer capability (ATC) is determined by considering the firm and nonfirm transfers of CBM. A comprehensive set of numerical studies are conducted on the modified IEEE-RTS79 and the performance of the proposed method is numerically investigated in detail. The main advantage of the proposed technique is in terms of flexibility offered to an independent system operator in selecting an appropriate solution of CBM simultaneously for all areas.
A dichotomous search-based heuristic for the three-dimensional sphere packing problem
Directory of Open Access Journals (Sweden)
Mhand Hifi
2015-12-01
Full Text Available In this paper, the three-dimensional sphere packing problem is solved by using a dichotomous search-based heuristic. An instance of the problem is defined by a set of $ n $ unequal spheres and an object of fixed width and height and, unlimited length. Each sphere is characterized by its radius and the aim of the problem is to optimize the length of the object containing all spheres without overlapping. The proposed method is based upon beam search, in which three complementary phases are combined: (i a greedy selection phase which determines a series of eligible search subspace, (ii a truncated tree search, using a width-beam search, that explores some promising paths, and (iii a dichotomous search that diversifies the search. The performance of the proposed method is evaluated on benchmark instances taken from the literature where its obtained results are compared to those reached by some recent methods of the literature. The proposed method is competitive and it yields promising results.
Zittersteijn, M.; Vananti, A.; Schildknecht, T.; Dolado Perez, J. C.; Martinot, V.
2016-11-01
Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. The problem faced in this framework is that of Multiple Target Tracking (MTT). The MTT problem quickly becomes an NP-hard combinatorial optimization problem. This means that the effort required to solve the MTT problem increases exponentially with the number of tracked objects. In an attempt to find an approximate solution of sufficient quality, several Population-Based Meta-Heuristic (PBMH) algorithms are implemented and tested on simulated optical measurements. These first results show that one of the tested algorithms, namely the Elitist Genetic Algorithm (EGA), consistently displays the desired behavior of finding good approximate solutions before reaching the optimum. The results further suggest that the algorithm possesses a polynomial time complexity, as the computation times are consistent with a polynomial model. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the association and orbit determination problems simultaneously, and is able to efficiently process large data sets with minimal manual intervention.
Improving the Ranking Capability of the Hyperlink Based Search Engines Using Heuristic Approach
Directory of Open Access Journals (Sweden)
Haider A. Ramadhan
2006-01-01
Full Text Available To evaluate the informative content of a Web page, the Web structure has to be carefully analyzed. Hyperlink analysis, which is capable of measuring the potential information contained in a Web page with respect to the Web space, is gaining more attention. The links to and from Web pages are an important resource that has largely gone unused in existing search engines. Web pages differ from general text in that they posses external and internal structure. The Web links between documents can provide useful information in finding pages for a given set of topics. Making use of the Web link information would allow the construction of more powerful tools for answering user queries. Google has been among the first search engines to utilize hyper links in page ranking. Still two main flaws in Google need to be tackled. First, all the backlinks to a page are assigned equal weights. Second, less content rich pages, such as intermediate and transient pages, are not differentiated from more content rich pages. To overcome these pitfalls, this paper proposes a heuristic based solution to differentiate the significance of various backlinks by assigning a different weight factor to them depending on their location in the directory tree of the Web space.
Simplifying Hill-based muscle models through generalized extensible fuzzy heuristic implementation
O'Brien, Amy J.
2006-04-01
Traditional dynamic muscle models based on work initially published by A. V. Hill in 1938 often rely on high-order systems of differential equations. While such models are very accurate and effective, they do not typically lend themselves to modification by clinicians who are unfamiliar with biomedical engineering and advanced mathematics. However, it is possible to develop a fuzzy heuristic implementation of a Hill-based model-the Fuzzy Logic Implemented HIll-based (FLIHI) muscle model-that offers several advantages over conventional state equation approaches. Because a fuzzy system is oriented by design to describe a model in linguistics rather than ordinary differential equation-based mathematics, the resulting fuzzy model can be more readily modified and extended by medical practitioners. It also stands to reason that a well-designed fuzzy inference system can be implemented with a degree of generalizability not often encountered in traditional state space models. Taking electromyogram (EMG) as one input to muscle, FLIHI is tantamount to a fuzzy EMG-to-muscle force estimator that captures dynamic muscle properties while providing robustness to partial or noisy data. One goal behind this approach is to encourage clinicians to rely on the model rather than assuming that muscle force as an output maps directly to smoothed EMG as an input. FLIHI's force estimate is more accurate than assuming force equal to smoothed EMG because FLIHI provides a transfer function that accounts for muscle's inherent nonlinearity. Furthermore, employing fuzzy logic should provide FLIHI with improved robustness over traditional mathematical approaches.
Mathematical models and a constructive heuristic for finding minimum fundamental cycle bases
Directory of Open Access Journals (Sweden)
Liberti Leo
2005-01-01
Full Text Available The problem of finding a fundamental cycle basis with minimum total cost in a graph arises in many application fields. In this paper we present some integer linear programming formulations and we compare their performances, in terms of instance size, CPU time required for the solution, and quality of the associated lower bound derived by solving the corresponding continuous relaxations. Since only very small instances can be solved to optimality with these formulations and very large instances occur in a number of applications, we present a new constructive heuristic and compare it with alternative heuristics.
A CDT-Based Heuristic Zone Design Approach for Economic Census Investigators
Directory of Open Access Journals (Sweden)
Changixu Cheng
2015-01-01
Full Text Available This paper addresses a special zone design problem for economic census investigators that is motivated by a real-world application. This paper presented a heuristic multikernel growth approach via Constrained Delaunay Triangulation (CDT. This approach not only solved the barriers problem but also dealt with the polygon data in zoning procedure. In addition, it uses a new heuristic method to speed up the zoning process greatly on the premise of the required quality of zoning. At last, two special instances for economic census were performed, highlighting the performance of this approach.
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
A modified bottleneck-based (MB) heuristic for large-scale job-shop scheduling problems with a welldefined bottleneck is suggested,which is simpler but more tailored than the shifting bottleneck (SB) procedure.In this algorithm,the bottleneck is first scheduled optimally while the non-bottleneck machines are subordinated around the solutions of the bottleneck schedule by some effective dispatching rules.Computational results indicate that the MB heuristic can achieve a better tradeoff between solution quality and computational time compared to SB procedure for medium-size problems.Furthermore,it can obtain a good solution in a short time for large-scale job-shop scheduling problems.
A novel energy-aware multi-task dynamic mapping heuristic of NoC-based MPSoCs
Zhao, Xibin; Gu, Ming
2013-05-01
Task mapping is an important issue in network-on-chip (NoC)-based multiprocessor systems-on-chips (MPSoCs) design. The dynamic characteristic of application execution enforces the use of dynamic task mapping. In this article, a hybrid energy-aware dynamic mapping strategy is proposed. The strategy consists of an off-line part and an on-line part. In the off-line part, optimisation tools are used to extract information that helps to reduce the energy consumption in the on-line mapping, while the on-line mapping heuristic makes use of the information. Experimental result shows that the energy consumption is reduced by 21%, on average, compared to the best neighbour heuristic.
Energy Technology Data Exchange (ETDEWEB)
Yim, Ho Bin [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Gu Seong Dong, KAIST, N7-1, 2416, Daejeon 305-701 (Korea, Republic of); Seong, Poong Hyun, E-mail: phseong@kaist.ac.k [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, Gu Seong Dong, KAIST, N7-1, 2416, Daejeon 305-701 (Korea, Republic of)
2010-12-15
Research highlights: Augmented reality (AR) instructions were built for NPPs maintenance personnel. 4-5 pieces of information at a time were optimum for AR instructions in this study. A large variance in mode no. 5 implies these were also found to be critical amount. Heuristic guidelines were suggested to make AR instructions more effective. - Abstract: As industrial plants and factories age, their maintenance requirements increase. Because maintenance mistakes directly increase the operating costs of a power plant, maintenance quality is significant concern to plant management. By law, all personnel working with nuclear technology must be re-trained every three years in Korea; however, as the statistical data show, the number of shutdown accidents at nuclear power plants (NPPs) due to maintenance failure is still high and needs to be reduced. Industries have started to adopt various technologies to increase the speed and accuracy of maintenance. Among those technologies, augmented reality (AR) is the latest multimedia presentation technology to be applied to plant maintenance, and it offers superior intuitiveness and user interactivity over other conventional multimedia. This empirical study aims to measure the optimum amounts of information to be delivered at a time and to identify what types of information enhance the learning ability of novices and to suggest heuristic guidelines by which to make effective AR training instructions. In the first experiment, the optimum amount of information in an AR learning environment for novices was found to be 4-5 pieces of information in a chunk by comparing results between a pre-test and an after-test. This result implies that intentionally made chunks help novices learn more effectively. In the second experiment, the AR training instruction based on the suggested heuristic guidelines was slightly more effective than other AR training instructions. Maintenance in nuclear power plants can be more reliable and accurate by
Multiple-Goal Heuristic Search
Davidov, D; 10.1613/jair.1940
2011-01-01
This paper presents a new framework for anytime heuristic search where the task is to achieve as many goals as possible within the allocated resources. We show the inadequacy of traditional distance-estimation heuristics for tasks of this type and present alternative heuristics that are more appropriate for multiple-goal search. In particular, we introduce the marginal-utility heuristic, which estimates the cost and the benefit of exploring a subtree below a search node. We developed two methods for online learning of the marginal-utility heuristic. One is based on local similarity of the partial marginal utility of sibling nodes, and the other generalizes marginal-utility over the state feature space. We apply our adaptive and non-adaptive multiple-goal search algorithms to several problems, including focused crawling, and show their superiority over existing methods.
Improving the Bin Packing Heuristic through Grammatical Evolution Based on Swarm Intelligence
Directory of Open Access Journals (Sweden)
Marco Aurelio Sotelo-Figueroa
2014-01-01
Full Text Available In recent years Grammatical Evolution (GE has been used as a representation of Genetic Programming (GP which has been applied to many optimization problems such as symbolic regression, classification, Boolean functions, constructed problems, and algorithmic problems. GE can use a diversity of searching strategies including Swarm Intelligence (SI. Particle Swarm Optimisation (PSO is an algorithm of SI that has two main problems: premature convergence and poor diversity. Particle Evolutionary Swarm Optimization (PESO is a recent and novel algorithm which is also part of SI. PESO uses two perturbations to avoid PSO’s problems. In this paper we propose using PESO and PSO in the frame of GE as strategies to generate heuristics that solve the Bin Packing Problem (BPP; it is possible however to apply this methodology to other kinds of problems using another Grammar designed for that problem. A comparison between PESO, PSO, and BPP’s heuristics is performed through the nonparametric Friedman test. The main contribution of this paper is proposing a Grammar to generate online and offline heuristics depending on the test instance trying to improve the heuristics generated by other grammars and humans; it also proposes a way to implement different algorithms as search strategies in GE like PESO to obtain better results than those obtained by PSO.
A typology of evidence based practice research heuristics for clinical laboratory science curricula.
Leibach, Elizabeth K; Russell, Barbara L
2010-01-01
A typology of EBP research heuristics was defined relative to clinical laboratory science levels of practice. Research skills requisite for CLS baccalaureate level are associated mainly with quality control of analytic processes. Research skills at master's level are associated with pre- and post-analytic investigations, as well. Doctoral level CLS practice portends to utilize research skills facilitating quality investigations at the systems level.
Design and usability of heuristic-based deliberation tools for women facing amniocentesis.
Durand, M.A.; Wegwarth, O.; Boivin, J.; Elwyn, G.
2012-01-01
BACKGROUND: Evidence suggests that in decision contexts characterized by uncertainty and time constraints (e.g. health-care decisions), fast and frugal decision-making strategies (heuristics) may perform better than complex rules of reasoning. OBJECTIVE: To examine whether it is possible to design
Solving the Tractor and Semi-Trailer Routing Problem Based on a Heuristic Approach
Directory of Open Access Journals (Sweden)
Hongqi Li
2012-01-01
Full Text Available We study the tractor and semi-trailer routing problem (TSRP, a variant of the vehicle routing problem (VRP. In the TSRP model for this paper, vehicles are dispatched on a trailer-flow network where there is only one main depot, and all tractors originate and terminate in the main depot. Two types of decisions are involved: the number of tractors and the route of each tractor. Heuristic algorithms have seen widespread application to various extensions of the VRP. However, this approach has not been applied to the TSRP. We propose a heuristic algorithm to solve the TSRP. The proposed heuristic algorithm first constructs the initial route set by the limitation of a driver’s on-duty time. The candidate routes in the initial set are then filtered by a two-phase approach. The computational study shows that our algorithm is feasible for the TSRP. Moreover, the algorithm takes relatively little time to obtain satisfactory solutions. The results suggest that our heuristic algorithm is competitive in solving the TSRP.
Ding, Zhe; Xu, Zhanqi; Zeng, Xiaodong; Ma, Tao; Yang, Fan
2014-04-01
By adopting the orthogonal frequency division multiplexing technology, spectrum-sliced elastic optical path networks can offer flexible bandwidth to each connection request and utilize the spectrum resources efficiently. The routing and spectrum assignment (RSA) problems in SLICE networks are solved by using heuristic algorithms in most prior studies and addressed by intelligent algorithms in few investigations. The performance of RSA algorithms can be further improved if we could combine such two types of algorithms. Therefore, we propose three hybrid RSA algorithms: DACE-GMSF, DACE-GLPF, and DACE-GEMkPSF, which are the combination of the heuristic algorithm and coevolution based on distance-adaptive policy. In the proposed algorithms, we first groom the connection requests, then sort the connection requests by using the heuristic algorithm (most subcarriers first, longest path first, and extended most k paths' slots first), and finally search the approximately optimal solution with the coevolutionary policy. We present a model of the RSA problem by using integral linear programming, and key elements in the proposed algorithms are addressed in detail. Simulations under three topologies show that the proposed hybrid RSA algorithms can save spectrum resources efficiently.
Formal Component-Based Semantics
Madlener, Ken; van Eekelen, Marko; 10.4204/EPTCS.62.2
2011-01-01
One of the proposed solutions for improving the scalability of semantics of programming languages is Component-Based Semantics, introduced by Peter D. Mosses. It is expected that this framework can also be used effectively for modular meta theoretic reasoning. This paper presents a formalization of Component-Based Semantics in the theorem prover Coq. It is based on Modular SOS, a variant of SOS, and makes essential use of dependent types, while profiting from type classes. This formalization constitutes a contribution towards modular meta theoretic formalizations in theorem provers. As a small example, a modular proof of determinism of a mini-language is developed.
2015-01-01
How can we advance knowledge? Which methods do we need in order to make new discoveries? How can we rationally evaluate, reconstruct and offer discoveries as a means of improving the ‘method’ of discovery itself? And how can we use findings about scientific discovery to boost funding policies, thus fostering a deeper impact of scientific discovery itself? The respective chapters in this book provide readers with answers to these questions. They focus on a set of issues that are essential to the development of types of reasoning for advancing knowledge, such as models for both revolutionary findings and paradigm shifts; ways of rationally addressing scientific disagreement, e.g. when a revolutionary discovery sparks considerable disagreement inside the scientific community; frameworks for both discovery and inference methods; and heuristics for economics and the social sciences.
Delay-Constrained Multicast Routing Algorithm Based on Average Distance Heuristic
Ling, Zhou; Yu-xi, Zhu; 10.5121/ijcnc.2010.2212
2010-01-01
Multicast is the ability of a communication network to accept a single message from an application and to deliver copies of the message to multiple recipients at different location. With the development of Internet, Multicast is widely applied in all kinds of multimedia real-time application: distributed multimedia systems, collaborative computing, video-conferencing, distance education, etc. In order to construct a delay-constrained multicast routing tree, average distance heuristic (ADH) algorithm is analyzed firstly. Then a delay-constrained algorithm called DCADH (delay-constrained average distance heuristic) is presented. By using ADH a least cost multicast routing tree can be constructed; if the path delay can't meet the delay upper bound, a shortest delay path which is computed by Dijkstra algorithm will be merged into the existing multicast routing tree to meet the delay upper bound. Simulation experiments show that DCADH has a good performance in achieving a low-cost multicast routing tree.
Trajectory Tracking Control for a GMM Actuator Based on a Heuristic ILC Method
Institute of Scientific and Technical Information of China (English)
SONG Zhao-qing; ZHOU Shao-lei; SHI Xian-jun
2006-01-01
A heuristic iterative learning control (ILC) method is presented and applied to the trajectory tracking control of a giant magnetostrictive material (GMM) actuator. A GMM actuator is used as experimental equipment for micro-displacement trajectory tracking control. The advantage of the presented approach lies in quitting the model of the GMM actuator. The experimental results attest to the high efficiency of the presented method for the micro-displacement trajectory tracking control.
Familiarity and recollection in heuristic decision making.
Schwikert, Shane R; Curran, Tim
2014-12-01
Heuristics involve the ability to utilize memory to make quick judgments by exploiting fundamental cognitive abilities. In the current study we investigated the memory processes that contribute to the recognition heuristic and the fluency heuristic, which are both presumed to capitalize on the byproducts of memory to make quick decisions. In Experiment 1, we used a city-size comparison task while recording event-related potentials (ERPs) to investigate the potential contributions of familiarity and recollection to the 2 heuristics. ERPs were markedly different for recognition heuristic-based decisions and fluency heuristic-based decisions, suggesting a role for familiarity in the recognition heuristic and recollection in the fluency heuristic. In Experiment 2, we coupled the same city-size comparison task with measures of subjective preexperimental memory for each stimulus in the task. Although previous literature suggests the fluency heuristic relies on recognition speed alone, our results suggest differential contributions of recognition speed and recollected knowledge to these decisions, whereas the recognition heuristic relies on familiarity. Based on these results, we created a new theoretical framework that explains decisions attributed to both heuristics based on the underlying memory associated with the choice options.
Mashkina, Elena; Bond, Alan M
2011-03-01
Sinusoidal large amplitude ac voltammetric techniques gene-rate very large data sets. When analyzed in the frequency domain, using a Fourier transform (FT)-band filtering- inverse FT sequence, the data may be resolved into the aperiodic dc, fundamental, second, and higher order ac harmonics. Each of these components exhibit a different level of sensitivity to electrode kinetics, uncompensated resistance and capacitance. Detailed simulations illustrate how the heuristic approach for evaluation of each data subset may be implemented and exploited in the assessment of the electrode kinetics for the fast Fc [symbol:see text] Fc(+) + e (Fc = ferrocene) oxidation process at a glassy carbon macrodisk electrode. The simulations presented in this study are based on the Butler-Volmer model and incorporate consideration of the uncompensated resistance (R(u)), double-layer capacitance (C(dl)), rate constant (k(0)), and charge transfer coefficient (α). Error analysis of the heuristically evaluated simulation-experiment comparison is used to assist in establishing the best fit of data for each harmonic. The result of the heuristic pattern recognition type approach for analysis of the oxidation of ferrocene (0.499, 0.999, and 5.00 mM) at a glassy carbon macrodisk electrode in acetonitrile (0.1 M Bu(4)NPF(6)) implies that k(0) ≥ 0.25 cm s(-1) on the basis of analysis of the first 4 harmonics and plausibly lies in the range of 0.25-0.5 cm s(-1) with α = 0.25-0.75 when analysis of the next four harmonics is undertaken. The k(0) value is significantly faster then indicated in most literature reports based on use of dc cyclic voltammetry under transient conditions at glassy carbon macrodisk electrode. The data analysis with a sinusoidal amplitude of 80 mV is conducted at very low frequency experiments of 9 Hz to minimize contribution from electrode heterogeneity, frequency dispersion, and adsorption, all of which can complicate the response for the oxidation of Fc in acetonitrile
Energy Technology Data Exchange (ETDEWEB)
Garcia-Martin, F.J.; Camacho, E.F. [Universidad de Sevilla (Spain). Escuela Superior de Ingenieros; Berenguel, M. [Universidad de Almeria (Spain). Dpto. de Lenguajes y Computacion; Valverde, A. [Plataforma Solar de Almeria (Spain)
1999-08-01
The paper presents the development and implementation of a heuristic knowledge-based heliostat control strategy optimizing the temperature distribution within a volumetric receiver at the Plataforma Solar de Almeria (PSA) power tower plant. The experience in operating the plant has been used in the development of an automatic control strategy that provides an appropriate flux distribution within the volumetric receiver in order to obtain a desired temperature profile, and allows for operation without a continuous intervention of the operator, which is one of the main characteristics,and drawbacks in the exploitation of these kinds of plants. Experimental results are included and discussed in the paper. (author)
Rosburg, Timm; Mecklinger, Axel; Frings, Christian
2011-12-01
Humans can make fast and highly efficient decisions by using simple heuristics that are assumed to exploit basic cognitive functions. In the study reported here, we used event-related potentials (ERPs) to disclose the psychological mechanisms underlying one of the most frugal decision rules, namely, the recognition heuristic. According to this heuristic, whenever two objects have to be ranked by a specific criterion and only one object is recognized, the recognized object is ranked higher than the unrecognized object. Using a standard recognition-heuristic paradigm, we predicted participants' decisions by analyzing an ERP correlate of familiarity-based recognition occurring 300 to 450 ms after stimulus onset. The measure remained a significant predictor even when later ERP correlates were taken into account. These findings are evidence for the thesis that simple heuristics exploit basic cognitive processes. Specifically, the findings show that familiarity--that is, recognition in the absence of recollection--contributes to decisions made on the basis of such heuristics.
Directory of Open Access Journals (Sweden)
Jorge A. Ruiz-Vanoye
2012-07-01
Full Text Available In this paper, we show a survey of meta-heuristics algorithms based on grouping of animals by social behavior for the Traveling Salesman Problem, and propose a new classification of meta-heuristics algorithms (not based on swarm intelligence theory based on grouping of animals: swarm algorithms, schools algorithms, flocks algorithms and herds algorithms: a The swarm algorithms (inspired by the insect swarms and zooplankton swarms: Ant Colony Optimization algorithm – ACO (inspired by the research on the behavior of ant colonies, Firefly Algorithm (based on fireflies, Marriage in Honey Bees Optimization Algorithm - MBO algorithm (inspired by the Honey Bee, Wasp Swarm Algorithm (inspired on the Parasitic wasps, Termite Algorithm (inspired by the termites, Mosquito swarms Algorithm – MSA (inspired by mosquito swarms, zooplankton swarms Algorithm - ZSA (inspired by the Zooplankton and Bumblebees Swarms Algorithm – BSA (inspired by Bumblebees. b The school algorithms (inspired by the fish schools: The Particle Swarm Optimization algorithm – PSO (inspired by social behavior and movement dynamics of fish or schooling. c The flock algorithms (inspired by the bird flocks: the flocking algorithm, and the Particle Swarm Optimization algorithm (inspired on the dynamics of the birds, d The herd and pack Algorithms (inspired by the mammal herds and packs: bat algorithm (inspired by bat, wolf pack search algorithm - WPS (inspired by wolfs, Rats herds algorithm - RATHA (inspired by Rats, Dolphins Herds Algorithm - DHA (inspired by Dolphins and the feral-dogs herd algorithm - FDHA (inspired by feral-dogs herd.
Gallo, David A; Bell, Deborah M; Beier, Jonathan S; Schacter, Daniel L
2006-08-01
People often use recollection to avoid false memories. At least two types of recollection-based monitoring processes can be identified in the literature. Recall-to-reject is based on the recall of logically inconsistent information (which disqualifies the false event from having occurred), whereas the distinctiveness heuristic is based on the failure to recall to-be-expected information (which is diagnostic of non-occurrence). We attempted to investigate these hypothetical monitoring processes in a single task, as a first step at delineating the functional relationship between them. By design, participants could reject familiar lures by (1) recalling them from a to-be-excluded list (recall-to-reject) or (2) realising the absence of expected picture recollections (the distinctiveness heuristic). Both manipulations reduced false recognition in young adults, suggesting that these two types of monitoring were deployed on the same test. In contrast, older adults had limited success in reducing false recognition with either manipulation, indicating deficits in recollection-based monitoring processes. Depending on how a retrieval task is structured, attempts to use one monitoring process might interfere with another, especially in older adults.
Support vector classifier based on principal component analysis
Institute of Scientific and Technical Information of China (English)
无
2008-01-01
Support vector classifier (SVC) has the superior advantages for small sample learning problems with high dimensions,with especially better generalization ability.However there is some redundancy among the high dimensions of the original samples and the main features of the samples may be picked up first to improve the performance of SVC.A principal component analysis (PCA) is employed to reduce the feature dimensions of the original samples and the pre-selected main features efficiently,and an SVC is constructed in the selected feature space to improve the learning speed and identification rate of SVC.Furthermore,a heuristic genetic algorithm-based automatic model selection is proposed to determine the hyperparameters of SVC to evaluate the performance of the learning machines.Experiments performed on the Heart and Adult benchmark data sets demonstrate that the proposed PCA-based SVC not only reduces the test time drastically,but also improves the identify rates effectively.
QuickVina: accelerating AutoDock Vina using gradient-based heuristics for global optimization.
Handoko, Stephanus Daniel; Ouyang, Xuchang; Su, Chinh Tran To; Kwoh, Chee Keong; Ong, Yew Soon
2012-01-01
Predicting binding between macromolecule and small molecule is a crucial phase in the field of rational drug design. AutoDock Vina, one of the most widely used docking software released in 2009, uses an empirical scoring function to evaluate the binding affinity between the molecules and employs the iterated local search global optimizer for global optimization, achieving a significantly improved speed and better accuracy of the binding mode prediction compared its predecessor, AutoDock 4. In this paper, we propose further improvement in the local search algorithm of Vina by heuristically preventing some intermediate points from undergoing local search. Our improved version of Vina-dubbed QVina-achieved a maximum acceleration of about 25 times with the average speed-up of 8.34 times compared to the original Vina when tested on a set of 231 protein-ligand complexes while maintaining the optimal scores mostly identical. Using our heuristics, larger number of different ligands can be quickly screened against a given receptor within the same time frame.
GPU-Based Heuristic Solver for Linear Sum Assignment Problems Under Real-time Constraints
Roverso, Roberto; El-Beltagy, Mohammed; El-Ansary, Sameh
2011-01-01
In this paper we modify a fast heuristic solver for the Linear Sum Assignment Problem (LSAP) for use on Graphical Processing Units (GPUs). The motivating scenario is an industrial application for P2P live streaming that is moderated by a central node which is periodically solving LSAP instances for assigning peers to one another. The central node needs to handle LSAP instances involving thousands of peers in as near to real-time as possible. Our findings are generic enough to be applied in other contexts. Our main result is a parallel version of a heuristic algorithm called Deep Greedy Switching (DGS) on GPUs using the CUDA programming language. DGS sacrifices absolute optimality in favor of low computation time and was designed as an alternative to classical LSAP solvers such as the Hungarian and auctioning methods. The contribution of the paper is threefold: First, we present the process of trial and error we went through, in the hope that our experience will be beneficial to adopters of GPU programming for...
Path-Wise Test Data Generation Based on Heuristic Look-Ahead Methods
Directory of Open Access Journals (Sweden)
Ying Xing
2014-01-01
Full Text Available Path-wise test data generation is generally considered an important problem in the automation of software testing. In essence, it is a constraint optimization problem, which is often solved by search methods such as backtracking algorithms. In this paper, the backtracking algorithm branch and bound and state space search in artificial intelligence are introduced to tackle the problem of path-wise test data generation. The former is utilized to explore the space of potential solutions and the latter is adopted to construct the search tree dynamically. Heuristics are employed in the look-ahead stage of the search. Dynamic variable ordering is presented with a heuristic rule to break ties, values of a variable are determined by the monotonicity analysis on branching conditions, and maintaining path consistency is achieved through analysis on the result of interval arithmetic. An optimization method is also proposed to reduce the search space. The results of empirical experiments show that the search is conducted in a basically backtrack-free manner, which ensures both test data generation with promising performance and its excellence over some currently existing static and dynamic methods in terms of coverage. The results also demonstrate that the proposed method is applicable in engineering.
Heuristics for Multidimensional Packing Problems
DEFF Research Database (Denmark)
Egeblad, Jens
for a minimum height container required for the items. The main contributions of the thesis are three new heuristics for strip-packing and knapsack packing problems where items are both rectangular and irregular. In the two first papers we describe a heuristic for the multidimensional strip-packing problem...... for a three-dimensional knapsack packing problem involving furniture is presented in the fourth paper. The heuristic is based on a variety of techniques including tree-search, wall-building, and sequential placement. The solution process includes considerations regarding stability and load bearing strength...... paper. Ensuring that a loaded consignment of items are balanced throughout a container can reduce fuel consumption and prolong the life-span of vehicles. The heuristic can be used as a post-processing tool to reorganize an existing solution to a packing problem. A method for optimizing the placement...
Heuristics for the Hodgkin-Huxley system.
Hoppensteadt, Frank
2013-09-01
Hodgkin and Huxley (HH) discovered that voltages control ionic currents in nerve membranes. This led them to describe electrical activity in a neuronal membrane patch in terms of an electronic circuit whose characteristics were determined using empirical data. Due to the complexity of this model, a variety of heuristics, including relaxation oscillator circuits and integrate-and-fire models, have been used to investigate activity in neurons, and these simpler models have been successful in suggesting experiments and explaining observations. Connections between most of the simpler models had not been made clear until recently. Shown here are connections between these heuristics and the full HH model. In particular, we study a new model (Type III circuit): It includes the van der Pol-based models; it can be approximated by a simple integrate-and-fire model; and it creates voltages and currents that correspond, respectively, to the h and V components of the HH system.
Institute of Scientific and Technical Information of China (English)
Pei-Chann Chang; Wei-Hsiu Huang; Zhen-Zhen Zhang
2012-01-01
In this research,we introduce a new heuristic approach using the concept of ant colony optimization (ACO)to extract patterns from the chromosomes generated by previous generations for solving the generalized traveling salesman problem.The proposed heuristic is composed of two phases.In the first phase the ACO technique is adopted to establish an archive consisting of a set of non-overlapping blocks and of a set of remaining cities (nodes) to be visited.The second phase is a block recombination phase where the set of blocks and the rest of cities are combined to form an artificial chromosome.The generated artificial chromosomes (ACs) will then be injected into a standard genetic algorithm (SGA) to speed up the convergence.The proposed method is called "Puzzle-Based Genetic Algorithm" or "p-ACGA".We demonstrate that p-ACGA performs very well on all TSPLIB problems,which have been solved to optimality by other researchers.The proposed approach can prevent the early convergence of the genetic algorithm (GA) and lead the algorithm to explore and exploit the search space by taking advantage of the artificial chromosomes.
An Heuristic Drift-Based Model of the Power Scrape-Off Width in H-Mode Tokamaks
Energy Technology Data Exchange (ETDEWEB)
Robert J. Goldston
2011-02-28
An heuristic model for the plasma scrape-off width in H-mode plasmas is introduced. Grad B and curv B drifts into the SOL are balanced against sonic parallel flows out of the SOL, to the divertor plates. The overall mass flow pattern posited is a modification for open field lines of Pfirsch-Shlüter flows to include sinks to the divertors. These assumptions result in an estimated SOL width of 2aρp/R. They also result in a first-principles calculation of the particle confinement time of H-mode plasmas, qualitatively consistent with experimental observations. It is next assumed that anomalous perpendicular electron thermal diffusivity is the dominant source of heat flux across the separatrix, investing the SOL width, defined above, with heat from the main plasma. The separatrix temperature is calculated based on a two-point model balancing power input to the SOL with Spitzer-Härm parallel thermal conduction losses to the divertor. This results in an heuristic closed-form prediction for the power scrape-off width that is in remarkable quantitative agreement both in absolute magnitude and in scaling with recent experimental data. Further work should include full numerical calculations, including all magnetic and electric drifts, as well as more thorough comparison with experimental data.
Heuristic Drift-based Model of the Power Scrape-off width in H-mode Tokamaks
Energy Technology Data Exchange (ETDEWEB)
Robert J. Goldston
2011-04-29
An heuristic model for the plasma scrape-off width in H-mode plasmas is introduced. Grad B and curv B drifts into the SOL are balanced against sonic parallel flows out of the SOL, to the divertor plates. The overall particle flow pattern posited is a modification for open field lines of Pfirsch-Shlüter flows to include sinks to the divertors. These assumptions result in an estimated SOL width of ~ 2aρp/R. They also result in a first-principles calculation of the particle confinement time of H-mode plasmas, qualitatively consistent with experimental observations. It is next assumed that anomalous perpendicular electron thermal diffusivity is the dominant source of heat flux across the separatrix, investing the SOL width, defined above, with heat from the main plasma. The separatrix temperature is calculated based on a two-point model balancing power input to the SOL with Spitzer-Härm parallel thermal conduction losses to the divertor. This results in a heuristic closed-form prediction for the power scrape-off width that is in reasonable quantitative agreement both in absolute magnitude and in scaling with recent experimental data from deuterium plasmas. Further work should include full numerical calculations, including all magnetic and electric drifts, as well as more thorough comparison with experimental data.
Fazlollahtabar, Hamed
2010-12-01
Consumer expectations for automobile seat comfort continue to rise. With this said, it is evident that the current automobile seat comfort development process, which is only sporadically successful, needs to change. In this context, there has been growing recognition of the need for establishing theoretical and methodological automobile seat comfort. On the other hand, seat producer need to know the costumer's required comfort to produce based on their interests. The current research methodologies apply qualitative approaches due to anthropometric specifications. The most significant weakness of these approaches is the inexact extracted inferences. Despite the qualitative nature of the consumer's preferences there are some methods to transform the qualitative parameters into numerical value which could help seat producer to improve or enhance their products. Nonetheless this approach would help the automobile manufacturer to provide their seats from the best producer regarding to the consumers idea. In this paper, a heuristic multi criteria decision making technique is applied to make consumers preferences in the numeric value. This Technique is combination of Analytical Hierarchy Procedure (AHP), Entropy method, and Technique for Order Preference by Similarity to an Ideal Solution (TOPSIS). A case study is conducted to illustrate the applicability and the effectiveness of the proposed heuristic approach.
Efficient Heuristic Variable Ordering of OBDDs
Institute of Scientific and Technical Information of China (English)
无
2000-01-01
An efficient heuristic algorithm for variable ordering of OBDDs, the WDHA (Weight-and-Distance based Heuristic Algorithm), is presented. The algorithm is based on the heuristics implied in the circuit structure graph. To scale the heuristics, pi- weight, node- weight, average- weight and pi- distance in the circuit structure graph are defined. As any of the heuristics is not a panacea for all circuits, several sub-algorithms are proposed to cope with various cases. One is a direct method that uses pi- weight and pi- distance. The others are based on the depth-first-search (DFS) traversal of the circuit structure graph, with each focusing on one of the heuristics. An adaptive order selection strategy is adopted in WDHA. Experimental results show that WDHA is efficient in terms of BDD size and run time, and the dynamic OBDD variable ordering is more attractive if combined with WDHA.
Heuristics-Based Trust Estimation in Multiagent Systems Using Temporal Difference Learning.
Rishwaraj, G; Ponnambalam, S G; Loo, Chu Kiong
2016-12-20
The application of multiagent system (MAS) is becoming increasing popular as it allows agents in a system to pool resources together to achieve a common objective. A vital part of the MAS is the teamwork cooperation through the sharing of information and resources among the agents to optimize their efforts in accomplishing given objectives. A critical part of the teamwork effort is the ability to trust each other when executing any task to ensure efficient and successful cooperation. This paper presents the development of a trust estimation model that could empirically evaluate the trust of an agent in MAS. The proposed model is developed using temporal difference learning by incorporating the concept of Markov games and heuristics to estimate trust. Simulation experiments are conducted to test and evaluate the performance of the developed model against some of the recently reported model in the literature. The simulation experiments indicate that the developed model performs better in terms of accuracy and efficiency in estimating trust.
A Meta-Heuristic Regression-Based Feature Selection for Predictive Analytics
Directory of Open Access Journals (Sweden)
Bharat Singh
2014-11-01
Full Text Available A high-dimensional feature selection having a very large number of features with an optimal feature subset is an NP-complete problem. Because conventional optimization techniques are unable to tackle large-scale feature selection problems, meta-heuristic algorithms are widely used. In this paper, we propose a particle swarm optimization technique while utilizing regression techniques for feature selection. We then use the selected features to classify the data. Classification accuracy is used as a criterion to evaluate classifier performance, and classification is accomplished through the use of k-nearest neighbour (KNN and Bayesian techniques. Various high dimensional data sets are used to evaluate the usefulness of the proposed approach. Results show that our approach gives better results when compared with other conventional feature selection algorithms.
Ziemer, Benjamin P; Sanghvi, Parag; Hattangadi-Gluth, Jona; Moore, Kevin L
2017-07-21
plan quality improvements were quantified by calculating the difference between SRS quality metrics (QMs): ΔQM = QMclinical - QMKBP . In addition to GM, investigated QMs were: volume of brain receiving ≥ 10 Gy (V10 Gy ), volume of brain receiving ≥ 5 Gy (ΔV5 Gy ), heterogeneity index (HI), dose to 0.1 cc of the brainstem (D0.1 cc ), dose to 1% of the optic chiasm (D1% ), and interlesion dose (DIL ). In addition to this quantitative analysis, overall plan quality was assessed via blinded plan comparison of the manual and KBP treatment plans by SRS-specializing physicians. A dose combination factor of n = 8 yielded an integrated dose profile RMS difference of 2.9% across the 41-patient cohort. Multimet dose predictions exhibited ΔGM = 0.07 ± 0.10 cm against the clinical sample, implying either further normal tissue sparing was possible or that dose predictions were slightly overestimating achievable dose gradients. The latter is the more likely explanation, as this bias vanished when dose predictions were converted to deliverable KBP plans ΔGM = 0.00 ± 0.08 cm. Remaining QMs were nearly identical or showed modest improvements in the KBP sample. Equivalent QMs included: ΔV10 Gy = 0.37 ± 3.78 cc, ΔHI = 0.02 ± 0.08 and ΔDIL = -2.22 ± 171.4 cGy. The KBP plans showed a greater degree of normal tissue sparing as indicated by brain ΔV5 Gy = 4.11± 24.05 cc, brainstem ΔD0.1 cc = 42.8 ± 121.4 cGy, and chiasm ΔD1% = 50.8 ± 83.0 cGy. In blinded review by SRS-specializing physicians, KBP-generated plans were deemed equivalent or superior in 32/41(78.1%) of the cases. Heuristic KBP-driven automated planning in linac-based, single-isocenter treatments for multiple brain metastases maintained or exceeded overall plan quality. © 2017 American Association of Physicists in Medicine.
Directory of Open Access Journals (Sweden)
Markowski Marcin
2017-09-01
Full Text Available In recent years elastic optical networks have been perceived as a prospective choice for future optical networks due to better adjustment and utilization of optical resources than is the case with traditional wavelength division multiplexing networks. In the paper we investigate the elastic architecture as the communication network for distributed data centers. We address the problems of optimization of routing and spectrum assignment for large-scale computing systems based on an elastic optical architecture; particularly, we concentrate on anycast user to data center traffic optimization. We assume that computational resources of data centers are limited. For this offline problems we formulate the integer linear programming model and propose a few heuristics, including a meta-heuristic algorithm based on a tabu search method. We report computational results, presenting the quality of approximate solutions and efficiency of the proposed heuristics, and we also analyze and compare some data center allocation scenarios.
Heuristics for Hierarchical Partitioning with Application to Model Checking
DEFF Research Database (Denmark)
Möller, Michael Oliver; Alur, Rajeev
2001-01-01
for a temporal scaling technique, called “Next” heuristic [2]. The latter is applicable in reachability analysis and is included in a recent version of the Mocha model checking tool. We demonstrate performance and benefits of our method and use an asynchronous parity computer and an opinion poll protocol as case...... that captures the quality of a structure relative to the connections and favors shallow structures with a low degree of branching. Finding a structure with minimal cost is NP-complete. We present a greedy polynomial-time algorithm that approximates good solutions incrementally by local evaluation of a heuristic...... function. We argue for a heuristic function based on four criteria: the number of enclosed connections, the number of components, the number of touched connections and the depth of the structure. We report on an application in the context of formal verification, where our algorithm serves as a preprocessor...
Agnisarman, Sruthy; Narasimha, Shraddhaa; Chalil Madathil, Kapil; Welch, Brandon; Brinda, Fnu; Ashok, Aparna; McElligott, James
2017-04-24
Telemedicine is the use of technology to provide and support health care when distance separates the clinical service and the patient. Home-based telemedicine systems involve the use of such technology for medical support and care connecting the patient from the comfort of their homes with the clinician. In order for such a system to be used extensively, it is necessary to understand not only the issues faced by the patients in using them but also the clinician. The aim of this study was to conduct a heuristic evaluation of 4 telemedicine software platforms-Doxy.me, Polycom, Vidyo, and VSee-to assess possible problems and limitations that could affect the usability of the system from the clinician's perspective. It was found that 5 experts individually evaluated all four systems using Nielsen's list of heuristics, classifying the issues based on a severity rating scale. A total of 46 unique problems were identified by the experts. The heuristics most frequently violated were visibility of system status and Error prevention amounting to 24% (11/46 issues) each. Esthetic and minimalist design was second contributing to 13% (6/46 issues) of the total errors. Heuristic evaluation coupled with a severity rating scale was found to be an effective method for identifying problems with the systems. Prioritization of these problems based on the rating provides a good starting point for resolving the issues affecting these platforms. There is a need for better transparency and a more streamlined approach for how physicians use telemedicine systems. Visibility of the system status and speaking the users' language are keys for achieving this.
Formalization in Component Based Development
DEFF Research Database (Denmark)
Holmegaard, Jens Peter; Knudsen, John; Makowski, Piotr
2006-01-01
We present a unifying conceptual framework for components, component interfaces, contracts and composition of components by focusing on the collection of properties or qualities that they must share. A specific property, such as signature, functionality behaviour or timing is an aspect. Each aspect...
Liu, Weibo; Jin, Yan; Price, Mark
2016-10-01
A new heuristic based on the Nawaz-Enscore-Ham algorithm is proposed in this article for solving a permutation flow-shop scheduling problem. A new priority rule is proposed by accounting for the average, mean absolute deviation, skewness and kurtosis, in order to fully describe the distribution style of processing times. A new tie-breaking rule is also introduced for achieving effective job insertion with the objective of minimizing both makespan and machine idle time. Statistical tests illustrate better solution quality of the proposed algorithm compared to existing benchmark heuristics.
de Jong, Menno D.T.; van der Geest, Thea
2000-01-01
This article is intended to make Web designers more aware of the qualities of heuristics by presenting a framework for analyzing the characteristics of heuristics. The framework is meant to support Web designers in choosing among alternative heuristics. We hope that better knowledge of the
DEFF Research Database (Denmark)
Vlachogiannis, Ioannis (John); Lee, KY
2009-01-01
In this paper an improved coordinated aggregation-based particle swarm optimization (ICA-PSO) algorithm is introduced for solving the optimal economic load dispatch (ELD) problem in power systems. In the ICA-PSO algorithm each particle in the swarm retains a memory of its best position ever...... and the particles search the decision space with accuracy up to two digit points resulting in the improved convergence of the process. The ICA-PSO algorithm is tested on a number of power systems, including the systems with 6, 13, 15, and 40 generating units, the island power system of Crete in Greece...... and the Hellenic bulk power system, and is compared with other state-of-the-art heuristic optimization techniques (HOTs), demonstrating improved performance over them....
Institute of Scientific and Technical Information of China (English)
Deepak KAPUR
2006-01-01
A method using quantifier-elimination is proposed for automatically generating program invariants/inductive assertions. Given a program, inductive assertions, hypothesized as parameterized formulas in a theory, are associated with program locations. Parameters in inductive assertions are discovered by generating constraints on parameters by ensuring that an inductive assertion is indeed preserved by all execution paths leading to the associated location of the program. The method can be used to discover loop invariants-properties of variables that remain invariant at the entry of a loop. The parameterized formula can be successively refined by considering execution paths one by one; heuristics can be developed for determining the order in which the paths are considered. Initialization of program variables as well as the precondition and postcondition, if available, can also be used to further refine the hypothesized invariant. The method does not depend on the availability of the precondition and postcondition of a program. Constraints on parameters generated in this way are solved for possible values of parameters. If no solution is possible, this means that an invariant of the hypothesized form is not likely to exist for the loop under the assumptions/approximations made to generate the associated verification condition. Otherwise, if the parametric constraints are solvable, then under certain conditions on methods for generating these constraints, the strongest possible invariant of the hypothesized form can be generated from most general solutions of the parametric constraints. The approach is illustrated using the logical languages of conjunction of polynomial equations as well as Presburger arithmetic for expressing assertions.
A heuristic-based approach for reliability importance assessment of energy producers
Energy Technology Data Exchange (ETDEWEB)
Akhavein, A., E-mail: a_akhavein@azad.ac.i [Department of Engineering, Science and Research Branch, Islamic Azad University, Tehran (Iran, Islamic Republic of); Fotuhi Firuzabad, M., E-mail: fotuhi@sharif.ed [Department of Electrical Engineering, Sharif University of Technology, Tehran (Iran, Islamic Republic of)
2011-03-15
Reliability of energy supply is one of the most important issues of service quality. On one hand, customers usually have different expectations for service reliability and price. On the other hand, providing different level of reliability at load points is a challenge for system operators. In order to take reasonable decisions and obviate reliability implementation difficulties, market players need to know impacts of their assets on system and load-point reliabilities. One tool to specify reliability impacts of assets is the criticality or reliability importance measure by which system components can be ranked based on their effect on reliability. Conventional methods for determination of reliability importance are essentially on the basis of risk sensitivity analysis and hence, impose prohibitive calculation burden in large power systems. An approach is proposed in this paper to determine reliability importance of energy producers from perspective of consumers or distribution companies in a composite generation and transmission system. In the presented method, while avoiding immense computational burden, the energy producers are ranked based on their rating, unavailability and impact on power flows in the lines connecting to the considered load points. Study results on the IEEE reliability test system show successful application of the proposed method. - Research highlights: {yields} Required reliability level at load points is a concern in modern power systems. {yields} It is important to assess reliability importance of energy producers or generators. {yields} Generators can be ranked based on their impacts on power flow to a selected area. {yields} Ranking of generators is an efficient tool to assess their reliability importance.
Component Based Electronic Voting Systems
Lundin, David
An electronic voting system may be said to be composed of a number of components, each of which has a number of properties. One of the most attractive effects of this way of thinking is that each component may have an attached in-depth threat analysis and verification strategy. Furthermore, the need to include the full system when making changes to a component is minimised and a model at this level can be turned into a lower-level implementation model where changes can cascade to as few parts of the implementation as possible.
Gigerenzer, Gerd; Gaissmaier, Wolfgang
2011-01-01
As reflected in the amount of controversy, few areas in psychology have undergone such dramatic conceptual changes in the past decade as the emerging science of heuristics. Heuristics are efficient cognitive processes, conscious or unconscious, that ignore part of the information. Because using heuristics saves effort, the classical view has been that heuristic decisions imply greater errors than do "rational" decisions as defined by logic or statistical models. However, for many decisions, the assumptions of rational models are not met, and it is an empirical rather than an a priori issue how well cognitive heuristics function in an uncertain world. To answer both the descriptive question ("Which heuristics do people use in which situations?") and the prescriptive question ("When should people rely on a given heuristic rather than a complex strategy to make better judgments?"), formal models are indispensable. We review research that tests formal models of heuristic inference, including in business organizations, health care, and legal institutions. This research indicates that (a) individuals and organizations often rely on simple heuristics in an adaptive way, and (b) ignoring part of the information can lead to more accurate judgments than weighting and adding all information, for instance for low predictability and small samples. The big future challenge is to develop a systematic theory of the building blocks of heuristics as well as the core capacities and environmental structures these exploit.
Modeling reproductive decisions with simple heuristics
Directory of Open Access Journals (Sweden)
Peter Todd
2013-10-01
Full Text Available BACKGROUND Many of the reproductive decisions that humans make happen without much planning or forethought, arising instead through the use of simple choice rules or heuristics that involve relatively little information and processing. Nonetheless, these heuristic-guided decisions are typically beneficial, owing to humans' ecological rationality - the evolved fit between our constrained decision mechanisms and the adaptive problems we face. OBJECTIVE This paper reviews research on the ecological rationality of human decision making in the domain of reproduction, showing how fertility-related decisions are commonly made using various simple heuristics matched to the structure of the environment in which they are applied, rather than being made with information-hungry mechanisms based on optimization or rational economic choice. METHODS First, heuristics for sequential mate search are covered; these heuristics determine when to stop the process of mate search by deciding that a good-enough mate who is also mutually interested has been found, using a process of aspiration-level setting and assessing. These models are tested via computer simulation and comparison to demographic age-at-first-marriage data. Next, a heuristic process of feature-based mate comparison and choice is discussed, in which mate choices are determined by a simple process of feature-matching with relaxing standards over time. Parental investment heuristics used to divide resources among offspring are summarized. Finally, methods for testing the use of such mate choice heuristics in a specific population over time are then described.
Directory of Open Access Journals (Sweden)
I PUTU SUDANA
2011-01-01
Full Text Available Prinsip heuristics tidak dapat dikatakan sebagai sebuah pendekatanpengambilan keputusan yang non-rasional, karena penerapan atau penggunaanyang unconscious atau subtle mind tidak dapat dianggap sebagai tindakanyang irrational. Dengan alasan tersebut, terdapat cukup alasan untukmenyatakan bahwa pengklasifikasian pendekatan-pendekatan keputusansemestinya menggunakan terminologi analytical dan experiential, dan bukanmemakai istilah rational dan non-rational seperti yang umumnya diikuti.Penerapan pendekatan heuristics dapat ditemukan pada berbagai disiplin,termasuk bisnis dan akuntansi. Topik heuristics semestinya mendapatperhatian yang cukup luas dari para periset di bidang akuntansi. Bidangbehavioral research in accounting menawarkan banyak kemungkinan untukdikaji, karena prinsip heuristics bertautan erat dengan aspek manusia sebagaipelaku dalam pengambilan keputusan.
A Variable-Selection Heuristic for K-Means Clustering.
Brusco, Michael J.; Cradit, J. Dennis
2001-01-01
Presents a variable selection heuristic for nonhierarchical (K-means) cluster analysis based on the adjusted Rand index for measuring cluster recovery. Subjected the heuristic to Monte Carlo testing across more than 2,200 datasets. Results indicate that the heuristic is extremely effective at eliminating masking variables. (SLD)
Institute of Scientific and Technical Information of China (English)
Xiao LIU; Jia-wei YE
2011-01-01
We present a new algorithm for nesting problems.Many equally spaced points are set on a sheet,and a piece is moved to one of the points and rotated by an angle.Both the point and the rotation angle constitute the packing attitude of the piece.We propose a new algorithm named HAPE(Heuristic Algorithm based on the principle of minimum total Potential Energy)to find the optimal packing attitude at which the piece has the lowest center of gravity.In addition,a new technique for polygon overlap testing is proposed which avoids the time-consuming calculation of no-fit-polygon(NFP).The detailed implementation of HAPE is presented and two computational experiments are described.The first experiment is based on a real industrial problem and the second on 11 published benchmark problems.Using a hill-climbing(HC)search method,the proposed algorithm performs well in comparison with other published solutions.
Energy Technology Data Exchange (ETDEWEB)
Liu Huanxiang [Department of Chemistry, Lanzhou University, Lanzhou 730000 (China); Yao Xiaojun [Department of Chemistry, Lanzhou University, Lanzhou 730000 (China)]. E-mail: xjyao@lzu.edu.cn; Liu Mancang [Department of Chemistry, Lanzhou University, Lanzhou 730000 (China); Hu Zhide [Department of Chemistry, Lanzhou University, Lanzhou 730000 (China); Fan Botao [Universite Paris 7-Denis Diderot, ITODYS 1, rue Guy de la Brosse, 75005 Paris (France)
2006-02-03
Based on calculated molecular descriptors from the solutes' structure alone, the micelle-water partition coefficients of 103 solutes in micellar electrokinetic chromatography (MEKC) were predicted using the heuristic method (HM). At the same time, in order to show the influence of different molecular descriptors on the micelle-water partition of solute and to well understand the retention mechanism in MEKC, HM was used to build several multivariable linear models using different numbers of molecular descriptors. The best 6-parameter model gave the following results: the square of correlation coefficient R {sup 2} was 0.958 and the mean relative error was 3.98%, which proved that the predictive values were in good agreement with the experimental results. From the built model, it can be concluded that the hydrophobic, H-bond, polar interactions of solutes with the micellar and aqueous phases are the main factors that determine their partitioning behavior. In addition, this paper provided a simple, fast and effective method for predicting the retention of the solutes in MEKC from their structures and gave some insight into structural features related to the retention of the solutes.
Directory of Open Access Journals (Sweden)
Vinicius Amorim Sobreiro
2013-06-01
Full Text Available The definition of the product mix provides the allocation of the productive resources in the manufacture process and the optimization of productive system. However, the definition of the product mix is a problem of the NP-complete, in other words, of difficult solution. Taking this into account, with the aid of the Theory of Constraints - TOC, some constructive heuristics have been presented to help to solve this problem. Thus, the objective in this paper is to propose a new heuristics to provide better solutions when compared with the main heuristics presented in the literature, TOC-h of Fredendall and Lea. To accomplish this comparison, simulations were accomplished with the objective of identifying the production mix with the best throughput, considering CPU time and the characteristics of the productive ambient. The results show that the heuristics proposal was more satisfactory when compared to TOC-h and it shows good solution when compared with the optimum solution. This fact evidence the importance of the heuristics proposal in the definition of product mix.
Comparison of Heuristics for Inhibitory Rule Optimization
Alsolami, Fawaz
2014-09-13
Knowledge representation and extraction are very important tasks in data mining. In this work, we proposed a variety of rule-based greedy algorithms that able to obtain knowledge contained in a given dataset as a series of inhibitory rules containing an expression “attribute ≠ value” on the right-hand side. The main goal of this paper is to determine based on rule characteristics, rule length and coverage, whether the proposed rule heuristics are statistically significantly different or not; if so, we aim to identify the best performing rule heuristics for minimization of rule length and maximization of rule coverage. Friedman test with Nemenyi post-hoc are used to compare the greedy algorithms statistically against each other for length and coverage. The experiments are carried out on real datasets from UCI Machine Learning Repository. For leading heuristics, the constructed rules are compared with optimal ones obtained based on dynamic programming approach. The results seem to be promising for the best heuristics: the average relative difference between length (coverage) of constructed and optimal rules is at most 2.27% (7%, respectively). Furthermore, the quality of classifiers based on sets of inhibitory rules constructed by the considered heuristics are compared against each other, and the results show that the three best heuristics from the point of view classification accuracy coincides with the three well-performed heuristics from the point of view of rule length minimization.
The artifacts of component-based development
Qureshi, M Rizwan Jameel
2012-01-01
Component based development idea was floated in a conference name "Mass Produced Software Components" in 1968 [1]. Since then engineering and scientific libraries are developed to reuse the previously developed functions. This concept is now widely used in SW development as component based development (CBD). Component-based software engineering (CBSE) is used to develop/ assemble software from existing components [2]. Software developed using components is called component ware [3]. This paper presents different architectures of CBD such as ActiveX, common object request broker architecture (CORBA), remote method invocation (RMI) and simple object access protocol (SOAP). The overall objective of this paper is to support the practice of CBD by comparing its advantages and disadvantages. This paper also evaluates object oriented process model to adapt it for CBD.
Heuristic Synthesis of Reversible Logic – A Comparative Study
Directory of Open Access Journals (Sweden)
Chua Shin Cheng
2014-01-01
Full Text Available Reversible logic circuits have been historically motivated by theoretical research in low-power, and recently attracted interest as components of the quantum algorithm, optical computing and nanotechnology. However due to the intrinsic property of reversible logic, traditional irreversible logic design and synthesis methods cannot be carried out. Thus a new set of algorithms are developed correctly to synthesize reversible logic circuit. This paper presents a comprehensive literature review with comparative study on heuristic based reversible logic synthesis. It reviews a range of heuristic based reversible logic synthesis techniques reported by researchers (BDD-based, cycle-based, search-based, non-search-based, rule-based, transformation-based, and ESOP-based. All techniques are described in detail and summarized in a table based on their features, limitation, library used and their consideration metric. Benchmark comparison of gate count and quantum cost are analysed for each synthesis technique. Comparing the synthesis algorithm outputs over the years, it can be observed that different approach has been used for the synthesis of reversible circuit. However, the improvements are not significant. Quantum cost and gate count has improved over the years, but arguments and debates are still on certain issues such as the issue of garbage outputs that remain the same. This paper provides the information of all heuristic based synthesis of reversible logic method proposed over the years. All techniques are explained in detail and thus informative for new reversible logic researchers and bridging the knowledge gap in this area.
BASES COMPONENTS OF PARETO EFFICIENCY
Directory of Open Access Journals (Sweden)
Daniela POPESCU
2011-01-01
Full Text Available This Study take into discussion the problem of underlay the decisions, which are particularly complex and actual, based of an important volume of information, which need an important quantity of work. From our investigations, we conclusion that some inconvenient can be evitable by use also of others concepts, which apply to this kind of information. In this direction, the Study follow up to end the manner which base the decisions, we allot a especial attention to analyze the Concept of Efficiency Pareto, which finally has two fundamental elements: final benefit and opportunity cost, use also in the process for take decisions. So we explain the ample analyze of Concept of Efficiency Pareto, where the main accent is on quantitative aspects evaluation of elements, which characterize them. By amplification is thoroughness the analyze of process for take decisions. So is underlined the closed link between different economical concepts and their great usefulness in practice.
Graphene-based spintronic components
Zeng, Minggang; Shen, Lei; Su, Haibin; Zhou, Miao; Zhang, Chun; Feng, Yuanping
2010-01-01
A major challenge of spintronics is in generating, controlling and detecting spin-polarized current. Manipulation of spin-polarized current, in particular, is difficult. We demonstrate here, based on calculated transport properties of graphene nanoribbons, that nearly +-100% spin-polarized current can be generated in zigzag graphene nanoribbons (ZGNRs) and tuned by a source-drain voltage in the bipolar spin diode, in addition to magnetic configurations of the electrodes. This unusual transpor...
Graphene-based spintronic components
Zeng, Minggang; Shen, Lei; Su, Haibin; Zhou, Miao; Zhang, Chun; Feng, Yuanping
2010-01-01
A major challenge of spintronics is in generating, controlling and detecting spin-polarized current. Manipulation of spin-polarized current, in particular, is difficult. We demonstrate here, based on calculated transport properties of graphene nanoribbons, that nearly +-100% spin-polarized current can be generated in zigzag graphene nanoribbons (ZGNRs) and tuned by a source-drain voltage in the bipolar spin diode, in addition to magnetic configurations of the electrodes. This unusual transpor...
Component Based Dynamic Reconfigurable Test System
Institute of Scientific and Technical Information of China (English)
LAI Hong; HE Lingsong; ZHANG Dengpan
2006-01-01
In this paper, a novel component based framework of test system is presented for the new requirements of dynamic changes of test functions and reconfiguration of test resources. The complexity of dynamic reconfiguration arises from the scale, redirection, extensibility and interconnection of components in test system. The paper is started by discussing the component assembly based framework which provide the open platform to the deploy of components and then the script interpreter model is introduced to dynamically create the components and build the test system by analyzing XML based information of test system. A pipeline model is presented to provide the data channels and behavior reflection among the components. Finally, a dynamic reconfigurable test system is implemented on the basis of COM and applied in the remote test and control system of CNC machine.
Study on vehicle routing problem based on heuristic ant colony optimization%基于启发式蚁群算法的VRP问题研究
Institute of Scientific and Technical Information of China (English)
刘晓勇; 付辉
2011-01-01
When Ant Colony Optimization algorithm (ACO) is applied to vehicle routing problem, it always spends much time and has worse solutions.This paper uses ACO based on a heuristic method for vehicle routing problem.This heuristic method combines distance matrix with saving route matrix to assign initial pheromone matrix.Three benchmark datasets are chosen to verify performance of the new algorithm. Experiments show that ant colony optimization based on heuristic information has better solution and spends less time.%针对蚁群算法求解VRP问题时收敛速度慢,求解质量不高的缺点,把城市和仓库间的距离矩阵和路径节约矩阵信息融入到初始信息素矩阵中作为启发式信息引入到蚁群算法中用于求解有容量限制的车辆路径规划问题(CVRP),在三个基准数据集上的实验研究表明,基于启发式信息的蚁群算法与基本蚁群算法相比能够以较快的速度收敛到较好的解.
DEFF Research Database (Denmark)
Bloch, Søren; Christiansen, Christian Holk
versa. This, however, is often neglected in the existing literature. We solve the TSLAP simultaneously for the reserve area and the forward area. Based on randomly generated test instances we show that the solutions of TSLAP compare favorably to solutions found by other algorithms proposed...
Research on Heuristic Feature Extraction and Classification of EEG Signal Based on BCI Data Set
Directory of Open Access Journals (Sweden)
Lijuan Duan
2013-01-01
Full Text Available In this study, an EEG signal classification framework was proposed. The framework contained three feature extraction methods refer to optimization strategy. Firstly, we selected optimal electrodes based on the single electrode classification performance and combined all the optimal electrodes’ data as the feature. Then, we discussed the contribution of each time span of EEG signals for each electrode and joined all the optimal time spans’ data together to be used for classifying. In addition, we further selected useful information from original data based on genetic algorithm. Finally, the performances were evaluated by Bayes and SVM classifiers on BCI 2003 Competition data set Ia. And the accuracy of genetic algorithm has reached 91.81%. The experimental results show that our methods offer the better performance for reliable classification of the EEG signal.
Nafezi, Nima
2013-01-01
In this dissertation, we discussed a type of vehicle routing problem called vehicle routing problem with intermediate facilities with consideration of the impact of adding intermediate facilities to the problem. To study how IFs change the result of the problem, we firstly present a simple model based on clustering algorithm along with finding the shortest route between clusters, implementing Clarke and Wright’s algorithm within each cluster. Then we determine a set of design of experiments w...
A Heuristic Model for the Active Galactic Nucleus Based on the Planck Vacuum Theory
Directory of Open Access Journals (Sweden)
Daywitt W. C.
2009-07-01
Full Text Available The standard explanation for an active galactic nucleus (AGN is a "central engine" consisting of a hot accretion disk surrounding a supermassive black hole. Energy is generated by the gravitational infall of material which is heated to high temperatures in this dissipative accretion disk. What follows is an alternative model for the AGN based on the Planck vacuum (PV theory, where both the energy of the AGN and its variable luminosity are explained in terms of a variable photon flux emanating from the PV.
Directory of Open Access Journals (Sweden)
Stefania Costantini
2013-01-01
multi-issue negotiation between two autonomous competitive software agents proposed by Cadoli. This model is based on the view of negotiation spaces (or “areas”, representing the admissible values of the goods involved in the process as convex regions. However, in order to speed up the negotiation process and guarantee convergence, there was the restriction of potential agreements to vertices included in the intersection of the two areas. We present and assess experimentally an extension to Cadoli's approach where, for both participating agents, interaction is no longer vertex based, or at least not necessarily so. This eliminates the asymmetry among parties and the limitation to polyhedral negotiation areas. The extension can be usefully integrated to Cadoli's framework, thus obtaining an enhanced algorithm that can be effective in many practical cases. We present and discuss a number of experiments, aimed at assessing how parameters influence the performance of the algorithm and how they relate to each other. We discuss the usefulness of the approach in relevant application fields, such as, for instance, supply chain management in the fashion industry, which is a field of growing importance in economy and e-commerce.
Laser based refurbishment of steel mill components
CSIR Research Space (South Africa)
Kazadi, P
2006-03-01
Full Text Available Laser refurbishment capabilities were demonstrated and promising results were obtained for repair of distance sleeves, foot rolls, descaler cassette, idler rolls. Based on the cost projections and the results of the in-situ testing, components which...
Muders, D.; Boone, F.; Wyrowski, F.; Lightfoot, J.; Kosugi, G.; Wilson, C.; Davis, L.; Shepherd, D.
2007-10-01
The Atacama Large Millimeter Array / Atacama Compact Array (ALMA / ACA) Pipeline Heuristics system is being developed to automatically reduce data taken with the standard observing modes such as single fields, mosaics or on-the-fly maps. The goal is to make ALMA user-friendly to astronomers who are not experts in radio interferometry. The Pipeline Heuristics must capture the expert knowledge required to provide data products that can be used without further processing. The Pipeline Heuristics system is being developed as a set of Python scripts using as the data processing engines the Common Astronomy Software Applications (CASA[PY]) libraries and the ATNF Spectral Analysis Package (ASAP). The interferometry heuristics scripts currently provide an end-to-end process for the single field mode comprising flagging, initial calibration, re-flagging, re-calibration, and imaging of the target data. A Java browser provides user-friendly access to the heuristics results. The initial single-dish heuristics scripts implement automatic spectral line detection, baseline fitting and image gridding. The resulting data cubes are analyzed to detect source emission spectrally and spatially in order to calculate signal-to-noise ratios for comparison against the science goals specified by the observer.
Directory of Open Access Journals (Sweden)
Yingni Zhai
2014-10-01
Full Text Available Purpose: A decomposition heuristics based on multi-bottleneck machines for large-scale job shop scheduling problems (JSP is proposed.Design/methodology/approach: In the algorithm, a number of sub-problems are constructed by iteratively decomposing the large-scale JSP according to the process route of each job. And then the solution of the large-scale JSP can be obtained by iteratively solving the sub-problems. In order to improve the sub-problems' solving efficiency and the solution quality, a detection method for multi-bottleneck machines based on critical path is proposed. Therewith the unscheduled operations can be decomposed into bottleneck operations and non-bottleneck operations. According to the principle of “Bottleneck leads the performance of the whole manufacturing system” in TOC (Theory Of Constraints, the bottleneck operations are scheduled by genetic algorithm for high solution quality, and the non-bottleneck operations are scheduled by dispatching rules for the improvement of the solving efficiency.Findings: In the process of the sub-problems' construction, partial operations in the previous scheduled sub-problem are divided into the successive sub-problem for re-optimization. This strategy can improve the solution quality of the algorithm. In the process of solving the sub-problems, the strategy that evaluating the chromosome's fitness by predicting the global scheduling objective value can improve the solution quality.Research limitations/implications: In this research, there are some assumptions which reduce the complexity of the large-scale scheduling problem. They are as follows: The processing route of each job is predetermined, and the processing time of each operation is fixed. There is no machine breakdown, and no preemption of the operations is allowed. The assumptions should be considered if the algorithm is used in the actual job shop.Originality/value: The research provides an efficient scheduling method for the
Outlier Mining Based on Principal Component Estimation
Institute of Scientific and Technical Information of China (English)
Hu Yang; Ting Yang
2005-01-01
Outlier mining is an important aspect in data mining and the outlier mining based on Cook distance is most commonly used. But we know that when the data have multicollinearity, the traditional Cook method is no longer effective. Considering the excellence of the principal component estimation, we use it to substitute the least squares estimation, and then give the Cook distance measurement based on principal component estimation, which can be used in outlier mining. At the same time, we have done some research on related theories and application problems.
Zhang, Huaguang; Song, Ruizhuo; Wei, Qinglai; Zhang, Tieyan
2011-12-01
In this paper, a novel heuristic dynamic programming (HDP) iteration algorithm is proposed to solve the optimal tracking control problem for a class of nonlinear discrete-time systems with time delays. The novel algorithm contains state updating, control policy iteration, and performance index iteration. To get the optimal states, the states are also updated. Furthermore, the "backward iteration" is applied to state updating. Two neural networks are used to approximate the performance index function and compute the optimal control policy for facilitating the implementation of HDP iteration algorithm. At last, we present two examples to demonstrate the effectiveness of the proposed HDP iteration algorithm.
Directory of Open Access Journals (Sweden)
Amir Abbas Najafi
2009-01-01
Full Text Available Resource investment problem with discounted cash flows (RIPDCFs is a class of project scheduling problem. In RIPDCF, the availability levels of the resources are considered decision variables, and the goal is to find a schedule such that the net present value of the project cash flows optimizes. In this paper, we consider a new RIPDCF in which tardiness of project is permitted with defined penalty. We mathematically formulated the problem and developed a heuristic method to solve it. The results of the performance analysis of the proposed method show an effective solution approach to the problem.
A refinement driven component-based design
DEFF Research Database (Denmark)
Chen, Zhenbang; Liu, Zhiming; Ravn, Anders Peter;
2007-01-01
to integrate sophisticated checkers, generators and transformations. A feasible approach to ensuring high quality of such add-ins is to base them on sound formal foundations. This paper summarizes our research on the Refinement of Component and Object Systems (rCOS) and illustrates it with experiences from...... the work on the Common Component Modelling Example (CoCoME). This gives evidence that the formal techniques developed in rCOS can be integrated into a model-driven development process and shows where it may be integrated in computer-aided software engineering (CASE) tools for adding formally supported...
Lightfoot, J.; Wyrowski, F.; Muders, D.; Boone, F.; Davis, L.; Shepherd, D.; Wilson, C.
2006-07-01
The ALMA (Atacama Large Millimeter Array) Pipeline Heuristics system is being developed to automatically reduce data taken with the standard observing modes. The goal is to make ALMA user-friendly to astronomers who are not experts in radio interferometry. The Pipeline Heuristics system must capture the expert knowledge required to provide data products that can be used without further processing. Observing modes to be processed by the system include single field interferometry, mosaics and single dish `on-the-fly' maps, and combinations of these modes. The data will be produced by the main ALMA array, the ALMA Compact Array (ACA) and single dish antennas. The Pipeline Heuristics system is being developed as a set of Python scripts. For interferometry these use as data processing engines the CASA/AIPS++ libraries and their bindings as CORBA objects within the ALMA Common Software (ACS). Initial development has used VLA and Plateau de Bure data sets to build and test a heuristic script capable of reducing single field data. In this paper we describe the reduction datapath and the algorithms used at each stage. Test results are presented. The path for future development is outlined.
A HYBRID HEURISTIC ALGORITHM FOR THE CLUSTERED TRAVELING SALESMAN PROBLEM
Directory of Open Access Journals (Sweden)
Mário Mestria
2016-04-01
Full Text Available ABSTRACT This paper proposes a hybrid heuristic algorithm, based on the metaheuristics Greedy Randomized Adaptive Search Procedure, Iterated Local Search and Variable Neighborhood Descent, to solve the Clustered Traveling Salesman Problem (CTSP. Hybrid Heuristic algorithm uses several variable neighborhood structures combining the intensification (using local search operators and diversification (constructive heuristic and perturbation routine. In the CTSP, the vertices are partitioned into clusters and all vertices of each cluster have to be visited contiguously. The CTSP is -hard since it includes the well-known Traveling Salesman Problem (TSP as a special case. Our hybrid heuristic is compared with three heuristics from the literature and an exact method. Computational experiments are reported for different classes of instances. Experimental results show that the proposed hybrid heuristic obtains competitive results within reasonable computational time.
Component-Based Cartoon Face Generation
Directory of Open Access Journals (Sweden)
Saman Sepehri Nejad
2016-11-01
Full Text Available In this paper, we present a cartoon face generation method that stands on a component-based facial feature extraction approach. Given a frontal face image as an input, our proposed system has the following stages. First, face features are extracted using an extended Active Shape Model. Outlines of the components are locally modified using edge detection, template matching and Hermit interpolation. This modification enhances the diversity of output and accuracy of the component matching required for cartoon generation. Second, to bring cartoon-specific features such as shadows, highlights and, especially, stylish drawing, an array of various face photographs and corresponding hand-drawn cartoon faces are collected. These cartoon templates are automatically decomposed into cartoon components using our proposed method for parameterizing cartoon samples, which is fast and simple. Then, using shape matching methods, the appropriate cartoon component is selected and deformed to fit the input face. Finally, a cartoon face is rendered in a vector format using the rendering rules of the selected template. Experimental results demonstrate effectiveness of our approach in generating life-like cartoon faces.
A Heuristic Approach for International Crude Oil Transportation Scheduling Problems
Yin, Sisi; Nishi, Tatsushi; Izuno, Tsukasa
In this paper, we propose a heuristic algorithm to solve a practical ship scheduling problem for international crude oil transportation. The problem is considered as a vehicle routing problem with split deliveries. The objective of this paper is to find an optimal assignment of tankers, a sequence of visiting and loading volume simultaneously in order to minimize the total distance satisfying the capacity of tankers. A savings-based meta-heuristic algorithm with lot sizing parameters and volume assignment heuristic is developed. The proposed method is applied to solve a case study with real data. Computational results demonstrate the effectiveness of the heuristic algorithm compared with that of human operators.
Directory of Open Access Journals (Sweden)
Wei Tu
2015-10-01
Full Text Available Vehicle routing optimization (VRO designs the best routes to reduce travel cost, energy consumption, and carbon emission. Due to non-deterministic polynomial-time hard (NP-hard complexity, many VROs involved in real-world applications require too much computing effort. Shortening computing time for VRO is a great challenge for state-of-the-art spatial optimization algorithms. From a spatial-temporal perspective, this paper presents a spatial-temporal Voronoi diagram-based heuristic approach for large-scale vehicle routing problems with time windows (VRPTW. Considering time constraints, a spatial-temporal Voronoi distance is derived from the spatial-temporal Voronoi diagram to find near neighbors in the space-time searching context. A Voronoi distance decay strategy that integrates a time warp operation is proposed to accelerate local search procedures. A spatial-temporal feature-guided search is developed to improve unpromising micro route structures. Experiments on VRPTW benchmarks and real-world instances are conducted to verify performance. The results demonstrate that the proposed approach is competitive with state-of-the-art heuristics and achieves high-quality solutions for large-scale instances of VRPTWs in a short time. This novel approach will contribute to spatial decision support community by developing an effective vehicle routing optimization method for large transportation applications in both public and private sectors.
Deep Web result pattern extracting based on heuristic information%基于启发式信息的Deep Web结果模式获取方法
Institute of Scientific and Technical Information of China (English)
李明; 李秀兰
2011-01-01
获取模式信息是深入研究Deep Web数据的必要步骤,针对Deep Web结果模式结构信息的丢失问题,提出了一种基于启发式信息的Deep Web结果模式获取方法.通过解析Deep Web结果页面数据,利用启发式信息为结果页面数据添加正确的属性名,进而得到对应Deep Web的结果模式,并对其进行规范化处理,解决不同数据源结果模式的结构不一致问题.实验验证该方法可以有效地获取Deep Web的结果模式信息.%Extracting schema information is the necessary step in the Deep Web data research, to address the loss problem of Deep Web result schema information, this paper proposed a novel approach Deep Web result pattern extracting based on heuristic information. Through analyzing Deep Web result page data and adding correct attribute names to result pages data by heuristic information, it obtained the corresponding of Deep Web result pattern. Moreover, it solved the structure conflict by standardized treatment. Experimental results show that the method can effectively extract result pattern.
Intuition and Heuristics in Mathematics
Directory of Open Access Journals (Sweden)
Sultanova L. B.
2013-01-01
Full Text Available The article is devoted to philosophy of mathematics. Mathematical heuristics, being a complex of methods for solving the non-standard problems of mathematics (such problems which have no known algorithms to be solved, is the main subject of the research. As a specific mechanism for thinking, generating elements of guesswork needed as the basis of mathematical heuristics, the author considers intuition. In the work, the author uses Descartes’s, Poincaré’s, Hadamard’s and Piaget’s findings. Based on Descartes’s concept of rational intuition, the author develops the concept of heuristic intuition. As a result, the author turns to the question of possibility of a complete translation of the user-derived mathematical statements in a discourse, in fact, that means a maximum depth of mathematical proof, i.e. its maximum rationalization. For this purpose, it is necessary to re-attract the intuition since it is able to transform the intuitive elements into the discourse ones. Therefore, from this point of view, the rationale is intuitively derived mathematical proof should be no more than a “multilayer” creative process. In general, the author, based on Poincaré’s research, proves that the essence of mathematical creativity is not to «sort out» and «choose». Referring to examples for illustration, the author reveals moments of «interference» of intuition, even in the process of solving school problems. Therefore, it is currently impossible to ignore the phenomenon of intuition and the results that have been historically derived a theory of knowledge in the study of creative mechanisms.
Proposing New Heuristic Approaches for Preventive Maintenance Scheduling
Directory of Open Access Journals (Sweden)
majid Esmailian
2013-08-01
Full Text Available The purpose of preventive maintenance management is to perform a series of tasks that prevent or minimize production breakdowns and improve reliability of production facilities. An important objective of preventive maintenance management is to minimize downtime of production facilities. In order to accomplish this objective, personnel should efficiently allocate resources and determine an effective maintenance schedule. Gopalakrishnan (1997 developed a mathematical model and four heuristic approaches to solve the preventive maintenance scheduling problem of assigning skilled personnel to work with tasks that require a set of corresponding skills. However, there are several limitations in the prior work in this area of research. The craft combination problem has not been solved because the craft combination is assumed as given. The craft combination problem concerns the computation of all combinations of assigning multi skilled workers to accomplishment of a particular task. In fact, determining craft combinations is difficult because of the exponential number of craft combinations that are possible. This research provides a heuristic approach for determining the craft combination and four new heuristic approach solution for the preventive maintenance scheduling problem with multi skilled workforce constraints. In order to examine the new heuristic approach and to compare the new heuristic approach with heuristic approach of Gopalakrishnan (1997, 81 standard problems have been generated based on the criterion suggested by from Gopalakrishnan (1997. The average solution quality (SQ of the new heuristic approaches is 1.86% and in old heuristic approaches is 8.32%. The solution time of new heuristic approaches are shorter than old heuristic approaches. The solution time of new heuristic approaches is 0.78 second and old heuristic approaches is 6.43 second, but the solution time of mathematical model provided by Gopalakrishnan (1997 is 152 second.
Quantifying Heuristic Bias: Anchoring, Availability, and Representativeness.
Richie, Megan; Josephson, S Andrew
2017-07-28
Construct: Authors examined whether a new vignette-based instrument could isolate and quantify heuristic bias. Heuristics are cognitive shortcuts that may introduce bias and contribute to error. There is no standardized instrument available to quantify heuristic bias in clinical decision making, limiting future study of educational interventions designed to improve calibration of medical decisions. This study presents validity data to support a vignette-based instrument quantifying bias due to the anchoring, availability, and representativeness heuristics. Participants completed questionnaires requiring assignment of probabilities to potential outcomes of medical and nonmedical scenarios. The instrument randomly presented scenarios in one of two versions: Version A, encouraging heuristic bias, and Version B, worded neutrally. The primary outcome was the difference in probability judgments for Version A versus Version B scenario options. Of 167 participants recruited, 139 enrolled. Participants assigned significantly higher mean probability values to Version A scenario options (M = 9.56, SD = 3.75) than Version B (M = 8.98, SD = 3.76), t(1801) = 3.27, p = .001. This result remained significant analyzing medical scenarios alone (Version A, M = 9.41, SD = 3.92; Version B, M = 8.86, SD = 4.09), t(1204) = 2.36, p = .02. Analyzing medical scenarios by heuristic revealed a significant difference between Version A and B for availability (Version A, M = 6.52, SD = 3.32; Version B, M = 5.52, SD = 3.05), t(404) = 3.04, p = .003, and representativeness (Version A, M = 11.45, SD = 3.12; Version B, M = 10.67, SD = 3.71), t(396) = 2.28, p = .02, but not anchoring. Stratifying by training level, students maintained a significant difference between Version A and B medical scenarios (Version A, M = 9.83, SD = 3.75; Version B, M = 9.00, SD = 3.98), t(465) = 2.29, p = .02, but not residents or attendings. Stratifying by heuristic and training level, availability maintained
Energy Technology Data Exchange (ETDEWEB)
Cruz Castrejon, J. A; Islas Perez, E; Espinosa Reza, A; Garcia Mendoza, R [Instituto de Investigaciones Electricas, Cuernavaca, Morelos (Mexico)]. E-mails: adrian.cruz@iie.org.mx; eislas@iie.org.mx; aer@iie.org.mx; rgarcia@iie.org.mx
2013-03-15
In this paper we present a proposed solution to the problem of finding alternatives to reset faults in radial distribution networks power systems. This solution uses a deterministic method based on the definition of heuristics and whose main objectives are to improve execution time and solution quality. This search is based on the alternate repetition of two stages: a stage that attempts to reset the unconnected areas and other areas trying ballasting overloaded. [Spanish] En este articulo se presenta una propuesta de solucion al problema de busqueda de alternativas de restablecimiento para fallas en redes de distribucion radiales en sistemas electricos de potencia. Esta solucion utiliza un metodo deterministico basado en la definicion de heuristicas y cuyos objetivos principales son: mejorar el tiempo de ejecucion y calidad de la solucion. Esta busqueda se basa en la repeticion alternada de dos etapas: una etapa que intenta restablecer las areas desconectadas y otra que intenta deslastrar las areas sobrecargadas.
Data collection for the Sloan Digital Sky Survey - A network-flow heuristic
Energy Technology Data Exchange (ETDEWEB)
Lupton, R.; Miller Maley, F. [Princeton Univ., NJ (United States); Young, N. [Dartmouth College, Hanover, NH (United States)
1996-12-31
This note describes a combinatorial optimization problem arising in the Sloan Digital Sky Survey and an effective heuristic for the problem that has been implemented and will be used in the Survey. The heuristic is based on network flow theory.
New Hoopoe Heuristic Optimization
El-Dosuky, Mohammed; EL-Bassiouny, Ahmed; Hamza, Taher; Rashad, Magdy
2012-01-01
Most optimization problems in real life applications are often highly nonlinear. Local optimization algorithms do not give the desired performance. So, only global optimization algorithms should be used to obtain optimal solutions. This paper introduces a new nature-inspired metaheuristic optimization algorithm, called Hoopoe Heuristic (HH). In this paper, we will study HH and validate it against some test functions. Investigations show that it is very promising and could be seen as an optimi...
Heuristic of radiodiagnostic systems
Energy Technology Data Exchange (ETDEWEB)
Wackenheim, A.
1986-12-01
In the practice of creating expert systems, the radiologist and his team are considered as the expert who leads the job of the cognitian or cognitician. Different kinds of expert systems can be imagined. The author describes the main characteristics of heuristics in redefining semiology, semantics and rules of picture reading. Finally it is the experience of the couple expert and cognitician which will in the futur grant for the success of expert systems in radiology.
A Heuristic Hierarchical Scheme for Academic Search and Retrieval
DEFF Research Database (Denmark)
Amolochitis, Emmanouil; Christou, Ioannis T.; Tan, Zheng-Hua
2013-01-01
We present PubSearch, a hybrid heuristic scheme for re-ranking academic papers retrieved from standard digital libraries such as the ACM Portal. The scheme is based on the hierarchical combination of a custom implementation of the term frequency heuristic, a time-depreciated citation score...
Industrialisation of flyash based building components
Energy Technology Data Exchange (ETDEWEB)
Rajkumar, C.; Lal, R. [National Council for Cement and Building Materials (India)
1996-12-31
There is an acute shortage of housing in India and the prevailing backlog of housing is increasing every year, as the rate of construction has not kept pace with population growth. One way of partially meeting the increasing demand for building materials is to make use of non-conventional materials and technologies based on the use of industrial by-products. Studies conducted have shown that manufacture of building materials or components, particularly bricks and blocks, is the most promising direction of fly ash utilization. 3 figs., 3 tabs.
Shahar, Golan; Davidson, Larry
2009-01-01
We propose Participation-Engagement (PAR-EN) as a philosophically based heuristic for prioritizing interventions in comorbid, complex, and chronic psychiatric conditions. Drawing from 1) the sociologist Talcott Parsons, 2) the continental-philosophical tradition, and 3) our own previous work (Davidson & Shahar, 2009; Shahar, 2004, 2006), we argue that participation in personally meaningful life goals represents a hallmark of mental health. Symptoms and vulnerabilities that impede such participation should therefore be targeted vigorously, whereas others which do not pose such imminent threats should assume a secondary focus, if at all. Winnicott's (1987) notion of the spontaneous gesture, the importance of daily activities as reflecting patients' participation, and the dialectics of interpersonal relatedness and self-definition, are introduced as guidelines for implementing PAR-EN. Implications for clinical assessment and the therapeutic relationship are discussed.
A systematic approach for component-based software development
Guareis de farias, Cléver; van Sinderen, Marten J.; Ferreira Pires, Luis
2000-01-01
Component-based software development enables the construction of software artefacts by assembling prefabricated, configurable and independently evolving building blocks, called software components. This paper presents an approach for the development of component-based software artefacts. This
Heuristic approach to image registration
Gertner, Izidor; Maslov, Igor V.
2000-08-01
Image registration, i.e. correct mapping of images obtained from different sensor readings onto common reference frame, is a critical part of multi-sensor ATR/AOR systems based on readings from different types of sensors. In order to fuse two different sensor readings of the same object, the readings have to be put into a common coordinate system. This task can be formulated as optimization problem in a space of all possible affine transformations of an image. In this paper, a combination of heuristic methods is explored to register gray- scale images. The modification of Genetic Algorithm is used as the first step in global search for optimal transformation. It covers the entire search space with (randomly or heuristically) scattered probe points and helps significantly reduce the search space to a subspace of potentially most successful transformations. Due to its discrete character, however, Genetic Algorithm in general can not converge while coming close to the optimum. Its termination point can be specified either as some predefined number of generations or as achievement of a certain acceptable convergence level. To refine the search, potential optimal subspaces are searched using more delicate and efficient for local search Taboo and Simulated Annealing methods.
Motor heuristics and embodied choices: how to choose and act.
Raab, Markus
2017-08-01
Human performance requires choosing what to do and how to do it. The goal of this theoretical contribution is to advance understanding of how the motor and cognitive components of choices are intertwined. From a holistic perspective I extend simple heuristics that have been tested in cognitive tasks to motor tasks, coining the term motor heuristics. Similarly I extend the concept of embodied cognition, that has been tested in simple sensorimotor processes changing decisions, to complex sport behavior coining the term embodied choices. Thus both motor heuristics and embodied choices explain complex behavior such as studied in sport and exercise psychology. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Tutorial on Heuristic Methods
DEFF Research Database (Denmark)
Vidal, Rene Victor Valqui; Werra, D. de; Silver, E.
1980-01-01
In this paper we define a heuristic method as a procedure for solving a well-defined mathematical problem by an intuitive approach in which the structure of the problem can be interpreted and exploited intelligently to obtain a reasonable solution. Issues discussed include: (i) the measurement...... of the quality of a heuristic method, (ii) different types of heuristic procedures, (iii) the interactive role of human beings and (iv) factors that may influence the choice or testing of heuristic methods. A large number of references are included....
Electric Vehicle based on standard industrial components
Fernández Ramos, José; Aghili Kathir, Foroohar
2013-01-01
The aim of this paper is to presents the complete design of an electric vehicle by using standard industrial components as VRLA batteries, AC induction motors and standard frequency converters. In comparison with dedicated components, the use of standard components has the following advantages: higher reliability, low price, broad range of products and suppliers, and high availability and technological independence. Besides this, we show that these components allow to ...
Semantic network based component organization model for program mining
Institute of Scientific and Technical Information of China (English)
王斌; 张尧学; 陈松乔
2003-01-01
Based on the definition of component ontology, an effective component classification mechanism and a facet named component relationship are proposed. Then an application domain oriented, hierarchical component organization model is established. At last a hierarchical component semantic network (HCSN) described by ontology interchange language(OIL) is presented and then its function is described. Using HCSN and cooperating with other components retrieving algorithms based on component description, other components information and their assembly or composite modes related to the key component can be found. Based on HCSN, component directory library is catalogued and a prototype system is constructed. The prototype system proves that component library organization based on this model gives guarantee to the reliability of component assembly during program mining.
Reexamining Our Bias against Heuristics
McLaughlin, Kevin; Eva, Kevin W.; Norman, Geoff R.
2014-01-01
Using heuristics offers several cognitive advantages, such as increased speed and reduced effort when making decisions, in addition to allowing us to make decision in situations where missing data do not allow for formal reasoning. But the traditional view of heuristics is that they trade accuracy for efficiency. Here the authors discuss sources…
Heuristics, biases and traps in managerial decision making
Directory of Open Access Journals (Sweden)
Peter Gál
2013-01-01
Full Text Available The aim of the paper is to demonstrate the impact of heuristics, biases and psychological traps on the decision making. Heuristics are unconscious routines people use to cope with the complexity inherent in most decision situations. They serve as mental shortcuts that help people to simplify and structure the information encountered in the world. These heuristics could be quite useful in some situations, while in others they can lead to severe and systematic errors, based on significant deviations from the fundamental principles of statistics, probability and sound judgment. This paper focuses on illustrating the existence of the anchoring, availability, and representativeness heuristics, originally described by Tversky & Kahneman in the early 1970’s. The anchoring heuristic is a tendency to focus on the initial information, estimate or perception (even random or irrelevant number as a starting point. People tend to give disproportionate weight to the initial information they receive. The availability heuristic explains why highly imaginable or vivid information have a disproportionate effect on people’s decisions. The representativeness heuristic causes that people rely on highly specific scenarios, ignore base rates, draw conclusions based on small samples and neglect scope. Mentioned phenomena are illustrated and supported by evidence based on the statistical analysis of the results of a questionnaire.
Lifestyles Based on Health Components in Iran
Directory of Open Access Journals (Sweden)
Babaei
2016-07-01
Full Text Available Context Lifestyle is a way employed by people, groups and nations and is formed in specific geographical, economic, political, cultural and religious texts. Health depends on lifestyle and is essential to preserve and promote health and improve lifestyle. Objectives The present study aimed to investigate lifestyle based on health-oriented components in Iran. Data Sources The research was conducted through E-banks including scientific information database (SID, Iran medical science databank (Iran Medex, Iran journal databank (Magiran and other databases such as Elsevier, PubMed and google scholar meta search engine regarding the subject from 2000 to 2014. Moreover, Official Iranian statistics and information were applied. The search terms used included lifestyle, health, health promoting behaviors, health-oriented lifestyle and lifestyle in Iran. Study Selection In the primary research, many papers were observed out of which 157 (120 in Farsi and 37 in English were selected. Data Extraction Following the careful study of these papers and excluding the unqualified papers, 19 papers with thorough information and higher relevance with the research purpose were selected. Results After examining articles based on the selected keywords and search strategies, 215 articles (134 in Farsi and 81 in English were obtained. Components of lifestyle and health are increasing in recent years; therefore, 8 (42% and 11 (58% articles were published during 2005 - 2010 and 2011 - 2014, respectively. Among them, there were 3 (16%, 8 (42%, 2 (10.5%, 2 (10.5% and 0 articles on the review of literature, descriptive-analytic, qualitative, analytic and descriptive articles, respectively. Conclusions Due to positive effect of healthy lifestyle on health promotion of individuals, it would be better for the government to provide comprehensive programs and policies in the society to enhance awareness of people about positive effects of health-oriented lifestyle on life and
Heuristic Chinese sentence compression algorithm based on hot word%基于词语热度的启发式中文句子压缩算法
Institute of Scientific and Technical Information of China (English)
韩静; 张东站
2014-01-01
Since the parallel sentence/compression corpora which most of the traditional methods based on are not easy to obtain, a linguistically-motivated heuristics Chinese sentence compression algorithm is proposed after studying traditional methods. By analyzing the human-produced compression and linguistic knowledge, two sets of rules are proposed, one is in word layer and the other is in clause layer. Two sets of rules based on the parse tree and the words dependence are used to compress sentence, and enhance the algorithm by hot word in order to keep the algorithm flexibility and accuracy. In the last step the compression result is cleaned and repaired. Human-produced compression, rule-only algorithm and hot word enhanced algorithm are compared then the results are evaluated in compression rate, grammaticality, informative-ness and heat. The experimental results show that heuristic Chinese sentence compression algorithm based on hot word can improve the heat of compression results without much loss in compression rate, grammaticality and informativeness.%传统的句子压缩方法多基于难以获得的“原句-压缩句”对齐语料库，因此提出了不依赖于对齐语料库的中文句子压缩算法。通过研究人工压缩结果并结合语言学知识，提出了词语层面和分句层面的两组压缩规则。算法在原句句法分析树和词语间依赖关系的基础上，使用两组规则进行压缩，同时为了保证压缩算法具有更强的适应性和准确性，引入词语的热度加强了压缩算法，最后通过句子整理和语法修复得到最终的压缩句。对比了人工压缩、只使用规则压缩和引入词语热度压缩三种压缩方法。实验结果表明，基于热度的启发式中文句子压缩算法可以在压缩比、语法性、信息量都损失较少的情况下，提高压缩句的热度。
Heuristics in Composition and Literary Criticism.
McCarthy, B. Eugene
1978-01-01
Describes the "particle, wave, field" heuristic for gathering information, and shows how students can apply that heuristic in analyzing literature and in using procedures of historical criticism. (RL)
BCI Control of Heuristic Search Algorithms
Cavazza, Marc; Aranyi, Gabor; Charles, Fred
2017-01-01
The ability to develop Brain-Computer Interfaces (BCI) to Intelligent Systems would offer new perspectives in terms of human supervision of complex Artificial Intelligence (AI) systems, as well as supporting new types of applications. In this article, we introduce a basic mechanism for the control of heuristic search through fNIRS-based BCI. The rationale is that heuristic search is not only a basic AI mechanism but also one still at the heart of many different AI systems. We investigate how users’ mental disposition can be harnessed to influence the performance of heuristic search algorithm through a mechanism of precision-complexity exchange. From a system perspective, we use weighted variants of the A* algorithm which have an ability to provide faster, albeit suboptimal solutions. We use recent results in affective BCI to capture a BCI signal, which is indicative of a compatible mental disposition in the user. It has been established that Prefrontal Cortex (PFC) asymmetry is strongly correlated to motivational dispositions and results anticipation, such as approach or even risk-taking, and that this asymmetry is amenable to Neurofeedback (NF) control. Since PFC asymmetry is accessible through fNIRS, we designed a BCI paradigm in which users vary their PFC asymmetry through NF during heuristic search tasks, resulting in faster solutions. This is achieved through mapping the PFC asymmetry value onto the dynamic weighting parameter of the weighted A* (WA*) algorithm. We illustrate this approach through two different experiments, one based on solving 8-puzzle configurations, and the other on path planning. In both experiments, subjects were able to speed up the computation of a solution through a reduction of search space in WA*. Our results establish the ability of subjects to intervene in heuristic search progression, with effects which are commensurate to their control of PFC asymmetry: this opens the way to new mechanisms for the implementation of hybrid
Component-Based Software Reuse on the World Wide Web
Institute of Scientific and Technical Information of China (English)
无
2000-01-01
Component-based software reuse (CBSR) has been widely used in software developing practice and has an even more brilliant future with the rapid extension of the Internet, because World Wide Web (WWW) makes the large scale of component resources from different vendors become available to software developers. In this paper, an abstract component model suitable for representing components on WWW isproposed, which plays important roles both in achieving interoperability among components and amongreusable component libraries (RCLs). Some necessary changes to many aspects of component management brought by WWW are also discussed, such as the classification of components and the corresponding searching methods, and the certification of components.
The Component-Based Application for GAMESS
Energy Technology Data Exchange (ETDEWEB)
Peng, Fang [Iowa State Univ., Ames, IA (United States)
2007-01-01
GAMESS, a quantum chetnistry program for electronic structure calculations, has been freely shared by high-performance application scientists for over twenty years. It provides a rich set of functionalities and can be run on a variety of parallel platforms through a distributed data interface. While a chemistry computation is sophisticated and hard to develop, the resource sharing among different chemistry packages will accelerate the development of new computations and encourage the cooperation of scientists from universities and laboratories. Common Component Architecture (CCA) offers an enviromnent that allows scientific packages to dynamically interact with each other through components, which enable dynamic coupling of GAMESS with other chetnistry packages, such as MPQC and NWChem. Conceptually, a cotnputation can be constructed with "plug-and-play" components from scientific packages and require more than componentizing functions/subroutines of interest, especially for large-scale scientific packages with a long development history. In this research, we present our efforts to construct cotnponents for GAMESS that conform to the CCA specification. The goal is to enable the fine-grained interoperability between three quantum chemistry programs, GAMESS, MPQC and NWChem, via components. We focus on one of the three packages, GAMESS; delineate the structure of GAMESS computations, followed by our approaches to its component development. Then we use GAMESS as the driver to interoperate integral components from the other tw"o packages, arid show the solutions for interoperability problems along with preliminary results. To justify the versatility of the design, the Tuning and Analysis Utility (TAU) components have been coupled with GAMESS and its components, so that the performance of GAMESS and its components may be analyzed for a wide range of systetn parameters.
Directory of Open Access Journals (Sweden)
B. Thamaraikannan
2014-01-01
Full Text Available This paper studies in detail the background and implementation of a teaching-learning based optimization (TLBO algorithm with differential operator for optimization task of a few mechanical components, which are essential for most of the mechanical engineering applications. Like most of the other heuristic techniques, TLBO is also a population-based method and uses a population of solutions to proceed to the global solution. A differential operator is incorporated into the TLBO for effective search of better solutions. To validate the effectiveness of the proposed method, three typical optimization problems are considered in this research: firstly, to optimize the weight in a belt-pulley drive, secondly, to optimize the volume in a closed coil helical spring, and finally to optimize the weight in a hollow shaft. have been demonstrated. Simulation result on the optimization (mechanical components problems reveals the ability of the proposed methodology to find better optimal solutions compared to other optimization algorithms.
The development of component-based information systems
Cesare, Sergio de; Macredie, Robert
2015-01-01
This work provides a comprehensive overview of research and practical issues relating to component-based development information systems (CBIS). Spanning the organizational, developmental, and technical aspects of the subject, the original research included here provides fresh insights into successful CBIS technology and application. Part I covers component-based development methodologies and system architectures. Part II analyzes different aspects of managing component-based development. Part III investigates component-based development versus commercial off-the-shelf products (COTS), includi
Institute of Scientific and Technical Information of China (English)
李颖浩; 郭瑞鹏
2012-01-01
电力系统机组组合问题是一个高维、离散、非线性的工程优化问题。提出了一种基于Benders分解的启发式算法。该算法一方面充分利用研究时段负荷曲线的特征，将问题进行解耦，减小被研究问题的规模。另一方面，利用Benders分解算法在混合整数规划中的有效性，提高了解决问题的效率。算例表明该方法效率高、结果稳定，有较好的实用价值。%Unit commitment （UC） of power system is a high dimensional, nonlinear and mixed-integer engineering optimization problem. To solve this problem a generalized Benders decomposition based heuristic algorithm is proposed. On the one hand the characteristics of load curve in the time-interval being researched are fully utilized to decouple the problem and to decrease the scale of the problem, on the other hand using the effectiveness of Benders algorithm is solving mixed-integer programming problem the efficiency of solving the problem is improved. Results of calculation example show that the proposed algorithm is efficient and practicable.
Algorithm of Topic-oriented Crawling Based on Heuristic Search%一种启发式主题爬行算法
Institute of Scientific and Technical Information of China (English)
刘欣宇; 唐学文; 邓一贵
2012-01-01
To solve the traditional topic crawler's drawback in terms of precision and efficiency as well as improve the precise ratio and recall ratio of general search engine results, a new kind of topic-oriented crawling algorithm is put forward, according to the current features of the topic - oriented crawling methods. The methods based on the link analysis and the content analysis of the topic methods is combined through page radiation space to combine, with heuristic algorithm embed. The experiment result shows that this algorithm is more efficient than the u-, sual algorithms.%为克服传统主题爬行器在爬行速度和主题预测精度上的不足,提高爬行器的查准率和查全率,根据当前常用主题爬行策略的特点,通过页面辐射空间的引入将主题策略中基于链接分析和基于内容分析的方法相结合,并嵌入启发式算法,提出一种基于启发式的主题爬行算法.实验结果表明,该算法较常用爬行算法有较好的爬行效率.
Component-based Discrete Event Simulation Using the Fractal Component Model
Dalle, Olivier
2007-01-01
In this paper we show that Fractal, a generic component model coming from the Component-Based Software Engineering (CBSE) community, meets most of the functional expectations identified so far in the simulation community for component-based modeling and simulation. We also demonstrate that Fractal offers additional features that have not yet been identified in the simulation community despite their potential usefulness. Eventually we describe our ongoing work on such a new simulation architec...
Heuristic Methods for Security Protocols
Directory of Open Access Journals (Sweden)
Qurat ul Ain Nizamani
2009-10-01
Full Text Available Model checking is an automatic verification technique to verify hardware and software systems. However it suffers from state-space explosion problem. In this paper we address this problem in the context of cryptographic protocols by proposing a security property-dependent heuristic. The heuristic weights the state space by exploiting the security formulae; the weights may then be used to explore the state space when searching for attacks.
Heuristic Methods for Security Protocols
Qurat ul Ain Nizamani; Emilio Tuosto
2009-01-01
Model checking is an automatic verification technique to verify hardware and software systems. However it suffers from state-space explosion problem. In this paper we address this problem in the context of cryptographic protocols by proposing a security property-dependent heuristic. The heuristic weights the state space by exploiting the security formulae; the weights may then be used to explore the state space when searching for attacks.
Study of engine noise based on independent component analysis
Institute of Scientific and Technical Information of China (English)
HAO Zhi-yong; JIN Yan; YANG Chen
2007-01-01
Independent component analysis was applied to analyze the acoustic signals from diesel engine. First the basic principle of independent component analysis (ICA) was reviewed. Diesel engine acoustic signal was decomposed into several independent components (Ics); Fourier transform and continuous wavelet transform (CWT) were applied to analyze the independent components. Different noise sources of the diesel engine were separated, based on the characteristics of different component in time-frequency domain.
基于优势关系的启发式属性约简算法%Heuristic Algorithm for Attribute Reduction Based on Dominance Relation
Institute of Scientific and Technical Information of China (English)
廖帆; 膝书华; 邵世雷
2011-01-01
根据优势原理,提出一种具有明确粗糙集理论含义的指标——优势度,用于度量序目标信息系统的协调程度.在证明优势度粒化单调性的基础上,给出属性集重要性度量函数,提出一种基于优势度的序目标信息系统启发式约简算法.该算法与经典粗糙集理论约简有相同的理论基础,易于理解.应用结果表明,该算法适用于优势关系下目标信息系统的知识发现.%A new uncertainty measure, such as dominance degree is proposed in ordered objective information systems based on dominance principle, and an explicit theoretical meaning of rough set is given to the dominance degree which can be used to measure the inconsistence of objective information system. The granulation monotonicity of dominance degree is proved, based on which a new measure of attribution importance is designed. An heuristic reduct algorithm in objective information system is provided based on dominance relation. An example illustrates the validity of this algorithm, and results show that the algorithm has the same theoretical foundation with classical reduct algorithm in rough set theory, and it is easily understood. The algorithm provides an important theoretical basis for knowledge discovery in ordered objective information systems.
AN EVEN COMPONENT BASED FACE RECOGNITION METHOD
Institute of Scientific and Technical Information of China (English)
无
2005-01-01
This paper presents a novel face recognition algorithm. To provide additional variations to training data set, even-odd decomposition is adopted, and only the even components (half-even face images) are used for further processing. To tackle with shift-variant problem,Fourier transform is applied to half-even face images. To reduce the dimension of an image,PCA (Principle Component Analysis) features are extracted from the amplitude spectrum of half-even face images. Finally, nearest neighbor classifier is employed for the task of classification. Experimental results on ORL database show that the proposed method outperforms in terms of accuracy the conventional eigenface method which applies PCA on original images and the eigenface method which uses both the original images and their mirror images as training set.
Keshavarz, Behrang; Campos, Jennifer L; DeLucia, Patricia R; Oberfeld, Daniel
2017-04-01
Estimating time to contact (TTC) involves multiple sensory systems, including vision and audition. Previous findings suggested that the ratio of an object's instantaneous optical size/sound intensity to its instantaneous rate of change in optical size/sound intensity (τ) drives TTC judgments. Other evidence has shown that heuristic-based cues are used, including final optical size or final sound pressure level. Most previous studies have used decontextualized and unfamiliar stimuli (e.g., geometric shapes on a blank background). Here we evaluated TTC estimates by using a traffic scene with an approaching vehicle to evaluate the weights of visual and auditory TTC cues under more realistic conditions. Younger (18-39 years) and older (65+ years) participants made TTC estimates in three sensory conditions: visual-only, auditory-only, and audio-visual. Stimuli were presented within an immersive virtual-reality environment, and cue weights were calculated for both visual cues (e.g., visual τ, final optical size) and auditory cues (e.g., auditory τ, final sound pressure level). The results demonstrated the use of visual τ as well as heuristic cues in the visual-only condition. TTC estimates in the auditory-only condition, however, were primarily based on an auditory heuristic cue (final sound pressure level), rather than on auditory τ. In the audio-visual condition, the visual cues dominated overall, with the highest weight being assigned to visual τ by younger adults, and a more equal weighting of visual τ and heuristic cues in older adults. Overall, better characterizing the effects of combined sensory inputs, stimulus characteristics, and age on the cues used to estimate TTC will provide important insights into how these factors may affect everyday behavior.
Heuristic space diversity control for improved meta-hyper-heuristic performance
CSIR Research Space (South Africa)
Grobler, J
2015-04-01
Full Text Available This paper expands on the concept of heuristic space diversity and investigates various strategies for the management of heuristic space diversity within the context of a meta-hyper-heuristic algorithm in search of greater performance benefits...
Heuristic space diversity management in a meta-hyper-heuristic framework
CSIR Research Space (South Africa)
Grobler, J
2014-07-01
Full Text Available This paper introduces the concept of heuristic space diversity and investigates various strategies for the management of heuristic space diversity within the context of a meta-hyper-heuristic algorithm. Evaluation on a diverse set of floating...
Face Recognition Based on Principal Component Analysis
Directory of Open Access Journals (Sweden)
Ali Javed
2013-02-01
Full Text Available The purpose of the proposed research work is to develop a computer system that can recognize a person by comparing the characteristics of face to those of known individuals. The main focus is on frontal two dimensional images that are taken in a controlled environment i.e. the illumination and the background will be constant. All the other methods of person’s identification and verification like iris scan or finger print scan require high quality and costly equipment’s but in face recognition we only require a normal camera giving us a 2-D frontal image of the person that will be used for the process of the person’s recognition. Principal Component Analysis technique has been used in the proposed system of face recognition. The purpose is to compare the results of the technique under the different conditions and to find the most efficient approach for developing a facial recognition system
Directory of Open Access Journals (Sweden)
Fanrong Kong
2017-09-01
Full Text Available To alleviate the emission of greenhouse gas and the dependence on fossil fuel, Plug-in Hybrid Electrical Vehicles (PHEVs have gained an increasing popularity in current decades. Due to the fluctuating electricity prices in the power market, a charging schedule is very influential to driving cost. Although the next-day electricity prices can be obtained in a day-ahead power market, a driving plan is not easily made in advance. Although PHEV owners can input a next-day plan into a charging system, e.g., aggregators, day-ahead, it is a very trivial task to do everyday. Moreover, the driving plan may not be very accurate. To address this problem, in this paper, we analyze energy demands according to a PHEV owner’s historical driving records and build a personalized statistic driving model. Based on the model and the electricity spot prices, a rolling optimization strategy is proposed to help make a charging decision in the current time slot. On one hand, by employing a heuristic algorithm, the schedule is made according to the situations in the following time slots. On the other hand, however, after the current time slot, the schedule will be remade according to the next tens of time slots. Hence, the schedule is made by a dynamic rolling optimization, but it only decides the charging decision in the current time slot. In this way, the fluctuation of electricity prices and driving routine are both involved in the scheduling. Moreover, it is not necessary for PHEV owners to input a day-ahead driving plan. By the optimization simulation, the results demonstrate that the proposed method is feasible to help owners save charging costs and also meet requirements for driving.
SAT-based verification for timed component connectors
Kemper, S.
2011-01-01
Component-based software construction relies on suitable models underlying components, and in particular the coordinators which orchestrate component behaviour. Verifying correctness and safety of such systems amounts to model checking the underlying system model. The model checking techniques not o
The development of a culture of problem solving with secondary students through heuristic strategies
Eisenmann, Petr; Novotná, Jarmila; Přibyl, Jiří; Břehovský, Jiří
2015-12-01
The article reports the results of a longitudinal research study conducted in three mathematics classes in Czech schools with 62 pupils aged 12-18 years. The pupils were exposed to the use of selected heuristic strategies in mathematical problem solving for a period of 16 months. This was done through solving problems where the solution was the most efficient if heuristic strategies were used. The authors conducted a two-dimensional classification of the use of heuristic strategies based on the work of Pólya (2004) and Schoenfeld (1985). We developed a tool that allows for the description of a pupil's ability to solve problems. Named, the Culture of Problem Solving (CPS), this tool consists of four components: intelligence, text comprehension, creativity and the ability to use existing knowledge. The pupils' success rate in problem solving and the changes in some of the CPS factors pre- and post-experiment were monitored. The pupils appeared to considerably improve in the creativity component. In addition, the results indicate a positive change in the students' attitude to problem solving. As far as the teachers participating in the experiment are concerned, a significant change was in their teaching style to a more constructivist, inquiry-based approach, as well as their willingness to accept a student's non-standard approach to solving a problem. Another important outcome of the research was the identification of the heuristic strategies that can be taught via long-term guided solutions of suitable problems and those that cannot. Those that can be taught include systematic experimentation, guess-check-revise and introduction of an auxiliary element. Those that cannot be taught (or can only be taught with difficulty) include the strategies of specification and generalization and analogy.
Independent component analysis based on adaptive artificial bee colony
National Research Council Canada - National Science Library
Shi Zhang; Chao-Wei Bao; Hai-Bin Shen
2016-01-01
.... An independent component analysis method based on adaptive artificial bee colony algorithm is proposed in this paper, aiming at the problems of slow convergence and low computational precision...
Component-based event composition modeling for CPS
Yin, Zhonghai; Chu, Yanan
2017-06-01
In order to combine event-drive model with component-based architecture design, this paper proposes a component-based event composition model to realize CPS’s event processing. Firstly, the formal representations of component and attribute-oriented event are defined. Every component is consisted of subcomponents and the corresponding event sets. The attribute “type” is added to attribute-oriented event definition so as to describe the responsiveness to the component. Secondly, component-based event composition model is constructed. Concept lattice-based event algebra system is built to describe the relations between events, and the rules for drawing Hasse diagram are discussed. Thirdly, as there are redundancies among composite events, two simplification methods are proposed. Finally, the communication-based train control system is simulated to verify the event composition model. Results show that the event composition model we have constructed can be applied to express composite events correctly and effectively.
Directory of Open Access Journals (Sweden)
Supatchaya Chotayakul
2013-01-01
Full Text Available This research studies a cash inventory problem in an ATM Network to satisfy customerâs cash needs over multiple periods with deterministic demand. The objective is to determine the amount of money to place in Automated Teller Machines (ATMs and cash centers for each period over a given time horizon. The algorithms are designed as a multi-echelon inventory problem with single-item capacitated lot-sizing to minimize total costs of running ATM network. In this study, we formulate the problem as a Mixed Integer Program (MIP and develop an approach based on reformulating the model as a shortest path formulation for finding a near-optimal solution of the problem. This reformulation is the same as the traditional model, except the capacity constraints, inventory balance constraints and setup constraints related to the management of the money in ATMs are relaxed. This new formulation gives more variables and constraints, but has a much tighter linear relaxation than the original and is faster to solve for short term planning. Computational results show its effectiveness, especially for large sized problems.
Li, Xuejun; Xu, Jia; Yang, Yun
2015-01-01
Cloud workflow system is a kind of platform service based on cloud computing. It facilitates the automation of workflow applications. Between cloud workflow system and its counterparts, market-oriented business model is one of the most prominent factors. The optimization of task-level scheduling in cloud workflow system is a hot topic. As the scheduling is a NP problem, Ant Colony Optimization (ACO) and Particle Swarm Optimization (PSO) have been proposed to optimize the cost. However, they have the characteristic of premature convergence in optimization process and therefore cannot effectively reduce the cost. To solve these problems, Chaotic Particle Swarm Optimization (CPSO) algorithm with chaotic sequence and adaptive inertia weight factor is applied to present the task-level scheduling. Chaotic sequence with high randomness improves the diversity of solutions, and its regularity assures a good global convergence. Adaptive inertia weight factor depends on the estimate value of cost. It makes the scheduling avoid premature convergence by properly balancing between global and local exploration. The experimental simulation shows that the cost obtained by our scheduling is always lower than the other two representative counterparts.
Directory of Open Access Journals (Sweden)
Xuejun Li
2015-01-01
Full Text Available Cloud workflow system is a kind of platform service based on cloud computing. It facilitates the automation of workflow applications. Between cloud workflow system and its counterparts, market-oriented business model is one of the most prominent factors. The optimization of task-level scheduling in cloud workflow system is a hot topic. As the scheduling is a NP problem, Ant Colony Optimization (ACO and Particle Swarm Optimization (PSO have been proposed to optimize the cost. However, they have the characteristic of premature convergence in optimization process and therefore cannot effectively reduce the cost. To solve these problems, Chaotic Particle Swarm Optimization (CPSO algorithm with chaotic sequence and adaptive inertia weight factor is applied to present the task-level scheduling. Chaotic sequence with high randomness improves the diversity of solutions, and its regularity assures a good global convergence. Adaptive inertia weight factor depends on the estimate value of cost. It makes the scheduling avoid premature convergence by properly balancing between global and local exploration. The experimental simulation shows that the cost obtained by our scheduling is always lower than the other two representative counterparts.
Leveraging Component-Based Software Engineering with Fraclet
Rouvoy, Romain; Merle, Philippe
2009-01-01
International audience; Component-based software engineering has achieved wide acceptance in the domain of software engineering by improving productivity, reusability and composition. This success has also encouraged the emergence of a plethora of component models. Nevertheless, even if the abstract models of most of lightweight component models are quite similar, their programming models can still differ a lot. This drawback limits the reuse and composition of components implemented using di...
Java Applications Development Based on Component and Metacomponent Approach
Danijel Radošević; Mario Konecki; Tihomir Orehovački
2008-01-01
Component based modeling offers new and improved approach to design, construction, implementation and evolution of software applications development. This kind of software applications development is usually represented by appropriate component model/diagram. UML, for example, offers component diagram for representation of this kind of model. On the other hand, metacomponents usage offers some new features which hardly could be achieved by using generic components. Firstly, implementation of ...
Energy Technology Data Exchange (ETDEWEB)
Stevens, James E.; Nordquist, Christopher Daniel; Baker, Michael Sean; Fleming, James Grant; Stewart, Harold D.; Dyck, Christopher William
2005-01-01
Radio frequency microelectromechanical systems (RF MEMS) are an enabling technology for next-generation communications and radar systems in both military and commercial sectors. RF MEMS-based reconfigurable circuits outperform solid-state circuits in terms of insertion loss, linearity, and static power consumption and are advantageous in applications where high signal power and nanosecond switching speeds are not required. We have demonstrated a number of RF MEMS switches on high-resistivity silicon (high-R Si) that were fabricated by leveraging the volume manufacturing processes available in the Microelectronics Development Laboratory (MDL), a Class-1, radiation-hardened CMOS manufacturing facility. We describe novel tungsten and aluminum-based processes, and present results of switches developed in each of these processes. Series and shunt ohmic switches and shunt capacitive switches were successfully demonstrated. The implications of fabricating on high-R Si and suggested future directions for developing low-loss RF MEMS-based circuits are also discussed.
Neural basis of scientific innovation induced by heuristic prototype.
Directory of Open Access Journals (Sweden)
Junlong Luo
Full Text Available A number of major inventions in history have been based on bionic imitation. Heuristics, by applying biological systems to the creation of artificial devices and machines, might be one of the most critical processes in scientific innovation. In particular, prototype heuristics propositions that innovation may engage automatic activation of a prototype such as a biological system to form novel associations between a prototype's function and problem-solving. We speculated that the cortical dissociation between the automatic activation and forming novel associations in innovation is critical point to heuristic creativity. In the present study, novel and old scientific innovations (NSI and OSI were selected as experimental materials in using learning-testing paradigm to explore the neural basis of scientific innovation induced by heuristic prototype. College students were required to resolve NSI problems (to which they did not know the answers and OSI problems (to which they knew the answers. From two fMRI experiments, our results showed that the subjects could resolve NSI when provided with heuristic prototypes. In Experiment 1, it was found that the lingual gyrus (LG; BA18 might be related to prototype heuristics in college students resolving NSI after learning a relative prototype. In Experiment 2, the LG (BA18 and precuneus (BA31 were significantly activated for NSI compared to OSI when college students learned all prototypes one day before the test. In addition, the mean beta-values of these brain regions of NSI were all correlated with the behavior accuracy of NSI. As our hypothesis indicated, the findings suggested that the LG might be involved in forming novel associations using heuristic information, while the precuneus might be involved in the automatic activation of heuristic prototype during scientific innovation.
FUZZY PRINCIPAL COMPONENT ANALYSIS AND ITS KERNEL BASED MODEL
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
Principal Component Analysis (PCA) is one of the most important feature extraction methods, and Kernel Principal Component Analysis (KPCA) is a nonlinear extension of PCA based on kernel methods. In real world, each input data may not be fully assigned to one class and it may partially belong to other classes. Based on the theory of fuzzy sets, this paper presents Fuzzy Principal Component Analysis (FPCA) and its nonlinear extension model, i.e., Kernel-based Fuzzy Principal Component Analysis (KFPCA). The experimental results indicate that the proposed algorithms have good performances.
How do people judge risks: availability heuristic, affect heuristic, or both?
Pachur, Thorsten; Hertwig, Ralph; Steinmann, Florian
2012-09-01
How does the public reckon which risks to be concerned about? The availability heuristic and the affect heuristic are key accounts of how laypeople judge risks. Yet, these two accounts have never been systematically tested against each other, nor have their predictive powers been examined across different measures of the public's risk perception. In two studies, we gauged risk perception in student samples by employing three measures (frequency, value of a statistical life, and perceived risk) and by using a homogeneous (cancer) and a classic set of heterogeneous causes of death. Based on these judgments of risk, we tested precise models of the availability heuristic and the affect heuristic and different definitions of availability and affect. Overall, availability-by-recall, a heuristic that exploits people's direct experience of occurrences of risks in their social network, conformed to people's responses best. We also found direct experience to carry a high degree of ecological validity (and one that clearly surpasses that of affective information). However, the relative impact of affective information (as compared to availability) proved more pronounced in value-of-a-statistical-life and perceived-risk judgments than in risk-frequency judgments. Encounters with risks in the media, in contrast, played a negligible role in people's judgments. Going beyond the assumption of exclusive reliance on either availability or affect, we also found evidence for mechanisms that combine both, either sequentially or in a composite fashion. We conclude with a discussion of policy implications of our results, including how to foster people's risk calibration and the success of education campaigns.
Component-Based Approach in Learning Management System Development
Zaitseva, Larisa; Bule, Jekaterina; Makarov, Sergey
2013-01-01
The paper describes component-based approach (CBA) for learning management system development. Learning object as components of e-learning courses and their metadata is considered. The architecture of learning management system based on CBA being developed in Riga Technical University, namely its architecture, elements and possibilities are…
Heuristics for Hierarchical Partitioning with Application to Model Checking
DEFF Research Database (Denmark)
Möller, Michael Oliver; Alur, Rajeev
2001-01-01
Given a collection of connected components, it is often desired to cluster together parts of strong correspondence, yielding a hierarchical structure. We address the automation of this process and apply heuristics to battle the combinatorial and computational complexity. We define a cost function...
Heuristic Search Theory and Applications
Edelkamp, Stefan
2011-01-01
Search has been vital to artificial intelligence from the very beginning as a core technique in problem solving. The authors present a thorough overview of heuristic search with a balance of discussion between theoretical analysis and efficient implementation and application to real-world problems. Current developments in search such as pattern databases and search with efficient use of external memory and parallel processing units on main boards and graphics cards are detailed. Heuristic search as a problem solving tool is demonstrated in applications for puzzle solving, game playing, constra
Regarding Chilcott's "Structural Functionalism as a Heuristic Device" Heuristically.
Blot, Richard K.
1998-01-01
The heuristic value of Chilcott's essay lies less in its support for structural functionalism and more in its concern to reexamine theory in the work of earlier educational anthropologists for what earlier theories and practices can add to current research. (SLD)
Hyper Heuristic Approach for Design and Optimization of Satellite Launch Vehicle
Institute of Scientific and Technical Information of China (English)
Amer Farhan RAFIQUE; HE Linshu; Ali KAMRAN; Qasim ZEESHAN
2011-01-01
Satellite launch vehicle lies at the cross-road of multiple challenging technologies and its design and optimization present a typical example of multidisciplinary design and optimization (MDO) process. The complexity of problem demands highly efficient and effective algorithm that can optimize the design. Hyper heuristic approach (HHA) based on meta-heuristics is applied to the optimization of air launched satellite launch vehicle (ASLV). A non-learning random function (NLRF) is proposed to control low-level meta-heuristics (LLMHs) that increases certainty of global solution, an essential ingredient required in product conceptual design phase of aerospace systems. Comprehensive empirical study is performed to evaluate the performance advantages of proposed approach over popular non-gradient based optimization methods. Design of ASLV encompasses aerodynamics,propulsion, structure, stages layout, mass distribution, and trajectory modules connected by multidisciplinary feasible design approach. This approach formulates explicit system-level goals and then forwards the design optimization process entirely over to optimizer. This distinctive approach for launch vehicle system design relieves engineers from tedious, iterative task and enables them to improve their component level models. Mass is an impetus on vehicle performance and cost, and so it is considered as the core of vehicle design process. Therefore, gross launch mass is to be minimized in HHA.
Edalati, Sh; Houshangi far, A.; Torabi, N.; Baneshi, Z.; Behjat, A.
2017-02-01
Poly(3,4-ethylendioxythiophene):poly(styrene sulfonate) (PEDOT:PSS) was deposited on a fluoride-doped tin oxide glass substrate using a heuristic method to fabricate platinum-free counter electrodes for dye-sensitized solar cells (DSSCs). In this heuristic method a thin layer of PEDOT:PPS is obtained by spin coating the PEDOT:PSS on a Cu substrate and then removing the substrate with FeCl3. The characteristics of the deposited PEDOT:PSS were studied by energy dispersive x-ray analysis and scanning electron microscopy, which revealed the micro-electronic specifications of the cathode. The aforementioned DSSCs exhibited a solar conversion efficiency of 3.90%, which is far higher than that of DSSCs with pure PEDOT:PSS (1.89%). This enhancement is attributed not only to the micro-electronic specifications but also to the HNO3 treatment through our heuristic method. The results of cyclic voltammetry, electrochemical impedance spectroscopy (EIS) and Tafel polarization plots show the modified cathode has a dual function, including excellent conductivity and electrocatalytic activity for iodine reduction.
The planning of order picking in a warehouse by heuristic algorithms
Uršič, Jakob
2014-01-01
Planning of order picking is essential process in every warehouse. In this thesis, we developed a simple warehouse simulator, which allows us to do various searches on path finding for a certain amount of items for one or more robots, using the A* algorithm. Heuristic guidance of search is mainly based on heuristic evaluation. We have implemented five different heuristic estimates, which we tested experimentally on examples with different warehouse configurations and with different numbers of...
Advances in resonance based NDT for ceramic components
Hunter, L. J.; Jauriqui, L. M.; Gatewood, G. D.; Sisneros, R.
2012-05-01
The application of resonance based non-destructive testing methods has been providing benefit to manufacturers of metal components in the automotive and aerospace industries for many years. Recent developments in resonance based technologies are now allowing the application of resonance NDT to ceramic components including turbine engine components, armor, and hybrid bearing rolling elements. Application of higher frequencies and advanced signal interpretation are now allowing Process Compensated Resonance Testing to detect both internal material defects and surface breaking cracks in a variety of ceramic components. Resonance techniques can also be applied to determine material properties of coupons and to evaluate process capability for new manufacturing methods.
3D face recognition algorithm based on detecting reliable components
Institute of Scientific and Technical Information of China (English)
Huang Wenjun; Zhou Xuebing; Niu Xiamu
2007-01-01
Fisherfaces algorithm is a popular method for face recognition. However, there exist some unstable components that degrade recognition performance. In this paper, we propose a method based on detecting reliable components to overcome the problem and introduce it to 3D face recognition. The reliable components are detected within the binary feature vector, which is generated from the Fisherfaces feature vector based on statistical properties, and is used for 3D face recognition as the final feature vector. Experimental results show that the reliable components feature vector is much more effective than the Fisherfaces feature vector for face recognition.
Victimized by Phishing: A Heuristic - Systematic Perspective
Directory of Open Access Journals (Sweden)
ZHENGCHUAN XU
2012-12-01
Full Text Available Phishing has become an ever - present, ever - increasing thr eat to information security, yet theory - based, systematic research on the behavioral aspect of this phenomenon is rather limited. In this paper, we propose the Heuristic - Systematic Model (HSM as an overarching theory to solidify the theory base for this l ine of research. We note the theoretical synergy between HSM and other theories used in previous research, and illustrate how HSM can be used to develop a research model investigating the psychological mechanism underlying the effectiveness of phishing att acks
Component-based Control Software Design for Flexible Manufacturing System
Institute of Scientific and Technical Information of China (English)
周炳海; 奚立峰; 曹永上
2003-01-01
A new method that designs and implements the component-based distributed & hierarchical flexible manufacturing control software is described with a component concept in this paper. The proposed method takes aim at improving the flexibility and reliability of the control system. On the basis of describing the concepts of component-based software and the distributed object technology, the architecture of the component-based software of the control system is suggested with the Common Object Request Broker Architecture (CORBA). And then, we propose a design method for component-based distributed & hierarchical flexible manufacturing control system. Finally, to verify the software design method, a prototype flexible manufacturing control system software has been implemented in Orbix 2. 3c, VC++6.0 and has been tested in connection with the physical flexible ranufacturing shop at the WuXi Professional Institute.
Optimization of Component Based Software Engineering Model Using Neural Network
Directory of Open Access Journals (Sweden)
Gaurav Kumar
2014-10-01
Full Text Available The goal of Component Based Software Engineering (CBSE is to deliver high quality, more reliable and more maintainable software systems in a shorter time and within limited budget by reusing and combining existing quality components. A high quality system can be achieved by using quality components, framework and integration process that plays a significant role. So, techniques and methods used for quality assurance and assessment of a component based system is different from those of the traditional software engineering methodology. In this paper, we are presenting a model for optimizing Chidamber and Kemerer (CK metric values of component-based software. A deep analysis of a series of CK metrics of the software components design patterns is done and metric values are drawn from them. By using unsupervised neural network- Self Organizing Map, we have proposed a model that provides an optimized model for Software Component engineering model based on reusability that depends on CK metric values. Average, standard deviated and optimized values for the CK metric are compared and evaluated to show the optimized reusability of component based model.
Methods for reliability based design optimization of structural components
Dersjö, Tomas
2012-01-01
Cost and quality are key properties of a product, possibly even the two most important. Onedefinition of quality is fitness for purpose. Load-bearing products, i.e. structural components,loose their fitness for purpose if they fail. Thus, the ability to withstand failure is a fundamentalmeasure of quality for structural components. Reliability based design optimization(RBDO) is an approach for development of structural components which aims to minimizethe cost while constraining the probabili...
Directory of Open Access Journals (Sweden)
Tibor Tot
2011-01-01
Full Text Available A unique case of metaplastic breast carcinoma with an epithelial component showing tumoral necrosis and neuroectodermal stromal component is described. The tumor grew rapidly and measured 9 cm at the time of diagnosis. No lymph node metastases were present. The disease progressed rapidly and the patient died two years after the diagnosis from a hemorrhage caused by brain metastases. The morphology and phenotype of the tumor are described in detail and the differential diagnostic options are discussed.
Efficient heuristics for the Rural Postman Problem
Directory of Open Access Journals (Sweden)
GW Groves
2005-06-01
Full Text Available A local search framework for the (undirected Rural Postman Problem (RPP is presented in this paper. The framework allows local search approaches that have been applied successfully to the well–known Travelling Salesman Problem also to be applied to the RPP. New heuristics for the RPP, based on this framework, are introduced and these are capable of solving significantly larger instances of the RPP than have been reported in the literature. Test results are presented for a number of benchmark RPP instances in a bid to compare efficiency and solution quality against known methods.
Advances in heuristic signal processing and applications
Chatterjee, Amitava; Siarry, Patrick
2013-01-01
There have been significant developments in the design and application of algorithms for both one-dimensional signal processing and multidimensional signal processing, namely image and video processing, with the recent focus changing from a step-by-step procedure of designing the algorithm first and following up with in-depth analysis and performance improvement to instead applying heuristic-based methods to solve signal-processing problems. In this book the contributing authors demonstrate both general-purpose algorithms and those aimed at solving specialized application problems, with a spec
Implementing Quality Assurance Features in Component-based Software System
National Research Council Canada - National Science Library
Navdeep Batolar; Parminder Kaur
2016-01-01
The increasing demand of component-based development approach (CBDA) gives opportunity to the software developers to increase the speed of the software development process and lower its production cost...
Water quality assessment using SVD-based principal component ...
African Journals Online (AJOL)
Water quality assessment using SVD-based principal component analysis of hydrological data. ... value decomposition (SVD) of hydrological data was tested for water quality assessment. ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT
High-extensible scene graph framework based on component techniques
Institute of Scientific and Technical Information of China (English)
LI Qi-cheng; WANG Guo-ping; ZHOU Feng
2006-01-01
In this paper, a novel component-based scene graph is proposed, in which all objects in the scene are classified to different entities, and a scene can be represented as a hierarchical graph composed of the instances of entities. Each entity contains basic data and its operations which are encapsulated into the entity component. The entity possesses certain behaviours which are responses to rules and interaction defined by the high-level application. Such behaviours can be described by script or behaviours model. The component-based scene graph in the paper is more abstractive and high-level than traditional scene graphs. The contents of a scene could be extended flexibly by adding new entities and new entity components, and behaviour modification can be obtained by modifying the model components or behaviour scripts. Its robustness and efficiency are verified by many examples implemented in the Virtual Scenario developed by Peking University.
Cloud Based Big Data Infrastructure: Architectural Components and Automated Provisioning
Demchenko, Yuri; Turkmen, Fatih; Blanchet, Christophe; Loomis, Charles; Laat, Caees de
2016-01-01
This paper describes the general architecture and functional components of the cloud based Big Data Infrastructure (BDI). The proposed BDI architecture is based on the analysis of the emerging Big Data and data intensive technologies and supported by the definition of the Big Data Architecture Framework (BDAF) that defines the following components of the Big Data technologies: Big Data definition, Data Management including data lifecycle and data structures, Big Data Infrastructure (generical...
Issues in Component-Based Development: Towards Specification with ADLs
Directory of Open Access Journals (Sweden)
Rafael González
2006-10-01
Full Text Available Software development has been coupled with time and cost problems through history. This has motivated the search for flexible, trustworthy and time and cost-efficient development. In order to achieve this, software reuse appears fundamental and component-based development, the way towards reuse. This paper discusses the present state of component-based development and some of its critical issues for success, such as: the existence of adequate repositories, component integration within a software architecture and an adequate specification. This paper suggests ADLs (Architecture Description Languages as a possible means for this specification.
Software component composition based on ADL and Middleware
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
How to compose prefabricated components is a key issue in component-based reuse. Research on Software Architecture (SA) and Component-based Software Development (CBSD) provides two hopeful solutions from different perspectives. SA provides a top-down approach to realizing component-based reuse. However, it pays insufficient attention to the refinement and implementation of the architectural descriptions, and does not provide the necessary capability to automate the transformation or composition to form a final executable application. CBSD provides a bottom-up way by using existing middleware infrastructures. However, these technologies do not take into account the systematic methodology that can guide the CBSD process, especially the component composition at higher abstract levels. We argue that it is a natural solution to combine these two approaches. In this paper, an architecture-based component composition approach is presented. In this way, SA description, using mapping rules and mini-tools to narrow the gap between design and implementation, is used as the blueprint and middleware technology as the runtime scaffold for component composition. Our approach presents an ADL, which supports user-defined connectors and has an extensible framework, to specify software architectures. To map a SA description into implementation, it is necessary to map it first to an OO design model described in UML, then to the final implementation. The architectural description can be mapped into source code or executable code by using some ORB conforming to CORBA standard. Also a toolkit is provided to support this approach efficiently.
Two Effective Heuristics for Beam Angle Optimization in Radiation Therapy
Yarmand, Hamed
2013-01-01
In radiation therapy, mathematical methods have been used for optimizing treatment planning for delivery of sufficient dose to the cancerous cells while keeping the dose to critical surrounding structures minimal. This optimization problem can be modeled using mixed integer programming (MIP) whose solution gives the optimal beam orientation as well as optimal beam intensity. The challenge, however, is the computation time for this large scale MIP. We propose and investigate two novel heuristic approaches to reduce the computation time considerably while attaining high-quality solutions. We introduce a family of heuristic cuts based on the concept of 'adjacent beams' and a beam elimination scheme based on the contribution of each beam to deliver the dose to the tumor in the ideal plan in which all potential beams can be used simultaneously. We show the effectiveness of these heuristics for intensity modulated radiation therapy (IMRT) and stereotactic body radiation therapy (SBRT) on a clinical liver case.
Least Dependent Component Analysis Based on Mutual Information
Stögbauer, H; Astakhov, S A; Grassberger, P; St\\"ogbauer, Harald; Kraskov, Alexander; Astakhov, Sergey A.; Grassberger, Peter
2004-01-01
We propose to use precise estimators of mutual information (MI) to find least dependent components in a linearly mixed signal. On the one hand this seems to lead to better blind source separation than with any other presently available algorithm. On the other hand it has the advantage, compared to other implementations of `independent' component analysis (ICA) some of which are based on crude approximations for MI, that the numerical values of the MI can be used for: (i) estimating residual dependencies between the output components; (ii) estimating the reliability of the output, by comparing the pairwise MIs with those of re-mixed components; (iii) clustering the output according to the residual interdependencies. For the MI estimator we use a recently proposed k-nearest neighbor based algorithm. For time sequences we combine this with delay embedding, in order to take into account non-trivial time correlations. After several tests with artificial data, we apply the resulting MILCA (Mutual Information based ...
Java Applications Development Based on Component and Metacomponent Approach
Directory of Open Access Journals (Sweden)
Danijel Radošević
2008-12-01
Full Text Available Component based modeling offers new and improved approach to design, construction, implementation and evolution of software applications development. This kind of software applications development is usually represented by appropriate component model/diagram. UML, for example, offers component diagram for representation of this kind of model. On the other hand, metacomponents usage offers some new features which hardly could be achieved by using generic components. Firstly, implementation of program properties which are dispersed on different classes and other program units, i.e. aspects, is offered. This implies using automated process of assembling components and their interconnection for building applications, according to appropriate model offered in this paper, which also offers generic components usage. Benefits of this hybrid process are higher flexibility achieved by automated connection process, optimization through selective features inclusion and easier application maintenance and development. In this paper we offer an approach of application development based on hybrid component/metacomponent model. The component model is given by UML diagrams, while the metacomponent model is given by generator scripting model. We explain that hybrid approach on an example of Java Web application development.
Investigation on Supply Chain Management Based on ComponentConfiguration
Institute of Scientific and Technical Information of China (English)
张洁; 陈淮莉; 马登哲
2004-01-01
From supply-push mode to demand-pull mode, SCM systems will face four main points: (1) real time visibility that covers the whole supply chain, (2) agility for choice of supply and source, (3) response to diverse customer demands and short delivery deadlines, and (4) rapid occurrence of new products following the market trends and new designs. Component-based SCM has become a hot spot in research areas. A multi-layer framework is set up, including a database server layer, an application server layer, a kernel component layer and a user interface layer. Some function components are designed, which are optimal planning arithmetic components, controller components and evaluation indexes components, in order to suit both discrete and continuous manufacturing. This paper studies a three-dimensional SCM configuration method based on the types of enterprise, manufacturing and products, provides powerful tools for SCM system implementations, and adopts an object-oriented technology to construct component-based distributed information system to assure right time, right materials, right place, right quantity and right customers.
Integration of Simulink Models with Component-based Software Models
DEFF Research Database (Denmark)
Marian, Nicolae
2008-01-01
constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behaviour as a means of computation...... constraints. COMDES (Component-based Design of Software for Distributed Embedded Systems) is such a component-based system framework developed by the software engineering group of Mads Clausen Institute for Product Innovation (MCI), University of Southern Denmark. Once specified, the software model has......Model based development aims to facilitate the development of embedded control systems by emphasizing the separation of the design level from the implementation level. Model based design involves the use of multiple models that represent different views of a system, having different semantics...
Meta-heuristic algorithms as tools for hydrological science
Yoo, Do Guen; Kim, Joong Hoon
2014-12-01
In this paper, meta-heuristic optimization techniques are introduced and their applications to water resources engineering, particularly in hydrological science are introduced. In recent years, meta-heuristic optimization techniques have been introduced that can overcome the problems inherent in iterative simulations. These methods are able to find good solutions and require limited computation time and memory use without requiring complex derivatives. Simulation-based meta-heuristic methods such as Genetic algorithms (GAs) and Harmony Search (HS) have powerful searching abilities, which can occasionally overcome the several drawbacks of traditional mathematical methods. For example, HS algorithms can be conceptualized from a musical performance process and used to achieve better harmony; such optimization algorithms seek a near global optimum determined by the value of an objective function, providing a more robust determination of musical performance than can be achieved through typical aesthetic estimation. In this paper, meta-heuristic algorithms and their applications (focus on GAs and HS) in hydrological science are discussed by subject, including a review of existing literature in the field. Then, recent trends in optimization are presented and a relatively new technique such as Smallest Small World Cellular Harmony Search (SSWCHS) is briefly introduced, with a summary of promising results obtained in previous studies. As a result, previous studies have demonstrated that meta-heuristic algorithms are effective tools for the development of hydrological models and the management of water resources.
Yoshizawa, Akira; Nisizima, Shoiti; Shimomura, Yutaka; Kobayashi, Hiromichi; Matsuo, Yuichi; Abe, Hiroyuki; Fujiwara, Hitoshi
2006-03-01
A new methodology for the Reynolds-averaged Navier-Stokes modeling is presented on the basis of the amalgamation of heuristic-modeling and turbulence-theory methods. A characteristic turbulence time scale is synthesized in a heuristic manner through the combination of several characteristic time scales. An algebraic model of turbulent-viscosity type for the Reynolds stress is derived from the Reynolds-stress transport equation with the time scale embedded. It is applied to the state of weak spatial and temporal nonequilibrium, and is compared with its theoretical counterpart derived by the two-scale direct-interaction approximation. The synthesized time scale is justified through the agreement of the two expressions derived by these entirely different methods. The derived model is tested in rotating isotropic, channel, and homogeneous-shear flows. It is extended to a nonlinear algebraic model and a supersonic model. The latter is shown to succeed in reproducing the reduction in the growth rate of a free-shear layer flow, without causing wrong effects on wall-bounded flows such as channel and boundary-layer flows.
Availability Allocation of Networked Systems Using Markov Model and Heuristics Algorithm
Directory of Open Access Journals (Sweden)
Ruiying Li
2014-01-01
Full Text Available It is a common practice to allocate the system availability goal to reliability and maintainability goals of components in the early design phase. However, the networked system availability is difficult to be allocated due to its complex topology and multiple down states. To solve these problems, a practical availability allocation method is proposed. Network reliability algebraic methods are used to derive the availability expression of the networked topology on the system level, and Markov model is introduced to determine that on the component level. A heuristic algorithm is proposed to obtain the reliability and maintainability allocation values of components. The principles applied in the AGREE reliability allocation method, proposed by the Advisory Group on Reliability of Electronic Equipment, and failure rate-based maintainability allocation method persist in our allocation method. A series system is used to verify the new algorithm, and the result shows that the allocation based on the heuristic algorithm is quite accurate compared to the traditional one. Moreover, our case study of a signaling system number 7 shows that the proposed allocation method is quite efficient for networked systems.
Component-based Systems Reconﬁgurations Using Graph Grammars
Directory of Open Access Journals (Sweden)
O. Kouchnarenko
2016-01-01
Full Text Available Dynamic reconﬁgurations can modify the architecture of component-based systems without incurring any system downtime. In this context, the main contribution of the present article is the establishment of correctness results proving component-based systems reconﬁgurations using graph grammars. New guarded reconﬁgurations allow us to build reconﬁgurations based on primitive reconﬁguration operations using sequences of reconﬁgurations and the alternative and the repetitive constructs, while preserving conﬁguration consistency. A practical contribution consists of the implementation of a component-based model using the GROOVE graph transformation tool. Then, after enriching the model with interpreted conﬁgurations and reconﬁgurations in a consistency compatible manner, a simulation relation is exploited to validate component systems’ implementations. This sound implementation is illustrated on a cloud-based multitier application hosting environment managed as a component-based system.
How to make a greedy heuristic for the asymmetric traveling salesman problem competitive
Goldengorin, B.; Jäger, G.
2005-01-01
It is widely confirmed by many computational experiments that a greedy type heuristics for the Traveling Salesman Problem (TSP) produces rather poor solutions except for the Euclidean TSP. The selection of arcs to be included by a greedy heuristic is usually done on the base of cost values. We
Managing Heuristics as a Method of Inquiry in Autobiographical Graphic Design Theses
Ings, Welby
2011-01-01
This article draws on case studies undertaken in postgraduate research at AUT University, Auckland. It seeks to address a number of issues related to heuristic inquiries employed by graphic design students who use autobiographical approaches when developing research-based theses. For this type of thesis, heuristics as a system of inquiry may…
A new iterative heuristic to solve the joint replenishment problem using a spreadsheet technique
Nilsson, A.; Segerstedt, A.; van der Sluis, E.
2007-01-01
In this paper, a heuristic method is presented which gives a novel approach to solve joint replenishment problems (JRP) with strict cycle policies. The heuristic solves the JRP in an iterative procedure and is based on a spreadsheet technique. The principle of the recursion procedure is to find a ba
Managing Heuristics as a Method of Inquiry in Autobiographical Graphic Design Theses
Ings, Welby
2011-01-01
This article draws on case studies undertaken in postgraduate research at AUT University, Auckland. It seeks to address a number of issues related to heuristic inquiries employed by graphic design students who use autobiographical approaches when developing research-based theses. For this type of thesis, heuristics as a system of inquiry may…
Memorability in Context: A Heuristic Story.
Geurten, Marie; Meulemans, Thierry; Willems, Sylvie
2015-01-01
We examined children's ability to employ a metacognitive heuristic based on memorability expectations to reduce false recognitions, and explored whether these expectations depend on the context in which the items are presented. Specifically, 4-, 6-, and 9-year-old children were presented with high-, medium-, and low-memorability words, either mixed together (Experiment 1) or separated into two different lists (Experiment 2). Results revealed that only children with a higher level of executive functioning (9-year-olds) used the memorability-based heuristic when all types of items were presented within the same list. However, all children, regardless of age or executive level, implemented the metacognitive rule when high- and low-memorability words were presented in two separate lists. Moreover, the results of Experiment 2 showed that participants processed medium-memorability words more conservatively when they were presented in a low- than in a high-memorability list, suggesting that children's memorability expectations are sensitive to list-context effects.
Component-based integration of chemistry and optimization software.
Kenny, Joseph P; Benson, Steven J; Alexeev, Yuri; Sarich, Jason; Janssen, Curtis L; McInnes, Lois Curfman; Krishnan, Manojkumar; Nieplocha, Jarek; Jurrus, Elizabeth; Fahlstrom, Carl; Windus, Theresa L
2004-11-15
Typical scientific software designs make rigid assumptions regarding programming language and data structures, frustrating software interoperability and scientific collaboration. Component-based software engineering is an emerging approach to managing the increasing complexity of scientific software. Component technology facilitates code interoperability and reuse. Through the adoption of methodology and tools developed by the Common Component Architecture Forum, we have developed a component architecture for molecular structure optimization. Using the NWChem and Massively Parallel Quantum Chemistry packages, we have produced chemistry components that provide capacity for energy and energy derivative evaluation. We have constructed geometry optimization applications by integrating the Toolkit for Advanced Optimization, Portable Extensible Toolkit for Scientific Computation, and Global Arrays packages, which provide optimization and linear algebra capabilities. We present a brief overview of the component development process and a description of abstract interfaces for chemical optimizations. The components conforming to these abstract interfaces allow the construction of applications using different chemistry and mathematics packages interchangeably. Initial numerical results for the component software demonstrate good performance, and highlight potential research enabled by this platform.
Reo: A Channel-based Coordination Model for Component Composition
Arbab, F.
2004-01-01
In this paper, we present Reo, which forms a paradigm for composition of software components based on the notion of mobile channels. Reo is a channel-based exogenous coordination model in which complex coordinators, called connectors, are compositionally built out of simpler ones. The simplest conne
A channel-based coordination model for component composition
Arbab, F.
2002-01-01
In this paper, we present $P epsilon omega$, a paradigm for composition of software components based on the notion of mobile channels. $P repsilon omega$ is a channel-based exogenous coordination model wherein complex coordinators, called {em connectors are compositionally built out of simpler ones.
Isocyanide based multi component reactions in combinatorial chemistry.
Dömling, A.
1998-01-01
Although usually regarded as a recent development, the combinatorial approach to the synthesis of libraries of new drug candidates was first described as early as 1961 using the isocyanide-based one-pot multicomponent Ugi reaction. Isocyanide-based multi component reactions (MCR's) markedly differ f
Isocyanide based multi component reactions in combinatorial chemistry.
Dömling, A.
1998-01-01
Although usually regarded as a recent development, the combinatorial approach to the synthesis of libraries of new drug candidates was first described as early as 1961 using the isocyanide-based one-pot multicomponent Ugi reaction. Isocyanide-based multi component reactions (MCR's) markedly differ
CCPA: Component-based communication protocol architecture for embedded systems
Institute of Scientific and Technical Information of China (English)
DAI Hong-jun; CHEN Tian-zhou; CHEN Chun
2005-01-01
For increased and various communication requirements of modem applications on embedded systems, general purpose protocol stacks and protocol models are not efficient because they are fixed to execute in the static mode. We present the Component-Based Communication Protocol Architecture (CCPA) to make communication dynamic and configurable. It can develop, test and store the customized components for flexible reuse. The protocols are implemented by component assembly and support by configurable environments. This leads to smaller memory, more flexibility, more reconfiguration ability, better concurrency, and multiple data channel support.
Secure Wireless Embedded Systems Via Component-based Design
DEFF Research Database (Denmark)
Hjorth, Theis S.; Torbensen, R.
2010-01-01
This paper introduces the method secure-by-design as a way of constructing wireless embedded systems using component-based modeling frameworks. This facilitates design of secure applications through verified, reusable software. Following this method we propose a security framework with a secure...... communication component for distributed wireless embedded devices. The components communicate using the Secure Embedded Exchange Protocol (SEEP), which has been designed for flexible trust establishment so that small, resource-constrained, wireless embedded systems are able to communicate short command messages...
Secure wireless embedded systems via component-based design
DEFF Research Database (Denmark)
Hjorth, T.; Torbensen, R.
2010-01-01
This paper introduces the method secure-by-design as a way of constructing wireless embedded systems using component-based modeling frameworks. This facilitates design of secure applications through verified, reusable software. Following this method we propose a security framework with a secure...... communication component for distributed wireless embedded devices. The components communicate using the Secure Embedded Exchange Protocol (SEEP), which has been designed for flexible trust establishment so that small, resource-constrained, wireless embedded systems are able to communicate short command messages...
Algorithms for Synthesizing Priorities in Component-based Systems
Cheng, Chih-Hong; Chen, Yu-Fang; Yan, Rongjie; Jobstmann, Barbara; Ruess, Harald; Buckl, Christian; Knoll, Alois
2011-01-01
We present algorithms to synthesize component-based systems that are safe and deadlock-free using priorities, which define stateless-precedence between enabled actions. Our core method combines the concept of fault-localization (using safety-game) and fault-repair (using SAT for conflict resolution). For complex systems, we propose three complementary methods as preprocessing steps for priority synthesis, namely (a) data abstraction to reduce component complexities, (b) alphabet abstraction and #-deadlock to ignore components, and (c) automated assumption learning for compositional priority synthesis.
A Heuristic Hierarchical Scheme for Academic Search and Retrieval
DEFF Research Database (Denmark)
Amolochitis, Emmanouil; Christou, Ioannis T.; Tan, Zheng-Hua
2013-01-01
We present PubSearch, a hybrid heuristic scheme for re-ranking academic papers retrieved from standard digital libraries such as the ACM Portal. The scheme is based on the hierarchical combination of a custom implementation of the term frequency heuristic, a time-depreciated citation score...... and a graph-theoretic computed score that relates the paper’s index terms with each other. We designed and developed a meta-search engine that submits user queries to standard digital repositories of academic publications and re-ranks the repository results using the hierarchical heuristic scheme. We evaluate...... in Information Retrieval including Normalized Discounted Cumulative Gain (NDCG), Expected Reciprocal Rank (ERR) as well as a newly introduced lexicographic rule (LEX) of ranking search results. In particular, PubSearch outperforms ACM Portal by more than 77% in terms of ERR, by more than 11% in terms of NDCG...
Bio-Inspired Meta-Heuristics for Emergency Transportation Problems
Directory of Open Access Journals (Sweden)
Min-Xia Zhang
2014-02-01
Full Text Available Emergency transportation plays a vital role in the success of disaster rescue and relief operations, but its planning and scheduling often involve complex objectives and search spaces. In this paper, we conduct a survey of recent advances in bio-inspired meta-heuristics, including genetic algorithms (GA, particle swarm optimization (PSO, ant colony optimization (ACO, etc., for solving emergency transportation problems. We then propose a new hybrid biogeography-based optimization (BBO algorithm, which outperforms some state-of-the-art heuristics on a typical transportation planning problem.
Deriving a Set of Privacy Specific Heuristics for the Assessment of PHRs (Personal Health Records).
Furano, Riccardo F; Kushniruk, Andre; Barnett, Jeff
2017-01-01
With the emergence of personal health record (PHR) platforms becoming more widely available, this research focused on the development of privacy heuristics to assess PHRs regarding privacy. Existing sets of heuristics are typically not application specific and do not address patient-centric privacy as a main concern prior to undergoing PHR procurement. A set of privacy specific heuristics were developed based on a scoping review of the literature. An internet-based commercially available, vendor specific PHR application was evaluated using the derived set of privacy specific heuristics. The proposed set of privacy specific derived heuristics is explored in detail in relation to ISO 29100. The assessment of the internet-based commercially available, vendor specific PHR application indicated numerous violations. These violations were noted within the study. It is argued that the new derived privacy heuristics should be used in addition to Nielsen's well-established set of heuristics. Privacy specific heuristics could be used to assess PHR portal system-level privacy mechanisms in the procurement process of a PHR application and may prove to be a beneficial form of assessment to prevent the selection of a PHR platform with a poor privacy specific interface design.
Heuristic-based Visual C＋＋ Programming on the Reform of Teaching%基于启发式的Visual C＋＋程序设计教学改革
Institute of Scientific and Technical Information of China (English)
孙娜
2011-01-01
文章阐述了国内高校Visual C＋＋程序设计课程传统教学中存在的问题,根据Visual C＋＋程序设计课程的特点,并结合自身的教学实践,将启发式理念引入到教学环节中,提出一种基于启发式的程序设计教学改革方案,在教学实践中取得较好效果。%This paper presented the traditional teaching problems existing in Visual C＋＋ programming courses of domestic universities.According to the courses features of Visual C＋＋ programming and combing with our own teaching practice,we applied the heuristic concept to teaching sessions,and proposed a heuristic-based Visual C＋＋ programming on the reform and design of teaching,which achieved better results in teaching practice.
A Geographical Heuristic Routing Protocol for VANETs
Directory of Open Access Journals (Sweden)
Luis Urquiza-Aguiar
2016-09-01
Full Text Available Vehicular ad hoc networks (VANETs leverage the communication system of Intelligent Transportation Systems (ITS. Recently, Delay-Tolerant Network (DTN routing protocols have increased their popularity among the research community for being used in non-safety VANET applications and services like traffic reporting. Vehicular DTN protocols use geographical and local information to make forwarding decisions. However, current proposals only consider the selection of the best candidate based on a local-search. In this paper, we propose a generic Geographical Heuristic Routing (GHR protocol that can be applied to any DTN geographical routing protocol that makes forwarding decisions hop by hop. GHR includes in its operation adaptations simulated annealing and Tabu-search meta-heuristics, which have largely been used to improve local-search results in discrete optimization. We include a complete performance evaluation of GHR in a multi-hop VANET simulation scenario for a reporting service. Our study analyzes all of the meaningful configurations of GHR and offers a statistical analysis of our findings by means of MANOVA tests. Our results indicate that the use of a Tabu list contributes to improving the packet delivery ratio by around 5% to 10%. Moreover, if Tabu is used, then the simulated annealing routing strategy gets a better performance than the selection of the best node used with carry and forwarding (default operation.
HEURISTIC DISCRETIZATION METHOD FOR BAYESIAN NETWORKS
Directory of Open Access Journals (Sweden)
Mariana D.C. Lima
2014-01-01
Full Text Available Bayesian Network (BN is a classification technique widely used in Artificial Intelligence. Its structure is a Direct Acyclic Graph (DAG used to model the association of categorical variables. However, in cases where the variables are numerical, a previous discretization is necessary. Discretization methods are usually based on a statistical approach using the data distribution, such as division by quartiles. In this article we present a discretization using a heuristic that identifies events called peak and valley. Genetic Algorithm was used to identify these events having the minimization of the error between the estimated average for BN and the actual value of the numeric variable output as the objective function. The BN has been modeled from a database of Bit’s Rate of Penetration of the Brazilian pre-salt layer with 5 numerical variables and one categorical variable, using the proposed discretization and the division of the data by the quartiles. The results show that the proposed heuristic discretization has higher accuracy than the quartiles discretization.
Sabar, Nasser R; Ayob, Masri; Kendall, Graham; Qu, Rong
2015-02-01
Hyper-heuristics are search methodologies that aim to provide high-quality solutions across a wide variety of problem domains, rather than developing tailor-made methodologies for each problem instance/domain. A traditional hyper-heuristic framework has two levels, namely, the high level strategy (heuristic selection mechanism and the acceptance criterion) and low level heuristics (a set of problem specific heuristics). Due to the different landscape structures of different problem instances, the high level strategy plays an important role in the design of a hyper-heuristic framework. In this paper, we propose a new high level strategy for a hyper-heuristic framework. The proposed high-level strategy utilizes a dynamic multiarmed bandit-extreme value-based reward as an online heuristic selection mechanism to select the appropriate heuristic to be applied at each iteration. In addition, we propose a gene expression programming framework to automatically generate the acceptance criterion for each problem instance, instead of using human-designed criteria. Two well-known, and very different, combinatorial optimization problems, one static (exam timetabling) and one dynamic (dynamic vehicle routing) are used to demonstrate the generality of the proposed framework. Compared with state-of-the-art hyper-heuristics and other bespoke methods, empirical results demonstrate that the proposed framework is able to generalize well across both domains. We obtain competitive, if not better results, when compared to the best known results obtained from other methods that have been presented in the scientific literature. We also compare our approach against the recently released hyper-heuristic competition test suite. We again demonstrate the generality of our approach when we compare against other methods that have utilized the same six benchmark datasets from this test suite.
Modeling Component-based Bragg gratings Application: tunable lasers
Directory of Open Access Journals (Sweden)
Hedara Rachida
2011-09-01
Full Text Available The principal function of a grating Bragg is filtering, which can be used in optical fibers based component and active or passive semi conductors based component, as well as telecommunication systems. Their ideal use is with lasers with fiber, amplifiers with fiber or Laser diodes. In this work, we are going to show the principal results obtained during the analysis of various types of grating Bragg by the method of the coupled modes. We then present the operation of DBR are tunable. The use of Bragg gratings in a laser provides single-mode sources, agile wavelength. The use of sampled grating increases the tuning range.
Towards a Component Based Model for Database Systems
Directory of Open Access Journals (Sweden)
Octavian Paul ROTARU
2004-02-01
Full Text Available Due to their effectiveness in the design and development of software applications and due to their recognized advantages in terms of reusability, Component-Based Software Engineering (CBSE concepts have been arousing a great deal of interest in recent years. This paper presents and extends a component-based approach to object-oriented database systems (OODB introduced by us in [1] and [2]. Components are proposed as a new abstraction level for database system, logical partitions of the schema. In this context, the scope is introduced as an escalated property for transactions. Components are studied from the integrity, consistency, and concurrency control perspective. The main benefits of our proposed component model for OODB are the reusability of the database design, including the access statistics required for a proper query optimization, and a smooth information exchange. The integration of crosscutting concerns into the component database model using aspect-oriented techniques is also discussed. One of the main goals is to define a method for the assessment of component composition capabilities. These capabilities are restricted by the component’s interface and measured in terms of adaptability, degree of compose-ability and acceptability level. The above-mentioned metrics are extended from database components to generic software components. This paper extends and consolidates into one common view the ideas previously presented by us in [1, 2, 3].[1] Octavian Paul Rotaru, Marian Dobre, Component Aspects in Object Oriented Databases, Proceedings of the International Conference on Software Engineering Research and Practice (SERP’04, Volume II, ISBN 1-932415-29-7, pages 719-725, Las Vegas, NV, USA, June 2004.[2] Octavian Paul Rotaru, Marian Dobre, Mircea Petrescu, Integrity and Consistency Aspects in Component-Oriented Databases, Proceedings of the International Symposium on Innovation in Information and Communication Technology (ISIICT
Verifying Embedded Systems using Component-based Runtime Observers
DEFF Research Database (Denmark)
Guan, Wei; Marian, Nicolae; Angelov, Christo K.
Formal verification methods, such as exhaustive model checking, are often infeasible because of high computational complexity. Runtime observers (monitors) provide an alternative, light-weight verification method, which offers a non-exhaustive yet feasible approach to monitoring system behavior...... against formally specified properties. This paper presents a component-based design method for runtime observers, which are configured from instances of prefabricated reusable components---Predicate Evaluator (PE) and Temporal Evaluator (TE). The PE computes atomic propositions for the TE; the latter...... is a reconfigurable component processing a data structure, representing the state transition diagram of a non-deterministic state machine, i.e. a Buchi automaton derived from a system property specified in Linear Temporal Logic (LTL). Observer components have been implemented using design models and design patterns...
Context sensitivity and ambiguity in component-based systems design
Energy Technology Data Exchange (ETDEWEB)
Bespalko, S.J.; Sindt, A.
1997-10-01
Designers of components-based, real-time systems need to guarantee to correctness of soft-ware and its output. Complexity of a system, and thus the propensity for error, is best characterized by the number of states a component can encounter. In many cases, large numbers of states arise where the processing is highly dependent on context. In these cases, states are often missed, leading to errors. The following are proposals for compactly specifying system states which allow the factoring of complex components into a control module and a semantic processing module. Further, the need for methods that allow for the explicit representation of ambiguity and uncertainty in the design of components is discussed. Presented herein are examples of real-world problems which are highly context-sensitive or are inherently ambiguous.
Fast Automatic Heuristic Construction Using Active Learning
Ogilvie, William; Petoumenos, Pavlos; Wang, Zheng; Leather, Hugh
2015-01-01
Building effective optimization heuristics is a challenging task which often takes developers several months if not years to complete. Predictive modelling has recently emerged as a promising solution, automatically constructing heuristics from training data. However, obtaining this data can take months per platform. This is becoming an ever more critical problem and if no solution is found we shall be left with out of date heuristics which cannot extract the best performance from modern mach...
Maximum flow-based resilience analysis: From component to system
Jin, Chong; Li, Ruiying; Kang, Rui
2017-01-01
Resilience, the ability to withstand disruptions and recover quickly, must be considered during system design because any disruption of the system may cause considerable loss, including economic and societal. This work develops analytic maximum flow-based resilience models for series and parallel systems using Zobel’s resilience measure. The two analytic models can be used to evaluate quantitatively and compare the resilience of the systems with the corresponding performance structures. For systems with identical components, the resilience of the parallel system increases with increasing number of components, while the resilience remains constant in the series system. A Monte Carlo-based simulation method is also provided to verify the correctness of our analytic resilience models and to analyze the resilience of networked systems based on that of components. A road network example is used to illustrate the analysis process, and the resilience comparison among networks with different topologies but the same components indicates that a system with redundant performance is usually more resilient than one without redundant performance. However, not all redundant capacities of components can improve the system resilience, the effectiveness of the capacity redundancy depends on where the redundant capacity is located. PMID:28545135
Heuristic Model Of The Composite Quality Index Of Environmental Assessment
Khabarov, A. N.; Knyaginin, A. A.; Bondarenko, D. V.; Shepet, I. P.; Korolkova, L. N.
2017-01-01
The goal of the paper is to present the heuristic model of the composite environmental quality index based on the integrated application of the elements of utility theory, multidimensional scaling, expert evaluation and decision-making. The composite index is synthesized in linear-quadratic form, it provides higher adequacy of the results of the assessment preferences of experts and decision-makers.
A global heuristically search algorithm for DNA encoding
Institute of Scientific and Technical Information of China (English)
Zhang Kai; Pan Linqiang; Xu Jin
2007-01-01
A new efficient algorithm is developed to design DNA words with equal length for DNA computing. The algorithm uses a global heuristic optimizing search approach and converts constraints to a carry number to accelerate the convergence, which can generate a DNA words set satisfying some thermodynamic and combinatorial constraints. Based on the algorithm, a software for DNA words design is developed.
Can Component/Service-Based Systems Be Proved Correct?
Attiogbe, Christian
2009-01-01
Component-oriented and service-oriented approaches have gained a strong enthusiasm in industries and academia with a particular interest for service-oriented approaches. A component is a software entity with given functionalities, made available by a provider, and used to build other application within which it is integrated. The service concept and its use in web-based application development have a huge impact on reuse practices. Accordingly a considerable part of software architectures is influenced; these architectures are moving towards service-oriented architectures. Therefore applications (re)use services that are available elsewhere and many applications interact, without knowing each other, using services available via service servers and their published interfaces and functionalities. Industries propose, through various consortium, languages, technologies and standards. More academic works are also undertaken concerning semantics and formalisation of components and service-based systems. We consider...
Nominal and Structural Subtyping in Component-Based Programming
DEFF Research Database (Denmark)
Ostermann, Klaus
2007-01-01
type. We analyze structural and different flavors of nominal subtyping from the perspective of component-based programming, where issues such as blame assignment and modular extensibility are important. Our analysis puts various existing subtyping mechanisms into a common frame of reference...
Management of Globally Distributed Component-Based Software Development Projects
J. Kotlarsky (Julia)
2005-01-01
textabstractGlobally Distributed Component-Based Development (GD CBD) is expected to become a promising area, as increasing numbers of companies are setting up software development in a globally distributed environment and at the same time are adopting CBD methodologies. Being an emerging area, the
RECEIVE ANTENNA SUBSET SELECTION BASED ON ORTHOGONAL COMPONENTS
Institute of Scientific and Technical Information of China (English)
Lan Peng; Liu Ju; Gu Bo; Zhang Wei
2007-01-01
A new receive antenna subset selection algorithm with low complexity for wireless Multiple-Input Multiple-Output (MIMO) systems is proposed, which is based on the orthogonal components of the channel matrix. Larger capacity is achieved compared with the existing antenna selection methods. Simulation results of quasi-static flat fading channel demonstrate the significant performance of the proposed selection algorithm.
Reliability-Based Design of Wind Turbine Components
DEFF Research Database (Denmark)
Toft, Henrik Stensgaard; Sørensen, John Dalsgaard
2010-01-01
wind turbine design a deterministic design approach based on partial safety factors is normally used. In the present paper a numerical example demonstrates how information from tests with wind turbine blades can be used to establish a probabilistic basis for reliabilitybased design. It is also......Application of reliability-based design for wind turbines requires a definition of the probabilistic basis for the individual components of the wind turbine. In the present paper reliability-based design of structural wind turbine components is considered. A framework for the uncertainties which...... demonstrated how partial safety factors can be derived for reliability-based design and how the partial safety factors changes dependent on the uncertainty in the test results....
Diagnostic Problem Solving Using First Principles and Heuristics
Institute of Scientific and Technical Information of China (English)
沈一栋; 童Fu; 等
1996-01-01
is paper proposes an approach to diagnostic reasoning with the following distinct features:(1)A diagnostic system is formulated in FOL with equality, particularly in the form of program clauses;(2)The abnormality of system components is determined in terms of either experiential knowledge of domain experts of behavioral description of components;(3)Heuristics is fully used not only to assist in judging the abnormalits of system components,but also to guide the diagnosis;(4)A unique diagnosis will be computed for a given observation,provided that certain essential I-O information is supplemented when demanded.
Component-based software for high-performance scientific computing
Alexeev, Yuri; Allan, Benjamin A.; Armstrong, Robert C.; Bernholdt, David E.; Dahlgren, Tamara L.; Gannon, Dennis; Janssen, Curtis L.; Kenny, Joseph P.; Krishnan, Manojkumar; Kohl, James A.; Kumfert, Gary; Curfman McInnes, Lois; Nieplocha, Jarek; Parker, Steven G.; Rasmussen, Craig; Windus, Theresa L.
2005-01-01
Recent advances in both computational hardware and multidisciplinary science have given rise to an unprecedented level of complexity in scientific simulation software. This paper describes an ongoing grass roots effort aimed at addressing complexity in high-performance computing through the use of Component-Based Software Engineering (CBSE). Highlights of the benefits and accomplishments of the Common Component Architecture (CCA) Forum and SciDAC ISIC are given, followed by an illustrative example of how the CCA has been applied to drive scientific discovery in quantum chemistry. Thrusts for future research are also described briefly.
Verifying Embedded Systems using Component-based Runtime Observers
DEFF Research Database (Denmark)
Guan, Wei; Marian, Nicolae; Angelov, Christo K.
Formal verification methods, such as exhaustive model checking, are often infeasible because of high computational complexity. Runtime observers (monitors) provide an alternative, light-weight verification method, which offers a non-exhaustive yet feasible approach to monitoring system behavior...... against formally specified properties. This paper presents a component-based design method for runtime observers, which are configured from instances of prefabricated reusable components---Predicate Evaluator (PE) and Temporal Evaluator (TE). The PE computes atomic propositions for the TE; the latter...... specified properties via simulation. The presented method has been experimentally validated in an industrial case study---a control system for a safety-critical medical ventilator unit....
Lifting a Butterfly – A Component-Based FFT
Directory of Open Access Journals (Sweden)
Sibylle Schupp
2003-01-01
Full Text Available While modern software engineering, with good reason, tries to establish the idea of reusability and the principles of parameterization and loosely coupled components even for the design of performance-critical software, Fast Fourier Transforms (FFTs tend to be monolithic and of a very low degree of parameterization. The data structures to hold the input and output data, the element type of these data, the algorithm for computing the so-called twiddle factors, the storage model for a given set of twiddle factors, all are unchangeably defined in the so-called butterfly, restricting its reuse almost entirely. This paper shows a way to a component-based FFT by designing a parameterized butterfly. Based on the technique of lifting, this parameterization includes algorithmic and implementation issues without violating the complexity guarantees of an FFT. The paper demonstrates the lifting process for the Gentleman-Sande butterfly, i.e., the butterfly that underlies the large class of decimation-in-frequency (DIF FFTs, shows the resulting components and summarizes the implementation of a component-based, generic DIF library in C++.
Real Time Engineering Analysis Based on a Generative Component Implementation
DEFF Research Database (Denmark)
Kirkegaard, Poul Henning; Klitgaard, Jens
2007-01-01
The present paper outlines the idea of a conceptual design tool with real time engineering analysis which can be used in the early conceptual design phase. The tool is based on a parametric approach using Generative Components with embedded structural analysis. Each of these components uses...... the geometry, material properties and fixed point characteristics to calculate the dimensions and subsequent feasibility of any architectural design. The proposed conceptual design tool provides the possibility for the architect to work with both the aesthetic as well as the structural aspects of architecture...... with a static determinate roof structure modelled by beam components is given. The example outlines the idea of the tool for conceptual design in early phase of a multidisciplinary design process between architecture and structural engineering....
Real Time Engineering Analysis Based on a Generative Component Implementation
DEFF Research Database (Denmark)
Kirkegaard, Poul Henning; Klitgaard, Jens
2007-01-01
The present paper outlines the idea of a conceptual design tool with real time engineering analysis which can be used in the early conceptual design phase. The tool is based on a parametric approach using Generative Components with embedded structural analysis. Each of these components uses...... without jumping from aesthetics to structural digital design tools and back, but to work with both simultaneously and real time. The engineering level of knowledge is incorporated at a conceptual thinking level, i.e. qualitative information is used in stead of using quantitative information. An example...... with a static determinate roof structure modelled by beam components is given. The example outlines the idea of the tool for conceptual design in early phase of a multidisciplinary design process between architecture and structural engineering....
A probabilistic model for component-based shape synthesis
Kalogerakis, Evangelos
2012-07-01
We present an approach to synthesizing shapes from complex domains, by identifying new plausible combinations of components from existing shapes. Our primary contribution is a new generative model of component-based shape structure. The model represents probabilistic relationships between properties of shape components, and relates them to learned underlying causes of structural variability within the domain. These causes are treated as latent variables, leading to a compact representation that can be effectively learned without supervision from a set of compatibly segmented shapes. We evaluate the model on a number of shape datasets with complex structural variability and demonstrate its application to amplification of shape databases and to interactive shape synthesis. © 2012 ACM 0730-0301/2012/08-ART55.
Heuristic evaluation on mobile interfaces: a new checklist.
Yáñez Gómez, Rosa; Cascado Caballero, Daniel; Sevillano, José-Luis
2014-01-01
The rapid evolution and adoption of mobile devices raise new usability challenges, given their limitations (in screen size, battery life, etc.) as well as the specific requirements of this new interaction. Traditional evaluation techniques need to be adapted in order for these requirements to be met. Heuristic evaluation (HE), an Inspection Method based on evaluation conducted by experts over a real system or prototype, is based on checklists which are desktop-centred and do not adequately detect mobile-specific usability issues. In this paper, we propose a compilation of heuristic evaluation checklists taken from the existing bibliography but readapted to new mobile interfaces. Selecting and rearranging these heuristic guidelines offer a tool which works well not just for evaluation but also as a best-practices checklist. The result is a comprehensive checklist which is experimentally evaluated as a design tool. This experimental evaluation involved two software engineers without any specific knowledge about usability, a group of ten users who compared the usability of a first prototype designed without our heuristics, and a second one after applying the proposed checklist. The results of this experiment show the usefulness of the proposed checklist for avoiding usability gaps even with nontrained developers.
Heuristic Evaluation on Mobile Interfaces: A New Checklist
Directory of Open Access Journals (Sweden)
Rosa Yáñez Gómez
2014-01-01
Full Text Available The rapid evolution and adoption of mobile devices raise new usability challenges, given their limitations (in screen size, battery life, etc. as well as the specific requirements of this new interaction. Traditional evaluation techniques need to be adapted in order for these requirements to be met. Heuristic evaluation (HE, an Inspection Method based on evaluation conducted by experts over a real system or prototype, is based on checklists which are desktop-centred and do not adequately detect mobile-specific usability issues. In this paper, we propose a compilation of heuristic evaluation checklists taken from the existing bibliography but readapted to new mobile interfaces. Selecting and rearranging these heuristic guidelines offer a tool which works well not just for evaluation but also as a best-practices checklist. The result is a comprehensive checklist which is experimentally evaluated as a design tool. This experimental evaluation involved two software engineers without any specific knowledge about usability, a group of ten users who compared the usability of a first prototype designed without our heuristics, and a second one after applying the proposed checklist. The results of this experiment show the usefulness of the proposed checklist for avoiding usability gaps even with nontrained developers.
Component Based Effort Estimation During Software Development: Problematic View
Directory of Open Access Journals (Sweden)
VINIT KUMAR
2011-10-01
Full Text Available Component-based software development (CBD is anemerging discipline that promises to take softwareengineering into a new era. Building on theachievements of object-oriented software construction,CBD aims to deliver software engineering from acottage industry into an industrial age for InformationTechnology, wherein software can be assembled fromcomponents, in the manner that hardware systems arecurrently constructed from kits of parts. Componentbaseddevelopment (CBD is a branch of softwareengineering that emphasizes the separation ofconcerns in respect of the wide-ranging functionalityavailable throughout a given software system. Thispractice aims to bring about an equally wide-rangingdegree of benefits in both the short-term and the longtermfor the software itself and for organizations thatsponsor such software. Software engineers regardcomponents as part of the starting platformfor service-orientation. Components play this role, for example, in Web services, and more recently, in service-oriented architectures (SOA, whereby a component is converted by the Web service into aservice and subsequently inherits further characteristics beyond that of an ordinary component. Components can produce or consume events and can be used for event driven architectures (EDA.
Effective Heuristics for New Venture Formation
Kraaijenbrink, Jeroen
2010-01-01
Entrepreneurs are often under time pressure and may only have a short window of opportunity to launch their new venture. This means they often have no time for rational analytical decisions and rather rely on heuristics. Past research on entrepreneurial heuristics has primarily focused on predictive
SA BASED SOFTWARE DEPLOYMENT RELIABILITY ESTIMATION CONSIDERING COMPONENT DEPENDENCE
Institute of Scientific and Technical Information of China (English)
Su Xihong; Liu Hongwei; Wu Zhibo; Yang Xiaozong; Zuo Decheng
2011-01-01
Reliability is one of the most critical properties of software system.System deployment architecture is the allocation of system software components on host nodes.Software Architecture (SA)based software deployment models help to analyze reliability of different deployments.Though many approaches for architecture-based reliability estimation exist,little work has incorporated the influence of system deployment and hardware resources into reliability estimation.There are many factors influencing system deployment.By translating the multi-dimension factors into degree matrix of component dependence,we provide the definition of component dependence and propose a method of calculating system reliability of deployments.Additionally,the parameters that influence the optimal deployment may change during system execution.The existing software deployment architecture may be ill-suited for the given environment,and the system needs to be redeployed to improve reliability.An approximate algorithm,A*_D,to increase system reliability is presented.When the number of components and host nodes is relative large,experimental results show that this algorithm can obtain better deployment than stochastic and greedy algorithms.
基于求解 TSP 问题的改进贪婪算法%An Improved Greedy Heuristic Based on Solving Traveling Salesman Problem
Institute of Scientific and Technical Information of China (English)
饶卫振; 金淳
2012-01-01
分析了求解旅行商问题(Traveling Salesman Problem,TSP)的经典贪婪算法(Greedy Heuristic,GR)的特点,发现影响 GR 求解质量的主要因素是在构造后期添加的边过长,从而导致最终求解质量不高.为此,借鉴Held Karp 模型的思路,构造了一种新的距离矩阵改造法(Transforming Distance Matrix, TDM). GR 结合 TDM 得到的改进贪婪算法 GR-TDM 能够有效地克服传统 GR 在构造后期添加长边的缺点.通过计算来自 TSP 算例标准库和 TSP 世界挑战赛网站中的40个算例表明,GR-TDM 耗时较 GR 仅增加了0.5~2%,然而 GR-TDM 的求解质量较传统的 GR 提高了43%.另外,通过与当前构建型算法比较发现,GR-TDM 的求解性能达到一流构建型算法水平.
Heuristics-Guided Exploration of Reaction Mechanisms
Bergeler, Maike; Proppe, Jonny; Reiher, Markus
2015-01-01
For the investigation of chemical reaction networks, the efficient and accurate determination of all relevant intermediates and elementary reactions is inevitable. The complexity of such a network may grow rapidly, in particular if reactive species are involved that might cause a myriad of side reactions. Without automation, a complete investigation of complex reaction mechanisms is tedious and possibly unfeasible. Therefore, only the expected dominant reaction paths of a chemical reaction network (e.g., a catalytic cycle or an enzymatic cascade) are usually explored in practice. Here, we present a computational protocol that constructs such networks in a parallelized and automated manner. Molecular structures of reactive complexes are generated based on heuristic rules and subsequently optimized by electronic-structure methods. Pairs of reactive complexes related by an elementary reaction are then automatically detected and subjected to an automated search for the connecting transition state. The results are...
A Hybrid Heuristics for Irregular Flight Recovery
Institute of Scientific and Technical Information of China (English)
ZHAO Xiu-li; ZHU Jin-fu; GAO Qiang
2010-01-01
Adverse weather conditions, congestion at airports, and mechanical failures often disrupt regular flight schedules. The irregular flight recovery problem aims to recover these schedules through reassignments of flights and cancellations. In this article, we develop the classic resource assignment model for the irregular flight recovery problem, and a new hybrid heuristic procedure based on greedy random adaptive search procedure (GRASP) and simulated annealing algorithm is presented to solve this problem. As compared with the original GRASP method, the proposed algorithm demonstrates quite a high global optimization capability. Computational experiments on large-scale problems show that the proposed procedure is able to generate feasible revised flight schedules of good quality in less than five seconds.
Tracy Tomlinson; Julian N. Marewski; Michael Dougherty
2011-01-01
The recognition heuristic assumes that people make inferences based on the output of recognition memory. While much work has been devoted to establishing the recognition heuristic as a viable description of how people make inferences, more work is needed to fully integrate research on the recognition heuristic with research from the broader cognitive psychology literature. In this article, we outline four challenges that should be met for this integration to take place, and close with a call ...
State Inspection for Transmission Lines Based on Independent Component Analysis
Institute of Scientific and Technical Information of China (English)
REN Li-jia; JIANG Xiu-chen; SHENG Ge-hao; YANG Wei-wei
2009-01-01
Monitoring transmission towers is of great importance to prevent severe thefts on them and ensure the reliability and safety of the power grid operation. Independent component analysis (ICA) is a method for finding underlying factors or components from multivariate statistical data based on dimension reduction methods, and it is applicable to extract the non-stationary signals. FastICA based on negentropy is presented to effectively extract and separate the vibration signals caused by human activity in this paper. A new method combined empirical mode decomposition (EMD) technique with the adaptive threshold method is applied to extract the vibration pulses, and suppress the interference signals. The practical tests demonstrate that the method proposed in the paper is effective in separating and extracting the vibration signals.
Education Knowledge System Combination Model Based on the Components
Institute of Scientific and Technical Information of China (English)
CHEN Lei; LI Dehua; LI Xiaojian; WU Chunxiang
2007-01-01
Resources are the base and core of education information, but current web education resources have no structure and it is still difficult to reuse them and make them can be self assembled and developed continually. According to the knowledge structure of course and text, the relation among knowledge points, knowledge units from three levels of media material, we can build education resource components, and build TKCM (Teaching Knowledge Combination Model) based on resource components. Builders can build and assemble knowledge system structure and make knowledge units can be self assembled, thus we can develop and consummate them continually. Users can make knowledge units can be self assembled and renewed, and build education knowledge system to satisfy users' demand under the form of education knowledge system.
Component-based analysis of embedded control applications
DEFF Research Database (Denmark)
Angelov, Christo K.; Guan, Wei; Marian, Nicolae
2011-01-01
presents an analysis technique that can be used to validate COMDES design models in SIMULINK. It is based on a transformation of the COMDES design model into a SIMULINK analysis model, which preserves the functional and timing behaviour of the application. This technique has been employed to develop...... configuration of applications from validated design models and trusted components. This design philosophy has been instrumental for developing COMDES—a component-based framework for distributed embedded control systems. A COMDES application is conceived as a network of embedded actors that are configured from...... instances of reusable, executable components—function blocks (FBs). System actors operate in accordance with a timed multitasking model of computation, whereby I/O signals are exchanged with the controlled plant at precisely specified time instants, resulting in the elimination of I/O jitter. The paper...
Heuristic Scheduling Algorithm Oriented Dynamic Tasks for Imaging Satellites
Directory of Open Access Journals (Sweden)
Maocai Wang
2014-01-01
Full Text Available Imaging satellite scheduling is an NP-hard problem with many complex constraints. This paper researches the scheduling problem for dynamic tasks oriented to some emergency cases. After the dynamic properties of satellite scheduling were analyzed, the optimization model is proposed in this paper. Based on the model, two heuristic algorithms are proposed to solve the problem. The first heuristic algorithm arranges new tasks by inserting or deleting them, then inserting them repeatedly according to the priority from low to high, which is named IDI algorithm. The second one called ISDR adopts four steps: insert directly, insert by shifting, insert by deleting, and reinsert the tasks deleted. Moreover, two heuristic factors, congestion degree of a time window and the overlapping degree of a task, are employed to improve the algorithm’s performance. Finally, a case is given to test the algorithms. The results show that the IDI algorithm is better than ISDR from the running time point of view while ISDR algorithm with heuristic factors is more effective with regard to algorithm performance. Moreover, the results also show that our method has good performance for the larger size of the dynamic tasks in comparison with the other two methods.
How cognitive heuristics can explain social interactions in spatial movement.
Seitz, Michael J; Bode, Nikolai W F; Köster, Gerta
2016-08-01
The movement of pedestrian crowds is a paradigmatic example of collective motion. The precise nature of individual-level behaviours underlying crowd movements has been subject to a lively debate. Here, we propose that pedestrians follow simple heuristics rooted in cognitive psychology, such as 'stop if another step would lead to a collision' or 'follow the person in front'. In other words, our paradigm explicitly models individual-level behaviour as a series of discrete decisions. We show that our cognitive heuristics produce realistic emergent crowd phenomena, such as lane formation and queuing behaviour. Based on our results, we suggest that pedestrians follow different cognitive heuristics that are selected depending on the context. This differs from the widely used approach of capturing changes in behaviour via model parameters and leads to testable hypotheses on changes in crowd behaviour for different motivation levels. For example, we expect that rushed individuals more often evade to the side and thus display distinct emergent queue formations in front of a bottleneck. Our heuristics can be ranked according to the cognitive effort that is required to follow them. Therefore, our model establishes a direct link between behavioural responses and cognitive effort and thus facilitates a novel perspective on collective behaviour.
Hendricks, Robert C.; Zaretsky, Erwin V.
2001-01-01
Critical component design is based on minimizing product failures that results in loss of life. Potential catastrophic failures are reduced to secondary failures where components removed for cause or operating time in the system. Issues of liability and cost of component removal become of paramount importance. Deterministic design with factors of safety and probabilistic design address but lack the essential characteristics for the design of critical components. In deterministic design and fabrication there are heuristic rules and safety factors developed over time for large sets of structural/material components. These factors did not come without cost. Many designs failed and many rules (codes) have standing committees to oversee their proper usage and enforcement. In probabilistic design, not only are failures a given, the failures are calculated; an element of risk is assumed based on empirical failure data for large classes of component operations. Failure of a class of components can be predicted, yet one can not predict when a specific component will fail. The analogy is to the life insurance industry where very careful statistics are book-kept on classes of individuals. For a specific class, life span can be predicted within statistical limits, yet life-span of a specific element of that class can not be predicted.
Heuristic Algorithms Applied to Train Station Parking using information of Transponders
2013-01-01
Train Station Parking has received increasing concentration as Platform Screen Doors (PSDs) are widely used in Urban Rail Transit. Aiming to enhance the accuracy and robustness of Train Station Parking, we proposed three algorithms which are Newton Dynamics based Algorithm (NDA), Heuristic Learning based Algorithm (HLA) and Heuristic Algorithm based on deceleration deviations Sequences (HAS) by using the information of transponders, essential locating equipments in subway. Then we verify the ...
QoS Guided Min-Min Heuristic for Grid Task Scheduling
Institute of Scientific and Technical Information of China (English)
HE XiaoShan(何晓珊); SUN XianHe(孙贤和); Gregor von Laszewski
2003-01-01
Task scheduling is an integrated component of computing. With the emergence of Grid and ubiquitous computing, new challenges appear in task scheduling based on properties such as security, quality of service, and lack of central control within distributed administrative domains. A Grid task scheduling framework must be able to deal with these issues. One of the goals of Grid task scheduling is to achieve high system throughput while matching applications with the available computing resources. This matching of resources in a non-deterministically shared heterogeneous environment leads to concerns over Quality of Service (QoS). In this paper a novel QoS guided task scheduling algorithm for Grid computing is introduced. The proposed novel algorithm is based on a general adaptive scheduling heuristics that includes QoS guidance.The algorithm is evaluated within a simulated Grid environment. The experimental results show that the new QoS guided Min-Min heuristic can lead to significant performance gain for a variety of applications. The approach is compared with others based on the quality of the prediction formulated by inaccurate information.
A NONLINEAR FEASIBILITY PROBLEM HEURISTIC
Directory of Open Access Journals (Sweden)
Sergio Drumond Ventura
2015-04-01
Full Text Available In this work we consider a region S ⊂ given by a finite number of nonlinear smooth convex inequalities and having nonempty interior. We assume a point x 0 is given, which is close in certain norm to the analytic center of S, and that a new nonlinear smooth convex inequality is added to those defining S (perturbed region. It is constructively shown how to obtain a shift of the right-hand side of this inequality such that the point x 0 is still close (in the same norm to the analytic center of this shifted region. Starting from this point and using the theoretical results shown, we develop a heuristic that allows us to obtain the approximate analytic center of the perturbed region. Then, we present a procedure to solve the problem of nonlinear feasibility. The procedure was implemented and we performed some numerical tests for the quadratic (random case.
Space cryogenics components based on the thermomechanical (TM) effect
Yuan, S. W. K.; Frederking, T. H. K.
1988-01-01
He II vapor-liquid phase separation (VLPS) is discussed, with emphasis on fluid-related transport phenomena. The VLPS system has been studied for both linear and nonlinear regimes, demonstrating that well-defined convection patterns exist in porous plug phase separators. In the second part, other components based on the thermomechanical effect are discussed in the limit of ideal conditions. Examples considered include the heat pipe transfer of zero net mass flow, liquid transfer pumps based on the fountain effect, mechanocaloric devices for cooling purposes, and He II vortex refrigerators.
A generalized GPU-based connected component labeling algorithm
Komura, Yukihiro
2016-01-01
We propose a generalized GPU-based connected component labeling (CCL) algorithm that can be applied to both various lattices and to non-lattice environments in a uniform fashion. We extend our recent GPU-based CCL algorithm without the use of conventional iteration to the generalized method. As an application of this algorithm, we deal with the bond percolation problem. We investigate bond percolation on the honeycomb and triangle lattices to confirm the correctness of this algorithm. Moreover, we deal with bond percolation on the Bethe lattice as a substitute for a network structure, and demonstrate the performance of this algorithm on those lattices.
Production Planning and Control Method Based on TOC and Heuristic Rules%基于TOC及启发式规则的生产计划与控制方法
Institute of Scientific and Technical Information of China (English)
芮剑锋; 杨建军; 石艳
2012-01-01
To solve the problem of production planning and control with multi-variety and small batch in a complex product manufacturing system, a production planning and control method based on TOC and heuristic rules are proposed from the reference to the APS model. This method integrates the traditional production planning and control methods, TOC theory, and the characteristics of heuristic rules in different planning and dispatch layers, so the assisting workshop can rapidly and effectly realize the production control. This method implements in MES platform. After an effective application in a manufacturing enterprise, its feasibility and effectiveness are verified.%为解决复杂产品制造系统面临的多品种、小批量的生产计划与控制问题,参考APS的计划模型,提出了一种基于TOC及启发式规则的生产计划与控制方法.该方法通过在不同的计划与调度层面集成传统生产计划与控制方法、TOC理论和启发式规则的特性,辅助车间快速有效地实现生产作业控制.该方法最终实现于制造执行系统(MES)平台,通过在某制造企业的应用,验证了其可行性和有效性.
Likelihood-based CT reconstruction of objects containing known components
Energy Technology Data Exchange (ETDEWEB)
Stayman, J. Webster [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Biomedical Engineering; Otake, Yoshito; Uneri, Ali; Prince, Jerry L.; Siewerdsen, Jeffrey H.
2011-07-01
There are many situations in medical imaging where there are known components within the imaging volume. Such is the case in diagnostic X-ray CT imaging of patients with implants, in intraoperative CT imaging where there may be surgical tools in the field, or in situations where the patient support (table or frame) or other devices are outside the (truncated) reconstruction FOV. In such scenarios it is often of great interest to image the relation between the known component and the surrounding anatomy, or to provide high-quality images at the boundary of these objects, or simply to minimize artifacts arising from such components. We propose a framework for simultaneously estimating the position and orientation of a known component and the surrounding volume. Toward this end, we adopt a likelihood-based objective function with an image volume jointly parameterized by a known object, or objects, with unknown registration parameters and an unknown background attenuation volume. The objective is solved iteratively using an alternating minimization approach between the two parameter types. Because this model integrates a substantial amount of prior knowledge about the overall volume, we expect a number of advantages including the reduction of metal artifacts, potential for more sparse data acquisition (decreased time and dose), and/or improved image quality. We illustrate this approach using simulated spine CT data that contains pedicle screws placed in a vertebra, and demonstrate improved performance over traditional filtered-backprojection and penalized-likelihood reconstruction techniques. (orig.)
Intensified crystallization in complex media: heuristics for crystallization of platform chemicals
Urbanus, J.; Roelands, C.P.M.; Verdoes, D.; Horst, J.H. ter
2012-01-01
This paper presents heuristics for the integration of fermentation with the appropriate crystallization based in-situ product recovery (ISPR) technique. Here techniques, such as co-crystallization (CC), evaporative crystallization (EC), template induced crystallization (TIC), cooling crystallization
Intensified crystallization in complex media: heuristics for crystallization of platform chemicals
Urbanus, J.; Roelands, C.P.M.; Verdoes, D.; Horst, J.H. ter
2012-01-01
This paper presents heuristics for the integration of fermentation with the appropriate crystallization based in-situ product recovery (ISPR) technique. Here techniques, such as co-crystallization (CC), evaporative crystallization (EC), template induced crystallization (TIC), cooling crystallization
A Component Based Approach to Scientific Workflow Management
Le Goff, Jean-Marie; Baker, Nigel; Brooks, Peter; McClatchey, Richard
2001-01-01
CRISTAL is a distributed scientific workflow system used in the manufacturing and production phases of HEP experiment construction at CERN. The CRISTAL project has studied the use of a description driven approach, using meta- modelling techniques, to manage the evolving needs of a large physics community. Interest from such diverse communities as bio-informatics and manufacturing has motivated the CRISTAL team to re-engineer the system to customize functionality according to end user requirements but maximize software reuse in the process. The next generation CRISTAL vision is to build a generic component architecture from which a complete software product line can be generated according to the particular needs of the target enterprise. This paper discusses the issues of adopting a component product line based approach and our experiences of software reuse.
Integration of Simulink Models with Component-based Software Models
DEFF Research Database (Denmark)
Marian, Nicolae; Top, Søren
2008-01-01
of abstract system descriptions. Usually, in mechatronics systems, design proceeds by iterating model construction, model analysis, and model transformation. Constructing a MATLAB/Simulink model, a plant and controller behavior is simulated using graphical blocks to represent mathematical and logical...... of MATLAB/Simulink blocks to COMDES software components, both for continuous and discrete behaviour, and the transformation of the software system into the S-functions. The general aim of this work is the improvement of multi-disciplinary development of embedded systems with the focus on the relation...... constructs and process flow, then software code is generated. A Simulink model is a representation of the design or implementation of a physical system that satisfies a set of requirements. A software component-based system aims to organize system architecture and behaviour as a means of computation...
Heuristic Evaluation on Mobile Interfaces: A New Checklist
Rosa Yáñez Gómez; Daniel Cascado Caballero; José-Luis Sevillano
2014-01-01
The rapid evolution and adoption of mobile devices raise new usability challenges, given their limitations (in screen size, battery life, etc.) as well as the specific requirements of this new interaction. Traditional evaluation techniques need to be adapted in order for these requirements to be met. Heuristic evaluation (HE), an Inspection Method based on evaluation conducted by experts over a real system or prototype, is based on checklists which are desktop-centred and do not adeq...
Combined heuristic with fuzzy system to transmission system expansion planning
Energy Technology Data Exchange (ETDEWEB)
Silva Sousa, Aldir; Asada, Eduardo N. [University of Sao Paulo, Sao Carlos School of Engineering, Department of Electrical Engineering Av. Trabalhador Sao-carlense, 400, 13566-590 Sao Carlos, SP (Brazil)
2011-01-15
A heuristic algorithm that employs fuzzy logic is proposed to the power system transmission expansion planning problem. The algorithm is based on the divide to conquer strategy, which is controlled by the fuzzy system. The algorithm provides high quality solutions with the use of fuzzy decision making, which is based on nondeterministic criteria to guide the search. The fuzzy system provides a self-adjusting mechanism that eliminates the manual adjustment of parameters to each system being solved. (author)
A Variable Depth Sequential Search Heuristic for the Quadratic Assignment Problem
Paul, Gerald
2009-01-01
We develop a variable depth search heuristic for the quadratic assignment problem. The heuristic is based on sequential changes in assignments analogous to the Lin-Kernighan sequential edge moves for the traveling salesman problem. We treat unstructured problem instances of sizes 60 to 400. When the heuristic is used in conjunction with robust tabu search, we measure performance improvements of up to a factor of 15 compared to the use of robust tabu alone. The performance improvement increases as the problem size increases.
Combined heuristics for determining order quantity under time-varying demands
Institute of Scientific and Technical Information of China (English)
无
2008-01-01
The time-varying demands for a certain period are often assumed to be less than the basic economic order quantity (EOQ) so that total replenishment quantity rather than economic order quantity is normally considered by most of the heuristics.This acticle focuses on a combined heuristics method for determining order quantity under generalized time-varying demands.The independent policy (IP),abnormal independent policy (AIP) and dependent policies are studied and compared.Using the concepts of normal/abnormal periods and the properties of dependent policies,a dependent policy-based heuristics (DPH) is proposed for solving the order quantity problems with a kind of time-varying demands pattern under which the first period is normal.By merging the Silver-Meal (S-M) heuristics and the dependent policy-based heuristics (DPH),a combined heuristics (DPH/S-M) is developed for solving order quantity problems with generalized time-varying demands.The experimentation shows that (1) for the problem with one normal period,no matter which position the normal period stands,the DPH/S-M could not guarantee better than the S-M heuristics,however it is superior to the S-M heuristics in the case that the demands in the abnormal periods are in descending order,and (2) The DPH/S-M is superior to the S-M heuristics for problems with more than one normal period,and the more the number of normal periods,the greater the improvements.
A heuristic algorithm for scheduling in a flow shop environment to minimize makespan
Directory of Open Access Journals (Sweden)
Arun Gupta
2015-04-01
Full Text Available Scheduling ‘n’ jobs on ‘m’ machines in a flow shop is NP- hard problem and places itself at prominent place in the area of production scheduling. The essence of any scheduling algorithm is to minimize the makespan in a flowshop environment. In this paper an attempt has been made to develop a heuristic algorithm, based on the reduced weightage of ma-chines at each stage to generate different combination of ‘m-1’ sequences. The proposed heuristic has been tested on several benchmark problems of Taillard (1993 [Taillard, E. (1993. Benchmarks for basic scheduling problems. European Journal of Operational Research, 64, 278-285.]. The performance of the proposed heuristic is compared with three well-known heuristics, namely Palmer’s heuristic, Campbell’s CDS heuristic, and Dannenbring’s rapid access heuristic. Results are evaluated with the best-known upper-bound solutions and found better than the above three.
Institute of Scientific and Technical Information of China (English)
Jian-Wan Ding; Li-Ping Chen; Fan-Li Zhou
2006-01-01
Object-oriented modeling with declarative equation based languages often unconsciously leads to structural inconsistencies. Component-based debugging is a new structural analysis approach that addresses this problem by analyzing the structure of each component in a model to separately locate faulty components. The analysis procedure is performed recursively based on the depth-first rule. It first generates fictitious equations for a component to establish a debugging environment, and then detects structural defects by using graph theoretical approaches to analyzing the structure of the system of equations resulting from the component. The proposed method can automatically locate components that cause the structural inconsistencies, and show the user detailed error messages. This information can be a great help in finding and localizing structural inconsistencies, and in some cases pinpoints them immediately.
A new heuristic for the quadratic assignment problem
Zvi Drezner
2002-01-01
We propose a new heuristic for the solution of the quadratic assignment problem. The heuristic combines ideas from tabu search and genetic algorithms. Run times are very short compared with other heuristic procedures. The heuristic performed very well on a set of test problems.
Component-based assistants for MEMS design tools
Hahn, Kai; Brueck, Rainer; Schneider, Christian; Schumer, Christian; Popp, Jens
2001-04-01
With this paper a new approach for MEMS design tools will be introduced. An analysis of the design tool market leads to the result that most of the designers work with large and inflexible frameworks. Purchasing and maintaining these frameworks is expensive, and gives no optimum support for MEMS design process. The concept of design assistants, carried out with the concept of interacting software components, denotes a new generation of flexible, small, semi-autonomous software systems that are used to solve specific MEMS design tasks in close interaction with the designer. The degree of interaction depends on the complexity of the design task to be performed and the possibility to formalize the respective knowledge. In this context the Internet as one of today's most important communication media provides support for new tool concepts on the basis of the Java programming language. These modern technologies can be used to set up distributed and platform-independent applications. Thus the idea emerged to implement design assistants using Java. According to the MEMS design model new process sequences have to be defined new for every specific design object. As a consequence, assistants have to be built dynamically depending on the requirements of the design process, what can be achieved with component based software development. Componentware offers the possibility to realize design assistants, in areas like design rule checks, process consistency checks, technology definitions, graphical editors, etc. that may reside distributed over the Internet, communicating via Internet protocols. At the University of Siegen a directory for reusable MEMS components has been created, containing a process specification assistant and a layout verification assistant for lithography based MEMS technologies.
Nonlinear Process Fault Diagnosis Based on Serial Principal Component Analysis.
Deng, Xiaogang; Tian, Xuemin; Chen, Sheng; Harris, Chris J
2016-12-22
Many industrial processes contain both linear and nonlinear parts, and kernel principal component analysis (KPCA), widely used in nonlinear process monitoring, may not offer the most effective means for dealing with these nonlinear processes. This paper proposes a new hybrid linear-nonlinear statistical modeling approach for nonlinear process monitoring by closely integrating linear principal component analysis (PCA) and nonlinear KPCA using a serial model structure, which we refer to as serial PCA (SPCA). Specifically, PCA is first applied to extract PCs as linear features, and to decompose the data into the PC subspace and residual subspace (RS). Then, KPCA is performed in the RS to extract the nonlinear PCs as nonlinear features. Two monitoring statistics are constructed for fault detection, based on both the linear and nonlinear features extracted by the proposed SPCA. To effectively perform fault identification after a fault is detected, an SPCA similarity factor method is built for fault recognition, which fuses both the linear and nonlinear features. Unlike PCA and KPCA, the proposed method takes into account both linear and nonlinear PCs simultaneously, and therefore, it can better exploit the underlying process's structure to enhance fault diagnosis performance. Two case studies involving a simulated nonlinear process and the benchmark Tennessee Eastman process demonstrate that the proposed SPCA approach is more effective than the existing state-of-the-art approach based on KPCA alone, in terms of nonlinear process fault detection and identification.
NONLINEAR DATA RECONCILIATION METHOD BASED ON KERNEL PRINCIPAL COMPONENT ANALYSIS
Institute of Scientific and Technical Information of China (English)
无
2003-01-01
In the industrial process situation, principal component analysis (PCA) is a general method in data reconciliation.However, PCA sometime is unfeasible to nonlinear feature analysis and limited in application to nonlinear industrial process.Kernel PCA (KPCA) is extension of PCA and can be used for nonlinear feature analysis.A nonlinear data reconciliation method based on KPCA is proposed.The basic idea of this method is that firstly original data are mapped to high dimensional feature space by nonlinear function, and PCA is implemented in the feature space.Then nonlinear feature analysis is implemented and data are reconstructed by using the kernel.The data reconciliation method based on KPCA is applied to ternary distillation column.Simulation results show that this method can filter the noise in measurements of nonlinear process and reconciliated data can represent the true information of nonlinear process.
A Direct Heuristic Algorithm for Linear Programming
Indian Academy of Sciences (India)
S K Sen; A Ramful
2000-02-01
An (3) mathematically non-iterative heuristic procedure that needs no artificial variable is presented for solving linear programming problems. An optimality test is included. Numerical experiments depict the utility/scope of such a procedure.
Heuristic attacks against graphical password generators
CSIR Research Space (South Africa)
Peach, S
2010-05-01
Full Text Available In this paper the authors explore heuristic attacks against graphical password generators. A new trend is emerging to use user clickable pictures to generate passwords. This technique of authentication can be successfully used for - for example...
A Heuristic and Hybrid Method for the Tank Allocation Problem in Maritime Bulk Shipping
DEFF Research Database (Denmark)
Vilhelmsen, Charlotte; Larsen, Jesper; Lusby, Richard Martin
and strength as well as other operational constraints. The problem of finding a feasible solution to this tank allocation problem has been shown to be NP-Complete. We approach the problem on a tactical level where requirements for computation time are strict while solution quality is less important than simply...... have created a hybrid method that first runs the heuristic and if the heuristic fails to solve the problem, then runs the modified optimality based method on the parts of the problem that the heuristic did not solve. This hybrid method cuts between 90% and 94% of the average running times compared...... finding a feasible solution. We have developed a heuristic that can efficiently find feasible cargo allocations. Computational results show that it can solve 99% of the considered instances within 0.4 seconds and all of them if allowed longer time. We have also modified an optimality based method from...
An Improved Heuristic Ant-Clustering Algorithm
Institute of Scientific and Technical Information of China (English)
Yunfei Chen; Yushu Liu; Jihai Zhao
2004-01-01
An improved heuristic ant-clustering algorithm(HAC)is presented in this paper. A device of ＇memory bank＇ is proposed,which can bring forth heuristic knowledge guiding ant to move in the bi-dimension grid space.The device experiments on real data sets and synthetic data sets.The results demonstrate that HAC has superiority in misclassification error rate and runtime over the classical algorithm.
Intelligent Heuristic Construction with Active Learning
Ogilvie, William; Petoumenos, Pavlos; Wang, Zheng; Leather, Hugh
2015-01-01
Building effective optimization heuristics is a challenging task which oftentakes developers several months if not years to complete. Predictive modellinghas recently emerged as a promising solution, automatically constructingheuristics from training data. However, obtaining this data can take monthsper platform. This is becoming an ever more critical problem and if no solutionis found we shall be left with out of date heuristics which cannot extract thebest performance from modern machines.I...
基于网络通信指纹的启发式木马识别系统%Heuristic Trojan Identification SystemBased on Network Communication Fingerprint
Institute of Scientific and Technical Information of China (English)
唐彰国; 李换洲; 钟明全; 张健
2011-01-01
对比传统木马检测技术的原理及特点,根据网络数据流检测木马的需求,提出一种基于网络通信特征分析的木马识别方法.引入通信指纹的概念扩展通信特征的外延,用实验方法归纳木马在连接、控制和文件传输阶段表现出的通信指纹信息,设计并实现一个启发式木马网络通信指纹识别系统.测试结果表明,该系统运行高效、检测结果准确.%This paper discusses the trojan detection technique, and a detail contrast research of related characters is given. In order to provide trojan detection based on network data flow, a trojan identification method based on network communication fingerprint is broutht forward. The concept of communication fingerprint is introduced to expand the extension of the communication features. Through the experimental method the fingerprints information of trojan for each phase such as connection, control and file transfer can be highlighted. On that basis, a heuristic identification system for trojan based on network communication fingerprint is designed and implemented. Test results indicate that the system runs efficient and the results are accurate.
Put a limit on it: The protective effects of scarcity heuristics when self-control is low
Cheung, T.T.L.; Kroese, F.M.; Fennis, Bob; de Ridder, D.T.D.
2015-01-01
Low self-control is a state in which consumers are assumed to be vulnerable to making impulsive choices that hurt long-term goals. Rather than increasing self-control, the current research exploits the tendency for heuristic-based thinking in low self-control by employing scarcity heuristics to
An Evaluation of Component-Based Software Design Approaches
Puppin, Diego; Silvestri, Fabrizio; Laforenza, Domenico
2004-01-01
There is growing attention for a component-oriented software design of Grid applications. Within this framework, applications are built by assembling together independently developed-software components. A component is a software unit with a clearly defined interface and explicit dependencies. It is designed to be integrated with other components, but independently from them. Unix filters and the pipe composition model, the first successful component-oriented model, allowed more complex appli...
New Heuristic Distributed Parallel Algorithms for Searching and Planning
Institute of Scientific and Technical Information of China (English)
无
1995-01-01
This paper proposes new heuristic distributed parallel algorithms for searching and planning,which are based on the concepts of wave concurrent propagations and competitive activation mechanisms.These algorithms are characterized by simplicity and clearness of control strategies for earching,and distinguished abilities in many aspects,such as high speed processing,wide suitability for searching AND/OR implicit graphs,and ease in hardware implementation.
HEURISTICAL FEATURE EXTRACTION FROM LIDAR DATA AND THEIR VISUALIZATION
Ghosh., S; B. Lohani
2012-01-01
Extraction of landscape features from LiDAR data has been studied widely in the past few years. These feature extraction methodologies have been focussed on certain types of features only, namely the bare earth model, buildings principally containing planar roofs, trees and roads. In this paper, we present a methodology to process LiDAR data through DBSCAN, a density based clustering method, which extracts natural and man-made clusters. We then develop heuristics to process these clu...
Directory of Open Access Journals (Sweden)
D. A. Viattchenin
2009-01-01
Full Text Available A method for constructing a subset of labeled objects which is used in a heuristic algorithm of possible clusterization with partial training is proposed in the paper. The method is based on data preprocessing by the heuristic algorithm of possible clusterization using a transitive closure of a fuzzy tolerance. Method efficiency is demonstrated by way of an illustrative example.
Modelling raster-based monthly water balance components for Europe
Energy Technology Data Exchange (ETDEWEB)
Ulmen, C.
2000-11-01
The terrestrial runoff component is a comparatively small but sensitive and thus significant quantity in the global energy and water cycle at the interface between landmass and atmosphere. As opposed to soil moisture and evapotranspiration which critically determine water vapour fluxes and thus water and energy transport, it can be measured as an integrated quantity over a large area, i.e. the river basin. This peculiarity makes terrestrial runoff ideally suited for the calibration, verification and validation of general circulation models (GCMs). Gauging stations are not homogeneously distributed in space. Moreover, time series are not necessarily continuously measured nor do they in general have overlapping time periods. To overcome this problems with regard to regular grid spacing used in GCMs, different methods can be applied to transform irregular data to regular so called gridded runoff fields. The present work aims to directly compute the gridded components of the monthly water balance (including gridded runoff fields) for Europe by application of the well-established raster-based macro-scale water balance model WABIMON used at the Federal Institute of Hydrology, Germany. Model calibration and validation is performed by separated examination of 29 representative European catchments. Results indicate a general applicability of the model delivering reliable overall patterns and integrated quantities on a monthly basis. For time steps less then too weeks further research and structural improvements of the model are suggested. (orig.)
Institute of Scientific and Technical Information of China (English)
徐培德; 王建江; 许语拉
2012-01-01
At the time of multiple satellites execute reconnaissance tasks in cooperation, Clustering of multiple satellite imaging reconnaissance tasks is able to improve integrated Reconnaissance efficiency of multiple satellites. Based on satellite imaging reconnaissance atomic tasks clustering relations and restriction conditions,this paper establishes mathematics programming model of multiple satellite imaging reconnaissance tasks clustering, then it presents heuristic clustering algorithm based on atomic task insertion to solve the model. Finally the algorithm is validated by an example.%在多颗卫星协同执行侦察任务时,多星成像侦察任务聚类能够提高多颗卫星的整体侦察效率.根据卫星成像侦察元任务之间的聚类关系及约束条件,建立了数学规划模型,并提出了模型求解的基于元任务插入的启发式聚类算法,最后结合实例验证了算法的有效性.
High Q, Miniaturized LCP-Based Passive Components
Shamim, Atif
2014-10-16
Various methods and systems are provided for high Q, miniaturized LCP-based passive components. In one embodiment, among others, a spiral inductor includes a center connection and a plurality of inductors formed on a liquid crystal polymer (LCP) layer, the plurality of inductors concentrically spiraling out from the center connection. In another embodiment, a vertically intertwined inductor includes first and second inductors including a first section disposed on a side of the LCP layer forming a fraction of a turn and a second section disposed on another side of the LCP layer. At least a portion of the first section of the first inductor is substantially aligned with at least a portion of the second section of the second inductor and at least a portion of the first section of the second inductor is substantially aligned with at least a portion of the second section of the first inductor.
Nonlinear fault diagnosis method based on kernel principal component analysis
Institute of Scientific and Technical Information of China (English)
Yan Weiwu; Zhang Chunkai; Shao Huihe
2005-01-01
To ensure the system run under working order, detection and diagnosis of faults play an important role in industrial process. This paper proposed a nonlinear fault diagnosis method based on kernel principal component analysis (KPCA). In proposed method, using essential information of nonlinear system extracted by KPCA, we constructed KPCA model of nonlinear system under normal working condition. Then new data were projected onto the KPCA model. When new data are incompatible with the KPCA model, it can be concluded that the nonlinear system isout of normal working condition. Proposed method was applied to fault diagnosison rolling bearings. Simulation results show proposed method provides an effective method for fault detection and diagnosis of nonlinear system.
Heuristic Portfolio Trading Rules with Capital Gain Taxes
DEFF Research Database (Denmark)
Fischer, Marcel; Gallmeyer, Michael
strategy is not dominated out-of-sample by a variety of optimizing trading strategies, except the parametric portfolios of Brandt, Santa-Clara, and Valkanov (2009). With dividend and realization-based capital gain taxes, the welfare costs of the taxes are large with the cost being as large as 30% of wealth......We study the out-of-sample performance of portfolio trading strategies when an investor faces capital gain taxation and proportional transaction costs. Under no capital gain taxation and no transaction costs, we show that, consistent with DeMiguel, Garlappi, and Uppal (2009), a simple 1/N trading...... outperform a 1/N trading strategy augmented with a tax heuristic, not even the most tax- and transaction-cost efficient buy-and-hold strategy. Overall, the best strategy is 1/N augmented with a heuristic that allows for a fixed deviation in absolute portfolio weights. Our results show that the best trading...
A novel heuristic algorithm for capacitated vehicle routing problem
Kır, Sena; Yazgan, Harun Reşit; Tüncel, Emre
2017-02-01
The vehicle routing problem with the capacity constraints was considered in this paper. It is quite difficult to achieve an optimal solution with traditional optimization methods by reason of the high computational complexity for large-scale problems. Consequently, new heuristic or metaheuristic approaches have been developed to solve this problem. In this paper, we constructed a new heuristic algorithm based on the tabu search and adaptive large neighborhood search (ALNS) with several specifically designed operators and features to solve the capacitated vehicle routing problem (CVRP). The effectiveness of the proposed algorithm was illustrated on the benchmark problems. The algorithm provides a better performance on large-scaled instances and gained advantage in terms of CPU time. In addition, we solved a real-life CVRP using the proposed algorithm and found the encouraging results by comparison with the current situation that the company is in.
Heuristic Portfolio Trading Rules with Capital Gain Taxes
DEFF Research Database (Denmark)
Fischer, Marcel; Gallmeyer, Michael
We study the out-of-sample performance of portfolio trading strategies when an investor faces capital gain taxation and proportional transaction costs. Under no capital gain taxation and no transaction costs, we show that, consistent with DeMiguel, Garlappi, and Uppal (2009), a simple 1/N trading...... strategy is not dominated out-of-sample by a variety of optimizing trading strategies, except the parametric portfolios of Brandt, Santa-Clara, and Valkanov (2009). With dividend and realization-based capital gain taxes, the welfare costs of the taxes are large with the cost being as large as 30% of wealth...... in some cases. Overlaying simple tax trading heuristics on these trading strategies improves out-of-sample performance. In particular, the 1/N trading strategy's welfare gains improve when a variety of tax trading heuristics are also imposed. For medium to large transaction costs, no trading strategy can...
An Efficient Heuristic Approach for Irregular Cutting Stock Problem in Ship Building Industry
Directory of Open Access Journals (Sweden)
Yan-xin Xu
2016-01-01
Full Text Available This paper presents an efficient approach for solving a real two-dimensional irregular cutting stock problem in ship building industry. Cutting stock problem is a common cutting and packing problem that arises in a variety of industrial applications. A modification of selection heuristic Exact Fit is applied in our research. In the case referring to irregular shapes, a placement heuristics is more important to construct a complete solution. A placement heuristic relating to bottom-left-fill is presented. We evaluate the proposed approach using generated instance only with convex shapes in literatures and some instances with nonconvex shapes based on real problem from ship building industry. The results demonstrate that the effectiveness and efficiency of the proposed approach are significantly better than some conventional heuristics.
Unified heuristics to solve routing problem of reverse logistics in sustainable supply chain
Anbuudayasankar, S. P.; Ganesh, K.; Lenny Koh, S. C.; Mohandas, K.
2010-03-01
A reverse logistics problem, motivated by many real-life applications, is examined where bottles/cans in which products are delivered from a processing depot to customers in one period are available for return to the depot in the following period. The picked-up bottles/cans need to be adjusted in the place of delivery load. This problem is termed as simultaneous delivery and pick-up problem with constrained capacity (SDPC). We develop three unified heuristics based on extended branch and bound heuristic, genetic algorithm and simulated annealing to solve SDPC. These heuristics are also designed to solve standard travelling salesman problem (TSP) and TSP with simultaneous delivery and pick-up (TSDP). We tested the heuristics on standard, derived and randomly generated datasets of TSP, TSDP and SDPC and obtained satisfying results with high convergence in reasonable time.
Madni, Syed Hamid Hussain; Abd Latiff, Muhammad Shafie; Abdullahi, Mohammed; Abdulhamid, Shafi'i Muhammad; Usman, Mohammed Joda
2017-01-01
Cloud computing infrastructure is suitable for meeting computational needs of large task sizes. Optimal scheduling of tasks in cloud computing environment has been proved to be an NP-complete problem, hence the need for the application of heuristic methods. Several heuristic algorithms have been developed and used in addressing this problem, but choosing the appropriate algorithm for solving task assignment problem of a particular nature is difficult since the methods are developed under different assumptions. Therefore, six rule based heuristic algorithms are implemented and used to schedule autonomous tasks in homogeneous and heterogeneous environments with the aim of comparing their performance in terms of cost, degree of imbalance, makespan and throughput. First Come First Serve (FCFS), Minimum Completion Time (MCT), Minimum Execution Time (MET), Max-min, Min-min and Sufferage are the heuristic algorithms considered for the performance comparison and analysis of task scheduling in cloud computing.
Performance of nickel base superalloy components in gas turbines
DEFF Research Database (Denmark)
Dahl, Kristian Vinter
2006-01-01
The topic of this thesis is the microstructural behaviour of hot section components in the industrial gas turbine......The topic of this thesis is the microstructural behaviour of hot section components in the industrial gas turbine...
Institute of Scientific and Technical Information of China (English)
臧天磊; 钟佳辰; 何正友; 钱清泉
2012-01-01
为快速准确实现配电网故障恢复,提出了启发式规则与熵权理论相结合的两阶段配电网故障恢复算法.第1阶段,利用启发式规则生成故障恢复的候选方案集;第2阶段,考虑故障恢复的目标,引入负荷恢复量、负荷转移量、负荷容量裕度、负荷均衡率和开关操作次数5个评价指标,分别计算候选方案的各评价指标,利用熵权理论对方案进行评价,同时引入主客观权重对方案进行综合评价,得出优选和备选恢复方案.算例结果验证了该方法的有效性.%To implement service restoration of distribution network fast and optimally, a two-stage service restoration algorithm for distribution network based on the combination of heuristic rules and entropy weight is proposed. In the first stage, by use of heuristic rules, a candidate service restoration scheme set is generated; in the second stage, considering the target of service restoration, five evaluation indices such as the quantities of restored load and transferred load, the margin of load capacity, rate of load balancing and switching times of circuit breakers are introduced in, then the evaluation indices of candidate schemes are calculated, and then these schemes are assessed by entropy weight, meanwhile, by means of introducing subjective and objective weights in, the schemes are synthetically assessed to achieve the preferred and alternative service restoration schemes. Calculation results of a distribution network with six feeders show that the proposed method is effective.
Parallel Heuristics for Scalable Community Detection
Energy Technology Data Exchange (ETDEWEB)
Lu, Howard; Kalyanaraman, Anantharaman; Halappanavar, Mahantesh; Choudhury, Sutanay
2014-05-17
Community detection has become a fundamental operation in numerous graph-theoretic applications. It is used to reveal natural divisions that exist within real world networks without imposing prior size or cardinality constraints on the set of communities. Despite its potential for application, there is only limited support for community detection on large-scale parallel computers, largely owing to the irregular and inherently sequential nature of the underlying heuristics. In this paper, we present parallelization heuristics for fast community detection using the Louvain method as the serial template. The Louvain method is an iterative heuristic for modularity optimization. Originally developed by Blondel et al. in 2008, the method has become increasingly popular owing to its ability to detect high modularity community partitions in a fast and memory-efficient manner. However, the method is also inherently sequential, thereby limiting its scalability to problems that can be solved on desktops. Here, we observe certain key properties of this method that present challenges for its parallelization, and consequently propose multiple heuristics that are designed to break the sequential barrier. Our heuristics are agnostic to the underlying parallel architecture. For evaluation purposes, we implemented our heuristics on shared memory (OpenMP) and distributed memory (MapReduce-MPI) machines, and tested them over real world graphs derived from multiple application domains (internet, biological, natural language processing). Experimental results demonstrate the ability of our heuristics to converge to high modularity solutions comparable to those output by the serial algorithm in nearly the same number of iterations, while also drastically reducing time to solution.
Choosing the best heuristic for seeded alignment of DNA sequences
Directory of Open Access Journals (Sweden)
Buhler Jeremy
2006-03-01
Full Text Available Abstract Background Seeded alignment is an important component of algorithms for fast, large-scale DNA similarity search. A good seed matching heuristic can reduce the execution time of genomic-scale sequence comparison without degrading sensitivity. Recently, many types of seed have been proposed to improve on the performance of traditional contiguous seeds as used in, e.g., NCBI BLASTN. Choosing among these seed types, particularly those that use information besides the presence or absence of matching residue pairs, requires practical guidance based on a rigorous comparison, including assessment of sensitivity, specificity, and computational efficiency. This work performs such a comparison, focusing on alignments in DNA outside widely studied coding regions. Results We compare seeds of several types, including those allowing transition mutations rather than matches at fixed positions, those allowing transitions at arbitrary positions ("BLASTZ" seeds, and those using a more general scoring matrix. For each seed type, we use an extended version of our Mandala seed design software to choose seeds with optimized sensitivity for various levels of specificity. Our results show that, on a test set biased toward alignments of noncoding DNA, transition information significantly improves seed performance, while finer distinctions between different types of mismatches do not. BLASTZ seeds perform especially well. These results depend on properties of our test set that are not shared by EST-based test sets with a strong bias toward coding DNA. Conclusion Practical seed design requires careful attention to the properties of the alignments being sought. For noncoding DNA sequences, seeds that use transition information, especially BLASTZ-style seeds, are particularly useful. The Mandala seed design software can be found at http://www.cse.wustl.edu/~yanni/mandala/.
Efficiency Improvements in Meta-Heuristic Algorithms to Solve the Optimal Power Flow Problem
Reddy, S. Surender; Bijwe, P. R.
2016-12-01
This paper proposes the efficient approaches for solving the Optimal Power Flow (OPF) problem using the meta-heuristic algorithms. Mathematically, OPF is formulated as non-linear equality and inequality constrained optimization problem. The main drawback of meta-heuristic algorithm based OPF is the excessive execution time required due to the large number of power flows needed in the solution process. The proposed efficient approaches uses the lower and upper bounds of objective function values. By using this approach, the number of power flows to be performed are reduced substantially, resulting in the solution speed up. The efficiently generated objective function bounds can result in the faster solutions of meta-heuristic algorithms. The original advantages of meta-heuristic algorithms, such as ability to handle complex non-linearities, discontinuities in the objective function, discrete variables handling, and multi-objective optimization, etc., are still available in the proposed efficient approaches. The proposed OPF formulation includes the active and reactive power generation limits, Valve Point Loading (VPL) and Prohibited Operating Zones (POZs) effects of generating units. The effectiveness of proposed approach is examined on IEEE 30, 118 and 300 bus test systems, and the simulation results confirm the efficiency and superiority of the proposed approaches over the other meta-heuristic algorithms. The proposed efficient approach is generic enough to use with any type of meta-heuristic algorithm based OPF.
Evaluating the hydrological consistency of satellite based water cycle components
Lopez Valencia, Oliver M.
2016-06-15
Advances in multi-satellite based observations of the earth system have provided the capacity to retrieve information across a wide-range of land surface hydrological components and provided an opportunity to characterize terrestrial processes from a completely new perspective. Given the spatial advantage that space-based observations offer, several regional-to-global scale products have been developed, offering insights into the multi-scale behaviour and variability of hydrological states and fluxes. However, one of the key challenges in the use of satellite-based products is characterizing the degree to which they provide realistic and representative estimates of the underlying retrieval: that is, how accurate are the hydrological components derived from satellite observations? The challenge is intrinsically linked to issues of scale, since the availability of high-quality in-situ data is limited, and even where it does exist, is generally not commensurate to the resolution of the satellite observation. Basin-scale studies have shown considerable variability in achieving water budget closure with any degree of accuracy using satellite estimates of the water cycle. In order to assess the suitability of this type of approach for evaluating hydrological observations, it makes sense to first test it over environments with restricted hydrological inputs, before applying it to more hydrological complex basins. Here we explore the concept of hydrological consistency, i.e. the physical considerations that the water budget impose on the hydrologic fluxes and states to be temporally and spatially linked, to evaluate the reproduction of a set of large-scale evaporation (E) products by using a combination of satellite rainfall (P) and Gravity Recovery and Climate Experiment (GRACE) observations of storage change, focusing on arid and semi-arid environments, where the hydrological flows can be more realistically described. Our results indicate no persistent hydrological
Bonding and Integration Technologies for Silicon Carbide Based Injector Components
Halbig, Michael C.; Singh, Mrityunjay
2008-01-01
Advanced ceramic bonding and integration technologies play a critical role in the fabrication and application of silicon carbide based components for a number of aerospace and ground based applications. One such application is a lean direct injector for a turbine engine to achieve low NOx emissions. Ceramic to ceramic diffusion bonding and ceramic to metal brazing technologies are being developed for this injector application. For the diffusion bonding, titanium interlayers (PVD and foils) were used to aid in the joining of silicon carbide (SiC) substrates. The influence of such variables as surface finish, interlayer thickness (10, 20, and 50 microns), processing time and temperature, and cooling rates were investigated. Microprobe analysis was used to identify the phases in the bonded region. For bonds that were not fully reacted an intermediate phase, Ti5Si3Cx, formed that is thermally incompatible in its thermal expansion and caused thermal stresses and cracking during the processing cool-down. Thinner titanium interlayers and/or longer processing times resulted in stable and compatible phases that did not contribute to microcracking and resulted in an optimized microstructure. Tensile tests on the joined materials resulted in strengths of 13-28 MPa depending on the SiC substrate material. Non-destructive evaluation using ultrasonic immersion showed well formed bonds. For the joining technology of brazing Kovar fuel tubes to silicon carbide, preliminary development of the joining approach has begun. Various technical issues and requirements for the injector application are addressed.
Directory of Open Access Journals (Sweden)
Kristian M. Lien
1990-01-01
Full Text Available This paper presents a new algorithm based on the heuristic tearing algorithm by Gundersen and Hertzberg (1983. The basic idea in both the original and the proposed algorithm is sequential tearing of strong components which have been identified by an algorithm proposed by Targan (1972. The new algorithm has two alternative options for selection of tear streams, and alternative precedence orderings may be generated for the selected set of tear streams. The algorithm has been tested on several problems. It has identified minimal (optimal tear sets for all of them, including the four problems presented in Gundersen and Hertzberg (1983 where the original algorithm could not find a minimal tear set. A Lisp implementation of the algorithm is described, and example problems arc presented.
Directory of Open Access Journals (Sweden)
Zeeshan Ali Siddiqui
2016-01-01
Full Text Available Component-based software system (CBSS development technique is an emerging discipline that promises to take software development into a new era. As hardware systems are presently being constructed from kits of parts, software systems may also be assembled from components. It is more reliable to reuse software than to create. It is the glue code and individual components reliability that contribute to the reliability of the overall system. Every component contributes to overall system reliability according to the number of times it is being used, some components are of critical usage, known as usage frequency of component. The usage frequency decides the weight of each component. According to their weights, each component contributes to the overall reliability of the system. Therefore, ranking of components may be obtained by analyzing their reliability impacts on overall application. In this paper, we propose the application of fuzzy multi-objective optimization on the basis of ratio analysis, Fuzzy-MOORA. The method helps us find the best suitable alternative, software component, from a set of available feasible alternatives named software components. It is an accurate and easy to understand tool for solving multi-criteria decision making problems that have imprecise and vague evaluation data. By the use of ratio analysis, the proposed method determines the most suitable alternative among all possible alternatives, and dimensionless measurement will realize the job of ranking of components for estimating CBSS reliability in a non-subjective way. Finally, three case studies are shown to illustrate the use of the proposed technique.
Heuristics-Guided Exploration of Reaction Mechanisms.
Bergeler, Maike; Simm, Gregor N; Proppe, Jonny; Reiher, Markus
2015-12-08
For the investigation of chemical reaction networks, the efficient and accurate determination of all relevant intermediates and elementary reactions is mandatory. The complexity of such a network may grow rapidly, in particular if reactive species are involved that might cause a myriad of side reactions. Without automation, a complete investigation of complex reaction mechanisms is tedious and possibly unfeasible. Therefore, only the expected dominant reaction paths of a chemical reaction network (e.g., a catalytic cycle or an enzymatic cascade) are usually explored in practice. Here, we present a computational protocol that constructs such networks in a parallelized and automated manner. Molecular structures of reactive complexes are generated based on heuristic rules derived from conceptual electronic-structure theory and subsequently optimized by quantum-chemical methods to produce stable intermediates of an emerging reaction network. Pairs of intermediates in this network that might be related by an elementary reaction according to some structural similarity measure are then automatically detected and subjected to an automated search for the connecting transition state. The results are visualized as an automatically generated network graph, from which a comprehensive picture of the mechanism of a complex chemical process can be obtained that greatly facilitates the analysis of the whole network. We apply our protocol to the Schrock dinitrogen-fixation catalyst to study alternative pathways of catalytic ammonia production.
Multiobjective hyper heuristic scheme for system design and optimization
Rafique, Amer Farhan
2012-11-01
As system design is becoming more and more multifaceted, integrated, and complex, the traditional single objective optimization trends of optimal design are becoming less and less efficient and effective. Single objective optimization methods present a unique optimal solution whereas multiobjective methods present pareto front. The foremost intent is to predict a reasonable distributed pareto-optimal solution set independent of the problem instance through multiobjective scheme. Other objective of application of intended approach is to improve the worthiness of outputs of the complex engineering system design process at the conceptual design phase. The process is automated in order to provide the system designer with the leverage of the possibility of studying and analyzing a large multiple of possible solutions in a short time. This article presents Multiobjective Hyper Heuristic Optimization Scheme based on low level meta-heuristics developed for the application in engineering system design. Herein, we present a stochastic function to manage meta-heuristics (low-level) to augment surety of global optimum solution. Generic Algorithm, Simulated Annealing and Swarm Intelligence are used as low-level meta-heuristics in this study. Performance of the proposed scheme is investigated through a comprehensive empirical analysis yielding acceptable results. One of the primary motives for performing multiobjective optimization is that the current engineering systems require simultaneous optimization of conflicting and multiple. Random decision making makes the implementation of this scheme attractive and easy. Injecting feasible solutions significantly alters the search direction and also adds diversity of population resulting in accomplishment of pre-defined goals set in the proposed scheme.
Automatic Choice of Scheduling Heuristics for Parallel/Distributed Computing
Directory of Open Access Journals (Sweden)
Clayton S. Ferner
1999-01-01
Full Text Available Task mapping and scheduling are two very difficult problems that must be addressed when a sequential program is transformed into a parallel program. Since these problems are NP‐hard, compiler writers have opted to concentrate their efforts on optimizations that produce immediate gains in performance. As a result, current parallelizing compilers either use very simple methods to deal with task scheduling or they simply ignore it altogether. Unfortunately, the programmer does not have this luxury. The burden of repartitioning or rescheduling, should the compiler produce inefficient parallel code, lies entirely with the programmer. We were able to create an algorithm (called a metaheuristic, which automatically chooses a scheduling heuristic for each input program. The metaheuristic produces better schedules in general than the heuristics upon which it is based. This technique was tested on a suite of real scientific programs written in SISAL and simulated on four different network configurations. Averaged over all of the test cases, the metaheuristic out‐performed all eight underlying scheduling algorithms; beating the best one by 2%, 12%, 13%, and 3% on the four separate network configurations. It is able to do this, not always by picking the best heuristic, but rather by avoiding the heuristics when they would produce very poor schedules. For example, while the metaheuristic only picked the best algorithm about 50% of the time for the 100 Gbps Ethernet, its worst decision was only 49% away from optimal. In contrast, the best of the eight scheduling algorithms was optimal 30% of the time, but its worst decision was 844% away from optimal.
WEB SERVICE SELECTION ALGORITHM BASED ON PRINCIPAL COMPONENT ANALYSIS
Institute of Scientific and Technical Information of China (English)
Kang Guosheng; Liu Jianxun; Tang Mingdong; Cao Buqing
2013-01-01
Existing Web service selection approaches usually assume that preferences of users have been provided in a quantitative form by users.However,due to the subjectivity and vagueness of preferences,it may be impractical for users to specify quantitative and exact preferences.Moreover,due to that Quality of Service (QoS) attributes are often interrelated,existing Web service selection approaches which employ weighted summation of QoS attribute values to compute the overall QoS of Web services may produce inaccurate results,since they do not take correlations among QoS attributes into account.To resolve these problems,a Web service selection framework considering user's preference priority is proposed,which incorporates a searching mechanism with QoS range setting to identify services satisfying the user's QoS constraints.With the identified service candidates,based on the idea of Principal Component Analysis (PCA),an algorithm of Web service selection named PCAoWSS (Web Service Selection based on PCA) is proposed,which can eliminate the correlations among QoS attributes and compute the overall QoS of Web services accurately.After computing the overall QoS for each service,the algorithm ranks the Web service candidates based on their overall QoS and recommends services with top QoS values to users.Finally,the effectiveness and feasibility of our approach are validated by experiments,i.e.the selected Web service by our approach is given high average evaluation than other ones by users and the time cost of PCA-WSS algorithm is not affected acutely by the number of service candidates.
Meta Heuristic Algorithms for Vehicle Routing Problem with Stochastic Demands
Directory of Open Access Journals (Sweden)
Geetha Shanmugam
2011-01-01
Full Text Available Problem statement: The shipment of goods from manufacturer to the consumer is a focal point of distribution logistics. In reality, the demand of consumers is not known a priori. This kind of distribution is dealt by Stochastic Vehicle Routing Problem (SVRP which is a NP-hard problem. In this proposed work, VRP with stochastic demand is considered. A probability distribution is considered as a random variable for stochastic demand of a customer. Approach: In this study, VRPSD is resolved using Meta heuristic algorithms such as Genetic Algorithm (GA, Particle Swarm Optimization (PSO and Hybrid PSO (HPSO. Dynamic Programming (DP is used to find the expected cost of each route generated by GA, PSO and HPSO. Results: The objective is to minimize the total expected cost of a priori route. The fitness value of a priori route is calculated using DP. In proposed HPSO, the initial particles are generated based Nearest Neighbor Heuristic (NNH. Elitism is used in HPSO for updating the particles. The algorithm is implemented using MATLAB7.0 and tested with problems having different number of customers. The results obtained are competitive in terms of execution time and memory usage. Conclusion: The computational time is reduced as polynomial time as O(nKQ time and the memory required is O(nQ. The ANOVA test is performed to compare the proposed HPSO with other heuristic algorithms.
When less is more: evolutionary origins of the affect heuristic.
Directory of Open Access Journals (Sweden)
Jerald D Kralik
Full Text Available The human mind is built for approximations. When considering the value of a large aggregate of different items, for example, we typically do not summate the many individual values. Instead, we appear to form an immediate impression of the likeability of the option based on the average quality of the full collection, which is easier to evaluate and remember. While useful in many situations, this affect heuristic can lead to apparently irrational decision-making. For example, studies have shown that people are willing to pay more for a small set of high-quality goods than for the same set of high-quality goods with lower-quality items added [e.g. 1]. We explored whether this kind of choice behavior could be seen in other primates. In two experiments, one in the laboratory and one in the field, using two different sets of food items, we found that rhesus monkeys preferred a highly-valued food item alone to the identical item paired with a food of positive but lower value. This finding provides experimental evidence that, under certain conditions, macaque monkeys follow an affect heuristic that can cause them to prefer less food. Conservation of this affect heuristic could account for similar 'irrational' biases in humans, and may reflect a more general complexity reduction strategy in which averages, prototypes, or stereotypes represent a set or group.
Recipient design in human communication: simple heuristics or perspective taking?
Blokpoel, Mark; van Kesteren, Marlieke; Stolk, Arjen; Haselager, Pim; Toni, Ivan; van Rooij, Iris
2012-01-01
Humans have a remarkable capacity for tuning their communicative behaviors to different addressees, a phenomenon also known as recipient design. It remains unclear how this tuning of communicative behavior is implemented during live human interactions. Classical theories of communication postulate that recipient design involves perspective taking, i.e., the communicator selects her behavior based on her hypotheses about beliefs and knowledge of the recipient. More recently, researchers have argued that perspective taking is computationally too costly to be a plausible mechanism in everyday human communication. These researchers propose that computationally simple mechanisms, or heuristics, are exploited to perform recipient design. Such heuristics may be able to adapt communicative behavior to an addressee with no consideration for the addressee's beliefs and knowledge. To test whether the simpler of the two mechanisms is sufficient for explaining the "how" of recipient design we studied communicators' behaviors in the context of a non-verbal communicative task (the Tacit Communication Game, TCG). We found that the specificity of the observed trial-by-trial adjustments made by communicators is parsimoniously explained by perspective taking, but not by simple heuristics. This finding is important as it suggests that humans do have a computationally efficient way of taking beliefs and knowledge of a recipient into account.
Recipient design in human communication: Simple heuristics or perspective taking?
Directory of Open Access Journals (Sweden)
Mark eBlokpoel
2012-09-01
Full Text Available Humans have a remarkable capacity for tuning their communicative behaviors to different addressees, a phenomenon also known as recipient design. It remains unclear how this tuning of communicative behavior is implemented during live human interactions. Classical theories of communication postulate that recipient design involves perspective taking, i.e., the communicator selects her behavior based on her hypotheses about beliefs and knowledge of the recipient. More recently, researchers have argued that perspective taking is computationally too costly to be a plausible mechanism in everyday human communication. These researchers propose that computationally simple mechanisms, or heuristics, are exploited to perform recipient design. Such heuristics may be able to adapt communicative behavior to an addressee with no consideration for the addressee's beliefs and knowledge. To test whether the simpler of the two mechanisms is sufficient for explaining the `how' of recipient design we studied communicators' behaviors in the context of a non-verbal communicative task (the Tacit Communication Game, TCG. We found that the specificity of the observed trial-by-trial adjustments made by communicators is parsimoniously explained by perspective taking, but not by simple heuristics. This finding is important as it suggests that humans do have a computationally efficient way of taking beliefs and knowledge of a recipient into account.
Architectures: Design patterns for component-based systems
Bliudze, Simon
2014-01-01
Architectures depict design principles, paradigms that can be understood by all, allow thinking on a higher plane and avoiding low-level mistakes. They provide means for ensuring correctness by construction by enforcing global properties characterizing the coordination between components. An architecture can be considered as an operator A that, applied to a set of components B, builds a composite component A(B) meeting a characteristic property P. A theory of architectures must address sever...
Zheng, Jun-Xi; Zhang, Ping; Li, Fang; Du, Guang-Long
2016-09-01
Although the sequence-dependent setup times flowshop problem with the total weighted tardiness minimization objective exists widely in industry, work on the problem has been scant in the existing literature. To the authors' best knowledge, the NEH?EWDD heuristic and the Iterated Greedy (IG) algorithm with descent local search have been regarded as the high performing heuristic and the state-of-the-art algorithm for the problem, which are both based on insertion search. In this article firstly, an efficient backtracking algorithm and a novel heuristic (HPIS) are presented for insertion search. Accordingly, two heuristics are introduced, one is NEH?EWDD with HPIS for insertion search, and the other is the combination of NEH?EWDD and both the two methods. Furthermore, the authors improve the IG algorithm with the proposed methods. Finally, experimental results show that both the proposed heuristics and the improved IG (IG*) significantly outperform the original ones.
Directory of Open Access Journals (Sweden)
G. Esteve-Asensio
2009-01-01
Full Text Available We propose and compare three novel heuristics for the calculation of the optimal cell radius in mobile networks based on Wideband Code Division Multiple Access (WCDMA technology. The proposed heuristics solve the problem of the load assignment and cellular radius calculation. We have tested our approaches with experiments in multiservices scenarios showing that the proposed heuristics maximize the cell radius, providing the optimum load factor assignment. The main application of these algorithms is strategic planning studies, where an estimation of the number of Nodes B of the mobile operator, at a national level, is required for economic analysis. In this case due to the large number of different scenarios considered (cities, towns, and open areas other methods than simulation need to be considered. As far as we know, there is no other similar method in the literature and therefore these heuristics may represent a novelty in strategic network planning studies. The proposed heuristics are implemented in a strategic planning software tool and an example of their application for a case in Spain is presented. The proposed heuristics are used for telecommunications regulatory studies in several countries.
Laser repairing surface crack of Ni-based superalloy components
Institute of Scientific and Technical Information of China (English)
王忠柯; 叶和清; 许德胜; 黄索逸
2001-01-01
Surface crack of components of the cast nickel-base superalloy was repaired with twin laser beams under proper technological conditions. One laser beam was used to melt the substrate material of crack, and the other to fill in powder material to the crack region. The experimental results show that the surface crack with the width of 0.1～0.3mm could be repaired under the laser power of 3kW and the scanning speed of 6～8mm/s. The repaired deepness of crack region is below 6.5mm. The microstructure of repaired region is the cellular crystal, columnar crystal dendrite crystal from the transition region to the top filled layer. The phases in repaired region mainly consisted of supersaturated α-Co with plenty of Ni, some Cr and Al, Cr23C6, Co2B, Co-Ni-Mo, Ni4B3, TiSi and VSi. The hardness of filled layer in repaired region ranged from HV0.2450 to HV0.2500, and the hardness decreases gradually from the filled layer to joined zone.
Service oriented architecture assessment based on software components
Directory of Open Access Journals (Sweden)
Mahnaz Amirpour
2016-01-01
Full Text Available Enterprise architecture, with detailed descriptions of the functions of information technology in the organization, tries to reduce the complexity of technology applications resulting in tools with greater efficiency in achieving the objectives of the organization. Enterprise architecture consists of a set of models describing this technology in different components performance as well as various aspects of the applications in any organization. Therefore, information technology development and maintenance management can perform well within organizations. This study aims to suggest a method to identify different types of services in service-oriented architecture analysis step that applies some previous approaches in an integrated form and, based on the principles of software engineering, to provide a simpler and more transparent approach through the expression of analysis details. Advantages and disadvantages of proposals should be evaluated before the implementation and costs allocation. Evaluation methods can better identify strengths and weaknesses of the current situation apart from selecting appropriate model out of several suggestions, and clarify this technology development solution for organizations in the future. We will be able to simulate data and processes flow within the organization by converting the output of the model to colored Petri nets and evaluate and test it by examining various inputs to enterprise architecture before implemented in terms of reliability and response time. A model of application has been studied for the proposed model and the results can describe and design architecture for data.
Biological agent detection based on principal component analysis
Mudigonda, Naga R.; Kacelenga, Ray
2006-05-01
This paper presents an algorithm, based on principal component analysis for the detection of biological threats using General Dynamics Canada's 4WARN Sentry 3000 biodetection system. The proposed method employs a statistical method for estimating background biological activity so as to make the algorithm adaptive to varying background situations. The method attempts to characterize the pattern of change that occurs in the fluorescent particle counts distribution and uses the information to suppress false-alarms. The performance of the method was evaluated using a total of 68 tests including 51 releases of Bacillus Globigii (BG), six releases of BG in the presence of obscurants, six releases of obscurants only, and five releases of ovalbumin at the Ambient Breeze Tunnel Test facility, Battelle, OH. The peak one-minute average concentration of BG used in the tests ranged from 10 - 65 Agent Containing Particles per Liter of Air (ACPLA). The obscurants used in the tests included diesel smoke, white grenade smoke, and salt solution. The method successfully detected BG at a sensitivity of 10 ACPLA and resulted in an overall probability of detection of 94% for BG without generating any false-alarms for obscurants at a detection threshold of 0.6 on a scale of 0 to 1. Also, the method successfully detected BG in the presence of diesel smoke and salt water fumes. The system successfully responded to all the five ovalbumin releases with noticeable trends in algorithm output and alarmed for two releases at the selected detection threshold.
Cnn Based Retinal Image Upscaling Using Zero Component Analysis
Nasonov, A.; Chesnakov, K.; Krylov, A.
2017-05-01
The aim of the paper is to obtain high quality of image upscaling for noisy images that are typical in medical image processing. A new training scenario for convolutional neural network based image upscaling method is proposed. Its main idea is a novel dataset preparation method for deep learning. The dataset contains pairs of noisy low-resolution images and corresponding noiseless highresolution images. To achieve better results at edges and textured areas, Zero Component Analysis is applied to these images. The upscaling results are compared with other state-of-the-art methods like DCCI, SI-3 and SRCNN on noisy medical ophthalmological images. Objective evaluation of the results confirms high quality of the proposed method. Visual analysis shows that fine details and structures like blood vessels are preserved, noise level is reduced and no artifacts or non-existing details are added. These properties are essential in retinal diagnosis establishment, so the proposed algorithm is recommended to be used in real medical applications.
Internet MEMS design tools based on component technology
Brueck, Rainer; Schumer, Christian
1999-03-01
The micro electromechanical systems (MEMS) industry in Europe is characterized by small and medium sized enterprises specialized on products to solve problems in specific domains like medicine, automotive sensor technology, etc. In this field of business the technology driven design approach known from micro electronics is not appropriate. Instead each design problem aims at its own, specific technology to be used for the solution. The variety of technologies at hand, like Si-surface, Si-bulk, LIGA, laser, precision engineering requires a huge set of different design tools to be available. No single SME can afford to hold licenses for all these tools. This calls for a new and flexible way of designing, implementing and distributing design software. The Internet provides a flexible manner of offering software access along with methodologies of flexible licensing e.g. on a pay-per-use basis. New communication technologies like ADSL, TV cable of satellites as carriers promise to offer a bandwidth sufficient even for interactive tools with graphical interfaces in the near future. INTERLIDO is an experimental tool suite for process specification and layout verification for lithography based MEMS technologies to be accessed via the Internet. The first version provides a Java implementation even including a graphical editor for process specification. Currently, a new version is brought into operation that is based on JavaBeans component technology. JavaBeans offers the possibility to realize independent interactive design assistants, like a design rule checking assistants, a process consistency checking assistants, a technology definition assistants, a graphical editor assistants, etc. that may reside distributed over the Internet, communicating via Internet protocols. Each potential user thus is able to configure his own dedicated version of a design tool set dedicated to the requirements of the current problem to be solved.
Modeling QoS Parameters in Component-Based Systems
2004-08-01
deployed components, begins with the system developer, willing to build a system, by presenting a query to the system generator . The query describes...is built using the system generator . If some of the components are not found then the system integrator can modify the system query by adding more
Quantum Heuristics of Angular Momentum
Levy-Leblond, Jean-Marc
1976-01-01
Discusses the quantization of angular momentum components, Heisenberg-type inequalities for their spectral dispersions, and the quantization of the angular momentum modulus, without using operators or commutation relations. (MLH)
Intelligent System Design Using Hyper-Heuristics
Directory of Open Access Journals (Sweden)
Nelishia Pillay
2015-07-01
Full Text Available Determining the most appropriate search method or artificial intelligence technique to solve a problem is not always evident and usually requires implementation of the different approaches to ascertain this. In some instances a single approach may not be sufficient and hybridization of methods may be needed to find a solution. This process can be time consuming. The paper proposes the use of hyper-heuristics as a means of identifying which method or combination of approaches is needed to solve a problem. The research presented forms part of a larger initiative aimed at using hyper-heuristics to develop intelligent hybrid systems. As an initial step in this direction, this paper investigates this for classical artificial intelligence uninformed and informed search methods, namely depth first search, breadth first search, best first search, hill-climbing and the A* algorithm. The hyper-heuristic determines the search or combination of searches to use to solve the problem. An evolutionary algorithm hyper-heuristic is implemented for this purpose and its performance is evaluated in solving the 8-Puzzle, Towers of Hanoi and Blocks World problems. The hyper-heuristic employs a generational evolutionary algorithm which iteratively refines an initial population using tournament selection to select parents, which the mutation and crossover operators are applied to for regeneration. The hyper-heuristic was able to identify a search or combination of searches to produce solutions for the twenty 8-Puzzle, five Towers of Hanoi and five Blocks World problems. Furthermore, admissible solutions were produced for all problem instances.
Cryptanalysis of optical encryption: a heuristic approach
Gopinathan, Unnikrishnan; Monaghan, David S.; Naughton, Thomas J.; Sheridan, John T.
2006-10-01
The Fourier plane encryption algorithm is subjected to a heuristic known-plaintext attack. The simulated annealing algorithm is used to estimate the key using a known plaintext-ciphertext pair which decrypts the ciphertext with arbitrarily low error. The strength of the algorithm is tested by using the key to decrypt a different ciphertext encrypted using the same original key. The Fourier plane encryption algorithm is found to be susceptible to a known-plaintext heuristic attack. It is found that phase only encryption, a variation of Fourier plane encoding algorithm, successfully defends against this attack.
Independet Component Analyses of Ground-based Exoplanetary Transits
Silva Martins-Filho, Walter; Griffith, Caitlin Ann; Pearson, Kyle; Waldmann, Ingo; Biddle, Lauren; Zellem, Robert Thomas; Alvarez-Candal, Alvaro
2016-10-01
Most observations of exoplanetary atmospheres are conducted when a "Hot Jupiter" exoplanet transits in front of its host star. These Jovian-sized planets have small orbital periods, on the order of days, and therefore a short transit time, making them more ameanable to observations. Measurements of Hot Jupiter transits must achieve a 10-4 level of accuracy in the flux to determine the spectral modulations of the exoplanetary atmosphere. In order to accomplish this level of precision, we need to extract systematic errors, and, for ground-based measurements, the effects of Earth's atmosphere, from the signal due to the exoplanet, which is several orders of magnitudes smaller. Currently, the effects of the terrestrial atmosphere and the some of the time-dependent systematic errors are treated by dividing the host star by a reference star at each wavelength and time step of the transit. More recently, Independent Component Analyses (ICA) have been used to remove systematic effects from the raw data of space-based observations (Waldmann 2014,2012; Morello et al.,2015,2016). ICA is a statistical method born from the ideas of the blind-source separation studies, which can be used to de-trend several independent source signals of a data set (Hyvarinen and Oja, 2000). One strength of this method is that it requires no additional prior knowledge of the system. Here, we present a study of the application of ICA to ground-based transit observations of extrasolar planets, which are affected by Earth's atmosphere. We analyze photometric data of two extrasolar planets, WASP-1b and GJ3470b, recorded by the 61" Kuiper Telescope at Stewart Observatory using the Harris B and U filters. The presentation will compare the light curve depths and their dispersions as derived from the ICA analysis to those derived by analyses that ratio of the host star to nearby reference stars.References: Waldmann, I.P. 2012 ApJ, 747, 12, Waldamann, I. P. 2014 ApJ, 780, 23; Morello G. 2015 ApJ, 806
Knowledge-based System Prototype in Structural Component Design Based on FM
Institute of Scientific and Technical Information of China (English)
JIANG; Tao; LI; Qing-fen; LI; Ming; FU; Wei
2002-01-01
A knowledge-based system in structural component design based on fracture mechanics is developed in this paper. The system consists of several functional parts: a general inference engine, a set of knowledge bases and data-bases, an interpretation engine, a bases administration system and the interface. It can simulate a human expert to make analysis and design scheme mainly for four kinds of typical structural components widely used in shipbuilding industry: pressure vessels, huge rotation constructions, pump-rod and welded structures. It is an open system which may be broadened and perfected to cover a wider range of engineering application through the modification and enlargement of knowledge bases and data-bases. It has a natural and friendly interface that may be easily operated. An on-line help service is also provided.
A flexible framework for sparse simultaneous component based data integration
Directory of Open Access Journals (Sweden)
Van Deun Katrijn
2011-11-01
Full Text Available Abstract 1 Background High throughput data are complex and methods that reveal structure underlying the data are most useful. Principal component analysis, frequently implemented as a singular value decomposition, is a popular technique in this respect. Nowadays often the challenge is to reveal structure in several sources of information (e.g., transcriptomics, proteomics that are available for the same biological entities under study. Simultaneous component methods are most promising in this respect. However, the interpretation of the principal and simultaneous components is often daunting because contributions of each of the biomolecules (transcripts, proteins have to be taken into account. 2 Results We propose a sparse simultaneous component method that makes many of the parameters redundant by shrinking them to zero. It includes principal component analysis, sparse principal component analysis, and ordinary simultaneous component analysis as special cases. Several penalties can be tuned that account in different ways for the block structure present in the integrated data. This yields known sparse approaches as the lasso, the ridge penalty, the elastic net, the group lasso, sparse group lasso, and elitist lasso. In addition, the algorithmic results can be easily transposed to the context of regression. Metabolomics data obtained with two measurement platforms for the same set of Escherichia coli samples are used to illustrate the proposed methodology and the properties of different penalties with respect to sparseness across and within data blocks. 3 Conclusion Sparse simultaneous component analysis is a useful method for data integration: First, simultaneous analyses of multiple blocks offer advantages over sequential and separate analyses and second, interpretation of the results is highly facilitated by their sparseness. The approach offered is flexible and allows to take the block structure in different ways into account. As such
An Ensemble Algorithm Based Component for Geomagnetic Data Assimilation
Directory of Open Access Journals (Sweden)
Zhibin Sun and Weijia Kuang
2015-01-01
Full Text Available Geomagnetic data assimilation is one of the most recent developments in geomagnetic studies. It combines geodynamo model outputs and surface geomagnetic observations to provide more accurate estimates of the core dynamic state and provide accurate geomagnetic secular variation forecasting. To facilitate geomagnetic data assimilation studies, we develop a stand-alone data assimilation component for the geomagnetic community. This component is used to calculate the forecast error covariance matrices and the gain matrix from a given geodynamo solution, which can then be used for sequential geomagnetic data assimilation. This component is very flexible and can be executed independently. It can also be easily integrated with arbitrary dynamo models.
Design and Fabrication of SOI-based photonic crystal components
DEFF Research Database (Denmark)
Borel, Peter Ingo; Frandsen, Lars Hagedorn; Harpøth, Anders;
2004-01-01
We present examples of ultra-compact photonic crystal components realized in silicon-on-insulator material. We have fabricated several different types of photonic crystal waveguide components displaying high transmission features. This includes 60° and 120° bends, different types of couplers......, and splitters. Recently, we have designed and fabricated components with more than 200 nm bandwidths. Design strategies to enhance the performance include systematic variation of design parameters using finite-difference time-domain simulations and inverse design methods such as topology optimization....
Heuristic detection model of covert channel based on quantum neural network%基于量子神经网络的启发式网络隐蔽信道检测模型
Institute of Scientific and Technical Information of China (English)
唐彰国; 李焕洲; 钟明全; 张健
2012-01-01
In order to improve the detection rate of the covert channel, this paper discussed the covert channel detection technique , and gave a detail contrast research of related characters. Through the induction and analysis for the characteristics of network covert channel, this paper found some attributes to describe the network covert channels, and proposed an identification method based on network communication fingerprint. The communication fingerprints of covert channel were summarized such as in the protocol field, statistical regularities and behavior characteristics. On that basis, it designed and implemented a heuristic detection model of covert channel based on quantum neural network. Test results indicate that the system runs efficient and the results are more accurate.%为了提高隐蔽信道的检测率,讨论了传统的隐蔽信道检测技术的原理并对其特点作了详细的对比研究；通过对网络隐蔽通道特点的归纳和分析,找到可以描述网络隐蔽通道的若干属性,并提出基于网络隐蔽通道特征指纹的检测思想,归纳出隐蔽信道在协议域、统计规律、行为特征等方面表现出的通信指纹信息,在此基础上,设计并实现了一个基于量子神经网络的启发式网络隐蔽信道检测模型.测试结果表明该检测模型运行高效、检测结果较准确.
DEFF Research Database (Denmark)
Vlachogiannis, Ioannis (John); Lee, K. Y.
2010-01-01
In this paper an improved coordinated aggregation-based particle swarm optimization algorithm is introduced for solving the optimal economic load dispatch problem in power systems. In the improved coordinated aggregation-based particle swarm optimization algorithm each particle in the swarm retains...... a memory of its best position ever encountered, and is attracted only by other particles with better achievements than its own with the exception of the particle with the best achievement, which moves randomly.The ICA-PSO algorithm is tested on a number of power systems, including the systems with 6, 13...
Directory of Open Access Journals (Sweden)
David Glenn Clark
2012-05-01
Full Text Available Background: Aphasic individuals exhibit greater difficulty understanding complex sentences, but there is little consensus regarding what makes one sentence more complicated than another. In addition, aphasic individuals might make use of heuristic strategies for understanding sentences. This research is a comparison of specific predictions derived from two approaches to the quantification of sentence complexity, one based on the hierarchical structure of sentences (trees, and the other based on Dependency Locality Theory (DLT. Complexity metrics derived from these theories are evaluated under various assumptions of heuristic use.Method: A set of complexity metrics was derived from each general theory of sentence complexity. Each metric was paired with assumptions of heuristic use. Probability spaces were generated that summarized the possible patterns of performance across 16 different sentence structures. The maximum likelihood of comprehension scores of 42 aphasic individuals was then computed for each probability space and the expected scores from the best-fitting points in the space were recorded for comparison to the actual scores. Predictions were then compared using measures of fit quality derived from linear mixed effects models.Results: All three of the metrics that provide the most consistently accurate predictions of patient scores rely on storage costs based on the DLT. Patients appear to employ an Agent-Theme heuristic, but vary in their tendency to accept heuristically generated interpretations. Furthermore, the ability to apply the heuristic may be degraded in proportion to aphasia severity. Conclusion: The results suggest that storage (i.e., allocation of cognitive resources for anticipated syntactic constituents is a key resource degraded by aphasia, but aphasic individuals may vary in their tendency to use or accept heuristically generated interpretations.
Combining heuristic and statistical techniques in landslide hazard assessments
Cepeda, Jose; Schwendtner, Barbara; Quan, Byron; Nadim, Farrokh; Diaz, Manuel; Molina, Giovanni
2014-05-01
As a contribution to the Global Assessment Report 2013 - GAR2013, coordinated by the United Nations International Strategy for Disaster Reduction - UNISDR, a drill-down exercise for landslide hazard assessment was carried out by entering the results of both heuristic and statistical techniques into a new but simple combination rule. The data available for this evaluation included landslide inventories, both historical and event-based. In addition to the application of a heuristic method used in the previous editions of GAR, the availability of inventories motivated the use of statistical methods. The heuristic technique is largely based on the Mora & Vahrson method, which estimates hazard as the product of susceptibility and triggering factors, where classes are weighted based on expert judgment and experience. Two statistical methods were also applied: the landslide index method, which estimates weights of the classes for the susceptibility and triggering factors based on the evidence provided by the density of landslides in each class of the factors; and the weights of evidence method, which extends the previous technique to include both positive and negative evidence of landslide occurrence in the estimation of weights for the classes. One key aspect during the hazard evaluation was the decision on the methodology to be chosen for the final assessment. Instead of opting for a single methodology, it was decided to combine the results of the three implemented techniques using a combination rule based on a normalization of the results of each method. The hazard evaluation was performed for both earthquake- and rainfall-induced landslides. The country chosen for the drill-down exercise was El Salvador. The results indicate that highest hazard levels are concentrated along the central volcanic chain and at the centre of the northern mountains.
Institute of Scientific and Technical Information of China (English)
张媛; 张立民; 刘文彪; 陈洁
2012-01-01
Starting from the multi-aircrafts cooperative air combat under command and control of early warning aircraft task principle analysis, a tactics decision-making modeling based on bidirectional heuristic attributes reduction in incomplete information system is put forward, which aims at the uncertain and incomplete question in the decision making process for the beyond-visual-range cooperative air combat. Firstly, considered the decision-making attributes missing or uncertain in the rough set decision, the complete selection of the incomplete decision-making information system is established according to the concept for the extended incomplete information; Secondly, based on the attributes reduction with discemibility matrix, the most value of the attributes frequency as heuristic information is selected to make bidirectional selection for the decision-making information system attributes reduction, which is to get the most optimal reduction; And then, the optimal selection of the decision-making information system is presented according to the principle to make the decision have the maximum probability. The decision-making rules is extracted from the optimal selection; Lastly, according to the combat's tentative plan for CGF entity in the beyond-visual-range cooperative air combat, in which both soft and hard kill weapons are applied, a synthesized tactical decision-making model for CGF entity is established. A decision-making example has been presented to illustrate the decision-making course and verified its correctness and validity. The results show that the method can present the synthesized tactical actions for CGF entity accurately in the condition of incomplete combat's information.%从预警机指挥引导的多机协同空战原则分析出发,针对超视距协同空战决策过程中的不确定性和不完备性问题,提出了一种不完备信息系统中的基于双向启发式属性约简的战术粗决策建模方法.首先考虑粗糙集决策过
Heuristic Decision Making in Network Linking
M.J.W. Harmsen - Van Hout (Marjolein); B.G.C. Dellaert (Benedict); P.J.J. Herings (Jean-Jacques)
2015-01-01
textabstractNetwork formation among individuals constitutes an important part of many OR processes, but relatively little is known about how individuals make their linking decisions in networks. This article provides an investigation of heuristic effects in individual linking decisions for
Fast heuristics for a dynamic paratransit problem
Cremers, M.L.A.G.; Klein Haneveld, W.K.; van der Vlerk, M.H.
2008-01-01
In a previous paper we developed a non-standard two-stage recourse model for the dynamic day-ahead paratransit planning problem. Two heuristics, which are frequently applied in the recourse model, contain many details which leads to large CPU times to solve instances of relatively small size. In thi
Teaching a Heuristic Approach to Information Retrieval.
Ury, Connie Jo; And Others
1997-01-01
Discusses lifelong learning and the need for information retrieval skills, and describes how Northwest Missouri State University incorporates a heuristic model of library instruction in which students continually evaluate and refine information-seeking practices while progressing through all levels of courses in diverse disciplines. (Author/LRW)
A Heuristic for the Teaching of Persuasion.
Schell, John F.
Interpreting Aristotle's criteria for persuasive writing--ethos, logos, and pathos--as a concern for writer, language, and audience creates both an effective model for persuasive writing and a structure around which to organize discussions of relevant rhetorical issues. Use of this heuristic to analyze writing style, organization, and content…
The Heuristic Interpretation of Box Plots
Lem, Stephanie; Onghena, Patrick; Verschaffel, Lieven; Van Dooren, Wim
2013-01-01
Box plots are frequently used, but are often misinterpreted by students. Especially the area of the box in box plots is often misinterpreted as representing number or proportion of observations, while it actually represents their density. In a first study, reaction time evidence was used to test whether heuristic reasoning underlies this…
Pattern Matching, Searching, and Heuristics in Algebra.
Lopez, Antonio M.
1996-01-01
Presents a methodology designed to strengthen the cognitive effects of using graphing calculators to solve polynomial equations using pattern matching, searching, and heuristics. Discusses pattern matching as a problem-solving strategy useful in the physical, social, political, and economic worlds of today's students. (DDR)
Structural Functionalism as a Heuristic Device.
Chilcott, John H.
1998-01-01
Argues that structural functionalism as a method for conducting fieldwork and as a format for the analysis of ethnographic data remains a powerful model, one that is easily understood by professional educators. As a heuristic device, functionalist theory can help in the solution of a problem that is otherwise incapable of theoretical…
Institute of Scientific and Technical Information of China (English)
林小峰; 赵立
2016-01-01
Due to the fact that the power control of permanent magnet synchronous generators(PMSG)is usually based on linear control theory,its dynamic performances cannot fully satisfy the operation requirements of wind farms. In this paper,dual heuristic programming(DHP)is adopted to control the output power of PMSG. After the introduction of the working principle of PMSG,its dynamic model is established. The DHP algorithm is written using C programming lan⁃guage,and the DHP control blocks replace the PI control blocks in the dynamic model of PMSG,which is incorporated into the IEEE 10-generator 39-node system to perform time-domain simulation with adjusted parameters. Simulation re⁃sults show that the output power of PMSG can be effectively controlled with DHP,and a desired dynamic performance can be obtained.%永磁同步风力发电机（PMSG）的功率控制一般基于线性控制理论，致使其动态性能难以完全满足风电场运行的需要。本文采用双启发式动态规划（DHP）对PMSG的输出功率进行控制。首先阐述PMSG的工作原理，然后搭建动态模型，编制了DHP算法的C程序，创建DHP控制模块以更换动态模型中的PI控制模块，最终接入到IEEE 10机39节点系统进行时域仿真并调节参数。仿真结果表明，应用DHP算法有效实现了PMSG的功率控制，获得了较为理想的动态性能。
Institute of Scientific and Technical Information of China (English)
张希翔; 李陶深
2012-01-01
Regression analysis is often used for filling and predicting incomplete data, whereas it has some flaws when constructing regression equation, the independent variable form is fixed and single. In order to solve the problem, the paper proposed an improved multivariate regression analytical method based on heuristic constructed variable. Firstly, the existing variables' optimized combination forms were found by means of greedy algorithm, then the new constructed variable for multivariate regression analysis was chosen to get a better goodness of fit. Results of calculating and estimating incomplete data of wheat stalks' mechanical strength prove thai the proposed method is feasible and effective, and it can get a better goodness of fit when predicting incomplete data.%传统的多元回归分析方法可以对缺失数据进行预测填补,但它在构造回归方程时存在自变量形式较为固定、单一等不足.为此,提出一种基于启发式构元的多元回归分析方法,通过贪婪算法找出现有变量的优化组合形式,选取若干新构变量进行回归分析,从而得到更好的拟合优度.通过对案例中小麦茎秆机械强度缺失数据信息进行仿真计算和评估,证实了方法的有效性.算例结果表明该方法运用在缺失数据预测中拥有较好的精准性.
Heuristic Diagrams as a Tool to Teach History of Science
Chamizo, Jose A.
2012-01-01
The graphic organizer called here heuristic diagram as an improvement of Gowin's Vee heuristic is proposed as a tool to teach history of science. Heuristic diagrams have the purpose of helping students (or teachers, or researchers) to understand their own research considering that asks and problem-solving are central to scientific activity. The…
Heuristic Diagrams as a Tool to Teach History of Science
Chamizo, Jose A.
2012-01-01
The graphic organizer called here heuristic diagram as an improvement of Gowin's Vee heuristic is proposed as a tool to teach history of science. Heuristic diagrams have the purpose of helping students (or teachers, or researchers) to understand their own research considering that asks and problem-solving are central to scientific activity. The…
Indian Academy of Sciences (India)
Sangeetha S; S Jeevananthan
2015-12-01
Genetic Algorithms (GA) has always done justice to the art of optimization. One such endeavor has been made in employing the roots of GA in a most proficient way to determine the switching moments of a cascaded H-bridge seven level inverter with equal DC sources. Evolutionary techniques have proved themselves efficient to solve such an obscurity. GA is one of the methods to achieve the objective through biological mimicking. The extraordinary property of crossover is extracted using Random 3-Point Neighbourhood Crossover (RPNC) and Multi Midpoint Selective Bit Neighbourhood crossover (MMSBNC). This paper deals with solving of the selective harmonic equations (SHE) using binary coded GA specific to knowledge based neighbourhood multipoint crossover technique. This is directly related to the switching moments of the multilevel inverter under consideration. Although the previous root-finding techniques such as N-R or resultant like methods endeavor the same, the latter offers faster convergence, better program reliability and wide range of solutions. With an acute algorithm developed in Turbo C, the switching moments are calculated offline. The simulation results closely agree with the hardware results.
A four-component organogel based on orthogonal chemical interactions.
Luisier, Nicolas; Schenk, Kurt; Severin, Kay
2014-09-14
A thermoresponsive organogel was obtained by orthogonal assembly of four compounds using dynamic covalent boronate ester and imine bonds, as well as dative boron-nitrogen bonds. It is shown that the gel state can be disrupted or reinforced by chemicals which undergo exchange reactions with the gel components.
Industrial Component-based Sample Mobile Robot System
Directory of Open Access Journals (Sweden)
Péter Kucsera
2007-12-01
Full Text Available The mobile robot development can be done in two different ways. The first is tobuild up an embedded system, the second is to use ‘ready to use’ industrial components.With the spread of Industrial mobile robots there are more and more components on themarket which can be used to build up a whole control and sensor system of a mobile robotplatform. Using these components electrical hardware development is not needed, whichspeeds up the development time and decreases the cost. Using a PLC on board, ‘only’constructing the program is needed and the developer can concentrate on the algorithms,not on developing hardware. My idea is to solve the problem of mobile robot localizationand obstacle avoidance using industrial components and concentrate this topic to themobile robot docking. In factories, mobile robots can be used to deliver parts from oneplace to another, but there are always two critical points. The robot has to be able tooperate in human environment, and also reach the target and get to a predefined positionwhere another system can load it or get the delivered product. I would like to construct amechanically simple robot model, which can calculate its position from the rotation of itswheels, and when it reaches a predefined location with the aid of an image processingsystem it can dock to an electrical connector. If the robot succeeded it could charge itsbatteries through this connector as well.
A Component-Based Dataflow Framework for Simulation and Visualization
Telea, Alexandru
1999-01-01
Reuse in the context of scientific simulation applications has mostly taken the form of procedural or object-oriented libraries. End users of such systems are however often non software experts needing very simple, possibly interactive ways to build applications from domain-specific components and t
National Research Council Canada - National Science Library
Chen, Jeng-Fung; Hsieh, Ho-Nien; Do, Quang
2014-01-01
.... In this study, an approach to the problem based on the artificial neural network (ANN) with the two meta-heuristic algorithms inspired by cuckoo birds and their lifestyle, namely, Cuckoo Search (CS...
2013-11-14
... COMMISSION Certain Vision-Based Driver Assistance System Cameras and Components Thereof; Institution of...-based driver assistance system cameras and components thereof by reason of infringement of certain... assistance system cameras and components thereof by reason of infringement of one or more of claims 1, 2,...
Facing the grand challenges through heuristics and mindfulness
Powietrzynska, Malgorzata; Tobin, Kenneth; Alexakos, Konstantinos
2015-03-01
We address the nature of mindfulness and its salience to education generally and to science education specifically. In a context of the historical embeddedness of mindfulness in Buddhism we discuss research in social neuroscience, presenting evidence for neuronal plasticity of the brain and six emotional styles, which are not biologically predetermined, but are responsive to adaptation through life experiences. We raise questions about the role of science education in mediating the structure and function of the brain. Also, we discuss interventions to increase Mindfulness in Education, including meditation and heuristics, that act as reflexive objects to heighten awareness of characteristics of mindfulness and increase the likelihood of changes in the conduct of social life—increasing the mindfulness of those who engage the characteristics included in the heuristic. We present mindfulness and the development of a toolkit for ameliorating emotions when and as necessary as a component of a science curriculum that orientates toward wellness and sustainability. We advocate for changes in the nature of science education to reflect the priorities of the twenty first century that relate to sustainability of the living and nonliving universe and wellness of sentient beings.
Understanding topological symmetry: a heuristic approach to its determination.
Contreras, M L; Alvarez, J; Guajardo, D; Rozas, R
2008-03-01
An algorithm based on heuristic rules for topological symmetry perception of organic structures having heteroatoms, multiple bonds, and any kind of cycle, and configuration, is presented. This algorithm identifies topological symmetry planes and sets of equivalent atoms in the structure, named symmetry atom groups (SAGs). This approach avoids both the need to explore the entire graph automorphism groups, and to encompass cycle determination, resulting in a very effective computer processing. Applications to several structures, some of them highly symmetrical such as dendrimers, are presented.
Heuristical Feature Extraction from LIDAR Data and Their Visualization
Ghosh, S.; Lohani, B.
2011-09-01
Extraction of landscape features from LiDAR data has been studied widely in the past few years. These feature extraction methodologies have been focussed on certain types of features only, namely the bare earth model, buildings principally containing planar roofs, trees and roads. In this paper, we present a methodology to process LiDAR data through DBSCAN, a density based clustering method, which extracts natural and man-made clusters. We then develop heuristics to process these clusters and simplify them to be sent to a visualization engine.
Heuristic algorithm for off-lattice protein folding problem
Institute of Scientific and Technical Information of China (English)
CHEN Mao; HUANG Wen-qi
2006-01-01
Enlightened by the law of interactions among objects in the physical world, we propose a heuristic algorithm for solving the three-dimensional (3D) off-lattice protein folding problem. Based on a physical model, the problem is converted from a nonlinear constraint-satisfied problem to an unconstrained optimization problem which can be solved by the well-known gradient method. To improve the efficiency of our algorithm, a strategy was introduced to generate initial configuration. Computational results showed that this algorithm could find states with lower energy than previously proposed ground states obtained by nPERM algorithm for all chains with length ranging from 13 to 55.
An Analysis of a Heuristic Procedure to Evaluate Tail (independence
Directory of Open Access Journals (Sweden)
Marta Ferreira
2014-01-01
Full Text Available Measuring tail dependence is an important issue in many applied sciences in order to quantify the risk of simultaneous extreme events. A usual measure is given by the tail dependence coefficient. The characteristics of events behave quite differently as these become more extreme, whereas we are in the class of asymptotic dependence or in the class of asymptotic independence. The literature has emphasized the asymptotic dependent class but wrongly infers that tail dependence will result in the overestimation of extreme value dependence and consequently of the risk. In this paper we analyze this issue through simulation based on a heuristic procedure.
Development of a knowledge-based system for the design of composite automotive components
Moynihan, Gary P.; Stephens, J. Paul
1997-01-01
Composite materials are comprised of two or more constituents possessing significantly different physical properties. Due to their high strength and light weight, there is an emerging trend to utilize composites in the automotive industry. There is an inherent link between component design and the manufacturing processes necessary for fabrication. To many designers, this situation may be intimidating, since there is frequently little available understanding of composites and their processes. A direct results is high rates of product scrap and rework. Thus, there is a need to implement a systematic approach to composite material design. One such approach is quality function deployment (QFD). By translating customer requirements into design parameters, through the use of heuristics, QFD supports the improvement of product quality during the planning stages prior to actual production. The purpose of this research is to automate the use of knowledge pertaining to the design and application of composite materials within the automobile industry. This is being accomplished through the development of a prototype expert system incorporating a QFD approach. It will provide industry designers with access to knowledge of composite materials that might not be otherwise available.
Action-based distribution functions for spheroidal galaxy components
Posti, Lorenzo; Nipoti, Carlo; Ciotti, Luca
2014-01-01
We present an approach to the design of distribution functions that depend on the phase-space coordinates through the action integrals. The approach makes it easy to construct a dynamical model of a given stellar component. We illustrate the approach by deriving distribution functions that self-consistently generate several popular stellar systems, including the Hernquist, Jaffe, Navarro, Frenk and White models. We focus on non-rotating spherical systems, but extension to flattened and rotating systems is trivial. Our distribution functions are easily added to each other and to previously published distribution functions for discs to create self-consistent multi-component galaxies. The models this approach makes possible should prove valuable both for the interpretation of observational data and for exploring the non-equilibrium dynamics of galaxies via N-body simulation.
A New Image Steganography Based On First Component Alteration Technique
Directory of Open Access Journals (Sweden)
Amanpreet Kaur
2009-12-01
Full Text Available In this paper, A new image steganography scheme is proposed which is a kind of spatial domain technique. In order to hide secret data in cover-image, the first component alteration technique is used. Techniques used so far focuses only on the two or four bits of a pixel in a image (at the most five bits at the edge of an image which results in less peak to signal noise ratio and high root mean square error. In this technique, 8 bits of blue components of pixels are replaced with secret data bits. Proposed scheme can embed more data than previous schemes and shows better image quality. To prove this scheme, several experiments are performed, and are compared the experimental results with the related previous works.Keywords—image; mean square error; Peak signal to noise ratio; steganography;
A two-component NZRI metamaterial based rectangular cloak
Islam, Sikder Sunbeam; Faruque, Mohammd Rashed Iqbal; Islam, Mohammad Tariqul
2015-10-01
A new two-component, near zero refractive index (NZRI) metamaterial is presented for electromagnetic rectangular cloaking operation in the microwave range. In the basic design a pi-shaped, metamaterial was developed and its characteristics were investigated for the two major axes (x and z-axis) wave propagation through the material. For the z-axis wave propagation, it shows more than 2 GHz bandwidth and for the x-axis wave propagation; it exhibits more than 1 GHz bandwidth of NZRI property. The metamaterial was then utilized in designing a rectangular cloak where a metal cylinder was cloaked perfectly in the C-band area of microwave regime. The experimental result was provided for the metamaterial and the cloak and these results were compared with the simulated results. This is a novel and promising design for its two-component NZRI characteristics and rectangular cloaking operation in the electromagnetic paradigm.
A New Image Steganography Based On First Component Alteration Technique
Kaur, Amanpreet; Sikka, Geeta
2010-01-01
In this paper, A new image steganography scheme is proposed which is a kind of spatial domain technique. In order to hide secret data in cover-image, the first component alteration technique is used. Techniques used so far focuses only on the two or four bits of a pixel in a image (at the most five bits at the edge of an image) which results in less peak to signal noise ratio and high root mean square error. In this technique, 8 bits of blue components of pixels are replaced with secret data bits. Proposed scheme can embed more data than previous schemes and shows better image quality. To prove this scheme, several experiments are performed, and are compared the experimental results with the related previous works.
QUALITY CONTROL OF SEMICONDUCTOR PACKAGING BASED ON PRINCIPAL COMPONENTS ANALYSIS
Institute of Scientific and Technical Information of China (English)
无
2007-01-01
5 critical quality characteristics must be controlled in the surface mount and wire-bond process in semiconductor packaging. And these characteristics are correlated with each other. So the principal components analysis(PCA) is used in the analysis of the sample data firstly. And then the process is controlled with hotelling T2 control chart for the first several principal components which contain sufficient information. Furthermore, a software tool is developed for this kind of problems. And with sample data from a surface mounting device(SMD) process, it is demonstrated that the T2 control chart with PCA gets the same conclusion as without PCA, but the problem is transformed from high-dimensional one to a lower dimensional one, i.e., from 5 to 2 in this demonstration.
Directory of Open Access Journals (Sweden)
Tracy Tomlinson
2011-02-01
Full Text Available The recognition heuristic assumes that people make inferences based on the output of recognition memory. While much work has been devoted to establishing the recognition heuristic as a viable description of how people make inferences, more work is needed to fully integrate research on the recognition heuristic with research from the broader cognitive psychology literature. In this article, we outline four challenges that should be met for this integration to take place, and close with a call to address these four challenges collectively, rather than piecemeal.
An approach to software development based on heterogeneous component reuse and its supporting system
Institute of Scientific and Technical Information of China (English)
杨芙清; 梅宏; 吴穹; 朱冰
1997-01-01
Software reuse is considered as a practical approach to solving the software crisis. The BD-HCRUS, a software development supporting system based on heterogeneous component reuse, is introduced. The system has a reusable component library as its kernel in charge of the organization, storage and retrieval of the heterogeneous components, an object-oriented integrated language for the specification and composition of the heterogeneous components, and program comprehension tools for reverse-engineering and extracting reusable components from source code, then re-engineering the components. Therefore, a whole support is lent systematically to the acquisition, specification, organization, storage, retrieval and composition of reusable components.
Directory of Open Access Journals (Sweden)
Vinícius Vilar Jacob
2016-01-01
Full Text Available This paper addresses a single-machine scheduling problem with sequence-dependent family setup times. In this problem the jobs are classified into families according to their similarity characteristics. Setup times are required on each occasion when the machine switches from processing jobs in one family to jobs in another family. The performance measure to be minimized is the total tardiness with respect to the given due dates of the jobs. The problem is classified as NP-hard in the ordinary sense. Since the computational complexity associated with the mathematical formulation of the problem makes it difficult for optimization solvers to deal with large-sized instances in reasonable solution time, efficient heuristic algorithms are needed to obtain near-optimal solutions. In this work we propose three heuristics based on the Iterated Local Search (ILS metaheuristic. The first heuristic is a basic ILS, the second uses a dynamic perturbation size, and the third uses a Path Relinking (PR technique as an intensification strategy. We carry out comprehensive computational and statistical experiments in order to analyze the performance of the proposed heuristics. The computational experiments show that the ILS heuristics outperform a genetic algorithm proposed in the literature. The ILS heuristic with dynamic perturbation size and PR intensification has a superior performance compared to other heuristics.
Directory of Open Access Journals (Sweden)
Nur Ariffin Mohd Zin
2012-01-01
Full Text Available This paper discusses on a comparative study towards solution for solving Travelling Salesman Problem based on three techniques proposed namely exhaustive, heuristic and genetic algorithm. Each solution is to cater on finding an optimal path of available 25 contiguous cities in England whereby solution is written in Prolog. Comparisons were made with emphasis against time consumed and closeness to optimal solutions. Based on the experimental, we found that heuristic is very promising in terms of time taken, while on the other hand, Genetic Algorithm manages to be outstanding on big number of traversal by resulting the shortest path among the others.
基于启发式预测窗口的 UAV 实时航迹规划方法%Real-time method of UAV path planning baSed on heuriStic predictive window
Institute of Scientific and Technical Information of China (English)
王强; 张安; 张艳霞
2015-01-01
In view of moving target and moving threat,in accordance to the real-time information of target and threat acquired by airborne detector,a real-time method of unmanned aerial vehicle (UAV)path planning based on heuristic predictive window is proposed.Estimating predictive window based on friend or enemy situa-tion,the state of target and threat can be predicted by Kalman filtering.Objective function is established by using vector angle principle,which can evaluate threats and voyage costs and satisfy the UAV maneuvering con-straints.By negative gradient descent online optimization,a series of UAV heading angle can be gotten,and the path planning is accomplished.The simulation results show that the method can pursuit moving target and avoid moving threat efficiently,which can realize real-time path planning for UAV.%考虑移动目标及移动威胁，根据机载传感器实时获得的目标及威胁信息，提出一种基于启发式预测窗口的无人机实时航迹规划方法。根据敌我态势估算预测窗口，并结合卡尔曼滤波预测目标及威胁的状态，构建基于矢量夹角原理的目标函数，评估威胁及航程代价并满足无人机的机动约束。该方法通过最速下降法在线优化得到一系列的无人机航向角，完成航迹规划。仿真结果表明该方法可以有效追击移动目标，并规避移动威胁，实现无人机实时航迹规划。
Authentication Scheme Based on Principal Component Analysis for Satellite Images
Directory of Open Access Journals (Sweden)
Ashraf. K. Helmy
2009-09-01
Full Text Available This paper presents a multi-band wavelet image content authentication scheme for satellite images by incorporating the principal component analysis (PCA. The proposed schemeachieves higher perceptual transparency and stronger robustness. Specifically, the developed watermarking scheme can successfully resist common signal processing such as JPEG compression and geometric distortions such as cropping. In addition, the proposed scheme can be parameterized, thus resulting in more security. That is, an attacker may not be able to extract the embedded watermark if the attacker does not know the parameter.In an order to meet these requirements, the host image is transformed to YIQ to decrease the correlation between different bands, Then Multi-band Wavelet transform (M-WT is applied to each channel separately obtaining one approximate sub band and fifteen detail sub bands. PCA is then applied to the coefficients corresponding to the same spatial location in all detail sub bands. The last principle component band represents an excellent domain forinserting the water mark since it represents lowest correlated features in high frequency area of host image.One of the most important aspects of satellite images is spectral signature, the behavior of different features in different spectral bands, the results of proposed algorithm shows that the spectral stamp for different features doesn't tainted after inserting the watermark.
Heuristic rule for constructing physics axiomatization
Moldoveanu, Florin
2010-01-01
Constructing the Theory of Everything (TOE) is an elusive goal of today's physics. Goedel's incompleteness theorem seems to forbid physics axiomatization, a necessary part of the TOE. The purpose of this contribution is to show how physics axiomatization can be achieved guided by a new heuristic rule. This will open up new roads into constructing the ultimate theory of everything. Three physical principles will be identified from the heuristic rule and they in turn will generate uniqueness results of various technical strengths regarding space, time, non-relativistic and relativistic quantum mechanics, electroweak symmetry and the dimensionality of space-time. The hope is that the strong force and the Standard Model axiomatizations are not too far out. Quantum gravity and cosmology are harder problems and maybe new approaches are needed. However, complete physics axiomatization seems to be an achievable goal, no longer part of philosophical discussions, but subject to rigorous mathematical proofs.
A heuristic for the minimization of open stacks problem
Directory of Open Access Journals (Sweden)
Fernando Masanori Ashikaga
2009-08-01
Full Text Available It is suggested here a fast and easy to implement heuristic for the minimization of open stacks problem (MOSP. The problem is modeled as a traversing problem in a graph (Gmosp with a special structure (Yanasse, 1997b. It was observed in Ashikaga (2001 that, in the mean experimental case, Gmosp has large cliques and high edge density. This information was used to implement a heuristic based on the extension-rotation algorithm of Pósa (1976 for approximation of Hamiltonian Circuits. Additionally, an initial path for Pósa's algorithm is derived from the vertices of an ideally maximum clique in order to accelerate the process. Extensive computational tests show that the resulting simple approach dominates in time and mean error the fast actually know Yuen (1991 and 1995 heuristic to the problem.Sugerimos uma heurística rápida e de implementação simples para o problema de minimização de pilhas abertas (MOSP. O problema é modelado como um problema de percorrimento de arcos no grafo (Gmosp associado (Yanasse, 1997b. Foi observado em Ashikaga (2001 que o grafo Gmosp possui grandes cliques e uma alta densidade de arestas. Esta informação foi utilizada para implementar uma heurística baseada no algoritmo Extensão-Rotação de Pósa (1976 para aproximação de Circuitos Hamiltonianos. O caminho inicial para o algoritmo de Pósa é obtido a partir dos vértices de uma aproximação do maior clique do grafo para acelerar o processo. Testes computacionais extensivos mostram que a abordagem domina tanto em tempo quanto em erro médio a mais rápida heurística conhecida de Yuen (1991 e 1995.
Meta-Heuristic Combining Prior Online and Offline Information for the Quadratic Assignment Problem.
Sun, Jianyong; Zhang, Qingfu; Yao, Xin
2014-03-01
The construction of promising solutions for NP-hard combinatorial optimization problems (COPs) in meta-heuristics is usually based on three types of information, namely a priori information, a posteriori information learned from visited solutions during the search procedure, and online information collected in the solution construction process. Prior information reflects our domain knowledge about the COPs. Extensive domain knowledge can surely make the search effective, yet it is not always available. Posterior information could guide the meta-heuristics to globally explore promising search areas, but it lacks local guidance capability. On the contrary, online information can capture local structures, and its application can help exploit the search space. In this paper, we studied the effects of using this information on metaheuristic's algorithmic performances for the COPs. The study was illustrated by a set of heuristic algorithms developed for the quadratic assignment problem. We first proposed an improved scheme to extract online local information, then developed a unified framework under which all types of information can be combined readily. Finally, we studied the benefits of the three types of information to meta-heuristics. Conclusions were drawn from the comprehensive study, which can be used as principles to guide the design of effective meta-heuristic in the future.
Quantum heuristic algorithm for traveling salesman problem
Bang, Jeongho; Lim, James; Ryu, Junghee; Lee, Changhyoup; Lee, Jinhyoung
2010-01-01
We propose a quantum heuristic algorithm to solve a traveling salesman problem by generalizing Grover search. Sufficient conditions are derived to greatly enhance the probability of finding the tours with extremal costs, reaching almost to unity and they are shown characterized by statistical properties of tour costs. In particular for a Gaussian distribution of the tours along the cost we show that the quantum algorithm exhibits the quadratic speedup of its classical counterpart, similarly to Grover search.
Heuristics for the economic dispatch problem
Energy Technology Data Exchange (ETDEWEB)
Flores, Benjamin Carpio [Centro Nacional de Controle de Energia (CENACE), Mexico, D.F. (Mexico). Dept. de Planificacion Economica de Largo Plazo], E-mail: benjamin.carpo@cfe.gob.mx; Laureano Cruces, A.L.; Lopez Bracho, R.; Ramirez Rodriguez, J. [Universidad Autonoma Metropolitana (UAM), Mexico, D.F. (Brazil). Dept. de Sistemas], Emails: clc@correo.azc.uam.mx, rlb@correo.azc.uam.mx, jararo@correo.azc.uam.mx
2009-07-01
This paper presents GRASP (Greedy Randomized Adaptive Search Procedure), Simulated Annealing (SAA), Genetic (GA), and Hybrid Genetic (HGA) Algorithms for the economic dispatch problem (EDP), considering non-convex cost functions and dead zones the only restrictions, showing the results obtained. We also present parameter settings that are specifically applicable to the EDP, and a comparative table of results for each heuristic. It is shown that these methods outperform the classical methods without the need to assume convexity of the target function. (author)
Cut Size Statistics of Graph Bisection Heuristics
Schreiber, G. R.; Martin, O. C.
1998-01-01
We investigate the statistical properties of cut sizes generated by heuristic algorithms which solve approximately the graph bisection problem. On an ensemble of sparse random graphs, we find empirically that the distribution of the cut sizes found by ``local'' algorithms becomes peaked as the number of vertices in the graphs becomes large. Evidence is given that this distribution tends towards a Gaussian whose mean and variance scales linearly with the number of vertices of the graphs. Given...
Heuristics and Biases in Military Decision Making
2010-10-01
length and breadth of the tapestry. In the whole range of human activities, war most closely resembles a game of cards. —Clausewitz, On War. 1 CARL...Uncertainty: Heuristics and Biases, ed. Daniel Kahneman and Amos Tversky (New York, Cambridge University Press, 1982), 156-57. It is similar to a quiz I...gave during my Game Theory class at West Point. 38. Mathematically, this problem can be solved using Bayesian inference. 39. Some may feel that the
A Heuristic Molecular Model of Hydrophobic Interactions
Hummer, G; Garde, S; Garcia, A.E.; Pohorille, A; Pratt, L.R.
1995-01-01
Hydrophobic interactions provide driving forces for protein folding, membrane formation, and oil-water separation. Motivated by information theory, the poorly understood nonpolar solute interactions in water are investigated. A simple heuristic model of hydrophobic effects in terms of density fluctuations is developed. This model accounts quantitatively for the central hydrophobic phenomena of cavity formation and association of inert gas solutes; it therefore clarifies the underlying physics...
Component-Based Approach for Educating Students in Bioinformatics
Poe, D.; Venkatraman, N.; Hansen, C.; Singh, G.
2009-01-01
There is an increasing need for an effective method of teaching bioinformatics. Increased progress and availability of computer-based tools for educating students have led to the implementation of a computer-based system for teaching bioinformatics as described in this paper. Bioinformatics is a recent, hybrid field of study combining elements of…
Teacher Perceptions Regarding Portfolio-Based Components of Teacher Evaluations
Nagel, Charles I.
2012-01-01
This study reports the results of teachers' and principals' perceptions of the package evaluation process, a process that uses a combination of a traditional evaluation with a portfolio-based assessment tool. In addition, this study contributes to the educational knowledge base by exploring the participants' views on the impact of…
Component-Based Approach for Educating Students in Bioinformatics
Poe, D.; Venkatraman, N.; Hansen, C.; Singh, G.
2009-01-01
There is an increasing need for an effective method of teaching bioinformatics. Increased progress and availability of computer-based tools for educating students have led to the implementation of a computer-based system for teaching bioinformatics as described in this paper. Bioinformatics is a recent, hybrid field of study combining elements of…
The recognition heuristic: A decade of research
Directory of Open Access Journals (Sweden)
Gerd Gigerenzer
2011-02-01
Full Text Available The recognition heuristic exploits the basic psychological capacity for recognition in order to make inferences about unknown quantities in the world. In this article, we review and clarify issues that emerged from our initial work (Goldstein and Gigerenzer, 1999, 2002, including the distinction between a recognition and an evaluation process. There is now considerable evidence that (i the recognition heuristic predicts the inferences of a substantial proportion of individuals consistently, even in the presence of one or more contradicting cues, (ii people are adaptive decision makers in that accordance increases with larger recognition validity and decreases in situations when the validity is low or wholly indeterminable, and (iii in the presence of contradicting cues, some individuals appear to select different strategies. Little is known about these individual differences, or how to precisely model the alternative strategies. Although some researchers have attributed judgments inconsistent with the use of the recognition heuristic to compensatory processing, little research on such compensatory models has been reported. We discuss extensions of the recognition model, open questions, unanticipated results, and the surprising predictive power of recognition in forecasting.
Face Detection Using Adaboosted SVM-Based Component Classifier
Valiollahzadeh, Seyyed Majid; Nazari, Mohammad
2008-01-01
Recently, Adaboost has been widely used to improve the accuracy of any given learning algorithm. In this paper we focus on designing an algorithm to employ combination of Adaboost with Support Vector Machine as weak component classifiers to be used in Face Detection Task. To obtain a set of effective SVM-weaklearner Classifier, this algorithm adaptively adjusts the kernel parameter in SVM instead of using a fixed one. Proposed combination outperforms in generalization in comparison with SVM on imbalanced classification problem. The proposed here method is compared, in terms of classification accuracy, to other commonly used Adaboost methods, such as Decision Trees and Neural Networks, on CMU+MIT face database. Results indicate that the performance of the proposed method is overall superior to previous Adaboost approaches.
Action-based distribution functions for spheroidal galaxy components
Posti, Lorenzo; Binney, James; Nipoti, Carlo; Ciotti, Luca
2015-03-01
We present an approach to the design of distribution functions that depend on the phase-space coordinates through the action integrals. The approach makes it easy to construct a dynamical model of a given stellar component. We illustrate the approach by deriving distribution functions that self-consistently generate several popular stellar systems, including the Hernquist, Jaffe, and Navarro, Frenk and White models. We focus on non-rotating spherical systems, but extension to flattened and rotating systems is trivial. Our distribution functions are easily added to each other and to previously published distribution functions for discs to create self-consistent multicomponent galaxies. The models this approach makes possible should prove valuable both for the interpretation of observational data and for exploring the non-equilibrium dynamics of galaxies via N-body simulations.
Component Thermodynamical Selection Based Gene Expression Programming for Function Finding
Directory of Open Access Journals (Sweden)
Zhaolu Guo
2014-01-01
Full Text Available Gene expression programming (GEP, improved genetic programming (GP, has become a popular tool for data mining. However, like other evolutionary algorithms, it tends to suffer from premature convergence and slow convergence rate when solving complex problems. In this paper, we propose an enhanced GEP algorithm, called CTSGEP, which is inspired by the principle of minimal free energy in thermodynamics. In CTSGEP, it employs a component thermodynamical selection (CTS operator to quantitatively keep a balance between the selective pressure and the population diversity during the evolution process. Experiments are conducted on several benchmark datasets from the UCI machine learning repository. The results show that the performance of CTSGEP is better than the conventional GEP and some GEP variations.
Directory of Open Access Journals (Sweden)
Mohammad Hossein Zarei
2016-01-01
Full Text Available Job selection and scheduling are among the most important decisions for production planning in today’s manufacturing systems. However, the studies that take into account both problems together are scarce. Given that such problems are strongly NP-hard, this paper presents an approach based on two heuristic algorithms for simultaneous job selection and scheduling. The objective is to select a subset of candidate jobs and schedule them in such a way that the total net profit is maximized. The cost components considered here include jobs' processing costs and weighted earliness/tardiness penalties. Two heuristic algorithms; namely scatter search (SS and simulated annealing (SA, were employed to solve the problem for single machine environments. The algorithms were applied to several examples of different sizes with sequence-dependent setup times. Computational results were compared in terms of quality of solutions and convergence speed. Both algorithms were found to be efficient in solving the problem. While SS could provide solutions with slightly higher quality for large size problems, SA could achieve solutions in a more reasonable computational time.
Feature-Based TAG in place of multi-component adjunction Computational Implications
Hockey, B A
1994-01-01
Using feature-based Tree Adjoining Grammar (TAG), this paper presents linguistically motivated analyses of constructions claimed to require multi-component adjunction. These feature-based TAG analyses permit parsing of these constructions using an existing unification-based Earley-style TAG parser, thus obviating the need for a multi-component TAG parser without sacrificing linguistic coverage for English.
一种基于启发式奖赏函数的分层强化学习方法%A Hierarchical Reinforcement Learning Method Based on Heuristic Reward Function
Institute of Scientific and Technical Information of China (English)
刘全; 闫其粹; 伏玉琛; 胡道京; 龚声蓉
2011-01-01
针对强化学习在应用中经常出现的“维数灾”问题,即状态空间的大小随着特征数量的增加而发生指数级的增长,以及收敛速度过慢的问题,提出了一种基于启发式奖赏函数的分层强化学习方法.该方法不仅能够大幅度减少环境状态空间,还能加快学习的收敛速度.将此算法应用到俄罗斯方块的仿真平台中,通过对实验中的参数进行设置及对算法性能进行分析,结果表明:采用启发式奖赏函数的分层强化学习方法能在一定程度上解决“维数灾”问题,并具有很好的收敛速度.%Reinforcement learning is about controlling an autonomous agent in an unknown enviroment-often called the state space. The agent has no prior knowledge about the environment and can only obtain some knowledge by acting in the environment. Reinforcement learning, and Q-learning particularly, encounters a major problem. Learning the Q-function in tablular form may be infeasible because the amount of memory needed to store the table is excessive, and the Q-f unction converges only after each state being visited a lot of times. So "curse of dimensionality" is inevitably produced by large state spaces. A hierarchical reinforcement learning method based on heuristic reward function is proposed to solve the problem of "curse of dimensionality", which make the states space grow exponentially by the number of features and slow down the convergence speed. The method can reduce state spaces greatly and quicken the speed of the study. Actions are chosen with favorable purpose and efficiency so as to optimize the reward function and quicken the convergence speed. The Tetris game is applied in the method. Analysis of algorithms and the experiment result show that the method can partly solve the "curse of dimensionality" and quicken the convergence speed prominently.
Directory of Open Access Journals (Sweden)
Vinay
2014-03-01
Full Text Available Component Based Software Engineering (CBSE provides an approach to develop high quality software system at less cost by using fresh and existing software components. The quality of the software system is based on the quality of individual software component integrated. Application developer wants the good or the fittest component to assemble and improve the quality of the software product. The application developer specifies the criteria and requirements of software systems and uses them in selecting the fit components. Component classification and selection is a practical problem and requires complete and predictable input information. It is missing due to uncertainty in judgment and impression in calculations. Hence, component fitness evaluation, classification and selection are critical, multi-faceted, fuzzy and vague nature problems. There exists many component selection approaches, but theses lack the repeatable, usable, exile, multi-faceted and automated processes for component selection and filtration. These approaches are not fulfilling the objectives of software industry in terms of cost, quality and precision. So, there is need of hour to devise an intelligent approach for multifaceted component fitness evaluation, classification and selection. In this study, fuzzy synthetic based approach is proposed for multi-criteria fitness evaluation, classification and selection of software component. For validation of the proposed framework, fifteen black box components of calculators are used. It helps the application developer in selecting fit or high quality component. The proposed framework reduces the cost and enhances the quality, productivity of software systems.
Fast deterministic algorithm for EEE components classification
Kazakovtsev, L. A.; Antamoshkin, A. N.; Masich, I. S.
2015-10-01
Authors consider the problem of automatic classification of the electronic, electrical and electromechanical (EEE) components based on results of the test control. Electronic components of the same type used in a high- quality unit must be produced as a single production batch from a single batch of the raw materials. Data of the test control are used for splitting a shipped lot of the components into several classes representing the production batches. Methods such as k-means++ clustering or evolutionary algorithms combine local search and random search heuristics. The proposed fast algorithm returns a unique result for each data set. The result is comparatively precise. If the data processing is performed by the customer of the EEE components, this feature of the algorithm allows easy checking of the results by a producer or supplier.
Directory of Open Access Journals (Sweden)
Dnyanesh G Lad
2013-01-01
Conclusions: A significantly improved placement of the component was found in the coronal and sagittal planes of the tibial component by CAS. The placement of the components in the other planes was comparable with the values recorded in the jig-based surgery group. Functional outcome was not significantly different.
Heuristic for Task-Worker Assignment with Varying Learning Slopes
Directory of Open Access Journals (Sweden)
Wipawee Tharmmaphornphilas
2010-04-01
Full Text Available Fashion industry has variety products, so the multi-skilled workers are required to improve flexibility in production and assignment. Generally the supervisor will assign task to the workers based on skill and skill levels of worker. Since in fashion industry new product styles are launched more frequently and the order size tends to be smaller, the workers always learn when the raw material and the production process changes. Consequently they require less time to produce the succeeding units of a task based on their learning ability. Since the workers have both experience and inexperience workers, so each worker has different skill level and learning ability. Consequently, the assignment which assumed constant skill level is not proper to use. This paper proposes a task-worker assignment considering worker skill levels and learning abilities. Processing time of each worker changes along production period due to a worker learning ability. We focus on a task-worker assignment in a fashion industry where tasks are ordered in series; the number of tasks is greater than the number of workers. Therefore, workers can perform multiple assignments followed the precedence restriction as an assembly line balancing problem. The problem is formulated in an integer linear programming model with objective to minimize makespan. A heuristic is proposed to determine the lower bound (LB and the upper bound (UB of the problem and the best assignment is determined. The performance of the heuristic method is tested by comparing quality of solution and computational time to optimal solutions.
"The Gaze Heuristic:" Biography of an Adaptively Rational Decision Process.
Hamlin, Robert P
2017-02-21
This article is a case study that describes the natural and human history of the gaze heuristic. The gaze heuristic is an interception heuristic that utilizes a single input (deviation from a constant angle of approach) repeatedly as a task is performed. Its architecture, advantages, and limitations are described in detail. A history of the gaze heuristic is then presented. In natural history, the gaze heuristic is the only known technique used by predators to intercept prey. In human history the gaze heuristic was discovered accidentally by Royal Air Force (RAF) fighter command just prior to World War II. As it was never discovered by the Luftwaffe, the technique conferred a decisive advantage upon the RAF throughout the war. After the end of the war in America, German technology was combined with the British heuristic to create the Sidewinder AIM9 missile, the most successful autonomous weapon ever built. There are no plans to withdraw it or replace its guiding gaze heuristic. The case study demonstrates that the gaze heuristic is a specific heuristic type that takes a single best input at the best time (take the best(2) ). Its use is an adaptively rational response to specific, rapidly evolving decision environments that has allowed those animals/humans/machines who use it to survive, prosper, and multiply relative to those who do not.
A Fast Method for Heuristics in Large-Scale Flow Shop Scheduling
Institute of Scientific and Technical Information of China (English)
无
2006-01-01
Fast computation methods are needed for the heuristics of flow shop scheduling problems in practical manufacturing environments. This paper describes a generalized flow shop model, which is an extension of the classical model, in which not all machines are available at time zero. The general completion-time computing method is used to compute completion time of generalized flow shops. The transform classical flow shop to generalized shop (TCG) method is used to transform classical schedules into generalized schedules with less jobs. INSERT and SWAP, extended from job-insertion and pair-wise exchange which are fundamental procedures used in most heuristics for classical flow shops, reduce the CPU time by 1/2 and 1/3, respectively. The CPU time of 14 job-insertion and pair-wise exchange-based heuristics are analyzed with and without the TCG method. The results show that TCG considerably reduces the CPU time.
Review on use of Swarm Intelligence Meta heuristics in Scheduling of FMS
Directory of Open Access Journals (Sweden)
Hamesh babu Nanvala,
2011-04-01
Full Text Available Due to the high complexity of Flexible Manufacturing Systems(FMS scheduling problem, approaches that guarantee to find the optimal solution are feasible only for small size instance of the problemswith lot of computational effort and time. In contrast, approaches based on meta heuristics are capable of finding good and “near to optimal” solutions to problem instances of realistic size, in a generally smaller computation time. This work provided a review on the use of swarm intelligence meta heuristics to the scheduling of flexible manufacturing problem. The two main areas of swarm intelligence that are prominently appeared in the literature relevant to this problems are ant colony optimization (ACO and particle swarmoptimization (PSO. By reviewing the literature related to use of swarm intelligence meta heuristics to FMS scheduling problem, and commented on the basis of the review.
Directory of Open Access Journals (Sweden)
Washington Alves de Oliveira
Full Text Available ABSTRACT In this work we propose a heuristic algorithm for the layout optimization for disks installed in a rotating circular container. This is a unequal circle packing problem with additional balance constraints. It proved to be an NP-hard problem, which justifies heuristics methods for its resolution in larger instances. The main feature of our heuristic is based on the selection of the next circle to be placed inside the container according to the position of the system's center of mass. Our approach has been tested on a series of instances up to 55 circles and compared with the literature. Computational results show good performance in terms of solution quality and computational time for the proposed algorithm.
Heuristic algorithm for RCPSP with the objective of minimizing activities' cost
Institute of Scientific and Technical Information of China (English)
Liu Zhenyuan; Wang Hongwei
2006-01-01
Resource-constrained project scheduling problem(RCPSP) is an important problem in research on project management. But there has been little attention paid to the objective of minimizing activities' cost with the resource constraints that is a critical sub-problem in partner selection of construction supply chain management because the capacities of the renewable resources supplied by the partners will effect on the project scheduling. Its mathematic model is presented firstly, and analysis on the characteristic of the problem shows that the objective function is non-regular and the problem is NP-complete following which the basic idea for solution is clarified. Based on a definition of preposing activity cost matrix, a heuristic algorithm is brought forward. Analyses on the complexity of the heuristics and the result of numerical studies show that the heuristic algorithm is feasible and relatively effective.
A nuclear heuristic for application to metaheuristics in-core fuel management optimization
Energy Technology Data Exchange (ETDEWEB)
Meneses, Anderson Alvarenga de Moura, E-mail: ameneses@lmp.ufrj.b [COPPE/Federal University of Rio de Janeiro, RJ (Brazil). Nuclear Engineering Program; Dalle Molle Institute for Artificial Intelligence (IDSIA), Manno-Lugano, TI (Switzerland); Gambardella, Luca Maria, E-mail: luca@idsia.c [Dalle Molle Institute for Artificial Intelligence (IDSIA), Manno-Lugano, TI (Switzerland); Schirru, Roberto, E-mail: schirru@lmp.ufrj.b [COPPE/Federal University of Rio de Janeiro, RJ (Brazil). Nuclear Engineering Program
2009-07-01
The In-Core Fuel Management Optimization (ICFMO) is a well-known problem of nuclear engineering whose features are complexity, high number of feasible solutions, and a complex evaluation process with high computational cost, thus it is prohibitive to have a great number of evaluations during an optimization process. Heuristics are criteria or principles for deciding which among several alternative courses of action are more effective with respect to some goal. In this paper, we propose a new approach for the use of relational heuristics for the search in the ICFMO. The Heuristic is based on the reactivity of the fuel assemblies and their position into the reactor core. It was applied to random search, resulting in less computational effort concerning the number of evaluations of loading patterns during the search. The experiments demonstrate that it is possible to achieve results comparable to results in the literature, for future application to metaheuristics in the ICFMO. (author)
Directory of Open Access Journals (Sweden)
Suwicha Jirayucharoensak
2014-01-01
Full Text Available Automatic emotion recognition is one of the most challenging tasks. To detect emotion from nonstationary EEG signals, a sophisticated learning algorithm that can represent high-level abstraction is required. This study proposes the utilization of a deep learning network (DLN to discover unknown feature correlation between input signals that is crucial for the learning task. The DLN is implemented with a stacked autoencoder (SAE using hierarchical feature learning approach. Input features of the network are power spectral densities of 32-channel EEG signals from 32 subjects. To alleviate overfitting problem, principal component analysis (PCA is applied to extract the most important components of initial input features. Furthermore, covariate shift adaptation of the principal components is implemented to minimize the nonstationary effect of EEG signals. Experimental results show that the DLN is capable of classifying three different levels of valence and arousal with accuracy of 49.52% and 46.03%, respectively. Principal component based covariate shift adaptation enhances the respective classification accuracy by 5.55% and 6.53%. Moreover, DLN provides better performance compared to SVM and naive Bayes classifiers.
Relationship between the modified due date rule and the heuristic of Wilkerson and Irwin
Directory of Open Access Journals (Sweden)
J.C. Nyirenda
2014-01-01
Full Text Available In this paper, we consider the problem of scheduling N jobs on a single machine to minimise total tardiness. Both the modified due date (MDD rule and the heuristic of Wilkerson and Irwin (W-I are very effective in reducing total tardiness. We show that in fact the MDD rule and the W-I heuristic are strongly related in the sense that both are based on the same local optimality condition for a pair of adjacent jobs, so that a sequence generated by these methods cannot be improved by any further adjacent pair-wise interchange.
Heuristics for Multidimensional Packing Problems
DEFF Research Database (Denmark)
Egeblad, Jens
In this thesis we consider solution methods for packing problems. Packing problems occur in many different situations both directly in the industry and as sub-problems of other problems. High-quality solutions for problems in the industrial sector may be able to reduce transportation and production...... costs significantly. For packing problems in general are given a set of items and one of more containers. The items must be placed within the container such that some objective is optimized and the items do not overlap. Items and container may be rectangular or irregular (e.g. polygons and polyhedra......) and may be defined in any number of dimensions. Solution methods are based on theory from both computational geometry and operations research. The scientific contributions of this thesis are presented in the form of six papers and a section which introduces the many problem types and recent solution...
Using problem-based learning in web-based components of nurse education.
Crawford, Tonia R
2011-03-01
Problem-based learning (PBL) is a student-centred method of teaching, and is initiated by introducing a clinical problem through which learning is fostered by active inquisition (Tavakol and Reicherter, 2003). Using this teaching and learning strategy for web-based environments is examined from the literature for potential implementation in a Bachelor of Nursing program. In view of the evidence, students accessing online nursing subjects would seem to benefit from web-based PBL as it provides flexibility, opportunities for discussion and co-participation, encourages student autonomy, and allows construction of meaning as the problems mirror the real world. PBL also promotes critical thinking and transfer of theory to practice. It is recommended that some components of practice-based subjects such as Clinical Practice or Community Health Nursing, could be implemented online using a PBL format, which should also include a discussion forum to enable group work for problem-solving activities, and tutor facilitation.
Ryan, Jason C; Banerjee, Ashis Gopal; Cummings, Mary L; Roy, Nicholas
2014-06-01
Planning operations across a number of domains can be considered as resource allocation problems with timing constraints. An unexplored instance of such a problem domain is the aircraft carrier flight deck, where, in current operations, replanning is done without the aid of any computerized decision support. Rather, veteran operators employ a set of experience-based heuristics to quickly generate new operating schedules. These expert user heuristics are neither codified nor evaluated by the United States Navy; they have grown solely from the convergent experiences of supervisory staff. As unmanned aerial vehicles (UAVs) are introduced in the aircraft carrier domain, these heuristics may require alterations due to differing capabilities. The inclusion of UAVs also allows for new opportunities for on-line planning and control, providing an alternative to the current heuristic-based replanning methodology. To investigate these issues formally, we have developed a decision support system for flight deck operations that utilizes a conventional integer linear program-based planning algorithm. In this system, a human operator sets both the goals and constraints for the algorithm, which then returns a proposed schedule for operator approval. As a part of validating this system, the performance of this collaborative human-automation planner was compared with that of the expert user heuristics over a set of test scenarios. The resulting analysis shows that human heuristics often outperform the plans produced by an optimization algorithm, but are also often more conservative.
Heuristic space diversity management in a meta-hyper-heuristic framework
CSIR Research Space (South Africa)
Grobler, J
2014-07-01
Full Text Available stream_source_info Grobler1_2014.pdf.txt stream_content_type text/plain stream_size 35657 Content-Encoding UTF-8 stream_name Grobler1_2014.pdf.txt Content-Type text/plain; charset=UTF-8 Heuristic Space Diversity... Management in a Meta-Hyper-Heuristic Framework Jacomine Grobler1 and Andries P. Engelbrecht2 1Department of Industrial and Systems Engineering University of Pretoria and Council for Scientific and Industrial Research Email: jacomine.grobler@gmail.com 2...
Research on the Component-based Software Architecture
Institute of Scientific and Technical Information of China (English)
无
2002-01-01
Computer software has been becoming more and more c om plex with the development of hardware. Thus, how to efficiently develop extensib le, maintainable and adaptable software occurs to be an urgent problem. The comp onent-based software development technique is a better method to solve the prob lem. In this paper, we first discuss the concept, description method and some fa miliar styles of software architecture, and then analyze the merits of using the software architecture to guide the software developm...
A Component-Based Software Configuration Management Model and Its Supporting System
Institute of Scientific and Technical Information of China (English)
梅宏; 张路; 杨芙清
2002-01-01
Software configuration management (SCM) is an important key technology in software development. Component-based software development (CBSD) is an emerging paradigm in software development. However, to apply CBSD effectively in real world practice,supporting SCM in CBSD needs to be further investigated. In this paper, the objects that need to be managed in CBSD is analyzed and a component-based SCM model is presented. In this model, components, as the integral logical constituents in a system, are managed as the basic configuration items in SCM, and the relationships between/among components are defined and maintained. Based on this model, a configuration management system is implemented.
A heuristic approach to automated nipple detection in digital mammograms.
Jas, Mainak; Mukhopadhyay, Sudipta; Chakraborty, Jayasree; Sadhu, Anup; Khandelwal, Niranjan
2013-10-01
In this paper, a heuristic approach to automated nipple detection in digital mammograms is presented. A multithresholding algorithm is first applied to segment the mammogram and separate the breast region from the background region. Next, the problem is considered separately for craniocaudal (CC) and mediolateral-oblique (MLO) views. In the simplified algorithm, a search is performed on the segmented image along a band around the centroid and in a direction perpendicular to the pectoral muscle edge in the MLO view image. The direction defaults to the horizontal (perpendicular to the thoracic wall) in case of CC view images. The farthest pixel from the base found in this direction can be approximated as the nipple point. Further, an improved version of the simplified algorithm is proposed which can be considered as a subclass of the Branch and Bound algorithms. The mean Euclidean distance between the ground truth and calculated nipple position for 500 mammograms from the Digital Database for Screening Mammography (DDSM) database was found to be 11.03 mm and the average total time taken by the algorithm was 0.79 s. Results of the proposed algorithm demonstrate that even simple heuristics can achieve the desired result in nipple detection thus reducing the time and computational complexity.
A Heuristic Task Scheduling Algorithm for Heterogeneous Virtual Clusters
Directory of Open Access Journals (Sweden)
Weiwei Lin
2016-01-01
Full Text Available Cloud computing provides on-demand computing and storage services with high performance and high scalability. However, the rising energy consumption of cloud data centers has become a prominent problem. In this paper, we first introduce an energy-aware framework for task scheduling in virtual clusters. The framework consists of a task resource requirements prediction module, an energy estimate module, and a scheduler with a task buffer. Secondly, based on this framework, we propose a virtual machine power efficiency-aware greedy scheduling algorithm (VPEGS. As a heuristic algorithm, VPEGS estimates task energy by considering factors including task resource demands, VM power efficiency, and server workload before scheduling tasks in a greedy manner. We simulated a heterogeneous VM cluster and conducted experiment to evaluate the effectiveness of VPEGS. Simulation results show that VPEGS effectively reduced total energy consumption by more than 20% without producing large scheduling overheads. With the similar heuristic ideology, it outperformed Min-Min and RASA with respect to energy saving by about 29% and 28%, respectively.
Optimal Rapid Restart of Heuristic Methods of NP Hard Problems
Institute of Scientific and Technical Information of China (English)
侯越先; 王芳
2004-01-01
Many heuristic search methods exhibit a remarkable variability in the time required to solve some particular problem instances. Their cost distributions are often heavy-tailed. It has been demonstrated that, in most cases, rapid restart (RR) method can prominently suppress the heavy-tailed nature of the instances and improve computation efficiency. However, it is usually time-consuming to check whether an algorithm on a specific instance is heavy-tailed or not. Moreover, if the heavy-tailed distribution is confirmed and the RR method is relevant, an optimal RR threshold should be chosen to facilitate the RR mechanism. In this paper, an approximate approach is proposed to quickly check whether an algorithm on a specific instance is heavy-tailed or not.The method is realized by means of calculating the maximal Lyapunov exponent of its generic running trace.Then a statistical formula to estimate the optimal RR threshold is educed. The method is based on common nonparametric estimation, e. g. , Kernel estimation. Two heuristic methods are selected to verify our method. The experimental results are consistent with the theoretical consideration perfectly.
Theoretical Analysis of Heuristic Search Methods for Online POMDPs.
Ross, Stéphane; Pineau, Joelle; Chaib-Draa, Brahim
2008-01-01
Planning in partially observable environments remains a challenging problem, despite significant recent advances in offline approximation techniques. A few online methods have also been proposed recently, and proven to be remarkably scalable, but without the theoretical guarantees of their offline counterparts. Thus it seems natural to try to unify offline and online techniques, preserving the theoretical properties of the former, and exploiting the scalability of the latter. In this paper, we provide theoretical guarantees on an anytime algorithm for POMDPs which aims to reduce the error made by approximate offline value iteration algorithms through the use of an efficient online searching procedure. The algorithm uses search heuristics based on an error analysis of lookahead search, to guide the online search towards reachable beliefs with the most potential to reduce error. We provide a general theorem showing that these search heuristics are admissible, and lead to complete and ε-optimal algorithms. This is, to the best of our knowledge, the strongest theoretical result available for online POMDP solution methods. We also provide empirical evidence showing that our approach is also practical, and can find (provably) near-optimal solutions in reasonable time.
Identifying multiple influential spreaders by a heuristic clustering algorithm
Energy Technology Data Exchange (ETDEWEB)
Bao, Zhong-Kui [School of Mathematical Science, Anhui University, Hefei 230601 (China); Liu, Jian-Guo [Data Science and Cloud Service Research Center, Shanghai University of Finance and Economics, Shanghai, 200133 (China); Zhang, Hai-Feng, E-mail: haifengzhang1978@gmail.com [School of Mathematical Science, Anhui University, Hefei 230601 (China); Department of Communication Engineering, North University of China, Taiyuan, Shan' xi 030051 (China)
2017-03-18
The problem of influence maximization in social networks has attracted much attention. However, traditional centrality indices are suitable for the case where a single spreader is chosen as the spreading source. Many times, spreading process is initiated by simultaneously choosing multiple nodes as the spreading sources. In this situation, choosing the top ranked nodes as multiple spreaders is not an optimal strategy, since the chosen nodes are not sufficiently scattered in networks. Therefore, one ideal situation for multiple spreaders case is that the spreaders themselves are not only influential but also they are dispersively distributed in networks, but it is difficult to meet the two conditions together. In this paper, we propose a heuristic clustering (HC) algorithm based on the similarity index to classify nodes into different clusters, and finally the center nodes in clusters are chosen as the multiple spreaders. HC algorithm not only ensures that the multiple spreaders are dispersively distributed in networks but also avoids the selected nodes to be very “negligible”. Compared with the traditional methods, our experimental results on synthetic and real networks indicate that the performance of HC method on influence maximization is more significant. - Highlights: • A heuristic clustering algorithm is proposed to identify the multiple influential spreaders in complex networks. • The algorithm can not only guarantee the selected spreaders are sufficiently scattered but also avoid to be “insignificant”. • The performance of our algorithm is generally better than other methods, regardless of real networks or synthetic networks.
Heuristic RNA pseudoknot prediction including intramolecular kissing hairpins
Sperschneider, Jana; Datta, Amitava; Wise, Michael J.
2011-01-01
Pseudoknots are an essential feature of RNA tertiary structures. Simple H-type pseudoknots have been studied extensively in terms of biological functions, computational prediction, and energy models. Intramolecular kissing hairpins are a more complex and biologically important type of pseudoknot in which two hairpin loops form base pairs. They are hard to predict using free energy minimization due to high computational requirements. Heuristic methods that allow arbitrary pseudoknots strongly depend on the quality of energy parameters, which are not yet available for complex pseudoknots. We present an extension of the heuristic pseudoknot prediction algorithm DotKnot, which covers H-type pseudoknots and intramolecular kissing hairpins. Our framework allows for easy integration of advanced H-type pseudoknot energy models. For a test set of RNA sequences containing kissing hairpins and other types of pseudoknot structures, DotKnot outperforms competing methods from the literature. DotKnot is available as a web server under http://dotknot.csse.uwa.edu.au. PMID:21098139
Matija Novak; Ivan Švogor
2016-01-01
Component based software development has become a very popular paradigm in many software engineering branches. In the early phase of Web 2.0 appearance, it was also popular for web application development. From the analyzed papers, between this period and today, use of component based techniques for web application development was somewhat slowed down, however, the recent development indicates a comeback. Most of all it is apparent with W3C’s component web working group. In this article we wa...
A Heuristic Algorithm for Resource Allocation/Reallocation Problem
Directory of Open Access Journals (Sweden)
S. Raja Balachandar
2011-01-01
Full Text Available This paper presents a 1-opt heuristic approach to solve resource allocation/reallocation problem which is known as 0/1 multichoice multidimensional knapsack problem (MMKP. The intercept matrix of the constraints is employed to find optimal or near-optimal solution of the MMKP. This heuristic approach is tested for 33 benchmark problems taken from OR library of sizes upto 7000, and the results have been compared with optimum solutions. Computational complexity is proved to be (2 of solving heuristically MMKP using this approach. The performance of our heuristic is compared with the best state-of-art heuristic algorithms with respect to the quality of the solutions found. The encouraging results especially for relatively large-size test problems indicate that this heuristic approach can successfully be used for finding good solutions for highly constrained NP-hard problems.
A rescheduling heuristic for the single machine total tardiness problem
Directory of Open Access Journals (Sweden)
JC Nyirenda
2006-06-01
Full Text Available In this paper, we propose a rescheduling heuristic for scheduling N jobs on a single machine in order to minimise total tardiness. The heuristic is of the interchange type and constructs a schedule from the modified due date (MDD schedule. Unlike most interchange heuristics that consider interchanges involving only two jobs at a time, the newly proposed heuristic uses interchanges that may involve more than two jobs at any one time. Experimental results show that the heuristic is effective at reducing total tardiness producing schedules that either similar or better than those produced by the MDD alone. Furthermore, when applied to some test problems the heuristic found optimal schedules to all of them.
Knowledge Based Components of Expertise in Medical Diagnosis.
1981-09-01
PAPVC. In fact, one variant of PAPVC, "scimitar syndrome ’, de- rives its name from its presentation of such a finding on x-ray (Lucas & Schmidt, 1977...just sort of guessing right now. I would say just Scimitar Syndrome (PAPVC) pri- marily based on the chest x-ray and ah. I’m not really sure whether... Laron Chicago, ML 60605 Code 306 Navy Personal RD Center 1 Office of Naval Research San Diego, CA 92152 Code 437 800 N. Quincy SStreet Arlington, VA
Directory of Open Access Journals (Sweden)
Jana Matrková
Full Text Available BACKGROUND: Carotenoid plumage is of widespread use in bird communication. Carotenoid-based feather colouration has recently been shown to be dependent on both pigment concentration and feather structure. If these two components are determined differently, one plumage patch may potentially convey different aspects of individual quality. METHODOLOGY/PRINCIPAL FINDINGS: We evaluated the effects of genetic and environmental factors on carotenoid-based yellow breast colouration of Great Tit (Parus major nestlings. By partial cross-fostering, we separated the genetic and pre-natal vs. post-natal parental effects on both the structural and the pigment-based component of carotenoid-based plumage colouration. We also simultaneously manipulated the post-hatching environment by brood size manipulation. The structural component of nestling colouration reflected features of female colouration. On the other hand, the pigment-based component was more affected by rearing conditions presumably representing food quality. While the structural component was related to both origin- and environment-related factors, the pigment-based component seemed to be environment-dependent only. These results support the notion that pigment-based and structural components of feather colouration are determined differently. CONCLUSIONS/SIGNIFICANCE: Chromatic and achromatic components of carotenoid-based feather colouration reflected different aspects of individual quality and history, and thus may potentially form a multicomponent signal.
Complex Chemical Reaction Networks from Heuristics-Aided Quantum Chemistry
Rappoport, Dmitrij; Galvin, Cooper J.; Zubarev, Dmitry; Aspuru-Guzik, Alan
2014-01-01
While structures and reactivities of many small molecules can be computed efficiently and accurately using quantum chemical methods, heuristic approaches remain essential for modeling complex structures and large-scale chemical systems. Here, we present a heuristics-aided quantum chemical methodology applicable to complex chemical reaction networks such as those arising in cell metabolism and prebiotic chemistry. Chemical heuristics offer an expedient way of traversing high-dimensional reacti...
A Heuristic Approach for Aeromedical Evacuation System scheduling and Routing
1988-12-16
can be used to assign AES aircraft to the six medical regions. Additionally, a heuristic algorithm is developed and applied to the AES in order to...These works were used to form the foundation for developing a heuristic algorithm that can be applied to the AES, or to other systems in which vehicles...his destination. Another possible research effort involving the daily routing problem involves formulating a heuristic algorithm that quickly and
Cobweb Heuristic for solving Multi-Objective Vehicle Routing Problem
Okitonyumbe Y.F., Joseph; Ulungu, Berthold E.-L.; Kapiamba Nt., Joel
2015-01-01
Abstract Solving a classical vehicle routing problem (VRP) by exact methods presents many difficulties for large dimension problem. Consequently, in multi-objective framework, heuristic or metaheuristic methods are required. Due to particular VRP structure, it seems that a dedicated heuristic is more suitable than a metaheuristic. The aim of this article is to collapse different heuristics solving classical VRP and adapt them for to solve the multi-objective vehicle routing problem (MOVRP)...
High performance coated board inspection system based on commercial components
Barjaktarovic, M; Radunovic, J
2007-01-01
This paper presents a vision system for defect (fault) detection on a coated board developed using three industrial firewire cameras and a PC. Application for image processing and system control was realized with the LabView software package. Software for defect detection is based on a variation of the image segmentation algorithm. Standard steps in image segmentation are modified to match the characteristics of defects. Software optimization was accomplished using SIMD (Single Instruction Multiple Data) technology available in the Intel Pentium 4 processors that provided real time inspection capability. System provides benefits such as: improvement in production process, higher quality of delivered coated board and reduction of waste. This was proven during successful exploitation of the system for more than a year.
Phase Change-based Fixturing for Arbitrarily Shaped Components
Institute of Scientific and Technical Information of China (English)
LI Bei-zhi; YANG Jian-guo; ZHOU Li-bing; XIANG Qian
2002-01-01
Issues in industrialization of RFPE (Reference Free Part Encapsulation) are discussed in this paper. The issues technique. A new method - adaptable location system (ATLS) is presented in this paper. ATLS consists of an array of pins which are controlled manually or automatically by an actuator. The actuation force comes from a shape memory alloy (SMA). Material properties of filler are very important for RFPE. The experiment has shown that machining error can be reduced by using conservative cutting parameters. Based on finite element analysis, the relationship between the deformation of the workpiece, the filler and the machining parameters has been achieved. A new approach, partial cage with active side wall (PCASW), allows machine tools to easily access any feature of the workpiece from different directions. It is convenient for every new setup.
An Approach to Protein Name Extraction Using Heuristics and a Dictionary.
Seki, Kazuhiro; Mostafa, Javed
2003-01-01
Proposes a method for protein name extraction from biological texts. The method exploits hand-crafted rules based on heuristics and a set of protein names (dictionary). The approach avoids use of natural language processing tools so as to improve processing speed. Evaluation experiments were conducted in terms of: accuracy, generalizability, and…
Dafu, Shen; Leihong, Zhang; Dong, Liang; Bei, Li; Yi, Kang
2017-07-01
The purpose of this study is to improve the reconstruction precision and better copy the color of spectral image surfaces. A new spectral reflectance reconstruction algorithm based on an iterative threshold combined with weighted principal component space is presented in this paper, and the principal component with weighted visual features is the sparse basis. Different numbers of color cards are selected as the training samples, a multispectral image is the testing sample, and the color differences in the reconstructions are compared. The channel response value is obtained by a Mega Vision high-accuracy, multi-channel imaging system. The results show that spectral reconstruction based on weighted principal component space is superior in performance to that based on traditional principal component space. Therefore, the color difference obtained using the compressive-sensing algorithm with weighted principal component analysis is less than that obtained using the algorithm with traditional principal component analysis, and better reconstructed color consistency with human eye vision is achieved.
Runtime analysis of search heuristics on software engineering problems
Institute of Scientific and Technical Information of China (English)
Per Kristian LEHRE; Xin YAO
2009-01-01
Many software engineering tasks can potentially be automated using search heuristics. However, much work is needed in designing and evaluating search heuristics before this approach can be routinely applied to a software engineering problem. Experimental methodology should be complemented with theoretical analysis to achieve this goal.Recently, there have been significant theoretical advances in the runtime analysis of evolutionary algorithms (EAs) and other search heuristics in other problem domains. We suggest that these methods could be transferred and adapted to gain insight into the behaviour of search heuristics on software engineering problems while automating software engineering.
A Graph Search Heuristic for Shortest Distance Paths
Energy Technology Data Exchange (ETDEWEB)
Chow, E
2005-03-24
This paper presents a heuristic for guiding A* search for finding the shortest distance path between two vertices in a connected, undirected, and explicitly stored graph. The heuristic requires a small amount of data to be stored at each vertex. The heuristic has application to quickly detecting relationships between two vertices in a large information or knowledge network. We compare the performance of this heuristic with breadth-first search on graphs with various topological properties. The results show that one or more orders of magnitude improvement in the number of vertices expanded is possible for large graphs, including Poisson random graphs.
A Heuristic Molecular Model of Hydrophobic Interactions
Hummer, G; García, A E; Pohorille, A; Pratt, L R
1995-01-01
Hydrophobic interactions provide driving forces for protein folding, membrane formation, and oil-water separation. Motivated by information theory, the poorly understood nonpolar solute interactions in water are investigated. A simple heuristic model of hydrophobic effects in terms of density fluctuations is developed. This model accounts quantitatively for the central hydrophobic phenomena of cavity formation and association of inert gas solutes; it therefore clarifies the underlying physics of hydrophobic effects and permits important applications to conformational equilibria of nonpolar solutes and hydrophobic residues in biopolymers.
A Generalized Assignment Heuristic for Vehicle Routing
1979-08-01
1977), 517-524. 20. Shuster, K. A. and D. A. Schur.- "A Heuristic Approach to Routing Solid Waste Collection Vehicles," U.S. Environmental Protection...this problem by ( VRP ). L. . . . .. i -3- Formulation of the Vehicle Routing Problem ( VRP ) min Z c X. (1)ijk 1) i jk S.t. Z aiYik < bk , k ,...,K (2) 1...developing a sophisticated solution theory for the traveling salesman and generalized assignment models embedded within ( VRP ). By con- trast
Paraphrase Identification using Semantic Heuristic Features
Directory of Open Access Journals (Sweden)
Zia Ul-Qayyum
2012-11-01
Full Text Available Paraphrase Identification (PI problem is to classify that whether or not two sentences are close enough in meaning to be termed as paraphrases. PI is an important research dimension with practical applications in Information Extraction (IE, Machine Translation, Information Retrieval, Automatic Identification of Copyright Infringement, Question Answering Systems and Intelligent Tutoring Systems, to name a few. This study presents a novel approach of paraphrase identification using semantic heuristic features envisaging improving the accuracy compared to state-of-the-art PI systems. Finally, a comprehensive critical analysis of misclassifications is carried out to provide insightful evidence about the proposed approach and the corpora used in the experiments.
Heuristics for container loading of furniture
DEFF Research Database (Denmark)
Egeblad, Jens; Garavelli, Claudio; Lisi, Stefano
2010-01-01
We consider a container loading problem that occurs at a typical furniture manufacturer. Each furniture item has an associated profit. Given container dimensions and a set of furniture items, the problem is to determine a subset of items with maximal profit sum that is loadable in the container....... In the studied company, the problem arises hundreds of times daily during transport planning. Instances may contain more than one hundred different items with irregular shapes. To solve this complex problem we apply a set of heuristics successively that each solve one part of the problem. Large items...
Rapid heuristic projection on simplicial cones
Ekárt, A; Németh, S Z
2010-01-01
A very fast heuristic iterative method of projection on simplicial cones is presented. It consists in solving two linear systems at each step of the iteration. The extensive experiments indicate that the method furnishes the exact solution in more then 99.7 percent of the cases. The average number of steps is 5.67 (we have not found any examples which required more than 13 steps) and the relative number of steps with respect to the dimension decreases dramatically. Roughly speaking, for high enough dimensions the absolute number of steps is independent of the dimension.
Condition Based Monitoring of Gas Turbine Combustion Components
Energy Technology Data Exchange (ETDEWEB)
Ulerich, Nancy; Kidane, Getnet; Spiegelberg, Christine; Tevs, Nikolai
2012-09-30
The objective of this program is to develop sensors that allow condition based monitoring of critical combustion parts of gas turbines. Siemens teamed with innovative, small companies that were developing sensor concepts that could monitor wearing and cracking of hot turbine parts. A magnetic crack monitoring sensor concept developed by JENTEK Sensors, Inc. was evaluated in laboratory tests. Designs for engine application were evaluated. The inability to develop a robust lead wire to transmit the signal long distances resulted in a discontinuation of this concept. An optical wear sensor concept proposed by K Sciences GP, LLC was tested in proof-of concept testing. The sensor concept depended, however, on optical fiber tips wearing with the loaded part. The fiber tip wear resulted in too much optical input variability; the sensor could not provide adequate stability for measurement. Siemens developed an alternative optical wear sensor approach that used a commercial PHILTEC, Inc. optical gap sensor with an optical spacer to remove fibers from the wearing surface. The gap sensor measured the length of the wearing spacer to follow loaded part wear. This optical wear sensor was developed to a Technology Readiness Level (TRL) of 5. It was validated in lab tests and installed on a floating transition seal in an F-Class gas turbine. Laboratory tests indicate that the concept can measure wear on loaded parts at temperatures up to 800{degrees}C with uncertainty of < 0.3 mm. Testing in an F-Class engine installation showed that the optical spacer wore with the wearing part. The electro-optics box located outside the engine enclosure survived the engine enclosure environment. The fiber optic cable and the optical spacer, however, both degraded after about 100 operating hours, impacting the signal analysis.
Case-Based Reasoning Topological Complexity Calculation of Design for Components
Institute of Scientific and Technical Information of China (English)
无
2001-01-01
Directly calculating the topological and geometric complexity from the STEP (standard for the exchange of product model data, ISO 10303) file is a huge task. So, a case-based reasoning approach is presented, which is based on the similarity between the new component and the old one, to calculate the topological and geometric complexity of new components. In order to index, retrieve in historical component database, a new way of component representation is brought forth. And then an algorithm is given to extract topological graph from its STEP files. A mathematical model, which describes how to compare the similarity, is discussed. Finally, an example is given to show the result.
Methodology and Implementation on DSP of Heuristic Multiuser DS/CDMA Detectors
Directory of Open Access Journals (Sweden)
Alex Miyamoto Mussi
2010-12-01
Full Text Available The growing number of users of mobile communications networks and the scarcity of the electromagnetic spectrum make the use of diversity techniques and detection/decoding efficient, such as the use of multiple antennas at the transmitter and/or receiver, multiuser detection (MuD – Multiuser Detection, among others, have an increasingly prominent role in the telecommunications landscape. This paper presents a design methodology based on digital signal processors (DSP – Digital Signal Processor with a view to the implementation of multiuser heuristics detectors in systems DS/CDMA (Direct Sequence Code Division Multiple Access. Heuristics detection techniques result in near-optimal performance in order to approach the performance of maximum-likelihood (ML. In this work, was employed the DSP development platform called the C6713 DSK, which is based in Texas TMS320C6713 processor. The heuristics techniques proposed are based on well established algorithms in the literature. The efficiency of the algorithms implemented in DSP has been evaluated numerically by computing the measure of bit error rate (BER. Finally, the feasibility of implementation in DSP could then be verified by comparing results from multiple Monte-Carlo simulation in Matlab, with those obtained from implementation on DSP. It also demonstrates the effective increase in performance and system capacity of DS/CDMA with the use of heuristic multiuser detection techniques, implemented directly in the DSP.
Research and Implementation of Distributed Virtual Simulation Platform Based on Components
Institute of Scientific and Technical Information of China (English)
SUN Zhi-xin; WANG Ru-chuan; WANG Shao-di
2004-01-01
This paper proposes a combination of system's theoretic simulation methodology with the virtual reality technology as a basis for a component-based virtual simulation framework. The created universal framework can be used in different fields, such as drive training, airplane fighting training, and so on. The result of the synergism is a powerful component-based virtual simulation framework. After having briefly introduced the concepts and principles of the distributed component object, the paper describes a software development method based on components. Then a method of virtual simulation system modeling based on components is proposed, and the integrated framework supporting distributed virtual simulation and its key technologies are discussed at length. Our experiments indicate that the framework can be widely used in simulation fields such as arms antagonism, driving simulation and so on.